All Learning Resources

  • DCC Curation Webinar: Customising DMPonline

    Demonstration of DMPonline functionality, followed by a demonstration of how to customize DMPonline for your institution. ​DMPonline helps you to create, review, and share data management plans that meet institutional and funder requirements.

  • DMPonline: Recent Updates and New Functionality

    This presentation covers updates and enhancements to the DMPonline  DMPonline helps you to create, review, and share data management plans that meet institutional and funder requirements. Six areas are discussed:

    1. Usability improvements
    2. Lifecycle and review
    3. API for systems integration
    4. Institutional enhancements 
    5. Locale-aware support
    6. Maintenance
  • Organizing and Modeling Data

    This presentation is part of the World Geodetic System (WGS) Data Management Planning Course. It provides a brief overview of data management, databases, and data modeling. Presentation sections include:

    • Why manage data?
    • What is a database?
    • Information systems cycle and models
    • Database design in research
    • A database modeling exercise
    • Structured Query Language (SQL)
    • Tips and tricks/good practice

    Access this downloadable PowerPoint slide presentation called "Data Management for Library Ph.D." course at the 13.15 - 14.15 time slot on the agenda. 

  • Providing and Using (Open) Biodiversity Data through the Infrastructure of the Global Diversity Information Facility (GBIF)

    Introduction to the Global Biodiversity Information Facility (GBIF), including lessons and achievements from the first ten years of using GBIF. This presentation was part of the Conference Connecting Data for Research held at VU University in Amsterdam. Topics include:

    • What is biodiversity?
    • Biodiversity data
    • History of GBIF
    • GBIF usage trends
    • Primary GBIF data
    • Using GBIF
    • Darwin Core
    • Integrated Publishing Toolkit (IPT)
  • OpenEarth: A Flood of Dutch Coastal Data in Your Browser

    OpenEarth is a free and open source initiative to deal with data, models, and tools in earth science and engineering projects, currently mainly marine and coastal. This presentation was part of the Conference Connecting Data for Research held at VU University in Amsterdam. Presentation topics include:

    • What is OpenEarth?
    • Community
    • Philosophy
    • Features
  • Open Access Publishing: A User Perspective

    Part of the Embracing Data Management, Bridging the Gap Between Theory and Practice workshop, held in Brussels, this talk provides an introduction to open access publishing. Topics include:

    • Benefits
    • Philosophy
    • Requirements
    • Preprints and postprints
    • Tips
  • Modern Research Management Workshop

    Introduction to research data management (RDM). Topics include:

    • Data organization, storage, backup, security, and deposit 
    • Benefits of managing and sharing data
    • File formats
    • File naming
    • Data publishing
    • Data reuse
    • Open data
    • Sensitive data
    • Data management plans and the DMPonline tool
  • Data Management Tools

    An overview of research management tools:

    • Re3data to find repositories
    • FAIRsharing and RDA Data Standards Catalogue
    • DMPonline for writing data management plans
    • OpenAIRE for managing outputs and reporting results
  • How NOT to Share Your Data: Avoiding Data Horror Stories

    This presentation is designed to encourage best practice from researchers when sharing their data. It covers basic issues such as repositories, file formats, and cleaning spreadsheets. It was designed for researchers in the sciences who already have some basic awareness that data sharing has many benefits and is expected by many UK research funders. Topics include:

    • Where you should and should not share your data
    • What data should you include?
    • Choosing a file format
    • Spreadsheet use
    • How you should and should not describe your data
  • Research Data Lifecycle

    Data often have a longer lifespan than the research project that creates them. Researchers may continue to work on data after funding has ceased, follow-up projects may analyse or add to the data, and data may be re-used by other researchers.

    Adhering to a well-defined research data lifecycle results in organised, well documented, preserved and shared data that advance scientific inquiry and increase opportunities for learning and innovation.

  • Bringing Research Data into the Library: Expanding the Horizons of Institutional Repositories

    The focus of library-managed institutional repositories has so far been on document-like items (published articles, preprints, theses, reports, working papers, etc.) but there is growing demand to expand their use into new genres such as scientific research datasets (sensor readings, genomics data, neuroimages, etc.). This webcast explains how institutional repositories are including this type of collection, what librarians need to know in order to manage such collections, and a few case studies from the MIT Libraries. ​

     
  • Marine GIS Applications (using QGIS)

    This course provides an in-depth overview of the application of Geographic Information Systems (GIS) to the marine environment using QGIS. All resources can be accessed from the provided URL. 

    Topic 1:
    Aims and Objectives:

    • Provide an introduction to GIS for marine applications
    • Focus on some publicly available marine datasets
    • Show the potential applications of GIS for the marine environment

    Learning Outcomes:

    • Knowledge and understanding of GIS, spatial data, raster, and vector models
    • Core tasks involved in the GIS analysis process including data acquisition, management, manipulation and analysis, and presentation and output
    • Core functionality of QGIS Desktop and QGIS Browser
    • Creating and editing spatial data
    • Appreciation of coastal and marine GIS data applications

    Topic 2: Introduction to Marine GIS

    • A brief explanation of GIS
    • GIS components
    • GIS data models
    • Marine GIS applications
    • Spatial analysis

    Topic 3: Introduction to QGIS

    • Why QGIS?
    • Exercise: Introduction To QGIS

    Topic 4: View data in QGIS Desktop

    • QGIS provides several windows, convenient for the user
    • Exercise

    Topic 5: Map Projections and Coordinate Systems

    • Geographic Coordinate Systems
    • Coordinate Systems And Map Projections
    • Video: Map projections 
    • Exercise

    Topic 6: Create Base Map in QGIS

    • Define the Area of Interest (AOI)
    • Exercise

    Topic 7: Creating Data Collection from the World Ocean Database

    • Exercise: Obtaining Marine Data from the World Ocean Database

    Topic 8: Introduction to Ocean Data View

    • Video: Ocean Data View (ODV) by Reiner Schlitzer
    • Exercise: Creating Data Collection from the World Ocean Database
    • Exercise: Export Marine Data from the Ocean Data View

    Topic 9: Working with Spreadsheet Data

    • Exercise: Adding Spreadsheet Data

    Topic 10: Edit Data in QGIS

    • Exercise

    Topic 11: Edit Data: Area of Interest and Analysis Mask

    • Exercise

    Topic 12: Interpolating surfaces

    • Map Interpolation
    • Interpolating Surfaces: Summary 
    • Exercise: Interpolate to Raster

    Topic 13: Rendering Raster Data

    • Exercise

    Topic 14: Raster Calculator

    • Exercise: Using the Raster Calculation

    Topic 15: Working with NetCDF

    • Exercise

    Topic 16: Plotting Vector Arrows from U and V Component Grids

    • Marine Geospatial Ecology Tools (MGET)

    Topic 17: Downloading species observations from the Ocean Biogeographic Information System (OBIS)

    • Exercise: Downloading OBIS Data in CSV Format

    Topic 18: Creating KML files for Google Earth

    • Example KML document
    • Exercise

    Topic 19: Publication Quality Maps

    • Exercise: Create Publication Quality Maps
  • Metadata Recommendations, Dialects, Evaluation & Improvement

    This webinar describes a flexible, multifaceted approach to evaluating and improving metadata collections (in multiple dialects).

    The initial goal of many metadata efforts was discoverable data but, like many other elements of data management, the metadata landscape has evolved considerably over the last decade to include new use cases and requirements. The approach that has been developed includes web-based tools for understanding and comparing recommendations and dialects, flexible comparisons of completeness of metadata collections (in multiple dialects) with respect to particular recommendations, evaluation of completeness of single metadata records, identification of specific metadata improvement needs and an open forum for sharing information, experiences, and examples. Recommendations for metadata requirements and metadata improvement needs are discussed and shared.
     

  • FAIR Data in Trustworthy Data Repositories

    Everybody wants to play FAIR, but how do we put the principles into practice?

    There is a growing demand for quality criteria for research datasets. The presenters argue that the DSA (Data Seal of Approval for data repositories) and FAIR principles get as close as possible to giving quality criteria for research data. They do not do this by trying to make value judgments about the content of datasets, but rather by qualifying the fitness for data reuse in an impartial and measurable way. By bringing the ideas of the DSA and FAIR together, we will be able to offer an operationalization that can be implemented in any certified Trustworthy Digital Repository. In 2014 the FAIR Guiding Principles (Findable, Accessible, Interoperable and Reusable) were formulated. The well-chosen FAIR acronym is highly attractive: it is one of these ideas that almost automatically get stuck in your mind once you have heard it. In a relatively short term, the FAIR data principles have been adopted by many stakeholder groups, including research funders. The FAIR principles are remarkably similar to the underlying principles of DSA (2005): the data can be found on the Internet, are accessible (clear rights and licenses), in a usable format, reliable and are identified in a unique and persistent way so that they can be referred to. Essentially, the DSA presents quality criteria for digital repositories, whereas the FAIR principles target individual datasets. In this
    webinar the two sets of principles will be discussed and compared and a tangible operationalization will be presented.

  • How to Import Works into Your ORCID Record Using a Search & Link Wizard

    Several ORCID (Open Researcher and Contributor ID) member organizations have built search and link tools that allow you to import information about publications and other works into your ORCID record from other databases. The linking process can begin on the ORCID site, or at the organization's website. Note that ORCID does not store preprints or content. Rather we require that works data added to your record include a link that allows users to easily navigate to the source document. 

     

  • ORCID Registry: How to Group Works on Your ORCID Record

    Learn how and why to group works that have been added to your ORCID (Open Researcher and Contributor ID) record from different sources, so that they are displayed together on your record.

  • Getting Started with ORCID & ORCID API

    This presentation introduces ORCID (Open Researcher and Contributor ID), ORCID features, and ORCID's approach to API development.

  • DMP Assistant: bilingual tool for preparing data management plans (DMPs)

    The DMP Assistant is a bilingual tool to assist in the preparation of a Data Management Plan (DMP). This tool, which is based on international standards and best practices in data management, guides the researcher step by step through the key questions to develop his plan. DMP Assistant is powered by an open source application called DMPOnline, which is developed by the Digital Curation Centre (DCC).  Site registration is required.  Data management planning templates are available for the DMP Assistant after registration and sign in.

  • Data Management for Clinical Research MOOC

    This course presents critical concepts and practical methods to support planning, collection, storage, and dissemination of data in clinical research.
     
    Understanding and implementing solid data management principles is critical for any scientific domain. Regardless of your current (or anticipated) role in the research enterprise, a strong working knowledge and skill set in data management principles and practice will increase your productivity and improve your science. The instructors' goal is to use these modules to help you learn and practice this skill set.

    This course assumes very little current knowledge of technology other than how to operate a web browser. The course will focus on practical lessons, short quizzes, and hands-on exercises as we explore together best practices for data management.

    The six modules cover these topics:

    • Research Data Collection Strategy
    • Electronic Data Capture Fundamentals
    • Planning a Data Strategy for a Prospective Study
    • Practicing What We've Learned: Implementation
    • Post-Study Activities and Other Considerations
    • Data Collection with Surveys
  • Overview of Interdisciplinary Earth Data Alliance (IEDA) Data Management Resources

    In the digital era, documenting and sharing our scientific data is growing increasingly important as an integral part of the scientific process. Data Management not only makes our data resources available for others to build upon, but it also enables data syntheses and new analyses that hold the potential for significant scientific advancement. Effective data management begins during the planning stages of a project and continues throughout the research process from field and/or laboratory work, through analysis, and culminating with scientific literature and data publication. By planning ahead, and following some best practices along the way, the process of data management can be simple and relatively low-effort, enabling rapid contribution and publication of data in the appropriate data systems at the conclusion of a project.

    IEDA offers a variety of tools to support investigators along the full continuum of their data management efforts:  Links to these tools and resources are available from the landing page for this resource.

    Pre-Award

    • IEDA Data Discovery Tools
    • IEDA Data Management Plan (DMP) Tool

    Research & Analysis

    • Register sample-based data sets and samples 
      • Register sample metadata and get a unique sample identifier (IGSN)
      • Download Templates for Analytical Data
      • Learn about contributing Analytical Data to the EarthChem Library
    • Register sensor-based data sets and samples 
      • Contribute sensor data files (e.g. geophysical data) and supporting metadata to MGDS
    • IEDA Analysis Tools
      • GeoMapApp earth science exploration and visualization application 
        • Analyze your own geospatial data within the context of other available datasets

    Synthesis & Publication

    • Register final data sets with IEDA 
    • Publish your data 
      • Publishing your data with a DOI ensures that it can be directly referenced in your paper and cited by others.
    • IEDA Data Compliance Reporting (DCR) Tool 
      • Rapidly generate a Data Compliance Report (DCR) based on your NSF award number to demonstrate that your data are registered with IEDA systems and you are compliant with NSF Data Policies.
  • Simplifying the Reuse and Interoperability of Hydrologic Data Sets and Models with Semantic Metadata that is Human-Readable & Machine-Actionable

    This slide set discusses the big, generic problem facing geoscientists today that stems from lack of interoperability across a huge number of heterogeneous resources, and how to solve it.  Practical solutions to tame the inherent heterogeneity involve the collection of standardized, "deep-description" metadata for resources that are then wrapped with standardized APIs that provide callers wtih access to both the data and the metadata.  

  • ORNL DAAC Data Recipes

    A collection of tutorials, called "data recipes" that describe how to use Earth science data from NASA's Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) using easily available tools and commonly used formats for Earth science data focusing on biogeochemical dynamics data.  These tutorials are available to assist those wishing to learn or teach how to obtain and view these data. 

  • EarthChem Library: How to Complete a Data Submission Template

    Learn how to complete a data submission template for the EarthChem Library (www.earthchem.org/library). You can access existing templates at www.earthchem.org/data/templates. If you do not see a template appropriate for your data type, please contact EarthChem at [email protected].

  • iData Tutorial

    A brief tutorial that shows how to upload, preview, and publish from iData. To use the accompanying My Geo Hub tutorial exercises, go to https://mygeohub.org/resources/1217. Note that the number before each step is the time on the YouTube video where it shows how each step is done. Also, note that the video does not contain audio content.

  • EarthChem Library: Submission Guidelines

    Learn general guidelines for data submission to the EarthChem Library (www.earthchem.org/library), including the data types and formats accepted and additional best practices for submission.

  • How to Manage Your Samples in MySESAR

    The System for Earth Sample Registration (SESAR) operates a registry that distributes the International Geo Sample Number IGSN. SESAR catalogs and preserves sample metadata profiles, and provides access to the sample catalog via the Global Sample Search.

    MySESAR provides a private working space in the System for Earth Sample Registration. This tutorial will introduce you to how to manage samples in MySESAR, including how to search the sample catalog, how to view and edit samples, how to print labels, how to group samples and how to transfer ownership of samples. For details relating to sample registration, please see tutorials for individual sample and batch sample registration here: http://www.geosamples.org/help/registration.

    MySESAR allows you to:

    • obtain IGSNs for your samples by registering them with SESAR.
    • register samples one at a time by entering metadata into a web form.
    • register multiple samples by uploading metadata in a SESAR spreadsheet form.
    • generate customized SESAR spreadsheet forms.
    • view lists of samples that you registered.
    • edit sample metadata profiles.
    • upload images and other documents such as field notes, maps, or links to publications to a sample profile.
    • restrict access to metadata profiles of your samples.
    • transfer ownership of a sample to another SESAR user.
  • GeoBuilder for Exploring Geospatial Data

    A step-by-step tutorial for GeoBuilder. The GeoBuilder tool provides a wizard type interface that guides users through several steps for loading, selecting, configuring and analyzing geo-referenced tabular data. As a result, the data is presented on an Open Street Map with customized annotation, station/site popup, and dynamic filtering and plotting. The tool can be used in two ways: first, an end user can use it to dynamically load and explore a csv file of interest. Second, a data owner can use it to build a customized view of the data he wants to share, save the configuration, and publish the data, configuration, and viewer as a new “tool” specifically for his data. With this, any scientist can easily develop an interactive web-enabled GIS interface to share their data within minutes, as compared to the past where they needed to hire a web developer and spent months to get the same done.

    Note that My Geo Hub registration is required to access the GeoBuilder tool.

  • GeoBuilder - How to Share My Session

    A brief tutorial that shows how to share a GeoBuilder session. ​The GeoBuilder tool provides a wizard type interface that guides users through several steps for loading, selecting, configuring and analyzing geo-referenced tabular data. To use the accompanying My Geo Hub tutorial exercises, go to https://mygeohub.org/resources/1219. Note that the number before each step is the time on the YouTube video where it shows how each step is done. Also, note that the video does not contain audio content.

    For more information about GeoBuilder, go to ​https://mygeohub.org/resources/geobuilder.

  • Introduction to Lidar

    This course provides an overview of Lidar technology; data collection workflow; data products formats, and metadata; Lidar and vegetation; QA/QC, artifacts, issues to keep in mind; and DEM generation from Lidar point cloud data.

  • Genomics Curriculum

    The focus of this workshop is on working with genomics data and data management and analysis for genomics research. It covers data management and analysis for genomics research including best practices for the organization of bioinformatics projects and data, use of command line utilities, use of command line tools to analyze sequence quality and perform variant calling, and connecting to and using cloud computing.
    Lessons:

    • Project organization and management
    • Introduction to the command line
    • Data wrangling and processing
    • Introduction to cloud computing for genomics
    • Data analysis and visualization in R *beta*
  • New England Collaborative Data Management Curriculum

    NECDMC is an instructional tool for teaching data management best practices to undergraduates, graduate students, and researchers in the health sciences, sciences, and engineering disciplines. Each of the curriculum’s seven online instructional modules aligns with the National Science Foundation’s data management plan recommendations and addresses universal data management challenges. Included in the curriculum is a collection of actual research cases that provides a discipline specific context to the content of the instructional modules. These cases come from a range of research settings such as clinical research, biomedical labs, an engineering project, and a qualitative behavioral health study. Additional research cases will be added to the collection on an ongoing basis. Each of the modules can be taught as a stand-alone class or as part of a series of classes. Instructors are welcome to customize the content of the instructional modules to meet the learning needs of their students and the policies and resources at their institutions.

  • Imaging and Analyzing Southern California’s Active Faults with High-Resolution Lidar Topography

    Over the past 5+ years, many of Southern California’s active faults have been scanned with airborne lidar through various community and PI-data collection efforts (e.g., the B4 Project, EarthScope, and the post-El Mayor–Cucapah earthquake). All of these community datasets are publicly available (via OpenTopography: https://www.opentopography.org) and powerfully depict the effect of repeated slip along these active faults as well as surface processes in a range of climatic regimes. These datasets are of great interest to the Southern California Earthquake Center (SCEC) research and greater academic communities and have already yielded important new insights into earthquake processes in southern California.

    This is a short course on LiDAR technology, data processing, and analysis techniques. The foci of the course are fault trace and geomorphic mapping applications, integration with other geospatial data, and data visualization and analysis approaches. Course materials include slide presentations, video demonstrations, and text-based software application tutorials.

  • GODAN Webinar Series

    A series of webinars organised by the GODAN Working Group on Capacity Development in collaboration with CTA. The Global Open Data for Agriculture and Nutrition (GODAN) supports the proactive sharing of open data to make information about agriculture and nutrition available, accessible and usable to deal with the urgent challenge of ensuring world food security. A core principle behind GODAN is that a solution to Zero Hunger lies within existing, but often unavailable, agriculture and nutrition data. At the GODAN Summit in September 2016, GODAN launched a new Working Group on Capacity Development. More info here: https://www.godan.info/news/leveraging-power-webinars-support-open-data-...

  • Sustaining Science Gateways—Finding your "best fit" model

    Digital projects – science gateways, data repositories, educational websites, and others—have a few things in common. They can deliver a great deal of value to users – by sharing widely sophisticated tools, large data sets, or access to computing capacity among those in the academic sector who really need them to advance their work.  But they share something else in common, too: They are devilishly hard to run in a way that permits ongoing growth and expansion.

    In this webinar, Nancy Maron, a lead instructor in the Science Gateways Bootcamp, introduces participants to the key elements of sustainability planning – the building blocks for developing Science Gateways that have the best chance for ongoing growth.

    The webinar will introduce sustainability models and share some key tactics for identifying the models that are most likely to work for your gateway. We will touch upon funding models, the competitive environment, and audience assessment, to show how these need to be considered in tandem with any plan.

  • Working in the R Ecosystem: Building Applications & Content for Your Gateway

    The R programming language first appeared on the scene in the 1990's as an open source environment for statistical modeling and data analysis. Throughout the last decade, interest in the language has grown alongside researcher's abilities to collect and store larger amounts of data. Today, scientific and business decisions increasingly rely on the interpretation of this data. New libraries for processing data and communicating results are being debuted in ways that break down traditional language silos. Technologies like interactive documents, HTML based applications, and RESTful APIs have exposed capability gaps between R's interfaces for numerical analysis libraries and its built-in ability for graphical display. In this webinar, Derrick Kearney will survey several R libraries that are helping people bridge the gap between their R-based analysis and the numerous ways people are representing results today, all of which can be published on your science gateway, thus extending your research impact to others in a reproducible way.

  • Webinar: National Data Service (NDS) Labs Workbench

    The growing size and complexity of high-value scientific datasets are pushing the boundaries of traditional models of data access and discovery. Many large datasets are only accessible through the systems on which they were created or require specialized software or computational resources for re-use. In response to this growing need, the National Data Service (NDS) consortium is developing the Labs Workbench platform, a scalable, web-based system intended to support turn-key deployment of encapsulated data management and analysis tools to support exploratory analysis and development on cloud resources that are physically "near" the data and associated high-performance computing (HPC) systems.  The Labs Workbench may complement existing science gateways by enabling exploratory analysis of data and the ability for users to deploy and share their own tools. The Labs Workbench platform has also been used to support a variety training and workshop environments.

    This webinar includes a demonstration of the Labs Workbench platform and a discussion of several key use cases. A presentation of findings from the recent Workshop on Container Based Analysis Environments for Research Data Access and Computing further highlight compatibilities between science gateways and interactive analysis platforms such as Labs Workbench.
     

  • Facing the data challenge: Developing data policy and services

    Overview of research data management (RDM), who is responsible for RDM, the components of a researh data service, and policy and research activity roadmap development in compliance with Engineering and Physical Sciences Research Council (EPSRC) funding expectations in the UK. 

  • Research Data Management and Sharing MOOC

    This course will provide learners with an introduction to research data management and sharing. After completing this course, learners will understand the diversity of data and their management needs across the research data lifecycle, be able to identify the components of good data management plans and be familiar with best practices for working with data including the organization, documentation, and storage and security of data. Learners will also understand the impetus and importance of archiving and sharing data as well as how to assess the trustworthiness of repositories.

    Note: The course is free to access. However, if you pay for the course, you will have access to all of the features and content you need to earn a Course Certificate from Coursera. If you complete the course successfully, your electronic Certificate will be added to your Coursera Accomplishments page - from there, you can print your Certificate or add it to your LinkedIn profile. Note that the Course Certificate does not represent official academic credit from the partner institution offering the course.
    Also, note that the course is offered on a regular basis. For information about the next enrollment, go to the provided URL.

     
  • GEOMAPApp Tutorials

    The video tutorials are available from the home page under the general topics listed below, and also on the GeoMapApp YouTube channel at:  https://www.youtube.com/user/GeoMapApp. The tutorials demonstrate how to perform common tasks with GeoMapApp. Full information on the functions is available at the provided web address. General topics include:
     - Introduction
     - Import Your Own Data
     - Analyze Data
     - Working with Gridded Data
     - Available Data and examples
     - Portals (including, for example Ocean Floor Drilling, Multibeam Swath Bathymetry DAta, Seismic Data, Earthquake data)
     - In-Depth Webinars

    GeoMapApp is an earth science exploration and visualization application that is continually being expanded as part of the Marine Geoscience Data System (MGDS) at the Lamont-Doherty Earth Observatory of Columbia University. The application provides direct access to the Global Multi-Resolution Topography (GMRT) compilation that hosts high resolution (~100 m node spacing) bathymetry from multibeam data for ocean areas and ASTER (Advanced Spaceborne Thermal Emission and Reflection Radiometer) and NED (National Elevation Dataset) topography datasets for the global land masses.  

  • DataONE Data Management Module 03: Data Management Planning

    Data management planning is the starting point in the data life cycle. Creating a formal document that outlines what you will do with the data during and after the completion of research helps to ensure that the data is safe for current and future use. This 30-40 minute lesson describes the benefits of a data management plan (DMP), outlines the components of a DMP, details tools for creating a DMP, provides NSF DMP information, and demonstrates the use of an example DMP and includes a downloadable presentation (PPT or PDF) with supporting hands-on exercise and handout.