All Learning Resources

  • Modern Research Management Workshop

    Introduction to research data management (RDM). Topics include:

    • Data organization, storage, backup, security, and deposit 
    • Benefits of managing and sharing data
    • File formats
    • File naming
    • Data publishing
    • Data reuse
    • Open data
    • Sensitive data
    • Data management plans and the DMPonline tool
  • Data Management Tools

    An overview of research management tools:

    • Re3data to find repositories
    • FAIRsharing and RDA Data Standards Catalogue
    • DMPonline for writing data management plans
    • OpenAIRE for managing outputs and reporting results
  • How NOT to Share Your Data: Avoiding Data Horror Stories

    This presentation is designed to encourage best practice from researchers when sharing their data. It covers basic issues such as repositories, file formats, and cleaning spreadsheets. It was designed for researchers in the sciences who already have some basic awareness that data sharing has many benefits and is expected by many UK research funders. Topics include:

    • Where you should and should not share your data
    • What data should you include?
    • Choosing a file format
    • Spreadsheet use
    • How you should and should not describe your data
  • Dash: Making Data Sharing Easier

    Dash is a self-service tool for researchers to select, describe, identify, upload, update, and share their research data. 

    For more information about Dash go to ​https://www.cdlib.org/services/uc3/dash.html.

  • Research Data Lifecycle

    Data often have a longer lifespan than the research project that creates them. Researchers may continue to work on data after funding has ceased, follow-up projects may analyse or add to the data, and data may be re-used by other researchers.

    Adhering to a well-defined research data lifecycle results in organised, well documented, preserved and shared data that advance scientific inquiry and increase opportunities for learning and innovation.

  • Bringing Research Data into the Library: Expanding the Horizons of Institutional Repositories

    The focus of library-managed institutional repositories has so far been on document-like items (published articles, preprints, theses, reports, working papers, etc.) but there is growing demand to expand their use into new genres such as scientific research datasets (sensor readings, genomics data, neuroimages, etc.). This webcast explains how institutional repositories are including this type of collection, what librarians need to know in order to manage such collections, and a few case studies from the MIT Libraries. ​

     
  • Marine GIS Applications (using QGIS)

    This course provides an in-depth overview of the application of Geographic Information Systems (GIS) to the marine environment using QGIS. All resources can be accessed from the provided URL. 

    Topic 1:
    Aims and Objectives:

    • Provide an introduction to GIS for marine applications
    • Focus on some publicly available marine datasets
    • Show the potential applications of GIS for the marine environment

    Learning Outcomes:

    • Knowledge and understanding of GIS, spatial data, raster, and vector models
    • Core tasks involved in the GIS analysis process including data acquisition, management, manipulation and analysis, and presentation and output
    • Core functionality of QGIS Desktop and QGIS Browser
    • Creating and editing spatial data
    • Appreciation of coastal and marine GIS data applications

    Topic 2: Introduction to Marine GIS

    • A brief explanation of GIS
    • GIS components
    • GIS data models
    • Marine GIS applications
    • Spatial analysis

    Topic 3: Introduction to QGIS

    • Why QGIS?
    • Exercise: Introduction To QGIS

    Topic 4: View data in QGIS Desktop

    • QGIS provides several windows, convenient for the user
    • Exercise

    Topic 5: Map Projections and Coordinate Systems

    • Geographic Coordinate Systems
    • Coordinate Systems And Map Projections
    • Video: Map projections 
    • Exercise

    Topic 6: Create Base Map in QGIS

    • Define the Area of Interest (AOI)
    • Exercise

    Topic 7: Creating Data Collection from the World Ocean Database

    • Exercise: Obtaining Marine Data from the World Ocean Database

    Topic 8: Introduction to Ocean Data View

    • Video: Ocean Data View (ODV) by Reiner Schlitzer
    • Exercise: Creating Data Collection from the World Ocean Database
    • Exercise: Export Marine Data from the Ocean Data View

    Topic 9: Working with Spreadsheet Data

    • Exercise: Adding Spreadsheet Data

    Topic 10: Edit Data in QGIS

    • Exercise

    Topic 11: Edit Data: Area of Interest and Analysis Mask

    • Exercise

    Topic 12: Interpolating surfaces

    • Map Interpolation
    • Interpolating Surfaces: Summary 
    • Exercise: Interpolate to Raster

    Topic 13: Rendering Raster Data

    • Exercise

    Topic 14: Raster Calculator

    • Exercise: Using the Raster Calculation

    Topic 15: Working with NetCDF

    • Exercise

    Topic 16: Plotting Vector Arrows from U and V Component Grids

    • Marine Geospatial Ecology Tools (MGET)

    Topic 17: Downloading species observations from the Ocean Biogeographic Information System (OBIS)

    • Exercise: Downloading OBIS Data in CSV Format

    Topic 18: Creating KML files for Google Earth

    • Example KML document
    • Exercise

    Topic 19: Publication Quality Maps

    • Exercise: Create Publication Quality Maps
  • Quality Management System Essentials for IODE National Oceanographic Data Centres (NODC) and Associate Data Units (ADU)

    Course overview

    The International Oceanographic Data and Information Exchange (IODE) maintains a global network of National Oceanographic Data Centres (NODC) and Associate Data Units (ADU) responsible for the collection, quality control, archive and online publication of many millions of ocean observations. The concept of quality management has become increasingly significant for these centres to meet national and international competency standards for delivery of data products and services. The IODE Quality Management Framework encourages NODCs and ADUs to implement a quality management system which will lead to the accreditation.

    This workshop provides an introduction for NODCs and ADUs involved in the development, implementation, and management of a Quality Management System based on ISO 9001:2015.

    Aims and objectives

    • To introduce the IODE Quality Management Framework
    • To introduce the ISO 9000 series of standards
    • To provide a description of a Quality Management System
    • To describe the importance of quality management for oceanographic data
    • To describe the accreditation process for NODCs and ADU

    Note that the exercises are no longer accessible.

    Topics include:

    • Introduction to Quality Management Systems
    • QMS Implementation in Meteorological Services
    • Introduction to ISO standards
    • Understanding ISO 9001:2015
      • Overview
      • ISO 9001:2015 Clause 4. Context of the Organization
      • ISO 9001:2015 Clause 5. Leadership
      • ISO 9001:2015 Clause 6. Planning
      • ISO 9001:2015 Clause 7.Support
      • ISO 9001:2015 Clause 8. Operation
      • ISO 9001:2015 Clause 9. Performance Evaluation
      • ISO 19115:2015 Clause 10. Improvement
    • Developing a quality system manual
    • Experiences and lessons learned from implementing a QMS: SISMER
    • Implementing the Quality Management System
    • IODE Quality Management Framework and Accreditation
  • Administración de Datos Biogeográficos Marinos (Contribuyendo al Uso de OBIS) (2016)

    The course provides an introduction to the Ocean Biogeographic Information System (OBIS). It includes best practices in the management of marine biogeographic data, publication of data for free access (IPT), access to data, organization, analysis, and visualization. 

    Goals:

    • Expand the network of OBIS collaborators.
    • Improve the quality of marine biogeographic data.
    • Increase knowledge of international standards and best practices related to marine biogeographic data.
    • Increase the amount of freely accessible data published through OBIS and its OBIS nodes.
    • Increase the use of OBIS data for science, species conservation, and area-based management applications.

    There are four modules consisting of Spanish language slide presentations and videos:

    • MODULE 1 - General and concepts
    • Introduction to IOC, IODE, OTGA and OBIS and related to WORMS, Marine Regions, DarwinCore biodiversity data standard, and metadata.
    •  
    • MODULE 2 - Data Quality Control Procedures
    •  
    • MODULE 3 - Best practices in the management and policy of marine biogeographic data and access, organization, analysis and visualization of OBIS data
    •  
    • MODULE 4 - Publication of data for free access (Integrate Publishing Toolkit -IPT)
  • Marine Biogeographic Data Management (Contributing and Using Ocean Biogeographic Information System) (2015)

    The course provided an introduction to the Ocean Biogeographic Information System (OBIS). This includes best practices in marine biogeographic data management, data publication, data access, data analysis, and data visualization. Content consists of slide presentations and videos.

    Aims and Objectives

    • Expand the OBIS network of collaborators
    • Improve marine biogeographic data quality
    • Increase awareness of international standards and best practices related to marine biogeographic data
    • Increase the amount of open access data published through OBIS and its OBIS nodes
    • Increase the use of data from OBIS for science, species conservation, and area-based management applications

    Learning Outcomes

    • Knowledge and understanding of OBIS structure, mission, and objectives
    • Installation and management of IPT
    • Use of Darwin Core standards for species occurrence records, taxonomy, event/sample records and additional biological and environmental parameters.
    • Data quality control tools
    • Publishing data through IPT and contributing datasets to OBIS
    • Use of OBIS data access (SQL, web service, API/R). 
    • Data visualization tools (ArGIS online, CartoDB, QGIS, …) 

    Target Audience

    • Marine data managers
    • Staff of NODCs or ADUs/OBIS nodes working with marine biodiversity data
    • Principle Investigators of major marine biodiversity expeditions
    • National marine biodiversity focal points

    Sections 

    • Introductions to IOC, IODE, OTGA, and OBIS
    • Biodiversity Data Standards
    • Data Quality Control Procedures
    • Data Access and Visualisation
    • Social Aspects of Data Management
  • Research Data Management

    Marine information managers are increasingly seen as major contributors to research data management (RDM) activities in general and in the design of research data services (RDS) in particular. They promote research by providing services for storage, discovery, and access and liaise and partner with researchers and data centers to foster an interoperable infrastructure for the above services.

    The series of units within this training course recognizes the potential contributions that librarians/information managers can offer and hence the need to develop their skills in the research data management process. Course materials consist of slide presentations and student activities. Topics include:

    • Data and information management in International Indian Ocean Expedition-2 (IIOE-2)
    • Open science data
    • Research data and publication lifecycles
    • Research data organization and standards
    • Data management plans
    • Data publication and data citation
    • Access to research data
    • Management of sensitive data
    • Repositories for data management
    • Data management resources
  • Making Research Data Available

    There is a growing awareness of the importance of research data. Elsevier is committed to encouraging and supporting researchers who want to store, share, discover and reuse data. To this end, Elsevier has set up several initiatives that allow authors to make their data available when they publish with Elsevier. The webinars in the collection (located on the bottom half of the web page) cover:

    • Ways for researchers to store, share, discover, and use data
    • How to create a good research data management plan  
    • Data Citation: How can you as a researcher benefit from citing data? 
  • Metadata Recommendations, Dialects, Evaluation & Improvement

    This webinar describes a flexible, multifaceted approach to evaluating and improving metadata collections (in multiple dialects).

    The initial goal of many metadata efforts was discoverable data but, like many other elements of data management, the metadata landscape has evolved considerably over the last decade to include new use cases and requirements. The approach that has been developed includes web-based tools for understanding and comparing recommendations and dialects, flexible comparisons of completeness of metadata collections (in multiple dialects) with respect to particular recommendations, evaluation of completeness of single metadata records, identification of specific metadata improvement needs and an open forum for sharing information, experiences, and examples. Recommendations for metadata requirements and metadata improvement needs are discussed and shared.
     

  • FAIR Data in Trustworthy Data Repositories

    Everybody wants to play FAIR, but how do we put the principles into practice?

    There is a growing demand for quality criteria for research datasets. The presenters argue that the DSA (Data Seal of Approval for data repositories) and FAIR principles get as close as possible to giving quality criteria for research data. They do not do this by trying to make value judgments about the content of datasets, but rather by qualifying the fitness for data reuse in an impartial and measurable way. By bringing the ideas of the DSA and FAIR together, we will be able to offer an operationalization that can be implemented in any certified Trustworthy Digital Repository. In 2014 the FAIR Guiding Principles (Findable, Accessible, Interoperable and Reusable) were formulated. The well-chosen FAIR acronym is highly attractive: it is one of these ideas that almost automatically get stuck in your mind once you have heard it. In a relatively short term, the FAIR data principles have been adopted by many stakeholder groups, including research funders. The FAIR principles are remarkably similar to the underlying principles of DSA (2005): the data can be found on the Internet, are accessible (clear rights and licenses), in a usable format, reliable and are identified in a unique and persistent way so that they can be referred to. Essentially, the DSA presents quality criteria for digital repositories, whereas the FAIR principles target individual datasets. In this
    webinar the two sets of principles will be discussed and compared and a tangible operationalization will be presented.

  • ORCID Communications Toolkit: Interactive Outreach Presentation

    This presentation includes information about ORCID (Open Researcher and Contributor ID), short quizzes, and workshop activities. Select content for your presentation according to how long you have, what is suitable for your audience’s level of knowledge, and if you want to include practical tasks. After downloading, the slides can be edited to include your institution's name and presentation details.

  • How to Import Works into Your ORCID Record Using a Search & Link Wizard

    Several ORCID (Open Researcher and Contributor ID) member organizations have built search and link tools that allow you to import information about publications and other works into your ORCID record from other databases. The linking process can begin on the ORCID site, or at the organization's website. Note that ORCID does not store preprints or content. Rather we require that works data added to your record include a link that allows users to easily navigate to the source document. 

     

  • ORCID Registry: How to Group Works on Your ORCID Record

    Learn how and why to group works that have been added to your ORCID (Open Researcher and Contributor ID) record from different sources, so that they are displayed together on your record.

  • Getting Started with ORCID & ORCID API

    This presentation introduces ORCID (Open Researcher and Contributor ID), ORCID features, and ORCID's approach to API development.

  • Data Management Support for Researchers

    Tips and advice from a variety of researchers, data managers, and service providers, to help with data management. Titles include:

    • Sharing data: good for science, good for you
    • What support needs to be provided to assist researchers with data management?
    • How can choices about data capture open up, or limit, opportunities for researchers?
    • What should researchers do to help their data survive?
    • Why should researchers share their data?
    • How can repositories and data centres help researchers?
  • DMP Assistant: bilingual tool for preparing data management plans (DMPs)

    The DMP Assistant is a bilingual tool to assist in the preparation of a Data Management Plan (DMP). This tool, which is based on international standards and best practices in data management, guides the researcher step by step through the key questions to develop his plan. DMP Assistant is powered by an open source application called DMPOnline, which is developed by the Digital Curation Centre (DCC).  Site registration is required.  Data management planning templates are available for the DMP Assistant after registration and sign in.

  • Data Management for Clinical Research MOOC

    This course presents critical concepts and practical methods to support planning, collection, storage, and dissemination of data in clinical research.
     
    Understanding and implementing solid data management principles is critical for any scientific domain. Regardless of your current (or anticipated) role in the research enterprise, a strong working knowledge and skill set in data management principles and practice will increase your productivity and improve your science. The instructors' goal is to use these modules to help you learn and practice this skill set.

    This course assumes very little current knowledge of technology other than how to operate a web browser. The course will focus on practical lessons, short quizzes, and hands-on exercises as we explore together best practices for data management.

    The six modules cover these topics:

    • Research Data Collection Strategy
    • Electronic Data Capture Fundamentals
    • Planning a Data Strategy for a Prospective Study
    • Practicing What We've Learned: Implementation
    • Post-Study Activities and Other Considerations
    • Data Collection with Surveys
  • National Network of Libraries of Medicine (NNLM) Research Data Management Webinar Series

    The National Network of Libraries of Medicine (NNLM) Research Data Management (RDM) webinar series is a collaborative, bimonthly series intended to increase awareness of research data management topics and resources.  The series aims to support RDM within the library to better serve librarians and their institutional communities. Topics include, but are not limited to, understanding a library’s role in RDM, getting started, data management planning, and different RDM tools.

    Several NNLM Regional Medical Libraries will collaborate and combine efforts to feature experts from the field for this national webinar series. Each session will include separate objectives based on the featured webinar presenter. Attendee participation will be possible through the WebEx platform chat features and other electronic methods designed by the guest presenter. Sessions are recorded, closed-captioned, and posted for later viewing.

    Each session will last approximately 1 hour and 1 MLA CE contact hour will be offered per session. CE contact hours will only be available during the live presentations of the webinar.

    Watch the webpage for upcoming webinars.

  • Overview of Interdisciplinary Earth Data Alliance (IEDA) Data Management Resources

    In the digital era, documenting and sharing our scientific data is growing increasingly important as an integral part of the scientific process. Data Management not only makes our data resources available for others to build upon, but it also enables data syntheses and new analyses that hold the potential for significant scientific advancement. Effective data management begins during the planning stages of a project and continues throughout the research process from field and/or laboratory work, through analysis, and culminating with scientific literature and data publication. By planning ahead, and following some best practices along the way, the process of data management can be simple and relatively low-effort, enabling rapid contribution and publication of data in the appropriate data systems at the conclusion of a project.

    IEDA offers a variety of tools to support investigators along the full continuum of their data management efforts:  Links to these tools and resources are available from the landing page for this resource.

    Pre-Award

    • IEDA Data Discovery Tools
    • IEDA Data Management Plan (DMP) Tool

    Research & Analysis

    • Register sample-based data sets and samples 
      • Register sample metadata and get a unique sample identifier (IGSN)
      • Download Templates for Analytical Data
      • Learn about contributing Analytical Data to the EarthChem Library
    • Register sensor-based data sets and samples 
      • Contribute sensor data files (e.g. geophysical data) and supporting metadata to MGDS
    • IEDA Analysis Tools
      • GeoMapApp earth science exploration and visualization application 
        • Analyze your own geospatial data within the context of other available datasets

    Synthesis & Publication

    • Register final data sets with IEDA 
    • Publish your data 
      • Publishing your data with a DOI ensures that it can be directly referenced in your paper and cited by others.
    • IEDA Data Compliance Reporting (DCR) Tool 
      • Rapidly generate a Data Compliance Report (DCR) based on your NSF award number to demonstrate that your data are registered with IEDA systems and you are compliant with NSF Data Policies.
  • Introduction to Scientific Visualization

    Scientific Visualization transforms numerical data sets obtained through measurements or computations into graphical representations. Interactive visualization systems allow scientists, engineers, and biomedical researchers to explore and analyze a variety of phenomena in an intuitive and effective way. The course provides an introduction to the principles and techniques of Scientific Visualization. It covers methods corresponding to the visualization of the most common data types, as well as higher-dimensional, so-called multi-field problems. It combines a description of visualization algorithms with a presentation of their practical application. Basic notions of computer graphics and human visual perception are introduced early on for completeness. Simple but very instructive programming assignments offer a hands-on exposure to the most widely used visualization techniques.

    Note that the lectures, demonstration, and tutorial content require a Purdue Credentials,Hydroshare, or CILogon account.

    Access the CCSM Portal/ESG/ESGC Integration slide presentation at  https://mygeohub.org/resources/50/download/ccsm.pdf. The CCSM/ESG/ESGC collaboration provides a semantically enabled environment that includes modeling, simulated and observed data, visualization, and analysis.
    Topics include:

    • CCSM Overview
    • CCSM on the TeraGrid
    • Challenges
    • Steps in a typical CCSM Simulation
    • Climate Modeling Portal: Community Climate System Model (CCSM) to simulate climate change on Earth
    • CCSM Self-Describing Workflows 
    • Provenance metadata collection
    • Metadata

     

  • Simplifying the Reuse and Interoperability of Hydrologic Data Sets and Models with Semantic Metadata that is Human-Readable & Machine-Actionable

    This slide set discusses the big, generic problem facing geoscientists today that stems from lack of interoperability across a huge number of heterogeneous resources, and how to solve it.  Practical solutions to tame the inherent heterogeneity involve the collection of standardized, "deep-description" metadata for resources that are then wrapped with standardized APIs that provide callers wtih access to both the data and the metadata.  

  • ORNL DAAC Data Recipes

    A collection of tutorials, called "data recipes" that describe how to use Earth science data from NASA's Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) using easily available tools and commonly used formats for Earth science data focusing on biogeochemical dynamics data.  These tutorials are available to assist those wishing to learn or teach how to obtain and view these data. 

  • EarthChem Library: How to Complete a Data Submission Template

    Learn how to complete a data submission template for the EarthChem Library (www.earthchem.org/library). You can access existing templates at www.earthchem.org/data/templates. If you do not see a template appropriate for your data type, please contact EarthChem at info@earthchem.org.

  • iData Tutorial

    A brief tutorial that shows how to upload, preview, and publish from iData. To use the accompanying My Geo Hub tutorial exercises, go to https://mygeohub.org/resources/1217. Note that the number before each step is the time on the YouTube video where it shows how each step is done. Also, note that the video does not contain audio content.

  • EarthChem Library: Submission Guidelines

    Learn general guidelines for data submission to the EarthChem Library (www.earthchem.org/library), including the data types and formats accepted and additional best practices for submission.

  • How to Manage Your Samples in MySESAR

    The System for Earth Sample Registration (SESAR) operates a registry that distributes the International Geo Sample Number IGSN. SESAR catalogs and preserves sample metadata profiles, and provides access to the sample catalog via the Global Sample Search.

    MySESAR provides a private working space in the System for Earth Sample Registration. This tutorial will introduce you to how to manage samples in MySESAR, including how to search the sample catalog, how to view and edit samples, how to print labels, how to group samples and how to transfer ownership of samples. For details relating to sample registration, please see tutorials for individual sample and batch sample registration here: http://www.geosamples.org/help/registration.

    MySESAR allows you to:

    • obtain IGSNs for your samples by registering them with SESAR.
    • register samples one at a time by entering metadata into a web form.
    • register multiple samples by uploading metadata in a SESAR spreadsheet form.
    • generate customized SESAR spreadsheet forms.
    • view lists of samples that you registered.
    • edit sample metadata profiles.
    • upload images and other documents such as field notes, maps, or links to publications to a sample profile.
    • restrict access to metadata profiles of your samples.
    • transfer ownership of a sample to another SESAR user.
  • GeoBuilder for Exploring Geospatial Data

    A step-by-step tutorial for GeoBuilder. The GeoBuilder tool provides a wizard type interface that guides users through several steps for loading, selecting, configuring and analyzing geo-referenced tabular data. As a result, the data is presented on an Open Street Map with customized annotation, station/site popup, and dynamic filtering and plotting. The tool can be used in two ways: first, an end user can use it to dynamically load and explore a csv file of interest. Second, a data owner can use it to build a customized view of the data he wants to share, save the configuration, and publish the data, configuration, and viewer as a new “tool” specifically for his data. With this, any scientist can easily develop an interactive web-enabled GIS interface to share their data within minutes, as compared to the past where they needed to hire a web developer and spent months to get the same done.

    Note that My Geo Hub registration is required to access the GeoBuilder tool.

  • GeoBuilder - How to Share My Session

    A brief tutorial that shows how to share a GeoBuilder session. ​The GeoBuilder tool provides a wizard type interface that guides users through several steps for loading, selecting, configuring and analyzing geo-referenced tabular data. To use the accompanying My Geo Hub tutorial exercises, go to https://mygeohub.org/resources/1219. Note that the number before each step is the time on the YouTube video where it shows how each step is done. Also, note that the video does not contain audio content.

    For more information about GeoBuilder, go to ​https://mygeohub.org/resources/geobuilder.

  • Introduction to Lidar

    This self-paced, online training introduces several fundamental concepts of lidar and demonstrates how high-accuracy lidar-derived elevation data support natural resource and emergency management applications in the coastal zone.

    Learning objectives:

    • Define lidar
    • Select different types of elevation data for specific coastal applications
    • Describe how lidar are collected
    • Identify the important characteristics of lidar data
    • Distinguish between different lidar data products
    • Recognize aspects of data quality that impact data usability
    • Locate sources of lidar data
    • Discover additional information and additional educational resources

    Note: requires Flash Plugin.

  • Introduction to Lidar

    This course provides an overview of Lidar technology; data collection workflow; data products formats, and metadata; Lidar and vegetation; QA/QC, artifacts, issues to keep in mind; and DEM generation from Lidar point cloud data.

  • Genomics Curriculum

    The focus of this workshop is on working with genomics data and data management and analysis for genomics research. It covers data management and analysis for genomics research including best practices for the organization of bioinformatics projects and data, use of command line utilities, use of command line tools to analyze sequence quality and perform variant calling, and connecting to and using cloud computing.
    Lessons:

    • Project organization and management
    • Introduction to the command line
    • Data wrangling and processing
    • Introduction to cloud computing for genomics
    • Data analysis and visualization in R *beta*
  • Ecology Curriculum

    This workshop uses a tabular ecology dataset from the Portal Project Teaching Database and teaches data cleaning, management, analysis, and visualization. There are no pre-requisites, and the materials assume no prior knowledge about the tools. We use a single dataset throughout the workshop to model the data management and analysis workflow that a researcher would use.
    Lessons:

    • Data Organization in Spreadsheets
    • Data Cleaning with OpenRefine
    • Data Management with SQL
    • Data Analysis and Visualization in R
    • Data Analysis and Visualization in Python


    The Ecology workshop can be taught using R or Python as the base language.
    Portal Project Teaching Dataset: the Portal Project Teaching Database is a simplified version of the Portal Project Database designed for teaching. It provides a real-world example of life-history, population, and ecological data, with sufficient complexity to teach many aspects of data analysis and management, but with many complexities removed to allow students to focus on the core ideas and skills being taught.
     

  • The Agriculture Open Data Package

    he third GODAN Capacity Development Working Group webinar, supported by GODAN Action, focused on the Agriculture Open Data Package (AgPack).
    In 2016 GODAN, ODI, the Open Data Charter and OD4D developed the Agricultural Open Data Package (AgPack) to help governments to realize impact with open data in the agriculture sector and food security. Details at http://www.agpack.info 
    During the webinar the speakers outlined examples and use cases of governments using open data in support of their agricultural sector and food security. Also, the different roles a government can pick up to facilitate such a development, how open data can support government policy objectives on agriculture and food security. 

  • Publishing Open Data from an Organisational Point of View

    The second GODAN Capacity Building webinar was on “Publishing open data from an organisational point of view” and was lead by GODAN Action colleagues from the Open Data Institute in London.
    This webinar focused on key aspects:
    - Why publish open data
    - What benefit can publishing open data bring
    - Why licenses are the most important aspect of publishing open data
    - How to start with publishing open data

  • GODAN Working Group on Capacity Development

    The first webinar organized by the GODAN (Global Open Data for Agriculture & Nutrition) Working Group on Capacity Development gave an overview of GODAN, its objectives and how people can get involved. The webinar also provided information on the purpose of the GODAN Working Group on Capacity Development and explained how to join and get involved in the activities.

  • Curriculum on Open Data and Research Data Management in Agriculture and Nutrition

    This paper details the curriculum for the Open Data Management in Agriculture and Nutrition e-learning course, including background to the course, course design, target audiences, and lesson objectives and outcomes. 
    This free online course aims to strengthen the capacity of data producers and data consumers to manage and use open data in agriculture and nutrition. One of the main learning objectives is for the course to be used widely within agricultural and nutrition knowledge networks, in different institutions. The course also aims to raise awareness of different types of data formats and uses, and to highlight how important it is for data to be reliable, accessible and transparent.
    The course is delivered through Moodle e-learning platform.  Course units include:
    Unit 1:  Open data principles
    Unit 2:  Using open data
    Unit 3:  Making data open
    Unit 4:  Sharing open data
    Unit 5:  IPR and Licensing
    By the end of the course, participants will be able to:
    - Understand the principles and benefits of open data
    -  Understand ethics and responsible use of data
    -  Identify the steps to advocate for open data policies
    -  Understand how and where to find open data
    -  Apply techniques to data analysis and visualisation
    -  Recognise the necessary steps to set up an open data repository
    -  Define the FAIR data principles
    -  Understand the basics of copyright and database rights
    -  Apply open licenses to data
    The course is open to infomediaries which includes ICT workers, technologist - journalists, communication officers, librarians and extensionists; policy makers, administrators and project managers, and researchers, academics and scientists working in the area of  agriculture, nutrition, weather and climate, and land data.