All Learning Resources

  • Understanding Phenology with Remote Sensing [Introductory]

    This training will focus on the use of remote sensing to understand phenology: the study of life-cycle events. Phenological patterns and processes can vary greatly across a range of spatial and temporal scales and can provide insights about ecological processes like invasive species encroachment, drought, wildlife habitat, and wildfire potential. This training will highlight NASA-funded tools to observe and study phenology across a range of scales. Attendees will be exposed to the latest in phenological observatory networks and science, and how these observations relate to ecosystem services, the carbon cycle, biodiversity, and conservation.

    Learning Objectives: 
    By the end of this training series, attendees will be able to:

    • Summarize NASA satellites and sensors that can be used for monitoring global phenology patterns
    • Outline the benefits and limitations of NASA data for phenology
    • Describe the multi-scalar approach to vegetation life cycle analyses
    • Compare and contrast data from multiple phenology networks
    • Evaluate various projects and case-study examples of phenological data


    Course Format: 

    • Three, one-hour sessions


    Prerequisites: Attendees who have not completed the course(s) below may be unprepared for the pace of this training.
    Fundamentals of Remote Sensing  

    Part 1: Overview of Phenology and Remote Sensing

    • Introduction to NASA data and Phenology
    • Land Surface Phenology from MODIS and VIIRS


    Part 2: Scales of Phenology

    • Resolving challenges associated with variability in space, time, and resolution for phenology research and applications
    • USA-National Phenology Network (NPN) and The National Ecological Observatory Network (NEON) 
    • Phenocam: Near-surface phenology
    • Conservation Science Partners


    Part 3: Utility and Advantage of Multi-Scale Analysis

    • Field-based phenology and gridded products
    • Case-study examples:
    • Integration of PhenoCam near-surface remote sensing and satellite phenological data
    • Greenwave modeling
    • Urbanization and plant phenology


    Each part of 3 includes links to the recordings, presentation slides, and Question & Answer Transcripts.
     

  • Using Earth Observations to Monitor Water Budgets for River Basin Management II [Advanced]

    Rivers are a major source of fresh water. They support aquatic and terrestrial ecosystems, provide transportation, generate hydropower, and when treated, provide drinking and agricultural water. Estimating and monitoring water budgets within a river basin is required for sustainable management of water resources and flooding within watersheds. This advanced-level webinar series will focus on the use of NASA Earth observations and Earth system-modeled data for estimating water budgets in river basins.
    Past ARSET training on monitoring water budgets for river basins focused on data sources relevant for river basin monitoring and management and provided case studies for estimating the water budget of a watershed using remote sensing products. This advanced webinar will include lectures and hands-on exercises for participants to estimate water budgets for a given river basin.

    Learning Objectives:
     By the end of this training, attendees will be able to:


    • Identify and access remote sensing and Earth system-modeled data for estimating water budgets in a river basin
    • Explain the uncertainties involved in estimating water budgets for river basins
    • Replicate the steps for estimating water budgets for a river basin and sub-watersheds using remote sensing products and GIS


    Course Format: 


    • Three, two-hour webinars 
    • A certificate of completion will also be available to participants who attend all sessions and complete the homework assignment, which will be based on the webinar sessions.
    • NOTE: Certificates of completion only indicate participation in all aspects of the training.
    • They do not imply proficiency on the subject matter, nor should they be seen as a professional certification.



    Prerequisites: Attendees who have not completed the following may not be prepared for the pace of the training:



    Portions of the series will include data import to QGIS. If you wish to follow along with those steps, please install using the instructions here:



    Part 1: Review and Access of Earth Observations and Earth System-Modeled Data for River Basin Monitoring and Management


    This session will provide an overview of data sources relevant to estimating water budgets for a river basin. There will be a demonstration and guided exercise to download water budget component data to estimate the water budget of a given watershed using remote sensing products.


    Part 2: Water Budget Estimation using Remote Sensing Observations
    This session will include a demonstration and step-by-step exercise to estimate an integrated water budget over a river basin using Integrated Multi-satellitE Retrievals for Global Precipitation Measurement (IMERG) precipitation data, Atmosphere Land Exchange Inverse (ALEXI) evapotranspiration data, and Gravity Recovery and Climate Experiment (GRACE) terrestrial water storage data, all analyzed with QGIS.
    Part 3: Water Budget Estimation using the Global Land Data Assimilation Model
    The final session will include a demonstration and step-by-step exercise to estimate water budgets at a sub-watershed level within a river basin using water budget components from the latest version of the Global Land Data Assimilation System (GLDAS v2.2), which includes assimilation of groundwater data.
     


    Each part of 3 includes links to the recordings, presentation slides, exercises, and Question & Answer Transcripts.
     

  • An Inside Look at how NASA Measures Air Pollution [Introductory]

    Would you like to learn how to access and visualize NASA satellite imagery? With the world’s eyes and media coverage turned to recent global changes in air pollution from the economic downturn, this two-part webinar series provides a primer for the novice and a good refresher course for all others. You will learn which pollutants can be measured from space, how satellites make these measurements, the do’s and don’ts in interpreting satellite data, and how to download and create your own visualizations.

    Learning Objectives: By the end of this training, attendees will be able to:

    • List the pollutants that can be observed by NASA satellites
    • Find and download imagery for NO2 and aerosols/particles
    • Describe the capabilities and limitations of NASA NO2 and aerosol measurements



    Prerequisites: Fundamentals of Remote Sensing (recommended but not required)

    Part One: Nitrogen Dioxide (NO2)
    • What is NO2?
    • NASA Remote Sensing Basics
    • Interpreting NO2 Imagery: Dos and Don’ts
    • Downloading Data and Creating Imagery

    Part Two: Particulate Matter (Aerosols)
    • What are Aerosols?
    • Interpreting Aerosol Imagery: Dos and Don’ts
    • A Tour of NASA Resources for Generating Your Own Visualizations

    Each part of 2 includes links to the recordings, presentation slides,  and Question & Answer Transcripts.

  • Un Vistazo a Cómo la NASA Mide la Contaminación del Aire [Introductorio]

    ¿Le gustaría saber cómo acceder y visualizar imágenes satelitales de la NASA? La reciente disminución de la contaminación atmosférica a nivel mundial debido al bajón económico ha capturado la atención del mundo entero y recibido mucha cobertura mediática. Inspirándose en ello, esta serie de dos webinars imparte conocimiento fundamental para novatos y sirve de curso de repaso para los demás. Ud. aprenderá cuáles son los contaminantes que se pueden medir desde el espacio, cómo los satélites hacen estas mediciones, lo que se debe hacer y no se debe hacer al momento de interpretar datos satelitales y cómo descargar y crear sus propias visualizaciones. 

    Objetivos de Aprendizaje:Al finalizar esta capacitación, los/las participantes podrán:

    • Nombrar los contaminantes que pueden ser observados por satélites de la NASA
    • Encontrar y descargar imágenes para NO2 y aerosoles/partículas
    • Describir las capacidades y limitaciones de las mediciones de NO2 y aerosoles de la NASA



    Prerrequisitos: Fundamentos de la Teledetección (Percepción Remota) -  recomendado pero no obligatorio

    Parte 1: Dióxido de Nitrógeno (NO2)

    • ¿Qué es el NO2?
    • Conceptos Básicos de la Teledetección de la NASA 
    • Interpretación de Imágenes de NO2: Qué hacer y qué no hacer
    • Descargar Datos y Crear Imágenes


    Parte 2: Partículas (Aerosoles)

    • ¿Qué son los Aerosoles?
    • Interpretación de Imágenes de Aerosoles: Qué hacer y qué no hacer
    • Un Recorrido por los Recursos de la NASA para Generar sus Propias Visualizaciones

  • Earth Observations for Disaster Risk Assessment & Resilience [Introductory]

    According to a UN report, between 1998 and 2017, the U.S. alone lost $944.8 billion USD from disasters. Between 1878 and 2017, losses from extreme weather events rose by 251 percent. It is critical to developing disaster management strategies to reduce and mitigate disaster risks. A major factor in regional risk assessment is evaluating the vulnerability of lives and property to disasters. Environmental information about disasters, their spatial impact, and their temporal evolution can plan an important role as well.
    This webinar series will focus on Earth observation (EO) data useful for disaster risk assessment. The series will cover disasters including tropical cyclones, flooding, wildfires, and heat stress. The training will also include access to socioeconomic and disaster damage data. Sessions 3 & 4 will cover case studies and operational applications of EO for disaster risk assessment.

    Learning Objectives: By the end of this training, attendees will: 


    • learn about available NASA remote sensing and socioeconomic data and how to combine them for assessing risk
    • understand how to apply these data for assessing risk from floods and tropical cyclones in specific regions
    • learn how operational agencies are using NASA data for risk management



    Course Format:


    • Four, two-hour parts that include lectures, demonstrations, and question and answer sessions
    • Both Session A & B will be broadcast in English
    • A certificate of completion will also be available to participants who attend all four parts and complete all homework assignments. Note: certificates of completion only indicate the attendee participated in all aspects of the training, they do not imply proficiency on the subject matter, nor should they be seen as a professional certification.



    Prerequisites: 



    Part One: NASA Remote Sensing and Socioeconomic Data for Disaster Risk Assessment Attendees will learn basic concepts and definitions in disaster risk management. Attendees will also learn about the types of satellites and socioeconomic data available through NASA for disaster risk management.


    Part Two: Assessing the Risk of Floods and Cyclones Using NASA Data Attendees will learn a methodology for analyzing remote sensing and socioeconomic data to assess flood and cyclone risk. Examples will be shown for an urban area (Houston, TX, USA) and a country (Mozambique). These case studies will use both historical and forecast data.

    Part Three: Disaster Risk Assessment Case Studies Using Remote Sensing Data This will cover two case studies for using remote sensing data. One on how New York state is using NASA data for heatwave risk assessment, another on the freely available online tools from the World Resources Institute for visualizing NASA remote sensing and socioeconomic data.

    Part Four: Operational Application of Remote Sensing for Disaster Management The Pacific Disaster Center will describe the data, applications, and strategies they use for disaster risk reduction, response, and relief operations.




    Each part of 4 includes links to the recordings, presentation slides,  and Question & Answer Transcripts.

  • Remote Sensing for Conservation & Biodiversity [Introductory]

    The United Nations Millennium Ecosystem Assessment states: “ecosystems are critical to human well-being - to our health, our prosperity, our security, and to our social and cultural identity.” Conservation and biodiversity management play important roles in maintaining healthy ecosystems. Earth observations can help with these efforts. This online webinar series introduces participants to the use of satellite data for conservation and biodiversity applications. The series will highlight specific projects that have successfully used satellite data. Examples include:

    • monitoring chimpanzee habitat loss
    • decreasing whale mortality
    • detecting penguins
    • monitoring wildfires
    • biodiversity observation networks


    Learning Objectives: By the end of this training, attendees will: 

    • be able to outline uses of remote sensing for habitat suitability, species population dynamics, and monitoring wildfires
    • learn about the Group on Earth Observations Biodiversity Observation Network (GEOBON), Marine Biodiversity Observation Network (MBON), and essential biodiversity variables


    Course Format: 

    • Two, one hour sessions
    • The same session will be broadcast at both times, both in English


    Prerequisites: Fundamentals of Remote Sensing or equivalent knowledge
    If you do not complete the prerequisite, you may not be adequately prepared for the pace of the training.

    Session One: Remote Sensing for Conservation 
    This session will focus on remote sensing for habitat suitability, species population dynamics, and monitoring wildfires.

    Session Two: Remote Sensing for Biodiversity 
    This session will focus on the Group on Earth Observations Biodiversity Observation Network (GEOBON), Marine Biodiversity Observation Network (MBON), and essential biodiversity variables.

    Each part of 2 includes links to the recordings, presentation slides, exercises, and Question & Answer Transcripts, in English and in Spanish.  There is no link to a landing page in Spanish for this resource.   

  • An Introduction to Humanities Data Curation

    This webpage is a compilation of articles that address aspects of data curation in the digital humanities. The goal of it is to direct readers to trusted resources with enough context from expert editors and the other members of the research community to indicate how these resources might help them with their own data curation challenges.
    Each article provides a short introduction to a topic and a list of linked resources. Structuring articles in this way acknowledges the many excellent resources that already exist to provide guidance on subjects relevant to curation such as data formats, legal policies, description, and more.
    The table of contents:
    -An Introduction to Humanities
    -Data Curation-Classics, “Digital Classics” and Issues for Data Curation
    -Data Representation
    -Digital Collections and Aggregations
    -Policy, Practice, and Law
    -Standards

  • Data Management using NEON Small Mammal Data

    Undergraduate STEM students are graduating into professions that require them to manage and work with data at many points of a data management lifecycle. Within ecology, students are presented not only with many opportunities to collect data themselves but increasingly to access and use public data collected by others. This activity introduces the basic concept of data management from the field through to data analysis. The accompanying presentation materials mention the importance of considering long-term data storage and data analysis using public data.

    Content page: ​https://github.com/NEONScience/NEON-Data-Skills/blob/master/tutorials/te...

  • Introduction To The Principles Of Linked Open Data

    This lesson offers a brief and concise introduction to Linked Open Data (LOD). No prior knowledge is assumed. Readers should gain a clear understanding of the concepts behind linked open data, how it is used, and how it is created. The tutorial is split into five parts, plus further reading:
    -Linked open data: what is it?
    -The role of the Uniform Resource Identifier (URI)
    -How LOD organizes knowledge: ontologies
    -The Resource Description Framework (RDF) and data formats
    -Querying linked open data with SPARQL
    -Further reading and resources
    The tutorial should take a couple of hours to complete, and you may find it helpful to re-read sections to solidify your understanding. Technical terms have been linked to their corresponding page on Wikipedia, and you are encouraged to pause and read about terms that you find challenging. After having learned some of the key principles of LOD, the best way to improve and solidify that knowledge is to practice. This tutorial provides opportunities to do so. By the end of the course, you should understand the basics of LOD, including key terms and concepts.
    In order to provide readers with a solid grounding in the basic principles of LOD, this tutorial will not be able to offer comprehensive coverage of all LOD concepts. The following two LOD concepts will not be the focus of this lesson:
    -The semantic web and semantic reasoning of datasets. A semantic reasoner would deduce that George VI is the brother or half-brother of Edward VIII, given the fact that a) Edward VIII is the son of George V and b) George VI is the son of George V. This tutorial does not focus on this type of task.
    -Creating and uploading linked open datasets to the linked data cloud. Sharing your LOD is an important principle, which is encouraged below. However, the practicalities of contributing your LOD to the linked data cloud are beyond the scope of this lesson. Some resources that can help you get started with this task are available at the end of this tutorial.

    This tutorial is also available in Spanish at:  https://programminghistorian.org/es/lecciones/introduccion-datos-abiertos-enlazados

  • From Hermeneutics To Data To Networks: Data Extraction And Network Visualization Of Historical Sources

    Network visualizations can help humanities scholars reveal hidden and complex patterns and structures in textual sources. This tutorial explains how to extract network data (people, institutions, places, etc) from historical sources through the use of non-technical methods developed in Qualitative Data Analysis (QDA) and Social Network Analysis (SNA), and how to visualize this data with the platform-independent and particularly easy-to-use Palladio.

    This tutorial will focus on data extraction from unstructured text and shows one way to visualize it using Palladio. It is purposefully designed to be as simple and robust as possible. For the limited scope of this tutorial it will suffice to say that an actor refers to the persons, institutions, etc. which are the object of study and which are connected by relations. Within the context of a network visualization or computation (also called graph), we call them nodes and we call the connections ties. In all cases it is important to remember that nodes and ties are drastically simplified models used to represent the complexities of past events, and in themselves do not always suffice to generate insight. But it is likely that the graph will highlight interesting aspects, challenge your hypothesis and/or lead you to generate new ones. Network diagrams become meaningful when they are part of a dialogue with data and other sources of information.

    Topics include:  

    • Introduction
    • About the case study
    • Developing a coding scheme
    • Visualize network data in Palladio
    • The added value of network visualizations
    • Other network visualization tools to consider

    This tutorial is also available in Spanish at:  https://programminghistorian.org/es/lecciones/creando-diagramas-de-redes-desde-fuentes-historicas.

  • Ocean Data Management for Researchers

    This training course is aimed at researchers at the post-graduate level and provides a comprehensive introduction to a variety of marine datasets and formats and the use of software for synthesis and analysis of marine data. The importance of good research data management practices and the role of researchers will also be highlighted. Personal projects are presented by the students at the end of the course. 

    To acquire Certificates of Participation, this course required an application and once approved, member login.  Guest access is available to review course slides, video presentations, exercises, class activities and supplementary materials.  

    Aims and Objectives
    -Provide an introduction to the use of software for synthesis and analysis of marine data
    -Introduction to the FAIR Guiding Principles for scientific data management and stewardship
    -Understand best practice for management and analysis of marine data
     
    The learning outcomes of this course include:
    -Knowledge and understanding of the importance of management of ocean data
    -Experience in the use of data analysis and visualization tools
    -Recognize the importance of good research data management practice
    -Awareness of European based marine research projects and data repositories

    Preparation 
    Participants must download the latest version of ODV (5.1.5) from https://odv.awi.de/software/download/ and install the software on their laptops. If not already done, participants must register as non-commercial users before getting access to the software.
    Participants also must download the course material package from https://drive.google.com/file/d/1SojaNEPE3uI5zUN2gib7SfPI8f319ILQ/view?u... unzip to the desktop.

  • Centre Of Excellence Ocean Data Management

    This course provides a comprehensive introduction to a wide variety of earth science datasets, formats and analysis software. Students will learn and practice methods using a common ocean area, and they are expected to create a personal project of data products for a marine region of their own choosing. Personal projects are presented by the students at the end of the course.  This course requires either guest or POGO Scholar login, and is hosted on a Moodle platform.

    Aims and Objectives:
    -Recognize the importance of good research data management practice
    -Provide an introduction to the use of free software for synthesis of marine data and analyses
    -Creation and use of multi-parameter marine data collections to prepare and publish standard data products
    -Develop marine data and products from multiple sources using selected software programs

    Course overview
    1. Course outline and summary
    2. Pre-course reading (optional)
    3. Introduction to IODE and data management
    4. Research Data Management
    5. Ocean Data Collections using Ocean Data View
    6. Introduction to Marine Metadata
    7. Managing Operational Data using Integrated Data Viewer
    8. Marine GIS operations using Saga
    9. Student Project - Marine data products for selected project areas

     

  • Train-the-Trainer Concept on Research Data Management

    Within the project FDMentor, a German Train-the-Trainer Programme on Research Data Management (RDM) was developed and piloted in a series of workshops. The topics cover many aspects of research data management, such as data management plans and the publication of research data, as well as didactic units on learning concepts, workshop design, and a range of didactic methods.
    After the end of the project, the concept was supplemented and updated by members of the Sub-Working Group Training/Further Education (UAG Schulungen/Fortbildungen) of the DINI/Nestor Working Group Research Data (DINI/Nestor-AG Forschungsdaten). The newly published English version of the Train-the-Trainer Concept contains the translated concept, the materials, and all methods of the Train-the-Trainer Programme. Furthermore, additional English references and materials complement this version.
    This document is primarily intended for trainers who want to conduct a Train-the-Trainer workshop on research data management. It contains background knowledge on the PowerPoint slides and teaching scripts as well as further information on the individual subject areas required for reuse and implementation of a two-day workshop of seven and a half hours a day.

    Each unit of this guide contains information about how to teach the unit including the unit's learning objectives, key aspects, contents, didactic methods and exercises, training materials, addiitional sources, template, and teaching scripts.

    Topics of the units inlclude orientation, didactic approach, digital research data, research data policies, data management plans, order and structure, documentation and metadata, storage and backup, long term archiving, access control, formal framework, data publication, re-use of research data, legal aspects, institutional infrastructure, training exercises, concept development and didactic methods.  
     

  • De bonnes pratiques en gestion des données de recherche: Un guide sommaire pour gens occupés (French version of the 'Good Enough' RDM)

    Ce petit guide présente un ensemble de bonnes pratiques que les chercheurs peuvent adopter, et ce, indépendamment de leurs compétences ou de leur niveau d’expertise. 

  • Environmental Data Initiative Five Phases of Data Publishing Webinar - What are metadata and structured metadata?

    Metadata are essential to understanding a dataset. The talk covers:

    • How structured metadata are used to document, discover, and analyze ecological datasets.
    • Tips on creating quality metadata content.
    • An introduction to the metadata language used by the Environmental Data Initiative, Ecological Metadata Language (EML). EML is written in XML, a general purpose mechanism for describing hierarchical information, so some general XML features and how these apply to EML are covered.

    This video in the Environmental Data Initiative (EDI) "Five Phases of Data Publishing" tutorial series covers the third phase of data publishing, describing.

     

  • Environmental Data Initiative Five Phases of Data Publishing Webinar - Creating "clean" data for archiving

    Not all data are easy to use, and some are nearly impossible to use effectively. This presentation lays out the principles and some best practices for creating data that will be easy to document and use. It will identify many of the pitfalls in data preparation and formatting that will cause problems further down the line and how to avoid them.

    This video in the Environmental Data Initiative (EDI) "Five Phases of Data Publishing" tutorial series covers the second phase of data publishing, cleaning data. For more guidance from EDI on data cleaning, also see "How to clean and format data using Excel, OpenRefine, and Excel," located here: ​https://www.youtube.com/watch?v=tRk01ytRXjE.

  • Environmental Data Initiative Five Phases of Data Publishing Webinar - How to clean and format data using Excel, OpenRefine, and Excel

    This webinar provides an overview of some of the tools available for formatting and cleaning data,  guidance on tool suitability and limitations, and an example dataset and instructions for working with those tools.

    This video in the Environmental Data Initiative (EDI) "Five Phases of Data Publishing" tutorial series covers the second phase of data publishing, cleaning data.

    For more guidance from EDI on data cleaning, also see " Creating 'clean' data for archiving," located here:  https://www.youtube.com/watch?v=gW_-XTwJ1OA.

  • Introduction to Scientific Visualization

    Scientific Visualization transforms numerical data sets obtained through measurements or computations into graphical representations. Interactive visualization systems allow scientists, engineers, and biomedical researchers to explore and analyze a variety of phenomena in an intuitive and effective way. The course provides an introduction to the principles and techniques of Scientific Visualization. It covers methods corresponding to the visualization of the most common data types, as well as higher-dimensional, so-called multi-field problems. It combines a description of visualization algorithms with a presentation of their practical application. Basic notions of computer graphics and human visual perception are introduced early on for completeness. Simple but very instructive programming assignments offer a hands-on exposure to the most widely used visualization techniques.

    Note that the lectures, demonstration, and tutorial content require a Purdue Credentials,Hydroshare, or CILogon account.

    Access the CCSM Portal/ESG/ESGC Integration slide presentation at  https://mygeohub.org/resources/50/download/ccsm.pdf. The CCSM/ESG/ESGC collaboration provides a semantically enabled environment that includes modeling, simulated and observed data, visualization, and analysis.
    Topics include:

    • CCSM Overview
    • CCSM on the TeraGrid
    • Challenges
    • Steps in a typical CCSM Simulation
    • Climate Modeling Portal: Community Climate System Model (CCSM) to simulate climate change on Earth
    • CCSM Self-Describing Workflows 
    • Provenance metadata collection
    • Metadata

     

  • 23 (research data) Things

    23 (research data) Things is self-directed learning for anybody who wants to know more about research data. Anyone can do 23 (research data) Things at any time.  Do them all, do some, cherry-pick the Things you need or want to know about. Do them on your own, or get together a Group and share the learning.  The program is intended to be flexible, adaptable and fun!

    Each of the 23 Things offers a variety of learning opportunities with activities at three levels of complexity: ‘Getting started’, ‘Learn more’ and ‘Challenge me’. All resources used in the program are online and free to use.

  • Introduction: FAIR Principles and Management Plans

    This presentation introducing the FAIR (Findable Accessible Interoperable Re-usable) data principles and management plans is one of 9 webinars on topics related to FAIR Data and Software that was offered at a Carpentries-based Workshop in Hannover, Germany, Jul 9-13 2018.  Presentation slides are also available in addition to the recorded presentation.

    Other topics included in the series include:
    - Findability of Research Data and Software through PIDs and FAIR
    - Accessibility through Git, Python Funcations and Their Documentation
    - Interoperability through Python Modules, Unit-Testing and Continuous Integration
    - Reusability through Community Standards, Tidy Data Formats and R Functions, their Documentation, Packaging, and Unit-Testing
    - Reusability:  Data Licensing
    - Reusability:  Software Licensing
    - Reusability:  Software Publication
    - FAIR Data and Software - Summary

    URL locations for the other modules in the webinar can be found at the URL above.
     

  • IOCCP & BONUS INTEGRAL Training Course on "Instrumenting our oceans for better observation: a training course on a suite of biogeochemical sensors"

    Building on the success of prior training courses, the International Ocean Carbon Coordination Project (IOCCP) and EU BONUS INTEGRAL Project (Integrated carboN and TracE Gas monitoRing for the bALtic sea) organized an international training course on "Instrumenting our ocean for better observation:a training course on a suite of biogeochemical sensors." The course was held on June 10-19, 2019 at the Sven Lovén Center for Marine Sciences, in Kristineberg, Sweden. This course responded to the growing demand of the global ocean observing system and the marine biogeochemistry community for expanding the correct usage and generation of information from a suite of autonomous biogeochemical sensors.

    The goal of the course was to train the new generation of marine biogeochemists in the use of a suite of biogeochemical sensors and to assure the best possible quality of the data produced. This intensive training course provided trainees with lectures and hands-on field and laboratory experience with sensors (deployment, interfacing, troubleshooting, and calibration), and provided in-depth knowledge on data reduction and quality control as well as data management. This course also offered an overview of the use of remote sensing, modeling, and intelligent data extrapolation techniques.

    It provides a comprehensive set of training materials divided into several topics. The course materials include video-recorded lectures and/or lecture slideshows in PDF supplemented with links and references to various materials such as manuals, guides, and best practices. 

    Note:  please explore the contents of this course as a self-learning course. Note however that the contents of this training course were designed for a face to face context. As such, some features (assignments, discussion fora, etc) may not work properly and we cannot ensure tutor support. For any queries please contact [email protected] and we will do our best to redirect you to an expert that can assist you. Thank you for your understanding.
    Topic 1: Scientific importance of instrumenting our ocean
    Topic 2: Coordinated global observing networks for marine biogeochemistry
    Topic 3: Sensors inside out
    Topic 4: Interfacing sensors
    Topic 5: Calibration and validation: what are the needs?
    Topic 6: The carbonate system: assessing and controlling measurement uncertainty in estimating the seawater CO2 system
    Topic 7: Equilibrator-based surface measurements
    Topic 8: How to choose the right sensor depending on your circumstances?
    Topic 9: Theory of data processing
    Topic 10: Combining remote sensing and in situ biogeochemical observations
    Topic 11: How to take care of data?
    Topic 12: Modelling for best observations design
    Topic 13: "Smart" data extrapolation
    Topic 14: From surface measurements to ocean-atmosphere fluxes
    Topic 15: Emerging technologies
    Topic 16: Ocean Best Practices (OBP) Initiative and Repository
     

  • MBON Pole to Pole Of The Americas: Tools For The Analysis Of Biodiversity Data Using OBIS And Remote Sensing Data

    The Marine Biodiversity Observation Network (MBON) Pole to Pole organized a second Marine Biodiversity Workshop - From the Sea to the Cloud - after a successful first workshop held during the 2018 AmeriGEOSS Week in Brazil. This activity advanced the implementation of the MBON Pole to Pole network by enhancing knowledge on field data collection methods and the use of informatic technologies for data management and analysis.

    Github site: marinebon.github.io/p2p-mexico-workshop/index.html

    The purpose was to continue the development of a community of practice dedicated to understanding change in marine biodiversity and generating knowledge and products that inform conservation and management strategies of marine living resources by engaging researchers, managers, and policy-makers with interest in biodiversity monitoring and data synthesis and analysis. During this workshop, participants:
    -Advanced already agreed on field sampling protocols for rocky shores and sandy beaches;
    -Manipulated tabular and spatial data already collected at their study sites for standardized data formats using Darwin Core vocabularies and quality controls;
    -Developed specific vocabularies for flora and fauna of rocky shore and sandy beach measured during field surveys;
    -Published survey datasets to the Ocean Biogeographic Information System (OBIS) using tools for sharing data;
    -Advanced knowledge on data science tools (R, Rmarkdown, Github) to mine data, visualize and analyze, and produce reproducible research documents with interactive visualizations onto the web.

    The MBON Pole to Pole workshops are designed to:

    • enhance coordination of data collection among nations;
    • improve the collection of harmonized data, developing data standards and methodologies for data management and dissemination without compromising national concerns;
    • support the integration of biodiversity information with physical and chemical data over time (status and trends); and
    • generates products needed for informed policy and management of the ocean.
    The workshop targeted investigators and resource managers dedicated to studying and conserving biodiversity of invertebrates in two important coastal habitats: rocky shore intertidal zone and sandy beaches. This activity targeted participants from all nations in the Americas, from pole to pole.

    Note:  please explore the contents of this course as a self-learning course. Note however that the contents of this training course were designed for a face to face context. As such, some features (assignments, discussion fora, etc) may not work properly and we cannot ensure tutor support. For any queries please contact [email protected] and we will do our best to redirect you to an expert that can assist you. Thank you for your understanding.

  • Ocean-Colour Data In Climate Studies

    The course will deliver training in ocean-colour data and their applications in climate studies. Remote sensing experts from the Plymouth Marine Laboratory (PML) will guide students through a combination of lectures and computer-based exercises covering the following topics:
    -Introduction to ocean colour;
    -Modelling primary production;
    -Ocean colour applications for ecosystem state assessment;
    -Climate impacts and feedbacks;
    -Ocean colour in data assimilation;
    -Dataset archive, management, visualization, and analysis.
    Objectives
    The objectives and learning outcomes of this course are for students to be able to:
    • Understand the fundamentals of ocean colour;
    • Understand the principles for modeling primary production, for detecting phytoplankton size structure, Harmful Algal Blooms, and for estimating phytoplankton phenology;
    • Conduct research with ocean-colour data for ecosystem state assessment, model validation, data assimilation, and climate research;
    • Plan requirements for large datasets processing, management and archiving;
    • Apply tools and statistical methods for visualization and analysis of ocean-colour data with their associated uncertainty.

  • Research Design, Data Management & Data Communication in Marine Sciences - Module 3 - Data Communication

    The contents of the course are:
    1 - Scientific paper writing
    2 - Communication with the media
    3 - Communication via online platforms: Academic sites and profiling
    4 - Communication via online platforms: social media
    5.1 - Visualizations (option 1)
    5.2 - Presentations & public speaking (option 2)
    5.3 - Storytelling (option 3)
    6 – Assignments

     

  • Data Visualization of Marine Met Data (using FERRET)

    Data visualization is the science of describing the significance of data by placing it in a visual context. In the current scenario it is very useful for dealing with marine met data as patterns, trends, and correlations that might go undetected in text-based data can be exposed and recognized easier with data visualization software.
    The current course will demonstrate the use of open-source software FERRET for the generation of NetCDF data and visualize various types of plots, save and reuse them at a later stage. The course is designed to be a mix of both practical and theoretical sessions.
    Aims and Objectives:
    -Provide exposure to Data Visualization using FERRET.
    -Generation of scripts to visualize various types of data sets (1D, 2D, 3D, etc.).
    -Perform data analysis, generate added-value products.
     Learning Outcomes:
    -Knowledge and understanding of FERRET software.
    -Generation of different types of JNL scripts for visualization and analysis.
    -Able to visualize data set viz., in situ, remote sensing, and model outputs.
    -Tools for visualizing different types of ocean data and data products.
    Course Pre-requisites:
    -Candidates should have knowledge of data formats in which most of the oceanographic data sets are available;
    -Candidates should be preferably working in institutions responsible for the management of oceanographic and/or atmospheric data;

    Note:  please explore the contents of this course as a self-learning course. Note however that the contents of this training course were designed for a face to face context. As such, some features (assignments, discussion fora, etc) may not work properly and we cannot ensure tutor support. For any queries please contact [email protected] and we will do our best to redirect you to an expert that can assist you. Thank you for your understanding.

  • Health And Medical Short Bites #1 - Funders And Publishers

    This short webinar is the first of the fifth in the Health and Medical Short Bites webinar series which aims to support better management and publication of Health and Medical data. 

    You can find the information from the Funder perspective: Dr. Wee-Ming Boon speaks about NHMRC's statement on data sharing And Jeremy Kenner Reviews the National Statement on Ethical Conduct in Human Research. 
    A Publisher perspective on data: Peter D’Onghia, Senior Journal Publishing Manager at Wiley, has a portfolio of journals in health and life sciences and will discuss the new Wiley data policies.

    Find other parts of this webinar:
    Health and Medical Short Bites #2 - Storing and publishing health and medical data

    Health and medical data #3 - Ethics, legal issues, and data sharing

    Health & Medical Data Short Bites #4 - Patient views on data sharing

    Health & Medical Data Short Bites #5 - Data linkage and the Australian Health Thesaurus

    Recording, slides, transcripts, and links are available for all health and medical webinars.

  • Health And Medical Short Bites #2 - Storing And Publishing Health And Medical Data

    This short webinar is the second of the fifth in the Health and Medical Short Bites webinar series which aims to support better management and publication of Health and Medical data.

    Kate LeMay covered more general storage and repository options for health and medical data including Institutional, Discipline, Non-specific repositories, and Jeff Christiansen introduced Med.data.edu.au is a national facility to provide petabyte-scale research data storage, and related high-speed networked computational services, to Australian medical and health research organizations. Find data, Use data, and Store data.
     

    Find other parts of this webinar:

    Health And Medical Short Bites #1 - Funders And Publishers

    Health and medical data #3 - Ethics, legal issues, and data sharing

    Health & Medical Data Short Bites #4 - Patient views on data sharing

    Health & Medical Data Short Bites #5 - Data linkage and the Australian Health Thesaurus

    Recordings, slides, transcripts and links for all healthand medical webinars are available.  

  • Health and Medical Data #3 - Ethics, Legal Issues and Data Sharing

    This short webinar is the third of the fifth in the Health and Medical Short Bites webinar series which aims to support better management and publication of Health and Medical data.

    The legal framework around privacy in Australia is complex and differs between states. Many Acts regulate the collection, use, disclosure and handling of private data. Principles to follow around sensitive data include the management of personal information openly and transparently, only collecting necessary information, and adequate de-identification of data when possible. There are also many ethical considerations around the management and sharing of sensitive data. Informed consent by research participants is essential for the collection, use, and sharing of sensitive data. Storage, access, de-identification, and plans for sharing are very important considerations.
    Topics include:
    1) Legal issues and data sharing
    2) Ethics and data sharing

    Find other parts of this webinar:
    Health and Medical Short Bites #1 - Funders and publishers 

    Health and Medical Short Bites #2 - Storing and publishing health and medical data

    Health & Medical Data Short Bites #4 - Patient views on data sharing

    Health & Medical Data Short Bites #5 - Data linkage and the Australian Health Thesaurus

    Recordings, slides, transcripts and links are available for all webinars in this series.

  • Health & Medical Data Short Bites #4 - Patient Views on Data Sharing

    This short webinar is the fourth in the Health and Medical Short Bites webinar series which aims to support better management and publication of Health and Medical data.

    Informed consent by research participants is essential for the collection, use, and sharing of sensitive data. Storage, access, de-identification, and plans for sharing are important considerations when gaining patient consent.

    Find other parts of this webinar:
    Health and Medical Short Bites #1 - Funders and publishers 

    Health and Medical Short Bites #2 - Storing and publishing health and medical data

    Health and medical data #3 - Ethics, legal issues, and data sharing

    Health & Medical Data Short Bites #5 - Data linkage and the Australian Health Thesaurus

    Recordings, slides, transcripts and links are available for all health and medical webinars.  

  • Health & Medical Data Short Bites #5 - Data linkage and the Australian Health Thesaurus

    This short webinar is the fifth in the Health and Medical Short Bites webinar series which aims to support better management and publication of Health and Medical data.

     Topics and Speakers include:
     1) Data linkage: processes, how it’s done, and data availability statements. Dr. Trisha Johnston, Director, Statistical Analysis Linkage Team, Queensland Department of Health Data points out that linkage is an efficient way to enhance existing data to increase its usefulness for informing population health and clinical research, policy development, and health service planning, management, monitoring, and evaluation.
    2) Australian Health Thesaurus James Humffray, Manager of the Australian Health Thesaurus talks about how Health Direct uses the Thesaurus to improve the user’s search experience by auto-suggestions of search terms, alternative terms or synonyms to find content, ranking in search results of the most relevant content, facets or filters to narrow down a user’s search results.

    Find other parts of this webinar:
    Health and Medical Short Bites #1 - Funders and publishers 

    Health and Medical Short Bites #2 - Storing and publishing health and medical data

    Health and medical data #3 - Ethics, legal issues, and data sharing

    Health & Medical Data Short Bites #4 - Patient views on data sharing

    Recordings, slides, transcripts, and links are available for all health and medical webinars.

  • ORCID Communications Toolkit: Interactive Outreach Presentation

    This presentation includes information about ORCID (Open Researcher and Contributor ID), short quizzes, and workshop activities. Select content for your presentation according to how long you have, what is suitable for your audience’s level of knowledge, and if you want to include practical tasks. After downloading, the slides can be edited to include your institution's name and presentation details.

  • Research Data Management and Open Data

    This was a presentation during the Julius Symposium 2017 on Open Science and in particular on Open data and/or FAIR data.  Examples are given of medical and health research data.

  • Sociolegal and Empirical Legal Research - Research Data Management

    An overview presentation of research data management methods and practices - planning, file organization, storage, security, ethics, archives - with examples drawn from legal research.

    Much of the presentation is in English, but examples and some specific explanations of concepts are in Swedish.

  • The Geoscience Paper of the Future: OntoSoft Training

    This presentation was developed to train scientists on best practices for digital scholarship, reproducibility, and data and software sharing.  It was developed as part of the NSF EarthCube Initiative and funded under the OntoSoft project.  More details about the project can be found at https://www.scientificpaperofthefuture.org/gpf/index.html.

    A powerpoint version of the slides is available upon request from [email protected].

    These OntoSoft GPF training materials were developed and edited by Yolanda Gil (USC), with contributions from the OntoSoft team including Chris Duffy (PSU), Chris Mattmann (JPL), Scott Pechkam (CU), Ji-Hyun Oh (USC), Varun Ratnakar (USC), Erin Robinson (ESIP).  They were significantly improved through input from GPF pioneers Cedric David (JPL), Ibrahim Demir (UI), Bakinam Essawy (UV), Robinson W. Fulweiler (BU), Jon Goodall (UV), Leif Karlstrom (UO), Kyo Lee (JPL), Heath Mills (UH), Suzanne Pierce (UT), Allen Pope (CU), Mimi Tzeng (DISL), Karan Venayagamoorthy (CSU), Sandra Villamizar (UC), and Xuan Yu (UD).  Others contributed with feedback on best practices, including Ruth Duerr (NSIDC), James Howison (UT), Matt Jones (UCSB), Lisa Kempler (Matworks), Kerstin Lehnert (LDEO), Matt Meyernick (NCAR), and Greg Wilson (Software Carpentry).  These materials were also improved thanks to the many scientists and colleagues that have taken the training and asked hard questions about GPFs.

  • Teledetección de Ecosistemas Costeros

    Los ecosistemas marinos y costeros tienen roles vitales en el almacenamiento de carbono, reciclaje de nutrientes y otros materiales, al igual que sirven de reservorios de biodiversidad. Además, proveen servicios ecosistémicos tales como comida para millones de personas, protección costera contra el oleaje, y actividades recreativas. La teledetección de los ecosistemas costeros y marinos es particularmente difícil. Hasta el 80% de la señal recibida por los sensores en órbita proviene de la atmósfera. Además, los componentes de la columna de agua (disueltos y suspendidos) atenúan la mayor parte de la luz mediante absorción o dispersión. Cuando se trata de recuperar información del fondo del océano, incluso en las aguas más claras, solo menos del 10% de la señal proviene de el fondo marino. Los usuarios, particularmente aquellos con poca experiencia en teledetección, pueden beneficiarse de esta capacitación que cubre algunas de las dificultades asociadas con la teledetección de ecosistemas costeros, particularmente playas y comunidades bentónicas tales como arrecifes de coral y yerbas marinas.




    OBJETIVOS DE APRENDIZAJEAl
    final de esta capacitación, los asistentes podrán:

    • Identificar los diferentes componentes de la columna de agua y cómo afectan la señal de teledetección remota de los ecosistemas de aguas poco profundas.
    • Describir los sensores satelitales existentes utilizados para analizar el color del océano y en la caracterización de ecosistemas de aguas poco profundas.
    • Comprender la interacción entre los componentes del agua, el espectro electromagnético y la señal de detección remota.
    • Reconocer los diferentes procesos utilizados para eliminar la atenuación de la columna de agua de la señal de teledetección remota para caracterizar los componentes bentónicos.
    • Resumir las técnicas para caracterizar los entornos de playas costeras con datos de teledetección remota y métodos de campo para el perfil de playas.



    FORMATO DEL CURSO

    • Tres sesiones de una hora cada una con presentaciones en inglés y español
    • Una tarea a someter usando Google Forms 
    • English



    Parte Uno: Una Mirada a los Ecosistemas Costeros y la Teledetección

    • Introducción a ecosistemas costeros 
    • Un resumen de los sensores más utilizados para la teledetección de áreas costeras 
    • Preguntas y Respuestas



    Parte Dos: Penetración de la Luz en la Columna de Agua

    • Propiedades Aparentes e Inherentes 
    • Medidas de Campo Bio-ópticas 
    • Correcciones de la Columna de Agua 
    • Derivación de Batimetría y Caracterización Béntica Usando Datos Multiespectrales 
    • Calibración y Validación de Datos de Color del Océano 
    • Preguntas y Respuestas


    Parte Tres: Teledetección de Componentes de la Línea de Costa

    • Componentes Geofísicos de la Línea de Costa
    • Las Partes de una Playa
    • Medidas de Campo en la Línea de Costa Necesarias para Validar Imágenes
    • Procesamiento y Análisis de Imágenes para la Caracterización de la Línea de Costa
    • Preguntas y Respuestas


    Materiales:


    • Ver Grabación
    • Diapositivas de la Presentación
    • Tarea 
    • Transcripción de Preguntas y Respuestas

  • Remote Sensing of Coastal Ecosystems [Introductory]

    Coastal and marine ecosystems serve key roles for carbon storage, nutrients, and materials cycling, as well as reservoirs of biodiversity. They also provide ecosystem services such as sustenance for millions of people, coastal protection against wave action, and recreational activities. Remote sensing of coastal and marine ecosystems is particularly challenging. Up to 90% of the signal received by the sensors in orbit comes from the atmosphere. Additionally, dissolved and suspended constituents in the water column attenuate most of the light received through absorption or scattering. When it comes to retrieving information about shallow-water ecosystems, even in the clearest waters under the clearest skies, less than 10% of the signal originates from the water and its bottom surface. Users, particularly those with little remote sensing experience, stand to benefit from this training covering some of the difficulties associated with remote sensing of coastal ecosystems, particularly beaches and benthic communities such as coral reefs and seagrass.



    OBJECTIVES
    by the end of this training, attendees will be able to:

    • Identify the different water column components and how they affect the remote sensing signal of shallow-water ecosystems
    • Outline existing satellite sensors used for ocean color and shallow-water ecosystem characterization
    • Understand the interaction between water constituents, the electromagnetic spectrum, and the remote sensing signal
    • Recognize the different processes used to remove the water column attenuation from the remotely-sensed signal to characterize benthic components
    • Summarize techniques for characterizing shoreline beach environments with remotely-sensed data and field methods for beach profiling


    COURSE FORMAT


    • Three one-hour sessions with presentations in English and Spanish
    • One Google Form homework
    • Spanish sessions 



    PREREQUISITES




    Part One: Overview of Coastal Ecosystems and Remote Sensing


    • Introduction to coastal and marine ecosystems
    • Overview of sensors for remote sensing of coastal areas
    • Q&A


    Part Two: Penetration of Light in the Water Column


    • Apparent and inherent optical properties 
    • Field bio-optical measurements 
    • Water column corrections 
    • Deriving bathymetry and benthic characterization from multispectral data 
    • Validation and calibration of ocean color data 
    • Q&A


    Part Three: Remote Sensing of Shorelines


    • Geophysical components of shorelines 
    • The parts of a beach 
    • Field-based measurements in shorelines for image validation 
    • Image processing and analysis for shoreline characterization 
    • Q&A


    Each part of 3 includes links to the recordings, presentation slides,  and Question & Answer Transcripts. 

  • Fundamentals of Remote Sensing [Introductory]

    These webinars are available for viewing at any time. They provide basic information about the fundamentals of remote sensing and are often a prerequisite for other ARSET training.

    OBJECTIVE
    Participants will become familiar with satellite orbits, types, resolutions, sensors, and processing levels. In addition to a conceptual understanding of remote sensing, attendees will also be able to articulate their advantages and disadvantages. Participants will also have a basic understanding of NASA satellites, sensors, data, tools, portals, and applications to environmental monitoring and management.

    SESSIONS
    Session 1: Fundamentals of Remote Sensing
    A general overview of remote sensing and its application to disasters, health & air quality, land, water resource, and wildfire management.
    Session 1A: NASA's Earth Observing Fleet
    Get familiar with Earth-observing satellites in NASA's fleet, sensors that collect data you can use in ARSET training, and potential applications. 
    Session 2A: Satellites, Sensors, Data and Tools for Land Management and Wildfire Applications
    Specific satellites, sensors, and resources for remote sensing in land management and wildfires. This includes land cover mapping and products, fire detection products, detecting land cover change, and NDVI and EVI. 
    Session 2B: Satellites, Sensors, and Earth Systems Models for Water Resources Management
    Water resources management, an overview of relevant satellites and sensors, an overview of relevant Earth system models, and data and tools for water resources management. 
    Session 2C: Fundamentals of Aquatic Remote Sensing
    Overview of relevant satellites and sensors, and data and tools for aquatic environmental management. 

  • Dendro Open-Source Dropbox

    Dendro is a collaborative file storage and description platform designed to support users in collecting and describing data, with its roots in research data management. It does not intend to replace existing research data repositories, because it is placed before the moment of deposit in a data repository.  The DENDRO platform is an open-source platform designed to help researchers describe their datasets, fully build on Linked Open Data. Whenever researchers want to publish a dataset, they can export to repositories such as CKAN, DSpace, Invenio, or EUDAT's B2SHARE. 

    It is designed to support the work of research groups with collaborative features such as:

    File metadata versioning
    Permissions management
    Editing and rollback
    Public/Private/Metadata Only project visibility

    You start by creating a “Project”, which is like a Dropbox shared folder. Projects can be private (completely invisible to non-colaborators), metadata-only (only metadata is visible but data is not), and public (everyone can read both data and metadata). Project members can then upload files and folders and describe those resources using domain-specific and generic metadata, so it can suit a broad spectrum of data description needs. The contents of some files that contain data (Excel, CSV, for example) is automatically extracted, as well as text from others (PDF, Word, TXT, etc) to assist discovery.

    Dendro provides a flexible data description framework built on Linked Open Data at the core (triple store as), scalable file storage for handling big files, BagIt-represented backups, authentication with ORCID and sharing to practically any repository platform.

    Further information about Dendro can be found on its Github repository at:  https://github.com/feup-infolab/dendro.  Documentation and descriptions of Dendro can be found in other languages from the primary URL home page.

  • Overview of RSpace Electronic Lab Notebook (ELN) for Researchers

    A demonstration of RSpace, a software system for documenting your research work.

    More information about electronic lab notebooks (ELNs) can be found on the Gurdon Institute website at https://www.gurdon.cam.ac.uk/institute-life/computing/elnguidance. The site includes:

    • What is an Electronic Lab Notebook, and why should I use one?
    • A note about DIY documentation systems
    • Electronic Lab Notebook (ELN) vs Laboratory Information Management System (LIMS)
    • Disengagement - what if I want to change systems?
    • Which ELN would be best for me/my group?
    • Evaluating the product
    • Some current ELN products (suitability, platform, storage, and other details)


     
     

  • Using LabArchives Electronic Laboratory Notebook (ELN) in Your Lab

    LabArchives® is the leading secure and intuitive cloud-based Electronic Lab Notebook (ELN) application enabling researchers to easily create, store, share and manage their research data. Far more than an “ELN”, LabArchives provides a flexible, extensible platform that can be easily customized to match your lab's workflow providing benefits to Principal Investigators, lab managers, staff, postdoctoral fellows, and graduate students.

    LabArchives ELN benefits and features:

    • Monitor, engage, and evaluate your teams' lab work
    • Notebook user access can be managed to allow access rights to certain notebooks, pages and/or entries
    • Interconnect all your lab data and image files to your observations and notes
    • Create and adhere to funding agency data management plans which require data sharing (via public URL or DOI)
    • Easily import and access digital experimental data captured from original lab machines produced by hardware/software
    • Lab team can easily upload images and videos directly to their notebook while conducting lab experiments
    • Complete audit control - tracks and stores ALL revisions, by users, for every entry - NO entry can be deleted - Protect IP
    • Publish and share selected data or entire notebooks to specific individuals or the public


    More information about electronic lab notebooks (ELNs) can be found on the Gurdon Institute website at https://www.gurdon.cam.ac.uk/institute-life/computing/elnguidance. The site includes:

    • What is an Electronic Lab Notebook, and why should I use one?
    • A note about DIY documentation systems
    • Electronic Lab Notebook (ELN) vs Laboratory Information Management System (LIMS)
    • Disengagement - what if I want to change systems?
    • Which ELN would be best for me/my group?
    • Evaluating the product
    • Some current ELN products (suitability, platform, storage, and other details)