All Learning Resources

  • Centre Of Excellence Ocean Data Management

    This course provides a comprehensive introduction to a wide variety of earth science datasets, formats and analysis software. Students will learn and practice methods using a common ocean area, and they are expected to create a personal project of data products for a marine region of their own choosing. Personal projects are presented by the students at the end of the course.  This course requires either guest or POGO Scholar login, and is hosted on a Moodle platform.

    Aims and Objectives:
    -Recognize the importance of good research data management practice
    -Provide an introduction to the use of free software for synthesis of marine data and analyses
    -Creation and use of multi-parameter marine data collections to prepare and publish standard data products
    -Develop marine data and products from multiple sources using selected software programs

    Course overview
    1. Course outline and summary
    2. Pre-course reading (optional)
    3. Introduction to IODE and data management
    4. Research Data Management
    5. Ocean Data Collections using Ocean Data View
    6. Introduction to Marine Metadata
    7. Managing Operational Data using Integrated Data Viewer
    8. Marine GIS operations using Saga
    9. Student Project - Marine data products for selected project areas

     

  • Train-the-Trainer Concept on Research Data Management

    Within the project FDMentor, a German Train-the-Trainer Programme on Research Data Management (RDM) was developed and piloted in a series of workshops. The topics cover many aspects of research data management, such as data management plans and the publication of research data, as well as didactic units on learning concepts, workshop design, and a range of didactic methods.
    After the end of the project, the concept was supplemented and updated by members of the Sub-Working Group Training/Further Education (UAG Schulungen/Fortbildungen) of the DINI/Nestor Working Group Research Data (DINI/Nestor-AG Forschungsdaten). The newly published English version of the Train-the-Trainer Concept contains the translated concept, the materials, and all methods of the Train-the-Trainer Programme. Furthermore, additional English references and materials complement this version.
    This document is primarily intended for trainers who want to conduct a Train-the-Trainer workshop on research data management. It contains background knowledge on the PowerPoint slides and teaching scripts as well as further information on the individual subject areas required for reuse and implementation of a two-day workshop of seven and a half hours a day.

    Each unit of this guide contains information about how to teach the unit including the unit's learning objectives, key aspects, contents, didactic methods and exercises, training materials, addiitional sources, template, and teaching scripts.

    Topics of the units inlclude orientation, didactic approach, digital research data, research data policies, data management plans, order and structure, documentation and metadata, storage and backup, long term archiving, access control, formal framework, data publication, re-use of research data, legal aspects, institutional infrastructure, training exercises, concept development and didactic methods.  
     

  • Environmental Data Initiative Five Phases of Data Publishing Webinar - What are metadata and structured metadata?

    Metadata are essential to understanding a dataset. The talk covers:

    • How structured metadata are used to document, discover, and analyze ecological datasets.
    • Tips on creating quality metadata content.
    • An introduction to the metadata language used by the Environmental Data Initiative, Ecological Metadata Language (EML). EML is written in XML, a general purpose mechanism for describing hierarchical information, so some general XML features and how these apply to EML are covered.

    This video in the Environmental Data Initiative (EDI) "Five Phases of Data Publishing" tutorial series covers the third phase of data publishing, describing.

     

  • Environmental Data Initiative Five Phases of Data Publishing Webinar - Creating "clean" data for archiving

    Not all data are easy to use, and some are nearly impossible to use effectively. This presentation lays out the principles and some best practices for creating data that will be easy to document and use. It will identify many of the pitfalls in data preparation and formatting that will cause problems further down the line and how to avoid them.

    This video in the Environmental Data Initiative (EDI) "Five Phases of Data Publishing" tutorial series covers the second phase of data publishing, cleaning data. For more guidance from EDI on data cleaning, also see "How to clean and format data using Excel, OpenRefine, and Excel," located here: ​https://www.youtube.com/watch?v=tRk01ytRXjE.

  • Environmental Data Initiative Five Phases of Data Publishing Webinar - How to clean and format data using Excel, OpenRefine, and Excel

    This webinar provides an overview of some of the tools available for formatting and cleaning data,  guidance on tool suitability and limitations, and an example dataset and instructions for working with those tools.

    This video in the Environmental Data Initiative (EDI) "Five Phases of Data Publishing" tutorial series covers the second phase of data publishing, cleaning data.

    For more guidance from EDI on data cleaning, also see " Creating 'clean' data for archiving," located here:  https://www.youtube.com/watch?v=gW_-XTwJ1OA.

  • Introduction to Scientific Visualization

    Scientific Visualization transforms numerical data sets obtained through measurements or computations into graphical representations. Interactive visualization systems allow scientists, engineers, and biomedical researchers to explore and analyze a variety of phenomena in an intuitive and effective way. The course provides an introduction to the principles and techniques of Scientific Visualization. It covers methods corresponding to the visualization of the most common data types, as well as higher-dimensional, so-called multi-field problems. It combines a description of visualization algorithms with a presentation of their practical application. Basic notions of computer graphics and human visual perception are introduced early on for completeness. Simple but very instructive programming assignments offer a hands-on exposure to the most widely used visualization techniques.

    Note that the lectures, demonstration, and tutorial content require a Purdue Credentials,Hydroshare, or CILogon account.

    Access the CCSM Portal/ESG/ESGC Integration slide presentation at  https://mygeohub.org/resources/50/download/ccsm.pdf. The CCSM/ESG/ESGC collaboration provides a semantically enabled environment that includes modeling, simulated and observed data, visualization, and analysis.
    Topics include:

    • CCSM Overview
    • CCSM on the TeraGrid
    • Challenges
    • Steps in a typical CCSM Simulation
    • Climate Modeling Portal: Community Climate System Model (CCSM) to simulate climate change on Earth
    • CCSM Self-Describing Workflows 
    • Provenance metadata collection
    • Metadata

     

  • Introduction: FAIR Principles and Management Plans

    This presentation introducing the FAIR (Findable Accessible Interoperable Re-usable) data principles and management plans is one of 9 webinars on topics related to FAIR Data and Software that was offered at a Carpentries-based Workshop in Hannover, Germany, Jul 9-13 2018.  Presentation slides are also available in addition to the recorded presentation.

    Other topics included in the series include:
    - Findability of Research Data and Software through PIDs and FAIR
    - Accessibility through Git, Python Funcations and Their Documentation
    - Interoperability through Python Modules, Unit-Testing and Continuous Integration
    - Reusability through Community Standards, Tidy Data Formats and R Functions, their Documentation, Packaging, and Unit-Testing
    - Reusability:  Data Licensing
    - Reusability:  Software Licensing
    - Reusability:  Software Publication
    - FAIR Data and Software - Summary

    URL locations for the other modules in the webinar can be found at the URL above.
     

  • IOCCP & BONUS INTEGRAL Training Course on "Instrumenting our oceans for better observation: a training course on a suite of biogeochemical sensors"

    Building on the success of prior training courses, the International Ocean Carbon Coordination Project (IOCCP) and EU BONUS INTEGRAL Project (Integrated carboN and TracE Gas monitoRing for the bALtic sea) organized an international training course on "Instrumenting our ocean for better observation:a training course on a suite of biogeochemical sensors." The course was held on June 10-19, 2019 at the Sven Lovén Center for Marine Sciences, in Kristineberg, Sweden. This course responded to the growing demand of the global ocean observing system and the marine biogeochemistry community for expanding the correct usage and generation of information from a suite of autonomous biogeochemical sensors.

    The goal of the course was to train the new generation of marine biogeochemists in the use of a suite of biogeochemical sensors and to assure the best possible quality of the data produced. This intensive training course provided trainees with lectures and hands-on field and laboratory experience with sensors (deployment, interfacing, troubleshooting, and calibration), and provided in-depth knowledge on data reduction and quality control as well as data management. This course also offered an overview of the use of remote sensing, modeling, and intelligent data extrapolation techniques.

    It provides a comprehensive set of training materials divided into several topics. The course materials include video-recorded lectures and/or lecture slideshows in PDF supplemented with links and references to various materials such as manuals, guides, and best practices. 

    Note:  please explore the contents of this course as a self-learning course. Note however that the contents of this training course were designed for a face to face context. As such, some features (assignments, discussion fora, etc) may not work properly and we cannot ensure tutor support. For any queries please contact [email protected] and we will do our best to redirect you to an expert that can assist you. Thank you for your understanding.
    Topic 1: Scientific importance of instrumenting our ocean
    Topic 2: Coordinated global observing networks for marine biogeochemistry
    Topic 3: Sensors inside out
    Topic 4: Interfacing sensors
    Topic 5: Calibration and validation: what are the needs?
    Topic 6: The carbonate system: assessing and controlling measurement uncertainty in estimating the seawater CO2 system
    Topic 7: Equilibrator-based surface measurements
    Topic 8: How to choose the right sensor depending on your circumstances?
    Topic 9: Theory of data processing
    Topic 10: Combining remote sensing and in situ biogeochemical observations
    Topic 11: How to take care of data?
    Topic 12: Modelling for best observations design
    Topic 13: "Smart" data extrapolation
    Topic 14: From surface measurements to ocean-atmosphere fluxes
    Topic 15: Emerging technologies
    Topic 16: Ocean Best Practices (OBP) Initiative and Repository
     

  • MBON Pole to Pole Of The Americas: Tools For The Analysis Of Biodiversity Data Using OBIS And Remote Sensing Data

    The Marine Biodiversity Observation Network (MBON) Pole to Pole organized a second Marine Biodiversity Workshop - From the Sea to the Cloud - after a successful first workshop held during the 2018 AmeriGEOSS Week in Brazil. This activity advanced the implementation of the MBON Pole to Pole network by enhancing knowledge on field data collection methods and the use of informatic technologies for data management and analysis.

    Github site: marinebon.github.io/p2p-mexico-workshop/index.html

    The purpose was to continue the development of a community of practice dedicated to understanding change in marine biodiversity and generating knowledge and products that inform conservation and management strategies of marine living resources by engaging researchers, managers, and policy-makers with interest in biodiversity monitoring and data synthesis and analysis. During this workshop, participants:
    -Advanced already agreed on field sampling protocols for rocky shores and sandy beaches;
    -Manipulated tabular and spatial data already collected at their study sites for standardized data formats using Darwin Core vocabularies and quality controls;
    -Developed specific vocabularies for flora and fauna of rocky shore and sandy beach measured during field surveys;
    -Published survey datasets to the Ocean Biogeographic Information System (OBIS) using tools for sharing data;
    -Advanced knowledge on data science tools (R, Rmarkdown, Github) to mine data, visualize and analyze, and produce reproducible research documents with interactive visualizations onto the web.

    The MBON Pole to Pole workshops are designed to:

    • enhance coordination of data collection among nations;
    • improve the collection of harmonized data, developing data standards and methodologies for data management and dissemination without compromising national concerns;
    • support the integration of biodiversity information with physical and chemical data over time (status and trends); and
    • generates products needed for informed policy and management of the ocean.
    The workshop targeted investigators and resource managers dedicated to studying and conserving biodiversity of invertebrates in two important coastal habitats: rocky shore intertidal zone and sandy beaches. This activity targeted participants from all nations in the Americas, from pole to pole.

    Note:  please explore the contents of this course as a self-learning course. Note however that the contents of this training course were designed for a face to face context. As such, some features (assignments, discussion fora, etc) may not work properly and we cannot ensure tutor support. For any queries please contact [email protected] and we will do our best to redirect you to an expert that can assist you. Thank you for your understanding.

  • Ocean-Colour Data In Climate Studies

    The course will deliver training in ocean-colour data and their applications in climate studies. Remote sensing experts from the Plymouth Marine Laboratory (PML) will guide students through a combination of lectures and computer-based exercises covering the following topics:
    -Introduction to ocean colour;
    -Modelling primary production;
    -Ocean colour applications for ecosystem state assessment;
    -Climate impacts and feedbacks;
    -Ocean colour in data assimilation;
    -Dataset archive, management, visualization, and analysis.
    Objectives
    The objectives and learning outcomes of this course are for students to be able to:
    • Understand the fundamentals of ocean colour;
    • Understand the principles for modeling primary production, for detecting phytoplankton size structure, Harmful Algal Blooms, and for estimating phytoplankton phenology;
    • Conduct research with ocean-colour data for ecosystem state assessment, model validation, data assimilation, and climate research;
    • Plan requirements for large datasets processing, management and archiving;
    • Apply tools and statistical methods for visualization and analysis of ocean-colour data with their associated uncertainty.

  • Research Design, Data Management & Data Communication in Marine Sciences - Module 3 - Data Communication

    The contents of the course are:
    1 - Scientific paper writing
    2 - Communication with the media
    3 - Communication via online platforms: Academic sites and profiling
    4 - Communication via online platforms: social media
    5.1 - Visualizations (option 1)
    5.2 - Presentations & public speaking (option 2)
    5.3 - Storytelling (option 3)
    6 – Assignments

     

  • Data Visualization of Marine Met Data (using FERRET)

    Data visualization is the science of describing the significance of data by placing it in a visual context. In the current scenario it is very useful for dealing with marine met data as patterns, trends, and correlations that might go undetected in text-based data can be exposed and recognized easier with data visualization software.
    The current course will demonstrate the use of open-source software FERRET for the generation of NetCDF data and visualize various types of plots, save and reuse them at a later stage. The course is designed to be a mix of both practical and theoretical sessions.
    Aims and Objectives:
    -Provide exposure to Data Visualization using FERRET.
    -Generation of scripts to visualize various types of data sets (1D, 2D, 3D, etc.).
    -Perform data analysis, generate added-value products.
     Learning Outcomes:
    -Knowledge and understanding of FERRET software.
    -Generation of different types of JNL scripts for visualization and analysis.
    -Able to visualize data set viz., in situ, remote sensing, and model outputs.
    -Tools for visualizing different types of ocean data and data products.
    Course Pre-requisites:
    -Candidates should have knowledge of data formats in which most of the oceanographic data sets are available;
    -Candidates should be preferably working in institutions responsible for the management of oceanographic and/or atmospheric data;

    Note:  please explore the contents of this course as a self-learning course. Note however that the contents of this training course were designed for a face to face context. As such, some features (assignments, discussion fora, etc) may not work properly and we cannot ensure tutor support. For any queries please contact [email protected] and we will do our best to redirect you to an expert that can assist you. Thank you for your understanding.

  • Health And Medical Short Bites #1 - Funders And Publishers

    This short webinar is the first of the fifth in the Health and Medical Short Bites webinar series which aims to support better management and publication of Health and Medical data. 

    You can find the information from the Funder perspective: Dr. Wee-Ming Boon speaks about NHMRC's statement on data sharing And Jeremy Kenner Reviews the National Statement on Ethical Conduct in Human Research. 
    A Publisher perspective on data: Peter D’Onghia, Senior Journal Publishing Manager at Wiley, has a portfolio of journals in health and life sciences and will discuss the new Wiley data policies.

    Find other parts of this webinar:
    Health and Medical Short Bites #2 - Storing and publishing health and medical data

    Health and medical data #3 - Ethics, legal issues, and data sharing

    Health & Medical Data Short Bites #4 - Patient views on data sharing

    Health & Medical Data Short Bites #5 - Data linkage and the Australian Health Thesaurus

    Recording, slides, transcripts, and links are available for all health and medical webinars.

  • Health And Medical Short Bites #2 - Storing And Publishing Health And Medical Data

    This short webinar is the second of the fifth in the Health and Medical Short Bites webinar series which aims to support better management and publication of Health and Medical data.

    Kate LeMay covered more general storage and repository options for health and medical data including Institutional, Discipline, Non-specific repositories, and Jeff Christiansen introduced Med.data.edu.au is a national facility to provide petabyte-scale research data storage, and related high-speed networked computational services, to Australian medical and health research organizations. Find data, Use data, and Store data.
     

    Find other parts of this webinar:

    Health And Medical Short Bites #1 - Funders And Publishers

    Health and medical data #3 - Ethics, legal issues, and data sharing

    Health & Medical Data Short Bites #4 - Patient views on data sharing

    Health & Medical Data Short Bites #5 - Data linkage and the Australian Health Thesaurus

    Recordings, slides, transcripts and links for all healthand medical webinars are available.  

  • Health and Medical Data #3 - Ethics, Legal Issues and Data Sharing

    This short webinar is the third of the fifth in the Health and Medical Short Bites webinar series which aims to support better management and publication of Health and Medical data.

    The legal framework around privacy in Australia is complex and differs between states. Many Acts regulate the collection, use, disclosure and handling of private data. Principles to follow around sensitive data include the management of personal information openly and transparently, only collecting necessary information, and adequate de-identification of data when possible. There are also many ethical considerations around the management and sharing of sensitive data. Informed consent by research participants is essential for the collection, use, and sharing of sensitive data. Storage, access, de-identification, and plans for sharing are very important considerations.
    Topics include:
    1) Legal issues and data sharing
    2) Ethics and data sharing

    Find other parts of this webinar:
    Health and Medical Short Bites #1 - Funders and publishers 

    Health and Medical Short Bites #2 - Storing and publishing health and medical data

    Health & Medical Data Short Bites #4 - Patient views on data sharing

    Health & Medical Data Short Bites #5 - Data linkage and the Australian Health Thesaurus

    Recordings, slides, transcripts and links are available for all webinars in this series.

  • Health & Medical Data Short Bites #4 - Patient Views on Data Sharing

    This short webinar is the fourth in the Health and Medical Short Bites webinar series which aims to support better management and publication of Health and Medical data.

    Informed consent by research participants is essential for the collection, use, and sharing of sensitive data. Storage, access, de-identification, and plans for sharing are important considerations when gaining patient consent.

    Find other parts of this webinar:
    Health and Medical Short Bites #1 - Funders and publishers 

    Health and Medical Short Bites #2 - Storing and publishing health and medical data

    Health and medical data #3 - Ethics, legal issues, and data sharing

    Health & Medical Data Short Bites #5 - Data linkage and the Australian Health Thesaurus

    Recordings, slides, transcripts and links are available for all health and medical webinars.  

  • Health & Medical Data Short Bites #5 - Data linkage and the Australian Health Thesaurus

    This short webinar is the fifth in the Health and Medical Short Bites webinar series which aims to support better management and publication of Health and Medical data.

     Topics and Speakers include:
     1) Data linkage: processes, how it’s done, and data availability statements. Dr. Trisha Johnston, Director, Statistical Analysis Linkage Team, Queensland Department of Health Data points out that linkage is an efficient way to enhance existing data to increase its usefulness for informing population health and clinical research, policy development, and health service planning, management, monitoring, and evaluation.
    2) Australian Health Thesaurus James Humffray, Manager of the Australian Health Thesaurus talks about how Health Direct uses the Thesaurus to improve the user’s search experience by auto-suggestions of search terms, alternative terms or synonyms to find content, ranking in search results of the most relevant content, facets or filters to narrow down a user’s search results.

    Find other parts of this webinar:
    Health and Medical Short Bites #1 - Funders and publishers 

    Health and Medical Short Bites #2 - Storing and publishing health and medical data

    Health and medical data #3 - Ethics, legal issues, and data sharing

    Health & Medical Data Short Bites #4 - Patient views on data sharing

    Recordings, slides, transcripts, and links are available for all health and medical webinars.

  • ORCID Communications Toolkit: Interactive Outreach Presentation

    This presentation includes information about ORCID (Open Researcher and Contributor ID), short quizzes, and workshop activities. Select content for your presentation according to how long you have, what is suitable for your audience’s level of knowledge, and if you want to include practical tasks. After downloading, the slides can be edited to include your institution's name and presentation details.

  • Research Data Management and Open Data

    This was a presentation during the Julius Symposium 2017 on Open Science and in particular on Open data and/or FAIR data.  Examples are given of medical and health research data.

  • Sociolegal and Empirical Legal Research - Research Data Management

    An overview presentation of research data management methods and practices - planning, file organization, storage, security, ethics, archives - with examples drawn from legal research.

    Much of the presentation is in English, but examples and some specific explanations of concepts are in Swedish.

  • The Geoscience Paper of the Future: OntoSoft Training

    This presentation was developed to train scientists on best practices for digital scholarship, reproducibility, and data and software sharing.  It was developed as part of the NSF EarthCube Initiative and funded under the OntoSoft project.  More details about the project can be found at https://www.scientificpaperofthefuture.org/gpf/index.html.

    A powerpoint version of the slides is available upon request from [email protected].

    These OntoSoft GPF training materials were developed and edited by Yolanda Gil (USC), with contributions from the OntoSoft team including Chris Duffy (PSU), Chris Mattmann (JPL), Scott Pechkam (CU), Ji-Hyun Oh (USC), Varun Ratnakar (USC), Erin Robinson (ESIP).  They were significantly improved through input from GPF pioneers Cedric David (JPL), Ibrahim Demir (UI), Bakinam Essawy (UV), Robinson W. Fulweiler (BU), Jon Goodall (UV), Leif Karlstrom (UO), Kyo Lee (JPL), Heath Mills (UH), Suzanne Pierce (UT), Allen Pope (CU), Mimi Tzeng (DISL), Karan Venayagamoorthy (CSU), Sandra Villamizar (UC), and Xuan Yu (UD).  Others contributed with feedback on best practices, including Ruth Duerr (NSIDC), James Howison (UT), Matt Jones (UCSB), Lisa Kempler (Matworks), Kerstin Lehnert (LDEO), Matt Meyernick (NCAR), and Greg Wilson (Software Carpentry).  These materials were also improved thanks to the many scientists and colleagues that have taken the training and asked hard questions about GPFs.

  • Teledetección de Ecosistemas Costeros

    Los ecosistemas marinos y costeros tienen roles vitales en el almacenamiento de carbono, reciclaje de nutrientes y otros materiales, al igual que sirven de reservorios de biodiversidad. Además, proveen servicios ecosistémicos tales como comida para millones de personas, protección costera contra el oleaje, y actividades recreativas. La teledetección de los ecosistemas costeros y marinos es particularmente difícil. Hasta el 80% de la señal recibida por los sensores en órbita proviene de la atmósfera. Además, los componentes de la columna de agua (disueltos y suspendidos) atenúan la mayor parte de la luz mediante absorción o dispersión. Cuando se trata de recuperar información del fondo del océano, incluso en las aguas más claras, solo menos del 10% de la señal proviene de el fondo marino. Los usuarios, particularmente aquellos con poca experiencia en teledetección, pueden beneficiarse de esta capacitación que cubre algunas de las dificultades asociadas con la teledetección de ecosistemas costeros, particularmente playas y comunidades bentónicas tales como arrecifes de coral y yerbas marinas.




    OBJETIVOS DE APRENDIZAJEAl
    final de esta capacitación, los asistentes podrán:

    • Identificar los diferentes componentes de la columna de agua y cómo afectan la señal de teledetección remota de los ecosistemas de aguas poco profundas.
    • Describir los sensores satelitales existentes utilizados para analizar el color del océano y en la caracterización de ecosistemas de aguas poco profundas.
    • Comprender la interacción entre los componentes del agua, el espectro electromagnético y la señal de detección remota.
    • Reconocer los diferentes procesos utilizados para eliminar la atenuación de la columna de agua de la señal de teledetección remota para caracterizar los componentes bentónicos.
    • Resumir las técnicas para caracterizar los entornos de playas costeras con datos de teledetección remota y métodos de campo para el perfil de playas.



    FORMATO DEL CURSO

    • Tres sesiones de una hora cada una con presentaciones en inglés y español
    • Una tarea a someter usando Google Forms 
    • English



    Parte Uno: Una Mirada a los Ecosistemas Costeros y la Teledetección

    • Introducción a ecosistemas costeros 
    • Un resumen de los sensores más utilizados para la teledetección de áreas costeras 
    • Preguntas y Respuestas



    Parte Dos: Penetración de la Luz en la Columna de Agua

    • Propiedades Aparentes e Inherentes 
    • Medidas de Campo Bio-ópticas 
    • Correcciones de la Columna de Agua 
    • Derivación de Batimetría y Caracterización Béntica Usando Datos Multiespectrales 
    • Calibración y Validación de Datos de Color del Océano 
    • Preguntas y Respuestas


    Parte Tres: Teledetección de Componentes de la Línea de Costa

    • Componentes Geofísicos de la Línea de Costa
    • Las Partes de una Playa
    • Medidas de Campo en la Línea de Costa Necesarias para Validar Imágenes
    • Procesamiento y Análisis de Imágenes para la Caracterización de la Línea de Costa
    • Preguntas y Respuestas


    Materiales:


    • Ver Grabación
    • Diapositivas de la Presentación
    • Tarea 
    • Transcripción de Preguntas y Respuestas

  • Remote Sensing of Coastal Ecosystems [Introductory]

    Coastal and marine ecosystems serve key roles for carbon storage, nutrients, and materials cycling, as well as reservoirs of biodiversity. They also provide ecosystem services such as sustenance for millions of people, coastal protection against wave action, and recreational activities. Remote sensing of coastal and marine ecosystems is particularly challenging. Up to 90% of the signal received by the sensors in orbit comes from the atmosphere. Additionally, dissolved and suspended constituents in the water column attenuate most of the light received through absorption or scattering. When it comes to retrieving information about shallow-water ecosystems, even in the clearest waters under the clearest skies, less than 10% of the signal originates from the water and its bottom surface. Users, particularly those with little remote sensing experience, stand to benefit from this training covering some of the difficulties associated with remote sensing of coastal ecosystems, particularly beaches and benthic communities such as coral reefs and seagrass.



    OBJECTIVES
    by the end of this training, attendees will be able to:

    • Identify the different water column components and how they affect the remote sensing signal of shallow-water ecosystems
    • Outline existing satellite sensors used for ocean color and shallow-water ecosystem characterization
    • Understand the interaction between water constituents, the electromagnetic spectrum, and the remote sensing signal
    • Recognize the different processes used to remove the water column attenuation from the remotely-sensed signal to characterize benthic components
    • Summarize techniques for characterizing shoreline beach environments with remotely-sensed data and field methods for beach profiling


    COURSE FORMAT


    • Three one-hour sessions with presentations in English and Spanish
    • One Google Form homework
    • Spanish sessions 



    PREREQUISITES




    Part One: Overview of Coastal Ecosystems and Remote Sensing


    • Introduction to coastal and marine ecosystems
    • Overview of sensors for remote sensing of coastal areas
    • Q&A


    Part Two: Penetration of Light in the Water Column


    • Apparent and inherent optical properties 
    • Field bio-optical measurements 
    • Water column corrections 
    • Deriving bathymetry and benthic characterization from multispectral data 
    • Validation and calibration of ocean color data 
    • Q&A


    Part Three: Remote Sensing of Shorelines


    • Geophysical components of shorelines 
    • The parts of a beach 
    • Field-based measurements in shorelines for image validation 
    • Image processing and analysis for shoreline characterization 
    • Q&A


    Each part of 3 includes links to the recordings, presentation slides,  and Question & Answer Transcripts. 

  • Fundamentals of Remote Sensing [Introductory]

    These webinars are available for viewing at any time. They provide basic information about the fundamentals of remote sensing and are often a prerequisite for other ARSET training.

    OBJECTIVE
    Participants will become familiar with satellite orbits, types, resolutions, sensors, and processing levels. In addition to a conceptual understanding of remote sensing, attendees will also be able to articulate their advantages and disadvantages. Participants will also have a basic understanding of NASA satellites, sensors, data, tools, portals, and applications to environmental monitoring and management.

    SESSIONS
    Session 1: Fundamentals of Remote Sensing
    A general overview of remote sensing and its application to disasters, health & air quality, land, water resource, and wildfire management.
    Session 1A: NASA's Earth Observing Fleet
    Get familiar with Earth-observing satellites in NASA's fleet, sensors that collect data you can use in ARSET training, and potential applications. 
    Session 2A: Satellites, Sensors, Data and Tools for Land Management and Wildfire Applications
    Specific satellites, sensors, and resources for remote sensing in land management and wildfires. This includes land cover mapping and products, fire detection products, detecting land cover change, and NDVI and EVI. 
    Session 2B: Satellites, Sensors, and Earth Systems Models for Water Resources Management
    Water resources management, an overview of relevant satellites and sensors, an overview of relevant Earth system models, and data and tools for water resources management. 
    Session 2C: Fundamentals of Aquatic Remote Sensing
    Overview of relevant satellites and sensors, and data and tools for aquatic environmental management. 

  • Dendro Open-Source Dropbox

    Dendro is a collaborative file storage and description platform designed to support users in collecting and describing data, with its roots in research data management. It does not intend to replace existing research data repositories, because it is placed before the moment of deposit in a data repository.  The DENDRO platform is an open-source platform designed to help researchers describe their datasets, fully build on Linked Open Data. Whenever researchers want to publish a dataset, they can export to repositories such as CKAN, DSpace, Invenio, or EUDAT's B2SHARE. 

    It is designed to support the work of research groups with collaborative features such as:

    File metadata versioning
    Permissions management
    Editing and rollback
    Public/Private/Metadata Only project visibility

    You start by creating a “Project”, which is like a Dropbox shared folder. Projects can be private (completely invisible to non-colaborators), metadata-only (only metadata is visible but data is not), and public (everyone can read both data and metadata). Project members can then upload files and folders and describe those resources using domain-specific and generic metadata, so it can suit a broad spectrum of data description needs. The contents of some files that contain data (Excel, CSV, for example) is automatically extracted, as well as text from others (PDF, Word, TXT, etc) to assist discovery.

    Dendro provides a flexible data description framework built on Linked Open Data at the core (triple store as), scalable file storage for handling big files, BagIt-represented backups, authentication with ORCID and sharing to practically any repository platform.

    Further information about Dendro can be found on its Github repository at:  https://github.com/feup-infolab/dendro.  Documentation and descriptions of Dendro can be found in other languages from the primary URL home page.

  • Overview of RSpace Electronic Lab Notebook (ELN) for Researchers

    A demonstration of RSpace, a software system for documenting your research work.

    More information about electronic lab notebooks (ELNs) can be found on the Gurdon Institute website at https://www.gurdon.cam.ac.uk/institute-life/computing/elnguidance. The site includes:

    • What is an Electronic Lab Notebook, and why should I use one?
    • A note about DIY documentation systems
    • Electronic Lab Notebook (ELN) vs Laboratory Information Management System (LIMS)
    • Disengagement - what if I want to change systems?
    • Which ELN would be best for me/my group?
    • Evaluating the product
    • Some current ELN products (suitability, platform, storage, and other details)


     
     

  • Using LabArchives Electronic Laboratory Notebook (ELN) in Your Lab

    LabArchives® is the leading secure and intuitive cloud-based Electronic Lab Notebook (ELN) application enabling researchers to easily create, store, share and manage their research data. Far more than an “ELN”, LabArchives provides a flexible, extensible platform that can be easily customized to match your lab's workflow providing benefits to Principal Investigators, lab managers, staff, postdoctoral fellows, and graduate students.

    LabArchives ELN benefits and features:

    • Monitor, engage, and evaluate your teams' lab work
    • Notebook user access can be managed to allow access rights to certain notebooks, pages and/or entries
    • Interconnect all your lab data and image files to your observations and notes
    • Create and adhere to funding agency data management plans which require data sharing (via public URL or DOI)
    • Easily import and access digital experimental data captured from original lab machines produced by hardware/software
    • Lab team can easily upload images and videos directly to their notebook while conducting lab experiments
    • Complete audit control - tracks and stores ALL revisions, by users, for every entry - NO entry can be deleted - Protect IP
    • Publish and share selected data or entire notebooks to specific individuals or the public


    More information about electronic lab notebooks (ELNs) can be found on the Gurdon Institute website at https://www.gurdon.cam.ac.uk/institute-life/computing/elnguidance. The site includes:

    • What is an Electronic Lab Notebook, and why should I use one?
    • A note about DIY documentation systems
    • Electronic Lab Notebook (ELN) vs Laboratory Information Management System (LIMS)
    • Disengagement - what if I want to change systems?
    • Which ELN would be best for me/my group?
    • Evaluating the product
    • Some current ELN products (suitability, platform, storage, and other details)

  • LabArchives Electronic Lab Notebook (ELN): Introduction to Professional Edition

    Demonstration of the LabArchives Professional Edition Electronic Lab Notebook (ELN).

    A wide range of resources for perspective electronic lab notebook (ELN) users can be found on the Gurdon Institute website at https://www.gurdon.cam.ac.uk/institute-life/computing/elnguidance. The site includes:

    • What is an Electronic Lab Notebook, and why should I use one?
    • A note about DIY documentation systems
    • Electronic Lab Notebook (ELN) vs Laboratory Information Management System (LIMS)
    • Disengagement - what if I want to change systems?
    • Which ELN would be best for me/my group?
    • Evaluating the product
    • Some current ELN products (suitability, platform, storage, and other details)

  • Introduction to sciNote Electronic Lab Notebook (ELN) and its Main Functionalities

    Whether you are just beginning your scientific career or already have vast experience, sciNote Electronic Lab Notebook enables you to organize your projects, experiments, and protocols and keep track of it all. It is easy to use, reliable and flexible. During this tutorial we will give you a detailed overview of sciNote functionalities, from experiment design to electronic signatures.

    A wide range of resources for perspective electronic lab notebook (ELN) users can be found on the Gurdon Institute website at https://www.gurdon.cam.ac.uk/institute-life/computing/elnguidance. The site includes:

    • What is an Electronic Lab Notebook, and why should I use one?
    • A note about DIY documentation systems
    • Electronic Lab Notebook (ELN) vs Laboratory Information Management System (LIMS)
    • Disengagement - what if I want to change systems?
    • Which ELN would be best for me/my group?
    • Evaluating the product
    • Some current ELN products (suitability, platform, storage, and other details)

  • sciNote Electronic Lab Notebook (ELN) Features Walkthrough

    sciNote electronic lab notebook enables you to organize your scientific data in a systematic way, which you can quickly find or cluster it together within a report. Every scientist has their own unique way of writing things down in the notebook and sciNote allows this necessary flexibility. In this video, we will walk you through main sciNote functionalities, from experiment design to creating a report.

    A wide range of resources for perspective electronic lab notebook (ELN) users can be found on the Gurdon Institute website at https://www.gurdon.cam.ac.uk/institute-life/computing/elnguidance. The site includes:

    • What is an Electronic Lab Notebook, and why should I use one?
    • A note about DIY documentation systems
    • Electronic Lab Notebook (ELN) vs Laboratory Information Management System (LIMS)
    • Disengagement - what if I want to change systems?
    • Which ELN would be best for me/my group?
    • Evaluating the product
    • Some current ELN products (suitability, platform, storage, and other details)

  • Reference Management Tools

    Reference Management Tools help scholars to create and manage their lists of references for research projects. Most tools are designed to organize citations into specific formats for the preparation of manuscripts and bibliographies. Many search tools provide ways to download references into reference management tools. 
    This module will help  you develop the skills necessary to use three of the most used management softwares: Mendeley, Zotero and EndNote Web. Exercises using the tool are included in the downloadable slides.  Part A: Mendeley
    Mendeley is a free reference manager and academic social network that can help you organize your research, collaborate with others online, and discover the latest research. In this module you will learn how to:
    • Register to Mendeley
    • Create your Mendeley Library
    • Manage your documents and references
    • Cite references
    • Share Documents and ReferencesPart B: Zotero
    Zotero is a free and open-source reference management software to manage bibliographic data and related research materials. In this module you will learn how to:
    • Install Zotero
    • Create your Zotero library
    • Add references to your Zotero library
    • Manage your documents and references
    • Create bibliographies
    • Use Zotero with MS Word
     

  • OSTI Lecture 5: The Changing Face of Publication

    The following video is an original recording from the OSTI pilot initiative. Part 5 of the Graduate Training in Open Science series entitled "The Changing Face of Publication", the seminar introduces students to the concept of Open Access publishing and contrasts this approach with traditional publishing models. This lecture proved one of the most popular in the course, based on student feedback at the end of the pilot initiative.
    The Open Science Training Initiative provides a series of lectures in open science, data management, licensing and reproducibility, for use with graduate students and postdoctoral researchers. The lectures can be used individually as one-off information lectures in aspects of open science or can be integrated into existing course provision at your institution as a lecture series, forming part of a hands-on exercise in producing a coherent research story.
    The raw materials have already been released online in the GitHub repository

     

  • OSTI Lecture 1: Reproducibility and Open Science

    This video is an original recording of the opening lecture to the OSTI pilot initiative, hosted by the Doctoral Training Centers in Systems Biology, Life Sciences and the Industrial Doctorate at the University of Oxford. Part 1 of the Graduate Training in Open Science series entitled "Reproducibility and Open Science", the seminar identifies some of the current issues facing scientific research and introduces the theme of open science as a possible solution. At the end of the lecture, the speaker provides an explanation of the Rotation Based Learning (RBL) implementation, an educational pattern that can be applied to any existing taught course in the sciences. RBL is designed to increase participants' awareness of the future uses of their research outputs and aims to foster increased reproducibility in scientific research.

    The Open Science Training Initiative provides a series of lectures in open science, data management, licensing and reproducibility, for use with graduate students and postdoctoral researchers. The lectures can be used individually as one-off information lectures in aspects of open science or can be integrated into existing course provision at your institution as a lecture series, forming part of a hands-on exercise in producing a coherent research story.
    The raw materials have already been released online in the GitHub repository

  • OSTI Lecture 2: Managing Your Code in GitHub

    This video is an original recording from the OSTI pilot initiative. Part 2 of the Graduate Training in Open Science series entitled "Version Control Using GitHub", the seminar aims to introduce the students to version control of code and writing, outlining the motivation and advantages of this approach. Following on from student feedback and the analysis provided in our Post-Pilot Report, this material will be offered as a mini-workshop, rather than a traditional lecture, in the official release of the course.
    The Open Science Training Initiative provides a series of lectures in open science, data management, licensing and reproducibility, for use with graduate students and postdoctoral researchers. The lectures can be used individually as one-off information lectures in aspects of open science or can be integrated into existing course provision at your institution as a lecture series, forming part of a hands-on exercise in producing a coherent research story.
    The raw materials have already been released online in the GitHub repository
     

  • OSTI Lecture 3: Data, Code and Content Licensing

    The following video is an original recording from the OSTI pilot initiative. Part 3 of the Graduate Training in Open Science series entitled "Data, Code and Content Licensing", the seminar introduces the theme of licensing, outlines the advantages of this approach, and takes students through the main steps of implementation.

    The Open Science Training Initiative provides a series of lectures in open science, data management, licensing and reproducibility, for use with graduate students and postdoctoral researchers. The lectures can be used individually as one-off information lectures in aspects of open science or can be integrated into existing course provision at your institution as a lecture series, forming part of a hands-on exercise in producing a coherent research story.
    The raw materials have already been released online in the GitHub repository

  • OSTI Lecture 4: Data Management Planning

    What is a Data Management Plan? Why do we need them and how do they relate to our day-to-day research work? This short lecture, Part 4 of the Graduate Training in Open Science series, introduces the concept of the DMP, set within the context of the scope and scale of data produced in modern scientific research, and breaks the how-to process down into short-, medium- and long-term project management stages. Please note that, as a result of student feedback and the analysis undertaken for the Post-Pilot Report, this lecture will be offered as a mini-workshop in the release of the official material.

    The Open Science Training Initiative provides a series of lectures in open science, data management, licensing and reproducibility, for use with graduate students and postdoctoral researchers. The lectures can be used individually as one-off information lectures in aspects of open science or can be integrated into existing course provision at your institution as a lecture series, forming part of a hands-on exercise in producing a coherent research story.
    The raw materials have already been released online in the GitHub repository

  • Introduction To Databases And WoSIS

    The module teaches an introduction to databases and general soil database design, outlines problems of data standardization and harmonization, and concludes with some practical sessions on these issues.
    Overview:      
    -Introduction to relational and spatial databases
    -Introduction to soil data modeling in databases
    -World Soil Information Service (WoSIS) databse structure
    -Practical for data access query and manipulation
    Objective:       
    -Enable users to understand database usage, design, and access
    -Enable users to understand how soil data is modeled in databases

  • Data: Changing Paradigms for Data Management, Publication and Sharing #1

    In this webinar which is part of the Research Data Information Integration series, Professor William Michener provides a historical overview of data management and data sharing , focusing on lessons learned from past and emerging large ecological and environmental research programs , reviews some of the current impediments to data management , publication and sharing, discusses solutions to these challenges including various tools that support management of data throughout the data lifecycle from planning through analysis, explores new approaches to publishing and sharing data such as the Dryad digital repository and DataONE, and glimpses a future where informatics can better enable science, highlighting some of the activities that are underway with respect to changing the scientific culture (e.g., altimetric, semantic annotation and provenance tracking ).

     

  • Web Tutorials - Using the GLOBE Website

    The Global Learning and Observations to Benefit the Environment (GLOBE) Program is an international science and education program that provides students and the public worldwide with the opportunity to participate in data collection and the scientific process and contribute meaningfully to our understanding of the Earth system and global environment.
    The tutorials on this page are here to help you understand and work with the various parts of the GLOBE website. Each area is listed on the left, click on an item to see a video and/or to download step-by-step guides.
    Tutorials cover the main areas of the site, including:
    -GLOBE Data User Guide, to help scientists and researchers understand, access, and use available GLOBE data.
    -GLOBE Data Fundamentals, a webinar that allows one to understand how to access GLOBE data and includes a demonstration of the various tools.
    -The main GLOBE website, which allows you to engage in The GLOBE Program, as well as to collaborate and access training and educational resources.
    -The Data Entry System, which allows you to enter scientific data that school organizations have collected.
    -The Visualization and Data Retrieval System, which provides tools for interacting with and retrieving scientific data that has been entered by trained community members that belong to school organizations.

  • Excel for Chemical Engineers

    This great Excel series is a perennial favorite.  Learn about custom functions using VBO, pivot tables, macros, indirect references and more.
    The content includes:
    -Excelling with Excel #1 – Custom Functions Using VBA
    -Excelling with Excel #2 – Pivot Tables
    - Excelling with Excel #3 – Macros
    - Excelling with Excel #4 – Indirect References