(index page)
Dat-in-the-Lab: Announcing UC3 research collaboration
We are excited to announce that the Gordon and Betty Moore Foundation has awarded a rese
arch grant to the California Digital Library and Code for Science & Society (CSS) for the Dat-in-the-Lab project to develop practical new techniques for effective data management in the academic research environment.
Dat-in-the-Lab
The project will pilot the use of CSS’s Dat system to streamline data preservation, publication, sharing, and reuse in two UC research laboratories: the Evolution: Ecology, Environment lab at UC Merced, focused on basic ecological and evolutionary research under the direction of Michael Dawson; and the Center for Watershed Sciences at UC Davis, dedicated to the interdisciplinary study of water challenges. UC researchers are increasingly faced with demands for proactive and sustainable management of their research data with respect to funder mandates, publication requirements, institutional policies, and evolving norms of scholarly best practice. With the support of the UC Davis and UC Merced Libraries, the project team will conduct a series of site visits to the two UC labs in order to create, deploy, evaluate, and refactor Dat-based data management solutions built for real-world data collection and management contexts, along with outreach and training materials that can be repurposed for wider UC or non-UC use.
What is Dat?
The Dat system enables effective research data management (RDM) through continuous data versioning, efficient distribution and synchronization, and verified replication. Dat lets researchers continue to work with the familiar paradigm of file folders and directories yet still have access to rich, robust, and cryptographically-secure peer-to-peer networking functions. You can think of Dat as doing for data what Git has done for distributed source code control. Details of how the system works are explained in the Dat whitepaper.
Project partners
Dat-in-the-Lab is the latest expression of CDL’s longstanding interest in supporting RDM at the University of California, and is complementary to other initiatives such as the DMPTool for data management planning, the Dash data publication service, and active collaboration with local campus-based RDM efforts. CSS is a non-profit organization committed to improving access to research data for the public good, and works at the intersection of technology with science, journalism, and government to promote openness, transparency, and collaboration. Dat-in-the-Lab activities will be coordinated by Max Ogden, CSS founder and director; Danielle Robinson, CSS scientific and partnerships director; and Stephen Abrams, associate director of the CDL’s UC Curation Center (UC3).
Learn more
Stay tuned for monthly updates on the project. You can bookmark Dat-in-the-Lab on GitHub for access to code, curricula, and other project outputs. Also follow along as the project evolves on our roadmap, chat with the project team, and keep up to date through the project Twitter feed. For more information about UC3, contact us at uc3@ucop.edu and follow us on Twitter.
NSF EAGER Grant for Actionable DMPs
We’re delighted to announce that the California Digital Library has been awarded a 2-year NSF EAGER grant to support active, machine-actionable data management plans (DMPs). The vision is to convert DMPs from a compliance exercise based on static text documents into a key component of a networked research data management ecosystem that not only facilitates, but improves the research process for all stakeholders.![]()
Machine-actionable “refers to information that is structured in a consistent way so that machines, or computers, can be programmed against the structure” (DDI definition). Through prototyping and pilot projects we will experiment with making DMPs machine-actionable.
Imagine if the information contained in a DMP could flow across other systems automatically (e.g., to populate faculty profiles, monitor grants, notify repositories of data in the pipeline) and reduce administrative burdens. What if DMPs were part of active research workflows, and served to connect researchers with tailored guidance and resources at appropriate points over the course of a project? The grant will enable us to extend ongoing work with researchers, institutions, data repositories, funders, and international organizations (e.g., Research Data Alliance, Force11) to define a vision of machine-actionable DMPs and explore this enhanced DMP future. Working with a broad coalition of stakeholders, we will implement, test, and refine machine-actionable DMP use cases. The work plan also involves outreach to domain-specific research communities (environmental science, biomedical science) and pilot projects with various partners (full proposal text).
Active DMP community
Building on our existing partnership with the Digital Curation Centre, we look forward to incorporating new collaborators and aligning our work with wider community efforts to create a future world of machine-actionable DMPs. We’re aware that many of you are already experimenting in this arena and are energized to connect the dots, share experiences, and help carry things forward. These next-generation DMPs are a key component in the globally networked research data management ecosystem. We also plan to provide a neutral forum (not tied to any particular tool or project or working group) to ground conversations and community efforts.
Follow the conversation @ActiveDMPs #ActiveDMPs and activedmps.org (forthcoming). You can also join the active, machine-actionable DMP community (live or remote participation) at the RDA plenary in Montreal and Force11 meeting in Berlin to contribute to next steps.
Contact us to get involved!
cross-posted from https://blog.dmptool.org/2017/09/18/nsf-eager-grant-for-making-dmps-actionable/
Co-Author ORCiDs in Dash
Recently, the Dash team enabled ORCiD login. And while this configuration is important for primary authors, the Dash team feels strongly that all contributors to data publications should get credit for their work.
All co-authors of a published dataset now have the ability to authenticate and attach their ORCiD in Dash.
How this works:
- Data are published by a corresponding author who has the ability to authenticate their own ORCiD but they cannot enter other ORCiDs for co-authors. Bearing this in mind, Dash has a space for co-author email addresses to be entered.
- If email addresses are entered for co-authors, upon publication of the data, co-authors will receive an email notification. This notification will have a note about ORCiD iDs and a URL that directs to Dash.
- Co-authors who have clicked on this URL will be directed to a pop-up box over the dataset landing page which navigates authors to ORCiD for login and authentication
- After an ORCiD iD is entered and authenticated, the author is returned to the Dash landing page for their dataset and their ORCiD ID will appear by their name.
Managing the new NIH requirements for clinical trials
As part of an effort to enhance transparency in biomedical research, the National Institutes of Health (NIH) have, over the last few years, announced a series of policy changes related to clinical trials. Though there is still a great deal of uncertainty about which studies do and do not qualify, these changes may have significant consequences for researchers who may not necessarily consider their work to be clinical or part of a trial.
Last September, the NIH announced a series of requirements for studies that meet the agency’s revised and expanded definition of a clinical trials. Soon after, it was revealed that many of these requirements may apply to large swaths of NIH-funded behavioral, social science, and neuroscience research that, historically, have not been considered to be clinical in nature. This was affirmed several weeks ago when the agency released a list of case studies that included a brain imaging study in which healthy participants completed a memory task as an example of a clinical trial.

What exactly constitutes a clinical trial now?
Because many investigators doing behavioral, social science, and neuroscience research consider their work to be basic research and not a part of a clinical trial, it is worth taking a step back to consider how NIH now defines the term.
According to the NIH, clinical trials are “studies involving human participants assigned to an intervention in which the study is designed to evaluate the effect(s) of the intervention on the participant and the effect being evaluated is a health-related biomedical or behavioral outcome.”, In an NIH context, intervention refers to “a manipulation of the subject or subject’s environment for the purpose of modifying one or more health-related biomedical or behavioral processes and/or endpoints.”. Because the agency considers all of the studies it funds that investigate biomedical or behavioral outcomes to be health-related, this definition includes mechanistic or exploratory work that does not have direct clinical implications.
Basically, if you are working on an NIH-funded study that involves biomedical or behavioral variables, you should be paying attention to the new requirements about clinical trials.
What do I need to do now that my study is considered a clinical trial?
If you think your work may be reclassified as a clinical trial, it’s probably worth getting a head start on meeting the new requirements. Here is some practical advice about getting started.

Applying for Funding
NIH has specified new requirements about how research involving clinical trials can be funded. For example, NIH will soon require that any application involving a clinical trial be submitted in response to a funding opportunity announcement (FOA) or request for proposal (RFP) that explicitly states that it will accept a clinical trial. This means, that if you are a researcher whose work involves biomedical or behavioral measures, you may have to apply to funding mechanisms that your peers have argued are not necessarily optimal or appropriate. Get in touch with your program officer and watch this space.
Grant applications will also feature a new form that consolidates the human subjects and clinical trial information previously collected across multiple forms into one structured form. For a walkthrough of the new form, check out this video.
Human Subjects Training
Investigators involved in a clinical trial must complete Good Clinical Practice (GCP) training. GCP training addresses elements related to the design, conduct, and reporting of clinical trials and can be completed via a class or course, academic training program, or certification from a recognized clinical research professional organization.
In practice, if you have already completed human subjects training (e.g. via CITI) and believe your research may soon be classified as a clinical trials, you may want to get proactive about completing those couple additional modules.
Getting IRB Approval
Good news if you work on a multi-site study, NIH now expects that you will use a single Institutional Review Board (sIRB) for ethical review. This should help streamline the review process, since it will no longer be necessary to submit an application to each site’s individual IRB. This requirement also applies to studies that are not clinical trials.
Registration and Reporting
NIH-funded projects involving clinical trials must be registered on Clinicaltrials.gov. In practice, this means that the primary investigator or grant awardee is responsible for registering the trial no later than 21 days after the enrollment of the first participant and is required to submit results information no later than a year after the study’s completion date. Registration involves supplying a significant amount of information about a study’s planned design and participants while results reporting involves supplying information about the participants recruited, the data collected, and the statistical tests applied. For more information about Clinicaltrials.gov, check out this paper.
If you believe your research may soon be reclassified as a clinical trial, now is probably a good time to take a hard look at how you and your lab handle research data management.The best way to relieve the administrative burden of these new requirements is to plan ahead and ensure that your materials are well organized, your data is securely saved, and your decisions are well documented. The more you think through how you’re going to manage your data and analyses now, the less you’ll have to scramble to get everything together when the report is due. If you haven’t already, now would be a good time to get in touch with the data management, scholarly communications, and research IT professionals at your institution.