(index page)
Dash: 2017 in Review
The goal for Dash in 2017 was to build out features that would make Dash a desirable place to publish data. While we continue to work with the research community to find incentives to publish data generally, the small team of us working on Dash wanted to take a moment to thank everyone who published data this year.
In 2017 we worked in two week sprint intervals to release 26 features and instances (not including fixes).

In 2018 we have one major focus: integrate into researcher workflows to make publishing data a more common practice.
To do so we will be working with the community to:
- Release a read only API for download of datasets
- Release a Submission REST API for publishing and versioning datasets
- Implement ‘Make Data Count’- standardized data usage metrics for researchers to have standard views, downloads, and citations of published data
- Integrate with publishers (i.e. submit data to Dash while submitting an article to UC Press)
- Integrate with online lab notebooks (i.e. right click and submit data after analysis with accompanied metadata from Jupyter notebooks)
- Talk to as many labs and researchers as possible to educate on data publishing and better understand incentives and needs
Follow along with our Github and Twitter and please get in touch with us if you have ideas or experiences to share for making data publishing a more common practice in the research environment.
Test-driving the Dash read-only API
The Dash Data Publication service is now allowing access to dataset metadata and public files through a read-only API. This API focuses on allowing metadata access through a RESTful API. Documentation is available at https://dash.ucop.edu/api/docs/index.html.

There are a number of ways to test out and access this API such as through programming language libraries or with the Linux curl command. This short tutorial gives examples of accessing the API using Postman software which is an easy GUI way to test out and browse an API and is available for the major desktop operating systems. If you’d like to follow along please download Postman from https://www.getpostman.com/ .
We are looking to receive feedback on the first of our Dash APIs, before we embark on building our submission API. Please get in touch with us with feedback or if you would be interested in setting up an API integration with the Dash service.
Create a Dash Collection in Postman
After you’ve installed Postman we want to open Postman and create a Dash collection to hold the queries against the API.
- Open Postman.
- Click New > Collection.

3. Enter the collection name and click Create.

Set Up Your First Request
1. Click the folder icon for the collection you just set up.

2. Click Add requests under this collection.

3. Fill in a name for your request, select to put it in the Dash collection you created earlier and click Save to Dash to create.

4. Click on the request you just created in the left bar and then click the headers tab.

5. Enter the following key and value in the header list. Key: Content-Type and Value: application/json. This header ensures that you’ll receive JSON data.

6. Enter the request URL in the box toward the top of the page. Leave the request type on “GET.” Enter https://dash.ucop.edu/api for the URL and click Save.
Try Your Request
1. Test out your request by clicking the Send button.
2. If everything is set up correctly you’ll see results like these.

Information about the API is being returned in JavaScript Object Notation (JSON) and includes a few features to become familiar with.
– A links section in the JSON exposes Hypertext Application Language (HAL) links that can guide you to other parts of the API, much like links in a web page allow you to browse other parts of a site.
– The self link refers to the current request.
– Other links can allow you to get further information to create other requests in the API.
– The curies section leads to some basic documentation that may be used by some software.
Following Links and Viewing Dataset Information
Postman has a nice feature that allows you to follow links in an API to create additional requests.
- Try it out by clicking the url path associated with stash:datasets which shows as /api/datasets.

2. You’ll see a new tab open for your new request toward the top of the screen and then you can submit or save the new request.

3. If you send this request you will see a lot of information about datasets in Dash.

Some things to point out about this request:
– The top-level links section contains paging links because this request returns a list of datasets. Not all datasets are returned at once, but if you needed to see more you could go to the next page.

– The list contains a count of items in the current page and a total for all items.
– When you look at the embedded datasets you’ll see additional links for each individual dataset, which you could also follow.

– You can view metadata for the most recent, successfully submitted version of each dataset that shows as dataset information here.

Hopefully this gives a general idea of how the API can be used and now you can create additional requests to browse the Dash API.
Hints for Navigating the Dash Data Model
– As you browse through the different links in the Dash API, it keep the following in mind.
– A dataset may have multiple versions. If it has only been edited or submitted once in the UI it will still have one version.
– The metadata shown at the dataset level is based on the latest published version and the dataset indicates the version number it is using to derive this metadata.
– Each version has descriptive metadata and files associated with it.
– To download files, look for the stash:download links. There are downloads for a dataset, a version and for an individual file. These links are standard HTTP downloads that could be downloaded using a web browser or other HTTP client.
– If you know the DOI of the dataset you wish to view, use a GET request for /api/datasets/<doi>.
– The DOI would be in a format such as doi:10.5072/FKK2K64GZ22 and needs to be URL encoded when included in the URL.
– See for example https://www.w3schools.com/tags/ref_urlencode.asp or https://www.urlencoder.org/ or use the URL encoding methods available in most programming languages.
– For datasets that are currently private for peer review, downloads will not become available until the privacy period has passed.
Dash Updates: Fall, 2017
Throughout the summer the Dash team has focused on features that better integrate with researcher workflows. The goal: make data publishing as easy as possible.
With that, here are the releases now up on the Dash site. Please feel free to use our demo site dashdemo.ucop.edu to test features and practice submitting data.
- Dash enabled co-author ORCiDs– all listed co-authors now have the ability to link their ORCiD iD with their data publication.
- Dash notifies “administrators” (set for each instance- campus data librarians & publishing staff) when data are deposited so researchers can get assistance enhancing their metadata (to make data more reproducible, transparent, and discoverable).
- Dash has rich text editing. The abstract, methods, and usage notes fields now have HTML text editors that allow for stylistic text editing to properly format information about the data publication.
- Dash allows for individual file download. All versions of the datasets may now be downloaded at the file-level and not just the entire dataset.
- Dash welcomes UC Davis. Researchers at UC Davis may now publish and share their research data at dash.ucdavis.edu.
- Dash welcomes UC Press journal Elementa. Authors submitting to the Elementa may now utilize UC Press Dash for all data supporting journal publications.
So, what is Dash working on now?
In order to integrate with various aspects of the research workflows, Dash needs an open Rest API. The first API being built is a new deposit API. The team is talking with the repository community and gathering use cases for mapping out how Dash can integrate with journals & online lab notebooks for alternate ways of submitting data that are more in line with researcher workflows.