diff --git a/doc/release-notes/5.1.1-release-notes.md b/doc/release-notes/5.1.1-release-notes.md index f5243aebc8f..739bd3da800 100644 --- a/doc/release-notes/5.1.1-release-notes.md +++ b/doc/release-notes/5.1.1-release-notes.md @@ -24,9 +24,12 @@ As mentioned above, we encourage 5.1.1 instead of 5.1 for production use. Larger installations may want to increase the number of open S3 connections allowed (default is 256). For example, to set the value to 4096: -``./asadmin create-jvm-options "-Ddataverse.files..connection-pool-size=4096"` +`./asadmin create-jvm-options "-Ddataverse.files..connection-pool-size=4096"` +(where `` is the identifier of your S3 file store (likely `"s3"`). The JVM Options section of the [Configuration Guide](http://guides.dataverse.org/en/5.1.1/installation/config/) has more information. -The JVM Options section of the [Configuration Guide](http://guides.dataverse.org/en/5.1.1/installation/config/) has more information. +### New S3 Bucket CORS setting for Direct Upload/Download + +When using S3 storage with direct upload and/or download enabled, one must now expose the ETag header as documented in the [updated cors.json example](https://guides.dataverse.org/en/5.1.1/developers/big-data-support.html?highlight=etag#s3-direct-upload-and-download). ## Complete List of Changes @@ -44,16 +47,17 @@ If this is a new installation, please see our [Installation Guide](http://guides 1. Undeploy the previous version. -/payara/bin/asadmin list-applications -/payara/bin/asadmin undeploy dataverse +`/bin/asadmin list-applications` +`/bin/asadmin undeploy dataverse<-version>` 2. Stop payara and remove the generated directory, start. -- service payara stop -- remove the generated directory: rm -rf payara/payara/domains/domain1/generated -- service payara start +- `service payara stop` +- remove the generated directory: +`rm -rf /glassfish/domains/domain1/generated` +- `service payara start` 3. Deploy this version. -/payara/bin/asadmin deploy dataverse-5.1.1.war +`/bin/asadmin deploy dataverse-5.1.1.war` 4. Restart payara diff --git a/doc/release-notes/6919-preview-tools.md b/doc/release-notes/6919-preview-tools.md new file mode 100644 index 00000000000..567ce2311c7 --- /dev/null +++ b/doc/release-notes/6919-preview-tools.md @@ -0,0 +1,38 @@ +## Release Highlights + +### File Preview When Guestbooks or Terms Exist + +Previously, file preview was only available when files were publicly downloadable. Now if a guestbook or terms (or both) are configured for the dataset, they will be shown in the Preview tab and once they are agreed to, the file preview will appear (#6919). + +### Preview Only External Tools + +A new external tool type called "preview" has been added that prevents the tool from being displayed under "Explore Options" under the "Access File" button on the file landing page (#6919). + + +## Major Use Cases + +Newly-supported use cases in this release include: + +- Users can now preview files that have a guestbook or terms. (Issue #6919) +- External tools authors can indicate that their tool is "preview only". (Issue #6919) + + +## Notes for Dataverse Installation Administrators + +### Converting Explore External Tools to Preview Only + +When the war file is deployed, a SQL migration script will convert [dataverse-previewers][] to have both "explore" and "preview" types so that they will continue to be displayed in the Preview tab. + +If you would prefer that these tools be preview only, you can delete the tools, adjust the JSON manifests (changing "explore" to "preview"), and re-add them. + +[dataverse-previewers]: https://github.com/GlobalDataverseCommunityConsortium/dataverse-previewers + +## Notes for Tool Developers and Integrators + +### Preview Only External Tools + +A new external tool type called "preview" has been added that prevents the tool from being displayed under "Explore Options" under the "Access File" button on the file landing page (#6919). This "preview" type replaces "hasPreviewMode", which has been removed. + +### Multiple Types for External Tools + +External tools now support multiple types. In practice, the types "explore" and "preview" are the only combination that makes a difference in the UI as opposed to only having only one or the other type (see "preview only" above). Multiple types are specified in the JSON manifest with an array in "types". The older, single "type" is still supported but should be considered deprecated. diff --git a/doc/sphinx-guides/source/_static/admin/dataverse-external-tools.tsv b/doc/sphinx-guides/source/_static/admin/dataverse-external-tools.tsv index 556a17ef0eb..881cf448221 100644 --- a/doc/sphinx-guides/source/_static/admin/dataverse-external-tools.tsv +++ b/doc/sphinx-guides/source/_static/admin/dataverse-external-tools.tsv @@ -1,3 +1,4 @@ +Tool Type Scope Description TwoRavens explore file A system of interlocking statistical tools for data exploration, analysis, and meta-analysis: http://2ra.vn. See the :doc:`/user/data-exploration/tworavens` section of the User Guide for more information on TwoRavens from the user perspective and the :doc:`/installation/r-rapache-tworavens` section of the Installation Guide. Data Explorer explore file A GUI which lists the variables in a tabular data file allowing searching, charting and cross tabulation analysis. See the README.md file at https://github.com/scholarsportal/Dataverse-Data-Explorer for the instructions on adding Data Explorer to your Dataverse; and the :doc:`/installation/prerequisites` section of the Installation Guide for the instructions on how to set up **basic R configuration required** (specifically, Dataverse uses R to generate .prep metadata files that are needed to run Data Explorer). Whole Tale explore dataset A platform for the creation of reproducible research packages that allows users to launch containerized interactive analysis environments based on popular tools such as Jupyter and RStudio. Using this integration, Dataverse users can launch Jupyter and RStudio environments to analyze published datasets. For more information, see the `Whole Tale User Guide `_. diff --git a/doc/sphinx-guides/source/_static/installation/files/root/external-tools/dynamicDatasetTool.json b/doc/sphinx-guides/source/_static/installation/files/root/external-tools/dynamicDatasetTool.json index 2a9a888dddf..e30c067a86b 100644 --- a/doc/sphinx-guides/source/_static/installation/files/root/external-tools/dynamicDatasetTool.json +++ b/doc/sphinx-guides/source/_static/installation/files/root/external-tools/dynamicDatasetTool.json @@ -2,7 +2,9 @@ "displayName": "Dynamic Dataset Tool", "description": "Dazzles! Dizzying!", "scope": "dataset", - "type": "explore", + "types": [ + "explore" + ], "toolUrl": "https://dynamicdatasettool.com/v2", "toolParameters": { "queryParameters": [ diff --git a/doc/sphinx-guides/source/_static/installation/files/root/external-tools/fabulousFileTool.json b/doc/sphinx-guides/source/_static/installation/files/root/external-tools/fabulousFileTool.json index bc7f2b72f2e..14f71a280b3 100644 --- a/doc/sphinx-guides/source/_static/installation/files/root/external-tools/fabulousFileTool.json +++ b/doc/sphinx-guides/source/_static/installation/files/root/external-tools/fabulousFileTool.json @@ -3,7 +3,10 @@ "description": "Fabulous Fun for Files!", "toolName": "fabulous", "scope": "file", - "type": "explore", + "types": [ + "explore", + "preview" + ], "toolUrl": "https://fabulousfiletool.com", "contentType": "text/tab-separated-values", "toolParameters": { diff --git a/doc/sphinx-guides/source/admin/external-tools.rst b/doc/sphinx-guides/source/admin/external-tools.rst index 405c710d07e..3e1965c48ea 100644 --- a/doc/sphinx-guides/source/admin/external-tools.rst +++ b/doc/sphinx-guides/source/admin/external-tools.rst @@ -1,7 +1,7 @@ External Tools ============== -External tools can provide additional features that are not part of Dataverse itself, such as data exploration. +External tools can provide additional features that are not part of Dataverse itself, such as data file previews, visualization, and curation. .. contents:: |toctitle| :local: @@ -12,7 +12,7 @@ Inventory of External Tools --------------------------- .. csv-table:: - :header: "Tool", "Type", "Scope", "Description" + :header-rows: 1 :widths: 20, 10, 5, 65 :delim: tab :file: ../_static/admin/dataverse-external-tools.tsv @@ -31,14 +31,12 @@ To add an external tool to your installation of Dataverse you must first downloa Go to :ref:`inventory-of-external-tools` and download a JSON manifest for one of the tools by following links in the description to installation instructions. -In the curl command below, replace the placeholder "fabulousFileTool.json" placeholder for the actual name of the JSON file you downloaded. +Configure the tool with the curl command below, making sure to replace the ``fabulousFileTool.json`` placeholder for name of the JSON manifest file you downloaded. .. code-block:: bash curl -X POST -H 'Content-type: application/json' http://localhost:8080/api/admin/externalTools --upload-file fabulousFileTool.json -Note that some tools will provide a preview mode, which provides an embedded, simplified view of the tool on the file pages of your installation. This is controlled by the `hasPreviewMode` parameter. - Listing All External Tools in Dataverse +++++++++++++++++++++++++++++++++++++++ @@ -75,17 +73,26 @@ Testing External Tools Once you have added an external tool to your installation of Dataverse, you will probably want to test it to make sure it is functioning properly. +File Level vs. Dataset Level +++++++++++++++++++++++++++++ + +File level tools are specific to the file type (content type or MIME type). For example, a tool may work with PDFs, which have a content type of "application/pdf". + +In contrast, dataset level tools are always available no matter what file types are within the dataset. + File Level Explore Tools ++++++++++++++++++++++++ -File level explore tools are specific to the file type (content type or MIME type) of the file. For example, Data Explorer is tool for exploring tabular data files. +File level explore tools provide a variety of features from data visualization to statistical analysis. -An "Explore" button will appear (on both the dataset page and the file landing page) for files that match the type that the tool has been built for. When there are multiple explore tools for a filetype, the button becomes a dropdown. +For each supported file type, file level explore tools appear in the file listing of the dataset page as well as under the "Access" button on each file page. File Level Preview Tools ++++++++++++++++++++++++ -File level explore tools can be set up to display in preview mode, which is a simplified view of an explore tool designed specifically for embedding in the file page. +File level preview tools allow the user to see a preview of the file contents without having to download it. + +When a file has a preview available, a preview icon will appear next to that file in the file listing on the dataset page. On the file page itself, the preview will appear in a Preview tab either immediately or once a guestbook has been filled in or terms, if any, have been agreed to. File Level Configure Tools ++++++++++++++++++++++++++ @@ -95,12 +102,12 @@ File level configure tools are only available when you log in and have write acc Dataset Level Explore Tools +++++++++++++++++++++++++++ -When a dataset level explore tool is added, an "Explore" button on the dataset page will appear. This button becomes a drop down when there are multiple tools. +Dataset level explore tools allow the user to explore all the files in a dataset. Dataset Level Configure Tools +++++++++++++++++++++++++++++ -Configure tools at the dataset level are not currently supported. No button appears in the GUI if you add this type of tool. +Configure tools at the dataset level are not currently supported. Writing Your Own External Tool ------------------------------ diff --git a/doc/sphinx-guides/source/admin/geoconnect-worldmap.rst b/doc/sphinx-guides/source/admin/geoconnect-worldmap.rst index 0af163a2916..fb8250c0650 100644 --- a/doc/sphinx-guides/source/admin/geoconnect-worldmap.rst +++ b/doc/sphinx-guides/source/admin/geoconnect-worldmap.rst @@ -35,7 +35,7 @@ SQL commands to point a Dataverse installation at different Geoconnect servers: Removing Dead Explore Links --------------------------- -After a map has been created in WorldMap (assuming all the setup has been done), an "Explore" button will appear next to the name of the file in Dataverse. The "Explore" button should open the map in WorldMap. In rare occasions, the map has been deleted on the WorldMap side such that the "Explore" button goes nowhere, resulting in a dead link, a 404. +After a map has been created in WorldMap (assuming all the setup has been done), in Dataverse the file will display WorldMap as an explore option. In rare occasions, the map has been deleted on the WorldMap resulting in a dead link, a 404 page not found error. Functionality has been added on the Dataverse side to iterate through all the maps Dataverse knows about (stored in the ``maplayermetadata`` database table) and to check for the existence of each map in WorldMap. The status code returned from WorldMap (200, 404, etc.) is recorded in Dataverse along with a timestamp of when the check was performed. To perform this check, you can execute the following ``curl`` command: @@ -43,7 +43,7 @@ Functionality has been added on the Dataverse side to iterate through all the ma The output above will contain the ``layerLink`` being checked as well as the HTTP response status code (200, 404, etc.) in the ``lastVerifiedStatus`` field. 200 means OK and 404 means not found. 500 might indicate that the map is only temporarily unavailable. The ``lastVerifiedStatus`` and ``lastVerifiedTime`` will be persisted to the ``maplayermetadata`` database table. -Armed with this information about WorldMap returning a 404 for a map, you may want to delete any record of the map on the Dataverse side so that the "Explore" button goes away (and so that thumbnail files are cleaned up). To accomplish this, use the following ``curl`` command, substituting the id of the file: +Armed with this information about WorldMap returning a 404 for a map, you may want to delete any record of the map on the Dataverse side removing WorldMap as an explore option (and so that thumbnail files are cleaned up). To accomplish this, use the following ``curl`` command, substituting the id of the file: ``curl -H "X-Dataverse-key: $API_TOKEN" -X DELETE http://localhost:8080/api/files/{file_id}/map`` diff --git a/doc/sphinx-guides/source/admin/mail-groups.rst b/doc/sphinx-guides/source/admin/mail-groups.rst index 862fe088818..3df529e4112 100644 --- a/doc/sphinx-guides/source/admin/mail-groups.rst +++ b/doc/sphinx-guides/source/admin/mail-groups.rst @@ -55,6 +55,32 @@ To load it into your Dataverse installation, either use a ``POST`` or ``PUT`` re ``curl -X POST -H 'Content-type: application/json' http://localhost:8080/api/admin/groups/domain --upload-file domainGroup1.json`` +Matching with Domains or Regular Expressions +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +Adding simple domain names requires exact matches of user email domains and the configured domains of a group. Although you could add multiple domains to a group, those still require exact matches. + +You can also use one or multiple regular expressions instead of simple domains for a group. Those should not be mixed, although it would work. Regular expressions still require exact matches, but are much more flexible and are designed to support installation-specific use cases for group management. + +Some hints: + +- Due to their excessive CPU usage, regular expressions should be used rarely. +- Remember to properly escape "\" in your regular expression. Both Java and JSON are a bit picky about this. E.g. a character class "\d" would have to be escaped as "\\d". Plenty of tutorials on the web explain this in more detail. +- There is no way Dataverse can detect a wrong regular expression for you. Be sure to do extensive testing, as a misconfigured group could result in privilege escalation or an unexpected influx of support contacts. +- Remember to enable the regular expression support for a group by adding ``"regex": true``! + +A short example for a group using regular expressions: + +.. code-block:: json + + { + "name": "Users from @example.org", + "alias": "exampleorg-regex", + "description": "Any verified user from x@example.org or x@sub.example.org will be included.", + "regex": true, + "domains": ["example\\.org", "[a-z]+\\.example\\.org"] + } + Updating a Mail Domain Group ---------------------------- diff --git a/doc/sphinx-guides/source/api/apps.rst b/doc/sphinx-guides/source/api/apps.rst index c1d3a0a5395..dca641b1feb 100755 --- a/doc/sphinx-guides/source/api/apps.rst +++ b/doc/sphinx-guides/source/api/apps.rst @@ -65,10 +65,10 @@ OSF allows you to view, download, and upload files to and from a Dataverse datas https://github.com/CenterForOpenScience/osf.io/tree/develop/addons/dataverse -GeoConnect +Geoconnect ~~~~~~~~~~ -GeoConnect allows Dataverse files to be visualized on http://worldmap.harvard.edu with the "Explore" button. Read more about it in the :doc:`/user/data-exploration/worldmap` section of the User Guide. +Geoconnect allows Dataverse files to be visualized and explored on `WorldMap `_. Read more about it in the :doc:`/user/data-exploration/worldmap` section of the User Guide. https://github.com/IQSS/geoconnect diff --git a/doc/sphinx-guides/source/api/external-tools.rst b/doc/sphinx-guides/source/api/external-tools.rst index f9231f55359..9c261154150 100644 --- a/doc/sphinx-guides/source/api/external-tools.rst +++ b/doc/sphinx-guides/source/api/external-tools.rst @@ -9,11 +9,19 @@ External tools can provide additional features that are not part of Dataverse it Introduction ------------ -You can think of a external tool as **a glorified hyperlink** that opens a browser window in a new tab on some other website. The term "external" is used to indicate that the user has left the Dataverse web interface. For example, perhaps the user is looking at a dataset on https://demo.dataverse.org . They click "Explore" and are brought to https://fabulousfiletool.com?fileId=42&siteUrl=http://demo.dataverse.org +External tools are additional applications the user can access or open from Dataverse to preview, explore, and manipulate data files and datasets. The term "external" is used to indicate that the tool is not part of the main Dataverse application. -The "other website" (fabulousfiletool.com in the example above) is probably part of the same ecosystem of scholarly publishing that Dataverse itself participates in. Sometimes the other website runs entirely in the browser. Sometimes the other website is a full blown server side web application like Dataverse itself. +Once you have created the external tool itself (which is most of the work!), you need to teach Dataverse how to construct URLs that your tool needs to operate. For example, if you've deployed your tool to fabulousfiletool.com your tool might want the ID of a file and the siteUrl of the Dataverse installation like this: https://fabulousfiletool.com?fileId=42&siteUrl=http://demo.dataverse.org -The possibilities for external tools are endless. Let's look at some examples to get your creative juices flowing. +In short, you will be creating a manifest in JSON format that describes not only how to construct URLs for your tool, but also what types of files your tool operates on, where it should appear in the Dataverse web interfaces, etc. + +The possibilities for external tools are endless. Let's look at some examples to get your creative juices flowing. Then we'll look at a complete list of parameters you can use when creating the manifest file for your tool. + +If you're still looking for more information on external tools, you can also watch a video introduction called `Background on the External Tool Framework`_ (slides_) from the 2020 Dataverse Community Meeting. + +.. _Background on the External Tool Framework: https://youtu.be/YH4I_kldmGI?t=159 + +.. _slides: https://osf.io/xjdfw/ Examples of External Tools -------------------------- @@ -21,7 +29,7 @@ Examples of External Tools Note: This is the same list that appears in the :doc:`/admin/external-tools` section of the Admin Guide. .. csv-table:: - :header: "Tool", "Type", "Scope", "Description" + :header-rows: 1 :widths: 20, 10, 5, 65 :delim: tab :file: ../_static/admin/dataverse-external-tools.tsv @@ -29,10 +37,10 @@ Note: This is the same list that appears in the :doc:`/admin/external-tools` sec How External Tools Are Presented to Users ----------------------------------------- -An external tool can appear in Dataverse in one of three ways: +An external tool can appear in Dataverse in a variety of ways: -- under an "Explore" or "Configure" button either on a dataset landing page -- under an "Explore" or "Configure" button on a file landing page +- as an explore, preview, or configure option for a file +- as an explore option for a dataset - as an embedded preview on the file landing page See also the :ref:`testing-external-tools` section of the Admin Guide for some perspective on how installations of Dataverse will expect to test your tool before announcing it to their users. @@ -50,7 +58,7 @@ Let's look at two examples of external tool manifests (one at the file level and External Tools for Files ^^^^^^^^^^^^^^^^^^^^^^^^ -:download:`fabulousFileTool.json <../_static/installation/files/root/external-tools/fabulousFileTool.json>` is a file level explore tool that operates on tabular files: +:download:`fabulousFileTool.json <../_static/installation/files/root/external-tools/fabulousFileTool.json>` is a file level both an "explore" tool and a "preview" tool that operates on tabular files: .. literalinclude:: ../_static/installation/files/root/external-tools/fabulousFileTool.json @@ -70,7 +78,7 @@ Terminology =========================== ========== Term Definition =========================== ========== - external tool manifest A **JSON file** the defines the URL constructed by Dataverse when users click "Explore" or "Configure" buttons. External tool makers are asked to host this JSON file on a website (no app store yet, sorry) and explain how to use install and use the tool. Examples include :download:`fabulousFileTool.json <../_static/installation/files/root/external-tools/fabulousFileTool.json>` and :download:`dynamicDatasetTool.json <../_static/installation/files/root/external-tools/dynamicDatasetTool.json>` as well as the real world examples above such as Data Explorer. + external tool manifest A **JSON file** the defines the URL constructed by Dataverse when users click explore or configure tool options. External tool makers are asked to host this JSON file on a website (no app store yet, sorry) and explain how to use install and use the tool. Examples include :download:`fabulousFileTool.json <../_static/installation/files/root/external-tools/fabulousFileTool.json>` and :download:`dynamicDatasetTool.json <../_static/installation/files/root/external-tools/dynamicDatasetTool.json>` as well as the real world examples above such as Data Explorer. displayName The **name** of the tool in the Dataverse web interface. For example, "Data Explorer". @@ -78,12 +86,10 @@ Terminology scope Whether the external tool appears and operates at the **file** level or the **dataset** level. Note that a file level tool much also specify the type of file it operates on (see "contentType" below). - type Whether the external tool is an **explore** tool or a **configure** tool. Configure tools require an API token because they make changes to data files (files within datasets). Configure tools are currently not supported at the dataset level (no "Configure" button appears in the GUI for datasets). + types Whether the external tool is an **explore** tool, a **preview** tool, a **configure** tool or any combination of these (multiple types are supported for a single tool). Configure tools require an API token because they make changes to data files (files within datasets). Configure tools are currently not supported at the dataset level. The older "type" keyword that allows you to pass a single type as a string is deprecated but still supported. toolUrl The **base URL** of the tool before query parameters are added. - hasPreviewMode A boolean that indicates whether tool has a preview mode which can be embedded in the File Page. Since this view is designed for embedding within Dataverse, the preview mode for a tool will typically be a view without headers or other options that may be included with a tool that is designed to be launched in a new window. Sometimes, a tool will exist solely to preview files in Dataverse and the preview mode will be the same as the regular view. - contentType File level tools operate on a specific **file type** (content type or MIME type such as "application/pdf") and this must be specified. Dataset level tools do not use contentType. toolParameters **Query parameters** are supported and described below. diff --git a/doc/sphinx-guides/source/developers/geospatial.rst b/doc/sphinx-guides/source/developers/geospatial.rst index 8a19a0b11f2..16527b00aab 100644 --- a/doc/sphinx-guides/source/developers/geospatial.rst +++ b/doc/sphinx-guides/source/developers/geospatial.rst @@ -186,7 +186,7 @@ The ``get_latest_jointarget_information()`` in ``utils.py`` retrieves recent Joi Setting Up WorldMap Test Data ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -For the dataset page, this script gives a query to add test WorldMap map data. After the query is run, the "Explore Map" button should appear for a tabular file or shapefile. In the example SQL queries below, substitute ``$DATASET_ID`` and ``$DATAFILE_ID`` with the appropriate ID's. +For the dataset page, this script gives a query to add test WorldMap map data. After the query is run, the explore in WorldMap option should display for tabular files or shapefiles. In the example SQL queries below, substitute ``$DATASET_ID`` and ``$DATAFILE_ID`` with the appropriate ID's. To add sample map data for a tabular file: diff --git a/doc/sphinx-guides/source/developers/troubleshooting.rst b/doc/sphinx-guides/source/developers/troubleshooting.rst index 63f00ce02f6..2964684f7a9 100755 --- a/doc/sphinx-guides/source/developers/troubleshooting.rst +++ b/doc/sphinx-guides/source/developers/troubleshooting.rst @@ -89,16 +89,18 @@ As another example, here is how to create a Mail Host via command line for Amazo Rebuilding Your Dev Environment ------------------------------- -If you have an old copy of the database and old Solr data and want to start fresh, here are the recommended steps: - -- drop your old database -- clear out your existing Solr index: ``scripts/search/clear`` -- run the installer script above - it will create the db, deploy the app, populate the db with reference data and run all the scripts that create the domain metadata fields. You no longer need to perform these steps separately. -- confirm you are using the latest Dataverse-specific Solr schema.xml and included XML files (schema_dv_cmb_[copies|fields].xml) -- confirm http://localhost:8080 is up -- If you want to set some dataset-specific facets, go to the root dataverse (or any dataverse; the selections can be inherited) and click "General Information" and make choices under "Select Facets". There is a ticket to automate this: https://github.com/IQSS/dataverse/issues/619 - -You may also find https://github.com/IQSS/dataverse/blob/develop/scripts/deploy/phoenix.dataverse.org/deploy and related scripts interesting because they demonstrate how we have at least partially automated the process of tearing down a Dataverse installation and having it rise again, hence the name "phoenix." See :ref:`fresh-reinstall` section of the Installation Guide. +A script called :download:`dev-rebuild.sh <../../../../scripts/dev/dev-rebuild.sh>` is available that does the following: + +- Drops the database. +- Clears our Solr. +- Deletes all data files uploaded by users (assuming you are using the default directory). +- Deploys the war file located in the ``target`` directory. +- Runs ``setup-all.sh`` in insecure mode so tests will pass. +- Runs post-install SQL statements. +- Publishes the root dataverse. +- Adjusts permissions on on the root dataverse so tests will pass. + +To execute the script, make sure you have built a war file already and then ``cd`` to the root of the source tree and run ``scripts/dev/dev-rebuild.sh``. Feedback on this script is welcome! DataCite -------- diff --git a/doc/sphinx-guides/source/installation/config.rst b/doc/sphinx-guides/source/installation/config.rst index 82149662733..6e7b1a2c8f5 100644 --- a/doc/sphinx-guides/source/installation/config.rst +++ b/doc/sphinx-guides/source/installation/config.rst @@ -26,11 +26,11 @@ The default password for the "dataverseAdmin" superuser account is "admin", as m Blocking API Endpoints ++++++++++++++++++++++ -The :doc:`/api/native-api` contains a useful but potentially dangerous API endpoint called "admin" that allows you to change system settings, make ordinary users into superusers, and more. The "builtin-users" endpoint lets people create a local/builtin user account if they know the key defined in :ref:`BuiltinUsers.KEY`. The endpoint "test" is not used but is where testing code maybe be added (see https://github.com/IQSS/dataverse/issues/4137 ). +The :doc:`/api/native-api` contains a useful but potentially dangerous API endpoint called "admin" that allows you to change system settings, make ordinary users into superusers, and more. The "builtin-users" endpoint lets admins create a local/builtin user account if they know the key defined in :ref:`BuiltinUsers.KEY`. -By default, most APIs can be operated on remotely and a number of endpoints do not require authentication. The endpoints "admin" and "test" are limited to localhost out of the box by the settings :ref:`:BlockedApiEndpoints` and :ref:`:BlockedApiPolicy`. +By default, most APIs can be operated on remotely and a number of endpoints do not require authentication. The endpoints "admin" and "builtin-users" are limited to localhost out of the box by the settings :ref:`:BlockedApiEndpoints` and :ref:`:BlockedApiPolicy`. -It is very important to keep the block in place for the "admin" endpoint (and at least consider blocking "builtin-users"). Please note that documentation for the "admin" endpoint is spread across the :doc:`/api/native-api` section of the API Guide and the :doc:`/admin/index`. +It is very important to keep the block in place for the "admin" endpoint, and to leave the "builtin-users" endpoint blocked unless you need to access it remotely. Documentation for the "admin" endpoint is spread across the :doc:`/api/native-api` section of the API Guide and the :doc:`/admin/index`. It's also possible to prevent file uploads via API by adjusting the :ref:`:UploadMethods` database setting. @@ -225,7 +225,7 @@ Both Local and Remote Auth The ``authenticationproviderrow`` database table controls which "authentication providers" are available within Dataverse. Out of the box, a single row with an id of "builtin" will be present. For each user in Dataverse, the ``authenticateduserlookup`` table will have a value under ``authenticationproviderid`` that matches this id. For example, the default "dataverseAdmin" user will have the value "builtin" under ``authenticationproviderid``. Why is this important? Users are tied to a specific authentication provider but conversion mechanisms are available to switch a user from one authentication provider to the other. As explained in the :doc:`/user/account` section of the User Guide, a graphical workflow is provided for end users to convert from the "builtin" authentication provider to a remote provider. Conversion from a remote authentication provider to the builtin provider can be performed by a sysadmin with access to the "admin" API. See the :doc:`/api/native-api` section of the API Guide for how to list users and authentication providers as JSON. -Adding and enabling a second authentication provider (:ref:`native-api-add-auth-provider` and :ref:`api-toggle-auth-provider`) will result in the Log In page showing additional providers for your users to choose from. By default, the Log In page will show the "builtin" provider, but you can adjust this via the :ref:`conf-default-auth-provider` configuration option. Further customization can be achieved by setting :ref:`conf-allow-signup` to "false", thus preventing users from creating local accounts via the web interface. Please note that local accounts can also be created via API, and the way to prevent this is to block the ``builtin-users`` endpoint (:ref:`:BlockedApiEndpoints`) or scramble (or remove) the ``BuiltinUsers.KEY`` database setting (:ref:`BuiltinUsers.KEY`). +Adding and enabling a second authentication provider (:ref:`native-api-add-auth-provider` and :ref:`api-toggle-auth-provider`) will result in the Log In page showing additional providers for your users to choose from. By default, the Log In page will show the "builtin" provider, but you can adjust this via the :ref:`conf-default-auth-provider` configuration option. Further customization can be achieved by setting :ref:`conf-allow-signup` to "false", thus preventing users from creating local accounts via the web interface. Please note that local accounts can also be created through the API by enabling the ``builtin-users`` endpoint (:ref:`:BlockedApiEndpoints`) and setting the ``BuiltinUsers.KEY`` database setting (:ref:`BuiltinUsers.KEY`). To configure Shibboleth see the :doc:`shibboleth` section and to configure OAuth see the :doc:`oauth2` section. @@ -1279,7 +1279,7 @@ Below is an example of setting ``localhost-only``. :BlockedApiEndpoints ++++++++++++++++++++ -A comma separated list of API endpoints to be blocked. For a production installation, "admin" should be blocked (and perhaps "builtin-users" as well), as mentioned in the section on security above: +A comma-separated list of API endpoints to be blocked. For a standard production installation, the installer blocks both "admin" and "builtin-users" by default per the security section above: ``curl -X PUT -d "admin,builtin-users" http://localhost:8080/api/admin/settings/:BlockedApiEndpoints`` @@ -1718,14 +1718,14 @@ The ``:TwoRavensTabularView`` option is no longer valid. See :doc:`r-rapache-two :GeoconnectCreateEditMaps +++++++++++++++++++++++++ -Set ``GeoconnectCreateEditMaps`` to true to allow the user to create GeoConnect Maps. This boolean effects whether the user sees the map button on the dataset page and if the ingest will create a shape file. +Set ``GeoconnectCreateEditMaps`` to true to allow the user to create maps using Geoconnect. This boolean enables the map configure tool option for a data file and the ingest to create a shape file. ``curl -X PUT -d true http://localhost:8080/api/admin/settings/:GeoconnectCreateEditMaps`` :GeoconnectViewMaps +++++++++++++++++++ -Set ``GeoconnectViewMaps`` to true to allow a user to view existing maps. This boolean effects whether a user will see the "Explore" button. +Set ``GeoconnectViewMaps`` to true to allow a user to view existing maps. This boolean enables the map explore tool option for a data file. ``curl -X PUT -d true http://localhost:8080/api/admin/settings/:GeoconnectViewMaps`` diff --git a/doc/sphinx-guides/source/installation/prep.rst b/doc/sphinx-guides/source/installation/prep.rst index a792c5a4e9c..a32ab4ebeb1 100644 --- a/doc/sphinx-guides/source/installation/prep.rst +++ b/doc/sphinx-guides/source/installation/prep.rst @@ -52,11 +52,12 @@ Required Components When planning your installation you should be aware of the following components of the Dataverse architecture: - Linux: RHEL/CentOS is highly recommended since all development and QA happens on this distribution. -- App server: Payara is the recommended Jakarta EE application server +- App server: Payara is the recommended Jakarta EE application server. - PostgreSQL: a relational database. - Solr: a search engine. A Dataverse-specific schema is provided. - SMTP server: for sending mail for password resets and other notifications. - Persistent identifier service: DOI and Handle support are provided. Production use requires a registered DOI or Handle.net authority. +- Rserve: runs as a daemon to execute R code. Optional Components +++++++++++++++++++ @@ -64,7 +65,6 @@ Optional Components There are a number of optional components you may choose to install or configure, including: - External Tools: Third party tools for data exploration can be added to Dataverse by following the instructions in the :doc:`/admin/external-tools` section of the Admin Guide. -- R, rApache, Zelig, and TwoRavens: :doc:`/user/data-exploration/tworavens` describes the feature and :doc:`r-rapache-tworavens` describes how to install these components. :doc:`/admin/external-tools` explains how third-party tools like TwoRavens can be added to Dataverse. - Dropbox integration :ref:`dataverse.dropbox.key`: for uploading files from the Dropbox API. - Apache: a web server that can "reverse proxy" Jakarta EE applications (like Dataverse) and rewrite HTTP traffic. - Shibboleth: an authentication system described in :doc:`shibboleth`. Its use with Dataverse requires Apache. diff --git a/doc/sphinx-guides/source/installation/r-rapache-tworavens.rst b/doc/sphinx-guides/source/installation/r-rapache-tworavens.rst index 1ba0bd083ec..3498fdcca85 100644 --- a/doc/sphinx-guides/source/installation/r-rapache-tworavens.rst +++ b/doc/sphinx-guides/source/installation/r-rapache-tworavens.rst @@ -342,8 +342,8 @@ Compare the two files. **It is important that the two copies are identical**. *(Yes, this is a HACK! We are working on finding a better way to ensure this compatibility between TwoRavens and Dataverse!)* -e. Enable TwoRavens Button in Dataverse ---------------------------------------- +e. Enable TwoRavens in Dataverse +-------------------------------- Now that you have installed TwoRavens, you can make it available to your users by adding it an "external tool" for your Dataverse installation. (For more on external tools in general, see the :doc:`/admin/external-tools` section of the Admin Guide.) @@ -353,8 +353,7 @@ Once you have made your edits, make the tool available within Dataverse with the ``curl -X POST -H 'Content-type: application/json' --upload-file twoRavens.json http://localhost:8080/api/admin/externalTools`` -Once enabled, an "Explore" dropdown will appear next to ingested tabular data files a "TwoRavens" button; clicking it will redirect -the user to the instance of TwoRavens, initialized with the data variables from the selected file. +Once enabled, TwoRavens will display as an explore tool option for tabular data files. Clicking it will redirect the user to the instance of TwoRavens, initialized with the data variables from the selected file. f. Perform a quick test of TwoRavens functionality -------------------------------------------------- @@ -373,10 +372,10 @@ change the view to Tabular, it likely means that something went very wrong with tabular ingest. Consult the app server log for any error messages that may explain the failure. -If the file is showing as Tabular Data, but the ``Explore`` button isn't present, +If the file type is tabular data, but TwoRavens is not displayed as an explore tool option, double-check that the steps in ``e.``, above, were correctly performed. -Otherwise, click on the ``Explore`` button. This will open TwoRavens in a new browser window. +Selecting the TwoRavens explore tool option will open TwoRavens in a new browser window. If the application initializes successfully, you should see the "data pebbles" representing the first 3 variables in the file: diff --git a/doc/sphinx-guides/source/style/patterns.rst b/doc/sphinx-guides/source/style/patterns.rst index 96648817cd6..92397910c14 100644 --- a/doc/sphinx-guides/source/style/patterns.rst +++ b/doc/sphinx-guides/source/style/patterns.rst @@ -15,43 +15,41 @@ When logged in, the account name is a dropdown menu, linking the user to account .. raw:: html -
-
- - - -
-
+ + + .. code-block:: html @@ -80,28 +78,26 @@ The breadcrumbs are displayed under the header, and provide a trail of links for .. raw:: html -
-
- - .. code-block:: html