diff --git a/.github/mkdocs/mkdocs.yaml b/.github/mkdocs/mkdocs.yaml index 7abee96..602a8e2 100644 --- a/.github/mkdocs/mkdocs.yaml +++ b/.github/mkdocs/mkdocs.yaml @@ -1,42 +1,34 @@ +INHERIT: mkdocs-default.yml site_name: SciCat Documentation -docs_dir: ../../docs - nav: - Home: index.md - SciCat User Guide: - - user-manual/index.md + - user-guide/index.md - Login: - login/index.md - Anonymous: login/Anonymous.md - Dashboard: login/Dashboard.md - Datasets: - datasets/index.md - - Register DOIs: datasets/Publishing.md - - Proposals: proposals.md - - Samples: samples.md - - Instruments: instruments.md + - Publishing data: datasets/Publishing.md + - Publishing data Advanced: datasets/PublishingAdvanced.md + - Data Retrieval: datasets/jobs.md + - Proposals: + - proposals/index.md + - Samples: samples/index.md + - Instruments: instruments/index.md - Troubleshooting: - troubleshoot/index.md - SciCat Operator Guide: - - operator-manual/index.md - - sites/DESY/index.md + - operator-guide/index.md - swagger/index.md - backendconfig/index.md + - backendconfig/authorization/index.md - backendconfig/dois.md + - frontendconfig/index.md - About: - about/index.md - -theme: material - -plugins: - - search - - glightbox - - section-index - - -extra_css: - - custom.css diff --git a/.github/workflows/mkdocs.yaml b/.github/workflows/publish-docs.yml similarity index 76% rename from .github/workflows/mkdocs.yaml rename to .github/workflows/publish-docs.yml index aadf1a8..4f9d760 100644 --- a/.github/workflows/mkdocs.yaml +++ b/.github/workflows/publish-docs.yml @@ -6,11 +6,12 @@ on: # yamllint disable-line rule:truthy - main tags: - v* + jobs: build: runs-on: ubuntu-latest steps: - - uses: SciCatProject/scicatlive/.github/actions/mkdocs-pages@tree + - uses: SciCatProject/docs-template/.github/actions/mkdocs-pages@main with: GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} push: true diff --git a/docs/backendconfig/authorization/authorization.md b/docs/backendconfig/authorization/authorization.md new file mode 100644 index 0000000..4db6881 --- /dev/null +++ b/docs/backendconfig/authorization/authorization.md @@ -0,0 +1,69 @@ +# Authorization +### Permission settings or who can do what? + +SciCat backend v4.x relies on [CASL](https://casl.js.or) to manage permissions. +The default vanilla installation of the backend is configured with the permissions described and linked below. +To avoid confusion and clarify the terminology used below, the term _User_ indicates a normal authenticated user with no elevated permissions, while _Admin_ indicates any user who belongs to a group that it is listed in the environmental variable ADMIN_GROUPS. +By default ADMIN_GROUPS is set to groups: admin, ingestor, archivemanager. +Special case is for deleting items in SciCat. Users with groups listed in DELETE_GROUPS, are allowed to perform delete. Default value is archivemanager. + +___IMPORTANT___ In v3.x, permissions were managed through roles. In v4.x, roles are not used, and they are converted to user groups. + +In the vanilla installation, the default functional accounts are assigned to groups as follow: +- user: admin + group: admin + +- user: ingestor + group: ingestor + +- user: archiveManager + group: archivemanager + +This allow for the flexibility required by many installations in different facilities with different needs. + + +## Group Lists available in Vanilla Configuration +The permissions in the vanilla installation provides a set of user groups which acquires specific set of permissions. In order to assign a set of permissions to a specific group of user, add such group to the correct list indicated below. + +| Configuration Group List | Description | CASL ability actions | +| ------------------------ | ----------- | ------------------- | +| _authenticated users_ | Authenticated users can view/access all datasets that belong to one of the groups they belong to | DatasetReadOwn | +| | Users can view attachments for datasets belonging to one of their group | DatasetAttachmentReadOwn | +| | Users are allowed to view origdatablocks for datasets belonging to one of their group | DatasetOrigdatablockReadOwn | +| | Users are allowed to view datablocks for datasets belonging to one of their group | DatasetDatablockReadOwn | +| | Users can view the logbook of the datasets that belong to one of their group | DatasetLogbookReadOwn | +| | | +| CREATE_DATASET_GROUPS | Users of the listed groups can create and modify datasets for any of the groups they belong to. At creation time, the system assignes a pid to the new datasets. If the user assigns one, the system will ignore it. | DatasetCreateOwn , DatasetReadOwn , DatasetUpdateOwn | +| | Users are allowed to perform all operations on attachments for datasets belonging to one of their group | DatasetAttachmentCreateOwn , DatasetAttachmentReadOwn , DatasetAtatchementUpdateOwn , DatasetAttachmentDeleteOwn | +| | Users are allowed to create and update origdatablocks for datasets belonging to one of their group | DatasetOrigdatablockCreateOwn , DatasetOrigdatablockReadOwn , DatasetOrigdatablockUpdateOwn | +| | Users are allowed to create and update datablocks for datasets belonging to one of their group | DatasetDatablockCreateOwn , DatasetDatablockReadOwn , DatasetDatablockUpdateOwn | +| | Users can view the logbook of the datasets that belong to one of their group | DatasetLogbookReadOwn | +| | | +| CREATE_DATASET_WITH_PID_GROUPS | Users of the listed groups can create and modify datasets for any of the groups they belong to. They are allowed to specify the dataset pid. If they decided not to specify a pid, the system will assign one. | DatasetCreateOwn , DatasetReadOwn , DatasetUpdateOwn | +| | Users are allowed to perform all operations on attachments for datasets belonging to one of their group | DatasetAttachmentCreateOwn , DatasetAttachmentReadOwn , DatasetAtatchementUpdateOwn , DatasetAttachmentDeleteOwn | +| | Users are allowed to create and update origdatablocks for datasets belonging to one of their group | DatasetOrigdatablockCreateOwn , DatasetOrigdatablockReadOwn , DatasetOrigdatablockUpdateOwn | +| | Users are allowed to create and update datablocks for datasets belonging to one of their group | DatasetDatablockCreateOwn , DatasetDatablockReadOwn , DatasetDatablockUpdateOwn | +| | Users can view the logbook of the datasets that belong to one of their group | DatasetLogbookReadOwn | +| | | +| CREATE_DATASET_PRIVILEGED_GROUPS | Users of the listed groups can create datasets for any group, but can only modify datasets belong to one of the group they belong to. They are allowed to specify pids for new datasets. This settings are suggested for ingestion functional accounts | DatasetCreateAll , DatasetReadOwn , DatasetUpdateOwn | +| | Users are allowed to perform all operations on attachments for datasets belonging to one of their group | DatasetAttachmentCreateOwn , DatasetAttachmentReadOwn , DatasetAtatchementUpdateOwn , DatasetAttachmentDeleteOwn | +| | Users are allowed to create origdatablocks for any datasets, but can only update them for datasets belonging to one of their group | DatasetOrigdatablockCreateAny , DatasetOrigdatablockReadOwn , DatasetOrigdatablockUpdateOwn | +| | Users are allowed to create and update datablocks for datasets belonging to one of their group | DatasetDatablockCreateOwn , DatasetDatablockReadOwn , DatasetDatablockUpdateOwn | +| | Users can view the logbook of the datasets that belong to one of their group | DatasetLogbookReadOwn | +| | | +| ADMIN_GROUPS | Users of the listed groups can create and modify datasets belonging to any group. They are allowed to specify the dataset's pid at creation time | DatasetCreateAny , DatasetReadAny , DatasetUpdateAny | +| | Users are allowed to perform all operations on attachments for any datasets | DatasetAttachmentCreateAny , DatasetAttachmentReadAny , DatasetAtatchementUpdateAny , DatasetAttachmentDeleteAny | +| | Users are allowed to perform all operations on origdatablocks for any datasets, except delete | DatasetOrigdatablockCreateAny , DatasetOrigdatablockReadAny , DatasetOrigdatablockUpdateAny | +| | Users are allowed to perform all operations on datablocks for any datasets, except delete | DatasetDatablockCreateAny , DatasetDatablockReadAny , DatasetDatablockUpdateAny | +| | Users can view logbook for any datasets| DatasetLogbookReadAny | +| | | +| DELETE_GROUPS | Users whose group is listed here are allowed to delete datasets, origdatablock or datablock | DatasetDeleteAny , DatasetOrigdatablockDeleteAny , DatasetDatablockDeleteAny | + +## Subsystems +- [Datasets](./authorization_datasets.md) +- [OrigDatablocks](./authorization_origdatablocks.md) +- [Jobs](./authorization_jobs.md) +- [Users](./authorization_users.md) + +___N.B.___: we know that many subsystems are still missing. We are working on reviewing the authorization model for each one of them and producing the relative documentation. We welcome any contribution. + diff --git a/docs/backendconfig/authorization/authorization_datasets.md b/docs/backendconfig/authorization/authorization_datasets.md new file mode 100644 index 0000000..aae63fe --- /dev/null +++ b/docs/backendconfig/authorization/authorization_datasets.md @@ -0,0 +1,155 @@ +# Datasets Authorization +## CASL ability actions +This is the list of the permissions methods available for datasets and all their endpoints and more fine-grained instance authorization. + + +### Endpoint authorization +1. DatasetCreate +2. DatasetRead +- DatasetUpdate +- DatasetDelete +- DatasetAttachmentCreate +- DatasetAttachmentRead +- DatasetAttachmentUpdate +- DatasetAttachmentDelete +- DatasetOrigdatablockCreate +- DatasetOrigdatablockRead +- DatasetOrigdatablockUpdate +- DatasetOrigdatablockDelete +- DatasetDatablockCreate +- DatasetDatablockRead +- DatasetDatablockUpdate +- DatasetDatablockDelete +- DatasetLogbookRead +### Instance authorization +1. DatasetCreateOwnerNoPid +2. DatasetCreateOwnerWithPid +- DatasetCreateAny +- DatasetReadManyPublic +- DatasetReadManyAccess +- DatasetReadManyOwner +- DatasetReadOnePublic +- DatasetReadOneAccess +- DatasetReadOneOwner +- DatasetReadAny +- DatasetUpdateOwner +- DatasetUpdateAny +- DetasetDeleteOwner +- DatasetDeleteAny +- DatasetAttachmentCreateOwner +- DatasetAttachmentCreateAny +- DatasetAttachmentReadPublic +- DatasetAttachmentReadAccess +- DatasetAttachmentReadOwner +- DatasetAttachmentReadAny +- DatasetAtatchementUpdateOwner +- DatasetAtatchementUpdateAny +- DatasetAttachmentDeleteOwner +- DatasetAttachmentDeleteAny +- DatasetOrigdatablockCreateOwner +- DatasetOrigdatablockCreateAny +- DatasetOrigdatablockReadPublic +- DatasetOrigdatablockReadAccess +- DatasetOrigdatablockReadOwner +- DatasetOrigdatablockReadAny +- DatasetOrigdatablockUpdateOwner +- DatasetOrigdatablockUpdateAny +- DatasetOrigdatablockDeleteAny +- DatasetDatablockCreateOwner +- DatasetDatablockCreateAny +- DatasetDatablockReadPublic +- DatasetDatablockReadAccess +- DatasetDatablockReadOwner +- DatasetDatablockReadAny +- DatasetDatablockUpdateOwner +- DatasetDatablockUpdateAny +- DatasetDatablockDeleteOwner +- DatasetDatablockDeleteAny +- DatasetLogbookReadOwner +- DatasetLogbookReadAny + +### Implementation +How the different level of authorization translates in data condition applied byt he backend. + +- **Public** +- `isPublished = true` +- **Access** (condition ar applied in logical _or_) +- `isPublished = true` +- `ownerGroup` is one of the groups that the user belongs +- `accessGroups` are one of the groups that the user belongs +- `sharedWith` contains the user's email +- **Owner** + - `ownerGroup` is one of the groups that the user belongs +- **Any** +- User can perform the action to any dataset + + +### Priority +``` + DatasetCreate-->DatasetCreateOwnerNoPid; + DatasetCreateOwnerNoPid-->DatasetCreateOwnerWithPid; + DatasetCreateOwnerWithPid-->DatasetCreateAny; +``` +``` + DatasetRead-->DatasetReadManyPublic; + DatasetReadManyPublic-->DatasetReadManyAccess; + DatasetReadManyAccess-->DatasetReadManyOwner; + DatasetReadManyOwner-->DatasetReadAny; + DatasetRead-->DatasetReadOnePublic; + DatasetReadOnePublic-->DatasetReadOneAccess; + DatasetReadOneAccess-->DatasetReadOneOwner; + DatasetReadOneOwner-->DatasetReadAny; +``` +``` + DatasetUpdate-->DatasetUpdateOwner; + DatasetUpdateOwner-->DatasetUpdateAny; + DatasetDelete-->DatasetDeleteOwner; + DatasetDeleteOwner-->DatasetDeleteAny; +``` + +### Authorization table +Note, merely for visibility reasons the table has been split. Hierarchically, `OrigDatablocks` and `Datablocks` belong to `Datasets`. +#### Datasets +| HTTP method | Endpoint | Endpoint Authorization | Anonymous | Authenticated User | Create Dataset Groups | Create Dataset with Pid Groups | Create Dataset Privileged Groups | Admin Groups | Delete Groups | Notes | +| -------- | ------- | ------- | ------- | ------- | ------- | ------- | ------- | ------- | ------- | ------- | +| POST | Datasets | _DatasetCreate_ | __no__ | __no__ | Owner, w/o PID
_DatasetCreateOwnerNoPid_ | Owner, w/ PID
_DatasetCreateOwnerWithPid_ | Any
_DatasetCreateAny_ | Any
_DatasetCreateAny_ | __no__ | +| POST | Datasets/isValid | _DatasetCreate_ | __no__ | __no__ | Owner, w/o PID
_DatasetCreateOwnerNoPid_ | Owner, W/ PID
_DatasetCreateOwnerWithPid_ | Any
_DatasetCreateAny_ | Any
_DatasetCreateAny_ | __no__ | +| GET | Datasets | _DatasetRead_ | Public
_DatasetReadPublic_ | Has Access
_DatasetReadAccess_ | Has Access
_DatasetReadAccess_ | Has Access
_DatasetReadAccess_ | Has Access
_DatasetReadAccess_ | Any
_DatasetReadyAny_ | __no__ | +| GET | Datasets/fullquery | _DatasetRead_ | Public
_DatasetReadManyPublic_ | Has Access
_DatasetReadManyAccess_ | Has Access
_DatasetReadManyAccess_ | Has Access
_DatasetReadManyAccess_ | Has Access
_DatasetReadManyAccess_ | Any
_DatasetReadAny_ | __no__ | +| GET | Datasets/fullfacet | _DatasetRead_ | Public
_DatasetReadManyPublic_ | Has Access
_DatasetReadManyAccess_ | Has Access
_DatasetReadManyAccess_ | Has Access
_DatasetReadManyAccess_ | Has Access
_DatasetReadManyAccess_ | Any
_DatasetReadAny_ | __no__ | +| GET | Datasets/metadataKeys | _DatasetRead_ | Public
_DatasetReadManyPublic_ | Has Access
_DatasetReadManyAccess_ | Has Access
_DatasetReadManyAccess_ | Has Access
_DatasetReadManyAccess_ | Has Access
_DatasetReadManyAccess_ | Any
_DatasetReadAny_ | __no__ | +| GET | Datasets/count | _DatasetRead_ | Public
_DatasetReadManyPublic_ | Has Access
_DatasetReadManyAccess_ | Has Access
_DatasetReadManyAccess_ | Has Access
_DatasetReadManyAccess_ | Has Access
_DatasetReadManyAccess_ | Any
_DatasetReadAny_ | __no__ | +| GET | Datasets/findOne | _DatasetRead_ | Public
_DatasetReadOnePublic_ | Has Access
_DatasetReadOneAccess_ | Has Access
_DatasetReadOneAccess_ | Has Access
_DatasetReadOneAccess_ | Has Access
_DatasetReadOneAccess_ | Any
_DatasetReadAny_ | __no__ | +| GET | Datasets/_pid_ | _DatasetRead_ | Public
_DatasetReadOnePublic_ | Has Access
_DatasetReadOneAccess_ | Has Access
_DatasetReadOneAccess_ | Has Access
_DatasetReadOneAccess_ | Has Access
_DatasetReadOneAccess_ | Any
_DatasetReadAny_ | __no__ | +| PATCH | Datasets/_pid_ | _DatasetUpdate_ | __no__ | __no__ | Owner
_DatasetUpdateOwner_ | Owner
_DatasetUpdateOwner_ | Owner
_DatasetUpdateOwner_ | Any
_DatasetUpdateAny_ | __no__ | +| PUT | Datasets/_pid_ | _DatasetUpdate_ |__no__ | __no__ | Owner
_DatasetUpdateOwner_ | Owner
_DatasetUpdateOwner_ | Owner
_DatasetUpdateOwner_ | Any
_DatasetUpdateAny_ | __no__ | +| POST | Datasets/_pid_/appendToArrayField | _DatasetUpdate_ |__no__ | __no__ | Owner
_DatasetUpdateOwner_ | Owner
_DatasetUpdateOwner_ | Owner
_DatasetUpdateOwner_ | Any
_DatasetUpdateAny_ | __no__ | +| | | | | | | | | | +| DELETE | Datasets/_pid_ | _DatasetDelete_ | __no__ | __no__ | __no__ | __no__ | __no__ | __no__ | Any
_DatasetDeleteAny_ | +| | | | | | | | | | +| GET | Datasets/_pid_/thumbnail | _DatasetRead_ | Public
_DatasetReadPublic_ | Has Access
_DatasetReadAccess_ | Has Access
_DatasetReadAccess_ | Has Access
_DatasetReadAccess_ | Has Access
_DatasetReadAccess_ | Any
_DatasetReadAny_ | __no__ | +| | | | | | | | | | +| POST | Datasets/_pid_/attachments | _DatasetAttachmentCreate_ | __no__ | __no__ | Owner
_DatasetAttachmentCreateOwner_ | Owner
_DatasetAttachmentCreateOwner_ | Any
_DatasetAttachmentCreateAny_ | Any
_DatasetAttachmentCreateAny_ | __no__ | +| GET | Datasets/_pid_/attachments | _DatasetAttachmemntRead_ | Public
_DatasetAttachmentReadPublic_ | Has Access
_DatasetAttachmentReadAccess_ | Has Access
_DatasetAttachmentReadAccess_ | Has Access
_DatasetAttachmentReadAccess_ | Has Access
_DatasetAttachmentReadAccess_ | Any
_DatasetAttachmentReadAny_ | __no__ | +| PUT | Datasets/_pid_/attachments/_aid_ | _DatasetAttachmemntUpdate_ | __no__ | __no__ | Owner
_DatasetAttachmentUpdateOwner_ | Owner
_DatasetAttachmentUpdateOwner_ | Owner
_DatasetAttachmentUpdateOwner_ | Any
_DatasetAttachmentCreateAny_ | __no__ | +| DELETE | Datasets/_pid_/attachments/_aid_ | _DatasetAttachmemntDelete_ | __no__ | __no__ | Owner
_DatasetAttachmentDeleteOwner_ | Owner
_DatasetAttachmentDeleteOwner_ | Owner
_DatasetAttachmentDeleteOwner_ | Any
_DatasetAttachmentDeleteAny_ | __no__ | + +#### OrigDatablock +| HTTP method | Endpoint | Endpoint Authorization | Anonymous | Authenticated User | Create Dataset Groups | Create Dataset with Pid Groups | Create Dataset Privileged Groups | Admin Groups | Delete Groups | Notes | +| -------- | ------- | ------- | ------- | ------- | ------- | ------- | ------- | ------- | ------- | ------- | +| POST | Datasets/_pid_/origdatablocks | _DatasetOrigdatablocksCreate_ | __no__ | __no__ | Owner
_DatasetOrigdatablockCreateOwner_ | Owner
_DatasetOrigdatablockCreateOwner_ | Any
_DatasetOrigdatablockCreateAny_ | Any
_DatasetOrigdatablockCreateAny_ | __no__ | +| POST | Datasets/_pid_/origdatablocks/isValid | _DatasetOrigdatablocksCreate_ | __no__ | __no__ | Owner
_DatasetOrigdatablockCreateOwner_ | Owner
_DatasetOrigdatablockCreateOwner_ | Any
_DatasetOrigdatablockCreateAny_ | Any
_DatasetOrigdatablockCreateAny_ | __no__ | +| GET | Datasets/_pid_/origdatablocks | _DatasetOrigdatablocksRead_ | Public
_DatasetOrigdatablockReadPublic_ | Has Access
_DatasetOrigdatablockReadOAccess_ | Has Access
_DatasetOrigdatablockReadAccess_ | Has Access
_DatasetOrigdatablockReadAccess_ | Has Access
_DatasetOrigdatablockReadAccess_ | Any
_DatasetOrigdatablockReadAny_ | __no__ | +| PATCH | Datasets/_pid_/origdatablocks/_oid_ | _DatasetOrigdatablocksUpdate_ | __no__ | __no__ | Owner
_DatasetOrigdatablockUpdateOwner_ | Owner
_DatasetOrigdatablockUpdateOwner_ | Owner
_DatasetOrigdatablockUpdateOwner_ | Any
_DatasetOrigdatablockCreateAny_ | __no__ | | +| DELETE | Datasets/_pid_/origdatablocks/_oid_ | _DatasetOrigdatablocksDelete_ | __no__ | __no__ | __no__ | __no__ | __no__ | __no__ | Any
_DatasetOrigdatablockDeleteAny_ | | + + +#### Datablocks +| HTTP method | Endpoint | Endpoint Authorization | Anonymous | Authenticated User | Create Dataset Groups | Create Dataset with Pid Groups | Create Dataset Privileged Groups | Admin Groups | Delete Groups | Notes | +| -------- | ------- | ------- | ------- | ------- | ------- | ------- | ------- | ------- | ------- | ------- | +| POST | Datasets/_pid_/datablocks | _DatasetDatablocksCreate_ | __no__ | __no__ | Owner
_DatasetDatablockCreateOwner_ | Owner
_DatasetDatablockCreateOwner_ | Owner
_DatasetDatablockCreateOwner_ | Any
_DatasetDatablockCreateAny_ | __no__ | | +| GET | Datasets/_pid_/datablocks | _DatasetOrigdatablocksRead_ | Public
_DatasetDatablockReadPublic_ | Has Access
_DatasetDatablockReadAccess_ | Has Access
_DatasetDatablockReadAccess_ | Has Access
_DatasetDatablockReadAccess_ | Has Access
_DatasetDatablockReadAccess_ | Any
_DatasetDatablockReadAny_ | __no__ | | +| PATCH | Datasets/_pid_/datablocks/_oid_ | _DatasetDatablocksUpdate_ | __no__ | __no__ | Owner
_DatasetDatablockUpdateOwner_ | Owner
_DatasetDatablockUpdateOwner_ | Owner
_DatasetDatablockUpdateOwner_ | Any
_DatasetDatablockCreateAny_ | __no__ | | +| DELETE | Datasets/_pid_/datablocks/_oid_ | _DatasetDatablocksDelete_ | __no__ | __no__ | __no__ | __no__ | __no__ | __no__ | Any
_DatasetDatablockDeleteAny_ | +| | | | | | | | | | +| GET | Datasets/_pid_/logbook | _DatasetLogbookRead_ | __no__ | Owner
_DatasetLogbookReadOwner_ | Owner
_DatasetLogbookReadOwner_ | Owner
_DatasetLogbookReadOwner_ | Owner
_DatasetLogbookReadOwner_ | Any
_DatasetLogbookReadAny_ | __no__ | | \ No newline at end of file diff --git a/docs/backendconfig/authorization/authorization_instruments.md b/docs/backendconfig/authorization/authorization_instruments.md new file mode 100644 index 0000000..c483c10 --- /dev/null +++ b/docs/backendconfig/authorization/authorization_instruments.md @@ -0,0 +1,31 @@ +# Instruments Authorization + +## CASL ability actions + +This is the list of the permissions methods available for Instrument and all their endpoints + +### Endpoint Authorization + +1. InstrumentCreate +2. InstrumentRead +- InstrumentUpdate +- InstrumentDelete + +#### Priority + +``` + InstrumentCreate; + InstrumentRead; + InstrumentUpdate; + InstrumentDelete; +``` + +#### Authorization table + +| HTTP method | Endpoint | Endpoint Authentication | Anonymous | Authenticated User | Admin Groups | Delete Groups | Notes | +| ----------- | ---------------- | ----------------------- | ---------------- | ------------------ | ------------------ | ------------------ | ----- | +| POST | instruments | _InstrumentCreate_ | **no** | **no** | _InstrumentCreate_ | **no** | - | +| GET | instruments | _InstrumentRead_ | _InstrumentRead_ | _InstrumentRead_ | _InstrumentRead_ | **no** | - | +| GET | instruments/_id_ | _InstrumentRead_ | _InstrumentRead_ | _InstrumentRead_ | _InstrumentRead_ | **no** | - | +| PATCH | instruments/_id_ | _InstrumentUpdate_ | **no** | **no** | _InstrumentUpdate_ | **no** | - | +| DELETE | instruments/_id_ | _InstrumentDelete_ | **no** | **no** | **no** | _InstrumentDelete_ | - | diff --git a/docs/backendconfig/authorization/authorization_jobs.md b/docs/backendconfig/authorization/authorization_jobs.md new file mode 100644 index 0000000..57ea1ea --- /dev/null +++ b/docs/backendconfig/authorization/authorization_jobs.md @@ -0,0 +1,97 @@ +# Jobs Authorization + +Jobs subsystem relies on groups defined in the configuration file for the backend: + + +| Configuration Group List | Description | +| ------------------------ | ----------- | +| ADMIN_GROUPS | Users of the listed groups can create, modify and read any job. They cannot delete jobs. | +| | | +| CREATE_JOB_PRIVILEGED_GROUPS | Users of the listed groups can create and read any job. They can only modify jobs that belong to their user or group depending on the configuration of given job (see Job Create Authorization Table ). They cannot delete jobs. | +| | | +| UPDATE_JOB_PRIVILEGED_GROUPS | Users of the listed groups can modify and read any job. They can only create jobs that belong to their user or group depending on the configuration of given job (see Job Update Authorization Table ). They cannot delete jobs. | +| | | +| DELETE_JOB_GROUPS | Users whose group is listed here are allowed to delete any job | + + +## CASL ability actions +This is the list of the permission methods available for Jobs and all their endpoints. +The authorization for jobs is consistently different from all the other endpoints. + +### Endpoint Authorization +1. JobCreate +2. JobRead +- JobUpdate +- JobDelete + +### (Data) Instance Authorization +1. JobCreateConfiguration (The job's create section of the configuration dictates if the user can create the job) +2. JobCreateOwner (Users with this privilege can create jobs for others) +- JobCreateAny (Users with this privilege can create jobs for any of the users that are defined in the create section of the job configuration) +- JobReadAccess +- JobReadAny +- JobUpdateConfiguration (The job's update section in configuration dictates if the user can update the job) +- JobUpdateOwner (Users with this privilege can update jobs belonging to others) +- JobUpdateAny (Users with this privilege can update any job) + +#### Priority +```mermaid + JobCreate-->JobCreateConfiguration; + JobCreateConfiguration-->JobCreateAny; + JobRead-->JobReadAccess; + JobReadAccess-->JobReadAny; + JobUpdate-->JobUpdateConfiguration; + JobUpdateConfiguration-->JobUpdateAny; + JobDelete; +``` + +#### Authorization table +| HTTP method | Endpoint | Endpoint Authentication | Anonymous | Authenticated | Create Jobs Groups | Update Jobs Groups | Admin Groups | Delete Groups | +| -------- | ------- | ------- | ------- | ------- | ------- | ------- | ------- | ------- | +| POST | Jobs | _JobCreate_ | _JobCreateConfiguration_ | _JobCreateConfiguration_ | Any
_JobsCreateOwner_ | __no__ | Any
_JobsCreateAny_ | __no__ | +| GET | Jobs | _JobReadMany_ | __no__ | Has Access
_JobReadAccess_ | Has Access
_JobReadAccess_ | __no__ | Any
_JobReadAny_ | __no__ | +| GET | Jobs/_jid_ | _JobReadOne_ | __no__ | Has Access
_JobReadAccess_ | Has Access
_JobReadAccess_ | __no__ | Any
_JobReadAny_ | __no__ | +| PATCH | Jobs/_jid_ | _JobUpdate_ | __no__ | _JobUpdateConfiguration_ | __no__ | Owner
_JobUpdateOwner_ | Any
_JobUpdateAny_ | __no__ | +| DELETE | Jobs/_jid_ | _JobDelete_ | __no__ | __no__ | __no__ | __no__ | __no__ | __no__ | + +#### Job Create Authorization Table +The _JobCreateConfiguration_ authorization permissions are configured directly in the __*create*__ section of the job configuration. +Any positive match will result in the user acquiring _JobCreate_ endpoint authorization, which applies to the jobs endpoint `POST:Jobs` + +| Job Create Authorization | Endpoint Authentication Translation | Endpoint Authentication Description | Instance Authentication Translation | Instance Authentication Description | +| --- | --- | --- | --- | --- | +| _#all_ | _#all_ | any user can access this endpoint, both anonymous and authenticated | _#all_ | Any user can create this instance of the job | +| _#datasetPublic_ | _#all_ | any user can access this endpoint, both anonymous and authenticated | _#datasetPublic_ | the job instance will be created only if all the datasets listed are __public__ | +| _#authenticated_ | _#user_ | any valid users can access the endpoint, independently from their groups | _#user_ | any valid users can create this instance of the job | +| _#datasetAccess_ | _#all_ | any user can access this endpoint, both anonymous and authenticated | _#datasetAccess_ | the job instance will be created only if the specified user group or otherwise any of the user's groups has access to all the datasets listed | +| _#datasetOwner_ | _#all_ | any user can access this endpoint, both anonymous and authenticated | _#datasetOwner_ | the job instance will be created only if the specified user group or otherwise any of the user's groups is part of all the datasets' owner group | +| __*@GROUP*__ | _#all_ | any user can access this endpoint, both anonymous and authenticated | __*GROUP*__ | the job instance will be created only if the user belongs to the group specified | +| __*USER*__ | _#all_ | any user can access this endpoint, both anonymous and authenticated | __*USER*__ | the job instance can be created only by the user indicated | +| #jobAdmin | #all | any user can access this endpoint, both anonymous and authenticated | _#jobAdmin_ | the job instance can be created by users of ADMIN_GROUPS and CREATE_JOB_PRIVILEGED only | +__IMPORTANT__: use option _#all_ carefully, as it allows anybody to create a new job. It is mostly used for debugging and testing. + +#### Job Update Authorization Table +The _JobUpdateConfiguration_ authorization permissions are configured directly in the __*update*__ section of the job configuration. +Any positive match will result in the user acquiring _JobUpdate_ endpoint authorization, which applies to the jobs endpoint `PATCH:Jobs/id` + +| Job Update Authorization | Endpoint Authentication Translation | Endpoint Authentication Description | Instance Authentication Translation | Instance Authentication Description | +| --- | --- | --- | --- | --- | +| _#all_ | _#all_ | any user can access this endpoint, both anonymous and authenticated | _#all_ | Any user can update this job instance | +| _#jobOwnerUser_ | _#user_ | any user can access this endpoint, both anonymous and authenticated | _#jobOwnerUser_ | only the user that is listed in field _ownerUser_ can perform the update | +| _#jobOwnerGroup_ | _#user_ | any user can access this endpoint, both anonymous and authenticated | _#jobOwnerGroup_ | any user that belongs to the group listed in field _ownerGroup_ can perform the update | +| __*@GROUP*__ | __*GROUP*__ | any user can access this endpoint, both anonymous and authenticated | __*GROUP*__ | the job can be updated only by users who belong to the group specified | +| __*USER*__ | __*USER*__ | any user can access this endpoint, both anonymous and authenticated | __*USER*__ | the job can be updated only by the user indicated | +| #jobAdmin | #all | any user can access this endpoint, both anonymous and authenticated | _#jobAdmin_ | the job instance can be created by users of ADMIN_GROUPS and UPDATE_JOB_PRIVILEGED only | + +__IMPORTANT__: use option _#all_ carefully, as it allows anybody to update the job. It is mostly used for debugging and testing. + +#### Job Authorization priority +The endpoint authorization is the most permissive authorization across all the jobs defined. +The priority between job create and update authorization is as follows: + +```mermaid + all-->user; + user-->GROUP; + GROUP-->USER; + USER-->ADMIN_GROUPS; +``` diff --git a/docs/backendconfig/authorization/authorization_origdatablocks.md b/docs/backendconfig/authorization/authorization_origdatablocks.md new file mode 100644 index 0000000..0451573 --- /dev/null +++ b/docs/backendconfig/authorization/authorization_origdatablocks.md @@ -0,0 +1,59 @@ +# OrigDatablock Authorization +## CASL ability actions +This is the list of the permissions methods available for origdatablock and all their endpoints. + +### Endpoint Authorization +1. OrigdatablockCreate +2. OrigdatablockRead +- OrigdatablockUpdate +- OrigdatablockDelete + +### (Data) Instance Authorization +1. OrigdatablockCreateOwner +2. OrigdatablockCreateAny +- OrigdatablockReadManyPublic +- OrigdatablockReadManyAccess +- OrigdatablockReadManyOwner +- OrigdatablockReadOnePublic +- OrigdatablockReadOneAccess +- OrigdatablockReadOneOwner +- OrigdatablockReadAny +- OrigdatablockUpdateOwner +- OrigdatablockUpdateAny +- OrigdatablockDeleteAny + +#### Priority +``` + DatasetOrigdatablockCreate-->DatasetOrigdatablockCreateOwner; + DatasetOrigdatablockCreateOwner-->DatasetOrigdatablockCreateAny; +``` +``` + DatasetOrigdatablockRead-->DatasetOrigdatablockReadManyPublic; + DatasetOrigdatablockReadManyPublic-->DatasetOrigdatablockReadManyAccess; + DatasetOrigdatablockReadManyAccess-->DatasetOrigdatablockReadAny; + DatasetOrigdatablockRead-->DatasetOrigdatablockReadOnePublic; + DatasetOrigdatablockReadOnePublic-->DatasetOrigdatablockReadOneAccess; + DatasetOrigdatablockReadOneAccess-->DatasetOrigdatablockReadAny; +``` +``` + DatasetOrigdatablockUpdate-->DatasetOrigdatablockUpdateOwner; + DatasetOrigdatablockUpdateOwner-->DatasetOrigdatablockUpdateAny; +``` +``` + DatasetOrigdatablockDelete-->DatasetOrigdatablockDeleteOwner; + DatasetOrigdatablockDeleteOwner-->DatasetOrigdatablockDelteAny; +``` + +#### Authorization table +| HTTP method | Endpoint | Endpoint Authentication | Anonymous | Authenticated User | Create Dataset Groups | Create Dataset with Pid Groups | Create Dataset Privileged Groups | Admin Groups | Delete Groups | Notes | +| -------- | ------- | ------- | ------- | ------- | ------- | ------- | ------- | ------- | ------- | ------- | +| POST | origdatablocks | _OrigdatablockCreate_ | __no__ | __no__ | Owner
_OrigdatablockCreateOwn_ | Owner
_OrigidatablockCreateOwn_ | Any
_OrigdatablockCreateAny_ | Any _OrigdatablockCreateAny_ | __no__ | +| POST | origdatablocks/isValid | _OrigdatablockCreate_ | __no__ | __no__ | Owner
_OrigdatablockCreateOwn_ | Owner
_OrigdatablockCreateOwn_ | Any
_OrigdatablockCreateAny_ | Any
_OrigdatablockCreateAny_ | __no__ | +| GET | origdatablocks | _OrigdatablockRead_ | Public
_OrigdatablockReadManyPublic_ | Has Access
_OrigdatablockReadManyAccess_ | Has Access
_OrigdatablockReadManyAccess_ | Has Access
_OrigdatablockReadManyAccess_ | Has Access
_OrigdatablockReadManyAccess_ | Any
_OrigdatablockReadAny_ | __no__ | +| GET | origdatablocks/fullquery | _OrigdatablockRead_ | Public
_OrigdatablockReadManyPublic_ | Has Access
_OrigdatablockReadManyAccess_ | Has Access
_OrigdatablockReadManyAccess_ | Has Access
_OrigdatablockReadManyAccess_ | Has Access
_OrigdatablockReadManyAccess_ | Any
_OrigdatablockReadAny_ | __no__ | +| GET | origdatablocks/fullquery/files | _OrigdatablockRead_ | Public
_OrigdatablockReadManyPublic_ | Has Access
_OrigdatablockReadManyAccess_ | Has Access
_OrigdatablockReadManyAccess_ | Has Access
_OrigdatablockReadManyAccess_ | Has Access
_OrigdatablockReadManyAccess_ | Any
_OrigdatablockReadAny_ | __no__ | +| GET | origdatablocks/fullfacet | _OrigdatablockRead_ | Public
_OrigdatablockReadManyPublic_ | Has Access
_OrigdatablockReadManyAccess_ | Has Access
_OrigdatablockReadManyAccess_ | Has Access
_OrigdatablockReadManyAccess_ | Has Access
_OrigdatablockReadManyAccess_ | Any
_OrigdatablockReadAny_ | __no__ | +| GET | origdatablocks/_oid_ | _OrigdatablockRead_ | Public
_OrigdatablockReadOnePublic_ | Has Access
_OrigdatablockReadOneAccess_ | Has Access
_OrigdatablockReadOneAccess_ | Has Access
_OrigdatablockReadOneAccess_ | Has Access
_OrigdatablockReadOneAccess_ | Any
_OrigdatablockReadAny_ | __no__ | +| PATCH | origdatablocks/_oid_ | _OrigdatablockUpdate_ | __no__ | __no__ | Owner
_OrigdatablockUpdateOwner_ | Owner
_OrigdatablockUpdateOwner_ | Owner
_OrigdatablockUpdateOwner_ | Any
_OrigdatablockUpdateAny_ | __no__ | +| DELETE | origdatablocks/_oid_ | _OrigdatablockDelete_ | __no__ | __no__ | __no__ | __no__ | __no__ | __no__ | Any
_OrigdatablockDeleteAny_ | + diff --git a/docs/backendconfig/authorization/authorization_proposals.md b/docs/backendconfig/authorization/authorization_proposals.md new file mode 100644 index 0000000..6be99d0 --- /dev/null +++ b/docs/backendconfig/authorization/authorization_proposals.md @@ -0,0 +1,104 @@ +# Proposals Authorization +## CASL ability actions +This is the list of the permissions methods available for Proposals and all their endpoints. + +### Endpoint Authorization +1. ProposalCreate +2. ProposalRead +- ProposalUpdate +- ProposalDelete +- ProposalAttachmentCreate +- ProposalAttachmentRead +- ProposalAttachmentUpdate +- ProposalAttachmentDelete +- ProposalDatasetRead + + +### (Data) Instance Authorization +1. ProposalCreateOwner +2. ProposalCreateAny +- ProposalReadManyPublic +- ProposalReadManyAccess +- ProposalReadManyOwner +- ProposalReadOnePublic +- ProposalReadOneAccess +- ProposalReadOneOwner +- ProposalReadAny +- ProposalUpdateOwner +- ProposalUpdateAny +- ProposalDeleteOwner +- ProposalDeleteAny +- ProposalAttachmentCreateOnwer +- ProposalAttachmentCreateAny +- ProposalAttachmentReadManyPublic +- ProposalAttachmentReadManyAccess +- ProposalAttachmentReadManyOwner +- ProposalAttachmentReadManyAny +- ProposalAttachmentUpdateOwner +- ProposalAttachmentUpdateAny +- ProposalAttachmentDeleteOwner +- ProposalAttachmentDeleteAny +- ProposalDatasetReadPublic +- ProposalDatasetReadAccess +- ProposalDatasetReadOwner +- ProposalDatasetReadAny + + +#### Priority +``` + ProposalCreate-->ProposalsCreateOwner; + ProposalCreateOwner-->ProposalCreateAny; +``` +``` + ProposalRead-->ProposalReadManyPublic; + ProposalReadManyPublic-->ProposalReadManyAccess; + ProposalReadManyAccess-->ProposalReadManyOwner; + ProposalReadManyOwner-->ProposalReadAny; + ProposalRead-->ProposalReadOnePublic; + ProposalReadOnePublic-->ProposalReadOneAccess; + ProposalReadOneAccess-->ProposalReadOneOwner; + ProposalReadOneOwner-->ProposalReadAny; +``` +``` + ProposalUpdate-->ProposalUpdateOwner; + ProposalUpdateOwner-->ProposalUpdateAny; + ProposalDelete-->ProposalDeleteOwner; + ProposalDeleteOwner-->ProposalDeleteAny; +``` +``` + ProposalAttachmentCreate-->ProposalAttachmentCreateOnwer; + ProposalAttachmentCreateOnwer-->ProposalAttachmentCreateAny; + ProposalAttachmentRead-->ProposalAttachmentReadManyPublic; + ProposalAttachmentReadManyPublic-->ProposalAttachmentReadManyAccess; + ProposalAttachmentReadManyAccess-->ProposalAttachmentReadManyOwner; + ProposalAttachmentReadManyOwner-->ProposalAttachmentReadManyAny; + ProposalAttachmentUpdate-->ProposalAttachmentUpdateOwner; + ProposalAttachmentUpdateOwner-->ProposalAttachmentUpdateAny; + ProposalAttachmentDelete-->ProposalAttachmentDeleteOwner; + ProposalAttachmentDeleteOwner-->ProposalAttachmentDeleteAny; +``` +``` + ProposalDatasetRead-->ProposalDatasetReadPublic; + ProposalDatasetReadPublic-->ProposalDatasetReadAccess; + ProposalDatasetReadAccess-->ProposalDatasetReadOwner; + ProposalDatasetReadOwner-->ProposalDatasetReadAny; +``` + +#### Authorization table +| HTTP method | Endpoint | Endpoint Authentication | Anonymous | Authenticated User | Proposals Groups | Admin Groups | Delete Groups | Notes | +| -------- | ------- | ------- | ------- | ------- | ------- | ------- | ------- | ------- | +| POST | Proposals | _ProposalCreate_ | __no__ | __no__ | Any
_ProposalCreateAny_ | Any
_ProposalCreateAny_ | __no__ | | +| GET | Proposals | _ProposalRead_ | Public
_ProposalReadManyPublic_ | Has Access
_ProposalReadManyAccess_ | Has Access
_ProposalReadManyAccess_ | Any
_ProposalReadAny_ | __no__ | | +| GET | Proposals/fullquery | _ProposalRead_ | Public
_ProposalReadManyPublic_ | Has Access
_ProposalReadManyAccess_ | Has Access
_ProposalReadManyAccess_ | Any
_ProposalReadAny_ | __no__ | | +| GET | Proposals/fullfacet | _ProposalRead_ | Public
_ProposalReadManyPublic_ | Has Access
_ProposalReadManyAccess_ | Has Access
_ProposalReadManyAccess_ | Any
_ProposalReadAny_ | __no__ | | +| GET | Proposals/_pid_ | _ProposalRead_ | Public
_ProposalReadOnePublic_ | Has Access
_ProposalReadOneAccess_ | Has Access
_ProposalReadOneAccess_ | Any
_ProposalReadAny_ | __no__ | | +| GET | Proposals/fullquery | _ProposalRead_ | Public
_ProposalReadOnePublic_ | Has Access
_ProposalReadOneAccess_ | Has Access
_ProposalReadOneAccess_ | Any
_ProposalReadAny_ | __no__ | | +| PATCH | Proposals/_pid_ | _ProposalUpdate_ | __no__ | __no__ | Any
_ProposalUpdateAny_ | Any
_ProposalUpdateAny_ | __no__ | | +| DELETE | Proposals/_pid_ | _ProposalDelete_ | __no__ | __no__ | __no__ | __no__ | Any
_ProposalDeleteAny_ | | +||||| +| POST | Proposals/_pid_/attachements | _ProposalAttachementCreate_ | __no__ | __no__ | Any
_ProposalAttachmentCreateAny_ | Any
_ProposalAttachmentCreateAny_ | __no__ | | +| GET | Proposals/_pid_/attachements | _ProposalAttachmentRead_ | Public
_ProposalAttachmentReadManyPublic_ | Has Access
_ProposalAttachmentReadManyAccess_ | Has Access
_ProposalAttachmentReadManyAccess_ | Any
_ProposalAttachmentReadManyAny_ | __no__ | | +| PATCH | Proposals/_pid_/attachments/_aid_ | _ProposalAttachmentUpdate_ | __no__ | __no__ | Owner
_ProposalAttachmentUpdateOwner_ | Any
_ProposalAttachmentUpdateAny_ | __no__ | | +| DELETE | Proposals/_pid_/attachment/_aid_ | _ProposalAttachmentDelete_ | __no__ | __no__ | Onwer
_ProposalAttachmentDeleteOwner_ | Any
_ProposalAttachmentDeleteAny_ | __no__ | | +||||| +| GET | Proposals/_pid_/datasets | _ProposalDatasetRead_ | Public
_ProposalDatasetReadOnePublic_ | Has Access
_ProposalDatasetReadOneAccess_ | Has Access
_ProposalDatasetReadOneAccess_ | Any
_ProposalDatasetReadOneAny_ | __no__ | | diff --git a/docs/backendconfig/authorization/authorization_samples.md b/docs/backendconfig/authorization/authorization_samples.md new file mode 100644 index 0000000..e0be98e --- /dev/null +++ b/docs/backendconfig/authorization/authorization_samples.md @@ -0,0 +1,94 @@ +# Samples Authorization +## CASL ability actions +This is the list of the permissions methods available for Samples and all their endpoints + +### Endpoint Authorization +- SampleCreate +- SampleRead +- SampleUpdate +- SampleDelete +- SampleAttachmentCreate +- SampleAttachmentRead +- SampleAttachmentUpdate +- SampleAttachmentDelete +- SampleDatasetRead + +### (Data) Instance Authorization +- SampleCreateOwner +- SampleCreateAny +- SampleReadManyPublic +- SampleReadManyAccess +- SampleReadManyOwner +- SampleReadOnePublic +- SampleReadOneAccess +- SampleReadOneOwner +- SampleReadAny +- SampleUpdateOwner +- SampleUpdateAny +- SampleDeleteOwner +- SampleDeleteAny +- SampleAttachmentCreateOwner +- SampleAttachmentCreateAny +- SampleAttachmentReadManyPublic +- SampleAttachmentReadManyAccess +- SampleAttachmentReadManyOwner +- SampleAttachmentReadManyAny +- SampleAttachmentUpdateOwner +- SampleAttachmentUpdateAny +- SampleAttachmentDeleteOwner +- SampleAttachmentDeleteAny +- SampleDatasetReadPublic +- SampleDatasetReadAccess +- SampleDatasetReadOwner +- SampleDatasetReadAny + +#### Priority +```mermaid +graph LR; + SampleCreate-->SampleCreateOwner; + SampleCreateOwner-->SampleCreateAny; + SampleRead-->SampleReadManyPublic; + SampleReadManyPublic-->SampleReadManyAccess; + SampleReadManyAccess-->SampleReadManyOwner; + SampleReadManyOwner-->SampleReadAny; + SampleRead-->SampleReadOnePublic; + SampleReadOnePublic-->SampleReadOneAccess; + SampleReadOneAccess-->SampleReadOneOwner; + SampleReadOneOwner-->SampleReadAny; + SampleUpdate-->SampleUpdateOwner; + SampleUpdateOwner-->SampleUpdateAny; + SampleDelete-->SampleDeleteOwner; + SampleDeleteOwner-->SampleDeleteAny; + SampleAttachmentCreate-->SampleAttachmentCreateOwner; + SampleAttachmentCreateOwner-->SampleAttachmentCreateAny; + SampleAttachmentRead-->SampleAttachmentReadManyPublic; + SampleAttachmentReadManyPublic-->SampleAttachmentReadManyAccess; + SampleAttachmentReadManyAccess-->SampleAttachmentReadManyOwner; + SampleAttachmentReadManyOwner-->SampleAttachmentReadManyAny; + SampleAttachmentUpdate-->SampleAttachmentUpdateOwner; + SampleAttachmentUpdateOwner-->SampleAttachmentUpdateAny; + SampleAttachmentDelete-->SampleAttachmentDeleteOwner; + SampleAttachmentDeleteOwner-->SampleAttachmentDeleteAny; + SampleDatasetRead-->SampleDatasetReadPublic; + SampleDatasetReadPublic-->SampleDatasetReadAccess; + SampleDatasetReadAccess-->SampleDatasetReadOwner; + SampleDatasetReadOwner-->SampleDatasetReadAny; +``` + +#### Authorization table +| HTTP method | Endpoint | Endpoint Authentication | Anonymous | Authenticated User | Sample Groups | Sample Privileged Groups | Admin Groups | Delete Groups | Notes | +| -------- | ------- | ------- | ------- | ------- | ------- | ------- | ------- | ------- | ------- | +| POST | Samples | _SampleCreate_ | __no__ | __no__ | Owner
_SampleCreateOwner_ | Any
_SampleCreateAny_ | Any
_SampleCreateAny_ | __no__ | | +| GET | Samples | _SampleRead_ | Public
_SampleReadManyPublic_ | Has Access
_SampleReadManyAccess_ | Has Access
_SampleReadManyAccess_ | Has Access
_SampleReadManyAccess_ | Any
_SampleReadAny_ | __no__ | | +| GET | Samples/fullquery | _SampleRead_ | Public
_SampleReadManyPublic_ | Has Access
_SampleReadManyAccess_ | Has Access
_SampleReadManyAccess_ | Has Access
_SampleReadManyAccess_ | Any
_SampleReadAny_ | __no__ | | +| GET | Samples/fullfacet | _SampleRead_ | Public
_SampleReadManyPublic_ | Has Access
_SampleReadManyAccess_ | Has Access
_SampleReadManyAccess_ | Has Access
_SampleReadManyAccess_ | Any
_SampleReadAny_ | __no__ | | +| GET | Samples/_pid_ | _SampleRead_ | Public
_SampleReadOnePublic_ | Has Access
_SampleReadOneAccess_ | Has Access
_SampleReadOneAccess_ | Has Access
_SampleReadOneAccess_ | Any
_SampleReadAny_ | __no__ | | +| GET | Samples/fullquery | _SampleRead_ | Public
_SampleReadOnePublic_ | Has Access
_SampleReadOneAccess_ | Has Access
_SampleReadOneAccess_ | Has Access
_SampleReadOneAccess_ | Any
_SampleReadAny_ | __no__ | | +| PATCH | Samples/_pid_ | _SampleUpdate_ | __no__ | __no__ | Owner
_SampleUpdateOwn_ | Owner
_SampleUpdateOwn_ | Any
_SampleUpdateAny_ | __no__ | | +| DELETE | Samples/_pid_ | _SampleDelete_ | __no__ | __no__ | __no__ | __no__ | __no__ | Any
_SampleDeleteAny_ | | +||||| +| POST | Samples/_pid_/Attachments | _SampleAttachmentCreate_ | __no__ | __no__ | Owner
_SampleAttachmentCreateOwner_ | Any
_SampleAttachmentCreateAny_ | Any
_SampleAttachmentCreateAny_ | __no__ | | +| GET | Samples/_pid_/Attachments | _SampleAttachmentRead_ | Public
_SampleAttachmentReadManyPublic_ | Has Access
_SampleAttachmentReadManyAccess_ | Has Access
_SampleAttachmentReadManyAccess_ | Has Access
_SampleAttachmentReadManyAccess_ | Any
_SampleAttachmentReadManyAny_ | __no__ | | +| DELETE | Samples/_pid_/attachment/_aid_ | _SampleAttachmentDelete_ | __no__ | __no__ | Owner
_SampleAttachmentDeleteOwner_ | Owner
_SampleAttachmentDeleteOwner_ | Any
_SampleAttachmentDeleteAny_ | Any
_SampleAttachmentDeleteAny_ | | +||||| +| GET | Samples/_pid_/datasets | _SampleDatasetRead_ | Public
_SampleDatasetReadOnePublic_ | Has Access
_SampleDatasetReadOneAccess_ | Has Access
_SampleDatasetReadOneAccess_ | Has Access
_SampleDatasetReadOneAccess_ | Any
_SampleDatasetReadOneAny_ | __no__ | | diff --git a/docs/backendconfig/authorization/authorization_users.md b/docs/backendconfig/authorization/authorization_users.md new file mode 100644 index 0000000..d54b9cd --- /dev/null +++ b/docs/backendconfig/authorization/authorization_users.md @@ -0,0 +1,51 @@ +# Users Authorization +## CASL ability actions +This is the list of the permissions methods available for datasets and all their endpoints. +##### Endpoint authorization +1. UserLogin +2. UserRead +- UserCreate +- UserUpdate +- UserPassword +- UserDelete + +##### Instance authorization +1. UserReadOwn +2. UserReadAny +- UserCreateOwn +- UserCreateAny +- UserUpdateOwn +- UserUpdateAny +- UserPasswordOwn +- UserPasswordAny +- UserDeleteAny + +#### Priority +``` + UserLogin(E) + UserCreate(E)-->UserCreateOwn(I)-->UserCreateAny(I); + UserRead(E)-->UserReadOwn(I)-->UserReadAny(I); + UserUpdate(E)-->UserUpdateOwner(I)-->UserUpdateAny(I); + UserPassword(E)-->UserPasswordOwner(I)-->UserPasswordAny(I); + UserDelete(E)-->UserDeleteOwn(I)-->UserDeleteAny(I); +``` + +#### Authorization table +| HTTP method | Endpoint | Endpoint Authorization | Anonymous | Authenticated User | User Privileged Groups | Admin Groups | User Delete Groups | +| ----------- | -------- | --------- | ------------------ | ---------------------- | ------------ | ------------- | ------------- | +| POST | Users/jwt | _UserRead_ | __no__ | Own
_UserReadOwn_ | __no__ | __no__ | __no__ | +| POST | Users/login | _UserLogin_ | __no__ | __no__ | __no__ | __no__ | __no__ | +| GET | Users/_id_ | _UserRead_ | __no__ | Own
_UserReadOwn_ | Any
_UserReadAny_ | Any
_UserReadAny_ | __no__ | +| GET | Users/_id_/userIdentity | _UserRead_ | __no__ | Own
_UserReadOwn_ | Any
_UserReadAny_ | Any
_UserReadAny_ | __no__ | +| POST | Users/_id_/settings | _UserCreate_ | __no__ | Own
_UserCreateOwn_ | Any
_UserCreateAny_ | Any
_UserCreateAny_ | __no__ | +| GET | Users/_id_/settings | _UserUpdate_ | __no__ | Own
_UserReadOwn_ | Any
_UserReadAny_ | Any
_UserReadAny_ | __no__ | +| PUT | Users/_id_/settings | _UserUpdate_ | __no__ | Own
_UserUpdateOwn_ | Any
_UserUpdateAny_ | Any
_UserUpdateAny_ | __no__ | +| PATCH | Users/_id_/settings | _UserUpdate_ | __no__ | Own
_UserUpdateOwn_ | Any
_UserUpdateAny_ | Any
_UserUpdateAny_ | __no__ | +| PATCH | Users/_id_/password | _UserPassword_ | __no__ | Own
_UserPasswordOwn_ | Any
_UserPasswordAny_ | Any
_UserPasswordAny_ | __no__ | +| DELETE | Users/_id_ | _UserDelete_ | __no__ | __no__ | __no__ | __no__ | Any
_UserDeleteAny_ | +| DELETE | Users/_id_/settings | _UserDelete_ | __no__ | __no__ | __no__ | __no__ | Any
_UserDeleteAny_ | +| GET | Users/_id_/authorization/dataset/create | _UserRead_ | __no__ | Own
_UserReadOwn_ | Own
_UserReadOwn_ | Any
_UserReadAny_ | __no__ | +| GET | Users/logout | _UserLogout_ | __no__ | Own
_UserLogoutOwn_ | __no__ | __no__ | __no__ | +| GET | useridentities/findOne | _UserRead_ | __no__ | Own
_UserReadOwn_ | Any
_UserReadAny_ | Any
_UserReadAny_ | __no__ | + + diff --git a/docs/backendconfig/authorization/index.md b/docs/backendconfig/authorization/index.md new file mode 100644 index 0000000..aaaba7c --- /dev/null +++ b/docs/backendconfig/authorization/index.md @@ -0,0 +1,31 @@ +# Authorization Model + +## General +For how authorization is handled in SciCat, see [general](./authorization.md) description. +Developers information, see [github](https://github.com/SciCatProject/documentation/blob/master/Development/v4.x/backend/authorization/authorization.md). + +## Authorization Datasets + +Developers information, see [github](https://github.com/SciCatProject/documentation/blob/master/Development/v4.x/backend/authorization/authorization_datasets.md). + + +## Authorization OrigDatablocks + +Developers information, see [github](https://github.com/SciCatProject/documentation/blob/master/Development/v4.x/backend/authorization/authorization_origdatablocks.md). + +## Authorization Jobs + +Developers information, see [github](https://github.com/SciCatProject/documentation/blob/master/Development/v4.x/backend/authorization/authorization_jobs.md). + +## Authorization Users + +Developers information, see [github](https://github.com/SciCatProject/documentation/blob/master/Development/v4.x/backend/authorization/authorization_users.md). + +## Authorization Proposals + +Developers information, see [github](https://github.com/SciCatProject/documentation/blob/master/Development/v4.x/backend/authorization/authorization_proposals.md). + +## Authorization Intruments + +Developers information, see [github](https://github.com/SciCatProject/documentation/blob/master/Development/v4.x/backend/authorization/authorization_instruments.md). + diff --git a/docs/backendconfig/dois.md b/docs/backendconfig/dois.md index 91f0724..b9304e4 100644 --- a/docs/backendconfig/dois.md +++ b/docs/backendconfig/dois.md @@ -1,8 +1,43 @@ -# DOI minting in SciCat - How to publish datasets +# DOI minting in SciCat - How to set up publication of datasets ## Introduction -User introduction can be found [here](../doisIntro.md). +User introduction can be found [here](../datasets/Publishing.md). +## Variables to configure +We repeat here the relevant parts from the [```.env```-file](../backendconfig/index.md#environment-variables) that essentially give handle to the admin-user: +* REGISTER_DOI_URI="https://mds.test.datacite.org/doi" +* REGISTER_METADATA_URI="https://mds.test.datacite.org/metadata" +* DOI_USERNAME="username" +* DOI_PASSWORD="password" +The up to now main landing page server as separate frontend client will become redundant: other than datasets can be chosen as entry points to benefit from the nice search on datasets, one can simply use also the publishedData main page as entry point for displaying all externaly accessible DOIs. + +## Full potential with SciCat's APIs + +The respective endpoints can be viewed from swagger and are + +List of API endpoints one can access: +![swagger screenshot](../swagger/img/swagger_publishedData.png) + +### Endpoints + +#### post +Main one is to the post object: + +![```post```](../swagger/img/swagger_publishedData_post.png) + +Others are + +#### count +![```count```](../swagger/img/swagger_publishedData_count.png) + +#### register +![```register```](../swagger/img/swagger_publishedData_register.png) + +#### form populate +![```form populate```](../swagger/img/swagger_publishedData_formpopulate.png) + +#### resync +![```resync```](../swagger/img/swagger_publishedData_resync.png) diff --git a/docs/backendconfig/index.md b/docs/backendconfig/index.md index ea0298d..b8cb627 100644 --- a/docs/backendconfig/index.md +++ b/docs/backendconfig/index.md @@ -4,7 +4,7 @@ The configuration file ```.env``` allows the systems administrator to configure There are currently many configurable additions to SciCat which makes it very flexible these are: -* OIDC for authenticatoin +* OIDC for identification * LDAP for authentication * Elastic Search * SMTP for sending emails to notify users of SciCat jobs @@ -13,7 +13,7 @@ There are currently many configurable additions to SciCat which makes it very fl ## Environment Variables All environment variables can be used in the ```.env``` filee. The current source code contains an example .env file, named _.env.example_, listing all (79) environment variables available to configure the backend. They can be found [here](https://github.com/SciCatProject/scicat-backend-next/blob/master/.env.example) and define -* How SciCat handles [access rights](#how-to-handle-access-rights) and connects to identity providers - such as [LDAP](#how-to-configure-ldap) or [OIDC](#how-to-configure-oidc) +* How SciCat handles access rights and connects to other services e.g. to identity providers - such as LDAP or OIDC for authentication. * How to configure [DOIs](#how-to-configure-doi-minting). * How to configure elasitc search (ES) * How to configure jobs @@ -499,17 +499,12 @@ LOGGERS_CONFIG_FILE="loggers.json" DATASET_TYPES_FILE="datasetTypes.json" PROPOSAL_TYPES_FILE="proposalTypes.json" ``` -### How to configure LDAP -Here are some details that are currently unknown to the author. - -### How to configure OIDC -Here are some details that are currently unknown to the author. +### How to configure to connect the backend to other services +In [scicatlive](https://www.scicatproject.org/scicatlive/latest/services/backend/) you find documentation on how to integrate your SciCat system with services providing identities, (e.g. KeyCloak) and authentication (OpenLDAP). ### How to configure DOI minting -In SciCat one can publish selected datasets that triggers a DOI minting process. Find [here](dois.md) a short introduction and instructions how to set up such a service. SciCat also has the option to make datasets publicly available, if you wish to do that follow [this Link](toBeWritten.md) - +In SciCat one can publish selected datasets that triggers a DOI minting process. Find [here](../datasets/Publishing.md) a short introduction on SciCats Published Data class. Instructions how to configure this DOI minting service and in addition make datasets publicly via APIs follow [this Link.](dois.md) ## More advanced options - If you are compiling the application from source, you can edit the file _src/config/configuration.ts_ with the correct values for your infrastructure. **This option is still undocumented, although it is our intention to provide a detailed how-to guide as soon as we can.** diff --git a/docs/backendconfig/jobs.md b/docs/backendconfig/jobs.md new file mode 100644 index 0000000..f08f86a --- /dev/null +++ b/docs/backendconfig/jobs.md @@ -0,0 +1,36 @@ +# SciCat jobs - how to use them + +If you want to post a jobs, check the structure of the job's body. From swagger endpoints we see that there are 4 fields: + + * types (mandatory) + * jobsParams (mandatory) + * ownerUser + * ownerGroup + + +## **jobTypes** + +Possible values can be seen from the [example config file](https://github.com/SciCatProject/scicat-backend-next/blob/master/jobConfig.example.yaml): + +1. jobType: template_job +2. jobType: archive +3. jobType: retrieve +4. jobType: public +5. jobType: email_demo +6. jobType: url_demo +7. jobType: rabbitmq_demo +8. jobType: switch_demo +9. jobType: validate_demo + + +## **actionTypes** + +Per job one can execute several actions. There are these actionTypes: + +1. actionType: log +2. actionType: validate +3. actionType: rabbitmq +4. actionType: url +5. actionType: switch +6. actionType: email +7. actionType: error diff --git a/docs/datasets/Publishing.md b/docs/datasets/Publishing.md index af42052..fb1aec3 100644 --- a/docs/datasets/Publishing.md +++ b/docs/datasets/Publishing.md @@ -1,6 +1,9 @@ -# Publishing datasets +# Publishing SciCat datasets -There are two ways of _publishing_ datasets in SciCat, one via the "publish button" for each dataset, and secondly when adding to it to a selection of datasets for which a DOI is registered. +There are two ways of _publishing_ datasets in SciCat via the GUI: one using the "publish button" for each dataset, and the other when registering a selection of datasets for a DOI according to DataCite standards. The first step is included in the second one. +If you want a more advanced option for full exploitation of the available endpoints of SciCat, see [here](../backendconfig/dois.md). + +A more technical description of the workflow can be found [here](PublishingAdvanced.md). ## Publish without DOI registration @@ -10,13 +13,13 @@ Each dataset has this button on the top right. ## Publish with DOI registeration -The user can select one or several datasets for DOI (**Digital Object Identifier**) registration which means that a record in DataCite, a DOI provider, will be made that points to a DESY landing page. SciCat offers a DataCite conform schema during the workflow. Any data that is known to the data catalog can be published. The publication workflow does the following: +The user can select one or several datasets for DOI (**Digital Object Identifier**) registration producing a record in DataCite, a DOI provider, pointing to a local detailed landing page. SciCat offers a DataCite conform schema during the workflow. Any data that is known to the data catalog can be published and the publication workflow goes as follows: 1. The logged in user can define a **set** of datasets to be published. -2. That person assigns metadata relevant to the publication of the datasets, such as title, author (currently the name(s) under 'Creator'), abstract etc. One can work on it at a later stage, too and re-edit the registration. Note, that editing will be allowed once the registration request has been sent. +2. That user assigns metadata relevant to the publication of the datasets, such as title, author (currently the name(s) under 'Creator'), abstract etc. One can work on it at a later stage, too and re-edit the registration. Note, that **no** editing will be allowed once the registration request has been sent. 3. A DOI is assigned to the published data which can e.g. be used to link from a journal article to the data. -4. It makes the data publicly available by providing a _landing page_ that describes the data. -5. It publishes the DOI to the worldwide DOI system , e.g. from Datacite +4. It makes the data publicly available by providing a detailed _landing page_ that describes the data. +5. It publishes the DOI to the worldwide DOI system from Datacite. So the first step is to **select the datasets** that should be published: @@ -43,8 +46,7 @@ Once this is finished one can hit the "register" button (not shown in previous s ![Landing page of published data](img/landingpage.png) -Finally you can have a look at all the published data by going to the Published Data menu item (again by clicking the user icon at the top right corner and choosing "Published Data"): +Finally you can have a look at all the published data by going to the Published Data menu item: by clicking the user icon at the top right corner and choosing "Published Data": ![Landing page of published data](img/published_datasets.png) -This [short video](https://scicatproject.github.io/img/attach_and_publish.mp4) demonstrates how you can add an attachment to your dataset and publish the data. \ No newline at end of file diff --git a/docs/datasets/PublishingAdvanced.md b/docs/datasets/PublishingAdvanced.md new file mode 100644 index 0000000..b46da43 --- /dev/null +++ b/docs/datasets/PublishingAdvanced.md @@ -0,0 +1,31 @@ +# Publishing SciCat datasets Advanced + +The previously described options to [publish datasets](Publishing.md) in SciCat - the process of registration of a selection of datasets - is here outlined in a more technical way. + +## Implementation workflow target + +Please note only metadata is stored in SciCat, there is no direct coupling between the software and the storage system. If you wish to publish both the metadata and data, please speak to your operator or consult the developer documentation for examples. + +### 1. Create a list of selected datasets +You can select datasets to create a **dataset list**; more datasets can be added and removed in several sessions. You can cancel the process at any time. New will be that while examining single datasets he can directly add or remove them to or from the selection in the cart. Before proceeding, you will be asked to verify the selection of datasets. You as user have finalized the dataset selection for which you want to register the dataset with a DOI. + +##### Internal review (to be implemented) +Some institutions may introduce an internal review step at this point: other authenticated user (as part of a dedicated reviewer-group) review the selected datasets. If OK, proceed to next form and the initial you can continue the DOI minting process. + +### 2. Fill the form for this dataset selection +You will be forwarded to a form where you provide **metadata specific to this selection** already conform to DataCite metadata fields to match site specific information about e.g. grants, associated projects, etc. All selected datasets will be made public. Owners (and Admins) are allowed to update this form. Again, this shall be possible within several sessions. + +### 3. Publish the selection +After hitting button all selected datasets become publicly visible: not only the owner can view all the metadata of the data, date of creation, associated files names, location, PI, etc. This is **prerequisite** for DOI registration. + +### 4. DOI registration +Before hitting the registration button the data selection does have an "internal" DOI which is an unregistered DOI clearly indicated by the state of this registration request. +When hitting the button register all the meta data will be forwarded to DOI provider DataCite if configured, see [backend config](../backendconfig/dois.md). For quality control your site may run in between an external service before forwarding the request. *Pending request* is indicated until the request if forwarded to DataCite. Note, from then on no more changes are possible for the requester. The concept of DOIs is to never change the metadata/data of the DOI. + +### 5. For Admins only +In extremely rare cases and only if justified, i.e. in case of great errors an update can be made by admins only. + + +![workflow diagram](img/published_data_workflow_1.png) + + diff --git a/docs/datasets/grouping_tagging_ds.md b/docs/datasets/grouping_tagging_ds.md index bea2e48..9b83d7c 100644 --- a/docs/datasets/grouping_tagging_ds.md +++ b/docs/datasets/grouping_tagging_ds.md @@ -4,10 +4,9 @@ An easy way of quickly grouping some of the datasets is the option to tag each d Here one sees the EDIT button (bottom right). ![first photo](img/groupingByTagging/groupingByTagging_editbutton.png) -// erscheint noch nicht mittig... -Having added your keywords you can use them to search: -![second one](img/groupingByTagging/groupingByTagging_searchresults.png){align=center} +Having added your keywords you can use them to search. SciCat directly indicates the number of datasets in the database visible to you +![second one](img/groupingByTagging/groupingByTagging_searchresults.png) -On the left you see the filters of your datasets, all results with e.g. "myowntagtest". +on which one can filter (left sidebar) showing all results with e.g. "myowntagtest": ![same topic](img/groupingByTagging_searchresults2.png) diff --git a/docs/datasets/img/dataset_details.png b/docs/datasets/img/dataset_details.png new file mode 100644 index 0000000..93bd61f Binary files /dev/null and b/docs/datasets/img/dataset_details.png differ diff --git a/docs/datasets/img/datasets_SearchBar.png b/docs/datasets/img/datasets_SearchBar.png new file mode 100644 index 0000000..9befcf8 Binary files /dev/null and b/docs/datasets/img/datasets_SearchBar.png differ diff --git a/docs/datasets/img/datasets_detailedView.png b/docs/datasets/img/datasets_detailedView.png new file mode 100644 index 0000000..986bf50 Binary files /dev/null and b/docs/datasets/img/datasets_detailedView.png differ diff --git a/docs/datasets/img/datasets_filterNConditions_1.png b/docs/datasets/img/datasets_filterNConditions_1.png new file mode 100644 index 0000000..8afe064 Binary files /dev/null and b/docs/datasets/img/datasets_filterNConditions_1.png differ diff --git a/docs/datasets/img/datasets_filterNConditions_2.png b/docs/datasets/img/datasets_filterNConditions_2.png new file mode 100644 index 0000000..3ed816c Binary files /dev/null and b/docs/datasets/img/datasets_filterNConditions_2.png differ diff --git a/docs/datasets/img/publish_button.png b/docs/datasets/img/publish_button.png new file mode 100644 index 0000000..e98f778 Binary files /dev/null and b/docs/datasets/img/publish_button.png differ diff --git a/docs/datasets/img/published_data_workflow_1.png b/docs/datasets/img/published_data_workflow_1.png new file mode 100644 index 0000000..7f73208 Binary files /dev/null and b/docs/datasets/img/published_data_workflow_1.png differ diff --git a/docs/datasets/index.md b/docs/datasets/index.md index 2d9c169..3a92635 100644 --- a/docs/datasets/index.md +++ b/docs/datasets/index.md @@ -1,22 +1,44 @@ # Datasets -Datasets can include several files which e.g. comprise a self-contained measurement - which is fully customizeable during ingestion of meta data. Users can search, view, list the meta data of a dataset. +SciCat datasets are sets of metadata and can include several files which e.g. comprise a self-contained measurement - which is fully customizeable during ingestion of metadata. Users can search and view different formats (e.g. in tree, tables or as JSON) of the dataset and list them. -To group and tag datasets is depicted [here](grouping_tagging_ds.md). Datasets can also be issued to be published: either removing the restricted view or triggering the process of obtaining a DOI for the selected datasets, see [old description](Publishing.md). +## Features +A very handy feature is to **group and tag** datasets. Find [here](grouping_tagging_ds.md) more details how to group them using tags. +Datasets can also be selected for **several actions**: +One such action is the *publication* of that selection. For more details see [publication of SciCat datasets](Publishing.md). +Generally, actions depend on what is implemented at your site and can cover a wide range from +comprising them into a *new datacollection of a custom type* [(see advanced documentation)](../datasets/datasetTypes.md) to +using that selection of datasets to *run an analysis* on them. -### How to query datasets +## How to search for datasets +Datasets can be queried in several places: +* The search bar at the top of the page provides a quick free text search. +* The filter & conditions column on the left allow you to customize your filters and conditions, adjust the filters to those that you find interesting and define your own conditions making use of your specific scientific metadata. + +The bar looks like this: +![search bar](img/datasets_SearchBar.png) + +Filtering by conditions can be applied through the option box on the left. + +![filterColumn](img/datasets_filterNConditions_1.png) + +If you chose "More Filters" a pop-up window appears where you can chose which of the filters you want to display. You can also add your own conditions as well (visible in the background under conditions): +![apply filters and conditions](../datasets/img/datasets_filterNConditions_2.png) + +## Dataset details +The main tab shows the details of a dataset. + +![example of dataset details page](img/dataset_details_PSI.png) ## Dataset file listing -Here is the view of files belonging to a dataset: Below the PID on the top, one finds the tab **Datafiles**: +A dataset can have several associated files to it. They can be listed by clicking on the tab **Datafiles** just right to the Details tab: ![list](img/dataset_details_filelist.png) - ## Dataset attachments -What kind of attachement can be saved? Will they be searchable? Can also other formats be attached than pngs? +Another tab is for the attachements of a dataset, e.g. PNG or TIFF images. -On the dataset details page, you can click on the Attachments tab ![Choose an image file, must be under 16 MB limit](img/dataset_attachments_PSI.png) Simply follow the instructions to upload an image. The size is restricted to be below 16 MB. @@ -31,13 +53,6 @@ Scientific meta data is shown in JSON under its section and looks like this: One can also get the JSON file via the swagger API. If set up, one can directly access the API endpoints of SciCat backend. Usually the address is in the form: ```my-scicat-instance.country/explorer```, swagger is accessible via the explorer. One needs to authenticate by copying the token from the GUI into the field **authorize**, then find the dataset of interest, by trying it out it will display you dataset and you can download it in JSON format. ## Edit Scientific meta data -// WARNING: this is the old text! -If enabled, fields in the scientific metadata can be modified and edited by the owner of the data by hitting the "Edit" Icon. The user can add,remove or change metadata fields, every change will create a new record in the databse with it's history. +If enabled, fields in the scientific metadata can be modified and edited by any member of the [Owner Group](../backendconfig/authorization/authorization_datasets.md) of the data by hitting the "Edit" Icon. The user can add, remove or change metadata fields, every change will create a new record in the databse with it's history [feature is soon available again from 2025-07-02]. ![Image edit metadata](img/editMetadata.png) - - -## New developments on dataset types -Generalize datatypes to remove restrictions of ```raw``` and ```derived``` types (difference was a set of dataset properties). - -[datasetTypes](datasetTypes.md) \ No newline at end of file diff --git a/docs/datasets/jobs.md b/docs/datasets/jobs.md new file mode 100644 index 0000000..3e14145 --- /dev/null +++ b/docs/datasets/jobs.md @@ -0,0 +1,6 @@ +# Archival and retrieval of experimental data + +SciCat can be setup to interconnect to your local storage system which allows you to do the following: +* Send datasets to your site's archive via a click of a button +* Retrieve your datasets asynchronously via a button click. +For more information please contact your site administrators. diff --git a/docs/doisIntro.md b/docs/doisIntro.md deleted file mode 100644 index 28ebf1d..0000000 --- a/docs/doisIntro.md +++ /dev/null @@ -1,39 +0,0 @@ -# Publishing SciCat datasets - -A **D**igital **O**bject **I**dentifier (DOI) is issued to uniquely identify some object, e.g. data, used for reference e.g. in journal publications. -In SciCat, if one selects his datasets, puts them into the cart and clicks ```publish```, DOI minting process is triggered. This process compises the following steps: - -1. It defines a **set** of datasets to be published -2. It assigns metadata relevant to the publication of the datasets, such as author, abstract etc -3. It assigns a **digital object identifier** DOI to the published data, which can e.g. be used to link from a journal article to the data -4. It makes the data publicly available by providing a **landing page** that describes the data. -5. It publishes the DOI to the worldwide DOI system , e.g. from Datacite - -So the first step is to select the datasets to be published - -![Selecting datasets for publication](img/publish_select.png) - -Put them into the cart by hitting the "Add to Cart" button. This step can be repeated to add further datasets. Once the selection is finished you open the cart (click on the cart symbol) to see the selected datasets: - -![Prepare datasets for publication](img/publish_show_selection.png) - -Here you can still change your selection, remove datasets etc. Once this is finished simply hit the publish button. This leads you to the following screen: - -![Adding metadata for publication](img/publish_edit_metadata.png) - -Some of these fields will be pre-filled with information derived from the proposal data, such as the abstract. Independent of the pre-filling you can change the contents as you like until you are satisfied. Then hit the publish button, which leads you to the resulting display page: - -![Showing entered metadata for publication](img/published_data_details.png) - -Initially the status field is in state "pending". This means, the published data information has been stored, but not yet made public to the worldwide DOI system, and no landing page has been created yet. This gives the possibility, that (potentially another person) can have a look at the data and do further editing by hitting the Edit button: - -![Editing metadata for publication](img/published_data_edit.png) - -Once this is finished one can hit the "register" button (not shown in previous screenshot, because already in state registered)to register the DOI and thus making the data public. The resulting public landing page for this data then looks somethink like this - -![Landing page of published data](img/landingpage.png) - -Finally you can have a look at all the published data by going to the Published Data menu item (again by clicking the user icon at the top right corner and choosing "Published Data"): - -![Landing page of published data](img/published_datasets.png) - diff --git a/docs/frontendconfig/index.md b/docs/frontendconfig/index.md new file mode 100644 index 0000000..6a3133d --- /dev/null +++ b/docs/frontendconfig/index.md @@ -0,0 +1,64 @@ +# Frontend configuration + +In only two json files SciCat's frontend configuration is managed, both as part of the backend configuration: + +1. [frontend.config.json](https://github.com/SciCatProject/scicat-backend-next/blob/master/src/config/frontend.config.json) and +2. [frontend.theme.json](https://github.com/SciCatProject/scicat-backend-next/blob/master/src/config/frontend.theme.json) + +Please write about + - Why the frontend config is in the backend config? + - What the scope is of these two files? + - Please provide code snippets examples for enabling/disabling buttons. Add ideally screenshots. + +# Default List & Filter Configuration Pattern - Frontend Configuration Guide + +## Overview + +This guide explains how to configure the List & Side-Panel Configuration Pattern used on the frontend. +It allows customizing how list-based components (e.g., datasets, proposals) display table columns, side-panel filters, and optional query conditions. + +The configuration should be defined or mounted at the location specified by the environment variable `FRONTEND_CONFIG_FILE` (default: `src/config/frontend.config.json`). + +## Configuration Details + +### **Columns** + +Defines how each field is displayed in the list table. + +| **Property** | **Type** | **Description** | **Example / Notes** | +| ------------ | --------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------------- | +| `name` | `string` | Object key whose value will be displayed in the column. | `"datasetName"` | +| `order` | `number` | Position of the column in the table. | `2` | +| `type` | `string` | How the value is rendered:
• `standard` – plain text (default)
• `hoverContent` – shows icon with popup/modal when mouseover (for long text)
• `date` – formats ISO date strings; can include a `format` (e.g. `yyyy-MM-dd`) | `"date"` | +| `width` | `number` | Default width of the column. | `200` | +| `format` | `string` | Optional property used **only** when `type` is set to `date`. Defines how ISO date strings are displayed (e.g. `yyyy-MM-dd`).
it fallsback to `dateFormat` or `yyyy-MM-dd HH:mm` for dataset and `yyyy-MM-dd` for proposal | `"yyyy-MM-dd"` | +| `enabled` | `boolean` | Whether the column is displayed by default. | `true` | + +--- + +### **Filters** + +Defines which filters appear in the side panel and how they behave. + +| **Property** | **Type** | **Description** | **Example / Notes** | +| ------------- | --------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -------------------------- | +| `key` | `string` | Object key used for filtering. | `"creationTime"` | +| `label` | `string` | Custom label for the filter. If not provided, it falls back to `labelLocalization` or `key`. | `"Creation Time"` | +| `type` | `string` | Filter input type:
• `text` – _deprecated_ (use search box)
• `multiSelect` – dropdown with multiple options; supports `checkBoxFilterClickTrigger` for auto-apply
• `dateRange` – calendar or manual from–to input
• `checkbox` – pre-populated list; supports `checkBoxFilterClickTrigger` for auto-apply | `"multiSelect"` | +| `description` | `string` | Tooltip text for the filter. | `"Filter by dataset type"` | +| `enabled` | `boolean` | Whether the filter is active by default. | `true` | + +--- + +### **Conditions** + +Defines predefined condition filter in the side panel (currently supported only for the dataset table) + +| **Property** | **Type** | **Description** | **Example / Notes** | +| ------------ | -------- | ------------------------------------------------- | ------------------- | +| _–_ | _–_ | Currently supported **only for dataset filters**. | — | +| `lhs` | `string` | Metadata key to filter on | `"outgassing_values_after_1h"` | +| `relation` | `string` | Comparison operator:
• `GREATER_THAN`
• `GREATER_THAN_OR_EQUAL`
• `LESS_THAN`
• `LESS_THAN_OR_EQUAL`
• `EQUAL_TO`
• `RANGE` | `"EQUAL_TO"` | +| `rhs` | `string` | Value to compare against | `"3.1e4"` | +| `unit` | `string` | **Optional** unit for the value | `"mbar l/s/cm^2"` | +| `unitsOptions`| `string[]`| **Optional** A list of allowed units for this condition. When provided, the unit dropdown will be restricted to only these options | `["mbar l/s/cm^2", "Pa m^3/s/m^2"]` \ No newline at end of file diff --git a/docs/img/SciCatATPSI.pdf b/docs/img/SciCatATPSI.pdf deleted file mode 100644 index 7c38837..0000000 Binary files a/docs/img/SciCatATPSI.pdf and /dev/null differ diff --git a/docs/img/SciCatATPSI.png b/docs/img/SciCatATPSI.png deleted file mode 100644 index 6dfd874..0000000 Binary files a/docs/img/SciCatATPSI.png and /dev/null differ diff --git a/docs/img/publish_edit_metadata.png b/docs/img/publish_edit_metadata.png deleted file mode 100644 index 3d19ab2..0000000 Binary files a/docs/img/publish_edit_metadata.png and /dev/null differ diff --git a/docs/img/publish_select.png b/docs/img/publish_select.png deleted file mode 100644 index bc1add9..0000000 Binary files a/docs/img/publish_select.png and /dev/null differ diff --git a/docs/img/publish_show_selection.png b/docs/img/publish_show_selection.png deleted file mode 100644 index 08310a1..0000000 Binary files a/docs/img/publish_show_selection.png and /dev/null differ diff --git a/docs/img/published_data_details.png b/docs/img/published_data_details.png deleted file mode 100644 index 7c8a587..0000000 Binary files a/docs/img/published_data_details.png and /dev/null differ diff --git a/docs/img/published_data_edit.png b/docs/img/published_data_edit.png deleted file mode 100644 index b89ef8a..0000000 Binary files a/docs/img/published_data_edit.png and /dev/null differ diff --git a/docs/img/published_datasets.png b/docs/img/published_datasets.png deleted file mode 100644 index 9a5f8df..0000000 Binary files a/docs/img/published_datasets.png and /dev/null differ diff --git a/docs/index.md b/docs/index.md index dc746f4..d711027 100644 --- a/docs/index.md +++ b/docs/index.md @@ -1,4 +1,4 @@ # Welcome to SciCat Documentation -Find [**SciCat USERS Guide**](user-manual/index.md) or [**SciCat Operator's Guide**](operator-manual/index.md). -Developers can read long the ```READMEs``` in github of the [projects page](https://www.scicatproject.org) and in both guides as well. +Find [**SciCat USERS Guide**](user-guide/index.md) or [**SciCat Operator's Guide**](operator-guide/index.md). +Developers can read along the ```READMEs``` in github of the [projects page](https://www.scicatproject.org) and in both guides as well. diff --git a/docs/instruments.md b/docs/instruments.md deleted file mode 100644 index e69de29..0000000 diff --git a/docs/instruments/index.md b/docs/instruments/index.md new file mode 100644 index 0000000..4dcd95b --- /dev/null +++ b/docs/instruments/index.md @@ -0,0 +1,3 @@ +## Instruments + +Instruments contains the metadata of the different instruments available at the facility. \ No newline at end of file diff --git a/docs/login/Dashboard.md b/docs/login/Dashboard.md index a868eae..9aef31f 100644 --- a/docs/login/Dashboard.md +++ b/docs/login/Dashboard.md @@ -1,44 +1,57 @@ -## Dashboard +# Dashboard -The dashboard is the first page that you see after being logged in. It contains an overview of all datasets that you have access to. +SciCat's Dashboard or also sometimes called Landing Page is the first page that you see independent of whether you are logged in or not. If you want to login click *Sign in* on the top right button, see [here](./index.md) for more information. When set to datasets as main access point, it will show an overview of all datasets that you have access to. If you do not login you see those that are public. ![dashboard](img/dashboard.png) -## Menu access to different information pages +SciCat offers now new features for viewing metadata as you like with adjustable columns. -You can always navigate to other parts of the application, simply by clicking on the user icon on the top right corner +![dashboard_adjustableColumns](img/dashboard_adjustableColumns_rk.png) -![Overall Menu](img/menu_dropdown.png) +You can change the columns to be shown by chosing from the three right dots "Column setting" and select those you would like. You can also drag columns by hovering over dots that appear just next to the label, click and pull it where you want to place it, then release. -## Filtering Datasets +![dashboard_newFeatures](img/dashboard_optionsPerColumn.png) -You can currently filter across 5 different fields: -1. Location (= field creationLocation) -2. Groups (= field ownerGroup) -3. Type (=field type - e.g. raw data or derived data ) -4. Keywords =field keywords, the tags added to the datasets) -5. Start - End Date ( = field createdAt, show datasets captured between the dates that you have set) +You can -The text fields provide an auto completion, which becomes visible as you type. +1. sort columns (click on the name and pull) +2. adjust width of columns (left block of dots) +3. remove or add columns (selection from Column settings) +4. invert order of display (click arrow next to the name) +5. apply a filter directly on that dataset with various options ("contains", "equals", "startsWith", "endsWith", "empty", "notEmpty") and either add (+), or (||) and exclude (x) another filter. + +![Overall Menu](img/dashboard_filterOnColumns.png) + +## Menu access to different information pages + +You can always navigate to other parts of the application, simply by clicking on the user icon on the top right corner -One click on the date calendar selects the start date and a second selects the end date. Make sure you select 2 dates. +![Overall Menu](img/menu_dropdown.png) -In the following screenshot the datasets are filterd by the condition ownerGroup="p17301" +## Finding Datasets +SciCat provides several possibilities for finding the right datasets. You can use the top search bar, you can narrow down your selection by applying filters and/or conditions and the user can search on scientific metadata as well. -![filters](img/dashboard_PSI_pgroup_public.png) +### Using Filters and Conditions +On the left you can apply most common filters. Currently there are -## Searching +1. Location: location of creation of the dataset. +2. PID: Identifier of the dataset. +3. Groups: who owns the dataset. +4. Type: data type - e.g. raw data or derived data. +5. Keywords: tags added to the dataset. +6. Start - End Date: show datasets captured between the dates that you have set. +7. Text: which searches across dataset name and description. -The text field at the top of the navigation bar allows you to search the metadata for any word contained in the metadata (but not arbitrary substrings). The search starts automatically when to start to type in this textfield, so better type fast ;-) +The text fields provide an auto completion, which becomes visible as you type. -## Configure table columns +You can click on the date calendar to select the start date and a second to select end date. Make sure you select 2 dates. -The cog wheel symbol on the top right allows to define the columns, that you want to see in the table +You can configure the selection of filters and add specific _conditions_. An example shows two additional conditions added: +![filters](./img/dashboard_filters.png) ## View Details - -To view a dataset simply click on it in the table and a more detailed view will load (this is covered in the next section) +To view a dataset simply click on it in the table and a more detailed view will load (this is covered in the datasets section) diff --git a/docs/login/img/dashboard.png b/docs/login/img/dashboard.png index 4d84bb3..e779983 100644 Binary files a/docs/login/img/dashboard.png and b/docs/login/img/dashboard.png differ diff --git a/docs/login/img/dashboard_adjustableColumns_rk.png b/docs/login/img/dashboard_adjustableColumns_rk.png new file mode 100644 index 0000000..f6f34c1 Binary files /dev/null and b/docs/login/img/dashboard_adjustableColumns_rk.png differ diff --git a/docs/login/img/dashboard_filterOnColumns.png b/docs/login/img/dashboard_filterOnColumns.png new file mode 100644 index 0000000..5e787ad Binary files /dev/null and b/docs/login/img/dashboard_filterOnColumns.png differ diff --git a/docs/login/img/dashboard_filters.png b/docs/login/img/dashboard_filters.png new file mode 100644 index 0000000..c21f466 Binary files /dev/null and b/docs/login/img/dashboard_filters.png differ diff --git a/docs/login/img/dashboard_optionsPerColumn.png b/docs/login/img/dashboard_optionsPerColumn.png new file mode 100644 index 0000000..783d578 Binary files /dev/null and b/docs/login/img/dashboard_optionsPerColumn.png differ diff --git a/docs/login/index.md b/docs/login/index.md index ed26518..6383761 100644 --- a/docs/login/index.md +++ b/docs/login/index.md @@ -2,17 +2,14 @@ To get access to **all** the data, for which you have read access, you first have to login. Otherwise you can browse only public datasets, see [anonymous browsing](Anonymous.md). -To login hit the "Sign in" Icon at the top right corner. +To login hit the "Sign in" icon at the top right corner. ![Login to ESS](img/login.png) -There are two types of account associated with the DataCatalog: *Functional* and *User*. A *functional* account will primarily be used by software and system administrators to deal with backups and other tasks. - -*User* accounts are tied into the login system that is used by your institution, for example: Active Directory. You are able to log in to the system using the same credentials you use on that account. This process is called *authenitication* in IT tech terminology +User accounts are tied into the login system that is used by your institution, for example: Active Directory. You are able to log in to the system using the same credentials you use on that account. This process is called *authenitication* in IT tech terminology. When you login as a user your user management system will assign groups to the logged in user. Each dataset is also assigned to one such group (via the so called ownerGroup field), and you can view the datasets only, if you are member of the corresponding group. The logic that defines, what parts of the data you can see, is called "authorization" in IT terminology. The first page you'll see after login is the ["Dashboard"](Dashboard.md). Another example login page from PSI is here - ![Login to PSI](img/login-psi.png) diff --git a/docs/operator-guide/img/DacatDataflowV3.png b/docs/operator-guide/img/DacatDataflowV3.png new file mode 100644 index 0000000..27ed5bc Binary files /dev/null and b/docs/operator-guide/img/DacatDataflowV3.png differ diff --git a/docs/operator-guide/img/job-assembler.png b/docs/operator-guide/img/job-assembler.png new file mode 100644 index 0000000..5edde42 Binary files /dev/null and b/docs/operator-guide/img/job-assembler.png differ diff --git a/docs/operator-guide/index.md b/docs/operator-guide/index.md new file mode 100644 index 0000000..3a72c07 --- /dev/null +++ b/docs/operator-guide/index.md @@ -0,0 +1,101 @@ +# Welcome to SciCat Operator's Guide + +## Overview +SciCat is a flexible metadata catalogue designed to be easily interfaced in to most existing infrastructure. The following guide will introduce the configuration required to integrate SciCat into common technologies. + +How to ingest metadata, set SciCat up to deploy it is best covered by understanding its core systems, backend and frontend, its features, configurations, and what else one could do to fully exploit all of SciCat's capabilities. + +SciCat consists of a backend application, that is connected to the database - a MongoDB - and a frontend client exposing database content through a GUI to a user. At large scale facilities SciCat handles about 30 PB of data. + +## Features + +SciCat covers these core aspects in a flexible way: + +1. Searchable metadata fields, most common and highly specific ones. SciCat was developed by the PaNoSc community and has been successfully used more widely. This is because SciCat is highly configurable. +2. Provision of unique persistent identifiers not only for the internal catalogue, but also connecting to the global DOI system through e.g. ready pathway to publication via [DataCite](https://datacite.org/). + +SciCat is an open source project can can be developed in accordance with our [license](https://github.com/SciCatProject/scicat-backend-next?tab=BSD-3-Clause-1-ov-file#readme). + +## Dataset ingestion +You find here a pythonic way of metadata ingestion using SciCats API based on the PySciCat client: +See this [how-to-ingest doc](https://www.scicatproject.org/pyscicat/howto/ingest.html) to get started. + +Another example that uses Jupyter Notebook in SciCatLive (see below) can be found [here]([https://github.com/SciCatProject/scicatlive/blob/main/services/jupyter/config/notebooks/pyscicat.ipynb) which includes how to authenticate, create a dataset, add datablocks and upload an attachement. + +## Up-to-date operator's information +Generally, the [**scicatlive**](https://www.scicatproject.org/scicatlive/latest/) documentation contains an up-to-date information how to set up and run the system ```SciCat``` interfacing it with various external, site-specific services. For troublshooting issues, please refer [the User's Guide](../troubleshoot/index.md). + +## Backend +At the heart of the SciCat architecture there is the **REST API server**. This is a NodeJS application that uses the nestjs framework to generate RESTful APIs from JSON files that define these models Users, Datasets, Instruments, Proposals and Instruments. Following the Swagger/OpenAPI format SDKs can be generated in almost any language. You can explore the backend APIs directly via the [Swagger](../swagger/index.md) interface. + +The persistence layer behind this API server is a **MongoDB** instance, i.e an open source, NoSQL, document-based database solution. The API server handles all the bi-directional communication from the REST interface to the database. + +These two components together comprise the "backend" of the architecture. + +### Configuration of the backend +There is one central place where one has a handle on how the backend is configured in SciCat: the [dotenv](../backendconfig/index.md) file. + + +### Example: How to connect your scicat to an external service +One useful feature of SciCat is to be able to connect your scicat to some external service via "SciCat jobs". Traditionally there were three fixed types, a job to archive, to retrieve and to publish, visible from the GUI through the respective buttons. Since recently (summer 2025) there are in total 9 types available. How to use them for your site, start with this documentation [here](../backendconfig/jobs.md) for admins. For now, stick to the developers documentation directly in the [code repo](https://github.com/SciCatProject/documentation/blob/master/Development/v4.x/backend/configuration/jobconfig.md). + + +### Example: How to integrate to OIDC using keycloak + +Integration with an identity provider, Keycloak, can be done using Open ID Connect, a protocol for authentication. +See [scicatlive manual](https://www.scicatproject.org/scicatlive/latest/services/backend/services/keycloak/) for more information on integration setup in SciCat backend. + +## Frontend + +To the REST server an arbitrary number of "clients" (frontends) can be connected. One of the most important clients is the web based GUI frontend. This allows to communicate with the data catalog in a user friendly way. It is based on the Angular (9+) technology and uses ngrx to communicate with the SciCat API and provide a searchable interface for datasets, as well as the option to carry out actions (i.e. archiving). + +In addition to the GUI other clients exist, such as command line (CLI) clients (example exist written in GO and Python) or desktop based GUI applications based on Qt. The CLI tools are especially useful for automated workflows, e.g. to get the data into the data catalog. This process is termed "ingestion" of the data. But they can also be used to add the data manually, especially for derived data, since this part of the workflow is often not possible to automate, in particular in truly experimental setups. + +### Start up a frontend client + +To start a local instance of the frontend follow the recipe: install requirements, esp. angular, git clone the [code](https://github.com/SciCatProject/frontend), go the the directory and run "npm run start". Then you can launch it by entering "localhost:4200". + +### Configuration of the frontend +[Jays Link](https://github.com/SciCatProject/scicat-backend-next/pull/2306/files) Find [here](../frontendconfig/index.md) a guide of the frontend configuration. + +### How to include site-specific logos +See [here](https://github.com/SciCatProject/frontend/blob/master/SITE-LOGO-CONFIGURATION.md) for example procedure how to include your logo. + +### Messaging infractructure + +SciCat strength is to intergrate into almost any existing infrastructure because **messaging systems** can be easily interfaced to SciCat that take over the communication to other services and systems. + +A detailed description of jobs in for the new backend can be found [here](https://github.com/SciCatProject/documentation/blob/master/Development/v4.x/backend/configuration/jobconfig.md). + + +### Different entry points to SciCat + +One can ususally see SciCat datasets that is the metadata of data taken. It will be possible to sort according to samples, proposals, instruments and published data. Integration and generalisation of these entry points to the catalogue is currently in development. Another strength of SciCat is that it provides a publishing server. + +#### Publishing Server + +In order to publish data you need to run a landing page server and you need to assign DOIs to your published data. Since the API server may be operated in an intranet, with no access to the internet the following architecture was chosen at PSI: + +An OAI-PMH server is running in a DMZ connected to a local Mongo instance. At publication time the data from SciCat is pushed to the external OAI-PMH server. From this server the landing page server can fetch the information about the published data. Also external DOI systems connect to this OAI-PMH server to synchronize the data with the world wide DOI system. + +If a user wants to download the full datasets of the published data, the data is copied from the internal file server to a https file server (acting as a cache file server) , which subsequently allows anonymous download of the data. + +### Underlying Infrastructure of SciCat as a Service + +You may or may not run the infrastructure as part of a Kubernetes cluster. E.g. at PSI the API server, the GUI application, RabbitMQ and the Node-RED instances are all deployed to a Kubernetes cluster, whereas the Mongo DB ist kept outside Kubernetes. Kubernetes is not necessary to have, but can simplify operations. Likewise "helm charts" or similar tools for managing software applications as a service. + +## Who uses SciCat? + +Traditionally there were PSI, ESS and MaxIV that developed and deploy SciCat. More institutions joined the efforts and have pushed its development and many deploy photon and neutron labs in Europe and world-wide, see our project's for [all facilities](https://www.scicatproject.org/#facilities), contributors and users of SciCat. + +Below is a list of their documentation with more details on their deployment. + +* ESS - European Spallation Source +* [PSI - Paul-Scherrer-Institute](../sites/PSI/index.md) +* MAXIV +* RFI +* ALS +* SOLEIL +* [DESY](../sites/DESY/index.md) + + diff --git a/docs/operator-manual/index.md b/docs/operator-manual/index.md deleted file mode 100644 index 17a28d1..0000000 --- a/docs/operator-manual/index.md +++ /dev/null @@ -1,25 +0,0 @@ -# Welcome to SciCat Operator's Manual - -General manual for site-administrators can be found in the [**scicatlive**](https://www.scicatproject.org/scicatlive/latest/) documentation, it contains information how to set up and run a SciCat instance. For troublshooting issues, please see [the User's Guide](../troubleshoot/index.md). - -## Configuration of the Backend -There is one central place where one has a handle on how the Backend is configured in SciCat: the [dotenv](../backendconfig/index.md) file. - -### Hands-on SciCat -For getting familiar with SciCat's APIs, you can explore via the [Swagger](../swagger/index.md) interface. - -## Configuration of the Frontend - - - -Here we link to site-specific set ups. - -## Links to site-specific SciCat documentation of user sites - -* ESS -* [PSI](../sites/PSI/index.md) -* MAXIV -* SOLEIL -* [DESY](../sites/DESY/index.md) - - diff --git a/docs/operator/datasetsv4/index.md b/docs/operator/datasetsv4/index.md deleted file mode 100644 index 9fb1243..0000000 --- a/docs/operator/datasetsv4/index.md +++ /dev/null @@ -1,44 +0,0 @@ -# How to run new dataset endpoint in v4 - -For now one can run and play with the new version of dataset endpoints v4. Here is how to set it up: - -1. Pre-requisits: admin rights, git, docker, nodejs (v.20.18.2) and npm (v10.8.2). -2. For frontend to run, install angular like this: "npm install -g @angular" -3. Git clone frontend and backend repository. -4. Launch "npm install" in each dir by entering "npm install", respectively. -5. Configuration of backend: add "loggers.json", "functionalAccounts.json", -"proposalTypes.json", "datasetTypes.json" und ".env" files (there are examples named *.example). -6. Start MongoDB: I do it using docker container. Recipe is get the image, e.g. bitnami/mongodb:latest, attach volume to it, e.g. "mongodb", mount it, e.g. to /bitnami/mongodb, port 27017. Open in host on 27017. -7. Optional -8. Start backend: in dir of backend run "npm run start". After about 10s you should be able to view the APIs from "localhost:3000/explorer" in the browser. -9. Start frontend: in dir of frontend run "npm run start". After about 40s one should see under "localhost:4200" to SciCat frontend. - -1. Voraussetzungen: Adminrechte, Git, Docker, NodeJS (ich habe es jetzt -mit v.20.18.2 laufen) inkl. npm (bei mir v10.8.2). -2. Wenn Frontend laufen soll: Über npm angular installieren via "npm -install -g @angular". -3. Front- und Backend Repositories von GitHub klonen. -4. Jeweils im Verzeichnis von Backend und Frontend einmal "npm install" -um die Dependencies zu installieren. -5. Backendkonfiguration: Das Backend braucht fünf zusätzliche Dateien um -richtig zu laufen: "loggers.json", "functionalAccounts.json", -"proposalTypes.json", "datasetTypes.json" und ".env". Für alle davon ist -eine Beispieldatei schon vorhanden (die .env.example ist allerdings -Mist, ich habe mal meine momentane angehängt mit der es läuft). -6. MongoDB starten: Mache ich über einen Docker-Container. Rezept: Image -bitnami/mongodb:latest, Volumen "mongodb" anlegen und im Container auf -/bitnami/mongodb mounten, Port 27017 im Container auf 27017 im Host -öffnen. -7. (Optional?) MongoDB anlegen: Über Docker in den laufenden mongodb -Container einloggen. Auf der Shell "mongosh dacat" ausführen, das legt -die DB "dacat" an (andere name geht auch, dann muss das nur auch in der -.env anders!). Laut der alten Dokumentation muss man dann in der mongosh -den Befehl "db.Dataset.createIndex( { "$**" : "text" } )" ausführen, um -Textindexing zu bekommen. Nicht 100% sicher ob das noch aktuell ist, -musst du mal Max zu fragen. -8. Backend starten: Im Backendverzeichnis "npm run start", sollte dann -nach ca. 10s laufen. Testen, indem "localhost:3000/explorer" im Browser -geöffnet wird - wenn Swagger zu sehen ist, läuft es. -9. Frontend starten: Nachdem das Backend läuft, wieder "npm run start" -im Frontendverzeichnis ausführen. Frontend sollte nach Kompilierung (ca. -30s) auf "localhost:4200" zu sehen sein. \ No newline at end of file diff --git a/docs/operator/index.md b/docs/operator/index.md deleted file mode 100644 index 1ff4d93..0000000 --- a/docs/operator/index.md +++ /dev/null @@ -1,9 +0,0 @@ -# Welcome to SciCat Manual for Admins - -This is a short guide containing most of how SciCat can be operated. Many technical aspects are already described in the [scicatlive](https://www.scicatproject.org/scicatlive/latest/) documentation. - -We highlight in more detail aspects for someone exploring core SciCat software. - -* How to set up development versions of the new version of [dataset endpoints](datasetsv4/index.md). -* How to configure SciCat to mint DOIs. -* How to integrate SciCat with other systems. \ No newline at end of file diff --git a/docs/proposals.md b/docs/proposals.md deleted file mode 100644 index a397d75..0000000 --- a/docs/proposals.md +++ /dev/null @@ -1,3 +0,0 @@ -## Proposals - -Here shall be a description of all meta data fields describing the proposal and how they are or can be related to datasets and samples. \ No newline at end of file diff --git a/docs/proposals/img/proposallist.png b/docs/proposals/img/proposallist.png new file mode 100644 index 0000000..c406e80 Binary files /dev/null and b/docs/proposals/img/proposallist.png differ diff --git a/docs/proposals/index.md b/docs/proposals/index.md new file mode 100644 index 0000000..4a63b90 --- /dev/null +++ b/docs/proposals/index.md @@ -0,0 +1,7 @@ +## Proposals + +Proposals if entered can be listed as well (select from your login icon on top right *Proposals*). Now *parent proposals* can be handled and associated to your run number ID or beamtime id as *proposal ID* which is the child proposal of the parent proposal. + +With the new adjustable configurable table view one can select what to display and sort as well just like in datasets. + +![proposals](./img/proposallist.png) \ No newline at end of file diff --git a/docs/samples.md b/docs/samples.md deleted file mode 100644 index 93e4378..0000000 --- a/docs/samples.md +++ /dev/null @@ -1,4 +0,0 @@ -## Samples - -A dscription of which meta data fields can be expected in SciCats sample description will follow. - diff --git a/docs/samples/img/sampleDetails.png b/docs/samples/img/sampleDetails.png new file mode 100644 index 0000000..b0739a2 Binary files /dev/null and b/docs/samples/img/sampleDetails.png differ diff --git a/docs/samples/img/sample_linkedFromDataset.png b/docs/samples/img/sample_linkedFromDataset.png new file mode 100644 index 0000000..2fbe350 Binary files /dev/null and b/docs/samples/img/sample_linkedFromDataset.png differ diff --git a/docs/samples/img/sample_popUp.png b/docs/samples/img/sample_popUp.png new file mode 100644 index 0000000..9142636 Binary files /dev/null and b/docs/samples/img/sample_popUp.png differ diff --git a/docs/samples/index.md b/docs/samples/index.md new file mode 100644 index 0000000..b1424e0 --- /dev/null +++ b/docs/samples/index.md @@ -0,0 +1,12 @@ +## Samples + +If set up at your site you can list, search and view samples that have been entered into SciCat. This screenshot shows how to filter for specific characteristics of a sample: + +![samples](./img/sample_popUp.png) + + +When selecting a such sample one sees the details +![samples](./img/sampleDetails.png) + +Once a sample entry exist in SciCat one can link datasets to it. It appears under "related information" +![samples](./img/sample_linkedFromDataset.png) diff --git a/docs/sites/DESY/index.md b/docs/sites/DESY/index.md index 1ccc154..16771a6 100644 --- a/docs/sites/DESY/index.md +++ b/docs/sites/DESY/index.md @@ -1,9 +1,3 @@ # SciCat at DESY -Here we plan to describe how SciCat is set up and used at DESY. -DESY has a proposal system called ```DOOR```, more than 20 beamlines at [
PETRA
](https://photon-science.desy.de/facilities/petra_iii/index_eng.html) and also laser beam experiments at [
FLASH.
](https://photon-science.desy.de/facilities/flash/index_eng.html) - -Setup: -1. Structure of SciCats dataset are used - -Usage: \ No newline at end of file +Here we plan to provide more details on how SciCat is set up and used at DESY that might be beneficial for other institutions. For now we link to DESY's [(internal) IT documentation](https://xwiki.desy.de/xwiki/bin/view/IT/SciCat/) on SciCat. diff --git a/docs/swagger/img/swagger_publishedData.png b/docs/swagger/img/swagger_publishedData.png new file mode 100644 index 0000000..2dad104 Binary files /dev/null and b/docs/swagger/img/swagger_publishedData.png differ diff --git a/docs/swagger/img/swagger_publishedData_count.png b/docs/swagger/img/swagger_publishedData_count.png new file mode 100644 index 0000000..a1d6d75 Binary files /dev/null and b/docs/swagger/img/swagger_publishedData_count.png differ diff --git a/docs/swagger/img/swagger_publishedData_formpopulate.png b/docs/swagger/img/swagger_publishedData_formpopulate.png new file mode 100644 index 0000000..a0f2eac Binary files /dev/null and b/docs/swagger/img/swagger_publishedData_formpopulate.png differ diff --git a/docs/swagger/img/swagger_publishedData_post.png b/docs/swagger/img/swagger_publishedData_post.png new file mode 100644 index 0000000..b8769be Binary files /dev/null and b/docs/swagger/img/swagger_publishedData_post.png differ diff --git a/docs/swagger/img/swagger_publishedData_register.png b/docs/swagger/img/swagger_publishedData_register.png new file mode 100644 index 0000000..986227b Binary files /dev/null and b/docs/swagger/img/swagger_publishedData_register.png differ diff --git a/docs/swagger/img/swagger_publishedData_resync.png b/docs/swagger/img/swagger_publishedData_resync.png new file mode 100644 index 0000000..7d5eeba Binary files /dev/null and b/docs/swagger/img/swagger_publishedData_resync.png differ diff --git a/docs/swagger/index.md b/docs/swagger/index.md index 07ae923..6ae34fd 100644 --- a/docs/swagger/index.md +++ b/docs/swagger/index.md @@ -1,4 +1,4 @@ -# Swagger or Explorer - The Backend API +# Swagger - explore the Backend API If SciCat has been set up and runs, one has direct access to the backend through the APIs via the Swagger tool or Explorer interface. Often, you can simply extend the ```url``` by ```/explorer```, e.g. ```https://myscicat.mydomain.de/explorer```. You will see a list of all APIs of that instance. diff --git a/docs/user-guide/index.md b/docs/user-guide/index.md new file mode 100644 index 0000000..a0f792e --- /dev/null +++ b/docs/user-guide/index.md @@ -0,0 +1,42 @@ +# Welcome to SciCat Users Guide + +Scicat, a scientific metadata catalogue, allows you to explore your datasets through their metadata. SciCat has mechanisms to interact with the related datasets through its flexible integration with most storage systems. + +SciCat is a data management tool accompanying some critical steps during the entire data life cycle which are: getting an overview of datasets for data analysis, for re-analysis, for publishing datasets, and in particular for publication. + +Advantages of SciCat: + +* It can be integrated to almost any other service that has REST APIs. Therefore, site-specific applications can be easily integrated. +* The Data Model of SciCat forsees a schemaless fields for quite different use cases. This concept has been implemented for the main class, Datasets, but is extended to function in the same way for the other classes e.g. Proposlas, Samples, Intstruments and Published Data. +* Its components are based on OpenSource software projects and state of the art technologies using MongoDB as backbone database, nestjs as backend basis. + +In the past 5 years (since about 2020) SciCat has undergone major improvements in key areas for better user experience and re-structuring to meet the various different needs of photon science labs. The collaboration has grown and governance will be soon established. + +## How to run SciCat +More detailed information on how to run scicat, see [scicatlive documentation](https://www.scicatproject.org/scicatlive/latest/). For more details on how to ingest, setup and deploy information from SciCat, see the [operator's guide](../operator-guide/index.md). + +## How to use SciCat +Once metadata is ingested into SciCat, the user can login and view, edit the metadata, list, filter and make a selection of interesting datasets using also scientific metadata. There are four main areas of SciCat where metadata can be explored: + +1. [Datasets](../datasets/index.md): Metadata in SciCat is ideally sorted according to a dataset. It can have several associated files attached which have the same metadata like a thumbnail or most common image files. +2. [Proposals](../proposals.md): are used to link datasets to the proposal under which beamtime was granted. +3. [Instruments](../instruments.md): Instruments is a library of instruments available at your institute, which can be linked to datasets. +4. [Samples](../samples.md): Here you can add metadata describing a physical sample which can be linked to it’s experimental use captured in datasets. + +For many the SciCat datasets are the entry point to the catalogue, but soon it will be possible to start with samples or published data records (registered metadata sets). +You can just browse what's in the catalogue for any published datasets. Else one can list all datasets that I either own or have access to. Here is how to find more on how to proceed: + +## How-Tos for Users (Quick-links) + +* [Login](../login/index.md) +* Search and find your data, see [Datasets How to query](../datasets/index.md#how-to-query-datasets) +* How to change some fields after ingestion +* How to view history of changes to a dataset +* How to [group and tag datasets](../datasets/grouping_tagging_ds.md). +* How to group and tag grouped datasets + +## A few more How-To's for users and site-admins +* Where to find the version of the deployed SciCat Frontend? Check [here](../about/operatorHowTos.md). + + + diff --git a/docs/user-manual/index.md b/docs/user-manual/index.md deleted file mode 100644 index dd0cfd6..0000000 --- a/docs/user-manual/index.md +++ /dev/null @@ -1,31 +0,0 @@ -# Welcome to SciCat Users Manual - -This is a short guide to most common questions users of SciCat can face. First of all: What is SciCat? And why should I use it? SciCat, a _science catalogue_, should serve you to find back your data. SciCat is a bookkeeping tool accompanying some critical steps during the entire data life cycle which are: getting an *overview of datasets* for data analysis, for re-analysis, for publishing datasets, and in particular for publication. In SciCat, only the meta data is stored, descriptions to identify a certain measurements. - -We highlight most promiment features that SciCat offers, but note that SciCat is a general software layer which when integrated with site-specific application develops its full potential. For more detailed information on how to run scicat, see [scicatlive documentation](https://www.scicatproject.org/scicatlive/latest/). For more details on how to ingest, setup and deploy information from SciCat, see the [site admin manual](operator/index.md). - -## Features: Quick links to How-To's for users - -Your data may be of type raw or derived, you may want to login or just browse what's in the catalogue. Here is how to find more on how to proceed: - -* [Login](../login/index.md) -* Search and find your data, see [Datasets How to query](datasets/index.md#how-to-query-datasets) -* How to change some fields after ingestion -* How to view history of changes to a dataset -* How to [group and tag datasets](../datasets/grouping_tagging_ds.md). -* How to group and tag grouped datasets - -## A few more How-To's for users and site-admins -* Where to find the version of the deployed SciCat Frontend? Check [here](about/operatorHowTos.md). - - - -## Structure of SciCat - -These main classes determine the functionality of SciCat: - -1. [Datasets](datasets/index.md): This class is the most elaborated one. -2. [Proposals](proposals.md): are used to link datasets to the proposal under which beamtime was granted. -3. [Instruments](instruments.md): Depending on the science background `instruments` can mean something different. -4. [Samples](samples.md): -