From 2a6783bb892cc48d0bfb9aae35f0526b0beb2d96 Mon Sep 17 00:00:00 2001 From: Herpes3000 <61895715+Herpes3000@users.noreply.github.com> Date: Wed, 12 Jul 2023 16:44:04 +0300 Subject: [PATCH] Squashed commit of the following: commit 67d4f9729a74d8c7c55dce356bf9a44eac9f9f49 Author: WithoutPants <53250216+WithoutPants@users.noreply.github.com> Date: Wed Jul 12 11:51:52 2023 +1000 Multiple scene URLs (#3852) * Add URLs scene relationship * Update unit tests * Update scene edit and details pages * Update scrapers to use urls * Post-process scenes during query scrape * Update UI for URLs * Change urls label commit 76a4bfa49ac5ea6650eb6d644cb3fb0dc8178975 Author: chickenwingavalanche <138962341+chickenwingavalanche@users.noreply.github.com> Date: Tue Jul 11 19:25:24 2023 -0600 Add keyboard shortcut to toggle video looping in scene player (#3902) * Use shift+L to toggle video looping in scene player commit 3e810cf8b108c81a86071b0980c143447874d74f Author: WithoutPants <53250216+WithoutPants@users.noreply.github.com> Date: Wed Jul 12 10:53:46 2023 +1000 Add nil checks in identify (#3905) commit c1352f9048ceabd2160e8ec1ad13fb1e10f952d1 Author: NodudeWasTaken <75137537+NodudeWasTaken@users.noreply.github.com> Date: Wed Jul 12 02:45:33 2023 +0200 Safari video height css fix (#3882) commit f665aa8bc2dcbd977105abc35106ddfdd98a9916 Author: WithoutPants <53250216+WithoutPants@users.noreply.github.com> Date: Wed Jul 12 10:38:52 2023 +1000 Update make target in Dockerfile-CUDA commit b2b52bcc417a1d420e6fc360bac512d9dcb477c9 Author: chickenwingavalanche <138962341+chickenwingavalanche@users.noreply.github.com> Date: Tue Jul 11 18:37:46 2023 -0600 Add missing scene player shortcuts to Help -> Keyboard Shortcuts (#3903) Co-authored-by: chickenwingavalanche commit 96f222997a99f43cd226e4530aa9289961ddcf65 Author: DingDongSoLong4 <99329275+DingDongSoLong4@users.noreply.github.com> Date: Wed Jul 12 02:05:35 2023 +0200 Improve Makefile (#3901) * Improve Makefile * Make ui targets consistent --------- Co-authored-by: WithoutPants <53250216+WithoutPants@users.noreply.github.com> commit 278a0642f42b68319f0b45c7f3fd8926e728c007 Author: WithoutPants <53250216+WithoutPants@users.noreply.github.com> Date: Tue Jul 11 19:16:22 2023 +1000 Revert "Add AirPlay and Chromecast support (#2872)" (#3898) This reverts commit 8e235a26eedccd7a97a9aa6a3478620a95a57979. commit 0c0ba19a238912a068f60c2ea5ff57b16ffb4c9d Author: Csaba Maulis Date: Tue Jul 11 13:54:42 2023 +0800 Add `-v/--version` flag to print version string (#3883) * Add `-v/--version` flag to print version string - Created a new flag `-v/--version` in the command-line interface to display the version number and exit. - Moved all version-related functions inside the config package to the new file `manager/config/version.go` to avoid circular dependencies. - Added a new `GetVersionString()` function to generate a formatted version string. - Updated references to the moved version functions. - Updated references in the `Makefile`. * Move version embeds to build package * Remove githash var --------- Co-authored-by: WithoutPants <53250216+WithoutPants@users.noreply.github.com> commit 969af2ab6985f900f750c50d18f2c58fb3012527 Author: A Ghoul Coder Date: Tue Jul 11 07:53:53 2023 +0200 add phasher (#3864) * add phasher A simple `phasher` program that accepts a video file as a command line argument and calculates and prints its PHASH. The goal of this separate executable is to have a simple way to calculate phashes that doesn't depend on a full stash instance so that third-party systems and tools can independently generate PHASHes which can be used for interacting with stash and stash-box APIs and data. Currently `phasher` is built in the default make target along with `stash` by simply running `make`. Cross-platform targets have not been considered. Concurrency is intentionally not implemented because it is simpler to use [GNU Parallel](https://www.gnu.org/software/parallel/). For example: ``` parallel phasher {} ::: *.mp4 ``` * standard dir structure for phasher and separate make target The make target still needs to be integrated into the rest of the Makefile so it can be built as part of normal releases. * phasher: basic usage output and quiet option * phasher: allow and process multiple command line arguments * phasher: camelCase identifiers * phasher: initialize ffmpeg and ffprobe only once commit cbdd4d3cbf596cbc485e24dfd677e8ccccfc66e1 Author: Flashy78 <90150289+Flashy78@users.noreply.github.com> Date: Mon Jul 10 21:37:00 2023 -0700 Identify: Options to skip multiple results and single name performers (#3707) Co-authored-by: WithoutPants <53250216+WithoutPants@users.noreply.github.com> commit ff22577ce09feecb16c0fb94f506cdeb9780eae6 Author: hontheinternet <121332499+hontheinternet@users.noreply.github.com> Date: Tue Jul 11 13:32:42 2023 +0900 Add additional stats to the Stats page (#3812) * Add o_counter, play_duration, play_count, unique_play_count stats commit 4f0e0e1d99c62f1cd14a064663df72cfd36a7bf9 Author: hontheinternet <121332499+hontheinternet@users.noreply.github.com> Date: Tue Jul 11 13:02:09 2023 +0900 Allow serving of interactive CSVs directly to Handy (#3756) * allow direct serve interactive CSVs to Handy --------- Co-authored-by: kermieisinthehouse commit 8e235a26eedccd7a97a9aa6a3478620a95a57979 Author: CJ <72030708+Teda1@users.noreply.github.com> Date: Mon Jul 10 22:47:11 2023 -0500 Add AirPlay and Chromecast support (#2872) * dynamically load cast_sender.js * add https://www.gstatic.com to connectableOrigins * Add toggle for chromecast commit c499c20a7b370c96a5b4ea66fca47a80865cb83b Author: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue Jul 11 13:40:29 2023 +1000 Bump semver from 5.7.1 to 5.7.2 in /ui/v2.5 (#3896) Bumps [semver](https://github.com/npm/node-semver) from 5.7.1 to 5.7.2. - [Release notes](https://github.com/npm/node-semver/releases) - [Changelog](https://github.com/npm/node-semver/blob/v5.7.2/CHANGELOG.md) - [Commits](https://github.com/npm/node-semver/compare/v5.7.1...v5.7.2) --- updated-dependencies: - dependency-name: semver dependency-type: indirect ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> commit f0d901a6979bd262abb1effe04e5299a129b911f Author: plato178 <137155614+plato178@users.noreply.github.com> Date: Tue Jul 11 03:45:20 2023 +0100 Add codec filters (#3843) * Add video_codec and audio_codec filter criteria * Add Audio Codec and Video Codec UI filters commit 93b41fb6506d8ebe4fe432dd5b45f5bc36a63e01 Author: WithoutPants <53250216+WithoutPants@users.noreply.github.com> Date: Tue Jul 11 11:53:49 2023 +1000 Add folder rename detection (#3817) commit 5c38836ade07e0b97314c728cf5f13d5d8f4fb2b Author: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue Jul 11 11:40:49 2023 +1000 Bump stylelint from 15.1.0 to 15.10.1 in /ui/v2.5 (#3889) Bumps [stylelint](https://github.com/stylelint/stylelint) from 15.1.0 to 15.10.1. - [Release notes](https://github.com/stylelint/stylelint/releases) - [Changelog](https://github.com/stylelint/stylelint/blob/main/CHANGELOG.md) - [Commits](https://github.com/stylelint/stylelint/compare/15.1.0...15.10.1) --- updated-dependencies: - dependency-name: stylelint dependency-type: direct:development ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> commit cec919554364daf11a5737b7a40d5eea394eb4e3 Author: DingDongSoLong4 <99329275+DingDongSoLong4@users.noreply.github.com> Date: Tue Jul 11 03:40:20 2023 +0200 Fix scene missing flicker on scene page (#3857) * use useLayoutEffect * Remove unnecessary nullability in ScenePlayer commit 02685650999920c8e851ffe8c2028f3d7fdff5c9 Author: DingDongSoLong4 <99329275+DingDongSoLong4@users.noreply.github.com> Date: Tue Jul 11 03:36:57 2023 +0200 Makefile cleanup (#3876) --- .github/workflows/build.yml | 2 +- .gitignore | 3 +- Makefile | 247 +++++++++----- cmd/phasher/main.go | 83 +++++ docker/build/x86_64/Dockerfile | 6 +- docker/build/x86_64/Dockerfile-CUDA | 6 +- docs/DEVELOPMENT.md | 57 ++-- graphql/documents/data/config.graphql | 5 + graphql/documents/data/scene-slim.graphql | 2 +- graphql/documents/data/scene.graphql | 2 +- graphql/documents/data/scrapers.graphql | 2 +- graphql/documents/queries/misc.graphql | 22 +- graphql/schema/types/config.graphql | 4 + graphql/schema/types/filters.graphql | 4 + graphql/schema/types/metadata.graphql | 16 + graphql/schema/types/scene.graphql | 12 +- graphql/schema/types/scraper.graphql | 6 +- graphql/schema/types/stats.graphql | 4 + internal/api/check_version.go | 7 +- internal/api/resolver.go | 31 +- internal/api/resolver_model_scene.go | 29 ++ internal/api/resolver_mutation_configure.go | 4 + internal/api/resolver_mutation_scene.go | 33 +- internal/api/resolver_mutation_stash_box.go | 4 + internal/api/resolver_query_configuration.go | 6 +- internal/api/routes_scene.go | 15 + internal/api/server.go | 51 +-- internal/autotag/integration_test.go | 4 +- internal/build/version.go | 57 ++++ internal/desktop/desktop.go | 3 +- internal/identify/identify.go | 239 ++++++++++---- internal/identify/identify_test.go | 158 +++++++-- internal/identify/options.go | 8 + internal/identify/performer.go | 31 +- internal/identify/performer_test.go | 108 +++--- internal/identify/scene.go | 37 ++- internal/identify/scene_test.go | 6 +- internal/identify/studio.go | 15 + internal/manager/config/config.go | 16 +- .../manager/config/config_concurrency_test.go | 1 + internal/manager/config/init.go | 8 + .../generator_interactive_heatmap_speed.go | 61 ++++ internal/manager/repository.go | 1 + internal/manager/task_identify.go | 10 +- pkg/file/file.go | 2 + pkg/file/folder_rename_detect.go | 195 +++++++++++ pkg/file/scan.go | 76 ++++- pkg/models/jsonschema/scene.go | 8 +- pkg/models/mocks/SceneReaderWriter.go | 107 ++++++ pkg/models/model_scene.go | 17 +- pkg/models/model_scene_test.go | 15 +- pkg/models/relationships.go | 4 + pkg/models/scene.go | 9 + pkg/models/update.go | 8 + pkg/scene/export.go | 49 +-- pkg/scene/export_test.go | 6 +- pkg/scene/import.go | 8 +- pkg/scraper/cache.go | 23 +- pkg/scraper/postprocessing.go | 8 + pkg/scraper/query_url.go | 10 +- pkg/scraper/scene.go | 28 +- pkg/scraper/stash.go | 2 +- pkg/scraper/stashbox/stash_box.go | 17 +- pkg/sliceutil/collections.go | 47 +++ pkg/sqlite/anonymise.go | 70 +++- pkg/sqlite/file.go | 27 ++ pkg/sqlite/migrations/47_scene_urls.up.sql | 94 ++++++ pkg/sqlite/scene.go | 116 ++++++- pkg/sqlite/scene_test.go | 44 ++- pkg/sqlite/setup_test.go | 8 +- pkg/sqlite/table.go | 108 ++++++ pkg/sqlite/tables.go | 9 + ui/v2.5/package.json | 2 +- .../Dialogs/IdentifyDialog/FieldOptions.tsx | 2 +- .../Dialogs/IdentifyDialog/IdentifyDialog.tsx | 6 + .../Dialogs/IdentifyDialog/Options.tsx | 119 ++++++- .../IdentifyDialog/ThreeStateBoolean.tsx | 5 +- ui/v2.5/src/components/Dialogs/styles.scss | 10 + .../components/ScenePlayer/ScenePlayer.tsx | 40 ++- .../src/components/ScenePlayer/styles.scss | 4 +- .../components/Scenes/SceneDetails/Scene.tsx | 64 ++-- .../Scenes/SceneDetails/SceneEditPanel.tsx | 58 +++- .../SceneDetails/SceneFileInfoPanel.tsx | 9 +- .../Scenes/SceneDetails/SceneScrapeDialog.tsx | 25 +- .../components/Scenes/SceneMergeDialog.tsx | 13 +- .../SettingsInterfacePanel.tsx | 8 + .../src/components/Shared/ScrapeDialog.tsx | 69 +++- .../src/components/Shared/StringListInput.tsx | 69 +++- ui/v2.5/src/components/Shared/URLField.tsx | 35 ++ ui/v2.5/src/components/Shared/styles.scss | 4 + ui/v2.5/src/components/Stats.tsx | 37 +++ ui/v2.5/src/components/Tagger/context.tsx | 2 +- .../Tagger/scenes/StashSearchResult.tsx | 20 +- ui/v2.5/src/docs/en/Manual/Identify.md | 4 + .../src/docs/en/Manual/KeyboardShortcuts.md | 6 + ui/v2.5/src/hooks/Interactive/context.tsx | 22 +- ui/v2.5/src/hooks/Interactive/interactive.ts | 37 ++- ui/v2.5/src/locales/en-GB.json | 28 +- .../models/list-filter/criteria/factory.ts | 4 + ui/v2.5/src/models/list-filter/scenes.ts | 2 + ui/v2.5/src/models/list-filter/types.ts | 2 + ui/v2.5/src/utils/field.tsx | 47 +++ ui/v2.5/yarn.lock | 310 +++++++++++------- 103 files changed, 2876 insertions(+), 728 deletions(-) create mode 100644 cmd/phasher/main.go create mode 100644 internal/build/version.go create mode 100644 pkg/file/folder_rename_detect.go create mode 100644 pkg/sqlite/migrations/47_scene_urls.up.sql diff --git a/.github/workflows/build.yml b/.github/workflows/build.yml index 7657059898a..3927d9438f0 100644 --- a/.github/workflows/build.yml +++ b/.github/workflows/build.yml @@ -68,7 +68,7 @@ jobs: - name: Validate UI # skip UI validation for pull requests if UI is unchanged if: ${{ github.event_name != 'pull_request' || steps.cache-ui.outputs.cache-hit != 'true' }} - run: docker exec -t build /bin/bash -c "make validate-frontend" + run: docker exec -t build /bin/bash -c "make validate-ui" # Static validation happens in the linter workflow in parallel to this workflow # Run Dynamic validation here, to make sure we pass all the projects integration tests diff --git a/.gitignore b/.gitignore index 197fd730287..ead0b09f953 100644 --- a/.gitignore +++ b/.gitignore @@ -61,6 +61,7 @@ node_modules *.db /stash +/phasher dist .DS_Store -/.local \ No newline at end of file +/.local* diff --git a/Makefile b/Makefile index 79551c2bab9..70154258553 100644 --- a/Makefile +++ b/Makefile @@ -15,18 +15,28 @@ else endif # set LDFLAGS environment variable to any extra ldflags required -# set OUTPUT to generate a specific binary name LDFLAGS := $(LDFLAGS) + +# set OUTPUT environment variable to generate a specific binary name +# this will apply to both `stash` and `phasher`, so build them separately +# alternatively use STASH_OUTPUT or PHASHER_OUTPUT to set the value individually ifdef OUTPUT - OUTPUT := -o $(OUTPUT) + STASH_OUTPUT := $(OUTPUT) + PHASHER_OUTPUT := $(OUTPUT) +endif +ifdef STASH_OUTPUT + STASH_OUTPUT := -o $(STASH_OUTPUT) +endif +ifdef PHASHER_OUTPUT + PHASHER_OUTPUT := -o $(PHASHER_OUTPUT) endif -export CGO_ENABLED = 1 +# set GO_BUILD_FLAGS environment variable to any extra build flags required +GO_BUILD_FLAGS := $(GO_BUILD_FLAGS) -# including netgo causes name resolution to go through the Go resolver -# and isn't necessary for static builds on Windows -GO_BUILD_TAGS_WINDOWS := sqlite_omit_load_extension sqlite_stat4 osusergo -GO_BUILD_TAGS_DEFAULT = $(GO_BUILD_TAGS_WINDOWS) netgo +# set GO_BUILD_TAGS environment variable to any extra build tags required +GO_BUILD_TAGS := $(GO_BUILD_TAGS) +GO_BUILD_TAGS += sqlite_stat4 # set STASH_NOLEGACY environment variable or uncomment to disable legacy browser support # STASH_NOLEGACY := true @@ -34,55 +44,116 @@ GO_BUILD_TAGS_DEFAULT = $(GO_BUILD_TAGS_WINDOWS) netgo # set STASH_SOURCEMAPS environment variable or uncomment to enable UI sourcemaps # STASH_SOURCEMAPS := true +export CGO_ENABLED := 1 + .PHONY: release release: pre-ui generate ui build-release -.PHONY: pre-build -pre-build: +# targets to set various build flags + +.PHONY: flags-release +flags-release: + $(eval LDFLAGS += -s -w) + $(eval GO_BUILD_FLAGS += -trimpath) + +.PHONY: flags-pie +flags-pie: + $(eval GO_BUILD_FLAGS += -buildmode=pie) + +.PHONY: flags-static +flags-static: + $(eval LDFLAGS += -extldflags=-static) + $(eval GO_BUILD_TAGS += sqlite_omit_load_extension osusergo netgo) + +.PHONY: flags-static-pie +flags-static-pie: + $(eval LDFLAGS += -extldflags=-static-pie) + $(eval GO_BUILD_FLAGS += -buildmode=pie) + $(eval GO_BUILD_TAGS += sqlite_omit_load_extension osusergo netgo) + +.PHONY: flags-static-windows +flags-static-windows: + $(eval LDFLAGS += -extldflags=-static-pie) + $(eval GO_BUILD_FLAGS += -buildmode=pie) + $(eval GO_BUILD_TAGS += sqlite_omit_load_extension osusergo) + +.PHONY: build-info +build-info: ifndef BUILD_DATE - $(eval BUILD_DATE := $(shell go run -mod=vendor scripts/getDate.go)) + $(eval BUILD_DATE := $(shell go run scripts/getDate.go)) endif - ifndef GITHASH $(eval GITHASH := $(shell git rev-parse --short HEAD)) endif - ifndef STASH_VERSION $(eval STASH_VERSION := $(shell git describe --tags --exclude latest_develop)) endif - ifndef OFFICIAL_BUILD $(eval OFFICIAL_BUILD := false) endif .PHONY: build-flags -build-flags: pre-build - $(eval LDFLAGS := $(LDFLAGS) -X 'github.com/stashapp/stash/internal/api.buildstamp=$(BUILD_DATE)') - $(eval LDFLAGS := $(LDFLAGS) -X 'github.com/stashapp/stash/internal/api.githash=$(GITHASH)') - $(eval LDFLAGS := $(LDFLAGS) -X 'github.com/stashapp/stash/internal/api.version=$(STASH_VERSION)') - $(eval LDFLAGS := $(LDFLAGS) -X 'github.com/stashapp/stash/internal/manager/config.officialBuild=$(OFFICIAL_BUILD)') -ifndef GO_BUILD_TAGS - $(eval GO_BUILD_TAGS := $(GO_BUILD_TAGS_DEFAULT)) -endif - $(eval BUILD_FLAGS := -mod=vendor -v -tags "$(GO_BUILD_TAGS)" $(GO_BUILD_FLAGS) -ldflags "$(LDFLAGS) $(EXTRA_LDFLAGS)") - -# NOTE: the build target still includes netgo because we cannot detect -# Windows easily from the Makefile. +build-flags: build-info + $(eval BUILD_LDFLAGS := $(LDFLAGS)) + $(eval BUILD_LDFLAGS += -X 'github.com/stashapp/stash/internal/build.buildstamp=$(BUILD_DATE)') + $(eval BUILD_LDFLAGS += -X 'github.com/stashapp/stash/internal/build.githash=$(GITHASH)') + $(eval BUILD_LDFLAGS += -X 'github.com/stashapp/stash/internal/build.version=$(STASH_VERSION)') + $(eval BUILD_LDFLAGS += -X 'github.com/stashapp/stash/internal/build.officialBuild=$(OFFICIAL_BUILD)') + $(eval BUILD_FLAGS := -v -tags "$(GO_BUILD_TAGS)" $(GO_BUILD_FLAGS) -ldflags "$(BUILD_LDFLAGS)") + +.PHONY: stash +stash: build-flags + go build $(STASH_OUTPUT) $(BUILD_FLAGS) ./cmd/stash + +.PHONY: stash-release +stash-release: flags-release +stash-release: flags-pie +stash-release: stash + +.PHONY: stash-release-static +stash-release-static: flags-release +stash-release-static: flags-static-pie +stash-release-static: stash + +.PHONY: stash-release-static-windows +stash-release-static-windows: flags-release +stash-release-static-windows: flags-static-windows +stash-release-static-windows: stash + +.PHONY: phasher +phasher: build-flags + go build $(PHASHER_OUTPUT) $(BUILD_FLAGS) ./cmd/phasher + +.PHONY: phasher-release +phasher-release: flags-release +phasher-release: flags-pie +phasher-release: phasher + +.PHONY: phasher-release-static +phasher-release-static: flags-release +phasher-release-static: flags-static-pie +phasher-release-static: phasher + +.PHONY: phasher-release-static-windows +phasher-release-static-windows: flags-release +phasher-release-static-windows: flags-static-windows +phasher-release-static-windows: phasher + +# builds dynamically-linked debug binaries .PHONY: build -build: build-flags -build: - go build $(OUTPUT) $(BUILD_FLAGS) ./cmd/stash +build: stash phasher -# strips debug symbols from the release build +# builds dynamically-linked release binaries .PHONY: build-release -build-release: EXTRA_LDFLAGS := -s -w -build-release: GO_BUILD_FLAGS := -trimpath -build-release: build +build-release: stash-release phasher-release +# builds statically-linked release binaries .PHONY: build-release-static -build-release-static: EXTRA_LDFLAGS := -extldflags=-static -s -w -build-release-static: GO_BUILD_FLAGS := -trimpath -build-release-static: build +build-release-static: stash-release-static phasher-release-static + +# build-release-static, but excluding netgo, which is not needed on windows +.PHONY: build-release-static-windows +build-release-static-windows: stash-release-static-windows phasher-release-static-windows # cross-compile- targets should be run within the compiler docker container .PHONY: cross-compile-windows @@ -90,29 +161,35 @@ cross-compile-windows: export GOOS := windows cross-compile-windows: export GOARCH := amd64 cross-compile-windows: export CC := x86_64-w64-mingw32-gcc cross-compile-windows: export CXX := x86_64-w64-mingw32-g++ -cross-compile-windows: OUTPUT := -o dist/stash-win.exe -cross-compile-windows: GO_BUILD_TAGS := $(GO_BUILD_TAGS_WINDOWS) -cross-compile-windows: build-release-static +cross-compile-windows: STASH_OUTPUT := -o dist/stash-win.exe +cross-compile-windows: PHASHER_OUTPUT := -o dist/phasher-win.exe +cross-compile-windows: flags-release +cross-compile-windows: flags-static-windows +cross-compile-windows: build .PHONY: cross-compile-macos-intel cross-compile-macos-intel: export GOOS := darwin cross-compile-macos-intel: export GOARCH := amd64 cross-compile-macos-intel: export CC := o64-clang cross-compile-macos-intel: export CXX := o64-clang++ -cross-compile-macos-intel: OUTPUT := -o dist/stash-macos-intel -cross-compile-macos-intel: GO_BUILD_TAGS := $(GO_BUILD_TAGS_DEFAULT) +cross-compile-macos-intel: STASH_OUTPUT := -o dist/stash-macos-intel +cross-compile-macos-intel: PHASHER_OUTPUT := -o dist/phasher-macos-intel +cross-compile-macos-intel: flags-release # can't use static build for OSX -cross-compile-macos-intel: build-release +cross-compile-macos-intel: flags-pie +cross-compile-macos-intel: build .PHONY: cross-compile-macos-applesilicon cross-compile-macos-applesilicon: export GOOS := darwin cross-compile-macos-applesilicon: export GOARCH := arm64 cross-compile-macos-applesilicon: export CC := oa64e-clang cross-compile-macos-applesilicon: export CXX := oa64e-clang++ -cross-compile-macos-applesilicon: OUTPUT := -o dist/stash-macos-applesilicon -cross-compile-macos-applesilicon: GO_BUILD_TAGS := $(GO_BUILD_TAGS_DEFAULT) +cross-compile-macos-applesilicon: STASH_OUTPUT := -o dist/stash-macos-applesilicon +cross-compile-macos-applesilicon: PHASHER_OUTPUT := -o dist/phasher-macos-applesilicon +cross-compile-macos-applesilicon: flags-release # can't use static build for OSX -cross-compile-macos-applesilicon: build-release +cross-compile-macos-applesilicon: flags-pie +cross-compile-macos-applesilicon: build .PHONY: cross-compile-macos cross-compile-macos: @@ -132,42 +209,52 @@ cross-compile-macos: .PHONY: cross-compile-freebsd cross-compile-freebsd: export GOOS := freebsd cross-compile-freebsd: export GOARCH := amd64 -cross-compile-freebsd: OUTPUT := -o dist/stash-freebsd -cross-compile-freebsd: GO_BUILD_TAGS += netgo -cross-compile-freebsd: build-release-static +cross-compile-freebsd: STASH_OUTPUT := -o dist/stash-freebsd +cross-compile-freebsd: PHASHER_OUTPUT := -o dist/phasher-freebsd +cross-compile-freebsd: flags-release +cross-compile-freebsd: flags-static-pie +cross-compile-freebsd: build .PHONY: cross-compile-linux cross-compile-linux: export GOOS := linux cross-compile-linux: export GOARCH := amd64 -cross-compile-linux: OUTPUT := -o dist/stash-linux -cross-compile-linux: GO_BUILD_TAGS := $(GO_BUILD_TAGS_DEFAULT) -cross-compile-linux: build-release-static +cross-compile-linux: STASH_OUTPUT := -o dist/stash-linux +cross-compile-linux: PHASHER_OUTPUT := -o dist/phasher-linux +cross-compile-linux: flags-release +cross-compile-linux: flags-static-pie +cross-compile-linux: build .PHONY: cross-compile-linux-arm64v8 cross-compile-linux-arm64v8: export GOOS := linux cross-compile-linux-arm64v8: export GOARCH := arm64 cross-compile-linux-arm64v8: export CC := aarch64-linux-gnu-gcc -cross-compile-linux-arm64v8: OUTPUT := -o dist/stash-linux-arm64v8 -cross-compile-linux-arm64v8: GO_BUILD_TAGS := $(GO_BUILD_TAGS_DEFAULT) -cross-compile-linux-arm64v8: build-release-static +cross-compile-linux-arm64v8: STASH_OUTPUT := -o dist/stash-linux-arm64v8 +cross-compile-linux-arm64v8: PHASHER_OUTPUT := -o dist/phasher-linux-arm64v8 +cross-compile-linux-arm64v8: flags-release +cross-compile-linux-arm64v8: flags-static-pie +cross-compile-linux-arm64v8: build .PHONY: cross-compile-linux-arm32v7 cross-compile-linux-arm32v7: export GOOS := linux cross-compile-linux-arm32v7: export GOARCH := arm cross-compile-linux-arm32v7: export GOARM := 7 cross-compile-linux-arm32v7: export CC := arm-linux-gnueabihf-gcc -cross-compile-linux-arm32v7: OUTPUT := -o dist/stash-linux-arm32v7 -cross-compile-linux-arm32v7: GO_BUILD_TAGS := $(GO_BUILD_TAGS_DEFAULT) -cross-compile-linux-arm32v7: build-release-static +cross-compile-linux-arm32v7: STASH_OUTPUT := -o dist/stash-linux-arm32v7 +cross-compile-linux-arm32v7: PHASHER_OUTPUT := -o dist/phasher-linux-arm32v7 +cross-compile-linux-arm32v7: flags-release +cross-compile-linux-arm32v7: flags-static +cross-compile-linux-arm32v7: build .PHONY: cross-compile-linux-arm32v6 cross-compile-linux-arm32v6: export GOOS := linux cross-compile-linux-arm32v6: export GOARCH := arm cross-compile-linux-arm32v6: export GOARM := 6 cross-compile-linux-arm32v6: export CC := arm-linux-gnueabi-gcc -cross-compile-linux-arm32v6: OUTPUT := -o dist/stash-linux-arm32v6 -cross-compile-linux-arm32v6: GO_BUILD_TAGS := $(GO_BUILD_TAGS_DEFAULT) -cross-compile-linux-arm32v6: build-release-static +cross-compile-linux-arm32v6: STASH_OUTPUT := -o dist/stash-linux-arm32v6 +cross-compile-linux-arm32v6: PHASHER_OUTPUT := -o dist/phasher-linux-arm32v6 +cross-compile-linux-arm32v6: flags-release +cross-compile-linux-arm32v6: flags-static +cross-compile-linux-arm32v6: build .PHONY: cross-compile-all cross-compile-all: @@ -191,24 +278,24 @@ endif # Regenerates GraphQL files .PHONY: generate -generate: generate-backend generate-frontend +generate: generate-backend generate-ui -.PHONY: generate-frontend -generate-frontend: +.PHONY: generate-ui +generate-ui: cd ui/v2.5 && yarn run gqlgen .PHONY: generate-backend generate-backend: touch-ui - go generate -mod=vendor ./cmd/stash + go generate ./cmd/stash .PHONY: generate-dataloaders generate-dataloaders: - go generate -mod=vendor ./internal/api/loaders + go generate ./internal/api/loaders # Regenerates stash-box client files .PHONY: generate-stash-box-client generate-stash-box-client: - go run -mod=vendor github.com/Yamashou/gqlgenc + go run github.com/Yamashou/gqlgenc # Runs gofmt -w on the project's source code, modifying any files that do not match its style. .PHONY: fmt @@ -222,17 +309,17 @@ lint: # runs unit tests - excluding integration tests .PHONY: test test: - go test -mod=vendor ./... + go test ./... # runs all tests - including integration tests .PHONY: it it: - go test -mod=vendor -tags=integration ./... + go test -tags=integration ./... # generates test mocks .PHONY: generate-test-mocks generate-test-mocks: - go run -mod=vendor github.com/vektra/mockery/v2 --dir ./pkg/models --name '.*ReaderWriter' --outpkg mocks --output ./pkg/models/mocks + go run github.com/vektra/mockery/v2 --dir ./pkg/models --name '.*ReaderWriter' --outpkg mocks --output ./pkg/models/mocks # runs server # sets the config file to use the local dev config @@ -258,7 +345,7 @@ pre-ui: cd ui/v2.5 && yarn install --frozen-lockfile .PHONY: ui-env -ui-env: pre-build +ui-env: build-info $(eval export VITE_APP_DATE := $(BUILD_DATE)) $(eval export VITE_APP_GITHASH := $(GITHASH)) $(eval export VITE_APP_STASH_VERSION := $(STASH_VERSION)) @@ -289,29 +376,25 @@ ui-start: ui-env fmt-ui: cd ui/v2.5 && yarn format -# runs tests and checks on the UI and builds it -.PHONY: ui-validate -ui-validate: - cd ui/v2.5 && yarn run validate - -# runs all of the tests and checks required for a PR to be accepted -.PHONY: validate -validate: validate-frontend validate-backend - # runs all of the frontend PR-acceptance steps -.PHONY: validate-frontend -validate-frontend: ui-validate +.PHONY: validate-ui +validate-ui: + cd ui/v2.5 && yarn run validate # runs all of the backend PR-acceptance steps .PHONY: validate-backend validate-backend: lint it +# runs all of the tests and checks required for a PR to be accepted +.PHONY: validate +validate: validate-ui validate-backend + # locally builds and tags a 'stash/build' docker image .PHONY: docker-build -docker-build: pre-build +docker-build: build-info docker build --build-arg GITHASH=$(GITHASH) --build-arg STASH_VERSION=$(STASH_VERSION) -t stash/build -f docker/build/x86_64/Dockerfile . # locally builds and tags a 'stash/cuda-build' docker image .PHONY: docker-cuda-build -docker-cuda-build: pre-build +docker-cuda-build: build-info docker build --build-arg GITHASH=$(GITHASH) --build-arg STASH_VERSION=$(STASH_VERSION) -t stash/cuda-build -f docker/build/x86_64/Dockerfile-CUDA . diff --git a/cmd/phasher/main.go b/cmd/phasher/main.go new file mode 100644 index 00000000000..f4648b74e2f --- /dev/null +++ b/cmd/phasher/main.go @@ -0,0 +1,83 @@ +// TODO: document in README.md +package main + +import ( + "context" + "fmt" + "os" + + flag "github.com/spf13/pflag" + "github.com/stashapp/stash/pkg/ffmpeg" + "github.com/stashapp/stash/pkg/file" + "github.com/stashapp/stash/pkg/hash/videophash" +) + +func customUsage() { + fmt.Fprintf(os.Stderr, "Usage:\n") + fmt.Fprintf(os.Stderr, "%s [OPTIONS] VIDEOFILE...\n\nOptions:\n", os.Args[0]) + flag.PrintDefaults() +} + +func printPhash(ff *ffmpeg.FFMpeg, ffp ffmpeg.FFProbe, inputfile string, quiet *bool) error { + ffvideoFile, err := ffp.NewVideoFile(inputfile) + if err != nil { + return err + } + + // All we need for videophash.Generate() is + // videoFile.Path (from BaseFile) + // videoFile.Duration + // The rest of the struct isn't needed. + vf := &file.VideoFile{ + BaseFile: &file.BaseFile{Path: inputfile}, + Duration: ffvideoFile.FileDuration, + } + + phash, err := videophash.Generate(ff, vf) + if err != nil { + return err + } + + if *quiet { + fmt.Printf("%x\n", *phash) + } else { + fmt.Printf("%x %v\n", *phash, vf.Path) + } + return nil +} + +func main() { + flag.Usage = customUsage + quiet := flag.BoolP("quiet", "q", false, "print only the phash") + help := flag.BoolP("help", "h", false, "print this help output") + flag.Parse() + + if *help { + flag.Usage() + os.Exit(2) + } + + args := flag.Args() + + if len(args) < 1 { + fmt.Fprintf(os.Stderr, "Missing VIDEOFILE argument.\n") + flag.Usage() + os.Exit(2) + } + + if len(args) > 1 { + fmt.Fprintln(os.Stderr, "Files will be processed sequentially! Consier using GNU Parallel.") + fmt.Fprintf(os.Stderr, "Example: parallel %v ::: *.mp4\n", os.Args[0]) + } + + ffmpegPath, ffprobePath := ffmpeg.GetPaths(nil) + encoder := ffmpeg.NewEncoder(ffmpegPath) + encoder.InitHWSupport(context.TODO()) + ffprobe := ffmpeg.FFProbe(ffprobePath) + + for _, item := range args { + if err := printPhash(encoder, ffprobe, item, quiet); err != nil { + fmt.Fprintln(os.Stderr, err) + } + } +} diff --git a/docker/build/x86_64/Dockerfile b/docker/build/x86_64/Dockerfile index 5133eba36f9..3e8b70d5be9 100644 --- a/docker/build/x86_64/Dockerfile +++ b/docker/build/x86_64/Dockerfile @@ -6,11 +6,11 @@ RUN apk add --no-cache make ## cache node_modules separately COPY ./ui/v2.5/package.json ./ui/v2.5/yarn.lock /stash/ui/v2.5/ WORKDIR /stash -RUN yarn --cwd ui/v2.5 install --frozen-lockfile. +RUN make pre-ui COPY Makefile /stash/ COPY ./graphql /stash/graphql/ COPY ./ui /stash/ui/ -RUN make generate-frontend +RUN make generate-ui ARG GITHASH ARG STASH_VERSION RUN BUILD_DATE=$(date +"%Y-%m-%d %H:%M:%S") make ui @@ -29,7 +29,7 @@ COPY --from=frontend /stash /stash/ RUN make generate-backend ARG GITHASH ARG STASH_VERSION -RUN make build +RUN make stash-release # Final Runnable Image FROM alpine:latest diff --git a/docker/build/x86_64/Dockerfile-CUDA b/docker/build/x86_64/Dockerfile-CUDA index 676e2ead915..3f8ed603cfd 100644 --- a/docker/build/x86_64/Dockerfile-CUDA +++ b/docker/build/x86_64/Dockerfile-CUDA @@ -6,11 +6,11 @@ RUN apk add --no-cache make ## cache node_modules separately COPY ./ui/v2.5/package.json ./ui/v2.5/yarn.lock /stash/ui/v2.5/ WORKDIR /stash -RUN yarn --cwd ui/v2.5 install --frozen-lockfile. +RUN make pre-ui COPY Makefile /stash/ COPY ./graphql /stash/graphql/ COPY ./ui /stash/ui/ -RUN make generate-frontend +RUN make generate-ui ARG GITHASH ARG STASH_VERSION RUN BUILD_DATE=$(date +"%Y-%m-%d %H:%M:%S") make ui @@ -29,7 +29,7 @@ COPY --from=frontend /stash /stash/ RUN make generate-backend ARG GITHASH ARG STASH_VERSION -RUN make build +RUN make stash-release # Final Runnable Image FROM nvidia/cuda:12.0.1-base-ubuntu22.04 diff --git a/docs/DEVELOPMENT.md b/docs/DEVELOPMENT.md index 94882efa53b..a19b9bc2acc 100644 --- a/docs/DEVELOPMENT.md +++ b/docs/DEVELOPMENT.md @@ -6,21 +6,18 @@ * [GolangCI](https://golangci-lint.run/) - A meta-linter which runs several linters in parallel * To install, follow the [local installation instructions](https://golangci-lint.run/usage/install/#local-installation) * [Yarn](https://yarnpkg.com/en/docs/install) - Yarn package manager - * Run `yarn install --frozen-lockfile` in the `stash/ui/v2.5` folder (before running make generate for first time). - -NOTE: You may need to run the `go get` commands outside the project directory to avoid modifying the projects module file. ## Environment ### Windows 1. Download and install [Go for Windows](https://golang.org/dl/) -2. Download and extract [MingW64](https://sourceforge.net/projects/mingw-w64/files/) (scroll down and select x86_64-posix-seh, dont use the autoinstaller it doesnt work) -3. Search for "advanced system settings" and open the system properties dialog. +2. Download and extract [MinGW64](https://sourceforge.net/projects/mingw-w64/files/) (scroll down and select x86_64-posix-seh, don't use the autoinstaller, it doesn't work) +3. Search for "Advanced System Settings" and open the System Properties dialog. 1. Click the `Environment Variables` button - 2. Under system variables find the `Path`. Edit and add `C:\MinGW\bin` (replace with the correct path to where you extracted MingW64). + 2. Under System Variables find `Path`. Edit and add `C:\MinGW\bin` (replace with the correct path to where you extracted MingW64). -NOTE: The `make` command in Windows will be `mingw32-make` with MingW. For example `make pre-ui` will be `mingw32-make pre-ui` +NOTE: The `make` command in Windows will be `mingw32-make` with MinGW. For example, `make pre-ui` will be `mingw32-make pre-ui`. ### macOS @@ -30,28 +27,36 @@ NOTE: The `make` command in Windows will be `mingw32-make` with MingW. For examp ### Linux #### Arch Linux + 1. Install dependencies: `sudo pacman -S go git yarn gcc make nodejs ffmpeg --needed` #### Ubuntu + 1. Install dependencies: `sudo apt-get install golang git gcc nodejs ffmpeg -y` 2. Enable corepack in Node.js: `corepack enable` 3. Install yarn: `corepack prepare yarn@stable --activate` ## Commands -* `make pre-ui` - Installs the UI dependencies. Only needs to be run once before building the UI for the first time, or if the dependencies are updated -* `make generate` - Generate Go and UI GraphQL files -* `make fmt-ui` - Formats the UI source code -* `make ui` - Builds the frontend -* `make build` - Builds the binary (make sure to build the UI as well... see below) +* `make pre-ui` - Installs the UI dependencies. This only needs to be run once after cloning the repository, or if the dependencies are updated. +* `make generate` - Generates Go and UI GraphQL files. Requires `make pre-ui` to have been run. +* `make ui` - Builds the UI. Requires `make pre-ui` to have been run. +* `make stash` - Builds the `stash` binary (make sure to build the UI as well... see below) +* `make stash-release` - Builds a release version the `stash` binary, with debug information removed +* `make phasher` - Builds the `phasher` binary +* `make phasher-release` - Builds a release version the `phasher` binary, with debug information removed +* `make build` - Builds both the `stash` and `phasher` binaries +* `make build-release` - Builds release versions of both the `stash` and `phasher` binaries * `make docker-build` - Locally builds and tags a complete 'stash/build' docker image -* `make lint` - Run the linter on the backend -* `make fmt` - Run `go fmt` -* `make it` - Run the unit and integration tests -* `make validate` - Run all of the tests and checks required to submit a PR -* `make server-start` - Runs an instance of the server in the `.local` directory. -* `make server-clean` - Removes the `.local` directory and all of its contents. -* `make ui-start` - Runs the UI in development mode. Requires a running stash server to connect to. Stash server port can be changed from the default of `9999` using environment variable `VITE_APP_PLATFORM_PORT`. UI runs on port `3000` or the next available port. +* `make docker-cuda-build` - Locally builds and tags a complete 'stash/cuda-build' docker image +* `make validate` - Runs all of the tests and checks required to submit a PR +* `make lint` - Runs `golangci-lint` on the backend +* `make it` - Runs all unit and integration tests +* `make fmt` - Formats the Go source code +* `make fmt-ui` - Formats the UI source code +* `make server-start` - Runs a development stash server in the `.local` directory +* `make server-clean` - Removes the `.local` directory and all of its contents +* `make ui-start` - Runs the UI in development mode. Requires a running Stash server to connect to. The server port can be changed from the default of `9999` using the environment variable `VITE_APP_PLATFORM_PORT`. The UI runs on port `3000` or the next available port. ## Local development quickstart @@ -59,13 +64,14 @@ NOTE: The `make` command in Windows will be `mingw32-make` with MingW. For examp 2. Run `make generate` to create generated files 3. In one terminal, run `make server-start` to run the server code 4. In a separate terminal, run `make ui-start` to run the UI in development mode -5. Open the UI in a browser `http://localhost:3000/` +5. Open the UI in a browser: `http://localhost:3000/` Changes to the UI code can be seen by reloading the browser page. -Changes to the server code requires a restart (`CTRL-C` in the server terminal). +Changes to the backend code require a server restart (`CTRL-C` in the server terminal, followed by `make server-start` again) to be seen. On first launch: + 1. On the "Stash Setup Wizard" screen, choose a directory with some files to test with 2. Press "Next" to use the default locations for the database and generated content 3. Press the "Confirm" and "Finish" buttons to get into the UI @@ -73,17 +79,20 @@ On first launch: 5. You're all set! Set any other configurations you'd like and test your code changes. To start fresh with new configuration: + 1. Stop the server (`CTRL-C` in the server terminal) -2. Run `make server-clean` to clear all config, database, and generated files (under `.local/`) +2. Run `make server-clean` to clear all config, database, and generated files (under `.local`) 3. Run `make server-start` to restart the server 4. Follow the "On first launch" steps above ## Building a release +Simply run `make` or `make release`, or equivalently: + 1. Run `make pre-ui` to install UI dependencies 2. Run `make generate` to create generated files -3. Run `make ui` to compile the frontend -4. Run `make build` to build the executable for your current platform +3. Run `make ui` to build the frontend +4. Run `make build-release` to build a release executable for your current platform ## Cross compiling diff --git a/graphql/documents/data/config.graphql b/graphql/documents/data/config.graphql index 2a56e951252..048bb481567 100644 --- a/graphql/documents/data/config.graphql +++ b/graphql/documents/data/config.graphql @@ -93,6 +93,7 @@ fragment ConfigInterfaceData on ConfigInterfaceResult { } handyKey funscriptOffset + useStashHostedFunscript } fragment ConfigDLNAData on ConfigDLNAResult { @@ -123,6 +124,10 @@ fragment IdentifyMetadataOptionsData on IdentifyMetadataOptions { setCoverImage setOrganized includeMalePerformers + skipMultipleMatches + skipMultipleMatchTag + skipSingleNamePerformers + skipSingleNamePerformerTag } fragment ScraperSourceData on ScraperSource { diff --git a/graphql/documents/data/scene-slim.graphql b/graphql/documents/data/scene-slim.graphql index b6ed326e022..09db76bb7a9 100644 --- a/graphql/documents/data/scene-slim.graphql +++ b/graphql/documents/data/scene-slim.graphql @@ -4,7 +4,7 @@ fragment SlimSceneData on Scene { code details director - url + urls date rating100 o_counter diff --git a/graphql/documents/data/scene.graphql b/graphql/documents/data/scene.graphql index 8b0a664d50a..3f26856a33d 100644 --- a/graphql/documents/data/scene.graphql +++ b/graphql/documents/data/scene.graphql @@ -4,7 +4,7 @@ fragment SceneData on Scene { code details director - url + urls date rating100 o_counter diff --git a/graphql/documents/data/scrapers.graphql b/graphql/documents/data/scrapers.graphql index 1d4553a97c2..3af1d2868e8 100644 --- a/graphql/documents/data/scrapers.graphql +++ b/graphql/documents/data/scrapers.graphql @@ -114,7 +114,7 @@ fragment ScrapedSceneData on ScrapedScene { code details director - url + urls date image remote_site_id diff --git a/graphql/documents/queries/misc.graphql b/graphql/documents/queries/misc.graphql index e653635dc8d..791392fb00d 100644 --- a/graphql/documents/queries/misc.graphql +++ b/graphql/documents/queries/misc.graphql @@ -40,16 +40,20 @@ query AllTagsForFilter { query Stats { stats { - scene_count, - scenes_size, - scenes_duration, - image_count, - images_size, - gallery_count, - performer_count, - studio_count, - movie_count, + scene_count + scenes_size + scenes_duration + image_count + images_size + gallery_count + performer_count + studio_count + movie_count tag_count + total_o_count + total_play_duration + total_play_count + scenes_played } } diff --git a/graphql/schema/types/config.graphql b/graphql/schema/types/config.graphql index 6c99393858e..08fa4e21756 100644 --- a/graphql/schema/types/config.graphql +++ b/graphql/schema/types/config.graphql @@ -354,6 +354,8 @@ input ConfigInterfaceInput { handyKey: String """Funscript Time Offset""" funscriptOffset: Int + """Whether to use Stash Hosted Funscript""" + useStashHostedFunscript: Boolean """True if we should not auto-open a browser window on startup""" noBrowser: Boolean """True if we should send notifications to the desktop""" @@ -425,6 +427,8 @@ type ConfigInterfaceResult { handyKey: String """Funscript Time Offset""" funscriptOffset: Int + """Whether to use Stash Hosted Funscript""" + useStashHostedFunscript: Boolean } input ConfigDLNAInput { diff --git a/graphql/schema/types/filters.graphql b/graphql/schema/types/filters.graphql index 190432594d4..b80cf851bf3 100644 --- a/graphql/schema/types/filters.graphql +++ b/graphql/schema/types/filters.graphql @@ -194,6 +194,10 @@ input SceneFilterType { duplicated: PHashDuplicationCriterionInput """Filter by resolution""" resolution: ResolutionCriterionInput + """Filter by video codec""" + video_codec: StringCriterionInput + """Filter by audio codec""" + audio_codec: StringCriterionInput """Filter by duration (in seconds)""" duration: IntCriterionInput """Filter to only include scenes which have markers. `true` or `false`""" diff --git a/graphql/schema/types/metadata.graphql b/graphql/schema/types/metadata.graphql index 8e575b3ece8..7cd89202b0f 100644 --- a/graphql/schema/types/metadata.graphql +++ b/graphql/schema/types/metadata.graphql @@ -185,6 +185,14 @@ input IdentifyMetadataOptionsInput { setOrganized: Boolean """defaults to true if not provided""" includeMalePerformers: Boolean + """defaults to true if not provided""" + skipMultipleMatches: Boolean + """tag to tag skipped multiple matches with""" + skipMultipleMatchTag: String + """defaults to true if not provided""" + skipSingleNamePerformers: Boolean + """tag to tag skipped single name performers with""" + skipSingleNamePerformerTag: String } input IdentifySourceInput { @@ -222,6 +230,14 @@ type IdentifyMetadataOptions { setOrganized: Boolean """defaults to true if not provided""" includeMalePerformers: Boolean + """defaults to true if not provided""" + skipMultipleMatches: Boolean + """tag to tag skipped multiple matches with""" + skipMultipleMatchTag: String + """defaults to true if not provided""" + skipSingleNamePerformers: Boolean + """tag to tag skipped single name performers with""" + skipSingleNamePerformerTag: String } type IdentifySource { diff --git a/graphql/schema/types/scene.graphql b/graphql/schema/types/scene.graphql index 7ec2134c9e4..ef538b22f9b 100644 --- a/graphql/schema/types/scene.graphql +++ b/graphql/schema/types/scene.graphql @@ -40,7 +40,8 @@ type Scene { code: String details: String director: String - url: String + url: String @deprecated(reason: "Use urls") + urls: [String!] date: String # rating expressed as 1-5 rating: Int @deprecated(reason: "Use 1-100 range with rating100") @@ -91,7 +92,8 @@ input SceneCreateInput { code: String details: String director: String - url: String + url: String @deprecated(reason: "Use urls") + urls: [String!] date: String # rating expressed as 1-5 rating: Int @deprecated(reason: "Use 1-100 range with rating100") @@ -119,7 +121,8 @@ input SceneUpdateInput { code: String details: String director: String - url: String + url: String @deprecated(reason: "Use urls") + urls: [String!] date: String # rating expressed as 1-5 rating: Int @deprecated(reason: "Use 1-100 range with rating100") @@ -164,7 +167,8 @@ input BulkSceneUpdateInput { code: String details: String director: String - url: String + url: String @deprecated(reason: "Use urls") + urls: BulkUpdateStrings date: String # rating expressed as 1-5 rating: Int @deprecated(reason: "Use 1-100 range with rating100") diff --git a/graphql/schema/types/scraper.graphql b/graphql/schema/types/scraper.graphql index 1230fde32c8..f04eb2b3726 100644 --- a/graphql/schema/types/scraper.graphql +++ b/graphql/schema/types/scraper.graphql @@ -64,7 +64,8 @@ type ScrapedScene { code: String details: String director: String - url: String + url: String @deprecated(reason: "use urls") + urls: [String!] date: String """This should be a base64 encoded data URL""" @@ -87,7 +88,8 @@ input ScrapedSceneInput { code: String details: String director: String - url: String + url: String @deprecated(reason: "use urls") + urls: [String!] date: String # no image, file, duration or relationships diff --git a/graphql/schema/types/stats.graphql b/graphql/schema/types/stats.graphql index fcadd54a78a..3675c2a6bb2 100644 --- a/graphql/schema/types/stats.graphql +++ b/graphql/schema/types/stats.graphql @@ -9,4 +9,8 @@ type StatsResultType { studio_count: Int! movie_count: Int! tag_count: Int! + total_o_count: Int! + total_play_duration: Float! + total_play_count: Int! + scenes_played: Int! } diff --git a/internal/api/check_version.go b/internal/api/check_version.go index a2da99c9a06..b19727ab8c6 100644 --- a/internal/api/check_version.go +++ b/internal/api/check_version.go @@ -13,6 +13,7 @@ import ( "golang.org/x/sys/cpu" + "github.com/stashapp/stash/internal/build" "github.com/stashapp/stash/pkg/logger" ) @@ -170,7 +171,7 @@ func GetLatestRelease(ctx context.Context) (*LatestRelease, error) { wantedRelease := stashReleases()[platform] url := apiReleases - if IsDevelop() { + if build.IsDevelop() { // get the release tagged with the development tag url += "/tags/" + developmentTag } else { @@ -213,7 +214,7 @@ func GetLatestRelease(ctx context.Context) (*LatestRelease, error) { } } - _, githash, _ := GetVersion() + _, githash, _ := build.Version() shLength := len(githash) if shLength == 0 { shLength = defaultSHLength @@ -273,7 +274,7 @@ func printLatestVersion(ctx context.Context) { if err != nil { logger.Errorf("Couldn't retrieve latest version: %v", err) } else { - _, githash, _ = GetVersion() + _, githash, _ := build.Version() switch { case githash == "": logger.Infof("Latest version: %s (%s)", latestRelease.Version, latestRelease.ShortHash) diff --git a/internal/api/resolver.go b/internal/api/resolver.go index af26bef4dd5..3932270b794 100644 --- a/internal/api/resolver.go +++ b/internal/api/resolver.go @@ -7,6 +7,7 @@ import ( "sort" "strconv" + "github.com/stashapp/stash/internal/build" "github.com/stashapp/stash/internal/manager" "github.com/stashapp/stash/pkg/logger" "github.com/stashapp/stash/pkg/models" @@ -157,18 +158,26 @@ func (r *queryResolver) Stats(ctx context.Context) (*StatsResultType, error) { studiosCount, _ := studiosQB.Count(ctx) moviesCount, _ := moviesQB.Count(ctx) tagsCount, _ := tagsQB.Count(ctx) + totalOCount, _ := scenesQB.OCount(ctx) + totalPlayDuration, _ := scenesQB.PlayDuration(ctx) + totalPlayCount, _ := scenesQB.PlayCount(ctx) + uniqueScenePlayCount, _ := scenesQB.UniqueScenePlayCount(ctx) ret = StatsResultType{ - SceneCount: scenesCount, - ScenesSize: scenesSize, - ScenesDuration: scenesDuration, - ImageCount: imageCount, - ImagesSize: imageSize, - GalleryCount: galleryCount, - PerformerCount: performersCount, - StudioCount: studiosCount, - MovieCount: moviesCount, - TagCount: tagsCount, + SceneCount: scenesCount, + ScenesSize: scenesSize, + ScenesDuration: scenesDuration, + ImageCount: imageCount, + ImagesSize: imageSize, + GalleryCount: galleryCount, + PerformerCount: performersCount, + StudioCount: studiosCount, + MovieCount: moviesCount, + TagCount: tagsCount, + TotalOCount: totalOCount, + TotalPlayDuration: totalPlayDuration, + TotalPlayCount: totalPlayCount, + ScenesPlayed: uniqueScenePlayCount, } return nil @@ -180,7 +189,7 @@ func (r *queryResolver) Stats(ctx context.Context) (*StatsResultType, error) { } func (r *queryResolver) Version(ctx context.Context) (*Version, error) { - version, hash, buildtime := GetVersion() + version, hash, buildtime := build.Version() return &Version{ Version: &version, diff --git a/internal/api/resolver_model_scene.go b/internal/api/resolver_model_scene.go index cd6f16a5785..9d5b41725ce 100644 --- a/internal/api/resolver_model_scene.go +++ b/internal/api/resolver_model_scene.go @@ -405,3 +405,32 @@ func (r *sceneResolver) InteractiveSpeed(ctx context.Context, obj *models.Scene) return primaryFile.InteractiveSpeed, nil } + +func (r *sceneResolver) URL(ctx context.Context, obj *models.Scene) (*string, error) { + if !obj.URLs.Loaded() { + if err := r.withReadTxn(ctx, func(ctx context.Context) error { + return obj.LoadURLs(ctx, r.repository.Scene) + }); err != nil { + return nil, err + } + } + + urls := obj.URLs.List() + if len(urls) == 0 { + return nil, nil + } + + return &urls[0], nil +} + +func (r *sceneResolver) Urls(ctx context.Context, obj *models.Scene) ([]string, error) { + if !obj.URLs.Loaded() { + if err := r.withReadTxn(ctx, func(ctx context.Context) error { + return obj.LoadURLs(ctx, r.repository.Scene) + }); err != nil { + return nil, err + } + } + + return obj.URLs.List(), nil +} diff --git a/internal/api/resolver_mutation_configure.go b/internal/api/resolver_mutation_configure.go index bdc93137f17..177b444277c 100644 --- a/internal/api/resolver_mutation_configure.go +++ b/internal/api/resolver_mutation_configure.go @@ -479,6 +479,10 @@ func (r *mutationResolver) ConfigureInterface(ctx context.Context, input ConfigI c.Set(config.FunscriptOffset, *input.FunscriptOffset) } + if input.UseStashHostedFunscript != nil { + c.Set(config.UseStashHostedFunscript, *input.UseStashHostedFunscript) + } + if err := c.Write(); err != nil { return makeConfigInterfaceResult(), err } diff --git a/internal/api/resolver_mutation_scene.go b/internal/api/resolver_mutation_scene.go index 45f4b1e54d8..6eeb9f4819d 100644 --- a/internal/api/resolver_mutation_scene.go +++ b/internal/api/resolver_mutation_scene.go @@ -67,7 +67,6 @@ func (r *mutationResolver) SceneCreate(ctx context.Context, input SceneCreateInp Code: translator.string(input.Code, "code"), Details: translator.string(input.Details, "details"), Director: translator.string(input.Director, "director"), - URL: translator.string(input.URL, "url"), Date: translator.datePtr(input.Date, "date"), Rating: translator.ratingConversionInt(input.Rating, input.Rating100), Organized: translator.bool(input.Organized, "organized"), @@ -83,6 +82,12 @@ func (r *mutationResolver) SceneCreate(ctx context.Context, input SceneCreateInp return nil, fmt.Errorf("converting studio id: %w", err) } + if input.Urls != nil { + newScene.URLs = models.NewRelatedStrings(input.Urls) + } else if input.URL != nil { + newScene.URLs = models.NewRelatedStrings([]string{*input.URL}) + } + var coverImageData []byte if input.CoverImage != nil && *input.CoverImage != "" { var err error @@ -168,7 +173,6 @@ func scenePartialFromInput(input models.SceneUpdateInput, translator changesetTr updatedScene.Code = translator.optionalString(input.Code, "code") updatedScene.Details = translator.optionalString(input.Details, "details") updatedScene.Director = translator.optionalString(input.Director, "director") - updatedScene.URL = translator.optionalString(input.URL, "url") updatedScene.Date = translator.optionalDate(input.Date, "date") updatedScene.Rating = translator.ratingConversionOptional(input.Rating, input.Rating100) updatedScene.OCounter = translator.optionalInt(input.OCounter, "o_counter") @@ -182,6 +186,18 @@ func scenePartialFromInput(input models.SceneUpdateInput, translator changesetTr updatedScene.Organized = translator.optionalBool(input.Organized, "organized") + if translator.hasField("urls") { + updatedScene.URLs = &models.UpdateStrings{ + Values: input.Urls, + Mode: models.RelationshipUpdateModeSet, + } + } else if translator.hasField("url") { + updatedScene.URLs = &models.UpdateStrings{ + Values: []string{*input.URL}, + Mode: models.RelationshipUpdateModeSet, + } + } + if input.PrimaryFileID != nil { primaryFileID, err := strconv.Atoi(*input.PrimaryFileID) if err != nil { @@ -339,7 +355,6 @@ func (r *mutationResolver) BulkSceneUpdate(ctx context.Context, input BulkSceneU updatedScene.Code = translator.optionalString(input.Code, "code") updatedScene.Details = translator.optionalString(input.Details, "details") updatedScene.Director = translator.optionalString(input.Director, "director") - updatedScene.URL = translator.optionalString(input.URL, "url") updatedScene.Date = translator.optionalDate(input.Date, "date") updatedScene.Rating = translator.ratingConversionOptional(input.Rating, input.Rating100) updatedScene.StudioID, err = translator.optionalIntFromString(input.StudioID, "studio_id") @@ -349,6 +364,18 @@ func (r *mutationResolver) BulkSceneUpdate(ctx context.Context, input BulkSceneU updatedScene.Organized = translator.optionalBool(input.Organized, "organized") + if translator.hasField("urls") { + updatedScene.URLs = &models.UpdateStrings{ + Values: input.Urls.Values, + Mode: input.Urls.Mode, + } + } else if translator.hasField("url") { + updatedScene.URLs = &models.UpdateStrings{ + Values: []string{*input.URL}, + Mode: models.RelationshipUpdateModeSet, + } + } + if translator.hasField("performer_ids") { updatedScene.PerformerIDs, err = translateUpdateIDs(input.PerformerIds.Ids, input.PerformerIds.Mode) if err != nil { diff --git a/internal/api/resolver_mutation_stash_box.go b/internal/api/resolver_mutation_stash_box.go index 8f6753f5bd1..ccd57dd0938 100644 --- a/internal/api/resolver_mutation_stash_box.go +++ b/internal/api/resolver_mutation_stash_box.go @@ -68,6 +68,10 @@ func (r *mutationResolver) SubmitStashBoxSceneDraft(ctx context.Context, input S logger.Errorf("Error getting scene cover: %v", err) } + if err := scene.LoadURLs(ctx, r.repository.Scene); err != nil { + return fmt.Errorf("loading scene URLs: %w", err) + } + res, err = client.SubmitSceneDraft(ctx, scene, boxes[input.StashBoxIndex].Endpoint, cover) return err }) diff --git a/internal/api/resolver_query_configuration.go b/internal/api/resolver_query_configuration.go index 4c9f00aea0d..7de9bda0da6 100644 --- a/internal/api/resolver_query_configuration.go +++ b/internal/api/resolver_query_configuration.go @@ -159,6 +159,7 @@ func makeConfigInterfaceResult() *ConfigInterfaceResult { language := config.GetLanguage() handyKey := config.GetHandyKey() scriptOffset := config.GetFunscriptOffset() + useStashHostedFunscript := config.GetUseStashHostedFunscript() imageLightboxOptions := config.GetImageLightboxOptions() // FIXME - misnamed output field means we have redundant fields disableDropdownCreate := config.GetDisableDropdownCreate() @@ -190,8 +191,9 @@ func makeConfigInterfaceResult() *ConfigInterfaceResult { DisabledDropdownCreate: disableDropdownCreate, DisableDropdownCreate: disableDropdownCreate, - HandyKey: &handyKey, - FunscriptOffset: &scriptOffset, + HandyKey: &handyKey, + FunscriptOffset: &scriptOffset, + UseStashHostedFunscript: &useStashHostedFunscript, } } diff --git a/internal/api/routes_scene.go b/internal/api/routes_scene.go index 9a5e8149657..43d37da36e0 100644 --- a/internal/api/routes_scene.go +++ b/internal/api/routes_scene.go @@ -72,6 +72,7 @@ func (rs sceneRoutes) Routes() chi.Router { r.Get("/vtt/thumbs", rs.VttThumbs) r.Get("/vtt/sprite", rs.VttSprite) r.Get("/funscript", rs.Funscript) + r.Get("/interactive_csv", rs.InteractiveCSV) r.Get("/interactive_heatmap", rs.InteractiveHeatmap) r.Get("/caption", rs.CaptionLang) @@ -374,6 +375,20 @@ func (rs sceneRoutes) Funscript(w http.ResponseWriter, r *http.Request) { utils.ServeStaticFile(w, r, filepath) } +func (rs sceneRoutes) InteractiveCSV(w http.ResponseWriter, r *http.Request) { + s := r.Context().Value(sceneKey).(*models.Scene) + filepath := video.GetFunscriptPath(s.Path) + + // TheHandy directly only accepts interactive CSVs + csvBytes, err := manager.ConvertFunscriptToCSV(filepath) + + if err != nil { + http.Error(w, err.Error(), http.StatusInternalServerError) + return + } + utils.ServeStaticContent(w, r, csvBytes) +} + func (rs sceneRoutes) InteractiveHeatmap(w http.ResponseWriter, r *http.Request) { scene := r.Context().Value(sceneKey).(*models.Scene) sceneHash := scene.GetHash(config.GetInstance().GetVideoFileNamingAlgorithm()) diff --git a/internal/api/server.go b/internal/api/server.go index cfc57b3dd62..516505fb186 100644 --- a/internal/api/server.go +++ b/internal/api/server.go @@ -11,7 +11,6 @@ import ( "net/http" "os" "path" - "regexp" "runtime/debug" "strconv" "strings" @@ -30,6 +29,7 @@ import ( "github.com/go-chi/cors" "github.com/go-chi/httplog" "github.com/stashapp/stash/internal/api/loaders" + "github.com/stashapp/stash/internal/build" "github.com/stashapp/stash/internal/manager" "github.com/stashapp/stash/internal/manager/config" "github.com/stashapp/stash/pkg/fsutil" @@ -46,10 +46,6 @@ const ( playgroundEndpoint = "/playground" ) -var version string -var buildstamp string -var githash string - var uiBox = ui.UIBox var loginUIBox = ui.LoginUIBox @@ -270,7 +266,7 @@ func Start() error { TLSNextProto: make(map[string]func(*http.Server, *tls.Conn, http.Handler)), } - printVersion() + logger.Infof("stash version: %s\n", build.VersionString()) go printLatestVersion(context.TODO()) logger.Infof("stash is listening on " + address) if tlsConfig != nil { @@ -390,49 +386,6 @@ func customLocalesHandler(c *config.Instance) func(w http.ResponseWriter, r *htt } } -func printVersion() { - var versionString string - switch { - case version != "": - if githash != "" && !IsDevelop() { - versionString = version + " (" + githash + ")" - } else { - versionString = version - } - case githash != "": - versionString = githash - default: - versionString = "unknown" - } - if config.IsOfficialBuild() { - versionString += " - Official Build" - } else { - versionString += " - Unofficial Build" - } - if buildstamp != "" { - versionString += " - " + buildstamp - } - logger.Infof("stash version: %s\n", versionString) -} - -func GetVersion() (string, string, string) { - return version, githash, buildstamp -} - -func IsDevelop() bool { - if githash == "" { - return false - } - - // if the version is suffixed with -x-xxxx, then we are running a development build - develop := false - re := regexp.MustCompile(`-\d+-g\w+$`) - if re.MatchString(version) { - develop = true - } - return develop -} - func makeTLSConfig(c *config.Instance) (*tls.Config, error) { c.InitTLS() certFile, keyFile := c.GetTLSFiles() diff --git a/internal/autotag/integration_test.go b/internal/autotag/integration_test.go index cb7aa08b6ac..b31d656664b 100644 --- a/internal/autotag/integration_test.go +++ b/internal/autotag/integration_test.go @@ -176,7 +176,7 @@ func createScenes(ctx context.Context, sqb models.SceneReaderWriter, folderStore s := &models.Scene{ Title: expectedMatchTitle, - URL: existingStudioSceneName, + Code: existingStudioSceneName, StudioID: &existingStudioID, } if err := createScene(ctx, sqb, s, f); err != nil { @@ -625,7 +625,7 @@ func TestParseStudioScenes(t *testing.T) { for _, scene := range scenes { // check for existing studio id scene first - if scene.URL == existingStudioSceneName { + if scene.Code == existingStudioSceneName { if scene.StudioID == nil || *scene.StudioID != existingStudioID { t.Error("Incorrectly overwrote studio ID for scene with existing studio ID") } diff --git a/internal/build/version.go b/internal/build/version.go new file mode 100644 index 00000000000..84c5f819f4f --- /dev/null +++ b/internal/build/version.go @@ -0,0 +1,57 @@ +package build + +import ( + "regexp" +) + +var version string +var buildstamp string +var githash string +var officialBuild string + +func Version() (string, string, string) { + return version, githash, buildstamp +} + +func VersionString() string { + var versionString string + switch { + case version != "": + if githash != "" && !IsDevelop() { + versionString = version + " (" + githash + ")" + } else { + versionString = version + } + case githash != "": + versionString = githash + default: + versionString = "unknown" + } + if IsOfficial() { + versionString += " - Official Build" + } else { + versionString += " - Unofficial Build" + } + if buildstamp != "" { + versionString += " - " + buildstamp + } + return versionString +} + +func IsOfficial() bool { + return officialBuild == "true" +} + +func IsDevelop() bool { + if githash == "" { + return false + } + + // if the version is suffixed with -x-xxxx, then we are running a development build + develop := false + re := regexp.MustCompile(`-\d+-g\w+$`) + if re.MatchString(version) { + develop = true + } + return develop +} diff --git a/internal/desktop/desktop.go b/internal/desktop/desktop.go index 91d87ac10be..1e69a6c76cf 100644 --- a/internal/desktop/desktop.go +++ b/internal/desktop/desktop.go @@ -9,6 +9,7 @@ import ( "strings" "github.com/pkg/browser" + "github.com/stashapp/stash/internal/build" "github.com/stashapp/stash/internal/manager/config" "github.com/stashapp/stash/pkg/fsutil" "github.com/stashapp/stash/pkg/logger" @@ -104,7 +105,7 @@ func writeStashIcon(faviconProvider FaviconProvider) { func IsAllowedAutoUpdate() bool { // Only try to update if downloaded from official sources - if !config.IsOfficialBuild() { + if !build.IsOfficial() { return false } diff --git a/internal/identify/identify.go b/internal/identify/identify.go index 04eccb7b096..8d45e100967 100644 --- a/internal/identify/identify.go +++ b/internal/identify/identify.go @@ -2,18 +2,33 @@ package identify import ( "context" + "errors" "fmt" + "strconv" "github.com/stashapp/stash/pkg/logger" "github.com/stashapp/stash/pkg/models" "github.com/stashapp/stash/pkg/scene" "github.com/stashapp/stash/pkg/scraper" + "github.com/stashapp/stash/pkg/sliceutil" "github.com/stashapp/stash/pkg/txn" "github.com/stashapp/stash/pkg/utils" ) +var ( + ErrSkipSingleNamePerformer = errors.New("a performer was skipped because they only had a single name and no disambiguation") +) + +type MultipleMatchesFoundError struct { + Source ScraperSource +} + +func (e *MultipleMatchesFoundError) Error() string { + return fmt.Sprintf("multiple matches found for %s", e.Source.Name) +} + type SceneScraper interface { - ScrapeScene(ctx context.Context, sceneID int) (*scraper.ScrapedScene, error) + ScrapeScenes(ctx context.Context, sceneID int) ([]*scraper.ScrapedScene, error) } type SceneUpdatePostHookExecutor interface { @@ -31,7 +46,7 @@ type SceneIdentifier struct { SceneReaderUpdater SceneReaderUpdater StudioCreator StudioCreator PerformerCreator PerformerCreator - TagCreator TagCreator + TagCreatorFinder TagCreatorFinder DefaultOptions *MetadataOptions Sources []ScraperSource @@ -39,13 +54,31 @@ type SceneIdentifier struct { } func (t *SceneIdentifier) Identify(ctx context.Context, txnManager txn.Manager, scene *models.Scene) error { - result, err := t.scrapeScene(ctx, scene) + result, err := t.scrapeScene(ctx, txnManager, scene) + var multipleMatchErr *MultipleMatchesFoundError if err != nil { - return err + if !errors.As(err, &multipleMatchErr) { + return err + } } if result == nil { - logger.Debugf("Unable to identify %s", scene.Path) + if multipleMatchErr != nil { + logger.Debugf("Identify skipped because multiple results returned for %s", scene.Path) + + // find if the scene should be tagged for multiple results + options := t.getOptions(multipleMatchErr.Source) + if options.SkipMultipleMatchTag != nil && len(*options.SkipMultipleMatchTag) > 0 { + // Tag it with the multiple results tag + err := t.addTagToScene(ctx, txnManager, scene, *options.SkipMultipleMatchTag) + if err != nil { + return err + } + return nil + } + } else { + logger.Debugf("Unable to identify %s", scene.Path) + } return nil } @@ -62,63 +95,95 @@ type scrapeResult struct { source ScraperSource } -func (t *SceneIdentifier) scrapeScene(ctx context.Context, scene *models.Scene) (*scrapeResult, error) { +func (t *SceneIdentifier) scrapeScene(ctx context.Context, txnManager txn.Manager, scene *models.Scene) (*scrapeResult, error) { // iterate through the input sources for _, source := range t.Sources { // scrape using the source - scraped, err := source.Scraper.ScrapeScene(ctx, scene.ID) + results, err := source.Scraper.ScrapeScenes(ctx, scene.ID) if err != nil { logger.Errorf("error scraping from %v: %v", source.Scraper, err) continue } - // if results were found then return - if scraped != nil { - return &scrapeResult{ - result: scraped, - source: source, - }, nil + if len(results) > 0 { + options := t.getOptions(source) + if len(results) > 1 && utils.IsTrue(options.SkipMultipleMatches) { + return nil, &MultipleMatchesFoundError{ + Source: source, + } + } else { + // if results were found then return + return &scrapeResult{ + result: results[0], + source: source, + }, nil + } } } return nil, nil } +// Returns a MetadataOptions object with any default options overwritten by source specific options +func (t *SceneIdentifier) getOptions(source ScraperSource) MetadataOptions { + options := *t.DefaultOptions + if source.Options == nil { + return options + } + if source.Options.SetCoverImage != nil { + options.SetCoverImage = source.Options.SetCoverImage + } + if source.Options.SetOrganized != nil { + options.SetOrganized = source.Options.SetOrganized + } + if source.Options.IncludeMalePerformers != nil { + options.IncludeMalePerformers = source.Options.IncludeMalePerformers + } + if source.Options.SkipMultipleMatches != nil { + options.SkipMultipleMatches = source.Options.SkipMultipleMatches + } + if source.Options.SkipMultipleMatchTag != nil && len(*source.Options.SkipMultipleMatchTag) > 0 { + options.SkipMultipleMatchTag = source.Options.SkipMultipleMatchTag + } + if source.Options.SkipSingleNamePerformers != nil { + options.SkipSingleNamePerformers = source.Options.SkipSingleNamePerformers + } + if source.Options.SkipSingleNamePerformerTag != nil && len(*source.Options.SkipSingleNamePerformerTag) > 0 { + options.SkipSingleNamePerformerTag = source.Options.SkipSingleNamePerformerTag + } + return options +} + func (t *SceneIdentifier) getSceneUpdater(ctx context.Context, s *models.Scene, result *scrapeResult) (*scene.UpdateSet, error) { ret := &scene.UpdateSet{ ID: s.ID, } - options := []MetadataOptions{} + allOptions := []MetadataOptions{} if result.source.Options != nil { - options = append(options, *result.source.Options) + allOptions = append(allOptions, *result.source.Options) } if t.DefaultOptions != nil { - options = append(options, *t.DefaultOptions) + allOptions = append(allOptions, *t.DefaultOptions) } - fieldOptions := getFieldOptions(options) - - setOrganized := false - for _, o := range options { - if o.SetOrganized != nil { - setOrganized = *o.SetOrganized - break - } - } + fieldOptions := getFieldOptions(allOptions) + options := t.getOptions(result.source) scraped := result.result rel := sceneRelationships{ - sceneReader: t.SceneReaderUpdater, - studioCreator: t.StudioCreator, - performerCreator: t.PerformerCreator, - tagCreator: t.TagCreator, - scene: s, - result: result, - fieldOptions: fieldOptions, + sceneReader: t.SceneReaderUpdater, + studioCreator: t.StudioCreator, + performerCreator: t.PerformerCreator, + tagCreatorFinder: t.TagCreatorFinder, + scene: s, + result: result, + fieldOptions: fieldOptions, + skipSingleNamePerformers: utils.IsTrue(options.SkipSingleNamePerformers), } + setOrganized := utils.IsTrue(options.SetOrganized) ret.Partial = getScenePartial(s, scraped, fieldOptions, setOrganized) studioID, err := rel.studio(ctx) @@ -130,17 +195,19 @@ func (t *SceneIdentifier) getSceneUpdater(ctx context.Context, s *models.Scene, ret.Partial.StudioID = models.NewOptionalInt(*studioID) } - ignoreMale := false - for _, o := range options { - if o.IncludeMalePerformers != nil { - ignoreMale = !*o.IncludeMalePerformers - break - } + includeMalePerformers := true + if options.IncludeMalePerformers != nil { + includeMalePerformers = *options.IncludeMalePerformers } - performerIDs, err := rel.performers(ctx, ignoreMale) + addSkipSingleNamePerformerTag := false + performerIDs, err := rel.performers(ctx, !includeMalePerformers) if err != nil { - return nil, err + if errors.Is(err, ErrSkipSingleNamePerformer) { + addSkipSingleNamePerformerTag = true + } else { + return nil, err + } } if performerIDs != nil { ret.Partial.PerformerIDs = &models.UpdateIDs{ @@ -153,6 +220,14 @@ func (t *SceneIdentifier) getSceneUpdater(ctx context.Context, s *models.Scene, if err != nil { return nil, err } + if addSkipSingleNamePerformerTag && options.SkipSingleNamePerformerTag != nil { + tagID, err := strconv.ParseInt(*options.SkipSingleNamePerformerTag, 10, 64) + if err != nil { + return nil, fmt.Errorf("error converting tag ID %s: %w", *options.SkipSingleNamePerformerTag, err) + } + + tagIDs = sliceutil.AppendUnique(tagIDs, int(tagID)) + } if tagIDs != nil { ret.Partial.TagIDs = &models.UpdateIDs{ IDs: tagIDs, @@ -171,15 +246,7 @@ func (t *SceneIdentifier) getSceneUpdater(ctx context.Context, s *models.Scene, } } - setCoverImage := false - for _, o := range options { - if o.SetCoverImage != nil { - setCoverImage = *o.SetCoverImage - break - } - } - - if setCoverImage { + if utils.IsTrue(options.SetCoverImage) { ret.CoverImage, err = rel.cover(ctx) if err != nil { return nil, err @@ -193,6 +260,9 @@ func (t *SceneIdentifier) modifyScene(ctx context.Context, txnManager txn.Manage var updater *scene.UpdateSet if err := txn.WithTxn(ctx, txnManager, func(ctx context.Context) error { // load scene relationships + if err := s.LoadURLs(ctx, t.SceneReaderUpdater); err != nil { + return err + } if err := s.LoadPerformerIDs(ctx, t.SceneReaderUpdater); err != nil { return err } @@ -241,6 +311,41 @@ func (t *SceneIdentifier) modifyScene(ctx context.Context, txnManager txn.Manage return nil } +func (t *SceneIdentifier) addTagToScene(ctx context.Context, txnManager txn.Manager, s *models.Scene, tagToAdd string) error { + if err := txn.WithTxn(ctx, txnManager, func(ctx context.Context) error { + tagID, err := strconv.Atoi(tagToAdd) + if err != nil { + return fmt.Errorf("error converting tag ID %s: %w", tagToAdd, err) + } + + if err := s.LoadTagIDs(ctx, t.SceneReaderUpdater); err != nil { + return err + } + existing := s.TagIDs.List() + + if sliceutil.Include(existing, tagID) { + // skip if the scene was already tagged + return nil + } + + if err := scene.AddTag(ctx, t.SceneReaderUpdater, s, tagID); err != nil { + return err + } + + ret, err := t.TagCreatorFinder.Find(ctx, tagID) + if err != nil { + logger.Infof("Added tag id %s to skipped scene %s", tagToAdd, s.Path) + } else { + logger.Infof("Added tag %s to skipped scene %s", ret.Name, s.Path) + } + + return nil + }); err != nil { + return err + } + return nil +} + func getFieldOptions(options []MetadataOptions) map[string]*FieldOptions { // prefer source-specific field strategies, then the defaults ret := make(map[string]*FieldOptions) @@ -274,9 +379,27 @@ func getScenePartial(scene *models.Scene, scraped *scraper.ScrapedScene, fieldOp partial.Details = models.NewOptionalString(*scraped.Details) } } - if scraped.URL != nil && (scene.URL != *scraped.URL) { - if shouldSetSingleValueField(fieldOptions["url"], scene.URL != "") { - partial.URL = models.NewOptionalString(*scraped.URL) + if len(scraped.URLs) > 0 && shouldSetSingleValueField(fieldOptions["url"], false) { + // if overwrite, then set over the top + switch getFieldStrategy(fieldOptions["url"]) { + case FieldStrategyOverwrite: + // only overwrite if not equal + if len(sliceutil.Exclude(scene.URLs.List(), scraped.URLs)) != 0 { + partial.URLs = &models.UpdateStrings{ + Values: scraped.URLs, + Mode: models.RelationshipUpdateModeSet, + } + } + case FieldStrategyMerge: + // if merge, add if not already present + urls := sliceutil.AppendUniques(scene.URLs.List(), scraped.URLs) + + if len(urls) != len(scene.URLs.List()) { + partial.URLs = &models.UpdateStrings{ + Values: urls, + Mode: models.RelationshipUpdateModeSet, + } + } } } if scraped.Director != nil && (scene.Director != *scraped.Director) { @@ -291,14 +414,13 @@ func getScenePartial(scene *models.Scene, scraped *scraper.ScrapedScene, fieldOp } if setOrganized && !scene.Organized { - // just reuse the boolean since we know it's true - partial.Organized = models.NewOptionalBool(setOrganized) + partial.Organized = models.NewOptionalBool(true) } return partial } -func shouldSetSingleValueField(strategy *FieldOptions, hasExistingValue bool) bool { +func getFieldStrategy(strategy *FieldOptions) FieldStrategy { // if unset then default to MERGE fs := FieldStrategyMerge @@ -306,6 +428,13 @@ func shouldSetSingleValueField(strategy *FieldOptions, hasExistingValue bool) bo fs = strategy.Strategy } + return fs +} + +func shouldSetSingleValueField(strategy *FieldOptions, hasExistingValue bool) bool { + // if unset then default to MERGE + fs := getFieldStrategy(strategy) + if fs == FieldStrategyIgnore { return false } diff --git a/internal/identify/identify_test.go b/internal/identify/identify_test.go index 751f9bf4cfd..6c9f92cb249 100644 --- a/internal/identify/identify_test.go +++ b/internal/identify/identify_test.go @@ -4,12 +4,14 @@ import ( "context" "errors" "reflect" + "strconv" "testing" "github.com/stashapp/stash/pkg/models" "github.com/stashapp/stash/pkg/models/mocks" "github.com/stashapp/stash/pkg/scraper" "github.com/stashapp/stash/pkg/sliceutil/intslice" + "github.com/stretchr/testify/assert" "github.com/stretchr/testify/mock" ) @@ -17,10 +19,10 @@ var testCtx = context.Background() type mockSceneScraper struct { errIDs []int - results map[int]*scraper.ScrapedScene + results map[int][]*scraper.ScrapedScene } -func (s mockSceneScraper) ScrapeScene(ctx context.Context, sceneID int) (*scraper.ScrapedScene, error) { +func (s mockSceneScraper) ScrapeScenes(ctx context.Context, sceneID int) ([]*scraper.ScrapedScene, error) { if intslice.IntInclude(s.errIDs, sceneID) { return nil, errors.New("scrape scene error") } @@ -40,32 +42,66 @@ func TestSceneIdentifier_Identify(t *testing.T) { missingID found1ID found2ID + multiFoundID + multiFound2ID errUpdateID ) - var scrapedTitle = "scrapedTitle" + var ( + skipMultipleTagID = 1 + skipMultipleTagIDStr = strconv.Itoa(skipMultipleTagID) + ) + + var ( + scrapedTitle = "scrapedTitle" + scrapedTitle2 = "scrapedTitle2" + + boolFalse = false + boolTrue = true + ) - defaultOptions := &MetadataOptions{} + defaultOptions := &MetadataOptions{ + SetOrganized: &boolFalse, + SetCoverImage: &boolFalse, + IncludeMalePerformers: &boolFalse, + SkipSingleNamePerformers: &boolFalse, + } sources := []ScraperSource{ { Scraper: mockSceneScraper{ errIDs: []int{errID1}, - results: map[int]*scraper.ScrapedScene{ - found1ID: { + results: map[int][]*scraper.ScrapedScene{ + found1ID: {{ Title: &scrapedTitle, - }, + }}, }, }, }, { Scraper: mockSceneScraper{ errIDs: []int{errID2}, - results: map[int]*scraper.ScrapedScene{ - found2ID: { + results: map[int][]*scraper.ScrapedScene{ + found2ID: {{ Title: &scrapedTitle, - }, - errUpdateID: { + }}, + errUpdateID: {{ Title: &scrapedTitle, + }}, + multiFoundID: { + { + Title: &scrapedTitle, + }, + { + Title: &scrapedTitle2, + }, + }, + multiFound2ID: { + { + Title: &scrapedTitle, + }, + { + Title: &scrapedTitle2, + }, }, }, }, @@ -73,7 +109,7 @@ func TestSceneIdentifier_Identify(t *testing.T) { } mockSceneReaderWriter := &mocks.SceneReaderWriter{} - + mockSceneReaderWriter.On("GetURLs", mock.Anything, mock.Anything).Return(nil, nil) mockSceneReaderWriter.On("UpdatePartial", mock.Anything, mock.MatchedBy(func(id int) bool { return id == errUpdateID }), mock.Anything).Return(nil, errors.New("update error")) @@ -81,52 +117,85 @@ func TestSceneIdentifier_Identify(t *testing.T) { return id != errUpdateID }), mock.Anything).Return(nil, nil) + mockTagFinderCreator := &mocks.TagReaderWriter{} + mockTagFinderCreator.On("Find", mock.Anything, skipMultipleTagID).Return(&models.Tag{ + ID: skipMultipleTagID, + Name: skipMultipleTagIDStr, + }, nil) + tests := []struct { name string sceneID int + options *MetadataOptions wantErr bool }{ { "error scraping", errID1, + nil, false, }, { "error scraping from second", errID2, + nil, false, }, { "found in first scraper", found1ID, + nil, false, }, { "found in second scraper", found2ID, + nil, false, }, { "not found", missingID, + nil, false, }, { "error modifying", errUpdateID, + nil, true, }, - } - - identifier := SceneIdentifier{ - SceneReaderUpdater: mockSceneReaderWriter, - DefaultOptions: defaultOptions, - Sources: sources, - SceneUpdatePostHookExecutor: mockHookExecutor{}, + { + "multiple found", + multiFoundID, + nil, + false, + }, + { + "multiple found - set tag", + multiFound2ID, + &MetadataOptions{ + SkipMultipleMatches: &boolTrue, + SkipMultipleMatchTag: &skipMultipleTagIDStr, + }, + false, + }, } for _, tt := range tests { t.Run(tt.name, func(t *testing.T) { + identifier := SceneIdentifier{ + SceneReaderUpdater: mockSceneReaderWriter, + TagCreatorFinder: mockTagFinderCreator, + DefaultOptions: defaultOptions, + Sources: sources, + SceneUpdatePostHookExecutor: mockHookExecutor{}, + } + + if tt.options != nil { + identifier.DefaultOptions = tt.options + } + scene := &models.Scene{ ID: tt.sceneID, PerformerIDs: models.NewRelatedIDs([]int{}), @@ -144,7 +213,16 @@ func TestSceneIdentifier_modifyScene(t *testing.T) { repo := models.Repository{ TxnManager: &mocks.TxnManager{}, } - tr := &SceneIdentifier{} + boolFalse := false + defaultOptions := &MetadataOptions{ + SetOrganized: &boolFalse, + SetCoverImage: &boolFalse, + IncludeMalePerformers: &boolFalse, + SkipSingleNamePerformers: &boolFalse, + } + tr := &SceneIdentifier{ + DefaultOptions: defaultOptions, + } type args struct { scene *models.Scene @@ -159,12 +237,16 @@ func TestSceneIdentifier_modifyScene(t *testing.T) { "empty update", args{ &models.Scene{ + URLs: models.NewRelatedStrings([]string{}), PerformerIDs: models.NewRelatedIDs([]int{}), TagIDs: models.NewRelatedIDs([]int{}), StashIDs: models.NewRelatedStashIDs([]models.StashID{}), }, &scrapeResult{ result: &scraper.ScrapedScene{}, + source: ScraperSource{ + Options: defaultOptions, + }, }, }, false, @@ -271,33 +353,44 @@ func Test_getScenePartial(t *testing.T) { Title: originalTitle, Date: &originalDateObj, Details: originalDetails, - URL: originalURL, + URLs: models.NewRelatedStrings([]string{originalURL}), } organisedScene := *originalScene organisedScene.Organized = true - emptyScene := &models.Scene{} + emptyScene := &models.Scene{ + URLs: models.NewRelatedStrings([]string{}), + } postPartial := models.ScenePartial{ Title: models.NewOptionalString(scrapedTitle), Date: models.NewOptionalDate(scrapedDateObj), Details: models.NewOptionalString(scrapedDetails), - URL: models.NewOptionalString(scrapedURL), + URLs: &models.UpdateStrings{ + Values: []string{scrapedURL}, + Mode: models.RelationshipUpdateModeSet, + }, + } + + postPartialMerge := postPartial + postPartialMerge.URLs = &models.UpdateStrings{ + Values: []string{scrapedURL}, + Mode: models.RelationshipUpdateModeSet, } scrapedScene := &scraper.ScrapedScene{ Title: &scrapedTitle, Date: &scrapedDate, Details: &scrapedDetails, - URL: &scrapedURL, + URLs: []string{scrapedURL}, } scrapedUnchangedScene := &scraper.ScrapedScene{ Title: &originalTitle, Date: &originalDate, Details: &originalDetails, - URL: &originalURL, + URLs: []string{originalURL}, } makeFieldOptions := func(input *FieldOptions) map[string]*FieldOptions { @@ -360,7 +453,12 @@ func Test_getScenePartial(t *testing.T) { mergeAll, false, }, - models.ScenePartial{}, + models.ScenePartial{ + URLs: &models.UpdateStrings{ + Values: []string{originalURL, scrapedURL}, + Mode: models.RelationshipUpdateModeSet, + }, + }, }, { "merge (empty values)", @@ -370,7 +468,7 @@ func Test_getScenePartial(t *testing.T) { mergeAll, false, }, - postPartial, + postPartialMerge, }, { "unchanged", @@ -407,9 +505,9 @@ func Test_getScenePartial(t *testing.T) { } for _, tt := range tests { t.Run(tt.name, func(t *testing.T) { - if got := getScenePartial(tt.args.scene, tt.args.scraped, tt.args.fieldOptions, tt.args.setOrganized); !reflect.DeepEqual(got, tt.want) { - t.Errorf("getScenePartial() = %v, want %v", got, tt.want) - } + got := getScenePartial(tt.args.scene, tt.args.scraped, tt.args.fieldOptions, tt.args.setOrganized) + + assert.Equal(t, tt.want, got) }) } } diff --git a/internal/identify/options.go b/internal/identify/options.go index 84530e5fc5a..b4954a1f18b 100644 --- a/internal/identify/options.go +++ b/internal/identify/options.go @@ -33,6 +33,14 @@ type MetadataOptions struct { SetOrganized *bool `json:"setOrganized"` // defaults to true if not provided IncludeMalePerformers *bool `json:"includeMalePerformers"` + // defaults to true if not provided + SkipMultipleMatches *bool `json:"skipMultipleMatches"` + // ID of tag to tag skipped multiple matches with + SkipMultipleMatchTag *string `json:"skipMultipleMatchTag"` + // defaults to true if not provided + SkipSingleNamePerformers *bool `json:"skipSingleNamePerformers"` + // ID of tag to tag skipped single name performers with + SkipSingleNamePerformerTag *string `json:"skipSingleNamePerformerTag"` } type FieldOptions struct { diff --git a/internal/identify/performer.go b/internal/identify/performer.go index cb16f2a83d2..a75bfb024b0 100644 --- a/internal/identify/performer.go +++ b/internal/identify/performer.go @@ -4,17 +4,20 @@ import ( "context" "fmt" "strconv" + "strings" "time" "github.com/stashapp/stash/pkg/models" "github.com/stashapp/stash/pkg/sliceutil/stringslice" + "github.com/stashapp/stash/pkg/utils" ) type PerformerCreator interface { Create(ctx context.Context, newPerformer *models.Performer) error + UpdateImage(ctx context.Context, performerID int, image []byte) error } -func getPerformerID(ctx context.Context, endpoint string, w PerformerCreator, p *models.ScrapedPerformer, createMissing bool) (*int, error) { +func getPerformerID(ctx context.Context, endpoint string, w PerformerCreator, p *models.ScrapedPerformer, createMissing bool, skipSingleNamePerformers bool) (*int, error) { if p.StoredID != nil { // existing performer, just add it performerID, err := strconv.Atoi(*p.StoredID) @@ -24,6 +27,10 @@ func getPerformerID(ctx context.Context, endpoint string, w PerformerCreator, p return &performerID, nil } else if createMissing && p.Name != nil { // name is mandatory + // skip single name performers with no disambiguation + if skipSingleNamePerformers && !strings.Contains(*p.Name, " ") && (p.Disambiguation == nil || len(*p.Disambiguation) == 0) { + return nil, ErrSkipSingleNamePerformer + } return createMissingPerformer(ctx, endpoint, w, p) } @@ -46,6 +53,19 @@ func createMissingPerformer(ctx context.Context, endpoint string, w PerformerCre return nil, fmt.Errorf("error creating performer: %w", err) } + // update image table + if p.Image != nil && len(*p.Image) > 0 { + imageData, err := utils.ReadImageFromURL(ctx, *p.Image) + if err != nil { + return nil, err + } + + err = w.UpdateImage(ctx, performerInput.ID, imageData) + if err != nil { + return nil, err + } + } + return &performerInput.ID, nil } @@ -56,6 +76,9 @@ func scrapedToPerformerInput(performer *models.ScrapedPerformer) models.Performe CreatedAt: currentTime, UpdatedAt: currentTime, } + if performer.Disambiguation != nil { + ret.Disambiguation = *performer.Disambiguation + } if performer.Birthdate != nil { d := models.NewDate(*performer.Birthdate) ret.Birthdate = &d @@ -126,6 +149,12 @@ func scrapedToPerformerInput(performer *models.ScrapedPerformer) models.Performe if performer.Instagram != nil { ret.Instagram = *performer.Instagram } + if performer.URL != nil { + ret.URL = *performer.URL + } + if performer.Details != nil { + ret.Details = *performer.Details + } return ret } diff --git a/internal/identify/performer_test.go b/internal/identify/performer_test.go index 9ba1018c783..2e22837c49f 100644 --- a/internal/identify/performer_test.go +++ b/internal/identify/performer_test.go @@ -31,9 +31,10 @@ func Test_getPerformerID(t *testing.T) { }).Return(nil) type args struct { - endpoint string - p *models.ScrapedPerformer - createMissing bool + endpoint string + p *models.ScrapedPerformer + createMissing bool + skipSingleName bool } tests := []struct { name string @@ -47,6 +48,7 @@ func Test_getPerformerID(t *testing.T) { emptyEndpoint, &models.ScrapedPerformer{}, false, + false, }, nil, false, @@ -59,6 +61,7 @@ func Test_getPerformerID(t *testing.T) { StoredID: &invalidStoredID, }, false, + false, }, nil, true, @@ -71,6 +74,7 @@ func Test_getPerformerID(t *testing.T) { StoredID: &validStoredIDStr, }, false, + false, }, &validStoredID, false, @@ -83,6 +87,7 @@ func Test_getPerformerID(t *testing.T) { Name: &name, }, false, + false, }, nil, false, @@ -93,10 +98,24 @@ func Test_getPerformerID(t *testing.T) { emptyEndpoint, &models.ScrapedPerformer{}, true, + false, }, nil, false, }, + { + "single name no disambig creating", + args{ + emptyEndpoint, + &models.ScrapedPerformer{ + Name: &name, + }, + true, + true, + }, + nil, + true, + }, { "valid name creating", args{ @@ -105,6 +124,7 @@ func Test_getPerformerID(t *testing.T) { Name: &name, }, true, + false, }, &validStoredID, false, @@ -112,7 +132,7 @@ func Test_getPerformerID(t *testing.T) { } for _, tt := range tests { t.Run(tt.name, func(t *testing.T) { - got, err := getPerformerID(testCtx, tt.args.endpoint, &mockPerformerReaderWriter, tt.args.p, tt.args.createMissing) + got, err := getPerformerID(testCtx, tt.args.endpoint, &mockPerformerReaderWriter, tt.args.p, tt.args.createMissing, tt.args.skipSingleName) if (err != nil) != tt.wantErr { t.Errorf("getPerformerID() error = %v, wantErr %v", err, tt.wantErr) return @@ -207,7 +227,7 @@ func Test_scrapedToPerformerInput(t *testing.T) { name := "name" var stringValues []string - for i := 0; i < 17; i++ { + for i := 0; i < 20; i++ { stringValues = append(stringValues, strconv.Itoa(i)) } @@ -240,44 +260,50 @@ func Test_scrapedToPerformerInput(t *testing.T) { { "set all", &models.ScrapedPerformer{ - Name: &name, - Birthdate: nextVal(), - DeathDate: nextVal(), - Gender: nextVal(), - Ethnicity: nextVal(), - Country: nextVal(), - EyeColor: nextVal(), - HairColor: nextVal(), - Height: nextVal(), - Weight: nextVal(), - Measurements: nextVal(), - FakeTits: nextVal(), - CareerLength: nextVal(), - Tattoos: nextVal(), - Piercings: nextVal(), - Aliases: nextVal(), - Twitter: nextVal(), - Instagram: nextVal(), + Name: &name, + Disambiguation: nextVal(), + Birthdate: nextVal(), + DeathDate: nextVal(), + Gender: nextVal(), + Ethnicity: nextVal(), + Country: nextVal(), + EyeColor: nextVal(), + HairColor: nextVal(), + Height: nextVal(), + Weight: nextVal(), + Measurements: nextVal(), + FakeTits: nextVal(), + CareerLength: nextVal(), + Tattoos: nextVal(), + Piercings: nextVal(), + Aliases: nextVal(), + Twitter: nextVal(), + Instagram: nextVal(), + URL: nextVal(), + Details: nextVal(), }, models.Performer{ - Name: name, - Birthdate: dateToDatePtr(models.NewDate(*nextVal())), - DeathDate: dateToDatePtr(models.NewDate(*nextVal())), - Gender: genderPtr(models.GenderEnum(*nextVal())), - Ethnicity: *nextVal(), - Country: *nextVal(), - EyeColor: *nextVal(), - HairColor: *nextVal(), - Height: nextIntVal(), - Weight: nextIntVal(), - Measurements: *nextVal(), - FakeTits: *nextVal(), - CareerLength: *nextVal(), - Tattoos: *nextVal(), - Piercings: *nextVal(), - Aliases: models.NewRelatedStrings([]string{*nextVal()}), - Twitter: *nextVal(), - Instagram: *nextVal(), + Name: name, + Disambiguation: *nextVal(), + Birthdate: dateToDatePtr(models.NewDate(*nextVal())), + DeathDate: dateToDatePtr(models.NewDate(*nextVal())), + Gender: genderPtr(models.GenderEnum(*nextVal())), + Ethnicity: *nextVal(), + Country: *nextVal(), + EyeColor: *nextVal(), + HairColor: *nextVal(), + Height: nextIntVal(), + Weight: nextIntVal(), + Measurements: *nextVal(), + FakeTits: *nextVal(), + CareerLength: *nextVal(), + Tattoos: *nextVal(), + Piercings: *nextVal(), + Aliases: models.NewRelatedStrings([]string{*nextVal()}), + Twitter: *nextVal(), + Instagram: *nextVal(), + URL: *nextVal(), + Details: *nextVal(), }, }, { diff --git a/internal/identify/scene.go b/internal/identify/scene.go index 9f99f67dcd3..7568d1b1f0a 100644 --- a/internal/identify/scene.go +++ b/internal/identify/scene.go @@ -3,6 +3,7 @@ package identify import ( "bytes" "context" + "errors" "fmt" "strconv" "strings" @@ -13,6 +14,7 @@ import ( "github.com/stashapp/stash/pkg/scene" "github.com/stashapp/stash/pkg/sliceutil" "github.com/stashapp/stash/pkg/sliceutil/intslice" + "github.com/stashapp/stash/pkg/tag" "github.com/stashapp/stash/pkg/utils" ) @@ -22,20 +24,23 @@ type SceneReaderUpdater interface { models.PerformerIDLoader models.TagIDLoader models.StashIDLoader + models.URLLoader } -type TagCreator interface { +type TagCreatorFinder interface { Create(ctx context.Context, newTag *models.Tag) error + tag.Finder } type sceneRelationships struct { - sceneReader SceneReaderUpdater - studioCreator StudioCreator - performerCreator PerformerCreator - tagCreator TagCreator - scene *models.Scene - result *scrapeResult - fieldOptions map[string]*FieldOptions + sceneReader SceneReaderUpdater + studioCreator StudioCreator + performerCreator PerformerCreator + tagCreatorFinder TagCreatorFinder + scene *models.Scene + result *scrapeResult + fieldOptions map[string]*FieldOptions + skipSingleNamePerformers bool } func (g sceneRelationships) studio(ctx context.Context) (*int, error) { @@ -93,13 +98,19 @@ func (g sceneRelationships) performers(ctx context.Context, ignoreMale bool) ([] performerIDs = originalPerformerIDs } + singleNamePerformerSkipped := false + for _, p := range scraped { if ignoreMale && p.Gender != nil && strings.EqualFold(*p.Gender, models.GenderEnumMale.String()) { continue } - performerID, err := getPerformerID(ctx, endpoint, g.performerCreator, p, createMissing) + performerID, err := getPerformerID(ctx, endpoint, g.performerCreator, p, createMissing, g.skipSingleNamePerformers) if err != nil { + if errors.Is(err, ErrSkipSingleNamePerformer) { + singleNamePerformerSkipped = true + continue + } return nil, err } @@ -110,9 +121,15 @@ func (g sceneRelationships) performers(ctx context.Context, ignoreMale bool) ([] // don't return if nothing was added if sliceutil.SliceSame(originalPerformerIDs, performerIDs) { + if singleNamePerformerSkipped { + return nil, ErrSkipSingleNamePerformer + } return nil, nil } + if singleNamePerformerSkipped { + return performerIDs, ErrSkipSingleNamePerformer + } return performerIDs, nil } @@ -156,7 +173,7 @@ func (g sceneRelationships) tags(ctx context.Context) ([]int, error) { CreatedAt: now, UpdatedAt: now, } - err := g.tagCreator.Create(ctx, &newTag) + err := g.tagCreatorFinder.Create(ctx, &newTag) if err != nil { return nil, fmt.Errorf("error creating tag: %w", err) } diff --git a/internal/identify/scene_test.go b/internal/identify/scene_test.go index b91220f9f8f..714b559ce15 100644 --- a/internal/identify/scene_test.go +++ b/internal/identify/scene_test.go @@ -374,9 +374,9 @@ func Test_sceneRelationships_tags(t *testing.T) { })).Return(errors.New("error creating tag")) tr := sceneRelationships{ - sceneReader: mockSceneReaderWriter, - tagCreator: mockTagReaderWriter, - fieldOptions: make(map[string]*FieldOptions), + sceneReader: mockSceneReaderWriter, + tagCreatorFinder: mockTagReaderWriter, + fieldOptions: make(map[string]*FieldOptions), } tests := []struct { diff --git a/internal/identify/studio.go b/internal/identify/studio.go index e90864b1162..682245d5b15 100644 --- a/internal/identify/studio.go +++ b/internal/identify/studio.go @@ -7,11 +7,13 @@ import ( "github.com/stashapp/stash/pkg/hash/md5" "github.com/stashapp/stash/pkg/models" + "github.com/stashapp/stash/pkg/utils" ) type StudioCreator interface { Create(ctx context.Context, newStudio *models.Studio) error UpdateStashIDs(ctx context.Context, studioID int, stashIDs []models.StashID) error + UpdateImage(ctx context.Context, studioID int, image []byte) error } func createMissingStudio(ctx context.Context, endpoint string, w StudioCreator, studio *models.ScrapedStudio) (*int, error) { @@ -21,6 +23,19 @@ func createMissingStudio(ctx context.Context, endpoint string, w StudioCreator, return nil, fmt.Errorf("error creating studio: %w", err) } + // update image table + if studio.Image != nil && len(*studio.Image) > 0 { + imageData, err := utils.ReadImageFromURL(ctx, *studio.Image) + if err != nil { + return nil, err + } + + err = w.UpdateImage(ctx, studioInput.ID, imageData) + if err != nil { + return nil, err + } + } + if endpoint != "" && studio.RemoteSiteID != nil { if err := w.UpdateStashIDs(ctx, studioInput.ID, []models.StashID{ { diff --git a/internal/manager/config/config.go b/internal/manager/config/config.go index 44c64392515..3a339a13687 100644 --- a/internal/manager/config/config.go +++ b/internal/manager/config/config.go @@ -23,8 +23,6 @@ import ( "github.com/stashapp/stash/pkg/models/paths" ) -var officialBuild string - const ( Stash = "stash" Cache = "cache" @@ -192,8 +190,10 @@ const ( DisableDropdownCreateStudio = "disable_dropdown_create.studio" DisableDropdownCreateTag = "disable_dropdown_create.tag" - HandyKey = "handy_key" - FunscriptOffset = "funscript_offset" + HandyKey = "handy_key" + FunscriptOffset = "funscript_offset" + UseStashHostedFunscript = "use_stash_hosted_funscript" + useStashHostedFunscriptDefault = false DrawFunscriptHeatmapRange = "draw_funscript_heatmap_range" drawFunscriptHeatmapRangeDefault = true @@ -273,10 +273,6 @@ func (s *StashBoxError) Error() string { return "Stash-box: " + s.msg } -func IsOfficialBuild() bool { - return officialBuild == "true" -} - type Instance struct { // main instance - backed by config file main *viper.Viper @@ -1260,6 +1256,10 @@ func (i *Instance) GetFunscriptOffset() int { return i.getInt(FunscriptOffset) } +func (i *Instance) GetUseStashHostedFunscript() bool { + return i.getBoolDefault(UseStashHostedFunscript, useStashHostedFunscriptDefault) +} + func (i *Instance) GetDeleteFileDefault() bool { return i.getBool(DeleteFileDefault) } diff --git a/internal/manager/config/config_concurrency_test.go b/internal/manager/config/config_concurrency_test.go index 81bb7e81687..0ede5f05518 100644 --- a/internal/manager/config/config_concurrency_test.go +++ b/internal/manager/config/config_concurrency_test.go @@ -93,6 +93,7 @@ func TestConcurrentConfigAccess(t *testing.T) { i.Set(CSSEnabled, i.GetCSSEnabled()) i.Set(CSSEnabled, i.GetCustomLocalesEnabled()) i.Set(HandyKey, i.GetHandyKey()) + i.Set(UseStashHostedFunscript, i.GetUseStashHostedFunscript()) i.Set(DLNAServerName, i.GetDLNAServerName()) i.Set(DLNADefaultEnabled, i.GetDLNADefaultEnabled()) i.Set(DLNADefaultIPWhitelist, i.GetDLNADefaultIPWhitelist()) diff --git a/internal/manager/config/init.go b/internal/manager/config/init.go index 37a19143692..18cda5aa764 100644 --- a/internal/manager/config/init.go +++ b/internal/manager/config/init.go @@ -11,6 +11,7 @@ import ( "github.com/spf13/pflag" "github.com/spf13/viper" + "github.com/stashapp/stash/internal/build" "github.com/stashapp/stash/pkg/fsutil" "github.com/stashapp/stash/pkg/logger" ) @@ -25,6 +26,7 @@ type flagStruct struct { cpuProfilePath string nobrowser bool helpFlag bool + versionFlag bool } func GetInstance() *Instance { @@ -47,6 +49,11 @@ func Initialize() (*Instance, error) { os.Exit(0) } + if flags.versionFlag { + fmt.Printf(build.VersionString() + "\n") + os.Exit(0) + } + overrides := makeOverrideConfig() _ = GetInstance() @@ -134,6 +141,7 @@ func initFlags() flagStruct { pflag.StringVar(&flags.cpuProfilePath, "cpuprofile", "", "write cpu profile to file") pflag.BoolVar(&flags.nobrowser, "nobrowser", false, "Don't open a browser window after launch") pflag.BoolVarP(&flags.helpFlag, "help", "h", false, "show this help text and exit") + pflag.BoolVarP(&flags.versionFlag, "version", "v", false, "show version number and exit") pflag.Parse() diff --git a/internal/manager/generator_interactive_heatmap_speed.go b/internal/manager/generator_interactive_heatmap_speed.go index 3cae5f5621e..17f8c2a8a02 100644 --- a/internal/manager/generator_interactive_heatmap_speed.go +++ b/internal/manager/generator_interactive_heatmap_speed.go @@ -1,6 +1,7 @@ package manager import ( + "bytes" "encoding/json" "fmt" "image" @@ -11,6 +12,7 @@ import ( "sort" "github.com/lucasb-eyer/go-colorful" + "github.com/stashapp/stash/pkg/fsutil" "github.com/stashapp/stash/pkg/logger" ) @@ -365,3 +367,62 @@ func getSegmentColor(intensity float64) colorful.Color { return c } + +func LoadFunscriptData(path string) (Script, error) { + data, err := os.ReadFile(path) + if err != nil { + return Script{}, err + } + + var funscript Script + err = json.Unmarshal(data, &funscript) + if err != nil { + return Script{}, err + } + + if funscript.Actions == nil { + return Script{}, fmt.Errorf("actions list missing in %s", path) + } + + sort.SliceStable(funscript.Actions, func(i, j int) bool { return funscript.Actions[i].At < funscript.Actions[j].At }) + + return funscript, nil +} + +func convertRange(value int, fromLow int, fromHigh int, toLow int, toHigh int) int { + return ((value-fromLow)*(toHigh-toLow))/(fromHigh-fromLow) + toLow +} + +func ConvertFunscriptToCSV(funscriptPath string) ([]byte, error) { + funscript, err := LoadFunscriptData(funscriptPath) + + if err != nil { + return nil, err + } + + var buffer bytes.Buffer + for _, action := range funscript.Actions { + pos := action.Pos + + if funscript.Inverted { + pos = convertRange(pos, 0, 100, 100, 0) + } + + if funscript.Range > 0 { + pos = convertRange(pos, 0, funscript.Range, 0, 100) + } + + buffer.WriteString(fmt.Sprintf("%d,%d\r\n", action.At, pos)) + } + return buffer.Bytes(), nil +} + +func ConvertFunscriptToCSVFile(funscriptPath string, csvPath string) error { + csvBytes, err := ConvertFunscriptToCSV(funscriptPath) + + if err != nil { + return err + } + + return fsutil.WriteFile(csvPath, csvBytes) +} diff --git a/internal/manager/repository.go b/internal/manager/repository.go index 55fea16724a..a8f0b9e7d05 100644 --- a/internal/manager/repository.go +++ b/internal/manager/repository.go @@ -29,6 +29,7 @@ type GalleryReaderWriter interface { type SceneReaderWriter interface { models.SceneReaderWriter scene.CreatorUpdater + models.URLLoader GetManyFileIDs(ctx context.Context, ids []int) ([][]file.ID, error) } diff --git a/internal/manager/task_identify.go b/internal/manager/task_identify.go index 4cbacde2b2c..2a0c942f22d 100644 --- a/internal/manager/task_identify.go +++ b/internal/manager/task_identify.go @@ -136,7 +136,7 @@ func (j *IdentifyJob) identifyScene(ctx context.Context, s *models.Scene, source SceneReaderUpdater: instance.Repository.Scene, StudioCreator: instance.Repository.Studio, PerformerCreator: instance.Repository.Performer, - TagCreator: instance.Repository.Tag, + TagCreatorFinder: instance.Repository.Tag, DefaultOptions: j.input.Options, Sources: sources, @@ -248,14 +248,14 @@ type stashboxSource struct { endpoint string } -func (s stashboxSource) ScrapeScene(ctx context.Context, sceneID int) (*scraper.ScrapedScene, error) { +func (s stashboxSource) ScrapeScenes(ctx context.Context, sceneID int) ([]*scraper.ScrapedScene, error) { results, err := s.FindStashBoxSceneByFingerprints(ctx, sceneID) if err != nil { return nil, fmt.Errorf("error querying stash-box using scene ID %d: %w", sceneID, err) } if len(results) > 0 { - return results[0], nil + return results, nil } return nil, nil @@ -270,7 +270,7 @@ type scraperSource struct { scraperID string } -func (s scraperSource) ScrapeScene(ctx context.Context, sceneID int) (*scraper.ScrapedScene, error) { +func (s scraperSource) ScrapeScenes(ctx context.Context, sceneID int) ([]*scraper.ScrapedScene, error) { content, err := s.cache.ScrapeID(ctx, s.scraperID, sceneID, scraper.ScrapeContentTypeScene) if err != nil { return nil, err @@ -282,7 +282,7 @@ func (s scraperSource) ScrapeScene(ctx context.Context, sceneID int) (*scraper.S } if scene, ok := content.(scraper.ScrapedScene); ok { - return &scene, nil + return []*scraper.ScrapedScene{&scene}, nil } return nil, errors.New("could not convert content to scene") diff --git a/pkg/file/file.go b/pkg/file/file.go index 5b6f8d44776..50a2d613868 100644 --- a/pkg/file/file.go +++ b/pkg/file/file.go @@ -154,10 +154,12 @@ type Getter interface { FindByFingerprint(ctx context.Context, fp Fingerprint) ([]File, error) FindByZipFileID(ctx context.Context, zipFileID ID) ([]File, error) FindAllInPaths(ctx context.Context, p []string, limit, offset int) ([]File, error) + FindByFileInfo(ctx context.Context, info fs.FileInfo, size int64) ([]File, error) } type Counter interface { CountAllInPaths(ctx context.Context, p []string) (int, error) + CountByFolderID(ctx context.Context, folderID FolderID) (int, error) } // Creator provides methods to create Files. diff --git a/pkg/file/folder_rename_detect.go b/pkg/file/folder_rename_detect.go new file mode 100644 index 00000000000..0e52eb7854c --- /dev/null +++ b/pkg/file/folder_rename_detect.go @@ -0,0 +1,195 @@ +package file + +import ( + "context" + "errors" + "fmt" + "io/fs" + + "github.com/stashapp/stash/pkg/logger" +) + +type folderRenameCandidate struct { + folder *Folder + found int + files int +} + +type folderRenameDetector struct { + // candidates is a map of folder id to the number of files that match + candidates map[FolderID]folderRenameCandidate + // rejects is a set of folder ids which were found to still exist + rejects map[FolderID]struct{} +} + +func (d *folderRenameDetector) isReject(id FolderID) bool { + _, ok := d.rejects[id] + return ok +} + +func (d *folderRenameDetector) getCandidate(id FolderID) *folderRenameCandidate { + c, ok := d.candidates[id] + if !ok { + return nil + } + + return &c +} + +func (d *folderRenameDetector) setCandidate(c folderRenameCandidate) { + d.candidates[c.folder.ID] = c +} + +func (d *folderRenameDetector) reject(id FolderID) { + d.rejects[id] = struct{}{} +} + +// bestCandidate returns the folder that is the best candidate for a rename. +// This is the folder that has the largest number of its original files that +// are still present in the new location. +func (d *folderRenameDetector) bestCandidate() *Folder { + if len(d.candidates) == 0 { + return nil + } + + var best *folderRenameCandidate + + for _, c := range d.candidates { + // ignore folders that have less than 50% of their original files + if c.found < c.files/2 { + continue + } + + // prefer the folder with the most files if the ratio is the same + if best == nil || c.found > best.found { + cc := c + best = &cc + } + } + + if best == nil { + return nil + } + + return best.folder +} + +func (s *scanJob) detectFolderMove(ctx context.Context, file scanFile) (*Folder, error) { + // in order for a folder to be considered moved, the existing folder must be + // missing, and the majority of the old folder's files must be present, unchanged, + // in the new folder. + + detector := folderRenameDetector{ + candidates: make(map[FolderID]folderRenameCandidate), + rejects: make(map[FolderID]struct{}), + } + // rejects is a set of folder ids which were found to still exist + + if err := symWalk(file.fs, file.Path, func(path string, d fs.DirEntry, err error) error { + if err != nil { + // don't let errors prevent scanning + logger.Errorf("error scanning %s: %v", path, err) + return nil + } + + // ignore root + if path == file.Path { + return nil + } + + // ignore directories + if d.IsDir() { + return fs.SkipDir + } + + info, err := d.Info() + if err != nil { + return fmt.Errorf("reading info for %q: %w", path, err) + } + + if !s.acceptEntry(ctx, path, info) { + return nil + } + + size, err := getFileSize(file.fs, path, info) + if err != nil { + return fmt.Errorf("getting file size for %q: %w", path, err) + } + + // check if the file exists in the database based on basename, size and mod time + existing, err := s.Repository.Store.FindByFileInfo(ctx, info, size) + if err != nil { + return fmt.Errorf("checking for existing file %q: %w", path, err) + } + + for _, e := range existing { + // ignore files in zip files + if e.Base().ZipFileID != nil { + continue + } + + parentFolderID := e.Base().ParentFolderID + + if detector.isReject(parentFolderID) { + // folder was found to still exist, not a candidate + continue + } + + c := detector.getCandidate(parentFolderID) + + if c == nil { + // need to check if the folder exists in the filesystem + pf, err := s.Repository.FolderStore.Find(ctx, e.Base().ParentFolderID) + if err != nil { + return fmt.Errorf("getting parent folder %d: %w", e.Base().ParentFolderID, err) + } + + if pf == nil { + // shouldn't happen, but just in case + continue + } + + // parent folder must be missing + _, err = file.fs.Lstat(pf.Path) + if err == nil { + // parent folder exists, not a candidate + detector.reject(parentFolderID) + continue + } + + if !errors.Is(err, fs.ErrNotExist) { + return fmt.Errorf("checking for parent folder %q: %w", pf.Path, err) + } + + // parent folder is missing, possible candidate + // count the total number of files in the existing folder + count, err := s.Repository.Store.CountByFolderID(ctx, parentFolderID) + if err != nil { + return fmt.Errorf("counting files in folder %d: %w", parentFolderID, err) + } + + if count == 0 { + // no files in the folder, not a candidate + detector.reject(parentFolderID) + continue + } + + c = &folderRenameCandidate{ + folder: pf, + found: 0, + files: count, + } + } + + // increment the count and set it in the map + c.found++ + detector.setCandidate(*c) + } + + return nil + }); err != nil { + return nil, fmt.Errorf("walking filesystem for folder rename detection: %w", err) + } + + return detector.bestCandidate(), nil +} diff --git a/pkg/file/scan.go b/pkg/file/scan.go index dcd625ff667..badb5ab23e5 100644 --- a/pkg/file/scan.go +++ b/pkg/file/scan.go @@ -215,19 +215,6 @@ func (s *scanJob) queueFileFunc(ctx context.Context, f FS, zipFile *scanFile) fs return fmt.Errorf("reading info for %q: %w", path, err) } - var size int64 - - // #2196/#3042 - replace size with target size if file is a symlink - if info.Mode()&os.ModeSymlink == os.ModeSymlink { - targetInfo, err := f.Stat(path) - if err != nil { - return fmt.Errorf("reading info for symlink %q: %w", path, err) - } - size = targetInfo.Size() - } else { - size = info.Size() - } - if !s.acceptEntry(ctx, path, info) { if info.IsDir() { return fs.SkipDir @@ -236,6 +223,11 @@ func (s *scanJob) queueFileFunc(ctx context.Context, f FS, zipFile *scanFile) fs return nil } + size, err := getFileSize(f, path, info) + if err != nil { + return err + } + ff := scanFile{ BaseFile: &BaseFile{ DirEntry: DirEntry{ @@ -294,6 +286,19 @@ func (s *scanJob) queueFileFunc(ctx context.Context, f FS, zipFile *scanFile) fs } } +func getFileSize(f FS, path string, info fs.FileInfo) (int64, error) { + // #2196/#3042 - replace size with target size if file is a symlink + if info.Mode()&os.ModeSymlink == os.ModeSymlink { + targetInfo, err := f.Stat(path) + if err != nil { + return 0, fmt.Errorf("reading info for symlink %q: %w", path, err) + } + return targetInfo.Size(), nil + } + + return info.Size(), nil +} + func (s *scanJob) acceptEntry(ctx context.Context, path string, info fs.FileInfo) bool { // always accept if there's no filters accept := len(s.options.ScanFilters) == 0 @@ -485,6 +490,15 @@ func (s *scanJob) handleFolder(ctx context.Context, file scanFile) error { } func (s *scanJob) onNewFolder(ctx context.Context, file scanFile) (*Folder, error) { + renamed, err := s.handleFolderRename(ctx, file) + if err != nil { + return nil, err + } + + if renamed != nil { + return renamed, nil + } + now := time.Now() toCreate := &Folder{ @@ -522,6 +536,42 @@ func (s *scanJob) onNewFolder(ctx context.Context, file scanFile) (*Folder, erro return toCreate, nil } +func (s *scanJob) handleFolderRename(ctx context.Context, file scanFile) (*Folder, error) { + // ignore folders in zip files + if file.ZipFileID != nil { + return nil, nil + } + + // check if the folder was moved from elsewhere + renamedFrom, err := s.detectFolderMove(ctx, file) + if err != nil { + return nil, fmt.Errorf("detecting folder move: %w", err) + } + + if renamedFrom == nil { + return nil, nil + } + + // if the folder was moved, update the existing folder + logger.Infof("%s moved to %s. Updating path...", renamedFrom.Path, file.Path) + renamedFrom.Path = file.Path + + // update the parent folder ID + // find the parent folder + parentFolderID, err := s.getFolderID(ctx, filepath.Dir(file.Path)) + if err != nil { + return nil, fmt.Errorf("getting parent folder for %q: %w", file.Path, err) + } + + renamedFrom.ParentFolderID = parentFolderID + + if err := s.Repository.FolderStore.Update(ctx, renamedFrom); err != nil { + return nil, fmt.Errorf("updating folder for rename %q: %w", renamedFrom.Path, err) + } + + return renamedFrom, nil +} + func (s *scanJob) onExistingFolder(ctx context.Context, f scanFile, existing *Folder) (*Folder, error) { update := false diff --git a/pkg/models/jsonschema/scene.go b/pkg/models/jsonschema/scene.go index 850b16c27bc..38aa6ed24fb 100644 --- a/pkg/models/jsonschema/scene.go +++ b/pkg/models/jsonschema/scene.go @@ -47,10 +47,12 @@ type SceneMovie struct { } type Scene struct { - Title string `json:"title,omitempty"` - Code string `json:"code,omitempty"` - Studio string `json:"studio,omitempty"` + Title string `json:"title,omitempty"` + Code string `json:"code,omitempty"` + Studio string `json:"studio,omitempty"` + // deprecated - for import only URL string `json:"url,omitempty"` + URLs []string `json:"urls,omitempty"` Date string `json:"date,omitempty"` Rating int `json:"rating,omitempty"` Organized bool `json:"organized,omitempty"` diff --git a/pkg/models/mocks/SceneReaderWriter.go b/pkg/models/mocks/SceneReaderWriter.go index ee0c124963d..8d7245ee9ea 100644 --- a/pkg/models/mocks/SceneReaderWriter.go +++ b/pkg/models/mocks/SceneReaderWriter.go @@ -624,6 +624,29 @@ func (_m *SceneReaderWriter) GetTagIDs(ctx context.Context, relatedID int) ([]in return r0, r1 } +// GetURLs provides a mock function with given fields: ctx, relatedID +func (_m *SceneReaderWriter) GetURLs(ctx context.Context, relatedID int) ([]string, error) { + ret := _m.Called(ctx, relatedID) + + var r0 []string + if rf, ok := ret.Get(0).(func(context.Context, int) []string); ok { + r0 = rf(ctx, relatedID) + } else { + if ret.Get(0) != nil { + r0 = ret.Get(0).([]string) + } + } + + var r1 error + if rf, ok := ret.Get(1).(func(context.Context, int) error); ok { + r1 = rf(ctx, relatedID) + } else { + r1 = ret.Error(1) + } + + return r0, r1 +} + // HasCover provides a mock function with given fields: ctx, sceneID func (_m *SceneReaderWriter) HasCover(ctx context.Context, sceneID int) (bool, error) { ret := _m.Called(ctx, sceneID) @@ -687,6 +710,27 @@ func (_m *SceneReaderWriter) IncrementWatchCount(ctx context.Context, id int) (i return r0, r1 } +// OCount provides a mock function with given fields: ctx +func (_m *SceneReaderWriter) OCount(ctx context.Context) (int, error) { + ret := _m.Called(ctx) + + var r0 int + if rf, ok := ret.Get(0).(func(context.Context) int); ok { + r0 = rf(ctx) + } else { + r0 = ret.Get(0).(int) + } + + var r1 error + if rf, ok := ret.Get(1).(func(context.Context) error); ok { + r1 = rf(ctx) + } else { + r1 = ret.Error(1) + } + + return r0, r1 +} + // OCountByPerformerID provides a mock function with given fields: ctx, performerID func (_m *SceneReaderWriter) OCountByPerformerID(ctx context.Context, performerID int) (int, error) { ret := _m.Called(ctx, performerID) @@ -708,6 +752,48 @@ func (_m *SceneReaderWriter) OCountByPerformerID(ctx context.Context, performerI return r0, r1 } +// PlayCount provides a mock function with given fields: ctx +func (_m *SceneReaderWriter) PlayCount(ctx context.Context) (int, error) { + ret := _m.Called(ctx) + + var r0 int + if rf, ok := ret.Get(0).(func(context.Context) int); ok { + r0 = rf(ctx) + } else { + r0 = ret.Get(0).(int) + } + + var r1 error + if rf, ok := ret.Get(1).(func(context.Context) error); ok { + r1 = rf(ctx) + } else { + r1 = ret.Error(1) + } + + return r0, r1 +} + +// PlayDuration provides a mock function with given fields: ctx +func (_m *SceneReaderWriter) PlayDuration(ctx context.Context) (float64, error) { + ret := _m.Called(ctx) + + var r0 float64 + if rf, ok := ret.Get(0).(func(context.Context) float64); ok { + r0 = rf(ctx) + } else { + r0 = ret.Get(0).(float64) + } + + var r1 error + if rf, ok := ret.Get(1).(func(context.Context) error); ok { + r1 = rf(ctx) + } else { + r1 = ret.Error(1) + } + + return r0, r1 +} + // Query provides a mock function with given fields: ctx, options func (_m *SceneReaderWriter) Query(ctx context.Context, options models.SceneQueryOptions) (*models.SceneQueryResult, error) { ret := _m.Called(ctx, options) @@ -815,6 +901,27 @@ func (_m *SceneReaderWriter) Size(ctx context.Context) (float64, error) { return r0, r1 } +// UniqueScenePlayCount provides a mock function with given fields: ctx +func (_m *SceneReaderWriter) UniqueScenePlayCount(ctx context.Context) (int, error) { + ret := _m.Called(ctx) + + var r0 int + if rf, ok := ret.Get(0).(func(context.Context) int); ok { + r0 = rf(ctx) + } else { + r0 = ret.Get(0).(int) + } + + var r1 error + if rf, ok := ret.Get(1).(func(context.Context) error); ok { + r1 = rf(ctx) + } else { + r1 = ret.Error(1) + } + + return r0, r1 +} + // Update provides a mock function with given fields: ctx, updatedScene func (_m *SceneReaderWriter) Update(ctx context.Context, updatedScene *models.Scene) error { ret := _m.Called(ctx, updatedScene) diff --git a/pkg/models/model_scene.go b/pkg/models/model_scene.go index 79c865ed2a1..f19113f499a 100644 --- a/pkg/models/model_scene.go +++ b/pkg/models/model_scene.go @@ -17,7 +17,6 @@ type Scene struct { Code string `json:"code"` Details string `json:"details"` Director string `json:"director"` - URL string `json:"url"` Date *Date `json:"date"` // Rating expressed in 1-100 scale Rating *int `json:"rating"` @@ -43,6 +42,7 @@ type Scene struct { PlayDuration float64 `json:"play_duration"` PlayCount int `json:"play_count"` + URLs RelatedStrings `json:"urls"` GalleryIDs RelatedIDs `json:"gallery_ids"` TagIDs RelatedIDs `json:"tag_ids"` PerformerIDs RelatedIDs `json:"performer_ids"` @@ -50,6 +50,12 @@ type Scene struct { StashIDs RelatedStashIDs `json:"stash_ids"` } +func (s *Scene) LoadURLs(ctx context.Context, l URLLoader) error { + return s.URLs.load(func() ([]string, error) { + return l.GetURLs(ctx, s.ID) + }) +} + func (s *Scene) LoadFiles(ctx context.Context, l VideoFileLoader) error { return s.Files.load(func() ([]*file.VideoFile, error) { return l.GetFiles(ctx, s.ID) @@ -110,6 +116,10 @@ func (s *Scene) LoadStashIDs(ctx context.Context, l StashIDLoader) error { } func (s *Scene) LoadRelationships(ctx context.Context, l SceneReader) error { + if err := s.LoadURLs(ctx, l); err != nil { + return err + } + if err := s.LoadGalleryIDs(ctx, l); err != nil { return err } @@ -144,7 +154,6 @@ type ScenePartial struct { Code OptionalString Details OptionalString Director OptionalString - URL OptionalString Date OptionalDate // Rating expressed in 1-100 scale Rating OptionalInt @@ -158,6 +167,7 @@ type ScenePartial struct { PlayCount OptionalInt LastPlayedAt OptionalTime + URLs *UpdateStrings GalleryIDs *UpdateIDs TagIDs *UpdateIDs PerformerIDs *UpdateIDs @@ -193,6 +203,7 @@ type SceneUpdateInput struct { Rating100 *int `json:"rating100"` OCounter *int `json:"o_counter"` Organized *bool `json:"organized"` + Urls []string `json:"urls"` StudioID *string `json:"studio_id"` GalleryIds []string `json:"gallery_ids"` PerformerIds []string `json:"performer_ids"` @@ -227,7 +238,7 @@ func (s ScenePartial) UpdateInput(id int) SceneUpdateInput { Code: s.Code.Ptr(), Details: s.Details.Ptr(), Director: s.Director.Ptr(), - URL: s.URL.Ptr(), + Urls: s.URLs.Strings(), Date: dateStr, Rating100: s.Rating.Ptr(), Organized: s.Organized.Ptr(), diff --git a/pkg/models/model_scene_test.go b/pkg/models/model_scene_test.go index 91099197131..e3126acaea1 100644 --- a/pkg/models/model_scene_test.go +++ b/pkg/models/model_scene_test.go @@ -37,11 +37,14 @@ func TestScenePartial_UpdateInput(t *testing.T) { "full", id, ScenePartial{ - Title: NewOptionalString(title), - Code: NewOptionalString(code), - Details: NewOptionalString(details), - Director: NewOptionalString(director), - URL: NewOptionalString(url), + Title: NewOptionalString(title), + Code: NewOptionalString(code), + Details: NewOptionalString(details), + Director: NewOptionalString(director), + URLs: &UpdateStrings{ + Values: []string{url}, + Mode: RelationshipUpdateModeSet, + }, Date: NewOptionalDate(dateObj), Rating: NewOptionalInt(rating100), Organized: NewOptionalBool(organized), @@ -53,7 +56,7 @@ func TestScenePartial_UpdateInput(t *testing.T) { Code: &code, Details: &details, Director: &director, - URL: &url, + Urls: []string{url}, Date: &date, Rating: &ratingLegacy, Rating100: &rating100, diff --git a/pkg/models/relationships.go b/pkg/models/relationships.go index 3975bffc3e6..f59e7d92e06 100644 --- a/pkg/models/relationships.go +++ b/pkg/models/relationships.go @@ -42,6 +42,10 @@ type AliasLoader interface { GetAliases(ctx context.Context, relatedID int) ([]string, error) } +type URLLoader interface { + GetURLs(ctx context.Context, relatedID int) ([]string, error) +} + // RelatedIDs represents a list of related IDs. // TODO - this can be made generic type RelatedIDs struct { diff --git a/pkg/models/scene.go b/pkg/models/scene.go index 7652545b5b8..4b7a537e5f2 100644 --- a/pkg/models/scene.go +++ b/pkg/models/scene.go @@ -45,6 +45,10 @@ type SceneFilterType struct { Duplicated *PHashDuplicationCriterionInput `json:"duplicated"` // Filter by resolution Resolution *ResolutionCriterionInput `json:"resolution"` + // Filter by video codec + VideoCodec *StringCriterionInput `json:"video_codec"` + // Filter by audio codec + AudioCodec *StringCriterionInput `json:"audio_codec"` // Filter by duration (in seconds) Duration *IntCriterionInput `json:"duration"` // Filter to only include scenes which have markers. `true` or `false` @@ -155,6 +159,7 @@ type SceneReader interface { FindByGalleryID(ctx context.Context, performerID int) ([]*Scene, error) FindDuplicates(ctx context.Context, distance int, durationDiff float64) ([][]*Scene, error) + URLLoader GalleryIDLoader PerformerIDLoader TagIDLoader @@ -164,12 +169,16 @@ type SceneReader interface { CountByPerformerID(ctx context.Context, performerID int) (int, error) OCountByPerformerID(ctx context.Context, performerID int) (int, error) + OCount(ctx context.Context) (int, error) // FindByStudioID(studioID int) ([]*Scene, error) FindByMovieID(ctx context.Context, movieID int) ([]*Scene, error) CountByMovieID(ctx context.Context, movieID int) (int, error) Count(ctx context.Context) (int, error) + PlayCount(ctx context.Context) (int, error) + UniqueScenePlayCount(ctx context.Context) (int, error) Size(ctx context.Context) (float64, error) Duration(ctx context.Context) (float64, error) + PlayDuration(ctx context.Context) (float64, error) // SizeCount() (string, error) CountByStudioID(ctx context.Context, studioID int) (int, error) CountByTagID(ctx context.Context, tagID int) (int, error) diff --git a/pkg/models/update.go b/pkg/models/update.go index ffa793bdaad..31d8bd21d07 100644 --- a/pkg/models/update.go +++ b/pkg/models/update.go @@ -110,3 +110,11 @@ type UpdateStrings struct { Values []string `json:"values"` Mode RelationshipUpdateMode `json:"mode"` } + +func (u *UpdateStrings) Strings() []string { + if u == nil { + return nil + } + + return u.Values +} diff --git a/pkg/scene/export.go b/pkg/scene/export.go index 2696adb06c6..5fa3b8b2df5 100644 --- a/pkg/scene/export.go +++ b/pkg/scene/export.go @@ -41,7 +41,7 @@ func ToBasicJSON(ctx context.Context, reader CoverGetter, scene *models.Scene) ( newSceneJSON := jsonschema.Scene{ Title: scene.Title, Code: scene.Code, - URL: scene.URL, + URLs: scene.URLs.List(), Details: scene.Details, Director: scene.Director, CreatedAt: json.JSONTime{Time: scene.CreatedAt}, @@ -86,53 +86,6 @@ func ToBasicJSON(ctx context.Context, reader CoverGetter, scene *models.Scene) ( return &newSceneJSON, nil } -// func getSceneFileJSON(scene *models.Scene) *jsonschema.SceneFile { -// ret := &jsonschema.SceneFile{} - -// TODO -// if scene.FileModTime != nil { -// ret.ModTime = json.JSONTime{Time: *scene.FileModTime} -// } - -// if scene.Size != nil { -// ret.Size = *scene.Size -// } - -// if scene.Duration != nil { -// ret.Duration = getDecimalString(*scene.Duration) -// } - -// if scene.VideoCodec != nil { -// ret.VideoCodec = *scene.VideoCodec -// } - -// if scene.AudioCodec != nil { -// ret.AudioCodec = *scene.AudioCodec -// } - -// if scene.Format != nil { -// ret.Format = *scene.Format -// } - -// if scene.Width != nil { -// ret.Width = *scene.Width -// } - -// if scene.Height != nil { -// ret.Height = *scene.Height -// } - -// if scene.Framerate != nil { -// ret.Framerate = getDecimalString(*scene.Framerate) -// } - -// if scene.Bitrate != nil { -// ret.Bitrate = int(*scene.Bitrate) -// } - -// return ret -// } - // GetStudioName returns the name of the provided scene's studio. It returns an // empty string if there is no studio assigned to the scene. func GetStudioName(ctx context.Context, reader studio.Finder, scene *models.Scene) (string, error) { diff --git a/pkg/scene/export_test.go b/pkg/scene/export_test.go index d02109d6eaf..224a30a2d16 100644 --- a/pkg/scene/export_test.go +++ b/pkg/scene/export_test.go @@ -92,7 +92,7 @@ func createFullScene(id int) models.Scene { OCounter: ocounter, Rating: &rating, Organized: organized, - URL: url, + URLs: models.NewRelatedStrings([]string{url}), Files: models.NewRelatedVideoFiles([]*file.VideoFile{ { BaseFile: &file.BaseFile{ @@ -118,6 +118,7 @@ func createEmptyScene(id int) models.Scene { }, }, }), + URLs: models.NewRelatedStrings([]string{}), StashIDs: models.NewRelatedStashIDs([]models.StashID{}), CreatedAt: createTime, UpdatedAt: updateTime, @@ -133,7 +134,7 @@ func createFullJSONScene(image string) *jsonschema.Scene { OCounter: ocounter, Rating: rating, Organized: organized, - URL: url, + URLs: []string{url}, CreatedAt: json.JSONTime{ Time: createTime, }, @@ -149,6 +150,7 @@ func createFullJSONScene(image string) *jsonschema.Scene { func createEmptyJSONScene() *jsonschema.Scene { return &jsonschema.Scene{ + URLs: []string{}, Files: []string{path}, CreatedAt: json.JSONTime{ Time: createTime, diff --git a/pkg/scene/import.go b/pkg/scene/import.go index d90c8c4b913..1c00f801519 100644 --- a/pkg/scene/import.go +++ b/pkg/scene/import.go @@ -80,12 +80,10 @@ func (i *Importer) PreImport(ctx context.Context) error { func (i *Importer) sceneJSONToScene(sceneJSON jsonschema.Scene) models.Scene { newScene := models.Scene{ - // Path: i.Path, Title: sceneJSON.Title, Code: sceneJSON.Code, Details: sceneJSON.Details, Director: sceneJSON.Director, - URL: sceneJSON.URL, PerformerIDs: models.NewRelatedIDs([]int{}), TagIDs: models.NewRelatedIDs([]int{}), GalleryIDs: models.NewRelatedIDs([]int{}), @@ -93,6 +91,12 @@ func (i *Importer) sceneJSONToScene(sceneJSON jsonschema.Scene) models.Scene { StashIDs: models.NewRelatedStashIDs(sceneJSON.StashIDs), } + if len(sceneJSON.URLs) > 0 { + newScene.URLs = models.NewRelatedStrings(sceneJSON.URLs) + } else if sceneJSON.URL != "" { + newScene.URLs = models.NewRelatedStrings([]string{sceneJSON.URL}) + } + if sceneJSON.Date != "" { d := models.NewDate(sceneJSON.Date) newScene.Date = &d diff --git a/pkg/scraper/cache.go b/pkg/scraper/cache.go index 4c40c95c2aa..81607cd4474 100644 --- a/pkg/scraper/cache.go +++ b/pkg/scraper/cache.go @@ -52,6 +52,11 @@ func isCDPPathWS(c GlobalConfig) bool { return strings.HasPrefix(c.GetScraperCDPPath(), "ws://") } +type SceneFinder interface { + scene.IDFinder + models.URLLoader +} + type PerformerFinder interface { match.PerformerAutoTagQueryer match.PerformerFinder @@ -73,7 +78,7 @@ type GalleryFinder interface { } type Repository struct { - SceneFinder scene.IDFinder + SceneFinder SceneFinder GalleryFinder GalleryFinder TagFinder TagFinder PerformerFinder PerformerFinder @@ -240,7 +245,19 @@ func (c Cache) ScrapeName(ctx context.Context, id, query string, ty ScrapeConten return nil, fmt.Errorf("%w: cannot use scraper %s to scrape by name", ErrNotSupported, id) } - return ns.viaName(ctx, c.client, query, ty) + content, err := ns.viaName(ctx, c.client, query, ty) + if err != nil { + return nil, fmt.Errorf("error while name scraping with scraper %s: %w", id, err) + } + + for i, cc := range content { + content[i], err = c.postScrape(ctx, cc) + if err != nil { + return nil, fmt.Errorf("error while post-scraping with scraper %s: %w", id, err) + } + } + + return content, nil } // ScrapeFragment uses the given fragment input to scrape @@ -361,7 +378,7 @@ func (c Cache) getScene(ctx context.Context, sceneID int) (*models.Scene, error) return fmt.Errorf("scene with id %d not found", sceneID) } - return nil + return ret.LoadURLs(ctx, c.repository.SceneFinder) }); err != nil { return nil, err } diff --git a/pkg/scraper/postprocessing.go b/pkg/scraper/postprocessing.go index cf8cac1eb34..e2d404d7c19 100644 --- a/pkg/scraper/postprocessing.go +++ b/pkg/scraper/postprocessing.go @@ -106,6 +106,14 @@ func (c Cache) postScrapeScenePerformer(ctx context.Context, p models.ScrapedPer } func (c Cache) postScrapeScene(ctx context.Context, scene ScrapedScene) (ScrapedContent, error) { + // set the URL/URLs field + if scene.URL == nil && len(scene.URLs) > 0 { + scene.URL = &scene.URLs[0] + } + if scene.URL != nil && len(scene.URLs) == 0 { + scene.URLs = []string{*scene.URL} + } + if err := txn.WithReadTxn(ctx, c.txnManager, func(ctx context.Context) error { pqb := c.repository.PerformerFinder mqb := c.repository.MovieFinder diff --git a/pkg/scraper/query_url.go b/pkg/scraper/query_url.go index 0ad4aa7e9e1..49cd08cf717 100644 --- a/pkg/scraper/query_url.go +++ b/pkg/scraper/query_url.go @@ -20,8 +20,8 @@ func queryURLParametersFromScene(scene *models.Scene) queryURLParameters { if scene.Title != "" { ret["title"] = scene.Title } - if scene.URL != "" { - ret["url"] = scene.URL + if len(scene.URLs.List()) > 0 { + ret["url"] = scene.URLs.List()[0] } return ret } @@ -37,7 +37,11 @@ func queryURLParametersFromScrapedScene(scene ScrapedSceneInput) queryURLParamet setField("title", scene.Title) setField("code", scene.Code) - setField("url", scene.URL) + if len(scene.URLs) > 0 { + setField("url", &scene.URLs[0]) + } else { + setField("url", scene.URL) + } setField("date", scene.Date) setField("details", scene.Details) setField("director", scene.Director) diff --git a/pkg/scraper/scene.go b/pkg/scraper/scene.go index 517f2a31899..e5de74a23f1 100644 --- a/pkg/scraper/scene.go +++ b/pkg/scraper/scene.go @@ -5,12 +5,13 @@ import ( ) type ScrapedScene struct { - Title *string `json:"title"` - Code *string `json:"code"` - Details *string `json:"details"` - Director *string `json:"director"` - URL *string `json:"url"` - Date *string `json:"date"` + Title *string `json:"title"` + Code *string `json:"code"` + Details *string `json:"details"` + Director *string `json:"director"` + URL *string `json:"url"` + URLs []string `json:"urls"` + Date *string `json:"date"` // This should be a base64 encoded data URL Image *string `json:"image"` File *models.SceneFileType `json:"file"` @@ -26,11 +27,12 @@ type ScrapedScene struct { func (ScrapedScene) IsScrapedContent() {} type ScrapedSceneInput struct { - Title *string `json:"title"` - Code *string `json:"code"` - Details *string `json:"details"` - Director *string `json:"director"` - URL *string `json:"url"` - Date *string `json:"date"` - RemoteSiteID *string `json:"remote_site_id"` + Title *string `json:"title"` + Code *string `json:"code"` + Details *string `json:"details"` + Director *string `json:"director"` + URL *string `json:"url"` + URLs []string `json:"urls"` + Date *string `json:"date"` + RemoteSiteID *string `json:"remote_site_id"` } diff --git a/pkg/scraper/stash.go b/pkg/scraper/stash.go index 652a9de0ac4..f616789c407 100644 --- a/pkg/scraper/stash.go +++ b/pkg/scraper/stash.go @@ -328,7 +328,7 @@ func sceneToUpdateInput(scene *models.Scene) models.SceneUpdateInput { ID: strconv.Itoa(scene.ID), Title: &title, Details: &scene.Details, - URL: &scene.URL, + Urls: scene.URLs.List(), Date: dateToStringPtr(scene.Date), } } diff --git a/pkg/scraper/stashbox/stash_box.go b/pkg/scraper/stashbox/stash_box.go index 65176bbeaf6..54f11363847 100644 --- a/pkg/scraper/stashbox/stash_box.go +++ b/pkg/scraper/stashbox/stash_box.go @@ -684,6 +684,7 @@ func getFingerprints(scene *graphql.SceneFragment) []*models.StashBoxFingerprint func (c Client) sceneFragmentToScrapedScene(ctx context.Context, s *graphql.SceneFragment) (*scraper.ScrapedScene, error) { stashID := s.ID + ss := &scraper.ScrapedScene{ Title: s.Title, Code: s.Code, @@ -698,6 +699,14 @@ func (c Client) sceneFragmentToScrapedScene(ctx context.Context, s *graphql.Scen // stash_id } + for _, u := range s.Urls { + ss.URLs = append(ss.URLs, u.URL) + } + + if len(ss.URLs) > 0 { + ss.URL = &ss.URLs[0] + } + if len(s.Images) > 0 { // TODO - #454 code sorts images by aspect ratio according to a wanted // orientation. I'm just grabbing the first for now @@ -722,6 +731,9 @@ func (c Client) sceneFragmentToScrapedScene(ctx context.Context, s *graphql.Scen URL: findURL(s.Studio.Urls, "HOME"), RemoteSiteID: &studioID, } + if s.Studio.Images != nil && len(s.Studio.Images) > 0 { + ss.Studio.Image = &s.Studio.Images[0].URL + } err := match.ScrapedStudio(ctx, c.repository.Studio, ss.Studio, &c.box.Endpoint) if err != nil { @@ -820,8 +832,9 @@ func (c Client) SubmitSceneDraft(ctx context.Context, scene *models.Scene, endpo if scene.Director != "" { draft.Director = &scene.Director } - if scene.URL != "" && len(strings.TrimSpace(scene.URL)) > 0 { - url := strings.TrimSpace(scene.URL) + // TODO - draft does not accept multiple URLs. Use single URL for now. + if len(scene.URLs.List()) > 0 { + url := strings.TrimSpace(scene.URLs.List()[0]) draft.URL = &url } if scene.Date != nil { diff --git a/pkg/sliceutil/collections.go b/pkg/sliceutil/collections.go index 5a271268cb8..454038e170c 100644 --- a/pkg/sliceutil/collections.go +++ b/pkg/sliceutil/collections.go @@ -2,6 +2,53 @@ package sliceutil import "reflect" +// Exclude removes all instances of any value in toExclude from the vs +// slice. It returns the new or unchanged slice. +func Exclude[T comparable](vs []T, toExclude []T) []T { + var ret []T + for _, v := range vs { + if !Include(toExclude, v) { + ret = append(ret, v) + } + } + + return ret +} + +func Index[T comparable](vs []T, t T) int { + for i, v := range vs { + if v == t { + return i + } + } + return -1 +} + +func Include[T comparable](vs []T, t T) bool { + return Index(vs, t) >= 0 +} + +// IntAppendUnique appends toAdd to the vs int slice if toAdd does not already +// exist in the slice. It returns the new or unchanged int slice. +func AppendUnique[T comparable](vs []T, toAdd T) []T { + if Include(vs, toAdd) { + return vs + } + + return append(vs, toAdd) +} + +// IntAppendUniques appends a slice of values to the vs slice. It only +// appends values that do not already exist in the slice. It returns the new or +// unchanged slice. +func AppendUniques[T comparable](vs []T, toAdd []T) []T { + for _, v := range toAdd { + vs = AppendUnique(vs, v) + } + + return vs +} + // SliceSame returns true if the two provided lists have the same elements, // regardless of order. Panics if either parameter is not a slice. func SliceSame(a, b interface{}) bool { diff --git a/pkg/sqlite/anonymise.go b/pkg/sqlite/anonymise.go index c16d1160d8a..a62b8d3c5ec 100644 --- a/pkg/sqlite/anonymise.go +++ b/pkg/sqlite/anonymise.go @@ -230,7 +230,6 @@ func (db *Anonymiser) anonymiseScenes(ctx context.Context) error { table.Col(idColumn), table.Col("title"), table.Col("details"), - table.Col("url"), table.Col("code"), table.Col("director"), ).Where(table.Col(idColumn).Gt(lastID)).Limit(1000) @@ -243,7 +242,6 @@ func (db *Anonymiser) anonymiseScenes(ctx context.Context) error { id int title sql.NullString details sql.NullString - url sql.NullString code sql.NullString director sql.NullString ) @@ -252,7 +250,6 @@ func (db *Anonymiser) anonymiseScenes(ctx context.Context) error { &id, &title, &details, - &url, &code, &director, ); err != nil { @@ -264,7 +261,6 @@ func (db *Anonymiser) anonymiseScenes(ctx context.Context) error { // if title set set new title db.obfuscateNullString(set, "title", title) db.obfuscateNullString(set, "details", details) - db.obfuscateNullString(set, "url", url) if len(set) > 0 { stmt := dialect.Update(table).Set(set).Where(table.Col(idColumn).Eq(id)) @@ -301,6 +297,10 @@ func (db *Anonymiser) anonymiseScenes(ctx context.Context) error { } } + if err := db.anonymiseURLs(ctx, goqu.T(scenesURLsTable), "scene_id"); err != nil { + return err + } + return nil } @@ -704,6 +704,68 @@ func (db *Anonymiser) anonymiseAliases(ctx context.Context, table exp.Identifier return nil } +func (db *Anonymiser) anonymiseURLs(ctx context.Context, table exp.IdentifierExpression, idColumn string) error { + lastID := 0 + lastURL := "" + total := 0 + const logEvery = 10000 + + for gotSome := true; gotSome; { + if err := txn.WithTxn(ctx, db, func(ctx context.Context) error { + query := dialect.From(table).Select( + table.Col(idColumn), + table.Col("url"), + ).Where(goqu.L("(" + idColumn + ", url)").Gt(goqu.L("(?, ?)", lastID, lastURL))).Limit(1000) + + gotSome = false + + const single = false + return queryFunc(ctx, query, single, func(rows *sqlx.Rows) error { + var ( + id int + url sql.NullString + ) + + if err := rows.Scan( + &id, + &url, + ); err != nil { + return err + } + + set := goqu.Record{} + db.obfuscateNullString(set, "url", url) + + if len(set) > 0 { + stmt := dialect.Update(table).Set(set).Where( + table.Col(idColumn).Eq(id), + table.Col("url").Eq(url), + ) + + if _, err := exec(ctx, stmt); err != nil { + return fmt.Errorf("anonymising %s: %w", table.GetTable(), err) + } + } + + lastID = id + lastURL = url.String + gotSome = true + total++ + + if total%logEvery == 0 { + logger.Infof("Anonymised %d %s URLs", total, table.GetTable()) + } + + return nil + }) + }); err != nil { + return err + } + } + + return nil +} + func (db *Anonymiser) anonymiseTags(ctx context.Context) error { logger.Infof("Anonymising tags") table := tagTableMgr.table diff --git a/pkg/sqlite/file.go b/pkg/sqlite/file.go index 87834a2df11..760a7746558 100644 --- a/pkg/sqlite/file.go +++ b/pkg/sqlite/file.go @@ -5,8 +5,10 @@ import ( "database/sql" "errors" "fmt" + "io/fs" "path/filepath" "strings" + "time" "github.com/doug-martin/goqu/v9" "github.com/doug-martin/goqu/v9/exp" @@ -713,6 +715,31 @@ func (qb *FileStore) FindByZipFileID(ctx context.Context, zipFileID file.ID) ([] return qb.getMany(ctx, q) } +// FindByFileInfo finds files that match the base name, size, and mod time of the given file. +func (qb *FileStore) FindByFileInfo(ctx context.Context, info fs.FileInfo, size int64) ([]file.File, error) { + table := qb.table() + + modTime := info.ModTime().Format(time.RFC3339) + + q := qb.selectDataset().Prepared(true).Where( + table.Col("basename").Eq(info.Name()), + table.Col("size").Eq(size), + table.Col("mod_time").Eq(modTime), + ) + + return qb.getMany(ctx, q) +} + +func (qb *FileStore) CountByFolderID(ctx context.Context, folderID file.FolderID) (int, error) { + table := qb.table() + + q := qb.countDataset().Prepared(true).Where( + table.Col("parent_folder_id").Eq(folderID), + ) + + return count(ctx, q) +} + func (qb *FileStore) IsPrimary(ctx context.Context, fileID file.ID) (bool, error) { joinTables := []exp.IdentifierExpression{ scenesFilesJoinTable, diff --git a/pkg/sqlite/migrations/47_scene_urls.up.sql b/pkg/sqlite/migrations/47_scene_urls.up.sql new file mode 100644 index 00000000000..1334ffe2a10 --- /dev/null +++ b/pkg/sqlite/migrations/47_scene_urls.up.sql @@ -0,0 +1,94 @@ +PRAGMA foreign_keys=OFF; + +CREATE TABLE `scene_urls` ( + `scene_id` integer NOT NULL, + `position` integer NOT NULL, + `url` varchar(255) NOT NULL, + foreign key(`scene_id`) references `scenes`(`id`) on delete CASCADE, + PRIMARY KEY(`scene_id`, `position`, `url`) +); + +CREATE INDEX `scene_urls_url` on `scene_urls` (`url`); + +-- drop url +CREATE TABLE "scenes_new" ( + `id` integer not null primary key autoincrement, + `title` varchar(255), + `details` text, + `date` date, + `rating` tinyint, + `studio_id` integer, + `o_counter` tinyint not null default 0, + `organized` boolean not null default '0', + `created_at` datetime not null, + `updated_at` datetime not null, + `code` text, + `director` text, + `resume_time` float not null default 0, + `last_played_at` datetime default null, + `play_count` tinyint not null default 0, + `play_duration` float not null default 0, + `cover_blob` varchar(255) REFERENCES `blobs`(`checksum`), + foreign key(`studio_id`) references `studios`(`id`) on delete SET NULL +); + +INSERT INTO `scenes_new` + ( + `id`, + `title`, + `details`, + `date`, + `rating`, + `studio_id`, + `o_counter`, + `organized`, + `created_at`, + `updated_at`, + `code`, + `director`, + `resume_time`, + `last_played_at`, + `play_count`, + `play_duration`, + `cover_blob` + ) + SELECT + `id`, + `title`, + `details`, + `date`, + `rating`, + `studio_id`, + `o_counter`, + `organized`, + `created_at`, + `updated_at`, + `code`, + `director`, + `resume_time`, + `last_played_at`, + `play_count`, + `play_duration`, + `cover_blob` + FROM `scenes`; + +INSERT INTO `scene_urls` + ( + `scene_id`, + `position`, + `url` + ) + SELECT + `id`, + '0', + `url` + FROM `scenes` + WHERE `scenes`.`url` IS NOT NULL AND `scenes`.`url` != ''; + +DROP INDEX `index_scenes_on_studio_id`; +DROP TABLE `scenes`; +ALTER TABLE `scenes_new` rename to `scenes`; + +CREATE INDEX `index_scenes_on_studio_id` on `scenes` (`studio_id`); + +PRAGMA foreign_keys=ON; diff --git a/pkg/sqlite/scene.go b/pkg/sqlite/scene.go index 0bfa6157f21..ef0ef928e7d 100644 --- a/pkg/sqlite/scene.go +++ b/pkg/sqlite/scene.go @@ -9,6 +9,7 @@ import ( "sort" "strconv" "strings" + "time" "github.com/doug-martin/goqu/v9" "github.com/doug-martin/goqu/v9/exp" @@ -30,6 +31,8 @@ const ( scenesTagsTable = "scenes_tags" scenesGalleriesTable = "scenes_galleries" moviesScenesTable = "movies_scenes" + scenesURLsTable = "scene_urls" + sceneURLColumn = "url" sceneCoverBlobColumn = "cover_blob" ) @@ -75,7 +78,6 @@ type sceneRow struct { Code zero.String `db:"code"` Details zero.String `db:"details"` Director zero.String `db:"director"` - URL zero.String `db:"url"` Date NullDate `db:"date"` // expressed as 1-100 Rating null.Int `db:"rating"` @@ -99,7 +101,6 @@ func (r *sceneRow) fromScene(o models.Scene) { r.Code = zero.StringFrom(o.Code) r.Details = zero.StringFrom(o.Details) r.Director = zero.StringFrom(o.Director) - r.URL = zero.StringFrom(o.URL) r.Date = NullDateFromDatePtr(o.Date) r.Rating = intFromPtr(o.Rating) r.Organized = o.Organized @@ -129,7 +130,6 @@ func (r *sceneQueryRow) resolve() *models.Scene { Code: r.Code.String, Details: r.Details.String, Director: r.Director.String, - URL: r.URL.String, Date: r.Date.DatePtr(), Rating: nullIntPtr(r.Rating), Organized: r.Organized, @@ -165,7 +165,6 @@ func (r *sceneRowRecord) fromPartial(o models.ScenePartial) { r.setNullString("code", o.Code) r.setNullString("details", o.Details) r.setNullString("director", o.Director) - r.setNullString("url", o.URL) r.setNullDate("date", o.Date) r.setNullInt("rating", o.Rating) r.setBool("organized", o.Organized) @@ -269,6 +268,13 @@ func (qb *SceneStore) Create(ctx context.Context, newObject *models.Scene, fileI } } + if newObject.URLs.Loaded() { + const startPos = 0 + if err := scenesURLsTableMgr.insertJoins(ctx, id, startPos, newObject.URLs.List()); err != nil { + return err + } + } + if newObject.PerformerIDs.Loaded() { if err := scenesPerformersTableMgr.insertJoins(ctx, id, newObject.PerformerIDs.List()); err != nil { return err @@ -323,6 +329,11 @@ func (qb *SceneStore) UpdatePartial(ctx context.Context, id int, partial models. } } + if partial.URLs != nil { + if err := scenesURLsTableMgr.modifyJoins(ctx, id, partial.URLs.Values, partial.URLs.Mode); err != nil { + return nil, err + } + } if partial.PerformerIDs != nil { if err := scenesPerformersTableMgr.modifyJoins(ctx, id, partial.PerformerIDs.IDs, partial.PerformerIDs.Mode); err != nil { return nil, err @@ -365,6 +376,12 @@ func (qb *SceneStore) Update(ctx context.Context, updatedObject *models.Scene) e return err } + if updatedObject.URLs.Loaded() { + if err := scenesURLsTableMgr.replaceJoins(ctx, updatedObject.ID, updatedObject.URLs.List()); err != nil { + return err + } + } + if updatedObject.PerformerIDs.Loaded() { if err := scenesPerformersTableMgr.replaceJoins(ctx, updatedObject.ID, updatedObject.PerformerIDs.List()); err != nil { return err @@ -706,6 +723,18 @@ func (qb *SceneStore) OCountByPerformerID(ctx context.Context, performerID int) return ret, nil } +func (qb *SceneStore) OCount(ctx context.Context) (int, error) { + table := qb.table() + + q := dialect.Select(goqu.COALESCE(goqu.SUM("o_counter"), 0)).From(table) + var ret int + if err := querySimple(ctx, q, &ret); err != nil { + return 0, err + } + + return ret, nil +} + func (qb *SceneStore) FindByMovieID(ctx context.Context, movieID int) ([]*models.Scene, error) { sq := dialect.From(scenesMoviesJoinTable).Select(scenesMoviesJoinTable.Col(sceneIDColumn)).Where( scenesMoviesJoinTable.Col(movieIDColumn).Eq(movieID), @@ -731,6 +760,24 @@ func (qb *SceneStore) Count(ctx context.Context) (int, error) { return count(ctx, q) } +func (qb *SceneStore) PlayCount(ctx context.Context) (int, error) { + q := dialect.Select(goqu.COALESCE(goqu.SUM("play_count"), 0)).From(qb.table()) + + var ret int + if err := querySimple(ctx, q, &ret); err != nil { + return 0, err + } + + return ret, nil +} + +func (qb *SceneStore) UniqueScenePlayCount(ctx context.Context) (int, error) { + table := qb.table() + q := dialect.Select(goqu.COUNT("*")).From(table).Where(table.Col("play_count").Gt(0)) + + return count(ctx, q) +} + func (qb *SceneStore) Size(ctx context.Context) (float64, error) { table := qb.table() fileTable := fileTableMgr.table @@ -772,6 +819,19 @@ func (qb *SceneStore) Duration(ctx context.Context) (float64, error) { return ret, nil } +func (qb *SceneStore) PlayDuration(ctx context.Context) (float64, error) { + table := qb.table() + + q := dialect.Select(goqu.COALESCE(goqu.SUM("play_duration"), 0)).From(table) + + var ret float64 + if err := querySimple(ctx, q, &ret); err != nil { + return 0, err + } + + return ret, nil +} + func (qb *SceneStore) CountByStudioID(ctx context.Context, studioID int) (int, error) { table := qb.table() @@ -927,9 +987,12 @@ func (qb *SceneStore) makeFilter(ctx context.Context, sceneFilter *models.SceneF query.handleCriterion(ctx, floatIntCriterionHandler(sceneFilter.Duration, "video_files.duration", qb.addVideoFilesTable)) query.handleCriterion(ctx, resolutionCriterionHandler(sceneFilter.Resolution, "video_files.height", "video_files.width", qb.addVideoFilesTable)) + query.handleCriterion(ctx, codecCriterionHandler(sceneFilter.VideoCodec, "video_files.video_codec", qb.addVideoFilesTable)) + query.handleCriterion(ctx, codecCriterionHandler(sceneFilter.AudioCodec, "video_files.audio_codec", qb.addVideoFilesTable)) + query.handleCriterion(ctx, hasMarkersCriterionHandler(sceneFilter.HasMarkers)) query.handleCriterion(ctx, sceneIsMissingCriterionHandler(qb, sceneFilter.IsMissing)) - query.handleCriterion(ctx, stringCriterionHandler(sceneFilter.URL, "scenes.url")) + query.handleCriterion(ctx, sceneURLsCriterionHandler(sceneFilter.URL)) query.handleCriterion(ctx, criterionHandlerFunc(func(ctx context.Context, f *filterBuilder) { if sceneFilter.StashID != nil { @@ -1202,6 +1265,18 @@ func resolutionCriterionHandler(resolution *models.ResolutionCriterionInput, hei } } +func codecCriterionHandler(codec *models.StringCriterionInput, codecColumn string, addJoinFn func(f *filterBuilder)) criterionHandlerFunc { + return func(ctx context.Context, f *filterBuilder) { + if codec != nil { + if addJoinFn != nil { + addJoinFn(f) + } + + stringCriterionHandler(codec, codecColumn)(ctx, f) + } + } +} + func hasMarkersCriterionHandler(hasMarkers *string) criterionHandlerFunc { return func(ctx context.Context, f *filterBuilder) { if hasMarkers != nil { @@ -1251,6 +1326,18 @@ func sceneIsMissingCriterionHandler(qb *SceneStore, isMissing *string) criterion } } +func sceneURLsCriterionHandler(url *models.StringCriterionInput) criterionHandlerFunc { + h := stringListCriterionHandlerBuilder{ + joinTable: scenesURLsTable, + stringColumn: sceneURLColumn, + addJoinTable: func(f *filterBuilder) { + scenesURLsTableMgr.join(f, "", "scenes.id") + }, + } + + return h.handler(url) +} + func (qb *SceneStore) getMultiCriterionHandlerBuilder(foreignTable, joinTable, foreignFK string, addJoinsFunc func(f *filterBuilder)) multiCriterionHandlerBuilder { return multiCriterionHandlerBuilder{ primaryTable: sceneTable, @@ -1548,6 +1635,25 @@ func (qb *SceneStore) SaveActivity(ctx context.Context, id int, resumeTime *floa return true, nil } +func (qb *SceneStore) IncrementWatchCount(ctx context.Context, id int) (int, error) { + if err := qb.tableMgr.checkIDExists(ctx, id); err != nil { + return 0, err + } + + if err := qb.tableMgr.updateByID(ctx, id, goqu.Record{ + "play_count": goqu.L("play_count + 1"), + "last_played_at": time.Now(), + }); err != nil { + return 0, err + } + + return qb.getPlayCount(ctx, id) +} + +func (qb *SceneStore) GetURLs(ctx context.Context, sceneID int) ([]string, error) { + return scenesURLsTableMgr.get(ctx, sceneID) +} + func (qb *SceneStore) GetCover(ctx context.Context, sceneID int) ([]byte, error) { return qb.GetImage(ctx, sceneID, sceneCoverBlobColumn) } diff --git a/pkg/sqlite/scene_test.go b/pkg/sqlite/scene_test.go index 50691437dbb..11085a0c182 100644 --- a/pkg/sqlite/scene_test.go +++ b/pkg/sqlite/scene_test.go @@ -21,6 +21,12 @@ import ( ) func loadSceneRelationships(ctx context.Context, expected models.Scene, actual *models.Scene) error { + if expected.URLs.Loaded() { + if err := actual.LoadURLs(ctx, db.Scene); err != nil { + return err + } + } + if expected.GalleryIDs.Loaded() { if err := actual.LoadGalleryIDs(ctx, db.Scene); err != nil { return err @@ -108,7 +114,7 @@ func Test_sceneQueryBuilder_Create(t *testing.T) { Code: code, Details: details, Director: director, - URL: url, + URLs: models.NewRelatedStrings([]string{url}), Date: &date, Rating: &rating, Organized: true, @@ -153,7 +159,7 @@ func Test_sceneQueryBuilder_Create(t *testing.T) { Code: code, Details: details, Director: director, - URL: url, + URLs: models.NewRelatedStrings([]string{url}), Date: &date, Rating: &rating, Organized: true, @@ -346,7 +352,7 @@ func Test_sceneQueryBuilder_Update(t *testing.T) { Code: code, Details: details, Director: director, - URL: url, + URLs: models.NewRelatedStrings([]string{url}), Date: &date, Rating: &rating, Organized: true, @@ -513,7 +519,7 @@ func clearScenePartial() models.ScenePartial { Code: models.OptionalString{Set: true, Null: true}, Details: models.OptionalString{Set: true, Null: true}, Director: models.OptionalString{Set: true, Null: true}, - URL: models.OptionalString{Set: true, Null: true}, + URLs: &models.UpdateStrings{Mode: models.RelationshipUpdateModeSet}, Date: models.OptionalDate{Set: true, Null: true}, Rating: models.OptionalInt{Set: true, Null: true}, StudioID: models.OptionalInt{Set: true, Null: true}, @@ -560,11 +566,14 @@ func Test_sceneQueryBuilder_UpdatePartial(t *testing.T) { "full", sceneIDs[sceneIdxWithSpacedName], models.ScenePartial{ - Title: models.NewOptionalString(title), - Code: models.NewOptionalString(code), - Details: models.NewOptionalString(details), - Director: models.NewOptionalString(director), - URL: models.NewOptionalString(url), + Title: models.NewOptionalString(title), + Code: models.NewOptionalString(code), + Details: models.NewOptionalString(details), + Director: models.NewOptionalString(director), + URLs: &models.UpdateStrings{ + Values: []string{url}, + Mode: models.RelationshipUpdateModeSet, + }, Date: models.NewOptionalDate(date), Rating: models.NewOptionalInt(rating), Organized: models.NewOptionalBool(true), @@ -624,7 +633,7 @@ func Test_sceneQueryBuilder_UpdatePartial(t *testing.T) { Code: code, Details: details, Director: director, - URL: url, + URLs: models.NewRelatedStrings([]string{url}), Date: &date, Rating: &rating, Organized: true, @@ -2400,7 +2409,14 @@ func TestSceneQueryURL(t *testing.T) { verifyFn := func(s *models.Scene) { t.Helper() - verifyString(t, s.URL, urlCriterion) + + urls := s.URLs.List() + var url string + if len(urls) > 0 { + url = urls[0] + } + + verifyString(t, url, urlCriterion) } verifySceneQuery(t, filter, verifyFn) @@ -2576,6 +2592,12 @@ func verifySceneQuery(t *testing.T, filter models.SceneFilterType, verifyFn func scenes := queryScene(ctx, t, sqb, &filter, nil) + for _, scene := range scenes { + if err := scene.LoadRelationships(ctx, sqb); err != nil { + t.Errorf("Error loading scene relationships: %v", err) + } + } + // assume it should find at least one assert.Greater(t, len(scenes), 0) diff --git a/pkg/sqlite/setup_test.go b/pkg/sqlite/setup_test.go index e5b56efadf7..d869a35bafc 100644 --- a/pkg/sqlite/setup_test.go +++ b/pkg/sqlite/setup_test.go @@ -1065,9 +1065,11 @@ func makeScene(i int) *models.Scene { rating := getRating(i) return &models.Scene{ - Title: title, - Details: details, - URL: getSceneEmptyString(i, urlField), + Title: title, + Details: details, + URLs: models.NewRelatedStrings([]string{ + getSceneEmptyString(i, urlField), + }), Rating: getIntPtr(rating), OCounter: getOCounter(i), Date: getObjectDate(i), diff --git a/pkg/sqlite/table.go b/pkg/sqlite/table.go index 3980968f2d3..3cd9c37ef96 100644 --- a/pkg/sqlite/table.go +++ b/pkg/sqlite/table.go @@ -14,6 +14,7 @@ import ( "github.com/stashapp/stash/pkg/file" "github.com/stashapp/stash/pkg/logger" "github.com/stashapp/stash/pkg/models" + "github.com/stashapp/stash/pkg/sliceutil" "github.com/stashapp/stash/pkg/sliceutil/intslice" "github.com/stashapp/stash/pkg/sliceutil/stringslice" ) @@ -534,6 +535,113 @@ func (t *stringTable) modifyJoins(ctx context.Context, id int, v []string, mode return nil } +type orderedValueTable[T comparable] struct { + table + valueColumn exp.IdentifierExpression +} + +func (t *orderedValueTable[T]) positionColumn() exp.IdentifierExpression { + const positionColumn = "position" + return t.table.table.Col(positionColumn) +} + +func (t *orderedValueTable[T]) get(ctx context.Context, id int) ([]T, error) { + q := dialect.Select(t.valueColumn).From(t.table.table).Where(t.idColumn.Eq(id)).Order(t.positionColumn().Asc()) + + const single = false + var ret []T + if err := queryFunc(ctx, q, single, func(rows *sqlx.Rows) error { + var v T + if err := rows.Scan(&v); err != nil { + return err + } + + ret = append(ret, v) + + return nil + }); err != nil { + return nil, fmt.Errorf("getting stash ids from %s: %w", t.table.table.GetTable(), err) + } + + return ret, nil +} + +func (t *orderedValueTable[T]) insertJoin(ctx context.Context, id int, position int, v T) (sql.Result, error) { + q := dialect.Insert(t.table.table).Cols(t.idColumn.GetCol(), t.positionColumn().GetCol(), t.valueColumn.GetCol()).Vals( + goqu.Vals{id, position, v}, + ) + ret, err := exec(ctx, q) + if err != nil { + return nil, fmt.Errorf("inserting into %s: %w", t.table.table.GetTable(), err) + } + + return ret, nil +} + +func (t *orderedValueTable[T]) insertJoins(ctx context.Context, id int, startPos int, v []T) error { + for i, fk := range v { + if _, err := t.insertJoin(ctx, id, i+startPos, fk); err != nil { + return err + } + } + + return nil +} + +func (t *orderedValueTable[T]) replaceJoins(ctx context.Context, id int, v []T) error { + if err := t.destroy(ctx, []int{id}); err != nil { + return err + } + + const startPos = 0 + return t.insertJoins(ctx, id, startPos, v) +} + +func (t *orderedValueTable[T]) addJoins(ctx context.Context, id int, v []T) error { + // get existing foreign keys + existing, err := t.get(ctx, id) + if err != nil { + return err + } + + // only add values that are not already present + filtered := sliceutil.Exclude(v, existing) + + if len(filtered) == 0 { + return nil + } + + startPos := len(existing) + return t.insertJoins(ctx, id, startPos, filtered) +} + +func (t *orderedValueTable[T]) destroyJoins(ctx context.Context, id int, v []T) error { + existing, err := t.get(ctx, id) + if err != nil { + return fmt.Errorf("getting existing %s: %w", t.table.table.GetTable(), err) + } + + newValue := sliceutil.Exclude(existing, v) + if len(newValue) == len(existing) { + return nil + } + + return t.replaceJoins(ctx, id, newValue) +} + +func (t *orderedValueTable[T]) modifyJoins(ctx context.Context, id int, v []T, mode models.RelationshipUpdateMode) error { + switch mode { + case models.RelationshipUpdateModeSet: + return t.replaceJoins(ctx, id, v) + case models.RelationshipUpdateModeAdd: + return t.addJoins(ctx, id, v) + case models.RelationshipUpdateModeRemove: + return t.destroyJoins(ctx, id, v) + } + + return nil +} + type scenesMoviesTable struct { table } diff --git a/pkg/sqlite/tables.go b/pkg/sqlite/tables.go index 1cc9ac4e732..e93d38a8b92 100644 --- a/pkg/sqlite/tables.go +++ b/pkg/sqlite/tables.go @@ -24,6 +24,7 @@ var ( scenesPerformersJoinTable = goqu.T(performersScenesTable) scenesStashIDsJoinTable = goqu.T("scene_stash_ids") scenesMoviesJoinTable = goqu.T(moviesScenesTable) + scenesURLsJoinTable = goqu.T(scenesURLsTable) performersAliasesJoinTable = goqu.T(performersAliasesTable) performersTagsJoinTable = goqu.T(performersTagsTable) @@ -170,6 +171,14 @@ var ( idColumn: scenesMoviesJoinTable.Col(sceneIDColumn), }, } + + scenesURLsTableMgr = &orderedValueTable[string]{ + table: table{ + table: scenesURLsJoinTable, + idColumn: scenesURLsJoinTable.Col(sceneIDColumn), + }, + valueColumn: scenesURLsJoinTable.Col(sceneURLColumn), + } ) var ( diff --git a/ui/v2.5/package.json b/ui/v2.5/package.json index 5beca06f01e..dd59c944c0b 100644 --- a/ui/v2.5/package.json +++ b/ui/v2.5/package.json @@ -112,7 +112,7 @@ "postcss-scss": "^4.0.6", "prettier": "^2.8.4", "sass": "^1.58.1", - "stylelint": "^15.1.0", + "stylelint": "^15.10.1", "stylelint-order": "^6.0.2", "terser": "^5.9.0", "ts-node": "^10.9.1", diff --git a/ui/v2.5/src/components/Dialogs/IdentifyDialog/FieldOptions.tsx b/ui/v2.5/src/components/Dialogs/IdentifyDialog/FieldOptions.tsx index 68a31fe6b32..79fe2b6c1d9 100644 --- a/ui/v2.5/src/components/Dialogs/IdentifyDialog/FieldOptions.tsx +++ b/ui/v2.5/src/components/Dialogs/IdentifyDialog/FieldOptions.tsx @@ -311,7 +311,7 @@ export const FieldOptionsList: React.FC = ({ } return ( - +
diff --git a/ui/v2.5/src/components/Dialogs/IdentifyDialog/IdentifyDialog.tsx b/ui/v2.5/src/components/Dialogs/IdentifyDialog/IdentifyDialog.tsx index 7c5207f4403..ece7589dcfb 100644 --- a/ui/v2.5/src/components/Dialogs/IdentifyDialog/IdentifyDialog.tsx +++ b/ui/v2.5/src/components/Dialogs/IdentifyDialog/IdentifyDialog.tsx @@ -50,6 +50,10 @@ export const IdentifyDialog: React.FC = ({ includeMalePerformers: true, setCoverImage: true, setOrganized: false, + skipMultipleMatches: true, + skipMultipleMatchTag: undefined, + skipSingleNamePerformers: true, + skipSingleNamePerformerTag: undefined, }; } @@ -240,6 +244,8 @@ export const IdentifyDialog: React.FC = ({ const autoTagCopy = { ...autoTag }; autoTagCopy.options = { setOrganized: false, + skipMultipleMatches: true, + skipSingleNamePerformers: true, }; newSources.push(autoTagCopy); } diff --git a/ui/v2.5/src/components/Dialogs/IdentifyDialog/Options.tsx b/ui/v2.5/src/components/Dialogs/IdentifyDialog/Options.tsx index 88655c860e6..0bc31e6ae47 100644 --- a/ui/v2.5/src/components/Dialogs/IdentifyDialog/Options.tsx +++ b/ui/v2.5/src/components/Dialogs/IdentifyDialog/Options.tsx @@ -1,10 +1,11 @@ import React from "react"; -import { Form } from "react-bootstrap"; +import { Col, Form, Row } from "react-bootstrap"; import * as GQL from "src/core/generated-graphql"; import { FormattedMessage, useIntl } from "react-intl"; import { IScraperSource } from "./constants"; import { FieldOptionsList } from "./FieldOptions"; import { ThreeStateBoolean } from "./ThreeStateBoolean"; +import { TagSelect } from "src/components/Shared/Select"; interface IOptionsEditor { options: GQL.IdentifyMetadataOptionsInput; @@ -35,8 +36,76 @@ export const OptionsEditor: React.FC = ({ indeterminateClassname: "text-muted", }; + function maybeRenderMultipleMatchesTag() { + if (!options.skipMultipleMatches) { + return; + } + + return ( + + + + + + + setOptions({ + skipMultipleMatchTag: tags[0]?.id, + }) + } + ids={ + options.skipMultipleMatchTag ? [options.skipMultipleMatchTag] : [] + } + noSelectionString="Select/create tag..." + /> + + + ); + } + + function maybeRenderPerformersTag() { + if (!options.skipSingleNamePerformers) { + return; + } + + return ( + + + + + + + setOptions({ + skipSingleNamePerformerTag: tags[0]?.id, + }) + } + ids={ + options.skipSingleNamePerformerTag + ? [options.skipSingleNamePerformerTag] + : [] + } + noSelectionString="Select/create tag..." + /> + + + ); + } + return ( - +
= ({ )} - + = ({ {...checkboxProps} /> + + setOptions({ + skipMultipleMatches: v, + }) + } + label={intl.formatMessage({ + id: "config.tasks.identify.skip_multiple_matches", + })} + defaultValue={defaultOptions?.skipMultipleMatches ?? undefined} + tooltip={intl.formatMessage({ + id: "config.tasks.identify.skip_multiple_matches_tooltip", + })} + {...checkboxProps} + /> + {maybeRenderMultipleMatchesTag()} + + setOptions({ + skipSingleNamePerformers: v, + }) + } + label={intl.formatMessage({ + id: "config.tasks.identify.skip_single_name_performers", + })} + defaultValue={defaultOptions?.skipSingleNamePerformers ?? undefined} + tooltip={intl.formatMessage({ + id: "config.tasks.identify.skip_single_name_performers_tooltip", + })} + {...checkboxProps} + /> + {maybeRenderPerformersTag()} = ({ @@ -20,6 +21,7 @@ export const ThreeStateBoolean: React.FC = ({ label, disabled, defaultValue, + tooltip, }) => { const intl = useIntl(); @@ -31,6 +33,7 @@ export const ThreeStateBoolean: React.FC = ({ checked={value} label={label} onChange={() => setValue(!value)} + title={tooltip} /> ); } @@ -79,7 +82,7 @@ export const ThreeStateBoolean: React.FC = ({ return ( -
{label}
+
{label}
{renderModeButton(undefined)} {renderModeButton(false)} diff --git a/ui/v2.5/src/components/Dialogs/styles.scss b/ui/v2.5/src/components/Dialogs/styles.scss index 36c350c2d75..dfd8b555fc1 100644 --- a/ui/v2.5/src/components/Dialogs/styles.scss +++ b/ui/v2.5/src/components/Dialogs/styles.scss @@ -6,3 +6,13 @@ justify-content: space-between; } } + +.form-group { + h6, + label { + &[title]:not([title=""]) { + cursor: help; + text-decoration: underline dotted; + } + } +} diff --git a/ui/v2.5/src/components/ScenePlayer/ScenePlayer.tsx b/ui/v2.5/src/components/ScenePlayer/ScenePlayer.tsx index 249111b92b4..8797a38bec3 100644 --- a/ui/v2.5/src/components/ScenePlayer/ScenePlayer.tsx +++ b/ui/v2.5/src/components/ScenePlayer/ScenePlayer.tsx @@ -81,6 +81,12 @@ function handleHotkeys(player: VideoJsPlayer, event: videojs.KeyboardEvent) { break; } + // toggle player looping with shift+l + if (event.shiftKey && event.which === 76) { + player.loop(!player.loop()); + return; + } + if (event.altKey || event.ctrlKey || event.metaKey || event.shiftKey) { return; } @@ -162,7 +168,7 @@ function getMarkerTitle(marker: MarkerFragment) { } interface IScenePlayerProps { - scene: GQL.SceneDataFragment | undefined | null; + scene: GQL.SceneDataFragment; hideScrubberOverride: boolean; autoplay?: boolean; permitLoop?: boolean; @@ -217,7 +223,7 @@ export const ScenePlayer: React.FC = ({ const vrTag = uiConfig?.vrTag ?? undefined; const file = useMemo( - () => ((scene?.files.length ?? 0) > 0 ? scene?.files[0] : undefined), + () => (scene.files.length > 0 ? scene.files[0] : undefined), [scene] ); @@ -363,7 +369,7 @@ export const ScenePlayer: React.FC = ({ }, [getPlayer, onNext, onPrevious]); useEffect(() => { - if (scene?.interactive && interactiveInitialised) { + if (scene.interactive && interactiveInitialised) { interactiveReady.current = false; uploadScript(scene.paths.funscript || "").then(() => { interactiveReady.current = true; @@ -372,8 +378,8 @@ export const ScenePlayer: React.FC = ({ }, [ uploadScript, interactiveInitialised, - scene?.interactive, - scene?.paths.funscript, + scene.interactive, + scene.paths.funscript, ]); useEffect(() => { @@ -384,7 +390,7 @@ export const ScenePlayer: React.FC = ({ let showButton = false; - if (scene && vrTag) { + if (vrTag) { showButton = scene.tags.some((tag) => vrTag === tag.name); } @@ -438,7 +444,7 @@ export const ScenePlayer: React.FC = ({ function onplay(this: VideoJsPlayer) { this.persistVolume().enabled = true; - if (scene?.interactive && interactiveReady.current) { + if (scene.interactive && interactiveReady.current) { interactiveClient.play(this.currentTime()); } } @@ -449,14 +455,14 @@ export const ScenePlayer: React.FC = ({ function seeking(this: VideoJsPlayer) { if (this.paused()) return; - if (scene?.interactive && interactiveReady.current) { + if (scene.interactive && interactiveReady.current) { interactiveClient.play(this.currentTime()); } } function timeupdate(this: VideoJsPlayer) { if (this.paused()) return; - if (scene?.interactive && interactiveReady.current) { + if (scene.interactive && interactiveReady.current) { interactiveClient.ensurePlaying(this.currentTime()); } setTime(this.currentTime()); @@ -480,7 +486,7 @@ export const ScenePlayer: React.FC = ({ if (!player) return; // don't re-initialise the player unless the scene has changed - if (!scene || !file || scene.id === sceneId.current) return; + if (!file || scene.id === sceneId.current) return; sceneId.current = scene.id; @@ -629,7 +635,7 @@ export const ScenePlayer: React.FC = ({ useEffect(() => { const player = getPlayer(); - if (!player || !scene) return; + if (!player) return; const markers = player.markers(); markers.clearMarkers(); @@ -652,7 +658,7 @@ export const ScenePlayer: React.FC = ({ if (!player) return; async function saveActivity(resumeTime: number, playDuration: number) { - if (!scene?.id) return; + if (!scene.id) return; await sceneSaveActivity({ variables: { @@ -664,7 +670,7 @@ export const ScenePlayer: React.FC = ({ } async function incrementPlayCount() { - if (!scene?.id) return; + if (!scene.id) return; await sceneIncrementPlayCount({ variables: { @@ -698,7 +704,7 @@ export const ScenePlayer: React.FC = ({ useEffect(() => { const player = getPlayer(); - if (!player || !scene || !ready || !auto.current) { + if (!player || !ready || !auto.current) { return; } @@ -766,7 +772,7 @@ export const ScenePlayer: React.FC = ({ } const isPortrait = - scene && file && file.height && file.width && file.height > file.width; + file && file.height && file.width && file.height > file.width; return (
= ({ onKeyDownCapture={onKeyDown} >
- {scene?.interactive && + {scene.interactive && (interactiveState !== ConnectionState.Ready || getPlayer()?.paused()) && } - {scene && file && showScrubber && ( + {file && showScrubber && ( { const { configuration } = useContext(ConfigurationContext); const { data, loading, error } = useFindScene(id ?? ""); - const [scene, setScene] = useState( - data?.findScene ?? undefined - ); + const [scene, setScene] = useState(); - // only update scene when loading is done - useEffect(() => { + // useLayoutEffect to update before paint + useLayoutEffect(() => { + // only update scene when loading is done if (!loading) { setScene(data?.findScene ?? undefined); } @@ -819,34 +825,32 @@ const SceneLoader: React.FC = () => { } } - if (!scene && loading) return ; - if (error) return ; - - if (!loading && !scene) + if (!scene) { + if (loading) return ; + if (error) return ; return ; + } return (
- {scene && ( - - )} +
= ({ const schema = yup.object({ title: yup.string().ensure(), code: yup.string().ensure(), - url: yup.string().ensure(), + urls: yup + .array(yup.string().required()) + .defined() + .test({ + name: "unique", + test: (value) => { + const dupes = value + .map((e, i, a) => { + if (a.indexOf(e) !== i) { + return String(i - 1); + } else { + return null; + } + }) + .filter((e) => e !== null) as string[]; + if (dupes.length === 0) return true; + return new yup.ValidationError(dupes.join(" "), value, "urls"); + }, + }), date: yup .string() .ensure() @@ -143,7 +161,7 @@ export const SceneEditPanel: React.FC = ({ () => ({ title: scene.title ?? "", code: scene.code ?? "", - url: scene.url ?? "", + urls: scene.urls ?? [], date: scene.date ?? "", director: scene.director ?? "", rating100: scene.rating100 ?? null, @@ -333,7 +351,7 @@ export const SceneEditPanel: React.FC = ({ director: fragment.director, remote_site_id: fragment.remote_site_id, title: fragment.title, - url: fragment.url, + urls: fragment.urls, }; const result = await queryScrapeSceneQueryFragment(s, input); @@ -549,8 +567,8 @@ export const SceneEditPanel: React.FC = ({ formik.setFieldValue("date", updatedScene.date); } - if (updatedScene.url) { - formik.setFieldValue("url", updatedScene.url); + if (updatedScene.urls) { + formik.setFieldValue("urls", updatedScene.urls); } if (updatedScene.studio && updatedScene.studio.stored_id) { @@ -624,13 +642,13 @@ export const SceneEditPanel: React.FC = ({ } } - async function onScrapeSceneURL() { - if (!formik.values.url) { + async function onScrapeSceneURL(url: string) { + if (!url) { return; } setIsLoading(true); try { - const result = await queryScrapeSceneURL(formik.values.url); + const result = await queryScrapeSceneURL(url); if (!result.data || !result.data.scrapeSceneURL) { return; } @@ -683,6 +701,14 @@ export const SceneEditPanel: React.FC = ({ if (isLoading) return ; + const urlsErrors = Array.isArray(formik.errors.urls) + ? formik.errors.urls[0] + : formik.errors.urls; + const urlsErrorMsg = urlsErrors + ? intl.formatMessage({ id: "validation.urls_must_be_unique" }) + : undefined; + const urlsErrorIdx = urlsErrors?.split(" ").map((e) => parseInt(e)); + return (
= ({
{renderTextField("title", intl.formatMessage({ id: "title" }))} {renderTextField("code", intl.formatMessage({ id: "scene_code" }))} - + - + - formik.setFieldValue("urls", value)} + errors={urlsErrorMsg} + errorIdx={urlsErrorIdx} + onScrapeClick={(url) => onScrapeSceneURL(url)} urlScrapable={urlScrapable} - isInvalid={!!formik.getFieldMeta("url").error} /> diff --git a/ui/v2.5/src/components/Scenes/SceneDetails/SceneFileInfoPanel.tsx b/ui/v2.5/src/components/Scenes/SceneDetails/SceneFileInfoPanel.tsx index a3ead85a1d8..4de616dbe68 100644 --- a/ui/v2.5/src/components/Scenes/SceneDetails/SceneFileInfoPanel.tsx +++ b/ui/v2.5/src/components/Scenes/SceneDetails/SceneFileInfoPanel.tsx @@ -16,7 +16,7 @@ import { useToast } from "src/hooks/Toast"; import NavUtils from "src/utils/navigation"; import TextUtils from "src/utils/text"; import { getStashboxBase } from "src/utils/stashbox"; -import { TextField, URLField } from "src/utils/field"; +import { TextField, URLField, URLsField } from "src/utils/field"; interface IFileInfoPanelProps { sceneID: string; @@ -316,12 +316,7 @@ export const SceneFileInfoPanel: React.FC = ( )} {renderFunscript()} {renderInteractiveSpeed()} - + {renderStashIDs()} {/* = ({ const [code, setCode] = useState>( new ScrapeResult(scene.code, scraped.code) ); - const [url, setURL] = useState>( - new ScrapeResult(scene.url, scraped.url) + + const [urls, setURLs] = useState>( + new ScrapeResult( + scene.urls, + scraped.urls + ? uniq((scene.urls ?? []).concat(scraped.urls ?? [])) + : undefined + ) ); + const [date, setDate] = useState>( new ScrapeResult(scene.date, scraped.date) ); @@ -407,7 +416,7 @@ export const SceneScrapeDialog: React.FC = ({ [ title, code, - url, + urls, date, director, studio, @@ -581,7 +590,7 @@ export const SceneScrapeDialog: React.FC = ({ return { title: title.getNewValue(), code: code.getNewValue(), - url: url.getNewValue(), + urls: urls.getNewValue(), date: date.getNewValue(), director: director.getNewValue(), studio: newStudioValue @@ -627,10 +636,10 @@ export const SceneScrapeDialog: React.FC = ({ result={code} onChange={(value) => setCode(value)} /> - setURL(value)} + setURLs(value)} /> = ({ const [code, setCode] = useState>( new ScrapeResult(dest.code) ); - const [url, setURL] = useState>( - new ScrapeResult(dest.url) + const [url, setURL] = useState>( + new ScrapeResult(dest.urls) ); const [date, setDate] = useState>( new ScrapeResult(dest.date) @@ -164,7 +165,7 @@ const SceneMergeDetails: React.FC = ({ new ScrapeResult(dest.code, sources.find((s) => s.code)?.code, !dest.code) ); setURL( - new ScrapeResult(dest.url, sources.find((s) => s.url)?.url, !dest.url) + new ScrapeResult(dest.urls, sources.find((s) => s.urls)?.urls, !dest.urls) ); setDate( new ScrapeResult(dest.date, sources.find((s) => s.date)?.date, !dest.date) @@ -361,8 +362,8 @@ const SceneMergeDetails: React.FC = ({ result={code} onChange={(value) => setCode(value)} /> - setURL(value)} /> @@ -546,7 +547,7 @@ const SceneMergeDetails: React.FC = ({ id: dest.id, title: title.getNewValue(), code: code.getNewValue(), - url: url.getNewValue(), + urls: url.getNewValue(), date: date.getNewValue(), rating100: rating.getNewValue(), o_counter: oCounter.getNewValue(), diff --git a/ui/v2.5/src/components/Settings/SettingsInterfacePanel/SettingsInterfacePanel.tsx b/ui/v2.5/src/components/Settings/SettingsInterfacePanel/SettingsInterfacePanel.tsx index c44f3ab78b3..dd4dce07021 100644 --- a/ui/v2.5/src/components/Settings/SettingsInterfacePanel/SettingsInterfacePanel.tsx +++ b/ui/v2.5/src/components/Settings/SettingsInterfacePanel/SettingsInterfacePanel.tsx @@ -756,6 +756,14 @@ export const SettingsInterfacePanel: React.FC = () => { value={iface.funscriptOffset ?? undefined} onChange={(v) => saveInterface({ funscriptOffset: v })} /> + + saveInterface({ useStashHostedFunscript: v })} + /> ); diff --git a/ui/v2.5/src/components/Shared/ScrapeDialog.tsx b/ui/v2.5/src/components/Shared/ScrapeDialog.tsx index 0a5b1b9b226..0c4753f74ef 100644 --- a/ui/v2.5/src/components/Shared/ScrapeDialog.tsx +++ b/ui/v2.5/src/components/Shared/ScrapeDialog.tsx @@ -22,6 +22,7 @@ import { } from "@fortawesome/free-solid-svg-icons"; import { getCountryByISO } from "src/utils/country"; import { CountrySelect } from "./CountrySelect"; +import { StringListInput } from "./StringListInput"; export class ScrapeResult { public newValue?: T; @@ -102,6 +103,7 @@ interface IScrapedFieldProps { interface IScrapedRowProps extends IScrapedFieldProps { + className?: string; title: string; renderOriginalField: (result: ScrapeResult) => JSX.Element | undefined; renderNewField: (result: ScrapeResult) => JSX.Element | undefined; @@ -175,7 +177,7 @@ export const ScrapeDialogRow = ( } return ( - + {props.title} @@ -276,6 +278,71 @@ export const ScrapedInputGroupRow: React.FC = ( ); }; +interface IScrapedStringListProps { + isNew?: boolean; + placeholder?: string; + locked?: boolean; + result: ScrapeResult; + onChange?: (value: string[]) => void; +} + +const ScrapedStringList: React.FC = (props) => { + const value = props.isNew + ? props.result.newValue + : props.result.originalValue; + + return ( + { + if (props.isNew && props.onChange) { + props.onChange(v); + } + }} + placeholder={props.placeholder} + readOnly={!props.isNew || props.locked} + /> + ); +}; + +interface IScrapedStringListRowProps { + title: string; + placeholder?: string; + result: ScrapeResult; + locked?: boolean; + onChange: (value: ScrapeResult) => void; +} + +export const ScrapedStringListRow: React.FC = ( + props +) => { + return ( + ( + + )} + renderNewField={() => ( + + props.onChange(props.result.cloneWithValue(value)) + } + /> + )} + onChange={props.onChange} + /> + ); +}; + const ScrapedTextArea: React.FC = (props) => { return ( void; + placeholder?: string; + className?: string; + readOnly?: boolean; +} + +interface IListInputAppendProps { + value: string; +} + +export interface IStringListInputProps { value: string[]; setValue: (value: string[]) => void; + inputComponent?: ComponentType; + appendComponent?: ComponentType; placeholder?: string; className?: string; errors?: string; errorIdx?: number[]; + readOnly?: boolean; } +export const StringInput: React.FC = ({ + className, + placeholder, + value, + setValue, + readOnly = false, +}) => { + return ( + ) => + setValue(e.currentTarget.value) + } + placeholder={placeholder} + readOnly={readOnly} + /> + ); +}; + export const StringListInput: React.FC = (props) => { + const Input = props.inputComponent ?? StringInput; + const AppendComponent = props.appendComponent; const values = props.value.concat(""); function valueChanged(idx: number, value: string) { @@ -37,24 +74,24 @@ export const StringListInput: React.FC = (props) => { {values.map((v, i) => ( - ) => - valueChanged(i, e.currentTarget.value) - } + setValue={(value) => valueChanged(i, value)} placeholder={props.placeholder} + className={props.errorIdx?.includes(i) ? "is-invalid" : ""} + readOnly={props.readOnly} /> - + {AppendComponent && } + {!props.readOnly && ( + + )} ))} diff --git a/ui/v2.5/src/components/Shared/URLField.tsx b/ui/v2.5/src/components/Shared/URLField.tsx index 413f24fd93b..9cea50e7c7a 100644 --- a/ui/v2.5/src/components/Shared/URLField.tsx +++ b/ui/v2.5/src/components/Shared/URLField.tsx @@ -4,6 +4,11 @@ import { Button, InputGroup, Form } from "react-bootstrap"; import { Icon } from "./Icon"; import { FormikHandlers } from "formik"; import { faFileDownload } from "@fortawesome/free-solid-svg-icons"; +import { + IStringListInputProps, + StringInput, + StringListInput, +} from "./StringListInput"; interface IProps { value: string; @@ -43,3 +48,33 @@ export const URLField: React.FC = (props: IProps) => { ); }; + +interface IURLListProps extends IStringListInputProps { + onScrapeClick(url: string): void; + urlScrapable(url: string): boolean; +} + +export const URLListInput: React.FC = ( + listProps: IURLListProps +) => { + const intl = useIntl(); + const { onScrapeClick, urlScrapable } = listProps; + return ( + ( + + )} + /> + ); +}; diff --git a/ui/v2.5/src/components/Shared/styles.scss b/ui/v2.5/src/components/Shared/styles.scss index 4d166878f0a..fb7c51c2bad 100644 --- a/ui/v2.5/src/components/Shared/styles.scss +++ b/ui/v2.5/src/components/Shared/styles.scss @@ -441,3 +441,7 @@ div.react-datepicker { right: 0; z-index: 4; } + +.string-list-row .input-group { + flex-wrap: nowrap; +} diff --git a/ui/v2.5/src/components/Stats.tsx b/ui/v2.5/src/components/Stats.tsx index 79e2062827b..f177aa46194 100644 --- a/ui/v2.5/src/components/Stats.tsx +++ b/ui/v2.5/src/components/Stats.tsx @@ -18,6 +18,11 @@ export const Stats: React.FC = () => { 3 ); + const totalPlayDuration = TextUtils.secondsAsTimeString( + data.stats.total_play_duration, + 3 + ); + return (
@@ -114,6 +119,38 @@ export const Stats: React.FC = () => {

+
+
+

+ +

+

+ +

+
+
+

+ +

+

+ +

+
+
+

+ +

+

+ +

+
+
+

{totalPlayDuration || "-"}

+

+ +

+
+
); }; diff --git a/ui/v2.5/src/components/Tagger/context.tsx b/ui/v2.5/src/components/Tagger/context.tsx index 095c77a396b..a497f020159 100644 --- a/ui/v2.5/src/components/Tagger/context.tsx +++ b/ui/v2.5/src/components/Tagger/context.tsx @@ -408,7 +408,7 @@ export const TaggerContext: React.FC = ({ children }) => { details: scene.details, remote_site_id: scene.remote_site_id, title: scene.title, - url: scene.url, + urls: scene.urls, }; const result = await queryScrapeSceneQueryFragment( diff --git a/ui/v2.5/src/components/Tagger/scenes/StashSearchResult.tsx b/ui/v2.5/src/components/Tagger/scenes/StashSearchResult.tsx index 155f4950f15..db117c3e90b 100755 --- a/ui/v2.5/src/components/Tagger/scenes/StashSearchResult.tsx +++ b/ui/v2.5/src/components/Tagger/scenes/StashSearchResult.tsx @@ -360,7 +360,7 @@ const StashSearchResult: React.FC = ({ ), studio_id: studioID, cover_image: resolveField("cover_image", undefined, imgData), - url: resolveField("url", stashScene.url, scene.url), + urls: resolveField("url", stashScene.urls, scene.urls), tag_ids: tagIDs, stash_ids: stashScene.stash_ids ?? [], code: resolveField("code", stashScene.code, scene.code), @@ -462,9 +462,11 @@ const StashSearchResult: React.FC = ({ ); } - const sceneTitleEl = scene.url ? ( + const url = scene.urls?.length ? scene.urls[0] : null; + + const sceneTitleEl = url ? ( = ({ }; const maybeRenderURL = () => { - if (scene.url) { + if (scene.urls) { return ( ); diff --git a/ui/v2.5/src/docs/en/Manual/Identify.md b/ui/v2.5/src/docs/en/Manual/Identify.md index dceb8c1dc90..8d0b3af2ea4 100644 --- a/ui/v2.5/src/docs/en/Manual/Identify.md +++ b/ui/v2.5/src/docs/en/Manual/Identify.md @@ -15,6 +15,10 @@ The following options can be set: | Include male performers | If false, then male performers will not be created or set on scenes. | | Set cover images | If false, then scene cover images will not be modified. | | Set organised flag | If true, the organised flag is set to true when a scene is organised. | +| Skip matches that have more than one result | If this is not enabled and more than one result is returned, one will be randomly chosen to match | +| Tag skipped matches with | If the above option is set and a scene is skipped, this will add the tag so that you can filter for it in the Scene Tagger view and choose the correct match by hand | +| Skip single name performers with no disambiguation | If this is not enabled, performers that are often generic like Samantha or Olga will be matched | +| Tag skipped performers with | If the above options is set and a performer is skipped, this will add the tag so that you can filter for in it the Scene Tagger view and choose how you want to handle those performers | Field specific options may be set as well. Each field may have a Strategy. The behaviour for each strategy value is as follows: diff --git a/ui/v2.5/src/docs/en/Manual/KeyboardShortcuts.md b/ui/v2.5/src/docs/en/Manual/KeyboardShortcuts.md index ca1545486c0..e67b693694c 100644 --- a/ui/v2.5/src/docs/en/Manual/KeyboardShortcuts.md +++ b/ui/v2.5/src/docs/en/Manual/KeyboardShortcuts.md @@ -65,6 +65,8 @@ | `p n` | Play next scene in queue | | `p p` | Play previous scene in queue | | `p r` | Play random scene in queue | +| `Space` | Play/pause player | +| `Enter` | Play/pause player | | `←` | Seek backwards by 10 seconds | | `→` | Seek forwards by 10 seconds | | `Shift + ←` | Seek backwards by 5 seconds | @@ -74,6 +76,10 @@ | `{1-9}` | Seek to 10-90% duration | | `[` | Scrub backwards 10% duration | | `]` | Scrub forwards 10% duration | +| `↑` | Increase volume 10% | +| `↓` | Decrease volume 10% | +| `m` | Toggle mute | +| `Shift + l` | Toggle player looping | ### Scene Markers tab shortcuts diff --git a/ui/v2.5/src/hooks/Interactive/context.tsx b/ui/v2.5/src/hooks/Interactive/context.tsx index c3c5ccb86b5..487fa8468f5 100644 --- a/ui/v2.5/src/hooks/Interactive/context.tsx +++ b/ui/v2.5/src/hooks/Interactive/context.tsx @@ -81,6 +81,8 @@ export const InteractiveProvider: React.FC = ({ children }) => { undefined ); const [scriptOffset, setScriptOffset] = useState(0); + const [useStashHostedFunscript, setUseStashHostedFunscript] = + useState(false); const [interactive] = useState(new InteractiveAPI("", 0)); const [initialised, setInitialised] = useState(false); @@ -118,6 +120,9 @@ export const InteractiveProvider: React.FC = ({ children }) => { setHandyKey(stashConfig.interface.handyKey ?? undefined); setScriptOffset(stashConfig.interface.funscriptOffset ?? 0); + setUseStashHostedFunscript( + stashConfig.interface.useStashHostedFunscript ?? false + ); }, [stashConfig]); useEffect(() => { @@ -129,11 +134,19 @@ export const InteractiveProvider: React.FC = ({ children }) => { interactive.handyKey = handyKey ?? ""; interactive.scriptOffset = scriptOffset; + interactive.useStashHostedFunscript = useStashHostedFunscript; if (oldKey !== interactive.handyKey && interactive.handyKey) { initialise(); } - }, [handyKey, scriptOffset, config, interactive, initialise]); + }, [ + handyKey, + scriptOffset, + useStashHostedFunscript, + config, + interactive, + initialise, + ]); const sync = useCallback(async () => { if ( @@ -163,14 +176,17 @@ export const InteractiveProvider: React.FC = ({ children }) => { setState(ConnectionState.Uploading); try { - await interactive.uploadScript(funscriptPath); + await interactive.uploadScript( + funscriptPath, + stashConfig?.general?.apiKey + ); setCurrentScript(funscriptPath); setState(ConnectionState.Ready); } catch (e) { setState(ConnectionState.Error); } }, - [interactive, currentScript] + [interactive, currentScript, stashConfig] ); return ( diff --git a/ui/v2.5/src/hooks/Interactive/interactive.ts b/ui/v2.5/src/hooks/Interactive/interactive.ts index 1198ac59e6b..ef34bd2ef6a 100644 --- a/ui/v2.5/src/hooks/Interactive/interactive.ts +++ b/ui/v2.5/src/hooks/Interactive/interactive.ts @@ -97,11 +97,13 @@ export class Interactive { _playing: boolean; _scriptOffset: number; _handy: Handy; + _useStashHostedFunscript: boolean; constructor(handyKey: string, scriptOffset: number) { this._handy = new Handy(); this._handy.connectionKey = handyKey; this._scriptOffset = scriptOffset; + this._useStashHostedFunscript = false; this._connected = false; this._playing = false; } @@ -127,27 +129,46 @@ export class Interactive { return this._handy.connectionKey; } + set useStashHostedFunscript(useStashHostedFunscript: boolean) { + this._useStashHostedFunscript = useStashHostedFunscript; + } + + get useStashHostedFunscript(): boolean { + return this._useStashHostedFunscript; + } + set scriptOffset(offset: number) { this._scriptOffset = offset; } - async uploadScript(funscriptPath: string) { + async uploadScript(funscriptPath: string, apiKey?: string) { if (!(this._handy.connectionKey && funscriptPath)) { return; } - const csv = await fetch(funscriptPath) - .then((response) => response.json()) - .then((json) => convertFunscriptToCSV(json)); - const fileName = `${Math.round(Math.random() * 100000000)}.csv`; - const csvFile = new File([csv], fileName); + var funscriptUrl; - const tempURL = await uploadCsv(csvFile).then((response) => response.url); + if (this._useStashHostedFunscript) { + funscriptUrl = funscriptPath.replace("/funscript", "/interactive_csv"); + if (typeof apiKey !== "undefined" && apiKey !== "") { + var url = new URL(funscriptUrl); + url.searchParams.append("apikey", apiKey); + funscriptUrl = url.toString(); + } + } else { + const csv = await fetch(funscriptPath) + .then((response) => response.json()) + .then((json) => convertFunscriptToCSV(json)); + const fileName = `${Math.round(Math.random() * 100000000)}.csv`; + const csvFile = new File([csv], fileName); + + funscriptUrl = await uploadCsv(csvFile).then((response) => response.url); + } await this._handy.setMode(HandyMode.hssp); this._connected = await this._handy - .setHsspSetup(tempURL) + .setHsspSetup(funscriptUrl) .then((result) => result === HsspSetupResult.downloaded); } diff --git a/ui/v2.5/src/locales/en-GB.json b/ui/v2.5/src/locales/en-GB.json index 3a17758565a..b398757ac92 100644 --- a/ui/v2.5/src/locales/en-GB.json +++ b/ui/v2.5/src/locales/en-GB.json @@ -129,6 +129,7 @@ "also_known_as": "Also known as", "appears_with": "Appears With", "ascending": "Ascending", + "audio_codec": "Audio Codec", "average_resolution": "Average Resolution", "between_and": "and", "birth_year": "Birth Year", @@ -454,10 +455,18 @@ "include_male_performers": "Include male performers", "set_cover_images": "Set cover images", "set_organized": "Set organised flag", + "skip_multiple_matches": "Skip matches that have more than one result", + "skip_multiple_matches_tooltip": "If this is not enabled and more than one result is returned, one will be randomly chosen to match", + "skip_single_name_performers": "Skip single name performers with no disambiguation", + "skip_single_name_performers_tooltip": "If this is not enabled, performers that are often generic like Samantha or Olga will be matched", "source": "Source", "source_options": "{source} Options", "sources": "Sources", - "strategy": "Strategy" + "strategy": "Strategy", + "tag_skipped_matches": "Tag skipped matches with", + "tag_skipped_matches_tooltip": "Create a tag like 'Identify: Multiple Matches' that you can filter for in the Scene Tagger view and choose the correct match by hand", + "tag_skipped_performers": "Tag skipped performers with", + "tag_skipped_performer_tooltip": "Create a tag like 'Identify: Single Name Performer' that you can filter for in the Scene Tagger view and choose how you want to handle these performers" }, "import_from_exported_json": "Import from exported JSON in the metadata directory. Wipes the existing database.", "incremental_import": "Incremental import from a supplied export zip file.", @@ -702,7 +711,11 @@ } } }, - "title": "User Interface" + "title": "User Interface", + "use_stash_hosted_funscript": { + "description": "When enabled, funscripts will be served directly from Stash to your Handy device without using the third party Handy server. Requires that Stash be accessible from your Handy device.", + "heading": "Serve funscripts directly" + } } }, "configuration": "Configuration", @@ -1219,7 +1232,11 @@ "stats": { "image_size": "Images size", "scenes_duration": "Scenes duration", - "scenes_size": "Scenes size" + "scenes_size": "Scenes size", + "scenes_played": "Scenes Played", + "total_play_duration": "Total Play Duration", + "total_play_count": "Total Play Count", + "total_o_count": "Total O-Count" }, "status": "Status: {statusText}", "studio": "Studio", @@ -1260,12 +1277,15 @@ "type": "Type", "updated_at": "Updated At", "url": "URL", + "urls": "URLs", "validation": { "aliases_must_be_unique": "aliases must be unique", "date_invalid_form": "${path} must be in YYYY-MM-DD form", - "required": "${path} is a required field" + "required": "${path} is a required field", + "urls_must_be_unique": "URLs must be unique" }, "videos": "Videos", + "video_codec": "Video Codec", "view_all": "View All", "weight": "Weight", "weight_kg": "Weight (kg)", diff --git a/ui/v2.5/src/models/list-filter/criteria/factory.ts b/ui/v2.5/src/models/list-filter/criteria/factory.ts index 5096a14b0f3..d45042aa0ee 100644 --- a/ui/v2.5/src/models/list-filter/criteria/factory.ts +++ b/ui/v2.5/src/models/list-filter/criteria/factory.ts @@ -107,6 +107,10 @@ export function makeCriteria( return new ResolutionCriterion(); case "average_resolution": return new AverageResolutionCriterion(); + case "video_codec": + return new StringCriterion(new StringCriterionOption(type, type)); + case "audio_codec": + return new StringCriterion(new StringCriterionOption(type, type)); case "resume_time": case "duration": case "play_duration": diff --git a/ui/v2.5/src/models/list-filter/scenes.ts b/ui/v2.5/src/models/list-filter/scenes.ts index f4be9f78a97..106875a1251 100644 --- a/ui/v2.5/src/models/list-filter/scenes.ts +++ b/ui/v2.5/src/models/list-filter/scenes.ts @@ -75,6 +75,8 @@ const criterionOptions = [ new NullNumberCriterionOption("rating", "rating100"), createMandatoryNumberCriterionOption("o_counter"), ResolutionCriterionOption, + createStringCriterionOption("video_codec"), + createStringCriterionOption("audio_codec"), createMandatoryNumberCriterionOption("duration"), createMandatoryNumberCriterionOption("resume_time"), createMandatoryNumberCriterionOption("play_duration"), diff --git a/ui/v2.5/src/models/list-filter/types.ts b/ui/v2.5/src/models/list-filter/types.ts index 79731eb3bfe..a200024f743 100644 --- a/ui/v2.5/src/models/list-filter/types.ts +++ b/ui/v2.5/src/models/list-filter/types.ts @@ -109,6 +109,8 @@ export type CriterionType = | "o_counter" | "resolution" | "average_resolution" + | "video_codec" + | "audio_codec" | "duration" | "favorite" | "hasMarkers" diff --git a/ui/v2.5/src/utils/field.tsx b/ui/v2.5/src/utils/field.tsx index c947b835500..26ef6d86ccc 100644 --- a/ui/v2.5/src/utils/field.tsx +++ b/ui/v2.5/src/utils/field.tsx @@ -90,3 +90,50 @@ export const URLField: React.FC = ({ ); }; + +interface IURLsField { + id?: string; + name?: string; + abbr?: string | null; + urls?: string[] | null; + truncate?: boolean | null; + target?: string; + // use for internal links + trusted?: boolean; +} + +export const URLsField: React.FC = ({ + id, + name, + urls, + abbr, + truncate, + target, + trusted, +}) => { + const values = urls ?? []; + if (!values.length) { + return null; + } + + const message = ( + <>{id ? : name}: + ); + + const rel = !trusted ? "noopener noreferrer" : undefined; + + return ( + <> +
{abbr ? {message} : message}
+
+
+ {values.map((url, i) => ( + + {truncate ? : url} + + ))} +
+
+ + ); +}; diff --git a/ui/v2.5/yarn.lock b/ui/v2.5/yarn.lock index 18322543cc9..6fe09bc33c9 100644 --- a/ui/v2.5/yarn.lock +++ b/ui/v2.5/yarn.lock @@ -1154,25 +1154,25 @@ dependencies: "@jridgewell/trace-mapping" "0.3.9" -"@csstools/css-parser-algorithms@^2.0.1": - version "2.0.1" - resolved "https://registry.yarnpkg.com/@csstools/css-parser-algorithms/-/css-parser-algorithms-2.0.1.tgz#ff02629c7c95d1f4f8ea84d5ef1173461610535e" - integrity sha512-B9/8PmOtU6nBiibJg0glnNktQDZ3rZnGn/7UmDfrm2vMtrdlXO3p7ErE95N0up80IRk9YEtB5jyj/TmQ1WH3dw== +"@csstools/css-parser-algorithms@^2.3.0": + version "2.3.0" + resolved "https://registry.yarnpkg.com/@csstools/css-parser-algorithms/-/css-parser-algorithms-2.3.0.tgz#0cc3a656dc2d638370ecf6f98358973bfbd00141" + integrity sha512-dTKSIHHWc0zPvcS5cqGP+/TPFUJB0ekJ9dGKvMAFoNuBFhDPBt9OMGNZiIA5vTiNdGHHBeScYPXIGBMnVOahsA== -"@csstools/css-tokenizer@^2.0.1": - version "2.0.2" - resolved "https://registry.yarnpkg.com/@csstools/css-tokenizer/-/css-tokenizer-2.0.2.tgz#3635560ffc8f1994295d7ce3482e14f956d3f9e1" - integrity sha512-prUTipz0NZH7Lc5wyBUy93NFy3QYDMVEQgSeZzNdpMbKRd6V2bgRFyJ+O0S0Dw0MXWuE/H9WXlJk3kzMZRHZ/g== +"@csstools/css-tokenizer@^2.1.1": + version "2.1.1" + resolved "https://registry.yarnpkg.com/@csstools/css-tokenizer/-/css-tokenizer-2.1.1.tgz#07ae11a0a06365d7ec686549db7b729bc036528e" + integrity sha512-GbrTj2Z8MCTUv+52GE0RbFGM527xuXZ0Xa5g0Z+YN573uveS4G0qi6WNOMyz3yrFM/jaILTTwJ0+umx81EzqfA== -"@csstools/media-query-list-parser@^2.0.1": - version "2.0.1" - resolved "https://registry.yarnpkg.com/@csstools/media-query-list-parser/-/media-query-list-parser-2.0.1.tgz#d85a366811563a5d002755ed10e5212a1613c91d" - integrity sha512-X2/OuzEbjaxhzm97UJ+95GrMeT29d1Ib+Pu+paGLuRWZnWRK9sI9r3ikmKXPWGA1C4y4JEdBEFpp9jEqCvLeRA== +"@csstools/media-query-list-parser@^2.1.2": + version "2.1.2" + resolved "https://registry.yarnpkg.com/@csstools/media-query-list-parser/-/media-query-list-parser-2.1.2.tgz#6ef642b728d30c1009bfbba3211c7e4c11302728" + integrity sha512-M8cFGGwl866o6++vIY7j1AKuq9v57cf+dGepScwCcbut9ypJNr4Cj+LLTWligYUZ0uyhEoJDKt5lvyBfh2L3ZQ== -"@csstools/selector-specificity@^2.1.1": - version "2.1.1" - resolved "https://registry.yarnpkg.com/@csstools/selector-specificity/-/selector-specificity-2.1.1.tgz#c9c61d9fe5ca5ac664e1153bb0aa0eba1c6d6308" - integrity sha512-jwx+WCqszn53YHOfvFMJJRd/B2GqkCBt+1MJSG6o5/s8+ytHMvDZXsJgUEWLk12UnLd7HYKac4BYU5i/Ron1Cw== +"@csstools/selector-specificity@^3.0.0": + version "3.0.0" + resolved "https://registry.yarnpkg.com/@csstools/selector-specificity/-/selector-specificity-3.0.0.tgz#798622546b63847e82389e473fd67f2707d82247" + integrity sha512-hBI9tfBtuPIi885ZsZ32IMEU/5nlZH/KOVYJCOh7gyMxaVLGmLedYqFN6Ui1LXkI8JlC8IsuC0rF0btcRZKd5g== "@emotion/babel-plugin@^11.10.5": version "11.10.5" @@ -2309,7 +2309,7 @@ dependencies: "@types/unist" "*" -"@types/minimist@^1.2.0": +"@types/minimist@^1.2.0", "@types/minimist@^1.2.2": version "1.2.2" resolved "https://registry.yarnpkg.com/@types/minimist/-/minimist-1.2.2.tgz#ee771e2ba4b3dc5b372935d549fd9617bf345b8c" integrity sha512-jhuKLIRrhvCPLqwPcx6INqmKeiA5EWrsCOPhrlFSrbrmU4ZMPjj5Ul/oLCMDO98XRUIwVm78xICz4EPCektzeQ== @@ -3192,11 +3192,26 @@ camelcase-keys@^6.2.2: map-obj "^4.0.0" quick-lru "^4.0.1" +camelcase-keys@^7.0.0: + version "7.0.2" + resolved "https://registry.yarnpkg.com/camelcase-keys/-/camelcase-keys-7.0.2.tgz#d048d8c69448745bb0de6fc4c1c52a30dfbe7252" + integrity sha512-Rjs1H+A9R+Ig+4E/9oyB66UC5Mj9Xq3N//vcLf2WzgdTi/3gUu3Z9KoqmlrEG4VuuLK8wJHofxzdQXz/knhiYg== + dependencies: + camelcase "^6.3.0" + map-obj "^4.1.0" + quick-lru "^5.1.1" + type-fest "^1.2.1" + camelcase@^5.0.0, camelcase@^5.3.1: version "5.3.1" resolved "https://registry.yarnpkg.com/camelcase/-/camelcase-5.3.1.tgz#e3c9b31569e106811df242f715725a1f4c494320" integrity sha512-L28STB170nwWS63UjtlEOE3dldQApaJXZkOI1uMFfzf3rRuPegHaHesyee+YxQ+W6SvRDQV6UrdOdRiR153wJg== +camelcase@^6.3.0: + version "6.3.0" + resolved "https://registry.yarnpkg.com/camelcase/-/camelcase-6.3.0.tgz#5685b95eb209ac9c0c177467778c9c84df58ba9a" + integrity sha512-Gmy6FhYlCY7uOElZUSbxo2UCDH8owEk996gkbrpsgGtrJLM3J7jGxl9Ic7Qwwj4ivOE5AWZWRMecDdF7hqGjFA== + caniuse-lite@^1.0.30001449: version "1.0.30001453" resolved "https://registry.yarnpkg.com/caniuse-lite/-/caniuse-lite-1.0.30001453.tgz#6d3a1501622bf424a3cee5ad9550e640b0de3de8" @@ -3492,7 +3507,7 @@ cosmiconfig-typescript-loader@^4.3.0: resolved "https://registry.yarnpkg.com/cosmiconfig-typescript-loader/-/cosmiconfig-typescript-loader-4.3.0.tgz#c4259ce474c9df0f32274ed162c0447c951ef073" integrity sha512-NTxV1MFfZDLPiBMjxbHRwSh5LaLcPMwNdCutmnHJCKoVnlvldPWlllonKwrsRJ5pYZBIBGRWWU2tfvzxgeSW5Q== -cosmiconfig@8.0.0, cosmiconfig@^8.0.0: +cosmiconfig@8.0.0: version "8.0.0" resolved "https://registry.yarnpkg.com/cosmiconfig/-/cosmiconfig-8.0.0.tgz#e9feae014eab580f858f8a0288f38997a7bebe97" integrity sha512-da1EafcpH6b/TD8vDRaWV7xFINlHlF6zKsGwS1TsuVJTZRkquaS5HTMq7uq6h31619QjbsYl21gVDOm32KM1vQ== @@ -3513,6 +3528,16 @@ cosmiconfig@^7.0.0: path-type "^4.0.0" yaml "^1.10.0" +cosmiconfig@^8.2.0: + version "8.2.0" + resolved "https://registry.yarnpkg.com/cosmiconfig/-/cosmiconfig-8.2.0.tgz#f7d17c56a590856cd1e7cee98734dca272b0d8fd" + integrity sha512-3rTMnFJA1tCOPwRxtgF4wd7Ab2qvDbL8jX+3smjIbS4HlZBagTlpERbdN7iAbWlrfxE3M8c27kTwTawQ7st+OQ== + dependencies: + import-fresh "^3.2.1" + js-yaml "^4.1.0" + parse-json "^5.0.0" + path-type "^4.0.0" + create-require@^1.1.0: version "1.1.1" resolved "https://registry.yarnpkg.com/create-require/-/create-require-1.1.1.tgz#c1d7e8f1e5f6cfc9ff65f9cd352d37348756c333" @@ -3620,6 +3645,11 @@ decamelize@^1.1.0, decamelize@^1.2.0: resolved "https://registry.yarnpkg.com/decamelize/-/decamelize-1.2.0.tgz#f6534d15148269b20352e7bee26f501f9a191290" integrity sha512-z2S+W9X73hAUUki+N+9Za2lBlun89zigOyGrsax+KUQ6wKW4ZoWpEYBkGhQjwAjjDCkWxhY0VKEhk8wzY7F5cA== +decamelize@^5.0.0: + version "5.0.1" + resolved "https://registry.yarnpkg.com/decamelize/-/decamelize-5.0.1.tgz#db11a92e58c741ef339fb0a2868d8a06a9a7b1e9" + integrity sha512-VfxadyCECXgQlkoEAjeghAr5gY3Hf+IKjKb+X8tGVDtveCjN+USwprd2q3QXBR9T1+x2DG0XZF5/w+7HAtSaXA== + deep-equal@^2.0.5: version "2.2.0" resolved "https://registry.yarnpkg.com/deep-equal/-/deep-equal-2.2.0.tgz#5caeace9c781028b9ff459f33b779346637c43e6" @@ -4203,10 +4233,10 @@ fast-deep-equal@^3.1.1, fast-deep-equal@^3.1.3: resolved "https://registry.yarnpkg.com/fast-deep-equal/-/fast-deep-equal-3.1.3.tgz#3a7d56b559d6cbc3eb512325244e619a65c6c525" integrity sha512-f3qQ9oQy9j2AhBe/H9VC91wLmKBCCU/gDOnKNAYG5hswO7BLKj09Hc5HYNz9cGI++xlpDCIgDaitVs03ATR84Q== -fast-glob@^3.2.12, fast-glob@^3.2.9: - version "3.2.12" - resolved "https://registry.yarnpkg.com/fast-glob/-/fast-glob-3.2.12.tgz#7f39ec99c2e6ab030337142da9e0c18f37afae80" - integrity sha512-DVj4CQIYYow0BlaelwK1pHl5n5cRSJfM60UA0zK891sVInoPri2Ekj7+e1CT3/3qxXenpI+nBBmQAcJPJgaj4w== +fast-glob@^3.2.9, fast-glob@^3.3.0: + version "3.3.0" + resolved "https://registry.yarnpkg.com/fast-glob/-/fast-glob-3.3.0.tgz#7c40cb491e1e2ed5664749e87bfb516dbe8727c0" + integrity sha512-ChDuvbOypPuNjO8yIDf36x7BlZX1smcUMTTcyoIjycexOxd6DFsKsg21qVBzEmr3G7fUKIRy2/psii+CIUt7FA== dependencies: "@nodelib/fs.stat" "^2.0.2" "@nodelib/fs.walk" "^1.2.3" @@ -4728,10 +4758,10 @@ html-entities@^1.2.1: resolved "https://registry.yarnpkg.com/html-entities/-/html-entities-1.4.0.tgz#cfbd1b01d2afaf9adca1b10ae7dffab98c71d2dc" integrity sha512-8nxjcBcd8wovbeKx7h3wTji4e6+rhaVuPNpMqwWgnHh+N9ToqsCs6XztWRBPQ+UtzsoMAdKZtUENoVzU/EMtZA== -html-tags@^3.2.0: - version "3.2.0" - resolved "https://registry.yarnpkg.com/html-tags/-/html-tags-3.2.0.tgz#dbb3518d20b726524e4dd43de397eb0a95726961" - integrity sha512-vy7ClnArOZwCnqZgvv+ddgHgJiAFXe3Ge9ML5/mBctVJoUoYPCdxVucOywjDARn6CVoh3dRSFdPHy2sX80L0Wg== +html-tags@^3.3.1: + version "3.3.1" + resolved "https://registry.yarnpkg.com/html-tags/-/html-tags-3.3.1.tgz#a04026a18c882e4bba8a01a3d39cfe465d40b5ce" + integrity sha512-ztqyC3kLto0e9WbNp0aeP+M3kTt+nbaIveGmUxAtZa+8iFgKLUOD4YKM5j+f3QD89bra7UeumolZHKuOXnTmeQ== http-proxy-agent@^5.0.0: version "5.0.0" @@ -4824,6 +4854,11 @@ indent-string@^4.0.0: resolved "https://registry.yarnpkg.com/indent-string/-/indent-string-4.0.0.tgz#624f8f4497d619b2d9768531d58f4122854d7251" integrity sha512-EdDDZu4A2OyIK7Lr/2zG+w5jmbuk1DVBnEwREQvBzspBJkCEbRa8GxU1lghYcaGJCnRWibjDXlq779X1/y5xwg== +indent-string@^5.0.0: + version "5.0.0" + resolved "https://registry.yarnpkg.com/indent-string/-/indent-string-5.0.0.tgz#4fd2980fccaf8622d14c64d694f4cf33c81951a5" + integrity sha512-m6FAo/spmsW2Ab2fU35JTYwtOKa2yAwXSwgjSv1TJzh4Mh7mC3lzAOVLBprb72XsTrgkEIsl7YrFNAiDiRhIGg== + individual@^2.0.0: version "2.0.0" resolved "https://registry.yarnpkg.com/individual/-/individual-2.0.0.tgz#833b097dad23294e76117a98fb38e0d9ad61bb97" @@ -5371,10 +5406,10 @@ kind-of@^6.0.2, kind-of@^6.0.3: resolved "https://registry.yarnpkg.com/kind-of/-/kind-of-6.0.3.tgz#07c05034a6c349fa06e24fa35aa76db4580ce4dd" integrity sha512-dcS1ul+9tmeD95T+x28/ehLgd9mENa3LsvDTtzm3vyBEO7RPptvAD+t44WVXaUjTBRcrpFeFlC8WCruUR456hw== -known-css-properties@^0.26.0: - version "0.26.0" - resolved "https://registry.yarnpkg.com/known-css-properties/-/known-css-properties-0.26.0.tgz#008295115abddc045a9f4ed7e2a84dc8b3a77649" - integrity sha512-5FZRzrZzNTBruuurWpvZnvP9pum+fe0HcK8z/ooo+U+Hmp4vtbyp1/QDsqmufirXy4egGzbaH/y2uCZf+6W5Kg== +known-css-properties@^0.27.0: + version "0.27.0" + resolved "https://registry.yarnpkg.com/known-css-properties/-/known-css-properties-0.27.0.tgz#82a9358dda5fe7f7bd12b5e7142c0a205393c0c5" + integrity sha512-uMCj6+hZYDoffuvAJjFAPz56E9uoowFHmTkqRtRq5WyC5Q6Cu/fTZKNQpX/RbzChBYLLl3lo8CjFZBAZXq9qFg== language-subtag-registry@~0.3.2: version "0.3.22" @@ -5584,7 +5619,7 @@ map-obj@^1.0.0: resolved "https://registry.yarnpkg.com/map-obj/-/map-obj-1.0.1.tgz#d933ceb9205d82bdcf4886f6742bdc2b4dea146d" integrity sha512-7N/q3lyZ+LVCp7PzuxrJr4KMbBE2hW7BT7YNia330OFxIf4d3r5zVpicP2650l7CPN6RM9zOJRl3NGpqSiw3Eg== -map-obj@^4.0.0: +map-obj@^4.0.0, map-obj@^4.1.0: version "4.3.0" resolved "https://registry.yarnpkg.com/map-obj/-/map-obj-4.3.0.tgz#9304f906e93faae70880da102a9f1df0ea8bb05a" integrity sha512-hdN1wVrZbb29eBGiGjJbeP8JbKjq1urkHJ/LIP/NY48MZ1QVXUsQBV1G1zvYFHn1XE06cwjBsOI2K3Ulnj1YXQ== @@ -5716,6 +5751,24 @@ memoize-one@^6.0.0: resolved "https://registry.yarnpkg.com/memoize-one/-/memoize-one-6.0.0.tgz#b2591b871ed82948aee4727dc6abceeeac8c1045" integrity sha512-rkpe71W0N0c0Xz6QD0eJETuWAJGnJ9afsl1srmwPrI+yBCkge5EycXXbYRyvL29zZVUWQCY7InPRCv3GDXuZNw== +meow@^10.1.5: + version "10.1.5" + resolved "https://registry.yarnpkg.com/meow/-/meow-10.1.5.tgz#be52a1d87b5f5698602b0f32875ee5940904aa7f" + integrity sha512-/d+PQ4GKmGvM9Bee/DPa8z3mXs/pkvJE2KEThngVNOqtmljC6K7NMPxtc2JeZYTmpWb9k/TmxjeL18ez3h7vCw== + dependencies: + "@types/minimist" "^1.2.2" + camelcase-keys "^7.0.0" + decamelize "^5.0.0" + decamelize-keys "^1.1.0" + hard-rejection "^2.1.0" + minimist-options "4.1.0" + normalize-package-data "^3.0.2" + read-pkg-up "^8.0.0" + redent "^4.0.0" + trim-newlines "^4.0.2" + type-fest "^1.2.2" + yargs-parser "^20.2.9" + meow@^6.1.0: version "6.1.1" resolved "https://registry.yarnpkg.com/meow/-/meow-6.1.1.tgz#1ad64c4b76b2a24dfb2f635fddcadf320d251467" @@ -5733,24 +5786,6 @@ meow@^6.1.0: type-fest "^0.13.1" yargs-parser "^18.1.3" -meow@^9.0.0: - version "9.0.0" - resolved "https://registry.yarnpkg.com/meow/-/meow-9.0.0.tgz#cd9510bc5cac9dee7d03c73ee1f9ad959f4ea364" - integrity sha512-+obSblOQmRhcyBt62furQqRAQpNyWXo8BuQ5bN7dG8wmwQ+vwHKp/rCFD4CrTP8CsDQD1sjoZ94K417XEUk8IQ== - dependencies: - "@types/minimist" "^1.2.0" - camelcase-keys "^6.2.2" - decamelize "^1.2.0" - decamelize-keys "^1.1.0" - hard-rejection "^2.1.0" - minimist-options "4.1.0" - normalize-package-data "^3.0.0" - read-pkg-up "^7.0.1" - redent "^3.0.0" - trim-newlines "^3.0.0" - type-fest "^0.18.0" - yargs-parser "^20.2.3" - merge2@^1.3.0, merge2@^1.4.1: version "1.4.1" resolved "https://registry.yarnpkg.com/merge2/-/merge2-1.4.1.tgz#4368892f885e907455a6fd7dc55c0c9d404990ae" @@ -5846,7 +5881,7 @@ min-document@^2.19.0: dependencies: dom-walk "^0.1.0" -min-indent@^1.0.0: +min-indent@^1.0.0, min-indent@^1.0.1: version "1.0.1" resolved "https://registry.yarnpkg.com/min-indent/-/min-indent-1.0.1.tgz#a63f681673b30571fbe8bc25686ae746eefa9869" integrity sha512-I9jwMn07Sy/IwOj3zVkVik2JTvgpaykDZEigL6Rx6N9LbMywwUSMtxET+7lVoDLLd3O3IXwJwvuuns8UB/HeAg== @@ -5932,10 +5967,10 @@ mux.js@6.0.1: "@babel/runtime" "^7.11.2" global "^4.4.0" -nanoid@^3.3.4: - version "3.3.4" - resolved "https://registry.yarnpkg.com/nanoid/-/nanoid-3.3.4.tgz#730b67e3cd09e2deacf03c027c81c9d9dbc5e8ab" - integrity sha512-MqBkQh/OHTS2egovRtLk45wEyNXwF+cokD+1YPf9u5VfJiRdAiRwB2froX5Co9Rh20xs4siNPm8naNotSD6RBw== +nanoid@^3.3.6: + version "3.3.6" + resolved "https://registry.yarnpkg.com/nanoid/-/nanoid-3.3.6.tgz#443380c856d6e9f9824267d960b4236ad583ea4c" + integrity sha512-BGcqMMJuToF7i1rt+2PWSNVnWIkGCU78jBG3RxO/bZlnZPK2Cmi2QaffxGO/2RvWi9sL+FAiRiXMgsyxQ1DIDA== natural-compare-lite@^1.4.0: version "1.4.0" @@ -5989,7 +6024,7 @@ normalize-package-data@^2.5.0: semver "2 || 3 || 4 || 5" validate-npm-package-license "^3.0.1" -normalize-package-data@^3.0.0: +normalize-package-data@^3.0.2: version "3.0.3" resolved "https://registry.yarnpkg.com/normalize-package-data/-/normalize-package-data-3.0.3.tgz#dbcc3e2da59509a0983422884cd172eefdfa525e" integrity sha512-p2W1sgqij3zMMyRC067Dg16bfzVH+w7hyegmpIvZ4JNjqtGOVAIvLmjBx3yP7YTe9vKJgkoNOPjwQGogDoMXFA== @@ -6224,7 +6259,7 @@ parse-filepath@^1.0.2: map-cache "^0.2.0" path-root "^0.1.1" -parse-json@^5.0.0: +parse-json@^5.0.0, parse-json@^5.2.0: version "5.2.0" resolved "https://registry.yarnpkg.com/parse-json/-/parse-json-5.2.0.tgz#c76fc66dee54231c962b22bcc8a72cf2f99753cd" integrity sha512-ayCKvm/phCGxOkYRSCM82iDwct8/EonSEgCSxWxD7ve6jHggsFl4fZVQBPRNgQoKiuV/odhFrGzQXZwbifC8Rg== @@ -6321,11 +6356,6 @@ pkcs7@^1.0.4: dependencies: "@babel/runtime" "^7.5.5" -postcss-media-query-parser@^0.2.3: - version "0.2.3" - resolved "https://registry.yarnpkg.com/postcss-media-query-parser/-/postcss-media-query-parser-0.2.3.tgz#27b39c6f4d94f81b1a73b8f76351c609e5cef244" - integrity sha512-3sOlxmbKcSHMjlUXQZKQ06jOswE7oVkXPxmZdoB1r5l0q6gTFTQSHxNxOrCccElbW7dxNytifNEo8qidX2Vsig== - postcss-resolve-nested-selector@^0.1.1: version "0.1.1" resolved "https://registry.yarnpkg.com/postcss-resolve-nested-selector/-/postcss-resolve-nested-selector-0.1.1.tgz#29ccbc7c37dedfac304e9fff0bf1596b3f6a0e4e" @@ -6341,10 +6371,10 @@ postcss-scss@^4.0.6: resolved "https://registry.yarnpkg.com/postcss-scss/-/postcss-scss-4.0.6.tgz#5d62a574b950a6ae12f2aa89b60d63d9e4432bfd" integrity sha512-rLDPhJY4z/i4nVFZ27j9GqLxj1pwxE80eAzUNRMXtcpipFYIeowerzBgG3yJhMtObGEXidtIgbUpQ3eLDsf5OQ== -postcss-selector-parser@^6.0.11: - version "6.0.11" - resolved "https://registry.yarnpkg.com/postcss-selector-parser/-/postcss-selector-parser-6.0.11.tgz#2e41dc39b7ad74046e1615185185cd0b17d0c8dc" - integrity sha512-zbARubNdogI9j7WY4nQJBiNqQf3sLS3wCP4WfOidu+p28LofJqDH1tcXypGrcmMHhDk2t9wGhCsYe/+szLTy1g== +postcss-selector-parser@^6.0.13: + version "6.0.13" + resolved "https://registry.yarnpkg.com/postcss-selector-parser/-/postcss-selector-parser-6.0.13.tgz#d05d8d76b1e8e173257ef9d60b706a8e5e99bf1b" + integrity sha512-EaV1Gl4mUEV4ddhDnv/xtj7sxwrwxdetHdWUGnT4VJQf+4d05v6lHYZr8N573k5Z0BViss7BDhfWtKS3+sfAqQ== dependencies: cssesc "^3.0.0" util-deprecate "^1.0.2" @@ -6359,12 +6389,12 @@ postcss-value-parser@^4.2.0: resolved "https://registry.yarnpkg.com/postcss-value-parser/-/postcss-value-parser-4.2.0.tgz#723c09920836ba6d3e5af019f92bc0971c02e514" integrity sha512-1NNCs6uurfkVbeXG4S8JFT9t19m45ICnif8zWLd5oPSZ50QnwMfK+H3jv408d4jw/7Bttv5axS5IiHoLaVNHeQ== -postcss@^8.4.21: - version "8.4.21" - resolved "https://registry.yarnpkg.com/postcss/-/postcss-8.4.21.tgz#c639b719a57efc3187b13a1d765675485f4134f4" - integrity sha512-tP7u/Sn/dVxK2NnruI4H9BG+x+Wxz6oeZ1cJ8P6G/PZY0IKk4k/63TDsQf2kQq3+qoJeLm2kIBUNlZe3zgb4Zg== +postcss@^8.4.21, postcss@^8.4.24: + version "8.4.25" + resolved "https://registry.yarnpkg.com/postcss/-/postcss-8.4.25.tgz#4a133f5e379eda7f61e906c3b1aaa9b81292726f" + integrity sha512-7taJ/8t2av0Z+sQEvNzCkpDynl0tX3uJMCODi6nT3PfASC7dYCWV9aQ+uiCf+KBD4SEFcu+GvJdGdwzQ6OSjCw== dependencies: - nanoid "^3.3.4" + nanoid "^3.3.6" picocolors "^1.0.0" source-map-js "^1.0.2" @@ -6465,6 +6495,11 @@ quick-lru@^4.0.1: resolved "https://registry.yarnpkg.com/quick-lru/-/quick-lru-4.0.1.tgz#5b8878f113a58217848c6482026c73e1ba57727f" integrity sha512-ARhCpm70fzdcvNQfPoy49IaanKkTlRWF2JMzqhcJbhSFRZv7nPTvZJdcY7301IPmvW+/p0RgIWnQDLJxifsQ7g== +quick-lru@^5.1.1: + version "5.1.1" + resolved "https://registry.yarnpkg.com/quick-lru/-/quick-lru-5.1.1.tgz#366493e6b3e42a3a6885e2e99d18f80fb7a8c932" + integrity sha512-WuyALRjWPDGtt/wzJiadO5AXY+8hZ80hVpe6MyivgraREW751X3SbhRvG3eLKOYN+8VEvqLcf3wdnt44Z4S4SA== + react-bootstrap@^1.6.6: version "1.6.6" resolved "https://registry.yarnpkg.com/react-bootstrap/-/react-bootstrap-1.6.6.tgz#3f3b274f8923b9886008a0e61485b5ac9a2b3073" @@ -6707,6 +6742,15 @@ read-pkg-up@^7.0.1: read-pkg "^5.2.0" type-fest "^0.8.1" +read-pkg-up@^8.0.0: + version "8.0.0" + resolved "https://registry.yarnpkg.com/read-pkg-up/-/read-pkg-up-8.0.0.tgz#72f595b65e66110f43b052dd9af4de6b10534670" + integrity sha512-snVCqPczksT0HS2EC+SxUndvSzn6LRCwpfSvLrIfR5BKDQQZMaI6jPRC9dYvYFDRAuFEAnkwww8kBBNE/3VvzQ== + dependencies: + find-up "^5.0.0" + read-pkg "^6.0.0" + type-fest "^1.0.1" + read-pkg@^5.2.0: version "5.2.0" resolved "https://registry.yarnpkg.com/read-pkg/-/read-pkg-5.2.0.tgz#7bf295438ca5a33e56cd30e053b34ee7250c93cc" @@ -6717,6 +6761,16 @@ read-pkg@^5.2.0: parse-json "^5.0.0" type-fest "^0.6.0" +read-pkg@^6.0.0: + version "6.0.0" + resolved "https://registry.yarnpkg.com/read-pkg/-/read-pkg-6.0.0.tgz#a67a7d6a1c2b0c3cd6aa2ea521f40c458a4a504c" + integrity sha512-X1Fu3dPuk/8ZLsMhEj5f4wFAF0DWoK7qhGJvgaijocXxBmSToKfbFtqbxMO7bVjNA1dmE5huAzjXj/ey86iw9Q== + dependencies: + "@types/normalize-package-data" "^2.4.0" + normalize-package-data "^3.0.2" + parse-json "^5.2.0" + type-fest "^1.0.1" + readable-stream@^3.4.0: version "3.6.0" resolved "https://registry.yarnpkg.com/readable-stream/-/readable-stream-3.6.0.tgz#337bbda3adc0706bd3e024426a286d4b4b2c9198" @@ -6741,6 +6795,14 @@ redent@^3.0.0: indent-string "^4.0.0" strip-indent "^3.0.0" +redent@^4.0.0: + version "4.0.0" + resolved "https://registry.yarnpkg.com/redent/-/redent-4.0.0.tgz#0c0ba7caabb24257ab3bb7a4fd95dd1d5c5681f9" + integrity sha512-tYkDkVVtYkSVhuQ4zBgfvciymHaeuel+zFKXShfDnFP5SyVEP7qo70Rf1jTOTCx3vGNAbnEi/xFkcfQVMIBWag== + dependencies: + indent-string "^5.0.0" + strip-indent "^4.0.0" + regenerate-unicode-properties@^10.1.0: version "10.1.0" resolved "https://registry.yarnpkg.com/regenerate-unicode-properties/-/regenerate-unicode-properties-10.1.0.tgz#7c3192cab6dd24e21cb4461e5ddd7dd24fa8374c" @@ -7046,19 +7108,19 @@ scuid@^1.1.0: integrity sha512-MuCAyrGZcTLfQoH2XoBlQ8C6bzwN88XT/0slOGz0pn8+gIP85BOAfYa44ZXQUTOwRwPU0QvgU+V+OSajl/59Xg== "semver@2 || 3 || 4 || 5": - version "5.7.1" - resolved "https://registry.yarnpkg.com/semver/-/semver-5.7.1.tgz#a954f931aeba508d307bbf069eff0c01c96116f7" - integrity sha512-sauaDf/PZdVgrLTNYHRtpXa1iRiKcaebiKQ1BJdpQlWH2lCvexQdX55snPFyK7QzpudqbCI0qXFfOasHdyNDGQ== + version "5.7.2" + resolved "https://registry.yarnpkg.com/semver/-/semver-5.7.2.tgz#48d55db737c3287cd4835e17fa13feace1c41ef8" + integrity sha512-cBznnQ9KjJqU67B52RMC65CMarK2600WFnbkcaiwWq3xy/5haFJlshgnpjovMVJ+Hff49d8GEn0b87C5pDQ10g== semver@^6.0.0, semver@^6.1.1, semver@^6.1.2, semver@^6.3.0: - version "6.3.0" - resolved "https://registry.yarnpkg.com/semver/-/semver-6.3.0.tgz#ee0a64c8af5e8ceea67687b133761e1becbd1d3d" - integrity sha512-b39TBaTSfV6yBrapU89p5fKekE2m/NwnDocOVruQFS1/veMgdzuPcnOM34M6CwxW8jH/lxEa5rBoDeUwu5HHTw== + version "6.3.1" + resolved "https://registry.yarnpkg.com/semver/-/semver-6.3.1.tgz#556d2ef8689146e46dcea4bfdd095f3434dffcb4" + integrity sha512-BR7VvDCVHO+q2xBEWskxS6DJE1qRnb7DxzUrogb71CWoSficBxYsiAGd+Kl0mmq/MprG9yArRkyrQxTO6XjMzA== semver@^7.3.4, semver@^7.3.7, semver@^7.3.8: - version "7.3.8" - resolved "https://registry.yarnpkg.com/semver/-/semver-7.3.8.tgz#07a78feafb3f7b32347d725e33de7e2a2df67798" - integrity sha512-NB1ctGL5rlHrPJtFDVIVzTyQylMLu9N9VICA6HSFJo8MCGVTMW6gfpicwKmmK/dAjTOrqu5l63JJOpDSrAis3A== + version "7.5.4" + resolved "https://registry.yarnpkg.com/semver/-/semver-7.5.4.tgz#483986ec4ed38e1c6c48c34894a9182dbff68a6e" + integrity sha512-1bCSESV6Pv+i21Hvpxp3Dx+pSD8lIPt8uVjRrxAUt/nbswYc+tK6Y2btiULjd4+fnq15PX+nqQDC7Oft7WkwcA== dependencies: lru-cache "^6.0.0" @@ -7107,11 +7169,16 @@ side-channel@^1.0.4: get-intrinsic "^1.0.2" object-inspect "^1.9.0" -signal-exit@^3.0.2, signal-exit@^3.0.7: +signal-exit@^3.0.2: version "3.0.7" resolved "https://registry.yarnpkg.com/signal-exit/-/signal-exit-3.0.7.tgz#a9a1767f8af84155114eaabd73f99273c8f59ad9" integrity sha512-wnD2ZE+l+SPC/uoS0vXeE9L1+0wuaMqKlfz9AMUo38JsyLSBWSFcHR1Rri62LZc12vLr1gb3jl7iwQhgwpAbGQ== +signal-exit@^4.0.1: + version "4.0.2" + resolved "https://registry.yarnpkg.com/signal-exit/-/signal-exit-4.0.2.tgz#ff55bb1d9ff2114c13b400688fa544ac63c36967" + integrity sha512-MY2/qGx4enyjprQnFaZsHib3Yadh3IXyV2C321GY0pjGfVBu4un0uDJkwgdxqO+Rdx8JMT8IfJIRwbYVz3Ob3Q== + signedsource@^1.0.0: version "1.0.0" resolved "https://registry.yarnpkg.com/signedsource/-/signedsource-1.0.0.tgz#1ddace4981798f93bd833973803d80d52e93ad6a" @@ -7332,6 +7399,13 @@ strip-indent@^3.0.0: dependencies: min-indent "^1.0.0" +strip-indent@^4.0.0: + version "4.0.0" + resolved "https://registry.yarnpkg.com/strip-indent/-/strip-indent-4.0.0.tgz#b41379433dd06f5eae805e21d631e07ee670d853" + integrity sha512-mnVSV2l+Zv6BLpSD/8V87CW/y9EmmbYzGCIavsnsI6/nwn26DwffM/yztm30Z/I2DY9wdS3vXVCMnHDgZaVNoA== + dependencies: + min-indent "^1.0.1" + strip-json-comments@^3.1.0, strip-json-comments@^3.1.1: version "3.1.1" resolved "https://registry.yarnpkg.com/strip-json-comments/-/strip-json-comments-3.1.1.tgz#31f1281b3832630434831c310c01cccda8cbe006" @@ -7357,53 +7431,51 @@ stylelint-order@^6.0.2: postcss "^8.4.21" postcss-sorting "^8.0.1" -stylelint@^15.1.0: - version "15.1.0" - resolved "https://registry.yarnpkg.com/stylelint/-/stylelint-15.1.0.tgz#24d7cbe06250ceca3b276393bfdeaaaba4356195" - integrity sha512-Tw8OyIiYhxnIHUzgoLlCyWgCUKsPYiP3TDgs7M1VbayS+q5qZly2yxABg+YPe/hFRWiu0cOtptCtpyrn1CrnYw== +stylelint@^15.10.1: + version "15.10.1" + resolved "https://registry.yarnpkg.com/stylelint/-/stylelint-15.10.1.tgz#93f189958687e330c106b010cbec0c41dcae506d" + integrity sha512-CYkzYrCFfA/gnOR+u9kJ1PpzwG10WLVnoxHDuBA/JiwGqdM9+yx9+ou6SE/y9YHtfv1mcLo06fdadHTOx4gBZQ== dependencies: - "@csstools/css-parser-algorithms" "^2.0.1" - "@csstools/css-tokenizer" "^2.0.1" - "@csstools/media-query-list-parser" "^2.0.1" - "@csstools/selector-specificity" "^2.1.1" + "@csstools/css-parser-algorithms" "^2.3.0" + "@csstools/css-tokenizer" "^2.1.1" + "@csstools/media-query-list-parser" "^2.1.2" + "@csstools/selector-specificity" "^3.0.0" balanced-match "^2.0.0" colord "^2.9.3" - cosmiconfig "^8.0.0" + cosmiconfig "^8.2.0" css-functions-list "^3.1.0" css-tree "^2.3.1" debug "^4.3.4" - fast-glob "^3.2.12" + fast-glob "^3.3.0" fastest-levenshtein "^1.0.16" file-entry-cache "^6.0.1" global-modules "^2.0.0" globby "^11.1.0" globjoin "^0.1.4" - html-tags "^3.2.0" + html-tags "^3.3.1" ignore "^5.2.4" import-lazy "^4.0.0" imurmurhash "^0.1.4" is-plain-object "^5.0.0" - known-css-properties "^0.26.0" + known-css-properties "^0.27.0" mathml-tag-names "^2.1.3" - meow "^9.0.0" + meow "^10.1.5" micromatch "^4.0.5" normalize-path "^3.0.0" picocolors "^1.0.0" - postcss "^8.4.21" - postcss-media-query-parser "^0.2.3" + postcss "^8.4.24" postcss-resolve-nested-selector "^0.1.1" postcss-safe-parser "^6.0.0" - postcss-selector-parser "^6.0.11" + postcss-selector-parser "^6.0.13" postcss-value-parser "^4.2.0" resolve-from "^5.0.0" string-width "^4.2.3" strip-ansi "^6.0.1" style-search "^0.1.0" - supports-hyperlinks "^2.3.0" + supports-hyperlinks "^3.0.0" svg-tags "^1.0.0" table "^6.8.1" - v8-compile-cache "^2.3.0" - write-file-atomic "^5.0.0" + write-file-atomic "^5.0.1" stylis@4.1.3: version "4.1.3" @@ -7424,10 +7496,10 @@ supports-color@^7.0.0, supports-color@^7.1.0: dependencies: has-flag "^4.0.0" -supports-hyperlinks@^2.3.0: - version "2.3.0" - resolved "https://registry.yarnpkg.com/supports-hyperlinks/-/supports-hyperlinks-2.3.0.tgz#3943544347c1ff90b15effb03fc14ae45ec10624" - integrity sha512-RpsAZlpWcDwOPQA22aCH4J0t7L8JmAvsCxfOSEwm7cQs3LshN36QaTkwd70DnBOXDWGssw2eUoc8CaRWT0XunA== +supports-hyperlinks@^3.0.0: + version "3.0.0" + resolved "https://registry.yarnpkg.com/supports-hyperlinks/-/supports-hyperlinks-3.0.0.tgz#c711352a5c89070779b4dad54c05a2f14b15c94b" + integrity sha512-QBDPHyPQDRTy9ku4URNGY5Lah8PAaXs6tAAwp55sL5WCsSW7GIfdf6W5ixfziW+t7wh3GVvHyHHyQ1ESsoRvaA== dependencies: has-flag "^4.0.0" supports-color "^7.0.0" @@ -7561,6 +7633,11 @@ trim-newlines@^3.0.0: resolved "https://registry.yarnpkg.com/trim-newlines/-/trim-newlines-3.0.1.tgz#260a5d962d8b752425b32f3a7db0dcacd176c144" integrity sha512-c1PTsA3tYrIsLGkJkzHF+w9F2EyxfXGo4UyJc4pFL++FMjnq0HJS69T3M7d//gKrFKwy429bouPescbjecU+Zw== +trim-newlines@^4.0.2: + version "4.1.1" + resolved "https://registry.yarnpkg.com/trim-newlines/-/trim-newlines-4.1.1.tgz#28c88deb50ed10c7ba6dc2474421904a00139125" + integrity sha512-jRKj0n0jXWo6kh62nA5TEh3+4igKDXLvzBJcPpiizP7oOolUrYIxmVBG9TOtHYFHoddUk6YvAkGeGoSVTXfQXQ== + trough@^1.0.0: version "1.0.5" resolved "https://registry.yarnpkg.com/trough/-/trough-1.0.5.tgz#b8b639cefad7d0bb2abd37d433ff8293efa5f406" @@ -7646,11 +7723,6 @@ type-fest@^0.13.1: resolved "https://registry.yarnpkg.com/type-fest/-/type-fest-0.13.1.tgz#0172cb5bce80b0bd542ea348db50c7e21834d934" integrity sha512-34R7HTnG0XIJcBSn5XhDd7nNFPRcXYRZrBB2O2jdKqYODldSzBAqzsWoZYYvduky73toYS/ESqxPvkDf/F0XMg== -type-fest@^0.18.0: - version "0.18.1" - resolved "https://registry.yarnpkg.com/type-fest/-/type-fest-0.18.1.tgz#db4bc151a4a2cf4eebf9add5db75508db6cc841f" - integrity sha512-OIAYXk8+ISY+qTOwkHtKqzAuxchoMiD9Udx+FSGQDuiRR+PJKJHc2NJAXlbhkGwTt/4/nKZxELY1w3ReWOL8mw== - type-fest@^0.20.2: version "0.20.2" resolved "https://registry.yarnpkg.com/type-fest/-/type-fest-0.20.2.tgz#1bf207f4b28f91583666cb5fbd327887301cd5f4" @@ -7671,6 +7743,11 @@ type-fest@^0.8.1: resolved "https://registry.yarnpkg.com/type-fest/-/type-fest-0.8.1.tgz#09e249ebde851d3b1e48d27c105444667f17b83d" integrity sha512-4dbzIzqvjtgiM5rw1k5rEHtBANKmdudhGyBEajN01fEyhaAIhsoKNy6y7+IN93IfpFtwY9iqi7kD+xwKhQsNJA== +type-fest@^1.0.1, type-fest@^1.2.1, type-fest@^1.2.2: + version "1.4.0" + resolved "https://registry.yarnpkg.com/type-fest/-/type-fest-1.4.0.tgz#e9fb813fe3bf1744ec359d55d1affefa76f14be1" + integrity sha512-yGSza74xk0UG8k+pLh5oeoYirvIiWo5t0/o3zHHAO2tRDiZcxWP7fywNlXhqb6/r6sWvwi+RsyQMWhVLe4BVuA== + type-fest@^2.19.0: version "2.19.0" resolved "https://registry.yarnpkg.com/type-fest/-/type-fest-2.19.0.tgz#88068015bb33036a598b952e55e9311a60fd3a9b" @@ -7911,11 +7988,6 @@ v8-compile-cache-lib@^3.0.1: resolved "https://registry.yarnpkg.com/v8-compile-cache-lib/-/v8-compile-cache-lib-3.0.1.tgz#6336e8d71965cb3d35a1bbb7868445a7c05264bf" integrity sha512-wa7YjyUGfNZngI/vtK0UHAN+lgDCxBPCylVXGp0zu59Fz5aiGtNXaq3DhIov063MorB+VfufLh3JlF2KdTK3xg== -v8-compile-cache@^2.3.0: - version "2.3.0" - resolved "https://registry.yarnpkg.com/v8-compile-cache/-/v8-compile-cache-2.3.0.tgz#2de19618c66dc247dcfb6f99338035d8245a2cee" - integrity sha512-l8lCEmLcLYZh4nbunNZvQCJc5pv7+RCwa8q/LdUx8u7lsWvPDKmpodJAJNwkAhJC//dFY48KuIEmjtd4RViDrA== - validate-npm-package-license@^3.0.1: version "3.0.4" resolved "https://registry.yarnpkg.com/validate-npm-package-license/-/validate-npm-package-license-3.0.4.tgz#fc91f6b9c7ba15c857f4cb2c5defeec39d4f410a" @@ -8202,13 +8274,13 @@ write-file-atomic@^3.0.0: signal-exit "^3.0.2" typedarray-to-buffer "^3.1.5" -write-file-atomic@^5.0.0: - version "5.0.0" - resolved "https://registry.yarnpkg.com/write-file-atomic/-/write-file-atomic-5.0.0.tgz#54303f117e109bf3d540261125c8ea5a7320fab0" - integrity sha512-R7NYMnHSlV42K54lwY9lvW6MnSm1HSJqZL3xiSgi9E7//FYaI74r2G0rd+/X6VAMkHEdzxQaU5HUOXWUz5kA/w== +write-file-atomic@^5.0.1: + version "5.0.1" + resolved "https://registry.yarnpkg.com/write-file-atomic/-/write-file-atomic-5.0.1.tgz#68df4717c55c6fa4281a7860b4c2ba0a6d2b11e7" + integrity sha512-+QU2zd6OTD8XWIJCbffaiQeH9U73qIqafo1x6V1snCWYGJf6cVE0cDR4D8xRzcEnfI21IFrUPzPGtcPf8AC+Rw== dependencies: imurmurhash "^0.1.4" - signal-exit "^3.0.7" + signal-exit "^4.0.1" write-json-file@^4.3.0: version "4.3.0" @@ -8270,7 +8342,7 @@ yargs-parser@^18.1.2, yargs-parser@^18.1.3: camelcase "^5.0.0" decamelize "^1.2.0" -yargs-parser@^20.2.3: +yargs-parser@^20.2.9: version "20.2.9" resolved "https://registry.yarnpkg.com/yargs-parser/-/yargs-parser-20.2.9.tgz#2eb7dc3b0289718fc295f362753845c41a0c94ee" integrity sha512-y11nGElTIV+CT3Zv9t7VKl+Q3hTQoT9a1Qzezhhl6Rp21gJ/IVTW7Z3y9EWXhuUBC2Shnf+DX0antecpAwSP8w==