Compare commits

...

62 Commits

Author SHA1 Message Date
dependabot[bot]
8da2db42f1 Bump docker/setup-buildx-action from 3 to 4
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3 to 4.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](https://github.com/docker/setup-buildx-action/compare/v3...v4)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-version: '4'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-03-11 16:57:34 +00:00
Maximilian Dorninger
f5253990e0 switch from APScheduler to Taskiq (#461)
This PR replaces the APScheduler lib with the Taskiq task queuing lib. 

# why

APScheduler doesn't support FastAPI's DI in tasks, this makes them quite
cumbersome to read and write since DB, Repositories and Services all
need to be instanciated manually.

Moreover, Taskiq makes it easier to start background tasks from FastAPI
requests. This enables MM to move to a more event-based architecture.

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **New Features**
* App now uses an orchestrated async startup/shutdown and runs
background scheduling via a database-backed task queue; startup enqueues
pre-load/import/update tasks.

* **Bug Fixes**
* Improved torrent client handling with clearer conflict messages and
guidance for manual resolution.
* Enhanced logging around season, episode and metadata update
operations; minor regex/behaviour formatting preserved.

* **Chores**
* Updated dependencies to support the new task queue and connection
pooling.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-02-26 21:23:24 +01:00
dependabot[bot]
e529e0c0a3 Bump @eslint/compat from 1.4.1 to 2.0.2 in /web (#465)
Bumps
[@eslint/compat](https://github.com/eslint/rewrite/tree/HEAD/packages/compat)
from 1.4.1 to 2.0.2.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/eslint/rewrite/releases"><code>@​eslint/compat</code>'s
releases</a>.</em></p>
<blockquote>
<h2>compat: v2.0.2</h2>
<h2><a
href="https://github.com/eslint/rewrite/compare/compat-v2.0.1...compat-v2.0.2">2.0.2</a>
(2026-01-29)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>add eslint 10 as peer dependency (<a
href="https://redirect.github.com/eslint/rewrite/issues/361">#361</a>)
(<a
href="ecb37dcafc">ecb37dc</a>)</li>
</ul>
<h3>Dependencies</h3>
<ul>
<li>The following workspace dependencies were updated
<ul>
<li>dependencies
<ul>
<li><code>@​eslint/core</code> bumped from ^1.0.1 to ^1.1.0</li>
</ul>
</li>
</ul>
</li>
</ul>
<h2>migrate-config: v2.0.2</h2>
<h2><a
href="https://github.com/eslint/rewrite/compare/migrate-config-v2.0.1...migrate-config-v2.0.2">2.0.2</a>
(2026-01-29)</h2>
<h3>Dependencies</h3>
<ul>
<li>The following workspace dependencies were updated
<ul>
<li>dependencies
<ul>
<li><code>@​eslint/compat</code> bumped from ^2.0.1 to ^2.0.2</li>
</ul>
</li>
<li>devDependencies
<ul>
<li><code>@​eslint/core</code> bumped from ^1.0.1 to ^1.1.0</li>
</ul>
</li>
</ul>
</li>
</ul>
<h2>compat: v2.0.1</h2>
<h2><a
href="https://github.com/eslint/rewrite/compare/compat-v2.0.0...compat-v2.0.1">2.0.1</a>
(2026-01-08)</h2>
<h3>Dependencies</h3>
<ul>
<li>The following workspace dependencies were updated
<ul>
<li>dependencies
<ul>
<li><code>@​eslint/core</code> bumped from ^1.0.0 to ^1.0.1</li>
</ul>
</li>
</ul>
</li>
</ul>
<h2>migrate-config: v2.0.1</h2>
<h2><a
href="https://github.com/eslint/rewrite/compare/migrate-config-v2.0.0...migrate-config-v2.0.1">2.0.1</a>
(2026-01-08)</h2>
<h3>Dependencies</h3>
<ul>
<li>The following workspace dependencies were updated
<ul>
<li>dependencies
<ul>
<li><code>@​eslint/compat</code> bumped from ^2.0.0 to ^2.0.1</li>
</ul>
</li>
<li>devDependencies
<ul>
<li><code>@​eslint/core</code> bumped from ^1.0.0 to ^1.0.1</li>
</ul>
</li>
</ul>
</li>
</ul>
<h2>compat: v2.0.0</h2>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/eslint/rewrite/blob/main/packages/compat/CHANGELOG.md"><code>@​eslint/compat</code>'s
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/eslint/rewrite/compare/compat-v2.0.1...compat-v2.0.2">2.0.2</a>
(2026-01-29)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>add eslint 10 as peer dependency (<a
href="https://redirect.github.com/eslint/rewrite/issues/361">#361</a>)
(<a
href="ecb37dcafc">ecb37dc</a>)</li>
</ul>
<h3>Dependencies</h3>
<ul>
<li>The following workspace dependencies were updated
<ul>
<li>dependencies
<ul>
<li><code>@​eslint/core</code> bumped from ^1.0.1 to ^1.1.0</li>
</ul>
</li>
</ul>
</li>
</ul>
<h2><a
href="https://github.com/eslint/rewrite/compare/compat-v2.0.0...compat-v2.0.1">2.0.1</a>
(2026-01-08)</h2>
<h3>Dependencies</h3>
<ul>
<li>The following workspace dependencies were updated
<ul>
<li>dependencies
<ul>
<li><code>@​eslint/core</code> bumped from ^1.0.0 to ^1.0.1</li>
</ul>
</li>
</ul>
</li>
</ul>
<h2><a
href="https://github.com/eslint/rewrite/compare/compat-v1.4.1...compat-v2.0.0">2.0.0</a>
(2025-11-14)</h2>
<h3>⚠ BREAKING CHANGES</h3>
<ul>
<li>Require Node.js ^20.19.0 || ^22.13.0 || &gt;=24 (<a
href="https://redirect.github.com/eslint/rewrite/issues/297">#297</a>)</li>
</ul>
<h3>Features</h3>
<ul>
<li>patch missing context and SourceCode methods for v10 (<a
href="https://redirect.github.com/eslint/rewrite/issues/311">#311</a>)
(<a
href="a40d8c60af">a40d8c6</a>)</li>
<li>Require Node.js ^20.19.0 || ^22.13.0 || &gt;=24 (<a
href="https://redirect.github.com/eslint/rewrite/issues/297">#297</a>)
(<a
href="acc623c807">acc623c</a>)</li>
</ul>
<h3>Dependencies</h3>
<ul>
<li>The following workspace dependencies were updated
<ul>
<li>dependencies
<ul>
<li><code>@​eslint/core</code> bumped from ^0.17.0 to ^1.0.0</li>
</ul>
</li>
</ul>
</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7960653fe6"><code>7960653</code></a>
chore: release main (<a
href="https://github.com/eslint/rewrite/tree/HEAD/packages/compat/issues/356">#356</a>)</li>
<li><a
href="ecb37dcafc"><code>ecb37dc</code></a>
fix: add eslint 10 as peer dependency (<a
href="https://github.com/eslint/rewrite/tree/HEAD/packages/compat/issues/361">#361</a>)</li>
<li><a
href="074cac2268"><code>074cac2</code></a>
docs: Update README sponsors</li>
<li><a
href="a3b0fd5102"><code>a3b0fd5</code></a>
docs: Update README sponsors</li>
<li><a
href="7abc05147e"><code>7abc051</code></a>
chore: release main (<a
href="https://github.com/eslint/rewrite/tree/HEAD/packages/compat/issues/336">#336</a>)</li>
<li><a
href="f0b5b68e6d"><code>f0b5b68</code></a>
docs: Update README sponsors</li>
<li><a
href="b65204d085"><code>b65204d</code></a>
docs: Update README sponsors</li>
<li><a
href="5f8bc5b872"><code>5f8bc5b</code></a>
ci: run <code>arethetypeswrong</code> on packages with types (<a
href="https://github.com/eslint/rewrite/tree/HEAD/packages/compat/issues/338">#338</a>)</li>
<li><a
href="d9eb64a30a"><code>d9eb64a</code></a>
docs: Update README sponsors</li>
<li><a
href="7444f36783"><code>7444f36</code></a>
docs: Update README sponsors</li>
<li>Additional commits viewable in <a
href="https://github.com/eslint/rewrite/commits/compat-v2.0.2/packages/compat">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=@eslint/compat&package-manager=npm_and_yarn&previous-version=1.4.1&new-version=2.0.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-26 21:21:11 +01:00
dependabot[bot]
46a9760376 Bump ruff from 0.14.10 to 0.15.2 (#467)
Bumps [ruff](https://github.com/astral-sh/ruff) from 0.14.10 to 0.15.2.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/releases">ruff's
releases</a>.</em></p>
<blockquote>
<h2>0.15.2</h2>
<h2>Release Notes</h2>
<p>Released on 2026-02-19.</p>
<h3>Preview features</h3>
<ul>
<li>
<p>Expand the default rule set (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23385">#23385</a>)</p>
<p>In preview, Ruff now enables a significantly expanded default rule
set of 412 rules, up from the stable default set of 59 rules. The new
rules are mostly a superset of the stable defaults, with the exception
of these rules, which are removed from the preview defaults:</p>
<ul>
<li><a
href="https://docs.astral.sh/ruff/rules/multiple-imports-on-one-line"><code>multiple-imports-on-one-line</code></a>
(<code>E401</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/module-import-not-at-top-of-file"><code>module-import-not-at-top-of-file</code></a>
(<code>E402</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/module-import-not-at-top-of-file"><code>module-import-not-at-top-of-file</code></a>
(<code>E701</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/multiple-statements-on-one-line-semicolon"><code>multiple-statements-on-one-line-semicolon</code></a>
(<code>E702</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/useless-semicolon"><code>useless-semicolon</code></a>
(<code>E703</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/none-comparison"><code>none-comparison</code></a>
(<code>E711</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/true-false-comparison"><code>true-false-comparison</code></a>
(<code>E712</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/not-in-test"><code>not-in-test</code></a>
(<code>E713</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/not-is-test"><code>not-is-test</code></a>
(<code>E714</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/type-comparison"><code>type-comparison</code></a>
(<code>E721</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/lambda-assignment"><code>lambda-assignment</code></a>
(<code>E731</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/ambiguous-variable-name"><code>ambiguous-variable-name</code></a>
(<code>E741</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/ambiguous-class-name"><code>ambiguous-class-name</code></a>
(<code>E742</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/ambiguous-function-name"><code>ambiguous-function-name</code></a>
(<code>E743</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/undefined-local-with-import-star"><code>undefined-local-with-import-star</code></a>
(<code>F403</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/undefined-local-with-import-star-usage"><code>undefined-local-with-import-star-usage</code></a>
(<code>F405</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/undefined-local-with-nested-import-star-usage"><code>undefined-local-with-nested-import-star-usage</code></a>
(<code>F406</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/forward-annotation-syntax-error"><code>forward-annotation-syntax-error</code></a>
(<code>F722</code>)</li>
</ul>
<p>If you use preview and prefer the old defaults, you can restore them
with configuration like:</p>
<pre lang="toml"><code>
# ruff.toml
<p>[lint]
select = [&quot;E4&quot;, &quot;E7&quot;, &quot;E9&quot;,
&quot;F&quot;]</p>
<h1>pyproject.toml</h1>
<p>[tool.ruff.lint]
select = [&quot;E4&quot;, &quot;E7&quot;, &quot;E9&quot;, &quot;F&quot;]
</code></pre></p>
<p>If you do give them a try, feel free to share your feedback in the <a
href="https://github.com/astral-sh/ruff/discussions/23203">GitHub
discussion</a>!</p>
</li>
<li>
<p>[<code>flake8-pyi</code>] Also check string annotations
(<code>PYI041</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/19023">#19023</a>)</p>
</li>
</ul>
<h3>Bug fixes</h3>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md">ruff's
changelog</a>.</em></p>
<blockquote>
<h2>0.15.2</h2>
<p>Released on 2026-02-19.</p>
<h3>Preview features</h3>
<ul>
<li>
<p>Expand the default rule set (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23385">#23385</a>)</p>
<p>In preview, Ruff now enables a significantly expanded default rule
set of 412
rules, up from the stable default set of 59 rules. The new rules are
mostly a
superset of the stable defaults, with the exception of these rules,
which are
removed from the preview defaults:</p>
<ul>
<li><a
href="https://docs.astral.sh/ruff/rules/multiple-imports-on-one-line"><code>multiple-imports-on-one-line</code></a>
(<code>E401</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/module-import-not-at-top-of-file"><code>module-import-not-at-top-of-file</code></a>
(<code>E402</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/module-import-not-at-top-of-file"><code>module-import-not-at-top-of-file</code></a>
(<code>E701</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/multiple-statements-on-one-line-semicolon"><code>multiple-statements-on-one-line-semicolon</code></a>
(<code>E702</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/useless-semicolon"><code>useless-semicolon</code></a>
(<code>E703</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/none-comparison"><code>none-comparison</code></a>
(<code>E711</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/true-false-comparison"><code>true-false-comparison</code></a>
(<code>E712</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/not-in-test"><code>not-in-test</code></a>
(<code>E713</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/not-is-test"><code>not-is-test</code></a>
(<code>E714</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/type-comparison"><code>type-comparison</code></a>
(<code>E721</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/lambda-assignment"><code>lambda-assignment</code></a>
(<code>E731</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/ambiguous-variable-name"><code>ambiguous-variable-name</code></a>
(<code>E741</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/ambiguous-class-name"><code>ambiguous-class-name</code></a>
(<code>E742</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/ambiguous-function-name"><code>ambiguous-function-name</code></a>
(<code>E743</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/undefined-local-with-import-star"><code>undefined-local-with-import-star</code></a>
(<code>F403</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/undefined-local-with-import-star-usage"><code>undefined-local-with-import-star-usage</code></a>
(<code>F405</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/undefined-local-with-nested-import-star-usage"><code>undefined-local-with-nested-import-star-usage</code></a>
(<code>F406</code>)</li>
<li><a
href="https://docs.astral.sh/ruff/rules/forward-annotation-syntax-error"><code>forward-annotation-syntax-error</code></a>
(<code>F722</code>)</li>
</ul>
<p>If you use preview and prefer the old defaults, you can restore them
with
configuration like:</p>
<pre lang="toml"><code>
# ruff.toml
<p>[lint]
select = [&quot;E4&quot;, &quot;E7&quot;, &quot;E9&quot;,
&quot;F&quot;]</p>
<h1>pyproject.toml</h1>
<p>[tool.ruff.lint]
select = [&quot;E4&quot;, &quot;E7&quot;, &quot;E9&quot;, &quot;F&quot;]
</code></pre></p>
<p>If you do give them a try, feel free to share your feedback in the <a
href="https://github.com/astral-sh/ruff/discussions/23203">GitHub
discussion</a>!</p>
</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="9d18ee9115"><code>9d18ee9</code></a>
Hard code workflow name and <code>cancel-in-progress</code> only for PRs
(<a
href="https://redirect.github.com/astral-sh/ruff/issues/23431">#23431</a>)</li>
<li><a
href="7cc15f024b"><code>7cc15f0</code></a>
Bump 0.15.2 (<a
href="https://redirect.github.com/astral-sh/ruff/issues/23430">#23430</a>)</li>
<li><a
href="d1b544393a"><code>d1b5443</code></a>
Add extension mapping to configuration file options (<a
href="https://redirect.github.com/astral-sh/ruff/issues/23384">#23384</a>)</li>
<li><a
href="222574af90"><code>222574a</code></a>
Expand the default rule set (<a
href="https://redirect.github.com/astral-sh/ruff/issues/23385">#23385</a>)</li>
<li><a
href="1465b5de38"><code>1465b5d</code></a>
[<code>flake8-async</code>] Fix <code>in_async_context</code> logic (<a
href="https://redirect.github.com/astral-sh/ruff/issues/23426">#23426</a>)</li>
<li><a
href="410902fa40"><code>410902f</code></a>
[<code>pyupgrade</code>] Fix handling of <code>typing.{io,re}</code>
(<code>UP035</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/issues/23131">#23131</a>)</li>
<li><a
href="729610acd9"><code>729610a</code></a>
[ty] Fall back to ambiguous for large control flow graphs (<a
href="https://redirect.github.com/astral-sh/ruff/issues/23399">#23399</a>)</li>
<li><a
href="1425c185b0"><code>1425c18</code></a>
[ty] Add code folding support</li>
<li><a
href="97acaaea5f"><code>97acaae</code></a>
[ty] Fix stack overflow for self-referential <code>TypeOf</code> in
annotations (<a
href="https://redirect.github.com/astral-sh/ruff/issues/23407">#23407</a>)</li>
<li><a
href="1f380c8258"><code>1f380c8</code></a>
[ty] Update tests <code>reveal_type</code> and <code>Never</code> (<a
href="https://redirect.github.com/astral-sh/ruff/issues/23418">#23418</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/astral-sh/ruff/compare/0.14.10...0.15.2">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=ruff&package-manager=uv&previous-version=0.14.10&new-version=0.15.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-26 21:20:43 +01:00
dependabot[bot]
8270d1d3ff Bump sqlalchemy from 2.0.45 to 2.0.47 (#471)
Bumps [sqlalchemy](https://github.com/sqlalchemy/sqlalchemy) from 2.0.45
to 2.0.47.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/sqlalchemy/sqlalchemy/releases">sqlalchemy's
releases</a>.</em></p>
<blockquote>
<h1>2.0.47</h1>
<p>Released: February 24, 2026</p>
<h2>orm</h2>
<ul>
<li>
<p><strong>[orm] [bug]</strong> Fixed issue when using ORM mappings with
Python 3.14's <a href="https://peps.python.org/pep-0649">PEP 649</a>
feature
that no longer requires &quot;future annotations&quot;, where the ORM's
introspection
of the <code>__init__</code> method of mapped classes would fail if
non-present
identifiers in annotations were present. The vendored
<code>getfullargspec()</code>
method has been amended to use <code>Format.FORWARDREF</code> under
Python 3.14 to
prevent resolution of names that aren't present.</p>
<p>References: <a
href="https://www.sqlalchemy.org/trac/ticket/13104">#13104</a></p>
</li>
</ul>
<h2>engine</h2>
<ul>
<li>
<p><strong>[engine] [usecase]</strong> The connection object returned by
<code>_engine.Engine.raw_connection()</code>
now supports the context manager protocol, automatically returning the
connection to the pool when exiting the context.</p>
<p>References: <a
href="https://www.sqlalchemy.org/trac/ticket/13116">#13116</a></p>
</li>
</ul>
<h2>postgresql</h2>
<ul>
<li>
<p><strong>[postgresql] [bug]</strong> Fixed an issue in the PostgreSQL
dialect where foreign key constraint
reflection would incorrectly swap or fail to capture
<code>onupdate</code> and
<code>ondelete</code> values when these clauses appeared in a different
order than
expected in the constraint definition. This issue primarily affected
PostgreSQL-compatible databases such as CockroachDB, which may return
<code>ON DELETE</code> before <code>ON UPDATE</code> in the constraint
definition string. The
reflection logic now correctly parses both clauses regardless of their
ordering.</p>
<p>References: <a
href="https://www.sqlalchemy.org/trac/ticket/13105">#13105</a></p>
</li>
<li>
<p><strong>[postgresql] [bug]</strong> Fixed issue in the
<code>engine_insertmanyvalues</code> feature where using
PostgreSQL's <code>ON CONFLICT</code> clause with
<code>_dml.Insert.returning.sort_by_parameter_order</code> enabled would
generate invalid SQL when the insert used an implicit sentinel
(server-side
autoincrement primary key). The generated SQL would incorrectly declare
a
sentinel counter column in the <code>imp_sen</code> table alias without
providing
corresponding values in the <code>VALUES</code> clause, leading to a
<code>ProgrammingError</code> indicating column count mismatch. The fix
allows batch
execution mode when <code>embed_values_counter</code> is active, as the
embedded
counter provides the ordering capability needed even with upsert
behaviors,
rather than unnecessarily downgrading to row-at-a-time execution.</p>
</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/sqlalchemy/sqlalchemy/commits">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sqlalchemy&package-manager=uv&previous-version=2.0.45&new-version=2.0.47)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-26 21:20:32 +01:00
dependabot[bot]
6ef200c558 Bump ty from 0.0.9 to 0.0.18 (#470)
Bumps [ty](https://github.com/astral-sh/ty) from 0.0.9 to 0.0.18.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ty/releases">ty's
releases</a>.</em></p>
<blockquote>
<h2>0.0.18</h2>
<h2>Release Notes</h2>
<p>Released on 2026-02-20.</p>
<h3>Bug fixes</h3>
<ul>
<li>Support classes dynamically created via <code>type(...)</code> with
cyclic bases (<a
href="https://redirect.github.com/astral-sh/ruff/pull/22792">#22792</a>)</li>
<li>Fix incorrect types inferred when unpacking mixed tuples (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23437">#23437</a>)</li>
<li>Fix stack overflow for self-referential <code>TypeOf</code> in
annotations (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23407">#23407</a>)</li>
<li>Fix several server panics that could occur when computing semantic
tokens for the current file (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23403">#23403</a>),
<a
href="https://redirect.github.com/astral-sh/ruff/pull/23398">#23398</a>,
<a
href="https://redirect.github.com/astral-sh/ruff/pull/23401">#23401</a>)</li>
</ul>
<h3>LSP server</h3>
<ul>
<li>Add code folding support (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23393">#23393</a>)</li>
<li>Add warning message when running <code>ty server</code>
interactively (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23416">#23416</a>)</li>
<li>Exclude test-related symbols from non-first-party packages in
auto-import completions (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23252">#23252</a>)</li>
<li>Fix bug where diagnostics could disappear after opening an external
file (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23447">#23447</a>)</li>
<li>Remove spurious destination for Go-To Definition on variables
defined in a loop (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23391">#23391</a>)</li>
<li>Use the fully qualified name when &quot;baking&quot; an inlay hint
into the source code if the scope already contains a variable with the
same name as the unqualified name (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23265">#23265</a>)</li>
<li>Resolve TypeVars in <code>call_signature_details</code> parameter
types (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23149">#23149</a>)</li>
</ul>
<h3>CLI</h3>
<ul>
<li>Add <code>--output-format</code> to <code>ty version</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23387">#23387</a>)</li>
</ul>
<h3>Configuration</h3>
<ul>
<li>Add <code>replace-imports-with-any</code> option (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23122">#23122</a>)</li>
<li>Support shellexpand for configuration paths (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23274">#23274</a>)</li>
</ul>
<h3>Type checking</h3>
<ul>
<li>Add a new diagnostic to detect invalid class patterns in
<code>match</code> statements (<a
href="https://redirect.github.com/astral-sh/ruff/pull/22939">#22939</a>)</li>
<li>Allow <code>Self</code> in <code>ClassVar</code> type annotations
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/23362">#23362</a>)</li>
<li>Consider synthesized methods and <code>ClassVar</code>-qualified
declarations when determining whether an abstract method has been
overridden in a subclass (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23381">#23381</a>)</li>
<li>Add a diagnostic when combining <code>Final</code> and
<code>ClassVar</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23365">#23365</a>)</li>
<li>Fix return type of <code>assert_never</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23389">#23389</a>)</li>
<li>Fix <code>assert_type</code> diagnostic messages (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23342">#23342</a>)</li>
<li>Ban PEP-613 type alias values from containing type-qualifier special
forms (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23444">#23444</a>)</li>
<li>Infer <code>LiteralString</code> for <code>f&quot;{literal_str_a}
{literal_str_b}&quot;</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23346">#23346</a>)</li>
<li>Infer precise types for bit-shift operations on integer literals (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23301">#23301</a>)</li>
<li>Make <code>[abstract-method-in-final-class]</code> diagnostics less
verbose for classes with many abstract methods (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23379">#23379</a>)</li>
<li>Improve diagnostics for abstract <code>@final</code> classes (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23376">#23376</a>)</li>
<li>Only perform literal promotion for implicitly inferred literals (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23107">#23107</a>)</li>
<li>Parenthesize callable types when they appear in the return
annotation of other callable types (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23327">#23327</a>)</li>
<li>Consider a call to a generic function returning <code>Never</code>
to terminate control flow (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23419">#23419</a>)</li>
<li>Support calls to intersection types (<a
href="https://redirect.github.com/astral-sh/ruff/pull/22469">#22469</a>)</li>
<li>Validate annotated assignments to attributes on self (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23388">#23388</a>)</li>
<li>Treat a bytes-literal type as a subtype of
<code>Sequence[&lt;constituent integers in the bytestring&gt;]</code>
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/23329">#23329</a>)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ty/blob/main/CHANGELOG.md">ty's
changelog</a>.</em></p>
<blockquote>
<h2>0.0.18</h2>
<p>Released on 2026-02-20.</p>
<h3>Bug fixes</h3>
<ul>
<li>Support classes dynamically created via <code>type(...)</code> with
cyclic bases (<a
href="https://redirect.github.com/astral-sh/ruff/pull/22792">#22792</a>)</li>
<li>Fix incorrect types inferred when unpacking mixed tuples (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23437">#23437</a>)</li>
<li>Fix stack overflow for self-referential <code>TypeOf</code> in
annotations (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23407">#23407</a>)</li>
<li>Fix several server panics that could occur when computing semantic
tokens for the current file (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23403">#23403</a>),
<a
href="https://redirect.github.com/astral-sh/ruff/pull/23398">#23398</a>,
<a
href="https://redirect.github.com/astral-sh/ruff/pull/23401">#23401</a>)</li>
</ul>
<h3>LSP server</h3>
<ul>
<li>Add code folding support (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23393">#23393</a>)</li>
<li>Add warning message when running <code>ty server</code>
interactively (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23416">#23416</a>)</li>
<li>Exclude test-related symbols from non-first-party packages in
auto-import completions (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23252">#23252</a>)</li>
<li>Fix bug where diagnostics could disappear after opening an external
file (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23447">#23447</a>)</li>
<li>Remove spurious destination for Go-To Definition on variables
defined in a loop (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23391">#23391</a>)</li>
<li>Use the fully qualified name when &quot;baking&quot; an inlay hint
into the source code if the scope already contains a variable with the
same name as the unqualified name (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23265">#23265</a>)</li>
<li>Resolve TypeVars in <code>call_signature_details</code> parameter
types (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23149">#23149</a>)</li>
</ul>
<h3>CLI</h3>
<ul>
<li>Add <code>--output-format</code> to <code>ty version</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23387">#23387</a>)</li>
</ul>
<h3>Configuration</h3>
<ul>
<li>Add <code>replace-imports-with-any</code> option (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23122">#23122</a>)</li>
<li>Support shellexpand for configuration paths (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23274">#23274</a>)</li>
</ul>
<h3>Type checking</h3>
<ul>
<li>Add a new diagnostic to detect invalid class patterns in
<code>match</code> statements (<a
href="https://redirect.github.com/astral-sh/ruff/pull/22939">#22939</a>)</li>
<li>Allow <code>Self</code> in <code>ClassVar</code> type annotations
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/23362">#23362</a>)</li>
<li>Consider synthesized methods and <code>ClassVar</code>-qualified
declarations when determining whether an abstract method has been
overridden in a subclass (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23381">#23381</a>)</li>
<li>Add a diagnostic when combining <code>Final</code> and
<code>ClassVar</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23365">#23365</a>)</li>
<li>Fix return type of <code>assert_never</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23389">#23389</a>)</li>
<li>Fix <code>assert_type</code> diagnostic messages (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23342">#23342</a>)</li>
<li>Ban PEP-613 type alias values from containing type-qualifier special
forms (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23444">#23444</a>)</li>
<li>Infer <code>LiteralString</code> for <code>f&quot;{literal_str_a}
{literal_str_b}&quot;</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23346">#23346</a>)</li>
<li>Infer precise types for bit-shift operations on integer literals (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23301">#23301</a>)</li>
<li>Make <code>[abstract-method-in-final-class]</code> diagnostics less
verbose for classes with many abstract methods (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23379">#23379</a>)</li>
<li>Improve diagnostics for abstract <code>@final</code> classes (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23376">#23376</a>)</li>
<li>Only perform literal promotion for implicitly inferred literals (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23107">#23107</a>)</li>
<li>Parenthesize callable types when they appear in the return
annotation of other callable types (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23327">#23327</a>)</li>
<li>Consider a call to a generic function returning <code>Never</code>
to terminate control flow (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23419">#23419</a>)</li>
<li>Support calls to intersection types (<a
href="https://redirect.github.com/astral-sh/ruff/pull/22469">#22469</a>)</li>
<li>Validate annotated assignments to attributes on self (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23388">#23388</a>)</li>
<li>Treat a bytes-literal type as a subtype of
<code>Sequence[&lt;constituent integers in the bytestring&gt;]</code>
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/23329">#23329</a>)</li>
<li>Allow a string-literal argument to match against an
<code>Iterable</code> parameter in type variable inference. (<a
href="https://redirect.github.com/astral-sh/ruff/pull/23326">#23326</a>)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="751672710c"><code>7516727</code></a>
Bump version to 0.0.18 (<a
href="https://redirect.github.com/astral-sh/ty/issues/2869">#2869</a>)</li>
<li><a
href="8e86a5684f"><code>8e86a56</code></a>
Mention code folding on language server feature page</li>
<li><a
href="ca49bf9199"><code>ca49bf9</code></a>
Add shellcheck to pre-commit configuration (<a
href="https://redirect.github.com/astral-sh/ty/issues/2845">#2845</a>)</li>
<li><a
href="532e761e8f"><code>532e761</code></a>
Update PyO3/maturin-action action to v1.50.0 (<a
href="https://redirect.github.com/astral-sh/ty/issues/2823">#2823</a>)</li>
<li><a
href="0211006eb6"><code>0211006</code></a>
Update prek dependencies (<a
href="https://redirect.github.com/astral-sh/ty/issues/2822">#2822</a>)</li>
<li><a
href="60ebbfe853"><code>60ebbfe</code></a>
docs: Add beta status notice to README (<a
href="https://redirect.github.com/astral-sh/ty/issues/2556">#2556</a>)</li>
<li><a
href="31b126a590"><code>31b126a</code></a>
docs: add link for the call hierarchy tracking issue (<a
href="https://redirect.github.com/astral-sh/ty/issues/2816">#2816</a>)</li>
<li><a
href="8cec857182"><code>8cec857</code></a>
[ty] Bump version to 0.0.17 (<a
href="https://redirect.github.com/astral-sh/ty/issues/2806">#2806</a>)</li>
<li><a
href="3650f58ffd"><code>3650f58</code></a>
docs: Clarify that nvim-lspconfig is the recommended way of using ty in
all v...</li>
<li><a
href="55b8ff2055"><code>55b8ff2</code></a>
Add note about fallback behavior to <code>python</code> in
<code>PATH</code> (<a
href="https://redirect.github.com/astral-sh/ty/issues/2787">#2787</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/astral-sh/ty/compare/0.0.9...0.0.18">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=ty&package-manager=uv&previous-version=0.0.9&new-version=0.0.18)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-26 21:20:22 +01:00
dependabot[bot]
47d35d4bd7 Bump tailwindcss from 4.2.0 to 4.2.1 in /web (#469)
Bumps
[tailwindcss](https://github.com/tailwindlabs/tailwindcss/tree/HEAD/packages/tailwindcss)
from 4.2.0 to 4.2.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/tailwindlabs/tailwindcss/releases">tailwindcss's
releases</a>.</em></p>
<blockquote>
<h2>v4.2.1</h2>
<h3>Fixed</h3>
<ul>
<li>Allow trailing dash in functional utility names for backwards
compatibility (<a
href="https://redirect.github.com/tailwindlabs/tailwindcss/pull/19696">#19696</a>)</li>
<li>Properly detect classes containing <code>.</code> characters within
curly braces in MDX files (<a
href="https://redirect.github.com/tailwindlabs/tailwindcss/pull/19711">#19711</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/tailwindlabs/tailwindcss/blob/main/CHANGELOG.md">tailwindcss's
changelog</a>.</em></p>
<blockquote>
<h2>[4.2.1] - 2026-02-23</h2>
<h3>Fixed</h3>
<ul>
<li>Allow trailing dash in functional utility names for backwards
compatibility (<a
href="https://redirect.github.com/tailwindlabs/tailwindcss/pull/19696">#19696</a>)</li>
<li>Properly detect classes containing <code>.</code> characters within
curly braces in MDX files (<a
href="https://redirect.github.com/tailwindlabs/tailwindcss/pull/19711">#19711</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="1dce64ee7e"><code>1dce64e</code></a>
4.2.1 (<a
href="https://github.com/tailwindlabs/tailwindcss/tree/HEAD/packages/tailwindcss/issues/19714">#19714</a>)</li>
<li><a
href="d15d92ca60"><code>d15d92c</code></a>
Allow trailing dash in functional utility names (<a
href="https://github.com/tailwindlabs/tailwindcss/tree/HEAD/packages/tailwindcss/issues/19696">#19696</a>)</li>
<li>See full diff in <a
href="https://github.com/tailwindlabs/tailwindcss/commits/v4.2.1/packages/tailwindcss">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=tailwindcss&package-manager=npm_and_yarn&previous-version=4.2.0&new-version=4.2.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-26 21:20:08 +01:00
Juan David Bermudez Celedon
cbd70bd6f3 Fix scoring rules keyword matching (#473)
Fixs for #460 

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

## Release Notes

* **Improvements**
* Refined keyword matching in indexer queries with case-insensitive,
word-boundary-aware search for more accurate results.

* **Bug Fixes**
  * Corrected torrent URL redirect resolution logic.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-02-26 16:37:29 +01:00
Juan David Bermudez Celedon
d8405fd903 Extend AVOID_CAM scoring rule (#458)
The base title scoring rule **avoid_cam** can be extended with more
keyword related to low quality videos similar to CAM.


| Type | Acronyms | Meaning |

|-----------|-------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| CAM | CAM<br>CAMRIP<br>HDCAM | A recording made with a handheld camera
in a movie theater. |
| Screener |
BDSCR<br>DDC<br>DVDSCR<br>DVDSCREENER<br>SCR<br>SCREENER<br>WEBSCREENER
| Screeners are early DVD or BD releases of the theatrical version of a
film, typically sent to movie reviewers, academy members and executives
for review purposes. |
| Telecine | HDTC<br>TC<br>TELECINE | A digital scan of the film print.
|
| Telesync | HDTS<br>TELESYNC<br>TS | Similar to *CAM*, but the camera
is typically placed closer to the projector or on a tripod in the
projection booth. Audio is captured directly from the sound system. |
| TV | TVRIP |

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

## Release Notes

* **Chores**
* Improved detection of additional camera-related content format
variants through updated filtering rules.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-02-23 23:21:48 +01:00
Maximilian Dorninger
7824891557 Fix qbittorrent category error (#456)
This PR fixes an error caused by MM trying to set the category save path
to None if the string is empty.
2026-02-22 19:51:10 +01:00
Maximilian Dorninger
a643c9426d remove everything related to requests (#455)
This PR removes the requests feature. The functionality will be replaced
either by Seerr or by reimplementing it in a better way.
2026-02-22 19:46:47 +01:00
GokuPlay609
c2645000e5 fix: improve quality detection regex to match 2160p, UHD, FullHD and other keywords (#450)
## What

Two-line fix to the quality detection regex in
`media_manager/indexer/schemas.py`.

**UHD pattern**: `\b(4k)\b` → `\b(4k|2160p|uhd)\b`  
**FullHD pattern**: `\b(1080p)\b` → `\b(1080p|fullhd|full\s*hd)\b`

## Why

The UHD regex only matched the literal keyword `4k`. Torrent titles
containing `2160p` or `UHD` (but not `4k`) were classified as
`Quality.unknown` (value 5) instead of `Quality.uhd` (value 1). Since
sorting uses quality as the primary key, these 4K releases ended up at
the bottom of search results.

### Example

| Title | Before | After |
|---|---|---|
| `Movie.2013.4K.HDR.2160p.x265` |  `Quality.uhd` |  `Quality.uhd` |
| `Movie.2013.UHD.BluRay.2160p.HDR10.x265` |  `Quality.unknown` | 
`Quality.uhd` |
| `Movie.2013.2160p.WEBRip.DDP5.1.x264` |  `Quality.unknown` | 
`Quality.uhd` |

All patterns already use `re.IGNORECASE`, so case variants are handled.

Fixes #449

---------

Co-authored-by: GokuPlay609 <GokuPlay609@users.noreply.github.com>
Co-authored-by: Amp <amp@ampcode.com>
Co-authored-by: maxid <97409287+maxdorninger@users.noreply.github.com>
2026-02-22 16:25:36 +01:00
Maximilian Dorninger
b16f2dce92 migrate season files to episode files and drop legacy table (#454)
This pull request introduces a migration script to transition from
storing file information at the season level to the episode level in the
database.
2026-02-22 16:25:12 +01:00
natarelli22
d8a0ec66c3 Support for handling Single Episode Torrents (#331)
**Description**
As explained on #322, MediaManager currently only matches torrents that
represent full seasons or season packs.
As a result, valid episode-based releases — commonly returned by
indexers such as EZTV — are filtered out during scoring and never
considered for download.

Initial changes to the season parsing logic allow these torrents to be
discovered.
However, additional changes are required beyond season parsing to
properly support single-episode imports.

This PR is intended as a work-in-progress / RFC to discuss the required
changes and align on the correct approach before completing the
implementation.

**Things planned to do**
[X] Update Web UI to better display episode-level details
[ ] Update TV show import logic to handle single episode files, instead
of assuming full season files (to avoid integrity errors when episodes
are missing)
[ ] Create episode file tables to store episode-level data, similar to
season files
[ ] Implement fetching and downloading logic for single-episode torrents

**Notes / current limitations**
At the moment, the database and import logic assume one file per season
per quality, which works for season packs but not for episode-based
releases.

These changes are intentionally not completed yet and are part of the
discussion this PR aims to start.

**Request for feedback**
This represents a significant change in how TV content is handled in
MediaManager.
Before proceeding further, feedback from @maxdorninger on the overall
direction and next steps would be greatly appreciated.

Once aligned, the remaining tasks can be implemented incrementally.

---------

Co-authored-by: Maximilian Dorninger <97409287+maxdorninger@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-22 15:21:19 +01:00
Maximilian Dorninger
094d0e4eb7 update logo
forgot to update this file
2026-02-21 21:48:37 +01:00
Maximilian Dorninger
4d7f596ffd Rebrand to new MediaManager logo (#452)
I made two new logos because the old one wasn't very recognizable at a
glance.


![1](https://github.com/user-attachments/assets/cb37a709-e80b-4c97-a4d8-cf9ba0dc1613)

![3](https://github.com/user-attachments/assets/c56ded5c-fe15-4c02-bc20-fe2bff06caf9)
2026-02-21 20:29:16 +01:00
dependabot[bot]
300df14c8c Bump svelte from 5.51.0 to 5.53.0 in /web in the npm_and_yarn group across 1 directory (#445)
Bumps the npm_and_yarn group with 1 update in the /web directory:
[svelte](https://github.com/sveltejs/svelte/tree/HEAD/packages/svelte).

Updates `svelte` from 5.51.0 to 5.53.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/sveltejs/svelte/releases">svelte's
releases</a>.</em></p>
<blockquote>
<h2>svelte@5.53.0</h2>
<h3>Minor Changes</h3>
<ul>
<li>
<p>feat: allow comments in tags (<a
href="https://redirect.github.com/sveltejs/svelte/pull/17671">#17671</a>)</p>
</li>
<li>
<p>feat: allow error boundaries to work on the server (<a
href="https://redirect.github.com/sveltejs/svelte/pull/17672">#17672</a>)</p>
</li>
</ul>
<h3>Patch Changes</h3>
<ul>
<li>
<p>fix: use TrustedHTML to test for customizable <!-- raw HTML omitted
--> support, where necessary (<a
href="https://redirect.github.com/sveltejs/svelte/pull/17743">#17743</a>)</p>
</li>
<li>
<p>fix: ensure head effects are kept in the effect tree (<a
href="https://redirect.github.com/sveltejs/svelte/pull/17746">#17746</a>)</p>
</li>
<li>
<p>chore: deactivate current_batch by default in unset_context (<a
href="https://redirect.github.com/sveltejs/svelte/pull/17738">#17738</a>)</p>
</li>
</ul>
<h2>svelte@5.52.0</h2>
<h3>Minor Changes</h3>
<ul>
<li>feat: support TrustedHTML in <code>{@html}</code> expressions (<a
href="https://redirect.github.com/sveltejs/svelte/pull/17701">#17701</a>)</li>
</ul>
<h3>Patch Changes</h3>
<ul>
<li>
<p>fix: repair dynamic component truthy/falsy hydration mismatches (<a
href="https://redirect.github.com/sveltejs/svelte/pull/17737">#17737</a>)</p>
</li>
<li>
<p>fix: re-run non-render-bound deriveds on the server (<a
href="https://redirect.github.com/sveltejs/svelte/pull/17674">#17674</a>)</p>
</li>
</ul>
<h2>svelte@5.51.5</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p>fix: check to make sure <code>svelte:element</code> tags are valid
during SSR (<a
href="73098bb26c"><code>73098bb26c6f06e7fd1b0746d817d2c5ee90755f</code></a>)</p>
</li>
<li>
<p>fix: misc option escaping and backwards compatibility (<a
href="https://redirect.github.com/sveltejs/svelte/pull/17741">#17741</a>)</p>
</li>
<li>
<p>fix: strip event handlers during SSR (<a
href="a0c7f28915"><code>a0c7f289156e9fafaeaf5ca14af6c06fe9b9eae5</code></a>)</p>
</li>
<li>
<p>fix: replace usage of <code>for in</code> with <code>for of
Object.keys</code> (<a
href="f89c7ddd7e"><code>f89c7ddd7eebaa1ef3cc540400bec2c9140b330c</code></a>)</p>
</li>
<li>
<p>fix: always escape option body in SSR (<a
href="f7c80da18c"><code>f7c80da18c215e3727c2a611b0b8744cc6e504c5</code></a>)</p>
</li>
<li>
<p>chore: upgrade <code>devalue</code> (<a
href="https://redirect.github.com/sveltejs/svelte/pull/17739">#17739</a>)</p>
</li>
</ul>
<h2>svelte@5.51.4</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p>chore: proactively defer effects in pending boundary (<a
href="https://redirect.github.com/sveltejs/svelte/pull/17734">#17734</a>)</p>
</li>
<li>
<p>fix: detect and error on non-idempotent each block keys in dev mode
(<a
href="https://redirect.github.com/sveltejs/svelte/pull/17732">#17732</a>)</p>
</li>
</ul>
<h2>svelte@5.51.3</h2>
<h3>Patch Changes</h3>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/sveltejs/svelte/blob/main/packages/svelte/CHANGELOG.md">svelte's
changelog</a>.</em></p>
<blockquote>
<h2>5.53.0</h2>
<h3>Minor Changes</h3>
<ul>
<li>
<p>feat: allow comments in tags (<a
href="https://redirect.github.com/sveltejs/svelte/pull/17671">#17671</a>)</p>
</li>
<li>
<p>feat: allow error boundaries to work on the server (<a
href="https://redirect.github.com/sveltejs/svelte/pull/17672">#17672</a>)</p>
</li>
</ul>
<h3>Patch Changes</h3>
<ul>
<li>
<p>fix: use TrustedHTML to test for customizable
<code>&lt;select&gt;</code> support, where necessary (<a
href="https://redirect.github.com/sveltejs/svelte/pull/17743">#17743</a>)</p>
</li>
<li>
<p>fix: ensure head effects are kept in the effect tree (<a
href="https://redirect.github.com/sveltejs/svelte/pull/17746">#17746</a>)</p>
</li>
<li>
<p>chore: deactivate current_batch by default in unset_context (<a
href="https://redirect.github.com/sveltejs/svelte/pull/17738">#17738</a>)</p>
</li>
</ul>
<h2>5.52.0</h2>
<h3>Minor Changes</h3>
<ul>
<li>feat: support TrustedHTML in <code>{@html}</code> expressions (<a
href="https://redirect.github.com/sveltejs/svelte/pull/17701">#17701</a>)</li>
</ul>
<h3>Patch Changes</h3>
<ul>
<li>
<p>fix: repair dynamic component truthy/falsy hydration mismatches (<a
href="https://redirect.github.com/sveltejs/svelte/pull/17737">#17737</a>)</p>
</li>
<li>
<p>fix: re-run non-render-bound deriveds on the server (<a
href="https://redirect.github.com/sveltejs/svelte/pull/17674">#17674</a>)</p>
</li>
</ul>
<h2>5.51.5</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p>fix: check to make sure <code>svelte:element</code> tags are valid
during SSR (<a
href="73098bb26c"><code>73098bb26c6f06e7fd1b0746d817d2c5ee90755f</code></a>)</p>
</li>
<li>
<p>fix: misc option escaping and backwards compatibility (<a
href="https://redirect.github.com/sveltejs/svelte/pull/17741">#17741</a>)</p>
</li>
<li>
<p>fix: strip event handlers during SSR (<a
href="a0c7f28915"><code>a0c7f289156e9fafaeaf5ca14af6c06fe9b9eae5</code></a>)</p>
</li>
<li>
<p>fix: replace usage of <code>for in</code> with <code>for of
Object.keys</code> (<a
href="f89c7ddd7e"><code>f89c7ddd7eebaa1ef3cc540400bec2c9140b330c</code></a>)</p>
</li>
<li>
<p>fix: always escape option body in SSR (<a
href="f7c80da18c"><code>f7c80da18c215e3727c2a611b0b8744cc6e504c5</code></a>)</p>
</li>
<li>
<p>chore: upgrade <code>devalue</code> (<a
href="https://redirect.github.com/sveltejs/svelte/pull/17739">#17739</a>)</p>
</li>
</ul>
<h2>5.51.4</h2>
<h3>Patch Changes</h3>
<ul>
<li>chore: proactively defer effects in pending boundary (<a
href="https://redirect.github.com/sveltejs/svelte/pull/17734">#17734</a>)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="c2fc95a467"><code>c2fc95a</code></a>
Version Packages (<a
href="https://github.com/sveltejs/svelte/tree/HEAD/packages/svelte/issues/17747">#17747</a>)</li>
<li><a
href="92e2fc1209"><code>92e2fc1</code></a>
feat: allow comments in tags (<a
href="https://github.com/sveltejs/svelte/tree/HEAD/packages/svelte/issues/17671">#17671</a>)</li>
<li><a
href="2661513cd3"><code>2661513</code></a>
feat: allow error boundaries to work on the server (<a
href="https://github.com/sveltejs/svelte/tree/HEAD/packages/svelte/issues/17672">#17672</a>)</li>
<li><a
href="582e4443dc"><code>582e444</code></a>
fix: ensure head effects are kept in the effect tree (<a
href="https://github.com/sveltejs/svelte/tree/HEAD/packages/svelte/issues/17746">#17746</a>)</li>
<li><a
href="f8bf9bb461"><code>f8bf9bb</code></a>
chore: deactivate current_batch by default in unset_context (<a
href="https://github.com/sveltejs/svelte/tree/HEAD/packages/svelte/issues/17738">#17738</a>)</li>
<li><a
href="696d97ff3e"><code>696d97f</code></a>
fix: use TrustedHTML to test for customizable &lt;select&gt; support,
where necessa...</li>
<li><a
href="cbf4e246fc"><code>cbf4e24</code></a>
Version Packages (<a
href="https://github.com/sveltejs/svelte/tree/HEAD/packages/svelte/issues/17742">#17742</a>)</li>
<li><a
href="09c4cb5084"><code>09c4cb5</code></a>
fix: re-run non-render-bound deriveds on the server (<a
href="https://github.com/sveltejs/svelte/tree/HEAD/packages/svelte/issues/17674">#17674</a>)</li>
<li><a
href="be24b0dca7"><code>be24b0d</code></a>
feat: support TrustedHTML in {<a
href="https://github.com/html"><code>@​html</code></a>} expressions (<a
href="https://github.com/sveltejs/svelte/tree/HEAD/packages/svelte/issues/17701">#17701</a>)</li>
<li><a
href="9f48e7620f"><code>9f48e76</code></a>
fix: repair dynamic component truthy/falsy hydration mismatches (<a
href="https://github.com/sveltejs/svelte/tree/HEAD/packages/svelte/issues/17737">#17737</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/sveltejs/svelte/commits/svelte@5.53.0/packages/svelte">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=svelte&package-manager=npm_and_yarn&previous-version=5.51.0&new-version=5.53.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions
You can disable automated security fix PRs for this repo from the
[Security Alerts
page](https://github.com/maxdorninger/MediaManager/network/alerts).

</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-20 18:08:06 +01:00
dependabot[bot]
2f102d6c5d Bump the uv group across 1 directory with 2 updates (#446)
Bumps the uv group with 2 updates in the /metadata_relay directory:
[python-multipart](https://github.com/Kludex/python-multipart) and
[urllib3](https://github.com/urllib3/urllib3).

Updates `python-multipart` from 0.0.21 to 0.0.22
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/Kludex/python-multipart/releases">python-multipart's
releases</a>.</em></p>
<blockquote>
<h2>Version 0.0.22</h2>
<h2>What's Changed</h2>
<ul>
<li>Drop directory path from filename in <code>File</code> <a
href="9433f4bbc9">9433f4b</a>.</li>
</ul>
<hr />
<p><strong>Full Changelog</strong>: <a
href="https://github.com/Kludex/python-multipart/compare/0.0.21...0.0.22">https://github.com/Kludex/python-multipart/compare/0.0.21...0.0.22</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/Kludex/python-multipart/blob/master/CHANGELOG.md">python-multipart's
changelog</a>.</em></p>
<blockquote>
<h2>0.0.22 (2026-01-25)</h2>
<ul>
<li>Drop directory path from filename in <code>File</code> <a
href="9433f4bbc9">9433f4b</a>.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="bea7bbb290"><code>bea7bbb</code></a>
Version 0.0.22 (<a
href="https://redirect.github.com/Kludex/python-multipart/issues/222">#222</a>)</li>
<li><a
href="0fb59a9df0"><code>0fb59a9</code></a>
chore: add return type on test (<a
href="https://redirect.github.com/Kludex/python-multipart/issues/221">#221</a>)</li>
<li><a
href="9433f4bbc9"><code>9433f4b</code></a>
Merge commit from fork</li>
<li><a
href="d5c91ecb0a"><code>d5c91ec</code></a>
Bump the github-actions group with 2 updates (<a
href="https://redirect.github.com/Kludex/python-multipart/issues/219">#219</a>)</li>
<li><a
href="5a90631b48"><code>5a90631</code></a>
bump uv (<a
href="https://redirect.github.com/Kludex/python-multipart/issues/218">#218</a>)</li>
<li>See full diff in <a
href="https://github.com/Kludex/python-multipart/compare/0.0.21...0.0.22">compare
view</a></li>
</ul>
</details>
<br />

Updates `urllib3` from 2.6.2 to 2.6.3
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/urllib3/urllib3/releases">urllib3's
releases</a>.</em></p>
<blockquote>
<h2>2.6.3</h2>
<h2>🚀 urllib3 is fundraising for HTTP/2 support</h2>
<p><a
href="https://sethmlarson.dev/urllib3-is-fundraising-for-http2-support">urllib3
is raising ~$40,000 USD</a> to release HTTP/2 support and ensure
long-term sustainable maintenance of the project after a sharp decline
in financial support. If your company or organization uses Python and
would benefit from HTTP/2 support in Requests, pip, cloud SDKs, and
thousands of other projects <a
href="https://opencollective.com/urllib3">please consider contributing
financially</a> to ensure HTTP/2 support is developed sustainably and
maintained for the long-haul.</p>
<p>Thank you for your support.</p>
<h2>Changes</h2>
<ul>
<li>Fixed a security issue where decompression-bomb safeguards of the
streaming API were bypassed when HTTP redirects were followed.
(CVE-2026-21441 reported by <a
href="https://github.com/D47A"><code>@​D47A</code></a>, 8.9 High,
GHSA-38jv-5279-wg99)</li>
<li>Started treating <code>Retry-After</code> times greater than 6 hours
as 6 hours by default. (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3743">urllib3/urllib3#3743</a>)</li>
<li>Fixed <code>urllib3.connection.VerifiedHTTPSConnection</code> on
Emscripten. (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3752">urllib3/urllib3#3752</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/urllib3/urllib3/blob/main/CHANGES.rst">urllib3's
changelog</a>.</em></p>
<blockquote>
<h1>2.6.3 (2026-01-07)</h1>
<ul>
<li>Fixed a high-severity security issue where decompression-bomb
safeguards of
the streaming API were bypassed when HTTP redirects were followed.
(<code>GHSA-38jv-5279-wg99
&lt;https://github.com/urllib3/urllib3/security/advisories/GHSA-38jv-5279-wg99&gt;</code>__)</li>
<li>Started treating <code>Retry-After</code> times greater than 6 hours
as 6 hours by
default. (<code>[#3743](https://github.com/urllib3/urllib3/issues/3743)
&lt;https://github.com/urllib3/urllib3/issues/3743&gt;</code>__)</li>
<li>Fixed <code>urllib3.connection.VerifiedHTTPSConnection</code> on
Emscripten.
(<code>[#3752](https://github.com/urllib3/urllib3/issues/3752)
&lt;https://github.com/urllib3/urllib3/issues/3752&gt;</code>__)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="0248277dd7"><code>0248277</code></a>
Release 2.6.3</li>
<li><a
href="8864ac407b"><code>8864ac4</code></a>
Merge commit from fork</li>
<li><a
href="70cecb27ca"><code>70cecb2</code></a>
Fix Scorecard issues related to vulnerable dev dependencies (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3755">#3755</a>)</li>
<li><a
href="41f249abe1"><code>41f249a</code></a>
Move &quot;v2.0 Migration Guide&quot; to the end of the table of
contents (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3747">#3747</a>)</li>
<li><a
href="fd4dffd2fc"><code>fd4dffd</code></a>
Patch <code>VerifiedHTTPSConnection</code> for Emscripten (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3752">#3752</a>)</li>
<li><a
href="13f0bfd55e"><code>13f0bfd</code></a>
Handle massive values in Retry-After when calculating time to sleep for
(<a
href="https://redirect.github.com/urllib3/urllib3/issues/3743">#3743</a>)</li>
<li><a
href="8c480bf87b"><code>8c480bf</code></a>
Bump actions/upload-artifact from 5.0.0 to 6.0.0 (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3748">#3748</a>)</li>
<li><a
href="4b40616e95"><code>4b40616</code></a>
Bump actions/cache from 4.3.0 to 5.0.1 (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3750">#3750</a>)</li>
<li><a
href="82b8479663"><code>82b8479</code></a>
Bump actions/download-artifact from 6.0.0 to 7.0.0 (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3749">#3749</a>)</li>
<li><a
href="34284cb017"><code>34284cb</code></a>
Mention experimental features in the security policy (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3746">#3746</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/urllib3/urllib3/compare/2.6.2...2.6.3">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions
You can disable automated security fix PRs for this repo from the
[Security Alerts
page](https://github.com/maxdorninger/MediaManager/network/alerts).

</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-20 18:07:53 +01:00
dependabot[bot]
3e696c463c Bump the uv group across 1 directory with 3 updates (#448)
Bumps the uv group with 3 updates in the / directory:
[cryptography](https://github.com/pyca/cryptography),
[python-multipart](https://github.com/Kludex/python-multipart) and
[urllib3](https://github.com/urllib3/urllib3).

Updates `cryptography` from 46.0.3 to 46.0.5
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/pyca/cryptography/blob/main/CHANGELOG.rst">cryptography's
changelog</a>.</em></p>
<blockquote>
<p>46.0.5 - 2026-02-10</p>
<pre><code>
* An attacker could create a malicious public key that reveals portions
of your
private key when using certain uncommon elliptic curves (binary curves).
This version now includes additional security checks to prevent this
attack.
This issue only affects binary elliptic curves, which are rarely used in
real-world applications. Credit to **XlabAI Team of Tencent Xuanwu Lab
and
Atuin Automated Vulnerability Discovery Engine** for reporting the
issue.
  **CVE-2026-26007**
* Support for ``SECT*`` binary elliptic curves is deprecated and will be
  removed in the next release.
<p>.. v46-0-4:</p>
<p>46.0.4 - 2026-01-27<br />
</code></pre></p>
<ul>
<li><code>Dropped support for win_arm64 wheels</code>_.</li>
<li>Updated Windows, macOS, and Linux wheels to be compiled with OpenSSL
3.5.5.</li>
</ul>
<p>.. _v46-0-3:</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="06e120e682"><code>06e120e</code></a>
bump version for 46.0.5 release (<a
href="https://redirect.github.com/pyca/cryptography/issues/14289">#14289</a>)</li>
<li><a
href="0eebb9dbb6"><code>0eebb9d</code></a>
EC check key on cofactor &gt; 1 (<a
href="https://redirect.github.com/pyca/cryptography/issues/14287">#14287</a>)</li>
<li><a
href="bedf6e186b"><code>bedf6e1</code></a>
fix openssl version on 46 branch (<a
href="https://redirect.github.com/pyca/cryptography/issues/14220">#14220</a>)</li>
<li><a
href="e6f44fc8e6"><code>e6f44fc</code></a>
bump for 46.0.4 and drop win arm64 due to CI issues (<a
href="https://redirect.github.com/pyca/cryptography/issues/14217">#14217</a>)</li>
<li>See full diff in <a
href="https://github.com/pyca/cryptography/compare/46.0.3...46.0.5">compare
view</a></li>
</ul>
</details>
<br />

Updates `python-multipart` from 0.0.21 to 0.0.22
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/Kludex/python-multipart/releases">python-multipart's
releases</a>.</em></p>
<blockquote>
<h2>Version 0.0.22</h2>
<h2>What's Changed</h2>
<ul>
<li>Drop directory path from filename in <code>File</code> <a
href="9433f4bbc9">9433f4b</a>.</li>
</ul>
<hr />
<p><strong>Full Changelog</strong>: <a
href="https://github.com/Kludex/python-multipart/compare/0.0.21...0.0.22">https://github.com/Kludex/python-multipart/compare/0.0.21...0.0.22</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/Kludex/python-multipart/blob/master/CHANGELOG.md">python-multipart's
changelog</a>.</em></p>
<blockquote>
<h2>0.0.22 (2026-01-25)</h2>
<ul>
<li>Drop directory path from filename in <code>File</code> <a
href="9433f4bbc9">9433f4b</a>.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="bea7bbb290"><code>bea7bbb</code></a>
Version 0.0.22 (<a
href="https://redirect.github.com/Kludex/python-multipart/issues/222">#222</a>)</li>
<li><a
href="0fb59a9df0"><code>0fb59a9</code></a>
chore: add return type on test (<a
href="https://redirect.github.com/Kludex/python-multipart/issues/221">#221</a>)</li>
<li><a
href="9433f4bbc9"><code>9433f4b</code></a>
Merge commit from fork</li>
<li><a
href="d5c91ecb0a"><code>d5c91ec</code></a>
Bump the github-actions group with 2 updates (<a
href="https://redirect.github.com/Kludex/python-multipart/issues/219">#219</a>)</li>
<li><a
href="5a90631b48"><code>5a90631</code></a>
bump uv (<a
href="https://redirect.github.com/Kludex/python-multipart/issues/218">#218</a>)</li>
<li>See full diff in <a
href="https://github.com/Kludex/python-multipart/compare/0.0.21...0.0.22">compare
view</a></li>
</ul>
</details>
<br />

Updates `urllib3` from 2.6.2 to 2.6.3
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/urllib3/urllib3/releases">urllib3's
releases</a>.</em></p>
<blockquote>
<h2>2.6.3</h2>
<h2>🚀 urllib3 is fundraising for HTTP/2 support</h2>
<p><a
href="https://sethmlarson.dev/urllib3-is-fundraising-for-http2-support">urllib3
is raising ~$40,000 USD</a> to release HTTP/2 support and ensure
long-term sustainable maintenance of the project after a sharp decline
in financial support. If your company or organization uses Python and
would benefit from HTTP/2 support in Requests, pip, cloud SDKs, and
thousands of other projects <a
href="https://opencollective.com/urllib3">please consider contributing
financially</a> to ensure HTTP/2 support is developed sustainably and
maintained for the long-haul.</p>
<p>Thank you for your support.</p>
<h2>Changes</h2>
<ul>
<li>Fixed a security issue where decompression-bomb safeguards of the
streaming API were bypassed when HTTP redirects were followed.
(CVE-2026-21441 reported by <a
href="https://github.com/D47A"><code>@​D47A</code></a>, 8.9 High,
GHSA-38jv-5279-wg99)</li>
<li>Started treating <code>Retry-After</code> times greater than 6 hours
as 6 hours by default. (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3743">urllib3/urllib3#3743</a>)</li>
<li>Fixed <code>urllib3.connection.VerifiedHTTPSConnection</code> on
Emscripten. (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3752">urllib3/urllib3#3752</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/urllib3/urllib3/blob/main/CHANGES.rst">urllib3's
changelog</a>.</em></p>
<blockquote>
<h1>2.6.3 (2026-01-07)</h1>
<ul>
<li>Fixed a high-severity security issue where decompression-bomb
safeguards of
the streaming API were bypassed when HTTP redirects were followed.
(<code>GHSA-38jv-5279-wg99
&lt;https://github.com/urllib3/urllib3/security/advisories/GHSA-38jv-5279-wg99&gt;</code>__)</li>
<li>Started treating <code>Retry-After</code> times greater than 6 hours
as 6 hours by
default. (<code>[#3743](https://github.com/urllib3/urllib3/issues/3743)
&lt;https://github.com/urllib3/urllib3/issues/3743&gt;</code>__)</li>
<li>Fixed <code>urllib3.connection.VerifiedHTTPSConnection</code> on
Emscripten.
(<code>[#3752](https://github.com/urllib3/urllib3/issues/3752)
&lt;https://github.com/urllib3/urllib3/issues/3752&gt;</code>__)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="0248277dd7"><code>0248277</code></a>
Release 2.6.3</li>
<li><a
href="8864ac407b"><code>8864ac4</code></a>
Merge commit from fork</li>
<li><a
href="70cecb27ca"><code>70cecb2</code></a>
Fix Scorecard issues related to vulnerable dev dependencies (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3755">#3755</a>)</li>
<li><a
href="41f249abe1"><code>41f249a</code></a>
Move &quot;v2.0 Migration Guide&quot; to the end of the table of
contents (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3747">#3747</a>)</li>
<li><a
href="fd4dffd2fc"><code>fd4dffd</code></a>
Patch <code>VerifiedHTTPSConnection</code> for Emscripten (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3752">#3752</a>)</li>
<li><a
href="13f0bfd55e"><code>13f0bfd</code></a>
Handle massive values in Retry-After when calculating time to sleep for
(<a
href="https://redirect.github.com/urllib3/urllib3/issues/3743">#3743</a>)</li>
<li><a
href="8c480bf87b"><code>8c480bf</code></a>
Bump actions/upload-artifact from 5.0.0 to 6.0.0 (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3748">#3748</a>)</li>
<li><a
href="4b40616e95"><code>4b40616</code></a>
Bump actions/cache from 4.3.0 to 5.0.1 (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3750">#3750</a>)</li>
<li><a
href="82b8479663"><code>82b8479</code></a>
Bump actions/download-artifact from 6.0.0 to 7.0.0 (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3749">#3749</a>)</li>
<li><a
href="34284cb017"><code>34284cb</code></a>
Mention experimental features in the security policy (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3746">#3746</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/urllib3/urllib3/compare/2.6.2...2.6.3">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions
You can disable automated security fix PRs for this repo from the
[Security Alerts
page](https://github.com/maxdorninger/MediaManager/network/alerts).

</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-20 18:07:39 +01:00
dependabot[bot]
5adb88f9e0 Bump pillow from 12.1.0 to 12.1.1 (#443)
Bumps [pillow](https://github.com/python-pillow/Pillow) from 12.1.0 to
12.1.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/python-pillow/Pillow/releases">pillow's
releases</a>.</em></p>
<blockquote>
<h2>12.1.1</h2>
<p><a
href="https://pillow.readthedocs.io/en/stable/releasenotes/12.1.1.html">https://pillow.readthedocs.io/en/stable/releasenotes/12.1.1.html</a></p>
<h2>Dependencies</h2>
<ul>
<li>Patch libavif for svt-av1 4.0 compatibility <a
href="https://redirect.github.com/python-pillow/Pillow/issues/9413">#9413</a>
[<a href="https://github.com/hugovk"><code>@​hugovk</code></a>]</li>
</ul>
<h2>Other changes</h2>
<ul>
<li>Fix OOB Write with invalid tile extents <a
href="https://redirect.github.com/python-pillow/Pillow/issues/9427">#9427</a>
[<a
href="https://github.com/radarhere"><code>@​radarhere</code></a>]</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="5158d98c80"><code>5158d98</code></a>
12.1.1 version bump</li>
<li><a
href="9000313cc5"><code>9000313</code></a>
Fix OOB Write with invalid tile extents (<a
href="https://redirect.github.com/python-pillow/Pillow/issues/9427">#9427</a>)</li>
<li><a
href="cd0111849f"><code>cd01118</code></a>
Patch libavif for svt-av1 4.0 compatibility</li>
<li>See full diff in <a
href="https://github.com/python-pillow/Pillow/compare/12.1.0...12.1.1">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=pillow&package-manager=uv&previous-version=12.1.0&new-version=12.1.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-20 17:35:11 +01:00
dependabot[bot]
eb277dddac Bump sveltekit-superforms from 2.28.1 to 2.29.1 in /web (#442)
Bumps
[sveltekit-superforms](https://github.com/ciscoheat/sveltekit-superforms)
from 2.28.1 to 2.29.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/ciscoheat/sveltekit-superforms/releases">sveltekit-superforms's
releases</a>.</em></p>
<blockquote>
<h2>v2.29.1</h2>
<h3>Fixed</h3>
<ul>
<li>Fixed TypeScript type inference for discriminated unions in
<code>ValidationErrors</code>. <a
href="https://redirect.github.com/ciscoheat/sveltekit-superforms/issues/653">#653</a></li>
<li>Fixed FormData parsing for discriminated unions, so they work
properly without requiring <code>dataType: 'json'</code>. <a
href="https://redirect.github.com/ciscoheat/sveltekit-superforms/issues/655">#655</a></li>
<li><code>reset()</code> function didn't preserve tainted state for
fields that are not being reset when using partial data. <a
href="https://redirect.github.com/ciscoheat/sveltekit-superforms/issues/656">#656</a></li>
<li>Fixed FormData parsing incorrectly coercing empty strings to literal
values (e.g., <code>z.literal(&quot;bar&quot;)</code>). Empty strings
now properly fail validation instead of being replaced with the literal
value. <a
href="https://redirect.github.com/ciscoheat/sveltekit-superforms/issues/664">#664</a></li>
<li>Fixed <code>ReferenceError</code> when using
<code>customValidity</code> with <code>validateForm({ update: true
})</code>. <a
href="https://redirect.github.com/ciscoheat/sveltekit-superforms/issues/669">#669</a></li>
</ul>
<h3>Changed</h3>
<ul>
<li>Replaced deprecated <code>@finom/zod-to-json-schema</code> with
<code>zod-v3-to-json-schema</code>. <a
href="https://redirect.github.com/ciscoheat/sveltekit-superforms/pull/660">#660</a></li>
<li>Migrated Valibot adapter to use the official
<code>@valibot/to-json-schema</code> package. <a
href="https://redirect.github.com/ciscoheat/sveltekit-superforms/pull/668">#668</a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/ciscoheat/sveltekit-superforms/blob/main/CHANGELOG.md">sveltekit-superforms's
changelog</a>.</em></p>
<blockquote>
<h2>[2.29.1] - 2025-12-16</h2>
<h3>Fixed</h3>
<ul>
<li>Fixed TypeScript type inference for discriminated unions in
<code>ValidationErrors</code>. <a
href="https://redirect.github.com/ciscoheat/sveltekit-superforms/issues/653">#653</a></li>
<li>Fixed FormData parsing for discriminated unions, so they work
properly without requiring <code>dataType: 'json'</code>. <a
href="https://redirect.github.com/ciscoheat/sveltekit-superforms/issues/655">#655</a></li>
<li><code>reset()</code> function didn't preserve tainted state for
fields that are not being reset when using partial data. <a
href="https://redirect.github.com/ciscoheat/sveltekit-superforms/issues/656">#656</a></li>
<li>Fixed FormData parsing incorrectly coercing empty strings to literal
values (e.g., <code>z.literal(&quot;bar&quot;)</code>). Empty strings
now properly fail validation instead of being replaced with the literal
value. <a
href="https://redirect.github.com/ciscoheat/sveltekit-superforms/issues/664">#664</a></li>
<li>Fixed <code>ReferenceError</code> when using
<code>customValidity</code> with <code>validateForm({ update: true
})</code>. <a
href="https://redirect.github.com/ciscoheat/sveltekit-superforms/issues/669">#669</a></li>
</ul>
<h3>Changed</h3>
<ul>
<li>Replaced deprecated <code>@finom/zod-to-json-schema</code> with
<code>zod-v3-to-json-schema</code>. <a
href="https://redirect.github.com/ciscoheat/sveltekit-superforms/pull/660">#660</a></li>
<li>Migrated Valibot adapter to use the official
<code>@valibot/to-json-schema</code> package. <a
href="https://redirect.github.com/ciscoheat/sveltekit-superforms/pull/668">#668</a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="22319df44e"><code>22319df</code></a>
2.29.1 changelog</li>
<li><a
href="c1b44308a8"><code>c1b4430</code></a>
2.29.1</li>
<li><a
href="9408124dc9"><code>9408124</code></a>
Incorrect dependency fix</li>
<li><a
href="887aeb6238"><code>887aeb6</code></a>
2.29.0</li>
<li><a
href="ce701fb6fc"><code>ce701fb</code></a>
2.29.0 changelog</li>
<li><a
href="58b41b1e84"><code>58b41b1</code></a>
Linter</li>
<li><a
href="fbbdb90ae7"><code>fbbdb90</code></a>
Fixed build warning</li>
<li><a
href="44a40c6b13"><code>44a40c6</code></a>
Fixed SvelteKit reference warnings</li>
<li><a
href="12bb4d5c32"><code>12bb4d5</code></a>
Using pnpm 10 for build</li>
<li><a
href="f7c87d8898"><code>f7c87d8</code></a>
Package updates</li>
<li>Additional commits viewable in <a
href="https://github.com/ciscoheat/sveltekit-superforms/compare/v2.28.1...v2.29.1">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sveltekit-superforms&package-manager=npm_and_yarn&previous-version=2.28.1&new-version=2.29.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-20 17:35:00 +01:00
dependabot[bot]
516d562bd8 Bump uvicorn from 0.40.0 to 0.41.0 (#441)
Bumps [uvicorn](https://github.com/Kludex/uvicorn) from 0.40.0 to
0.41.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/Kludex/uvicorn/releases">uvicorn's
releases</a>.</em></p>
<blockquote>
<h2>Version 0.41.0</h2>
<h2>Added</h2>
<ul>
<li>Add <code>--limit-max-requests-jitter</code> to stagger worker
restarts (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2707">#2707</a>)</li>
<li>Add socket path to <code>scope[&quot;server&quot;]</code> (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2561">#2561</a>)</li>
</ul>
<h2>Changed</h2>
<ul>
<li>Rename <code>LifespanOn.error_occured</code> to
<code>error_occurred</code> (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2776">#2776</a>)</li>
</ul>
<h2>Fixed</h2>
<ul>
<li>Ignore permission denied errors in watchfiles reloader (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2817">#2817</a>)</li>
<li>Ensure lifespan shutdown runs when <code>should_exit</code> is set
during startup (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2812">#2812</a>)</li>
<li>Reduce the log level of 'request limit exceeded' messages (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2788">#2788</a>)</li>
</ul>
<hr />
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/t-kawasumi"><code>@​t-kawasumi</code></a> made
their first contribution in <a
href="https://redirect.github.com/Kludex/uvicorn/pull/2776">Kludex/uvicorn#2776</a></li>
<li><a href="https://github.com/fardyn"><code>@​fardyn</code></a> made
their first contribution in <a
href="https://redirect.github.com/Kludex/uvicorn/pull/2800">Kludex/uvicorn#2800</a></li>
<li><a href="https://github.com/ewie"><code>@​ewie</code></a> made their
first contribution in <a
href="https://redirect.github.com/Kludex/uvicorn/pull/2807">Kludex/uvicorn#2807</a></li>
<li><a href="https://github.com/shevron"><code>@​shevron</code></a> made
their first contribution in <a
href="https://redirect.github.com/Kludex/uvicorn/pull/2788">Kludex/uvicorn#2788</a></li>
<li><a href="https://github.com/jonashaag"><code>@​jonashaag</code></a>
made their first contribution in <a
href="https://redirect.github.com/Kludex/uvicorn/pull/2707">Kludex/uvicorn#2707</a></li>
</ul>
<hr />
<p><strong>Full Changelog</strong>: <a
href="https://github.com/Kludex/uvicorn/compare/0.40.0...0.41.0">https://github.com/Kludex/uvicorn/compare/0.40.0...0.41.0</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/Kludex/uvicorn/blob/main/docs/release-notes.md">uvicorn's
changelog</a>.</em></p>
<blockquote>
<h2>0.41.0 (February 16, 2026)</h2>
<h3>Added</h3>
<ul>
<li>Add <code>--limit-max-requests-jitter</code> to stagger worker
restarts (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2707">#2707</a>)</li>
<li>Add socket path to <code>scope[&quot;server&quot;]</code> (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2561">#2561</a>)</li>
</ul>
<h3>Changed</h3>
<ul>
<li>Rename <code>LifespanOn.error_occured</code> to
<code>error_occurred</code> (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2776">#2776</a>)</li>
</ul>
<h3>Fixed</h3>
<ul>
<li>Ignore permission denied errors in watchfiles reloader (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2817">#2817</a>)</li>
<li>Ensure lifespan shutdown runs when <code>should_exit</code> is set
during startup (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2812">#2812</a>)</li>
<li>Reduce the log level of 'request limit exceeded' messages (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2788">#2788</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="9283c0f15c"><code>9283c0f</code></a>
Version 0.41.0 (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2821">#2821</a>)</li>
<li><a
href="a01a33eb8f"><code>a01a33e</code></a>
Add <code>--limit-max-requests-jitter</code> to stagger worker restarts
(<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2707">#2707</a>)</li>
<li><a
href="2ce65bde15"><code>2ce65bd</code></a>
Ignore permission denied errors in watchfiles reloader (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2817">#2817</a>)</li>
<li><a
href="654f2ed7d7"><code>654f2ed</code></a>
Ensure lifespan shutdown runs when <code>should_exit</code> is set
during startup (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2812">#2812</a>)</li>
<li><a
href="a03d9f6f0e"><code>a03d9f6</code></a>
Reduce the log level of 'request limit exceeded' messages (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2788">#2788</a>)</li>
<li><a
href="e377de40d0"><code>e377de4</code></a>
Add socket path to scope[&quot;server&quot;] (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2561">#2561</a>)</li>
<li><a
href="0779f7f8a4"><code>0779f7f</code></a>
Poll for readiness in <code>test_multiprocess_health_check</code> and
<code>run_server</code> (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2816">#2816</a>)</li>
<li><a
href="7e9ce2c974"><code>7e9ce2c</code></a>
Poll for PID changes in <code>test_multiprocess_sighup</code> instead of
fixed sleep (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2">#2</a>...</li>
<li><a
href="99f0d8734d"><code>99f0d87</code></a>
Fix grep warning in scripts/sync-version (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2807">#2807</a>)</li>
<li><a
href="7ae2e6375a"><code>7ae2e63</code></a>
chore(deps): bump the python-packages group with 18 updates (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2801">#2801</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/Kludex/uvicorn/compare/0.40.0...0.41.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=uvicorn&package-manager=uv&previous-version=0.40.0&new-version=0.41.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-20 17:34:48 +01:00
dependabot[bot]
dea75841b2 Bump lucide-svelte from 0.564.0 to 0.574.0 in /web (#439)
Bumps
[lucide-svelte](https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-svelte)
from 0.564.0 to 0.574.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/lucide-icons/lucide/releases">lucide-svelte's
releases</a>.</em></p>
<blockquote>
<h2>Version 0.574.0</h2>
<h2>What's Changed</h2>
<ul>
<li>fix(icons): changed <code>rocking-chair</code> icon by <a
href="https://github.com/jamiemlaw"><code>@​jamiemlaw</code></a> in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/3445">lucide-icons/lucide#3445</a></li>
<li>fix(icons): flipped <code>coins</code> icon by <a
href="https://github.com/jguddas"><code>@​jguddas</code></a> in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/3158">lucide-icons/lucide#3158</a></li>
<li>feat(icons): added <code>x-line-top</code> icon by <a
href="https://github.com/jguddas"><code>@​jguddas</code></a> in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/2838">lucide-icons/lucide#2838</a></li>
<li>feat(icons): added <code>mouse-left</code> icon by <a
href="https://github.com/marvfash"><code>@​marvfash</code></a> in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/2788">lucide-icons/lucide#2788</a></li>
<li>feat(icons): added <code>mouse-right</code> icon by <a
href="https://github.com/marvfash"><code>@​marvfash</code></a> in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/2787">lucide-icons/lucide#2787</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/marvfash"><code>@​marvfash</code></a>
made their first contribution in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/2788">lucide-icons/lucide#2788</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/lucide-icons/lucide/compare/0.572.0...0.574.0">https://github.com/lucide-icons/lucide/compare/0.572.0...0.574.0</a></p>
<h2>Version 0.573.0</h2>
<h2>What's Changed</h2>
<ul>
<li>fix(icons): changed <code>rocking-chair</code> icon by <a
href="https://github.com/jamiemlaw"><code>@​jamiemlaw</code></a> in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/3445">lucide-icons/lucide#3445</a></li>
<li>fix(icons): flipped <code>coins</code> icon by <a
href="https://github.com/jguddas"><code>@​jguddas</code></a> in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/3158">lucide-icons/lucide#3158</a></li>
<li>feat(icons): added <code>x-line-top</code> icon by <a
href="https://github.com/jguddas"><code>@​jguddas</code></a> in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/2838">lucide-icons/lucide#2838</a></li>
<li>feat(icons): added <code>mouse-left</code> icon by <a
href="https://github.com/marvfash"><code>@​marvfash</code></a> in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/2788">lucide-icons/lucide#2788</a></li>
<li>feat(icons): added <code>mouse-right</code> icon by <a
href="https://github.com/marvfash"><code>@​marvfash</code></a> in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/2787">lucide-icons/lucide#2787</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/marvfash"><code>@​marvfash</code></a>
made their first contribution in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/2788">lucide-icons/lucide#2788</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/lucide-icons/lucide/compare/0.572.0...0.573.0">https://github.com/lucide-icons/lucide/compare/0.572.0...0.573.0</a></p>
<h2>Version 0.572.0</h2>
<h2>What's Changed</h2>
<ul>
<li>feat(icons): added <code>message-circle-check</code> icon by <a
href="https://github.com/Shrinks99"><code>@​Shrinks99</code></a> in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/3770">lucide-icons/lucide#3770</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/Shrinks99"><code>@​Shrinks99</code></a>
made their first contribution in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/3770">lucide-icons/lucide#3770</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/lucide-icons/lucide/compare/0.571.0...0.572.0">https://github.com/lucide-icons/lucide/compare/0.571.0...0.572.0</a></p>
<h2>Version 0.571.0</h2>
<h2>What's Changed</h2>
<ul>
<li>fix(icons): rearange <code>circle</code>-icons path and circle order
by <a
href="https://github.com/adamlindqvist"><code>@​adamlindqvist</code></a>
in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/3746">lucide-icons/lucide#3746</a></li>
<li>feat(icons): added <code>shelving-unit</code> icon by <a
href="https://github.com/karsa-mistmere"><code>@​karsa-mistmere</code></a>
in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/3041">lucide-icons/lucide#3041</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/adamlindqvist"><code>@​adamlindqvist</code></a>
made their first contribution in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/3746">lucide-icons/lucide#3746</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/lucide-icons/lucide/compare/0.570.0...0.571.0">https://github.com/lucide-icons/lucide/compare/0.570.0...0.571.0</a></p>
<h2>Version 0.570.0</h2>
<h2>What's Changed</h2>
<ul>
<li>feat(icons): added <code>towel-rack</code> icon by <a
href="https://github.com/jguddas"><code>@​jguddas</code></a> in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/3350">lucide-icons/lucide#3350</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/lucide-icons/lucide/commits/0.574.0/packages/lucide-svelte">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=lucide-svelte&package-manager=npm_and_yarn&previous-version=0.564.0&new-version=0.574.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-20 17:34:38 +01:00
dependabot[bot]
20e0dbf936 Bump uvicorn from 0.40.0 to 0.41.0 in /metadata_relay (#440)
Bumps [uvicorn](https://github.com/Kludex/uvicorn) from 0.40.0 to
0.41.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/Kludex/uvicorn/releases">uvicorn's
releases</a>.</em></p>
<blockquote>
<h2>Version 0.41.0</h2>
<h2>Added</h2>
<ul>
<li>Add <code>--limit-max-requests-jitter</code> to stagger worker
restarts (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2707">#2707</a>)</li>
<li>Add socket path to <code>scope[&quot;server&quot;]</code> (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2561">#2561</a>)</li>
</ul>
<h2>Changed</h2>
<ul>
<li>Rename <code>LifespanOn.error_occured</code> to
<code>error_occurred</code> (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2776">#2776</a>)</li>
</ul>
<h2>Fixed</h2>
<ul>
<li>Ignore permission denied errors in watchfiles reloader (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2817">#2817</a>)</li>
<li>Ensure lifespan shutdown runs when <code>should_exit</code> is set
during startup (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2812">#2812</a>)</li>
<li>Reduce the log level of 'request limit exceeded' messages (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2788">#2788</a>)</li>
</ul>
<hr />
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/t-kawasumi"><code>@​t-kawasumi</code></a> made
their first contribution in <a
href="https://redirect.github.com/Kludex/uvicorn/pull/2776">Kludex/uvicorn#2776</a></li>
<li><a href="https://github.com/fardyn"><code>@​fardyn</code></a> made
their first contribution in <a
href="https://redirect.github.com/Kludex/uvicorn/pull/2800">Kludex/uvicorn#2800</a></li>
<li><a href="https://github.com/ewie"><code>@​ewie</code></a> made their
first contribution in <a
href="https://redirect.github.com/Kludex/uvicorn/pull/2807">Kludex/uvicorn#2807</a></li>
<li><a href="https://github.com/shevron"><code>@​shevron</code></a> made
their first contribution in <a
href="https://redirect.github.com/Kludex/uvicorn/pull/2788">Kludex/uvicorn#2788</a></li>
<li><a href="https://github.com/jonashaag"><code>@​jonashaag</code></a>
made their first contribution in <a
href="https://redirect.github.com/Kludex/uvicorn/pull/2707">Kludex/uvicorn#2707</a></li>
</ul>
<hr />
<p><strong>Full Changelog</strong>: <a
href="https://github.com/Kludex/uvicorn/compare/0.40.0...0.41.0">https://github.com/Kludex/uvicorn/compare/0.40.0...0.41.0</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/Kludex/uvicorn/blob/main/docs/release-notes.md">uvicorn's
changelog</a>.</em></p>
<blockquote>
<h2>0.41.0 (February 16, 2026)</h2>
<h3>Added</h3>
<ul>
<li>Add <code>--limit-max-requests-jitter</code> to stagger worker
restarts (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2707">#2707</a>)</li>
<li>Add socket path to <code>scope[&quot;server&quot;]</code> (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2561">#2561</a>)</li>
</ul>
<h3>Changed</h3>
<ul>
<li>Rename <code>LifespanOn.error_occured</code> to
<code>error_occurred</code> (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2776">#2776</a>)</li>
</ul>
<h3>Fixed</h3>
<ul>
<li>Ignore permission denied errors in watchfiles reloader (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2817">#2817</a>)</li>
<li>Ensure lifespan shutdown runs when <code>should_exit</code> is set
during startup (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2812">#2812</a>)</li>
<li>Reduce the log level of 'request limit exceeded' messages (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2788">#2788</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="9283c0f15c"><code>9283c0f</code></a>
Version 0.41.0 (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2821">#2821</a>)</li>
<li><a
href="a01a33eb8f"><code>a01a33e</code></a>
Add <code>--limit-max-requests-jitter</code> to stagger worker restarts
(<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2707">#2707</a>)</li>
<li><a
href="2ce65bde15"><code>2ce65bd</code></a>
Ignore permission denied errors in watchfiles reloader (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2817">#2817</a>)</li>
<li><a
href="654f2ed7d7"><code>654f2ed</code></a>
Ensure lifespan shutdown runs when <code>should_exit</code> is set
during startup (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2812">#2812</a>)</li>
<li><a
href="a03d9f6f0e"><code>a03d9f6</code></a>
Reduce the log level of 'request limit exceeded' messages (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2788">#2788</a>)</li>
<li><a
href="e377de40d0"><code>e377de4</code></a>
Add socket path to scope[&quot;server&quot;] (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2561">#2561</a>)</li>
<li><a
href="0779f7f8a4"><code>0779f7f</code></a>
Poll for readiness in <code>test_multiprocess_health_check</code> and
<code>run_server</code> (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2816">#2816</a>)</li>
<li><a
href="7e9ce2c974"><code>7e9ce2c</code></a>
Poll for PID changes in <code>test_multiprocess_sighup</code> instead of
fixed sleep (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2">#2</a>...</li>
<li><a
href="99f0d8734d"><code>99f0d87</code></a>
Fix grep warning in scripts/sync-version (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2807">#2807</a>)</li>
<li><a
href="7ae2e6375a"><code>7ae2e63</code></a>
chore(deps): bump the python-packages group with 18 updates (<a
href="https://redirect.github.com/Kludex/uvicorn/issues/2801">#2801</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/Kludex/uvicorn/compare/0.40.0...0.41.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=uvicorn&package-manager=uv&previous-version=0.40.0&new-version=0.41.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-20 17:34:27 +01:00
dependabot[bot]
c8f2a4316e Bump alembic from 1.17.2 to 1.18.4 (#438)
Bumps [alembic](https://github.com/sqlalchemy/alembic) from 1.17.2 to
1.18.4.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/sqlalchemy/alembic/releases">alembic's
releases</a>.</em></p>
<blockquote>
<h1>1.18.4</h1>
<p>Released: February 10, 2026</p>
<h2>bug</h2>
<ul>
<li>
<p><strong>[bug] [operations]</strong> Reverted the behavior of
<code>Operations.add_column()</code> that would
automatically render the &quot;PRIMARY KEY&quot; keyword inline when a
<code>Column</code> with <code>primary_key=True</code> is added. The
automatic
behavior, added in version 1.18.2, is now opt-in via the new
<code>Operations.add_column.inline_primary_key</code> parameter. This
change restores the ability to render a PostgreSQL SERIAL column, which
is
required to be <code>primary_key=True</code>, while not impacting the
ability to
render a separate primary key constraint. This also provides consistency
with the <code>Operations.add_column.inline_references</code> parameter
and
gives users explicit control over SQL generation.</p>
<p>To render PRIMARY KEY inline, use the
<code>Operations.add_column.inline_primary_key</code> parameter set to
<code>True</code>:</p>
<p>op.add_column(
&quot;my_table&quot;,
Column(&quot;id&quot;, Integer, primary_key=True),
inline_primary_key=True
)References: <a
href="https://redirect.github.com/sqlalchemy/alembic/issues/1232">#1232</a></p>
</li>
</ul>
<h1>1.18.3</h1>
<p>Released: January 29, 2026</p>
<h2>bug</h2>
<ul>
<li>
<p><strong>[bug] [autogenerate]</strong> Fixed regression in version
1.18.0 due to <a
href="https://redirect.github.com/sqlalchemy/alembic/issues/1771">#1771</a>
where autogenerate
would raise <code>NoReferencedTableError</code> when a foreign key
constraint
referenced a table that was not part of the initial table load,
including
tables filtered out by the
<code>EnvironmentContext.configure.include_name</code> callable or
tables
in remote schemas that were not included in the initial reflection
run.</p>
<p>The change in <a
href="https://redirect.github.com/sqlalchemy/alembic/issues/1771">#1771</a>
was a performance optimization that eliminated
additional reflection queries for tables that were only referenced by
foreign keys but not explicitly included in the main reflection run.
However, this optimization inadvertently removed the creation of
<code>Table</code> objects for these referenced tables, causing
autogenerate
to fail when processing foreign key constraints that pointed to
them.</p>
<p>The fix creates placeholder <code>Table</code> objects for foreign
key targets</p>
</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/sqlalchemy/alembic/commits">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=alembic&package-manager=uv&previous-version=1.17.2&new-version=1.18.4)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-20 17:34:15 +01:00
dependabot[bot]
4836e3e188 Bump prettier-plugin-svelte from 3.4.0 to 3.4.1 in /web (#437)
Bumps
[prettier-plugin-svelte](https://github.com/sveltejs/prettier-plugin-svelte)
from 3.4.0 to 3.4.1.
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/sveltejs/prettier-plugin-svelte/blob/v3.4.1/CHANGELOG.md">prettier-plugin-svelte's
changelog</a>.</em></p>
<blockquote>
<h2>3.4.1</h2>
<ul>
<li>(fix) externalize all prettier imports</li>
<li>(fix) don't remove parantheses of <code>bind:</code>ings with
<code>as</code> type casts</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/sveltejs/prettier-plugin-svelte/commits/v3.4.1">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=prettier-plugin-svelte&package-manager=npm_and_yarn&previous-version=3.4.0&new-version=3.4.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-20 17:33:54 +01:00
dependabot[bot]
7a6466ea9d Bump @sinclair/typebox from 0.34.41 to 0.34.48 in /web (#436)
Bumps
[@sinclair/typebox](https://github.com/sinclairzx81/typebox-legacy) from
0.34.41 to 0.34.48.
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/sinclairzx81/typebox-legacy/commits/0.34.48">compare
view</a></li>
</ul>
</details>
<details>
<summary>Maintainer changes</summary>
<p>This version was pushed to npm by [GitHub Actions](<a
href="https://www.npmjs.com/~GitHub">https://www.npmjs.com/~GitHub</a>
Actions), a new releaser for <code>@​sinclair/typebox</code> since your
current version.</p>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=@sinclair/typebox&package-manager=npm_and_yarn&previous-version=0.34.41&new-version=0.34.48)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-20 17:33:33 +01:00
Maximilian Dorninger
b427aa5723 Merge pull request #433 from maxdorninger/dependabot/npm_and_yarn/web/npm_and_yarn-e0ba90b5b1
Bump @sveltejs/kit from 2.49.2 to 2.51.0 in /web in the npm_and_yarn group across 1 directory
2026-02-13 19:39:57 +01:00
dependabot[bot]
82aa01a650 Bump @sveltejs/kit in /web in the npm_and_yarn group across 1 directory
Bumps the npm_and_yarn group with 1 update in the /web directory: [@sveltejs/kit](https://github.com/sveltejs/kit/tree/HEAD/packages/kit).


Updates `@sveltejs/kit` from 2.49.2 to 2.51.0
- [Release notes](https://github.com/sveltejs/kit/releases)
- [Changelog](https://github.com/sveltejs/kit/blob/main/packages/kit/CHANGELOG.md)
- [Commits](https://github.com/sveltejs/kit/commits/@sveltejs/kit@2.51.0/packages/kit)

---
updated-dependencies:
- dependency-name: "@sveltejs/kit"
  dependency-version: 2.51.0
  dependency-type: direct:development
  dependency-group: npm_and_yarn
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-13 18:25:01 +00:00
Maximilian Dorninger
bc3895ab40 Merge pull request #427 from maxdorninger/dependabot/npm_and_yarn/web/typeschema/class-validator-0.3.0
Bump @typeschema/class-validator from 0.2.0 to 0.3.0 in /web
2026-02-13 19:24:38 +01:00
Maximilian Dorninger
b7ed529f77 Merge pull request #428 from maxdorninger/dependabot/uv/starlette-0.52.1
Bump starlette from 0.50.0 to 0.52.1
2026-02-13 19:24:26 +01:00
Maximilian Dorninger
370df4efa0 Merge pull request #429 from maxdorninger/dependabot/npm_and_yarn/web/vite-7.3.1
Bump vite from 7.2.7 to 7.3.1 in /web
2026-02-13 19:24:12 +01:00
Maximilian Dorninger
a3e85d6338 Merge pull request #431 from maxdorninger/dependabot/npm_and_yarn/web/svelte-5.51.0
Bump svelte from 5.45.8 to 5.51.0 in /web
2026-02-13 19:24:01 +01:00
Maximilian Dorninger
a2816f2dfb Merge pull request #432 from maxdorninger/dependabot/npm_and_yarn/web/lucide-svelte-0.564.0
Bump lucide-svelte from 0.544.0 to 0.564.0 in /web
2026-02-13 19:23:48 +01:00
Maximilian Dorninger
0026b891f5 Merge pull request #430 from maxdorninger/dependabot/uv/cachetools-7.0.1
Bump cachetools from 6.2.4 to 7.0.1
2026-02-13 19:23:34 +01:00
dependabot[bot]
b312d880b7 Bump lucide-svelte from 0.544.0 to 0.564.0 in /web
Bumps [lucide-svelte](https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-svelte) from 0.544.0 to 0.564.0.
- [Release notes](https://github.com/lucide-icons/lucide/releases)
- [Commits](https://github.com/lucide-icons/lucide/commits/0.564.0/packages/lucide-svelte)

---
updated-dependencies:
- dependency-name: lucide-svelte
  dependency-version: 0.564.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-13 18:08:18 +00:00
dependabot[bot]
71e2a08535 Bump svelte from 5.45.8 to 5.51.0 in /web
Bumps [svelte](https://github.com/sveltejs/svelte/tree/HEAD/packages/svelte) from 5.45.8 to 5.51.0.
- [Release notes](https://github.com/sveltejs/svelte/releases)
- [Changelog](https://github.com/sveltejs/svelte/blob/main/packages/svelte/CHANGELOG.md)
- [Commits](https://github.com/sveltejs/svelte/commits/svelte@5.51.0/packages/svelte)

---
updated-dependencies:
- dependency-name: svelte
  dependency-version: 5.51.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-13 18:08:10 +00:00
dependabot[bot]
f2bf1a2dae Bump cachetools from 6.2.4 to 7.0.1
Bumps [cachetools](https://github.com/tkem/cachetools) from 6.2.4 to 7.0.1.
- [Changelog](https://github.com/tkem/cachetools/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/tkem/cachetools/compare/v6.2.4...v7.0.1)

---
updated-dependencies:
- dependency-name: cachetools
  dependency-version: 7.0.1
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-13 18:08:06 +00:00
dependabot[bot]
6b70980c2a Bump vite from 7.2.7 to 7.3.1 in /web
Bumps [vite](https://github.com/vitejs/vite/tree/HEAD/packages/vite) from 7.2.7 to 7.3.1.
- [Release notes](https://github.com/vitejs/vite/releases)
- [Changelog](https://github.com/vitejs/vite/blob/main/packages/vite/CHANGELOG.md)
- [Commits](https://github.com/vitejs/vite/commits/v7.3.1/packages/vite)

---
updated-dependencies:
- dependency-name: vite
  dependency-version: 7.3.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-13 18:08:01 +00:00
dependabot[bot]
e80a516c23 Bump starlette from 0.50.0 to 0.52.1
Bumps [starlette](https://github.com/Kludex/starlette) from 0.50.0 to 0.52.1.
- [Release notes](https://github.com/Kludex/starlette/releases)
- [Changelog](https://github.com/Kludex/starlette/blob/main/docs/release-notes.md)
- [Commits](https://github.com/Kludex/starlette/compare/0.50.0...0.52.1)

---
updated-dependencies:
- dependency-name: starlette
  dependency-version: 0.52.1
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-13 18:07:55 +00:00
dependabot[bot]
280e136209 Bump @typeschema/class-validator from 0.2.0 to 0.3.0 in /web
Bumps [@typeschema/class-validator](https://github.com/decs/typeschema) from 0.2.0 to 0.3.0.
- [Release notes](https://github.com/decs/typeschema/releases)
- [Commits](https://github.com/decs/typeschema/compare/@typeschema/class-validator@0.2.0...@typeschema/class-validator@0.3.0)

---
updated-dependencies:
- dependency-name: "@typeschema/class-validator"
  dependency-version: 0.3.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-13 18:07:51 +00:00
maxid
5c62c9f5be Merge remote-tracking branch 'origin/master' 2026-02-13 19:06:44 +01:00
maxid
1e46cdc03b add metadata_relay to dependabot.yml 2026-02-13 19:06:36 +01:00
Maximilian Dorninger
18573fa7d9 Bump actions/setup-python from 5 to 6 (#412)
Bumps [actions/setup-python](https://github.com/actions/setup-python)
from 5 to 6.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/actions/setup-python/releases">actions/setup-python's
releases</a>.</em></p>
<blockquote>
<h2>v6.0.0</h2>
<h2>What's Changed</h2>
<h3>Breaking Changes</h3>
<ul>
<li>Upgrade to node 24 by <a
href="https://github.com/salmanmkc"><code>@​salmanmkc</code></a> in <a
href="https://redirect.github.com/actions/setup-python/pull/1164">actions/setup-python#1164</a></li>
</ul>
<p>Make sure your runner is on version v2.327.1 or later to ensure
compatibility with this release. <a
href="https://github.com/actions/runner/releases/tag/v2.327.1">See
Release Notes</a></p>
<h3>Enhancements:</h3>
<ul>
<li>Add support for <code>pip-version</code> by <a
href="https://github.com/priyagupta108"><code>@​priyagupta108</code></a>
in <a
href="https://redirect.github.com/actions/setup-python/pull/1129">actions/setup-python#1129</a></li>
<li>Enhance reading from .python-version by <a
href="https://github.com/krystof-k"><code>@​krystof-k</code></a> in <a
href="https://redirect.github.com/actions/setup-python/pull/787">actions/setup-python#787</a></li>
<li>Add version parsing from Pipfile by <a
href="https://github.com/aradkdj"><code>@​aradkdj</code></a> in <a
href="https://redirect.github.com/actions/setup-python/pull/1067">actions/setup-python#1067</a></li>
</ul>
<h3>Bug fixes:</h3>
<ul>
<li>Clarify pythonLocation behaviour for PyPy and GraalPy in environment
variables by <a
href="https://github.com/aparnajyothi-y"><code>@​aparnajyothi-y</code></a>
in <a
href="https://redirect.github.com/actions/setup-python/pull/1183">actions/setup-python#1183</a></li>
<li>Change missing cache directory error to warning by <a
href="https://github.com/aparnajyothi-y"><code>@​aparnajyothi-y</code></a>
in <a
href="https://redirect.github.com/actions/setup-python/pull/1182">actions/setup-python#1182</a></li>
<li>Add Architecture-Specific PATH Management for Python with --user
Flag on Windows by <a
href="https://github.com/aparnajyothi-y"><code>@​aparnajyothi-y</code></a>
in <a
href="https://redirect.github.com/actions/setup-python/pull/1122">actions/setup-python#1122</a></li>
<li>Include python version in PyPy python-version output by <a
href="https://github.com/cdce8p"><code>@​cdce8p</code></a> in <a
href="https://redirect.github.com/actions/setup-python/pull/1110">actions/setup-python#1110</a></li>
<li>Update docs: clarification on pip authentication with setup-python
by <a
href="https://github.com/priya-kinthali"><code>@​priya-kinthali</code></a>
in <a
href="https://redirect.github.com/actions/setup-python/pull/1156">actions/setup-python#1156</a></li>
</ul>
<h3>Dependency updates:</h3>
<ul>
<li>Upgrade idna from 2.9 to 3.7 in /<strong>tests</strong>/data by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/actions/setup-python/pull/843">actions/setup-python#843</a></li>
<li>Upgrade form-data to fix critical vulnerabilities <a
href="https://redirect.github.com/actions/setup-python/issues/182">#182</a>
&amp; <a
href="https://redirect.github.com/actions/setup-python/issues/183">#183</a>
by <a
href="https://github.com/aparnajyothi-y"><code>@​aparnajyothi-y</code></a>
in <a
href="https://redirect.github.com/actions/setup-python/pull/1163">actions/setup-python#1163</a></li>
<li>Upgrade setuptools to 78.1.1 to fix path traversal vulnerability in
PackageIndex.download by <a
href="https://github.com/aparnajyothi-y"><code>@​aparnajyothi-y</code></a>
in <a
href="https://redirect.github.com/actions/setup-python/pull/1165">actions/setup-python#1165</a></li>
<li>Upgrade actions/checkout from 4 to 5 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/actions/setup-python/pull/1181">actions/setup-python#1181</a></li>
<li>Upgrade <code>@​actions/tool-cache</code> from 2.0.1 to 2.0.2 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/actions/setup-python/pull/1095">actions/setup-python#1095</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/krystof-k"><code>@​krystof-k</code></a>
made their first contribution in <a
href="https://redirect.github.com/actions/setup-python/pull/787">actions/setup-python#787</a></li>
<li><a href="https://github.com/cdce8p"><code>@​cdce8p</code></a> made
their first contribution in <a
href="https://redirect.github.com/actions/setup-python/pull/1110">actions/setup-python#1110</a></li>
<li><a href="https://github.com/aradkdj"><code>@​aradkdj</code></a> made
their first contribution in <a
href="https://redirect.github.com/actions/setup-python/pull/1067">actions/setup-python#1067</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/actions/setup-python/compare/v5...v6.0.0">https://github.com/actions/setup-python/compare/v5...v6.0.0</a></p>
<h2>v5.6.0</h2>
<h2>What's Changed</h2>
<ul>
<li>Workflow updates related to Ubuntu 20.04 by <a
href="https://github.com/aparnajyothi-y"><code>@​aparnajyothi-y</code></a>
in <a
href="https://redirect.github.com/actions/setup-python/pull/1065">actions/setup-python#1065</a></li>
<li>Fix for Candidate Not Iterable Error by <a
href="https://github.com/aparnajyothi-y"><code>@​aparnajyothi-y</code></a>
in <a
href="https://redirect.github.com/actions/setup-python/pull/1082">actions/setup-python#1082</a></li>
<li>Upgrade semver and <code>@​types/semver</code> by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/actions/setup-python/pull/1091">actions/setup-python#1091</a></li>
<li>Upgrade prettier from 2.8.8 to 3.5.3 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/actions/setup-python/pull/1046">actions/setup-python#1046</a></li>
<li>Upgrade ts-jest from 29.1.2 to 29.3.2 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/actions/setup-python/pull/1081">actions/setup-python#1081</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/actions/setup-python/compare/v5...v5.6.0">https://github.com/actions/setup-python/compare/v5...v5.6.0</a></p>
<h2>v5.5.0</h2>
<h2>What's Changed</h2>
<h3>Enhancements:</h3>
<ul>
<li>Support free threaded Python versions like '3.13t' by <a
href="https://github.com/colesbury"><code>@​colesbury</code></a> in <a
href="https://redirect.github.com/actions/setup-python/pull/973">actions/setup-python#973</a></li>
<li>Enhance Workflows: Include ubuntu-arm runners, Add e2e Testing for
free threaded and Upgrade <code>@​action/cache</code> from 4.0.0 to
4.0.3 by <a
href="https://github.com/priya-kinthali"><code>@​priya-kinthali</code></a>
in <a
href="https://redirect.github.com/actions/setup-python/pull/1056">actions/setup-python#1056</a></li>
<li>Add support for .tool-versions file in setup-python by <a
href="https://github.com/mahabaleshwars"><code>@​mahabaleshwars</code></a>
in <a
href="https://redirect.github.com/actions/setup-python/pull/1043">actions/setup-python#1043</a></li>
</ul>
<h3>Bug fixes:</h3>
<ul>
<li>Fix architecture for pypy on Linux ARM64 by <a
href="https://github.com/mayeut"><code>@​mayeut</code></a> in <a
href="https://redirect.github.com/actions/setup-python/pull/1011">actions/setup-python#1011</a>
This update maps arm64 to aarch64 for Linux ARM64 PyPy
installations.</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="a309ff8b42"><code>a309ff8</code></a>
Bump urllib3 from 2.6.0 to 2.6.3 in /<strong>tests</strong>/data (<a
href="https://redirect.github.com/actions/setup-python/issues/1264">#1264</a>)</li>
<li><a
href="bfe8cc55a7"><code>bfe8cc5</code></a>
Upgrade <a href="https://github.com/actions"><code>@​actions</code></a>
dependencies to Node 24 compatible versions (<a
href="https://redirect.github.com/actions/setup-python/issues/1259">#1259</a>)</li>
<li><a
href="4f41a90a1f"><code>4f41a90</code></a>
Bump urllib3 from 2.5.0 to 2.6.0 in /<strong>tests</strong>/data (<a
href="https://redirect.github.com/actions/setup-python/issues/1253">#1253</a>)</li>
<li><a
href="83679a892e"><code>83679a8</code></a>
Bump <code>@​types/node</code> from 24.1.0 to 24.9.1 and update macos-13
to macos-15-intel ...</li>
<li><a
href="bfc4944b43"><code>bfc4944</code></a>
Bump prettier from 3.5.3 to 3.6.2 (<a
href="https://redirect.github.com/actions/setup-python/issues/1234">#1234</a>)</li>
<li><a
href="97aeb3efb8"><code>97aeb3e</code></a>
Bump requests from 2.32.2 to 2.32.4 in /<strong>tests</strong>/data (<a
href="https://redirect.github.com/actions/setup-python/issues/1130">#1130</a>)</li>
<li><a
href="443da59188"><code>443da59</code></a>
Bump actions/publish-action from 0.3.0 to 0.4.0 &amp; Documentation
update for pi...</li>
<li><a
href="cfd55ca824"><code>cfd55ca</code></a>
graalpy: add graalpy early-access and windows builds (<a
href="https://redirect.github.com/actions/setup-python/issues/880">#880</a>)</li>
<li><a
href="bba65e51ff"><code>bba65e5</code></a>
Bump typescript from 5.4.2 to 5.9.3 and update docs/advanced-usage.md
(<a
href="https://redirect.github.com/actions/setup-python/issues/1094">#1094</a>)</li>
<li><a
href="18566f86b3"><code>18566f8</code></a>
Improve wording and &quot;fix example&quot; (remove 3.13) on testing
against pre-releas...</li>
<li>Additional commits viewable in <a
href="https://github.com/actions/setup-python/compare/v5...v6">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=actions/setup-python&package-manager=github_actions&previous-version=5&new-version=6)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-13 19:04:05 +01:00
maxid
6debd7a42d update package-lock.json 2026-02-13 18:54:54 +01:00
dependabot[bot]
cd70ab8711 Bump actions/setup-python from 5 to 6
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 5 to 6.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](https://github.com/actions/setup-python/compare/v5...v6)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-version: '6'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-13 17:08:28 +00:00
Maximilian Dorninger
51b8794e4d Merge pull request #411 from maxdorninger/Dependabot-auto-bump-deps
Configure Dependabot for multiple package ecosystems
2026-02-13 18:07:54 +01:00
Mark Riabov
0cfd1fa724 Fix suffix formatting for with_suffix call (#408)
Fixes issue ValueError: Invalid suffix 'jpg'

Completely prevents downloading posters from metadata provider
2026-02-10 20:29:05 +01:00
Maximilian Dorninger
b5b297e99a add new sponsor syn (#405)
this PR adds the new sponsor syn
2026-02-08 20:10:06 +01:00
maxid
58414cadae update all links to docs 2026-02-08 19:47:17 +01:00
maxid
462794520e update docs workflow 2026-02-08 19:43:13 +01:00
maxid
59afba007d update docs workflow 2026-02-08 19:36:07 +01:00
Maximilian Dorninger
cfa303e4f3 Merge pull request #404 from maxdorninger/mkdocs
This PR replaces Gitbook with Mkdocs to provide documentation
2026-02-08 19:27:15 +01:00
maxid
d3dde9c7eb add docs workflow 2026-02-08 19:22:34 +01:00
maxid
9c94ef6de0 convert gitbook files to mkdocs 2026-02-08 19:16:38 +01:00
Maximilian Dorninger
2665106847 Merge pull request #401 from maxdorninger/fix-env-variables
Fix download clients config being read from env variables
2026-02-08 16:37:15 +01:00
maxid
d029177fc0 hot fix: fix search tag name for episode in jackett 2026-02-04 23:52:07 +01:00
Maximilian Dorninger
1698c404cd Merge pull request #400 from maxdorninger/add-search-by-id-support-to-jackett
Add search by id support to jackett
2026-02-04 23:00:00 +01:00
maxid
abac894a95 fix download clients config being read from env variables without the mediamanager prefix 2026-02-04 22:49:24 +01:00
maxid
12854ff661 format files 2026-02-04 21:34:37 +01:00
maxid
3d52a87302 add id search capabilities to jackett 2026-02-04 21:34:31 +01:00
Maximilian Dorninger
9ee5cc6895 make the container user configurable (#399)
This PR makes the user the container runs as configurable. Before, the
container always tried stepping down (from root) to the mediamanager
user. Now it detects if it's already running as a non-root user and
starts the server directly. Fixes #397
2026-02-04 19:01:18 +01:00
126 changed files with 3879 additions and 4745 deletions

View File

@@ -53,5 +53,5 @@ YOUR CONFIG HERE
``` ```
- [ ] I understand, that without logs and/or screenshots and a detailed description of the problem, it is very hard to fix bugs. - [ ] I understand, that without logs and/or screenshots and a detailed description of the problem, it is very hard to fix bugs.
- [ ] I have checked the [documentation](https://maximilian-dorninger.gitbook.io/mediamanager) for help. - [ ] I have checked the [documentation](https://maxdorninger.github.io/MediaManager/) for help.
- [ ] I have searched the [issues](https://github.com/maxdorninger/MediaManager/issues) for similar issues and found none. - [ ] I have searched the [issues](https://github.com/maxdorninger/MediaManager/issues) for similar issues and found none.

31
.github/dependabot.yml vendored Normal file
View File

@@ -0,0 +1,31 @@
# To get started with Dependabot version updates, you'll need to specify which
# package ecosystems to update and where the package manifests are located.
# Please see the documentation for all configuration options:
# https://docs.github.com/code-security/dependabot/dependabot-version-updates/configuration-options-for-the-dependabot.yml-file
version: 2
updates:
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "weekly"
open-pull-requests-limit: 5
- package-ecosystem: "npm"
directory: "/web"
schedule:
interval: "weekly"
open-pull-requests-limit: 5
- package-ecosystem: "uv"
directory: "/"
schedule:
interval: "weekly"
open-pull-requests-limit: 5
- package-ecosystem: "uv"
directory: "/metadata_relay"
schedule:
interval: "weekly"
open-pull-requests-limit: 5

View File

@@ -85,7 +85,7 @@ jobs:
run: echo "name=$(echo '${{ github.event.repository.name }}' | tr '[:upper:]' '[:lower:]')" >> $GITHUB_OUTPUT run: echo "name=$(echo '${{ github.event.repository.name }}' | tr '[:upper:]' '[:lower:]')" >> $GITHUB_OUTPUT
- name: Set up Docker Buildx - name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3 uses: docker/setup-buildx-action@v4
with: with:
install: true install: true
driver-opts: image=moby/buildkit:rootless driver-opts: image=moby/buildkit:rootless
@@ -171,7 +171,7 @@ jobs:
run: echo "name=$(echo '${{ github.event.repository.name }}' | tr '[:upper:]' '[:lower:]')" >> $GITHUB_OUTPUT run: echo "name=$(echo '${{ github.event.repository.name }}' | tr '[:upper:]' '[:lower:]')" >> $GITHUB_OUTPUT
- name: Set up Docker Buildx - name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3 uses: docker/setup-buildx-action@v4
- name: Log in to GitHub Container Registry - name: Log in to GitHub Container Registry
uses: docker/login-action@v3 uses: docker/login-action@v3

View File

@@ -50,7 +50,7 @@ jobs:
run: echo "name=$(echo '${{ github.event.repository.name }}' | tr '[:upper:]' '[:lower:]')" >> $GITHUB_OUTPUT run: echo "name=$(echo '${{ github.event.repository.name }}' | tr '[:upper:]' '[:lower:]')" >> $GITHUB_OUTPUT
- name: Set up Docker Buildx - name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3 uses: docker/setup-buildx-action@v4
with: with:
install: true install: true
driver-opts: image=moby/buildkit:rootless driver-opts: image=moby/buildkit:rootless
@@ -135,7 +135,7 @@ jobs:
run: echo "name=$(echo '${{ github.event.repository.name }}' | tr '[:upper:]' '[:lower:]')" >> $GITHUB_OUTPUT run: echo "name=$(echo '${{ github.event.repository.name }}' | tr '[:upper:]' '[:lower:]')" >> $GITHUB_OUTPUT
- name: Set up Docker Buildx - name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3 uses: docker/setup-buildx-action@v4
- name: Log in to GitHub Container Registry - name: Log in to GitHub Container Registry
uses: docker/login-action@v3 uses: docker/login-action@v3

62
.github/workflows/docs.yml vendored Normal file
View File

@@ -0,0 +1,62 @@
name: Publish docs via GitHub Pages
on:
push:
branches:
- master
tags:
- v*
workflow_dispatch:
inputs:
set_default_alias:
description: 'Alias to set as default (e.g. latest, master)'
required: false
default: 'latest'
permissions:
contents: write
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Configure Git Credentials
run: |
git config user.name github-actions[bot]
git config user.email 41898282+github-actions[bot]@users.noreply.github.com
- uses: actions/setup-python@v6
with:
python-version: 3.x
- run: echo "cache_id=$(date --utc '+%V')" >> $GITHUB_ENV
- uses: actions/cache@v4
with:
key: mkdocs-material-${{ env.cache_id }}
path: .cache
restore-keys: |
mkdocs-material-
- name: Install dependencies
run: pip install mkdocs-material mike
- name: Deploy (master)
if: github.ref == 'refs/heads/master'
run: |
mike deploy --push --update-aliases master
- name: Deploy (tag)
if: startsWith(github.ref, 'refs/tags/v')
run: |
version=${GITHUB_REF#refs/tags/}
mike deploy --push --update-aliases $version latest --title "$version"
mike set-default --push latest
- name: Set Default (Manual)
if: github.event_name == 'workflow_dispatch' && github.event.inputs.set_default_alias != ''
run: |
mike set-default --push ${{ github.event.inputs.set_default_alias }}

4
.gitignore vendored
View File

@@ -49,5 +49,5 @@ __pycache__
# Postgres # Postgres
/postgres /postgres
# Node modules # MkDocs
/node_modules/* site/

View File

@@ -18,7 +18,7 @@ Generally, if you have any questions or need help on the implementation side of
just ask in the issue, or in a draft PR. just ask in the issue, or in a draft PR.
Also, see the contribution guide in the docs for information on how to setup the dev environment: Also, see the contribution guide in the docs for information on how to setup the dev environment:
https://maximilian-dorninger.gitbook.io/mediamanager https://maxdorninger.github.io/MediaManager/
### For something that is a one or two line fix: ### For something that is a one or two line fix:

View File

@@ -1,7 +1,7 @@
<br /> <br />
<div align="center"> <div align="center">
<a href="https://maximilian-dorninger.gitbook.io/mediamanager"> <a href="https://maxdorninger.github.io/MediaManager/">
<img src="https://github.com/maxdorninger/MediaManager/blob/master/web/static/logo.svg" alt="Logo" width="260" height="260"> <img src="https://raw.githubusercontent.com/maxdorninger/MediaManager/refs/heads/master/docs/assets/logo-with-text.svg" alt="Logo" width="800">
</a> </a>
<h3 align="center">MediaManager</h3> <h3 align="center">MediaManager</h3>
@@ -9,7 +9,7 @@
<p align="center"> <p align="center">
Modern management system for your media library Modern management system for your media library
<br /> <br />
<a href="https://maximilian-dorninger.gitbook.io/mediamanager"><strong>Explore the docs »</strong></a> <a href="https://maxdorninger.github.io/MediaManager/"><strong>Explore the docs »</strong></a>
<br /> <br />
<a href="https://github.com/maxdorninger/MediaManager/issues/new?labels=bug&template=bug_report.md">Report Bug</a> <a href="https://github.com/maxdorninger/MediaManager/issues/new?labels=bug&template=bug_report.md">Report Bug</a>
&middot; &middot;
@@ -35,7 +35,7 @@ wget -O ./config/config.toml https://github.com/maxdorninger/MediaManager/releas
docker compose up -d docker compose up -d
``` ```
### [View the docs for installation instructions and more](https://maximilian-dorninger.gitbook.io/mediamanager) ### [View the docs for installation instructions and more](https://maxdorninger.github.io/MediaManager/)
## Support MediaManager ## Support MediaManager
@@ -60,6 +60,7 @@ docker compose up -d
<a href="https://buymeacoffee.com/maxdorninger"><img src="https://cdn.buymeacoffee.com/uploads/profile_pictures/default/v2/DEBBB9/JO.png" width="80px" alt="Josh" /></a>&nbsp;&nbsp; <a href="https://buymeacoffee.com/maxdorninger"><img src="https://cdn.buymeacoffee.com/uploads/profile_pictures/default/v2/DEBBB9/JO.png" width="80px" alt="Josh" /></a>&nbsp;&nbsp;
<a href="https://buymeacoffee.com/maxdorninger"><img src="https://cdn.buymeacoffee.com/uploads/profile_pictures/2025/11/2VeQ8sTGPhj4tiLy.jpg" width="80px" alt="PuppiestDoggo" /></a>&nbsp;&nbsp; <a href="https://buymeacoffee.com/maxdorninger"><img src="https://cdn.buymeacoffee.com/uploads/profile_pictures/2025/11/2VeQ8sTGPhj4tiLy.jpg" width="80px" alt="PuppiestDoggo" /></a>&nbsp;&nbsp;
<a href="https://github.com/seferino-fernandez"><img src="https://avatars.githubusercontent.com/u/5546622" width="80px" alt="Seferino" /></a>&nbsp;&nbsp; <a href="https://github.com/seferino-fernandez"><img src="https://avatars.githubusercontent.com/u/5546622" width="80px" alt="Seferino" /></a>&nbsp;&nbsp;
<a href="https://buymeacoffee.com/maxdorninger"><img src="https://cdn.buymeacoffee.com/uploads/profile_pictures/default/v2/EC9689/SY.png" width="80px" alt="syn" /></a>&nbsp;&nbsp;
## Star History ## Star History
@@ -80,7 +81,7 @@ docker compose up -d
## Developer Quick Start ## Developer Quick Start
For the developer guide see the [Developer Guide](https://maximilian-dorninger.gitbook.io/mediamanager). For the developer guide see the [Developer Guide](https://maxdorninger.github.io/MediaManager/).
<!-- LICENSE --> <!-- LICENSE -->

View File

@@ -30,14 +30,13 @@ from media_manager.auth.db import OAuthAccount, User # noqa: E402
from media_manager.config import MediaManagerConfig # noqa: E402 from media_manager.config import MediaManagerConfig # noqa: E402
from media_manager.database import Base # noqa: E402 from media_manager.database import Base # noqa: E402
from media_manager.indexer.models import IndexerQueryResult # noqa: E402 from media_manager.indexer.models import IndexerQueryResult # noqa: E402
from media_manager.movies.models import Movie, MovieFile, MovieRequest # noqa: E402 from media_manager.movies.models import Movie, MovieFile # noqa: E402
from media_manager.notification.models import Notification # noqa: E402 from media_manager.notification.models import Notification # noqa: E402
from media_manager.torrent.models import Torrent # noqa: E402 from media_manager.torrent.models import Torrent # noqa: E402
from media_manager.tv.models import ( # noqa: E402 from media_manager.tv.models import ( # noqa: E402
Episode, Episode,
EpisodeFile,
Season, Season,
SeasonFile,
SeasonRequest,
Show, Show,
) )
@@ -47,15 +46,13 @@ target_metadata = Base.metadata
# noinspection PyStatementEffect # noinspection PyStatementEffect
__all__ = [ __all__ = [
"Episode", "Episode",
"EpisodeFile",
"IndexerQueryResult", "IndexerQueryResult",
"Movie", "Movie",
"MovieFile", "MovieFile",
"MovieRequest",
"Notification", "Notification",
"OAuthAccount", "OAuthAccount",
"Season", "Season",
"SeasonFile",
"SeasonRequest",
"Show", "Show",
"Torrent", "Torrent",
"User", "User",

View File

@@ -0,0 +1,46 @@
"""create episode file table and add episode column to indexerqueryresult
Revision ID: 3a8fbd71e2c2
Revises: 9f3c1b2a4d8e
Create Date: 2026-01-08 13:43:00
"""
from typing import Sequence, Union
from alembic import op
from sqlalchemy.dialects import postgresql
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision: str = "3a8fbd71e2c2"
down_revision: Union[str, None] = "9f3c1b2a4d8e"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
quality_enum = postgresql.ENUM("uhd", "fullhd", "hd", "sd", "unknown", name="quality",
create_type=False,
)
# Create episode file table
op.create_table(
"episode_file",
sa.Column("episode_id", sa.UUID(), nullable=False),
sa.Column("torrent_id", sa.UUID(), nullable=True),
sa.Column("file_path_suffix", sa.String(), nullable=False),
sa.Column("quality", quality_enum, nullable=False),
sa.ForeignKeyConstraint(["episode_id"], ["episode.id"], ondelete="CASCADE"),
sa.ForeignKeyConstraint(["torrent_id"], ["torrent.id"], ondelete="SET NULL"),
sa.PrimaryKeyConstraint("episode_id", "file_path_suffix"),
)
# Add episode column to indexerqueryresult
op.add_column(
"indexer_query_result", sa.Column("episode", postgresql.ARRAY(sa.Integer()), nullable=True),
)
def downgrade() -> None:
op.drop_table("episode_file")
op.drop_column("indexer_query_result", "episode")

View File

@@ -0,0 +1,31 @@
"""add overview column to episode table
Revision ID: 9f3c1b2a4d8e
Revises: 2c61f662ca9e
Create Date: 2025-12-29 21:45:00
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision: str = "9f3c1b2a4d8e"
down_revision: Union[str, None] = "2c61f662ca9e"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
# Add overview to episode table
op.add_column(
"episode",
sa.Column("overview", sa.Text(), nullable=True),
)
def downgrade() -> None:
op.drop_column("episode", "overview")

View File

@@ -0,0 +1,71 @@
"""migrate season files to episode files and drop the legacy table
Revision ID: a6f714d3c8b9
Revises: 16e78af9e5bf
Create Date: 2026-02-22 16:30:00
"""
from typing import Sequence, Union
import sqlalchemy as sa
from alembic import op
from sqlalchemy.dialects import postgresql
# revision identifiers, used by Alembic.
revision: str = "a6f714d3c8b9"
down_revision: Union[str, None] = "3a8fbd71e2c2"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
"""Copy season_file records into episode_file and remove the legacy table."""
op.execute(
"""
INSERT INTO episode_file (episode_id, torrent_id, file_path_suffix, quality)
SELECT episode.id, season_file.torrent_id, season_file.file_path_suffix, season_file.quality
FROM season_file
JOIN season ON season.id = season_file.season_id
JOIN episode ON episode.season_id = season.id
LEFT JOIN episode_file ON
episode_file.episode_id = episode.id
AND episode_file.file_path_suffix = season_file.file_path_suffix
WHERE episode_file.episode_id IS NULL
"""
)
op.drop_table("season_file")
def downgrade() -> None:
"""Recreate season_file, repopulate it from episode_file, and keep both tables."""
quality_enum = postgresql.ENUM(
"uhd", "fullhd", "hd", "sd", "unknown", name="quality", create_type=False
)
op.create_table(
"season_file",
sa.Column("season_id", sa.UUID(), nullable=False),
sa.Column("torrent_id", sa.UUID(), nullable=True),
sa.Column("file_path_suffix", sa.String(), nullable=False),
sa.Column("quality", quality_enum, nullable=False),
sa.ForeignKeyConstraint(["season_id"], ["season.id"], ondelete="CASCADE"),
sa.ForeignKeyConstraint(["torrent_id"], ["torrent.id"], ondelete="SET NULL"),
sa.PrimaryKeyConstraint("season_id", "file_path_suffix"),
)
op.execute(
"""
INSERT INTO season_file (season_id, torrent_id, file_path_suffix, quality)
SELECT DISTINCT ON (episode.season_id, episode_file.file_path_suffix)
episode.season_id,
episode_file.torrent_id,
episode_file.file_path_suffix,
episode_file.quality
FROM episode_file
JOIN episode ON episode.id = episode_file.episode_id
ORDER BY episode.season_id, episode_file.file_path_suffix, episode_file.torrent_id, episode_file.quality
"""
)

View File

@@ -0,0 +1,65 @@
"""remove requests
Revision ID: e60ae827ed98
Revises: a6f714d3c8b9
Create Date: 2026-02-22 18:07:12.866130
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import postgresql
# revision identifiers, used by Alembic.
revision: str = 'e60ae827ed98'
down_revision: Union[str, None] = 'a6f714d3c8b9'
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
"""Upgrade schema."""
# ### commands auto generated by Alembic - please adjust! ###
op.drop_table('movie_request')
op.drop_table('season_request')
op.alter_column('episode', 'overview',
existing_type=sa.TEXT(),
type_=sa.String(),
existing_nullable=True)
# ### end Alembic commands ###
def downgrade() -> None:
"""Downgrade schema."""
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('season_request',
sa.Column('id', sa.UUID(), autoincrement=False, nullable=False),
sa.Column('season_id', sa.UUID(), autoincrement=False, nullable=False),
sa.Column('wanted_quality', postgresql.ENUM('uhd', 'fullhd', 'hd', 'sd', 'unknown', name='quality'), autoincrement=False, nullable=False),
sa.Column('min_quality', postgresql.ENUM('uhd', 'fullhd', 'hd', 'sd', 'unknown', name='quality'), autoincrement=False, nullable=False),
sa.Column('requested_by_id', sa.UUID(), autoincrement=False, nullable=True),
sa.Column('authorized', sa.BOOLEAN(), autoincrement=False, nullable=False),
sa.Column('authorized_by_id', sa.UUID(), autoincrement=False, nullable=True),
sa.ForeignKeyConstraint(['authorized_by_id'], ['user.id'], name=op.f('season_request_authorized_by_id_fkey'), ondelete='SET NULL'),
sa.ForeignKeyConstraint(['requested_by_id'], ['user.id'], name=op.f('season_request_requested_by_id_fkey'), ondelete='SET NULL'),
sa.ForeignKeyConstraint(['season_id'], ['season.id'], name=op.f('season_request_season_id_fkey'), ondelete='CASCADE'),
sa.PrimaryKeyConstraint('id', name=op.f('season_request_pkey')),
sa.UniqueConstraint('season_id', 'wanted_quality', name=op.f('season_request_season_id_wanted_quality_key'), postgresql_include=[], postgresql_nulls_not_distinct=False)
)
op.create_table('movie_request',
sa.Column('id', sa.UUID(), autoincrement=False, nullable=False),
sa.Column('movie_id', sa.UUID(), autoincrement=False, nullable=False),
sa.Column('wanted_quality', postgresql.ENUM('uhd', 'fullhd', 'hd', 'sd', 'unknown', name='quality'), autoincrement=False, nullable=False),
sa.Column('min_quality', postgresql.ENUM('uhd', 'fullhd', 'hd', 'sd', 'unknown', name='quality'), autoincrement=False, nullable=False),
sa.Column('authorized', sa.BOOLEAN(), autoincrement=False, nullable=False),
sa.Column('requested_by_id', sa.UUID(), autoincrement=False, nullable=True),
sa.Column('authorized_by_id', sa.UUID(), autoincrement=False, nullable=True),
sa.ForeignKeyConstraint(['authorized_by_id'], ['user.id'], name=op.f('movie_request_authorized_by_id_fkey'), ondelete='SET NULL'),
sa.ForeignKeyConstraint(['movie_id'], ['movie.id'], name=op.f('movie_request_movie_id_fkey'), ondelete='CASCADE'),
sa.ForeignKeyConstraint(['requested_by_id'], ['user.id'], name=op.f('movie_request_requested_by_id_fkey'), ondelete='SET NULL'),
sa.PrimaryKeyConstraint('id', name=op.f('movie_request_pkey')),
sa.UniqueConstraint('movie_id', 'wanted_quality', name=op.f('movie_request_movie_id_wanted_quality_key'), postgresql_include=[], postgresql_nulls_not_distinct=False)
)
# ### end Alembic commands ###

View File

@@ -1,6 +1,6 @@
# MediaManager Dev Configuration File # MediaManager Dev Configuration File
# This file contains all available configuration options for MediaManager # This file contains all available configuration options for MediaManager
# Documentation: https://maximilian-dorninger.gitbook.io/mediamanager # Documentation: https://maxdorninger.github.io/MediaManager/
# #
# This is an example configuration file that gets copied to your config folder # This is an example configuration file that gets copied to your config folder
# on first boot. You should modify the values below to match your setup. # on first boot. You should modify the values below to match your setup.
@@ -138,7 +138,7 @@ negate = false
[[indexers.title_scoring_rules]] [[indexers.title_scoring_rules]]
name = "avoid_cam" name = "avoid_cam"
keywords = ["cam", "ts"] keywords = ["cam", "camrip", "bdscr", "ddc", "dvdscreener","dvdscr", "hdcam", "hdtc", "hdts", "scr", "screener","telesync", "ts", "webscreener", "tc", "telecine", "tvrip"]
score_modifier = -10000 score_modifier = -10000
negate = false negate = false

View File

@@ -1,6 +1,6 @@
# MediaManager Example Configuration File # MediaManager Example Configuration File
# This file contains all available configuration options for MediaManager # This file contains all available configuration options for MediaManager
# Documentation: https://maximilian-dorninger.gitbook.io/mediamanager # Documentation: https://maxdorninger.github.io/MediaManager/
# #
# This is an example configuration file that gets copied to your config folder # This is an example configuration file that gets copied to your config folder
# on first boot. You should modify the values below to match your setup. # on first boot. You should modify the values below to match your setup.
@@ -138,7 +138,7 @@ negate = false
[[indexers.title_scoring_rules]] [[indexers.title_scoring_rules]]
name = "avoid_cam" name = "avoid_cam"
keywords = ["cam", "ts"] keywords = ["cam", "camrip", "bdscr", "ddc", "dvdscreener","dvdscr", "hdcam", "hdtc", "hdts", "scr", "screener","telesync", "ts", "webscreener", "tc", "telecine", "tvrip"]
score_modifier = -10000 score_modifier = -10000
negate = false negate = false

View File

@@ -56,6 +56,15 @@ services:
- ./web:/app - ./web:/app
depends_on: depends_on:
- mediamanager - mediamanager
docs:
image: squidfunk/mkdocs-material:9
container_name: mediamanager-docs
volumes:
- .:/docs
ports:
- "9000:9000"
command: serve -w /docs -a 0.0.0.0:9000
# ---------------------------- # ----------------------------
# Additional services can be uncommented and configured as needed # Additional services can be uncommented and configured as needed
@@ -130,17 +139,17 @@ services:
# ports: # ports:
# - 8081:8080 # - 8081:8080
# restart: unless-stopped # restart: unless-stopped
# jackett: jackett:
# image: lscr.io/linuxserver/jackett:latest image: lscr.io/linuxserver/jackett:latest
# container_name: jackett container_name: jackett
# environment: environment:
# - PUID=1000 - PUID=1000
# - PGID=1000 - PGID=1000
# - TZ=Etc/UTC - TZ=Etc/UTC
# - AUTO_UPDATE=true - AUTO_UPDATE=true
# volumes: volumes:
# - ./res/jackett/data:/config - ./res/jackett/data:/config
# - ./res/jackett/torrents:/downloads - ./res/jackett/torrents:/downloads
# ports: ports:
# - 9117:9117 - 9117:9117
# restart: unless-stopped restart: unless-stopped

View File

@@ -1,34 +0,0 @@
---
layout:
width: default
title:
visible: true
description:
visible: true
tableOfContents:
visible: true
outline:
visible: false
pagination:
visible: true
metadata:
visible: true
---
# MediaManager
MediaManager is the modern, easy-to-use successor to the fragmented "Arr" stack. Manage, discover, and automate your TV and movie collection in a single, simple interface.
_Replaces Sonarr, Radarr, Seerr, and more._
### Quick Links
<table data-view="cards" data-full-width="false"><thead><tr><th align="center"></th><th data-hidden data-card-target data-type="content-ref"></th></tr></thead><tbody><tr><td align="center">Installation Guide</td><td><a href="installation/">installation</a></td></tr><tr><td align="center">Configuration</td><td><a href="configuration/">configuration</a></td></tr><tr><td align="center">Developer Guide</td><td><a href="contributing-to-mediamanager/developer-guide.md">developer-guide.md</a></td></tr><tr><td align="center">Troubleshooting</td><td><a href="troubleshooting.md">troubleshooting.md</a></td></tr><tr><td align="center">Advanced Features</td><td><a href="advanced-features/">advanced-features</a></td></tr><tr><td align="center">Import Existing Media</td><td><a href="importing-existing-media.md">importing-existing-media.md</a></td></tr></tbody></table>
## Support MediaManager & Maximilian Dorninger
<table data-card-size="large" data-view="cards" data-full-width="false"><thead><tr><th></th><th data-hidden data-card-target data-type="content-ref"></th><th data-hidden data-card-cover data-type="image">Cover image</th></tr></thead><tbody><tr><td>Sponsor me on GitHub Sponsors :)</td><td><a href="https://github.com/sponsors/maxdorninger">https://github.com/sponsors/maxdorninger</a></td><td></td></tr><tr><td>Buy me a coffee :)</td><td><a href="https://buymeacoffee.com/maxdorninger">https://buymeacoffee.com/maxdorninger</a></td><td></td></tr></tbody></table>
### MediaManager Sponsors
<table data-view="cards" data-full-width="false"><thead><tr><th>Sponsor</th><th data-hidden data-card-target data-type="content-ref"></th><th data-hidden data-card-cover data-type="image">Cover image</th></tr></thead><tbody><tr><td>Aljaž Mur Eržen</td><td><a href="https://fosstodon.org/@aljazmerzen">https://fosstodon.org/@aljazmerzen</a></td><td><a href="https://github.com/aljazerzen.png">https://github.com/aljazerzen.png</a></td></tr><tr><td>Luis Rodriguez</td><td><a href="https://github.com/ldrrp">https://github.com/ldrrp</a></td><td><a href="https://github.com/ldrrp.png">https://github.com/ldrrp.png</a></td></tr><tr><td>Brandon P.</td><td><a href="https://github.com/brandon-dacrib">https://github.com/brandon-dacrib</a></td><td><a href="https://github.com/brandon-dacrib.png">https://github.com/brandon-dacrib.png</a></td></tr><tr><td>SeimusS</td><td><a href="https://github.com/SeimusS">https://github.com/SeimusS</a></td><td><a href="https://github.com/SeimusS.png">https://github.com/SeimusS.png</a></td></tr><tr><td>HadrienKerlero</td><td><a href="https://github.com/HadrienKerlero">https://github.com/HadrienKerlero</a></td><td><a href="https://github.com/HadrienKerlero.png">https://github.com/HadrienKerlero.png</a></td></tr><tr><td>keyxmakerx</td><td><a href="https://github.com/keyxmakerx">https://github.com/keyxmakerx</a></td><td><a href="https://github.com/keyxmakerx.png">https://github.com/keyxmakerx.png</a></td></tr><tr><td>LITUATUI</td><td><a href="https://github.com/LITUATUI">https://github.com/LITUATUI</a></td><td><a href="https://github.com/LITUATUI.png">https://github.com/LITUATUI.png</a></td></tr><tr><td>Nicolas</td><td><a href="https://buymeacoffee.com/maxdorninger">https://buymeacoffee.com/maxdorninger</a></td><td><a href="https://cdn.buymeacoffee.com/uploads/profile_pictures/default/v2/B6CDBD/NI.png">https://cdn.buymeacoffee.com/uploads/profile_pictures/default/v2/B6CDBD/NI.png</a></td></tr><tr><td>Josh</td><td><a href="https://buymeacoffee.com/maxdorninger">https://buymeacoffee.com/maxdorninger</a></td><td><a href="https://cdn.buymeacoffee.com/uploads/profile_pictures/default/v2/DEBBB9/JO.png">https://cdn.buymeacoffee.com/uploads/profile_pictures/default/v2/DEBBB9/JO.png</a></td></tr><tr><td>PuppiestDoggo</td><td><a href="https://buymeacoffee.com/maxdorninger">https://buymeacoffee.com/maxdorninger</a></td><td><a href="https://cdn.buymeacoffee.com/uploads/profile_pictures/2025/11/2VeQ8sTGPhj4tiLy.jpg">https://cdn.buymeacoffee.com/uploads/profile_pictures/2025/11/2VeQ8sTGPhj4tiLy.jpg</a></td></tr><tr><td>Seferino</td><td><a href="https://github.com/seferino-fernandez">https://github.com/seferino-fernandez</a></td><td><a href="https://avatars.githubusercontent.com/u/5546622">https://avatars.githubusercontent.com/u/5546622</a></td></tr><tr><td>Powered by DigitalOcean</td><td><a href="https://m.do.co/c/4edf05429dca">https://m.do.co/c/4edf05429dca</a></td><td data-object-fit="contain"><a href="https://opensource.nyc3.cdn.digitaloceanspaces.com/attribution/assets/SVG/DO_Logo_vertical_blue.svg">https://opensource.nyc3.cdn.digitaloceanspaces.com/attribution/assets/SVG/DO_Logo_vertical_blue.svg</a></td></tr></tbody></table>

View File

@@ -1,33 +0,0 @@
# Table of contents
* [MediaManager](README.md)
* [Installation Guide](installation/README.md)
* [Docker Compose](installation/docker.md)
* [Nix Flakes \[Community\]](installation/flakes.md)
* [Importing existing media](importing-existing-media.md)
* [Usage](usage.md)
* [Configuration](configuration/README.md)
* [Backend](configuration/backend.md)
* [Authentication](configuration/authentication.md)
* [Database](configuration/database.md)
* [Download Clients](configuration/download-clients.md)
* [Indexers](configuration/indexers.md)
* [Scoring Rulesets](configuration/scoring-rulesets.md)
* [Notifications](configuration/notifications.md)
* [Custom Libraries](configuration/custom-libraries.md)
* [Logging](configuration/logging.md)
* [Advanced Features](advanced-features/README.md)
* [qBittorrent Category](advanced-features/qbittorrent-category.md)
* [URL Prefix](advanced-features/url-prefix.md)
* [Metadata Provider Configuration](advanced-features/metadata-provider-configuration.md)
* [Custom port](advanced-features/custom-port.md)
* [Follow symlinks in frontend files](advanced-features/follow-symlinks-in-frontend-files.md)
* [Disable startup ascii art](advanced-features/disable-startup-ascii-art.md)
* [Troubleshooting](troubleshooting.md)
* [API Reference](api-reference.md)
* [Screenshots](screenshots.md)
## Contributing to MediaManager
* [Developer Guide](contributing-to-mediamanager/developer-guide.md)
* [Documentation](contributing-to-mediamanager/documentation.md)

View File

@@ -1,9 +0,0 @@
---
description: >-
The features in this section are not required to run MediaManager and serve
their purpose in very specific environments, but they can enhance your
experience and provide additional functionality.
---
# Advanced Features

View File

@@ -7,8 +7,6 @@ MediaManager can be configured to follow symlinks when serving frontend files. T
* `FRONTEND_FOLLOW_SYMLINKS`\ * `FRONTEND_FOLLOW_SYMLINKS`\
Set this environment variable to `true` to follow symlinks when serving frontend files. Default is `false`. Set this environment variable to `true` to follow symlinks when serving frontend files. Default is `false`.
{% code title=".env" %} ```bash title=".env"
```bash
FRONTEND_FOLLOW_SYMLINKS=true FRONTEND_FOLLOW_SYMLINKS=true
``` ```
{% endcode %}

View File

@@ -8,9 +8,8 @@ Metadata provider settings are configured in the `[metadata]` section of your `c
TMDB (The Movie Database) is the primary metadata provider for MediaManager. It provides detailed information about movies and TV shows. TMDB (The Movie Database) is the primary metadata provider for MediaManager. It provides detailed information about movies and TV shows.
{% hint style="info" %} !!! info
Other software like Jellyfin use TMDB as well, so there won't be any metadata discrepancies. Other software like Jellyfin use TMDB as well, so there won't be any metadata discrepancies.
{% endhint %}
* `tmdb_relay_url`\ * `tmdb_relay_url`\
URL of the TMDB relay (MetadataRelay). Default is `https://metadata-relay.dorninger.co/tmdb`. Example: `https://your-own-relay.example.com/tmdb`. URL of the TMDB relay (MetadataRelay). Default is `https://metadata-relay.dorninger.co/tmdb`. Example: `https://your-own-relay.example.com/tmdb`.
@@ -19,24 +18,21 @@ Other software like Jellyfin use TMDB as well, so there won't be any metadata di
* `default_language`\ * `default_language`\
TMDB language parameter used when searching and adding. Default is `en`. Format: ISO 639-1 (2 letters). TMDB language parameter used when searching and adding. Default is `en`. Format: ISO 639-1 (2 letters).
{% hint style="warning" %} !!! warning
`default_language` sets the TMDB `language` parameter when searching and adding TV shows and movies. If TMDB does not find a matching translation, metadata in the original language will be fetched with no option for a fallback language. It is therefore highly advised to only use "broad" languages. For most use cases, the default setting is safest. `default_language` sets the TMDB `language` parameter when searching and adding TV shows and movies. If TMDB does not find a matching translation, metadata in the original language will be fetched with no option for a fallback language. It is therefore highly advised to only use "broad" languages. For most use cases, the default setting is safest.
{% endhint %}
### TVDB Settings (`[metadata.tvdb]`) ### TVDB Settings (`[metadata.tvdb]`)
{% hint style="warning" %} !!! warning
The TVDB might provide false metadata and doesn't support some features of MediaManager like showing overviews. Therefore, TMDB is the preferred metadata provider. The TVDB might provide false metadata and doesn't support some features of MediaManager like showing overviews. Therefore, TMDB is the preferred metadata provider.
{% endhint %}
* `tvdb_relay_url`\ * `tvdb_relay_url`\
URL of the TVDB relay (MetadataRelay). Default is `https://metadata-relay.dorninger.co/tvdb`. Example: `https://your-own-relay.example.com/tvdb`. URL of the TVDB relay (MetadataRelay). Default is `https://metadata-relay.dorninger.co/tvdb`. Example: `https://your-own-relay.example.com/tvdb`.
### MetadataRelay ### MetadataRelay
{% hint style="info" %} !!! info
To use MediaManager you don't need to set up your own MetadataRelay, as the default relay hosted by the developer should be sufficient for most purposes. To use MediaManager you don't need to set up your own MetadataRelay, as the default relay hosted by the developer should be sufficient for most purposes.
{% endhint %}
The MetadataRelay is a service that provides metadata for MediaManager. It acts as a proxy for TMDB and TVDB, allowing you to use your own API keys if needed, but the default relay means you don't need to create accounts for API keys yourself. The MetadataRelay is a service that provides metadata for MediaManager. It acts as a proxy for TMDB and TVDB, allowing you to use your own API keys if needed, but the default relay means you don't need to create accounts for API keys yourself.
@@ -47,16 +43,14 @@ You might want to use your own relay if you want to avoid rate limits, protect y
* Get a TMDB API key from [The Movie Database](https://www.themoviedb.org/settings/api) * Get a TMDB API key from [The Movie Database](https://www.themoviedb.org/settings/api)
* Get a TVDB API key from [The TVDB](https://thetvdb.com/auth/register) * Get a TVDB API key from [The TVDB](https://thetvdb.com/auth/register)
{% hint style="info" %} !!! info
If you want to use your own MetadataRelay, you can set the `tmdb_relay_url` and/or `tvdb_relay_url` to your own relay service. If you want to use your own MetadataRelay, you can set the `tmdb_relay_url` and/or `tvdb_relay_url` to your own relay service.
{% endhint %}
### Example Configuration ### Example Configuration
Here's a complete example of the metadata section in your `config.toml`: Here's a complete example of the metadata section in your `config.toml`:
{% code title="config.toml" %} ```toml title="config.toml"
```toml
[metadata] [metadata]
# TMDB configuration # TMDB configuration
[metadata.tmdb] [metadata.tmdb]
@@ -66,8 +60,6 @@ Here's a complete example of the metadata section in your `config.toml`:
[metadata.tvdb] [metadata.tvdb]
tvdb_relay_url = "https://metadata-relay.dorninger.co/tvdb" tvdb_relay_url = "https://metadata-relay.dorninger.co/tvdb"
``` ```
{% endcode %}
{% hint style="info" %} !!! info
In most cases, you can simply use the default values and don't need to specify these settings in your config file at all. In most cases, you can simply use the default values and don't need to specify these settings in your config file at all.
{% endhint %}

View File

@@ -9,10 +9,8 @@ Use the following variables to customize behavior:
* `torrents.qbittorrent.category_save_path`\ * `torrents.qbittorrent.category_save_path`\
Save path for the category in qBittorrent. By default, no subdirectory is used. Example: `/data/torrents/MediaManager`. Save path for the category in qBittorrent. By default, no subdirectory is used. Example: `/data/torrents/MediaManager`.
{% hint style="info" %} !!! info
qBittorrent saves torrents to the path specified by `torrents.qbittorrent.category_save_path`, so it must be a valid path that qBittorrent can write to. qBittorrent saves torrents to the path specified by `torrents.qbittorrent.category_save_path`, so it must be a valid path that qBittorrent can write to.
{% endhint %}
{% hint style="warning" %} !!! warning
For MediaManager to successfully import torrents, you must add the subdirectory to the `misc.torrent_directory` variable. For MediaManager to successfully import torrents, you must add the subdirectory to the `misc.torrent_directory` variable.
{% endhint %}

View File

@@ -6,23 +6,20 @@ In order to run it on a prefixed path, like `maxdorninger.github.io/media`, the
In short, clone the repository, then run: In short, clone the repository, then run:
{% code title="Build Docker image" %} ```none title="Build Docker image"
```none
docker build \ docker build \
--build-arg BASE_PATH=/media \ --build-arg BASE_PATH=/media \
--build-arg VERSION=my-custom-version \ --build-arg VERSION=my-custom-version \
-t MediaManager:my-custom-version \ -t MediaManager:my-custom-version \
-f Dockerfile . -f Dockerfile .
``` ```
{% endcode %}
You also need to set the `BASE_PATH` environment variable at runtime in `docker-compose.yaml`: You also need to set the `BASE_PATH` environment variable at runtime in `docker-compose.yaml`:
* `BASE_PATH`\ * `BASE_PATH`\
Base path prefix MediaManager is served under. Example: `/media`. This must match the `BASE_PATH` build arg. Base path prefix MediaManager is served under. Example: `/media`. This must match the `BASE_PATH` build arg.
{% code title="docker-compose.yaml (excerpt)" %} ```yaml title="docker-compose.yaml (excerpt)"
```yaml
services: services:
mediamanager: mediamanager:
image: MediaManager:my-custom-version image: MediaManager:my-custom-version
@@ -32,10 +29,8 @@ services:
BASE_PATH: /media BASE_PATH: /media
... ...
``` ```
{% endcode %}
{% hint style="info" %} !!! info
Make sure to include the base path in the `frontend_url` field in the config file. See [Backend](../configuration/backend.md). Make sure to include the base path in the `frontend_url` field in the config file. See [Backend](../configuration/backend.md).
{% endhint %}
Finally, ensure that whatever reverse proxy you're using leaves the incoming path unchanged; that is, you should not strip the `/media` from `/media/web/`. Finally, ensure that whatever reverse proxy you're using leaves the incoming path unchanged; that is, you should not strip the `/media` from `/media/web/`.

View File

@@ -1,8 +1,7 @@
# API Reference # API Reference
{% hint style="info" %} !!! info
Media Manager's backend is built with FastAPI, which automatically generates interactive API documentation. Media Manager's backend is built with FastAPI, which automatically generates interactive API documentation.
{% endhint %}
* Swagger UI (typically available at `http://localhost:8000/docs`) * Swagger UI (typically available at `http://localhost:8000/docs`)
* ReDoc (typically available at `http://localhost:8000/redoc`) * ReDoc (typically available at `http://localhost:8000/redoc`)

View File

Before

Width:  |  Height:  |  Size: 3.1 MiB

After

Width:  |  Height:  |  Size: 3.1 MiB

View File

Before

Width:  |  Height:  |  Size: 35 KiB

After

Width:  |  Height:  |  Size: 35 KiB

View File

Before

Width:  |  Height:  |  Size: 9.0 KiB

After

Width:  |  Height:  |  Size: 9.0 KiB

View File

Before

Width:  |  Height:  |  Size: 21 KiB

After

Width:  |  Height:  |  Size: 21 KiB

View File

Before

Width:  |  Height:  |  Size: 62 KiB

After

Width:  |  Height:  |  Size: 62 KiB

View File

Before

Width:  |  Height:  |  Size: 20 KiB

After

Width:  |  Height:  |  Size: 20 KiB

View File

Before

Width:  |  Height:  |  Size: 23 KiB

After

Width:  |  Height:  |  Size: 23 KiB

View File

Before

Width:  |  Height:  |  Size: 244 KiB

After

Width:  |  Height:  |  Size: 244 KiB

View File

Before

Width:  |  Height:  |  Size: 24 KiB

After

Width:  |  Height:  |  Size: 24 KiB

View File

Before

Width:  |  Height:  |  Size: 113 KiB

After

Width:  |  Height:  |  Size: 113 KiB

View File

Before

Width:  |  Height:  |  Size: 38 KiB

After

Width:  |  Height:  |  Size: 38 KiB

View File

Before

Width:  |  Height:  |  Size: 12 KiB

After

Width:  |  Height:  |  Size: 12 KiB

View File

Before

Width:  |  Height:  |  Size: 72 KiB

After

Width:  |  Height:  |  Size: 72 KiB

View File

Before

Width:  |  Height:  |  Size: 36 KiB

After

Width:  |  Height:  |  Size: 36 KiB

View File

Before

Width:  |  Height:  |  Size: 15 KiB

After

Width:  |  Height:  |  Size: 15 KiB

View File

Before

Width:  |  Height:  |  Size: 8.9 MiB

After

Width:  |  Height:  |  Size: 8.9 MiB

View File

Before

Width:  |  Height:  |  Size: 64 KiB

After

Width:  |  Height:  |  Size: 64 KiB

View File

Before

Width:  |  Height:  |  Size: 5.5 MiB

After

Width:  |  Height:  |  Size: 5.5 MiB

View File

Before

Width:  |  Height:  |  Size: 33 KiB

After

Width:  |  Height:  |  Size: 33 KiB

View File

Before

Width:  |  Height:  |  Size: 38 KiB

After

Width:  |  Height:  |  Size: 38 KiB

View File

Before

Width:  |  Height:  |  Size: 7.6 MiB

After

Width:  |  Height:  |  Size: 7.6 MiB

View File

Before

Width:  |  Height:  |  Size: 123 KiB

After

Width:  |  Height:  |  Size: 123 KiB

BIN
docs/assets/favicon.ico Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 15 KiB

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 108 KiB

1
docs/assets/logo.svg Normal file

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 110 KiB

View File

@@ -6,9 +6,8 @@ Frontend settings are configured through environment variables in your `docker-c
## Configuration File Location ## Configuration File Location
{% hint style="warning" %} !!! warning
Note that MediaManager may need to be restarted for changes in the config file to take effect. Note that MediaManager may need to be restarted for changes in the config file to take effect.
{% endhint %}
Your `config.toml` file should be in the directory that's mounted to `/app/config/config.toml` inside the container: Your `config.toml` file should be in the directory that's mounted to `/app/config/config.toml` inside the container:
@@ -66,6 +65,5 @@ MEDIAMANAGER_AUTH__OPENID_CONNECT__CLIENT_SECRET = "your_client_secret_from_prov
So for every config "level", you basically have to take the name of the value and prepend it with the section names in uppercase with 2 underscores as delimiters and `MEDIAMANAGER_` as the prefix. So for every config "level", you basically have to take the name of the value and prepend it with the section names in uppercase with 2 underscores as delimiters and `MEDIAMANAGER_` as the prefix.
{% hint style="warning" %} !!! warning
Note that not every env variable starts with `MEDIAMANAGER_`; this prefix only applies to env variables which replace/overwrite values in the config file. Variables like the `CONFIG_DIR` env variable must not be prefixed. Note that not every env variable starts with `MEDIAMANAGER_`; this prefix only applies to env variables which replace/overwrite values in the config file. Variables like the `CONFIG_DIR` env variable must not be prefixed.
{% endhint %}

View File

@@ -20,13 +20,11 @@ All authentication settings are configured in the `[auth]` section of your `conf
* `email_password_resets`\ * `email_password_resets`\
Enables password resets via email. Default is `false`. Enables password resets via email. Default is `false`.
{% hint style="info" %} !!! info
To use email password resets, you must also configure SMTP settings in the `[notifications.smtp_config]` section. To use email password resets, you must also configure SMTP settings in the `[notifications.smtp_config]` section.
{% endhint %}
{% hint style="info" %} !!! info
When setting up MediaManager for the first time, you should add your email to `admin_emails` in the `[auth]` config section. MediaManager will then use this email instead of the default admin email. Your account will automatically be created as an admin account, allowing you to manage other users, media and settings. When setting up MediaManager for the first time, you should add your email to `admin_emails` in the `[auth]` config section. MediaManager will then use this email instead of the default admin email. Your account will automatically be created as an admin account, allowing you to manage other users, media and settings.
{% endhint %}
## OpenID Connect Settings (`[auth.openid_connect]`) ## OpenID Connect Settings (`[auth.openid_connect]`)
@@ -53,22 +51,20 @@ The OpenID server will likely require a redirect URI. This URL will usually look
{MEDIAMANAGER_URL}/api/v1/auth/oauth/callback {MEDIAMANAGER_URL}/api/v1/auth/oauth/callback
``` ```
{% hint style="warning" %} !!! warning
It is very important that you set the correct callback URI, otherwise it won't work! It is very important that you set the correct callback URI, otherwise it won't work!
{% endhint %}
#### Authentik Example #### Authentik Example
Here is an example configuration for the OpenID Connect provider for Authentik. Here is an example configuration for the OpenID Connect provider for Authentik.
![authentik-redirect-url-example](<../.gitbook/assets/authentik redirect url example.png>) ![authentik-redirect-url-example](<../assets/assets/authentik redirect url example.png>)
## Example Configuration ## Example Configuration
Here's a complete example of the authentication section in your `config.toml`: Here's a complete example of the authentication section in your `config.toml`:
{% code title="config.toml" %} ```toml title="config.toml"
```toml
[auth] [auth]
token_secret = "a1b2c3d4e5f6g7h8i9j0k1l2m3n4o5p6q7r8s9t0u1v2w3x4y5z6" token_secret = "a1b2c3d4e5f6g7h8i9j0k1l2m3n4o5p6q7r8s9t0u1v2w3x4y5z6"
session_lifetime = 604800 # 1 week session_lifetime = 604800 # 1 week
@@ -82,4 +78,4 @@ client_secret = "your-secret-key-here"
configuration_endpoint = "https://auth.example.com/.well-known/openid-configuration" configuration_endpoint = "https://auth.example.com/.well-known/openid-configuration"
name = "Authentik" name = "Authentik"
``` ```
{% endcode %}

View File

@@ -26,8 +26,7 @@ description: >-
Here's a complete example of the general settings section in your `config.toml`: Here's a complete example of the general settings section in your `config.toml`:
{% code title="config.toml" %} ```toml title="config.toml"
```toml
[misc] [misc]
# REQUIRED: Change this to match your actual frontend domain. # REQUIRED: Change this to match your actual frontend domain.
@@ -38,8 +37,6 @@ cors_urls = ["http://localhost:8000"]
# Optional: Development mode (set to true for debugging) # Optional: Development mode (set to true for debugging)
development = false development = false
``` ```
{% endcode %}
{% hint style="info" %} !!! info
The `frontend_url` is the most important setting to configure correctly. Make sure it matches your actual deployment URLs. The `frontend_url` is the most important setting to configure correctly. Make sure it matches your actual deployment URLs.
{% endhint %}

View File

@@ -6,9 +6,8 @@ MediaManager supports custom libraries, allowing you to add multiple folders for
Custom libraries are configured in the `misc` section in the `config.toml` file. You can add as many libraries as you need. Custom libraries are configured in the `misc` section in the `config.toml` file. You can add as many libraries as you need.
{% hint style="info" %} !!! info
You are not limited to `/data/tv` or `/data/movies`, you can choose the entire path freely! You are not limited to `/data/tv` or `/data/movies`, you can choose the entire path freely!
{% endhint %}
### Movie Libraries ### Movie Libraries

View File

@@ -19,8 +19,7 @@ Database settings are configured in the `[database]` section of your `config.tom
Here's a complete example of the database section in your `config.toml`: Here's a complete example of the database section in your `config.toml`:
{% code title="config.toml" %} ```toml title="config.toml"
```toml
[database] [database]
host = "db" host = "db"
port = 5432 port = 5432
@@ -28,8 +27,6 @@ user = "MediaManager"
password = "your_secure_password" password = "your_secure_password"
dbname = "MediaManager" dbname = "MediaManager"
``` ```
{% endcode %}
{% hint style="info" %} !!! info
In docker-compose deployments the container name is simultaneously its hostname, so you can use "db" or "postgres" as host. In docker-compose deployments the container name is simultaneously its hostname, so you can use "db" or "postgres" as host.
{% endhint %}

View File

@@ -19,9 +19,8 @@ qBittorrent is a popular BitTorrent client that MediaManager can integrate with
## Transmission Settings (`[torrents.transmission]`) ## Transmission Settings (`[torrents.transmission]`)
{% hint style="info" %} !!! info
The downloads path in Transmission and MediaManager must be the same, i.e. the path `/data/torrents` must link to the same volume for both containers. The downloads path in Transmission and MediaManager must be the same, i.e. the path `/data/torrents` must link to the same volume for both containers.
{% endhint %}
Transmission is a BitTorrent client that MediaManager can integrate with for downloading torrents. Transmission is a BitTorrent client that MediaManager can integrate with for downloading torrents.
@@ -59,8 +58,7 @@ SABnzbd is a Usenet newsreader that MediaManager can integrate with for download
Here's a complete example of the download clients section in your `config.toml`: Here's a complete example of the download clients section in your `config.toml`:
{% code title="config.toml" %} ```toml title="config.toml"
```toml
[torrents] [torrents]
# qBittorrent configuration # qBittorrent configuration
[torrents.qbittorrent] [torrents.qbittorrent]
@@ -87,14 +85,12 @@ Here's a complete example of the download clients section in your `config.toml`:
port = 8080 port = 8080
api_key = "your_sabnzbd_api_key" api_key = "your_sabnzbd_api_key"
``` ```
{% endcode %}
## Docker Compose Integration ## Docker Compose Integration
When using Docker Compose, make sure your download clients are accessible from the MediaManager backend: When using Docker Compose, make sure your download clients are accessible from the MediaManager backend:
{% code title="docker-compose.yml" %} ```yaml title="docker-compose.yml"
```yaml
services: services:
# MediaManager backend # MediaManager backend
backend: backend:
@@ -121,12 +117,9 @@ services:
- ./data/usenet:/downloads - ./data/usenet:/downloads
# ... other configuration ... # ... other configuration ...
``` ```
{% endcode %}
{% hint style="warning" %} !!! warning
You should enable only one BitTorrent and only one Usenet Download Client at any time. You should enable only one BitTorrent and only one Usenet Download Client at any time.
{% endhint %}
{% hint style="info" %} !!! info
Make sure the download directories in your download clients are accessible to MediaManager for proper file management and organization. Make sure the download directories in your download clients are accessible to MediaManager for proper file management and organization.
{% endhint %}

View File

@@ -13,9 +13,8 @@ Indexer settings are configured in the `[indexers]` section of your `config.toml
* `timeout_seconds`\ * `timeout_seconds`\
Timeout in seconds for requests to Prowlarr. Default is `60`. Timeout in seconds for requests to Prowlarr. Default is `60`.
{% hint style="warning" %} !!! warning
Symptoms of timeouts are typically no search results ("No torrents found!") in conjunction with logs showing read timeouts. Symptoms of timeouts are typically no search results ("No torrents found!") in conjunction with logs showing read timeouts.
{% endhint %}
<details> <details>
@@ -50,8 +49,7 @@ DEBUG - media_manager.indexer.utils -
## Example Configuration ## Example Configuration
{% code title="config.toml" %} ```toml title="config.toml"
```toml
[indexers] [indexers]
[indexers.prowlarr] [indexers.prowlarr]
enabled = true enabled = true
@@ -66,4 +64,4 @@ api_key = "your_jackett_api_key"
indexers = ["1337x", "rarbg"] indexers = ["1337x", "rarbg"]
timeout_seconds = 60 timeout_seconds = 60
``` ```
{% endcode %}

View File

@@ -57,8 +57,7 @@ Controls which emails receive notifications.
Here's a complete example of the notifications section in your `config.toml`: Here's a complete example of the notifications section in your `config.toml`:
{% code title="config.toml" %} ```toml title="config.toml"
```toml
[notifications] [notifications]
# SMTP settings for email notifications and password resets # SMTP settings for email notifications and password resets
[notifications.smtp_config] [notifications.smtp_config]
@@ -91,8 +90,7 @@ Here's a complete example of the notifications section in your `config.toml`:
api_key = "your_pushover_api_key" api_key = "your_pushover_api_key"
user = "your_pushover_user_key" user = "your_pushover_user_key"
``` ```
{% endcode %}
{% hint style="info" %}
You can enable multiple notification methods simultaneously. For example, you could have both email and Gotify notifications enabled at the same time. !!! info
{% endhint %} You can enable multiple notification methods simultaneously. For example, you could have both email and Gotify notifications enabled at the same time.

View File

@@ -17,9 +17,8 @@ Rules define how MediaManager scores releases based on their titles or indexer f
* Reject releases that do not meet certain criteria (e.g., non-freeleech releases). * Reject releases that do not meet certain criteria (e.g., non-freeleech releases).
* and more. * and more.
{% hint style="info" %} !!! info
The keywords and flags are compared case-insensitively. The keywords and flags are compared case-insensitively.
{% endhint %}
### Title Rules ### Title Rules
@@ -38,8 +37,7 @@ Each title rule consists of:
Examples for Title Rules Examples for Title Rules
{% code title="config.toml" %} ```toml title="config.toml"
```toml
[[indexers.title_scoring_rules]] [[indexers.title_scoring_rules]]
name = "prefer_h265" name = "prefer_h265"
keywords = ["h265", "hevc", "x265"] keywords = ["h265", "hevc", "x265"]
@@ -52,7 +50,6 @@ keywords = ["cam", "ts"]
score_modifier = -10000 score_modifier = -10000
negate = false negate = false
``` ```
{% endcode %}
* The first rule increases the score for releases containing "h265", "hevc", or "x265". * The first rule increases the score for releases containing "h265", "hevc", or "x265".
* The second rule heavily penalizes releases containing "cam" or "ts". * The second rule heavily penalizes releases containing "cam" or "ts".
@@ -76,8 +73,7 @@ Each indexer flag rule consists of:
Examples for Indexer Flag Rules Examples for Indexer Flag Rules
{% code title="config.toml" %} ```toml title="config.toml"
```toml
[[indexers.indexer_flag_scoring_rules]] [[indexers.indexer_flag_scoring_rules]]
name = "reject_non_freeleech" name = "reject_non_freeleech"
flags = ["freeleech", "freeleech75"] flags = ["freeleech", "freeleech75"]
@@ -90,7 +86,6 @@ flags = ["nuked"]
score_modifier = -10000 score_modifier = -10000
negate = false negate = false
``` ```
{% endcode %}
* The first rule penalizes releases that do not have the "freeleech" or "freeleech75" flag. * The first rule penalizes releases that do not have the "freeleech" or "freeleech75" flag.
* The second rule penalizes releases that are marked as "nuked". * The second rule penalizes releases that are marked as "nuked".
@@ -99,8 +94,7 @@ If `negate` is set to `true`, the `score_modifier` is applied only if none of th
## Example ## Example
{% code title="config.toml" %} ```toml title="config.toml"
```toml
[[indexers.scoring_rule_sets]] [[indexers.scoring_rule_sets]]
name = "default" name = "default"
libraries = ["ALL_TV", "ALL_MOVIES"] libraries = ["ALL_TV", "ALL_MOVIES"]
@@ -111,7 +105,6 @@ name = "strict_quality"
libraries = ["ALL_MOVIES"] libraries = ["ALL_MOVIES"]
rule_names = ["prefer_h265", "avoid_cam", "reject_non_freeleech"] rule_names = ["prefer_h265", "avoid_cam", "reject_non_freeleech"]
``` ```
{% endcode %}
## Libraries ## Libraries
@@ -127,9 +120,8 @@ You can use special library names in your rulesets:
This allows you to set global rules for all TV or movie content, or provide fallback rules for uncategorized media. This allows you to set global rules for all TV or movie content, or provide fallback rules for uncategorized media.
{% hint style="info" %} !!! info
You don't need to create lots of libraries with different directories, multiple libraries can share the same directory. You can set multiple (unlimited) libraries to the default directory `/data/movies` or `/data/tv` and use different rulesets with them. You don't need to create lots of libraries with different directories, multiple libraries can share the same directory. You can set multiple (unlimited) libraries to the default directory `/data/movies` or `/data/tv` and use different rulesets with them.
{% endhint %}
## Relation to Sonarr/Radarr Profiles ## Relation to Sonarr/Radarr Profiles

View File

@@ -10,7 +10,7 @@ description: >-
* `media_manager/`: Backend FastAPI application * `media_manager/`: Backend FastAPI application
* `web/`: Frontend SvelteKit application * `web/`: Frontend SvelteKit application
* `docs/`: Documentation (GitBook) * `docs/`: Documentation (MkDocs)
* `metadata_relay/`: Metadata relay service, also FastAPI * `metadata_relay/`: Metadata relay service, also FastAPI
## Special Dev Configuration ## Special Dev Configuration
@@ -44,9 +44,8 @@ MediaManager uses various environment variables for configuration. In the Docker
* `DISABLE_FRONTEND_MOUNT`\ * `DISABLE_FRONTEND_MOUNT`\
When `TRUE`, disables mounting built frontend files (allows separate frontend container). When `TRUE`, disables mounting built frontend files (allows separate frontend container).
{% hint style="info" %} !!! info
This is automatically set in `docker-compose.dev.yaml` to enable the separate frontend development container This is automatically set in `docker-compose.dev.yaml` to enable the separate frontend development container
{% endhint %}
#### Configuration Files #### Configuration Files
@@ -105,10 +104,9 @@ This means when your browser makes a request to `http://localhost:5173/api/v1/tv
### Setting up the full development environment with Docker (Recommended) ### Setting up the full development environment with Docker (Recommended)
This is the easiest and recommended way to get started. Everything runs in Docker with hot-reloading enabled.
{% stepper %}
{% step %}
### Prepare config files ### Prepare config files
Create config directory (only needed on first run) and copy example config files: Create config directory (only needed on first run) and copy example config files:
@@ -118,9 +116,9 @@ mkdir -p res/config # Only needed on first run
cp config.dev.toml res/config/config.toml cp config.dev.toml res/config/config.toml
cp web/.env.example web/.env cp web/.env.example web/.env
``` ```
{% endstep %}
{% step %}
### Start all services ### Start all services
Recommended: Use make commands for easy development Recommended: Use make commands for easy development
@@ -135,9 +133,9 @@ Alternative: Use docker compose directly (if make is not available)
```bash ```bash
docker compose -f docker-compose.dev.yaml up docker compose -f docker-compose.dev.yaml up
``` ```
{% endstep %}
{% step %}
### Access the application ### Access the application
* Frontend (with HMR): http://localhost:5173 * Frontend (with HMR): http://localhost:5173
@@ -151,12 +149,10 @@ Now you can edit code and see changes instantly:
* Edit Python files → Backend auto-reloads * Edit Python files → Backend auto-reloads
* Edit Svelte/TypeScript files → Frontend HMR updates in browser * Edit Svelte/TypeScript files → Frontend HMR updates in browser
* Edit config.toml → Changes apply immediately * Edit config.toml → Changes apply immediately
{% endstep %}
{% endstepper %}
{% hint style="info" %}
Run `make help` to see all available development commands including `make down`, `make logs`, `make app` (shell into backend), and more. !!! info
{% endhint %} Run `make help` to see all available development commands including `make down`, `make logs`, `make app` (shell into backend), and more.
## Setting up the backend development environment (Local) ## Setting up the backend development environment (Local)
@@ -217,18 +213,17 @@ ruff check .
## Setting up the frontend development environment (Local, Optional) ## Setting up the frontend development environment (Local, Optional)
Using the Docker setup above is recommended. This section is for those who prefer to run the frontend locally outside of Docker.
{% stepper %}
{% step %}
### Clone & change dir ### Clone & change dir
1. Clone the repository 1. Clone the repository
2. cd into repo root 2. cd into repo root
3. cd into `web` directory 3. cd into `web` directory
{% endstep %}
{% step %}
### Install Node.js (example using nvm-windows) ### Install Node.js (example using nvm-windows)
I used nvm-windows: I used nvm-windows:
@@ -243,9 +238,9 @@ If using PowerShell you may need:
```powershell ```powershell
Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser
``` ```
{% endstep %}
{% step %}
### Create .env for frontend ### Create .env for frontend
```bash ```bash
@@ -253,18 +248,18 @@ cp .env.example .env
``` ```
Update `PUBLIC_API_URL` if your backend is not at `http://localhost:8000` Update `PUBLIC_API_URL` if your backend is not at `http://localhost:8000`
{% endstep %}
{% step %}
### Install dependencies and run dev server ### Install dependencies and run dev server
```bash ```bash
npm install npm install
npm run dev npm run dev
``` ```
{% endstep %}
{% step %}
### Format & lint ### Format & lint
* Format: * Format:
@@ -278,12 +273,10 @@ npm run format
```bash ```bash
npm run lint npm run lint
``` ```
{% endstep %}
{% endstepper %}
{% hint style="info" %}
If running frontend locally, make sure to add `http://localhost:5173` to the `cors_urls` in your backend config file. !!! info
{% endhint %} If running frontend locally, make sure to add `http://localhost:5173` to the `cors_urls` in your backend config file.
## Troubleshooting ## Troubleshooting

View File

@@ -1,11 +1,14 @@
# Documentation # Documentation
MediaManager currently uses GitBook for documentation. MediaManager uses [MkDocs](https://www.mkdocs.org/) with
the [Material for MkDocs](https://squidfunk.github.io/mkdocs-material/) theme for documentation.
The files for the documentation are in the \`/docs\` directory. They are \_mostly\_ standard markdown. The files for the documentation are in the `/docs` directory.
Unfortunately GitBook doesn't provide a way to locally preview the documentation. Instead you can submit a PR with your proposed changes and a GitBook workflow will run which will provide a link to the preview. To preview the documentation locally, you need to have mkdocs or Docker installed.
To access the preview just open the \`Details\` link. ## How to preview the documentation locally with docker
<figure><img src="../.gitbook/assets/image.png" alt=""><figcaption></figcaption></figure> 1. Run the mkdocs container in `docker-compose.dev.yaml`
2. Open `http://127.0.0.1:9000/` in your browser.

22
docs/custom.css Normal file
View File

@@ -0,0 +1,22 @@
/*.md-header__button.md-logo {*/
/* margin-top: 0;*/
/* margin-bottom: 0;*/
/* padding-top: 0;*/
/* padding-bottom: 0;*/
/*}*/
/*.md-header__button.md-logo img,*/
/*.md-header__button.md-logo svg {*/
/* height: 70%;*/
/* width: 70%;*/
/*}*/
/* Increase logo size */
.md-header__button.md-logo svg, .md-header__button.md-logo img {
height: 2.5rem; /* Increase height (default is usually ~1.2rem to 1.5rem) */
width: auto;
}
/* Adjust header height if necessary to fit the larger logo */
.md-header {
height: 4rem; /* Match or exceed your new logo height */
}

View File

@@ -23,9 +23,8 @@ Here is an example, using these rules:
If your folder structure is in the correct format, you can start importing. To do this, log in as an administrator and go to the TV/movie dashboard. If your folder structure is in the correct format, you can start importing. To do this, log in as an administrator and go to the TV/movie dashboard.
{% hint style="info" %} !!! info
After importing, MediaManager will automatically prefix the old root TV show/movie folders with a dot to mark them as "imported". After importing, MediaManager will automatically prefix the old root TV show/movie folders with a dot to mark them as "imported".
{% endhint %}
So after importing, the directory would look like this (using the above directory structure): So after importing, the directory would look like this (using the above directory structure):

2
docs/index.md Normal file
View File

@@ -0,0 +1,2 @@
--8<-- "README.md"

View File

@@ -2,4 +2,5 @@
The recommended way to install and run Media Manager is using Docker and Docker Compose. Other installation methods are not officially supported, but listed here for convenience. The recommended way to install and run Media Manager is using Docker and Docker Compose. Other installation methods are not officially supported, but listed here for convenience.
<table data-view="cards" data-full-width="false"><thead><tr><th align="center"></th><th data-hidden data-card-target data-type="content-ref"></th></tr></thead><tbody><tr><td align="center">Docker Compose (recommended)</td><td><a href="docker.md">docker.md</a></td></tr><tr><td align="center">Nix Flakes [Community]</td><td><a href="flakes.md">flakes.md</a></td></tr></tbody></table> [Docker Compose (recommended)](docker.md){ .md-button .md-button--primary }
[Nix Flakes [Community]](flakes.md){ .md-button }

View File

@@ -9,8 +9,8 @@
Follow these steps to get MediaManager running with Docker Compose: Follow these steps to get MediaManager running with Docker Compose:
{% stepper %}
{% step %}
#### Get the docker-compose file #### Get the docker-compose file
Download the `docker-compose.yaml` from the MediaManager repo: Download the `docker-compose.yaml` from the MediaManager repo:
@@ -18,9 +18,9 @@ Download the `docker-compose.yaml` from the MediaManager repo:
```bash ```bash
wget -O docker-compose.yaml https://github.com/maxdorninger/MediaManager/releases/latest/download/docker-compose.yaml wget -O docker-compose.yaml https://github.com/maxdorninger/MediaManager/releases/latest/download/docker-compose.yaml
``` ```
{% endstep %}
{% step %}
#### Prepare configuration directory and example config #### Prepare configuration directory and example config
Create a config directory and download the example configuration: Create a config directory and download the example configuration:
@@ -29,15 +29,15 @@ Create a config directory and download the example configuration:
mkdir config mkdir config
wget -O ./config/config.toml https://github.com/maxdorninger/MediaManager/releases/latest/download/config.example.toml wget -O ./config/config.toml https://github.com/maxdorninger/MediaManager/releases/latest/download/config.example.toml
``` ```
{% endstep %}
{% step %}
#### Edit configuration #### Edit configuration
You probably need to edit the `config.toml` file in the `./config` directory to suit your environment and preferences. [How to configure MediaManager.](configuration/) You probably need to edit the `config.toml` file in the `./config` directory to suit your environment and preferences. [How to configure MediaManager.](../configuration/README.md)
{% endstep %}
{% step %}
#### Start MediaManager #### Start MediaManager
Bring up the stack: Bring up the stack:
@@ -45,16 +45,15 @@ Bring up the stack:
```bash ```bash
docker compose up -d docker compose up -d
``` ```
{% endstep %}
{% endstepper %}
* Upon first run, MediaManager will create a default `config.toml` file in the `./config` directory (if not already present). * Upon first run, MediaManager will create a default `config.toml` file in the `./config` directory (if not already present).
* Upon first run, MediaManager will also create a default admin user. The credentials of the default admin user will be printed in the logs of the container — it's recommended to change the password of this user after the first login. * Upon first run, MediaManager will also create a default admin user. The credentials of the default admin user will be printed in the logs of the container — it's recommended to change the password of this user after the first login.
* [For more information on the available configuration options, see the Configuration section of the documentation.](configuration/) * [For more information on the available configuration options, see the Configuration section of the documentation.](../configuration/README.md)
{% hint style="info" %} !!! info
When setting up MediaManager for the first time, you should add your email to `admin_emails` in the `[auth]` config section. MediaManager will then use this email instead of the default admin email. Your account will automatically be created as an admin account, allowing you to manage other users, media, and settings. When setting up MediaManager for the first time, you should add your email to `admin_emails` in the `[auth]` config section. MediaManager will then use this email instead of the default admin email. Your account will automatically be created as an admin account, allowing you to manage other users, media, and settings.
{% endhint %}
## Docker Images ## Docker Images
@@ -70,9 +69,8 @@ MetadataRelay images are also available on both registries:
From v1.12.1 onwards, both MediaManager and MetadataRelay images are available on both Quay.io and GHCR. The reason for the switch to Quay.io as the primary image registry is due to [GHCR's continued slow performance.](https://github.com/orgs/community/discussions/173607) From v1.12.1 onwards, both MediaManager and MetadataRelay images are available on both Quay.io and GHCR. The reason for the switch to Quay.io as the primary image registry is due to [GHCR's continued slow performance.](https://github.com/orgs/community/discussions/173607)
{% hint style="info" %} !!! info
You can use either the Quay.io or GHCR images interchangeably, as they are built from the same source and the tags are the same across both registries. You can use either the Quay.io or GHCR images interchangeably, as they are built from the same source and the tags are the same across both registries.
{% endhint %}
### Tags ### Tags

View File

@@ -1,11 +1,9 @@
# Nix Flakes # Nix Flakes
{% hint style="note" %} !!! note
This is a community contribution and not officially supported by the MediaManager team, but included here for convenience. This is a community contribution and not officially supported by the MediaManager team, but included here for convenience.
{% endhint %}
*Please report issues with this method at the [corresponding GitHub repository](https://github.com/strangeglyph/mediamanager-nix).* *Please report issues with this method at the [corresponding GitHub repository](https://github.com/strangeglyph/mediamanager-nix).*
</note>
## Prerequisites ## Prerequisites
@@ -64,12 +62,11 @@ The host and port that MediaManager listens on can be set using `services.media-
To configure MediaManager, use `services.media-manager.settings`, which follows the same structure as the MediaManager To configure MediaManager, use `services.media-manager.settings`, which follows the same structure as the MediaManager
`config.toml`. To provision secrets, set `services.media-manager.environmentFile` to a protected file, for example one `config.toml`. To provision secrets, set `services.media-manager.environmentFile` to a protected file, for example one
provided by [agenix](https://github.com/ryantm/agenix) or [sops-nix](https://github.com/Mic92/sops-nix). provided by [agenix](https://github.com/ryantm/agenix) or [sops-nix](https://github.com/Mic92/sops-nix).
See [Configuration](Configuration.md#configuring-secrets) for guidance on using environment variables. See [Configuration](../configuration/README.md#configuring-secrets) for guidance on using environment variables.
{% hint style="warning" %} !!! warning
Do not place secrets in the nix store, as it is world-readable. Do not place secrets in the nix store, as it is world-readable.
{% endhint %}
## Automatic Postgres Setup ## Automatic Postgres Setup

View File

@@ -1,7 +1,6 @@
# Screenshots # Screenshots
{% hint style="info" %} !!! info
MediaManager also supports darkmode! MediaManager also supports darkmode!
{% endhint %}
![screenshot-dashboard.png](<.gitbook/assets/screenshot dashboard.png>) ![screenshot-tv-dashboard.png](<.gitbook/assets/screenshot tv dashboard.png>) ![screenshot-download-season.png](<.gitbook/assets/screenshot download season.png>) ![screenshot-request-season.png](<.gitbook/assets/screenshot request season.png>) ![screenshot-tv-torrents.png](<.gitbook/assets/screenshot tv torrents.png>) ![screenshot-settings.png](<.gitbook/assets/screenshot settings.png>) ![screenshot-login.png](<.gitbook/assets/screenshot login.png>) ![screenshot-dashboard.png](<assets/assets/screenshot dashboard.png>) ![screenshot-tv-dashboard.png](<assets/assets/screenshot tv dashboard.png>) ![screenshot-download-season.png](<assets/assets/screenshot download season.png>) ![screenshot-request-season.png](<assets/assets/screenshot request season.png>) ![screenshot-tv-torrents.png](<assets/assets/screenshot tv torrents.png>) ![screenshot-settings.png](<assets/assets/screenshot settings.png>) ![screenshot-login.png](<assets/assets/screenshot login.png>)

View File

@@ -1,8 +1,7 @@
# Troubleshooting # Troubleshooting
{% hint style="info" %} !!! info
Always check the container and browser logs for more specific error messages Always check the container and browser logs for more specific error messages
{% endhint %}
<details> <details>
@@ -60,10 +59,9 @@ Switch to advanced tabTry switching to the advanced tab when searching for torre
#### Possible Fixes: #### Possible Fixes:
* [Unable to pull image from GitHub Container Registry (Stack Overflow)](https://stackoverflow.com/questions/74656167/unable-to-pull-image-from-github-container-registry-ghcr) * [Unable to pull image from GitHub Container Registry (Stack Overflow)](https://stackoverflow.com/questions/74656167/unable-to-pull-image-from-github-container-registry-ghcr)
* [Try pulling the image from Quay.io](/broken/pages/09241b2fcda5d337e8878e4052f4634fe2902d10#mediamanager-and-metadatarelay-docker-images) * [Try pulling the image from Quay.io](installation/docker.md#docker-images)
</details> </details>
{% hint style="info" %} !!! info
If it still doesn't work, [please open an Issue.](https://github.com/maxdorninger/MediaManager/issues) It is possible that a bug is causing the issue. If it still doesn't work, [please open an Issue.](https://github.com/maxdorninger/MediaManager/issues) It is possible that a bug is causing the issue.
{% endhint %}

View File

@@ -1,133 +0,0 @@
# Usage
If you are coming from Radarr or Sonarr you will find that MediaManager does things a bit differently. Instead of completely automatically downloading and managing your media, MediaManager focuses on providing an easy-to-use interface to guide you through the process of finding and downloading media. Advanced features like multiple qualities of a show/movie necessitate such a paradigm shift. So here is a quick step-by-step guide to get you started:
#### Downloading/Requesting a show
{% stepper %}
{% step %}
### Add the show
Add a show on the "Add Show" page. After adding the show you will be redirected to the show's page.
{% endstep %}
{% step %}
### Request season(s)
Click the "Request Season" button on the show's page. Select one or more seasons that you want to download.
{% endstep %}
{% step %}
### Select qualities
Select the "Min Quality" — the minimum resolution of the content to download.\
Select the "Wanted Quality" — the **maximum** resolution of the content to download.
{% endstep %}
{% step %}
### Submit request
Click "Submit request". This is not the last step: an administrator must first approve your request for download. Only after approval will the requested content be downloaded.
{% endstep %}
{% step %}
### Finished
Congratulation! You've downloaded a show (after admin approval).
{% endstep %}
{% endstepper %}
#### Requesting a show (as an admin)
{% stepper %}
{% step %}
### Add the show
Add a show on the "Add Show" page. After adding the show you will be redirected to the show's page.
{% endstep %}
{% step %}
### Request season(s)
Click the "Request Season" button on the show's page. Select one or more seasons that you want to download.
{% endstep %}
{% step %}
### Select qualities
Select the "Min Quality" — the minimum resolution of the content to download.\
Select the "Wanted Quality" — the **maximum** resolution of the content to download.
{% endstep %}
{% step %}
### Submit request (auto-approved)
Click "Submit request". As an admin, your request will be automatically approved.
{% endstep %}
{% step %}
### Finished
Congratulation! You've downloaded a show.
{% endstep %}
{% endstepper %}
#### Downloading a show (admin-only)
You can only directly download a show if you are an admin!
{% stepper %}
{% step %}
### Go to the show's page
Open the show's page that contains the season you wish to download.
{% endstep %}
{% step %}
### Start download
Click the "Download Season" button.
{% endstep %}
{% step %}
### Enter season number
Enter the season number that you want to download.
{% endstep %}
{% step %}
### Optional file path suffix
Optionally select the "File Path Suffix". Note: **it needs to be unique per season per show!**
{% endstep %}
{% step %}
### Choose torrent and download
Click "Download" on the torrent that you want to download.
{% endstep %}
{% step %}
### Finished
Congratulation! You've downloaded a show.
{% endstep %}
{% endstepper %}
#### Managing requests
Users need their requests to be approved by an admin. To manage requests:
{% stepper %}
{% step %}
### Open Requests page
Go to the "Requests" page.
{% endstep %}
{% step %}
### Approve, delete or modify
From the Requests page you can approve, delete, or modify a user's request.
{% endstep %}
{% endstepper %}

View File

@@ -1,7 +1,9 @@
import concurrent import concurrent
import concurrent.futures import concurrent.futures
import logging import logging
import xml.etree.ElementTree as ET
from concurrent.futures.thread import ThreadPoolExecutor from concurrent.futures.thread import ThreadPoolExecutor
from dataclasses import dataclass
import requests import requests
@@ -15,6 +17,21 @@ from media_manager.tv.schemas import Show
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
@dataclass
class IndexerInfo:
supports_tv_search: bool
supports_tv_search_tmdb: bool
supports_tv_search_imdb: bool
supports_tv_search_tvdb: bool
supports_tv_search_season: bool
supports_tv_search_episode: bool
supports_movie_search: bool
supports_movie_search_tmdb: bool
supports_movie_search_imdb: bool
supports_movie_search_tvdb: bool
class Jackett(GenericIndexer, TorznabMixin): class Jackett(GenericIndexer, TorznabMixin):
def __init__(self) -> None: def __init__(self) -> None:
""" """
@@ -31,11 +48,16 @@ class Jackett(GenericIndexer, TorznabMixin):
def search(self, query: str, is_tv: bool) -> list[IndexerQueryResult]: def search(self, query: str, is_tv: bool) -> list[IndexerQueryResult]:
log.debug("Searching for " + query) log.debug("Searching for " + query)
params = {"q": query, "t": "tvsearch" if is_tv else "movie"}
return self.__search_jackett(params)
def __search_jackett(self, params: dict) -> list[IndexerQueryResult]:
futures = [] futures = []
with ThreadPoolExecutor() as executor, requests.Session() as session: with ThreadPoolExecutor() as executor, requests.Session() as session:
for indexer in self.indexers: for indexer in self.indexers:
future = executor.submit( future = executor.submit(
self.get_torrents_by_indexer, indexer, query, is_tv, session self.get_torrents_by_indexer, indexer, params, session
) )
futures.append(future) futures.append(future)
@@ -51,14 +73,103 @@ class Jackett(GenericIndexer, TorznabMixin):
return responses return responses
def get_torrents_by_indexer( def __get_search_capabilities(
self, indexer: str, query: str, is_tv: bool, session: requests.Session self, indexer: str, session: requests.Session
) -> list[IndexerQueryResult]: ) -> IndexerInfo:
url = ( url = (
self.url self.url
+ f"/api/v2.0/indexers/{indexer}/results/torznab/api?apikey={self.api_key}&t={'tvsearch' if is_tv else 'movie'}&q={query}" + f"/api/v2.0/indexers/{indexer}/results/torznab/api?apikey={self.api_key}&t=caps"
) )
response = session.get(url, timeout=self.timeout_seconds) response = session.get(url, timeout=self.timeout_seconds)
if response.status_code != 200:
msg = f"Cannot get search capabilities for Indexer {indexer}"
log.error(msg)
raise RuntimeError(msg)
xml = response.text
xml_tree = ET.fromstring(xml) # noqa: S314 # trusted source, since it is user controlled
tv_search = xml_tree.find("./*/tv-search")
movie_search = xml_tree.find("./*/movie-search")
log.debug(tv_search.attrib)
log.debug(movie_search.attrib)
tv_search_capabilities = []
movie_search_capabilities = []
tv_search_available = (tv_search is not None) and (
tv_search.attrib["available"] == "yes"
)
movie_search_available = (movie_search is not None) and (
movie_search.attrib["available"] == "yes"
)
if tv_search_available:
tv_search_capabilities = tv_search.attrib["supportedParams"].split(",")
if movie_search_available:
movie_search_capabilities = movie_search.attrib["supportedParams"].split(
","
)
return IndexerInfo(
supports_tv_search=tv_search_available,
supports_tv_search_imdb="tmdbid" in tv_search_capabilities,
supports_tv_search_tmdb="tmdbid" in tv_search_capabilities,
supports_tv_search_tvdb="tvdbid" in tv_search_capabilities,
supports_tv_search_season="season" in tv_search_capabilities,
supports_tv_search_episode="ep" in tv_search_capabilities,
supports_movie_search=movie_search_available,
supports_movie_search_imdb="imdbid" in movie_search_capabilities,
supports_movie_search_tmdb="tmdbid" in movie_search_capabilities,
supports_movie_search_tvdb="tvdbid" in movie_search_capabilities,
)
def __get_optimal_query_parameters(
self, indexer: str, session: requests.Session, params: dict
) -> dict[str, str]:
query_params = {"apikey": self.api_key, "t": params["t"]}
search_capabilities = self.__get_search_capabilities(
indexer=indexer, session=session
)
if params["t"] == "tvsearch":
if not search_capabilities.supports_tv_search:
msg = f"Indexer {indexer} does not support TV search"
raise RuntimeError(msg)
if search_capabilities.supports_tv_search_season and "season" in params:
query_params["season"] = params["season"]
if search_capabilities.supports_tv_search_episode and "ep" in params:
query_params["ep"] = params["ep"]
if search_capabilities.supports_tv_search_imdb and "imdbid" in params:
query_params["imdbid"] = params["imdbid"]
elif search_capabilities.supports_tv_search_tvdb and "tvdbid" in params:
query_params["tvdbid"] = params["tvdbid"]
elif search_capabilities.supports_tv_search_tmdb and "tmdbid" in params:
query_params["tmdbid"] = params["tmdbid"]
else:
query_params["q"] = params["q"]
if params["t"] == "movie":
if not search_capabilities.supports_movie_search:
msg = f"Indexer {indexer} does not support Movie search"
raise RuntimeError(msg)
if search_capabilities.supports_movie_search_imdb and "imdbid" in params:
query_params["imdbid"] = params["imdbid"]
elif search_capabilities.supports_tv_search_tvdb and "tvdbid" in params:
query_params["tvdbid"] = params["tvdbid"]
elif search_capabilities.supports_tv_search_tmdb and "tmdbid" in params:
query_params["tmdbid"] = params["tmdbid"]
else:
query_params["q"] = params["q"]
return query_params
def get_torrents_by_indexer(
self, indexer: str, params: dict, session: requests.Session
) -> list[IndexerQueryResult]:
url = f"{self.url}/api/v2.0/indexers/{indexer}/results/torznab/api"
query_params = self.__get_optimal_query_parameters(
indexer=indexer, session=session, params=params
)
response = session.get(url, timeout=self.timeout_seconds, params=query_params)
log.debug(f"Indexer {indexer} url: {response.url}")
if response.status_code != 200: if response.status_code != 200:
log.error( log.error(
@@ -75,8 +186,23 @@ class Jackett(GenericIndexer, TorznabMixin):
self, query: str, show: Show, season_number: int self, query: str, show: Show, season_number: int
) -> list[IndexerQueryResult]: ) -> list[IndexerQueryResult]:
log.debug(f"Searching for season {season_number} of show {show.name}") log.debug(f"Searching for season {season_number} of show {show.name}")
return self.search(query=query, is_tv=True) params = {
"t": "tvsearch",
"season": season_number,
"q": query,
}
if show.imdb_id:
params["imdbid"] = show.imdb_id
params[show.metadata_provider + "id"] = show.external_id
return self.__search_jackett(params=params)
def search_movie(self, query: str, movie: Movie) -> list[IndexerQueryResult]: def search_movie(self, query: str, movie: Movie) -> list[IndexerQueryResult]:
log.debug(f"Searching for movie {movie.name}") log.debug(f"Searching for movie {movie.name}")
return self.search(query=query, is_tv=False) params = {
"t": "movie",
"q": query,
}
if movie.imdb_id:
params["imdbid"] = movie.imdb_id
params[movie.metadata_provider + "id"] = movie.external_id
return self.__search_jackett(params=params)

View File

@@ -64,16 +64,12 @@ class TorznabMixin:
title = item.find("title").text title = item.find("title").text
size_str = item.find("size") size_str = item.find("size")
if size_str is None or size_str.text is None: if size_str is None or size_str.text is None:
log.warning( log.warning(f"Torznab item {title} has no size, skipping.")
f"Torznab item {title} has no size, skipping."
)
continue continue
try: try:
size = int(size_str.text or "0") size = int(size_str.text or "0")
except ValueError: except ValueError:
log.warning( log.warning(f"Torznab item {title} has invalid size, skipping.")
f"Torznab item {title} has invalid size, skipping."
)
continue continue
result = IndexerQueryResult( result = IndexerQueryResult(

View File

@@ -18,6 +18,7 @@ class IndexerQueryResult(Base):
flags = mapped_column(ARRAY(String)) flags = mapped_column(ARRAY(String))
quality: Mapped[Quality] quality: Mapped[Quality]
season = mapped_column(ARRAY(Integer)) season = mapped_column(ARRAY(Integer))
episode = mapped_column(ARRAY(Integer))
size = mapped_column(BigInteger) size = mapped_column(BigInteger)
usenet: Mapped[bool] usenet: Mapped[bool]
age: Mapped[int] age: Mapped[int]

View File

@@ -35,10 +35,10 @@ class IndexerQueryResult(BaseModel):
@computed_field @computed_field
@property @property
def quality(self) -> Quality: def quality(self) -> Quality:
high_quality_pattern = r"\b(4k)\b" high_quality_pattern = r"\b(4k|2160p|uhd)\b"
medium_quality_pattern = r"\b(1080p)\b" medium_quality_pattern = r"\b(1080p|full[ ._-]?hd)\b"
low_quality_pattern = r"\b(720p)\b" low_quality_pattern = r"\b(720p|(?<!full[ ._-])hd(?![a-z]))\b"
very_low_quality_pattern = r"\b(480p|360p)\b" very_low_quality_pattern = r"\b(480p|360p|sd)\b"
if re.search(high_quality_pattern, self.title, re.IGNORECASE): if re.search(high_quality_pattern, self.title, re.IGNORECASE):
return Quality.uhd return Quality.uhd
@@ -54,14 +54,55 @@ class IndexerQueryResult(BaseModel):
@computed_field @computed_field
@property @property
def season(self) -> list[int]: def season(self) -> list[int]:
pattern = r"\bS(\d+)\b" title = self.title.lower()
matches = re.findall(pattern, self.title, re.IGNORECASE)
if matches.__len__() == 2: # 1) S01E01 / S1E2
result = list(range(int(matches[0]), int(matches[1]) + 1)) m = re.search(r"s(\d{1,2})e\d{1,3}", title)
elif matches.__len__() == 1: if m:
result = [int(matches[0])] return [int(m.group(1))]
# 2) Range S01-S03 / S1-S3
m = re.search(r"s(\d{1,2})\s*(?:-|\u2013)\s*s?(\d{1,2})", title)
if m:
start, end = int(m.group(1)), int(m.group(2))
if start <= end:
return list(range(start, end + 1))
return []
# 3) Pack S01 / S1
m = re.search(r"\bs(\d{1,2})\b", title)
if m:
return [int(m.group(1))]
# 4) Season 01 / Season 1
m = re.search(r"\bseason\s*(\d{1,2})\b", title)
if m:
return [int(m.group(1))]
return []
@computed_field(return_type=list[int])
@property
def episode(self) -> list[int]:
title = self.title.lower()
result: list[int] = []
pattern = r"s\d{1,2}e(\d{1,3})(?:\s*-\s*(?:s?\d{1,2}e)?(\d{1,3}))?"
match = re.search(pattern, title)
if not match:
return result
start = int(match.group(1))
end = match.group(2)
if end:
end = int(end)
if end >= start:
result = list(range(start, end + 1))
else: else:
result = [] result = [start]
return result return result
def __gt__(self, other: "IndexerQueryResult") -> bool: def __gt__(self, other: "IndexerQueryResult") -> bool:

View File

@@ -1,4 +1,5 @@
import logging import logging
import re
from urllib.parse import urljoin from urllib.parse import urljoin
import requests import requests
@@ -23,7 +24,11 @@ def evaluate_indexer_query_result(
log.debug(f"Applying rule {rule.name} to {query_result.title}") log.debug(f"Applying rule {rule.name} to {query_result.title}")
if ( if (
any( any(
keyword.lower() in query_result.title.lower() re.search(
rf"\b{re.escape(keyword)}\b",
query_result.title,
re.IGNORECASE,
)
for keyword in rule.keywords for keyword in rule.keywords
) )
and not rule.negate and not rule.negate
@@ -34,7 +39,11 @@ def evaluate_indexer_query_result(
query_result.score += rule.score_modifier query_result.score += rule.score_modifier
elif ( elif (
not any( not any(
keyword.lower() in query_result.title.lower() re.search(
rf"\b{re.escape(keyword)}\b",
query_result.title,
re.IGNORECASE,
)
for keyword in rule.keywords for keyword in rule.keywords
) )
and rule.negate and rule.negate
@@ -155,5 +164,3 @@ def follow_redirects_to_final_torrent_url(
) )
msg = "An error occurred during the request" msg = "An error occurred during the request"
raise RuntimeError(msg) from e raise RuntimeError(msg) from e
return current_url

View File

@@ -83,3 +83,4 @@ def setup_logging() -> None:
logging.getLogger("transmission_rpc").setLevel(logging.WARNING) logging.getLogger("transmission_rpc").setLevel(logging.WARNING)
logging.getLogger("qbittorrentapi").setLevel(logging.WARNING) logging.getLogger("qbittorrentapi").setLevel(logging.WARNING)
logging.getLogger("sabnzbd_api").setLevel(logging.WARNING) logging.getLogger("sabnzbd_api").setLevel(logging.WARNING)
logging.getLogger("taskiq").setLevel(logging.WARNING)

View File

@@ -1,5 +1,8 @@
import asyncio
import logging import logging
import os import os
from collections.abc import AsyncGenerator
from contextlib import asynccontextmanager
import uvicorn import uvicorn
from asgi_correlation_id import CorrelationIdMiddleware from asgi_correlation_id import CorrelationIdMiddleware
@@ -9,6 +12,8 @@ from fastapi.staticfiles import StaticFiles
from psycopg.errors import UniqueViolation from psycopg.errors import UniqueViolation
from sqlalchemy.exc import IntegrityError from sqlalchemy.exc import IntegrityError
from starlette.responses import FileResponse, RedirectResponse from starlette.responses import FileResponse, RedirectResponse
from taskiq.receiver import Receiver
from taskiq_fastapi import populate_dependency_context
from uvicorn.middleware.proxy_headers import ProxyHeadersMiddleware from uvicorn.middleware.proxy_headers import ProxyHeadersMiddleware
import media_manager.movies.router as movies_router import media_manager.movies.router as movies_router
@@ -28,6 +33,7 @@ from media_manager.auth.users import (
fastapi_users, fastapi_users,
) )
from media_manager.config import MediaManagerConfig from media_manager.config import MediaManagerConfig
from media_manager.database import init_engine
from media_manager.exceptions import ( from media_manager.exceptions import (
ConflictError, ConflictError,
InvalidConfigError, InvalidConfigError,
@@ -42,18 +48,24 @@ from media_manager.exceptions import (
from media_manager.filesystem_checks import run_filesystem_checks from media_manager.filesystem_checks import run_filesystem_checks
from media_manager.logging import LOGGING_CONFIG, setup_logging from media_manager.logging import LOGGING_CONFIG, setup_logging
from media_manager.notification.router import router as notification_router from media_manager.notification.router import router as notification_router
from media_manager.scheduler import setup_scheduler from media_manager.scheduler import (
broker,
build_scheduler_loop,
import_all_movie_torrents_task,
import_all_show_torrents_task,
update_all_movies_metadata_task,
update_all_non_ended_shows_metadata_task,
)
setup_logging() setup_logging()
config = MediaManagerConfig() config = MediaManagerConfig()
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
if config.misc.development: if config.misc.development:
log.warning("Development Mode activated!") log.warning("Development Mode activated!")
scheduler = setup_scheduler(config)
run_filesystem_checks(config, log) run_filesystem_checks(config, log)
BASE_PATH = os.getenv("BASE_PATH", "") BASE_PATH = os.getenv("BASE_PATH", "")
@@ -62,7 +74,57 @@ DISABLE_FRONTEND_MOUNT = os.getenv("DISABLE_FRONTEND_MOUNT", "").lower() == "tru
FRONTEND_FOLLOW_SYMLINKS = os.getenv("FRONTEND_FOLLOW_SYMLINKS", "").lower() == "true" FRONTEND_FOLLOW_SYMLINKS = os.getenv("FRONTEND_FOLLOW_SYMLINKS", "").lower() == "true"
log.info("Hello World!") log.info("Hello World!")
app = FastAPI(root_path=BASE_PATH)
@asynccontextmanager
async def lifespan(app: FastAPI) -> AsyncGenerator:
init_engine(config.database)
broker_started = False
started_sources: list = []
finish_event: asyncio.Event | None = None
receiver_task: asyncio.Task | None = None
loop_task: asyncio.Task | None = None
try:
if not broker.is_worker_process:
await broker.startup()
broker_started = True
populate_dependency_context(broker, app)
scheduler_loop = build_scheduler_loop()
for source in scheduler_loop.scheduler.sources:
await source.startup()
started_sources.append(source)
finish_event = asyncio.Event()
receiver = Receiver(broker, run_startup=False, max_async_tasks=10)
receiver_task = asyncio.create_task(receiver.listen(finish_event))
loop_task = asyncio.create_task(scheduler_loop.run(skip_first_run=True))
try:
await asyncio.gather(
import_all_movie_torrents_task.kiq(),
import_all_show_torrents_task.kiq(),
update_all_movies_metadata_task.kiq(),
update_all_non_ended_shows_metadata_task.kiq(),
)
except Exception:
log.exception("Failed to submit initial background tasks during startup.")
raise
yield
finally:
if loop_task is not None:
loop_task.cancel()
try:
await loop_task
except asyncio.CancelledError:
pass
if finish_event is not None and receiver_task is not None:
finish_event.set()
await receiver_task
for source in started_sources:
await source.shutdown()
if broker_started:
await broker.shutdown()
app = FastAPI(root_path=BASE_PATH, lifespan=lifespan)
app.add_middleware(ProxyHeadersMiddleware, trusted_hosts="*") app.add_middleware(ProxyHeadersMiddleware, trusted_hosts="*")
origins = config.misc.cors_urls origins = config.misc.cors_urls
log.info(f"CORS URLs activated for following origins: {origins}") log.info(f"CORS URLs activated for following origins: {origins}")

View File

@@ -15,7 +15,7 @@ def download_poster_image(storage_path: Path, poster_url: str, uuid: UUID) -> bo
res = requests.get(poster_url, stream=True, timeout=60) res = requests.get(poster_url, stream=True, timeout=60)
if res.status_code == 200: if res.status_code == 200:
image_file_path = storage_path.joinpath(str(uuid)).with_suffix("jpg") image_file_path = storage_path.joinpath(str(uuid)).with_suffix(".jpg")
image_file_path.write_bytes(res.content) image_file_path.write_bytes(res.content)
original_image = Image.open(image_file_path) original_image = Image.open(image_file_path)

View File

@@ -3,7 +3,6 @@ from uuid import UUID
from sqlalchemy import ForeignKey, PrimaryKeyConstraint, UniqueConstraint from sqlalchemy import ForeignKey, PrimaryKeyConstraint, UniqueConstraint
from sqlalchemy.orm import Mapped, mapped_column, relationship from sqlalchemy.orm import Mapped, mapped_column, relationship
from media_manager.auth.db import User
from media_manager.database import Base from media_manager.database import Base
from media_manager.torrent.models import Quality from media_manager.torrent.models import Quality
@@ -22,10 +21,6 @@ class Movie(Base):
original_language: Mapped[str | None] = mapped_column(default=None) original_language: Mapped[str | None] = mapped_column(default=None)
imdb_id: Mapped[str | None] = mapped_column(default=None) imdb_id: Mapped[str | None] = mapped_column(default=None)
movie_requests: Mapped[list["MovieRequest"]] = relationship(
"MovieRequest", back_populates="movie", cascade="all, delete-orphan"
)
class MovieFile(Base): class MovieFile(Base):
__tablename__ = "movie_file" __tablename__ = "movie_file"
@@ -42,31 +37,3 @@ class MovieFile(Base):
) )
torrent = relationship("Torrent", back_populates="movie_files", uselist=False) torrent = relationship("Torrent", back_populates="movie_files", uselist=False)
class MovieRequest(Base):
__tablename__ = "movie_request"
__table_args__ = (UniqueConstraint("movie_id", "wanted_quality"),)
id: Mapped[UUID] = mapped_column(primary_key=True)
movie_id: Mapped[UUID] = mapped_column(
ForeignKey(column="movie.id", ondelete="CASCADE"),
)
wanted_quality: Mapped[Quality]
min_quality: Mapped[Quality]
authorized: Mapped[bool] = mapped_column(default=False)
requested_by_id: Mapped[UUID | None] = mapped_column(
ForeignKey(column="user.id", ondelete="SET NULL"),
)
authorized_by_id: Mapped[UUID | None] = mapped_column(
ForeignKey(column="user.id", ondelete="SET NULL"),
)
requested_by: Mapped["User|None"] = relationship(
foreign_keys=[requested_by_id], uselist=False
)
authorized_by: Mapped["User|None"] = relationship(
foreign_keys=[authorized_by_id], uselist=False
)
movie = relationship("Movie", back_populates="movie_requests", uselist=False)

View File

@@ -5,10 +5,10 @@ from sqlalchemy.exc import (
IntegrityError, IntegrityError,
SQLAlchemyError, SQLAlchemyError,
) )
from sqlalchemy.orm import Session, joinedload from sqlalchemy.orm import Session
from media_manager.exceptions import ConflictError, NotFoundError from media_manager.exceptions import ConflictError, NotFoundError
from media_manager.movies.models import Movie, MovieFile, MovieRequest from media_manager.movies.models import Movie, MovieFile
from media_manager.movies.schemas import ( from media_manager.movies.schemas import (
Movie as MovieSchema, Movie as MovieSchema,
) )
@@ -17,17 +17,10 @@ from media_manager.movies.schemas import (
) )
from media_manager.movies.schemas import ( from media_manager.movies.schemas import (
MovieId, MovieId,
MovieRequestId,
)
from media_manager.movies.schemas import (
MovieRequest as MovieRequestSchema,
) )
from media_manager.movies.schemas import ( from media_manager.movies.schemas import (
MovieTorrent as MovieTorrentSchema, MovieTorrent as MovieTorrentSchema,
) )
from media_manager.movies.schemas import (
RichMovieRequest as RichMovieRequestSchema,
)
from media_manager.torrent.models import Torrent from media_manager.torrent.models import Torrent
from media_manager.torrent.schemas import TorrentId from media_manager.torrent.schemas import TorrentId
@@ -173,46 +166,6 @@ class MovieRepository:
log.exception(f"Database error while deleting movie {movie_id}") log.exception(f"Database error while deleting movie {movie_id}")
raise raise
def add_movie_request(
self, movie_request: MovieRequestSchema
) -> MovieRequestSchema:
"""
Adds a Movie to the MovieRequest table, which marks it as requested.
:param movie_request: The MovieRequest object to add.
:return: The added MovieRequest object.
:raises IntegrityError: If a similar request already exists or violates constraints.
:raises SQLAlchemyError: If a database error occurs.
"""
log.debug(f"Adding movie request: {movie_request.model_dump_json()}")
db_model = MovieRequest(
id=movie_request.id,
movie_id=movie_request.movie_id,
requested_by_id=movie_request.requested_by.id
if movie_request.requested_by
else None,
authorized_by_id=movie_request.authorized_by.id
if movie_request.authorized_by
else None,
wanted_quality=movie_request.wanted_quality,
min_quality=movie_request.min_quality,
authorized=movie_request.authorized,
)
try:
self.db.add(db_model)
self.db.commit()
self.db.refresh(db_model)
log.info(f"Successfully added movie request with id: {db_model.id}")
return MovieRequestSchema.model_validate(db_model)
except IntegrityError:
self.db.rollback()
log.exception("Integrity error while adding movie request")
raise
except SQLAlchemyError:
self.db.rollback()
log.exception("Database error while adding movie request")
raise
def set_movie_library(self, movie_id: MovieId, library: str) -> None: def set_movie_library(self, movie_id: MovieId, library: str) -> None:
""" """
Sets the library for a movie. Sets the library for a movie.
@@ -234,49 +187,6 @@ class MovieRepository:
log.exception(f"Database error setting library for movie {movie_id}") log.exception(f"Database error setting library for movie {movie_id}")
raise raise
def delete_movie_request(self, movie_request_id: MovieRequestId) -> None:
"""
Removes a MovieRequest by its ID.
:param movie_request_id: The ID of the movie request to delete.
:raises NotFoundError: If the movie request is not found.
:raises SQLAlchemyError: If a database error occurs.
"""
try:
stmt = delete(MovieRequest).where(MovieRequest.id == movie_request_id)
result = self.db.execute(stmt)
if result.rowcount == 0:
self.db.rollback()
msg = f"movie request with id {movie_request_id} not found."
raise NotFoundError(msg)
self.db.commit()
# Successfully deleted movie request with id: {movie_request_id}
except SQLAlchemyError:
self.db.rollback()
log.exception(
f"Database error while deleting movie request {movie_request_id}"
)
raise
def get_movie_requests(self) -> list[RichMovieRequestSchema]:
"""
Retrieve all movie requests.
:return: A list of RichMovieRequest objects.
:raises SQLAlchemyError: If a database error occurs.
"""
try:
stmt = select(MovieRequest).options(
joinedload(MovieRequest.requested_by),
joinedload(MovieRequest.authorized_by),
joinedload(MovieRequest.movie),
)
results = self.db.execute(stmt).scalars().unique().all()
return [RichMovieRequestSchema.model_validate(x) for x in results]
except SQLAlchemyError:
log.exception("Database error while retrieving movie requests")
raise
def add_movie_file(self, movie_file: MovieFileSchema) -> MovieFileSchema: def add_movie_file(self, movie_file: MovieFileSchema) -> MovieFileSchema:
""" """
Adds a movie file record to the database. Adds a movie file record to the database.
@@ -396,25 +306,6 @@ class MovieRepository:
log.exception("Database error retrieving all movies with torrents") log.exception("Database error retrieving all movies with torrents")
raise raise
def get_movie_request(self, movie_request_id: MovieRequestId) -> MovieRequestSchema:
"""
Retrieve a movie request by its ID.
:param movie_request_id: The ID of the movie request.
:return: A MovieRequest object.
:raises NotFoundError: If the movie request is not found.
:raises SQLAlchemyError: If a database error occurs.
"""
try:
request = self.db.get(MovieRequest, movie_request_id)
if not request:
msg = f"Movie request with id {movie_request_id} not found."
raise NotFoundError(msg)
return MovieRequestSchema.model_validate(request)
except SQLAlchemyError:
log.exception(f"Database error retrieving movie request {movie_request_id}")
raise
def get_movie_by_torrent_id(self, torrent_id: TorrentId) -> MovieSchema: def get_movie_by_torrent_id(self, torrent_id: TorrentId) -> MovieSchema:
""" """
Retrieve a movie by a torrent ID. Retrieve a movie by a torrent ID.

View File

@@ -1,9 +1,7 @@
from pathlib import Path from pathlib import Path
from typing import Annotated
from fastapi import APIRouter, Depends, HTTPException, status from fastapi import APIRouter, Depends, HTTPException, status
from media_manager.auth.schemas import UserRead
from media_manager.auth.users import current_active_user, current_superuser from media_manager.auth.users import current_active_user, current_superuser
from media_manager.config import LibraryItem, MediaManagerConfig from media_manager.config import LibraryItem, MediaManagerConfig
from media_manager.exceptions import ConflictError, NotFoundError from media_manager.exceptions import ConflictError, NotFoundError
@@ -13,20 +11,14 @@ from media_manager.indexer.schemas import (
) )
from media_manager.metadataProvider.dependencies import metadata_provider_dep from media_manager.metadataProvider.dependencies import metadata_provider_dep
from media_manager.metadataProvider.schemas import MetaDataProviderSearchResult from media_manager.metadataProvider.schemas import MetaDataProviderSearchResult
from media_manager.movies import log
from media_manager.movies.dependencies import ( from media_manager.movies.dependencies import (
movie_dep, movie_dep,
movie_service_dep, movie_service_dep,
) )
from media_manager.movies.schemas import ( from media_manager.movies.schemas import (
CreateMovieRequest,
Movie, Movie,
MovieRequest,
MovieRequestBase,
MovieRequestId,
PublicMovie, PublicMovie,
PublicMovieFile, PublicMovieFile,
RichMovieRequest,
RichMovieTorrent, RichMovieTorrent,
) )
from media_manager.schemas import MediaImportSuggestion from media_manager.schemas import MediaImportSuggestion
@@ -188,103 +180,6 @@ def get_available_libraries() -> list[LibraryItem]:
return MediaManagerConfig().misc.movie_libraries return MediaManagerConfig().misc.movie_libraries
# -----------------------------------------------------------------------------
# MOVIE REQUESTS
# -----------------------------------------------------------------------------
@router.get(
"/requests",
dependencies=[Depends(current_active_user)],
)
def get_all_movie_requests(movie_service: movie_service_dep) -> list[RichMovieRequest]:
"""
Get all movie requests.
"""
return movie_service.get_all_movie_requests()
@router.post(
"/requests",
status_code=status.HTTP_201_CREATED,
)
def create_movie_request(
movie_service: movie_service_dep,
movie_request: CreateMovieRequest,
user: Annotated[UserRead, Depends(current_active_user)],
) -> MovieRequest:
"""
Create a new movie request.
"""
log.info(
f"User {user.email} is creating a movie request for {movie_request.movie_id}"
)
movie_request: MovieRequest = MovieRequest.model_validate(movie_request)
movie_request.requested_by = user
if user.is_superuser:
movie_request.authorized = True
movie_request.authorized_by = user
return movie_service.add_movie_request(movie_request=movie_request)
@router.put(
"/requests/{movie_request_id}",
)
def update_movie_request(
movie_service: movie_service_dep,
movie_request_id: MovieRequestId,
update_movie_request: MovieRequestBase,
user: Annotated[UserRead, Depends(current_active_user)],
) -> MovieRequest:
"""
Update an existing movie request.
"""
movie_request = movie_service.get_movie_request_by_id(
movie_request_id=movie_request_id
)
if movie_request.requested_by.id != user.id or user.is_superuser:
movie_request.min_quality = update_movie_request.min_quality
movie_request.wanted_quality = update_movie_request.wanted_quality
return movie_service.update_movie_request(movie_request=movie_request)
@router.patch("/requests/{movie_request_id}", status_code=status.HTTP_204_NO_CONTENT)
def authorize_request(
movie_service: movie_service_dep,
movie_request_id: MovieRequestId,
user: Annotated[UserRead, Depends(current_superuser)],
authorized_status: bool = False,
) -> None:
"""
Authorize or de-authorize a movie request.
"""
movie_request = movie_service.get_movie_request_by_id(
movie_request_id=movie_request_id
)
movie_request.authorized = authorized_status
if authorized_status:
movie_request.authorized_by = user
else:
movie_request.authorized_by = None
movie_service.update_movie_request(movie_request=movie_request)
@router.delete(
"/requests/{movie_request_id}",
status_code=status.HTTP_204_NO_CONTENT,
dependencies=[Depends(current_superuser)],
)
def delete_movie_request(
movie_service: movie_service_dep, movie_request_id: MovieRequestId
) -> None:
"""
Delete a movie request.
"""
movie_service.delete_movie_request(movie_request_id=movie_request_id)
# ----------------------------------------------------------------------------- # -----------------------------------------------------------------------------
# MOVIES - SINGLE RESOURCE # MOVIES - SINGLE RESOURCE
# ----------------------------------------------------------------------------- # -----------------------------------------------------------------------------

View File

@@ -2,14 +2,12 @@ import typing
import uuid import uuid
from uuid import UUID from uuid import UUID
from pydantic import BaseModel, ConfigDict, Field, model_validator from pydantic import BaseModel, ConfigDict, Field
from media_manager.auth.schemas import UserRead
from media_manager.torrent.models import Quality from media_manager.torrent.models import Quality
from media_manager.torrent.schemas import TorrentId, TorrentStatus from media_manager.torrent.schemas import TorrentId, TorrentStatus
MovieId = typing.NewType("MovieId", UUID) MovieId = typing.NewType("MovieId", UUID)
MovieRequestId = typing.NewType("MovieRequestId", UUID)
class Movie(BaseModel): class Movie(BaseModel):
@@ -40,38 +38,6 @@ class PublicMovieFile(MovieFile):
imported: bool = False imported: bool = False
class MovieRequestBase(BaseModel):
min_quality: Quality
wanted_quality: Quality
@model_validator(mode="after")
def ensure_wanted_quality_is_eq_or_gt_min_quality(self) -> "MovieRequestBase":
if self.min_quality.value < self.wanted_quality.value:
msg = "wanted_quality must be equal to or lower than minimum_quality."
raise ValueError(msg)
return self
class CreateMovieRequest(MovieRequestBase):
movie_id: MovieId
class MovieRequest(MovieRequestBase):
model_config = ConfigDict(from_attributes=True)
id: MovieRequestId = Field(default_factory=lambda: MovieRequestId(uuid.uuid4()))
movie_id: MovieId
requested_by: UserRead | None = None
authorized: bool = False
authorized_by: UserRead | None = None
class RichMovieRequest(MovieRequest):
movie: Movie
class MovieTorrent(BaseModel): class MovieTorrent(BaseModel):
model_config = ConfigDict(from_attributes=True) model_config = ConfigDict(from_attributes=True)

View File

@@ -4,12 +4,9 @@ from pathlib import Path
from typing import overload from typing import overload
from sqlalchemy.exc import IntegrityError from sqlalchemy.exc import IntegrityError
from sqlalchemy.orm import Session
from media_manager.config import MediaManagerConfig from media_manager.config import MediaManagerConfig
from media_manager.database import SessionLocal, get_session
from media_manager.exceptions import InvalidConfigError, NotFoundError, RenameError from media_manager.exceptions import InvalidConfigError, NotFoundError, RenameError
from media_manager.indexer.repository import IndexerRepository
from media_manager.indexer.schemas import IndexerQueryResult, IndexerQueryResultId from media_manager.indexer.schemas import IndexerQueryResult, IndexerQueryResultId
from media_manager.indexer.service import IndexerService from media_manager.indexer.service import IndexerService
from media_manager.indexer.utils import evaluate_indexer_query_results from media_manager.indexer.utils import evaluate_indexer_query_results
@@ -25,20 +22,14 @@ from media_manager.movies.schemas import (
Movie, Movie,
MovieFile, MovieFile,
MovieId, MovieId,
MovieRequest,
MovieRequestId,
PublicMovie, PublicMovie,
PublicMovieFile, PublicMovieFile,
RichMovieRequest,
RichMovieTorrent, RichMovieTorrent,
) )
from media_manager.notification.repository import NotificationRepository
from media_manager.notification.service import NotificationService from media_manager.notification.service import NotificationService
from media_manager.schemas import MediaImportSuggestion from media_manager.schemas import MediaImportSuggestion
from media_manager.torrent.repository import TorrentRepository
from media_manager.torrent.schemas import ( from media_manager.torrent.schemas import (
Quality, Quality,
QualityStrings,
Torrent, Torrent,
TorrentStatus, TorrentStatus,
) )
@@ -89,44 +80,6 @@ class MovieService:
metadata_provider.download_movie_poster_image(movie=saved_movie) metadata_provider.download_movie_poster_image(movie=saved_movie)
return saved_movie return saved_movie
def add_movie_request(self, movie_request: MovieRequest) -> MovieRequest:
"""
Add a new movie request.
:param movie_request: The movie request to add.
:return: The added movie request.
"""
return self.movie_repository.add_movie_request(movie_request=movie_request)
def get_movie_request_by_id(self, movie_request_id: MovieRequestId) -> MovieRequest:
"""
Get a movie request by its ID.
:param movie_request_id: The ID of the movie request.
:return: The movie request or None if not found.
"""
return self.movie_repository.get_movie_request(
movie_request_id=movie_request_id
)
def update_movie_request(self, movie_request: MovieRequest) -> MovieRequest:
"""
Update an existing movie request.
:param movie_request: The movie request to update.
:return: The updated movie request.
"""
self.movie_repository.delete_movie_request(movie_request_id=movie_request.id)
return self.movie_repository.add_movie_request(movie_request=movie_request)
def delete_movie_request(self, movie_request_id: MovieRequestId) -> None:
"""
Delete a movie request by its ID.
:param movie_request_id: The ID of the movie request to delete.
"""
self.movie_repository.delete_movie_request(movie_request_id=movie_request_id)
def delete_movie( def delete_movie(
self, self,
movie: Movie, movie: Movie,
@@ -391,14 +344,6 @@ class MovieService:
external_id=external_id, metadata_provider=metadata_provider external_id=external_id, metadata_provider=metadata_provider
) )
def get_all_movie_requests(self) -> list[RichMovieRequest]:
"""
Get all movie requests.
:return: A list of rich movie requests.
"""
return self.movie_repository.get_movie_requests()
def set_movie_library(self, movie: Movie, library: str) -> None: def set_movie_library(self, movie: Movie, library: str) -> None:
self.movie_repository.set_movie_library(movie_id=movie.id, library=library) self.movie_repository.set_movie_library(movie_id=movie.id, library=library)
@@ -471,65 +416,6 @@ class MovieService:
self.torrent_service.resume_download(torrent=movie_torrent) self.torrent_service.resume_download(torrent=movie_torrent)
return movie_torrent return movie_torrent
def download_approved_movie_request(
self, movie_request: MovieRequest, movie: Movie
) -> bool:
"""
Download an approved movie request.
:param movie_request: The movie request to download.
:param movie: The Movie object.
:return: True if the download was successful, False otherwise.
:raises ValueError: If the movie request is not authorized.
"""
if not movie_request.authorized:
msg = "Movie request is not authorized"
raise ValueError(msg)
log.info(f"Downloading approved movie request {movie_request.id}")
torrents = self.get_all_available_torrents_for_movie(movie=movie)
available_torrents: list[IndexerQueryResult] = []
for torrent in torrents:
if (
(torrent.quality.value < movie_request.wanted_quality.value)
or (torrent.quality.value > movie_request.min_quality.value)
or (torrent.seeders < 3)
):
log.debug(
f"Skipping torrent {torrent.title} with quality {torrent.quality} for movie {movie.id}, because it does not match the requested quality {movie_request.wanted_quality}"
)
else:
available_torrents.append(torrent)
log.debug(
f"Taking torrent {torrent.title} with quality {torrent.quality} for movie {movie.id} into consideration"
)
if len(available_torrents) == 0:
log.warning(
f"No torrents found for movie request {movie_request.id} with quality between {QualityStrings[movie_request.min_quality.name]} and {QualityStrings[movie_request.wanted_quality.name]}"
)
return False
available_torrents.sort()
torrent = self.torrent_service.download(indexer_result=available_torrents[0])
movie_file = MovieFile(
movie_id=movie.id,
quality=torrent.quality,
torrent_id=torrent.id,
file_path_suffix=QualityStrings[torrent.quality.name].value.upper(),
)
try:
self.movie_repository.add_movie_file(movie_file=movie_file)
except IntegrityError:
log.warning(
f"Movie file for movie {movie.name} and torrent {torrent.title} already exists"
)
self.delete_movie_request(movie_request.id)
return True
def get_movie_root_path(self, movie: Movie) -> Path: def get_movie_root_path(self, movie: Movie) -> Path:
misc_config = MediaManagerConfig().misc misc_config = MediaManagerConfig().misc
movie_file_path = ( movie_file_path = (
@@ -723,7 +609,7 @@ class MovieService:
) )
if not fresh_movie_data: if not fresh_movie_data:
log.warning( log.warning(
f"Could not fetch fresh metadata for movie: {db_movie.name} (ID: {db_movie.external_id})" f"Could not fetch fresh metadata for movie: {db_movie.name} ({db_movie.year})"
) )
return None return None
log.debug(f"Fetched fresh metadata for movie: {fresh_movie_data.name}") log.debug(f"Fetched fresh metadata for movie: {fresh_movie_data.name}")
@@ -738,7 +624,9 @@ class MovieService:
updated_movie = self.movie_repository.get_movie_by_id(movie_id=db_movie.id) updated_movie = self.movie_repository.get_movie_by_id(movie_id=db_movie.id)
log.info(f"Successfully updated metadata for movie ID: {db_movie.id}") log.info(
f"Successfully updated metadata for movie: {db_movie.name} ({db_movie.year})"
)
metadata_provider.download_movie_poster_image(movie=updated_movie) metadata_provider.download_movie_poster_image(movie=updated_movie)
return updated_movie return updated_movie
@@ -773,102 +661,29 @@ class MovieService:
log.debug(f"Found {len(importable_movies)} importable movies.") log.debug(f"Found {len(importable_movies)} importable movies.")
return importable_movies return importable_movies
def import_all_torrents(self) -> None:
def auto_download_all_approved_movie_requests() -> None:
"""
Auto download all approved movie requests.
This is a standalone function as it creates its own DB session.
"""
db: Session = SessionLocal() if SessionLocal else next(get_session())
movie_repository = MovieRepository(db=db)
torrent_service = TorrentService(torrent_repository=TorrentRepository(db=db))
indexer_service = IndexerService(indexer_repository=IndexerRepository(db=db))
notification_service = NotificationService(
notification_repository=NotificationRepository(db=db)
)
movie_service = MovieService(
movie_repository=movie_repository,
torrent_service=torrent_service,
indexer_service=indexer_service,
notification_service=notification_service,
)
log.info("Auto downloading all approved movie requests")
movie_requests = movie_repository.get_movie_requests()
log.info(f"Found {len(movie_requests)} movie requests to process")
count = 0
for movie_request in movie_requests:
if movie_request.authorized:
movie = movie_repository.get_movie_by_id(movie_id=movie_request.movie_id)
if movie_service.download_approved_movie_request(
movie_request=movie_request, movie=movie
):
count += 1
else:
log.info(
f"Could not download movie request {movie_request.id} for movie {movie.name}"
)
log.info(f"Auto downloaded {count} approved movie requests")
db.commit()
db.close()
def import_all_movie_torrents() -> None:
with next(get_session()) as db:
movie_repository = MovieRepository(db=db)
torrent_service = TorrentService(torrent_repository=TorrentRepository(db=db))
indexer_service = IndexerService(indexer_repository=IndexerRepository(db=db))
notification_service = NotificationService(
notification_repository=NotificationRepository(db=db)
)
movie_service = MovieService(
movie_repository=movie_repository,
torrent_service=torrent_service,
indexer_service=indexer_service,
notification_service=notification_service,
)
log.info("Importing all torrents") log.info("Importing all torrents")
torrents = torrent_service.get_all_torrents() torrents = self.torrent_service.get_all_torrents()
log.info("Found %d torrents to import", len(torrents)) log.info("Found %d torrents to import", len(torrents))
for t in torrents: for t in torrents:
try: try:
if not t.imported and t.status == TorrentStatus.finished: if not t.imported and t.status == TorrentStatus.finished:
movie = torrent_service.get_movie_of_torrent(torrent=t) movie = self.torrent_service.get_movie_of_torrent(torrent=t)
if movie is None: if movie is None:
log.warning( log.warning(
f"torrent {t.title} is not a movie torrent, skipping import." f"torrent {t.title} is not a movie torrent, skipping import."
) )
continue continue
movie_service.import_torrent_files(torrent=t, movie=movie) self.import_torrent_files(torrent=t, movie=movie)
except RuntimeError: except RuntimeError:
log.exception(f"Failed to import torrent {t.title}") log.exception(f"Failed to import torrent {t.title}")
log.info("Finished importing all torrents") log.info("Finished importing all torrents")
db.commit()
def update_all_movies_metadata() -> None:
"""
Updates the metadata of all movies.
"""
with next(get_session()) as db:
movie_repository = MovieRepository(db=db)
movie_service = MovieService(
movie_repository=movie_repository,
torrent_service=TorrentService(torrent_repository=TorrentRepository(db=db)),
indexer_service=IndexerService(indexer_repository=IndexerRepository(db=db)),
notification_service=NotificationService(
notification_repository=NotificationRepository(db=db)
),
)
def update_all_metadata(self) -> None:
"""Updates the metadata of all movies."""
log.info("Updating metadata for all movies") log.info("Updating metadata for all movies")
movies = self.movie_repository.get_movies()
movies = movie_repository.get_movies()
log.info(f"Found {len(movies)} movies to update") log.info(f"Found {len(movies)} movies to update")
for movie in movies: for movie in movies:
try: try:
if movie.metadata_provider == "tmdb": if movie.metadata_provider == "tmdb":
@@ -885,7 +700,6 @@ def update_all_movies_metadata() -> None:
f"Error initializing metadata provider {movie.metadata_provider} for movie {movie.name}", f"Error initializing metadata provider {movie.metadata_provider} for movie {movie.name}",
) )
continue continue
movie_service.update_movie_metadata( self.update_movie_metadata(
db_movie=movie, metadata_provider=metadata_provider db_movie=movie, metadata_provider=metadata_provider
) )
db.commit()

View File

@@ -1,67 +1,91 @@
from apscheduler.jobstores.sqlalchemy import SQLAlchemyJobStore import asyncio
from apscheduler.schedulers.background import BackgroundScheduler import logging
from apscheduler.triggers.cron import CronTrigger from urllib.parse import quote
import media_manager.database import taskiq_fastapi
from media_manager.config import MediaManagerConfig from taskiq import TaskiqDepends, TaskiqScheduler
from media_manager.movies.service import ( from taskiq.cli.scheduler.run import SchedulerLoop
auto_download_all_approved_movie_requests, from taskiq_postgresql import PostgresqlBroker
import_all_movie_torrents, from taskiq_postgresql.scheduler_source import PostgresqlSchedulerSource
update_all_movies_metadata,
) from media_manager.movies.dependencies import get_movie_service
from media_manager.tv.service import ( from media_manager.movies.service import MovieService
auto_download_all_approved_season_requests, from media_manager.tv.dependencies import get_tv_service
import_all_show_torrents, from media_manager.tv.service import TvService
update_all_non_ended_shows_metadata,
def _build_db_connection_string_for_taskiq() -> str:
from media_manager.config import MediaManagerConfig
db_config = MediaManagerConfig().database
user = quote(db_config.user, safe="")
password = quote(db_config.password, safe="")
dbname = quote(db_config.dbname, safe="")
host = quote(str(db_config.host), safe="")
port = quote(str(db_config.port), safe="")
return f"postgresql://{user}:{password}@{host}:{port}/{dbname}"
broker = PostgresqlBroker(
dsn=_build_db_connection_string_for_taskiq,
driver="psycopg",
run_migrations=True,
) )
# Register FastAPI app with the broker so worker processes can resolve FastAPI
# dependencies. Using a string reference avoids circular imports.
taskiq_fastapi.init(broker, "media_manager.main:app")
def setup_scheduler(config: MediaManagerConfig) -> BackgroundScheduler: log = logging.getLogger(__name__)
from media_manager.database import init_engine
init_engine(config.database)
jobstores = {"default": SQLAlchemyJobStore(engine=media_manager.database.engine)} @broker.task
scheduler = BackgroundScheduler(jobstores=jobstores) async def import_all_movie_torrents_task(
every_15_minutes_trigger = CronTrigger(minute="*/15", hour="*") movie_service: MovieService = TaskiqDepends(get_movie_service),
daily_trigger = CronTrigger(hour=0, minute=0, jitter=60 * 60 * 24 * 2) ) -> None:
weekly_trigger = CronTrigger( log.info("Importing all Movie torrents")
day_of_week="mon", hour=0, minute=0, jitter=60 * 60 * 24 * 2 await asyncio.to_thread(movie_service.import_all_torrents)
@broker.task
async def import_all_show_torrents_task(
tv_service: TvService = TaskiqDepends(get_tv_service),
) -> None:
log.info("Importing all Show torrents")
await asyncio.to_thread(tv_service.import_all_torrents)
@broker.task
async def update_all_movies_metadata_task(
movie_service: MovieService = TaskiqDepends(get_movie_service),
) -> None:
await asyncio.to_thread(movie_service.update_all_metadata)
@broker.task
async def update_all_non_ended_shows_metadata_task(
tv_service: TvService = TaskiqDepends(get_tv_service),
) -> None:
await asyncio.to_thread(tv_service.update_all_non_ended_shows_metadata)
# Maps each task to its cron schedule so PostgresqlSchedulerSource can seed
# the taskiq_schedulers table on first startup.
_STARTUP_SCHEDULES: dict[str, list[dict[str, str]]] = {
import_all_movie_torrents_task.task_name: [{"cron": "*/2 * * * *"}],
import_all_show_torrents_task.task_name: [{"cron": "*/2 * * * *"}],
update_all_movies_metadata_task.task_name: [{"cron": "0 0 * * 1"}],
update_all_non_ended_shows_metadata_task.task_name: [{"cron": "0 0 * * 1"}],
}
def build_scheduler_loop() -> SchedulerLoop:
source = PostgresqlSchedulerSource(
dsn=_build_db_connection_string_for_taskiq,
driver="psycopg",
broker=broker,
run_migrations=True,
startup_schedule=_STARTUP_SCHEDULES,
) )
scheduler.add_job( scheduler = TaskiqScheduler(broker=broker, sources=[source])
import_all_movie_torrents, return SchedulerLoop(scheduler)
every_15_minutes_trigger,
id="import_all_movie_torrents",
replace_existing=True,
)
scheduler.add_job(
import_all_show_torrents,
every_15_minutes_trigger,
id="import_all_show_torrents",
replace_existing=True,
)
scheduler.add_job(
auto_download_all_approved_season_requests,
daily_trigger,
id="auto_download_all_approved_season_requests",
replace_existing=True,
)
scheduler.add_job(
auto_download_all_approved_movie_requests,
daily_trigger,
id="auto_download_all_approved_movie_requests",
replace_existing=True,
)
scheduler.add_job(
update_all_movies_metadata,
weekly_trigger,
id="update_all_movies_metadata",
replace_existing=True,
)
scheduler.add_job(
update_all_non_ended_shows_metadata,
weekly_trigger,
id="update_all_non_ended_shows_metadata",
replace_existing=True,
)
scheduler.start()
return scheduler

View File

@@ -1,8 +1,7 @@
from pydantic_settings import BaseSettings, SettingsConfigDict from pydantic_settings import BaseSettings
class QbittorrentConfig(BaseSettings): class QbittorrentConfig(BaseSettings):
model_config = SettingsConfigDict(env_prefix="QBITTORRENT_")
host: str = "localhost" host: str = "localhost"
port: int = 8080 port: int = 8080
username: str = "admin" username: str = "admin"
@@ -14,7 +13,6 @@ class QbittorrentConfig(BaseSettings):
class TransmissionConfig(BaseSettings): class TransmissionConfig(BaseSettings):
model_config = SettingsConfigDict(env_prefix="TRANSMISSION_")
path: str = "/transmission/rpc" path: str = "/transmission/rpc"
https_enabled: bool = True https_enabled: bool = True
host: str = "localhost" host: str = "localhost"
@@ -25,7 +23,6 @@ class TransmissionConfig(BaseSettings):
class SabnzbdConfig(BaseSettings): class SabnzbdConfig(BaseSettings):
model_config = SettingsConfigDict(env_prefix="SABNZBD_")
host: str = "localhost" host: str = "localhost"
port: int = 8080 port: int = 8080
api_key: str = "" api_key: str = ""

View File

@@ -57,23 +57,45 @@ class QbittorrentDownloadClient(AbstractDownloadClient):
log.exception("Failed to log into qbittorrent") log.exception("Failed to log into qbittorrent")
raise raise
try: categories = self.api_client.torrents_categories()
self.api_client.torrents_create_category( log.debug(f"Found following categories in qBittorrent: {categories}")
name=self.config.category_name, if self.config.category_name in categories:
save_path=self.config.category_save_path category = categories.get(self.config.category_name)
if self.config.category_save_path != "" if category.get("savePath") == self.config.category_save_path:
else None, log.debug(
f"Category '{self.config.category_name}' already exists in qBittorrent with the correct save path."
)
return
# category exists but with a different save path, attempt to update it
log.debug(
f"Category '{self.config.category_name}' already exists in qBittorrent but with a different save path. Attempting to update it."
) )
except Conflict409Error:
try: try:
self.api_client.torrents_edit_category( self.api_client.torrents_edit_category(
name=self.config.category_name, name=self.config.category_name,
save_path=self.config.category_save_path save_path=self.config.category_save_path,
if self.config.category_save_path != "" )
else None, except Conflict409Error:
log.exception(
f"Attempt to update category '{self.config.category_name}' in qBittorrent with a different save"
f" path failed. The configured save path and the save path saved in Qbittorrent differ,"
f" manually update it in the qBittorrent WebUI or change the save path in the MediaManager"
f" config to match the one in qBittorrent."
)
else:
# create category if it doesn't exist
log.debug(
f"Category '{self.config.category_name}' does not exist in qBittorrent. Attempting to create it."
)
try:
self.api_client.torrents_create_category(
name=self.config.category_name,
save_path=self.config.category_save_path,
)
except Conflict409Error:
log.exception(
f"Attempt to create category '{self.config.category_name}' in qBittorrent failed. The category already exists but was not found in the initial category list, manually check if the category exists in the qBittorrent WebUI or change the category name in the MediaManager config."
) )
except Exception:
log.exception("Error on updating MediaManager category in qBittorrent")
def download_torrent(self, indexer_result: IndexerQueryResult) -> Torrent: def download_torrent(self, indexer_result: IndexerQueryResult) -> Torrent:
""" """

View File

@@ -16,5 +16,5 @@ class Torrent(Base):
hash: Mapped[str] hash: Mapped[str]
usenet: Mapped[bool] usenet: Mapped[bool]
season_files = relationship("SeasonFile", back_populates="torrent") episode_files = relationship("EpisodeFile", back_populates="torrent")
movie_files = relationship("MovieFile", back_populates="torrent") movie_files = relationship("MovieFile", back_populates="torrent")

View File

@@ -3,17 +3,13 @@ from sqlalchemy import delete, select
from media_manager.database import DbSessionDependency from media_manager.database import DbSessionDependency
from media_manager.exceptions import NotFoundError from media_manager.exceptions import NotFoundError
from media_manager.movies.models import Movie, MovieFile from media_manager.movies.models import Movie, MovieFile
from media_manager.movies.schemas import ( from media_manager.movies.schemas import Movie as MovieSchema
Movie as MovieSchema, from media_manager.movies.schemas import MovieFile as MovieFileSchema
)
from media_manager.movies.schemas import (
MovieFile as MovieFileSchema,
)
from media_manager.torrent.models import Torrent from media_manager.torrent.models import Torrent
from media_manager.torrent.schemas import Torrent as TorrentSchema from media_manager.torrent.schemas import Torrent as TorrentSchema
from media_manager.torrent.schemas import TorrentId from media_manager.torrent.schemas import TorrentId
from media_manager.tv.models import Season, SeasonFile, Show from media_manager.tv.models import Episode, EpisodeFile, Season, Show
from media_manager.tv.schemas import SeasonFile as SeasonFileSchema from media_manager.tv.schemas import EpisodeFile as EpisodeFileSchema
from media_manager.tv.schemas import Show as ShowSchema from media_manager.tv.schemas import Show as ShowSchema
@@ -21,19 +17,22 @@ class TorrentRepository:
def __init__(self, db: DbSessionDependency) -> None: def __init__(self, db: DbSessionDependency) -> None:
self.db = db self.db = db
def get_seasons_files_of_torrent( def get_episode_files_of_torrent(
self, torrent_id: TorrentId self, torrent_id: TorrentId
) -> list[SeasonFileSchema]: ) -> list[EpisodeFileSchema]:
stmt = select(SeasonFile).where(SeasonFile.torrent_id == torrent_id) stmt = select(EpisodeFile).where(EpisodeFile.torrent_id == torrent_id)
result = self.db.execute(stmt).scalars().all() result = self.db.execute(stmt).scalars().all()
return [SeasonFileSchema.model_validate(season_file) for season_file in result] return [
EpisodeFileSchema.model_validate(episode_file) for episode_file in result
]
def get_show_of_torrent(self, torrent_id: TorrentId) -> ShowSchema | None: def get_show_of_torrent(self, torrent_id: TorrentId) -> ShowSchema | None:
stmt = ( stmt = (
select(Show) select(Show)
.join(SeasonFile.season) .join(Show.seasons)
.join(Season.show) .join(Season.episodes)
.where(SeasonFile.torrent_id == torrent_id) .join(Episode.episode_files)
.where(EpisodeFile.torrent_id == torrent_id)
) )
result = self.db.execute(stmt).unique().scalar_one_or_none() result = self.db.execute(stmt).unique().scalar_one_or_none()
if result is None: if result is None:
@@ -69,10 +68,10 @@ class TorrentRepository:
) )
self.db.execute(movie_files_stmt) self.db.execute(movie_files_stmt)
season_files_stmt = delete(SeasonFile).where( episode_files_stmt = delete(EpisodeFile).where(
SeasonFile.torrent_id == torrent_id EpisodeFile.torrent_id == torrent_id
) )
self.db.execute(season_files_stmt) self.db.execute(episode_files_stmt)
self.db.delete(self.db.get(Torrent, torrent_id)) self.db.delete(self.db.get(Torrent, torrent_id))

View File

@@ -5,7 +5,7 @@ from media_manager.movies.schemas import Movie, MovieFile
from media_manager.torrent.manager import DownloadManager from media_manager.torrent.manager import DownloadManager
from media_manager.torrent.repository import TorrentRepository from media_manager.torrent.repository import TorrentRepository
from media_manager.torrent.schemas import Torrent, TorrentId from media_manager.torrent.schemas import Torrent, TorrentId
from media_manager.tv.schemas import SeasonFile, Show from media_manager.tv.schemas import EpisodeFile, Show
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
@@ -19,13 +19,13 @@ class TorrentService:
self.torrent_repository = torrent_repository self.torrent_repository = torrent_repository
self.download_manager = download_manager or DownloadManager() self.download_manager = download_manager or DownloadManager()
def get_season_files_of_torrent(self, torrent: Torrent) -> list[SeasonFile]: def get_episode_files_of_torrent(self, torrent: Torrent) -> list[EpisodeFile]:
""" """
Returns all season files of a torrent Returns all episode files of a torrent
:param torrent: the torrent to get the season files of :param torrent: the torrent to get the episode files of
:return: list of season files :return: list of episode files
""" """
return self.torrent_repository.get_seasons_files_of_torrent( return self.torrent_repository.get_episode_files_of_torrent(
torrent_id=torrent.id torrent_id=torrent.id
) )

View File

@@ -133,7 +133,8 @@ def get_torrent_hash(torrent: IndexerQueryResult) -> str:
:return: The hash of the torrent. :return: The hash of the torrent.
""" """
torrent_filepath = ( torrent_filepath = (
MediaManagerConfig().misc.torrent_directory / f"{sanitize_filename(torrent.title)}.torrent" MediaManagerConfig().misc.torrent_directory
/ f"{sanitize_filename(torrent.title)}.torrent"
) )
if torrent_filepath.exists(): if torrent_filepath.exists():
log.warning(f"Torrent file already exists at: {torrent_filepath}") log.warning(f"Torrent file already exists at: {torrent_filepath}")

View File

@@ -3,7 +3,6 @@ from uuid import UUID
from sqlalchemy import ForeignKey, PrimaryKeyConstraint, UniqueConstraint from sqlalchemy import ForeignKey, PrimaryKeyConstraint, UniqueConstraint
from sqlalchemy.orm import Mapped, mapped_column, relationship from sqlalchemy.orm import Mapped, mapped_column, relationship
from media_manager.auth.db import User
from media_manager.database import Base from media_manager.database import Base
from media_manager.torrent.models import Quality from media_manager.torrent.models import Quality
@@ -48,13 +47,6 @@ class Season(Base):
back_populates="season", cascade="all, delete" back_populates="season", cascade="all, delete"
) )
season_files = relationship(
"SeasonFile", back_populates="season", cascade="all, delete"
)
season_requests = relationship(
"SeasonRequest", back_populates="season", cascade="all, delete"
)
class Episode(Base): class Episode(Base):
__tablename__ = "episode" __tablename__ = "episode"
@@ -66,15 +58,19 @@ class Episode(Base):
number: Mapped[int] number: Mapped[int]
external_id: Mapped[int] external_id: Mapped[int]
title: Mapped[str] title: Mapped[str]
overview: Mapped[str | None] = mapped_column(nullable=True)
season: Mapped["Season"] = relationship(back_populates="episodes") season: Mapped["Season"] = relationship(back_populates="episodes")
episode_files = relationship(
"EpisodeFile", back_populates="episode", cascade="all, delete"
)
class SeasonFile(Base): class EpisodeFile(Base):
__tablename__ = "season_file" __tablename__ = "episode_file"
__table_args__ = (PrimaryKeyConstraint("season_id", "file_path_suffix"),) __table_args__ = (PrimaryKeyConstraint("episode_id", "file_path_suffix"),)
season_id: Mapped[UUID] = mapped_column( episode_id: Mapped[UUID] = mapped_column(
ForeignKey(column="season.id", ondelete="CASCADE"), ForeignKey(column="episode.id", ondelete="CASCADE"),
) )
torrent_id: Mapped[UUID | None] = mapped_column( torrent_id: Mapped[UUID | None] = mapped_column(
ForeignKey(column="torrent.id", ondelete="SET NULL"), ForeignKey(column="torrent.id", ondelete="SET NULL"),
@@ -82,31 +78,5 @@ class SeasonFile(Base):
file_path_suffix: Mapped[str] file_path_suffix: Mapped[str]
quality: Mapped[Quality] quality: Mapped[Quality]
torrent = relationship("Torrent", back_populates="season_files", uselist=False) torrent = relationship("Torrent", back_populates="episode_files", uselist=False)
season = relationship("Season", back_populates="season_files", uselist=False) episode = relationship("Episode", back_populates="episode_files", uselist=False)
class SeasonRequest(Base):
__tablename__ = "season_request"
__table_args__ = (UniqueConstraint("season_id", "wanted_quality"),)
id: Mapped[UUID] = mapped_column(primary_key=True)
season_id: Mapped[UUID] = mapped_column(
ForeignKey(column="season.id", ondelete="CASCADE"),
)
wanted_quality: Mapped[Quality]
min_quality: Mapped[Quality]
requested_by_id: Mapped[UUID | None] = mapped_column(
ForeignKey(column="user.id", ondelete="SET NULL"),
)
authorized: Mapped[bool] = mapped_column(default=False)
authorized_by_id: Mapped[UUID | None] = mapped_column(
ForeignKey(column="user.id", ondelete="SET NULL"),
)
requested_by: Mapped["User|None"] = relationship(
foreign_keys=[requested_by_id], uselist=False
)
authorized_by: Mapped["User|None"] = relationship(
foreign_keys=[authorized_by_id], uselist=False
)
season = relationship("Season", back_populates="season_requests", uselist=False)

View File

@@ -1,8 +1,5 @@
from sqlalchemy import delete, func, select from sqlalchemy import delete, func, select
from sqlalchemy.exc import ( from sqlalchemy.exc import IntegrityError, SQLAlchemyError
IntegrityError,
SQLAlchemyError,
)
from sqlalchemy.orm import Session, joinedload from sqlalchemy.orm import Session, joinedload
from media_manager.exceptions import ConflictError, NotFoundError from media_manager.exceptions import ConflictError, NotFoundError
@@ -10,32 +7,18 @@ from media_manager.torrent.models import Torrent
from media_manager.torrent.schemas import Torrent as TorrentSchema from media_manager.torrent.schemas import Torrent as TorrentSchema
from media_manager.torrent.schemas import TorrentId from media_manager.torrent.schemas import TorrentId
from media_manager.tv import log from media_manager.tv import log
from media_manager.tv.models import Episode, Season, SeasonFile, SeasonRequest, Show from media_manager.tv.models import Episode, EpisodeFile, Season, Show
from media_manager.tv.schemas import ( from media_manager.tv.schemas import Episode as EpisodeSchema
Episode as EpisodeSchema, from media_manager.tv.schemas import EpisodeFile as EpisodeFileSchema
)
from media_manager.tv.schemas import ( from media_manager.tv.schemas import (
EpisodeId, EpisodeId,
EpisodeNumber,
SeasonId, SeasonId,
SeasonNumber, SeasonNumber,
SeasonRequestId,
ShowId, ShowId,
) )
from media_manager.tv.schemas import ( from media_manager.tv.schemas import Season as SeasonSchema
RichSeasonRequest as RichSeasonRequestSchema, from media_manager.tv.schemas import Show as ShowSchema
)
from media_manager.tv.schemas import (
Season as SeasonSchema,
)
from media_manager.tv.schemas import (
SeasonFile as SeasonFileSchema,
)
from media_manager.tv.schemas import (
SeasonRequest as SeasonRequestSchema,
)
from media_manager.tv.schemas import (
Show as ShowSchema,
)
class TvRepository: class TvRepository:
@@ -120,9 +103,7 @@ class TvRepository:
def get_total_downloaded_episodes_count(self) -> int: def get_total_downloaded_episodes_count(self) -> int:
try: try:
stmt = ( stmt = select(func.count(Episode.id)).select_from(Episode).join(EpisodeFile)
select(func.count()).select_from(Episode).join(Season).join(SeasonFile)
)
return self.db.execute(stmt).scalar_one_or_none() return self.db.execute(stmt).scalar_one_or_none()
except SQLAlchemyError: except SQLAlchemyError:
log.exception("Database error while calculating downloaded episodes count") log.exception("Database error while calculating downloaded episodes count")
@@ -173,6 +154,7 @@ class TvRepository:
number=episode.number, number=episode.number,
external_id=episode.external_id, external_id=episode.external_id,
title=episode.title, title=episode.title,
overview=episode.overview,
) )
for episode in season.episodes for episode in season.episodes
], ],
@@ -234,64 +216,40 @@ class TvRepository:
log.exception(f"Database error while retrieving season {season_id}") log.exception(f"Database error while retrieving season {season_id}")
raise raise
def add_season_request( def get_episode(self, episode_id: EpisodeId) -> EpisodeSchema:
self, season_request: SeasonRequestSchema
) -> SeasonRequestSchema:
""" """
Adds a Season to the SeasonRequest table, which marks it as requested. Retrieve an episode by its ID.
:param season_request: The SeasonRequest object to add. :param episode_id: The ID of the episode to get.
:return: The added SeasonRequest object. :return: An Episode object.
:raises IntegrityError: If a similar request already exists or violates constraints. :raises NotFoundError: If the episode with the given ID is not found.
:raises SQLAlchemyError: If a database error occurs.
"""
db_model = SeasonRequest(
id=season_request.id,
season_id=season_request.season_id,
wanted_quality=season_request.wanted_quality,
min_quality=season_request.min_quality,
requested_by_id=season_request.requested_by.id
if season_request.requested_by
else None,
authorized=season_request.authorized,
authorized_by_id=season_request.authorized_by.id
if season_request.authorized_by
else None,
)
try:
self.db.add(db_model)
self.db.commit()
self.db.refresh(db_model)
return SeasonRequestSchema.model_validate(db_model)
except IntegrityError:
self.db.rollback()
log.exception("Integrity error while adding season request")
raise
except SQLAlchemyError:
self.db.rollback()
log.exception("Database error while adding season request")
raise
def delete_season_request(self, season_request_id: SeasonRequestId) -> None:
"""
Removes a SeasonRequest by its ID.
:param season_request_id: The ID of the season request to delete.
:raises NotFoundError: If the season request is not found.
:raises SQLAlchemyError: If a database error occurs. :raises SQLAlchemyError: If a database error occurs.
""" """
try: try:
stmt = delete(SeasonRequest).where(SeasonRequest.id == season_request_id) episode = self.db.get(Episode, episode_id)
result = self.db.execute(stmt) if not episode:
if result.rowcount == 0: msg = f"Episode with id {episode_id} not found."
self.db.rollback()
msg = f"SeasonRequest with id {season_request_id} not found."
raise NotFoundError(msg) raise NotFoundError(msg)
self.db.commit() return EpisodeSchema.model_validate(episode)
except SQLAlchemyError: except SQLAlchemyError as e:
self.db.rollback() log.error(f"Database error while retrieving episode {episode_id}: {e}")
log.exception( raise
f"Database error while deleting season request {season_request_id}"
def get_season_by_episode(self, episode_id: EpisodeId) -> SeasonSchema:
try:
stmt = select(Season).join(Season.episodes).where(Episode.id == episode_id)
season = self.db.scalar(stmt)
if not season:
msg = f"Season not found for episode {episode_id}"
raise NotFoundError(msg)
return SeasonSchema.model_validate(season)
except SQLAlchemyError as e:
log.error(
f"Database error while retrieving season for episode {episode_id}: {e}"
) )
raise raise
@@ -323,78 +281,46 @@ class TvRepository:
) )
raise raise
def get_season_requests(self) -> list[RichSeasonRequestSchema]: def add_episode_file(self, episode_file: EpisodeFileSchema) -> EpisodeFileSchema:
""" """
Retrieve all season requests. Adds an episode file record to the database.
:return: A list of RichSeasonRequest objects. :param episode_file: The EpisodeFile object to add.
:raises SQLAlchemyError: If a database error occurs. :return: The added EpisodeFile object.
"""
try:
stmt = select(SeasonRequest).options(
joinedload(SeasonRequest.requested_by),
joinedload(SeasonRequest.authorized_by),
joinedload(SeasonRequest.season).joinedload(Season.show),
)
results = self.db.execute(stmt).scalars().unique().all()
return [
RichSeasonRequestSchema(
id=SeasonRequestId(x.id),
min_quality=x.min_quality,
wanted_quality=x.wanted_quality,
season_id=SeasonId(x.season_id),
show=x.season.show,
season=x.season,
requested_by=x.requested_by,
authorized_by=x.authorized_by,
authorized=x.authorized,
)
for x in results
]
except SQLAlchemyError:
log.exception("Database error while retrieving season requests")
raise
def add_season_file(self, season_file: SeasonFileSchema) -> SeasonFileSchema:
"""
Adds a season file record to the database.
:param season_file: The SeasonFile object to add.
:return: The added SeasonFile object.
:raises IntegrityError: If the record violates constraints. :raises IntegrityError: If the record violates constraints.
:raises SQLAlchemyError: If a database error occurs. :raises SQLAlchemyError: If a database error occurs.
""" """
db_model = SeasonFile(**season_file.model_dump()) db_model = EpisodeFile(**episode_file.model_dump())
try: try:
self.db.add(db_model) self.db.add(db_model)
self.db.commit() self.db.commit()
self.db.refresh(db_model) self.db.refresh(db_model)
return SeasonFileSchema.model_validate(db_model) return EpisodeFileSchema.model_validate(db_model)
except IntegrityError: except IntegrityError as e:
self.db.rollback() self.db.rollback()
log.exception("Integrity error while adding season file") log.error(f"Integrity error while adding episode file: {e}")
raise raise
except SQLAlchemyError: except SQLAlchemyError as e:
self.db.rollback() self.db.rollback()
log.exception("Database error while adding season file") log.error(f"Database error while adding episode file: {e}")
raise raise
def remove_season_files_by_torrent_id(self, torrent_id: TorrentId) -> int: def remove_episode_files_by_torrent_id(self, torrent_id: TorrentId) -> int:
""" """
Removes season file records associated with a given torrent ID. Removes episode file records associated with a given torrent ID.
:param torrent_id: The ID of the torrent whose season files are to be removed. :param torrent_id: The ID of the torrent whose episode files are to be removed.
:return: The number of season files removed. :return: The number of episode files removed.
:raises SQLAlchemyError: If a database error occurs. :raises SQLAlchemyError: If a database error occurs.
""" """
try: try:
stmt = delete(SeasonFile).where(SeasonFile.torrent_id == torrent_id) stmt = delete(EpisodeFile).where(EpisodeFile.torrent_id == torrent_id)
result = self.db.execute(stmt) result = self.db.execute(stmt)
self.db.commit() self.db.commit()
except SQLAlchemyError: except SQLAlchemyError:
self.db.rollback() self.db.rollback()
log.exception( log.exception(
f"Database error removing season files for torrent_id {torrent_id}" f"Database error removing episode files for torrent_id {torrent_id}"
) )
raise raise
return result.rowcount return result.rowcount
@@ -420,23 +346,45 @@ class TvRepository:
log.exception(f"Database error setting library for show {show_id}") log.exception(f"Database error setting library for show {show_id}")
raise raise
def get_season_files_by_season_id( def get_episode_files_by_season_id(
self, season_id: SeasonId self, season_id: SeasonId
) -> list[SeasonFileSchema]: ) -> list[EpisodeFileSchema]:
""" """
Retrieve all season files for a given season ID. Retrieve all episode files for a given season ID.
:param season_id: The ID of the season. :param season_id: The ID of the season.
:return: A list of SeasonFile objects. :return: A list of EpisodeFile objects.
:raises SQLAlchemyError: If a database error occurs. :raises SQLAlchemyError: If a database error occurs.
""" """
try: try:
stmt = select(SeasonFile).where(SeasonFile.season_id == season_id) stmt = (
select(EpisodeFile).join(Episode).where(Episode.season_id == season_id)
)
results = self.db.execute(stmt).scalars().all() results = self.db.execute(stmt).scalars().all()
return [SeasonFileSchema.model_validate(sf) for sf in results] return [EpisodeFileSchema.model_validate(ef) for ef in results]
except SQLAlchemyError: except SQLAlchemyError:
log.exception( log.exception(
f"Database error retrieving season files for season_id {season_id}" f"Database error retrieving episode files for season_id {season_id}"
)
raise
def get_episode_files_by_episode_id(
self, episode_id: EpisodeId
) -> list[EpisodeFileSchema]:
"""
Retrieve all episode files for a given episode ID.
:param episode_id: The ID of the episode.
:return: A list of EpisodeFile objects.
:raises SQLAlchemyError: If a database error occurs.
"""
try:
stmt = select(EpisodeFile).where(EpisodeFile.episode_id == episode_id)
results = self.db.execute(stmt).scalars().all()
return [EpisodeFileSchema.model_validate(sf) for sf in results]
except SQLAlchemyError as e:
log.error(
f"Database error retrieving episode files for episode_id {episode_id}: {e}"
) )
raise raise
@@ -452,8 +400,9 @@ class TvRepository:
stmt = ( stmt = (
select(Torrent) select(Torrent)
.distinct() .distinct()
.join(SeasonFile, SeasonFile.torrent_id == Torrent.id) .join(EpisodeFile, EpisodeFile.torrent_id == Torrent.id)
.join(Season, Season.id == SeasonFile.season_id) .join(Episode, Episode.id == EpisodeFile.episode_id)
.join(Season, Season.id == Episode.season_id)
.where(Season.show_id == show_id) .where(Season.show_id == show_id)
) )
results = self.db.execute(stmt).scalars().unique().all() results = self.db.execute(stmt).scalars().unique().all()
@@ -474,8 +423,9 @@ class TvRepository:
select(Show) select(Show)
.distinct() .distinct()
.join(Season, Show.id == Season.show_id) .join(Season, Show.id == Season.show_id)
.join(SeasonFile, Season.id == SeasonFile.season_id) .join(Episode, Season.id == Episode.season_id)
.join(Torrent, SeasonFile.torrent_id == Torrent.id) .join(EpisodeFile, Episode.id == EpisodeFile.episode_id)
.join(Torrent, EpisodeFile.torrent_id == Torrent.id)
.options(joinedload(Show.seasons).joinedload(Season.episodes)) .options(joinedload(Show.seasons).joinedload(Season.episodes))
.order_by(Show.name) .order_by(Show.name)
) )
@@ -497,8 +447,9 @@ class TvRepository:
stmt = ( stmt = (
select(Season.number) select(Season.number)
.distinct() .distinct()
.join(SeasonFile, Season.id == SeasonFile.season_id) .join(Episode, Episode.season_id == Season.id)
.where(SeasonFile.torrent_id == torrent_id) .join(EpisodeFile, EpisodeFile.episode_id == Episode.id)
.where(EpisodeFile.torrent_id == torrent_id)
) )
results = self.db.execute(stmt).scalars().unique().all() results = self.db.execute(stmt).scalars().unique().all()
return [SeasonNumber(x) for x in results] return [SeasonNumber(x) for x in results]
@@ -508,27 +459,29 @@ class TvRepository:
) )
raise raise
def get_season_request( def get_episodes_by_torrent_id(self, torrent_id: TorrentId) -> list[EpisodeNumber]:
self, season_request_id: SeasonRequestId
) -> SeasonRequestSchema:
""" """
Retrieve a season request by its ID. Retrieve episode numbers associated with a given torrent ID.
:param season_request_id: The ID of the season request. :param torrent_id: The ID of the torrent.
:return: A SeasonRequest object. :return: A list of EpisodeNumber objects.
:raises NotFoundError: If the season request is not found.
:raises SQLAlchemyError: If a database error occurs. :raises SQLAlchemyError: If a database error occurs.
""" """
try: try:
request = self.db.get(SeasonRequest, season_request_id) stmt = (
if not request: select(Episode.number)
log.warning(f"Season request with id {season_request_id} not found.") .join(EpisodeFile, EpisodeFile.episode_id == Episode.id)
msg = f"Season request with id {season_request_id} not found." .where(EpisodeFile.torrent_id == torrent_id)
raise NotFoundError(msg) .order_by(Episode.number)
return SeasonRequestSchema.model_validate(request) )
except SQLAlchemyError:
log.exception( episode_numbers = self.db.execute(stmt).scalars().all()
f"Database error retrieving season request {season_request_id}"
return [EpisodeNumber(n) for n in sorted(set(episode_numbers))]
except SQLAlchemyError as e:
log.error(
f"Database error retrieving episodes for torrent_id {torrent_id}: {e}"
) )
raise raise
@@ -731,14 +684,21 @@ class TvRepository:
if updated: if updated:
self.db.commit() self.db.commit()
self.db.refresh(db_season) self.db.refresh(db_season)
log.debug(
f"Updating existing season {db_season.number} for show {db_season.show.name}"
)
return SeasonSchema.model_validate(db_season) return SeasonSchema.model_validate(db_season)
def update_episode_attributes( def update_episode_attributes(
self, episode_id: EpisodeId, title: str | None = None self,
episode_id: EpisodeId,
title: str | None = None,
overview: str | None = None,
) -> EpisodeSchema: ) -> EpisodeSchema:
""" """
Update attributes of an existing episode. Update attributes of an existing episode.
:param overview: Tje new overview for the episode.
:param episode_id: The ID of the episode to update. :param episode_id: The ID of the episode to update.
:param title: The new title for the episode. :param title: The new title for the episode.
:param external_id: The new external ID for the episode. :param external_id: The new external ID for the episode.
@@ -755,8 +715,12 @@ class TvRepository:
if title is not None and db_episode.title != title: if title is not None and db_episode.title != title:
db_episode.title = title db_episode.title = title
updated = True updated = True
if overview is not None and db_episode.overview != overview:
db_episode.overview = overview
updated = True
if updated: if updated:
self.db.commit() self.db.commit()
self.db.refresh(db_episode) self.db.refresh(db_episode)
log.info(f"Updating existing episode {db_episode.number}")
return EpisodeSchema.model_validate(db_episode) return EpisodeSchema.model_validate(db_episode)

View File

@@ -1,10 +1,7 @@
from pathlib import Path from pathlib import Path
from typing import Annotated
from fastapi import APIRouter, Depends, HTTPException, status from fastapi import APIRouter, Depends, HTTPException, status
from media_manager.auth.db import User
from media_manager.auth.schemas import UserRead
from media_manager.auth.users import current_active_user, current_superuser from media_manager.auth.users import current_active_user, current_superuser
from media_manager.config import LibraryItem, MediaManagerConfig from media_manager.config import LibraryItem, MediaManagerConfig
from media_manager.exceptions import MediaAlreadyExistsError, NotFoundError from media_manager.exceptions import MediaAlreadyExistsError, NotFoundError
@@ -17,24 +14,18 @@ from media_manager.metadataProvider.schemas import MetaDataProviderSearchResult
from media_manager.schemas import MediaImportSuggestion from media_manager.schemas import MediaImportSuggestion
from media_manager.torrent.schemas import Torrent from media_manager.torrent.schemas import Torrent
from media_manager.torrent.utils import get_importable_media_directories from media_manager.torrent.utils import get_importable_media_directories
from media_manager.tv import log
from media_manager.tv.dependencies import ( from media_manager.tv.dependencies import (
season_dep, season_dep,
show_dep, show_dep,
tv_service_dep, tv_service_dep,
) )
from media_manager.tv.schemas import ( from media_manager.tv.schemas import (
CreateSeasonRequest, PublicEpisodeFile,
PublicSeasonFile,
PublicShow, PublicShow,
RichSeasonRequest,
RichShowTorrent, RichShowTorrent,
Season, Season,
SeasonRequest,
SeasonRequestId,
Show, Show,
ShowId, ShowId,
UpdateSeasonRequest,
) )
router = APIRouter() router = APIRouter()
@@ -278,110 +269,6 @@ def get_a_shows_torrents(show: show_dep, tv_service: tv_service_dep) -> RichShow
return tv_service.get_torrents_for_show(show=show) return tv_service.get_torrents_for_show(show=show)
# -----------------------------------------------------------------------------
# SEASONS - REQUESTS
# -----------------------------------------------------------------------------
@router.get(
"/seasons/requests",
status_code=status.HTTP_200_OK,
dependencies=[Depends(current_active_user)],
)
def get_season_requests(tv_service: tv_service_dep) -> list[RichSeasonRequest]:
"""
Get all season requests.
"""
return tv_service.get_all_season_requests()
@router.post("/seasons/requests", status_code=status.HTTP_204_NO_CONTENT)
def request_a_season(
user: Annotated[User, Depends(current_active_user)],
season_request: CreateSeasonRequest,
tv_service: tv_service_dep,
) -> None:
"""
Create a new season request.
"""
request: SeasonRequest = SeasonRequest.model_validate(season_request)
request.requested_by = UserRead.model_validate(user)
if user.is_superuser:
request.authorized = True
request.authorized_by = UserRead.model_validate(user)
tv_service.add_season_request(request)
return
@router.put("/seasons/requests", status_code=status.HTTP_204_NO_CONTENT)
def update_request(
tv_service: tv_service_dep,
user: Annotated[User, Depends(current_active_user)],
season_request: UpdateSeasonRequest,
) -> None:
"""
Update an existing season request.
"""
updated_season_request: SeasonRequest = SeasonRequest.model_validate(season_request)
request = tv_service.get_season_request_by_id(
season_request_id=updated_season_request.id
)
if request.requested_by.id == user.id or user.is_superuser:
updated_season_request.requested_by = UserRead.model_validate(user)
tv_service.update_season_request(season_request=updated_season_request)
return
@router.patch(
"/seasons/requests/{season_request_id}", status_code=status.HTTP_204_NO_CONTENT
)
def authorize_request(
tv_service: tv_service_dep,
user: Annotated[User, Depends(current_superuser)],
season_request_id: SeasonRequestId,
authorized_status: bool = False,
) -> None:
"""
Authorize or de-authorize a season request.
"""
season_request = tv_service.get_season_request_by_id(
season_request_id=season_request_id
)
if not season_request:
raise NotFoundError
season_request.authorized_by = UserRead.model_validate(user)
season_request.authorized = authorized_status
if not authorized_status:
season_request.authorized_by = None
tv_service.update_season_request(season_request=season_request)
@router.delete(
"/seasons/requests/{request_id}",
status_code=status.HTTP_204_NO_CONTENT,
)
def delete_season_request(
tv_service: tv_service_dep,
user: Annotated[User, Depends(current_active_user)],
request_id: SeasonRequestId,
) -> None:
"""
Delete a season request.
"""
request = tv_service.get_season_request_by_id(season_request_id=request_id)
if user.is_superuser or request.requested_by.id == user.id:
tv_service.delete_season_request(season_request_id=request_id)
log.info(f"User {user.id} deleted season request {request_id}.")
return
log.warning(
f"User {user.id} tried to delete season request {request_id} but is not authorized."
)
raise HTTPException(
status_code=status.HTTP_403_FORBIDDEN,
detail="Not authorized to delete this request",
)
# ----------------------------------------------------------------------------- # -----------------------------------------------------------------------------
# SEASONS # SEASONS
# ----------------------------------------------------------------------------- # -----------------------------------------------------------------------------
@@ -402,13 +289,13 @@ def get_season(season: season_dep) -> Season:
"/seasons/{season_id}/files", "/seasons/{season_id}/files",
dependencies=[Depends(current_active_user)], dependencies=[Depends(current_active_user)],
) )
def get_season_files( def get_episode_files(
season: season_dep, tv_service: tv_service_dep season: season_dep, tv_service: tv_service_dep
) -> list[PublicSeasonFile]: ) -> list[PublicEpisodeFile]:
""" """
Get files associated with a specific season. Get episode files associated with a specific season.
""" """
return tv_service.get_public_season_files_by_season_id(season=season) return tv_service.get_public_episode_files_by_season_id(season=season)
# ----------------------------------------------------------------------------- # -----------------------------------------------------------------------------

View File

@@ -2,9 +2,8 @@ import typing
import uuid import uuid
from uuid import UUID from uuid import UUID
from pydantic import BaseModel, ConfigDict, Field, model_validator from pydantic import BaseModel, ConfigDict, Field
from media_manager.auth.schemas import UserRead
from media_manager.torrent.models import Quality from media_manager.torrent.models import Quality
from media_manager.torrent.schemas import TorrentId, TorrentStatus from media_manager.torrent.schemas import TorrentId, TorrentStatus
@@ -14,7 +13,6 @@ EpisodeId = typing.NewType("EpisodeId", UUID)
SeasonNumber = typing.NewType("SeasonNumber", int) SeasonNumber = typing.NewType("SeasonNumber", int)
EpisodeNumber = typing.NewType("EpisodeNumber", int) EpisodeNumber = typing.NewType("EpisodeNumber", int)
SeasonRequestId = typing.NewType("SeasonRequestId", UUID)
class Episode(BaseModel): class Episode(BaseModel):
@@ -24,6 +22,7 @@ class Episode(BaseModel):
number: EpisodeNumber number: EpisodeNumber
external_id: int external_id: int
title: str title: str
overview: str | None = None
class Season(BaseModel): class Season(BaseModel):
@@ -62,52 +61,16 @@ class Show(BaseModel):
seasons: list[Season] seasons: list[Season]
class SeasonRequestBase(BaseModel): class EpisodeFile(BaseModel):
min_quality: Quality
wanted_quality: Quality
@model_validator(mode="after")
def ensure_wanted_quality_is_eq_or_gt_min_quality(self) -> "SeasonRequestBase":
if self.min_quality.value < self.wanted_quality.value:
msg = "wanted_quality must be equal to or lower than minimum_quality."
raise ValueError(msg)
return self
class CreateSeasonRequest(SeasonRequestBase):
season_id: SeasonId
class UpdateSeasonRequest(SeasonRequestBase):
id: SeasonRequestId
class SeasonRequest(SeasonRequestBase):
model_config = ConfigDict(from_attributes=True) model_config = ConfigDict(from_attributes=True)
id: SeasonRequestId = Field(default_factory=lambda: SeasonRequestId(uuid.uuid4())) episode_id: EpisodeId
season_id: SeasonId
requested_by: UserRead | None = None
authorized: bool = False
authorized_by: UserRead | None = None
class RichSeasonRequest(SeasonRequest):
show: Show
season: Season
class SeasonFile(BaseModel):
model_config = ConfigDict(from_attributes=True)
season_id: SeasonId
quality: Quality quality: Quality
torrent_id: TorrentId | None torrent_id: TorrentId | None
file_path_suffix: str file_path_suffix: str
class PublicSeasonFile(SeasonFile): class PublicEpisodeFile(EpisodeFile):
downloaded: bool = False downloaded: bool = False
@@ -123,6 +86,7 @@ class RichSeasonTorrent(BaseModel):
file_path_suffix: str file_path_suffix: str
seasons: list[SeasonNumber] seasons: list[SeasonNumber]
episodes: list[EpisodeNumber]
class RichShowTorrent(BaseModel): class RichShowTorrent(BaseModel):
@@ -135,6 +99,18 @@ class RichShowTorrent(BaseModel):
torrents: list[RichSeasonTorrent] torrents: list[RichSeasonTorrent]
class PublicEpisode(BaseModel):
model_config = ConfigDict(from_attributes=True)
id: EpisodeId
number: EpisodeNumber
downloaded: bool = False
title: str
overview: str | None = None
external_id: int
class PublicSeason(BaseModel): class PublicSeason(BaseModel):
model_config = ConfigDict(from_attributes=True) model_config = ConfigDict(from_attributes=True)
@@ -147,7 +123,7 @@ class PublicSeason(BaseModel):
external_id: int external_id: int
episodes: list[Episode] episodes: list[PublicEpisode]
class PublicShow(BaseModel): class PublicShow(BaseModel):

View File

@@ -7,9 +7,7 @@ from typing import overload
from sqlalchemy.exc import IntegrityError from sqlalchemy.exc import IntegrityError
from media_manager.config import MediaManagerConfig from media_manager.config import MediaManagerConfig
from media_manager.database import get_session
from media_manager.exceptions import InvalidConfigError, NotFoundError, RenameError from media_manager.exceptions import InvalidConfigError, NotFoundError, RenameError
from media_manager.indexer.repository import IndexerRepository
from media_manager.indexer.schemas import IndexerQueryResult, IndexerQueryResultId from media_manager.indexer.schemas import IndexerQueryResult, IndexerQueryResultId
from media_manager.indexer.service import IndexerService from media_manager.indexer.service import IndexerService
from media_manager.indexer.utils import evaluate_indexer_query_results from media_manager.indexer.utils import evaluate_indexer_query_results
@@ -19,13 +17,10 @@ from media_manager.metadataProvider.abstract_metadata_provider import (
from media_manager.metadataProvider.schemas import MetaDataProviderSearchResult from media_manager.metadataProvider.schemas import MetaDataProviderSearchResult
from media_manager.metadataProvider.tmdb import TmdbMetadataProvider from media_manager.metadataProvider.tmdb import TmdbMetadataProvider
from media_manager.metadataProvider.tvdb import TvdbMetadataProvider from media_manager.metadataProvider.tvdb import TvdbMetadataProvider
from media_manager.notification.repository import NotificationRepository
from media_manager.notification.service import NotificationService from media_manager.notification.service import NotificationService
from media_manager.schemas import MediaImportSuggestion from media_manager.schemas import MediaImportSuggestion
from media_manager.torrent.repository import TorrentRepository
from media_manager.torrent.schemas import ( from media_manager.torrent.schemas import (
Quality, Quality,
QualityStrings,
Torrent, Torrent,
TorrentStatus, TorrentStatus,
) )
@@ -41,21 +36,17 @@ from media_manager.torrent.utils import (
from media_manager.tv import log from media_manager.tv import log
from media_manager.tv.repository import TvRepository from media_manager.tv.repository import TvRepository
from media_manager.tv.schemas import ( from media_manager.tv.schemas import (
Episode as EpisodeSchema, Episode,
) EpisodeFile,
from media_manager.tv.schemas import (
EpisodeId, EpisodeId,
EpisodeNumber,
PublicEpisodeFile,
PublicSeason, PublicSeason,
PublicSeasonFile,
PublicShow, PublicShow,
RichSeasonRequest,
RichSeasonTorrent, RichSeasonTorrent,
RichShowTorrent, RichShowTorrent,
Season, Season,
SeasonFile,
SeasonId, SeasonId,
SeasonRequest,
SeasonRequestId,
Show, Show,
ShowId, ShowId,
) )
@@ -94,28 +85,6 @@ class TvService:
metadata_provider.download_show_poster_image(show=saved_show) metadata_provider.download_show_poster_image(show=saved_show)
return saved_show return saved_show
def add_season_request(self, season_request: SeasonRequest) -> SeasonRequest:
"""
Add a new season request.
:param season_request: The season request to add.
:return: The added season request.
"""
return self.tv_repository.add_season_request(season_request=season_request)
def get_season_request_by_id(
self, season_request_id: SeasonRequestId
) -> SeasonRequest | None:
"""
Get a season request by its ID.
:param season_request_id: The ID of the season request.
:return: The season request or None if not found.
"""
return self.tv_repository.get_season_request(
season_request_id=season_request_id
)
def get_total_downloaded_episoded_count(self) -> int: def get_total_downloaded_episoded_count(self) -> int:
""" """
Get total number of downloaded episodes. Get total number of downloaded episodes.
@@ -123,27 +92,9 @@ class TvService:
return self.tv_repository.get_total_downloaded_episodes_count() return self.tv_repository.get_total_downloaded_episodes_count()
def update_season_request(self, season_request: SeasonRequest) -> SeasonRequest:
"""
Update an existing season request.
:param season_request: The season request to update.
:return: The updated season request.
"""
self.tv_repository.delete_season_request(season_request_id=season_request.id)
return self.tv_repository.add_season_request(season_request=season_request)
def set_show_library(self, show: Show, library: str) -> None: def set_show_library(self, show: Show, library: str) -> None:
self.tv_repository.set_show_library(show_id=show.id, library=library) self.tv_repository.set_show_library(show_id=show.id, library=library)
def delete_season_request(self, season_request_id: SeasonRequestId) -> None:
"""
Delete a season request by its ID.
:param season_request_id: The ID of the season request to delete.
"""
self.tv_repository.delete_season_request(season_request_id=season_request_id)
def delete_show( def delete_show(
self, self,
show: Show, show: Show,
@@ -173,6 +124,7 @@ class TvService:
for torrent in torrents: for torrent in torrents:
try: try:
self.torrent_service.cancel_download(torrent, delete_files=True) self.torrent_service.cancel_download(torrent, delete_files=True)
self.torrent_service.delete_torrent(torrent_id=torrent.id)
log.info(f"Deleted torrent: {torrent.hash}") log.info(f"Deleted torrent: {torrent.hash}")
except Exception: except Exception:
log.warning( log.warning(
@@ -181,24 +133,26 @@ class TvService:
self.tv_repository.delete_show(show_id=show.id) self.tv_repository.delete_show(show_id=show.id)
def get_public_season_files_by_season_id( def get_public_episode_files_by_season_id(
self, season: Season self, season: Season
) -> list[PublicSeasonFile]: ) -> list[PublicEpisodeFile]:
""" """
Get all public season files for a given season. Get all public episode files for a given season.
:param season: The season object. :param season: The season object.
:return: A list of public season files. :return: A list of public episode files.
""" """
season_files = self.tv_repository.get_season_files_by_season_id( episode_files = self.tv_repository.get_episode_files_by_season_id(
season_id=season.id season_id=season.id
) )
public_season_files = [PublicSeasonFile.model_validate(x) for x in season_files] public_episode_files = [
PublicEpisodeFile.model_validate(x) for x in episode_files
]
result = [] result = []
for season_file in public_season_files: for episode_file in public_episode_files:
if self.season_file_exists_on_file(season_file=season_file): if self.episode_file_exists_on_file(episode_file=episode_file):
season_file.downloaded = True episode_file.downloaded = True
result.append(season_file) result.append(episode_file)
return result return result
@overload @overload
@@ -334,11 +288,27 @@ class TvService:
:param show: The show object. :param show: The show object.
:return: A public show. :return: A public show.
""" """
seasons = [PublicSeason.model_validate(season) for season in show.seasons]
for season in seasons:
season.downloaded = self.is_season_downloaded(season_id=season.id)
public_show = PublicShow.model_validate(show) public_show = PublicShow.model_validate(show)
public_show.seasons = seasons public_seasons: list[PublicSeason] = []
for season in show.seasons:
public_season = PublicSeason.model_validate(season)
for episode in public_season.episodes:
episode.downloaded = self.is_episode_downloaded(
episode=episode,
season=season,
show=show,
)
# A season is considered downloaded if it has episodes and all of them are downloaded,
# matching the behavior of is_season_downloaded.
public_season.downloaded = bool(public_season.episodes) and all(
episode.downloaded for episode in public_season.episodes
)
public_seasons.append(public_season)
public_show.seasons = public_seasons
return public_show return public_show
def get_show_by_id(self, show_id: ShowId) -> Show: def get_show_by_id(self, show_id: ShowId) -> Show:
@@ -350,33 +320,85 @@ class TvService:
""" """
return self.tv_repository.get_show_by_id(show_id=show_id) return self.tv_repository.get_show_by_id(show_id=show_id)
def is_season_downloaded(self, season_id: SeasonId) -> bool: def is_season_downloaded(self, season: Season, show: Show) -> bool:
""" """
Check if a season is downloaded. Check if a season is downloaded.
:param season_id: The ID of the season. :param season: The season object.
:param show: The show object.
:return: True if the season is downloaded, False otherwise. :return: True if the season is downloaded, False otherwise.
""" """
season_files = self.tv_repository.get_season_files_by_season_id( episodes = season.episodes
season_id=season_id
) if not episodes:
for season_file in season_files:
if self.season_file_exists_on_file(season_file=season_file):
return True
return False return False
def season_file_exists_on_file(self, season_file: SeasonFile) -> bool: for episode in episodes:
""" if not self.is_episode_downloaded(
Check if a season file exists on the filesystem. episode=episode, season=season, show=show
):
return False
return True
:param season_file: The season file to check. def is_episode_downloaded(
self, episode: Episode, season: Season, show: Show
) -> bool:
"""
Check if an episode is downloaded and imported (file exists on disk).
An episode is considered downloaded if:
- There is at least one EpisodeFile in the database AND
- A matching episode file exists in the season directory on disk.
:param episode: The episode object.
:param season: The season object.
:param show: The show object.
:return: True if the episode is downloaded and imported, False otherwise.
"""
episode_files = self.tv_repository.get_episode_files_by_episode_id(
episode_id=episode.id
)
if not episode_files:
return False
season_dir = self.get_root_season_directory(show, season.number)
if not season_dir.exists():
return False
episode_token = f"S{season.number:02d}E{episode.number:02d}"
video_extensions = {".mkv", ".mp4", ".avi", ".mov"}
try:
for file in season_dir.iterdir():
if (
file.is_file()
and episode_token.lower() in file.name.lower()
and file.suffix.lower() in video_extensions
):
return True
except OSError as e:
log.error(
f"Disk check failed for episode {episode.id} in {season_dir}: {e}"
)
return False
def episode_file_exists_on_file(self, episode_file: EpisodeFile) -> bool:
"""
Check if an episode file exists on the filesystem.
:param episode_file: The episode file to check.
:return: True if the file exists, False otherwise. :return: True if the file exists, False otherwise.
""" """
if season_file.torrent_id is None: if episode_file.torrent_id is None:
return True return True
try: try:
torrent_file = self.torrent_service.get_torrent_by_id( torrent_file = self.torrent_service.get_torrent_by_id(
torrent_id=season_file.torrent_id torrent_id=episode_file.torrent_id
) )
if torrent_file.imported: if torrent_file.imported:
@@ -409,13 +431,23 @@ class TvService:
""" """
return self.tv_repository.get_season(season_id=season_id) return self.tv_repository.get_season(season_id=season_id)
def get_all_season_requests(self) -> list[RichSeasonRequest]: def get_episode(self, episode_id: EpisodeId) -> Episode:
""" """
Get all season requests. Get an episode by its ID.
:return: A list of rich season requests. :param episode_id: The ID of the episode.
:return: The episode.
""" """
return self.tv_repository.get_season_requests() return self.tv_repository.get_episode(episode_id=episode_id)
def get_season_by_episode(self, episode_id: EpisodeId) -> Season:
"""
Get a season by the episode ID.
:param episode_id: The ID of the episode.
:return: The season.
"""
return self.tv_repository.get_season_by_episode(episode_id=episode_id)
def get_torrents_for_show(self, show: Show) -> RichShowTorrent: def get_torrents_for_show(self, show: Show) -> RichShowTorrent:
""" """
@@ -430,10 +462,16 @@ class TvService:
seasons = self.tv_repository.get_seasons_by_torrent_id( seasons = self.tv_repository.get_seasons_by_torrent_id(
torrent_id=show_torrent.id torrent_id=show_torrent.id
) )
season_files = self.torrent_service.get_season_files_of_torrent( episodes = self.tv_repository.get_episodes_by_torrent_id(
torrent_id=show_torrent.id
)
episode_files = self.torrent_service.get_episode_files_of_torrent(
torrent=show_torrent torrent=show_torrent
) )
file_path_suffix = season_files[0].file_path_suffix if season_files else ""
file_path_suffix = (
episode_files[0].file_path_suffix if episode_files else ""
)
season_torrent = RichSeasonTorrent( season_torrent = RichSeasonTorrent(
torrent_id=show_torrent.id, torrent_id=show_torrent.id,
torrent_title=show_torrent.title, torrent_title=show_torrent.title,
@@ -441,10 +479,12 @@ class TvService:
quality=show_torrent.quality, quality=show_torrent.quality,
imported=show_torrent.imported, imported=show_torrent.imported,
seasons=seasons, seasons=seasons,
episodes=episodes if len(seasons) == 1 else [],
file_path_suffix=file_path_suffix, file_path_suffix=file_path_suffix,
usenet=show_torrent.usenet, usenet=show_torrent.usenet,
) )
rich_season_torrents.append(season_torrent) rich_season_torrents.append(season_torrent)
return RichShowTorrent( return RichShowTorrent(
show_id=show.id, show_id=show.id,
name=show.name, name=show.name,
@@ -487,95 +527,54 @@ class TvService:
season = self.tv_repository.get_season_by_number( season = self.tv_repository.get_season_by_number(
season_number=season_number, show_id=show_id season_number=season_number, show_id=show_id
) )
season_file = SeasonFile( episodes = {episode.number: episode.id for episode in season.episodes}
season_id=season.id,
if indexer_result.episode:
episode_ids = []
missing_episodes = []
for ep_number in indexer_result.episode:
ep_id = episodes.get(EpisodeNumber(ep_number))
if ep_id is None:
missing_episodes.append(ep_number)
continue
episode_ids.append(ep_id)
if missing_episodes:
log.warning(
"Some episodes from indexer result were not found in season %s "
"for show %s and will be skipped: %s",
season.id,
show_id,
", ".join(str(ep) for ep in missing_episodes),
)
else:
episode_ids = [episode.id for episode in season.episodes]
for episode_id in episode_ids:
episode_file = EpisodeFile(
episode_id=episode_id,
quality=indexer_result.quality, quality=indexer_result.quality,
torrent_id=show_torrent.id, torrent_id=show_torrent.id,
file_path_suffix=override_show_file_path_suffix, file_path_suffix=override_show_file_path_suffix,
) )
self.tv_repository.add_season_file(season_file=season_file) self.tv_repository.add_episode_file(episode_file=episode_file)
except IntegrityError: except IntegrityError:
log.error( log.error(
f"Season file for season {season.id} and quality {indexer_result.quality} already exists, skipping." f"Episode file for episode {episode_id} of season {season.id} and quality {indexer_result.quality} already exists, skipping."
) )
self.tv_repository.remove_episode_files_by_torrent_id(show_torrent.id)
self.torrent_service.cancel_download( self.torrent_service.cancel_download(
torrent=show_torrent, delete_files=True torrent=show_torrent, delete_files=True
) )
raise raise
else: else:
log.info( log.info(
f"Successfully added season files for torrent {show_torrent.title} and show ID {show_id}" f"Successfully added episode files for torrent {show_torrent.title} and show ID {show_id}"
) )
self.torrent_service.resume_download(torrent=show_torrent) self.torrent_service.resume_download(torrent=show_torrent)
return show_torrent return show_torrent
def download_approved_season_request(
self, season_request: SeasonRequest, show: Show
) -> bool:
"""
Download an approved season request.
:param season_request: The season request to download.
:param show: The Show object.
:return: True if the download was successful, False otherwise.
:raises ValueError: If the season request is not authorized.
"""
if not season_request.authorized:
msg = f"Season request {season_request.id} is not authorized for download"
raise ValueError(msg)
log.info(f"Downloading approved season request {season_request.id}")
season = self.get_season(season_id=season_request.season_id)
torrents = self.get_all_available_torrents_for_a_season(
season_number=season.number, show_id=show.id
)
available_torrents: list[IndexerQueryResult] = []
for torrent in torrents:
if (
(torrent.quality.value < season_request.wanted_quality.value)
or (torrent.quality.value > season_request.min_quality.value)
or (torrent.seeders < 3)
):
log.info(
f"Skipping torrent {torrent.title} with quality {torrent.quality} for season {season.id}, because it does not match the requested quality {season_request.wanted_quality}"
)
elif torrent.season != [season.number]:
log.info(
f"Skipping torrent {torrent.title} with quality {torrent.quality} for season {season.id}, because it contains to many/wrong seasons {torrent.season} (wanted: {season.number})"
)
else:
available_torrents.append(torrent)
log.info(
f"Taking torrent {torrent.title} with quality {torrent.quality} for season {season.id} into consideration"
)
if len(available_torrents) == 0:
log.warning(
f"No torrents matching criteria were found (wanted quality: {season_request.wanted_quality}, min_quality: {season_request.min_quality} for season {season.id})"
)
return False
available_torrents.sort()
torrent = self.torrent_service.download(indexer_result=available_torrents[0])
season_file = SeasonFile(
season_id=season.id,
quality=torrent.quality,
torrent_id=torrent.id,
file_path_suffix=QualityStrings[torrent.quality.name].value.upper(),
)
try:
self.tv_repository.add_season_file(season_file=season_file)
except IntegrityError:
log.warning(
f"Season file for season {season.id} and quality {torrent.quality} already exists, skipping."
)
self.delete_season_request(season_request.id)
return True
def get_root_show_directory(self, show: Show) -> Path: def get_root_show_directory(self, show: Show) -> Path:
misc_config = MediaManagerConfig().misc misc_config = MediaManagerConfig().misc
show_directory_name = f"{remove_special_characters(show.name)} ({show.year}) [{show.metadata_provider}id-{show.external_id}]" show_directory_name = f"{remove_special_characters(show.name)} ({show.year}) [{show.metadata_provider}id-{show.external_id}]"
@@ -653,12 +652,12 @@ class TvService:
video_files: list[Path], video_files: list[Path],
subtitle_files: list[Path], subtitle_files: list[Path],
file_path_suffix: str = "", file_path_suffix: str = "",
) -> tuple[bool, int]: ) -> tuple[bool, list[Episode]]:
season_path = self.get_root_season_directory( season_path = self.get_root_season_directory(
show=show, season_number=season.number show=show, season_number=season.number
) )
success = True success = True
imported_episodes_count = 0 imported_episodes = []
try: try:
season_path.mkdir(parents=True, exist_ok=True) season_path.mkdir(parents=True, exist_ok=True)
except Exception as e: except Exception as e:
@@ -677,7 +676,7 @@ class TvService:
file_path_suffix=file_path_suffix, file_path_suffix=file_path_suffix,
) )
if imported: if imported:
imported_episodes_count += 1 imported_episodes.append(episode)
except Exception: except Exception:
# Send notification about missing episode file # Send notification about missing episode file
@@ -690,11 +689,72 @@ class TvService:
log.warning( log.warning(
f"S{season.number}E{episode.number} not found when trying to import episode for show {show.name}." f"S{season.number}E{episode.number} not found when trying to import episode for show {show.name}."
) )
return success, imported_episodes_count return success, imported_episodes
def import_torrent_files(self, torrent: Torrent, show: Show) -> None: def import_episode_files(
self,
show: Show,
season: Season,
episode: Episode,
video_files: list[Path],
subtitle_files: list[Path],
file_path_suffix: str = "",
) -> bool:
episode_file_name = f"{remove_special_characters(show.name)} S{season.number:02d}E{episode.number:02d}"
if file_path_suffix != "":
episode_file_name += f" - {file_path_suffix}"
pattern = (
r".*[. ]S0?" + str(season.number) + r"E0?" + str(episode.number) + r"[. ].*"
)
subtitle_pattern = pattern + r"[. ]([A-Za-z]{2})[. ]srt"
target_file_name = (
self.get_root_season_directory(show=show, season_number=season.number)
/ episode_file_name
)
# import subtitle
for subtitle_file in subtitle_files:
regex_result = re.search(
subtitle_pattern, subtitle_file.name, re.IGNORECASE
)
if regex_result:
language_code = regex_result.group(1)
target_subtitle_file = target_file_name.with_suffix(
f".{language_code}.srt"
)
import_file(target_file=target_subtitle_file, source_file=subtitle_file)
else:
log.debug(
f"Didn't find any pattern {subtitle_pattern} in subtitle file: {subtitle_file.name}"
)
found_video = False
# import episode videos
for file in video_files:
if re.search(pattern, file.name, re.IGNORECASE):
target_video_file = target_file_name.with_suffix(file.suffix)
import_file(target_file=target_video_file, source_file=file)
found_video = True
break
if not found_video:
# Send notification about missing episode file
if self.notification_service:
self.notification_service.send_notification_to_all_providers(
title="Missing Episode File",
message=f"No video file found for S{season.number:02d}E{episode.number:02d} for show {show.name}. Manual intervention may be required.",
)
log.warning(
f"File for S{season.number}E{episode.number} not found when trying to import episode for show {show.name}."
)
return False
return True
def import_episode_files_from_torrent(self, torrent: Torrent, show: Show) -> None:
""" """
Organizes files from a torrent into the TV directory structure, mapping them to seasons and episodes. Organizes episodes files from a torrent into the TV directory structure, mapping them to seasons and episodes.
:param torrent: The Torrent object :param torrent: The Torrent object
:param show: The Show object :param show: The Show object
""" """
@@ -707,34 +767,69 @@ class TvService:
f"Importing these {len(video_files)} files:\n" + pprint.pformat(video_files) f"Importing these {len(video_files)} files:\n" + pprint.pformat(video_files)
) )
season_files = self.torrent_service.get_season_files_of_torrent(torrent=torrent) episode_files = self.torrent_service.get_episode_files_of_torrent(
torrent=torrent
)
if not episode_files:
log.warning(
f"No episode files associated with torrent {torrent.title}, skipping import."
)
return
log.info( log.info(
f"Found {len(season_files)} season files associated with torrent {torrent.title}" f"Found {len(episode_files)} episode files associated with torrent {torrent.title}"
) )
for season_file in season_files: imported_episodes_by_season: dict[int, list[int]] = {}
season = self.get_season(season_id=season_file.season_id)
season_import_success, _imported_episodes_count = self.import_season( for episode_file in episode_files:
season = self.get_season_by_episode(episode_id=episode_file.episode_id)
episode = self.get_episode(episode_file.episode_id)
season_path = self.get_root_season_directory(
show=show, season_number=season.number
)
if not season_path.exists():
try:
season_path.mkdir(parents=True)
except Exception as e:
log.warning(f"Could not create path {season_path}: {e}")
msg = f"Could not create path {season_path}"
raise Exception(msg) from e # noqa: TRY002
episoded_import_success = self.import_episode_files(
show=show, show=show,
season=season, season=season,
episode=episode,
video_files=video_files, video_files=video_files,
subtitle_files=subtitle_files, subtitle_files=subtitle_files,
file_path_suffix=season_file.file_path_suffix, file_path_suffix=episode_file.file_path_suffix,
) )
success.append(season_import_success) success.append(episoded_import_success)
if season_import_success:
if episoded_import_success:
imported_episodes_by_season.setdefault(season.number, []).append(
episode.number
)
log.info( log.info(
f"Season {season.number} successfully imported from torrent {torrent.title}" f"Episode {episode.number} from Season {season.number} successfully imported from torrent {torrent.title}"
) )
else: else:
log.warning( log.warning(
f"Season {season.number} failed to import from torrent {torrent.title}" f"Episode {episode.number} from Season {season.number} failed to import from torrent {torrent.title}"
) )
log.info( success_messages: list[str] = []
f"Finished importing files for torrent {torrent.title} {'without' if all(success) else 'with'} errors"
for season_number, episodes in imported_episodes_by_season.items():
episode_list = ",".join(str(e) for e in sorted(episodes))
success_messages.append(
f"Episode(s): {episode_list} from Season {season_number}"
) )
episodes_summary = "; ".join(success_messages)
if all(success): if all(success):
torrent.imported = True torrent.imported = True
self.torrent_service.torrent_repository.save_torrent(torrent=torrent) self.torrent_service.torrent_repository.save_torrent(torrent=torrent)
@@ -743,7 +838,11 @@ class TvService:
if self.notification_service: if self.notification_service:
self.notification_service.send_notification_to_all_providers( self.notification_service.send_notification_to_all_providers(
title="TV Show imported successfully", title="TV Show imported successfully",
message=f"Successfully imported {show.name} ({show.year}) from torrent {torrent.title}.", message=(
f"Successfully imported {episodes_summary} "
f"of {show.name} ({show.year}) "
f"from torrent {torrent.title}."
),
) )
else: else:
if self.notification_service: if self.notification_service:
@@ -752,6 +851,10 @@ class TvService:
message=f"Importing {show.name} ({show.year}) from torrent {torrent.title} completed with errors. Please check the logs for details.", message=f"Importing {show.name} ({show.year}) from torrent {torrent.title} completed with errors. Please check the logs for details.",
) )
log.info(
f"Finished importing files for torrent {torrent.title} {'without' if all(success) else 'with'} errors"
)
def update_show_metadata( def update_show_metadata(
self, db_show: Show, metadata_provider: AbstractMetadataProvider self, db_show: Show, metadata_provider: AbstractMetadataProvider
) -> Show | None: ) -> Show | None:
@@ -798,9 +901,7 @@ class TvService:
existing_season = existing_season_external_ids[ existing_season = existing_season_external_ids[
fresh_season_data.external_id fresh_season_data.external_id
] ]
log.debug(
f"Updating existing season {existing_season.number} for show {db_show.name}"
)
self.tv_repository.update_season_attributes( self.tv_repository.update_season_attributes(
season_id=existing_season.id, season_id=existing_season.id,
name=fresh_season_data.name, name=fresh_season_data.name,
@@ -812,28 +913,28 @@ class TvService:
ep.external_id: ep for ep in existing_season.episodes ep.external_id: ep for ep in existing_season.episodes
} }
for fresh_episode_data in fresh_season_data.episodes: for fresh_episode_data in fresh_season_data.episodes:
if fresh_episode_data.number in existing_episode_external_ids: if fresh_episode_data.external_id in existing_episode_external_ids:
# Update existing episode # Update existing episode
existing_episode = existing_episode_external_ids[ existing_episode = existing_episode_external_ids[
fresh_episode_data.external_id fresh_episode_data.external_id
] ]
log.debug(
f"Updating existing episode {existing_episode.number} for season {existing_season.number}"
)
self.tv_repository.update_episode_attributes( self.tv_repository.update_episode_attributes(
episode_id=existing_episode.id, episode_id=existing_episode.id,
title=fresh_episode_data.title, title=fresh_episode_data.title,
overview=fresh_episode_data.overview,
) )
else: else:
# Add new episode # Add new episode
log.debug( log.debug(
f"Adding new episode {fresh_episode_data.number} to season {existing_season.number}" f"Adding new episode {fresh_episode_data.number} to season {existing_season.number}"
) )
episode_schema = EpisodeSchema( episode_schema = Episode(
id=EpisodeId(fresh_episode_data.id), id=EpisodeId(fresh_episode_data.id),
number=fresh_episode_data.number, number=fresh_episode_data.number,
external_id=fresh_episode_data.external_id, external_id=fresh_episode_data.external_id,
title=fresh_episode_data.title, title=fresh_episode_data.title,
overview=fresh_episode_data.overview,
) )
self.tv_repository.add_episode_to_season( self.tv_repository.add_episode_to_season(
season_id=existing_season.id, episode_data=episode_schema season_id=existing_season.id, episode_data=episode_schema
@@ -844,11 +945,12 @@ class TvService:
f"Adding new season {fresh_season_data.number} to show {db_show.name}" f"Adding new season {fresh_season_data.number} to show {db_show.name}"
) )
episodes_for_schema = [ episodes_for_schema = [
EpisodeSchema( Episode(
id=EpisodeId(ep_data.id), id=EpisodeId(ep_data.id),
number=ep_data.number, number=ep_data.number,
external_id=ep_data.external_id, external_id=ep_data.external_id,
title=ep_data.title, title=ep_data.title,
overview=ep_data.overview,
) )
for ep_data in fresh_season_data.episodes for ep_data in fresh_season_data.episodes
] ]
@@ -867,7 +969,7 @@ class TvService:
updated_show = self.tv_repository.get_show_by_id(show_id=db_show.id) updated_show = self.tv_repository.get_show_by_id(show_id=db_show.id)
log.info(f"Successfully updated metadata for show ID: {db_show.id}") log.info(f"Successfully updated metadata for show: {updated_show.name}")
metadata_provider.download_show_poster_image(show=updated_show) metadata_provider.download_show_poster_image(show=updated_show)
return updated_show return updated_show
@@ -911,21 +1013,22 @@ class TvService:
directory=new_source_path directory=new_source_path
) )
for season in tv_show.seasons: for season in tv_show.seasons:
success, imported_episode_count = self.import_season( _success, imported_episodes = self.import_season(
show=tv_show, show=tv_show,
season=season, season=season,
video_files=video_files, video_files=video_files,
subtitle_files=subtitle_files, subtitle_files=subtitle_files,
file_path_suffix="IMPORTED", file_path_suffix="IMPORTED",
) )
season_file = SeasonFile( for episode in imported_episodes:
season_id=season.id, episode_file = EpisodeFile(
episode_id=episode.id,
quality=Quality.unknown, quality=Quality.unknown,
file_path_suffix="IMPORTED", file_path_suffix="IMPORTED",
torrent_id=None, torrent_id=None,
) )
if success or imported_episode_count > (len(season.episodes) / 2):
self.tv_repository.add_season_file(season_file=season_file) self.tv_repository.add_episode_file(episode_file=episode_file)
def get_importable_tv_shows( def get_importable_tv_shows(
self, metadata_provider: AbstractMetadataProvider self, metadata_provider: AbstractMetadataProvider
@@ -959,104 +1062,34 @@ class TvService:
log.debug(f"Detected {len(import_suggestions)} importable TV shows.") log.debug(f"Detected {len(import_suggestions)} importable TV shows.")
return import_suggestions return import_suggestions
def import_all_torrents(self) -> None:
def auto_download_all_approved_season_requests() -> None:
"""
Auto download all approved season requests.
This is a standalone function as it creates its own DB session.
"""
with next(get_session()) as db:
tv_repository = TvRepository(db=db)
torrent_service = TorrentService(torrent_repository=TorrentRepository(db=db))
indexer_service = IndexerService(indexer_repository=IndexerRepository(db=db))
notification_service = NotificationService(
notification_repository=NotificationRepository(db=db)
)
tv_service = TvService(
tv_repository=tv_repository,
torrent_service=torrent_service,
indexer_service=indexer_service,
notification_service=notification_service,
)
log.info("Auto downloading all approved season requests")
season_requests = tv_repository.get_season_requests()
log.info(f"Found {len(season_requests)} season requests to process")
count = 0
for season_request in season_requests:
if season_request.authorized:
log.info(f"Processing season request {season_request.id} for download")
show = tv_repository.get_show_by_season_id(
season_id=season_request.season_id
)
if tv_service.download_approved_season_request(
season_request=season_request, show=show
):
count += 1
else:
log.warning(
f"Failed to download season request {season_request.id} for show {show.name}"
)
log.info(f"Auto downloaded {count} approved season requests")
db.commit()
def import_all_show_torrents() -> None:
with next(get_session()) as db:
tv_repository = TvRepository(db=db)
torrent_service = TorrentService(torrent_repository=TorrentRepository(db=db))
indexer_service = IndexerService(indexer_repository=IndexerRepository(db=db))
notification_service = NotificationService(
notification_repository=NotificationRepository(db=db)
)
tv_service = TvService(
tv_repository=tv_repository,
torrent_service=torrent_service,
indexer_service=indexer_service,
notification_service=notification_service,
)
log.info("Importing all torrents") log.info("Importing all torrents")
torrents = torrent_service.get_all_torrents() torrents = self.torrent_service.get_all_torrents()
log.info("Found %d torrents to import", len(torrents)) log.info("Found %d torrents to import", len(torrents))
for t in torrents: for t in torrents:
show = None
try: try:
if not t.imported and t.status == TorrentStatus.finished: if not t.imported and t.status == TorrentStatus.finished:
show = torrent_service.get_show_of_torrent(torrent=t) show = self.torrent_service.get_show_of_torrent(torrent=t)
if show is None: if show is None:
log.warning( log.warning(
f"torrent {t.title} is not a tv torrent, skipping import." f"torrent {t.title} is not a tv torrent, skipping import."
) )
continue continue
tv_service.import_torrent_files(torrent=t, show=show) self.import_episode_files_from_torrent(torrent=t, show=show)
except RuntimeError: except RuntimeError as e:
log.exception(f"Error importing torrent {t.title} for show {show.name}") show_name = show.name if show is not None else "<unknown>"
log.info("Finished importing all torrents") log.error(
db.commit() f"Error importing torrent {t.title} for show {show_name}: {e}",
exc_info=True,
def update_all_non_ended_shows_metadata() -> None:
"""
Updates the metadata of all non-ended shows.
"""
with next(get_session()) as db:
tv_repository = TvRepository(db=db)
tv_service = TvService(
tv_repository=tv_repository,
torrent_service=TorrentService(torrent_repository=TorrentRepository(db=db)),
indexer_service=IndexerService(indexer_repository=IndexerRepository(db=db)),
notification_service=NotificationService(
notification_repository=NotificationRepository(db=db)
),
) )
log.info("Finished importing all torrents")
def update_all_non_ended_shows_metadata(self) -> None:
"""Updates the metadata of all non-ended shows."""
log.info("Updating metadata for all non-ended shows") log.info("Updating metadata for all non-ended shows")
shows = [show for show in self.tv_repository.get_shows() if not show.ended]
shows = [show for show in tv_repository.get_shows() if not show.ended]
log.info(f"Found {len(shows)} non-ended shows to update") log.info(f"Found {len(shows)} non-ended shows to update")
for show in shows: for show in shows:
try: try:
if show.metadata_provider == "tmdb": if show.metadata_provider == "tmdb":
@@ -1073,34 +1106,10 @@ def update_all_non_ended_shows_metadata() -> None:
f"Error initializing metadata provider {show.metadata_provider} for show {show.name}" f"Error initializing metadata provider {show.metadata_provider} for show {show.name}"
) )
continue continue
updated_show = tv_service.update_show_metadata( updated_show = self.update_show_metadata(
db_show=show, metadata_provider=metadata_provider db_show=show, metadata_provider=metadata_provider
) )
# Automatically add season requests for new seasons
existing_seasons = [x.id for x in show.seasons]
new_seasons = [
x for x in updated_show.seasons if x.id not in existing_seasons
]
if show.continuous_download:
for new_season in new_seasons:
log.info(
f"Automatically adding season request for new season {new_season.number} of show {updated_show.name}"
)
tv_service.add_season_request(
SeasonRequest(
min_quality=Quality.sd,
wanted_quality=Quality.uhd,
season_id=new_season.id,
authorized=True,
)
)
if updated_show: if updated_show:
log.debug( log.debug("Updated show metadata", extra={"show": updated_show.name})
f"Added new seasons: {len(new_seasons)} to show: {updated_show.name}"
)
else: else:
log.warning(f"Failed to update metadata for show: {show.name}") log.warning(f"Failed to update metadata for show: {show.name}")
db.commit()

View File

@@ -145,21 +145,30 @@ else
echo "Config file found at: $CONFIG_FILE" echo "Config file found at: $CONFIG_FILE"
fi fi
# permission fix # check if running as root, if yes, fix permissions
echo "Ensuring file permissions for mediamanager user..." if [ "$(id -u)" = '0' ]; then
echo "Running as root. Ensuring file permissions for mediamanager user..."
chown -R mediamanager:mediamanager "$CONFIG_DIR"
chown -R mediamanager:mediamanager "$CONFIG_DIR" if [ -d "/data" ]; then
if [ -d "/data" ]; then
if [ "$(stat -c '%U' /data)" != "mediamanager" ]; then if [ "$(stat -c '%U' /data)" != "mediamanager" ]; then
echo "Fixing ownership of /data (this may take a while for large libraries)..." echo "Fixing ownership of /data (this may take a while for large media libraries)..."
chown -R mediamanager:mediamanager /data chown -R mediamanager:mediamanager /data
else
echo "/data ownership is already correct."
fi fi
fi
else
echo "Running as non-root user ($(id -u)). Skipping permission fixes."
echo "Note: Ensure your host volumes are manually set to the correct permissions."
fi fi
echo "Running DB migrations..." echo "Running DB migrations..."
gosu mediamanager uv run alembic upgrade head if [ "$(id -u)" = '0' ]; then
gosu mediamanager uv run alembic upgrade head
else
uv run alembic upgrade head
fi
echo "Starting MediaManager backend service..." echo "Starting MediaManager backend service..."
echo "" echo ""
@@ -172,9 +181,16 @@ echo ""
DEVELOPMENT_MODE=${MEDIAMANAGER_MISC__DEVELOPMENT:-FALSE} DEVELOPMENT_MODE=${MEDIAMANAGER_MISC__DEVELOPMENT:-FALSE}
PORT=${PORT:-8000} PORT=${PORT:-8000}
if [ "$DEVELOPMENT_MODE" == "TRUE" ]; then if [ "$DEVELOPMENT_MODE" == "TRUE" ]; then
echo "Development mode is enabled, enabling auto-reload..." echo "Development mode is enabled, enabling auto-reload..."
exec gosu mediamanager uv run fastapi run /app/media_manager/main.py --port "$PORT" --proxy-headers --reload DEV_OPTIONS="--reload"
else else
exec gosu mediamanager uv run fastapi run /app/media_manager/main.py --port "$PORT" --proxy-headers DEV_OPTIONS=""
fi
if [ "$(id -u)" = '0' ]; then
exec gosu mediamanager uv run fastapi run /app/media_manager/main.py --port "$PORT" --proxy-headers $DEV_OPTIONS
else
exec uv run fastapi run /app/media_manager/main.py --port "$PORT" --proxy-headers $DEV_OPTIONS
fi fi

18
metadata_relay/uv.lock generated
View File

@@ -547,11 +547,11 @@ wheels = [
[[package]] [[package]]
name = "python-multipart" name = "python-multipart"
version = "0.0.21" version = "0.0.22"
source = { registry = "https://pypi.org/simple" } source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/78/96/804520d0850c7db98e5ccb70282e29208723f0964e88ffd9d0da2f52ea09/python_multipart-0.0.21.tar.gz", hash = "sha256:7137ebd4d3bbf70ea1622998f902b97a29434a9e8dc40eb203bbcf7c2a2cba92", size = 37196, upload-time = "2025-12-17T09:24:22.446Z" } sdist = { url = "https://files.pythonhosted.org/packages/94/01/979e98d542a70714b0cb2b6728ed0b7c46792b695e3eaec3e20711271ca3/python_multipart-0.0.22.tar.gz", hash = "sha256:7340bef99a7e0032613f56dc36027b959fd3b30a787ed62d310e951f7c3a3a58", size = 37612, upload-time = "2026-01-25T10:15:56.219Z" }
wheels = [ wheels = [
{ url = "https://files.pythonhosted.org/packages/aa/76/03af049af4dcee5d27442f71b6924f01f3efb5d2bd34f23fcd563f2cc5f5/python_multipart-0.0.21-py3-none-any.whl", hash = "sha256:cf7a6713e01c87aa35387f4774e812c4361150938d20d232800f75ffcf266090", size = 24541, upload-time = "2025-12-17T09:24:21.153Z" }, { url = "https://files.pythonhosted.org/packages/1b/d0/397f9626e711ff749a95d96b7af99b9c566a9bb5129b8e4c10fc4d100304/python_multipart-0.0.22-py3-none-any.whl", hash = "sha256:2b2cd894c83d21bf49d702499531c7bafd057d730c201782048f7945d82de155", size = 24579, upload-time = "2026-01-25T10:15:54.811Z" },
] ]
[[package]] [[package]]
@@ -791,24 +791,24 @@ wheels = [
[[package]] [[package]]
name = "urllib3" name = "urllib3"
version = "2.6.2" version = "2.6.3"
source = { registry = "https://pypi.org/simple" } source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/1e/24/a2a2ed9addd907787d7aa0355ba36a6cadf1768b934c652ea78acbd59dcd/urllib3-2.6.2.tar.gz", hash = "sha256:016f9c98bb7e98085cb2b4b17b87d2c702975664e4f060c6532e64d1c1a5e797", size = 432930, upload-time = "2025-12-11T15:56:40.252Z" } sdist = { url = "https://files.pythonhosted.org/packages/c7/24/5f1b3bdffd70275f6661c76461e25f024d5a38a46f04aaca912426a2b1d3/urllib3-2.6.3.tar.gz", hash = "sha256:1b62b6884944a57dbe321509ab94fd4d3b307075e0c2eae991ac71ee15ad38ed", size = 435556, upload-time = "2026-01-07T16:24:43.925Z" }
wheels = [ wheels = [
{ url = "https://files.pythonhosted.org/packages/6d/b9/4095b668ea3678bf6a0af005527f39de12fb026516fb3df17495a733b7f8/urllib3-2.6.2-py3-none-any.whl", hash = "sha256:ec21cddfe7724fc7cb4ba4bea7aa8e2ef36f607a4bab81aa6ce42a13dc3f03dd", size = 131182, upload-time = "2025-12-11T15:56:38.584Z" }, { url = "https://files.pythonhosted.org/packages/39/08/aaaad47bc4e9dc8c725e68f9d04865dbcb2052843ff09c97b08904852d84/urllib3-2.6.3-py3-none-any.whl", hash = "sha256:bf272323e553dfb2e87d9bfd225ca7b0f467b919d7bbd355436d3fd37cb0acd4", size = 131584, upload-time = "2026-01-07T16:24:42.685Z" },
] ]
[[package]] [[package]]
name = "uvicorn" name = "uvicorn"
version = "0.40.0" version = "0.41.0"
source = { registry = "https://pypi.org/simple" } source = { registry = "https://pypi.org/simple" }
dependencies = [ dependencies = [
{ name = "click" }, { name = "click" },
{ name = "h11" }, { name = "h11" },
] ]
sdist = { url = "https://files.pythonhosted.org/packages/c3/d1/8f3c683c9561a4e6689dd3b1d345c815f10f86acd044ee1fb9a4dcd0b8c5/uvicorn-0.40.0.tar.gz", hash = "sha256:839676675e87e73694518b5574fd0f24c9d97b46bea16df7b8c05ea1a51071ea", size = 81761, upload-time = "2025-12-21T14:16:22.45Z" } sdist = { url = "https://files.pythonhosted.org/packages/32/ce/eeb58ae4ac36fe09e3842eb02e0eb676bf2c53ae062b98f1b2531673efdd/uvicorn-0.41.0.tar.gz", hash = "sha256:09d11cf7008da33113824ee5a1c6422d89fbc2ff476540d69a34c87fab8b571a", size = 82633, upload-time = "2026-02-16T23:07:24.1Z" }
wheels = [ wheels = [
{ url = "https://files.pythonhosted.org/packages/3d/d8/2083a1daa7439a66f3a48589a57d576aa117726762618f6bb09fe3798796/uvicorn-0.40.0-py3-none-any.whl", hash = "sha256:c6c8f55bc8bf13eb6fa9ff87ad62308bbbc33d0b67f84293151efe87e0d5f2ee", size = 68502, upload-time = "2025-12-21T14:16:21.041Z" }, { url = "https://files.pythonhosted.org/packages/83/e4/d04a086285c20886c0daad0e026f250869201013d18f81d9ff5eada73a88/uvicorn-0.41.0-py3-none-any.whl", hash = "sha256:29e35b1d2c36a04b9e180d4007ede3bcb32a85fbdfd6c6aeb3f26839de088187", size = 68783, upload-time = "2026-02-16T23:07:22.357Z" },
] ]
[package.optional-dependencies] [package.optional-dependencies]

72
mkdocs.yml Normal file
View File

@@ -0,0 +1,72 @@
site_name: "MediaManager Documentation"
theme:
name: "material"
logo: "assets/logo.svg"
favicon: "assets/favicon.ico"
features:
- navigation.sections
- navigation.expand
- navigation.indexes
- content.code.copy
- navigation.footer
palette:
- scheme: default
primary: indigo
accent: indigo
toggle:
icon: material/brightness-7
name: Switch to dark mode
- scheme: slate
primary: black
accent: black
toggle:
icon: material/brightness-4
name: Switch to light mode
markdown_extensions:
- admonition
- pymdownx.details
- pymdownx.superfences
- attr_list
- md_in_html
- pymdownx.snippets:
base_path: ["."]
nav:
- Welcome: index.md
- Installation:
- installation/README.md
- Docker Compose: installation/docker.md
- Nix Flakes [Community]: installation/flakes.md
- Usage:
- Importing existing media: importing-existing-media.md
- Configuration:
- configuration/README.md
- Backend: configuration/backend.md
- Authentication: configuration/authentication.md
- Database: configuration/database.md
- Download Clients: configuration/download-clients.md
- Indexers: configuration/indexers.md
- Scoring Rulesets: configuration/scoring-rulesets.md
- Notifications: configuration/notifications.md
- Custom Libraries: configuration/custom-libraries.md
- Logging: configuration/logging.md
- Advanced Features:
- qBittorrent Category: advanced-features/qbittorrent-category.md
- URL Prefix: advanced-features/url-prefix.md
- Metadata Provider Configuration: advanced-features/metadata-provider-configuration.md
- Custom port: advanced-features/custom-port.md
- Follow symlinks in frontend files: advanced-features/follow-symlinks-in-frontend-files.md
- Disable startup ascii art: advanced-features/disable-startup-ascii-art.md
- Troubleshooting: troubleshooting.md
- API Reference: api-reference.md
- Screenshots: screenshots.md
- Contributing to MediaManager:
- Developer Guide: contributing-to-mediamanager/developer-guide.md
- Documentation: contributing-to-mediamanager/documentation.md
extra:
version:
provider: mike
extra_css:
- custom.css

View File

@@ -13,7 +13,7 @@ dependencies = [
"httpx-oauth>=0.16.1", "httpx-oauth>=0.16.1",
"jsonschema>=4.24.0", "jsonschema>=4.24.0",
"patool>=4.0.1", "patool>=4.0.1",
"psycopg[binary]>=3.2.9", "psycopg[binary,pool]>=3.2.9",
"pydantic>=2.11.5", "pydantic>=2.11.5",
"pydantic-settings[toml]>=2.9.1", "pydantic-settings[toml]>=2.9.1",
"python-json-logger>=3.3.0", "python-json-logger>=3.3.0",
@@ -26,7 +26,9 @@ dependencies = [
"typing-inspect>=0.9.0", "typing-inspect>=0.9.0",
"uvicorn>=0.34.2", "uvicorn>=0.34.2",
"fastapi-utils>=0.8.0", "fastapi-utils>=0.8.0",
"apscheduler>=3.11.0", "taskiq>=0.12.0",
"taskiq-fastapi>=0.4.0",
"taskiq-postgresql[psycopg]>=0.4.0",
"alembic>=1.16.1", "alembic>=1.16.1",
"pytest>=8.4.0", "pytest>=8.4.0",
"pillow>=11.3.0", "pillow>=11.3.0",

View File

@@ -41,4 +41,4 @@ ignore = [
] ]
[lint.flake8-bugbear] [lint.flake8-bugbear]
extend-immutable-calls = ["fastapi.Depends", "fastapi.Path"] extend-immutable-calls = ["fastapi.Depends", "fastapi.Path", "taskiq.TaskiqDepends"]

890
uv.lock generated

File diff suppressed because it is too large Load Diff

Some files were not shown because too many files have changed in this diff Show More