Compare commits

...

70 Commits

Author SHA1 Message Date
dependabot[bot]
cd70ab8711 Bump actions/setup-python from 5 to 6
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 5 to 6.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](https://github.com/actions/setup-python/compare/v5...v6)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-version: '6'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-13 17:08:28 +00:00
Maximilian Dorninger
51b8794e4d Merge pull request #411 from maxdorninger/Dependabot-auto-bump-deps
Configure Dependabot for multiple package ecosystems
2026-02-13 18:07:54 +01:00
Mark Riabov
0cfd1fa724 Fix suffix formatting for with_suffix call (#408)
Fixes issue ValueError: Invalid suffix 'jpg'

Completely prevents downloading posters from metadata provider
2026-02-10 20:29:05 +01:00
Maximilian Dorninger
b5b297e99a add new sponsor syn (#405)
this PR adds the new sponsor syn
2026-02-08 20:10:06 +01:00
maxid
58414cadae update all links to docs 2026-02-08 19:47:17 +01:00
maxid
462794520e update docs workflow 2026-02-08 19:43:13 +01:00
maxid
59afba007d update docs workflow 2026-02-08 19:36:07 +01:00
Maximilian Dorninger
cfa303e4f3 Merge pull request #404 from maxdorninger/mkdocs
This PR replaces Gitbook with Mkdocs to provide documentation
2026-02-08 19:27:15 +01:00
maxid
d3dde9c7eb add docs workflow 2026-02-08 19:22:34 +01:00
maxid
9c94ef6de0 convert gitbook files to mkdocs 2026-02-08 19:16:38 +01:00
Maximilian Dorninger
2665106847 Merge pull request #401 from maxdorninger/fix-env-variables
Fix download clients config being read from env variables
2026-02-08 16:37:15 +01:00
maxid
d029177fc0 hot fix: fix search tag name for episode in jackett 2026-02-04 23:52:07 +01:00
Maximilian Dorninger
1698c404cd Merge pull request #400 from maxdorninger/add-search-by-id-support-to-jackett
Add search by id support to jackett
2026-02-04 23:00:00 +01:00
maxid
abac894a95 fix download clients config being read from env variables without the mediamanager prefix 2026-02-04 22:49:24 +01:00
maxid
12854ff661 format files 2026-02-04 21:34:37 +01:00
maxid
3d52a87302 add id search capabilities to jackett 2026-02-04 21:34:31 +01:00
Maximilian Dorninger
9ee5cc6895 make the container user configurable (#399)
This PR makes the user the container runs as configurable. Before, the
container always tried stepping down (from root) to the mediamanager
user. Now it detects if it's already running as a non-root user and
starts the server directly. Fixes #397
2026-02-04 19:01:18 +01:00
Maximilian Dorninger
c45c9e5873 add correlation id to logging (#398)
This PR adds Correlation IDs to logs and request responses.

```
2026-02-04 12:40:32,793 - [afd825081d874d6e835b5c59a6ddb371] DEBUG - media_manager.movies - get_importable_movies(): Found 5 importable movies.
2026-02-04 12:40:32,794 - [afd825081d874d6e835b5c59a6ddb371] INFO - uvicorn.access - send(): 172.19.0.1:64094 - "GET /api/v1/movies/importable HTTP/1.1" 200
2026-02-04 12:40:47,322 - [41d30b7003fd45288c6a4bb1cfba5e7a] INFO - uvicorn.access - send(): 127.0.0.1:52964 - "GET /api/v1/health HTTP/1.1" 200
2026-02-04 12:41:17,408 - [157027ea5dde472a9e620f53739ccd53] INFO - uvicorn.access - send(): 127.0.0.1:39850 - "GET /api/v1/health HTTP/1.1" 200
```
2026-02-04 13:55:05 +01:00
Sergey Khruschak
24fcba6bee Torrent file name sanitizing (#390)
Hi, I've added file names sanitization when saving the torrent file, as
previously the import was failing on torrents with special characters in
names. This fixes #367
2026-02-03 17:09:36 +01:00
Maximilian Dorninger
d5994a9037 Fix docker permission issues (#395)
This PR fixes docker permission issues by first starting as root and
then chown-ing all the volumes. This should fix #388 #389
2026-02-03 13:06:18 +01:00
just_Bri
9e0d0c03c0 feat: add links to media detail pages in requests and torrent tables (#352)
Feature Request: https://github.com/maxdorninger/MediaManager/issues/351

[feat: add links to media detail pages in requests and torrent
tables](ac376c0d6d)
2026-02-02 22:48:14 +01:00
Maximilian Dorninger
70ff8f6ace Fix the broken link to the disable ascii art page (#396)
Fix the broken link to the disable ascii art page
2026-02-02 22:22:11 +01:00
Maximilian Dorninger
e347219721 Merge pull request #394 from juandbc/fix-torznab-process-and-jackett-movies-search
Fix torznab process and jackett movies search
2026-02-02 17:42:49 +01:00
strangeglyph
72a626cb1a Add flag to disable startup ascii art (#369)
Adds an environment variable to disable the colorized splash screen.
2026-02-02 17:39:47 +01:00
Juan David Bermudez Celedon
a1f3f92c10 Enhance size validation for indexer results 2026-02-01 22:14:04 -05:00
Juan David Bermudez Celedon
caaa08fbf4 Fix typo in Jackett log for search_movie 2026-02-01 22:01:42 -05:00
Juan David Bermudez Celedon
5db60141bb Fix bug by typo in jackett log message (#387)
fix typo in the `search_season` function log, which causes an error when searching for torrents.
2026-02-01 18:09:18 +01:00
Marcel Hellwig
96b84d45db Adding some more new lints (#393)
Enable `UP` and `TRY` lint
2026-02-01 18:04:15 +01:00
Marcel Hellwig
311e625eee two hotfixes (#392)
this prevents the app from running correctly
2026-02-01 17:42:15 +01:00
maxidorninger
e22e0394bd GITBOOK-19: No subject 2026-01-09 20:13:39 +00:00
maxid
6377aa8b83 rever "add digital ocean attribution" in GitBook 2026-01-09 21:02:19 +01:00
Maximilian Dorninger
8855204930 add digital ocean attribution (#368) 2026-01-09 20:54:47 +01:00
maxidorninger
7a13326d87 GITBOOK-16: No subject 2026-01-07 19:10:20 +00:00
maxidorninger
15e9cd001f GITBOOK-15: No subject 2026-01-07 18:59:43 +00:00
maxidorninger
e52b84c3c7 GITBOOK-14: No subject 2026-01-07 18:58:37 +00:00
maxidorninger
84a430651f GITBOOK-13: No subject 2026-01-07 18:57:25 +00:00
maxidorninger
463e6914e3 GITBOOK-12: No subject 2026-01-07 18:56:20 +00:00
strangeglyph
e5e85077ae docs: add installation instructions for nix flake (#361)
Following the discussion in #329 and #115, here's a doc section on using
nix flakes to install MediaManager.

Co-authored-by: lschuetze <lschuetze@mpi-sws.org>
2026-01-07 19:45:47 +01:00
Maximilian Dorninger
a39e0d204a Ruff enable type annotations rule (#362)
This PR enables the ruff rule for return type annotations (ANN), and
adds the ty package for type checking.
2026-01-06 17:07:19 +01:00
Renan Greca
dd0b439bbe Fix logging bug in jackett indexer (#360)
fix MM trying to access non-existent attribute
2026-01-06 14:49:06 +01:00
Maximilian Dorninger
732b9c0970 make installation guides always link to files of latest release (#359)
make installation guides always link to files of latest release
2026-01-06 11:49:36 +01:00
Maximilian Dorninger
57028991df Merge pull request #341 from hellow554/ruff
enable more Ruff lints
2026-01-05 23:15:38 +01:00
maxid
d5c41430a6 add back hello word message 2026-01-05 23:05:46 +01:00
Maximilian Dorninger
5db3560e9a fix readme 2026-01-05 21:46:12 +01:00
Maximilian Dorninger
13ed291dd4 Revise MediaManager overview (#358)
Updated the description and key features of MediaManager.
2026-01-05 21:44:32 +01:00
Marcel Hellwig
75406cbc64 ruff: add RET lint
lints about assign and immediatly returning a variable
2026-01-05 19:30:42 +01:00
Marcel Hellwig
805a6981a6 ruff: enable PTH lint 2026-01-05 19:30:42 +01:00
Marcel Hellwig
acd883df21 ruff: enable PIE lint
this just removes needless pass or ...
2026-01-05 19:30:42 +01:00
Marcel Hellwig
f2141ca8b8 ruff: enable PERF lint
this complains a lot about using manual append where a list
comprehension would be more suitable
2026-01-05 19:30:42 +01:00
Marcel Hellwig
7182344036 create list from range directly instead of using append 2026-01-05 19:30:42 +01:00
Marcel Hellwig
a34b0f11a6 use single s since we're ignoring cases anyway 2026-01-05 19:30:42 +01:00
Marcel Hellwig
40812c6040 omit return_type in computed field
it's calculated from the functions return type, so there's no need to
specifify it
2026-01-05 19:30:42 +01:00
Marcel Hellwig
29476e2008 ruff: enable INT and N lint
this renames some files to use snake_case and add Error suffix to custom
exceptions
2026-01-05 19:30:42 +01:00
Marcel Hellwig
29a0d8fe5d ruff: add INP lint
this checks for missing __init__.py files, there was one :)
2026-01-05 19:30:42 +01:00
Marcel Hellwig
55b2dd63d8 ruff: add ARG linter
this mostly either removes unused parameters, prefixes them with an
underscore or uses the @override decorator to tell the linter, that that
method comes from a superclass and can't be changed
2026-01-05 19:30:42 +01:00
Marcel Hellwig
6e46b482cb ruff: enable A lint 2026-01-05 19:30:42 +01:00
Marcel Hellwig
7824828bea ruff: enable T20 lint
and remove a print hello world :)
2026-01-05 19:30:42 +01:00
Marcel Hellwig
5368cad77a ruff: add S linter
this mostly adds a timeout=60 to all requests

this does mainly wants a timeout to all requests functions, since when
left out they hang infinitly.
I added a timeout of 60s, which is probably way too high, but since
before this there was none, I guess it's an improvement?
2026-01-05 19:30:42 +01:00
Marcel Hellwig
1857cf501c ruff: enable RUF lint 2026-01-05 19:30:42 +01:00
Marcel Hellwig
a7bb5e1e04 Make proper use of function overloading
In preparation of the RUFF lint, I rewrote the function to use
typing.overload.
This is the proper way to accept either two arguments or one argument
2026-01-05 19:30:42 +01:00
Marcel Hellwig
ff013ac76e ruff: enable I lint 2026-01-05 19:30:42 +01:00
Marcel Hellwig
42502c93fc ruff: enable ISC lint 2026-01-05 19:30:42 +01:00
Marcel Hellwig
eac58d2843 ruff: enable FAST lint
this mostly is replacing the response_model attribute with a return type
of that function since that's the more idiomatic way to do
2026-01-05 19:30:42 +01:00
Marcel Hellwig
97cb3b5c1e ruff: enable EM lint 2026-01-05 19:30:42 +01:00
Marcel Hellwig
7ef4e52c81 ruff: enable C4 lint 2026-01-05 19:30:42 +01:00
Marcel Hellwig
2c36adfd75 ruff: Enable B lint 2026-01-05 19:30:42 +01:00
Marcel Hellwig
0f272052b3 ruff: enable lints that do not complain right now 2026-01-05 19:30:42 +01:00
Marcel Hellwig
0b4b84a4aa add ruff as dev-dependency
since it is mentioned in the development doc, it makes sense to install
it as such
2026-01-05 19:30:34 +01:00
Marcel Hellwig
9ff2dc4b92 rewrite downlaod_post_image function
this now uses the proper functions instead of handling with strings
2026-01-05 19:30:00 +01:00
Marcel Hellwig
593e1828cc remove pillow-avif package
it is possible since 11.3 to use avif in the main pillow package,
therefore the avif package is no longer needed

https://github.com/python-pillow/Pillow/pull/5201#issuecomment-3023668716
2026-01-05 19:29:52 +01:00
153 changed files with 2621 additions and 1970 deletions

View File

@@ -53,5 +53,5 @@ YOUR CONFIG HERE
```
- [ ] I understand, that without logs and/or screenshots and a detailed description of the problem, it is very hard to fix bugs.
- [ ] I have checked the [documentation](https://maximilian-dorninger.gitbook.io/mediamanager) for help.
- [ ] I have checked the [documentation](https://maxdorninger.github.io/MediaManager/) for help.
- [ ] I have searched the [issues](https://github.com/maxdorninger/MediaManager/issues) for similar issues and found none.

25
.github/dependabot.yml vendored Normal file
View File

@@ -0,0 +1,25 @@
# To get started with Dependabot version updates, you'll need to specify which
# package ecosystems to update and where the package manifests are located.
# Please see the documentation for all configuration options:
# https://docs.github.com/code-security/dependabot/dependabot-version-updates/configuration-options-for-the-dependabot.yml-file
version: 2
updates:
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "weekly"
open-pull-requests-limit: 5
- package-ecosystem: "npm"
directory: "/web"
schedule:
interval: "weekly"
open-pull-requests-limit: 5
- package-ecosystem: "uv"
directory: "/"
schedule:
interval: "weekly"
open-pull-requests-limit: 5

62
.github/workflows/docs.yml vendored Normal file
View File

@@ -0,0 +1,62 @@
name: Publish docs via GitHub Pages
on:
push:
branches:
- master
tags:
- v*
workflow_dispatch:
inputs:
set_default_alias:
description: 'Alias to set as default (e.g. latest, master)'
required: false
default: 'latest'
permissions:
contents: write
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Configure Git Credentials
run: |
git config user.name github-actions[bot]
git config user.email 41898282+github-actions[bot]@users.noreply.github.com
- uses: actions/setup-python@v6
with:
python-version: 3.x
- run: echo "cache_id=$(date --utc '+%V')" >> $GITHUB_ENV
- uses: actions/cache@v4
with:
key: mkdocs-material-${{ env.cache_id }}
path: .cache
restore-keys: |
mkdocs-material-
- name: Install dependencies
run: pip install mkdocs-material mike
- name: Deploy (master)
if: github.ref == 'refs/heads/master'
run: |
mike deploy --push --update-aliases master
- name: Deploy (tag)
if: startsWith(github.ref, 'refs/tags/v')
run: |
version=${GITHUB_REF#refs/tags/}
mike deploy --push --update-aliases $version latest --title "$version"
mike set-default --push latest
- name: Set Default (Manual)
if: github.event_name == 'workflow_dispatch' && github.event.inputs.set_default_alias != ''
run: |
mike set-default --push ${{ github.event.inputs.set_default_alias }}

4
.gitignore vendored
View File

@@ -49,5 +49,5 @@ __pycache__
# Postgres
/postgres
# Node modules
/node_modules/*
# MkDocs
site/

View File

@@ -18,7 +18,7 @@ Generally, if you have any questions or need help on the implementation side of
just ask in the issue, or in a draft PR.
Also, see the contribution guide in the docs for information on how to setup the dev environment:
https://maximilian-dorninger.gitbook.io/mediamanager
https://maxdorninger.github.io/MediaManager/
### For something that is a one or two line fix:

View File

@@ -13,7 +13,7 @@ RUN env PUBLIC_VERSION=${VERSION} PUBLIC_API_URL=${BASE_PATH} BASE_PATH=${BASE_P
FROM ghcr.io/astral-sh/uv:python3.13-trixie-slim AS base
RUN apt-get update && \
apt-get install -y ca-certificates bash libtorrent21 gcc bc locales postgresql media-types mailcap curl gzip unzip tar 7zip bzip2 unar && \
apt-get install -y ca-certificates bash libtorrent21 gcc bc locales postgresql media-types mailcap curl gzip unzip tar 7zip bzip2 unar gosu && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*
@@ -33,7 +33,6 @@ RUN chown -R mediamanager:mediamanager /app
USER mediamanager
# Set uv cache to a writable home directory and use copy mode for volume compatibility
ENV UV_CACHE_DIR=/home/mediamanager/.cache/uv \
UV_LINK_MODE=copy
@@ -47,6 +46,7 @@ ARG BASE_PATH=""
LABEL author="github.com/maxdorninger"
LABEL version=${VERSION}
LABEL description="Docker image for MediaManager"
USER root
ENV PUBLIC_VERSION=${VERSION} \
CONFIG_DIR="/app/config" \

View File

@@ -1,7 +1,7 @@
<br />
<div align="center">
<a href="https://maximilian-dorninger.gitbook.io/mediamanager">
<img src="https://raw.githubusercontent.com/maxdorninger/MediaManager/refs/heads/master/Writerside/images/logo.svg" alt="Logo" width="260" height="260">
<a href="https://maxdorninger.github.io/MediaManager/">
<img src="https://raw.githubusercontent.com/maxdorninger/MediaManager/refs/heads/master/web/static/logo.svg" alt="Logo" width="260" height="260">
</a>
<h3 align="center">MediaManager</h3>
@@ -9,7 +9,7 @@
<p align="center">
Modern management system for your media library
<br />
<a href="https://maximilian-dorninger.gitbook.io/mediamanager"><strong>Explore the docs »</strong></a>
<a href="https://maxdorninger.github.io/MediaManager/"><strong>Explore the docs »</strong></a>
<br />
<a href="https://github.com/maxdorninger/MediaManager/issues/new?labels=bug&template=bug_report.md">Report Bug</a>
&middot;
@@ -18,26 +18,24 @@
</div>
MediaManager is a modern software to manage your TV and movie library. It is designed to be a replacement for Sonarr,
Radarr, Overseer, and Jellyseer.
It supports TVDB and TMDB for metadata, supports OIDC and OAuth 2.0 for authentication and supports Prowlarr and
Jackett.
MediaManager is built first and foremost for deployment with Docker, making it easy to set up.
MediaManager is the modern, easy-to-use successor to the fragmented "Arr" stack. Manage, discover, and automate your TV and movie collection in a single, simple interface.
It also provides an API to interact with the software programmatically, allowing for automation and integration with
other services.
Key features:
- support for OAuth/OIDC
- support for TVDB and TMDB
- made to be deployed with Docker
## Quick Start
```sh
wget -O docker-compose.yaml https://raw.githubusercontent.com/maxdorninger/MediaManager/refs/heads/master/docker-compose.yaml
wget -O docker-compose.yaml https://github.com/maxdorninger/MediaManager/releases/latest/download/docker-compose.yaml
mkdir config
wget -O ./config/config.toml https://raw.githubusercontent.com/maxdorninger/MediaManager/refs/heads/master/config.example.toml
wget -O ./config/config.toml https://github.com/maxdorninger/MediaManager/releases/latest/download/config.example.toml
# you probably need to edit the config.toml file in the ./config directory, for more help see the documentation
docker compose up -d
```
### [View the docs for installation instructions and more](https://maximilian-dorninger.gitbook.io/mediamanager)
### [View the docs for installation instructions and more](https://maxdorninger.github.io/MediaManager/)
## Support MediaManager
@@ -62,6 +60,7 @@ docker compose up -d
<a href="https://buymeacoffee.com/maxdorninger"><img src="https://cdn.buymeacoffee.com/uploads/profile_pictures/default/v2/DEBBB9/JO.png" width="80px" alt="Josh" /></a>&nbsp;&nbsp;
<a href="https://buymeacoffee.com/maxdorninger"><img src="https://cdn.buymeacoffee.com/uploads/profile_pictures/2025/11/2VeQ8sTGPhj4tiLy.jpg" width="80px" alt="PuppiestDoggo" /></a>&nbsp;&nbsp;
<a href="https://github.com/seferino-fernandez"><img src="https://avatars.githubusercontent.com/u/5546622" width="80px" alt="Seferino" /></a>&nbsp;&nbsp;
<a href="https://buymeacoffee.com/maxdorninger"><img src="https://cdn.buymeacoffee.com/uploads/profile_pictures/default/v2/EC9689/SY.png" width="80px" alt="syn" /></a>&nbsp;&nbsp;
## Star History
@@ -82,7 +81,7 @@ docker compose up -d
## Developer Quick Start
For the developer guide see the [Developer Guide](https://maximilian-dorninger.gitbook.io/mediamanager).
For the developer guide see the [Developer Guide](https://maxdorninger.github.io/MediaManager/).
<!-- LICENSE -->
@@ -95,5 +94,9 @@ Distributed under the AGPL 3.0. See `LICENSE.txt` for more information.
## Acknowledgments
Thanks to DigitalOcean for sponsoring the project!
[![DigitalOcean Referral Badge](https://web-platforms.sfo2.cdn.digitaloceanspaces.com/WWW/Badge%201.svg)](https://www.digitalocean.com/?refcode=4edf05429dca&utm_campaign=Referral_Invite&utm_medium=Referral_Program&utm_source=badge)
* [Thanks to Pawel Czerwinski for the image on the login screen](https://unsplash.com/@pawel_czerwinski)

View File

@@ -1,13 +1,16 @@
import sys
sys.path = ["", ".."] + sys.path[1:]
sys.path = ["", "..", *sys.path[1:]]
from logging.config import fileConfig # noqa: E402
from sqlalchemy import ( # noqa: E402
engine_from_config,
pool,
)
from alembic import context # noqa: E402
from sqlalchemy import engine_from_config # noqa: E402
from sqlalchemy import pool # noqa: E402
# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
@@ -23,34 +26,40 @@ if config.config_file_name is not None:
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
from media_manager.auth.db import User, OAuthAccount # noqa: E402
from media_manager.auth.db import OAuthAccount, User # noqa: E402
from media_manager.config import MediaManagerConfig # noqa: E402
from media_manager.database import Base # noqa: E402
from media_manager.indexer.models import IndexerQueryResult # noqa: E402
from media_manager.torrent.models import Torrent # noqa: E402
from media_manager.tv.models import Show, Season, Episode, SeasonFile, SeasonRequest # noqa: E402
from media_manager.movies.models import Movie, MovieFile, MovieRequest # noqa: E402
from media_manager.notification.models import Notification # noqa: E402
from media_manager.database import Base # noqa: E402
from media_manager.config import MediaManagerConfig # noqa: E402
from media_manager.torrent.models import Torrent # noqa: E402
from media_manager.tv.models import ( # noqa: E402
Episode,
Season,
SeasonFile,
SeasonRequest,
Show,
)
target_metadata = Base.metadata
# this is to keep pycharm from complaining about/optimizing unused imports
# noinspection PyStatementEffect
(
User,
OAuthAccount,
IndexerQueryResult,
Torrent,
Show,
Season,
Episode,
SeasonFile,
SeasonRequest,
Movie,
MovieFile,
MovieRequest,
Notification,
)
__all__ = [
"Episode",
"IndexerQueryResult",
"Movie",
"MovieFile",
"MovieRequest",
"Notification",
"OAuthAccount",
"Season",
"SeasonFile",
"SeasonRequest",
"Show",
"Torrent",
"User",
]
# other values from the config, defined by the needs of env.py,
@@ -60,19 +69,7 @@ target_metadata = Base.metadata
db_config = MediaManagerConfig().database
db_url = (
"postgresql+psycopg"
+ "://"
+ db_config.user
+ ":"
+ db_config.password
+ "@"
+ db_config.host
+ ":"
+ str(db_config.port)
+ "/"
+ db_config.dbname
)
db_url = f"postgresql+psycopg://{db_config.user}:{db_config.password}@{db_config.host}:{db_config.port}/{db_config.dbname}"
config.set_main_option("sqlalchemy.url", db_url)
@@ -109,7 +106,13 @@ def run_migrations_online() -> None:
"""
def include_object(object, name, type_, reflected, compare_to):
def include_object(
_object: object | None,
name: str | None,
type_: str | None,
_reflected: bool | None,
_compare_to: object | None,
) -> bool:
if type_ == "table" and name == "apscheduler_jobs":
return False
return True

View File

@@ -8,9 +8,9 @@ Create Date: 2025-12-13 18:47:02.146038
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision: str = "16e78af9e5bf"

View File

@@ -8,9 +8,9 @@ Create Date: 2025-07-16 01:09:44.045395
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision: str = "1801d9f5a275"

View File

@@ -8,9 +8,9 @@ Create Date: 2025-06-22 13:46:01.973406
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision: str = "1f340754640a"

View File

@@ -8,9 +8,9 @@ Create Date: 2025-07-06 10:49:08.814496
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision: str = "21a19f0675f9"

View File

@@ -8,9 +8,9 @@ Create Date: 2025-12-23 19:42:09.593945
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision: str = "2c61f662ca9e"

View File

@@ -8,9 +8,10 @@ Create Date: 2025-07-09 20:55:42.338629
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision: str = "333866afcd2c"
down_revision: Union[str, None] = "aa4689f80796"

View File

@@ -8,9 +8,9 @@ Create Date: 2025-07-16 23:24:37.931188
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision: str = "5299dfed220b"

View File

@@ -8,9 +8,9 @@ Create Date: 2025-06-10 21:25:27.871064
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision: str = "7508237d5bc2"

View File

@@ -8,10 +8,11 @@ Create Date: 2025-05-27 21:36:18.532068
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import postgresql
from alembic import op
# revision identifiers, used by Alembic.
revision: str = "93fb07842385"
down_revision: Union[str, None] = None

View File

@@ -8,9 +8,9 @@ Create Date: 2025-07-06 10:54:19.714809
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision: str = "aa4689f80796"

View File

@@ -8,9 +8,9 @@ Create Date: 2025-10-28 21:39:24.480466
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision: str = "eb0bd3cc1852"

View File

@@ -1,6 +1,6 @@
# MediaManager Dev Configuration File
# This file contains all available configuration options for MediaManager
# Documentation: https://maximilian-dorninger.gitbook.io/mediamanager
# Documentation: https://maxdorninger.github.io/MediaManager/
#
# This is an example configuration file that gets copied to your config folder
# on first boot. You should modify the values below to match your setup.

View File

@@ -1,6 +1,6 @@
# MediaManager Example Configuration File
# This file contains all available configuration options for MediaManager
# Documentation: https://maximilian-dorninger.gitbook.io/mediamanager
# Documentation: https://maxdorninger.github.io/MediaManager/
#
# This is an example configuration file that gets copied to your config folder
# on first boot. You should modify the values below to match your setup.

View File

@@ -56,6 +56,15 @@ services:
- ./web:/app
depends_on:
- mediamanager
docs:
image: squidfunk/mkdocs-material:9
container_name: mediamanager-docs
volumes:
- .:/docs
ports:
- "9000:9000"
command: serve -w /docs -a 0.0.0.0:9000
# ----------------------------
# Additional services can be uncommented and configured as needed
@@ -130,17 +139,17 @@ services:
# ports:
# - 8081:8080
# restart: unless-stopped
# jackett:
# image: lscr.io/linuxserver/jackett:latest
# container_name: jackett
# environment:
# - PUID=1000
# - PGID=1000
# - TZ=Etc/UTC
# - AUTO_UPDATE=true
# volumes:
# - ./res/jackett/data:/config
# - ./res/jackett/torrents:/downloads
# ports:
# - 9117:9117
# restart: unless-stopped
jackett:
image: lscr.io/linuxserver/jackett:latest
container_name: jackett
environment:
- PUID=1000
- PGID=1000
- TZ=Etc/UTC
- AUTO_UPDATE=true
volumes:
- ./res/jackett/data:/config
- ./res/jackett/torrents:/downloads
ports:
- 9117:9117
restart: unless-stopped

View File

@@ -1,38 +0,0 @@
---
layout:
width: wide
title:
visible: true
description:
visible: true
tableOfContents:
visible: true
outline:
visible: false
pagination:
visible: true
metadata:
visible: true
---
# MediaManager
MediaManager is the modern, easy-to-use successor to the fragmented "Arr" stack. Manage, discover, and automate your TV and movie collection in a single, simple interface.
_Replaces Sonarr, Radarr, Seerr, and more._
### Quick Links
<table data-view="cards" data-full-width="true"><thead><tr><th align="center"></th><th data-hidden data-card-target data-type="content-ref"></th></tr></thead><tbody><tr><td align="center">Installation Guide</td><td><a href="installation-guide.md">installation-guide.md</a></td></tr><tr><td align="center">Configuration</td><td><a href="configuration/">configuration</a></td></tr><tr><td align="center">Developer Guide</td><td><a href="developer-guide.md">developer-guide.md</a></td></tr><tr><td align="center">Troubleshooting</td><td><a href="troubleshooting.md">troubleshooting.md</a></td></tr><tr><td align="center">Advanced Features</td><td><a href="advanced-features/">advanced-features</a></td></tr><tr><td align="center">Import Existing Media</td><td><a href="importing-existing-media.md">importing-existing-media.md</a></td></tr></tbody></table>
## Support MediaManager & Maximilian Dorninger
<table data-card-size="large" data-view="cards" data-full-width="true"><thead><tr><th></th><th data-hidden data-card-target data-type="content-ref"></th><th data-hidden data-card-cover data-type="image">Cover image</th></tr></thead><tbody><tr><td>Sponsor me on GitHub Sponsors :)</td><td><a href="https://github.com/sponsors/maxdorninger">https://github.com/sponsors/maxdorninger</a></td><td></td></tr><tr><td>Buy me a coffee :)</td><td><a href="https://buymeacoffee.com/maxdorninger">https://buymeacoffee.com/maxdorninger</a></td><td></td></tr></tbody></table>
### MediaManager Sponsors
<table data-view="cards" data-full-width="true"><thead><tr><th>Sponsor</th><th data-hidden data-card-target data-type="content-ref"></th><th data-hidden data-card-cover data-type="image">Cover image</th></tr></thead><tbody><tr><td>Aljaž Mur Eržen</td><td><a href="https://fosstodon.org/@aljazmerzen">https://fosstodon.org/@aljazmerzen</a></td><td><a href="https://github.com/aljazerzen.png">https://github.com/aljazerzen.png</a></td></tr><tr><td>Luis Rodriguez</td><td><a href="https://github.com/ldrrp">https://github.com/ldrrp</a></td><td><a href="https://github.com/ldrrp.png">https://github.com/ldrrp.png</a></td></tr><tr><td>Brandon P.</td><td><a href="https://github.com/brandon-dacrib">https://github.com/brandon-dacrib</a></td><td><a href="https://github.com/brandon-dacrib.png">https://github.com/brandon-dacrib.png</a></td></tr><tr><td>SeimusS</td><td><a href="https://github.com/SeimusS">https://github.com/SeimusS</a></td><td><a href="https://github.com/SeimusS.png">https://github.com/SeimusS.png</a></td></tr><tr><td>HadrienKerlero</td><td><a href="https://github.com/HadrienKerlero">https://github.com/HadrienKerlero</a></td><td><a href="https://github.com/HadrienKerlero.png">https://github.com/HadrienKerlero.png</a></td></tr><tr><td>keyxmakerx</td><td><a href="https://github.com/keyxmakerx">https://github.com/keyxmakerx</a></td><td><a href="https://github.com/keyxmakerx.png">https://github.com/keyxmakerx.png</a></td></tr><tr><td>LITUATUI</td><td><a href="https://github.com/LITUATUI">https://github.com/LITUATUI</a></td><td><a href="https://github.com/LITUATUI.png">https://github.com/LITUATUI.png</a></td></tr><tr><td>Nicolas</td><td><a href="https://buymeacoffee.com/maxdorninger">https://buymeacoffee.com/maxdorninger</a></td><td><a href="https://cdn.buymeacoffee.com/uploads/profile_pictures/default/v2/B6CDBD/NI.png">https://cdn.buymeacoffee.com/uploads/profile_pictures/default/v2/B6CDBD/NI.png</a></td></tr><tr><td>Josh</td><td><a href="https://buymeacoffee.com/maxdorninger">https://buymeacoffee.com/maxdorninger</a></td><td><a href="https://cdn.buymeacoffee.com/uploads/profile_pictures/default/v2/DEBBB9/JO.png">https://cdn.buymeacoffee.com/uploads/profile_pictures/default/v2/DEBBB9/JO.png</a></td></tr><tr><td>PuppiestDoggo</td><td><a href="https://buymeacoffee.com/maxdorninger">https://buymeacoffee.com/maxdorninger</a></td><td><a href="https://cdn.buymeacoffee.com/uploads/profile_pictures/2025/11/2VeQ8sTGPhj4tiLy.jpg">https://cdn.buymeacoffee.com/uploads/profile_pictures/2025/11/2VeQ8sTGPhj4tiLy.jpg</a></td></tr><tr><td>Seferino</td><td><a href="https://github.com/seferino-fernandez">https://github.com/seferino-fernandez</a></td><td><a href="https://avatars.githubusercontent.com/u/5546622">https://avatars.githubusercontent.com/u/5546622</a></td></tr></tbody></table>
### MediaManager Repository
https://github.com/maxdorninger/MediaManager

View File

@@ -1,26 +0,0 @@
# Table of contents
* [MediaManager](README.md)
* [Installation Guide](installation-guide.md)
* [Importing existing media](importing-existing-media.md)
* [Usage](usage.md)
* [Configuration](configuration/README.md)
* [Backend](configuration/backend.md)
* [Authentication](configuration/authentication.md)
* [Database](configuration/database.md)
* [Download Clients](configuration/download-clients.md)
* [Indexers](configuration/indexers.md)
* [Scoring Rulesets](configuration/scoring-rulesets.md)
* [Notifications](configuration/notifications.md)
* [Custom Libraries](configuration/custom-libraries.md)
* [Logging](configuration/logging.md)
* [Advanced Features](advanced-features/README.md)
* [qBittorrent Category](advanced-features/qbittorrent-category.md)
* [URL Prefix](advanced-features/url-prefix.md)
* [Metadata Provider Configuration](advanced-features/metadata-provider-configuration.md)
* [Custom port](advanced-features/custom-port.md)
* [Follow symlinks in frontend files](advanced-features/follow-symlinks-in-frontend-files.md)
* [Troubleshooting](troubleshooting.md)
* [Developer Guide](developer-guide.md)
* [API Reference](api-reference.md)
* [Screenshots](screenshots.md)

View File

@@ -1,9 +0,0 @@
---
description: >-
The features in this section are not required to run MediaManager and serve
their purpose in very specific environments, but they can enhance your
experience and provide additional functionality.
---
# Advanced Features

View File

@@ -0,0 +1,4 @@
# Disable Startup Ascii Art
* `MEDIAMANAGER_NO_STARTUP_ART`: Set this environment variable (to any value) \
to disable the colorized startup splash screen. Unset to reenable.

View File

@@ -7,8 +7,6 @@ MediaManager can be configured to follow symlinks when serving frontend files. T
* `FRONTEND_FOLLOW_SYMLINKS`\
Set this environment variable to `true` to follow symlinks when serving frontend files. Default is `false`.
{% code title=".env" %}
```bash
```bash title=".env"
FRONTEND_FOLLOW_SYMLINKS=true
```
{% endcode %}

View File

@@ -8,9 +8,8 @@ Metadata provider settings are configured in the `[metadata]` section of your `c
TMDB (The Movie Database) is the primary metadata provider for MediaManager. It provides detailed information about movies and TV shows.
{% hint style="info" %}
Other software like Jellyfin use TMDB as well, so there won't be any metadata discrepancies.
{% endhint %}
!!! info
Other software like Jellyfin use TMDB as well, so there won't be any metadata discrepancies.
* `tmdb_relay_url`\
URL of the TMDB relay (MetadataRelay). Default is `https://metadata-relay.dorninger.co/tmdb`. Example: `https://your-own-relay.example.com/tmdb`.
@@ -19,24 +18,21 @@ Other software like Jellyfin use TMDB as well, so there won't be any metadata di
* `default_language`\
TMDB language parameter used when searching and adding. Default is `en`. Format: ISO 639-1 (2 letters).
{% hint style="warning" %}
`default_language` sets the TMDB `language` parameter when searching and adding TV shows and movies. If TMDB does not find a matching translation, metadata in the original language will be fetched with no option for a fallback language. It is therefore highly advised to only use "broad" languages. For most use cases, the default setting is safest.
{% endhint %}
!!! warning
`default_language` sets the TMDB `language` parameter when searching and adding TV shows and movies. If TMDB does not find a matching translation, metadata in the original language will be fetched with no option for a fallback language. It is therefore highly advised to only use "broad" languages. For most use cases, the default setting is safest.
### TVDB Settings (`[metadata.tvdb]`)
{% hint style="warning" %}
The TVDB might provide false metadata and doesn't support some features of MediaManager like showing overviews. Therefore, TMDB is the preferred metadata provider.
{% endhint %}
!!! warning
The TVDB might provide false metadata and doesn't support some features of MediaManager like showing overviews. Therefore, TMDB is the preferred metadata provider.
* `tvdb_relay_url`\
URL of the TVDB relay (MetadataRelay). Default is `https://metadata-relay.dorninger.co/tvdb`. Example: `https://your-own-relay.example.com/tvdb`.
### MetadataRelay
{% hint style="info" %}
To use MediaManager you don't need to set up your own MetadataRelay, as the default relay hosted by the developer should be sufficient for most purposes.
{% endhint %}
!!! info
To use MediaManager you don't need to set up your own MetadataRelay, as the default relay hosted by the developer should be sufficient for most purposes.
The MetadataRelay is a service that provides metadata for MediaManager. It acts as a proxy for TMDB and TVDB, allowing you to use your own API keys if needed, but the default relay means you don't need to create accounts for API keys yourself.
@@ -47,16 +43,14 @@ You might want to use your own relay if you want to avoid rate limits, protect y
* Get a TMDB API key from [The Movie Database](https://www.themoviedb.org/settings/api)
* Get a TVDB API key from [The TVDB](https://thetvdb.com/auth/register)
{% hint style="info" %}
If you want to use your own MetadataRelay, you can set the `tmdb_relay_url` and/or `tvdb_relay_url` to your own relay service.
{% endhint %}
!!! info
If you want to use your own MetadataRelay, you can set the `tmdb_relay_url` and/or `tvdb_relay_url` to your own relay service.
### Example Configuration
Here's a complete example of the metadata section in your `config.toml`:
{% code title="config.toml" %}
```toml
```toml title="config.toml"
[metadata]
# TMDB configuration
[metadata.tmdb]
@@ -66,8 +60,6 @@ Here's a complete example of the metadata section in your `config.toml`:
[metadata.tvdb]
tvdb_relay_url = "https://metadata-relay.dorninger.co/tvdb"
```
{% endcode %}
{% hint style="info" %}
In most cases, you can simply use the default values and don't need to specify these settings in your config file at all.
{% endhint %}
!!! info
In most cases, you can simply use the default values and don't need to specify these settings in your config file at all.

View File

@@ -9,10 +9,8 @@ Use the following variables to customize behavior:
* `torrents.qbittorrent.category_save_path`\
Save path for the category in qBittorrent. By default, no subdirectory is used. Example: `/data/torrents/MediaManager`.
{% hint style="info" %}
qBittorrent saves torrents to the path specified by `torrents.qbittorrent.category_save_path`, so it must be a valid path that qBittorrent can write to.
{% endhint %}
!!! info
qBittorrent saves torrents to the path specified by `torrents.qbittorrent.category_save_path`, so it must be a valid path that qBittorrent can write to.
{% hint style="warning" %}
For MediaManager to successfully import torrents, you must add the subdirectory to the `misc.torrent_directory` variable.
{% endhint %}
!!! warning
For MediaManager to successfully import torrents, you must add the subdirectory to the `misc.torrent_directory` variable.

View File

@@ -6,23 +6,20 @@ In order to run it on a prefixed path, like `maxdorninger.github.io/media`, the
In short, clone the repository, then run:
{% code title="Build Docker image" %}
```none
```none title="Build Docker image"
docker build \
--build-arg BASE_PATH=/media \
--build-arg VERSION=my-custom-version \
-t MediaManager:my-custom-version \
-f Dockerfile .
```
{% endcode %}
You also need to set the `BASE_PATH` environment variable at runtime in `docker-compose.yaml`:
* `BASE_PATH`\
Base path prefix MediaManager is served under. Example: `/media`. This must match the `BASE_PATH` build arg.
{% code title="docker-compose.yaml (excerpt)" %}
```yaml
```yaml title="docker-compose.yaml (excerpt)"
services:
mediamanager:
image: MediaManager:my-custom-version
@@ -32,10 +29,8 @@ services:
BASE_PATH: /media
...
```
{% endcode %}
{% hint style="info" %}
Make sure to include the base path in the `frontend_url` field in the config file. See [Backend](../configuration/backend.md).
{% endhint %}
!!! info
Make sure to include the base path in the `frontend_url` field in the config file. See [Backend](../configuration/backend.md).
Finally, ensure that whatever reverse proxy you're using leaves the incoming path unchanged; that is, you should not strip the `/media` from `/media/web/`.

View File

@@ -1,8 +1,7 @@
# API Reference
{% hint style="info" %}
Media Manager's backend is built with FastAPI, which automatically generates interactive API documentation.
{% endhint %}
!!! info
Media Manager's backend is built with FastAPI, which automatically generates interactive API documentation.
* Swagger UI (typically available at `http://localhost:8000/docs`)
* ReDoc (typically available at `http://localhost:8000/redoc`)

View File

Before

Width:  |  Height:  |  Size: 3.1 MiB

After

Width:  |  Height:  |  Size: 3.1 MiB

View File

Before

Width:  |  Height:  |  Size: 35 KiB

After

Width:  |  Height:  |  Size: 35 KiB

View File

Before

Width:  |  Height:  |  Size: 9.0 KiB

After

Width:  |  Height:  |  Size: 9.0 KiB

View File

Before

Width:  |  Height:  |  Size: 21 KiB

After

Width:  |  Height:  |  Size: 21 KiB

View File

Before

Width:  |  Height:  |  Size: 62 KiB

After

Width:  |  Height:  |  Size: 62 KiB

View File

Before

Width:  |  Height:  |  Size: 20 KiB

After

Width:  |  Height:  |  Size: 20 KiB

View File

Before

Width:  |  Height:  |  Size: 23 KiB

After

Width:  |  Height:  |  Size: 23 KiB

View File

Before

Width:  |  Height:  |  Size: 244 KiB

After

Width:  |  Height:  |  Size: 244 KiB

View File

Before

Width:  |  Height:  |  Size: 24 KiB

After

Width:  |  Height:  |  Size: 24 KiB

View File

Before

Width:  |  Height:  |  Size: 113 KiB

After

Width:  |  Height:  |  Size: 113 KiB

View File

Before

Width:  |  Height:  |  Size: 38 KiB

After

Width:  |  Height:  |  Size: 38 KiB

View File

Before

Width:  |  Height:  |  Size: 12 KiB

After

Width:  |  Height:  |  Size: 12 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 72 KiB

View File

Before

Width:  |  Height:  |  Size: 36 KiB

After

Width:  |  Height:  |  Size: 36 KiB

View File

Before

Width:  |  Height:  |  Size: 15 KiB

After

Width:  |  Height:  |  Size: 15 KiB

View File

Before

Width:  |  Height:  |  Size: 8.9 MiB

After

Width:  |  Height:  |  Size: 8.9 MiB

View File

Before

Width:  |  Height:  |  Size: 64 KiB

After

Width:  |  Height:  |  Size: 64 KiB

View File

Before

Width:  |  Height:  |  Size: 5.5 MiB

After

Width:  |  Height:  |  Size: 5.5 MiB

View File

Before

Width:  |  Height:  |  Size: 33 KiB

After

Width:  |  Height:  |  Size: 33 KiB

View File

Before

Width:  |  Height:  |  Size: 38 KiB

After

Width:  |  Height:  |  Size: 38 KiB

View File

Before

Width:  |  Height:  |  Size: 7.6 MiB

After

Width:  |  Height:  |  Size: 7.6 MiB

View File

Before

Width:  |  Height:  |  Size: 123 KiB

After

Width:  |  Height:  |  Size: 123 KiB

158
docs/assets/logo.svg Normal file
View File

@@ -0,0 +1,158 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!-- Created with Inkscape (http://www.inkscape.org/) -->
<svg
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns:svg="http://www.w3.org/2000/svg"
version="1.1"
id="svg1"
width="2000"
height="2000"
viewBox="0 0 2000 2000"
sodipodi:docname="logo2.svg"
inkscape:version="1.4.2 (f4327f4, 2025-05-13)"
xmlns="http://www.w3.org/2000/svg">
<defs
id="defs1">
<clipPath
clipPathUnits="userSpaceOnUse"
id="clipPath1">
<path
d="M 0,1500 H 1500 V 0 H 0 Z"
id="path1"/>
</clipPath>
<clipPath
clipPathUnits="userSpaceOnUse"
id="clipPath3">
<path
d="M 0,0 H 1500 V 1500 H 0 Z"
transform="matrix(1.3333333,0,0,-1.3333333,0,2000)"
id="path3"/>
</clipPath>
<clipPath
clipPathUnits="userSpaceOnUse"
id="clipPath4">
<path
d="M -17.6886,1032.99 H 1106.27 V 238.53 H -17.6886 Z"
transform="translate(-319.61281,-1032.9941)"
id="path4"/>
</clipPath>
<clipPath
clipPathUnits="userSpaceOnUse"
id="clipPath6">
<path
d="M 0,0 H 1500 V 1500 H 0 Z"
transform="matrix(1.3333333,0,0,-1.3333333,0,2000)"
id="path6"/>
</clipPath>
<clipPath
clipPathUnits="userSpaceOnUse"
id="clipPath7">
<path
d="M 223.314,1226.85 H 1182.49 V 548.867 H 223.314 Z"
transform="translate(-894.64255,-548.86681)"
id="path7"/>
</clipPath>
<clipPath
clipPathUnits="userSpaceOnUse"
id="clipPath9">
<path
d="M 0,0 H 1500 V 1500 H 0 Z"
transform="matrix(1.3333333,0,0,-1.3333333,0,2000)"
id="path9"/>
</clipPath>
<clipPath
clipPathUnits="userSpaceOnUse"
id="clipPath10">
<path
d="M 301.561,1098.17 H 1517.73 V 238.53 H 301.561 Z"
transform="translate(-666.53282,-1098.1678)"
id="path10"/>
</clipPath>
<clipPath
clipPathUnits="userSpaceOnUse"
id="clipPath12">
<path
d="M 0,0 H 1500 V 1500 H 0 Z"
transform="matrix(1.3333333,0,0,-1.3333333,0,2000)"
id="path12"/>
</clipPath>
</defs>
<sodipodi:namedview
id="namedview1"
pagecolor="#ffffff"
bordercolor="#000000"
borderopacity="0.25"
inkscape:showpageshadow="2"
inkscape:pageopacity="0.0"
inkscape:pagecheckerboard="0"
inkscape:deskcolor="#d1d1d1"
inkscape:zoom="0.9075"
inkscape:cx="999.44904"
inkscape:cy="1000"
inkscape:window-width="3840"
inkscape:window-height="2054"
inkscape:window-x="3373"
inkscape:window-y="199"
inkscape:window-maximized="1"
inkscape:current-layer="g1">
<inkscape:page
x="0"
y="0"
inkscape:label="1"
id="page1"
width="2000"
height="2000"
margin="0"
bleed="0"/>
</sodipodi:namedview>
<g
id="g1"
inkscape:groupmode="layer"
inkscape:label="1">
<g
id="g2"
clip-path="url(#clipPath3)">
<path
d="M 0,0 H 1500 V 1500 H 0 Z"
style="fill:#9ed8f7;fill-opacity:0;fill-rule:nonzero;stroke:none"
transform="matrix(1.3333333,0,0,-1.3333333,0,2000)"
clip-path="url(#clipPath1)"
id="path2"/>
</g>
<g
opacity="0.720001"
id="g5"
clip-path="url(#clipPath6)">
<path
d="m 0,0 h 669.787 c 68.994,0 116.873,-68.746 92.95,-133.46 L 542.309,-729.728 c -14.382,-38.904 -51.472,-64.736 -92.95,-64.736 h -669.787 c -68.994,0 -116.873,68.746 -92.949,133.46 L -92.949,-64.736 C -78.567,-25.832 -41.478,0 0,0"
style="fill:#2842fc;fill-opacity:1;fill-rule:nonzero;stroke:none"
transform="matrix(1.3333333,0,0,-1.3333333,426.1504,622.67453)"
clip-path="url(#clipPath4)"
id="path5"/>
</g>
<g
opacity="0.720001"
id="g8"
clip-path="url(#clipPath9)">
<path
d="m 0,0 h -571.59 c -58.879,0 -99.738,58.667 -79.322,113.893 l 188.111,508.849 c 12.274,33.201 43.925,55.246 79.322,55.246 h 571.59 c 58.879,0 99.739,-58.667 79.322,-113.894 L 79.322,55.245 C 67.049,22.045 35.397,0 0,0"
style="fill:#ff5e00;fill-opacity:1;fill-rule:nonzero;stroke:none"
transform="matrix(1.3333333,0,0,-1.3333333,1192.8567,1268.1776)"
clip-path="url(#clipPath7)"
id="path8"/>
</g>
<g
opacity="0.75"
id="g11"
clip-path="url(#clipPath12)">
<path
d="m 0,0 h 724.733 c 74.654,0 126.46,-74.386 100.575,-144.408 L 586.797,-789.591 c -15.562,-42.096 -55.694,-70.047 -100.575,-70.047 h -724.733 c -74.654,0 -126.461,74.386 -100.574,144.409 l 238.511,645.182 C -85.013,-27.952 -44.88,0 0,0"
style="fill:#f20a4c;fill-opacity:1;fill-rule:nonzero;stroke:none"
transform="matrix(1.3333333,0,0,-1.3333333,888.7104,535.77627)"
clip-path="url(#clipPath10)"
id="path11"/>
</g>
</g>
</svg>

After

Width:  |  Height:  |  Size: 6.1 KiB

View File

@@ -6,9 +6,8 @@ Frontend settings are configured through environment variables in your `docker-c
## Configuration File Location
{% hint style="warning" %}
Note that MediaManager may need to be restarted for changes in the config file to take effect.
{% endhint %}
!!! warning
Note that MediaManager may need to be restarted for changes in the config file to take effect.
Your `config.toml` file should be in the directory that's mounted to `/app/config/config.toml` inside the container:
@@ -66,6 +65,5 @@ MEDIAMANAGER_AUTH__OPENID_CONNECT__CLIENT_SECRET = "your_client_secret_from_prov
So for every config "level", you basically have to take the name of the value and prepend it with the section names in uppercase with 2 underscores as delimiters and `MEDIAMANAGER_` as the prefix.
{% hint style="warning" %}
Note that not every env variable starts with `MEDIAMANAGER_`; this prefix only applies to env variables which replace/overwrite values in the config file. Variables like the `CONFIG_DIR` env variable must not be prefixed.
{% endhint %}
!!! warning
Note that not every env variable starts with `MEDIAMANAGER_`; this prefix only applies to env variables which replace/overwrite values in the config file. Variables like the `CONFIG_DIR` env variable must not be prefixed.

View File

@@ -20,13 +20,11 @@ All authentication settings are configured in the `[auth]` section of your `conf
* `email_password_resets`\
Enables password resets via email. Default is `false`.
{% hint style="info" %}
To use email password resets, you must also configure SMTP settings in the `[notifications.smtp_config]` section.
{% endhint %}
!!! info
To use email password resets, you must also configure SMTP settings in the `[notifications.smtp_config]` section.
{% hint style="info" %}
When setting up MediaManager for the first time, you should add your email to `admin_emails` in the `[auth]` config section. MediaManager will then use this email instead of the default admin email. Your account will automatically be created as an admin account, allowing you to manage other users, media and settings.
{% endhint %}
!!! info
When setting up MediaManager for the first time, you should add your email to `admin_emails` in the `[auth]` config section. MediaManager will then use this email instead of the default admin email. Your account will automatically be created as an admin account, allowing you to manage other users, media and settings.
## OpenID Connect Settings (`[auth.openid_connect]`)
@@ -53,22 +51,20 @@ The OpenID server will likely require a redirect URI. This URL will usually look
{MEDIAMANAGER_URL}/api/v1/auth/oauth/callback
```
{% hint style="warning" %}
It is very important that you set the correct callback URI, otherwise it won't work!
{% endhint %}
!!! warning
It is very important that you set the correct callback URI, otherwise it won't work!
#### Authentik Example
Here is an example configuration for the OpenID Connect provider for Authentik.
![authentik-redirect-url-example](<../.gitbook/assets/authentik redirect url example.png>)
![authentik-redirect-url-example](<../assets/assets/authentik redirect url example.png>)
## Example Configuration
Here's a complete example of the authentication section in your `config.toml`:
{% code title="config.toml" %}
```toml
```toml title="config.toml"
[auth]
token_secret = "a1b2c3d4e5f6g7h8i9j0k1l2m3n4o5p6q7r8s9t0u1v2w3x4y5z6"
session_lifetime = 604800 # 1 week
@@ -82,4 +78,4 @@ client_secret = "your-secret-key-here"
configuration_endpoint = "https://auth.example.com/.well-known/openid-configuration"
name = "Authentik"
```
{% endcode %}

View File

@@ -26,8 +26,7 @@ description: >-
Here's a complete example of the general settings section in your `config.toml`:
{% code title="config.toml" %}
```toml
```toml title="config.toml"
[misc]
# REQUIRED: Change this to match your actual frontend domain.
@@ -38,8 +37,6 @@ cors_urls = ["http://localhost:8000"]
# Optional: Development mode (set to true for debugging)
development = false
```
{% endcode %}
{% hint style="info" %}
The `frontend_url` is the most important setting to configure correctly. Make sure it matches your actual deployment URLs.
{% endhint %}
!!! info
The `frontend_url` is the most important setting to configure correctly. Make sure it matches your actual deployment URLs.

View File

@@ -6,9 +6,8 @@ MediaManager supports custom libraries, allowing you to add multiple folders for
Custom libraries are configured in the `misc` section in the `config.toml` file. You can add as many libraries as you need.
{% hint style="info" %}
You are not limited to `/data/tv` or `/data/movies`, you can choose the entire path freely!
{% endhint %}
!!! info
You are not limited to `/data/tv` or `/data/movies`, you can choose the entire path freely!
### Movie Libraries

View File

@@ -19,8 +19,7 @@ Database settings are configured in the `[database]` section of your `config.tom
Here's a complete example of the database section in your `config.toml`:
{% code title="config.toml" %}
```toml
```toml title="config.toml"
[database]
host = "db"
port = 5432
@@ -28,8 +27,6 @@ user = "MediaManager"
password = "your_secure_password"
dbname = "MediaManager"
```
{% endcode %}
{% hint style="info" %}
In docker-compose deployments the container name is simultaneously its hostname, so you can use "db" or "postgres" as host.
{% endhint %}
!!! info
In docker-compose deployments the container name is simultaneously its hostname, so you can use "db" or "postgres" as host.

View File

@@ -19,9 +19,8 @@ qBittorrent is a popular BitTorrent client that MediaManager can integrate with
## Transmission Settings (`[torrents.transmission]`)
{% hint style="info" %}
The downloads path in Transmission and MediaManager must be the same, i.e. the path `/data/torrents` must link to the same volume for both containers.
{% endhint %}
!!! info
The downloads path in Transmission and MediaManager must be the same, i.e. the path `/data/torrents` must link to the same volume for both containers.
Transmission is a BitTorrent client that MediaManager can integrate with for downloading torrents.
@@ -59,8 +58,7 @@ SABnzbd is a Usenet newsreader that MediaManager can integrate with for download
Here's a complete example of the download clients section in your `config.toml`:
{% code title="config.toml" %}
```toml
```toml title="config.toml"
[torrents]
# qBittorrent configuration
[torrents.qbittorrent]
@@ -87,14 +85,12 @@ Here's a complete example of the download clients section in your `config.toml`:
port = 8080
api_key = "your_sabnzbd_api_key"
```
{% endcode %}
## Docker Compose Integration
When using Docker Compose, make sure your download clients are accessible from the MediaManager backend:
{% code title="docker-compose.yml" %}
```yaml
```yaml title="docker-compose.yml"
services:
# MediaManager backend
backend:
@@ -121,12 +117,9 @@ services:
- ./data/usenet:/downloads
# ... other configuration ...
```
{% endcode %}
{% hint style="warning" %}
You should enable only one BitTorrent and only one Usenet Download Client at any time.
{% endhint %}
!!! warning
You should enable only one BitTorrent and only one Usenet Download Client at any time.
{% hint style="info" %}
Make sure the download directories in your download clients are accessible to MediaManager for proper file management and organization.
{% endhint %}
!!! info
Make sure the download directories in your download clients are accessible to MediaManager for proper file management and organization.

View File

@@ -13,9 +13,8 @@ Indexer settings are configured in the `[indexers]` section of your `config.toml
* `timeout_seconds`\
Timeout in seconds for requests to Prowlarr. Default is `60`.
{% hint style="warning" %}
Symptoms of timeouts are typically no search results ("No torrents found!") in conjunction with logs showing read timeouts.
{% endhint %}
!!! warning
Symptoms of timeouts are typically no search results ("No torrents found!") in conjunction with logs showing read timeouts.
<details>
@@ -50,8 +49,7 @@ DEBUG - media_manager.indexer.utils -
## Example Configuration
{% code title="config.toml" %}
```toml
```toml title="config.toml"
[indexers]
[indexers.prowlarr]
enabled = true
@@ -66,4 +64,4 @@ api_key = "your_jackett_api_key"
indexers = ["1337x", "rarbg"]
timeout_seconds = 60
```
{% endcode %}

View File

@@ -57,8 +57,7 @@ Controls which emails receive notifications.
Here's a complete example of the notifications section in your `config.toml`:
{% code title="config.toml" %}
```toml
```toml title="config.toml"
[notifications]
# SMTP settings for email notifications and password resets
[notifications.smtp_config]
@@ -91,8 +90,7 @@ Here's a complete example of the notifications section in your `config.toml`:
api_key = "your_pushover_api_key"
user = "your_pushover_user_key"
```
{% endcode %}
{% hint style="info" %}
You can enable multiple notification methods simultaneously. For example, you could have both email and Gotify notifications enabled at the same time.
{% endhint %}
!!! info
You can enable multiple notification methods simultaneously. For example, you could have both email and Gotify notifications enabled at the same time.

View File

@@ -17,9 +17,8 @@ Rules define how MediaManager scores releases based on their titles or indexer f
* Reject releases that do not meet certain criteria (e.g., non-freeleech releases).
* and more.
{% hint style="info" %}
The keywords and flags are compared case-insensitively.
{% endhint %}
!!! info
The keywords and flags are compared case-insensitively.
### Title Rules
@@ -38,8 +37,7 @@ Each title rule consists of:
Examples for Title Rules
{% code title="config.toml" %}
```toml
```toml title="config.toml"
[[indexers.title_scoring_rules]]
name = "prefer_h265"
keywords = ["h265", "hevc", "x265"]
@@ -52,7 +50,6 @@ keywords = ["cam", "ts"]
score_modifier = -10000
negate = false
```
{% endcode %}
* The first rule increases the score for releases containing "h265", "hevc", or "x265".
* The second rule heavily penalizes releases containing "cam" or "ts".
@@ -76,8 +73,7 @@ Each indexer flag rule consists of:
Examples for Indexer Flag Rules
{% code title="config.toml" %}
```toml
```toml title="config.toml"
[[indexers.indexer_flag_scoring_rules]]
name = "reject_non_freeleech"
flags = ["freeleech", "freeleech75"]
@@ -90,7 +86,6 @@ flags = ["nuked"]
score_modifier = -10000
negate = false
```
{% endcode %}
* The first rule penalizes releases that do not have the "freeleech" or "freeleech75" flag.
* The second rule penalizes releases that are marked as "nuked".
@@ -99,8 +94,7 @@ If `negate` is set to `true`, the `score_modifier` is applied only if none of th
## Example
{% code title="config.toml" %}
```toml
```toml title="config.toml"
[[indexers.scoring_rule_sets]]
name = "default"
libraries = ["ALL_TV", "ALL_MOVIES"]
@@ -111,7 +105,6 @@ name = "strict_quality"
libraries = ["ALL_MOVIES"]
rule_names = ["prefer_h265", "avoid_cam", "reject_non_freeleech"]
```
{% endcode %}
## Libraries
@@ -127,9 +120,8 @@ You can use special library names in your rulesets:
This allows you to set global rules for all TV or movie content, or provide fallback rules for uncategorized media.
{% hint style="info" %}
You don't need to create lots of libraries with different directories, multiple libraries can share the same directory. You can set multiple (unlimited) libraries to the default directory `/data/movies` or `/data/tv` and use different rulesets with them.
{% endhint %}
!!! info
You don't need to create lots of libraries with different directories, multiple libraries can share the same directory. You can set multiple (unlimited) libraries to the default directory `/data/movies` or `/data/tv` and use different rulesets with them.
## Relation to Sonarr/Radarr Profiles

View File

@@ -1,12 +1,16 @@
# Developer Guide
---
description: >-
This section is for those who want to contribute to Media Manager or
understand its internals.
---
This section is for those who want to contribute to Media Manager or understand its internals.
# Developer Guide
## Source Code directory structure
* `media_manager/`: Backend FastAPI application
* `web/`: Frontend SvelteKit application
* `Writerside/`: Documentation
* `docs/`: Documentation (MkDocs)
* `metadata_relay/`: Metadata relay service, also FastAPI
## Special Dev Configuration
@@ -40,9 +44,8 @@ MediaManager uses various environment variables for configuration. In the Docker
* `DISABLE_FRONTEND_MOUNT`\
When `TRUE`, disables mounting built frontend files (allows separate frontend container).
{% hint style="info" %}
This is automatically set in `docker-compose.dev.yaml` to enable the separate frontend development container
{% endhint %}
!!! info
This is automatically set in `docker-compose.dev.yaml` to enable the separate frontend development container
#### Configuration Files
@@ -69,7 +72,6 @@ I use IntellijIdea with the Pycharm and Webstorm plugins to develop this, but th
* Pydantic
* Ruff
* VirtualKit
* Writerside (for writing documentation)
### Recommended Development Workflow
@@ -102,10 +104,9 @@ This means when your browser makes a request to `http://localhost:5173/api/v1/tv
### Setting up the full development environment with Docker (Recommended)
This is the easiest and recommended way to get started. Everything runs in Docker with hot-reloading enabled.
{% stepper %}
{% step %}
### Prepare config files
Create config directory (only needed on first run) and copy example config files:
@@ -115,9 +116,9 @@ mkdir -p res/config # Only needed on first run
cp config.dev.toml res/config/config.toml
cp web/.env.example web/.env
```
{% endstep %}
{% step %}
### Start all services
Recommended: Use make commands for easy development
@@ -132,9 +133,9 @@ Alternative: Use docker compose directly (if make is not available)
```bash
docker compose -f docker-compose.dev.yaml up
```
{% endstep %}
{% step %}
### Access the application
* Frontend (with HMR): http://localhost:5173
@@ -148,12 +149,10 @@ Now you can edit code and see changes instantly:
* Edit Python files → Backend auto-reloads
* Edit Svelte/TypeScript files → Frontend HMR updates in browser
* Edit config.toml → Changes apply immediately
{% endstep %}
{% endstepper %}
{% hint style="info" %}
Run `make help` to see all available development commands including `make down`, `make logs`, `make app` (shell into backend), and more.
{% endhint %}
!!! info
Run `make help` to see all available development commands including `make down`, `make logs`, `make app` (shell into backend), and more.
## Setting up the backend development environment (Local)
@@ -203,29 +202,28 @@ uv run fastapi run media_manager/main.py --reload --port 8000
* Format code:
```bash
uv run ruff format .
ruff format .
```
* Lint code:
```bash
uv run ruff check .
ruff check .
```
## Setting up the frontend development environment (Local, Optional)
Using the Docker setup above is recommended. This section is for those who prefer to run the frontend locally outside of Docker.
{% stepper %}
{% step %}
### Clone & change dir
1. Clone the repository
2. cd into repo root
3. cd into `web` directory
{% endstep %}
{% step %}
### Install Node.js (example using nvm-windows)
I used nvm-windows:
@@ -240,9 +238,9 @@ If using PowerShell you may need:
```powershell
Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser
```
{% endstep %}
{% step %}
### Create .env for frontend
```bash
@@ -250,18 +248,18 @@ cp .env.example .env
```
Update `PUBLIC_API_URL` if your backend is not at `http://localhost:8000`
{% endstep %}
{% step %}
### Install dependencies and run dev server
```bash
npm install
npm run dev
```
{% endstep %}
{% step %}
### Format & lint
* Format:
@@ -275,12 +273,10 @@ npm run format
```bash
npm run lint
```
{% endstep %}
{% endstepper %}
{% hint style="info" %}
If running frontend locally, make sure to add `http://localhost:5173` to the `cors_urls` in your backend config file.
{% endhint %}
!!! info
If running frontend locally, make sure to add `http://localhost:5173` to the `cors_urls` in your backend config file.
## Troubleshooting

View File

@@ -0,0 +1,14 @@
# Documentation
MediaManager uses [MkDocs](https://www.mkdocs.org/) with
the [Material for MkDocs](https://squidfunk.github.io/mkdocs-material/) theme for documentation.
The files for the documentation are in the `/docs` directory.
To preview the documentation locally, you need to have mkdocs or Docker installed.
## How to preview the documentation locally with docker
1. Run the mkdocs container in `docker-compose.dev.yaml`
2. Open `http://127.0.0.1:9000/` in your browser.

View File

@@ -23,9 +23,8 @@ Here is an example, using these rules:
If your folder structure is in the correct format, you can start importing. To do this, log in as an administrator and go to the TV/movie dashboard.
{% hint style="info" %}
After importing, MediaManager will automatically prefix the old root TV show/movie folders with a dot to mark them as "imported".
{% endhint %}
!!! info
After importing, MediaManager will automatically prefix the old root TV show/movie folders with a dot to mark them as "imported".
So after importing, the directory would look like this (using the above directory structure):

2
docs/index.md Normal file
View File

@@ -0,0 +1,2 @@
--8<-- "README.md"

View File

@@ -0,0 +1,6 @@
# Installation Guide
The recommended way to install and run Media Manager is using Docker and Docker Compose. Other installation methods are not officially supported, but listed here for convenience.
[Docker Compose (recommended)](docker.md){ .md-button .md-button--primary }
[Nix Flakes [Community]](flakes.md){ .md-button }

View File

@@ -1,6 +1,4 @@
# Installation Guide
The recommended way to install and run Media Manager is using Docker and Docker Compose.
# Docker Compose
## Prerequisites
@@ -11,54 +9,53 @@ The recommended way to install and run Media Manager is using Docker and Docker
Follow these steps to get MediaManager running with Docker Compose:
{% stepper %}
{% step %}
### Get the docker-compose file
#### Get the docker-compose file
Download the `docker-compose.yaml` from the MediaManager repo:
```bash
wget -O docker-compose.yaml https://raw.githubusercontent.com/maxdorninger/MediaManager/refs/heads/master/docker-compose.yaml
wget -O docker-compose.yaml https://github.com/maxdorninger/MediaManager/releases/latest/download/docker-compose.yaml
```
{% endstep %}
{% step %}
### Prepare configuration directory and example config
#### Prepare configuration directory and example config
Create a config directory and download the example configuration:
```bash
mkdir config
wget -O ./config/config.toml https://raw.githubusercontent.com/maxdorninger/MediaManager/refs/heads/master/config.example.toml
wget -O ./config/config.toml https://github.com/maxdorninger/MediaManager/releases/latest/download/config.example.toml
```
{% endstep %}
{% step %}
### Edit configuration
You probably need to edit the `config.toml` file in the `./config` directory to suit your environment and preferences. [How to configure MediaManager.](configuration/)
{% endstep %}
{% step %}
### Start MediaManager
#### Edit configuration
You probably need to edit the `config.toml` file in the `./config` directory to suit your environment and preferences. [How to configure MediaManager.](../configuration/README.md)
#### Start MediaManager
Bring up the stack:
```bash
docker compose up -d
```
{% endstep %}
{% endstepper %}
* Upon first run, MediaManager will create a default `config.toml` file in the `./config` directory (if not already present).
* Upon first run, MediaManager will also create a default admin user. The credentials of the default admin user will be printed in the logs of the container — it's recommended to change the password of this user after the first login.
* [For more information on the available configuration options, see the Configuration section of the documentation.](configuration/)
* [For more information on the available configuration options, see the Configuration section of the documentation.](../configuration/README.md)
{% hint style="info" %}
When setting up MediaManager for the first time, you should add your email to `admin_emails` in the `[auth]` config section. MediaManager will then use this email instead of the default admin email. Your account will automatically be created as an admin account, allowing you to manage other users, media, and settings.
{% endhint %}
!!! info
When setting up MediaManager for the first time, you should add your email to `admin_emails` in the `[auth]` config section. MediaManager will then use this email instead of the default admin email. Your account will automatically be created as an admin account, allowing you to manage other users, media, and settings.
## MediaManager and MetadataRelay Docker Images
## Docker Images
MediaManager is available as a Docker image on both Red Hat Quay.io and GitHub Container Registry (GHCR):
@@ -70,11 +67,10 @@ MetadataRelay images are also available on both registries:
* quay.io/maxdorninger/metadata\_relay
* ghcr.io/maxdorninger/mediamanager/metadata\_relay
From v1.12.1 onwards, both MediaManager and MetadataRelay images are available on both Quay.io and GHCR. The reason for the switch to Quay.io as the primary image registry is due to GHCR's continued slow performance: https://github.com/orgs/community/discussions/173607
From v1.12.1 onwards, both MediaManager and MetadataRelay images are available on both Quay.io and GHCR. The reason for the switch to Quay.io as the primary image registry is due to [GHCR's continued slow performance.](https://github.com/orgs/community/discussions/173607)
{% hint style="info" %}
You can use either the Quay.io or GHCR images interchangeably, as they are built from the same source and the tags are the same across both registries.
{% endhint %}
!!! info
You can use either the Quay.io or GHCR images interchangeably, as they are built from the same source and the tags are the same across both registries.
### Tags
@@ -85,4 +81,4 @@ Both registries support the following tags:
* X.Y.Z: Specific version tags (e.g., `1.12.0`).
* X.Y: Points to the latest release in the X.Y series (e.g., `1.12`).
* X: Points to the latest release in the X series (e.g., `1`).
* pr-: Points to the latest commit in the specified pull request (e.g., `pr-67`).
* pr-\<number>: Points to the latest commit in the specified pull request (e.g., `pr-67`).

124
docs/installation/flakes.md Normal file
View File

@@ -0,0 +1,124 @@
# Nix Flakes
!!! note
This is a community contribution and not officially supported by the MediaManager team, but included here for convenience.
*Please report issues with this method at the [corresponding GitHub repository](https://github.com/strangeglyph/mediamanager-nix).*
## Prerequisites
This guide assumes that your system is a flakes-based NixOS installation. Hosting MediaManager on a subpath (e.g. `yourdomain.com/mediamanager`) is currently not supported, though contributions to add support are welcome.
## Importing the community flake
To use the community-provided flake and module, first import it in your own flake, for example:
```nix
{
description = "An example NixOS configuration";
inputs = {
nixpkgs = { url = "github:nixos/nixpkgs/nixos-unstable"; };
mediamanager-nix = {
url = "github:strangeglyph/mediamanager-nix";
inputs.nixpkgs.follows = "nixpkgs";
};
};
outputs = inputs@{
nixpkgs,
mediamanager-nix,
...
}: {
nixosConfigurations.your-system = nixpkgs.lib.nixosSystem {
modules = [
mediamanager-nix.nixosModules.default
];
};
};
}
```
## Configuration
The flake provides a simple module to set up a MediaManager systemd service. To enable it, set
```nix
{
config = {
services.media-manager = {
enable = true;
};
};
}
```
You will either want to set `services.media-manager.dataDir`, which will provide sensible defaults for the settings
`misc.{image,movie,tv,torrent}_directory`, or provide specific paths yourself.
The host and port that MediaManager listens on can be set using `services.media-manager.{host,port}`.
To configure MediaManager, use `services.media-manager.settings`, which follows the same structure as the MediaManager
`config.toml`. To provision secrets, set `services.media-manager.environmentFile` to a protected file, for example one
provided by [agenix](https://github.com/ryantm/agenix) or [sops-nix](https://github.com/Mic92/sops-nix).
See [Configuration](../configuration/README.md#configuring-secrets) for guidance on using environment variables.
!!! warning
Do not place secrets in the nix store, as it is world-readable.
## Automatic Postgres Setup
As a convenience feature, the module provides a simple Postgres setup that can be enabled with `services.media-manager.postgres.enable`. This sets up a database user named `services.media-manager.postgres.user` and a database with the same name. Provided the user of the systemd service wasn't changed, authentication should work automatically for unix socket connections (the default mediamanager-nix settings).
For advanced setups, please refer to the NixOS manual.
## Example Configuration
Here is a minimal complete flake for a MediaManager setup:
```nix
{
description = "An example NixOS configuration";
inputs = {
nixpkgs = { url = "github:nixos/nixpkgs/nixos-unstable"; };
mediamanager-nix = {
url = "github:strangeglyph/mediamanager-nix";
inputs.nixpkgs.follows = "nixpkgs";
};
};
outputs = inputs@{
nixpkgs,
mediamanager-nix,
...
}: {
nixosConfigurations.your-system = nixpkgs.lib.nixosSystem {
imports = [
mediamanager-nix.nixosModules.default
];
config = {
services.media-manager = {
enable = true;
postgres.enable = true;
port = 12345;
dataDir = "/tmp";
settings = {
misc.frontend_url = "http://[::1]:12345";
};
};
systemd.tmpfiles.settings."10-mediamanager" = {
"/tmp/movies".d = { user = config.services.media-manager.user; };
"/tmp/shows".d = { user = config.services.media-manager.user; };
"/tmp/images".d = { user = config.services.media-manager.user; };
"/tmp/torrents".d = { user = config.services.media-manager.user; };
};
};
};
};
}
```

View File

@@ -1,7 +1,6 @@
# Screenshots
{% hint style="info" %}
MediaManager also supports darkmode!
{% endhint %}
!!! info
MediaManager also supports darkmode!
![screenshot-dashboard.png](<.gitbook/assets/screenshot dashboard.png>) ![screenshot-tv-dashboard.png](<.gitbook/assets/screenshot tv dashboard.png>) ![screenshot-download-season.png](<.gitbook/assets/screenshot download season.png>) ![screenshot-request-season.png](<.gitbook/assets/screenshot request season.png>) ![screenshot-tv-torrents.png](<.gitbook/assets/screenshot tv torrents.png>) ![screenshot-settings.png](<.gitbook/assets/screenshot settings.png>) ![screenshot-login.png](<.gitbook/assets/screenshot login.png>)
![screenshot-dashboard.png](<assets/assets/screenshot dashboard.png>) ![screenshot-tv-dashboard.png](<assets/assets/screenshot tv dashboard.png>) ![screenshot-download-season.png](<assets/assets/screenshot download season.png>) ![screenshot-request-season.png](<assets/assets/screenshot request season.png>) ![screenshot-tv-torrents.png](<assets/assets/screenshot tv torrents.png>) ![screenshot-settings.png](<assets/assets/screenshot settings.png>) ![screenshot-login.png](<assets/assets/screenshot login.png>)

View File

@@ -1,8 +1,7 @@
# Troubleshooting
{% hint style="info" %}
Always check the container and browser logs for more specific error messages
{% endhint %}
!!! info
Always check the container and browser logs for more specific error messages
<details>
@@ -60,10 +59,9 @@ Switch to advanced tabTry switching to the advanced tab when searching for torre
#### Possible Fixes:
* [Unable to pull image from GitHub Container Registry (Stack Overflow)](https://stackoverflow.com/questions/74656167/unable-to-pull-image-from-github-container-registry-ghcr)
* [Try pulling the image from Quay.io](installation-guide.md#mediamanager-and-metadatarelay-docker-images)
* [Try pulling the image from Quay.io](installation/docker.md#docker-images)
</details>
{% hint style="info" %}
If it still doesn't work, [please open an Issue.](https://github.com/maxdorninger/MediaManager/issues) It is possible that a bug is causing the issue.
{% endhint %}
!!! info
If it still doesn't work, [please open an Issue.](https://github.com/maxdorninger/MediaManager/issues) It is possible that a bug is causing the issue.

View File

@@ -1,133 +0,0 @@
# Usage
If you are coming from Radarr or Sonarr you will find that MediaManager does things a bit differently. Instead of completely automatically downloading and managing your media, MediaManager focuses on providing an easy-to-use interface to guide you through the process of finding and downloading media. Advanced features like multiple qualities of a show/movie necessitate such a paradigm shift. So here is a quick step-by-step guide to get you started:
#### Downloading/Requesting a show
{% stepper %}
{% step %}
### Add the show
Add a show on the "Add Show" page. After adding the show you will be redirected to the show's page.
{% endstep %}
{% step %}
### Request season(s)
Click the "Request Season" button on the show's page. Select one or more seasons that you want to download.
{% endstep %}
{% step %}
### Select qualities
Select the "Min Quality" — the minimum resolution of the content to download.\
Select the "Wanted Quality" — the **maximum** resolution of the content to download.
{% endstep %}
{% step %}
### Submit request
Click "Submit request". This is not the last step: an administrator must first approve your request for download. Only after approval will the requested content be downloaded.
{% endstep %}
{% step %}
### Finished
Congratulation! You've downloaded a show (after admin approval).
{% endstep %}
{% endstepper %}
#### Requesting a show (as an admin)
{% stepper %}
{% step %}
### Add the show
Add a show on the "Add Show" page. After adding the show you will be redirected to the show's page.
{% endstep %}
{% step %}
### Request season(s)
Click the "Request Season" button on the show's page. Select one or more seasons that you want to download.
{% endstep %}
{% step %}
### Select qualities
Select the "Min Quality" — the minimum resolution of the content to download.\
Select the "Wanted Quality" — the **maximum** resolution of the content to download.
{% endstep %}
{% step %}
### Submit request (auto-approved)
Click "Submit request". As an admin, your request will be automatically approved.
{% endstep %}
{% step %}
### Finished
Congratulation! You've downloaded a show.
{% endstep %}
{% endstepper %}
#### Downloading a show (admin-only)
You can only directly download a show if you are an admin!
{% stepper %}
{% step %}
### Go to the show's page
Open the show's page that contains the season you wish to download.
{% endstep %}
{% step %}
### Start download
Click the "Download Season" button.
{% endstep %}
{% step %}
### Enter season number
Enter the season number that you want to download.
{% endstep %}
{% step %}
### Optional file path suffix
Optionally select the "File Path Suffix". Note: **it needs to be unique per season per show!**
{% endstep %}
{% step %}
### Choose torrent and download
Click "Download" on the torrent that you want to download.
{% endstep %}
{% step %}
### Finished
Congratulation! You've downloaded a show.
{% endstep %}
{% endstepper %}
#### Managing requests
Users need their requests to be approved by an admin. To manage requests:
{% stepper %}
{% step %}
### Open Requests page
Go to the "Requests" page.
{% endstep %}
{% step %}
### Approve, delete or modify
From the Requests page you can approve, delete, or modify a user's request.
{% endstep %}
{% endstepper %}

View File

View File

@@ -1,7 +1,8 @@
from pydantic_settings import BaseSettings
from pydantic import Field
import secrets
from pydantic import Field
from pydantic_settings import BaseSettings
class OpenIdConfig(BaseSettings):
client_id: str = ""
@@ -19,7 +20,3 @@ class AuthConfig(BaseSettings):
admin_emails: list[str] = []
email_password_resets: bool = False
openid_connect: OpenIdConfig = OpenIdConfig()
@property
def jwt_signing_key(self):
return self._jwt_signing_key

View File

@@ -1,26 +1,24 @@
from collections.abc import AsyncGenerator
from typing import Optional
from fastapi import Depends
from fastapi_users.db import (
SQLAlchemyBaseOAuthAccountTableUUID,
SQLAlchemyBaseUserTableUUID,
SQLAlchemyUserDatabase,
SQLAlchemyBaseOAuthAccountTableUUID,
)
from sqlalchemy import String
from sqlalchemy.ext.asyncio import AsyncSession, async_sessionmaker, create_async_engine
from sqlalchemy.orm import Mapped, relationship, mapped_column
from sqlalchemy.orm import Mapped, mapped_column, relationship
from media_manager.database import Base, build_db_url
from media_manager.config import MediaManagerConfig
from media_manager.database import Base, build_db_url
class OAuthAccount(SQLAlchemyBaseOAuthAccountTableUUID, Base):
access_token: Mapped[str] = mapped_column(String(length=4096), nullable=False)
refresh_token: Mapped[Optional[str]] = mapped_column(
refresh_token: Mapped[str | None] = mapped_column(
String(length=4096), nullable=True
)
pass
class User(SQLAlchemyBaseUserTableUUID, Base):
@@ -35,10 +33,12 @@ engine = create_async_engine(
async_session_maker = async_sessionmaker(engine, expire_on_commit=False)
async def get_async_session() -> AsyncGenerator[AsyncSession, None]:
async def get_async_session() -> AsyncGenerator[AsyncSession]:
async with async_session_maker() as session:
yield session
async def get_user_db(session: AsyncSession = Depends(get_async_session)):
async def get_user_db(
session: AsyncSession = Depends(get_async_session),
) -> AsyncGenerator[SQLAlchemyUserDatabase]:
yield SQLAlchemyUserDatabase(session, User, OAuthAccount)

View File

@@ -1,26 +1,36 @@
from fastapi import APIRouter, Depends
from fastapi import status
from collections.abc import AsyncGenerator
from contextlib import asynccontextmanager
from fastapi import APIRouter, Depends, FastAPI, status
from fastapi_users.router import get_oauth_router
from httpx_oauth.oauth2 import OAuth2
from sqlalchemy import select
from media_manager.config import MediaManagerConfig
from media_manager.auth.db import User
from media_manager.auth.schemas import UserRead, AuthMetadata
from media_manager.auth.schemas import AuthMetadata, UserRead
from media_manager.auth.users import (
SECRET,
create_default_admin_user,
current_superuser,
fastapi_users,
openid_client,
openid_cookie_auth_backend,
SECRET,
fastapi_users,
)
from media_manager.config import MediaManagerConfig
from media_manager.database import DbSessionDependency
users_router = APIRouter()
@asynccontextmanager
async def lifespan(_app: FastAPI) -> AsyncGenerator:
await create_default_admin_user()
yield
users_router = APIRouter(lifespan=lifespan)
auth_metadata_router = APIRouter()
def get_openid_router():
def get_openid_router() -> APIRouter:
if openid_client:
return get_oauth_router(
oauth_client=openid_client,
@@ -31,23 +41,22 @@ def get_openid_router():
is_verified_by_default=True,
redirect_url=None,
)
else:
# this is there, so that the appropriate routes are created even if OIDC is not configured,
# e.g. for generating the frontend's openapi client
return get_oauth_router(
oauth_client=OAuth2(
client_id="mock",
client_secret="mock",
authorize_endpoint="https://example.com/authorize",
access_token_endpoint="https://example.com/token",
),
backend=openid_cookie_auth_backend,
get_user_manager=fastapi_users.get_user_manager,
state_secret=SECRET,
associate_by_email=False,
is_verified_by_default=False,
redirect_url=None,
)
# this is there, so that the appropriate routes are created even if OIDC is not configured,
# e.g. for generating the frontend's openapi client
return get_oauth_router(
oauth_client=OAuth2(
client_id="mock",
client_secret="mock", # noqa: S106
authorize_endpoint="https://example.com/authorize",
access_token_endpoint="https://example.com/token", # noqa: S106
),
backend=openid_cookie_auth_backend,
get_user_manager=fastapi_users.get_user_manager,
state_secret=SECRET,
associate_by_email=False,
is_verified_by_default=False,
redirect_url=None,
)
openid_config = MediaManagerConfig().auth.openid_connect
@@ -68,5 +77,4 @@ def get_all_users(db: DbSessionDependency) -> list[UserRead]:
def get_auth_metadata() -> AuthMetadata:
if openid_config.enabled:
return AuthMetadata(oauth_providers=[openid_config.name])
else:
return AuthMetadata(oauth_providers=[])
return AuthMetadata(oauth_providers=[])

View File

@@ -1,9 +1,11 @@
import contextlib
import logging
import uuid
from typing import Optional, Any
from collections.abc import AsyncGenerator
from typing import Any, override
from fastapi import Depends, Request
from fastapi.responses import RedirectResponse, Response
from fastapi_users import BaseUserManager, FastAPIUsers, UUIDIDMixin, models
from fastapi_users.authentication import (
AuthenticationBackend,
@@ -13,13 +15,12 @@ from fastapi_users.authentication import (
)
from fastapi_users.db import SQLAlchemyUserDatabase
from httpx_oauth.clients.openid import OpenID
from fastapi.responses import RedirectResponse, Response
from sqlalchemy import func, select
from starlette import status
from sqlalchemy import select, func
import media_manager.notification.utils
from media_manager.auth.db import User, get_user_db, get_async_session
from media_manager.auth.schemas import UserUpdate, UserCreate
from media_manager.auth.db import User, get_async_session, get_user_db
from media_manager.auth.schemas import UserCreate, UserUpdate
from media_manager.config import MediaManagerConfig
log = logging.getLogger(__name__)
@@ -44,28 +45,33 @@ class UserManager(UUIDIDMixin, BaseUserManager[User, uuid.UUID]):
reset_password_token_secret = SECRET
verification_token_secret = SECRET
@override
async def on_after_update(
self,
user: models.UP,
update_dict: dict[str, Any],
request: Optional[Request] = None,
request: Request | None = None,
) -> None:
log.info(f"User {user.id} has been updated.")
if "is_superuser" in update_dict and update_dict["is_superuser"]:
if update_dict.get("is_superuser"):
log.info(f"User {user.id} has been granted superuser privileges.")
if "email" in update_dict:
updated_user = UserUpdate(is_verified=True)
await self.update(user=user, user_update=updated_user)
async def on_after_register(self, user: User, request: Optional[Request] = None):
@override
async def on_after_register(
self, user: User, request: Request | None = None
) -> None:
log.info(f"User {user.id} has registered.")
if user.email in config.admin_emails:
updated_user = UserUpdate(is_superuser=True, is_verified=True)
await self.update(user=user, user_update=updated_user)
@override
async def on_after_forgot_password(
self, user: User, token: str, request: Optional[Request] = None
):
self, user: User, token: str, request: Request | None = None
) -> None:
link = f"{MediaManagerConfig().misc.frontend_url}web/login/reset-password?token={token}"
log.info(f"User {user.id} has forgot their password. Reset Link: {link}")
@@ -80,7 +86,7 @@ class UserManager(UUIDIDMixin, BaseUserManager[User, uuid.UUID]):
<p>Hi {user.email},
<br>
<br>
if you forgot your password, <a href="{link}">reset you password here</a>.<br>
if you forgot your password, <a href=\"{link}\">reset you password here</a>.<br>
If you did not request a password reset, you can ignore this email.</p>
<br>
<br>
@@ -93,23 +99,28 @@ class UserManager(UUIDIDMixin, BaseUserManager[User, uuid.UUID]):
)
log.info(f"Sent password reset email to {user.email}")
@override
async def on_after_reset_password(
self, user: User, request: Optional[Request] = None
):
self, user: User, request: Request | None = None
) -> None:
log.info(f"User {user.id} has reset their password.")
@override
async def on_after_request_verify(
self, user: User, token: str, request: Optional[Request] = None
):
self, user: User, token: str, request: Request | None = None
) -> None:
log.info(
f"Verification requested for user {user.id}. Verification token: {token}"
)
async def on_after_verify(self, user: User, request: Optional[Request] = None):
@override
async def on_after_verify(self, user: User, request: Request | None = None) -> None:
log.info(f"User {user.id} has been verified")
async def get_user_manager(user_db: SQLAlchemyUserDatabase = Depends(get_user_db)):
async def get_user_manager(
user_db: SQLAlchemyUserDatabase = Depends(get_user_db),
) -> AsyncGenerator[UserManager]:
yield UserManager(user_db)
@@ -118,7 +129,7 @@ get_user_db_context = contextlib.asynccontextmanager(get_user_db)
get_user_manager_context = contextlib.asynccontextmanager(get_user_manager)
async def create_default_admin_user():
async def create_default_admin_user() -> None:
"""Create a default admin user if no users exist in the database"""
try:
async with get_async_session_context() as session:
@@ -140,7 +151,7 @@ async def create_default_admin_user():
if config.auth.admin_emails
else "admin@example.com"
)
default_password = "admin" # Simple default password
default_password = "admin" # noqa: S105 # Simple default password
user_create = UserCreate(
email=admin_email,
@@ -164,17 +175,13 @@ async def create_default_admin_user():
log.info(
f"Found {user_count} existing users. Skipping default user creation."
)
except Exception as e:
log.error(f"Failed to create default admin user: {e}")
except Exception:
log.exception("Failed to create default admin user")
log.info(
"You can create an admin user manually by registering with an email from the admin_emails list in your config."
)
async def get_user_manager(user_db: SQLAlchemyUserDatabase = Depends(get_user_db)):
yield UserManager(user_db)
def get_jwt_strategy() -> JWTStrategy[models.UP, models.ID]:
return JWTStrategy(secret=SECRET, lifetime_seconds=LIFETIME)

View File

@@ -1,13 +1,12 @@
import logging
import os
from pathlib import Path
from typing import Type, Tuple
from pydantic import AnyHttpUrl
from pydantic_settings import (
BaseSettings,
SettingsConfigDict,
PydanticBaseSettingsSource,
SettingsConfigDict,
TomlConfigSettingsSource,
)
@@ -41,7 +40,7 @@ class BasicConfig(BaseSettings):
movie_directory: Path = Path(__file__).parent.parent / "data" / "movies"
torrent_directory: Path = Path(__file__).parent.parent / "data" / "torrents"
frontend_url: AnyHttpUrl = "http://localhost:8000"
frontend_url: AnyHttpUrl = AnyHttpUrl("http://localhost:8000")
cors_urls: list[str] = []
development: bool = False
@@ -71,12 +70,12 @@ class MediaManagerConfig(BaseSettings):
@classmethod
def settings_customise_sources(
cls,
settings_cls: Type[BaseSettings],
settings_cls: type[BaseSettings],
init_settings: PydanticBaseSettingsSource,
env_settings: PydanticBaseSettingsSource,
dotenv_settings: PydanticBaseSettingsSource,
file_secret_settings: PydanticBaseSettingsSource,
) -> Tuple[PydanticBaseSettingsSource, ...]:
) -> tuple[PydanticBaseSettingsSource, ...]:
return (
init_settings,
env_settings,

View File

@@ -1,7 +1,8 @@
import logging
import os
from collections.abc import Generator
from contextvars import ContextVar
from typing import Annotated, Any, Generator, Optional
from typing import Annotated
from fastapi import Depends
from sqlalchemy import create_engine
@@ -9,12 +10,14 @@ from sqlalchemy.engine import Engine
from sqlalchemy.engine.url import URL
from sqlalchemy.orm import Session, declarative_base, sessionmaker
from media_manager.database.config import DbConfig
log = logging.getLogger(__name__)
Base = declarative_base()
engine: Optional[Engine] = None
SessionLocal: Optional[sessionmaker] = None
engine: Engine | None = None
SessionLocal: sessionmaker | None = None
def build_db_url(
@@ -23,21 +26,20 @@ def build_db_url(
host: str,
port: int | str,
dbname: str,
) -> str:
db_url = URL.create(
) -> URL:
return URL.create(
"postgresql+psycopg",
user,
password,
host,
port,
int(port),
dbname,
)
return db_url
def init_engine(
db_config: Any | None = None,
url: str | None = None,
db_config: DbConfig | None = None,
url: str | URL | None = None,
) -> Engine:
"""
Initialize the global SQLAlchemy engine and session factory.
@@ -51,7 +53,8 @@ def init_engine(
if db_config is None:
url = os.getenv("DATABASE_URL")
if not url:
raise RuntimeError("DB config or `DATABASE_URL` must be provided")
msg = "DB config or `DATABASE_URL` must be provided"
raise RuntimeError(msg)
else:
url = build_db_url(
db_config.user,
@@ -76,22 +79,22 @@ def init_engine(
def get_engine() -> Engine:
if engine is None:
raise RuntimeError("Engine not initialized. Call init_engine(...) first.")
msg = "Engine not initialized. Call init_engine(...) first."
raise RuntimeError(msg)
return engine
def get_session() -> Generator[Session, Any, None]:
def get_session() -> Generator[Session]:
if SessionLocal is None:
raise RuntimeError(
"Session factory not initialized. Call init_engine(...) first."
)
msg = "Session factory not initialized. Call init_engine(...) first."
raise RuntimeError(msg)
db = SessionLocal()
try:
yield db
db.commit()
except Exception as e:
except Exception:
db.rollback()
log.critical(f"error occurred: {e}")
log.critical("", exc_info=True)
raise
finally:
db.close()

View File

@@ -5,5 +5,5 @@ class DbConfig(BaseSettings):
host: str = "localhost"
port: int = 5432
user: str = "MediaManager"
password: str = "MediaManager"
password: str = "MediaManager" # noqa: S105
dbname: str = "MediaManager"

View File

@@ -1,124 +1,131 @@
from fastapi import Request
from fastapi import FastAPI, Request
from fastapi.responses import JSONResponse
from sqlalchemy.exc import IntegrityError
from psycopg.errors import UniqueViolation
from sqlalchemy.exc import IntegrityError
class MediaManagerException(Exception):
class RenameError(Exception):
"""Error when renaming something"""
def __init__(self, message: str = "Failed to rename source directory") -> None:
super().__init__(message)
class MediaManagerError(Exception):
"""Base exception for MediaManager errors."""
def __init__(self, message: str = "An error occurred."):
def __init__(self, message: str = "An error occurred.") -> None:
super().__init__(message)
self.message = message
class MediaAlreadyExists(MediaManagerException):
class MediaAlreadyExistsError(MediaManagerError):
"""Raised when a media entity already exists (HTTP 409)."""
def __init__(
self, message: str = "Entity with this ID or other identifier already exists"
):
) -> None:
super().__init__(message)
class NotFoundError(MediaManagerException):
class NotFoundError(MediaManagerError):
"""Raised when an entity is not found (HTTP 404)."""
def __init__(self, message: str = "The requested entity was not found."):
def __init__(self, message: str = "The requested entity was not found.") -> None:
super().__init__(message)
class InvalidConfigError(MediaManagerException):
class InvalidConfigError(MediaManagerError):
"""Raised when the server is improperly configured (HTTP 500)."""
def __init__(self, message: str = "The server is improperly configured."):
def __init__(self, message: str = "The server is improperly configured.") -> None:
super().__init__(message)
class BadRequestError(MediaManagerException):
class BadRequestError(MediaManagerError):
"""Raised for invalid client requests (HTTP 400)."""
def __init__(self, message: str = "Bad request."):
def __init__(self, message: str = "Bad request.") -> None:
super().__init__(message)
class UnauthorizedError(MediaManagerException):
class UnauthorizedError(MediaManagerError):
"""Raised for authentication failures (HTTP 401)."""
def __init__(self, message: str = "Unauthorized."):
def __init__(self, message: str = "Unauthorized.") -> None:
super().__init__(message)
class ForbiddenError(MediaManagerException):
class ForbiddenError(MediaManagerError):
"""Raised for forbidden actions (HTTP 403)."""
def __init__(self, message: str = "Forbidden."):
def __init__(self, message: str = "Forbidden.") -> None:
super().__init__(message)
class ConflictError(MediaManagerException):
class ConflictError(MediaManagerError):
"""Raised for resource conflicts (HTTP 409)."""
def __init__(self, message: str = "Conflict."):
def __init__(self, message: str = "Conflict.") -> None:
super().__init__(message)
class UnprocessableEntityError(MediaManagerException):
class UnprocessableEntityError(MediaManagerError):
"""Raised for validation errors (HTTP 422)."""
def __init__(self, message: str = "Unprocessable entity."):
def __init__(self, message: str = "Unprocessable entity.") -> None:
super().__init__(message)
# Exception handlers
async def media_already_exists_exception_handler(
request: Request, exc: MediaAlreadyExists
_request: Request, _exc: Exception
) -> JSONResponse:
return JSONResponse(status_code=409, content={"detail": exc.message})
return JSONResponse(status_code=409, content={"detail": str(_exc)})
async def not_found_error_exception_handler(
request: Request, exc: NotFoundError
_request: Request, _exc: Exception
) -> JSONResponse:
return JSONResponse(status_code=404, content={"detail": exc.message})
return JSONResponse(status_code=404, content={"detail": str(_exc)})
async def invalid_config_error_exception_handler(
request: Request, exc: InvalidConfigError
_request: Request, _exc: Exception
) -> JSONResponse:
return JSONResponse(status_code=500, content={"detail": exc.message})
return JSONResponse(status_code=500, content={"detail": str(_exc)})
async def bad_request_error_handler(
request: Request, exc: BadRequestError
_request: Request, exc: BadRequestError
) -> JSONResponse:
return JSONResponse(status_code=400, content={"detail": exc.message})
async def unauthorized_error_handler(
request: Request, exc: UnauthorizedError
_request: Request, exc: UnauthorizedError
) -> JSONResponse:
return JSONResponse(status_code=401, content={"detail": exc.message})
async def forbidden_error_handler(
request: Request, exc: ForbiddenError
_request: Request, exc: ForbiddenError
) -> JSONResponse:
return JSONResponse(status_code=403, content={"detail": exc.message})
async def conflict_error_handler(request: Request, exc: ConflictError) -> JSONResponse:
return JSONResponse(status_code=409, content={"detail": exc.message})
async def conflict_error_handler(_request: Request, _exc: Exception) -> JSONResponse:
return JSONResponse(status_code=409, content={"detail": str(_exc)})
async def unprocessable_entity_error_handler(
request: Request, exc: UnprocessableEntityError
_request: Request, exc: UnprocessableEntityError
) -> JSONResponse:
return JSONResponse(status_code=422, content={"detail": exc.message})
async def sqlalchemy_integrity_error_handler(
request: Request, exc: Exception
_request: Request, _exc: Exception
) -> JSONResponse:
return JSONResponse(
status_code=409,
@@ -128,10 +135,10 @@ async def sqlalchemy_integrity_error_handler(
)
def register_exception_handlers(app):
def register_exception_handlers(app: FastAPI) -> None:
app.add_exception_handler(NotFoundError, not_found_error_exception_handler)
app.add_exception_handler(
MediaAlreadyExists, media_already_exists_exception_handler
MediaAlreadyExistsError, media_already_exists_exception_handler
)
app.add_exception_handler(
InvalidConfigError, invalid_config_error_exception_handler

View File

@@ -1,8 +1,11 @@
import shutil
from logging import Logger
from pathlib import Path
from media_manager.config import MediaManagerConfig
def run_filesystem_checks(config, log):
def run_filesystem_checks(config: MediaManagerConfig, log: Logger) -> None:
log.info("Creating directories if they don't exist...")
config.misc.tv_directory.mkdir(parents=True, exist_ok=True)
config.misc.movie_directory.mkdir(parents=True, exist_ok=True)
@@ -33,10 +36,8 @@ def run_filesystem_checks(config, log):
if not test_hardlink.samefile(test_torrent_file):
log.critical("Hardlink creation failed!")
log.info("Successfully created test hardlink in TV directory")
except OSError as e:
log.error(
f"Hardlink creation failed, falling back to copying files. Error: {e}"
)
except OSError:
log.exception("Hardlink creation failed, falling back to copying files")
shutil.copy(src=test_torrent_file, dst=test_hardlink)
finally:
test_hardlink.unlink()

View File

@@ -2,10 +2,9 @@ from typing import Annotated
from fastapi import Depends
from media_manager.database import DbSessionDependency
from media_manager.indexer.repository import IndexerRepository
from media_manager.indexer.service import IndexerService
from media_manager.database import DbSessionDependency
from media_manager.tv.service import TvService
def get_indexer_repository(db_session: DbSessionDependency) -> IndexerRepository:
@@ -21,4 +20,4 @@ def get_indexer_service(
return IndexerService(indexer_repository)
indexer_service_dep = Annotated[TvService, Depends(get_indexer_service)]
indexer_service_dep = Annotated[IndexerService, Depends(get_indexer_service)]

View File

@@ -1,4 +1,4 @@
from abc import abstractmethod, ABC
from abc import ABC, abstractmethod
from media_manager.indexer.schemas import IndexerQueryResult
from media_manager.movies.schemas import Movie
@@ -8,11 +8,8 @@ from media_manager.tv.schemas import Show
class GenericIndexer(ABC):
name: str
def __init__(self, name: str = None):
if name:
self.name = name
else:
raise ValueError("indexer name must not be None")
def __init__(self, name: str) -> None:
self.name = name
@abstractmethod
def search(self, query: str, is_tv: bool) -> list[IndexerQueryResult]:

View File

@@ -1,21 +1,39 @@
import concurrent
import concurrent.futures
import logging
import xml.etree.ElementTree as ET
from concurrent.futures.thread import ThreadPoolExecutor
from dataclasses import dataclass
import requests
from media_manager.config import MediaManagerConfig
from media_manager.indexer.indexers.generic import GenericIndexer
from media_manager.indexer.indexers.torznab_mixin import TorznabMixin
from media_manager.indexer.schemas import IndexerQueryResult
from media_manager.config import MediaManagerConfig
from media_manager.movies.schemas import Movie
from media_manager.tv.schemas import Show
log = logging.getLogger(__name__)
@dataclass
class IndexerInfo:
supports_tv_search: bool
supports_tv_search_tmdb: bool
supports_tv_search_imdb: bool
supports_tv_search_tvdb: bool
supports_tv_search_season: bool
supports_tv_search_episode: bool
supports_movie_search: bool
supports_movie_search_tmdb: bool
supports_movie_search_imdb: bool
supports_movie_search_tvdb: bool
class Jackett(GenericIndexer, TorznabMixin):
def __init__(self):
def __init__(self) -> None:
"""
A subclass of GenericIndexer for interacting with the Jacket API.
@@ -30,11 +48,16 @@ class Jackett(GenericIndexer, TorznabMixin):
def search(self, query: str, is_tv: bool) -> list[IndexerQueryResult]:
log.debug("Searching for " + query)
params = {"q": query, "t": "tvsearch" if is_tv else "movie"}
return self.__search_jackett(params)
def __search_jackett(self, params: dict) -> list[IndexerQueryResult]:
futures = []
with ThreadPoolExecutor() as executor, requests.Session() as session:
for indexer in self.indexers:
future = executor.submit(
self.get_torrents_by_indexer, indexer, query, is_tv, session
self.get_torrents_by_indexer, indexer, params, session
)
futures.append(future)
@@ -45,19 +68,108 @@ class Jackett(GenericIndexer, TorznabMixin):
result = future.result()
if result is not None:
responses.extend(result)
except Exception as e:
log.error(f"search result failed with: {e}")
except Exception:
log.exception("Searching failed")
return responses
def get_torrents_by_indexer(
self, indexer: str, query: str, is_tv: bool, session: requests.Session
) -> list[IndexerQueryResult]:
def __get_search_capabilities(
self, indexer: str, session: requests.Session
) -> IndexerInfo:
url = (
self.url
+ f"/api/v2.0/indexers/{indexer}/results/torznab/api?apikey={self.api_key}&t={'tvsearch' if is_tv else 'movie'}&q={query}"
+ f"/api/v2.0/indexers/{indexer}/results/torznab/api?apikey={self.api_key}&t=caps"
)
response = session.get(url, timeout=self.timeout_seconds)
if response.status_code != 200:
msg = f"Cannot get search capabilities for Indexer {indexer}"
log.error(msg)
raise RuntimeError(msg)
xml = response.text
xml_tree = ET.fromstring(xml) # noqa: S314 # trusted source, since it is user controlled
tv_search = xml_tree.find("./*/tv-search")
movie_search = xml_tree.find("./*/movie-search")
log.debug(tv_search.attrib)
log.debug(movie_search.attrib)
tv_search_capabilities = []
movie_search_capabilities = []
tv_search_available = (tv_search is not None) and (
tv_search.attrib["available"] == "yes"
)
movie_search_available = (movie_search is not None) and (
movie_search.attrib["available"] == "yes"
)
if tv_search_available:
tv_search_capabilities = tv_search.attrib["supportedParams"].split(",")
if movie_search_available:
movie_search_capabilities = movie_search.attrib["supportedParams"].split(
","
)
return IndexerInfo(
supports_tv_search=tv_search_available,
supports_tv_search_imdb="tmdbid" in tv_search_capabilities,
supports_tv_search_tmdb="tmdbid" in tv_search_capabilities,
supports_tv_search_tvdb="tvdbid" in tv_search_capabilities,
supports_tv_search_season="season" in tv_search_capabilities,
supports_tv_search_episode="ep" in tv_search_capabilities,
supports_movie_search=movie_search_available,
supports_movie_search_imdb="imdbid" in movie_search_capabilities,
supports_movie_search_tmdb="tmdbid" in movie_search_capabilities,
supports_movie_search_tvdb="tvdbid" in movie_search_capabilities,
)
def __get_optimal_query_parameters(
self, indexer: str, session: requests.Session, params: dict
) -> dict[str, str]:
query_params = {"apikey": self.api_key, "t": params["t"]}
search_capabilities = self.__get_search_capabilities(
indexer=indexer, session=session
)
if params["t"] == "tvsearch":
if not search_capabilities.supports_tv_search:
msg = f"Indexer {indexer} does not support TV search"
raise RuntimeError(msg)
if search_capabilities.supports_tv_search_season and "season" in params:
query_params["season"] = params["season"]
if search_capabilities.supports_tv_search_episode and "ep" in params:
query_params["ep"] = params["ep"]
if search_capabilities.supports_tv_search_imdb and "imdbid" in params:
query_params["imdbid"] = params["imdbid"]
elif search_capabilities.supports_tv_search_tvdb and "tvdbid" in params:
query_params["tvdbid"] = params["tvdbid"]
elif search_capabilities.supports_tv_search_tmdb and "tmdbid" in params:
query_params["tmdbid"] = params["tmdbid"]
else:
query_params["q"] = params["q"]
if params["t"] == "movie":
if not search_capabilities.supports_movie_search:
msg = f"Indexer {indexer} does not support Movie search"
raise RuntimeError(msg)
if search_capabilities.supports_movie_search_imdb and "imdbid" in params:
query_params["imdbid"] = params["imdbid"]
elif search_capabilities.supports_tv_search_tvdb and "tvdbid" in params:
query_params["tvdbid"] = params["tvdbid"]
elif search_capabilities.supports_tv_search_tmdb and "tmdbid" in params:
query_params["tmdbid"] = params["tmdbid"]
else:
query_params["q"] = params["q"]
return query_params
def get_torrents_by_indexer(
self, indexer: str, params: dict, session: requests.Session
) -> list[IndexerQueryResult]:
url = f"{self.url}/api/v2.0/indexers/{indexer}/results/torznab/api"
query_params = self.__get_optimal_query_parameters(
indexer=indexer, session=session, params=params
)
response = session.get(url, timeout=self.timeout_seconds, params=query_params)
log.debug(f"Indexer {indexer} url: {response.url}")
if response.status_code != 200:
log.error(
@@ -67,13 +179,30 @@ class Jackett(GenericIndexer, TorznabMixin):
results = self.process_search_result(response.content)
log.info(f"Indexer {indexer.name} returned {len(results)} results")
log.info(f"Indexer {indexer} returned {len(results)} results")
return results
def search_season(
self, query: str, show: Show, season_number: int
) -> list[IndexerQueryResult]:
pass
log.debug(f"Searching for season {season_number} of show {show.name}")
params = {
"t": "tvsearch",
"season": season_number,
"q": query,
}
if show.imdb_id:
params["imdbid"] = show.imdb_id
params[show.metadata_provider + "id"] = show.external_id
return self.__search_jackett(params=params)
def search_movie(self, query: str, movie: Movie) -> list[IndexerQueryResult]:
pass
log.debug(f"Searching for movie {movie.name}")
params = {
"t": "movie",
"q": query,
}
if movie.imdb_id:
params["imdbid"] = movie.imdb_id
params[movie.metadata_provider + "id"] = movie.external_id
return self.__search_jackett(params=params)

View File

@@ -1,10 +1,10 @@
import logging
from dataclasses import dataclass
from requests import Session
from requests import Response, Session
from media_manager.indexer.indexers.generic import GenericIndexer
from media_manager.config import MediaManagerConfig
from media_manager.indexer.indexers.generic import GenericIndexer
from media_manager.indexer.indexers.torznab_mixin import TorznabMixin
from media_manager.indexer.schemas import IndexerQueryResult
from media_manager.movies.schemas import Movie
@@ -31,14 +31,14 @@ class IndexerInfo:
class Prowlarr(GenericIndexer, TorznabMixin):
def __init__(self):
def __init__(self) -> None:
"""
A subclass of GenericIndexer for interacting with the Prowlarr API.
"""
super().__init__(name="prowlarr")
self.config = MediaManagerConfig().indexers.prowlarr
def _call_prowlarr_api(self, path: str, parameters: dict = None):
def _call_prowlarr_api(self, path: str, parameters: dict | None = None) -> Response:
url = f"{self.config.url}/api/v1{path}"
headers = {"X-Api-Key": self.config.api_key}
with Session() as session:
@@ -50,7 +50,7 @@ class Prowlarr(GenericIndexer, TorznabMixin):
)
def _newznab_search(
self, indexer: IndexerInfo, parameters: dict = None
self, indexer: IndexerInfo, parameters: dict | None = None
) -> list[IndexerQueryResult]:
if parameters is None:
parameters = {}

View File

@@ -1,10 +1,9 @@
import logging
import xml.etree.ElementTree as ET
from datetime import datetime
from email.utils import parsedate_to_datetime
from media_manager.indexer.schemas import IndexerQueryResult
import xml.etree.ElementTree as ET
from xml.etree.ElementTree import Element
from email.utils import parsedate_to_datetime
from datetime import datetime, timezone
log = logging.getLogger(__name__)
@@ -12,7 +11,7 @@ log = logging.getLogger(__name__)
class TorznabMixin:
def process_search_result(self, xml: str) -> list[IndexerQueryResult]:
result_list: list[IndexerQueryResult] = []
xml_tree = ET.fromstring(xml)
xml_tree = ET.fromstring(xml) # noqa: S314 # trusted source, since it is user controlled
xmlns = {
"torznab": "http://torznab.com/schemas/2015/feed",
"atom": "http://www.w3.org/2005/Atom",
@@ -33,16 +32,14 @@ class TorznabMixin:
item.find("enclosure").attrib["type"] != "application/x-bittorrent"
)
attributes: list[Element] = [
x for x in item.findall("torznab:attr", xmlns)
]
attributes = list(item.findall("torznab:attr", xmlns))
for attribute in attributes:
if is_usenet:
if attribute.attrib["name"] == "usenetdate":
posted_date = parsedate_to_datetime(
attribute.attrib["value"]
)
now = datetime.now(timezone.utc)
now = datetime.now(datetime.UTC)
age = int((now - posted_date).total_seconds())
else:
if attribute.attrib["name"] == "seeders":
@@ -64,17 +61,28 @@ class TorznabMixin:
if upload_volume_factor == 2:
flags.append("doubleupload")
title = item.find("title").text
size_str = item.find("size")
if size_str is None or size_str.text is None:
log.warning(f"Torznab item {title} has no size, skipping.")
continue
try:
size = int(size_str.text or "0")
except ValueError:
log.warning(f"Torznab item {title} has invalid size, skipping.")
continue
result = IndexerQueryResult(
title=item.find("title").text,
title=title or "unknown",
download_url=str(item.find("enclosure").attrib["url"]),
seeders=seeders,
flags=flags,
size=int(item.find("size").text),
size=size,
usenet=is_usenet,
age=age,
indexer=indexer_name,
)
result_list.append(result)
except Exception as e:
log.error(f"1 Torznab search result errored with error: {e}")
except Exception:
log.exception("1 Torznab search result failed")
return result_list

View File

@@ -1,6 +1,6 @@
from uuid import UUID
from sqlalchemy import String, Integer
from sqlalchemy import Integer, String
from sqlalchemy.dialects.postgresql import ARRAY
from sqlalchemy.orm import Mapped, mapped_column
from sqlalchemy.sql.sqltypes import BigInteger

View File

@@ -4,15 +4,17 @@ from sqlalchemy.orm import Session
from media_manager.indexer.models import IndexerQueryResult
from media_manager.indexer.schemas import (
IndexerQueryResultId,
IndexerQueryResult as IndexerQueryResultSchema,
)
from media_manager.indexer.schemas import (
IndexerQueryResultId,
)
log = logging.getLogger(__name__)
class IndexerRepository:
def __init__(self, db: Session):
def __init__(self, db: Session) -> None:
self.db = db
def get_result(self, result_id: IndexerQueryResultId) -> IndexerQueryResultSchema:

View File

@@ -3,7 +3,7 @@ import typing
from uuid import UUID, uuid4
import pydantic
from pydantic import BaseModel, computed_field, ConfigDict
from pydantic import BaseModel, ConfigDict, computed_field
from media_manager.torrent.models import Quality
@@ -13,7 +13,9 @@ IndexerQueryResultId = typing.NewType("IndexerQueryResultId", UUID)
class IndexerQueryResult(BaseModel):
model_config = ConfigDict(from_attributes=True)
id: IndexerQueryResultId = pydantic.Field(default_factory=uuid4)
id: IndexerQueryResultId = pydantic.Field(
default_factory=lambda: IndexerQueryResultId(uuid4())
)
title: str
download_url: str = pydantic.Field(
exclude=True,
@@ -30,7 +32,7 @@ class IndexerQueryResult(BaseModel):
indexer: str | None
@computed_field(return_type=Quality)
@computed_field
@property
def quality(self) -> Quality:
high_quality_pattern = r"\b(4k)\b"
@@ -40,31 +42,29 @@ class IndexerQueryResult(BaseModel):
if re.search(high_quality_pattern, self.title, re.IGNORECASE):
return Quality.uhd
elif re.search(medium_quality_pattern, self.title, re.IGNORECASE):
if re.search(medium_quality_pattern, self.title, re.IGNORECASE):
return Quality.fullhd
elif re.search(low_quality_pattern, self.title, re.IGNORECASE):
if re.search(low_quality_pattern, self.title, re.IGNORECASE):
return Quality.hd
elif re.search(very_low_quality_pattern, self.title, re.IGNORECASE):
if re.search(very_low_quality_pattern, self.title, re.IGNORECASE):
return Quality.sd
return Quality.unknown
@computed_field(return_type=list[int])
@computed_field
@property
def season(self) -> list[int]:
pattern = r"\b[sS](\d+)\b"
pattern = r"\bS(\d+)\b"
matches = re.findall(pattern, self.title, re.IGNORECASE)
if matches.__len__() == 2:
result = []
for i in range(int(matches[0]), int(matches[1]) + 1):
result.append(i)
result = list(range(int(matches[0]), int(matches[1]) + 1))
elif matches.__len__() == 1:
result = [int(matches[0])]
else:
result = []
return result
def __gt__(self, other) -> bool:
def __gt__(self, other: "IndexerQueryResult") -> bool:
if self.quality.value != other.quality.value:
return self.quality.value < other.quality.value
if self.score != other.score:
@@ -78,7 +78,7 @@ class IndexerQueryResult(BaseModel):
return self.size < other.size
def __lt__(self, other) -> bool:
def __lt__(self, other: "IndexerQueryResult") -> bool:
if self.quality.value != other.quality.value:
return self.quality.value > other.quality.value
if self.score != other.score:

View File

@@ -4,8 +4,8 @@ from media_manager.config import MediaManagerConfig
from media_manager.indexer.indexers.generic import GenericIndexer
from media_manager.indexer.indexers.jackett import Jackett
from media_manager.indexer.indexers.prowlarr import Prowlarr
from media_manager.indexer.schemas import IndexerQueryResultId, IndexerQueryResult
from media_manager.indexer.repository import IndexerRepository
from media_manager.indexer.schemas import IndexerQueryResult, IndexerQueryResultId
from media_manager.movies.schemas import Movie
from media_manager.torrent.utils import remove_special_chars_and_parentheses
from media_manager.tv.schemas import Show
@@ -14,7 +14,7 @@ log = logging.getLogger(__name__)
class IndexerService:
def __init__(self, indexer_repository: IndexerRepository):
def __init__(self, indexer_repository: IndexerRepository) -> None:
config = MediaManagerConfig()
self.repository = indexer_repository
self.indexers: list[GenericIndexer] = []
@@ -45,9 +45,9 @@ class IndexerService:
log.debug(
f"Indexer {indexer.__class__.__name__} returned {len(indexer_results)} results for query: {query}"
)
except Exception as e:
log.error(
f"Indexer {indexer.__class__.__name__} failed for query '{query}': {e}"
except Exception:
log.exception(
f"Indexer {indexer.__class__.__name__} failed for query '{query}'"
)
for result in results:
@@ -55,7 +55,7 @@ class IndexerService:
return results
def search_movie(self, movie: Movie):
def search_movie(self, movie: Movie) -> list[IndexerQueryResult]:
query = f"{movie.name} {movie.year}"
query = remove_special_chars_and_parentheses(query)
@@ -65,9 +65,9 @@ class IndexerService:
indexer_results = indexer.search_movie(query=query, movie=movie)
if indexer_results:
results.extend(indexer_results)
except Exception as e:
log.error(
f"Indexer {indexer.__class__.__name__} failed for movie search '{query}': {e}"
except Exception:
log.exception(
f"Indexer {indexer.__class__.__name__} failed for movie search '{query}'"
)
for result in results:
@@ -75,7 +75,7 @@ class IndexerService:
return results
def search_season(self, show: Show, season_number: int):
def search_season(self, show: Show, season_number: int) -> list[IndexerQueryResult]:
query = f"{show.name} {show.year} S{season_number:02d}"
query = remove_special_chars_and_parentheses(query)
@@ -87,9 +87,9 @@ class IndexerService:
)
if indexer_results:
results.extend(indexer_results)
except Exception as e:
log.error(
f"Indexer {indexer.__class__.__name__} failed for season search '{query}': {e}"
except Exception:
log.exception(
f"Indexer {indexer.__class__.__name__} failed for season search '{query}'"
)
for result in results:

View File

@@ -14,7 +14,7 @@ log = logging.getLogger(__name__)
def evaluate_indexer_query_result(
query_result: IndexerQueryResult, ruleset: ScoringRuleSet
) -> (IndexerQueryResult, bool):
) -> tuple[IndexerQueryResult, bool]:
title_rules = MediaManagerConfig().indexers.title_scoring_rules
indexer_flag_rules = MediaManagerConfig().indexers.indexer_flag_scoring_rules
for rule_name in ruleset.rule_names:
@@ -132,7 +132,8 @@ def follow_redirects_to_final_torrent_url(
if 300 <= response.status_code < 400:
redirect_url = response.headers.get("Location")
if not redirect_url:
raise RuntimeError("Redirect response without Location header")
msg = "Redirect response without Location header"
raise RuntimeError(msg)
# Resolve relative redirects against the last URL
current_url = urljoin(current_url, redirect_url)
@@ -144,10 +145,15 @@ def follow_redirects_to_final_torrent_url(
response.raise_for_status() # Raise an exception for bad status codes
return current_url
else:
raise RuntimeError("Exceeded maximum number of redirects")
msg = "Exceeded maximum number of redirects"
raise RuntimeError(msg)
except requests.exceptions.RequestException as e:
log.debug(f"An error occurred during the request for {initial_url}: {e}")
raise RuntimeError(f"An error occurred during the request: {e}") from e
log.debug(
f"An error occurred during the request for {initial_url}",
exc_info=True,
)
msg = "An error occurred during the request"
raise RuntimeError(msg) from e
return current_url

View File

@@ -1,15 +1,18 @@
import logging
import os
import sys
from datetime import UTC, datetime
from logging.config import dictConfig
from pythonjsonlogger.json import JsonFormatter
from pathlib import Path
from datetime import datetime, timezone
from typing import override
from pythonjsonlogger.json import JsonFormatter
class ISOJsonFormatter(JsonFormatter):
def formatTime(self, record, datefmt=None):
dt = datetime.fromtimestamp(record.created, tz=timezone.utc)
@override
def formatTime(self, record: logging.LogRecord, datefmt: str | None = None) -> str:
dt = datetime.fromtimestamp(record.created, tz=UTC)
return dt.isoformat(timespec="milliseconds").replace("+00:00", "Z")
@@ -18,13 +21,20 @@ LOG_FILE = Path(os.getenv("LOG_FILE", "/app/config/media_manager.log"))
LOGGING_CONFIG = {
"version": 1,
"disable_existing_loggers": False,
"filters": {
"correlation_id": {
"()": "asgi_correlation_id.CorrelationIdFilter",
"uuid_length": 32,
"default_value": "-",
},
},
"formatters": {
"default": {
"format": "%(asctime)s - %(levelname)s - %(name)s - %(funcName)s(): %(message)s"
"format": "%(asctime)s - [%(correlation_id)s] %(levelname)s - %(name)s - %(funcName)s(): %(message)s"
},
"json": {
"()": ISOJsonFormatter,
"format": "%(asctime)s %(levelname)s %(name)s %(message)s",
"format": "%(asctime)s %(correlation_id)s %(levelname)s %(name)s %(message)s",
"rename_fields": {
"levelname": "level",
"asctime": "timestamp",
@@ -36,11 +46,13 @@ LOGGING_CONFIG = {
"console": {
"class": "logging.StreamHandler",
"formatter": "default",
"filters": ["correlation_id"],
"stream": sys.stdout,
},
"file": {
"class": "logging.handlers.RotatingFileHandler",
"formatter": "json",
"filters": ["correlation_id"],
"filename": str(LOG_FILE),
"maxBytes": 10485760,
"backupCount": 5,
@@ -59,7 +71,7 @@ LOGGING_CONFIG = {
}
def setup_logging():
def setup_logging() -> None:
dictConfig(LOGGING_CONFIG)
logging.basicConfig(
level=LOG_LEVEL,

View File

@@ -1,43 +1,48 @@
from media_manager.logging import setup_logging, LOGGING_CONFIG
from media_manager.scheduler import setup_scheduler
from media_manager.filesystem_checks import run_filesystem_checks
from media_manager.config import MediaManagerConfig
import uvicorn
import logging
import os
from fastapi import FastAPI, APIRouter
import uvicorn
from asgi_correlation_id import CorrelationIdMiddleware
from fastapi import APIRouter, FastAPI, Request, Response
from fastapi.middleware.cors import CORSMiddleware
from uvicorn.middleware.proxy_headers import ProxyHeadersMiddleware
from fastapi.staticfiles import StaticFiles
from starlette.responses import RedirectResponse, FileResponse, Response
from media_manager.auth.users import (
bearer_auth_backend,
fastapi_users,
cookie_auth_backend,
)
from psycopg.errors import UniqueViolation
from sqlalchemy.exc import IntegrityError
from starlette.responses import FileResponse, RedirectResponse
from uvicorn.middleware.proxy_headers import ProxyHeadersMiddleware
import media_manager.movies.router as movies_router
import media_manager.torrent.router as torrent_router
import media_manager.tv.router as tv_router
from media_manager.auth.router import (
users_router as custom_users_router,
auth_metadata_router,
get_openid_router,
)
from media_manager.auth.schemas import UserCreate, UserRead, UserUpdate
from media_manager.exceptions import (
NotFoundError,
not_found_error_exception_handler,
MediaAlreadyExists,
media_already_exists_exception_handler,
InvalidConfigError,
invalid_config_error_exception_handler,
sqlalchemy_integrity_error_handler,
ConflictError,
conflict_error_handler,
from media_manager.auth.router import (
users_router as custom_users_router,
)
from sqlalchemy.exc import IntegrityError
from psycopg.errors import UniqueViolation
import media_manager.torrent.router as torrent_router
import media_manager.movies.router as movies_router
import media_manager.tv.router as tv_router
from media_manager.auth.schemas import UserCreate, UserRead, UserUpdate
from media_manager.auth.users import (
bearer_auth_backend,
cookie_auth_backend,
fastapi_users,
)
from media_manager.config import MediaManagerConfig
from media_manager.exceptions import (
ConflictError,
InvalidConfigError,
MediaAlreadyExistsError,
NotFoundError,
conflict_error_handler,
invalid_config_error_exception_handler,
media_already_exists_exception_handler,
not_found_error_exception_handler,
sqlalchemy_integrity_error_handler,
)
from media_manager.filesystem_checks import run_filesystem_checks
from media_manager.logging import LOGGING_CONFIG, setup_logging
from media_manager.notification.router import router as notification_router
import logging
from media_manager.scheduler import setup_scheduler
setup_logging()
@@ -47,7 +52,7 @@ log = logging.getLogger(__name__)
if config.misc.development:
log.warning("Development Mode activated!")
scheduler = setup_scheduler(config, log)
scheduler = setup_scheduler(config)
run_filesystem_checks(config, log)
@@ -56,7 +61,7 @@ FRONTEND_FILES_DIR = os.getenv("FRONTEND_FILES_DIR")
DISABLE_FRONTEND_MOUNT = os.getenv("DISABLE_FRONTEND_MOUNT", "").lower() == "true"
FRONTEND_FOLLOW_SYMLINKS = os.getenv("FRONTEND_FOLLOW_SYMLINKS", "").lower() == "true"
log.info("Hello World!")
app = FastAPI(root_path=BASE_PATH)
app.add_middleware(ProxyHeadersMiddleware, trusted_hosts="*")
origins = config.misc.cors_urls
@@ -67,6 +72,7 @@ app.add_middleware(
allow_credentials=True,
allow_methods=["GET", "PUT", "POST", "DELETE", "PATCH", "HEAD", "OPTIONS"],
)
app.add_middleware(CorrelationIdMiddleware, header_name="X-Correlation-ID")
api_app = APIRouter(prefix="/api/v1")
@@ -139,23 +145,23 @@ else:
@app.get("/")
async def root():
async def root() -> RedirectResponse:
return RedirectResponse(url="/web/")
@app.get("/dashboard")
async def dashboard():
async def dashboard() -> RedirectResponse:
return RedirectResponse(url="/web/")
@app.get("/login")
async def login():
async def login() -> RedirectResponse:
return RedirectResponse(url="/web/")
# this will serve the custom 404 page for frontend routes, so SvelteKit can handle routing
@app.exception_handler(404)
async def not_found_handler(request, exc):
async def not_found_handler(request: Request, _exc: Exception) -> Response:
if not DISABLE_FRONTEND_MOUNT and any(
base_path in ["/web", "/dashboard", "/login"] for base_path in request.url.path
):
@@ -165,7 +171,9 @@ async def not_found_handler(request, exc):
# Register exception handlers for custom exceptions
app.add_exception_handler(NotFoundError, not_found_error_exception_handler)
app.add_exception_handler(MediaAlreadyExists, media_already_exists_exception_handler)
app.add_exception_handler(
MediaAlreadyExistsError, media_already_exists_exception_handler
)
app.add_exception_handler(InvalidConfigError, invalid_config_error_exception_handler)
app.add_exception_handler(IntegrityError, sqlalchemy_integrity_error_handler)
app.add_exception_handler(UniqueViolation, sqlalchemy_integrity_error_handler)

View File

@@ -1,10 +1,10 @@
import logging
from abc import ABC, abstractmethod
from media_manager.metadataProvider.schemas import MetaDataProviderSearchResult
from media_manager.tv.schemas import Show
from media_manager.movies.schemas import Movie
from media_manager.config import MediaManagerConfig
from media_manager.metadataProvider.schemas import MetaDataProviderSearchResult
from media_manager.movies.schemas import Movie
from media_manager.tv.schemas import Show
log = logging.getLogger(__name__)
@@ -18,11 +18,11 @@ class AbstractMetadataProvider(ABC):
pass
@abstractmethod
def get_show_metadata(self, id: int = None, language: str | None = None) -> Show:
def get_show_metadata(self, show_id: int, language: str | None = None) -> Show:
raise NotImplementedError()
@abstractmethod
def get_movie_metadata(self, id: int = None, language: str | None = None) -> Movie:
def get_movie_metadata(self, movie_id: int, language: str | None = None) -> Movie:
raise NotImplementedError()
@abstractmethod

View File

@@ -1,12 +1,12 @@
from typing import Annotated, Literal
from fastapi import Depends
from fastapi.exceptions import HTTPException
from media_manager.metadataProvider.tmdb import TmdbMetadataProvider
from media_manager.metadataProvider.abstractMetaDataProvider import (
from media_manager.metadataProvider.abstract_metadata_provider import (
AbstractMetadataProvider,
)
from media_manager.metadataProvider.tmdb import TmdbMetadataProvider
from media_manager.metadataProvider.tvdb import TvdbMetadataProvider
@@ -15,13 +15,12 @@ def get_metadata_provider(
) -> AbstractMetadataProvider:
if metadata_provider == "tmdb":
return TmdbMetadataProvider()
elif metadata_provider == "tvdb":
if metadata_provider == "tvdb":
return TvdbMetadataProvider()
else:
raise HTTPException(
status_code=400,
detail=f"Invalid metadata provider: {metadata_provider}. Supported providers are 'tmdb' and 'tvdb'.",
)
raise HTTPException(
status_code=400,
detail=f"Invalid metadata provider: {metadata_provider}. Supported providers are 'tmdb' and 'tvdb'.",
)
metadata_provider_dep = Annotated[

View File

@@ -1,4 +1,5 @@
from pydantic import BaseModel
from media_manager.movies.schemas import MovieId
from media_manager.tv.schemas import ShowId

View File

@@ -1,17 +1,17 @@
import logging
from typing import override
import requests
import media_manager.metadataProvider.utils
from media_manager.config import MediaManagerConfig
from media_manager.metadataProvider.abstractMetaDataProvider import (
from media_manager.metadataProvider.abstract_metadata_provider import (
AbstractMetadataProvider,
)
from media_manager.metadataProvider.schemas import MetaDataProviderSearchResult
from media_manager.tv.schemas import Episode, Season, Show, SeasonNumber, EpisodeNumber
from media_manager.movies.schemas import Movie
from media_manager.notification.manager import notification_manager
from media_manager.tv.schemas import Episode, EpisodeNumber, Season, SeasonNumber, Show
ENDED_STATUS = {"Ended", "Canceled"}
@@ -21,7 +21,7 @@ log = logging.getLogger(__name__)
class TmdbMetadataProvider(AbstractMetadataProvider):
name = "tmdb"
def __init__(self):
def __init__(self) -> None:
config = MediaManagerConfig().metadata.tmdb
self.url = config.tmdb_relay_url
self.primary_languages = config.primary_languages
@@ -39,35 +39,40 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
return original_language
return self.default_language
def __get_show_metadata(self, id: int, language: str | None = None) -> dict:
def __get_show_metadata(self, show_id: int, language: str | None = None) -> dict:
if language is None:
language = self.default_language
try:
response = requests.get(
url=f"{self.url}/tv/shows/{id}", params={"language": language}
url=f"{self.url}/tv/shows/{show_id}",
params={"language": language},
timeout=60,
)
response.raise_for_status()
return response.json()
except requests.RequestException as e:
log.error(f"TMDB API error getting show metadata for ID {id}: {e}")
log.exception(f"TMDB API error getting show metadata for ID {show_id}")
if notification_manager.is_configured():
notification_manager.send_notification(
title="TMDB API Error",
message=f"Failed to fetch show metadata for ID {id} from TMDB. Error: {str(e)}",
message=f"Failed to fetch show metadata for ID {show_id} from TMDB. Error: {e}",
)
raise
def __get_show_external_ids(self, id: int) -> dict:
def __get_show_external_ids(self, show_id: int) -> dict:
try:
response = requests.get(url=f"{self.url}/tv/shows/{id}/external_ids")
response = requests.get(
url=f"{self.url}/tv/shows/{show_id}/external_ids",
timeout=60,
)
response.raise_for_status()
return response.json()
except requests.RequestException as e:
log.error(f"TMDB API error getting show external IDs for ID {id}: {e}")
log.exception(f"TMDB API error getting show external IDs for ID {show_id}")
if notification_manager.is_configured():
notification_manager.send_notification(
title="TMDB API Error",
message=f"Failed to fetch show external IDs for ID {id} from TMDB. Error: {str(e)}",
message=f"Failed to fetch show external IDs for ID {show_id} from TMDB. Error: {e}",
)
raise
@@ -80,17 +85,18 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
response = requests.get(
url=f"{self.url}/tv/shows/{show_id}/{season_number}",
params={"language": language},
timeout=60,
)
response.raise_for_status()
return response.json()
except requests.RequestException as e:
log.error(
f"TMDB API error getting season {season_number} metadata for show ID {show_id}: {e}"
log.exception(
f"TMDB API error getting season {season_number} metadata for show ID {show_id}"
)
if notification_manager.is_configured():
notification_manager.send_notification(
title="TMDB API Error",
message=f"Failed to fetch season {season_number} metadata for show ID {show_id} from TMDB. Error: {str(e)}",
message=f"Failed to fetch season {season_number} metadata for show ID {show_id} from TMDB. Error: {e}",
)
raise
@@ -102,15 +108,16 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
"query": query,
"page": page,
},
timeout=60,
)
response.raise_for_status()
return response.json()
except requests.RequestException as e:
log.error(f"TMDB API error searching TV shows with query '{query}': {e}")
log.exception(f"TMDB API error searching TV shows with query '{query}'")
if notification_manager.is_configured():
notification_manager.send_notification(
title="TMDB API Error",
message=f"Failed to search TV shows with query '{query}' on TMDB. Error: {str(e)}",
message=f"Failed to search TV shows with query '{query}' on TMDB. Error: {e}",
)
raise
@@ -119,47 +126,54 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
response = requests.get(
url=f"{self.url}/tv/trending",
params={"language": self.default_language},
timeout=60,
)
response.raise_for_status()
return response.json()
except requests.RequestException as e:
log.error(f"TMDB API error getting trending TV: {e}")
log.exception("TMDB API error getting trending TV")
if notification_manager.is_configured():
notification_manager.send_notification(
title="TMDB API Error",
message=f"Failed to fetch trending TV shows from TMDB. Error: {str(e)}",
message=f"Failed to fetch trending TV shows from TMDB. Error: {e}",
)
raise
def __get_movie_metadata(self, id: int, language: str | None = None) -> dict:
def __get_movie_metadata(self, movie_id: int, language: str | None = None) -> dict:
if language is None:
language = self.default_language
try:
response = requests.get(
url=f"{self.url}/movies/{id}", params={"language": language}
url=f"{self.url}/movies/{movie_id}",
params={"language": language},
timeout=60,
)
response.raise_for_status()
return response.json()
except requests.RequestException as e:
log.error(f"TMDB API error getting movie metadata for ID {id}: {e}")
log.exception(f"TMDB API error getting movie metadata for ID {movie_id}")
if notification_manager.is_configured():
notification_manager.send_notification(
title="TMDB API Error",
message=f"Failed to fetch movie metadata for ID {id} from TMDB. Error: {str(e)}",
message=f"Failed to fetch movie metadata for ID {movie_id} from TMDB. Error: {e}",
)
raise
def __get_movie_external_ids(self, id: int) -> dict:
def __get_movie_external_ids(self, movie_id: int) -> dict:
try:
response = requests.get(url=f"{self.url}/movies/{id}/external_ids")
response = requests.get(
url=f"{self.url}/movies/{movie_id}/external_ids", timeout=60
)
response.raise_for_status()
return response.json()
except requests.RequestException as e:
log.error(f"TMDB API error getting movie external IDs for ID {id}: {e}")
log.exception(
f"TMDB API error getting movie external IDs for ID {movie_id}"
)
if notification_manager.is_configured():
notification_manager.send_notification(
title="TMDB API Error",
message=f"Failed to fetch movie external IDs for ID {id} from TMDB. Error: {str(e)}",
message=f"Failed to fetch movie external IDs for ID {movie_id} from TMDB. Error: {e}",
)
raise
@@ -171,15 +185,16 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
"query": query,
"page": page,
},
timeout=60,
)
response.raise_for_status()
return response.json()
except requests.RequestException as e:
log.error(f"TMDB API error searching movies with query '{query}': {e}")
log.exception(f"TMDB API error searching movies with query '{query}'")
if notification_manager.is_configured():
notification_manager.send_notification(
title="TMDB API Error",
message=f"Failed to search movies with query '{query}' on TMDB. Error: {str(e)}",
message=f"Failed to search movies with query '{query}' on TMDB. Error: {e}",
)
raise
@@ -188,18 +203,20 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
response = requests.get(
url=f"{self.url}/movies/trending",
params={"language": self.default_language},
timeout=60,
)
response.raise_for_status()
return response.json()
except requests.RequestException as e:
log.error(f"TMDB API error getting trending movies: {e}")
log.exception("TMDB API error getting trending movies")
if notification_manager.is_configured():
notification_manager.send_notification(
title="TMDB API Error",
message=f"Failed to fetch trending movies from TMDB. Error: {str(e)}",
message=f"Failed to fetch trending movies from TMDB. Error: {e}",
)
raise
@override
def download_show_poster_image(self, show: Show) -> bool:
# Determine which language to use based on show's original_language
language = self.__get_language_param(show.original_language)
@@ -214,7 +231,7 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
"https://image.tmdb.org/t/p/original" + show_metadata["poster_path"]
)
if media_manager.metadataProvider.utils.download_poster_image(
storage_path=self.storage_path, poster_url=poster_url, id=show.id
storage_path=self.storage_path, poster_url=poster_url, uuid=show.id
):
log.info("Successfully downloaded poster image for show " + show.name)
else:
@@ -225,11 +242,12 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
return False
return True
def get_show_metadata(self, id: int = None, language: str | None = None) -> Show:
@override
def get_show_metadata(self, show_id: int, language: str | None = None) -> Show:
"""
:param id: the external id of the show
:type id: int
:param show_id: the external id of the show
:type show_id: int
:param language: optional language code (ISO 639-1) to fetch metadata in
:type language: str | None
:return: returns a Show object
@@ -237,17 +255,17 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
"""
# If language not provided, fetch once to determine original language
if language is None:
show_metadata = self.__get_show_metadata(id)
show_metadata = self.__get_show_metadata(show_id)
language = show_metadata.get("original_language")
# Determine which language to use for metadata
language = self.__get_language_param(language)
# Fetch show metadata in the appropriate language
show_metadata = self.__get_show_metadata(id, language=language)
show_metadata = self.__get_show_metadata(show_id, language=language)
# get imdb id
external_ids = self.__get_show_external_ids(id=id)
external_ids = self.__get_show_external_ids(show_id=show_id)
imdb_id = external_ids.get("imdb_id")
season_list = []
@@ -258,16 +276,14 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
season_number=season["season_number"],
language=language,
)
episode_list = []
for episode in season_metadata["episodes"]:
episode_list.append(
Episode(
external_id=int(episode["id"]),
title=episode["name"],
number=EpisodeNumber(episode["episode_number"]),
)
episode_list = [
Episode(
external_id=int(episode["id"]),
title=episode["name"],
number=EpisodeNumber(episode["episode_number"]),
)
for episode in season_metadata["episodes"]
]
season_list.append(
Season(
@@ -283,8 +299,8 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
show_metadata["first_air_date"]
)
show = Show(
external_id=id,
return Show(
external_id=show_id,
name=show_metadata["name"],
overview=show_metadata["overview"],
year=year,
@@ -295,8 +311,7 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
imdb_id=imdb_id,
)
return show
@override
def search_show(
self, query: str | None = None, max_pages: int = 5
) -> list[MetaDataProviderSearchResult]:
@@ -313,8 +328,7 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
if not result_page["results"]:
break
else:
results.extend(result_page["results"])
results.extend(result_page["results"])
formatted_results = []
for result in results:
@@ -352,16 +366,17 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
original_language=original_language,
)
)
except Exception as e:
log.warning(f"Error processing search result: {e}")
except Exception:
log.warning("Error processing search result", exc_info=True)
return formatted_results
def get_movie_metadata(self, id: int = None, language: str | None = None) -> Movie:
@override
def get_movie_metadata(self, movie_id: int, language: str | None = None) -> Movie:
"""
Get movie metadata with language-aware fetching.
:param id: the external id of the movie
:type id: int
:param movie_id: the external id of the movie
:type movie_id: int
:param language: optional language code (ISO 639-1) to fetch metadata in
:type language: str | None
:return: returns a Movie object
@@ -369,25 +384,25 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
"""
# If language not provided, fetch once to determine original language
if language is None:
movie_metadata = self.__get_movie_metadata(id=id)
movie_metadata = self.__get_movie_metadata(movie_id=movie_id)
language = movie_metadata.get("original_language")
# Determine which language to use for metadata
language = self.__get_language_param(language)
# Fetch movie metadata in the appropriate language
movie_metadata = self.__get_movie_metadata(id=id, language=language)
movie_metadata = self.__get_movie_metadata(movie_id=movie_id, language=language)
# get imdb id
external_ids = self.__get_movie_external_ids(id=id)
external_ids = self.__get_movie_external_ids(movie_id=movie_id)
imdb_id = external_ids.get("imdb_id")
year = media_manager.metadataProvider.utils.get_year_from_date(
movie_metadata["release_date"]
)
movie = Movie(
external_id=id,
return Movie(
external_id=movie_id,
name=movie_metadata["title"],
overview=movie_metadata["overview"],
year=year,
@@ -396,8 +411,7 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
imdb_id=imdb_id,
)
return movie
@override
def search_movie(
self, query: str | None = None, max_pages: int = 5
) -> list[MetaDataProviderSearchResult]:
@@ -414,8 +428,7 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
if not result_page["results"]:
break
else:
results.extend(result_page["results"])
results.extend(result_page["results"])
formatted_results = []
for result in results:
@@ -453,17 +466,18 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
original_language=original_language,
)
)
except Exception as e:
log.warning(f"Error processing search result: {e}")
except Exception:
log.warning("Error processing search result", exc_info=True)
return formatted_results
@override
def download_movie_poster_image(self, movie: Movie) -> bool:
# Determine which language to use based on movie's original_language
language = self.__get_language_param(movie.original_language)
# Fetch metadata in the appropriate language to get localized poster
movie_metadata = self.__get_movie_metadata(
id=movie.external_id, language=language
movie_id=movie.external_id, language=language
)
# downloading the poster
@@ -473,7 +487,7 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
"https://image.tmdb.org/t/p/original" + movie_metadata["poster_path"]
)
if media_manager.metadataProvider.utils.download_poster_image(
storage_path=self.storage_path, poster_url=poster_url, id=movie.id
storage_path=self.storage_path, poster_url=poster_url, uuid=movie.id
):
log.info("Successfully downloaded poster image for movie " + movie.name)
else:

View File

@@ -1,14 +1,16 @@
import requests
import logging
from typing import override
import requests
import media_manager.metadataProvider.utils
from media_manager.config import MediaManagerConfig
from media_manager.metadataProvider.abstractMetaDataProvider import (
from media_manager.metadataProvider.abstract_metadata_provider import (
AbstractMetadataProvider,
)
from media_manager.metadataProvider.schemas import MetaDataProviderSearchResult
from media_manager.tv.schemas import Episode, Season, Show, SeasonNumber
from media_manager.movies.schemas import Movie
from media_manager.tv.schemas import Episode, Season, SeasonNumber, Show
log = logging.getLogger(__name__)
@@ -16,63 +18,58 @@ log = logging.getLogger(__name__)
class TvdbMetadataProvider(AbstractMetadataProvider):
name = "tvdb"
def __init__(self):
def __init__(self) -> None:
config = MediaManagerConfig().metadata.tvdb
self.url = config.tvdb_relay_url
def __get_show(self, id: int) -> dict:
return requests.get(f"{self.url}/tv/shows/{id}").json()
def __get_show(self, show_id: int) -> dict:
return requests.get(url=f"{self.url}/tv/shows/{show_id}", timeout=60).json()
def __get_season(self, id: int) -> dict:
return requests.get(f"{self.url}/tv/seasons/{id}").json()
def __get_season(self, show_id: int) -> dict:
return requests.get(url=f"{self.url}/tv/seasons/{show_id}", timeout=60).json()
def __search_tv(self, query: str) -> dict:
return requests.get(
f"{self.url}/tv/search",
params={"query": query},
url=f"{self.url}/tv/search", params={"query": query}, timeout=60
).json()
def __get_trending_tv(self) -> dict:
return requests.get(f"{self.url}/tv/trending").json()
return requests.get(url=f"{self.url}/tv/trending", timeout=60).json()
def __get_movie(self, id: int) -> dict:
return requests.get(f"{self.url}/movies/{id}").json()
def __get_movie(self, movie_id: int) -> dict:
return requests.get(url=f"{self.url}/movies/{movie_id}", timeout=60).json()
def __search_movie(self, query: str) -> dict:
return requests.get(
f"{self.url}/movies/search",
params={"query": query},
url=f"{self.url}/movies/search", params={"query": query}, timeout=60
).json()
def __get_trending_movies(self) -> dict:
return requests.get(f"{self.url}/movies/trending").json()
return requests.get(url=f"{self.url}/movies/trending", timeout=60).json()
@override
def download_show_poster_image(self, show: Show) -> bool:
show_metadata = self.__get_show(id=show.external_id)
show_metadata = self.__get_show(show_id=show.external_id)
if show_metadata["image"] is not None:
media_manager.metadataProvider.utils.download_poster_image(
storage_path=self.storage_path,
poster_url=show_metadata["image"],
id=show.id,
uuid=show.id,
)
log.debug("Successfully downloaded poster image for show " + show.name)
return True
else:
log.warning(f"image for show {show.name} could not be downloaded")
return False
log.warning(f"image for show {show.name} could not be downloaded")
return False
def get_show_metadata(self, id: int = None, language: str | None = None) -> Show:
@override
def get_show_metadata(self, show_id: int, language: str | None = None) -> Show:
"""
:param id: the external id of the show
:type id: int
:param show_id: The external id of the show
:param language: does nothing, TVDB does not support multiple languages
:type language: str | None
:return: returns a ShowMetadata object
:rtype: ShowMetadata
"""
series = self.__get_show(id=id)
series = self.__get_show(show_id)
seasons = []
seasons_ids = [season["id"] for season in series["seasons"]]
@@ -85,7 +82,7 @@ class TvdbMetadataProvider(AbstractMetadataProvider):
imdb_id = remote_id.get("id")
for season in seasons_ids:
s = self.__get_season(id=season)
s = self.__get_season(show_id=season)
# the seasons need to be filtered to a certain type,
# otherwise the same season will be imported in aired and dvd order,
# which causes duplicate season number + show ids which in turn violates a unique constraint of the season table
@@ -112,15 +109,11 @@ class TvdbMetadataProvider(AbstractMetadataProvider):
episodes=episodes,
)
)
try:
year = series["year"]
except KeyError:
year = None
show = Show(
return Show(
name=series["name"],
overview=series["overview"],
year=year,
year=series.get("year"),
external_id=series["id"],
metadata_provider=self.name,
seasons=seasons,
@@ -128,8 +121,7 @@ class TvdbMetadataProvider(AbstractMetadataProvider):
imdb_id=imdb_id,
)
return show
@override
def search_show(
self, query: str | None = None
) -> list[MetaDataProviderSearchResult]:
@@ -156,39 +148,39 @@ class TvdbMetadataProvider(AbstractMetadataProvider):
vote_average=None,
)
)
except Exception as e:
log.warning(f"Error processing search result: {e}")
except Exception:
log.warning("Error processing search result", exc_info=True)
return formatted_results
else:
results = self.__get_trending_tv()
formatted_results = []
for result in results:
try:
if result["type"] == "series":
try:
year = result["year"]
except KeyError:
year = None
results = self.__get_trending_tv()
formatted_results = []
for result in results:
try:
if result["type"] == "series":
try:
year = result["year"]
except KeyError:
year = None
formatted_results.append(
MetaDataProviderSearchResult(
poster_path="https://artworks.thetvdb.com"
+ result.get("image")
if result.get("image")
else None,
overview=result.get("overview"),
name=result["name"],
external_id=result["id"],
year=year,
metadata_provider=self.name,
added=False,
vote_average=None,
)
formatted_results.append(
MetaDataProviderSearchResult(
poster_path="https://artworks.thetvdb.com"
+ result.get("image")
if result.get("image")
else None,
overview=result.get("overview"),
name=result["name"],
external_id=result["id"],
year=year,
metadata_provider=self.name,
added=False,
vote_average=None,
)
except Exception as e:
log.warning(f"Error processing search result: {e}")
return formatted_results
)
except Exception:
log.warning("Error processing search result", exc_info=True)
return formatted_results
@override
def search_movie(
self, query: str | None = None
) -> list[MetaDataProviderSearchResult]:
@@ -221,41 +213,45 @@ class TvdbMetadataProvider(AbstractMetadataProvider):
vote_average=None,
)
)
except Exception as e:
log.warning(f"Error processing search result: {e}")
except Exception:
log.warning("Error processing search result", exc_info=True)
return formatted_results
else:
results = self.__get_trending_movies()
results = results[0:20]
log.debug(f"got {len(results)} results from TVDB search")
formatted_results = []
for result in results:
result = self.__get_movie(result["id"])
results = self.__get_trending_movies()
results = results[0:20]
log.debug(f"got {len(results)} results from TVDB search")
formatted_results = []
for result in results:
result = self.__get_movie(result["id"])
try:
try:
try:
year = result["year"]
except KeyError:
year = None
year = result["year"]
except KeyError:
year = None
formatted_results.append(
MetaDataProviderSearchResult(
poster_path="https://artworks.thetvdb.com"
+ result.get("image")
if result.get("image")
else None,
overview=result.get("overview"),
name=result["name"],
external_id=result["id"],
year=year,
metadata_provider=self.name,
added=False,
vote_average=None,
)
if result.get("image"):
poster_path = "https://artworks.thetvdb.com" + str(
result.get("image")
)
except Exception as e:
log.warning(f"Error processing search result: {e}")
return formatted_results
else:
poster_path = None
formatted_results.append(
MetaDataProviderSearchResult(
poster_path=poster_path if result.get("image") else None,
overview=result.get("overview"),
name=result["name"],
external_id=result["id"],
year=year,
metadata_provider=self.name,
added=False,
vote_average=None,
)
)
except Exception:
log.warning("Error processing search result", exc_info=True)
return formatted_results
@override
def download_movie_poster_image(self, movie: Movie) -> bool:
movie_metadata = self.__get_movie(movie.external_id)
@@ -263,25 +259,22 @@ class TvdbMetadataProvider(AbstractMetadataProvider):
media_manager.metadataProvider.utils.download_poster_image(
storage_path=self.storage_path,
poster_url=movie_metadata["image"],
id=movie.id,
uuid=movie.id,
)
log.info("Successfully downloaded poster image for show " + movie.name)
return True
else:
log.warning(f"image for show {movie.name} could not be downloaded")
return False
log.warning(f"image for show {movie.name} could not be downloaded")
return False
def get_movie_metadata(self, id: int = None, language: str | None = None) -> Movie:
@override
def get_movie_metadata(self, movie_id: int, language: str | None = None) -> Movie:
"""
:param id: the external id of the movie
:type id: int
:param movie_id: the external id of the movie
:param language: does nothing, TVDB does not support multiple languages
:type language: str | None
:return: returns a Movie object
:rtype: Movie
"""
movie = self.__get_movie(id)
movie = self.__get_movie(movie_id=movie_id)
# get imdb id from remote ids
imdb_id = None
@@ -291,7 +284,7 @@ class TvdbMetadataProvider(AbstractMetadataProvider):
if remote_id.get("type") == 2:
imdb_id = remote_id.get("id")
movie = Movie(
return Movie(
name=movie["name"],
overview="Overviews are not supported with TVDB",
year=movie.get("year"),
@@ -299,5 +292,3 @@ class TvdbMetadataProvider(AbstractMetadataProvider):
metadata_provider=self.name,
imdb_id=imdb_id,
)
return movie

Some files were not shown because too many files have changed in this diff Show More