Compare commits

...

32 Commits

Author SHA1 Message Date
dependabot[bot]
cd70ab8711 Bump actions/setup-python from 5 to 6
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 5 to 6.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](https://github.com/actions/setup-python/compare/v5...v6)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-version: '6'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-13 17:08:28 +00:00
Maximilian Dorninger
51b8794e4d Merge pull request #411 from maxdorninger/Dependabot-auto-bump-deps
Configure Dependabot for multiple package ecosystems
2026-02-13 18:07:54 +01:00
Mark Riabov
0cfd1fa724 Fix suffix formatting for with_suffix call (#408)
Fixes issue ValueError: Invalid suffix 'jpg'

Completely prevents downloading posters from metadata provider
2026-02-10 20:29:05 +01:00
Maximilian Dorninger
b5b297e99a add new sponsor syn (#405)
this PR adds the new sponsor syn
2026-02-08 20:10:06 +01:00
maxid
58414cadae update all links to docs 2026-02-08 19:47:17 +01:00
maxid
462794520e update docs workflow 2026-02-08 19:43:13 +01:00
maxid
59afba007d update docs workflow 2026-02-08 19:36:07 +01:00
Maximilian Dorninger
cfa303e4f3 Merge pull request #404 from maxdorninger/mkdocs
This PR replaces Gitbook with Mkdocs to provide documentation
2026-02-08 19:27:15 +01:00
maxid
d3dde9c7eb add docs workflow 2026-02-08 19:22:34 +01:00
maxid
9c94ef6de0 convert gitbook files to mkdocs 2026-02-08 19:16:38 +01:00
Maximilian Dorninger
2665106847 Merge pull request #401 from maxdorninger/fix-env-variables
Fix download clients config being read from env variables
2026-02-08 16:37:15 +01:00
maxid
d029177fc0 hot fix: fix search tag name for episode in jackett 2026-02-04 23:52:07 +01:00
Maximilian Dorninger
1698c404cd Merge pull request #400 from maxdorninger/add-search-by-id-support-to-jackett
Add search by id support to jackett
2026-02-04 23:00:00 +01:00
maxid
abac894a95 fix download clients config being read from env variables without the mediamanager prefix 2026-02-04 22:49:24 +01:00
maxid
12854ff661 format files 2026-02-04 21:34:37 +01:00
maxid
3d52a87302 add id search capabilities to jackett 2026-02-04 21:34:31 +01:00
Maximilian Dorninger
9ee5cc6895 make the container user configurable (#399)
This PR makes the user the container runs as configurable. Before, the
container always tried stepping down (from root) to the mediamanager
user. Now it detects if it's already running as a non-root user and
starts the server directly. Fixes #397
2026-02-04 19:01:18 +01:00
Maximilian Dorninger
c45c9e5873 add correlation id to logging (#398)
This PR adds Correlation IDs to logs and request responses.

```
2026-02-04 12:40:32,793 - [afd825081d874d6e835b5c59a6ddb371] DEBUG - media_manager.movies - get_importable_movies(): Found 5 importable movies.
2026-02-04 12:40:32,794 - [afd825081d874d6e835b5c59a6ddb371] INFO - uvicorn.access - send(): 172.19.0.1:64094 - "GET /api/v1/movies/importable HTTP/1.1" 200
2026-02-04 12:40:47,322 - [41d30b7003fd45288c6a4bb1cfba5e7a] INFO - uvicorn.access - send(): 127.0.0.1:52964 - "GET /api/v1/health HTTP/1.1" 200
2026-02-04 12:41:17,408 - [157027ea5dde472a9e620f53739ccd53] INFO - uvicorn.access - send(): 127.0.0.1:39850 - "GET /api/v1/health HTTP/1.1" 200
```
2026-02-04 13:55:05 +01:00
Sergey Khruschak
24fcba6bee Torrent file name sanitizing (#390)
Hi, I've added file names sanitization when saving the torrent file, as
previously the import was failing on torrents with special characters in
names. This fixes #367
2026-02-03 17:09:36 +01:00
Maximilian Dorninger
d5994a9037 Fix docker permission issues (#395)
This PR fixes docker permission issues by first starting as root and
then chown-ing all the volumes. This should fix #388 #389
2026-02-03 13:06:18 +01:00
just_Bri
9e0d0c03c0 feat: add links to media detail pages in requests and torrent tables (#352)
Feature Request: https://github.com/maxdorninger/MediaManager/issues/351

[feat: add links to media detail pages in requests and torrent
tables](ac376c0d6d)
2026-02-02 22:48:14 +01:00
Maximilian Dorninger
70ff8f6ace Fix the broken link to the disable ascii art page (#396)
Fix the broken link to the disable ascii art page
2026-02-02 22:22:11 +01:00
Maximilian Dorninger
e347219721 Merge pull request #394 from juandbc/fix-torznab-process-and-jackett-movies-search
Fix torznab process and jackett movies search
2026-02-02 17:42:49 +01:00
strangeglyph
72a626cb1a Add flag to disable startup ascii art (#369)
Adds an environment variable to disable the colorized splash screen.
2026-02-02 17:39:47 +01:00
Juan David Bermudez Celedon
a1f3f92c10 Enhance size validation for indexer results 2026-02-01 22:14:04 -05:00
Juan David Bermudez Celedon
caaa08fbf4 Fix typo in Jackett log for search_movie 2026-02-01 22:01:42 -05:00
Juan David Bermudez Celedon
5db60141bb Fix bug by typo in jackett log message (#387)
fix typo in the `search_season` function log, which causes an error when searching for torrents.
2026-02-01 18:09:18 +01:00
Marcel Hellwig
96b84d45db Adding some more new lints (#393)
Enable `UP` and `TRY` lint
2026-02-01 18:04:15 +01:00
Marcel Hellwig
311e625eee two hotfixes (#392)
this prevents the app from running correctly
2026-02-01 17:42:15 +01:00
maxidorninger
e22e0394bd GITBOOK-19: No subject 2026-01-09 20:13:39 +00:00
maxid
6377aa8b83 rever "add digital ocean attribution" in GitBook 2026-01-09 21:02:19 +01:00
Maximilian Dorninger
8855204930 add digital ocean attribution (#368) 2026-01-09 20:54:47 +01:00
113 changed files with 1245 additions and 905 deletions

View File

@@ -53,5 +53,5 @@ YOUR CONFIG HERE
``` ```
- [ ] I understand, that without logs and/or screenshots and a detailed description of the problem, it is very hard to fix bugs. - [ ] I understand, that without logs and/or screenshots and a detailed description of the problem, it is very hard to fix bugs.
- [ ] I have checked the [documentation](https://maximilian-dorninger.gitbook.io/mediamanager) for help. - [ ] I have checked the [documentation](https://maxdorninger.github.io/MediaManager/) for help.
- [ ] I have searched the [issues](https://github.com/maxdorninger/MediaManager/issues) for similar issues and found none. - [ ] I have searched the [issues](https://github.com/maxdorninger/MediaManager/issues) for similar issues and found none.

25
.github/dependabot.yml vendored Normal file
View File

@@ -0,0 +1,25 @@
# To get started with Dependabot version updates, you'll need to specify which
# package ecosystems to update and where the package manifests are located.
# Please see the documentation for all configuration options:
# https://docs.github.com/code-security/dependabot/dependabot-version-updates/configuration-options-for-the-dependabot.yml-file
version: 2
updates:
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "weekly"
open-pull-requests-limit: 5
- package-ecosystem: "npm"
directory: "/web"
schedule:
interval: "weekly"
open-pull-requests-limit: 5
- package-ecosystem: "uv"
directory: "/"
schedule:
interval: "weekly"
open-pull-requests-limit: 5

62
.github/workflows/docs.yml vendored Normal file
View File

@@ -0,0 +1,62 @@
name: Publish docs via GitHub Pages
on:
push:
branches:
- master
tags:
- v*
workflow_dispatch:
inputs:
set_default_alias:
description: 'Alias to set as default (e.g. latest, master)'
required: false
default: 'latest'
permissions:
contents: write
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Configure Git Credentials
run: |
git config user.name github-actions[bot]
git config user.email 41898282+github-actions[bot]@users.noreply.github.com
- uses: actions/setup-python@v6
with:
python-version: 3.x
- run: echo "cache_id=$(date --utc '+%V')" >> $GITHUB_ENV
- uses: actions/cache@v4
with:
key: mkdocs-material-${{ env.cache_id }}
path: .cache
restore-keys: |
mkdocs-material-
- name: Install dependencies
run: pip install mkdocs-material mike
- name: Deploy (master)
if: github.ref == 'refs/heads/master'
run: |
mike deploy --push --update-aliases master
- name: Deploy (tag)
if: startsWith(github.ref, 'refs/tags/v')
run: |
version=${GITHUB_REF#refs/tags/}
mike deploy --push --update-aliases $version latest --title "$version"
mike set-default --push latest
- name: Set Default (Manual)
if: github.event_name == 'workflow_dispatch' && github.event.inputs.set_default_alias != ''
run: |
mike set-default --push ${{ github.event.inputs.set_default_alias }}

4
.gitignore vendored
View File

@@ -49,5 +49,5 @@ __pycache__
# Postgres # Postgres
/postgres /postgres
# Node modules # MkDocs
/node_modules/* site/

View File

@@ -18,7 +18,7 @@ Generally, if you have any questions or need help on the implementation side of
just ask in the issue, or in a draft PR. just ask in the issue, or in a draft PR.
Also, see the contribution guide in the docs for information on how to setup the dev environment: Also, see the contribution guide in the docs for information on how to setup the dev environment:
https://maximilian-dorninger.gitbook.io/mediamanager https://maxdorninger.github.io/MediaManager/
### For something that is a one or two line fix: ### For something that is a one or two line fix:

View File

@@ -13,7 +13,7 @@ RUN env PUBLIC_VERSION=${VERSION} PUBLIC_API_URL=${BASE_PATH} BASE_PATH=${BASE_P
FROM ghcr.io/astral-sh/uv:python3.13-trixie-slim AS base FROM ghcr.io/astral-sh/uv:python3.13-trixie-slim AS base
RUN apt-get update && \ RUN apt-get update && \
apt-get install -y ca-certificates bash libtorrent21 gcc bc locales postgresql media-types mailcap curl gzip unzip tar 7zip bzip2 unar && \ apt-get install -y ca-certificates bash libtorrent21 gcc bc locales postgresql media-types mailcap curl gzip unzip tar 7zip bzip2 unar gosu && \
apt-get clean && \ apt-get clean && \
rm -rf /var/lib/apt/lists/* rm -rf /var/lib/apt/lists/*
@@ -33,7 +33,6 @@ RUN chown -R mediamanager:mediamanager /app
USER mediamanager USER mediamanager
# Set uv cache to a writable home directory and use copy mode for volume compatibility
ENV UV_CACHE_DIR=/home/mediamanager/.cache/uv \ ENV UV_CACHE_DIR=/home/mediamanager/.cache/uv \
UV_LINK_MODE=copy UV_LINK_MODE=copy
@@ -47,6 +46,7 @@ ARG BASE_PATH=""
LABEL author="github.com/maxdorninger" LABEL author="github.com/maxdorninger"
LABEL version=${VERSION} LABEL version=${VERSION}
LABEL description="Docker image for MediaManager" LABEL description="Docker image for MediaManager"
USER root
ENV PUBLIC_VERSION=${VERSION} \ ENV PUBLIC_VERSION=${VERSION} \
CONFIG_DIR="/app/config" \ CONFIG_DIR="/app/config" \

View File

@@ -1,7 +1,7 @@
<br /> <br />
<div align="center"> <div align="center">
<a href="https://maximilian-dorninger.gitbook.io/mediamanager"> <a href="https://maxdorninger.github.io/MediaManager/">
<img src="https://github.com/maxdorninger/MediaManager/blob/master/web/static/logo.svg" alt="Logo" width="260" height="260"> <img src="https://raw.githubusercontent.com/maxdorninger/MediaManager/refs/heads/master/web/static/logo.svg" alt="Logo" width="260" height="260">
</a> </a>
<h3 align="center">MediaManager</h3> <h3 align="center">MediaManager</h3>
@@ -9,7 +9,7 @@
<p align="center"> <p align="center">
Modern management system for your media library Modern management system for your media library
<br /> <br />
<a href="https://maximilian-dorninger.gitbook.io/mediamanager"><strong>Explore the docs »</strong></a> <a href="https://maxdorninger.github.io/MediaManager/"><strong>Explore the docs »</strong></a>
<br /> <br />
<a href="https://github.com/maxdorninger/MediaManager/issues/new?labels=bug&template=bug_report.md">Report Bug</a> <a href="https://github.com/maxdorninger/MediaManager/issues/new?labels=bug&template=bug_report.md">Report Bug</a>
&middot; &middot;
@@ -35,7 +35,7 @@ wget -O ./config/config.toml https://github.com/maxdorninger/MediaManager/releas
docker compose up -d docker compose up -d
``` ```
### [View the docs for installation instructions and more](https://maximilian-dorninger.gitbook.io/mediamanager) ### [View the docs for installation instructions and more](https://maxdorninger.github.io/MediaManager/)
## Support MediaManager ## Support MediaManager
@@ -60,6 +60,7 @@ docker compose up -d
<a href="https://buymeacoffee.com/maxdorninger"><img src="https://cdn.buymeacoffee.com/uploads/profile_pictures/default/v2/DEBBB9/JO.png" width="80px" alt="Josh" /></a>&nbsp;&nbsp; <a href="https://buymeacoffee.com/maxdorninger"><img src="https://cdn.buymeacoffee.com/uploads/profile_pictures/default/v2/DEBBB9/JO.png" width="80px" alt="Josh" /></a>&nbsp;&nbsp;
<a href="https://buymeacoffee.com/maxdorninger"><img src="https://cdn.buymeacoffee.com/uploads/profile_pictures/2025/11/2VeQ8sTGPhj4tiLy.jpg" width="80px" alt="PuppiestDoggo" /></a>&nbsp;&nbsp; <a href="https://buymeacoffee.com/maxdorninger"><img src="https://cdn.buymeacoffee.com/uploads/profile_pictures/2025/11/2VeQ8sTGPhj4tiLy.jpg" width="80px" alt="PuppiestDoggo" /></a>&nbsp;&nbsp;
<a href="https://github.com/seferino-fernandez"><img src="https://avatars.githubusercontent.com/u/5546622" width="80px" alt="Seferino" /></a>&nbsp;&nbsp; <a href="https://github.com/seferino-fernandez"><img src="https://avatars.githubusercontent.com/u/5546622" width="80px" alt="Seferino" /></a>&nbsp;&nbsp;
<a href="https://buymeacoffee.com/maxdorninger"><img src="https://cdn.buymeacoffee.com/uploads/profile_pictures/default/v2/EC9689/SY.png" width="80px" alt="syn" /></a>&nbsp;&nbsp;
## Star History ## Star History
@@ -80,7 +81,7 @@ docker compose up -d
## Developer Quick Start ## Developer Quick Start
For the developer guide see the [Developer Guide](https://maximilian-dorninger.gitbook.io/mediamanager). For the developer guide see the [Developer Guide](https://maxdorninger.github.io/MediaManager/).
<!-- LICENSE --> <!-- LICENSE -->
@@ -93,5 +94,9 @@ Distributed under the AGPL 3.0. See `LICENSE.txt` for more information.
## Acknowledgments ## Acknowledgments
Thanks to DigitalOcean for sponsoring the project!
[![DigitalOcean Referral Badge](https://web-platforms.sfo2.cdn.digitaloceanspaces.com/WWW/Badge%201.svg)](https://www.digitalocean.com/?refcode=4edf05429dca&utm_campaign=Referral_Invite&utm_medium=Referral_Program&utm_source=badge)
* [Thanks to Pawel Czerwinski for the image on the login screen](https://unsplash.com/@pawel_czerwinski) * [Thanks to Pawel Czerwinski for the image on the login screen](https://unsplash.com/@pawel_czerwinski)

View File

@@ -1,6 +1,6 @@
# MediaManager Dev Configuration File # MediaManager Dev Configuration File
# This file contains all available configuration options for MediaManager # This file contains all available configuration options for MediaManager
# Documentation: https://maximilian-dorninger.gitbook.io/mediamanager # Documentation: https://maxdorninger.github.io/MediaManager/
# #
# This is an example configuration file that gets copied to your config folder # This is an example configuration file that gets copied to your config folder
# on first boot. You should modify the values below to match your setup. # on first boot. You should modify the values below to match your setup.

View File

@@ -1,6 +1,6 @@
# MediaManager Example Configuration File # MediaManager Example Configuration File
# This file contains all available configuration options for MediaManager # This file contains all available configuration options for MediaManager
# Documentation: https://maximilian-dorninger.gitbook.io/mediamanager # Documentation: https://maxdorninger.github.io/MediaManager/
# #
# This is an example configuration file that gets copied to your config folder # This is an example configuration file that gets copied to your config folder
# on first boot. You should modify the values below to match your setup. # on first boot. You should modify the values below to match your setup.

View File

@@ -56,6 +56,15 @@ services:
- ./web:/app - ./web:/app
depends_on: depends_on:
- mediamanager - mediamanager
docs:
image: squidfunk/mkdocs-material:9
container_name: mediamanager-docs
volumes:
- .:/docs
ports:
- "9000:9000"
command: serve -w /docs -a 0.0.0.0:9000
# ---------------------------- # ----------------------------
# Additional services can be uncommented and configured as needed # Additional services can be uncommented and configured as needed
@@ -130,17 +139,17 @@ services:
# ports: # ports:
# - 8081:8080 # - 8081:8080
# restart: unless-stopped # restart: unless-stopped
# jackett: jackett:
# image: lscr.io/linuxserver/jackett:latest image: lscr.io/linuxserver/jackett:latest
# container_name: jackett container_name: jackett
# environment: environment:
# - PUID=1000 - PUID=1000
# - PGID=1000 - PGID=1000
# - TZ=Etc/UTC - TZ=Etc/UTC
# - AUTO_UPDATE=true - AUTO_UPDATE=true
# volumes: volumes:
# - ./res/jackett/data:/config - ./res/jackett/data:/config
# - ./res/jackett/torrents:/downloads - ./res/jackett/torrents:/downloads
# ports: ports:
# - 9117:9117 - 9117:9117
# restart: unless-stopped restart: unless-stopped

View File

@@ -1,34 +0,0 @@
---
layout:
width: default
title:
visible: true
description:
visible: true
tableOfContents:
visible: true
outline:
visible: false
pagination:
visible: true
metadata:
visible: true
---
# MediaManager
MediaManager is the modern, easy-to-use successor to the fragmented "Arr" stack. Manage, discover, and automate your TV and movie collection in a single, simple interface.
_Replaces Sonarr, Radarr, Seerr, and more._
### Quick Links
<table data-view="cards" data-full-width="false"><thead><tr><th align="center"></th><th data-hidden data-card-target data-type="content-ref"></th></tr></thead><tbody><tr><td align="center">Installation Guide</td><td><a href="installation/">installation</a></td></tr><tr><td align="center">Configuration</td><td><a href="configuration/">configuration</a></td></tr><tr><td align="center">Developer Guide</td><td><a href="contributing-to-mediamanager/developer-guide.md">developer-guide.md</a></td></tr><tr><td align="center">Troubleshooting</td><td><a href="troubleshooting.md">troubleshooting.md</a></td></tr><tr><td align="center">Advanced Features</td><td><a href="advanced-features/">advanced-features</a></td></tr><tr><td align="center">Import Existing Media</td><td><a href="importing-existing-media.md">importing-existing-media.md</a></td></tr></tbody></table>
## Support MediaManager & Maximilian Dorninger
<table data-card-size="large" data-view="cards" data-full-width="false"><thead><tr><th></th><th data-hidden data-card-target data-type="content-ref"></th><th data-hidden data-card-cover data-type="image">Cover image</th></tr></thead><tbody><tr><td>Sponsor me on GitHub Sponsors :)</td><td><a href="https://github.com/sponsors/maxdorninger">https://github.com/sponsors/maxdorninger</a></td><td></td></tr><tr><td>Buy me a coffee :)</td><td><a href="https://buymeacoffee.com/maxdorninger">https://buymeacoffee.com/maxdorninger</a></td><td></td></tr></tbody></table>
### MediaManager Sponsors
<table data-view="cards" data-full-width="false"><thead><tr><th>Sponsor</th><th data-hidden data-card-target data-type="content-ref"></th><th data-hidden data-card-cover data-type="image">Cover image</th></tr></thead><tbody><tr><td>Aljaž Mur Eržen</td><td><a href="https://fosstodon.org/@aljazmerzen">https://fosstodon.org/@aljazmerzen</a></td><td><a href="https://github.com/aljazerzen.png">https://github.com/aljazerzen.png</a></td></tr><tr><td>Luis Rodriguez</td><td><a href="https://github.com/ldrrp">https://github.com/ldrrp</a></td><td><a href="https://github.com/ldrrp.png">https://github.com/ldrrp.png</a></td></tr><tr><td>Brandon P.</td><td><a href="https://github.com/brandon-dacrib">https://github.com/brandon-dacrib</a></td><td><a href="https://github.com/brandon-dacrib.png">https://github.com/brandon-dacrib.png</a></td></tr><tr><td>SeimusS</td><td><a href="https://github.com/SeimusS">https://github.com/SeimusS</a></td><td><a href="https://github.com/SeimusS.png">https://github.com/SeimusS.png</a></td></tr><tr><td>HadrienKerlero</td><td><a href="https://github.com/HadrienKerlero">https://github.com/HadrienKerlero</a></td><td><a href="https://github.com/HadrienKerlero.png">https://github.com/HadrienKerlero.png</a></td></tr><tr><td>keyxmakerx</td><td><a href="https://github.com/keyxmakerx">https://github.com/keyxmakerx</a></td><td><a href="https://github.com/keyxmakerx.png">https://github.com/keyxmakerx.png</a></td></tr><tr><td>LITUATUI</td><td><a href="https://github.com/LITUATUI">https://github.com/LITUATUI</a></td><td><a href="https://github.com/LITUATUI.png">https://github.com/LITUATUI.png</a></td></tr><tr><td>Nicolas</td><td><a href="https://buymeacoffee.com/maxdorninger">https://buymeacoffee.com/maxdorninger</a></td><td><a href="https://cdn.buymeacoffee.com/uploads/profile_pictures/default/v2/B6CDBD/NI.png">https://cdn.buymeacoffee.com/uploads/profile_pictures/default/v2/B6CDBD/NI.png</a></td></tr><tr><td>Josh</td><td><a href="https://buymeacoffee.com/maxdorninger">https://buymeacoffee.com/maxdorninger</a></td><td><a href="https://cdn.buymeacoffee.com/uploads/profile_pictures/default/v2/DEBBB9/JO.png">https://cdn.buymeacoffee.com/uploads/profile_pictures/default/v2/DEBBB9/JO.png</a></td></tr><tr><td>PuppiestDoggo</td><td><a href="https://buymeacoffee.com/maxdorninger">https://buymeacoffee.com/maxdorninger</a></td><td><a href="https://cdn.buymeacoffee.com/uploads/profile_pictures/2025/11/2VeQ8sTGPhj4tiLy.jpg">https://cdn.buymeacoffee.com/uploads/profile_pictures/2025/11/2VeQ8sTGPhj4tiLy.jpg</a></td></tr><tr><td>Seferino</td><td><a href="https://github.com/seferino-fernandez">https://github.com/seferino-fernandez</a></td><td><a href="https://avatars.githubusercontent.com/u/5546622">https://avatars.githubusercontent.com/u/5546622</a></td></tr></tbody></table>

View File

@@ -1,32 +0,0 @@
# Table of contents
* [MediaManager](README.md)
* [Installation Guide](installation/README.md)
* [Docker Compose](installation/docker.md)
* [Nix Flakes \[Community\]](installation/flakes.md)
* [Importing existing media](importing-existing-media.md)
* [Usage](usage.md)
* [Configuration](configuration/README.md)
* [Backend](configuration/backend.md)
* [Authentication](configuration/authentication.md)
* [Database](configuration/database.md)
* [Download Clients](configuration/download-clients.md)
* [Indexers](configuration/indexers.md)
* [Scoring Rulesets](configuration/scoring-rulesets.md)
* [Notifications](configuration/notifications.md)
* [Custom Libraries](configuration/custom-libraries.md)
* [Logging](configuration/logging.md)
* [Advanced Features](advanced-features/README.md)
* [qBittorrent Category](advanced-features/qbittorrent-category.md)
* [URL Prefix](advanced-features/url-prefix.md)
* [Metadata Provider Configuration](advanced-features/metadata-provider-configuration.md)
* [Custom port](advanced-features/custom-port.md)
* [Follow symlinks in frontend files](advanced-features/follow-symlinks-in-frontend-files.md)
* [Troubleshooting](troubleshooting.md)
* [API Reference](api-reference.md)
* [Screenshots](screenshots.md)
## Contributing to MediaManager
* [Developer Guide](contributing-to-mediamanager/developer-guide.md)
* [Documentation](contributing-to-mediamanager/documentation.md)

View File

@@ -1,9 +0,0 @@
---
description: >-
The features in this section are not required to run MediaManager and serve
their purpose in very specific environments, but they can enhance your
experience and provide additional functionality.
---
# Advanced Features

View File

@@ -0,0 +1,4 @@
# Disable Startup Ascii Art
* `MEDIAMANAGER_NO_STARTUP_ART`: Set this environment variable (to any value) \
to disable the colorized startup splash screen. Unset to reenable.

View File

@@ -7,8 +7,6 @@ MediaManager can be configured to follow symlinks when serving frontend files. T
* `FRONTEND_FOLLOW_SYMLINKS`\ * `FRONTEND_FOLLOW_SYMLINKS`\
Set this environment variable to `true` to follow symlinks when serving frontend files. Default is `false`. Set this environment variable to `true` to follow symlinks when serving frontend files. Default is `false`.
{% code title=".env" %} ```bash title=".env"
```bash
FRONTEND_FOLLOW_SYMLINKS=true FRONTEND_FOLLOW_SYMLINKS=true
``` ```
{% endcode %}

View File

@@ -8,9 +8,8 @@ Metadata provider settings are configured in the `[metadata]` section of your `c
TMDB (The Movie Database) is the primary metadata provider for MediaManager. It provides detailed information about movies and TV shows. TMDB (The Movie Database) is the primary metadata provider for MediaManager. It provides detailed information about movies and TV shows.
{% hint style="info" %} !!! info
Other software like Jellyfin use TMDB as well, so there won't be any metadata discrepancies. Other software like Jellyfin use TMDB as well, so there won't be any metadata discrepancies.
{% endhint %}
* `tmdb_relay_url`\ * `tmdb_relay_url`\
URL of the TMDB relay (MetadataRelay). Default is `https://metadata-relay.dorninger.co/tmdb`. Example: `https://your-own-relay.example.com/tmdb`. URL of the TMDB relay (MetadataRelay). Default is `https://metadata-relay.dorninger.co/tmdb`. Example: `https://your-own-relay.example.com/tmdb`.
@@ -19,24 +18,21 @@ Other software like Jellyfin use TMDB as well, so there won't be any metadata di
* `default_language`\ * `default_language`\
TMDB language parameter used when searching and adding. Default is `en`. Format: ISO 639-1 (2 letters). TMDB language parameter used when searching and adding. Default is `en`. Format: ISO 639-1 (2 letters).
{% hint style="warning" %} !!! warning
`default_language` sets the TMDB `language` parameter when searching and adding TV shows and movies. If TMDB does not find a matching translation, metadata in the original language will be fetched with no option for a fallback language. It is therefore highly advised to only use "broad" languages. For most use cases, the default setting is safest. `default_language` sets the TMDB `language` parameter when searching and adding TV shows and movies. If TMDB does not find a matching translation, metadata in the original language will be fetched with no option for a fallback language. It is therefore highly advised to only use "broad" languages. For most use cases, the default setting is safest.
{% endhint %}
### TVDB Settings (`[metadata.tvdb]`) ### TVDB Settings (`[metadata.tvdb]`)
{% hint style="warning" %} !!! warning
The TVDB might provide false metadata and doesn't support some features of MediaManager like showing overviews. Therefore, TMDB is the preferred metadata provider. The TVDB might provide false metadata and doesn't support some features of MediaManager like showing overviews. Therefore, TMDB is the preferred metadata provider.
{% endhint %}
* `tvdb_relay_url`\ * `tvdb_relay_url`\
URL of the TVDB relay (MetadataRelay). Default is `https://metadata-relay.dorninger.co/tvdb`. Example: `https://your-own-relay.example.com/tvdb`. URL of the TVDB relay (MetadataRelay). Default is `https://metadata-relay.dorninger.co/tvdb`. Example: `https://your-own-relay.example.com/tvdb`.
### MetadataRelay ### MetadataRelay
{% hint style="info" %} !!! info
To use MediaManager you don't need to set up your own MetadataRelay, as the default relay hosted by the developer should be sufficient for most purposes. To use MediaManager you don't need to set up your own MetadataRelay, as the default relay hosted by the developer should be sufficient for most purposes.
{% endhint %}
The MetadataRelay is a service that provides metadata for MediaManager. It acts as a proxy for TMDB and TVDB, allowing you to use your own API keys if needed, but the default relay means you don't need to create accounts for API keys yourself. The MetadataRelay is a service that provides metadata for MediaManager. It acts as a proxy for TMDB and TVDB, allowing you to use your own API keys if needed, but the default relay means you don't need to create accounts for API keys yourself.
@@ -47,16 +43,14 @@ You might want to use your own relay if you want to avoid rate limits, protect y
* Get a TMDB API key from [The Movie Database](https://www.themoviedb.org/settings/api) * Get a TMDB API key from [The Movie Database](https://www.themoviedb.org/settings/api)
* Get a TVDB API key from [The TVDB](https://thetvdb.com/auth/register) * Get a TVDB API key from [The TVDB](https://thetvdb.com/auth/register)
{% hint style="info" %} !!! info
If you want to use your own MetadataRelay, you can set the `tmdb_relay_url` and/or `tvdb_relay_url` to your own relay service. If you want to use your own MetadataRelay, you can set the `tmdb_relay_url` and/or `tvdb_relay_url` to your own relay service.
{% endhint %}
### Example Configuration ### Example Configuration
Here's a complete example of the metadata section in your `config.toml`: Here's a complete example of the metadata section in your `config.toml`:
{% code title="config.toml" %} ```toml title="config.toml"
```toml
[metadata] [metadata]
# TMDB configuration # TMDB configuration
[metadata.tmdb] [metadata.tmdb]
@@ -66,8 +60,6 @@ Here's a complete example of the metadata section in your `config.toml`:
[metadata.tvdb] [metadata.tvdb]
tvdb_relay_url = "https://metadata-relay.dorninger.co/tvdb" tvdb_relay_url = "https://metadata-relay.dorninger.co/tvdb"
``` ```
{% endcode %}
{% hint style="info" %} !!! info
In most cases, you can simply use the default values and don't need to specify these settings in your config file at all. In most cases, you can simply use the default values and don't need to specify these settings in your config file at all.
{% endhint %}

View File

@@ -9,10 +9,8 @@ Use the following variables to customize behavior:
* `torrents.qbittorrent.category_save_path`\ * `torrents.qbittorrent.category_save_path`\
Save path for the category in qBittorrent. By default, no subdirectory is used. Example: `/data/torrents/MediaManager`. Save path for the category in qBittorrent. By default, no subdirectory is used. Example: `/data/torrents/MediaManager`.
{% hint style="info" %} !!! info
qBittorrent saves torrents to the path specified by `torrents.qbittorrent.category_save_path`, so it must be a valid path that qBittorrent can write to. qBittorrent saves torrents to the path specified by `torrents.qbittorrent.category_save_path`, so it must be a valid path that qBittorrent can write to.
{% endhint %}
{% hint style="warning" %} !!! warning
For MediaManager to successfully import torrents, you must add the subdirectory to the `misc.torrent_directory` variable. For MediaManager to successfully import torrents, you must add the subdirectory to the `misc.torrent_directory` variable.
{% endhint %}

View File

@@ -6,23 +6,20 @@ In order to run it on a prefixed path, like `maxdorninger.github.io/media`, the
In short, clone the repository, then run: In short, clone the repository, then run:
{% code title="Build Docker image" %} ```none title="Build Docker image"
```none
docker build \ docker build \
--build-arg BASE_PATH=/media \ --build-arg BASE_PATH=/media \
--build-arg VERSION=my-custom-version \ --build-arg VERSION=my-custom-version \
-t MediaManager:my-custom-version \ -t MediaManager:my-custom-version \
-f Dockerfile . -f Dockerfile .
``` ```
{% endcode %}
You also need to set the `BASE_PATH` environment variable at runtime in `docker-compose.yaml`: You also need to set the `BASE_PATH` environment variable at runtime in `docker-compose.yaml`:
* `BASE_PATH`\ * `BASE_PATH`\
Base path prefix MediaManager is served under. Example: `/media`. This must match the `BASE_PATH` build arg. Base path prefix MediaManager is served under. Example: `/media`. This must match the `BASE_PATH` build arg.
{% code title="docker-compose.yaml (excerpt)" %} ```yaml title="docker-compose.yaml (excerpt)"
```yaml
services: services:
mediamanager: mediamanager:
image: MediaManager:my-custom-version image: MediaManager:my-custom-version
@@ -32,10 +29,8 @@ services:
BASE_PATH: /media BASE_PATH: /media
... ...
``` ```
{% endcode %}
{% hint style="info" %} !!! info
Make sure to include the base path in the `frontend_url` field in the config file. See [Backend](../configuration/backend.md). Make sure to include the base path in the `frontend_url` field in the config file. See [Backend](../configuration/backend.md).
{% endhint %}
Finally, ensure that whatever reverse proxy you're using leaves the incoming path unchanged; that is, you should not strip the `/media` from `/media/web/`. Finally, ensure that whatever reverse proxy you're using leaves the incoming path unchanged; that is, you should not strip the `/media` from `/media/web/`.

View File

@@ -1,8 +1,7 @@
# API Reference # API Reference
{% hint style="info" %} !!! info
Media Manager's backend is built with FastAPI, which automatically generates interactive API documentation. Media Manager's backend is built with FastAPI, which automatically generates interactive API documentation.
{% endhint %}
* Swagger UI (typically available at `http://localhost:8000/docs`) * Swagger UI (typically available at `http://localhost:8000/docs`)
* ReDoc (typically available at `http://localhost:8000/redoc`) * ReDoc (typically available at `http://localhost:8000/redoc`)

View File

Before

Width:  |  Height:  |  Size: 3.1 MiB

After

Width:  |  Height:  |  Size: 3.1 MiB

View File

Before

Width:  |  Height:  |  Size: 35 KiB

After

Width:  |  Height:  |  Size: 35 KiB

View File

Before

Width:  |  Height:  |  Size: 9.0 KiB

After

Width:  |  Height:  |  Size: 9.0 KiB

View File

Before

Width:  |  Height:  |  Size: 21 KiB

After

Width:  |  Height:  |  Size: 21 KiB

View File

Before

Width:  |  Height:  |  Size: 62 KiB

After

Width:  |  Height:  |  Size: 62 KiB

View File

Before

Width:  |  Height:  |  Size: 20 KiB

After

Width:  |  Height:  |  Size: 20 KiB

View File

Before

Width:  |  Height:  |  Size: 23 KiB

After

Width:  |  Height:  |  Size: 23 KiB

View File

Before

Width:  |  Height:  |  Size: 244 KiB

After

Width:  |  Height:  |  Size: 244 KiB

View File

Before

Width:  |  Height:  |  Size: 24 KiB

After

Width:  |  Height:  |  Size: 24 KiB

View File

Before

Width:  |  Height:  |  Size: 113 KiB

After

Width:  |  Height:  |  Size: 113 KiB

View File

Before

Width:  |  Height:  |  Size: 38 KiB

After

Width:  |  Height:  |  Size: 38 KiB

View File

Before

Width:  |  Height:  |  Size: 12 KiB

After

Width:  |  Height:  |  Size: 12 KiB

View File

Before

Width:  |  Height:  |  Size: 72 KiB

After

Width:  |  Height:  |  Size: 72 KiB

View File

Before

Width:  |  Height:  |  Size: 36 KiB

After

Width:  |  Height:  |  Size: 36 KiB

View File

Before

Width:  |  Height:  |  Size: 15 KiB

After

Width:  |  Height:  |  Size: 15 KiB

View File

Before

Width:  |  Height:  |  Size: 8.9 MiB

After

Width:  |  Height:  |  Size: 8.9 MiB

View File

Before

Width:  |  Height:  |  Size: 64 KiB

After

Width:  |  Height:  |  Size: 64 KiB

View File

Before

Width:  |  Height:  |  Size: 5.5 MiB

After

Width:  |  Height:  |  Size: 5.5 MiB

View File

Before

Width:  |  Height:  |  Size: 33 KiB

After

Width:  |  Height:  |  Size: 33 KiB

View File

Before

Width:  |  Height:  |  Size: 38 KiB

After

Width:  |  Height:  |  Size: 38 KiB

View File

Before

Width:  |  Height:  |  Size: 7.6 MiB

After

Width:  |  Height:  |  Size: 7.6 MiB

View File

Before

Width:  |  Height:  |  Size: 123 KiB

After

Width:  |  Height:  |  Size: 123 KiB

158
docs/assets/logo.svg Normal file
View File

@@ -0,0 +1,158 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!-- Created with Inkscape (http://www.inkscape.org/) -->
<svg
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns:svg="http://www.w3.org/2000/svg"
version="1.1"
id="svg1"
width="2000"
height="2000"
viewBox="0 0 2000 2000"
sodipodi:docname="logo2.svg"
inkscape:version="1.4.2 (f4327f4, 2025-05-13)"
xmlns="http://www.w3.org/2000/svg">
<defs
id="defs1">
<clipPath
clipPathUnits="userSpaceOnUse"
id="clipPath1">
<path
d="M 0,1500 H 1500 V 0 H 0 Z"
id="path1"/>
</clipPath>
<clipPath
clipPathUnits="userSpaceOnUse"
id="clipPath3">
<path
d="M 0,0 H 1500 V 1500 H 0 Z"
transform="matrix(1.3333333,0,0,-1.3333333,0,2000)"
id="path3"/>
</clipPath>
<clipPath
clipPathUnits="userSpaceOnUse"
id="clipPath4">
<path
d="M -17.6886,1032.99 H 1106.27 V 238.53 H -17.6886 Z"
transform="translate(-319.61281,-1032.9941)"
id="path4"/>
</clipPath>
<clipPath
clipPathUnits="userSpaceOnUse"
id="clipPath6">
<path
d="M 0,0 H 1500 V 1500 H 0 Z"
transform="matrix(1.3333333,0,0,-1.3333333,0,2000)"
id="path6"/>
</clipPath>
<clipPath
clipPathUnits="userSpaceOnUse"
id="clipPath7">
<path
d="M 223.314,1226.85 H 1182.49 V 548.867 H 223.314 Z"
transform="translate(-894.64255,-548.86681)"
id="path7"/>
</clipPath>
<clipPath
clipPathUnits="userSpaceOnUse"
id="clipPath9">
<path
d="M 0,0 H 1500 V 1500 H 0 Z"
transform="matrix(1.3333333,0,0,-1.3333333,0,2000)"
id="path9"/>
</clipPath>
<clipPath
clipPathUnits="userSpaceOnUse"
id="clipPath10">
<path
d="M 301.561,1098.17 H 1517.73 V 238.53 H 301.561 Z"
transform="translate(-666.53282,-1098.1678)"
id="path10"/>
</clipPath>
<clipPath
clipPathUnits="userSpaceOnUse"
id="clipPath12">
<path
d="M 0,0 H 1500 V 1500 H 0 Z"
transform="matrix(1.3333333,0,0,-1.3333333,0,2000)"
id="path12"/>
</clipPath>
</defs>
<sodipodi:namedview
id="namedview1"
pagecolor="#ffffff"
bordercolor="#000000"
borderopacity="0.25"
inkscape:showpageshadow="2"
inkscape:pageopacity="0.0"
inkscape:pagecheckerboard="0"
inkscape:deskcolor="#d1d1d1"
inkscape:zoom="0.9075"
inkscape:cx="999.44904"
inkscape:cy="1000"
inkscape:window-width="3840"
inkscape:window-height="2054"
inkscape:window-x="3373"
inkscape:window-y="199"
inkscape:window-maximized="1"
inkscape:current-layer="g1">
<inkscape:page
x="0"
y="0"
inkscape:label="1"
id="page1"
width="2000"
height="2000"
margin="0"
bleed="0"/>
</sodipodi:namedview>
<g
id="g1"
inkscape:groupmode="layer"
inkscape:label="1">
<g
id="g2"
clip-path="url(#clipPath3)">
<path
d="M 0,0 H 1500 V 1500 H 0 Z"
style="fill:#9ed8f7;fill-opacity:0;fill-rule:nonzero;stroke:none"
transform="matrix(1.3333333,0,0,-1.3333333,0,2000)"
clip-path="url(#clipPath1)"
id="path2"/>
</g>
<g
opacity="0.720001"
id="g5"
clip-path="url(#clipPath6)">
<path
d="m 0,0 h 669.787 c 68.994,0 116.873,-68.746 92.95,-133.46 L 542.309,-729.728 c -14.382,-38.904 -51.472,-64.736 -92.95,-64.736 h -669.787 c -68.994,0 -116.873,68.746 -92.949,133.46 L -92.949,-64.736 C -78.567,-25.832 -41.478,0 0,0"
style="fill:#2842fc;fill-opacity:1;fill-rule:nonzero;stroke:none"
transform="matrix(1.3333333,0,0,-1.3333333,426.1504,622.67453)"
clip-path="url(#clipPath4)"
id="path5"/>
</g>
<g
opacity="0.720001"
id="g8"
clip-path="url(#clipPath9)">
<path
d="m 0,0 h -571.59 c -58.879,0 -99.738,58.667 -79.322,113.893 l 188.111,508.849 c 12.274,33.201 43.925,55.246 79.322,55.246 h 571.59 c 58.879,0 99.739,-58.667 79.322,-113.894 L 79.322,55.245 C 67.049,22.045 35.397,0 0,0"
style="fill:#ff5e00;fill-opacity:1;fill-rule:nonzero;stroke:none"
transform="matrix(1.3333333,0,0,-1.3333333,1192.8567,1268.1776)"
clip-path="url(#clipPath7)"
id="path8"/>
</g>
<g
opacity="0.75"
id="g11"
clip-path="url(#clipPath12)">
<path
d="m 0,0 h 724.733 c 74.654,0 126.46,-74.386 100.575,-144.408 L 586.797,-789.591 c -15.562,-42.096 -55.694,-70.047 -100.575,-70.047 h -724.733 c -74.654,0 -126.461,74.386 -100.574,144.409 l 238.511,645.182 C -85.013,-27.952 -44.88,0 0,0"
style="fill:#f20a4c;fill-opacity:1;fill-rule:nonzero;stroke:none"
transform="matrix(1.3333333,0,0,-1.3333333,888.7104,535.77627)"
clip-path="url(#clipPath10)"
id="path11"/>
</g>
</g>
</svg>

After

Width:  |  Height:  |  Size: 6.1 KiB

View File

@@ -6,9 +6,8 @@ Frontend settings are configured through environment variables in your `docker-c
## Configuration File Location ## Configuration File Location
{% hint style="warning" %} !!! warning
Note that MediaManager may need to be restarted for changes in the config file to take effect. Note that MediaManager may need to be restarted for changes in the config file to take effect.
{% endhint %}
Your `config.toml` file should be in the directory that's mounted to `/app/config/config.toml` inside the container: Your `config.toml` file should be in the directory that's mounted to `/app/config/config.toml` inside the container:
@@ -66,6 +65,5 @@ MEDIAMANAGER_AUTH__OPENID_CONNECT__CLIENT_SECRET = "your_client_secret_from_prov
So for every config "level", you basically have to take the name of the value and prepend it with the section names in uppercase with 2 underscores as delimiters and `MEDIAMANAGER_` as the prefix. So for every config "level", you basically have to take the name of the value and prepend it with the section names in uppercase with 2 underscores as delimiters and `MEDIAMANAGER_` as the prefix.
{% hint style="warning" %} !!! warning
Note that not every env variable starts with `MEDIAMANAGER_`; this prefix only applies to env variables which replace/overwrite values in the config file. Variables like the `CONFIG_DIR` env variable must not be prefixed. Note that not every env variable starts with `MEDIAMANAGER_`; this prefix only applies to env variables which replace/overwrite values in the config file. Variables like the `CONFIG_DIR` env variable must not be prefixed.
{% endhint %}

View File

@@ -20,13 +20,11 @@ All authentication settings are configured in the `[auth]` section of your `conf
* `email_password_resets`\ * `email_password_resets`\
Enables password resets via email. Default is `false`. Enables password resets via email. Default is `false`.
{% hint style="info" %} !!! info
To use email password resets, you must also configure SMTP settings in the `[notifications.smtp_config]` section. To use email password resets, you must also configure SMTP settings in the `[notifications.smtp_config]` section.
{% endhint %}
{% hint style="info" %} !!! info
When setting up MediaManager for the first time, you should add your email to `admin_emails` in the `[auth]` config section. MediaManager will then use this email instead of the default admin email. Your account will automatically be created as an admin account, allowing you to manage other users, media and settings. When setting up MediaManager for the first time, you should add your email to `admin_emails` in the `[auth]` config section. MediaManager will then use this email instead of the default admin email. Your account will automatically be created as an admin account, allowing you to manage other users, media and settings.
{% endhint %}
## OpenID Connect Settings (`[auth.openid_connect]`) ## OpenID Connect Settings (`[auth.openid_connect]`)
@@ -53,22 +51,20 @@ The OpenID server will likely require a redirect URI. This URL will usually look
{MEDIAMANAGER_URL}/api/v1/auth/oauth/callback {MEDIAMANAGER_URL}/api/v1/auth/oauth/callback
``` ```
{% hint style="warning" %} !!! warning
It is very important that you set the correct callback URI, otherwise it won't work! It is very important that you set the correct callback URI, otherwise it won't work!
{% endhint %}
#### Authentik Example #### Authentik Example
Here is an example configuration for the OpenID Connect provider for Authentik. Here is an example configuration for the OpenID Connect provider for Authentik.
![authentik-redirect-url-example](<../.gitbook/assets/authentik redirect url example.png>) ![authentik-redirect-url-example](<../assets/assets/authentik redirect url example.png>)
## Example Configuration ## Example Configuration
Here's a complete example of the authentication section in your `config.toml`: Here's a complete example of the authentication section in your `config.toml`:
{% code title="config.toml" %} ```toml title="config.toml"
```toml
[auth] [auth]
token_secret = "a1b2c3d4e5f6g7h8i9j0k1l2m3n4o5p6q7r8s9t0u1v2w3x4y5z6" token_secret = "a1b2c3d4e5f6g7h8i9j0k1l2m3n4o5p6q7r8s9t0u1v2w3x4y5z6"
session_lifetime = 604800 # 1 week session_lifetime = 604800 # 1 week
@@ -82,4 +78,4 @@ client_secret = "your-secret-key-here"
configuration_endpoint = "https://auth.example.com/.well-known/openid-configuration" configuration_endpoint = "https://auth.example.com/.well-known/openid-configuration"
name = "Authentik" name = "Authentik"
``` ```
{% endcode %}

View File

@@ -26,8 +26,7 @@ description: >-
Here's a complete example of the general settings section in your `config.toml`: Here's a complete example of the general settings section in your `config.toml`:
{% code title="config.toml" %} ```toml title="config.toml"
```toml
[misc] [misc]
# REQUIRED: Change this to match your actual frontend domain. # REQUIRED: Change this to match your actual frontend domain.
@@ -38,8 +37,6 @@ cors_urls = ["http://localhost:8000"]
# Optional: Development mode (set to true for debugging) # Optional: Development mode (set to true for debugging)
development = false development = false
``` ```
{% endcode %}
{% hint style="info" %} !!! info
The `frontend_url` is the most important setting to configure correctly. Make sure it matches your actual deployment URLs. The `frontend_url` is the most important setting to configure correctly. Make sure it matches your actual deployment URLs.
{% endhint %}

View File

@@ -6,9 +6,8 @@ MediaManager supports custom libraries, allowing you to add multiple folders for
Custom libraries are configured in the `misc` section in the `config.toml` file. You can add as many libraries as you need. Custom libraries are configured in the `misc` section in the `config.toml` file. You can add as many libraries as you need.
{% hint style="info" %} !!! info
You are not limited to `/data/tv` or `/data/movies`, you can choose the entire path freely! You are not limited to `/data/tv` or `/data/movies`, you can choose the entire path freely!
{% endhint %}
### Movie Libraries ### Movie Libraries

View File

@@ -19,8 +19,7 @@ Database settings are configured in the `[database]` section of your `config.tom
Here's a complete example of the database section in your `config.toml`: Here's a complete example of the database section in your `config.toml`:
{% code title="config.toml" %} ```toml title="config.toml"
```toml
[database] [database]
host = "db" host = "db"
port = 5432 port = 5432
@@ -28,8 +27,6 @@ user = "MediaManager"
password = "your_secure_password" password = "your_secure_password"
dbname = "MediaManager" dbname = "MediaManager"
``` ```
{% endcode %}
{% hint style="info" %} !!! info
In docker-compose deployments the container name is simultaneously its hostname, so you can use "db" or "postgres" as host. In docker-compose deployments the container name is simultaneously its hostname, so you can use "db" or "postgres" as host.
{% endhint %}

View File

@@ -19,9 +19,8 @@ qBittorrent is a popular BitTorrent client that MediaManager can integrate with
## Transmission Settings (`[torrents.transmission]`) ## Transmission Settings (`[torrents.transmission]`)
{% hint style="info" %} !!! info
The downloads path in Transmission and MediaManager must be the same, i.e. the path `/data/torrents` must link to the same volume for both containers. The downloads path in Transmission and MediaManager must be the same, i.e. the path `/data/torrents` must link to the same volume for both containers.
{% endhint %}
Transmission is a BitTorrent client that MediaManager can integrate with for downloading torrents. Transmission is a BitTorrent client that MediaManager can integrate with for downloading torrents.
@@ -59,8 +58,7 @@ SABnzbd is a Usenet newsreader that MediaManager can integrate with for download
Here's a complete example of the download clients section in your `config.toml`: Here's a complete example of the download clients section in your `config.toml`:
{% code title="config.toml" %} ```toml title="config.toml"
```toml
[torrents] [torrents]
# qBittorrent configuration # qBittorrent configuration
[torrents.qbittorrent] [torrents.qbittorrent]
@@ -87,14 +85,12 @@ Here's a complete example of the download clients section in your `config.toml`:
port = 8080 port = 8080
api_key = "your_sabnzbd_api_key" api_key = "your_sabnzbd_api_key"
``` ```
{% endcode %}
## Docker Compose Integration ## Docker Compose Integration
When using Docker Compose, make sure your download clients are accessible from the MediaManager backend: When using Docker Compose, make sure your download clients are accessible from the MediaManager backend:
{% code title="docker-compose.yml" %} ```yaml title="docker-compose.yml"
```yaml
services: services:
# MediaManager backend # MediaManager backend
backend: backend:
@@ -121,12 +117,9 @@ services:
- ./data/usenet:/downloads - ./data/usenet:/downloads
# ... other configuration ... # ... other configuration ...
``` ```
{% endcode %}
{% hint style="warning" %} !!! warning
You should enable only one BitTorrent and only one Usenet Download Client at any time. You should enable only one BitTorrent and only one Usenet Download Client at any time.
{% endhint %}
{% hint style="info" %} !!! info
Make sure the download directories in your download clients are accessible to MediaManager for proper file management and organization. Make sure the download directories in your download clients are accessible to MediaManager for proper file management and organization.
{% endhint %}

View File

@@ -13,9 +13,8 @@ Indexer settings are configured in the `[indexers]` section of your `config.toml
* `timeout_seconds`\ * `timeout_seconds`\
Timeout in seconds for requests to Prowlarr. Default is `60`. Timeout in seconds for requests to Prowlarr. Default is `60`.
{% hint style="warning" %} !!! warning
Symptoms of timeouts are typically no search results ("No torrents found!") in conjunction with logs showing read timeouts. Symptoms of timeouts are typically no search results ("No torrents found!") in conjunction with logs showing read timeouts.
{% endhint %}
<details> <details>
@@ -50,8 +49,7 @@ DEBUG - media_manager.indexer.utils -
## Example Configuration ## Example Configuration
{% code title="config.toml" %} ```toml title="config.toml"
```toml
[indexers] [indexers]
[indexers.prowlarr] [indexers.prowlarr]
enabled = true enabled = true
@@ -66,4 +64,4 @@ api_key = "your_jackett_api_key"
indexers = ["1337x", "rarbg"] indexers = ["1337x", "rarbg"]
timeout_seconds = 60 timeout_seconds = 60
``` ```
{% endcode %}

View File

@@ -57,8 +57,7 @@ Controls which emails receive notifications.
Here's a complete example of the notifications section in your `config.toml`: Here's a complete example of the notifications section in your `config.toml`:
{% code title="config.toml" %} ```toml title="config.toml"
```toml
[notifications] [notifications]
# SMTP settings for email notifications and password resets # SMTP settings for email notifications and password resets
[notifications.smtp_config] [notifications.smtp_config]
@@ -91,8 +90,7 @@ Here's a complete example of the notifications section in your `config.toml`:
api_key = "your_pushover_api_key" api_key = "your_pushover_api_key"
user = "your_pushover_user_key" user = "your_pushover_user_key"
``` ```
{% endcode %}
{% hint style="info" %}
!!! info
You can enable multiple notification methods simultaneously. For example, you could have both email and Gotify notifications enabled at the same time. You can enable multiple notification methods simultaneously. For example, you could have both email and Gotify notifications enabled at the same time.
{% endhint %}

View File

@@ -17,9 +17,8 @@ Rules define how MediaManager scores releases based on their titles or indexer f
* Reject releases that do not meet certain criteria (e.g., non-freeleech releases). * Reject releases that do not meet certain criteria (e.g., non-freeleech releases).
* and more. * and more.
{% hint style="info" %} !!! info
The keywords and flags are compared case-insensitively. The keywords and flags are compared case-insensitively.
{% endhint %}
### Title Rules ### Title Rules
@@ -38,8 +37,7 @@ Each title rule consists of:
Examples for Title Rules Examples for Title Rules
{% code title="config.toml" %} ```toml title="config.toml"
```toml
[[indexers.title_scoring_rules]] [[indexers.title_scoring_rules]]
name = "prefer_h265" name = "prefer_h265"
keywords = ["h265", "hevc", "x265"] keywords = ["h265", "hevc", "x265"]
@@ -52,7 +50,6 @@ keywords = ["cam", "ts"]
score_modifier = -10000 score_modifier = -10000
negate = false negate = false
``` ```
{% endcode %}
* The first rule increases the score for releases containing "h265", "hevc", or "x265". * The first rule increases the score for releases containing "h265", "hevc", or "x265".
* The second rule heavily penalizes releases containing "cam" or "ts". * The second rule heavily penalizes releases containing "cam" or "ts".
@@ -76,8 +73,7 @@ Each indexer flag rule consists of:
Examples for Indexer Flag Rules Examples for Indexer Flag Rules
{% code title="config.toml" %} ```toml title="config.toml"
```toml
[[indexers.indexer_flag_scoring_rules]] [[indexers.indexer_flag_scoring_rules]]
name = "reject_non_freeleech" name = "reject_non_freeleech"
flags = ["freeleech", "freeleech75"] flags = ["freeleech", "freeleech75"]
@@ -90,7 +86,6 @@ flags = ["nuked"]
score_modifier = -10000 score_modifier = -10000
negate = false negate = false
``` ```
{% endcode %}
* The first rule penalizes releases that do not have the "freeleech" or "freeleech75" flag. * The first rule penalizes releases that do not have the "freeleech" or "freeleech75" flag.
* The second rule penalizes releases that are marked as "nuked". * The second rule penalizes releases that are marked as "nuked".
@@ -99,8 +94,7 @@ If `negate` is set to `true`, the `score_modifier` is applied only if none of th
## Example ## Example
{% code title="config.toml" %} ```toml title="config.toml"
```toml
[[indexers.scoring_rule_sets]] [[indexers.scoring_rule_sets]]
name = "default" name = "default"
libraries = ["ALL_TV", "ALL_MOVIES"] libraries = ["ALL_TV", "ALL_MOVIES"]
@@ -111,7 +105,6 @@ name = "strict_quality"
libraries = ["ALL_MOVIES"] libraries = ["ALL_MOVIES"]
rule_names = ["prefer_h265", "avoid_cam", "reject_non_freeleech"] rule_names = ["prefer_h265", "avoid_cam", "reject_non_freeleech"]
``` ```
{% endcode %}
## Libraries ## Libraries
@@ -127,9 +120,8 @@ You can use special library names in your rulesets:
This allows you to set global rules for all TV or movie content, or provide fallback rules for uncategorized media. This allows you to set global rules for all TV or movie content, or provide fallback rules for uncategorized media.
{% hint style="info" %} !!! info
You don't need to create lots of libraries with different directories, multiple libraries can share the same directory. You can set multiple (unlimited) libraries to the default directory `/data/movies` or `/data/tv` and use different rulesets with them. You don't need to create lots of libraries with different directories, multiple libraries can share the same directory. You can set multiple (unlimited) libraries to the default directory `/data/movies` or `/data/tv` and use different rulesets with them.
{% endhint %}
## Relation to Sonarr/Radarr Profiles ## Relation to Sonarr/Radarr Profiles

View File

@@ -10,7 +10,7 @@ description: >-
* `media_manager/`: Backend FastAPI application * `media_manager/`: Backend FastAPI application
* `web/`: Frontend SvelteKit application * `web/`: Frontend SvelteKit application
* `docs/`: Documentation (GitBook) * `docs/`: Documentation (MkDocs)
* `metadata_relay/`: Metadata relay service, also FastAPI * `metadata_relay/`: Metadata relay service, also FastAPI
## Special Dev Configuration ## Special Dev Configuration
@@ -44,9 +44,8 @@ MediaManager uses various environment variables for configuration. In the Docker
* `DISABLE_FRONTEND_MOUNT`\ * `DISABLE_FRONTEND_MOUNT`\
When `TRUE`, disables mounting built frontend files (allows separate frontend container). When `TRUE`, disables mounting built frontend files (allows separate frontend container).
{% hint style="info" %} !!! info
This is automatically set in `docker-compose.dev.yaml` to enable the separate frontend development container This is automatically set in `docker-compose.dev.yaml` to enable the separate frontend development container
{% endhint %}
#### Configuration Files #### Configuration Files
@@ -105,10 +104,9 @@ This means when your browser makes a request to `http://localhost:5173/api/v1/tv
### Setting up the full development environment with Docker (Recommended) ### Setting up the full development environment with Docker (Recommended)
This is the easiest and recommended way to get started. Everything runs in Docker with hot-reloading enabled.
{% stepper %}
{% step %}
### Prepare config files ### Prepare config files
Create config directory (only needed on first run) and copy example config files: Create config directory (only needed on first run) and copy example config files:
@@ -118,9 +116,9 @@ mkdir -p res/config # Only needed on first run
cp config.dev.toml res/config/config.toml cp config.dev.toml res/config/config.toml
cp web/.env.example web/.env cp web/.env.example web/.env
``` ```
{% endstep %}
{% step %}
### Start all services ### Start all services
Recommended: Use make commands for easy development Recommended: Use make commands for easy development
@@ -135,9 +133,9 @@ Alternative: Use docker compose directly (if make is not available)
```bash ```bash
docker compose -f docker-compose.dev.yaml up docker compose -f docker-compose.dev.yaml up
``` ```
{% endstep %}
{% step %}
### Access the application ### Access the application
* Frontend (with HMR): http://localhost:5173 * Frontend (with HMR): http://localhost:5173
@@ -151,12 +149,10 @@ Now you can edit code and see changes instantly:
* Edit Python files → Backend auto-reloads * Edit Python files → Backend auto-reloads
* Edit Svelte/TypeScript files → Frontend HMR updates in browser * Edit Svelte/TypeScript files → Frontend HMR updates in browser
* Edit config.toml → Changes apply immediately * Edit config.toml → Changes apply immediately
{% endstep %}
{% endstepper %}
{% hint style="info" %}
!!! info
Run `make help` to see all available development commands including `make down`, `make logs`, `make app` (shell into backend), and more. Run `make help` to see all available development commands including `make down`, `make logs`, `make app` (shell into backend), and more.
{% endhint %}
## Setting up the backend development environment (Local) ## Setting up the backend development environment (Local)
@@ -217,18 +213,17 @@ ruff check .
## Setting up the frontend development environment (Local, Optional) ## Setting up the frontend development environment (Local, Optional)
Using the Docker setup above is recommended. This section is for those who prefer to run the frontend locally outside of Docker.
{% stepper %}
{% step %}
### Clone & change dir ### Clone & change dir
1. Clone the repository 1. Clone the repository
2. cd into repo root 2. cd into repo root
3. cd into `web` directory 3. cd into `web` directory
{% endstep %}
{% step %}
### Install Node.js (example using nvm-windows) ### Install Node.js (example using nvm-windows)
I used nvm-windows: I used nvm-windows:
@@ -243,9 +238,9 @@ If using PowerShell you may need:
```powershell ```powershell
Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser
``` ```
{% endstep %}
{% step %}
### Create .env for frontend ### Create .env for frontend
```bash ```bash
@@ -253,18 +248,18 @@ cp .env.example .env
``` ```
Update `PUBLIC_API_URL` if your backend is not at `http://localhost:8000` Update `PUBLIC_API_URL` if your backend is not at `http://localhost:8000`
{% endstep %}
{% step %}
### Install dependencies and run dev server ### Install dependencies and run dev server
```bash ```bash
npm install npm install
npm run dev npm run dev
``` ```
{% endstep %}
{% step %}
### Format & lint ### Format & lint
* Format: * Format:
@@ -278,12 +273,10 @@ npm run format
```bash ```bash
npm run lint npm run lint
``` ```
{% endstep %}
{% endstepper %}
{% hint style="info" %}
!!! info
If running frontend locally, make sure to add `http://localhost:5173` to the `cors_urls` in your backend config file. If running frontend locally, make sure to add `http://localhost:5173` to the `cors_urls` in your backend config file.
{% endhint %}
## Troubleshooting ## Troubleshooting

View File

@@ -1,11 +1,14 @@
# Documentation # Documentation
MediaManager currently uses GitBook for documentation. MediaManager uses [MkDocs](https://www.mkdocs.org/) with
the [Material for MkDocs](https://squidfunk.github.io/mkdocs-material/) theme for documentation.
The files for the documentation are in the \`/docs\` directory. They are \_mostly\_ standard markdown. The files for the documentation are in the `/docs` directory.
Unfortunately GitBook doesn't provide a way to locally preview the documentation. Instead you can submit a PR with your proposed changes and a GitBook workflow will run which will provide a link to the preview. To preview the documentation locally, you need to have mkdocs or Docker installed.
To access the preview just open the \`Details\` link. ## How to preview the documentation locally with docker
<figure><img src="../.gitbook/assets/image.png" alt=""><figcaption></figcaption></figure> 1. Run the mkdocs container in `docker-compose.dev.yaml`
2. Open `http://127.0.0.1:9000/` in your browser.

View File

@@ -23,9 +23,8 @@ Here is an example, using these rules:
If your folder structure is in the correct format, you can start importing. To do this, log in as an administrator and go to the TV/movie dashboard. If your folder structure is in the correct format, you can start importing. To do this, log in as an administrator and go to the TV/movie dashboard.
{% hint style="info" %} !!! info
After importing, MediaManager will automatically prefix the old root TV show/movie folders with a dot to mark them as "imported". After importing, MediaManager will automatically prefix the old root TV show/movie folders with a dot to mark them as "imported".
{% endhint %}
So after importing, the directory would look like this (using the above directory structure): So after importing, the directory would look like this (using the above directory structure):

2
docs/index.md Normal file
View File

@@ -0,0 +1,2 @@
--8<-- "README.md"

View File

@@ -2,4 +2,5 @@
The recommended way to install and run Media Manager is using Docker and Docker Compose. Other installation methods are not officially supported, but listed here for convenience. The recommended way to install and run Media Manager is using Docker and Docker Compose. Other installation methods are not officially supported, but listed here for convenience.
<table data-view="cards" data-full-width="false"><thead><tr><th align="center"></th><th data-hidden data-card-target data-type="content-ref"></th></tr></thead><tbody><tr><td align="center">Docker Compose (recommended)</td><td><a href="docker.md">docker.md</a></td></tr><tr><td align="center">Nix Flakes [Community]</td><td><a href="flakes.md">flakes.md</a></td></tr></tbody></table> [Docker Compose (recommended)](docker.md){ .md-button .md-button--primary }
[Nix Flakes [Community]](flakes.md){ .md-button }

View File

@@ -9,8 +9,8 @@
Follow these steps to get MediaManager running with Docker Compose: Follow these steps to get MediaManager running with Docker Compose:
{% stepper %}
{% step %}
#### Get the docker-compose file #### Get the docker-compose file
Download the `docker-compose.yaml` from the MediaManager repo: Download the `docker-compose.yaml` from the MediaManager repo:
@@ -18,9 +18,9 @@ Download the `docker-compose.yaml` from the MediaManager repo:
```bash ```bash
wget -O docker-compose.yaml https://github.com/maxdorninger/MediaManager/releases/latest/download/docker-compose.yaml wget -O docker-compose.yaml https://github.com/maxdorninger/MediaManager/releases/latest/download/docker-compose.yaml
``` ```
{% endstep %}
{% step %}
#### Prepare configuration directory and example config #### Prepare configuration directory and example config
Create a config directory and download the example configuration: Create a config directory and download the example configuration:
@@ -29,15 +29,15 @@ Create a config directory and download the example configuration:
mkdir config mkdir config
wget -O ./config/config.toml https://github.com/maxdorninger/MediaManager/releases/latest/download/config.example.toml wget -O ./config/config.toml https://github.com/maxdorninger/MediaManager/releases/latest/download/config.example.toml
``` ```
{% endstep %}
{% step %}
#### Edit configuration #### Edit configuration
You probably need to edit the `config.toml` file in the `./config` directory to suit your environment and preferences. [How to configure MediaManager.](configuration/) You probably need to edit the `config.toml` file in the `./config` directory to suit your environment and preferences. [How to configure MediaManager.](../configuration/README.md)
{% endstep %}
{% step %}
#### Start MediaManager #### Start MediaManager
Bring up the stack: Bring up the stack:
@@ -45,16 +45,15 @@ Bring up the stack:
```bash ```bash
docker compose up -d docker compose up -d
``` ```
{% endstep %}
{% endstepper %}
* Upon first run, MediaManager will create a default `config.toml` file in the `./config` directory (if not already present). * Upon first run, MediaManager will create a default `config.toml` file in the `./config` directory (if not already present).
* Upon first run, MediaManager will also create a default admin user. The credentials of the default admin user will be printed in the logs of the container — it's recommended to change the password of this user after the first login. * Upon first run, MediaManager will also create a default admin user. The credentials of the default admin user will be printed in the logs of the container — it's recommended to change the password of this user after the first login.
* [For more information on the available configuration options, see the Configuration section of the documentation.](configuration/) * [For more information on the available configuration options, see the Configuration section of the documentation.](../configuration/README.md)
{% hint style="info" %} !!! info
When setting up MediaManager for the first time, you should add your email to `admin_emails` in the `[auth]` config section. MediaManager will then use this email instead of the default admin email. Your account will automatically be created as an admin account, allowing you to manage other users, media, and settings. When setting up MediaManager for the first time, you should add your email to `admin_emails` in the `[auth]` config section. MediaManager will then use this email instead of the default admin email. Your account will automatically be created as an admin account, allowing you to manage other users, media, and settings.
{% endhint %}
## Docker Images ## Docker Images
@@ -70,9 +69,8 @@ MetadataRelay images are also available on both registries:
From v1.12.1 onwards, both MediaManager and MetadataRelay images are available on both Quay.io and GHCR. The reason for the switch to Quay.io as the primary image registry is due to [GHCR's continued slow performance.](https://github.com/orgs/community/discussions/173607) From v1.12.1 onwards, both MediaManager and MetadataRelay images are available on both Quay.io and GHCR. The reason for the switch to Quay.io as the primary image registry is due to [GHCR's continued slow performance.](https://github.com/orgs/community/discussions/173607)
{% hint style="info" %} !!! info
You can use either the Quay.io or GHCR images interchangeably, as they are built from the same source and the tags are the same across both registries. You can use either the Quay.io or GHCR images interchangeably, as they are built from the same source and the tags are the same across both registries.
{% endhint %}
### Tags ### Tags

View File

@@ -1,11 +1,9 @@
# Nix Flakes # Nix Flakes
{% hint style="note" %} !!! note
This is a community contribution and not officially supported by the MediaManager team, but included here for convenience. This is a community contribution and not officially supported by the MediaManager team, but included here for convenience.
{% endhint %}
*Please report issues with this method at the [corresponding GitHub repository](https://github.com/strangeglyph/mediamanager-nix).* *Please report issues with this method at the [corresponding GitHub repository](https://github.com/strangeglyph/mediamanager-nix).*
</note>
## Prerequisites ## Prerequisites
@@ -64,12 +62,11 @@ The host and port that MediaManager listens on can be set using `services.media-
To configure MediaManager, use `services.media-manager.settings`, which follows the same structure as the MediaManager To configure MediaManager, use `services.media-manager.settings`, which follows the same structure as the MediaManager
`config.toml`. To provision secrets, set `services.media-manager.environmentFile` to a protected file, for example one `config.toml`. To provision secrets, set `services.media-manager.environmentFile` to a protected file, for example one
provided by [agenix](https://github.com/ryantm/agenix) or [sops-nix](https://github.com/Mic92/sops-nix). provided by [agenix](https://github.com/ryantm/agenix) or [sops-nix](https://github.com/Mic92/sops-nix).
See [Configuration](Configuration.md#configuring-secrets) for guidance on using environment variables. See [Configuration](../configuration/README.md#configuring-secrets) for guidance on using environment variables.
{% hint style="warning" %} !!! warning
Do not place secrets in the nix store, as it is world-readable. Do not place secrets in the nix store, as it is world-readable.
{% endhint %}
## Automatic Postgres Setup ## Automatic Postgres Setup

View File

@@ -1,7 +1,6 @@
# Screenshots # Screenshots
{% hint style="info" %} !!! info
MediaManager also supports darkmode! MediaManager also supports darkmode!
{% endhint %}
![screenshot-dashboard.png](<.gitbook/assets/screenshot dashboard.png>) ![screenshot-tv-dashboard.png](<.gitbook/assets/screenshot tv dashboard.png>) ![screenshot-download-season.png](<.gitbook/assets/screenshot download season.png>) ![screenshot-request-season.png](<.gitbook/assets/screenshot request season.png>) ![screenshot-tv-torrents.png](<.gitbook/assets/screenshot tv torrents.png>) ![screenshot-settings.png](<.gitbook/assets/screenshot settings.png>) ![screenshot-login.png](<.gitbook/assets/screenshot login.png>) ![screenshot-dashboard.png](<assets/assets/screenshot dashboard.png>) ![screenshot-tv-dashboard.png](<assets/assets/screenshot tv dashboard.png>) ![screenshot-download-season.png](<assets/assets/screenshot download season.png>) ![screenshot-request-season.png](<assets/assets/screenshot request season.png>) ![screenshot-tv-torrents.png](<assets/assets/screenshot tv torrents.png>) ![screenshot-settings.png](<assets/assets/screenshot settings.png>) ![screenshot-login.png](<assets/assets/screenshot login.png>)

View File

@@ -1,8 +1,7 @@
# Troubleshooting # Troubleshooting
{% hint style="info" %} !!! info
Always check the container and browser logs for more specific error messages Always check the container and browser logs for more specific error messages
{% endhint %}
<details> <details>
@@ -60,10 +59,9 @@ Switch to advanced tabTry switching to the advanced tab when searching for torre
#### Possible Fixes: #### Possible Fixes:
* [Unable to pull image from GitHub Container Registry (Stack Overflow)](https://stackoverflow.com/questions/74656167/unable-to-pull-image-from-github-container-registry-ghcr) * [Unable to pull image from GitHub Container Registry (Stack Overflow)](https://stackoverflow.com/questions/74656167/unable-to-pull-image-from-github-container-registry-ghcr)
* [Try pulling the image from Quay.io](/broken/pages/09241b2fcda5d337e8878e4052f4634fe2902d10#mediamanager-and-metadatarelay-docker-images) * [Try pulling the image from Quay.io](installation/docker.md#docker-images)
</details> </details>
{% hint style="info" %} !!! info
If it still doesn't work, [please open an Issue.](https://github.com/maxdorninger/MediaManager/issues) It is possible that a bug is causing the issue. If it still doesn't work, [please open an Issue.](https://github.com/maxdorninger/MediaManager/issues) It is possible that a bug is causing the issue.
{% endhint %}

View File

@@ -1,133 +0,0 @@
# Usage
If you are coming from Radarr or Sonarr you will find that MediaManager does things a bit differently. Instead of completely automatically downloading and managing your media, MediaManager focuses on providing an easy-to-use interface to guide you through the process of finding and downloading media. Advanced features like multiple qualities of a show/movie necessitate such a paradigm shift. So here is a quick step-by-step guide to get you started:
#### Downloading/Requesting a show
{% stepper %}
{% step %}
### Add the show
Add a show on the "Add Show" page. After adding the show you will be redirected to the show's page.
{% endstep %}
{% step %}
### Request season(s)
Click the "Request Season" button on the show's page. Select one or more seasons that you want to download.
{% endstep %}
{% step %}
### Select qualities
Select the "Min Quality" — the minimum resolution of the content to download.\
Select the "Wanted Quality" — the **maximum** resolution of the content to download.
{% endstep %}
{% step %}
### Submit request
Click "Submit request". This is not the last step: an administrator must first approve your request for download. Only after approval will the requested content be downloaded.
{% endstep %}
{% step %}
### Finished
Congratulation! You've downloaded a show (after admin approval).
{% endstep %}
{% endstepper %}
#### Requesting a show (as an admin)
{% stepper %}
{% step %}
### Add the show
Add a show on the "Add Show" page. After adding the show you will be redirected to the show's page.
{% endstep %}
{% step %}
### Request season(s)
Click the "Request Season" button on the show's page. Select one or more seasons that you want to download.
{% endstep %}
{% step %}
### Select qualities
Select the "Min Quality" — the minimum resolution of the content to download.\
Select the "Wanted Quality" — the **maximum** resolution of the content to download.
{% endstep %}
{% step %}
### Submit request (auto-approved)
Click "Submit request". As an admin, your request will be automatically approved.
{% endstep %}
{% step %}
### Finished
Congratulation! You've downloaded a show.
{% endstep %}
{% endstepper %}
#### Downloading a show (admin-only)
You can only directly download a show if you are an admin!
{% stepper %}
{% step %}
### Go to the show's page
Open the show's page that contains the season you wish to download.
{% endstep %}
{% step %}
### Start download
Click the "Download Season" button.
{% endstep %}
{% step %}
### Enter season number
Enter the season number that you want to download.
{% endstep %}
{% step %}
### Optional file path suffix
Optionally select the "File Path Suffix". Note: **it needs to be unique per season per show!**
{% endstep %}
{% step %}
### Choose torrent and download
Click "Download" on the torrent that you want to download.
{% endstep %}
{% step %}
### Finished
Congratulation! You've downloaded a show.
{% endstep %}
{% endstepper %}
#### Managing requests
Users need their requests to be approved by an admin. To manage requests:
{% stepper %}
{% step %}
### Open Requests page
Go to the "Requests" page.
{% endstep %}
{% step %}
### Approve, delete or modify
From the Requests page you can approve, delete, or modify a user's request.
{% endstep %}
{% endstepper %}

View File

@@ -1,5 +1,4 @@
from collections.abc import AsyncGenerator from collections.abc import AsyncGenerator
from typing import Optional
from fastapi import Depends from fastapi import Depends
from fastapi_users.db import ( from fastapi_users.db import (
@@ -17,7 +16,7 @@ from media_manager.database import Base, build_db_url
class OAuthAccount(SQLAlchemyBaseOAuthAccountTableUUID, Base): class OAuthAccount(SQLAlchemyBaseOAuthAccountTableUUID, Base):
access_token: Mapped[str] = mapped_column(String(length=4096), nullable=False) access_token: Mapped[str] = mapped_column(String(length=4096), nullable=False)
refresh_token: Mapped[Optional[str]] = mapped_column( refresh_token: Mapped[str | None] = mapped_column(
String(length=4096), nullable=True String(length=4096), nullable=True
) )
@@ -34,12 +33,12 @@ engine = create_async_engine(
async_session_maker = async_sessionmaker(engine, expire_on_commit=False) async_session_maker = async_sessionmaker(engine, expire_on_commit=False)
async def get_async_session() -> AsyncGenerator[AsyncSession, None]: async def get_async_session() -> AsyncGenerator[AsyncSession]:
async with async_session_maker() as session: async with async_session_maker() as session:
yield session yield session
async def get_user_db( async def get_user_db(
session: AsyncSession = Depends(get_async_session), session: AsyncSession = Depends(get_async_session),
) -> AsyncGenerator[SQLAlchemyUserDatabase, None]: ) -> AsyncGenerator[SQLAlchemyUserDatabase]:
yield SQLAlchemyUserDatabase(session, User, OAuthAccount) yield SQLAlchemyUserDatabase(session, User, OAuthAccount)

View File

@@ -1,4 +1,7 @@
from fastapi import APIRouter, Depends, status from collections.abc import AsyncGenerator
from contextlib import asynccontextmanager
from fastapi import APIRouter, Depends, FastAPI, status
from fastapi_users.router import get_oauth_router from fastapi_users.router import get_oauth_router
from httpx_oauth.oauth2 import OAuth2 from httpx_oauth.oauth2 import OAuth2
from sqlalchemy import select from sqlalchemy import select
@@ -7,6 +10,7 @@ from media_manager.auth.db import User
from media_manager.auth.schemas import AuthMetadata, UserRead from media_manager.auth.schemas import AuthMetadata, UserRead
from media_manager.auth.users import ( from media_manager.auth.users import (
SECRET, SECRET,
create_default_admin_user,
current_superuser, current_superuser,
fastapi_users, fastapi_users,
openid_client, openid_client,
@@ -15,7 +19,14 @@ from media_manager.auth.users import (
from media_manager.config import MediaManagerConfig from media_manager.config import MediaManagerConfig
from media_manager.database import DbSessionDependency from media_manager.database import DbSessionDependency
users_router = APIRouter()
@asynccontextmanager
async def lifespan(_app: FastAPI) -> AsyncGenerator:
await create_default_admin_user()
yield
users_router = APIRouter(lifespan=lifespan)
auth_metadata_router = APIRouter() auth_metadata_router = APIRouter()

View File

@@ -1,7 +1,8 @@
import contextlib import contextlib
import logging import logging
import uuid import uuid
from typing import Any, AsyncGenerator, Optional, override from collections.abc import AsyncGenerator
from typing import Any, override
from fastapi import Depends, Request from fastapi import Depends, Request
from fastapi.responses import RedirectResponse, Response from fastapi.responses import RedirectResponse, Response
@@ -49,7 +50,7 @@ class UserManager(UUIDIDMixin, BaseUserManager[User, uuid.UUID]):
self, self,
user: models.UP, user: models.UP,
update_dict: dict[str, Any], update_dict: dict[str, Any],
request: Optional[Request] = None, request: Request | None = None,
) -> None: ) -> None:
log.info(f"User {user.id} has been updated.") log.info(f"User {user.id} has been updated.")
if update_dict.get("is_superuser"): if update_dict.get("is_superuser"):
@@ -60,7 +61,7 @@ class UserManager(UUIDIDMixin, BaseUserManager[User, uuid.UUID]):
@override @override
async def on_after_register( async def on_after_register(
self, user: User, request: Optional[Request] = None self, user: User, request: Request | None = None
) -> None: ) -> None:
log.info(f"User {user.id} has registered.") log.info(f"User {user.id} has registered.")
if user.email in config.admin_emails: if user.email in config.admin_emails:
@@ -69,7 +70,7 @@ class UserManager(UUIDIDMixin, BaseUserManager[User, uuid.UUID]):
@override @override
async def on_after_forgot_password( async def on_after_forgot_password(
self, user: User, token: str, request: Optional[Request] = None self, user: User, token: str, request: Request | None = None
) -> None: ) -> None:
link = f"{MediaManagerConfig().misc.frontend_url}web/login/reset-password?token={token}" link = f"{MediaManagerConfig().misc.frontend_url}web/login/reset-password?token={token}"
log.info(f"User {user.id} has forgot their password. Reset Link: {link}") log.info(f"User {user.id} has forgot their password. Reset Link: {link}")
@@ -100,28 +101,26 @@ class UserManager(UUIDIDMixin, BaseUserManager[User, uuid.UUID]):
@override @override
async def on_after_reset_password( async def on_after_reset_password(
self, user: User, request: Optional[Request] = None self, user: User, request: Request | None = None
) -> None: ) -> None:
log.info(f"User {user.id} has reset their password.") log.info(f"User {user.id} has reset their password.")
@override @override
async def on_after_request_verify( async def on_after_request_verify(
self, user: User, token: str, request: Optional[Request] = None self, user: User, token: str, request: Request | None = None
) -> None: ) -> None:
log.info( log.info(
f"Verification requested for user {user.id}. Verification token: {token}" f"Verification requested for user {user.id}. Verification token: {token}"
) )
@override @override
async def on_after_verify( async def on_after_verify(self, user: User, request: Request | None = None) -> None:
self, user: User, request: Optional[Request] = None
) -> None:
log.info(f"User {user.id} has been verified") log.info(f"User {user.id} has been verified")
async def get_user_manager( async def get_user_manager(
user_db: SQLAlchemyUserDatabase = Depends(get_user_db), user_db: SQLAlchemyUserDatabase = Depends(get_user_db),
) -> AsyncGenerator[UserManager, None]: ) -> AsyncGenerator[UserManager]:
yield UserManager(user_db) yield UserManager(user_db)
@@ -176,8 +175,8 @@ async def create_default_admin_user() -> None:
log.info( log.info(
f"Found {user_count} existing users. Skipping default user creation." f"Found {user_count} existing users. Skipping default user creation."
) )
except Exception as e: except Exception:
log.error(f"Failed to create default admin user: {e}") log.exception("Failed to create default admin user")
log.info( log.info(
"You can create an admin user manually by registering with an email from the admin_emails list in your config." "You can create an admin user manually by registering with an email from the admin_emails list in your config."
) )

View File

@@ -1,7 +1,6 @@
import logging import logging
import os import os
from pathlib import Path from pathlib import Path
from typing import Tuple, Type
from pydantic import AnyHttpUrl from pydantic import AnyHttpUrl
from pydantic_settings import ( from pydantic_settings import (
@@ -71,12 +70,12 @@ class MediaManagerConfig(BaseSettings):
@classmethod @classmethod
def settings_customise_sources( def settings_customise_sources(
cls, cls,
settings_cls: Type[BaseSettings], settings_cls: type[BaseSettings],
init_settings: PydanticBaseSettingsSource, init_settings: PydanticBaseSettingsSource,
env_settings: PydanticBaseSettingsSource, env_settings: PydanticBaseSettingsSource,
dotenv_settings: PydanticBaseSettingsSource, dotenv_settings: PydanticBaseSettingsSource,
file_secret_settings: PydanticBaseSettingsSource, file_secret_settings: PydanticBaseSettingsSource,
) -> Tuple[PydanticBaseSettingsSource, ...]: ) -> tuple[PydanticBaseSettingsSource, ...]:
return ( return (
init_settings, init_settings,
env_settings, env_settings,

View File

@@ -1,7 +1,8 @@
import logging import logging
import os import os
from collections.abc import Generator
from contextvars import ContextVar from contextvars import ContextVar
from typing import Annotated, Any, Generator, Optional from typing import Annotated
from fastapi import Depends from fastapi import Depends
from sqlalchemy import create_engine from sqlalchemy import create_engine
@@ -15,8 +16,8 @@ log = logging.getLogger(__name__)
Base = declarative_base() Base = declarative_base()
engine: Optional[Engine] = None engine: Engine | None = None
SessionLocal: Optional[sessionmaker] = None SessionLocal: sessionmaker | None = None
def build_db_url( def build_db_url(
@@ -83,7 +84,7 @@ def get_engine() -> Engine:
return engine return engine
def get_session() -> Generator[Session, Any, None]: def get_session() -> Generator[Session]:
if SessionLocal is None: if SessionLocal is None:
msg = "Session factory not initialized. Call init_engine(...) first." msg = "Session factory not initialized. Call init_engine(...) first."
raise RuntimeError(msg) raise RuntimeError(msg)
@@ -91,9 +92,9 @@ def get_session() -> Generator[Session, Any, None]:
try: try:
yield db yield db
db.commit() db.commit()
except Exception as e: except Exception:
db.rollback() db.rollback()
log.critical(f"error occurred: {e}") log.critical("", exc_info=True)
raise raise
finally: finally:
db.close() db.close()

View File

@@ -4,6 +4,13 @@ from psycopg.errors import UniqueViolation
from sqlalchemy.exc import IntegrityError from sqlalchemy.exc import IntegrityError
class RenameError(Exception):
"""Error when renaming something"""
def __init__(self, message: str = "Failed to rename source directory") -> None:
super().__init__(message)
class MediaManagerError(Exception): class MediaManagerError(Exception):
"""Base exception for MediaManager errors.""" """Base exception for MediaManager errors."""

View File

@@ -36,10 +36,8 @@ def run_filesystem_checks(config: MediaManagerConfig, log: Logger) -> None:
if not test_hardlink.samefile(test_torrent_file): if not test_hardlink.samefile(test_torrent_file):
log.critical("Hardlink creation failed!") log.critical("Hardlink creation failed!")
log.info("Successfully created test hardlink in TV directory") log.info("Successfully created test hardlink in TV directory")
except OSError as e: except OSError:
log.error( log.exception("Hardlink creation failed, falling back to copying files")
f"Hardlink creation failed, falling back to copying files. Error: {e}"
)
shutil.copy(src=test_torrent_file, dst=test_hardlink) shutil.copy(src=test_torrent_file, dst=test_hardlink)
finally: finally:
test_hardlink.unlink() test_hardlink.unlink()

View File

@@ -1,7 +1,9 @@
import concurrent import concurrent
import concurrent.futures import concurrent.futures
import logging import logging
import xml.etree.ElementTree as ET
from concurrent.futures.thread import ThreadPoolExecutor from concurrent.futures.thread import ThreadPoolExecutor
from dataclasses import dataclass
import requests import requests
@@ -15,6 +17,21 @@ from media_manager.tv.schemas import Show
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
@dataclass
class IndexerInfo:
supports_tv_search: bool
supports_tv_search_tmdb: bool
supports_tv_search_imdb: bool
supports_tv_search_tvdb: bool
supports_tv_search_season: bool
supports_tv_search_episode: bool
supports_movie_search: bool
supports_movie_search_tmdb: bool
supports_movie_search_imdb: bool
supports_movie_search_tvdb: bool
class Jackett(GenericIndexer, TorznabMixin): class Jackett(GenericIndexer, TorznabMixin):
def __init__(self) -> None: def __init__(self) -> None:
""" """
@@ -31,11 +48,16 @@ class Jackett(GenericIndexer, TorznabMixin):
def search(self, query: str, is_tv: bool) -> list[IndexerQueryResult]: def search(self, query: str, is_tv: bool) -> list[IndexerQueryResult]:
log.debug("Searching for " + query) log.debug("Searching for " + query)
params = {"q": query, "t": "tvsearch" if is_tv else "movie"}
return self.__search_jackett(params)
def __search_jackett(self, params: dict) -> list[IndexerQueryResult]:
futures = [] futures = []
with ThreadPoolExecutor() as executor, requests.Session() as session: with ThreadPoolExecutor() as executor, requests.Session() as session:
for indexer in self.indexers: for indexer in self.indexers:
future = executor.submit( future = executor.submit(
self.get_torrents_by_indexer, indexer, query, is_tv, session self.get_torrents_by_indexer, indexer, params, session
) )
futures.append(future) futures.append(future)
@@ -46,19 +68,108 @@ class Jackett(GenericIndexer, TorznabMixin):
result = future.result() result = future.result()
if result is not None: if result is not None:
responses.extend(result) responses.extend(result)
except Exception as e: except Exception:
log.error(f"search result failed with: {e}") log.exception("Searching failed")
return responses return responses
def get_torrents_by_indexer( def __get_search_capabilities(
self, indexer: str, query: str, is_tv: bool, session: requests.Session self, indexer: str, session: requests.Session
) -> list[IndexerQueryResult]: ) -> IndexerInfo:
url = ( url = (
self.url self.url
+ f"/api/v2.0/indexers/{indexer}/results/torznab/api?apikey={self.api_key}&t={'tvsearch' if is_tv else 'movie'}&q={query}" + f"/api/v2.0/indexers/{indexer}/results/torznab/api?apikey={self.api_key}&t=caps"
) )
response = session.get(url, timeout=self.timeout_seconds) response = session.get(url, timeout=self.timeout_seconds)
if response.status_code != 200:
msg = f"Cannot get search capabilities for Indexer {indexer}"
log.error(msg)
raise RuntimeError(msg)
xml = response.text
xml_tree = ET.fromstring(xml) # noqa: S314 # trusted source, since it is user controlled
tv_search = xml_tree.find("./*/tv-search")
movie_search = xml_tree.find("./*/movie-search")
log.debug(tv_search.attrib)
log.debug(movie_search.attrib)
tv_search_capabilities = []
movie_search_capabilities = []
tv_search_available = (tv_search is not None) and (
tv_search.attrib["available"] == "yes"
)
movie_search_available = (movie_search is not None) and (
movie_search.attrib["available"] == "yes"
)
if tv_search_available:
tv_search_capabilities = tv_search.attrib["supportedParams"].split(",")
if movie_search_available:
movie_search_capabilities = movie_search.attrib["supportedParams"].split(
","
)
return IndexerInfo(
supports_tv_search=tv_search_available,
supports_tv_search_imdb="tmdbid" in tv_search_capabilities,
supports_tv_search_tmdb="tmdbid" in tv_search_capabilities,
supports_tv_search_tvdb="tvdbid" in tv_search_capabilities,
supports_tv_search_season="season" in tv_search_capabilities,
supports_tv_search_episode="ep" in tv_search_capabilities,
supports_movie_search=movie_search_available,
supports_movie_search_imdb="imdbid" in movie_search_capabilities,
supports_movie_search_tmdb="tmdbid" in movie_search_capabilities,
supports_movie_search_tvdb="tvdbid" in movie_search_capabilities,
)
def __get_optimal_query_parameters(
self, indexer: str, session: requests.Session, params: dict
) -> dict[str, str]:
query_params = {"apikey": self.api_key, "t": params["t"]}
search_capabilities = self.__get_search_capabilities(
indexer=indexer, session=session
)
if params["t"] == "tvsearch":
if not search_capabilities.supports_tv_search:
msg = f"Indexer {indexer} does not support TV search"
raise RuntimeError(msg)
if search_capabilities.supports_tv_search_season and "season" in params:
query_params["season"] = params["season"]
if search_capabilities.supports_tv_search_episode and "ep" in params:
query_params["ep"] = params["ep"]
if search_capabilities.supports_tv_search_imdb and "imdbid" in params:
query_params["imdbid"] = params["imdbid"]
elif search_capabilities.supports_tv_search_tvdb and "tvdbid" in params:
query_params["tvdbid"] = params["tvdbid"]
elif search_capabilities.supports_tv_search_tmdb and "tmdbid" in params:
query_params["tmdbid"] = params["tmdbid"]
else:
query_params["q"] = params["q"]
if params["t"] == "movie":
if not search_capabilities.supports_movie_search:
msg = f"Indexer {indexer} does not support Movie search"
raise RuntimeError(msg)
if search_capabilities.supports_movie_search_imdb and "imdbid" in params:
query_params["imdbid"] = params["imdbid"]
elif search_capabilities.supports_tv_search_tvdb and "tvdbid" in params:
query_params["tvdbid"] = params["tvdbid"]
elif search_capabilities.supports_tv_search_tmdb and "tmdbid" in params:
query_params["tmdbid"] = params["tmdbid"]
else:
query_params["q"] = params["q"]
return query_params
def get_torrents_by_indexer(
self, indexer: str, params: dict, session: requests.Session
) -> list[IndexerQueryResult]:
url = f"{self.url}/api/v2.0/indexers/{indexer}/results/torznab/api"
query_params = self.__get_optimal_query_parameters(
indexer=indexer, session=session, params=params
)
response = session.get(url, timeout=self.timeout_seconds, params=query_params)
log.debug(f"Indexer {indexer} url: {response.url}")
if response.status_code != 200: if response.status_code != 200:
log.error( log.error(
@@ -74,9 +185,24 @@ class Jackett(GenericIndexer, TorznabMixin):
def search_season( def search_season(
self, query: str, show: Show, season_number: int self, query: str, show: Show, season_number: int
) -> list[IndexerQueryResult]: ) -> list[IndexerQueryResult]:
log.debug(f"Searching for season {season_number} of show {show.title}") log.debug(f"Searching for season {season_number} of show {show.name}")
return self.search(query=query, is_tv=True) params = {
"t": "tvsearch",
"season": season_number,
"q": query,
}
if show.imdb_id:
params["imdbid"] = show.imdb_id
params[show.metadata_provider + "id"] = show.external_id
return self.__search_jackett(params=params)
def search_movie(self, query: str, movie: Movie) -> list[IndexerQueryResult]: def search_movie(self, query: str, movie: Movie) -> list[IndexerQueryResult]:
log.debug(f"Searching for movie {movie.title}") log.debug(f"Searching for movie {movie.name}")
return self.search(query=query, is_tv=False) params = {
"t": "movie",
"q": query,
}
if movie.imdb_id:
params["imdbid"] = movie.imdb_id
params[movie.metadata_provider + "id"] = movie.external_id
return self.__search_jackett(params=params)

View File

@@ -1,6 +1,6 @@
import logging import logging
import xml.etree.ElementTree as ET import xml.etree.ElementTree as ET
from datetime import datetime, timezone from datetime import datetime
from email.utils import parsedate_to_datetime from email.utils import parsedate_to_datetime
from media_manager.indexer.schemas import IndexerQueryResult from media_manager.indexer.schemas import IndexerQueryResult
@@ -39,7 +39,7 @@ class TorznabMixin:
posted_date = parsedate_to_datetime( posted_date = parsedate_to_datetime(
attribute.attrib["value"] attribute.attrib["value"]
) )
now = datetime.now(timezone.utc) now = datetime.now(datetime.UTC)
age = int((now - posted_date).total_seconds()) age = int((now - posted_date).total_seconds())
else: else:
if attribute.attrib["name"] == "seeders": if attribute.attrib["name"] == "seeders":
@@ -61,15 +61,19 @@ class TorznabMixin:
if upload_volume_factor == 2: if upload_volume_factor == 2:
flags.append("doubleupload") flags.append("doubleupload")
if not item.find("size") or item.find("size").text is None: title = item.find("title").text
log.warning( size_str = item.find("size")
f"Torznab item {item.find('title').text} has no size, skipping." if size_str is None or size_str.text is None:
) log.warning(f"Torznab item {title} has no size, skipping.")
continue
try:
size = int(size_str.text or "0")
except ValueError:
log.warning(f"Torznab item {title} has invalid size, skipping.")
continue continue
size = int(item.find("size").text or "0")
result = IndexerQueryResult( result = IndexerQueryResult(
title=item.find("title").text or "unknown", title=title or "unknown",
download_url=str(item.find("enclosure").attrib["url"]), download_url=str(item.find("enclosure").attrib["url"]),
seeders=seeders, seeders=seeders,
flags=flags, flags=flags,
@@ -79,6 +83,6 @@ class TorznabMixin:
indexer=indexer_name, indexer=indexer_name,
) )
result_list.append(result) result_list.append(result)
except Exception as e: except Exception:
log.error(f"1 Torznab search result errored with error: {e}") log.exception("1 Torznab search result failed")
return result_list return result_list

View File

@@ -13,7 +13,9 @@ IndexerQueryResultId = typing.NewType("IndexerQueryResultId", UUID)
class IndexerQueryResult(BaseModel): class IndexerQueryResult(BaseModel):
model_config = ConfigDict(from_attributes=True) model_config = ConfigDict(from_attributes=True)
id: IndexerQueryResultId = pydantic.Field(default_factory=lambda: IndexerQueryResultId(uuid4())) id: IndexerQueryResultId = pydantic.Field(
default_factory=lambda: IndexerQueryResultId(uuid4())
)
title: str title: str
download_url: str = pydantic.Field( download_url: str = pydantic.Field(
exclude=True, exclude=True,

View File

@@ -45,9 +45,9 @@ class IndexerService:
log.debug( log.debug(
f"Indexer {indexer.__class__.__name__} returned {len(indexer_results)} results for query: {query}" f"Indexer {indexer.__class__.__name__} returned {len(indexer_results)} results for query: {query}"
) )
except Exception as e: except Exception:
log.error( log.exception(
f"Indexer {indexer.__class__.__name__} failed for query '{query}': {e}" f"Indexer {indexer.__class__.__name__} failed for query '{query}'"
) )
for result in results: for result in results:
@@ -65,9 +65,9 @@ class IndexerService:
indexer_results = indexer.search_movie(query=query, movie=movie) indexer_results = indexer.search_movie(query=query, movie=movie)
if indexer_results: if indexer_results:
results.extend(indexer_results) results.extend(indexer_results)
except Exception as e: except Exception:
log.error( log.exception(
f"Indexer {indexer.__class__.__name__} failed for movie search '{query}': {e}" f"Indexer {indexer.__class__.__name__} failed for movie search '{query}'"
) )
for result in results: for result in results:
@@ -87,9 +87,9 @@ class IndexerService:
) )
if indexer_results: if indexer_results:
results.extend(indexer_results) results.extend(indexer_results)
except Exception as e: except Exception:
log.error( log.exception(
f"Indexer {indexer.__class__.__name__} failed for season search '{query}': {e}" f"Indexer {indexer.__class__.__name__} failed for season search '{query}'"
) )
for result in results: for result in results:

View File

@@ -149,8 +149,11 @@ def follow_redirects_to_final_torrent_url(
raise RuntimeError(msg) raise RuntimeError(msg)
except requests.exceptions.RequestException as e: except requests.exceptions.RequestException as e:
log.debug(f"An error occurred during the request for {initial_url}: {e}") log.debug(
msg = f"An error occurred during the request: {e}" f"An error occurred during the request for {initial_url}",
exc_info=True,
)
msg = "An error occurred during the request"
raise RuntimeError(msg) from e raise RuntimeError(msg) from e
return current_url return current_url

View File

@@ -1,7 +1,7 @@
import logging import logging
import os import os
import sys import sys
from datetime import datetime, timezone from datetime import UTC, datetime
from logging.config import dictConfig from logging.config import dictConfig
from pathlib import Path from pathlib import Path
from typing import override from typing import override
@@ -12,7 +12,7 @@ from pythonjsonlogger.json import JsonFormatter
class ISOJsonFormatter(JsonFormatter): class ISOJsonFormatter(JsonFormatter):
@override @override
def formatTime(self, record: logging.LogRecord, datefmt: str | None = None) -> str: def formatTime(self, record: logging.LogRecord, datefmt: str | None = None) -> str:
dt = datetime.fromtimestamp(record.created, tz=timezone.utc) dt = datetime.fromtimestamp(record.created, tz=UTC)
return dt.isoformat(timespec="milliseconds").replace("+00:00", "Z") return dt.isoformat(timespec="milliseconds").replace("+00:00", "Z")
@@ -21,13 +21,20 @@ LOG_FILE = Path(os.getenv("LOG_FILE", "/app/config/media_manager.log"))
LOGGING_CONFIG = { LOGGING_CONFIG = {
"version": 1, "version": 1,
"disable_existing_loggers": False, "disable_existing_loggers": False,
"filters": {
"correlation_id": {
"()": "asgi_correlation_id.CorrelationIdFilter",
"uuid_length": 32,
"default_value": "-",
},
},
"formatters": { "formatters": {
"default": { "default": {
"format": "%(asctime)s - %(levelname)s - %(name)s - %(funcName)s(): %(message)s" "format": "%(asctime)s - [%(correlation_id)s] %(levelname)s - %(name)s - %(funcName)s(): %(message)s"
}, },
"json": { "json": {
"()": ISOJsonFormatter, "()": ISOJsonFormatter,
"format": "%(asctime)s %(levelname)s %(name)s %(message)s", "format": "%(asctime)s %(correlation_id)s %(levelname)s %(name)s %(message)s",
"rename_fields": { "rename_fields": {
"levelname": "level", "levelname": "level",
"asctime": "timestamp", "asctime": "timestamp",
@@ -39,11 +46,13 @@ LOGGING_CONFIG = {
"console": { "console": {
"class": "logging.StreamHandler", "class": "logging.StreamHandler",
"formatter": "default", "formatter": "default",
"filters": ["correlation_id"],
"stream": sys.stdout, "stream": sys.stdout,
}, },
"file": { "file": {
"class": "logging.handlers.RotatingFileHandler", "class": "logging.handlers.RotatingFileHandler",
"formatter": "json", "formatter": "json",
"filters": ["correlation_id"],
"filename": str(LOG_FILE), "filename": str(LOG_FILE),
"maxBytes": 10485760, "maxBytes": 10485760,
"backupCount": 5, "backupCount": 5,

View File

@@ -2,6 +2,7 @@ import logging
import os import os
import uvicorn import uvicorn
from asgi_correlation_id import CorrelationIdMiddleware
from fastapi import APIRouter, FastAPI, Request, Response from fastapi import APIRouter, FastAPI, Request, Response
from fastapi.middleware.cors import CORSMiddleware from fastapi.middleware.cors import CORSMiddleware
from fastapi.staticfiles import StaticFiles from fastapi.staticfiles import StaticFiles
@@ -71,6 +72,7 @@ app.add_middleware(
allow_credentials=True, allow_credentials=True,
allow_methods=["GET", "PUT", "POST", "DELETE", "PATCH", "HEAD", "OPTIONS"], allow_methods=["GET", "PUT", "POST", "DELETE", "PATCH", "HEAD", "OPTIONS"],
) )
app.add_middleware(CorrelationIdMiddleware, header_name="X-Correlation-ID")
api_app = APIRouter(prefix="/api/v1") api_app = APIRouter(prefix="/api/v1")

View File

@@ -18,15 +18,11 @@ class AbstractMetadataProvider(ABC):
pass pass
@abstractmethod @abstractmethod
def get_show_metadata( def get_show_metadata(self, show_id: int, language: str | None = None) -> Show:
self, show_id: int, language: str | None = None
) -> Show:
raise NotImplementedError() raise NotImplementedError()
@abstractmethod @abstractmethod
def get_movie_metadata( def get_movie_metadata(self, movie_id: int, language: str | None = None) -> Movie:
self, movie_id: int, language: str | None = None
) -> Movie:
raise NotImplementedError() raise NotImplementedError()
@abstractmethod @abstractmethod

View File

@@ -51,7 +51,7 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
response.raise_for_status() response.raise_for_status()
return response.json() return response.json()
except requests.RequestException as e: except requests.RequestException as e:
log.error(f"TMDB API error getting show metadata for ID {show_id}: {e}") log.exception(f"TMDB API error getting show metadata for ID {show_id}")
if notification_manager.is_configured(): if notification_manager.is_configured():
notification_manager.send_notification( notification_manager.send_notification(
title="TMDB API Error", title="TMDB API Error",
@@ -68,7 +68,7 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
response.raise_for_status() response.raise_for_status()
return response.json() return response.json()
except requests.RequestException as e: except requests.RequestException as e:
log.error(f"TMDB API error getting show external IDs for ID {show_id}: {e}") log.exception(f"TMDB API error getting show external IDs for ID {show_id}")
if notification_manager.is_configured(): if notification_manager.is_configured():
notification_manager.send_notification( notification_manager.send_notification(
title="TMDB API Error", title="TMDB API Error",
@@ -90,8 +90,8 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
response.raise_for_status() response.raise_for_status()
return response.json() return response.json()
except requests.RequestException as e: except requests.RequestException as e:
log.error( log.exception(
f"TMDB API error getting season {season_number} metadata for show ID {show_id}: {e}" f"TMDB API error getting season {season_number} metadata for show ID {show_id}"
) )
if notification_manager.is_configured(): if notification_manager.is_configured():
notification_manager.send_notification( notification_manager.send_notification(
@@ -113,7 +113,7 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
response.raise_for_status() response.raise_for_status()
return response.json() return response.json()
except requests.RequestException as e: except requests.RequestException as e:
log.error(f"TMDB API error searching TV shows with query '{query}': {e}") log.exception(f"TMDB API error searching TV shows with query '{query}'")
if notification_manager.is_configured(): if notification_manager.is_configured():
notification_manager.send_notification( notification_manager.send_notification(
title="TMDB API Error", title="TMDB API Error",
@@ -131,7 +131,7 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
response.raise_for_status() response.raise_for_status()
return response.json() return response.json()
except requests.RequestException as e: except requests.RequestException as e:
log.error(f"TMDB API error getting trending TV: {e}") log.exception("TMDB API error getting trending TV")
if notification_manager.is_configured(): if notification_manager.is_configured():
notification_manager.send_notification( notification_manager.send_notification(
title="TMDB API Error", title="TMDB API Error",
@@ -151,7 +151,7 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
response.raise_for_status() response.raise_for_status()
return response.json() return response.json()
except requests.RequestException as e: except requests.RequestException as e:
log.error(f"TMDB API error getting movie metadata for ID {movie_id}: {e}") log.exception(f"TMDB API error getting movie metadata for ID {movie_id}")
if notification_manager.is_configured(): if notification_manager.is_configured():
notification_manager.send_notification( notification_manager.send_notification(
title="TMDB API Error", title="TMDB API Error",
@@ -167,8 +167,8 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
response.raise_for_status() response.raise_for_status()
return response.json() return response.json()
except requests.RequestException as e: except requests.RequestException as e:
log.error( log.exception(
f"TMDB API error getting movie external IDs for ID {movie_id}: {e}" f"TMDB API error getting movie external IDs for ID {movie_id}"
) )
if notification_manager.is_configured(): if notification_manager.is_configured():
notification_manager.send_notification( notification_manager.send_notification(
@@ -190,7 +190,7 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
response.raise_for_status() response.raise_for_status()
return response.json() return response.json()
except requests.RequestException as e: except requests.RequestException as e:
log.error(f"TMDB API error searching movies with query '{query}': {e}") log.exception(f"TMDB API error searching movies with query '{query}'")
if notification_manager.is_configured(): if notification_manager.is_configured():
notification_manager.send_notification( notification_manager.send_notification(
title="TMDB API Error", title="TMDB API Error",
@@ -208,7 +208,7 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
response.raise_for_status() response.raise_for_status()
return response.json() return response.json()
except requests.RequestException as e: except requests.RequestException as e:
log.error(f"TMDB API error getting trending movies: {e}") log.exception("TMDB API error getting trending movies")
if notification_manager.is_configured(): if notification_manager.is_configured():
notification_manager.send_notification( notification_manager.send_notification(
title="TMDB API Error", title="TMDB API Error",
@@ -243,9 +243,7 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
return True return True
@override @override
def get_show_metadata( def get_show_metadata(self, show_id: int, language: str | None = None) -> Show:
self, show_id: int, language: str | None = None
) -> Show:
""" """
:param show_id: the external id of the show :param show_id: the external id of the show
@@ -368,14 +366,12 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
original_language=original_language, original_language=original_language,
) )
) )
except Exception as e: except Exception:
log.warning(f"Error processing search result: {e}") log.warning("Error processing search result", exc_info=True)
return formatted_results return formatted_results
@override @override
def get_movie_metadata( def get_movie_metadata(self, movie_id: int, language: str | None = None) -> Movie:
self, movie_id: int, language: str | None = None
) -> Movie:
""" """
Get movie metadata with language-aware fetching. Get movie metadata with language-aware fetching.
@@ -470,8 +466,8 @@ class TmdbMetadataProvider(AbstractMetadataProvider):
original_language=original_language, original_language=original_language,
) )
) )
except Exception as e: except Exception:
log.warning(f"Error processing search result: {e}") log.warning("Error processing search result", exc_info=True)
return formatted_results return formatted_results
@override @override

View File

@@ -63,9 +63,7 @@ class TvdbMetadataProvider(AbstractMetadataProvider):
return False return False
@override @override
def get_show_metadata( def get_show_metadata(self, show_id: int, language: str | None = None) -> Show:
self, show_id: int, language: str | None = None
) -> Show:
""" """
:param show_id: The external id of the show :param show_id: The external id of the show
@@ -150,8 +148,8 @@ class TvdbMetadataProvider(AbstractMetadataProvider):
vote_average=None, vote_average=None,
) )
) )
except Exception as e: except Exception:
log.warning(f"Error processing search result: {e}") log.warning("Error processing search result", exc_info=True)
return formatted_results return formatted_results
results = self.__get_trending_tv() results = self.__get_trending_tv()
formatted_results = [] formatted_results = []
@@ -178,8 +176,8 @@ class TvdbMetadataProvider(AbstractMetadataProvider):
vote_average=None, vote_average=None,
) )
) )
except Exception as e: except Exception:
log.warning(f"Error processing search result: {e}") log.warning("Error processing search result", exc_info=True)
return formatted_results return formatted_results
@override @override
@@ -215,8 +213,8 @@ class TvdbMetadataProvider(AbstractMetadataProvider):
vote_average=None, vote_average=None,
) )
) )
except Exception as e: except Exception:
log.warning(f"Error processing search result: {e}") log.warning("Error processing search result", exc_info=True)
return formatted_results return formatted_results
results = self.__get_trending_movies() results = self.__get_trending_movies()
results = results[0:20] results = results[0:20]
@@ -231,15 +229,15 @@ class TvdbMetadataProvider(AbstractMetadataProvider):
year = None year = None
if result.get("image"): if result.get("image"):
poster_path = "https://artworks.thetvdb.com" + str(result.get("image")) poster_path = "https://artworks.thetvdb.com" + str(
result.get("image")
)
else: else:
poster_path = None poster_path = None
formatted_results.append( formatted_results.append(
MetaDataProviderSearchResult( MetaDataProviderSearchResult(
poster_path= poster_path poster_path=poster_path if result.get("image") else None,
if result.get("image")
else None,
overview=result.get("overview"), overview=result.get("overview"),
name=result["name"], name=result["name"],
external_id=result["id"], external_id=result["id"],
@@ -249,8 +247,8 @@ class TvdbMetadataProvider(AbstractMetadataProvider):
vote_average=None, vote_average=None,
) )
) )
except Exception as e: except Exception:
log.warning(f"Error processing search result: {e}") log.warning("Error processing search result", exc_info=True)
return formatted_results return formatted_results
@override @override
@@ -269,9 +267,7 @@ class TvdbMetadataProvider(AbstractMetadataProvider):
return False return False
@override @override
def get_movie_metadata( def get_movie_metadata(self, movie_id: int, language: str | None = None) -> Movie:
self, movie_id: int, language: str | None = None
) -> Movie:
""" """
:param movie_id: the external id of the movie :param movie_id: the external id of the movie

View File

@@ -15,7 +15,7 @@ def download_poster_image(storage_path: Path, poster_url: str, uuid: UUID) -> bo
res = requests.get(poster_url, stream=True, timeout=60) res = requests.get(poster_url, stream=True, timeout=60)
if res.status_code == 200: if res.status_code == 200:
image_file_path = storage_path.joinpath(str(uuid)).with_suffix("jpg") image_file_path = storage_path.joinpath(str(uuid)).with_suffix(".jpg")
image_file_path.write_bytes(res.content) image_file_path.write_bytes(res.content)
original_image = Image.open(image_file_path) original_image = Image.open(image_file_path)

View File

@@ -59,8 +59,8 @@ class MovieRepository:
msg = f"Movie with id {movie_id} not found." msg = f"Movie with id {movie_id} not found."
raise NotFoundError(msg) raise NotFoundError(msg)
return MovieSchema.model_validate(result) return MovieSchema.model_validate(result)
except SQLAlchemyError as e: except SQLAlchemyError:
log.error(f"Database error while retrieving movie {movie_id}: {e}") log.exception(f"Database error while retrieving movie {movie_id}")
raise raise
def get_movie_by_external_id( def get_movie_by_external_id(
@@ -86,9 +86,9 @@ class MovieRepository:
msg = f"Movie with external_id {external_id} and provider {metadata_provider} not found." msg = f"Movie with external_id {external_id} and provider {metadata_provider} not found."
raise NotFoundError(msg) raise NotFoundError(msg)
return MovieSchema.model_validate(result) return MovieSchema.model_validate(result)
except SQLAlchemyError as e: except SQLAlchemyError:
log.error( log.exception(
f"Database error while retrieving movie by external_id {external_id}: {e}" f"Database error while retrieving movie by external_id {external_id}"
) )
raise raise
@@ -103,8 +103,8 @@ class MovieRepository:
stmt = select(Movie) stmt = select(Movie)
results = self.db.execute(stmt).scalars().unique().all() results = self.db.execute(stmt).scalars().unique().all()
return [MovieSchema.model_validate(movie) for movie in results] return [MovieSchema.model_validate(movie) for movie in results]
except SQLAlchemyError as e: except SQLAlchemyError:
log.error(f"Database error while retrieving all movies: {e}") log.exception("Database error while retrieving all movies")
raise raise
def save_movie(self, movie: MovieSchema) -> MovieSchema: def save_movie(self, movie: MovieSchema) -> MovieSchema:
@@ -140,14 +140,14 @@ class MovieRepository:
return MovieSchema.model_validate(db_movie) return MovieSchema.model_validate(db_movie)
except IntegrityError as e: except IntegrityError as e:
self.db.rollback() self.db.rollback()
log.error(f"Integrity error while saving movie {movie.name}: {e}") log.exception(f"Integrity error while saving movie {movie.name}")
msg = ( msg = (
f"Movie with this primary key or unique constraint violation: {e.orig}" f"Movie with this primary key or unique constraint violation: {e.orig}"
) )
raise ConflictError(msg) from e raise ConflictError(msg) from e
except SQLAlchemyError as e: except SQLAlchemyError:
self.db.rollback() self.db.rollback()
log.error(f"Database error while saving movie {movie.name}: {e}") log.exception(f"Database error while saving movie {movie.name}")
raise raise
def delete_movie(self, movie_id: MovieId) -> None: def delete_movie(self, movie_id: MovieId) -> None:
@@ -168,9 +168,9 @@ class MovieRepository:
self.db.delete(movie) self.db.delete(movie)
self.db.commit() self.db.commit()
log.info(f"Successfully deleted movie with id: {movie_id}") log.info(f"Successfully deleted movie with id: {movie_id}")
except SQLAlchemyError as e: except SQLAlchemyError:
self.db.rollback() self.db.rollback()
log.error(f"Database error while deleting movie {movie_id}: {e}") log.exception(f"Database error while deleting movie {movie_id}")
raise raise
def add_movie_request( def add_movie_request(
@@ -204,13 +204,13 @@ class MovieRepository:
self.db.refresh(db_model) self.db.refresh(db_model)
log.info(f"Successfully added movie request with id: {db_model.id}") log.info(f"Successfully added movie request with id: {db_model.id}")
return MovieRequestSchema.model_validate(db_model) return MovieRequestSchema.model_validate(db_model)
except IntegrityError as e: except IntegrityError:
self.db.rollback() self.db.rollback()
log.error(f"Integrity error while adding movie request: {e}") log.exception("Integrity error while adding movie request")
raise raise
except SQLAlchemyError as e: except SQLAlchemyError:
self.db.rollback() self.db.rollback()
log.error(f"Database error while adding movie request: {e}") log.exception("Database error while adding movie request")
raise raise
def set_movie_library(self, movie_id: MovieId, library: str) -> None: def set_movie_library(self, movie_id: MovieId, library: str) -> None:
@@ -229,9 +229,9 @@ class MovieRepository:
raise NotFoundError(msg) raise NotFoundError(msg)
movie.library = library movie.library = library
self.db.commit() self.db.commit()
except SQLAlchemyError as e: except SQLAlchemyError:
self.db.rollback() self.db.rollback()
log.error(f"Database error setting library for movie {movie_id}: {e}") log.exception(f"Database error setting library for movie {movie_id}")
raise raise
def delete_movie_request(self, movie_request_id: MovieRequestId) -> None: def delete_movie_request(self, movie_request_id: MovieRequestId) -> None:
@@ -251,10 +251,10 @@ class MovieRepository:
raise NotFoundError(msg) raise NotFoundError(msg)
self.db.commit() self.db.commit()
# Successfully deleted movie request with id: {movie_request_id} # Successfully deleted movie request with id: {movie_request_id}
except SQLAlchemyError as e: except SQLAlchemyError:
self.db.rollback() self.db.rollback()
log.error( log.exception(
f"Database error while deleting movie request {movie_request_id}: {e}" f"Database error while deleting movie request {movie_request_id}"
) )
raise raise
@@ -273,8 +273,8 @@ class MovieRepository:
) )
results = self.db.execute(stmt).scalars().unique().all() results = self.db.execute(stmt).scalars().unique().all()
return [RichMovieRequestSchema.model_validate(x) for x in results] return [RichMovieRequestSchema.model_validate(x) for x in results]
except SQLAlchemyError as e: except SQLAlchemyError:
log.error(f"Database error while retrieving movie requests: {e}") log.exception("Database error while retrieving movie requests")
raise raise
def add_movie_file(self, movie_file: MovieFileSchema) -> MovieFileSchema: def add_movie_file(self, movie_file: MovieFileSchema) -> MovieFileSchema:
@@ -292,13 +292,13 @@ class MovieRepository:
self.db.commit() self.db.commit()
self.db.refresh(db_model) self.db.refresh(db_model)
return MovieFileSchema.model_validate(db_model) return MovieFileSchema.model_validate(db_model)
except IntegrityError as e: except IntegrityError:
self.db.rollback() self.db.rollback()
log.error(f"Integrity error while adding movie file: {e}") log.exception("Integrity error while adding movie file")
raise raise
except SQLAlchemyError as e: except SQLAlchemyError:
self.db.rollback() self.db.rollback()
log.error(f"Database error while adding movie file: {e}") log.exception("Database error while adding movie file")
raise raise
def remove_movie_files_by_torrent_id(self, torrent_id: TorrentId) -> int: def remove_movie_files_by_torrent_id(self, torrent_id: TorrentId) -> int:
@@ -313,14 +313,15 @@ class MovieRepository:
stmt = delete(MovieFile).where(MovieFile.torrent_id == torrent_id) stmt = delete(MovieFile).where(MovieFile.torrent_id == torrent_id)
result = self.db.execute(stmt) result = self.db.execute(stmt)
self.db.commit() self.db.commit()
return result.rowcount except SQLAlchemyError:
except SQLAlchemyError as e:
self.db.rollback() self.db.rollback()
log.error( log.exception(
f"Database error removing movie files for torrent_id {torrent_id}: {e}" f"Database error removing movie files for torrent_id {torrent_id}"
) )
raise raise
return result.rowcount
def get_movie_files_by_movie_id(self, movie_id: MovieId) -> list[MovieFileSchema]: def get_movie_files_by_movie_id(self, movie_id: MovieId) -> list[MovieFileSchema]:
""" """
Retrieve all movie files for a given movie ID. Retrieve all movie files for a given movie ID.
@@ -333,9 +334,9 @@ class MovieRepository:
stmt = select(MovieFile).where(MovieFile.movie_id == movie_id) stmt = select(MovieFile).where(MovieFile.movie_id == movie_id)
results = self.db.execute(stmt).scalars().all() results = self.db.execute(stmt).scalars().all()
return [MovieFileSchema.model_validate(sf) for sf in results] return [MovieFileSchema.model_validate(sf) for sf in results]
except SQLAlchemyError as e: except SQLAlchemyError:
log.error( log.exception(
f"Database error retrieving movie files for movie_id {movie_id}: {e}" f"Database error retrieving movie files for movie_id {movie_id}"
) )
raise raise
@@ -367,13 +368,13 @@ class MovieRepository:
usenet=torrent.usenet, usenet=torrent.usenet,
) )
formatted_results.append(movie_torrent) formatted_results.append(movie_torrent)
return formatted_results
except SQLAlchemyError as e: except SQLAlchemyError:
log.error( log.exception(f"Database error retrieving torrents for movie_id {movie_id}")
f"Database error retrieving torrents for movie_id {movie_id}: {e}"
)
raise raise
return formatted_results
def get_all_movies_with_torrents(self) -> list[MovieSchema]: def get_all_movies_with_torrents(self) -> list[MovieSchema]:
""" """
Retrieve all movies that are associated with a torrent, ordered alphabetically by movie name. Retrieve all movies that are associated with a torrent, ordered alphabetically by movie name.
@@ -391,8 +392,8 @@ class MovieRepository:
) )
results = self.db.execute(stmt).scalars().unique().all() results = self.db.execute(stmt).scalars().unique().all()
return [MovieSchema.model_validate(movie) for movie in results] return [MovieSchema.model_validate(movie) for movie in results]
except SQLAlchemyError as e: except SQLAlchemyError:
log.error(f"Database error retrieving all movies with torrents: {e}") log.exception("Database error retrieving all movies with torrents")
raise raise
def get_movie_request(self, movie_request_id: MovieRequestId) -> MovieRequestSchema: def get_movie_request(self, movie_request_id: MovieRequestId) -> MovieRequestSchema:
@@ -410,10 +411,8 @@ class MovieRepository:
msg = f"Movie request with id {movie_request_id} not found." msg = f"Movie request with id {movie_request_id} not found."
raise NotFoundError(msg) raise NotFoundError(msg)
return MovieRequestSchema.model_validate(request) return MovieRequestSchema.model_validate(request)
except SQLAlchemyError as e: except SQLAlchemyError:
log.error( log.exception(f"Database error retrieving movie request {movie_request_id}")
f"Database error retrieving movie request {movie_request_id}: {e}"
)
raise raise
def get_movie_by_torrent_id(self, torrent_id: TorrentId) -> MovieSchema: def get_movie_by_torrent_id(self, torrent_id: TorrentId) -> MovieSchema:
@@ -436,10 +435,8 @@ class MovieRepository:
msg = f"Movie for torrent_id {torrent_id} not found." msg = f"Movie for torrent_id {torrent_id} not found."
raise NotFoundError(msg) raise NotFoundError(msg)
return MovieSchema.model_validate(result) return MovieSchema.model_validate(result)
except SQLAlchemyError as e: except SQLAlchemyError:
log.error( log.exception(f"Database error retrieving movie by torrent_id {torrent_id}")
f"Database error retrieving movie by torrent_id {torrent_id}: {e}"
)
raise raise
def update_movie_attributes( def update_movie_attributes(

View File

@@ -256,7 +256,7 @@ def authorize_request(
movie_request_id: MovieRequestId, movie_request_id: MovieRequestId,
user: Annotated[UserRead, Depends(current_superuser)], user: Annotated[UserRead, Depends(current_superuser)],
authorized_status: bool = False, authorized_status: bool = False,
) -> MovieRequest: ) -> None:
""" """
Authorize or de-authorize a movie request. Authorize or de-authorize a movie request.
""" """
@@ -268,7 +268,7 @@ def authorize_request(
movie_request.authorized_by = user movie_request.authorized_by = user
else: else:
movie_request.authorized_by = None movie_request.authorized_by = None
return movie_service.update_movie_request(movie_request=movie_request) movie_service.update_movie_request(movie_request=movie_request)
@router.delete( @router.delete(

View File

@@ -8,7 +8,7 @@ from sqlalchemy.orm import Session
from media_manager.config import MediaManagerConfig from media_manager.config import MediaManagerConfig
from media_manager.database import SessionLocal, get_session from media_manager.database import SessionLocal, get_session
from media_manager.exceptions import InvalidConfigError, NotFoundError from media_manager.exceptions import InvalidConfigError, NotFoundError, RenameError
from media_manager.indexer.repository import IndexerRepository from media_manager.indexer.repository import IndexerRepository
from media_manager.indexer.schemas import IndexerQueryResult, IndexerQueryResultId from media_manager.indexer.schemas import IndexerQueryResult, IndexerQueryResultId
from media_manager.indexer.service import IndexerService from media_manager.indexer.service import IndexerService
@@ -98,9 +98,7 @@ class MovieService:
""" """
return self.movie_repository.add_movie_request(movie_request=movie_request) return self.movie_repository.add_movie_request(movie_request=movie_request)
def get_movie_request_by_id( def get_movie_request_by_id(self, movie_request_id: MovieRequestId) -> MovieRequest:
self, movie_request_id: MovieRequestId
) -> MovieRequest:
""" """
Get a movie request by its ID. Get a movie request by its ID.
@@ -151,10 +149,8 @@ class MovieService:
try: try:
shutil.rmtree(movie_dir) shutil.rmtree(movie_dir)
log.info(f"Deleted movie directory: {movie_dir}") log.info(f"Deleted movie directory: {movie_dir}")
except OSError as e: except OSError:
log.error( log.exception(f"Deleting movie directory: {movie_dir}")
f"Deleting movie directory: {movie_dir} : {e.strerror}"
)
if delete_torrents: if delete_torrents:
# Get all torrents associated with this movie # Get all torrents associated with this movie
@@ -171,8 +167,10 @@ class MovieService:
torrent=torrent, delete_files=True torrent=torrent, delete_files=True
) )
log.info(f"Deleted torrent: {torrent.torrent_title}") log.info(f"Deleted torrent: {torrent.torrent_title}")
except Exception as e: except Exception:
log.warning(f"Failed to delete torrent {torrent.hash}: {e}") log.warning(
f"Failed to delete torrent {torrent.hash}", exc_info=True
)
# Delete from database # Delete from database
self.movie_repository.delete_movie(movie_id=movie.id) self.movie_repository.delete_movie(movie_id=movie.id)
@@ -237,19 +235,19 @@ class MovieService:
self.movie_repository.get_movie_by_external_id( self.movie_repository.get_movie_by_external_id(
external_id=external_id, metadata_provider=metadata_provider external_id=external_id, metadata_provider=metadata_provider
) )
return True
except NotFoundError: except NotFoundError:
return False return False
elif movie_id is not None: elif movie_id is not None:
try: try:
self.movie_repository.get_movie_by_id(movie_id=movie_id) self.movie_repository.get_movie_by_id(movie_id=movie_id)
return True
except NotFoundError: except NotFoundError:
return False return False
else: else:
msg = "Use one of the provided overloads for this function!" msg = "Use one of the provided overloads for this function!"
raise ValueError(msg) raise ValueError(msg)
return True
def get_all_available_torrents_for_movie( def get_all_available_torrents_for_movie(
self, movie: Movie, search_query_override: str | None = None self, movie: Movie, search_query_override: str | None = None
) -> list[IndexerQueryResult]: ) -> list[IndexerQueryResult]:
@@ -570,8 +568,8 @@ class MovieService:
try: try:
movie_root_path.mkdir(parents=True, exist_ok=True) movie_root_path.mkdir(parents=True, exist_ok=True)
except Exception as e: except Exception:
log.error(f"Failed to create directory {movie_root_path}: {e}") log.exception("Failed to create directory {movie_root_path}")
return False return False
# import movie video # import movie video
@@ -682,9 +680,8 @@ class MovieService:
try: try:
source_directory.rename(new_source_path) source_directory.rename(new_source_path)
except Exception as e: except Exception as e:
log.error(f"Failed to rename {source_directory} to {new_source_path}: {e}") log.exception(f"Failed to rename {source_directory} to {new_source_path}")
msg = "Failed to rename directory" raise RenameError from e
raise Exception(msg) from e
video_files, subtitle_files, _all_files = get_files_for_import( video_files, subtitle_files, _all_files = get_files_for_import(
directory=new_source_path directory=new_source_path
@@ -786,12 +783,14 @@ def auto_download_all_approved_movie_requests() -> None:
movie_repository = MovieRepository(db=db) movie_repository = MovieRepository(db=db)
torrent_service = TorrentService(torrent_repository=TorrentRepository(db=db)) torrent_service = TorrentService(torrent_repository=TorrentRepository(db=db))
indexer_service = IndexerService(indexer_repository=IndexerRepository(db=db)) indexer_service = IndexerService(indexer_repository=IndexerRepository(db=db))
notification_service = NotificationService(notification_repository=NotificationRepository(db=db)) notification_service = NotificationService(
notification_repository=NotificationRepository(db=db)
)
movie_service = MovieService( movie_service = MovieService(
movie_repository=movie_repository, movie_repository=movie_repository,
torrent_service=torrent_service, torrent_service=torrent_service,
indexer_service=indexer_service, indexer_service=indexer_service,
notification_service=notification_service notification_service=notification_service,
) )
log.info("Auto downloading all approved movie requests") log.info("Auto downloading all approved movie requests")
@@ -821,7 +820,9 @@ def import_all_movie_torrents() -> None:
movie_repository = MovieRepository(db=db) movie_repository = MovieRepository(db=db)
torrent_service = TorrentService(torrent_repository=TorrentRepository(db=db)) torrent_service = TorrentService(torrent_repository=TorrentRepository(db=db))
indexer_service = IndexerService(indexer_repository=IndexerRepository(db=db)) indexer_service = IndexerService(indexer_repository=IndexerRepository(db=db))
notification_service = NotificationService(notification_repository=NotificationRepository(db=db)) notification_service = NotificationService(
notification_repository=NotificationRepository(db=db)
)
movie_service = MovieService( movie_service = MovieService(
movie_repository=movie_repository, movie_repository=movie_repository,
torrent_service=torrent_service, torrent_service=torrent_service,
@@ -841,11 +842,8 @@ def import_all_movie_torrents() -> None:
) )
continue continue
movie_service.import_torrent_files(torrent=t, movie=movie) movie_service.import_torrent_files(torrent=t, movie=movie)
except RuntimeError as e: except RuntimeError:
log.error( log.exception(f"Failed to import torrent {t.title}")
f"Failed to import torrent {t.title}: {e}",
exc_info=True,
)
log.info("Finished importing all torrents") log.info("Finished importing all torrents")
db.commit() db.commit()
@@ -860,7 +858,9 @@ def update_all_movies_metadata() -> None:
movie_repository=movie_repository, movie_repository=movie_repository,
torrent_service=TorrentService(torrent_repository=TorrentRepository(db=db)), torrent_service=TorrentService(torrent_repository=TorrentRepository(db=db)),
indexer_service=IndexerService(indexer_repository=IndexerRepository(db=db)), indexer_service=IndexerService(indexer_repository=IndexerRepository(db=db)),
notification_service=NotificationService(notification_repository=NotificationRepository(db=db)) notification_service=NotificationService(
notification_repository=NotificationRepository(db=db)
),
) )
log.info("Updating metadata for all movies") log.info("Updating metadata for all movies")
@@ -880,9 +880,9 @@ def update_all_movies_metadata() -> None:
f"Unsupported metadata provider {movie.metadata_provider} for movie {movie.name}, skipping update." f"Unsupported metadata provider {movie.metadata_provider} for movie {movie.name}, skipping update."
) )
continue continue
except InvalidConfigError as e: except InvalidConfigError:
log.error( log.exception(
f"Error initializing metadata provider {movie.metadata_provider} for movie {movie.name}: {e}" f"Error initializing metadata provider {movie.metadata_provider} for movie {movie.name}",
) )
continue continue
movie_service.update_movie_metadata( movie_service.update_movie_metadata(

View File

@@ -3,7 +3,6 @@ Notification Manager - Orchestrates sending notifications through all configured
""" """
import logging import logging
from typing import List
from media_manager.config import MediaManagerConfig from media_manager.config import MediaManagerConfig
from media_manager.notification.schemas import MessageNotification from media_manager.notification.schemas import MessageNotification
@@ -33,7 +32,7 @@ class NotificationManager:
def __init__(self) -> None: def __init__(self) -> None:
self.config = MediaManagerConfig().notifications self.config = MediaManagerConfig().notifications
self.providers: List[AbstractNotificationServiceProvider] = [] self.providers: list[AbstractNotificationServiceProvider] = []
self._initialize_providers() self._initialize_providers()
def _initialize_providers(self) -> None: def _initialize_providers(self) -> None:
@@ -42,32 +41,32 @@ class NotificationManager:
try: try:
self.providers.append(EmailNotificationServiceProvider()) self.providers.append(EmailNotificationServiceProvider())
logger.info("Email notification provider initialized") logger.info("Email notification provider initialized")
except Exception as e: except Exception:
logger.error(f"Failed to initialize Email provider: {e}") logger.exception("Failed to initialize Email provider")
# Gotify provider # Gotify provider
if self.config.gotify.enabled: if self.config.gotify.enabled:
try: try:
self.providers.append(GotifyNotificationServiceProvider()) self.providers.append(GotifyNotificationServiceProvider())
logger.info("Gotify notification provider initialized") logger.info("Gotify notification provider initialized")
except Exception as e: except Exception:
logger.error(f"Failed to initialize Gotify provider: {e}") logger.exception("Failed to initialize Gotify provider")
# Ntfy provider # Ntfy provider
if self.config.ntfy.enabled: if self.config.ntfy.enabled:
try: try:
self.providers.append(NtfyNotificationServiceProvider()) self.providers.append(NtfyNotificationServiceProvider())
logger.info("Ntfy notification provider initialized") logger.info("Ntfy notification provider initialized")
except Exception as e: except Exception:
logger.error(f"Failed to initialize Ntfy provider: {e}") logger.exception("Failed to initialize Ntfy provider")
# Pushover provider # Pushover provider
if self.config.pushover.enabled: if self.config.pushover.enabled:
try: try:
self.providers.append(PushoverNotificationServiceProvider()) self.providers.append(PushoverNotificationServiceProvider())
logger.info("Pushover notification provider initialized") logger.info("Pushover notification provider initialized")
except Exception as e: except Exception:
logger.error(f"Failed to initialize Pushover provider: {e}") logger.exception("Failed to initialize Pushover provider")
logger.info(f"Initialized {len(self.providers)} notification providers") logger.info(f"Initialized {len(self.providers)} notification providers")
@@ -86,10 +85,10 @@ class NotificationManager:
else: else:
logger.warning(f"Failed to send notification via {provider_name}") logger.warning(f"Failed to send notification via {provider_name}")
except Exception as e: except Exception:
logger.error(f"Error sending notification via {provider_name}: {e}") logger.exception(f"Error sending notification via {provider_name}")
def get_configured_providers(self) -> List[str]: def get_configured_providers(self) -> list[str]:
return [provider.__class__.__name__ for provider in self.providers] return [provider.__class__.__name__ for provider in self.providers]
def is_configured(self) -> bool: def is_configured(self) -> bool:

View File

@@ -6,6 +6,7 @@ from sqlalchemy.exc import (
SQLAlchemyError, SQLAlchemyError,
) )
from sqlalchemy.orm import Session from sqlalchemy.orm import Session
from sqlalchemy.sql.expression import false
from media_manager.exceptions import ConflictError, NotFoundError from media_manager.exceptions import ConflictError, NotFoundError
from media_manager.notification.models import Notification from media_manager.notification.models import Notification
@@ -36,7 +37,7 @@ class NotificationRepository:
try: try:
stmt = ( stmt = (
select(Notification) select(Notification)
.where(Notification.read == False) # noqa: E712 .where(Notification.read == false())
.order_by(Notification.timestamp.desc()) .order_by(Notification.timestamp.desc())
) )
results = self.db.execute(stmt).scalars().all() results = self.db.execute(stmt).scalars().all()
@@ -44,8 +45,8 @@ class NotificationRepository:
NotificationSchema.model_validate(notification) NotificationSchema.model_validate(notification)
for notification in results for notification in results
] ]
except SQLAlchemyError as e: except SQLAlchemyError:
log.error(f"Database error while retrieving unread notifications: {e}") log.exception("Database error while retrieving unread notifications")
raise raise
def get_all_notifications(self) -> list[NotificationSchema]: def get_all_notifications(self) -> list[NotificationSchema]:
@@ -56,8 +57,8 @@ class NotificationRepository:
NotificationSchema.model_validate(notification) NotificationSchema.model_validate(notification)
for notification in results for notification in results
] ]
except SQLAlchemyError as e: except SQLAlchemyError:
log.error(f"Database error while retrieving notifications: {e}") log.exception("Database error while retrieving notifications")
raise raise
def save_notification(self, notification: NotificationSchema) -> None: def save_notification(self, notification: NotificationSchema) -> None:
@@ -71,8 +72,8 @@ class NotificationRepository:
) )
) )
self.db.commit() self.db.commit()
except IntegrityError as e: except IntegrityError:
log.error(f"Could not save notification, Error: {e}") log.exception("Could not save notification")
msg = f"Notification with id {notification.id} already exists." msg = f"Notification with id {notification.id} already exists."
raise ConflictError(msg) from None raise ConflictError(msg) from None
return return

View File

@@ -12,7 +12,8 @@ class Notification(BaseModel):
model_config = ConfigDict(from_attributes=True) model_config = ConfigDict(from_attributes=True)
id: NotificationId = Field( id: NotificationId = Field(
default_factory=lambda: NotificationId(uuid.uuid4()), description="Unique identifier for the notification" default_factory=lambda: NotificationId(uuid.uuid4()),
description="Unique identifier for the notification",
) )
read: bool = Field(False, description="Whether the notification has been read") read: bool = Field(False, description="Whether the notification has been read")
message: str = Field(description="The content of the notification") message: str = Field(description="The content of the notification")

View File

@@ -1,8 +1,7 @@
from pydantic_settings import BaseSettings, SettingsConfigDict from pydantic_settings import BaseSettings
class QbittorrentConfig(BaseSettings): class QbittorrentConfig(BaseSettings):
model_config = SettingsConfigDict(env_prefix="QBITTORRENT_")
host: str = "localhost" host: str = "localhost"
port: int = 8080 port: int = 8080
username: str = "admin" username: str = "admin"
@@ -14,7 +13,6 @@ class QbittorrentConfig(BaseSettings):
class TransmissionConfig(BaseSettings): class TransmissionConfig(BaseSettings):
model_config = SettingsConfigDict(env_prefix="TRANSMISSION_")
path: str = "/transmission/rpc" path: str = "/transmission/rpc"
https_enabled: bool = True https_enabled: bool = True
host: str = "localhost" host: str = "localhost"
@@ -25,7 +23,6 @@ class TransmissionConfig(BaseSettings):
class SabnzbdConfig(BaseSettings): class SabnzbdConfig(BaseSettings):
model_config = SettingsConfigDict(env_prefix="SABNZBD_")
host: str = "localhost" host: str = "localhost"
port: int = 8080 port: int = 8080
api_key: str = "" api_key: str = ""

View File

@@ -53,8 +53,8 @@ class QbittorrentDownloadClient(AbstractDownloadClient):
) )
try: try:
self.api_client.auth_log_in() self.api_client.auth_log_in()
except Exception as e: except Exception:
log.error(f"Failed to log into qbittorrent: {e}") log.exception("Failed to log into qbittorrent")
raise raise
try: try:
@@ -72,11 +72,8 @@ class QbittorrentDownloadClient(AbstractDownloadClient):
if self.config.category_save_path != "" if self.config.category_save_path != ""
else None, else None,
) )
except Exception as e: except Exception:
if str(e) != "": log.exception("Error on updating MediaManager category in qBittorrent")
log.error(
f"Error on updating MediaManager category in qBittorrent, error: {e}"
)
def download_torrent(self, indexer_result: IndexerQueryResult) -> Torrent: def download_torrent(self, indexer_result: IndexerQueryResult) -> Torrent:
""" """

View File

@@ -38,8 +38,8 @@ class SabnzbdDownloadClient(AbstractDownloadClient):
try: try:
# Test connection # Test connection
self.client.version() self.client.version()
except Exception as e: except Exception:
log.error(f"Failed to connect to SABnzbd: {e}") log.exception("Failed to connect to SABnzbd")
raise raise
def download_torrent(self, indexer_result: IndexerQueryResult) -> Torrent: def download_torrent(self, indexer_result: IndexerQueryResult) -> Torrent:
@@ -55,10 +55,7 @@ class SabnzbdDownloadClient(AbstractDownloadClient):
url=str(indexer_result.download_url), nzbname=indexer_result.title url=str(indexer_result.download_url), nzbname=indexer_result.title
) )
if not response["status"]: if not response["status"]:
error_msg = response raise RuntimeError(f"Failed to add NZB to SABnzbd: {response}") # noqa: EM102, TRY003, TRY301
log.error(f"Failed to add NZB to SABnzbd: {error_msg}")
msg = f"Failed to add NZB to SABnzbd: {error_msg}"
raise RuntimeError(msg)
# Generate a hash for the NZB (using title and download URL) # Generate a hash for the NZB (using title and download URL)
nzo_id = response["nzo_ids"][0] nzo_id = response["nzo_ids"][0]
@@ -75,13 +72,12 @@ class SabnzbdDownloadClient(AbstractDownloadClient):
# Get initial status from SABnzbd # Get initial status from SABnzbd
torrent.status = self.get_torrent_status(torrent) torrent.status = self.get_torrent_status(torrent)
except Exception:
log.exception(f"Failed to download NZB {indexer_result.title}")
raise
return torrent return torrent
except Exception as e:
log.error(f"Failed to download NZB {indexer_result.title}: {e}")
raise
def remove_torrent(self, torrent: Torrent, delete_data: bool = False) -> None: def remove_torrent(self, torrent: Torrent, delete_data: bool = False) -> None:
""" """
Remove a torrent from SABnzbd. Remove a torrent from SABnzbd.
@@ -91,8 +87,8 @@ class SabnzbdDownloadClient(AbstractDownloadClient):
""" """
try: try:
self.client.delete_job(nzo_id=torrent.hash, delete_files=delete_data) self.client.delete_job(nzo_id=torrent.hash, delete_files=delete_data)
except Exception as e: except Exception:
log.error(f"Failed to remove torrent {torrent.title}: {e}") log.exception(f"Failed to remove torrent {torrent.title}")
raise raise
def pause_torrent(self, torrent: Torrent) -> None: def pause_torrent(self, torrent: Torrent) -> None:
@@ -103,8 +99,8 @@ class SabnzbdDownloadClient(AbstractDownloadClient):
""" """
try: try:
self.client.pause_job(nzo_id=torrent.hash) self.client.pause_job(nzo_id=torrent.hash)
except Exception as e: except Exception:
log.error(f"Failed to pause torrent {torrent.title}: {e}") log.exception(f"Failed to pause torrent {torrent.title}")
raise raise
def resume_torrent(self, torrent: Torrent) -> None: def resume_torrent(self, torrent: Torrent) -> None:
@@ -115,8 +111,8 @@ class SabnzbdDownloadClient(AbstractDownloadClient):
""" """
try: try:
self.client.resume_job(nzo_id=torrent.hash) self.client.resume_job(nzo_id=torrent.hash)
except Exception as e: except Exception:
log.error(f"Failed to resume torrent {torrent.title}: {e}") log.exception(f"Failed to resume torrent {torrent.title}")
raise raise
def get_torrent_status(self, torrent: Torrent) -> TorrentStatus: def get_torrent_status(self, torrent: Torrent) -> TorrentStatus:

View File

@@ -43,8 +43,8 @@ class TransmissionDownloadClient(AbstractDownloadClient):
) )
# Test connection # Test connection
self._client.session_stats() self._client.session_stats()
except Exception as e: except Exception:
log.error(f"Failed to connect to Transmission: {e}") log.exception("Failed to connect to Transmission")
raise raise
def download_torrent(self, indexer_result: IndexerQueryResult) -> Torrent: def download_torrent(self, indexer_result: IndexerQueryResult) -> Torrent:
@@ -68,8 +68,8 @@ class TransmissionDownloadClient(AbstractDownloadClient):
f"Successfully added torrent to Transmission: {indexer_result.title}" f"Successfully added torrent to Transmission: {indexer_result.title}"
) )
except Exception as e: except Exception:
log.error(f"Failed to add torrent to Transmission: {e}") log.exception("Failed to add torrent to Transmission")
raise raise
torrent = Torrent( torrent = Torrent(
@@ -95,8 +95,8 @@ class TransmissionDownloadClient(AbstractDownloadClient):
try: try:
self._client.remove_torrent(torrent.hash, delete_data=delete_data) self._client.remove_torrent(torrent.hash, delete_data=delete_data)
except Exception as e: except Exception:
log.error(f"Failed to remove torrent: {e}") log.exception("Failed to remove torrent")
raise raise
def get_torrent_status(self, torrent: Torrent) -> TorrentStatus: def get_torrent_status(self, torrent: Torrent) -> TorrentStatus:
@@ -123,13 +123,12 @@ class TransmissionDownloadClient(AbstractDownloadClient):
log.warning( log.warning(
f"Torrent {torrent.title} has error status: {transmission_torrent.error_string}" f"Torrent {torrent.title} has error status: {transmission_torrent.error_string}"
) )
except Exception:
log.exception("Failed to get torrent status")
return TorrentStatus.error
return status return status
except Exception as e:
log.error(f"Failed to get torrent status: {e}")
return TorrentStatus.error
def pause_torrent(self, torrent: Torrent) -> None: def pause_torrent(self, torrent: Torrent) -> None:
""" """
Pause a torrent download. Pause a torrent download.
@@ -140,8 +139,8 @@ class TransmissionDownloadClient(AbstractDownloadClient):
self._client.stop_torrent(torrent.hash) self._client.stop_torrent(torrent.hash)
log.debug(f"Successfully paused torrent: {torrent.title}") log.debug(f"Successfully paused torrent: {torrent.title}")
except Exception as e: except Exception:
log.error(f"Failed to pause torrent: {e}") log.exception("Failed to pause torrent")
raise raise
def resume_torrent(self, torrent: Torrent) -> None: def resume_torrent(self, torrent: Torrent) -> None:
@@ -154,6 +153,6 @@ class TransmissionDownloadClient(AbstractDownloadClient):
self._client.start_torrent(torrent.hash) self._client.start_torrent(torrent.hash)
log.debug(f"Successfully resumed torrent: {torrent.title}") log.debug(f"Successfully resumed torrent: {torrent.title}")
except Exception as e: except Exception:
log.error(f"Failed to resume torrent: {e}") log.exception("Failed to resume torrent")
raise raise

View File

@@ -43,22 +43,22 @@ class DownloadManager:
if self.config.qbittorrent.enabled: if self.config.qbittorrent.enabled:
try: try:
self._torrent_client = QbittorrentDownloadClient() self._torrent_client = QbittorrentDownloadClient()
except Exception as e: except Exception:
log.error(f"Failed to initialize qBittorrent client: {e}") log.exception("Failed to initialize qBittorrent client")
# If qBittorrent is not available or failed, try Transmission # If qBittorrent is not available or failed, try Transmission
if self._torrent_client is None and self.config.transmission.enabled: if self._torrent_client is None and self.config.transmission.enabled:
try: try:
self._torrent_client = TransmissionDownloadClient() self._torrent_client = TransmissionDownloadClient()
except Exception as e: except Exception:
log.error(f"Failed to initialize Transmission client: {e}") log.exception("Failed to initialize Transmission client")
# Initialize SABnzbd client for usenet # Initialize SABnzbd client for usenet
if self.config.sabnzbd.enabled: if self.config.sabnzbd.enabled:
try: try:
self._usenet_client = SabnzbdDownloadClient() self._usenet_client = SabnzbdDownloadClient()
except Exception as e: except Exception:
log.error(f"Failed to initialize SABnzbd client: {e}") log.exception("Failed to initialize SABnzbd client")
active_clients = [] active_clients = []
if self._torrent_client: if self._torrent_client:

View File

@@ -87,7 +87,9 @@ class TorrentRepository:
return None return None
return MovieSchema.model_validate(result) return MovieSchema.model_validate(result)
def get_movie_files_of_torrent(self, torrent_id: TorrentId) -> list[MovieFileSchema]: def get_movie_files_of_torrent(
self, torrent_id: TorrentId
) -> list[MovieFileSchema]:
stmt = select(MovieFile).where(MovieFile.torrent_id == torrent_id) stmt = select(MovieFile).where(MovieFile.torrent_id == torrent_id)
result = self.db.execute(stmt).scalars().all() result = self.db.execute(stmt).scalars().all()
return [MovieFileSchema.model_validate(movie_file) for movie_file in result] return [MovieFileSchema.model_validate(movie_file) for movie_file in result]

View File

@@ -92,8 +92,8 @@ class TorrentService:
for x in self.torrent_repository.get_all_torrents(): for x in self.torrent_repository.get_all_torrents():
try: try:
torrents.append(self.get_torrent_status(x)) torrents.append(self.get_torrent_status(x))
except RuntimeError as e: except RuntimeError:
log.error(f"Error fetching status for torrent {x.title}: {e}") log.exception(f"Error fetching status for torrent {x.title}")
return torrents return torrents
def get_torrent_by_id(self, torrent_id: TorrentId) -> Torrent: def get_torrent_by_id(self, torrent_id: TorrentId) -> Torrent:

View File

@@ -9,6 +9,7 @@ import bencoder
import libtorrent import libtorrent
import patoolib import patoolib
import requests import requests
from pathvalidate import sanitize_filename
from requests.exceptions import InvalidSchema from requests.exceptions import InvalidSchema
from media_manager.config import MediaManagerConfig from media_manager.config import MediaManagerConfig
@@ -57,8 +58,8 @@ def extract_archives(files: list) -> None:
) )
try: try:
patoolib.extract_archive(str(file), outdir=str(file.parent)) patoolib.extract_archive(str(file), outdir=str(file.parent))
except patoolib.util.PatoolError as e: except patoolib.util.PatoolError:
log.error(f"Failed to extract archive {file}. Error: {e}") log.exception(f"Failed to extract archive {file}")
def get_torrent_filepath(torrent: Torrent) -> Path: def get_torrent_filepath(torrent: Torrent) -> Path:
@@ -72,10 +73,10 @@ def import_file(target_file: Path, source_file: Path) -> None:
try: try:
target_file.hardlink_to(source_file) target_file.hardlink_to(source_file)
except FileExistsError: except FileExistsError:
log.error(f"File already exists at {target_file}.") log.exception(f"File already exists at {target_file}.")
except (OSError, UnsupportedOperation, NotImplementedError) as e: except (OSError, UnsupportedOperation, NotImplementedError):
log.error( log.exception(
f"Failed to create hardlink from {source_file} to {target_file}: {e}. Falling back to copying the file." f"Failed to create hardlink from {source_file} to {target_file}. Falling back to copying the file."
) )
shutil.copy(src=source_file, dst=target_file) shutil.copy(src=source_file, dst=target_file)
@@ -132,7 +133,8 @@ def get_torrent_hash(torrent: IndexerQueryResult) -> str:
:return: The hash of the torrent. :return: The hash of the torrent.
""" """
torrent_filepath = ( torrent_filepath = (
MediaManagerConfig().misc.torrent_directory / f"{torrent.title}.torrent" MediaManagerConfig().misc.torrent_directory
/ f"{sanitize_filename(torrent.title)}.torrent"
) )
if torrent_filepath.exists(): if torrent_filepath.exists():
log.warning(f"Torrent file already exists at: {torrent_filepath}") log.warning(f"Torrent file already exists at: {torrent_filepath}")
@@ -148,16 +150,16 @@ def get_torrent_hash(torrent: IndexerQueryResult) -> str:
response = requests.get(str(torrent.download_url), timeout=30) response = requests.get(str(torrent.download_url), timeout=30)
response.raise_for_status() response.raise_for_status()
torrent_content = response.content torrent_content = response.content
except InvalidSchema as e: except InvalidSchema:
log.debug(f"Invalid schema for URL {torrent.download_url}: {e}") log.debug(f"Invalid schema for URL {torrent.download_url}", exc_info=True)
final_url = follow_redirects_to_final_torrent_url( final_url = follow_redirects_to_final_torrent_url(
initial_url=torrent.download_url, initial_url=torrent.download_url,
session=requests.Session(), session=requests.Session(),
timeout=MediaManagerConfig().indexers.prowlarr.timeout_seconds, timeout=MediaManagerConfig().indexers.prowlarr.timeout_seconds,
) )
return str(libtorrent.parse_magnet_uri(final_url).info_hash) return str(libtorrent.parse_magnet_uri(final_url).info_hash)
except Exception as e: except Exception:
log.error(f"Failed to download torrent file: {e}") log.exception("Failed to download torrent file")
raise raise
# saving the torrent file # saving the torrent file
@@ -170,9 +172,10 @@ def get_torrent_hash(torrent: IndexerQueryResult) -> str:
torrent_hash = hashlib.sha1( # noqa: S324 torrent_hash = hashlib.sha1( # noqa: S324
bencoder.encode(decoded_content[b"info"]) bencoder.encode(decoded_content[b"info"])
).hexdigest() ).hexdigest()
except Exception as e: except Exception:
log.error(f"Failed to decode torrent file: {e}") log.exception("Failed to decode torrent file")
raise raise
return torrent_hash return torrent_hash

View File

@@ -67,8 +67,8 @@ class TvRepository:
msg = f"Show with id {show_id} not found." msg = f"Show with id {show_id} not found."
raise NotFoundError(msg) raise NotFoundError(msg)
return ShowSchema.model_validate(result) return ShowSchema.model_validate(result)
except SQLAlchemyError as e: except SQLAlchemyError:
log.error(f"Database error while retrieving show {show_id}: {e}") log.exception(f"Database error while retrieving show {show_id}")
raise raise
def get_show_by_external_id( def get_show_by_external_id(
@@ -95,9 +95,9 @@ class TvRepository:
msg = f"Show with external_id {external_id} and provider {metadata_provider} not found." msg = f"Show with external_id {external_id} and provider {metadata_provider} not found."
raise NotFoundError(msg) raise NotFoundError(msg)
return ShowSchema.model_validate(result) return ShowSchema.model_validate(result)
except SQLAlchemyError as e: except SQLAlchemyError:
log.error( log.exception(
f"Database error while retrieving show by external_id {external_id}: {e}" f"Database error while retrieving show by external_id {external_id}",
) )
raise raise
@@ -114,8 +114,8 @@ class TvRepository:
) )
results = self.db.execute(stmt).scalars().unique().all() results = self.db.execute(stmt).scalars().unique().all()
return [ShowSchema.model_validate(show) for show in results] return [ShowSchema.model_validate(show) for show in results]
except SQLAlchemyError as e: except SQLAlchemyError:
log.error(f"Database error while retrieving all shows: {e}") log.exception("Database error while retrieving all shows")
raise raise
def get_total_downloaded_episodes_count(self) -> int: def get_total_downloaded_episodes_count(self) -> int:
@@ -124,11 +124,9 @@ class TvRepository:
select(func.count()).select_from(Episode).join(Season).join(SeasonFile) select(func.count()).select_from(Episode).join(Season).join(SeasonFile)
) )
return self.db.execute(stmt).scalar_one_or_none() return self.db.execute(stmt).scalar_one_or_none()
except SQLAlchemyError as e: except SQLAlchemyError:
log.error( log.exception("Database error while calculating downloaded episodes count")
f"Database error while calculating downloaded episodes count: {e}" raise
)
raise e
def save_show(self, show: ShowSchema) -> ShowSchema: def save_show(self, show: ShowSchema) -> ShowSchema:
""" """
@@ -192,9 +190,9 @@ class TvRepository:
self.db.rollback() self.db.rollback()
msg = f"Show with this primary key or unique constraint violation: {e.orig}" msg = f"Show with this primary key or unique constraint violation: {e.orig}"
raise ConflictError(msg) from e raise ConflictError(msg) from e
except SQLAlchemyError as e: except SQLAlchemyError:
self.db.rollback() self.db.rollback()
log.error(f"Database error while saving show {show.name}: {e}") log.exception(f"Database error while saving show {show.name}")
raise raise
def delete_show(self, show_id: ShowId) -> None: def delete_show(self, show_id: ShowId) -> None:
@@ -212,9 +210,9 @@ class TvRepository:
raise NotFoundError(msg) raise NotFoundError(msg)
self.db.delete(show) self.db.delete(show)
self.db.commit() self.db.commit()
except SQLAlchemyError as e: except SQLAlchemyError:
self.db.rollback() self.db.rollback()
log.error(f"Database error while deleting show {show_id}: {e}") log.exception(f"Database error while deleting show {show_id}")
raise raise
def get_season(self, season_id: SeasonId) -> SeasonSchema: def get_season(self, season_id: SeasonId) -> SeasonSchema:
@@ -232,8 +230,8 @@ class TvRepository:
msg = f"Season with id {season_id} not found." msg = f"Season with id {season_id} not found."
raise NotFoundError(msg) raise NotFoundError(msg)
return SeasonSchema.model_validate(season) return SeasonSchema.model_validate(season)
except SQLAlchemyError as e: except SQLAlchemyError:
log.error(f"Database error while retrieving season {season_id}: {e}") log.exception(f"Database error while retrieving season {season_id}")
raise raise
def add_season_request( def add_season_request(
@@ -265,13 +263,13 @@ class TvRepository:
self.db.commit() self.db.commit()
self.db.refresh(db_model) self.db.refresh(db_model)
return SeasonRequestSchema.model_validate(db_model) return SeasonRequestSchema.model_validate(db_model)
except IntegrityError as e: except IntegrityError:
self.db.rollback() self.db.rollback()
log.error(f"Integrity error while adding season request: {e}") log.exception("Integrity error while adding season request")
raise raise
except SQLAlchemyError as e: except SQLAlchemyError:
self.db.rollback() self.db.rollback()
log.error(f"Database error while adding season request: {e}") log.exception("Database error while adding season request")
raise raise
def delete_season_request(self, season_request_id: SeasonRequestId) -> None: def delete_season_request(self, season_request_id: SeasonRequestId) -> None:
@@ -290,10 +288,10 @@ class TvRepository:
msg = f"SeasonRequest with id {season_request_id} not found." msg = f"SeasonRequest with id {season_request_id} not found."
raise NotFoundError(msg) raise NotFoundError(msg)
self.db.commit() self.db.commit()
except SQLAlchemyError as e: except SQLAlchemyError:
self.db.rollback() self.db.rollback()
log.error( log.exception(
f"Database error while deleting season request {season_request_id}: {e}" f"Database error while deleting season request {season_request_id}"
) )
raise raise
@@ -319,9 +317,9 @@ class TvRepository:
msg = f"Season number {season_number} for show_id {show_id} not found." msg = f"Season number {season_number} for show_id {show_id} not found."
raise NotFoundError(msg) raise NotFoundError(msg)
return SeasonSchema.model_validate(result) return SeasonSchema.model_validate(result)
except SQLAlchemyError as e: except SQLAlchemyError:
log.error( log.exception(
f"Database error retrieving season {season_number} for show {show_id}: {e}" f"Database error retrieving season {season_number} for show {show_id}"
) )
raise raise
@@ -353,8 +351,8 @@ class TvRepository:
) )
for x in results for x in results
] ]
except SQLAlchemyError as e: except SQLAlchemyError:
log.error(f"Database error while retrieving season requests: {e}") log.exception("Database error while retrieving season requests")
raise raise
def add_season_file(self, season_file: SeasonFileSchema) -> SeasonFileSchema: def add_season_file(self, season_file: SeasonFileSchema) -> SeasonFileSchema:
@@ -372,13 +370,13 @@ class TvRepository:
self.db.commit() self.db.commit()
self.db.refresh(db_model) self.db.refresh(db_model)
return SeasonFileSchema.model_validate(db_model) return SeasonFileSchema.model_validate(db_model)
except IntegrityError as e: except IntegrityError:
self.db.rollback() self.db.rollback()
log.error(f"Integrity error while adding season file: {e}") log.exception("Integrity error while adding season file")
raise raise
except SQLAlchemyError as e: except SQLAlchemyError:
self.db.rollback() self.db.rollback()
log.error(f"Database error while adding season file: {e}") log.exception("Database error while adding season file")
raise raise
def remove_season_files_by_torrent_id(self, torrent_id: TorrentId) -> int: def remove_season_files_by_torrent_id(self, torrent_id: TorrentId) -> int:
@@ -393,13 +391,13 @@ class TvRepository:
stmt = delete(SeasonFile).where(SeasonFile.torrent_id == torrent_id) stmt = delete(SeasonFile).where(SeasonFile.torrent_id == torrent_id)
result = self.db.execute(stmt) result = self.db.execute(stmt)
self.db.commit() self.db.commit()
return result.rowcount except SQLAlchemyError:
except SQLAlchemyError as e:
self.db.rollback() self.db.rollback()
log.error( log.exception(
f"Database error removing season files for torrent_id {torrent_id}: {e}" f"Database error removing season files for torrent_id {torrent_id}"
) )
raise raise
return result.rowcount
def set_show_library(self, show_id: ShowId, library: str) -> None: def set_show_library(self, show_id: ShowId, library: str) -> None:
""" """
@@ -417,9 +415,9 @@ class TvRepository:
raise NotFoundError(msg) raise NotFoundError(msg)
show.library = library show.library = library
self.db.commit() self.db.commit()
except SQLAlchemyError as e: except SQLAlchemyError:
self.db.rollback() self.db.rollback()
log.error(f"Database error setting library for show {show_id}: {e}") log.exception(f"Database error setting library for show {show_id}")
raise raise
def get_season_files_by_season_id( def get_season_files_by_season_id(
@@ -436,9 +434,9 @@ class TvRepository:
stmt = select(SeasonFile).where(SeasonFile.season_id == season_id) stmt = select(SeasonFile).where(SeasonFile.season_id == season_id)
results = self.db.execute(stmt).scalars().all() results = self.db.execute(stmt).scalars().all()
return [SeasonFileSchema.model_validate(sf) for sf in results] return [SeasonFileSchema.model_validate(sf) for sf in results]
except SQLAlchemyError as e: except SQLAlchemyError:
log.error( log.exception(
f"Database error retrieving season files for season_id {season_id}: {e}" f"Database error retrieving season files for season_id {season_id}"
) )
raise raise
@@ -460,8 +458,8 @@ class TvRepository:
) )
results = self.db.execute(stmt).scalars().unique().all() results = self.db.execute(stmt).scalars().unique().all()
return [TorrentSchema.model_validate(torrent) for torrent in results] return [TorrentSchema.model_validate(torrent) for torrent in results]
except SQLAlchemyError as e: except SQLAlchemyError:
log.error(f"Database error retrieving torrents for show_id {show_id}: {e}") log.exception(f"Database error retrieving torrents for show_id {show_id}")
raise raise
def get_all_shows_with_torrents(self) -> list[ShowSchema]: def get_all_shows_with_torrents(self) -> list[ShowSchema]:
@@ -483,8 +481,8 @@ class TvRepository:
) )
results = self.db.execute(stmt).scalars().unique().all() results = self.db.execute(stmt).scalars().unique().all()
return [ShowSchema.model_validate(show) for show in results] return [ShowSchema.model_validate(show) for show in results]
except SQLAlchemyError as e: except SQLAlchemyError:
log.error(f"Database error retrieving all shows with torrents: {e}") log.exception("Database error retrieving all shows with torrents")
raise raise
def get_seasons_by_torrent_id(self, torrent_id: TorrentId) -> list[SeasonNumber]: def get_seasons_by_torrent_id(self, torrent_id: TorrentId) -> list[SeasonNumber]:
@@ -504,9 +502,9 @@ class TvRepository:
) )
results = self.db.execute(stmt).scalars().unique().all() results = self.db.execute(stmt).scalars().unique().all()
return [SeasonNumber(x) for x in results] return [SeasonNumber(x) for x in results]
except SQLAlchemyError as e: except SQLAlchemyError:
log.error( log.exception(
f"Database error retrieving season numbers for torrent_id {torrent_id}: {e}" f"Database error retrieving season numbers for torrent_id {torrent_id}"
) )
raise raise
@@ -528,9 +526,9 @@ class TvRepository:
msg = f"Season request with id {season_request_id} not found." msg = f"Season request with id {season_request_id} not found."
raise NotFoundError(msg) raise NotFoundError(msg)
return SeasonRequestSchema.model_validate(request) return SeasonRequestSchema.model_validate(request)
except SQLAlchemyError as e: except SQLAlchemyError:
log.error( log.exception(
f"Database error retrieving season request {season_request_id}: {e}" f"Database error retrieving season request {season_request_id}"
) )
raise raise
@@ -555,8 +553,8 @@ class TvRepository:
msg = f"Show for season_id {season_id} not found." msg = f"Show for season_id {season_id} not found."
raise NotFoundError(msg) raise NotFoundError(msg)
return ShowSchema.model_validate(result) return ShowSchema.model_validate(result)
except SQLAlchemyError as e: except SQLAlchemyError:
log.error(f"Database error retrieving show by season_id {season_id}: {e}") log.exception(f"Database error retrieving show by season_id {season_id}")
raise raise
def add_season_to_show( def add_season_to_show(

View File

@@ -94,7 +94,9 @@ def get_all_importable_shows(
dependencies=[Depends(current_superuser)], dependencies=[Depends(current_superuser)],
status_code=status.HTTP_204_NO_CONTENT, status_code=status.HTTP_204_NO_CONTENT,
) )
def import_detected_show(tv_service: tv_service_dep, tv_show: show_dep, directory: str) -> None: def import_detected_show(
tv_service: tv_service_dep, tv_show: show_dep, directory: str
) -> None:
""" """
Import a detected show from the specified directory into the library. Import a detected show from the specified directory into the library.
""" """
@@ -352,7 +354,6 @@ def authorize_request(
if not authorized_status: if not authorized_status:
season_request.authorized_by = None season_request.authorized_by = None
tv_service.update_season_request(season_request=season_request) tv_service.update_season_request(season_request=season_request)
return
@router.delete( @router.delete(

View File

@@ -8,7 +8,7 @@ from sqlalchemy.exc import IntegrityError
from media_manager.config import MediaManagerConfig from media_manager.config import MediaManagerConfig
from media_manager.database import get_session from media_manager.database import get_session
from media_manager.exceptions import InvalidConfigError, NotFoundError from media_manager.exceptions import InvalidConfigError, NotFoundError, RenameError
from media_manager.indexer.repository import IndexerRepository from media_manager.indexer.repository import IndexerRepository
from media_manager.indexer.schemas import IndexerQueryResult, IndexerQueryResultId from media_manager.indexer.schemas import IndexerQueryResult, IndexerQueryResultId
from media_manager.indexer.service import IndexerService from media_manager.indexer.service import IndexerService
@@ -174,8 +174,10 @@ class TvService:
try: try:
self.torrent_service.cancel_download(torrent, delete_files=True) self.torrent_service.cancel_download(torrent, delete_files=True)
log.info(f"Deleted torrent: {torrent.hash}") log.info(f"Deleted torrent: {torrent.hash}")
except Exception as e: except Exception:
log.warning(f"Failed to delete torrent {torrent.hash}: {e}") log.warning(
f"Failed to delete torrent {torrent.hash}", exc_info=True
)
self.tv_repository.delete_show(show_id=show.id) self.tv_repository.delete_show(show_id=show.id)
@@ -226,19 +228,19 @@ class TvService:
self.tv_repository.get_show_by_external_id( self.tv_repository.get_show_by_external_id(
external_id=external_id, metadata_provider=metadata_provider external_id=external_id, metadata_provider=metadata_provider
) )
return True
except NotFoundError: except NotFoundError:
return False return False
elif show_id is not None: elif show_id is not None:
try: try:
self.tv_repository.get_show_by_id(show_id=show_id) self.tv_repository.get_show_by_id(show_id=show_id)
return True
except NotFoundError: except NotFoundError:
return False return False
else: else:
msg = "Use one of the provided overloads for this function!" msg = "Use one of the provided overloads for this function!"
raise ValueError(msg) raise ValueError(msg)
return True
def get_all_available_torrents_for_a_season( def get_all_available_torrents_for_a_season(
self, self,
season_number: int, season_number: int,
@@ -379,8 +381,9 @@ class TvService:
if torrent_file.imported: if torrent_file.imported:
return True return True
except RuntimeError as e: except RuntimeError:
log.error(f"Error retrieving torrent, error: {e}") log.exception("Error retrieving torrent")
return False return False
def get_show_by_external_id( def get_show_by_external_id(
@@ -641,7 +644,7 @@ class TvService:
return True return True
else: else:
msg = f"Could not find any video file for episode {episode_number} of show {show.name} S{season.number}" msg = f"Could not find any video file for episode {episode_number} of show {show.name} S{season.number}"
raise Exception(msg) raise Exception(msg) # noqa: TRY002 # TODO: resolve this
def import_season( def import_season(
self, self,
@@ -659,9 +662,9 @@ class TvService:
try: try:
season_path.mkdir(parents=True, exist_ok=True) season_path.mkdir(parents=True, exist_ok=True)
except Exception as e: except Exception as e:
log.warning(f"Could not create path {season_path}: {e}") log.exception(f"Could not create path {season_path}")
msg = f"Could not create path {season_path}" msg = f"Could not create path {season_path}"
raise Exception(msg) from e raise Exception(msg) from e # noqa: TRY002 # TODO: resolve this
for episode in season.episodes: for episode in season.episodes:
try: try:
@@ -901,9 +904,8 @@ class TvService:
try: try:
source_directory.rename(new_source_path) source_directory.rename(new_source_path)
except Exception as e: except Exception as e:
log.error(f"Failed to rename {source_directory} to {new_source_path}: {e}") log.exception(f"Failed to rename {source_directory} to {new_source_path}")
msg = "Failed to rename source directory" raise RenameError from e
raise Exception(msg) from e
video_files, subtitle_files, _all_files = get_files_for_import( video_files, subtitle_files, _all_files = get_files_for_import(
directory=new_source_path directory=new_source_path
@@ -967,12 +969,14 @@ def auto_download_all_approved_season_requests() -> None:
tv_repository = TvRepository(db=db) tv_repository = TvRepository(db=db)
torrent_service = TorrentService(torrent_repository=TorrentRepository(db=db)) torrent_service = TorrentService(torrent_repository=TorrentRepository(db=db))
indexer_service = IndexerService(indexer_repository=IndexerRepository(db=db)) indexer_service = IndexerService(indexer_repository=IndexerRepository(db=db))
notification_service = NotificationService(notification_repository=NotificationRepository(db=db)) notification_service = NotificationService(
notification_repository=NotificationRepository(db=db)
)
tv_service = TvService( tv_service = TvService(
tv_repository=tv_repository, tv_repository=tv_repository,
torrent_service=torrent_service, torrent_service=torrent_service,
indexer_service=indexer_service, indexer_service=indexer_service,
notification_service=notification_service notification_service=notification_service,
) )
log.info("Auto downloading all approved season requests") log.info("Auto downloading all approved season requests")
@@ -1004,12 +1008,14 @@ def import_all_show_torrents() -> None:
tv_repository = TvRepository(db=db) tv_repository = TvRepository(db=db)
torrent_service = TorrentService(torrent_repository=TorrentRepository(db=db)) torrent_service = TorrentService(torrent_repository=TorrentRepository(db=db))
indexer_service = IndexerService(indexer_repository=IndexerRepository(db=db)) indexer_service = IndexerService(indexer_repository=IndexerRepository(db=db))
notification_service = NotificationService(notification_repository=NotificationRepository(db=db)) notification_service = NotificationService(
notification_repository=NotificationRepository(db=db)
)
tv_service = TvService( tv_service = TvService(
tv_repository=tv_repository, tv_repository=tv_repository,
torrent_service=torrent_service, torrent_service=torrent_service,
indexer_service=indexer_service, indexer_service=indexer_service,
notification_service=notification_service notification_service=notification_service,
) )
log.info("Importing all torrents") log.info("Importing all torrents")
torrents = torrent_service.get_all_torrents() torrents = torrent_service.get_all_torrents()
@@ -1024,10 +1030,8 @@ def import_all_show_torrents() -> None:
) )
continue continue
tv_service.import_torrent_files(torrent=t, show=show) tv_service.import_torrent_files(torrent=t, show=show)
except RuntimeError as e: except RuntimeError:
log.error( log.exception(f"Error importing torrent {t.title} for show {show.name}")
f"Error importing torrent {t.title} for show {show.name}: {e}"
)
log.info("Finished importing all torrents") log.info("Finished importing all torrents")
db.commit() db.commit()
@@ -1042,7 +1046,9 @@ def update_all_non_ended_shows_metadata() -> None:
tv_repository=tv_repository, tv_repository=tv_repository,
torrent_service=TorrentService(torrent_repository=TorrentRepository(db=db)), torrent_service=TorrentService(torrent_repository=TorrentRepository(db=db)),
indexer_service=IndexerService(indexer_repository=IndexerRepository(db=db)), indexer_service=IndexerService(indexer_repository=IndexerRepository(db=db)),
notification_service=NotificationService(notification_repository=NotificationRepository(db=db)) notification_service=NotificationService(
notification_repository=NotificationRepository(db=db)
),
) )
log.info("Updating metadata for all non-ended shows") log.info("Updating metadata for all non-ended shows")
@@ -1062,9 +1068,9 @@ def update_all_non_ended_shows_metadata() -> None:
f"Unsupported metadata provider {show.metadata_provider} for show {show.name}, skipping update." f"Unsupported metadata provider {show.metadata_provider} for show {show.name}, skipping update."
) )
continue continue
except InvalidConfigError as e: except InvalidConfigError:
log.error( log.exception(
f"Error initializing metadata provider {show.metadata_provider} for show {show.name}: {e}" f"Error initializing metadata provider {show.metadata_provider} for show {show.name}"
) )
continue continue
updated_show = tv_service.update_show_metadata( updated_show = tv_service.update_show_metadata(

View File

@@ -104,7 +104,16 @@ ASCII_ART='
░░░░░░ ░░░░░░
' '
if [[ -v MEDIAMANAGER_NO_STARTUP_ART ]]; then
echo
echo " +================+"
echo " | MediaManager |"
echo " +================+"
echo
else
display_cool_text "$ASCII_ART" display_cool_text "$ASCII_ART"
fi
echo "Buy me a coffee at https://buymeacoffee.com/maxdorninger" echo "Buy me a coffee at https://buymeacoffee.com/maxdorninger"
# Initialize config if it doesn't exist # Initialize config if it doesn't exist
@@ -136,8 +145,30 @@ else
echo "Config file found at: $CONFIG_FILE" echo "Config file found at: $CONFIG_FILE"
fi fi
# check if running as root, if yes, fix permissions
if [ "$(id -u)" = '0' ]; then
echo "Running as root. Ensuring file permissions for mediamanager user..."
chown -R mediamanager:mediamanager "$CONFIG_DIR"
if [ -d "/data" ]; then
if [ "$(stat -c '%U' /data)" != "mediamanager" ]; then
echo "Fixing ownership of /data (this may take a while for large media libraries)..."
chown -R mediamanager:mediamanager /data
else
echo "/data ownership is already correct."
fi
fi
else
echo "Running as non-root user ($(id -u)). Skipping permission fixes."
echo "Note: Ensure your host volumes are manually set to the correct permissions."
fi
echo "Running DB migrations..." echo "Running DB migrations..."
if [ "$(id -u)" = '0' ]; then
gosu mediamanager uv run alembic upgrade head
else
uv run alembic upgrade head uv run alembic upgrade head
fi
echo "Starting MediaManager backend service..." echo "Starting MediaManager backend service..."
echo "" echo ""
@@ -150,9 +181,16 @@ echo ""
DEVELOPMENT_MODE=${MEDIAMANAGER_MISC__DEVELOPMENT:-FALSE} DEVELOPMENT_MODE=${MEDIAMANAGER_MISC__DEVELOPMENT:-FALSE}
PORT=${PORT:-8000} PORT=${PORT:-8000}
if [ "$DEVELOPMENT_MODE" == "TRUE" ]; then if [ "$DEVELOPMENT_MODE" == "TRUE" ]; then
echo "Development mode is enabled, enabling auto-reload..." echo "Development mode is enabled, enabling auto-reload..."
uv run fastapi run /app/media_manager/main.py --port "$PORT" --proxy-headers --reload DEV_OPTIONS="--reload"
else else
uv run fastapi run /app/media_manager/main.py --port "$PORT" --proxy-headers DEV_OPTIONS=""
fi
if [ "$(id -u)" = '0' ]; then
exec gosu mediamanager uv run fastapi run /app/media_manager/main.py --port "$PORT" --proxy-headers $DEV_OPTIONS
else
exec uv run fastapi run /app/media_manager/main.py --port "$PORT" --proxy-headers $DEV_OPTIONS
fi fi

View File

@@ -8,23 +8,25 @@ RUN apt-get update && apt-get install -y ca-certificates && \
apt-get clean && \ apt-get clean && \
rm -rf /var/lib/apt/lists/* rm -rf /var/lib/apt/lists/*
# Create a non-root user and group
RUN groupadd -g 1000 mediamanager && \ RUN groupadd -g 1000 mediamanager && \
useradd -m -u 1000 -g mediamanager mediamanager useradd -m -u 1000 -g mediamanager mediamanager
WORKDIR /app WORKDIR /app
# Ensure mediamanager owns the app directory
RUN chown -R mediamanager:mediamanager /app RUN chown -R mediamanager:mediamanager /app
USER mediamanager
# Set uv cache to a writable home directory and use copy mode for volume compatibility
ENV UV_CACHE_DIR=/home/mediamanager/.cache/uv \ ENV UV_CACHE_DIR=/home/mediamanager/.cache/uv \
UV_LINK_MODE=copy UV_LINK_MODE=copy \
UV_COMPILE_BYTECODE=1
COPY --chown=mediamanager:mediamanager pyproject.toml uv.lock ./
USER mediamanager
RUN --mount=type=cache,target=/home/mediamanager/.cache/uv,uid=1000,gid=1000 \
uv sync --frozen --no-install-project --no-dev
COPY --chown=mediamanager:mediamanager . . COPY --chown=mediamanager:mediamanager . .
RUN --mount=type=cache,target=/home/mediamanager/.cache/uv,uid=1000,gid=1000 \
uv sync --locked RUN uv sync --frozen --no-dev
EXPOSE 8000 EXPOSE 8000
CMD ["uv", "run", "fastapi", "run", "/app/main.py"] CMD ["uv", "run", "fastapi", "run", "/app/main.py", "--port", "8000", "--proxy-headers"]

View File

@@ -32,7 +32,9 @@ else:
return TV(show_id).external_ids() return TV(show_id).external_ids()
@router.get("/tv/shows/{show_id}/{season_number}") @router.get("/tv/shows/{show_id}/{season_number}")
async def get_tmdb_season(season_number: int, show_id: int, language: str = "en") -> dict: async def get_tmdb_season(
season_number: int, show_id: int, language: str = "en"
) -> dict:
return TV_Seasons(season_number=season_number, tv_id=show_id).info( return TV_Seasons(season_number=season_number, tv_id=show_id).info(
language=language language=language
) )
@@ -42,7 +44,9 @@ else:
return Trending(media_type="movie").info(language=language) return Trending(media_type="movie").info(language=language)
@router.get("/movies/search") @router.get("/movies/search")
async def search_tmdb_movies(query: str, page: int = 1, language: str = "en") -> dict: async def search_tmdb_movies(
query: str, page: int = 1, language: str = "en"
) -> dict:
return Search().movie(page=page, query=query, language=language) return Search().movie(page=page, query=query, language=language)
@router.get("/movies/{movie_id}") @router.get("/movies/{movie_id}")

70
mkdocs.yml Normal file
View File

@@ -0,0 +1,70 @@
site_name: "MediaManager Documentation"
theme:
name: "material"
logo: "assets/logo.svg"
favicon: "assets/logo.svg"
features:
- navigation.sections
- navigation.expand
- navigation.indexes
- content.code.copy
- navigation.footer
palette:
- scheme: default
primary: indigo
accent: indigo
toggle:
icon: material/brightness-7
name: Switch to dark mode
- scheme: slate
primary: black
accent: black
toggle:
icon: material/brightness-4
name: Switch to light mode
markdown_extensions:
- admonition
- pymdownx.details
- pymdownx.superfences
- attr_list
- md_in_html
- pymdownx.snippets:
base_path: ["."]
nav:
- Welcome: index.md
- Installation:
- installation/README.md
- Docker Compose: installation/docker.md
- Nix Flakes [Community]: installation/flakes.md
- Usage:
- Importing existing media: importing-existing-media.md
- Configuration:
- configuration/README.md
- Backend: configuration/backend.md
- Authentication: configuration/authentication.md
- Database: configuration/database.md
- Download Clients: configuration/download-clients.md
- Indexers: configuration/indexers.md
- Scoring Rulesets: configuration/scoring-rulesets.md
- Notifications: configuration/notifications.md
- Custom Libraries: configuration/custom-libraries.md
- Logging: configuration/logging.md
- Advanced Features:
- qBittorrent Category: advanced-features/qbittorrent-category.md
- URL Prefix: advanced-features/url-prefix.md
- Metadata Provider Configuration: advanced-features/metadata-provider-configuration.md
- Custom port: advanced-features/custom-port.md
- Follow symlinks in frontend files: advanced-features/follow-symlinks-in-frontend-files.md
- Disable startup ascii art: advanced-features/disable-startup-ascii-art.md
- Troubleshooting: troubleshooting.md
- API Reference: api-reference.md
- Screenshots: screenshots.md
- Contributing to MediaManager:
- Developer Guide: contributing-to-mediamanager/developer-guide.md
- Documentation: contributing-to-mediamanager/documentation.md
extra:
version:
provider: mike

Some files were not shown because too many files have changed in this diff Show More