Merge branch 'release/0.21'
This commit is contained in:
commit
acf5cb92d1
|
@ -20,7 +20,7 @@ stages:
|
|||
review_front:
|
||||
interruptible: true
|
||||
stage: review
|
||||
image: node:11
|
||||
image: node:12-buster
|
||||
when: manual
|
||||
allow_failure: true
|
||||
variables:
|
||||
|
@ -71,7 +71,7 @@ review_docs:
|
|||
- cd docs
|
||||
- apt-get update
|
||||
- apt-get install -y graphviz
|
||||
- pip install sphinx sphinx_rtd_theme
|
||||
- pip install sphinx sphinx_rtd_theme django-environ django
|
||||
script:
|
||||
- ./build_docs.sh
|
||||
cache:
|
||||
|
@ -120,7 +120,7 @@ test_api:
|
|||
interruptible: true
|
||||
services:
|
||||
- postgres:11
|
||||
- redis:3
|
||||
- redis:5
|
||||
stage: test
|
||||
image: funkwhale/funkwhale:develop
|
||||
cache:
|
||||
|
@ -131,12 +131,12 @@ test_api:
|
|||
DATABASE_URL: "postgresql://postgres@postgres/postgres"
|
||||
FUNKWHALE_URL: "https://funkwhale.ci"
|
||||
DJANGO_SETTINGS_MODULE: config.settings.local
|
||||
POSTGRES_HOST_AUTH_METHOD: trust
|
||||
only:
|
||||
- branches
|
||||
before_script:
|
||||
- apk add make
|
||||
- apk add make git gcc python3-dev musl-dev
|
||||
- cd api
|
||||
- sed -i '/Pillow/d' requirements/base.txt
|
||||
- pip3 install -r requirements/base.txt
|
||||
- pip3 install -r requirements/local.txt
|
||||
- pip3 install -r requirements/test.txt
|
||||
|
@ -148,7 +148,7 @@ test_api:
|
|||
test_front:
|
||||
interruptible: true
|
||||
stage: test
|
||||
image: node:11
|
||||
image: node:12-buster
|
||||
before_script:
|
||||
- cd front
|
||||
only:
|
||||
|
@ -170,7 +170,7 @@ test_front:
|
|||
|
||||
build_front:
|
||||
stage: build
|
||||
image: node:11
|
||||
image: node:12-buster
|
||||
before_script:
|
||||
- curl -L -o /usr/local/bin/jq https://github.com/stedolan/jq/releases/download/jq-1.5/jq-linux64
|
||||
- chmod +x /usr/local/bin/jq
|
||||
|
|
314
CHANGELOG
314
CHANGELOG
|
@ -10,6 +10,320 @@ This changelog is viewable on the web at https://docs.funkwhale.audio/changelog.
|
|||
|
||||
.. towncrier
|
||||
|
||||
0.21 "Agate" (2020-04-24)
|
||||
-------------------------
|
||||
|
||||
This 0.21 release is dedicated to Agate, to thank her, for both having created the Funkwhale project, being the current lead developer, and for her courage of coming out. Thank you Agate from all the members of the Funkwhale community <3
|
||||
|
||||
We are truly grateful as well to the dozens of people who contributed to this release with translations, development, documentation, reviews, design, testing, feedback, financial support, third-party projects and integrations… You made it possible!
|
||||
|
||||
Upgrade instructions are available at
|
||||
https://docs.funkwhale.audio/admin/upgrading.html, there are also additional operations you need to execute, listed in the changelog below (search "Manual action").
|
||||
|
||||
Channels and podcasts
|
||||
^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
Funkwhale 0.21 includes a brand new feature: Channels!
|
||||
|
||||
Channels can be used as a replacement to public libraries,
|
||||
to publish audio content, both musical and non-musical. They federate with other Funkwhale pods, but also other
|
||||
fediverse software, in particular Mastodon, Pleroma, Friendica and Reel2Bits, meaning people can subscribe to your channel
|
||||
from any of these software. To get started with publication, simply visit your profile and create a channel from there.
|
||||
|
||||
Each Funkwhale channel also comes with RSS feed that is compatible with existing podcasting applications, like AntennaPod
|
||||
on Android and, within Funkwhale, you can also subscribe to any podcast from its RSS feed!
|
||||
|
||||
Many, many thanks to the numerous people who helped with the feature design, development and testing, and in particular
|
||||
to the members of the working group who met every week for months in order to get this done, and the members of other third-party
|
||||
projects who took the time to work with us to ensure compatibility.
|
||||
|
||||
Redesigned navigation, player and queue
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
This release includes a full redesign of our navigation, player and queue. Overall, it should provide
|
||||
a better, less confusing experience, especially on mobile devices. This redesign was suggested
|
||||
14 months ago, and took a while, but thanks to the involvement and feedback of many people, we got it done!
|
||||
|
||||
Improved search bar for searching remote objects
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
The search bar now support fetching arbitrary objects using a URL. In particular, you can use this to quickly:
|
||||
|
||||
- Subscribe to a remote library via its URL
|
||||
- Listen a public track from another pod
|
||||
- Subscribe to a channel
|
||||
|
||||
Screening for sign-ups and custom sign-up form
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
Instance admins can now configure their pod so that registrations required manual approval from a moderator. This
|
||||
is especially useful on private or semi-private pods where you don't want to close registrations completely,
|
||||
but don't want spam or unwanted users to join your pod.
|
||||
|
||||
When this is enabled and a new user register, their request is put in a moderation queue, and moderators
|
||||
are notified by email. When the request is approved or refused, the user is also notified by email.
|
||||
|
||||
In addition, it's also possible to customize the sign-up form by:
|
||||
|
||||
- Providing a custom help text, in markdown format
|
||||
- Including additional fields in the form, for instance to ask the user why they want to join. Data collected through these fields is included in the sign-up request and viewable by the mods
|
||||
|
||||
Federated reports
|
||||
^^^^^^^^^^^^^^^^^
|
||||
|
||||
It's now possible to send a copy of a report to the server hosting the reported object, in order to make moderation easier and more distributed.
|
||||
|
||||
This feature is inspired by Mastodon's current design, and should work with at least Funkwhale and Mastodon servers.
|
||||
|
||||
Improved search performance
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
Our search engine went through a full rewrite to make it faster. This new engine is enabled
|
||||
by default when using the search bar, or when searching for artists, albums and tracks. It leverages
|
||||
PostgreSQL full-text search capabilities.
|
||||
|
||||
During our tests, we observed huge performance improvements after the switch, by an order of
|
||||
magnitude. This should be especially perceptible on pods with large databases, more modest hardware
|
||||
or hard drives.
|
||||
|
||||
We plan to remove the old engine in an upcoming release. In the meantime, if anything goes wrong,
|
||||
you can switch back by setting ``USE_FULL_TEXT_SEARCH=false`` in your ``.env`` file.
|
||||
|
||||
Enforced email verification
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
The brand new ``ACCOUNT_EMAIL_VERIFICATION_ENFORCE`` setting can be used to make email verification
|
||||
mandatory for your users. It defaults to ``false``, and doesn't apply to superuser accounts created through
|
||||
the CLI.
|
||||
|
||||
If you enable this, ensure you have a SMTP server configured too.
|
||||
|
||||
More reliable CLI importer [manual action required]
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
Our CLI importer is now more reliable and less prone to Out-of-Memory issues, especially when scanning large libraries. (hundreds of GB or bigger)
|
||||
|
||||
We've also improved the directory crawling logic, so that you don't have to use glob patterns or specify extensions when importing. As a result, the syntax for providing directories to the command as changed slightly.
|
||||
|
||||
If you use the ``import_files`` command, this means you should replace scripts that look like this::
|
||||
|
||||
python api/manage.py import_files $LIBRARY_ID "/srv/funkwhale/data/music/**/*.ogg" "/srv/funkwhale/data/music/**/*.mp3" --recursive --noinput
|
||||
|
||||
By this::
|
||||
|
||||
python api/manage.py import_files $LIBRARY_ID "/srv/funkwhale/data/music/" --recursive --noinput
|
||||
|
||||
And Funkwhale will happily import any supported audio file from the specified directory.
|
||||
|
||||
User management through the server CLI
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
We now support user creation (incl. non-admin accounts), update and removal directly
|
||||
from the server CLI. Typical use cases include:
|
||||
|
||||
- Changing a user password from the command line
|
||||
- Creating or updating users from deployments scripts or playbooks
|
||||
- Removing or granting permissions or upload quota to multiple users at once
|
||||
- Marking multiple users as inactive
|
||||
|
||||
All user-related commands are available under the ``python manage.py fw users`` namespace.
|
||||
Please refer to the `Admin documentation <https://docs.funkwhale.audio/admin/commands.html#user-management>`_ for
|
||||
more information and instructions.
|
||||
|
||||
Progressive web app [Manual action suggested, non-docker only]
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
We've made Funkwhale's Web UI a Progressive Web Application (PWA), in order to improve the user experience
|
||||
during offline use, and on mobile devices.
|
||||
|
||||
In order to fully benefit from this change, if your pod isn't deployed using Docker, ensure
|
||||
the following instruction is present in your nginx configuration::
|
||||
|
||||
location /front/ {
|
||||
# Add the following line in the /front/ location
|
||||
add_header Service-Worker-Allowed "/";
|
||||
}
|
||||
|
||||
Postgres docker changed environment variable [manual action required, docker multi-container only]
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
If you're running with docker and our multi-container setup, there was a breaking change starting in the 11.7 postgres image (https://github.com/docker-library/postgres/pull/658)
|
||||
|
||||
You need to add this to your .env file: ``POSTGRES_HOST_AUTH_METHOD=trust``
|
||||
|
||||
Newer deployments aren't affected.
|
||||
|
||||
Upgrade from Postgres 10 to 11 [manual action required, docker all-in-one only]
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
With our upgrade to Alpine 3.10, the ``funkwhale/all-in-one`` image now includes PostgreSQL 11.
|
||||
|
||||
In order to update to Funkwhale 0.21, you will first need to uprade Funkwhale's PostgreSQL database, following the steps below::
|
||||
|
||||
# open a shell as the Funkwhale user
|
||||
sudo -u funkwhale -H bash
|
||||
|
||||
# move to the funkwhale data directory
|
||||
# (replace this with your own if you used a different path)
|
||||
cd /srv/funkwhale/data
|
||||
|
||||
# stop the funkwhale container
|
||||
docker stop funkwhale
|
||||
|
||||
# backup the database files
|
||||
cp -r data/ ../postgres.bak
|
||||
|
||||
# Upgrade the database
|
||||
docker run --rm \
|
||||
-v $(pwd)/data:/var/lib/postgresql/10/data \
|
||||
-v $(pwd)/upgraded-postgresql:/var/lib/postgresql/11/data \
|
||||
-e PGUSER=funkwhale \
|
||||
-e POSTGRES_INITDB_ARGS="-U funkwhale --locale C --encoding UTF8" \
|
||||
tianon/postgres-upgrade:10-to-11
|
||||
|
||||
# replace the Postgres 10 files with Postgres 11 files
|
||||
mv data/ postgres-10
|
||||
mv upgraded-postgresql/ data
|
||||
|
||||
Once you have completed the Funkwhale upgrade with our regular instructions and everything works properly,
|
||||
you can remove the backups/old files::
|
||||
|
||||
sudo -u funkwhale -H bash
|
||||
cd /srv/funkwhale/data
|
||||
rm -rf ../postgres.bak
|
||||
rm -rf postgres-10
|
||||
|
||||
Full list of changes
|
||||
^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
Features:
|
||||
|
||||
- Support for publishing and subscribing to podcasts (#170)
|
||||
- Brand new navigation, queue and player redesign (#594)
|
||||
- Can now browse a library content through the UI (#926)
|
||||
- Federated reports (#1038)
|
||||
- Screening for sign-ups (#1040)
|
||||
- Make it possible to enforce email verification (#1039)
|
||||
- Added a new radio based on another user listenings (#1060)
|
||||
- User management through the server CLI
|
||||
|
||||
Enhancements:
|
||||
|
||||
- Added ability to reject library follows from notifications screen (#859)
|
||||
- Added periodic background task and CLI command to associate genre tags to artists and albums based on identical tags found on corresponding tracks (#988)
|
||||
- Added support for CELERYD_CONCURRENCY env var to control the number of worker processes (#997)
|
||||
- Added the ability to sort albums by release date (#1013)
|
||||
- Added two new radios to play your own content or a given library tracks
|
||||
- Advertise list of known nodes on /api/v1/federation/domains and in nodeinfo if stats sharing is enabled
|
||||
- Changed footer to use instance name if available, and append ellipses if instance URL/Name is too long (#1012)
|
||||
- Favor local uploads when playing a track with multiple uploads (#1036)
|
||||
- Include only local content in nodeinfo stats, added downloads count
|
||||
- Make media and static files serving more reliable when reverse proxy X_FORWARDED_* headers are incorrect (#947)
|
||||
- Order the playlist columns by modification date in the Browse tab (#775)
|
||||
- Reduced size of funkwhale/funkwhale docker images thanks to multi-stage builds (!1042)
|
||||
- Remember display settings in Album, Artist, Radio and Playlist views (#391)
|
||||
- Removed unnecessary "Federation music needs approval" setting (#959)
|
||||
- Replaced our slow research logic by PostgreSQL full-text search (#994)
|
||||
- Support autoplay when loading embed frame from Mastodon and third-party websites (#1041)
|
||||
- Support filtering playlist by name and several additional UX improvements in playlists modal (#974)
|
||||
- Support modifying album cover art through the web UI (#588)
|
||||
- Use a dedicated scope for throttling subsonic to avoid intrusive rate-limiting
|
||||
- Use same markdown widget for all content fields (rules, description, reports, notes, etc.)
|
||||
- CLI Importer is now more reliable and less resource-hungry on large libraries
|
||||
- Add support custom domain for S3 storage
|
||||
- Better placeholders for channels when there are no episodes or series
|
||||
- Updated documentation for 0.21 release
|
||||
- Improved performance and error handling when fetching remote attachments
|
||||
|
||||
Bugfixes:
|
||||
|
||||
- Added missing manuallyApprovesFollowers entry in JSON-LD contexts (#961)
|
||||
- Fix issue with browser shortcuts such as search and focus URL not being recognised (#340, #985)
|
||||
- Fixed admin dropdown not showing after login (#1042)
|
||||
- Fixed an issue with celerybeat container failing to restart (#1004)
|
||||
- Fixed invalid displayed number of tracks in playlist (#986)
|
||||
- Fixed issue with recent results not being loaded from the API (#948)
|
||||
- Fixed issue with sorting by album name not working (#960)
|
||||
- Fixed short audio glitch when switching switching to another track with player paused (#970)
|
||||
- Improved deduplication logic to prevent skipped files during import (#348, #474, #557, #740, #928)
|
||||
- More resilient tag parsing with empty release date or album artist (#1037)
|
||||
- More robust importer against malformed dates (#966)
|
||||
- Removed "nodeinfo disabled" setting, as nodeinfo is required for the UI to work (#982)
|
||||
- Replaced PDF icon by List icon in playlist placeholder (#943)
|
||||
- Resolve an issue where disc numbers were not taken into consideration when playing an album from the album card (#1006)
|
||||
- Set correct size for album covers in playlist cards (#680)
|
||||
- Remove double spaces in ChannelForm
|
||||
- Deduplicate tags in Audio ActivityPub representation
|
||||
- Add support custom domain for S3 storage
|
||||
- Fix #1079: fixed z-index issues with dropdowns (#1079 and #1075)
|
||||
- Exclude external podcasts from library home
|
||||
- Fixed broken channel save when description is too long
|
||||
- Fixed 500 error when federation is disabled and application+json is requested
|
||||
- Fixed minor subsonic API crash
|
||||
- Fixed broken local profile page when allow-list is enabled
|
||||
- Fixed issue with confirmation email not sending when signup-approval was enabled
|
||||
- Ensure 0 quota on user is honored
|
||||
- Fixed attachments URL not honoring media URL
|
||||
- Fix grammar in msg string in TrackBase.vue
|
||||
- Fix typo in SubscribeButton.vue
|
||||
|
||||
Translations:
|
||||
|
||||
- Arabic
|
||||
- Catalan
|
||||
- English (United Kingdom)
|
||||
- German
|
||||
- Hungarian
|
||||
- Japanese
|
||||
- Occitan
|
||||
- Portuguese (Brazil)
|
||||
- Russian
|
||||
|
||||
Contributors to this release (translation, development, documentation, reviews, design, testing, third-party projects):
|
||||
|
||||
- Agate
|
||||
- annando
|
||||
- Anton Strömkvist
|
||||
- Audrey
|
||||
- ButterflyOfFire
|
||||
- Ciarán Ainsworth
|
||||
- Creak
|
||||
- Daniele Lira Mereb
|
||||
- dashie
|
||||
- Eloisa
|
||||
- eorn
|
||||
- Francesc Galí
|
||||
- gerhardbeck
|
||||
- GinnyMcQueen
|
||||
- guillermau
|
||||
- Haelwenn
|
||||
- jinxx
|
||||
- Jonathan Aylard
|
||||
- Keunes
|
||||
- M.G
|
||||
- marzzzello
|
||||
- Mathé Grievink
|
||||
- Mélanie Chauvel
|
||||
- Mjourdan
|
||||
- Morgan Kesler
|
||||
- Noe Gaumont
|
||||
- Noureddine HADDAG
|
||||
- Ollie
|
||||
- Peter Wickenberg
|
||||
- Quentin PAGÈS
|
||||
- Renon
|
||||
- Satsuki Yanagi
|
||||
- Shlee
|
||||
- SpcCw
|
||||
- techknowlogick
|
||||
- ThibG
|
||||
- Tony Wasserka
|
||||
- unklebonehead
|
||||
- wakest
|
||||
- wxcafé
|
||||
- Xaloc
|
||||
- Xosé M
|
||||
|
||||
0.20.1 (2019-10-28)
|
||||
-------------------
|
||||
|
||||
|
|
|
@ -84,6 +84,18 @@ Visit https://dev.funkwhale.audio/funkwhale/funkwhale and clone the repository u
|
|||
git clone ssh://git@dev.funkwhale.audio/funkwhale/funkwhale.git
|
||||
cd funkwhale
|
||||
|
||||
.. note::
|
||||
|
||||
As of January 2020, the SSH fingerprints of our Gitlab server are the following::
|
||||
|
||||
$ ssh-keyscan dev.funkwhale.audio | ssh-keygen -lf -
|
||||
# dev.funkwhale.audio:22 SSH-2.0-OpenSSH_7.4p1 Debian-10+deb9u6
|
||||
# dev.funkwhale.audio:22 SSH-2.0-OpenSSH_7.4p1 Debian-10+deb9u6
|
||||
# dev.funkwhale.audio:22 SSH-2.0-OpenSSH_7.4p1 Debian-10+deb9u6
|
||||
2048 SHA256:WEZ546nkMhB9yV9lyDZZcEeN/IfriyhU8+mj7Cz/+sU dev.funkwhale.audio (RSA)
|
||||
256 SHA256:dEhAo+1ImjC98hSqVdnkwVleheCulV8xIsV1eKUcig0 dev.funkwhale.audio (ECDSA)
|
||||
256 SHA256:/AxZwOSP74hlNKCHzmu9Trlp9zVGTrsJOV+zet1hYyQ dev.funkwhale.audio (ED25519)
|
||||
|
||||
|
||||
A note about branches
|
||||
^^^^^^^^^^^^^^^^^^^^^
|
||||
|
@ -709,3 +721,48 @@ The latter is especially useful when you are debugging failing tests.
|
|||
.. note::
|
||||
|
||||
The front-end test suite coverage is still pretty low
|
||||
|
||||
|
||||
Making a release
|
||||
----------------
|
||||
|
||||
To make a new 3.4 release::
|
||||
|
||||
# setup
|
||||
export NEXT_RELEASE=3.4 # replace with the next release number
|
||||
export PREVIOUS_RELEASE=3.3 # replace with the previous release number
|
||||
|
||||
# ensure you have an up-to-date repo
|
||||
git checkout develop # use master if you're doing a hotfix release
|
||||
git pull
|
||||
|
||||
# compile changelog
|
||||
towncrier --version $NEXT_RELEASE --yes
|
||||
|
||||
# polish changelog
|
||||
# - update the date
|
||||
# - look for typos
|
||||
# - add list of contributors via `python3 scripts/get-contributions-stats.py develop $PREVIOUS_RELEASE`
|
||||
nano CHANGELOG
|
||||
|
||||
# Set the `__version__` variable to $NEXT_RELEASE
|
||||
nano nano api/funkwhale_api/__init__.py
|
||||
|
||||
# commit
|
||||
git add .
|
||||
git commit -m "Version bump and changelog for $NEXT_RELEASE"
|
||||
|
||||
# tag
|
||||
git tag $NEXT_RELEASE
|
||||
|
||||
# publish
|
||||
git push --tags && git push
|
||||
|
||||
# if you're doing a hotfix release from master
|
||||
git checkout develop && git merge master && git push
|
||||
|
||||
# if you're doing a non-hotfix release, and a real release (not a real release) from develop
|
||||
git checkout master && git merge develop && git push
|
||||
|
||||
Then, visit https://dev.funkwhale.audio/funkwhale/funkwhale/-/tags, copy-paste the changelog on the corresponding
|
||||
tag, and announce the good news ;)
|
||||
|
|
|
@ -1,38 +1,34 @@
|
|||
FROM alpine:3.8
|
||||
FROM alpine:3.10 as builder
|
||||
|
||||
RUN \
|
||||
echo 'installing dependencies' && \
|
||||
apk add \
|
||||
bash \
|
||||
apk add --no-cache \
|
||||
git \
|
||||
gettext \
|
||||
musl-dev \
|
||||
gcc \
|
||||
postgresql-dev \
|
||||
python3-dev \
|
||||
py3-psycopg2 \
|
||||
py3-pillow \
|
||||
libldap \
|
||||
ffmpeg \
|
||||
libpq \
|
||||
libmagic \
|
||||
libffi-dev \
|
||||
make \
|
||||
zlib-dev \
|
||||
openldap-dev && \
|
||||
\
|
||||
jpeg-dev \
|
||||
openldap-dev \
|
||||
&& \
|
||||
\
|
||||
ln -s /usr/bin/python3 /usr/bin/python
|
||||
|
||||
# create virtual env for next stage
|
||||
RUN python -m venv /venv
|
||||
# emulate activation by prefixing PATH
|
||||
ENV PATH="/venv/bin:$PATH" VIRTUAL_ENV=/venv
|
||||
|
||||
RUN mkdir /requirements
|
||||
COPY ./requirements/base.txt /requirements/base.txt
|
||||
# hack around https://github.com/pypa/pip/issues/6158#issuecomment-456619072
|
||||
ENV PIP_DOWNLOAD_CACHE=/noop/
|
||||
RUN \
|
||||
echo 'fixing requirements file for alpine' && \
|
||||
sed -i '/Pillow/d' /requirements/base.txt && \
|
||||
\
|
||||
\
|
||||
echo 'installing pip requirements' && \
|
||||
pip3 install --upgrade pip && \
|
||||
pip3 install setuptools wheel && \
|
||||
|
@ -44,6 +40,26 @@ COPY ./requirements/*.txt /requirements/
|
|||
RUN \
|
||||
if [ "$install_dev_deps" = "1" ] ; then echo "Installing dev dependencies" && pip3 install --no-cache-dir -r /requirements/local.txt -r /requirements/test.txt ; else echo "Skipping dev deps installation" ; fi
|
||||
|
||||
|
||||
FROM alpine:3.10 as build-image
|
||||
|
||||
COPY --from=builder /venv /venv
|
||||
# emulate activation by prefixing PATH
|
||||
ENV PATH="/venv/bin:$PATH"
|
||||
|
||||
RUN apk add --no-cache \
|
||||
libmagic \
|
||||
bash \
|
||||
gettext \
|
||||
python3 \
|
||||
jpeg-dev \
|
||||
ffmpeg \
|
||||
libpq \
|
||||
&& \
|
||||
\
|
||||
ln -s /usr/bin/python3 /usr/bin/python
|
||||
|
||||
|
||||
ENTRYPOINT ["./compose/django/entrypoint.sh"]
|
||||
CMD ["./compose/django/server.sh"]
|
||||
|
||||
|
|
|
@ -4,6 +4,7 @@ from rest_framework import routers
|
|||
from rest_framework.urlpatterns import format_suffix_patterns
|
||||
|
||||
from funkwhale_api.activity import views as activity_views
|
||||
from funkwhale_api.audio import views as audio_views
|
||||
from funkwhale_api.common import views as common_views
|
||||
from funkwhale_api.common import routers as common_routers
|
||||
from funkwhale_api.music import views
|
||||
|
@ -21,6 +22,8 @@ router.register(r"uploads", views.UploadViewSet, "uploads")
|
|||
router.register(r"libraries", views.LibraryViewSet, "libraries")
|
||||
router.register(r"listen", views.ListenViewSet, "listen")
|
||||
router.register(r"artists", views.ArtistViewSet, "artists")
|
||||
router.register(r"channels", audio_views.ChannelViewSet, "channels")
|
||||
router.register(r"subscriptions", audio_views.SubscriptionsViewSet, "subscriptions")
|
||||
router.register(r"albums", views.AlbumViewSet, "albums")
|
||||
router.register(r"licenses", views.LicenseViewSet, "licenses")
|
||||
router.register(r"playlists", playlists_views.PlaylistViewSet, "playlists")
|
||||
|
@ -28,6 +31,7 @@ router.register(
|
|||
r"playlist-tracks", playlists_views.PlaylistTrackViewSet, "playlist-tracks"
|
||||
)
|
||||
router.register(r"mutations", common_views.MutationViewSet, "mutations")
|
||||
router.register(r"attachments", common_views.AttachmentViewSet, "attachments")
|
||||
v1_patterns = router.urls
|
||||
|
||||
subsonic_router = routers.SimpleRouter(trailing_slash=False)
|
||||
|
@ -84,6 +88,9 @@ v1_patterns += [
|
|||
url(r"^token/?$", jwt_views.obtain_jwt_token, name="token"),
|
||||
url(r"^token/refresh/?$", jwt_views.refresh_jwt_token, name="token_refresh"),
|
||||
url(r"^rate-limit/?$", common_views.RateLimitView.as_view(), name="rate-limit"),
|
||||
url(
|
||||
r"^text-preview/?$", common_views.TextPreviewView.as_view(), name="text-preview"
|
||||
),
|
||||
]
|
||||
|
||||
urlpatterns = [
|
||||
|
|
|
@ -1,17 +1,9 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
Django settings for funkwhale_api project.
|
||||
|
||||
For more information on this file, see
|
||||
https://docs.djangoproject.com/en/dev/topics/settings/
|
||||
|
||||
For the full list of settings and their values, see
|
||||
https://docs.djangoproject.com/en/dev/ref/settings/
|
||||
"""
|
||||
from __future__ import absolute_import, unicode_literals
|
||||
|
||||
import datetime
|
||||
import logging.config
|
||||
import os
|
||||
import sys
|
||||
|
||||
from urllib.parse import urlsplit
|
||||
|
@ -28,6 +20,9 @@ APPS_DIR = ROOT_DIR.path("funkwhale_api")
|
|||
env = environ.Env()
|
||||
|
||||
LOGLEVEL = env("LOGLEVEL", default="info").upper()
|
||||
"""
|
||||
Default logging level for the Funkwhale processes""" # pylint: disable=W0105
|
||||
|
||||
LOGGING_CONFIG = None
|
||||
logging.config.dictConfig(
|
||||
{
|
||||
|
@ -56,7 +51,10 @@ logging.config.dictConfig(
|
|||
}
|
||||
)
|
||||
|
||||
env_file = env("ENV_FILE", default=None)
|
||||
ENV_FILE = env_file = env("ENV_FILE", default=None)
|
||||
"""
|
||||
Path to a .env file to load
|
||||
"""
|
||||
if env_file:
|
||||
logger.info("Loading specified env file at %s", env_file)
|
||||
# we have an explicitely specified env file
|
||||
|
@ -84,6 +82,9 @@ else:
|
|||
FUNKWHALE_PLUGINS_PATH = env(
|
||||
"FUNKWHALE_PLUGINS_PATH", default="/srv/funkwhale/plugins/"
|
||||
)
|
||||
"""
|
||||
Path to a directory containing Funkwhale plugins. These will be imported at runtime.
|
||||
"""
|
||||
sys.path.append(FUNKWHALE_PLUGINS_PATH)
|
||||
|
||||
FUNKWHALE_HOSTNAME = None
|
||||
|
@ -98,7 +99,14 @@ if FUNKWHALE_HOSTNAME_PREFIX and FUNKWHALE_HOSTNAME_SUFFIX:
|
|||
else:
|
||||
try:
|
||||
FUNKWHALE_HOSTNAME = env("FUNKWHALE_HOSTNAME")
|
||||
"""
|
||||
Hostname of your Funkwhale pod, e.g ``mypod.audio``
|
||||
"""
|
||||
|
||||
FUNKWHALE_PROTOCOL = env("FUNKWHALE_PROTOCOL", default="https")
|
||||
"""
|
||||
Protocol end users will use to access your pod, either ``http`` or ``https``.
|
||||
"""
|
||||
except Exception:
|
||||
FUNKWHALE_URL = env("FUNKWHALE_URL")
|
||||
_parsed = urlsplit(FUNKWHALE_URL)
|
||||
|
@ -111,14 +119,36 @@ FUNKWHALE_URL = "{}://{}".format(FUNKWHALE_PROTOCOL, FUNKWHALE_HOSTNAME)
|
|||
FUNKWHALE_SPA_HTML_ROOT = env(
|
||||
"FUNKWHALE_SPA_HTML_ROOT", default=FUNKWHALE_URL + "/front/"
|
||||
)
|
||||
"""
|
||||
URL or path to the Web Application files. Funkwhale needs access to it so that
|
||||
it can inject <meta> tags relevant to the given page (e.g page title, cover, etc.).
|
||||
|
||||
If a URL is specified, the index.html file will be fetched through HTTP. If a path is provided,
|
||||
it will be accessed from disk.
|
||||
|
||||
Use something like ``/srv/funkwhale/front/dist/`` if the web processes shows request errors related to this.
|
||||
"""
|
||||
|
||||
FUNKWHALE_SPA_HTML_CACHE_DURATION = env.int(
|
||||
"FUNKWHALE_SPA_HTML_CACHE_DURATION", default=60 * 15
|
||||
)
|
||||
FUNKWHALE_EMBED_URL = env(
|
||||
"FUNKWHALE_EMBED_URL", default=FUNKWHALE_URL + "/front/embed.html"
|
||||
)
|
||||
FUNKWHALE_SPA_REWRITE_MANIFEST = env.bool(
|
||||
"FUNKWHALE_SPA_REWRITE_MANIFEST", default=True
|
||||
)
|
||||
FUNKWHALE_SPA_REWRITE_MANIFEST_URL = env.bool(
|
||||
"FUNKWHALE_SPA_REWRITE_MANIFEST_URL", default=None
|
||||
)
|
||||
|
||||
APP_NAME = "Funkwhale"
|
||||
|
||||
# XXX: for backward compat with django 2.2, remove this when django 2.2 support is dropped
|
||||
os.environ["DJANGO_ALLOW_ASYNC_UNSAFE"] = env.bool(
|
||||
"DJANGO_ALLOW_ASYNC_UNSAFE", default="true"
|
||||
)
|
||||
|
||||
# XXX: deprecated, see #186
|
||||
FEDERATION_ENABLED = env.bool("FEDERATION_ENABLED", default=True)
|
||||
FEDERATION_HOSTNAME = env("FEDERATION_HOSTNAME", default=FUNKWHALE_HOSTNAME).lower()
|
||||
|
@ -133,7 +163,18 @@ FEDERATION_ACTOR_FETCH_DELAY = env.int("FEDERATION_ACTOR_FETCH_DELAY", default=6
|
|||
FEDERATION_SERVICE_ACTOR_USERNAME = env(
|
||||
"FEDERATION_SERVICE_ACTOR_USERNAME", default="service"
|
||||
)
|
||||
# How many pages to fetch when crawling outboxes and third-party collections
|
||||
FEDERATION_COLLECTION_MAX_PAGES = env.int("FEDERATION_COLLECTION_MAX_PAGES", default=5)
|
||||
"""
|
||||
Number of existing pages of content to fetch when discovering/refreshing an actor or channel.
|
||||
|
||||
More pages means more content will be loaded, but will require more resources.
|
||||
"""
|
||||
|
||||
ALLOWED_HOSTS = env.list("DJANGO_ALLOWED_HOSTS", default=[]) + [FUNKWHALE_HOSTNAME]
|
||||
"""
|
||||
List of allowed hostnames for which the Funkwhale server will answer.
|
||||
"""
|
||||
|
||||
# APP CONFIGURATION
|
||||
# ------------------------------------------------------------------------------
|
||||
|
@ -191,6 +232,7 @@ LOCAL_APPS = (
|
|||
"funkwhale_api.users.oauth",
|
||||
# Your stuff: custom apps go here
|
||||
"funkwhale_api.instance",
|
||||
"funkwhale_api.audio",
|
||||
"funkwhale_api.music",
|
||||
"funkwhale_api.requests",
|
||||
"funkwhale_api.favorites",
|
||||
|
@ -207,21 +249,30 @@ LOCAL_APPS = (
|
|||
|
||||
|
||||
PLUGINS = [p for p in env.list("FUNKWHALE_PLUGINS", default=[]) if p]
|
||||
"""
|
||||
List of Funkwhale plugins to load.
|
||||
"""
|
||||
if PLUGINS:
|
||||
logger.info("Running with the following plugins enabled: %s", ", ".join(PLUGINS))
|
||||
else:
|
||||
logger.info("Running with no plugins")
|
||||
|
||||
ADDITIONAL_APPS = env.list("ADDITIONAL_APPS", default=[])
|
||||
"""
|
||||
List of Django apps to load in addition to Funkwhale plugins and apps.
|
||||
"""
|
||||
INSTALLED_APPS = (
|
||||
DJANGO_APPS
|
||||
+ THIRD_PARTY_APPS
|
||||
+ LOCAL_APPS
|
||||
+ tuple(["{}.apps.Plugin".format(p) for p in PLUGINS])
|
||||
+ tuple(ADDITIONAL_APPS)
|
||||
)
|
||||
|
||||
# MIDDLEWARE CONFIGURATION
|
||||
# ------------------------------------------------------------------------------
|
||||
MIDDLEWARE = (
|
||||
ADDITIONAL_MIDDLEWARES_BEFORE = env.list("ADDITIONAL_MIDDLEWARES_BEFORE", default=[])
|
||||
MIDDLEWARE = tuple(ADDITIONAL_MIDDLEWARES_BEFORE) + (
|
||||
"django.middleware.security.SecurityMiddleware",
|
||||
"django.middleware.clickjacking.XFrameOptionsMiddleware",
|
||||
"corsheaders.middleware.CorsMiddleware",
|
||||
|
@ -238,8 +289,11 @@ MIDDLEWARE = (
|
|||
# DEBUG
|
||||
# ------------------------------------------------------------------------------
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/settings/#debug
|
||||
DEBUG = env.bool("DJANGO_DEBUG", False)
|
||||
|
||||
DJANGO_DEBUG = DEBUG = env.bool("DJANGO_DEBUG", False)
|
||||
"""
|
||||
Whether to enable debugging info and pages. Never enable this on a production server,
|
||||
as it can leak very sensitive information.
|
||||
"""
|
||||
# FIXTURE CONFIGURATION
|
||||
# ------------------------------------------------------------------------------
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/settings/#std:setting-FIXTURE_DIRS
|
||||
|
@ -253,25 +307,70 @@ FIXTURE_DIRS = (str(APPS_DIR.path("fixtures")),)
|
|||
DEFAULT_FROM_EMAIL = env(
|
||||
"DEFAULT_FROM_EMAIL", default="Funkwhale <noreply@{}>".format(FUNKWHALE_HOSTNAME)
|
||||
)
|
||||
"""
|
||||
Name and email address used to send system emails.
|
||||
|
||||
Default: ``Funkwhale <noreply@yourdomain>``
|
||||
|
||||
.. note::
|
||||
|
||||
Both the forms ``Funkwhale <noreply@yourdomain>`` and
|
||||
``noreply@yourdomain`` work.
|
||||
|
||||
"""
|
||||
EMAIL_SUBJECT_PREFIX = env("EMAIL_SUBJECT_PREFIX", default="[Funkwhale] ")
|
||||
"""
|
||||
Subject prefix for system emails.
|
||||
"""
|
||||
SERVER_EMAIL = env("SERVER_EMAIL", default=DEFAULT_FROM_EMAIL)
|
||||
|
||||
|
||||
EMAIL_CONFIG = env.email_url("EMAIL_CONFIG", default="consolemail://")
|
||||
"""
|
||||
SMTP configuration for sending emails. Possible values:
|
||||
|
||||
- ``EMAIL_CONFIG=consolemail://``: output emails to console (the default)
|
||||
- ``EMAIL_CONFIG=dummymail://``: disable email sending completely
|
||||
|
||||
On a production instance, you'll usually want to use an external SMTP server:
|
||||
|
||||
- ``EMAIL_CONFIG=smtp://user@:password@youremail.host:25``
|
||||
- ``EMAIL_CONFIG=smtp+ssl://user@:password@youremail.host:465``
|
||||
- ``EMAIL_CONFIG=smtp+tls://user@:password@youremail.host:587``
|
||||
|
||||
.. note::
|
||||
|
||||
If ``user`` or ``password`` contain special characters (eg.
|
||||
``noreply@youremail.host`` as ``user``), be sure to urlencode them, using
|
||||
for example the command:
|
||||
``python3 -c 'import urllib.parse; print(urllib.parse.quote_plus("noreply@youremail.host"))'``
|
||||
(returns ``noreply%40youremail.host``)
|
||||
|
||||
"""
|
||||
vars().update(EMAIL_CONFIG)
|
||||
|
||||
# DATABASE CONFIGURATION
|
||||
# ------------------------------------------------------------------------------
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/settings/#databases
|
||||
DATABASE_URL = env.db("DATABASE_URL")
|
||||
"""
|
||||
URL to connect to the PostgreSQL database. Examples:
|
||||
|
||||
- ``postgresql://funkwhale@:5432/funkwhale``
|
||||
- ``postgresql://<user>:<password>@<host>:<port>/<database>``
|
||||
- ``postgresql://funkwhale:passw0rd@localhost:5432/funkwhale_database``
|
||||
"""
|
||||
DATABASES = {
|
||||
# Raises ImproperlyConfigured exception if DATABASE_URL not in os.environ
|
||||
"default": env.db("DATABASE_URL")
|
||||
"default": DATABASE_URL
|
||||
}
|
||||
DATABASES["default"]["ATOMIC_REQUESTS"] = True
|
||||
DATABASES["default"]["CONN_MAX_AGE"] = env("DB_CONN_MAX_AGE", default=60 * 5)
|
||||
|
||||
DB_CONN_MAX_AGE = DATABASES["default"]["CONN_MAX_AGE"] = env(
|
||||
"DB_CONN_MAX_AGE", default=60 * 5
|
||||
)
|
||||
"""
|
||||
Max time, in seconds, before database connections are closed.
|
||||
"""
|
||||
MIGRATION_MODULES = {
|
||||
# see https://github.com/jazzband/django-oauth-toolkit/issues/634
|
||||
# swappable models are badly designed in oauth2_provider
|
||||
|
@ -280,13 +379,6 @@ MIGRATION_MODULES = {
|
|||
"sites": "funkwhale_api.contrib.sites.migrations",
|
||||
}
|
||||
|
||||
#
|
||||
# DATABASES = {
|
||||
# 'default': {
|
||||
# 'ENGINE': 'django.db.backends.sqlite3',
|
||||
# 'NAME': 'db.sqlite3',
|
||||
# }
|
||||
# }
|
||||
# GENERAL CONFIGURATION
|
||||
# ------------------------------------------------------------------------------
|
||||
# Local time zone for this installation. Choices can be found here:
|
||||
|
@ -351,30 +443,79 @@ CRISPY_TEMPLATE_PACK = "bootstrap3"
|
|||
# ------------------------------------------------------------------------------
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/settings/#static-root
|
||||
STATIC_ROOT = env("STATIC_ROOT", default=str(ROOT_DIR("staticfiles")))
|
||||
|
||||
"""
|
||||
Path were static files should be collected.
|
||||
"""
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/settings/#static-url
|
||||
STATIC_URL = env("STATIC_URL", default="/staticfiles/")
|
||||
STATIC_URL = env("STATIC_URL", default=FUNKWHALE_URL + "/staticfiles/")
|
||||
DEFAULT_FILE_STORAGE = "funkwhale_api.common.storage.ASCIIFileSystemStorage"
|
||||
|
||||
PROXY_MEDIA = env.bool("PROXY_MEDIA", default=True)
|
||||
"""
|
||||
Wether to proxy audio files through your reverse proxy. It's recommended to keep this on,
|
||||
as a way to enforce access control, however, if you're using S3 storage with :attr:`AWS_QUERYSTRING_AUTH`,
|
||||
it's safe to disable it.
|
||||
"""
|
||||
AWS_DEFAULT_ACL = None
|
||||
AWS_QUERYSTRING_AUTH = env.bool("AWS_QUERYSTRING_AUTH", default=not PROXY_MEDIA)
|
||||
"""
|
||||
Whether to include signatures in S3 urls, as a way to enforce access-control.
|
||||
|
||||
Defaults to the inverse of :attr:`PROXY_MEDIA`.
|
||||
"""
|
||||
|
||||
AWS_S3_MAX_MEMORY_SIZE = env.int(
|
||||
"AWS_S3_MAX_MEMORY_SIZE", default=1000 * 1000 * 1000 * 20
|
||||
)
|
||||
|
||||
AWS_QUERYSTRING_EXPIRE = env.int("AWS_QUERYSTRING_EXPIRE", default=3600)
|
||||
"""
|
||||
Expiration delay, in seconds, of signatures generated when :attr:`AWS_QUERYSTRING_AUTH` is enabled.
|
||||
"""
|
||||
|
||||
AWS_ACCESS_KEY_ID = env("AWS_ACCESS_KEY_ID", default=None)
|
||||
"""
|
||||
Access-key ID for your S3 storage.
|
||||
"""
|
||||
|
||||
if AWS_ACCESS_KEY_ID:
|
||||
AWS_ACCESS_KEY_ID = AWS_ACCESS_KEY_ID
|
||||
AWS_SECRET_ACCESS_KEY = env("AWS_SECRET_ACCESS_KEY")
|
||||
"""
|
||||
Secret access key for your S3 storage.
|
||||
"""
|
||||
AWS_STORAGE_BUCKET_NAME = env("AWS_STORAGE_BUCKET_NAME")
|
||||
"""
|
||||
Bucket name of your S3 storage.
|
||||
"""
|
||||
AWS_S3_CUSTOM_DOMAIN = env("AWS_S3_CUSTOM_DOMAIN", default=None)
|
||||
"""
|
||||
Custom domain to use for your S3 storage.
|
||||
"""
|
||||
AWS_S3_ENDPOINT_URL = env("AWS_S3_ENDPOINT_URL", default=None)
|
||||
"""
|
||||
If you use a S3-compatible storage such as minio, set the following variable to
|
||||
the full URL to the storage server. Example:
|
||||
|
||||
- ``https://minio.mydomain.com``
|
||||
- ``https://s3.wasabisys.com``
|
||||
"""
|
||||
AWS_S3_REGION_NAME = env("AWS_S3_REGION_NAME", default=None)
|
||||
"""If you are using Amazon S3 to serve media directly, you will need to specify your region
|
||||
name in order to access files. Example:
|
||||
|
||||
- ``eu-west-2``
|
||||
"""
|
||||
|
||||
AWS_S3_SIGNATURE_VERSION = "s3v4"
|
||||
AWS_LOCATION = env("AWS_LOCATION", default="")
|
||||
"""
|
||||
An optional bucket subdirectory were you want to store the files. This is especially useful
|
||||
if you plan to use share the bucket with other services
|
||||
"""
|
||||
DEFAULT_FILE_STORAGE = "funkwhale_api.common.storage.ASCIIS3Boto3Storage"
|
||||
|
||||
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/contrib/staticfiles/#std:setting-STATICFILES_DIRS
|
||||
STATICFILES_DIRS = (str(APPS_DIR.path("static")),)
|
||||
|
||||
|
@ -388,10 +529,26 @@ STATICFILES_FINDERS = (
|
|||
# ------------------------------------------------------------------------------
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/settings/#media-root
|
||||
MEDIA_ROOT = env("MEDIA_ROOT", default=str(APPS_DIR("media")))
|
||||
|
||||
"""
|
||||
Where media files (such as album covers or audio tracks) should be stored
|
||||
on your system? (Ensure this directory actually exists)
|
||||
"""
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/settings/#media-url
|
||||
MEDIA_URL = env("MEDIA_URL", default="/media/")
|
||||
MEDIA_URL = env("MEDIA_URL", default=FUNKWHALE_URL + "/media/")
|
||||
"""
|
||||
URL where media files are served. The default value should work fine on most
|
||||
configurations, but could can tweak this if you are hosting media files on a separate
|
||||
domain, or if you host Funkwhale on a non-standard port.
|
||||
"""
|
||||
FILE_UPLOAD_PERMISSIONS = 0o644
|
||||
|
||||
ATTACHMENTS_UNATTACHED_PRUNE_DELAY = env.int(
|
||||
"ATTACHMENTS_UNATTACHED_PRUNE_DELAY", default=3600 * 24
|
||||
)
|
||||
"""
|
||||
Delay in seconds before uploaded but unattached attachements are pruned from the system.
|
||||
"""
|
||||
|
||||
# URL Configuration
|
||||
# ------------------------------------------------------------------------------
|
||||
ROOT_URLCONF = "config.urls"
|
||||
|
@ -407,13 +564,26 @@ SECURE_CONTENT_TYPE_NOSNIFF = True
|
|||
# ------------------------------------------------------------------------------
|
||||
AUTHENTICATION_BACKENDS = (
|
||||
"funkwhale_api.users.auth_backends.ModelBackend",
|
||||
"allauth.account.auth_backends.AuthenticationBackend",
|
||||
"funkwhale_api.users.auth_backends.AllAuthBackend",
|
||||
)
|
||||
SESSION_COOKIE_HTTPONLY = False
|
||||
# Some really nice defaults
|
||||
ACCOUNT_AUTHENTICATION_METHOD = "username_email"
|
||||
ACCOUNT_EMAIL_REQUIRED = True
|
||||
ACCOUNT_EMAIL_VERIFICATION = "mandatory"
|
||||
ACCOUNT_EMAIL_VERIFICATION_ENFORCE = env.bool(
|
||||
"ACCOUNT_EMAIL_VERIFICATION_ENFORCE", default=False
|
||||
)
|
||||
"""
|
||||
Determine wether users need to verify their email address before using the service. Enabling this can be useful
|
||||
to reduce spam or bots accounts, however, you'll need to configure a mail server so that your users can receive the
|
||||
verification emails, using :attr:`EMAIL_CONFIG`.
|
||||
|
||||
Note that regardless of the setting value, superusers created through the command line will never require verification.
|
||||
|
||||
"""
|
||||
ACCOUNT_EMAIL_VERIFICATION = (
|
||||
"mandatory" if ACCOUNT_EMAIL_VERIFICATION_ENFORCE else "optional"
|
||||
)
|
||||
ACCOUNT_USERNAME_VALIDATORS = "funkwhale_api.users.serializers.username_validators"
|
||||
|
||||
# Custom user app defaults
|
||||
|
@ -442,6 +612,10 @@ OAUTH2_PROVIDER_REFRESH_TOKEN_MODEL = "users.RefreshToken"
|
|||
# LDAP AUTHENTICATION CONFIGURATION
|
||||
# ------------------------------------------------------------------------------
|
||||
AUTH_LDAP_ENABLED = env.bool("LDAP_ENABLED", default=False)
|
||||
"""
|
||||
Wether to enable LDAP authentication. See :doc:`/installation/ldap` for more information.
|
||||
"""
|
||||
|
||||
if AUTH_LDAP_ENABLED:
|
||||
|
||||
# Import the LDAP modules here; this way, we don't need the dependency unless someone
|
||||
|
@ -460,6 +634,9 @@ if AUTH_LDAP_ENABLED:
|
|||
"%(user)s"
|
||||
)
|
||||
AUTH_LDAP_START_TLS = env.bool("LDAP_START_TLS", default=False)
|
||||
AUTH_LDAP_BIND_AS_AUTHENTICATING_USER = env(
|
||||
"AUTH_LDAP_BIND_AS_AUTHENTICATING_USER", default=False
|
||||
)
|
||||
|
||||
DEFAULT_USER_ATTR_MAP = [
|
||||
"first_name:givenName",
|
||||
|
@ -508,8 +685,22 @@ if AUTH_LDAP_ENABLED:
|
|||
AUTOSLUG_SLUGIFY_FUNCTION = "slugify.slugify"
|
||||
|
||||
CACHE_DEFAULT = "redis://127.0.0.1:6379/0"
|
||||
CACHE_URL = env.cache_url("CACHE_URL", default=CACHE_DEFAULT)
|
||||
"""
|
||||
URL to your redis server. Examples:
|
||||
|
||||
- `redis://<host>:<port>/<database>`
|
||||
- `redis://127.0.0.1:6379/0`
|
||||
- `redis://:password@localhost:6379/0` for password auth (the extra semicolon is important)
|
||||
- `redis:///run/redis/redis.sock?db=0` over unix sockets
|
||||
|
||||
.. note::
|
||||
|
||||
If you want to use Redis over unix sockets, you'll also need to update :attr:`CELERY_BROKER_URL`
|
||||
|
||||
"""
|
||||
CACHES = {
|
||||
"default": env.cache_url("CACHE_URL", default=CACHE_DEFAULT),
|
||||
"default": CACHE_URL,
|
||||
"local": {
|
||||
"BACKEND": "django.core.cache.backends.locmem.LocMemCache",
|
||||
"LOCATION": "local-cache",
|
||||
|
@ -525,7 +716,7 @@ CHANNEL_LAYERS = {
|
|||
}
|
||||
|
||||
CACHES["default"]["OPTIONS"] = {
|
||||
"CLIENT_CLASS": "django_redis.client.DefaultClient",
|
||||
"CLIENT_CLASS": "funkwhale_api.common.cache.RedisClient",
|
||||
"IGNORE_EXCEPTIONS": True, # mimics memcache behavior.
|
||||
# http://niwinz.github.io/django-redis/latest/#_memcached_exceptions_behavior
|
||||
}
|
||||
|
@ -548,6 +739,15 @@ INSTALLED_APPS += ("funkwhale_api.taskapp.celery.CeleryConfig",)
|
|||
CELERY_BROKER_URL = env(
|
||||
"CELERY_BROKER_URL", default=env("CACHE_URL", default=CACHE_DEFAULT)
|
||||
)
|
||||
"""
|
||||
URL to celery's task broker. Defaults to :attr:`CACHE_URL`, so you shouldn't have to tweak this, unless you want
|
||||
to use a different one, or use Redis sockets to connect.
|
||||
|
||||
Exemple:
|
||||
|
||||
- `redis://127.0.0.1:6379/0`
|
||||
- `redis+socket:///run/redis/redis.sock?virtual_host=0`
|
||||
"""
|
||||
# END CELERY
|
||||
# Location of root django.contrib.admin URL, use {% url 'admin:index' %}
|
||||
|
||||
|
@ -555,6 +755,16 @@ CELERY_BROKER_URL = env(
|
|||
CELERY_TASK_DEFAULT_RATE_LIMIT = 1
|
||||
CELERY_TASK_TIME_LIMIT = 300
|
||||
CELERY_BEAT_SCHEDULE = {
|
||||
"audio.fetch_rss_feeds": {
|
||||
"task": "audio.fetch_rss_feeds",
|
||||
"schedule": crontab(minute="0", hour="*"),
|
||||
"options": {"expires": 60 * 60},
|
||||
},
|
||||
"common.prune_unattached_attachments": {
|
||||
"task": "common.prune_unattached_attachments",
|
||||
"schedule": crontab(minute="0", hour="*"),
|
||||
"options": {"expires": 60 * 60},
|
||||
},
|
||||
"federation.clean_music_cache": {
|
||||
"task": "federation.clean_music_cache",
|
||||
"schedule": crontab(minute="0", hour="*/2"),
|
||||
|
@ -572,11 +782,30 @@ CELERY_BEAT_SCHEDULE = {
|
|||
},
|
||||
"federation.refresh_nodeinfo_known_nodes": {
|
||||
"task": "federation.refresh_nodeinfo_known_nodes",
|
||||
"schedule": crontab(minute="0", hour="*"),
|
||||
"schedule": crontab(
|
||||
**env.dict(
|
||||
"SCHEDULE_FEDERATION_REFRESH_NODEINFO_KNOWN_NODES",
|
||||
default={"minute": "0", "hour": "*"},
|
||||
)
|
||||
),
|
||||
"options": {"expires": 60 * 60},
|
||||
},
|
||||
}
|
||||
|
||||
if env.bool("ADD_ALBUM_TAGS_FROM_TRACKS", default=True):
|
||||
CELERY_BEAT_SCHEDULE["music.albums_set_tags_from_tracks"] = {
|
||||
"task": "music.albums_set_tags_from_tracks",
|
||||
"schedule": crontab(minute="0", hour="4", day_of_week="4"),
|
||||
"options": {"expires": 60 * 60 * 2},
|
||||
}
|
||||
|
||||
if env.bool("ADD_ARTIST_TAGS_FROM_TRACKS", default=True):
|
||||
CELERY_BEAT_SCHEDULE["music.artists_set_tags_from_tracks"] = {
|
||||
"task": "music.artists_set_tags_from_tracks",
|
||||
"schedule": crontab(minute="0", hour="4", day_of_week="4"),
|
||||
"options": {"expires": 60 * 60 * 2},
|
||||
}
|
||||
|
||||
NODEINFO_REFRESH_DELAY = env.int("NODEINFO_REFRESH_DELAY", default=3600 * 24)
|
||||
|
||||
|
||||
|
@ -605,6 +834,12 @@ AUTH_PASSWORD_VALIDATORS = [
|
|||
{"NAME": "django.contrib.auth.password_validation.CommonPasswordValidator"},
|
||||
{"NAME": "django.contrib.auth.password_validation.NumericPasswordValidator"},
|
||||
]
|
||||
DISABLE_PASSWORD_VALIDATORS = env.bool("DISABLE_PASSWORD_VALIDATORS", default=False)
|
||||
"""
|
||||
Wether to disable password validators (length, common words, similarity with username…) used during regitration.
|
||||
"""
|
||||
if DISABLE_PASSWORD_VALIDATORS:
|
||||
AUTH_PASSWORD_VALIDATORS = []
|
||||
ACCOUNT_ADAPTER = "funkwhale_api.users.adapters.FunkwhaleAccountAdapter"
|
||||
CORS_ORIGIN_ALLOW_ALL = True
|
||||
# CORS_ORIGIN_WHITELIST = (
|
||||
|
@ -623,7 +858,7 @@ REST_FRAMEWORK = {
|
|||
"funkwhale_api.federation.parsers.ActivityParser",
|
||||
),
|
||||
"DEFAULT_AUTHENTICATION_CLASSES": (
|
||||
"oauth2_provider.contrib.rest_framework.OAuth2Authentication",
|
||||
"funkwhale_api.common.authentication.OAuth2Authentication",
|
||||
"funkwhale_api.common.authentication.JSONWebTokenAuthenticationQS",
|
||||
"funkwhale_api.common.authentication.BearerTokenHeaderAuth",
|
||||
"funkwhale_api.common.authentication.JSONWebTokenAuthentication",
|
||||
|
@ -641,6 +876,11 @@ REST_FRAMEWORK = {
|
|||
"NUM_PROXIES": env.int("NUM_PROXIES", default=1),
|
||||
}
|
||||
THROTTLING_ENABLED = env.bool("THROTTLING_ENABLED", default=True)
|
||||
"""
|
||||
Wether to enable throttling (also known as rate-limiting). Leaving this enabled is recommended
|
||||
especially on public pods, to improve the quality of service.
|
||||
"""
|
||||
|
||||
if THROTTLING_ENABLED:
|
||||
REST_FRAMEWORK["DEFAULT_THROTTLE_CLASSES"] = env.list(
|
||||
"THROTTLE_CLASSES",
|
||||
|
@ -723,6 +963,10 @@ THROTTLING_RATES = {
|
|||
"rate": THROTTLING_USER_RATES.get("anonymous-update", "1000/day"),
|
||||
"description": "Anonymous PATCH and PUT requests on resource detail",
|
||||
},
|
||||
"subsonic": {
|
||||
"rate": THROTTLING_USER_RATES.get("subsonic", "2000/hour"),
|
||||
"description": "All subsonic API requests",
|
||||
},
|
||||
# potentially spammy / dangerous endpoints
|
||||
"authenticated-reports": {
|
||||
"rate": THROTTLING_USER_RATES.get("authenticated-reports", "100/day"),
|
||||
|
@ -780,8 +1024,21 @@ THROTTLING_RATES = {
|
|||
"rate": THROTTLING_USER_RATES.get("password-reset-confirm", "20/h"),
|
||||
"description": "Password reset confirmation",
|
||||
},
|
||||
"fetch": {
|
||||
"rate": THROTTLING_USER_RATES.get("fetch", "200/d"),
|
||||
"description": "Fetch remote objects",
|
||||
},
|
||||
}
|
||||
THROTTLING_RATES = THROTTLING_RATES
|
||||
"""
|
||||
Throttling rates for specific endpoints and features of the app. You can tweak this if you are
|
||||
encountering to severe rate limiting issues or, on the contrary, if you want to reduce
|
||||
the consumption on some endpoints.
|
||||
|
||||
Example:
|
||||
|
||||
- ``signup=5/d,password-reset=2/d,anonymous-reports=5/d``
|
||||
"""
|
||||
|
||||
BROWSABLE_API_ENABLED = env.bool("BROWSABLE_API_ENABLED", default=False)
|
||||
if BROWSABLE_API_ENABLED:
|
||||
|
@ -802,24 +1059,48 @@ USE_X_FORWARDED_PORT = True
|
|||
# Wether we should use Apache, Nginx (or other) headers when serving audio files
|
||||
# Default to Nginx
|
||||
REVERSE_PROXY_TYPE = env("REVERSE_PROXY_TYPE", default="nginx")
|
||||
"""
|
||||
Depending on the reverse proxy used in front of your funkwhale instance,
|
||||
the API will use different kind of headers to serve audio files
|
||||
|
||||
Allowed values: ``nginx``, ``apache2``
|
||||
"""
|
||||
assert REVERSE_PROXY_TYPE in ["apache2", "nginx"], "Unsupported REVERSE_PROXY_TYPE"
|
||||
|
||||
# Which path will be used to process the internal redirection
|
||||
# **DO NOT** put a slash at the end
|
||||
PROTECT_FILES_PATH = env("PROTECT_FILES_PATH", default="/_protected")
|
||||
"""
|
||||
Which path will be used to process the internal redirection to the reverse proxy
|
||||
**DO NOT** put a slash at the end.
|
||||
|
||||
You shouldn't have to tweak this.
|
||||
"""
|
||||
|
||||
# use this setting to tweak for how long you want to cache
|
||||
# musicbrainz results. (value is in seconds)
|
||||
MUSICBRAINZ_CACHE_DURATION = env.int("MUSICBRAINZ_CACHE_DURATION", default=300)
|
||||
|
||||
# Use this setting to change the musicbrainz hostname, for instance to
|
||||
# use a mirror. The hostname can also contain a port number (so, e.g.,
|
||||
# "localhost:5000" is a valid name to set).
|
||||
"""
|
||||
How long to cache MusicBrainz results, in seconds
|
||||
"""
|
||||
MUSICBRAINZ_HOSTNAME = env("MUSICBRAINZ_HOSTNAME", default="musicbrainz.org")
|
||||
"""
|
||||
Use this setting to change the musicbrainz hostname, for instance to
|
||||
use a mirror. The hostname can also contain a port number.
|
||||
|
||||
Example:
|
||||
|
||||
- ``mymusicbrainz.mirror``
|
||||
- ``localhost:5000``
|
||||
|
||||
"""
|
||||
# Custom Admin URL, use {% url 'admin:index' %}
|
||||
ADMIN_URL = env("DJANGO_ADMIN_URL", default="^api/admin/")
|
||||
"""
|
||||
Path to the Django admin area.
|
||||
|
||||
Exemples:
|
||||
|
||||
- `^api/admin/`
|
||||
- `^api/mycustompath/`
|
||||
|
||||
"""
|
||||
CSRF_USE_SESSIONS = True
|
||||
SESSION_ENGINE = "django.contrib.sessions.backends.cache"
|
||||
|
||||
|
@ -827,6 +1108,7 @@ SESSION_ENGINE = "django.contrib.sessions.backends.cache"
|
|||
# XXX: deprecated, see #186
|
||||
PLAYLISTS_MAX_TRACKS = env.int("PLAYLISTS_MAX_TRACKS", default=250)
|
||||
|
||||
|
||||
ACCOUNT_USERNAME_BLACKLIST = [
|
||||
"funkwhale",
|
||||
"library",
|
||||
|
@ -851,23 +1133,71 @@ ACCOUNT_USERNAME_BLACKLIST = [
|
|||
"shared_inbox",
|
||||
"actor",
|
||||
] + env.list("ACCOUNT_USERNAME_BLACKLIST", default=[])
|
||||
|
||||
"""
|
||||
List of usernames that will be unavailable during registration.
|
||||
"""
|
||||
EXTERNAL_REQUESTS_VERIFY_SSL = env.bool("EXTERNAL_REQUESTS_VERIFY_SSL", default=True)
|
||||
"""
|
||||
Wether to enforce HTTPS certificates verification when doing outgoing HTTP requests (typically with federation).
|
||||
Disabling this is not recommended.
|
||||
"""
|
||||
EXTERNAL_REQUESTS_TIMEOUT = env.int("EXTERNAL_REQUESTS_TIMEOUT", default=10)
|
||||
"""
|
||||
Default timeout for external requests.
|
||||
"""
|
||||
# XXX: deprecated, see #186
|
||||
API_AUTHENTICATION_REQUIRED = env.bool("API_AUTHENTICATION_REQUIRED", True)
|
||||
|
||||
MUSIC_DIRECTORY_PATH = env("MUSIC_DIRECTORY_PATH", default=None)
|
||||
# on Docker setup, the music directory may not match the host path,
|
||||
# and we need to know it for it to serve stuff properly
|
||||
"""
|
||||
The path on your server where Funkwhale can import files using :ref:`in-place import
|
||||
<in-place-import>`. It must be readable by the webserver and Funkwhale
|
||||
api and worker processes.
|
||||
|
||||
On docker installations, we recommend you use the default of ``/music``
|
||||
for this value. For non-docker installation, you can use any absolute path.
|
||||
``/srv/funkwhale/data/music`` is a safe choice if you don't know what to use.
|
||||
|
||||
.. note:: This path should not include any trailing slash
|
||||
|
||||
.. warning::
|
||||
|
||||
You need to adapt your :ref:`reverse-proxy configuration<reverse-proxy-setup>` to
|
||||
serve the directory pointed by ``MUSIC_DIRECTORY_PATH`` on
|
||||
``/_protected/music`` URL.
|
||||
|
||||
"""
|
||||
MUSIC_DIRECTORY_SERVE_PATH = env(
|
||||
"MUSIC_DIRECTORY_SERVE_PATH", default=MUSIC_DIRECTORY_PATH
|
||||
)
|
||||
"""
|
||||
Default: :attr:`MUSIC_DIRECTORY_PATH`
|
||||
|
||||
When using Docker, the value of :attr:`MUSIC_DIRECTORY_PATH` in your containers
|
||||
may differ from the real path on your host. Assuming you have the following directive
|
||||
in your :file:`docker-compose.yml` file::
|
||||
|
||||
volumes:
|
||||
- /srv/funkwhale/data/music:/music:ro
|
||||
|
||||
Then, the value of :attr:`MUSIC_DIRECTORY_SERVE_PATH` should be
|
||||
``/srv/funkwhale/data/music``. This must be readable by the webserver.
|
||||
|
||||
On non-docker setup, you don't need to configure this setting.
|
||||
|
||||
.. note:: This path should not include any trailing slash
|
||||
|
||||
"""
|
||||
# When this is set to default=True, we need to reenable migration music/0042
|
||||
# to ensure data is populated correctly on existing pods
|
||||
MUSIC_USE_DENORMALIZATION = env.bool("MUSIC_USE_DENORMALIZATION", default=False)
|
||||
|
||||
USERS_INVITATION_EXPIRATION_DAYS = env.int(
|
||||
"USERS_INVITATION_EXPIRATION_DAYS", default=14
|
||||
)
|
||||
"""
|
||||
Expiration delay in days, for user invitations.
|
||||
"""
|
||||
|
||||
VERSATILEIMAGEFIELD_RENDITION_KEY_SETS = {
|
||||
"square": [
|
||||
|
@ -875,7 +1205,11 @@ VERSATILEIMAGEFIELD_RENDITION_KEY_SETS = {
|
|||
("square_crop", "crop__400x400"),
|
||||
("medium_square_crop", "crop__200x200"),
|
||||
("small_square_crop", "crop__50x50"),
|
||||
]
|
||||
],
|
||||
"attachment_square": [
|
||||
("original", "url"),
|
||||
("medium_square_crop", "crop__200x200"),
|
||||
],
|
||||
}
|
||||
VERSATILEIMAGEFIELD_SETTINGS = {"create_images_on_demand": False}
|
||||
RSA_KEY_SIZE = 2048
|
||||
|
@ -887,17 +1221,84 @@ ACTOR_KEY_ROTATION_DELAY = env.int("ACTOR_KEY_ROTATION_DELAY", default=3600 * 48
|
|||
SUBSONIC_DEFAULT_TRANSCODING_FORMAT = (
|
||||
env("SUBSONIC_DEFAULT_TRANSCODING_FORMAT", default="mp3") or None
|
||||
)
|
||||
|
||||
"""
|
||||
Default format for transcoding when using Subsonic API.
|
||||
"""
|
||||
# extra tags will be ignored
|
||||
TAGS_MAX_BY_OBJ = env.int("TAGS_MAX_BY_OBJ", default=30)
|
||||
"""
|
||||
Maximum number of tags that can be associated with an object. Extra tags will be ignored.
|
||||
"""
|
||||
FEDERATION_OBJECT_FETCH_DELAY = env.int(
|
||||
"FEDERATION_OBJECT_FETCH_DELAY", default=60 * 24 * 3
|
||||
)
|
||||
|
||||
"""
|
||||
Number of minutes before a remote object will be automatically refetched when accessed in the UI.
|
||||
"""
|
||||
MODERATION_EMAIL_NOTIFICATIONS_ENABLED = env.bool(
|
||||
"MODERATION_EMAIL_NOTIFICATIONS_ENABLED", default=True
|
||||
)
|
||||
|
||||
# Delay in days after signup before we show the "support us" messages
|
||||
"""
|
||||
Whether to enable email notifications to moderators and pods admins.
|
||||
"""
|
||||
FEDERATION_AUTHENTIFY_FETCHES = True
|
||||
FEDERATION_SYNCHRONOUS_FETCH = env.bool("FEDERATION_SYNCHRONOUS_FETCH", default=True)
|
||||
FEDERATION_DUPLICATE_FETCH_DELAY = env.int(
|
||||
"FEDERATION_DUPLICATE_FETCH_DELAY", default=60 * 50
|
||||
)
|
||||
"""
|
||||
Delay, in seconds, between two manual fetch of the same remote object.
|
||||
"""
|
||||
INSTANCE_SUPPORT_MESSAGE_DELAY = env.int("INSTANCE_SUPPORT_MESSAGE_DELAY", default=15)
|
||||
"""
|
||||
Delay in days after signup before we show the "support your pod" message
|
||||
"""
|
||||
FUNKWHALE_SUPPORT_MESSAGE_DELAY = env.int("FUNKWHALE_SUPPORT_MESSAGE_DELAY", default=15)
|
||||
"""
|
||||
Delay in days after signup before we show the "support Funkwhale" message
|
||||
"""
|
||||
# XXX Stable release: remove
|
||||
USE_FULL_TEXT_SEARCH = env.bool("USE_FULL_TEXT_SEARCH", default=True)
|
||||
|
||||
MIN_DELAY_BETWEEN_DOWNLOADS_COUNT = env.int(
|
||||
"MIN_DELAY_BETWEEN_DOWNLOADS_COUNT", default=60 * 60 * 6
|
||||
)
|
||||
"""
|
||||
Minimum required period, in seconds, for two downloads of the same track by the same IP
|
||||
or user to be recorded in statistics.
|
||||
"""
|
||||
MARKDOWN_EXTENSIONS = env.list("MARKDOWN_EXTENSIONS", default=["nl2br", "extra"])
|
||||
"""
|
||||
List of markdown extensions to enable.
|
||||
|
||||
Cf `<https://python-markdown.github.io/extensions/>`_
|
||||
"""
|
||||
LINKIFIER_SUPPORTED_TLDS = ["audio"] + env.list("LINKINFIER_SUPPORTED_TLDS", default=[])
|
||||
"""
|
||||
Additional TLDs to support with our markdown linkifier.
|
||||
"""
|
||||
EXTERNAL_MEDIA_PROXY_ENABLED = env.bool("EXTERNAL_MEDIA_PROXY_ENABLED", default=True)
|
||||
"""
|
||||
Wether to proxy attachment files hosted on third party pods and and servers. Keeping
|
||||
this to true is recommended, to reduce leaking browsing information of your users, and
|
||||
reduce the bandwidth used on remote pods.
|
||||
"""
|
||||
PODCASTS_THIRD_PARTY_VISIBILITY = env("PODCASTS_THIRD_PARTY_VISIBILITY", default="me")
|
||||
"""
|
||||
By default, only people who subscribe to a podcast RSS will have access to their episodes.
|
||||
switch to "instance" or "everyone" to change that.
|
||||
|
||||
Changing it only affect new podcasts.
|
||||
"""
|
||||
PODCASTS_RSS_FEED_REFRESH_DELAY = env.int(
|
||||
"PODCASTS_RSS_FEED_REFRESH_DELAY", default=60 * 60 * 24
|
||||
)
|
||||
"""
|
||||
Delay in seconds between to fetch of RSS feeds. Reducing this mean you'll receive new episodes faster,
|
||||
but will require more resources.
|
||||
"""
|
||||
# maximum items loaded through XML feed
|
||||
PODCASTS_RSS_FEED_MAX_ITEMS = env.int("PODCASTS_RSS_FEED_MAX_ITEMS", default=250)
|
||||
"""
|
||||
Maximum number of RSS items to load in each podcast feed.
|
||||
"""
|
||||
|
|
|
@ -1,5 +1,7 @@
|
|||
from django import urls
|
||||
|
||||
from funkwhale_api.audio import spa_views as audio_spa_views
|
||||
from funkwhale_api.federation import spa_views as federation_spa_views
|
||||
from funkwhale_api.music import spa_views
|
||||
|
||||
|
||||
|
@ -20,4 +22,24 @@ urlpatterns = [
|
|||
spa_views.library_playlist,
|
||||
name="library_playlist",
|
||||
),
|
||||
urls.re_path(
|
||||
r"^library/(?P<uuid>[0-9a-f-]+)/?$",
|
||||
spa_views.library_library,
|
||||
name="library_library",
|
||||
),
|
||||
urls.re_path(
|
||||
r"^channels/(?P<uuid>[0-9a-f-]+)/?$",
|
||||
audio_spa_views.channel_detail_uuid,
|
||||
name="channel_detail",
|
||||
),
|
||||
urls.re_path(
|
||||
r"^channels/(?P<username>[^/]+)/?$",
|
||||
audio_spa_views.channel_detail_username,
|
||||
name="channel_detail",
|
||||
),
|
||||
urls.re_path(
|
||||
r"^@(?P<username>[^/]+)/?$",
|
||||
federation_spa_views.actor_detail_username,
|
||||
name="actor_detail",
|
||||
),
|
||||
]
|
||||
|
|
|
@ -40,3 +40,8 @@ if settings.DEBUG:
|
|||
urlpatterns = [
|
||||
path("api/__debug__/", include(debug_toolbar.urls))
|
||||
] + urlpatterns
|
||||
|
||||
if "silk" in settings.INSTALLED_APPS:
|
||||
urlpatterns = [
|
||||
url(r"^api/silk/", include("silk.urls", namespace="silk"))
|
||||
] + urlpatterns
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
__version__ = "0.20.1"
|
||||
__version__ = "0.21"
|
||||
__version_info__ = tuple(
|
||||
[
|
||||
int(num) if num.isdigit() else num
|
||||
|
|
|
@ -0,0 +1,15 @@
|
|||
from funkwhale_api.common import admin
|
||||
|
||||
from . import models
|
||||
|
||||
|
||||
@admin.register(models.Channel)
|
||||
class ChannelAdmin(admin.ModelAdmin):
|
||||
list_display = [
|
||||
"uuid",
|
||||
"artist",
|
||||
"attributed_to",
|
||||
"actor",
|
||||
"library",
|
||||
"creation_date",
|
||||
]
|
|
@ -0,0 +1,113 @@
|
|||
# from https://help.apple.com/itc/podcasts_connect/#/itc9267a2f12
|
||||
ITUNES_CATEGORIES = {
|
||||
"Arts": [
|
||||
"Books",
|
||||
"Design",
|
||||
"Fashion & Beauty",
|
||||
"Food",
|
||||
"Performing Arts",
|
||||
"Visual Arts",
|
||||
],
|
||||
"Business": [
|
||||
"Careers",
|
||||
"Entrepreneurship",
|
||||
"Investing",
|
||||
"Management",
|
||||
"Marketing",
|
||||
"Non-Profit",
|
||||
],
|
||||
"Comedy": ["Comedy Interviews", "Improv", "Stand-Up"],
|
||||
"Education": ["Courses", "How To", "Language Learning", "Self-Improvement"],
|
||||
"Fiction": ["Comedy Fiction", "Drama", "Science Fiction"],
|
||||
"Government": [],
|
||||
"History": [],
|
||||
"Health & Fitness": [
|
||||
"Alternative Health",
|
||||
"Fitness",
|
||||
"Medicine",
|
||||
"Mental Health",
|
||||
"Nutrition",
|
||||
"Sexuality",
|
||||
],
|
||||
"Kids & Family": [
|
||||
"Education for Kids",
|
||||
"Parenting",
|
||||
"Pets & Animals",
|
||||
"Stories for Kids",
|
||||
],
|
||||
"Leisure": [
|
||||
"Animation & Manga",
|
||||
"Automotive",
|
||||
"Aviation",
|
||||
"Crafts",
|
||||
"Games",
|
||||
"Hobbies",
|
||||
"Home & Garden",
|
||||
"Video Games",
|
||||
],
|
||||
"Music": ["Music Commentary", "Music History", "Music Interviews"],
|
||||
"News": [
|
||||
"Business News",
|
||||
"Daily News",
|
||||
"Entertainment News",
|
||||
"News Commentary",
|
||||
"Politics",
|
||||
"Sports News",
|
||||
"Tech News",
|
||||
],
|
||||
"Religion & Spirituality": [
|
||||
"Buddhism",
|
||||
"Christianity",
|
||||
"Hinduism",
|
||||
"Islam",
|
||||
"Judaism",
|
||||
"Religion",
|
||||
"Spirituality",
|
||||
],
|
||||
"Science": [
|
||||
"Astronomy",
|
||||
"Chemistry",
|
||||
"Earth Sciences",
|
||||
"Life Sciences",
|
||||
"Mathematics",
|
||||
"Natural Sciences",
|
||||
"Nature",
|
||||
"Physics",
|
||||
"Social Sciences",
|
||||
],
|
||||
"Society & Culture": [
|
||||
"Documentary",
|
||||
"Personal Journals",
|
||||
"Philosophy",
|
||||
"Places & Travel",
|
||||
"Relationships",
|
||||
],
|
||||
"Sports": [
|
||||
"Baseball",
|
||||
"Basketball",
|
||||
"Cricket",
|
||||
"Fantasy Sports",
|
||||
"Football",
|
||||
"Golf",
|
||||
"Hockey",
|
||||
"Rugby",
|
||||
"Running",
|
||||
"Soccer",
|
||||
"Swimming",
|
||||
"Tennis",
|
||||
"Volleyball",
|
||||
"Wilderness",
|
||||
"Wrestling",
|
||||
],
|
||||
"Technology": [],
|
||||
"True Crime": [],
|
||||
"TV & Film": [
|
||||
"After Shows",
|
||||
"Film History",
|
||||
"Film Interviews",
|
||||
"Film Reviews",
|
||||
"TV Reviews",
|
||||
],
|
||||
}
|
||||
|
||||
ITUNES_SUBCATEGORIES = [s for p in ITUNES_CATEGORIES.values() for s in p]
|
|
@ -0,0 +1,25 @@
|
|||
from dynamic_preferences import types
|
||||
from dynamic_preferences.registries import global_preferences_registry
|
||||
|
||||
audio = types.Section("audio")
|
||||
|
||||
|
||||
@global_preferences_registry.register
|
||||
class ChannelsEnabled(types.BooleanPreference):
|
||||
section = audio
|
||||
name = "channels_enabled"
|
||||
default = True
|
||||
verbose_name = "Enable channels"
|
||||
help_text = (
|
||||
"If disabled, the channels feature will be completely switched off, "
|
||||
"and users won't be able to create channels or subscribe to them."
|
||||
)
|
||||
|
||||
|
||||
@global_preferences_registry.register
|
||||
class MaxChannels(types.IntegerPreference):
|
||||
show_in_api = True
|
||||
section = audio
|
||||
default = 20
|
||||
name = "max_channels"
|
||||
verbose_name = "Max channels allowed per user"
|
|
@ -0,0 +1,68 @@
|
|||
import uuid
|
||||
|
||||
import factory
|
||||
|
||||
from funkwhale_api.factories import registry, NoUpdateOnCreate
|
||||
from funkwhale_api.federation import actors
|
||||
from funkwhale_api.federation import factories as federation_factories
|
||||
from funkwhale_api.music import factories as music_factories
|
||||
|
||||
from . import models
|
||||
|
||||
|
||||
def set_actor(o):
|
||||
return models.generate_actor(str(o.uuid))
|
||||
|
||||
|
||||
def get_rss_channel_name():
|
||||
return "rssfeed-{}".format(uuid.uuid4())
|
||||
|
||||
|
||||
@registry.register
|
||||
class ChannelFactory(NoUpdateOnCreate, factory.django.DjangoModelFactory):
|
||||
uuid = factory.Faker("uuid4")
|
||||
attributed_to = factory.SubFactory(federation_factories.ActorFactory)
|
||||
library = factory.SubFactory(
|
||||
federation_factories.MusicLibraryFactory,
|
||||
actor=factory.SelfAttribute("..attributed_to"),
|
||||
privacy_level="everyone",
|
||||
)
|
||||
actor = factory.LazyAttribute(set_actor)
|
||||
artist = factory.SubFactory(
|
||||
music_factories.ArtistFactory,
|
||||
attributed_to=factory.SelfAttribute("..attributed_to"),
|
||||
)
|
||||
rss_url = factory.Faker("url")
|
||||
metadata = factory.LazyAttribute(lambda o: {})
|
||||
|
||||
class Meta:
|
||||
model = "audio.Channel"
|
||||
|
||||
class Params:
|
||||
external = factory.Trait(
|
||||
attributed_to=factory.LazyFunction(actors.get_service_actor),
|
||||
library__privacy_level="me",
|
||||
actor=factory.SubFactory(
|
||||
federation_factories.ActorFactory,
|
||||
local=True,
|
||||
preferred_username=factory.LazyFunction(get_rss_channel_name),
|
||||
),
|
||||
)
|
||||
local = factory.Trait(
|
||||
attributed_to=factory.SubFactory(
|
||||
federation_factories.ActorFactory, local=True
|
||||
),
|
||||
library__privacy_level="everyone",
|
||||
artist__local=True,
|
||||
)
|
||||
|
||||
|
||||
@registry.register(name="audio.Subscription")
|
||||
class SubscriptionFactory(NoUpdateOnCreate, factory.django.DjangoModelFactory):
|
||||
uuid = factory.Faker("uuid4")
|
||||
approved = True
|
||||
target = factory.LazyAttribute(lambda o: ChannelFactory().actor)
|
||||
actor = factory.SubFactory(federation_factories.ActorFactory)
|
||||
|
||||
class Meta:
|
||||
model = "federation.Follow"
|
|
@ -0,0 +1,106 @@
|
|||
from django.db.models import Q
|
||||
|
||||
import django_filters
|
||||
|
||||
from funkwhale_api.common import fields
|
||||
from funkwhale_api.common import filters as common_filters
|
||||
from funkwhale_api.federation import actors
|
||||
from funkwhale_api.moderation import filters as moderation_filters
|
||||
|
||||
from . import models
|
||||
|
||||
|
||||
def filter_tags(queryset, name, value):
|
||||
non_empty_tags = [v.lower() for v in value if v]
|
||||
for tag in non_empty_tags:
|
||||
queryset = queryset.filter(artist__tagged_items__tag__name=tag).distinct()
|
||||
return queryset
|
||||
|
||||
|
||||
TAG_FILTER = common_filters.MultipleQueryFilter(method=filter_tags)
|
||||
|
||||
|
||||
class ChannelFilter(moderation_filters.HiddenContentFilterSet):
|
||||
q = fields.SearchFilter(
|
||||
search_fields=["artist__name", "actor__summary", "actor__preferred_username"]
|
||||
)
|
||||
tag = TAG_FILTER
|
||||
scope = common_filters.ActorScopeFilter(actor_field="attributed_to", distinct=True)
|
||||
subscribed = django_filters.BooleanFilter(
|
||||
field_name="_", method="filter_subscribed"
|
||||
)
|
||||
external = django_filters.BooleanFilter(field_name="_", method="filter_external")
|
||||
ordering = django_filters.OrderingFilter(
|
||||
# tuple-mapping retains order
|
||||
fields=(
|
||||
("creation_date", "creation_date"),
|
||||
("artist__modification_date", "modification_date"),
|
||||
)
|
||||
)
|
||||
|
||||
class Meta:
|
||||
model = models.Channel
|
||||
fields = ["q", "scope", "tag", "subscribed", "ordering", "external"]
|
||||
hidden_content_fields_mapping = moderation_filters.USER_FILTER_CONFIG["CHANNEL"]
|
||||
|
||||
def filter_subscribed(self, queryset, name, value):
|
||||
if not self.request.user.is_authenticated:
|
||||
return queryset.none()
|
||||
|
||||
emitted_follows = self.request.user.actor.emitted_follows.exclude(
|
||||
target__channel__isnull=True
|
||||
)
|
||||
|
||||
query = Q(actor__in=emitted_follows.values_list("target", flat=True))
|
||||
|
||||
if value is True:
|
||||
return queryset.filter(query)
|
||||
else:
|
||||
return queryset.exclude(query)
|
||||
|
||||
def filter_external(self, queryset, name, value):
|
||||
query = Q(
|
||||
attributed_to=actors.get_service_actor(),
|
||||
actor__preferred_username__startswith="rssfeed-",
|
||||
)
|
||||
if value is True:
|
||||
queryset = queryset.filter(query)
|
||||
if value is False:
|
||||
queryset = queryset.exclude(query)
|
||||
|
||||
return queryset
|
||||
|
||||
|
||||
class IncludeChannelsFilterSet(django_filters.FilterSet):
|
||||
"""
|
||||
|
||||
A filterset that include a "include_channels" param. Meant for compatibility
|
||||
with clients that don't support channels yet:
|
||||
|
||||
- include_channels=false : exclude objects associated with a channel
|
||||
- include_channels=true : don't exclude objects associated with a channel
|
||||
- not specified: include_channels=false
|
||||
|
||||
Usage:
|
||||
|
||||
class MyFilterSet(IncludeChannelsFilterSet):
|
||||
class Meta:
|
||||
include_channels_field = "album__artist__channel"
|
||||
|
||||
"""
|
||||
|
||||
include_channels = django_filters.BooleanFilter(
|
||||
field_name="_", method="filter_include_channels"
|
||||
)
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
self.data = self.data.copy()
|
||||
self.data.setdefault("include_channels", False)
|
||||
|
||||
def filter_include_channels(self, queryset, name, value):
|
||||
if value is True:
|
||||
return queryset
|
||||
else:
|
||||
params = {self.__class__.Meta.include_channels_field: None}
|
||||
return queryset.filter(**params)
|
|
@ -0,0 +1,31 @@
|
|||
# Generated by Django 2.2.6 on 2019-10-29 12:57
|
||||
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
import django.utils.timezone
|
||||
import uuid
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
initial = True
|
||||
|
||||
dependencies = [
|
||||
('federation', '0021_auto_20191029_1257'),
|
||||
('music', '0041_auto_20191021_1705'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name='Channel',
|
||||
fields=[
|
||||
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('uuid', models.UUIDField(default=uuid.uuid4, unique=True)),
|
||||
('creation_date', models.DateTimeField(default=django.utils.timezone.now)),
|
||||
('actor', models.OneToOneField(on_delete=django.db.models.deletion.CASCADE, related_name='channel', to='federation.Actor')),
|
||||
('artist', models.OneToOneField(on_delete=django.db.models.deletion.CASCADE, related_name='channel', to='music.Artist')),
|
||||
('attributed_to', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='owned_channels', to='federation.Actor')),
|
||||
('library', models.OneToOneField(on_delete=django.db.models.deletion.CASCADE, related_name='channel', to='music.Library')),
|
||||
],
|
||||
),
|
||||
]
|
|
@ -0,0 +1,21 @@
|
|||
# Generated by Django 2.2.9 on 2020-01-31 06:24
|
||||
|
||||
import django.contrib.postgres.fields.jsonb
|
||||
import django.core.serializers.json
|
||||
from django.db import migrations
|
||||
import funkwhale_api.audio.models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('audio', '0001_initial'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='channel',
|
||||
name='metadata',
|
||||
field=django.contrib.postgres.fields.jsonb.JSONField(blank=True, default=funkwhale_api.audio.models.empty_dict, encoder=django.core.serializers.json.DjangoJSONEncoder, max_length=50000),
|
||||
),
|
||||
]
|
|
@ -0,0 +1,18 @@
|
|||
# Generated by Django 2.2.10 on 2020-02-06 15:01
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('audio', '0002_channel_metadata'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='channel',
|
||||
name='rss_url',
|
||||
field=models.URLField(blank=True, max_length=500, null=True),
|
||||
),
|
||||
]
|
|
@ -0,0 +1,123 @@
|
|||
import uuid
|
||||
|
||||
|
||||
from django.contrib.contenttypes.fields import GenericRelation
|
||||
from django.contrib.postgres.fields import JSONField
|
||||
from django.core.serializers.json import DjangoJSONEncoder
|
||||
from django.db import models
|
||||
from django.urls import reverse
|
||||
from django.utils import timezone
|
||||
from django.db.models.signals import post_delete
|
||||
from django.dispatch import receiver
|
||||
|
||||
from funkwhale_api.federation import keys
|
||||
from funkwhale_api.federation import models as federation_models
|
||||
from funkwhale_api.federation import utils as federation_utils
|
||||
from funkwhale_api.users import models as user_models
|
||||
|
||||
|
||||
def empty_dict():
|
||||
return {}
|
||||
|
||||
|
||||
class ChannelQuerySet(models.QuerySet):
|
||||
def external_rss(self, include=True):
|
||||
from funkwhale_api.federation import actors
|
||||
|
||||
query = models.Q(
|
||||
attributed_to=actors.get_service_actor(),
|
||||
actor__preferred_username__startswith="rssfeed-",
|
||||
)
|
||||
if include:
|
||||
return self.filter(query)
|
||||
return self.exclude(query)
|
||||
|
||||
def subscribed(self, actor):
|
||||
if not actor:
|
||||
return self.none()
|
||||
|
||||
subscriptions = actor.emitted_follows.filter(
|
||||
approved=True, target__channel__isnull=False
|
||||
)
|
||||
return self.filter(actor__in=subscriptions.values_list("target", flat=True))
|
||||
|
||||
|
||||
class Channel(models.Model):
|
||||
uuid = models.UUIDField(default=uuid.uuid4, unique=True)
|
||||
artist = models.OneToOneField(
|
||||
"music.Artist", on_delete=models.CASCADE, related_name="channel"
|
||||
)
|
||||
# the owner of the channel
|
||||
attributed_to = models.ForeignKey(
|
||||
"federation.Actor", on_delete=models.CASCADE, related_name="owned_channels"
|
||||
)
|
||||
# the federation actor created for the channel
|
||||
# (the one people can follow to receive updates)
|
||||
actor = models.OneToOneField(
|
||||
"federation.Actor", on_delete=models.CASCADE, related_name="channel"
|
||||
)
|
||||
|
||||
library = models.OneToOneField(
|
||||
"music.Library", on_delete=models.CASCADE, related_name="channel"
|
||||
)
|
||||
creation_date = models.DateTimeField(default=timezone.now)
|
||||
rss_url = models.URLField(max_length=500, null=True, blank=True)
|
||||
|
||||
# metadata to enhance rss feed
|
||||
metadata = JSONField(
|
||||
default=empty_dict, max_length=50000, encoder=DjangoJSONEncoder, blank=True
|
||||
)
|
||||
|
||||
fetches = GenericRelation(
|
||||
"federation.Fetch",
|
||||
content_type_field="object_content_type",
|
||||
object_id_field="object_id",
|
||||
)
|
||||
objects = ChannelQuerySet.as_manager()
|
||||
|
||||
@property
|
||||
def fid(self):
|
||||
if not self.is_external_rss:
|
||||
return self.actor.fid
|
||||
|
||||
@property
|
||||
def is_local(self):
|
||||
return self.actor.is_local
|
||||
|
||||
@property
|
||||
def is_external_rss(self):
|
||||
return self.actor.preferred_username.startswith("rssfeed-")
|
||||
|
||||
def get_absolute_url(self):
|
||||
suffix = self.uuid
|
||||
if self.actor.is_local:
|
||||
suffix = self.actor.preferred_username
|
||||
else:
|
||||
suffix = self.actor.full_username
|
||||
return federation_utils.full_url("/channels/{}".format(suffix))
|
||||
|
||||
def get_rss_url(self):
|
||||
if not self.artist.is_local or self.is_external_rss:
|
||||
return self.rss_url
|
||||
|
||||
return federation_utils.full_url(
|
||||
reverse(
|
||||
"api:v1:channels-rss",
|
||||
kwargs={"composite": self.actor.preferred_username},
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
def generate_actor(username, **kwargs):
|
||||
actor_data = user_models.get_actor_data(username, **kwargs)
|
||||
private, public = keys.get_key_pair()
|
||||
actor_data["private_key"] = private.decode("utf-8")
|
||||
actor_data["public_key"] = public.decode("utf-8")
|
||||
|
||||
return federation_models.Actor.objects.create(**actor_data)
|
||||
|
||||
|
||||
@receiver(post_delete, sender=Channel)
|
||||
def delete_channel_related_objs(instance, **kwargs):
|
||||
instance.library.delete()
|
||||
instance.artist.delete()
|
|
@ -0,0 +1,36 @@
|
|||
import xml.etree.ElementTree as ET
|
||||
|
||||
from rest_framework import negotiation
|
||||
from rest_framework import renderers
|
||||
|
||||
from funkwhale_api.subsonic.renderers import dict_to_xml_tree
|
||||
|
||||
|
||||
class PodcastRSSRenderer(renderers.JSONRenderer):
|
||||
media_type = "application/rss+xml"
|
||||
|
||||
def render(self, data, accepted_media_type=None, renderer_context=None):
|
||||
if not data:
|
||||
# when stream view is called, we don't have any data
|
||||
return super().render(data, accepted_media_type, renderer_context)
|
||||
final = {
|
||||
"version": "2.0",
|
||||
"xmlns:atom": "http://www.w3.org/2005/Atom",
|
||||
"xmlns:itunes": "http://www.itunes.com/dtds/podcast-1.0.dtd",
|
||||
"xmlns:media": "http://search.yahoo.com/mrss/",
|
||||
}
|
||||
final.update(data)
|
||||
tree = dict_to_xml_tree("rss", final)
|
||||
return render_xml(tree)
|
||||
|
||||
|
||||
class PodcastRSSContentNegociation(negotiation.DefaultContentNegotiation):
|
||||
def select_renderer(self, request, renderers, format_suffix=None):
|
||||
|
||||
return (PodcastRSSRenderer(), PodcastRSSRenderer.media_type)
|
||||
|
||||
|
||||
def render_xml(tree):
|
||||
return b'<?xml version="1.0" encoding="UTF-8"?>\n' + ET.tostring(
|
||||
tree, encoding="utf-8"
|
||||
)
|
|
@ -0,0 +1,942 @@
|
|||
import datetime
|
||||
import logging
|
||||
import time
|
||||
import uuid
|
||||
|
||||
from django.conf import settings
|
||||
from django.db import transaction
|
||||
from django.db.models import Q
|
||||
from django.utils import timezone
|
||||
|
||||
import feedparser
|
||||
import requests
|
||||
import pytz
|
||||
|
||||
from rest_framework import serializers
|
||||
|
||||
from django.templatetags.static import static
|
||||
from django.urls import reverse
|
||||
|
||||
from funkwhale_api.common import serializers as common_serializers
|
||||
from funkwhale_api.common import utils as common_utils
|
||||
from funkwhale_api.common import locales
|
||||
from funkwhale_api.common import preferences
|
||||
from funkwhale_api.common import session
|
||||
from funkwhale_api.federation import actors
|
||||
from funkwhale_api.federation import models as federation_models
|
||||
from funkwhale_api.federation import serializers as federation_serializers
|
||||
from funkwhale_api.federation import utils as federation_utils
|
||||
from funkwhale_api.moderation import mrf
|
||||
from funkwhale_api.music import models as music_models
|
||||
from funkwhale_api.music import serializers as music_serializers
|
||||
from funkwhale_api.tags import models as tags_models
|
||||
from funkwhale_api.tags import serializers as tags_serializers
|
||||
from funkwhale_api.users import serializers as users_serializers
|
||||
|
||||
from . import categories
|
||||
from . import models
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class ChannelMetadataSerializer(serializers.Serializer):
|
||||
itunes_category = serializers.ChoiceField(
|
||||
choices=categories.ITUNES_CATEGORIES, required=True
|
||||
)
|
||||
itunes_subcategory = serializers.CharField(required=False, allow_null=True)
|
||||
language = serializers.ChoiceField(required=True, choices=locales.ISO_639_CHOICES)
|
||||
copyright = serializers.CharField(required=False, allow_null=True, max_length=255)
|
||||
owner_name = serializers.CharField(required=False, allow_null=True, max_length=255)
|
||||
owner_email = serializers.EmailField(required=False, allow_null=True)
|
||||
explicit = serializers.BooleanField(required=False)
|
||||
|
||||
def validate(self, validated_data):
|
||||
validated_data = super().validate(validated_data)
|
||||
subcategory = self._validate_itunes_subcategory(
|
||||
validated_data["itunes_category"], validated_data.get("itunes_subcategory")
|
||||
)
|
||||
if subcategory:
|
||||
validated_data["itunes_subcategory"] = subcategory
|
||||
return validated_data
|
||||
|
||||
def _validate_itunes_subcategory(self, parent, child):
|
||||
if not child:
|
||||
return
|
||||
|
||||
if child not in categories.ITUNES_CATEGORIES[parent]:
|
||||
raise serializers.ValidationError(
|
||||
'"{}" is not a valid subcategory for "{}"'.format(child, parent)
|
||||
)
|
||||
|
||||
return child
|
||||
|
||||
|
||||
class ChannelCreateSerializer(serializers.Serializer):
|
||||
name = serializers.CharField(max_length=music_models.MAX_LENGTHS["ARTIST_NAME"])
|
||||
username = serializers.CharField(
|
||||
max_length=music_models.MAX_LENGTHS["ARTIST_NAME"],
|
||||
validators=[users_serializers.ASCIIUsernameValidator()],
|
||||
)
|
||||
description = common_serializers.ContentSerializer(allow_null=True)
|
||||
tags = tags_serializers.TagsListField()
|
||||
content_category = serializers.ChoiceField(
|
||||
choices=music_models.ARTIST_CONTENT_CATEGORY_CHOICES
|
||||
)
|
||||
metadata = serializers.DictField(required=False)
|
||||
cover = music_serializers.COVER_WRITE_FIELD
|
||||
|
||||
def validate(self, validated_data):
|
||||
existing_channels = self.context["actor"].owned_channels.count()
|
||||
if existing_channels >= preferences.get("audio__max_channels"):
|
||||
raise serializers.ValidationError(
|
||||
"You have reached the maximum amount of allowed channels"
|
||||
)
|
||||
validated_data = super().validate(validated_data)
|
||||
metadata = validated_data.pop("metadata", {})
|
||||
if validated_data["content_category"] == "podcast":
|
||||
metadata_serializer = ChannelMetadataSerializer(data=metadata)
|
||||
metadata_serializer.is_valid(raise_exception=True)
|
||||
metadata = metadata_serializer.validated_data
|
||||
validated_data["metadata"] = metadata
|
||||
return validated_data
|
||||
|
||||
def validate_username(self, value):
|
||||
if value.lower() in [n.lower() for n in settings.ACCOUNT_USERNAME_BLACKLIST]:
|
||||
raise serializers.ValidationError("This username is already taken")
|
||||
|
||||
matching = federation_models.Actor.objects.local().filter(
|
||||
preferred_username__iexact=value
|
||||
)
|
||||
if matching.exists():
|
||||
raise serializers.ValidationError("This username is already taken")
|
||||
return value
|
||||
|
||||
@transaction.atomic
|
||||
def create(self, validated_data):
|
||||
from . import views
|
||||
|
||||
cover = validated_data.pop("cover", None)
|
||||
description = validated_data.get("description")
|
||||
artist = music_models.Artist.objects.create(
|
||||
attributed_to=validated_data["attributed_to"],
|
||||
name=validated_data["name"],
|
||||
content_category=validated_data["content_category"],
|
||||
attachment_cover=cover,
|
||||
)
|
||||
common_utils.attach_content(artist, "description", description)
|
||||
|
||||
if validated_data.get("tags", []):
|
||||
tags_models.set_tags(artist, *validated_data["tags"])
|
||||
|
||||
channel = models.Channel(
|
||||
artist=artist,
|
||||
attributed_to=validated_data["attributed_to"],
|
||||
metadata=validated_data["metadata"],
|
||||
)
|
||||
channel.actor = models.generate_actor(
|
||||
validated_data["username"], name=validated_data["name"],
|
||||
)
|
||||
|
||||
channel.library = music_models.Library.objects.create(
|
||||
name=channel.actor.preferred_username,
|
||||
privacy_level="everyone",
|
||||
actor=validated_data["attributed_to"],
|
||||
)
|
||||
channel.save()
|
||||
channel = views.ChannelViewSet.queryset.get(pk=channel.pk)
|
||||
return channel
|
||||
|
||||
def to_representation(self, obj):
|
||||
return ChannelSerializer(obj, context=self.context).data
|
||||
|
||||
|
||||
NOOP = object()
|
||||
|
||||
|
||||
class ChannelUpdateSerializer(serializers.Serializer):
|
||||
name = serializers.CharField(max_length=music_models.MAX_LENGTHS["ARTIST_NAME"])
|
||||
description = common_serializers.ContentSerializer(allow_null=True)
|
||||
tags = tags_serializers.TagsListField()
|
||||
content_category = serializers.ChoiceField(
|
||||
choices=music_models.ARTIST_CONTENT_CATEGORY_CHOICES
|
||||
)
|
||||
metadata = serializers.DictField(required=False)
|
||||
cover = music_serializers.COVER_WRITE_FIELD
|
||||
|
||||
def validate(self, validated_data):
|
||||
validated_data = super().validate(validated_data)
|
||||
require_metadata_validation = False
|
||||
new_content_category = validated_data.get("content_category")
|
||||
metadata = validated_data.pop("metadata", NOOP)
|
||||
if (
|
||||
new_content_category == "podcast"
|
||||
and self.instance.artist.content_category != "postcast"
|
||||
):
|
||||
# updating channel, setting as podcast
|
||||
require_metadata_validation = True
|
||||
elif self.instance.artist.content_category == "postcast" and metadata != NOOP:
|
||||
# channel is podcast, and metadata was updated
|
||||
require_metadata_validation = True
|
||||
else:
|
||||
metadata = self.instance.metadata
|
||||
|
||||
if require_metadata_validation:
|
||||
metadata_serializer = ChannelMetadataSerializer(data=metadata)
|
||||
metadata_serializer.is_valid(raise_exception=True)
|
||||
metadata = metadata_serializer.validated_data
|
||||
|
||||
validated_data["metadata"] = metadata
|
||||
return validated_data
|
||||
|
||||
@transaction.atomic
|
||||
def update(self, obj, validated_data):
|
||||
if validated_data.get("tags") is not None:
|
||||
tags_models.set_tags(obj.artist, *validated_data["tags"])
|
||||
actor_update_fields = []
|
||||
artist_update_fields = []
|
||||
|
||||
obj.metadata = validated_data["metadata"]
|
||||
obj.save(update_fields=["metadata"])
|
||||
|
||||
if "description" in validated_data:
|
||||
common_utils.attach_content(
|
||||
obj.artist, "description", validated_data["description"]
|
||||
)
|
||||
|
||||
if "name" in validated_data:
|
||||
actor_update_fields.append(("name", validated_data["name"]))
|
||||
artist_update_fields.append(("name", validated_data["name"]))
|
||||
|
||||
if "content_category" in validated_data:
|
||||
artist_update_fields.append(
|
||||
("content_category", validated_data["content_category"])
|
||||
)
|
||||
|
||||
if "cover" in validated_data:
|
||||
artist_update_fields.append(("attachment_cover", validated_data["cover"]))
|
||||
|
||||
if actor_update_fields:
|
||||
for field, value in actor_update_fields:
|
||||
setattr(obj.actor, field, value)
|
||||
obj.actor.save(update_fields=[f for f, _ in actor_update_fields])
|
||||
|
||||
if artist_update_fields:
|
||||
for field, value in artist_update_fields:
|
||||
setattr(obj.artist, field, value)
|
||||
obj.artist.save(update_fields=[f for f, _ in artist_update_fields])
|
||||
|
||||
return obj
|
||||
|
||||
def to_representation(self, obj):
|
||||
return ChannelSerializer(obj, context=self.context).data
|
||||
|
||||
|
||||
class ChannelSerializer(serializers.ModelSerializer):
|
||||
artist = serializers.SerializerMethodField()
|
||||
actor = serializers.SerializerMethodField()
|
||||
attributed_to = federation_serializers.APIActorSerializer()
|
||||
rss_url = serializers.CharField(source="get_rss_url")
|
||||
url = serializers.SerializerMethodField()
|
||||
|
||||
class Meta:
|
||||
model = models.Channel
|
||||
fields = [
|
||||
"uuid",
|
||||
"artist",
|
||||
"attributed_to",
|
||||
"actor",
|
||||
"creation_date",
|
||||
"metadata",
|
||||
"rss_url",
|
||||
"url",
|
||||
]
|
||||
|
||||
def get_artist(self, obj):
|
||||
return music_serializers.serialize_artist_simple(obj.artist)
|
||||
|
||||
def to_representation(self, obj):
|
||||
data = super().to_representation(obj)
|
||||
if self.context.get("subscriptions_count"):
|
||||
data["subscriptions_count"] = self.get_subscriptions_count(obj)
|
||||
return data
|
||||
|
||||
def get_subscriptions_count(self, obj):
|
||||
return obj.actor.received_follows.exclude(approved=False).count()
|
||||
|
||||
def get_actor(self, obj):
|
||||
if obj.attributed_to == actors.get_service_actor():
|
||||
return None
|
||||
return federation_serializers.APIActorSerializer(obj.actor).data
|
||||
|
||||
def get_url(self, obj):
|
||||
return obj.actor.url
|
||||
|
||||
|
||||
class SubscriptionSerializer(serializers.Serializer):
|
||||
approved = serializers.BooleanField(read_only=True)
|
||||
fid = serializers.URLField(read_only=True)
|
||||
uuid = serializers.UUIDField(read_only=True)
|
||||
creation_date = serializers.DateTimeField(read_only=True)
|
||||
|
||||
def to_representation(self, obj):
|
||||
data = super().to_representation(obj)
|
||||
data["channel"] = ChannelSerializer(obj.target.channel).data
|
||||
return data
|
||||
|
||||
|
||||
class RssSubscribeSerializer(serializers.Serializer):
|
||||
url = serializers.URLField()
|
||||
|
||||
|
||||
class FeedFetchException(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class BlockedFeedException(FeedFetchException):
|
||||
pass
|
||||
|
||||
|
||||
def retrieve_feed(url):
|
||||
try:
|
||||
logger.info("Fetching RSS feed at %s", url)
|
||||
response = session.get_session().get(url)
|
||||
response.raise_for_status()
|
||||
except requests.exceptions.HTTPError as e:
|
||||
if e.response:
|
||||
raise FeedFetchException(
|
||||
"Error while fetching feed: HTTP {}".format(e.response.status_code)
|
||||
)
|
||||
raise FeedFetchException("Error while fetching feed: unknown error")
|
||||
except requests.exceptions.Timeout:
|
||||
raise FeedFetchException("Error while fetching feed: timeout")
|
||||
except requests.exceptions.ConnectionError:
|
||||
raise FeedFetchException("Error while fetching feed: connection error")
|
||||
except requests.RequestException as e:
|
||||
raise FeedFetchException("Error while fetching feed: {}".format(e))
|
||||
except Exception as e:
|
||||
raise FeedFetchException("Error while fetching feed: {}".format(e))
|
||||
|
||||
return response
|
||||
|
||||
|
||||
@transaction.atomic
|
||||
def get_channel_from_rss_url(url, raise_exception=False):
|
||||
# first, check if the url is blocked
|
||||
is_valid, _ = mrf.inbox.apply({"id": url})
|
||||
if not is_valid:
|
||||
logger.warn("Feed fetch for url %s dropped by MRF", url)
|
||||
raise BlockedFeedException("This feed or domain is blocked")
|
||||
|
||||
# retrieve the XML payload at the given URL
|
||||
response = retrieve_feed(url)
|
||||
|
||||
parsed_feed = feedparser.parse(response.text)
|
||||
serializer = RssFeedSerializer(data=parsed_feed["feed"])
|
||||
if not serializer.is_valid(raise_exception=raise_exception):
|
||||
raise FeedFetchException("Invalid xml content: {}".format(serializer.errors))
|
||||
|
||||
# second mrf check with validated data
|
||||
urls_to_check = set()
|
||||
atom_link = serializer.validated_data.get("atom_link")
|
||||
|
||||
if atom_link and atom_link != url:
|
||||
urls_to_check.add(atom_link)
|
||||
|
||||
if serializer.validated_data["link"] != url:
|
||||
urls_to_check.add(serializer.validated_data["link"])
|
||||
|
||||
for u in urls_to_check:
|
||||
is_valid, _ = mrf.inbox.apply({"id": u})
|
||||
if not is_valid:
|
||||
logger.warn("Feed fetch for url %s dropped by MRF", u)
|
||||
raise BlockedFeedException("This feed or domain is blocked")
|
||||
|
||||
# now, we're clear, we can save the data
|
||||
channel = serializer.save(rss_url=url)
|
||||
|
||||
entries = parsed_feed.entries or []
|
||||
uploads = []
|
||||
track_defaults = {}
|
||||
existing_uploads = list(
|
||||
channel.library.uploads.all().select_related(
|
||||
"track__description", "track__attachment_cover"
|
||||
)
|
||||
)
|
||||
if parsed_feed.feed.get("rights"):
|
||||
track_defaults["copyright"] = parsed_feed.feed.rights[
|
||||
: music_models.MAX_LENGTHS["COPYRIGHT"]
|
||||
]
|
||||
for entry in entries[: settings.PODCASTS_RSS_FEED_MAX_ITEMS]:
|
||||
logger.debug("Importing feed item %s", entry.id)
|
||||
s = RssFeedItemSerializer(data=entry)
|
||||
if not s.is_valid(raise_exception=raise_exception):
|
||||
logger.debug("Skipping invalid RSS feed item %s, ", entry, str(s.errors))
|
||||
continue
|
||||
uploads.append(
|
||||
s.save(channel, existing_uploads=existing_uploads, **track_defaults)
|
||||
)
|
||||
|
||||
common_utils.on_commit(
|
||||
music_models.TrackActor.create_entries,
|
||||
library=channel.library,
|
||||
delete_existing=True,
|
||||
)
|
||||
if uploads:
|
||||
latest_track_date = max([upload.track.creation_date for upload in uploads])
|
||||
common_utils.update_modification_date(channel.artist, date=latest_track_date)
|
||||
return channel, uploads
|
||||
|
||||
|
||||
# RSS related stuff
|
||||
# https://github.com/simplepie/simplepie-ng/wiki/Spec:-iTunes-Podcast-RSS
|
||||
# is extremely useful
|
||||
|
||||
|
||||
class RssFeedSerializer(serializers.Serializer):
|
||||
title = serializers.CharField()
|
||||
link = serializers.URLField(required=False, allow_blank=True)
|
||||
language = serializers.CharField(required=False, allow_blank=True)
|
||||
rights = serializers.CharField(required=False, allow_blank=True)
|
||||
itunes_explicit = serializers.BooleanField(required=False, allow_null=True)
|
||||
tags = serializers.ListField(required=False)
|
||||
atom_link = serializers.DictField(required=False)
|
||||
links = serializers.ListField(required=False)
|
||||
summary_detail = serializers.DictField(required=False)
|
||||
author_detail = serializers.DictField(required=False)
|
||||
image = serializers.DictField(required=False)
|
||||
|
||||
def validate_atom_link(self, v):
|
||||
if (
|
||||
v.get("rel", "self") == "self"
|
||||
and v.get("type", "application/rss+xml") == "application/rss+xml"
|
||||
):
|
||||
return v["href"]
|
||||
|
||||
def validate_links(self, v):
|
||||
for link in v:
|
||||
if link.get("rel") == "self":
|
||||
return link.get("href")
|
||||
|
||||
def validate_summary_detail(self, v):
|
||||
content = v.get("value")
|
||||
if not content:
|
||||
return
|
||||
return {
|
||||
"content_type": v.get("type", "text/plain"),
|
||||
"text": content,
|
||||
}
|
||||
|
||||
def validate_image(self, v):
|
||||
url = v.get("href")
|
||||
if url:
|
||||
return {
|
||||
"url": url,
|
||||
"mimetype": common_utils.get_mimetype_from_ext(url) or "image/jpeg",
|
||||
}
|
||||
|
||||
def validate_tags(self, v):
|
||||
data = {}
|
||||
for row in v:
|
||||
if row.get("scheme") != "http://www.itunes.com/":
|
||||
continue
|
||||
term = row["term"]
|
||||
if "parent" not in data and term in categories.ITUNES_CATEGORIES:
|
||||
data["parent"] = term
|
||||
elif "child" not in data and term in categories.ITUNES_SUBCATEGORIES:
|
||||
data["child"] = term
|
||||
elif (
|
||||
term not in categories.ITUNES_SUBCATEGORIES
|
||||
and term not in categories.ITUNES_CATEGORIES
|
||||
):
|
||||
raw_tags = term.split(" ")
|
||||
data["tags"] = []
|
||||
tag_serializer = tags_serializers.TagNameField()
|
||||
for tag in raw_tags:
|
||||
try:
|
||||
data["tags"].append(tag_serializer.to_internal_value(tag))
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return data
|
||||
|
||||
def validate(self, data):
|
||||
validated_data = super().validate(data)
|
||||
if not validated_data.get("link"):
|
||||
validated_data["link"] = validated_data.get("links")
|
||||
if not validated_data.get("link"):
|
||||
raise serializers.ValidationError("Missing link")
|
||||
return validated_data
|
||||
|
||||
@transaction.atomic
|
||||
def save(self, rss_url):
|
||||
validated_data = self.validated_data
|
||||
# because there may be redirections from the original feed URL
|
||||
real_rss_url = validated_data.get("atom_link", rss_url) or rss_url
|
||||
service_actor = actors.get_service_actor()
|
||||
author = validated_data.get("author_detail", {})
|
||||
categories = validated_data.get("tags", {})
|
||||
metadata = {
|
||||
"explicit": validated_data.get("itunes_explicit", False),
|
||||
"copyright": validated_data.get("rights"),
|
||||
"owner_name": author.get("name"),
|
||||
"owner_email": author.get("email"),
|
||||
"itunes_category": categories.get("parent"),
|
||||
"itunes_subcategory": categories.get("child"),
|
||||
"language": validated_data.get("language"),
|
||||
}
|
||||
public_url = validated_data["link"]
|
||||
existing = (
|
||||
models.Channel.objects.external_rss()
|
||||
.filter(
|
||||
Q(rss_url=real_rss_url) | Q(rss_url=rss_url) | Q(actor__url=public_url)
|
||||
)
|
||||
.first()
|
||||
)
|
||||
channel_defaults = {
|
||||
"rss_url": real_rss_url,
|
||||
"metadata": metadata,
|
||||
}
|
||||
if existing:
|
||||
artist_kwargs = {"channel": existing}
|
||||
actor_kwargs = {"channel": existing}
|
||||
actor_defaults = {"url": public_url}
|
||||
else:
|
||||
artist_kwargs = {"pk": None}
|
||||
actor_kwargs = {"pk": None}
|
||||
preferred_username = "rssfeed-{}".format(uuid.uuid4())
|
||||
actor_defaults = {
|
||||
"preferred_username": preferred_username,
|
||||
"type": "Application",
|
||||
"domain": service_actor.domain,
|
||||
"url": public_url,
|
||||
"fid": federation_utils.full_url(
|
||||
reverse(
|
||||
"federation:actors-detail",
|
||||
kwargs={"preferred_username": preferred_username},
|
||||
)
|
||||
),
|
||||
}
|
||||
channel_defaults["attributed_to"] = service_actor
|
||||
|
||||
actor_defaults["last_fetch_date"] = timezone.now()
|
||||
|
||||
# create/update the artist profile
|
||||
artist, created = music_models.Artist.objects.update_or_create(
|
||||
**artist_kwargs,
|
||||
defaults={
|
||||
"attributed_to": service_actor,
|
||||
"name": validated_data["title"][
|
||||
: music_models.MAX_LENGTHS["ARTIST_NAME"]
|
||||
],
|
||||
"content_category": "podcast",
|
||||
},
|
||||
)
|
||||
|
||||
cover = validated_data.get("image")
|
||||
|
||||
if cover:
|
||||
common_utils.attach_file(artist, "attachment_cover", cover)
|
||||
tags = categories.get("tags", [])
|
||||
|
||||
if tags:
|
||||
tags_models.set_tags(artist, *tags)
|
||||
|
||||
summary = validated_data.get("summary_detail")
|
||||
if summary:
|
||||
common_utils.attach_content(artist, "description", summary)
|
||||
|
||||
if created:
|
||||
channel_defaults["artist"] = artist
|
||||
|
||||
# create/update the actor
|
||||
actor, created = federation_models.Actor.objects.update_or_create(
|
||||
**actor_kwargs, defaults=actor_defaults
|
||||
)
|
||||
if created:
|
||||
channel_defaults["actor"] = actor
|
||||
|
||||
# create the library
|
||||
if not existing:
|
||||
channel_defaults["library"] = music_models.Library.objects.create(
|
||||
actor=service_actor,
|
||||
privacy_level=settings.PODCASTS_THIRD_PARTY_VISIBILITY,
|
||||
name=actor_defaults["preferred_username"],
|
||||
)
|
||||
|
||||
# create/update the channel
|
||||
channel, created = models.Channel.objects.update_or_create(
|
||||
pk=existing.pk if existing else None, defaults=channel_defaults,
|
||||
)
|
||||
return channel
|
||||
|
||||
|
||||
class ItunesDurationField(serializers.CharField):
|
||||
def to_internal_value(self, v):
|
||||
try:
|
||||
return int(v)
|
||||
except (ValueError, TypeError):
|
||||
pass
|
||||
parts = v.split(":")
|
||||
int_parts = []
|
||||
for part in parts:
|
||||
try:
|
||||
int_parts.append(int(part))
|
||||
except (ValueError, TypeError):
|
||||
raise serializers.ValidationError("Invalid duration {}".format(v))
|
||||
|
||||
if len(int_parts) == 2:
|
||||
hours = 0
|
||||
minutes, seconds = int_parts
|
||||
elif len(int_parts) == 3:
|
||||
hours, minutes, seconds = int_parts
|
||||
else:
|
||||
raise serializers.ValidationError("Invalid duration {}".format(v))
|
||||
|
||||
return (hours * 3600) + (minutes * 60) + seconds
|
||||
|
||||
|
||||
class DummyField(serializers.Field):
|
||||
def to_internal_value(self, v):
|
||||
return v
|
||||
|
||||
|
||||
def get_cached_upload(uploads, expected_track_uuid):
|
||||
for upload in uploads:
|
||||
if upload.track.uuid == expected_track_uuid:
|
||||
return upload
|
||||
|
||||
|
||||
class PermissiveIntegerField(serializers.IntegerField):
|
||||
def to_internal_value(self, v):
|
||||
try:
|
||||
return super().to_internal_value(v)
|
||||
except serializers.ValidationError:
|
||||
return self.default
|
||||
|
||||
|
||||
class RssFeedItemSerializer(serializers.Serializer):
|
||||
id = serializers.CharField()
|
||||
title = serializers.CharField()
|
||||
rights = serializers.CharField(required=False, allow_blank=True)
|
||||
itunes_season = serializers.IntegerField(
|
||||
required=False, allow_null=True, default=None
|
||||
)
|
||||
itunes_episode = PermissiveIntegerField(
|
||||
required=False, allow_null=True, default=None
|
||||
)
|
||||
itunes_duration = ItunesDurationField(
|
||||
required=False, allow_null=True, default=None, allow_blank=True
|
||||
)
|
||||
links = serializers.ListField()
|
||||
tags = serializers.ListField(required=False)
|
||||
summary_detail = serializers.DictField(required=False)
|
||||
published_parsed = DummyField(required=False)
|
||||
image = serializers.DictField(required=False)
|
||||
|
||||
def validate_summary_detail(self, v):
|
||||
content = v.get("value")
|
||||
if not content:
|
||||
return
|
||||
return {
|
||||
"content_type": v.get("type", "text/plain"),
|
||||
"text": content,
|
||||
}
|
||||
|
||||
def validate_image(self, v):
|
||||
url = v.get("href")
|
||||
if url:
|
||||
return {
|
||||
"url": url,
|
||||
"mimetype": common_utils.get_mimetype_from_ext(url) or "image/jpeg",
|
||||
}
|
||||
|
||||
def validate_links(self, v):
|
||||
data = {}
|
||||
for row in v:
|
||||
if not row.get("type", "").startswith("audio/"):
|
||||
continue
|
||||
if row.get("rel") != "enclosure":
|
||||
continue
|
||||
try:
|
||||
size = int(row.get("length", 0) or 0) or None
|
||||
except (TypeError, ValueError):
|
||||
raise serializers.ValidationError("Invalid size")
|
||||
|
||||
data["audio"] = {
|
||||
"mimetype": common_utils.get_audio_mimetype(row["type"]),
|
||||
"size": size,
|
||||
"source": row["href"],
|
||||
}
|
||||
|
||||
if not data:
|
||||
raise serializers.ValidationError("No valid audio enclosure found")
|
||||
|
||||
return data
|
||||
|
||||
def validate_tags(self, v):
|
||||
data = {}
|
||||
for row in v:
|
||||
if row.get("scheme") != "http://www.itunes.com/":
|
||||
continue
|
||||
term = row["term"]
|
||||
raw_tags = term.split(" ")
|
||||
data["tags"] = []
|
||||
tag_serializer = tags_serializers.TagNameField()
|
||||
for tag in raw_tags:
|
||||
try:
|
||||
data["tags"].append(tag_serializer.to_internal_value(tag))
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return data
|
||||
|
||||
@transaction.atomic
|
||||
def save(self, channel, existing_uploads=[], **track_defaults):
|
||||
validated_data = self.validated_data
|
||||
categories = validated_data.get("tags", {})
|
||||
expected_uuid = uuid.uuid3(
|
||||
uuid.NAMESPACE_URL, "rss://{}-{}".format(channel.pk, validated_data["id"])
|
||||
)
|
||||
existing_upload = get_cached_upload(existing_uploads, expected_uuid)
|
||||
if existing_upload:
|
||||
existing_track = existing_upload.track
|
||||
else:
|
||||
existing_track = (
|
||||
music_models.Track.objects.filter(
|
||||
uuid=expected_uuid, artist__channel=channel
|
||||
)
|
||||
.select_related("description", "attachment_cover")
|
||||
.first()
|
||||
)
|
||||
if existing_track:
|
||||
existing_upload = existing_track.uploads.filter(
|
||||
library=channel.library
|
||||
).first()
|
||||
|
||||
track_defaults = track_defaults
|
||||
track_defaults.update(
|
||||
{
|
||||
"disc_number": validated_data.get("itunes_season", 1) or 1,
|
||||
"position": validated_data.get("itunes_episode", 1) or 1,
|
||||
"title": validated_data["title"][
|
||||
: music_models.MAX_LENGTHS["TRACK_TITLE"]
|
||||
],
|
||||
"artist": channel.artist,
|
||||
}
|
||||
)
|
||||
if "rights" in validated_data:
|
||||
track_defaults["copyright"] = validated_data["rights"][
|
||||
: music_models.MAX_LENGTHS["COPYRIGHT"]
|
||||
]
|
||||
|
||||
if "published_parsed" in validated_data:
|
||||
track_defaults["creation_date"] = datetime.datetime.fromtimestamp(
|
||||
time.mktime(validated_data["published_parsed"])
|
||||
).replace(tzinfo=pytz.utc)
|
||||
|
||||
upload_defaults = {
|
||||
"source": validated_data["links"]["audio"]["source"],
|
||||
"size": validated_data["links"]["audio"]["size"],
|
||||
"mimetype": validated_data["links"]["audio"]["mimetype"],
|
||||
"duration": validated_data.get("itunes_duration") or None,
|
||||
"import_status": "finished",
|
||||
"library": channel.library,
|
||||
}
|
||||
if existing_track:
|
||||
track_kwargs = {"pk": existing_track.pk}
|
||||
upload_kwargs = {"track": existing_track}
|
||||
else:
|
||||
track_kwargs = {"pk": None}
|
||||
track_defaults["uuid"] = expected_uuid
|
||||
upload_kwargs = {"pk": None}
|
||||
|
||||
if existing_upload and existing_upload.source != upload_defaults["source"]:
|
||||
# delete existing upload, the url to the audio file has changed
|
||||
existing_upload.delete()
|
||||
|
||||
# create/update the track
|
||||
track, created = music_models.Track.objects.update_or_create(
|
||||
**track_kwargs, defaults=track_defaults,
|
||||
)
|
||||
# optimisation for reducing SQL queries, because we cannot use select_related with
|
||||
# update or create, so we restore the cache by hand
|
||||
if existing_track:
|
||||
for field in ["attachment_cover", "description"]:
|
||||
cached_id_value = getattr(existing_track, "{}_id".format(field))
|
||||
new_id_value = getattr(track, "{}_id".format(field))
|
||||
if new_id_value and cached_id_value == new_id_value:
|
||||
setattr(track, field, getattr(existing_track, field))
|
||||
|
||||
cover = validated_data.get("image")
|
||||
|
||||
if cover:
|
||||
common_utils.attach_file(track, "attachment_cover", cover)
|
||||
tags = categories.get("tags", [])
|
||||
|
||||
if tags:
|
||||
tags_models.set_tags(track, *tags)
|
||||
|
||||
summary = validated_data.get("summary_detail")
|
||||
if summary:
|
||||
common_utils.attach_content(track, "description", summary)
|
||||
|
||||
if created:
|
||||
upload_defaults["track"] = track
|
||||
|
||||
# create/update the upload
|
||||
upload, created = music_models.Upload.objects.update_or_create(
|
||||
**upload_kwargs, defaults=upload_defaults
|
||||
)
|
||||
|
||||
return upload
|
||||
|
||||
|
||||
def rfc822_date(dt):
|
||||
return dt.strftime("%a, %d %b %Y %H:%M:%S %z")
|
||||
|
||||
|
||||
def rss_duration(seconds):
|
||||
if not seconds:
|
||||
return "00:00:00"
|
||||
full_hours = seconds // 3600
|
||||
full_minutes = (seconds - (full_hours * 3600)) // 60
|
||||
remaining_seconds = seconds - (full_hours * 3600) - (full_minutes * 60)
|
||||
return "{}:{}:{}".format(
|
||||
str(full_hours).zfill(2),
|
||||
str(full_minutes).zfill(2),
|
||||
str(remaining_seconds).zfill(2),
|
||||
)
|
||||
|
||||
|
||||
def rss_serialize_item(upload):
|
||||
data = {
|
||||
"title": [{"value": upload.track.title}],
|
||||
"itunes:title": [{"value": upload.track.title}],
|
||||
"guid": [{"cdata_value": str(upload.uuid), "isPermalink": "false"}],
|
||||
"pubDate": [{"value": rfc822_date(upload.creation_date)}],
|
||||
"itunes:duration": [{"value": rss_duration(upload.duration)}],
|
||||
"itunes:explicit": [{"value": "no"}],
|
||||
"itunes:episodeType": [{"value": "full"}],
|
||||
"itunes:season": [{"value": upload.track.disc_number or 1}],
|
||||
"itunes:episode": [{"value": upload.track.position or 1}],
|
||||
"link": [{"value": federation_utils.full_url(upload.track.get_absolute_url())}],
|
||||
"enclosure": [
|
||||
{
|
||||
# we enforce MP3, since it's the only format supported everywhere
|
||||
"url": federation_utils.full_url(upload.get_listen_url(to="mp3")),
|
||||
"length": upload.size or 0,
|
||||
"type": "audio/mpeg",
|
||||
}
|
||||
],
|
||||
}
|
||||
if upload.track.description:
|
||||
data["itunes:subtitle"] = [{"value": upload.track.description.truncate(255)}]
|
||||
data["itunes:summary"] = [{"cdata_value": upload.track.description.rendered}]
|
||||
data["description"] = [{"value": upload.track.description.as_plain_text}]
|
||||
|
||||
if upload.track.attachment_cover:
|
||||
data["itunes:image"] = [
|
||||
{"href": upload.track.attachment_cover.download_url_original}
|
||||
]
|
||||
|
||||
tagged_items = getattr(upload.track, "_prefetched_tagged_items", [])
|
||||
if tagged_items:
|
||||
data["itunes:keywords"] = [
|
||||
{"value": " ".join([ti.tag.name for ti in tagged_items])}
|
||||
]
|
||||
|
||||
return data
|
||||
|
||||
|
||||
def rss_serialize_channel(channel):
|
||||
metadata = channel.metadata or {}
|
||||
explicit = metadata.get("explicit", False)
|
||||
copyright = metadata.get("copyright", "All rights reserved")
|
||||
owner_name = metadata.get("owner_name", channel.attributed_to.display_name)
|
||||
owner_email = metadata.get("owner_email")
|
||||
itunes_category = metadata.get("itunes_category")
|
||||
itunes_subcategory = metadata.get("itunes_subcategory")
|
||||
language = metadata.get("language")
|
||||
|
||||
data = {
|
||||
"title": [{"value": channel.artist.name}],
|
||||
"copyright": [{"value": copyright}],
|
||||
"itunes:explicit": [{"value": "no" if not explicit else "yes"}],
|
||||
"itunes:author": [{"value": owner_name}],
|
||||
"itunes:owner": [{"itunes:name": [{"value": owner_name}]}],
|
||||
"itunes:type": [{"value": "episodic"}],
|
||||
"link": [{"value": channel.get_absolute_url()}],
|
||||
"atom:link": [
|
||||
{
|
||||
"href": channel.get_rss_url(),
|
||||
"rel": "self",
|
||||
"type": "application/rss+xml",
|
||||
},
|
||||
{
|
||||
"href": channel.actor.fid,
|
||||
"rel": "alternate",
|
||||
"type": "application/activity+json",
|
||||
},
|
||||
],
|
||||
}
|
||||
if language:
|
||||
data["language"] = [{"value": language}]
|
||||
|
||||
if owner_email:
|
||||
data["itunes:owner"][0]["itunes:email"] = [{"value": owner_email}]
|
||||
|
||||
if itunes_category:
|
||||
node = {"text": itunes_category}
|
||||
if itunes_subcategory:
|
||||
node["itunes:category"] = [{"text": itunes_subcategory}]
|
||||
data["itunes:category"] = [node]
|
||||
|
||||
if channel.artist.description:
|
||||
data["itunes:subtitle"] = [{"value": channel.artist.description.truncate(255)}]
|
||||
data["itunes:summary"] = [{"cdata_value": channel.artist.description.rendered}]
|
||||
data["description"] = [{"value": channel.artist.description.as_plain_text}]
|
||||
|
||||
if channel.artist.attachment_cover:
|
||||
data["itunes:image"] = [
|
||||
{"href": channel.artist.attachment_cover.download_url_original}
|
||||
]
|
||||
else:
|
||||
placeholder_url = federation_utils.full_url(
|
||||
static("images/podcasts-cover-placeholder.png")
|
||||
)
|
||||
data["itunes:image"] = [{"href": placeholder_url}]
|
||||
|
||||
tagged_items = getattr(channel.artist, "_prefetched_tagged_items", [])
|
||||
|
||||
if tagged_items:
|
||||
data["itunes:keywords"] = [
|
||||
{"value": " ".join([ti.tag.name for ti in tagged_items])}
|
||||
]
|
||||
|
||||
return data
|
||||
|
||||
|
||||
def rss_serialize_channel_full(channel, uploads):
|
||||
channel_data = rss_serialize_channel(channel)
|
||||
channel_data["item"] = [rss_serialize_item(upload) for upload in uploads]
|
||||
return {"channel": channel_data}
|
||||
|
||||
|
||||
# OPML stuff
|
||||
def get_opml_outline(channel):
|
||||
return {
|
||||
"title": channel.artist.name,
|
||||
"text": channel.artist.name,
|
||||
"type": "rss",
|
||||
"xmlUrl": channel.get_rss_url(),
|
||||
"htmlUrl": channel.actor.url,
|
||||
}
|
||||
|
||||
|
||||
def get_opml(channels, date, title):
|
||||
return {
|
||||
"version": "2.0",
|
||||
"head": [{"date": [{"value": rfc822_date(date)}], "title": [{"value": title}]}],
|
||||
"body": [{"outline": [get_opml_outline(channel) for channel in channels]}],
|
||||
}
|
|
@ -0,0 +1,107 @@
|
|||
import urllib.parse
|
||||
|
||||
from django.conf import settings
|
||||
from django.db.models import Q
|
||||
from django.urls import reverse
|
||||
|
||||
from rest_framework import serializers
|
||||
|
||||
from funkwhale_api.common import preferences
|
||||
from funkwhale_api.common import middleware
|
||||
from funkwhale_api.common import utils
|
||||
from funkwhale_api.federation import utils as federation_utils
|
||||
from funkwhale_api.music import spa_views
|
||||
|
||||
from . import models
|
||||
|
||||
|
||||
def channel_detail(query, redirect_to_ap):
|
||||
queryset = models.Channel.objects.filter(query).select_related(
|
||||
"artist__attachment_cover", "actor", "library"
|
||||
)
|
||||
try:
|
||||
obj = queryset.get()
|
||||
except models.Channel.DoesNotExist:
|
||||
return []
|
||||
|
||||
if redirect_to_ap:
|
||||
raise middleware.ApiRedirect(obj.actor.fid)
|
||||
|
||||
obj_url = utils.join_url(
|
||||
settings.FUNKWHALE_URL,
|
||||
utils.spa_reverse(
|
||||
"channel_detail", kwargs={"username": obj.actor.full_username}
|
||||
),
|
||||
)
|
||||
metas = [
|
||||
{"tag": "meta", "property": "og:url", "content": obj_url},
|
||||
{"tag": "meta", "property": "og:title", "content": obj.artist.name},
|
||||
{"tag": "meta", "property": "og:type", "content": "profile"},
|
||||
]
|
||||
|
||||
if obj.artist.attachment_cover:
|
||||
metas.append(
|
||||
{
|
||||
"tag": "meta",
|
||||
"property": "og:image",
|
||||
"content": obj.artist.attachment_cover.download_url_medium_square_crop,
|
||||
}
|
||||
)
|
||||
|
||||
if preferences.get("federation__enabled"):
|
||||
metas.append(
|
||||
{
|
||||
"tag": "link",
|
||||
"rel": "alternate",
|
||||
"type": "application/activity+json",
|
||||
"href": obj.actor.fid,
|
||||
}
|
||||
)
|
||||
|
||||
metas.append(
|
||||
{
|
||||
"tag": "link",
|
||||
"rel": "alternate",
|
||||
"type": "application/rss+xml",
|
||||
"href": obj.get_rss_url(),
|
||||
"title": "{} - RSS Podcast Feed".format(obj.artist.name),
|
||||
},
|
||||
)
|
||||
|
||||
if obj.library.uploads.all().playable_by(None).exists():
|
||||
metas.append(
|
||||
{
|
||||
"tag": "link",
|
||||
"rel": "alternate",
|
||||
"type": "application/json+oembed",
|
||||
"href": (
|
||||
utils.join_url(settings.FUNKWHALE_URL, reverse("api:v1:oembed"))
|
||||
+ "?format=json&url={}".format(urllib.parse.quote_plus(obj_url))
|
||||
),
|
||||
}
|
||||
)
|
||||
# twitter player is also supported in various software
|
||||
metas += spa_views.get_twitter_card_metas(type="channel", id=obj.uuid)
|
||||
return metas
|
||||
|
||||
|
||||
def channel_detail_uuid(request, uuid, redirect_to_ap):
|
||||
validator = serializers.UUIDField().to_internal_value
|
||||
try:
|
||||
uuid = validator(uuid)
|
||||
except serializers.ValidationError:
|
||||
return []
|
||||
return channel_detail(Q(uuid=uuid), redirect_to_ap)
|
||||
|
||||
|
||||
def channel_detail_username(request, username, redirect_to_ap):
|
||||
validator = federation_utils.get_actor_data_from_username
|
||||
try:
|
||||
username_data = validator(username)
|
||||
except serializers.ValidationError:
|
||||
return []
|
||||
query = Q(
|
||||
actor__domain=username_data["domain"],
|
||||
actor__preferred_username__iexact=username_data["username"],
|
||||
)
|
||||
return channel_detail(query, redirect_to_ap)
|
|
@ -0,0 +1,51 @@
|
|||
import datetime
|
||||
import logging
|
||||
|
||||
from django.conf import settings
|
||||
from django.db import transaction
|
||||
from django.utils import timezone
|
||||
|
||||
from funkwhale_api.taskapp import celery
|
||||
|
||||
from . import models
|
||||
from . import serializers
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@celery.app.task(name="audio.fetch_rss_feeds")
|
||||
def fetch_rss_feeds():
|
||||
limit = timezone.now() - datetime.timedelta(
|
||||
seconds=settings.PODCASTS_RSS_FEED_REFRESH_DELAY
|
||||
)
|
||||
candidates = (
|
||||
models.Channel.objects.external_rss()
|
||||
.filter(actor__last_fetch_date__lte=limit)
|
||||
.values_list("rss_url", flat=True)
|
||||
)
|
||||
|
||||
total = len(candidates)
|
||||
logger.info("Refreshing %s rss feeds…", total)
|
||||
for url in candidates:
|
||||
fetch_rss_feed.delay(rss_url=url)
|
||||
|
||||
|
||||
@celery.app.task(name="audio.fetch_rss_feed")
|
||||
@transaction.atomic
|
||||
def fetch_rss_feed(rss_url):
|
||||
channel = (
|
||||
models.Channel.objects.external_rss()
|
||||
.filter(rss_url=rss_url)
|
||||
.order_by("id")
|
||||
.first()
|
||||
)
|
||||
if not channel:
|
||||
logger.warn("Cannot refresh non external feed")
|
||||
return
|
||||
|
||||
try:
|
||||
serializers.get_channel_from_rss_url(rss_url)
|
||||
except serializers.BlockedFeedException:
|
||||
# channel was blocked since last fetch, let's delete it
|
||||
logger.info("Deleting blocked channel linked to %s", rss_url)
|
||||
channel.delete()
|
|
@ -0,0 +1,322 @@
|
|||
from rest_framework import decorators
|
||||
from rest_framework import exceptions
|
||||
from rest_framework import mixins
|
||||
from rest_framework import permissions as rest_permissions
|
||||
from rest_framework import response
|
||||
from rest_framework import viewsets
|
||||
|
||||
from django import http
|
||||
from django.db import transaction
|
||||
from django.db.models import Count, Prefetch, Q
|
||||
from django.utils import timezone
|
||||
|
||||
from funkwhale_api.common import locales
|
||||
from funkwhale_api.common import permissions
|
||||
from funkwhale_api.common import preferences
|
||||
from funkwhale_api.common import utils as common_utils
|
||||
from funkwhale_api.common.mixins import MultipleLookupDetailMixin
|
||||
from funkwhale_api.federation import actors
|
||||
from funkwhale_api.federation import models as federation_models
|
||||
from funkwhale_api.federation import routes
|
||||
from funkwhale_api.federation import tasks as federation_tasks
|
||||
from funkwhale_api.federation import utils as federation_utils
|
||||
from funkwhale_api.music import models as music_models
|
||||
from funkwhale_api.music import views as music_views
|
||||
from funkwhale_api.users.oauth import permissions as oauth_permissions
|
||||
|
||||
from . import categories, filters, models, renderers, serializers
|
||||
|
||||
ARTIST_PREFETCH_QS = (
|
||||
music_models.Artist.objects.select_related("description", "attachment_cover",)
|
||||
.prefetch_related(music_views.TAG_PREFETCH)
|
||||
.annotate(_tracks_count=Count("tracks"))
|
||||
)
|
||||
|
||||
|
||||
class ChannelsMixin(object):
|
||||
def dispatch(self, request, *args, **kwargs):
|
||||
if not preferences.get("audio__channels_enabled"):
|
||||
return http.HttpResponse(status=405)
|
||||
return super().dispatch(request, *args, **kwargs)
|
||||
|
||||
|
||||
class ChannelViewSet(
|
||||
ChannelsMixin,
|
||||
MultipleLookupDetailMixin,
|
||||
mixins.CreateModelMixin,
|
||||
mixins.RetrieveModelMixin,
|
||||
mixins.UpdateModelMixin,
|
||||
mixins.ListModelMixin,
|
||||
mixins.DestroyModelMixin,
|
||||
viewsets.GenericViewSet,
|
||||
):
|
||||
url_lookups = [
|
||||
{
|
||||
"lookup_field": "uuid",
|
||||
"validator": serializers.serializers.UUIDField().to_internal_value,
|
||||
},
|
||||
{
|
||||
"lookup_field": "username",
|
||||
"validator": federation_utils.get_actor_data_from_username,
|
||||
"get_query": lambda v: Q(
|
||||
actor__domain=v["domain"],
|
||||
actor__preferred_username__iexact=v["username"],
|
||||
),
|
||||
},
|
||||
]
|
||||
filterset_class = filters.ChannelFilter
|
||||
serializer_class = serializers.ChannelSerializer
|
||||
queryset = (
|
||||
models.Channel.objects.all()
|
||||
.prefetch_related(
|
||||
"library",
|
||||
"attributed_to",
|
||||
"actor",
|
||||
Prefetch("artist", queryset=ARTIST_PREFETCH_QS),
|
||||
)
|
||||
.order_by("-creation_date")
|
||||
)
|
||||
permission_classes = [
|
||||
oauth_permissions.ScopePermission,
|
||||
permissions.OwnerPermission,
|
||||
]
|
||||
required_scope = "libraries"
|
||||
anonymous_policy = "setting"
|
||||
owner_checks = ["write"]
|
||||
owner_field = "attributed_to.user"
|
||||
owner_exception = exceptions.PermissionDenied
|
||||
|
||||
def get_serializer_class(self):
|
||||
if self.request.method.lower() in ["head", "get", "options"]:
|
||||
return serializers.ChannelSerializer
|
||||
elif self.action in ["update", "partial_update"]:
|
||||
return serializers.ChannelUpdateSerializer
|
||||
return serializers.ChannelCreateSerializer
|
||||
|
||||
def perform_create(self, serializer):
|
||||
return serializer.save(attributed_to=self.request.user.actor)
|
||||
|
||||
def list(self, request, *args, **kwargs):
|
||||
if self.request.GET.get("output") == "opml":
|
||||
queryset = self.filter_queryset(self.get_queryset())[:500]
|
||||
opml = serializers.get_opml(
|
||||
channels=queryset,
|
||||
date=timezone.now(),
|
||||
title="Funkwhale channels OPML export",
|
||||
)
|
||||
xml_body = renderers.render_xml(renderers.dict_to_xml_tree("opml", opml))
|
||||
return http.HttpResponse(xml_body, content_type="application/xml")
|
||||
|
||||
else:
|
||||
return super().list(request, *args, **kwargs)
|
||||
|
||||
def get_object(self):
|
||||
obj = super().get_object()
|
||||
if (
|
||||
self.action == "retrieve"
|
||||
and self.request.GET.get("refresh", "").lower() == "true"
|
||||
):
|
||||
obj = music_views.refetch_obj(obj, self.get_queryset())
|
||||
return obj
|
||||
|
||||
@decorators.action(
|
||||
detail=True,
|
||||
methods=["post"],
|
||||
permission_classes=[rest_permissions.IsAuthenticated],
|
||||
)
|
||||
def subscribe(self, request, *args, **kwargs):
|
||||
object = self.get_object()
|
||||
subscription = federation_models.Follow(actor=request.user.actor)
|
||||
subscription.fid = subscription.get_federation_id()
|
||||
subscription, created = SubscriptionsViewSet.queryset.get_or_create(
|
||||
target=object.actor,
|
||||
actor=request.user.actor,
|
||||
defaults={
|
||||
"approved": True,
|
||||
"fid": subscription.fid,
|
||||
"uuid": subscription.uuid,
|
||||
},
|
||||
)
|
||||
# prefetch stuff
|
||||
subscription = SubscriptionsViewSet.queryset.get(pk=subscription.pk)
|
||||
if not object.actor.is_local:
|
||||
routes.outbox.dispatch({"type": "Follow"}, context={"follow": subscription})
|
||||
|
||||
data = serializers.SubscriptionSerializer(subscription).data
|
||||
return response.Response(data, status=201)
|
||||
|
||||
@decorators.action(
|
||||
detail=True,
|
||||
methods=["post", "delete"],
|
||||
permission_classes=[rest_permissions.IsAuthenticated],
|
||||
)
|
||||
def unsubscribe(self, request, *args, **kwargs):
|
||||
object = self.get_object()
|
||||
follow_qs = request.user.actor.emitted_follows.filter(target=object.actor)
|
||||
follow = follow_qs.first()
|
||||
if follow:
|
||||
if not object.actor.is_local:
|
||||
routes.outbox.dispatch(
|
||||
{"type": "Undo", "object": {"type": "Follow"}},
|
||||
context={"follow": follow},
|
||||
)
|
||||
follow_qs.delete()
|
||||
return response.Response(status=204)
|
||||
|
||||
@decorators.action(
|
||||
detail=True,
|
||||
methods=["get"],
|
||||
content_negotiation_class=renderers.PodcastRSSContentNegociation,
|
||||
)
|
||||
def rss(self, request, *args, **kwargs):
|
||||
object = self.get_object()
|
||||
if not object.attributed_to.is_local:
|
||||
return response.Response({"detail": "Not found"}, status=404)
|
||||
|
||||
if object.attributed_to == actors.get_service_actor():
|
||||
# external feed, we redirect to the canonical one
|
||||
return http.HttpResponseRedirect(object.rss_url)
|
||||
|
||||
uploads = (
|
||||
object.library.uploads.playable_by(None)
|
||||
.prefetch_related(
|
||||
Prefetch(
|
||||
"track",
|
||||
queryset=music_models.Track.objects.select_related(
|
||||
"attachment_cover", "description"
|
||||
).prefetch_related(music_views.TAG_PREFETCH,),
|
||||
),
|
||||
)
|
||||
.select_related("track__attachment_cover", "track__description")
|
||||
.order_by("-creation_date")
|
||||
)[:50]
|
||||
data = serializers.rss_serialize_channel_full(channel=object, uploads=uploads)
|
||||
return response.Response(data, status=200)
|
||||
|
||||
@decorators.action(
|
||||
methods=["get"],
|
||||
detail=False,
|
||||
url_path="metadata-choices",
|
||||
url_name="metadata_choices",
|
||||
permission_classes=[],
|
||||
)
|
||||
def metedata_choices(self, request, *args, **kwargs):
|
||||
data = {
|
||||
"language": [
|
||||
{"value": code, "label": name} for code, name in locales.ISO_639_CHOICES
|
||||
],
|
||||
"itunes_category": [
|
||||
{"value": code, "label": code, "children": children}
|
||||
for code, children in categories.ITUNES_CATEGORIES.items()
|
||||
],
|
||||
}
|
||||
return response.Response(data)
|
||||
|
||||
@decorators.action(
|
||||
methods=["post"],
|
||||
detail=False,
|
||||
url_path="rss-subscribe",
|
||||
url_name="rss_subscribe",
|
||||
)
|
||||
@transaction.atomic
|
||||
def rss_subscribe(self, request, *args, **kwargs):
|
||||
serializer = serializers.RssSubscribeSerializer(data=request.data)
|
||||
if not serializer.is_valid():
|
||||
return response.Response(serializer.errors, status=400)
|
||||
channel = (
|
||||
models.Channel.objects.filter(rss_url=serializer.validated_data["url"],)
|
||||
.order_by("id")
|
||||
.first()
|
||||
)
|
||||
if not channel:
|
||||
# try to retrieve the channel via its URL and create it
|
||||
try:
|
||||
channel, uploads = serializers.get_channel_from_rss_url(
|
||||
serializer.validated_data["url"]
|
||||
)
|
||||
except serializers.FeedFetchException as e:
|
||||
return response.Response({"detail": str(e)}, status=400,)
|
||||
|
||||
subscription = federation_models.Follow(actor=request.user.actor)
|
||||
subscription.fid = subscription.get_federation_id()
|
||||
subscription, created = SubscriptionsViewSet.queryset.get_or_create(
|
||||
target=channel.actor,
|
||||
actor=request.user.actor,
|
||||
defaults={
|
||||
"approved": True,
|
||||
"fid": subscription.fid,
|
||||
"uuid": subscription.uuid,
|
||||
},
|
||||
)
|
||||
# prefetch stuff
|
||||
subscription = SubscriptionsViewSet.queryset.get(pk=subscription.pk)
|
||||
|
||||
return response.Response(
|
||||
serializers.SubscriptionSerializer(subscription).data, status=201
|
||||
)
|
||||
|
||||
def get_serializer_context(self):
|
||||
context = super().get_serializer_context()
|
||||
context["subscriptions_count"] = self.action in [
|
||||
"retrieve",
|
||||
"create",
|
||||
"update",
|
||||
"partial_update",
|
||||
]
|
||||
if self.request.user.is_authenticated:
|
||||
context["actor"] = self.request.user.actor
|
||||
return context
|
||||
|
||||
@transaction.atomic
|
||||
def perform_destroy(self, instance):
|
||||
instance.__class__.objects.filter(pk=instance.pk).delete()
|
||||
common_utils.on_commit(
|
||||
federation_tasks.remove_actor.delay, actor_id=instance.actor.pk
|
||||
)
|
||||
|
||||
|
||||
class SubscriptionsViewSet(
|
||||
ChannelsMixin,
|
||||
mixins.RetrieveModelMixin,
|
||||
mixins.ListModelMixin,
|
||||
viewsets.GenericViewSet,
|
||||
):
|
||||
lookup_field = "uuid"
|
||||
serializer_class = serializers.SubscriptionSerializer
|
||||
queryset = (
|
||||
federation_models.Follow.objects.exclude(target__channel__isnull=True)
|
||||
.prefetch_related(
|
||||
"target__channel__library",
|
||||
"target__channel__attributed_to",
|
||||
"actor",
|
||||
Prefetch("target__channel__artist", queryset=ARTIST_PREFETCH_QS),
|
||||
)
|
||||
.order_by("-creation_date")
|
||||
)
|
||||
permission_classes = [
|
||||
oauth_permissions.ScopePermission,
|
||||
rest_permissions.IsAuthenticated,
|
||||
]
|
||||
required_scope = "libraries"
|
||||
anonymous_policy = False
|
||||
|
||||
def get_queryset(self):
|
||||
qs = super().get_queryset()
|
||||
return qs.filter(actor=self.request.user.actor)
|
||||
|
||||
@decorators.action(methods=["get"], detail=False)
|
||||
def all(self, request, *args, **kwargs):
|
||||
"""
|
||||
Return all the subscriptions of the current user, with only limited data
|
||||
to have a performant endpoint and avoid lots of queries just to display
|
||||
subscription status in the UI
|
||||
"""
|
||||
subscriptions = list(
|
||||
self.get_queryset().values_list("uuid", "target__channel__uuid")
|
||||
)
|
||||
|
||||
payload = {
|
||||
"results": [{"uuid": str(u[0]), "channel": u[1]} for u in subscriptions],
|
||||
"count": len(subscriptions),
|
||||
}
|
||||
return response.Response(payload, status=200)
|
|
@ -0,0 +1,65 @@
|
|||
import click
|
||||
import functools
|
||||
|
||||
|
||||
@click.group()
|
||||
def cli():
|
||||
pass
|
||||
|
||||
|
||||
def confirm_action(f, id_var, message_template="Do you want to proceed?"):
|
||||
@functools.wraps(f)
|
||||
def action(*args, **kwargs):
|
||||
if id_var:
|
||||
id_value = kwargs[id_var]
|
||||
message = message_template.format(len(id_value))
|
||||
else:
|
||||
message = message_template
|
||||
if not kwargs.pop("no_input", False) and not click.confirm(message, abort=True):
|
||||
return
|
||||
|
||||
return f(*args, **kwargs)
|
||||
|
||||
return action
|
||||
|
||||
|
||||
def delete_command(
|
||||
group,
|
||||
id_var="id",
|
||||
name="rm",
|
||||
message_template="Do you want to delete {} objects? This action is irreversible.",
|
||||
):
|
||||
"""
|
||||
Wrap a command to ensure it asks for confirmation before deletion, unless the --no-input
|
||||
flag is provided
|
||||
"""
|
||||
|
||||
def decorator(f):
|
||||
decorated = click.option("--no-input", is_flag=True)(f)
|
||||
decorated = confirm_action(
|
||||
decorated, id_var=id_var, message_template=message_template
|
||||
)
|
||||
return group.command(name)(decorated)
|
||||
|
||||
return decorator
|
||||
|
||||
|
||||
def update_command(
|
||||
group,
|
||||
id_var="id",
|
||||
name="set",
|
||||
message_template="Do you want to update {} objects? This action may have irreversible consequnces.",
|
||||
):
|
||||
"""
|
||||
Wrap a command to ensure it asks for confirmation before deletion, unless the --no-input
|
||||
flag is provided
|
||||
"""
|
||||
|
||||
def decorator(f):
|
||||
decorated = click.option("--no-input", is_flag=True)(f)
|
||||
decorated = confirm_action(
|
||||
decorated, id_var=id_var, message_template=message_template
|
||||
)
|
||||
return group.command(name)(decorated)
|
||||
|
||||
return decorator
|
|
@ -0,0 +1,50 @@
|
|||
import click
|
||||
|
||||
from funkwhale_api.music import tasks
|
||||
|
||||
from . import base
|
||||
|
||||
|
||||
def handler_add_tags_from_tracks(
|
||||
artists=False, albums=False,
|
||||
):
|
||||
result = None
|
||||
if artists:
|
||||
result = tasks.artists_set_tags_from_tracks()
|
||||
elif albums:
|
||||
result = tasks.albums_set_tags_from_tracks()
|
||||
else:
|
||||
raise click.BadOptionUsage("You must specify artists or albums")
|
||||
|
||||
if result is None:
|
||||
click.echo(" No relevant tags found")
|
||||
else:
|
||||
click.echo(" Relevant tags added to {} objects".format(len(result)))
|
||||
|
||||
|
||||
@base.cli.group()
|
||||
def albums():
|
||||
"""Manage albums"""
|
||||
pass
|
||||
|
||||
|
||||
@base.cli.group()
|
||||
def artists():
|
||||
"""Manage artists"""
|
||||
pass
|
||||
|
||||
|
||||
@albums.command(name="add-tags-from-tracks")
|
||||
def albums_add_tags_from_tracks():
|
||||
"""
|
||||
Associate tags to album with no genre tags, assuming identical tags are found on the album tracks
|
||||
"""
|
||||
handler_add_tags_from_tracks(albums=True)
|
||||
|
||||
|
||||
@artists.command(name="add-tags-from-tracks")
|
||||
def artists_add_tags_from_tracks():
|
||||
"""
|
||||
Associate tags to artists with no genre tags, assuming identical tags are found on the artist tracks
|
||||
"""
|
||||
handler_add_tags_from_tracks(artists=True)
|
|
@ -0,0 +1,20 @@
|
|||
import click
|
||||
import sys
|
||||
|
||||
from . import base
|
||||
from . import library # noqa
|
||||
from . import users # noqa
|
||||
|
||||
from rest_framework.exceptions import ValidationError
|
||||
|
||||
|
||||
def invoke():
|
||||
try:
|
||||
return base.cli()
|
||||
except ValidationError as e:
|
||||
click.secho("Invalid data:", fg="red")
|
||||
for field, errors in e.detail.items():
|
||||
click.secho(" {}:".format(field), fg="red")
|
||||
for error in errors:
|
||||
click.secho(" - {}".format(error), fg="red")
|
||||
sys.exit(1)
|
|
@ -0,0 +1,234 @@
|
|||
import click
|
||||
|
||||
from django.db import transaction
|
||||
|
||||
from funkwhale_api.federation import models as federation_models
|
||||
from funkwhale_api.users import models
|
||||
from funkwhale_api.users import serializers
|
||||
from funkwhale_api.users import tasks
|
||||
|
||||
from . import base
|
||||
from . import utils
|
||||
|
||||
|
||||
class FakeRequest(object):
|
||||
def __init__(self, session={}):
|
||||
self.session = session
|
||||
|
||||
|
||||
@transaction.atomic
|
||||
def handler_create_user(
|
||||
username,
|
||||
password,
|
||||
email,
|
||||
is_superuser=False,
|
||||
is_staff=False,
|
||||
permissions=[],
|
||||
upload_quota=None,
|
||||
):
|
||||
serializer = serializers.RS(
|
||||
data={
|
||||
"username": username,
|
||||
"email": email,
|
||||
"password1": password,
|
||||
"password2": password,
|
||||
}
|
||||
)
|
||||
utils.logger.debug("Validating user data…")
|
||||
serializer.is_valid(raise_exception=True)
|
||||
|
||||
# Override email validation, we assume accounts created from CLI have a valid email
|
||||
request = FakeRequest(session={"account_verified_email": email})
|
||||
utils.logger.debug("Creating user…")
|
||||
user = serializer.save(request=request)
|
||||
utils.logger.debug("Setting permissions and other attributes…")
|
||||
user.is_staff = is_staff
|
||||
user.upload_quota = upload_quota
|
||||
user.is_superuser = is_superuser
|
||||
for permission in permissions:
|
||||
if permission in models.PERMISSIONS:
|
||||
utils.logger.debug("Setting %s permission to True", permission)
|
||||
setattr(user, "permission_{}".format(permission), True)
|
||||
else:
|
||||
utils.logger.warn("Unknown permission %s", permission)
|
||||
utils.logger.debug("Creating actor…")
|
||||
user.actor = models.create_actor(user)
|
||||
user.save()
|
||||
return user
|
||||
|
||||
|
||||
@transaction.atomic
|
||||
def handler_delete_user(usernames, soft=True):
|
||||
for username in usernames:
|
||||
click.echo("Deleting {}…".format(username))
|
||||
actor = None
|
||||
user = None
|
||||
try:
|
||||
user = models.User.objects.get(username=username)
|
||||
except models.User.DoesNotExist:
|
||||
try:
|
||||
actor = federation_models.Actor.objects.local().get(
|
||||
preferred_username=username
|
||||
)
|
||||
except federation_models.Actor.DoesNotExist:
|
||||
click.echo(" Not found, skipping")
|
||||
continue
|
||||
|
||||
actor = actor or user.actor
|
||||
if user:
|
||||
tasks.delete_account(user_id=user.pk)
|
||||
if not soft:
|
||||
click.echo(" Hard delete, removing actor")
|
||||
actor.delete()
|
||||
click.echo(" Done")
|
||||
|
||||
|
||||
@transaction.atomic
|
||||
def handler_update_user(usernames, kwargs):
|
||||
users = models.User.objects.filter(username__in=usernames)
|
||||
total = users.count()
|
||||
if not total:
|
||||
click.echo("No matching users")
|
||||
return
|
||||
|
||||
final_kwargs = {}
|
||||
supported_fields = [
|
||||
"is_active",
|
||||
"permission_moderation",
|
||||
"permission_library",
|
||||
"permission_settings",
|
||||
"is_staff",
|
||||
"is_superuser",
|
||||
"upload_quota",
|
||||
"password",
|
||||
]
|
||||
for field in supported_fields:
|
||||
try:
|
||||
value = kwargs[field]
|
||||
except KeyError:
|
||||
continue
|
||||
final_kwargs[field] = value
|
||||
|
||||
click.echo(
|
||||
"Updating {} on {} matching users…".format(
|
||||
", ".join(final_kwargs.keys()), total
|
||||
)
|
||||
)
|
||||
if "password" in final_kwargs:
|
||||
new_password = final_kwargs.pop("password")
|
||||
for user in users:
|
||||
user.set_password(new_password)
|
||||
models.User.objects.bulk_update(users, ["password"])
|
||||
if final_kwargs:
|
||||
users.update(**final_kwargs)
|
||||
click.echo("Done!")
|
||||
|
||||
|
||||
@base.cli.group()
|
||||
def users():
|
||||
"""Manage users"""
|
||||
pass
|
||||
|
||||
|
||||
@users.command()
|
||||
@click.option("--username", "-u", prompt=True, required=True)
|
||||
@click.option(
|
||||
"-p",
|
||||
"--password",
|
||||
prompt="Password (leave empty to have a random one generated)",
|
||||
hide_input=True,
|
||||
envvar="FUNKWHALE_CLI_USER_PASSWORD",
|
||||
default="",
|
||||
help="If empty, a random password will be generated and displayed in console output",
|
||||
)
|
||||
@click.option(
|
||||
"-e",
|
||||
"--email",
|
||||
prompt=True,
|
||||
help="Email address to associate with the account",
|
||||
required=True,
|
||||
)
|
||||
@click.option(
|
||||
"-q",
|
||||
"--upload-quota",
|
||||
help="Upload quota (leave empty to use default pod quota)",
|
||||
required=False,
|
||||
default=None,
|
||||
type=click.INT,
|
||||
)
|
||||
@click.option(
|
||||
"--superuser/--no-superuser", default=False,
|
||||
)
|
||||
@click.option(
|
||||
"--staff/--no-staff", default=False,
|
||||
)
|
||||
@click.option(
|
||||
"--permission", multiple=True,
|
||||
)
|
||||
def create(username, password, email, superuser, staff, permission, upload_quota):
|
||||
"""Create a new user"""
|
||||
generated_password = None
|
||||
if password == "":
|
||||
generated_password = models.User.objects.make_random_password()
|
||||
user = handler_create_user(
|
||||
username=username,
|
||||
password=password or generated_password,
|
||||
email=email,
|
||||
is_superuser=superuser,
|
||||
is_staff=staff,
|
||||
permissions=permission,
|
||||
upload_quota=upload_quota,
|
||||
)
|
||||
click.echo("User {} created!".format(user.username))
|
||||
if generated_password:
|
||||
click.echo(" Generated password: {}".format(generated_password))
|
||||
|
||||
|
||||
@base.delete_command(group=users, id_var="username")
|
||||
@click.argument("username", nargs=-1)
|
||||
@click.option(
|
||||
"--hard/--no-hard",
|
||||
default=False,
|
||||
help="Purge all user-related info (allow recreating a user with the same username)",
|
||||
)
|
||||
def delete(username, hard):
|
||||
"""Delete given users"""
|
||||
handler_delete_user(usernames=username, soft=not hard)
|
||||
|
||||
|
||||
@base.update_command(group=users, id_var="username")
|
||||
@click.argument("username", nargs=-1)
|
||||
@click.option(
|
||||
"--active/--inactive",
|
||||
help="Mark as active or inactive (inactive users cannot login or use the service)",
|
||||
default=None,
|
||||
)
|
||||
@click.option("--superuser/--no-superuser", default=None)
|
||||
@click.option("--staff/--no-staff", default=None)
|
||||
@click.option("--permission-library/--no-permission-library", default=None)
|
||||
@click.option("--permission-moderation/--no-permission-moderation", default=None)
|
||||
@click.option("--permission-settings/--no-permission-settings", default=None)
|
||||
@click.option("--password", default=None, envvar="FUNKWHALE_CLI_USER_UPDATE_PASSWORD")
|
||||
@click.option(
|
||||
"-q", "--upload-quota", type=click.INT,
|
||||
)
|
||||
def update(username, **kwargs):
|
||||
"""Update attributes for given users"""
|
||||
field_mapping = {
|
||||
"active": "is_active",
|
||||
"superuser": "is_superuser",
|
||||
"staff": "is_staff",
|
||||
}
|
||||
final_kwargs = {}
|
||||
for cli_field, value in kwargs.items():
|
||||
if value is None:
|
||||
continue
|
||||
model_field = (
|
||||
field_mapping[cli_field] if cli_field in field_mapping else cli_field
|
||||
)
|
||||
final_kwargs[model_field] = value
|
||||
|
||||
if not final_kwargs:
|
||||
raise click.BadArgumentUsage("You need to update at least one attribute")
|
||||
|
||||
handler_update_user(usernames=username, kwargs=final_kwargs)
|
|
@ -0,0 +1,3 @@
|
|||
import logging
|
||||
|
||||
logger = logging.getLogger("funkwhale_api.cli")
|
|
@ -45,3 +45,20 @@ class MutationAdmin(ModelAdmin):
|
|||
search_fields = ["created_by__preferred_username"]
|
||||
list_filter = ["type", "is_approved", "is_applied"]
|
||||
actions = [apply]
|
||||
|
||||
|
||||
@register(models.Attachment)
|
||||
class AttachmentAdmin(ModelAdmin):
|
||||
list_display = [
|
||||
"uuid",
|
||||
"actor",
|
||||
"url",
|
||||
"file",
|
||||
"size",
|
||||
"mimetype",
|
||||
"creation_date",
|
||||
"last_fetch_date",
|
||||
]
|
||||
list_select_related = True
|
||||
search_fields = ["actor__domain__name"]
|
||||
list_filter = ["mimetype"]
|
||||
|
|
|
@ -1,11 +1,91 @@
|
|||
from django.conf import settings
|
||||
from django.utils.encoding import smart_text
|
||||
from django.utils.translation import ugettext as _
|
||||
|
||||
from django.core.cache import cache
|
||||
|
||||
from allauth.account.utils import send_email_confirmation
|
||||
from oauth2_provider.contrib.rest_framework.authentication import (
|
||||
OAuth2Authentication as BaseOAuth2Authentication,
|
||||
)
|
||||
from rest_framework import exceptions
|
||||
from rest_framework_jwt import authentication
|
||||
from rest_framework_jwt.settings import api_settings
|
||||
|
||||
|
||||
class JSONWebTokenAuthenticationQS(authentication.BaseJSONWebTokenAuthentication):
|
||||
def should_verify_email(user):
|
||||
if user.is_superuser:
|
||||
return False
|
||||
has_unverified_email = not user.has_verified_primary_email
|
||||
mandatory_verification = settings.ACCOUNT_EMAIL_VERIFICATION != "optional"
|
||||
return has_unverified_email and mandatory_verification
|
||||
|
||||
|
||||
class UnverifiedEmail(Exception):
|
||||
def __init__(self, user):
|
||||
self.user = user
|
||||
|
||||
|
||||
def resend_confirmation_email(request, user):
|
||||
THROTTLE_DELAY = 500
|
||||
cache_key = "auth:resent-email-confirmation:{}".format(user.pk)
|
||||
if cache.get(cache_key):
|
||||
return False
|
||||
|
||||
done = send_email_confirmation(request, user)
|
||||
cache.set(cache_key, True, THROTTLE_DELAY)
|
||||
return done
|
||||
|
||||
|
||||
class OAuth2Authentication(BaseOAuth2Authentication):
|
||||
def authenticate(self, request):
|
||||
try:
|
||||
return super().authenticate(request)
|
||||
except UnverifiedEmail as e:
|
||||
request.oauth2_error = {"error": "unverified_email"}
|
||||
resend_confirmation_email(request, e.user)
|
||||
|
||||
|
||||
class BaseJsonWebTokenAuth(object):
|
||||
def authenticate(self, request):
|
||||
try:
|
||||
return super().authenticate(request)
|
||||
except UnverifiedEmail as e:
|
||||
msg = _("You need to verify your email address.")
|
||||
resend_confirmation_email(request, e.user)
|
||||
raise exceptions.AuthenticationFailed(msg)
|
||||
|
||||
def authenticate_credentials(self, payload):
|
||||
"""
|
||||
We have to implement this method by hand to ensure we can check that the
|
||||
User has a verified email, if required
|
||||
"""
|
||||
User = authentication.get_user_model()
|
||||
username = authentication.jwt_get_username_from_payload(payload)
|
||||
|
||||
if not username:
|
||||
msg = _("Invalid payload.")
|
||||
raise exceptions.AuthenticationFailed(msg)
|
||||
|
||||
try:
|
||||
user = User.objects.get_by_natural_key(username)
|
||||
except User.DoesNotExist:
|
||||
msg = _("Invalid signature.")
|
||||
raise exceptions.AuthenticationFailed(msg)
|
||||
|
||||
if not user.is_active:
|
||||
msg = _("User account is disabled.")
|
||||
raise exceptions.AuthenticationFailed(msg)
|
||||
|
||||
if should_verify_email(user):
|
||||
raise UnverifiedEmail(user)
|
||||
|
||||
return user
|
||||
|
||||
|
||||
class JSONWebTokenAuthenticationQS(
|
||||
BaseJsonWebTokenAuth, authentication.BaseJSONWebTokenAuthentication
|
||||
):
|
||||
|
||||
www_authenticate_realm = "api"
|
||||
|
||||
|
@ -22,7 +102,9 @@ class JSONWebTokenAuthenticationQS(authentication.BaseJSONWebTokenAuthentication
|
|||
)
|
||||
|
||||
|
||||
class BearerTokenHeaderAuth(authentication.BaseJSONWebTokenAuthentication):
|
||||
class BearerTokenHeaderAuth(
|
||||
BaseJsonWebTokenAuth, authentication.BaseJSONWebTokenAuthentication
|
||||
):
|
||||
"""
|
||||
For backward compatibility purpose, we used Authorization: JWT <token>
|
||||
but Authorization: Bearer <token> is probably better.
|
||||
|
@ -65,7 +147,9 @@ class BearerTokenHeaderAuth(authentication.BaseJSONWebTokenAuthentication):
|
|||
return auth
|
||||
|
||||
|
||||
class JSONWebTokenAuthentication(authentication.JSONWebTokenAuthentication):
|
||||
class JSONWebTokenAuthentication(
|
||||
BaseJsonWebTokenAuth, authentication.JSONWebTokenAuthentication
|
||||
):
|
||||
def authenticate(self, request):
|
||||
auth = super().authenticate(request)
|
||||
|
||||
|
|
|
@ -0,0 +1,29 @@
|
|||
import logging
|
||||
|
||||
from django_redis.client import default
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class RedisClient(default.DefaultClient):
|
||||
def get(self, key, default=None, version=None, client=None):
|
||||
try:
|
||||
return super().get(key, default=default, version=version, client=client)
|
||||
except ValueError as e:
|
||||
if "unsupported pickle protocol" in str(e):
|
||||
# pickle deserialization error
|
||||
logger.warn("Error while deserializing pickle value from cache")
|
||||
return default
|
||||
else:
|
||||
raise
|
||||
|
||||
def get_many(self, *args, **kwargs):
|
||||
try:
|
||||
return super().get_many(*args, **kwargs)
|
||||
except ValueError as e:
|
||||
if "unsupported pickle protocol" in str(e):
|
||||
# pickle deserialization error
|
||||
logger.warn("Error while deserializing pickle value from cache")
|
||||
return {}
|
||||
else:
|
||||
raise
|
|
@ -8,6 +8,7 @@ from django.core.serializers.json import DjangoJSONEncoder
|
|||
logger = logging.getLogger(__name__)
|
||||
channel_layer = get_channel_layer()
|
||||
group_add = async_to_sync(channel_layer.group_add)
|
||||
group_discard = async_to_sync(channel_layer.group_discard)
|
||||
|
||||
|
||||
def group_send(group, event):
|
||||
|
|
|
@ -14,7 +14,11 @@ class JsonAuthConsumer(JsonWebsocketConsumer):
|
|||
|
||||
def accept(self):
|
||||
super().accept()
|
||||
for group in self.groups:
|
||||
channels.group_add(group, self.channel_name)
|
||||
for group in self.scope["user"].get_channels_groups():
|
||||
groups = self.scope["user"].get_channels_groups() + self.groups
|
||||
for group in groups:
|
||||
channels.group_add(group, self.channel_name)
|
||||
|
||||
def disconnect(self, close_code):
|
||||
groups = self.scope["user"].get_channels_groups() + self.groups
|
||||
for group in groups:
|
||||
channels.group_discard(group, self.channel_name)
|
||||
|
|
|
@ -16,10 +16,22 @@ class MutationFactory(NoUpdateOnCreate, factory.django.DjangoModelFactory):
|
|||
class Meta:
|
||||
model = "common.Mutation"
|
||||
|
||||
@factory.post_generation
|
||||
def target(self, create, extracted, **kwargs):
|
||||
if not create:
|
||||
# Simple build, do nothing.
|
||||
return
|
||||
self.target = extracted
|
||||
self.save()
|
||||
|
||||
@registry.register
|
||||
class AttachmentFactory(NoUpdateOnCreate, factory.django.DjangoModelFactory):
|
||||
url = factory.Faker("federation_url")
|
||||
uuid = factory.Faker("uuid4")
|
||||
actor = factory.SubFactory(federation_factories.ActorFactory)
|
||||
file = factory.django.ImageField()
|
||||
|
||||
class Meta:
|
||||
model = "common.Attachment"
|
||||
|
||||
|
||||
@registry.register
|
||||
class CommonFactory(NoUpdateOnCreate, factory.django.DjangoModelFactory):
|
||||
text = factory.Faker("paragraph")
|
||||
content_type = "text/plain"
|
||||
|
||||
class Meta:
|
||||
model = "common.Content"
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
import django_filters
|
||||
from django import forms
|
||||
from django.conf import settings
|
||||
from django.core.serializers.json import DjangoJSONEncoder
|
||||
from django.db import models
|
||||
|
||||
|
@ -33,12 +34,18 @@ def privacy_level_query(user, lookup_field="privacy_level", user_field="user"):
|
|||
class SearchFilter(django_filters.CharFilter):
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.search_fields = kwargs.pop("search_fields")
|
||||
self.fts_search_fields = kwargs.pop("fts_search_fields", [])
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
def filter(self, qs, value):
|
||||
if not value:
|
||||
return qs
|
||||
query = search.get_query(value, self.search_fields)
|
||||
if settings.USE_FULL_TEXT_SEARCH and self.fts_search_fields:
|
||||
query = search.get_fts_query(
|
||||
value, self.fts_search_fields, model=self.parent.Meta.model
|
||||
)
|
||||
else:
|
||||
query = search.get_query(value, self.search_fields)
|
||||
return qs.filter(query)
|
||||
|
||||
|
||||
|
|
|
@ -168,3 +168,50 @@ class MutationFilter(filters.FilterSet):
|
|||
class Meta:
|
||||
model = models.Mutation
|
||||
fields = ["is_approved", "is_applied", "type"]
|
||||
|
||||
|
||||
class ActorScopeFilter(filters.CharFilter):
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.actor_field = kwargs.pop("actor_field")
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
def filter(self, queryset, value):
|
||||
from funkwhale_api.federation import models as federation_models
|
||||
|
||||
if not value:
|
||||
return queryset
|
||||
|
||||
request = getattr(self.parent, "request", None)
|
||||
if not request:
|
||||
return queryset.none()
|
||||
|
||||
user = getattr(request, "user", None)
|
||||
qs = queryset
|
||||
if value.lower() == "me":
|
||||
qs = self.filter_me(user=user, queryset=queryset)
|
||||
elif value.lower() == "all":
|
||||
return queryset
|
||||
elif value.lower().startswith("actor:"):
|
||||
full_username = value.split("actor:", 1)[1]
|
||||
username, domain = full_username.split("@")
|
||||
try:
|
||||
actor = federation_models.Actor.objects.get(
|
||||
preferred_username=username, domain_id=domain,
|
||||
)
|
||||
except federation_models.Actor.DoesNotExist:
|
||||
return queryset.none()
|
||||
|
||||
return queryset.filter(**{self.actor_field: actor})
|
||||
else:
|
||||
return queryset.none()
|
||||
|
||||
if self.distinct:
|
||||
qs = qs.distinct()
|
||||
return qs
|
||||
|
||||
def filter_me(self, user, queryset):
|
||||
actor = getattr(user, "actor", None)
|
||||
if not actor:
|
||||
return queryset.none()
|
||||
|
||||
return queryset.filter(**{self.actor_field: actor})
|
||||
|
|
|
@ -0,0 +1,191 @@
|
|||
# from https://gist.github.com/carlopires/1262033/c52ef0f7ce4f58108619508308372edd8d0bd518
|
||||
|
||||
ISO_639_CHOICES = [
|
||||
("ab", "Abkhaz"),
|
||||
("aa", "Afar"),
|
||||
("af", "Afrikaans"),
|
||||
("ak", "Akan"),
|
||||
("sq", "Albanian"),
|
||||
("am", "Amharic"),
|
||||
("ar", "Arabic"),
|
||||
("an", "Aragonese"),
|
||||
("hy", "Armenian"),
|
||||
("as", "Assamese"),
|
||||
("av", "Avaric"),
|
||||
("ae", "Avestan"),
|
||||
("ay", "Aymara"),
|
||||
("az", "Azerbaijani"),
|
||||
("bm", "Bambara"),
|
||||
("ba", "Bashkir"),
|
||||
("eu", "Basque"),
|
||||
("be", "Belarusian"),
|
||||
("bn", "Bengali"),
|
||||
("bh", "Bihari"),
|
||||
("bi", "Bislama"),
|
||||
("bs", "Bosnian"),
|
||||
("br", "Breton"),
|
||||
("bg", "Bulgarian"),
|
||||
("my", "Burmese"),
|
||||
("ca", "Catalan; Valencian"),
|
||||
("ch", "Chamorro"),
|
||||
("ce", "Chechen"),
|
||||
("ny", "Chichewa; Chewa; Nyanja"),
|
||||
("zh", "Chinese"),
|
||||
("cv", "Chuvash"),
|
||||
("kw", "Cornish"),
|
||||
("co", "Corsican"),
|
||||
("cr", "Cree"),
|
||||
("hr", "Croatian"),
|
||||
("cs", "Czech"),
|
||||
("da", "Danish"),
|
||||
("dv", "Divehi; Maldivian;"),
|
||||
("nl", "Dutch"),
|
||||
("dz", "Dzongkha"),
|
||||
("en", "English"),
|
||||
("eo", "Esperanto"),
|
||||
("et", "Estonian"),
|
||||
("ee", "Ewe"),
|
||||
("fo", "Faroese"),
|
||||
("fj", "Fijian"),
|
||||
("fi", "Finnish"),
|
||||
("fr", "French"),
|
||||
("ff", "Fula"),
|
||||
("gl", "Galician"),
|
||||
("ka", "Georgian"),
|
||||
("de", "German"),
|
||||
("el", "Greek, Modern"),
|
||||
("gn", "Guaraní"),
|
||||
("gu", "Gujarati"),
|
||||
("ht", "Haitian"),
|
||||
("ha", "Hausa"),
|
||||
("he", "Hebrew (modern)"),
|
||||
("hz", "Herero"),
|
||||
("hi", "Hindi"),
|
||||
("ho", "Hiri Motu"),
|
||||
("hu", "Hungarian"),
|
||||
("ia", "Interlingua"),
|
||||
("id", "Indonesian"),
|
||||
("ie", "Interlingue"),
|
||||
("ga", "Irish"),
|
||||
("ig", "Igbo"),
|
||||
("ik", "Inupiaq"),
|
||||
("io", "Ido"),
|
||||
("is", "Icelandic"),
|
||||
("it", "Italian"),
|
||||
("iu", "Inuktitut"),
|
||||
("ja", "Japanese"),
|
||||
("jv", "Javanese"),
|
||||
("kl", "Kalaallisut"),
|
||||
("kn", "Kannada"),
|
||||
("kr", "Kanuri"),
|
||||
("ks", "Kashmiri"),
|
||||
("kk", "Kazakh"),
|
||||
("km", "Khmer"),
|
||||
("ki", "Kikuyu, Gikuyu"),
|
||||
("rw", "Kinyarwanda"),
|
||||
("ky", "Kirghiz, Kyrgyz"),
|
||||
("kv", "Komi"),
|
||||
("kg", "Kongo"),
|
||||
("ko", "Korean"),
|
||||
("ku", "Kurdish"),
|
||||
("kj", "Kwanyama, Kuanyama"),
|
||||
("la", "Latin"),
|
||||
("lb", "Luxembourgish"),
|
||||
("lg", "Luganda"),
|
||||
("li", "Limburgish"),
|
||||
("ln", "Lingala"),
|
||||
("lo", "Lao"),
|
||||
("lt", "Lithuanian"),
|
||||
("lu", "Luba-Katanga"),
|
||||
("lv", "Latvian"),
|
||||
("gv", "Manx"),
|
||||
("mk", "Macedonian"),
|
||||
("mg", "Malagasy"),
|
||||
("ms", "Malay"),
|
||||
("ml", "Malayalam"),
|
||||
("mt", "Maltese"),
|
||||
("mi", "Māori"),
|
||||
("mr", "Marathi (Marāṭhī)"),
|
||||
("mh", "Marshallese"),
|
||||
("mn", "Mongolian"),
|
||||
("na", "Nauru"),
|
||||
("nv", "Navajo, Navaho"),
|
||||
("nb", "Norwegian Bokmål"),
|
||||
("nd", "North Ndebele"),
|
||||
("ne", "Nepali"),
|
||||
("ng", "Ndonga"),
|
||||
("nn", "Norwegian Nynorsk"),
|
||||
("no", "Norwegian"),
|
||||
("ii", "Nuosu"),
|
||||
("nr", "South Ndebele"),
|
||||
("oc", "Occitan"),
|
||||
("oj", "Ojibwe, Ojibwa"),
|
||||
("cu", "Old Church Slavonic"),
|
||||
("om", "Oromo"),
|
||||
("or", "Oriya"),
|
||||
("os", "Ossetian, Ossetic"),
|
||||
("pa", "Panjabi, Punjabi"),
|
||||
("pi", "Pāli"),
|
||||
("fa", "Persian"),
|
||||
("pl", "Polish"),
|
||||
("ps", "Pashto, Pushto"),
|
||||
("pt", "Portuguese"),
|
||||
("qu", "Quechua"),
|
||||
("rm", "Romansh"),
|
||||
("rn", "Kirundi"),
|
||||
("ro", "Romanian, Moldavan"),
|
||||
("ru", "Russian"),
|
||||
("sa", "Sanskrit (Saṁskṛta)"),
|
||||
("sc", "Sardinian"),
|
||||
("sd", "Sindhi"),
|
||||
("se", "Northern Sami"),
|
||||
("sm", "Samoan"),
|
||||
("sg", "Sango"),
|
||||
("sr", "Serbian"),
|
||||
("gd", "Scottish Gaelic"),
|
||||
("sn", "Shona"),
|
||||
("si", "Sinhala, Sinhalese"),
|
||||
("sk", "Slovak"),
|
||||
("sl", "Slovene"),
|
||||
("so", "Somali"),
|
||||
("st", "Southern Sotho"),
|
||||
("es", "Spanish; Castilian"),
|
||||
("su", "Sundanese"),
|
||||
("sw", "Swahili"),
|
||||
("ss", "Swati"),
|
||||
("sv", "Swedish"),
|
||||
("ta", "Tamil"),
|
||||
("te", "Telugu"),
|
||||
("tg", "Tajik"),
|
||||
("th", "Thai"),
|
||||
("ti", "Tigrinya"),
|
||||
("bo", "Tibetan"),
|
||||
("tk", "Turkmen"),
|
||||
("tl", "Tagalog"),
|
||||
("tn", "Tswana"),
|
||||
("to", "Tonga"),
|
||||
("tr", "Turkish"),
|
||||
("ts", "Tsonga"),
|
||||
("tt", "Tatar"),
|
||||
("tw", "Twi"),
|
||||
("ty", "Tahitian"),
|
||||
("ug", "Uighur, Uyghur"),
|
||||
("uk", "Ukrainian"),
|
||||
("ur", "Urdu"),
|
||||
("uz", "Uzbek"),
|
||||
("ve", "Venda"),
|
||||
("vi", "Vietnamese"),
|
||||
("vo", "Volapük"),
|
||||
("wa", "Walloon"),
|
||||
("cy", "Welsh"),
|
||||
("wo", "Wolof"),
|
||||
("fy", "Western Frisian"),
|
||||
("xh", "Xhosa"),
|
||||
("yi", "Yiddish"),
|
||||
("yo", "Yoruba"),
|
||||
("za", "Zhuang, Chuang"),
|
||||
("zu", "Zulu"),
|
||||
]
|
||||
|
||||
|
||||
ISO_639_BY_CODE = {code: name for code, name in ISO_639_CHOICES}
|
|
@ -1,7 +1,10 @@
|
|||
import html
|
||||
import logging
|
||||
import io
|
||||
import requests
|
||||
import os
|
||||
import re
|
||||
import time
|
||||
import urllib.parse
|
||||
import xml.sax.saxutils
|
||||
|
||||
from django import http
|
||||
|
@ -10,12 +13,17 @@ from django.core.cache import caches
|
|||
from django import urls
|
||||
from rest_framework import views
|
||||
|
||||
from funkwhale_api.federation import utils as federation_utils
|
||||
|
||||
from . import preferences
|
||||
from . import session
|
||||
from . import throttling
|
||||
from . import utils
|
||||
|
||||
EXCLUDED_PATHS = ["/api", "/federation", "/.well-known"]
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def should_fallback_to_spa(path):
|
||||
if path == "/":
|
||||
|
@ -26,6 +34,13 @@ def should_fallback_to_spa(path):
|
|||
def serve_spa(request):
|
||||
html = get_spa_html(settings.FUNKWHALE_SPA_HTML_ROOT)
|
||||
head, tail = html.split("</head>", 1)
|
||||
if settings.FUNKWHALE_SPA_REWRITE_MANIFEST:
|
||||
new_url = (
|
||||
settings.FUNKWHALE_SPA_REWRITE_MANIFEST_URL
|
||||
or federation_utils.full_url(urls.reverse("api:v1:instance:spa-manifest"))
|
||||
)
|
||||
head = replace_manifest_url(head, new_url)
|
||||
|
||||
if not preferences.get("common__api_authentication_required"):
|
||||
try:
|
||||
request_tags = get_request_head_tags(request) or []
|
||||
|
@ -66,20 +81,34 @@ def serve_spa(request):
|
|||
return http.HttpResponse(head + tail)
|
||||
|
||||
|
||||
MANIFEST_LINK_REGEX = re.compile(r"<link [^>]*rel=(?:'|\")?manifest(?:'|\")?[^>]*>")
|
||||
|
||||
|
||||
def replace_manifest_url(head, new_url):
|
||||
replacement = '<link rel=manifest href="{}">'.format(new_url)
|
||||
head = MANIFEST_LINK_REGEX.sub(replacement, head)
|
||||
return head
|
||||
|
||||
|
||||
def get_spa_html(spa_url):
|
||||
return get_spa_file(spa_url, "index.html")
|
||||
|
||||
|
||||
def get_spa_file(spa_url, name):
|
||||
if spa_url.startswith("/"):
|
||||
# XXX: spa_url is an absolute path to index.html, on the local disk.
|
||||
# However, we may want to access manifest.json or other files as well, so we
|
||||
# strip the filename
|
||||
path = os.path.join(os.path.dirname(spa_url), name)
|
||||
# we try to open a local file
|
||||
with open(spa_url) as f:
|
||||
return f.read()
|
||||
cache_key = "spa-html:{}".format(spa_url)
|
||||
with open(path, "rb") as f:
|
||||
return f.read().decode("utf-8")
|
||||
cache_key = "spa-file:{}:{}".format(spa_url, name)
|
||||
cached = caches["local"].get(cache_key)
|
||||
if cached:
|
||||
return cached
|
||||
|
||||
response = requests.get(
|
||||
utils.join_url(spa_url, "index.html"),
|
||||
verify=settings.EXTERNAL_REQUESTS_VERIFY_SSL,
|
||||
)
|
||||
response = session.get_session().get(utils.join_url(spa_url, name),)
|
||||
response.raise_for_status()
|
||||
content = response.text
|
||||
caches["local"].set(cache_key, content, settings.FUNKWHALE_SPA_HTML_CACHE_DURATION)
|
||||
|
@ -135,8 +164,16 @@ def render_tags(tags):
|
|||
|
||||
|
||||
def get_request_head_tags(request):
|
||||
accept_header = request.headers.get("Accept") or None
|
||||
redirect_to_ap = (
|
||||
False
|
||||
if not accept_header
|
||||
else not federation_utils.should_redirect_ap_to_html(accept_header)
|
||||
)
|
||||
match = urls.resolve(request.path, urlconf=settings.SPA_URLCONF)
|
||||
return match.func(request, *match.args, **match.kwargs)
|
||||
return match.func(
|
||||
request, *match.args, redirect_to_ap=redirect_to_ap, **match.kwargs
|
||||
)
|
||||
|
||||
|
||||
def get_custom_css():
|
||||
|
@ -147,6 +184,31 @@ def get_custom_css():
|
|||
return xml.sax.saxutils.escape(css)
|
||||
|
||||
|
||||
class ApiRedirect(Exception):
|
||||
def __init__(self, url):
|
||||
self.url = url
|
||||
|
||||
|
||||
def get_api_response(request, url):
|
||||
"""
|
||||
Quite ugly but we have no choice. When Accept header is set to application/activity+json
|
||||
some clients expect to get a JSON payload (instead of the HTML we return). Since
|
||||
redirecting to the URL does not work (because it makes the signature verification fail),
|
||||
we grab the internal view corresponding to the URL, call it and return this as the
|
||||
response
|
||||
"""
|
||||
path = urllib.parse.urlparse(url).path
|
||||
|
||||
try:
|
||||
match = urls.resolve(path)
|
||||
except urls.exceptions.Resolver404:
|
||||
return http.HttpResponseNotFound()
|
||||
response = match.func(request, *match.args, **match.kwargs)
|
||||
if hasattr(response, "render"):
|
||||
response.render()
|
||||
return response
|
||||
|
||||
|
||||
class SPAFallbackMiddleware:
|
||||
def __init__(self, get_response):
|
||||
self.get_response = get_response
|
||||
|
@ -155,7 +217,10 @@ class SPAFallbackMiddleware:
|
|||
response = self.get_response(request)
|
||||
|
||||
if response.status_code == 404 and should_fallback_to_spa(request.path):
|
||||
return serve_spa(request)
|
||||
try:
|
||||
return serve_spa(request)
|
||||
except ApiRedirect as e:
|
||||
return get_api_response(request, e.url)
|
||||
|
||||
return response
|
||||
|
||||
|
@ -245,6 +310,17 @@ class ThrottleStatusMiddleware:
|
|||
return response
|
||||
|
||||
|
||||
class VerboseBadRequestsMiddleware:
|
||||
def __init__(self, get_response):
|
||||
self.get_response = get_response
|
||||
|
||||
def __call__(self, request):
|
||||
response = self.get_response(request)
|
||||
if response.status_code == 400:
|
||||
logger.warning("Bad request: %s", response.content)
|
||||
return response
|
||||
|
||||
|
||||
class ProfilerMiddleware:
|
||||
"""
|
||||
from https://github.com/omarish/django-cprofile-middleware/blob/master/django_cprofile_middleware/middleware.py
|
||||
|
|
|
@ -0,0 +1,35 @@
|
|||
# Generated by Django 2.2.6 on 2019-11-11 13:38
|
||||
|
||||
import django.contrib.postgres.fields.jsonb
|
||||
import django.core.serializers.json
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
import django.utils.timezone
|
||||
import funkwhale_api.common.models
|
||||
import funkwhale_api.common.validators
|
||||
import uuid
|
||||
import versatileimagefield.fields
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('common', '0003_cit_extension'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name='Attachment',
|
||||
fields=[
|
||||
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('url', models.URLField(max_length=500, unique=True, null=True)),
|
||||
('uuid', models.UUIDField(db_index=True, default=uuid.uuid4, unique=True)),
|
||||
('creation_date', models.DateTimeField(default=django.utils.timezone.now)),
|
||||
('last_fetch_date', models.DateTimeField(blank=True, null=True)),
|
||||
('size', models.IntegerField(blank=True, null=True)),
|
||||
('mimetype', models.CharField(blank=True, max_length=200, null=True)),
|
||||
('file', versatileimagefield.fields.VersatileImageField(max_length=255, upload_to=funkwhale_api.common.models.get_file_path, validators=[funkwhale_api.common.validators.ImageDimensionsValidator(min_height=50, min_width=50), funkwhale_api.common.validators.FileValidator(allowed_extensions=['png', 'jpg', 'jpeg'], max_size=5242880)])),
|
||||
('actor', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='attachments', to='federation.Actor', null=True)),
|
||||
],
|
||||
),
|
||||
]
|
|
@ -0,0 +1,37 @@
|
|||
# Generated by Django 2.2.7 on 2019-11-25 14:21
|
||||
|
||||
import django.contrib.postgres.fields.jsonb
|
||||
import django.core.serializers.json
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('common', '0004_auto_20191111_1338'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='mutation',
|
||||
name='payload',
|
||||
field=django.contrib.postgres.fields.jsonb.JSONField(encoder=django.core.serializers.json.DjangoJSONEncoder),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='mutation',
|
||||
name='previous_state',
|
||||
field=django.contrib.postgres.fields.jsonb.JSONField(default=None, encoder=django.core.serializers.json.DjangoJSONEncoder, null=True),
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name='MutationAttachment',
|
||||
fields=[
|
||||
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('attachment', models.OneToOneField(on_delete=django.db.models.deletion.CASCADE, related_name='mutation_attachment', to='common.Attachment')),
|
||||
('mutation', models.OneToOneField(on_delete=django.db.models.deletion.CASCADE, related_name='mutation_attachment', to='common.Mutation')),
|
||||
],
|
||||
options={
|
||||
'unique_together': {('attachment', 'mutation')},
|
||||
},
|
||||
),
|
||||
]
|
|
@ -0,0 +1,21 @@
|
|||
# Generated by Django 2.2.7 on 2020-01-13 10:14
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('common', '0005_auto_20191125_1421'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name='Content',
|
||||
fields=[
|
||||
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('text', models.CharField(blank=True, max_length=5000, null=True)),
|
||||
('content_type', models.CharField(max_length=100)),
|
||||
],
|
||||
),
|
||||
]
|
|
@ -0,0 +1,18 @@
|
|||
# Generated by Django 2.2.9 on 2020-01-16 16:10
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('common', '0006_content'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='attachment',
|
||||
name='url',
|
||||
field=models.URLField(max_length=500, null=True),
|
||||
),
|
||||
]
|
|
@ -0,0 +1,34 @@
|
|||
from rest_framework import serializers
|
||||
|
||||
from django.db.models import Q
|
||||
from django.shortcuts import get_object_or_404
|
||||
|
||||
|
||||
class MultipleLookupDetailMixin(object):
|
||||
lookup_value_regex = "[^/]+"
|
||||
lookup_field = "composite"
|
||||
|
||||
def get_object(self):
|
||||
queryset = self.filter_queryset(self.get_queryset())
|
||||
|
||||
relevant_lookup = None
|
||||
value = None
|
||||
for lookup in self.url_lookups:
|
||||
field_validator = lookup["validator"]
|
||||
try:
|
||||
value = field_validator(self.kwargs["composite"])
|
||||
except serializers.ValidationError:
|
||||
continue
|
||||
else:
|
||||
relevant_lookup = lookup
|
||||
break
|
||||
get_query = relevant_lookup.get(
|
||||
"get_query", lambda value: Q(**{relevant_lookup["lookup_field"]: value})
|
||||
)
|
||||
query = get_query(value)
|
||||
obj = get_object_or_404(queryset, query)
|
||||
|
||||
# May raise a permission denied
|
||||
self.check_object_permissions(self.request, obj)
|
||||
|
||||
return obj
|
|
@ -1,4 +1,6 @@
|
|||
import uuid
|
||||
import magic
|
||||
import mimetypes
|
||||
|
||||
from django.contrib.postgres.fields import JSONField
|
||||
from django.contrib.contenttypes.fields import GenericForeignKey
|
||||
|
@ -9,11 +11,26 @@ from django.db import connections, models, transaction
|
|||
from django.db.models import Lookup
|
||||
from django.db.models.fields import Field
|
||||
from django.db.models.sql.compiler import SQLCompiler
|
||||
from django.dispatch import receiver
|
||||
from django.utils import timezone
|
||||
from django.urls import reverse
|
||||
|
||||
from versatileimagefield.fields import VersatileImageField
|
||||
from versatileimagefield.image_warmer import VersatileImageFieldWarmer
|
||||
|
||||
from funkwhale_api.federation import utils as federation_utils
|
||||
|
||||
from . import utils
|
||||
from . import validators
|
||||
|
||||
|
||||
CONTENT_TEXT_MAX_LENGTH = 5000
|
||||
CONTENT_TEXT_SUPPORTED_TYPES = [
|
||||
"text/html",
|
||||
"text/markdown",
|
||||
"text/plain",
|
||||
]
|
||||
|
||||
|
||||
@Field.register_lookup
|
||||
class NotEqual(Lookup):
|
||||
|
@ -150,3 +167,199 @@ class Mutation(models.Model):
|
|||
self.applied_date = timezone.now()
|
||||
self.save(update_fields=["is_applied", "applied_date", "previous_state"])
|
||||
return previous_state
|
||||
|
||||
|
||||
def get_file_path(instance, filename):
|
||||
return utils.ChunkedPath("attachments")(instance, filename)
|
||||
|
||||
|
||||
class AttachmentQuerySet(models.QuerySet):
|
||||
def attached(self, include=True):
|
||||
related_fields = [
|
||||
"covered_album",
|
||||
"mutation_attachment",
|
||||
"covered_track",
|
||||
"covered_artist",
|
||||
"iconed_actor",
|
||||
]
|
||||
query = None
|
||||
for field in related_fields:
|
||||
field_query = ~models.Q(**{field: None})
|
||||
query = query | field_query if query else field_query
|
||||
|
||||
if include is False:
|
||||
query = ~query
|
||||
|
||||
return self.filter(query)
|
||||
|
||||
def local(self, include=True):
|
||||
if include:
|
||||
return self.filter(actor__domain_id=settings.FEDERATION_HOSTNAME)
|
||||
else:
|
||||
return self.exclude(actor__domain_id=settings.FEDERATION_HOSTNAME)
|
||||
|
||||
|
||||
class Attachment(models.Model):
|
||||
# Remote URL where the attachment can be fetched
|
||||
url = models.URLField(max_length=500, null=True, blank=True)
|
||||
uuid = models.UUIDField(unique=True, db_index=True, default=uuid.uuid4)
|
||||
# Actor associated with the attachment
|
||||
actor = models.ForeignKey(
|
||||
"federation.Actor",
|
||||
related_name="attachments",
|
||||
on_delete=models.CASCADE,
|
||||
null=True,
|
||||
)
|
||||
creation_date = models.DateTimeField(default=timezone.now)
|
||||
last_fetch_date = models.DateTimeField(null=True, blank=True)
|
||||
# File size
|
||||
size = models.IntegerField(null=True, blank=True)
|
||||
mimetype = models.CharField(null=True, blank=True, max_length=200)
|
||||
|
||||
file = VersatileImageField(
|
||||
upload_to=get_file_path,
|
||||
max_length=255,
|
||||
validators=[
|
||||
validators.ImageDimensionsValidator(min_width=50, min_height=50),
|
||||
validators.FileValidator(
|
||||
allowed_extensions=["png", "jpg", "jpeg"], max_size=1024 * 1024 * 5,
|
||||
),
|
||||
],
|
||||
)
|
||||
|
||||
objects = AttachmentQuerySet.as_manager()
|
||||
|
||||
def save(self, **kwargs):
|
||||
if self.file and not self.size:
|
||||
self.size = self.file.size
|
||||
|
||||
if self.file and not self.mimetype:
|
||||
self.mimetype = self.guess_mimetype()
|
||||
|
||||
return super().save()
|
||||
|
||||
@property
|
||||
def is_local(self):
|
||||
return federation_utils.is_local(self.fid)
|
||||
|
||||
def guess_mimetype(self):
|
||||
f = self.file
|
||||
b = min(1000000, f.size)
|
||||
t = magic.from_buffer(f.read(b), mime=True)
|
||||
if not t.startswith("image/"):
|
||||
# failure, we try guessing by extension
|
||||
mt, _ = mimetypes.guess_type(f.name)
|
||||
if mt:
|
||||
t = mt
|
||||
return t
|
||||
|
||||
@property
|
||||
def download_url_original(self):
|
||||
if self.file:
|
||||
return utils.media_url(self.file.url)
|
||||
proxy_url = reverse("api:v1:attachments-proxy", kwargs={"uuid": self.uuid})
|
||||
return federation_utils.full_url(proxy_url + "?next=original")
|
||||
|
||||
@property
|
||||
def download_url_medium_square_crop(self):
|
||||
if self.file:
|
||||
return utils.media_url(self.file.crop["200x200"].url)
|
||||
proxy_url = reverse("api:v1:attachments-proxy", kwargs={"uuid": self.uuid})
|
||||
return federation_utils.full_url(proxy_url + "?next=medium_square_crop")
|
||||
|
||||
|
||||
class MutationAttachment(models.Model):
|
||||
"""
|
||||
When using attachments in mutations, we need to keep a reference to
|
||||
the attachment to ensure it is not pruned by common/tasks.py.
|
||||
|
||||
This is what this model does.
|
||||
"""
|
||||
|
||||
attachment = models.OneToOneField(
|
||||
Attachment, related_name="mutation_attachment", on_delete=models.CASCADE
|
||||
)
|
||||
mutation = models.OneToOneField(
|
||||
Mutation, related_name="mutation_attachment", on_delete=models.CASCADE
|
||||
)
|
||||
|
||||
class Meta:
|
||||
unique_together = ("attachment", "mutation")
|
||||
|
||||
|
||||
class Content(models.Model):
|
||||
"""
|
||||
A text content that can be associated to other models, like a description, a summary, etc.
|
||||
"""
|
||||
|
||||
text = models.CharField(max_length=CONTENT_TEXT_MAX_LENGTH, blank=True, null=True)
|
||||
content_type = models.CharField(max_length=100)
|
||||
|
||||
@property
|
||||
def rendered(self):
|
||||
from . import utils
|
||||
|
||||
return utils.render_html(self.text, self.content_type)
|
||||
|
||||
@property
|
||||
def as_plain_text(self):
|
||||
from . import utils
|
||||
|
||||
return utils.render_plain_text(self.rendered)
|
||||
|
||||
def truncate(self, length):
|
||||
text = self.as_plain_text
|
||||
truncated = text[:length]
|
||||
if len(truncated) < len(text):
|
||||
truncated += "…"
|
||||
|
||||
return truncated
|
||||
|
||||
|
||||
@receiver(models.signals.post_save, sender=Attachment)
|
||||
def warm_attachment_thumbnails(sender, instance, **kwargs):
|
||||
if not instance.file or not settings.CREATE_IMAGE_THUMBNAILS:
|
||||
return
|
||||
warmer = VersatileImageFieldWarmer(
|
||||
instance_or_queryset=instance,
|
||||
rendition_key_set="attachment_square",
|
||||
image_attr="file",
|
||||
)
|
||||
num_created, failed_to_create = warmer.warm()
|
||||
|
||||
|
||||
@receiver(models.signals.post_save, sender=Mutation)
|
||||
def trigger_mutation_post_init(sender, instance, created, **kwargs):
|
||||
if not created:
|
||||
return
|
||||
|
||||
from . import mutations
|
||||
|
||||
try:
|
||||
conf = mutations.registry.get_conf(instance.type, instance.target)
|
||||
except mutations.ConfNotFound:
|
||||
return
|
||||
serializer = conf["serializer_class"]()
|
||||
try:
|
||||
handler = serializer.mutation_post_init
|
||||
except AttributeError:
|
||||
return
|
||||
handler(instance)
|
||||
|
||||
|
||||
CONTENT_FKS = {
|
||||
"music.Track": ["description"],
|
||||
"music.Album": ["description"],
|
||||
"music.Artist": ["description"],
|
||||
}
|
||||
|
||||
|
||||
@receiver(models.signals.post_delete, sender=None)
|
||||
def remove_attached_content(sender, instance, **kwargs):
|
||||
fk_fields = CONTENT_FKS.get(instance._meta.label, [])
|
||||
for field in fk_fields:
|
||||
if getattr(instance, "{}_id".format(field)):
|
||||
try:
|
||||
getattr(instance, field).delete()
|
||||
except Content.DoesNotExist:
|
||||
pass
|
||||
|
|
|
@ -85,9 +85,6 @@ class MutationSerializer(serializers.Serializer):
|
|||
|
||||
|
||||
class UpdateMutationSerializer(serializers.ModelSerializer, MutationSerializer):
|
||||
serialized_relations = {}
|
||||
previous_state_handlers = {}
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
# we force partial mode, because update mutations are partial
|
||||
kwargs.setdefault("partial", True)
|
||||
|
@ -106,13 +103,14 @@ class UpdateMutationSerializer(serializers.ModelSerializer, MutationSerializer):
|
|||
return super().validate(validated_data)
|
||||
|
||||
def db_serialize(self, validated_data):
|
||||
serialized_relations = self.get_serialized_relations()
|
||||
data = {}
|
||||
# ensure model fields are serialized properly
|
||||
for key, value in list(validated_data.items()):
|
||||
if not isinstance(value, models.Model):
|
||||
data[key] = value
|
||||
continue
|
||||
field = self.serialized_relations[key]
|
||||
field = serialized_relations[key]
|
||||
data[key] = getattr(value, field)
|
||||
return data
|
||||
|
||||
|
@ -121,7 +119,7 @@ class UpdateMutationSerializer(serializers.ModelSerializer, MutationSerializer):
|
|||
# we use our serialized_relations configuration
|
||||
# to ensure we store ids instead of model instances in our json
|
||||
# payload
|
||||
for field, attr in self.serialized_relations.items():
|
||||
for field, attr in self.get_serialized_relations().items():
|
||||
try:
|
||||
obj = data[field]
|
||||
except KeyError:
|
||||
|
@ -140,10 +138,16 @@ class UpdateMutationSerializer(serializers.ModelSerializer, MutationSerializer):
|
|||
return get_update_previous_state(
|
||||
obj,
|
||||
*list(validated_data.keys()),
|
||||
serialized_relations=self.serialized_relations,
|
||||
handlers=self.previous_state_handlers,
|
||||
serialized_relations=self.get_serialized_relations(),
|
||||
handlers=self.get_previous_state_handlers(),
|
||||
)
|
||||
|
||||
def get_serialized_relations(self):
|
||||
return {}
|
||||
|
||||
def get_previous_state_handlers(self):
|
||||
return {}
|
||||
|
||||
|
||||
def get_update_previous_state(obj, *fields, serialized_relations={}, handlers={}):
|
||||
if not fields:
|
||||
|
|
|
@ -1,6 +1,8 @@
|
|||
import operator
|
||||
|
||||
from django.core.exceptions import ObjectDoesNotExist
|
||||
from django.http import Http404
|
||||
|
||||
from rest_framework.permissions import BasePermission
|
||||
|
||||
from funkwhale_api.common import preferences
|
||||
|
@ -46,7 +48,12 @@ class OwnerPermission(BasePermission):
|
|||
return True
|
||||
|
||||
owner_field = getattr(view, "owner_field", "user")
|
||||
owner = operator.attrgetter(owner_field)(obj)
|
||||
owner_exception = getattr(view, "owner_exception", Http404)
|
||||
try:
|
||||
owner = operator.attrgetter(owner_field)(obj)
|
||||
except ObjectDoesNotExist:
|
||||
raise owner_exception
|
||||
|
||||
if not owner or not request.user.is_authenticated or owner != request.user:
|
||||
raise Http404
|
||||
raise owner_exception
|
||||
return True
|
||||
|
|
|
@ -1,4 +1,7 @@
|
|||
import json
|
||||
|
||||
from django import forms
|
||||
from django.contrib.postgres.forms import JSONField
|
||||
from django.conf import settings
|
||||
from dynamic_preferences import serializers, types
|
||||
from dynamic_preferences.registries import global_preferences_registry
|
||||
|
@ -57,3 +60,48 @@ class StringListPreference(types.BasePreferenceType):
|
|||
d = super(StringListPreference, self).get_api_additional_data()
|
||||
d["choices"] = self.get("choices")
|
||||
return d
|
||||
|
||||
|
||||
class JSONSerializer(serializers.BaseSerializer):
|
||||
required = True
|
||||
|
||||
@classmethod
|
||||
def to_db(cls, value, **kwargs):
|
||||
if not cls.required and value is None:
|
||||
return json.dumps(value)
|
||||
data_serializer = cls.data_serializer_class(data=value)
|
||||
if not data_serializer.is_valid():
|
||||
raise cls.exception(
|
||||
"{} is not a valid value: {}".format(value, data_serializer.errors)
|
||||
)
|
||||
value = data_serializer.validated_data
|
||||
try:
|
||||
return json.dumps(value, sort_keys=True)
|
||||
except TypeError:
|
||||
raise cls.exception(
|
||||
"Cannot serialize, value {} is not JSON serializable".format(value)
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def to_python(cls, value, **kwargs):
|
||||
return json.loads(value)
|
||||
|
||||
|
||||
class SerializedPreference(types.BasePreferenceType):
|
||||
"""
|
||||
A preference that store arbitrary JSON and validate it using a rest_framework
|
||||
serializer
|
||||
"""
|
||||
|
||||
serializer = JSONSerializer
|
||||
data_serializer_class = None
|
||||
field_class = JSONField
|
||||
widget = forms.Textarea
|
||||
|
||||
@property
|
||||
def serializer(self):
|
||||
class _internal(JSONSerializer):
|
||||
data_serializer_class = self.data_serializer_class
|
||||
required = self.get("required")
|
||||
|
||||
return _internal
|
||||
|
|
|
@ -4,11 +4,12 @@ Compute different sizes of image used for Album covers and User avatars
|
|||
|
||||
from versatileimagefield.image_warmer import VersatileImageFieldWarmer
|
||||
|
||||
from funkwhale_api.music.models import Album
|
||||
from funkwhale_api.users.models import User
|
||||
from funkwhale_api.common.models import Attachment
|
||||
|
||||
|
||||
MODELS = [(Album, "cover", "square"), (User, "avatar", "square")]
|
||||
MODELS = [
|
||||
(Attachment, "file", "attachment_square"),
|
||||
]
|
||||
|
||||
|
||||
def main(command, **kwargs):
|
||||
|
|
|
@ -1,7 +1,10 @@
|
|||
import re
|
||||
|
||||
from django.contrib.postgres.search import SearchQuery
|
||||
from django.db.models import Q
|
||||
|
||||
from . import utils
|
||||
|
||||
|
||||
QUERY_REGEX = re.compile(r'(((?P<key>\w+):)?(?P<value>"[^"]+"|[\S]+))')
|
||||
|
||||
|
@ -56,6 +59,51 @@ def get_query(query_string, search_fields):
|
|||
return query
|
||||
|
||||
|
||||
def get_fts_query(query_string, fts_fields=["body_text"], model=None):
|
||||
if query_string.startswith('"') and query_string.endswith('"'):
|
||||
# we pass the query directly to the FTS engine
|
||||
query_string = query_string[1:-1]
|
||||
else:
|
||||
parts = query_string.replace(":", "").split(" ")
|
||||
parts = ["{}:*".format(p) for p in parts if p]
|
||||
if not parts:
|
||||
return Q(pk=None)
|
||||
|
||||
query_string = "&".join(parts)
|
||||
|
||||
if not fts_fields or not query_string.strip():
|
||||
return Q(pk=None)
|
||||
query = None
|
||||
for field in fts_fields:
|
||||
if "__" in field and model:
|
||||
# When we have a nested lookup, we switch to a subquery for enhanced performance
|
||||
fk_field_name, lookup = (
|
||||
field.split("__")[0],
|
||||
"__".join(field.split("__")[1:]),
|
||||
)
|
||||
fk_field = model._meta.get_field(fk_field_name)
|
||||
related_model = fk_field.related_model
|
||||
subquery = related_model.objects.filter(
|
||||
**{
|
||||
lookup: SearchQuery(
|
||||
query_string, search_type="raw", config="english_nostop"
|
||||
)
|
||||
}
|
||||
).values_list("pk", flat=True)
|
||||
new_query = Q(**{"{}__in".format(fk_field_name): list(subquery)})
|
||||
else:
|
||||
new_query = Q(
|
||||
**{
|
||||
field: SearchQuery(
|
||||
query_string, search_type="raw", config="english_nostop"
|
||||
)
|
||||
}
|
||||
)
|
||||
query = utils.join_queries_or(query, new_query)
|
||||
|
||||
return query
|
||||
|
||||
|
||||
def filter_tokens(tokens, valid):
|
||||
return [t for t in tokens if t["key"] in valid]
|
||||
|
||||
|
|
|
@ -11,6 +11,7 @@ from django.utils.encoding import smart_text
|
|||
from django.utils.translation import ugettext_lazy as _
|
||||
|
||||
from . import models
|
||||
from . import utils
|
||||
|
||||
|
||||
class RelatedField(serializers.RelatedField):
|
||||
|
@ -23,9 +24,11 @@ class RelatedField(serializers.RelatedField):
|
|||
self.related_field_name = related_field_name
|
||||
self.serializer = serializer
|
||||
self.filters = kwargs.pop("filters", None)
|
||||
kwargs["queryset"] = kwargs.pop(
|
||||
"queryset", self.serializer.Meta.model.objects.all()
|
||||
)
|
||||
self.queryset_filter = kwargs.pop("queryset_filter", None)
|
||||
try:
|
||||
kwargs["queryset"] = kwargs.pop("queryset")
|
||||
except KeyError:
|
||||
kwargs["queryset"] = self.serializer.Meta.model.objects.all()
|
||||
super().__init__(**kwargs)
|
||||
|
||||
def get_filters(self, data):
|
||||
|
@ -34,10 +37,16 @@ class RelatedField(serializers.RelatedField):
|
|||
filters.update(self.filters(self.context))
|
||||
return filters
|
||||
|
||||
def filter_queryset(self, queryset):
|
||||
if self.queryset_filter:
|
||||
queryset = self.queryset_filter(queryset, self.context)
|
||||
return queryset
|
||||
|
||||
def to_internal_value(self, data):
|
||||
try:
|
||||
queryset = self.get_queryset()
|
||||
filters = self.get_filters(data)
|
||||
queryset = self.filter_queryset(queryset)
|
||||
return queryset.get(**filters)
|
||||
except ObjectDoesNotExist:
|
||||
self.fail(
|
||||
|
@ -68,6 +77,7 @@ class RelatedField(serializers.RelatedField):
|
|||
self.display_value(item),
|
||||
)
|
||||
for item in queryset
|
||||
if self.serializer
|
||||
]
|
||||
)
|
||||
|
||||
|
@ -210,7 +220,7 @@ class StripExifImageField(serializers.ImageField):
|
|||
with io.BytesIO() as output:
|
||||
image_without_exif.save(
|
||||
output,
|
||||
format=PIL.Image.EXTENSION[os.path.splitext(file_obj.name)[-1]],
|
||||
format=PIL.Image.EXTENSION[os.path.splitext(file_obj.name)[-1].lower()],
|
||||
quality=100,
|
||||
)
|
||||
content = output.getvalue()
|
||||
|
@ -272,3 +282,60 @@ class APIMutationSerializer(serializers.ModelSerializer):
|
|||
if value not in self.context["registry"]:
|
||||
raise serializers.ValidationError("Invalid mutation type {}".format(value))
|
||||
return value
|
||||
|
||||
|
||||
class AttachmentSerializer(serializers.Serializer):
|
||||
uuid = serializers.UUIDField(read_only=True)
|
||||
size = serializers.IntegerField(read_only=True)
|
||||
mimetype = serializers.CharField(read_only=True)
|
||||
creation_date = serializers.DateTimeField(read_only=True)
|
||||
file = StripExifImageField(write_only=True)
|
||||
urls = serializers.SerializerMethodField()
|
||||
|
||||
def get_urls(self, o):
|
||||
urls = {}
|
||||
urls["source"] = o.url
|
||||
urls["original"] = o.download_url_original
|
||||
urls["medium_square_crop"] = o.download_url_medium_square_crop
|
||||
return urls
|
||||
|
||||
def to_representation(self, o):
|
||||
repr = super().to_representation(o)
|
||||
# XXX: BACKWARD COMPATIBILITY
|
||||
# having the attachment urls in a nested JSON obj is better,
|
||||
# but we can't do this without breaking clients
|
||||
# So we extract the urls and include these in the parent payload
|
||||
repr.update({k: v for k, v in repr["urls"].items() if k != "source"})
|
||||
# also, our legacy images had lots of variations (400x400, 200x200, 50x50)
|
||||
# but we removed some of these, so we emulate these by hand (by redirecting)
|
||||
# to actual, existing attachment variations
|
||||
repr["square_crop"] = repr["medium_square_crop"]
|
||||
repr["small_square_crop"] = repr["medium_square_crop"]
|
||||
return repr
|
||||
|
||||
def create(self, validated_data):
|
||||
return models.Attachment.objects.create(
|
||||
file=validated_data["file"], actor=validated_data["actor"]
|
||||
)
|
||||
|
||||
|
||||
class ContentSerializer(serializers.Serializer):
|
||||
text = serializers.CharField(max_length=models.CONTENT_TEXT_MAX_LENGTH)
|
||||
content_type = serializers.ChoiceField(choices=models.CONTENT_TEXT_SUPPORTED_TYPES,)
|
||||
html = serializers.SerializerMethodField()
|
||||
|
||||
def get_html(self, o):
|
||||
return utils.render_html(o.text, o.content_type)
|
||||
|
||||
|
||||
class NullToEmptDict(object):
|
||||
def get_attribute(self, o):
|
||||
attr = super().get_attribute(o)
|
||||
if attr is None:
|
||||
return {}
|
||||
return attr
|
||||
|
||||
def to_representation(self, v):
|
||||
if not v:
|
||||
return v
|
||||
return super().to_representation(v)
|
||||
|
|
|
@ -4,6 +4,13 @@ from django.conf import settings
|
|||
import funkwhale_api
|
||||
|
||||
|
||||
class FunkwhaleSession(requests.Session):
|
||||
def request(self, *args, **kwargs):
|
||||
kwargs.setdefault("verify", settings.EXTERNAL_REQUESTS_VERIFY_SSL)
|
||||
kwargs.setdefault("timeout", settings.EXTERNAL_REQUESTS_TIMEOUT)
|
||||
return super().request(*args, **kwargs)
|
||||
|
||||
|
||||
def get_user_agent():
|
||||
return "python-requests (funkwhale/{}; +{})".format(
|
||||
funkwhale_api.__version__, settings.FUNKWHALE_URL
|
||||
|
@ -11,6 +18,6 @@ def get_user_agent():
|
|||
|
||||
|
||||
def get_session():
|
||||
s = requests.Session()
|
||||
s = FunkwhaleSession()
|
||||
s.headers["User-Agent"] = get_user_agent()
|
||||
return s
|
||||
|
|
|
@ -1,14 +1,23 @@
|
|||
import datetime
|
||||
import logging
|
||||
import tempfile
|
||||
|
||||
from django.conf import settings
|
||||
from django.core.files import File
|
||||
from django.db import transaction
|
||||
from django.dispatch import receiver
|
||||
|
||||
from django.utils import timezone
|
||||
|
||||
from funkwhale_api.common import channels
|
||||
from funkwhale_api.taskapp import celery
|
||||
|
||||
from . import models
|
||||
from . import serializers
|
||||
from . import session
|
||||
from . import signals
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@celery.app.task(name="common.apply_mutation")
|
||||
@transaction.atomic
|
||||
|
@ -57,3 +66,36 @@ def broadcast_mutation_update(mutation, old_is_approved, new_is_approved, **kwar
|
|||
},
|
||||
},
|
||||
)
|
||||
|
||||
|
||||
def fetch_remote_attachment(attachment, filename=None, save=True):
|
||||
if attachment.file:
|
||||
# already there, no need to fetch
|
||||
return
|
||||
|
||||
s = session.get_session()
|
||||
attachment.last_fetch_date = timezone.now()
|
||||
with tempfile.TemporaryFile() as tf:
|
||||
with s.get(attachment.url, timeout=5, stream=True) as r:
|
||||
for chunk in r.iter_content(chunk_size=1024 * 100):
|
||||
tf.write(chunk)
|
||||
tf.seek(0)
|
||||
if not filename:
|
||||
filename = attachment.url.split("/")[-1]
|
||||
filename = filename[-50:]
|
||||
attachment.file.save(filename, File(tf), save=save)
|
||||
|
||||
|
||||
@celery.app.task(name="common.prune_unattached_attachments")
|
||||
def prune_unattached_attachments():
|
||||
limit = timezone.now() - datetime.timedelta(
|
||||
seconds=settings.ATTACHMENTS_UNATTACHED_PRUNE_DELAY
|
||||
)
|
||||
candidates = models.Attachment.objects.attached(False).filter(
|
||||
creation_date__lte=limit
|
||||
)
|
||||
|
||||
total = candidates.count()
|
||||
logger.info("Deleting %s unattached attachments…", total)
|
||||
result = candidates.delete()
|
||||
logger.info("Deletion done: %s", result)
|
||||
|
|
|
@ -6,9 +6,9 @@ from rest_framework import throttling as rest_throttling
|
|||
from django.conf import settings
|
||||
|
||||
|
||||
def get_ident(request):
|
||||
if hasattr(request, "user") and request.user.is_authenticated:
|
||||
return {"type": "authenticated", "id": request.user.pk}
|
||||
def get_ident(user, request):
|
||||
if user and user.is_authenticated:
|
||||
return {"type": "authenticated", "id": user.pk}
|
||||
ident = rest_throttling.BaseThrottle().get_ident(request)
|
||||
|
||||
return {"type": "anonymous", "id": ident}
|
||||
|
@ -89,7 +89,7 @@ class FunkwhaleThrottle(rest_throttling.SimpleRateThrottle):
|
|||
|
||||
def allow_request(self, request, view):
|
||||
self.request = request
|
||||
self.ident = get_ident(request)
|
||||
self.ident = get_ident(getattr(request, "user", None), request)
|
||||
action = getattr(view, "action", "*")
|
||||
view_scopes = getattr(view, "throttling_scopes", {})
|
||||
if view_scopes is None:
|
||||
|
|
|
@ -1,5 +1,11 @@
|
|||
import datetime
|
||||
|
||||
from django.core.files.base import ContentFile
|
||||
from django.utils.deconstruct import deconstructible
|
||||
|
||||
import bleach.sanitizer
|
||||
import logging
|
||||
import markdown
|
||||
import os
|
||||
import shutil
|
||||
import uuid
|
||||
|
@ -10,6 +16,9 @@ from urllib.parse import parse_qs, urlencode, urlsplit, urlunsplit
|
|||
from django.conf import settings
|
||||
from django import urls
|
||||
from django.db import models, transaction
|
||||
from django.utils import timezone
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def rename_file(instance, field_name, new_name, allow_missing_file=False):
|
||||
|
@ -125,6 +134,17 @@ def join_url(start, end):
|
|||
return start + end
|
||||
|
||||
|
||||
def media_url(path):
|
||||
if settings.MEDIA_URL.startswith("http://") or settings.MEDIA_URL.startswith(
|
||||
"https://"
|
||||
):
|
||||
return join_url(settings.MEDIA_URL, path)
|
||||
|
||||
from funkwhale_api.federation import utils as federation_utils
|
||||
|
||||
return federation_utils.full_url(path)
|
||||
|
||||
|
||||
def spa_reverse(name, args=[], kwargs={}):
|
||||
return urls.reverse(name, urlconf=settings.SPA_URLCONF, args=args, kwargs=kwargs)
|
||||
|
||||
|
@ -228,9 +248,188 @@ def get_updated_fields(conf, data, obj):
|
|||
data_value = data[data_field]
|
||||
except KeyError:
|
||||
continue
|
||||
|
||||
obj_value = getattr(obj, obj_field)
|
||||
if obj_value != data_value:
|
||||
if obj.pk:
|
||||
obj_value = getattr(obj, obj_field)
|
||||
if obj_value != data_value:
|
||||
final_data[obj_field] = data_value
|
||||
else:
|
||||
final_data[obj_field] = data_value
|
||||
|
||||
return final_data
|
||||
|
||||
|
||||
def join_queries_or(left, right):
|
||||
if left:
|
||||
return left | right
|
||||
else:
|
||||
return right
|
||||
|
||||
|
||||
MARKDOWN_RENDERER = markdown.Markdown(extensions=settings.MARKDOWN_EXTENSIONS)
|
||||
|
||||
|
||||
def render_markdown(text):
|
||||
return MARKDOWN_RENDERER.convert(text)
|
||||
|
||||
|
||||
SAFE_TAGS = [
|
||||
"p",
|
||||
"a",
|
||||
"abbr",
|
||||
"acronym",
|
||||
"b",
|
||||
"blockquote",
|
||||
"code",
|
||||
"em",
|
||||
"i",
|
||||
"li",
|
||||
"ol",
|
||||
"strong",
|
||||
"ul",
|
||||
]
|
||||
HTMl_CLEANER = bleach.sanitizer.Cleaner(strip=True, tags=SAFE_TAGS)
|
||||
|
||||
HTML_PERMISSIVE_CLEANER = bleach.sanitizer.Cleaner(
|
||||
strip=True,
|
||||
tags=SAFE_TAGS + ["h1", "h2", "h3", "h4", "h5", "h6", "div", "section", "article"],
|
||||
attributes=["class", "rel", "alt", "title"],
|
||||
)
|
||||
|
||||
# support for additional tlds
|
||||
# cf https://github.com/mozilla/bleach/issues/367#issuecomment-384631867
|
||||
ALL_TLDS = set(settings.LINKIFIER_SUPPORTED_TLDS + bleach.linkifier.TLDS)
|
||||
URL_RE = bleach.linkifier.build_url_re(tlds=sorted(ALL_TLDS, reverse=True))
|
||||
HTML_LINKER = bleach.linkifier.Linker(url_re=URL_RE)
|
||||
|
||||
|
||||
def clean_html(html, permissive=False):
|
||||
return (
|
||||
HTML_PERMISSIVE_CLEANER.clean(html) if permissive else HTMl_CLEANER.clean(html)
|
||||
)
|
||||
|
||||
|
||||
def render_html(text, content_type, permissive=False):
|
||||
if not text:
|
||||
return ""
|
||||
rendered = render_markdown(text)
|
||||
if content_type == "text/html":
|
||||
rendered = text
|
||||
elif content_type == "text/markdown":
|
||||
rendered = render_markdown(text)
|
||||
else:
|
||||
rendered = render_markdown(text)
|
||||
rendered = HTML_LINKER.linkify(rendered)
|
||||
return clean_html(rendered, permissive=permissive).strip().replace("\n", "")
|
||||
|
||||
|
||||
def render_plain_text(html):
|
||||
if not html:
|
||||
return ""
|
||||
return bleach.clean(html, tags=[], strip=True)
|
||||
|
||||
|
||||
def same_content(old, text=None, content_type=None):
|
||||
return old.text == text and old.content_type == content_type
|
||||
|
||||
|
||||
@transaction.atomic
|
||||
def attach_content(obj, field, content_data):
|
||||
from . import models
|
||||
|
||||
content_data = content_data or {}
|
||||
existing = getattr(obj, "{}_id".format(field))
|
||||
|
||||
if existing:
|
||||
if same_content(getattr(obj, field), **content_data):
|
||||
# optimization to avoid a delete/save if possible
|
||||
return getattr(obj, field)
|
||||
getattr(obj, field).delete()
|
||||
setattr(obj, field, None)
|
||||
|
||||
if not content_data:
|
||||
return
|
||||
|
||||
content_obj = models.Content.objects.create(
|
||||
text=content_data["text"][: models.CONTENT_TEXT_MAX_LENGTH],
|
||||
content_type=content_data["content_type"],
|
||||
)
|
||||
setattr(obj, field, content_obj)
|
||||
obj.save(update_fields=[field])
|
||||
return content_obj
|
||||
|
||||
|
||||
@transaction.atomic
|
||||
def attach_file(obj, field, file_data, fetch=False):
|
||||
from . import models
|
||||
from . import tasks
|
||||
|
||||
existing = getattr(obj, "{}_id".format(field))
|
||||
if existing:
|
||||
getattr(obj, field).delete()
|
||||
|
||||
if not file_data:
|
||||
return
|
||||
|
||||
if isinstance(file_data, models.Attachment):
|
||||
attachment = file_data
|
||||
else:
|
||||
extensions = {"image/jpeg": "jpg", "image/png": "png", "image/gif": "gif"}
|
||||
extension = extensions.get(file_data["mimetype"], "jpg")
|
||||
attachment = models.Attachment(mimetype=file_data["mimetype"])
|
||||
name_fields = ["uuid", "full_username", "pk"]
|
||||
name = [
|
||||
getattr(obj, field) for field in name_fields if getattr(obj, field, None)
|
||||
][0]
|
||||
filename = "{}-{}.{}".format(field, name, extension)
|
||||
if "url" in file_data:
|
||||
attachment.url = file_data["url"]
|
||||
else:
|
||||
f = ContentFile(file_data["content"])
|
||||
attachment.file.save(filename, f, save=False)
|
||||
|
||||
if not attachment.file and fetch:
|
||||
try:
|
||||
tasks.fetch_remote_attachment(attachment, filename=filename, save=False)
|
||||
except Exception as e:
|
||||
logger.warn(
|
||||
"Cannot download attachment at url %s: %s", attachment.url, e
|
||||
)
|
||||
attachment = None
|
||||
|
||||
if attachment:
|
||||
attachment.save()
|
||||
|
||||
setattr(obj, field, attachment)
|
||||
obj.save(update_fields=[field])
|
||||
return attachment
|
||||
|
||||
|
||||
def get_mimetype_from_ext(path):
|
||||
parts = path.lower().split(".")
|
||||
ext = parts[-1]
|
||||
match = {
|
||||
"jpeg": "image/jpeg",
|
||||
"jpg": "image/jpeg",
|
||||
"png": "image/png",
|
||||
"gif": "image/gif",
|
||||
}
|
||||
return match.get(ext)
|
||||
|
||||
|
||||
def get_audio_mimetype(mt):
|
||||
aliases = {"audio/x-mp3": "audio/mpeg", "audio/mpeg3": "audio/mpeg"}
|
||||
return aliases.get(mt, mt)
|
||||
|
||||
|
||||
def update_modification_date(obj, field="modification_date", date=None):
|
||||
IGNORE_DELAY = 60
|
||||
current_value = getattr(obj, field)
|
||||
date = date or timezone.now()
|
||||
ignore = current_value is not None and current_value < date - datetime.timedelta(
|
||||
seconds=IGNORE_DELAY
|
||||
)
|
||||
if ignore:
|
||||
setattr(obj, field, date)
|
||||
obj.__class__.objects.filter(pk=obj.pk).update(**{field: date})
|
||||
|
||||
return date
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
import logging
|
||||
import time
|
||||
|
||||
from django.conf import settings
|
||||
|
@ -11,6 +12,8 @@ from rest_framework import response
|
|||
from rest_framework import views
|
||||
from rest_framework import viewsets
|
||||
|
||||
from funkwhale_api.users.oauth import permissions as oauth_permissions
|
||||
|
||||
from . import filters
|
||||
from . import models
|
||||
from . import mutations
|
||||
|
@ -21,6 +24,9 @@ from . import throttling
|
|||
from . import utils
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class SkipFilterForGetObject:
|
||||
def get_object(self, *args, **kwargs):
|
||||
setattr(self.request, "_skip_filters", True)
|
||||
|
@ -133,10 +139,74 @@ class RateLimitView(views.APIView):
|
|||
throttle_classes = []
|
||||
|
||||
def get(self, request, *args, **kwargs):
|
||||
ident = throttling.get_ident(request)
|
||||
ident = throttling.get_ident(getattr(request, "user", None), request)
|
||||
data = {
|
||||
"enabled": settings.THROTTLING_ENABLED,
|
||||
"ident": ident,
|
||||
"scopes": throttling.get_status(ident, time.time()),
|
||||
}
|
||||
return response.Response(data, status=200)
|
||||
|
||||
|
||||
class AttachmentViewSet(
|
||||
mixins.RetrieveModelMixin,
|
||||
mixins.CreateModelMixin,
|
||||
mixins.DestroyModelMixin,
|
||||
viewsets.GenericViewSet,
|
||||
):
|
||||
lookup_field = "uuid"
|
||||
queryset = models.Attachment.objects.all()
|
||||
serializer_class = serializers.AttachmentSerializer
|
||||
permission_classes = [oauth_permissions.ScopePermission]
|
||||
required_scope = "libraries"
|
||||
anonymous_policy = "setting"
|
||||
|
||||
@action(
|
||||
detail=True, methods=["get"], permission_classes=[], authentication_classes=[]
|
||||
)
|
||||
@transaction.atomic
|
||||
def proxy(self, request, *args, **kwargs):
|
||||
instance = self.get_object()
|
||||
if not settings.EXTERNAL_MEDIA_PROXY_ENABLED:
|
||||
r = response.Response(status=302)
|
||||
r["Location"] = instance.url
|
||||
return r
|
||||
|
||||
size = request.GET.get("next", "original").lower()
|
||||
if size not in ["original", "medium_square_crop"]:
|
||||
size = "original"
|
||||
|
||||
try:
|
||||
tasks.fetch_remote_attachment(instance)
|
||||
except Exception:
|
||||
logger.exception("Error while fetching attachment %s", instance.url)
|
||||
return response.Response(status=500)
|
||||
data = self.serializer_class(instance).data
|
||||
redirect = response.Response(status=302)
|
||||
redirect["Location"] = data["urls"][size]
|
||||
return redirect
|
||||
|
||||
def perform_create(self, serializer):
|
||||
return serializer.save(actor=self.request.user.actor)
|
||||
|
||||
def perform_destroy(self, instance):
|
||||
if instance.actor is None or instance.actor != self.request.user.actor:
|
||||
raise exceptions.PermissionDenied()
|
||||
instance.delete()
|
||||
|
||||
|
||||
class TextPreviewView(views.APIView):
|
||||
permission_classes = []
|
||||
|
||||
def post(self, request, *args, **kwargs):
|
||||
payload = request.data
|
||||
if "text" not in payload:
|
||||
return response.Response({"detail": "Invalid input"}, status=400)
|
||||
|
||||
permissive = payload.get("permissive", False)
|
||||
data = {
|
||||
"rendered": utils.render_html(
|
||||
payload["text"], "text/markdown", permissive=permissive
|
||||
)
|
||||
}
|
||||
return response.Response(data, status=200)
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
from funkwhale_api.common import fields
|
||||
from funkwhale_api.common import filters as common_filters
|
||||
from funkwhale_api.moderation import filters as moderation_filters
|
||||
|
||||
from . import models
|
||||
|
@ -8,10 +9,11 @@ class TrackFavoriteFilter(moderation_filters.HiddenContentFilterSet):
|
|||
q = fields.SearchFilter(
|
||||
search_fields=["track__title", "track__artist__name", "track__album__title"]
|
||||
)
|
||||
scope = common_filters.ActorScopeFilter(actor_field="user__actor", distinct=True)
|
||||
|
||||
class Meta:
|
||||
model = models.TrackFavorite
|
||||
fields = ["user", "q"]
|
||||
fields = ["user", "q", "scope"]
|
||||
hidden_content_fields_mapping = moderation_filters.USER_FILTER_CONFIG[
|
||||
"TRACK_FAVORITE"
|
||||
]
|
||||
|
|
|
@ -22,7 +22,9 @@ class TrackFavoriteViewSet(
|
|||
|
||||
filterset_class = filters.TrackFavoriteFilter
|
||||
serializer_class = serializers.UserTrackFavoriteSerializer
|
||||
queryset = models.TrackFavorite.objects.all().select_related("user__actor")
|
||||
queryset = models.TrackFavorite.objects.all().select_related(
|
||||
"user__actor__attachment_icon"
|
||||
)
|
||||
permission_classes = [
|
||||
oauth_permissions.ScopePermission,
|
||||
permissions.OwnerPermission,
|
||||
|
@ -54,7 +56,9 @@ class TrackFavoriteViewSet(
|
|||
)
|
||||
tracks = Track.objects.with_playable_uploads(
|
||||
music_utils.get_actor_from_request(self.request)
|
||||
).select_related("artist", "album__artist", "attributed_to")
|
||||
).select_related(
|
||||
"artist", "album__artist", "attributed_to", "album__attachment_cover"
|
||||
)
|
||||
queryset = queryset.prefetch_related(Prefetch("track", queryset=tracks))
|
||||
return queryset
|
||||
|
||||
|
|
|
@ -118,7 +118,7 @@ def should_reject(fid, actor_id=None, payload={}):
|
|||
|
||||
|
||||
@transaction.atomic
|
||||
def receive(activity, on_behalf_of):
|
||||
def receive(activity, on_behalf_of, inbox_actor=None):
|
||||
from . import models
|
||||
from . import serializers
|
||||
from . import tasks
|
||||
|
@ -131,7 +131,12 @@ def receive(activity, on_behalf_of):
|
|||
# we ensure the activity has the bare minimum structure before storing
|
||||
# it in our database
|
||||
serializer = serializers.BaseActivitySerializer(
|
||||
data=activity, context={"actor": on_behalf_of, "local_recipients": True}
|
||||
data=activity,
|
||||
context={
|
||||
"actor": on_behalf_of,
|
||||
"local_recipients": True,
|
||||
"recipients": [inbox_actor] if inbox_actor else [],
|
||||
},
|
||||
)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
|
||||
|
@ -159,16 +164,25 @@ def receive(activity, on_behalf_of):
|
|||
)
|
||||
return
|
||||
|
||||
local_to_recipients = get_actors_from_audience(activity.get("to", []))
|
||||
local_to_recipients = local_to_recipients.exclude(user=None)
|
||||
local_to_recipients = get_actors_from_audience(
|
||||
serializer.validated_data.get("to", [])
|
||||
)
|
||||
local_to_recipients = local_to_recipients.local()
|
||||
local_to_recipients = local_to_recipients.values_list("pk", flat=True)
|
||||
local_to_recipients = list(local_to_recipients)
|
||||
if inbox_actor:
|
||||
local_to_recipients.append(inbox_actor.pk)
|
||||
|
||||
local_cc_recipients = get_actors_from_audience(activity.get("cc", []))
|
||||
local_cc_recipients = local_cc_recipients.exclude(user=None)
|
||||
local_cc_recipients = get_actors_from_audience(
|
||||
serializer.validated_data.get("cc", [])
|
||||
)
|
||||
local_cc_recipients = local_cc_recipients.local()
|
||||
local_cc_recipients = local_cc_recipients.values_list("pk", flat=True)
|
||||
|
||||
inbox_items = []
|
||||
for recipients, type in [(local_to_recipients, "to"), (local_cc_recipients, "cc")]:
|
||||
|
||||
for r in recipients.values_list("pk", flat=True):
|
||||
for r in recipients:
|
||||
inbox_items.append(models.InboxItem(actor_id=r, type=type, activity=copy))
|
||||
|
||||
models.InboxItem.objects.bulk_create(inbox_items)
|
||||
|
@ -447,6 +461,13 @@ def prepare_deliveries_and_inbox_items(recipient_list, type, allowed_domains=Non
|
|||
else:
|
||||
remote_inbox_urls.add(actor.shared_inbox_url or actor.inbox_url)
|
||||
urls.append(r["target"].followers_url)
|
||||
elif isinstance(r, dict) and r["type"] == "actor_inbox":
|
||||
actor = r["actor"]
|
||||
urls.append(actor.fid)
|
||||
if actor.is_local:
|
||||
local_recipients.add(actor)
|
||||
else:
|
||||
remote_inbox_urls.add(actor.inbox_url)
|
||||
|
||||
elif isinstance(r, dict) and r["type"] == "instances_with_followers":
|
||||
# we want to broadcast the activity to other instances service actors
|
||||
|
@ -501,13 +522,6 @@ def prepare_deliveries_and_inbox_items(recipient_list, type, allowed_domains=Non
|
|||
return inbox_items, deliveries, urls
|
||||
|
||||
|
||||
def join_queries_or(left, right):
|
||||
if left:
|
||||
return left | right
|
||||
else:
|
||||
return right
|
||||
|
||||
|
||||
def get_actors_from_audience(urls):
|
||||
"""
|
||||
Given a list of urls such as [
|
||||
|
@ -529,22 +543,24 @@ def get_actors_from_audience(urls):
|
|||
if url == PUBLIC_ADDRESS:
|
||||
continue
|
||||
queries["actors"].append(url)
|
||||
queries["followed"] = join_queries_or(
|
||||
queries["followed"] = funkwhale_utils.join_queries_or(
|
||||
queries["followed"], Q(target__followers_url=url)
|
||||
)
|
||||
final_query = None
|
||||
if queries["actors"]:
|
||||
final_query = join_queries_or(final_query, Q(fid__in=queries["actors"]))
|
||||
final_query = funkwhale_utils.join_queries_or(
|
||||
final_query, Q(fid__in=queries["actors"])
|
||||
)
|
||||
if queries["followed"]:
|
||||
actor_follows = models.Follow.objects.filter(queries["followed"], approved=True)
|
||||
final_query = join_queries_or(
|
||||
final_query = funkwhale_utils.join_queries_or(
|
||||
final_query, Q(pk__in=actor_follows.values_list("actor", flat=True))
|
||||
)
|
||||
|
||||
library_follows = models.LibraryFollow.objects.filter(
|
||||
queries["followed"], approved=True
|
||||
)
|
||||
final_query = join_queries_or(
|
||||
final_query = funkwhale_utils.join_queries_or(
|
||||
final_query, Q(pk__in=library_follows.values_list("actor", flat=True))
|
||||
)
|
||||
if not final_query:
|
||||
|
|
|
@ -14,10 +14,7 @@ logger = logging.getLogger(__name__)
|
|||
|
||||
def get_actor_data(actor_url):
|
||||
response = session.get_session().get(
|
||||
actor_url,
|
||||
timeout=5,
|
||||
verify=settings.EXTERNAL_REQUESTS_VERIFY_SSL,
|
||||
headers={"Accept": "application/activity+json"},
|
||||
actor_url, headers={"Accept": "application/activity+json"},
|
||||
)
|
||||
response.raise_for_status()
|
||||
try:
|
||||
|
@ -45,21 +42,32 @@ def get_actor(fid, skip_cache=False):
|
|||
return serializer.save(last_fetch_date=timezone.now())
|
||||
|
||||
|
||||
def get_service_actor():
|
||||
_CACHE = {}
|
||||
|
||||
|
||||
def get_service_actor(cache=True):
|
||||
if cache and "service_actor" in _CACHE:
|
||||
return _CACHE["service_actor"]
|
||||
|
||||
name, domain = (
|
||||
settings.FEDERATION_SERVICE_ACTOR_USERNAME,
|
||||
settings.FEDERATION_HOSTNAME,
|
||||
)
|
||||
try:
|
||||
return models.Actor.objects.select_related().get(
|
||||
actor = models.Actor.objects.select_related().get(
|
||||
preferred_username=name, domain__name=domain
|
||||
)
|
||||
except models.Actor.DoesNotExist:
|
||||
pass
|
||||
else:
|
||||
_CACHE["service_actor"] = actor
|
||||
return actor
|
||||
|
||||
args = users_models.get_actor_data(name)
|
||||
private, public = keys.get_key_pair()
|
||||
args["private_key"] = private.decode("utf-8")
|
||||
args["public_key"] = public.decode("utf-8")
|
||||
args["type"] = "Service"
|
||||
return models.Actor.objects.create(**args)
|
||||
actor = models.Actor.objects.create(**args)
|
||||
_CACHE["service_actor"] = actor
|
||||
return actor
|
||||
|
|
|
@ -1,7 +1,17 @@
|
|||
import datetime
|
||||
|
||||
from django.conf import settings
|
||||
from django.core.exceptions import ObjectDoesNotExist
|
||||
from django.core import validators
|
||||
from django.utils import timezone
|
||||
|
||||
from rest_framework import serializers
|
||||
|
||||
from funkwhale_api.audio import models as audio_models
|
||||
from funkwhale_api.common import fields as common_fields
|
||||
from funkwhale_api.common import serializers as common_serializers
|
||||
from funkwhale_api.music import models as music_models
|
||||
from funkwhale_api.users import serializers as users_serializers
|
||||
|
||||
from . import filters
|
||||
from . import models
|
||||
|
@ -27,6 +37,10 @@ class LibraryScanSerializer(serializers.ModelSerializer):
|
|||
]
|
||||
|
||||
|
||||
class DomainSerializer(serializers.Serializer):
|
||||
name = serializers.CharField()
|
||||
|
||||
|
||||
class LibrarySerializer(serializers.ModelSerializer):
|
||||
actor = federation_serializers.APIActorSerializer()
|
||||
uploads_count = serializers.SerializerMethodField()
|
||||
|
@ -86,7 +100,12 @@ class LibraryFollowSerializer(serializers.ModelSerializer):
|
|||
|
||||
|
||||
def serialize_generic_relation(activity, obj):
|
||||
data = {"uuid": obj.uuid, "type": obj._meta.label}
|
||||
data = {"type": obj._meta.label}
|
||||
if data["type"] == "federation.Actor":
|
||||
data["full_username"] = obj.full_username
|
||||
else:
|
||||
data["uuid"] = obj.uuid
|
||||
|
||||
if data["type"] == "music.Library":
|
||||
data["name"] = obj.name
|
||||
if data["type"] == "federation.LibraryFollow":
|
||||
|
@ -146,8 +165,22 @@ class InboxItemActionSerializer(common_serializers.ActionSerializer):
|
|||
return objects.update(is_read=True)
|
||||
|
||||
|
||||
FETCH_OBJECT_CONFIG = {
|
||||
"artist": {"queryset": music_models.Artist.objects.all()},
|
||||
"album": {"queryset": music_models.Album.objects.all()},
|
||||
"track": {"queryset": music_models.Track.objects.all()},
|
||||
"library": {"queryset": music_models.Library.objects.all(), "id_attr": "uuid"},
|
||||
"upload": {"queryset": music_models.Upload.objects.all(), "id_attr": "uuid"},
|
||||
"account": {"queryset": models.Actor.objects.all(), "id_attr": "full_username"},
|
||||
"channel": {"queryset": audio_models.Channel.objects.all(), "id_attr": "uuid"},
|
||||
}
|
||||
FETCH_OBJECT_FIELD = common_fields.GenericRelation(FETCH_OBJECT_CONFIG)
|
||||
|
||||
|
||||
class FetchSerializer(serializers.ModelSerializer):
|
||||
actor = federation_serializers.APIActorSerializer()
|
||||
actor = federation_serializers.APIActorSerializer(read_only=True)
|
||||
object = serializers.CharField(write_only=True)
|
||||
force = serializers.BooleanField(default=False, required=False, write_only=True)
|
||||
|
||||
class Meta:
|
||||
model = models.Fetch
|
||||
|
@ -159,4 +192,84 @@ class FetchSerializer(serializers.ModelSerializer):
|
|||
"detail",
|
||||
"creation_date",
|
||||
"fetch_date",
|
||||
"object",
|
||||
"force",
|
||||
]
|
||||
read_only_fields = [
|
||||
"id",
|
||||
"url",
|
||||
"actor",
|
||||
"status",
|
||||
"detail",
|
||||
"creation_date",
|
||||
"fetch_date",
|
||||
]
|
||||
|
||||
def validate_object(self, value):
|
||||
# if value is a webginfer lookup, we craft a special url
|
||||
if value.startswith("@"):
|
||||
value = value.lstrip("@")
|
||||
validator = validators.EmailValidator()
|
||||
try:
|
||||
validator(value)
|
||||
except validators.ValidationError:
|
||||
return value
|
||||
|
||||
return "webfinger://{}".format(value)
|
||||
|
||||
def create(self, validated_data):
|
||||
check_duplicates = not validated_data.get("force", False)
|
||||
if check_duplicates:
|
||||
# first we check for duplicates
|
||||
duplicate = (
|
||||
validated_data["actor"]
|
||||
.fetches.filter(
|
||||
status="finished",
|
||||
url=validated_data["object"],
|
||||
creation_date__gte=timezone.now()
|
||||
- datetime.timedelta(
|
||||
seconds=settings.FEDERATION_DUPLICATE_FETCH_DELAY
|
||||
),
|
||||
)
|
||||
.order_by("-creation_date")
|
||||
.first()
|
||||
)
|
||||
if duplicate:
|
||||
return duplicate
|
||||
|
||||
fetch = models.Fetch.objects.create(
|
||||
actor=validated_data["actor"], url=validated_data["object"]
|
||||
)
|
||||
return fetch
|
||||
|
||||
def to_representation(self, obj):
|
||||
repr = super().to_representation(obj)
|
||||
object_data = None
|
||||
if obj.object:
|
||||
object_data = FETCH_OBJECT_FIELD.to_representation(obj.object)
|
||||
repr["object"] = object_data
|
||||
return repr
|
||||
|
||||
|
||||
class FullActorSerializer(serializers.Serializer):
|
||||
fid = serializers.URLField()
|
||||
url = serializers.URLField()
|
||||
domain = serializers.CharField(source="domain_id")
|
||||
creation_date = serializers.DateTimeField()
|
||||
last_fetch_date = serializers.DateTimeField()
|
||||
name = serializers.CharField()
|
||||
preferred_username = serializers.CharField()
|
||||
full_username = serializers.CharField()
|
||||
type = serializers.CharField()
|
||||
is_local = serializers.BooleanField()
|
||||
is_channel = serializers.SerializerMethodField()
|
||||
manually_approves_followers = serializers.BooleanField()
|
||||
user = users_serializers.UserBasicSerializer()
|
||||
summary = common_serializers.ContentSerializer(source="summary_obj")
|
||||
icon = common_serializers.AttachmentSerializer(source="attachment_icon")
|
||||
|
||||
def get_is_channel(self, o):
|
||||
try:
|
||||
return bool(o.channel)
|
||||
except ObjectDoesNotExist:
|
||||
return False
|
||||
|
|
|
@ -7,5 +7,7 @@ router.register(r"fetches", api_views.FetchViewSet, "fetches")
|
|||
router.register(r"follows/library", api_views.LibraryFollowViewSet, "library-follows")
|
||||
router.register(r"inbox", api_views.InboxItemViewSet, "inbox")
|
||||
router.register(r"libraries", api_views.LibraryViewSet, "libraries")
|
||||
router.register(r"domains", api_views.DomainViewSet, "domains")
|
||||
router.register(r"actors", api_views.ActorViewSet, "actors")
|
||||
|
||||
urlpatterns = router.urls
|
||||
|
|
|
@ -1,7 +1,8 @@
|
|||
import requests.exceptions
|
||||
|
||||
from django.conf import settings
|
||||
from django.db import transaction
|
||||
from django.db.models import Count
|
||||
from django.db.models import Count, Q
|
||||
|
||||
from rest_framework import decorators
|
||||
from rest_framework import mixins
|
||||
|
@ -9,7 +10,11 @@ from rest_framework import permissions
|
|||
from rest_framework import response
|
||||
from rest_framework import viewsets
|
||||
|
||||
from funkwhale_api.common import preferences
|
||||
from funkwhale_api.common import utils as common_utils
|
||||
from funkwhale_api.common.permissions import ConditionalAuthentication
|
||||
from funkwhale_api.music import models as music_models
|
||||
from funkwhale_api.music import views as music_views
|
||||
from funkwhale_api.users.oauth import permissions as oauth_permissions
|
||||
|
||||
from . import activity
|
||||
|
@ -19,6 +24,7 @@ from . import filters
|
|||
from . import models
|
||||
from . import routes
|
||||
from . import serializers
|
||||
from . import tasks
|
||||
from . import utils
|
||||
|
||||
|
||||
|
@ -92,6 +98,26 @@ class LibraryFollowViewSet(
|
|||
update_follow(follow, approved=False)
|
||||
return response.Response(status=204)
|
||||
|
||||
@decorators.action(methods=["get"], detail=False)
|
||||
def all(self, request, *args, **kwargs):
|
||||
"""
|
||||
Return all the subscriptions of the current user, with only limited data
|
||||
to have a performant endpoint and avoid lots of queries just to display
|
||||
subscription status in the UI
|
||||
"""
|
||||
follows = list(
|
||||
self.get_queryset().values_list("uuid", "target__uuid", "approved")
|
||||
)
|
||||
|
||||
payload = {
|
||||
"results": [
|
||||
{"uuid": str(u[0]), "library": str(u[1]), "approved": u[2]}
|
||||
for u in follows
|
||||
],
|
||||
"count": len(follows),
|
||||
}
|
||||
return response.Response(payload, status=200)
|
||||
|
||||
|
||||
class LibraryViewSet(mixins.RetrieveModelMixin, viewsets.GenericViewSet):
|
||||
lookup_field = "uuid"
|
||||
|
@ -192,8 +218,76 @@ class InboxItemViewSet(
|
|||
return response.Response(result, status=200)
|
||||
|
||||
|
||||
class FetchViewSet(mixins.RetrieveModelMixin, viewsets.GenericViewSet):
|
||||
class FetchViewSet(
|
||||
mixins.CreateModelMixin, mixins.RetrieveModelMixin, viewsets.GenericViewSet
|
||||
):
|
||||
|
||||
queryset = models.Fetch.objects.select_related("actor")
|
||||
serializer_class = api_serializers.FetchSerializer
|
||||
permission_classes = [permissions.IsAuthenticated]
|
||||
throttling_scopes = {"create": {"authenticated": "fetch"}}
|
||||
|
||||
def get_queryset(self):
|
||||
return super().get_queryset().filter(actor=self.request.user.actor)
|
||||
|
||||
def perform_create(self, serializer):
|
||||
fetch = serializer.save(actor=self.request.user.actor)
|
||||
if fetch.status == "finished":
|
||||
# a duplicate was returned, no need to fetch again
|
||||
return
|
||||
if settings.FEDERATION_SYNCHRONOUS_FETCH:
|
||||
tasks.fetch(fetch_id=fetch.pk)
|
||||
fetch.refresh_from_db()
|
||||
else:
|
||||
common_utils.on_commit(tasks.fetch.delay, fetch_id=fetch.pk)
|
||||
|
||||
|
||||
class DomainViewSet(
|
||||
mixins.RetrieveModelMixin, mixins.ListModelMixin, viewsets.GenericViewSet
|
||||
):
|
||||
queryset = models.Domain.objects.order_by("name").external()
|
||||
permission_classes = [ConditionalAuthentication]
|
||||
serializer_class = api_serializers.DomainSerializer
|
||||
ordering_fields = ("creation_date", "name")
|
||||
max_page_size = 100
|
||||
|
||||
def get_queryset(self):
|
||||
qs = super().get_queryset()
|
||||
qs = qs.exclude(
|
||||
instance_policy__is_active=True, instance_policy__block_all=True
|
||||
)
|
||||
if preferences.get("moderation__allow_list_enabled"):
|
||||
qs = qs.filter(allowed=True)
|
||||
return qs
|
||||
|
||||
|
||||
class ActorViewSet(mixins.RetrieveModelMixin, viewsets.GenericViewSet):
|
||||
queryset = models.Actor.objects.select_related(
|
||||
"user", "channel", "summary_obj", "attachment_icon"
|
||||
)
|
||||
permission_classes = [ConditionalAuthentication]
|
||||
serializer_class = api_serializers.FullActorSerializer
|
||||
lookup_field = "full_username"
|
||||
lookup_value_regex = r"([a-zA-Z0-9_.+-]+@[a-zA-Z0-9-]+\.[a-zA-Z0-9-.]+)"
|
||||
|
||||
def get_object(self):
|
||||
queryset = self.get_queryset()
|
||||
username, domain = self.kwargs["full_username"].split("@", 1)
|
||||
return queryset.get(preferred_username=username, domain_id=domain)
|
||||
|
||||
def get_queryset(self):
|
||||
qs = super().get_queryset()
|
||||
qs = qs.exclude(
|
||||
domain__instance_policy__is_active=True,
|
||||
domain__instance_policy__block_all=True,
|
||||
)
|
||||
if preferences.get("moderation__allow_list_enabled"):
|
||||
query = Q(domain_id=settings.FUNKWHALE_HOSTNAME) | Q(domain__allowed=True)
|
||||
qs = qs.filter(query)
|
||||
return qs
|
||||
|
||||
libraries = decorators.action(methods=["get"], detail=True)(
|
||||
music_views.get_libraries(
|
||||
filter_uploads=lambda o, uploads: uploads.filter(library__actor=o)
|
||||
)
|
||||
)
|
||||
|
|
|
@ -52,9 +52,13 @@ class SignatureAuthentication(authentication.BaseAuthentication):
|
|||
actor = actors.get_actor(actor_url)
|
||||
except Exception as e:
|
||||
logger.info(
|
||||
"Discarding HTTP request from blocked actor/domain %s", actor_url
|
||||
"Discarding HTTP request from blocked actor/domain %s, %s",
|
||||
actor_url,
|
||||
str(e),
|
||||
)
|
||||
raise rest_exceptions.AuthenticationFailed(
|
||||
"Cannot fetch remote actor to authenticate signature"
|
||||
)
|
||||
raise rest_exceptions.AuthenticationFailed(str(e))
|
||||
|
||||
if not actor.public_key:
|
||||
raise rest_exceptions.AuthenticationFailed("No public key found")
|
||||
|
|
|
@ -1,3 +1,5 @@
|
|||
from . import schema_org
|
||||
|
||||
CONTEXTS = [
|
||||
{
|
||||
"shortId": "LDP",
|
||||
|
@ -218,6 +220,12 @@ CONTEXTS = [
|
|||
}
|
||||
},
|
||||
},
|
||||
{
|
||||
"shortId": "SC",
|
||||
"contextUrl": None,
|
||||
"documentUrl": "http://schema.org",
|
||||
"document": {"@context": schema_org.CONTEXT},
|
||||
},
|
||||
{
|
||||
"shortId": "SEC",
|
||||
"contextUrl": None,
|
||||
|
@ -280,6 +288,7 @@ CONTEXTS = [
|
|||
"type": "@type",
|
||||
"as": "https://www.w3.org/ns/activitystreams#",
|
||||
"fw": "https://funkwhale.audio/ns#",
|
||||
"schema": "http://schema.org#",
|
||||
"xsd": "http://www.w3.org/2001/XMLSchema#",
|
||||
"Album": "fw:Album",
|
||||
"Track": "fw:Track",
|
||||
|
@ -298,6 +307,40 @@ CONTEXTS = [
|
|||
"musicbrainzId": "fw:musicbrainzId",
|
||||
"license": {"@id": "fw:license", "@type": "@id"},
|
||||
"copyright": "fw:copyright",
|
||||
"category": "schema:category",
|
||||
"language": "schema:inLanguage",
|
||||
}
|
||||
},
|
||||
},
|
||||
{
|
||||
"shortId": "LITEPUB",
|
||||
"contextUrl": None,
|
||||
"documentUrl": "http://litepub.social/ns",
|
||||
"document": {
|
||||
# from https://ap.thequietplace.social/schemas/litepub-0.1.jsonld
|
||||
"@context": {
|
||||
"Emoji": "toot:Emoji",
|
||||
"Hashtag": "as:Hashtag",
|
||||
"PropertyValue": "schema:PropertyValue",
|
||||
"atomUri": "ostatus:atomUri",
|
||||
"conversation": {"@id": "ostatus:conversation", "@type": "@id"},
|
||||
"discoverable": "toot:discoverable",
|
||||
"manuallyApprovesFollowers": "as:manuallyApprovesFollowers",
|
||||
"ostatus": "http://ostatus.org#",
|
||||
"schema": "http://schema.org#",
|
||||
"toot": "http://joinmastodon.org/ns#",
|
||||
"value": "schema:value",
|
||||
"sensitive": "as:sensitive",
|
||||
"litepub": "http://litepub.social/ns#",
|
||||
"invisible": "litepub:invisible",
|
||||
"directMessage": "litepub:directMessage",
|
||||
"listMessage": {"@id": "litepub:listMessage", "@type": "@id"},
|
||||
"oauthRegistrationEndpoint": {
|
||||
"@id": "litepub:oauthRegistrationEndpoint",
|
||||
"@type": "@id",
|
||||
},
|
||||
"EmojiReact": "litepub:EmojiReact",
|
||||
"alsoKnownAs": {"@id": "as:alsoKnownAs", "@type": "@id"},
|
||||
}
|
||||
},
|
||||
},
|
||||
|
@ -332,3 +375,5 @@ AS = NS(CONTEXTS_BY_ID["AS"])
|
|||
LDP = NS(CONTEXTS_BY_ID["LDP"])
|
||||
SEC = NS(CONTEXTS_BY_ID["SEC"])
|
||||
FW = NS(CONTEXTS_BY_ID["FW"])
|
||||
SC = NS(CONTEXTS_BY_ID["SC"])
|
||||
LITEPUB = NS(CONTEXTS_BY_ID["LITEPUB"])
|
||||
|
|
|
@ -53,15 +53,3 @@ class ActorFetchDelay(preferences.DefaultFromSettingMixin, types.IntPreference):
|
|||
"request authentication."
|
||||
)
|
||||
field_kwargs = {"required": False}
|
||||
|
||||
|
||||
@global_preferences_registry.register
|
||||
class MusicNeedsApproval(preferences.DefaultFromSettingMixin, types.BooleanPreference):
|
||||
section = federation
|
||||
name = "music_needs_approval"
|
||||
setting = "FEDERATION_MUSIC_NEEDS_APPROVAL"
|
||||
verbose_name = "Federation music needs approval"
|
||||
help_text = (
|
||||
"When true, other federation actors will need your approval"
|
||||
" before being able to browse your library."
|
||||
)
|
||||
|
|
|
@ -21,7 +21,7 @@ class SignatureAuthFactory(factory.Factory):
|
|||
key = factory.LazyFunction(lambda: keys.get_key_pair()[0])
|
||||
key_id = factory.Faker("url")
|
||||
use_auth_header = False
|
||||
headers = ["(request-target)", "user-agent", "host", "date", "content-type"]
|
||||
headers = ["(request-target)", "user-agent", "host", "date", "accept"]
|
||||
|
||||
class Meta:
|
||||
model = requests_http_signature.HTTPSignatureAuth
|
||||
|
@ -42,7 +42,7 @@ class SignedRequestFactory(factory.Factory):
|
|||
"User-Agent": "Test",
|
||||
"Host": "test.host",
|
||||
"Date": http_date(timezone.now().timestamp()),
|
||||
"Content-Type": "application/activity+json",
|
||||
"Accept": "application/activity+json",
|
||||
}
|
||||
if extracted:
|
||||
default_headers.update(extracted)
|
||||
|
@ -86,6 +86,17 @@ class DomainFactory(NoUpdateOnCreate, factory.django.DjangoModelFactory):
|
|||
return self.service_actor
|
||||
|
||||
|
||||
_CACHE = {}
|
||||
|
||||
|
||||
def get_cached_key_pair():
|
||||
try:
|
||||
return _CACHE["keys"]
|
||||
except KeyError:
|
||||
_CACHE["keys"] = keys.get_key_pair()
|
||||
return _CACHE["keys"]
|
||||
|
||||
|
||||
@registry.register
|
||||
class ActorFactory(NoUpdateOnCreate, factory.DjangoModelFactory):
|
||||
public_key = None
|
||||
|
@ -111,11 +122,14 @@ class ActorFactory(NoUpdateOnCreate, factory.DjangoModelFactory):
|
|||
o.domain.name, o.preferred_username
|
||||
)
|
||||
)
|
||||
keys = factory.LazyFunction(keys.get_key_pair)
|
||||
keys = factory.LazyFunction(get_cached_key_pair)
|
||||
|
||||
class Meta:
|
||||
model = models.Actor
|
||||
|
||||
class Params:
|
||||
with_real_keys = factory.Trait(keys=factory.LazyFunction(keys.get_key_pair),)
|
||||
|
||||
@factory.post_generation
|
||||
def local(self, create, extracted, **kwargs):
|
||||
if not extracted and not kwargs:
|
||||
|
@ -125,7 +139,8 @@ class ActorFactory(NoUpdateOnCreate, factory.DjangoModelFactory):
|
|||
self.domain = models.Domain.objects.get_or_create(
|
||||
name=settings.FEDERATION_HOSTNAME
|
||||
)[0]
|
||||
self.save(update_fields=["domain"])
|
||||
self.fid = "https://{}/actors/{}".format(self.domain, self.preferred_username)
|
||||
self.save(update_fields=["domain", "fid"])
|
||||
if not create:
|
||||
if extracted and hasattr(extracted, "pk"):
|
||||
extracted.actor = self
|
||||
|
@ -166,7 +181,9 @@ class MusicLibraryFactory(NoUpdateOnCreate, factory.django.DjangoModelFactory):
|
|||
model = "music.Library"
|
||||
|
||||
class Params:
|
||||
local = factory.Trait(actor=factory.SubFactory(ActorFactory, local=True))
|
||||
local = factory.Trait(
|
||||
fid=None, actor=factory.SubFactory(ActorFactory, local=True)
|
||||
)
|
||||
|
||||
|
||||
@registry.register
|
||||
|
|
|
@ -17,6 +17,10 @@ def cached_contexts(loader):
|
|||
for cached in contexts.CONTEXTS:
|
||||
if url == cached["documentUrl"]:
|
||||
return cached
|
||||
if cached["shortId"] == "LITEPUB" and "/schemas/litepub-" in url:
|
||||
# XXX UGLY fix for pleroma because they host their schema
|
||||
# under each instance domain, which makes caching harder
|
||||
return cached
|
||||
return loader(url, *args, **kwargs)
|
||||
|
||||
return load
|
||||
|
@ -29,18 +33,19 @@ def get_document_loader():
|
|||
return cached_contexts(loader)
|
||||
|
||||
|
||||
def expand(doc, options=None, insert_fw_context=True):
|
||||
def expand(doc, options=None, default_contexts=["AS", "FW", "SEC"]):
|
||||
options = options or {}
|
||||
options.setdefault("documentLoader", get_document_loader())
|
||||
if isinstance(doc, str):
|
||||
doc = options["documentLoader"](doc)["document"]
|
||||
if insert_fw_context:
|
||||
fw = contexts.CONTEXTS_BY_ID["FW"]["documentUrl"]
|
||||
for context_name in default_contexts:
|
||||
ctx = contexts.CONTEXTS_BY_ID[context_name]["documentUrl"]
|
||||
try:
|
||||
insert_context(fw, doc)
|
||||
insert_context(ctx, doc)
|
||||
except KeyError:
|
||||
# probably an already expanded document
|
||||
pass
|
||||
|
||||
result = pyld.jsonld.expand(doc, options=options)
|
||||
try:
|
||||
# jsonld.expand returns a list, which is useless for us
|
||||
|
@ -167,7 +172,7 @@ def prepare_for_serializer(payload, config, fallbacks={}):
|
|||
attr=field_config.get("attr"),
|
||||
)
|
||||
except (IndexError, KeyError):
|
||||
aliases = field_config.get("aliases", [])
|
||||
aliases = field_config.get("aliases", {})
|
||||
noop = object()
|
||||
value = noop
|
||||
if not aliases:
|
||||
|
@ -176,9 +181,7 @@ def prepare_for_serializer(payload, config, fallbacks={}):
|
|||
for a in aliases:
|
||||
try:
|
||||
value = get_value(
|
||||
payload[a],
|
||||
keep=field_config.get("keep"),
|
||||
attr=field_config.get("attr"),
|
||||
payload[a["property"]], keep=a.get("keep"), attr=a.get("attr"),
|
||||
)
|
||||
except (IndexError, KeyError):
|
||||
continue
|
||||
|
@ -214,27 +217,34 @@ def get_ids(v):
|
|||
|
||||
|
||||
def get_default_context():
|
||||
return ["https://www.w3.org/ns/activitystreams", "https://w3id.org/security/v1", {}]
|
||||
|
||||
|
||||
def get_default_context_fw():
|
||||
return [
|
||||
"https://www.w3.org/ns/activitystreams",
|
||||
"https://w3id.org/security/v1",
|
||||
{},
|
||||
"https://funkwhale.audio/ns",
|
||||
{
|
||||
"manuallyApprovesFollowers": "as:manuallyApprovesFollowers",
|
||||
"Hashtag": "as:Hashtag",
|
||||
},
|
||||
]
|
||||
|
||||
|
||||
class JsonLdSerializer(serializers.Serializer):
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.jsonld_expand = kwargs.pop("jsonld_expand", True)
|
||||
super().__init__(*args, **kwargs)
|
||||
self.jsonld_context = []
|
||||
|
||||
def run_validation(self, data=empty):
|
||||
if data and data is not empty and self.context.get("expand", True):
|
||||
try:
|
||||
data = expand(data)
|
||||
except ValueError:
|
||||
raise serializers.ValidationError(
|
||||
"{} is not a valid jsonld document".format(data)
|
||||
)
|
||||
if data and data is not empty:
|
||||
|
||||
self.jsonld_context = data.get("@context", [])
|
||||
if self.context.get("expand", self.jsonld_expand):
|
||||
try:
|
||||
data = expand(data)
|
||||
except ValueError as e:
|
||||
raise serializers.ValidationError(
|
||||
"{} is not a valid jsonld document: {}".format(data, e)
|
||||
)
|
||||
try:
|
||||
config = self.Meta.jsonld_mapping
|
||||
except AttributeError:
|
||||
|
@ -243,6 +253,7 @@ class JsonLdSerializer(serializers.Serializer):
|
|||
fallbacks = self.Meta.jsonld_fallbacks
|
||||
except AttributeError:
|
||||
fallbacks = {}
|
||||
|
||||
data = prepare_for_serializer(data, config, fallbacks=fallbacks)
|
||||
dereferenced_fields = [
|
||||
k
|
||||
|
@ -285,3 +296,15 @@ def first_obj(property, aliases=[]):
|
|||
|
||||
def raw(property, aliases=[]):
|
||||
return {"property": property, "aliases": aliases}
|
||||
|
||||
|
||||
def is_present_recursive(data, key):
|
||||
if isinstance(data, (dict, list)):
|
||||
for v in data:
|
||||
if is_present_recursive(v, key):
|
||||
return True
|
||||
else:
|
||||
if data == key:
|
||||
return True
|
||||
|
||||
return False
|
||||
|
|
|
@ -21,7 +21,8 @@ def get_key_pair(size=None):
|
|||
crypto_serialization.NoEncryption(),
|
||||
)
|
||||
public_key = key.public_key().public_bytes(
|
||||
crypto_serialization.Encoding.PEM, crypto_serialization.PublicFormat.PKCS1
|
||||
crypto_serialization.Encoding.PEM,
|
||||
crypto_serialization.PublicFormat.SubjectPublicKeyInfo,
|
||||
)
|
||||
|
||||
return private_key, public_key
|
||||
|
|
|
@ -1,5 +1,4 @@
|
|||
import requests
|
||||
from django.conf import settings
|
||||
|
||||
from funkwhale_api.common import session
|
||||
|
||||
|
@ -10,11 +9,7 @@ def get_library_data(library_url, actor):
|
|||
auth = signing.get_auth(actor.private_key, actor.private_key_id)
|
||||
try:
|
||||
response = session.get_session().get(
|
||||
library_url,
|
||||
auth=auth,
|
||||
timeout=5,
|
||||
verify=settings.EXTERNAL_REQUESTS_VERIFY_SSL,
|
||||
headers={"Content-Type": "application/activity+json"},
|
||||
library_url, auth=auth, headers={"Accept": "application/activity+json"},
|
||||
)
|
||||
except requests.ConnectionError:
|
||||
return {"errors": ["This library is not reachable"]}
|
||||
|
@ -35,11 +30,7 @@ def get_library_data(library_url, actor):
|
|||
def get_library_page(library, page_url, actor):
|
||||
auth = signing.get_auth(actor.private_key, actor.private_key_id)
|
||||
response = session.get_session().get(
|
||||
page_url,
|
||||
auth=auth,
|
||||
timeout=5,
|
||||
verify=settings.EXTERNAL_REQUESTS_VERIFY_SSL,
|
||||
headers={"Content-Type": "application/activity+json"},
|
||||
page_url, auth=auth, headers={"Accept": "application/activity+json"},
|
||||
)
|
||||
serializer = serializers.CollectionPageSerializer(
|
||||
data=response.json(),
|
||||
|
|
|
@ -0,0 +1,22 @@
|
|||
# Generated by Django 2.2.6 on 2019-10-29 12:57
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('federation', '0020_auto_20190730_0846'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterModelOptions(
|
||||
name='actor',
|
||||
options={'verbose_name': 'Account'},
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='actor',
|
||||
name='type',
|
||||
field=models.CharField(choices=[('Person', 'Person'), ('Tombstone', 'Tombstone'), ('Application', 'Application'), ('Group', 'Group'), ('Organization', 'Organization'), ('Service', 'Service')], default='Person', max_length=25),
|
||||
),
|
||||
]
|
|
@ -0,0 +1,23 @@
|
|||
# Generated by Django 2.2.7 on 2019-12-04 15:39
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('federation', '0021_auto_20191029_1257'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='actor',
|
||||
name='inbox_url',
|
||||
field=models.URLField(blank=True, max_length=500, null=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='actor',
|
||||
name='outbox_url',
|
||||
field=models.URLField(blank=True, max_length=500, null=True),
|
||||
),
|
||||
]
|
|
@ -0,0 +1,20 @@
|
|||
# Generated by Django 2.2.9 on 2020-01-22 11:01
|
||||
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('common', '0007_auto_20200116_1610'),
|
||||
('federation', '0022_auto_20191204_1539'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='actor',
|
||||
name='summary_obj',
|
||||
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, to='common.Content'),
|
||||
),
|
||||
]
|
|
@ -0,0 +1,20 @@
|
|||
# Generated by Django 2.2.9 on 2020-01-23 13:59
|
||||
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('common', '0007_auto_20200116_1610'),
|
||||
('federation', '0023_actor_summary_obj'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='actor',
|
||||
name='attachment_icon',
|
||||
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='iconed_actor', to='common.Attachment'),
|
||||
),
|
||||
]
|
|
@ -0,0 +1,45 @@
|
|||
# Generated by Django 3.0.4 on 2020-03-17 08:20
|
||||
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('contenttypes', '0002_remove_content_type_name'),
|
||||
('federation', '0024_actor_attachment_icon'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='activity',
|
||||
name='object_content_type',
|
||||
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='objecting_activities', to='contenttypes.ContentType'),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='activity',
|
||||
name='object_id',
|
||||
field=models.IntegerField(blank=True, null=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='activity',
|
||||
name='related_object_content_type',
|
||||
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='related_objecting_activities', to='contenttypes.ContentType'),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='activity',
|
||||
name='related_object_id',
|
||||
field=models.IntegerField(blank=True, null=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='activity',
|
||||
name='target_content_type',
|
||||
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='targeting_activities', to='contenttypes.ContentType'),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='activity',
|
||||
name='target_id',
|
||||
field=models.IntegerField(blank=True, null=True),
|
||||
),
|
||||
]
|
|
@ -0,0 +1,56 @@
|
|||
# Generated by Django 2.0.9 on 2018-11-14 08:55
|
||||
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
import django.utils.timezone
|
||||
|
||||
|
||||
def update_public_key_format(apps, schema_editor):
|
||||
"""
|
||||
Reserialize keys in proper format (PKCS#8 instead of #1)
|
||||
https://github.com/friendica/friendica/issues/7771#issuecomment-603019826
|
||||
"""
|
||||
Actor = apps.get_model("federation", "Actor")
|
||||
|
||||
local_actors = list(
|
||||
Actor.objects.exclude(private_key="")
|
||||
.exclude(private_key=None)
|
||||
.only("pk", "private_key", "public_key")
|
||||
.order_by("id")
|
||||
)
|
||||
|
||||
total = len(local_actors)
|
||||
if total:
|
||||
print("{} keys to update...".format(total))
|
||||
else:
|
||||
print("Skipping")
|
||||
return
|
||||
|
||||
from cryptography.hazmat.primitives import serialization as crypto_serialization
|
||||
from cryptography.hazmat.backends import default_backend
|
||||
|
||||
for actor in local_actors:
|
||||
private_key = crypto_serialization.load_pem_private_key(
|
||||
actor.private_key.encode(), password=None, backend=default_backend()
|
||||
)
|
||||
public_key = private_key.public_key().public_bytes(
|
||||
crypto_serialization.Encoding.PEM,
|
||||
crypto_serialization.PublicFormat.SubjectPublicKeyInfo,
|
||||
)
|
||||
actor.public_key = public_key.decode()
|
||||
|
||||
Actor.objects.bulk_update(local_actors, ["public_key"])
|
||||
print("Done!")
|
||||
|
||||
|
||||
def skip(apps, schema_editor):
|
||||
pass
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [("federation", "0025_auto_20200317_0820")]
|
||||
|
||||
operations = [
|
||||
migrations.RunPython(update_public_key_format, skip),
|
||||
]
|
|
@ -68,7 +68,7 @@ class ActorQuerySet(models.QuerySet):
|
|||
|
||||
def with_current_usage(self):
|
||||
qs = self
|
||||
for s in ["pending", "skipped", "errored", "finished"]:
|
||||
for s in ["draft", "pending", "skipped", "errored", "finished"]:
|
||||
uploads_query = models.Q(
|
||||
libraries__uploads__import_status=s,
|
||||
libraries__uploads__audio_file__isnull=False,
|
||||
|
@ -145,6 +145,7 @@ class Domain(models.Model):
|
|||
actors=models.Count("actors", distinct=True),
|
||||
outbox_activities=models.Count("actors__outbox_activities", distinct=True),
|
||||
libraries=models.Count("actors__libraries", distinct=True),
|
||||
channels=models.Count("actors__owned_channels", distinct=True),
|
||||
received_library_follows=models.Count(
|
||||
"actors__libraries__received_follows", distinct=True
|
||||
),
|
||||
|
@ -180,8 +181,8 @@ class Actor(models.Model):
|
|||
|
||||
fid = models.URLField(unique=True, max_length=500, db_index=True)
|
||||
url = models.URLField(max_length=500, null=True, blank=True)
|
||||
outbox_url = models.URLField(max_length=500)
|
||||
inbox_url = models.URLField(max_length=500)
|
||||
outbox_url = models.URLField(max_length=500, null=True, blank=True)
|
||||
inbox_url = models.URLField(max_length=500, null=True, blank=True)
|
||||
following_url = models.URLField(max_length=500, null=True, blank=True)
|
||||
followers_url = models.URLField(max_length=500, null=True, blank=True)
|
||||
shared_inbox_url = models.URLField(max_length=500, null=True, blank=True)
|
||||
|
@ -189,6 +190,9 @@ class Actor(models.Model):
|
|||
name = models.CharField(max_length=200, null=True, blank=True)
|
||||
domain = models.ForeignKey(Domain, on_delete=models.CASCADE, related_name="actors")
|
||||
summary = models.CharField(max_length=500, null=True, blank=True)
|
||||
summary_obj = models.ForeignKey(
|
||||
"common.Content", null=True, blank=True, on_delete=models.SET_NULL
|
||||
)
|
||||
preferred_username = models.CharField(max_length=200, null=True, blank=True)
|
||||
public_key = models.TextField(max_length=5000, null=True, blank=True)
|
||||
private_key = models.TextField(max_length=5000, null=True, blank=True)
|
||||
|
@ -202,6 +206,13 @@ class Actor(models.Model):
|
|||
through_fields=("target", "actor"),
|
||||
related_name="following",
|
||||
)
|
||||
attachment_icon = models.ForeignKey(
|
||||
"common.Attachment",
|
||||
null=True,
|
||||
blank=True,
|
||||
on_delete=models.SET_NULL,
|
||||
related_name="iconed_actor",
|
||||
)
|
||||
|
||||
objects = ActorQuerySet.as_manager()
|
||||
|
||||
|
@ -236,6 +247,8 @@ class Actor(models.Model):
|
|||
return self.followers.filter(pk__in=follows.values_list("actor", flat=True))
|
||||
|
||||
def should_autoapprove_follow(self, actor):
|
||||
if self.get_channel():
|
||||
return True
|
||||
return False
|
||||
|
||||
def get_user(self):
|
||||
|
@ -244,10 +257,21 @@ class Actor(models.Model):
|
|||
except ObjectDoesNotExist:
|
||||
return None
|
||||
|
||||
def get_channel(self):
|
||||
try:
|
||||
return self.channel
|
||||
except ObjectDoesNotExist:
|
||||
return None
|
||||
|
||||
def get_absolute_url(self):
|
||||
if self.is_local:
|
||||
return federation_utils.full_url("/@{}".format(self.preferred_username))
|
||||
return self.url or self.fid
|
||||
|
||||
def get_current_usage(self):
|
||||
actor = self.__class__.objects.filter(pk=self.pk).with_current_usage().get()
|
||||
data = {}
|
||||
for s in ["pending", "skipped", "errored", "finished"]:
|
||||
for s in ["draft", "pending", "skipped", "errored", "finished"]:
|
||||
data[s] = getattr(actor, "_usage_{}".format(s)) or 0
|
||||
|
||||
data["total"] = sum(data.values())
|
||||
|
@ -260,6 +284,7 @@ class Actor(models.Model):
|
|||
data = Actor.objects.filter(pk=self.pk).aggregate(
|
||||
outbox_activities=models.Count("outbox_activities", distinct=True),
|
||||
libraries=models.Count("libraries", distinct=True),
|
||||
channels=models.Count("owned_channels", distinct=True),
|
||||
received_library_follows=models.Count(
|
||||
"libraries__received_follows", distinct=True
|
||||
),
|
||||
|
@ -269,6 +294,9 @@ class Actor(models.Model):
|
|||
from_activity__actor=self.pk
|
||||
).count()
|
||||
data["reports"] = moderation_models.Report.objects.get_for_target(self).count()
|
||||
data["requests"] = moderation_models.UserRequest.objects.filter(
|
||||
submitter=self
|
||||
).count()
|
||||
data["albums"] = music_models.Album.objects.filter(
|
||||
from_activity__actor=self.pk
|
||||
).count()
|
||||
|
@ -312,6 +340,10 @@ class Actor(models.Model):
|
|||
"https://{}/".format(domain)
|
||||
)
|
||||
|
||||
@property
|
||||
def display_name(self):
|
||||
return self.name or self.preferred_username
|
||||
|
||||
|
||||
FETCH_STATUSES = [
|
||||
("pending", "Pending"),
|
||||
|
@ -345,7 +377,7 @@ class Fetch(models.Model):
|
|||
objects = FetchQuerySet.as_manager()
|
||||
|
||||
def save(self, **kwargs):
|
||||
if not self.url and self.object:
|
||||
if not self.url and self.object and hasattr(self.object, "fid"):
|
||||
self.url = self.object.fid
|
||||
|
||||
super().save(**kwargs)
|
||||
|
@ -356,11 +388,19 @@ class Fetch(models.Model):
|
|||
from . import serializers
|
||||
|
||||
return {
|
||||
contexts.FW.Artist: serializers.ArtistSerializer,
|
||||
contexts.FW.Album: serializers.AlbumSerializer,
|
||||
contexts.FW.Track: serializers.TrackSerializer,
|
||||
contexts.AS.Audio: serializers.UploadSerializer,
|
||||
contexts.FW.Library: serializers.LibrarySerializer,
|
||||
contexts.FW.Artist: [serializers.ArtistSerializer],
|
||||
contexts.FW.Album: [serializers.AlbumSerializer],
|
||||
contexts.FW.Track: [serializers.TrackSerializer],
|
||||
contexts.AS.Audio: [
|
||||
serializers.UploadSerializer,
|
||||
serializers.ChannelUploadSerializer,
|
||||
],
|
||||
contexts.FW.Library: [serializers.LibrarySerializer],
|
||||
contexts.AS.Group: [serializers.ActorSerializer],
|
||||
contexts.AS.Person: [serializers.ActorSerializer],
|
||||
contexts.AS.Organization: [serializers.ActorSerializer],
|
||||
contexts.AS.Service: [serializers.ActorSerializer],
|
||||
contexts.AS.Application: [serializers.ActorSerializer],
|
||||
}
|
||||
|
||||
|
||||
|
@ -411,26 +451,29 @@ class Activity(models.Model):
|
|||
type = models.CharField(db_index=True, null=True, max_length=100)
|
||||
|
||||
# generic relations
|
||||
object_id = models.IntegerField(null=True)
|
||||
object_id = models.IntegerField(null=True, blank=True)
|
||||
object_content_type = models.ForeignKey(
|
||||
ContentType,
|
||||
null=True,
|
||||
blank=True,
|
||||
on_delete=models.SET_NULL,
|
||||
related_name="objecting_activities",
|
||||
)
|
||||
object = GenericForeignKey("object_content_type", "object_id")
|
||||
target_id = models.IntegerField(null=True)
|
||||
target_id = models.IntegerField(null=True, blank=True)
|
||||
target_content_type = models.ForeignKey(
|
||||
ContentType,
|
||||
null=True,
|
||||
blank=True,
|
||||
on_delete=models.SET_NULL,
|
||||
related_name="targeting_activities",
|
||||
)
|
||||
target = GenericForeignKey("target_content_type", "target_id")
|
||||
related_object_id = models.IntegerField(null=True)
|
||||
related_object_id = models.IntegerField(null=True, blank=True)
|
||||
related_object_content_type = models.ForeignKey(
|
||||
ContentType,
|
||||
null=True,
|
||||
blank=True,
|
||||
on_delete=models.SET_NULL,
|
||||
related_name="related_objecting_activities",
|
||||
)
|
||||
|
@ -541,8 +584,7 @@ class LibraryTrack(models.Model):
|
|||
auth=auth,
|
||||
stream=True,
|
||||
timeout=20,
|
||||
verify=settings.EXTERNAL_REQUESTS_VERIFY_SSL,
|
||||
headers={"Content-Type": "application/activity+json"},
|
||||
headers={"Accept": "application/activity+json"},
|
||||
)
|
||||
with remote_response as r:
|
||||
remote_response.raise_for_status()
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
from rest_framework.negotiation import BaseContentNegotiation
|
||||
from rest_framework.renderers import JSONRenderer
|
||||
|
||||
|
||||
|
@ -6,6 +7,7 @@ def get_ap_renderers():
|
|||
("APActivity", "application/activity+json"),
|
||||
("APLD", "application/ld+json"),
|
||||
("APJSON", "application/json"),
|
||||
("HTML", "text/html"),
|
||||
]
|
||||
|
||||
return [
|
||||
|
@ -14,5 +16,19 @@ def get_ap_renderers():
|
|||
]
|
||||
|
||||
|
||||
class IgnoreClientContentNegotiation(BaseContentNegotiation):
|
||||
def select_parser(self, request, parsers):
|
||||
"""
|
||||
Select the first parser in the `.parser_classes` list.
|
||||
"""
|
||||
return parsers[0]
|
||||
|
||||
def select_renderer(self, request, renderers, format_suffix):
|
||||
"""
|
||||
Select the first renderer in the `.renderer_classes` list.
|
||||
"""
|
||||
return (renderers[0], renderers[0].media_type)
|
||||
|
||||
|
||||
class WebfingerRenderer(JSONRenderer):
|
||||
media_type = "application/jrd+json"
|
||||
|
|
|
@ -1,4 +1,7 @@
|
|||
import logging
|
||||
import uuid
|
||||
|
||||
from django.db.models import Q
|
||||
|
||||
from funkwhale_api.music import models as music_models
|
||||
|
||||
|
@ -131,38 +134,53 @@ def outbox_follow(context):
|
|||
@outbox.register({"type": "Create", "object.type": "Audio"})
|
||||
def outbox_create_audio(context):
|
||||
upload = context["upload"]
|
||||
serializer = serializers.ActivitySerializer(
|
||||
{
|
||||
"type": "Create",
|
||||
"actor": upload.library.actor.fid,
|
||||
"object": serializers.UploadSerializer(upload).data,
|
||||
}
|
||||
)
|
||||
channel = upload.library.get_channel()
|
||||
followers_target = channel.actor if channel else upload.library
|
||||
actor = channel.actor if channel else upload.library.actor
|
||||
if channel:
|
||||
serializer = serializers.ChannelCreateUploadSerializer(upload)
|
||||
else:
|
||||
upload_serializer = serializers.UploadSerializer
|
||||
serializer = serializers.ActivitySerializer(
|
||||
{
|
||||
"type": "Create",
|
||||
"actor": actor.fid,
|
||||
"object": upload_serializer(upload).data,
|
||||
}
|
||||
)
|
||||
yield {
|
||||
"type": "Create",
|
||||
"actor": upload.library.actor,
|
||||
"actor": actor,
|
||||
"payload": with_recipients(
|
||||
serializer.data, to=[{"type": "followers", "target": upload.library}]
|
||||
serializer.data, to=[{"type": "followers", "target": followers_target}]
|
||||
),
|
||||
"object": upload,
|
||||
"target": upload.library,
|
||||
"target": None if channel else upload.library,
|
||||
}
|
||||
|
||||
|
||||
@inbox.register({"type": "Create", "object.type": "Audio"})
|
||||
def inbox_create_audio(payload, context):
|
||||
serializer = serializers.UploadSerializer(
|
||||
data=payload["object"],
|
||||
context={"activity": context.get("activity"), "actor": context["actor"]},
|
||||
)
|
||||
|
||||
is_channel = "library" not in payload["object"]
|
||||
if is_channel:
|
||||
channel = context["actor"].get_channel()
|
||||
serializer = serializers.ChannelCreateUploadSerializer(
|
||||
data=payload, context={"channel": channel},
|
||||
)
|
||||
else:
|
||||
serializer = serializers.UploadSerializer(
|
||||
data=payload["object"],
|
||||
context={"activity": context.get("activity"), "actor": context["actor"]},
|
||||
)
|
||||
if not serializer.is_valid(raise_exception=context.get("raise_exception", False)):
|
||||
logger.warn("Discarding invalid audio create")
|
||||
logger.warn("Discarding invalid audio create: %s", serializer.errors)
|
||||
return
|
||||
|
||||
upload = serializer.save()
|
||||
|
||||
return {"object": upload, "target": upload.library}
|
||||
if is_channel:
|
||||
return {"object": upload, "target": channel}
|
||||
else:
|
||||
return {"object": upload, "target": upload.library}
|
||||
|
||||
|
||||
@inbox.register({"type": "Delete", "object.type": "Library"})
|
||||
|
@ -245,9 +263,10 @@ def inbox_delete_audio(payload, context):
|
|||
# we did not receive a list of Ids, so we can probably use the value directly
|
||||
upload_fids = [payload["object"]["id"]]
|
||||
|
||||
candidates = music_models.Upload.objects.filter(
|
||||
library__actor=actor, fid__in=upload_fids
|
||||
query = Q(fid__in=upload_fids) & (
|
||||
Q(library__actor=actor) | Q(track__artist__channel__actor=actor)
|
||||
)
|
||||
candidates = music_models.Upload.objects.filter(query)
|
||||
|
||||
total = candidates.count()
|
||||
logger.info("Deleting %s uploads with ids %s", total, upload_fids)
|
||||
|
@ -258,6 +277,9 @@ def inbox_delete_audio(payload, context):
|
|||
def outbox_delete_audio(context):
|
||||
uploads = context["uploads"]
|
||||
library = uploads[0].library
|
||||
channel = library.get_channel()
|
||||
followers_target = channel.actor if channel else library
|
||||
actor = channel.actor if channel else library.actor
|
||||
serializer = serializers.ActivitySerializer(
|
||||
{
|
||||
"type": "Delete",
|
||||
|
@ -266,9 +288,9 @@ def outbox_delete_audio(context):
|
|||
)
|
||||
yield {
|
||||
"type": "Delete",
|
||||
"actor": library.actor,
|
||||
"actor": actor,
|
||||
"payload": with_recipients(
|
||||
serializer.data, to=[{"type": "followers", "target": library}]
|
||||
serializer.data, to=[{"type": "followers", "target": followers_target}]
|
||||
),
|
||||
}
|
||||
|
||||
|
@ -312,6 +334,37 @@ def inbox_update_track(payload, context):
|
|||
)
|
||||
|
||||
|
||||
@inbox.register({"type": "Update", "object.type": "Audio"})
|
||||
def inbox_update_audio(payload, context):
|
||||
serializer = serializers.ChannelCreateUploadSerializer(
|
||||
data=payload, context=context
|
||||
)
|
||||
|
||||
if not serializer.is_valid(raise_exception=context.get("raise_exception", False)):
|
||||
logger.info("Skipped update, invalid payload")
|
||||
return
|
||||
serializer.save()
|
||||
|
||||
|
||||
@outbox.register({"type": "Update", "object.type": "Audio"})
|
||||
def outbox_update_audio(context):
|
||||
upload = context["upload"]
|
||||
channel = upload.library.get_channel()
|
||||
actor = channel.actor
|
||||
serializer = serializers.ChannelCreateUploadSerializer(
|
||||
upload, context={"type": "Update", "activity_id_suffix": str(uuid.uuid4())[:8]}
|
||||
)
|
||||
|
||||
yield {
|
||||
"type": "Update",
|
||||
"actor": actor,
|
||||
"payload": with_recipients(
|
||||
serializer.data,
|
||||
to=[activity.PUBLIC_ADDRESS, {"type": "instances_with_followers"}],
|
||||
),
|
||||
}
|
||||
|
||||
|
||||
@inbox.register({"type": "Update", "object.type": "Artist"})
|
||||
def inbox_update_artist(payload, context):
|
||||
return handle_library_entry_update(
|
||||
|
@ -416,7 +469,6 @@ def outbox_delete_actor(context):
|
|||
{
|
||||
"type": "Delete",
|
||||
"object.type": [
|
||||
"Tombstone",
|
||||
"Actor",
|
||||
"Person",
|
||||
"Application",
|
||||
|
@ -441,3 +493,89 @@ def inbox_delete_actor(payload, context):
|
|||
logger.warn("Cannot delete actor %s, no matching object found", actor.fid)
|
||||
return
|
||||
actor.delete()
|
||||
|
||||
|
||||
@inbox.register({"type": "Delete", "object.type": "Tombstone"})
|
||||
def inbox_delete(payload, context):
|
||||
serializer = serializers.DeleteSerializer(data=payload, context=context)
|
||||
if not serializer.is_valid(raise_exception=context.get("raise_exception", False)):
|
||||
logger.info("Skipped deletion, invalid payload")
|
||||
return
|
||||
|
||||
to_delete = serializer.validated_data["object"]
|
||||
to_delete.delete()
|
||||
|
||||
|
||||
@inbox.register({"type": "Flag"})
|
||||
def inbox_flag(payload, context):
|
||||
serializer = serializers.FlagSerializer(data=payload, context=context)
|
||||
if not serializer.is_valid(raise_exception=context.get("raise_exception", False)):
|
||||
logger.debug(
|
||||
"Discarding invalid report from {}: %s",
|
||||
context["actor"].fid,
|
||||
serializer.errors,
|
||||
)
|
||||
return
|
||||
|
||||
report = serializer.save()
|
||||
return {"object": report.target, "related_object": report}
|
||||
|
||||
|
||||
@outbox.register({"type": "Flag"})
|
||||
def outbox_flag(context):
|
||||
report = context["report"]
|
||||
if not report.target or not report.target.fid:
|
||||
return
|
||||
actor = actors.get_service_actor()
|
||||
serializer = serializers.FlagSerializer(report)
|
||||
yield {
|
||||
"type": "Flag",
|
||||
"actor": actor,
|
||||
"payload": with_recipients(
|
||||
serializer.data,
|
||||
# Mastodon requires the report to be sent to the reported actor inbox
|
||||
# (and not the shared inbox)
|
||||
to=[{"type": "actor_inbox", "actor": report.target_owner}],
|
||||
),
|
||||
}
|
||||
|
||||
|
||||
@inbox.register({"type": "Delete", "object.type": "Album"})
|
||||
def inbox_delete_album(payload, context):
|
||||
actor = context["actor"]
|
||||
album_id = payload["object"].get("id")
|
||||
if not album_id:
|
||||
logger.debug("Discarding deletion of empty library")
|
||||
return
|
||||
|
||||
query = Q(fid=album_id) & (Q(attributed_to=actor) | Q(artist__channel__actor=actor))
|
||||
try:
|
||||
album = music_models.Album.objects.get(query)
|
||||
except music_models.Album.DoesNotExist:
|
||||
logger.debug("Discarding deletion of unkwnown album %s", album_id)
|
||||
return
|
||||
|
||||
album.delete()
|
||||
|
||||
|
||||
@outbox.register({"type": "Delete", "object.type": "Album"})
|
||||
def outbox_delete_album(context):
|
||||
album = context["album"]
|
||||
actor = (
|
||||
album.artist.channel.actor
|
||||
if album.artist.get_channel()
|
||||
else album.attributed_to
|
||||
)
|
||||
actor = actor or actors.get_service_actor()
|
||||
serializer = serializers.ActivitySerializer(
|
||||
{"type": "Delete", "object": {"type": "Album", "id": album.fid}}
|
||||
)
|
||||
|
||||
yield {
|
||||
"type": "Delete",
|
||||
"actor": actor,
|
||||
"payload": with_recipients(
|
||||
serializer.data,
|
||||
to=[activity.PUBLIC_ADDRESS, {"type": "instances_with_followers"}],
|
||||
),
|
||||
}
|
||||
|
|
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
|
@ -1,3 +1,4 @@
|
|||
import cryptography.exceptions
|
||||
import datetime
|
||||
import logging
|
||||
import pytz
|
||||
|
@ -31,18 +32,29 @@ def verify_date(raw_date):
|
|||
now = timezone.now()
|
||||
if dt < now - delta or dt > now + delta:
|
||||
raise forms.ValidationError(
|
||||
"Request Date is too far in the future or in the past"
|
||||
"Request Date {} is too far in the future or in the past".format(raw_date)
|
||||
)
|
||||
|
||||
return dt
|
||||
|
||||
|
||||
def verify(request, public_key):
|
||||
verify_date(request.headers.get("Date"))
|
||||
|
||||
return requests_http_signature.HTTPSignatureAuth.verify(
|
||||
request, key_resolver=lambda **kwargs: public_key, use_auth_header=False
|
||||
date = request.headers.get("Date")
|
||||
logger.debug(
|
||||
"Verifying request with date %s and headers %s", date, str(request.headers)
|
||||
)
|
||||
verify_date(date)
|
||||
try:
|
||||
return requests_http_signature.HTTPSignatureAuth.verify(
|
||||
request, key_resolver=lambda **kwargs: public_key, use_auth_header=False
|
||||
)
|
||||
except cryptography.exceptions.InvalidSignature:
|
||||
logger.warning(
|
||||
"Could not verify request with date %s and headers %s",
|
||||
date,
|
||||
str(request.headers),
|
||||
)
|
||||
raise
|
||||
|
||||
|
||||
def verify_django(django_request, public_key):
|
||||
|
@ -67,6 +79,9 @@ def verify_django(django_request, public_key):
|
|||
expected = signature_headers.split(" ")
|
||||
logger.debug("Signature expected headers: %s", expected)
|
||||
for header in expected:
|
||||
if header == "(request-target)":
|
||||
# this one represent the request body, so not an actual HTTP header
|
||||
continue
|
||||
try:
|
||||
headers[header]
|
||||
except KeyError:
|
||||
|
|
|
@ -0,0 +1,63 @@
|
|||
from django.conf import settings
|
||||
|
||||
from rest_framework import serializers
|
||||
|
||||
from funkwhale_api.common import preferences
|
||||
from funkwhale_api.common import middleware
|
||||
from funkwhale_api.common import utils
|
||||
from funkwhale_api.federation import utils as federation_utils
|
||||
|
||||
from . import models
|
||||
|
||||
|
||||
def actor_detail_username(request, username, redirect_to_ap):
|
||||
validator = federation_utils.get_actor_data_from_username
|
||||
try:
|
||||
username_data = validator(username)
|
||||
except serializers.ValidationError:
|
||||
return []
|
||||
|
||||
queryset = (
|
||||
models.Actor.objects.filter(
|
||||
preferred_username__iexact=username_data["username"]
|
||||
)
|
||||
.local()
|
||||
.select_related("attachment_icon")
|
||||
)
|
||||
try:
|
||||
obj = queryset.get()
|
||||
except models.Actor.DoesNotExist:
|
||||
return []
|
||||
|
||||
if redirect_to_ap:
|
||||
raise middleware.ApiRedirect(obj.fid)
|
||||
obj_url = utils.join_url(
|
||||
settings.FUNKWHALE_URL,
|
||||
utils.spa_reverse("actor_detail", kwargs={"username": obj.preferred_username}),
|
||||
)
|
||||
metas = [
|
||||
{"tag": "meta", "property": "og:url", "content": obj_url},
|
||||
{"tag": "meta", "property": "og:title", "content": obj.display_name},
|
||||
{"tag": "meta", "property": "og:type", "content": "profile"},
|
||||
]
|
||||
|
||||
if obj.attachment_icon:
|
||||
metas.append(
|
||||
{
|
||||
"tag": "meta",
|
||||
"property": "og:image",
|
||||
"content": obj.attachment_icon.download_url_medium_square_crop,
|
||||
}
|
||||
)
|
||||
|
||||
if preferences.get("federation__enabled"):
|
||||
metas.append(
|
||||
{
|
||||
"tag": "link",
|
||||
"rel": "alternate",
|
||||
"type": "application/activity+json",
|
||||
"href": obj.fid,
|
||||
}
|
||||
)
|
||||
|
||||
return metas
|
|
@ -7,16 +7,21 @@ import requests
|
|||
from django.conf import settings
|
||||
from django.db import transaction
|
||||
from django.db.models import Q, F
|
||||
from django.db.models.deletion import Collector
|
||||
from django.utils import timezone
|
||||
from dynamic_preferences.registries import global_preferences_registry
|
||||
from requests.exceptions import RequestException
|
||||
|
||||
from funkwhale_api.audio import models as audio_models
|
||||
from funkwhale_api.common import preferences
|
||||
from funkwhale_api.common import models as common_models
|
||||
from funkwhale_api.common import session
|
||||
from funkwhale_api.common import utils as common_utils
|
||||
from funkwhale_api.moderation import mrf
|
||||
from funkwhale_api.music import models as music_models
|
||||
from funkwhale_api.taskapp import celery
|
||||
|
||||
from . import activity
|
||||
from . import actors
|
||||
from . import jsonld
|
||||
from . import keys
|
||||
|
@ -24,6 +29,7 @@ from . import models, signing
|
|||
from . import serializers
|
||||
from . import routes
|
||||
from . import utils
|
||||
from . import webfinger
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
@ -88,7 +94,7 @@ def dispatch_inbox(activity, call_handlers=True):
|
|||
context={
|
||||
"activity": activity,
|
||||
"actor": activity.actor,
|
||||
"inbox_items": activity.inbox_items.filter(is_read=False),
|
||||
"inbox_items": activity.inbox_items.filter(is_read=False).order_by("id"),
|
||||
},
|
||||
call_handlers=call_handlers,
|
||||
)
|
||||
|
@ -142,8 +148,6 @@ def deliver_to_remote(delivery):
|
|||
auth=auth,
|
||||
json=delivery.activity.payload,
|
||||
url=delivery.inbox_url,
|
||||
timeout=5,
|
||||
verify=settings.EXTERNAL_REQUESTS_VERIFY_SSL,
|
||||
headers={"Content-Type": "application/activity+json"},
|
||||
)
|
||||
logger.debug("Remote answered with %s", response.status_code)
|
||||
|
@ -163,9 +167,7 @@ def deliver_to_remote(delivery):
|
|||
def fetch_nodeinfo(domain_name):
|
||||
s = session.get_session()
|
||||
wellknown_url = "https://{}/.well-known/nodeinfo".format(domain_name)
|
||||
response = s.get(
|
||||
url=wellknown_url, timeout=5, verify=settings.EXTERNAL_REQUESTS_VERIFY_SSL
|
||||
)
|
||||
response = s.get(url=wellknown_url)
|
||||
response.raise_for_status()
|
||||
serializer = serializers.NodeInfoSerializer(data=response.json())
|
||||
serializer.is_valid(raise_exception=True)
|
||||
|
@ -175,9 +177,7 @@ def fetch_nodeinfo(domain_name):
|
|||
nodeinfo_url = link["href"]
|
||||
break
|
||||
|
||||
response = s.get(
|
||||
url=nodeinfo_url, timeout=5, verify=settings.EXTERNAL_REQUESTS_VERIFY_SSL
|
||||
)
|
||||
response = s.get(url=nodeinfo_url)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
|
@ -258,8 +258,11 @@ def handle_purge_actors(ids, only=[]):
|
|||
|
||||
# purge audio content
|
||||
if not only or "media" in only:
|
||||
delete_qs(common_models.Attachment.objects.filter(actor__in=ids))
|
||||
delete_qs(models.LibraryFollow.objects.filter(actor_id__in=ids))
|
||||
delete_qs(models.Follow.objects.filter(target_id__in=ids))
|
||||
delete_qs(audio_models.Channel.objects.filter(attributed_to__in=ids))
|
||||
delete_qs(audio_models.Channel.objects.filter(actor__in=ids))
|
||||
delete_qs(music_models.Upload.objects.filter(library__actor_id__in=ids))
|
||||
delete_qs(music_models.Library.objects.filter(actor_id__in=ids))
|
||||
|
||||
|
@ -291,26 +294,46 @@ def rotate_actor_key(actor):
|
|||
@celery.app.task(name="federation.fetch")
|
||||
@transaction.atomic
|
||||
@celery.require_instance(
|
||||
models.Fetch.objects.filter(status="pending").select_related("actor"), "fetch"
|
||||
models.Fetch.objects.filter(status="pending").select_related("actor"),
|
||||
"fetch_obj",
|
||||
"fetch_id",
|
||||
)
|
||||
def fetch(fetch):
|
||||
actor = fetch.actor
|
||||
auth = signing.get_auth(actor.private_key, actor.private_key_id)
|
||||
|
||||
def fetch(fetch_obj):
|
||||
def error(code, **kwargs):
|
||||
fetch.status = "errored"
|
||||
fetch.fetch_date = timezone.now()
|
||||
fetch.detail = {"error_code": code}
|
||||
fetch.detail.update(kwargs)
|
||||
fetch.save(update_fields=["fetch_date", "status", "detail"])
|
||||
fetch_obj.status = "errored"
|
||||
fetch_obj.fetch_date = timezone.now()
|
||||
fetch_obj.detail = {"error_code": code}
|
||||
fetch_obj.detail.update(kwargs)
|
||||
fetch_obj.save(update_fields=["fetch_date", "status", "detail"])
|
||||
|
||||
url = fetch_obj.url
|
||||
mrf_check_url = url
|
||||
if not mrf_check_url.startswith("webfinger://"):
|
||||
payload, updated = mrf.inbox.apply({"id": mrf_check_url})
|
||||
if not payload:
|
||||
return error("blocked", message="Blocked by MRF")
|
||||
|
||||
actor = fetch_obj.actor
|
||||
if settings.FEDERATION_AUTHENTIFY_FETCHES:
|
||||
auth = signing.get_auth(actor.private_key, actor.private_key_id)
|
||||
else:
|
||||
auth = None
|
||||
auth = None
|
||||
try:
|
||||
if url.startswith("webfinger://"):
|
||||
# we first grab the correpsonding webfinger representation
|
||||
# to get the ActivityPub actor ID
|
||||
webfinger_data = webfinger.get_resource(
|
||||
"acct:" + url.replace("webfinger://", "")
|
||||
)
|
||||
url = webfinger.get_ap_url(webfinger_data["links"])
|
||||
if not url:
|
||||
return error("webfinger", message="Invalid or missing webfinger data")
|
||||
payload, updated = mrf.inbox.apply({"id": url})
|
||||
if not payload:
|
||||
return error("blocked", message="Blocked by MRF")
|
||||
response = session.get_session().get(
|
||||
auth=auth,
|
||||
url=fetch.url,
|
||||
timeout=5,
|
||||
verify=settings.EXTERNAL_REQUESTS_VERIFY_SSL,
|
||||
headers={"Content-Type": "application/activity+json"},
|
||||
auth=auth, url=url, headers={"Accept": "application/activity+json"},
|
||||
)
|
||||
logger.debug("Remote answered with %s", response.status_code)
|
||||
response.raise_for_status()
|
||||
|
@ -328,8 +351,19 @@ def fetch(fetch):
|
|||
try:
|
||||
payload = response.json()
|
||||
except json.decoder.JSONDecodeError:
|
||||
# we attempt to extract a <link rel=alternate> that points
|
||||
# to an activity pub resource, if possible, and retry with this URL
|
||||
alternate_url = utils.find_alternate(response.text)
|
||||
if alternate_url:
|
||||
fetch_obj.url = alternate_url
|
||||
fetch_obj.save(update_fields=["url"])
|
||||
return fetch(fetch_id=fetch_obj.pk)
|
||||
return error("invalid_json")
|
||||
|
||||
payload, updated = mrf.inbox.apply(payload)
|
||||
if not payload:
|
||||
return error("blocked", message="Blocked by MRF")
|
||||
|
||||
try:
|
||||
doc = jsonld.expand(payload)
|
||||
except ValueError:
|
||||
|
@ -340,13 +374,13 @@ def fetch(fetch):
|
|||
except IndexError:
|
||||
return error("missing_jsonld_type")
|
||||
try:
|
||||
serializer_class = fetch.serializers[type]
|
||||
model = serializer_class.Meta.model
|
||||
serializer_classes = fetch_obj.serializers[type]
|
||||
model = serializer_classes[0].Meta.model
|
||||
except (KeyError, AttributeError):
|
||||
fetch.status = "skipped"
|
||||
fetch.fetch_date = timezone.now()
|
||||
fetch.detail = {"reason": "unhandled_type", "type": type}
|
||||
return fetch.save(update_fields=["fetch_date", "status", "detail"])
|
||||
fetch_obj.status = "skipped"
|
||||
fetch_obj.fetch_date = timezone.now()
|
||||
fetch_obj.detail = {"reason": "unhandled_type", "type": type}
|
||||
return fetch_obj.save(update_fields=["fetch_date", "status", "detail"])
|
||||
try:
|
||||
id = doc.get("@id")
|
||||
except IndexError:
|
||||
|
@ -354,15 +388,216 @@ def fetch(fetch):
|
|||
else:
|
||||
existing = model.objects.filter(fid=id).first()
|
||||
|
||||
serializer = serializer_class(existing, data=payload)
|
||||
if not serializer.is_valid():
|
||||
serializer = None
|
||||
for serializer_class in serializer_classes:
|
||||
serializer = serializer_class(existing, data=payload)
|
||||
if not serializer.is_valid():
|
||||
continue
|
||||
else:
|
||||
break
|
||||
if serializer.errors:
|
||||
return error("validation", validation_errors=serializer.errors)
|
||||
try:
|
||||
serializer.save()
|
||||
obj = serializer.save()
|
||||
except Exception as e:
|
||||
error("save", message=str(e))
|
||||
raise
|
||||
|
||||
fetch.status = "finished"
|
||||
fetch.fetch_date = timezone.now()
|
||||
return fetch.save(update_fields=["fetch_date", "status"])
|
||||
# special case for channels
|
||||
# when obj is an actor, we check if the actor has a channel associated with it
|
||||
# if it is the case, we consider the fetch obj to be a channel instead
|
||||
# and also trigger a fetch on the channel outbox
|
||||
if isinstance(obj, models.Actor) and obj.get_channel():
|
||||
obj = obj.get_channel()
|
||||
if obj.actor.outbox_url:
|
||||
try:
|
||||
# first page fetch is synchronous, so that at least some data is available
|
||||
# in the UI after subscription
|
||||
result = fetch_collection(
|
||||
obj.actor.outbox_url, channel_id=obj.pk, max_pages=1,
|
||||
)
|
||||
except Exception:
|
||||
logger.exception(
|
||||
"Error while fetching actor outbox: %s", obj.actor.outbox.url
|
||||
)
|
||||
else:
|
||||
if result.get("next_page"):
|
||||
# additional pages are fetched in the background
|
||||
result = fetch_collection.delay(
|
||||
result["next_page"],
|
||||
channel_id=obj.pk,
|
||||
max_pages=settings.FEDERATION_COLLECTION_MAX_PAGES - 1,
|
||||
is_page=True,
|
||||
)
|
||||
|
||||
fetch_obj.object = obj
|
||||
fetch_obj.status = "finished"
|
||||
fetch_obj.fetch_date = timezone.now()
|
||||
return fetch_obj.save(
|
||||
update_fields=["fetch_date", "status", "object_id", "object_content_type"]
|
||||
)
|
||||
|
||||
|
||||
class PreserveSomeDataCollector(Collector):
|
||||
"""
|
||||
We need to delete everything related to an actor. Well… Almost everything.
|
||||
But definitely not the Delete Activity we send to announce the actor is deleted.
|
||||
"""
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.creation_date = timezone.now()
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
def related_objects(self, related, *args, **kwargs):
|
||||
qs = super().related_objects(related, *args, **kwargs)
|
||||
if related.name == "outbox_activities":
|
||||
# exclude the delete activity can be broadcasted properly
|
||||
qs = qs.exclude(type="Delete", creation_date__gte=self.creation_date)
|
||||
|
||||
return qs
|
||||
|
||||
|
||||
@celery.app.task(name="federation.remove_actor")
|
||||
@transaction.atomic
|
||||
@celery.require_instance(
|
||||
models.Actor.objects.all(), "actor",
|
||||
)
|
||||
def remove_actor(actor):
|
||||
# Then we broadcast the info over federation. We do this *before* deleting objects
|
||||
# associated with the actor, otherwise follows are removed and we don't know where
|
||||
# to broadcast
|
||||
logger.info("Broadcasting deletion to federation…")
|
||||
collector = PreserveSomeDataCollector(using="default")
|
||||
routes.outbox.dispatch(
|
||||
{"type": "Delete", "object": {"type": actor.type}}, context={"actor": actor}
|
||||
)
|
||||
|
||||
# then we delete any object associated with the actor object, but *not* the actor
|
||||
# itself. We keep it for auditability and sending the Delete ActivityPub message
|
||||
logger.info(
|
||||
"Prepare deletion of objects associated with account %s…",
|
||||
actor.preferred_username,
|
||||
)
|
||||
collector.collect([actor])
|
||||
for model, instances in collector.data.items():
|
||||
if issubclass(model, actor.__class__):
|
||||
# we skip deletion of the actor itself
|
||||
continue
|
||||
|
||||
to_delete = model.objects.filter(pk__in=[instance.pk for instance in instances])
|
||||
logger.info(
|
||||
"Deleting %s objects associated with account %s…",
|
||||
len(instances),
|
||||
actor.preferred_username,
|
||||
)
|
||||
to_delete.delete()
|
||||
|
||||
# Finally, we update the actor itself and mark it as removed
|
||||
logger.info("Marking actor as Tombsone…")
|
||||
actor.type = "Tombstone"
|
||||
actor.name = None
|
||||
actor.summary = None
|
||||
actor.save(update_fields=["type", "name", "summary"])
|
||||
|
||||
|
||||
COLLECTION_ACTIVITY_SERIALIZERS = [
|
||||
(
|
||||
{"type": "Create", "object.type": "Audio"},
|
||||
serializers.ChannelCreateUploadSerializer,
|
||||
)
|
||||
]
|
||||
|
||||
|
||||
def match_serializer(payload, conf):
|
||||
return [
|
||||
serializer_class
|
||||
for route, serializer_class in conf
|
||||
if activity.match_route(route, payload)
|
||||
]
|
||||
|
||||
|
||||
@celery.app.task(name="federation.fetch_collection")
|
||||
@celery.require_instance(
|
||||
audio_models.Channel.objects.all(), "channel", allow_null=True,
|
||||
)
|
||||
def fetch_collection(url, max_pages, channel, is_page=False):
|
||||
actor = actors.get_service_actor()
|
||||
results = {
|
||||
"items": [],
|
||||
"skipped": 0,
|
||||
"errored": 0,
|
||||
"seen": 0,
|
||||
"total": 0,
|
||||
}
|
||||
if is_page:
|
||||
# starting immediatly from a page, no need to fetch the wrapping collection
|
||||
logger.debug("Fetch collection page immediatly at %s", url)
|
||||
results["next_page"] = url
|
||||
else:
|
||||
logger.debug("Fetching collection object at %s", url)
|
||||
collection = utils.retrieve_ap_object(
|
||||
url,
|
||||
actor=actor,
|
||||
serializer_class=serializers.PaginatedCollectionSerializer,
|
||||
)
|
||||
results["next_page"] = collection["first"]
|
||||
results["total"] = collection.get("totalItems")
|
||||
|
||||
seen_pages = 0
|
||||
context = {}
|
||||
if channel:
|
||||
context["channel"] = channel
|
||||
|
||||
for i in range(max_pages):
|
||||
page_url = results["next_page"]
|
||||
logger.debug("Handling page %s on max %s, at %s", i + 1, max_pages, page_url)
|
||||
page = utils.retrieve_ap_object(page_url, actor=actor, serializer_class=None,)
|
||||
try:
|
||||
items = page["orderedItems"]
|
||||
except KeyError:
|
||||
try:
|
||||
items = page["items"]
|
||||
except KeyError:
|
||||
logger.error("Invalid collection page at %s", page_url)
|
||||
break
|
||||
|
||||
for item in items:
|
||||
results["seen"] += 1
|
||||
|
||||
matching_serializer = match_serializer(
|
||||
item, COLLECTION_ACTIVITY_SERIALIZERS
|
||||
)
|
||||
if not matching_serializer:
|
||||
results["skipped"] += 1
|
||||
logger.debug("Skipping unhandled activity %s", item.get("type"))
|
||||
continue
|
||||
|
||||
s = matching_serializer[0](data=item, context=context)
|
||||
if not s.is_valid():
|
||||
logger.warn("Skipping invalid activity: %s", s.errors)
|
||||
results["errored"] += 1
|
||||
continue
|
||||
|
||||
results["items"].append(s.save())
|
||||
|
||||
seen_pages += 1
|
||||
results["next_page"] = page.get("next", None) or None
|
||||
if not results["next_page"]:
|
||||
logger.debug("No more pages to fetch")
|
||||
break
|
||||
|
||||
logger.info(
|
||||
"Finished fetch of collection pages at %s. Results:\n"
|
||||
" Total in collection: %s\n"
|
||||
" Seen: %s\n"
|
||||
" Handled: %s\n"
|
||||
" Skipped: %s\n"
|
||||
" Errored: %s",
|
||||
url,
|
||||
results.get("total"),
|
||||
results["seen"],
|
||||
len(results["items"]),
|
||||
results["skipped"],
|
||||
results["errored"],
|
||||
)
|
||||
return results
|
||||
|
|
|
@ -1,7 +1,12 @@
|
|||
import html.parser
|
||||
import unicodedata
|
||||
import urllib.parse
|
||||
import re
|
||||
|
||||
from django.apps import apps
|
||||
from django.conf import settings
|
||||
from django.db.models import Q
|
||||
from django.core.exceptions import ObjectDoesNotExist
|
||||
from django.db.models import CharField, Q, Value
|
||||
|
||||
from funkwhale_api.common import session
|
||||
from funkwhale_api.moderation import mrf
|
||||
|
@ -84,8 +89,6 @@ def retrieve_ap_object(
|
|||
response = session.get_session().get(
|
||||
fid,
|
||||
auth=auth,
|
||||
timeout=5,
|
||||
verify=settings.EXTERNAL_REQUESTS_VERIFY_SSL,
|
||||
headers={
|
||||
"Accept": "application/activity+json",
|
||||
"Content-Type": "application/activity+json",
|
||||
|
@ -104,7 +107,10 @@ def retrieve_ap_object(
|
|||
return data
|
||||
serializer = serializer_class(data=data, context={"fetch_actor": actor})
|
||||
serializer.is_valid(raise_exception=True)
|
||||
return serializer.save()
|
||||
try:
|
||||
return serializer.save()
|
||||
except NotImplementedError:
|
||||
return serializer.validated_data
|
||||
|
||||
|
||||
def get_domain_query_from_url(domain, url_field="fid"):
|
||||
|
@ -120,6 +126,15 @@ def get_domain_query_from_url(domain, url_field="fid"):
|
|||
return query
|
||||
|
||||
|
||||
def local_qs(queryset, url_field="fid", include=True):
|
||||
query = get_domain_query_from_url(
|
||||
domain=settings.FEDERATION_HOSTNAME, url_field=url_field
|
||||
)
|
||||
if not include:
|
||||
query = ~query
|
||||
return queryset.filter(query)
|
||||
|
||||
|
||||
def is_local(url):
|
||||
if not url:
|
||||
return True
|
||||
|
@ -157,3 +172,123 @@ def get_actor_from_username_data_query(field, data):
|
|||
"domain__name__iexact": data["domain"],
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
class StopParsing(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class AlternateLinkParser(html.parser.HTMLParser):
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.result = None
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
def handle_starttag(self, tag, attrs):
|
||||
if tag != "link":
|
||||
return
|
||||
|
||||
attrs_dict = dict(attrs)
|
||||
if attrs_dict.get("rel") == "alternate" and attrs_dict.get(
|
||||
"type", "application/activity+json"
|
||||
):
|
||||
self.result = attrs_dict.get("href")
|
||||
raise StopParsing()
|
||||
|
||||
def handle_endtag(self, tag):
|
||||
if tag == "head":
|
||||
raise StopParsing()
|
||||
|
||||
|
||||
def find_alternate(response_text):
|
||||
if not response_text:
|
||||
return
|
||||
|
||||
parser = AlternateLinkParser()
|
||||
try:
|
||||
parser.feed(response_text)
|
||||
except StopParsing:
|
||||
return parser.result
|
||||
|
||||
|
||||
def should_redirect_ap_to_html(accept_header, default=True):
|
||||
if not accept_header:
|
||||
return False
|
||||
|
||||
redirect_headers = [
|
||||
"text/html",
|
||||
]
|
||||
no_redirect_headers = [
|
||||
"*/*", # XXX backward compat with older Funkwhale instances that don't send the Accept header
|
||||
"application/json",
|
||||
"application/activity+json",
|
||||
"application/ld+json",
|
||||
]
|
||||
|
||||
parsed_header = [ct.lower().strip() for ct in accept_header.split(",")]
|
||||
for ct in parsed_header:
|
||||
if ct in redirect_headers:
|
||||
return True
|
||||
if ct in no_redirect_headers:
|
||||
return False
|
||||
|
||||
return default
|
||||
|
||||
|
||||
FID_MODEL_LABELS = [
|
||||
"music.Artist",
|
||||
"music.Album",
|
||||
"music.Track",
|
||||
"music.Library",
|
||||
"music.Upload",
|
||||
"federation.Actor",
|
||||
]
|
||||
|
||||
|
||||
def get_object_by_fid(fid, local=None):
|
||||
|
||||
if local is True:
|
||||
parsed = urllib.parse.urlparse(fid)
|
||||
if parsed.netloc != settings.FEDERATION_HOSTNAME:
|
||||
raise ObjectDoesNotExist()
|
||||
|
||||
models = [apps.get_model(*l.split(".")) for l in FID_MODEL_LABELS]
|
||||
|
||||
def get_qs(model):
|
||||
return (
|
||||
model.objects.all()
|
||||
.filter(fid=fid)
|
||||
.annotate(__type=Value(model._meta.label, output_field=CharField()))
|
||||
.values("fid", "__type")
|
||||
)
|
||||
|
||||
qs = get_qs(models[0])
|
||||
for m in models[1:]:
|
||||
qs = qs.union(get_qs(m))
|
||||
|
||||
result = qs.order_by("fid").first()
|
||||
|
||||
if not result:
|
||||
raise ObjectDoesNotExist()
|
||||
model = apps.get_model(*result["__type"].split("."))
|
||||
instance = model.objects.get(fid=fid)
|
||||
if model._meta.label == "federation.Actor":
|
||||
channel = instance.get_channel()
|
||||
if channel:
|
||||
return channel
|
||||
|
||||
return instance
|
||||
|
||||
|
||||
def can_manage(obj_owner, actor):
|
||||
if not obj_owner:
|
||||
return False
|
||||
|
||||
if not actor:
|
||||
return False
|
||||
|
||||
if obj_owner == actor:
|
||||
return True
|
||||
if obj_owner.domain.service_actor == actor:
|
||||
return True
|
||||
|
||||
return False
|
||||
|
|
|
@ -1,16 +1,34 @@
|
|||
from django import forms
|
||||
from django.conf import settings
|
||||
from django.core import paginator
|
||||
from django.db.models import Prefetch
|
||||
from django.http import HttpResponse
|
||||
from django.urls import reverse
|
||||
from rest_framework import exceptions, mixins, permissions, response, viewsets
|
||||
from rest_framework.decorators import action
|
||||
|
||||
from funkwhale_api.common import preferences
|
||||
from funkwhale_api.common import utils as common_utils
|
||||
from funkwhale_api.moderation import models as moderation_models
|
||||
from funkwhale_api.music import models as music_models
|
||||
from funkwhale_api.music import utils as music_utils
|
||||
|
||||
from . import activity, authentication, models, renderers, serializers, utils, webfinger
|
||||
from . import (
|
||||
actors,
|
||||
activity,
|
||||
authentication,
|
||||
models,
|
||||
renderers,
|
||||
serializers,
|
||||
utils,
|
||||
webfinger,
|
||||
)
|
||||
|
||||
|
||||
def redirect_to_html(public_url):
|
||||
response = HttpResponse(status=302)
|
||||
response["Location"] = common_utils.join_url(settings.FUNKWHALE_URL, public_url)
|
||||
return response
|
||||
|
||||
|
||||
class AuthenticatedIfAllowListEnabled(permissions.BasePermission):
|
||||
|
@ -34,7 +52,11 @@ class SharedViewSet(FederationMixin, viewsets.GenericViewSet):
|
|||
authentication_classes = [authentication.SignatureAuthentication]
|
||||
renderer_classes = renderers.get_ap_renderers()
|
||||
|
||||
@action(methods=["post"], detail=False)
|
||||
@action(
|
||||
methods=["post"],
|
||||
detail=False,
|
||||
content_negotiation_class=renderers.IgnoreClientContentNegotiation,
|
||||
)
|
||||
def inbox(self, request, *args, **kwargs):
|
||||
if request.method.lower() == "post" and request.actor is None:
|
||||
raise exceptions.AuthenticationFailed(
|
||||
|
@ -49,23 +71,84 @@ class ActorViewSet(FederationMixin, mixins.RetrieveModelMixin, viewsets.GenericV
|
|||
lookup_field = "preferred_username"
|
||||
authentication_classes = [authentication.SignatureAuthentication]
|
||||
renderer_classes = renderers.get_ap_renderers()
|
||||
queryset = models.Actor.objects.local().select_related("user")
|
||||
queryset = (
|
||||
models.Actor.objects.local()
|
||||
.select_related("user", "channel__artist", "channel__attributed_to")
|
||||
.prefetch_related("channel__artist__tagged_items__tag")
|
||||
)
|
||||
serializer_class = serializers.ActorSerializer
|
||||
|
||||
@action(methods=["get", "post"], detail=True)
|
||||
def get_queryset(self):
|
||||
queryset = super().get_queryset()
|
||||
return queryset.exclude(channel__attributed_to=actors.get_service_actor())
|
||||
|
||||
def retrieve(self, request, *args, **kwargs):
|
||||
instance = self.get_object()
|
||||
if utils.should_redirect_ap_to_html(request.headers.get("accept")):
|
||||
if instance.get_channel():
|
||||
return redirect_to_html(instance.channel.get_absolute_url())
|
||||
return redirect_to_html(instance.get_absolute_url())
|
||||
|
||||
serializer = self.get_serializer(instance)
|
||||
return response.Response(serializer.data)
|
||||
|
||||
@action(
|
||||
methods=["get", "post"],
|
||||
detail=True,
|
||||
content_negotiation_class=renderers.IgnoreClientContentNegotiation,
|
||||
)
|
||||
def inbox(self, request, *args, **kwargs):
|
||||
inbox_actor = self.get_object()
|
||||
if request.method.lower() == "post" and request.actor is None:
|
||||
raise exceptions.AuthenticationFailed(
|
||||
"You need a valid signature to send an activity"
|
||||
)
|
||||
if request.method.lower() == "post":
|
||||
activity.receive(activity=request.data, on_behalf_of=request.actor)
|
||||
activity.receive(
|
||||
activity=request.data,
|
||||
on_behalf_of=request.actor,
|
||||
inbox_actor=inbox_actor,
|
||||
)
|
||||
return response.Response({}, status=200)
|
||||
|
||||
@action(methods=["get", "post"], detail=True)
|
||||
def outbox(self, request, *args, **kwargs):
|
||||
actor = self.get_object()
|
||||
channel = actor.get_channel()
|
||||
if channel:
|
||||
return self.get_channel_outbox_response(request, channel)
|
||||
return response.Response({}, status=200)
|
||||
|
||||
def get_channel_outbox_response(self, request, channel):
|
||||
conf = {
|
||||
"id": channel.actor.outbox_url,
|
||||
"actor": channel.actor,
|
||||
"items": channel.library.uploads.for_federation()
|
||||
.order_by("-creation_date")
|
||||
.prefetch_related("library__channel__actor", "track__artist"),
|
||||
"item_serializer": serializers.ChannelCreateUploadSerializer,
|
||||
}
|
||||
page = request.GET.get("page")
|
||||
if page is None:
|
||||
serializer = serializers.ChannelOutboxSerializer(channel)
|
||||
data = serializer.data
|
||||
else:
|
||||
try:
|
||||
page_number = int(page)
|
||||
except Exception:
|
||||
return response.Response({"page": ["Invalid page number"]}, status=400)
|
||||
conf["page_size"] = preferences.get("federation__collection_page_size")
|
||||
p = paginator.Paginator(conf["items"], conf["page_size"])
|
||||
try:
|
||||
page = p.page(page_number)
|
||||
conf["page"] = page
|
||||
serializer = serializers.CollectionPageSerializer(conf)
|
||||
data = serializer.data
|
||||
except paginator.EmptyPage:
|
||||
return response.Response(status=404)
|
||||
|
||||
return response.Response(data)
|
||||
|
||||
@action(methods=["get"], detail=True)
|
||||
def followers(self, request, *args, **kwargs):
|
||||
self.get_object()
|
||||
|
@ -103,8 +186,6 @@ class WellKnownViewSet(viewsets.GenericViewSet):
|
|||
|
||||
@action(methods=["get"], detail=False)
|
||||
def nodeinfo(self, request, *args, **kwargs):
|
||||
if not preferences.get("instance__nodeinfo_enabled"):
|
||||
return HttpResponse(status=404)
|
||||
data = {
|
||||
"links": [
|
||||
{
|
||||
|
@ -165,18 +246,48 @@ class MusicLibraryViewSet(
|
|||
authentication_classes = [authentication.SignatureAuthentication]
|
||||
renderer_classes = renderers.get_ap_renderers()
|
||||
serializer_class = serializers.LibrarySerializer
|
||||
queryset = music_models.Library.objects.all().select_related("actor")
|
||||
queryset = (
|
||||
music_models.Library.objects.all()
|
||||
.local()
|
||||
.select_related("actor")
|
||||
.filter(channel=None)
|
||||
)
|
||||
lookup_field = "uuid"
|
||||
|
||||
def retrieve(self, request, *args, **kwargs):
|
||||
lb = self.get_object()
|
||||
|
||||
if utils.should_redirect_ap_to_html(request.headers.get("accept")):
|
||||
return redirect_to_html(lb.get_absolute_url())
|
||||
conf = {
|
||||
"id": lb.get_federation_id(),
|
||||
"actor": lb.actor,
|
||||
"name": lb.name,
|
||||
"summary": lb.description,
|
||||
"items": lb.uploads.for_federation().order_by("-creation_date"),
|
||||
"items": lb.uploads.for_federation()
|
||||
.order_by("-creation_date")
|
||||
.prefetch_related(
|
||||
Prefetch(
|
||||
"track",
|
||||
queryset=music_models.Track.objects.select_related(
|
||||
"album__artist__attributed_to",
|
||||
"artist__attributed_to",
|
||||
"artist__attachment_cover",
|
||||
"attachment_cover",
|
||||
"album__attributed_to",
|
||||
"attributed_to",
|
||||
"album__attachment_cover",
|
||||
"album__artist__attachment_cover",
|
||||
"description",
|
||||
).prefetch_related(
|
||||
"tagged_items__tag",
|
||||
"album__tagged_items__tag",
|
||||
"album__artist__tagged_items__tag",
|
||||
"artist__tagged_items__tag",
|
||||
"artist__description",
|
||||
"album__description",
|
||||
),
|
||||
)
|
||||
),
|
||||
"item_serializer": serializers.UploadSerializer,
|
||||
}
|
||||
page = request.GET.get("page")
|
||||
|
@ -219,36 +330,86 @@ class MusicUploadViewSet(
|
|||
authentication_classes = [authentication.SignatureAuthentication]
|
||||
renderer_classes = renderers.get_ap_renderers()
|
||||
queryset = music_models.Upload.objects.local().select_related(
|
||||
"library__actor", "track__artist", "track__album__artist"
|
||||
"library__actor",
|
||||
"track__artist",
|
||||
"track__album__artist",
|
||||
"track__description",
|
||||
"track__album__attachment_cover",
|
||||
"track__album__artist__attachment_cover",
|
||||
"track__artist__attachment_cover",
|
||||
"track__attachment_cover",
|
||||
)
|
||||
serializer_class = serializers.UploadSerializer
|
||||
lookup_field = "uuid"
|
||||
|
||||
def retrieve(self, request, *args, **kwargs):
|
||||
instance = self.get_object()
|
||||
if utils.should_redirect_ap_to_html(request.headers.get("accept")):
|
||||
return redirect_to_html(instance.track.get_absolute_url())
|
||||
|
||||
serializer = self.get_serializer(instance)
|
||||
return response.Response(serializer.data)
|
||||
|
||||
def get_queryset(self):
|
||||
queryset = super().get_queryset()
|
||||
actor = music_utils.get_actor_from_request(self.request)
|
||||
return queryset.playable_by(actor)
|
||||
|
||||
def get_serializer(self, obj):
|
||||
if obj.library.get_channel():
|
||||
return serializers.ChannelUploadSerializer(obj)
|
||||
return super().get_serializer(obj)
|
||||
|
||||
@action(
|
||||
methods=["get"],
|
||||
detail=True,
|
||||
content_negotiation_class=renderers.IgnoreClientContentNegotiation,
|
||||
)
|
||||
def activity(self, request, *args, **kwargs):
|
||||
object = self.get_object()
|
||||
serializer = serializers.ChannelCreateUploadSerializer(object)
|
||||
return response.Response(serializer.data)
|
||||
|
||||
|
||||
class MusicArtistViewSet(
|
||||
FederationMixin, mixins.RetrieveModelMixin, viewsets.GenericViewSet
|
||||
):
|
||||
authentication_classes = [authentication.SignatureAuthentication]
|
||||
renderer_classes = renderers.get_ap_renderers()
|
||||
queryset = music_models.Artist.objects.local()
|
||||
queryset = music_models.Artist.objects.local().select_related(
|
||||
"description", "attachment_cover"
|
||||
)
|
||||
serializer_class = serializers.ArtistSerializer
|
||||
lookup_field = "uuid"
|
||||
|
||||
def retrieve(self, request, *args, **kwargs):
|
||||
instance = self.get_object()
|
||||
if utils.should_redirect_ap_to_html(request.headers.get("accept")):
|
||||
return redirect_to_html(instance.get_absolute_url())
|
||||
|
||||
serializer = self.get_serializer(instance)
|
||||
return response.Response(serializer.data)
|
||||
|
||||
|
||||
class MusicAlbumViewSet(
|
||||
FederationMixin, mixins.RetrieveModelMixin, viewsets.GenericViewSet
|
||||
):
|
||||
authentication_classes = [authentication.SignatureAuthentication]
|
||||
renderer_classes = renderers.get_ap_renderers()
|
||||
queryset = music_models.Album.objects.local().select_related("artist")
|
||||
queryset = music_models.Album.objects.local().select_related(
|
||||
"artist__description", "description", "artist__attachment_cover"
|
||||
)
|
||||
serializer_class = serializers.AlbumSerializer
|
||||
lookup_field = "uuid"
|
||||
|
||||
def retrieve(self, request, *args, **kwargs):
|
||||
instance = self.get_object()
|
||||
if utils.should_redirect_ap_to_html(request.headers.get("accept")):
|
||||
return redirect_to_html(instance.get_absolute_url())
|
||||
|
||||
serializer = self.get_serializer(instance)
|
||||
return response.Response(serializer.data)
|
||||
|
||||
|
||||
class MusicTrackViewSet(
|
||||
FederationMixin, mixins.RetrieveModelMixin, viewsets.GenericViewSet
|
||||
|
@ -256,7 +417,22 @@ class MusicTrackViewSet(
|
|||
authentication_classes = [authentication.SignatureAuthentication]
|
||||
renderer_classes = renderers.get_ap_renderers()
|
||||
queryset = music_models.Track.objects.local().select_related(
|
||||
"album__artist", "artist"
|
||||
"album__artist",
|
||||
"album__description",
|
||||
"artist__description",
|
||||
"description",
|
||||
"attachment_cover",
|
||||
"album__artist__attachment_cover",
|
||||
"album__attachment_cover",
|
||||
"artist__attachment_cover",
|
||||
)
|
||||
serializer_class = serializers.TrackSerializer
|
||||
lookup_field = "uuid"
|
||||
|
||||
def retrieve(self, request, *args, **kwargs):
|
||||
instance = self.get_object()
|
||||
if utils.should_redirect_ap_to_html(request.headers.get("accept")):
|
||||
return redirect_to_html(instance.get_absolute_url())
|
||||
|
||||
serializer = self.get_serializer(instance)
|
||||
return response.Response(serializer.data)
|
||||
|
|
|
@ -41,10 +41,17 @@ def get_resource(resource_string):
|
|||
url = "https://{}/.well-known/webfinger?resource={}".format(
|
||||
hostname, resource_string
|
||||
)
|
||||
response = session.get_session().get(
|
||||
url, verify=settings.EXTERNAL_REQUESTS_VERIFY_SSL, timeout=5
|
||||
)
|
||||
response = session.get_session().get(url)
|
||||
response.raise_for_status()
|
||||
serializer = serializers.ActorWebfingerSerializer(data=response.json())
|
||||
serializer.is_valid(raise_exception=True)
|
||||
return serializer.validated_data
|
||||
|
||||
|
||||
def get_ap_url(links):
|
||||
for link in links:
|
||||
if (
|
||||
link.get("rel") == "self"
|
||||
and link.get("type") == "application/activity+json"
|
||||
):
|
||||
return link["href"]
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
import django_filters
|
||||
|
||||
from funkwhale_api.common import filters as common_filters
|
||||
from funkwhale_api.moderation import filters as moderation_filters
|
||||
|
||||
from . import models
|
||||
|
@ -8,10 +9,11 @@ from . import models
|
|||
class ListeningFilter(moderation_filters.HiddenContentFilterSet):
|
||||
username = django_filters.CharFilter("user__username")
|
||||
domain = django_filters.CharFilter("user__actor__domain_id")
|
||||
scope = common_filters.ActorScopeFilter(actor_field="user__actor", distinct=True)
|
||||
|
||||
class Meta:
|
||||
model = models.Listening
|
||||
hidden_content_fields_mapping = moderation_filters.USER_FILTER_CONFIG[
|
||||
"LISTENING"
|
||||
]
|
||||
fields = ["hidden"]
|
||||
fields = ["hidden", "scope"]
|
||||
|
|
|
@ -19,7 +19,9 @@ class ListeningViewSet(
|
|||
):
|
||||
|
||||
serializer_class = serializers.ListeningSerializer
|
||||
queryset = models.Listening.objects.all().select_related("user__actor")
|
||||
queryset = models.Listening.objects.all().select_related(
|
||||
"user__actor__attachment_icon"
|
||||
)
|
||||
|
||||
permission_classes = [
|
||||
oauth_permissions.ScopePermission,
|
||||
|
|
|
@ -38,9 +38,7 @@ class InstanceLongDescription(types.StringPreference):
|
|||
name = "long_description"
|
||||
verbose_name = "Long description"
|
||||
default = ""
|
||||
help_text = (
|
||||
"Instance long description, displayed in the about page (markdown allowed)."
|
||||
)
|
||||
help_text = "Instance long description, displayed in the about page."
|
||||
widget = widgets.Textarea
|
||||
field_kwargs = {"required": False}
|
||||
|
||||
|
@ -52,9 +50,7 @@ class InstanceTerms(types.StringPreference):
|
|||
name = "terms"
|
||||
verbose_name = "Terms of service"
|
||||
default = ""
|
||||
help_text = (
|
||||
"Terms of service and privacy policy for your instance (markdown allowed)."
|
||||
)
|
||||
help_text = "Terms of service and privacy policy for your instance."
|
||||
widget = widgets.Textarea
|
||||
field_kwargs = {"required": False}
|
||||
|
||||
|
@ -66,7 +62,7 @@ class InstanceRules(types.StringPreference):
|
|||
name = "rules"
|
||||
verbose_name = "Rules"
|
||||
default = ""
|
||||
help_text = "Rules/Code of Conduct (markdown allowed)."
|
||||
help_text = "Rules/Code of Conduct."
|
||||
widget = widgets.Textarea
|
||||
field_kwargs = {"required": False}
|
||||
|
||||
|
@ -127,21 +123,6 @@ class RavenDSN(types.StringPreference):
|
|||
field_kwargs = {"required": False}
|
||||
|
||||
|
||||
@global_preferences_registry.register
|
||||
class InstanceNodeinfoEnabled(types.BooleanPreference):
|
||||
show_in_api = False
|
||||
section = instance
|
||||
name = "nodeinfo_enabled"
|
||||
default = True
|
||||
verbose_name = "Enable nodeinfo endpoint"
|
||||
help_text = (
|
||||
"This endpoint is needed for your about page to work. "
|
||||
"It's also helpful for the various monitoring "
|
||||
"tools that map and analyzize the fediverse, "
|
||||
"but you can disable it completely if needed."
|
||||
)
|
||||
|
||||
|
||||
@global_preferences_registry.register
|
||||
class InstanceNodeinfoPrivate(types.BooleanPreference):
|
||||
show_in_api = False
|
||||
|
|
|
@ -1,5 +1,7 @@
|
|||
import memoize.djangocache
|
||||
|
||||
from django.urls import reverse
|
||||
|
||||
import funkwhale_api
|
||||
from funkwhale_api.common import preferences
|
||||
from funkwhale_api.federation import actors, models as federation_models
|
||||
|
@ -18,6 +20,7 @@ def get():
|
|||
share_stats = all_preferences.get("instance__nodeinfo_stats_enabled")
|
||||
allow_list_enabled = all_preferences.get("moderation__allow_list_enabled")
|
||||
allow_list_public = all_preferences.get("moderation__allow_list_public")
|
||||
auth_required = all_preferences.get("common__api_authentication_required")
|
||||
banner = all_preferences.get("instance__banner")
|
||||
unauthenticated_report_types = all_preferences.get(
|
||||
"moderation__unauthenticated_report_types"
|
||||
|
@ -50,9 +53,6 @@ def get():
|
|||
"defaultUploadQuota": all_preferences.get("users__upload_quota"),
|
||||
"library": {
|
||||
"federationEnabled": all_preferences.get("federation__enabled"),
|
||||
"federationNeedsApproval": all_preferences.get(
|
||||
"federation__music_needs_approval"
|
||||
),
|
||||
"anonymousCanListen": not all_preferences.get(
|
||||
"common__api_authentication_required"
|
||||
),
|
||||
|
@ -67,6 +67,7 @@ def get():
|
|||
"instance__funkwhale_support_message_enabled"
|
||||
),
|
||||
"instanceSupportMessage": all_preferences.get("instance__support_message"),
|
||||
"knownNodesListUrl": None,
|
||||
},
|
||||
}
|
||||
|
||||
|
@ -86,5 +87,10 @@ def get():
|
|||
data["metadata"]["usage"] = {
|
||||
"favorites": {"tracks": {"total": statistics["track_favorites"]}},
|
||||
"listenings": {"total": statistics["listenings"]},
|
||||
"downloads": {"total": statistics["downloads"]},
|
||||
}
|
||||
if not auth_required:
|
||||
data["metadata"]["knownNodesListUrl"] = federation_utils.full_url(
|
||||
reverse("api:v1:federation:domains-list")
|
||||
)
|
||||
return data
|
||||
|
|
|
@ -17,6 +17,7 @@ def get():
|
|||
"artists": get_artists(),
|
||||
"track_favorites": get_track_favorites(),
|
||||
"listenings": get_listenings(),
|
||||
"downloads": get_downloads(),
|
||||
"music_duration": get_music_duration(),
|
||||
}
|
||||
|
||||
|
@ -43,15 +44,19 @@ def get_track_favorites():
|
|||
|
||||
|
||||
def get_tracks():
|
||||
return models.Track.objects.count()
|
||||
return models.Track.objects.local().count()
|
||||
|
||||
|
||||
def get_albums():
|
||||
return models.Album.objects.count()
|
||||
return models.Album.objects.local().count()
|
||||
|
||||
|
||||
def get_artists():
|
||||
return models.Artist.objects.count()
|
||||
return models.Artist.objects.local().count()
|
||||
|
||||
|
||||
def get_downloads():
|
||||
return models.Track.objects.aggregate(d=Sum("downloads_count"))["d"] or 0
|
||||
|
||||
|
||||
def get_music_duration():
|
||||
|
|
|
@ -9,4 +9,5 @@ admin_router.register(r"admin/settings", views.AdminSettings, "admin-settings")
|
|||
urlpatterns = [
|
||||
url(r"^nodeinfo/2.0/?$", views.NodeInfo.as_view(), name="nodeinfo-2.0"),
|
||||
url(r"^settings/?$", views.InstanceSettings.as_view(), name="settings"),
|
||||
url(r"^spa-manifest.json", views.SpaManifest.as_view(), name="spa-manifest"),
|
||||
] + admin_router.urls
|
||||
|
|
|
@ -1,10 +1,16 @@
|
|||
import json
|
||||
|
||||
from django.conf import settings
|
||||
|
||||
from dynamic_preferences.api import serializers
|
||||
from dynamic_preferences.api import viewsets as preferences_viewsets
|
||||
from dynamic_preferences.registries import global_preferences_registry
|
||||
from rest_framework import views
|
||||
from rest_framework.response import Response
|
||||
|
||||
from funkwhale_api.common import middleware
|
||||
from funkwhale_api.common import preferences
|
||||
from funkwhale_api.federation import utils as federation_utils
|
||||
from funkwhale_api.users.oauth import permissions as oauth_permissions
|
||||
|
||||
from . import nodeinfo
|
||||
|
@ -38,7 +44,26 @@ class NodeInfo(views.APIView):
|
|||
authentication_classes = []
|
||||
|
||||
def get(self, request, *args, **kwargs):
|
||||
if not preferences.get("instance__nodeinfo_enabled"):
|
||||
return Response(status=404)
|
||||
data = nodeinfo.get()
|
||||
return Response(data, status=200, content_type=NODEINFO_2_CONTENT_TYPE)
|
||||
|
||||
|
||||
class SpaManifest(views.APIView):
|
||||
permission_classes = []
|
||||
authentication_classes = []
|
||||
|
||||
def get(self, request, *args, **kwargs):
|
||||
existing_manifest = middleware.get_spa_file(
|
||||
settings.FUNKWHALE_SPA_HTML_ROOT, "manifest.json"
|
||||
)
|
||||
parsed_manifest = json.loads(existing_manifest)
|
||||
parsed_manifest["short_name"] = settings.APP_NAME
|
||||
parsed_manifest["start_url"] = federation_utils.full_url("/")
|
||||
instance_name = preferences.get("instance__name")
|
||||
if instance_name:
|
||||
parsed_manifest["short_name"] = instance_name
|
||||
parsed_manifest["name"] = instance_name
|
||||
instance_description = preferences.get("instance__short_description")
|
||||
if instance_description:
|
||||
parsed_manifest["description"] = instance_description
|
||||
return Response(parsed_manifest, status=200)
|
||||
|
|
|
@ -8,6 +8,7 @@ from funkwhale_api.common import fields
|
|||
from funkwhale_api.common import filters as common_filters
|
||||
from funkwhale_api.common import search
|
||||
|
||||
from funkwhale_api.audio import models as audio_models
|
||||
from funkwhale_api.federation import models as federation_models
|
||||
from funkwhale_api.federation import utils as federation_utils
|
||||
from funkwhale_api.moderation import models as moderation_models
|
||||
|
@ -34,6 +35,34 @@ def get_actor_filter(actor_field):
|
|||
return {"field": ActorField(), "handler": handler}
|
||||
|
||||
|
||||
class ManageChannelFilterSet(filters.FilterSet):
|
||||
q = fields.SmartSearchFilter(
|
||||
config=search.SearchConfig(
|
||||
search_fields={
|
||||
"name": {"to": "artist__name"},
|
||||
"username": {"to": "artist__name"},
|
||||
"fid": {"to": "artist__fid"},
|
||||
"rss": {"to": "rss_url"},
|
||||
},
|
||||
filter_fields={
|
||||
"uuid": {"to": "uuid"},
|
||||
"category": {"to": "artist__content_category"},
|
||||
"domain": {
|
||||
"handler": lambda v: federation_utils.get_domain_query_from_url(
|
||||
v, url_field="attributed_to__fid"
|
||||
)
|
||||
},
|
||||
"tag": {"to": "artist__tagged_items__tag__name", "distinct": True},
|
||||
"account": get_actor_filter("attributed_to"),
|
||||
},
|
||||
)
|
||||
)
|
||||
|
||||
class Meta:
|
||||
model = audio_models.Channel
|
||||
fields = ["q"]
|
||||
|
||||
|
||||
class ManageArtistFilterSet(filters.FilterSet):
|
||||
q = fields.SmartSearchFilter(
|
||||
config=search.SearchConfig(
|
||||
|
@ -52,6 +81,7 @@ class ManageArtistFilterSet(filters.FilterSet):
|
|||
"field": forms.IntegerField(),
|
||||
"distinct": True,
|
||||
},
|
||||
"category": {"to": "content_category"},
|
||||
"tag": {"to": "tagged_items__tag__name", "distinct": True},
|
||||
},
|
||||
)
|
||||
|
@ -59,7 +89,7 @@ class ManageArtistFilterSet(filters.FilterSet):
|
|||
|
||||
class Meta:
|
||||
model = music_models.Artist
|
||||
fields = ["q", "name", "mbid", "fid"]
|
||||
fields = ["q", "name", "mbid", "fid", "content_category"]
|
||||
|
||||
|
||||
class ManageAlbumFilterSet(filters.FilterSet):
|
||||
|
@ -394,3 +424,26 @@ class ManageNoteFilterSet(filters.FilterSet):
|
|||
class Meta:
|
||||
model = moderation_models.Note
|
||||
fields = ["q"]
|
||||
|
||||
|
||||
class ManageUserRequestFilterSet(filters.FilterSet):
|
||||
q = fields.SmartSearchFilter(
|
||||
config=search.SearchConfig(
|
||||
search_fields={
|
||||
"username": {"to": "submitter__preferred_username"},
|
||||
"uuid": {"to": "uuid"},
|
||||
},
|
||||
filter_fields={
|
||||
"uuid": {"to": "uuid"},
|
||||
"id": {"to": "id"},
|
||||
"status": {"to": "status"},
|
||||
"category": {"to": "type"},
|
||||
"submitter": get_actor_filter("submitter"),
|
||||
"assigned_to": get_actor_filter("assigned_to"),
|
||||
},
|
||||
)
|
||||
)
|
||||
|
||||
class Meta:
|
||||
model = moderation_models.UserRequest
|
||||
fields = ["q", "status", "type"]
|
||||
|
|
|
@ -3,6 +3,7 @@ from django.db import transaction
|
|||
|
||||
from rest_framework import serializers
|
||||
|
||||
from funkwhale_api.audio import models as audio_models
|
||||
from funkwhale_api.common import fields as common_fields
|
||||
from funkwhale_api.common import serializers as common_serializers
|
||||
from funkwhale_api.common import utils as common_utils
|
||||
|
@ -173,7 +174,7 @@ class ManageDomainActionSerializer(common_serializers.ActionSerializer):
|
|||
|
||||
@transaction.atomic
|
||||
def handle_purge(self, objects):
|
||||
ids = objects.values_list("pk", flat=True)
|
||||
ids = objects.values_list("pk", flat=True).order_by("pk")
|
||||
common_utils.on_commit(federation_tasks.purge_actors.delay, domains=list(ids))
|
||||
|
||||
@transaction.atomic
|
||||
|
@ -383,31 +384,50 @@ class ManageNestedAlbumSerializer(ManageBaseAlbumSerializer):
|
|||
return getattr(obj, "tracks_count", None)
|
||||
|
||||
|
||||
class ManageArtistSerializer(ManageBaseArtistSerializer):
|
||||
albums = ManageNestedAlbumSerializer(many=True)
|
||||
tracks = ManageNestedTrackSerializer(many=True)
|
||||
class ManageArtistSerializer(
|
||||
music_serializers.OptionalDescriptionMixin, ManageBaseArtistSerializer
|
||||
):
|
||||
attributed_to = ManageBaseActorSerializer()
|
||||
tags = serializers.SerializerMethodField()
|
||||
tracks_count = serializers.SerializerMethodField()
|
||||
albums_count = serializers.SerializerMethodField()
|
||||
channel = serializers.SerializerMethodField()
|
||||
cover = music_serializers.cover_field
|
||||
|
||||
class Meta:
|
||||
model = music_models.Artist
|
||||
fields = ManageBaseArtistSerializer.Meta.fields + [
|
||||
"albums",
|
||||
"tracks",
|
||||
"tracks_count",
|
||||
"albums_count",
|
||||
"attributed_to",
|
||||
"tags",
|
||||
"cover",
|
||||
"channel",
|
||||
"content_category",
|
||||
]
|
||||
|
||||
def get_tracks_count(self, obj):
|
||||
return getattr(obj, "_tracks_count", None)
|
||||
|
||||
def get_albums_count(self, obj):
|
||||
return getattr(obj, "_albums_count", None)
|
||||
|
||||
def get_tags(self, obj):
|
||||
tagged_items = getattr(obj, "_prefetched_tagged_items", [])
|
||||
return [ti.tag.name for ti in tagged_items]
|
||||
|
||||
def get_channel(self, obj):
|
||||
if "channel" in obj._state.fields_cache and obj.get_channel():
|
||||
return str(obj.channel.uuid)
|
||||
|
||||
|
||||
class ManageNestedArtistSerializer(ManageBaseArtistSerializer):
|
||||
pass
|
||||
|
||||
|
||||
class ManageAlbumSerializer(ManageBaseAlbumSerializer):
|
||||
class ManageAlbumSerializer(
|
||||
music_serializers.OptionalDescriptionMixin, ManageBaseAlbumSerializer
|
||||
):
|
||||
tracks = ManageNestedTrackSerializer(many=True)
|
||||
attributed_to = ManageBaseActorSerializer()
|
||||
artist = ManageNestedArtistSerializer()
|
||||
|
@ -435,12 +455,15 @@ class ManageTrackAlbumSerializer(ManageBaseAlbumSerializer):
|
|||
fields = ManageBaseAlbumSerializer.Meta.fields + ["artist"]
|
||||
|
||||
|
||||
class ManageTrackSerializer(ManageNestedTrackSerializer):
|
||||
class ManageTrackSerializer(
|
||||
music_serializers.OptionalDescriptionMixin, ManageNestedTrackSerializer
|
||||
):
|
||||
artist = ManageNestedArtistSerializer()
|
||||
album = ManageTrackAlbumSerializer()
|
||||
attributed_to = ManageBaseActorSerializer()
|
||||
uploads_count = serializers.SerializerMethodField()
|
||||
tags = serializers.SerializerMethodField()
|
||||
cover = music_serializers.cover_field
|
||||
|
||||
class Meta:
|
||||
model = music_models.Track
|
||||
|
@ -450,6 +473,7 @@ class ManageTrackSerializer(ManageNestedTrackSerializer):
|
|||
"attributed_to",
|
||||
"uploads_count",
|
||||
"tags",
|
||||
"cover",
|
||||
]
|
||||
|
||||
def get_uploads_count(self, obj):
|
||||
|
@ -700,3 +724,56 @@ class ManageReportSerializer(serializers.ModelSerializer):
|
|||
def get_notes(self, o):
|
||||
notes = getattr(o, "_prefetched_notes", [])
|
||||
return ManageBaseNoteSerializer(notes, many=True).data
|
||||
|
||||
|
||||
class ManageUserRequestSerializer(serializers.ModelSerializer):
|
||||
assigned_to = ManageBaseActorSerializer()
|
||||
submitter = ManageBaseActorSerializer()
|
||||
notes = serializers.SerializerMethodField()
|
||||
|
||||
class Meta:
|
||||
model = moderation_models.UserRequest
|
||||
fields = [
|
||||
"id",
|
||||
"uuid",
|
||||
"creation_date",
|
||||
"handled_date",
|
||||
"type",
|
||||
"status",
|
||||
"assigned_to",
|
||||
"submitter",
|
||||
"notes",
|
||||
"metadata",
|
||||
]
|
||||
read_only_fields = [
|
||||
"id",
|
||||
"uuid",
|
||||
"submitter",
|
||||
"creation_date",
|
||||
"handled_date",
|
||||
"metadata",
|
||||
]
|
||||
|
||||
def get_notes(self, o):
|
||||
notes = getattr(o, "_prefetched_notes", [])
|
||||
return ManageBaseNoteSerializer(notes, many=True).data
|
||||
|
||||
|
||||
class ManageChannelSerializer(serializers.ModelSerializer):
|
||||
attributed_to = ManageBaseActorSerializer()
|
||||
actor = ManageBaseActorSerializer()
|
||||
artist = ManageArtistSerializer()
|
||||
|
||||
class Meta:
|
||||
model = audio_models.Channel
|
||||
fields = [
|
||||
"id",
|
||||
"uuid",
|
||||
"creation_date",
|
||||
"artist",
|
||||
"attributed_to",
|
||||
"actor",
|
||||
"rss_url",
|
||||
"metadata",
|
||||
]
|
||||
read_only_fields = fields
|
||||
|
|
|
@ -18,6 +18,7 @@ moderation_router.register(
|
|||
r"instance-policies", views.ManageInstancePolicyViewSet, "instance-policies"
|
||||
)
|
||||
moderation_router.register(r"reports", views.ManageReportViewSet, "reports")
|
||||
moderation_router.register(r"requests", views.ManageUserRequestViewSet, "requests")
|
||||
moderation_router.register(r"notes", views.ManageNoteViewSet, "notes")
|
||||
|
||||
users_router = routers.OptionalSlashRouter()
|
||||
|
@ -26,6 +27,7 @@ users_router.register(r"invitations", views.ManageInvitationViewSet, "invitation
|
|||
|
||||
other_router = routers.OptionalSlashRouter()
|
||||
other_router.register(r"accounts", views.ManageActorViewSet, "accounts")
|
||||
other_router.register(r"channels", views.ManageChannelViewSet, "channels")
|
||||
other_router.register(r"tags", views.ManageTagViewSet, "tags")
|
||||
|
||||
urlpatterns = [
|
||||
|
|
|
@ -1,19 +1,25 @@
|
|||
from rest_framework import mixins, response, viewsets
|
||||
from rest_framework import decorators as rest_decorators
|
||||
|
||||
from django.db import transaction
|
||||
from django.db.models import Count, Prefetch, Q, Sum, OuterRef, Subquery
|
||||
from django.db.models.functions import Coalesce, Length
|
||||
from django.shortcuts import get_object_or_404
|
||||
|
||||
from funkwhale_api.audio import models as audio_models
|
||||
from funkwhale_api.common.mixins import MultipleLookupDetailMixin
|
||||
from funkwhale_api.common import models as common_models
|
||||
from funkwhale_api.common import preferences, decorators
|
||||
from funkwhale_api.common import utils as common_utils
|
||||
from funkwhale_api.favorites import models as favorites_models
|
||||
from funkwhale_api.federation import models as federation_models
|
||||
from funkwhale_api.federation import tasks as federation_tasks
|
||||
from funkwhale_api.federation import utils as federation_utils
|
||||
from funkwhale_api.history import models as history_models
|
||||
from funkwhale_api.music import models as music_models
|
||||
from funkwhale_api.music import views as music_views
|
||||
from funkwhale_api.moderation import models as moderation_models
|
||||
from funkwhale_api.moderation import tasks as moderation_tasks
|
||||
from funkwhale_api.playlists import models as playlists_models
|
||||
from funkwhale_api.tags import models as tags_models
|
||||
from funkwhale_api.users import models as users_models
|
||||
|
@ -22,26 +28,39 @@ from funkwhale_api.users import models as users_models
|
|||
from . import filters, serializers
|
||||
|
||||
|
||||
def get_stats(tracks, target):
|
||||
data = {}
|
||||
def get_stats(tracks, target, ignore_fields=[]):
|
||||
tracks = list(tracks.values_list("pk", flat=True))
|
||||
uploads = music_models.Upload.objects.filter(track__in=tracks)
|
||||
data["listenings"] = history_models.Listening.objects.filter(
|
||||
track__in=tracks
|
||||
).count()
|
||||
data["mutations"] = common_models.Mutation.objects.get_for_target(target).count()
|
||||
data["playlists"] = (
|
||||
playlists_models.PlaylistTrack.objects.filter(track__in=tracks)
|
||||
.values_list("playlist", flat=True)
|
||||
.distinct()
|
||||
.count()
|
||||
)
|
||||
data["track_favorites"] = favorites_models.TrackFavorite.objects.filter(
|
||||
track__in=tracks
|
||||
).count()
|
||||
data["libraries"] = uploads.values_list("library", flat=True).distinct().count()
|
||||
data["uploads"] = uploads.count()
|
||||
data["reports"] = moderation_models.Report.objects.get_for_target(target).count()
|
||||
fields = {
|
||||
"listenings": history_models.Listening.objects.filter(track__in=tracks),
|
||||
"mutations": common_models.Mutation.objects.get_for_target(target),
|
||||
"playlists": (
|
||||
playlists_models.PlaylistTrack.objects.filter(track__in=tracks)
|
||||
.values_list("playlist", flat=True)
|
||||
.distinct()
|
||||
),
|
||||
"track_favorites": (
|
||||
favorites_models.TrackFavorite.objects.filter(track__in=tracks)
|
||||
),
|
||||
"libraries": (
|
||||
uploads.filter(library__channel=None)
|
||||
.values_list("library", flat=True)
|
||||
.distinct()
|
||||
),
|
||||
"channels": (
|
||||
uploads.exclude(library__channel=None)
|
||||
.values_list("library", flat=True)
|
||||
.distinct()
|
||||
),
|
||||
"uploads": uploads,
|
||||
"reports": moderation_models.Report.objects.get_for_target(target),
|
||||
}
|
||||
data = {}
|
||||
for key, qs in fields.items():
|
||||
if key in ignore_fields:
|
||||
continue
|
||||
data[key] = qs.count()
|
||||
|
||||
data.update(get_media_stats(uploads))
|
||||
return data
|
||||
|
||||
|
@ -64,17 +83,10 @@ class ManageArtistViewSet(
|
|||
queryset = (
|
||||
music_models.Artist.objects.all()
|
||||
.order_by("-id")
|
||||
.select_related("attributed_to")
|
||||
.prefetch_related(
|
||||
"tracks",
|
||||
Prefetch(
|
||||
"albums",
|
||||
queryset=music_models.Album.objects.annotate(
|
||||
tracks_count=Count("tracks")
|
||||
),
|
||||
),
|
||||
music_views.TAG_PREFETCH,
|
||||
)
|
||||
.select_related("attributed_to", "attachment_cover", "channel")
|
||||
.annotate(_tracks_count=Count("tracks"))
|
||||
.annotate(_albums_count=Count("albums"))
|
||||
.prefetch_related(music_views.TAG_PREFETCH)
|
||||
)
|
||||
serializer_class = serializers.ManageArtistSerializer
|
||||
filterset_class = filters.ManageArtistFilterSet
|
||||
|
@ -100,6 +112,11 @@ class ManageArtistViewSet(
|
|||
result = serializer.save()
|
||||
return response.Response(result, status=200)
|
||||
|
||||
def get_serializer_context(self):
|
||||
context = super().get_serializer_context()
|
||||
context["description"] = self.action in ["retrieve", "create", "update"]
|
||||
return context
|
||||
|
||||
|
||||
class ManageAlbumViewSet(
|
||||
mixins.ListModelMixin,
|
||||
|
@ -110,7 +127,7 @@ class ManageAlbumViewSet(
|
|||
queryset = (
|
||||
music_models.Album.objects.all()
|
||||
.order_by("-id")
|
||||
.select_related("attributed_to", "artist")
|
||||
.select_related("attributed_to", "artist", "attachment_cover")
|
||||
.prefetch_related("tracks", music_views.TAG_PREFETCH)
|
||||
)
|
||||
serializer_class = serializers.ManageAlbumSerializer
|
||||
|
@ -134,6 +151,11 @@ class ManageAlbumViewSet(
|
|||
result = serializer.save()
|
||||
return response.Response(result, status=200)
|
||||
|
||||
def get_serializer_context(self):
|
||||
context = super().get_serializer_context()
|
||||
context["description"] = self.action in ["retrieve", "create", "update"]
|
||||
return context
|
||||
|
||||
|
||||
uploads_subquery = (
|
||||
music_models.Upload.objects.filter(track_id=OuterRef("pk"))
|
||||
|
@ -153,7 +175,13 @@ class ManageTrackViewSet(
|
|||
queryset = (
|
||||
music_models.Track.objects.all()
|
||||
.order_by("-id")
|
||||
.select_related("attributed_to", "artist", "album__artist")
|
||||
.select_related(
|
||||
"attributed_to",
|
||||
"artist",
|
||||
"album__artist",
|
||||
"album__attachment_cover",
|
||||
"attachment_cover",
|
||||
)
|
||||
.annotate(uploads_count=Coalesce(Subquery(uploads_subquery), 0))
|
||||
.prefetch_related(music_views.TAG_PREFETCH)
|
||||
)
|
||||
|
@ -184,6 +212,11 @@ class ManageTrackViewSet(
|
|||
result = serializer.save()
|
||||
return response.Response(result, status=200)
|
||||
|
||||
def get_serializer_context(self):
|
||||
context = super().get_serializer_context()
|
||||
context["description"] = self.action in ["retrieve", "create", "update"]
|
||||
return context
|
||||
|
||||
|
||||
uploads_subquery = (
|
||||
music_models.Upload.objects.filter(library_id=OuterRef("pk"))
|
||||
|
@ -212,6 +245,7 @@ class ManageLibraryViewSet(
|
|||
lookup_field = "uuid"
|
||||
queryset = (
|
||||
music_models.Library.objects.all()
|
||||
.filter(channel=None)
|
||||
.order_by("-id")
|
||||
.select_related("actor")
|
||||
.annotate(
|
||||
|
@ -351,8 +385,7 @@ class ManageDomainViewSet(
|
|||
):
|
||||
lookup_value_regex = r"[a-zA-Z0-9\-\.]+"
|
||||
queryset = (
|
||||
federation_models.Domain.objects.external()
|
||||
.with_actors_count()
|
||||
federation_models.Domain.objects.with_actors_count()
|
||||
.with_outbox_activities_count()
|
||||
.prefetch_related("instance_policy")
|
||||
.order_by("name")
|
||||
|
@ -369,6 +402,10 @@ class ManageDomainViewSet(
|
|||
"instance_policy",
|
||||
]
|
||||
|
||||
def get_queryset(self, **kwargs):
|
||||
queryset = super().get_queryset(**kwargs)
|
||||
return queryset.external()
|
||||
|
||||
def get_serializer_class(self):
|
||||
if self.action in ["update", "partial_update"]:
|
||||
# A dedicated serializer for update
|
||||
|
@ -433,8 +470,8 @@ class ManageActorViewSet(
|
|||
|
||||
@rest_decorators.action(methods=["get"], detail=True)
|
||||
def stats(self, request, *args, **kwargs):
|
||||
domain = self.get_object()
|
||||
return response.Response(domain.get_stats(), status=200)
|
||||
obj = self.get_object()
|
||||
return response.Response(obj.get_stats(), status=200)
|
||||
|
||||
action = decorators.action_route(serializers.ManageActorActionSerializer)
|
||||
|
||||
|
@ -571,3 +608,115 @@ class ManageTagViewSet(
|
|||
serializer.is_valid(raise_exception=True)
|
||||
result = serializer.save()
|
||||
return response.Response(result, status=200)
|
||||
|
||||
|
||||
class ManageUserRequestViewSet(
|
||||
mixins.ListModelMixin,
|
||||
mixins.RetrieveModelMixin,
|
||||
mixins.UpdateModelMixin,
|
||||
viewsets.GenericViewSet,
|
||||
):
|
||||
lookup_field = "uuid"
|
||||
queryset = (
|
||||
moderation_models.UserRequest.objects.all()
|
||||
.order_by("-creation_date")
|
||||
.select_related("submitter", "assigned_to")
|
||||
.prefetch_related(
|
||||
Prefetch(
|
||||
"notes",
|
||||
queryset=moderation_models.Note.objects.order_by(
|
||||
"creation_date"
|
||||
).select_related("author"),
|
||||
to_attr="_prefetched_notes",
|
||||
)
|
||||
)
|
||||
)
|
||||
serializer_class = serializers.ManageUserRequestSerializer
|
||||
filterset_class = filters.ManageUserRequestFilterSet
|
||||
required_scope = "instance:requests"
|
||||
ordering_fields = ["id", "creation_date", "handled_date"]
|
||||
|
||||
def get_queryset(self):
|
||||
queryset = super().get_queryset()
|
||||
if self.action in ["update", "partial_update"]:
|
||||
# approved requests cannot be edited
|
||||
queryset = queryset.exclude(status="approved")
|
||||
return queryset
|
||||
|
||||
@transaction.atomic
|
||||
def perform_update(self, serializer):
|
||||
old_status = serializer.instance.status
|
||||
new_status = serializer.validated_data.get("status")
|
||||
|
||||
if old_status != new_status and new_status != "pending":
|
||||
# report was resolved, we assign to the mod making the request
|
||||
serializer.save(assigned_to=self.request.user.actor)
|
||||
common_utils.on_commit(
|
||||
moderation_tasks.user_request_handle.delay,
|
||||
user_request_id=serializer.instance.pk,
|
||||
new_status=new_status,
|
||||
old_status=old_status,
|
||||
)
|
||||
else:
|
||||
serializer.save()
|
||||
|
||||
|
||||
class ManageChannelViewSet(
|
||||
MultipleLookupDetailMixin,
|
||||
mixins.ListModelMixin,
|
||||
mixins.RetrieveModelMixin,
|
||||
mixins.DestroyModelMixin,
|
||||
viewsets.GenericViewSet,
|
||||
):
|
||||
|
||||
url_lookups = [
|
||||
{
|
||||
"lookup_field": "uuid",
|
||||
"validator": serializers.serializers.UUIDField().to_internal_value,
|
||||
},
|
||||
{
|
||||
"lookup_field": "username",
|
||||
"validator": federation_utils.get_actor_data_from_username,
|
||||
"get_query": lambda v: Q(
|
||||
actor__domain=v["domain"],
|
||||
actor__preferred_username__iexact=v["username"],
|
||||
),
|
||||
},
|
||||
]
|
||||
queryset = (
|
||||
audio_models.Channel.objects.all()
|
||||
.order_by("-id")
|
||||
.select_related("attributed_to", "actor",)
|
||||
.prefetch_related(
|
||||
Prefetch(
|
||||
"artist",
|
||||
queryset=(
|
||||
music_models.Artist.objects.all()
|
||||
.order_by("-id")
|
||||
.select_related("attributed_to", "attachment_cover", "channel")
|
||||
.annotate(_tracks_count=Count("tracks"))
|
||||
.annotate(_albums_count=Count("albums"))
|
||||
.prefetch_related(music_views.TAG_PREFETCH)
|
||||
),
|
||||
)
|
||||
)
|
||||
)
|
||||
serializer_class = serializers.ManageChannelSerializer
|
||||
filterset_class = filters.ManageChannelFilterSet
|
||||
required_scope = "instance:libraries"
|
||||
ordering_fields = ["creation_date", "name"]
|
||||
|
||||
@rest_decorators.action(methods=["get"], detail=True)
|
||||
def stats(self, request, *args, **kwargs):
|
||||
channel = self.get_object()
|
||||
tracks = music_models.Track.objects.filter(
|
||||
Q(artist=channel.artist) | Q(album__artist=channel.artist)
|
||||
)
|
||||
data = get_stats(tracks, channel, ignore_fields=["libraries", "channels"])
|
||||
data["follows"] = channel.actor.received_follows.count()
|
||||
return response.Response(data, status=200)
|
||||
|
||||
def get_serializer_context(self):
|
||||
context = super().get_serializer_context()
|
||||
context["description"] = self.action in ["retrieve", "create", "update"]
|
||||
return context
|
||||
|
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue