Merge branch 'release/0.17'
This commit is contained in:
commit
007d1f7f47
1
.env.dev
1
.env.dev
|
@ -11,3 +11,4 @@ VUE_PORT=8080
|
|||
MUSIC_DIRECTORY_PATH=/music
|
||||
BROWSABLE_API_ENABLED=True
|
||||
FORWARDED_PROTO=http
|
||||
LDAP_ENABLED=False
|
||||
|
|
|
@ -148,6 +148,8 @@ test_api:
|
|||
- branches
|
||||
before_script:
|
||||
- cd api
|
||||
- apt-get update
|
||||
- grep "^[^#;]" requirements.apt | grep -Fv "python3-dev" | xargs apt-get install -y --no-install-recommends
|
||||
- pip install -r requirements/base.txt
|
||||
- pip install -r requirements/local.txt
|
||||
- pip install -r requirements/test.txt
|
||||
|
|
|
@ -0,0 +1,6 @@
|
|||
Related issue: #XXX <!-- it's okay to have no issue for small changes -->
|
||||
|
||||
This Merge Request includes:
|
||||
|
||||
- [ ] Tests
|
||||
- [ ] A changelog fragment (cf https://docs.funkwhale.audio/contributing.html#changelog-management)
|
194
CHANGELOG
194
CHANGELOG
|
@ -10,6 +10,200 @@ This changelog is viewable on the web at https://docs.funkwhale.audio/changelog.
|
|||
|
||||
.. towncrier
|
||||
|
||||
0.17 (2018-10-07)
|
||||
-----------------
|
||||
|
||||
Per user libraries
|
||||
^^^^^^^^^^^^^^^^^^
|
||||
|
||||
This release contains a big change in music management. This has a lot of impact
|
||||
on how Funkwhale behaves, and you should have a look at
|
||||
https://docs.funkwhale.audio/upgrading/0.17.html for information
|
||||
about what changed and how to migrate.
|
||||
|
||||
|
||||
Features:
|
||||
|
||||
- Per user libraries (#463, also fixes #160 and #147)
|
||||
- Authentication using a LDAP directory (#194)
|
||||
|
||||
|
||||
Enhancements:
|
||||
|
||||
- Add configuration option to set Musicbrainz hostname
|
||||
- Add sign up link in the sidebar (#408)
|
||||
- Added a library widget to display libraries associated with a track, album
|
||||
and artist (#551)
|
||||
- Ensure from_activity field is not required in django's admin (#546)
|
||||
- Move setting link from profile page to the sidebar (#406)
|
||||
- Simplified and less error-prone nginx setup (#358)
|
||||
|
||||
Bugfixes:
|
||||
|
||||
- Do not restart current song when rordering queue, deleting tracks from queue
|
||||
or adding tracks to queue (#464)
|
||||
- Fix broken icons in playlist editor (#515)
|
||||
- Fixed a few untranslated strings (#559)
|
||||
- Fixed splitted album when importing from federation (#346)
|
||||
- Fixed toggle mute in volume bar does not restore previous volume level (#514)
|
||||
- Fixed wrong env file URL and display bugs in deployment documentation (#520)
|
||||
- Fixed wrong title in PlayButton (#435)
|
||||
- Remove transparency on artist page button (#517)
|
||||
- Set sane width default for ui cards and center play button (#530)
|
||||
- Updated wrong icon and copy in play button dropdown (#436)
|
||||
|
||||
|
||||
Documentation:
|
||||
|
||||
- Fixed wrong URLs for docker / nginx files in documentation (#537)
|
||||
|
||||
|
||||
Other:
|
||||
|
||||
- Added a merge request template and more documentation about the changelog
|
||||
|
||||
|
||||
Using a LDAP directory to authenticate to your Funkwhale instance
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
Funkwhale now support LDAP as an authentication source: you can configure
|
||||
your instance to delegate login to a LDAP directory, which is especially
|
||||
useful when you have an existing directory and don't want to manage users
|
||||
manually.
|
||||
|
||||
You can use this authentication backend side by side with the classic one.
|
||||
|
||||
Have a look at https://docs.funkwhale.audio/installation/ldap.html
|
||||
for detailed instructions on how to set this up.
|
||||
|
||||
|
||||
Simplified nginx setup [Docker: Manual action required]
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
We've received a lot of user feedback regarding our installation process,
|
||||
and it seems the proxy part is the one which is the most confusing and difficult.
|
||||
Unfortunately, this is also the one where errors and mistakes can completely break
|
||||
the application.
|
||||
|
||||
To make things easier for everyone, we now offer a simplified deployment
|
||||
process for the reverse proxy part. This will make upgrade of the proxy configuration
|
||||
significantly easier on docker deployments.
|
||||
|
||||
On non-docker instances, you have nothing to do.
|
||||
|
||||
If you have a dockerized instance, here is the upgrade path.
|
||||
|
||||
First, tweak your .env file::
|
||||
|
||||
# remove the FUNKWHALE_URL variable
|
||||
# and add the next variables
|
||||
FUNKWHALE_HOSTNAME=yourdomain.funkwhale
|
||||
FUNKWHALE_PROTOCOL=https
|
||||
|
||||
# add the following variable, matching the path your app is deployed
|
||||
# leaving the default should work fine if you deployed using the same
|
||||
# paths as the documentation
|
||||
FUNKWHALE_FRONTEND_PATH=/srv/funkwhale/front/dist
|
||||
|
||||
Then, add the following block at the end of your docker-compose.yml file::
|
||||
|
||||
# existing services
|
||||
api:
|
||||
...
|
||||
celeryworker:
|
||||
...
|
||||
|
||||
# new service
|
||||
nginx:
|
||||
image: nginx
|
||||
env_file:
|
||||
- .env
|
||||
environment:
|
||||
# Override those variables in your .env file if needed
|
||||
- "NGINX_MAX_BODY_SIZE=${NGINX_MAX_BODY_SIZE-30M}"
|
||||
volumes:
|
||||
- "./nginx/funkwhale.template:/etc/nginx/conf.d/funkwhale.template:ro"
|
||||
- "./nginx/funkwhale_proxy.conf:/etc/nginx/funkwhale_proxy.conf:ro"
|
||||
- "${MUSIC_DIRECTORY_SERVE_PATH-/srv/funkwhale/data/music}:${MUSIC_DIRECTORY_SERVE_PATH-/srv/funkwhale/data/music}:ro"
|
||||
- "${MEDIA_ROOT}:${MEDIA_ROOT}:ro"
|
||||
- "${STATIC_ROOT}:${STATIC_ROOT}:ro"
|
||||
- "${FUNKWHALE_FRONTEND_PATH}:/frontend:ro"
|
||||
ports:
|
||||
# override those variables in your .env file if needed
|
||||
- "${FUNKWHALE_API_IP}:${FUNKWHALE_API_PORT}:80"
|
||||
command: >
|
||||
sh -c "envsubst \"`env | awk -F = '{printf \" $$%s\", $$1}'`\"
|
||||
< /etc/nginx/conf.d/funkwhale.template
|
||||
> /etc/nginx/conf.d/default.conf
|
||||
&& cat /etc/nginx/conf.d/default.conf
|
||||
&& nginx -g 'daemon off;'"
|
||||
links:
|
||||
- api
|
||||
|
||||
By doing that, you'll enable a dockerized nginx that will automatically be
|
||||
configured to serve your Funkwhale instance.
|
||||
|
||||
Download the required configuration files for the nginx container:
|
||||
|
||||
.. parsed-literal::
|
||||
|
||||
cd /srv/funkwhale
|
||||
mkdir nginx
|
||||
curl -L -o nginx/funkwhale.template "https://code.eliotberriot.com/funkwhale/funkwhale/raw/|version|/deploy/docker.nginx.template"
|
||||
curl -L -o nginx/funkwhale_proxy.conf "https://code.eliotberriot.com/funkwhale/funkwhale/raw/|version|/deploy/funkwhale_proxy.conf"
|
||||
|
||||
Update the funkwhale.conf configuration of your server's reverse-proxy::
|
||||
|
||||
# the file should match something like that, upgrade all variables
|
||||
# between ${} to match the ones in your .env file,
|
||||
# and your SSL configuration if you're not using let's encrypt
|
||||
# The important thing is that you only have a single location block
|
||||
# that proxies everything to your dockerized nginx.
|
||||
|
||||
sudo nano /etc/nginx/sites-enabled/funkwhale.conf
|
||||
upstream fw {
|
||||
# depending on your setup, you may want to update this
|
||||
server ${FUNKWHALE_API_IP}:${FUNKWHALE_API_PORT};
|
||||
}
|
||||
map $http_upgrade $connection_upgrade {
|
||||
default upgrade;
|
||||
'' close;
|
||||
}
|
||||
|
||||
server {
|
||||
listen 80;
|
||||
listen [::]:80;
|
||||
server_name ${FUNKWHALE_HOSTNAME};
|
||||
location / { return 301 https://$host$request_uri; }
|
||||
}
|
||||
server {
|
||||
listen 443 ssl;
|
||||
listen [::]:443 ssl;
|
||||
server_name ${FUNKWHALE_HOSTNAME};
|
||||
|
||||
# TLS
|
||||
ssl_protocols TLSv1.2;
|
||||
ssl_ciphers HIGH:!MEDIUM:!LOW:!aNULL:!NULL:!SHA;
|
||||
ssl_prefer_server_ciphers on;
|
||||
ssl_session_cache shared:SSL:10m;
|
||||
ssl_certificate /etc/letsencrypt/live/${FUNKWHALE_HOSTNAME}/fullchain.pem;
|
||||
ssl_certificate_key /etc/letsencrypt/live/${FUNKWHALE_HOSTNAME}/privkey.pem;
|
||||
|
||||
# HSTS
|
||||
add_header Strict-Transport-Security "max-age=31536000";
|
||||
|
||||
location / {
|
||||
include /etc/nginx/funkwhale_proxy.conf;
|
||||
proxy_pass http://fw/;
|
||||
}
|
||||
}
|
||||
|
||||
Check that your configuration is valid then reload:
|
||||
|
||||
sudo nginx -t
|
||||
sudo systemctl reload nginx
|
||||
|
||||
|
||||
0.16.3 (2018-08-21)
|
||||
-------------------
|
||||
|
||||
|
|
|
@ -249,6 +249,7 @@ Then, in separate terminals, you can setup as many different instances as you
|
|||
need::
|
||||
|
||||
export COMPOSE_PROJECT_NAME=node2
|
||||
export VUE_PORT=1234 # this has to be unique for each instance
|
||||
docker-compose -f dev.yml run --rm api python manage.py migrate
|
||||
docker-compose -f dev.yml run --rm api python manage.py createsuperuser
|
||||
docker-compose -f dev.yml up nginx api front nginx api celeryworker
|
||||
|
@ -285,6 +286,62 @@ Typical workflow for a contribution
|
|||
8. Create your merge request
|
||||
9. Take a step back and enjoy, we're really grateful you did all of this and took the time to contribute!
|
||||
|
||||
Changelog management
|
||||
--------------------
|
||||
|
||||
To ensure we have extensive and well-structured changelog, any significant
|
||||
work such as closing an issue must include a changelog fragment. Small changes
|
||||
may include a changelog fragment as well but this is not mandatory. If you're not
|
||||
sure about what to do, do not panic, open your merge request normally and we'll
|
||||
figure everything during the review ;)
|
||||
|
||||
Changelog fragments are text files that can contain one or multiple lines
|
||||
that describe the changes occuring in a bunch of commits. Those files reside
|
||||
in ``changes/changelog.d``.
|
||||
|
||||
Content
|
||||
^^^^^^^
|
||||
|
||||
A typical fragment looks like that:
|
||||
|
||||
Fixed broken audio player on Chrome 42 for ogg files (#567)
|
||||
|
||||
If the work fixes one or more issues, the issue number should be included at the
|
||||
end of the fragment (``(#567)`` is the issue number in the previous example.
|
||||
|
||||
If your work is not related to a specific issue, use the merge request
|
||||
identifier instead, like this:
|
||||
|
||||
Fixed a typo in landing page copy (!342)
|
||||
|
||||
Naming
|
||||
^^^^^^
|
||||
|
||||
Fragment files should respect the following naming pattern: ``changes/changelog.d/<name>.<category>``.
|
||||
Name can be anything describing your work, or simply the identifier of the issue number you are fixing.
|
||||
Category can be one of:
|
||||
|
||||
- ``feature``: for new features
|
||||
- ``enhancement``: for enhancements on existing features
|
||||
- ``bugfix``: for bugfixes
|
||||
- ``doc``: for documentation
|
||||
- ``i18n``: for internationalization-related work
|
||||
- ``misc``: for anything else
|
||||
|
||||
Shortcuts
|
||||
^^^^^^^^^
|
||||
|
||||
Here is a shortcut you can use/adapt to easily create new fragments from command-line:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
issue="42"
|
||||
content="Fixed an overflowing issue on small resolutions (#$issue)"
|
||||
category="bugfix"
|
||||
echo $content > changes/changelog.d/$issue.$category
|
||||
|
||||
You can of course create fragments by hand in your text editor, or from Gitlab's
|
||||
interface as well.
|
||||
|
||||
Internationalization
|
||||
--------------------
|
||||
|
|
|
@ -14,12 +14,11 @@ router.register(r"settings", GlobalPreferencesViewSet, base_name="settings")
|
|||
router.register(r"activity", activity_views.ActivityViewSet, "activity")
|
||||
router.register(r"tags", views.TagViewSet, "tags")
|
||||
router.register(r"tracks", views.TrackViewSet, "tracks")
|
||||
router.register(r"trackfiles", views.TrackFileViewSet, "trackfiles")
|
||||
router.register(r"uploads", views.UploadViewSet, "uploads")
|
||||
router.register(r"libraries", views.LibraryViewSet, "libraries")
|
||||
router.register(r"listen", views.ListenViewSet, "listen")
|
||||
router.register(r"artists", views.ArtistViewSet, "artists")
|
||||
router.register(r"albums", views.AlbumViewSet, "albums")
|
||||
router.register(r"import-batches", views.ImportBatchViewSet, "import-batches")
|
||||
router.register(r"import-jobs", views.ImportJobViewSet, "import-jobs")
|
||||
router.register(r"submit", views.SubmitViewSet, "submit")
|
||||
router.register(r"playlists", playlists_views.PlaylistViewSet, "playlists")
|
||||
router.register(
|
||||
r"playlist-tracks", playlists_views.PlaylistTrackViewSet, "playlist-tracks"
|
||||
|
@ -66,10 +65,6 @@ v1_patterns += [
|
|||
r"^users/",
|
||||
include(("funkwhale_api.users.api_urls", "users"), namespace="users"),
|
||||
),
|
||||
url(
|
||||
r"^requests/",
|
||||
include(("funkwhale_api.requests.api_urls", "requests"), namespace="requests"),
|
||||
),
|
||||
url(r"^token/$", jwt_views.obtain_jwt_token, name="token"),
|
||||
url(r"^token/refresh/$", jwt_views.refresh_jwt_token, name="token_refresh"),
|
||||
]
|
||||
|
|
|
@ -8,9 +8,7 @@ application = ProtocolTypeRouter(
|
|||
{
|
||||
# Empty for now (http->django views is added by default)
|
||||
"websocket": TokenAuthMiddleware(
|
||||
URLRouter(
|
||||
[url("^api/v1/instance/activity$", consumers.InstanceActivityConsumer)]
|
||||
)
|
||||
URLRouter([url("^api/v1/activity$", consumers.InstanceActivityConsumer)])
|
||||
)
|
||||
}
|
||||
)
|
||||
|
|
|
@ -125,8 +125,6 @@ LOCAL_APPS = (
|
|||
"funkwhale_api.radios",
|
||||
"funkwhale_api.history",
|
||||
"funkwhale_api.playlists",
|
||||
"funkwhale_api.providers.audiofile",
|
||||
"funkwhale_api.providers.youtube",
|
||||
"funkwhale_api.providers.acoustid",
|
||||
"funkwhale_api.subsonic",
|
||||
)
|
||||
|
@ -280,7 +278,7 @@ MEDIA_ROOT = env("MEDIA_ROOT", default=str(APPS_DIR("media")))
|
|||
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/settings/#media-url
|
||||
MEDIA_URL = env("MEDIA_URL", default="/media/")
|
||||
|
||||
FILE_UPLOAD_PERMISSIONS = 0o644
|
||||
# URL Configuration
|
||||
# ------------------------------------------------------------------------------
|
||||
ROOT_URLCONF = "config.urls"
|
||||
|
@ -310,6 +308,71 @@ AUTH_USER_MODEL = "users.User"
|
|||
LOGIN_REDIRECT_URL = "users:redirect"
|
||||
LOGIN_URL = "account_login"
|
||||
|
||||
# LDAP AUTHENTICATION CONFIGURATION
|
||||
# ------------------------------------------------------------------------------
|
||||
AUTH_LDAP_ENABLED = env.bool("LDAP_ENABLED", default=False)
|
||||
if AUTH_LDAP_ENABLED:
|
||||
|
||||
# Import the LDAP modules here; this way, we don't need the dependency unless someone
|
||||
# actually enables the LDAP support
|
||||
import ldap
|
||||
from django_auth_ldap.config import LDAPSearch, LDAPSearchUnion, GroupOfNamesType
|
||||
|
||||
# Add LDAP to the authentication backends
|
||||
AUTHENTICATION_BACKENDS += ("django_auth_ldap.backend.LDAPBackend",)
|
||||
|
||||
# Basic configuration
|
||||
AUTH_LDAP_SERVER_URI = env("LDAP_SERVER_URI")
|
||||
AUTH_LDAP_BIND_DN = env("LDAP_BIND_DN", default="")
|
||||
AUTH_LDAP_BIND_PASSWORD = env("LDAP_BIND_PASSWORD", default="")
|
||||
AUTH_LDAP_SEARCH_FILTER = env("LDAP_SEARCH_FILTER", default="(uid={0})").format(
|
||||
"%(user)s"
|
||||
)
|
||||
AUTH_LDAP_START_TLS = env.bool("LDAP_START_TLS", default=False)
|
||||
|
||||
DEFAULT_USER_ATTR_MAP = [
|
||||
"first_name:givenName",
|
||||
"last_name:sn",
|
||||
"username:cn",
|
||||
"email:mail",
|
||||
]
|
||||
LDAP_USER_ATTR_MAP = env.list("LDAP_USER_ATTR_MAP", default=DEFAULT_USER_ATTR_MAP)
|
||||
AUTH_LDAP_USER_ATTR_MAP = {}
|
||||
for m in LDAP_USER_ATTR_MAP:
|
||||
funkwhale_field, ldap_field = m.split(":")
|
||||
AUTH_LDAP_USER_ATTR_MAP[funkwhale_field.strip()] = ldap_field.strip()
|
||||
|
||||
# Determine root DN supporting multiple root DNs
|
||||
AUTH_LDAP_ROOT_DN = env("LDAP_ROOT_DN")
|
||||
AUTH_LDAP_ROOT_DN_LIST = []
|
||||
for ROOT_DN in AUTH_LDAP_ROOT_DN.split():
|
||||
AUTH_LDAP_ROOT_DN_LIST.append(
|
||||
LDAPSearch(ROOT_DN, ldap.SCOPE_SUBTREE, AUTH_LDAP_SEARCH_FILTER)
|
||||
)
|
||||
# Search for the user in all the root DNs
|
||||
AUTH_LDAP_USER_SEARCH = LDAPSearchUnion(*AUTH_LDAP_ROOT_DN_LIST)
|
||||
|
||||
# Search for group types
|
||||
LDAP_GROUP_DN = env("LDAP_GROUP_DN", default="")
|
||||
if LDAP_GROUP_DN:
|
||||
AUTH_LDAP_GROUP_DN = LDAP_GROUP_DN
|
||||
# Get filter
|
||||
AUTH_LDAP_GROUP_FILTER = env("LDAP_GROUP_FILER", default="")
|
||||
# Search for the group in the specified DN
|
||||
AUTH_LDAP_GROUP_SEARCH = LDAPSearch(
|
||||
AUTH_LDAP_GROUP_DN, ldap.SCOPE_SUBTREE, AUTH_LDAP_GROUP_FILTER
|
||||
)
|
||||
AUTH_LDAP_GROUP_TYPE = GroupOfNamesType()
|
||||
|
||||
# Configure basic group support
|
||||
LDAP_REQUIRE_GROUP = env("LDAP_REQUIRE_GROUP", default="")
|
||||
if LDAP_REQUIRE_GROUP:
|
||||
AUTH_LDAP_REQUIRE_GROUP = LDAP_REQUIRE_GROUP
|
||||
LDAP_DENY_GROUP = env("LDAP_DENY_GROUP", default="")
|
||||
if LDAP_DENY_GROUP:
|
||||
AUTH_LDAP_DENY_GROUP = LDAP_DENY_GROUP
|
||||
|
||||
|
||||
# SLUGLIFIER
|
||||
AUTOSLUG_SLUGIFY_FUNCTION = "slugify.slugify"
|
||||
|
||||
|
@ -381,7 +444,7 @@ REST_FRAMEWORK = {
|
|||
"DEFAULT_AUTHENTICATION_CLASSES": (
|
||||
"funkwhale_api.common.authentication.JSONWebTokenAuthenticationQS",
|
||||
"funkwhale_api.common.authentication.BearerTokenHeaderAuth",
|
||||
"rest_framework_jwt.authentication.JSONWebTokenAuthentication",
|
||||
"funkwhale_api.common.authentication.JSONWebTokenAuthentication",
|
||||
"rest_framework.authentication.SessionAuthentication",
|
||||
"rest_framework.authentication.BasicAuthentication",
|
||||
),
|
||||
|
@ -422,6 +485,11 @@ PROTECT_FILES_PATH = env("PROTECT_FILES_PATH", default="/_protected")
|
|||
# musicbrainz results. (value is in seconds)
|
||||
MUSICBRAINZ_CACHE_DURATION = env.int("MUSICBRAINZ_CACHE_DURATION", default=300)
|
||||
|
||||
# Use this setting to change the musicbrainz hostname, for instance to
|
||||
# use a mirror. The hostname can also contain a port number (so, e.g.,
|
||||
# "localhost:5000" is a valid name to set).
|
||||
MUSICBRAINZ_HOSTNAME = env("MUSICBRAINZ_HOSTNAME", default="musicbrainz.org")
|
||||
|
||||
# Custom Admin URL, use {% url 'admin:index' %}
|
||||
ADMIN_URL = env("DJANGO_ADMIN_URL", default="^api/admin/")
|
||||
CSRF_USE_SESSIONS = True
|
||||
|
@ -445,8 +513,14 @@ ACCOUNT_USERNAME_BLACKLIST = [
|
|||
"me",
|
||||
"ghost",
|
||||
"_",
|
||||
"-",
|
||||
"hello",
|
||||
"contact",
|
||||
"inbox",
|
||||
"outbox",
|
||||
"shared-inbox",
|
||||
"shared_inbox",
|
||||
"actor",
|
||||
] + env.list("ACCOUNT_USERNAME_BLACKLIST", default=[])
|
||||
|
||||
EXTERNAL_REQUESTS_VERIFY_SSL = env.bool("EXTERNAL_REQUESTS_VERIFY_SSL", default=True)
|
||||
|
|
|
@ -67,6 +67,7 @@ LOGGING = {
|
|||
"propagate": True,
|
||||
"level": "DEBUG",
|
||||
},
|
||||
"django_auth_ldap": {"handlers": ["console"], "level": "DEBUG"},
|
||||
"": {"level": "DEBUG", "handlers": ["console"]},
|
||||
},
|
||||
}
|
||||
|
|
|
@ -2,11 +2,13 @@
|
|||
from __future__ import unicode_literals
|
||||
|
||||
from django.conf import settings
|
||||
from django.conf.urls import include, url
|
||||
from django.conf.urls import url
|
||||
from django.urls import include, path
|
||||
from django.conf.urls.static import static
|
||||
from django.contrib import admin
|
||||
from funkwhale_api.common import admin
|
||||
from django.views import defaults as default_views
|
||||
|
||||
|
||||
urlpatterns = [
|
||||
# Django Admin, use {% url 'admin:index' %}
|
||||
url(settings.ADMIN_URL, admin.site.urls),
|
||||
|
@ -36,4 +38,6 @@ if settings.DEBUG:
|
|||
if "debug_toolbar" in settings.INSTALLED_APPS:
|
||||
import debug_toolbar
|
||||
|
||||
urlpatterns += [url(r"^__debug__/", include(debug_toolbar.urls))]
|
||||
urlpatterns = [
|
||||
path("api/__debug__/", include(debug_toolbar.urls))
|
||||
] + urlpatterns
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
__version__ = "0.16.3"
|
||||
__version__ = "0.17"
|
||||
__version_info__ = tuple(
|
||||
[
|
||||
int(num) if num.isdigit() else num
|
||||
|
|
|
@ -0,0 +1,19 @@
|
|||
from django.contrib.admin import register as initial_register, site, ModelAdmin # noqa
|
||||
from django.db.models.fields.related import RelatedField
|
||||
|
||||
|
||||
def register(model):
|
||||
"""
|
||||
To make the admin more performant, we ensure all the the relations
|
||||
are listed under raw_id_fields
|
||||
"""
|
||||
|
||||
def decorator(modeladmin):
|
||||
raw_id_fields = []
|
||||
for field in model._meta.fields:
|
||||
if isinstance(field, RelatedField):
|
||||
raw_id_fields.append(field.name)
|
||||
setattr(modeladmin, "raw_id_fields", raw_id_fields)
|
||||
return initial_register(model)(modeladmin)
|
||||
|
||||
return decorator
|
|
@ -56,3 +56,20 @@ class BearerTokenHeaderAuth(authentication.BaseJSONWebTokenAuthentication):
|
|||
|
||||
def authenticate_header(self, request):
|
||||
return '{0} realm="{1}"'.format("Bearer", self.www_authenticate_realm)
|
||||
|
||||
def authenticate(self, request):
|
||||
auth = super().authenticate(request)
|
||||
if auth:
|
||||
if not auth[0].actor:
|
||||
auth[0].create_actor()
|
||||
return auth
|
||||
|
||||
|
||||
class JSONWebTokenAuthentication(authentication.JSONWebTokenAuthentication):
|
||||
def authenticate(self, request):
|
||||
auth = super().authenticate(request)
|
||||
|
||||
if auth:
|
||||
if not auth[0].actor:
|
||||
auth[0].create_actor()
|
||||
return auth
|
||||
|
|
|
@ -1,6 +1,25 @@
|
|||
import json
|
||||
import logging
|
||||
|
||||
from asgiref.sync import async_to_sync
|
||||
from channels.layers import get_channel_layer
|
||||
from django.core.serializers.json import DjangoJSONEncoder
|
||||
|
||||
logger = logging.getLogger(__file__)
|
||||
channel_layer = get_channel_layer()
|
||||
group_send = async_to_sync(channel_layer.group_send)
|
||||
group_add = async_to_sync(channel_layer.group_add)
|
||||
|
||||
|
||||
def group_send(group, event):
|
||||
# we serialize the payload ourselves and deserialize it to ensure it
|
||||
# works with msgpack. This is dirty, but we'll find a better solution
|
||||
# later
|
||||
s = json.dumps(event, cls=DjangoJSONEncoder)
|
||||
event = json.loads(s)
|
||||
logger.debug(
|
||||
"[channels] Dispatching %s to group %s: %s",
|
||||
event["type"],
|
||||
group,
|
||||
{"type": event["data"]["type"]},
|
||||
)
|
||||
async_to_sync(channel_layer.group_send)(group, event)
|
||||
|
|
|
@ -16,3 +16,5 @@ class JsonAuthConsumer(JsonWebsocketConsumer):
|
|||
super().accept()
|
||||
for group in self.groups:
|
||||
channels.group_add(group, self.channel_name)
|
||||
for group in self.scope["user"].get_channels_groups():
|
||||
channels.group_add(group, self.channel_name)
|
||||
|
|
|
@ -9,7 +9,9 @@ from funkwhale_api.common import preferences
|
|||
class ConditionalAuthentication(BasePermission):
|
||||
def has_permission(self, request, view):
|
||||
if preferences.get("common__api_authentication_required"):
|
||||
return request.user and request.user.is_authenticated
|
||||
return (request.user and request.user.is_authenticated) or (
|
||||
hasattr(request, "actor") and request.actor
|
||||
)
|
||||
return True
|
||||
|
||||
|
||||
|
|
|
@ -1,6 +1,7 @@
|
|||
from . import create_actors
|
||||
from . import create_image_variations
|
||||
from . import django_permissions_to_user_permissions
|
||||
from . import migrate_to_user_libraries
|
||||
from . import test
|
||||
|
||||
|
||||
|
@ -8,5 +9,6 @@ __all__ = [
|
|||
"create_actors",
|
||||
"create_image_variations",
|
||||
"django_permissions_to_user_permissions",
|
||||
"migrate_to_user_libraries",
|
||||
"test",
|
||||
]
|
||||
|
|
|
@ -0,0 +1,163 @@
|
|||
"""
|
||||
Mirate instance files to a library #463. For each user that imported music on an
|
||||
instance, we will create a "default" library with related files and an instance-level
|
||||
visibility (unless instance has common__api_authentication_required set to False,
|
||||
in which case the libraries will be public).
|
||||
|
||||
Files without any import job will be bounded to a "default" library on the first
|
||||
superuser account found. This should now happen though.
|
||||
|
||||
This command will also generate federation ids for existing resources.
|
||||
|
||||
"""
|
||||
|
||||
from django.conf import settings
|
||||
from django.db.models import functions, CharField, F, Value
|
||||
|
||||
from funkwhale_api.music import models
|
||||
from funkwhale_api.users.models import User
|
||||
from funkwhale_api.federation import models as federation_models
|
||||
from funkwhale_api.common import preferences
|
||||
|
||||
|
||||
def create_libraries(open_api, stdout):
|
||||
local_actors = federation_models.Actor.objects.exclude(user=None).only("pk", "user")
|
||||
privacy_level = "everyone" if open_api else "instance"
|
||||
stdout.write(
|
||||
"* Creating {} libraries with {} visibility".format(
|
||||
len(local_actors), privacy_level
|
||||
)
|
||||
)
|
||||
libraries_by_user = {}
|
||||
|
||||
for a in local_actors:
|
||||
library, created = models.Library.objects.get_or_create(
|
||||
name="default", actor=a, defaults={"privacy_level": privacy_level}
|
||||
)
|
||||
libraries_by_user[library.actor.user.pk] = library.pk
|
||||
if created:
|
||||
stdout.write(
|
||||
" * Created library {} for user {}".format(library.pk, a.user.pk)
|
||||
)
|
||||
else:
|
||||
stdout.write(
|
||||
" * Found existing library {} for user {}".format(
|
||||
library.pk, a.user.pk
|
||||
)
|
||||
)
|
||||
|
||||
return libraries_by_user
|
||||
|
||||
|
||||
def update_uploads(libraries_by_user, stdout):
|
||||
stdout.write("* Updating uploads with proper libraries...")
|
||||
for user_id, library_id in libraries_by_user.items():
|
||||
jobs = models.ImportJob.objects.filter(
|
||||
upload__library=None, batch__submitted_by=user_id
|
||||
)
|
||||
candidates = models.Upload.objects.filter(
|
||||
pk__in=jobs.values_list("upload", flat=True)
|
||||
)
|
||||
total = candidates.update(library=library_id, import_status="finished")
|
||||
if total:
|
||||
stdout.write(
|
||||
" * Assigned {} uploads to user {}'s library".format(total, user_id)
|
||||
)
|
||||
else:
|
||||
stdout.write(
|
||||
" * No uploads to assign to user {}'s library".format(user_id)
|
||||
)
|
||||
|
||||
|
||||
def update_orphan_uploads(open_api, stdout):
|
||||
privacy_level = "everyone" if open_api else "instance"
|
||||
first_superuser = (
|
||||
User.objects.filter(is_superuser=True)
|
||||
.exclude(actor=None)
|
||||
.order_by("pk")
|
||||
.first()
|
||||
)
|
||||
if not first_superuser:
|
||||
stdout.write("* No superuser found, skipping update orphan uploads")
|
||||
return
|
||||
library, _ = models.Library.objects.get_or_create(
|
||||
name="default",
|
||||
actor=first_superuser.actor,
|
||||
defaults={"privacy_level": privacy_level},
|
||||
)
|
||||
candidates = (
|
||||
models.Upload.objects.filter(library=None, jobs__isnull=True)
|
||||
.exclude(audio_file=None)
|
||||
.exclude(audio_file="")
|
||||
)
|
||||
|
||||
total = candidates.update(library=library, import_status="finished")
|
||||
if total:
|
||||
stdout.write(
|
||||
"* Assigned {} orphaned uploads to superuser {}".format(
|
||||
total, first_superuser.pk
|
||||
)
|
||||
)
|
||||
else:
|
||||
stdout.write("* No orphaned uploads found")
|
||||
|
||||
|
||||
def set_fid(queryset, path, stdout):
|
||||
model = queryset.model._meta.label
|
||||
qs = queryset.filter(fid=None)
|
||||
base_url = "{}{}".format(settings.FUNKWHALE_URL, path)
|
||||
stdout.write(
|
||||
"* Assigning federation ids to {} entries (path: {})".format(model, base_url)
|
||||
)
|
||||
new_fid = functions.Concat(Value(base_url), F("uuid"), output_field=CharField())
|
||||
total = qs.update(fid=new_fid)
|
||||
|
||||
stdout.write(" * {} entries updated".format(total))
|
||||
|
||||
|
||||
def update_shared_inbox_url(stdout):
|
||||
stdout.write("* Update shared inbox url for local actors...")
|
||||
candidates = federation_models.Actor.objects.local()
|
||||
url = federation_models.get_shared_inbox_url()
|
||||
candidates.update(shared_inbox_url=url)
|
||||
|
||||
|
||||
def generate_actor_urls(part, stdout):
|
||||
field = "{}_url".format(part)
|
||||
stdout.write("* Update {} for local actors...".format(field))
|
||||
|
||||
queryset = federation_models.Actor.objects.local().filter(**{field: None})
|
||||
base_url = "{}/federation/actors/".format(settings.FUNKWHALE_URL)
|
||||
|
||||
new_field = functions.Concat(
|
||||
Value(base_url),
|
||||
F("preferred_username"),
|
||||
Value("/{}".format(part)),
|
||||
output_field=CharField(),
|
||||
)
|
||||
|
||||
queryset.update(**{field: new_field})
|
||||
|
||||
|
||||
def main(command, **kwargs):
|
||||
open_api = not preferences.get("common__api_authentication_required")
|
||||
libraries_by_user = create_libraries(open_api, command.stdout)
|
||||
update_uploads(libraries_by_user, command.stdout)
|
||||
update_orphan_uploads(open_api, command.stdout)
|
||||
|
||||
set_fid_params = [
|
||||
(
|
||||
models.Upload.objects.exclude(library__actor__user=None),
|
||||
"/federation/music/uploads/",
|
||||
),
|
||||
(models.Artist.objects.all(), "/federation/music/artists/"),
|
||||
(models.Album.objects.all(), "/federation/music/albums/"),
|
||||
(models.Track.objects.all(), "/federation/music/tracks/"),
|
||||
]
|
||||
for qs, path in set_fid_params:
|
||||
set_fid(qs, path, command.stdout)
|
||||
|
||||
update_shared_inbox_url(command.stdout)
|
||||
|
||||
for part in ["followers", "following"]:
|
||||
generate_actor_urls(part, command.stdout)
|
|
@ -1,5 +1,70 @@
|
|||
import collections
|
||||
|
||||
from rest_framework import serializers
|
||||
|
||||
from django.core.exceptions import ObjectDoesNotExist
|
||||
from django.utils.encoding import smart_text
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
|
||||
|
||||
class RelatedField(serializers.RelatedField):
|
||||
default_error_messages = {
|
||||
"does_not_exist": _("Object with {related_field_name}={value} does not exist."),
|
||||
"invalid": _("Invalid value."),
|
||||
}
|
||||
|
||||
def __init__(self, related_field_name, serializer, **kwargs):
|
||||
self.related_field_name = related_field_name
|
||||
self.serializer = serializer
|
||||
self.filters = kwargs.pop("filters", None)
|
||||
kwargs["queryset"] = kwargs.pop(
|
||||
"queryset", self.serializer.Meta.model.objects.all()
|
||||
)
|
||||
super().__init__(**kwargs)
|
||||
|
||||
def get_filters(self, data):
|
||||
filters = {self.related_field_name: data}
|
||||
if self.filters:
|
||||
filters.update(self.filters(self.context))
|
||||
return filters
|
||||
|
||||
def to_internal_value(self, data):
|
||||
try:
|
||||
queryset = self.get_queryset()
|
||||
filters = self.get_filters(data)
|
||||
return queryset.get(**filters)
|
||||
except ObjectDoesNotExist:
|
||||
self.fail(
|
||||
"does_not_exist",
|
||||
related_field_name=self.related_field_name,
|
||||
value=smart_text(data),
|
||||
)
|
||||
except (TypeError, ValueError):
|
||||
self.fail("invalid")
|
||||
|
||||
def to_representation(self, obj):
|
||||
return self.serializer.to_representation(obj)
|
||||
|
||||
def get_choices(self, cutoff=None):
|
||||
queryset = self.get_queryset()
|
||||
if queryset is None:
|
||||
# Ensure that field.choices returns something sensible
|
||||
# even when accessed with a read-only field.
|
||||
return {}
|
||||
|
||||
if cutoff is not None:
|
||||
queryset = queryset[:cutoff]
|
||||
|
||||
return collections.OrderedDict(
|
||||
[
|
||||
(
|
||||
self.to_representation(item)[self.related_field_name],
|
||||
self.display_value(item),
|
||||
)
|
||||
for item in queryset
|
||||
]
|
||||
)
|
||||
|
||||
|
||||
class Action(object):
|
||||
def __init__(self, name, allow_all=False, qs_filter=None):
|
||||
|
@ -21,6 +86,7 @@ class ActionSerializer(serializers.Serializer):
|
|||
objects = serializers.JSONField(required=True)
|
||||
filters = serializers.DictField(required=False)
|
||||
actions = None
|
||||
pk_field = "pk"
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.actions_by_name = {a.name: a for a in self.actions}
|
||||
|
@ -51,7 +117,9 @@ class ActionSerializer(serializers.Serializer):
|
|||
if value == "all":
|
||||
return self.queryset.all().order_by("id")
|
||||
if type(value) in [list, tuple]:
|
||||
return self.queryset.filter(pk__in=value).order_by("id")
|
||||
return self.queryset.filter(
|
||||
**{"{}__in".format(self.pk_field): value}
|
||||
).order_by("id")
|
||||
|
||||
raise serializers.ValidationError(
|
||||
"{} is not a valid value for objects. You must provide either a "
|
||||
|
|
|
@ -64,3 +64,46 @@ class ChunkedPath(object):
|
|||
new_filename = "".join(chunks[3:]) + ".{}".format(ext)
|
||||
parts = chunks[:3] + [new_filename]
|
||||
return os.path.join(self.root, *parts)
|
||||
|
||||
|
||||
def chunk_queryset(source_qs, chunk_size):
|
||||
"""
|
||||
From https://github.com/peopledoc/django-chunkator/blob/master/chunkator/__init__.py
|
||||
"""
|
||||
pk = None
|
||||
# In django 1.9, _fields is always present and `None` if 'values()' is used
|
||||
# In Django 1.8 and below, _fields will only be present if using `values()`
|
||||
has_fields = hasattr(source_qs, "_fields") and source_qs._fields
|
||||
if has_fields:
|
||||
if "pk" not in source_qs._fields:
|
||||
raise ValueError("The values() call must include the `pk` field")
|
||||
|
||||
field = source_qs.model._meta.pk
|
||||
# set the correct field name:
|
||||
# for ForeignKeys, we want to use `model_id` field, and not `model`,
|
||||
# to bypass default ordering on related model
|
||||
order_by_field = field.attname
|
||||
|
||||
source_qs = source_qs.order_by(order_by_field)
|
||||
queryset = source_qs
|
||||
while True:
|
||||
if pk:
|
||||
queryset = source_qs.filter(pk__gt=pk)
|
||||
page = queryset[:chunk_size]
|
||||
page = list(page)
|
||||
nb_items = len(page)
|
||||
|
||||
if nb_items == 0:
|
||||
return
|
||||
|
||||
last_item = page[-1]
|
||||
# source_qs._fields exists *and* is not none when using "values()"
|
||||
if has_fields:
|
||||
pk = last_item["pk"]
|
||||
else:
|
||||
pk = last_item.pk
|
||||
|
||||
yield page
|
||||
|
||||
if nb_items < chunk_size:
|
||||
return
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
from django.contrib import admin
|
||||
from funkwhale_api.common import admin
|
||||
|
||||
from . import models
|
||||
|
||||
|
|
|
@ -1,4 +1,3 @@
|
|||
|
||||
from rest_framework import serializers
|
||||
|
||||
from funkwhale_api.activity import serializers as activity_serializers
|
||||
|
|
|
@ -3,9 +3,12 @@ from rest_framework.decorators import list_route
|
|||
from rest_framework.permissions import IsAuthenticatedOrReadOnly
|
||||
from rest_framework.response import Response
|
||||
|
||||
from django.db.models import Prefetch
|
||||
|
||||
from funkwhale_api.activity import record
|
||||
from funkwhale_api.common import fields, permissions
|
||||
from funkwhale_api.music.models import Track
|
||||
from funkwhale_api.music import utils as music_utils
|
||||
|
||||
from . import filters, models, serializers
|
||||
|
||||
|
@ -19,11 +22,7 @@ class TrackFavoriteViewSet(
|
|||
|
||||
filter_class = filters.TrackFavoriteFilter
|
||||
serializer_class = serializers.UserTrackFavoriteSerializer
|
||||
queryset = (
|
||||
models.TrackFavorite.objects.all()
|
||||
.select_related("track__artist", "track__album__artist", "user")
|
||||
.prefetch_related("track__files")
|
||||
)
|
||||
queryset = models.TrackFavorite.objects.all().select_related("user")
|
||||
permission_classes = [
|
||||
permissions.ConditionalAuthentication,
|
||||
permissions.OwnerPermission,
|
||||
|
@ -49,9 +48,14 @@ class TrackFavoriteViewSet(
|
|||
|
||||
def get_queryset(self):
|
||||
queryset = super().get_queryset()
|
||||
return queryset.filter(
|
||||
queryset = queryset.filter(
|
||||
fields.privacy_level_query(self.request.user, "user__privacy_level")
|
||||
)
|
||||
tracks = Track.objects.annotate_playable_by_actor(
|
||||
music_utils.get_actor_from_request(self.request)
|
||||
).select_related("artist", "album__artist")
|
||||
queryset = queryset.prefetch_related(Prefetch("track", queryset=tracks))
|
||||
return queryset
|
||||
|
||||
def perform_create(self, serializer):
|
||||
track = Track.objects.get(pk=serializer.data["track"])
|
||||
|
|
|
@ -1,3 +1,16 @@
|
|||
import uuid
|
||||
import logging
|
||||
|
||||
from django.db import transaction, IntegrityError
|
||||
from django.db.models import Q
|
||||
|
||||
from funkwhale_api.common import channels
|
||||
from funkwhale_api.common import utils as funkwhale_utils
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
PUBLIC_ADDRESS = "https://www.w3.org/ns/activitystreams#Public"
|
||||
|
||||
ACTIVITY_TYPES = [
|
||||
"Accept",
|
||||
"Add",
|
||||
|
@ -48,14 +61,340 @@ OBJECT_TYPES = [
|
|||
] + ACTIVITY_TYPES
|
||||
|
||||
|
||||
def deliver(activity, on_behalf_of, to=[]):
|
||||
BROADCAST_TO_USER_ACTIVITIES = ["Follow", "Accept"]
|
||||
|
||||
|
||||
@transaction.atomic
|
||||
def receive(activity, on_behalf_of):
|
||||
from . import models
|
||||
from . import serializers
|
||||
from . import tasks
|
||||
|
||||
return tasks.send.delay(activity=activity, actor_id=on_behalf_of.pk, to=to)
|
||||
# we ensure the activity has the bare minimum structure before storing
|
||||
# it in our database
|
||||
serializer = serializers.BaseActivitySerializer(
|
||||
data=activity, context={"actor": on_behalf_of, "local_recipients": True}
|
||||
)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
try:
|
||||
copy = serializer.save()
|
||||
except IntegrityError:
|
||||
logger.warning(
|
||||
"[federation] Discarding already elivered activity %s",
|
||||
serializer.validated_data.get("id"),
|
||||
)
|
||||
return
|
||||
|
||||
local_to_recipients = get_actors_from_audience(activity.get("to", []))
|
||||
local_to_recipients = local_to_recipients.exclude(user=None)
|
||||
|
||||
local_cc_recipients = get_actors_from_audience(activity.get("cc", []))
|
||||
local_cc_recipients = local_cc_recipients.exclude(user=None)
|
||||
|
||||
inbox_items = []
|
||||
for recipients, type in [(local_to_recipients, "to"), (local_cc_recipients, "cc")]:
|
||||
|
||||
for r in recipients.values_list("pk", flat=True):
|
||||
inbox_items.append(models.InboxItem(actor_id=r, type=type, activity=copy))
|
||||
|
||||
models.InboxItem.objects.bulk_create(inbox_items)
|
||||
|
||||
# at this point, we have the activity in database. Even if we crash, it's
|
||||
# okay, as we can retry later
|
||||
funkwhale_utils.on_commit(tasks.dispatch_inbox.delay, activity_id=copy.pk)
|
||||
return copy
|
||||
|
||||
|
||||
def accept_follow(follow):
|
||||
from . import serializers
|
||||
class Router:
|
||||
def __init__(self):
|
||||
self.routes = []
|
||||
|
||||
serializer = serializers.AcceptFollowSerializer(follow)
|
||||
return deliver(serializer.data, to=[follow.actor.url], on_behalf_of=follow.target)
|
||||
def connect(self, route, handler):
|
||||
self.routes.append((route, handler))
|
||||
|
||||
def register(self, route):
|
||||
def decorator(handler):
|
||||
self.connect(route, handler)
|
||||
return handler
|
||||
|
||||
return decorator
|
||||
|
||||
|
||||
class InboxRouter(Router):
|
||||
@transaction.atomic
|
||||
def dispatch(self, payload, context):
|
||||
"""
|
||||
Receives an Activity payload and some context and trigger our
|
||||
business logic
|
||||
"""
|
||||
from . import api_serializers
|
||||
from . import models
|
||||
|
||||
for route, handler in self.routes:
|
||||
if match_route(route, payload):
|
||||
r = handler(payload, context=context)
|
||||
activity_obj = context.get("activity")
|
||||
if activity_obj and r:
|
||||
# handler returned additional data we can use
|
||||
# to update the activity target
|
||||
for key, value in r.items():
|
||||
setattr(activity_obj, key, value)
|
||||
|
||||
update_fields = []
|
||||
for k in r.keys():
|
||||
if k in ["object", "target", "related_object"]:
|
||||
update_fields += [
|
||||
"{}_id".format(k),
|
||||
"{}_content_type".format(k),
|
||||
]
|
||||
else:
|
||||
update_fields.append(k)
|
||||
activity_obj.save(update_fields=update_fields)
|
||||
|
||||
if payload["type"] not in BROADCAST_TO_USER_ACTIVITIES:
|
||||
return
|
||||
|
||||
inbox_items = context.get(
|
||||
"inbox_items", models.InboxItem.objects.none()
|
||||
)
|
||||
inbox_items = (
|
||||
inbox_items.select_related()
|
||||
.select_related("actor__user")
|
||||
.prefetch_related(
|
||||
"activity__object",
|
||||
"activity__target",
|
||||
"activity__related_object",
|
||||
)
|
||||
)
|
||||
|
||||
for ii in inbox_items:
|
||||
user = ii.actor.get_user()
|
||||
if not user:
|
||||
continue
|
||||
group = "user.{}.inbox".format(user.pk)
|
||||
channels.group_send(
|
||||
group,
|
||||
{
|
||||
"type": "event.send",
|
||||
"text": "",
|
||||
"data": {
|
||||
"type": "inbox.item_added",
|
||||
"item": api_serializers.InboxItemSerializer(ii).data,
|
||||
},
|
||||
},
|
||||
)
|
||||
return
|
||||
|
||||
|
||||
class OutboxRouter(Router):
|
||||
@transaction.atomic
|
||||
def dispatch(self, routing, context):
|
||||
"""
|
||||
Receives a routing payload and some business objects in the context
|
||||
and may yield data that should be persisted in the Activity model
|
||||
for further delivery.
|
||||
"""
|
||||
from . import models
|
||||
from . import tasks
|
||||
|
||||
for route, handler in self.routes:
|
||||
if not match_route(route, routing):
|
||||
continue
|
||||
|
||||
activities_data = []
|
||||
for e in handler(context):
|
||||
# a route can yield zero, one or more activity payloads
|
||||
if e:
|
||||
activities_data.append(e)
|
||||
inbox_items_by_activity_uuid = {}
|
||||
deliveries_by_activity_uuid = {}
|
||||
prepared_activities = []
|
||||
for activity_data in activities_data:
|
||||
activity_data["payload"]["actor"] = activity_data["actor"].fid
|
||||
to = activity_data["payload"].pop("to", [])
|
||||
cc = activity_data["payload"].pop("cc", [])
|
||||
a = models.Activity(**activity_data)
|
||||
a.uuid = uuid.uuid4()
|
||||
to_inbox_items, to_deliveries, new_to = prepare_deliveries_and_inbox_items(
|
||||
to, "to"
|
||||
)
|
||||
cc_inbox_items, cc_deliveries, new_cc = prepare_deliveries_and_inbox_items(
|
||||
cc, "cc"
|
||||
)
|
||||
if not any(
|
||||
[to_inbox_items, to_deliveries, cc_inbox_items, cc_deliveries]
|
||||
):
|
||||
continue
|
||||
deliveries_by_activity_uuid[str(a.uuid)] = to_deliveries + cc_deliveries
|
||||
inbox_items_by_activity_uuid[str(a.uuid)] = (
|
||||
to_inbox_items + cc_inbox_items
|
||||
)
|
||||
if new_to:
|
||||
a.payload["to"] = new_to
|
||||
if new_cc:
|
||||
a.payload["cc"] = new_cc
|
||||
prepared_activities.append(a)
|
||||
|
||||
activities = models.Activity.objects.bulk_create(prepared_activities)
|
||||
|
||||
for activity in activities:
|
||||
if str(activity.uuid) in deliveries_by_activity_uuid:
|
||||
for obj in deliveries_by_activity_uuid[str(a.uuid)]:
|
||||
obj.activity = activity
|
||||
|
||||
if str(activity.uuid) in inbox_items_by_activity_uuid:
|
||||
for obj in inbox_items_by_activity_uuid[str(a.uuid)]:
|
||||
obj.activity = activity
|
||||
|
||||
# create all deliveries and items, in bulk
|
||||
models.Delivery.objects.bulk_create(
|
||||
[
|
||||
obj
|
||||
for collection in deliveries_by_activity_uuid.values()
|
||||
for obj in collection
|
||||
]
|
||||
)
|
||||
models.InboxItem.objects.bulk_create(
|
||||
[
|
||||
obj
|
||||
for collection in inbox_items_by_activity_uuid.values()
|
||||
for obj in collection
|
||||
]
|
||||
)
|
||||
|
||||
for a in activities:
|
||||
funkwhale_utils.on_commit(tasks.dispatch_outbox.delay, activity_id=a.pk)
|
||||
return activities
|
||||
|
||||
|
||||
def recursive_gettattr(obj, key):
|
||||
"""
|
||||
Given a dictionary such as {'user': {'name': 'Bob'}} and
|
||||
a dotted string such as user.name, returns 'Bob'.
|
||||
|
||||
If the value is not present, returns None
|
||||
"""
|
||||
v = obj
|
||||
for k in key.split("."):
|
||||
v = v.get(k)
|
||||
if v is None:
|
||||
return
|
||||
|
||||
return v
|
||||
|
||||
|
||||
def match_route(route, payload):
|
||||
for key, value in route.items():
|
||||
payload_value = recursive_gettattr(payload, key)
|
||||
if payload_value != value:
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def prepare_deliveries_and_inbox_items(recipient_list, type):
|
||||
"""
|
||||
Given a list of recipients (
|
||||
either actor instances, public adresses, a dictionnary with a "type" and "target"
|
||||
keys for followers collections)
|
||||
returns a list of deliveries, alist of inbox_items and a list
|
||||
of urls to persist in the activity in place of the initial recipient list.
|
||||
"""
|
||||
from . import models
|
||||
|
||||
local_recipients = set()
|
||||
remote_inbox_urls = set()
|
||||
urls = []
|
||||
|
||||
for r in recipient_list:
|
||||
if isinstance(r, models.Actor):
|
||||
if r.is_local:
|
||||
local_recipients.add(r)
|
||||
else:
|
||||
remote_inbox_urls.add(r.shared_inbox_url or r.inbox_url)
|
||||
urls.append(r.fid)
|
||||
elif r == PUBLIC_ADDRESS:
|
||||
urls.append(r)
|
||||
elif isinstance(r, dict) and r["type"] == "followers":
|
||||
received_follows = (
|
||||
r["target"]
|
||||
.received_follows.filter(approved=True)
|
||||
.select_related("actor__user")
|
||||
)
|
||||
for follow in received_follows:
|
||||
actor = follow.actor
|
||||
if actor.is_local:
|
||||
local_recipients.add(actor)
|
||||
else:
|
||||
remote_inbox_urls.add(actor.shared_inbox_url or actor.inbox_url)
|
||||
urls.append(r["target"].followers_url)
|
||||
|
||||
deliveries = [models.Delivery(inbox_url=url) for url in remote_inbox_urls]
|
||||
inbox_items = [
|
||||
models.InboxItem(actor=actor, type=type) for actor in local_recipients
|
||||
]
|
||||
|
||||
return inbox_items, deliveries, urls
|
||||
|
||||
|
||||
def join_queries_or(left, right):
|
||||
if left:
|
||||
return left | right
|
||||
else:
|
||||
return right
|
||||
|
||||
|
||||
def get_actors_from_audience(urls):
|
||||
"""
|
||||
Given a list of urls such as [
|
||||
"https://hello.world/@bob/followers",
|
||||
"https://eldritch.cafe/@alice/followers",
|
||||
"https://funkwhale.demo/libraries/uuid/followers",
|
||||
]
|
||||
Returns a queryset of actors that are member of the collections
|
||||
listed in the given urls. The urls may contain urls referring
|
||||
to an actor, an actor followers collection or an library followers
|
||||
collection.
|
||||
|
||||
Urls that don't match anything are simply discarded
|
||||
"""
|
||||
from . import models
|
||||
|
||||
queries = {"followed": None, "actors": []}
|
||||
for url in urls:
|
||||
if url == PUBLIC_ADDRESS:
|
||||
continue
|
||||
queries["actors"].append(url)
|
||||
queries["followed"] = join_queries_or(
|
||||
queries["followed"], Q(target__followers_url=url)
|
||||
)
|
||||
final_query = None
|
||||
if queries["actors"]:
|
||||
final_query = join_queries_or(final_query, Q(fid__in=queries["actors"]))
|
||||
if queries["followed"]:
|
||||
actor_follows = models.Follow.objects.filter(queries["followed"], approved=True)
|
||||
final_query = join_queries_or(
|
||||
final_query, Q(pk__in=actor_follows.values_list("actor", flat=True))
|
||||
)
|
||||
|
||||
library_follows = models.LibraryFollow.objects.filter(
|
||||
queries["followed"], approved=True
|
||||
)
|
||||
final_query = join_queries_or(
|
||||
final_query, Q(pk__in=library_follows.values_list("actor", flat=True))
|
||||
)
|
||||
if not final_query:
|
||||
return models.Actor.objects.none()
|
||||
return models.Actor.objects.filter(final_query)
|
||||
|
||||
|
||||
def get_inbox_urls(actor_queryset):
|
||||
"""
|
||||
Given an actor queryset, returns a deduplicated set containing
|
||||
all inbox or shared inbox urls where we should deliver our payloads for
|
||||
those actors
|
||||
"""
|
||||
values = actor_queryset.values("inbox_url", "shared_inbox_url")
|
||||
|
||||
urls = set([actor["shared_inbox_url"] or actor["inbox_url"] for actor in values])
|
||||
return sorted(urls)
|
||||
|
|
|
@ -1,30 +1,16 @@
|
|||
import datetime
|
||||
import logging
|
||||
import xml
|
||||
|
||||
from django.conf import settings
|
||||
from django.db import transaction
|
||||
from django.urls import reverse
|
||||
from django.utils import timezone
|
||||
from rest_framework.exceptions import PermissionDenied
|
||||
|
||||
from funkwhale_api.common import preferences, session
|
||||
from funkwhale_api.common import utils as funkwhale_utils
|
||||
from funkwhale_api.music import models as music_models
|
||||
from funkwhale_api.music import tasks as music_tasks
|
||||
|
||||
from . import activity, keys, models, serializers, signing, utils
|
||||
from . import models, serializers
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def remove_tags(text):
|
||||
logger.debug("Removing tags from %s", text)
|
||||
return "".join(
|
||||
xml.etree.ElementTree.fromstring("<div>{}</div>".format(text)).itertext()
|
||||
)
|
||||
|
||||
|
||||
def get_actor_data(actor_url):
|
||||
response = session.get_session().get(
|
||||
actor_url,
|
||||
|
@ -39,9 +25,9 @@ def get_actor_data(actor_url):
|
|||
raise ValueError("Invalid actor payload: {}".format(response.text))
|
||||
|
||||
|
||||
def get_actor(actor_url):
|
||||
def get_actor(fid):
|
||||
try:
|
||||
actor = models.Actor.objects.get(url=actor_url)
|
||||
actor = models.Actor.objects.get(fid=fid)
|
||||
except models.Actor.DoesNotExist:
|
||||
actor = None
|
||||
fetch_delta = datetime.timedelta(
|
||||
|
@ -50,330 +36,8 @@ def get_actor(actor_url):
|
|||
if actor and actor.last_fetch_date > timezone.now() - fetch_delta:
|
||||
# cache is hot, we can return as is
|
||||
return actor
|
||||
data = get_actor_data(actor_url)
|
||||
data = get_actor_data(fid)
|
||||
serializer = serializers.ActorSerializer(data=data)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
|
||||
return serializer.save(last_fetch_date=timezone.now())
|
||||
|
||||
|
||||
class SystemActor(object):
|
||||
additional_attributes = {}
|
||||
manually_approves_followers = False
|
||||
|
||||
def get_request_auth(self):
|
||||
actor = self.get_actor_instance()
|
||||
return signing.get_auth(actor.private_key, actor.private_key_id)
|
||||
|
||||
def serialize(self):
|
||||
actor = self.get_actor_instance()
|
||||
serializer = serializers.ActorSerializer(actor)
|
||||
return serializer.data
|
||||
|
||||
def get_actor_instance(self):
|
||||
try:
|
||||
return models.Actor.objects.get(url=self.get_actor_url())
|
||||
except models.Actor.DoesNotExist:
|
||||
pass
|
||||
private, public = keys.get_key_pair()
|
||||
args = self.get_instance_argument(
|
||||
self.id, name=self.name, summary=self.summary, **self.additional_attributes
|
||||
)
|
||||
args["private_key"] = private.decode("utf-8")
|
||||
args["public_key"] = public.decode("utf-8")
|
||||
return models.Actor.objects.create(**args)
|
||||
|
||||
def get_actor_url(self):
|
||||
return utils.full_url(
|
||||
reverse("federation:instance-actors-detail", kwargs={"actor": self.id})
|
||||
)
|
||||
|
||||
def get_instance_argument(self, id, name, summary, **kwargs):
|
||||
p = {
|
||||
"preferred_username": id,
|
||||
"domain": settings.FEDERATION_HOSTNAME,
|
||||
"type": "Person",
|
||||
"name": name.format(host=settings.FEDERATION_HOSTNAME),
|
||||
"manually_approves_followers": True,
|
||||
"url": self.get_actor_url(),
|
||||
"shared_inbox_url": utils.full_url(
|
||||
reverse("federation:instance-actors-inbox", kwargs={"actor": id})
|
||||
),
|
||||
"inbox_url": utils.full_url(
|
||||
reverse("federation:instance-actors-inbox", kwargs={"actor": id})
|
||||
),
|
||||
"outbox_url": utils.full_url(
|
||||
reverse("federation:instance-actors-outbox", kwargs={"actor": id})
|
||||
),
|
||||
"summary": summary.format(host=settings.FEDERATION_HOSTNAME),
|
||||
}
|
||||
p.update(kwargs)
|
||||
return p
|
||||
|
||||
def get_inbox(self, data, actor=None):
|
||||
raise NotImplementedError
|
||||
|
||||
def post_inbox(self, data, actor=None):
|
||||
return self.handle(data, actor=actor)
|
||||
|
||||
def get_outbox(self, data, actor=None):
|
||||
raise NotImplementedError
|
||||
|
||||
def post_outbox(self, data, actor=None):
|
||||
raise NotImplementedError
|
||||
|
||||
def handle(self, data, actor=None):
|
||||
"""
|
||||
Main entrypoint for handling activities posted to the
|
||||
actor's inbox
|
||||
"""
|
||||
logger.info("Received activity on %s inbox", self.id)
|
||||
|
||||
if actor is None:
|
||||
raise PermissionDenied("Actor not authenticated")
|
||||
|
||||
serializer = serializers.ActivitySerializer(data=data, context={"actor": actor})
|
||||
serializer.is_valid(raise_exception=True)
|
||||
|
||||
ac = serializer.data
|
||||
try:
|
||||
handler = getattr(self, "handle_{}".format(ac["type"].lower()))
|
||||
except (KeyError, AttributeError):
|
||||
logger.debug("No handler for activity %s", ac["type"])
|
||||
return
|
||||
|
||||
return handler(data, actor)
|
||||
|
||||
def handle_follow(self, ac, sender):
|
||||
serializer = serializers.FollowSerializer(
|
||||
data=ac, context={"follow_actor": sender}
|
||||
)
|
||||
if not serializer.is_valid():
|
||||
return logger.info("Invalid follow payload")
|
||||
approved = True if not self.manually_approves_followers else None
|
||||
follow = serializer.save(approved=approved)
|
||||
if follow.approved:
|
||||
return activity.accept_follow(follow)
|
||||
|
||||
def handle_accept(self, ac, sender):
|
||||
system_actor = self.get_actor_instance()
|
||||
serializer = serializers.AcceptFollowSerializer(
|
||||
data=ac, context={"follow_target": sender, "follow_actor": system_actor}
|
||||
)
|
||||
if not serializer.is_valid(raise_exception=True):
|
||||
return logger.info("Received invalid payload")
|
||||
|
||||
return serializer.save()
|
||||
|
||||
def handle_undo_follow(self, ac, sender):
|
||||
system_actor = self.get_actor_instance()
|
||||
serializer = serializers.UndoFollowSerializer(
|
||||
data=ac, context={"actor": sender, "target": system_actor}
|
||||
)
|
||||
if not serializer.is_valid():
|
||||
return logger.info("Received invalid payload")
|
||||
serializer.save()
|
||||
|
||||
def handle_undo(self, ac, sender):
|
||||
if ac["object"]["type"] != "Follow":
|
||||
return
|
||||
|
||||
if ac["object"]["actor"] != sender.url:
|
||||
# not the same actor, permission issue
|
||||
return
|
||||
|
||||
self.handle_undo_follow(ac, sender)
|
||||
|
||||
|
||||
class LibraryActor(SystemActor):
|
||||
id = "library"
|
||||
name = "{host}'s library"
|
||||
summary = "Bot account to federate with {host}'s library"
|
||||
additional_attributes = {"manually_approves_followers": True}
|
||||
|
||||
def serialize(self):
|
||||
data = super().serialize()
|
||||
urls = data.setdefault("url", [])
|
||||
urls.append(
|
||||
{
|
||||
"type": "Link",
|
||||
"mediaType": "application/activity+json",
|
||||
"name": "library",
|
||||
"href": utils.full_url(reverse("federation:music:files-list")),
|
||||
}
|
||||
)
|
||||
return data
|
||||
|
||||
@property
|
||||
def manually_approves_followers(self):
|
||||
return preferences.get("federation__music_needs_approval")
|
||||
|
||||
@transaction.atomic
|
||||
def handle_create(self, ac, sender):
|
||||
try:
|
||||
remote_library = models.Library.objects.get(
|
||||
actor=sender, federation_enabled=True
|
||||
)
|
||||
except models.Library.DoesNotExist:
|
||||
logger.info("Skipping import, we're not following %s", sender.url)
|
||||
return
|
||||
|
||||
if ac["object"]["type"] != "Collection":
|
||||
return
|
||||
|
||||
if ac["object"]["totalItems"] <= 0:
|
||||
return
|
||||
|
||||
try:
|
||||
items = ac["object"]["items"]
|
||||
except KeyError:
|
||||
logger.warning("No items in collection!")
|
||||
return
|
||||
|
||||
item_serializers = [
|
||||
serializers.AudioSerializer(data=i, context={"library": remote_library})
|
||||
for i in items
|
||||
]
|
||||
now = timezone.now()
|
||||
valid_serializers = []
|
||||
for s in item_serializers:
|
||||
if s.is_valid():
|
||||
valid_serializers.append(s)
|
||||
else:
|
||||
logger.debug("Skipping invalid item %s, %s", s.initial_data, s.errors)
|
||||
|
||||
lts = []
|
||||
for s in valid_serializers:
|
||||
lts.append(s.save())
|
||||
|
||||
if remote_library.autoimport:
|
||||
batch = music_models.ImportBatch.objects.create(source="federation")
|
||||
for lt in lts:
|
||||
if lt.creation_date < now:
|
||||
# track was already in the library, we do not trigger
|
||||
# an import
|
||||
continue
|
||||
job = music_models.ImportJob.objects.create(
|
||||
batch=batch, library_track=lt, mbid=lt.mbid, source=lt.url
|
||||
)
|
||||
funkwhale_utils.on_commit(
|
||||
music_tasks.import_job_run.delay,
|
||||
import_job_id=job.pk,
|
||||
use_acoustid=False,
|
||||
)
|
||||
|
||||
|
||||
class TestActor(SystemActor):
|
||||
id = "test"
|
||||
name = "{host}'s test account"
|
||||
summary = (
|
||||
"Bot account to test federation with {host}. "
|
||||
"Send me /ping and I'll answer you."
|
||||
)
|
||||
additional_attributes = {"manually_approves_followers": False}
|
||||
manually_approves_followers = False
|
||||
|
||||
def get_outbox(self, data, actor=None):
|
||||
return {
|
||||
"@context": [
|
||||
"https://www.w3.org/ns/activitystreams",
|
||||
"https://w3id.org/security/v1",
|
||||
{},
|
||||
],
|
||||
"id": utils.full_url(
|
||||
reverse("federation:instance-actors-outbox", kwargs={"actor": self.id})
|
||||
),
|
||||
"type": "OrderedCollection",
|
||||
"totalItems": 0,
|
||||
"orderedItems": [],
|
||||
}
|
||||
|
||||
def parse_command(self, message):
|
||||
"""
|
||||
Remove any links or fancy markup to extract /command from
|
||||
a note message.
|
||||
"""
|
||||
raw = remove_tags(message)
|
||||
try:
|
||||
return raw.split("/")[1]
|
||||
except IndexError:
|
||||
return
|
||||
|
||||
def handle_create(self, ac, sender):
|
||||
if ac["object"]["type"] != "Note":
|
||||
return
|
||||
|
||||
# we received a toot \o/
|
||||
command = self.parse_command(ac["object"]["content"])
|
||||
logger.debug("Parsed command: %s", command)
|
||||
if command != "ping":
|
||||
return
|
||||
|
||||
now = timezone.now()
|
||||
test_actor = self.get_actor_instance()
|
||||
reply_url = "https://{}/activities/note/{}".format(
|
||||
settings.FEDERATION_HOSTNAME, now.timestamp()
|
||||
)
|
||||
reply_activity = {
|
||||
"@context": [
|
||||
"https://www.w3.org/ns/activitystreams",
|
||||
"https://w3id.org/security/v1",
|
||||
{},
|
||||
],
|
||||
"type": "Create",
|
||||
"actor": test_actor.url,
|
||||
"id": "{}/activity".format(reply_url),
|
||||
"published": now.isoformat(),
|
||||
"to": ac["actor"],
|
||||
"cc": [],
|
||||
"object": {
|
||||
"type": "Note",
|
||||
"content": "Pong!",
|
||||
"summary": None,
|
||||
"published": now.isoformat(),
|
||||
"id": reply_url,
|
||||
"inReplyTo": ac["object"]["id"],
|
||||
"sensitive": False,
|
||||
"url": reply_url,
|
||||
"to": [ac["actor"]],
|
||||
"attributedTo": test_actor.url,
|
||||
"cc": [],
|
||||
"attachment": [],
|
||||
"tag": [
|
||||
{
|
||||
"type": "Mention",
|
||||
"href": ac["actor"],
|
||||
"name": sender.mention_username,
|
||||
}
|
||||
],
|
||||
},
|
||||
}
|
||||
activity.deliver(reply_activity, to=[ac["actor"]], on_behalf_of=test_actor)
|
||||
|
||||
def handle_follow(self, ac, sender):
|
||||
super().handle_follow(ac, sender)
|
||||
# also, we follow back
|
||||
test_actor = self.get_actor_instance()
|
||||
follow_back = models.Follow.objects.get_or_create(
|
||||
actor=test_actor, target=sender, approved=None
|
||||
)[0]
|
||||
activity.deliver(
|
||||
serializers.FollowSerializer(follow_back).data,
|
||||
to=[follow_back.target.url],
|
||||
on_behalf_of=follow_back.actor,
|
||||
)
|
||||
|
||||
def handle_undo_follow(self, ac, sender):
|
||||
super().handle_undo_follow(ac, sender)
|
||||
actor = self.get_actor_instance()
|
||||
# we also unfollow the sender, if possible
|
||||
try:
|
||||
follow = models.Follow.objects.get(target=sender, actor=actor)
|
||||
except models.Follow.DoesNotExist:
|
||||
return
|
||||
undo = serializers.UndoFollowSerializer(follow).data
|
||||
follow.delete()
|
||||
activity.deliver(undo, to=[sender.url], on_behalf_of=actor)
|
||||
|
||||
|
||||
SYSTEM_ACTORS = {"library": LibraryActor(), "test": TestActor()}
|
||||
|
|
|
@ -1,19 +1,49 @@
|
|||
from django.contrib import admin
|
||||
from funkwhale_api.common import admin
|
||||
|
||||
from . import models
|
||||
from . import tasks
|
||||
|
||||
|
||||
def redeliver_deliveries(modeladmin, request, queryset):
|
||||
queryset.update(is_delivered=False)
|
||||
for delivery in queryset:
|
||||
tasks.deliver_to_remote.delay(delivery_id=delivery.pk)
|
||||
|
||||
|
||||
redeliver_deliveries.short_description = "Redeliver"
|
||||
|
||||
|
||||
def redeliver_activities(modeladmin, request, queryset):
|
||||
for activity in queryset.select_related("actor__user"):
|
||||
if activity.actor.get_user():
|
||||
tasks.dispatch_outbox.delay(activity_id=activity.pk)
|
||||
else:
|
||||
tasks.dispatch_inbox.delay(activity_id=activity.pk)
|
||||
|
||||
|
||||
redeliver_activities.short_description = "Redeliver"
|
||||
|
||||
|
||||
@admin.register(models.Activity)
|
||||
class ActivityAdmin(admin.ModelAdmin):
|
||||
list_display = ["type", "fid", "url", "actor", "creation_date"]
|
||||
search_fields = ["payload", "fid", "url", "actor__domain"]
|
||||
list_filter = ["type", "actor__domain"]
|
||||
actions = [redeliver_activities]
|
||||
list_select_related = True
|
||||
|
||||
|
||||
@admin.register(models.Actor)
|
||||
class ActorAdmin(admin.ModelAdmin):
|
||||
list_display = [
|
||||
"url",
|
||||
"fid",
|
||||
"domain",
|
||||
"preferred_username",
|
||||
"type",
|
||||
"creation_date",
|
||||
"last_fetch_date",
|
||||
]
|
||||
search_fields = ["url", "domain", "preferred_username"]
|
||||
search_fields = ["fid", "domain", "preferred_username"]
|
||||
list_filter = ["type"]
|
||||
|
||||
|
||||
|
@ -21,28 +51,36 @@ class ActorAdmin(admin.ModelAdmin):
|
|||
class FollowAdmin(admin.ModelAdmin):
|
||||
list_display = ["actor", "target", "approved", "creation_date"]
|
||||
list_filter = ["approved"]
|
||||
search_fields = ["actor__url", "target__url"]
|
||||
search_fields = ["actor__fid", "target__fid"]
|
||||
list_select_related = True
|
||||
|
||||
|
||||
@admin.register(models.Library)
|
||||
class LibraryAdmin(admin.ModelAdmin):
|
||||
list_display = ["actor", "url", "creation_date", "fetched_date", "tracks_count"]
|
||||
search_fields = ["actor__url", "url"]
|
||||
list_filter = ["federation_enabled", "download_files", "autoimport"]
|
||||
@admin.register(models.LibraryFollow)
|
||||
class LibraryFollowAdmin(admin.ModelAdmin):
|
||||
list_display = ["actor", "target", "approved", "creation_date"]
|
||||
list_filter = ["approved"]
|
||||
search_fields = ["actor__fid", "target__fid"]
|
||||
list_select_related = True
|
||||
|
||||
|
||||
@admin.register(models.LibraryTrack)
|
||||
class LibraryTrackAdmin(admin.ModelAdmin):
|
||||
@admin.register(models.InboxItem)
|
||||
class InboxItemAdmin(admin.ModelAdmin):
|
||||
list_display = ["actor", "activity", "type", "is_read"]
|
||||
list_filter = ["type", "activity__type", "is_read"]
|
||||
search_fields = ["actor__fid", "activity__fid"]
|
||||
list_select_related = True
|
||||
|
||||
|
||||
@admin.register(models.Delivery)
|
||||
class DeliveryAdmin(admin.ModelAdmin):
|
||||
list_display = [
|
||||
"title",
|
||||
"artist_name",
|
||||
"album_title",
|
||||
"url",
|
||||
"library",
|
||||
"creation_date",
|
||||
"published_date",
|
||||
"inbox_url",
|
||||
"activity",
|
||||
"last_attempt_date",
|
||||
"attempts",
|
||||
"is_delivered",
|
||||
]
|
||||
search_fields = ["library__url", "url", "artist_name", "title", "album_title"]
|
||||
list_filter = ["activity__type", "is_delivered"]
|
||||
search_fields = ["inbox_url"]
|
||||
list_select_related = True
|
||||
actions = [redeliver_deliveries]
|
||||
|
|
|
@ -0,0 +1,146 @@
|
|||
from rest_framework import serializers
|
||||
|
||||
from funkwhale_api.common import serializers as common_serializers
|
||||
from funkwhale_api.music import models as music_models
|
||||
|
||||
from . import filters
|
||||
from . import models
|
||||
from . import serializers as federation_serializers
|
||||
|
||||
|
||||
class NestedLibraryFollowSerializer(serializers.ModelSerializer):
|
||||
class Meta:
|
||||
model = models.LibraryFollow
|
||||
fields = ["creation_date", "uuid", "fid", "approved", "modification_date"]
|
||||
|
||||
|
||||
class LibraryScanSerializer(serializers.ModelSerializer):
|
||||
class Meta:
|
||||
model = music_models.LibraryScan
|
||||
fields = [
|
||||
"total_files",
|
||||
"processed_files",
|
||||
"errored_files",
|
||||
"status",
|
||||
"creation_date",
|
||||
"modification_date",
|
||||
]
|
||||
|
||||
|
||||
class LibrarySerializer(serializers.ModelSerializer):
|
||||
actor = federation_serializers.APIActorSerializer()
|
||||
uploads_count = serializers.SerializerMethodField()
|
||||
latest_scan = serializers.SerializerMethodField()
|
||||
follow = serializers.SerializerMethodField()
|
||||
|
||||
class Meta:
|
||||
model = music_models.Library
|
||||
fields = [
|
||||
"fid",
|
||||
"uuid",
|
||||
"actor",
|
||||
"name",
|
||||
"description",
|
||||
"creation_date",
|
||||
"uploads_count",
|
||||
"privacy_level",
|
||||
"follow",
|
||||
"latest_scan",
|
||||
]
|
||||
|
||||
def get_uploads_count(self, o):
|
||||
return max(getattr(o, "_uploads_count", 0), o.uploads_count)
|
||||
|
||||
def get_follow(self, o):
|
||||
try:
|
||||
return NestedLibraryFollowSerializer(o._follows[0]).data
|
||||
except (AttributeError, IndexError):
|
||||
return None
|
||||
|
||||
def get_latest_scan(self, o):
|
||||
scan = o.scans.order_by("-creation_date").first()
|
||||
if scan:
|
||||
return LibraryScanSerializer(scan).data
|
||||
|
||||
|
||||
class LibraryFollowSerializer(serializers.ModelSerializer):
|
||||
target = common_serializers.RelatedField("uuid", LibrarySerializer(), required=True)
|
||||
actor = serializers.SerializerMethodField()
|
||||
|
||||
class Meta:
|
||||
model = models.LibraryFollow
|
||||
fields = ["creation_date", "actor", "uuid", "target", "approved"]
|
||||
read_only_fields = ["uuid", "actor", "approved", "creation_date"]
|
||||
|
||||
def validate_target(self, v):
|
||||
actor = self.context["actor"]
|
||||
if v.actor == actor:
|
||||
raise serializers.ValidationError("You cannot follow your own library")
|
||||
|
||||
if v.received_follows.filter(actor=actor).exists():
|
||||
raise serializers.ValidationError("You are already following this library")
|
||||
return v
|
||||
|
||||
def get_actor(self, o):
|
||||
return federation_serializers.APIActorSerializer(o.actor).data
|
||||
|
||||
|
||||
def serialize_generic_relation(activity, obj):
|
||||
data = {"uuid": obj.uuid, "type": obj._meta.label}
|
||||
if data["type"] == "music.Library":
|
||||
data["name"] = obj.name
|
||||
if data["type"] == "federation.LibraryFollow":
|
||||
data["approved"] = obj.approved
|
||||
|
||||
return data
|
||||
|
||||
|
||||
class ActivitySerializer(serializers.ModelSerializer):
|
||||
actor = federation_serializers.APIActorSerializer()
|
||||
object = serializers.SerializerMethodField()
|
||||
target = serializers.SerializerMethodField()
|
||||
related_object = serializers.SerializerMethodField()
|
||||
|
||||
class Meta:
|
||||
model = models.Activity
|
||||
fields = [
|
||||
"uuid",
|
||||
"fid",
|
||||
"actor",
|
||||
"payload",
|
||||
"object",
|
||||
"target",
|
||||
"related_object",
|
||||
"actor",
|
||||
"creation_date",
|
||||
"type",
|
||||
]
|
||||
|
||||
def get_object(self, o):
|
||||
if o.object:
|
||||
return serialize_generic_relation(o, o.object)
|
||||
|
||||
def get_related_object(self, o):
|
||||
if o.related_object:
|
||||
return serialize_generic_relation(o, o.related_object)
|
||||
|
||||
def get_target(self, o):
|
||||
if o.target:
|
||||
return serialize_generic_relation(o, o.target)
|
||||
|
||||
|
||||
class InboxItemSerializer(serializers.ModelSerializer):
|
||||
activity = ActivitySerializer()
|
||||
|
||||
class Meta:
|
||||
model = models.InboxItem
|
||||
fields = ["id", "type", "activity", "is_read"]
|
||||
read_only_fields = ["id", "type", "activity"]
|
||||
|
||||
|
||||
class InboxItemActionSerializer(common_serializers.ActionSerializer):
|
||||
actions = [common_serializers.Action("read", allow_all=True)]
|
||||
filterset_class = filters.InboxItemFilter
|
||||
|
||||
def handle_read(self, objects):
|
||||
return objects.update(is_read=True)
|
|
@ -1,9 +1,10 @@
|
|||
from rest_framework import routers
|
||||
|
||||
from . import views
|
||||
from . import api_views
|
||||
|
||||
router = routers.SimpleRouter()
|
||||
router.register(r"libraries", views.LibraryViewSet, "libraries")
|
||||
router.register(r"library-tracks", views.LibraryTrackViewSet, "library-tracks")
|
||||
router.register(r"follows/library", api_views.LibraryFollowViewSet, "library-follows")
|
||||
router.register(r"inbox", api_views.InboxItemViewSet, "inbox")
|
||||
router.register(r"libraries", api_views.LibraryViewSet, "libraries")
|
||||
|
||||
urlpatterns = router.urls
|
||||
|
|
|
@ -0,0 +1,180 @@
|
|||
import requests.exceptions
|
||||
|
||||
from django.db import transaction
|
||||
from django.db.models import Count
|
||||
|
||||
from rest_framework import decorators
|
||||
from rest_framework import mixins
|
||||
from rest_framework import permissions
|
||||
from rest_framework import response
|
||||
from rest_framework import viewsets
|
||||
|
||||
from funkwhale_api.music import models as music_models
|
||||
|
||||
from . import activity
|
||||
from . import api_serializers
|
||||
from . import filters
|
||||
from . import models
|
||||
from . import routes
|
||||
from . import serializers
|
||||
from . import utils
|
||||
|
||||
|
||||
@transaction.atomic
|
||||
def update_follow(follow, approved):
|
||||
follow.approved = approved
|
||||
follow.save(update_fields=["approved"])
|
||||
routes.outbox.dispatch({"type": "Accept"}, context={"follow": follow})
|
||||
|
||||
|
||||
class LibraryFollowViewSet(
|
||||
mixins.CreateModelMixin,
|
||||
mixins.ListModelMixin,
|
||||
mixins.RetrieveModelMixin,
|
||||
mixins.DestroyModelMixin,
|
||||
viewsets.GenericViewSet,
|
||||
):
|
||||
lookup_field = "uuid"
|
||||
queryset = (
|
||||
models.LibraryFollow.objects.all()
|
||||
.order_by("-creation_date")
|
||||
.select_related("actor", "target__actor")
|
||||
)
|
||||
serializer_class = api_serializers.LibraryFollowSerializer
|
||||
permission_classes = [permissions.IsAuthenticated]
|
||||
filter_class = filters.LibraryFollowFilter
|
||||
ordering_fields = ("creation_date",)
|
||||
|
||||
def get_queryset(self):
|
||||
qs = super().get_queryset()
|
||||
return qs.filter(actor=self.request.user.actor)
|
||||
|
||||
def perform_create(self, serializer):
|
||||
follow = serializer.save(actor=self.request.user.actor)
|
||||
routes.outbox.dispatch({"type": "Follow"}, context={"follow": follow})
|
||||
|
||||
@transaction.atomic
|
||||
def perform_destroy(self, instance):
|
||||
routes.outbox.dispatch(
|
||||
{"type": "Undo", "object": {"type": "Follow"}}, context={"follow": instance}
|
||||
)
|
||||
instance.delete()
|
||||
|
||||
def get_serializer_context(self):
|
||||
context = super().get_serializer_context()
|
||||
context["actor"] = self.request.user.actor
|
||||
return context
|
||||
|
||||
@decorators.detail_route(methods=["post"])
|
||||
def accept(self, request, *args, **kwargs):
|
||||
try:
|
||||
follow = self.queryset.get(
|
||||
target__actor=self.request.user.actor, uuid=kwargs["uuid"]
|
||||
)
|
||||
except models.LibraryFollow.DoesNotExist:
|
||||
return response.Response({}, status=404)
|
||||
update_follow(follow, approved=True)
|
||||
return response.Response(status=204)
|
||||
|
||||
@decorators.detail_route(methods=["post"])
|
||||
def reject(self, request, *args, **kwargs):
|
||||
try:
|
||||
follow = self.queryset.get(
|
||||
target__actor=self.request.user.actor, uuid=kwargs["uuid"]
|
||||
)
|
||||
except models.LibraryFollow.DoesNotExist:
|
||||
return response.Response({}, status=404)
|
||||
|
||||
update_follow(follow, approved=False)
|
||||
return response.Response(status=204)
|
||||
|
||||
|
||||
class LibraryViewSet(mixins.RetrieveModelMixin, viewsets.GenericViewSet):
|
||||
lookup_field = "uuid"
|
||||
queryset = (
|
||||
music_models.Library.objects.all()
|
||||
.order_by("-creation_date")
|
||||
.select_related("actor__user")
|
||||
.annotate(_uploads_count=Count("uploads"))
|
||||
)
|
||||
serializer_class = api_serializers.LibrarySerializer
|
||||
permission_classes = [permissions.IsAuthenticated]
|
||||
|
||||
def get_queryset(self):
|
||||
qs = super().get_queryset()
|
||||
return qs.viewable_by(actor=self.request.user.actor)
|
||||
|
||||
@decorators.detail_route(methods=["post"])
|
||||
def scan(self, request, *args, **kwargs):
|
||||
library = self.get_object()
|
||||
if library.actor.get_user():
|
||||
return response.Response({"status": "skipped"}, 200)
|
||||
|
||||
scan = library.schedule_scan(actor=request.user.actor)
|
||||
if scan:
|
||||
return response.Response(
|
||||
{
|
||||
"status": "scheduled",
|
||||
"scan": api_serializers.LibraryScanSerializer(scan).data,
|
||||
},
|
||||
200,
|
||||
)
|
||||
return response.Response({"status": "skipped"}, 200)
|
||||
|
||||
@decorators.list_route(methods=["post"])
|
||||
def fetch(self, request, *args, **kwargs):
|
||||
try:
|
||||
fid = request.data["fid"]
|
||||
except KeyError:
|
||||
return response.Response({"fid": ["This field is required"]})
|
||||
try:
|
||||
library = utils.retrieve(
|
||||
fid,
|
||||
queryset=self.queryset,
|
||||
serializer_class=serializers.LibrarySerializer,
|
||||
)
|
||||
except requests.exceptions.RequestException as e:
|
||||
return response.Response(
|
||||
{"detail": "Error while fetching the library: {}".format(str(e))},
|
||||
status=400,
|
||||
)
|
||||
except serializers.serializers.ValidationError as e:
|
||||
return response.Response(
|
||||
{"detail": "Invalid data in remote library: {}".format(str(e))},
|
||||
status=400,
|
||||
)
|
||||
serializer = self.serializer_class(library)
|
||||
return response.Response({"count": 1, "results": [serializer.data]})
|
||||
|
||||
|
||||
class InboxItemViewSet(
|
||||
mixins.UpdateModelMixin,
|
||||
mixins.ListModelMixin,
|
||||
mixins.RetrieveModelMixin,
|
||||
viewsets.GenericViewSet,
|
||||
):
|
||||
|
||||
queryset = (
|
||||
models.InboxItem.objects.select_related("activity__actor")
|
||||
.prefetch_related("activity__object", "activity__target")
|
||||
.filter(activity__type__in=activity.BROADCAST_TO_USER_ACTIVITIES, type="to")
|
||||
.order_by("-activity__creation_date")
|
||||
)
|
||||
serializer_class = api_serializers.InboxItemSerializer
|
||||
permission_classes = [permissions.IsAuthenticated]
|
||||
filter_class = filters.InboxItemFilter
|
||||
ordering_fields = ("activity__creation_date",)
|
||||
|
||||
def get_queryset(self):
|
||||
qs = super().get_queryset()
|
||||
return qs.filter(actor=self.request.user.actor)
|
||||
|
||||
@decorators.list_route(methods=["post"])
|
||||
def action(self, request, *args, **kwargs):
|
||||
queryset = self.get_queryset()
|
||||
serializer = api_serializers.InboxItemActionSerializer(
|
||||
request.data, queryset=queryset
|
||||
)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
result = serializer.save()
|
||||
return response.Response(result, status=200)
|
|
@ -1,4 +1,3 @@
|
|||
|
||||
from dynamic_preferences import types
|
||||
from dynamic_preferences.registries import global_preferences_registry
|
||||
|
||||
|
|
|
@ -8,6 +8,7 @@ from django.utils import timezone
|
|||
from django.utils.http import http_date
|
||||
|
||||
from funkwhale_api.factories import registry
|
||||
from funkwhale_api.users import factories as user_factories
|
||||
|
||||
from . import keys, models
|
||||
|
||||
|
@ -61,6 +62,10 @@ class LinkFactory(factory.Factory):
|
|||
audio = factory.Trait(mediaType=factory.Iterator(["audio/mp3", "audio/ogg"]))
|
||||
|
||||
|
||||
def create_user(actor):
|
||||
return user_factories.UserFactory(actor=actor)
|
||||
|
||||
|
||||
@registry.register
|
||||
class ActorFactory(factory.DjangoModelFactory):
|
||||
public_key = None
|
||||
|
@ -68,9 +73,12 @@ class ActorFactory(factory.DjangoModelFactory):
|
|||
preferred_username = factory.Faker("user_name")
|
||||
summary = factory.Faker("paragraph")
|
||||
domain = factory.Faker("domain_name")
|
||||
url = factory.LazyAttribute(
|
||||
fid = factory.LazyAttribute(
|
||||
lambda o: "https://{}/users/{}".format(o.domain, o.preferred_username)
|
||||
)
|
||||
followers_url = factory.LazyAttribute(
|
||||
lambda o: "https://{}/users/{}followers".format(o.domain, o.preferred_username)
|
||||
)
|
||||
inbox_url = factory.LazyAttribute(
|
||||
lambda o: "https://{}/users/{}/inbox".format(o.domain, o.preferred_username)
|
||||
)
|
||||
|
@ -81,20 +89,34 @@ class ActorFactory(factory.DjangoModelFactory):
|
|||
class Meta:
|
||||
model = models.Actor
|
||||
|
||||
class Params:
|
||||
local = factory.Trait(
|
||||
domain=factory.LazyAttribute(lambda o: settings.FEDERATION_HOSTNAME)
|
||||
)
|
||||
@factory.post_generation
|
||||
def local(self, create, extracted, **kwargs):
|
||||
if not extracted and not kwargs:
|
||||
return
|
||||
from funkwhale_api.users.factories import UserFactory
|
||||
|
||||
@classmethod
|
||||
def _generate(cls, create, attrs):
|
||||
has_public = attrs.get("public_key") is not None
|
||||
has_private = attrs.get("private_key") is not None
|
||||
if not has_public and not has_private:
|
||||
self.domain = settings.FEDERATION_HOSTNAME
|
||||
self.save(update_fields=["domain"])
|
||||
if not create:
|
||||
if extracted and hasattr(extracted, "pk"):
|
||||
extracted.actor = self
|
||||
else:
|
||||
UserFactory.build(actor=self, **kwargs)
|
||||
if extracted and hasattr(extracted, "pk"):
|
||||
extracted.actor = self
|
||||
extracted.save(update_fields=["user"])
|
||||
else:
|
||||
self.user = UserFactory(actor=self, **kwargs)
|
||||
|
||||
@factory.post_generation
|
||||
def keys(self, create, extracted, **kwargs):
|
||||
if not create:
|
||||
# Simple build, do nothing.
|
||||
return
|
||||
if not extracted:
|
||||
private, public = keys.get_key_pair()
|
||||
attrs["private_key"] = private.decode("utf-8")
|
||||
attrs["public_key"] = public.decode("utf-8")
|
||||
return super()._generate(create, attrs)
|
||||
self.private_key = private.decode("utf-8")
|
||||
self.public_key = public.decode("utf-8")
|
||||
|
||||
|
||||
@registry.register
|
||||
|
@ -110,15 +132,72 @@ class FollowFactory(factory.DjangoModelFactory):
|
|||
|
||||
|
||||
@registry.register
|
||||
class LibraryFactory(factory.DjangoModelFactory):
|
||||
class MusicLibraryFactory(factory.django.DjangoModelFactory):
|
||||
actor = factory.SubFactory(ActorFactory)
|
||||
url = factory.Faker("url")
|
||||
federation_enabled = True
|
||||
download_files = False
|
||||
autoimport = False
|
||||
privacy_level = "me"
|
||||
name = factory.Faker("sentence")
|
||||
description = factory.Faker("sentence")
|
||||
uploads_count = 0
|
||||
fid = factory.Faker("federation_url")
|
||||
|
||||
class Meta:
|
||||
model = models.Library
|
||||
model = "music.Library"
|
||||
|
||||
@factory.post_generation
|
||||
def followers_url(self, create, extracted, **kwargs):
|
||||
if not create:
|
||||
# Simple build, do nothing.
|
||||
return
|
||||
|
||||
self.followers_url = extracted or self.fid + "/followers"
|
||||
|
||||
|
||||
@registry.register
|
||||
class LibraryScan(factory.django.DjangoModelFactory):
|
||||
library = factory.SubFactory(MusicLibraryFactory)
|
||||
actor = factory.SubFactory(ActorFactory)
|
||||
total_files = factory.LazyAttribute(lambda o: o.library.uploads_count)
|
||||
|
||||
class Meta:
|
||||
model = "music.LibraryScan"
|
||||
|
||||
|
||||
@registry.register
|
||||
class ActivityFactory(factory.django.DjangoModelFactory):
|
||||
actor = factory.SubFactory(ActorFactory)
|
||||
url = factory.Faker("federation_url")
|
||||
payload = factory.LazyFunction(lambda: {"type": "Create"})
|
||||
|
||||
class Meta:
|
||||
model = "federation.Activity"
|
||||
|
||||
|
||||
@registry.register
|
||||
class InboxItemFactory(factory.django.DjangoModelFactory):
|
||||
actor = factory.SubFactory(ActorFactory, local=True)
|
||||
activity = factory.SubFactory(ActivityFactory)
|
||||
type = "to"
|
||||
|
||||
class Meta:
|
||||
model = "federation.InboxItem"
|
||||
|
||||
|
||||
@registry.register
|
||||
class DeliveryFactory(factory.django.DjangoModelFactory):
|
||||
activity = factory.SubFactory(ActivityFactory)
|
||||
inbox_url = factory.Faker("url")
|
||||
|
||||
class Meta:
|
||||
model = "federation.Delivery"
|
||||
|
||||
|
||||
@registry.register
|
||||
class LibraryFollowFactory(factory.DjangoModelFactory):
|
||||
target = factory.SubFactory(MusicLibraryFactory)
|
||||
actor = factory.SubFactory(ActorFactory)
|
||||
|
||||
class Meta:
|
||||
model = "federation.LibraryFollow"
|
||||
|
||||
|
||||
class ArtistMetadataFactory(factory.Factory):
|
||||
|
@ -161,25 +240,6 @@ class LibraryTrackMetadataFactory(factory.Factory):
|
|||
model = dict
|
||||
|
||||
|
||||
@registry.register
|
||||
class LibraryTrackFactory(factory.DjangoModelFactory):
|
||||
library = factory.SubFactory(LibraryFactory)
|
||||
url = factory.Faker("url")
|
||||
title = factory.Faker("sentence")
|
||||
artist_name = factory.Faker("sentence")
|
||||
album_title = factory.Faker("sentence")
|
||||
audio_url = factory.Faker("url")
|
||||
audio_mimetype = "audio/ogg"
|
||||
metadata = factory.SubFactory(LibraryTrackMetadataFactory)
|
||||
published_date = factory.LazyFunction(timezone.now)
|
||||
|
||||
class Meta:
|
||||
model = models.LibraryTrack
|
||||
|
||||
class Params:
|
||||
with_audio_file = factory.Trait(audio_file=factory.django.FileField())
|
||||
|
||||
|
||||
@registry.register(name="federation.Note")
|
||||
class NoteFactory(factory.Factory):
|
||||
type = "Note"
|
||||
|
@ -192,22 +252,6 @@ class NoteFactory(factory.Factory):
|
|||
model = dict
|
||||
|
||||
|
||||
@registry.register(name="federation.Activity")
|
||||
class ActivityFactory(factory.Factory):
|
||||
type = "Create"
|
||||
id = factory.Faker("url")
|
||||
published = factory.LazyFunction(lambda: timezone.now().isoformat())
|
||||
actor = factory.Faker("url")
|
||||
object = factory.SubFactory(
|
||||
NoteFactory,
|
||||
actor=factory.SelfAttribute("..actor"),
|
||||
published=factory.SelfAttribute("..published"),
|
||||
)
|
||||
|
||||
class Meta:
|
||||
model = dict
|
||||
|
||||
|
||||
@registry.register(name="federation.AudioMetadata")
|
||||
class AudioMetadataFactory(factory.Factory):
|
||||
recording = factory.LazyAttribute(
|
||||
|
@ -230,9 +274,9 @@ class AudioMetadataFactory(factory.Factory):
|
|||
@registry.register(name="federation.Audio")
|
||||
class AudioFactory(factory.Factory):
|
||||
type = "Audio"
|
||||
id = factory.Faker("url")
|
||||
id = factory.Faker("federation_url")
|
||||
published = factory.LazyFunction(lambda: timezone.now().isoformat())
|
||||
actor = factory.Faker("url")
|
||||
actor = factory.Faker("federation_url")
|
||||
url = factory.SubFactory(LinkFactory, audio=True)
|
||||
metadata = factory.SubFactory(LibraryTrackMetadataFactory)
|
||||
|
||||
|
|
|
@ -1,68 +1,10 @@
|
|||
import django_filters
|
||||
import django_filters.widgets
|
||||
|
||||
from funkwhale_api.common import fields
|
||||
from funkwhale_api.common import search
|
||||
|
||||
from . import models
|
||||
|
||||
|
||||
class LibraryFilter(django_filters.FilterSet):
|
||||
approved = django_filters.BooleanFilter("following__approved")
|
||||
q = fields.SearchFilter(search_fields=["actor__domain"])
|
||||
|
||||
class Meta:
|
||||
model = models.Library
|
||||
fields = {
|
||||
"approved": ["exact"],
|
||||
"federation_enabled": ["exact"],
|
||||
"download_files": ["exact"],
|
||||
"autoimport": ["exact"],
|
||||
"tracks_count": ["exact"],
|
||||
}
|
||||
|
||||
|
||||
class LibraryTrackFilter(django_filters.FilterSet):
|
||||
library = django_filters.CharFilter("library__uuid")
|
||||
status = django_filters.CharFilter(method="filter_status")
|
||||
q = fields.SmartSearchFilter(
|
||||
config=search.SearchConfig(
|
||||
search_fields={
|
||||
"domain": {"to": "library__actor__domain"},
|
||||
"artist": {"to": "artist_name"},
|
||||
"album": {"to": "album_title"},
|
||||
"title": {"to": "title"},
|
||||
},
|
||||
filter_fields={
|
||||
"domain": {"to": "library__actor__domain"},
|
||||
"artist": {"to": "artist_name__iexact"},
|
||||
"album": {"to": "album_title__iexact"},
|
||||
"title": {"to": "title__iexact"},
|
||||
},
|
||||
)
|
||||
)
|
||||
|
||||
def filter_status(self, queryset, field_name, value):
|
||||
if value == "imported":
|
||||
return queryset.filter(local_track_file__isnull=False)
|
||||
elif value == "not_imported":
|
||||
return queryset.filter(local_track_file__isnull=True).exclude(
|
||||
import_jobs__status="pending"
|
||||
)
|
||||
elif value == "import_pending":
|
||||
return queryset.filter(import_jobs__status="pending")
|
||||
return queryset
|
||||
|
||||
class Meta:
|
||||
model = models.LibraryTrack
|
||||
fields = {
|
||||
"library": ["exact"],
|
||||
"artist_name": ["exact", "icontains"],
|
||||
"title": ["exact", "icontains"],
|
||||
"album_title": ["exact", "icontains"],
|
||||
"audio_mimetype": ["exact", "icontains"],
|
||||
}
|
||||
|
||||
|
||||
class FollowFilter(django_filters.FilterSet):
|
||||
pending = django_filters.CharFilter(method="filter_pending")
|
||||
ordering = django_filters.OrderingFilter(
|
||||
|
@ -84,3 +26,23 @@ class FollowFilter(django_filters.FilterSet):
|
|||
if value.lower() in ["true", "1", "yes"]:
|
||||
queryset = queryset.filter(approved__isnull=True)
|
||||
return queryset
|
||||
|
||||
|
||||
class LibraryFollowFilter(django_filters.FilterSet):
|
||||
class Meta:
|
||||
model = models.LibraryFollow
|
||||
fields = ["approved"]
|
||||
|
||||
|
||||
class InboxItemFilter(django_filters.FilterSet):
|
||||
is_read = django_filters.BooleanFilter(
|
||||
"is_read", widget=django_filters.widgets.BooleanWidget()
|
||||
)
|
||||
before = django_filters.NumberFilter(method="filter_before")
|
||||
|
||||
class Meta:
|
||||
model = models.InboxItem
|
||||
fields = ["is_read", "activity__type", "activity__actor"]
|
||||
|
||||
def filter_before(self, queryset, field_name, value):
|
||||
return queryset.filter(pk__lte=value)
|
||||
|
|
|
@ -1,78 +1,12 @@
|
|||
import json
|
||||
|
||||
import requests
|
||||
from django.conf import settings
|
||||
|
||||
from funkwhale_api.common import session
|
||||
|
||||
from . import actors, models, serializers, signing, webfinger
|
||||
from . import serializers, signing
|
||||
|
||||
|
||||
def scan_from_account_name(account_name):
|
||||
"""
|
||||
Given an account name such as library@test.library, will:
|
||||
|
||||
1. Perform the webfinger lookup
|
||||
2. Perform the actor lookup
|
||||
3. Perform the library's collection lookup
|
||||
|
||||
and return corresponding data in a dictionary.
|
||||
"""
|
||||
data = {}
|
||||
try:
|
||||
username, domain = webfinger.clean_acct(account_name, ensure_local=False)
|
||||
except serializers.ValidationError:
|
||||
return {"webfinger": {"errors": ["Invalid account string"]}}
|
||||
system_library = actors.SYSTEM_ACTORS["library"].get_actor_instance()
|
||||
data["local"] = {"following": False, "awaiting_approval": False}
|
||||
try:
|
||||
follow = models.Follow.objects.get(
|
||||
target__preferred_username=username,
|
||||
target__domain=username,
|
||||
actor=system_library,
|
||||
)
|
||||
data["local"]["awaiting_approval"] = not bool(follow.approved)
|
||||
data["local"]["following"] = True
|
||||
except models.Follow.DoesNotExist:
|
||||
pass
|
||||
|
||||
try:
|
||||
data["webfinger"] = webfinger.get_resource("acct:{}".format(account_name))
|
||||
except requests.ConnectionError:
|
||||
return {"webfinger": {"errors": ["This webfinger resource is not reachable"]}}
|
||||
except requests.HTTPError as e:
|
||||
return {
|
||||
"webfinger": {
|
||||
"errors": [
|
||||
"Error {} during webfinger request".format(e.response.status_code)
|
||||
]
|
||||
}
|
||||
}
|
||||
except json.JSONDecodeError as e:
|
||||
return {"webfinger": {"errors": ["Could not process webfinger response"]}}
|
||||
|
||||
try:
|
||||
data["actor"] = actors.get_actor_data(data["webfinger"]["actor_url"])
|
||||
except requests.ConnectionError:
|
||||
data["actor"] = {"errors": ["This actor is not reachable"]}
|
||||
return data
|
||||
except requests.HTTPError as e:
|
||||
data["actor"] = {
|
||||
"errors": ["Error {} during actor request".format(e.response.status_code)]
|
||||
}
|
||||
return data
|
||||
|
||||
serializer = serializers.LibraryActorSerializer(data=data["actor"])
|
||||
if not serializer.is_valid():
|
||||
data["actor"] = {"errors": ["Invalid ActivityPub actor"]}
|
||||
return data
|
||||
data["library"] = get_library_data(serializer.validated_data["library_url"])
|
||||
|
||||
return data
|
||||
|
||||
|
||||
def get_library_data(library_url):
|
||||
actor = actors.SYSTEM_ACTORS["library"].get_actor_instance()
|
||||
def get_library_data(library_url, actor):
|
||||
auth = signing.get_auth(actor.private_key, actor.private_key_id)
|
||||
try:
|
||||
response = session.get_session().get(
|
||||
|
@ -91,15 +25,14 @@ def get_library_data(library_url):
|
|||
return {"errors": ["Permission denied while scanning library"]}
|
||||
elif scode >= 400:
|
||||
return {"errors": ["Error {} while fetching the library".format(scode)]}
|
||||
serializer = serializers.PaginatedCollectionSerializer(data=response.json())
|
||||
serializer = serializers.LibrarySerializer(data=response.json())
|
||||
if not serializer.is_valid():
|
||||
return {"errors": ["Invalid ActivityPub response from remote library"]}
|
||||
|
||||
return serializer.validated_data
|
||||
|
||||
|
||||
def get_library_page(library, page_url):
|
||||
actor = actors.SYSTEM_ACTORS["library"].get_actor_instance()
|
||||
def get_library_page(library, page_url, actor):
|
||||
auth = signing.get_auth(actor.private_key, actor.private_key_id)
|
||||
response = session.get_session().get(
|
||||
page_url,
|
||||
|
@ -110,7 +43,7 @@ def get_library_page(library, page_url):
|
|||
)
|
||||
serializer = serializers.CollectionPageSerializer(
|
||||
data=response.json(),
|
||||
context={"library": library, "item_serializer": serializers.AudioSerializer},
|
||||
context={"library": library, "item_serializer": serializers.UploadSerializer},
|
||||
)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
return serializer.validated_data
|
||||
|
|
|
@ -0,0 +1,95 @@
|
|||
# Generated by Django 2.0.7 on 2018-08-07 17:48
|
||||
|
||||
import django.contrib.postgres.fields.jsonb
|
||||
import django.core.serializers.json
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
import django.utils.timezone
|
||||
import uuid
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [("federation", "0006_auto_20180521_1702")]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name="Activity",
|
||||
fields=[
|
||||
(
|
||||
"id",
|
||||
models.AutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
("uuid", models.UUIDField(default=uuid.uuid4, unique=True)),
|
||||
(
|
||||
"fid",
|
||||
models.URLField(blank=True, max_length=500, null=True, unique=True),
|
||||
),
|
||||
("url", models.URLField(blank=True, max_length=500, null=True)),
|
||||
(
|
||||
"payload",
|
||||
django.contrib.postgres.fields.jsonb.JSONField(
|
||||
default={},
|
||||
encoder=django.core.serializers.json.DjangoJSONEncoder,
|
||||
max_length=50000,
|
||||
),
|
||||
),
|
||||
(
|
||||
"creation_date",
|
||||
models.DateTimeField(default=django.utils.timezone.now),
|
||||
),
|
||||
("delivered", models.NullBooleanField(default=None)),
|
||||
("delivered_date", models.DateTimeField(blank=True, null=True)),
|
||||
],
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name="LibraryFollow",
|
||||
fields=[
|
||||
(
|
||||
"id",
|
||||
models.AutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
(
|
||||
"fid",
|
||||
models.URLField(blank=True, max_length=500, null=True, unique=True),
|
||||
),
|
||||
("uuid", models.UUIDField(default=uuid.uuid4, unique=True)),
|
||||
(
|
||||
"creation_date",
|
||||
models.DateTimeField(default=django.utils.timezone.now),
|
||||
),
|
||||
("modification_date", models.DateTimeField(auto_now=True)),
|
||||
("approved", models.NullBooleanField(default=None)),
|
||||
],
|
||||
),
|
||||
migrations.RenameField("actor", "url", "fid"),
|
||||
migrations.AddField(
|
||||
model_name="actor",
|
||||
name="url",
|
||||
field=models.URLField(blank=True, max_length=500, null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="follow",
|
||||
name="fid",
|
||||
field=models.URLField(blank=True, max_length=500, null=True, unique=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="libraryfollow",
|
||||
name="actor",
|
||||
field=models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name="library_follows",
|
||||
to="federation.Actor",
|
||||
),
|
||||
),
|
||||
]
|
|
@ -0,0 +1,36 @@
|
|||
# Generated by Django 2.0.7 on 2018-08-07 17:48
|
||||
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
("music", "0029_auto_20180807_1748"),
|
||||
("federation", "0007_auto_20180807_1748"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name="libraryfollow",
|
||||
name="target",
|
||||
field=models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name="received_follows",
|
||||
to="music.Library",
|
||||
),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="activity",
|
||||
name="actor",
|
||||
field=models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name="activities",
|
||||
to="federation.Actor",
|
||||
),
|
||||
),
|
||||
migrations.AlterUniqueTogether(
|
||||
name="libraryfollow", unique_together={("actor", "target")}
|
||||
),
|
||||
]
|
|
@ -0,0 +1,44 @@
|
|||
# Generated by Django 2.0.8 on 2018-08-22 19:56
|
||||
|
||||
import django.contrib.postgres.fields.jsonb
|
||||
import django.core.serializers.json
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
import funkwhale_api.federation.models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [("federation", "0008_auto_20180807_1748")]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name="activity",
|
||||
name="recipient",
|
||||
field=models.ForeignKey(
|
||||
blank=True,
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.SET_NULL,
|
||||
related_name="inbox_activities",
|
||||
to="federation.Actor",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="activity",
|
||||
name="payload",
|
||||
field=django.contrib.postgres.fields.jsonb.JSONField(
|
||||
default=funkwhale_api.federation.models.empty_dict,
|
||||
encoder=django.core.serializers.json.DjangoJSONEncoder,
|
||||
max_length=50000,
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="librarytrack",
|
||||
name="metadata",
|
||||
field=django.contrib.postgres.fields.jsonb.JSONField(
|
||||
default=funkwhale_api.federation.models.empty_dict,
|
||||
encoder=django.core.serializers.json.DjangoJSONEncoder,
|
||||
max_length=10000,
|
||||
),
|
||||
),
|
||||
]
|
|
@ -0,0 +1,74 @@
|
|||
# Generated by Django 2.0.8 on 2018-09-04 20:11
|
||||
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [("federation", "0009_auto_20180822_1956")]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name="InboxItem",
|
||||
fields=[
|
||||
(
|
||||
"id",
|
||||
models.AutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
("is_delivered", models.BooleanField(default=False)),
|
||||
(
|
||||
"type",
|
||||
models.CharField(
|
||||
choices=[("to", "to"), ("cc", "cc")], max_length=10
|
||||
),
|
||||
),
|
||||
("last_delivery_date", models.DateTimeField(blank=True, null=True)),
|
||||
("delivery_attempts", models.PositiveIntegerField(default=0)),
|
||||
],
|
||||
),
|
||||
migrations.RemoveField(model_name="activity", name="delivered"),
|
||||
migrations.RemoveField(model_name="activity", name="delivered_date"),
|
||||
migrations.RemoveField(model_name="activity", name="recipient"),
|
||||
migrations.AlterField(
|
||||
model_name="activity",
|
||||
name="actor",
|
||||
field=models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name="outbox_activities",
|
||||
to="federation.Actor",
|
||||
),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="inboxitem",
|
||||
name="activity",
|
||||
field=models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name="inbox_items",
|
||||
to="federation.Activity",
|
||||
),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="inboxitem",
|
||||
name="actor",
|
||||
field=models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name="inbox_items",
|
||||
to="federation.Actor",
|
||||
),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="activity",
|
||||
name="recipients",
|
||||
field=models.ManyToManyField(
|
||||
related_name="inbox_activities",
|
||||
through="federation.InboxItem",
|
||||
to="federation.Actor",
|
||||
),
|
||||
),
|
||||
]
|
|
@ -0,0 +1,61 @@
|
|||
# Generated by Django 2.0.8 on 2018-09-10 19:02
|
||||
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
import django.utils.timezone
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('contenttypes', '0002_remove_content_type_name'),
|
||||
('federation', '0010_auto_20180904_2011'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='activity',
|
||||
name='object_content_type',
|
||||
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='objecting_activities', to='contenttypes.ContentType'),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='activity',
|
||||
name='object_id',
|
||||
field=models.IntegerField(null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='activity',
|
||||
name='related_object_content_type',
|
||||
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='related_objecting_activities', to='contenttypes.ContentType'),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='activity',
|
||||
name='related_object_id',
|
||||
field=models.IntegerField(null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='activity',
|
||||
name='target_content_type',
|
||||
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='targeting_activities', to='contenttypes.ContentType'),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='activity',
|
||||
name='target_id',
|
||||
field=models.IntegerField(null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='activity',
|
||||
name='type',
|
||||
field=models.CharField(db_index=True, max_length=100, null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='inboxitem',
|
||||
name='is_read',
|
||||
field=models.BooleanField(default=False),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='activity',
|
||||
name='creation_date',
|
||||
field=models.DateTimeField(db_index=True, default=django.utils.timezone.now),
|
||||
),
|
||||
]
|
|
@ -0,0 +1,37 @@
|
|||
# Generated by Django 2.0.8 on 2018-09-20 18:03
|
||||
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('federation', '0011_auto_20180910_1902'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name='Delivery',
|
||||
fields=[
|
||||
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('is_delivered', models.BooleanField(default=False)),
|
||||
('last_attempt_date', models.DateTimeField(blank=True, null=True)),
|
||||
('attempts', models.PositiveIntegerField(default=0)),
|
||||
('inbox_url', models.URLField(max_length=500)),
|
||||
('activity', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='deliveries', to='federation.Activity')),
|
||||
],
|
||||
),
|
||||
migrations.RemoveField(
|
||||
model_name='inboxitem',
|
||||
name='delivery_attempts',
|
||||
),
|
||||
migrations.RemoveField(
|
||||
model_name='inboxitem',
|
||||
name='is_delivered',
|
||||
),
|
||||
migrations.RemoveField(
|
||||
model_name='inboxitem',
|
||||
name='last_delivery_date',
|
||||
),
|
||||
]
|
|
@ -3,14 +3,20 @@ import uuid
|
|||
|
||||
from django.conf import settings
|
||||
from django.contrib.postgres.fields import JSONField
|
||||
from django.contrib.contenttypes.fields import GenericForeignKey
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from django.core.exceptions import ObjectDoesNotExist
|
||||
from django.core.serializers.json import DjangoJSONEncoder
|
||||
from django.db import models
|
||||
from django.utils import timezone
|
||||
from django.urls import reverse
|
||||
|
||||
from funkwhale_api.common import session
|
||||
from funkwhale_api.common import utils as common_utils
|
||||
from funkwhale_api.music import utils as music_utils
|
||||
|
||||
from . import utils as federation_utils
|
||||
|
||||
TYPE_CHOICES = [
|
||||
("Person", "Person"),
|
||||
("Application", "Application"),
|
||||
|
@ -20,15 +26,47 @@ TYPE_CHOICES = [
|
|||
]
|
||||
|
||||
|
||||
def empty_dict():
|
||||
return {}
|
||||
|
||||
|
||||
def get_shared_inbox_url():
|
||||
return federation_utils.full_url(reverse("federation:shared-inbox"))
|
||||
|
||||
|
||||
class FederationMixin(models.Model):
|
||||
# federation id/url
|
||||
fid = models.URLField(unique=True, max_length=500, db_index=True)
|
||||
url = models.URLField(max_length=500, null=True, blank=True)
|
||||
|
||||
class Meta:
|
||||
abstract = True
|
||||
|
||||
|
||||
class ActorQuerySet(models.QuerySet):
|
||||
def local(self, include=True):
|
||||
return self.exclude(user__isnull=include)
|
||||
|
||||
def with_current_usage(self):
|
||||
qs = self
|
||||
for s in ["pending", "skipped", "errored", "finished"]:
|
||||
qs = qs.annotate(
|
||||
**{
|
||||
"_usage_{}".format(s): models.Sum(
|
||||
"libraries__uploads__size",
|
||||
filter=models.Q(libraries__uploads__import_status=s),
|
||||
)
|
||||
}
|
||||
)
|
||||
|
||||
return qs
|
||||
|
||||
|
||||
class Actor(models.Model):
|
||||
ap_type = "Actor"
|
||||
|
||||
url = models.URLField(unique=True, max_length=500, db_index=True)
|
||||
fid = models.URLField(unique=True, max_length=500, db_index=True)
|
||||
url = models.URLField(max_length=500, null=True, blank=True)
|
||||
outbox_url = models.URLField(max_length=500)
|
||||
inbox_url = models.URLField(max_length=500)
|
||||
following_url = models.URLField(max_length=500, null=True, blank=True)
|
||||
|
@ -39,8 +77,8 @@ class Actor(models.Model):
|
|||
domain = models.CharField(max_length=1000)
|
||||
summary = models.CharField(max_length=500, null=True, blank=True)
|
||||
preferred_username = models.CharField(max_length=200, null=True, blank=True)
|
||||
public_key = models.CharField(max_length=5000, null=True, blank=True)
|
||||
private_key = models.CharField(max_length=5000, null=True, blank=True)
|
||||
public_key = models.TextField(max_length=5000, null=True, blank=True)
|
||||
private_key = models.TextField(max_length=5000, null=True, blank=True)
|
||||
creation_date = models.DateTimeField(default=timezone.now)
|
||||
last_fetch_date = models.DateTimeField(default=timezone.now)
|
||||
manually_approves_followers = models.NullBooleanField(default=None)
|
||||
|
@ -63,11 +101,14 @@ class Actor(models.Model):
|
|||
|
||||
@property
|
||||
def private_key_id(self):
|
||||
return "{}#main-key".format(self.url)
|
||||
return "{}#main-key".format(self.fid)
|
||||
|
||||
@property
|
||||
def mention_username(self):
|
||||
return "@{}@{}".format(self.preferred_username, self.domain)
|
||||
def full_username(self):
|
||||
return "{}@{}".format(self.preferred_username, self.domain)
|
||||
|
||||
def __str__(self):
|
||||
return "{}@{}".format(self.preferred_username, self.domain)
|
||||
|
||||
def save(self, **kwargs):
|
||||
lowercase_fields = ["domain"]
|
||||
|
@ -104,26 +145,137 @@ class Actor(models.Model):
|
|||
follows = self.received_follows.filter(approved=True)
|
||||
return self.followers.filter(pk__in=follows.values_list("actor", flat=True))
|
||||
|
||||
def should_autoapprove_follow(self, actor):
|
||||
return False
|
||||
|
||||
class Follow(models.Model):
|
||||
ap_type = "Follow"
|
||||
def get_user(self):
|
||||
try:
|
||||
return self.user
|
||||
except ObjectDoesNotExist:
|
||||
return None
|
||||
|
||||
def get_current_usage(self):
|
||||
actor = self.__class__.objects.filter(pk=self.pk).with_current_usage().get()
|
||||
data = {}
|
||||
for s in ["pending", "skipped", "errored", "finished"]:
|
||||
data[s] = getattr(actor, "_usage_{}".format(s)) or 0
|
||||
|
||||
data["total"] = sum(data.values())
|
||||
return data
|
||||
|
||||
|
||||
class InboxItem(models.Model):
|
||||
"""
|
||||
Store activities binding to local actors, with read/unread status.
|
||||
"""
|
||||
|
||||
actor = models.ForeignKey(
|
||||
Actor, related_name="inbox_items", on_delete=models.CASCADE
|
||||
)
|
||||
activity = models.ForeignKey(
|
||||
"Activity", related_name="inbox_items", on_delete=models.CASCADE
|
||||
)
|
||||
type = models.CharField(max_length=10, choices=[("to", "to"), ("cc", "cc")])
|
||||
is_read = models.BooleanField(default=False)
|
||||
|
||||
|
||||
class Delivery(models.Model):
|
||||
"""
|
||||
Store deliveries attempt to remote inboxes
|
||||
"""
|
||||
|
||||
is_delivered = models.BooleanField(default=False)
|
||||
last_attempt_date = models.DateTimeField(null=True, blank=True)
|
||||
attempts = models.PositiveIntegerField(default=0)
|
||||
inbox_url = models.URLField(max_length=500)
|
||||
|
||||
activity = models.ForeignKey(
|
||||
"Activity", related_name="deliveries", on_delete=models.CASCADE
|
||||
)
|
||||
|
||||
|
||||
class Activity(models.Model):
|
||||
actor = models.ForeignKey(
|
||||
Actor, related_name="outbox_activities", on_delete=models.CASCADE
|
||||
)
|
||||
recipients = models.ManyToManyField(
|
||||
Actor, related_name="inbox_activities", through=InboxItem
|
||||
)
|
||||
uuid = models.UUIDField(default=uuid.uuid4, unique=True)
|
||||
fid = models.URLField(unique=True, max_length=500, null=True, blank=True)
|
||||
url = models.URLField(max_length=500, null=True, blank=True)
|
||||
payload = JSONField(default=empty_dict, max_length=50000, encoder=DjangoJSONEncoder)
|
||||
creation_date = models.DateTimeField(default=timezone.now, db_index=True)
|
||||
type = models.CharField(db_index=True, null=True, max_length=100)
|
||||
|
||||
# generic relations
|
||||
object_id = models.IntegerField(null=True)
|
||||
object_content_type = models.ForeignKey(
|
||||
ContentType,
|
||||
null=True,
|
||||
on_delete=models.SET_NULL,
|
||||
related_name="objecting_activities",
|
||||
)
|
||||
object = GenericForeignKey("object_content_type", "object_id")
|
||||
target_id = models.IntegerField(null=True)
|
||||
target_content_type = models.ForeignKey(
|
||||
ContentType,
|
||||
null=True,
|
||||
on_delete=models.SET_NULL,
|
||||
related_name="targeting_activities",
|
||||
)
|
||||
target = GenericForeignKey("target_content_type", "target_id")
|
||||
related_object_id = models.IntegerField(null=True)
|
||||
related_object_content_type = models.ForeignKey(
|
||||
ContentType,
|
||||
null=True,
|
||||
on_delete=models.SET_NULL,
|
||||
related_name="related_objecting_activities",
|
||||
)
|
||||
related_object = GenericForeignKey(
|
||||
"related_object_content_type", "related_object_id"
|
||||
)
|
||||
|
||||
|
||||
class AbstractFollow(models.Model):
|
||||
ap_type = "Follow"
|
||||
fid = models.URLField(unique=True, max_length=500, null=True, blank=True)
|
||||
uuid = models.UUIDField(default=uuid.uuid4, unique=True)
|
||||
creation_date = models.DateTimeField(default=timezone.now)
|
||||
modification_date = models.DateTimeField(auto_now=True)
|
||||
approved = models.NullBooleanField(default=None)
|
||||
|
||||
class Meta:
|
||||
abstract = True
|
||||
|
||||
def get_federation_id(self):
|
||||
return federation_utils.full_url(
|
||||
"{}#follows/{}".format(self.actor.fid, self.uuid)
|
||||
)
|
||||
|
||||
|
||||
class Follow(AbstractFollow):
|
||||
actor = models.ForeignKey(
|
||||
Actor, related_name="emitted_follows", on_delete=models.CASCADE
|
||||
)
|
||||
target = models.ForeignKey(
|
||||
Actor, related_name="received_follows", on_delete=models.CASCADE
|
||||
)
|
||||
creation_date = models.DateTimeField(default=timezone.now)
|
||||
modification_date = models.DateTimeField(auto_now=True)
|
||||
approved = models.NullBooleanField(default=None)
|
||||
|
||||
class Meta:
|
||||
unique_together = ["actor", "target"]
|
||||
|
||||
def get_federation_url(self):
|
||||
return "{}#follows/{}".format(self.actor.url, self.uuid)
|
||||
|
||||
class LibraryFollow(AbstractFollow):
|
||||
actor = models.ForeignKey(
|
||||
Actor, related_name="library_follows", on_delete=models.CASCADE
|
||||
)
|
||||
target = models.ForeignKey(
|
||||
"music.Library", related_name="received_follows", on_delete=models.CASCADE
|
||||
)
|
||||
|
||||
class Meta:
|
||||
unique_together = ["actor", "target"]
|
||||
|
||||
|
||||
class Library(models.Model):
|
||||
|
@ -167,7 +319,9 @@ class LibraryTrack(models.Model):
|
|||
artist_name = models.CharField(max_length=500)
|
||||
album_title = models.CharField(max_length=500)
|
||||
title = models.CharField(max_length=500)
|
||||
metadata = JSONField(default={}, max_length=10000, encoder=DjangoJSONEncoder)
|
||||
metadata = JSONField(
|
||||
default=empty_dict, max_length=10000, encoder=DjangoJSONEncoder
|
||||
)
|
||||
|
||||
@property
|
||||
def mbid(self):
|
||||
|
|
|
@ -1,19 +0,0 @@
|
|||
|
||||
from rest_framework.permissions import BasePermission
|
||||
|
||||
from funkwhale_api.common import preferences
|
||||
|
||||
from . import actors
|
||||
|
||||
|
||||
class LibraryFollower(BasePermission):
|
||||
def has_permission(self, request, view):
|
||||
if not preferences.get("federation__music_needs_approval"):
|
||||
return True
|
||||
|
||||
actor = getattr(request, "actor", None)
|
||||
if actor is None:
|
||||
return False
|
||||
|
||||
library = actors.SYSTEM_ACTORS["library"].get_actor_instance()
|
||||
return library.received_follows.filter(approved=True, actor=actor).exists()
|
|
@ -0,0 +1,232 @@
|
|||
import logging
|
||||
|
||||
from funkwhale_api.music import models as music_models
|
||||
|
||||
from . import activity
|
||||
from . import serializers
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
inbox = activity.InboxRouter()
|
||||
outbox = activity.OutboxRouter()
|
||||
|
||||
|
||||
def with_recipients(payload, to=[], cc=[]):
|
||||
if to:
|
||||
payload["to"] = to
|
||||
if cc:
|
||||
payload["cc"] = cc
|
||||
return payload
|
||||
|
||||
|
||||
@inbox.register({"type": "Follow"})
|
||||
def inbox_follow(payload, context):
|
||||
context["recipient"] = [
|
||||
ii.actor for ii in context["inbox_items"] if ii.type == "to"
|
||||
][0]
|
||||
serializer = serializers.FollowSerializer(data=payload, context=context)
|
||||
if not serializer.is_valid(raise_exception=context.get("raise_exception", False)):
|
||||
logger.debug(
|
||||
"Discarding invalid follow from {}: %s",
|
||||
context["actor"].fid,
|
||||
serializer.errors,
|
||||
)
|
||||
return
|
||||
|
||||
autoapprove = serializer.validated_data["object"].should_autoapprove_follow(
|
||||
context["actor"]
|
||||
)
|
||||
follow = serializer.save(approved=True if autoapprove else None)
|
||||
if follow.approved:
|
||||
outbox.dispatch({"type": "Accept"}, context={"follow": follow})
|
||||
return {"object": follow.target, "related_object": follow}
|
||||
|
||||
|
||||
@inbox.register({"type": "Accept"})
|
||||
def inbox_accept(payload, context):
|
||||
context["recipient"] = [
|
||||
ii.actor for ii in context["inbox_items"] if ii.type == "to"
|
||||
][0]
|
||||
serializer = serializers.AcceptFollowSerializer(data=payload, context=context)
|
||||
if not serializer.is_valid(raise_exception=context.get("raise_exception", False)):
|
||||
logger.debug(
|
||||
"Discarding invalid accept from {}: %s",
|
||||
context["actor"].fid,
|
||||
serializer.errors,
|
||||
)
|
||||
return
|
||||
|
||||
serializer.save()
|
||||
obj = serializer.validated_data["follow"]
|
||||
return {"object": obj, "related_object": obj.target}
|
||||
|
||||
|
||||
@outbox.register({"type": "Accept"})
|
||||
def outbox_accept(context):
|
||||
follow = context["follow"]
|
||||
if follow._meta.label == "federation.LibraryFollow":
|
||||
actor = follow.target.actor
|
||||
else:
|
||||
actor = follow.target
|
||||
payload = serializers.AcceptFollowSerializer(follow, context={"actor": actor}).data
|
||||
yield {
|
||||
"actor": actor,
|
||||
"type": "Accept",
|
||||
"payload": with_recipients(payload, to=[follow.actor]),
|
||||
"object": follow,
|
||||
"related_object": follow.target,
|
||||
}
|
||||
|
||||
|
||||
@inbox.register({"type": "Undo", "object.type": "Follow"})
|
||||
def inbox_undo_follow(payload, context):
|
||||
serializer = serializers.UndoFollowSerializer(data=payload, context=context)
|
||||
if not serializer.is_valid(raise_exception=context.get("raise_exception", False)):
|
||||
logger.debug(
|
||||
"Discarding invalid follow undo from {}: %s",
|
||||
context["actor"].fid,
|
||||
serializer.errors,
|
||||
)
|
||||
return
|
||||
|
||||
serializer.save()
|
||||
|
||||
|
||||
@outbox.register({"type": "Undo", "object.type": "Follow"})
|
||||
def outbox_undo_follow(context):
|
||||
follow = context["follow"]
|
||||
actor = follow.actor
|
||||
if follow._meta.label == "federation.LibraryFollow":
|
||||
recipient = follow.target.actor
|
||||
else:
|
||||
recipient = follow.target
|
||||
payload = serializers.UndoFollowSerializer(follow, context={"actor": actor}).data
|
||||
yield {
|
||||
"actor": actor,
|
||||
"type": "Undo",
|
||||
"payload": with_recipients(payload, to=[recipient]),
|
||||
"object": follow,
|
||||
"related_object": follow.target,
|
||||
}
|
||||
|
||||
|
||||
@outbox.register({"type": "Follow"})
|
||||
def outbox_follow(context):
|
||||
follow = context["follow"]
|
||||
if follow._meta.label == "federation.LibraryFollow":
|
||||
target = follow.target.actor
|
||||
else:
|
||||
target = follow.target
|
||||
payload = serializers.FollowSerializer(follow, context={"actor": follow.actor}).data
|
||||
yield {
|
||||
"type": "Follow",
|
||||
"actor": follow.actor,
|
||||
"payload": with_recipients(payload, to=[target]),
|
||||
"object": follow.target,
|
||||
"related_object": follow,
|
||||
}
|
||||
|
||||
|
||||
@outbox.register({"type": "Create", "object.type": "Audio"})
|
||||
def outbox_create_audio(context):
|
||||
upload = context["upload"]
|
||||
serializer = serializers.ActivitySerializer(
|
||||
{
|
||||
"type": "Create",
|
||||
"actor": upload.library.actor.fid,
|
||||
"object": serializers.UploadSerializer(upload).data,
|
||||
}
|
||||
)
|
||||
yield {
|
||||
"type": "Create",
|
||||
"actor": upload.library.actor,
|
||||
"payload": with_recipients(
|
||||
serializer.data, to=[{"type": "followers", "target": upload.library}]
|
||||
),
|
||||
"object": upload,
|
||||
"target": upload.library,
|
||||
}
|
||||
|
||||
|
||||
@inbox.register({"type": "Create", "object.type": "Audio"})
|
||||
def inbox_create_audio(payload, context):
|
||||
serializer = serializers.UploadSerializer(
|
||||
data=payload["object"],
|
||||
context={"activity": context.get("activity"), "actor": context["actor"]},
|
||||
)
|
||||
|
||||
if not serializer.is_valid(raise_exception=context.get("raise_exception", False)):
|
||||
logger.warn("Discarding invalid audio create")
|
||||
return
|
||||
|
||||
upload = serializer.save()
|
||||
|
||||
return {"object": upload, "target": upload.library}
|
||||
|
||||
|
||||
@inbox.register({"type": "Delete", "object.type": "Library"})
|
||||
def inbox_delete_library(payload, context):
|
||||
actor = context["actor"]
|
||||
library_id = payload["object"].get("id")
|
||||
if not library_id:
|
||||
logger.debug("Discarding deletion of empty library")
|
||||
return
|
||||
|
||||
try:
|
||||
library = actor.libraries.get(fid=library_id)
|
||||
except music_models.Library.DoesNotExist:
|
||||
logger.debug("Discarding deletion of unkwnown library %s", library_id)
|
||||
return
|
||||
|
||||
library.delete()
|
||||
|
||||
|
||||
@outbox.register({"type": "Delete", "object.type": "Library"})
|
||||
def outbox_delete_library(context):
|
||||
library = context["library"]
|
||||
serializer = serializers.ActivitySerializer(
|
||||
{"type": "Delete", "object": {"type": "Library", "id": library.fid}}
|
||||
)
|
||||
yield {
|
||||
"type": "Delete",
|
||||
"actor": library.actor,
|
||||
"payload": with_recipients(
|
||||
serializer.data, to=[{"type": "followers", "target": library}]
|
||||
),
|
||||
}
|
||||
|
||||
|
||||
@inbox.register({"type": "Delete", "object.type": "Audio"})
|
||||
def inbox_delete_audio(payload, context):
|
||||
actor = context["actor"]
|
||||
try:
|
||||
upload_fids = [i for i in payload["object"]["id"]]
|
||||
except TypeError:
|
||||
# we did not receive a list of Ids, so we can probably use the value directly
|
||||
upload_fids = [payload["object"]["id"]]
|
||||
|
||||
candidates = music_models.Upload.objects.filter(
|
||||
library__actor=actor, fid__in=upload_fids
|
||||
)
|
||||
|
||||
total = candidates.count()
|
||||
logger.info("Deleting %s uploads with ids %s", total, upload_fids)
|
||||
candidates.delete()
|
||||
|
||||
|
||||
@outbox.register({"type": "Delete", "object.type": "Audio"})
|
||||
def outbox_delete_audio(context):
|
||||
uploads = context["uploads"]
|
||||
library = uploads[0].library
|
||||
serializer = serializers.ActivitySerializer(
|
||||
{
|
||||
"type": "Delete",
|
||||
"object": {"type": "Audio", "id": [u.get_federation_id() for u in uploads]},
|
||||
}
|
||||
)
|
||||
yield {
|
||||
"type": "Delete",
|
||||
"actor": library.actor,
|
||||
"payload": with_recipients(
|
||||
serializer.data, to=[{"type": "followers", "target": library}]
|
||||
),
|
||||
}
|
|
@ -4,15 +4,12 @@ import urllib.parse
|
|||
|
||||
from django.core.exceptions import ObjectDoesNotExist
|
||||
from django.core.paginator import Paginator
|
||||
from django.db import transaction
|
||||
from rest_framework import serializers
|
||||
|
||||
from funkwhale_api.common import serializers as common_serializers
|
||||
from funkwhale_api.common import utils as funkwhale_utils
|
||||
from funkwhale_api.music import models as music_models
|
||||
from funkwhale_api.music import tasks as music_tasks
|
||||
|
||||
from . import activity, filters, models, utils
|
||||
from . import activity, models, utils
|
||||
|
||||
AP_CONTEXT = [
|
||||
"https://www.w3.org/ns/activitystreams",
|
||||
|
@ -23,6 +20,31 @@ AP_CONTEXT = [
|
|||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class LinkSerializer(serializers.Serializer):
|
||||
type = serializers.ChoiceField(choices=["Link"])
|
||||
href = serializers.URLField(max_length=500)
|
||||
mediaType = serializers.CharField()
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.allowed_mimetypes = kwargs.pop("allowed_mimetypes", [])
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
def validate_mediaType(self, v):
|
||||
if not self.allowed_mimetypes:
|
||||
# no restrictions
|
||||
return v
|
||||
for mt in self.allowed_mimetypes:
|
||||
if mt.endswith("/*"):
|
||||
if v.startswith(mt.replace("*", "")):
|
||||
return v
|
||||
else:
|
||||
if v == mt:
|
||||
return v
|
||||
raise serializers.ValidationError(
|
||||
"Invalid mimetype {}. Allowed: {}".format(v, self.allowed_mimetypes)
|
||||
)
|
||||
|
||||
|
||||
class ActorSerializer(serializers.Serializer):
|
||||
id = serializers.URLField(max_length=500)
|
||||
outbox = serializers.URLField(max_length=500)
|
||||
|
@ -32,13 +54,13 @@ class ActorSerializer(serializers.Serializer):
|
|||
manuallyApprovesFollowers = serializers.NullBooleanField(required=False)
|
||||
name = serializers.CharField(required=False, max_length=200)
|
||||
summary = serializers.CharField(max_length=None, required=False)
|
||||
followers = serializers.URLField(max_length=500, required=False, allow_null=True)
|
||||
followers = serializers.URLField(max_length=500)
|
||||
following = serializers.URLField(max_length=500, required=False, allow_null=True)
|
||||
publicKey = serializers.JSONField(required=False)
|
||||
|
||||
def to_representation(self, instance):
|
||||
ret = {
|
||||
"id": instance.url,
|
||||
"id": instance.fid,
|
||||
"outbox": instance.outbox_url,
|
||||
"inbox": instance.inbox_url,
|
||||
"preferredUsername": instance.preferred_username,
|
||||
|
@ -58,9 +80,9 @@ class ActorSerializer(serializers.Serializer):
|
|||
ret["@context"] = AP_CONTEXT
|
||||
if instance.public_key:
|
||||
ret["publicKey"] = {
|
||||
"owner": instance.url,
|
||||
"owner": instance.fid,
|
||||
"publicKeyPem": instance.public_key,
|
||||
"id": "{}#main-key".format(instance.url),
|
||||
"id": "{}#main-key".format(instance.fid),
|
||||
}
|
||||
ret["endpoints"] = {}
|
||||
if instance.shared_inbox_url:
|
||||
|
@ -78,7 +100,7 @@ class ActorSerializer(serializers.Serializer):
|
|||
|
||||
def prepare_missing_fields(self):
|
||||
kwargs = {
|
||||
"url": self.validated_data["id"],
|
||||
"fid": self.validated_data["id"],
|
||||
"outbox_url": self.validated_data["outbox"],
|
||||
"inbox_url": self.validated_data["inbox"],
|
||||
"following_url": self.validated_data.get("following"),
|
||||
|
@ -91,7 +113,7 @@ class ActorSerializer(serializers.Serializer):
|
|||
maf = self.validated_data.get("manuallyApprovesFollowers")
|
||||
if maf is not None:
|
||||
kwargs["manually_approves_followers"] = maf
|
||||
domain = urllib.parse.urlparse(kwargs["url"]).netloc
|
||||
domain = urllib.parse.urlparse(kwargs["fid"]).netloc
|
||||
kwargs["domain"] = domain
|
||||
for endpoint, url in self.initial_data.get("endpoints", {}).items():
|
||||
if endpoint == "sharedInbox":
|
||||
|
@ -110,7 +132,7 @@ class ActorSerializer(serializers.Serializer):
|
|||
def save(self, **kwargs):
|
||||
d = self.prepare_missing_fields()
|
||||
d.update(kwargs)
|
||||
return models.Actor.objects.update_or_create(url=d["url"], defaults=d)[0]
|
||||
return models.Actor.objects.update_or_create(fid=d["fid"], defaults=d)[0]
|
||||
|
||||
def validate_summary(self, value):
|
||||
if value:
|
||||
|
@ -122,6 +144,7 @@ class APIActorSerializer(serializers.ModelSerializer):
|
|||
model = models.Actor
|
||||
fields = [
|
||||
"id",
|
||||
"fid",
|
||||
"url",
|
||||
"creation_date",
|
||||
"summary",
|
||||
|
@ -131,190 +154,50 @@ class APIActorSerializer(serializers.ModelSerializer):
|
|||
"domain",
|
||||
"type",
|
||||
"manually_approves_followers",
|
||||
"full_username",
|
||||
]
|
||||
|
||||
|
||||
class LibraryActorSerializer(ActorSerializer):
|
||||
url = serializers.ListField(child=serializers.JSONField())
|
||||
|
||||
def validate(self, validated_data):
|
||||
try:
|
||||
urls = validated_data["url"]
|
||||
except KeyError:
|
||||
raise serializers.ValidationError("Missing URL field")
|
||||
|
||||
for u in urls:
|
||||
try:
|
||||
if u["name"] != "library":
|
||||
continue
|
||||
validated_data["library_url"] = u["href"]
|
||||
break
|
||||
except KeyError:
|
||||
continue
|
||||
|
||||
return validated_data
|
||||
|
||||
|
||||
class APIFollowSerializer(serializers.ModelSerializer):
|
||||
class Meta:
|
||||
model = models.Follow
|
||||
fields = [
|
||||
"uuid",
|
||||
"actor",
|
||||
"target",
|
||||
"approved",
|
||||
"creation_date",
|
||||
"modification_date",
|
||||
]
|
||||
|
||||
|
||||
class APILibrarySerializer(serializers.ModelSerializer):
|
||||
actor = APIActorSerializer()
|
||||
follow = APIFollowSerializer()
|
||||
|
||||
class Meta:
|
||||
model = models.Library
|
||||
|
||||
read_only_fields = [
|
||||
"actor",
|
||||
"uuid",
|
||||
"url",
|
||||
"tracks_count",
|
||||
"follow",
|
||||
"fetched_date",
|
||||
"modification_date",
|
||||
"creation_date",
|
||||
]
|
||||
fields = [
|
||||
"autoimport",
|
||||
"federation_enabled",
|
||||
"download_files",
|
||||
] + read_only_fields
|
||||
|
||||
|
||||
class APILibraryScanSerializer(serializers.Serializer):
|
||||
until = serializers.DateTimeField(required=False)
|
||||
|
||||
|
||||
class APILibraryFollowUpdateSerializer(serializers.Serializer):
|
||||
follow = serializers.IntegerField()
|
||||
approved = serializers.BooleanField()
|
||||
|
||||
def validate_follow(self, value):
|
||||
from . import actors
|
||||
|
||||
library_actor = actors.SYSTEM_ACTORS["library"].get_actor_instance()
|
||||
qs = models.Follow.objects.filter(pk=value, target=library_actor)
|
||||
try:
|
||||
return qs.get()
|
||||
except models.Follow.DoesNotExist:
|
||||
raise serializers.ValidationError("Invalid follow")
|
||||
|
||||
def save(self):
|
||||
new_status = self.validated_data["approved"]
|
||||
follow = self.validated_data["follow"]
|
||||
if new_status == follow.approved:
|
||||
return follow
|
||||
|
||||
follow.approved = new_status
|
||||
follow.save(update_fields=["approved", "modification_date"])
|
||||
if new_status:
|
||||
activity.accept_follow(follow)
|
||||
return follow
|
||||
|
||||
|
||||
class APILibraryCreateSerializer(serializers.ModelSerializer):
|
||||
class BaseActivitySerializer(serializers.Serializer):
|
||||
id = serializers.URLField(max_length=500, required=False)
|
||||
type = serializers.CharField(max_length=100)
|
||||
actor = serializers.URLField(max_length=500)
|
||||
federation_enabled = serializers.BooleanField()
|
||||
uuid = serializers.UUIDField(read_only=True)
|
||||
|
||||
class Meta:
|
||||
model = models.Library
|
||||
fields = ["uuid", "actor", "autoimport", "federation_enabled", "download_files"]
|
||||
|
||||
def validate(self, validated_data):
|
||||
from . import actors
|
||||
from . import library
|
||||
|
||||
actor_url = validated_data["actor"]
|
||||
actor_data = actors.get_actor_data(actor_url)
|
||||
acs = LibraryActorSerializer(data=actor_data)
|
||||
acs.is_valid(raise_exception=True)
|
||||
def validate_actor(self, v):
|
||||
expected = self.context.get("actor")
|
||||
if expected and expected.fid != v:
|
||||
raise serializers.ValidationError("Invalid actor")
|
||||
if expected:
|
||||
# avoid a DB lookup
|
||||
return expected
|
||||
try:
|
||||
actor = models.Actor.objects.get(url=actor_url)
|
||||
return models.Actor.objects.get(fid=v)
|
||||
except models.Actor.DoesNotExist:
|
||||
actor = acs.save()
|
||||
library_actor = actors.SYSTEM_ACTORS["library"].get_actor_instance()
|
||||
validated_data["follow"] = models.Follow.objects.get_or_create(
|
||||
actor=library_actor, target=actor
|
||||
)[0]
|
||||
if validated_data["follow"].approved is None:
|
||||
funkwhale_utils.on_commit(
|
||||
activity.deliver,
|
||||
FollowSerializer(validated_data["follow"]).data,
|
||||
on_behalf_of=validated_data["follow"].actor,
|
||||
to=[validated_data["follow"].target.url],
|
||||
)
|
||||
|
||||
library_data = library.get_library_data(acs.validated_data["library_url"])
|
||||
if "errors" in library_data:
|
||||
# we pass silently because it may means we require permission
|
||||
# before scanning
|
||||
pass
|
||||
validated_data["library"] = library_data
|
||||
validated_data["library"].setdefault("id", acs.validated_data["library_url"])
|
||||
validated_data["actor"] = actor
|
||||
return validated_data
|
||||
raise serializers.ValidationError("Actor not found")
|
||||
|
||||
def create(self, validated_data):
|
||||
library = models.Library.objects.update_or_create(
|
||||
url=validated_data["library"]["id"],
|
||||
defaults={
|
||||
"actor": validated_data["actor"],
|
||||
"follow": validated_data["follow"],
|
||||
"tracks_count": validated_data["library"].get("totalItems"),
|
||||
"federation_enabled": validated_data["federation_enabled"],
|
||||
"autoimport": validated_data["autoimport"],
|
||||
"download_files": validated_data["download_files"],
|
||||
},
|
||||
)[0]
|
||||
return library
|
||||
return models.Activity.objects.create(
|
||||
fid=validated_data.get("id"),
|
||||
actor=validated_data["actor"],
|
||||
payload=self.initial_data,
|
||||
type=validated_data["type"],
|
||||
)
|
||||
|
||||
def validate(self, data):
|
||||
data["recipients"] = self.validate_recipients(self.initial_data)
|
||||
return super().validate(data)
|
||||
|
||||
class APILibraryTrackSerializer(serializers.ModelSerializer):
|
||||
library = APILibrarySerializer()
|
||||
status = serializers.SerializerMethodField()
|
||||
def validate_recipients(self, payload):
|
||||
"""
|
||||
Ensure we have at least a to/cc field with valid actors
|
||||
"""
|
||||
to = payload.get("to", [])
|
||||
cc = payload.get("cc", [])
|
||||
|
||||
class Meta:
|
||||
model = models.LibraryTrack
|
||||
fields = [
|
||||
"id",
|
||||
"url",
|
||||
"audio_url",
|
||||
"audio_mimetype",
|
||||
"creation_date",
|
||||
"modification_date",
|
||||
"fetched_date",
|
||||
"published_date",
|
||||
"metadata",
|
||||
"artist_name",
|
||||
"album_title",
|
||||
"title",
|
||||
"library",
|
||||
"local_track_file",
|
||||
"status",
|
||||
]
|
||||
|
||||
def get_status(self, o):
|
||||
try:
|
||||
if o.local_track_file is not None:
|
||||
return "imported"
|
||||
except music_models.TrackFile.DoesNotExist:
|
||||
pass
|
||||
for job in o.import_jobs.all():
|
||||
if job.status == "pending":
|
||||
return "import_pending"
|
||||
return "not_imported"
|
||||
if not to and not cc:
|
||||
raise serializers.ValidationError(
|
||||
"We cannot handle an activity with no recipient"
|
||||
)
|
||||
|
||||
|
||||
class FollowSerializer(serializers.Serializer):
|
||||
|
@ -325,35 +208,61 @@ class FollowSerializer(serializers.Serializer):
|
|||
|
||||
def validate_object(self, v):
|
||||
expected = self.context.get("follow_target")
|
||||
if expected and expected.url != v:
|
||||
if self.parent:
|
||||
# it's probably an accept, so everything is inverted, the actor
|
||||
# the recipient does not matter
|
||||
recipient = None
|
||||
else:
|
||||
recipient = self.context.get("recipient")
|
||||
if expected and expected.fid != v:
|
||||
raise serializers.ValidationError("Invalid target")
|
||||
try:
|
||||
return models.Actor.objects.get(url=v)
|
||||
obj = models.Actor.objects.get(fid=v)
|
||||
if recipient and recipient.fid != obj.fid:
|
||||
raise serializers.ValidationError("Invalid target")
|
||||
return obj
|
||||
except models.Actor.DoesNotExist:
|
||||
raise serializers.ValidationError("Target not found")
|
||||
pass
|
||||
try:
|
||||
qs = music_models.Library.objects.filter(fid=v)
|
||||
if recipient:
|
||||
qs = qs.filter(actor=recipient)
|
||||
return qs.get()
|
||||
except music_models.Library.DoesNotExist:
|
||||
pass
|
||||
|
||||
raise serializers.ValidationError("Target not found")
|
||||
|
||||
def validate_actor(self, v):
|
||||
expected = self.context.get("follow_actor")
|
||||
if expected and expected.url != v:
|
||||
if expected and expected.fid != v:
|
||||
raise serializers.ValidationError("Invalid actor")
|
||||
try:
|
||||
return models.Actor.objects.get(url=v)
|
||||
return models.Actor.objects.get(fid=v)
|
||||
except models.Actor.DoesNotExist:
|
||||
raise serializers.ValidationError("Actor not found")
|
||||
|
||||
def save(self, **kwargs):
|
||||
return models.Follow.objects.get_or_create(
|
||||
target = self.validated_data["object"]
|
||||
|
||||
if target._meta.label == "music.Library":
|
||||
follow_class = models.LibraryFollow
|
||||
else:
|
||||
follow_class = models.Follow
|
||||
defaults = kwargs
|
||||
defaults["fid"] = self.validated_data["id"]
|
||||
return follow_class.objects.update_or_create(
|
||||
actor=self.validated_data["actor"],
|
||||
target=self.validated_data["object"],
|
||||
**kwargs, # noqa
|
||||
defaults=defaults,
|
||||
)[0]
|
||||
|
||||
def to_representation(self, instance):
|
||||
return {
|
||||
"@context": AP_CONTEXT,
|
||||
"actor": instance.actor.url,
|
||||
"id": instance.get_federation_url(),
|
||||
"object": instance.target.url,
|
||||
"actor": instance.actor.fid,
|
||||
"id": instance.get_federation_id(),
|
||||
"object": instance.target.fid,
|
||||
"type": "Follow",
|
||||
}
|
||||
|
||||
|
@ -376,50 +285,66 @@ class APIFollowSerializer(serializers.ModelSerializer):
|
|||
|
||||
|
||||
class AcceptFollowSerializer(serializers.Serializer):
|
||||
id = serializers.URLField(max_length=500)
|
||||
id = serializers.URLField(max_length=500, required=False)
|
||||
actor = serializers.URLField(max_length=500)
|
||||
object = FollowSerializer()
|
||||
type = serializers.ChoiceField(choices=["Accept"])
|
||||
|
||||
def validate_actor(self, v):
|
||||
expected = self.context.get("follow_target")
|
||||
if expected and expected.url != v:
|
||||
expected = self.context.get("actor")
|
||||
if expected and expected.fid != v:
|
||||
raise serializers.ValidationError("Invalid actor")
|
||||
try:
|
||||
return models.Actor.objects.get(url=v)
|
||||
return models.Actor.objects.get(fid=v)
|
||||
except models.Actor.DoesNotExist:
|
||||
raise serializers.ValidationError("Actor not found")
|
||||
|
||||
def validate(self, validated_data):
|
||||
# we ensure the accept actor actually match the follow target
|
||||
if validated_data["actor"] != validated_data["object"]["object"]:
|
||||
# we ensure the accept actor actually match the follow target / library owner
|
||||
target = validated_data["object"]["object"]
|
||||
|
||||
if target._meta.label == "music.Library":
|
||||
expected = target.actor
|
||||
follow_class = models.LibraryFollow
|
||||
else:
|
||||
expected = target
|
||||
follow_class = models.Follow
|
||||
if validated_data["actor"] != expected:
|
||||
raise serializers.ValidationError("Actor mismatch")
|
||||
try:
|
||||
validated_data["follow"] = (
|
||||
models.Follow.objects.filter(
|
||||
target=validated_data["actor"],
|
||||
actor=validated_data["object"]["actor"],
|
||||
follow_class.objects.filter(
|
||||
target=target, actor=validated_data["object"]["actor"]
|
||||
)
|
||||
.exclude(approved=True)
|
||||
.select_related()
|
||||
.get()
|
||||
)
|
||||
except models.Follow.DoesNotExist:
|
||||
except follow_class.DoesNotExist:
|
||||
raise serializers.ValidationError("No follow to accept")
|
||||
return validated_data
|
||||
|
||||
def to_representation(self, instance):
|
||||
if instance.target._meta.label == "music.Library":
|
||||
actor = instance.target.actor
|
||||
else:
|
||||
actor = instance.target
|
||||
|
||||
return {
|
||||
"@context": AP_CONTEXT,
|
||||
"id": instance.get_federation_url() + "/accept",
|
||||
"id": instance.get_federation_id() + "/accept",
|
||||
"type": "Accept",
|
||||
"actor": instance.target.url,
|
||||
"actor": actor.fid,
|
||||
"object": FollowSerializer(instance).data,
|
||||
}
|
||||
|
||||
def save(self):
|
||||
self.validated_data["follow"].approved = True
|
||||
self.validated_data["follow"].save()
|
||||
return self.validated_data["follow"]
|
||||
follow = self.validated_data["follow"]
|
||||
follow.approved = True
|
||||
follow.save()
|
||||
if follow.target._meta.label == "music.Library":
|
||||
follow.target.schedule_scan(actor=follow.actor)
|
||||
return follow
|
||||
|
||||
|
||||
class UndoFollowSerializer(serializers.Serializer):
|
||||
|
@ -429,11 +354,12 @@ class UndoFollowSerializer(serializers.Serializer):
|
|||
type = serializers.ChoiceField(choices=["Undo"])
|
||||
|
||||
def validate_actor(self, v):
|
||||
expected = self.context.get("follow_target")
|
||||
if expected and expected.url != v:
|
||||
expected = self.context.get("actor")
|
||||
|
||||
if expected and expected.fid != v:
|
||||
raise serializers.ValidationError("Invalid actor")
|
||||
try:
|
||||
return models.Actor.objects.get(url=v)
|
||||
return models.Actor.objects.get(fid=v)
|
||||
except models.Actor.DoesNotExist:
|
||||
raise serializers.ValidationError("Actor not found")
|
||||
|
||||
|
@ -441,20 +367,28 @@ class UndoFollowSerializer(serializers.Serializer):
|
|||
# we ensure the accept actor actually match the follow actor
|
||||
if validated_data["actor"] != validated_data["object"]["actor"]:
|
||||
raise serializers.ValidationError("Actor mismatch")
|
||||
|
||||
target = validated_data["object"]["object"]
|
||||
|
||||
if target._meta.label == "music.Library":
|
||||
follow_class = models.LibraryFollow
|
||||
else:
|
||||
follow_class = models.Follow
|
||||
|
||||
try:
|
||||
validated_data["follow"] = models.Follow.objects.filter(
|
||||
actor=validated_data["actor"], target=validated_data["object"]["object"]
|
||||
validated_data["follow"] = follow_class.objects.filter(
|
||||
actor=validated_data["actor"], target=target
|
||||
).get()
|
||||
except models.Follow.DoesNotExist:
|
||||
except follow_class.DoesNotExist:
|
||||
raise serializers.ValidationError("No follow to remove")
|
||||
return validated_data
|
||||
|
||||
def to_representation(self, instance):
|
||||
return {
|
||||
"@context": AP_CONTEXT,
|
||||
"id": instance.get_federation_url() + "/undo",
|
||||
"id": instance.get_federation_id() + "/undo",
|
||||
"type": "Undo",
|
||||
"actor": instance.actor.url,
|
||||
"actor": instance.actor.fid,
|
||||
"object": FollowSerializer(instance).data,
|
||||
}
|
||||
|
||||
|
@ -488,9 +422,9 @@ class ActorWebfingerSerializer(serializers.Serializer):
|
|||
data = {}
|
||||
data["subject"] = "acct:{}".format(instance.webfinger_subject)
|
||||
data["links"] = [
|
||||
{"rel": "self", "href": instance.url, "type": "application/activity+json"}
|
||||
{"rel": "self", "href": instance.fid, "type": "application/activity+json"}
|
||||
]
|
||||
data["aliases"] = [instance.url]
|
||||
data["aliases"] = [instance.fid]
|
||||
return data
|
||||
|
||||
|
||||
|
@ -498,7 +432,8 @@ class ActivitySerializer(serializers.Serializer):
|
|||
actor = serializers.URLField(max_length=500)
|
||||
id = serializers.URLField(max_length=500, required=False)
|
||||
type = serializers.ChoiceField(choices=[(c, c) for c in activity.ACTIVITY_TYPES])
|
||||
object = serializers.JSONField()
|
||||
object = serializers.JSONField(required=False)
|
||||
target = serializers.JSONField(required=False)
|
||||
|
||||
def validate_object(self, value):
|
||||
try:
|
||||
|
@ -519,7 +454,7 @@ class ActivitySerializer(serializers.Serializer):
|
|||
|
||||
def validate_actor(self, value):
|
||||
request_actor = self.context.get("actor")
|
||||
if request_actor and request_actor.url != value:
|
||||
if request_actor and request_actor.fid != value:
|
||||
raise serializers.ValidationError(
|
||||
"The actor making the request do not match" " the activity actor"
|
||||
)
|
||||
|
@ -560,6 +495,18 @@ class ObjectSerializer(serializers.Serializer):
|
|||
OBJECT_SERIALIZERS = {t: ObjectSerializer for t in activity.OBJECT_TYPES}
|
||||
|
||||
|
||||
def get_additional_fields(data):
|
||||
UNSET = object()
|
||||
additional_fields = {}
|
||||
for field in ["name", "summary"]:
|
||||
v = data.get(field, UNSET)
|
||||
if v == UNSET:
|
||||
continue
|
||||
additional_fields[field] = v
|
||||
|
||||
return additional_fields
|
||||
|
||||
|
||||
class PaginatedCollectionSerializer(serializers.Serializer):
|
||||
type = serializers.ChoiceField(choices=["Collection"])
|
||||
totalItems = serializers.IntegerField(min_value=0)
|
||||
|
@ -575,18 +522,73 @@ class PaginatedCollectionSerializer(serializers.Serializer):
|
|||
last = funkwhale_utils.set_query_parameter(conf["id"], page=paginator.num_pages)
|
||||
d = {
|
||||
"id": conf["id"],
|
||||
"actor": conf["actor"].url,
|
||||
"actor": conf["actor"].fid,
|
||||
"totalItems": paginator.count,
|
||||
"type": "Collection",
|
||||
"type": conf.get("type", "Collection"),
|
||||
"current": current,
|
||||
"first": first,
|
||||
"last": last,
|
||||
}
|
||||
d.update(get_additional_fields(conf))
|
||||
if self.context.get("include_ap_context", True):
|
||||
d["@context"] = AP_CONTEXT
|
||||
return d
|
||||
|
||||
|
||||
class LibrarySerializer(PaginatedCollectionSerializer):
|
||||
type = serializers.ChoiceField(choices=["Library"])
|
||||
name = serializers.CharField()
|
||||
summary = serializers.CharField(allow_blank=True, allow_null=True, required=False)
|
||||
followers = serializers.URLField(max_length=500)
|
||||
audience = serializers.ChoiceField(
|
||||
choices=["", None, "https://www.w3.org/ns/activitystreams#Public"],
|
||||
required=False,
|
||||
allow_null=True,
|
||||
allow_blank=True,
|
||||
)
|
||||
|
||||
def to_representation(self, library):
|
||||
conf = {
|
||||
"id": library.fid,
|
||||
"name": library.name,
|
||||
"summary": library.description,
|
||||
"page_size": 100,
|
||||
"actor": library.actor,
|
||||
"items": library.uploads.for_federation(),
|
||||
"type": "Library",
|
||||
}
|
||||
r = super().to_representation(conf)
|
||||
r["audience"] = (
|
||||
"https://www.w3.org/ns/activitystreams#Public"
|
||||
if library.privacy_level == "public"
|
||||
else ""
|
||||
)
|
||||
r["followers"] = library.followers_url
|
||||
return r
|
||||
|
||||
def create(self, validated_data):
|
||||
actor = utils.retrieve(
|
||||
validated_data["actor"],
|
||||
queryset=models.Actor,
|
||||
serializer_class=ActorSerializer,
|
||||
)
|
||||
library, created = music_models.Library.objects.update_or_create(
|
||||
fid=validated_data["id"],
|
||||
actor=actor,
|
||||
defaults={
|
||||
"uploads_count": validated_data["totalItems"],
|
||||
"name": validated_data["name"],
|
||||
"description": validated_data["summary"],
|
||||
"followers_url": validated_data["followers"],
|
||||
"privacy_level": "everyone"
|
||||
if validated_data["audience"]
|
||||
== "https://www.w3.org/ns/activitystreams#Public"
|
||||
else "me",
|
||||
},
|
||||
)
|
||||
return library
|
||||
|
||||
|
||||
class CollectionPageSerializer(serializers.Serializer):
|
||||
type = serializers.ChoiceField(choices=["CollectionPage"])
|
||||
totalItems = serializers.IntegerField(min_value=0)
|
||||
|
@ -606,9 +608,10 @@ class CollectionPageSerializer(serializers.Serializer):
|
|||
raw_items = [item_serializer(data=i, context=self.context) for i in v]
|
||||
valid_items = []
|
||||
for i in raw_items:
|
||||
if i.is_valid():
|
||||
try:
|
||||
i.is_valid(raise_exception=True)
|
||||
valid_items.append(i)
|
||||
else:
|
||||
except serializers.ValidationError:
|
||||
logger.debug("Invalid item %s: %s", i.data, i.errors)
|
||||
|
||||
return valid_items
|
||||
|
@ -623,7 +626,7 @@ class CollectionPageSerializer(serializers.Serializer):
|
|||
d = {
|
||||
"id": id,
|
||||
"partOf": conf["id"],
|
||||
"actor": conf["actor"].url,
|
||||
"actor": conf["actor"].fid,
|
||||
"totalItems": page.paginator.count,
|
||||
"type": "CollectionPage",
|
||||
"first": first,
|
||||
|
@ -645,48 +648,135 @@ class CollectionPageSerializer(serializers.Serializer):
|
|||
d["next"] = funkwhale_utils.set_query_parameter(
|
||||
conf["id"], page=page.next_page_number()
|
||||
)
|
||||
|
||||
d.update(get_additional_fields(conf))
|
||||
if self.context.get("include_ap_context", True):
|
||||
d["@context"] = AP_CONTEXT
|
||||
return d
|
||||
|
||||
|
||||
class ArtistMetadataSerializer(serializers.Serializer):
|
||||
musicbrainz_id = serializers.UUIDField(required=False, allow_null=True)
|
||||
name = serializers.CharField()
|
||||
|
||||
|
||||
class ReleaseMetadataSerializer(serializers.Serializer):
|
||||
musicbrainz_id = serializers.UUIDField(required=False, allow_null=True)
|
||||
title = serializers.CharField()
|
||||
|
||||
|
||||
class RecordingMetadataSerializer(serializers.Serializer):
|
||||
musicbrainz_id = serializers.UUIDField(required=False, allow_null=True)
|
||||
title = serializers.CharField()
|
||||
|
||||
|
||||
class AudioMetadataSerializer(serializers.Serializer):
|
||||
artist = ArtistMetadataSerializer()
|
||||
release = ReleaseMetadataSerializer()
|
||||
recording = RecordingMetadataSerializer()
|
||||
bitrate = serializers.IntegerField(required=False, allow_null=True, min_value=0)
|
||||
size = serializers.IntegerField(required=False, allow_null=True, min_value=0)
|
||||
length = serializers.IntegerField(required=False, allow_null=True, min_value=0)
|
||||
|
||||
|
||||
class AudioSerializer(serializers.Serializer):
|
||||
type = serializers.CharField()
|
||||
class MusicEntitySerializer(serializers.Serializer):
|
||||
id = serializers.URLField(max_length=500)
|
||||
url = serializers.JSONField()
|
||||
published = serializers.DateTimeField()
|
||||
updated = serializers.DateTimeField(required=False)
|
||||
metadata = AudioMetadataSerializer()
|
||||
musicbrainzId = serializers.UUIDField(allow_null=True, required=False)
|
||||
name = serializers.CharField(max_length=1000)
|
||||
|
||||
def validate_type(self, v):
|
||||
if v != "Audio":
|
||||
raise serializers.ValidationError("Invalid type for audio")
|
||||
return v
|
||||
|
||||
class ArtistSerializer(MusicEntitySerializer):
|
||||
def to_representation(self, instance):
|
||||
d = {
|
||||
"type": "Artist",
|
||||
"id": instance.fid,
|
||||
"name": instance.name,
|
||||
"published": instance.creation_date.isoformat(),
|
||||
"musicbrainzId": str(instance.mbid) if instance.mbid else None,
|
||||
}
|
||||
|
||||
if self.context.get("include_ap_context", self.parent is None):
|
||||
d["@context"] = AP_CONTEXT
|
||||
return d
|
||||
|
||||
|
||||
class AlbumSerializer(MusicEntitySerializer):
|
||||
released = serializers.DateField(allow_null=True, required=False)
|
||||
artists = serializers.ListField(child=ArtistSerializer(), min_length=1)
|
||||
cover = LinkSerializer(
|
||||
allowed_mimetypes=["image/*"], allow_null=True, required=False
|
||||
)
|
||||
|
||||
def to_representation(self, instance):
|
||||
d = {
|
||||
"type": "Album",
|
||||
"id": instance.fid,
|
||||
"name": instance.title,
|
||||
"published": instance.creation_date.isoformat(),
|
||||
"musicbrainzId": str(instance.mbid) if instance.mbid else None,
|
||||
"released": instance.release_date.isoformat()
|
||||
if instance.release_date
|
||||
else None,
|
||||
"artists": [
|
||||
ArtistSerializer(
|
||||
instance.artist, context={"include_ap_context": False}
|
||||
).data
|
||||
],
|
||||
}
|
||||
if instance.cover:
|
||||
d["cover"] = {
|
||||
"type": "Link",
|
||||
"href": utils.full_url(instance.cover.url),
|
||||
"mediaType": mimetypes.guess_type(instance.cover.path)[0]
|
||||
or "image/jpeg",
|
||||
}
|
||||
if self.context.get("include_ap_context", self.parent is None):
|
||||
d["@context"] = AP_CONTEXT
|
||||
return d
|
||||
|
||||
def get_create_data(self, validated_data):
|
||||
artist_data = validated_data["artists"][0]
|
||||
artist = ArtistSerializer(
|
||||
context={"activity": self.context.get("activity")}
|
||||
).create(artist_data)
|
||||
|
||||
return {
|
||||
"mbid": validated_data.get("musicbrainzId"),
|
||||
"fid": validated_data["id"],
|
||||
"title": validated_data["name"],
|
||||
"creation_date": validated_data["published"],
|
||||
"artist": artist,
|
||||
"release_date": validated_data.get("released"),
|
||||
"from_activity": self.context.get("activity"),
|
||||
}
|
||||
|
||||
|
||||
class TrackSerializer(MusicEntitySerializer):
|
||||
position = serializers.IntegerField(min_value=0, allow_null=True, required=False)
|
||||
artists = serializers.ListField(child=ArtistSerializer(), min_length=1)
|
||||
album = AlbumSerializer()
|
||||
|
||||
def to_representation(self, instance):
|
||||
d = {
|
||||
"type": "Track",
|
||||
"id": instance.fid,
|
||||
"name": instance.title,
|
||||
"published": instance.creation_date.isoformat(),
|
||||
"musicbrainzId": str(instance.mbid) if instance.mbid else None,
|
||||
"position": instance.position,
|
||||
"artists": [
|
||||
ArtistSerializer(
|
||||
instance.artist, context={"include_ap_context": False}
|
||||
).data
|
||||
],
|
||||
"album": AlbumSerializer(
|
||||
instance.album, context={"include_ap_context": False}
|
||||
).data,
|
||||
}
|
||||
|
||||
if self.context.get("include_ap_context", self.parent is None):
|
||||
d["@context"] = AP_CONTEXT
|
||||
return d
|
||||
|
||||
def create(self, validated_data):
|
||||
from funkwhale_api.music import tasks as music_tasks
|
||||
|
||||
metadata = music_tasks.federation_audio_track_to_metadata(validated_data)
|
||||
from_activity = self.context.get("activity")
|
||||
if from_activity:
|
||||
metadata["from_activity_id"] = from_activity.pk
|
||||
track = music_tasks.get_track_from_import_metadata(metadata)
|
||||
return track
|
||||
|
||||
|
||||
class UploadSerializer(serializers.Serializer):
|
||||
type = serializers.ChoiceField(choices=["Audio"])
|
||||
id = serializers.URLField(max_length=500)
|
||||
library = serializers.URLField(max_length=500)
|
||||
url = LinkSerializer(allowed_mimetypes=["audio/*"])
|
||||
published = serializers.DateTimeField()
|
||||
updated = serializers.DateTimeField(required=False, allow_null=True)
|
||||
bitrate = serializers.IntegerField(min_value=0)
|
||||
size = serializers.IntegerField(min_value=0)
|
||||
duration = serializers.IntegerField(min_value=0)
|
||||
|
||||
track = TrackSerializer(required=True)
|
||||
|
||||
def validate_url(self, v):
|
||||
try:
|
||||
|
@ -704,57 +794,70 @@ class AudioSerializer(serializers.Serializer):
|
|||
|
||||
return v
|
||||
|
||||
def validate_library(self, v):
|
||||
lb = self.context.get("library")
|
||||
if lb:
|
||||
if lb.fid != v:
|
||||
raise serializers.ValidationError("Invalid library")
|
||||
return lb
|
||||
|
||||
actor = self.context.get("actor")
|
||||
kwargs = {}
|
||||
if actor:
|
||||
kwargs["actor"] = actor
|
||||
try:
|
||||
return music_models.Library.objects.get(fid=v, **kwargs)
|
||||
except music_models.Library.DoesNotExist:
|
||||
raise serializers.ValidationError("Invalid library")
|
||||
|
||||
def create(self, validated_data):
|
||||
defaults = {
|
||||
"audio_mimetype": validated_data["url"]["mediaType"],
|
||||
"audio_url": validated_data["url"]["href"],
|
||||
"metadata": validated_data["metadata"],
|
||||
"artist_name": validated_data["metadata"]["artist"]["name"],
|
||||
"album_title": validated_data["metadata"]["release"]["title"],
|
||||
"title": validated_data["metadata"]["recording"]["title"],
|
||||
"published_date": validated_data["published"],
|
||||
try:
|
||||
return music_models.Upload.objects.get(fid=validated_data["id"])
|
||||
except music_models.Upload.DoesNotExist:
|
||||
pass
|
||||
|
||||
track = TrackSerializer(
|
||||
context={"activity": self.context.get("activity")}
|
||||
).create(validated_data["track"])
|
||||
|
||||
data = {
|
||||
"fid": validated_data["id"],
|
||||
"mimetype": validated_data["url"]["mediaType"],
|
||||
"source": validated_data["url"]["href"],
|
||||
"creation_date": validated_data["published"],
|
||||
"modification_date": validated_data.get("updated"),
|
||||
"track": track,
|
||||
"duration": validated_data["duration"],
|
||||
"size": validated_data["size"],
|
||||
"bitrate": validated_data["bitrate"],
|
||||
"library": validated_data["library"],
|
||||
"from_activity": self.context.get("activity"),
|
||||
"import_status": "finished",
|
||||
}
|
||||
return models.LibraryTrack.objects.get_or_create(
|
||||
library=self.context["library"], url=validated_data["id"], defaults=defaults
|
||||
)[0]
|
||||
return music_models.Upload.objects.create(**data)
|
||||
|
||||
def to_representation(self, instance):
|
||||
track = instance.track
|
||||
album = instance.track.album
|
||||
artist = instance.track.artist
|
||||
|
||||
d = {
|
||||
"type": "Audio",
|
||||
"id": instance.get_federation_url(),
|
||||
"name": instance.track.full_name,
|
||||
"id": instance.get_federation_id(),
|
||||
"library": instance.library.fid,
|
||||
"name": track.full_name,
|
||||
"published": instance.creation_date.isoformat(),
|
||||
"updated": instance.modification_date.isoformat(),
|
||||
"metadata": {
|
||||
"artist": {
|
||||
"musicbrainz_id": str(artist.mbid) if artist.mbid else None,
|
||||
"name": artist.name,
|
||||
},
|
||||
"release": {
|
||||
"musicbrainz_id": str(album.mbid) if album.mbid else None,
|
||||
"title": album.title,
|
||||
},
|
||||
"recording": {
|
||||
"musicbrainz_id": str(track.mbid) if track.mbid else None,
|
||||
"title": track.title,
|
||||
},
|
||||
"bitrate": instance.bitrate,
|
||||
"size": instance.size,
|
||||
"length": instance.duration,
|
||||
},
|
||||
"bitrate": instance.bitrate,
|
||||
"size": instance.size,
|
||||
"duration": instance.duration,
|
||||
"url": {
|
||||
"href": utils.full_url(instance.path),
|
||||
"href": utils.full_url(instance.listen_url),
|
||||
"type": "Link",
|
||||
"mediaType": instance.mimetype,
|
||||
},
|
||||
"attributedTo": [self.context["actor"].url],
|
||||
"track": TrackSerializer(track, context={"include_ap_context": False}).data,
|
||||
}
|
||||
if self.context.get("include_ap_context", True):
|
||||
if instance.modification_date:
|
||||
d["updated"] = instance.modification_date.isoformat()
|
||||
|
||||
if self.context.get("include_ap_context", self.parent is None):
|
||||
d["@context"] = AP_CONTEXT
|
||||
return d
|
||||
|
||||
|
@ -763,7 +866,7 @@ class CollectionSerializer(serializers.Serializer):
|
|||
def to_representation(self, conf):
|
||||
d = {
|
||||
"id": conf["id"],
|
||||
"actor": conf["actor"].url,
|
||||
"actor": conf["actor"].fid,
|
||||
"totalItems": len(conf["items"]),
|
||||
"type": "Collection",
|
||||
"items": [
|
||||
|
@ -777,27 +880,3 @@ class CollectionSerializer(serializers.Serializer):
|
|||
if self.context.get("include_ap_context", True):
|
||||
d["@context"] = AP_CONTEXT
|
||||
return d
|
||||
|
||||
|
||||
class LibraryTrackActionSerializer(common_serializers.ActionSerializer):
|
||||
actions = [common_serializers.Action("import", allow_all=True)]
|
||||
filterset_class = filters.LibraryTrackFilter
|
||||
|
||||
@transaction.atomic
|
||||
def handle_import(self, objects):
|
||||
batch = music_models.ImportBatch.objects.create(
|
||||
source="federation", submitted_by=self.context["submitted_by"]
|
||||
)
|
||||
jobs = []
|
||||
for lt in objects:
|
||||
job = music_models.ImportJob(
|
||||
batch=batch, library_track=lt, mbid=lt.mbid, source=lt.url
|
||||
)
|
||||
jobs.append(job)
|
||||
|
||||
music_models.ImportJob.objects.bulk_create(jobs)
|
||||
funkwhale_utils.on_commit(
|
||||
music_tasks.import_batch_run.delay, import_batch_id=batch.pk
|
||||
)
|
||||
|
||||
return {"batch": {"id": batch.pk}}
|
||||
|
|
|
@ -1,92 +1,24 @@
|
|||
import datetime
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
|
||||
from django.conf import settings
|
||||
from django.db.models import Q
|
||||
from django.db.models import Q, F
|
||||
from django.utils import timezone
|
||||
from dynamic_preferences.registries import global_preferences_registry
|
||||
from requests.exceptions import RequestException
|
||||
|
||||
from funkwhale_api.common import preferences
|
||||
from funkwhale_api.common import session
|
||||
from funkwhale_api.music import models as music_models
|
||||
from funkwhale_api.taskapp import celery
|
||||
|
||||
from . import actors
|
||||
from . import library as lb
|
||||
from . import models, signing
|
||||
from . import routes
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@celery.app.task(
|
||||
name="federation.send",
|
||||
autoretry_for=[RequestException],
|
||||
retry_backoff=30,
|
||||
max_retries=5,
|
||||
)
|
||||
@celery.require_instance(models.Actor, "actor")
|
||||
def send(activity, actor, to):
|
||||
logger.info("Preparing activity delivery to %s", to)
|
||||
auth = signing.get_auth(actor.private_key, actor.private_key_id)
|
||||
for url in to:
|
||||
recipient_actor = actors.get_actor(url)
|
||||
logger.debug("delivering to %s", recipient_actor.inbox_url)
|
||||
logger.debug("activity content: %s", json.dumps(activity))
|
||||
response = session.get_session().post(
|
||||
auth=auth,
|
||||
json=activity,
|
||||
url=recipient_actor.inbox_url,
|
||||
timeout=5,
|
||||
verify=settings.EXTERNAL_REQUESTS_VERIFY_SSL,
|
||||
headers={"Content-Type": "application/activity+json"},
|
||||
)
|
||||
response.raise_for_status()
|
||||
logger.debug("Remote answered with %s", response.status_code)
|
||||
|
||||
|
||||
@celery.app.task(
|
||||
name="federation.scan_library",
|
||||
autoretry_for=[RequestException],
|
||||
retry_backoff=30,
|
||||
max_retries=5,
|
||||
)
|
||||
@celery.require_instance(models.Library, "library")
|
||||
def scan_library(library, until=None):
|
||||
if not library.federation_enabled:
|
||||
return
|
||||
|
||||
data = lb.get_library_data(library.url)
|
||||
scan_library_page.delay(library_id=library.id, page_url=data["first"], until=until)
|
||||
library.fetched_date = timezone.now()
|
||||
library.tracks_count = data["totalItems"]
|
||||
library.save(update_fields=["fetched_date", "tracks_count"])
|
||||
|
||||
|
||||
@celery.app.task(
|
||||
name="federation.scan_library_page",
|
||||
autoretry_for=[RequestException],
|
||||
retry_backoff=30,
|
||||
max_retries=5,
|
||||
)
|
||||
@celery.require_instance(models.Library, "library")
|
||||
def scan_library_page(library, page_url, until=None):
|
||||
if not library.federation_enabled:
|
||||
return
|
||||
|
||||
data = lb.get_library_page(library, page_url)
|
||||
lts = []
|
||||
for item_serializer in data["items"]:
|
||||
item_date = item_serializer.validated_data["published"]
|
||||
if until and item_date < until:
|
||||
return
|
||||
lts.append(item_serializer.save())
|
||||
|
||||
next_page = data.get("next")
|
||||
if next_page and next_page != page_url:
|
||||
scan_library_page.delay(library_id=library.id, page_url=next_page)
|
||||
|
||||
|
||||
@celery.app.task(name="federation.clean_music_cache")
|
||||
def clean_music_cache():
|
||||
preferences = global_preferences_registry.manager()
|
||||
|
@ -96,23 +28,23 @@ def clean_music_cache():
|
|||
limit = timezone.now() - datetime.timedelta(minutes=delay)
|
||||
|
||||
candidates = (
|
||||
models.LibraryTrack.objects.filter(
|
||||
music_models.Upload.objects.filter(
|
||||
Q(audio_file__isnull=False)
|
||||
& (
|
||||
Q(local_track_file__accessed_date__lt=limit)
|
||||
| Q(local_track_file__accessed_date=None)
|
||||
)
|
||||
& (Q(accessed_date__lt=limit) | Q(accessed_date=None)),
|
||||
# library__actor__user=None,
|
||||
)
|
||||
.local(False)
|
||||
.exclude(audio_file="")
|
||||
.only("audio_file", "id")
|
||||
.order_by("id")
|
||||
)
|
||||
for lt in candidates:
|
||||
lt.audio_file.delete()
|
||||
for upload in candidates:
|
||||
upload.audio_file.delete()
|
||||
|
||||
# we also delete orphaned files, if any
|
||||
storage = models.LibraryTrack._meta.get_field("audio_file").storage
|
||||
files = get_files(storage, "federation_cache")
|
||||
existing = models.LibraryTrack.objects.filter(audio_file__in=files)
|
||||
files = get_files(storage, "federation_cache/tracks")
|
||||
existing = music_models.Upload.objects.filter(audio_file__in=files)
|
||||
missing = set(files) - set(existing.values_list("audio_file", flat=True))
|
||||
for m in missing:
|
||||
storage.delete(m)
|
||||
|
@ -125,8 +57,93 @@ def get_files(storage, *parts):
|
|||
"""
|
||||
if not parts:
|
||||
raise ValueError("Missing path")
|
||||
|
||||
dirs, files = storage.listdir(os.path.join(*parts))
|
||||
try:
|
||||
dirs, files = storage.listdir(os.path.join(*parts))
|
||||
except FileNotFoundError:
|
||||
return []
|
||||
for dir in dirs:
|
||||
files += get_files(storage, *(list(parts) + [dir]))
|
||||
return [os.path.join(parts[-1], path) for path in files]
|
||||
|
||||
|
||||
@celery.app.task(name="federation.dispatch_inbox")
|
||||
@celery.require_instance(models.Activity.objects.select_related(), "activity")
|
||||
def dispatch_inbox(activity):
|
||||
"""
|
||||
Given an activity instance, triggers our internal delivery logic (follow
|
||||
creation, etc.)
|
||||
"""
|
||||
|
||||
routes.inbox.dispatch(
|
||||
activity.payload,
|
||||
context={
|
||||
"activity": activity,
|
||||
"actor": activity.actor,
|
||||
"inbox_items": activity.inbox_items.filter(is_read=False),
|
||||
},
|
||||
)
|
||||
|
||||
|
||||
@celery.app.task(name="federation.dispatch_outbox")
|
||||
@celery.require_instance(models.Activity.objects.select_related(), "activity")
|
||||
def dispatch_outbox(activity):
|
||||
"""
|
||||
Deliver a local activity to its recipients, both locally and remotely
|
||||
"""
|
||||
inbox_items = activity.inbox_items.filter(is_read=False).select_related()
|
||||
|
||||
if inbox_items.exists():
|
||||
dispatch_inbox.delay(activity_id=activity.pk)
|
||||
|
||||
if not preferences.get("federation__enabled"):
|
||||
# federation is disabled, we only deliver to local recipients
|
||||
return
|
||||
|
||||
deliveries = activity.deliveries.filter(is_delivered=False)
|
||||
|
||||
for id in deliveries.values_list("pk", flat=True):
|
||||
deliver_to_remote.delay(delivery_id=id)
|
||||
|
||||
|
||||
@celery.app.task(
|
||||
name="federation.deliver_to_remote_inbox",
|
||||
autoretry_for=[RequestException],
|
||||
retry_backoff=30,
|
||||
max_retries=5,
|
||||
)
|
||||
@celery.require_instance(
|
||||
models.Delivery.objects.filter(is_delivered=False).select_related(
|
||||
"activity__actor"
|
||||
),
|
||||
"delivery",
|
||||
)
|
||||
def deliver_to_remote(delivery):
|
||||
|
||||
if not preferences.get("federation__enabled"):
|
||||
# federation is disabled, we only deliver to local recipients
|
||||
return
|
||||
|
||||
actor = delivery.activity.actor
|
||||
logger.info("Preparing activity delivery to %s", delivery.inbox_url)
|
||||
auth = signing.get_auth(actor.private_key, actor.private_key_id)
|
||||
try:
|
||||
response = session.get_session().post(
|
||||
auth=auth,
|
||||
json=delivery.activity.payload,
|
||||
url=delivery.inbox_url,
|
||||
timeout=5,
|
||||
verify=settings.EXTERNAL_REQUESTS_VERIFY_SSL,
|
||||
headers={"Content-Type": "application/activity+json"},
|
||||
)
|
||||
logger.debug("Remote answered with %s", response.status_code)
|
||||
response.raise_for_status()
|
||||
except Exception:
|
||||
delivery.last_attempt_date = timezone.now()
|
||||
delivery.attempts = F("attempts") + 1
|
||||
delivery.save(update_fields=["last_attempt_date", "attempts"])
|
||||
raise
|
||||
else:
|
||||
delivery.last_attempt_date = timezone.now()
|
||||
delivery.attempts = F("attempts") + 1
|
||||
delivery.is_delivered = True
|
||||
delivery.save(update_fields=["last_attempt_date", "attempts", "is_delivered"])
|
||||
|
|
|
@ -5,13 +5,16 @@ from . import views
|
|||
|
||||
router = routers.SimpleRouter(trailing_slash=False)
|
||||
music_router = routers.SimpleRouter(trailing_slash=False)
|
||||
router.register(
|
||||
r"federation/instance/actors", views.InstanceActorViewSet, "instance-actors"
|
||||
)
|
||||
|
||||
router.register(r"federation/shared", views.SharedViewSet, "shared")
|
||||
router.register(r"federation/actors", views.ActorViewSet, "actors")
|
||||
router.register(r".well-known", views.WellKnownViewSet, "well-known")
|
||||
|
||||
music_router.register(r"files", views.MusicFilesViewSet, "files")
|
||||
music_router.register(r"libraries", views.MusicLibraryViewSet, "libraries")
|
||||
music_router.register(r"uploads", views.MusicUploadViewSet, "uploads")
|
||||
music_router.register(r"artists", views.MusicArtistViewSet, "artists")
|
||||
music_router.register(r"albums", views.MusicAlbumViewSet, "albums")
|
||||
music_router.register(r"tracks", views.MusicTrackViewSet, "tracks")
|
||||
urlpatterns = router.urls + [
|
||||
url("federation/music/", include((music_router.urls, "music"), namespace="music"))
|
||||
]
|
||||
|
|
|
@ -1,5 +1,11 @@
|
|||
import unicodedata
|
||||
import re
|
||||
from django.conf import settings
|
||||
|
||||
from funkwhale_api.common import session
|
||||
|
||||
from . import signing
|
||||
|
||||
|
||||
def full_url(path):
|
||||
"""
|
||||
|
@ -32,3 +38,53 @@ def clean_wsgi_headers(raw_headers):
|
|||
cleaned[cleaned_header] = value
|
||||
|
||||
return cleaned
|
||||
|
||||
|
||||
def slugify_username(username):
|
||||
"""
|
||||
Given a username such as "hello M. world", returns a username
|
||||
suitable for federation purpose (hello_M_world).
|
||||
|
||||
Preserves the original case.
|
||||
|
||||
Code is borrowed from django's slugify function.
|
||||
"""
|
||||
|
||||
value = str(username)
|
||||
value = (
|
||||
unicodedata.normalize("NFKD", value).encode("ascii", "ignore").decode("ascii")
|
||||
)
|
||||
value = re.sub(r"[^\w\s-]", "", value).strip()
|
||||
return re.sub(r"[-\s]+", "_", value)
|
||||
|
||||
|
||||
def retrieve(fid, actor=None, serializer_class=None, queryset=None):
|
||||
if queryset:
|
||||
try:
|
||||
# queryset can also be a Model class
|
||||
existing = queryset.filter(fid=fid).first()
|
||||
except AttributeError:
|
||||
existing = queryset.objects.filter(fid=fid).first()
|
||||
if existing:
|
||||
return existing
|
||||
|
||||
auth = (
|
||||
None if not actor else signing.get_auth(actor.private_key, actor.private_key_id)
|
||||
)
|
||||
response = session.get_session().get(
|
||||
fid,
|
||||
auth=auth,
|
||||
timeout=5,
|
||||
verify=settings.EXTERNAL_REQUESTS_VERIFY_SSL,
|
||||
headers={
|
||||
"Accept": "application/activity+json",
|
||||
"Content-Type": "application/activity+json",
|
||||
},
|
||||
)
|
||||
response.raise_for_status()
|
||||
data = response.json()
|
||||
if not serializer_class:
|
||||
return data
|
||||
serializer = serializer_class(data=data)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
return serializer.save()
|
||||
|
|
|
@ -1,28 +1,14 @@
|
|||
from django import forms
|
||||
from django.core import paginator
|
||||
from django.db import transaction
|
||||
from django.http import HttpResponse, Http404
|
||||
from django.http import HttpResponse
|
||||
from django.urls import reverse
|
||||
from rest_framework import mixins, response, viewsets
|
||||
from rest_framework import exceptions, mixins, response, viewsets
|
||||
from rest_framework.decorators import detail_route, list_route
|
||||
|
||||
from funkwhale_api.common import preferences
|
||||
from funkwhale_api.music import models as music_models
|
||||
from funkwhale_api.users.permissions import HasUserPermission
|
||||
|
||||
from . import (
|
||||
actors,
|
||||
authentication,
|
||||
filters,
|
||||
library,
|
||||
models,
|
||||
permissions,
|
||||
renderers,
|
||||
serializers,
|
||||
tasks,
|
||||
utils,
|
||||
webfinger,
|
||||
)
|
||||
from . import activity, authentication, models, renderers, serializers, utils, webfinger
|
||||
|
||||
|
||||
class FederationMixin(object):
|
||||
|
@ -32,9 +18,24 @@ class FederationMixin(object):
|
|||
return super().dispatch(request, *args, **kwargs)
|
||||
|
||||
|
||||
class SharedViewSet(FederationMixin, viewsets.GenericViewSet):
|
||||
permission_classes = []
|
||||
authentication_classes = [authentication.SignatureAuthentication]
|
||||
renderer_classes = [renderers.ActivityPubRenderer]
|
||||
|
||||
@list_route(methods=["post"])
|
||||
def inbox(self, request, *args, **kwargs):
|
||||
if request.method.lower() == "post" and request.actor is None:
|
||||
raise exceptions.AuthenticationFailed(
|
||||
"You need a valid signature to send an activity"
|
||||
)
|
||||
if request.method.lower() == "post":
|
||||
activity.receive(activity=request.data, on_behalf_of=request.actor)
|
||||
return response.Response({}, status=200)
|
||||
|
||||
|
||||
class ActorViewSet(FederationMixin, mixins.RetrieveModelMixin, viewsets.GenericViewSet):
|
||||
lookup_field = "user__username"
|
||||
lookup_value_regex = ".*"
|
||||
lookup_field = "preferred_username"
|
||||
authentication_classes = [authentication.SignatureAuthentication]
|
||||
permission_classes = []
|
||||
renderer_classes = [renderers.ActivityPubRenderer]
|
||||
|
@ -43,52 +44,29 @@ class ActorViewSet(FederationMixin, mixins.RetrieveModelMixin, viewsets.GenericV
|
|||
|
||||
@detail_route(methods=["get", "post"])
|
||||
def inbox(self, request, *args, **kwargs):
|
||||
if request.method.lower() == "post" and request.actor is None:
|
||||
raise exceptions.AuthenticationFailed(
|
||||
"You need a valid signature to send an activity"
|
||||
)
|
||||
if request.method.lower() == "post":
|
||||
activity.receive(activity=request.data, on_behalf_of=request.actor)
|
||||
return response.Response({}, status=200)
|
||||
|
||||
@detail_route(methods=["get", "post"])
|
||||
def outbox(self, request, *args, **kwargs):
|
||||
return response.Response({}, status=200)
|
||||
|
||||
@detail_route(methods=["get"])
|
||||
def followers(self, request, *args, **kwargs):
|
||||
self.get_object()
|
||||
# XXX to implement
|
||||
return response.Response({})
|
||||
|
||||
class InstanceActorViewSet(FederationMixin, viewsets.GenericViewSet):
|
||||
lookup_field = "actor"
|
||||
lookup_value_regex = "[a-z]*"
|
||||
authentication_classes = [authentication.SignatureAuthentication]
|
||||
permission_classes = []
|
||||
renderer_classes = [renderers.ActivityPubRenderer]
|
||||
|
||||
def get_object(self):
|
||||
try:
|
||||
return actors.SYSTEM_ACTORS[self.kwargs["actor"]]
|
||||
except KeyError:
|
||||
raise Http404
|
||||
|
||||
def retrieve(self, request, *args, **kwargs):
|
||||
system_actor = self.get_object()
|
||||
actor = system_actor.get_actor_instance()
|
||||
data = actor.system_conf.serialize()
|
||||
return response.Response(data, status=200)
|
||||
|
||||
@detail_route(methods=["get", "post"])
|
||||
def inbox(self, request, *args, **kwargs):
|
||||
system_actor = self.get_object()
|
||||
handler = getattr(system_actor, "{}_inbox".format(request.method.lower()))
|
||||
|
||||
try:
|
||||
handler(request.data, actor=request.actor)
|
||||
except NotImplementedError:
|
||||
return response.Response(status=405)
|
||||
return response.Response({}, status=200)
|
||||
|
||||
@detail_route(methods=["get", "post"])
|
||||
def outbox(self, request, *args, **kwargs):
|
||||
system_actor = self.get_object()
|
||||
handler = getattr(system_actor, "{}_outbox".format(request.method.lower()))
|
||||
try:
|
||||
handler(request.data, actor=request.actor)
|
||||
except NotImplementedError:
|
||||
return response.Response(status=405)
|
||||
return response.Response({}, status=200)
|
||||
@detail_route(methods=["get"])
|
||||
def following(self, request, *args, **kwargs):
|
||||
self.get_object()
|
||||
# XXX to implement
|
||||
return response.Response({})
|
||||
|
||||
|
||||
class WellKnownViewSet(viewsets.GenericViewSet):
|
||||
|
@ -132,56 +110,69 @@ class WellKnownViewSet(viewsets.GenericViewSet):
|
|||
def handler_acct(self, clean_result):
|
||||
username, hostname = clean_result
|
||||
|
||||
if username in actors.SYSTEM_ACTORS:
|
||||
actor = actors.SYSTEM_ACTORS[username].get_actor_instance()
|
||||
else:
|
||||
try:
|
||||
actor = models.Actor.objects.local().get(user__username=username)
|
||||
except models.Actor.DoesNotExist:
|
||||
raise forms.ValidationError("Invalid username")
|
||||
try:
|
||||
actor = models.Actor.objects.local().get(preferred_username=username)
|
||||
except models.Actor.DoesNotExist:
|
||||
raise forms.ValidationError("Invalid username")
|
||||
|
||||
return serializers.ActorWebfingerSerializer(actor).data
|
||||
|
||||
|
||||
class MusicFilesViewSet(FederationMixin, viewsets.GenericViewSet):
|
||||
authentication_classes = [authentication.SignatureAuthentication]
|
||||
permission_classes = [permissions.LibraryFollower]
|
||||
renderer_classes = [renderers.ActivityPubRenderer]
|
||||
def has_library_access(request, library):
|
||||
if library.privacy_level == "everyone":
|
||||
return True
|
||||
if request.user.is_authenticated and request.user.is_superuser:
|
||||
return True
|
||||
|
||||
def list(self, request, *args, **kwargs):
|
||||
try:
|
||||
actor = request.actor
|
||||
except AttributeError:
|
||||
return False
|
||||
|
||||
return library.received_follows.filter(actor=actor, approved=True).exists()
|
||||
|
||||
|
||||
class MusicLibraryViewSet(
|
||||
FederationMixin, mixins.RetrieveModelMixin, viewsets.GenericViewSet
|
||||
):
|
||||
authentication_classes = [authentication.SignatureAuthentication]
|
||||
permission_classes = []
|
||||
renderer_classes = [renderers.ActivityPubRenderer]
|
||||
serializer_class = serializers.LibrarySerializer
|
||||
queryset = music_models.Library.objects.all().select_related("actor")
|
||||
lookup_field = "uuid"
|
||||
|
||||
def retrieve(self, request, *args, **kwargs):
|
||||
lb = self.get_object()
|
||||
|
||||
conf = {
|
||||
"id": lb.get_federation_id(),
|
||||
"actor": lb.actor,
|
||||
"name": lb.name,
|
||||
"summary": lb.description,
|
||||
"items": lb.uploads.for_federation().order_by("-creation_date"),
|
||||
"item_serializer": serializers.UploadSerializer,
|
||||
}
|
||||
page = request.GET.get("page")
|
||||
library = actors.SYSTEM_ACTORS["library"].get_actor_instance()
|
||||
qs = (
|
||||
music_models.TrackFile.objects.order_by("-creation_date")
|
||||
.select_related("track__artist", "track__album__artist")
|
||||
.filter(library_track__isnull=True)
|
||||
)
|
||||
if page is None:
|
||||
conf = {
|
||||
"id": utils.full_url(reverse("federation:music:files-list")),
|
||||
"page_size": preferences.get("federation__collection_page_size"),
|
||||
"items": qs,
|
||||
"item_serializer": serializers.AudioSerializer,
|
||||
"actor": library,
|
||||
}
|
||||
serializer = serializers.PaginatedCollectionSerializer(conf)
|
||||
serializer = serializers.LibrarySerializer(lb)
|
||||
data = serializer.data
|
||||
else:
|
||||
# if actor is requesting a specific page, we ensure library is public
|
||||
# or readable by the actor
|
||||
if not has_library_access(request, lb):
|
||||
raise exceptions.AuthenticationFailed(
|
||||
"You do not have access to this library"
|
||||
)
|
||||
try:
|
||||
page_number = int(page)
|
||||
except Exception:
|
||||
return response.Response({"page": ["Invalid page number"]}, status=400)
|
||||
p = paginator.Paginator(
|
||||
qs, preferences.get("federation__collection_page_size")
|
||||
)
|
||||
conf["page_size"] = preferences.get("federation__collection_page_size")
|
||||
p = paginator.Paginator(conf["items"], conf["page_size"])
|
||||
try:
|
||||
page = p.page(page_number)
|
||||
conf = {
|
||||
"id": utils.full_url(reverse("federation:music:files-list")),
|
||||
"page": page,
|
||||
"item_serializer": serializers.AudioSerializer,
|
||||
"actor": library,
|
||||
}
|
||||
conf["page"] = page
|
||||
serializer = serializers.CollectionPageSerializer(conf)
|
||||
data = serializer.data
|
||||
except paginator.EmptyPage:
|
||||
|
@ -189,115 +180,48 @@ class MusicFilesViewSet(FederationMixin, viewsets.GenericViewSet):
|
|||
|
||||
return response.Response(data)
|
||||
|
||||
|
||||
class LibraryViewSet(
|
||||
mixins.RetrieveModelMixin,
|
||||
mixins.UpdateModelMixin,
|
||||
mixins.ListModelMixin,
|
||||
viewsets.GenericViewSet,
|
||||
):
|
||||
permission_classes = (HasUserPermission,)
|
||||
required_permissions = ["federation"]
|
||||
queryset = models.Library.objects.all().select_related("actor", "follow")
|
||||
lookup_field = "uuid"
|
||||
filter_class = filters.LibraryFilter
|
||||
serializer_class = serializers.APILibrarySerializer
|
||||
ordering_fields = (
|
||||
"id",
|
||||
"creation_date",
|
||||
"fetched_date",
|
||||
"actor__domain",
|
||||
"tracks_count",
|
||||
)
|
||||
|
||||
@list_route(methods=["get"])
|
||||
def fetch(self, request, *args, **kwargs):
|
||||
account = request.GET.get("account")
|
||||
if not account:
|
||||
return response.Response({"account": "This field is mandatory"}, status=400)
|
||||
|
||||
data = library.scan_from_account_name(account)
|
||||
return response.Response(data)
|
||||
|
||||
@detail_route(methods=["post"])
|
||||
def scan(self, request, *args, **kwargs):
|
||||
library = self.get_object()
|
||||
serializer = serializers.APILibraryScanSerializer(data=request.data)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
result = tasks.scan_library.delay(
|
||||
library_id=library.pk, until=serializer.validated_data.get("until")
|
||||
)
|
||||
return response.Response({"task": result.id})
|
||||
|
||||
@list_route(methods=["get"])
|
||||
def following(self, request, *args, **kwargs):
|
||||
library_actor = actors.SYSTEM_ACTORS["library"].get_actor_instance()
|
||||
queryset = (
|
||||
models.Follow.objects.filter(actor=library_actor)
|
||||
.select_related("actor", "target")
|
||||
.order_by("-creation_date")
|
||||
)
|
||||
filterset = filters.FollowFilter(request.GET, queryset=queryset)
|
||||
final_qs = filterset.qs
|
||||
serializer = serializers.APIFollowSerializer(final_qs, many=True)
|
||||
data = {"results": serializer.data, "count": len(final_qs)}
|
||||
return response.Response(data)
|
||||
|
||||
@list_route(methods=["get", "patch"])
|
||||
@detail_route(methods=["get"])
|
||||
def followers(self, request, *args, **kwargs):
|
||||
if request.method.lower() == "patch":
|
||||
serializer = serializers.APILibraryFollowUpdateSerializer(data=request.data)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
follow = serializer.save()
|
||||
return response.Response(serializers.APIFollowSerializer(follow).data)
|
||||
|
||||
library_actor = actors.SYSTEM_ACTORS["library"].get_actor_instance()
|
||||
queryset = (
|
||||
models.Follow.objects.filter(target=library_actor)
|
||||
.select_related("actor", "target")
|
||||
.order_by("-creation_date")
|
||||
)
|
||||
filterset = filters.FollowFilter(request.GET, queryset=queryset)
|
||||
final_qs = filterset.qs
|
||||
serializer = serializers.APIFollowSerializer(final_qs, many=True)
|
||||
data = {"results": serializer.data, "count": len(final_qs)}
|
||||
return response.Response(data)
|
||||
|
||||
@transaction.atomic
|
||||
def create(self, request, *args, **kwargs):
|
||||
serializer = serializers.APILibraryCreateSerializer(data=request.data)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
serializer.save()
|
||||
return response.Response(serializer.data, status=201)
|
||||
self.get_object()
|
||||
# XXX Implement this
|
||||
return response.Response({})
|
||||
|
||||
|
||||
class LibraryTrackViewSet(mixins.ListModelMixin, viewsets.GenericViewSet):
|
||||
permission_classes = (HasUserPermission,)
|
||||
required_permissions = ["federation"]
|
||||
queryset = (
|
||||
models.LibraryTrack.objects.all()
|
||||
.select_related("library__actor", "library__follow", "local_track_file")
|
||||
.prefetch_related("import_jobs")
|
||||
)
|
||||
filter_class = filters.LibraryTrackFilter
|
||||
serializer_class = serializers.APILibraryTrackSerializer
|
||||
ordering_fields = (
|
||||
"id",
|
||||
"artist_name",
|
||||
"title",
|
||||
"album_title",
|
||||
"creation_date",
|
||||
"modification_date",
|
||||
"fetched_date",
|
||||
"published_date",
|
||||
)
|
||||
class MusicUploadViewSet(
|
||||
FederationMixin, mixins.RetrieveModelMixin, viewsets.GenericViewSet
|
||||
):
|
||||
authentication_classes = [authentication.SignatureAuthentication]
|
||||
permission_classes = []
|
||||
renderer_classes = [renderers.ActivityPubRenderer]
|
||||
queryset = music_models.Upload.objects.none()
|
||||
lookup_field = "uuid"
|
||||
|
||||
@list_route(methods=["post"])
|
||||
def action(self, request, *args, **kwargs):
|
||||
queryset = models.LibraryTrack.objects.filter(local_track_file__isnull=True)
|
||||
serializer = serializers.LibraryTrackActionSerializer(
|
||||
request.data, queryset=queryset, context={"submitted_by": request.user}
|
||||
)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
result = serializer.save()
|
||||
return response.Response(result, status=200)
|
||||
|
||||
class MusicArtistViewSet(
|
||||
FederationMixin, mixins.RetrieveModelMixin, viewsets.GenericViewSet
|
||||
):
|
||||
authentication_classes = [authentication.SignatureAuthentication]
|
||||
permission_classes = []
|
||||
renderer_classes = [renderers.ActivityPubRenderer]
|
||||
queryset = music_models.Artist.objects.none()
|
||||
lookup_field = "uuid"
|
||||
|
||||
|
||||
class MusicAlbumViewSet(
|
||||
FederationMixin, mixins.RetrieveModelMixin, viewsets.GenericViewSet
|
||||
):
|
||||
authentication_classes = [authentication.SignatureAuthentication]
|
||||
permission_classes = []
|
||||
renderer_classes = [renderers.ActivityPubRenderer]
|
||||
queryset = music_models.Album.objects.none()
|
||||
lookup_field = "uuid"
|
||||
|
||||
|
||||
class MusicTrackViewSet(
|
||||
FederationMixin, mixins.RetrieveModelMixin, viewsets.GenericViewSet
|
||||
):
|
||||
authentication_classes = [authentication.SignatureAuthentication]
|
||||
permission_classes = []
|
||||
renderer_classes = [renderers.ActivityPubRenderer]
|
||||
queryset = music_models.Track.objects.none()
|
||||
lookup_field = "uuid"
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
from django.contrib import admin
|
||||
from funkwhale_api.common import admin
|
||||
|
||||
from . import models
|
||||
|
||||
|
|
|
@ -1,9 +1,12 @@
|
|||
from rest_framework import mixins, viewsets
|
||||
from rest_framework.permissions import IsAuthenticatedOrReadOnly
|
||||
|
||||
from django.db.models import Prefetch
|
||||
|
||||
from funkwhale_api.activity import record
|
||||
from funkwhale_api.common import fields, permissions
|
||||
|
||||
from funkwhale_api.music.models import Track
|
||||
from funkwhale_api.music import utils as music_utils
|
||||
from . import models, serializers
|
||||
|
||||
|
||||
|
@ -15,11 +18,7 @@ class ListeningViewSet(
|
|||
):
|
||||
|
||||
serializer_class = serializers.ListeningSerializer
|
||||
queryset = (
|
||||
models.Listening.objects.all()
|
||||
.select_related("track__artist", "track__album__artist", "user")
|
||||
.prefetch_related("track__files")
|
||||
)
|
||||
queryset = models.Listening.objects.all().select_related("user")
|
||||
permission_classes = [
|
||||
permissions.ConditionalAuthentication,
|
||||
permissions.OwnerPermission,
|
||||
|
@ -39,9 +38,13 @@ class ListeningViewSet(
|
|||
|
||||
def get_queryset(self):
|
||||
queryset = super().get_queryset()
|
||||
return queryset.filter(
|
||||
queryset = queryset.filter(
|
||||
fields.privacy_level_query(self.request.user, "user__privacy_level")
|
||||
)
|
||||
tracks = Track.objects.annotate_playable_by_actor(
|
||||
music_utils.get_actor_from_request(self.request)
|
||||
).select_related("artist", "album__artist")
|
||||
return queryset.prefetch_related(Prefetch("track", queryset=tracks))
|
||||
|
||||
def get_serializer_context(self):
|
||||
context = super().get_serializer_context()
|
||||
|
|
|
@ -43,7 +43,7 @@ def get_artists():
|
|||
|
||||
|
||||
def get_music_duration():
|
||||
seconds = models.TrackFile.objects.aggregate(d=Sum("duration"))["d"]
|
||||
seconds = models.Upload.objects.aggregate(d=Sum("duration"))["d"]
|
||||
if seconds:
|
||||
return seconds / 3600
|
||||
return 0
|
||||
|
|
|
@ -2,11 +2,10 @@ from django_filters import rest_framework as filters
|
|||
|
||||
from funkwhale_api.common import fields
|
||||
from funkwhale_api.music import models as music_models
|
||||
from funkwhale_api.requests import models as requests_models
|
||||
from funkwhale_api.users import models as users_models
|
||||
|
||||
|
||||
class ManageTrackFileFilterSet(filters.FilterSet):
|
||||
class ManageUploadFilterSet(filters.FilterSet):
|
||||
q = fields.SearchFilter(
|
||||
search_fields=[
|
||||
"track__title",
|
||||
|
@ -17,8 +16,8 @@ class ManageTrackFileFilterSet(filters.FilterSet):
|
|||
)
|
||||
|
||||
class Meta:
|
||||
model = music_models.TrackFile
|
||||
fields = ["q", "track__album", "track__artist", "track", "library_track"]
|
||||
model = music_models.Upload
|
||||
fields = ["q", "track__album", "track__artist", "track"]
|
||||
|
||||
|
||||
class ManageUserFilterSet(filters.FilterSet):
|
||||
|
@ -51,13 +50,3 @@ class ManageInvitationFilterSet(filters.FilterSet):
|
|||
if value is None:
|
||||
return queryset
|
||||
return queryset.open(value)
|
||||
|
||||
|
||||
class ManageImportRequestFilterSet(filters.FilterSet):
|
||||
q = fields.SearchFilter(
|
||||
search_fields=["user__username", "albums", "artist_name", "comment"]
|
||||
)
|
||||
|
||||
class Meta:
|
||||
model = requests_models.ImportRequest
|
||||
fields = ["q", "status"]
|
||||
|
|
|
@ -1,23 +1,22 @@
|
|||
from django.db import transaction
|
||||
from django.utils import timezone
|
||||
|
||||
from rest_framework import serializers
|
||||
|
||||
from funkwhale_api.common import serializers as common_serializers
|
||||
from funkwhale_api.music import models as music_models
|
||||
from funkwhale_api.requests import models as requests_models
|
||||
from funkwhale_api.users import models as users_models
|
||||
|
||||
from . import filters
|
||||
|
||||
|
||||
class ManageTrackFileArtistSerializer(serializers.ModelSerializer):
|
||||
class ManageUploadArtistSerializer(serializers.ModelSerializer):
|
||||
class Meta:
|
||||
model = music_models.Artist
|
||||
fields = ["id", "mbid", "creation_date", "name"]
|
||||
|
||||
|
||||
class ManageTrackFileAlbumSerializer(serializers.ModelSerializer):
|
||||
artist = ManageTrackFileArtistSerializer()
|
||||
class ManageUploadAlbumSerializer(serializers.ModelSerializer):
|
||||
artist = ManageUploadArtistSerializer()
|
||||
|
||||
class Meta:
|
||||
model = music_models.Album
|
||||
|
@ -32,20 +31,20 @@ class ManageTrackFileAlbumSerializer(serializers.ModelSerializer):
|
|||
)
|
||||
|
||||
|
||||
class ManageTrackFileTrackSerializer(serializers.ModelSerializer):
|
||||
artist = ManageTrackFileArtistSerializer()
|
||||
album = ManageTrackFileAlbumSerializer()
|
||||
class ManageUploadTrackSerializer(serializers.ModelSerializer):
|
||||
artist = ManageUploadArtistSerializer()
|
||||
album = ManageUploadAlbumSerializer()
|
||||
|
||||
class Meta:
|
||||
model = music_models.Track
|
||||
fields = ("id", "mbid", "title", "album", "artist", "creation_date", "position")
|
||||
|
||||
|
||||
class ManageTrackFileSerializer(serializers.ModelSerializer):
|
||||
track = ManageTrackFileTrackSerializer()
|
||||
class ManageUploadSerializer(serializers.ModelSerializer):
|
||||
track = ManageUploadTrackSerializer()
|
||||
|
||||
class Meta:
|
||||
model = music_models.TrackFile
|
||||
model = music_models.Upload
|
||||
fields = (
|
||||
"id",
|
||||
"path",
|
||||
|
@ -59,13 +58,12 @@ class ManageTrackFileSerializer(serializers.ModelSerializer):
|
|||
"bitrate",
|
||||
"size",
|
||||
"path",
|
||||
"library_track",
|
||||
)
|
||||
|
||||
|
||||
class ManageTrackFileActionSerializer(common_serializers.ActionSerializer):
|
||||
class ManageUploadActionSerializer(common_serializers.ActionSerializer):
|
||||
actions = [common_serializers.Action("delete", allow_all=False)]
|
||||
filterset_class = filters.ManageTrackFileFilterSet
|
||||
filterset_class = filters.ManageUploadFilterSet
|
||||
|
||||
@transaction.atomic
|
||||
def handle_delete(self, objects):
|
||||
|
@ -94,11 +92,13 @@ class ManageUserSimpleSerializer(serializers.ModelSerializer):
|
|||
"date_joined",
|
||||
"last_activity",
|
||||
"privacy_level",
|
||||
"upload_quota",
|
||||
)
|
||||
|
||||
|
||||
class ManageUserSerializer(serializers.ModelSerializer):
|
||||
permissions = PermissionsSerializer(source="*")
|
||||
upload_quota = serializers.IntegerField(allow_null=True)
|
||||
|
||||
class Meta:
|
||||
model = users_models.User
|
||||
|
@ -114,6 +114,7 @@ class ManageUserSerializer(serializers.ModelSerializer):
|
|||
"last_activity",
|
||||
"permissions",
|
||||
"privacy_level",
|
||||
"upload_quota",
|
||||
)
|
||||
read_only_fields = [
|
||||
"id",
|
||||
|
@ -167,69 +168,3 @@ class ManageInvitationActionSerializer(common_serializers.ActionSerializer):
|
|||
@transaction.atomic
|
||||
def handle_delete(self, objects):
|
||||
return objects.delete()
|
||||
|
||||
|
||||
class ManageImportRequestSerializer(serializers.ModelSerializer):
|
||||
user = ManageUserSimpleSerializer(required=False)
|
||||
|
||||
class Meta:
|
||||
model = requests_models.ImportRequest
|
||||
fields = [
|
||||
"id",
|
||||
"status",
|
||||
"creation_date",
|
||||
"imported_date",
|
||||
"user",
|
||||
"albums",
|
||||
"artist_name",
|
||||
"comment",
|
||||
]
|
||||
read_only_fields = [
|
||||
"id",
|
||||
"status",
|
||||
"creation_date",
|
||||
"imported_date",
|
||||
"user",
|
||||
"albums",
|
||||
"artist_name",
|
||||
"comment",
|
||||
]
|
||||
|
||||
def validate_code(self, value):
|
||||
if not value:
|
||||
return value
|
||||
if users_models.Invitation.objects.filter(code__iexact=value).exists():
|
||||
raise serializers.ValidationError(
|
||||
"An invitation with this code already exists"
|
||||
)
|
||||
return value
|
||||
|
||||
|
||||
class ManageImportRequestActionSerializer(common_serializers.ActionSerializer):
|
||||
actions = [
|
||||
common_serializers.Action(
|
||||
"mark_closed",
|
||||
allow_all=True,
|
||||
qs_filter=lambda qs: qs.filter(status__in=["pending", "accepted"]),
|
||||
),
|
||||
common_serializers.Action(
|
||||
"mark_imported",
|
||||
allow_all=True,
|
||||
qs_filter=lambda qs: qs.filter(status__in=["pending", "accepted"]),
|
||||
),
|
||||
common_serializers.Action("delete", allow_all=False),
|
||||
]
|
||||
filterset_class = filters.ManageImportRequestFilterSet
|
||||
|
||||
@transaction.atomic
|
||||
def handle_delete(self, objects):
|
||||
return objects.delete()
|
||||
|
||||
@transaction.atomic
|
||||
def handle_mark_closed(self, objects):
|
||||
return objects.update(status="closed")
|
||||
|
||||
@transaction.atomic
|
||||
def handle_mark_imported(self, objects):
|
||||
now = timezone.now()
|
||||
return objects.update(status="imported", imported_date=now)
|
||||
|
|
|
@ -4,11 +4,7 @@ from rest_framework import routers
|
|||
from . import views
|
||||
|
||||
library_router = routers.SimpleRouter()
|
||||
library_router.register(r"track-files", views.ManageTrackFileViewSet, "track-files")
|
||||
requests_router = routers.SimpleRouter()
|
||||
requests_router.register(
|
||||
r"import-requests", views.ManageImportRequestViewSet, "import-requests"
|
||||
)
|
||||
library_router.register(r"uploads", views.ManageUploadViewSet, "uploads")
|
||||
users_router = routers.SimpleRouter()
|
||||
users_router.register(r"users", views.ManageUserViewSet, "users")
|
||||
users_router.register(r"invitations", views.ManageInvitationViewSet, "invitations")
|
||||
|
@ -16,7 +12,4 @@ users_router.register(r"invitations", views.ManageInvitationViewSet, "invitation
|
|||
urlpatterns = [
|
||||
url(r"^library/", include((library_router.urls, "instance"), namespace="library")),
|
||||
url(r"^users/", include((users_router.urls, "instance"), namespace="users")),
|
||||
url(
|
||||
r"^requests/", include((requests_router.urls, "instance"), namespace="requests")
|
||||
),
|
||||
]
|
||||
|
|
|
@ -3,23 +3,22 @@ from rest_framework.decorators import list_route
|
|||
|
||||
from funkwhale_api.common import preferences
|
||||
from funkwhale_api.music import models as music_models
|
||||
from funkwhale_api.requests import models as requests_models
|
||||
from funkwhale_api.users import models as users_models
|
||||
from funkwhale_api.users.permissions import HasUserPermission
|
||||
|
||||
from . import filters, serializers
|
||||
|
||||
|
||||
class ManageTrackFileViewSet(
|
||||
class ManageUploadViewSet(
|
||||
mixins.ListModelMixin, mixins.RetrieveModelMixin, viewsets.GenericViewSet
|
||||
):
|
||||
queryset = (
|
||||
music_models.TrackFile.objects.all()
|
||||
.select_related("track__artist", "track__album__artist", "library_track")
|
||||
music_models.Upload.objects.all()
|
||||
.select_related("track__artist", "track__album__artist")
|
||||
.order_by("-id")
|
||||
)
|
||||
serializer_class = serializers.ManageTrackFileSerializer
|
||||
filter_class = filters.ManageTrackFileFilterSet
|
||||
serializer_class = serializers.ManageUploadSerializer
|
||||
filter_class = filters.ManageUploadFilterSet
|
||||
permission_classes = (HasUserPermission,)
|
||||
required_permissions = ["library"]
|
||||
ordering_fields = [
|
||||
|
@ -35,7 +34,7 @@ class ManageTrackFileViewSet(
|
|||
@list_route(methods=["post"])
|
||||
def action(self, request, *args, **kwargs):
|
||||
queryset = self.get_queryset()
|
||||
serializer = serializers.ManageTrackFileActionSerializer(
|
||||
serializer = serializers.ManageUploadActionSerializer(
|
||||
request.data, queryset=queryset
|
||||
)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
|
@ -93,31 +92,3 @@ class ManageInvitationViewSet(
|
|||
serializer.is_valid(raise_exception=True)
|
||||
result = serializer.save()
|
||||
return response.Response(result, status=200)
|
||||
|
||||
|
||||
class ManageImportRequestViewSet(
|
||||
mixins.ListModelMixin,
|
||||
mixins.RetrieveModelMixin,
|
||||
mixins.UpdateModelMixin,
|
||||
viewsets.GenericViewSet,
|
||||
):
|
||||
queryset = (
|
||||
requests_models.ImportRequest.objects.all()
|
||||
.order_by("-id")
|
||||
.select_related("user")
|
||||
)
|
||||
serializer_class = serializers.ManageImportRequestSerializer
|
||||
filter_class = filters.ManageImportRequestFilterSet
|
||||
permission_classes = (HasUserPermission,)
|
||||
required_permissions = ["library"]
|
||||
ordering_fields = ["creation_date", "imported_date"]
|
||||
|
||||
@list_route(methods=["post"])
|
||||
def action(self, request, *args, **kwargs):
|
||||
queryset = self.get_queryset()
|
||||
serializer = serializers.ManageImportRequestActionSerializer(
|
||||
request.data, queryset=queryset
|
||||
)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
result = serializer.save()
|
||||
return response.Response(result, status=200)
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
from django.contrib import admin
|
||||
from funkwhale_api.common import admin
|
||||
|
||||
from . import models
|
||||
|
||||
|
@ -33,8 +33,8 @@ class ImportBatchAdmin(admin.ModelAdmin):
|
|||
|
||||
@admin.register(models.ImportJob)
|
||||
class ImportJobAdmin(admin.ModelAdmin):
|
||||
list_display = ["source", "batch", "track_file", "status", "mbid"]
|
||||
list_select_related = ["track_file", "batch"]
|
||||
list_display = ["source", "batch", "upload", "status", "mbid"]
|
||||
list_select_related = ["upload", "batch"]
|
||||
search_fields = ["source", "batch__pk", "mbid"]
|
||||
list_filter = ["status"]
|
||||
|
||||
|
@ -55,8 +55,8 @@ class LyricsAdmin(admin.ModelAdmin):
|
|||
list_filter = ["work__language"]
|
||||
|
||||
|
||||
@admin.register(models.TrackFile)
|
||||
class TrackFileAdmin(admin.ModelAdmin):
|
||||
@admin.register(models.Upload)
|
||||
class UploadAdmin(admin.ModelAdmin):
|
||||
list_display = [
|
||||
"track",
|
||||
"audio_file",
|
||||
|
@ -65,6 +65,7 @@ class TrackFileAdmin(admin.ModelAdmin):
|
|||
"mimetype",
|
||||
"size",
|
||||
"bitrate",
|
||||
"import_status",
|
||||
]
|
||||
list_select_related = ["track"]
|
||||
search_fields = [
|
||||
|
@ -74,4 +75,40 @@ class TrackFileAdmin(admin.ModelAdmin):
|
|||
"track__album__title",
|
||||
"track__artist__name",
|
||||
]
|
||||
list_filter = ["mimetype"]
|
||||
list_filter = ["mimetype", "import_status", "library__privacy_level"]
|
||||
|
||||
|
||||
def launch_scan(modeladmin, request, queryset):
|
||||
for library in queryset:
|
||||
library.schedule_scan(actor=request.user.actor, force=True)
|
||||
|
||||
|
||||
launch_scan.short_description = "Launch scan"
|
||||
|
||||
|
||||
@admin.register(models.Library)
|
||||
class LibraryAdmin(admin.ModelAdmin):
|
||||
list_display = ["id", "name", "actor", "uuid", "privacy_level", "creation_date"]
|
||||
list_select_related = True
|
||||
search_fields = ["actor__username", "name", "description"]
|
||||
list_filter = ["privacy_level"]
|
||||
actions = [launch_scan]
|
||||
|
||||
|
||||
@admin.register(models.LibraryScan)
|
||||
class LibraryScanAdmin(admin.ModelAdmin):
|
||||
list_display = [
|
||||
"id",
|
||||
"library",
|
||||
"actor",
|
||||
"status",
|
||||
"creation_date",
|
||||
"modification_date",
|
||||
"status",
|
||||
"total_files",
|
||||
"processed_files",
|
||||
"errored_files",
|
||||
]
|
||||
list_select_related = True
|
||||
search_fields = ["actor__username", "library__name"]
|
||||
list_filter = ["status"]
|
||||
|
|
|
@ -3,8 +3,9 @@ import os
|
|||
import factory
|
||||
|
||||
from funkwhale_api.factories import ManyToManyFromList, registry
|
||||
from funkwhale_api.federation.factories import LibraryTrackFactory
|
||||
from funkwhale_api.users.factories import UserFactory
|
||||
from funkwhale_api.federation import factories as federation_factories
|
||||
from funkwhale_api.users import factories as users_factories
|
||||
|
||||
|
||||
SAMPLES_PATH = os.path.join(
|
||||
os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))),
|
||||
|
@ -13,10 +14,28 @@ SAMPLES_PATH = os.path.join(
|
|||
)
|
||||
|
||||
|
||||
def playable_factory(field):
|
||||
@factory.post_generation
|
||||
def inner(self, create, extracted, **kwargs):
|
||||
if not create:
|
||||
return
|
||||
|
||||
if extracted:
|
||||
UploadFactory(
|
||||
library__privacy_level="everyone",
|
||||
import_status="finished",
|
||||
**{field: self}
|
||||
)
|
||||
|
||||
return inner
|
||||
|
||||
|
||||
@registry.register
|
||||
class ArtistFactory(factory.django.DjangoModelFactory):
|
||||
name = factory.Faker("name")
|
||||
mbid = factory.Faker("uuid4")
|
||||
fid = factory.Faker("federation_url")
|
||||
playable = playable_factory("track__album__artist")
|
||||
|
||||
class Meta:
|
||||
model = "music.Artist"
|
||||
|
@ -30,6 +49,8 @@ class AlbumFactory(factory.django.DjangoModelFactory):
|
|||
cover = factory.django.ImageField()
|
||||
artist = factory.SubFactory(ArtistFactory)
|
||||
release_group_id = factory.Faker("uuid4")
|
||||
fid = factory.Faker("federation_url")
|
||||
playable = playable_factory("track__album")
|
||||
|
||||
class Meta:
|
||||
model = "music.Album"
|
||||
|
@ -37,20 +58,24 @@ class AlbumFactory(factory.django.DjangoModelFactory):
|
|||
|
||||
@registry.register
|
||||
class TrackFactory(factory.django.DjangoModelFactory):
|
||||
fid = factory.Faker("federation_url")
|
||||
title = factory.Faker("sentence", nb_words=3)
|
||||
mbid = factory.Faker("uuid4")
|
||||
album = factory.SubFactory(AlbumFactory)
|
||||
artist = factory.SelfAttribute("album.artist")
|
||||
position = 1
|
||||
tags = ManyToManyFromList("tags")
|
||||
playable = playable_factory("track")
|
||||
|
||||
class Meta:
|
||||
model = "music.Track"
|
||||
|
||||
|
||||
@registry.register
|
||||
class TrackFileFactory(factory.django.DjangoModelFactory):
|
||||
class UploadFactory(factory.django.DjangoModelFactory):
|
||||
fid = factory.Faker("federation_url")
|
||||
track = factory.SubFactory(TrackFactory)
|
||||
library = factory.SubFactory(federation_factories.MusicLibraryFactory)
|
||||
audio_file = factory.django.FileField(
|
||||
from_path=os.path.join(SAMPLES_PATH, "test.ogg")
|
||||
)
|
||||
|
@ -58,69 +83,18 @@ class TrackFileFactory(factory.django.DjangoModelFactory):
|
|||
bitrate = None
|
||||
size = None
|
||||
duration = None
|
||||
mimetype = "audio/ogg"
|
||||
|
||||
class Meta:
|
||||
model = "music.TrackFile"
|
||||
model = "music.Upload"
|
||||
|
||||
class Params:
|
||||
in_place = factory.Trait(audio_file=None)
|
||||
federation = factory.Trait(
|
||||
audio_file=None,
|
||||
library_track=factory.SubFactory(LibraryTrackFactory),
|
||||
mimetype=factory.LazyAttribute(lambda o: o.library_track.audio_mimetype),
|
||||
source=factory.LazyAttribute(lambda o: o.library_track.audio_url),
|
||||
playable = factory.Trait(
|
||||
import_status="finished", library__privacy_level="everyone"
|
||||
)
|
||||
|
||||
|
||||
@registry.register
|
||||
class ImportBatchFactory(factory.django.DjangoModelFactory):
|
||||
submitted_by = factory.SubFactory(UserFactory)
|
||||
|
||||
class Meta:
|
||||
model = "music.ImportBatch"
|
||||
|
||||
class Params:
|
||||
federation = factory.Trait(submitted_by=None, source="federation")
|
||||
finished = factory.Trait(status="finished")
|
||||
|
||||
|
||||
@registry.register
|
||||
class ImportJobFactory(factory.django.DjangoModelFactory):
|
||||
batch = factory.SubFactory(ImportBatchFactory)
|
||||
source = factory.Faker("url")
|
||||
mbid = factory.Faker("uuid4")
|
||||
replace_if_duplicate = False
|
||||
|
||||
class Meta:
|
||||
model = "music.ImportJob"
|
||||
|
||||
class Params:
|
||||
federation = factory.Trait(
|
||||
mbid=None,
|
||||
library_track=factory.SubFactory(LibraryTrackFactory),
|
||||
batch=factory.SubFactory(ImportBatchFactory, federation=True),
|
||||
)
|
||||
finished = factory.Trait(
|
||||
status="finished", track_file=factory.SubFactory(TrackFileFactory)
|
||||
)
|
||||
in_place = factory.Trait(status="finished", audio_file=None)
|
||||
with_audio_file = factory.Trait(
|
||||
status="finished",
|
||||
audio_file=factory.django.FileField(
|
||||
from_path=os.path.join(SAMPLES_PATH, "test.ogg")
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
@registry.register(name="music.FileImportJob")
|
||||
class FileImportJobFactory(ImportJobFactory):
|
||||
source = "file://"
|
||||
mbid = None
|
||||
audio_file = factory.django.FileField(
|
||||
from_path=os.path.join(SAMPLES_PATH, "test.ogg")
|
||||
)
|
||||
|
||||
|
||||
@registry.register
|
||||
class WorkFactory(factory.django.DjangoModelFactory):
|
||||
mbid = factory.Faker("uuid4")
|
||||
|
@ -149,3 +123,24 @@ class TagFactory(factory.django.DjangoModelFactory):
|
|||
|
||||
class Meta:
|
||||
model = "taggit.Tag"
|
||||
|
||||
|
||||
# XXX To remove
|
||||
|
||||
|
||||
class ImportBatchFactory(factory.django.DjangoModelFactory):
|
||||
submitted_by = factory.SubFactory(users_factories.UserFactory)
|
||||
|
||||
class Meta:
|
||||
model = "music.ImportBatch"
|
||||
|
||||
|
||||
@registry.register
|
||||
class ImportJobFactory(factory.django.DjangoModelFactory):
|
||||
batch = factory.SubFactory(ImportBatchFactory)
|
||||
source = factory.Faker("url")
|
||||
mbid = factory.Faker("uuid4")
|
||||
replace_if_duplicate = False
|
||||
|
||||
class Meta:
|
||||
model = "music.ImportJob"
|
||||
|
|
|
@ -14,7 +14,7 @@ def create_data(count=25):
|
|||
artist=artist, size=random.randint(1, 5)
|
||||
)
|
||||
for album in albums:
|
||||
factories.TrackFileFactory.create_batch(
|
||||
factories.UploadFactory.create_batch(
|
||||
track__album=album, size=random.randint(3, 18)
|
||||
)
|
||||
|
||||
|
|
|
@ -1,85 +1,97 @@
|
|||
from django.db.models import Count
|
||||
from django_filters import rest_framework as filters
|
||||
|
||||
from funkwhale_api.common import fields
|
||||
from funkwhale_api.common import search
|
||||
|
||||
from . import models
|
||||
from . import utils
|
||||
|
||||
|
||||
class ArtistFilter(filters.FilterSet):
|
||||
q = fields.SearchFilter(search_fields=["name"])
|
||||
listenable = filters.BooleanFilter(name="_", method="filter_listenable")
|
||||
playable = filters.BooleanFilter(name="_", method="filter_playable")
|
||||
|
||||
class Meta:
|
||||
model = models.Artist
|
||||
fields = {
|
||||
"name": ["exact", "iexact", "startswith", "icontains"],
|
||||
"listenable": "exact",
|
||||
"playable": "exact",
|
||||
}
|
||||
|
||||
def filter_listenable(self, queryset, name, value):
|
||||
queryset = queryset.annotate(files_count=Count("albums__tracks__files"))
|
||||
if value:
|
||||
return queryset.filter(files_count__gt=0)
|
||||
else:
|
||||
return queryset.filter(files_count=0)
|
||||
def filter_playable(self, queryset, name, value):
|
||||
actor = utils.get_actor_from_request(self.request)
|
||||
return queryset.playable_by(actor, value)
|
||||
|
||||
|
||||
class TrackFilter(filters.FilterSet):
|
||||
q = fields.SearchFilter(search_fields=["title", "album__title", "artist__name"])
|
||||
listenable = filters.BooleanFilter(name="_", method="filter_listenable")
|
||||
playable = filters.BooleanFilter(name="_", method="filter_playable")
|
||||
|
||||
class Meta:
|
||||
model = models.Track
|
||||
fields = {
|
||||
"title": ["exact", "iexact", "startswith", "icontains"],
|
||||
"listenable": ["exact"],
|
||||
"playable": ["exact"],
|
||||
"artist": ["exact"],
|
||||
"album": ["exact"],
|
||||
}
|
||||
|
||||
def filter_listenable(self, queryset, name, value):
|
||||
queryset = queryset.annotate(files_count=Count("files"))
|
||||
if value:
|
||||
return queryset.filter(files_count__gt=0)
|
||||
else:
|
||||
return queryset.filter(files_count=0)
|
||||
def filter_playable(self, queryset, name, value):
|
||||
actor = utils.get_actor_from_request(self.request)
|
||||
return queryset.playable_by(actor, value)
|
||||
|
||||
|
||||
class ImportBatchFilter(filters.FilterSet):
|
||||
q = fields.SearchFilter(search_fields=["submitted_by__username", "source"])
|
||||
class UploadFilter(filters.FilterSet):
|
||||
library = filters.CharFilter("library__uuid")
|
||||
track = filters.UUIDFilter("track__uuid")
|
||||
track_artist = filters.UUIDFilter("track__artist__uuid")
|
||||
album_artist = filters.UUIDFilter("track__album__artist__uuid")
|
||||
library = filters.UUIDFilter("library__uuid")
|
||||
playable = filters.BooleanFilter(name="_", method="filter_playable")
|
||||
q = fields.SmartSearchFilter(
|
||||
config=search.SearchConfig(
|
||||
search_fields={
|
||||
"track_artist": {"to": "track__artist__name"},
|
||||
"album_artist": {"to": "track__album__artist__name"},
|
||||
"album": {"to": "track__album__title"},
|
||||
"title": {"to": "track__title"},
|
||||
},
|
||||
filter_fields={
|
||||
"artist": {"to": "track__artist__name__iexact"},
|
||||
"mimetype": {"to": "mimetype"},
|
||||
"album": {"to": "track__album__title__iexact"},
|
||||
"title": {"to": "track__title__iexact"},
|
||||
"status": {"to": "import_status"},
|
||||
},
|
||||
)
|
||||
)
|
||||
|
||||
class Meta:
|
||||
model = models.ImportBatch
|
||||
fields = {"status": ["exact"], "source": ["exact"], "submitted_by": ["exact"]}
|
||||
model = models.Upload
|
||||
fields = [
|
||||
"playable",
|
||||
"import_status",
|
||||
"mimetype",
|
||||
"track",
|
||||
"track_artist",
|
||||
"album_artist",
|
||||
"library",
|
||||
"import_reference",
|
||||
]
|
||||
|
||||
|
||||
class ImportJobFilter(filters.FilterSet):
|
||||
q = fields.SearchFilter(search_fields=["batch__submitted_by__username", "source"])
|
||||
|
||||
class Meta:
|
||||
model = models.ImportJob
|
||||
fields = {
|
||||
"batch": ["exact"],
|
||||
"batch__status": ["exact"],
|
||||
"batch__source": ["exact"],
|
||||
"batch__submitted_by": ["exact"],
|
||||
"status": ["exact"],
|
||||
"source": ["exact"],
|
||||
}
|
||||
def filter_playable(self, queryset, name, value):
|
||||
actor = utils.get_actor_from_request(self.request)
|
||||
return queryset.playable_by(actor, value)
|
||||
|
||||
|
||||
class AlbumFilter(filters.FilterSet):
|
||||
listenable = filters.BooleanFilter(name="_", method="filter_listenable")
|
||||
playable = filters.BooleanFilter(name="_", method="filter_playable")
|
||||
q = fields.SearchFilter(search_fields=["title", "artist__name" "source"])
|
||||
|
||||
class Meta:
|
||||
model = models.Album
|
||||
fields = ["listenable", "q", "artist"]
|
||||
fields = ["playable", "q", "artist"]
|
||||
|
||||
def filter_listenable(self, queryset, name, value):
|
||||
queryset = queryset.annotate(files_count=Count("tracks__files"))
|
||||
if value:
|
||||
return queryset.filter(files_count__gt=0)
|
||||
else:
|
||||
return queryset.filter(files_count=0)
|
||||
def filter_playable(self, queryset, name, value):
|
||||
actor = utils.get_actor_from_request(self.request)
|
||||
return queryset.playable_by(actor, value)
|
||||
|
|
|
@ -15,7 +15,7 @@ class Importer(object):
|
|||
# let's validate data, just in case
|
||||
instance = self.model(**cleaned_data)
|
||||
exclude = EXCLUDE_VALIDATION.get(self.model.__name__, [])
|
||||
instance.full_clean(exclude=["mbid", "uuid"] + exclude)
|
||||
instance.full_clean(exclude=["mbid", "uuid", "fid", "from_activity"] + exclude)
|
||||
m = self.model.objects.update_or_create(mbid=mbid, defaults=cleaned_data)[0]
|
||||
for hook in import_hooks:
|
||||
hook(m, cleaned_data, raw_data)
|
||||
|
|
|
@ -27,9 +27,9 @@ class Command(BaseCommand):
|
|||
@transaction.atomic
|
||||
def fix_mimetypes(self, dry_run, **kwargs):
|
||||
self.stdout.write("Fixing missing mimetypes...")
|
||||
matching = models.TrackFile.objects.filter(
|
||||
source__startswith="file://"
|
||||
).exclude(mimetype__startswith="audio/")
|
||||
matching = models.Upload.objects.filter(source__startswith="file://").exclude(
|
||||
mimetype__startswith="audio/"
|
||||
)
|
||||
self.stdout.write(
|
||||
"[mimetypes] {} entries found with bad or no mimetype".format(
|
||||
matching.count()
|
||||
|
@ -48,7 +48,7 @@ class Command(BaseCommand):
|
|||
|
||||
def fix_file_data(self, dry_run, **kwargs):
|
||||
self.stdout.write("Fixing missing bitrate or length...")
|
||||
matching = models.TrackFile.objects.filter(
|
||||
matching = models.Upload.objects.filter(
|
||||
Q(bitrate__isnull=True) | Q(duration__isnull=True)
|
||||
)
|
||||
total = matching.count()
|
||||
|
@ -57,41 +57,41 @@ class Command(BaseCommand):
|
|||
)
|
||||
if dry_run:
|
||||
return
|
||||
for i, tf in enumerate(matching.only("audio_file")):
|
||||
for i, upload in enumerate(matching.only("audio_file")):
|
||||
self.stdout.write(
|
||||
"[bitrate/length] {}/{} fixing file #{}".format(i + 1, total, tf.pk)
|
||||
"[bitrate/length] {}/{} fixing file #{}".format(i + 1, total, upload.pk)
|
||||
)
|
||||
|
||||
try:
|
||||
audio_file = tf.get_audio_file()
|
||||
audio_file = upload.get_audio_file()
|
||||
if audio_file:
|
||||
data = utils.get_audio_file_data(audio_file)
|
||||
tf.bitrate = data["bitrate"]
|
||||
tf.duration = data["length"]
|
||||
tf.save(update_fields=["duration", "bitrate"])
|
||||
upload.bitrate = data["bitrate"]
|
||||
upload.duration = data["length"]
|
||||
upload.save(update_fields=["duration", "bitrate"])
|
||||
else:
|
||||
self.stderr.write("[bitrate/length] no file found")
|
||||
except Exception as e:
|
||||
self.stderr.write(
|
||||
"[bitrate/length] error with file #{}: {}".format(tf.pk, str(e))
|
||||
"[bitrate/length] error with file #{}: {}".format(upload.pk, str(e))
|
||||
)
|
||||
|
||||
def fix_file_size(self, dry_run, **kwargs):
|
||||
self.stdout.write("Fixing missing size...")
|
||||
matching = models.TrackFile.objects.filter(size__isnull=True)
|
||||
matching = models.Upload.objects.filter(size__isnull=True)
|
||||
total = matching.count()
|
||||
self.stdout.write("[size] {} entries found with missing values".format(total))
|
||||
if dry_run:
|
||||
return
|
||||
for i, tf in enumerate(matching.only("size")):
|
||||
for i, upload in enumerate(matching.only("size")):
|
||||
self.stdout.write(
|
||||
"[size] {}/{} fixing file #{}".format(i + 1, total, tf.pk)
|
||||
"[size] {}/{} fixing file #{}".format(i + 1, total, upload.pk)
|
||||
)
|
||||
|
||||
try:
|
||||
tf.size = tf.get_file_size()
|
||||
tf.save(update_fields=["size"])
|
||||
upload.size = upload.get_file_size()
|
||||
upload.save(update_fields=["size"])
|
||||
except Exception as e:
|
||||
self.stderr.write(
|
||||
"[size] error with file #{}: {}".format(tf.pk, str(e))
|
||||
"[size] error with file #{}: {}".format(upload.pk, str(e))
|
||||
)
|
|
@ -1,18 +1,29 @@
|
|||
import glob
|
||||
import os
|
||||
import urllib.parse
|
||||
|
||||
from django.conf import settings
|
||||
from django.core.files import File
|
||||
from django.core.management.base import BaseCommand, CommandError
|
||||
from django.utils import timezone
|
||||
|
||||
from funkwhale_api.music import models, tasks
|
||||
from funkwhale_api.users.models import User
|
||||
|
||||
|
||||
class Command(BaseCommand):
|
||||
help = "Import audio files mathinc given glob pattern"
|
||||
|
||||
def add_arguments(self, parser):
|
||||
parser.add_argument(
|
||||
"library_id",
|
||||
type=str,
|
||||
help=(
|
||||
"A local library identifier where the files should be imported. "
|
||||
"You can use the full uuid such as e29c5be9-6da3-4d92-b40b-4970edd3ee4b "
|
||||
"or only a small portion of it, starting from the beginning, such as "
|
||||
"e29c5be9"
|
||||
),
|
||||
)
|
||||
parser.add_argument("path", nargs="+", type=str)
|
||||
parser.add_argument(
|
||||
"--recursive",
|
||||
|
@ -29,7 +40,7 @@ class Command(BaseCommand):
|
|||
parser.add_argument(
|
||||
"--async",
|
||||
action="store_true",
|
||||
dest="async",
|
||||
dest="async_",
|
||||
default=False,
|
||||
help="Will launch celery tasks for each file to import instead of doing it synchronously and block the CLI",
|
||||
)
|
||||
|
@ -66,6 +77,40 @@ class Command(BaseCommand):
|
|||
"with their newest version."
|
||||
),
|
||||
)
|
||||
parser.add_argument(
|
||||
"--outbox",
|
||||
action="store_true",
|
||||
dest="outbox",
|
||||
default=False,
|
||||
help=(
|
||||
"Use this flag to notify library followers of newly imported files. "
|
||||
"You'll likely want to keep this disabled for CLI imports, especially if"
|
||||
"you plan to import hundreds or thousands of files, as it will cause a lot "
|
||||
"of overhead on your server and on servers you are federating with."
|
||||
),
|
||||
)
|
||||
|
||||
parser.add_argument(
|
||||
"--broadcast",
|
||||
action="store_true",
|
||||
dest="broadcast",
|
||||
default=False,
|
||||
help=(
|
||||
"Use this flag to enable realtime updates about the import in the UI. "
|
||||
"This causes some overhead, so it's disabled by default."
|
||||
),
|
||||
)
|
||||
|
||||
parser.add_argument(
|
||||
"--reference",
|
||||
action="store",
|
||||
dest="reference",
|
||||
default=None,
|
||||
help=(
|
||||
"A custom reference for the import. Leave this empty to have a random "
|
||||
"reference being generated for you."
|
||||
),
|
||||
)
|
||||
parser.add_argument(
|
||||
"--noinput",
|
||||
"--no-input",
|
||||
|
@ -77,14 +122,22 @@ class Command(BaseCommand):
|
|||
def handle(self, *args, **options):
|
||||
glob_kwargs = {}
|
||||
matching = []
|
||||
|
||||
try:
|
||||
library = models.Library.objects.select_related("actor__user").get(
|
||||
uuid__startswith=options["library_id"]
|
||||
)
|
||||
except models.Library.DoesNotExist:
|
||||
raise CommandError("Invalid library id")
|
||||
|
||||
if not library.actor.get_user():
|
||||
raise CommandError("Library {} is not a local library".format(library.uuid))
|
||||
|
||||
if options["recursive"]:
|
||||
glob_kwargs["recursive"] = True
|
||||
try:
|
||||
for import_path in options["path"]:
|
||||
matching += glob.glob(import_path, **glob_kwargs)
|
||||
raw_matching = sorted(list(set(matching)))
|
||||
except TypeError:
|
||||
raise Exception("You need Python 3.5 to use the --recursive flag")
|
||||
for import_path in options["path"]:
|
||||
matching += glob.glob(import_path, **glob_kwargs)
|
||||
raw_matching = sorted(list(set(matching)))
|
||||
|
||||
matching = []
|
||||
for m in raw_matching:
|
||||
|
@ -128,28 +181,12 @@ class Command(BaseCommand):
|
|||
if not matching:
|
||||
raise CommandError("No file matching pattern, aborting")
|
||||
|
||||
user = None
|
||||
if options["username"]:
|
||||
try:
|
||||
user = User.objects.get(username=options["username"])
|
||||
except User.DoesNotExist:
|
||||
raise CommandError("Invalid username")
|
||||
else:
|
||||
# we bind the import to the first registered superuser
|
||||
try:
|
||||
user = User.objects.filter(is_superuser=True).order_by("pk").first()
|
||||
assert user is not None
|
||||
except AssertionError:
|
||||
raise CommandError(
|
||||
"No superuser available, please provide a --username"
|
||||
)
|
||||
|
||||
if options["replace"]:
|
||||
filtered = {"initial": matching, "skipped": [], "new": matching}
|
||||
message = "- {} files to be replaced"
|
||||
import_paths = matching
|
||||
else:
|
||||
filtered = self.filter_matching(matching)
|
||||
filtered = self.filter_matching(matching, library)
|
||||
message = "- {} files already found in database"
|
||||
import_paths = filtered["new"]
|
||||
|
||||
|
@ -179,10 +216,26 @@ class Command(BaseCommand):
|
|||
)
|
||||
if input("".join(message)) != "yes":
|
||||
raise CommandError("Import cancelled.")
|
||||
reference = options["reference"] or "cli-{}".format(timezone.now().isoformat())
|
||||
|
||||
batch, errors = self.do_import(import_paths, user=user, options=options)
|
||||
import_url = "{}://{}/content/libraries/{}/upload?{}"
|
||||
import_url = import_url.format(
|
||||
settings.FUNKWHALE_PROTOCOL,
|
||||
settings.FUNKWHALE_HOSTNAME,
|
||||
str(library.uuid),
|
||||
urllib.parse.urlencode([("import", reference)]),
|
||||
)
|
||||
self.stdout.write(
|
||||
"For details, please refer to import refrence '{}' or URL {}".format(
|
||||
reference, import_url
|
||||
)
|
||||
)
|
||||
|
||||
errors = self.do_import(
|
||||
import_paths, library=library, reference=reference, options=options
|
||||
)
|
||||
message = "Successfully imported {} tracks"
|
||||
if options["async"]:
|
||||
if options["async_"]:
|
||||
message = "Successfully launched import for {} tracks"
|
||||
|
||||
self.stdout.write(message.format(len(import_paths)))
|
||||
|
@ -191,15 +244,18 @@ class Command(BaseCommand):
|
|||
|
||||
for path, error in errors:
|
||||
self.stderr.write("- {}: {}".format(path, error))
|
||||
|
||||
self.stdout.write(
|
||||
"For details, please refer to import batch #{}".format(batch.pk)
|
||||
"For details, please refer to import refrence '{}' or URL {}".format(
|
||||
reference, import_url
|
||||
)
|
||||
)
|
||||
|
||||
def filter_matching(self, matching):
|
||||
def filter_matching(self, matching, library):
|
||||
sources = ["file://{}".format(p) for p in matching]
|
||||
# we skip reimport for path that are already found
|
||||
# as a TrackFile.source
|
||||
existing = models.TrackFile.objects.filter(source__in=sources)
|
||||
# as a Upload.source
|
||||
existing = library.uploads.filter(source__in=sources, import_status="finished")
|
||||
existing = existing.values_list("source", flat=True)
|
||||
existing = set([p.replace("file://", "", 1) for p in existing])
|
||||
skipped = set(matching) & existing
|
||||
|
@ -210,20 +266,27 @@ class Command(BaseCommand):
|
|||
}
|
||||
return result
|
||||
|
||||
def do_import(self, paths, user, options):
|
||||
def do_import(self, paths, library, reference, options):
|
||||
message = "{i}/{total} Importing {path}..."
|
||||
if options["async"]:
|
||||
if options["async_"]:
|
||||
message = "{i}/{total} Launching import for {path}..."
|
||||
|
||||
# we create an import batch binded to the user
|
||||
async_ = options["async"]
|
||||
import_handler = tasks.import_job_run.delay if async_ else tasks.import_job_run
|
||||
batch = user.imports.create(source="shell")
|
||||
# we create an upload binded to the library
|
||||
async_ = options["async_"]
|
||||
errors = []
|
||||
for i, path in list(enumerate(paths)):
|
||||
try:
|
||||
self.stdout.write(message.format(path=path, i=i + 1, total=len(paths)))
|
||||
self.import_file(path, batch, import_handler, options)
|
||||
self.create_upload(
|
||||
path,
|
||||
reference,
|
||||
library,
|
||||
async_,
|
||||
options["replace"],
|
||||
options["in_place"],
|
||||
options["outbox"],
|
||||
options["broadcast"],
|
||||
)
|
||||
except Exception as e:
|
||||
if options["exit_on_failure"]:
|
||||
raise
|
||||
|
@ -232,16 +295,36 @@ class Command(BaseCommand):
|
|||
)
|
||||
self.stderr.write(m)
|
||||
errors.append((path, "{} {}".format(e.__class__.__name__, e)))
|
||||
return batch, errors
|
||||
return errors
|
||||
|
||||
def import_file(self, path, batch, import_handler, options):
|
||||
job = batch.jobs.create(
|
||||
source="file://" + path, replace_if_duplicate=options["replace"]
|
||||
)
|
||||
if not options["in_place"]:
|
||||
def create_upload(
|
||||
self,
|
||||
path,
|
||||
reference,
|
||||
library,
|
||||
async_,
|
||||
replace,
|
||||
in_place,
|
||||
dispatch_outbox,
|
||||
broadcast,
|
||||
):
|
||||
import_handler = tasks.process_upload.delay if async_ else tasks.process_upload
|
||||
upload = models.Upload(library=library, import_reference=reference)
|
||||
upload.source = "file://" + path
|
||||
upload.import_metadata = {
|
||||
"funkwhale": {
|
||||
"config": {
|
||||
"replace": replace,
|
||||
"dispatch_outbox": dispatch_outbox,
|
||||
"broadcast": broadcast,
|
||||
}
|
||||
}
|
||||
}
|
||||
if not in_place:
|
||||
name = os.path.basename(path)
|
||||
with open(path, "rb") as f:
|
||||
job.audio_file.save(name, File(f))
|
||||
upload.audio_file.save(name, File(f), save=False)
|
||||
|
||||
job.save()
|
||||
import_handler(import_job_id=job.pk, use_acoustid=False)
|
||||
upload.save()
|
||||
|
||||
import_handler(upload_id=upload.pk)
|
|
@ -93,9 +93,9 @@ def convert_track_number(v):
|
|||
class FirstUUIDField(forms.UUIDField):
|
||||
def to_python(self, value):
|
||||
try:
|
||||
# sometimes, Picard leaves to uuids in the field, separated
|
||||
# by a slash
|
||||
value = value.split("/")[0]
|
||||
# sometimes, Picard leaves two uuids in the field, separated
|
||||
# by a slash or a ;
|
||||
value = value.split(";")[0].split("/")[0].strip()
|
||||
except (AttributeError, IndexError, TypeError):
|
||||
pass
|
||||
|
||||
|
@ -107,13 +107,42 @@ def get_date(value):
|
|||
return datetime.date(parsed.year, parsed.month, parsed.day)
|
||||
|
||||
|
||||
def split_and_return_first(separator):
|
||||
def inner(v):
|
||||
return v.split(separator)[0].strip()
|
||||
|
||||
return inner
|
||||
|
||||
|
||||
VALIDATION = {
|
||||
"musicbrainz_artistid": FirstUUIDField(),
|
||||
"musicbrainz_albumid": FirstUUIDField(),
|
||||
"musicbrainz_recordingid": FirstUUIDField(),
|
||||
"musicbrainz_albumartistid": FirstUUIDField(),
|
||||
}
|
||||
|
||||
CONF = {
|
||||
"OggOpus": {
|
||||
"getter": lambda f, k: f[k][0],
|
||||
"fields": {
|
||||
"track_number": {
|
||||
"field": "TRACKNUMBER",
|
||||
"to_application": convert_track_number,
|
||||
},
|
||||
"title": {},
|
||||
"artist": {},
|
||||
"album_artist": {
|
||||
"field": "albumartist",
|
||||
"to_application": split_and_return_first(";"),
|
||||
},
|
||||
"album": {},
|
||||
"date": {"field": "date", "to_application": get_date},
|
||||
"musicbrainz_albumid": {},
|
||||
"musicbrainz_artistid": {},
|
||||
"musicbrainz_albumartistid": {},
|
||||
"musicbrainz_recordingid": {"field": "musicbrainz_trackid"},
|
||||
},
|
||||
},
|
||||
"OggVorbis": {
|
||||
"getter": lambda f, k: f[k][0],
|
||||
"fields": {
|
||||
|
@ -123,10 +152,15 @@ CONF = {
|
|||
},
|
||||
"title": {},
|
||||
"artist": {},
|
||||
"album_artist": {
|
||||
"field": "albumartist",
|
||||
"to_application": split_and_return_first(";"),
|
||||
},
|
||||
"album": {},
|
||||
"date": {"field": "date", "to_application": get_date},
|
||||
"musicbrainz_albumid": {},
|
||||
"musicbrainz_artistid": {},
|
||||
"musicbrainz_albumartistid": {},
|
||||
"musicbrainz_recordingid": {"field": "musicbrainz_trackid"},
|
||||
},
|
||||
},
|
||||
|
@ -139,10 +173,12 @@ CONF = {
|
|||
},
|
||||
"title": {},
|
||||
"artist": {},
|
||||
"album_artist": {"field": "albumartist"},
|
||||
"album": {},
|
||||
"date": {"field": "date", "to_application": get_date},
|
||||
"musicbrainz_albumid": {"field": "MusicBrainz Album Id"},
|
||||
"musicbrainz_artistid": {"field": "MusicBrainz Artist Id"},
|
||||
"musicbrainz_albumartistid": {"field": "MusicBrainz Album Artist Id"},
|
||||
"musicbrainz_recordingid": {"field": "MusicBrainz Track Id"},
|
||||
},
|
||||
},
|
||||
|
@ -153,10 +189,12 @@ CONF = {
|
|||
"track_number": {"field": "TRCK", "to_application": convert_track_number},
|
||||
"title": {"field": "TIT2"},
|
||||
"artist": {"field": "TPE1"},
|
||||
"album_artist": {"field": "TPE2"},
|
||||
"album": {"field": "TALB"},
|
||||
"date": {"field": "TDRC", "to_application": get_date},
|
||||
"musicbrainz_albumid": {"field": "MusicBrainz Album Id"},
|
||||
"musicbrainz_artistid": {"field": "MusicBrainz Artist Id"},
|
||||
"musicbrainz_albumartistid": {"field": "MusicBrainz Album Artist Id"},
|
||||
"musicbrainz_recordingid": {
|
||||
"field": "UFID",
|
||||
"getter": get_mp3_recording_id,
|
||||
|
@ -174,10 +212,12 @@ CONF = {
|
|||
},
|
||||
"title": {},
|
||||
"artist": {},
|
||||
"album_artist": {"field": "albumartist"},
|
||||
"album": {},
|
||||
"date": {"field": "date", "to_application": get_date},
|
||||
"musicbrainz_albumid": {},
|
||||
"musicbrainz_artistid": {},
|
||||
"musicbrainz_albumartistid": {},
|
||||
"musicbrainz_recordingid": {"field": "musicbrainz_trackid"},
|
||||
"test": {},
|
||||
"pictures": {},
|
||||
|
@ -185,6 +225,19 @@ CONF = {
|
|||
},
|
||||
}
|
||||
|
||||
ALL_FIELDS = [
|
||||
"track_number",
|
||||
"title",
|
||||
"artist",
|
||||
"album_artist",
|
||||
"album",
|
||||
"date",
|
||||
"musicbrainz_albumid",
|
||||
"musicbrainz_artistid",
|
||||
"musicbrainz_albumartistid",
|
||||
"musicbrainz_recordingid",
|
||||
]
|
||||
|
||||
|
||||
class Metadata(object):
|
||||
def __init__(self, path):
|
||||
|
@ -222,6 +275,20 @@ class Metadata(object):
|
|||
v = field.to_python(v)
|
||||
return v
|
||||
|
||||
def all(self):
|
||||
"""
|
||||
Return a dict containing all metadata of the file
|
||||
"""
|
||||
|
||||
data = {}
|
||||
for field in ALL_FIELDS:
|
||||
try:
|
||||
data[field] = self.get(field, None)
|
||||
except (TagNotFound, forms.ValidationError):
|
||||
data[field] = None
|
||||
|
||||
return data
|
||||
|
||||
def get_picture(self, picture_type="cover_front"):
|
||||
ptype = getattr(mutagen.id3.PictureType, picture_type.upper())
|
||||
try:
|
||||
|
|
|
@ -5,14 +5,12 @@ from django.db import migrations, models
|
|||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('music', '0027_auto_20180515_1808'),
|
||||
]
|
||||
dependencies = [("music", "0027_auto_20180515_1808")]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='importjob',
|
||||
name='replace_if_duplicate',
|
||||
model_name="importjob",
|
||||
name="replace_if_duplicate",
|
||||
field=models.BooleanField(default=False),
|
||||
),
|
||||
)
|
||||
]
|
||||
|
|
|
@ -0,0 +1,109 @@
|
|||
# Generated by Django 2.0.7 on 2018-08-07 17:48
|
||||
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
import django.utils.timezone
|
||||
import uuid
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
("federation", "0007_auto_20180807_1748"),
|
||||
("music", "0028_importjob_replace_if_duplicate"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name="Library",
|
||||
fields=[
|
||||
(
|
||||
"id",
|
||||
models.AutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
("fid", models.URLField(db_index=True, max_length=500, unique=True)),
|
||||
("url", models.URLField(blank=True, max_length=500, null=True)),
|
||||
(
|
||||
"uuid",
|
||||
models.UUIDField(db_index=True, default=uuid.uuid4, unique=True),
|
||||
),
|
||||
("followers_url", models.URLField(max_length=500)),
|
||||
(
|
||||
"creation_date",
|
||||
models.DateTimeField(default=django.utils.timezone.now),
|
||||
),
|
||||
("name", models.CharField(max_length=100)),
|
||||
(
|
||||
"description",
|
||||
models.TextField(blank=True, max_length=5000, null=True),
|
||||
),
|
||||
(
|
||||
"privacy_level",
|
||||
models.CharField(
|
||||
choices=[
|
||||
("me", "Only me"),
|
||||
("instance", "Everyone on my instance, and my followers"),
|
||||
(
|
||||
"everyone",
|
||||
"Everyone, including people on other instances",
|
||||
),
|
||||
],
|
||||
default="me",
|
||||
max_length=25,
|
||||
),
|
||||
),
|
||||
(
|
||||
"actor",
|
||||
models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name="libraries",
|
||||
to="federation.Actor",
|
||||
),
|
||||
),
|
||||
],
|
||||
options={"abstract": False},
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="importjob",
|
||||
name="audio_file_size",
|
||||
field=models.IntegerField(blank=True, null=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="importbatch",
|
||||
name="import_request",
|
||||
field=models.ForeignKey(
|
||||
blank=True,
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.SET_NULL,
|
||||
related_name="import_batches",
|
||||
to="requests.ImportRequest",
|
||||
),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="importbatch",
|
||||
name="library",
|
||||
field=models.ForeignKey(
|
||||
blank=True,
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name="import_batches",
|
||||
to="music.Library",
|
||||
),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="trackfile",
|
||||
name="library",
|
||||
field=models.ForeignKey(
|
||||
blank=True,
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name="files",
|
||||
to="music.Library",
|
||||
),
|
||||
),
|
||||
]
|
|
@ -0,0 +1,152 @@
|
|||
# Generated by Django 2.0.8 on 2018-08-25 14:11
|
||||
|
||||
import django.contrib.postgres.fields.jsonb
|
||||
import django.core.serializers.json
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
import django.utils.timezone
|
||||
import funkwhale_api.music.models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
("federation", "0009_auto_20180822_1956"),
|
||||
("music", "0029_auto_20180807_1748"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name="LibraryScan",
|
||||
fields=[
|
||||
(
|
||||
"id",
|
||||
models.AutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
("total_files", models.PositiveIntegerField(default=0)),
|
||||
("processed_files", models.PositiveIntegerField(default=0)),
|
||||
("errored_files", models.PositiveIntegerField(default=0)),
|
||||
("status", models.CharField(default="pending", max_length=25)),
|
||||
(
|
||||
"creation_date",
|
||||
models.DateTimeField(default=django.utils.timezone.now),
|
||||
),
|
||||
("modification_date", models.DateTimeField(blank=True, null=True)),
|
||||
(
|
||||
"actor",
|
||||
models.ForeignKey(
|
||||
blank=True,
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
to="federation.Actor",
|
||||
),
|
||||
),
|
||||
],
|
||||
),
|
||||
migrations.RemoveField(model_name="trackfile", name="library_track"),
|
||||
migrations.AddField(
|
||||
model_name="library",
|
||||
name="files_count",
|
||||
field=models.PositiveIntegerField(default=0),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="trackfile",
|
||||
name="fid",
|
||||
field=models.URLField(blank=True, max_length=500, null=True, unique=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="trackfile",
|
||||
name="import_date",
|
||||
field=models.DateTimeField(blank=True, null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="trackfile",
|
||||
name="import_details",
|
||||
field=django.contrib.postgres.fields.jsonb.JSONField(
|
||||
default=funkwhale_api.music.models.empty_dict,
|
||||
encoder=django.core.serializers.json.DjangoJSONEncoder,
|
||||
max_length=50000,
|
||||
),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="trackfile",
|
||||
name="import_metadata",
|
||||
field=django.contrib.postgres.fields.jsonb.JSONField(
|
||||
default=funkwhale_api.music.models.empty_dict,
|
||||
encoder=django.core.serializers.json.DjangoJSONEncoder,
|
||||
max_length=50000,
|
||||
),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="trackfile",
|
||||
name="import_reference",
|
||||
field=models.CharField(
|
||||
default=funkwhale_api.music.models.get_import_reference, max_length=50
|
||||
),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="trackfile",
|
||||
name="import_status",
|
||||
field=models.CharField(
|
||||
choices=[
|
||||
("pending", "Pending"),
|
||||
("finished", "Finished"),
|
||||
("errored", "Errored"),
|
||||
("skipped", "Skipped"),
|
||||
],
|
||||
default="pending",
|
||||
max_length=25,
|
||||
),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="trackfile",
|
||||
name="metadata",
|
||||
field=django.contrib.postgres.fields.jsonb.JSONField(
|
||||
default=funkwhale_api.music.models.empty_dict,
|
||||
encoder=django.core.serializers.json.DjangoJSONEncoder,
|
||||
max_length=50000,
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="album",
|
||||
name="release_date",
|
||||
field=models.DateField(blank=True, null=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="trackfile",
|
||||
name="audio_file",
|
||||
field=models.FileField(
|
||||
max_length=255, upload_to=funkwhale_api.music.models.get_file_path
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="trackfile",
|
||||
name="source",
|
||||
field=models.CharField(blank=True, max_length=500, null=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="trackfile",
|
||||
name="track",
|
||||
field=models.ForeignKey(
|
||||
blank=True,
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name="files",
|
||||
to="music.Track",
|
||||
),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="libraryscan",
|
||||
name="library",
|
||||
field=models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name="scans",
|
||||
to="music.Library",
|
||||
),
|
||||
),
|
||||
]
|
|
@ -0,0 +1,66 @@
|
|||
# Generated by Django 2.0.8 on 2018-09-14 20:07
|
||||
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
import django.utils.timezone
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('federation', '0011_auto_20180910_1902'),
|
||||
('music', '0030_auto_20180825_1411'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='album',
|
||||
name='fid',
|
||||
field=models.URLField(db_index=True, max_length=500, null=True, unique=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='album',
|
||||
name='from_activity',
|
||||
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, to='federation.Activity'),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='artist',
|
||||
name='fid',
|
||||
field=models.URLField(db_index=True, max_length=500, null=True, unique=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='artist',
|
||||
name='from_activity',
|
||||
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, to='federation.Activity'),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='track',
|
||||
name='fid',
|
||||
field=models.URLField(db_index=True, max_length=500, null=True, unique=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='track',
|
||||
name='from_activity',
|
||||
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, to='federation.Activity'),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='trackfile',
|
||||
name='from_activity',
|
||||
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, to='federation.Activity'),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='work',
|
||||
name='fid',
|
||||
field=models.URLField(db_index=True, max_length=500, null=True, unique=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='work',
|
||||
name='from_activity',
|
||||
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, to='federation.Activity'),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='trackfile',
|
||||
name='modification_date',
|
||||
field=models.DateTimeField(default=django.utils.timezone.now, null=True),
|
||||
),
|
||||
]
|
|
@ -0,0 +1,40 @@
|
|||
# Generated by Django 2.0.8 on 2018-09-21 16:47
|
||||
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [("music", "0031_auto_20180914_2007")]
|
||||
|
||||
operations = [
|
||||
migrations.RenameModel("TrackFile", "Upload"),
|
||||
migrations.RenameField(
|
||||
model_name="importjob", old_name="track_file", new_name="upload"
|
||||
),
|
||||
migrations.RenameField(
|
||||
model_name="library", old_name="files_count", new_name="uploads_count"
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="upload",
|
||||
name="library",
|
||||
field=models.ForeignKey(
|
||||
blank=True,
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name="uploads",
|
||||
to="music.Library",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="upload",
|
||||
name="track",
|
||||
field=models.ForeignKey(
|
||||
blank=True,
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name="uploads",
|
||||
to="music.Track",
|
||||
),
|
||||
),
|
||||
]
|
|
@ -1,13 +1,16 @@
|
|||
import datetime
|
||||
import logging
|
||||
import mimetypes
|
||||
import os
|
||||
import shutil
|
||||
import tempfile
|
||||
import uuid
|
||||
|
||||
import markdown
|
||||
import pendulum
|
||||
from django.conf import settings
|
||||
from django.core.files import File
|
||||
from django.contrib.postgres.fields import JSONField
|
||||
from django.core.files.base import ContentFile
|
||||
from django.core.serializers.json import DjangoJSONEncoder
|
||||
from django.db import models
|
||||
from django.db.models.signals import post_save
|
||||
from django.dispatch import receiver
|
||||
|
@ -18,15 +21,28 @@ from taggit.managers import TaggableManager
|
|||
from versatileimagefield.fields import VersatileImageField
|
||||
from versatileimagefield.image_warmer import VersatileImageFieldWarmer
|
||||
|
||||
from funkwhale_api import downloader, musicbrainz
|
||||
from funkwhale_api import musicbrainz
|
||||
from funkwhale_api.common import fields
|
||||
from funkwhale_api.common import session
|
||||
from funkwhale_api.common import utils as common_utils
|
||||
from funkwhale_api.federation import models as federation_models
|
||||
from funkwhale_api.federation import utils as federation_utils
|
||||
|
||||
from . import importers, metadata, utils
|
||||
|
||||
logger = logging.getLogger(__file__)
|
||||
|
||||
|
||||
def empty_dict():
|
||||
return {}
|
||||
|
||||
|
||||
class APIModelMixin(models.Model):
|
||||
fid = models.URLField(unique=True, max_length=500, db_index=True, null=True)
|
||||
mbid = models.UUIDField(unique=True, db_index=True, null=True, blank=True)
|
||||
uuid = models.UUIDField(unique=True, db_index=True, default=uuid.uuid4)
|
||||
from_activity = models.ForeignKey(
|
||||
"federation.Activity", null=True, blank=True, on_delete=models.SET_NULL
|
||||
)
|
||||
api_includes = []
|
||||
creation_date = models.DateTimeField(default=timezone.now)
|
||||
import_hooks = []
|
||||
|
@ -79,6 +95,23 @@ class APIModelMixin(models.Model):
|
|||
self.musicbrainz_model, self.mbid
|
||||
)
|
||||
|
||||
def get_federation_id(self):
|
||||
if self.fid:
|
||||
return self.fid
|
||||
|
||||
return federation_utils.full_url(
|
||||
reverse(
|
||||
"federation:music:{}-detail".format(self.federation_namespace),
|
||||
kwargs={"uuid": self.uuid},
|
||||
)
|
||||
)
|
||||
|
||||
def save(self, **kwargs):
|
||||
if not self.pk and not self.fid:
|
||||
self.fid = self.get_federation_id()
|
||||
|
||||
return super().save(**kwargs)
|
||||
|
||||
|
||||
class ArtistQuerySet(models.QuerySet):
|
||||
def with_albums_count(self):
|
||||
|
@ -89,10 +122,27 @@ class ArtistQuerySet(models.QuerySet):
|
|||
models.Prefetch("albums", queryset=Album.objects.with_tracks_count())
|
||||
)
|
||||
|
||||
def annotate_playable_by_actor(self, actor):
|
||||
tracks = (
|
||||
Track.objects.playable_by(actor)
|
||||
.filter(artist=models.OuterRef("id"))
|
||||
.order_by("id")
|
||||
.values("id")[:1]
|
||||
)
|
||||
subquery = models.Subquery(tracks)
|
||||
return self.annotate(is_playable_by_actor=subquery)
|
||||
|
||||
def playable_by(self, actor, include=True):
|
||||
tracks = Track.objects.playable_by(actor, include)
|
||||
if include:
|
||||
return self.filter(tracks__in=tracks)
|
||||
else:
|
||||
return self.exclude(tracks__in=tracks)
|
||||
|
||||
|
||||
class Artist(APIModelMixin):
|
||||
name = models.CharField(max_length=255)
|
||||
|
||||
federation_namespace = "artists"
|
||||
musicbrainz_model = "artist"
|
||||
musicbrainz_mapping = {
|
||||
"mbid": {"musicbrainz_field_name": "id"},
|
||||
|
@ -140,6 +190,23 @@ class AlbumQuerySet(models.QuerySet):
|
|||
def with_tracks_count(self):
|
||||
return self.annotate(_tracks_count=models.Count("tracks"))
|
||||
|
||||
def annotate_playable_by_actor(self, actor):
|
||||
tracks = (
|
||||
Track.objects.playable_by(actor)
|
||||
.filter(album=models.OuterRef("id"))
|
||||
.order_by("id")
|
||||
.values("id")[:1]
|
||||
)
|
||||
subquery = models.Subquery(tracks)
|
||||
return self.annotate(is_playable_by_actor=subquery)
|
||||
|
||||
def playable_by(self, actor, include=True):
|
||||
tracks = Track.objects.playable_by(actor, include)
|
||||
if include:
|
||||
return self.filter(tracks__in=tracks)
|
||||
else:
|
||||
return self.exclude(tracks__in=tracks)
|
||||
|
||||
|
||||
class Album(APIModelMixin):
|
||||
title = models.CharField(max_length=255)
|
||||
|
@ -154,6 +221,7 @@ class Album(APIModelMixin):
|
|||
|
||||
api_includes = ["artist-credits", "recordings", "media", "release-groups"]
|
||||
api = musicbrainz.api.releases
|
||||
federation_namespace = "albums"
|
||||
musicbrainz_model = "release"
|
||||
musicbrainz_mapping = {
|
||||
"mbid": {"musicbrainz_field_name": "id"},
|
||||
|
@ -177,14 +245,35 @@ class Album(APIModelMixin):
|
|||
|
||||
def get_image(self, data=None):
|
||||
if data:
|
||||
f = ContentFile(data["content"])
|
||||
extensions = {"image/jpeg": "jpg", "image/png": "png", "image/gif": "gif"}
|
||||
extension = extensions.get(data["mimetype"], "jpg")
|
||||
self.cover.save("{}.{}".format(self.uuid, extension), f)
|
||||
else:
|
||||
if data.get("content"):
|
||||
# we have to cover itself
|
||||
f = ContentFile(data["content"])
|
||||
elif data.get("url"):
|
||||
# we can fetch from a url
|
||||
try:
|
||||
response = session.get_session().get(
|
||||
data.get("url"),
|
||||
timeout=3,
|
||||
verify=settings.EXTERNAL_REQUESTS_VERIFY_SSL,
|
||||
)
|
||||
response.raise_for_status()
|
||||
except Exception as e:
|
||||
logger.warn(
|
||||
"Cannot download cover at url %s: %s", data.get("url"), e
|
||||
)
|
||||
return
|
||||
else:
|
||||
f = ContentFile(response.content)
|
||||
self.cover.save("{}.{}".format(self.uuid, extension), f, save=False)
|
||||
self.save(update_fields=["cover"])
|
||||
return self.cover.file
|
||||
if self.mbid:
|
||||
image_data = musicbrainz.api.images.get_front(str(self.mbid))
|
||||
f = ContentFile(image_data)
|
||||
self.cover.save("{0}.jpg".format(self.mbid), f)
|
||||
self.cover.save("{0}.jpg".format(self.mbid), f, save=False)
|
||||
self.save(update_fields=["cover"])
|
||||
return self.cover.file
|
||||
|
||||
def __str__(self):
|
||||
|
@ -249,6 +338,8 @@ class Work(APIModelMixin):
|
|||
api = musicbrainz.api.works
|
||||
api_includes = ["url-rels", "recording-rels"]
|
||||
musicbrainz_model = "work"
|
||||
federation_namespace = "works"
|
||||
|
||||
musicbrainz_mapping = {
|
||||
"mbid": {"musicbrainz_field_name": "id"},
|
||||
"title": {"musicbrainz_field_name": "title"},
|
||||
|
@ -266,6 +357,12 @@ class Work(APIModelMixin):
|
|||
|
||||
return lyric
|
||||
|
||||
def get_federation_id(self):
|
||||
if self.fid:
|
||||
return self.fid
|
||||
|
||||
return None
|
||||
|
||||
|
||||
class Lyrics(models.Model):
|
||||
uuid = models.UUIDField(unique=True, db_index=True, default=uuid.uuid4)
|
||||
|
@ -287,10 +384,37 @@ class Lyrics(models.Model):
|
|||
|
||||
class TrackQuerySet(models.QuerySet):
|
||||
def for_nested_serialization(self):
|
||||
return (
|
||||
self.select_related()
|
||||
.select_related("album__artist", "artist")
|
||||
.prefetch_related("files")
|
||||
return self.select_related().select_related("album__artist", "artist")
|
||||
|
||||
def annotate_playable_by_actor(self, actor):
|
||||
files = (
|
||||
Upload.objects.playable_by(actor)
|
||||
.filter(track=models.OuterRef("id"))
|
||||
.order_by("id")
|
||||
.values("id")[:1]
|
||||
)
|
||||
subquery = models.Subquery(files)
|
||||
return self.annotate(is_playable_by_actor=subquery)
|
||||
|
||||
def playable_by(self, actor, include=True):
|
||||
files = Upload.objects.playable_by(actor, include)
|
||||
if include:
|
||||
return self.filter(uploads__in=files)
|
||||
else:
|
||||
return self.exclude(uploads__in=files)
|
||||
|
||||
def annotate_duration(self):
|
||||
first_upload = Upload.objects.filter(track=models.OuterRef("pk")).order_by("pk")
|
||||
return self.annotate(
|
||||
duration=models.Subquery(first_upload.values("duration")[:1])
|
||||
)
|
||||
|
||||
def annotate_file_data(self):
|
||||
first_upload = Upload.objects.filter(track=models.OuterRef("pk")).order_by("pk")
|
||||
return self.annotate(
|
||||
bitrate=models.Subquery(first_upload.values("bitrate")[:1]),
|
||||
size=models.Subquery(first_upload.values("size")[:1]),
|
||||
mimetype=models.Subquery(first_upload.values("mimetype")[:1]),
|
||||
)
|
||||
|
||||
|
||||
|
@ -310,7 +434,7 @@ class Track(APIModelMixin):
|
|||
work = models.ForeignKey(
|
||||
Work, related_name="tracks", null=True, blank=True, on_delete=models.CASCADE
|
||||
)
|
||||
|
||||
federation_namespace = "tracks"
|
||||
musicbrainz_model = "recording"
|
||||
api = musicbrainz.api.recordings
|
||||
api_includes = ["artist-credits", "releases", "media", "tags", "work-rels"]
|
||||
|
@ -423,48 +547,139 @@ class Track(APIModelMixin):
|
|||
},
|
||||
)
|
||||
|
||||
@property
|
||||
def listen_url(self):
|
||||
return reverse("api:v1:listen-detail", kwargs={"uuid": self.uuid})
|
||||
|
||||
class TrackFile(models.Model):
|
||||
|
||||
class UploadQuerySet(models.QuerySet):
|
||||
def playable_by(self, actor, include=True):
|
||||
libraries = Library.objects.viewable_by(actor)
|
||||
|
||||
if include:
|
||||
return self.filter(library__in=libraries, import_status="finished")
|
||||
return self.exclude(library__in=libraries, import_status="finished")
|
||||
|
||||
def local(self, include=True):
|
||||
return self.exclude(library__actor__user__isnull=include)
|
||||
|
||||
def for_federation(self):
|
||||
return self.filter(import_status="finished", mimetype__startswith="audio/")
|
||||
|
||||
|
||||
TRACK_FILE_IMPORT_STATUS_CHOICES = (
|
||||
("pending", "Pending"),
|
||||
("finished", "Finished"),
|
||||
("errored", "Errored"),
|
||||
("skipped", "Skipped"),
|
||||
)
|
||||
|
||||
|
||||
def get_file_path(instance, filename):
|
||||
if instance.library.actor.get_user():
|
||||
return common_utils.ChunkedPath("tracks")(instance, filename)
|
||||
else:
|
||||
# we cache remote tracks in a different directory
|
||||
return common_utils.ChunkedPath("federation_cache/tracks")(instance, filename)
|
||||
|
||||
|
||||
def get_import_reference():
|
||||
return str(uuid.uuid4())
|
||||
|
||||
|
||||
class Upload(models.Model):
|
||||
fid = models.URLField(unique=True, max_length=500, null=True, blank=True)
|
||||
uuid = models.UUIDField(unique=True, db_index=True, default=uuid.uuid4)
|
||||
track = models.ForeignKey(Track, related_name="files", on_delete=models.CASCADE)
|
||||
audio_file = models.FileField(upload_to="tracks/%Y/%m/%d", max_length=255)
|
||||
source = models.URLField(null=True, blank=True, max_length=500)
|
||||
track = models.ForeignKey(
|
||||
Track, related_name="uploads", on_delete=models.CASCADE, null=True, blank=True
|
||||
)
|
||||
audio_file = models.FileField(upload_to=get_file_path, max_length=255)
|
||||
source = models.CharField(
|
||||
# URL validators are not flexible enough for our file:// and upload:// schemes
|
||||
null=True,
|
||||
blank=True,
|
||||
max_length=500,
|
||||
)
|
||||
creation_date = models.DateTimeField(default=timezone.now)
|
||||
modification_date = models.DateTimeField(auto_now=True)
|
||||
modification_date = models.DateTimeField(default=timezone.now, null=True)
|
||||
accessed_date = models.DateTimeField(null=True, blank=True)
|
||||
duration = models.IntegerField(null=True, blank=True)
|
||||
size = models.IntegerField(null=True, blank=True)
|
||||
bitrate = models.IntegerField(null=True, blank=True)
|
||||
acoustid_track_id = models.UUIDField(null=True, blank=True)
|
||||
mimetype = models.CharField(null=True, blank=True, max_length=200)
|
||||
|
||||
library_track = models.OneToOneField(
|
||||
"federation.LibraryTrack",
|
||||
related_name="local_track_file",
|
||||
on_delete=models.CASCADE,
|
||||
library = models.ForeignKey(
|
||||
"library",
|
||||
null=True,
|
||||
blank=True,
|
||||
related_name="uploads",
|
||||
on_delete=models.CASCADE,
|
||||
)
|
||||
|
||||
def download_file(self):
|
||||
# import the track file, since there is not any
|
||||
# we create a tmp dir for the download
|
||||
tmp_dir = tempfile.mkdtemp()
|
||||
data = downloader.download(self.source, target_directory=tmp_dir)
|
||||
self.duration = data.get("duration", None)
|
||||
self.audio_file.save(
|
||||
os.path.basename(data["audio_file_path"]),
|
||||
File(open(data["audio_file_path"], "rb")),
|
||||
# metadata from federation
|
||||
metadata = JSONField(
|
||||
default=empty_dict, max_length=50000, encoder=DjangoJSONEncoder
|
||||
)
|
||||
import_date = models.DateTimeField(null=True, blank=True)
|
||||
# optionnal metadata provided during import
|
||||
import_metadata = JSONField(
|
||||
default=empty_dict, max_length=50000, encoder=DjangoJSONEncoder
|
||||
)
|
||||
# status / error details for the import
|
||||
import_status = models.CharField(
|
||||
default="pending", choices=TRACK_FILE_IMPORT_STATUS_CHOICES, max_length=25
|
||||
)
|
||||
# a short reference provided by the client to group multiple files
|
||||
# in the same import
|
||||
import_reference = models.CharField(max_length=50, default=get_import_reference)
|
||||
|
||||
# optionnal metadata about import results (error messages, etc.)
|
||||
import_details = JSONField(
|
||||
default=empty_dict, max_length=50000, encoder=DjangoJSONEncoder
|
||||
)
|
||||
from_activity = models.ForeignKey(
|
||||
"federation.Activity", null=True, on_delete=models.SET_NULL
|
||||
)
|
||||
|
||||
objects = UploadQuerySet.as_manager()
|
||||
|
||||
def download_audio_from_remote(self, user):
|
||||
from funkwhale_api.common import session
|
||||
from funkwhale_api.federation import signing
|
||||
|
||||
if user.is_authenticated and user.actor:
|
||||
auth = signing.get_auth(user.actor.private_key, user.actor.private_key_id)
|
||||
else:
|
||||
auth = None
|
||||
|
||||
remote_response = session.get_session().get(
|
||||
self.source,
|
||||
auth=auth,
|
||||
stream=True,
|
||||
timeout=20,
|
||||
headers={"Content-Type": "application/octet-stream"},
|
||||
verify=settings.EXTERNAL_REQUESTS_VERIFY_SSL,
|
||||
)
|
||||
shutil.rmtree(tmp_dir)
|
||||
return self.audio_file
|
||||
with remote_response as r:
|
||||
remote_response.raise_for_status()
|
||||
extension = utils.get_ext_from_type(self.mimetype)
|
||||
title = " - ".join(
|
||||
[self.track.title, self.track.album.title, self.track.artist.name]
|
||||
)
|
||||
filename = "{}.{}".format(title, extension)
|
||||
tmp_file = tempfile.TemporaryFile()
|
||||
for chunk in r.iter_content(chunk_size=512):
|
||||
tmp_file.write(chunk)
|
||||
self.audio_file.save(filename, tmp_file, save=False)
|
||||
self.save(update_fields=["audio_file"])
|
||||
|
||||
def get_federation_url(self):
|
||||
return federation_utils.full_url("/federation/music/file/{}".format(self.uuid))
|
||||
def get_federation_id(self):
|
||||
if self.fid:
|
||||
return self.fid
|
||||
|
||||
@property
|
||||
def path(self):
|
||||
return reverse("api:v1:trackfiles-serve", kwargs={"pk": self.pk})
|
||||
return federation_utils.full_url(
|
||||
reverse("federation:music:uploads-detail", kwargs={"uuid": self.uuid})
|
||||
)
|
||||
|
||||
@property
|
||||
def filename(self):
|
||||
|
@ -483,37 +698,35 @@ class TrackFile(models.Model):
|
|||
if self.source.startswith("file://"):
|
||||
return os.path.getsize(self.source.replace("file://", "", 1))
|
||||
|
||||
if self.library_track and self.library_track.audio_file:
|
||||
return self.library_track.audio_file.size
|
||||
|
||||
def get_audio_file(self):
|
||||
if self.audio_file:
|
||||
return self.audio_file.open()
|
||||
if self.source.startswith("file://"):
|
||||
return open(self.source.replace("file://", "", 1), "rb")
|
||||
if self.library_track and self.library_track.audio_file:
|
||||
return self.library_track.audio_file.open()
|
||||
|
||||
def set_audio_data(self):
|
||||
def get_audio_data(self):
|
||||
audio_file = self.get_audio_file()
|
||||
if audio_file:
|
||||
with audio_file as f:
|
||||
audio_data = utils.get_audio_file_data(f)
|
||||
if not audio_data:
|
||||
return
|
||||
self.duration = int(audio_data["length"])
|
||||
self.bitrate = audio_data["bitrate"]
|
||||
self.size = self.get_file_size()
|
||||
else:
|
||||
lt = self.library_track
|
||||
if lt:
|
||||
self.duration = lt.get_metadata("length")
|
||||
self.size = lt.get_metadata("size")
|
||||
self.bitrate = lt.get_metadata("bitrate")
|
||||
if not audio_file:
|
||||
return
|
||||
audio_data = utils.get_audio_file_data(audio_file)
|
||||
if not audio_data:
|
||||
return
|
||||
return {
|
||||
"duration": int(audio_data["length"]),
|
||||
"bitrate": audio_data["bitrate"],
|
||||
"size": self.get_file_size(),
|
||||
}
|
||||
|
||||
def save(self, **kwargs):
|
||||
if not self.mimetype and self.audio_file:
|
||||
self.mimetype = utils.guess_mimetype(self.audio_file)
|
||||
if not self.mimetype:
|
||||
if self.audio_file:
|
||||
self.mimetype = utils.guess_mimetype(self.audio_file)
|
||||
elif self.source and self.source.startswith("file://"):
|
||||
self.mimetype = mimetypes.guess_type(self.source)[0]
|
||||
if not self.size and self.audio_file:
|
||||
self.size = self.audio_file.size
|
||||
if not self.pk and not self.fid and self.library.actor.get_user():
|
||||
self.fid = self.get_federation_id()
|
||||
return super().save(**kwargs)
|
||||
|
||||
def get_metadata(self):
|
||||
|
@ -522,6 +735,10 @@ class TrackFile(models.Model):
|
|||
return
|
||||
return metadata.Metadata(audio_file)
|
||||
|
||||
@property
|
||||
def listen_url(self):
|
||||
return self.track.listen_url + "?upload={}".format(self.uuid)
|
||||
|
||||
|
||||
IMPORT_STATUS_CHOICES = (
|
||||
("pending", "Pending"),
|
||||
|
@ -559,6 +776,13 @@ class ImportBatch(models.Model):
|
|||
blank=True,
|
||||
on_delete=models.SET_NULL,
|
||||
)
|
||||
library = models.ForeignKey(
|
||||
"Library",
|
||||
related_name="import_batches",
|
||||
null=True,
|
||||
blank=True,
|
||||
on_delete=models.CASCADE,
|
||||
)
|
||||
|
||||
class Meta:
|
||||
ordering = ["-creation_date"]
|
||||
|
@ -577,7 +801,7 @@ class ImportBatch(models.Model):
|
|||
|
||||
tasks.import_batch_notify_followers.delay(import_batch_id=self.pk)
|
||||
|
||||
def get_federation_url(self):
|
||||
def get_federation_id(self):
|
||||
return federation_utils.full_url(
|
||||
"/federation/music/import/batch/{}".format(self.uuid)
|
||||
)
|
||||
|
@ -589,8 +813,8 @@ class ImportJob(models.Model):
|
|||
batch = models.ForeignKey(
|
||||
ImportBatch, related_name="jobs", on_delete=models.CASCADE
|
||||
)
|
||||
track_file = models.ForeignKey(
|
||||
TrackFile, related_name="jobs", null=True, blank=True, on_delete=models.CASCADE
|
||||
upload = models.ForeignKey(
|
||||
Upload, related_name="jobs", null=True, blank=True, on_delete=models.CASCADE
|
||||
)
|
||||
source = models.CharField(max_length=500)
|
||||
mbid = models.UUIDField(editable=False, null=True, blank=True)
|
||||
|
@ -609,10 +833,125 @@ class ImportJob(models.Model):
|
|||
null=True,
|
||||
blank=True,
|
||||
)
|
||||
audio_file_size = models.IntegerField(null=True, blank=True)
|
||||
|
||||
class Meta:
|
||||
ordering = ("id",)
|
||||
|
||||
def save(self, **kwargs):
|
||||
if self.audio_file and not self.audio_file_size:
|
||||
self.audio_file_size = self.audio_file.size
|
||||
return super().save(**kwargs)
|
||||
|
||||
|
||||
LIBRARY_PRIVACY_LEVEL_CHOICES = [
|
||||
(k, l) for k, l in fields.PRIVACY_LEVEL_CHOICES if k != "followers"
|
||||
]
|
||||
|
||||
|
||||
class LibraryQuerySet(models.QuerySet):
|
||||
def with_follows(self, actor):
|
||||
return self.prefetch_related(
|
||||
models.Prefetch(
|
||||
"received_follows",
|
||||
queryset=federation_models.LibraryFollow.objects.filter(actor=actor),
|
||||
to_attr="_follows",
|
||||
)
|
||||
)
|
||||
|
||||
def viewable_by(self, actor):
|
||||
from funkwhale_api.federation.models import LibraryFollow
|
||||
|
||||
if actor is None:
|
||||
return Library.objects.filter(privacy_level="everyone")
|
||||
|
||||
me_query = models.Q(privacy_level="me", actor=actor)
|
||||
instance_query = models.Q(privacy_level="instance", actor__domain=actor.domain)
|
||||
followed_libraries = LibraryFollow.objects.filter(
|
||||
actor=actor, approved=True
|
||||
).values_list("target", flat=True)
|
||||
return Library.objects.filter(
|
||||
me_query
|
||||
| instance_query
|
||||
| models.Q(privacy_level="everyone")
|
||||
| models.Q(pk__in=followed_libraries)
|
||||
)
|
||||
|
||||
|
||||
class Library(federation_models.FederationMixin):
|
||||
uuid = models.UUIDField(unique=True, db_index=True, default=uuid.uuid4)
|
||||
actor = models.ForeignKey(
|
||||
"federation.Actor", related_name="libraries", on_delete=models.CASCADE
|
||||
)
|
||||
followers_url = models.URLField(max_length=500)
|
||||
creation_date = models.DateTimeField(default=timezone.now)
|
||||
name = models.CharField(max_length=100)
|
||||
description = models.TextField(max_length=5000, null=True, blank=True)
|
||||
privacy_level = models.CharField(
|
||||
choices=LIBRARY_PRIVACY_LEVEL_CHOICES, default="me", max_length=25
|
||||
)
|
||||
uploads_count = models.PositiveIntegerField(default=0)
|
||||
objects = LibraryQuerySet.as_manager()
|
||||
|
||||
def get_federation_id(self):
|
||||
return federation_utils.full_url(
|
||||
reverse("federation:music:libraries-detail", kwargs={"uuid": self.uuid})
|
||||
)
|
||||
|
||||
def save(self, **kwargs):
|
||||
if not self.pk and not self.fid and self.actor.get_user():
|
||||
self.fid = self.get_federation_id()
|
||||
self.followers_url = self.fid + "/followers"
|
||||
|
||||
return super().save(**kwargs)
|
||||
|
||||
def should_autoapprove_follow(self, actor):
|
||||
if self.privacy_level == "everyone":
|
||||
return True
|
||||
if self.privacy_level == "instance" and actor.get_user():
|
||||
return True
|
||||
return False
|
||||
|
||||
def schedule_scan(self, actor, force=False):
|
||||
latest_scan = (
|
||||
self.scans.exclude(status="errored").order_by("-creation_date").first()
|
||||
)
|
||||
delay_between_scans = datetime.timedelta(seconds=3600 * 24)
|
||||
now = timezone.now()
|
||||
if (
|
||||
not force
|
||||
and latest_scan
|
||||
and latest_scan.creation_date + delay_between_scans > now
|
||||
):
|
||||
return
|
||||
|
||||
scan = self.scans.create(total_files=self.uploads_count, actor=actor)
|
||||
from . import tasks
|
||||
|
||||
common_utils.on_commit(tasks.start_library_scan.delay, library_scan_id=scan.pk)
|
||||
return scan
|
||||
|
||||
|
||||
SCAN_STATUS = [
|
||||
("pending", "pending"),
|
||||
("scanning", "scanning"),
|
||||
("errored", "errored"),
|
||||
("finished", "finished"),
|
||||
]
|
||||
|
||||
|
||||
class LibraryScan(models.Model):
|
||||
actor = models.ForeignKey(
|
||||
"federation.Actor", null=True, blank=True, on_delete=models.CASCADE
|
||||
)
|
||||
library = models.ForeignKey(Library, related_name="scans", on_delete=models.CASCADE)
|
||||
total_files = models.PositiveIntegerField(default=0)
|
||||
processed_files = models.PositiveIntegerField(default=0)
|
||||
errored_files = models.PositiveIntegerField(default=0)
|
||||
status = models.CharField(default="pending", max_length=25)
|
||||
creation_date = models.DateTimeField(default=timezone.now)
|
||||
modification_date = models.DateTimeField(null=True, blank=True)
|
||||
|
||||
|
||||
@receiver(post_save, sender=ImportJob)
|
||||
def update_batch_status(sender, instance, **kwargs):
|
||||
|
|
|
@ -1,24 +0,0 @@
|
|||
|
||||
from rest_framework.permissions import BasePermission
|
||||
|
||||
from funkwhale_api.common import preferences
|
||||
from funkwhale_api.federation import actors, models
|
||||
|
||||
|
||||
class Listen(BasePermission):
|
||||
def has_permission(self, request, view):
|
||||
if not preferences.get("common__api_authentication_required"):
|
||||
return True
|
||||
|
||||
user = getattr(request, "user", None)
|
||||
if user and user.is_authenticated:
|
||||
return True
|
||||
|
||||
actor = getattr(request, "actor", None)
|
||||
if actor is None:
|
||||
return False
|
||||
|
||||
library = actors.SYSTEM_ACTORS["library"].get_actor_instance()
|
||||
return models.Follow.objects.filter(
|
||||
target=library, actor=actor, approved=True
|
||||
).exists()
|
|
@ -1,12 +1,14 @@
|
|||
from django.db.models import Q
|
||||
from django.db import transaction
|
||||
from rest_framework import serializers
|
||||
from taggit.models import Tag
|
||||
from versatileimagefield.serializers import VersatileImageFieldSerializer
|
||||
|
||||
from funkwhale_api.activity import serializers as activity_serializers
|
||||
from funkwhale_api.users.serializers import UserBasicSerializer
|
||||
from funkwhale_api.common import serializers as common_serializers
|
||||
from funkwhale_api.common import utils as common_utils
|
||||
from funkwhale_api.federation import routes
|
||||
|
||||
from . import models, tasks
|
||||
from . import filters, models, tasks
|
||||
|
||||
|
||||
cover_field = VersatileImageFieldSerializer(allow_null=True, sizes="square")
|
||||
|
@ -15,6 +17,7 @@ cover_field = VersatileImageFieldSerializer(allow_null=True, sizes="square")
|
|||
class ArtistAlbumSerializer(serializers.ModelSerializer):
|
||||
tracks_count = serializers.SerializerMethodField()
|
||||
cover = cover_field
|
||||
is_playable = serializers.SerializerMethodField()
|
||||
|
||||
class Meta:
|
||||
model = models.Album
|
||||
|
@ -27,11 +30,18 @@ class ArtistAlbumSerializer(serializers.ModelSerializer):
|
|||
"cover",
|
||||
"creation_date",
|
||||
"tracks_count",
|
||||
"is_playable",
|
||||
)
|
||||
|
||||
def get_tracks_count(self, o):
|
||||
return o._tracks_count
|
||||
|
||||
def get_is_playable(self, obj):
|
||||
try:
|
||||
return bool(obj.is_playable_by_actor)
|
||||
except AttributeError:
|
||||
return None
|
||||
|
||||
|
||||
class ArtistWithAlbumsSerializer(serializers.ModelSerializer):
|
||||
albums = ArtistAlbumSerializer(many=True, read_only=True)
|
||||
|
@ -41,30 +51,6 @@ class ArtistWithAlbumsSerializer(serializers.ModelSerializer):
|
|||
fields = ("id", "mbid", "name", "creation_date", "albums")
|
||||
|
||||
|
||||
class TrackFileSerializer(serializers.ModelSerializer):
|
||||
path = serializers.SerializerMethodField()
|
||||
|
||||
class Meta:
|
||||
model = models.TrackFile
|
||||
fields = (
|
||||
"id",
|
||||
"path",
|
||||
"source",
|
||||
"filename",
|
||||
"mimetype",
|
||||
"track",
|
||||
"duration",
|
||||
"mimetype",
|
||||
"bitrate",
|
||||
"size",
|
||||
)
|
||||
read_only_fields = ["duration", "mimetype", "bitrate", "size"]
|
||||
|
||||
def get_path(self, o):
|
||||
url = o.path
|
||||
return url
|
||||
|
||||
|
||||
class ArtistSimpleSerializer(serializers.ModelSerializer):
|
||||
class Meta:
|
||||
model = models.Artist
|
||||
|
@ -72,8 +58,10 @@ class ArtistSimpleSerializer(serializers.ModelSerializer):
|
|||
|
||||
|
||||
class AlbumTrackSerializer(serializers.ModelSerializer):
|
||||
files = TrackFileSerializer(many=True, read_only=True)
|
||||
artist = ArtistSimpleSerializer(read_only=True)
|
||||
is_playable = serializers.SerializerMethodField()
|
||||
listen_url = serializers.SerializerMethodField()
|
||||
duration = serializers.SerializerMethodField()
|
||||
|
||||
class Meta:
|
||||
model = models.Track
|
||||
|
@ -84,15 +72,33 @@ class AlbumTrackSerializer(serializers.ModelSerializer):
|
|||
"album",
|
||||
"artist",
|
||||
"creation_date",
|
||||
"files",
|
||||
"position",
|
||||
"is_playable",
|
||||
"listen_url",
|
||||
"duration",
|
||||
)
|
||||
|
||||
def get_is_playable(self, obj):
|
||||
try:
|
||||
return bool(obj.is_playable_by_actor)
|
||||
except AttributeError:
|
||||
return None
|
||||
|
||||
def get_listen_url(self, obj):
|
||||
return obj.listen_url
|
||||
|
||||
def get_duration(self, obj):
|
||||
try:
|
||||
return obj.duration
|
||||
except AttributeError:
|
||||
return None
|
||||
|
||||
|
||||
class AlbumSerializer(serializers.ModelSerializer):
|
||||
tracks = serializers.SerializerMethodField()
|
||||
artist = ArtistSimpleSerializer(read_only=True)
|
||||
cover = cover_field
|
||||
is_playable = serializers.SerializerMethodField()
|
||||
|
||||
class Meta:
|
||||
model = models.Album
|
||||
|
@ -105,6 +111,7 @@ class AlbumSerializer(serializers.ModelSerializer):
|
|||
"release_date",
|
||||
"cover",
|
||||
"creation_date",
|
||||
"is_playable",
|
||||
)
|
||||
|
||||
def get_tracks(self, o):
|
||||
|
@ -114,6 +121,12 @@ class AlbumSerializer(serializers.ModelSerializer):
|
|||
)
|
||||
return AlbumTrackSerializer(ordered_tracks, many=True).data
|
||||
|
||||
def get_is_playable(self, obj):
|
||||
try:
|
||||
return any([bool(t.is_playable_by_actor) for t in obj.tracks.all()])
|
||||
except AttributeError:
|
||||
return None
|
||||
|
||||
|
||||
class TrackAlbumSerializer(serializers.ModelSerializer):
|
||||
artist = ArtistSimpleSerializer(read_only=True)
|
||||
|
@ -133,10 +146,15 @@ class TrackAlbumSerializer(serializers.ModelSerializer):
|
|||
|
||||
|
||||
class TrackSerializer(serializers.ModelSerializer):
|
||||
files = TrackFileSerializer(many=True, read_only=True)
|
||||
artist = ArtistSimpleSerializer(read_only=True)
|
||||
album = TrackAlbumSerializer(read_only=True)
|
||||
lyrics = serializers.SerializerMethodField()
|
||||
is_playable = serializers.SerializerMethodField()
|
||||
listen_url = serializers.SerializerMethodField()
|
||||
duration = serializers.SerializerMethodField()
|
||||
bitrate = serializers.SerializerMethodField()
|
||||
size = serializers.SerializerMethodField()
|
||||
mimetype = serializers.SerializerMethodField()
|
||||
|
||||
class Meta:
|
||||
model = models.Track
|
||||
|
@ -147,14 +165,184 @@ class TrackSerializer(serializers.ModelSerializer):
|
|||
"album",
|
||||
"artist",
|
||||
"creation_date",
|
||||
"files",
|
||||
"position",
|
||||
"lyrics",
|
||||
"is_playable",
|
||||
"listen_url",
|
||||
"duration",
|
||||
"bitrate",
|
||||
"size",
|
||||
"mimetype",
|
||||
)
|
||||
|
||||
def get_lyrics(self, obj):
|
||||
return obj.get_lyrics_url()
|
||||
|
||||
def get_listen_url(self, obj):
|
||||
return obj.listen_url
|
||||
|
||||
def get_is_playable(self, obj):
|
||||
try:
|
||||
return bool(obj.is_playable_by_actor)
|
||||
except AttributeError:
|
||||
return None
|
||||
|
||||
def get_duration(self, obj):
|
||||
try:
|
||||
return obj.duration
|
||||
except AttributeError:
|
||||
return None
|
||||
|
||||
def get_bitrate(self, obj):
|
||||
try:
|
||||
return obj.bitrate
|
||||
except AttributeError:
|
||||
return None
|
||||
|
||||
def get_size(self, obj):
|
||||
try:
|
||||
return obj.size
|
||||
except AttributeError:
|
||||
return None
|
||||
|
||||
def get_mimetype(self, obj):
|
||||
try:
|
||||
return obj.mimetype
|
||||
except AttributeError:
|
||||
return None
|
||||
|
||||
|
||||
class LibraryForOwnerSerializer(serializers.ModelSerializer):
|
||||
uploads_count = serializers.SerializerMethodField()
|
||||
size = serializers.SerializerMethodField()
|
||||
|
||||
class Meta:
|
||||
model = models.Library
|
||||
fields = [
|
||||
"uuid",
|
||||
"fid",
|
||||
"name",
|
||||
"description",
|
||||
"privacy_level",
|
||||
"uploads_count",
|
||||
"size",
|
||||
"creation_date",
|
||||
]
|
||||
read_only_fields = ["fid", "uuid", "creation_date", "actor"]
|
||||
|
||||
def get_uploads_count(self, o):
|
||||
return getattr(o, "_uploads_count", o.uploads_count)
|
||||
|
||||
def get_size(self, o):
|
||||
return getattr(o, "_size", 0)
|
||||
|
||||
|
||||
class UploadSerializer(serializers.ModelSerializer):
|
||||
track = TrackSerializer(required=False, allow_null=True)
|
||||
library = common_serializers.RelatedField(
|
||||
"uuid",
|
||||
LibraryForOwnerSerializer(),
|
||||
required=True,
|
||||
filters=lambda context: {"actor": context["user"].actor},
|
||||
)
|
||||
|
||||
class Meta:
|
||||
model = models.Upload
|
||||
fields = [
|
||||
"uuid",
|
||||
"filename",
|
||||
"creation_date",
|
||||
"mimetype",
|
||||
"track",
|
||||
"library",
|
||||
"duration",
|
||||
"mimetype",
|
||||
"bitrate",
|
||||
"size",
|
||||
"import_date",
|
||||
"import_status",
|
||||
]
|
||||
|
||||
read_only_fields = [
|
||||
"uuid",
|
||||
"creation_date",
|
||||
"duration",
|
||||
"mimetype",
|
||||
"bitrate",
|
||||
"size",
|
||||
"track",
|
||||
"import_date",
|
||||
"import_status",
|
||||
]
|
||||
|
||||
|
||||
class UploadForOwnerSerializer(UploadSerializer):
|
||||
class Meta(UploadSerializer.Meta):
|
||||
fields = UploadSerializer.Meta.fields + [
|
||||
"import_details",
|
||||
"import_metadata",
|
||||
"import_reference",
|
||||
"metadata",
|
||||
"source",
|
||||
"audio_file",
|
||||
]
|
||||
write_only_fields = ["audio_file"]
|
||||
read_only_fields = UploadSerializer.Meta.read_only_fields + [
|
||||
"import_details",
|
||||
"import_metadata",
|
||||
"metadata",
|
||||
]
|
||||
|
||||
def to_representation(self, obj):
|
||||
r = super().to_representation(obj)
|
||||
if "audio_file" in r:
|
||||
del r["audio_file"]
|
||||
return r
|
||||
|
||||
def validate(self, validated_data):
|
||||
if "audio_file" in validated_data:
|
||||
self.validate_upload_quota(validated_data["audio_file"])
|
||||
|
||||
return super().validate(validated_data)
|
||||
|
||||
def validate_upload_quota(self, f):
|
||||
quota_status = self.context["user"].get_quota_status()
|
||||
if (f.size / 1000 / 1000) > quota_status["remaining"]:
|
||||
raise serializers.ValidationError("upload_quota_reached")
|
||||
|
||||
return f
|
||||
|
||||
|
||||
class UploadActionSerializer(common_serializers.ActionSerializer):
|
||||
actions = [
|
||||
common_serializers.Action("delete", allow_all=True),
|
||||
common_serializers.Action("relaunch_import", allow_all=True),
|
||||
]
|
||||
filterset_class = filters.UploadFilter
|
||||
pk_field = "uuid"
|
||||
|
||||
@transaction.atomic
|
||||
def handle_delete(self, objects):
|
||||
libraries = sorted(set(objects.values_list("library", flat=True)))
|
||||
for id in libraries:
|
||||
# we group deletes by library for easier federation
|
||||
uploads = objects.filter(library__pk=id).select_related("library__actor")
|
||||
for chunk in common_utils.chunk_queryset(uploads, 100):
|
||||
routes.outbox.dispatch(
|
||||
{"type": "Delete", "object": {"type": "Audio"}},
|
||||
context={"uploads": chunk},
|
||||
)
|
||||
|
||||
return objects.delete()
|
||||
|
||||
@transaction.atomic
|
||||
def handle_relaunch_import(self, objects):
|
||||
qs = objects.exclude(import_status="finished")
|
||||
pks = list(qs.values_list("id", flat=True))
|
||||
qs.update(import_status="pending")
|
||||
for pk in pks:
|
||||
common_utils.on_commit(tasks.process_upload.delay, upload_id=pk)
|
||||
|
||||
|
||||
class TagSerializer(serializers.ModelSerializer):
|
||||
class Meta:
|
||||
|
@ -176,40 +364,6 @@ class LyricsSerializer(serializers.ModelSerializer):
|
|||
fields = ("id", "work", "content", "content_rendered")
|
||||
|
||||
|
||||
class ImportJobSerializer(serializers.ModelSerializer):
|
||||
track_file = TrackFileSerializer(read_only=True)
|
||||
|
||||
class Meta:
|
||||
model = models.ImportJob
|
||||
fields = ("id", "mbid", "batch", "source", "status", "track_file", "audio_file")
|
||||
read_only_fields = ("status", "track_file")
|
||||
|
||||
|
||||
class ImportBatchSerializer(serializers.ModelSerializer):
|
||||
submitted_by = UserBasicSerializer(read_only=True)
|
||||
|
||||
class Meta:
|
||||
model = models.ImportBatch
|
||||
fields = (
|
||||
"id",
|
||||
"submitted_by",
|
||||
"source",
|
||||
"status",
|
||||
"creation_date",
|
||||
"import_request",
|
||||
)
|
||||
read_only_fields = ("creation_date", "submitted_by", "source")
|
||||
|
||||
def to_representation(self, instance):
|
||||
repr = super().to_representation(instance)
|
||||
try:
|
||||
repr["job_count"] = instance.job_count
|
||||
except AttributeError:
|
||||
# Queryset was not annotated
|
||||
pass
|
||||
return repr
|
||||
|
||||
|
||||
class TrackActivitySerializer(activity_serializers.ModelSerializer):
|
||||
type = serializers.SerializerMethodField()
|
||||
name = serializers.CharField(source="title")
|
||||
|
@ -222,33 +376,3 @@ class TrackActivitySerializer(activity_serializers.ModelSerializer):
|
|||
|
||||
def get_type(self, obj):
|
||||
return "Audio"
|
||||
|
||||
|
||||
class ImportJobRunSerializer(serializers.Serializer):
|
||||
jobs = serializers.PrimaryKeyRelatedField(
|
||||
many=True,
|
||||
queryset=models.ImportJob.objects.filter(status__in=["pending", "errored"]),
|
||||
)
|
||||
batches = serializers.PrimaryKeyRelatedField(
|
||||
many=True, queryset=models.ImportBatch.objects.all()
|
||||
)
|
||||
|
||||
def validate(self, validated_data):
|
||||
jobs = validated_data["jobs"]
|
||||
batches_ids = [b.pk for b in validated_data["batches"]]
|
||||
query = Q(batch__pk__in=batches_ids)
|
||||
query |= Q(pk__in=[j.id for j in jobs])
|
||||
queryset = (
|
||||
models.ImportJob.objects.filter(query)
|
||||
.filter(status__in=["pending", "errored"])
|
||||
.distinct()
|
||||
)
|
||||
validated_data["_jobs"] = queryset
|
||||
return validated_data
|
||||
|
||||
def create(self, validated_data):
|
||||
ids = validated_data["_jobs"].values_list("id", flat=True)
|
||||
validated_data["_jobs"].update(status="pending")
|
||||
for id in ids:
|
||||
tasks.import_job_run.delay(import_job_id=id)
|
||||
return {"jobs": list(ids)}
|
||||
|
|
|
@ -0,0 +1,5 @@
|
|||
import django.dispatch
|
||||
|
||||
upload_import_status_updated = django.dispatch.Signal(
|
||||
providing_args=["old_status", "new_status", "upload"]
|
||||
)
|
|
@ -1,228 +1,54 @@
|
|||
import collections
|
||||
import logging
|
||||
import os
|
||||
|
||||
from django.conf import settings
|
||||
from django.core.files.base import ContentFile
|
||||
from musicbrainzngs import ResponseError
|
||||
from django.utils import timezone
|
||||
from django.db import transaction
|
||||
from django.db.models import F, Q
|
||||
from django.dispatch import receiver
|
||||
|
||||
from funkwhale_api.common import preferences
|
||||
from funkwhale_api.federation import activity, actors
|
||||
from funkwhale_api.federation import serializers as federation_serializers
|
||||
from funkwhale_api.providers.acoustid import get_acoustid_client
|
||||
from funkwhale_api.providers.audiofile import tasks as audiofile_tasks
|
||||
from musicbrainzngs import ResponseError
|
||||
from requests.exceptions import RequestException
|
||||
|
||||
from funkwhale_api.common import channels
|
||||
from funkwhale_api.federation import routes
|
||||
from funkwhale_api.federation import library as lb
|
||||
from funkwhale_api.taskapp import celery
|
||||
|
||||
from . import lyrics as lyrics_utils
|
||||
from . import models
|
||||
from . import utils as music_utils
|
||||
from . import metadata
|
||||
from . import signals
|
||||
from . import serializers
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@celery.app.task(name="acoustid.set_on_track_file")
|
||||
@celery.require_instance(models.TrackFile, "track_file")
|
||||
def set_acoustid_on_track_file(track_file):
|
||||
client = get_acoustid_client()
|
||||
result = client.get_best_match(track_file.audio_file.path)
|
||||
|
||||
def update(id):
|
||||
track_file.acoustid_track_id = id
|
||||
track_file.save(update_fields=["acoustid_track_id"])
|
||||
return id
|
||||
|
||||
if result:
|
||||
return update(result["id"])
|
||||
|
||||
|
||||
def import_track_from_remote(library_track):
|
||||
metadata = library_track.metadata
|
||||
try:
|
||||
track_mbid = metadata["recording"]["musicbrainz_id"]
|
||||
assert track_mbid # for null/empty values
|
||||
except (KeyError, AssertionError):
|
||||
pass
|
||||
else:
|
||||
return models.Track.get_or_create_from_api(mbid=track_mbid)[0]
|
||||
|
||||
try:
|
||||
album_mbid = metadata["release"]["musicbrainz_id"]
|
||||
assert album_mbid # for null/empty values
|
||||
except (KeyError, AssertionError):
|
||||
pass
|
||||
else:
|
||||
album, _ = models.Album.get_or_create_from_api(mbid=album_mbid)
|
||||
return models.Track.get_or_create_from_title(
|
||||
library_track.title, artist=album.artist, album=album
|
||||
)[0]
|
||||
|
||||
try:
|
||||
artist_mbid = metadata["artist"]["musicbrainz_id"]
|
||||
assert artist_mbid # for null/empty values
|
||||
except (KeyError, AssertionError):
|
||||
pass
|
||||
else:
|
||||
artist, _ = models.Artist.get_or_create_from_api(mbid=artist_mbid)
|
||||
album, _ = models.Album.get_or_create_from_title(
|
||||
library_track.album_title, artist=artist
|
||||
)
|
||||
return models.Track.get_or_create_from_title(
|
||||
library_track.title, artist=artist, album=album
|
||||
)[0]
|
||||
|
||||
# worst case scenario, we have absolutely no way to link to a
|
||||
# musicbrainz resource, we rely on the name/titles
|
||||
artist, _ = models.Artist.get_or_create_from_name(library_track.artist_name)
|
||||
album, _ = models.Album.get_or_create_from_title(
|
||||
library_track.album_title, artist=artist
|
||||
)
|
||||
return models.Track.get_or_create_from_title(
|
||||
library_track.title, artist=artist, album=album
|
||||
)[0]
|
||||
|
||||
|
||||
def _do_import(import_job, use_acoustid=False):
|
||||
logger.info("[Import Job %s] starting job", import_job.pk)
|
||||
from_file = bool(import_job.audio_file)
|
||||
mbid = import_job.mbid
|
||||
replace = import_job.replace_if_duplicate
|
||||
acoustid_track_id = None
|
||||
duration = None
|
||||
track = None
|
||||
# use_acoustid = use_acoustid and preferences.get('providers_acoustid__api_key')
|
||||
# Acoustid is not reliable, we disable it for now.
|
||||
use_acoustid = False
|
||||
if not mbid and use_acoustid and from_file:
|
||||
# we try to deduce mbid from acoustid
|
||||
client = get_acoustid_client()
|
||||
match = client.get_best_match(import_job.audio_file.path)
|
||||
if match:
|
||||
duration = match["recordings"][0]["duration"]
|
||||
mbid = match["recordings"][0]["id"]
|
||||
acoustid_track_id = match["id"]
|
||||
if mbid:
|
||||
logger.info(
|
||||
"[Import Job %s] importing track from musicbrainz recording %s",
|
||||
import_job.pk,
|
||||
str(mbid),
|
||||
)
|
||||
track, _ = models.Track.get_or_create_from_api(mbid=mbid)
|
||||
elif import_job.audio_file:
|
||||
logger.info(
|
||||
"[Import Job %s] importing track from uploaded track data at %s",
|
||||
import_job.pk,
|
||||
import_job.audio_file.path,
|
||||
)
|
||||
track = audiofile_tasks.import_track_data_from_path(import_job.audio_file.path)
|
||||
elif import_job.library_track:
|
||||
logger.info(
|
||||
"[Import Job %s] importing track from federated library track %s",
|
||||
import_job.pk,
|
||||
import_job.library_track.pk,
|
||||
)
|
||||
track = import_track_from_remote(import_job.library_track)
|
||||
elif import_job.source.startswith("file://"):
|
||||
tf_path = import_job.source.replace("file://", "", 1)
|
||||
logger.info(
|
||||
"[Import Job %s] importing track from local track data at %s",
|
||||
import_job.pk,
|
||||
tf_path,
|
||||
)
|
||||
track = audiofile_tasks.import_track_data_from_path(tf_path)
|
||||
else:
|
||||
raise ValueError(
|
||||
"Not enough data to process import, "
|
||||
"add a mbid, an audio file or a library track"
|
||||
)
|
||||
|
||||
track_file = None
|
||||
if replace:
|
||||
logger.info("[Import Job %s] deleting existing audio file", import_job.pk)
|
||||
track.files.all().delete()
|
||||
elif track.files.count() > 0:
|
||||
logger.info(
|
||||
"[Import Job %s] skipping, we already have a file for this track",
|
||||
import_job.pk,
|
||||
)
|
||||
if import_job.audio_file:
|
||||
import_job.audio_file.delete()
|
||||
import_job.status = "skipped"
|
||||
import_job.save()
|
||||
return
|
||||
|
||||
track_file = track_file or models.TrackFile(track=track, source=import_job.source)
|
||||
track_file.acoustid_track_id = acoustid_track_id
|
||||
if from_file:
|
||||
track_file.audio_file = ContentFile(import_job.audio_file.read())
|
||||
track_file.audio_file.name = import_job.audio_file.name
|
||||
track_file.duration = duration
|
||||
elif import_job.library_track:
|
||||
track_file.library_track = import_job.library_track
|
||||
track_file.mimetype = import_job.library_track.audio_mimetype
|
||||
if import_job.library_track.library.download_files:
|
||||
raise NotImplementedError()
|
||||
else:
|
||||
# no downloading, we hotlink
|
||||
pass
|
||||
elif not import_job.audio_file and not import_job.source.startswith("file://"):
|
||||
# not an inplace import, and we have a source, so let's download it
|
||||
logger.info("[Import Job %s] downloading audio file from remote", import_job.pk)
|
||||
track_file.download_file()
|
||||
elif not import_job.audio_file and import_job.source.startswith("file://"):
|
||||
# in place import, we set mimetype from extension
|
||||
path, ext = os.path.splitext(import_job.source)
|
||||
track_file.mimetype = music_utils.get_type_from_ext(ext)
|
||||
track_file.set_audio_data()
|
||||
track_file.save()
|
||||
# if no cover is set on track album, we try to update it as well:
|
||||
if not track.album.cover:
|
||||
logger.info("[Import Job %s] retrieving album cover", import_job.pk)
|
||||
update_album_cover(track.album, track_file)
|
||||
import_job.status = "finished"
|
||||
import_job.track_file = track_file
|
||||
if import_job.audio_file:
|
||||
# it's imported on the track, we don't need it anymore
|
||||
import_job.audio_file.delete()
|
||||
import_job.save()
|
||||
logger.info("[Import Job %s] job finished", import_job.pk)
|
||||
return track_file
|
||||
|
||||
|
||||
def update_album_cover(album, track_file, replace=False):
|
||||
def update_album_cover(album, source=None, cover_data=None, replace=False):
|
||||
if album.cover and not replace:
|
||||
return
|
||||
if cover_data:
|
||||
return album.get_image(data=cover_data)
|
||||
|
||||
if track_file:
|
||||
# maybe the file has a cover embedded?
|
||||
if source and source.startswith("file://"):
|
||||
# let's look for a cover in the same directory
|
||||
path = os.path.dirname(source.replace("file://", "", 1))
|
||||
logger.info("[Album %s] scanning covers from %s", album.pk, path)
|
||||
cover = get_cover_from_fs(path)
|
||||
if cover:
|
||||
return album.get_image(data=cover)
|
||||
if album.mbid:
|
||||
try:
|
||||
metadata = track_file.get_metadata()
|
||||
except FileNotFoundError:
|
||||
metadata = None
|
||||
if metadata:
|
||||
cover = metadata.get_picture("cover_front")
|
||||
if cover:
|
||||
# best case scenario, cover is embedded in the track
|
||||
logger.info("[Album %s] Using cover embedded in file", album.pk)
|
||||
return album.get_image(data=cover)
|
||||
if track_file.source and track_file.source.startswith("file://"):
|
||||
# let's look for a cover in the same directory
|
||||
path = os.path.dirname(track_file.source.replace("file://", "", 1))
|
||||
logger.info("[Album %s] scanning covers from %s", album.pk, path)
|
||||
cover = get_cover_from_fs(path)
|
||||
if cover:
|
||||
return album.get_image(data=cover)
|
||||
if not album.mbid:
|
||||
return
|
||||
try:
|
||||
logger.info(
|
||||
"[Album %s] Fetching cover from musicbrainz release %s",
|
||||
album.pk,
|
||||
str(album.mbid),
|
||||
)
|
||||
return album.get_image()
|
||||
except ResponseError as exc:
|
||||
logger.warning(
|
||||
"[Album %s] cannot fetch cover from musicbrainz: %s", album.pk, str(exc)
|
||||
)
|
||||
logger.info(
|
||||
"[Album %s] Fetching cover from musicbrainz release %s",
|
||||
album.pk,
|
||||
str(album.mbid),
|
||||
)
|
||||
return album.get_image()
|
||||
except ResponseError as exc:
|
||||
logger.warning(
|
||||
"[Album %s] cannot fetch cover from musicbrainz: %s", album.pk, str(exc)
|
||||
)
|
||||
|
||||
|
||||
IMAGE_TYPES = [("jpg", "image/jpeg"), ("png", "image/png")]
|
||||
|
@ -240,37 +66,6 @@ def get_cover_from_fs(dir_path):
|
|||
return {"mimetype": m, "content": c.read()}
|
||||
|
||||
|
||||
@celery.app.task(name="ImportJob.run", bind=True)
|
||||
@celery.require_instance(
|
||||
models.ImportJob.objects.filter(status__in=["pending", "errored"]), "import_job"
|
||||
)
|
||||
def import_job_run(self, import_job, use_acoustid=False):
|
||||
def mark_errored(exc):
|
||||
logger.error("[Import Job %s] Error during import: %s", import_job.pk, str(exc))
|
||||
import_job.status = "errored"
|
||||
import_job.save(update_fields=["status"])
|
||||
|
||||
try:
|
||||
tf = _do_import(import_job, use_acoustid=use_acoustid)
|
||||
return tf.pk if tf else None
|
||||
except Exception as exc:
|
||||
if not settings.DEBUG:
|
||||
try:
|
||||
self.retry(exc=exc, countdown=30, max_retries=3)
|
||||
except Exception:
|
||||
mark_errored(exc)
|
||||
raise
|
||||
mark_errored(exc)
|
||||
raise
|
||||
|
||||
|
||||
@celery.app.task(name="ImportBatch.run")
|
||||
@celery.require_instance(models.ImportBatch, "import_batch")
|
||||
def import_batch_run(import_batch):
|
||||
for job_id in import_batch.jobs.order_by("id").values_list("id", flat=True):
|
||||
import_job_run.delay(import_job_id=job_id)
|
||||
|
||||
|
||||
@celery.app.task(name="Lyrics.fetch_content")
|
||||
@celery.require_instance(models.Lyrics, "lyrics")
|
||||
def fetch_content(lyrics):
|
||||
|
@ -281,40 +76,453 @@ def fetch_content(lyrics):
|
|||
lyrics.save(update_fields=["content"])
|
||||
|
||||
|
||||
@celery.app.task(name="music.import_batch_notify_followers")
|
||||
@celery.app.task(name="music.start_library_scan")
|
||||
@celery.require_instance(
|
||||
models.ImportBatch.objects.filter(status="finished"), "import_batch"
|
||||
models.LibraryScan.objects.select_related().filter(status="pending"), "library_scan"
|
||||
)
|
||||
def import_batch_notify_followers(import_batch):
|
||||
if not preferences.get("federation__enabled"):
|
||||
return
|
||||
def start_library_scan(library_scan):
|
||||
try:
|
||||
data = lb.get_library_data(library_scan.library.fid, actor=library_scan.actor)
|
||||
except Exception:
|
||||
library_scan.status = "errored"
|
||||
library_scan.save(update_fields=["status", "modification_date"])
|
||||
raise
|
||||
library_scan.modification_date = timezone.now()
|
||||
library_scan.status = "scanning"
|
||||
library_scan.total_files = data["totalItems"]
|
||||
library_scan.save(update_fields=["status", "modification_date", "total_files"])
|
||||
scan_library_page.delay(library_scan_id=library_scan.pk, page_url=data["first"])
|
||||
|
||||
if import_batch.source == "federation":
|
||||
return
|
||||
|
||||
library_actor = actors.SYSTEM_ACTORS["library"].get_actor_instance()
|
||||
followers = library_actor.get_approved_followers()
|
||||
jobs = import_batch.jobs.filter(
|
||||
status="finished", library_track__isnull=True, track_file__isnull=False
|
||||
).select_related("track_file__track__artist", "track_file__track__album__artist")
|
||||
track_files = [job.track_file for job in jobs]
|
||||
collection = federation_serializers.CollectionSerializer(
|
||||
{
|
||||
"actor": library_actor,
|
||||
"id": import_batch.get_federation_url(),
|
||||
"items": track_files,
|
||||
"item_serializer": federation_serializers.AudioSerializer,
|
||||
@celery.app.task(
|
||||
name="music.scan_library_page",
|
||||
retry_backoff=60,
|
||||
max_retries=5,
|
||||
autoretry_for=[RequestException],
|
||||
)
|
||||
@celery.require_instance(
|
||||
models.LibraryScan.objects.select_related().filter(status="scanning"),
|
||||
"library_scan",
|
||||
)
|
||||
def scan_library_page(library_scan, page_url):
|
||||
data = lb.get_library_page(library_scan.library, page_url, library_scan.actor)
|
||||
uploads = []
|
||||
|
||||
for item_serializer in data["items"]:
|
||||
upload = item_serializer.save(library=library_scan.library)
|
||||
uploads.append(upload)
|
||||
|
||||
library_scan.processed_files = F("processed_files") + len(uploads)
|
||||
library_scan.modification_date = timezone.now()
|
||||
update_fields = ["modification_date", "processed_files"]
|
||||
|
||||
next_page = data.get("next")
|
||||
fetch_next = next_page and next_page != page_url
|
||||
|
||||
if not fetch_next:
|
||||
update_fields.append("status")
|
||||
library_scan.status = "finished"
|
||||
library_scan.save(update_fields=update_fields)
|
||||
|
||||
if fetch_next:
|
||||
scan_library_page.delay(library_scan_id=library_scan.pk, page_url=next_page)
|
||||
|
||||
|
||||
def getter(data, *keys, default=None):
|
||||
if not data:
|
||||
return default
|
||||
v = data
|
||||
for k in keys:
|
||||
try:
|
||||
v = v[k]
|
||||
except KeyError:
|
||||
return default
|
||||
|
||||
return v
|
||||
|
||||
|
||||
class UploadImportError(ValueError):
|
||||
def __init__(self, code):
|
||||
self.code = code
|
||||
super().__init__(code)
|
||||
|
||||
|
||||
def fail_import(upload, error_code):
|
||||
old_status = upload.import_status
|
||||
upload.import_status = "errored"
|
||||
upload.import_details = {"error_code": error_code}
|
||||
upload.import_date = timezone.now()
|
||||
upload.save(update_fields=["import_details", "import_status", "import_date"])
|
||||
|
||||
broadcast = getter(
|
||||
upload.import_metadata, "funkwhale", "config", "broadcast", default=True
|
||||
)
|
||||
if broadcast:
|
||||
signals.upload_import_status_updated.send(
|
||||
old_status=old_status,
|
||||
new_status=upload.import_status,
|
||||
upload=upload,
|
||||
sender=None,
|
||||
)
|
||||
|
||||
|
||||
@celery.app.task(name="music.process_upload")
|
||||
@celery.require_instance(
|
||||
models.Upload.objects.filter(import_status="pending").select_related(
|
||||
"library__actor__user"
|
||||
),
|
||||
"upload",
|
||||
)
|
||||
def process_upload(upload):
|
||||
import_metadata = upload.import_metadata or {}
|
||||
old_status = upload.import_status
|
||||
audio_file = upload.get_audio_file()
|
||||
try:
|
||||
additional_data = {}
|
||||
if not audio_file:
|
||||
# we can only rely on user proveded data
|
||||
final_metadata = import_metadata
|
||||
else:
|
||||
# we use user provided data and data from the file itself
|
||||
m = metadata.Metadata(audio_file)
|
||||
file_metadata = m.all()
|
||||
final_metadata = collections.ChainMap(
|
||||
additional_data, import_metadata, file_metadata
|
||||
)
|
||||
additional_data["cover_data"] = m.get_picture("cover_front")
|
||||
additional_data["upload_source"] = upload.source
|
||||
track = get_track_from_import_metadata(final_metadata)
|
||||
except UploadImportError as e:
|
||||
return fail_import(upload, e.code)
|
||||
except Exception:
|
||||
fail_import(upload, "unknown_error")
|
||||
raise
|
||||
|
||||
# under some situations, we want to skip the import (
|
||||
# for instance if the user already owns the files)
|
||||
owned_duplicates = get_owned_duplicates(upload, track)
|
||||
upload.track = track
|
||||
|
||||
if owned_duplicates:
|
||||
upload.import_status = "skipped"
|
||||
upload.import_details = {
|
||||
"code": "already_imported_in_owned_libraries",
|
||||
"duplicates": list(owned_duplicates),
|
||||
}
|
||||
).data
|
||||
for f in followers:
|
||||
create = federation_serializers.ActivitySerializer(
|
||||
{
|
||||
"type": "Create",
|
||||
"id": collection["id"],
|
||||
"object": collection,
|
||||
"actor": library_actor.url,
|
||||
"to": [f.url],
|
||||
}
|
||||
).data
|
||||
upload.import_date = timezone.now()
|
||||
upload.save(
|
||||
update_fields=["import_details", "import_status", "import_date", "track"]
|
||||
)
|
||||
signals.upload_import_status_updated.send(
|
||||
old_status=old_status,
|
||||
new_status=upload.import_status,
|
||||
upload=upload,
|
||||
sender=None,
|
||||
)
|
||||
return
|
||||
|
||||
activity.deliver(create, on_behalf_of=library_actor, to=[f.url])
|
||||
# all is good, let's finalize the import
|
||||
audio_data = upload.get_audio_data()
|
||||
if audio_data:
|
||||
upload.duration = audio_data["duration"]
|
||||
upload.size = audio_data["size"]
|
||||
upload.bitrate = audio_data["bitrate"]
|
||||
upload.import_status = "finished"
|
||||
upload.import_date = timezone.now()
|
||||
upload.save(
|
||||
update_fields=[
|
||||
"track",
|
||||
"import_status",
|
||||
"import_date",
|
||||
"size",
|
||||
"duration",
|
||||
"bitrate",
|
||||
]
|
||||
)
|
||||
broadcast = getter(
|
||||
import_metadata, "funkwhale", "config", "broadcast", default=True
|
||||
)
|
||||
if broadcast:
|
||||
signals.upload_import_status_updated.send(
|
||||
old_status=old_status,
|
||||
new_status=upload.import_status,
|
||||
upload=upload,
|
||||
sender=None,
|
||||
)
|
||||
dispatch_outbox = getter(
|
||||
import_metadata, "funkwhale", "config", "dispatch_outbox", default=True
|
||||
)
|
||||
if dispatch_outbox:
|
||||
routes.outbox.dispatch(
|
||||
{"type": "Create", "object": {"type": "Audio"}}, context={"upload": upload}
|
||||
)
|
||||
|
||||
|
||||
def federation_audio_track_to_metadata(payload):
|
||||
"""
|
||||
Given a valid payload as returned by federation.serializers.TrackSerializer.validated_data,
|
||||
returns a correct metadata payload for use with get_track_from_import_metadata.
|
||||
"""
|
||||
musicbrainz_recordingid = payload.get("musicbrainzId")
|
||||
musicbrainz_artistid = payload["artists"][0].get("musicbrainzId")
|
||||
musicbrainz_albumartistid = payload["album"]["artists"][0].get("musicbrainzId")
|
||||
musicbrainz_albumid = payload["album"].get("musicbrainzId")
|
||||
|
||||
new_data = {
|
||||
"title": payload["name"],
|
||||
"album": payload["album"]["name"],
|
||||
"track_number": payload["position"],
|
||||
"artist": payload["artists"][0]["name"],
|
||||
"album_artist": payload["album"]["artists"][0]["name"],
|
||||
"date": payload["album"].get("released"),
|
||||
# musicbrainz
|
||||
"musicbrainz_recordingid": str(musicbrainz_recordingid)
|
||||
if musicbrainz_recordingid
|
||||
else None,
|
||||
"musicbrainz_artistid": str(musicbrainz_artistid)
|
||||
if musicbrainz_artistid
|
||||
else None,
|
||||
"musicbrainz_albumartistid": str(musicbrainz_albumartistid)
|
||||
if musicbrainz_albumartistid
|
||||
else None,
|
||||
"musicbrainz_albumid": str(musicbrainz_albumid)
|
||||
if musicbrainz_albumid
|
||||
else None,
|
||||
# federation
|
||||
"fid": payload["id"],
|
||||
"artist_fid": payload["artists"][0]["id"],
|
||||
"album_artist_fid": payload["album"]["artists"][0]["id"],
|
||||
"album_fid": payload["album"]["id"],
|
||||
"fdate": payload["published"],
|
||||
"album_fdate": payload["album"]["published"],
|
||||
"album_artist_fdate": payload["album"]["artists"][0]["published"],
|
||||
"artist_fdate": payload["artists"][0]["published"],
|
||||
}
|
||||
cover = payload["album"].get("cover")
|
||||
if cover:
|
||||
new_data["cover_data"] = {"mimetype": cover["mediaType"], "url": cover["href"]}
|
||||
return new_data
|
||||
|
||||
|
||||
def get_owned_duplicates(upload, track):
|
||||
"""
|
||||
Ensure we skip duplicate tracks to avoid wasting user/instance storage
|
||||
"""
|
||||
owned_libraries = upload.library.actor.libraries.all()
|
||||
return (
|
||||
models.Upload.objects.filter(
|
||||
track__isnull=False, library__in=owned_libraries, track=track
|
||||
)
|
||||
.exclude(pk=upload.pk)
|
||||
.values_list("uuid", flat=True)
|
||||
)
|
||||
|
||||
|
||||
def get_best_candidate_or_create(model, query, defaults, sort_fields):
|
||||
"""
|
||||
Like queryset.get_or_create() but does not crash if multiple objects
|
||||
are returned on the get() call
|
||||
"""
|
||||
candidates = model.objects.filter(query)
|
||||
if candidates:
|
||||
|
||||
return sort_candidates(candidates, sort_fields)[0], False
|
||||
|
||||
return model.objects.create(**defaults), True
|
||||
|
||||
|
||||
def sort_candidates(candidates, important_fields):
|
||||
"""
|
||||
Given a list of objects and a list of fields,
|
||||
will return a sorted list of those objects by score.
|
||||
|
||||
Score is higher for objects that have a non-empty attribute
|
||||
that is also present in important fields::
|
||||
|
||||
artist1 = Artist(mbid=None, fid=None)
|
||||
artist2 = Artist(mbid="something", fid=None)
|
||||
|
||||
# artist2 has a mbid, so is sorted first
|
||||
assert sort_candidates([artist1, artist2], ['mbid'])[0] == artist2
|
||||
|
||||
Only supports string fields.
|
||||
"""
|
||||
|
||||
# map each fields to its score, giving a higher score to first fields
|
||||
fields_scores = {f: i + 1 for i, f in enumerate(sorted(important_fields))}
|
||||
candidates_with_scores = []
|
||||
for candidate in candidates:
|
||||
current_score = 0
|
||||
for field, score in fields_scores.items():
|
||||
v = getattr(candidate, field, "")
|
||||
if v:
|
||||
current_score += score
|
||||
|
||||
candidates_with_scores.append((candidate, current_score))
|
||||
|
||||
return [c for c, s in reversed(sorted(candidates_with_scores, key=lambda v: v[1]))]
|
||||
|
||||
|
||||
@transaction.atomic
|
||||
def get_track_from_import_metadata(data):
|
||||
track_uuid = getter(data, "funkwhale", "track", "uuid")
|
||||
|
||||
if track_uuid:
|
||||
# easy case, we have a reference to a uuid of a track that
|
||||
# already exists in our database
|
||||
try:
|
||||
track = models.Track.objects.get(uuid=track_uuid)
|
||||
except models.Track.DoesNotExist:
|
||||
raise UploadImportError(code="track_uuid_not_found")
|
||||
|
||||
if not track.album.cover:
|
||||
update_album_cover(
|
||||
track.album,
|
||||
source=data.get("upload_source"),
|
||||
cover_data=data.get("cover_data"),
|
||||
)
|
||||
return track
|
||||
|
||||
from_activity_id = data.get("from_activity_id", None)
|
||||
track_mbid = data.get("musicbrainz_recordingid", None)
|
||||
album_mbid = data.get("musicbrainz_albumid", None)
|
||||
track_fid = getter(data, "fid")
|
||||
|
||||
query = None
|
||||
|
||||
if album_mbid and track_mbid:
|
||||
query = Q(mbid=track_mbid, album__mbid=album_mbid)
|
||||
|
||||
if track_fid:
|
||||
query = query | Q(fid=track_fid) if query else Q(fid=track_fid)
|
||||
|
||||
if query:
|
||||
# second easy case: we have a (track_mbid, album_mbid) pair or
|
||||
# a federation uuid we can check on
|
||||
try:
|
||||
return sort_candidates(models.Track.objects.filter(query), ["mbid", "fid"])[
|
||||
0
|
||||
]
|
||||
except IndexError:
|
||||
pass
|
||||
|
||||
# get / create artist and album artist
|
||||
artist_mbid = data.get("musicbrainz_artistid", None)
|
||||
artist_fid = data.get("artist_fid", None)
|
||||
artist_name = data["artist"]
|
||||
query = Q(name__iexact=artist_name)
|
||||
if artist_mbid:
|
||||
query |= Q(mbid=artist_mbid)
|
||||
if artist_fid:
|
||||
query |= Q(fid=artist_fid)
|
||||
defaults = {
|
||||
"name": artist_name,
|
||||
"mbid": artist_mbid,
|
||||
"fid": artist_fid,
|
||||
"from_activity_id": from_activity_id,
|
||||
}
|
||||
if data.get("artist_fdate"):
|
||||
defaults["creation_date"] = data.get("artist_fdate")
|
||||
|
||||
artist = get_best_candidate_or_create(
|
||||
models.Artist, query, defaults=defaults, sort_fields=["mbid", "fid"]
|
||||
)[0]
|
||||
|
||||
album_artist_name = data.get("album_artist") or artist_name
|
||||
if album_artist_name == artist_name:
|
||||
album_artist = artist
|
||||
else:
|
||||
query = Q(name__iexact=album_artist_name)
|
||||
album_artist_mbid = data.get("musicbrainz_albumartistid", None)
|
||||
album_artist_fid = data.get("album_artist_fid", None)
|
||||
if album_artist_mbid:
|
||||
query |= Q(mbid=album_artist_mbid)
|
||||
if album_artist_fid:
|
||||
query |= Q(fid=album_artist_fid)
|
||||
defaults = {
|
||||
"name": album_artist_name,
|
||||
"mbid": album_artist_mbid,
|
||||
"fid": album_artist_fid,
|
||||
"from_activity_id": from_activity_id,
|
||||
}
|
||||
if data.get("album_artist_fdate"):
|
||||
defaults["creation_date"] = data.get("album_artist_fdate")
|
||||
|
||||
album_artist = get_best_candidate_or_create(
|
||||
models.Artist, query, defaults=defaults, sort_fields=["mbid", "fid"]
|
||||
)[0]
|
||||
|
||||
# get / create album
|
||||
album_title = data["album"]
|
||||
album_fid = data.get("album_fid", None)
|
||||
query = Q(title__iexact=album_title, artist=album_artist)
|
||||
if album_mbid:
|
||||
query |= Q(mbid=album_mbid)
|
||||
if album_fid:
|
||||
query |= Q(fid=album_fid)
|
||||
defaults = {
|
||||
"title": album_title,
|
||||
"artist": album_artist,
|
||||
"mbid": album_mbid,
|
||||
"release_date": data.get("date"),
|
||||
"fid": album_fid,
|
||||
"from_activity_id": from_activity_id,
|
||||
}
|
||||
if data.get("album_fdate"):
|
||||
defaults["creation_date"] = data.get("album_fdate")
|
||||
|
||||
album = get_best_candidate_or_create(
|
||||
models.Album, query, defaults=defaults, sort_fields=["mbid", "fid"]
|
||||
)[0]
|
||||
if not album.cover:
|
||||
update_album_cover(
|
||||
album, source=data.get("upload_source"), cover_data=data.get("cover_data")
|
||||
)
|
||||
|
||||
# get / create track
|
||||
track_title = data["title"]
|
||||
track_number = data.get("track_number", 1)
|
||||
query = Q(title__iexact=track_title, artist=artist, album=album)
|
||||
if track_mbid:
|
||||
query |= Q(mbid=track_mbid)
|
||||
if track_fid:
|
||||
query |= Q(fid=track_fid)
|
||||
defaults = {
|
||||
"title": track_title,
|
||||
"album": album,
|
||||
"mbid": track_mbid,
|
||||
"artist": artist,
|
||||
"position": track_number,
|
||||
"fid": track_fid,
|
||||
"from_activity_id": from_activity_id,
|
||||
}
|
||||
if data.get("fdate"):
|
||||
defaults["creation_date"] = data.get("fdate")
|
||||
|
||||
track = get_best_candidate_or_create(
|
||||
models.Track, query, defaults=defaults, sort_fields=["mbid", "fid"]
|
||||
)[0]
|
||||
|
||||
return track
|
||||
|
||||
|
||||
@receiver(signals.upload_import_status_updated)
|
||||
def broadcast_import_status_update_to_owner(old_status, new_status, upload, **kwargs):
|
||||
user = upload.library.actor.get_user()
|
||||
if not user:
|
||||
return
|
||||
|
||||
group = "user.{}.imports".format(user.pk)
|
||||
channels.group_send(
|
||||
group,
|
||||
{
|
||||
"type": "event.send",
|
||||
"text": "",
|
||||
"data": {
|
||||
"type": "import.status_updated",
|
||||
"upload": serializers.UploadForOwnerSerializer(upload).data,
|
||||
"old_status": old_status,
|
||||
"new_status": new_status,
|
||||
},
|
||||
},
|
||||
)
|
||||
|
|
|
@ -54,7 +54,17 @@ def get_audio_file_data(f):
|
|||
if not data:
|
||||
return
|
||||
d = {}
|
||||
d["bitrate"] = data.info.bitrate
|
||||
d["bitrate"] = getattr(data.info, "bitrate", 0)
|
||||
d["length"] = data.info.length
|
||||
|
||||
return d
|
||||
|
||||
|
||||
def get_actor_from_request(request):
|
||||
actor = None
|
||||
if hasattr(request, "actor"):
|
||||
actor = request.actor
|
||||
elif request.user.is_authenticated:
|
||||
actor = request.user.actor
|
||||
|
||||
return actor
|
||||
|
|
|
@ -1,36 +1,53 @@
|
|||
import json
|
||||
import logging
|
||||
import urllib
|
||||
|
||||
from django.conf import settings
|
||||
from django.core.exceptions import ObjectDoesNotExist
|
||||
from django.db import transaction
|
||||
from django.db.models import Count
|
||||
from django.db.models import Count, Prefetch, Sum, F, Q
|
||||
from django.db.models.functions import Length
|
||||
from django.utils import timezone
|
||||
from musicbrainzngs import ResponseError
|
||||
|
||||
from rest_framework import mixins
|
||||
from rest_framework import permissions
|
||||
from rest_framework import settings as rest_settings
|
||||
from rest_framework import views, viewsets
|
||||
from rest_framework.decorators import detail_route, list_route
|
||||
from rest_framework.response import Response
|
||||
from taggit.models import Tag
|
||||
|
||||
from funkwhale_api.common import utils as funkwhale_utils
|
||||
from funkwhale_api.common.permissions import ConditionalAuthentication
|
||||
from funkwhale_api.common import utils as common_utils
|
||||
from funkwhale_api.common import permissions as common_permissions
|
||||
from funkwhale_api.federation.authentication import SignatureAuthentication
|
||||
from funkwhale_api.federation.models import LibraryTrack
|
||||
from funkwhale_api.musicbrainz import api
|
||||
from funkwhale_api.requests.models import ImportRequest
|
||||
from funkwhale_api.users.permissions import HasUserPermission
|
||||
from funkwhale_api.federation import api_serializers as federation_api_serializers
|
||||
from funkwhale_api.federation import routes
|
||||
|
||||
from . import filters, importers, models
|
||||
from . import permissions as music_permissions
|
||||
from . import serializers, tasks, utils
|
||||
from . import filters, models, serializers, tasks, utils
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def get_libraries(filter_uploads):
|
||||
def view(self, request, *args, **kwargs):
|
||||
obj = self.get_object()
|
||||
actor = utils.get_actor_from_request(request)
|
||||
uploads = models.Upload.objects.all()
|
||||
uploads = filter_uploads(obj, uploads)
|
||||
uploads = uploads.playable_by(actor)
|
||||
libraries = models.Library.objects.filter(
|
||||
pk__in=uploads.values_list("library", flat=True)
|
||||
)
|
||||
libraries = libraries.select_related("actor")
|
||||
page = self.paginate_queryset(libraries)
|
||||
if page is not None:
|
||||
serializer = federation_api_serializers.LibrarySerializer(page, many=True)
|
||||
return self.get_paginated_response(serializer.data)
|
||||
|
||||
serializer = federation_api_serializers.LibrarySerializer(libraries, many=True)
|
||||
return Response(serializer.data)
|
||||
|
||||
return view
|
||||
|
||||
|
||||
class TagViewSetMixin(object):
|
||||
def get_queryset(self):
|
||||
queryset = super().get_queryset()
|
||||
|
@ -41,107 +58,115 @@ class TagViewSetMixin(object):
|
|||
|
||||
|
||||
class ArtistViewSet(viewsets.ReadOnlyModelViewSet):
|
||||
queryset = models.Artist.objects.with_albums()
|
||||
queryset = models.Artist.objects.all()
|
||||
serializer_class = serializers.ArtistWithAlbumsSerializer
|
||||
permission_classes = [ConditionalAuthentication]
|
||||
permission_classes = [common_permissions.ConditionalAuthentication]
|
||||
filter_class = filters.ArtistFilter
|
||||
ordering_fields = ("id", "name", "creation_date")
|
||||
|
||||
def get_queryset(self):
|
||||
queryset = super().get_queryset()
|
||||
albums = models.Album.objects.with_tracks_count()
|
||||
albums = albums.annotate_playable_by_actor(
|
||||
utils.get_actor_from_request(self.request)
|
||||
)
|
||||
return queryset.prefetch_related(Prefetch("albums", queryset=albums)).distinct()
|
||||
|
||||
libraries = detail_route(methods=["get"])(
|
||||
get_libraries(
|
||||
filter_uploads=lambda o, uploads: uploads.filter(
|
||||
Q(track__artist=o) | Q(track__album__artist=o)
|
||||
)
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
class AlbumViewSet(viewsets.ReadOnlyModelViewSet):
|
||||
queryset = (
|
||||
models.Album.objects.all()
|
||||
.order_by("artist", "release_date")
|
||||
.select_related()
|
||||
.prefetch_related("tracks__artist", "tracks__files")
|
||||
models.Album.objects.all().order_by("artist", "release_date").select_related()
|
||||
)
|
||||
serializer_class = serializers.AlbumSerializer
|
||||
permission_classes = [ConditionalAuthentication]
|
||||
permission_classes = [common_permissions.ConditionalAuthentication]
|
||||
ordering_fields = ("creation_date", "release_date", "title")
|
||||
filter_class = filters.AlbumFilter
|
||||
|
||||
def get_queryset(self):
|
||||
queryset = super().get_queryset()
|
||||
tracks = models.Track.objects.annotate_playable_by_actor(
|
||||
utils.get_actor_from_request(self.request)
|
||||
).select_related("artist")
|
||||
if (
|
||||
hasattr(self, "kwargs")
|
||||
and self.kwargs
|
||||
and self.request.method.lower() == "get"
|
||||
):
|
||||
# we are detailing a single album, so we can add the overhead
|
||||
# to fetch additional data
|
||||
tracks = tracks.annotate_duration()
|
||||
qs = queryset.prefetch_related(Prefetch("tracks", queryset=tracks))
|
||||
return qs.distinct()
|
||||
|
||||
class ImportBatchViewSet(
|
||||
libraries = detail_route(methods=["get"])(
|
||||
get_libraries(filter_uploads=lambda o, uploads: uploads.filter(track__album=o))
|
||||
)
|
||||
|
||||
|
||||
class LibraryViewSet(
|
||||
mixins.CreateModelMixin,
|
||||
mixins.ListModelMixin,
|
||||
mixins.RetrieveModelMixin,
|
||||
mixins.UpdateModelMixin,
|
||||
mixins.DestroyModelMixin,
|
||||
viewsets.GenericViewSet,
|
||||
):
|
||||
lookup_field = "uuid"
|
||||
queryset = (
|
||||
models.ImportBatch.objects.select_related()
|
||||
models.Library.objects.all()
|
||||
.order_by("-creation_date")
|
||||
.annotate(job_count=Count("jobs"))
|
||||
.annotate(_uploads_count=Count("uploads"))
|
||||
.annotate(_size=Sum("uploads__size"))
|
||||
)
|
||||
serializer_class = serializers.ImportBatchSerializer
|
||||
permission_classes = (HasUserPermission,)
|
||||
required_permissions = ["library", "upload"]
|
||||
permission_operator = "or"
|
||||
filter_class = filters.ImportBatchFilter
|
||||
|
||||
def perform_create(self, serializer):
|
||||
serializer.save(submitted_by=self.request.user)
|
||||
serializer_class = serializers.LibraryForOwnerSerializer
|
||||
permission_classes = [
|
||||
permissions.IsAuthenticated,
|
||||
common_permissions.OwnerPermission,
|
||||
]
|
||||
owner_field = "actor.user"
|
||||
owner_checks = ["read", "write"]
|
||||
|
||||
def get_queryset(self):
|
||||
qs = super().get_queryset()
|
||||
# if user do not have library permission, we limit to their
|
||||
# own jobs
|
||||
if not self.request.user.has_permissions("library"):
|
||||
qs = qs.filter(submitted_by=self.request.user)
|
||||
return qs
|
||||
|
||||
|
||||
class ImportJobViewSet(
|
||||
mixins.CreateModelMixin, mixins.ListModelMixin, viewsets.GenericViewSet
|
||||
):
|
||||
queryset = models.ImportJob.objects.all().select_related()
|
||||
serializer_class = serializers.ImportJobSerializer
|
||||
permission_classes = (HasUserPermission,)
|
||||
required_permissions = ["library", "upload"]
|
||||
permission_operator = "or"
|
||||
filter_class = filters.ImportJobFilter
|
||||
|
||||
def get_queryset(self):
|
||||
qs = super().get_queryset()
|
||||
# if user do not have library permission, we limit to their
|
||||
# own jobs
|
||||
if not self.request.user.has_permissions("library"):
|
||||
qs = qs.filter(batch__submitted_by=self.request.user)
|
||||
return qs
|
||||
|
||||
@list_route(methods=["get"])
|
||||
def stats(self, request, *args, **kwargs):
|
||||
if not request.user.has_permissions("library"):
|
||||
return Response(status=403)
|
||||
qs = models.ImportJob.objects.all()
|
||||
filterset = filters.ImportJobFilter(request.GET, queryset=qs)
|
||||
qs = filterset.qs
|
||||
qs = qs.values("status").order_by("status")
|
||||
qs = qs.annotate(status_count=Count("status"))
|
||||
|
||||
data = {}
|
||||
for row in qs:
|
||||
data[row["status"]] = row["status_count"]
|
||||
|
||||
for s, _ in models.IMPORT_STATUS_CHOICES:
|
||||
data.setdefault(s, 0)
|
||||
|
||||
data["count"] = sum([v for v in data.values()])
|
||||
return Response(data)
|
||||
|
||||
@list_route(methods=["post"])
|
||||
def run(self, request, *args, **kwargs):
|
||||
serializer = serializers.ImportJobRunSerializer(data=request.data)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
payload = serializer.save()
|
||||
|
||||
return Response(payload)
|
||||
return qs.filter(actor=self.request.user.actor)
|
||||
|
||||
def perform_create(self, serializer):
|
||||
source = "file://" + serializer.validated_data["audio_file"].name
|
||||
serializer.save(source=source)
|
||||
funkwhale_utils.on_commit(
|
||||
tasks.import_job_run.delay, import_job_id=serializer.instance.pk
|
||||
serializer.save(actor=self.request.user.actor)
|
||||
|
||||
@transaction.atomic
|
||||
def perform_destroy(self, instance):
|
||||
routes.outbox.dispatch(
|
||||
{"type": "Delete", "object": {"type": "Library"}},
|
||||
context={"library": instance},
|
||||
)
|
||||
instance.delete()
|
||||
|
||||
@detail_route(methods=["get"])
|
||||
@transaction.non_atomic_requests
|
||||
def follows(self, request, *args, **kwargs):
|
||||
library = self.get_object()
|
||||
queryset = (
|
||||
library.received_follows.filter(target__actor=self.request.user.actor)
|
||||
.select_related("actor", "target__actor")
|
||||
.order_by("-creation_date")
|
||||
)
|
||||
page = self.paginate_queryset(queryset)
|
||||
if page is not None:
|
||||
serializer = federation_api_serializers.LibraryFollowSerializer(
|
||||
page, many=True
|
||||
)
|
||||
return self.get_paginated_response(serializer.data)
|
||||
|
||||
serializer = self.get_serializer(queryset, many=True)
|
||||
return Response(serializer.data)
|
||||
|
||||
|
||||
class TrackViewSet(TagViewSetMixin, viewsets.ReadOnlyModelViewSet):
|
||||
|
@ -151,14 +176,13 @@ class TrackViewSet(TagViewSetMixin, viewsets.ReadOnlyModelViewSet):
|
|||
|
||||
queryset = models.Track.objects.all().for_nested_serialization()
|
||||
serializer_class = serializers.TrackSerializer
|
||||
permission_classes = [ConditionalAuthentication]
|
||||
permission_classes = [common_permissions.ConditionalAuthentication]
|
||||
filter_class = filters.TrackFilter
|
||||
ordering_fields = (
|
||||
"creation_date",
|
||||
"title",
|
||||
"album__title",
|
||||
"album__release_date",
|
||||
"position",
|
||||
"size",
|
||||
"artist__name",
|
||||
)
|
||||
|
||||
|
@ -169,7 +193,18 @@ class TrackViewSet(TagViewSetMixin, viewsets.ReadOnlyModelViewSet):
|
|||
if user.is_authenticated and filter_favorites == "true":
|
||||
queryset = queryset.filter(track_favorites__user=user)
|
||||
|
||||
return queryset
|
||||
queryset = queryset.annotate_playable_by_actor(
|
||||
utils.get_actor_from_request(self.request)
|
||||
).annotate_duration()
|
||||
if (
|
||||
hasattr(self, "kwargs")
|
||||
and self.kwargs
|
||||
and self.request.method.lower() == "get"
|
||||
):
|
||||
# we are detailing a single track, so we can add the overhead
|
||||
# to fetch additional data
|
||||
queryset = queryset.annotate_file_data()
|
||||
return queryset.distinct()
|
||||
|
||||
@detail_route(methods=["get"])
|
||||
@transaction.non_atomic_requests
|
||||
|
@ -196,6 +231,10 @@ class TrackViewSet(TagViewSetMixin, viewsets.ReadOnlyModelViewSet):
|
|||
serializer = serializers.LyricsSerializer(lyrics)
|
||||
return Response(serializer.data)
|
||||
|
||||
libraries = detail_route(methods=["get"])(
|
||||
get_libraries(filter_uploads=lambda o, uploads: uploads.filter(track=o))
|
||||
)
|
||||
|
||||
|
||||
def get_file_path(audio_file):
|
||||
serve_path = settings.MUSIC_DIRECTORY_SERVE_PATH
|
||||
|
@ -228,40 +267,37 @@ def get_file_path(audio_file):
|
|||
return path.encode("utf-8")
|
||||
|
||||
|
||||
def handle_serve(track_file):
|
||||
f = track_file
|
||||
def handle_serve(upload, user):
|
||||
f = upload
|
||||
# we update the accessed_date
|
||||
f.accessed_date = timezone.now()
|
||||
f.save(update_fields=["accessed_date"])
|
||||
|
||||
mt = f.mimetype
|
||||
audio_file = f.audio_file
|
||||
try:
|
||||
library_track = f.library_track
|
||||
except ObjectDoesNotExist:
|
||||
library_track = None
|
||||
if library_track and not audio_file:
|
||||
if not library_track.audio_file:
|
||||
# we need to populate from cache
|
||||
with transaction.atomic():
|
||||
# why the transaction/select_for_update?
|
||||
# this is because browsers may send multiple requests
|
||||
# in a short time range, for partial content,
|
||||
# thus resulting in multiple downloads from the remote
|
||||
qs = LibraryTrack.objects.select_for_update()
|
||||
library_track = qs.get(pk=library_track.pk)
|
||||
library_track.download_audio()
|
||||
track_file.library_track = library_track
|
||||
track_file.set_audio_data()
|
||||
track_file.save(update_fields=["bitrate", "duration", "size"])
|
||||
if f.audio_file:
|
||||
file_path = get_file_path(f.audio_file)
|
||||
|
||||
audio_file = library_track.audio_file
|
||||
file_path = get_file_path(audio_file)
|
||||
mt = library_track.audio_mimetype
|
||||
elif audio_file:
|
||||
file_path = get_file_path(audio_file)
|
||||
elif f.source and (
|
||||
f.source.startswith("http://") or f.source.startswith("https://")
|
||||
):
|
||||
# we need to populate from cache
|
||||
with transaction.atomic():
|
||||
# why the transaction/select_for_update?
|
||||
# this is because browsers may send multiple requests
|
||||
# in a short time range, for partial content,
|
||||
# thus resulting in multiple downloads from the remote
|
||||
qs = f.__class__.objects.select_for_update()
|
||||
f = qs.get(pk=f.pk)
|
||||
f.download_audio_from_remote(user=user)
|
||||
data = f.get_audio_data()
|
||||
if data:
|
||||
f.duration = data["duration"]
|
||||
f.size = data["size"]
|
||||
f.bitrate = data["bitrate"]
|
||||
f.save(update_fields=["bitrate", "duration", "size"])
|
||||
file_path = get_file_path(f.audio_file)
|
||||
elif f.source and f.source.startswith("file://"):
|
||||
file_path = get_file_path(f.source.replace("file://", "", 1))
|
||||
mt = f.mimetype
|
||||
if mt:
|
||||
response = Response(content_type=mt)
|
||||
else:
|
||||
|
@ -278,39 +314,100 @@ def handle_serve(track_file):
|
|||
return response
|
||||
|
||||
|
||||
class TrackFileViewSet(viewsets.ReadOnlyModelViewSet):
|
||||
queryset = (
|
||||
models.TrackFile.objects.all()
|
||||
.select_related("track__artist", "track__album")
|
||||
.order_by("-id")
|
||||
)
|
||||
serializer_class = serializers.TrackFileSerializer
|
||||
class ListenViewSet(mixins.RetrieveModelMixin, viewsets.GenericViewSet):
|
||||
queryset = models.Track.objects.all()
|
||||
serializer_class = serializers.TrackSerializer
|
||||
authentication_classes = (
|
||||
rest_settings.api_settings.DEFAULT_AUTHENTICATION_CLASSES
|
||||
+ [SignatureAuthentication]
|
||||
)
|
||||
permission_classes = [music_permissions.Listen]
|
||||
permission_classes = [common_permissions.ConditionalAuthentication]
|
||||
lookup_field = "uuid"
|
||||
|
||||
@detail_route(methods=["get"])
|
||||
def serve(self, request, *args, **kwargs):
|
||||
queryset = models.TrackFile.objects.select_related(
|
||||
"library_track", "track__album__artist", "track__artist"
|
||||
)
|
||||
try:
|
||||
return handle_serve(queryset.get(pk=kwargs["pk"]))
|
||||
except models.TrackFile.DoesNotExist:
|
||||
def retrieve(self, request, *args, **kwargs):
|
||||
track = self.get_object()
|
||||
actor = utils.get_actor_from_request(request)
|
||||
queryset = track.uploads.select_related("track__album__artist", "track__artist")
|
||||
explicit_file = request.GET.get("upload")
|
||||
if explicit_file:
|
||||
queryset = queryset.filter(uuid=explicit_file)
|
||||
queryset = queryset.playable_by(actor)
|
||||
queryset = queryset.order_by(F("audio_file").desc(nulls_last=True))
|
||||
upload = queryset.first()
|
||||
if not upload:
|
||||
return Response(status=404)
|
||||
|
||||
return handle_serve(upload, user=request.user)
|
||||
|
||||
|
||||
class UploadViewSet(
|
||||
mixins.ListModelMixin,
|
||||
mixins.CreateModelMixin,
|
||||
mixins.RetrieveModelMixin,
|
||||
mixins.DestroyModelMixin,
|
||||
viewsets.GenericViewSet,
|
||||
):
|
||||
lookup_field = "uuid"
|
||||
queryset = (
|
||||
models.Upload.objects.all()
|
||||
.order_by("-creation_date")
|
||||
.select_related("library", "track__artist", "track__album__artist")
|
||||
)
|
||||
serializer_class = serializers.UploadForOwnerSerializer
|
||||
permission_classes = [
|
||||
permissions.IsAuthenticated,
|
||||
common_permissions.OwnerPermission,
|
||||
]
|
||||
owner_field = "library.actor.user"
|
||||
owner_checks = ["read", "write"]
|
||||
filter_class = filters.UploadFilter
|
||||
ordering_fields = (
|
||||
"creation_date",
|
||||
"import_date",
|
||||
"bitrate",
|
||||
"size",
|
||||
"artist__name",
|
||||
)
|
||||
|
||||
def get_queryset(self):
|
||||
qs = super().get_queryset()
|
||||
return qs.filter(library__actor=self.request.user.actor)
|
||||
|
||||
@list_route(methods=["post"])
|
||||
def action(self, request, *args, **kwargs):
|
||||
queryset = self.get_queryset()
|
||||
serializer = serializers.UploadActionSerializer(request.data, queryset=queryset)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
result = serializer.save()
|
||||
return Response(result, status=200)
|
||||
|
||||
def get_serializer_context(self):
|
||||
context = super().get_serializer_context()
|
||||
context["user"] = self.request.user
|
||||
return context
|
||||
|
||||
def perform_create(self, serializer):
|
||||
upload = serializer.save()
|
||||
common_utils.on_commit(tasks.process_upload.delay, upload_id=upload.pk)
|
||||
|
||||
@transaction.atomic
|
||||
def perform_destroy(self, instance):
|
||||
routes.outbox.dispatch(
|
||||
{"type": "Delete", "object": {"type": "Audio"}},
|
||||
context={"uploads": [instance]},
|
||||
)
|
||||
instance.delete()
|
||||
|
||||
|
||||
class TagViewSet(viewsets.ReadOnlyModelViewSet):
|
||||
queryset = Tag.objects.all().order_by("name")
|
||||
serializer_class = serializers.TagSerializer
|
||||
permission_classes = [ConditionalAuthentication]
|
||||
permission_classes = [common_permissions.ConditionalAuthentication]
|
||||
|
||||
|
||||
class Search(views.APIView):
|
||||
max_results = 3
|
||||
permission_classes = [ConditionalAuthentication]
|
||||
permission_classes = [common_permissions.ConditionalAuthentication]
|
||||
|
||||
def get(self, request, *args, **kwargs):
|
||||
query = request.GET["query"]
|
||||
|
@ -340,7 +437,6 @@ class Search(views.APIView):
|
|||
models.Track.objects.all()
|
||||
.filter(query_obj)
|
||||
.select_related("artist", "album__artist")
|
||||
.prefetch_related("files")
|
||||
)[: self.max_results]
|
||||
|
||||
def get_albums(self, query):
|
||||
|
@ -350,7 +446,7 @@ class Search(views.APIView):
|
|||
models.Album.objects.all()
|
||||
.filter(query_obj)
|
||||
.select_related()
|
||||
.prefetch_related("tracks__files")
|
||||
.prefetch_related("tracks")
|
||||
)[: self.max_results]
|
||||
|
||||
def get_artists(self, query):
|
||||
|
@ -372,99 +468,3 @@ class Search(views.APIView):
|
|||
)
|
||||
|
||||
return qs.filter(query_obj)[: self.max_results]
|
||||
|
||||
|
||||
class SubmitViewSet(viewsets.ViewSet):
|
||||
queryset = models.ImportBatch.objects.none()
|
||||
permission_classes = (HasUserPermission,)
|
||||
required_permissions = ["library"]
|
||||
|
||||
@list_route(methods=["post"])
|
||||
@transaction.non_atomic_requests
|
||||
def single(self, request, *args, **kwargs):
|
||||
try:
|
||||
models.Track.objects.get(mbid=request.POST["mbid"])
|
||||
return Response({})
|
||||
except models.Track.DoesNotExist:
|
||||
pass
|
||||
batch = models.ImportBatch.objects.create(submitted_by=request.user)
|
||||
job = models.ImportJob.objects.create(
|
||||
mbid=request.POST["mbid"], batch=batch, source=request.POST["import_url"]
|
||||
)
|
||||
tasks.import_job_run.delay(import_job_id=job.pk)
|
||||
serializer = serializers.ImportBatchSerializer(batch)
|
||||
return Response(serializer.data, status=201)
|
||||
|
||||
def get_import_request(self, data):
|
||||
try:
|
||||
raw = data["importRequest"]
|
||||
except KeyError:
|
||||
return
|
||||
|
||||
pk = int(raw)
|
||||
try:
|
||||
return ImportRequest.objects.get(pk=pk)
|
||||
except ImportRequest.DoesNotExist:
|
||||
pass
|
||||
|
||||
@list_route(methods=["post"])
|
||||
@transaction.non_atomic_requests
|
||||
def album(self, request, *args, **kwargs):
|
||||
data = json.loads(request.body.decode("utf-8"))
|
||||
import_request = self.get_import_request(data)
|
||||
import_data, batch = self._import_album(
|
||||
data, request, batch=None, import_request=import_request
|
||||
)
|
||||
return Response(import_data)
|
||||
|
||||
@transaction.atomic
|
||||
def _import_album(self, data, request, batch=None, import_request=None):
|
||||
# we import the whole album here to prevent race conditions that occurs
|
||||
# when using get_or_create_from_api in tasks
|
||||
album_data = api.releases.get(
|
||||
id=data["releaseId"], includes=models.Album.api_includes
|
||||
)["release"]
|
||||
cleaned_data = models.Album.clean_musicbrainz_data(album_data)
|
||||
album = importers.load(
|
||||
models.Album, cleaned_data, album_data, import_hooks=[models.import_tracks]
|
||||
)
|
||||
try:
|
||||
album.get_image()
|
||||
except ResponseError:
|
||||
pass
|
||||
if not batch:
|
||||
batch = models.ImportBatch.objects.create(
|
||||
submitted_by=request.user, import_request=import_request
|
||||
)
|
||||
for row in data["tracks"]:
|
||||
try:
|
||||
models.TrackFile.objects.get(track__mbid=row["mbid"])
|
||||
except models.TrackFile.DoesNotExist:
|
||||
job = models.ImportJob.objects.create(
|
||||
mbid=row["mbid"], batch=batch, source=row["source"]
|
||||
)
|
||||
funkwhale_utils.on_commit(
|
||||
tasks.import_job_run.delay, import_job_id=job.pk
|
||||
)
|
||||
|
||||
serializer = serializers.ImportBatchSerializer(batch)
|
||||
return serializer.data, batch
|
||||
|
||||
@list_route(methods=["post"])
|
||||
@transaction.non_atomic_requests
|
||||
def artist(self, request, *args, **kwargs):
|
||||
data = json.loads(request.body.decode("utf-8"))
|
||||
import_request = self.get_import_request(data)
|
||||
artist_data = api.artists.get(id=data["artistId"])["artist"]
|
||||
cleaned_data = models.Artist.clean_musicbrainz_data(artist_data)
|
||||
importers.load(models.Artist, cleaned_data, artist_data, import_hooks=[])
|
||||
|
||||
import_data = []
|
||||
batch = None
|
||||
for row in data["albums"]:
|
||||
row_data, batch = self._import_album(
|
||||
row, request, batch=batch, import_request=import_request
|
||||
)
|
||||
import_data.append(row_data)
|
||||
|
||||
return Response(import_data[0])
|
||||
|
|
|
@ -6,6 +6,7 @@ from funkwhale_api import __version__
|
|||
|
||||
_api = musicbrainzngs
|
||||
_api.set_useragent("funkwhale", str(__version__), settings.FUNKWHALE_URL)
|
||||
_api.set_hostname(settings.MUSICBRAINZ_HOSTNAME)
|
||||
|
||||
|
||||
store = memoize.djangocache.Cache("default")
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
from django.contrib import admin
|
||||
from funkwhale_api.common import admin
|
||||
|
||||
from . import models
|
||||
|
||||
|
|
|
@ -8,7 +8,7 @@ from . import models
|
|||
|
||||
class PlaylistFilter(filters.FilterSet):
|
||||
q = filters.CharFilter(name="_", method="filter_q")
|
||||
listenable = filters.BooleanFilter(name="_", method="filter_listenable")
|
||||
playable = filters.BooleanFilter(name="_", method="filter_playable")
|
||||
|
||||
class Meta:
|
||||
model = models.Playlist
|
||||
|
@ -16,10 +16,10 @@ class PlaylistFilter(filters.FilterSet):
|
|||
"user": ["exact"],
|
||||
"name": ["exact", "icontains"],
|
||||
"q": "exact",
|
||||
"listenable": "exact",
|
||||
"playable": "exact",
|
||||
}
|
||||
|
||||
def filter_listenable(self, queryset, name, value):
|
||||
def filter_playable(self, queryset, name, value):
|
||||
queryset = queryset.annotate(plts_count=Count("playlist_tracks"))
|
||||
if value:
|
||||
return queryset.filter(plts_count__gt=0)
|
||||
|
|
|
@ -12,7 +12,7 @@ class PlaylistQuerySet(models.QuerySet):
|
|||
|
||||
def with_duration(self):
|
||||
return self.annotate(
|
||||
duration=models.Sum("playlist_tracks__track__files__duration")
|
||||
duration=models.Sum("playlist_tracks__track__uploads__duration")
|
||||
)
|
||||
|
||||
def with_covers(self):
|
||||
|
@ -38,6 +38,23 @@ class PlaylistQuerySet(models.QuerySet):
|
|||
)
|
||||
return self.prefetch_related(plt_prefetch)
|
||||
|
||||
def annotate_playable_by_actor(self, actor):
|
||||
plts = (
|
||||
PlaylistTrack.objects.playable_by(actor)
|
||||
.filter(playlist=models.OuterRef("id"))
|
||||
.order_by("id")
|
||||
.values("id")[:1]
|
||||
)
|
||||
subquery = models.Subquery(plts)
|
||||
return self.annotate(is_playable_by_actor=subquery)
|
||||
|
||||
def playable_by(self, actor, include=True):
|
||||
plts = PlaylistTrack.objects.playable_by(actor, include)
|
||||
if include:
|
||||
return self.filter(playlist_tracks__in=plts)
|
||||
else:
|
||||
return self.exclude(playlist_tracks__in=plts)
|
||||
|
||||
|
||||
class Playlist(models.Model):
|
||||
name = models.CharField(max_length=50)
|
||||
|
@ -130,15 +147,30 @@ class Playlist(models.Model):
|
|||
|
||||
|
||||
class PlaylistTrackQuerySet(models.QuerySet):
|
||||
def for_nested_serialization(self):
|
||||
return (
|
||||
self.select_related()
|
||||
.select_related("track__album__artist")
|
||||
.prefetch_related(
|
||||
"track__tags", "track__files", "track__artist__albums__tracks__tags"
|
||||
)
|
||||
def for_nested_serialization(self, actor=None):
|
||||
tracks = music_models.Track.objects.annotate_playable_by_actor(actor)
|
||||
tracks = tracks.select_related("artist", "album__artist")
|
||||
return self.prefetch_related(
|
||||
models.Prefetch("track", queryset=tracks, to_attr="_prefetched_track")
|
||||
)
|
||||
|
||||
def annotate_playable_by_actor(self, actor):
|
||||
tracks = (
|
||||
music_models.Track.objects.playable_by(actor)
|
||||
.filter(pk=models.OuterRef("track"))
|
||||
.order_by("id")
|
||||
.values("id")[:1]
|
||||
)
|
||||
subquery = models.Subquery(tracks)
|
||||
return self.annotate(is_playable_by_actor=subquery)
|
||||
|
||||
def playable_by(self, actor, include=True):
|
||||
tracks = music_models.Track.objects.playable_by(actor, include)
|
||||
if include:
|
||||
return self.filter(track__pk__in=tracks)
|
||||
else:
|
||||
return self.exclude(track__pk__in=tracks)
|
||||
|
||||
|
||||
class PlaylistTrack(models.Model):
|
||||
track = models.ForeignKey(
|
||||
|
|
|
@ -10,12 +10,17 @@ from . import models
|
|||
|
||||
|
||||
class PlaylistTrackSerializer(serializers.ModelSerializer):
|
||||
track = TrackSerializer()
|
||||
# track = TrackSerializer()
|
||||
track = serializers.SerializerMethodField()
|
||||
|
||||
class Meta:
|
||||
model = models.PlaylistTrack
|
||||
fields = ("id", "track", "playlist", "index", "creation_date")
|
||||
|
||||
def get_track(self, o):
|
||||
track = o._prefetched_track if hasattr(o, "_prefetched_track") else o.track
|
||||
return TrackSerializer(track).data
|
||||
|
||||
|
||||
class PlaylistTrackWriteSerializer(serializers.ModelSerializer):
|
||||
index = serializers.IntegerField(required=False, min_value=0, allow_null=True)
|
||||
|
@ -68,6 +73,7 @@ class PlaylistSerializer(serializers.ModelSerializer):
|
|||
duration = serializers.SerializerMethodField(read_only=True)
|
||||
album_covers = serializers.SerializerMethodField(read_only=True)
|
||||
user = UserBasicSerializer(read_only=True)
|
||||
is_playable = serializers.SerializerMethodField()
|
||||
|
||||
class Meta:
|
||||
model = models.Playlist
|
||||
|
@ -81,9 +87,16 @@ class PlaylistSerializer(serializers.ModelSerializer):
|
|||
"tracks_count",
|
||||
"album_covers",
|
||||
"duration",
|
||||
"is_playable",
|
||||
)
|
||||
read_only_fields = ["id", "modification_date", "creation_date"]
|
||||
|
||||
def get_is_playable(self, obj):
|
||||
try:
|
||||
return bool(obj.is_playable_by_actor)
|
||||
except AttributeError:
|
||||
return None
|
||||
|
||||
def get_tracks_count(self, obj):
|
||||
try:
|
||||
return obj.tracks_count
|
||||
|
|
|
@ -6,7 +6,7 @@ from rest_framework.permissions import IsAuthenticatedOrReadOnly
|
|||
from rest_framework.response import Response
|
||||
|
||||
from funkwhale_api.common import fields, permissions
|
||||
|
||||
from funkwhale_api.music import utils as music_utils
|
||||
from . import filters, models, serializers
|
||||
|
||||
|
||||
|
@ -39,7 +39,9 @@ class PlaylistViewSet(
|
|||
@detail_route(methods=["get"])
|
||||
def tracks(self, request, *args, **kwargs):
|
||||
playlist = self.get_object()
|
||||
plts = playlist.playlist_tracks.all().for_nested_serialization()
|
||||
plts = playlist.playlist_tracks.all().for_nested_serialization(
|
||||
music_utils.get_actor_from_request(request)
|
||||
)
|
||||
serializer = serializers.PlaylistTrackSerializer(plts, many=True)
|
||||
data = {"count": len(plts), "results": serializer.data}
|
||||
return Response(data, status=200)
|
||||
|
@ -59,7 +61,7 @@ class PlaylistViewSet(
|
|||
plts = (
|
||||
models.PlaylistTrack.objects.filter(pk__in=ids)
|
||||
.order_by("index")
|
||||
.for_nested_serialization()
|
||||
.for_nested_serialization(music_utils.get_actor_from_request(request))
|
||||
)
|
||||
serializer = serializers.PlaylistTrackSerializer(plts, many=True)
|
||||
data = {"count": len(plts), "results": serializer.data}
|
||||
|
@ -74,7 +76,9 @@ class PlaylistViewSet(
|
|||
return Response(status=204)
|
||||
|
||||
def get_queryset(self):
|
||||
return self.queryset.filter(fields.privacy_level_query(self.request.user))
|
||||
return self.queryset.filter(
|
||||
fields.privacy_level_query(self.request.user)
|
||||
).annotate_playable_by_actor(music_utils.get_actor_from_request(self.request))
|
||||
|
||||
def perform_create(self, serializer):
|
||||
return serializer.save(
|
||||
|
@ -95,7 +99,7 @@ class PlaylistTrackViewSet(
|
|||
):
|
||||
|
||||
serializer_class = serializers.PlaylistTrackSerializer
|
||||
queryset = models.PlaylistTrack.objects.all().for_nested_serialization()
|
||||
queryset = models.PlaylistTrack.objects.all()
|
||||
permission_classes = [
|
||||
permissions.ConditionalAuthentication,
|
||||
permissions.OwnerPermission,
|
||||
|
@ -116,7 +120,7 @@ class PlaylistTrackViewSet(
|
|||
lookup_field="playlist__privacy_level",
|
||||
user_field="playlist__user",
|
||||
)
|
||||
)
|
||||
).for_nested_serialization(music_utils.get_actor_from_request(self.request))
|
||||
|
||||
def perform_destroy(self, instance):
|
||||
instance.delete(update_indexes=True)
|
||||
|
|
|
@ -1,4 +0,0 @@
|
|||
"""
|
||||
This module is responsible from importing existing audiofiles from the
|
||||
filesystem into funkwhale.
|
||||
"""
|
|
@ -1,45 +0,0 @@
|
|||
from django.db import transaction
|
||||
|
||||
from funkwhale_api.music import metadata, models
|
||||
|
||||
|
||||
@transaction.atomic
|
||||
def import_track_data_from_path(path):
|
||||
data = metadata.Metadata(path)
|
||||
album = None
|
||||
track_mbid = data.get("musicbrainz_recordingid", None)
|
||||
album_mbid = data.get("musicbrainz_albumid", None)
|
||||
|
||||
if album_mbid and track_mbid:
|
||||
# to gain performance and avoid additional mb lookups,
|
||||
# we import from the release data, which is already cached
|
||||
return models.Track.get_or_create_from_release(album_mbid, track_mbid)[0]
|
||||
elif track_mbid:
|
||||
return models.Track.get_or_create_from_api(track_mbid)[0]
|
||||
elif album_mbid:
|
||||
album = models.Album.get_or_create_from_api(album_mbid)[0]
|
||||
|
||||
artist = album.artist if album else None
|
||||
artist_mbid = data.get("musicbrainz_artistid", None)
|
||||
if not artist:
|
||||
if artist_mbid:
|
||||
artist = models.Artist.get_or_create_from_api(artist_mbid)[0]
|
||||
else:
|
||||
artist = models.Artist.objects.get_or_create(
|
||||
name__iexact=data.get("artist"), defaults={"name": data.get("artist")}
|
||||
)[0]
|
||||
|
||||
release_date = data.get("date", default=None)
|
||||
if not album:
|
||||
album = models.Album.objects.get_or_create(
|
||||
title__iexact=data.get("album"),
|
||||
artist=artist,
|
||||
defaults={"title": data.get("album"), "release_date": release_date},
|
||||
)[0]
|
||||
position = data.get("track_number", default=None)
|
||||
track = models.Track.objects.get_or_create(
|
||||
title__iexact=data.get("title"),
|
||||
album=album,
|
||||
defaults={"title": data.get("title"), "position": position},
|
||||
)[0]
|
||||
return track
|
|
@ -1,16 +1,10 @@
|
|||
from django.conf.urls import include, url
|
||||
|
||||
urlpatterns = [
|
||||
url(
|
||||
r"^youtube/",
|
||||
include(
|
||||
("funkwhale_api.providers.youtube.urls", "youtube"), namespace="youtube"
|
||||
),
|
||||
),
|
||||
url(
|
||||
r"^musicbrainz/",
|
||||
include(
|
||||
("funkwhale_api.musicbrainz.urls", "musicbrainz"), namespace="musicbrainz"
|
||||
),
|
||||
),
|
||||
)
|
||||
]
|
||||
|
|
|
@ -1,80 +0,0 @@
|
|||
import threading
|
||||
|
||||
from apiclient.discovery import build
|
||||
from dynamic_preferences.registries import global_preferences_registry as registry
|
||||
|
||||
YOUTUBE_API_SERVICE_NAME = "youtube"
|
||||
YOUTUBE_API_VERSION = "v3"
|
||||
VIDEO_BASE_URL = "https://www.youtube.com/watch?v={0}"
|
||||
|
||||
|
||||
def _do_search(query):
|
||||
manager = registry.manager()
|
||||
youtube = build(
|
||||
YOUTUBE_API_SERVICE_NAME,
|
||||
YOUTUBE_API_VERSION,
|
||||
developerKey=manager["providers_youtube__api_key"],
|
||||
)
|
||||
|
||||
return youtube.search().list(q=query, part="id,snippet", maxResults=25).execute()
|
||||
|
||||
|
||||
class Client(object):
|
||||
def search(self, query):
|
||||
search_response = _do_search(query)
|
||||
videos = []
|
||||
for search_result in search_response.get("items", []):
|
||||
if search_result["id"]["kind"] == "youtube#video":
|
||||
search_result["full_url"] = VIDEO_BASE_URL.format(
|
||||
search_result["id"]["videoId"]
|
||||
)
|
||||
videos.append(search_result)
|
||||
return videos
|
||||
|
||||
def search_multiple(self, queries):
|
||||
results = {}
|
||||
|
||||
def search(key, query):
|
||||
results[key] = self.search(query)
|
||||
|
||||
threads = [
|
||||
threading.Thread(target=search, args=(key, query))
|
||||
for key, query in queries.items()
|
||||
]
|
||||
for thread in threads:
|
||||
thread.start()
|
||||
for thread in threads:
|
||||
thread.join()
|
||||
|
||||
return results
|
||||
|
||||
def to_funkwhale(self, result):
|
||||
"""
|
||||
We convert youtube results to something more generic.
|
||||
|
||||
{
|
||||
"id": "video id",
|
||||
"type": "youtube#video",
|
||||
"url": "https://www.youtube.com/watch?v=id",
|
||||
"description": "description",
|
||||
"channelId": "Channel id",
|
||||
"title": "Title",
|
||||
"channelTitle": "channel Title",
|
||||
"publishedAt": "2012-08-22T18:41:03.000Z",
|
||||
"cover": "http://coverurl"
|
||||
}
|
||||
"""
|
||||
return {
|
||||
"id": result["id"]["videoId"],
|
||||
"url": "https://www.youtube.com/watch?v={}".format(result["id"]["videoId"]),
|
||||
"type": result["id"]["kind"],
|
||||
"title": result["snippet"]["title"],
|
||||
"description": result["snippet"]["description"],
|
||||
"channelId": result["snippet"]["channelId"],
|
||||
"channelTitle": result["snippet"]["channelTitle"],
|
||||
"publishedAt": result["snippet"]["publishedAt"],
|
||||
"cover": result["snippet"]["thumbnails"]["high"]["url"],
|
||||
}
|
||||
|
||||
|
||||
client = Client()
|
|
@ -1,16 +0,0 @@
|
|||
from django import forms
|
||||
from dynamic_preferences.registries import global_preferences_registry
|
||||
from dynamic_preferences.types import Section, StringPreference
|
||||
|
||||
youtube = Section("providers_youtube")
|
||||
|
||||
|
||||
@global_preferences_registry.register
|
||||
class APIKey(StringPreference):
|
||||
section = youtube
|
||||
name = "api_key"
|
||||
default = "CHANGEME"
|
||||
verbose_name = "YouTube API key"
|
||||
help_text = "The API key used to query YouTube. Get one at https://console.developers.google.com/."
|
||||
widget = forms.PasswordInput
|
||||
field_kwargs = {"required": False}
|
|
@ -1,8 +0,0 @@
|
|||
from django.conf.urls import url
|
||||
|
||||
from .views import APISearch, APISearchs
|
||||
|
||||
urlpatterns = [
|
||||
url(r"^search/$", APISearch.as_view(), name="search"),
|
||||
url(r"^searchs/$", APISearchs.as_view(), name="searchs"),
|
||||
]
|
|
@ -1,27 +0,0 @@
|
|||
from rest_framework.response import Response
|
||||
from rest_framework.views import APIView
|
||||
|
||||
from funkwhale_api.common.permissions import ConditionalAuthentication
|
||||
|
||||
from .client import client
|
||||
|
||||
|
||||
class APISearch(APIView):
|
||||
permission_classes = [ConditionalAuthentication]
|
||||
|
||||
def get(self, request, *args, **kwargs):
|
||||
results = client.search(request.GET["query"])
|
||||
return Response([client.to_funkwhale(result) for result in results])
|
||||
|
||||
|
||||
class APISearchs(APIView):
|
||||
permission_classes = [ConditionalAuthentication]
|
||||
|
||||
def post(self, request, *args, **kwargs):
|
||||
results = client.search_multiple(request.data)
|
||||
return Response(
|
||||
{
|
||||
key: [client.to_funkwhale(result) for result in group]
|
||||
for key, group in results.items()
|
||||
}
|
||||
)
|
|
@ -1,4 +1,4 @@
|
|||
from django.contrib import admin
|
||||
from funkwhale_api.common import admin
|
||||
|
||||
from . import models
|
||||
|
||||
|
|
|
@ -43,8 +43,8 @@ class SessionRadio(SimpleRadio):
|
|||
return self.session
|
||||
|
||||
def get_queryset(self, **kwargs):
|
||||
qs = Track.objects.annotate(files_count=Count("files"))
|
||||
return qs.filter(files_count__gt=0)
|
||||
qs = Track.objects.annotate(uploads_count=Count("uploads"))
|
||||
return qs.filter(uploads_count__gt=0)
|
||||
|
||||
def get_queryset_kwargs(self):
|
||||
return {}
|
||||
|
@ -54,6 +54,8 @@ class SessionRadio(SimpleRadio):
|
|||
queryset = self.get_queryset(**kwargs)
|
||||
if self.session:
|
||||
queryset = self.filter_from_session(queryset)
|
||||
if kwargs.pop("filter_playable", True):
|
||||
queryset = queryset.playable_by(self.session.user.actor)
|
||||
return queryset
|
||||
|
||||
def filter_from_session(self, queryset):
|
||||
|
|
|
@ -1,11 +0,0 @@
|
|||
from django.contrib import admin
|
||||
|
||||
from . import models
|
||||
|
||||
|
||||
@admin.register(models.ImportRequest)
|
||||
class ImportRequestAdmin(admin.ModelAdmin):
|
||||
list_display = ["artist_name", "user", "status", "creation_date"]
|
||||
list_select_related = ["user"]
|
||||
list_filter = ["status"]
|
||||
search_fields = ["artist_name", "comment", "albums"]
|
|
@ -1,8 +0,0 @@
|
|||
from rest_framework import routers
|
||||
|
||||
from . import views
|
||||
|
||||
router = routers.SimpleRouter()
|
||||
router.register(r"import-requests", views.ImportRequestViewSet, "import-requests")
|
||||
|
||||
urlpatterns = router.urls
|
|
@ -1,15 +0,0 @@
|
|||
import factory
|
||||
|
||||
from funkwhale_api.factories import registry
|
||||
from funkwhale_api.users.factories import UserFactory
|
||||
|
||||
|
||||
@registry.register
|
||||
class ImportRequestFactory(factory.django.DjangoModelFactory):
|
||||
artist_name = factory.Faker("name")
|
||||
albums = factory.Faker("sentence")
|
||||
user = factory.SubFactory(UserFactory)
|
||||
comment = factory.Faker("paragraph")
|
||||
|
||||
class Meta:
|
||||
model = "requests.ImportRequest"
|
|
@ -1,20 +0,0 @@
|
|||
import django_filters
|
||||
|
||||
from funkwhale_api.common import fields
|
||||
|
||||
from . import models
|
||||
|
||||
|
||||
class ImportRequestFilter(django_filters.FilterSet):
|
||||
|
||||
q = fields.SearchFilter(
|
||||
search_fields=["artist_name", "user__username", "albums", "comment"]
|
||||
)
|
||||
|
||||
class Meta:
|
||||
model = models.ImportRequest
|
||||
fields = {
|
||||
"artist_name": ["exact", "iexact", "startswith", "icontains"],
|
||||
"status": ["exact"],
|
||||
"user__username": ["exact"],
|
||||
}
|
|
@ -1,27 +0,0 @@
|
|||
from rest_framework import serializers
|
||||
|
||||
from funkwhale_api.users.serializers import UserBasicSerializer
|
||||
|
||||
from . import models
|
||||
|
||||
|
||||
class ImportRequestSerializer(serializers.ModelSerializer):
|
||||
user = UserBasicSerializer(read_only=True)
|
||||
|
||||
class Meta:
|
||||
model = models.ImportRequest
|
||||
fields = (
|
||||
"id",
|
||||
"status",
|
||||
"albums",
|
||||
"artist_name",
|
||||
"user",
|
||||
"creation_date",
|
||||
"imported_date",
|
||||
"comment",
|
||||
)
|
||||
read_only_fields = ("creation_date", "imported_date", "user", "status")
|
||||
|
||||
def create(self, validated_data):
|
||||
validated_data["user"] = self.context["user"]
|
||||
return super().create(validated_data)
|
|
@ -1,27 +0,0 @@
|
|||
from rest_framework import mixins, viewsets
|
||||
|
||||
from . import filters, models, serializers
|
||||
|
||||
|
||||
class ImportRequestViewSet(
|
||||
mixins.CreateModelMixin,
|
||||
mixins.RetrieveModelMixin,
|
||||
mixins.ListModelMixin,
|
||||
viewsets.GenericViewSet,
|
||||
):
|
||||
|
||||
serializer_class = serializers.ImportRequestSerializer
|
||||
queryset = (
|
||||
models.ImportRequest.objects.all().select_related().order_by("-creation_date")
|
||||
)
|
||||
filter_class = filters.ImportRequestFilter
|
||||
ordering_fields = ("id", "artist_name", "creation_date", "status")
|
||||
|
||||
def perform_create(self, serializer):
|
||||
return serializer.save(user=self.request.user)
|
||||
|
||||
def get_serializer_context(self):
|
||||
context = super().get_serializer_context()
|
||||
if self.request.user.is_authenticated:
|
||||
context["user"] = self.request.user
|
||||
return context
|
|
@ -24,7 +24,8 @@ class GetArtistsSerializer(serializers.Serializer):
|
|||
|
||||
first_letter_mapping = collections.defaultdict(list)
|
||||
for artist in values:
|
||||
first_letter_mapping[artist["name"][0].upper()].append(artist)
|
||||
if artist["name"]:
|
||||
first_letter_mapping[artist["name"][0].upper()].append(artist)
|
||||
|
||||
for letter, artists in sorted(first_letter_mapping.items()):
|
||||
letter_data = {
|
||||
|
@ -37,7 +38,7 @@ class GetArtistsSerializer(serializers.Serializer):
|
|||
|
||||
class GetArtistSerializer(serializers.Serializer):
|
||||
def to_representation(self, artist):
|
||||
albums = artist.albums.prefetch_related("tracks__files")
|
||||
albums = artist.albums.prefetch_related("tracks__uploads")
|
||||
payload = {
|
||||
"id": artist.pk,
|
||||
"name": artist.name,
|
||||
|
@ -61,7 +62,7 @@ class GetArtistSerializer(serializers.Serializer):
|
|||
return payload
|
||||
|
||||
|
||||
def get_track_data(album, track, tf):
|
||||
def get_track_data(album, track, upload):
|
||||
data = {
|
||||
"id": track.pk,
|
||||
"isDir": "false",
|
||||
|
@ -69,9 +70,9 @@ def get_track_data(album, track, tf):
|
|||
"album": album.title,
|
||||
"artist": album.artist.name,
|
||||
"track": track.position or 1,
|
||||
"contentType": tf.mimetype,
|
||||
"suffix": tf.extension or "",
|
||||
"duration": tf.duration or 0,
|
||||
"contentType": upload.mimetype,
|
||||
"suffix": upload.extension or "",
|
||||
"duration": upload.duration or 0,
|
||||
"created": track.creation_date,
|
||||
"albumId": album.pk,
|
||||
"artistId": album.artist.pk,
|
||||
|
@ -79,10 +80,10 @@ def get_track_data(album, track, tf):
|
|||
}
|
||||
if track.album.cover:
|
||||
data["coverArt"] = "al-{}".format(track.album.id)
|
||||
if tf.bitrate:
|
||||
data["bitrate"] = int(tf.bitrate / 1000)
|
||||
if tf.size:
|
||||
data["size"] = tf.size
|
||||
if upload.bitrate:
|
||||
data["bitrate"] = int(upload.bitrate / 1000)
|
||||
if upload.size:
|
||||
data["size"] = upload.size
|
||||
if album.release_date:
|
||||
data["year"] = album.release_date.year
|
||||
return data
|
||||
|
@ -102,7 +103,7 @@ def get_album2_data(album):
|
|||
try:
|
||||
payload["songCount"] = album._tracks_count
|
||||
except AttributeError:
|
||||
payload["songCount"] = len(album.tracks.prefetch_related("files"))
|
||||
payload["songCount"] = len(album.tracks.prefetch_related("uploads"))
|
||||
return payload
|
||||
|
||||
|
||||
|
@ -110,17 +111,17 @@ def get_song_list_data(tracks):
|
|||
songs = []
|
||||
for track in tracks:
|
||||
try:
|
||||
tf = [tf for tf in track.files.all()][0]
|
||||
uploads = [upload for upload in track.uploads.all()][0]
|
||||
except IndexError:
|
||||
continue
|
||||
track_data = get_track_data(track.album, track, tf)
|
||||
track_data = get_track_data(track.album, track, uploads)
|
||||
songs.append(track_data)
|
||||
return songs
|
||||
|
||||
|
||||
class GetAlbumSerializer(serializers.Serializer):
|
||||
def to_representation(self, album):
|
||||
tracks = album.tracks.prefetch_related("files").select_related("album")
|
||||
tracks = album.tracks.prefetch_related("uploads").select_related("album")
|
||||
payload = get_album2_data(album)
|
||||
if album.release_date:
|
||||
payload["year"] = album.release_date.year
|
||||
|
@ -129,21 +130,29 @@ class GetAlbumSerializer(serializers.Serializer):
|
|||
return payload
|
||||
|
||||
|
||||
class GetSongSerializer(serializers.Serializer):
|
||||
def to_representation(self, track):
|
||||
uploads = track.uploads.all()
|
||||
if not len(uploads):
|
||||
return {}
|
||||
return get_track_data(track.album, track, uploads[0])
|
||||
|
||||
|
||||
def get_starred_tracks_data(favorites):
|
||||
by_track_id = {f.track_id: f for f in favorites}
|
||||
tracks = (
|
||||
music_models.Track.objects.filter(pk__in=by_track_id.keys())
|
||||
.select_related("album__artist")
|
||||
.prefetch_related("files")
|
||||
.prefetch_related("uploads")
|
||||
)
|
||||
tracks = tracks.order_by("-creation_date")
|
||||
data = []
|
||||
for t in tracks:
|
||||
try:
|
||||
tf = [tf for tf in t.files.all()][0]
|
||||
uploads = [upload for upload in t.uploads.all()][0]
|
||||
except IndexError:
|
||||
continue
|
||||
td = get_track_data(t.album, t, tf)
|
||||
td = get_track_data(t.album, t, uploads)
|
||||
td["starred"] = by_track_id[t.pk].creation_date
|
||||
data.append(td)
|
||||
return data
|
||||
|
@ -169,26 +178,26 @@ def get_playlist_detail_data(playlist):
|
|||
data = get_playlist_data(playlist)
|
||||
qs = (
|
||||
playlist.playlist_tracks.select_related("track__album__artist")
|
||||
.prefetch_related("track__files")
|
||||
.prefetch_related("track__uploads")
|
||||
.order_by("index")
|
||||
)
|
||||
data["entry"] = []
|
||||
for plt in qs:
|
||||
try:
|
||||
tf = [tf for tf in plt.track.files.all()][0]
|
||||
uploads = [upload for upload in plt.track.uploads.all()][0]
|
||||
except IndexError:
|
||||
continue
|
||||
td = get_track_data(plt.track.album, plt.track, tf)
|
||||
td = get_track_data(plt.track.album, plt.track, uploads)
|
||||
data["entry"].append(td)
|
||||
return data
|
||||
|
||||
|
||||
def get_music_directory_data(artist):
|
||||
tracks = artist.tracks.select_related("album").prefetch_related("files")
|
||||
tracks = artist.tracks.select_related("album").prefetch_related("uploads")
|
||||
data = {"id": artist.pk, "parent": 1, "name": artist.name, "child": []}
|
||||
for track in tracks:
|
||||
try:
|
||||
tf = [tf for tf in track.files.all()][0]
|
||||
upload = [upload for upload in track.uploads.all()][0]
|
||||
except IndexError:
|
||||
continue
|
||||
album = track.album
|
||||
|
@ -200,19 +209,19 @@ def get_music_directory_data(artist):
|
|||
"artist": artist.name,
|
||||
"track": track.position or 1,
|
||||
"year": track.album.release_date.year if track.album.release_date else 0,
|
||||
"contentType": tf.mimetype,
|
||||
"suffix": tf.extension or "",
|
||||
"duration": tf.duration or 0,
|
||||
"contentType": upload.mimetype,
|
||||
"suffix": upload.extension or "",
|
||||
"duration": upload.duration or 0,
|
||||
"created": track.creation_date,
|
||||
"albumId": album.pk,
|
||||
"artistId": artist.pk,
|
||||
"parent": artist.id,
|
||||
"type": "music",
|
||||
}
|
||||
if tf.bitrate:
|
||||
td["bitrate"] = int(tf.bitrate / 1000)
|
||||
if tf.size:
|
||||
td["size"] = tf.size
|
||||
if upload.bitrate:
|
||||
td["bitrate"] = int(upload.bitrate / 1000)
|
||||
if upload.size:
|
||||
td["size"] = upload.size
|
||||
data["child"].append(td)
|
||||
return data
|
||||
|
||||
|
@ -220,9 +229,9 @@ def get_music_directory_data(artist):
|
|||
class ScrobbleSerializer(serializers.Serializer):
|
||||
submission = serializers.BooleanField(default=True, required=False)
|
||||
id = serializers.PrimaryKeyRelatedField(
|
||||
queryset=music_models.Track.objects.annotate(files_count=Count("files")).filter(
|
||||
files_count__gt=0
|
||||
)
|
||||
queryset=music_models.Track.objects.annotate(
|
||||
uploads_count=Count("uploads")
|
||||
).filter(uploads_count__gt=0)
|
||||
)
|
||||
|
||||
def create(self, data):
|
||||
|
|
|
@ -19,7 +19,9 @@ from funkwhale_api.playlists import models as playlists_models
|
|||
from . import authentication, filters, negotiation, serializers
|
||||
|
||||
|
||||
def find_object(queryset, model_field="pk", field="id", cast=int):
|
||||
def find_object(
|
||||
queryset, model_field="pk", field="id", cast=int, filter_playable=False
|
||||
):
|
||||
def decorator(func):
|
||||
def inner(self, request, *args, **kwargs):
|
||||
data = request.GET or request.POST
|
||||
|
@ -38,7 +40,7 @@ def find_object(queryset, model_field="pk", field="id", cast=int):
|
|||
)
|
||||
try:
|
||||
value = cast(raw_value)
|
||||
except (TypeError, ValidationError):
|
||||
except (ValueError, TypeError, ValidationError):
|
||||
return response.Response(
|
||||
{
|
||||
"error": {
|
||||
|
@ -50,6 +52,11 @@ def find_object(queryset, model_field="pk", field="id", cast=int):
|
|||
qs = queryset
|
||||
if hasattr(qs, "__call__"):
|
||||
qs = qs(request)
|
||||
|
||||
if filter_playable:
|
||||
actor = utils.get_actor_from_request(request)
|
||||
qs = qs.playable_by(actor).distinct()
|
||||
|
||||
try:
|
||||
obj = qs.get(**{model_field: value})
|
||||
except qs.model.DoesNotExist:
|
||||
|
@ -124,7 +131,9 @@ class SubsonicViewSet(viewsets.GenericViewSet):
|
|||
|
||||
@list_route(methods=["get", "post"], url_name="get_artists", url_path="getArtists")
|
||||
def get_artists(self, request, *args, **kwargs):
|
||||
artists = music_models.Artist.objects.all()
|
||||
artists = music_models.Artist.objects.all().playable_by(
|
||||
utils.get_actor_from_request(request)
|
||||
)
|
||||
data = serializers.GetArtistsSerializer(artists).data
|
||||
payload = {"artists": data}
|
||||
|
||||
|
@ -132,14 +141,16 @@ class SubsonicViewSet(viewsets.GenericViewSet):
|
|||
|
||||
@list_route(methods=["get", "post"], url_name="get_indexes", url_path="getIndexes")
|
||||
def get_indexes(self, request, *args, **kwargs):
|
||||
artists = music_models.Artist.objects.all()
|
||||
artists = music_models.Artist.objects.all().playable_by(
|
||||
utils.get_actor_from_request(request)
|
||||
)
|
||||
data = serializers.GetArtistsSerializer(artists).data
|
||||
payload = {"indexes": data}
|
||||
|
||||
return response.Response(payload, status=200)
|
||||
|
||||
@list_route(methods=["get", "post"], url_name="get_artist", url_path="getArtist")
|
||||
@find_object(music_models.Artist.objects.all())
|
||||
@find_object(music_models.Artist.objects.all(), filter_playable=True)
|
||||
def get_artist(self, request, *args, **kwargs):
|
||||
artist = kwargs.pop("obj")
|
||||
data = serializers.GetArtistSerializer(artist).data
|
||||
|
@ -147,17 +158,28 @@ class SubsonicViewSet(viewsets.GenericViewSet):
|
|||
|
||||
return response.Response(payload, status=200)
|
||||
|
||||
@list_route(methods=["get", "post"], url_name="get_song", url_path="getSong")
|
||||
@find_object(music_models.Track.objects.all(), filter_playable=True)
|
||||
def get_song(self, request, *args, **kwargs):
|
||||
track = kwargs.pop("obj")
|
||||
data = serializers.GetSongSerializer(track).data
|
||||
payload = {"song": data}
|
||||
|
||||
return response.Response(payload, status=200)
|
||||
|
||||
@list_route(
|
||||
methods=["get", "post"], url_name="get_artist_info2", url_path="getArtistInfo2"
|
||||
)
|
||||
@find_object(music_models.Artist.objects.all())
|
||||
@find_object(music_models.Artist.objects.all(), filter_playable=True)
|
||||
def get_artist_info2(self, request, *args, **kwargs):
|
||||
payload = {"artist-info2": {}}
|
||||
|
||||
return response.Response(payload, status=200)
|
||||
|
||||
@list_route(methods=["get", "post"], url_name="get_album", url_path="getAlbum")
|
||||
@find_object(music_models.Album.objects.select_related("artist"))
|
||||
@find_object(
|
||||
music_models.Album.objects.select_related("artist"), filter_playable=True
|
||||
)
|
||||
def get_album(self, request, *args, **kwargs):
|
||||
album = kwargs.pop("obj")
|
||||
data = serializers.GetAlbumSerializer(album).data
|
||||
|
@ -165,16 +187,14 @@ class SubsonicViewSet(viewsets.GenericViewSet):
|
|||
return response.Response(payload, status=200)
|
||||
|
||||
@list_route(methods=["get", "post"], url_name="stream", url_path="stream")
|
||||
@find_object(music_models.Track.objects.all())
|
||||
@find_object(music_models.Track.objects.all(), filter_playable=True)
|
||||
def stream(self, request, *args, **kwargs):
|
||||
track = kwargs.pop("obj")
|
||||
queryset = track.files.select_related(
|
||||
"library_track", "track__album__artist", "track__artist"
|
||||
)
|
||||
track_file = queryset.first()
|
||||
if not track_file:
|
||||
queryset = track.uploads.select_related("track__album__artist", "track__artist")
|
||||
upload = queryset.first()
|
||||
if not upload:
|
||||
return response.Response(status=404)
|
||||
return music_views.handle_serve(track_file)
|
||||
return music_views.handle_serve(upload=upload, user=request.user)
|
||||
|
||||
@list_route(methods=["get", "post"], url_name="star", url_path="star")
|
||||
@find_object(music_models.Track.objects.all())
|
||||
|
@ -214,6 +234,9 @@ class SubsonicViewSet(viewsets.GenericViewSet):
|
|||
data = request.GET or request.POST
|
||||
filterset = filters.AlbumList2FilterSet(data, queryset=queryset)
|
||||
queryset = filterset.qs
|
||||
actor = utils.get_actor_from_request(request)
|
||||
queryset = queryset.playable_by(actor)
|
||||
|
||||
try:
|
||||
offset = int(data["offset"])
|
||||
except (TypeError, KeyError, ValueError):
|
||||
|
@ -233,6 +256,7 @@ class SubsonicViewSet(viewsets.GenericViewSet):
|
|||
def search3(self, request, *args, **kwargs):
|
||||
data = request.GET or request.POST
|
||||
query = str(data.get("query", "")).replace("*", "")
|
||||
actor = utils.get_actor_from_request(request)
|
||||
conf = [
|
||||
{
|
||||
"subsonic": "artist",
|
||||
|
@ -258,9 +282,9 @@ class SubsonicViewSet(viewsets.GenericViewSet):
|
|||
"subsonic": "song",
|
||||
"search_fields": ["title"],
|
||||
"queryset": (
|
||||
music_models.Track.objects.prefetch_related("files").select_related(
|
||||
"album__artist"
|
||||
)
|
||||
music_models.Track.objects.prefetch_related(
|
||||
"uploads"
|
||||
).select_related("album__artist")
|
||||
),
|
||||
"serializer": serializers.get_song_list_data,
|
||||
},
|
||||
|
@ -285,6 +309,7 @@ class SubsonicViewSet(viewsets.GenericViewSet):
|
|||
queryset = c["queryset"].filter(
|
||||
utils.get_query(query, c["search_fields"])
|
||||
)
|
||||
queryset = queryset.playable_by(actor)
|
||||
queryset = queryset[offset : offset + size]
|
||||
payload["searchResult3"][c["subsonic"]] = c["serializer"](queryset)
|
||||
return response.Response(payload)
|
||||
|
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue