Merge branch 'release/1.0'
This commit is contained in:
commit
22c366ded8
|
@ -97,7 +97,7 @@ black:
|
|||
variables:
|
||||
GIT_STRATEGY: fetch
|
||||
before_script:
|
||||
- pip install black
|
||||
- pip install black==19.10b0
|
||||
script:
|
||||
- black --check --diff api/
|
||||
|
||||
|
|
146
CHANGELOG
146
CHANGELOG
|
@ -10,6 +10,149 @@ This changelog is viewable on the web at https://docs.funkwhale.audio/changelog.
|
|||
|
||||
.. towncrier
|
||||
|
||||
1.0 (2020-09-09)
|
||||
----------------
|
||||
|
||||
Upgrade instructions are available at
|
||||
https://docs.funkwhale.audio/index.html
|
||||
|
||||
|
||||
Dropped python 3.5 support [manual action required, non-docker only]
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
With Funkwhale 1.0, we're dropping support for Python 3.5. Before upgrading,
|
||||
ensure ``python3 --version`` returns ``3.6`` or higher.
|
||||
|
||||
If it returns ``3.6`` or higher, you have nothing to do.
|
||||
|
||||
If it returns ``3.5``, you will need to upgrade your Python version/Host, then recreate your virtual environment::
|
||||
|
||||
rm -rf /srv/funkwhale/virtualenv
|
||||
python3 -m venv /srv/funkwhale/virtualenv
|
||||
|
||||
|
||||
Increased quality of JPEG thumbnails [manual action required]
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
Default quality for JPEG thumbnails was increased from 70 to 95, as 70 was producing visible artifacts in resized images.
|
||||
|
||||
Because of this change, existing thumbnails will not load, and you will need to:
|
||||
|
||||
1. delete the ``__sized__`` directory in your ``MEDIA_ROOT`` directory
|
||||
2. run ``python manage.py fw media generate-thumbnails`` to regenerate thumbnails with the enhanced quality
|
||||
|
||||
If you don't want to regenerate thumbnails, you can keep the old ones by adding ``THUMBNAIL_JPEG_RESIZE_QUALITY=70`` to your .env file.
|
||||
|
||||
Small API breaking change in ``/api/v1/libraries``
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
To allow easier crawling of public libraries on a pod,we had to make a slight breaking change
|
||||
to the behaviour of ``GET /api/v1/libraries``.
|
||||
|
||||
Before, it returned only libraries owned by the current user.
|
||||
|
||||
Now, it returns all the accessible libraries (including ones from other users and pods).
|
||||
|
||||
If you are consuming the API via a third-party client and need to retrieve your libraries,
|
||||
use the ``scope`` parameter, like this: ``GET /api/v1/libraries?scope=me``
|
||||
|
||||
API breaking change in ``/api/v1/albums``
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
To increase performance, querying ``/api/v1/albums`` doesn't return album tracks anymore. This caused
|
||||
some performance issues, especially as some albums and series have dozens or even hundreds of tracks.
|
||||
|
||||
If you want to retrieve tracks for an album, you can query ``/api/v1/tracks/?album=<albumid>``.
|
||||
|
||||
JWT deprecation
|
||||
^^^^^^^^^^^^^^^
|
||||
|
||||
API Authentication using JWT is deprecated and will be removed in Funkwhale 1.0. Please use OAuth or application tokens
|
||||
and refer to our API documentation at https://docs.funkwhale.audio/swagger/ for guidance.
|
||||
|
||||
Full list of changes
|
||||
^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
Features:
|
||||
|
||||
- Allow users to hide compilation artists on the artist search page (#1053)
|
||||
- Can now launch server import from the UI (#1105)
|
||||
- Dedicated, advanced search page (#370)
|
||||
- Persist theme and language settings accross sessions (#996)
|
||||
|
||||
|
||||
Enhancements:
|
||||
|
||||
- Add support for unauthenticated users hitting the logout page
|
||||
- Added support for Licence Art Libre (#1088)
|
||||
- Broadcast/handle rejected follows (#858)
|
||||
- Confirm email without requiring the user to validate the form manually (#407)
|
||||
- Display channel and track downloads count (#1178)
|
||||
- Do not include tracks in album API representation (#1102)
|
||||
- Dropped python 3.5 support. Python 3.6 is the minimum required version (#1099)
|
||||
- Improved keyboard accessibility (#1125)
|
||||
- Improved naming of pages for accessibility (#1127)
|
||||
- Improved shuffle behaviour (#1190)
|
||||
- Increased quality of JPEG thumbnails
|
||||
- Lock focus in modals to improve accessibility (#1128)
|
||||
- More consistent search UX on /albums, /artists, /radios and /playlists (#1131)
|
||||
- Play button now replace current queue instead of appending to it (#1083)
|
||||
- Set proper lang attribute on HTML document (#1130)
|
||||
- Use semantic headers for accessibility (#1121)
|
||||
- Users can now update their email address (#292)
|
||||
- [plugin, scrobbler] Use last.fm API v2 for scrobbling if API key and secret are provided
|
||||
- Added a new, large thumbnail size for cover images (#1205
|
||||
- Enforce authentication when viewing remote channels, profiles and libraries (#1210)
|
||||
|
||||
|
||||
|
||||
Bugfixes:
|
||||
|
||||
- Fix broken media support detection (#1180)
|
||||
- Fix layout issue with playbar on landscape tablets (#1144)
|
||||
- Fix random radio so that podcast content is not picked up (#1140)
|
||||
- Fixed an issue with search pages where results would not appear after navigating to another page
|
||||
- Fixed crash with negative track position in file tags (#1193)
|
||||
- Handle access errors scanning directories when importing files
|
||||
- Make channel card updated times more humanly readable, add internationalization (#1089)
|
||||
- Ensure search page reloads if another search is submitted in the sidebar (#1197)
|
||||
- Fixed "scope=subscribed" on albums, artists, uploads and libraries API (#1217)
|
||||
- Fixed broken federation with pods using allow-listing (#1999)
|
||||
- Fixed broken search when using (, " or & chars (#1196)
|
||||
- Fixed domains table hidden controls when no domains are found (#1198)
|
||||
|
||||
|
||||
Documentation:
|
||||
|
||||
- Simplify Docker mono-container installation and upgrade documentation
|
||||
|
||||
|
||||
Contributors to this release (translation, development, documentation, reviews, design, testing, third-party projects):
|
||||
|
||||
- Agate
|
||||
- Andy Craze
|
||||
- anonymous
|
||||
- appzer0
|
||||
- Arne
|
||||
- Bheesham Persaud
|
||||
- Ciarán Ainsworth
|
||||
- Creak
|
||||
- Daniele Lira Mereb
|
||||
- dulz
|
||||
- Francesc Galí
|
||||
- ghose
|
||||
- mekind
|
||||
- Puri
|
||||
- Quentin PAGÈS
|
||||
- Raphaël Ventura
|
||||
- Simon Arlott
|
||||
- Slimane Selyan Amiri
|
||||
- Stefano Pigozzi
|
||||
- Sébastien de Melo
|
||||
- vicdorke
|
||||
- Xosé M
|
||||
|
||||
|
||||
0.21.2 (2020-07-27)
|
||||
-------------------
|
||||
|
||||
|
@ -223,7 +366,8 @@ All user-related commands are available under the ``python manage.py fw users``
|
|||
Please refer to the `Admin documentation <https://docs.funkwhale.audio/admin/commands.html#user-management>`_ for
|
||||
more information and instructions.
|
||||
|
||||
Progressive web app [Manual action suggested, non-docker only]
|
||||
Progressive web app [Manual action sugFull list of changes
|
||||
^^^^^^^^^^^^^^^^^^^^gested, non-docker only]
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
We've made Funkwhale's Web UI a Progressive Web Application (PWA), in order to improve the user experience
|
||||
|
|
|
@ -704,6 +704,21 @@ Views: you can find some readable views tests in file: ``api/tests/users/test_vi
|
|||
Contributing to the front-end
|
||||
-----------------------------
|
||||
|
||||
Styles and themes
|
||||
^^^^^^^^^^^^^^^^^
|
||||
|
||||
Our UI framework is Fomantic UI (https://fomantic-ui.com/), and Funkwhale's custom styles are written in SCSS. All the styles are configured in ``front/src/styles/_main.scss``,
|
||||
including imporing of Fomantic UI styles and components.
|
||||
|
||||
We're applying several changes on top of the Fomantic CSS files, before they are imported:
|
||||
|
||||
1. Many hardcoded color values are replaced by CSS vars: e.g ``color: orange`` is replaced by ``color: var(--vibrant-color)``. This makes theming way easier.
|
||||
2. Unused components variations and icons are stripped from the source files, in order to reduce the final size of our CSS files
|
||||
|
||||
This changes are applied automatically when running ``yarn install``, through a ``postinstall`` hook. Internally, ``front/scripts/fix-fomantic-css.py`` is called
|
||||
and handle both kind of modifications. Please refer to this script if you need to use new icons to the project, or restore some components variations that
|
||||
were stripped in order to use them.
|
||||
|
||||
Running tests
|
||||
^^^^^^^^^^^^^
|
||||
|
||||
|
|
10
README.rst
10
README.rst
|
@ -28,6 +28,16 @@ Contribute
|
|||
Contribution guidelines as well as development installation instructions
|
||||
are outlined in `CONTRIBUTING <CONTRIBUTING.rst>`_.
|
||||
|
||||
Security issues and vulnerabilities
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
If you found a vulnerability in Funkwhale, please report it on our Gitlab instance at `https://dev.funkwhale.audio/funkwhale/funkwhale/-/issues`_, ensuring
|
||||
you have checked the ``This issue is confidential and should only be visible to team members with at least Reporter access.
|
||||
`` box.
|
||||
|
||||
This will ensure only maintainers and developpers have access to the vulnerability. Thank you for your help!
|
||||
|
||||
|
||||
Translate
|
||||
^^^^^^^^^
|
||||
|
||||
|
|
|
@ -1,5 +1,4 @@
|
|||
from django.conf.urls import include, url
|
||||
from dynamic_preferences.api.viewsets import GlobalPreferencesViewSet
|
||||
from rest_framework import routers
|
||||
from rest_framework.urlpatterns import format_suffix_patterns
|
||||
|
||||
|
@ -14,22 +13,20 @@ from funkwhale_api.tags import views as tags_views
|
|||
from funkwhale_api.users import jwt_views
|
||||
|
||||
router = common_routers.OptionalSlashRouter()
|
||||
router.register(r"settings", GlobalPreferencesViewSet, basename="settings")
|
||||
router.register(r"activity", activity_views.ActivityViewSet, "activity")
|
||||
router.register(r"tags", tags_views.TagViewSet, "tags")
|
||||
router.register(r"plugins", common_views.PluginViewSet, "plugins")
|
||||
router.register(r"tracks", views.TrackViewSet, "tracks")
|
||||
router.register(r"uploads", views.UploadViewSet, "uploads")
|
||||
router.register(r"libraries", views.LibraryViewSet, "libraries")
|
||||
router.register(r"listen", views.ListenViewSet, "listen")
|
||||
router.register(r"stream", views.StreamViewSet, "stream")
|
||||
router.register(r"artists", views.ArtistViewSet, "artists")
|
||||
router.register(r"channels", audio_views.ChannelViewSet, "channels")
|
||||
router.register(r"subscriptions", audio_views.SubscriptionsViewSet, "subscriptions")
|
||||
router.register(r"albums", views.AlbumViewSet, "albums")
|
||||
router.register(r"licenses", views.LicenseViewSet, "licenses")
|
||||
router.register(r"playlists", playlists_views.PlaylistViewSet, "playlists")
|
||||
router.register(
|
||||
r"playlist-tracks", playlists_views.PlaylistTrackViewSet, "playlist-tracks"
|
||||
)
|
||||
router.register(r"mutations", common_views.MutationViewSet, "mutations")
|
||||
router.register(r"attachments", common_views.AttachmentViewSet, "attachments")
|
||||
v1_patterns = router.urls
|
||||
|
@ -77,9 +74,11 @@ v1_patterns += [
|
|||
r"^history/",
|
||||
include(("funkwhale_api.history.urls", "history"), namespace="history"),
|
||||
),
|
||||
url(r"^", include(("funkwhale_api.users.api_urls", "users"), namespace="users"),),
|
||||
# XXX: remove if Funkwhale 1.1
|
||||
url(
|
||||
r"^users/",
|
||||
include(("funkwhale_api.users.api_urls", "users"), namespace="users"),
|
||||
include(("funkwhale_api.users.api_urls", "users"), namespace="users-nested"),
|
||||
),
|
||||
url(
|
||||
r"^oauth/",
|
||||
|
|
|
@ -0,0 +1,332 @@
|
|||
import copy
|
||||
import logging
|
||||
import os
|
||||
import subprocess
|
||||
import sys
|
||||
|
||||
import persisting_theory
|
||||
from django.core.cache import cache
|
||||
from django.db.models import Q
|
||||
|
||||
from rest_framework import serializers
|
||||
|
||||
logger = logging.getLogger("plugins")
|
||||
|
||||
|
||||
class Startup(persisting_theory.Registry):
|
||||
look_into = "persisting_theory"
|
||||
|
||||
|
||||
class Ready(persisting_theory.Registry):
|
||||
look_into = "persisting_theory"
|
||||
|
||||
|
||||
startup = Startup()
|
||||
ready = Ready()
|
||||
|
||||
_plugins = {}
|
||||
_filters = {}
|
||||
_hooks = {}
|
||||
|
||||
|
||||
class PluginCache(object):
|
||||
def __init__(self, prefix):
|
||||
self.prefix = prefix
|
||||
|
||||
def get(self, key, default=None):
|
||||
key = ":".join([self.prefix, key])
|
||||
return cache.get(key, default)
|
||||
|
||||
def set(self, key, value, duration=None):
|
||||
key = ":".join([self.prefix, key])
|
||||
return cache.set(key, value, duration)
|
||||
|
||||
|
||||
def get_plugin_config(
|
||||
name,
|
||||
user=False,
|
||||
source=False,
|
||||
registry=_plugins,
|
||||
conf={},
|
||||
settings={},
|
||||
description=None,
|
||||
version=None,
|
||||
label=None,
|
||||
homepage=None,
|
||||
):
|
||||
conf = {
|
||||
"name": name,
|
||||
"label": label or name,
|
||||
"logger": logger,
|
||||
# conf is for dynamic settings
|
||||
"conf": conf,
|
||||
# settings is for settings hardcoded in .env
|
||||
"settings": settings,
|
||||
"user": True if source else user,
|
||||
# source plugins are plugins that provide audio content
|
||||
"source": source,
|
||||
"description": description,
|
||||
"version": version,
|
||||
"cache": PluginCache(name),
|
||||
"homepage": homepage,
|
||||
}
|
||||
registry[name] = conf
|
||||
return conf
|
||||
|
||||
|
||||
def load_settings(name, settings):
|
||||
from django.conf import settings as django_settings
|
||||
|
||||
mapping = {
|
||||
"boolean": django_settings.ENV.bool,
|
||||
"text": django_settings.ENV,
|
||||
}
|
||||
values = {}
|
||||
prefix = "FUNKWHALE_PLUGIN_{}".format(name.upper())
|
||||
for s in settings:
|
||||
key = "_".join([prefix, s["name"].upper()])
|
||||
value = mapping[s["type"]](key, default=s.get("default", None))
|
||||
values[s["name"]] = value
|
||||
|
||||
logger.debug("Plugin %s running with settings %s", name, values)
|
||||
return values
|
||||
|
||||
|
||||
def get_session():
|
||||
from funkwhale_api.common import session
|
||||
|
||||
return session.get_session()
|
||||
|
||||
|
||||
def register_filter(name, plugin_config, registry=_filters):
|
||||
def decorator(func):
|
||||
handlers = registry.setdefault(name, [])
|
||||
|
||||
def inner(*args, **kwargs):
|
||||
plugin_config["logger"].debug("Calling filter for %s", name)
|
||||
rval = func(*args, **kwargs)
|
||||
return rval
|
||||
|
||||
handlers.append((plugin_config["name"], inner))
|
||||
return inner
|
||||
|
||||
return decorator
|
||||
|
||||
|
||||
def register_hook(name, plugin_config, registry=_hooks):
|
||||
def decorator(func):
|
||||
handlers = registry.setdefault(name, [])
|
||||
|
||||
def inner(*args, **kwargs):
|
||||
plugin_config["logger"].debug("Calling hook for %s", name)
|
||||
func(*args, **kwargs)
|
||||
|
||||
handlers.append((plugin_config["name"], inner))
|
||||
return inner
|
||||
|
||||
return decorator
|
||||
|
||||
|
||||
class Skip(Exception):
|
||||
pass
|
||||
|
||||
|
||||
def trigger_filter(name, value, enabled=False, **kwargs):
|
||||
"""
|
||||
Call filters registered for "name" with the given
|
||||
args and kwargs.
|
||||
|
||||
Return the value (that could be modified by handlers)
|
||||
"""
|
||||
logger.debug("Calling handlers for filter %s", name)
|
||||
registry = kwargs.pop("registry", _filters)
|
||||
confs = kwargs.pop("confs", {})
|
||||
for plugin_name, handler in registry.get(name, []):
|
||||
if not enabled and confs.get(plugin_name, {}).get("enabled") is False:
|
||||
continue
|
||||
try:
|
||||
value = handler(value, conf=confs.get(plugin_name, {}), **kwargs)
|
||||
except Skip:
|
||||
pass
|
||||
except Exception as e:
|
||||
logger.warn("Plugin %s errored during filter %s: %s", plugin_name, name, e)
|
||||
return value
|
||||
|
||||
|
||||
def trigger_hook(name, enabled=False, **kwargs):
|
||||
"""
|
||||
Call hooks registered for "name" with the given
|
||||
args and kwargs.
|
||||
|
||||
Returns nothing
|
||||
"""
|
||||
logger.debug("Calling handlers for hook %s", name)
|
||||
registry = kwargs.pop("registry", _hooks)
|
||||
confs = kwargs.pop("confs", {})
|
||||
for plugin_name, handler in registry.get(name, []):
|
||||
if not enabled and confs.get(plugin_name, {}).get("enabled") is False:
|
||||
continue
|
||||
try:
|
||||
handler(conf=confs.get(plugin_name, {}).get("conf"), **kwargs)
|
||||
except Skip:
|
||||
pass
|
||||
except Exception as e:
|
||||
logger.warn("Plugin %s errored during hook %s: %s", plugin_name, name, e)
|
||||
|
||||
|
||||
def set_conf(name, conf, user=None, registry=_plugins):
|
||||
from funkwhale_api.common import models
|
||||
|
||||
if not registry[name]["conf"] and not registry[name]["source"]:
|
||||
return
|
||||
conf_serializer = get_serializer_from_conf_template(
|
||||
registry[name]["conf"], user=user, source=registry[name]["source"],
|
||||
)(data=conf)
|
||||
conf_serializer.is_valid(raise_exception=True)
|
||||
if "library" in conf_serializer.validated_data:
|
||||
conf_serializer.validated_data["library"] = str(
|
||||
conf_serializer.validated_data["library"]
|
||||
)
|
||||
conf, _ = models.PluginConfiguration.objects.update_or_create(
|
||||
user=user, code=name, defaults={"conf": conf_serializer.validated_data}
|
||||
)
|
||||
|
||||
|
||||
def get_confs(user=None):
|
||||
from funkwhale_api.common import models
|
||||
|
||||
qs = models.PluginConfiguration.objects.filter(code__in=list(_plugins.keys()))
|
||||
if user:
|
||||
qs = qs.filter(Q(user=None) | Q(user=user))
|
||||
else:
|
||||
qs = qs.filter(user=None)
|
||||
confs = {
|
||||
v["code"]: {"conf": v["conf"], "enabled": v["enabled"]}
|
||||
for v in qs.values("code", "conf", "enabled")
|
||||
}
|
||||
for p, v in _plugins.items():
|
||||
if p not in confs:
|
||||
confs[p] = {"conf": None, "enabled": False}
|
||||
return confs
|
||||
|
||||
|
||||
def get_conf(plugin, user=None):
|
||||
return get_confs(user=user)[plugin]
|
||||
|
||||
|
||||
def enable_conf(code, value, user):
|
||||
from funkwhale_api.common import models
|
||||
|
||||
models.PluginConfiguration.objects.update_or_create(
|
||||
code=code, user=user, defaults={"enabled": value}
|
||||
)
|
||||
|
||||
|
||||
class LibraryField(serializers.UUIDField):
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.actor = kwargs.pop("actor")
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
def to_internal_value(self, v):
|
||||
v = super().to_internal_value(v)
|
||||
if not self.actor.libraries.filter(uuid=v).first():
|
||||
raise serializers.ValidationError("Invalid library id")
|
||||
return v
|
||||
|
||||
|
||||
def get_serializer_from_conf_template(conf, source=False, user=None):
|
||||
conf = copy.deepcopy(conf)
|
||||
validators = {f["name"]: f.pop("validator") for f in conf if "validator" in f}
|
||||
mapping = {
|
||||
"url": serializers.URLField,
|
||||
"boolean": serializers.BooleanField,
|
||||
"text": serializers.CharField,
|
||||
"long_text": serializers.CharField,
|
||||
"password": serializers.CharField,
|
||||
"number": serializers.IntegerField,
|
||||
}
|
||||
|
||||
for attr in ["label", "help"]:
|
||||
for c in conf:
|
||||
c.pop(attr, None)
|
||||
|
||||
class Serializer(serializers.Serializer):
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
for field_conf in conf:
|
||||
field_kwargs = copy.copy(field_conf)
|
||||
name = field_kwargs.pop("name")
|
||||
self.fields[name] = mapping[field_kwargs.pop("type")](**field_kwargs)
|
||||
if source:
|
||||
self.fields["library"] = LibraryField(actor=user.actor)
|
||||
|
||||
for vname, v in validators.items():
|
||||
setattr(Serializer, "validate_{}".format(vname), v)
|
||||
return Serializer
|
||||
|
||||
|
||||
def serialize_plugin(plugin_conf, confs):
|
||||
return {
|
||||
"name": plugin_conf["name"],
|
||||
"label": plugin_conf["label"],
|
||||
"description": plugin_conf.get("description") or None,
|
||||
"user": plugin_conf.get("user", False),
|
||||
"source": plugin_conf.get("source", False),
|
||||
"conf": plugin_conf.get("conf", None),
|
||||
"values": confs.get(plugin_conf["name"], {"conf"}).get("conf"),
|
||||
"enabled": plugin_conf["name"] in confs
|
||||
and confs[plugin_conf["name"]]["enabled"],
|
||||
"homepage": plugin_conf["homepage"],
|
||||
}
|
||||
|
||||
|
||||
def install_dependencies(deps):
|
||||
if not deps:
|
||||
return
|
||||
logger.info("Installing plugins dependencies %s", deps)
|
||||
pip_path = os.path.join(os.path.dirname(sys.executable), "pip")
|
||||
subprocess.check_call([pip_path, "install"] + deps)
|
||||
|
||||
|
||||
def background_task(name):
|
||||
from funkwhale_api.taskapp import celery
|
||||
|
||||
def decorator(func):
|
||||
return celery.app.task(func, name=name)
|
||||
|
||||
return decorator
|
||||
|
||||
|
||||
# HOOKS
|
||||
LISTENING_CREATED = "listening_created"
|
||||
"""
|
||||
Called when a track is being listened
|
||||
"""
|
||||
SCAN = "scan"
|
||||
"""
|
||||
|
||||
"""
|
||||
# FILTERS
|
||||
PLUGINS_DEPENDENCIES = "plugins_dependencies"
|
||||
"""
|
||||
Called with an empty list, use this filter to append pip dependencies
|
||||
to the list for installation.
|
||||
"""
|
||||
PLUGINS_APPS = "plugins_apps"
|
||||
"""
|
||||
Called with an empty list, use this filter to append apps to INSTALLED_APPS
|
||||
"""
|
||||
MIDDLEWARES_BEFORE = "middlewares_before"
|
||||
"""
|
||||
Called with an empty list, use this filter to prepend middlewares
|
||||
to MIDDLEWARE
|
||||
"""
|
||||
MIDDLEWARES_AFTER = "middlewares_after"
|
||||
"""
|
||||
Called with an empty list, use this filter to append middlewares
|
||||
to MIDDLEWARE
|
||||
"""
|
||||
URLS = "urls"
|
||||
"""
|
||||
Called with an empty list, use this filter to register new urls and views
|
||||
"""
|
|
@ -1,13 +1,13 @@
|
|||
from channels.auth import AuthMiddlewareStack
|
||||
from channels.routing import ProtocolTypeRouter, URLRouter
|
||||
from django.conf.urls import url
|
||||
|
||||
from funkwhale_api.common.auth import TokenAuthMiddleware
|
||||
from django.conf.urls import url
|
||||
from funkwhale_api.instance import consumers
|
||||
|
||||
application = ProtocolTypeRouter(
|
||||
{
|
||||
# Empty for now (http->django views is added by default)
|
||||
"websocket": TokenAuthMiddleware(
|
||||
"websocket": AuthMiddlewareStack(
|
||||
URLRouter([url("^api/v1/activity$", consumers.InstanceActivityConsumer)])
|
||||
)
|
||||
}
|
||||
|
|
|
@ -1,9 +1,9 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
from __future__ import absolute_import, unicode_literals
|
||||
|
||||
from collections import OrderedDict
|
||||
import datetime
|
||||
import logging.config
|
||||
import os
|
||||
import sys
|
||||
|
||||
from urllib.parse import urlsplit
|
||||
|
@ -18,7 +18,7 @@ ROOT_DIR = environ.Path(__file__) - 3 # (/a/b/myfile.py - 3 = /)
|
|||
APPS_DIR = ROOT_DIR.path("funkwhale_api")
|
||||
|
||||
env = environ.Env()
|
||||
|
||||
ENV = env
|
||||
LOGLEVEL = env("LOGLEVEL", default="info").upper()
|
||||
"""
|
||||
Default logging level for the Funkwhale processes""" # pylint: disable=W0105
|
||||
|
@ -46,6 +46,12 @@ logging.config.dictConfig(
|
|||
# required to avoid double logging with root logger
|
||||
"propagate": False,
|
||||
},
|
||||
"plugins": {
|
||||
"level": LOGLEVEL,
|
||||
"handlers": ["console"],
|
||||
# required to avoid double logging with root logger
|
||||
"propagate": False,
|
||||
},
|
||||
"": {"level": "WARNING", "handlers": ["console"]},
|
||||
},
|
||||
}
|
||||
|
@ -86,7 +92,31 @@ FUNKWHALE_PLUGINS_PATH = env(
|
|||
Path to a directory containing Funkwhale plugins. These will be imported at runtime.
|
||||
"""
|
||||
sys.path.append(FUNKWHALE_PLUGINS_PATH)
|
||||
CORE_PLUGINS = [
|
||||
"funkwhale_api.contrib.scrobbler",
|
||||
]
|
||||
|
||||
LOAD_CORE_PLUGINS = env.bool("FUNKWHALE_LOAD_CORE_PLUGINS", default=True)
|
||||
PLUGINS = [p for p in env.list("FUNKWHALE_PLUGINS", default=[]) if p]
|
||||
"""
|
||||
List of Funkwhale plugins to load.
|
||||
"""
|
||||
if LOAD_CORE_PLUGINS:
|
||||
PLUGINS = CORE_PLUGINS + PLUGINS
|
||||
|
||||
# Remove duplicates
|
||||
PLUGINS = list(OrderedDict.fromkeys(PLUGINS))
|
||||
|
||||
if PLUGINS:
|
||||
logger.info("Running with the following plugins enabled: %s", ", ".join(PLUGINS))
|
||||
else:
|
||||
logger.info("Running with no plugins")
|
||||
|
||||
from .. import plugins # noqa
|
||||
|
||||
plugins.startup.autodiscover([p + ".funkwhale_startup" for p in PLUGINS])
|
||||
DEPENDENCIES = plugins.trigger_filter(plugins.PLUGINS_DEPENDENCIES, [], enabled=True)
|
||||
plugins.install_dependencies(DEPENDENCIES)
|
||||
FUNKWHALE_HOSTNAME = None
|
||||
FUNKWHALE_HOSTNAME_SUFFIX = env("FUNKWHALE_HOSTNAME_SUFFIX", default=None)
|
||||
FUNKWHALE_HOSTNAME_PREFIX = env("FUNKWHALE_HOSTNAME_PREFIX", default=None)
|
||||
|
@ -144,22 +174,7 @@ FUNKWHALE_SPA_REWRITE_MANIFEST_URL = env.bool(
|
|||
|
||||
APP_NAME = "Funkwhale"
|
||||
|
||||
# XXX: for backward compat with django 2.2, remove this when django 2.2 support is dropped
|
||||
os.environ["DJANGO_ALLOW_ASYNC_UNSAFE"] = env.bool(
|
||||
"DJANGO_ALLOW_ASYNC_UNSAFE", default="true"
|
||||
)
|
||||
|
||||
# XXX: deprecated, see #186
|
||||
FEDERATION_ENABLED = env.bool("FEDERATION_ENABLED", default=True)
|
||||
FEDERATION_HOSTNAME = env("FEDERATION_HOSTNAME", default=FUNKWHALE_HOSTNAME).lower()
|
||||
# XXX: deprecated, see #186
|
||||
FEDERATION_COLLECTION_PAGE_SIZE = env.int("FEDERATION_COLLECTION_PAGE_SIZE", default=50)
|
||||
# XXX: deprecated, see #186
|
||||
FEDERATION_MUSIC_NEEDS_APPROVAL = env.bool(
|
||||
"FEDERATION_MUSIC_NEEDS_APPROVAL", default=True
|
||||
)
|
||||
# XXX: deprecated, see #186
|
||||
FEDERATION_ACTOR_FETCH_DELAY = env.int("FEDERATION_ACTOR_FETCH_DELAY", default=60 * 12)
|
||||
FEDERATION_SERVICE_ACTOR_USERNAME = env(
|
||||
"FEDERATION_SERVICE_ACTOR_USERNAME", default="service"
|
||||
)
|
||||
|
@ -247,16 +262,6 @@ LOCAL_APPS = (
|
|||
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/settings/#installed-apps
|
||||
|
||||
|
||||
PLUGINS = [p for p in env.list("FUNKWHALE_PLUGINS", default=[]) if p]
|
||||
"""
|
||||
List of Funkwhale plugins to load.
|
||||
"""
|
||||
if PLUGINS:
|
||||
logger.info("Running with the following plugins enabled: %s", ", ".join(PLUGINS))
|
||||
else:
|
||||
logger.info("Running with no plugins")
|
||||
|
||||
ADDITIONAL_APPS = env.list("ADDITIONAL_APPS", default=[])
|
||||
"""
|
||||
List of Django apps to load in addition to Funkwhale plugins and apps.
|
||||
|
@ -265,25 +270,32 @@ INSTALLED_APPS = (
|
|||
DJANGO_APPS
|
||||
+ THIRD_PARTY_APPS
|
||||
+ LOCAL_APPS
|
||||
+ tuple(["{}.apps.Plugin".format(p) for p in PLUGINS])
|
||||
+ tuple(ADDITIONAL_APPS)
|
||||
+ tuple(plugins.trigger_filter(plugins.PLUGINS_APPS, [], enabled=True))
|
||||
)
|
||||
|
||||
# MIDDLEWARE CONFIGURATION
|
||||
# ------------------------------------------------------------------------------
|
||||
ADDITIONAL_MIDDLEWARES_BEFORE = env.list("ADDITIONAL_MIDDLEWARES_BEFORE", default=[])
|
||||
MIDDLEWARE = tuple(ADDITIONAL_MIDDLEWARES_BEFORE) + (
|
||||
"django.middleware.security.SecurityMiddleware",
|
||||
"django.middleware.clickjacking.XFrameOptionsMiddleware",
|
||||
"corsheaders.middleware.CorsMiddleware",
|
||||
"funkwhale_api.common.middleware.SPAFallbackMiddleware",
|
||||
"django.contrib.sessions.middleware.SessionMiddleware",
|
||||
"django.middleware.common.CommonMiddleware",
|
||||
"django.middleware.csrf.CsrfViewMiddleware",
|
||||
"django.contrib.auth.middleware.AuthenticationMiddleware",
|
||||
"django.contrib.messages.middleware.MessageMiddleware",
|
||||
"funkwhale_api.users.middleware.RecordActivityMiddleware",
|
||||
"funkwhale_api.common.middleware.ThrottleStatusMiddleware",
|
||||
MIDDLEWARE = (
|
||||
tuple(plugins.trigger_filter(plugins.MIDDLEWARES_BEFORE, [], enabled=True))
|
||||
+ tuple(ADDITIONAL_MIDDLEWARES_BEFORE)
|
||||
+ (
|
||||
"django.middleware.security.SecurityMiddleware",
|
||||
"django.middleware.clickjacking.XFrameOptionsMiddleware",
|
||||
"corsheaders.middleware.CorsMiddleware",
|
||||
# needs to be before SPA middleware
|
||||
"django.contrib.sessions.middleware.SessionMiddleware",
|
||||
"django.middleware.common.CommonMiddleware",
|
||||
"django.middleware.csrf.CsrfViewMiddleware",
|
||||
# /end
|
||||
"funkwhale_api.common.middleware.SPAFallbackMiddleware",
|
||||
"django.contrib.auth.middleware.AuthenticationMiddleware",
|
||||
"django.contrib.messages.middleware.MessageMiddleware",
|
||||
"funkwhale_api.users.middleware.RecordActivityMiddleware",
|
||||
"funkwhale_api.common.middleware.ThrottleStatusMiddleware",
|
||||
)
|
||||
+ tuple(plugins.trigger_filter(plugins.MIDDLEWARES_AFTER, [], enabled=True))
|
||||
)
|
||||
|
||||
# DEBUG
|
||||
|
@ -567,6 +579,8 @@ AUTHENTICATION_BACKENDS = (
|
|||
"funkwhale_api.users.auth_backends.AllAuthBackend",
|
||||
)
|
||||
SESSION_COOKIE_HTTPONLY = False
|
||||
SESSION_COOKIE_AGE = env.int("SESSION_COOKIE_AGE", default=3600 * 25 * 60)
|
||||
|
||||
# Some really nice defaults
|
||||
ACCOUNT_AUTHENTICATION_METHOD = "username_email"
|
||||
ACCOUNT_EMAIL_REQUIRED = True
|
||||
|
@ -861,6 +875,7 @@ REST_FRAMEWORK = {
|
|||
),
|
||||
"DEFAULT_AUTHENTICATION_CLASSES": (
|
||||
"funkwhale_api.common.authentication.OAuth2Authentication",
|
||||
"funkwhale_api.common.authentication.ApplicationTokenAuthentication",
|
||||
"funkwhale_api.common.authentication.JSONWebTokenAuthenticationQS",
|
||||
"funkwhale_api.common.authentication.BearerTokenHeaderAuth",
|
||||
"funkwhale_api.common.authentication.JSONWebTokenAuthentication",
|
||||
|
@ -998,6 +1013,10 @@ THROTTLING_RATES = {
|
|||
"rate": THROTTLING_USER_RATES.get("oauth-revoke-token", "100/hour"),
|
||||
"description": "OAuth token deletion",
|
||||
},
|
||||
"login": {
|
||||
"rate": THROTTLING_USER_RATES.get("login", "30/hour"),
|
||||
"description": "Login",
|
||||
},
|
||||
"jwt-login": {
|
||||
"rate": THROTTLING_USER_RATES.get("jwt-login", "30/hour"),
|
||||
"description": "JWT token creation",
|
||||
|
@ -1106,11 +1125,6 @@ Exemples:
|
|||
CSRF_USE_SESSIONS = True
|
||||
SESSION_ENGINE = "django.contrib.sessions.backends.cache"
|
||||
|
||||
# Playlist settings
|
||||
# XXX: deprecated, see #186
|
||||
PLAYLISTS_MAX_TRACKS = env.int("PLAYLISTS_MAX_TRACKS", default=250)
|
||||
|
||||
|
||||
ACCOUNT_USERNAME_BLACKLIST = [
|
||||
"funkwhale",
|
||||
"library",
|
||||
|
@ -1147,8 +1161,6 @@ EXTERNAL_REQUESTS_TIMEOUT = env.int("EXTERNAL_REQUESTS_TIMEOUT", default=10)
|
|||
"""
|
||||
Default timeout for external requests.
|
||||
"""
|
||||
# XXX: deprecated, see #186
|
||||
API_AUTHENTICATION_REQUIRED = env.bool("API_AUTHENTICATION_REQUIRED", True)
|
||||
|
||||
MUSIC_DIRECTORY_PATH = env("MUSIC_DIRECTORY_PATH", default=None)
|
||||
"""
|
||||
|
@ -1192,7 +1204,7 @@ On non-docker setup, you don't need to configure this setting.
|
|||
"""
|
||||
# When this is set to default=True, we need to reenable migration music/0042
|
||||
# to ensure data is populated correctly on existing pods
|
||||
MUSIC_USE_DENORMALIZATION = env.bool("MUSIC_USE_DENORMALIZATION", default=False)
|
||||
MUSIC_USE_DENORMALIZATION = env.bool("MUSIC_USE_DENORMALIZATION", default=True)
|
||||
|
||||
USERS_INVITATION_EXPIRATION_DAYS = env.int(
|
||||
"USERS_INVITATION_EXPIRATION_DAYS", default=14
|
||||
|
@ -1211,9 +1223,13 @@ VERSATILEIMAGEFIELD_RENDITION_KEY_SETS = {
|
|||
"attachment_square": [
|
||||
("original", "url"),
|
||||
("medium_square_crop", "crop__200x200"),
|
||||
("large_square_crop", "crop__600x600"),
|
||||
],
|
||||
}
|
||||
VERSATILEIMAGEFIELD_SETTINGS = {"create_images_on_demand": False}
|
||||
VERSATILEIMAGEFIELD_SETTINGS = {
|
||||
"create_images_on_demand": False,
|
||||
"jpeg_resize_quality": env.int("THUMBNAIL_JPEG_RESIZE_QUALITY", default=95),
|
||||
}
|
||||
RSA_KEY_SIZE = 2048
|
||||
# for performance gain in tests, since we don't need to actually create the
|
||||
# thumbnails
|
||||
|
@ -1259,8 +1275,6 @@ FUNKWHALE_SUPPORT_MESSAGE_DELAY = env.int("FUNKWHALE_SUPPORT_MESSAGE_DELAY", def
|
|||
"""
|
||||
Delay in days after signup before we show the "support Funkwhale" message
|
||||
"""
|
||||
# XXX Stable release: remove
|
||||
USE_FULL_TEXT_SEARCH = env.bool("USE_FULL_TEXT_SEARCH", default=True)
|
||||
|
||||
MIN_DELAY_BETWEEN_DOWNLOADS_COUNT = env.int(
|
||||
"MIN_DELAY_BETWEEN_DOWNLOADS_COUNT", default=60 * 60 * 6
|
||||
|
|
|
@ -8,7 +8,9 @@ from django.conf.urls.static import static
|
|||
from funkwhale_api.common import admin
|
||||
from django.views import defaults as default_views
|
||||
|
||||
from config import plugins
|
||||
|
||||
plugins_patterns = plugins.trigger_filter(plugins.URLS, [], enabled=True)
|
||||
urlpatterns = [
|
||||
# Django Admin, use {% url 'admin:index' %}
|
||||
url(settings.ADMIN_URL, admin.site.urls),
|
||||
|
@ -21,8 +23,7 @@ urlpatterns = [
|
|||
),
|
||||
url(r"^api/v1/auth/", include("funkwhale_api.users.rest_auth_urls")),
|
||||
url(r"^accounts/", include("allauth.urls")),
|
||||
# Your stuff: custom urls includes go here
|
||||
]
|
||||
] + plugins_patterns
|
||||
|
||||
if settings.DEBUG:
|
||||
# This allows the error pages to be debugged during development, just visit
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
__version__ = "0.21.2"
|
||||
__version__ = "1.0"
|
||||
__version_info__ = tuple(
|
||||
[
|
||||
int(num) if num.isdigit() else num
|
||||
|
|
|
@ -41,7 +41,7 @@ class ChannelFilter(moderation_filters.HiddenContentFilterSet):
|
|||
|
||||
class Meta:
|
||||
model = models.Channel
|
||||
fields = ["q", "scope", "tag", "subscribed", "ordering", "external"]
|
||||
fields = []
|
||||
hidden_content_fields_mapping = moderation_filters.USER_FILTER_CONFIG["CHANNEL"]
|
||||
|
||||
def filter_subscribed(self, queryset, name, value):
|
||||
|
|
|
@ -235,6 +235,7 @@ class ChannelUpdateSerializer(serializers.Serializer):
|
|||
class ChannelSerializer(serializers.ModelSerializer):
|
||||
artist = serializers.SerializerMethodField()
|
||||
actor = serializers.SerializerMethodField()
|
||||
downloads_count = serializers.SerializerMethodField()
|
||||
attributed_to = federation_serializers.APIActorSerializer()
|
||||
rss_url = serializers.CharField(source="get_rss_url")
|
||||
url = serializers.SerializerMethodField()
|
||||
|
@ -250,6 +251,7 @@ class ChannelSerializer(serializers.ModelSerializer):
|
|||
"metadata",
|
||||
"rss_url",
|
||||
"url",
|
||||
"downloads_count",
|
||||
]
|
||||
|
||||
def get_artist(self, obj):
|
||||
|
@ -264,6 +266,9 @@ class ChannelSerializer(serializers.ModelSerializer):
|
|||
def get_subscriptions_count(self, obj):
|
||||
return obj.actor.received_follows.exclude(approved=False).count()
|
||||
|
||||
def get_downloads_count(self, obj):
|
||||
return getattr(obj, "_downloads_count", None)
|
||||
|
||||
def get_actor(self, obj):
|
||||
if obj.attributed_to == actors.get_service_actor():
|
||||
return None
|
||||
|
@ -824,7 +829,12 @@ def rss_serialize_item(upload):
|
|||
"enclosure": [
|
||||
{
|
||||
# we enforce MP3, since it's the only format supported everywhere
|
||||
"url": federation_utils.full_url(upload.get_listen_url(to="mp3")),
|
||||
"url": federation_utils.full_url(
|
||||
reverse(
|
||||
"api:v1:stream-detail", kwargs={"uuid": str(upload.track.uuid)}
|
||||
)
|
||||
+ ".mp3"
|
||||
),
|
||||
"length": upload.size or 0,
|
||||
"type": "audio/mpeg",
|
||||
}
|
||||
|
|
|
@ -7,7 +7,7 @@ from rest_framework import viewsets
|
|||
|
||||
from django import http
|
||||
from django.db import transaction
|
||||
from django.db.models import Count, Prefetch, Q
|
||||
from django.db.models import Count, Prefetch, Q, Sum
|
||||
from django.utils import timezone
|
||||
|
||||
from funkwhale_api.common import locales
|
||||
|
@ -93,6 +93,14 @@ class ChannelViewSet(
|
|||
return serializers.ChannelUpdateSerializer
|
||||
return serializers.ChannelCreateSerializer
|
||||
|
||||
def get_queryset(self):
|
||||
queryset = super().get_queryset()
|
||||
if self.action == "retrieve":
|
||||
queryset = queryset.annotate(
|
||||
_downloads_count=Sum("artist__tracks__downloads_count")
|
||||
)
|
||||
return queryset
|
||||
|
||||
def perform_create(self, serializer):
|
||||
return serializer.save(attributed_to=self.request.user.actor)
|
||||
|
||||
|
|
|
@ -3,6 +3,8 @@ import sys
|
|||
|
||||
from . import base
|
||||
from . import library # noqa
|
||||
from . import media # noqa
|
||||
from . import plugins # noqa
|
||||
from . import users # noqa
|
||||
|
||||
from rest_framework.exceptions import ValidationError
|
||||
|
|
|
@ -0,0 +1,69 @@
|
|||
import click
|
||||
|
||||
from django.core.cache import cache
|
||||
from django.conf import settings
|
||||
from django.core.files.storage import default_storage
|
||||
|
||||
from versatileimagefield.image_warmer import VersatileImageFieldWarmer
|
||||
from versatileimagefield import settings as vif_settings
|
||||
|
||||
from funkwhale_api.common import utils as common_utils
|
||||
from funkwhale_api.common.models import Attachment
|
||||
|
||||
from . import base
|
||||
|
||||
|
||||
@base.cli.group()
|
||||
def media():
|
||||
"""Manage media files (avatars, covers, attachments…)"""
|
||||
pass
|
||||
|
||||
|
||||
@media.command("generate-thumbnails")
|
||||
@click.option("-d", "--delete", is_flag=True)
|
||||
def generate_thumbnails(delete):
|
||||
"""
|
||||
Generate thumbnails for all images (avatars, covers, etc.).
|
||||
|
||||
This can take a long time and generate a lot of I/O depending of the size
|
||||
of your library.
|
||||
"""
|
||||
click.echo("Deleting existing thumbnails…")
|
||||
if delete:
|
||||
try:
|
||||
# FileSystemStorage doesn't support deleting a non-empty directory
|
||||
# so we reimplemented a method to do so
|
||||
default_storage.force_delete("__sized__")
|
||||
except AttributeError:
|
||||
# backends doesn't support directory deletion
|
||||
pass
|
||||
MODELS = [
|
||||
(Attachment, "file", "attachment_square"),
|
||||
]
|
||||
for model, attribute, key_set in MODELS:
|
||||
click.echo(
|
||||
"Generating thumbnails for {}.{}…".format(model._meta.label, attribute)
|
||||
)
|
||||
qs = model.objects.exclude(**{"{}__isnull".format(attribute): True})
|
||||
qs = qs.exclude(**{attribute: ""})
|
||||
cache_key = "*{}{}*".format(
|
||||
settings.MEDIA_URL, vif_settings.VERSATILEIMAGEFIELD_SIZED_DIRNAME
|
||||
)
|
||||
entries = cache.keys(cache_key)
|
||||
if entries:
|
||||
click.echo(
|
||||
" Clearing {} cache entries: {}…".format(len(entries), cache_key)
|
||||
)
|
||||
for keys in common_utils.batch(iter(entries)):
|
||||
cache.delete_many(keys)
|
||||
warmer = VersatileImageFieldWarmer(
|
||||
instance_or_queryset=qs,
|
||||
rendition_key_set=key_set,
|
||||
image_attr=attribute,
|
||||
verbose=True,
|
||||
)
|
||||
click.echo(" Creating images")
|
||||
num_created, failed_to_create = warmer.warm()
|
||||
click.echo(
|
||||
" {} created, {} in error".format(num_created, len(failed_to_create))
|
||||
)
|
|
@ -0,0 +1,35 @@
|
|||
import os
|
||||
import subprocess
|
||||
import sys
|
||||
|
||||
import click
|
||||
from django.conf import settings
|
||||
|
||||
|
||||
from . import base
|
||||
|
||||
|
||||
@base.cli.group()
|
||||
def plugins():
|
||||
"""Manage plugins"""
|
||||
pass
|
||||
|
||||
|
||||
@plugins.command("install")
|
||||
@click.argument("plugin", nargs=-1)
|
||||
def install(plugin):
|
||||
"""
|
||||
Install a plugin from a given URL (zip, pip or git are supported)
|
||||
"""
|
||||
if not plugin:
|
||||
return click.echo("No plugin provided")
|
||||
|
||||
click.echo("Installing plugins…")
|
||||
pip_install(list(plugin), settings.FUNKWHALE_PLUGINS_PATH)
|
||||
|
||||
|
||||
def pip_install(deps, target):
|
||||
if not deps:
|
||||
return
|
||||
pip_path = os.path.join(os.path.dirname(sys.executable), "pip")
|
||||
subprocess.check_call([pip_path, "install", "-t", target] + deps)
|
|
@ -1,4 +1,7 @@
|
|||
from django.apps import AppConfig, apps
|
||||
from django.conf import settings
|
||||
|
||||
from config import plugins
|
||||
|
||||
from . import mutations
|
||||
from . import utils
|
||||
|
@ -13,3 +16,6 @@ class CommonConfig(AppConfig):
|
|||
app_names = [app.name for app in apps.app_configs.values()]
|
||||
mutations.registry.autodiscover(app_names)
|
||||
utils.monkey_patch_request_build_absolute_uri()
|
||||
plugins.startup.autodiscover([p + ".funkwhale_ready" for p in settings.PLUGINS])
|
||||
for p in plugins._plugins.values():
|
||||
p["settings"] = plugins.load_settings(p["name"], p["settings"])
|
||||
|
|
|
@ -1,40 +0,0 @@
|
|||
from urllib.parse import parse_qs
|
||||
|
||||
from django.contrib.auth.models import AnonymousUser
|
||||
from rest_framework import exceptions
|
||||
from rest_framework_jwt.authentication import BaseJSONWebTokenAuthentication
|
||||
|
||||
from funkwhale_api.users.models import User
|
||||
|
||||
|
||||
class TokenHeaderAuth(BaseJSONWebTokenAuthentication):
|
||||
def get_jwt_value(self, request):
|
||||
|
||||
try:
|
||||
qs = request.get("query_string", b"").decode("utf-8")
|
||||
parsed = parse_qs(qs)
|
||||
token = parsed["token"][0]
|
||||
except KeyError:
|
||||
raise exceptions.AuthenticationFailed("No token")
|
||||
|
||||
if not token:
|
||||
raise exceptions.AuthenticationFailed("Empty token")
|
||||
|
||||
return token
|
||||
|
||||
|
||||
class TokenAuthMiddleware:
|
||||
def __init__(self, inner):
|
||||
# Store the ASGI application we were passed
|
||||
self.inner = inner
|
||||
|
||||
def __call__(self, scope):
|
||||
# XXX: 1.0 remove this, replace with websocket/scopedtoken
|
||||
auth = TokenHeaderAuth()
|
||||
try:
|
||||
user, token = auth.authenticate(scope)
|
||||
except (User.DoesNotExist, exceptions.AuthenticationFailed):
|
||||
user = AnonymousUser()
|
||||
|
||||
scope["user"] = user
|
||||
return self.inner(scope)
|
|
@ -12,6 +12,8 @@ from rest_framework import exceptions
|
|||
from rest_framework_jwt import authentication
|
||||
from rest_framework_jwt.settings import api_settings
|
||||
|
||||
from funkwhale_api.users import models as users_models
|
||||
|
||||
|
||||
def should_verify_email(user):
|
||||
if user.is_superuser:
|
||||
|
@ -46,6 +48,36 @@ class OAuth2Authentication(BaseOAuth2Authentication):
|
|||
resend_confirmation_email(request, e.user)
|
||||
|
||||
|
||||
class ApplicationTokenAuthentication(object):
|
||||
def authenticate(self, request):
|
||||
try:
|
||||
header = request.headers["Authorization"]
|
||||
except KeyError:
|
||||
return
|
||||
|
||||
if "Bearer" not in header:
|
||||
return
|
||||
|
||||
token = header.split()[-1].strip()
|
||||
|
||||
try:
|
||||
application = users_models.Application.objects.exclude(user=None).get(
|
||||
token=token
|
||||
)
|
||||
except users_models.Application.DoesNotExist:
|
||||
return
|
||||
user = users_models.User.objects.all().for_auth().get(id=application.user_id)
|
||||
if not user.is_active:
|
||||
msg = _("User account is disabled.")
|
||||
raise exceptions.AuthenticationFailed(msg)
|
||||
|
||||
if should_verify_email(user):
|
||||
raise UnverifiedEmail(user)
|
||||
|
||||
request.scopes = application.scope.split()
|
||||
return user, None
|
||||
|
||||
|
||||
class BaseJsonWebTokenAuth(object):
|
||||
def authenticate(self, request):
|
||||
try:
|
||||
|
|
|
@ -1,19 +1,15 @@
|
|||
from dynamic_preferences import types
|
||||
from dynamic_preferences.registries import global_preferences_registry
|
||||
|
||||
from funkwhale_api.common import preferences
|
||||
|
||||
common = types.Section("common")
|
||||
|
||||
|
||||
@global_preferences_registry.register
|
||||
class APIAutenticationRequired(
|
||||
preferences.DefaultFromSettingMixin, types.BooleanPreference
|
||||
):
|
||||
class APIAutenticationRequired(types.BooleanPreference):
|
||||
section = common
|
||||
name = "api_authentication_required"
|
||||
verbose_name = "API Requires authentication"
|
||||
setting = "API_AUTHENTICATION_REQUIRED"
|
||||
default = True
|
||||
help_text = (
|
||||
"If disabled, anonymous users will be able to query the API"
|
||||
"and access music data (as well as other data exposed in the API "
|
||||
|
|
|
@ -35,3 +35,12 @@ class CommonFactory(NoUpdateOnCreate, factory.django.DjangoModelFactory):
|
|||
|
||||
class Meta:
|
||||
model = "common.Content"
|
||||
|
||||
|
||||
@registry.register
|
||||
class PluginConfiguration(NoUpdateOnCreate, factory.django.DjangoModelFactory):
|
||||
code = "test"
|
||||
conf = {"foo": "bar"}
|
||||
|
||||
class Meta:
|
||||
model = "common.PluginConfiguration"
|
||||
|
|
|
@ -1,6 +1,5 @@
|
|||
import django_filters
|
||||
from django import forms
|
||||
from django.conf import settings
|
||||
from django.core.serializers.json import DjangoJSONEncoder
|
||||
from django.db import models
|
||||
|
||||
|
@ -40,7 +39,7 @@ class SearchFilter(django_filters.CharFilter):
|
|||
def filter(self, qs, value):
|
||||
if not value:
|
||||
return qs
|
||||
if settings.USE_FULL_TEXT_SEARCH and self.fts_search_fields:
|
||||
if self.fts_search_fields:
|
||||
query = search.get_fts_query(
|
||||
value, self.fts_search_fields, model=self.parent.Meta.model
|
||||
)
|
||||
|
|
|
@ -120,7 +120,6 @@ class MultipleQueryFilter(filters.TypedMultipleChoiceFilter):
|
|||
def __init__(self, *args, **kwargs):
|
||||
kwargs["widget"] = QueryArrayWidget()
|
||||
super().__init__(*args, **kwargs)
|
||||
self.lookup_expr = "in"
|
||||
|
||||
|
||||
def filter_target(value):
|
||||
|
@ -228,7 +227,7 @@ class ActorScopeFilter(filters.CharFilter):
|
|||
username, domain = full_username.split("@")
|
||||
try:
|
||||
actor = federation_models.Actor.objects.get(
|
||||
preferred_username=username, domain_id=domain,
|
||||
preferred_username__iexact=username, domain_id=domain,
|
||||
)
|
||||
except federation_models.Actor.DoesNotExist:
|
||||
raise EmptyQuerySet()
|
||||
|
|
|
@ -10,6 +10,8 @@ import xml.sax.saxutils
|
|||
from django import http
|
||||
from django.conf import settings
|
||||
from django.core.cache import caches
|
||||
from django.middleware import csrf
|
||||
from django.contrib import auth
|
||||
from django import urls
|
||||
from rest_framework import views
|
||||
|
||||
|
@ -81,7 +83,12 @@ def serve_spa(request):
|
|||
body, tail = tail.split("</body>", 1)
|
||||
css = "<style>{}</style>".format(css)
|
||||
tail = body + "\n" + css + "\n</body>" + tail
|
||||
return http.HttpResponse(head + tail)
|
||||
|
||||
# set a csrf token so that visitor can login / query API if needed
|
||||
token = csrf.get_token(request)
|
||||
response = http.HttpResponse(head + tail)
|
||||
response.set_cookie("csrftoken", token, max_age=None)
|
||||
return response
|
||||
|
||||
|
||||
MANIFEST_LINK_REGEX = re.compile(r"<link [^>]*rel=(?:'|\")?manifest(?:'|\")?[^>]*>")
|
||||
|
@ -106,7 +113,7 @@ def get_spa_html(spa_url):
|
|||
|
||||
def get_spa_file(spa_url, name):
|
||||
if spa_url.startswith("/"):
|
||||
# XXX: spa_url is an absolute path to index.html, on the local disk.
|
||||
# spa_url is an absolute path to index.html, on the local disk.
|
||||
# However, we may want to access manifest.json or other files as well, so we
|
||||
# strip the filename
|
||||
path = os.path.join(os.path.dirname(spa_url), name)
|
||||
|
@ -276,6 +283,25 @@ def monkey_patch_rest_initialize_request():
|
|||
monkey_patch_rest_initialize_request()
|
||||
|
||||
|
||||
def monkey_patch_auth_get_user():
|
||||
"""
|
||||
We need an actor on our users for many endpoints, so we monkey patch
|
||||
auth.get_user to create it if it's missing
|
||||
"""
|
||||
original = auth.get_user
|
||||
|
||||
def replacement(request):
|
||||
r = original(request)
|
||||
if not r.is_anonymous and not r.actor:
|
||||
r.create_actor()
|
||||
return r
|
||||
|
||||
setattr(auth, "get_user", replacement)
|
||||
|
||||
|
||||
monkey_patch_auth_get_user()
|
||||
|
||||
|
||||
class ThrottleStatusMiddleware:
|
||||
"""
|
||||
Include useful information regarding throttling in API responses to
|
||||
|
|
|
@ -0,0 +1,37 @@
|
|||
# Generated by Django 3.0.8 on 2020-07-01 13:17
|
||||
|
||||
from django.conf import settings
|
||||
import django.contrib.postgres.fields.jsonb
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
import django.utils.timezone
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
|
||||
('common', '0007_auto_20200116_1610'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='attachment',
|
||||
name='url',
|
||||
field=models.URLField(blank=True, max_length=500, null=True),
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name='PluginConfiguration',
|
||||
fields=[
|
||||
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('code', models.CharField(max_length=100)),
|
||||
('conf', django.contrib.postgres.fields.jsonb.JSONField(blank=True, null=True)),
|
||||
('enabled', models.BooleanField(default=False)),
|
||||
('creation_date', models.DateTimeField(default=django.utils.timezone.now)),
|
||||
('user', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='plugins', to=settings.AUTH_USER_MODEL)),
|
||||
],
|
||||
options={
|
||||
'unique_together': {('user', 'code')},
|
||||
},
|
||||
),
|
||||
]
|
|
@ -267,6 +267,13 @@ class Attachment(models.Model):
|
|||
proxy_url = reverse("api:v1:attachments-proxy", kwargs={"uuid": self.uuid})
|
||||
return federation_utils.full_url(proxy_url + "?next=medium_square_crop")
|
||||
|
||||
@property
|
||||
def download_url_large_square_crop(self):
|
||||
if self.file:
|
||||
return utils.media_url(self.file.crop["600x600"].url)
|
||||
proxy_url = reverse("api:v1:attachments-proxy", kwargs={"uuid": self.uuid})
|
||||
return federation_utils.full_url(proxy_url + "?next=large_square_crop")
|
||||
|
||||
|
||||
class MutationAttachment(models.Model):
|
||||
"""
|
||||
|
@ -363,3 +370,24 @@ def remove_attached_content(sender, instance, **kwargs):
|
|||
getattr(instance, field).delete()
|
||||
except Content.DoesNotExist:
|
||||
pass
|
||||
|
||||
|
||||
class PluginConfiguration(models.Model):
|
||||
"""
|
||||
Store plugin configuration in DB
|
||||
"""
|
||||
|
||||
code = models.CharField(max_length=100)
|
||||
user = models.ForeignKey(
|
||||
"users.User",
|
||||
related_name="plugins",
|
||||
on_delete=models.CASCADE,
|
||||
null=True,
|
||||
blank=True,
|
||||
)
|
||||
conf = JSONField(null=True, blank=True)
|
||||
enabled = models.BooleanField(default=False)
|
||||
creation_date = models.DateTimeField(default=timezone.now)
|
||||
|
||||
class Meta:
|
||||
unique_together = ("user", "code")
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
from rest_framework.routers import SimpleRouter
|
||||
from rest_framework.routers import DefaultRouter
|
||||
|
||||
|
||||
class OptionalSlashRouter(SimpleRouter):
|
||||
class OptionalSlashRouter(DefaultRouter):
|
||||
def __init__(self):
|
||||
super().__init__()
|
||||
self.trailing_slash = "/?"
|
||||
|
|
|
@ -1,5 +1,4 @@
|
|||
from . import create_actors
|
||||
from . import create_image_variations
|
||||
from . import django_permissions_to_user_permissions
|
||||
from . import migrate_to_user_libraries
|
||||
from . import delete_pre_017_federated_uploads
|
||||
|
@ -8,7 +7,6 @@ from . import test
|
|||
|
||||
__all__ = [
|
||||
"create_actors",
|
||||
"create_image_variations",
|
||||
"django_permissions_to_user_permissions",
|
||||
"migrate_to_user_libraries",
|
||||
"delete_pre_017_federated_uploads",
|
||||
|
|
|
@ -59,11 +59,19 @@ def get_query(query_string, search_fields):
|
|||
return query
|
||||
|
||||
|
||||
def remove_chars(string, chars):
|
||||
for char in chars:
|
||||
string = string.replace(char, "")
|
||||
return string
|
||||
|
||||
|
||||
def get_fts_query(query_string, fts_fields=["body_text"], model=None):
|
||||
search_type = "raw"
|
||||
if query_string.startswith('"') and query_string.endswith('"'):
|
||||
# we pass the query directly to the FTS engine
|
||||
query_string = query_string[1:-1]
|
||||
else:
|
||||
query_string = remove_chars(query_string, ['"', "&", "(", ")", "!", "'"])
|
||||
parts = query_string.replace(":", "").split(" ")
|
||||
parts = ["{}:*".format(p) for p in parts if p]
|
||||
if not parts:
|
||||
|
@ -86,7 +94,7 @@ def get_fts_query(query_string, fts_fields=["body_text"], model=None):
|
|||
subquery = related_model.objects.filter(
|
||||
**{
|
||||
lookup: SearchQuery(
|
||||
query_string, search_type="raw", config="english_nostop"
|
||||
query_string, search_type=search_type, config="english_nostop"
|
||||
)
|
||||
}
|
||||
).values_list("pk", flat=True)
|
||||
|
@ -95,7 +103,7 @@ def get_fts_query(query_string, fts_fields=["body_text"], model=None):
|
|||
new_query = Q(
|
||||
**{
|
||||
field: SearchQuery(
|
||||
query_string, search_type="raw", config="english_nostop"
|
||||
query_string, search_type=search_type, config="english_nostop"
|
||||
)
|
||||
}
|
||||
)
|
||||
|
|
|
@ -297,22 +297,9 @@ class AttachmentSerializer(serializers.Serializer):
|
|||
urls["source"] = o.url
|
||||
urls["original"] = o.download_url_original
|
||||
urls["medium_square_crop"] = o.download_url_medium_square_crop
|
||||
urls["large_square_crop"] = o.download_url_large_square_crop
|
||||
return urls
|
||||
|
||||
def to_representation(self, o):
|
||||
repr = super().to_representation(o)
|
||||
# XXX: BACKWARD COMPATIBILITY
|
||||
# having the attachment urls in a nested JSON obj is better,
|
||||
# but we can't do this without breaking clients
|
||||
# So we extract the urls and include these in the parent payload
|
||||
repr.update({k: v for k, v in repr["urls"].items() if k != "source"})
|
||||
# also, our legacy images had lots of variations (400x400, 200x200, 50x50)
|
||||
# but we removed some of these, so we emulate these by hand (by redirecting)
|
||||
# to actual, existing attachment variations
|
||||
repr["square_crop"] = repr["medium_square_crop"]
|
||||
repr["small_square_crop"] = repr["medium_square_crop"]
|
||||
return repr
|
||||
|
||||
def create(self, validated_data):
|
||||
return models.Attachment.objects.create(
|
||||
file=validated_data["file"], actor=validated_data["actor"]
|
||||
|
|
|
@ -1,3 +1,5 @@
|
|||
import os
|
||||
import shutil
|
||||
import slugify
|
||||
|
||||
from django.core.files.storage import FileSystemStorage
|
||||
|
@ -15,6 +17,16 @@ class ASCIIFileSystemStorage(FileSystemStorage):
|
|||
def get_valid_name(self, name):
|
||||
return super().get_valid_name(asciionly(name))
|
||||
|
||||
def force_delete(self, name):
|
||||
path = self.path(name)
|
||||
try:
|
||||
if os.path.isdir(path):
|
||||
shutil.rmtree(path)
|
||||
else:
|
||||
return super().delete(name)
|
||||
except FileNotFoundError:
|
||||
pass
|
||||
|
||||
|
||||
class ASCIIS3Boto3Storage(S3Boto3Storage):
|
||||
def get_valid_name(self, name):
|
||||
|
|
|
@ -23,6 +23,18 @@ from django.utils import timezone
|
|||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def batch(iterable, n=1):
|
||||
has_entries = True
|
||||
while has_entries:
|
||||
current = []
|
||||
for i in range(0, n):
|
||||
try:
|
||||
current.append(next(iterable))
|
||||
except StopIteration:
|
||||
has_entries = False
|
||||
yield current
|
||||
|
||||
|
||||
def rename_file(instance, field_name, new_name, allow_missing_file=False):
|
||||
field = getattr(instance, field_name)
|
||||
current_name, extension = os.path.splitext(field.name)
|
||||
|
|
|
@ -12,6 +12,8 @@ from rest_framework import response
|
|||
from rest_framework import views
|
||||
from rest_framework import viewsets
|
||||
|
||||
from config import plugins
|
||||
|
||||
from funkwhale_api.users.oauth import permissions as oauth_permissions
|
||||
|
||||
from . import filters
|
||||
|
@ -173,7 +175,7 @@ class AttachmentViewSet(
|
|||
return r
|
||||
|
||||
size = request.GET.get("next", "original").lower()
|
||||
if size not in ["original", "medium_square_crop"]:
|
||||
if size not in ["original", "medium_square_crop", "large_square_crop"]:
|
||||
size = "original"
|
||||
|
||||
try:
|
||||
|
@ -210,3 +212,102 @@ class TextPreviewView(views.APIView):
|
|||
)
|
||||
}
|
||||
return response.Response(data, status=200)
|
||||
|
||||
|
||||
class PluginViewSet(mixins.ListModelMixin, viewsets.GenericViewSet):
|
||||
required_scope = "plugins"
|
||||
serializer_class = serializers.serializers.Serializer
|
||||
queryset = models.PluginConfiguration.objects.none()
|
||||
|
||||
def list(self, request, *args, **kwargs):
|
||||
user = request.user
|
||||
user_plugins = [p for p in plugins._plugins.values() if p["user"] is True]
|
||||
|
||||
return response.Response(
|
||||
[
|
||||
plugins.serialize_plugin(p, confs=plugins.get_confs(user=user))
|
||||
for p in user_plugins
|
||||
]
|
||||
)
|
||||
|
||||
def retrieve(self, request, *args, **kwargs):
|
||||
user = request.user
|
||||
user_plugin = [
|
||||
p
|
||||
for p in plugins._plugins.values()
|
||||
if p["user"] is True and p["name"] == kwargs["pk"]
|
||||
]
|
||||
if not user_plugin:
|
||||
return response.Response(status=404)
|
||||
|
||||
return response.Response(
|
||||
plugins.serialize_plugin(user_plugin[0], confs=plugins.get_confs(user=user))
|
||||
)
|
||||
|
||||
def post(self, request, *args, **kwargs):
|
||||
return self.create(request, *args, **kwargs)
|
||||
|
||||
def create(self, request, *args, **kwargs):
|
||||
user = request.user
|
||||
confs = plugins.get_confs(user=user)
|
||||
|
||||
user_plugin = [
|
||||
p
|
||||
for p in plugins._plugins.values()
|
||||
if p["user"] is True and p["name"] == kwargs["pk"]
|
||||
]
|
||||
if kwargs["pk"] not in confs:
|
||||
return response.Response(status=404)
|
||||
plugins.set_conf(kwargs["pk"], request.data, user)
|
||||
return response.Response(
|
||||
plugins.serialize_plugin(user_plugin[0], confs=plugins.get_confs(user=user))
|
||||
)
|
||||
|
||||
def delete(self, request, *args, **kwargs):
|
||||
user = request.user
|
||||
confs = plugins.get_confs(user=user)
|
||||
if kwargs["pk"] not in confs:
|
||||
return response.Response(status=404)
|
||||
|
||||
user.plugins.filter(code=kwargs["pk"]).delete()
|
||||
return response.Response(status=204)
|
||||
|
||||
@action(detail=True, methods=["post"])
|
||||
def enable(self, request, *args, **kwargs):
|
||||
user = request.user
|
||||
if kwargs["pk"] not in plugins._plugins:
|
||||
return response.Response(status=404)
|
||||
plugins.enable_conf(kwargs["pk"], True, user)
|
||||
return response.Response({}, status=200)
|
||||
|
||||
@action(detail=True, methods=["post"])
|
||||
def disable(self, request, *args, **kwargs):
|
||||
user = request.user
|
||||
if kwargs["pk"] not in plugins._plugins:
|
||||
return response.Response(status=404)
|
||||
plugins.enable_conf(kwargs["pk"], False, user)
|
||||
return response.Response({}, status=200)
|
||||
|
||||
@action(detail=True, methods=["post"])
|
||||
def scan(self, request, *args, **kwargs):
|
||||
user = request.user
|
||||
if kwargs["pk"] not in plugins._plugins:
|
||||
return response.Response(status=404)
|
||||
conf = plugins.get_conf(kwargs["pk"], user=user)
|
||||
|
||||
if not conf["enabled"]:
|
||||
return response.Response(status=405)
|
||||
|
||||
library = request.user.actor.libraries.get(uuid=conf["conf"]["library"])
|
||||
hook = [
|
||||
hook
|
||||
for p, hook in plugins._hooks.get(plugins.SCAN, [])
|
||||
if p == kwargs["pk"]
|
||||
]
|
||||
|
||||
if not hook:
|
||||
return response.Response(status=405)
|
||||
|
||||
hook[0](library=library, conf=conf["conf"])
|
||||
|
||||
return response.Response({}, status=200)
|
||||
|
|
|
@ -0,0 +1,10 @@
|
|||
Scrobbler plugin
|
||||
================
|
||||
|
||||
A plugin that enables scrobbling to ListenBrainz and Last.fm.
|
||||
|
||||
If you're scrobbling to last.fm, you will need to create an `API account <https://www.last.fm/api/account/create>`_
|
||||
and add two variables two your .env file:
|
||||
|
||||
- ``FUNKWHALE_PLUGIN_SCROBBLER_LASTFM_API_KEY=apikey``
|
||||
- ``FUNKWHALE_PLUGIN_SCROBBLER_LASTFM_API_SECRET=apisecret``
|
|
@ -0,0 +1,71 @@
|
|||
import hashlib
|
||||
|
||||
from config import plugins
|
||||
from .funkwhale_startup import PLUGIN
|
||||
|
||||
from . import scrobbler
|
||||
|
||||
# https://listenbrainz.org/lastfm-proxy
|
||||
DEFAULT_SCROBBLER_URL = "http://post.audioscrobbler.com"
|
||||
LASTFM_SCROBBLER_URL = "https://ws.audioscrobbler.com/2.0/"
|
||||
|
||||
|
||||
@plugins.register_hook(plugins.LISTENING_CREATED, PLUGIN)
|
||||
def forward_to_scrobblers(listening, conf, **kwargs):
|
||||
if not conf:
|
||||
raise plugins.Skip()
|
||||
|
||||
username = conf.get("username")
|
||||
password = conf.get("password")
|
||||
url = conf.get("url", DEFAULT_SCROBBLER_URL) or DEFAULT_SCROBBLER_URL
|
||||
if username and password:
|
||||
session = plugins.get_session()
|
||||
if (
|
||||
PLUGIN["settings"]["lastfm_api_key"]
|
||||
and PLUGIN["settings"]["lastfm_api_secret"]
|
||||
and url == DEFAULT_SCROBBLER_URL
|
||||
):
|
||||
hashed_auth = hashlib.md5(
|
||||
(username + " " + password).encode("utf-8")
|
||||
).hexdigest()
|
||||
cache_key = "lastfm:sessionkey:{}".format(
|
||||
":".join([str(listening.user.pk), hashed_auth])
|
||||
)
|
||||
PLUGIN["logger"].info("Forwarding scrobble to %s", LASTFM_SCROBBLER_URL)
|
||||
session_key = PLUGIN["cache"].get(cache_key)
|
||||
if not session_key:
|
||||
PLUGIN["logger"].debug("Authenticating…")
|
||||
session_key = scrobbler.handshake_v2(
|
||||
username=username,
|
||||
password=password,
|
||||
scrobble_url=LASTFM_SCROBBLER_URL,
|
||||
session=session,
|
||||
api_key=PLUGIN["settings"]["lastfm_api_key"],
|
||||
api_secret=PLUGIN["settings"]["lastfm_api_secret"],
|
||||
)
|
||||
PLUGIN["cache"].set(cache_key, session_key)
|
||||
scrobbler.submit_scrobble_v2(
|
||||
session=session,
|
||||
track=listening.track,
|
||||
scrobble_time=listening.creation_date,
|
||||
session_key=session_key,
|
||||
scrobble_url=LASTFM_SCROBBLER_URL,
|
||||
api_key=PLUGIN["settings"]["lastfm_api_key"],
|
||||
api_secret=PLUGIN["settings"]["lastfm_api_secret"],
|
||||
)
|
||||
|
||||
else:
|
||||
PLUGIN["logger"].info("Forwarding scrobble to %s", url)
|
||||
session_key, now_playing_url, scrobble_url = scrobbler.handshake_v1(
|
||||
session=session, url=url, username=username, password=password
|
||||
)
|
||||
scrobbler.submit_scrobble_v1(
|
||||
session=session,
|
||||
track=listening.track,
|
||||
scrobble_time=listening.creation_date,
|
||||
session_key=session_key,
|
||||
scrobble_url=scrobble_url,
|
||||
)
|
||||
PLUGIN["logger"].info("Scrobble sent!")
|
||||
else:
|
||||
PLUGIN["logger"].debug("No scrobbler configuration for user, skipping")
|
|
@ -0,0 +1,35 @@
|
|||
from config import plugins
|
||||
|
||||
PLUGIN = plugins.get_plugin_config(
|
||||
name="scrobbler",
|
||||
label="Scrobbler",
|
||||
description=(
|
||||
"A plugin that enables scrobbling to ListenBrainz and Last.fm. "
|
||||
"It must be configured on the server if you use Last.fm."
|
||||
),
|
||||
homepage="https://dev.funkwhale.audio/funkwhale/funkwhale/-/blob/develop/api/funkwhale_api/contrib/scrobbler/README.rst", # noqa
|
||||
version="0.1",
|
||||
user=True,
|
||||
conf=[
|
||||
{
|
||||
"name": "url",
|
||||
"type": "url",
|
||||
"allow_null": True,
|
||||
"allow_blank": True,
|
||||
"required": False,
|
||||
"label": "URL of the scrobbler service",
|
||||
"help": (
|
||||
"Suggested choices:\n\n"
|
||||
"- LastFM (default if left empty): http://post.audioscrobbler.com\n"
|
||||
"- ListenBrainz: http://proxy.listenbrainz.org/\n"
|
||||
"- Libre.fm: http://turtle.libre.fm/"
|
||||
),
|
||||
},
|
||||
{"name": "username", "type": "text", "label": "Your scrobbler username"},
|
||||
{"name": "password", "type": "password", "label": "Your scrobbler password"},
|
||||
],
|
||||
settings=[
|
||||
{"name": "lastfm_api_key", "type": "text"},
|
||||
{"name": "lastfm_api_secret", "type": "text"},
|
||||
],
|
||||
)
|
|
@ -0,0 +1,167 @@
|
|||
import hashlib
|
||||
import time
|
||||
|
||||
|
||||
# https://github.com/jlieth/legacy-scrobbler
|
||||
from .funkwhale_startup import PLUGIN
|
||||
|
||||
|
||||
class ScrobblerException(Exception):
|
||||
pass
|
||||
|
||||
|
||||
def handshake_v1(session, url, username, password):
|
||||
timestamp = str(int(time.time())).encode("utf-8")
|
||||
password_hash = hashlib.md5(password.encode("utf-8")).hexdigest()
|
||||
auth = hashlib.md5(password_hash.encode("utf-8") + timestamp).hexdigest()
|
||||
params = {
|
||||
"hs": "true",
|
||||
"p": "1.2",
|
||||
"c": PLUGIN["name"],
|
||||
"v": PLUGIN["version"],
|
||||
"u": username,
|
||||
"t": timestamp,
|
||||
"a": auth,
|
||||
}
|
||||
|
||||
PLUGIN["logger"].debug(
|
||||
"Performing scrobbler handshake for username %s at %s", username, url
|
||||
)
|
||||
handshake_response = session.get(url, params=params)
|
||||
# process response
|
||||
result = handshake_response.text.split("\n")
|
||||
if len(result) >= 4 and result[0] == "OK":
|
||||
session_key = result[1]
|
||||
nowplaying_url = result[2]
|
||||
scrobble_url = result[3]
|
||||
elif result[0] == "BANNED":
|
||||
raise ScrobblerException("BANNED")
|
||||
elif result[0] == "BADAUTH":
|
||||
raise ScrobblerException("BADAUTH")
|
||||
elif result[0] == "BADTIME":
|
||||
raise ScrobblerException("BADTIME")
|
||||
else:
|
||||
raise ScrobblerException(handshake_response.text)
|
||||
|
||||
PLUGIN["logger"].debug("Handshake successful, scrobble url: %s", scrobble_url)
|
||||
return session_key, nowplaying_url, scrobble_url
|
||||
|
||||
|
||||
def submit_scrobble_v1(session, scrobble_time, track, session_key, scrobble_url):
|
||||
payload = get_scrobble_payload(track, scrobble_time)
|
||||
PLUGIN["logger"].debug("Sending scrobble with payload %s", payload)
|
||||
payload["s"] = session_key
|
||||
response = session.post(scrobble_url, payload)
|
||||
response.raise_for_status()
|
||||
if response.text.startswith("OK"):
|
||||
return
|
||||
elif response.text.startswith("BADSESSION"):
|
||||
raise ScrobblerException("Remote server says the session is invalid")
|
||||
else:
|
||||
raise ScrobblerException(response.text)
|
||||
|
||||
PLUGIN["logger"].debug("Scrobble successfull!")
|
||||
|
||||
|
||||
def submit_now_playing_v1(session, track, session_key, now_playing_url):
|
||||
payload = get_scrobble_payload(track, date=None, suffix="")
|
||||
PLUGIN["logger"].debug("Sending now playing with payload %s", payload)
|
||||
payload["s"] = session_key
|
||||
response = session.post(now_playing_url, payload)
|
||||
response.raise_for_status()
|
||||
if response.text.startswith("OK"):
|
||||
return
|
||||
elif response.text.startswith("BADSESSION"):
|
||||
raise ScrobblerException("Remote server says the session is invalid")
|
||||
else:
|
||||
raise ScrobblerException(response.text)
|
||||
|
||||
PLUGIN["logger"].debug("Now playing successfull!")
|
||||
|
||||
|
||||
def get_scrobble_payload(track, date, suffix="[0]"):
|
||||
"""
|
||||
Documentation available at https://web.archive.org/web/20190531021725/https://www.last.fm/api/submissions
|
||||
"""
|
||||
upload = track.uploads.filter(duration__gte=0).first()
|
||||
data = {
|
||||
"a{}".format(suffix): track.artist.name,
|
||||
"t{}".format(suffix): track.title,
|
||||
"l{}".format(suffix): upload.duration if upload else 0,
|
||||
"b{}".format(suffix): (track.album.title if track.album else "") or "",
|
||||
"n{}".format(suffix): track.position or "",
|
||||
"m{}".format(suffix): str(track.mbid) or "",
|
||||
"o{}".format(suffix): "P", # Source: P = chosen by user
|
||||
}
|
||||
if date:
|
||||
data["i{}".format(suffix)] = int(date.timestamp())
|
||||
return data
|
||||
|
||||
|
||||
def get_scrobble2_payload(track, date, suffix="[0]"):
|
||||
"""
|
||||
Documentation available at https://web.archive.org/web/20190531021725/https://www.last.fm/api/submissions
|
||||
"""
|
||||
upload = track.uploads.filter(duration__gte=0).first()
|
||||
data = {
|
||||
"artist": track.artist.name,
|
||||
"track": track.title,
|
||||
"chosenByUser": 1,
|
||||
}
|
||||
if upload:
|
||||
data["duration"] = upload.duration
|
||||
if track.album:
|
||||
data["album"] = track.album.title
|
||||
if track.position:
|
||||
data["trackNumber"] = track.position
|
||||
if track.mbid:
|
||||
data["mbid"] = str(track.mbid)
|
||||
if date:
|
||||
offset = upload.duration / 2 if upload.duration else 0
|
||||
data["timestamp"] = int(int(date.timestamp()) - offset)
|
||||
return data
|
||||
|
||||
|
||||
def handshake_v2(username, password, session, api_key, api_secret, scrobble_url):
|
||||
params = {
|
||||
"method": "auth.getMobileSession",
|
||||
"username": username,
|
||||
"password": password,
|
||||
"api_key": api_key,
|
||||
}
|
||||
params["api_sig"] = hash_request(params, api_secret)
|
||||
response = session.post(scrobble_url, params)
|
||||
if 'status="ok"' not in response.text:
|
||||
raise ScrobblerException(response.text)
|
||||
|
||||
session_key = response.text.split("<key>")[1].split("</key>")[0]
|
||||
return session_key
|
||||
|
||||
|
||||
def submit_scrobble_v2(
|
||||
session, track, scrobble_time, session_key, scrobble_url, api_key, api_secret,
|
||||
):
|
||||
params = {
|
||||
"method": "track.scrobble",
|
||||
"api_key": api_key,
|
||||
"sk": session_key,
|
||||
}
|
||||
scrobble = get_scrobble2_payload(track, scrobble_time)
|
||||
PLUGIN["logger"].debug("Scrobble payload: %s", scrobble)
|
||||
params.update(scrobble)
|
||||
params["api_sig"] = hash_request(params, api_secret)
|
||||
response = session.post(scrobble_url, params)
|
||||
if 'status="ok"' not in response.text:
|
||||
raise ScrobblerException(response.text)
|
||||
|
||||
|
||||
def hash_request(data, secret_key):
|
||||
string = ""
|
||||
items = data.keys()
|
||||
items = sorted(items)
|
||||
for i in items:
|
||||
string += str(i)
|
||||
string += str(data[i])
|
||||
string += secret_key
|
||||
string_to_hash = string.encode("utf8")
|
||||
return hashlib.md5(string_to_hash).hexdigest()
|
|
@ -13,8 +13,7 @@ class TrackFavoriteFilter(moderation_filters.HiddenContentFilterSet):
|
|||
|
||||
class Meta:
|
||||
model = models.TrackFavorite
|
||||
# XXX: 1.0 remove the user filter, we have scope=me now
|
||||
fields = ["user", "q", "scope"]
|
||||
fields = []
|
||||
hidden_content_fields_mapping = moderation_filters.USER_FILTER_CONFIG[
|
||||
"TRACK_FAVORITE"
|
||||
]
|
||||
|
|
|
@ -13,6 +13,7 @@ logger = logging.getLogger(__name__)
|
|||
|
||||
|
||||
def get_actor_data(actor_url):
|
||||
logger.debug("Fetching actor %s", actor_url)
|
||||
response = session.get_session().get(
|
||||
actor_url, headers={"Accept": "application/activity+json"},
|
||||
)
|
||||
|
|
|
@ -34,6 +34,8 @@ def update_follow(follow, approved):
|
|||
follow.save(update_fields=["approved"])
|
||||
if approved:
|
||||
routes.outbox.dispatch({"type": "Accept"}, context={"follow": follow})
|
||||
else:
|
||||
routes.outbox.dispatch({"type": "Reject"}, context={"follow": follow})
|
||||
|
||||
|
||||
class LibraryFollowViewSet(
|
||||
|
@ -57,7 +59,7 @@ class LibraryFollowViewSet(
|
|||
|
||||
def get_queryset(self):
|
||||
qs = super().get_queryset()
|
||||
return qs.filter(actor=self.request.user.actor)
|
||||
return qs.filter(actor=self.request.user.actor).exclude(approved=False)
|
||||
|
||||
def perform_create(self, serializer):
|
||||
follow = serializer.save(actor=self.request.user.actor)
|
||||
|
|
|
@ -46,15 +46,14 @@ class SignatureAuthentication(authentication.BaseAuthentication):
|
|||
domain = urllib.parse.urlparse(actor_url).hostname
|
||||
allowed = models.Domain.objects.filter(name=domain, allowed=True).exists()
|
||||
if not allowed:
|
||||
logger.debug("Actor domain %s is not on allow-list", domain)
|
||||
raise exceptions.BlockedActorOrDomain()
|
||||
|
||||
try:
|
||||
actor = actors.get_actor(actor_url)
|
||||
except Exception as e:
|
||||
logger.info(
|
||||
"Discarding HTTP request from blocked actor/domain %s, %s",
|
||||
actor_url,
|
||||
str(e),
|
||||
"Discarding HTTP request from actor/domain %s, %s", actor_url, str(e),
|
||||
)
|
||||
raise rest_exceptions.AuthenticationFailed(
|
||||
"Cannot fetch remote actor to authenticate signature"
|
||||
|
|
|
@ -1,8 +1,6 @@
|
|||
from dynamic_preferences import types
|
||||
from dynamic_preferences.registries import global_preferences_registry
|
||||
|
||||
from funkwhale_api.common import preferences
|
||||
|
||||
federation = types.Section("federation")
|
||||
|
||||
|
||||
|
@ -22,10 +20,10 @@ class MusicCacheDuration(types.IntPreference):
|
|||
|
||||
|
||||
@global_preferences_registry.register
|
||||
class Enabled(preferences.DefaultFromSettingMixin, types.BooleanPreference):
|
||||
class Enabled(types.BooleanPreference):
|
||||
section = federation
|
||||
name = "enabled"
|
||||
setting = "FEDERATION_ENABLED"
|
||||
default = True
|
||||
verbose_name = "Federation enabled"
|
||||
help_text = (
|
||||
"Use this setting to enable or disable federation logic and API" " globally."
|
||||
|
@ -33,23 +31,33 @@ class Enabled(preferences.DefaultFromSettingMixin, types.BooleanPreference):
|
|||
|
||||
|
||||
@global_preferences_registry.register
|
||||
class CollectionPageSize(preferences.DefaultFromSettingMixin, types.IntPreference):
|
||||
class CollectionPageSize(types.IntPreference):
|
||||
section = federation
|
||||
name = "collection_page_size"
|
||||
setting = "FEDERATION_COLLECTION_PAGE_SIZE"
|
||||
default = 50
|
||||
verbose_name = "Federation collection page size"
|
||||
help_text = "How many items to display in ActivityPub collections."
|
||||
field_kwargs = {"required": False}
|
||||
|
||||
|
||||
@global_preferences_registry.register
|
||||
class ActorFetchDelay(preferences.DefaultFromSettingMixin, types.IntPreference):
|
||||
class ActorFetchDelay(types.IntPreference):
|
||||
section = federation
|
||||
name = "actor_fetch_delay"
|
||||
setting = "FEDERATION_ACTOR_FETCH_DELAY"
|
||||
default = 60 * 12
|
||||
verbose_name = "Federation actor fetch delay"
|
||||
help_text = (
|
||||
"How many minutes to wait before refetching actors on "
|
||||
"request authentication."
|
||||
)
|
||||
field_kwargs = {"required": False}
|
||||
|
||||
|
||||
@global_preferences_registry.register
|
||||
class PublicIndex(types.BooleanPreference):
|
||||
show_in_api = True
|
||||
section = federation
|
||||
name = "public_index"
|
||||
default = True
|
||||
verbose_name = "Enable public index"
|
||||
help_text = "If this is enabled, public channels and libraries will be crawlable by other pods and bots"
|
||||
|
|
|
@ -20,7 +20,7 @@ class FollowFilter(django_filters.FilterSet):
|
|||
|
||||
class Meta:
|
||||
model = models.Follow
|
||||
fields = ["approved", "pending", "q"]
|
||||
fields = ["approved"]
|
||||
|
||||
def filter_pending(self, queryset, field_name, value):
|
||||
if value.lower() in ["true", "1", "yes"]:
|
||||
|
|
|
@ -82,6 +82,37 @@ def outbox_accept(context):
|
|||
}
|
||||
|
||||
|
||||
@outbox.register({"type": "Reject"})
|
||||
def outbox_reject_follow(context):
|
||||
follow = context["follow"]
|
||||
if follow._meta.label == "federation.LibraryFollow":
|
||||
actor = follow.target.actor
|
||||
else:
|
||||
actor = follow.target
|
||||
payload = serializers.RejectFollowSerializer(follow, context={"actor": actor}).data
|
||||
yield {
|
||||
"actor": actor,
|
||||
"type": "Reject",
|
||||
"payload": with_recipients(payload, to=[follow.actor]),
|
||||
"object": follow,
|
||||
"related_object": follow.target,
|
||||
}
|
||||
|
||||
|
||||
@inbox.register({"type": "Reject"})
|
||||
def inbox_reject_follow(payload, context):
|
||||
serializer = serializers.RejectFollowSerializer(data=payload, context=context)
|
||||
if not serializer.is_valid(raise_exception=context.get("raise_exception", False)):
|
||||
logger.debug(
|
||||
"Discarding invalid follow reject from %s: %s",
|
||||
context["actor"].fid,
|
||||
serializer.errors,
|
||||
)
|
||||
return
|
||||
|
||||
serializer.save()
|
||||
|
||||
|
||||
@inbox.register({"type": "Undo", "object.type": "Follow"})
|
||||
def inbox_undo_follow(payload, context):
|
||||
serializer = serializers.UndoFollowSerializer(data=payload, context=context)
|
||||
|
|
|
@ -688,11 +688,10 @@ class APIFollowSerializer(serializers.ModelSerializer):
|
|||
]
|
||||
|
||||
|
||||
class AcceptFollowSerializer(serializers.Serializer):
|
||||
class FollowActionSerializer(serializers.Serializer):
|
||||
id = serializers.URLField(max_length=500, required=False)
|
||||
actor = serializers.URLField(max_length=500)
|
||||
object = FollowSerializer()
|
||||
type = serializers.ChoiceField(choices=["Accept"])
|
||||
|
||||
def validate_actor(self, v):
|
||||
expected = self.context.get("actor")
|
||||
|
@ -720,12 +719,13 @@ class AcceptFollowSerializer(serializers.Serializer):
|
|||
follow_class.objects.filter(
|
||||
target=target, actor=validated_data["object"]["actor"]
|
||||
)
|
||||
.exclude(approved=True)
|
||||
.select_related()
|
||||
.get()
|
||||
)
|
||||
except follow_class.DoesNotExist:
|
||||
raise serializers.ValidationError("No follow to accept")
|
||||
raise serializers.ValidationError(
|
||||
"No follow to {}".format(self.action_type)
|
||||
)
|
||||
return validated_data
|
||||
|
||||
def to_representation(self, instance):
|
||||
|
@ -736,12 +736,18 @@ class AcceptFollowSerializer(serializers.Serializer):
|
|||
|
||||
return {
|
||||
"@context": jsonld.get_default_context(),
|
||||
"id": instance.get_federation_id() + "/accept",
|
||||
"type": "Accept",
|
||||
"id": instance.get_federation_id() + "/{}".format(self.action_type),
|
||||
"type": self.action_type.title(),
|
||||
"actor": actor.fid,
|
||||
"object": FollowSerializer(instance).data,
|
||||
}
|
||||
|
||||
|
||||
class AcceptFollowSerializer(FollowActionSerializer):
|
||||
|
||||
type = serializers.ChoiceField(choices=["Accept"])
|
||||
action_type = "accept"
|
||||
|
||||
def save(self):
|
||||
follow = self.validated_data["follow"]
|
||||
follow.approved = True
|
||||
|
@ -751,6 +757,18 @@ class AcceptFollowSerializer(serializers.Serializer):
|
|||
return follow
|
||||
|
||||
|
||||
class RejectFollowSerializer(FollowActionSerializer):
|
||||
|
||||
type = serializers.ChoiceField(choices=["Reject"])
|
||||
action_type = "reject"
|
||||
|
||||
def save(self):
|
||||
follow = self.validated_data["follow"]
|
||||
follow.approved = False
|
||||
follow.save()
|
||||
return follow
|
||||
|
||||
|
||||
class UndoFollowSerializer(serializers.Serializer):
|
||||
id = serializers.URLField(max_length=500)
|
||||
actor = serializers.URLField(max_length=500)
|
||||
|
@ -938,8 +956,6 @@ class PaginatedCollectionSerializer(jsonld.JsonLdSerializer):
|
|||
last = common_utils.set_query_parameter(conf["id"], page=paginator.num_pages)
|
||||
d = {
|
||||
"id": conf["id"],
|
||||
# XXX Stable release: remove the obsolete actor field
|
||||
"actor": conf["actor"].fid,
|
||||
"attributedTo": conf["actor"].fid,
|
||||
"totalItems": paginator.count,
|
||||
"type": conf.get("type", "Collection"),
|
||||
|
@ -1004,9 +1020,8 @@ class LibrarySerializer(PaginatedCollectionSerializer):
|
|||
"name": library.name,
|
||||
"summary": library.description,
|
||||
"page_size": 100,
|
||||
# XXX Stable release: remove the obsolete actor field
|
||||
"actor": library.actor,
|
||||
"attributedTo": library.actor,
|
||||
"actor": library.actor,
|
||||
"items": library.uploads.for_federation(),
|
||||
"type": "Library",
|
||||
}
|
||||
|
@ -1096,9 +1111,6 @@ class CollectionPageSerializer(jsonld.JsonLdSerializer):
|
|||
d = {
|
||||
"id": id,
|
||||
"partOf": conf["id"],
|
||||
# XXX Stable release: remove the obsolete actor field
|
||||
"actor": conf["actor"].fid,
|
||||
"attributedTo": conf["actor"].fid,
|
||||
"totalItems": page.paginator.count,
|
||||
"type": "CollectionPage",
|
||||
"first": first,
|
||||
|
@ -1110,6 +1122,8 @@ class CollectionPageSerializer(jsonld.JsonLdSerializer):
|
|||
for i in page.object_list
|
||||
],
|
||||
}
|
||||
if conf["actor"]:
|
||||
d["attributedTo"] = conf["actor"].fid
|
||||
|
||||
if page.has_previous():
|
||||
d["prev"] = common_utils.set_query_parameter(
|
||||
|
@ -1296,8 +1310,7 @@ class AlbumSerializer(MusicEntitySerializer):
|
|||
child=MultipleSerializer(allowed=[BasicActorSerializer, ArtistSerializer]),
|
||||
min_length=1,
|
||||
)
|
||||
# XXX: 1.0 rename to image
|
||||
cover = ImageSerializer(
|
||||
image = ImageSerializer(
|
||||
allowed_mimetypes=["image/*"],
|
||||
allow_null=True,
|
||||
required=False,
|
||||
|
@ -1305,7 +1318,7 @@ class AlbumSerializer(MusicEntitySerializer):
|
|||
)
|
||||
updateable_fields = [
|
||||
("name", "title"),
|
||||
("cover", "attachment_cover"),
|
||||
("image", "attachment_cover"),
|
||||
("musicbrainzId", "mbid"),
|
||||
("attributedTo", "attributed_to"),
|
||||
("released", "release_date"),
|
||||
|
@ -1319,7 +1332,7 @@ class AlbumSerializer(MusicEntitySerializer):
|
|||
{
|
||||
"released": jsonld.first_val(contexts.FW.released),
|
||||
"artists": jsonld.first_attr(contexts.FW.artists, "@list"),
|
||||
"cover": jsonld.first_obj(contexts.FW.cover),
|
||||
"image": jsonld.first_obj(contexts.AS.image),
|
||||
},
|
||||
)
|
||||
|
||||
|
@ -1353,11 +1366,6 @@ class AlbumSerializer(MusicEntitySerializer):
|
|||
]
|
||||
include_content(d, instance.description)
|
||||
if instance.attachment_cover:
|
||||
d["cover"] = {
|
||||
"type": "Link",
|
||||
"href": instance.attachment_cover.download_url_original,
|
||||
"mediaType": instance.attachment_cover.mimetype or "image/jpeg",
|
||||
}
|
||||
include_image(d, instance.attachment_cover)
|
||||
|
||||
if self.context.get("include_ap_context", self.parent is None):
|
||||
|
@ -2030,3 +2038,33 @@ class DeleteSerializer(jsonld.JsonLdSerializer):
|
|||
):
|
||||
raise serializers.ValidationError("You cannot delete this object")
|
||||
return validated_data
|
||||
|
||||
|
||||
class IndexSerializer(jsonld.JsonLdSerializer):
|
||||
type = serializers.ChoiceField(
|
||||
choices=[contexts.AS.Collection, contexts.AS.OrderedCollection]
|
||||
)
|
||||
totalItems = serializers.IntegerField(min_value=0)
|
||||
id = serializers.URLField(max_length=500)
|
||||
first = serializers.URLField(max_length=500)
|
||||
last = serializers.URLField(max_length=500)
|
||||
|
||||
class Meta:
|
||||
jsonld_mapping = PAGINATED_COLLECTION_JSONLD_MAPPING
|
||||
|
||||
def to_representation(self, conf):
|
||||
paginator = Paginator(conf["items"], conf["page_size"])
|
||||
first = common_utils.set_query_parameter(conf["id"], page=1)
|
||||
current = first
|
||||
last = common_utils.set_query_parameter(conf["id"], page=paginator.num_pages)
|
||||
d = {
|
||||
"id": conf["id"],
|
||||
"totalItems": paginator.count,
|
||||
"type": "OrderedCollection",
|
||||
"current": current,
|
||||
"first": first,
|
||||
"last": last,
|
||||
}
|
||||
if self.context.get("include_ap_context", True):
|
||||
d["@context"] = jsonld.get_default_context()
|
||||
return d
|
||||
|
|
|
@ -23,6 +23,7 @@ from funkwhale_api.taskapp import celery
|
|||
|
||||
from . import activity
|
||||
from . import actors
|
||||
from . import exceptions
|
||||
from . import jsonld
|
||||
from . import keys
|
||||
from . import models, signing
|
||||
|
@ -212,7 +213,11 @@ def update_domain_nodeinfo(domain):
|
|||
if service_actor_id
|
||||
else None
|
||||
)
|
||||
except (serializers.serializers.ValidationError, RequestException) as e:
|
||||
except (
|
||||
serializers.serializers.ValidationError,
|
||||
RequestException,
|
||||
exceptions.BlockedActorOrDomain,
|
||||
) as e:
|
||||
logger.warning(
|
||||
"Cannot fetch system actor for domain %s: %s", domain.name, str(e)
|
||||
)
|
||||
|
@ -319,7 +324,6 @@ def fetch(fetch_obj):
|
|||
auth = signing.get_auth(actor.private_key, actor.private_key_id)
|
||||
else:
|
||||
auth = None
|
||||
auth = None
|
||||
try:
|
||||
if url.startswith("webfinger://"):
|
||||
# we first grab the correpsonding webfinger representation
|
||||
|
@ -336,10 +340,14 @@ def fetch(fetch_obj):
|
|||
response = session.get_session().get(
|
||||
auth=auth, url=url, headers={"Accept": "application/activity+json"},
|
||||
)
|
||||
logger.debug("Remote answered with %s", response.status_code)
|
||||
logger.debug("Remote answered with %s: %s", response.status_code, response.text)
|
||||
response.raise_for_status()
|
||||
except requests.exceptions.HTTPError as e:
|
||||
return error("http", status_code=e.response.status_code if e.response else None)
|
||||
return error(
|
||||
"http",
|
||||
status_code=e.response.status_code if e.response else None,
|
||||
message=response.text,
|
||||
)
|
||||
except requests.exceptions.Timeout:
|
||||
return error("timeout")
|
||||
except requests.exceptions.ConnectionError as e:
|
||||
|
@ -391,7 +399,9 @@ def fetch(fetch_obj):
|
|||
|
||||
serializer = None
|
||||
for serializer_class in serializer_classes:
|
||||
serializer = serializer_class(existing, data=payload)
|
||||
serializer = serializer_class(
|
||||
existing, data=payload, context={"fetch_actor": actor}
|
||||
)
|
||||
if not serializer.is_valid():
|
||||
continue
|
||||
else:
|
||||
|
@ -419,7 +429,7 @@ def fetch(fetch_obj):
|
|||
)
|
||||
except Exception:
|
||||
logger.exception(
|
||||
"Error while fetching actor outbox: %s", obj.actor.outbox.url
|
||||
"Error while fetching actor outbox: %s", obj.actor.outbox_url
|
||||
)
|
||||
else:
|
||||
if result.get("next_page"):
|
||||
|
|
|
@ -5,6 +5,7 @@ from . import views
|
|||
|
||||
router = routers.SimpleRouter(trailing_slash=False)
|
||||
music_router = routers.SimpleRouter(trailing_slash=False)
|
||||
index_router = routers.SimpleRouter(trailing_slash=False)
|
||||
|
||||
router.register(r"federation/shared", views.SharedViewSet, "shared")
|
||||
router.register(r"federation/actors", views.ActorViewSet, "actors")
|
||||
|
@ -17,6 +18,11 @@ music_router.register(r"uploads", views.MusicUploadViewSet, "uploads")
|
|||
music_router.register(r"artists", views.MusicArtistViewSet, "artists")
|
||||
music_router.register(r"albums", views.MusicAlbumViewSet, "albums")
|
||||
music_router.register(r"tracks", views.MusicTrackViewSet, "tracks")
|
||||
|
||||
|
||||
index_router.register(r"index", views.IndexViewSet, "index")
|
||||
|
||||
urlpatterns = router.urls + [
|
||||
url("federation/music/", include((music_router.urls, "music"), namespace="music"))
|
||||
url("federation/music/", include((music_router.urls, "music"), namespace="music")),
|
||||
url("federation/", include((index_router.urls, "index"), namespace="index")),
|
||||
]
|
||||
|
|
|
@ -218,7 +218,6 @@ def should_redirect_ap_to_html(accept_header, default=True):
|
|||
"text/html",
|
||||
]
|
||||
no_redirect_headers = [
|
||||
"*/*", # XXX backward compat with older Funkwhale instances that don't send the Accept header
|
||||
"application/json",
|
||||
"application/activity+json",
|
||||
"application/ld+json",
|
||||
|
|
|
@ -9,6 +9,7 @@ from rest_framework.decorators import action
|
|||
|
||||
from funkwhale_api.common import preferences
|
||||
from funkwhale_api.common import utils as common_utils
|
||||
from funkwhale_api.federation import utils as federation_utils
|
||||
from funkwhale_api.moderation import models as moderation_models
|
||||
from funkwhale_api.music import models as music_models
|
||||
from funkwhale_api.music import utils as music_utils
|
||||
|
@ -31,6 +32,34 @@ def redirect_to_html(public_url):
|
|||
return response
|
||||
|
||||
|
||||
def get_collection_response(
|
||||
conf, querystring, collection_serializer, page_access_check=None
|
||||
):
|
||||
page = querystring.get("page")
|
||||
if page is None:
|
||||
data = collection_serializer.data
|
||||
else:
|
||||
if page_access_check and not page_access_check():
|
||||
raise exceptions.AuthenticationFailed(
|
||||
"You do not have access to this resource"
|
||||
)
|
||||
try:
|
||||
page_number = int(page)
|
||||
except Exception:
|
||||
return response.Response({"page": ["Invalid page number"]}, status=400)
|
||||
conf["page_size"] = preferences.get("federation__collection_page_size")
|
||||
p = paginator.Paginator(conf["items"], conf["page_size"])
|
||||
try:
|
||||
page = p.page(page_number)
|
||||
conf["page"] = page
|
||||
serializer = serializers.CollectionPageSerializer(conf)
|
||||
data = serializer.data
|
||||
except paginator.EmptyPage:
|
||||
return response.Response(status=404)
|
||||
|
||||
return response.Response(data)
|
||||
|
||||
|
||||
class AuthenticatedIfAllowListEnabled(permissions.BasePermission):
|
||||
def has_permission(self, request, view):
|
||||
allow_list_enabled = preferences.get("moderation__allow_list_enabled")
|
||||
|
@ -82,6 +111,13 @@ class ActorViewSet(FederationMixin, mixins.RetrieveModelMixin, viewsets.GenericV
|
|||
queryset = super().get_queryset()
|
||||
return queryset.exclude(channel__attributed_to=actors.get_service_actor())
|
||||
|
||||
def get_permissions(self):
|
||||
# cf #1999 it must be possible to fetch actors without being authenticated
|
||||
# otherwise we end up in a loop
|
||||
if self.action == "retrieve":
|
||||
return []
|
||||
return super().get_permissions()
|
||||
|
||||
def retrieve(self, request, *args, **kwargs):
|
||||
instance = self.get_object()
|
||||
if utils.should_redirect_ap_to_html(request.headers.get("accept")):
|
||||
|
@ -128,26 +164,11 @@ class ActorViewSet(FederationMixin, mixins.RetrieveModelMixin, viewsets.GenericV
|
|||
.prefetch_related("library__channel__actor", "track__artist"),
|
||||
"item_serializer": serializers.ChannelCreateUploadSerializer,
|
||||
}
|
||||
page = request.GET.get("page")
|
||||
if page is None:
|
||||
serializer = serializers.ChannelOutboxSerializer(channel)
|
||||
data = serializer.data
|
||||
else:
|
||||
try:
|
||||
page_number = int(page)
|
||||
except Exception:
|
||||
return response.Response({"page": ["Invalid page number"]}, status=400)
|
||||
conf["page_size"] = preferences.get("federation__collection_page_size")
|
||||
p = paginator.Paginator(conf["items"], conf["page_size"])
|
||||
try:
|
||||
page = p.page(page_number)
|
||||
conf["page"] = page
|
||||
serializer = serializers.CollectionPageSerializer(conf)
|
||||
data = serializer.data
|
||||
except paginator.EmptyPage:
|
||||
return response.Response(status=404)
|
||||
|
||||
return response.Response(data)
|
||||
return get_collection_response(
|
||||
conf=conf,
|
||||
querystring=request.GET,
|
||||
collection_serializer=serializers.ChannelOutboxSerializer(channel),
|
||||
)
|
||||
|
||||
@action(methods=["get"], detail=True)
|
||||
def followers(self, request, *args, **kwargs):
|
||||
|
@ -290,32 +311,13 @@ class MusicLibraryViewSet(
|
|||
),
|
||||
"item_serializer": serializers.UploadSerializer,
|
||||
}
|
||||
page = request.GET.get("page")
|
||||
if page is None:
|
||||
serializer = serializers.LibrarySerializer(lb)
|
||||
data = serializer.data
|
||||
else:
|
||||
# if actor is requesting a specific page, we ensure library is public
|
||||
# or readable by the actor
|
||||
if not has_library_access(request, lb):
|
||||
raise exceptions.AuthenticationFailed(
|
||||
"You do not have access to this library"
|
||||
)
|
||||
try:
|
||||
page_number = int(page)
|
||||
except Exception:
|
||||
return response.Response({"page": ["Invalid page number"]}, status=400)
|
||||
conf["page_size"] = preferences.get("federation__collection_page_size")
|
||||
p = paginator.Paginator(conf["items"], conf["page_size"])
|
||||
try:
|
||||
page = p.page(page_number)
|
||||
conf["page"] = page
|
||||
serializer = serializers.CollectionPageSerializer(conf)
|
||||
data = serializer.data
|
||||
except paginator.EmptyPage:
|
||||
return response.Response(status=404)
|
||||
|
||||
return response.Response(data)
|
||||
return get_collection_response(
|
||||
conf=conf,
|
||||
querystring=request.GET,
|
||||
collection_serializer=serializers.LibrarySerializer(lb),
|
||||
page_access_check=lambda: has_library_access(request, lb),
|
||||
)
|
||||
|
||||
@action(methods=["get"], detail=True)
|
||||
def followers(self, request, *args, **kwargs):
|
||||
|
@ -436,3 +438,90 @@ class MusicTrackViewSet(
|
|||
|
||||
serializer = self.get_serializer(instance)
|
||||
return response.Response(serializer.data)
|
||||
|
||||
|
||||
class ChannelViewSet(
|
||||
FederationMixin, mixins.RetrieveModelMixin, viewsets.GenericViewSet
|
||||
):
|
||||
authentication_classes = [authentication.SignatureAuthentication]
|
||||
renderer_classes = renderers.get_ap_renderers()
|
||||
queryset = music_models.Artist.objects.local().select_related(
|
||||
"description", "attachment_cover"
|
||||
)
|
||||
serializer_class = serializers.ArtistSerializer
|
||||
lookup_field = "uuid"
|
||||
|
||||
def retrieve(self, request, *args, **kwargs):
|
||||
instance = self.get_object()
|
||||
if utils.should_redirect_ap_to_html(request.headers.get("accept")):
|
||||
return redirect_to_html(instance.get_absolute_url())
|
||||
|
||||
serializer = self.get_serializer(instance)
|
||||
return response.Response(serializer.data)
|
||||
|
||||
|
||||
class IndexViewSet(FederationMixin, viewsets.GenericViewSet):
|
||||
authentication_classes = [authentication.SignatureAuthentication]
|
||||
renderer_classes = renderers.get_ap_renderers()
|
||||
|
||||
def dispatch(self, request, *args, **kwargs):
|
||||
if not preferences.get("federation__public_index"):
|
||||
return HttpResponse(status=405)
|
||||
return super().dispatch(request, *args, **kwargs)
|
||||
|
||||
@action(
|
||||
methods=["get"], detail=False,
|
||||
)
|
||||
def libraries(self, request, *args, **kwargs):
|
||||
libraries = (
|
||||
music_models.Library.objects.local()
|
||||
.filter(channel=None, privacy_level="everyone")
|
||||
.prefetch_related("actor")
|
||||
.order_by("creation_date")
|
||||
)
|
||||
conf = {
|
||||
"id": federation_utils.full_url(
|
||||
reverse("federation:index:index-libraries")
|
||||
),
|
||||
"items": libraries,
|
||||
"item_serializer": serializers.LibrarySerializer,
|
||||
"page_size": 100,
|
||||
"actor": None,
|
||||
}
|
||||
return get_collection_response(
|
||||
conf=conf,
|
||||
querystring=request.GET,
|
||||
collection_serializer=serializers.IndexSerializer(conf),
|
||||
)
|
||||
|
||||
return response.Response({}, status=200)
|
||||
|
||||
@action(
|
||||
methods=["get"], detail=False,
|
||||
)
|
||||
def channels(self, request, *args, **kwargs):
|
||||
actors = (
|
||||
models.Actor.objects.local()
|
||||
.exclude(channel=None)
|
||||
.order_by("channel__creation_date")
|
||||
.prefetch_related(
|
||||
"channel__attributed_to",
|
||||
"channel__artist",
|
||||
"channel__artist__description",
|
||||
"channel__artist__attachment_cover",
|
||||
)
|
||||
)
|
||||
conf = {
|
||||
"id": federation_utils.full_url(reverse("federation:index:index-channels")),
|
||||
"items": actors,
|
||||
"item_serializer": serializers.ActorSerializer,
|
||||
"page_size": 100,
|
||||
"actor": None,
|
||||
}
|
||||
return get_collection_response(
|
||||
conf=conf,
|
||||
querystring=request.GET,
|
||||
collection_serializer=serializers.IndexSerializer(conf),
|
||||
)
|
||||
|
||||
return response.Response({}, status=200)
|
||||
|
|
|
@ -16,4 +16,4 @@ class ListeningFilter(moderation_filters.HiddenContentFilterSet):
|
|||
hidden_content_fields_mapping = moderation_filters.USER_FILTER_CONFIG[
|
||||
"LISTENING"
|
||||
]
|
||||
fields = ["hidden", "scope"]
|
||||
fields = []
|
||||
|
|
|
@ -2,6 +2,8 @@ from rest_framework import mixins, viewsets
|
|||
|
||||
from django.db.models import Prefetch
|
||||
|
||||
from config import plugins
|
||||
|
||||
from funkwhale_api.activity import record
|
||||
from funkwhale_api.common import fields, permissions
|
||||
from funkwhale_api.music.models import Track
|
||||
|
@ -39,6 +41,11 @@ class ListeningViewSet(
|
|||
|
||||
def perform_create(self, serializer):
|
||||
r = super().perform_create(serializer)
|
||||
plugins.trigger_hook(
|
||||
plugins.LISTENING_CREATED,
|
||||
listening=serializer.instance,
|
||||
confs=plugins.get_confs(self.request.user),
|
||||
)
|
||||
record.send(serializer.instance)
|
||||
return r
|
||||
|
||||
|
|
|
@ -67,7 +67,7 @@ def get():
|
|||
"instance__funkwhale_support_message_enabled"
|
||||
),
|
||||
"instanceSupportMessage": all_preferences.get("instance__support_message"),
|
||||
"knownNodesListUrl": None,
|
||||
"endpoints": {"knownNodes": None, "channels": None, "libraries": None},
|
||||
},
|
||||
}
|
||||
|
||||
|
@ -90,7 +90,14 @@ def get():
|
|||
"downloads": {"total": statistics["downloads"]},
|
||||
}
|
||||
if not auth_required:
|
||||
data["metadata"]["knownNodesListUrl"] = federation_utils.full_url(
|
||||
data["metadata"]["endpoints"]["knownNodes"] = federation_utils.full_url(
|
||||
reverse("api:v1:federation:domains-list")
|
||||
)
|
||||
if not auth_required and preferences.get("federation__public_index"):
|
||||
data["metadata"]["endpoints"]["libraries"] = federation_utils.full_url(
|
||||
reverse("federation:index:index-libraries")
|
||||
)
|
||||
data["metadata"]["endpoints"]["channels"] = federation_utils.full_url(
|
||||
reverse("federation:index:index-channels")
|
||||
)
|
||||
return data
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
import json
|
||||
import logging
|
||||
|
||||
from django.conf import settings
|
||||
|
||||
|
@ -18,6 +19,9 @@ from . import nodeinfo
|
|||
NODEINFO_2_CONTENT_TYPE = "application/json; profile=http://nodeinfo.diaspora.software/ns/schema/2.0#; charset=utf-8" # noqa
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class AdminSettings(preferences_viewsets.GlobalPreferencesViewSet):
|
||||
pagination_class = None
|
||||
permission_classes = [oauth_permissions.ScopePermission]
|
||||
|
@ -44,7 +48,11 @@ class NodeInfo(views.APIView):
|
|||
authentication_classes = []
|
||||
|
||||
def get(self, request, *args, **kwargs):
|
||||
data = nodeinfo.get()
|
||||
try:
|
||||
data = nodeinfo.get()
|
||||
except ValueError:
|
||||
logger.warn("nodeinfo returned invalid json")
|
||||
data = {}
|
||||
return Response(data, status=200, content_type=NODEINFO_2_CONTENT_TYPE)
|
||||
|
||||
|
||||
|
|
|
@ -60,7 +60,7 @@ class ManageChannelFilterSet(filters.FilterSet):
|
|||
|
||||
class Meta:
|
||||
model = audio_models.Channel
|
||||
fields = ["q"]
|
||||
fields = []
|
||||
|
||||
|
||||
class ManageArtistFilterSet(filters.FilterSet):
|
||||
|
@ -89,7 +89,7 @@ class ManageArtistFilterSet(filters.FilterSet):
|
|||
|
||||
class Meta:
|
||||
model = music_models.Artist
|
||||
fields = ["q", "name", "mbid", "fid", "content_category"]
|
||||
fields = ["name", "mbid", "fid", "content_category"]
|
||||
|
||||
|
||||
class ManageAlbumFilterSet(filters.FilterSet):
|
||||
|
@ -119,7 +119,7 @@ class ManageAlbumFilterSet(filters.FilterSet):
|
|||
|
||||
class Meta:
|
||||
model = music_models.Album
|
||||
fields = ["q", "title", "mbid", "fid", "artist"]
|
||||
fields = ["title", "mbid", "fid", "artist"]
|
||||
|
||||
|
||||
class ManageTrackFilterSet(filters.FilterSet):
|
||||
|
@ -158,7 +158,7 @@ class ManageTrackFilterSet(filters.FilterSet):
|
|||
|
||||
class Meta:
|
||||
model = music_models.Track
|
||||
fields = ["q", "title", "mbid", "fid", "artist", "album", "license"]
|
||||
fields = ["title", "mbid", "fid", "artist", "album", "license"]
|
||||
|
||||
|
||||
class ManageLibraryFilterSet(filters.FilterSet):
|
||||
|
@ -204,7 +204,7 @@ class ManageLibraryFilterSet(filters.FilterSet):
|
|||
|
||||
class Meta:
|
||||
model = music_models.Library
|
||||
fields = ["q", "name", "fid", "privacy_level", "domain"]
|
||||
fields = ["name", "fid", "privacy_level"]
|
||||
|
||||
|
||||
class ManageUploadFilterSet(filters.FilterSet):
|
||||
|
@ -249,10 +249,7 @@ class ManageUploadFilterSet(filters.FilterSet):
|
|||
class Meta:
|
||||
model = music_models.Upload
|
||||
fields = [
|
||||
"q",
|
||||
"fid",
|
||||
"privacy_level",
|
||||
"domain",
|
||||
"mimetype",
|
||||
"import_reference",
|
||||
"import_status",
|
||||
|
@ -275,7 +272,7 @@ class ManageDomainFilterSet(filters.FilterSet):
|
|||
|
||||
class Meta:
|
||||
model = federation_models.Domain
|
||||
fields = ["name", "allowed"]
|
||||
fields = ["name"]
|
||||
|
||||
|
||||
class ManageActorFilterSet(filters.FilterSet):
|
||||
|
@ -300,7 +297,7 @@ class ManageActorFilterSet(filters.FilterSet):
|
|||
|
||||
class Meta:
|
||||
model = federation_models.Actor
|
||||
fields = ["q", "domain", "type", "manually_approves_followers", "local"]
|
||||
fields = ["domain", "type", "manually_approves_followers"]
|
||||
|
||||
def filter_local(self, queryset, name, value):
|
||||
return queryset.local(value)
|
||||
|
@ -320,7 +317,6 @@ class ManageUserFilterSet(filters.FilterSet):
|
|||
class Meta:
|
||||
model = users_models.User
|
||||
fields = [
|
||||
"q",
|
||||
"is_active",
|
||||
"privacy_level",
|
||||
"is_staff",
|
||||
|
@ -337,7 +333,7 @@ class ManageInvitationFilterSet(filters.FilterSet):
|
|||
|
||||
class Meta:
|
||||
model = users_models.Invitation
|
||||
fields = ["q", "is_open"]
|
||||
fields = []
|
||||
|
||||
def filter_is_open(self, queryset, field_name, value):
|
||||
if value is None:
|
||||
|
@ -362,14 +358,10 @@ class ManageInstancePolicyFilterSet(filters.FilterSet):
|
|||
class Meta:
|
||||
model = moderation_models.InstancePolicy
|
||||
fields = [
|
||||
"q",
|
||||
"block_all",
|
||||
"silence_activity",
|
||||
"silence_notifications",
|
||||
"reject_media",
|
||||
"target_domain",
|
||||
"target_account_domain",
|
||||
"target_account_username",
|
||||
]
|
||||
|
||||
|
||||
|
@ -378,7 +370,7 @@ class ManageTagFilterSet(filters.FilterSet):
|
|||
|
||||
class Meta:
|
||||
model = tags_models.Tag
|
||||
fields = ["q"]
|
||||
fields = []
|
||||
|
||||
|
||||
class ManageReportFilterSet(filters.FilterSet):
|
||||
|
@ -404,7 +396,7 @@ class ManageReportFilterSet(filters.FilterSet):
|
|||
|
||||
class Meta:
|
||||
model = moderation_models.Report
|
||||
fields = ["q", "is_handled", "type", "submitter_email"]
|
||||
fields = ["is_handled", "type", "submitter_email"]
|
||||
|
||||
|
||||
class ManageNoteFilterSet(filters.FilterSet):
|
||||
|
@ -423,7 +415,7 @@ class ManageNoteFilterSet(filters.FilterSet):
|
|||
|
||||
class Meta:
|
||||
model = moderation_models.Note
|
||||
fields = ["q"]
|
||||
fields = []
|
||||
|
||||
|
||||
class ManageUserRequestFilterSet(filters.FilterSet):
|
||||
|
@ -446,4 +438,4 @@ class ManageUserRequestFilterSet(filters.FilterSet):
|
|||
|
||||
class Meta:
|
||||
model = moderation_models.UserRequest
|
||||
fields = ["q", "status", "type"]
|
||||
fields = ["status", "type"]
|
||||
|
|
|
@ -336,6 +336,7 @@ class ManageBaseArtistSerializer(serializers.ModelSerializer):
|
|||
class ManageBaseAlbumSerializer(serializers.ModelSerializer):
|
||||
cover = music_serializers.cover_field
|
||||
domain = serializers.CharField(source="domain_name")
|
||||
tracks_count = serializers.SerializerMethodField()
|
||||
|
||||
class Meta:
|
||||
model = music_models.Album
|
||||
|
@ -349,8 +350,12 @@ class ManageBaseAlbumSerializer(serializers.ModelSerializer):
|
|||
"cover",
|
||||
"domain",
|
||||
"is_local",
|
||||
"tracks_count",
|
||||
]
|
||||
|
||||
def get_tracks_count(self, o):
|
||||
return getattr(o, "_tracks_count", None)
|
||||
|
||||
|
||||
class ManageNestedTrackSerializer(serializers.ModelSerializer):
|
||||
domain = serializers.CharField(source="domain_name")
|
||||
|
@ -428,7 +433,6 @@ class ManageNestedArtistSerializer(ManageBaseArtistSerializer):
|
|||
class ManageAlbumSerializer(
|
||||
music_serializers.OptionalDescriptionMixin, ManageBaseAlbumSerializer
|
||||
):
|
||||
tracks = ManageNestedTrackSerializer(many=True)
|
||||
attributed_to = ManageBaseActorSerializer()
|
||||
artist = ManageNestedArtistSerializer()
|
||||
tags = serializers.SerializerMethodField()
|
||||
|
@ -437,11 +441,14 @@ class ManageAlbumSerializer(
|
|||
model = music_models.Album
|
||||
fields = ManageBaseAlbumSerializer.Meta.fields + [
|
||||
"artist",
|
||||
"tracks",
|
||||
"attributed_to",
|
||||
"tags",
|
||||
"tracks_count",
|
||||
]
|
||||
|
||||
def get_tracks_count(self, o):
|
||||
return len(o.tracks.all())
|
||||
|
||||
def get_tags(self, obj):
|
||||
tagged_items = getattr(obj, "_prefetched_tagged_items", [])
|
||||
return [ti.tag.name for ti in tagged_items]
|
||||
|
|
|
@ -128,7 +128,7 @@ class ManageAlbumViewSet(
|
|||
music_models.Album.objects.all()
|
||||
.order_by("-id")
|
||||
.select_related("attributed_to", "artist", "attachment_cover")
|
||||
.prefetch_related("tracks", music_views.TAG_PREFETCH)
|
||||
.prefetch_related("tracks")
|
||||
)
|
||||
serializer_class = serializers.ManageAlbumSerializer
|
||||
filterset_class = filters.ManageAlbumFilterSet
|
||||
|
|
|
@ -0,0 +1,22 @@
|
|||
# Generated by Django 3.0.8 on 2020-08-03 12:22
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('moderation', '0005_auto_20200317_0820'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RemoveField(
|
||||
model_name='userrequest',
|
||||
name='url',
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='userrequest',
|
||||
name='status',
|
||||
field=models.CharField(choices=[('pending', 'Pending'), ('refused', 'Refused'), ('approved', 'Approved')], default='pending', max_length=40),
|
||||
),
|
||||
]
|
|
@ -101,9 +101,12 @@ class ArtistFilter(
|
|||
|
||||
q = fields.SearchFilter(search_fields=["name"], fts_search_fields=["body_text"])
|
||||
playable = filters.BooleanFilter(field_name="_", method="filter_playable")
|
||||
has_albums = filters.BooleanFilter(field_name="_", method="filter_has_albums")
|
||||
tag = TAG_FILTER
|
||||
scope = common_filters.ActorScopeFilter(
|
||||
actor_field="tracks__uploads__library__actor", distinct=True
|
||||
actor_field="tracks__uploads__library__actor",
|
||||
distinct=True,
|
||||
library_field="tracks__uploads__library",
|
||||
)
|
||||
ordering = django_filters.OrderingFilter(
|
||||
fields=(
|
||||
|
@ -120,8 +123,6 @@ class ArtistFilter(
|
|||
model = models.Artist
|
||||
fields = {
|
||||
"name": ["exact", "iexact", "startswith", "icontains"],
|
||||
"playable": ["exact"],
|
||||
"scope": ["exact"],
|
||||
"mbid": ["exact"],
|
||||
}
|
||||
hidden_content_fields_mapping = moderation_filters.USER_FILTER_CONFIG["ARTIST"]
|
||||
|
@ -132,6 +133,12 @@ class ArtistFilter(
|
|||
actor = utils.get_actor_from_request(self.request)
|
||||
return queryset.playable_by(actor, value).distinct()
|
||||
|
||||
def filter_has_albums(self, queryset, name, value):
|
||||
if value is True:
|
||||
return queryset.filter(albums__isnull=False)
|
||||
else:
|
||||
return queryset.filter(albums__isnull=True)
|
||||
|
||||
|
||||
class TrackFilter(
|
||||
RelatedFilterSet,
|
||||
|
@ -176,11 +183,9 @@ class TrackFilter(
|
|||
model = models.Track
|
||||
fields = {
|
||||
"title": ["exact", "iexact", "startswith", "icontains"],
|
||||
"playable": ["exact"],
|
||||
"id": ["exact"],
|
||||
"album": ["exact"],
|
||||
"license": ["exact"],
|
||||
"scope": ["exact"],
|
||||
"mbid": ["exact"],
|
||||
}
|
||||
hidden_content_fields_mapping = moderation_filters.USER_FILTER_CONFIG["TRACK"]
|
||||
|
@ -204,7 +209,9 @@ class UploadFilter(audio_filters.IncludeChannelsFilterSet):
|
|||
album_artist = filters.UUIDFilter("track__album__artist__uuid")
|
||||
library = filters.UUIDFilter("library__uuid")
|
||||
playable = filters.BooleanFilter(field_name="_", method="filter_playable")
|
||||
scope = common_filters.ActorScopeFilter(actor_field="library__actor", distinct=True)
|
||||
scope = common_filters.ActorScopeFilter(
|
||||
actor_field="library__actor", distinct=True, library_field="library",
|
||||
)
|
||||
import_status = common_filters.MultipleQueryFilter(coerce=str)
|
||||
q = fields.SmartSearchFilter(
|
||||
config=search.SearchConfig(
|
||||
|
@ -227,16 +234,9 @@ class UploadFilter(audio_filters.IncludeChannelsFilterSet):
|
|||
class Meta:
|
||||
model = models.Upload
|
||||
fields = [
|
||||
"playable",
|
||||
"import_status",
|
||||
"mimetype",
|
||||
"track",
|
||||
"track_artist",
|
||||
"album_artist",
|
||||
"library",
|
||||
"import_reference",
|
||||
"scope",
|
||||
"channel",
|
||||
]
|
||||
include_channels_field = "track__artist__channel"
|
||||
|
||||
|
@ -259,7 +259,9 @@ class AlbumFilter(
|
|||
)
|
||||
tag = TAG_FILTER
|
||||
scope = common_filters.ActorScopeFilter(
|
||||
actor_field="tracks__uploads__library__actor", distinct=True
|
||||
actor_field="tracks__uploads__library__actor",
|
||||
distinct=True,
|
||||
library_field="tracks__uploads__library",
|
||||
)
|
||||
|
||||
ordering = django_filters.OrderingFilter(
|
||||
|
@ -275,7 +277,7 @@ class AlbumFilter(
|
|||
|
||||
class Meta:
|
||||
model = models.Album
|
||||
fields = ["playable", "q", "artist", "scope", "mbid"]
|
||||
fields = ["artist", "mbid"]
|
||||
hidden_content_fields_mapping = moderation_filters.USER_FILTER_CONFIG["ALBUM"]
|
||||
include_channels_field = "artist__channel"
|
||||
channel_filter_field = "track__album"
|
||||
|
@ -288,8 +290,10 @@ class AlbumFilter(
|
|||
|
||||
class LibraryFilter(filters.FilterSet):
|
||||
q = fields.SearchFilter(search_fields=["name"],)
|
||||
scope = common_filters.ActorScopeFilter(actor_field="actor", distinct=True)
|
||||
scope = common_filters.ActorScopeFilter(
|
||||
actor_field="actor", distinct=True, library_field="pk",
|
||||
)
|
||||
|
||||
class Meta:
|
||||
model = models.Library
|
||||
fields = ["privacy_level", "q", "scope"]
|
||||
fields = ["privacy_level"]
|
||||
|
|
|
@ -277,6 +277,17 @@ LICENSES = [
|
|||
"http://creativecommons.org/publicdomain/zero/1.0/"
|
||||
],
|
||||
},
|
||||
{
|
||||
"code": "LAL-1.3",
|
||||
"name": "Licence Art Libre 1.3",
|
||||
"redistribute": True,
|
||||
"derivative": True,
|
||||
"commercial": True,
|
||||
"attribution": True,
|
||||
"copyleft": True,
|
||||
"url": "https://artlibre.org/licence/lal",
|
||||
"identifiers": ["http://artlibre.org/licence/lal"],
|
||||
},
|
||||
# Creative commons version 4.0
|
||||
get_cc_license(version="4.0", perks=["by"]),
|
||||
get_cc_license(version="4.0", perks=["by", "sa"]),
|
||||
|
|
|
@ -3,6 +3,7 @@ import datetime
|
|||
import itertools
|
||||
import os
|
||||
import queue
|
||||
import sys
|
||||
import threading
|
||||
import time
|
||||
import urllib.parse
|
||||
|
@ -11,6 +12,7 @@ import watchdog.events
|
|||
import watchdog.observers
|
||||
|
||||
from django.conf import settings
|
||||
from django.core.cache import cache
|
||||
from django.core.files import File
|
||||
from django.core.management import call_command
|
||||
from django.core.management.base import BaseCommand, CommandError
|
||||
|
@ -29,16 +31,27 @@ def crawl_dir(dir, extensions, recursive=True, ignored=[]):
|
|||
return
|
||||
try:
|
||||
scanner = os.scandir(dir)
|
||||
except Exception as e:
|
||||
m = "Error while reading {}: {} {}\n".format(dir, e.__class__.__name__, e)
|
||||
sys.stderr.write(m)
|
||||
return
|
||||
try:
|
||||
for entry in scanner:
|
||||
if entry.is_file():
|
||||
for e in extensions:
|
||||
if entry.name.lower().endswith(".{}".format(e.lower())):
|
||||
if entry.path not in ignored:
|
||||
yield entry.path
|
||||
elif recursive and entry.is_dir():
|
||||
yield from crawl_dir(
|
||||
entry.path, extensions, recursive=recursive, ignored=ignored
|
||||
try:
|
||||
if entry.is_file():
|
||||
for e in extensions:
|
||||
if entry.name.lower().endswith(".{}".format(e.lower())):
|
||||
if entry.path not in ignored:
|
||||
yield entry.path
|
||||
elif recursive and entry.is_dir():
|
||||
yield from crawl_dir(
|
||||
entry.path, extensions, recursive=recursive, ignored=ignored
|
||||
)
|
||||
except Exception as e:
|
||||
m = "Error while reading {}: {} {}\n".format(
|
||||
entry.name, e.__class__.__name__, e
|
||||
)
|
||||
sys.stderr.write(m)
|
||||
finally:
|
||||
if hasattr(scanner, "close"):
|
||||
scanner.close()
|
||||
|
@ -56,8 +69,34 @@ def batch(iterable, n=1):
|
|||
yield current
|
||||
|
||||
|
||||
class CacheWriter:
|
||||
"""
|
||||
Output to cache instead of console
|
||||
"""
|
||||
|
||||
def __init__(self, key, stdout, buffer_size=10):
|
||||
self.key = key
|
||||
cache.set(self.key, [])
|
||||
self.stdout = stdout
|
||||
self.buffer_size = buffer_size
|
||||
self.buffer = []
|
||||
|
||||
def write(self, message):
|
||||
# we redispatch the message to the console, for debugging
|
||||
self.stdout.write(message)
|
||||
|
||||
self.buffer.append(message)
|
||||
if len(self.buffer) > self.buffer_size:
|
||||
self.flush()
|
||||
|
||||
def flush(self):
|
||||
current = cache.get(self.key)
|
||||
cache.set(self.key, current + self.buffer)
|
||||
self.buffer = []
|
||||
|
||||
|
||||
class Command(BaseCommand):
|
||||
help = "Import audio files mathinc given glob pattern"
|
||||
help = "Import audio files matching given glob pattern"
|
||||
|
||||
def add_arguments(self, parser):
|
||||
parser.add_argument(
|
||||
|
@ -195,7 +234,22 @@ class Command(BaseCommand):
|
|||
help="Size of each batch, only used when crawling large collections",
|
||||
)
|
||||
|
||||
def handle(self, *args, **options):
|
||||
def handle(self, *args, **kwargs):
|
||||
cache.set("fs-import:status", "started")
|
||||
if kwargs.get("update_cache", False):
|
||||
self.stdout = CacheWriter("fs-import:logs", self.stdout)
|
||||
self.stderr = self.stdout
|
||||
try:
|
||||
return self._handle(*args, **kwargs)
|
||||
except CommandError as e:
|
||||
self.stdout.write(str(e))
|
||||
raise
|
||||
finally:
|
||||
if kwargs.get("update_cache", False):
|
||||
cache.set("fs-import:status", "finished")
|
||||
self.stdout.flush()
|
||||
|
||||
def _handle(self, *args, **options):
|
||||
# handle relative directories
|
||||
options["path"] = [os.path.abspath(path) for path in options["path"]]
|
||||
self.is_confirmed = False
|
||||
|
@ -300,6 +354,10 @@ class Command(BaseCommand):
|
|||
batch_duration = None
|
||||
self.stdout.write("Starting import of new files…")
|
||||
for i, entries in enumerate(batch(crawler, options["batch_size"])):
|
||||
if options.get("update_cache", False) is True:
|
||||
# check to see if the scan was cancelled
|
||||
if cache.get("fs-import:status") == "canceled":
|
||||
raise CommandError("Import cancelled")
|
||||
total += len(entries)
|
||||
batch_start = time.time()
|
||||
time_stats = ""
|
||||
|
@ -643,9 +701,7 @@ def handle_modified(event, stdout, library, in_place, **kwargs):
|
|||
and to_update.track.attributed_to != library.actor
|
||||
):
|
||||
stdout.write(
|
||||
" Cannot update track metadata, track belongs to someone else".format(
|
||||
to_update.pk
|
||||
)
|
||||
" Cannot update track metadata, track belongs to someone else"
|
||||
)
|
||||
return
|
||||
else:
|
||||
|
@ -748,7 +804,7 @@ def check_updates(stdout, library, extensions, paths, batch_size):
|
|||
def check_upload(stdout, upload):
|
||||
try:
|
||||
audio_file = upload.get_audio_file()
|
||||
except FileNotFoundError:
|
||||
except (FileNotFoundError, PermissionError):
|
||||
stdout.write(
|
||||
" Removing file #{} missing from disk at {}".format(
|
||||
upload.pk, upload.source
|
||||
|
@ -765,9 +821,7 @@ def check_upload(stdout, upload):
|
|||
)
|
||||
if upload.library.actor_id != upload.track.attributed_to_id:
|
||||
stdout.write(
|
||||
" Cannot update track metadata, track belongs to someone else".format(
|
||||
upload.pk
|
||||
)
|
||||
" Cannot update track metadata, track belongs to someone else"
|
||||
)
|
||||
else:
|
||||
track = models.Track.objects.select_related("artist", "album__artist").get(
|
||||
|
|
|
@ -494,7 +494,7 @@ class ArtistField(serializers.Field):
|
|||
|
||||
def to_internal_value(self, data):
|
||||
# we have multiple values that can be separated by various separators
|
||||
separators = [";"]
|
||||
separators = [";", ","]
|
||||
# we get a list like that if tagged via musicbrainz
|
||||
# ae29aae4-abfb-4609-8f54-417b1f4d64cc; 3237b5a8-ae44-400c-aa6d-cea51f0b9074;
|
||||
raw_mbids = data["mbids"]
|
||||
|
@ -697,6 +697,12 @@ class AlbumSerializer(serializers.Serializer):
|
|||
return v
|
||||
|
||||
|
||||
def get_valid_position(v):
|
||||
if v <= 0:
|
||||
v = 1
|
||||
return v
|
||||
|
||||
|
||||
class PositionField(serializers.CharField):
|
||||
def to_internal_value(self, v):
|
||||
v = super().to_internal_value(v)
|
||||
|
@ -704,15 +710,15 @@ class PositionField(serializers.CharField):
|
|||
return v
|
||||
|
||||
try:
|
||||
return int(v)
|
||||
return get_valid_position(int(v))
|
||||
except ValueError:
|
||||
# maybe the position is of the form "1/4"
|
||||
pass
|
||||
|
||||
try:
|
||||
return int(v.split("/")[0])
|
||||
return get_valid_position(int(v.split("/")[0]))
|
||||
except (ValueError, AttributeError, IndexError):
|
||||
pass
|
||||
return
|
||||
|
||||
|
||||
class DescriptionField(serializers.CharField):
|
||||
|
|
|
@ -35,6 +35,6 @@ def rewind(apps, schema_editor):
|
|||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [("music", "0041_auto_20191021_1705")]
|
||||
dependencies = [("music", "0052_auto_20200505_0810")]
|
||||
|
||||
operations = [migrations.RunPython(denormalize, rewind)]
|
|
@ -20,7 +20,6 @@ from django.db.models.signals import post_save, pre_save
|
|||
from django.dispatch import receiver
|
||||
from django.urls import reverse
|
||||
from django.utils import timezone
|
||||
from versatileimagefield.fields import VersatileImageField
|
||||
|
||||
from funkwhale_api import musicbrainz
|
||||
from funkwhale_api.common import fields
|
||||
|
@ -319,20 +318,12 @@ class AlbumQuerySet(common_models.LocalFromFidQuerySet, models.QuerySet):
|
|||
else:
|
||||
return self.exclude(pk__in=matches)
|
||||
|
||||
def with_prefetched_tracks_and_playable_uploads(self, actor):
|
||||
tracks = Track.objects.with_playable_uploads(actor)
|
||||
return self.prefetch_related(models.Prefetch("tracks", queryset=tracks))
|
||||
|
||||
|
||||
class Album(APIModelMixin):
|
||||
title = models.CharField(max_length=MAX_LENGTHS["ALBUM_TITLE"])
|
||||
artist = models.ForeignKey(Artist, related_name="albums", on_delete=models.CASCADE)
|
||||
release_date = models.DateField(null=True, blank=True, db_index=True)
|
||||
release_group_id = models.UUIDField(null=True, blank=True)
|
||||
# XXX: 1.0 clean this uneeded field in favor of attachment_cover
|
||||
cover = VersatileImageField(
|
||||
upload_to="albums/covers/%Y/%m/%d", null=True, blank=True
|
||||
)
|
||||
attachment_cover = models.ForeignKey(
|
||||
"common.Attachment",
|
||||
null=True,
|
||||
|
@ -899,10 +890,13 @@ class Upload(models.Model):
|
|||
def listen_url(self):
|
||||
return self.track.listen_url + "?upload={}".format(self.uuid)
|
||||
|
||||
def get_listen_url(self, to=None):
|
||||
def get_listen_url(self, to=None, download=True):
|
||||
url = self.listen_url
|
||||
if to:
|
||||
url += "&to={}".format(to)
|
||||
if not download:
|
||||
url += "&download=false"
|
||||
|
||||
return url
|
||||
|
||||
@property
|
||||
|
|
|
@ -32,10 +32,7 @@ COVER_WRITE_FIELD = common_serializers.RelatedField(
|
|||
from funkwhale_api.audio import serializers as audio_serializers # NOQA
|
||||
|
||||
|
||||
class CoverField(
|
||||
common_serializers.NullToEmptDict, common_serializers.AttachmentSerializer
|
||||
):
|
||||
# XXX: BACKWARD COMPATIBILITY
|
||||
class CoverField(common_serializers.AttachmentSerializer):
|
||||
pass
|
||||
|
||||
|
||||
|
@ -101,7 +98,7 @@ class ArtistAlbumSerializer(serializers.Serializer):
|
|||
return o.artist_id
|
||||
|
||||
def get_tracks_count(self, o):
|
||||
return o._tracks_count
|
||||
return len(o.tracks.all())
|
||||
|
||||
def get_is_playable(self, obj):
|
||||
try:
|
||||
|
@ -189,35 +186,12 @@ def serialize_artist_simple(artist):
|
|||
return data
|
||||
|
||||
|
||||
def serialize_album_track(track):
|
||||
return {
|
||||
"id": track.id,
|
||||
"fid": track.fid,
|
||||
"mbid": str(track.mbid),
|
||||
"title": track.title,
|
||||
"artist": serialize_artist_simple(track.artist),
|
||||
"album": track.album_id,
|
||||
"creation_date": DATETIME_FIELD.to_representation(track.creation_date),
|
||||
"position": track.position,
|
||||
"disc_number": track.disc_number,
|
||||
"uploads": [
|
||||
serialize_upload(u) for u in getattr(track, "playable_uploads", [])
|
||||
],
|
||||
"listen_url": track.listen_url,
|
||||
"duration": getattr(track, "duration", None),
|
||||
"copyright": track.copyright,
|
||||
"license": track.license_id,
|
||||
"is_local": track.is_local,
|
||||
}
|
||||
|
||||
|
||||
class AlbumSerializer(OptionalDescriptionMixin, serializers.Serializer):
|
||||
# XXX: remove in 1.0, it's expensive and can work with a filter/api call
|
||||
tracks = serializers.SerializerMethodField()
|
||||
artist = serializers.SerializerMethodField()
|
||||
cover = cover_field
|
||||
is_playable = serializers.SerializerMethodField()
|
||||
tags = serializers.SerializerMethodField()
|
||||
tracks_count = serializers.SerializerMethodField()
|
||||
attributed_to = serializers.SerializerMethodField()
|
||||
id = serializers.IntegerField()
|
||||
fid = serializers.URLField()
|
||||
|
@ -234,9 +208,8 @@ class AlbumSerializer(OptionalDescriptionMixin, serializers.Serializer):
|
|||
def get_artist(self, o):
|
||||
return serialize_artist_simple(o.artist)
|
||||
|
||||
def get_tracks(self, o):
|
||||
ordered_tracks = o.tracks.all()
|
||||
return [serialize_album_track(track) for track in ordered_tracks]
|
||||
def get_tracks_count(self, o):
|
||||
return len(o.tracks.all())
|
||||
|
||||
def get_is_playable(self, obj):
|
||||
try:
|
||||
|
@ -282,6 +255,7 @@ def serialize_upload(upload):
|
|||
"bitrate": upload.bitrate,
|
||||
"mimetype": upload.mimetype,
|
||||
"extension": upload.extension,
|
||||
"is_local": federation_utils.is_local(upload.fid),
|
||||
}
|
||||
|
||||
|
||||
|
@ -318,6 +292,7 @@ class TrackSerializer(OptionalDescriptionMixin, serializers.Serializer):
|
|||
is_local = serializers.BooleanField()
|
||||
position = serializers.IntegerField()
|
||||
disc_number = serializers.IntegerField()
|
||||
downloads_count = serializers.IntegerField()
|
||||
copyright = serializers.CharField()
|
||||
license = serializers.SerializerMethodField()
|
||||
cover = cover_field
|
||||
|
@ -331,7 +306,10 @@ class TrackSerializer(OptionalDescriptionMixin, serializers.Serializer):
|
|||
|
||||
def get_uploads(self, obj):
|
||||
uploads = getattr(obj, "playable_uploads", [])
|
||||
return [serialize_upload(u) for u in sort_uploads_for_listen(uploads)]
|
||||
# we put local uploads first
|
||||
uploads = [serialize_upload(u) for u in sort_uploads_for_listen(uploads)]
|
||||
uploads = sorted(uploads, key=lambda u: u["is_local"], reverse=True)
|
||||
return list(uploads)
|
||||
|
||||
def get_tags(self, obj):
|
||||
tagged_items = getattr(obj, "_prefetched_tagged_items", [])
|
||||
|
@ -861,3 +839,23 @@ class AlbumCreateSerializer(serializers.Serializer):
|
|||
tag_models.set_tags(instance, *(validated_data.get("tags", []) or []))
|
||||
instance.artist.get_channel()
|
||||
return instance
|
||||
|
||||
|
||||
class FSImportSerializer(serializers.Serializer):
|
||||
path = serializers.CharField(allow_blank=True)
|
||||
library = serializers.UUIDField()
|
||||
import_reference = serializers.CharField()
|
||||
|
||||
def validate_path(self, value):
|
||||
try:
|
||||
utils.browse_dir(settings.MUSIC_DIRECTORY_PATH, value)
|
||||
except (NotADirectoryError, FileNotFoundError, ValueError):
|
||||
raise serializers.ValidationError("Invalid path")
|
||||
|
||||
return value
|
||||
|
||||
def validate_library(self, value):
|
||||
try:
|
||||
return self.context["user"].actor.libraries.get(uuid=value)
|
||||
except models.Library.DoesNotExist:
|
||||
raise serializers.ValidationError("Invalid library")
|
||||
|
|
|
@ -3,10 +3,12 @@ import datetime
|
|||
import logging
|
||||
import os
|
||||
|
||||
from django.utils import timezone
|
||||
from django.conf import settings
|
||||
from django.core.cache import cache
|
||||
from django.db import transaction
|
||||
from django.db.models import F, Q
|
||||
from django.dispatch import receiver
|
||||
from django.utils import timezone
|
||||
|
||||
from musicbrainzngs import ResponseError
|
||||
from requests.exceptions import RequestException
|
||||
|
@ -17,6 +19,7 @@ from funkwhale_api.common import utils as common_utils
|
|||
from funkwhale_api.federation import routes
|
||||
from funkwhale_api.federation import library as lb
|
||||
from funkwhale_api.federation import utils as federation_utils
|
||||
from funkwhale_api.music.management.commands import import_files
|
||||
from funkwhale_api.tags import models as tags_models
|
||||
from funkwhale_api.tags import tasks as tags_tasks
|
||||
from funkwhale_api.taskapp import celery
|
||||
|
@ -248,6 +251,10 @@ def process_upload(upload, update_denormalization=True):
|
|||
fail_import(upload, "unknown_error")
|
||||
raise
|
||||
|
||||
broadcast = getter(
|
||||
internal_config, "funkwhale", "config", "broadcast", default=True
|
||||
)
|
||||
|
||||
# under some situations, we want to skip the import (
|
||||
# for instance if the user already owns the files)
|
||||
owned_duplicates = get_owned_duplicates(upload, track)
|
||||
|
@ -263,12 +270,13 @@ def process_upload(upload, update_denormalization=True):
|
|||
upload.save(
|
||||
update_fields=["import_details", "import_status", "import_date", "track"]
|
||||
)
|
||||
signals.upload_import_status_updated.send(
|
||||
old_status=old_status,
|
||||
new_status=upload.import_status,
|
||||
upload=upload,
|
||||
sender=None,
|
||||
)
|
||||
if broadcast:
|
||||
signals.upload_import_status_updated.send(
|
||||
old_status=old_status,
|
||||
new_status=upload.import_status,
|
||||
upload=upload,
|
||||
sender=None,
|
||||
)
|
||||
return
|
||||
|
||||
# all is good, let's finalize the import
|
||||
|
@ -305,9 +313,6 @@ def process_upload(upload, update_denormalization=True):
|
|||
track.album, source=final_metadata.get("upload_source"),
|
||||
)
|
||||
|
||||
broadcast = getter(
|
||||
internal_config, "funkwhale", "config", "broadcast", default=True
|
||||
)
|
||||
if broadcast:
|
||||
signals.upload_import_status_updated.send(
|
||||
old_status=old_status,
|
||||
|
@ -361,7 +366,7 @@ def federation_audio_track_to_metadata(payload, references):
|
|||
"mbid": str(payload["album"]["musicbrainzId"])
|
||||
if payload["album"].get("musicbrainzId")
|
||||
else None,
|
||||
"cover_data": get_cover(payload["album"], "cover"),
|
||||
"cover_data": get_cover(payload["album"], "image"),
|
||||
"release_date": payload["album"].get("released"),
|
||||
"tags": [t["name"] for t in payload["album"].get("tags", []) or []],
|
||||
"artists": [
|
||||
|
@ -893,8 +898,6 @@ UPDATE_CONFIG = {
|
|||
|
||||
@transaction.atomic
|
||||
def update_track_metadata(audio_metadata, track):
|
||||
# XXX: implement this to support updating metadata when an imported files
|
||||
# is updated by an outside tool (e.g beets).
|
||||
serializer = metadata.TrackMetadataSerializer(data=audio_metadata)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
new_data = serializer.validated_data
|
||||
|
@ -938,3 +941,32 @@ def update_track_metadata(audio_metadata, track):
|
|||
common_utils.attach_file(
|
||||
track.album, "attachment_cover", new_data["album"].get("cover_data")
|
||||
)
|
||||
|
||||
|
||||
@celery.app.task(name="music.fs_import")
|
||||
@celery.require_instance(models.Library.objects.all(), "library")
|
||||
def fs_import(library, path, import_reference):
|
||||
if cache.get("fs-import:status") != "pending":
|
||||
raise ValueError("Invalid import status")
|
||||
|
||||
command = import_files.Command()
|
||||
|
||||
options = {
|
||||
"recursive": True,
|
||||
"library_id": str(library.uuid),
|
||||
"path": [os.path.join(settings.MUSIC_DIRECTORY_PATH, path)],
|
||||
"update_cache": True,
|
||||
"in_place": True,
|
||||
"reference": import_reference,
|
||||
"watch": False,
|
||||
"interactive": False,
|
||||
"batch_size": 1000,
|
||||
"async_": False,
|
||||
"prune": True,
|
||||
"replace": False,
|
||||
"verbosity": 1,
|
||||
"exit_on_failure": False,
|
||||
"outbox": False,
|
||||
"broadcast": False,
|
||||
}
|
||||
command.handle(**options)
|
||||
|
|
|
@ -1,3 +1,5 @@
|
|||
import os
|
||||
import pathlib
|
||||
import mimetypes
|
||||
|
||||
import magic
|
||||
|
@ -130,3 +132,21 @@ def increment_downloads_count(upload, user, wsgi_request):
|
|||
duration = max(upload.duration or 0, settings.MIN_DELAY_BETWEEN_DOWNLOADS_COUNT)
|
||||
|
||||
cache.set(cache_key, 1, duration)
|
||||
|
||||
|
||||
def browse_dir(root, path):
|
||||
if ".." in path:
|
||||
raise ValueError("Relative browsing is not allowed")
|
||||
|
||||
root = pathlib.Path(root)
|
||||
real_path = root / path
|
||||
|
||||
dirs = []
|
||||
files = []
|
||||
for el in sorted(os.listdir(real_path)):
|
||||
if os.path.isdir(real_path / el):
|
||||
dirs.append({"name": el, "dir": True})
|
||||
else:
|
||||
files.append({"name": el, "dir": False})
|
||||
|
||||
return dirs + files
|
||||
|
|
|
@ -2,19 +2,22 @@ import base64
|
|||
import datetime
|
||||
import logging
|
||||
import urllib.parse
|
||||
|
||||
from django.conf import settings
|
||||
from django.core.cache import cache
|
||||
from django.db import transaction
|
||||
from django.db.models import Count, Prefetch, Sum, F, Q
|
||||
import django.db.utils
|
||||
from django.utils import timezone
|
||||
|
||||
from rest_framework import mixins
|
||||
from rest_framework import renderers
|
||||
from rest_framework import settings as rest_settings
|
||||
from rest_framework import views, viewsets
|
||||
from rest_framework.decorators import action
|
||||
from rest_framework.response import Response
|
||||
|
||||
import requests.exceptions
|
||||
|
||||
from funkwhale_api.common import decorators as common_decorators
|
||||
from funkwhale_api.common import permissions as common_permissions
|
||||
from funkwhale_api.common import preferences
|
||||
|
@ -151,8 +154,10 @@ class ArtistViewSet(
|
|||
|
||||
def get_queryset(self):
|
||||
queryset = super().get_queryset()
|
||||
albums = models.Album.objects.with_tracks_count().select_related(
|
||||
"attachment_cover"
|
||||
albums = (
|
||||
models.Album.objects.with_tracks_count()
|
||||
.select_related("attachment_cover")
|
||||
.prefetch_related("tracks")
|
||||
)
|
||||
albums = albums.annotate_playable_by_actor(
|
||||
utils.get_actor_from_request(self.request)
|
||||
|
@ -180,7 +185,9 @@ class AlbumViewSet(
|
|||
queryset = (
|
||||
models.Album.objects.all()
|
||||
.order_by("-creation_date")
|
||||
.prefetch_related("artist__channel", "attributed_to", "attachment_cover")
|
||||
.prefetch_related(
|
||||
"artist__channel", "attributed_to", "attachment_cover", "tracks"
|
||||
)
|
||||
)
|
||||
serializer_class = serializers.AlbumSerializer
|
||||
permission_classes = [oauth_permissions.ScopePermission]
|
||||
|
@ -216,14 +223,7 @@ class AlbumViewSet(
|
|||
queryset = queryset.exclude(artist__channel=None).filter(
|
||||
artist__attributed_to=self.request.user.actor
|
||||
)
|
||||
tracks = (
|
||||
models.Track.objects.prefetch_related("artist")
|
||||
.with_playable_uploads(utils.get_actor_from_request(self.request))
|
||||
.order_for_album()
|
||||
)
|
||||
qs = queryset.prefetch_related(
|
||||
Prefetch("tracks", queryset=tracks), TAG_PREFETCH
|
||||
)
|
||||
qs = queryset.prefetch_related(TAG_PREFETCH)
|
||||
return qs
|
||||
|
||||
libraries = action(methods=["get"], detail=True)(
|
||||
|
@ -316,6 +316,64 @@ class LibraryViewSet(
|
|||
serializer = self.get_serializer(queryset, many=True)
|
||||
return Response(serializer.data)
|
||||
|
||||
@action(
|
||||
methods=["get", "post", "delete"],
|
||||
detail=False,
|
||||
url_name="fs-import",
|
||||
url_path="fs-import",
|
||||
)
|
||||
@transaction.non_atomic_requests
|
||||
def fs_import(self, request, *args, **kwargs):
|
||||
if not request.user.is_authenticated:
|
||||
return Response({}, status=403)
|
||||
if not request.user.all_permissions["library"]:
|
||||
return Response({}, status=403)
|
||||
if request.method == "GET":
|
||||
path = request.GET.get("path", "")
|
||||
data = {
|
||||
"root": settings.MUSIC_DIRECTORY_PATH,
|
||||
"path": path,
|
||||
"import": None,
|
||||
}
|
||||
status = cache.get("fs-import:status", default=None)
|
||||
if status:
|
||||
data["import"] = {
|
||||
"status": status,
|
||||
"reference": cache.get("fs-import:reference"),
|
||||
"logs": cache.get("fs-import:logs", default=[]),
|
||||
}
|
||||
try:
|
||||
data["content"] = utils.browse_dir(data["root"], data["path"])
|
||||
except (NotADirectoryError, ValueError, FileNotFoundError) as e:
|
||||
return Response({"detail": str(e)}, status=400)
|
||||
|
||||
return Response(data)
|
||||
if request.method == "POST":
|
||||
if cache.get("fs-import:status", default=None) in [
|
||||
"pending",
|
||||
"started",
|
||||
]:
|
||||
return Response({"detail": "An import is already running"}, status=400)
|
||||
|
||||
data = request.data
|
||||
serializer = serializers.FSImportSerializer(
|
||||
data=data, context={"user": request.user}
|
||||
)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
cache.set("fs-import:status", "pending")
|
||||
cache.set(
|
||||
"fs-import:reference", serializer.validated_data["import_reference"]
|
||||
)
|
||||
tasks.fs_import.delay(
|
||||
library_id=serializer.validated_data["library"].pk,
|
||||
path=serializer.validated_data["path"],
|
||||
import_reference=serializer.validated_data["import_reference"],
|
||||
)
|
||||
return Response(status=201)
|
||||
if request.method == "DELETE":
|
||||
cache.set("fs-import:status", "canceled")
|
||||
return Response(status=204)
|
||||
|
||||
|
||||
class TrackViewSet(
|
||||
HandleInvalidSearch,
|
||||
|
@ -514,7 +572,10 @@ def handle_serve(
|
|||
actor = user.actor
|
||||
else:
|
||||
actor = actors.get_service_actor()
|
||||
f.download_audio_from_remote(actor=actor)
|
||||
try:
|
||||
f.download_audio_from_remote(actor=actor)
|
||||
except requests.exceptions.RequestException:
|
||||
return Response({"detail": "Remove track is unavailable"}, status=503)
|
||||
data = f.get_audio_data()
|
||||
if data:
|
||||
f.duration = data["duration"]
|
||||
|
@ -554,7 +615,7 @@ def handle_serve(
|
|||
return response
|
||||
|
||||
|
||||
class ListenViewSet(mixins.RetrieveModelMixin, viewsets.GenericViewSet):
|
||||
class ListenMixin(mixins.RetrieveModelMixin, viewsets.GenericViewSet):
|
||||
queryset = models.Track.objects.all()
|
||||
serializer_class = serializers.TrackSerializer
|
||||
authentication_classes = (
|
||||
|
@ -567,39 +628,66 @@ class ListenViewSet(mixins.RetrieveModelMixin, viewsets.GenericViewSet):
|
|||
lookup_field = "uuid"
|
||||
|
||||
def retrieve(self, request, *args, **kwargs):
|
||||
config = {
|
||||
"explicit_file": request.GET.get("upload"),
|
||||
"download": request.GET.get("download", "true").lower() == "true",
|
||||
"format": request.GET.get("to"),
|
||||
"max_bitrate": request.GET.get("max_bitrate"),
|
||||
}
|
||||
track = self.get_object()
|
||||
actor = utils.get_actor_from_request(request)
|
||||
queryset = track.uploads.prefetch_related(
|
||||
"track__album__artist", "track__artist"
|
||||
)
|
||||
explicit_file = request.GET.get("upload")
|
||||
download = request.GET.get("download", "true").lower() == "true"
|
||||
if explicit_file:
|
||||
queryset = queryset.filter(uuid=explicit_file)
|
||||
queryset = queryset.playable_by(actor)
|
||||
queryset = queryset.order_by(F("audio_file").desc(nulls_last=True))
|
||||
upload = queryset.first()
|
||||
if not upload:
|
||||
return Response(status=404)
|
||||
return handle_stream(track, request, **config)
|
||||
|
||||
format = request.GET.get("to")
|
||||
max_bitrate = request.GET.get("max_bitrate")
|
||||
try:
|
||||
max_bitrate = min(max(int(max_bitrate), 0), 320) or None
|
||||
except (TypeError, ValueError):
|
||||
max_bitrate = None
|
||||
|
||||
if max_bitrate:
|
||||
max_bitrate = max_bitrate * 1000
|
||||
return handle_serve(
|
||||
upload=upload,
|
||||
user=request.user,
|
||||
format=format,
|
||||
max_bitrate=max_bitrate,
|
||||
proxy_media=settings.PROXY_MEDIA,
|
||||
download=download,
|
||||
wsgi_request=request._request,
|
||||
)
|
||||
def handle_stream(track, request, download, explicit_file, format, max_bitrate):
|
||||
actor = utils.get_actor_from_request(request)
|
||||
queryset = track.uploads.prefetch_related("track__album__artist", "track__artist")
|
||||
if explicit_file:
|
||||
queryset = queryset.filter(uuid=explicit_file)
|
||||
queryset = queryset.playable_by(actor)
|
||||
queryset = queryset.order_by(F("audio_file").desc(nulls_last=True))
|
||||
upload = queryset.first()
|
||||
if not upload:
|
||||
return Response(status=404)
|
||||
|
||||
try:
|
||||
max_bitrate = min(max(int(max_bitrate), 0), 320) or None
|
||||
except (TypeError, ValueError):
|
||||
max_bitrate = None
|
||||
|
||||
if max_bitrate:
|
||||
max_bitrate = max_bitrate * 1000
|
||||
return handle_serve(
|
||||
upload=upload,
|
||||
user=request.user,
|
||||
format=format,
|
||||
max_bitrate=max_bitrate,
|
||||
proxy_media=settings.PROXY_MEDIA,
|
||||
download=download,
|
||||
wsgi_request=request._request,
|
||||
)
|
||||
|
||||
|
||||
class ListenViewSet(ListenMixin):
|
||||
pass
|
||||
|
||||
|
||||
class MP3Renderer(renderers.JSONRenderer):
|
||||
format = "mp3"
|
||||
media_type = "audio/mpeg"
|
||||
|
||||
|
||||
class StreamViewSet(ListenMixin):
|
||||
renderer_classes = [MP3Renderer]
|
||||
|
||||
def retrieve(self, request, *args, **kwargs):
|
||||
config = {
|
||||
"explicit_file": None,
|
||||
"download": False,
|
||||
"format": "mp3",
|
||||
"max_bitrate": None,
|
||||
}
|
||||
track = self.get_object()
|
||||
return handle_stream(track, request, **config)
|
||||
|
||||
|
||||
class UploadViewSet(
|
||||
|
@ -737,20 +825,11 @@ class Search(views.APIView):
|
|||
return Response(results, status=200)
|
||||
|
||||
def get_tracks(self, query):
|
||||
search_fields = [
|
||||
"mbid",
|
||||
"title__unaccent",
|
||||
"album__title__unaccent",
|
||||
"artist__name__unaccent",
|
||||
]
|
||||
if settings.USE_FULL_TEXT_SEARCH:
|
||||
query_obj = utils.get_fts_query(
|
||||
query,
|
||||
fts_fields=["body_text", "album__body_text", "artist__body_text"],
|
||||
model=models.Track,
|
||||
)
|
||||
else:
|
||||
query_obj = utils.get_query(query, search_fields)
|
||||
query_obj = utils.get_fts_query(
|
||||
query,
|
||||
fts_fields=["body_text", "album__body_text", "artist__body_text"],
|
||||
model=models.Track,
|
||||
)
|
||||
qs = (
|
||||
models.Track.objects.all()
|
||||
.filter(query_obj)
|
||||
|
@ -761,20 +840,16 @@ class Search(views.APIView):
|
|||
"album",
|
||||
queryset=models.Album.objects.select_related(
|
||||
"artist", "attachment_cover", "attributed_to"
|
||||
),
|
||||
).prefetch_related("tracks"),
|
||||
),
|
||||
)
|
||||
)
|
||||
return common_utils.order_for_search(qs, "title")[: self.max_results]
|
||||
|
||||
def get_albums(self, query):
|
||||
search_fields = ["mbid", "title__unaccent", "artist__name__unaccent"]
|
||||
if settings.USE_FULL_TEXT_SEARCH:
|
||||
query_obj = utils.get_fts_query(
|
||||
query, fts_fields=["body_text", "artist__body_text"], model=models.Album
|
||||
)
|
||||
else:
|
||||
query_obj = utils.get_query(query, search_fields)
|
||||
query_obj = utils.get_fts_query(
|
||||
query, fts_fields=["body_text", "artist__body_text"], model=models.Album
|
||||
)
|
||||
qs = (
|
||||
models.Album.objects.all()
|
||||
.filter(query_obj)
|
||||
|
@ -784,11 +859,7 @@ class Search(views.APIView):
|
|||
return common_utils.order_for_search(qs, "title")[: self.max_results]
|
||||
|
||||
def get_artists(self, query):
|
||||
search_fields = ["mbid", "name__unaccent"]
|
||||
if settings.USE_FULL_TEXT_SEARCH:
|
||||
query_obj = utils.get_fts_query(query, model=models.Artist)
|
||||
else:
|
||||
query_obj = utils.get_query(query, search_fields)
|
||||
query_obj = utils.get_fts_query(query, model=models.Artist)
|
||||
qs = (
|
||||
models.Artist.objects.all()
|
||||
.filter(query_obj)
|
||||
|
|
|
@ -1,16 +1,14 @@
|
|||
from dynamic_preferences import types
|
||||
from dynamic_preferences.registries import global_preferences_registry
|
||||
|
||||
from funkwhale_api.common import preferences
|
||||
|
||||
playlists = types.Section("playlists")
|
||||
|
||||
|
||||
@global_preferences_registry.register
|
||||
class MaxTracks(preferences.DefaultFromSettingMixin, types.IntegerPreference):
|
||||
class MaxTracks(types.IntegerPreference):
|
||||
show_in_api = True
|
||||
section = playlists
|
||||
name = "max_tracks"
|
||||
default = 250
|
||||
verbose_name = "Max tracks per playlist"
|
||||
setting = "PLAYLISTS_MAX_TRACKS"
|
||||
field_kwargs = {"required": False}
|
||||
|
|
|
@ -31,11 +31,7 @@ class PlaylistFilter(filters.FilterSet):
|
|||
class Meta:
|
||||
model = models.Playlist
|
||||
fields = {
|
||||
"user": ["exact"],
|
||||
"name": ["exact", "icontains"],
|
||||
"q": "exact",
|
||||
"playable": "exact",
|
||||
"scope": "exact",
|
||||
}
|
||||
|
||||
def filter_playable(self, queryset, name, value):
|
||||
|
|
|
@ -203,6 +203,15 @@ class PlaylistTrackQuerySet(models.QuerySet):
|
|||
else:
|
||||
return self.exclude(track__pk__in=tracks).distinct()
|
||||
|
||||
def by_index(self, index):
|
||||
plts = self.order_by("index").values_list("id", flat=True)
|
||||
try:
|
||||
plt_id = plts[index]
|
||||
except IndexError:
|
||||
raise PlaylistTrack.DoesNotExist
|
||||
|
||||
return PlaylistTrack.objects.get(pk=plt_id)
|
||||
|
||||
|
||||
class PlaylistTrack(models.Model):
|
||||
track = models.ForeignKey(
|
||||
|
@ -218,7 +227,6 @@ class PlaylistTrack(models.Model):
|
|||
|
||||
class Meta:
|
||||
ordering = ("-playlist", "index")
|
||||
unique_together = ("playlist", "index")
|
||||
|
||||
def delete(self, *args, **kwargs):
|
||||
playlist = self.playlist
|
||||
|
|
|
@ -1,7 +1,5 @@
|
|||
from django.db import transaction
|
||||
from rest_framework import serializers
|
||||
|
||||
from funkwhale_api.common import preferences
|
||||
from funkwhale_api.federation import serializers as federation_serializers
|
||||
from funkwhale_api.music.models import Track
|
||||
from funkwhale_api.music.serializers import TrackSerializer
|
||||
|
@ -16,64 +14,13 @@ class PlaylistTrackSerializer(serializers.ModelSerializer):
|
|||
|
||||
class Meta:
|
||||
model = models.PlaylistTrack
|
||||
fields = ("id", "track", "playlist", "index", "creation_date")
|
||||
fields = ("track", "index", "creation_date")
|
||||
|
||||
def get_track(self, o):
|
||||
track = o._prefetched_track if hasattr(o, "_prefetched_track") else o.track
|
||||
return TrackSerializer(track).data
|
||||
|
||||
|
||||
class PlaylistTrackWriteSerializer(serializers.ModelSerializer):
|
||||
index = serializers.IntegerField(required=False, min_value=0, allow_null=True)
|
||||
allow_duplicates = serializers.BooleanField(required=False)
|
||||
|
||||
class Meta:
|
||||
model = models.PlaylistTrack
|
||||
fields = ("id", "track", "playlist", "index", "allow_duplicates")
|
||||
|
||||
def validate_playlist(self, value):
|
||||
if self.context.get("request"):
|
||||
# validate proper ownership on the playlist
|
||||
if self.context["request"].user != value.user:
|
||||
raise serializers.ValidationError(
|
||||
"You do not have the permission to edit this playlist"
|
||||
)
|
||||
existing = value.playlist_tracks.count()
|
||||
max_tracks = preferences.get("playlists__max_tracks")
|
||||
if existing >= max_tracks:
|
||||
raise serializers.ValidationError(
|
||||
"Playlist has reached the maximum of {} tracks".format(max_tracks)
|
||||
)
|
||||
return value
|
||||
|
||||
@transaction.atomic
|
||||
def create(self, validated_data):
|
||||
index = validated_data.pop("index", None)
|
||||
allow_duplicates = validated_data.pop("allow_duplicates", True)
|
||||
instance = super().create(validated_data)
|
||||
|
||||
instance.playlist.insert(instance, index, allow_duplicates)
|
||||
return instance
|
||||
|
||||
@transaction.atomic
|
||||
def update(self, instance, validated_data):
|
||||
update_index = "index" in validated_data
|
||||
index = validated_data.pop("index", None)
|
||||
allow_duplicates = validated_data.pop("allow_duplicates", True)
|
||||
super().update(instance, validated_data)
|
||||
if update_index:
|
||||
instance.playlist.insert(instance, index, allow_duplicates)
|
||||
|
||||
return instance
|
||||
|
||||
def get_unique_together_validators(self):
|
||||
"""
|
||||
We explicitely disable unique together validation here
|
||||
because it collides with our internal logic
|
||||
"""
|
||||
return []
|
||||
|
||||
|
||||
class PlaylistSerializer(serializers.ModelSerializer):
|
||||
tracks_count = serializers.SerializerMethodField(read_only=True)
|
||||
duration = serializers.SerializerMethodField(read_only=True)
|
||||
|
|
|
@ -93,40 +93,43 @@ class PlaylistViewSet(
|
|||
),
|
||||
)
|
||||
|
||||
@action(methods=["post", "delete"], detail=True)
|
||||
@transaction.atomic
|
||||
def remove(self, request, *args, **kwargs):
|
||||
playlist = self.get_object()
|
||||
try:
|
||||
index = int(request.data["index"])
|
||||
assert index >= 0
|
||||
except (KeyError, ValueError, AssertionError, TypeError):
|
||||
return Response(status=400)
|
||||
|
||||
class PlaylistTrackViewSet(
|
||||
mixins.RetrieveModelMixin,
|
||||
mixins.CreateModelMixin,
|
||||
mixins.UpdateModelMixin,
|
||||
mixins.DestroyModelMixin,
|
||||
mixins.ListModelMixin,
|
||||
viewsets.GenericViewSet,
|
||||
):
|
||||
try:
|
||||
plt = playlist.playlist_tracks.by_index(index)
|
||||
except models.PlaylistTrack.DoesNotExist:
|
||||
return Response(status=404)
|
||||
plt.delete(update_indexes=True)
|
||||
|
||||
serializer_class = serializers.PlaylistTrackSerializer
|
||||
queryset = models.PlaylistTrack.objects.all()
|
||||
permission_classes = [
|
||||
oauth_permissions.ScopePermission,
|
||||
permissions.OwnerPermission,
|
||||
]
|
||||
required_scope = "playlists"
|
||||
anonymous_policy = "setting"
|
||||
owner_field = "playlist.user"
|
||||
owner_checks = ["write"]
|
||||
return Response(status=204)
|
||||
|
||||
def get_serializer_class(self):
|
||||
if self.request.method in ["PUT", "PATCH", "DELETE", "POST"]:
|
||||
return serializers.PlaylistTrackWriteSerializer
|
||||
return self.serializer_class
|
||||
@action(methods=["post"], detail=True)
|
||||
@transaction.atomic
|
||||
def move(self, request, *args, **kwargs):
|
||||
playlist = self.get_object()
|
||||
try:
|
||||
from_index = int(request.data["from"])
|
||||
assert from_index >= 0
|
||||
except (KeyError, ValueError, AssertionError, TypeError):
|
||||
return Response({"detail": "invalid from index"}, status=400)
|
||||
|
||||
def get_queryset(self):
|
||||
return self.queryset.filter(
|
||||
fields.privacy_level_query(
|
||||
self.request.user,
|
||||
lookup_field="playlist__privacy_level",
|
||||
user_field="playlist__user",
|
||||
)
|
||||
).for_nested_serialization(music_utils.get_actor_from_request(self.request))
|
||||
try:
|
||||
to_index = int(request.data["to"])
|
||||
assert to_index >= 0
|
||||
except (KeyError, ValueError, AssertionError, TypeError):
|
||||
return Response({"detail": "invalid to index"}, status=400)
|
||||
|
||||
def perform_destroy(self, instance):
|
||||
instance.delete(update_indexes=True)
|
||||
try:
|
||||
plt = playlist.playlist_tracks.by_index(from_index)
|
||||
except models.PlaylistTrack.DoesNotExist:
|
||||
return Response(status=404)
|
||||
playlist.insert(plt, to_index)
|
||||
return Response(status=204)
|
||||
|
|
|
@ -1,15 +1,23 @@
|
|||
import django_filters
|
||||
|
||||
from django_filters import rest_framework as filters
|
||||
|
||||
from funkwhale_api.common import filters as common_filters
|
||||
from funkwhale_api.music import utils
|
||||
|
||||
from . import models
|
||||
|
||||
|
||||
class RadioFilter(django_filters.FilterSet):
|
||||
scope = common_filters.ActorScopeFilter(actor_field="user__actor", distinct=True)
|
||||
q = filters.CharFilter(field_name="_", method="filter_q")
|
||||
|
||||
class Meta:
|
||||
model = models.Radio
|
||||
fields = {
|
||||
"name": ["exact", "iexact", "startswith", "icontains"],
|
||||
"scope": "exact",
|
||||
}
|
||||
|
||||
def filter_q(self, queryset, name, value):
|
||||
query = utils.get_query(value, ["name", "user__username"])
|
||||
return queryset.filter(query)
|
||||
|
|
|
@ -0,0 +1,20 @@
|
|||
# Generated by Django 3.0.8 on 2020-08-03 12:22
|
||||
|
||||
import django.contrib.postgres.fields.jsonb
|
||||
import django.core.serializers.json
|
||||
from django.db import migrations
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('radios', '0004_auto_20180107_1813'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='radio',
|
||||
name='config',
|
||||
field=django.contrib.postgres.fields.jsonb.JSONField(encoder=django.core.serializers.json.DjangoJSONEncoder),
|
||||
),
|
||||
]
|
|
@ -102,7 +102,7 @@ class SessionRadio(SimpleRadio):
|
|||
class RandomRadio(SessionRadio):
|
||||
def get_queryset(self, **kwargs):
|
||||
qs = super().get_queryset(**kwargs)
|
||||
return qs.order_by("?")
|
||||
return qs.filter(artist__content_category="music").order_by("?")
|
||||
|
||||
|
||||
@registry.register(name="favorites")
|
||||
|
@ -116,7 +116,7 @@ class FavoritesRadio(SessionRadio):
|
|||
def get_queryset(self, **kwargs):
|
||||
qs = super().get_queryset(**kwargs)
|
||||
track_ids = kwargs["user"].track_favorites.all().values_list("track", flat=True)
|
||||
return qs.filter(pk__in=track_ids)
|
||||
return qs.filter(pk__in=track_ids, artist__content_category="music")
|
||||
|
||||
|
||||
@registry.register(name="custom")
|
||||
|
@ -271,10 +271,14 @@ class LessListenedRadio(SessionRadio):
|
|||
def get_queryset(self, **kwargs):
|
||||
qs = super().get_queryset(**kwargs)
|
||||
listened = self.session.user.listenings.all().values_list("track", flat=True)
|
||||
return qs.exclude(pk__in=listened).order_by("?")
|
||||
return (
|
||||
qs.filter(artist__content_category="music")
|
||||
.exclude(pk__in=listened)
|
||||
.order_by("?")
|
||||
)
|
||||
|
||||
|
||||
@registry.register(name="actor_content")
|
||||
@registry.register(name="actor-content")
|
||||
class ActorContentRadio(RelatedObjectRadio):
|
||||
"""
|
||||
Play content from given actor libraries
|
||||
|
|
|
@ -1,38 +0,0 @@
|
|||
/* These styles are generated from project.scss. */
|
||||
|
||||
.alert-debug {
|
||||
color: black;
|
||||
background-color: white;
|
||||
border-color: #d6e9c6;
|
||||
}
|
||||
|
||||
.alert-error {
|
||||
color: #b94a48;
|
||||
background-color: #f2dede;
|
||||
border-color: #eed3d7;
|
||||
}
|
||||
|
||||
/* This is a fix for the bootstrap4 alpha release */
|
||||
@media (max-width: 47.9em) {
|
||||
.navbar-nav .nav-item {
|
||||
float: none;
|
||||
width: 100%;
|
||||
display: inline-block;
|
||||
}
|
||||
|
||||
.navbar-nav .nav-item + .nav-item {
|
||||
margin-left: 0;
|
||||
}
|
||||
|
||||
.nav.navbar-nav.pull-right {
|
||||
float: none !important;
|
||||
}
|
||||
}
|
||||
|
||||
/* Display django-debug-toolbar.
|
||||
See https://github.com/django-debug-toolbar/django-debug-toolbar/issues/742
|
||||
and https://github.com/pydanny/cookiecutter-django/issues/317
|
||||
*/
|
||||
[hidden][style="display: block;"] {
|
||||
display: block !important;
|
||||
}
|
Binary file not shown.
Before Width: | Height: | Size: 8.2 KiB |
|
@ -1 +0,0 @@
|
|||
/* Project specific Javascript goes here. */
|
|
@ -1,51 +0,0 @@
|
|||
// project specific CSS goes here
|
||||
|
||||
// Alert colors
|
||||
|
||||
$white: #fff;
|
||||
$mint-green: #d6e9c6;
|
||||
$black: #000;
|
||||
$pink: #f2dede;
|
||||
$dark-pink: #eed3d7;
|
||||
$red: #b94a48;
|
||||
|
||||
// bootstrap alert CSS, translated to the django-standard levels of
|
||||
// debug, info, success, warning, error
|
||||
|
||||
.alert-debug {
|
||||
background-color: $white;
|
||||
border-color: $mint-green;
|
||||
color: $black;
|
||||
}
|
||||
|
||||
.alert-error {
|
||||
background-color: $pink;
|
||||
border-color: $dark-pink;
|
||||
color: $red;
|
||||
}
|
||||
|
||||
// This is a fix for the bootstrap4 alpha release
|
||||
|
||||
@media (max-width: 47.9em) {
|
||||
.navbar-nav .nav-item {
|
||||
display: inline-block;
|
||||
float: none;
|
||||
width: 100%;
|
||||
}
|
||||
|
||||
.navbar-nav .nav-item + .nav-item {
|
||||
margin-left: 0;
|
||||
}
|
||||
|
||||
.nav.navbar-nav.pull-right {
|
||||
float: none !important;
|
||||
}
|
||||
}
|
||||
|
||||
// Display django-debug-toolbar.
|
||||
// See https://github.com/django-debug-toolbar/django-debug-toolbar/issues/742
|
||||
// and https://github.com/pydanny/cookiecutter-django/issues/317
|
||||
|
||||
[hidden][style="display: block;"] {
|
||||
display: block !important;
|
||||
}
|
|
@ -8,7 +8,7 @@ class AlbumList2FilterSet(filters.FilterSet):
|
|||
|
||||
class Meta:
|
||||
model = music_models.Album
|
||||
fields = ["type"]
|
||||
fields = []
|
||||
|
||||
def filter_type(self, queryset, name, value):
|
||||
ORDERING = {
|
||||
|
|
|
@ -20,7 +20,7 @@ class TagFilter(filters.FilterSet):
|
|||
|
||||
class Meta:
|
||||
model = models.Tag
|
||||
fields = {"q": ["exact"], "name": ["exact", "startswith"]}
|
||||
fields = {"name": ["exact", "startswith"]}
|
||||
|
||||
|
||||
def get_by_similar_tags(qs, tags):
|
||||
|
|
|
@ -0,0 +1,25 @@
|
|||
# Generated by Django 3.0.8 on 2020-08-03 12:22
|
||||
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('contenttypes', '0002_remove_content_type_name'),
|
||||
('tags', '0001_initial'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='taggeditem',
|
||||
name='content_type',
|
||||
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='tagged_items', to='contenttypes.ContentType', verbose_name='Content type'),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='taggeditem',
|
||||
name='tag',
|
||||
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='tagged_items', to='tags.Tag'),
|
||||
),
|
||||
]
|
|
@ -1,8 +1,11 @@
|
|||
from django.conf.urls import url
|
||||
from funkwhale_api.common import routers
|
||||
|
||||
from . import views
|
||||
|
||||
router = routers.OptionalSlashRouter()
|
||||
router.register(r"users", views.UserViewSet, "users")
|
||||
|
||||
urlpatterns = router.urls
|
||||
urlpatterns = [
|
||||
url(r"^users/login/?$", views.login, name="login"),
|
||||
url(r"^users/logout/?$", views.logout, name="logout"),
|
||||
] + router.urls
|
||||
|
|
|
@ -129,6 +129,7 @@ class SuperUserFactory(UserFactory):
|
|||
class ApplicationFactory(factory.django.DjangoModelFactory):
|
||||
name = factory.Faker("name")
|
||||
redirect_uris = factory.Faker("url")
|
||||
token = factory.Faker("uuid4")
|
||||
client_type = models.Application.CLIENT_CONFIDENTIAL
|
||||
authorization_grant_type = models.Application.GRANT_AUTHORIZATION_CODE
|
||||
scope = "read"
|
||||
|
|
|
@ -0,0 +1,26 @@
|
|||
# Generated by Django 3.0.8 on 2020-07-05 08:29
|
||||
|
||||
import django.contrib.postgres.fields.jsonb
|
||||
from django.db import migrations
|
||||
import funkwhale_api.users.models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('users', '0017_actor_avatar'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterModelManagers(
|
||||
name='user',
|
||||
managers=[
|
||||
('objects', funkwhale_api.users.models.UserManager()),
|
||||
],
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='user',
|
||||
name='settings',
|
||||
field=django.contrib.postgres.fields.jsonb.JSONField(default=None, null=True, blank=True, max_length=50000),
|
||||
),
|
||||
]
|
|
@ -0,0 +1,23 @@
|
|||
# Generated by Django 3.0.8 on 2020-07-18 07:41
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('users', '0018_auto_20200705_0829'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='grant',
|
||||
name='code_challenge',
|
||||
field=models.CharField(blank=True, default='', max_length=128),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='grant',
|
||||
name='code_challenge_method',
|
||||
field=models.CharField(blank=True, choices=[('plain', 'plain'), ('S256', 'S256')], default='', max_length=10),
|
||||
),
|
||||
]
|
|
@ -0,0 +1,18 @@
|
|||
# Generated by Django 3.0.8 on 2020-08-19 08:58
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('users', '0019_auto_20200718_0741'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='application',
|
||||
name='token',
|
||||
field=models.CharField(blank=True, max_length=50, null=True, unique=True),
|
||||
),
|
||||
]
|
|
@ -1,16 +1,15 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
from __future__ import absolute_import, unicode_literals
|
||||
|
||||
import binascii
|
||||
import datetime
|
||||
import os
|
||||
import random
|
||||
import string
|
||||
import uuid
|
||||
|
||||
from django.conf import settings
|
||||
from django.contrib.auth.models import AbstractUser, UserManager as BaseUserManager
|
||||
from django.db import models
|
||||
from django.contrib.postgres.fields import JSONField
|
||||
from django.db import models, transaction
|
||||
from django.dispatch import receiver
|
||||
from django.urls import reverse
|
||||
from django.utils import timezone
|
||||
|
@ -30,8 +29,9 @@ from funkwhale_api.federation import models as federation_models
|
|||
from funkwhale_api.federation import utils as federation_utils
|
||||
|
||||
|
||||
def get_token():
|
||||
return binascii.b2a_hex(os.urandom(15)).decode("utf-8")
|
||||
def get_token(length=30):
|
||||
choices = string.ascii_lowercase + string.ascii_uppercase + "0123456789"
|
||||
return "".join(random.choice(choices) for i in range(length))
|
||||
|
||||
|
||||
PERMISSIONS_CONFIGURATION = {
|
||||
|
@ -103,7 +103,9 @@ class UserQuerySet(models.QuerySet):
|
|||
user=models.OuterRef("id"), primary=True
|
||||
).values("verified")[:1]
|
||||
subquery = models.Subquery(verified_emails)
|
||||
return qs.annotate(has_verified_primary_email=subquery)
|
||||
return qs.annotate(has_verified_primary_email=subquery).prefetch_related(
|
||||
"plugins"
|
||||
)
|
||||
|
||||
|
||||
class UserManager(BaseUserManager):
|
||||
|
@ -189,6 +191,7 @@ class User(AbstractUser):
|
|||
null=True,
|
||||
blank=True,
|
||||
)
|
||||
settings = JSONField(default=None, null=True, blank=True, max_length=50000)
|
||||
|
||||
objects = UserManager()
|
||||
|
||||
|
@ -211,6 +214,16 @@ class User(AbstractUser):
|
|||
def all_permissions(self):
|
||||
return self.get_permissions()
|
||||
|
||||
@transaction.atomic
|
||||
def set_settings(self, **settings):
|
||||
u = self.__class__.objects.select_for_update().get(pk=self.pk)
|
||||
if not u.settings:
|
||||
u.settings = {}
|
||||
for key, value in settings.items():
|
||||
u.settings[key] = value
|
||||
u.save(update_fields=["settings"])
|
||||
self.settings = u.settings
|
||||
|
||||
def has_permissions(self, *perms, **kwargs):
|
||||
operator = kwargs.pop("operator", "and")
|
||||
if operator not in ["and", "or"]:
|
||||
|
@ -336,6 +349,7 @@ class Invitation(models.Model):
|
|||
|
||||
class Application(oauth2_models.AbstractApplication):
|
||||
scope = models.TextField(blank=True)
|
||||
token = models.CharField(max_length=50, blank=True, null=True, unique=True)
|
||||
|
||||
@property
|
||||
def normalized_scopes(self):
|
||||
|
|
|
@ -51,12 +51,7 @@ class ScopePermission(permissions.BasePermission):
|
|||
if request.method.lower() in ["options", "head"]:
|
||||
return True
|
||||
|
||||
try:
|
||||
scope_config = getattr(view, "required_scope")
|
||||
except AttributeError:
|
||||
raise ImproperlyConfigured(
|
||||
"ScopePermission requires the view to define the required_scope attribute"
|
||||
)
|
||||
scope_config = getattr(view, "required_scope", "noopscope")
|
||||
anonymous_policy = getattr(view, "anonymous_policy", False)
|
||||
if anonymous_policy not in [True, False, "setting"]:
|
||||
raise ImproperlyConfigured(
|
||||
|
|
|
@ -23,6 +23,7 @@ BASE_SCOPES = [
|
|||
Scope("notifications", "Access personal notifications"),
|
||||
Scope("security", "Access security settings"),
|
||||
Scope("reports", "Access reports"),
|
||||
Scope("plugins", "Access plugins"),
|
||||
# Privileged scopes that require specific user permissions
|
||||
Scope("instance:settings", "Access instance settings"),
|
||||
Scope("instance:users", "Access local user accounts"),
|
||||
|
@ -81,7 +82,12 @@ COMMON_SCOPES = ANONYMOUS_SCOPES | {
|
|||
"write:listenings",
|
||||
}
|
||||
|
||||
LOGGED_IN_SCOPES = COMMON_SCOPES | {"read:security", "write:security"}
|
||||
LOGGED_IN_SCOPES = COMMON_SCOPES | {
|
||||
"read:security",
|
||||
"write:security",
|
||||
"read:plugins",
|
||||
"write:plugins",
|
||||
}
|
||||
|
||||
# We don't allow admin access for oauth apps yet
|
||||
OAUTH_APP_SCOPES = COMMON_SCOPES
|
||||
|
|
|
@ -10,6 +10,12 @@ class ApplicationSerializer(serializers.ModelSerializer):
|
|||
model = models.Application
|
||||
fields = ["client_id", "name", "scopes", "created", "updated"]
|
||||
|
||||
def to_representation(self, obj):
|
||||
repr = super().to_representation(obj)
|
||||
if obj.user_id:
|
||||
repr["token"] = obj.token
|
||||
return repr
|
||||
|
||||
|
||||
class CreateApplicationSerializer(serializers.ModelSerializer):
|
||||
name = serializers.CharField(required=True, max_length=255)
|
||||
|
@ -27,3 +33,9 @@ class CreateApplicationSerializer(serializers.ModelSerializer):
|
|||
"redirect_uris",
|
||||
]
|
||||
read_only_fields = ["client_id", "client_secret", "created", "updated"]
|
||||
|
||||
def to_representation(self, obj):
|
||||
repr = super().to_representation(obj)
|
||||
if obj.user_id:
|
||||
repr["token"] = obj.token
|
||||
return repr
|
||||
|
|
|
@ -4,7 +4,8 @@ import urllib.parse
|
|||
from django import http
|
||||
from django.utils import timezone
|
||||
from django.db.models import Q
|
||||
from rest_framework import mixins, permissions, views, viewsets
|
||||
from rest_framework import mixins, permissions, response, views, viewsets
|
||||
from rest_framework.decorators import action
|
||||
|
||||
from oauth2_provider import exceptions as oauth2_exceptions
|
||||
from oauth2_provider import views as oauth_views
|
||||
|
@ -32,6 +33,7 @@ class ApplicationViewSet(
|
|||
"destroy": "write:security",
|
||||
"update": "write:security",
|
||||
"partial_update": "write:security",
|
||||
"refresh_token": "write:security",
|
||||
"list": "read:security",
|
||||
}
|
||||
lookup_field = "client_id"
|
||||
|
@ -54,6 +56,7 @@ class ApplicationViewSet(
|
|||
client_type=models.Application.CLIENT_CONFIDENTIAL,
|
||||
authorization_grant_type=models.Application.GRANT_AUTHORIZATION_CODE,
|
||||
user=self.request.user if self.request.user.is_authenticated else None,
|
||||
token=models.get_token() if self.request.user.is_authenticated else None,
|
||||
)
|
||||
|
||||
def get_serializer(self, *args, **kwargs):
|
||||
|
@ -70,10 +73,31 @@ class ApplicationViewSet(
|
|||
|
||||
def get_queryset(self):
|
||||
qs = super().get_queryset()
|
||||
if self.action in ["list", "destroy", "update", "partial_update"]:
|
||||
if self.action in [
|
||||
"list",
|
||||
"destroy",
|
||||
"update",
|
||||
"partial_update",
|
||||
"refresh_token",
|
||||
]:
|
||||
qs = qs.filter(user=self.request.user)
|
||||
return qs
|
||||
|
||||
@action(
|
||||
detail=True,
|
||||
methods=["post"],
|
||||
url_name="refresh_token",
|
||||
url_path="refresh-token",
|
||||
)
|
||||
def refresh_token(self, request, *args, **kwargs):
|
||||
app = self.get_object()
|
||||
if not app.user_id or request.user != app.user:
|
||||
return response.Response(status=404)
|
||||
app.token = models.get_token()
|
||||
app.save(update_fields=["token"])
|
||||
serializer = serializers.CreateApplicationSerializer(app)
|
||||
return response.Response(serializer.data, status=200)
|
||||
|
||||
|
||||
class GrantViewSet(
|
||||
mixins.RetrieveModelMixin,
|
||||
|
@ -155,20 +179,21 @@ class AuthorizeView(views.APIView, oauth_views.AuthorizationView):
|
|||
|
||||
def form_valid(self, form):
|
||||
try:
|
||||
response = super().form_valid(form)
|
||||
return super().form_valid(form)
|
||||
|
||||
except models.Application.DoesNotExist:
|
||||
return self.json_payload({"non_field_errors": ["Invalid application"]}, 400)
|
||||
|
||||
if self.request.is_ajax() and response.status_code == 302:
|
||||
def redirect(self, redirect_to, application, token=None):
|
||||
if self.request.is_ajax():
|
||||
# Web client need this to be able to redirect the user
|
||||
query = urllib.parse.urlparse(response["Location"]).query
|
||||
query = urllib.parse.urlparse(redirect_to).query
|
||||
code = urllib.parse.parse_qs(query)["code"][0]
|
||||
return self.json_payload(
|
||||
{"redirect_uri": response["Location"], "code": code}, status_code=200
|
||||
{"redirect_uri": redirect_to, "code": code}, status_code=200
|
||||
)
|
||||
|
||||
return response
|
||||
return super().redirect(redirect_to, application, token)
|
||||
|
||||
def error_response(self, error, application):
|
||||
if isinstance(error, oauth2_exceptions.FatalClientError):
|
||||
|
|
|
@ -4,6 +4,9 @@ from django.core import validators
|
|||
from django.utils.deconstruct import deconstructible
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from django.contrib import auth
|
||||
|
||||
from allauth.account import models as allauth_models
|
||||
from rest_auth.serializers import PasswordResetSerializer as PRS
|
||||
from rest_auth.registration.serializers import RegisterSerializer as RS, get_adapter
|
||||
from rest_framework import serializers
|
||||
|
@ -230,6 +233,7 @@ class MeSerializer(UserReadSerializer):
|
|||
"funkwhale_support_message_display_date",
|
||||
"summary",
|
||||
"tokens",
|
||||
"settings",
|
||||
]
|
||||
|
||||
def get_quota_status(self, o):
|
||||
|
@ -265,3 +269,49 @@ class UserDeleteSerializer(serializers.Serializer):
|
|||
if not value:
|
||||
raise serializers.ValidationError("Please confirm deletion")
|
||||
return value
|
||||
|
||||
|
||||
class LoginSerializer(serializers.Serializer):
|
||||
username = serializers.CharField()
|
||||
password = serializers.CharField()
|
||||
|
||||
def validate(self, data):
|
||||
user = auth.authenticate(request=self.context.get("request"), **data)
|
||||
if not user:
|
||||
raise serializers.ValidationError(
|
||||
"Unable to log in with provided credentials"
|
||||
)
|
||||
|
||||
if not user.is_active:
|
||||
raise serializers.ValidationError("This account was disabled")
|
||||
|
||||
return user
|
||||
|
||||
def save(self, request):
|
||||
return auth.login(request, self.validated_data)
|
||||
|
||||
|
||||
class UserChangeEmailSerializer(serializers.Serializer):
|
||||
password = serializers.CharField()
|
||||
email = serializers.EmailField()
|
||||
|
||||
def validate_password(self, value):
|
||||
if not self.instance.check_password(value):
|
||||
raise serializers.ValidationError("Invalid password")
|
||||
|
||||
def validate_email(self, value):
|
||||
if (
|
||||
allauth_models.EmailAddress.objects.filter(email__iexact=value)
|
||||
.exclude(user=self.context["user"])
|
||||
.exists()
|
||||
):
|
||||
raise serializers.ValidationError("This email address is already in use")
|
||||
return value
|
||||
|
||||
def save(self, request):
|
||||
current, _ = allauth_models.EmailAddress.objects.get_or_create(
|
||||
user=request.user,
|
||||
email=request.user.email,
|
||||
defaults={"verified": False, "primary": True},
|
||||
)
|
||||
current.change(request, self.validated_data["email"], confirm=True)
|
||||
|
|
|
@ -1,12 +1,20 @@
|
|||
import json
|
||||
|
||||
from django import http
|
||||
from django.contrib import auth
|
||||
from django.middleware import csrf
|
||||
|
||||
from allauth.account.adapter import get_adapter
|
||||
from rest_auth import views as rest_auth_views
|
||||
from rest_auth.registration import views as registration_views
|
||||
from rest_framework import mixins, viewsets
|
||||
from rest_framework import mixins
|
||||
from rest_framework import viewsets
|
||||
from rest_framework.decorators import action
|
||||
from rest_framework.response import Response
|
||||
|
||||
from funkwhale_api.common import authentication
|
||||
from funkwhale_api.common import preferences
|
||||
from funkwhale_api.common import throttling
|
||||
|
||||
from . import models, serializers, tasks
|
||||
|
||||
|
@ -72,6 +80,13 @@ class UserViewSet(mixins.UpdateModelMixin, viewsets.GenericViewSet):
|
|||
serializer = serializers.MeSerializer(request.user)
|
||||
return Response(serializer.data)
|
||||
|
||||
@action(methods=["post"], detail=False, url_name="settings", url_path="settings")
|
||||
def set_settings(self, request, *args, **kwargs):
|
||||
"""Return information about the current user or delete it"""
|
||||
new_settings = request.data
|
||||
request.user.set_settings(**new_settings)
|
||||
return Response(request.user.settings)
|
||||
|
||||
@action(
|
||||
methods=["get", "post", "delete"],
|
||||
required_scope="security",
|
||||
|
@ -96,6 +111,22 @@ class UserViewSet(mixins.UpdateModelMixin, viewsets.GenericViewSet):
|
|||
data = {"subsonic_api_token": self.request.user.subsonic_api_token}
|
||||
return Response(data)
|
||||
|
||||
@action(
|
||||
methods=["post"],
|
||||
required_scope="security",
|
||||
url_path="change-email",
|
||||
detail=False,
|
||||
)
|
||||
def change_email(self, request, *args, **kwargs):
|
||||
if not self.request.user.is_authenticated:
|
||||
return Response(status=403)
|
||||
serializer = serializers.UserChangeEmailSerializer(
|
||||
request.user, data=request.data, context={"user": request.user}
|
||||
)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
serializer.save(request)
|
||||
return Response(status=204)
|
||||
|
||||
def update(self, request, *args, **kwargs):
|
||||
if not self.request.user.username == kwargs.get("username"):
|
||||
return Response(status=403)
|
||||
|
@ -105,3 +136,32 @@ class UserViewSet(mixins.UpdateModelMixin, viewsets.GenericViewSet):
|
|||
if not self.request.user.username == kwargs.get("username"):
|
||||
return Response(status=403)
|
||||
return super().partial_update(request, *args, **kwargs)
|
||||
|
||||
|
||||
def login(request):
|
||||
throttling.check_request(request, "login")
|
||||
if request.method != "POST":
|
||||
return http.HttpResponse(status=405)
|
||||
serializer = serializers.LoginSerializer(
|
||||
data=request.POST, context={"request": request}
|
||||
)
|
||||
if not serializer.is_valid():
|
||||
return http.HttpResponse(
|
||||
json.dumps(serializer.errors), status=400, content_type="application/json"
|
||||
)
|
||||
serializer.save(request)
|
||||
csrf.rotate_token(request)
|
||||
token = csrf.get_token(request)
|
||||
response = http.HttpResponse(status=200)
|
||||
response.set_cookie("csrftoken", token, max_age=None)
|
||||
return response
|
||||
|
||||
|
||||
def logout(request):
|
||||
if request.method != "POST":
|
||||
return http.HttpResponse(status=405)
|
||||
auth.logout(request)
|
||||
token = csrf.get_token(request)
|
||||
response = http.HttpResponse(status=200)
|
||||
response.set_cookie("csrftoken", token, max_age=None)
|
||||
return response
|
||||
|
|
|
@ -1,86 +1,74 @@
|
|||
# Bleeding edge Django
|
||||
django>=3.0.5,<3.1; python_version > '3.5'
|
||||
django>=2.2.12,<3; python_version < '3.6'
|
||||
setuptools>=36
|
||||
django~=3.0.8
|
||||
setuptools>=49
|
||||
# Configuration
|
||||
django-environ>=0.4,<0.5
|
||||
django-environ~=0.4
|
||||
|
||||
# Images
|
||||
Pillow>=6.2,<7
|
||||
Pillow~=7.0
|
||||
|
||||
# For user registration, either via email or social
|
||||
# Well-built with regular release cycles!
|
||||
django-allauth>=0.41,<0.42
|
||||
django-allauth~=0.42
|
||||
|
||||
|
||||
# Python-PostgreSQL Database Adapter
|
||||
psycopg2-binary>=2.8,<=2.9
|
||||
psycopg2-binary~=2.8
|
||||
|
||||
# Time zones support
|
||||
pytz==2019.3
|
||||
pytz==2020.1
|
||||
|
||||
# Redis support
|
||||
django-redis>=4.11,<4.12
|
||||
redis>=3.4,<3.5
|
||||
kombu>=4.5,<4.6
|
||||
django-redis~=4.12
|
||||
redis~=3.5
|
||||
kombu~=4.6
|
||||
|
||||
celery>=4.3,<4.4
|
||||
celery~=4.4
|
||||
|
||||
|
||||
# Your custom requirements go here
|
||||
django-cors-headers>=3.2,<3.3
|
||||
musicbrainzngs==0.6
|
||||
djangorestframework>=3.11,<3.12
|
||||
djangorestframework-jwt>=1.11,<1.12
|
||||
arrow>=0.15.5,<0.16
|
||||
persisting-theory>=0.2,<0.3
|
||||
django-versatileimagefield>=2.0,<2.1
|
||||
django-filter>=2.1,<2.2
|
||||
django-rest-auth>=0.9,<0.10
|
||||
# XXX: remove when we drop support for python 3.5
|
||||
ipython>=7.10,<8; python_version > '3.5'
|
||||
ipython>=7,<7.10; python_version < '3.6'
|
||||
mutagen>=1.44,<1.45
|
||||
django-cors-headers~=3.4
|
||||
musicbrainzngs~=0.7.1
|
||||
djangorestframework~=3.11
|
||||
djangorestframework-jwt~=1.11
|
||||
arrow~=0.15.5
|
||||
persisting-theory~=0.2
|
||||
django-versatileimagefield~=2.0
|
||||
django-filter~=2.3
|
||||
django-rest-auth~=0.9
|
||||
ipython~=7.10
|
||||
mutagen~=1.45
|
||||
|
||||
pymemoize==1.0.3
|
||||
pymemoize~=1.0
|
||||
|
||||
django-dynamic-preferences>=1.8.1,<1.9
|
||||
raven>=6.10,<7
|
||||
python-magic==0.4.15
|
||||
channels>=2.4,<2.5
|
||||
# XXX: remove when we drop support for python 3.5
|
||||
channels_redis==2.2.1; python_version < '3.6'
|
||||
channels_redis>=2.3.2,<2.4; python_version > '3.5'
|
||||
uvicorn==0.8.6; python_version < '3.6'
|
||||
uvicorn>=0.11.3,<0.12; python_version > '3.5'
|
||||
gunicorn>=20.0.4,<20.1
|
||||
django-dynamic-preferences~=1.10
|
||||
raven~=6.10
|
||||
python-magic~=0.4
|
||||
channels~=2.4
|
||||
channels_redis~=3.0
|
||||
uvicorn~=0.11
|
||||
gunicorn~=20.0
|
||||
|
||||
cryptography>=2.8,<3
|
||||
cryptography~=2.9
|
||||
# requests-http-signature==0.0.3
|
||||
# clone until the branch is merged and released upstream
|
||||
git+https://github.com/EliotBerriot/requests-http-signature.git@signature-header-support
|
||||
django-cleanup>=4,<4.1
|
||||
requests>=2.22<2.23
|
||||
pyOpenSSL>=19<20
|
||||
django-cleanup~=5.0
|
||||
requests~=2.24
|
||||
pyOpenSSL~=19.1
|
||||
|
||||
# for LDAP authentication
|
||||
python-ldap>=3.2.0,<3.3
|
||||
django-auth-ldap>=2.1.0,<2.2
|
||||
python-ldap~=3.3
|
||||
django-auth-ldap~=2.2
|
||||
|
||||
pydub>=0.23.1,<0.24
|
||||
pyld==1.0.4
|
||||
aiohttp>=3.6,<3.7
|
||||
autobahn>=19.3.3
|
||||
pydub~=0.24
|
||||
pyld~=1.0
|
||||
aiohttp~=3.6
|
||||
|
||||
django-oauth-toolkit==1.2
|
||||
django-storages>=1.9.1,<1.10
|
||||
boto3<3
|
||||
unicode-slugify==0.1.3
|
||||
django-cacheops==4.2
|
||||
django-oauth-toolkit~=1.3
|
||||
django-storages~=1.9
|
||||
boto3~=1.14
|
||||
unicode-slugify~=0.1
|
||||
django-cacheops~=5.0
|
||||
|
||||
click>=7,<8
|
||||
service_identity==18.1.0
|
||||
markdown>=3.2,<4
|
||||
bleach>=3,<4
|
||||
click~=7.1
|
||||
service_identity~=18.1
|
||||
markdown~=3.2
|
||||
bleach~=3.1
|
||||
feedparser==6.0.0b3
|
||||
watchdog==0.10.2
|
||||
watchdog~=0.10
|
||||
|
|
|
@ -1,20 +1,20 @@
|
|||
# Local development dependencies go here
|
||||
|
||||
coverage>=4.5,<4.6
|
||||
django_coverage_plugin>=1.6,<1.7
|
||||
factory_boy>=2.11.1
|
||||
coverage~=4.5
|
||||
django_coverage_plugin~=1.6
|
||||
factory_boy~=2.11
|
||||
|
||||
# django-debug-toolbar that works with Django 1.5+
|
||||
django-debug-toolbar>=2.2,<2.3
|
||||
django-debug-toolbar~=2.2
|
||||
|
||||
# improved REPL
|
||||
ipdb==0.11
|
||||
prompt_toolkit<3
|
||||
black
|
||||
ipdb~=0.11
|
||||
prompt_toolkit~=2.0
|
||||
black==19.10b0
|
||||
#profiling
|
||||
|
||||
asynctest==0.12.2
|
||||
aioresponses==0.6.0
|
||||
asynctest~=0.12
|
||||
aioresponses~=0.6
|
||||
#line_profiler<3
|
||||
#https://github.com/dmclain/django-debug-toolbar-line-profiler/archive/master.zip
|
||||
#django-silk
|
||||
|
|
|
@ -1,13 +1,12 @@
|
|||
# Test dependencies go here.
|
||||
|
||||
flake8
|
||||
pytest>=5,<5.3.3
|
||||
pytest-django>=3.5.1
|
||||
pytest-mock
|
||||
pytest-sugar
|
||||
pytest-xdist
|
||||
pytest-cov
|
||||
pytest-env
|
||||
requests-mock
|
||||
pytest-randomly
|
||||
flake8~=3.8
|
||||
pytest~=6.0
|
||||
pytest-cov~=2.10
|
||||
pytest-django~=3.9
|
||||
pytest-env~=0.6
|
||||
pytest-mock~=3.2
|
||||
pytest-randomly~=3.4
|
||||
pytest-sugar~=0.9
|
||||
requests-mock~=1.8
|
||||
#pytest-profiling<1.4
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
[flake8]
|
||||
max-line-length = 120
|
||||
exclude = .tox,.git,*/migrations/*,*/static/CACHE/*,docs,node_modules,tests/data,tests/music/conftest.py
|
||||
ignore = F405,W503,E203
|
||||
ignore = F405,W503,E203,E741
|
||||
|
||||
[isort]
|
||||
skip_glob = .tox,.git,*/migrations/*,*/static/CACHE/*,docs,node_modules
|
||||
|
@ -35,3 +35,5 @@ env =
|
|||
EXTERNAL_MEDIA_PROXY_ENABLED=true
|
||||
DISABLE_PASSWORD_VALIDATORS=false
|
||||
DISABLE_PASSWORD_VALIDATORS=false
|
||||
FUNKWHALE_PLUGINS=
|
||||
MUSIC_DIRECTORY_PATH=/music
|
||||
|
|
|
@ -6,6 +6,7 @@ import pytest
|
|||
import pytz
|
||||
|
||||
from django.templatetags.static import static
|
||||
from django.urls import reverse
|
||||
|
||||
from funkwhale_api.audio import serializers
|
||||
from funkwhale_api.common import serializers as common_serializers
|
||||
|
@ -213,7 +214,7 @@ def test_channel_serializer_update_podcast(factories):
|
|||
def test_channel_serializer_representation(factories, to_api_date):
|
||||
content = factories["common.Content"]()
|
||||
channel = factories["audio.Channel"](artist__description=content)
|
||||
|
||||
setattr(channel, "_downloads_count", 12)
|
||||
expected = {
|
||||
"artist": music_serializers.serialize_artist_simple(channel.artist),
|
||||
"uuid": str(channel.uuid),
|
||||
|
@ -225,6 +226,7 @@ def test_channel_serializer_representation(factories, to_api_date):
|
|||
"metadata": {},
|
||||
"rss_url": channel.get_rss_url(),
|
||||
"url": channel.actor.url,
|
||||
"downloads_count": 12,
|
||||
}
|
||||
expected["artist"]["description"] = common_serializers.ContentSerializer(
|
||||
content
|
||||
|
@ -248,6 +250,7 @@ def test_channel_serializer_external_representation(factories, to_api_date):
|
|||
"metadata": {},
|
||||
"rss_url": channel.get_rss_url(),
|
||||
"url": channel.actor.url,
|
||||
"downloads_count": None,
|
||||
}
|
||||
expected["artist"]["description"] = common_serializers.ContentSerializer(
|
||||
content
|
||||
|
@ -312,7 +315,12 @@ def test_rss_item_serializer(factories):
|
|||
"link": [{"value": federation_utils.full_url(upload.track.get_absolute_url())}],
|
||||
"enclosure": [
|
||||
{
|
||||
"url": federation_utils.full_url(upload.get_listen_url("mp3")),
|
||||
"url": federation_utils.full_url(
|
||||
reverse(
|
||||
"api:v1:stream-detail", kwargs={"uuid": str(upload.track.uuid)}
|
||||
)
|
||||
+ ".mp3"
|
||||
),
|
||||
"length": upload.size,
|
||||
"type": "audio/mpeg",
|
||||
}
|
||||
|
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue