Merge branch 'release/0.14.2'
This commit is contained in:
commit
544a60b800
|
@ -3,13 +3,39 @@ variables:
|
|||
IMAGE: $IMAGE_NAME:$CI_COMMIT_REF_NAME
|
||||
IMAGE_LATEST: $IMAGE_NAME:latest
|
||||
PIP_CACHE_DIR: "$CI_PROJECT_DIR/pip-cache"
|
||||
PYTHONDONTWRITEBYTECODE: "true"
|
||||
|
||||
|
||||
stages:
|
||||
- lint
|
||||
- test
|
||||
- build
|
||||
- deploy
|
||||
|
||||
black:
|
||||
image: python:3.6
|
||||
stage: lint
|
||||
variables:
|
||||
GIT_STRATEGY: fetch
|
||||
before_script:
|
||||
- pip install black
|
||||
script:
|
||||
- black --check --diff api/
|
||||
|
||||
flake8:
|
||||
image: python:3.6
|
||||
stage: lint
|
||||
variables:
|
||||
GIT_STRATEGY: fetch
|
||||
before_script:
|
||||
- pip install flake8
|
||||
script:
|
||||
- flake8 -v api
|
||||
cache:
|
||||
key: "$CI_PROJECT_ID__flake8_pip_cache"
|
||||
paths:
|
||||
- "$PIP_CACHE_DIR"
|
||||
|
||||
test_api:
|
||||
services:
|
||||
- postgres:9.4
|
||||
|
@ -108,7 +134,7 @@ pages:
|
|||
tags:
|
||||
- docker
|
||||
|
||||
docker_develop:
|
||||
docker_release:
|
||||
stage: deploy
|
||||
before_script:
|
||||
- docker login -u $DOCKER_LOGIN -p $DOCKER_PASSWORD
|
||||
|
@ -119,8 +145,9 @@ docker_develop:
|
|||
- docker push $IMAGE
|
||||
only:
|
||||
- develop@funkwhale/funkwhale
|
||||
- tags@funkwhale/funkwhale
|
||||
tags:
|
||||
- dind
|
||||
- docker-build
|
||||
|
||||
build_api:
|
||||
# Simply publish a zip containing api/ directory
|
||||
|
@ -135,19 +162,3 @@ build_api:
|
|||
- tags@funkwhale/funkwhale
|
||||
- master@funkwhale/funkwhale
|
||||
- develop@funkwhale/funkwhale
|
||||
|
||||
|
||||
docker_release:
|
||||
stage: deploy
|
||||
before_script:
|
||||
- docker login -u $DOCKER_LOGIN -p $DOCKER_PASSWORD
|
||||
- cp -r front/dist api/frontend
|
||||
- cd api
|
||||
script:
|
||||
- docker build -t $IMAGE -t $IMAGE_LATEST .
|
||||
- docker push $IMAGE
|
||||
- docker push $IMAGE_LATEST
|
||||
only:
|
||||
- tags@funkwhale/funkwhale
|
||||
tags:
|
||||
- dind
|
||||
|
|
|
@ -0,0 +1,43 @@
|
|||
<!--
|
||||
Hi there! You are reporting a bug on this project, and we want to thank you!
|
||||
|
||||
To ensure your bug report is as useful as possible, please try to stick
|
||||
to the following structure. You can leave the parts text between `<!- ->`
|
||||
markers untouched, they won't be displayed in your final message.
|
||||
|
||||
Please do not edit the following line, it's used for automatic classification
|
||||
-->
|
||||
|
||||
/label ~"Type: Bug" ~"Status: Need triage"
|
||||
|
||||
## Steps to reproduce
|
||||
|
||||
<!--
|
||||
Describe the steps to reproduce the issue, like:
|
||||
|
||||
1. Visit the page at /artists/
|
||||
2. Type that
|
||||
3. Submit
|
||||
-->
|
||||
|
||||
## What happens?
|
||||
|
||||
<!--
|
||||
Describe what happens once the previous steps are completed.
|
||||
-->
|
||||
|
||||
## What is expected?
|
||||
|
||||
<!--
|
||||
Describe the expected behaviour.
|
||||
-->
|
||||
|
||||
## Context
|
||||
|
||||
<!--
|
||||
If relevant, share additional context here like:
|
||||
|
||||
- Browser type and version (for front-end bugs)
|
||||
- Instance configuration (Docker/non-docker, nginx/apache as proxy, etc.)
|
||||
- Error messages, screenshots and logs
|
||||
-->
|
|
@ -0,0 +1,39 @@
|
|||
<!--
|
||||
Hi there! You are about to share feature request or an idea, and we want to thank you!
|
||||
|
||||
To ensure we can deal with your idea or request, please try to stick
|
||||
to the following structure. You can leave the parts text between `<!- ->`
|
||||
markers untouched, they won't be displayed in your final message.
|
||||
|
||||
Please do not edit the following line, it's used for automatic classification
|
||||
-->
|
||||
|
||||
/label ~"Type: New feature" ~"Status: Need triage"
|
||||
|
||||
## What is the problem you are facing?
|
||||
|
||||
<!--
|
||||
Describe the problem you'd like to solve, and why we need to add or
|
||||
improve something in the current system to solve that problem.
|
||||
|
||||
Be as specific as possible.
|
||||
-->
|
||||
|
||||
## What are the possible drawbacks or issues with the requested changes?
|
||||
|
||||
<!--
|
||||
Altering the system behaviour is not always a free action, and it can impact
|
||||
user experience, performance, introduce bugs or complexity, etc..
|
||||
|
||||
If you think about anything we should keep in mind while
|
||||
examining your request, please describe it in this section.
|
||||
-->
|
||||
|
||||
## Context
|
||||
|
||||
<!--
|
||||
If relevant, share additional context here like:
|
||||
|
||||
- Links to existing implementations or examples of the requested feature
|
||||
- Screenshots
|
||||
-->
|
127
CHANGELOG
127
CHANGELOG
|
@ -10,6 +10,133 @@ This changelog is viewable on the web at https://docs.funkwhale.audio/changelog.
|
|||
|
||||
.. towncrier
|
||||
|
||||
0.14.2 (2018-06-16)
|
||||
-------------------
|
||||
|
||||
.. warning::
|
||||
|
||||
This release contains a fix for a permission issue. You should upgrade
|
||||
as soon as possible. Read the changelog below for more details.
|
||||
|
||||
Upgrade instructions are available at
|
||||
https://docs.funkwhale.audio/upgrading.html
|
||||
|
||||
Enhancements:
|
||||
|
||||
- Added feedback on shuffle button (#262)
|
||||
- Added multiple warnings in the documentation that you should never run
|
||||
makemigrations yourself (#291)
|
||||
- Album cover served in http (#264)
|
||||
- Apache2 reverse proxy now supports websockets (tested with Apache 2.4.25)
|
||||
(!252)
|
||||
- Display file size in human format during file upload (#289)
|
||||
- Switch from BSD-3 licence to AGPL-3 licence (#280)
|
||||
|
||||
Bugfixes:
|
||||
|
||||
- Ensure radios can only be edited and deleted by their owners (#311)
|
||||
- Fixed admin menu not showing after login (#245)
|
||||
- Fixed broken pagination in Subsonic API (#295)
|
||||
- Fixed duplicated websocket connexion on timeline (#287)
|
||||
|
||||
|
||||
Documentation:
|
||||
|
||||
- Improved documentation about in-place imports setup (#298)
|
||||
|
||||
|
||||
Other:
|
||||
|
||||
- Added Black and flake8 checks in CI to ensure consistent code styling and
|
||||
formatting (#297)
|
||||
- Added bug and feature issue templates (#299)
|
||||
|
||||
|
||||
Permission issues on radios
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
Because of an error in the way we checked user permissions on radios,
|
||||
public radios could be deleted by any logged-in user, even if they were not
|
||||
the owner of the radio.
|
||||
|
||||
We recommend instances owners to upgrade as fast as possible to avoid any abuse
|
||||
and data loss.
|
||||
|
||||
|
||||
Funkwhale is now licenced under AGPL-3
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
Following the recent switch made by PixelFed
|
||||
(https://github.com/dansup/pixelfed/issues/143), we decided along with
|
||||
the community to relicence Funkwhale under the AGPL-3 licence. We did this
|
||||
switch for various reasons:
|
||||
|
||||
- This is better aligned with other fediverse software
|
||||
- It prohibits anyone to distribute closed-source and proprietary forks of Funkwhale
|
||||
|
||||
As end users and instance owners, this does not change anything. You can
|
||||
continue to use Funkwhale exactly as you did before :)
|
||||
|
||||
|
||||
Apache support for websocket
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
Up until now, our Apache2 configuration was not working with websockets. This is now
|
||||
solved by adding this at the beginning of your Apache2 configuration file::
|
||||
|
||||
Define funkwhale-api-ws ws://localhost:5000
|
||||
|
||||
And this, before the "/api" block::
|
||||
|
||||
# Activating WebSockets
|
||||
ProxyPass "/api/v1/instance/activity" ${funkwhale-api-ws}/api/v1/instance/activity
|
||||
|
||||
Websockets may not be supported in older versions of Apache2. Be sure to upgrade to the latest version available.
|
||||
|
||||
|
||||
Serving album covers in https (Apache2 proxy)
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
Two issues are addressed here. The first one was about Django replying with
|
||||
mixed content (http) when queried for covers. Setting up the `X-Forwarded-Proto`
|
||||
allows Django to know that the client is using https, and that the reply must
|
||||
be https as well.
|
||||
|
||||
Second issue was a problem of permission causing Apache a denied access to
|
||||
album cover folder. It is solved by adding another block for this path in
|
||||
the Apache configuration file for funkwhale.
|
||||
|
||||
Here is how to modify your `funkwhale.conf` apache2 configuration::
|
||||
|
||||
<VirtualHost *:443>
|
||||
|
||||
...
|
||||
#Add this new line
|
||||
RequestHeader set X-Forwarded-Proto "https"
|
||||
...
|
||||
# Add this new block below the other <Directory/> blocks
|
||||
# replace /srv/funkwhale/data/media with the path to your media directory
|
||||
# if you're not using the standard layout.
|
||||
<Directory /srv/funkwhale/data/media/albums>
|
||||
Options FollowSymLinks
|
||||
AllowOverride None
|
||||
Require all granted
|
||||
</Directory>
|
||||
...
|
||||
</VirtualHost>
|
||||
|
||||
|
||||
About the makemigrations warning
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
You may sometimes get the following warning while applying migrations::
|
||||
|
||||
"Your models have changes that are not yet reflected in a migration, and so won't be applied."
|
||||
|
||||
This is a warning, not an error, and it can be safely ignored.
|
||||
Never run the ``makemigrations`` command yourself.
|
||||
|
||||
|
||||
0.14.1 (2018-06-06)
|
||||
-------------------
|
||||
|
||||
|
|
37
CONTRIBUTING
37
CONTRIBUTING
|
@ -61,16 +61,6 @@ If you do not want to add the ``-f dev.yml`` snippet everytime, you can run this
|
|||
export COMPOSE_FILE=dev.yml
|
||||
|
||||
|
||||
Building the containers
|
||||
^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
On your initial clone, or if there have been some changes in the
|
||||
app dependencies, you will have to rebuild your containers. This is done
|
||||
via the following command::
|
||||
|
||||
docker-compose -f dev.yml build
|
||||
|
||||
|
||||
Creating your env file
|
||||
^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
|
@ -84,6 +74,24 @@ Create it like this::
|
|||
touch .env
|
||||
|
||||
|
||||
Create docker network
|
||||
^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
Create the federation network::
|
||||
|
||||
docker network create federation
|
||||
|
||||
|
||||
Building the containers
|
||||
^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
On your initial clone, or if there have been some changes in the
|
||||
app dependencies, you will have to rebuild your containers. This is done
|
||||
via the following command::
|
||||
|
||||
docker-compose -f dev.yml build
|
||||
|
||||
|
||||
Database management
|
||||
^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
|
@ -124,7 +132,7 @@ Launch all services
|
|||
|
||||
Then you can run everything with::
|
||||
|
||||
docker-compose -f dev.yml up
|
||||
docker-compose -f dev.yml up front api nginx celeryworker
|
||||
|
||||
This will launch all services, and output the logs in your current terminal window.
|
||||
If you prefer to launch them in the background instead, use the ``-d`` flag, and access the logs when you need it via ``docker-compose -f dev.yml logs --tail=50 --follow``.
|
||||
|
@ -194,13 +202,6 @@ Run a reverse proxy for your instances
|
|||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
|
||||
Create docker network
|
||||
^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
Create the federation network::
|
||||
|
||||
docker network create federation
|
||||
|
||||
Launch everything
|
||||
^^^^^^^^^^^^^^^^^
|
||||
|
||||
|
|
678
LICENSE
678
LICENSE
|
@ -1,27 +1,661 @@
|
|||
Copyright (c) 2015, Eliot Berriot
|
||||
All rights reserved.
|
||||
GNU AFFERO GENERAL PUBLIC LICENSE
|
||||
Version 3, 19 November 2007
|
||||
|
||||
Redistribution and use in source and binary forms, with or without modification,
|
||||
are permitted provided that the following conditions are met:
|
||||
Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/>
|
||||
Everyone is permitted to copy and distribute verbatim copies
|
||||
of this license document, but changing it is not allowed.
|
||||
|
||||
* Redistributions of source code must retain the above copyright notice, this
|
||||
list of conditions and the following disclaimer.
|
||||
Preamble
|
||||
|
||||
* Redistributions in binary form must reproduce the above copyright notice, this
|
||||
list of conditions and the following disclaimer in the documentation and/or
|
||||
other materials provided with the distribution.
|
||||
The GNU Affero General Public License is a free, copyleft license for
|
||||
software and other kinds of works, specifically designed to ensure
|
||||
cooperation with the community in the case of network server software.
|
||||
|
||||
* Neither the name of funkwhale_api nor the names of its
|
||||
contributors may be used to endorse or promote products derived from this
|
||||
software without specific prior written permission.
|
||||
The licenses for most software and other practical works are designed
|
||||
to take away your freedom to share and change the works. By contrast,
|
||||
our General Public Licenses are intended to guarantee your freedom to
|
||||
share and change all versions of a program--to make sure it remains free
|
||||
software for all its users.
|
||||
|
||||
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
|
||||
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
|
||||
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED.
|
||||
IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT,
|
||||
INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
|
||||
BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
|
||||
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY
|
||||
OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
|
||||
OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED
|
||||
OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
When we speak of free software, we are referring to freedom, not
|
||||
price. Our General Public Licenses are designed to make sure that you
|
||||
have the freedom to distribute copies of free software (and charge for
|
||||
them if you wish), that you receive source code or can get it if you
|
||||
want it, that you can change the software or use pieces of it in new
|
||||
free programs, and that you know you can do these things.
|
||||
|
||||
Developers that use our General Public Licenses protect your rights
|
||||
with two steps: (1) assert copyright on the software, and (2) offer
|
||||
you this License which gives you legal permission to copy, distribute
|
||||
and/or modify the software.
|
||||
|
||||
A secondary benefit of defending all users' freedom is that
|
||||
improvements made in alternate versions of the program, if they
|
||||
receive widespread use, become available for other developers to
|
||||
incorporate. Many developers of free software are heartened and
|
||||
encouraged by the resulting cooperation. However, in the case of
|
||||
software used on network servers, this result may fail to come about.
|
||||
The GNU General Public License permits making a modified version and
|
||||
letting the public access it on a server without ever releasing its
|
||||
source code to the public.
|
||||
|
||||
The GNU Affero General Public License is designed specifically to
|
||||
ensure that, in such cases, the modified source code becomes available
|
||||
to the community. It requires the operator of a network server to
|
||||
provide the source code of the modified version running there to the
|
||||
users of that server. Therefore, public use of a modified version, on
|
||||
a publicly accessible server, gives the public access to the source
|
||||
code of the modified version.
|
||||
|
||||
An older license, called the Affero General Public License and
|
||||
published by Affero, was designed to accomplish similar goals. This is
|
||||
a different license, not a version of the Affero GPL, but Affero has
|
||||
released a new version of the Affero GPL which permits relicensing under
|
||||
this license.
|
||||
|
||||
The precise terms and conditions for copying, distribution and
|
||||
modification follow.
|
||||
|
||||
TERMS AND CONDITIONS
|
||||
|
||||
0. Definitions.
|
||||
|
||||
"This License" refers to version 3 of the GNU Affero General Public License.
|
||||
|
||||
"Copyright" also means copyright-like laws that apply to other kinds of
|
||||
works, such as semiconductor masks.
|
||||
|
||||
"The Program" refers to any copyrightable work licensed under this
|
||||
License. Each licensee is addressed as "you". "Licensees" and
|
||||
"recipients" may be individuals or organizations.
|
||||
|
||||
To "modify" a work means to copy from or adapt all or part of the work
|
||||
in a fashion requiring copyright permission, other than the making of an
|
||||
exact copy. The resulting work is called a "modified version" of the
|
||||
earlier work or a work "based on" the earlier work.
|
||||
|
||||
A "covered work" means either the unmodified Program or a work based
|
||||
on the Program.
|
||||
|
||||
To "propagate" a work means to do anything with it that, without
|
||||
permission, would make you directly or secondarily liable for
|
||||
infringement under applicable copyright law, except executing it on a
|
||||
computer or modifying a private copy. Propagation includes copying,
|
||||
distribution (with or without modification), making available to the
|
||||
public, and in some countries other activities as well.
|
||||
|
||||
To "convey" a work means any kind of propagation that enables other
|
||||
parties to make or receive copies. Mere interaction with a user through
|
||||
a computer network, with no transfer of a copy, is not conveying.
|
||||
|
||||
An interactive user interface displays "Appropriate Legal Notices"
|
||||
to the extent that it includes a convenient and prominently visible
|
||||
feature that (1) displays an appropriate copyright notice, and (2)
|
||||
tells the user that there is no warranty for the work (except to the
|
||||
extent that warranties are provided), that licensees may convey the
|
||||
work under this License, and how to view a copy of this License. If
|
||||
the interface presents a list of user commands or options, such as a
|
||||
menu, a prominent item in the list meets this criterion.
|
||||
|
||||
1. Source Code.
|
||||
|
||||
The "source code" for a work means the preferred form of the work
|
||||
for making modifications to it. "Object code" means any non-source
|
||||
form of a work.
|
||||
|
||||
A "Standard Interface" means an interface that either is an official
|
||||
standard defined by a recognized standards body, or, in the case of
|
||||
interfaces specified for a particular programming language, one that
|
||||
is widely used among developers working in that language.
|
||||
|
||||
The "System Libraries" of an executable work include anything, other
|
||||
than the work as a whole, that (a) is included in the normal form of
|
||||
packaging a Major Component, but which is not part of that Major
|
||||
Component, and (b) serves only to enable use of the work with that
|
||||
Major Component, or to implement a Standard Interface for which an
|
||||
implementation is available to the public in source code form. A
|
||||
"Major Component", in this context, means a major essential component
|
||||
(kernel, window system, and so on) of the specific operating system
|
||||
(if any) on which the executable work runs, or a compiler used to
|
||||
produce the work, or an object code interpreter used to run it.
|
||||
|
||||
The "Corresponding Source" for a work in object code form means all
|
||||
the source code needed to generate, install, and (for an executable
|
||||
work) run the object code and to modify the work, including scripts to
|
||||
control those activities. However, it does not include the work's
|
||||
System Libraries, or general-purpose tools or generally available free
|
||||
programs which are used unmodified in performing those activities but
|
||||
which are not part of the work. For example, Corresponding Source
|
||||
includes interface definition files associated with source files for
|
||||
the work, and the source code for shared libraries and dynamically
|
||||
linked subprograms that the work is specifically designed to require,
|
||||
such as by intimate data communication or control flow between those
|
||||
subprograms and other parts of the work.
|
||||
|
||||
The Corresponding Source need not include anything that users
|
||||
can regenerate automatically from other parts of the Corresponding
|
||||
Source.
|
||||
|
||||
The Corresponding Source for a work in source code form is that
|
||||
same work.
|
||||
|
||||
2. Basic Permissions.
|
||||
|
||||
All rights granted under this License are granted for the term of
|
||||
copyright on the Program, and are irrevocable provided the stated
|
||||
conditions are met. This License explicitly affirms your unlimited
|
||||
permission to run the unmodified Program. The output from running a
|
||||
covered work is covered by this License only if the output, given its
|
||||
content, constitutes a covered work. This License acknowledges your
|
||||
rights of fair use or other equivalent, as provided by copyright law.
|
||||
|
||||
You may make, run and propagate covered works that you do not
|
||||
convey, without conditions so long as your license otherwise remains
|
||||
in force. You may convey covered works to others for the sole purpose
|
||||
of having them make modifications exclusively for you, or provide you
|
||||
with facilities for running those works, provided that you comply with
|
||||
the terms of this License in conveying all material for which you do
|
||||
not control copyright. Those thus making or running the covered works
|
||||
for you must do so exclusively on your behalf, under your direction
|
||||
and control, on terms that prohibit them from making any copies of
|
||||
your copyrighted material outside their relationship with you.
|
||||
|
||||
Conveying under any other circumstances is permitted solely under
|
||||
the conditions stated below. Sublicensing is not allowed; section 10
|
||||
makes it unnecessary.
|
||||
|
||||
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
|
||||
|
||||
No covered work shall be deemed part of an effective technological
|
||||
measure under any applicable law fulfilling obligations under article
|
||||
11 of the WIPO copyright treaty adopted on 20 December 1996, or
|
||||
similar laws prohibiting or restricting circumvention of such
|
||||
measures.
|
||||
|
||||
When you convey a covered work, you waive any legal power to forbid
|
||||
circumvention of technological measures to the extent such circumvention
|
||||
is effected by exercising rights under this License with respect to
|
||||
the covered work, and you disclaim any intention to limit operation or
|
||||
modification of the work as a means of enforcing, against the work's
|
||||
users, your or third parties' legal rights to forbid circumvention of
|
||||
technological measures.
|
||||
|
||||
4. Conveying Verbatim Copies.
|
||||
|
||||
You may convey verbatim copies of the Program's source code as you
|
||||
receive it, in any medium, provided that you conspicuously and
|
||||
appropriately publish on each copy an appropriate copyright notice;
|
||||
keep intact all notices stating that this License and any
|
||||
non-permissive terms added in accord with section 7 apply to the code;
|
||||
keep intact all notices of the absence of any warranty; and give all
|
||||
recipients a copy of this License along with the Program.
|
||||
|
||||
You may charge any price or no price for each copy that you convey,
|
||||
and you may offer support or warranty protection for a fee.
|
||||
|
||||
5. Conveying Modified Source Versions.
|
||||
|
||||
You may convey a work based on the Program, or the modifications to
|
||||
produce it from the Program, in the form of source code under the
|
||||
terms of section 4, provided that you also meet all of these conditions:
|
||||
|
||||
a) The work must carry prominent notices stating that you modified
|
||||
it, and giving a relevant date.
|
||||
|
||||
b) The work must carry prominent notices stating that it is
|
||||
released under this License and any conditions added under section
|
||||
7. This requirement modifies the requirement in section 4 to
|
||||
"keep intact all notices".
|
||||
|
||||
c) You must license the entire work, as a whole, under this
|
||||
License to anyone who comes into possession of a copy. This
|
||||
License will therefore apply, along with any applicable section 7
|
||||
additional terms, to the whole of the work, and all its parts,
|
||||
regardless of how they are packaged. This License gives no
|
||||
permission to license the work in any other way, but it does not
|
||||
invalidate such permission if you have separately received it.
|
||||
|
||||
d) If the work has interactive user interfaces, each must display
|
||||
Appropriate Legal Notices; however, if the Program has interactive
|
||||
interfaces that do not display Appropriate Legal Notices, your
|
||||
work need not make them do so.
|
||||
|
||||
A compilation of a covered work with other separate and independent
|
||||
works, which are not by their nature extensions of the covered work,
|
||||
and which are not combined with it such as to form a larger program,
|
||||
in or on a volume of a storage or distribution medium, is called an
|
||||
"aggregate" if the compilation and its resulting copyright are not
|
||||
used to limit the access or legal rights of the compilation's users
|
||||
beyond what the individual works permit. Inclusion of a covered work
|
||||
in an aggregate does not cause this License to apply to the other
|
||||
parts of the aggregate.
|
||||
|
||||
6. Conveying Non-Source Forms.
|
||||
|
||||
You may convey a covered work in object code form under the terms
|
||||
of sections 4 and 5, provided that you also convey the
|
||||
machine-readable Corresponding Source under the terms of this License,
|
||||
in one of these ways:
|
||||
|
||||
a) Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by the
|
||||
Corresponding Source fixed on a durable physical medium
|
||||
customarily used for software interchange.
|
||||
|
||||
b) Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by a
|
||||
written offer, valid for at least three years and valid for as
|
||||
long as you offer spare parts or customer support for that product
|
||||
model, to give anyone who possesses the object code either (1) a
|
||||
copy of the Corresponding Source for all the software in the
|
||||
product that is covered by this License, on a durable physical
|
||||
medium customarily used for software interchange, for a price no
|
||||
more than your reasonable cost of physically performing this
|
||||
conveying of source, or (2) access to copy the
|
||||
Corresponding Source from a network server at no charge.
|
||||
|
||||
c) Convey individual copies of the object code with a copy of the
|
||||
written offer to provide the Corresponding Source. This
|
||||
alternative is allowed only occasionally and noncommercially, and
|
||||
only if you received the object code with such an offer, in accord
|
||||
with subsection 6b.
|
||||
|
||||
d) Convey the object code by offering access from a designated
|
||||
place (gratis or for a charge), and offer equivalent access to the
|
||||
Corresponding Source in the same way through the same place at no
|
||||
further charge. You need not require recipients to copy the
|
||||
Corresponding Source along with the object code. If the place to
|
||||
copy the object code is a network server, the Corresponding Source
|
||||
may be on a different server (operated by you or a third party)
|
||||
that supports equivalent copying facilities, provided you maintain
|
||||
clear directions next to the object code saying where to find the
|
||||
Corresponding Source. Regardless of what server hosts the
|
||||
Corresponding Source, you remain obligated to ensure that it is
|
||||
available for as long as needed to satisfy these requirements.
|
||||
|
||||
e) Convey the object code using peer-to-peer transmission, provided
|
||||
you inform other peers where the object code and Corresponding
|
||||
Source of the work are being offered to the general public at no
|
||||
charge under subsection 6d.
|
||||
|
||||
A separable portion of the object code, whose source code is excluded
|
||||
from the Corresponding Source as a System Library, need not be
|
||||
included in conveying the object code work.
|
||||
|
||||
A "User Product" is either (1) a "consumer product", which means any
|
||||
tangible personal property which is normally used for personal, family,
|
||||
or household purposes, or (2) anything designed or sold for incorporation
|
||||
into a dwelling. In determining whether a product is a consumer product,
|
||||
doubtful cases shall be resolved in favor of coverage. For a particular
|
||||
product received by a particular user, "normally used" refers to a
|
||||
typical or common use of that class of product, regardless of the status
|
||||
of the particular user or of the way in which the particular user
|
||||
actually uses, or expects or is expected to use, the product. A product
|
||||
is a consumer product regardless of whether the product has substantial
|
||||
commercial, industrial or non-consumer uses, unless such uses represent
|
||||
the only significant mode of use of the product.
|
||||
|
||||
"Installation Information" for a User Product means any methods,
|
||||
procedures, authorization keys, or other information required to install
|
||||
and execute modified versions of a covered work in that User Product from
|
||||
a modified version of its Corresponding Source. The information must
|
||||
suffice to ensure that the continued functioning of the modified object
|
||||
code is in no case prevented or interfered with solely because
|
||||
modification has been made.
|
||||
|
||||
If you convey an object code work under this section in, or with, or
|
||||
specifically for use in, a User Product, and the conveying occurs as
|
||||
part of a transaction in which the right of possession and use of the
|
||||
User Product is transferred to the recipient in perpetuity or for a
|
||||
fixed term (regardless of how the transaction is characterized), the
|
||||
Corresponding Source conveyed under this section must be accompanied
|
||||
by the Installation Information. But this requirement does not apply
|
||||
if neither you nor any third party retains the ability to install
|
||||
modified object code on the User Product (for example, the work has
|
||||
been installed in ROM).
|
||||
|
||||
The requirement to provide Installation Information does not include a
|
||||
requirement to continue to provide support service, warranty, or updates
|
||||
for a work that has been modified or installed by the recipient, or for
|
||||
the User Product in which it has been modified or installed. Access to a
|
||||
network may be denied when the modification itself materially and
|
||||
adversely affects the operation of the network or violates the rules and
|
||||
protocols for communication across the network.
|
||||
|
||||
Corresponding Source conveyed, and Installation Information provided,
|
||||
in accord with this section must be in a format that is publicly
|
||||
documented (and with an implementation available to the public in
|
||||
source code form), and must require no special password or key for
|
||||
unpacking, reading or copying.
|
||||
|
||||
7. Additional Terms.
|
||||
|
||||
"Additional permissions" are terms that supplement the terms of this
|
||||
License by making exceptions from one or more of its conditions.
|
||||
Additional permissions that are applicable to the entire Program shall
|
||||
be treated as though they were included in this License, to the extent
|
||||
that they are valid under applicable law. If additional permissions
|
||||
apply only to part of the Program, that part may be used separately
|
||||
under those permissions, but the entire Program remains governed by
|
||||
this License without regard to the additional permissions.
|
||||
|
||||
When you convey a copy of a covered work, you may at your option
|
||||
remove any additional permissions from that copy, or from any part of
|
||||
it. (Additional permissions may be written to require their own
|
||||
removal in certain cases when you modify the work.) You may place
|
||||
additional permissions on material, added by you to a covered work,
|
||||
for which you have or can give appropriate copyright permission.
|
||||
|
||||
Notwithstanding any other provision of this License, for material you
|
||||
add to a covered work, you may (if authorized by the copyright holders of
|
||||
that material) supplement the terms of this License with terms:
|
||||
|
||||
a) Disclaiming warranty or limiting liability differently from the
|
||||
terms of sections 15 and 16 of this License; or
|
||||
|
||||
b) Requiring preservation of specified reasonable legal notices or
|
||||
author attributions in that material or in the Appropriate Legal
|
||||
Notices displayed by works containing it; or
|
||||
|
||||
c) Prohibiting misrepresentation of the origin of that material, or
|
||||
requiring that modified versions of such material be marked in
|
||||
reasonable ways as different from the original version; or
|
||||
|
||||
d) Limiting the use for publicity purposes of names of licensors or
|
||||
authors of the material; or
|
||||
|
||||
e) Declining to grant rights under trademark law for use of some
|
||||
trade names, trademarks, or service marks; or
|
||||
|
||||
f) Requiring indemnification of licensors and authors of that
|
||||
material by anyone who conveys the material (or modified versions of
|
||||
it) with contractual assumptions of liability to the recipient, for
|
||||
any liability that these contractual assumptions directly impose on
|
||||
those licensors and authors.
|
||||
|
||||
All other non-permissive additional terms are considered "further
|
||||
restrictions" within the meaning of section 10. If the Program as you
|
||||
received it, or any part of it, contains a notice stating that it is
|
||||
governed by this License along with a term that is a further
|
||||
restriction, you may remove that term. If a license document contains
|
||||
a further restriction but permits relicensing or conveying under this
|
||||
License, you may add to a covered work material governed by the terms
|
||||
of that license document, provided that the further restriction does
|
||||
not survive such relicensing or conveying.
|
||||
|
||||
If you add terms to a covered work in accord with this section, you
|
||||
must place, in the relevant source files, a statement of the
|
||||
additional terms that apply to those files, or a notice indicating
|
||||
where to find the applicable terms.
|
||||
|
||||
Additional terms, permissive or non-permissive, may be stated in the
|
||||
form of a separately written license, or stated as exceptions;
|
||||
the above requirements apply either way.
|
||||
|
||||
8. Termination.
|
||||
|
||||
You may not propagate or modify a covered work except as expressly
|
||||
provided under this License. Any attempt otherwise to propagate or
|
||||
modify it is void, and will automatically terminate your rights under
|
||||
this License (including any patent licenses granted under the third
|
||||
paragraph of section 11).
|
||||
|
||||
However, if you cease all violation of this License, then your
|
||||
license from a particular copyright holder is reinstated (a)
|
||||
provisionally, unless and until the copyright holder explicitly and
|
||||
finally terminates your license, and (b) permanently, if the copyright
|
||||
holder fails to notify you of the violation by some reasonable means
|
||||
prior to 60 days after the cessation.
|
||||
|
||||
Moreover, your license from a particular copyright holder is
|
||||
reinstated permanently if the copyright holder notifies you of the
|
||||
violation by some reasonable means, this is the first time you have
|
||||
received notice of violation of this License (for any work) from that
|
||||
copyright holder, and you cure the violation prior to 30 days after
|
||||
your receipt of the notice.
|
||||
|
||||
Termination of your rights under this section does not terminate the
|
||||
licenses of parties who have received copies or rights from you under
|
||||
this License. If your rights have been terminated and not permanently
|
||||
reinstated, you do not qualify to receive new licenses for the same
|
||||
material under section 10.
|
||||
|
||||
9. Acceptance Not Required for Having Copies.
|
||||
|
||||
You are not required to accept this License in order to receive or
|
||||
run a copy of the Program. Ancillary propagation of a covered work
|
||||
occurring solely as a consequence of using peer-to-peer transmission
|
||||
to receive a copy likewise does not require acceptance. However,
|
||||
nothing other than this License grants you permission to propagate or
|
||||
modify any covered work. These actions infringe copyright if you do
|
||||
not accept this License. Therefore, by modifying or propagating a
|
||||
covered work, you indicate your acceptance of this License to do so.
|
||||
|
||||
10. Automatic Licensing of Downstream Recipients.
|
||||
|
||||
Each time you convey a covered work, the recipient automatically
|
||||
receives a license from the original licensors, to run, modify and
|
||||
propagate that work, subject to this License. You are not responsible
|
||||
for enforcing compliance by third parties with this License.
|
||||
|
||||
An "entity transaction" is a transaction transferring control of an
|
||||
organization, or substantially all assets of one, or subdividing an
|
||||
organization, or merging organizations. If propagation of a covered
|
||||
work results from an entity transaction, each party to that
|
||||
transaction who receives a copy of the work also receives whatever
|
||||
licenses to the work the party's predecessor in interest had or could
|
||||
give under the previous paragraph, plus a right to possession of the
|
||||
Corresponding Source of the work from the predecessor in interest, if
|
||||
the predecessor has it or can get it with reasonable efforts.
|
||||
|
||||
You may not impose any further restrictions on the exercise of the
|
||||
rights granted or affirmed under this License. For example, you may
|
||||
not impose a license fee, royalty, or other charge for exercise of
|
||||
rights granted under this License, and you may not initiate litigation
|
||||
(including a cross-claim or counterclaim in a lawsuit) alleging that
|
||||
any patent claim is infringed by making, using, selling, offering for
|
||||
sale, or importing the Program or any portion of it.
|
||||
|
||||
11. Patents.
|
||||
|
||||
A "contributor" is a copyright holder who authorizes use under this
|
||||
License of the Program or a work on which the Program is based. The
|
||||
work thus licensed is called the contributor's "contributor version".
|
||||
|
||||
A contributor's "essential patent claims" are all patent claims
|
||||
owned or controlled by the contributor, whether already acquired or
|
||||
hereafter acquired, that would be infringed by some manner, permitted
|
||||
by this License, of making, using, or selling its contributor version,
|
||||
but do not include claims that would be infringed only as a
|
||||
consequence of further modification of the contributor version. For
|
||||
purposes of this definition, "control" includes the right to grant
|
||||
patent sublicenses in a manner consistent with the requirements of
|
||||
this License.
|
||||
|
||||
Each contributor grants you a non-exclusive, worldwide, royalty-free
|
||||
patent license under the contributor's essential patent claims, to
|
||||
make, use, sell, offer for sale, import and otherwise run, modify and
|
||||
propagate the contents of its contributor version.
|
||||
|
||||
In the following three paragraphs, a "patent license" is any express
|
||||
agreement or commitment, however denominated, not to enforce a patent
|
||||
(such as an express permission to practice a patent or covenant not to
|
||||
sue for patent infringement). To "grant" such a patent license to a
|
||||
party means to make such an agreement or commitment not to enforce a
|
||||
patent against the party.
|
||||
|
||||
If you convey a covered work, knowingly relying on a patent license,
|
||||
and the Corresponding Source of the work is not available for anyone
|
||||
to copy, free of charge and under the terms of this License, through a
|
||||
publicly available network server or other readily accessible means,
|
||||
then you must either (1) cause the Corresponding Source to be so
|
||||
available, or (2) arrange to deprive yourself of the benefit of the
|
||||
patent license for this particular work, or (3) arrange, in a manner
|
||||
consistent with the requirements of this License, to extend the patent
|
||||
license to downstream recipients. "Knowingly relying" means you have
|
||||
actual knowledge that, but for the patent license, your conveying the
|
||||
covered work in a country, or your recipient's use of the covered work
|
||||
in a country, would infringe one or more identifiable patents in that
|
||||
country that you have reason to believe are valid.
|
||||
|
||||
If, pursuant to or in connection with a single transaction or
|
||||
arrangement, you convey, or propagate by procuring conveyance of, a
|
||||
covered work, and grant a patent license to some of the parties
|
||||
receiving the covered work authorizing them to use, propagate, modify
|
||||
or convey a specific copy of the covered work, then the patent license
|
||||
you grant is automatically extended to all recipients of the covered
|
||||
work and works based on it.
|
||||
|
||||
A patent license is "discriminatory" if it does not include within
|
||||
the scope of its coverage, prohibits the exercise of, or is
|
||||
conditioned on the non-exercise of one or more of the rights that are
|
||||
specifically granted under this License. You may not convey a covered
|
||||
work if you are a party to an arrangement with a third party that is
|
||||
in the business of distributing software, under which you make payment
|
||||
to the third party based on the extent of your activity of conveying
|
||||
the work, and under which the third party grants, to any of the
|
||||
parties who would receive the covered work from you, a discriminatory
|
||||
patent license (a) in connection with copies of the covered work
|
||||
conveyed by you (or copies made from those copies), or (b) primarily
|
||||
for and in connection with specific products or compilations that
|
||||
contain the covered work, unless you entered into that arrangement,
|
||||
or that patent license was granted, prior to 28 March 2007.
|
||||
|
||||
Nothing in this License shall be construed as excluding or limiting
|
||||
any implied license or other defenses to infringement that may
|
||||
otherwise be available to you under applicable patent law.
|
||||
|
||||
12. No Surrender of Others' Freedom.
|
||||
|
||||
If conditions are imposed on you (whether by court order, agreement or
|
||||
otherwise) that contradict the conditions of this License, they do not
|
||||
excuse you from the conditions of this License. If you cannot convey a
|
||||
covered work so as to satisfy simultaneously your obligations under this
|
||||
License and any other pertinent obligations, then as a consequence you may
|
||||
not convey it at all. For example, if you agree to terms that obligate you
|
||||
to collect a royalty for further conveying from those to whom you convey
|
||||
the Program, the only way you could satisfy both those terms and this
|
||||
License would be to refrain entirely from conveying the Program.
|
||||
|
||||
13. Remote Network Interaction; Use with the GNU General Public License.
|
||||
|
||||
Notwithstanding any other provision of this License, if you modify the
|
||||
Program, your modified version must prominently offer all users
|
||||
interacting with it remotely through a computer network (if your version
|
||||
supports such interaction) an opportunity to receive the Corresponding
|
||||
Source of your version by providing access to the Corresponding Source
|
||||
from a network server at no charge, through some standard or customary
|
||||
means of facilitating copying of software. This Corresponding Source
|
||||
shall include the Corresponding Source for any work covered by version 3
|
||||
of the GNU General Public License that is incorporated pursuant to the
|
||||
following paragraph.
|
||||
|
||||
Notwithstanding any other provision of this License, you have
|
||||
permission to link or combine any covered work with a work licensed
|
||||
under version 3 of the GNU General Public License into a single
|
||||
combined work, and to convey the resulting work. The terms of this
|
||||
License will continue to apply to the part which is the covered work,
|
||||
but the work with which it is combined will remain governed by version
|
||||
3 of the GNU General Public License.
|
||||
|
||||
14. Revised Versions of this License.
|
||||
|
||||
The Free Software Foundation may publish revised and/or new versions of
|
||||
the GNU Affero General Public License from time to time. Such new versions
|
||||
will be similar in spirit to the present version, but may differ in detail to
|
||||
address new problems or concerns.
|
||||
|
||||
Each version is given a distinguishing version number. If the
|
||||
Program specifies that a certain numbered version of the GNU Affero General
|
||||
Public License "or any later version" applies to it, you have the
|
||||
option of following the terms and conditions either of that numbered
|
||||
version or of any later version published by the Free Software
|
||||
Foundation. If the Program does not specify a version number of the
|
||||
GNU Affero General Public License, you may choose any version ever published
|
||||
by the Free Software Foundation.
|
||||
|
||||
If the Program specifies that a proxy can decide which future
|
||||
versions of the GNU Affero General Public License can be used, that proxy's
|
||||
public statement of acceptance of a version permanently authorizes you
|
||||
to choose that version for the Program.
|
||||
|
||||
Later license versions may give you additional or different
|
||||
permissions. However, no additional obligations are imposed on any
|
||||
author or copyright holder as a result of your choosing to follow a
|
||||
later version.
|
||||
|
||||
15. Disclaimer of Warranty.
|
||||
|
||||
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
|
||||
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
|
||||
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
|
||||
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
|
||||
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
|
||||
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
|
||||
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
|
||||
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
|
||||
|
||||
16. Limitation of Liability.
|
||||
|
||||
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
|
||||
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
|
||||
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
|
||||
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
|
||||
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
|
||||
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
|
||||
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
|
||||
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
|
||||
SUCH DAMAGES.
|
||||
|
||||
17. Interpretation of Sections 15 and 16.
|
||||
|
||||
If the disclaimer of warranty and limitation of liability provided
|
||||
above cannot be given local legal effect according to their terms,
|
||||
reviewing courts shall apply local law that most closely approximates
|
||||
an absolute waiver of all civil liability in connection with the
|
||||
Program, unless a warranty or assumption of liability accompanies a
|
||||
copy of the Program in return for a fee.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
How to Apply These Terms to Your New Programs
|
||||
|
||||
If you develop a new program, and you want it to be of the greatest
|
||||
possible use to the public, the best way to achieve this is to make it
|
||||
free software which everyone can redistribute and change under these terms.
|
||||
|
||||
To do so, attach the following notices to the program. It is safest
|
||||
to attach them to the start of each source file to most effectively
|
||||
state the exclusion of warranty; and each file should have at least
|
||||
the "copyright" line and a pointer to where the full notice is found.
|
||||
|
||||
<one line to give the program's name and a brief idea of what it does.>
|
||||
Copyright (C) <year> <name of author>
|
||||
|
||||
This program is free software: you can redistribute it and/or modify
|
||||
it under the terms of the GNU Affero General Public License as published by
|
||||
the Free Software Foundation, either version 3 of the License, or
|
||||
(at your option) any later version.
|
||||
|
||||
This program is distributed in the hope that it will be useful,
|
||||
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
GNU Affero General Public License for more details.
|
||||
|
||||
You should have received a copy of the GNU Affero General Public License
|
||||
along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
Also add information on how to contact you by electronic and paper mail.
|
||||
|
||||
If your software can interact with users remotely through a computer
|
||||
network, you should also make sure that it provides a way for users to
|
||||
get its source. For example, if your program is a web application, its
|
||||
interface could display a "Source" link that leads users to an archive
|
||||
of the code. There are many ways you could offer source, and different
|
||||
solutions will be better for different programs; see section 13 for the
|
||||
specific requirements.
|
||||
|
||||
You should also get your employer (if you work as a programmer) or school,
|
||||
if any, to sign a "copyright disclaimer" for the program, if necessary.
|
||||
For more information on this, and how to apply and follow the GNU AGPL, see
|
||||
<http://www.gnu.org/licenses/>.
|
||||
|
|
|
@ -7,7 +7,7 @@ Funkwhale
|
|||
|
||||
A self-hosted tribute to Grooveshark.com.
|
||||
|
||||
LICENSE: BSD
|
||||
LICENSE: AGPL3
|
||||
|
||||
Getting help
|
||||
------------
|
||||
|
|
|
@ -1,81 +1,79 @@
|
|||
from django.conf.urls import include, url
|
||||
from dynamic_preferences.api.viewsets import GlobalPreferencesViewSet
|
||||
from rest_framework import routers
|
||||
from rest_framework.urlpatterns import format_suffix_patterns
|
||||
from django.conf.urls import include, url
|
||||
from rest_framework_jwt import views as jwt_views
|
||||
|
||||
from funkwhale_api.activity import views as activity_views
|
||||
from funkwhale_api.instance import views as instance_views
|
||||
from funkwhale_api.music import views
|
||||
from funkwhale_api.playlists import views as playlists_views
|
||||
from funkwhale_api.subsonic.views import SubsonicViewSet
|
||||
from rest_framework_jwt import views as jwt_views
|
||||
|
||||
from dynamic_preferences.api.viewsets import GlobalPreferencesViewSet
|
||||
from dynamic_preferences.users.viewsets import UserPreferencesViewSet
|
||||
|
||||
router = routers.SimpleRouter()
|
||||
router.register(r'settings', GlobalPreferencesViewSet, base_name='settings')
|
||||
router.register(r'activity', activity_views.ActivityViewSet, 'activity')
|
||||
router.register(r'tags', views.TagViewSet, 'tags')
|
||||
router.register(r'tracks', views.TrackViewSet, 'tracks')
|
||||
router.register(r'trackfiles', views.TrackFileViewSet, 'trackfiles')
|
||||
router.register(r'artists', views.ArtistViewSet, 'artists')
|
||||
router.register(r'albums', views.AlbumViewSet, 'albums')
|
||||
router.register(r'import-batches', views.ImportBatchViewSet, 'import-batches')
|
||||
router.register(r'import-jobs', views.ImportJobViewSet, 'import-jobs')
|
||||
router.register(r'submit', views.SubmitViewSet, 'submit')
|
||||
router.register(r'playlists', playlists_views.PlaylistViewSet, 'playlists')
|
||||
router.register(r"settings", GlobalPreferencesViewSet, base_name="settings")
|
||||
router.register(r"activity", activity_views.ActivityViewSet, "activity")
|
||||
router.register(r"tags", views.TagViewSet, "tags")
|
||||
router.register(r"tracks", views.TrackViewSet, "tracks")
|
||||
router.register(r"trackfiles", views.TrackFileViewSet, "trackfiles")
|
||||
router.register(r"artists", views.ArtistViewSet, "artists")
|
||||
router.register(r"albums", views.AlbumViewSet, "albums")
|
||||
router.register(r"import-batches", views.ImportBatchViewSet, "import-batches")
|
||||
router.register(r"import-jobs", views.ImportJobViewSet, "import-jobs")
|
||||
router.register(r"submit", views.SubmitViewSet, "submit")
|
||||
router.register(r"playlists", playlists_views.PlaylistViewSet, "playlists")
|
||||
router.register(
|
||||
r'playlist-tracks',
|
||||
playlists_views.PlaylistTrackViewSet,
|
||||
'playlist-tracks')
|
||||
r"playlist-tracks", playlists_views.PlaylistTrackViewSet, "playlist-tracks"
|
||||
)
|
||||
v1_patterns = router.urls
|
||||
|
||||
subsonic_router = routers.SimpleRouter(trailing_slash=False)
|
||||
subsonic_router.register(r'subsonic/rest', SubsonicViewSet, base_name='subsonic')
|
||||
subsonic_router.register(r"subsonic/rest", SubsonicViewSet, base_name="subsonic")
|
||||
|
||||
|
||||
v1_patterns += [
|
||||
url(r'^instance/',
|
||||
url(
|
||||
r"^instance/",
|
||||
include(("funkwhale_api.instance.urls", "instance"), namespace="instance"),
|
||||
),
|
||||
url(
|
||||
r"^manage/",
|
||||
include(("funkwhale_api.manage.urls", "manage"), namespace="manage"),
|
||||
),
|
||||
url(
|
||||
r"^federation/",
|
||||
include(
|
||||
('funkwhale_api.instance.urls', 'instance'),
|
||||
namespace='instance')),
|
||||
url(r'^manage/',
|
||||
include(
|
||||
('funkwhale_api.manage.urls', 'manage'),
|
||||
namespace='manage')),
|
||||
url(r'^federation/',
|
||||
include(
|
||||
('funkwhale_api.federation.api_urls', 'federation'),
|
||||
namespace='federation')),
|
||||
url(r'^providers/',
|
||||
include(
|
||||
('funkwhale_api.providers.urls', 'providers'),
|
||||
namespace='providers')),
|
||||
url(r'^favorites/',
|
||||
include(
|
||||
('funkwhale_api.favorites.urls', 'favorites'),
|
||||
namespace='favorites')),
|
||||
url(r'^search$',
|
||||
views.Search.as_view(), name='search'),
|
||||
url(r'^radios/',
|
||||
include(
|
||||
('funkwhale_api.radios.urls', 'radios'),
|
||||
namespace='radios')),
|
||||
url(r'^history/',
|
||||
include(
|
||||
('funkwhale_api.history.urls', 'history'),
|
||||
namespace='history')),
|
||||
url(r'^users/',
|
||||
include(
|
||||
('funkwhale_api.users.api_urls', 'users'),
|
||||
namespace='users')),
|
||||
url(r'^requests/',
|
||||
include(
|
||||
('funkwhale_api.requests.api_urls', 'requests'),
|
||||
namespace='requests')),
|
||||
url(r'^token/$', jwt_views.obtain_jwt_token, name='token'),
|
||||
url(r'^token/refresh/$', jwt_views.refresh_jwt_token, name='token_refresh'),
|
||||
("funkwhale_api.federation.api_urls", "federation"), namespace="federation"
|
||||
),
|
||||
),
|
||||
url(
|
||||
r"^providers/",
|
||||
include(("funkwhale_api.providers.urls", "providers"), namespace="providers"),
|
||||
),
|
||||
url(
|
||||
r"^favorites/",
|
||||
include(("funkwhale_api.favorites.urls", "favorites"), namespace="favorites"),
|
||||
),
|
||||
url(r"^search$", views.Search.as_view(), name="search"),
|
||||
url(
|
||||
r"^radios/",
|
||||
include(("funkwhale_api.radios.urls", "radios"), namespace="radios"),
|
||||
),
|
||||
url(
|
||||
r"^history/",
|
||||
include(("funkwhale_api.history.urls", "history"), namespace="history"),
|
||||
),
|
||||
url(
|
||||
r"^users/",
|
||||
include(("funkwhale_api.users.api_urls", "users"), namespace="users"),
|
||||
),
|
||||
url(
|
||||
r"^requests/",
|
||||
include(("funkwhale_api.requests.api_urls", "requests"), namespace="requests"),
|
||||
),
|
||||
url(r"^token/$", jwt_views.obtain_jwt_token, name="token"),
|
||||
url(r"^token/refresh/$", jwt_views.refresh_jwt_token, name="token_refresh"),
|
||||
]
|
||||
|
||||
urlpatterns = [
|
||||
url(r'^v1/', include((v1_patterns, 'v1'), namespace='v1'))
|
||||
] + format_suffix_patterns(subsonic_router.urls, allowed=['view'])
|
||||
url(r"^v1/", include((v1_patterns, "v1"), namespace="v1"))
|
||||
] + format_suffix_patterns(subsonic_router.urls, allowed=["view"])
|
||||
|
|
|
@ -1,8 +1,9 @@
|
|||
import django
|
||||
import os
|
||||
|
||||
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "config.settings.production")
|
||||
import django
|
||||
|
||||
django.setup()
|
||||
|
||||
from .routing import application
|
||||
from .routing import application # noqa
|
||||
|
||||
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "config.settings.production")
|
||||
|
|
|
@ -1,18 +1,16 @@
|
|||
from django.conf.urls import url
|
||||
|
||||
from channels.auth import AuthMiddlewareStack
|
||||
from channels.routing import ProtocolTypeRouter, URLRouter
|
||||
from django.conf.urls import url
|
||||
|
||||
from funkwhale_api.common.auth import TokenAuthMiddleware
|
||||
from funkwhale_api.instance import consumers
|
||||
|
||||
|
||||
application = ProtocolTypeRouter({
|
||||
# Empty for now (http->django views is added by default)
|
||||
"websocket": TokenAuthMiddleware(
|
||||
URLRouter([
|
||||
url("^api/v1/instance/activity$",
|
||||
consumers.InstanceActivityConsumer),
|
||||
])
|
||||
),
|
||||
})
|
||||
application = ProtocolTypeRouter(
|
||||
{
|
||||
# Empty for now (http->django views is added by default)
|
||||
"websocket": TokenAuthMiddleware(
|
||||
URLRouter(
|
||||
[url("^api/v1/instance/activity$", consumers.InstanceActivityConsumer)]
|
||||
)
|
||||
)
|
||||
}
|
||||
)
|
||||
|
|
|
@ -10,131 +10,125 @@ https://docs.djangoproject.com/en/dev/ref/settings/
|
|||
"""
|
||||
from __future__ import absolute_import, unicode_literals
|
||||
|
||||
from urllib.parse import urlsplit
|
||||
import os
|
||||
import datetime
|
||||
from urllib.parse import urlparse, urlsplit
|
||||
|
||||
import environ
|
||||
from celery.schedules import crontab
|
||||
|
||||
from funkwhale_api import __version__
|
||||
|
||||
ROOT_DIR = environ.Path(__file__) - 3 # (/a/b/myfile.py - 3 = /)
|
||||
APPS_DIR = ROOT_DIR.path('funkwhale_api')
|
||||
APPS_DIR = ROOT_DIR.path("funkwhale_api")
|
||||
|
||||
env = environ.Env()
|
||||
|
||||
try:
|
||||
env.read_env(ROOT_DIR.file('.env'))
|
||||
env.read_env(ROOT_DIR.file(".env"))
|
||||
except FileNotFoundError:
|
||||
pass
|
||||
|
||||
FUNKWHALE_HOSTNAME = None
|
||||
FUNKWHALE_HOSTNAME_SUFFIX = env('FUNKWHALE_HOSTNAME_SUFFIX', default=None)
|
||||
FUNKWHALE_HOSTNAME_PREFIX = env('FUNKWHALE_HOSTNAME_PREFIX', default=None)
|
||||
FUNKWHALE_HOSTNAME_SUFFIX = env("FUNKWHALE_HOSTNAME_SUFFIX", default=None)
|
||||
FUNKWHALE_HOSTNAME_PREFIX = env("FUNKWHALE_HOSTNAME_PREFIX", default=None)
|
||||
if FUNKWHALE_HOSTNAME_PREFIX and FUNKWHALE_HOSTNAME_SUFFIX:
|
||||
# We're in traefik case, in development
|
||||
FUNKWHALE_HOSTNAME = '{}.{}'.format(
|
||||
FUNKWHALE_HOSTNAME_PREFIX, FUNKWHALE_HOSTNAME_SUFFIX)
|
||||
FUNKWHALE_PROTOCOL = env('FUNKWHALE_PROTOCOL', default='https')
|
||||
FUNKWHALE_HOSTNAME = "{}.{}".format(
|
||||
FUNKWHALE_HOSTNAME_PREFIX, FUNKWHALE_HOSTNAME_SUFFIX
|
||||
)
|
||||
FUNKWHALE_PROTOCOL = env("FUNKWHALE_PROTOCOL", default="https")
|
||||
else:
|
||||
try:
|
||||
FUNKWHALE_HOSTNAME = env('FUNKWHALE_HOSTNAME')
|
||||
FUNKWHALE_PROTOCOL = env('FUNKWHALE_PROTOCOL', default='https')
|
||||
FUNKWHALE_HOSTNAME = env("FUNKWHALE_HOSTNAME")
|
||||
FUNKWHALE_PROTOCOL = env("FUNKWHALE_PROTOCOL", default="https")
|
||||
except Exception:
|
||||
FUNKWHALE_URL = env('FUNKWHALE_URL')
|
||||
FUNKWHALE_URL = env("FUNKWHALE_URL")
|
||||
_parsed = urlsplit(FUNKWHALE_URL)
|
||||
FUNKWHALE_HOSTNAME = _parsed.netloc
|
||||
FUNKWHALE_PROTOCOL = _parsed.scheme
|
||||
|
||||
FUNKWHALE_URL = '{}://{}'.format(FUNKWHALE_PROTOCOL, FUNKWHALE_HOSTNAME)
|
||||
FUNKWHALE_URL = "{}://{}".format(FUNKWHALE_PROTOCOL, FUNKWHALE_HOSTNAME)
|
||||
|
||||
|
||||
# XXX: deprecated, see #186
|
||||
FEDERATION_ENABLED = env.bool('FEDERATION_ENABLED', default=True)
|
||||
FEDERATION_HOSTNAME = env('FEDERATION_HOSTNAME', default=FUNKWHALE_HOSTNAME)
|
||||
FEDERATION_ENABLED = env.bool("FEDERATION_ENABLED", default=True)
|
||||
FEDERATION_HOSTNAME = env("FEDERATION_HOSTNAME", default=FUNKWHALE_HOSTNAME)
|
||||
# XXX: deprecated, see #186
|
||||
FEDERATION_COLLECTION_PAGE_SIZE = env.int(
|
||||
'FEDERATION_COLLECTION_PAGE_SIZE', default=50
|
||||
)
|
||||
FEDERATION_COLLECTION_PAGE_SIZE = env.int("FEDERATION_COLLECTION_PAGE_SIZE", default=50)
|
||||
# XXX: deprecated, see #186
|
||||
FEDERATION_MUSIC_NEEDS_APPROVAL = env.bool(
|
||||
'FEDERATION_MUSIC_NEEDS_APPROVAL', default=True
|
||||
"FEDERATION_MUSIC_NEEDS_APPROVAL", default=True
|
||||
)
|
||||
# XXX: deprecated, see #186
|
||||
FEDERATION_ACTOR_FETCH_DELAY = env.int(
|
||||
'FEDERATION_ACTOR_FETCH_DELAY', default=60 * 12)
|
||||
ALLOWED_HOSTS = env.list('DJANGO_ALLOWED_HOSTS')
|
||||
FEDERATION_ACTOR_FETCH_DELAY = env.int("FEDERATION_ACTOR_FETCH_DELAY", default=60 * 12)
|
||||
ALLOWED_HOSTS = env.list("DJANGO_ALLOWED_HOSTS")
|
||||
|
||||
# APP CONFIGURATION
|
||||
# ------------------------------------------------------------------------------
|
||||
DJANGO_APPS = (
|
||||
'channels',
|
||||
"channels",
|
||||
# Default Django apps:
|
||||
'django.contrib.auth',
|
||||
'django.contrib.contenttypes',
|
||||
'django.contrib.sessions',
|
||||
'django.contrib.sites',
|
||||
'django.contrib.messages',
|
||||
'django.contrib.staticfiles',
|
||||
'django.contrib.postgres',
|
||||
|
||||
"django.contrib.auth",
|
||||
"django.contrib.contenttypes",
|
||||
"django.contrib.sessions",
|
||||
"django.contrib.sites",
|
||||
"django.contrib.messages",
|
||||
"django.contrib.staticfiles",
|
||||
"django.contrib.postgres",
|
||||
# Useful template tags:
|
||||
# 'django.contrib.humanize',
|
||||
|
||||
# Admin
|
||||
'django.contrib.admin',
|
||||
"django.contrib.admin",
|
||||
)
|
||||
THIRD_PARTY_APPS = (
|
||||
# 'crispy_forms', # Form layouts
|
||||
'allauth', # registration
|
||||
'allauth.account', # registration
|
||||
'allauth.socialaccount', # registration
|
||||
'corsheaders',
|
||||
'rest_framework',
|
||||
'rest_framework.authtoken',
|
||||
'taggit',
|
||||
'rest_auth',
|
||||
'rest_auth.registration',
|
||||
'dynamic_preferences',
|
||||
'django_filters',
|
||||
'cacheops',
|
||||
'django_cleanup',
|
||||
"allauth", # registration
|
||||
"allauth.account", # registration
|
||||
"allauth.socialaccount", # registration
|
||||
"corsheaders",
|
||||
"rest_framework",
|
||||
"rest_framework.authtoken",
|
||||
"taggit",
|
||||
"rest_auth",
|
||||
"rest_auth.registration",
|
||||
"dynamic_preferences",
|
||||
"django_filters",
|
||||
"cacheops",
|
||||
"django_cleanup",
|
||||
)
|
||||
|
||||
|
||||
# Sentry
|
||||
RAVEN_ENABLED = env.bool("RAVEN_ENABLED", default=False)
|
||||
RAVEN_DSN = env("RAVEN_DSN", default='')
|
||||
RAVEN_DSN = env("RAVEN_DSN", default="")
|
||||
|
||||
if RAVEN_ENABLED:
|
||||
RAVEN_CONFIG = {
|
||||
'dsn': RAVEN_DSN,
|
||||
"dsn": RAVEN_DSN,
|
||||
# If you are using git, you can also automatically configure the
|
||||
# release based on the git info.
|
||||
'release': __version__,
|
||||
"release": __version__,
|
||||
}
|
||||
THIRD_PARTY_APPS += (
|
||||
'raven.contrib.django.raven_compat',
|
||||
)
|
||||
THIRD_PARTY_APPS += ("raven.contrib.django.raven_compat",)
|
||||
|
||||
|
||||
# Apps specific for this project go here.
|
||||
LOCAL_APPS = (
|
||||
'funkwhale_api.common',
|
||||
'funkwhale_api.activity.apps.ActivityConfig',
|
||||
'funkwhale_api.users', # custom users app
|
||||
"funkwhale_api.common",
|
||||
"funkwhale_api.activity.apps.ActivityConfig",
|
||||
"funkwhale_api.users", # custom users app
|
||||
# Your stuff: custom apps go here
|
||||
'funkwhale_api.instance',
|
||||
'funkwhale_api.music',
|
||||
'funkwhale_api.requests',
|
||||
'funkwhale_api.favorites',
|
||||
'funkwhale_api.federation',
|
||||
'funkwhale_api.radios',
|
||||
'funkwhale_api.history',
|
||||
'funkwhale_api.playlists',
|
||||
'funkwhale_api.providers.audiofile',
|
||||
'funkwhale_api.providers.youtube',
|
||||
'funkwhale_api.providers.acoustid',
|
||||
'funkwhale_api.subsonic',
|
||||
"funkwhale_api.instance",
|
||||
"funkwhale_api.music",
|
||||
"funkwhale_api.requests",
|
||||
"funkwhale_api.favorites",
|
||||
"funkwhale_api.federation",
|
||||
"funkwhale_api.radios",
|
||||
"funkwhale_api.history",
|
||||
"funkwhale_api.playlists",
|
||||
"funkwhale_api.providers.audiofile",
|
||||
"funkwhale_api.providers.youtube",
|
||||
"funkwhale_api.providers.acoustid",
|
||||
"funkwhale_api.subsonic",
|
||||
)
|
||||
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/settings/#installed-apps
|
||||
|
@ -145,20 +139,18 @@ INSTALLED_APPS = DJANGO_APPS + THIRD_PARTY_APPS + LOCAL_APPS
|
|||
# ------------------------------------------------------------------------------
|
||||
MIDDLEWARE = (
|
||||
# Make sure djangosecure.middleware.SecurityMiddleware is listed first
|
||||
'django.contrib.sessions.middleware.SessionMiddleware',
|
||||
'corsheaders.middleware.CorsMiddleware',
|
||||
'django.middleware.common.CommonMiddleware',
|
||||
'django.middleware.csrf.CsrfViewMiddleware',
|
||||
'django.contrib.auth.middleware.AuthenticationMiddleware',
|
||||
'django.contrib.messages.middleware.MessageMiddleware',
|
||||
'django.middleware.clickjacking.XFrameOptionsMiddleware',
|
||||
"django.contrib.sessions.middleware.SessionMiddleware",
|
||||
"corsheaders.middleware.CorsMiddleware",
|
||||
"django.middleware.common.CommonMiddleware",
|
||||
"django.middleware.csrf.CsrfViewMiddleware",
|
||||
"django.contrib.auth.middleware.AuthenticationMiddleware",
|
||||
"django.contrib.messages.middleware.MessageMiddleware",
|
||||
"django.middleware.clickjacking.XFrameOptionsMiddleware",
|
||||
)
|
||||
|
||||
# MIGRATIONS CONFIGURATION
|
||||
# ------------------------------------------------------------------------------
|
||||
MIGRATION_MODULES = {
|
||||
'sites': 'funkwhale_api.contrib.sites.migrations'
|
||||
}
|
||||
MIGRATION_MODULES = {"sites": "funkwhale_api.contrib.sites.migrations"}
|
||||
|
||||
# DEBUG
|
||||
# ------------------------------------------------------------------------------
|
||||
|
@ -168,9 +160,7 @@ DEBUG = env.bool("DJANGO_DEBUG", False)
|
|||
# FIXTURE CONFIGURATION
|
||||
# ------------------------------------------------------------------------------
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/settings/#std:setting-FIXTURE_DIRS
|
||||
FIXTURE_DIRS = (
|
||||
str(APPS_DIR.path('fixtures')),
|
||||
)
|
||||
FIXTURE_DIRS = (str(APPS_DIR.path("fixtures")),)
|
||||
|
||||
# EMAIL CONFIGURATION
|
||||
# ------------------------------------------------------------------------------
|
||||
|
@ -178,16 +168,14 @@ FIXTURE_DIRS = (
|
|||
# EMAIL
|
||||
# ------------------------------------------------------------------------------
|
||||
DEFAULT_FROM_EMAIL = env(
|
||||
'DEFAULT_FROM_EMAIL',
|
||||
default='Funkwhale <noreply@{}>'.format(FUNKWHALE_HOSTNAME))
|
||||
"DEFAULT_FROM_EMAIL", default="Funkwhale <noreply@{}>".format(FUNKWHALE_HOSTNAME)
|
||||
)
|
||||
|
||||
EMAIL_SUBJECT_PREFIX = env(
|
||||
"EMAIL_SUBJECT_PREFIX", default='[Funkwhale] ')
|
||||
SERVER_EMAIL = env('SERVER_EMAIL', default=DEFAULT_FROM_EMAIL)
|
||||
EMAIL_SUBJECT_PREFIX = env("EMAIL_SUBJECT_PREFIX", default="[Funkwhale] ")
|
||||
SERVER_EMAIL = env("SERVER_EMAIL", default=DEFAULT_FROM_EMAIL)
|
||||
|
||||
|
||||
EMAIL_CONFIG = env.email_url(
|
||||
'EMAIL_CONFIG', default='consolemail://')
|
||||
EMAIL_CONFIG = env.email_url("EMAIL_CONFIG", default="consolemail://")
|
||||
|
||||
vars().update(EMAIL_CONFIG)
|
||||
|
||||
|
@ -196,9 +184,9 @@ vars().update(EMAIL_CONFIG)
|
|||
# See: https://docs.djangoproject.com/en/dev/ref/settings/#databases
|
||||
DATABASES = {
|
||||
# Raises ImproperlyConfigured exception if DATABASE_URL not in os.environ
|
||||
'default': env.db("DATABASE_URL"),
|
||||
"default": env.db("DATABASE_URL")
|
||||
}
|
||||
DATABASES['default']['ATOMIC_REQUESTS'] = True
|
||||
DATABASES["default"]["ATOMIC_REQUESTS"] = True
|
||||
#
|
||||
# DATABASES = {
|
||||
# 'default': {
|
||||
|
@ -212,10 +200,10 @@ DATABASES['default']['ATOMIC_REQUESTS'] = True
|
|||
# http://en.wikipedia.org/wiki/List_of_tz_zones_by_name
|
||||
# although not all choices may be available on all operating systems.
|
||||
# In a Windows environment this must be set to your system time zone.
|
||||
TIME_ZONE = 'UTC'
|
||||
TIME_ZONE = "UTC"
|
||||
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/settings/#language-code
|
||||
LANGUAGE_CODE = 'en-us'
|
||||
LANGUAGE_CODE = "en-us"
|
||||
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/settings/#site-id
|
||||
SITE_ID = 1
|
||||
|
@ -235,152 +223,142 @@ USE_TZ = True
|
|||
TEMPLATES = [
|
||||
{
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/settings/#std:setting-TEMPLATES-BACKEND
|
||||
'BACKEND': 'django.template.backends.django.DjangoTemplates',
|
||||
"BACKEND": "django.template.backends.django.DjangoTemplates",
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/settings/#template-dirs
|
||||
'DIRS': [
|
||||
str(APPS_DIR.path('templates')),
|
||||
],
|
||||
'OPTIONS': {
|
||||
"DIRS": [str(APPS_DIR.path("templates"))],
|
||||
"OPTIONS": {
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/settings/#template-debug
|
||||
'debug': DEBUG,
|
||||
"debug": DEBUG,
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/settings/#template-loaders
|
||||
# https://docs.djangoproject.com/en/dev/ref/templates/api/#loader-types
|
||||
'loaders': [
|
||||
'django.template.loaders.filesystem.Loader',
|
||||
'django.template.loaders.app_directories.Loader',
|
||||
"loaders": [
|
||||
"django.template.loaders.filesystem.Loader",
|
||||
"django.template.loaders.app_directories.Loader",
|
||||
],
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/settings/#template-context-processors
|
||||
'context_processors': [
|
||||
'django.template.context_processors.debug',
|
||||
'django.template.context_processors.request',
|
||||
'django.contrib.auth.context_processors.auth',
|
||||
'django.template.context_processors.i18n',
|
||||
'django.template.context_processors.media',
|
||||
'django.template.context_processors.static',
|
||||
'django.template.context_processors.tz',
|
||||
'django.contrib.messages.context_processors.messages',
|
||||
"context_processors": [
|
||||
"django.template.context_processors.debug",
|
||||
"django.template.context_processors.request",
|
||||
"django.contrib.auth.context_processors.auth",
|
||||
"django.template.context_processors.i18n",
|
||||
"django.template.context_processors.media",
|
||||
"django.template.context_processors.static",
|
||||
"django.template.context_processors.tz",
|
||||
"django.contrib.messages.context_processors.messages",
|
||||
# Your stuff: custom template context processors go here
|
||||
],
|
||||
},
|
||||
},
|
||||
}
|
||||
]
|
||||
|
||||
# See: http://django-crispy-forms.readthedocs.org/en/latest/install.html#template-packs
|
||||
CRISPY_TEMPLATE_PACK = 'bootstrap3'
|
||||
CRISPY_TEMPLATE_PACK = "bootstrap3"
|
||||
|
||||
# STATIC FILE CONFIGURATION
|
||||
# ------------------------------------------------------------------------------
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/settings/#static-root
|
||||
STATIC_ROOT = env("STATIC_ROOT", default=str(ROOT_DIR('staticfiles')))
|
||||
STATIC_ROOT = env("STATIC_ROOT", default=str(ROOT_DIR("staticfiles")))
|
||||
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/settings/#static-url
|
||||
STATIC_URL = env("STATIC_URL", default='/staticfiles/')
|
||||
DEFAULT_FILE_STORAGE = 'funkwhale_api.common.storage.ASCIIFileSystemStorage'
|
||||
STATIC_URL = env("STATIC_URL", default="/staticfiles/")
|
||||
DEFAULT_FILE_STORAGE = "funkwhale_api.common.storage.ASCIIFileSystemStorage"
|
||||
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/contrib/staticfiles/#std:setting-STATICFILES_DIRS
|
||||
STATICFILES_DIRS = (
|
||||
str(APPS_DIR.path('static')),
|
||||
)
|
||||
STATICFILES_DIRS = (str(APPS_DIR.path("static")),)
|
||||
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/contrib/staticfiles/#staticfiles-finders
|
||||
STATICFILES_FINDERS = (
|
||||
'django.contrib.staticfiles.finders.FileSystemFinder',
|
||||
'django.contrib.staticfiles.finders.AppDirectoriesFinder',
|
||||
"django.contrib.staticfiles.finders.FileSystemFinder",
|
||||
"django.contrib.staticfiles.finders.AppDirectoriesFinder",
|
||||
)
|
||||
|
||||
# MEDIA CONFIGURATION
|
||||
# ------------------------------------------------------------------------------
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/settings/#media-root
|
||||
MEDIA_ROOT = env("MEDIA_ROOT", default=str(APPS_DIR('media')))
|
||||
MEDIA_ROOT = env("MEDIA_ROOT", default=str(APPS_DIR("media")))
|
||||
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/settings/#media-url
|
||||
MEDIA_URL = env("MEDIA_URL", default='/media/')
|
||||
MEDIA_URL = env("MEDIA_URL", default="/media/")
|
||||
|
||||
# URL Configuration
|
||||
# ------------------------------------------------------------------------------
|
||||
ROOT_URLCONF = 'config.urls'
|
||||
ROOT_URLCONF = "config.urls"
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/settings/#wsgi-application
|
||||
WSGI_APPLICATION = 'config.wsgi.application'
|
||||
WSGI_APPLICATION = "config.wsgi.application"
|
||||
ASGI_APPLICATION = "config.routing.application"
|
||||
|
||||
# This ensures that Django will be able to detect a secure connection
|
||||
SECURE_PROXY_SSL_HEADER = ('HTTP_X_FORWARDED_PROTO', 'https')
|
||||
SECURE_PROXY_SSL_HEADER = ("HTTP_X_FORWARDED_PROTO", "https")
|
||||
|
||||
# AUTHENTICATION CONFIGURATION
|
||||
# ------------------------------------------------------------------------------
|
||||
AUTHENTICATION_BACKENDS = (
|
||||
'django.contrib.auth.backends.ModelBackend',
|
||||
'allauth.account.auth_backends.AuthenticationBackend',
|
||||
"django.contrib.auth.backends.ModelBackend",
|
||||
"allauth.account.auth_backends.AuthenticationBackend",
|
||||
)
|
||||
SESSION_COOKIE_HTTPONLY = False
|
||||
# Some really nice defaults
|
||||
ACCOUNT_AUTHENTICATION_METHOD = 'username_email'
|
||||
ACCOUNT_AUTHENTICATION_METHOD = "username_email"
|
||||
ACCOUNT_EMAIL_REQUIRED = True
|
||||
ACCOUNT_EMAIL_VERIFICATION = 'mandatory'
|
||||
ACCOUNT_EMAIL_VERIFICATION = "mandatory"
|
||||
|
||||
# Custom user app defaults
|
||||
# Select the correct user model
|
||||
AUTH_USER_MODEL = 'users.User'
|
||||
LOGIN_REDIRECT_URL = 'users:redirect'
|
||||
LOGIN_URL = 'account_login'
|
||||
AUTH_USER_MODEL = "users.User"
|
||||
LOGIN_REDIRECT_URL = "users:redirect"
|
||||
LOGIN_URL = "account_login"
|
||||
|
||||
# SLUGLIFIER
|
||||
AUTOSLUG_SLUGIFY_FUNCTION = 'slugify.slugify'
|
||||
AUTOSLUG_SLUGIFY_FUNCTION = "slugify.slugify"
|
||||
|
||||
CACHE_DEFAULT = "redis://127.0.0.1:6379/0"
|
||||
CACHES = {
|
||||
"default": env.cache_url('CACHE_URL', default=CACHE_DEFAULT)
|
||||
}
|
||||
CACHES = {"default": env.cache_url("CACHE_URL", default=CACHE_DEFAULT)}
|
||||
|
||||
CACHES["default"]["BACKEND"] = "django_redis.cache.RedisCache"
|
||||
from urllib.parse import urlparse
|
||||
cache_url = urlparse(CACHES['default']['LOCATION'])
|
||||
|
||||
cache_url = urlparse(CACHES["default"]["LOCATION"])
|
||||
CHANNEL_LAYERS = {
|
||||
"default": {
|
||||
"BACKEND": "channels_redis.core.RedisChannelLayer",
|
||||
"CONFIG": {
|
||||
"hosts": [(cache_url.hostname, cache_url.port)],
|
||||
},
|
||||
},
|
||||
"CONFIG": {"hosts": [(cache_url.hostname, cache_url.port)]},
|
||||
}
|
||||
}
|
||||
|
||||
CACHES["default"]["OPTIONS"] = {
|
||||
"CLIENT_CLASS": "django_redis.client.DefaultClient",
|
||||
"IGNORE_EXCEPTIONS": True, # mimics memcache behavior.
|
||||
# http://niwinz.github.io/django-redis/latest/#_memcached_exceptions_behavior
|
||||
# http://niwinz.github.io/django-redis/latest/#_memcached_exceptions_behavior
|
||||
}
|
||||
|
||||
|
||||
########## CELERY
|
||||
INSTALLED_APPS += ('funkwhale_api.taskapp.celery.CeleryConfig',)
|
||||
# CELERY
|
||||
INSTALLED_APPS += ("funkwhale_api.taskapp.celery.CeleryConfig",)
|
||||
CELERY_BROKER_URL = env(
|
||||
"CELERY_BROKER_URL", default=env('CACHE_URL', default=CACHE_DEFAULT))
|
||||
########## END CELERY
|
||||
"CELERY_BROKER_URL", default=env("CACHE_URL", default=CACHE_DEFAULT)
|
||||
)
|
||||
# END CELERY
|
||||
# Location of root django.contrib.admin URL, use {% url 'admin:index' %}
|
||||
|
||||
# Your common stuff: Below this line define 3rd party library settings
|
||||
CELERY_TASK_DEFAULT_RATE_LIMIT = 1
|
||||
CELERY_TASK_TIME_LIMIT = 300
|
||||
CELERYBEAT_SCHEDULE = {
|
||||
'federation.clean_music_cache': {
|
||||
'task': 'funkwhale_api.federation.tasks.clean_music_cache',
|
||||
'schedule': crontab(hour='*/2'),
|
||||
'options': {
|
||||
'expires': 60 * 2,
|
||||
},
|
||||
"federation.clean_music_cache": {
|
||||
"task": "funkwhale_api.federation.tasks.clean_music_cache",
|
||||
"schedule": crontab(hour="*/2"),
|
||||
"options": {"expires": 60 * 2},
|
||||
}
|
||||
}
|
||||
|
||||
import datetime
|
||||
JWT_AUTH = {
|
||||
'JWT_ALLOW_REFRESH': True,
|
||||
'JWT_EXPIRATION_DELTA': datetime.timedelta(days=7),
|
||||
'JWT_REFRESH_EXPIRATION_DELTA': datetime.timedelta(days=30),
|
||||
'JWT_AUTH_HEADER_PREFIX': 'JWT',
|
||||
'JWT_GET_USER_SECRET_KEY': lambda user: user.secret_key
|
||||
"JWT_ALLOW_REFRESH": True,
|
||||
"JWT_EXPIRATION_DELTA": datetime.timedelta(days=7),
|
||||
"JWT_REFRESH_EXPIRATION_DELTA": datetime.timedelta(days=30),
|
||||
"JWT_AUTH_HEADER_PREFIX": "JWT",
|
||||
"JWT_GET_USER_SECRET_KEY": lambda user: user.secret_key,
|
||||
}
|
||||
OLD_PASSWORD_FIELD_ENABLED = True
|
||||
ACCOUNT_ADAPTER = 'funkwhale_api.users.adapters.FunkwhaleAccountAdapter'
|
||||
ACCOUNT_ADAPTER = "funkwhale_api.users.adapters.FunkwhaleAccountAdapter"
|
||||
CORS_ORIGIN_ALLOW_ALL = True
|
||||
# CORS_ORIGIN_WHITELIST = (
|
||||
# 'localhost',
|
||||
|
@ -389,41 +367,37 @@ CORS_ORIGIN_ALLOW_ALL = True
|
|||
CORS_ALLOW_CREDENTIALS = True
|
||||
|
||||
REST_FRAMEWORK = {
|
||||
'DEFAULT_PERMISSION_CLASSES': (
|
||||
'rest_framework.permissions.IsAuthenticated',
|
||||
"DEFAULT_PERMISSION_CLASSES": ("rest_framework.permissions.IsAuthenticated",),
|
||||
"DEFAULT_PAGINATION_CLASS": "funkwhale_api.common.pagination.FunkwhalePagination",
|
||||
"PAGE_SIZE": 25,
|
||||
"DEFAULT_PARSER_CLASSES": (
|
||||
"rest_framework.parsers.JSONParser",
|
||||
"rest_framework.parsers.FormParser",
|
||||
"rest_framework.parsers.MultiPartParser",
|
||||
"funkwhale_api.federation.parsers.ActivityParser",
|
||||
),
|
||||
'DEFAULT_PAGINATION_CLASS': 'funkwhale_api.common.pagination.FunkwhalePagination',
|
||||
'PAGE_SIZE': 25,
|
||||
'DEFAULT_PARSER_CLASSES': (
|
||||
'rest_framework.parsers.JSONParser',
|
||||
'rest_framework.parsers.FormParser',
|
||||
'rest_framework.parsers.MultiPartParser',
|
||||
'funkwhale_api.federation.parsers.ActivityParser',
|
||||
"DEFAULT_AUTHENTICATION_CLASSES": (
|
||||
"funkwhale_api.common.authentication.JSONWebTokenAuthenticationQS",
|
||||
"funkwhale_api.common.authentication.BearerTokenHeaderAuth",
|
||||
"rest_framework_jwt.authentication.JSONWebTokenAuthentication",
|
||||
"rest_framework.authentication.SessionAuthentication",
|
||||
"rest_framework.authentication.BasicAuthentication",
|
||||
),
|
||||
'DEFAULT_AUTHENTICATION_CLASSES': (
|
||||
'funkwhale_api.common.authentication.JSONWebTokenAuthenticationQS',
|
||||
'funkwhale_api.common.authentication.BearerTokenHeaderAuth',
|
||||
'rest_framework_jwt.authentication.JSONWebTokenAuthentication',
|
||||
'rest_framework.authentication.SessionAuthentication',
|
||||
'rest_framework.authentication.BasicAuthentication',
|
||||
"DEFAULT_FILTER_BACKENDS": (
|
||||
"rest_framework.filters.OrderingFilter",
|
||||
"django_filters.rest_framework.DjangoFilterBackend",
|
||||
),
|
||||
'DEFAULT_FILTER_BACKENDS': (
|
||||
'rest_framework.filters.OrderingFilter',
|
||||
'django_filters.rest_framework.DjangoFilterBackend',
|
||||
),
|
||||
'DEFAULT_RENDERER_CLASSES': (
|
||||
'rest_framework.renderers.JSONRenderer',
|
||||
)
|
||||
"DEFAULT_RENDERER_CLASSES": ("rest_framework.renderers.JSONRenderer",),
|
||||
}
|
||||
|
||||
BROWSABLE_API_ENABLED = env.bool('BROWSABLE_API_ENABLED', default=False)
|
||||
BROWSABLE_API_ENABLED = env.bool("BROWSABLE_API_ENABLED", default=False)
|
||||
if BROWSABLE_API_ENABLED:
|
||||
REST_FRAMEWORK['DEFAULT_RENDERER_CLASSES'] += (
|
||||
'rest_framework.renderers.BrowsableAPIRenderer',
|
||||
REST_FRAMEWORK["DEFAULT_RENDERER_CLASSES"] += (
|
||||
"rest_framework.renderers.BrowsableAPIRenderer",
|
||||
)
|
||||
|
||||
REST_AUTH_SERIALIZERS = {
|
||||
'PASSWORD_RESET_SERIALIZER': 'funkwhale_api.users.serializers.PasswordResetSerializer' # noqa
|
||||
"PASSWORD_RESET_SERIALIZER": "funkwhale_api.users.serializers.PasswordResetSerializer" # noqa
|
||||
}
|
||||
REST_SESSION_LOGIN = False
|
||||
REST_USE_JWT = True
|
||||
|
@ -434,60 +408,55 @@ USE_X_FORWARDED_PORT = True
|
|||
|
||||
# Wether we should use Apache, Nginx (or other) headers when serving audio files
|
||||
# Default to Nginx
|
||||
REVERSE_PROXY_TYPE = env('REVERSE_PROXY_TYPE', default='nginx')
|
||||
assert REVERSE_PROXY_TYPE in ['apache2', 'nginx'], 'Unsupported REVERSE_PROXY_TYPE'
|
||||
REVERSE_PROXY_TYPE = env("REVERSE_PROXY_TYPE", default="nginx")
|
||||
assert REVERSE_PROXY_TYPE in ["apache2", "nginx"], "Unsupported REVERSE_PROXY_TYPE"
|
||||
|
||||
# Which path will be used to process the internal redirection
|
||||
# **DO NOT** put a slash at the end
|
||||
PROTECT_FILES_PATH = env('PROTECT_FILES_PATH', default='/_protected')
|
||||
PROTECT_FILES_PATH = env("PROTECT_FILES_PATH", default="/_protected")
|
||||
|
||||
|
||||
# use this setting to tweak for how long you want to cache
|
||||
# musicbrainz results. (value is in seconds)
|
||||
MUSICBRAINZ_CACHE_DURATION = env.int(
|
||||
'MUSICBRAINZ_CACHE_DURATION',
|
||||
default=300
|
||||
)
|
||||
CACHEOPS_REDIS = env('CACHE_URL', default=CACHE_DEFAULT)
|
||||
CACHEOPS_ENABLED = env.bool('CACHEOPS_ENABLED', default=True)
|
||||
MUSICBRAINZ_CACHE_DURATION = env.int("MUSICBRAINZ_CACHE_DURATION", default=300)
|
||||
CACHEOPS_REDIS = env("CACHE_URL", default=CACHE_DEFAULT)
|
||||
CACHEOPS_ENABLED = env.bool("CACHEOPS_ENABLED", default=True)
|
||||
CACHEOPS = {
|
||||
'music.artist': {'ops': 'all', 'timeout': 60 * 60},
|
||||
'music.album': {'ops': 'all', 'timeout': 60 * 60},
|
||||
'music.track': {'ops': 'all', 'timeout': 60 * 60},
|
||||
'music.trackfile': {'ops': 'all', 'timeout': 60 * 60},
|
||||
'taggit.tag': {'ops': 'all', 'timeout': 60 * 60},
|
||||
"music.artist": {"ops": "all", "timeout": 60 * 60},
|
||||
"music.album": {"ops": "all", "timeout": 60 * 60},
|
||||
"music.track": {"ops": "all", "timeout": 60 * 60},
|
||||
"music.trackfile": {"ops": "all", "timeout": 60 * 60},
|
||||
"taggit.tag": {"ops": "all", "timeout": 60 * 60},
|
||||
}
|
||||
|
||||
# Custom Admin URL, use {% url 'admin:index' %}
|
||||
ADMIN_URL = env('DJANGO_ADMIN_URL', default='^api/admin/')
|
||||
ADMIN_URL = env("DJANGO_ADMIN_URL", default="^api/admin/")
|
||||
CSRF_USE_SESSIONS = True
|
||||
|
||||
# Playlist settings
|
||||
# XXX: deprecated, see #186
|
||||
PLAYLISTS_MAX_TRACKS = env.int('PLAYLISTS_MAX_TRACKS', default=250)
|
||||
PLAYLISTS_MAX_TRACKS = env.int("PLAYLISTS_MAX_TRACKS", default=250)
|
||||
|
||||
ACCOUNT_USERNAME_BLACKLIST = [
|
||||
'funkwhale',
|
||||
'library',
|
||||
'test',
|
||||
'status',
|
||||
'root',
|
||||
'admin',
|
||||
'owner',
|
||||
'superuser',
|
||||
'staff',
|
||||
'service',
|
||||
] + env.list('ACCOUNT_USERNAME_BLACKLIST', default=[])
|
||||
"funkwhale",
|
||||
"library",
|
||||
"test",
|
||||
"status",
|
||||
"root",
|
||||
"admin",
|
||||
"owner",
|
||||
"superuser",
|
||||
"staff",
|
||||
"service",
|
||||
] + env.list("ACCOUNT_USERNAME_BLACKLIST", default=[])
|
||||
|
||||
EXTERNAL_REQUESTS_VERIFY_SSL = env.bool(
|
||||
'EXTERNAL_REQUESTS_VERIFY_SSL',
|
||||
default=True
|
||||
)
|
||||
EXTERNAL_REQUESTS_VERIFY_SSL = env.bool("EXTERNAL_REQUESTS_VERIFY_SSL", default=True)
|
||||
# XXX: deprecated, see #186
|
||||
API_AUTHENTICATION_REQUIRED = env.bool("API_AUTHENTICATION_REQUIRED", True)
|
||||
|
||||
MUSIC_DIRECTORY_PATH = env('MUSIC_DIRECTORY_PATH', default=None)
|
||||
MUSIC_DIRECTORY_PATH = env("MUSIC_DIRECTORY_PATH", default=None)
|
||||
# on Docker setup, the music directory may not match the host path,
|
||||
# and we need to know it for it to serve stuff properly
|
||||
MUSIC_DIRECTORY_SERVE_PATH = env(
|
||||
'MUSIC_DIRECTORY_SERVE_PATH', default=MUSIC_DIRECTORY_PATH)
|
||||
"MUSIC_DIRECTORY_SERVE_PATH", default=MUSIC_DIRECTORY_PATH
|
||||
)
|
||||
|
|
|
@ -1,79 +1,72 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
'''
|
||||
"""
|
||||
Local settings
|
||||
|
||||
- Run in Debug mode
|
||||
- Use console backend for emails
|
||||
- Add Django Debug Toolbar
|
||||
- Add django-extensions as app
|
||||
'''
|
||||
"""
|
||||
|
||||
from .common import * # noqa
|
||||
|
||||
|
||||
# DEBUG
|
||||
# ------------------------------------------------------------------------------
|
||||
DEBUG = env.bool('DJANGO_DEBUG', default=True)
|
||||
TEMPLATES[0]['OPTIONS']['debug'] = DEBUG
|
||||
DEBUG = env.bool("DJANGO_DEBUG", default=True)
|
||||
TEMPLATES[0]["OPTIONS"]["debug"] = DEBUG
|
||||
|
||||
# SECRET CONFIGURATION
|
||||
# ------------------------------------------------------------------------------
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/settings/#secret-key
|
||||
# Note: This key only used for development and testing.
|
||||
SECRET_KEY = env("DJANGO_SECRET_KEY", default='mc$&b=5j#6^bv7tld1gyjp2&+^-qrdy=0sw@r5sua*1zp4fmxc')
|
||||
SECRET_KEY = env(
|
||||
"DJANGO_SECRET_KEY", default="mc$&b=5j#6^bv7tld1gyjp2&+^-qrdy=0sw@r5sua*1zp4fmxc"
|
||||
)
|
||||
|
||||
# Mail settings
|
||||
# ------------------------------------------------------------------------------
|
||||
EMAIL_HOST = 'localhost'
|
||||
EMAIL_HOST = "localhost"
|
||||
EMAIL_PORT = 1025
|
||||
|
||||
# django-debug-toolbar
|
||||
# ------------------------------------------------------------------------------
|
||||
MIDDLEWARE += ('debug_toolbar.middleware.DebugToolbarMiddleware',)
|
||||
MIDDLEWARE += ("debug_toolbar.middleware.DebugToolbarMiddleware",)
|
||||
|
||||
# INTERNAL_IPS = ('127.0.0.1', '10.0.2.2',)
|
||||
|
||||
DEBUG_TOOLBAR_CONFIG = {
|
||||
'DISABLE_PANELS': [
|
||||
'debug_toolbar.panels.redirects.RedirectsPanel',
|
||||
],
|
||||
'SHOW_TEMPLATE_CONTEXT': True,
|
||||
'SHOW_TOOLBAR_CALLBACK': lambda request: True,
|
||||
"DISABLE_PANELS": ["debug_toolbar.panels.redirects.RedirectsPanel"],
|
||||
"SHOW_TEMPLATE_CONTEXT": True,
|
||||
"SHOW_TOOLBAR_CALLBACK": lambda request: True,
|
||||
}
|
||||
|
||||
# django-extensions
|
||||
# ------------------------------------------------------------------------------
|
||||
# INSTALLED_APPS += ('django_extensions', )
|
||||
INSTALLED_APPS += ('debug_toolbar', )
|
||||
INSTALLED_APPS += ("debug_toolbar",)
|
||||
|
||||
# TESTING
|
||||
# ------------------------------------------------------------------------------
|
||||
TEST_RUNNER = 'django.test.runner.DiscoverRunner'
|
||||
TEST_RUNNER = "django.test.runner.DiscoverRunner"
|
||||
|
||||
########## CELERY
|
||||
# CELERY
|
||||
# In development, all tasks will be executed locally by blocking until the task returns
|
||||
CELERY_TASK_ALWAYS_EAGER = False
|
||||
########## END CELERY
|
||||
# END CELERY
|
||||
|
||||
# Your local stuff: Below this line define 3rd party library settings
|
||||
|
||||
LOGGING = {
|
||||
'version': 1,
|
||||
'handlers': {
|
||||
'console':{
|
||||
'level':'DEBUG',
|
||||
'class':'logging.StreamHandler',
|
||||
},
|
||||
},
|
||||
'loggers': {
|
||||
'django.request': {
|
||||
'handlers':['console'],
|
||||
'propagate': True,
|
||||
'level':'DEBUG',
|
||||
},
|
||||
'': {
|
||||
'level': 'DEBUG',
|
||||
'handlers': ['console'],
|
||||
"version": 1,
|
||||
"handlers": {"console": {"level": "DEBUG", "class": "logging.StreamHandler"}},
|
||||
"loggers": {
|
||||
"django.request": {
|
||||
"handlers": ["console"],
|
||||
"propagate": True,
|
||||
"level": "DEBUG",
|
||||
},
|
||||
"": {"level": "DEBUG", "handlers": ["console"]},
|
||||
},
|
||||
}
|
||||
CSRF_TRUSTED_ORIGINS = [o for o in ALLOWED_HOSTS]
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
'''
|
||||
"""
|
||||
Production Configurations
|
||||
|
||||
- Use djangosecure
|
||||
|
@ -8,12 +8,9 @@ Production Configurations
|
|||
- Use Redis on Heroku
|
||||
|
||||
|
||||
'''
|
||||
"""
|
||||
from __future__ import absolute_import, unicode_literals
|
||||
|
||||
from django.utils import six
|
||||
|
||||
|
||||
from .common import * # noqa
|
||||
|
||||
# SECRET CONFIGURATION
|
||||
|
@ -58,19 +55,24 @@ CSRF_TRUSTED_ORIGINS = ALLOWED_HOSTS
|
|||
# ------------------------------------------------------------------------------
|
||||
# Uploaded Media Files
|
||||
# ------------------------
|
||||
DEFAULT_FILE_STORAGE = 'django.core.files.storage.FileSystemStorage'
|
||||
DEFAULT_FILE_STORAGE = "django.core.files.storage.FileSystemStorage"
|
||||
|
||||
# Static Assets
|
||||
# ------------------------
|
||||
STATICFILES_STORAGE = 'django.contrib.staticfiles.storage.StaticFilesStorage'
|
||||
STATICFILES_STORAGE = "django.contrib.staticfiles.storage.StaticFilesStorage"
|
||||
|
||||
# TEMPLATE CONFIGURATION
|
||||
# ------------------------------------------------------------------------------
|
||||
# See:
|
||||
# https://docs.djangoproject.com/en/dev/ref/templates/api/#django.template.loaders.cached.Loader
|
||||
TEMPLATES[0]['OPTIONS']['loaders'] = [
|
||||
('django.template.loaders.cached.Loader', [
|
||||
'django.template.loaders.filesystem.Loader', 'django.template.loaders.app_directories.Loader', ]),
|
||||
TEMPLATES[0]["OPTIONS"]["loaders"] = [
|
||||
(
|
||||
"django.template.loaders.cached.Loader",
|
||||
[
|
||||
"django.template.loaders.filesystem.Loader",
|
||||
"django.template.loaders.app_directories.Loader",
|
||||
],
|
||||
)
|
||||
]
|
||||
|
||||
# CACHING
|
||||
|
@ -78,7 +80,6 @@ TEMPLATES[0]['OPTIONS']['loaders'] = [
|
|||
# Heroku URL does not pass the DB number, so we parse it in
|
||||
|
||||
|
||||
|
||||
# LOGGING CONFIGURATION
|
||||
# ------------------------------------------------------------------------------
|
||||
# See: https://docs.djangoproject.com/en/dev/ref/settings/#logging
|
||||
|
@ -88,43 +89,39 @@ TEMPLATES[0]['OPTIONS']['loaders'] = [
|
|||
# See http://docs.djangoproject.com/en/dev/topics/logging for
|
||||
# more details on how to customize your logging configuration.
|
||||
LOGGING = {
|
||||
'version': 1,
|
||||
'disable_existing_loggers': False,
|
||||
'filters': {
|
||||
'require_debug_false': {
|
||||
'()': 'django.utils.log.RequireDebugFalse'
|
||||
"version": 1,
|
||||
"disable_existing_loggers": False,
|
||||
"filters": {"require_debug_false": {"()": "django.utils.log.RequireDebugFalse"}},
|
||||
"formatters": {
|
||||
"verbose": {
|
||||
"format": "%(levelname)s %(asctime)s %(module)s "
|
||||
"%(process)d %(thread)d %(message)s"
|
||||
}
|
||||
},
|
||||
'formatters': {
|
||||
'verbose': {
|
||||
'format': '%(levelname)s %(asctime)s %(module)s '
|
||||
'%(process)d %(thread)d %(message)s'
|
||||
"handlers": {
|
||||
"mail_admins": {
|
||||
"level": "ERROR",
|
||||
"filters": ["require_debug_false"],
|
||||
"class": "django.utils.log.AdminEmailHandler",
|
||||
},
|
||||
"console": {
|
||||
"level": "DEBUG",
|
||||
"class": "logging.StreamHandler",
|
||||
"formatter": "verbose",
|
||||
},
|
||||
},
|
||||
'handlers': {
|
||||
'mail_admins': {
|
||||
'level': 'ERROR',
|
||||
'filters': ['require_debug_false'],
|
||||
'class': 'django.utils.log.AdminEmailHandler'
|
||||
"loggers": {
|
||||
"django.request": {
|
||||
"handlers": ["mail_admins"],
|
||||
"level": "ERROR",
|
||||
"propagate": True,
|
||||
},
|
||||
'console': {
|
||||
'level': 'DEBUG',
|
||||
'class': 'logging.StreamHandler',
|
||||
'formatter': 'verbose',
|
||||
"django.security.DisallowedHost": {
|
||||
"level": "ERROR",
|
||||
"handlers": ["console", "mail_admins"],
|
||||
"propagate": True,
|
||||
},
|
||||
},
|
||||
'loggers': {
|
||||
'django.request': {
|
||||
'handlers': ['mail_admins'],
|
||||
'level': 'ERROR',
|
||||
'propagate': True
|
||||
},
|
||||
'django.security.DisallowedHost': {
|
||||
'level': 'ERROR',
|
||||
'handlers': ['console', 'mail_admins'],
|
||||
'propagate': True
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
|
|
|
@ -5,38 +5,35 @@ from django.conf import settings
|
|||
from django.conf.urls import include, url
|
||||
from django.conf.urls.static import static
|
||||
from django.contrib import admin
|
||||
from django.views.generic import TemplateView
|
||||
from django.views import defaults as default_views
|
||||
|
||||
urlpatterns = [
|
||||
# Django Admin, use {% url 'admin:index' %}
|
||||
url(settings.ADMIN_URL, admin.site.urls),
|
||||
|
||||
url(r'^api/', include(("config.api_urls", 'api'), namespace="api")),
|
||||
url(r'^', include(
|
||||
('funkwhale_api.federation.urls', 'federation'),
|
||||
namespace="federation")),
|
||||
url(r'^api/v1/auth/', include('rest_auth.urls')),
|
||||
url(r'^api/v1/auth/registration/', include('funkwhale_api.users.rest_auth_urls')),
|
||||
url(r'^accounts/', include('allauth.urls')),
|
||||
|
||||
url(r"^api/", include(("config.api_urls", "api"), namespace="api")),
|
||||
url(
|
||||
r"^",
|
||||
include(
|
||||
("funkwhale_api.federation.urls", "federation"), namespace="federation"
|
||||
),
|
||||
),
|
||||
url(r"^api/v1/auth/", include("rest_auth.urls")),
|
||||
url(r"^api/v1/auth/registration/", include("funkwhale_api.users.rest_auth_urls")),
|
||||
url(r"^accounts/", include("allauth.urls")),
|
||||
# Your stuff: custom urls includes go here
|
||||
|
||||
|
||||
]
|
||||
|
||||
if settings.DEBUG:
|
||||
# This allows the error pages to be debugged during development, just visit
|
||||
# these url in browser to see how these error pages look like.
|
||||
urlpatterns += [
|
||||
url(r'^400/$', default_views.bad_request),
|
||||
url(r'^403/$', default_views.permission_denied),
|
||||
url(r'^404/$', default_views.page_not_found),
|
||||
url(r'^500/$', default_views.server_error),
|
||||
url(r"^400/$", default_views.bad_request),
|
||||
url(r"^403/$", default_views.permission_denied),
|
||||
url(r"^404/$", default_views.page_not_found),
|
||||
url(r"^500/$", default_views.server_error),
|
||||
] + static(settings.MEDIA_URL, document_root=settings.MEDIA_ROOT)
|
||||
|
||||
if 'debug_toolbar' in settings.INSTALLED_APPS:
|
||||
if "debug_toolbar" in settings.INSTALLED_APPS:
|
||||
import debug_toolbar
|
||||
urlpatterns += [
|
||||
url(r'^__debug__/', include(debug_toolbar.urls)),
|
||||
]
|
||||
|
||||
urlpatterns += [url(r"^__debug__/", include(debug_toolbar.urls))]
|
||||
|
|
|
@ -15,11 +15,9 @@ framework.
|
|||
"""
|
||||
import os
|
||||
|
||||
|
||||
from django.core.wsgi import get_wsgi_application
|
||||
from whitenoise.django import DjangoWhiteNoise
|
||||
|
||||
|
||||
# We defer to a DJANGO_SETTINGS_MODULE already in the environment. This breaks
|
||||
# if running multiple sites in the same mod_wsgi process. To fix this, use
|
||||
# mod_wsgi daemon mode with each site in its own daemon process, or use
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
from funkwhale_api.users.models import User
|
||||
|
||||
|
||||
u = User.objects.create(email='demo@demo.com', username='demo', is_staff=True)
|
||||
u.set_password('demo')
|
||||
u.subsonic_api_token = 'demo'
|
||||
u = User.objects.create(email="demo@demo.com", username="demo", is_staff=True)
|
||||
u.set_password("demo")
|
||||
u.subsonic_api_token = "demo"
|
||||
u.save()
|
||||
|
|
|
@ -1,3 +1,8 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
__version__ = '0.14.1'
|
||||
__version_info__ = tuple([int(num) if num.isdigit() else num for num in __version__.replace('-', '.', 1).split('.')])
|
||||
__version__ = "0.14.2"
|
||||
__version_info__ = tuple(
|
||||
[
|
||||
int(num) if num.isdigit() else num
|
||||
for num in __version__.replace("-", ".", 1).split(".")
|
||||
]
|
||||
)
|
||||
|
|
|
@ -2,8 +2,9 @@ from django.apps import AppConfig, apps
|
|||
|
||||
from . import record
|
||||
|
||||
|
||||
class ActivityConfig(AppConfig):
|
||||
name = 'funkwhale_api.activity'
|
||||
name = "funkwhale_api.activity"
|
||||
|
||||
def ready(self):
|
||||
super(ActivityConfig, self).ready()
|
||||
|
|
|
@ -2,37 +2,36 @@ import persisting_theory
|
|||
|
||||
|
||||
class ActivityRegistry(persisting_theory.Registry):
|
||||
look_into = 'activities'
|
||||
look_into = "activities"
|
||||
|
||||
def _register_for_model(self, model, attr, value):
|
||||
key = model._meta.label
|
||||
d = self.setdefault(key, {'consumers': []})
|
||||
d = self.setdefault(key, {"consumers": []})
|
||||
d[attr] = value
|
||||
|
||||
def register_serializer(self, serializer_class):
|
||||
model = serializer_class.Meta.model
|
||||
self._register_for_model(model, 'serializer', serializer_class)
|
||||
self._register_for_model(model, "serializer", serializer_class)
|
||||
return serializer_class
|
||||
|
||||
def register_consumer(self, label):
|
||||
def decorator(func):
|
||||
consumers = self[label]['consumers']
|
||||
consumers = self[label]["consumers"]
|
||||
if func not in consumers:
|
||||
consumers.append(func)
|
||||
return func
|
||||
|
||||
return decorator
|
||||
|
||||
|
||||
registry = ActivityRegistry()
|
||||
|
||||
|
||||
|
||||
|
||||
def send(obj):
|
||||
conf = registry[obj.__class__._meta.label]
|
||||
consumers = conf['consumers']
|
||||
consumers = conf["consumers"]
|
||||
if not consumers:
|
||||
return
|
||||
serializer = conf['serializer'](obj)
|
||||
serializer = conf["serializer"](obj)
|
||||
for consumer in consumers:
|
||||
consumer(data=serializer.data, obj=obj)
|
||||
|
|
|
@ -4,8 +4,8 @@ from funkwhale_api.activity import record
|
|||
|
||||
|
||||
class ModelSerializer(serializers.ModelSerializer):
|
||||
id = serializers.CharField(source='get_activity_url')
|
||||
local_id = serializers.IntegerField(source='id')
|
||||
id = serializers.CharField(source="get_activity_url")
|
||||
local_id = serializers.IntegerField(source="id")
|
||||
# url = serializers.SerializerMethodField()
|
||||
|
||||
def get_url(self, obj):
|
||||
|
@ -17,8 +17,7 @@ class AutoSerializer(serializers.Serializer):
|
|||
A serializer that will automatically use registered activity serializers
|
||||
to serialize an henerogeneous list of objects (favorites, listenings, etc.)
|
||||
"""
|
||||
|
||||
def to_representation(self, instance):
|
||||
serializer = record.registry[instance._meta.label]['serializer'](
|
||||
instance
|
||||
)
|
||||
serializer = record.registry[instance._meta.label]["serializer"](instance)
|
||||
return serializer.data
|
||||
|
|
|
@ -6,31 +6,25 @@ from funkwhale_api.history.models import Listening
|
|||
|
||||
|
||||
def combined_recent(limit, **kwargs):
|
||||
datetime_field = kwargs.pop('datetime_field', 'creation_date')
|
||||
source_querysets = {
|
||||
qs.model._meta.label: qs for qs in kwargs.pop('querysets')
|
||||
}
|
||||
datetime_field = kwargs.pop("datetime_field", "creation_date")
|
||||
source_querysets = {qs.model._meta.label: qs for qs in kwargs.pop("querysets")}
|
||||
querysets = {
|
||||
k: qs.annotate(
|
||||
__type=models.Value(
|
||||
qs.model._meta.label, output_field=models.CharField()
|
||||
)
|
||||
).values('pk', datetime_field, '__type')
|
||||
__type=models.Value(qs.model._meta.label, output_field=models.CharField())
|
||||
).values("pk", datetime_field, "__type")
|
||||
for k, qs in source_querysets.items()
|
||||
}
|
||||
_qs_list = list(querysets.values())
|
||||
union_qs = _qs_list[0].union(*_qs_list[1:])
|
||||
records = []
|
||||
for row in union_qs.order_by('-{}'.format(datetime_field))[:limit]:
|
||||
records.append({
|
||||
'type': row['__type'],
|
||||
'when': row[datetime_field],
|
||||
'pk': row['pk']
|
||||
})
|
||||
for row in union_qs.order_by("-{}".format(datetime_field))[:limit]:
|
||||
records.append(
|
||||
{"type": row["__type"], "when": row[datetime_field], "pk": row["pk"]}
|
||||
)
|
||||
# Now we bulk-load each object type in turn
|
||||
to_load = {}
|
||||
for record in records:
|
||||
to_load.setdefault(record['type'], []).append(record['pk'])
|
||||
to_load.setdefault(record["type"], []).append(record["pk"])
|
||||
fetched = {}
|
||||
|
||||
for key, pks in to_load.items():
|
||||
|
@ -39,26 +33,19 @@ def combined_recent(limit, **kwargs):
|
|||
|
||||
# Annotate 'records' with loaded objects
|
||||
for record in records:
|
||||
record['object'] = fetched[(record['type'], record['pk'])]
|
||||
record["object"] = fetched[(record["type"], record["pk"])]
|
||||
return records
|
||||
|
||||
|
||||
def get_activity(user, limit=20):
|
||||
query = fields.privacy_level_query(
|
||||
user, lookup_field='user__privacy_level')
|
||||
query = fields.privacy_level_query(user, lookup_field="user__privacy_level")
|
||||
querysets = [
|
||||
Listening.objects.filter(query).select_related(
|
||||
'track',
|
||||
'user',
|
||||
'track__artist',
|
||||
'track__album__artist',
|
||||
"track", "user", "track__artist", "track__album__artist"
|
||||
),
|
||||
TrackFavorite.objects.filter(query).select_related(
|
||||
'track',
|
||||
'user',
|
||||
'track__artist',
|
||||
'track__album__artist',
|
||||
"track", "user", "track__artist", "track__album__artist"
|
||||
),
|
||||
]
|
||||
records = combined_recent(limit=limit, querysets=querysets)
|
||||
return [r['object'] for r in records]
|
||||
return [r["object"] for r in records]
|
||||
|
|
|
@ -4,8 +4,7 @@ from rest_framework.response import Response
|
|||
from funkwhale_api.common.permissions import ConditionalAuthentication
|
||||
from funkwhale_api.favorites.models import TrackFavorite
|
||||
|
||||
from . import serializers
|
||||
from . import utils
|
||||
from . import serializers, utils
|
||||
|
||||
|
||||
class ActivityViewSet(viewsets.GenericViewSet):
|
||||
|
@ -17,4 +16,4 @@ class ActivityViewSet(viewsets.GenericViewSet):
|
|||
def list(self, request, *args, **kwargs):
|
||||
activity = utils.get_activity(user=request.user)
|
||||
serializer = self.serializer_class(activity, many=True)
|
||||
return Response({'results': serializer.data}, status=200)
|
||||
return Response({"results": serializer.data}, status=200)
|
||||
|
|
|
@ -1,12 +1,7 @@
|
|||
from urllib.parse import parse_qs
|
||||
|
||||
import jwt
|
||||
|
||||
from django.contrib.auth.models import AnonymousUser
|
||||
from django.utils.encoding import smart_text
|
||||
|
||||
from rest_framework import exceptions
|
||||
from rest_framework_jwt.settings import api_settings
|
||||
from rest_framework_jwt.authentication import BaseJSONWebTokenAuthentication
|
||||
|
||||
from funkwhale_api.users.models import User
|
||||
|
@ -16,20 +11,19 @@ class TokenHeaderAuth(BaseJSONWebTokenAuthentication):
|
|||
def get_jwt_value(self, request):
|
||||
|
||||
try:
|
||||
qs = request.get('query_string', b'').decode('utf-8')
|
||||
qs = request.get("query_string", b"").decode("utf-8")
|
||||
parsed = parse_qs(qs)
|
||||
token = parsed['token'][0]
|
||||
token = parsed["token"][0]
|
||||
except KeyError:
|
||||
raise exceptions.AuthenticationFailed('No token')
|
||||
raise exceptions.AuthenticationFailed("No token")
|
||||
|
||||
if not token:
|
||||
raise exceptions.AuthenticationFailed('Empty token')
|
||||
raise exceptions.AuthenticationFailed("Empty token")
|
||||
|
||||
return token
|
||||
|
||||
|
||||
class TokenAuthMiddleware:
|
||||
|
||||
def __init__(self, inner):
|
||||
# Store the ASGI application we were passed
|
||||
self.inner = inner
|
||||
|
@ -41,5 +35,5 @@ class TokenAuthMiddleware:
|
|||
except (User.DoesNotExist, exceptions.AuthenticationFailed):
|
||||
user = AnonymousUser()
|
||||
|
||||
scope['user'] = user
|
||||
scope["user"] = user
|
||||
return self.inner(scope)
|
||||
|
|
|
@ -1,39 +1,38 @@
|
|||
from django.utils.encoding import smart_text
|
||||
from django.utils.translation import ugettext as _
|
||||
|
||||
from rest_framework import exceptions
|
||||
from rest_framework_jwt import authentication
|
||||
from rest_framework_jwt.settings import api_settings
|
||||
|
||||
|
||||
class JSONWebTokenAuthenticationQS(
|
||||
authentication.BaseJSONWebTokenAuthentication):
|
||||
class JSONWebTokenAuthenticationQS(authentication.BaseJSONWebTokenAuthentication):
|
||||
|
||||
www_authenticate_realm = 'api'
|
||||
www_authenticate_realm = "api"
|
||||
|
||||
def get_jwt_value(self, request):
|
||||
token = request.query_params.get('jwt')
|
||||
if 'jwt' in request.query_params and not token:
|
||||
msg = _('Invalid Authorization header. No credentials provided.')
|
||||
token = request.query_params.get("jwt")
|
||||
if "jwt" in request.query_params and not token:
|
||||
msg = _("Invalid Authorization header. No credentials provided.")
|
||||
raise exceptions.AuthenticationFailed(msg)
|
||||
return token
|
||||
|
||||
def authenticate_header(self, request):
|
||||
return '{0} realm="{1}"'.format(
|
||||
api_settings.JWT_AUTH_HEADER_PREFIX, self.www_authenticate_realm)
|
||||
api_settings.JWT_AUTH_HEADER_PREFIX, self.www_authenticate_realm
|
||||
)
|
||||
|
||||
|
||||
class BearerTokenHeaderAuth(
|
||||
authentication.BaseJSONWebTokenAuthentication):
|
||||
class BearerTokenHeaderAuth(authentication.BaseJSONWebTokenAuthentication):
|
||||
"""
|
||||
For backward compatibility purpose, we used Authorization: JWT <token>
|
||||
but Authorization: Bearer <token> is probably better.
|
||||
"""
|
||||
www_authenticate_realm = 'api'
|
||||
|
||||
www_authenticate_realm = "api"
|
||||
|
||||
def get_jwt_value(self, request):
|
||||
auth = authentication.get_authorization_header(request).split()
|
||||
auth_header_prefix = 'bearer'
|
||||
auth_header_prefix = "bearer"
|
||||
|
||||
if not auth:
|
||||
if api_settings.JWT_AUTH_COOKIE:
|
||||
|
@ -44,14 +43,16 @@ class BearerTokenHeaderAuth(
|
|||
return None
|
||||
|
||||
if len(auth) == 1:
|
||||
msg = _('Invalid Authorization header. No credentials provided.')
|
||||
msg = _("Invalid Authorization header. No credentials provided.")
|
||||
raise exceptions.AuthenticationFailed(msg)
|
||||
elif len(auth) > 2:
|
||||
msg = _('Invalid Authorization header. Credentials string '
|
||||
'should not contain spaces.')
|
||||
msg = _(
|
||||
"Invalid Authorization header. Credentials string "
|
||||
"should not contain spaces."
|
||||
)
|
||||
raise exceptions.AuthenticationFailed(msg)
|
||||
|
||||
return auth[1]
|
||||
|
||||
def authenticate_header(self, request):
|
||||
return '{0} realm="{1}"'.format('Bearer', self.www_authenticate_realm)
|
||||
return '{0} realm="{1}"'.format("Bearer", self.www_authenticate_realm)
|
||||
|
|
|
@ -1,11 +1,12 @@
|
|||
from channels.generic.websocket import JsonWebsocketConsumer
|
||||
|
||||
from funkwhale_api.common import channels
|
||||
|
||||
|
||||
class JsonAuthConsumer(JsonWebsocketConsumer):
|
||||
def connect(self):
|
||||
try:
|
||||
assert self.scope['user'].pk is not None
|
||||
assert self.scope["user"].pk is not None
|
||||
except (AssertionError, AttributeError, KeyError):
|
||||
return self.close()
|
||||
|
||||
|
|
|
@ -3,18 +3,19 @@ from dynamic_preferences.registries import global_preferences_registry
|
|||
|
||||
from funkwhale_api.common import preferences
|
||||
|
||||
common = types.Section('common')
|
||||
common = types.Section("common")
|
||||
|
||||
|
||||
@global_preferences_registry.register
|
||||
class APIAutenticationRequired(
|
||||
preferences.DefaultFromSettingMixin, types.BooleanPreference):
|
||||
preferences.DefaultFromSettingMixin, types.BooleanPreference
|
||||
):
|
||||
section = common
|
||||
name = 'api_authentication_required'
|
||||
verbose_name = 'API Requires authentication'
|
||||
setting = 'API_AUTHENTICATION_REQUIRED'
|
||||
name = "api_authentication_required"
|
||||
verbose_name = "API Requires authentication"
|
||||
setting = "API_AUTHENTICATION_REQUIRED"
|
||||
help_text = (
|
||||
'If disabled, anonymous users will be able to query the API'
|
||||
'and access music data (as well as other data exposed in the API '
|
||||
'without specific permissions).'
|
||||
"If disabled, anonymous users will be able to query the API"
|
||||
"and access music data (as well as other data exposed in the API "
|
||||
"without specific permissions)."
|
||||
)
|
||||
|
|
|
@ -1,39 +1,34 @@
|
|||
import django_filters
|
||||
|
||||
from django.db import models
|
||||
|
||||
from funkwhale_api.music import utils
|
||||
|
||||
|
||||
PRIVACY_LEVEL_CHOICES = [
|
||||
('me', 'Only me'),
|
||||
('followers', 'Me and my followers'),
|
||||
('instance', 'Everyone on my instance, and my followers'),
|
||||
('everyone', 'Everyone, including people on other instances'),
|
||||
("me", "Only me"),
|
||||
("followers", "Me and my followers"),
|
||||
("instance", "Everyone on my instance, and my followers"),
|
||||
("everyone", "Everyone, including people on other instances"),
|
||||
]
|
||||
|
||||
|
||||
def get_privacy_field():
|
||||
return models.CharField(
|
||||
max_length=30, choices=PRIVACY_LEVEL_CHOICES, default='instance')
|
||||
max_length=30, choices=PRIVACY_LEVEL_CHOICES, default="instance"
|
||||
)
|
||||
|
||||
|
||||
def privacy_level_query(user, lookup_field='privacy_level'):
|
||||
def privacy_level_query(user, lookup_field="privacy_level"):
|
||||
if user.is_anonymous:
|
||||
return models.Q(**{
|
||||
lookup_field: 'everyone',
|
||||
})
|
||||
return models.Q(**{lookup_field: "everyone"})
|
||||
|
||||
return models.Q(**{
|
||||
'{}__in'.format(lookup_field): [
|
||||
'followers', 'instance', 'everyone'
|
||||
]
|
||||
})
|
||||
return models.Q(
|
||||
**{"{}__in".format(lookup_field): ["followers", "instance", "everyone"]}
|
||||
)
|
||||
|
||||
|
||||
class SearchFilter(django_filters.CharFilter):
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.search_fields = kwargs.pop('search_fields')
|
||||
self.search_fields = kwargs.pop("search_fields")
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
def filter(self, qs, value):
|
||||
|
|
|
@ -4,17 +4,20 @@ from funkwhale_api.common import scripts
|
|||
|
||||
|
||||
class Command(BaseCommand):
|
||||
help = 'Run a specific script from funkwhale_api/common/scripts/'
|
||||
help = "Run a specific script from funkwhale_api/common/scripts/"
|
||||
|
||||
def add_arguments(self, parser):
|
||||
parser.add_argument('script_name', nargs='?', type=str)
|
||||
parser.add_argument("script_name", nargs="?", type=str)
|
||||
parser.add_argument(
|
||||
'--noinput', '--no-input', action='store_false', dest='interactive',
|
||||
"--noinput",
|
||||
"--no-input",
|
||||
action="store_false",
|
||||
dest="interactive",
|
||||
help="Do NOT prompt the user for input of any kind.",
|
||||
)
|
||||
|
||||
def handle(self, *args, **options):
|
||||
name = options['script_name']
|
||||
name = options["script_name"]
|
||||
if not name:
|
||||
self.show_help()
|
||||
|
||||
|
@ -23,44 +26,43 @@ class Command(BaseCommand):
|
|||
script = available_scripts[name]
|
||||
except KeyError:
|
||||
raise CommandError(
|
||||
'{} is not a valid script. Run python manage.py script for a '
|
||||
'list of available scripts'.format(name))
|
||||
"{} is not a valid script. Run python manage.py script for a "
|
||||
"list of available scripts".format(name)
|
||||
)
|
||||
|
||||
self.stdout.write('')
|
||||
if options['interactive']:
|
||||
self.stdout.write("")
|
||||
if options["interactive"]:
|
||||
message = (
|
||||
'Are you sure you want to execute the script {}?\n\n'
|
||||
"Are you sure you want to execute the script {}?\n\n"
|
||||
"Type 'yes' to continue, or 'no' to cancel: "
|
||||
).format(name)
|
||||
if input(''.join(message)) != 'yes':
|
||||
if input("".join(message)) != "yes":
|
||||
raise CommandError("Script cancelled.")
|
||||
script['entrypoint'](self, **options)
|
||||
script["entrypoint"](self, **options)
|
||||
|
||||
def show_help(self):
|
||||
indentation = 4
|
||||
self.stdout.write('')
|
||||
self.stdout.write('Available scripts:')
|
||||
self.stdout.write('Launch with: python manage.py <script_name>')
|
||||
self.stdout.write("")
|
||||
self.stdout.write("Available scripts:")
|
||||
self.stdout.write("Launch with: python manage.py <script_name>")
|
||||
available_scripts = self.get_scripts()
|
||||
for name, script in sorted(available_scripts.items()):
|
||||
self.stdout.write('')
|
||||
self.stdout.write("")
|
||||
self.stdout.write(self.style.SUCCESS(name))
|
||||
self.stdout.write('')
|
||||
for line in script['help'].splitlines():
|
||||
self.stdout.write(' {}'.format(line))
|
||||
self.stdout.write('')
|
||||
self.stdout.write("")
|
||||
for line in script["help"].splitlines():
|
||||
self.stdout.write(" {}".format(line))
|
||||
self.stdout.write("")
|
||||
|
||||
def get_scripts(self):
|
||||
available_scripts = [
|
||||
k for k in sorted(scripts.__dict__.keys())
|
||||
if not k.startswith('__')
|
||||
k for k in sorted(scripts.__dict__.keys()) if not k.startswith("__")
|
||||
]
|
||||
data = {}
|
||||
for name in available_scripts:
|
||||
module = getattr(scripts, name)
|
||||
data[name] = {
|
||||
'name': name,
|
||||
'help': module.__doc__.strip(),
|
||||
'entrypoint': module.main
|
||||
"name": name,
|
||||
"help": module.__doc__.strip(),
|
||||
"entrypoint": module.main,
|
||||
}
|
||||
return data
|
||||
|
|
|
@ -7,6 +7,4 @@ class Migration(migrations.Migration):
|
|||
|
||||
dependencies = []
|
||||
|
||||
operations = [
|
||||
UnaccentExtension()
|
||||
]
|
||||
operations = [UnaccentExtension()]
|
||||
|
|
|
@ -2,5 +2,5 @@ from rest_framework.pagination import PageNumberPagination
|
|||
|
||||
|
||||
class FunkwhalePagination(PageNumberPagination):
|
||||
page_size_query_param = 'page_size'
|
||||
page_size_query_param = "page_size"
|
||||
max_page_size = 50
|
||||
|
|
|
@ -1,17 +1,14 @@
|
|||
import operator
|
||||
|
||||
from django.conf import settings
|
||||
from django.http import Http404
|
||||
|
||||
from rest_framework.permissions import BasePermission
|
||||
|
||||
from funkwhale_api.common import preferences
|
||||
|
||||
|
||||
class ConditionalAuthentication(BasePermission):
|
||||
|
||||
def has_permission(self, request, view):
|
||||
if preferences.get('common__api_authentication_required'):
|
||||
if preferences.get("common__api_authentication_required"):
|
||||
return request.user and request.user.is_authenticated
|
||||
return True
|
||||
|
||||
|
@ -28,24 +25,25 @@ class OwnerPermission(BasePermission):
|
|||
owner_field = 'owner'
|
||||
owner_checks = ['read', 'write']
|
||||
"""
|
||||
|
||||
perms_map = {
|
||||
'GET': 'read',
|
||||
'OPTIONS': 'read',
|
||||
'HEAD': 'read',
|
||||
'POST': 'write',
|
||||
'PUT': 'write',
|
||||
'PATCH': 'write',
|
||||
'DELETE': 'write',
|
||||
"GET": "read",
|
||||
"OPTIONS": "read",
|
||||
"HEAD": "read",
|
||||
"POST": "write",
|
||||
"PUT": "write",
|
||||
"PATCH": "write",
|
||||
"DELETE": "write",
|
||||
}
|
||||
|
||||
def has_object_permission(self, request, view, obj):
|
||||
method_check = self.perms_map[request.method]
|
||||
owner_checks = getattr(view, 'owner_checks', ['read', 'write'])
|
||||
owner_checks = getattr(view, "owner_checks", ["read", "write"])
|
||||
if method_check not in owner_checks:
|
||||
# check not enabled
|
||||
return True
|
||||
|
||||
owner_field = getattr(view, 'owner_field', 'user')
|
||||
owner_field = getattr(view, "owner_field", "user")
|
||||
owner = operator.attrgetter(owner_field)(obj)
|
||||
if owner != request.user:
|
||||
raise Http404
|
||||
|
|
|
@ -1,8 +1,6 @@
|
|||
from django.conf import settings
|
||||
from django import forms
|
||||
|
||||
from dynamic_preferences import serializers
|
||||
from dynamic_preferences import types
|
||||
from django.conf import settings
|
||||
from dynamic_preferences import serializers, types
|
||||
from dynamic_preferences.registries import global_preferences_registry
|
||||
|
||||
|
||||
|
@ -17,7 +15,7 @@ def get(pref):
|
|||
|
||||
|
||||
class StringListSerializer(serializers.BaseSerializer):
|
||||
separator = ','
|
||||
separator = ","
|
||||
sort = True
|
||||
|
||||
@classmethod
|
||||
|
@ -27,8 +25,8 @@ class StringListSerializer(serializers.BaseSerializer):
|
|||
|
||||
if type(value) not in [list, tuple]:
|
||||
raise cls.exception(
|
||||
"Cannot serialize, value {} is not a list or a tuple".format(
|
||||
value))
|
||||
"Cannot serialize, value {} is not a list or a tuple".format(value)
|
||||
)
|
||||
|
||||
if cls.sort:
|
||||
value = sorted(value)
|
||||
|
@ -38,7 +36,7 @@ class StringListSerializer(serializers.BaseSerializer):
|
|||
def to_python(cls, value, **kwargs):
|
||||
if not value:
|
||||
return []
|
||||
return value.split(',')
|
||||
return value.split(",")
|
||||
|
||||
|
||||
class StringListPreference(types.BasePreferenceType):
|
||||
|
@ -47,5 +45,5 @@ class StringListPreference(types.BasePreferenceType):
|
|||
|
||||
def get_api_additional_data(self):
|
||||
d = super(StringListPreference, self).get_api_additional_data()
|
||||
d['choices'] = self.get('choices')
|
||||
d["choices"] = self.get("choices")
|
||||
return d
|
||||
|
|
|
@ -1,2 +0,0 @@
|
|||
from . import django_permissions_to_user_permissions
|
||||
from . import test
|
|
@ -2,28 +2,28 @@
|
|||
Convert django permissions to user permissions in the database,
|
||||
following the work done in #152.
|
||||
"""
|
||||
from django.contrib.auth.models import Permission
|
||||
from django.db.models import Q
|
||||
|
||||
from funkwhale_api.users import models
|
||||
|
||||
from django.contrib.auth.models import Permission
|
||||
|
||||
mapping = {
|
||||
'dynamic_preferences.change_globalpreferencemodel': 'settings',
|
||||
'music.add_importbatch': 'library',
|
||||
'federation.change_library': 'federation',
|
||||
"dynamic_preferences.change_globalpreferencemodel": "settings",
|
||||
"music.add_importbatch": "library",
|
||||
"federation.change_library": "federation",
|
||||
}
|
||||
|
||||
|
||||
def main(command, **kwargs):
|
||||
for codename, user_permission in sorted(mapping.items()):
|
||||
app_label, c = codename.split('.')
|
||||
p = Permission.objects.get(
|
||||
content_type__app_label=app_label, codename=c)
|
||||
app_label, c = codename.split(".")
|
||||
p = Permission.objects.get(content_type__app_label=app_label, codename=c)
|
||||
users = models.User.objects.filter(
|
||||
Q(groups__permissions=p) | Q(user_permissions=p)).distinct()
|
||||
Q(groups__permissions=p) | Q(user_permissions=p)
|
||||
).distinct()
|
||||
total = users.count()
|
||||
|
||||
command.stdout.write('Updating {} users with {} permission...'.format(
|
||||
total, user_permission
|
||||
))
|
||||
users.update(**{'permission_{}'.format(user_permission): True})
|
||||
command.stdout.write(
|
||||
"Updating {} users with {} permission...".format(total, user_permission)
|
||||
)
|
||||
users.update(**{"permission_{}".format(user_permission): True})
|
||||
|
|
|
@ -5,4 +5,4 @@ You can launch it just to check how it works.
|
|||
|
||||
|
||||
def main(command, **kwargs):
|
||||
command.stdout.write('Test script run successfully')
|
||||
command.stdout.write("Test script run successfully")
|
||||
|
|
|
@ -17,67 +17,67 @@ class ActionSerializer(serializers.Serializer):
|
|||
dangerous_actions = []
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.queryset = kwargs.pop('queryset')
|
||||
self.queryset = kwargs.pop("queryset")
|
||||
if self.actions is None:
|
||||
raise ValueError(
|
||||
'You must declare a list of actions on '
|
||||
'the serializer class')
|
||||
"You must declare a list of actions on " "the serializer class"
|
||||
)
|
||||
|
||||
for action in self.actions:
|
||||
handler_name = 'handle_{}'.format(action)
|
||||
assert hasattr(self, handler_name), (
|
||||
'{} miss a {} method'.format(
|
||||
self.__class__.__name__, handler_name)
|
||||
handler_name = "handle_{}".format(action)
|
||||
assert hasattr(self, handler_name), "{} miss a {} method".format(
|
||||
self.__class__.__name__, handler_name
|
||||
)
|
||||
super().__init__(self, *args, **kwargs)
|
||||
|
||||
def validate_action(self, value):
|
||||
if value not in self.actions:
|
||||
raise serializers.ValidationError(
|
||||
'{} is not a valid action. Pick one of {}.'.format(
|
||||
value, ', '.join(self.actions)
|
||||
"{} is not a valid action. Pick one of {}.".format(
|
||||
value, ", ".join(self.actions)
|
||||
)
|
||||
)
|
||||
return value
|
||||
|
||||
def validate_objects(self, value):
|
||||
qs = None
|
||||
if value == 'all':
|
||||
return self.queryset.all().order_by('id')
|
||||
if value == "all":
|
||||
return self.queryset.all().order_by("id")
|
||||
if type(value) in [list, tuple]:
|
||||
return self.queryset.filter(pk__in=value).order_by('id')
|
||||
return self.queryset.filter(pk__in=value).order_by("id")
|
||||
|
||||
raise serializers.ValidationError(
|
||||
'{} is not a valid value for objects. You must provide either a '
|
||||
'list of identifiers or the string "all".'.format(value))
|
||||
"{} is not a valid value for objects. You must provide either a "
|
||||
'list of identifiers or the string "all".'.format(value)
|
||||
)
|
||||
|
||||
def validate(self, data):
|
||||
dangerous = data['action'] in self.dangerous_actions
|
||||
if dangerous and self.initial_data['objects'] == 'all':
|
||||
dangerous = data["action"] in self.dangerous_actions
|
||||
if dangerous and self.initial_data["objects"] == "all":
|
||||
raise serializers.ValidationError(
|
||||
'This action is to dangerous to be applied to all objects')
|
||||
if self.filterset_class and 'filters' in data:
|
||||
"This action is to dangerous to be applied to all objects"
|
||||
)
|
||||
if self.filterset_class and "filters" in data:
|
||||
qs_filterset = self.filterset_class(
|
||||
data['filters'], queryset=data['objects'])
|
||||
data["filters"], queryset=data["objects"]
|
||||
)
|
||||
try:
|
||||
assert qs_filterset.form.is_valid()
|
||||
except (AssertionError, TypeError):
|
||||
raise serializers.ValidationError('Invalid filters')
|
||||
data['objects'] = qs_filterset.qs
|
||||
raise serializers.ValidationError("Invalid filters")
|
||||
data["objects"] = qs_filterset.qs
|
||||
|
||||
data['count'] = data['objects'].count()
|
||||
if data['count'] < 1:
|
||||
raise serializers.ValidationError(
|
||||
'No object matching your request')
|
||||
data["count"] = data["objects"].count()
|
||||
if data["count"] < 1:
|
||||
raise serializers.ValidationError("No object matching your request")
|
||||
return data
|
||||
|
||||
def save(self):
|
||||
handler_name = 'handle_{}'.format(self.validated_data['action'])
|
||||
handler_name = "handle_{}".format(self.validated_data["action"])
|
||||
handler = getattr(self, handler_name)
|
||||
result = handler(self.validated_data['objects'])
|
||||
result = handler(self.validated_data["objects"])
|
||||
payload = {
|
||||
'updated': self.validated_data['count'],
|
||||
'action': self.validated_data['action'],
|
||||
'result': result,
|
||||
"updated": self.validated_data["count"],
|
||||
"action": self.validated_data["action"],
|
||||
"result": result,
|
||||
}
|
||||
return payload
|
||||
|
|
|
@ -1,18 +1,16 @@
|
|||
import requests
|
||||
|
||||
from django.conf import settings
|
||||
|
||||
import funkwhale_api
|
||||
|
||||
|
||||
def get_user_agent():
|
||||
return 'python-requests (funkwhale/{}; +{})'.format(
|
||||
funkwhale_api.__version__,
|
||||
settings.FUNKWHALE_URL
|
||||
return "python-requests (funkwhale/{}; +{})".format(
|
||||
funkwhale_api.__version__, settings.FUNKWHALE_URL
|
||||
)
|
||||
|
||||
|
||||
def get_session():
|
||||
s = requests.Session()
|
||||
s.headers['User-Agent'] = get_user_agent()
|
||||
s.headers["User-Agent"] = get_user_agent()
|
||||
return s
|
||||
|
|
|
@ -7,6 +7,7 @@ class ASCIIFileSystemStorage(FileSystemStorage):
|
|||
"""
|
||||
Convert unicode characters in name to ASCII characters.
|
||||
"""
|
||||
|
||||
def get_valid_name(self, name):
|
||||
name = unicodedata.normalize('NFKD', name).encode('ascii', 'ignore')
|
||||
name = unicodedata.normalize("NFKD", name).encode("ascii", "ignore")
|
||||
return super().get_valid_name(name)
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
from urllib.parse import urlencode, parse_qs, urlsplit, urlunsplit
|
||||
import os
|
||||
import shutil
|
||||
from urllib.parse import parse_qs, urlencode, urlsplit, urlunsplit
|
||||
|
||||
from django.db import transaction
|
||||
|
||||
|
@ -9,13 +9,13 @@ def rename_file(instance, field_name, new_name, allow_missing_file=False):
|
|||
field = getattr(instance, field_name)
|
||||
current_name, extension = os.path.splitext(field.name)
|
||||
|
||||
new_name_with_extension = '{}{}'.format(new_name, extension)
|
||||
new_name_with_extension = "{}{}".format(new_name, extension)
|
||||
try:
|
||||
shutil.move(field.path, new_name_with_extension)
|
||||
except FileNotFoundError:
|
||||
if not allow_missing_file:
|
||||
raise
|
||||
print('Skipped missing file', field.path)
|
||||
print("Skipped missing file", field.path)
|
||||
initial_path = os.path.dirname(field.name)
|
||||
field.name = os.path.join(initial_path, new_name_with_extension)
|
||||
instance.save()
|
||||
|
@ -23,9 +23,7 @@ def rename_file(instance, field_name, new_name, allow_missing_file=False):
|
|||
|
||||
|
||||
def on_commit(f, *args, **kwargs):
|
||||
return transaction.on_commit(
|
||||
lambda: f(*args, **kwargs)
|
||||
)
|
||||
return transaction.on_commit(lambda: f(*args, **kwargs))
|
||||
|
||||
|
||||
def set_query_parameter(url, **kwargs):
|
||||
|
|
|
@ -7,25 +7,39 @@ import django.contrib.sites.models
|
|||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
]
|
||||
dependencies = []
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name='Site',
|
||||
name="Site",
|
||||
fields=[
|
||||
('id', models.AutoField(verbose_name='ID', primary_key=True, serialize=False, auto_created=True)),
|
||||
('domain', models.CharField(verbose_name='domain name', max_length=100, validators=[django.contrib.sites.models._simple_domain_name_validator])),
|
||||
('name', models.CharField(verbose_name='display name', max_length=50)),
|
||||
(
|
||||
"id",
|
||||
models.AutoField(
|
||||
verbose_name="ID",
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
auto_created=True,
|
||||
),
|
||||
),
|
||||
(
|
||||
"domain",
|
||||
models.CharField(
|
||||
verbose_name="domain name",
|
||||
max_length=100,
|
||||
validators=[
|
||||
django.contrib.sites.models._simple_domain_name_validator
|
||||
],
|
||||
),
|
||||
),
|
||||
("name", models.CharField(verbose_name="display name", max_length=50)),
|
||||
],
|
||||
options={
|
||||
'verbose_name_plural': 'sites',
|
||||
'verbose_name': 'site',
|
||||
'db_table': 'django_site',
|
||||
'ordering': ('domain',),
|
||||
"verbose_name_plural": "sites",
|
||||
"verbose_name": "site",
|
||||
"db_table": "django_site",
|
||||
"ordering": ("domain",),
|
||||
},
|
||||
managers=[
|
||||
('objects', django.contrib.sites.models.SiteManager()),
|
||||
],
|
||||
),
|
||||
managers=[("objects", django.contrib.sites.models.SiteManager())],
|
||||
)
|
||||
]
|
||||
|
|
|
@ -10,10 +10,7 @@ def update_site_forward(apps, schema_editor):
|
|||
Site = apps.get_model("sites", "Site")
|
||||
Site.objects.update_or_create(
|
||||
id=settings.SITE_ID,
|
||||
defaults={
|
||||
"domain": "funkwhale.io",
|
||||
"name": "funkwhale_api"
|
||||
}
|
||||
defaults={"domain": "funkwhale.io", "name": "funkwhale_api"},
|
||||
)
|
||||
|
||||
|
||||
|
@ -21,20 +18,12 @@ def update_site_backward(apps, schema_editor):
|
|||
"""Revert site domain and name to default."""
|
||||
Site = apps.get_model("sites", "Site")
|
||||
Site.objects.update_or_create(
|
||||
id=settings.SITE_ID,
|
||||
defaults={
|
||||
"domain": "example.com",
|
||||
"name": "example.com"
|
||||
}
|
||||
id=settings.SITE_ID, defaults={"domain": "example.com", "name": "example.com"}
|
||||
)
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('sites', '0001_initial'),
|
||||
]
|
||||
dependencies = [("sites", "0001_initial")]
|
||||
|
||||
operations = [
|
||||
migrations.RunPython(update_site_forward, update_site_backward),
|
||||
]
|
||||
operations = [migrations.RunPython(update_site_forward, update_site_backward)]
|
||||
|
|
|
@ -8,20 +8,21 @@ from django.db import migrations, models
|
|||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('sites', '0002_set_site_domain_and_name'),
|
||||
]
|
||||
dependencies = [("sites", "0002_set_site_domain_and_name")]
|
||||
|
||||
operations = [
|
||||
migrations.AlterModelManagers(
|
||||
name='site',
|
||||
managers=[
|
||||
('objects', django.contrib.sites.models.SiteManager()),
|
||||
],
|
||||
name="site",
|
||||
managers=[("objects", django.contrib.sites.models.SiteManager())],
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='site',
|
||||
name='domain',
|
||||
field=models.CharField(max_length=100, unique=True, validators=[django.contrib.sites.models._simple_domain_name_validator], verbose_name='domain name'),
|
||||
model_name="site",
|
||||
name="domain",
|
||||
field=models.CharField(
|
||||
max_length=100,
|
||||
unique=True,
|
||||
validators=[django.contrib.sites.models._simple_domain_name_validator],
|
||||
verbose_name="domain name",
|
||||
),
|
||||
),
|
||||
]
|
||||
|
|
|
@ -1,2 +1,3 @@
|
|||
|
||||
from .downloader import download
|
||||
|
||||
__all__ = ["download"]
|
||||
|
|
|
@ -1,26 +1,19 @@
|
|||
import os
|
||||
import json
|
||||
from urllib.parse import quote_plus
|
||||
|
||||
import youtube_dl
|
||||
from django.conf import settings
|
||||
import glob
|
||||
|
||||
|
||||
def download(
|
||||
url,
|
||||
target_directory=settings.MEDIA_ROOT,
|
||||
name="%(id)s.%(ext)s",
|
||||
bitrate=192):
|
||||
url, target_directory=settings.MEDIA_ROOT, name="%(id)s.%(ext)s", bitrate=192
|
||||
):
|
||||
target_path = os.path.join(target_directory, name)
|
||||
ydl_opts = {
|
||||
'quiet': True,
|
||||
'outtmpl': target_path,
|
||||
'postprocessors': [{
|
||||
'key': 'FFmpegExtractAudio',
|
||||
'preferredcodec': 'vorbis',
|
||||
}],
|
||||
"quiet": True,
|
||||
"outtmpl": target_path,
|
||||
"postprocessors": [{"key": "FFmpegExtractAudio", "preferredcodec": "vorbis"}],
|
||||
}
|
||||
_downloader = youtube_dl.YoutubeDL(ydl_opts)
|
||||
info = _downloader.extract_info(url)
|
||||
info['audio_file_path'] = target_path % {'id': info['id'], 'ext': 'ogg'}
|
||||
info["audio_file_path"] = target_path % {"id": info["id"], "ext": "ogg"}
|
||||
return info
|
||||
|
|
|
@ -3,7 +3,7 @@ import persisting_theory
|
|||
|
||||
|
||||
class FactoriesRegistry(persisting_theory.Registry):
|
||||
look_into = 'factories'
|
||||
look_into = "factories"
|
||||
|
||||
def prepare_name(self, data, name=None):
|
||||
return name or data._meta.model._meta.label
|
||||
|
|
|
@ -1,19 +1,16 @@
|
|||
from funkwhale_api.common import channels
|
||||
from funkwhale_api.activity import record
|
||||
from funkwhale_api.common import channels
|
||||
|
||||
from . import serializers
|
||||
|
||||
record.registry.register_serializer(
|
||||
serializers.TrackFavoriteActivitySerializer)
|
||||
record.registry.register_serializer(serializers.TrackFavoriteActivitySerializer)
|
||||
|
||||
|
||||
@record.registry.register_consumer('favorites.TrackFavorite')
|
||||
@record.registry.register_consumer("favorites.TrackFavorite")
|
||||
def broadcast_track_favorite_to_instance_activity(data, obj):
|
||||
if obj.user.privacy_level not in ['instance', 'everyone']:
|
||||
if obj.user.privacy_level not in ["instance", "everyone"]:
|
||||
return
|
||||
|
||||
channels.group_send('instance_activity', {
|
||||
'type': 'event.send',
|
||||
'text': '',
|
||||
'data': data
|
||||
})
|
||||
channels.group_send(
|
||||
"instance_activity", {"type": "event.send", "text": "", "data": data}
|
||||
)
|
||||
|
|
|
@ -5,8 +5,5 @@ from . import models
|
|||
|
||||
@admin.register(models.TrackFavorite)
|
||||
class TrackFavoriteAdmin(admin.ModelAdmin):
|
||||
list_display = ['user', 'track', 'creation_date']
|
||||
list_select_related = [
|
||||
'user',
|
||||
'track'
|
||||
]
|
||||
list_display = ["user", "track", "creation_date"]
|
||||
list_select_related = ["user", "track"]
|
||||
|
|
|
@ -1,7 +1,6 @@
|
|||
import factory
|
||||
|
||||
from funkwhale_api.factories import registry
|
||||
|
||||
from funkwhale_api.music.factories import TrackFactory
|
||||
from funkwhale_api.users.factories import UserFactory
|
||||
|
||||
|
@ -12,4 +11,4 @@ class TrackFavorite(factory.django.DjangoModelFactory):
|
|||
user = factory.SubFactory(UserFactory)
|
||||
|
||||
class Meta:
|
||||
model = 'favorites.TrackFavorite'
|
||||
model = "favorites.TrackFavorite"
|
||||
|
|
|
@ -9,25 +9,47 @@ from django.conf import settings
|
|||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('music', '0003_auto_20151222_2233'),
|
||||
("music", "0003_auto_20151222_2233"),
|
||||
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name='TrackFavorite',
|
||||
name="TrackFavorite",
|
||||
fields=[
|
||||
('id', models.AutoField(serialize=False, auto_created=True, verbose_name='ID', primary_key=True)),
|
||||
('creation_date', models.DateTimeField(default=django.utils.timezone.now)),
|
||||
('track', models.ForeignKey(related_name='track_favorites', to='music.Track', on_delete=models.CASCADE)),
|
||||
('user', models.ForeignKey(related_name='track_favorites', to=settings.AUTH_USER_MODEL, on_delete=models.CASCADE)),
|
||||
(
|
||||
"id",
|
||||
models.AutoField(
|
||||
serialize=False,
|
||||
auto_created=True,
|
||||
verbose_name="ID",
|
||||
primary_key=True,
|
||||
),
|
||||
),
|
||||
(
|
||||
"creation_date",
|
||||
models.DateTimeField(default=django.utils.timezone.now),
|
||||
),
|
||||
(
|
||||
"track",
|
||||
models.ForeignKey(
|
||||
related_name="track_favorites",
|
||||
to="music.Track",
|
||||
on_delete=models.CASCADE,
|
||||
),
|
||||
),
|
||||
(
|
||||
"user",
|
||||
models.ForeignKey(
|
||||
related_name="track_favorites",
|
||||
to=settings.AUTH_USER_MODEL,
|
||||
on_delete=models.CASCADE,
|
||||
),
|
||||
),
|
||||
],
|
||||
options={
|
||||
'ordering': ('-creation_date',),
|
||||
},
|
||||
options={"ordering": ("-creation_date",)},
|
||||
),
|
||||
migrations.AlterUniqueTogether(
|
||||
name='trackfavorite',
|
||||
unique_together=set([('track', 'user')]),
|
||||
name="trackfavorite", unique_together=set([("track", "user")])
|
||||
),
|
||||
]
|
||||
|
|
|
@ -1,4 +1,3 @@
|
|||
from django.conf import settings
|
||||
from django.db import models
|
||||
from django.utils import timezone
|
||||
|
||||
|
@ -8,13 +7,15 @@ from funkwhale_api.music.models import Track
|
|||
class TrackFavorite(models.Model):
|
||||
creation_date = models.DateTimeField(default=timezone.now)
|
||||
user = models.ForeignKey(
|
||||
'users.User', related_name='track_favorites', on_delete=models.CASCADE)
|
||||
"users.User", related_name="track_favorites", on_delete=models.CASCADE
|
||||
)
|
||||
track = models.ForeignKey(
|
||||
Track, related_name='track_favorites', on_delete=models.CASCADE)
|
||||
Track, related_name="track_favorites", on_delete=models.CASCADE
|
||||
)
|
||||
|
||||
class Meta:
|
||||
unique_together = ('track', 'user')
|
||||
ordering = ('-creation_date',)
|
||||
unique_together = ("track", "user")
|
||||
ordering = ("-creation_date",)
|
||||
|
||||
@classmethod
|
||||
def add(cls, track, user):
|
||||
|
@ -22,5 +23,4 @@ class TrackFavorite(models.Model):
|
|||
return favorite
|
||||
|
||||
def get_activity_url(self):
|
||||
return '{}/favorites/tracks/{}'.format(
|
||||
self.user.get_activity_url(), self.pk)
|
||||
return "{}/favorites/tracks/{}".format(self.user.get_activity_url(), self.pk)
|
||||
|
|
|
@ -1,4 +1,3 @@
|
|||
from django.conf import settings
|
||||
|
||||
from rest_framework import serializers
|
||||
|
||||
|
@ -11,29 +10,22 @@ from . import models
|
|||
|
||||
class TrackFavoriteActivitySerializer(activity_serializers.ModelSerializer):
|
||||
type = serializers.SerializerMethodField()
|
||||
object = TrackActivitySerializer(source='track')
|
||||
actor = UserActivitySerializer(source='user')
|
||||
published = serializers.DateTimeField(source='creation_date')
|
||||
object = TrackActivitySerializer(source="track")
|
||||
actor = UserActivitySerializer(source="user")
|
||||
published = serializers.DateTimeField(source="creation_date")
|
||||
|
||||
class Meta:
|
||||
model = models.TrackFavorite
|
||||
fields = [
|
||||
'id',
|
||||
'local_id',
|
||||
'object',
|
||||
'type',
|
||||
'actor',
|
||||
'published'
|
||||
]
|
||||
fields = ["id", "local_id", "object", "type", "actor", "published"]
|
||||
|
||||
def get_actor(self, obj):
|
||||
return UserActivitySerializer(obj.user).data
|
||||
|
||||
def get_type(self, obj):
|
||||
return 'Like'
|
||||
return "Like"
|
||||
|
||||
|
||||
class UserTrackFavoriteSerializer(serializers.ModelSerializer):
|
||||
class Meta:
|
||||
model = models.TrackFavorite
|
||||
fields = ('id', 'track', 'creation_date')
|
||||
fields = ("id", "track", "creation_date")
|
||||
|
|
|
@ -1,8 +1,8 @@
|
|||
from django.conf.urls import include, url
|
||||
from rest_framework import routers
|
||||
|
||||
from . import views
|
||||
|
||||
from rest_framework import routers
|
||||
router = routers.SimpleRouter()
|
||||
router.register(r'tracks', views.TrackFavoriteViewSet, 'tracks')
|
||||
router.register(r"tracks", views.TrackFavoriteViewSet, "tracks")
|
||||
|
||||
urlpatterns = router.urls
|
||||
|
|
|
@ -1,24 +1,23 @@
|
|||
from rest_framework import generics, mixins, viewsets
|
||||
from rest_framework import status
|
||||
from rest_framework.response import Response
|
||||
from rest_framework import pagination
|
||||
from rest_framework import mixins, status, viewsets
|
||||
from rest_framework.decorators import list_route
|
||||
from rest_framework.response import Response
|
||||
|
||||
from funkwhale_api.activity import record
|
||||
from funkwhale_api.music.models import Track
|
||||
from funkwhale_api.common.permissions import ConditionalAuthentication
|
||||
from funkwhale_api.music.models import Track
|
||||
|
||||
from . import models
|
||||
from . import serializers
|
||||
from . import models, serializers
|
||||
|
||||
|
||||
class TrackFavoriteViewSet(mixins.CreateModelMixin,
|
||||
mixins.DestroyModelMixin,
|
||||
mixins.ListModelMixin,
|
||||
viewsets.GenericViewSet):
|
||||
class TrackFavoriteViewSet(
|
||||
mixins.CreateModelMixin,
|
||||
mixins.DestroyModelMixin,
|
||||
mixins.ListModelMixin,
|
||||
viewsets.GenericViewSet,
|
||||
):
|
||||
|
||||
serializer_class = serializers.UserTrackFavoriteSerializer
|
||||
queryset = (models.TrackFavorite.objects.all())
|
||||
queryset = models.TrackFavorite.objects.all()
|
||||
permission_classes = [ConditionalAuthentication]
|
||||
|
||||
def create(self, request, *args, **kwargs):
|
||||
|
@ -28,20 +27,22 @@ class TrackFavoriteViewSet(mixins.CreateModelMixin,
|
|||
serializer = self.get_serializer(instance=instance)
|
||||
headers = self.get_success_headers(serializer.data)
|
||||
record.send(instance)
|
||||
return Response(serializer.data, status=status.HTTP_201_CREATED, headers=headers)
|
||||
return Response(
|
||||
serializer.data, status=status.HTTP_201_CREATED, headers=headers
|
||||
)
|
||||
|
||||
def get_queryset(self):
|
||||
return self.queryset.filter(user=self.request.user)
|
||||
|
||||
def perform_create(self, serializer):
|
||||
track = Track.objects.get(pk=serializer.data['track'])
|
||||
track = Track.objects.get(pk=serializer.data["track"])
|
||||
favorite = models.TrackFavorite.add(track=track, user=self.request.user)
|
||||
return favorite
|
||||
|
||||
@list_route(methods=['delete', 'post'])
|
||||
@list_route(methods=["delete", "post"])
|
||||
def remove(self, request, *args, **kwargs):
|
||||
try:
|
||||
pk = int(request.data['track'])
|
||||
pk = int(request.data["track"])
|
||||
favorite = request.user.track_favorites.get(track__pk=pk)
|
||||
except (AttributeError, ValueError, models.TrackFavorite.DoesNotExist):
|
||||
return Response({}, status=400)
|
||||
|
|
|
@ -1,67 +1,61 @@
|
|||
from . import serializers
|
||||
from . import tasks
|
||||
|
||||
ACTIVITY_TYPES = [
|
||||
'Accept',
|
||||
'Add',
|
||||
'Announce',
|
||||
'Arrive',
|
||||
'Block',
|
||||
'Create',
|
||||
'Delete',
|
||||
'Dislike',
|
||||
'Flag',
|
||||
'Follow',
|
||||
'Ignore',
|
||||
'Invite',
|
||||
'Join',
|
||||
'Leave',
|
||||
'Like',
|
||||
'Listen',
|
||||
'Move',
|
||||
'Offer',
|
||||
'Question',
|
||||
'Reject',
|
||||
'Read',
|
||||
'Remove',
|
||||
'TentativeReject',
|
||||
'TentativeAccept',
|
||||
'Travel',
|
||||
'Undo',
|
||||
'Update',
|
||||
'View',
|
||||
"Accept",
|
||||
"Add",
|
||||
"Announce",
|
||||
"Arrive",
|
||||
"Block",
|
||||
"Create",
|
||||
"Delete",
|
||||
"Dislike",
|
||||
"Flag",
|
||||
"Follow",
|
||||
"Ignore",
|
||||
"Invite",
|
||||
"Join",
|
||||
"Leave",
|
||||
"Like",
|
||||
"Listen",
|
||||
"Move",
|
||||
"Offer",
|
||||
"Question",
|
||||
"Reject",
|
||||
"Read",
|
||||
"Remove",
|
||||
"TentativeReject",
|
||||
"TentativeAccept",
|
||||
"Travel",
|
||||
"Undo",
|
||||
"Update",
|
||||
"View",
|
||||
]
|
||||
|
||||
|
||||
OBJECT_TYPES = [
|
||||
'Article',
|
||||
'Audio',
|
||||
'Collection',
|
||||
'Document',
|
||||
'Event',
|
||||
'Image',
|
||||
'Note',
|
||||
'OrderedCollection',
|
||||
'Page',
|
||||
'Place',
|
||||
'Profile',
|
||||
'Relationship',
|
||||
'Tombstone',
|
||||
'Video',
|
||||
"Article",
|
||||
"Audio",
|
||||
"Collection",
|
||||
"Document",
|
||||
"Event",
|
||||
"Image",
|
||||
"Note",
|
||||
"OrderedCollection",
|
||||
"Page",
|
||||
"Place",
|
||||
"Profile",
|
||||
"Relationship",
|
||||
"Tombstone",
|
||||
"Video",
|
||||
] + ACTIVITY_TYPES
|
||||
|
||||
|
||||
def deliver(activity, on_behalf_of, to=[]):
|
||||
return tasks.send.delay(
|
||||
activity=activity,
|
||||
actor_id=on_behalf_of.pk,
|
||||
to=to
|
||||
)
|
||||
from . import tasks
|
||||
|
||||
return tasks.send.delay(activity=activity, actor_id=on_behalf_of.pk, to=to)
|
||||
|
||||
|
||||
def accept_follow(follow):
|
||||
from . import serializers
|
||||
|
||||
serializer = serializers.AcceptFollowSerializer(follow)
|
||||
return deliver(
|
||||
serializer.data,
|
||||
to=[follow.actor.url],
|
||||
on_behalf_of=follow.target)
|
||||
return deliver(serializer.data, to=[follow.actor.url], on_behalf_of=follow.target)
|
||||
|
|
|
@ -1,36 +1,28 @@
|
|||
import datetime
|
||||
import logging
|
||||
import uuid
|
||||
import xml
|
||||
|
||||
from django.conf import settings
|
||||
from django.db import transaction
|
||||
from django.urls import reverse
|
||||
from django.utils import timezone
|
||||
|
||||
from rest_framework.exceptions import PermissionDenied
|
||||
|
||||
from dynamic_preferences.registries import global_preferences_registry
|
||||
|
||||
from funkwhale_api.common import preferences
|
||||
from funkwhale_api.common import session
|
||||
from funkwhale_api.common import preferences, session
|
||||
from funkwhale_api.common import utils as funkwhale_utils
|
||||
from funkwhale_api.music import models as music_models
|
||||
from funkwhale_api.music import tasks as music_tasks
|
||||
|
||||
from . import activity
|
||||
from . import keys
|
||||
from . import models
|
||||
from . import serializers
|
||||
from . import signing
|
||||
from . import utils
|
||||
from . import activity, keys, models, serializers, signing, utils
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def remove_tags(text):
|
||||
logger.debug('Removing tags from %s', text)
|
||||
return ''.join(xml.etree.ElementTree.fromstring('<div>{}</div>'.format(text)).itertext())
|
||||
logger.debug("Removing tags from %s", text)
|
||||
return "".join(
|
||||
xml.etree.ElementTree.fromstring("<div>{}</div>".format(text)).itertext()
|
||||
)
|
||||
|
||||
|
||||
def get_actor_data(actor_url):
|
||||
|
@ -38,16 +30,13 @@ def get_actor_data(actor_url):
|
|||
actor_url,
|
||||
timeout=5,
|
||||
verify=settings.EXTERNAL_REQUESTS_VERIFY_SSL,
|
||||
headers={
|
||||
'Accept': 'application/activity+json',
|
||||
}
|
||||
headers={"Accept": "application/activity+json"},
|
||||
)
|
||||
response.raise_for_status()
|
||||
try:
|
||||
return response.json()
|
||||
except:
|
||||
raise ValueError(
|
||||
'Invalid actor payload: {}'.format(response.text))
|
||||
except Exception:
|
||||
raise ValueError("Invalid actor payload: {}".format(response.text))
|
||||
|
||||
|
||||
def get_actor(actor_url):
|
||||
|
@ -56,7 +45,8 @@ def get_actor(actor_url):
|
|||
except models.Actor.DoesNotExist:
|
||||
actor = None
|
||||
fetch_delta = datetime.timedelta(
|
||||
minutes=preferences.get('federation__actor_fetch_delay'))
|
||||
minutes=preferences.get("federation__actor_fetch_delay")
|
||||
)
|
||||
if actor and actor.last_fetch_date > timezone.now() - fetch_delta:
|
||||
# cache is hot, we can return as is
|
||||
return actor
|
||||
|
@ -73,8 +63,7 @@ class SystemActor(object):
|
|||
|
||||
def get_request_auth(self):
|
||||
actor = self.get_actor_instance()
|
||||
return signing.get_auth(
|
||||
actor.private_key, actor.private_key_id)
|
||||
return signing.get_auth(actor.private_key, actor.private_key_id)
|
||||
|
||||
def serialize(self):
|
||||
actor = self.get_actor_instance()
|
||||
|
@ -88,42 +77,35 @@ class SystemActor(object):
|
|||
pass
|
||||
private, public = keys.get_key_pair()
|
||||
args = self.get_instance_argument(
|
||||
self.id,
|
||||
name=self.name,
|
||||
summary=self.summary,
|
||||
**self.additional_attributes
|
||||
self.id, name=self.name, summary=self.summary, **self.additional_attributes
|
||||
)
|
||||
args['private_key'] = private.decode('utf-8')
|
||||
args['public_key'] = public.decode('utf-8')
|
||||
args["private_key"] = private.decode("utf-8")
|
||||
args["public_key"] = public.decode("utf-8")
|
||||
return models.Actor.objects.create(**args)
|
||||
|
||||
def get_actor_url(self):
|
||||
return utils.full_url(
|
||||
reverse(
|
||||
'federation:instance-actors-detail',
|
||||
kwargs={'actor': self.id}))
|
||||
reverse("federation:instance-actors-detail", kwargs={"actor": self.id})
|
||||
)
|
||||
|
||||
def get_instance_argument(self, id, name, summary, **kwargs):
|
||||
p = {
|
||||
'preferred_username': id,
|
||||
'domain': settings.FEDERATION_HOSTNAME,
|
||||
'type': 'Person',
|
||||
'name': name.format(host=settings.FEDERATION_HOSTNAME),
|
||||
'manually_approves_followers': True,
|
||||
'url': self.get_actor_url(),
|
||||
'shared_inbox_url': utils.full_url(
|
||||
reverse(
|
||||
'federation:instance-actors-inbox',
|
||||
kwargs={'actor': id})),
|
||||
'inbox_url': utils.full_url(
|
||||
reverse(
|
||||
'federation:instance-actors-inbox',
|
||||
kwargs={'actor': id})),
|
||||
'outbox_url': utils.full_url(
|
||||
reverse(
|
||||
'federation:instance-actors-outbox',
|
||||
kwargs={'actor': id})),
|
||||
'summary': summary.format(host=settings.FEDERATION_HOSTNAME)
|
||||
"preferred_username": id,
|
||||
"domain": settings.FEDERATION_HOSTNAME,
|
||||
"type": "Person",
|
||||
"name": name.format(host=settings.FEDERATION_HOSTNAME),
|
||||
"manually_approves_followers": True,
|
||||
"url": self.get_actor_url(),
|
||||
"shared_inbox_url": utils.full_url(
|
||||
reverse("federation:instance-actors-inbox", kwargs={"actor": id})
|
||||
),
|
||||
"inbox_url": utils.full_url(
|
||||
reverse("federation:instance-actors-inbox", kwargs={"actor": id})
|
||||
),
|
||||
"outbox_url": utils.full_url(
|
||||
reverse("federation:instance-actors-outbox", kwargs={"actor": id})
|
||||
),
|
||||
"summary": summary.format(host=settings.FEDERATION_HOSTNAME),
|
||||
}
|
||||
p.update(kwargs)
|
||||
return p
|
||||
|
@ -145,32 +127,29 @@ class SystemActor(object):
|
|||
Main entrypoint for handling activities posted to the
|
||||
actor's inbox
|
||||
"""
|
||||
logger.info('Received activity on %s inbox', self.id)
|
||||
logger.info("Received activity on %s inbox", self.id)
|
||||
|
||||
if actor is None:
|
||||
raise PermissionDenied('Actor not authenticated')
|
||||
raise PermissionDenied("Actor not authenticated")
|
||||
|
||||
serializer = serializers.ActivitySerializer(
|
||||
data=data, context={'actor': actor})
|
||||
serializer = serializers.ActivitySerializer(data=data, context={"actor": actor})
|
||||
serializer.is_valid(raise_exception=True)
|
||||
|
||||
ac = serializer.data
|
||||
try:
|
||||
handler = getattr(
|
||||
self, 'handle_{}'.format(ac['type'].lower()))
|
||||
handler = getattr(self, "handle_{}".format(ac["type"].lower()))
|
||||
except (KeyError, AttributeError):
|
||||
logger.debug(
|
||||
'No handler for activity %s', ac['type'])
|
||||
logger.debug("No handler for activity %s", ac["type"])
|
||||
return
|
||||
|
||||
return handler(data, actor)
|
||||
|
||||
def handle_follow(self, ac, sender):
|
||||
system_actor = self.get_actor_instance()
|
||||
serializer = serializers.FollowSerializer(
|
||||
data=ac, context={'follow_actor': sender})
|
||||
data=ac, context={"follow_actor": sender}
|
||||
)
|
||||
if not serializer.is_valid():
|
||||
return logger.info('Invalid follow payload')
|
||||
return logger.info("Invalid follow payload")
|
||||
approved = True if not self.manually_approves_followers else None
|
||||
follow = serializer.save(approved=approved)
|
||||
if follow.approved:
|
||||
|
@ -179,26 +158,27 @@ class SystemActor(object):
|
|||
def handle_accept(self, ac, sender):
|
||||
system_actor = self.get_actor_instance()
|
||||
serializer = serializers.AcceptFollowSerializer(
|
||||
data=ac,
|
||||
context={'follow_target': sender, 'follow_actor': system_actor})
|
||||
data=ac, context={"follow_target": sender, "follow_actor": system_actor}
|
||||
)
|
||||
if not serializer.is_valid(raise_exception=True):
|
||||
return logger.info('Received invalid payload')
|
||||
return logger.info("Received invalid payload")
|
||||
|
||||
return serializer.save()
|
||||
|
||||
def handle_undo_follow(self, ac, sender):
|
||||
system_actor = self.get_actor_instance()
|
||||
serializer = serializers.UndoFollowSerializer(
|
||||
data=ac, context={'actor': sender, 'target': system_actor})
|
||||
data=ac, context={"actor": sender, "target": system_actor}
|
||||
)
|
||||
if not serializer.is_valid():
|
||||
return logger.info('Received invalid payload')
|
||||
return logger.info("Received invalid payload")
|
||||
serializer.save()
|
||||
|
||||
def handle_undo(self, ac, sender):
|
||||
if ac['object']['type'] != 'Follow':
|
||||
if ac["object"]["type"] != "Follow":
|
||||
return
|
||||
|
||||
if ac['object']['actor'] != sender.url:
|
||||
if ac["object"]["actor"] != sender.url:
|
||||
# not the same actor, permission issue
|
||||
return
|
||||
|
||||
|
@ -206,55 +186,52 @@ class SystemActor(object):
|
|||
|
||||
|
||||
class LibraryActor(SystemActor):
|
||||
id = 'library'
|
||||
name = '{host}\'s library'
|
||||
summary = 'Bot account to federate with {host}\'s library'
|
||||
additional_attributes = {
|
||||
'manually_approves_followers': True
|
||||
}
|
||||
id = "library"
|
||||
name = "{host}'s library"
|
||||
summary = "Bot account to federate with {host}'s library"
|
||||
additional_attributes = {"manually_approves_followers": True}
|
||||
|
||||
def serialize(self):
|
||||
data = super().serialize()
|
||||
urls = data.setdefault('url', [])
|
||||
urls.append({
|
||||
'type': 'Link',
|
||||
'mediaType': 'application/activity+json',
|
||||
'name': 'library',
|
||||
'href': utils.full_url(reverse('federation:music:files-list'))
|
||||
})
|
||||
urls = data.setdefault("url", [])
|
||||
urls.append(
|
||||
{
|
||||
"type": "Link",
|
||||
"mediaType": "application/activity+json",
|
||||
"name": "library",
|
||||
"href": utils.full_url(reverse("federation:music:files-list")),
|
||||
}
|
||||
)
|
||||
return data
|
||||
|
||||
@property
|
||||
def manually_approves_followers(self):
|
||||
return preferences.get('federation__music_needs_approval')
|
||||
return preferences.get("federation__music_needs_approval")
|
||||
|
||||
@transaction.atomic
|
||||
def handle_create(self, ac, sender):
|
||||
try:
|
||||
remote_library = models.Library.objects.get(
|
||||
actor=sender,
|
||||
federation_enabled=True,
|
||||
actor=sender, federation_enabled=True
|
||||
)
|
||||
except models.Library.DoesNotExist:
|
||||
logger.info(
|
||||
'Skipping import, we\'re not following %s', sender.url)
|
||||
logger.info("Skipping import, we're not following %s", sender.url)
|
||||
return
|
||||
|
||||
if ac['object']['type'] != 'Collection':
|
||||
if ac["object"]["type"] != "Collection":
|
||||
return
|
||||
|
||||
if ac['object']['totalItems'] <= 0:
|
||||
if ac["object"]["totalItems"] <= 0:
|
||||
return
|
||||
|
||||
try:
|
||||
items = ac['object']['items']
|
||||
items = ac["object"]["items"]
|
||||
except KeyError:
|
||||
logger.warning('No items in collection!')
|
||||
logger.warning("No items in collection!")
|
||||
return
|
||||
|
||||
item_serializers = [
|
||||
serializers.AudioSerializer(
|
||||
data=i, context={'library': remote_library})
|
||||
serializers.AudioSerializer(data=i, context={"library": remote_library})
|
||||
for i in items
|
||||
]
|
||||
now = timezone.now()
|
||||
|
@ -263,27 +240,21 @@ class LibraryActor(SystemActor):
|
|||
if s.is_valid():
|
||||
valid_serializers.append(s)
|
||||
else:
|
||||
logger.debug(
|
||||
'Skipping invalid item %s, %s', s.initial_data, s.errors)
|
||||
logger.debug("Skipping invalid item %s, %s", s.initial_data, s.errors)
|
||||
|
||||
lts = []
|
||||
for s in valid_serializers:
|
||||
lts.append(s.save())
|
||||
|
||||
if remote_library.autoimport:
|
||||
batch = music_models.ImportBatch.objects.create(
|
||||
source='federation',
|
||||
)
|
||||
batch = music_models.ImportBatch.objects.create(source="federation")
|
||||
for lt in lts:
|
||||
if lt.creation_date < now:
|
||||
# track was already in the library, we do not trigger
|
||||
# an import
|
||||
continue
|
||||
job = music_models.ImportJob.objects.create(
|
||||
batch=batch,
|
||||
library_track=lt,
|
||||
mbid=lt.mbid,
|
||||
source=lt.url,
|
||||
batch=batch, library_track=lt, mbid=lt.mbid, source=lt.url
|
||||
)
|
||||
funkwhale_utils.on_commit(
|
||||
music_tasks.import_job_run.delay,
|
||||
|
@ -293,15 +264,13 @@ class LibraryActor(SystemActor):
|
|||
|
||||
|
||||
class TestActor(SystemActor):
|
||||
id = 'test'
|
||||
name = '{host}\'s test account'
|
||||
id = "test"
|
||||
name = "{host}'s test account"
|
||||
summary = (
|
||||
'Bot account to test federation with {host}. '
|
||||
'Send me /ping and I\'ll answer you.'
|
||||
"Bot account to test federation with {host}. "
|
||||
"Send me /ping and I'll answer you."
|
||||
)
|
||||
additional_attributes = {
|
||||
'manually_approves_followers': False
|
||||
}
|
||||
additional_attributes = {"manually_approves_followers": False}
|
||||
manually_approves_followers = False
|
||||
|
||||
def get_outbox(self, data, actor=None):
|
||||
|
@ -309,15 +278,14 @@ class TestActor(SystemActor):
|
|||
"@context": [
|
||||
"https://www.w3.org/ns/activitystreams",
|
||||
"https://w3id.org/security/v1",
|
||||
{}
|
||||
{},
|
||||
],
|
||||
"id": utils.full_url(
|
||||
reverse(
|
||||
'federation:instance-actors-outbox',
|
||||
kwargs={'actor': self.id})),
|
||||
reverse("federation:instance-actors-outbox", kwargs={"actor": self.id})
|
||||
),
|
||||
"type": "OrderedCollection",
|
||||
"totalItems": 0,
|
||||
"orderedItems": []
|
||||
"orderedItems": [],
|
||||
}
|
||||
|
||||
def parse_command(self, message):
|
||||
|
@ -327,99 +295,85 @@ class TestActor(SystemActor):
|
|||
"""
|
||||
raw = remove_tags(message)
|
||||
try:
|
||||
return raw.split('/')[1]
|
||||
return raw.split("/")[1]
|
||||
except IndexError:
|
||||
return
|
||||
|
||||
def handle_create(self, ac, sender):
|
||||
if ac['object']['type'] != 'Note':
|
||||
if ac["object"]["type"] != "Note":
|
||||
return
|
||||
|
||||
# we received a toot \o/
|
||||
command = self.parse_command(ac['object']['content'])
|
||||
logger.debug('Parsed command: %s', command)
|
||||
if command != 'ping':
|
||||
command = self.parse_command(ac["object"]["content"])
|
||||
logger.debug("Parsed command: %s", command)
|
||||
if command != "ping":
|
||||
return
|
||||
|
||||
now = timezone.now()
|
||||
test_actor = self.get_actor_instance()
|
||||
reply_url = 'https://{}/activities/note/{}'.format(
|
||||
reply_url = "https://{}/activities/note/{}".format(
|
||||
settings.FEDERATION_HOSTNAME, now.timestamp()
|
||||
)
|
||||
reply_content = '{} Pong!'.format(
|
||||
sender.mention_username
|
||||
)
|
||||
reply_activity = {
|
||||
"@context": [
|
||||
"https://www.w3.org/ns/activitystreams",
|
||||
"https://w3id.org/security/v1",
|
||||
{}
|
||||
{},
|
||||
],
|
||||
'type': 'Create',
|
||||
'actor': test_actor.url,
|
||||
'id': '{}/activity'.format(reply_url),
|
||||
'published': now.isoformat(),
|
||||
'to': ac['actor'],
|
||||
'cc': [],
|
||||
'object': {
|
||||
'type': 'Note',
|
||||
'content': 'Pong!',
|
||||
'summary': None,
|
||||
'published': now.isoformat(),
|
||||
'id': reply_url,
|
||||
'inReplyTo': ac['object']['id'],
|
||||
'sensitive': False,
|
||||
'url': reply_url,
|
||||
'to': [ac['actor']],
|
||||
'attributedTo': test_actor.url,
|
||||
'cc': [],
|
||||
'attachment': [],
|
||||
'tag': [{
|
||||
"type": "Mention",
|
||||
"href": ac['actor'],
|
||||
"name": sender.mention_username
|
||||
}]
|
||||
}
|
||||
"type": "Create",
|
||||
"actor": test_actor.url,
|
||||
"id": "{}/activity".format(reply_url),
|
||||
"published": now.isoformat(),
|
||||
"to": ac["actor"],
|
||||
"cc": [],
|
||||
"object": {
|
||||
"type": "Note",
|
||||
"content": "Pong!",
|
||||
"summary": None,
|
||||
"published": now.isoformat(),
|
||||
"id": reply_url,
|
||||
"inReplyTo": ac["object"]["id"],
|
||||
"sensitive": False,
|
||||
"url": reply_url,
|
||||
"to": [ac["actor"]],
|
||||
"attributedTo": test_actor.url,
|
||||
"cc": [],
|
||||
"attachment": [],
|
||||
"tag": [
|
||||
{
|
||||
"type": "Mention",
|
||||
"href": ac["actor"],
|
||||
"name": sender.mention_username,
|
||||
}
|
||||
],
|
||||
},
|
||||
}
|
||||
activity.deliver(
|
||||
reply_activity,
|
||||
to=[ac['actor']],
|
||||
on_behalf_of=test_actor)
|
||||
activity.deliver(reply_activity, to=[ac["actor"]], on_behalf_of=test_actor)
|
||||
|
||||
def handle_follow(self, ac, sender):
|
||||
super().handle_follow(ac, sender)
|
||||
# also, we follow back
|
||||
test_actor = self.get_actor_instance()
|
||||
follow_back = models.Follow.objects.get_or_create(
|
||||
actor=test_actor,
|
||||
target=sender,
|
||||
approved=None,
|
||||
actor=test_actor, target=sender, approved=None
|
||||
)[0]
|
||||
activity.deliver(
|
||||
serializers.FollowSerializer(follow_back).data,
|
||||
to=[follow_back.target.url],
|
||||
on_behalf_of=follow_back.actor)
|
||||
on_behalf_of=follow_back.actor,
|
||||
)
|
||||
|
||||
def handle_undo_follow(self, ac, sender):
|
||||
super().handle_undo_follow(ac, sender)
|
||||
actor = self.get_actor_instance()
|
||||
# we also unfollow the sender, if possible
|
||||
try:
|
||||
follow = models.Follow.objects.get(
|
||||
target=sender,
|
||||
actor=actor,
|
||||
)
|
||||
follow = models.Follow.objects.get(target=sender, actor=actor)
|
||||
except models.Follow.DoesNotExist:
|
||||
return
|
||||
undo = serializers.UndoFollowSerializer(follow).data
|
||||
follow.delete()
|
||||
activity.deliver(
|
||||
undo,
|
||||
to=[sender.url],
|
||||
on_behalf_of=actor)
|
||||
activity.deliver(undo, to=[sender.url], on_behalf_of=actor)
|
||||
|
||||
|
||||
SYSTEM_ACTORS = {
|
||||
'library': LibraryActor(),
|
||||
'test': TestActor(),
|
||||
}
|
||||
SYSTEM_ACTORS = {"library": LibraryActor(), "test": TestActor()}
|
||||
|
|
|
@ -6,61 +6,43 @@ from . import models
|
|||
@admin.register(models.Actor)
|
||||
class ActorAdmin(admin.ModelAdmin):
|
||||
list_display = [
|
||||
'url',
|
||||
'domain',
|
||||
'preferred_username',
|
||||
'type',
|
||||
'creation_date',
|
||||
'last_fetch_date']
|
||||
search_fields = ['url', 'domain', 'preferred_username']
|
||||
list_filter = [
|
||||
'type'
|
||||
"url",
|
||||
"domain",
|
||||
"preferred_username",
|
||||
"type",
|
||||
"creation_date",
|
||||
"last_fetch_date",
|
||||
]
|
||||
search_fields = ["url", "domain", "preferred_username"]
|
||||
list_filter = ["type"]
|
||||
|
||||
|
||||
@admin.register(models.Follow)
|
||||
class FollowAdmin(admin.ModelAdmin):
|
||||
list_display = [
|
||||
'actor',
|
||||
'target',
|
||||
'approved',
|
||||
'creation_date'
|
||||
]
|
||||
list_filter = [
|
||||
'approved'
|
||||
]
|
||||
search_fields = ['actor__url', 'target__url']
|
||||
list_display = ["actor", "target", "approved", "creation_date"]
|
||||
list_filter = ["approved"]
|
||||
search_fields = ["actor__url", "target__url"]
|
||||
list_select_related = True
|
||||
|
||||
|
||||
@admin.register(models.Library)
|
||||
class LibraryAdmin(admin.ModelAdmin):
|
||||
list_display = [
|
||||
'actor',
|
||||
'url',
|
||||
'creation_date',
|
||||
'fetched_date',
|
||||
'tracks_count']
|
||||
search_fields = ['actor__url', 'url']
|
||||
list_filter = [
|
||||
'federation_enabled',
|
||||
'download_files',
|
||||
'autoimport',
|
||||
]
|
||||
list_display = ["actor", "url", "creation_date", "fetched_date", "tracks_count"]
|
||||
search_fields = ["actor__url", "url"]
|
||||
list_filter = ["federation_enabled", "download_files", "autoimport"]
|
||||
list_select_related = True
|
||||
|
||||
|
||||
@admin.register(models.LibraryTrack)
|
||||
class LibraryTrackAdmin(admin.ModelAdmin):
|
||||
list_display = [
|
||||
'title',
|
||||
'artist_name',
|
||||
'album_title',
|
||||
'url',
|
||||
'library',
|
||||
'creation_date',
|
||||
'published_date',
|
||||
"title",
|
||||
"artist_name",
|
||||
"album_title",
|
||||
"url",
|
||||
"library",
|
||||
"creation_date",
|
||||
"published_date",
|
||||
]
|
||||
search_fields = [
|
||||
'library__url', 'url', 'artist_name', 'title', 'album_title']
|
||||
search_fields = ["library__url", "url", "artist_name", "title", "album_title"]
|
||||
list_select_related = True
|
||||
|
|
|
@ -3,13 +3,7 @@ from rest_framework import routers
|
|||
from . import views
|
||||
|
||||
router = routers.SimpleRouter()
|
||||
router.register(
|
||||
r'libraries',
|
||||
views.LibraryViewSet,
|
||||
'libraries')
|
||||
router.register(
|
||||
r'library-tracks',
|
||||
views.LibraryTrackViewSet,
|
||||
'library-tracks')
|
||||
router.register(r"libraries", views.LibraryViewSet, "libraries")
|
||||
router.register(r"library-tracks", views.LibraryTrackViewSet, "library-tracks")
|
||||
|
||||
urlpatterns = router.urls
|
||||
|
|
|
@ -1,23 +1,15 @@
|
|||
import cryptography
|
||||
|
||||
from django.contrib.auth.models import AnonymousUser
|
||||
from rest_framework import authentication, exceptions
|
||||
|
||||
from rest_framework import authentication
|
||||
from rest_framework import exceptions
|
||||
|
||||
from . import actors
|
||||
from . import keys
|
||||
from . import models
|
||||
from . import serializers
|
||||
from . import signing
|
||||
from . import utils
|
||||
from . import actors, keys, signing, utils
|
||||
|
||||
|
||||
class SignatureAuthentication(authentication.BaseAuthentication):
|
||||
def authenticate_actor(self, request):
|
||||
headers = utils.clean_wsgi_headers(request.META)
|
||||
try:
|
||||
signature = headers['Signature']
|
||||
signature = headers["Signature"]
|
||||
key_id = keys.get_key_id_from_signature_header(signature)
|
||||
except KeyError:
|
||||
return
|
||||
|
@ -25,25 +17,25 @@ class SignatureAuthentication(authentication.BaseAuthentication):
|
|||
raise exceptions.AuthenticationFailed(str(e))
|
||||
|
||||
try:
|
||||
actor = actors.get_actor(key_id.split('#')[0])
|
||||
actor = actors.get_actor(key_id.split("#")[0])
|
||||
except Exception as e:
|
||||
raise exceptions.AuthenticationFailed(str(e))
|
||||
|
||||
if not actor.public_key:
|
||||
raise exceptions.AuthenticationFailed('No public key found')
|
||||
raise exceptions.AuthenticationFailed("No public key found")
|
||||
|
||||
try:
|
||||
signing.verify_django(request, actor.public_key.encode('utf-8'))
|
||||
signing.verify_django(request, actor.public_key.encode("utf-8"))
|
||||
except cryptography.exceptions.InvalidSignature:
|
||||
raise exceptions.AuthenticationFailed('Invalid signature')
|
||||
raise exceptions.AuthenticationFailed("Invalid signature")
|
||||
|
||||
return actor
|
||||
|
||||
def authenticate(self, request):
|
||||
setattr(request, 'actor', None)
|
||||
setattr(request, "actor", None)
|
||||
actor = self.authenticate_actor(request)
|
||||
if not actor:
|
||||
return
|
||||
user = AnonymousUser()
|
||||
setattr(request, 'actor', actor)
|
||||
setattr(request, "actor", actor)
|
||||
return (user, None)
|
||||
|
|
|
@ -1,80 +1,68 @@
|
|||
from django.forms import widgets
|
||||
|
||||
from dynamic_preferences import types
|
||||
from dynamic_preferences.registries import global_preferences_registry
|
||||
|
||||
from funkwhale_api.common import preferences
|
||||
federation = types.Section('federation')
|
||||
|
||||
federation = types.Section("federation")
|
||||
|
||||
|
||||
@global_preferences_registry.register
|
||||
class MusicCacheDuration(types.IntPreference):
|
||||
show_in_api = True
|
||||
section = federation
|
||||
name = 'music_cache_duration'
|
||||
name = "music_cache_duration"
|
||||
default = 60 * 24 * 2
|
||||
verbose_name = 'Music cache duration'
|
||||
verbose_name = "Music cache duration"
|
||||
help_text = (
|
||||
'How much minutes do you want to keep a copy of federated tracks'
|
||||
'locally? Federated files that were not listened in this interval '
|
||||
'will be erased and refetched from the remote on the next listening.'
|
||||
"How much minutes do you want to keep a copy of federated tracks"
|
||||
"locally? Federated files that were not listened in this interval "
|
||||
"will be erased and refetched from the remote on the next listening."
|
||||
)
|
||||
field_kwargs = {
|
||||
'required': False,
|
||||
}
|
||||
field_kwargs = {"required": False}
|
||||
|
||||
|
||||
@global_preferences_registry.register
|
||||
class Enabled(preferences.DefaultFromSettingMixin, types.BooleanPreference):
|
||||
section = federation
|
||||
name = 'enabled'
|
||||
setting = 'FEDERATION_ENABLED'
|
||||
verbose_name = 'Federation enabled'
|
||||
name = "enabled"
|
||||
setting = "FEDERATION_ENABLED"
|
||||
verbose_name = "Federation enabled"
|
||||
help_text = (
|
||||
'Use this setting to enable or disable federation logic and API'
|
||||
' globally.'
|
||||
"Use this setting to enable or disable federation logic and API" " globally."
|
||||
)
|
||||
|
||||
|
||||
@global_preferences_registry.register
|
||||
class CollectionPageSize(
|
||||
preferences.DefaultFromSettingMixin, types.IntPreference):
|
||||
class CollectionPageSize(preferences.DefaultFromSettingMixin, types.IntPreference):
|
||||
section = federation
|
||||
name = 'collection_page_size'
|
||||
setting = 'FEDERATION_COLLECTION_PAGE_SIZE'
|
||||
verbose_name = 'Federation collection page size'
|
||||
help_text = (
|
||||
'How much items to display in ActivityPub collections.'
|
||||
)
|
||||
field_kwargs = {
|
||||
'required': False,
|
||||
}
|
||||
name = "collection_page_size"
|
||||
setting = "FEDERATION_COLLECTION_PAGE_SIZE"
|
||||
verbose_name = "Federation collection page size"
|
||||
help_text = "How much items to display in ActivityPub collections."
|
||||
field_kwargs = {"required": False}
|
||||
|
||||
|
||||
@global_preferences_registry.register
|
||||
class ActorFetchDelay(
|
||||
preferences.DefaultFromSettingMixin, types.IntPreference):
|
||||
class ActorFetchDelay(preferences.DefaultFromSettingMixin, types.IntPreference):
|
||||
section = federation
|
||||
name = 'actor_fetch_delay'
|
||||
setting = 'FEDERATION_ACTOR_FETCH_DELAY'
|
||||
verbose_name = 'Federation actor fetch delay'
|
||||
name = "actor_fetch_delay"
|
||||
setting = "FEDERATION_ACTOR_FETCH_DELAY"
|
||||
verbose_name = "Federation actor fetch delay"
|
||||
help_text = (
|
||||
'How much minutes to wait before refetching actors on '
|
||||
'request authentication.'
|
||||
"How much minutes to wait before refetching actors on "
|
||||
"request authentication."
|
||||
)
|
||||
field_kwargs = {
|
||||
'required': False,
|
||||
}
|
||||
field_kwargs = {"required": False}
|
||||
|
||||
|
||||
@global_preferences_registry.register
|
||||
class MusicNeedsApproval(
|
||||
preferences.DefaultFromSettingMixin, types.BooleanPreference):
|
||||
class MusicNeedsApproval(preferences.DefaultFromSettingMixin, types.BooleanPreference):
|
||||
section = federation
|
||||
name = 'music_needs_approval'
|
||||
setting = 'FEDERATION_MUSIC_NEEDS_APPROVAL'
|
||||
verbose_name = 'Federation music needs approval'
|
||||
name = "music_needs_approval"
|
||||
setting = "FEDERATION_MUSIC_NEEDS_APPROVAL"
|
||||
verbose_name = "Federation music needs approval"
|
||||
help_text = (
|
||||
'When true, other federation actors will need your approval'
|
||||
' before being able to browse your library.'
|
||||
"When true, other federation actors will need your approval"
|
||||
" before being able to browse your library."
|
||||
)
|
||||
|
|
|
@ -1,5 +1,3 @@
|
|||
|
||||
|
||||
class MalformedPayload(ValueError):
|
||||
pass
|
||||
|
||||
|
|
|
@ -1,40 +1,34 @@
|
|||
import uuid
|
||||
|
||||
import factory
|
||||
import requests
|
||||
import requests_http_signature
|
||||
import uuid
|
||||
|
||||
from django.utils import timezone
|
||||
from django.conf import settings
|
||||
from django.utils import timezone
|
||||
|
||||
from funkwhale_api.factories import registry
|
||||
|
||||
from . import keys
|
||||
from . import models
|
||||
from . import keys, models
|
||||
|
||||
registry.register(keys.get_key_pair, name="federation.KeyPair")
|
||||
|
||||
|
||||
registry.register(keys.get_key_pair, name='federation.KeyPair')
|
||||
|
||||
|
||||
@registry.register(name='federation.SignatureAuth')
|
||||
@registry.register(name="federation.SignatureAuth")
|
||||
class SignatureAuthFactory(factory.Factory):
|
||||
algorithm = 'rsa-sha256'
|
||||
algorithm = "rsa-sha256"
|
||||
key = factory.LazyFunction(lambda: keys.get_key_pair()[0])
|
||||
key_id = factory.Faker('url')
|
||||
key_id = factory.Faker("url")
|
||||
use_auth_header = False
|
||||
headers = [
|
||||
'(request-target)',
|
||||
'user-agent',
|
||||
'host',
|
||||
'date',
|
||||
'content-type',]
|
||||
headers = ["(request-target)", "user-agent", "host", "date", "content-type"]
|
||||
|
||||
class Meta:
|
||||
model = requests_http_signature.HTTPSignatureAuth
|
||||
|
||||
|
||||
@registry.register(name='federation.SignedRequest')
|
||||
@registry.register(name="federation.SignedRequest")
|
||||
class SignedRequestFactory(factory.Factory):
|
||||
url = factory.Faker('url')
|
||||
method = 'get'
|
||||
url = factory.Faker("url")
|
||||
method = "get"
|
||||
auth = factory.SubFactory(SignatureAuthFactory)
|
||||
|
||||
class Meta:
|
||||
|
@ -43,59 +37,62 @@ class SignedRequestFactory(factory.Factory):
|
|||
@factory.post_generation
|
||||
def headers(self, create, extracted, **kwargs):
|
||||
default_headers = {
|
||||
'User-Agent': 'Test',
|
||||
'Host': 'test.host',
|
||||
'Date': 'Right now',
|
||||
'Content-Type': 'application/activity+json'
|
||||
"User-Agent": "Test",
|
||||
"Host": "test.host",
|
||||
"Date": "Right now",
|
||||
"Content-Type": "application/activity+json",
|
||||
}
|
||||
if extracted:
|
||||
default_headers.update(extracted)
|
||||
self.headers.update(default_headers)
|
||||
|
||||
|
||||
@registry.register(name='federation.Link')
|
||||
@registry.register(name="federation.Link")
|
||||
class LinkFactory(factory.Factory):
|
||||
type = 'Link'
|
||||
href = factory.Faker('url')
|
||||
mediaType = 'text/html'
|
||||
type = "Link"
|
||||
href = factory.Faker("url")
|
||||
mediaType = "text/html"
|
||||
|
||||
class Meta:
|
||||
model = dict
|
||||
|
||||
class Params:
|
||||
audio = factory.Trait(
|
||||
mediaType=factory.Iterator(['audio/mp3', 'audio/ogg'])
|
||||
)
|
||||
audio = factory.Trait(mediaType=factory.Iterator(["audio/mp3", "audio/ogg"]))
|
||||
|
||||
|
||||
@registry.register
|
||||
class ActorFactory(factory.DjangoModelFactory):
|
||||
public_key = None
|
||||
private_key = None
|
||||
preferred_username = factory.Faker('user_name')
|
||||
summary = factory.Faker('paragraph')
|
||||
domain = factory.Faker('domain_name')
|
||||
url = factory.LazyAttribute(lambda o: 'https://{}/users/{}'.format(o.domain, o.preferred_username))
|
||||
inbox_url = factory.LazyAttribute(lambda o: 'https://{}/users/{}/inbox'.format(o.domain, o.preferred_username))
|
||||
outbox_url = factory.LazyAttribute(lambda o: 'https://{}/users/{}/outbox'.format(o.domain, o.preferred_username))
|
||||
preferred_username = factory.Faker("user_name")
|
||||
summary = factory.Faker("paragraph")
|
||||
domain = factory.Faker("domain_name")
|
||||
url = factory.LazyAttribute(
|
||||
lambda o: "https://{}/users/{}".format(o.domain, o.preferred_username)
|
||||
)
|
||||
inbox_url = factory.LazyAttribute(
|
||||
lambda o: "https://{}/users/{}/inbox".format(o.domain, o.preferred_username)
|
||||
)
|
||||
outbox_url = factory.LazyAttribute(
|
||||
lambda o: "https://{}/users/{}/outbox".format(o.domain, o.preferred_username)
|
||||
)
|
||||
|
||||
class Meta:
|
||||
model = models.Actor
|
||||
|
||||
class Params:
|
||||
local = factory.Trait(
|
||||
domain=factory.LazyAttribute(
|
||||
lambda o: settings.FEDERATION_HOSTNAME)
|
||||
domain=factory.LazyAttribute(lambda o: settings.FEDERATION_HOSTNAME)
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def _generate(cls, create, attrs):
|
||||
has_public = attrs.get('public_key') is not None
|
||||
has_private = attrs.get('private_key') is not None
|
||||
has_public = attrs.get("public_key") is not None
|
||||
has_private = attrs.get("private_key") is not None
|
||||
if not has_public and not has_private:
|
||||
private, public = keys.get_key_pair()
|
||||
attrs['private_key'] = private.decode('utf-8')
|
||||
attrs['public_key'] = public.decode('utf-8')
|
||||
attrs["private_key"] = private.decode("utf-8")
|
||||
attrs["public_key"] = public.decode("utf-8")
|
||||
return super()._generate(create, attrs)
|
||||
|
||||
|
||||
|
@ -108,15 +105,13 @@ class FollowFactory(factory.DjangoModelFactory):
|
|||
model = models.Follow
|
||||
|
||||
class Params:
|
||||
local = factory.Trait(
|
||||
actor=factory.SubFactory(ActorFactory, local=True)
|
||||
)
|
||||
local = factory.Trait(actor=factory.SubFactory(ActorFactory, local=True))
|
||||
|
||||
|
||||
@registry.register
|
||||
class LibraryFactory(factory.DjangoModelFactory):
|
||||
actor = factory.SubFactory(ActorFactory)
|
||||
url = factory.Faker('url')
|
||||
url = factory.Faker("url")
|
||||
federation_enabled = True
|
||||
download_files = False
|
||||
autoimport = False
|
||||
|
@ -126,42 +121,36 @@ class LibraryFactory(factory.DjangoModelFactory):
|
|||
|
||||
|
||||
class ArtistMetadataFactory(factory.Factory):
|
||||
name = factory.Faker('name')
|
||||
name = factory.Faker("name")
|
||||
|
||||
class Meta:
|
||||
model = dict
|
||||
|
||||
class Params:
|
||||
musicbrainz = factory.Trait(
|
||||
musicbrainz_id=factory.Faker('uuid4')
|
||||
)
|
||||
musicbrainz = factory.Trait(musicbrainz_id=factory.Faker("uuid4"))
|
||||
|
||||
|
||||
class ReleaseMetadataFactory(factory.Factory):
|
||||
title = factory.Faker('sentence')
|
||||
title = factory.Faker("sentence")
|
||||
|
||||
class Meta:
|
||||
model = dict
|
||||
|
||||
class Params:
|
||||
musicbrainz = factory.Trait(
|
||||
musicbrainz_id=factory.Faker('uuid4')
|
||||
)
|
||||
musicbrainz = factory.Trait(musicbrainz_id=factory.Faker("uuid4"))
|
||||
|
||||
|
||||
class RecordingMetadataFactory(factory.Factory):
|
||||
title = factory.Faker('sentence')
|
||||
title = factory.Faker("sentence")
|
||||
|
||||
class Meta:
|
||||
model = dict
|
||||
|
||||
class Params:
|
||||
musicbrainz = factory.Trait(
|
||||
musicbrainz_id=factory.Faker('uuid4')
|
||||
)
|
||||
musicbrainz = factory.Trait(musicbrainz_id=factory.Faker("uuid4"))
|
||||
|
||||
|
||||
@registry.register(name='federation.LibraryTrackMetadata')
|
||||
@registry.register(name="federation.LibraryTrackMetadata")
|
||||
class LibraryTrackMetadataFactory(factory.Factory):
|
||||
artist = factory.SubFactory(ArtistMetadataFactory)
|
||||
recording = factory.SubFactory(RecordingMetadataFactory)
|
||||
|
@ -174,64 +163,59 @@ class LibraryTrackMetadataFactory(factory.Factory):
|
|||
@registry.register
|
||||
class LibraryTrackFactory(factory.DjangoModelFactory):
|
||||
library = factory.SubFactory(LibraryFactory)
|
||||
url = factory.Faker('url')
|
||||
title = factory.Faker('sentence')
|
||||
artist_name = factory.Faker('sentence')
|
||||
album_title = factory.Faker('sentence')
|
||||
audio_url = factory.Faker('url')
|
||||
audio_mimetype = 'audio/ogg'
|
||||
url = factory.Faker("url")
|
||||
title = factory.Faker("sentence")
|
||||
artist_name = factory.Faker("sentence")
|
||||
album_title = factory.Faker("sentence")
|
||||
audio_url = factory.Faker("url")
|
||||
audio_mimetype = "audio/ogg"
|
||||
metadata = factory.SubFactory(LibraryTrackMetadataFactory)
|
||||
|
||||
class Meta:
|
||||
model = models.LibraryTrack
|
||||
|
||||
class Params:
|
||||
with_audio_file = factory.Trait(
|
||||
audio_file=factory.django.FileField()
|
||||
)
|
||||
with_audio_file = factory.Trait(audio_file=factory.django.FileField())
|
||||
|
||||
|
||||
@registry.register(name='federation.Note')
|
||||
@registry.register(name="federation.Note")
|
||||
class NoteFactory(factory.Factory):
|
||||
type = 'Note'
|
||||
id = factory.Faker('url')
|
||||
published = factory.LazyFunction(
|
||||
lambda: timezone.now().isoformat()
|
||||
)
|
||||
type = "Note"
|
||||
id = factory.Faker("url")
|
||||
published = factory.LazyFunction(lambda: timezone.now().isoformat())
|
||||
inReplyTo = None
|
||||
content = factory.Faker('sentence')
|
||||
content = factory.Faker("sentence")
|
||||
|
||||
class Meta:
|
||||
model = dict
|
||||
|
||||
|
||||
@registry.register(name='federation.Activity')
|
||||
@registry.register(name="federation.Activity")
|
||||
class ActivityFactory(factory.Factory):
|
||||
type = 'Create'
|
||||
id = factory.Faker('url')
|
||||
published = factory.LazyFunction(
|
||||
lambda: timezone.now().isoformat()
|
||||
)
|
||||
actor = factory.Faker('url')
|
||||
type = "Create"
|
||||
id = factory.Faker("url")
|
||||
published = factory.LazyFunction(lambda: timezone.now().isoformat())
|
||||
actor = factory.Faker("url")
|
||||
object = factory.SubFactory(
|
||||
NoteFactory,
|
||||
actor=factory.SelfAttribute('..actor'),
|
||||
published=factory.SelfAttribute('..published'))
|
||||
actor=factory.SelfAttribute("..actor"),
|
||||
published=factory.SelfAttribute("..published"),
|
||||
)
|
||||
|
||||
class Meta:
|
||||
model = dict
|
||||
|
||||
|
||||
@registry.register(name='federation.AudioMetadata')
|
||||
@registry.register(name="federation.AudioMetadata")
|
||||
class AudioMetadataFactory(factory.Factory):
|
||||
recording = factory.LazyAttribute(
|
||||
lambda o: 'https://musicbrainz.org/recording/{}'.format(uuid.uuid4())
|
||||
lambda o: "https://musicbrainz.org/recording/{}".format(uuid.uuid4())
|
||||
)
|
||||
artist = factory.LazyAttribute(
|
||||
lambda o: 'https://musicbrainz.org/artist/{}'.format(uuid.uuid4())
|
||||
lambda o: "https://musicbrainz.org/artist/{}".format(uuid.uuid4())
|
||||
)
|
||||
release = factory.LazyAttribute(
|
||||
lambda o: 'https://musicbrainz.org/release/{}'.format(uuid.uuid4())
|
||||
lambda o: "https://musicbrainz.org/release/{}".format(uuid.uuid4())
|
||||
)
|
||||
bitrate = 42
|
||||
length = 43
|
||||
|
@ -241,14 +225,12 @@ class AudioMetadataFactory(factory.Factory):
|
|||
model = dict
|
||||
|
||||
|
||||
@registry.register(name='federation.Audio')
|
||||
@registry.register(name="federation.Audio")
|
||||
class AudioFactory(factory.Factory):
|
||||
type = 'Audio'
|
||||
id = factory.Faker('url')
|
||||
published = factory.LazyFunction(
|
||||
lambda: timezone.now().isoformat()
|
||||
)
|
||||
actor = factory.Faker('url')
|
||||
type = "Audio"
|
||||
id = factory.Faker("url")
|
||||
published = factory.LazyFunction(lambda: timezone.now().isoformat())
|
||||
actor = factory.Faker("url")
|
||||
url = factory.SubFactory(LinkFactory, audio=True)
|
||||
metadata = factory.SubFactory(LibraryTrackMetadataFactory)
|
||||
|
||||
|
|
|
@ -6,73 +6,67 @@ from . import models
|
|||
|
||||
|
||||
class LibraryFilter(django_filters.FilterSet):
|
||||
approved = django_filters.BooleanFilter('following__approved')
|
||||
q = fields.SearchFilter(search_fields=[
|
||||
'actor__domain',
|
||||
])
|
||||
approved = django_filters.BooleanFilter("following__approved")
|
||||
q = fields.SearchFilter(search_fields=["actor__domain"])
|
||||
|
||||
class Meta:
|
||||
model = models.Library
|
||||
fields = {
|
||||
'approved': ['exact'],
|
||||
'federation_enabled': ['exact'],
|
||||
'download_files': ['exact'],
|
||||
'autoimport': ['exact'],
|
||||
'tracks_count': ['exact'],
|
||||
"approved": ["exact"],
|
||||
"federation_enabled": ["exact"],
|
||||
"download_files": ["exact"],
|
||||
"autoimport": ["exact"],
|
||||
"tracks_count": ["exact"],
|
||||
}
|
||||
|
||||
|
||||
class LibraryTrackFilter(django_filters.FilterSet):
|
||||
library = django_filters.CharFilter('library__uuid')
|
||||
status = django_filters.CharFilter(method='filter_status')
|
||||
q = fields.SearchFilter(search_fields=[
|
||||
'artist_name',
|
||||
'title',
|
||||
'album_title',
|
||||
'library__actor__domain',
|
||||
])
|
||||
library = django_filters.CharFilter("library__uuid")
|
||||
status = django_filters.CharFilter(method="filter_status")
|
||||
q = fields.SearchFilter(
|
||||
search_fields=["artist_name", "title", "album_title", "library__actor__domain"]
|
||||
)
|
||||
|
||||
def filter_status(self, queryset, field_name, value):
|
||||
if value == 'imported':
|
||||
if value == "imported":
|
||||
return queryset.filter(local_track_file__isnull=False)
|
||||
elif value == 'not_imported':
|
||||
return queryset.filter(
|
||||
local_track_file__isnull=True
|
||||
).exclude(import_jobs__status='pending')
|
||||
elif value == 'import_pending':
|
||||
return queryset.filter(import_jobs__status='pending')
|
||||
elif value == "not_imported":
|
||||
return queryset.filter(local_track_file__isnull=True).exclude(
|
||||
import_jobs__status="pending"
|
||||
)
|
||||
elif value == "import_pending":
|
||||
return queryset.filter(import_jobs__status="pending")
|
||||
return queryset
|
||||
|
||||
class Meta:
|
||||
model = models.LibraryTrack
|
||||
fields = {
|
||||
'library': ['exact'],
|
||||
'artist_name': ['exact', 'icontains'],
|
||||
'title': ['exact', 'icontains'],
|
||||
'album_title': ['exact', 'icontains'],
|
||||
'audio_mimetype': ['exact', 'icontains'],
|
||||
"library": ["exact"],
|
||||
"artist_name": ["exact", "icontains"],
|
||||
"title": ["exact", "icontains"],
|
||||
"album_title": ["exact", "icontains"],
|
||||
"audio_mimetype": ["exact", "icontains"],
|
||||
}
|
||||
|
||||
|
||||
class FollowFilter(django_filters.FilterSet):
|
||||
pending = django_filters.CharFilter(method='filter_pending')
|
||||
pending = django_filters.CharFilter(method="filter_pending")
|
||||
ordering = django_filters.OrderingFilter(
|
||||
# tuple-mapping retains order
|
||||
fields=(
|
||||
('creation_date', 'creation_date'),
|
||||
('modification_date', 'modification_date'),
|
||||
),
|
||||
("creation_date", "creation_date"),
|
||||
("modification_date", "modification_date"),
|
||||
)
|
||||
)
|
||||
q = fields.SearchFilter(
|
||||
search_fields=["actor__domain", "actor__preferred_username"]
|
||||
)
|
||||
q = fields.SearchFilter(search_fields=[
|
||||
'actor__domain',
|
||||
'actor__preferred_username',
|
||||
])
|
||||
|
||||
class Meta:
|
||||
model = models.Follow
|
||||
fields = ['approved', 'pending', 'q']
|
||||
fields = ["approved", "pending", "q"]
|
||||
|
||||
def filter_pending(self, queryset, field_name, value):
|
||||
if value.lower() in ['true', '1', 'yes']:
|
||||
if value.lower() in ["true", "1", "yes"]:
|
||||
queryset = queryset.filter(approved__isnull=True)
|
||||
return queryset
|
||||
|
|
|
@ -1,48 +1,44 @@
|
|||
from cryptography.hazmat.primitives import serialization as crypto_serialization
|
||||
from cryptography.hazmat.primitives.asymmetric import rsa
|
||||
from cryptography.hazmat.backends import default_backend as crypto_default_backend
|
||||
|
||||
import re
|
||||
import urllib.parse
|
||||
|
||||
from . import exceptions
|
||||
from cryptography.hazmat.backends import default_backend as crypto_default_backend
|
||||
from cryptography.hazmat.primitives import serialization as crypto_serialization
|
||||
from cryptography.hazmat.primitives.asymmetric import rsa
|
||||
|
||||
KEY_ID_REGEX = re.compile(r'keyId=\"(?P<id>.*)\"')
|
||||
KEY_ID_REGEX = re.compile(r"keyId=\"(?P<id>.*)\"")
|
||||
|
||||
|
||||
def get_key_pair(size=2048):
|
||||
key = rsa.generate_private_key(
|
||||
backend=crypto_default_backend(),
|
||||
public_exponent=65537,
|
||||
key_size=size
|
||||
backend=crypto_default_backend(), public_exponent=65537, key_size=size
|
||||
)
|
||||
private_key = key.private_bytes(
|
||||
crypto_serialization.Encoding.PEM,
|
||||
crypto_serialization.PrivateFormat.PKCS8,
|
||||
crypto_serialization.NoEncryption())
|
||||
crypto_serialization.NoEncryption(),
|
||||
)
|
||||
public_key = key.public_key().public_bytes(
|
||||
crypto_serialization.Encoding.PEM,
|
||||
crypto_serialization.PublicFormat.PKCS1
|
||||
crypto_serialization.Encoding.PEM, crypto_serialization.PublicFormat.PKCS1
|
||||
)
|
||||
|
||||
return private_key, public_key
|
||||
|
||||
|
||||
def get_key_id_from_signature_header(header_string):
|
||||
parts = header_string.split(',')
|
||||
parts = header_string.split(",")
|
||||
try:
|
||||
raw_key_id = [p for p in parts if p.startswith('keyId="')][0]
|
||||
except IndexError:
|
||||
raise ValueError('Missing key id')
|
||||
raise ValueError("Missing key id")
|
||||
|
||||
match = KEY_ID_REGEX.match(raw_key_id)
|
||||
if not match:
|
||||
raise ValueError('Invalid key id')
|
||||
raise ValueError("Invalid key id")
|
||||
|
||||
key_id = match.groups()[0]
|
||||
url = urllib.parse.urlparse(key_id)
|
||||
if not url.scheme or not url.netloc:
|
||||
raise ValueError('Invalid url')
|
||||
if url.scheme not in ['http', 'https']:
|
||||
raise ValueError('Invalid shceme')
|
||||
raise ValueError("Invalid url")
|
||||
if url.scheme not in ["http", "https"]:
|
||||
raise ValueError("Invalid shceme")
|
||||
return key_id
|
||||
|
|
|
@ -1,15 +1,11 @@
|
|||
import json
|
||||
import requests
|
||||
|
||||
import requests
|
||||
from django.conf import settings
|
||||
|
||||
from funkwhale_api.common import session
|
||||
|
||||
from . import actors
|
||||
from . import models
|
||||
from . import serializers
|
||||
from . import signing
|
||||
from . import webfinger
|
||||
from . import actors, models, serializers, signing, webfinger
|
||||
|
||||
|
||||
def scan_from_account_name(account_name):
|
||||
|
@ -24,87 +20,59 @@ def scan_from_account_name(account_name):
|
|||
"""
|
||||
data = {}
|
||||
try:
|
||||
username, domain = webfinger.clean_acct(
|
||||
account_name, ensure_local=False)
|
||||
username, domain = webfinger.clean_acct(account_name, ensure_local=False)
|
||||
except serializers.ValidationError:
|
||||
return {
|
||||
'webfinger': {
|
||||
'errors': ['Invalid account string']
|
||||
}
|
||||
}
|
||||
system_library = actors.SYSTEM_ACTORS['library'].get_actor_instance()
|
||||
library = models.Library.objects.filter(
|
||||
actor__domain=domain,
|
||||
actor__preferred_username=username
|
||||
).select_related('actor').first()
|
||||
data['local'] = {
|
||||
'following': False,
|
||||
'awaiting_approval': False,
|
||||
}
|
||||
return {"webfinger": {"errors": ["Invalid account string"]}}
|
||||
system_library = actors.SYSTEM_ACTORS["library"].get_actor_instance()
|
||||
data["local"] = {"following": False, "awaiting_approval": False}
|
||||
try:
|
||||
follow = models.Follow.objects.get(
|
||||
target__preferred_username=username,
|
||||
target__domain=username,
|
||||
actor=system_library,
|
||||
)
|
||||
data['local']['awaiting_approval'] = not bool(follow.approved)
|
||||
data['local']['following'] = True
|
||||
data["local"]["awaiting_approval"] = not bool(follow.approved)
|
||||
data["local"]["following"] = True
|
||||
except models.Follow.DoesNotExist:
|
||||
pass
|
||||
|
||||
try:
|
||||
data['webfinger'] = webfinger.get_resource(
|
||||
'acct:{}'.format(account_name))
|
||||
data["webfinger"] = webfinger.get_resource("acct:{}".format(account_name))
|
||||
except requests.ConnectionError:
|
||||
return {
|
||||
'webfinger': {
|
||||
'errors': ['This webfinger resource is not reachable']
|
||||
}
|
||||
}
|
||||
return {"webfinger": {"errors": ["This webfinger resource is not reachable"]}}
|
||||
except requests.HTTPError as e:
|
||||
return {
|
||||
'webfinger': {
|
||||
'errors': [
|
||||
'Error {} during webfinger request'.format(
|
||||
e.response.status_code)]
|
||||
"webfinger": {
|
||||
"errors": [
|
||||
"Error {} during webfinger request".format(e.response.status_code)
|
||||
]
|
||||
}
|
||||
}
|
||||
except json.JSONDecodeError as e:
|
||||
return {
|
||||
'webfinger': {
|
||||
'errors': ['Could not process webfinger response']
|
||||
}
|
||||
}
|
||||
return {"webfinger": {"errors": ["Could not process webfinger response"]}}
|
||||
|
||||
try:
|
||||
data['actor'] = actors.get_actor_data(data['webfinger']['actor_url'])
|
||||
data["actor"] = actors.get_actor_data(data["webfinger"]["actor_url"])
|
||||
except requests.ConnectionError:
|
||||
data['actor'] = {
|
||||
'errors': ['This actor is not reachable']
|
||||
}
|
||||
data["actor"] = {"errors": ["This actor is not reachable"]}
|
||||
return data
|
||||
except requests.HTTPError as e:
|
||||
data['actor'] = {
|
||||
'errors': [
|
||||
'Error {} during actor request'.format(
|
||||
e.response.status_code)]
|
||||
data["actor"] = {
|
||||
"errors": ["Error {} during actor request".format(e.response.status_code)]
|
||||
}
|
||||
return data
|
||||
|
||||
serializer = serializers.LibraryActorSerializer(data=data['actor'])
|
||||
serializer = serializers.LibraryActorSerializer(data=data["actor"])
|
||||
if not serializer.is_valid():
|
||||
data['actor'] = {
|
||||
'errors': ['Invalid ActivityPub actor']
|
||||
}
|
||||
data["actor"] = {"errors": ["Invalid ActivityPub actor"]}
|
||||
return data
|
||||
data['library'] = get_library_data(
|
||||
serializer.validated_data['library_url'])
|
||||
data["library"] = get_library_data(serializer.validated_data["library_url"])
|
||||
|
||||
return data
|
||||
|
||||
|
||||
def get_library_data(library_url):
|
||||
actor = actors.SYSTEM_ACTORS['library'].get_actor_instance()
|
||||
actor = actors.SYSTEM_ACTORS["library"].get_actor_instance()
|
||||
auth = signing.get_auth(actor.private_key, actor.private_key_id)
|
||||
try:
|
||||
response = session.get_session().get(
|
||||
|
@ -112,55 +80,37 @@ def get_library_data(library_url):
|
|||
auth=auth,
|
||||
timeout=5,
|
||||
verify=settings.EXTERNAL_REQUESTS_VERIFY_SSL,
|
||||
headers={
|
||||
'Content-Type': 'application/activity+json'
|
||||
}
|
||||
headers={"Content-Type": "application/activity+json"},
|
||||
)
|
||||
except requests.ConnectionError:
|
||||
return {
|
||||
'errors': ['This library is not reachable']
|
||||
}
|
||||
return {"errors": ["This library is not reachable"]}
|
||||
scode = response.status_code
|
||||
if scode == 401:
|
||||
return {
|
||||
'errors': ['This library requires authentication']
|
||||
}
|
||||
return {"errors": ["This library requires authentication"]}
|
||||
elif scode == 403:
|
||||
return {
|
||||
'errors': ['Permission denied while scanning library']
|
||||
}
|
||||
return {"errors": ["Permission denied while scanning library"]}
|
||||
elif scode >= 400:
|
||||
return {
|
||||
'errors': ['Error {} while fetching the library'.format(scode)]
|
||||
}
|
||||
serializer = serializers.PaginatedCollectionSerializer(
|
||||
data=response.json(),
|
||||
)
|
||||
return {"errors": ["Error {} while fetching the library".format(scode)]}
|
||||
serializer = serializers.PaginatedCollectionSerializer(data=response.json())
|
||||
if not serializer.is_valid():
|
||||
return {
|
||||
'errors': [
|
||||
'Invalid ActivityPub response from remote library']
|
||||
}
|
||||
return {"errors": ["Invalid ActivityPub response from remote library"]}
|
||||
|
||||
return serializer.validated_data
|
||||
|
||||
|
||||
def get_library_page(library, page_url):
|
||||
actor = actors.SYSTEM_ACTORS['library'].get_actor_instance()
|
||||
actor = actors.SYSTEM_ACTORS["library"].get_actor_instance()
|
||||
auth = signing.get_auth(actor.private_key, actor.private_key_id)
|
||||
response = session.get_session().get(
|
||||
page_url,
|
||||
auth=auth,
|
||||
timeout=5,
|
||||
verify=settings.EXTERNAL_REQUESTS_VERIFY_SSL,
|
||||
headers={
|
||||
'Content-Type': 'application/activity+json'
|
||||
}
|
||||
headers={"Content-Type": "application/activity+json"},
|
||||
)
|
||||
serializer = serializers.CollectionPageSerializer(
|
||||
data=response.json(),
|
||||
context={
|
||||
'library': library,
|
||||
'item_serializer': serializers.AudioSerializer})
|
||||
context={"library": library, "item_serializer": serializers.AudioSerializer},
|
||||
)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
return serializer.validated_data
|
||||
|
|
|
@ -8,30 +8,74 @@ class Migration(migrations.Migration):
|
|||
|
||||
initial = True
|
||||
|
||||
dependencies = [
|
||||
]
|
||||
dependencies = []
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name='Actor',
|
||||
name="Actor",
|
||||
fields=[
|
||||
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('url', models.URLField(db_index=True, max_length=500, unique=True)),
|
||||
('outbox_url', models.URLField(max_length=500)),
|
||||
('inbox_url', models.URLField(max_length=500)),
|
||||
('following_url', models.URLField(blank=True, max_length=500, null=True)),
|
||||
('followers_url', models.URLField(blank=True, max_length=500, null=True)),
|
||||
('shared_inbox_url', models.URLField(blank=True, max_length=500, null=True)),
|
||||
('type', models.CharField(choices=[('Person', 'Person'), ('Application', 'Application'), ('Group', 'Group'), ('Organization', 'Organization'), ('Service', 'Service')], default='Person', max_length=25)),
|
||||
('name', models.CharField(blank=True, max_length=200, null=True)),
|
||||
('domain', models.CharField(max_length=1000)),
|
||||
('summary', models.CharField(blank=True, max_length=500, null=True)),
|
||||
('preferred_username', models.CharField(blank=True, max_length=200, null=True)),
|
||||
('public_key', models.CharField(blank=True, max_length=5000, null=True)),
|
||||
('private_key', models.CharField(blank=True, max_length=5000, null=True)),
|
||||
('creation_date', models.DateTimeField(default=django.utils.timezone.now)),
|
||||
('last_fetch_date', models.DateTimeField(default=django.utils.timezone.now)),
|
||||
('manually_approves_followers', models.NullBooleanField(default=None)),
|
||||
(
|
||||
"id",
|
||||
models.AutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
("url", models.URLField(db_index=True, max_length=500, unique=True)),
|
||||
("outbox_url", models.URLField(max_length=500)),
|
||||
("inbox_url", models.URLField(max_length=500)),
|
||||
(
|
||||
"following_url",
|
||||
models.URLField(blank=True, max_length=500, null=True),
|
||||
),
|
||||
(
|
||||
"followers_url",
|
||||
models.URLField(blank=True, max_length=500, null=True),
|
||||
),
|
||||
(
|
||||
"shared_inbox_url",
|
||||
models.URLField(blank=True, max_length=500, null=True),
|
||||
),
|
||||
(
|
||||
"type",
|
||||
models.CharField(
|
||||
choices=[
|
||||
("Person", "Person"),
|
||||
("Application", "Application"),
|
||||
("Group", "Group"),
|
||||
("Organization", "Organization"),
|
||||
("Service", "Service"),
|
||||
],
|
||||
default="Person",
|
||||
max_length=25,
|
||||
),
|
||||
),
|
||||
("name", models.CharField(blank=True, max_length=200, null=True)),
|
||||
("domain", models.CharField(max_length=1000)),
|
||||
("summary", models.CharField(blank=True, max_length=500, null=True)),
|
||||
(
|
||||
"preferred_username",
|
||||
models.CharField(blank=True, max_length=200, null=True),
|
||||
),
|
||||
(
|
||||
"public_key",
|
||||
models.CharField(blank=True, max_length=5000, null=True),
|
||||
),
|
||||
(
|
||||
"private_key",
|
||||
models.CharField(blank=True, max_length=5000, null=True),
|
||||
),
|
||||
(
|
||||
"creation_date",
|
||||
models.DateTimeField(default=django.utils.timezone.now),
|
||||
),
|
||||
(
|
||||
"last_fetch_date",
|
||||
models.DateTimeField(default=django.utils.timezone.now),
|
||||
),
|
||||
("manually_approves_followers", models.NullBooleanField(default=None)),
|
||||
],
|
||||
),
|
||||
)
|
||||
]
|
||||
|
|
|
@ -5,13 +5,10 @@ from django.db import migrations
|
|||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('federation', '0001_initial'),
|
||||
]
|
||||
dependencies = [("federation", "0001_initial")]
|
||||
|
||||
operations = [
|
||||
migrations.AlterUniqueTogether(
|
||||
name='actor',
|
||||
unique_together={('domain', 'preferred_username')},
|
||||
),
|
||||
name="actor", unique_together={("domain", "preferred_username")}
|
||||
)
|
||||
]
|
||||
|
|
|
@ -10,7 +10,7 @@ import uuid
|
|||
def delete_system_actors(apps, schema_editor):
|
||||
"""Revert site domain and name to default."""
|
||||
Actor = apps.get_model("federation", "Actor")
|
||||
Actor.objects.filter(preferred_username__in=['test', 'library']).delete()
|
||||
Actor.objects.filter(preferred_username__in=["test", "library"]).delete()
|
||||
|
||||
|
||||
def backward(apps, schema_editor):
|
||||
|
@ -19,76 +19,168 @@ def backward(apps, schema_editor):
|
|||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('federation', '0002_auto_20180403_1620'),
|
||||
]
|
||||
dependencies = [("federation", "0002_auto_20180403_1620")]
|
||||
|
||||
operations = [
|
||||
migrations.RunPython(delete_system_actors, backward),
|
||||
migrations.CreateModel(
|
||||
name='Follow',
|
||||
name="Follow",
|
||||
fields=[
|
||||
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('uuid', models.UUIDField(default=uuid.uuid4, unique=True)),
|
||||
('creation_date', models.DateTimeField(default=django.utils.timezone.now)),
|
||||
('modification_date', models.DateTimeField(auto_now=True)),
|
||||
('actor', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='emitted_follows', to='federation.Actor')),
|
||||
('target', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='received_follows', to='federation.Actor')),
|
||||
(
|
||||
"id",
|
||||
models.AutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
("uuid", models.UUIDField(default=uuid.uuid4, unique=True)),
|
||||
(
|
||||
"creation_date",
|
||||
models.DateTimeField(default=django.utils.timezone.now),
|
||||
),
|
||||
("modification_date", models.DateTimeField(auto_now=True)),
|
||||
(
|
||||
"actor",
|
||||
models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name="emitted_follows",
|
||||
to="federation.Actor",
|
||||
),
|
||||
),
|
||||
(
|
||||
"target",
|
||||
models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name="received_follows",
|
||||
to="federation.Actor",
|
||||
),
|
||||
),
|
||||
],
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name='FollowRequest',
|
||||
name="FollowRequest",
|
||||
fields=[
|
||||
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('uuid', models.UUIDField(default=uuid.uuid4, unique=True)),
|
||||
('creation_date', models.DateTimeField(default=django.utils.timezone.now)),
|
||||
('modification_date', models.DateTimeField(auto_now=True)),
|
||||
('approved', models.NullBooleanField(default=None)),
|
||||
('actor', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='emmited_follow_requests', to='federation.Actor')),
|
||||
('target', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='received_follow_requests', to='federation.Actor')),
|
||||
(
|
||||
"id",
|
||||
models.AutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
("uuid", models.UUIDField(default=uuid.uuid4, unique=True)),
|
||||
(
|
||||
"creation_date",
|
||||
models.DateTimeField(default=django.utils.timezone.now),
|
||||
),
|
||||
("modification_date", models.DateTimeField(auto_now=True)),
|
||||
("approved", models.NullBooleanField(default=None)),
|
||||
(
|
||||
"actor",
|
||||
models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name="emmited_follow_requests",
|
||||
to="federation.Actor",
|
||||
),
|
||||
),
|
||||
(
|
||||
"target",
|
||||
models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name="received_follow_requests",
|
||||
to="federation.Actor",
|
||||
),
|
||||
),
|
||||
],
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name='Library',
|
||||
name="Library",
|
||||
fields=[
|
||||
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('creation_date', models.DateTimeField(default=django.utils.timezone.now)),
|
||||
('modification_date', models.DateTimeField(auto_now=True)),
|
||||
('fetched_date', models.DateTimeField(blank=True, null=True)),
|
||||
('uuid', models.UUIDField(default=uuid.uuid4)),
|
||||
('url', models.URLField()),
|
||||
('federation_enabled', models.BooleanField()),
|
||||
('download_files', models.BooleanField()),
|
||||
('autoimport', models.BooleanField()),
|
||||
('tracks_count', models.PositiveIntegerField(blank=True, null=True)),
|
||||
('actor', models.OneToOneField(on_delete=django.db.models.deletion.CASCADE, related_name='library', to='federation.Actor')),
|
||||
(
|
||||
"id",
|
||||
models.AutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
(
|
||||
"creation_date",
|
||||
models.DateTimeField(default=django.utils.timezone.now),
|
||||
),
|
||||
("modification_date", models.DateTimeField(auto_now=True)),
|
||||
("fetched_date", models.DateTimeField(blank=True, null=True)),
|
||||
("uuid", models.UUIDField(default=uuid.uuid4)),
|
||||
("url", models.URLField()),
|
||||
("federation_enabled", models.BooleanField()),
|
||||
("download_files", models.BooleanField()),
|
||||
("autoimport", models.BooleanField()),
|
||||
("tracks_count", models.PositiveIntegerField(blank=True, null=True)),
|
||||
(
|
||||
"actor",
|
||||
models.OneToOneField(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name="library",
|
||||
to="federation.Actor",
|
||||
),
|
||||
),
|
||||
],
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name='LibraryTrack',
|
||||
name="LibraryTrack",
|
||||
fields=[
|
||||
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('url', models.URLField(unique=True)),
|
||||
('audio_url', models.URLField()),
|
||||
('audio_mimetype', models.CharField(max_length=200)),
|
||||
('creation_date', models.DateTimeField(default=django.utils.timezone.now)),
|
||||
('modification_date', models.DateTimeField(auto_now=True)),
|
||||
('fetched_date', models.DateTimeField(blank=True, null=True)),
|
||||
('published_date', models.DateTimeField(blank=True, null=True)),
|
||||
('artist_name', models.CharField(max_length=500)),
|
||||
('album_title', models.CharField(max_length=500)),
|
||||
('title', models.CharField(max_length=500)),
|
||||
('metadata', django.contrib.postgres.fields.jsonb.JSONField(default={}, max_length=10000)),
|
||||
('library', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='tracks', to='federation.Library')),
|
||||
(
|
||||
"id",
|
||||
models.AutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
("url", models.URLField(unique=True)),
|
||||
("audio_url", models.URLField()),
|
||||
("audio_mimetype", models.CharField(max_length=200)),
|
||||
(
|
||||
"creation_date",
|
||||
models.DateTimeField(default=django.utils.timezone.now),
|
||||
),
|
||||
("modification_date", models.DateTimeField(auto_now=True)),
|
||||
("fetched_date", models.DateTimeField(blank=True, null=True)),
|
||||
("published_date", models.DateTimeField(blank=True, null=True)),
|
||||
("artist_name", models.CharField(max_length=500)),
|
||||
("album_title", models.CharField(max_length=500)),
|
||||
("title", models.CharField(max_length=500)),
|
||||
(
|
||||
"metadata",
|
||||
django.contrib.postgres.fields.jsonb.JSONField(
|
||||
default={}, max_length=10000
|
||||
),
|
||||
),
|
||||
(
|
||||
"library",
|
||||
models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name="tracks",
|
||||
to="federation.Library",
|
||||
),
|
||||
),
|
||||
],
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='actor',
|
||||
name='followers',
|
||||
field=models.ManyToManyField(related_name='following', through='federation.Follow', to='federation.Actor'),
|
||||
model_name="actor",
|
||||
name="followers",
|
||||
field=models.ManyToManyField(
|
||||
related_name="following",
|
||||
through="federation.Follow",
|
||||
to="federation.Actor",
|
||||
),
|
||||
),
|
||||
migrations.AlterUniqueTogether(
|
||||
name='follow',
|
||||
unique_together={('actor', 'target')},
|
||||
name="follow", unique_together={("actor", "target")}
|
||||
),
|
||||
]
|
||||
|
|
|
@ -6,30 +6,26 @@ import django.db.models.deletion
|
|||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('federation', '0003_auto_20180407_1010'),
|
||||
]
|
||||
dependencies = [("federation", "0003_auto_20180407_1010")]
|
||||
|
||||
operations = [
|
||||
migrations.RemoveField(
|
||||
model_name='followrequest',
|
||||
name='actor',
|
||||
),
|
||||
migrations.RemoveField(
|
||||
model_name='followrequest',
|
||||
name='target',
|
||||
),
|
||||
migrations.RemoveField(model_name="followrequest", name="actor"),
|
||||
migrations.RemoveField(model_name="followrequest", name="target"),
|
||||
migrations.AddField(
|
||||
model_name='follow',
|
||||
name='approved',
|
||||
model_name="follow",
|
||||
name="approved",
|
||||
field=models.NullBooleanField(default=None),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='library',
|
||||
name='follow',
|
||||
field=models.OneToOneField(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='library', to='federation.Follow'),
|
||||
),
|
||||
migrations.DeleteModel(
|
||||
name='FollowRequest',
|
||||
model_name="library",
|
||||
name="follow",
|
||||
field=models.OneToOneField(
|
||||
blank=True,
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.SET_NULL,
|
||||
related_name="library",
|
||||
to="federation.Follow",
|
||||
),
|
||||
),
|
||||
migrations.DeleteModel(name="FollowRequest"),
|
||||
]
|
||||
|
|
|
@ -8,19 +8,25 @@ import funkwhale_api.federation.models
|
|||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('federation', '0004_auto_20180410_2025'),
|
||||
]
|
||||
dependencies = [("federation", "0004_auto_20180410_2025")]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='librarytrack',
|
||||
name='audio_file',
|
||||
field=models.FileField(blank=True, null=True, upload_to=funkwhale_api.federation.models.get_file_path),
|
||||
model_name="librarytrack",
|
||||
name="audio_file",
|
||||
field=models.FileField(
|
||||
blank=True,
|
||||
null=True,
|
||||
upload_to=funkwhale_api.federation.models.get_file_path,
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='librarytrack',
|
||||
name='metadata',
|
||||
field=django.contrib.postgres.fields.jsonb.JSONField(default={}, encoder=django.core.serializers.json.DjangoJSONEncoder, max_length=10000),
|
||||
model_name="librarytrack",
|
||||
name="metadata",
|
||||
field=django.contrib.postgres.fields.jsonb.JSONField(
|
||||
default={},
|
||||
encoder=django.core.serializers.json.DjangoJSONEncoder,
|
||||
max_length=10000,
|
||||
),
|
||||
),
|
||||
]
|
||||
|
|
|
@ -5,24 +5,20 @@ from django.db import migrations, models
|
|||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('federation', '0005_auto_20180413_1723'),
|
||||
]
|
||||
dependencies = [("federation", "0005_auto_20180413_1723")]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='library',
|
||||
name='url',
|
||||
model_name="library", name="url", field=models.URLField(max_length=500)
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="librarytrack",
|
||||
name="audio_url",
|
||||
field=models.URLField(max_length=500),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='librarytrack',
|
||||
name='audio_url',
|
||||
field=models.URLField(max_length=500),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='librarytrack',
|
||||
name='url',
|
||||
model_name="librarytrack",
|
||||
name="url",
|
||||
field=models.URLField(max_length=500, unique=True),
|
||||
),
|
||||
]
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
import os
|
||||
import uuid
|
||||
import tempfile
|
||||
import uuid
|
||||
|
||||
from django.conf import settings
|
||||
from django.contrib.postgres.fields import JSONField
|
||||
|
@ -12,16 +12,16 @@ from funkwhale_api.common import session
|
|||
from funkwhale_api.music import utils as music_utils
|
||||
|
||||
TYPE_CHOICES = [
|
||||
('Person', 'Person'),
|
||||
('Application', 'Application'),
|
||||
('Group', 'Group'),
|
||||
('Organization', 'Organization'),
|
||||
('Service', 'Service'),
|
||||
("Person", "Person"),
|
||||
("Application", "Application"),
|
||||
("Group", "Group"),
|
||||
("Organization", "Organization"),
|
||||
("Service", "Service"),
|
||||
]
|
||||
|
||||
|
||||
class Actor(models.Model):
|
||||
ap_type = 'Actor'
|
||||
ap_type = "Actor"
|
||||
|
||||
url = models.URLField(unique=True, max_length=500, db_index=True)
|
||||
outbox_url = models.URLField(max_length=500)
|
||||
|
@ -29,49 +29,41 @@ class Actor(models.Model):
|
|||
following_url = models.URLField(max_length=500, null=True, blank=True)
|
||||
followers_url = models.URLField(max_length=500, null=True, blank=True)
|
||||
shared_inbox_url = models.URLField(max_length=500, null=True, blank=True)
|
||||
type = models.CharField(
|
||||
choices=TYPE_CHOICES, default='Person', max_length=25)
|
||||
type = models.CharField(choices=TYPE_CHOICES, default="Person", max_length=25)
|
||||
name = models.CharField(max_length=200, null=True, blank=True)
|
||||
domain = models.CharField(max_length=1000)
|
||||
summary = models.CharField(max_length=500, null=True, blank=True)
|
||||
preferred_username = models.CharField(
|
||||
max_length=200, null=True, blank=True)
|
||||
preferred_username = models.CharField(max_length=200, null=True, blank=True)
|
||||
public_key = models.CharField(max_length=5000, null=True, blank=True)
|
||||
private_key = models.CharField(max_length=5000, null=True, blank=True)
|
||||
creation_date = models.DateTimeField(default=timezone.now)
|
||||
last_fetch_date = models.DateTimeField(
|
||||
default=timezone.now)
|
||||
last_fetch_date = models.DateTimeField(default=timezone.now)
|
||||
manually_approves_followers = models.NullBooleanField(default=None)
|
||||
followers = models.ManyToManyField(
|
||||
to='self',
|
||||
to="self",
|
||||
symmetrical=False,
|
||||
through='Follow',
|
||||
through_fields=('target', 'actor'),
|
||||
related_name='following',
|
||||
through="Follow",
|
||||
through_fields=("target", "actor"),
|
||||
related_name="following",
|
||||
)
|
||||
|
||||
class Meta:
|
||||
unique_together = ['domain', 'preferred_username']
|
||||
unique_together = ["domain", "preferred_username"]
|
||||
|
||||
@property
|
||||
def webfinger_subject(self):
|
||||
return '{}@{}'.format(
|
||||
self.preferred_username,
|
||||
settings.FEDERATION_HOSTNAME,
|
||||
)
|
||||
return "{}@{}".format(self.preferred_username, settings.FEDERATION_HOSTNAME)
|
||||
|
||||
@property
|
||||
def private_key_id(self):
|
||||
return '{}#main-key'.format(self.url)
|
||||
return "{}#main-key".format(self.url)
|
||||
|
||||
@property
|
||||
def mention_username(self):
|
||||
return '@{}@{}'.format(self.preferred_username, self.domain)
|
||||
return "@{}@{}".format(self.preferred_username, self.domain)
|
||||
|
||||
def save(self, **kwargs):
|
||||
lowercase_fields = [
|
||||
'domain',
|
||||
]
|
||||
lowercase_fields = ["domain"]
|
||||
for field in lowercase_fields:
|
||||
v = getattr(self, field, None)
|
||||
if v:
|
||||
|
@ -86,58 +78,54 @@ class Actor(models.Model):
|
|||
@property
|
||||
def is_system(self):
|
||||
from . import actors
|
||||
return all([
|
||||
settings.FEDERATION_HOSTNAME == self.domain,
|
||||
self.preferred_username in actors.SYSTEM_ACTORS
|
||||
])
|
||||
|
||||
return all(
|
||||
[
|
||||
settings.FEDERATION_HOSTNAME == self.domain,
|
||||
self.preferred_username in actors.SYSTEM_ACTORS,
|
||||
]
|
||||
)
|
||||
|
||||
@property
|
||||
def system_conf(self):
|
||||
from . import actors
|
||||
|
||||
if self.is_system:
|
||||
return actors.SYSTEM_ACTORS[self.preferred_username]
|
||||
|
||||
def get_approved_followers(self):
|
||||
follows = self.received_follows.filter(approved=True)
|
||||
return self.followers.filter(
|
||||
pk__in=follows.values_list('actor', flat=True))
|
||||
return self.followers.filter(pk__in=follows.values_list("actor", flat=True))
|
||||
|
||||
|
||||
class Follow(models.Model):
|
||||
ap_type = 'Follow'
|
||||
ap_type = "Follow"
|
||||
|
||||
uuid = models.UUIDField(default=uuid.uuid4, unique=True)
|
||||
actor = models.ForeignKey(
|
||||
Actor,
|
||||
related_name='emitted_follows',
|
||||
on_delete=models.CASCADE,
|
||||
Actor, related_name="emitted_follows", on_delete=models.CASCADE
|
||||
)
|
||||
target = models.ForeignKey(
|
||||
Actor,
|
||||
related_name='received_follows',
|
||||
on_delete=models.CASCADE,
|
||||
Actor, related_name="received_follows", on_delete=models.CASCADE
|
||||
)
|
||||
creation_date = models.DateTimeField(default=timezone.now)
|
||||
modification_date = models.DateTimeField(
|
||||
auto_now=True)
|
||||
modification_date = models.DateTimeField(auto_now=True)
|
||||
approved = models.NullBooleanField(default=None)
|
||||
|
||||
class Meta:
|
||||
unique_together = ['actor', 'target']
|
||||
unique_together = ["actor", "target"]
|
||||
|
||||
def get_federation_url(self):
|
||||
return '{}#follows/{}'.format(self.actor.url, self.uuid)
|
||||
return "{}#follows/{}".format(self.actor.url, self.uuid)
|
||||
|
||||
|
||||
class Library(models.Model):
|
||||
creation_date = models.DateTimeField(default=timezone.now)
|
||||
modification_date = models.DateTimeField(
|
||||
auto_now=True)
|
||||
modification_date = models.DateTimeField(auto_now=True)
|
||||
fetched_date = models.DateTimeField(null=True, blank=True)
|
||||
actor = models.OneToOneField(
|
||||
Actor,
|
||||
on_delete=models.CASCADE,
|
||||
related_name='library')
|
||||
Actor, on_delete=models.CASCADE, related_name="library"
|
||||
)
|
||||
uuid = models.UUIDField(default=uuid.uuid4)
|
||||
url = models.URLField(max_length=500)
|
||||
|
||||
|
@ -149,69 +137,60 @@ class Library(models.Model):
|
|||
autoimport = models.BooleanField()
|
||||
tracks_count = models.PositiveIntegerField(null=True, blank=True)
|
||||
follow = models.OneToOneField(
|
||||
Follow,
|
||||
related_name='library',
|
||||
null=True,
|
||||
blank=True,
|
||||
on_delete=models.SET_NULL,
|
||||
Follow, related_name="library", null=True, blank=True, on_delete=models.SET_NULL
|
||||
)
|
||||
|
||||
|
||||
def get_file_path(instance, filename):
|
||||
uid = str(uuid.uuid4())
|
||||
chunk_size = 2
|
||||
chunks = [uid[i:i+chunk_size] for i in range(0, len(uid), chunk_size)]
|
||||
chunks = [uid[i : i + chunk_size] for i in range(0, len(uid), chunk_size)]
|
||||
parts = chunks[:3] + [filename]
|
||||
return os.path.join('federation_cache', *parts)
|
||||
return os.path.join("federation_cache", *parts)
|
||||
|
||||
|
||||
class LibraryTrack(models.Model):
|
||||
url = models.URLField(unique=True, max_length=500)
|
||||
audio_url = models.URLField(max_length=500)
|
||||
audio_mimetype = models.CharField(max_length=200)
|
||||
audio_file = models.FileField(
|
||||
upload_to=get_file_path,
|
||||
null=True,
|
||||
blank=True)
|
||||
audio_file = models.FileField(upload_to=get_file_path, null=True, blank=True)
|
||||
|
||||
creation_date = models.DateTimeField(default=timezone.now)
|
||||
modification_date = models.DateTimeField(
|
||||
auto_now=True)
|
||||
modification_date = models.DateTimeField(auto_now=True)
|
||||
fetched_date = models.DateTimeField(null=True, blank=True)
|
||||
published_date = models.DateTimeField(null=True, blank=True)
|
||||
library = models.ForeignKey(
|
||||
Library, related_name='tracks', on_delete=models.CASCADE)
|
||||
Library, related_name="tracks", on_delete=models.CASCADE
|
||||
)
|
||||
artist_name = models.CharField(max_length=500)
|
||||
album_title = models.CharField(max_length=500)
|
||||
title = models.CharField(max_length=500)
|
||||
metadata = JSONField(
|
||||
default={}, max_length=10000, encoder=DjangoJSONEncoder)
|
||||
metadata = JSONField(default={}, max_length=10000, encoder=DjangoJSONEncoder)
|
||||
|
||||
@property
|
||||
def mbid(self):
|
||||
try:
|
||||
return self.metadata['recording']['musicbrainz_id']
|
||||
return self.metadata["recording"]["musicbrainz_id"]
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
def download_audio(self):
|
||||
from . import actors
|
||||
auth = actors.SYSTEM_ACTORS['library'].get_request_auth()
|
||||
|
||||
auth = actors.SYSTEM_ACTORS["library"].get_request_auth()
|
||||
remote_response = session.get_session().get(
|
||||
self.audio_url,
|
||||
auth=auth,
|
||||
stream=True,
|
||||
timeout=20,
|
||||
verify=settings.EXTERNAL_REQUESTS_VERIFY_SSL,
|
||||
headers={
|
||||
'Content-Type': 'application/activity+json'
|
||||
}
|
||||
headers={"Content-Type": "application/activity+json"},
|
||||
)
|
||||
with remote_response as r:
|
||||
remote_response.raise_for_status()
|
||||
extension = music_utils.get_ext_from_type(self.audio_mimetype)
|
||||
title = ' - '.join([self.title, self.album_title, self.artist_name])
|
||||
filename = '{}.{}'.format(title, extension)
|
||||
title = " - ".join([self.title, self.album_title, self.artist_name])
|
||||
filename = "{}.{}".format(title, extension)
|
||||
tmp_file = tempfile.TemporaryFile()
|
||||
for chunk in r.iter_content(chunk_size=512):
|
||||
tmp_file.write(chunk)
|
||||
|
|
|
@ -2,4 +2,4 @@ from rest_framework import parsers
|
|||
|
||||
|
||||
class ActivityParser(parsers.JSONParser):
|
||||
media_type = 'application/activity+json'
|
||||
media_type = "application/activity+json"
|
||||
|
|
|
@ -1,21 +1,19 @@
|
|||
from django.conf import settings
|
||||
|
||||
from rest_framework.permissions import BasePermission
|
||||
|
||||
from funkwhale_api.common import preferences
|
||||
|
||||
from . import actors
|
||||
|
||||
|
||||
class LibraryFollower(BasePermission):
|
||||
|
||||
def has_permission(self, request, view):
|
||||
if not preferences.get('federation__music_needs_approval'):
|
||||
if not preferences.get("federation__music_needs_approval"):
|
||||
return True
|
||||
|
||||
actor = getattr(request, 'actor', None)
|
||||
actor = getattr(request, "actor", None)
|
||||
if actor is None:
|
||||
return False
|
||||
|
||||
library = actors.SYSTEM_ACTORS['library'].get_actor_instance()
|
||||
return library.received_follows.filter(
|
||||
approved=True, actor=actor).exists()
|
||||
library = actors.SYSTEM_ACTORS["library"].get_actor_instance()
|
||||
return library.received_follows.filter(approved=True, actor=actor).exists()
|
||||
|
|
|
@ -2,8 +2,8 @@ from rest_framework.renderers import JSONRenderer
|
|||
|
||||
|
||||
class ActivityPubRenderer(JSONRenderer):
|
||||
media_type = 'application/activity+json'
|
||||
media_type = "application/activity+json"
|
||||
|
||||
|
||||
class WebfingerRenderer(JSONRenderer):
|
||||
media_type = 'application/jrd+json'
|
||||
media_type = "application/jrd+json"
|
||||
|
|
File diff suppressed because it is too large
Load Diff
|
@ -1,18 +1,16 @@
|
|||
import logging
|
||||
|
||||
import requests
|
||||
import requests_http_signature
|
||||
|
||||
from . import exceptions
|
||||
from . import utils
|
||||
from . import exceptions, utils
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def verify(request, public_key):
|
||||
return requests_http_signature.HTTPSignatureAuth.verify(
|
||||
request,
|
||||
key_resolver=lambda **kwargs: public_key,
|
||||
use_auth_header=False,
|
||||
request, key_resolver=lambda **kwargs: public_key, use_auth_header=False
|
||||
)
|
||||
|
||||
|
||||
|
@ -27,44 +25,37 @@ def verify_django(django_request, public_key):
|
|||
# with requests_http_signature
|
||||
headers[h.lower()] = v
|
||||
try:
|
||||
signature = headers['Signature']
|
||||
signature = headers["Signature"]
|
||||
except KeyError:
|
||||
raise exceptions.MissingSignature
|
||||
url = 'http://noop{}'.format(django_request.path)
|
||||
query = django_request.META['QUERY_STRING']
|
||||
url = "http://noop{}".format(django_request.path)
|
||||
query = django_request.META["QUERY_STRING"]
|
||||
if query:
|
||||
url += '?{}'.format(query)
|
||||
url += "?{}".format(query)
|
||||
signature_headers = signature.split('headers="')[1].split('",')[0]
|
||||
expected = signature_headers.split(' ')
|
||||
logger.debug('Signature expected headers: %s', expected)
|
||||
expected = signature_headers.split(" ")
|
||||
logger.debug("Signature expected headers: %s", expected)
|
||||
for header in expected:
|
||||
try:
|
||||
headers[header]
|
||||
except KeyError:
|
||||
logger.debug('Missing header: %s', header)
|
||||
logger.debug("Missing header: %s", header)
|
||||
request = requests.Request(
|
||||
method=django_request.method,
|
||||
url=url,
|
||||
data=django_request.body,
|
||||
headers=headers)
|
||||
method=django_request.method, url=url, data=django_request.body, headers=headers
|
||||
)
|
||||
for h in request.headers.keys():
|
||||
v = request.headers[h]
|
||||
if v:
|
||||
request.headers[h] = str(v)
|
||||
prepared_request = request.prepare()
|
||||
request.prepare()
|
||||
return verify(request, public_key)
|
||||
|
||||
|
||||
def get_auth(private_key, private_key_id):
|
||||
return requests_http_signature.HTTPSignatureAuth(
|
||||
use_auth_header=False,
|
||||
headers=[
|
||||
'(request-target)',
|
||||
'user-agent',
|
||||
'host',
|
||||
'date',
|
||||
'content-type'],
|
||||
algorithm='rsa-sha256',
|
||||
key=private_key.encode('utf-8'),
|
||||
headers=["(request-target)", "user-agent", "host", "date", "content-type"],
|
||||
algorithm="rsa-sha256",
|
||||
key=private_key.encode("utf-8"),
|
||||
key_id=private_key_id,
|
||||
)
|
||||
|
|
|
@ -6,114 +6,114 @@ import os
|
|||
from django.conf import settings
|
||||
from django.db.models import Q
|
||||
from django.utils import timezone
|
||||
|
||||
from requests.exceptions import RequestException
|
||||
from dynamic_preferences.registries import global_preferences_registry
|
||||
from requests.exceptions import RequestException
|
||||
|
||||
from funkwhale_api.common import session
|
||||
from funkwhale_api.history.models import Listening
|
||||
from funkwhale_api.taskapp import celery
|
||||
|
||||
from . import actors
|
||||
from . import library as lb
|
||||
from . import models
|
||||
from . import signing
|
||||
|
||||
from . import models, signing
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@celery.app.task(
|
||||
name='federation.send',
|
||||
name="federation.send",
|
||||
autoretry_for=[RequestException],
|
||||
retry_backoff=30,
|
||||
max_retries=5)
|
||||
@celery.require_instance(models.Actor, 'actor')
|
||||
max_retries=5,
|
||||
)
|
||||
@celery.require_instance(models.Actor, "actor")
|
||||
def send(activity, actor, to):
|
||||
logger.info('Preparing activity delivery to %s', to)
|
||||
auth = signing.get_auth(
|
||||
actor.private_key, actor.private_key_id)
|
||||
logger.info("Preparing activity delivery to %s", to)
|
||||
auth = signing.get_auth(actor.private_key, actor.private_key_id)
|
||||
for url in to:
|
||||
recipient_actor = actors.get_actor(url)
|
||||
logger.debug('delivering to %s', recipient_actor.inbox_url)
|
||||
logger.debug('activity content: %s', json.dumps(activity))
|
||||
logger.debug("delivering to %s", recipient_actor.inbox_url)
|
||||
logger.debug("activity content: %s", json.dumps(activity))
|
||||
response = session.get_session().post(
|
||||
auth=auth,
|
||||
json=activity,
|
||||
url=recipient_actor.inbox_url,
|
||||
timeout=5,
|
||||
verify=settings.EXTERNAL_REQUESTS_VERIFY_SSL,
|
||||
headers={
|
||||
'Content-Type': 'application/activity+json'
|
||||
}
|
||||
headers={"Content-Type": "application/activity+json"},
|
||||
)
|
||||
response.raise_for_status()
|
||||
logger.debug('Remote answered with %s', response.status_code)
|
||||
logger.debug("Remote answered with %s", response.status_code)
|
||||
|
||||
|
||||
@celery.app.task(
|
||||
name='federation.scan_library',
|
||||
name="federation.scan_library",
|
||||
autoretry_for=[RequestException],
|
||||
retry_backoff=30,
|
||||
max_retries=5)
|
||||
@celery.require_instance(models.Library, 'library')
|
||||
max_retries=5,
|
||||
)
|
||||
@celery.require_instance(models.Library, "library")
|
||||
def scan_library(library, until=None):
|
||||
if not library.federation_enabled:
|
||||
return
|
||||
|
||||
data = lb.get_library_data(library.url)
|
||||
scan_library_page.delay(
|
||||
library_id=library.id, page_url=data['first'], until=until)
|
||||
scan_library_page.delay(library_id=library.id, page_url=data["first"], until=until)
|
||||
library.fetched_date = timezone.now()
|
||||
library.tracks_count = data['totalItems']
|
||||
library.save(update_fields=['fetched_date', 'tracks_count'])
|
||||
library.tracks_count = data["totalItems"]
|
||||
library.save(update_fields=["fetched_date", "tracks_count"])
|
||||
|
||||
|
||||
@celery.app.task(
|
||||
name='federation.scan_library_page',
|
||||
name="federation.scan_library_page",
|
||||
autoretry_for=[RequestException],
|
||||
retry_backoff=30,
|
||||
max_retries=5)
|
||||
@celery.require_instance(models.Library, 'library')
|
||||
max_retries=5,
|
||||
)
|
||||
@celery.require_instance(models.Library, "library")
|
||||
def scan_library_page(library, page_url, until=None):
|
||||
if not library.federation_enabled:
|
||||
return
|
||||
|
||||
data = lb.get_library_page(library, page_url)
|
||||
lts = []
|
||||
for item_serializer in data['items']:
|
||||
item_date = item_serializer.validated_data['published']
|
||||
for item_serializer in data["items"]:
|
||||
item_date = item_serializer.validated_data["published"]
|
||||
if until and item_date < until:
|
||||
return
|
||||
lts.append(item_serializer.save())
|
||||
|
||||
next_page = data.get('next')
|
||||
next_page = data.get("next")
|
||||
if next_page and next_page != page_url:
|
||||
scan_library_page.delay(library_id=library.id, page_url=next_page)
|
||||
|
||||
|
||||
@celery.app.task(name='federation.clean_music_cache')
|
||||
@celery.app.task(name="federation.clean_music_cache")
|
||||
def clean_music_cache():
|
||||
preferences = global_preferences_registry.manager()
|
||||
delay = preferences['federation__music_cache_duration']
|
||||
delay = preferences["federation__music_cache_duration"]
|
||||
if delay < 1:
|
||||
return # cache clearing disabled
|
||||
limit = timezone.now() - datetime.timedelta(minutes=delay)
|
||||
|
||||
candidates = models.LibraryTrack.objects.filter(
|
||||
Q(audio_file__isnull=False) & (
|
||||
Q(local_track_file__accessed_date__lt=limit) |
|
||||
Q(local_track_file__accessed_date=None)
|
||||
candidates = (
|
||||
models.LibraryTrack.objects.filter(
|
||||
Q(audio_file__isnull=False)
|
||||
& (
|
||||
Q(local_track_file__accessed_date__lt=limit)
|
||||
| Q(local_track_file__accessed_date=None)
|
||||
)
|
||||
)
|
||||
).exclude(audio_file='').only('audio_file', 'id')
|
||||
.exclude(audio_file="")
|
||||
.only("audio_file", "id")
|
||||
)
|
||||
for lt in candidates:
|
||||
lt.audio_file.delete()
|
||||
|
||||
# we also delete orphaned files, if any
|
||||
storage = models.LibraryTrack._meta.get_field('audio_file').storage
|
||||
files = get_files(storage, 'federation_cache')
|
||||
storage = models.LibraryTrack._meta.get_field("audio_file").storage
|
||||
files = get_files(storage, "federation_cache")
|
||||
existing = models.LibraryTrack.objects.filter(audio_file__in=files)
|
||||
missing = set(files) - set(existing.values_list('audio_file', flat=True))
|
||||
missing = set(files) - set(existing.values_list("audio_file", flat=True))
|
||||
for m in missing:
|
||||
storage.delete(m)
|
||||
|
||||
|
@ -124,12 +124,9 @@ def get_files(storage, *parts):
|
|||
in a given directory using django's storage.
|
||||
"""
|
||||
if not parts:
|
||||
raise ValueError('Missing path')
|
||||
raise ValueError("Missing path")
|
||||
|
||||
dirs, files = storage.listdir(os.path.join(*parts))
|
||||
for dir in dirs:
|
||||
files += get_files(storage, *(list(parts) + [dir]))
|
||||
return [
|
||||
os.path.join(parts[-1], path)
|
||||
for path in files
|
||||
]
|
||||
return [os.path.join(parts[-1], path) for path in files]
|
||||
|
|
|
@ -1,24 +1,16 @@
|
|||
from django.conf.urls import include, url
|
||||
|
||||
from rest_framework import routers
|
||||
|
||||
from . import views
|
||||
|
||||
router = routers.SimpleRouter(trailing_slash=False)
|
||||
music_router = routers.SimpleRouter(trailing_slash=False)
|
||||
router.register(
|
||||
r'federation/instance/actors',
|
||||
views.InstanceActorViewSet,
|
||||
'instance-actors')
|
||||
router.register(
|
||||
r'.well-known',
|
||||
views.WellKnownViewSet,
|
||||
'well-known')
|
||||
|
||||
music_router.register(
|
||||
r'files',
|
||||
views.MusicFilesViewSet,
|
||||
'files',
|
||||
r"federation/instance/actors", views.InstanceActorViewSet, "instance-actors"
|
||||
)
|
||||
router.register(r".well-known", views.WellKnownViewSet, "well-known")
|
||||
|
||||
music_router.register(r"files", views.MusicFilesViewSet, "files")
|
||||
urlpatterns = router.urls + [
|
||||
url('federation/music/', include((music_router.urls, 'music'), namespace='music'))
|
||||
url("federation/music/", include((music_router.urls, "music"), namespace="music"))
|
||||
]
|
||||
|
|
|
@ -6,10 +6,10 @@ def full_url(path):
|
|||
Given a relative path, return a full url usable for federation purpose
|
||||
"""
|
||||
root = settings.FUNKWHALE_URL
|
||||
if path.startswith('/') and root.endswith('/'):
|
||||
if path.startswith("/") and root.endswith("/"):
|
||||
return root + path[1:]
|
||||
elif not path.startswith('/') and not root.endswith('/'):
|
||||
return root + '/' + path
|
||||
elif not path.startswith("/") and not root.endswith("/"):
|
||||
return root + "/" + path
|
||||
else:
|
||||
return root + path
|
||||
|
||||
|
@ -19,17 +19,14 @@ def clean_wsgi_headers(raw_headers):
|
|||
Convert WSGI headers from CONTENT_TYPE to Content-Type notation
|
||||
"""
|
||||
cleaned = {}
|
||||
non_prefixed = [
|
||||
'content_type',
|
||||
'content_length',
|
||||
]
|
||||
non_prefixed = ["content_type", "content_length"]
|
||||
for raw_header, value in raw_headers.items():
|
||||
h = raw_header.lower()
|
||||
if not h.startswith('http_') and h not in non_prefixed:
|
||||
if not h.startswith("http_") and h not in non_prefixed:
|
||||
continue
|
||||
|
||||
words = h.replace('http_', '', 1).split('_')
|
||||
cleaned_header = '-'.join([w.capitalize() for w in words])
|
||||
words = h.replace("http_", "", 1).split("_")
|
||||
cleaned_header = "-".join([w.capitalize() for w in words])
|
||||
cleaned[cleaned_header] = value
|
||||
|
||||
return cleaned
|
||||
|
|
|
@ -1,55 +1,47 @@
|
|||
from django import forms
|
||||
from django.conf import settings
|
||||
from django.core import paginator
|
||||
from django.db import transaction
|
||||
from django.http import HttpResponse
|
||||
from django.http import HttpResponse, Http404
|
||||
from django.urls import reverse
|
||||
|
||||
from rest_framework import mixins
|
||||
from rest_framework import permissions as rest_permissions
|
||||
from rest_framework import response
|
||||
from rest_framework import views
|
||||
from rest_framework import viewsets
|
||||
from rest_framework.decorators import list_route, detail_route
|
||||
from rest_framework.serializers import ValidationError
|
||||
from rest_framework import mixins, response, viewsets
|
||||
from rest_framework.decorators import detail_route, list_route
|
||||
|
||||
from funkwhale_api.common import preferences
|
||||
from funkwhale_api.common import utils as funkwhale_utils
|
||||
from funkwhale_api.music import models as music_models
|
||||
from funkwhale_api.users.permissions import HasUserPermission
|
||||
|
||||
from . import activity
|
||||
from . import actors
|
||||
from . import authentication
|
||||
from . import filters
|
||||
from . import library
|
||||
from . import models
|
||||
from . import permissions
|
||||
from . import renderers
|
||||
from . import serializers
|
||||
from . import tasks
|
||||
from . import utils
|
||||
from . import webfinger
|
||||
from . import (
|
||||
actors,
|
||||
authentication,
|
||||
filters,
|
||||
library,
|
||||
models,
|
||||
permissions,
|
||||
renderers,
|
||||
serializers,
|
||||
tasks,
|
||||
utils,
|
||||
webfinger,
|
||||
)
|
||||
|
||||
|
||||
class FederationMixin(object):
|
||||
def dispatch(self, request, *args, **kwargs):
|
||||
if not preferences.get('federation__enabled'):
|
||||
if not preferences.get("federation__enabled"):
|
||||
return HttpResponse(status=405)
|
||||
return super().dispatch(request, *args, **kwargs)
|
||||
|
||||
|
||||
class InstanceActorViewSet(FederationMixin, viewsets.GenericViewSet):
|
||||
lookup_field = 'actor'
|
||||
lookup_value_regex = '[a-z]*'
|
||||
authentication_classes = [
|
||||
authentication.SignatureAuthentication]
|
||||
lookup_field = "actor"
|
||||
lookup_value_regex = "[a-z]*"
|
||||
authentication_classes = [authentication.SignatureAuthentication]
|
||||
permission_classes = []
|
||||
renderer_classes = [renderers.ActivityPubRenderer]
|
||||
|
||||
def get_object(self):
|
||||
try:
|
||||
return actors.SYSTEM_ACTORS[self.kwargs['actor']]
|
||||
return actors.SYSTEM_ACTORS[self.kwargs["actor"]]
|
||||
except KeyError:
|
||||
raise Http404
|
||||
|
||||
|
@ -59,27 +51,23 @@ class InstanceActorViewSet(FederationMixin, viewsets.GenericViewSet):
|
|||
data = actor.system_conf.serialize()
|
||||
return response.Response(data, status=200)
|
||||
|
||||
@detail_route(methods=['get', 'post'])
|
||||
@detail_route(methods=["get", "post"])
|
||||
def inbox(self, request, *args, **kwargs):
|
||||
system_actor = self.get_object()
|
||||
handler = getattr(system_actor, '{}_inbox'.format(
|
||||
request.method.lower()
|
||||
))
|
||||
handler = getattr(system_actor, "{}_inbox".format(request.method.lower()))
|
||||
|
||||
try:
|
||||
data = handler(request.data, actor=request.actor)
|
||||
handler(request.data, actor=request.actor)
|
||||
except NotImplementedError:
|
||||
return response.Response(status=405)
|
||||
return response.Response({}, status=200)
|
||||
|
||||
@detail_route(methods=['get', 'post'])
|
||||
@detail_route(methods=["get", "post"])
|
||||
def outbox(self, request, *args, **kwargs):
|
||||
system_actor = self.get_object()
|
||||
handler = getattr(system_actor, '{}_outbox'.format(
|
||||
request.method.lower()
|
||||
))
|
||||
handler = getattr(system_actor, "{}_outbox".format(request.method.lower()))
|
||||
try:
|
||||
data = handler(request.data, actor=request.actor)
|
||||
handler(request.data, actor=request.actor)
|
||||
except NotImplementedError:
|
||||
return response.Response(status=405)
|
||||
return response.Response({}, status=200)
|
||||
|
@ -90,45 +78,36 @@ class WellKnownViewSet(viewsets.GenericViewSet):
|
|||
permission_classes = []
|
||||
renderer_classes = [renderers.JSONRenderer, renderers.WebfingerRenderer]
|
||||
|
||||
@list_route(methods=['get'])
|
||||
@list_route(methods=["get"])
|
||||
def nodeinfo(self, request, *args, **kwargs):
|
||||
if not preferences.get('instance__nodeinfo_enabled'):
|
||||
if not preferences.get("instance__nodeinfo_enabled"):
|
||||
return HttpResponse(status=404)
|
||||
data = {
|
||||
'links': [
|
||||
"links": [
|
||||
{
|
||||
'rel': 'http://nodeinfo.diaspora.software/ns/schema/2.0',
|
||||
'href': utils.full_url(
|
||||
reverse('api:v1:instance:nodeinfo-2.0')
|
||||
)
|
||||
"rel": "http://nodeinfo.diaspora.software/ns/schema/2.0",
|
||||
"href": utils.full_url(reverse("api:v1:instance:nodeinfo-2.0")),
|
||||
}
|
||||
]
|
||||
}
|
||||
return response.Response(data)
|
||||
|
||||
@list_route(methods=['get'])
|
||||
@list_route(methods=["get"])
|
||||
def webfinger(self, request, *args, **kwargs):
|
||||
if not preferences.get('federation__enabled'):
|
||||
if not preferences.get("federation__enabled"):
|
||||
return HttpResponse(status=405)
|
||||
try:
|
||||
resource_type, resource = webfinger.clean_resource(
|
||||
request.GET['resource'])
|
||||
cleaner = getattr(webfinger, 'clean_{}'.format(resource_type))
|
||||
resource_type, resource = webfinger.clean_resource(request.GET["resource"])
|
||||
cleaner = getattr(webfinger, "clean_{}".format(resource_type))
|
||||
result = cleaner(resource)
|
||||
except forms.ValidationError as e:
|
||||
return response.Response({
|
||||
'errors': {
|
||||
'resource': e.message
|
||||
}
|
||||
}, status=400)
|
||||
return response.Response({"errors": {"resource": e.message}}, status=400)
|
||||
except KeyError:
|
||||
return response.Response({
|
||||
'errors': {
|
||||
'resource': 'This field is required',
|
||||
}
|
||||
}, status=400)
|
||||
return response.Response(
|
||||
{"errors": {"resource": "This field is required"}}, status=400
|
||||
)
|
||||
|
||||
handler = getattr(self, 'handler_{}'.format(resource_type))
|
||||
handler = getattr(self, "handler_{}".format(resource_type))
|
||||
data = handler(result)
|
||||
|
||||
return response.Response(data)
|
||||
|
@ -140,46 +119,43 @@ class WellKnownViewSet(viewsets.GenericViewSet):
|
|||
|
||||
|
||||
class MusicFilesViewSet(FederationMixin, viewsets.GenericViewSet):
|
||||
authentication_classes = [
|
||||
authentication.SignatureAuthentication]
|
||||
authentication_classes = [authentication.SignatureAuthentication]
|
||||
permission_classes = [permissions.LibraryFollower]
|
||||
renderer_classes = [renderers.ActivityPubRenderer]
|
||||
|
||||
def list(self, request, *args, **kwargs):
|
||||
page = request.GET.get('page')
|
||||
library = actors.SYSTEM_ACTORS['library'].get_actor_instance()
|
||||
qs = music_models.TrackFile.objects.order_by(
|
||||
'-creation_date'
|
||||
).select_related(
|
||||
'track__artist',
|
||||
'track__album__artist'
|
||||
).filter(library_track__isnull=True)
|
||||
page = request.GET.get("page")
|
||||
library = actors.SYSTEM_ACTORS["library"].get_actor_instance()
|
||||
qs = (
|
||||
music_models.TrackFile.objects.order_by("-creation_date")
|
||||
.select_related("track__artist", "track__album__artist")
|
||||
.filter(library_track__isnull=True)
|
||||
)
|
||||
if page is None:
|
||||
conf = {
|
||||
'id': utils.full_url(reverse('federation:music:files-list')),
|
||||
'page_size': preferences.get(
|
||||
'federation__collection_page_size'),
|
||||
'items': qs,
|
||||
'item_serializer': serializers.AudioSerializer,
|
||||
'actor': library,
|
||||
"id": utils.full_url(reverse("federation:music:files-list")),
|
||||
"page_size": preferences.get("federation__collection_page_size"),
|
||||
"items": qs,
|
||||
"item_serializer": serializers.AudioSerializer,
|
||||
"actor": library,
|
||||
}
|
||||
serializer = serializers.PaginatedCollectionSerializer(conf)
|
||||
data = serializer.data
|
||||
else:
|
||||
try:
|
||||
page_number = int(page)
|
||||
except:
|
||||
return response.Response(
|
||||
{'page': ['Invalid page number']}, status=400)
|
||||
except Exception:
|
||||
return response.Response({"page": ["Invalid page number"]}, status=400)
|
||||
p = paginator.Paginator(
|
||||
qs, preferences.get('federation__collection_page_size'))
|
||||
qs, preferences.get("federation__collection_page_size")
|
||||
)
|
||||
try:
|
||||
page = p.page(page_number)
|
||||
conf = {
|
||||
'id': utils.full_url(reverse('federation:music:files-list')),
|
||||
'page': page,
|
||||
'item_serializer': serializers.AudioSerializer,
|
||||
'actor': library,
|
||||
"id": utils.full_url(reverse("federation:music:files-list")),
|
||||
"page": page,
|
||||
"item_serializer": serializers.AudioSerializer,
|
||||
"actor": library,
|
||||
}
|
||||
serializer = serializers.CollectionPageSerializer(conf)
|
||||
data = serializer.data
|
||||
|
@ -190,134 +166,112 @@ class MusicFilesViewSet(FederationMixin, viewsets.GenericViewSet):
|
|||
|
||||
|
||||
class LibraryViewSet(
|
||||
mixins.RetrieveModelMixin,
|
||||
mixins.UpdateModelMixin,
|
||||
mixins.ListModelMixin,
|
||||
viewsets.GenericViewSet):
|
||||
mixins.RetrieveModelMixin,
|
||||
mixins.UpdateModelMixin,
|
||||
mixins.ListModelMixin,
|
||||
viewsets.GenericViewSet,
|
||||
):
|
||||
permission_classes = (HasUserPermission,)
|
||||
required_permissions = ['federation']
|
||||
queryset = models.Library.objects.all().select_related(
|
||||
'actor',
|
||||
'follow',
|
||||
)
|
||||
lookup_field = 'uuid'
|
||||
required_permissions = ["federation"]
|
||||
queryset = models.Library.objects.all().select_related("actor", "follow")
|
||||
lookup_field = "uuid"
|
||||
filter_class = filters.LibraryFilter
|
||||
serializer_class = serializers.APILibrarySerializer
|
||||
ordering_fields = (
|
||||
'id',
|
||||
'creation_date',
|
||||
'fetched_date',
|
||||
'actor__domain',
|
||||
'tracks_count',
|
||||
"id",
|
||||
"creation_date",
|
||||
"fetched_date",
|
||||
"actor__domain",
|
||||
"tracks_count",
|
||||
)
|
||||
|
||||
@list_route(methods=['get'])
|
||||
@list_route(methods=["get"])
|
||||
def fetch(self, request, *args, **kwargs):
|
||||
account = request.GET.get('account')
|
||||
account = request.GET.get("account")
|
||||
if not account:
|
||||
return response.Response(
|
||||
{'account': 'This field is mandatory'}, status=400)
|
||||
return response.Response({"account": "This field is mandatory"}, status=400)
|
||||
|
||||
data = library.scan_from_account_name(account)
|
||||
return response.Response(data)
|
||||
|
||||
@detail_route(methods=['post'])
|
||||
@detail_route(methods=["post"])
|
||||
def scan(self, request, *args, **kwargs):
|
||||
library = self.get_object()
|
||||
serializer = serializers.APILibraryScanSerializer(
|
||||
data=request.data
|
||||
)
|
||||
serializer = serializers.APILibraryScanSerializer(data=request.data)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
result = tasks.scan_library.delay(
|
||||
library_id=library.pk,
|
||||
until=serializer.validated_data.get('until')
|
||||
library_id=library.pk, until=serializer.validated_data.get("until")
|
||||
)
|
||||
return response.Response({'task': result.id})
|
||||
return response.Response({"task": result.id})
|
||||
|
||||
@list_route(methods=['get'])
|
||||
@list_route(methods=["get"])
|
||||
def following(self, request, *args, **kwargs):
|
||||
library_actor = actors.SYSTEM_ACTORS['library'].get_actor_instance()
|
||||
queryset = models.Follow.objects.filter(
|
||||
actor=library_actor
|
||||
).select_related(
|
||||
'actor',
|
||||
'target',
|
||||
).order_by('-creation_date')
|
||||
library_actor = actors.SYSTEM_ACTORS["library"].get_actor_instance()
|
||||
queryset = (
|
||||
models.Follow.objects.filter(actor=library_actor)
|
||||
.select_related("actor", "target")
|
||||
.order_by("-creation_date")
|
||||
)
|
||||
filterset = filters.FollowFilter(request.GET, queryset=queryset)
|
||||
final_qs = filterset.qs
|
||||
serializer = serializers.APIFollowSerializer(final_qs, many=True)
|
||||
data = {
|
||||
'results': serializer.data,
|
||||
'count': len(final_qs),
|
||||
}
|
||||
data = {"results": serializer.data, "count": len(final_qs)}
|
||||
return response.Response(data)
|
||||
|
||||
@list_route(methods=['get', 'patch'])
|
||||
@list_route(methods=["get", "patch"])
|
||||
def followers(self, request, *args, **kwargs):
|
||||
if request.method.lower() == 'patch':
|
||||
serializer = serializers.APILibraryFollowUpdateSerializer(
|
||||
data=request.data)
|
||||
if request.method.lower() == "patch":
|
||||
serializer = serializers.APILibraryFollowUpdateSerializer(data=request.data)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
follow = serializer.save()
|
||||
return response.Response(
|
||||
serializers.APIFollowSerializer(follow).data
|
||||
)
|
||||
return response.Response(serializers.APIFollowSerializer(follow).data)
|
||||
|
||||
library_actor = actors.SYSTEM_ACTORS['library'].get_actor_instance()
|
||||
queryset = models.Follow.objects.filter(
|
||||
target=library_actor
|
||||
).select_related(
|
||||
'actor',
|
||||
'target',
|
||||
).order_by('-creation_date')
|
||||
library_actor = actors.SYSTEM_ACTORS["library"].get_actor_instance()
|
||||
queryset = (
|
||||
models.Follow.objects.filter(target=library_actor)
|
||||
.select_related("actor", "target")
|
||||
.order_by("-creation_date")
|
||||
)
|
||||
filterset = filters.FollowFilter(request.GET, queryset=queryset)
|
||||
final_qs = filterset.qs
|
||||
serializer = serializers.APIFollowSerializer(final_qs, many=True)
|
||||
data = {
|
||||
'results': serializer.data,
|
||||
'count': len(final_qs),
|
||||
}
|
||||
data = {"results": serializer.data, "count": len(final_qs)}
|
||||
return response.Response(data)
|
||||
|
||||
@transaction.atomic
|
||||
def create(self, request, *args, **kwargs):
|
||||
serializer = serializers.APILibraryCreateSerializer(data=request.data)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
library = serializer.save()
|
||||
serializer.save()
|
||||
return response.Response(serializer.data, status=201)
|
||||
|
||||
|
||||
class LibraryTrackViewSet(
|
||||
mixins.ListModelMixin,
|
||||
viewsets.GenericViewSet):
|
||||
class LibraryTrackViewSet(mixins.ListModelMixin, viewsets.GenericViewSet):
|
||||
permission_classes = (HasUserPermission,)
|
||||
required_permissions = ['federation']
|
||||
queryset = models.LibraryTrack.objects.all().select_related(
|
||||
'library__actor',
|
||||
'library__follow',
|
||||
'local_track_file',
|
||||
).prefetch_related('import_jobs')
|
||||
required_permissions = ["federation"]
|
||||
queryset = (
|
||||
models.LibraryTrack.objects.all()
|
||||
.select_related("library__actor", "library__follow", "local_track_file")
|
||||
.prefetch_related("import_jobs")
|
||||
)
|
||||
filter_class = filters.LibraryTrackFilter
|
||||
serializer_class = serializers.APILibraryTrackSerializer
|
||||
ordering_fields = (
|
||||
'id',
|
||||
'artist_name',
|
||||
'title',
|
||||
'album_title',
|
||||
'creation_date',
|
||||
'modification_date',
|
||||
'fetched_date',
|
||||
'published_date',
|
||||
"id",
|
||||
"artist_name",
|
||||
"title",
|
||||
"album_title",
|
||||
"creation_date",
|
||||
"modification_date",
|
||||
"fetched_date",
|
||||
"published_date",
|
||||
)
|
||||
|
||||
@list_route(methods=['post'])
|
||||
@list_route(methods=["post"])
|
||||
def action(self, request, *args, **kwargs):
|
||||
queryset = models.LibraryTrack.objects.filter(
|
||||
local_track_file__isnull=True)
|
||||
queryset = models.LibraryTrack.objects.filter(local_track_file__isnull=True)
|
||||
serializer = serializers.LibraryTrackActionSerializer(
|
||||
request.data,
|
||||
queryset=queryset,
|
||||
context={'submitted_by': request.user}
|
||||
request.data, queryset=queryset, context={"submitted_by": request.user}
|
||||
)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
result = serializer.save()
|
||||
|
|
|
@ -1,43 +1,39 @@
|
|||
from django import forms
|
||||
from django.conf import settings
|
||||
from django.urls import reverse
|
||||
|
||||
from funkwhale_api.common import session
|
||||
|
||||
from . import actors
|
||||
from . import utils
|
||||
from . import serializers
|
||||
from . import actors, serializers
|
||||
|
||||
VALID_RESOURCE_TYPES = ['acct']
|
||||
VALID_RESOURCE_TYPES = ["acct"]
|
||||
|
||||
|
||||
def clean_resource(resource_string):
|
||||
if not resource_string:
|
||||
raise forms.ValidationError('Invalid resource string')
|
||||
raise forms.ValidationError("Invalid resource string")
|
||||
|
||||
try:
|
||||
resource_type, resource = resource_string.split(':', 1)
|
||||
resource_type, resource = resource_string.split(":", 1)
|
||||
except ValueError:
|
||||
raise forms.ValidationError('Missing webfinger resource type')
|
||||
raise forms.ValidationError("Missing webfinger resource type")
|
||||
|
||||
if resource_type not in VALID_RESOURCE_TYPES:
|
||||
raise forms.ValidationError('Invalid webfinger resource type')
|
||||
raise forms.ValidationError("Invalid webfinger resource type")
|
||||
|
||||
return resource_type, resource
|
||||
|
||||
|
||||
def clean_acct(acct_string, ensure_local=True):
|
||||
try:
|
||||
username, hostname = acct_string.split('@')
|
||||
username, hostname = acct_string.split("@")
|
||||
except ValueError:
|
||||
raise forms.ValidationError('Invalid format')
|
||||
raise forms.ValidationError("Invalid format")
|
||||
|
||||
if ensure_local and hostname.lower() != settings.FEDERATION_HOSTNAME:
|
||||
raise forms.ValidationError(
|
||||
'Invalid hostname {}'.format(hostname))
|
||||
raise forms.ValidationError("Invalid hostname {}".format(hostname))
|
||||
|
||||
if ensure_local and username not in actors.SYSTEM_ACTORS:
|
||||
raise forms.ValidationError('Invalid username')
|
||||
raise forms.ValidationError("Invalid username")
|
||||
|
||||
return username, hostname
|
||||
|
||||
|
@ -45,12 +41,12 @@ def clean_acct(acct_string, ensure_local=True):
|
|||
def get_resource(resource_string):
|
||||
resource_type, resource = clean_resource(resource_string)
|
||||
username, hostname = clean_acct(resource, ensure_local=False)
|
||||
url = 'https://{}/.well-known/webfinger?resource={}'.format(
|
||||
hostname, resource_string)
|
||||
url = "https://{}/.well-known/webfinger?resource={}".format(
|
||||
hostname, resource_string
|
||||
)
|
||||
response = session.get_session().get(
|
||||
url,
|
||||
verify=settings.EXTERNAL_REQUESTS_VERIFY_SSL,
|
||||
timeout=5)
|
||||
url, verify=settings.EXTERNAL_REQUESTS_VERIFY_SSL, timeout=5
|
||||
)
|
||||
response.raise_for_status()
|
||||
serializer = serializers.ActorWebfingerSerializer(data=response.json())
|
||||
serializer.is_valid(raise_exception=True)
|
||||
|
|
|
@ -1,19 +1,16 @@
|
|||
from funkwhale_api.common import channels
|
||||
from funkwhale_api.activity import record
|
||||
from funkwhale_api.common import channels
|
||||
|
||||
from . import serializers
|
||||
|
||||
record.registry.register_serializer(
|
||||
serializers.ListeningActivitySerializer)
|
||||
record.registry.register_serializer(serializers.ListeningActivitySerializer)
|
||||
|
||||
|
||||
@record.registry.register_consumer('history.Listening')
|
||||
@record.registry.register_consumer("history.Listening")
|
||||
def broadcast_listening_to_instance_activity(data, obj):
|
||||
if obj.user.privacy_level not in ['instance', 'everyone']:
|
||||
if obj.user.privacy_level not in ["instance", "everyone"]:
|
||||
return
|
||||
|
||||
channels.group_send('instance_activity', {
|
||||
'type': 'event.send',
|
||||
'text': '',
|
||||
'data': data
|
||||
})
|
||||
channels.group_send(
|
||||
"instance_activity", {"type": "event.send", "text": "", "data": data}
|
||||
)
|
||||
|
|
|
@ -2,11 +2,9 @@ from django.contrib import admin
|
|||
|
||||
from . import models
|
||||
|
||||
|
||||
@admin.register(models.Listening)
|
||||
class ListeningAdmin(admin.ModelAdmin):
|
||||
list_display = ['track', 'creation_date', 'user', 'session_key']
|
||||
search_fields = ['track__name', 'user__username']
|
||||
list_select_related = [
|
||||
'user',
|
||||
'track'
|
||||
]
|
||||
list_display = ["track", "creation_date", "user", "session_key"]
|
||||
search_fields = ["track__name", "user__username"]
|
||||
list_select_related = ["user", "track"]
|
||||
|
|
|
@ -11,4 +11,4 @@ class ListeningFactory(factory.django.DjangoModelFactory):
|
|||
track = factory.SubFactory(factories.TrackFactory)
|
||||
|
||||
class Meta:
|
||||
model = 'history.Listening'
|
||||
model = "history.Listening"
|
||||
|
|
|
@ -9,22 +9,52 @@ import django.utils.timezone
|
|||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('music', '0008_auto_20160529_1456'),
|
||||
("music", "0008_auto_20160529_1456"),
|
||||
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name='Listening',
|
||||
name="Listening",
|
||||
fields=[
|
||||
('id', models.AutoField(verbose_name='ID', primary_key=True, serialize=False, auto_created=True)),
|
||||
('end_date', models.DateTimeField(null=True, blank=True, default=django.utils.timezone.now)),
|
||||
('session_key', models.CharField(null=True, blank=True, max_length=100)),
|
||||
('track', models.ForeignKey(related_name='listenings', to='music.Track', on_delete=models.CASCADE)),
|
||||
('user', models.ForeignKey(blank=True, null=True, related_name='listenings', to=settings.AUTH_USER_MODEL, on_delete=models.CASCADE)),
|
||||
(
|
||||
"id",
|
||||
models.AutoField(
|
||||
verbose_name="ID",
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
auto_created=True,
|
||||
),
|
||||
),
|
||||
(
|
||||
"end_date",
|
||||
models.DateTimeField(
|
||||
null=True, blank=True, default=django.utils.timezone.now
|
||||
),
|
||||
),
|
||||
(
|
||||
"session_key",
|
||||
models.CharField(null=True, blank=True, max_length=100),
|
||||
),
|
||||
(
|
||||
"track",
|
||||
models.ForeignKey(
|
||||
related_name="listenings",
|
||||
to="music.Track",
|
||||
on_delete=models.CASCADE,
|
||||
),
|
||||
),
|
||||
(
|
||||
"user",
|
||||
models.ForeignKey(
|
||||
blank=True,
|
||||
null=True,
|
||||
related_name="listenings",
|
||||
to=settings.AUTH_USER_MODEL,
|
||||
on_delete=models.CASCADE,
|
||||
),
|
||||
),
|
||||
],
|
||||
options={
|
||||
'ordering': ('-end_date',),
|
||||
},
|
||||
),
|
||||
options={"ordering": ("-end_date",)},
|
||||
)
|
||||
]
|
||||
|
|
|
@ -5,18 +5,13 @@ from django.db import migrations
|
|||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('history', '0001_initial'),
|
||||
]
|
||||
dependencies = [("history", "0001_initial")]
|
||||
|
||||
operations = [
|
||||
migrations.AlterModelOptions(
|
||||
name='listening',
|
||||
options={'ordering': ('-creation_date',)},
|
||||
name="listening", options={"ordering": ("-creation_date",)}
|
||||
),
|
||||
migrations.RenameField(
|
||||
model_name='listening',
|
||||
old_name='end_date',
|
||||
new_name='creation_date',
|
||||
model_name="listening", old_name="end_date", new_name="creation_date"
|
||||
),
|
||||
]
|
||||
|
|
|
@ -1,26 +1,25 @@
|
|||
from django.utils import timezone
|
||||
from django.db import models
|
||||
from django.core.exceptions import ValidationError
|
||||
from django.utils import timezone
|
||||
|
||||
from funkwhale_api.music.models import Track
|
||||
|
||||
|
||||
class Listening(models.Model):
|
||||
creation_date = models.DateTimeField(
|
||||
default=timezone.now, null=True, blank=True)
|
||||
creation_date = models.DateTimeField(default=timezone.now, null=True, blank=True)
|
||||
track = models.ForeignKey(
|
||||
Track, related_name="listenings", on_delete=models.CASCADE)
|
||||
Track, related_name="listenings", on_delete=models.CASCADE
|
||||
)
|
||||
user = models.ForeignKey(
|
||||
'users.User',
|
||||
"users.User",
|
||||
related_name="listenings",
|
||||
null=True,
|
||||
blank=True,
|
||||
on_delete=models.CASCADE)
|
||||
on_delete=models.CASCADE,
|
||||
)
|
||||
session_key = models.CharField(max_length=100, null=True, blank=True)
|
||||
|
||||
class Meta:
|
||||
ordering = ('-creation_date',)
|
||||
ordering = ("-creation_date",)
|
||||
|
||||
def get_activity_url(self):
|
||||
return '{}/listenings/tracks/{}'.format(
|
||||
self.user.get_activity_url(), self.pk)
|
||||
return "{}/listenings/tracks/{}".format(self.user.get_activity_url(), self.pk)
|
||||
|
|
|
@ -9,35 +9,27 @@ from . import models
|
|||
|
||||
class ListeningActivitySerializer(activity_serializers.ModelSerializer):
|
||||
type = serializers.SerializerMethodField()
|
||||
object = TrackActivitySerializer(source='track')
|
||||
actor = UserActivitySerializer(source='user')
|
||||
published = serializers.DateTimeField(source='creation_date')
|
||||
object = TrackActivitySerializer(source="track")
|
||||
actor = UserActivitySerializer(source="user")
|
||||
published = serializers.DateTimeField(source="creation_date")
|
||||
|
||||
class Meta:
|
||||
model = models.Listening
|
||||
fields = [
|
||||
'id',
|
||||
'local_id',
|
||||
'object',
|
||||
'type',
|
||||
'actor',
|
||||
'published'
|
||||
]
|
||||
fields = ["id", "local_id", "object", "type", "actor", "published"]
|
||||
|
||||
def get_actor(self, obj):
|
||||
return UserActivitySerializer(obj.user).data
|
||||
|
||||
def get_type(self, obj):
|
||||
return 'Listen'
|
||||
return "Listen"
|
||||
|
||||
|
||||
class ListeningSerializer(serializers.ModelSerializer):
|
||||
|
||||
class Meta:
|
||||
model = models.Listening
|
||||
fields = ('id', 'user', 'track', 'creation_date')
|
||||
fields = ("id", "user", "track", "creation_date")
|
||||
|
||||
def create(self, validated_data):
|
||||
validated_data['user'] = self.context['user']
|
||||
validated_data["user"] = self.context["user"]
|
||||
|
||||
return super().create(validated_data)
|
||||
|
|
|
@ -1,8 +1,8 @@
|
|||
from django.conf.urls import include, url
|
||||
from rest_framework import routers
|
||||
|
||||
from . import views
|
||||
|
||||
from rest_framework import routers
|
||||
router = routers.SimpleRouter()
|
||||
router.register(r'listenings', views.ListeningViewSet, 'listenings')
|
||||
router.register(r"listenings", views.ListeningViewSet, "listenings")
|
||||
|
||||
urlpatterns = router.urls
|
||||
|
|
|
@ -1,20 +1,13 @@
|
|||
from rest_framework import generics, mixins, viewsets
|
||||
from rest_framework import permissions
|
||||
from rest_framework import status
|
||||
from rest_framework.response import Response
|
||||
from rest_framework.decorators import detail_route
|
||||
from rest_framework import mixins, permissions, viewsets
|
||||
|
||||
from funkwhale_api.activity import record
|
||||
from funkwhale_api.common.permissions import ConditionalAuthentication
|
||||
|
||||
from . import models
|
||||
from . import serializers
|
||||
from . import models, serializers
|
||||
|
||||
|
||||
class ListeningViewSet(
|
||||
mixins.CreateModelMixin,
|
||||
mixins.RetrieveModelMixin,
|
||||
viewsets.GenericViewSet):
|
||||
mixins.CreateModelMixin, mixins.RetrieveModelMixin, viewsets.GenericViewSet
|
||||
):
|
||||
|
||||
serializer_class = serializers.ListeningSerializer
|
||||
queryset = models.Listening.objects.all()
|
||||
|
@ -31,5 +24,5 @@ class ListeningViewSet(
|
|||
|
||||
def get_serializer_context(self):
|
||||
context = super().get_serializer_context()
|
||||
context['user'] = self.request.user
|
||||
context["user"] = self.request.user
|
||||
return context
|
||||
|
|
|
@ -5,4 +5,4 @@ class InstanceActivityConsumer(JsonAuthConsumer):
|
|||
groups = ["instance_activity"]
|
||||
|
||||
def event_send(self, message):
|
||||
self.send_json(message['data'])
|
||||
self.send_json(message["data"])
|
||||
|
|
|
@ -1,93 +1,84 @@
|
|||
from django.forms import widgets
|
||||
|
||||
from dynamic_preferences import types
|
||||
from dynamic_preferences.registries import global_preferences_registry
|
||||
|
||||
raven = types.Section('raven')
|
||||
instance = types.Section('instance')
|
||||
raven = types.Section("raven")
|
||||
instance = types.Section("instance")
|
||||
|
||||
|
||||
@global_preferences_registry.register
|
||||
class InstanceName(types.StringPreference):
|
||||
show_in_api = True
|
||||
section = instance
|
||||
name = 'name'
|
||||
default = ''
|
||||
verbose_name = 'Public name'
|
||||
help_text = 'The public name of your instance, displayed in the about page.'
|
||||
field_kwargs = {
|
||||
'required': False,
|
||||
}
|
||||
name = "name"
|
||||
default = ""
|
||||
verbose_name = "Public name"
|
||||
help_text = "The public name of your instance, displayed in the about page."
|
||||
field_kwargs = {"required": False}
|
||||
|
||||
|
||||
@global_preferences_registry.register
|
||||
class InstanceShortDescription(types.StringPreference):
|
||||
show_in_api = True
|
||||
section = instance
|
||||
name = 'short_description'
|
||||
default = ''
|
||||
verbose_name = 'Short description'
|
||||
help_text = 'Instance succinct description, displayed in the about page.'
|
||||
field_kwargs = {
|
||||
'required': False,
|
||||
}
|
||||
name = "short_description"
|
||||
default = ""
|
||||
verbose_name = "Short description"
|
||||
help_text = "Instance succinct description, displayed in the about page."
|
||||
field_kwargs = {"required": False}
|
||||
|
||||
|
||||
@global_preferences_registry.register
|
||||
class InstanceLongDescription(types.StringPreference):
|
||||
show_in_api = True
|
||||
section = instance
|
||||
name = 'long_description'
|
||||
verbose_name = 'Long description'
|
||||
default = ''
|
||||
help_text = 'Instance long description, displayed in the about page (markdown allowed).'
|
||||
name = "long_description"
|
||||
verbose_name = "Long description"
|
||||
default = ""
|
||||
help_text = (
|
||||
"Instance long description, displayed in the about page (markdown allowed)."
|
||||
)
|
||||
widget = widgets.Textarea
|
||||
field_kwargs = {
|
||||
'required': False,
|
||||
}
|
||||
field_kwargs = {"required": False}
|
||||
|
||||
|
||||
@global_preferences_registry.register
|
||||
class RavenDSN(types.StringPreference):
|
||||
show_in_api = True
|
||||
section = raven
|
||||
name = 'front_dsn'
|
||||
default = 'https://9e0562d46b09442bb8f6844e50cbca2b@sentry.eliotberriot.com/4'
|
||||
verbose_name = 'Raven DSN key (front-end)'
|
||||
name = "front_dsn"
|
||||
default = "https://9e0562d46b09442bb8f6844e50cbca2b@sentry.eliotberriot.com/4"
|
||||
verbose_name = "Raven DSN key (front-end)"
|
||||
|
||||
help_text = (
|
||||
'A Raven DSN key used to report front-ent errors to '
|
||||
'a sentry instance. Keeping the default one will report errors to '
|
||||
'Funkwhale developers.'
|
||||
"A Raven DSN key used to report front-ent errors to "
|
||||
"a sentry instance. Keeping the default one will report errors to "
|
||||
"Funkwhale developers."
|
||||
)
|
||||
field_kwargs = {
|
||||
'required': False,
|
||||
}
|
||||
field_kwargs = {"required": False}
|
||||
|
||||
|
||||
@global_preferences_registry.register
|
||||
class RavenEnabled(types.BooleanPreference):
|
||||
show_in_api = True
|
||||
section = raven
|
||||
name = 'front_enabled'
|
||||
name = "front_enabled"
|
||||
default = False
|
||||
verbose_name = (
|
||||
'Report front-end errors with Raven'
|
||||
)
|
||||
verbose_name = "Report front-end errors with Raven"
|
||||
|
||||
|
||||
@global_preferences_registry.register
|
||||
class InstanceNodeinfoEnabled(types.BooleanPreference):
|
||||
show_in_api = False
|
||||
section = instance
|
||||
name = 'nodeinfo_enabled'
|
||||
name = "nodeinfo_enabled"
|
||||
default = True
|
||||
verbose_name = 'Enable nodeinfo endpoint'
|
||||
verbose_name = "Enable nodeinfo endpoint"
|
||||
help_text = (
|
||||
'This endpoint is needed for your about page to work. '
|
||||
'It\'s also helpful for the various monitoring '
|
||||
'tools that map and analyzize the fediverse, '
|
||||
'but you can disable it completely if needed.'
|
||||
"This endpoint is needed for your about page to work. "
|
||||
"It's also helpful for the various monitoring "
|
||||
"tools that map and analyzize the fediverse, "
|
||||
"but you can disable it completely if needed."
|
||||
)
|
||||
|
||||
|
||||
|
@ -95,13 +86,13 @@ class InstanceNodeinfoEnabled(types.BooleanPreference):
|
|||
class InstanceNodeinfoPrivate(types.BooleanPreference):
|
||||
show_in_api = False
|
||||
section = instance
|
||||
name = 'nodeinfo_private'
|
||||
name = "nodeinfo_private"
|
||||
default = False
|
||||
verbose_name = 'Private mode in nodeinfo'
|
||||
verbose_name = "Private mode in nodeinfo"
|
||||
help_text = (
|
||||
'Indicate in the nodeinfo endpoint that you do not want your instance '
|
||||
'to be tracked by third-party services. '
|
||||
'There is no guarantee these tools will honor this setting though.'
|
||||
"Indicate in the nodeinfo endpoint that you do not want your instance "
|
||||
"to be tracked by third-party services. "
|
||||
"There is no guarantee these tools will honor this setting though."
|
||||
)
|
||||
|
||||
|
||||
|
@ -109,10 +100,10 @@ class InstanceNodeinfoPrivate(types.BooleanPreference):
|
|||
class InstanceNodeinfoStatsEnabled(types.BooleanPreference):
|
||||
show_in_api = False
|
||||
section = instance
|
||||
name = 'nodeinfo_stats_enabled'
|
||||
name = "nodeinfo_stats_enabled"
|
||||
default = True
|
||||
verbose_name = 'Enable usage and library stats in nodeinfo endpoint'
|
||||
verbose_name = "Enable usage and library stats in nodeinfo endpoint"
|
||||
help_text = (
|
||||
'Disable this if you don\'t want to share usage and library statistics '
|
||||
'in the nodeinfo endpoint but don\'t want to disable it completely.'
|
||||
"Disable this if you don't want to share usage and library statistics "
|
||||
"in the nodeinfo endpoint but don't want to disable it completely."
|
||||
)
|
||||
|
|
|
@ -5,71 +5,46 @@ from funkwhale_api.common import preferences
|
|||
|
||||
from . import stats
|
||||
|
||||
|
||||
store = memoize.djangocache.Cache('default')
|
||||
memo = memoize.Memoizer(store, namespace='instance:stats')
|
||||
store = memoize.djangocache.Cache("default")
|
||||
memo = memoize.Memoizer(store, namespace="instance:stats")
|
||||
|
||||
|
||||
def get():
|
||||
share_stats = preferences.get('instance__nodeinfo_stats_enabled')
|
||||
private = preferences.get('instance__nodeinfo_private')
|
||||
share_stats = preferences.get("instance__nodeinfo_stats_enabled")
|
||||
data = {
|
||||
'version': '2.0',
|
||||
'software': {
|
||||
'name': 'funkwhale',
|
||||
'version': funkwhale_api.__version__
|
||||
},
|
||||
'protocols': ['activitypub'],
|
||||
'services': {
|
||||
'inbound': [],
|
||||
'outbound': []
|
||||
},
|
||||
'openRegistrations': preferences.get('users__registration_enabled'),
|
||||
'usage': {
|
||||
'users': {
|
||||
'total': 0,
|
||||
}
|
||||
},
|
||||
'metadata': {
|
||||
'private': preferences.get('instance__nodeinfo_private'),
|
||||
'shortDescription': preferences.get('instance__short_description'),
|
||||
'longDescription': preferences.get('instance__long_description'),
|
||||
'nodeName': preferences.get('instance__name'),
|
||||
'library': {
|
||||
'federationEnabled': preferences.get('federation__enabled'),
|
||||
'federationNeedsApproval': preferences.get('federation__music_needs_approval'),
|
||||
'anonymousCanListen': preferences.get('common__api_authentication_required'),
|
||||
"version": "2.0",
|
||||
"software": {"name": "funkwhale", "version": funkwhale_api.__version__},
|
||||
"protocols": ["activitypub"],
|
||||
"services": {"inbound": [], "outbound": []},
|
||||
"openRegistrations": preferences.get("users__registration_enabled"),
|
||||
"usage": {"users": {"total": 0}},
|
||||
"metadata": {
|
||||
"private": preferences.get("instance__nodeinfo_private"),
|
||||
"shortDescription": preferences.get("instance__short_description"),
|
||||
"longDescription": preferences.get("instance__long_description"),
|
||||
"nodeName": preferences.get("instance__name"),
|
||||
"library": {
|
||||
"federationEnabled": preferences.get("federation__enabled"),
|
||||
"federationNeedsApproval": preferences.get(
|
||||
"federation__music_needs_approval"
|
||||
),
|
||||
"anonymousCanListen": preferences.get(
|
||||
"common__api_authentication_required"
|
||||
),
|
||||
},
|
||||
}
|
||||
},
|
||||
}
|
||||
if share_stats:
|
||||
getter = memo(
|
||||
lambda: stats.get(),
|
||||
max_age=600
|
||||
)
|
||||
getter = memo(lambda: stats.get(), max_age=600)
|
||||
statistics = getter()
|
||||
data['usage']['users']['total'] = statistics['users']
|
||||
data['metadata']['library']['tracks'] = {
|
||||
'total': statistics['tracks'],
|
||||
}
|
||||
data['metadata']['library']['artists'] = {
|
||||
'total': statistics['artists'],
|
||||
}
|
||||
data['metadata']['library']['albums'] = {
|
||||
'total': statistics['albums'],
|
||||
}
|
||||
data['metadata']['library']['music'] = {
|
||||
'hours': statistics['music_duration']
|
||||
}
|
||||
data["usage"]["users"]["total"] = statistics["users"]
|
||||
data["metadata"]["library"]["tracks"] = {"total": statistics["tracks"]}
|
||||
data["metadata"]["library"]["artists"] = {"total": statistics["artists"]}
|
||||
data["metadata"]["library"]["albums"] = {"total": statistics["albums"]}
|
||||
data["metadata"]["library"]["music"] = {"hours": statistics["music_duration"]}
|
||||
|
||||
data['metadata']['usage'] = {
|
||||
'favorites': {
|
||||
'tracks': {
|
||||
'total': statistics['track_favorites'],
|
||||
}
|
||||
},
|
||||
'listenings': {
|
||||
'total': statistics['listenings']
|
||||
}
|
||||
data["metadata"]["usage"] = {
|
||||
"favorites": {"tracks": {"total": statistics["track_favorites"]}},
|
||||
"listenings": {"total": statistics["listenings"]},
|
||||
}
|
||||
return data
|
||||
|
|
|
@ -8,13 +8,13 @@ from funkwhale_api.users.models import User
|
|||
|
||||
def get():
|
||||
return {
|
||||
'users': get_users(),
|
||||
'tracks': get_tracks(),
|
||||
'albums': get_albums(),
|
||||
'artists': get_artists(),
|
||||
'track_favorites': get_track_favorites(),
|
||||
'listenings': get_listenings(),
|
||||
'music_duration': get_music_duration(),
|
||||
"users": get_users(),
|
||||
"tracks": get_tracks(),
|
||||
"albums": get_albums(),
|
||||
"artists": get_artists(),
|
||||
"track_favorites": get_track_favorites(),
|
||||
"listenings": get_listenings(),
|
||||
"music_duration": get_music_duration(),
|
||||
}
|
||||
|
||||
|
||||
|
@ -43,9 +43,7 @@ def get_artists():
|
|||
|
||||
|
||||
def get_music_duration():
|
||||
seconds = models.TrackFile.objects.aggregate(
|
||||
d=Sum('duration'),
|
||||
)['d']
|
||||
seconds = models.TrackFile.objects.aggregate(d=Sum("duration"))["d"]
|
||||
if seconds:
|
||||
return seconds / 3600
|
||||
return 0
|
||||
|
|
|
@ -2,10 +2,11 @@ from django.conf.urls import url
|
|||
from rest_framework import routers
|
||||
|
||||
from . import views
|
||||
|
||||
admin_router = routers.SimpleRouter()
|
||||
admin_router.register(r'admin/settings', views.AdminSettings, 'admin-settings')
|
||||
admin_router.register(r"admin/settings", views.AdminSettings, "admin-settings")
|
||||
|
||||
urlpatterns = [
|
||||
url(r'^nodeinfo/2.0/$', views.NodeInfo.as_view(), name='nodeinfo-2.0'),
|
||||
url(r'^settings/$', views.InstanceSettings.as_view(), name='settings'),
|
||||
url(r"^nodeinfo/2.0/$", views.NodeInfo.as_view(), name="nodeinfo-2.0"),
|
||||
url(r"^settings/$", views.InstanceSettings.as_view(), name="settings"),
|
||||
] + admin_router.urls
|
||||
|
|
|
@ -1,26 +1,22 @@
|
|||
from rest_framework import views
|
||||
from rest_framework.response import Response
|
||||
|
||||
from dynamic_preferences.api import serializers
|
||||
from dynamic_preferences.api import viewsets as preferences_viewsets
|
||||
from dynamic_preferences.registries import global_preferences_registry
|
||||
from rest_framework import views
|
||||
from rest_framework.response import Response
|
||||
|
||||
from funkwhale_api.common import preferences
|
||||
from funkwhale_api.users.permissions import HasUserPermission
|
||||
|
||||
from . import nodeinfo
|
||||
from . import stats
|
||||
|
||||
|
||||
NODEINFO_2_CONTENT_TYPE = (
|
||||
'application/json; profile=http://nodeinfo.diaspora.software/ns/schema/2.0#; charset=utf-8' # noqa
|
||||
)
|
||||
NODEINFO_2_CONTENT_TYPE = "application/json; profile=http://nodeinfo.diaspora.software/ns/schema/2.0#; charset=utf-8" # noqa
|
||||
|
||||
|
||||
class AdminSettings(preferences_viewsets.GlobalPreferencesViewSet):
|
||||
pagination_class = None
|
||||
permission_classes = (HasUserPermission,)
|
||||
required_permissions = ['settings']
|
||||
required_permissions = ["settings"]
|
||||
|
||||
|
||||
class InstanceSettings(views.APIView):
|
||||
permission_classes = []
|
||||
|
@ -29,16 +25,11 @@ class InstanceSettings(views.APIView):
|
|||
def get(self, request, *args, **kwargs):
|
||||
manager = global_preferences_registry.manager()
|
||||
manager.all()
|
||||
all_preferences = manager.model.objects.all().order_by(
|
||||
'section', 'name'
|
||||
)
|
||||
all_preferences = manager.model.objects.all().order_by("section", "name")
|
||||
api_preferences = [
|
||||
p
|
||||
for p in all_preferences
|
||||
if getattr(p.preference, 'show_in_api', False)
|
||||
p for p in all_preferences if getattr(p.preference, "show_in_api", False)
|
||||
]
|
||||
data = serializers.GlobalPreferenceSerializer(
|
||||
api_preferences, many=True).data
|
||||
data = serializers.GlobalPreferenceSerializer(api_preferences, many=True).data
|
||||
return Response(data, status=200)
|
||||
|
||||
|
||||
|
@ -47,8 +38,7 @@ class NodeInfo(views.APIView):
|
|||
authentication_classes = []
|
||||
|
||||
def get(self, request, *args, **kwargs):
|
||||
if not preferences.get('instance__nodeinfo_enabled'):
|
||||
if not preferences.get("instance__nodeinfo_enabled"):
|
||||
return Response(status=404)
|
||||
data = nodeinfo.get()
|
||||
return Response(
|
||||
data, status=200, content_type=NODEINFO_2_CONTENT_TYPE)
|
||||
return Response(data, status=200, content_type=NODEINFO_2_CONTENT_TYPE)
|
||||
|
|
|
@ -1,4 +1,3 @@
|
|||
from django.db.models import Count
|
||||
|
||||
from django_filters import rest_framework as filters
|
||||
|
||||
|
@ -7,19 +6,15 @@ from funkwhale_api.music import models as music_models
|
|||
|
||||
|
||||
class ManageTrackFileFilterSet(filters.FilterSet):
|
||||
q = fields.SearchFilter(search_fields=[
|
||||
'track__title',
|
||||
'track__album__title',
|
||||
'track__artist__name',
|
||||
'source',
|
||||
])
|
||||
q = fields.SearchFilter(
|
||||
search_fields=[
|
||||
"track__title",
|
||||
"track__album__title",
|
||||
"track__artist__name",
|
||||
"source",
|
||||
]
|
||||
)
|
||||
|
||||
class Meta:
|
||||
model = music_models.TrackFile
|
||||
fields = [
|
||||
'q',
|
||||
'track__album',
|
||||
'track__artist',
|
||||
'track',
|
||||
'library_track'
|
||||
]
|
||||
fields = ["q", "track__album", "track__artist", "track", "library_track"]
|
||||
|
|
|
@ -10,12 +10,7 @@ from . import filters
|
|||
class ManageTrackFileArtistSerializer(serializers.ModelSerializer):
|
||||
class Meta:
|
||||
model = music_models.Artist
|
||||
fields = [
|
||||
'id',
|
||||
'mbid',
|
||||
'creation_date',
|
||||
'name',
|
||||
]
|
||||
fields = ["id", "mbid", "creation_date", "name"]
|
||||
|
||||
|
||||
class ManageTrackFileAlbumSerializer(serializers.ModelSerializer):
|
||||
|
@ -24,13 +19,13 @@ class ManageTrackFileAlbumSerializer(serializers.ModelSerializer):
|
|||
class Meta:
|
||||
model = music_models.Album
|
||||
fields = (
|
||||
'id',
|
||||
'mbid',
|
||||
'title',
|
||||
'artist',
|
||||
'release_date',
|
||||
'cover',
|
||||
'creation_date',
|
||||
"id",
|
||||
"mbid",
|
||||
"title",
|
||||
"artist",
|
||||
"release_date",
|
||||
"cover",
|
||||
"creation_date",
|
||||
)
|
||||
|
||||
|
||||
|
@ -40,15 +35,7 @@ class ManageTrackFileTrackSerializer(serializers.ModelSerializer):
|
|||
|
||||
class Meta:
|
||||
model = music_models.Track
|
||||
fields = (
|
||||
'id',
|
||||
'mbid',
|
||||
'title',
|
||||
'album',
|
||||
'artist',
|
||||
'creation_date',
|
||||
'position',
|
||||
)
|
||||
fields = ("id", "mbid", "title", "album", "artist", "creation_date", "position")
|
||||
|
||||
|
||||
class ManageTrackFileSerializer(serializers.ModelSerializer):
|
||||
|
@ -57,24 +44,24 @@ class ManageTrackFileSerializer(serializers.ModelSerializer):
|
|||
class Meta:
|
||||
model = music_models.TrackFile
|
||||
fields = (
|
||||
'id',
|
||||
'path',
|
||||
'source',
|
||||
'filename',
|
||||
'mimetype',
|
||||
'track',
|
||||
'duration',
|
||||
'mimetype',
|
||||
'bitrate',
|
||||
'size',
|
||||
'path',
|
||||
'library_track',
|
||||
"id",
|
||||
"path",
|
||||
"source",
|
||||
"filename",
|
||||
"mimetype",
|
||||
"track",
|
||||
"duration",
|
||||
"mimetype",
|
||||
"bitrate",
|
||||
"size",
|
||||
"path",
|
||||
"library_track",
|
||||
)
|
||||
|
||||
|
||||
class ManageTrackFileActionSerializer(common_serializers.ActionSerializer):
|
||||
actions = ['delete']
|
||||
dangerous_actions = ['delete']
|
||||
actions = ["delete"]
|
||||
dangerous_actions = ["delete"]
|
||||
filterset_class = filters.ManageTrackFileFilterSet
|
||||
|
||||
@transaction.atomic
|
||||
|
|
|
@ -1,11 +1,11 @@
|
|||
from django.conf.urls import include, url
|
||||
from rest_framework import routers
|
||||
|
||||
from . import views
|
||||
|
||||
from rest_framework import routers
|
||||
library_router = routers.SimpleRouter()
|
||||
library_router.register(r'track-files', views.ManageTrackFileViewSet, 'track-files')
|
||||
library_router.register(r"track-files", views.ManageTrackFileViewSet, "track-files")
|
||||
|
||||
urlpatterns = [
|
||||
url(r'^library/',
|
||||
include((library_router.urls, 'instance'), namespace='library')),
|
||||
url(r"^library/", include((library_router.urls, "instance"), namespace="library"))
|
||||
]
|
||||
|
|
|
@ -1,48 +1,42 @@
|
|||
from rest_framework import mixins
|
||||
from rest_framework import response
|
||||
from rest_framework import viewsets
|
||||
from rest_framework import mixins, response, viewsets
|
||||
from rest_framework.decorators import list_route
|
||||
|
||||
from funkwhale_api.music import models as music_models
|
||||
from funkwhale_api.users.permissions import HasUserPermission
|
||||
|
||||
from . import filters
|
||||
from . import serializers
|
||||
from . import filters, serializers
|
||||
|
||||
|
||||
class ManageTrackFileViewSet(
|
||||
mixins.ListModelMixin,
|
||||
mixins.RetrieveModelMixin,
|
||||
mixins.DestroyModelMixin,
|
||||
viewsets.GenericViewSet):
|
||||
mixins.ListModelMixin,
|
||||
mixins.RetrieveModelMixin,
|
||||
mixins.DestroyModelMixin,
|
||||
viewsets.GenericViewSet,
|
||||
):
|
||||
queryset = (
|
||||
music_models.TrackFile.objects.all()
|
||||
.select_related(
|
||||
'track__artist',
|
||||
'track__album__artist',
|
||||
'library_track')
|
||||
.order_by('-id')
|
||||
.select_related("track__artist", "track__album__artist", "library_track")
|
||||
.order_by("-id")
|
||||
)
|
||||
serializer_class = serializers.ManageTrackFileSerializer
|
||||
filter_class = filters.ManageTrackFileFilterSet
|
||||
permission_classes = (HasUserPermission,)
|
||||
required_permissions = ['library']
|
||||
required_permissions = ["library"]
|
||||
ordering_fields = [
|
||||
'accessed_date',
|
||||
'modification_date',
|
||||
'creation_date',
|
||||
'track__artist__name',
|
||||
'bitrate',
|
||||
'size',
|
||||
'duration',
|
||||
"accessed_date",
|
||||
"modification_date",
|
||||
"creation_date",
|
||||
"track__artist__name",
|
||||
"bitrate",
|
||||
"size",
|
||||
"duration",
|
||||
]
|
||||
|
||||
@list_route(methods=['post'])
|
||||
@list_route(methods=["post"])
|
||||
def action(self, request, *args, **kwargs):
|
||||
queryset = self.get_queryset()
|
||||
serializer = serializers.ManageTrackFileActionSerializer(
|
||||
request.data,
|
||||
queryset=queryset,
|
||||
request.data, queryset=queryset
|
||||
)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
result = serializer.save()
|
||||
|
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue