mirror of
https://github.com/krateng/maloja.git
synced 2023-08-10 21:12:55 +03:00
Compare commits
No commits in common. "c77b7c952fb28a8ef7f0c3b4cd0db978803bd4c0" and "3b156a73ffa735670d5b302119999fb8c54d4240" have entirely different histories.
c77b7c952f
...
3b156a73ff
30
API.md
30
API.md
|
@ -1,7 +1,6 @@
|
||||||
# Scrobbling
|
# Scrobbling
|
||||||
|
|
||||||
Scrobbling can be done with the native API, see [below](#submitting-a-scrobble).
|
In order to scrobble from a wide selection of clients, you can use Maloja's standard-compliant APIs with the following settings:
|
||||||
In order to scrobble from a wide selection of clients, you can also use Maloja's standard-compliant APIs with the following settings:
|
|
||||||
|
|
||||||
GNU FM |
|
GNU FM |
|
||||||
------ | ---------
|
------ | ---------
|
||||||
|
@ -42,7 +41,7 @@ The user starts playing '(Fine Layers of) Slaysenflite', which is exactly 3:00 m
|
||||||
* If the user ends the play after 1:22, no scrobble is submitted
|
* If the user ends the play after 1:22, no scrobble is submitted
|
||||||
* If the user ends the play after 2:06, a scrobble with `"duration":126` is submitted
|
* If the user ends the play after 2:06, a scrobble with `"duration":126` is submitted
|
||||||
* If the user jumps back several times and ends the play after 3:57, a scrobble with `"duration":237` is submitted
|
* If the user jumps back several times and ends the play after 3:57, a scrobble with `"duration":237` is submitted
|
||||||
* If the user jumps back several times and ends the play after 4:49, two scrobbles with `"duration":180` and `"duration":109` are submitted
|
* If the user jumps back several times and ends the play after 4:49, two scrobbles with `"duration":180` and `"duration":109` should be submitted
|
||||||
|
|
||||||
</td></tr>
|
</td></tr>
|
||||||
<table>
|
<table>
|
||||||
|
@ -56,25 +55,10 @@ All endpoints return JSON data. POST request can be made with query string or fo
|
||||||
|
|
||||||
No application should ever rely on the non-existence of fields in the JSON data - i.e., additional fields can be added at any time without this being considered a breaking change. Existing fields should usually not be removed or changed, but it is always a good idea to add basic handling for missing fields.
|
No application should ever rely on the non-existence of fields in the JSON data - i.e., additional fields can be added at any time without this being considered a breaking change. Existing fields should usually not be removed or changed, but it is always a good idea to add basic handling for missing fields.
|
||||||
|
|
||||||
## Submitting a Scrobble
|
|
||||||
|
|
||||||
The POST endpoint `/newscrobble` is used to submit new scrobbles. These use a flat JSON structure with the following fields:
|
|
||||||
|
|
||||||
| Key | Type | Description |
|
|
||||||
| --- | --- | --- |
|
|
||||||
| `artists` | List(String) | Track artists |
|
|
||||||
| `title` | String | Track title |
|
|
||||||
| `album` | String | Name of the album (Optional) |
|
|
||||||
| `albumartists` | List(String) | Album artists (Optional) |
|
|
||||||
| `duration` | Integer | How long the song was listened to in seconds (Optional) |
|
|
||||||
| `length` | Integer | Actual length of the full song in seconds (Optional) |
|
|
||||||
| `time` | Integer | Timestamp of the listen if it was not at the time of submitting (Optional) |
|
|
||||||
| `nofix` | Boolean | Skip server-side metadata fixing (Optional) |
|
|
||||||
|
|
||||||
## General Structure
|
## General Structure
|
||||||
|
|
||||||
The API is not fully consistent in order to ensure backwards-compatibility. Refer to the individual endpoints.
|
|
||||||
Generally, most endpoints follow this structure:
|
Most endpoints follow this structure:
|
||||||
|
|
||||||
| Key | Type | Description |
|
| Key | Type | Description |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
|
@ -82,7 +66,7 @@ Generally, most endpoints follow this structure:
|
||||||
| `error` | Mapping | Details about the error if one occured. |
|
| `error` | Mapping | Details about the error if one occured. |
|
||||||
| `warnings` | List | Any warnings that did not result in failure, but should be noted. Field is omitted if there are no warnings! |
|
| `warnings` | List | Any warnings that did not result in failure, but should be noted. Field is omitted if there are no warnings! |
|
||||||
| `desc` | String | Human-readable feedback. This can be shown directly to the user if desired. |
|
| `desc` | String | Human-readable feedback. This can be shown directly to the user if desired. |
|
||||||
| `list` | List | List of returned [entities](#entity-structure) |
|
| `list` | List | List of returned [entities](#Entity-Structure) |
|
||||||
|
|
||||||
|
|
||||||
Both errors and warnings have the following structure:
|
Both errors and warnings have the following structure:
|
||||||
|
@ -103,7 +87,7 @@ Whenever a list of entities is returned, they have the following fields:
|
||||||
| Key | Type | Description |
|
| Key | Type | Description |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| `time` | Integer | Timestamp of the Scrobble in UTC |
|
| `time` | Integer | Timestamp of the Scrobble in UTC |
|
||||||
| `track` | Mapping | The [track](#track) being scrobbled |
|
| `track` | Mapping | The [track](#Track) being scrobbled |
|
||||||
| `duration` | Integer | How long the track was played for in seconds |
|
| `duration` | Integer | How long the track was played for in seconds |
|
||||||
| `origin` | String | Client that submitted the scrobble, or import source |
|
| `origin` | String | Client that submitted the scrobble, or import source |
|
||||||
|
|
||||||
|
@ -134,7 +118,7 @@ Whenever a list of entities is returned, they have the following fields:
|
||||||
|
|
||||||
| Key | Type | Description |
|
| Key | Type | Description |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| `artists` | List | The [artists](#artist) credited with the track |
|
| `artists` | List | The [artists](#Artist) credited with the track |
|
||||||
| `title` | String | The title of the track |
|
| `title` | String | The title of the track |
|
||||||
| `length` | Integer | The full length of the track in seconds |
|
| `length` | Integer | The full length of the track in seconds |
|
||||||
|
|
||||||
|
|
|
@ -42,10 +42,3 @@ minor_release_name: "Yeonhee"
|
||||||
- "[Bugfix] Fixed importing a Spotify file without path"
|
- "[Bugfix] Fixed importing a Spotify file without path"
|
||||||
- "[Bugfix] No longer releasing database lock during scrobble creation"
|
- "[Bugfix] No longer releasing database lock during scrobble creation"
|
||||||
- "[Distribution] Experimental arm64 image"
|
- "[Distribution] Experimental arm64 image"
|
||||||
3.0.7:
|
|
||||||
commit: "62abc319303a6cb6463f7c27b6ef09b76fc67f86"
|
|
||||||
notes:
|
|
||||||
- "[Bugix] Improved signal handling"
|
|
||||||
- "[Bugix] Fixed constant re-caching of all-time stats, significantly increasing page load speed"
|
|
||||||
- "[Logging] Disabled cache information when cache is not used"
|
|
||||||
- "[Distribution] Experimental arm/v7 image"
|
|
||||||
|
|
|
@ -6,5 +6,3 @@ minor_release_name: "Soyeon"
|
||||||
- "[Feature] Implemented track title and artist name editing from web interface"
|
- "[Feature] Implemented track title and artist name editing from web interface"
|
||||||
- "[Feature] Implemented track and artist merging from web interface"
|
- "[Feature] Implemented track and artist merging from web interface"
|
||||||
- "[Feature] Implemented scrobble reparsing from web interface"
|
- "[Feature] Implemented scrobble reparsing from web interface"
|
||||||
- "[Performance] Adjusted cache sizes"
|
|
||||||
- "[Logging] Added cache memory use information"
|
|
||||||
|
|
|
@ -6,7 +6,6 @@ FOLDER = "dev/releases"
|
||||||
|
|
||||||
releases = {}
|
releases = {}
|
||||||
for f in os.listdir(FOLDER):
|
for f in os.listdir(FOLDER):
|
||||||
if f == "branch.yml": continue
|
|
||||||
#maj,min = (int(i) for i in f.split('.')[:2])
|
#maj,min = (int(i) for i in f.split('.')[:2])
|
||||||
|
|
||||||
with open(os.path.join(FOLDER,f)) as fd:
|
with open(os.path.join(FOLDER,f)) as fd:
|
||||||
|
|
|
@ -1,7 +1,6 @@
|
||||||
import os
|
import os
|
||||||
import signal
|
import signal
|
||||||
import subprocess
|
import subprocess
|
||||||
import time
|
|
||||||
|
|
||||||
from setproctitle import setproctitle
|
from setproctitle import setproctitle
|
||||||
from ipaddress import ip_address
|
from ipaddress import ip_address
|
||||||
|
@ -41,10 +40,9 @@ def get_instance_supervisor():
|
||||||
return None
|
return None
|
||||||
|
|
||||||
def restart():
|
def restart():
|
||||||
if stop():
|
stop()
|
||||||
start()
|
start()
|
||||||
else:
|
|
||||||
print(col["red"]("Could not stop Maloja!"))
|
|
||||||
|
|
||||||
def start():
|
def start():
|
||||||
if get_instance_supervisor() is not None:
|
if get_instance_supervisor() is not None:
|
||||||
|
@ -71,28 +69,16 @@ def start():
|
||||||
|
|
||||||
def stop():
|
def stop():
|
||||||
|
|
||||||
for attempt in [(signal.SIGTERM,2),(signal.SIGTERM,5),(signal.SIGKILL,3),(signal.SIGKILL,5)]:
|
pid_sv = get_instance_supervisor()
|
||||||
|
if pid_sv is not None:
|
||||||
pid_sv = get_instance_supervisor()
|
os.kill(pid_sv,signal.SIGTERM)
|
||||||
pid = get_instance()
|
|
||||||
|
|
||||||
if pid is None and pid_sv is None:
|
|
||||||
print("Maloja stopped!")
|
|
||||||
return True
|
|
||||||
|
|
||||||
if pid_sv is not None:
|
|
||||||
os.kill(pid_sv,attempt[0])
|
|
||||||
if pid is not None:
|
|
||||||
os.kill(pid,attempt[0])
|
|
||||||
|
|
||||||
time.sleep(attempt[1])
|
|
||||||
|
|
||||||
return False
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
pid = get_instance()
|
||||||
|
if pid is not None:
|
||||||
|
os.kill(pid,signal.SIGTERM)
|
||||||
|
|
||||||
|
if pid is None and pid_sv is None:
|
||||||
|
return False
|
||||||
|
|
||||||
print("Maloja stopped!")
|
print("Maloja stopped!")
|
||||||
return True
|
return True
|
||||||
|
|
|
@ -4,7 +4,7 @@
|
||||||
# you know what f*ck it
|
# you know what f*ck it
|
||||||
# this is hardcoded for now because of that damn project / package name discrepancy
|
# this is hardcoded for now because of that damn project / package name discrepancy
|
||||||
# i'll fix it one day
|
# i'll fix it one day
|
||||||
VERSION = "3.0.7"
|
VERSION = "3.0.6"
|
||||||
HOMEPAGE = "https://github.com/krateng/maloja"
|
HOMEPAGE = "https://github.com/krateng/maloja"
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -40,7 +40,7 @@ api.__apipath__ = "mlj_1"
|
||||||
|
|
||||||
|
|
||||||
errors = {
|
errors = {
|
||||||
database.exceptions.MissingScrobbleParameters: lambda e: (400,{
|
database.MissingScrobbleParameters: lambda e: (400,{
|
||||||
"status":"failure",
|
"status":"failure",
|
||||||
"error":{
|
"error":{
|
||||||
'type':'missing_scrobble_data',
|
'type':'missing_scrobble_data',
|
||||||
|
@ -48,14 +48,6 @@ errors = {
|
||||||
'desc':"The scrobble is missing needed parameters."
|
'desc':"The scrobble is missing needed parameters."
|
||||||
}
|
}
|
||||||
}),
|
}),
|
||||||
database.exceptions.MissingEntityParameter: lambda e: (400,{
|
|
||||||
"status":"error",
|
|
||||||
"error":{
|
|
||||||
'type':'missing_entity_parameter',
|
|
||||||
'value':None,
|
|
||||||
'desc':"This API call is not valid without an entity (track or artist)."
|
|
||||||
}
|
|
||||||
}),
|
|
||||||
database.exceptions.EntityExists: lambda e: (409,{
|
database.exceptions.EntityExists: lambda e: (409,{
|
||||||
"status":"failure",
|
"status":"failure",
|
||||||
"error":{
|
"error":{
|
||||||
|
@ -64,16 +56,7 @@ errors = {
|
||||||
'desc':"This entity already exists in the database. Consider merging instead."
|
'desc':"This entity already exists in the database. Consider merging instead."
|
||||||
}
|
}
|
||||||
}),
|
}),
|
||||||
database.exceptions.DatabaseNotBuilt: lambda e: (503,{
|
Exception: lambda e: (500,{
|
||||||
"status":"error",
|
|
||||||
"error":{
|
|
||||||
'type':'server_not_ready',
|
|
||||||
'value':'db_upgrade',
|
|
||||||
'desc':"The database is being upgraded. Please try again later."
|
|
||||||
}
|
|
||||||
}),
|
|
||||||
# for http errors, use their status code
|
|
||||||
Exception: lambda e: ((e.status_code if hasattr(e,'statuscode') else 500),{
|
|
||||||
"status":"failure",
|
"status":"failure",
|
||||||
"error":{
|
"error":{
|
||||||
'type':'unknown_error',
|
'type':'unknown_error',
|
||||||
|
@ -202,7 +185,6 @@ def get_scrobbles_external(**keys):
|
||||||
if k_amount.get('perpage') is not math.inf: result = result[:k_amount.get('perpage')]
|
if k_amount.get('perpage') is not math.inf: result = result[:k_amount.get('perpage')]
|
||||||
|
|
||||||
return {
|
return {
|
||||||
"status":"ok",
|
|
||||||
"list":result
|
"list":result
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -222,7 +204,6 @@ def get_scrobbles_num_external(**keys):
|
||||||
result = database.get_scrobbles_num(**ckeys)
|
result = database.get_scrobbles_num(**ckeys)
|
||||||
|
|
||||||
return {
|
return {
|
||||||
"status":"ok",
|
|
||||||
"amount":result
|
"amount":result
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -243,7 +224,6 @@ def get_tracks_external(**keys):
|
||||||
result = database.get_tracks(**ckeys)
|
result = database.get_tracks(**ckeys)
|
||||||
|
|
||||||
return {
|
return {
|
||||||
"status":"ok",
|
|
||||||
"list":result
|
"list":result
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -260,7 +240,6 @@ def get_artists_external():
|
||||||
result = database.get_artists()
|
result = database.get_artists()
|
||||||
|
|
||||||
return {
|
return {
|
||||||
"status":"ok",
|
|
||||||
"list":result
|
"list":result
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -282,7 +261,6 @@ def get_charts_artists_external(**keys):
|
||||||
result = database.get_charts_artists(**ckeys)
|
result = database.get_charts_artists(**ckeys)
|
||||||
|
|
||||||
return {
|
return {
|
||||||
"status":"ok",
|
|
||||||
"list":result
|
"list":result
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -302,7 +280,6 @@ def get_charts_tracks_external(**keys):
|
||||||
result = database.get_charts_tracks(**ckeys)
|
result = database.get_charts_tracks(**ckeys)
|
||||||
|
|
||||||
return {
|
return {
|
||||||
"status":"ok",
|
|
||||||
"list":result
|
"list":result
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -323,7 +300,6 @@ def get_pulse_external(**keys):
|
||||||
results = database.get_pulse(**ckeys)
|
results = database.get_pulse(**ckeys)
|
||||||
|
|
||||||
return {
|
return {
|
||||||
"status":"ok",
|
|
||||||
"list":results
|
"list":results
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -344,7 +320,6 @@ def get_performance_external(**keys):
|
||||||
results = database.get_performance(**ckeys)
|
results = database.get_performance(**ckeys)
|
||||||
|
|
||||||
return {
|
return {
|
||||||
"status":"ok",
|
|
||||||
"list":results
|
"list":results
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -365,7 +340,6 @@ def get_top_artists_external(**keys):
|
||||||
results = database.get_top_artists(**ckeys)
|
results = database.get_top_artists(**ckeys)
|
||||||
|
|
||||||
return {
|
return {
|
||||||
"status":"ok",
|
|
||||||
"list":results
|
"list":results
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -388,7 +362,6 @@ def get_top_tracks_external(**keys):
|
||||||
results = database.get_top_tracks(**ckeys)
|
results = database.get_top_tracks(**ckeys)
|
||||||
|
|
||||||
return {
|
return {
|
||||||
"status":"ok",
|
|
||||||
"list":results
|
"list":results
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -413,7 +386,7 @@ def artist_info_external(**keys):
|
||||||
@api.get("trackinfo")
|
@api.get("trackinfo")
|
||||||
@catch_exceptions
|
@catch_exceptions
|
||||||
@add_common_args_to_docstring(filterkeys=True)
|
@add_common_args_to_docstring(filterkeys=True)
|
||||||
def track_info_external(artist:Multi[str]=[],**keys):
|
def track_info_external(artist:Multi[str],**keys):
|
||||||
"""Returns information about a track
|
"""Returns information about a track
|
||||||
|
|
||||||
:return: track (Mapping), scrobbles (Integer), position (Integer), medals (Mapping), certification (String), topweeks (Integer)
|
:return: track (Mapping), scrobbles (Integer), position (Integer), medals (Mapping), certification (String), topweeks (Integer)
|
||||||
|
@ -718,8 +691,7 @@ def reparse_scrobble(timestamp):
|
||||||
if result:
|
if result:
|
||||||
return {
|
return {
|
||||||
"status":"success",
|
"status":"success",
|
||||||
"desc":f"Scrobble was reparsed!",
|
"desc":f"Scrobble was reparsed!"
|
||||||
"scrobble":result
|
|
||||||
}
|
}
|
||||||
else:
|
else:
|
||||||
return {
|
return {
|
||||||
|
|
|
@ -1,5 +1,5 @@
|
||||||
# server
|
# server
|
||||||
from bottle import request, response, FormsDict
|
from bottle import request, response, FormsDict, HTTPError
|
||||||
|
|
||||||
# rest of the project
|
# rest of the project
|
||||||
from ..cleanup import CleanerAgent
|
from ..cleanup import CleanerAgent
|
||||||
|
@ -13,7 +13,6 @@ from ..apis import apikeystore
|
||||||
from . import sqldb
|
from . import sqldb
|
||||||
from . import cached
|
from . import cached
|
||||||
from . import dbcache
|
from . import dbcache
|
||||||
from . import exceptions
|
|
||||||
|
|
||||||
# doreah toolkit
|
# doreah toolkit
|
||||||
from doreah.logging import log
|
from doreah.logging import log
|
||||||
|
@ -43,12 +42,23 @@ dbstatus = {
|
||||||
"rebuildinprogress":False,
|
"rebuildinprogress":False,
|
||||||
"complete":False # information is complete
|
"complete":False # information is complete
|
||||||
}
|
}
|
||||||
|
class DatabaseNotBuilt(HTTPError):
|
||||||
|
def __init__(self):
|
||||||
|
super().__init__(
|
||||||
|
status=503,
|
||||||
|
body="The Maloja Database is being upgraded to Version 3. This could take quite a long time! (~ 2-5 minutes per 10 000 scrobbles)",
|
||||||
|
headers={"Retry-After":120}
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class MissingScrobbleParameters(Exception):
|
||||||
|
def __init__(self,params=[]):
|
||||||
|
self.params = params
|
||||||
|
|
||||||
|
|
||||||
def waitfordb(func):
|
def waitfordb(func):
|
||||||
def newfunc(*args,**kwargs):
|
def newfunc(*args,**kwargs):
|
||||||
if not dbstatus['healthy']: raise exceptions.DatabaseNotBuilt()
|
if not dbstatus['healthy']: raise DatabaseNotBuilt()
|
||||||
return func(*args,**kwargs)
|
return func(*args,**kwargs)
|
||||||
return newfunc
|
return newfunc
|
||||||
|
|
||||||
|
@ -87,7 +97,7 @@ def incoming_scrobble(rawscrobble,fix=True,client=None,api=None,dbconn=None):
|
||||||
missing.append(necessary_arg)
|
missing.append(necessary_arg)
|
||||||
if len(missing) > 0:
|
if len(missing) > 0:
|
||||||
log(f"Invalid Scrobble [Client: {client} | API: {api}]: {rawscrobble} ",color='red')
|
log(f"Invalid Scrobble [Client: {client} | API: {api}]: {rawscrobble} ",color='red')
|
||||||
raise exceptions.MissingScrobbleParameters(missing)
|
raise MissingScrobbleParameters(missing)
|
||||||
|
|
||||||
|
|
||||||
log(f"Incoming scrobble [Client: {client} | API: {api}]: {rawscrobble}")
|
log(f"Incoming scrobble [Client: {client} | API: {api}]: {rawscrobble}")
|
||||||
|
@ -118,9 +128,7 @@ def reparse_scrobble(timestamp):
|
||||||
# check if id changed
|
# check if id changed
|
||||||
if sqldb.get_track_id(scrobble['track']) != track_id:
|
if sqldb.get_track_id(scrobble['track']) != track_id:
|
||||||
sqldb.edit_scrobble(timestamp, {'track':newscrobble['track']})
|
sqldb.edit_scrobble(timestamp, {'track':newscrobble['track']})
|
||||||
dbcache.invalidate_entity_cache()
|
return True
|
||||||
dbcache.invalidate_caches()
|
|
||||||
return sqldb.get_scrobble(timestamp=timestamp)
|
|
||||||
|
|
||||||
return False
|
return False
|
||||||
|
|
||||||
|
@ -191,7 +199,6 @@ def merge_artists(target_id,source_ids):
|
||||||
log(f"Merging {sources} into {target}")
|
log(f"Merging {sources} into {target}")
|
||||||
result = sqldb.merge_artists(target_id,source_ids)
|
result = sqldb.merge_artists(target_id,source_ids)
|
||||||
dbcache.invalidate_entity_cache()
|
dbcache.invalidate_entity_cache()
|
||||||
dbcache.invalidate_caches()
|
|
||||||
|
|
||||||
return result
|
return result
|
||||||
|
|
||||||
|
@ -202,7 +209,6 @@ def merge_tracks(target_id,source_ids):
|
||||||
log(f"Merging {sources} into {target}")
|
log(f"Merging {sources} into {target}")
|
||||||
result = sqldb.merge_tracks(target_id,source_ids)
|
result = sqldb.merge_tracks(target_id,source_ids)
|
||||||
dbcache.invalidate_entity_cache()
|
dbcache.invalidate_entity_cache()
|
||||||
dbcache.invalidate_caches()
|
|
||||||
|
|
||||||
return result
|
return result
|
||||||
|
|
||||||
|
@ -299,8 +305,6 @@ def get_performance(dbconn=None,**keys):
|
||||||
if c["artist"] == artist:
|
if c["artist"] == artist:
|
||||||
rank = c["rank"]
|
rank = c["rank"]
|
||||||
break
|
break
|
||||||
else:
|
|
||||||
raise exceptions.MissingEntityParameter()
|
|
||||||
results.append({"range":rng,"rank":rank})
|
results.append({"range":rng,"rank":rank})
|
||||||
|
|
||||||
return results
|
return results
|
||||||
|
@ -340,7 +344,6 @@ def get_top_tracks(dbconn=None,**keys):
|
||||||
def artist_info(dbconn=None,**keys):
|
def artist_info(dbconn=None,**keys):
|
||||||
|
|
||||||
artist = keys.get('artist')
|
artist = keys.get('artist')
|
||||||
if artist is None: raise exceptions.MissingEntityParameter()
|
|
||||||
|
|
||||||
artist_id = sqldb.get_artist_id(artist,dbconn=dbconn)
|
artist_id = sqldb.get_artist_id(artist,dbconn=dbconn)
|
||||||
artist = sqldb.get_artist(artist_id,dbconn=dbconn)
|
artist = sqldb.get_artist(artist_id,dbconn=dbconn)
|
||||||
|
@ -385,7 +388,6 @@ def artist_info(dbconn=None,**keys):
|
||||||
def track_info(dbconn=None,**keys):
|
def track_info(dbconn=None,**keys):
|
||||||
|
|
||||||
track = keys.get('track')
|
track = keys.get('track')
|
||||||
if track is None: raise exceptions.MissingEntityParameter()
|
|
||||||
|
|
||||||
track_id = sqldb.get_track_id(track,dbconn=dbconn)
|
track_id = sqldb.get_track_id(track,dbconn=dbconn)
|
||||||
track = sqldb.get_track(track_id,dbconn=dbconn)
|
track = sqldb.get_track(track_id,dbconn=dbconn)
|
||||||
|
|
|
@ -5,7 +5,6 @@
|
||||||
import lru
|
import lru
|
||||||
import psutil
|
import psutil
|
||||||
import json
|
import json
|
||||||
import sys
|
|
||||||
from doreah.regular import runhourly
|
from doreah.regular import runhourly
|
||||||
from doreah.logging import log
|
from doreah.logging import log
|
||||||
|
|
||||||
|
@ -13,10 +12,16 @@ from ..pkg_global.conf import malojaconfig
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
if malojaconfig['USE_GLOBAL_CACHE']:
|
|
||||||
|
|
||||||
cache = lru.LRU(10000)
|
|
||||||
entitycache = lru.LRU(100000)
|
if malojaconfig['USE_GLOBAL_CACHE']:
|
||||||
|
CACHE_SIZE = 1000
|
||||||
|
ENTITY_CACHE_SIZE = 100000
|
||||||
|
|
||||||
|
cache = lru.LRU(CACHE_SIZE)
|
||||||
|
entitycache = lru.LRU(ENTITY_CACHE_SIZE)
|
||||||
|
|
||||||
|
hits, misses = 0, 0
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
@ -26,10 +31,11 @@ if malojaconfig['USE_GLOBAL_CACHE']:
|
||||||
trim_cache()
|
trim_cache()
|
||||||
|
|
||||||
def print_stats():
|
def print_stats():
|
||||||
for name,c in (('Cache',cache),('Entity Cache',entitycache)):
|
log(f"Cache Size: {len(cache)} [{len(entitycache)} E], System RAM Utilization: {psutil.virtual_memory().percent}%, Cache Hits: {hits}/{hits+misses}")
|
||||||
hits, misses = c.get_stats()
|
#print("Full rundown:")
|
||||||
log(f"{name}: Size: {len(c)} | Hits: {hits}/{hits+misses} | Estimated Memory: {human_readable_size(c)}")
|
#import sys
|
||||||
log(f"System RAM Utilization: {psutil.virtual_memory().percent}%")
|
#for k in cache.keys():
|
||||||
|
# print(f"\t{k}\t{sys.getsizeof(cache[k])}")
|
||||||
|
|
||||||
|
|
||||||
def cached_wrapper(inner_func):
|
def cached_wrapper(inner_func):
|
||||||
|
@ -43,9 +49,12 @@ if malojaconfig['USE_GLOBAL_CACHE']:
|
||||||
global hits, misses
|
global hits, misses
|
||||||
key = (serialize(args),serialize(kwargs), inner_func, kwargs.get("since"), kwargs.get("to"))
|
key = (serialize(args),serialize(kwargs), inner_func, kwargs.get("since"), kwargs.get("to"))
|
||||||
|
|
||||||
try:
|
if key in cache:
|
||||||
return cache[key]
|
hits += 1
|
||||||
except KeyError:
|
return cache.get(key)
|
||||||
|
|
||||||
|
else:
|
||||||
|
misses += 1
|
||||||
result = inner_func(*args,**kwargs,dbconn=conn)
|
result = inner_func(*args,**kwargs,dbconn=conn)
|
||||||
cache[key] = result
|
cache[key] = result
|
||||||
return result
|
return result
|
||||||
|
@ -58,18 +67,25 @@ if malojaconfig['USE_GLOBAL_CACHE']:
|
||||||
# cache that's aware of what we're calling
|
# cache that's aware of what we're calling
|
||||||
def cached_wrapper_individual(inner_func):
|
def cached_wrapper_individual(inner_func):
|
||||||
|
|
||||||
|
|
||||||
def outer_func(set_arg,**kwargs):
|
def outer_func(set_arg,**kwargs):
|
||||||
|
|
||||||
|
|
||||||
if 'dbconn' in kwargs:
|
if 'dbconn' in kwargs:
|
||||||
conn = kwargs.pop('dbconn')
|
conn = kwargs.pop('dbconn')
|
||||||
else:
|
else:
|
||||||
conn = None
|
conn = None
|
||||||
|
|
||||||
|
#global hits, misses
|
||||||
result = {}
|
result = {}
|
||||||
for id in set_arg:
|
for id in set_arg:
|
||||||
try:
|
if (inner_func,id) in entitycache:
|
||||||
result[id] = entitycache[(inner_func,id)]
|
result[id] = entitycache[(inner_func,id)]
|
||||||
except KeyError:
|
#hits += 1
|
||||||
|
else:
|
||||||
pass
|
pass
|
||||||
|
#misses += 1
|
||||||
|
|
||||||
|
|
||||||
remaining = inner_func(set(e for e in set_arg if e not in result),dbconn=conn)
|
remaining = inner_func(set(e for e in set_arg if e not in result),dbconn=conn)
|
||||||
for id in remaining:
|
for id in remaining:
|
||||||
|
@ -99,14 +115,13 @@ if malojaconfig['USE_GLOBAL_CACHE']:
|
||||||
def trim_cache():
|
def trim_cache():
|
||||||
ramprct = psutil.virtual_memory().percent
|
ramprct = psutil.virtual_memory().percent
|
||||||
if ramprct > malojaconfig["DB_MAX_MEMORY"]:
|
if ramprct > malojaconfig["DB_MAX_MEMORY"]:
|
||||||
log(f"{ramprct}% RAM usage, clearing cache!")
|
log(f"{ramprct}% RAM usage, clearing cache and adjusting size!")
|
||||||
for c in (cache,entitycache):
|
|
||||||
c.clear()
|
|
||||||
#ratio = 0.6
|
#ratio = 0.6
|
||||||
#targetsize = max(int(len(cache) * ratio),50)
|
#targetsize = max(int(len(cache) * ratio),50)
|
||||||
#log(f"Reducing to {targetsize} entries")
|
#log(f"Reducing to {targetsize} entries")
|
||||||
#cache.set_size(targetsize)
|
#cache.set_size(targetsize)
|
||||||
#cache.set_size(HIGH_NUMBER)
|
#cache.set_size(HIGH_NUMBER)
|
||||||
|
cache.clear()
|
||||||
#if cache.get_size() > CACHE_ADJUST_STEP:
|
#if cache.get_size() > CACHE_ADJUST_STEP:
|
||||||
# cache.set_size(cache.get_size() - CACHE_ADJUST_STEP)
|
# cache.set_size(cache.get_size() - CACHE_ADJUST_STEP)
|
||||||
|
|
||||||
|
@ -141,32 +156,3 @@ def serialize(obj):
|
||||||
elif isinstance(obj,dict):
|
elif isinstance(obj,dict):
|
||||||
return "{" + ",".join(serialize(o) + ":" + serialize(obj[o]) for o in obj) + "}"
|
return "{" + ",".join(serialize(o) + ":" + serialize(obj[o]) for o in obj) + "}"
|
||||||
return json.dumps(obj.hashable())
|
return json.dumps(obj.hashable())
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
def get_size_of(obj,counted=None):
|
|
||||||
if counted is None:
|
|
||||||
counted = set()
|
|
||||||
if id(obj) in counted: return 0
|
|
||||||
size = sys.getsizeof(obj)
|
|
||||||
counted.add(id(obj))
|
|
||||||
try:
|
|
||||||
for k,v in obj.items():
|
|
||||||
size += get_size_of(v,counted=counted)
|
|
||||||
except:
|
|
||||||
try:
|
|
||||||
for i in obj:
|
|
||||||
size += get_size_of(i,counted=counted)
|
|
||||||
except:
|
|
||||||
pass
|
|
||||||
return size
|
|
||||||
|
|
||||||
def human_readable_size(obj):
|
|
||||||
units = ['','K','M','G','T','P']
|
|
||||||
idx = 0
|
|
||||||
bytes = get_size_of(obj)
|
|
||||||
while bytes > 1024 and len(units) > idx+1:
|
|
||||||
bytes = bytes / 1024
|
|
||||||
idx += 1
|
|
||||||
|
|
||||||
return f"{bytes:.2f} {units[idx]}B"
|
|
||||||
|
|
|
@ -1,5 +1,3 @@
|
||||||
from bottle import HTTPError
|
|
||||||
|
|
||||||
class EntityExists(Exception):
|
class EntityExists(Exception):
|
||||||
def __init__(self,entitydict):
|
def __init__(self,entitydict):
|
||||||
self.entitydict = entitydict
|
self.entitydict = entitydict
|
||||||
|
@ -10,20 +8,3 @@ class TrackExists(EntityExists):
|
||||||
|
|
||||||
class ArtistExists(EntityExists):
|
class ArtistExists(EntityExists):
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|
||||||
class DatabaseNotBuilt(HTTPError):
|
|
||||||
def __init__(self):
|
|
||||||
super().__init__(
|
|
||||||
status=503,
|
|
||||||
body="The Maloja Database is being upgraded to Version 3. This could take quite a long time! (~ 2-5 minutes per 10 000 scrobbles)",
|
|
||||||
headers={"Retry-After":120}
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
class MissingScrobbleParameters(Exception):
|
|
||||||
def __init__(self,params=[]):
|
|
||||||
self.params = params
|
|
||||||
|
|
||||||
class MissingEntityParameter(Exception):
|
|
||||||
pass
|
|
||||||
|
|
|
@ -23,8 +23,7 @@ class JinjaDBConnection:
|
||||||
return self
|
return self
|
||||||
def __exit__(self, exc_type, exc_value, exc_traceback):
|
def __exit__(self, exc_type, exc_value, exc_traceback):
|
||||||
self.conn.close()
|
self.conn.close()
|
||||||
if malojaconfig['USE_REQUEST_CACHE']:
|
log(f"Generated page with {self.hits}/{self.hits+self.misses} local Cache hits",module="debug_performance")
|
||||||
log(f"Generated page with {self.hits}/{self.hits+self.misses} local Cache hits",module="debug_performance")
|
|
||||||
del self.cache
|
del self.cache
|
||||||
def __getattr__(self,name):
|
def __getattr__(self,name):
|
||||||
originalmethod = getattr(database,name)
|
originalmethod = getattr(database,name)
|
||||||
|
|
|
@ -115,11 +115,8 @@ def connection_provider(func):
|
||||||
return func(*args,**kwargs)
|
return func(*args,**kwargs)
|
||||||
else:
|
else:
|
||||||
with engine.connect() as connection:
|
with engine.connect() as connection:
|
||||||
with connection.begin():
|
kwargs['dbconn'] = connection
|
||||||
kwargs['dbconn'] = connection
|
return func(*args,**kwargs)
|
||||||
return func(*args,**kwargs)
|
|
||||||
|
|
||||||
wrapper.__innerfunc__ = func
|
|
||||||
return wrapper
|
return wrapper
|
||||||
|
|
||||||
##### DB <-> Dict translations
|
##### DB <-> Dict translations
|
||||||
|
@ -442,7 +439,7 @@ def merge_tracks(target_id,source_ids,dbconn=None):
|
||||||
track_id=target_id
|
track_id=target_id
|
||||||
)
|
)
|
||||||
result = dbconn.execute(op)
|
result = dbconn.execute(op)
|
||||||
clean_db(dbconn=dbconn)
|
clean_db()
|
||||||
|
|
||||||
return True
|
return True
|
||||||
|
|
||||||
|
@ -491,8 +488,8 @@ def merge_artists(target_id,source_ids,dbconn=None):
|
||||||
# result = dbconn.execute(op)
|
# result = dbconn.execute(op)
|
||||||
|
|
||||||
# this could have created duplicate tracks
|
# this could have created duplicate tracks
|
||||||
merge_duplicate_tracks(artist_id=target_id,dbconn=dbconn)
|
merge_duplicate_tracks(artist_id=target_id)
|
||||||
clean_db(dbconn=dbconn)
|
clean_db()
|
||||||
|
|
||||||
return True
|
return True
|
||||||
|
|
||||||
|
@ -871,37 +868,38 @@ def search_track(searchterm,dbconn=None):
|
||||||
##### MAINTENANCE
|
##### MAINTENANCE
|
||||||
|
|
||||||
@runhourly
|
@runhourly
|
||||||
@connection_provider
|
def clean_db():
|
||||||
def clean_db(dbconn=None):
|
|
||||||
|
|
||||||
log(f"Database Cleanup...")
|
with SCROBBLE_LOCK:
|
||||||
|
with engine.begin() as conn:
|
||||||
|
log(f"Database Cleanup...")
|
||||||
|
|
||||||
to_delete = [
|
to_delete = [
|
||||||
# tracks with no scrobbles (trackartist entries first)
|
# tracks with no scrobbles (trackartist entries first)
|
||||||
"from trackartists where track_id in (select id from tracks where id not in (select track_id from scrobbles))",
|
"from trackartists where track_id in (select id from tracks where id not in (select track_id from scrobbles))",
|
||||||
"from tracks where id not in (select track_id from scrobbles)",
|
"from tracks where id not in (select track_id from scrobbles)",
|
||||||
# artists with no tracks
|
# artists with no tracks
|
||||||
"from artists where id not in (select artist_id from trackartists) and id not in (select target_artist from associated_artists)",
|
"from artists where id not in (select artist_id from trackartists) and id not in (select target_artist from associated_artists)",
|
||||||
# tracks with no artists (scrobbles first)
|
# tracks with no artists (scrobbles first)
|
||||||
"from scrobbles where track_id in (select id from tracks where id not in (select track_id from trackartists))",
|
"from scrobbles where track_id in (select id from tracks where id not in (select track_id from trackartists))",
|
||||||
"from tracks where id not in (select track_id from trackartists)"
|
"from tracks where id not in (select track_id from trackartists)"
|
||||||
]
|
]
|
||||||
|
|
||||||
for d in to_delete:
|
for d in to_delete:
|
||||||
selection = dbconn.execute(sql.text(f"select * {d}"))
|
selection = conn.execute(sql.text(f"select * {d}"))
|
||||||
for row in selection.all():
|
for row in selection.all():
|
||||||
log(f"Deleting {row}")
|
log(f"Deleting {row}")
|
||||||
deletion = dbconn.execute(sql.text(f"delete {d}"))
|
deletion = conn.execute(sql.text(f"delete {d}"))
|
||||||
|
|
||||||
log("Database Cleanup complete!")
|
log("Database Cleanup complete!")
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
#if a2+a1>0: log(f"Deleted {a2} tracks without scrobbles ({a1} track artist entries)")
|
#if a2+a1>0: log(f"Deleted {a2} tracks without scrobbles ({a1} track artist entries)")
|
||||||
|
|
||||||
#if a3>0: log(f"Deleted {a3} artists without tracks")
|
#if a3>0: log(f"Deleted {a3} artists without tracks")
|
||||||
|
|
||||||
#if a5+a4>0: log(f"Deleted {a5} tracks without artists ({a4} scrobbles)")
|
#if a5+a4>0: log(f"Deleted {a5} tracks without artists ({a4} scrobbles)")
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
@ -922,39 +920,40 @@ def renormalize_names():
|
||||||
rows = conn.execute(DB['artists'].update().where(DB['artists'].c.id == id).values(name_normalized=norm_target))
|
rows = conn.execute(DB['artists'].update().where(DB['artists'].c.id == id).values(name_normalized=norm_target))
|
||||||
|
|
||||||
|
|
||||||
@connection_provider
|
|
||||||
def merge_duplicate_tracks(artist_id,dbconn=None):
|
def merge_duplicate_tracks(artist_id):
|
||||||
rows = dbconn.execute(
|
with engine.begin() as conn:
|
||||||
DB['trackartists'].select().where(
|
rows = conn.execute(
|
||||||
DB['trackartists'].c.artist_id == artist_id
|
DB['trackartists'].select().where(
|
||||||
|
DB['trackartists'].c.artist_id == artist_id
|
||||||
|
)
|
||||||
)
|
)
|
||||||
)
|
affected_tracks = [r.track_id for r in rows]
|
||||||
affected_tracks = [r.track_id for r in rows]
|
|
||||||
|
|
||||||
track_artists = {}
|
track_artists = {}
|
||||||
rows = dbconn.execute(
|
rows = conn.execute(
|
||||||
DB['trackartists'].select().where(
|
DB['trackartists'].select().where(
|
||||||
DB['trackartists'].c.track_id.in_(affected_tracks)
|
DB['trackartists'].c.track_id.in_(affected_tracks)
|
||||||
|
)
|
||||||
)
|
)
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
for row in rows:
|
for row in rows:
|
||||||
track_artists.setdefault(row.track_id,[]).append(row.artist_id)
|
track_artists.setdefault(row.track_id,[]).append(row.artist_id)
|
||||||
|
|
||||||
artist_combos = {}
|
artist_combos = {}
|
||||||
for track_id in track_artists:
|
for track_id in track_artists:
|
||||||
artist_combos.setdefault(tuple(sorted(track_artists[track_id])),[]).append(track_id)
|
artist_combos.setdefault(tuple(sorted(track_artists[track_id])),[]).append(track_id)
|
||||||
|
|
||||||
for c in artist_combos:
|
for c in artist_combos:
|
||||||
if len(artist_combos[c]) > 1:
|
if len(artist_combos[c]) > 1:
|
||||||
track_identifiers = {}
|
track_identifiers = {}
|
||||||
for track_id in artist_combos[c]:
|
for track_id in artist_combos[c]:
|
||||||
track_identifiers.setdefault(normalize_name(get_track(track_id)['title']),[]).append(track_id)
|
track_identifiers.setdefault(normalize_name(get_track(track_id)['title']),[]).append(track_id)
|
||||||
for track in track_identifiers:
|
for track in track_identifiers:
|
||||||
if len(track_identifiers[track]) > 1:
|
if len(track_identifiers[track]) > 1:
|
||||||
target,*src = track_identifiers[track]
|
target,*src = track_identifiers[track]
|
||||||
merge_tracks(target,src,dbconn=dbconn)
|
merge_tracks(target,src)
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -148,9 +148,9 @@ malojaconfig = Configuration(
|
||||||
"Technical":{
|
"Technical":{
|
||||||
"cache_expire_positive":(tp.Integer(), "Image Cache Expiration", 60, "Days until images are refetched"),
|
"cache_expire_positive":(tp.Integer(), "Image Cache Expiration", 60, "Days until images are refetched"),
|
||||||
"cache_expire_negative":(tp.Integer(), "Image Cache Negative Expiration", 5, "Days until failed image fetches are reattempted"),
|
"cache_expire_negative":(tp.Integer(), "Image Cache Negative Expiration", 5, "Days until failed image fetches are reattempted"),
|
||||||
"db_max_memory":(tp.Integer(min=0,max=100), "RAM Percentage soft limit", 50, "RAM Usage in percent at which Maloja should no longer increase its database cache."),
|
"db_max_memory":(tp.Integer(min=0,max=100), "RAM Percentage soft limit", 80, "RAM Usage in percent at which Maloja should no longer increase its database cache."),
|
||||||
"use_request_cache":(tp.Boolean(), "Use request-local DB Cache", False),
|
"use_request_cache":(tp.Boolean(), "Use request-local DB Cache", False),
|
||||||
"use_global_cache":(tp.Boolean(), "Use global DB Cache", True)
|
"use_global_cache":(tp.Boolean(), "Use global DB Cache", False)
|
||||||
},
|
},
|
||||||
"Fluff":{
|
"Fluff":{
|
||||||
"scrobbles_gold":(tp.Integer(), "Scrobbles for Gold", 250, "How many scrobbles a track needs to be considered 'Gold' status"),
|
"scrobbles_gold":(tp.Integer(), "Scrobbles for Gold", 250, "How many scrobbles a track needs to be considered 'Gold' status"),
|
||||||
|
|
|
@ -1,7 +0,0 @@
|
||||||
<td style="opacity:0.5;text-align:center;">
|
|
||||||
<svg height="96px" viewBox="0 0 24 24" width="96px">
|
|
||||||
<path d="M0 0h24v24H0z" fill="none"/>
|
|
||||||
<path d="M4.27 3L3 4.27l9 9v.28c-.59-.34-1.27-.55-2-.55-2.21 0-4 1.79-4 4s1.79 4 4 4 4-1.79 4-4v-1.73L19.73 21 21 19.73 4.27 3zM14 7h4V3h-6v5.18l2 2z"/>
|
|
||||||
</svg>
|
|
||||||
<br/>No scrobbles yet!
|
|
||||||
</td>
|
|
|
@ -9,12 +9,8 @@
|
||||||
{% set charts_cycler = cycler(*charts_14) %}
|
{% set charts_cycler = cycler(*charts_14) %}
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
<table class="tiles_top"><tr>
|
<table class="tiles_top"><tr>
|
||||||
{% for segment in range(3) %}
|
{% for segment in range(3) %}
|
||||||
{% if charts_14[0] is none and loop.first %}
|
|
||||||
{% include 'icons/nodata.jinja' %}
|
|
||||||
{% else %}
|
|
||||||
<td>
|
<td>
|
||||||
{% set segmentsize = segment+1 %}
|
{% set segmentsize = segment+1 %}
|
||||||
<table class="tiles_{{ segmentsize }}x{{ segmentsize }} tiles_sub">
|
<table class="tiles_{{ segmentsize }}x{{ segmentsize }} tiles_sub">
|
||||||
|
@ -39,7 +35,6 @@
|
||||||
</tr>
|
</tr>
|
||||||
{%- endfor -%}
|
{%- endfor -%}
|
||||||
</table>
|
</table>
|
||||||
</td>
|
</td>
|
||||||
{% endif %}
|
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
</tr></table>
|
</tr></table>
|
||||||
|
|
|
@ -11,9 +11,6 @@
|
||||||
|
|
||||||
<table class="tiles_top"><tr>
|
<table class="tiles_top"><tr>
|
||||||
{% for segment in range(3) %}
|
{% for segment in range(3) %}
|
||||||
{% if charts_14[0] is none and loop.first %}
|
|
||||||
{% include 'icons/nodata.jinja' %}
|
|
||||||
{% else %}
|
|
||||||
<td>
|
<td>
|
||||||
{% set segmentsize = segment+1 %}
|
{% set segmentsize = segment+1 %}
|
||||||
<table class="tiles_{{ segmentsize }}x{{ segmentsize }} tiles_sub">
|
<table class="tiles_{{ segmentsize }}x{{ segmentsize }} tiles_sub">
|
||||||
|
@ -38,7 +35,6 @@
|
||||||
</tr>
|
</tr>
|
||||||
{%- endfor %}
|
{%- endfor %}
|
||||||
</table>
|
</table>
|
||||||
</td>
|
</td>
|
||||||
{% endif %}
|
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
</tr></table>
|
</tr></table>
|
||||||
|
|
|
@ -58,10 +58,6 @@ div.header h1 {
|
||||||
settings icon
|
settings icon
|
||||||
**/
|
**/
|
||||||
|
|
||||||
svg {
|
|
||||||
fill: var(--text-color);
|
|
||||||
}
|
|
||||||
|
|
||||||
div#icon_bar {
|
div#icon_bar {
|
||||||
position:fixed;
|
position:fixed;
|
||||||
right:30px;
|
right:30px;
|
||||||
|
@ -73,13 +69,14 @@ div#icon_bar div.clickable_icon {
|
||||||
height:26px;
|
height:26px;
|
||||||
width:26px;
|
width:26px;
|
||||||
}
|
}
|
||||||
div.clickable_icon svg {
|
div.clickable_icon {
|
||||||
|
fill: var(--text-color);
|
||||||
cursor: pointer;
|
cursor: pointer;
|
||||||
}
|
}
|
||||||
div.clickable_icon:hover svg {
|
div.clickable_icon:hover {
|
||||||
fill: var(--text-color-focus);
|
fill: var(--text-color-focus);
|
||||||
}
|
}
|
||||||
div.clickable_icon.danger:hover svg {
|
div.clickable_icon.danger:hover {
|
||||||
fill: red;
|
fill: red;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -524,7 +521,6 @@ table.list tr {
|
||||||
background-color: var(--current-bg-color);
|
background-color: var(--current-bg-color);
|
||||||
border-color: var(--current-bg-color);
|
border-color: var(--current-bg-color);
|
||||||
height: 1.45em;
|
height: 1.45em;
|
||||||
transition: opacity 2s;
|
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -674,13 +670,6 @@ table.list tr.removed {
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
table.list tr.changed {
|
|
||||||
/*background-color: rgba(222,209,180,0.7) !important;*/
|
|
||||||
opacity:0;
|
|
||||||
transition: opacity 0.2s;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
/*
|
/*
|
||||||
table td.artists div {
|
table td.artists div {
|
||||||
overflow:hidden;
|
overflow:hidden;
|
||||||
|
|
|
@ -43,11 +43,7 @@ function reparseScrobble(id, element) {
|
||||||
callback_func = function(req){
|
callback_func = function(req){
|
||||||
if (req.status == 200) {
|
if (req.status == 200) {
|
||||||
if (req.response.status != 'no_operation') {
|
if (req.response.status != 'no_operation') {
|
||||||
//window.location.reload();
|
window.location.reload();
|
||||||
notifyCallback(req);
|
|
||||||
var newtrack = req.response.scrobble.track;
|
|
||||||
var row = element.parentElement.parentElement.parentElement.parentElement;
|
|
||||||
changeScrobbleRow(row,newtrack);
|
|
||||||
}
|
}
|
||||||
else {
|
else {
|
||||||
notifyCallback(req);
|
notifyCallback(req);
|
||||||
|
@ -62,43 +58,6 @@ function reparseScrobble(id, element) {
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
function changeScrobbleRow(element,newtrack) {
|
|
||||||
element.classList.add('changed');
|
|
||||||
|
|
||||||
setTimeout(function(){
|
|
||||||
element.getElementsByClassName('track')[0].innerHTML = createTrackCell(newtrack);
|
|
||||||
},200);
|
|
||||||
setTimeout(function(){element.classList.remove('changed')},300);
|
|
||||||
}
|
|
||||||
|
|
||||||
function createTrackCell(trackinfo) {
|
|
||||||
|
|
||||||
var trackquery = new URLSearchParams();
|
|
||||||
trackinfo.artists.forEach((a)=>trackquery.append('artist',a));
|
|
||||||
trackquery.append('title',trackinfo.title);
|
|
||||||
|
|
||||||
tracklink = document.createElement('a');
|
|
||||||
tracklink.href = "/track?" + trackquery.toString();
|
|
||||||
tracklink.textContent = trackinfo.title;
|
|
||||||
|
|
||||||
artistelements = []
|
|
||||||
var artistholder = document.createElement('span');
|
|
||||||
artistholder.classList.add('artist_in_trackcolumn');
|
|
||||||
for (var a of trackinfo.artists) {
|
|
||||||
var artistquery = new URLSearchParams();
|
|
||||||
artistquery.append('artist',a);
|
|
||||||
|
|
||||||
artistlink = document.createElement('a');
|
|
||||||
artistlink.href = "/artist?" + artistquery.toString();
|
|
||||||
artistlink.textContent = a;
|
|
||||||
|
|
||||||
artistelements.push(artistlink.outerHTML)
|
|
||||||
}
|
|
||||||
|
|
||||||
artistholder.innerHTML = artistelements.join(", ");
|
|
||||||
return artistholder.outerHTML + " – " + tracklink.outerHTML;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
// EDIT NAME
|
// EDIT NAME
|
||||||
function editEntity() {
|
function editEntity() {
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
[project]
|
[project]
|
||||||
name = "malojaserver"
|
name = "malojaserver"
|
||||||
version = "3.0.7"
|
version = "3.0.6"
|
||||||
description = "Self-hosted music scrobble database"
|
description = "Self-hosted music scrobble database"
|
||||||
readme = "./README.md"
|
readme = "./README.md"
|
||||||
requires-python = ">=3.7"
|
requires-python = ">=3.7"
|
||||||
|
|
Loading…
Reference in New Issue
Block a user