Compare commits

...

196 Commits

Author SHA1 Message Date
krateng 39a42e915c Initial support for new Spotify export format, GH-215 2023-06-25 18:00:23 +02:00
krateng b8944b4954 Reorganized containerfile to allow caching 2023-03-31 15:47:00 +02:00
krateng 9d9f3b500e More convenient album saving for 3.2 upgrade 2023-03-30 16:27:40 +02:00
krateng 72c58509a1 Added cool tag list script 2023-03-28 22:47:46 +02:00
krateng 11a5cb7401 Fixed scrobbler 2023-03-28 00:06:59 +02:00
krateng b4c8a0d68b Updated settings.md 2023-03-27 19:09:44 +02:00
krateng 88403d2583
Fetch smaller image from musicbrainz, fix GH-206 2023-03-27 18:10:53 +02:00
krateng 866d4ccd9b
Merge pull request #205 from FoxxMD/lsio
Refactor container image to use linuxserverio alpine base
2023-03-21 19:14:11 +01:00
FoxxMD 3db51a94d6 Add permission check and docs for PUID/PGID usage 2023-03-17 11:51:11 -04:00
FoxxMD a9c29f158e Refactor containerfile to align with lsio python install
* Simplify project file copy
* Reduce system and project dependency installs into single layer
* Add default permission ENVs for backwards-compatibility
2023-03-17 11:44:16 -04:00
krateng ab8af32812
Merge pull request #204 from FoxxMD/imagePerf
Improve image rendering performance
2023-03-17 16:43:49 +01:00
FoxxMD 7bc2ba0237 Move image base to linuxserverio alpine base
krateng/maloja#96
2023-03-17 10:28:07 -04:00
FoxxMD b8371347b7 Add configuration boolean for rendering album/artist icons
If a user has a slow internet connection or is using a low-power device they may wish to not render icons at all to prevent additional cpu/network load. Defaults to `true` to preserve existing behavior.
2023-03-16 15:21:02 -04:00
FoxxMD 1e3c6597d4 Lazy load tile and entity background images
CSS 'background-image:url' causes the browser to synchronously load images which prevents DOM from fully loading.
Replace this with lazyload.js to make
   * js load style=background-image... after DOM is loaded and
   * only load images in viewport

The end result is much faster *apparent* page loads as all DOM is loaded before images and a reduction in load for both client/server as images are only loaded if they become visible.
2023-03-16 15:01:54 -04:00
krateng 37210995fa
Merge pull request #202 from christophernewton/master
Fixed search response failure for manual scrobbling
2023-03-07 16:54:51 +01:00
Chris Newton 94ae453133 Fixed search response failure for manual scrobbling 2023-03-07 11:04:53 +11:00
krateng 93bbaac0e3 Bumped doreah version, fix GH-200 2023-02-26 22:10:13 +01:00
krateng 00a564c54d Hardcoded screenshot url in readme 2023-02-26 16:46:36 +01:00
krateng 4330b0294b Version bump 2023-02-26 16:32:03 +01:00
krateng b53141f065 Renamed workflows, fix GH-181 2023-02-26 16:00:06 +01:00
krateng 3ae395f697 Removed explicit column selection, GH-196 2023-02-26 15:50:37 +01:00
krateng 5466b6c37e Added dependency versions to information output 2023-02-26 15:48:50 +01:00
krateng e85861fb79 Bandaid for entity editing in Firefox, fix GH-188, GH-175 2023-02-25 22:18:57 +01:00
krateng a611b78dbc Removed dead link, fix GH-189 2023-02-25 22:07:12 +01:00
krateng c3ed5f318d Narrowed chart bars a bit, fix GH-195 2023-02-25 21:58:50 +01:00
krateng 073448257a Fixed robots.txt 2023-01-13 06:23:22 +01:00
krateng d12229d8a5 Fixed small visual bug 2023-01-01 23:35:55 +01:00
krateng d8f53a56d2 Fixed info output for Dual Stack 2022-12-05 00:05:19 +01:00
krateng c8f9e9c391 Fix main page display on Safari, fix GH-172 2022-11-30 05:15:20 +01:00
krateng 185a5b3e87 Added scrobbler functionality to selectively enable sites 2022-11-24 00:10:57 +01:00
krateng 95eaf0a3d6 Added interface for picking services to scrobbler 2022-11-23 23:40:44 +01:00
krateng a7d286c90c Added error handling for image upload 2022-10-19 19:53:13 +02:00
krateng ddc78c5756 Made addpicture endpoint part of the external API, GH-169 2022-10-19 19:28:15 +02:00
krateng a12253dc29 Sanitize artists and tracks in lists, GH-167 2022-10-13 18:08:08 +02:00
krateng 9eaeffca7e Sanitize artists and tracks in search results, GH-167 2022-10-13 18:06:02 +02:00
krateng db8389e6c1 Rules 2022-10-13 15:35:21 +02:00
krateng ef06f22622 Version bump 2022-10-07 17:38:12 +02:00
krateng b333009684 Merge branch 'master' of github.com:krateng/maloja 2022-10-07 17:33:07 +02:00
krateng ebd78914f9 Sanitize artists and tracks, fix GH-167 2022-10-07 17:32:34 +02:00
krateng 36d0e7bb8a
Merge pull request #153 from badlandspray/master
Track more additional information
2022-09-11 21:02:55 +02:00
krateng 91750db8ac
Reduce stored extra info from Listenbrainz API 2022-09-11 21:01:21 +02:00
krateng d5f2c254f3
Fix field name for track length 2022-09-11 20:58:37 +02:00
krateng e3933e7dca
Merge pull request #163 from krkk/import_listenbrainz
Implement importing scrobbles from ListenBrainz
2022-08-20 16:55:32 +02:00
Karol Kosek 9b10ca4a5d Implement importing scrobbles from ListenBrainz
Closes: #162
2022-08-16 22:17:02 +02:00
Karol Kosek 2ce2e2f682 Import track lengths from own maloja format
It will also be used when importing ListenBrainz files.
2022-08-16 19:37:15 +02:00
krateng 9917210b66 More intuitive manual scrobbling, GH-160 2022-08-15 17:50:42 +02:00
krateng 5656f8b4c0 Some rules 2022-08-15 17:50:07 +02:00
badlandspray 9ae14da397
scrobble_duration key 2022-07-16 00:54:54 -07:00
badlandspray 3fd02c1675
Add release_artist_name and correct duration 2022-07-15 15:18:22 +00:00
badlandspray f7251c613c
Add more fields 2022-07-05 08:34:07 +00:00
badlandspray d57bf33969
Track more additional information 2022-06-09 17:54:20 +00:00
krateng a1b2261fa7
Merge pull request #151 from badlandspray/master
Store album name
2022-06-06 18:02:52 +02:00
krateng 260c587248
Allow minimal listenbrainz payload 2022-06-06 18:02:34 +02:00
badlandspray c1493255b7
Store album name 2022-06-04 07:31:18 +00:00
krateng 97fc38f919 Graceful handling of missing templates 2022-05-26 14:56:04 +02:00
krateng 397d5e7c13 API root path now returns JSON error, fix GH-150 2022-05-26 14:43:29 +02:00
krateng 1eaba888c7 Saving additional info for Listenbrainz API, fix GH-149 2022-05-17 16:34:09 +02:00
krateng 084c7d5a1e Merge branch 'master' of github.com:krateng/maloja 2022-05-17 15:32:51 +02:00
krateng 515fa69fce
Fixed Readme links 2022-05-15 21:04:50 +02:00
krateng ca30309450
Merge pull request #146 from badlandspray/master
Track album name and track length
2022-05-08 16:45:51 +02:00
badlandspray 705f4b4252
Track album name and track length 2022-05-08 13:26:42 +00:00
krateng ac498bde73 Refactored some scrobble parsing 2022-05-07 22:24:37 +02:00
krateng f3a04c79b1 Version bump 2022-05-07 15:24:19 +02:00
krateng f74d5679eb Properly passing flags argument for regex sub calls, fix GH-145 2022-05-07 15:17:29 +02:00
krateng 5eb838d5df Added favicon html tag, fix GH-143 2022-05-06 16:30:24 +02:00
krateng 96778709bd Design change for chart tiles 2022-05-05 17:38:11 +02:00
krateng a073930601 Version bump 2022-05-05 16:38:08 +02:00
krateng 81f4e35258 Added API debug feature 2022-05-01 22:39:16 +02:00
krateng c16919eb1e Added rules 2022-05-01 17:53:25 +02:00
krateng e116690640 Fixed leftover whitespaces when parsing titles 2022-04-30 20:19:45 +02:00
krateng 8cb332b9fc Removed underline from linked buttons 2022-04-29 16:35:56 +02:00
krateng 3ede71fc79 Made some parsing rules case insensitive 2022-04-28 06:08:51 +02:00
krateng 77a0a0a41b
Merge pull request #139 from alim4r/feature/parse-remix-artists
Add feature to parse remix artists
2022-04-28 04:43:29 +02:00
alim4r ec02672a2e Remove debug print... 2022-04-27 22:30:23 +02:00
alim4r 5941123c52 Set parse_remix_artists default to False 2022-04-27 22:24:45 +02:00
alim4r 91a7aeb50d Add feature to parse remix artists 2022-04-27 20:54:33 +02:00
krateng 20aae955b2 Version bump 2022-04-27 20:30:15 +02:00
krateng d83b44de6e Added tooltip for image upload, GH-138 2022-04-27 17:51:39 +02:00
krateng 8197548285 Improved cache memory output 2022-04-26 19:58:36 +02:00
krateng 6171d1d2e1 Restored custom CSS file functionality, fix GH-135 2022-04-26 19:43:35 +02:00
krateng 0c948561a8 Added more generalized support for static user files, GH-135 2022-04-26 19:41:23 +02:00
krateng 02c77a5e31 Updated commit information 2022-04-26 14:52:19 +02:00
krateng bfa553bed0 Version bump 2022-04-26 14:43:51 +02:00
krateng 3592571afd
Merge pull request #130 from krateng/feature-webedit
Version 3.1
2022-04-26 14:38:41 +02:00
krateng c77b7c952f Added status to more API endpoints 2022-04-25 22:54:53 +02:00
krateng 8a44d3def2 Fixed some CSS 2022-04-25 22:01:56 +02:00
krateng cf04583122 Added more connection passing 2022-04-25 22:00:32 +02:00
krateng 8845f931df Wrapped DB write operations in transactions to ensure integrity 2022-04-25 21:50:40 +02:00
krateng 9c6c91f594 Can now reparse without reloading 2022-04-25 20:48:22 +02:00
krateng 2c31df3c58 Fixed cache invalidation after merging 2022-04-25 18:37:19 +02:00
krateng 9c656ee90b Some improvements to process control, should fix GH-112 2022-04-25 18:36:11 +02:00
krateng 938947d06c Added indicator for empty tile stats, fix GH-134 2022-04-25 17:55:46 +02:00
krateng ac3ca0b5e9 Moved exceptions and added handling for more of them 2022-04-25 17:03:44 +02:00
krateng 64d4036f55 Merge branch 'master' into feature-webedit 2022-04-25 16:07:26 +02:00
krateng 6df363a763 Merge branch 'master' of github.com:krateng/maloja 2022-04-25 15:51:55 +02:00
krateng 7062c0b440
Update API.md 2022-04-25 15:51:01 +02:00
krateng ad50ee866c More cache organization 2022-04-25 15:36:15 +02:00
krateng 62abc31930 Version bump 2022-04-25 03:37:12 +02:00
krateng c55e12dd43 Re-enabled cache per default 2022-04-25 03:24:16 +02:00
krateng 3b156a73ff Merge branch 'master' into feature-webedit 2022-04-25 02:53:43 +02:00
krateng 5b48c33a79 Added stresstest and new screenshot 2022-04-25 02:48:01 +02:00
krateng 95f98370cf Updated branch notes 2022-04-25 02:47:03 +02:00
krateng e470e2e43f Potentially fixed nonsensical caching, GH-132 2022-04-25 02:42:07 +02:00
krateng 35f428ef69 Merge branch 'master' of github.com:krateng/maloja 2022-04-24 20:55:57 +02:00
krateng 342b8867d9 Ported cache cleanup from 3.1 2022-04-24 20:55:07 +02:00
krateng bfc83fdbb0 Ported signal handling fix from 3.1 2022-04-24 20:47:17 +02:00
krateng f359662cf3 No longer catching BaseExceptions 2022-04-24 19:41:55 +02:00
krateng de286b58b9
Merge pull request #133 from northys/build_rpi
Build image for raspberry pi 2 (arm/v7)
2022-04-24 17:10:52 +02:00
krateng d5f5b48d85 Removed previous ability, but this time clean and consistent 2022-04-24 16:14:24 +02:00
Jiri Travnicek 00b3e6fc57
actions: build image for linux/arm/v7 (raspberry pi) 2022-04-24 16:12:11 +02:00
Jiri Travnicek e1074ba259
actions: drop ghcr support 2022-04-24 16:11:58 +02:00
krateng 7c77474feb Implemented cache enabling and disabling at runtime 2022-04-24 15:53:30 +02:00
krateng 279499ad9f Fixed old css passed to auth 2022-04-24 15:12:49 +02:00
krateng dc1becd683 Removed release notes that have been moved to 3.0.6 2022-04-24 03:17:35 +02:00
krateng c86ae31ea9 Fixed sins of my youth 2022-04-23 19:22:32 +02:00
krateng c3bb8ad322 Version-bumped Python and dependencies 2022-04-23 17:59:39 +02:00
krateng 6c5f08aa5a Removed special handling of css 2022-04-23 17:32:05 +02:00
krateng 29a6a74c37 Altered the previous fix. Pray I don't alter it further. 2022-04-23 17:24:18 +02:00
krateng 1bbb600481 Fixed small content jumping issue 2022-04-23 17:05:47 +02:00
krateng df07307730 Prepare for release 2022-04-23 16:48:37 +02:00
krateng 74977b18cc Merge branch 'master' into feature-webedit 2022-04-23 16:38:50 +02:00
krateng 1dfda0086e Fixed merging of artists that already share tracks 2022-04-22 22:59:02 +02:00
krateng 7c9f6e9e2d Fixed bug in web interface for non-independent artists 2022-04-22 21:38:35 +02:00
krateng 529d0c8a5d Reload on reparse 2022-04-22 21:37:48 +02:00
krateng cf4b3cd68f Commit 1291 🇨🇭 2022-04-22 20:59:55 +02:00
krateng 9272c191d8 More UI changes 2022-04-22 20:34:46 +02:00
krateng d0ccf3d1ae Small fixes 2022-04-22 20:04:24 +02:00
krateng 10fef00592 Unified remaining icons 2022-04-22 20:04:13 +02:00
krateng 1ed4af10ac Small design changes 2022-04-22 19:06:54 +02:00
krateng 11bc92ee8f Unified icon style somewhat 2022-04-22 19:04:00 +02:00
krateng 98c791064d More interface fixing and notifications 2022-04-22 18:43:40 +02:00
krateng d208290956 Added descriptions to API return dicts 2022-04-22 18:36:06 +02:00
krateng 009d77a75e Reogranized scrobble action area in web interface 2022-04-22 18:29:09 +02:00
krateng e6992f1e90 Moved more icons to jinja 2022-04-22 18:28:40 +02:00
krateng c52ad81fc2 Fixed destructive updating with missing fields 2022-04-22 17:51:42 +02:00
krateng f5d1fbc576 Generalized scrobble updating 2022-04-22 17:43:14 +02:00
krateng a8f8d86ec1 Adjusted reparse additions to new branch changes 2022-04-22 17:25:58 +02:00
krateng e9189b8903
Merge pull request #122 from alim4r/feature/reparse-scrobble
Add reparse scrobble feature
2022-04-22 17:17:11 +02:00
krateng 01d52d7e36 Merge branch 'master' into feature-webedit 2022-04-22 17:16:26 +02:00
krateng 528c954de9 Added output for API-caught errors 2022-04-22 17:14:57 +02:00
krateng 7c0ecda8a2 Fixed duplicate tracks on artist merge 2022-04-22 17:00:07 +02:00
alim4r 495627f3f7 Merge branch 'feature-webedit' into feature/reparse-scrobble 2022-04-21 19:04:32 +02:00
alim4r 6893fd745a Update get_scrobble parameters 2022-04-21 18:28:59 +02:00
krateng 91dae00851 Fixed renaming entities when new and old name are normalized the same 2022-04-21 18:19:33 +02:00
krateng c0ff50b064 Updated admin mode information 2022-04-21 18:08:15 +02:00
krateng 884e95dc58 Manual scrobbling now also uses new notification system 2022-04-21 18:04:01 +02:00
krateng 8023c2d51c Removed merge icon handling on pages that don't use them 2022-04-21 17:59:42 +02:00
krateng 428d92a267 Updated release notes 2022-04-21 17:35:19 +02:00
krateng 20092df02c Only showing valid icons for merging 2022-04-21 17:02:10 +02:00
krateng 713dbc34bb Fixed renaming artist to existing artist 2022-04-21 16:00:29 +02:00
krateng 181406d339 Added exception handling for all native API endpoints 2022-04-21 15:46:29 +02:00
krateng 9b5eb6f723 Fixed notifications of errors 2022-04-21 15:43:11 +02:00
krateng 662923dd5e Fixed caching bug with updating track 2022-04-21 15:41:38 +02:00
krateng ff71a9c526 Fixed renaming track to existing track 2022-04-21 15:13:14 +02:00
krateng fbbd959295 Added exceptions to database 2022-04-21 15:12:48 +02:00
krateng ce495176c1 Fixed passing of dbconn to subfunctions 2022-04-21 15:11:55 +02:00
krateng afc78e75b0 Generalized exception handling for native API 2022-04-21 15:05:54 +02:00
alim4r 85bb1f36cc Ignore scrobbles without a rawscrobble 2022-04-20 21:48:41 +02:00
alim4r c457b58ab8 Quick fix for reparse confirmation & button placement 2022-04-20 20:18:28 +02:00
krateng 62208bf668 Merge branch 'feature-restructure' into feature-webedit 2022-04-20 19:10:41 +02:00
krateng 53bc856222 Merge branch 'master' into feature-restructure 2022-04-20 19:08:16 +02:00
alim4r b525252af1 Add reparse scrobble feature 2022-04-20 15:59:33 +02:00
krateng 397eaf668f Moved static areas together in jinja base template 2022-04-18 23:34:53 +02:00
krateng b31e778d95 Made incomplete merging process a bit less permanent 2022-04-17 20:23:49 +02:00
krateng 6e8cbe6a57 Added callback notifications to edit functions 2022-04-17 20:18:44 +02:00
krateng 45ea7499b2 Added some return values to database 2022-04-17 20:18:26 +02:00
krateng 77c4dac7be Merge branch 'master' into feature-webedit 2022-04-17 19:30:39 +02:00
krateng ea6d70a650 Implemented experimental merging server-side 2022-04-17 17:38:38 +02:00
krateng 57e66fdafd Added client logic for merging 2022-04-17 17:24:23 +02:00
krateng 0d985ff706 Reorganized admin mode icons 2022-04-17 16:46:02 +02:00
krateng 27a9543da9 Added merge icons 2022-04-17 16:16:05 +02:00
krateng c9d2527a98 Added changelog 2022-04-17 15:37:48 +02:00
krateng 977385a700 Fixed editing with special characters 2022-04-17 15:37:08 +02:00
krateng 83e3157ad1 Can now cancel editing 2022-04-17 15:15:29 +02:00
krateng 0525ff400b Merge branch 'feature-restructure' into feature-webedit 2022-04-17 04:45:51 +02:00
krateng 13856a2347 Merge branch 'master' into feature-restructure 2022-04-17 04:44:28 +02:00
krateng fa2ce0c05f Reduced DB connections for cached stats 2022-04-16 04:37:50 +02:00
krateng b806be6e02 Cached stats now use IDs to survive renames 2022-04-16 03:10:51 +02:00
krateng 6601920f69 Fixed entrypoint 2022-04-16 02:17:43 +02:00
krateng f3f7dbd8ef Fixed double request when editing 2022-04-16 02:17:14 +02:00
krateng 263e7cd704 Merge branch 'feature-restructure' into feature-webedit 2022-04-16 02:04:43 +02:00
krateng 5b8e2debbc Merge branch 'master' into feature-restructure 2022-04-16 02:04:04 +02:00
krateng bccd88acd4 Implemented track title editing and refactored edit system 2022-04-15 19:41:44 +02:00
krateng 371e73ac99 Implemented artist name editing 2022-04-15 18:48:03 +02:00
krateng c33fcf1dc1 Added edit function to web interface 2022-04-15 18:16:54 +02:00
krateng 98e1926613 Moved svg icon to jinja snippet 2022-04-15 18:16:49 +02:00
krateng 28d43d00cb Merge branch 'master' into feature-restructure 2022-04-14 20:55:29 +02:00
krateng 4cffc9971d Merge branch 'master' into feature-restructure 2022-04-14 15:19:38 +02:00
krateng d018a758c0 Merge branch 'master' into feature-restructure 2022-04-12 16:20:53 +02:00
krateng 6635a9ac50 Merge branch 'v3' into feature-restructure 2022-04-10 17:44:36 +02:00
krateng 871b3d289d Moved monkey patching and globalconf to subpackage 2022-04-09 21:39:04 +02:00
krateng abde7e72c4 Moved scrobble generation to dev package 2022-04-09 21:24:48 +02:00
krateng 24dfa41ad9 Moved profiler to new dev subpackage 2022-04-09 21:20:48 +02:00
krateng bceb0db09a Moved supervisor to __main__ 2022-04-09 21:11:06 +02:00
krateng 87f1250629 Moved setup to top level 2022-04-09 21:02:17 +02:00
krateng bb68afee12 Moved main process control to __main__ 2022-04-09 20:55:50 +02:00
103 changed files with 2371 additions and 770 deletions

View File

@ -1,7 +1,7 @@
*
!maloja
!container
!Containerfile
!requirements_pre.txt
!requirements.txt
!pyproject.toml
!README.md

View File

@ -20,21 +20,12 @@ jobs:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_ACCESS_TOKEN }}
- name: Login to GHCR
if: github.event_name != 'pull_request'
uses: docker/login-action@dd4fa0671be5250ee6f50aedf4cb05514abda2c7
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Extract metadata (tags, labels) for Docker
id: meta
uses: docker/metadata-action@f2a13332ac1ce8c0a71aeac48a150dbb1838ab67
with:
images: |
${{ github.repository_owner }}/maloja
ghcr.io/${{ github.repository_owner }}/maloja
# generate Docker tags based on the following events/attributes
tags: |
type=semver,pattern={{version}}
@ -63,7 +54,7 @@ jobs:
push: ${{ github.event_name != 'pull_request' }}
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
platforms: linux/amd64,linux/arm64
platforms: linux/amd64,linux/arm64,linux/arm/v7
cache-from: type=local,src=/tmp/.buildx-cache
cache-to: type=local,dest=/tmp/.buildx-cache-new,mode=max

30
API.md
View File

@ -1,6 +1,7 @@
# Scrobbling
In order to scrobble from a wide selection of clients, you can use Maloja's standard-compliant APIs with the following settings:
Scrobbling can be done with the native API, see [below](#submitting-a-scrobble).
In order to scrobble from a wide selection of clients, you can also use Maloja's standard-compliant APIs with the following settings:
GNU FM |  
------ | ---------
@ -41,7 +42,7 @@ The user starts playing '(Fine Layers of) Slaysenflite', which is exactly 3:00 m
* If the user ends the play after 1:22, no scrobble is submitted
* If the user ends the play after 2:06, a scrobble with `"duration":126` is submitted
* If the user jumps back several times and ends the play after 3:57, a scrobble with `"duration":237` is submitted
* If the user jumps back several times and ends the play after 4:49, two scrobbles with `"duration":180` and `"duration":109` should be submitted
* If the user jumps back several times and ends the play after 4:49, two scrobbles with `"duration":180` and `"duration":109` are submitted
</td></tr>
<table>
@ -54,11 +55,26 @@ The native Maloja API is reachable at `/apis/mlj_1`. Endpoints are listed on `/a
All endpoints return JSON data. POST request can be made with query string or form data arguments, but this is discouraged - JSON should be used whenever possible.
No application should ever rely on the non-existence of fields in the JSON data - i.e., additional fields can be added at any time without this being considered a breaking change. Existing fields should usually not be removed or changed, but it is always a good idea to add basic handling for missing fields.
## Submitting a Scrobble
The POST endpoint `/newscrobble` is used to submit new scrobbles. These use a flat JSON structure with the following fields:
| Key | Type | Description |
| --- | --- | --- |
| `artists` | List(String) | Track artists |
| `title` | String | Track title |
| `album` | String | Name of the album (Optional) |
| `albumartists` | List(String) | Album artists (Optional) |
| `duration` | Integer | How long the song was listened to in seconds (Optional) |
| `length` | Integer | Actual length of the full song in seconds (Optional) |
| `time` | Integer | Timestamp of the listen if it was not at the time of submitting (Optional) |
| `nofix` | Boolean | Skip server-side metadata fixing (Optional) |
## General Structure
Most endpoints follow this structure:
The API is not fully consistent in order to ensure backwards-compatibility. Refer to the individual endpoints.
Generally, most endpoints follow this structure:
| Key | Type | Description |
| --- | --- | --- |
@ -66,7 +82,7 @@ Most endpoints follow this structure:
| `error` | Mapping | Details about the error if one occured. |
| `warnings` | List | Any warnings that did not result in failure, but should be noted. Field is omitted if there are no warnings! |
| `desc` | String | Human-readable feedback. This can be shown directly to the user if desired. |
| `list` | List | List of returned [entities](#Entity-Structure) |
| `list` | List | List of returned [entities](#entity-structure) |
Both errors and warnings have the following structure:
@ -87,7 +103,7 @@ Whenever a list of entities is returned, they have the following fields:
| Key | Type | Description |
| --- | --- | --- |
| `time` | Integer | Timestamp of the Scrobble in UTC |
| `track` | Mapping | The [track](#Track) being scrobbled |
| `track` | Mapping | The [track](#track) being scrobbled |
| `duration` | Integer | How long the track was played for in seconds |
| `origin` | String | Client that submitted the scrobble, or import source |
@ -118,7 +134,7 @@ Whenever a list of entities is returned, they have the following fields:
| Key | Type | Description |
| --- | --- | --- |
| `artists` | List | The [artists](#Artist) credited with the track |
| `artists` | List | The [artists](#artist) credited with the track |
| `title` | String | The title of the track |
| `length` | Integer | The full length of the track in seconds |

View File

@ -1,40 +1,74 @@
FROM alpine:3.15
# Python image includes two Python versions, so use base Alpine
# Based on the work of Jonathan Boeckel <jonathanboeckel1996@gmail.com>
FROM lsiobase/alpine:3.17 as base
WORKDIR /usr/src/app
# Install run dependencies first
RUN apk add --no-cache python3 py3-lxml tzdata
# system pip could be removed after build, but apk then decides to also remove all its
# python dependencies, even if they are explicitly installed as python packages
# whut
COPY --chown=abc:abc ./requirements.txt ./requirements.txt
# based on https://github.com/linuxserver/docker-pyload-ng/blob/main/Dockerfile
# everything but the app installation is run in one command so we can purge
# all build dependencies and cache in the same layer
# it may be possible to decrease image size slightly by using build stage and
# copying all site-packages to runtime stage but the image is already pretty small
RUN \
apk add py3-pip && \
pip install wheel
echo "**** install build packages ****" && \
apk add --no-cache --virtual=build-deps \
gcc \
g++ \
python3-dev \
libxml2-dev \
libxslt-dev \
libffi-dev \
libc-dev \
py3-pip \
linux-headers && \
echo "**** install runtime packages ****" && \
apk add --no-cache \
python3 \
py3-lxml \
tzdata && \
echo "**** install pip dependencies ****" && \
python3 -m ensurepip && \
pip3 install -U --no-cache-dir \
pip \
wheel && \
echo "**** install maloja requirements ****" && \
pip3 install --no-cache-dir -r requirements.txt && \
echo "**** cleanup ****" && \
apk del --purge \
build-deps && \
rm -rf \
/tmp/* \
${HOME}/.cache
# actual installation in extra layer so we can cache the stuff above
COPY ./requirements.txt ./requirements.txt
COPY --chown=abc:abc . .
RUN \
apk add --no-cache --virtual .build-deps gcc g++ python3-dev libxml2-dev libxslt-dev libffi-dev libc-dev py3-pip linux-headers && \
pip install --no-cache-dir -r requirements.txt && \
apk del .build-deps
echo "**** install maloja ****" && \
apk add --no-cache --virtual=install-deps \
py3-pip && \
pip3 install /usr/src/app && \
apk del --purge \
install-deps && \
rm -rf \
/tmp/* \
${HOME}/.cache
# no chance for caching below here
COPY . .
COPY container/root/ /
RUN pip install /usr/src/app
# Docker-specific configuration
# defaulting to IPv4 is no longer necessary (default host is dual stack)
ENV MALOJA_SKIP_SETUP=yes
ENV PYTHONUNBUFFERED=1
ENV \
# Docker-specific configuration
MALOJA_SKIP_SETUP=yes \
PYTHONUNBUFFERED=1 \
# Prevents breaking change for previous container that ran maloja as root
# On linux hosts (non-podman rootless) these variables should be set to the
# host user that should own the host folder bound to MALOJA_DATA_DIRECTORY
PUID=0 \
PGID=0
EXPOSE 42010
# use exec form for better signal handling https://docs.docker.com/engine/reference/builder/#entrypoint
ENTRYPOINT ["maloja", "run"]

View File

@ -9,7 +9,7 @@
Simple self-hosted music scrobble database to create personal listening statistics. No recommendations, no social network, no nonsense.
![screenshot](screenshot.png?raw=true)
![screenshot](https://raw.githubusercontent.com/krateng/maloja/master/screenshot.png)
You can check [my own Maloja page](https://maloja.krateng.ch) as an example instance.
@ -20,17 +20,13 @@ You can check [my own Maloja page](https://maloja.krateng.ch) as an example inst
* [Requirements](#requirements)
* [PyPI](#pypi)
* [From Source](#from-source)
* [Docker / Podman](#docker-podman)
* [Docker / Podman](#docker--podman)
* [Extras](#extras)
* [How to use](#how-to-use)
* [Basic control](#basic-control)
* [Data](#data)
* [Customization](#customization)
* [How to scrobble](#how-to-scrobble)
* [Native support](#native-support)
* [Native API](#native-api)
* [Standard-compliant API](#standard-compliant-api)
* [Manual](#manual)
* [How to extend](#how-to-extend)
## Features
@ -100,6 +96,23 @@ An example of a minimum run configuration to access maloja via `localhost:42010`
docker run -p 42010:42010 -v $PWD/malojadata:/mljdata -e MALOJA_DATA_DIRECTORY=/mljdata krateng/maloja
```
#### Linux Host
**NOTE:** If you are using [rootless containers with Podman](https://developers.redhat.com/blog/2020/09/25/rootless-containers-with-podman-the-basics#why_podman_) this DOES NOT apply to you.
If you are running Docker on a **Linux Host** you should specify `user:group` ids of the user who owns the folder on the host machine bound to `MALOJA_DATA_DIRECTORY` in order to avoid [docker file permission problems.](https://ikriv.com/blog/?p=4698) These can be specified using the [environmental variables **PUID** and **PGID**.](https://docs.linuxserver.io/general/understanding-puid-and-pgid)
To get the UID and GID for the current user run these commands from a terminal:
* `id -u` -- prints UID (EX `1000`)
* `id -g` -- prints GID (EX `1001`)
The modified run command with these variables would look like:
```console
docker run -e PUID=1000 -e PGID=1001 -p 42010:42010 -v $PWD/malojadata:/mljdata -e MALOJA_DATA_DIRECTORY=/mljdata krateng/maloja
```
### Extras
* If you'd like to display images, you will need API keys for [Last.fm](https://www.last.fm/api/account/create) and [Spotify](https://developer.spotify.com/dashboard/applications). These are free of charge!
@ -139,6 +152,7 @@ If you would like to import your previous scrobbles, use the command `maloja imp
* a Last.fm export generated by [benfoxall's website](https://benjaminbenben.com/lastfm-to-csv/) ([GitHub page](https://github.com/benfoxall/lastfm-to-csv))
* an official [Spotify data export file](https://www.spotify.com/us/account/privacy/)
* an official [ListenBrainz export file](https://listenbrainz.org/profile/export/)
* the export of another Maloja instance
⚠️ Never import your data while maloja is running. When you need to do import inside docker container start it in shell mode instead and perform import before starting the container as mentioned above.

View File

@ -11,7 +11,8 @@ const ALWAYS_SCROBBLE_SECONDS = 60*3;
// Longer songs are always scrobbled when playing at least 2 minutes
pages = {
"Plex Web":{
"plex":{
"name":"Plex",
"patterns":[
"https://app.plex.tv",
"http://app.plex.tv",
@ -20,31 +21,36 @@ pages = {
],
"script":"plex.js"
},
"YouTube Music":{
"ytmusic":{
"name":"YouTube Music",
"patterns":[
"https://music.youtube.com"
],
"script":"ytmusic.js"
},
"Spotify Web":{
"spotify":{
"name":"Spotify",
"patterns":[
"https://open.spotify.com"
],
"script":"spotify.js"
},
"Bandcamp":{
"bandcamp":{
"name":"Bandcamp",
"patterns":[
"bandcamp.com"
],
"script":"bandcamp.js"
},
"Soundcloud":{
"soundcloud":{
"name":"Soundcloud",
"patterns":[
"https://soundcloud.com"
],
"script":"soundcloud.js"
},
"Navidrome":{
"navidrome":{
"name":"Navidrome",
"patterns":[
"https://navidrome.",
"http://navidrome."
@ -77,6 +83,13 @@ function onTabUpdated(tabId, changeInfo, tab) {
//console.log("Still on same page!")
tabManagers[tabId].update();
// check if the setting for this page is still active
chrome.storage.local.get(["service_active_" + page],function(result){
if (!result["service_active_" + page]) {
delete tabManagers[tabId];
}
});
return
}
}
@ -90,13 +103,21 @@ function onTabUpdated(tabId, changeInfo, tab) {
patterns = pages[key]["patterns"];
for (var i=0;i<patterns.length;i++) {
if (tab.url.includes(patterns[i])) {
console.log("New page on tab " + tabId + " will be handled by new " + key + " manager!");
tabManagers[tabId] = new Controller(tabId,key);
updateTabNum();
return
//chrome.tabs.executeScript(tab.id,{"file":"sitescripts/" + pages[key]["script"]})
// check if we even like that page
chrome.storage.local.get(["service_active_" + key],function(result){
if (result["service_active_" + key]) {
console.log("New page on tab " + tabId + " will be handled by new " + key + " manager!");
tabManagers[tabId] = new Controller(tabId,key);
updateTabNum();
//chrome.tabs.executeScript(tab.id,{"file":"sitescripts/" + pages[key]["script"]})
}
else {
console.log("New page on tab " + tabId + " is " + key + ", not enabled!");
}
});
return;
}
}
}
@ -127,10 +148,10 @@ function onInternalMessage(request,sender) {
for (tabId in tabManagers) {
manager = tabManagers[tabId]
if (manager.currentlyPlaying) {
answer.push([manager.page,manager.currentArtist,manager.currentTitle]);
answer.push([pages[manager.page]['name'],manager.currentArtist,manager.currentTitle]);
}
else {
answer.push([manager.page,null]);
answer.push([pages[manager.page]['name'],null]);
}
}

View File

@ -1,6 +1,6 @@
{
"name": "Maloja Scrobbler",
"version": "1.11",
"version": "1.13",
"description": "Scrobbles tracks from various sites to your Maloja server",
"manifest_version": 2,
"permissions": [

View File

@ -14,7 +14,7 @@
color:beige;
font-family:'Ubuntu';
}
input {
input[type=text] {
width:270px;
font-family:'Ubuntu';
outline:none;
@ -33,10 +33,14 @@
<br /><br />
<span id="checkmark_key"></span> <span>API key:</span><br />
<input type="text" id="apikey" />
<br/><br/>
<hr/>
<span>Tabs:</span>
<list id="playinglist">
</list>
<hr/>
<span>Services:</span>
<list id="sitelist">
</list>
</div>
</body>
</html>

View File

@ -1,26 +1,71 @@
// duplicate this info for now, don't know if there is a better way than sending messages
var pages = {
"plex":"Plex",
"ytmusic":"YouTube Music",
"spotify":"Spotify",
"bandcamp":"Bandcamp",
"soundcloud":"Soundcloud",
"navidrome":"Navidrome"
}
var config_defaults = {
serverurl:"http://localhost:42010",
apikey:"BlackPinkInYourArea"
}
for (var key in pages) {
config_defaults["service_active_" + key] = true;
}
document.addEventListener("DOMContentLoaded",function() {
var sitelist = document.getElementById("sitelist");
for (var identifier in pages) {
sitelist.append(document.createElement('br'));
var checkbox = document.createElement('input');
checkbox.type = "checkbox";
checkbox.id = "service_active_" + identifier;
var label = document.createElement('label');
label.for = checkbox.id;
label.textContent = pages[identifier];
sitelist.appendChild(checkbox);
sitelist.appendChild(label);
checkbox.addEventListener("change",toggleSite);
}
document.getElementById("serverurl").addEventListener("change",checkServer);
document.getElementById("apikey").addEventListener("change",checkServer);
document.getElementById("serverurl").addEventListener("focusout",checkServer);
document.getElementById("apikey").addEventListener("focusout",checkServer);
document.getElementById("serverurl").addEventListener("input",saveConfig);
document.getElementById("apikey").addEventListener("input",saveConfig);
document.getElementById("serverurl").addEventListener("input",saveServer);
document.getElementById("apikey").addEventListener("input",saveServer);
chrome.runtime.onMessage.addListener(onInternalMessage);
chrome.storage.local.get(config_defaults,function(result){
console.log(result);
for (var key in result) {
document.getElementById(key).value = result[key];
// booleans
if (result[key] == true || result[key] == false) {
document.getElementById(key).checked = result[key];
}
// text
else{
document.getElementById(key).value = result[key];
}
}
checkServer();
})
@ -31,6 +76,11 @@ document.addEventListener("DOMContentLoaded",function() {
});
function toggleSite(evt) {
var element = evt.target;
chrome.storage.local.set({ [element.id]: element.checked });
}
function onInternalMessage(request,sender) {
if (request.type == "response") {
@ -50,8 +100,8 @@ function onInternalMessage(request,sender) {
function saveConfig() {
for (var key in config_defaults) {
function saveServer() {
for (var key of ["serverurl","apikey"]) {
var value = document.getElementById(key).value;
chrome.storage.local.set({ [key]: value });
}

View File

@ -0,0 +1,10 @@
#!/usr/bin/with-contenv bash
if [ "$(s6-setuidgid abc id -u)" = "0" ]; then
echo "-------------------------------------"
echo "WARN: Running as root! If you meant to do this than this message can be ignored."
echo "If you are running this container on a *linux* host and are not using podman rootless you SHOULD"
echo "change the ENVs PUID and PGID for this container to ensure correct permissions on your config folder."
echo -e "See: https://github.com/krateng/maloja#linux-host\n"
echo -e "-------------------------------------\n"
fi

View File

@ -0,0 +1 @@
oneshot

View File

@ -0,0 +1 @@
/etc/s6-overlay/s6-rc.d/init-permission-check/run

View File

@ -0,0 +1,7 @@
#!/usr/bin/with-contenv bash
# used https://github.com/linuxserver/docker-wikijs/blob/master/root/etc/s6-overlay/s6-rc.d/svc-wikijs/run as a template
echo -e "\nMaloja is starting!"
exec \
s6-setuidgid abc python -m maloja run

View File

@ -0,0 +1 @@
longrun

1
dev/list_tags.sh Normal file
View File

@ -0,0 +1 @@
git tag -l '*.0' -n1 --sort=v:refname

View File

@ -34,6 +34,7 @@ minor_release_name: "Yeonhee"
- "[Feature] Added notification system for web interface"
- "[Bugfix] Fixed crash when encountering error in Lastfm import"
3.0.6:
commit: "b3d4cb7a153845d1f5a5eef67a6508754e338f2f"
notes:
- "[Performance] Implemented search in database"
- "[Bugfix] Better parsing of featuring artists"
@ -41,3 +42,10 @@ minor_release_name: "Yeonhee"
- "[Bugfix] Fixed importing a Spotify file without path"
- "[Bugfix] No longer releasing database lock during scrobble creation"
- "[Distribution] Experimental arm64 image"
3.0.7:
commit: "62abc319303a6cb6463f7c27b6ef09b76fc67f86"
notes:
- "[Bugix] Improved signal handling"
- "[Bugix] Fixed constant re-caching of all-time stats, significantly increasing page load speed"
- "[Logging] Disabled cache information when cache is not used"
- "[Distribution] Experimental arm/v7 image"

46
dev/releases/3.1.yml Normal file
View File

@ -0,0 +1,46 @@
minor_release_name: "Soyeon"
3.1.0:
commit: "bfa553bed05d7dba33f611a44485d6cf460ba308"
notes:
- "[Architecture] Cleaned up legacy process control"
- "[Architecture] Added proper exception framework to native API"
- "[Feature] Implemented track title and artist name editing from web interface"
- "[Feature] Implemented track and artist merging from web interface"
- "[Feature] Implemented scrobble reparsing from web interface"
- "[Performance] Adjusted cache sizes"
- "[Logging] Added cache memory use information"
- "[Technical] Bumped Python Version and various dependencies"
3.1.1:
commit: "20aae955b2263be07c56bafe4794f622117116ef"
notes:
- "[Bugfix] Fixed inclusion of custom css files"
- "[Bugfix] Fixed list values in configuration"
3.1.2:
commit: "a0739306013cd9661f028fb5b2620cfa2d298aa4"
notes:
- "[Feature] Added remix artist parsing"
- "[Feature] Added API debug mode"
- "[Bugfix] Fixed leftover whitespaces when parsing titles"
- "[Bugfix] Fixed handling of fallthrough values in config file"
3.1.3:
commit: "f3a04c79b1c37597cdf3cafcd95e3c923cd6a53f"
notes:
- "[Bugfix] Fixed infinite recursion with capitalized featuring delimiters"
- "[Bugfix] Fixed favicon display"
3.1.4:
commit: "ef06f2262205c903e7c3060e2d2d52397f8ffc9d"
notes:
- "[Feature] Expanded information saved from Listenbrainz API"
- "[Feature] Added import for Listenbrainz exports"
- "[Bugfix] Sanitized artists and tracks with html-like structure"
3.1.5:
commit: "4330b0294bc0a01cdb841e2e3db370108da901db"
notes:
- "[Feature] Made image upload part of regular API"
- "[Bugfix] Additional entity name sanitization"
- "[Bugfix] Fixed image display on Safari"
- "[Bugfix] Fixed entity editing on Firefox"
- "[Bugfix] Made compatibile with SQLAlchemy 2.0"
upcoming:
notes:
- "[Bugfix] Fixed configuration of time format"

43
dev/testing/stresstest.py Normal file
View File

@ -0,0 +1,43 @@
import threading
import subprocess
import time
import requests
import os
ACTIVE = True
build_cmd = ["docker","build","-t","maloja",".","-f","Containerfile"]
subprocess.run(build_cmd)
common_prc = (
["docker","run","--rm","-v",f"{os.path.abspath('./testdata')}:/mlj","-e","MALOJA_DATA_DIRECTORY=/mlj"],
["maloja"]
)
servers = [
{'port': 42010},
{'port': 42011, 'extraargs':["--memory=1g"]},
{'port': 42012, 'extraargs':["--memory=500m"]}
]
for s in servers:
cmd = common_prc[0] + ["-p",f"{s['port']}:42010"] + s.get('extraargs',[]) + common_prc[1]
print(cmd)
t = threading.Thread(target=subprocess.run,args=(cmd,))
s['thread'] = t
t.daemon = True
t.start()
time.sleep(5)
time.sleep(5)
while ACTIVE:
time.sleep(1)
try:
for s in servers:
requests.get(f"http://localhost:{s['port']}")
except KeyboardInterrupt:
ACTIVE = False
except Exception:
pass
for s in servers:
s['thread'].join()

View File

@ -6,6 +6,7 @@ FOLDER = "dev/releases"
releases = {}
for f in os.listdir(FOLDER):
if f == "branch.yml": continue
#maj,min = (int(i) for i in f.split('.')[:2])
with open(os.path.join(FOLDER,f)) as fd:
@ -43,7 +44,7 @@ for version in releases:
try:
prev_tag = sp.check_output(["git","show",f'v{maj}.{min}.{hot}']).decode()
prev_tag_commit = prev_tag.split('\n')[6].split(" ")[1]
except:
except Exception:
pass
else:
assert prev_tag_commit == info['commit']

View File

@ -1,4 +1,4 @@
# monkey patching
from . import monkey
from .pkg_global import monkey
# configuration before all else
from . import globalconf
from .pkg_global import conf

View File

@ -1,4 +1,184 @@
# make the package itself runnable with python -m maloja
import os
import signal
import subprocess
import time
from .proccontrol.control import main
main()
from setproctitle import setproctitle
from ipaddress import ip_address
from doreah.control import mainfunction
from doreah.io import col
from doreah.logging import log
from . import __pkginfo__ as pkginfo
from .pkg_global import conf
from .proccontrol import tasks
from .setup import setup
from .dev import generate, apidebug
def print_header_info():
print()
#print("#####")
print(col['yellow']("Maloja"),f"v{pkginfo.VERSION}")
print(pkginfo.HOMEPAGE)
#print("#####")
print()
def get_instance():
try:
return int(subprocess.check_output(["pidof","maloja"]))
except Exception:
return None
def get_instance_supervisor():
try:
return int(subprocess.check_output(["pidof","maloja_supervisor"]))
except Exception:
return None
def restart():
if stop():
start()
else:
print(col["red"]("Could not stop Maloja!"))
def start():
if get_instance_supervisor() is not None:
print("Maloja is already running.")
else:
print_header_info()
setup()
try:
#p = subprocess.Popen(["python3","-m","maloja.server"],stdout=subprocess.DEVNULL,stderr=subprocess.DEVNULL)
sp = subprocess.Popen(["python3","-m","maloja","supervisor"],stdout=subprocess.DEVNULL,stderr=subprocess.DEVNULL)
print(col["green"]("Maloja started!"))
port = conf.malojaconfig["PORT"]
print("Visit your server address (Port " + str(port) + ") to see your web interface. Visit /admin_setup to get started.")
print("If you're installing this on your local machine, these links should get you there:")
print("\t" + col["blue"]("http://localhost:" + str(port)))
print("\t" + col["blue"]("http://localhost:" + str(port) + "/admin_setup"))
return True
except Exception:
print("Error while starting Maloja.")
return False
def stop():
for attempt in [(signal.SIGTERM,2),(signal.SIGTERM,5),(signal.SIGKILL,3),(signal.SIGKILL,5)]:
pid_sv = get_instance_supervisor()
pid = get_instance()
if pid is None and pid_sv is None:
print("Maloja stopped!")
return True
if pid_sv is not None:
os.kill(pid_sv,attempt[0])
if pid is not None:
os.kill(pid,attempt[0])
time.sleep(attempt[1])
return False
print("Maloja stopped!")
return True
def onlysetup():
print_header_info()
setup()
print("Setup complete!")
def run_server():
print_header_info()
setup()
setproctitle("maloja")
from . import server
server.run_server()
def run_supervisor():
setproctitle("maloja_supervisor")
while True:
log("Maloja is not running, starting...",module="supervisor")
try:
process = subprocess.Popen(
["python3", "-m", "maloja","run"],
stdout=subprocess.DEVNULL,
stderr=subprocess.DEVNULL,
)
except Exception as e:
log("Error starting Maloja: " + str(e),module="supervisor")
else:
try:
process.wait()
except Exception as e:
log("Maloja crashed: " + str(e),module="supervisor")
def debug():
os.environ["MALOJA_DEV_MODE"] = 'true'
conf.malojaconfig.load_environment()
direct()
def print_info():
print_header_info()
print(col['lightblue']("Configuration Directory:"),conf.dir_settings['config'])
print(col['lightblue']("Data Directory: "),conf.dir_settings['state'])
print(col['lightblue']("Log Directory: "),conf.dir_settings['logs'])
print(col['lightblue']("Network: "),f"Dual Stack, Port {conf.malojaconfig['port']}" if conf.malojaconfig['host'] == "*" else f"IPv{ip_address(conf.malojaconfig['host']).version}, Port {conf.malojaconfig['port']}")
print(col['lightblue']("Timezone: "),f"UTC{conf.malojaconfig['timezone']:+d}")
print()
try:
import pkg_resources
for pkg in ("sqlalchemy","waitress","bottle","doreah","jinja2"):
print(col['cyan'] (f"{pkg}:".ljust(13)),pkg_resources.get_distribution(pkg).version)
except ImportError:
print("Could not determine dependency versions.")
print()
@mainfunction({"l":"level","v":"version","V":"version"},flags=['version','include_images'],shield=True)
def main(*args,**kwargs):
actions = {
# server
"start":start,
"restart":restart,
"stop":stop,
"run":run_server,
"supervisor":run_supervisor,
"debug":debug,
"setup":onlysetup,
# admin scripts
"import":tasks.import_scrobbles, # maloja import /x/y.csv
"backup":tasks.backup, # maloja backup --targetfolder /x/y --include_images
"generate":generate.generate_scrobbles, # maloja generate 400
"export":tasks.export, # maloja export
"apidebug":apidebug.run, # maloja apidebug
# aux
"info":print_info
}
if "version" in kwargs:
print(info.VERSION)
return True
else:
try:
action, *args = args
action = actions[action]
except (ValueError, KeyError):
print("Valid commands: " + " ".join(a for a in actions))
return False
return action(*args,**kwargs)

View File

@ -4,7 +4,7 @@
# you know what f*ck it
# this is hardcoded for now because of that damn project / package name discrepancy
# i'll fix it one day
VERSION = "3.0.6"
VERSION = "3.1.5"
HOMEPAGE = "https://github.com/krateng/maloja"

View File

@ -47,9 +47,12 @@ def init_apis(server):
server.get(altpath_empty_cl)(alias_api)
server.post(altpath_empty_cl)(alias_api)
def invalid_api(pth):
def invalid_api(pth=''):
response.status = 404
return {"error":"Invalid API"}
server.get("/apis/<pth:path>")(invalid_api)
server.post("/apis/<pth:path>")(invalid_api)
server.get("/apis")(invalid_api)
server.post("/apis")(invalid_api)

View File

@ -4,7 +4,7 @@
from doreah.keystore import KeyStore
from doreah.logging import log
from ..globalconf import data_dir
from ..pkg_global.conf import data_dir
apikeystore = KeyStore(file=data_dir['clients']("apikeys.yml"),save_endpoint="/apis/mlj_1/apikeys")

View File

@ -62,7 +62,7 @@ class APIHandler:
try:
response.status,result = self.handle(path,keys)
except:
except Exception:
exceptiontype = sys.exc_info()[0]
if exceptiontype in self.errors:
response.status,result = self.errors[exceptiontype]
@ -82,7 +82,7 @@ class APIHandler:
try:
methodname = self.get_method(path,keys)
method = self.methods[methodname]
except:
except Exception:
log("Could not find a handler for method " + str(methodname) + " in API " + self.__apiname__,module="debug")
log("Keys: " + str(keys),module="debug")
raise InvalidMethodException()
@ -94,5 +94,5 @@ class APIHandler:
# fixing etc is handled by the main scrobble function
try:
return database.incoming_scrobble(rawscrobble,api=self.__apiname__,client=client)
except:
except Exception:
raise ScrobblingException()

View File

@ -76,7 +76,7 @@ class Audioscrobbler(APIHandler):
#(artists,title) = cla.fullclean(artiststr,titlestr)
try:
timestamp = int(keys["timestamp"])
except:
except Exception:
timestamp = None
#database.createScrobble(artists,title,timestamp)
self.scrobble({'track_artists':[artiststr],'track_title':titlestr,'scrobble_time':timestamp},client=client)

View File

@ -73,6 +73,8 @@ class AudioscrobblerLegacy(APIHandler):
client = self.mobile_sessions.get(key)
for count in range(50):
artist_key = f"a[{count}]"
album_key = f"b[{count}]"
length_key = f"l[{count}]"
track_key = f"t[{count}]"
time_key = f"i[{count}]"
if artist_key not in keys or track_key not in keys:
@ -80,14 +82,21 @@ class AudioscrobblerLegacy(APIHandler):
artiststr,titlestr = keys[artist_key], keys[track_key]
try:
timestamp = int(keys[time_key])
except:
except Exception:
timestamp = None
#database.createScrobble(artists,title,timestamp)
self.scrobble({
scrobble = {
'track_artists':[artiststr],
'track_title':titlestr,
'scrobble_time':timestamp
},client=client)
'scrobble_time':timestamp,
}
if album_key in keys:
scrobble['album_name'] = keys[album_key]
if length_key in keys:
scrobble['track_length'] = keys[length_key]
#database.createScrobble(artists,title,timestamp)
self.scrobble(scrobble, client=client)
return 200,"OK\n"

View File

@ -4,7 +4,7 @@ from .. import database
import datetime
from ._apikeys import apikeystore
from ..globalconf import malojaconfig
from ..pkg_global.conf import malojaconfig
class Listenbrainz(APIHandler):
@ -34,7 +34,7 @@ class Listenbrainz(APIHandler):
def submit(self,pathnodes,keys):
try:
token = self.get_token_from_request_keys(keys)
except:
except Exception:
raise BadAuthException()
client = apikeystore.check_and_identify_key(token)
@ -45,7 +45,7 @@ class Listenbrainz(APIHandler):
try:
listentype = keys["listen_type"]
payload = keys["payload"]
except:
except Exception:
raise MalformedJSONException()
if listentype == "playing_now":
@ -55,17 +55,30 @@ class Listenbrainz(APIHandler):
try:
metadata = listen["track_metadata"]
artiststr, titlestr = metadata["artist_name"], metadata["track_name"]
albumstr = metadata.get("release_name")
additional = metadata.get("additional_info",{})
try:
timestamp = int(listen["listened_at"])
except:
except Exception:
timestamp = None
except:
except Exception:
raise MalformedJSONException()
extrafields = {
# fields that will not be consumed by regular scrobbling
# will go into 'extra'
k:additional[k]
for k in ['track_mbid', 'release_mbid', 'artist_mbids','recording_mbid','tags']
if k in additional
}
self.scrobble({
'track_artists':[artiststr],
'track_title':titlestr,
'scrobble_time':timestamp
'album_name':albumstr,
'scrobble_time':timestamp,
'track_length': additional.get("duration"),
**extrafields
},client=client)
return 200,{"status":"ok"}
@ -74,7 +87,7 @@ class Listenbrainz(APIHandler):
def validate_token(self,pathnodes,keys):
try:
token = self.get_token_from_request_keys(keys)
except:
except Exception:
raise BadAuthException()
if not apikeystore.check_key(token):
raise InvalidAuthException()

View File

@ -1,5 +1,6 @@
import os
import math
import traceback
from bottle import response, static_file, request, FormsDict
@ -12,7 +13,7 @@ from nimrodel import Multi
from .. import database
from ..globalconf import malojaconfig, data_dir
from ..pkg_global.conf import malojaconfig, data_dir
@ -39,15 +40,48 @@ api.__apipath__ = "mlj_1"
errors = {
database.MissingScrobbleParameters: lambda e: (400,{
database.exceptions.MissingScrobbleParameters: lambda e: (400,{
"status":"failure",
"error":{
'type':'missing_scrobble_data',
'value':e.params,
'desc':"A scrobble requires these parameters."
'desc':"The scrobble is missing needed parameters."
}
}),
Exception: lambda e: (500,{
database.exceptions.MissingEntityParameter: lambda e: (400,{
"status":"error",
"error":{
'type':'missing_entity_parameter',
'value':None,
'desc':"This API call is not valid without an entity (track or artist)."
}
}),
database.exceptions.EntityExists: lambda e: (409,{
"status":"failure",
"error":{
'type':'entity_exists',
'value':e.entitydict,
'desc':"This entity already exists in the database. Consider merging instead."
}
}),
database.exceptions.DatabaseNotBuilt: lambda e: (503,{
"status":"error",
"error":{
'type':'server_not_ready',
'value':'db_upgrade',
'desc':"The database is being upgraded. Please try again later."
}
}),
images.MalformedB64: lambda e: (400,{
"status":"failure",
"error":{
'type':'malformed_b64',
'value':None,
'desc':"The provided base 64 string is not valid."
}
}),
# for http errors, use their status code
Exception: lambda e: ((e.status_code if hasattr(e,'statuscode') else 500),{
"status":"failure",
"error":{
'type':'unknown_error',
@ -57,6 +91,21 @@ errors = {
})
}
def catch_exceptions(func):
def protector(*args,**kwargs):
try:
return func(*args,**kwargs)
except Exception as e:
print(traceback.format_exc())
for etype in errors:
if isinstance(e,etype):
errorhandling = errors[etype](e)
response.status = errorhandling[0]
return errorhandling[1]
protector.__doc__ = func.__doc__
protector.__annotations__ = func.__annotations__
return protector
def add_common_args_to_docstring(filterkeys=False,limitkeys=False,delimitkeys=False,amountkeys=False):
@ -94,6 +143,7 @@ def add_common_args_to_docstring(filterkeys=False,limitkeys=False,delimitkeys=Fa
@api.get("test")
@catch_exceptions
def test_server(key=None):
"""Pings the server. If an API key is supplied, the server will respond with 200
if the key is correct and 403 if it isn't. If no key is supplied, the server will
@ -119,6 +169,7 @@ def test_server(key=None):
@api.get("serverinfo")
@catch_exceptions
def server_info():
"""Returns basic information about the server.
@ -141,6 +192,7 @@ def server_info():
@api.get("scrobbles")
@catch_exceptions
@add_common_args_to_docstring(filterkeys=True,limitkeys=True,amountkeys=True)
def get_scrobbles_external(**keys):
"""Returns a list of scrobbles.
@ -158,11 +210,13 @@ def get_scrobbles_external(**keys):
if k_amount.get('perpage') is not math.inf: result = result[:k_amount.get('perpage')]
return {
"status":"ok",
"list":result
}
@api.get("numscrobbles")
@catch_exceptions
@add_common_args_to_docstring(filterkeys=True,limitkeys=True,amountkeys=True)
def get_scrobbles_num_external(**keys):
"""Returns amount of scrobbles.
@ -176,12 +230,14 @@ def get_scrobbles_num_external(**keys):
result = database.get_scrobbles_num(**ckeys)
return {
"status":"ok",
"amount":result
}
@api.get("tracks")
@catch_exceptions
@add_common_args_to_docstring(filterkeys=True)
def get_tracks_external(**keys):
"""Returns all tracks (optionally of an artist).
@ -195,12 +251,14 @@ def get_tracks_external(**keys):
result = database.get_tracks(**ckeys)
return {
"status":"ok",
"list":result
}
@api.get("artists")
@catch_exceptions
@add_common_args_to_docstring()
def get_artists_external():
"""Returns all artists.
@ -210,6 +268,7 @@ def get_artists_external():
result = database.get_artists()
return {
"status":"ok",
"list":result
}
@ -218,6 +277,7 @@ def get_artists_external():
@api.get("charts/artists")
@catch_exceptions
@add_common_args_to_docstring(limitkeys=True)
def get_charts_artists_external(**keys):
"""Returns artist charts
@ -230,12 +290,14 @@ def get_charts_artists_external(**keys):
result = database.get_charts_artists(**ckeys)
return {
"status":"ok",
"list":result
}
@api.get("charts/tracks")
@catch_exceptions
@add_common_args_to_docstring(filterkeys=True,limitkeys=True)
def get_charts_tracks_external(**keys):
"""Returns track charts
@ -248,6 +310,7 @@ def get_charts_tracks_external(**keys):
result = database.get_charts_tracks(**ckeys)
return {
"status":"ok",
"list":result
}
@ -255,6 +318,7 @@ def get_charts_tracks_external(**keys):
@api.get("pulse")
@catch_exceptions
@add_common_args_to_docstring(filterkeys=True,limitkeys=True,delimitkeys=True,amountkeys=True)
def get_pulse_external(**keys):
"""Returns amounts of scrobbles in specified time frames
@ -267,6 +331,7 @@ def get_pulse_external(**keys):
results = database.get_pulse(**ckeys)
return {
"status":"ok",
"list":results
}
@ -274,6 +339,7 @@ def get_pulse_external(**keys):
@api.get("performance")
@catch_exceptions
@add_common_args_to_docstring(filterkeys=True,limitkeys=True,delimitkeys=True,amountkeys=True)
def get_performance_external(**keys):
"""Returns artist's or track's rank in specified time frames
@ -286,6 +352,7 @@ def get_performance_external(**keys):
results = database.get_performance(**ckeys)
return {
"status":"ok",
"list":results
}
@ -293,6 +360,7 @@ def get_performance_external(**keys):
@api.get("top/artists")
@catch_exceptions
@add_common_args_to_docstring(limitkeys=True,delimitkeys=True)
def get_top_artists_external(**keys):
"""Returns respective number 1 artists in specified time frames
@ -305,6 +373,7 @@ def get_top_artists_external(**keys):
results = database.get_top_artists(**ckeys)
return {
"status":"ok",
"list":results
}
@ -312,6 +381,7 @@ def get_top_artists_external(**keys):
@api.get("top/tracks")
@catch_exceptions
@add_common_args_to_docstring(limitkeys=True,delimitkeys=True)
def get_top_tracks_external(**keys):
"""Returns respective number 1 tracks in specified time frames
@ -326,6 +396,7 @@ def get_top_tracks_external(**keys):
results = database.get_top_tracks(**ckeys)
return {
"status":"ok",
"list":results
}
@ -333,6 +404,7 @@ def get_top_tracks_external(**keys):
@api.get("artistinfo")
@catch_exceptions
@add_common_args_to_docstring(filterkeys=True)
def artist_info_external(**keys):
"""Returns information about an artist
@ -347,8 +419,9 @@ def artist_info_external(**keys):
@api.get("trackinfo")
@catch_exceptions
@add_common_args_to_docstring(filterkeys=True)
def track_info_external(artist:Multi[str],**keys):
def track_info_external(artist:Multi[str]=[],**keys):
"""Returns information about a track
:return: track (Mapping), scrobbles (Integer), position (Integer), medals (Mapping), certification (String), topweeks (Integer)
@ -365,6 +438,7 @@ def track_info_external(artist:Multi[str],**keys):
@api.post("newscrobble")
@authenticated_function(alternate=api_key_correct,api=True,pass_auth_result_as='auth_result')
@catch_exceptions
def post_scrobble(
artist:Multi=None,
artists:list=[],
@ -406,46 +480,67 @@ def post_scrobble(
# for logging purposes, don't pass values that we didn't actually supply
rawscrobble = {k:rawscrobble[k] for k in rawscrobble if rawscrobble[k]}
try:
result = database.incoming_scrobble(
rawscrobble,
client='browser' if auth_result.get('doreah_native_auth_check') else auth_result.get('client'),
api='native/v1',
fix=(nofix is None)
)
responsedict = {
'status': 'success',
'track': {
'artists':result['track']['artists'],
'title':result['track']['title']
}
}
if extra_kwargs:
responsedict['warnings'] = [
{'type':'invalid_keyword_ignored','value':k,
'desc':"This key was not recognized by the server and has been discarded."}
for k in extra_kwargs
]
if artist and artists:
responsedict['warnings'] = [
{'type':'mixed_schema','value':['artist','artists'],
'desc':"These two fields are meant as alternative methods to submit information. Use of both is discouraged, but works at the moment."}
]
return responsedict
except Exception as e:
for etype in errors:
if isinstance(e,etype):
errorhandling = errors[etype](e)
response.status = errorhandling[0]
return errorhandling[1]
result = database.incoming_scrobble(
rawscrobble,
client='browser' if auth_result.get('doreah_native_auth_check') else auth_result.get('client'),
api='native/v1',
fix=(nofix is None)
)
responsedict = {
'status': 'success',
'track': {
'artists':result['track']['artists'],
'title':result['track']['title']
},
'desc':f"Scrobbled {result['track']['title']} by {', '.join(result['track']['artists'])}"
}
if extra_kwargs:
responsedict['warnings'] = [
{'type':'invalid_keyword_ignored','value':k,
'desc':"This key was not recognized by the server and has been discarded."}
for k in extra_kwargs
]
if artist and artists:
responsedict['warnings'] = [
{'type':'mixed_schema','value':['artist','artists'],
'desc':"These two fields are meant as alternative methods to submit information. Use of both is discouraged, but works at the moment."}
]
return responsedict
@api.post("addpicture")
@authenticated_function(alternate=api_key_correct,api=True)
@catch_exceptions
def add_picture(b64,artist:Multi=[],title=None):
"""Uploads a new image for an artist or track.
param string b64: Base 64 representation of the image
param string artist: Artist name. Can be supplied multiple times for tracks with multiple artists.
param string title: Title of the track. Optional.
"""
keys = FormsDict()
for a in artist:
keys.append("artist",a)
if title is not None: keys.append("title",title)
k_filter, _, _, _, _ = uri_to_internal(keys)
if "track" in k_filter: k_filter = k_filter["track"]
url = images.set_image(b64,**k_filter)
return {
'status': 'success',
'url': url
}
@api.post("importrules")
@authenticated_function(api=True)
@catch_exceptions
def import_rulemodule(**keys):
"""Internal Use Only"""
filename = keys.get("filename")
@ -464,6 +559,7 @@ def import_rulemodule(**keys):
@api.post("rebuild")
@authenticated_function(api=True)
@catch_exceptions
def rebuild(**keys):
"""Internal Use Only"""
log("Database rebuild initiated!")
@ -480,6 +576,7 @@ def rebuild(**keys):
@api.get("search")
@catch_exceptions
def search(**keys):
"""Internal Use Only"""
query = keys.get("query")
@ -501,37 +598,27 @@ def search(**keys):
artists_result = []
for a in artists:
result = {
'name': a,
'artist': a,
'link': "/artist?" + compose_querystring(internal_to_uri({"artist": a})),
'image': images.get_artist_image(a)
}
result["image"] = images.get_artist_image(a)
artists_result.append(result)
tracks_result = []
for t in tracks:
result = t
result["link"] = "/track?" + compose_querystring(internal_to_uri({"track":t}))
result["image"] = images.get_track_image(t)
result = {
'track': t,
'link': "/track?" + compose_querystring(internal_to_uri({"track":t})),
'image': images.get_track_image(t)
}
tracks_result.append(result)
return {"artists":artists_result[:max_],"tracks":tracks_result[:max_]}
@api.post("addpicture")
@authenticated_function(api=True)
def add_picture(b64,artist:Multi=[],title=None):
"""Internal Use Only"""
keys = FormsDict()
for a in artist:
keys.append("artist",a)
if title is not None: keys.append("title",title)
k_filter, _, _, _, _ = uri_to_internal(keys)
if "track" in k_filter: k_filter = k_filter["track"]
images.set_image(b64,**k_filter)
@api.post("newrule")
@authenticated_function(api=True)
@catch_exceptions
def newrule(**keys):
"""Internal Use Only"""
pass
@ -542,18 +629,21 @@ def newrule(**keys):
@api.post("settings")
@authenticated_function(api=True)
@catch_exceptions
def set_settings(**keys):
"""Internal Use Only"""
malojaconfig.update(keys)
@api.post("apikeys")
@authenticated_function(api=True)
@catch_exceptions
def set_apikeys(**keys):
"""Internal Use Only"""
apikeystore.update(keys)
@api.post("import")
@authenticated_function(api=True)
@catch_exceptions
def import_scrobbles(identifier):
"""Internal Use Only"""
from ..thirdparty import import_scrobbles
@ -561,6 +651,7 @@ def import_scrobbles(identifier):
@api.get("backup")
@authenticated_function(api=True)
@catch_exceptions
def get_backup(**keys):
"""Internal Use Only"""
from ..proccontrol.tasks.backup import backup
@ -573,6 +664,7 @@ def get_backup(**keys):
@api.get("export")
@authenticated_function(api=True)
@catch_exceptions
def get_export(**keys):
"""Internal Use Only"""
from ..proccontrol.tasks.export import export
@ -586,6 +678,71 @@ def get_export(**keys):
@api.post("delete_scrobble")
@authenticated_function(api=True)
@catch_exceptions
def delete_scrobble(timestamp):
"""Internal Use Only"""
database.remove_scrobble(timestamp)
result = database.remove_scrobble(timestamp)
return {
"status":"success",
"desc":f"Scrobble was deleted!"
}
@api.post("edit_artist")
@authenticated_function(api=True)
@catch_exceptions
def edit_artist(id,name):
"""Internal Use Only"""
result = database.edit_artist(id,name)
return {
"status":"success"
}
@api.post("edit_track")
@authenticated_function(api=True)
@catch_exceptions
def edit_track(id,title):
"""Internal Use Only"""
result = database.edit_track(id,{'title':title})
return {
"status":"success"
}
@api.post("merge_tracks")
@authenticated_function(api=True)
@catch_exceptions
def merge_tracks(target_id,source_ids):
"""Internal Use Only"""
result = database.merge_tracks(target_id,source_ids)
return {
"status":"success"
}
@api.post("merge_artists")
@authenticated_function(api=True)
@catch_exceptions
def merge_artists(target_id,source_ids):
"""Internal Use Only"""
result = database.merge_artists(target_id,source_ids)
return {
"status":"success"
}
@api.post("reparse_scrobble")
@authenticated_function(api=True)
@catch_exceptions
def reparse_scrobble(timestamp):
"""Internal Use Only"""
result = database.reparse_scrobble(timestamp)
if result:
return {
"status":"success",
"desc":f"Scrobble was reparsed!",
"scrobble":result
}
else:
return {
"status":"no_operation",
"desc":"The scrobble was not changed."
}

View File

@ -2,7 +2,7 @@ import re
import os
import csv
from .globalconf import data_dir, malojaconfig
from .pkg_global.conf import data_dir, malojaconfig
# need to do this as a class so it can retain loaded settings from file
# apparently this is not true
@ -55,7 +55,7 @@ class CleanerAgent:
artists = list(set(artists))
artists.sort()
return (artists,title)
return (artists,title.strip())
def removespecial(self,s):
if isinstance(s,list):
@ -82,7 +82,7 @@ class CleanerAgent:
def parseArtists(self,a):
if isinstance(a,list):
if isinstance(a,list) or isinstance(a,tuple):
res = [self.parseArtists(art) for art in a]
return [a for group in res for a in group]
@ -109,9 +109,9 @@ class CleanerAgent:
for d in self.delimiters_feat:
if re.match(r"(.*) [\(\[]" + d + " (.*)[\)\]]",a) is not None:
return self.parseArtists(re.sub(r"(.*) [\(\[]" + d + " (.*)[\)\]]",r"\1",a)) + \
self.parseArtists(re.sub(r"(.*) [\(\[]" + d + " (.*)[\)\]]",r"\2",a))
if re.match(r"(.*) [\(\[]" + d + " (.*)[\)\]]",a,flags=re.IGNORECASE) is not None:
return self.parseArtists(re.sub(r"(.*) [\(\[]" + d + " (.*)[\)\]]",r"\1",a,flags=re.IGNORECASE)) + \
self.parseArtists(re.sub(r"(.*) [\(\[]" + d + " (.*)[\)\]]",r"\2",a,flags=re.IGNORECASE))
@ -156,25 +156,37 @@ class CleanerAgent:
# t = p(t).strip()
return t
def parseTitleForArtists(self,t):
for d in self.delimiters_feat:
if re.match(r"(.*) [\(\[]" + d + " (.*?)[\)\]]",t) is not None:
(title,artists) = self.parseTitleForArtists(re.sub(r"(.*) [\(\[]" + d + " (.*?)[\)\]]",r"\1",t))
artists += self.parseArtists(re.sub(r"(.*) [\(\[]" + d + " (.*?)[\)\]].*",r"\2",t))
return (title,artists)
if re.match(r"(.*) - " + d + " (.*)",t) is not None:
(title,artists) = self.parseTitleForArtists(re.sub(r"(.*) - " + d + " (.*)",r"\1",t))
artists += self.parseArtists(re.sub(r"(.*) - " + d + " (.*).*",r"\2",t))
return (title,artists)
if re.match(r"(.*) " + d + " (.*)",t) is not None:
(title,artists) = self.parseTitleForArtists(re.sub(r"(.*) " + d + " (.*)",r"\1",t))
artists += self.parseArtists(re.sub(r"(.*) " + d + " (.*).*",r"\2",t))
return (title,artists)
def parseTitleForArtists(self,title):
artists = []
for delimiter in malojaconfig["DELIMITERS_FEAT"]:
for pattern in [
r" [\(\[]" + re.escape(delimiter) + " (.*?)[\)\]]",
r" - " + re.escape(delimiter) + " (.*)",
r" " + re.escape(delimiter) + " (.*)"
]:
matches = re.finditer(pattern,title,flags=re.IGNORECASE)
for match in matches:
title = match.re.sub('',match.string) # Remove matched part
artists += self.parseArtists(match.group(1)) # Parse matched artist string
if malojaconfig["PARSE_REMIX_ARTISTS"]:
for filter in malojaconfig["FILTERS_REMIX"]:
for pattern in [
r" [\(\[](.*)" + re.escape(filter) + "[\)\]]", # match remix in brackets
r" - (.*)" + re.escape(filter) # match remix split with "-"
]:
match = re.search(pattern,title,flags=re.IGNORECASE)
if match:
# title stays the same
artists += self.parseArtists(match.group(1))
for st in self.rules_artistintitle:
if st in t.lower(): artists += self.rules_artistintitle[st].split("")
return (t,artists)
if st in title.lower(): artists += self.rules_artistintitle[st].split("")
return (title,artists)

View File

@ -8,6 +8,7 @@ countas Trouble Maker HyunA
countas S Club 7 Tina Barrett
countas 4Minute HyunA
countas I.O.I Chungha
countas TrySail Sora Amamiya
# Group more famous than single artist
countas RenoakRhythm Approaching Nirvana
countas Shirley Manson Garbage
@ -18,3 +19,7 @@ countas Airi Suzuki ℃-ute
countas CeeLo Green Gnarls Barkley
countas Amelia Watson Hololive EN
countas Gawr Gura Hololive EN
countas Mori Calliope Hololive EN
countas Ninomae Ina'nis Hololive EN
countas Takanashi Kiara Hololive EN
countas Ceres Fauna Hololive EN

Can't render this file because it has a wrong number of fields in line 5.

View File

@ -0,0 +1,20 @@
# NAME: JPop
# DESC: Fixes and romanizes various Japanese tracks and artists
belongtogether Myth & Roid
# Sora-chan
replaceartist Amamiya Sora Sora Amamiya
replacetitle エデンの旅人 Eden no Tabibito
replacetitle 月灯り Tsukiakari
replacetitle 火花 Hibana
replacetitle ロンリーナイト・ディスコティック Lonely Night Discotheque
replacetitle 羽根輪舞 Hane Rinbu
replacetitle メリーゴーランド Merry-go-round
replacetitle フリイジア Fressia
replacetitle 誓い Chikai
# ReoNa
replacetitle ないない nainai
Can't render this file because it has a wrong number of fields in line 5.

View File

@ -21,7 +21,7 @@ addartists HyunA Change Jun Hyung
# BLACKPINK
countas Jennie BLACKPINK
countas Rosé BLACKPINK
countas Lisa BLACKPINK
countas Lalisa BLACKPINK
countas Jisoo BLACKPINK
replacetitle AS IF IT'S YOUR LAST As If It's Your Last
replacetitle BOOMBAYAH Boombayah
@ -200,10 +200,13 @@ countas ACE IZ*ONE
countas Chaewon IZ*ONE
countas Minju IZ*ONE
# ITZY
countas Yeji ITZY
# IVE
countas Wonyoung IVE
countas Yujin IVE
countas Gaeul IVE
# Popular Remixes
artistintitle Areia Remix Areia

Can't render this file because it has a wrong number of fields in line 5.

View File

@ -1,5 +1,5 @@
# server
from bottle import request, response, FormsDict, HTTPError
from bottle import request, response, FormsDict
# rest of the project
from ..cleanup import CleanerAgent
@ -7,12 +7,13 @@ from .. import images
from ..malojatime import register_scrobbletime, time_stamps, ranges, alltime
from ..malojauri import uri_to_internal, internal_to_uri, compose_querystring
from ..thirdparty import proxy_scrobble_all
from ..globalconf import data_dir, malojaconfig
from ..pkg_global.conf import data_dir, malojaconfig
from ..apis import apikeystore
#db
from . import sqldb
from . import cached
from . import dbcache
from . import exceptions
# doreah toolkit
from doreah.logging import log
@ -42,23 +43,12 @@ dbstatus = {
"rebuildinprogress":False,
"complete":False # information is complete
}
class DatabaseNotBuilt(HTTPError):
def __init__(self):
super().__init__(
status=503,
body="The Maloja Database is being upgraded to Version 3. This could take quite a long time! (~ 2-5 minutes per 10 000 scrobbles)",
headers={"Retry-After":120}
)
class MissingScrobbleParameters(Exception):
def __init__(self,params=[]):
self.params = params
def waitfordb(func):
def newfunc(*args,**kwargs):
if not dbstatus['healthy']: raise DatabaseNotBuilt()
if not dbstatus['healthy']: raise exceptions.DatabaseNotBuilt()
return func(*args,**kwargs)
return newfunc
@ -97,11 +87,45 @@ def incoming_scrobble(rawscrobble,fix=True,client=None,api=None,dbconn=None):
missing.append(necessary_arg)
if len(missing) > 0:
log(f"Invalid Scrobble [Client: {client} | API: {api}]: {rawscrobble} ",color='red')
raise MissingScrobbleParameters(missing)
raise exceptions.MissingScrobbleParameters(missing)
log(f"Incoming scrobble [Client: {client} | API: {api}]: {rawscrobble}")
scrobbledict = rawscrobble_to_scrobbledict(rawscrobble, fix, client)
sqldb.add_scrobble(scrobbledict,dbconn=dbconn)
proxy_scrobble_all(scrobbledict['track']['artists'],scrobbledict['track']['title'],scrobbledict['time'])
dbcache.invalidate_caches(scrobbledict['time'])
#return {"status":"success","scrobble":scrobbledict}
return scrobbledict
@waitfordb
def reparse_scrobble(timestamp):
log(f"Reparsing Scrobble {timestamp}")
scrobble = sqldb.get_scrobble(timestamp=timestamp, include_internal=True)
if not scrobble or not scrobble['rawscrobble']:
return False
newscrobble = rawscrobble_to_scrobbledict(scrobble['rawscrobble'])
track_id = sqldb.get_track_id(newscrobble['track'])
# check if id changed
if sqldb.get_track_id(scrobble['track']) != track_id:
sqldb.edit_scrobble(timestamp, {'track':newscrobble['track']})
dbcache.invalidate_entity_cache()
dbcache.invalidate_caches()
return sqldb.get_scrobble(timestamp=timestamp)
return False
def rawscrobble_to_scrobbledict(rawscrobble, fix=True, client=None):
# raw scrobble to processed info
scrobbleinfo = {**rawscrobble}
if fix:
@ -124,31 +148,63 @@ def incoming_scrobble(rawscrobble,fix=True,client=None,api=None,dbconn=None):
"origin":f"client:{client}" if client else "generic",
"extra":{
k:scrobbleinfo[k] for k in scrobbleinfo if k not in
['scrobble_time','track_artists','track_title','track_length','scrobble_duration','album_name','album_artists']
['scrobble_time','track_artists','track_title','track_length','scrobble_duration']#,'album_name','album_artists']
},
"rawscrobble":rawscrobble
}
sqldb.add_scrobble(scrobbledict,dbconn=dbconn)
proxy_scrobble_all(scrobbledict['track']['artists'],scrobbledict['track']['title'],scrobbledict['time'])
dbcache.invalidate_caches(scrobbledict['time'])
#return {"status":"success","scrobble":scrobbledict}
return scrobbledict
@waitfordb
def remove_scrobble(timestamp):
log(f"Deleting Scrobble {timestamp}")
result = sqldb.delete_scrobble(timestamp)
dbcache.invalidate_caches(timestamp)
return result
@waitfordb
def edit_artist(id,artistinfo):
artist = sqldb.get_artist(id)
log(f"Renaming {artist} to {artistinfo}")
result = sqldb.edit_artist(id,artistinfo)
dbcache.invalidate_entity_cache()
dbcache.invalidate_caches()
return result
@waitfordb
def edit_track(id,trackinfo):
track = sqldb.get_track(id)
log(f"Renaming {track['title']} to {trackinfo['title']}")
result = sqldb.edit_track(id,trackinfo)
dbcache.invalidate_entity_cache()
dbcache.invalidate_caches()
return result
@waitfordb
def merge_artists(target_id,source_ids):
sources = [sqldb.get_artist(id) for id in source_ids]
target = sqldb.get_artist(target_id)
log(f"Merging {sources} into {target}")
result = sqldb.merge_artists(target_id,source_ids)
dbcache.invalidate_entity_cache()
dbcache.invalidate_caches()
return result
@waitfordb
def merge_tracks(target_id,source_ids):
sources = [sqldb.get_track(id) for id in source_ids]
target = sqldb.get_track(target_id)
log(f"Merging {sources} into {target}")
result = sqldb.merge_tracks(target_id,source_ids)
dbcache.invalidate_entity_cache()
dbcache.invalidate_caches()
return result
@ -165,6 +221,7 @@ def get_scrobbles(dbconn=None,**keys):
#return result[keys['page']*keys['perpage']:(keys['page']+1)*keys['perpage']]
return list(reversed(result))
@waitfordb
def get_scrobbles_num(dbconn=None,**keys):
(since,to) = keys.get('timerange').timestamps()
@ -242,6 +299,8 @@ def get_performance(dbconn=None,**keys):
if c["artist"] == artist:
rank = c["rank"]
break
else:
raise exceptions.MissingEntityParameter()
results.append({"range":rng,"rank":rank})
return results
@ -256,7 +315,7 @@ def get_top_artists(dbconn=None,**keys):
try:
res = get_charts_artists(timerange=rng,dbconn=dbconn)[0]
results.append({"range":rng,"artist":res["artist"],"scrobbles":res["scrobbles"]})
except:
except Exception:
results.append({"range":rng,"artist":None,"scrobbles":0})
return results
@ -272,7 +331,7 @@ def get_top_tracks(dbconn=None,**keys):
try:
res = get_charts_tracks(timerange=rng,dbconn=dbconn)[0]
results.append({"range":rng,"track":res["track"],"scrobbles":res["scrobbles"]})
except:
except Exception:
results.append({"range":rng,"track":None,"scrobbles":0})
return results
@ -281,8 +340,10 @@ def get_top_tracks(dbconn=None,**keys):
def artist_info(dbconn=None,**keys):
artist = keys.get('artist')
if artist is None: raise exceptions.MissingEntityParameter()
artist = sqldb.get_artist(sqldb.get_artist_id(artist,dbconn=dbconn),dbconn=dbconn)
artist_id = sqldb.get_artist_id(artist,dbconn=dbconn)
artist = sqldb.get_artist(artist_id,dbconn=dbconn)
alltimecharts = get_charts_artists(timerange=alltime(),dbconn=dbconn)
scrobbles = get_scrobbles_num(artist=artist,timerange=alltime(),dbconn=dbconn)
#we cant take the scrobble number from the charts because that includes all countas scrobbles
@ -296,19 +357,26 @@ def artist_info(dbconn=None,**keys):
"position":position,
"associated":others,
"medals":{
"gold": [year for year in cached.medals_artists if artist in cached.medals_artists[year]['gold']],
"silver": [year for year in cached.medals_artists if artist in cached.medals_artists[year]['silver']],
"bronze": [year for year in cached.medals_artists if artist in cached.medals_artists[year]['bronze']],
"gold": [year for year in cached.medals_artists if artist_id in cached.medals_artists[year]['gold']],
"silver": [year for year in cached.medals_artists if artist_id in cached.medals_artists[year]['silver']],
"bronze": [year for year in cached.medals_artists if artist_id in cached.medals_artists[year]['bronze']],
},
"topweeks":len([e for e in cached.weekly_topartists if e == artist])
"topweeks":len([e for e in cached.weekly_topartists if e == artist_id]),
"id":artist_id
}
except:
except Exception:
# if the artist isnt in the charts, they are not being credited and we
# need to show information about the credited one
replaceartist = sqldb.get_credited_artists(artist)[0]
c = [e for e in alltimecharts if e["artist"] == replaceartist][0]
position = c["rank"]
return {"artist":artist,"replace":replaceartist,"scrobbles":scrobbles,"position":position}
return {
"artist":artist,
"replace":replaceartist,
"scrobbles":scrobbles,
"position":position,
"id":artist_id
}
@ -317,8 +385,10 @@ def artist_info(dbconn=None,**keys):
def track_info(dbconn=None,**keys):
track = keys.get('track')
if track is None: raise exceptions.MissingEntityParameter()
track = sqldb.get_track(sqldb.get_track_id(track,dbconn=dbconn),dbconn=dbconn)
track_id = sqldb.get_track_id(track,dbconn=dbconn)
track = sqldb.get_track(track_id,dbconn=dbconn)
alltimecharts = get_charts_tracks(timerange=alltime(),dbconn=dbconn)
#scrobbles = get_scrobbles_num(track=track,timerange=alltime())
@ -337,12 +407,13 @@ def track_info(dbconn=None,**keys):
"scrobbles":scrobbles,
"position":position,
"medals":{
"gold": [year for year in cached.medals_tracks if track in cached.medals_tracks[year]['gold']],
"silver": [year for year in cached.medals_tracks if track in cached.medals_tracks[year]['silver']],
"bronze": [year for year in cached.medals_tracks if track in cached.medals_tracks[year]['bronze']],
"gold": [year for year in cached.medals_tracks if track_id in cached.medals_tracks[year]['gold']],
"silver": [year for year in cached.medals_tracks if track_id in cached.medals_tracks[year]['silver']],
"bronze": [year for year in cached.medals_tracks if track_id in cached.medals_tracks[year]['bronze']],
},
"certification":cert,
"topweeks":len([e for e in cached.weekly_toptracks if e == track])
"topweeks":len([e for e in cached.weekly_toptracks if e == track_id]),
"id":track_id
}
@ -370,7 +441,7 @@ def get_predefined_rulesets(dbconn=None):
else: name = rawf.split("_")[1]
desc = line2.replace("# DESC: ","") if "# DESC: " in line2 else ""
author = rawf.split("_")[0]
except:
except Exception:
continue
ruleset = {"file":rawf}

View File

@ -8,7 +8,7 @@ import csv
import os
from . import sqldb
from ..globalconf import data_dir
from ..pkg_global.conf import data_dir
def load_associated_rules():

View File

@ -3,6 +3,7 @@
from doreah.regular import runyearly, rundaily
from .. import database
from . import sqldb
from .. import malojatime as mjt
@ -24,27 +25,29 @@ def update_medals():
medals_artists.clear()
medals_tracks.clear()
for year in mjt.ranges(step="year"):
if year == mjt.thisyear(): break
with sqldb.engine.begin() as conn:
for year in mjt.ranges(step="year"):
if year == mjt.thisyear(): break
charts_artists = database.get_charts_artists(timerange=year)
charts_tracks = database.get_charts_tracks(timerange=year)
charts_artists = sqldb.count_scrobbles_by_artist(since=year.first_stamp(),to=year.last_stamp(),resolve_ids=False,dbconn=conn)
charts_tracks = sqldb.count_scrobbles_by_track(since=year.first_stamp(),to=year.last_stamp(),resolve_ids=False,dbconn=conn)
entry_artists = {'gold':[],'silver':[],'bronze':[]}
entry_tracks = {'gold':[],'silver':[],'bronze':[]}
medals_artists[year.desc()] = entry_artists
medals_tracks[year.desc()] = entry_tracks
entry_artists = {'gold':[],'silver':[],'bronze':[]}
entry_tracks = {'gold':[],'silver':[],'bronze':[]}
medals_artists[year.desc()] = entry_artists
medals_tracks[year.desc()] = entry_tracks
for entry in charts_artists:
if entry['rank'] == 1: entry_artists['gold'].append(entry['artist_id'])
elif entry['rank'] == 2: entry_artists['silver'].append(entry['artist_id'])
elif entry['rank'] == 3: entry_artists['bronze'].append(entry['artist_id'])
else: break
for entry in charts_tracks:
if entry['rank'] == 1: entry_tracks['gold'].append(entry['track_id'])
elif entry['rank'] == 2: entry_tracks['silver'].append(entry['track_id'])
elif entry['rank'] == 3: entry_tracks['bronze'].append(entry['track_id'])
else: break
for entry in charts_artists:
if entry['rank'] == 1: entry_artists['gold'].append(entry['artist'])
elif entry['rank'] == 2: entry_artists['silver'].append(entry['artist'])
elif entry['rank'] == 3: entry_artists['bronze'].append(entry['artist'])
else: break
for entry in charts_tracks:
if entry['rank'] == 1: entry_tracks['gold'].append(entry['track'])
elif entry['rank'] == 2: entry_tracks['silver'].append(entry['track'])
elif entry['rank'] == 3: entry_tracks['bronze'].append(entry['track'])
else: break
@ -55,15 +58,17 @@ def update_weekly():
weekly_topartists.clear()
weekly_toptracks.clear()
for week in mjt.ranges(step="week"):
if week == mjt.thisweek(): break
with sqldb.engine.begin() as conn:
for week in mjt.ranges(step="week"):
if week == mjt.thisweek(): break
charts_artists = database.get_charts_artists(timerange=week)
charts_tracks = database.get_charts_tracks(timerange=week)
for entry in charts_artists:
if entry['rank'] == 1: weekly_topartists.append(entry['artist'])
else: break
for entry in charts_tracks:
if entry['rank'] == 1: weekly_toptracks.append(entry['track'])
else: break
charts_artists = sqldb.count_scrobbles_by_artist(since=week.first_stamp(),to=week.last_stamp(),resolve_ids=False,dbconn=conn)
charts_tracks = sqldb.count_scrobbles_by_track(since=week.first_stamp(),to=week.last_stamp(),resolve_ids=False,dbconn=conn)
for entry in charts_artists:
if entry['rank'] == 1: weekly_topartists.append(entry['artist_id'])
else: break
for entry in charts_tracks:
if entry['rank'] == 1: weekly_toptracks.append(entry['track_id'])
else: break

View File

@ -5,101 +5,86 @@
import lru
import psutil
import json
import sys
from doreah.regular import runhourly
from doreah.logging import log
from ..globalconf import malojaconfig
HIGH_NUMBER = 1000000
CACHE_SIZE = 10000
ENTITY_CACHE_SIZE = 1000000
CACHE_ADJUST_STEP = 100
cache = lru.LRU(CACHE_SIZE)
entitycache = lru.LRU(ENTITY_CACHE_SIZE)
hits, misses = 0, 0
from ..pkg_global.conf import malojaconfig
@runhourly
def maintenance():
if malojaconfig['USE_GLOBAL_CACHE']:
if malojaconfig['USE_GLOBAL_CACHE']:
cache = lru.LRU(10000)
entitycache = lru.LRU(100000)
@runhourly
def maintenance():
print_stats()
trim_cache()
def print_stats():
log(f"Cache Size: {len(cache)} [{len(entitycache)} E], System RAM Utilization: {psutil.virtual_memory().percent}%, Cache Hits: {hits}/{hits+misses}")
#print("Full rundown:")
#import sys
#for k in cache.keys():
# print(f"\t{k}\t{sys.getsizeof(cache[k])}")
def print_stats():
for name,c in (('Cache',cache),('Entity Cache',entitycache)):
hits, misses = c.get_stats()
log(f"{name}: Size: {len(c)} | Hits: {hits}/{hits+misses} | Estimated Memory: {human_readable_size(c)}")
log(f"System RAM Utilization: {psutil.virtual_memory().percent}%")
def cached_wrapper(inner_func):
def cached_wrapper(inner_func):
if not malojaconfig['USE_GLOBAL_CACHE']: return inner_func
def outer_func(*args,**kwargs):
if 'dbconn' in kwargs:
conn = kwargs.pop('dbconn')
else:
conn = None
global hits, misses
key = (serialize(args),serialize(kwargs), inner_func, kwargs.get("since"), kwargs.get("to"))
def outer_func(*args,**kwargs):
if key in cache:
hits += 1
return cache.get(key)
if 'dbconn' in kwargs:
conn = kwargs.pop('dbconn')
else:
conn = None
global hits, misses
key = (serialize(args),serialize(kwargs), inner_func, kwargs.get("since"), kwargs.get("to"))
try:
return cache[key]
except KeyError:
result = inner_func(*args,**kwargs,dbconn=conn)
cache[key] = result
return result
return outer_func
# cache for functions that call with a whole list of entity ids
# we don't want a new cache entry for every single combination, but keep a common
# cache that's aware of what we're calling
def cached_wrapper_individual(inner_func):
def outer_func(set_arg,**kwargs):
if 'dbconn' in kwargs:
conn = kwargs.pop('dbconn')
else:
conn = None
result = {}
for id in set_arg:
try:
result[id] = entitycache[(inner_func,id)]
except KeyError:
pass
remaining = inner_func(set(e for e in set_arg if e not in result),dbconn=conn)
for id in remaining:
entitycache[(inner_func,id)] = remaining[id]
result[id] = remaining[id]
else:
misses += 1
result = inner_func(*args,**kwargs,dbconn=conn)
cache[key] = result
return result
return outer_func
return outer_func
# cache for functions that call with a whole list of entity ids
# we don't want a new cache entry for every single combination, but keep a common
# cache that's aware of what we're calling
def cached_wrapper_individual(inner_func):
if not malojaconfig['USE_GLOBAL_CACHE']: return inner_func
def outer_func(set_arg,**kwargs):
if 'dbconn' in kwargs:
conn = kwargs.pop('dbconn')
else:
conn = None
#global hits, misses
result = {}
for id in set_arg:
if (inner_func,id) in entitycache:
result[id] = entitycache[(inner_func,id)]
#hits += 1
else:
pass
#misses += 1
remaining = inner_func(set(e for e in set_arg if e not in result),dbconn=conn)
for id in remaining:
entitycache[(inner_func,id)] = remaining[id]
result[id] = remaining[id]
return result
return outer_func
def invalidate_caches(scrobbletime):
if malojaconfig['USE_GLOBAL_CACHE']:
def invalidate_caches(scrobbletime=None):
cleared, kept = 0, 0
for k in cache.keys():
# VERY BIG TODO: differentiate between None as in 'unlimited timerange' and None as in 'time doesnt matter here'!
if (k[3] is None or scrobbletime >= k[3]) and (k[4] is None or scrobbletime <= k[4]):
if scrobbletime is None or (k[3] is None or scrobbletime >= k[3]) and (k[4] is None or scrobbletime <= k[4]):
cleared += 1
del cache[k]
else:
@ -107,37 +92,85 @@ def invalidate_caches(scrobbletime):
log(f"Invalidated {cleared} of {cleared+kept} DB cache entries")
def invalidate_entity_cache():
entitycache.clear()
def invalidate_entity_cache():
entitycache.clear()
def trim_cache():
ramprct = psutil.virtual_memory().percent
if ramprct > malojaconfig["DB_MAX_MEMORY"]:
log(f"{ramprct}% RAM usage, clearing cache and adjusting size!")
#ratio = 0.6
#targetsize = max(int(len(cache) * ratio),50)
#log(f"Reducing to {targetsize} entries")
#cache.set_size(targetsize)
#cache.set_size(HIGH_NUMBER)
cache.clear()
if cache.get_size() > CACHE_ADJUST_STEP:
cache.set_size(cache.get_size() - CACHE_ADJUST_STEP)
def trim_cache():
ramprct = psutil.virtual_memory().percent
if ramprct > malojaconfig["DB_MAX_MEMORY"]:
log(f"{ramprct}% RAM usage, clearing cache!")
for c in (cache,entitycache):
c.clear()
#ratio = 0.6
#targetsize = max(int(len(cache) * ratio),50)
#log(f"Reducing to {targetsize} entries")
#cache.set_size(targetsize)
#cache.set_size(HIGH_NUMBER)
#if cache.get_size() > CACHE_ADJUST_STEP:
# cache.set_size(cache.get_size() - CACHE_ADJUST_STEP)
#log(f"New RAM usage: {psutil.virtual_memory().percent}%")
print_stats()
#log(f"New RAM usage: {psutil.virtual_memory().percent}%")
print_stats()
else:
def cached_wrapper(func):
return func
def cached_wrapper_individual(func):
return func
def invalidate_caches(scrobbletime=None):
return None
def invalidate_entity_cache():
return None
def serialize(obj):
try:
return serialize(obj.hashable())
except:
except Exception:
try:
return json.dumps(obj)
except:
except Exception:
if isinstance(obj, (list, tuple, set)):
return "[" + ",".join(serialize(o) for o in obj) + "]"
elif isinstance(obj,dict):
return "{" + ",".join(serialize(o) + ":" + serialize(obj[o]) for o in obj) + "}"
return json.dumps(obj.hashable())
def get_size_of(obj,counted=None):
if counted is None:
counted = set()
if id(obj) in counted: return 0
size = sys.getsizeof(obj)
counted.add(id(obj))
try:
for k,v in obj.items():
size += get_size_of(v,counted=counted)
except:
try:
for i in obj:
size += get_size_of(i,counted=counted)
except:
pass
return size
def human_readable_size(obj):
units = ['','Ki','Mi','Gi','Ti','Pi']
magnitude = 0
bytes = get_size_of(obj)
while bytes > 1024 and len(units) > magnitude+1:
bytes = bytes / 1024
magnitude += 1
if magnitude > 2:
return f"{bytes:.2f} {units[magnitude]}B"
else:
return f"{bytes:.0f} {units[magnitude]}B"

View File

@ -0,0 +1,29 @@
from bottle import HTTPError
class EntityExists(Exception):
def __init__(self,entitydict):
self.entitydict = entitydict
class TrackExists(EntityExists):
pass
class ArtistExists(EntityExists):
pass
class DatabaseNotBuilt(HTTPError):
def __init__(self):
super().__init__(
status=503,
body="The Maloja Database is being upgraded to Version 3. This could take quite a long time! (~ 2-5 minutes per 10 000 scrobbles)",
headers={"Retry-After":120}
)
class MissingScrobbleParameters(Exception):
def __init__(self,params=[]):
self.params = params
class MissingEntityParameter(Exception):
pass

View File

@ -3,7 +3,7 @@ from . sqldb import engine
from .dbcache import serialize
from ..globalconf import malojaconfig
from ..pkg_global.conf import malojaconfig
from doreah.logging import log
@ -23,7 +23,8 @@ class JinjaDBConnection:
return self
def __exit__(self, exc_type, exc_value, exc_traceback):
self.conn.close()
log(f"Generated page with {self.hits}/{self.hits+self.misses} local Cache hits",module="debug_performance")
if malojaconfig['USE_REQUEST_CACHE']:
log(f"Generated page with {self.hits}/{self.hits+self.misses} local Cache hits",module="debug_performance")
del self.cache
def __getattr__(self,name):
originalmethod = getattr(database,name)

View File

@ -5,8 +5,9 @@ import math
from datetime import datetime
from threading import Lock
from ..globalconf import data_dir
from ..pkg_global.conf import data_dir
from .dbcache import cached_wrapper, cached_wrapper_individual
from . import exceptions as exc
from doreah.logging import log
from doreah.regular import runhourly, runmonthly
@ -114,8 +115,11 @@ def connection_provider(func):
return func(*args,**kwargs)
else:
with engine.connect() as connection:
kwargs['dbconn'] = connection
return func(*args,**kwargs)
with connection.begin():
kwargs['dbconn'] = connection
return func(*args,**kwargs)
wrapper.__innerfunc__ = func
return wrapper
##### DB <-> Dict translations
@ -207,21 +211,22 @@ def artist_db_to_dict(row,dbconn=None):
### DICT -> DB
# These should return None when no data is in the dict so they can be used for update statements
def scrobble_dict_to_db(info,dbconn=None):
return {
"timestamp":info['time'],
"origin":info['origin'],
"duration":info['duration'],
"track_id":get_track_id(info['track'],dbconn=dbconn),
"extra":json.dumps(info.get('extra',{})),
"rawscrobble":json.dumps(info.get('rawscrobble',{}))
"timestamp":info.get('time'),
"origin":info.get('origin'),
"duration":info.get('duration'),
"track_id":get_track_id(info.get('track'),dbconn=dbconn),
"extra":json.dumps(info.get('extra')) if info.get('extra') else None,
"rawscrobble":json.dumps(info.get('rawscrobble')) if info.get('rawscrobble') else None
}
def track_dict_to_db(info,dbconn=None):
return {
"title":info['title'],
"title_normalized":normalize_name(info['title']),
"title":info.get('title'),
"title_normalized":normalize_name(info.get('title','')) or None,
"length":info.get('length')
}
@ -275,13 +280,16 @@ def delete_scrobble(scrobble_id,dbconn=None):
DB['scrobbles'].c.timestamp == scrobble_id
)
dbconn.execute(op)
result = dbconn.execute(op)
return True
### these will 'get' the ID of an entity, creating it if necessary
@cached_wrapper
@connection_provider
def get_track_id(trackdict,dbconn=None):
def get_track_id(trackdict,create_new=True,dbconn=None):
ntitle = normalize_name(trackdict['title'])
artist_ids = [get_artist_id(a,dbconn=dbconn) for a in trackdict['artists']]
artist_ids = list(set(artist_ids))
@ -290,7 +298,7 @@ def get_track_id(trackdict,dbconn=None):
op = DB['tracks'].select(
DB['tracks'].c.id
# DB['tracks'].c.id
).where(
DB['tracks'].c.title_normalized==ntitle
)
@ -300,7 +308,7 @@ def get_track_id(trackdict,dbconn=None):
foundtrackartists = []
op = DB['trackartists'].select(
DB['trackartists'].c.artist_id
# DB['trackartists'].c.artist_id
).where(
DB['trackartists'].c.track_id==row[0]
)
@ -311,6 +319,8 @@ def get_track_id(trackdict,dbconn=None):
#print("ID for",trackdict['title'],"was",row[0])
return row.id
if not create_new: return None
op = DB['tracks'].insert().values(
**track_dict_to_db(trackdict,dbconn=dbconn)
@ -334,7 +344,7 @@ def get_artist_id(artistname,create_new=True,dbconn=None):
#print("looking for",nname)
op = DB['artists'].select(
DB['artists'].c.id
# DB['artists'].c.id
).where(
DB['artists'].c.name_normalized==nname
)
@ -354,6 +364,137 @@ def get_artist_id(artistname,create_new=True,dbconn=None):
return result.inserted_primary_key[0]
### Edit existing
@connection_provider
def edit_scrobble(scrobble_id,scrobbleupdatedict,dbconn=None):
dbentry = scrobble_dict_to_db(scrobbleupdatedict,dbconn=dbconn)
dbentry = {k:v for k,v in dbentry.items() if v}
print("Updating scrobble",dbentry)
with SCROBBLE_LOCK:
op = DB['scrobbles'].update().where(
DB['scrobbles'].c.timestamp == scrobble_id
).values(
**dbentry
)
dbconn.execute(op)
@connection_provider
def edit_artist(id,artistupdatedict,dbconn=None):
artist = get_artist(id)
changedartist = artistupdatedict # well
dbentry = artist_dict_to_db(artistupdatedict,dbconn=dbconn)
dbentry = {k:v for k,v in dbentry.items() if v}
existing_artist_id = get_artist_id(changedartist,create_new=False,dbconn=dbconn)
if existing_artist_id not in (None,id):
raise exc.ArtistExists(changedartist)
op = DB['artists'].update().where(
DB['artists'].c.id==id
).values(
**dbentry
)
result = dbconn.execute(op)
return True
@connection_provider
def edit_track(id,trackupdatedict,dbconn=None):
track = get_track(id,dbconn=dbconn)
changedtrack = {**track,**trackupdatedict}
dbentry = track_dict_to_db(trackupdatedict,dbconn=dbconn)
dbentry = {k:v for k,v in dbentry.items() if v}
existing_track_id = get_track_id(changedtrack,create_new=False,dbconn=dbconn)
if existing_track_id not in (None,id):
raise exc.TrackExists(changedtrack)
op = DB['tracks'].update().where(
DB['tracks'].c.id==id
).values(
**dbentry
)
result = dbconn.execute(op)
return True
### Merge
@connection_provider
def merge_tracks(target_id,source_ids,dbconn=None):
op = DB['scrobbles'].update().where(
DB['scrobbles'].c.track_id.in_(source_ids)
).values(
track_id=target_id
)
result = dbconn.execute(op)
clean_db(dbconn=dbconn)
return True
@connection_provider
def merge_artists(target_id,source_ids,dbconn=None):
# some tracks could already have multiple of the to be merged artists
# find literally all tracksartist entries that have any of the artists involved
op = DB['trackartists'].select().where(
DB['trackartists'].c.artist_id.in_(source_ids + [target_id])
)
result = dbconn.execute(op)
track_ids = set(row.track_id for row in result)
# now just delete them all lmao
op = DB['trackartists'].delete().where(
#DB['trackartists'].c.track_id.in_(track_ids),
DB['trackartists'].c.artist_id.in_(source_ids + [target_id]),
)
result = dbconn.execute(op)
# now add back the real new artist
op = DB['trackartists'].insert().values([
{'track_id':track_id,'artist_id':target_id}
for track_id in track_ids
])
result = dbconn.execute(op)
# tracks_artists = {}
# for row in result:
# tracks_artists.setdefault(row.track_id,[]).append(row.artist_id)
#
# multiple = {k:v for k,v in tracks_artists.items() if len(v) > 1}
#
# print([(get_track(k),[get_artist(a) for a in v]) for k,v in multiple.items()])
#
# op = DB['trackartists'].update().where(
# DB['trackartists'].c.artist_id.in_(source_ids)
# ).values(
# artist_id=target_id
# )
# result = dbconn.execute(op)
# this could have created duplicate tracks
merge_duplicate_tracks(artist_id=target_id,dbconn=dbconn)
clean_db(dbconn=dbconn)
return True
@ -486,7 +627,7 @@ def get_tracks(dbconn=None):
@cached_wrapper
@connection_provider
def count_scrobbles_by_artist(since,to,dbconn=None):
def count_scrobbles_by_artist(since,to,resolve_ids=True,dbconn=None):
jointable = sql.join(
DB['scrobbles'],
DB['trackartists'],
@ -514,16 +655,18 @@ def count_scrobbles_by_artist(since,to,dbconn=None):
).order_by(sql.desc('count'))
result = dbconn.execute(op).all()
counts = [row.count for row in result]
artists = get_artists_map([row.artist_id for row in result],dbconn=dbconn)
result = [{'scrobbles':row.count,'artist':artists[row.artist_id]} for row in result]
if resolve_ids:
counts = [row.count for row in result]
artists = get_artists_map([row.artist_id for row in result],dbconn=dbconn)
result = [{'scrobbles':row.count,'artist':artists[row.artist_id]} for row in result]
else:
result = [{'scrobbles':row.count,'artist_id':row.artist_id} for row in result]
result = rank(result,key='scrobbles')
return result
@cached_wrapper
@connection_provider
def count_scrobbles_by_track(since,to,dbconn=None):
def count_scrobbles_by_track(since,to,resolve_ids=True,dbconn=None):
op = sql.select(
@ -535,10 +678,12 @@ def count_scrobbles_by_track(since,to,dbconn=None):
).group_by(DB['scrobbles'].c.track_id).order_by(sql.desc('count'))
result = dbconn.execute(op).all()
counts = [row.count for row in result]
tracks = get_tracks_map([row.track_id for row in result],dbconn=dbconn)
result = [{'scrobbles':row.count,'track':tracks[row.track_id]} for row in result]
if resolve_ids:
counts = [row.count for row in result]
tracks = get_tracks_map([row.track_id for row in result],dbconn=dbconn)
result = [{'scrobbles':row.count,'track':tracks[row.track_id]} for row in result]
else:
result = [{'scrobbles':row.count,'track_id':row.track_id} for row in result]
result = rank(result,key='scrobbles')
return result
@ -691,6 +836,17 @@ def get_artist(id,dbconn=None):
return artist_db_to_dict(artistinfo,dbconn=dbconn)
@cached_wrapper
@connection_provider
def get_scrobble(timestamp, include_internal=False, dbconn=None):
op = DB['scrobbles'].select().where(
DB['scrobbles'].c.timestamp==timestamp
)
result = dbconn.execute(op).all()
scrobble = result[0]
return scrobbles_db_to_dict(rows=[scrobble], include_internal=include_internal)[0]
@cached_wrapper
@connection_provider
def search_artist(searchterm,dbconn=None):
@ -715,38 +871,37 @@ def search_track(searchterm,dbconn=None):
##### MAINTENANCE
@runhourly
def clean_db():
@connection_provider
def clean_db(dbconn=None):
with SCROBBLE_LOCK:
with engine.begin() as conn:
log(f"Database Cleanup...")
log(f"Database Cleanup...")
to_delete = [
# tracks with no scrobbles (trackartist entries first)
"from trackartists where track_id in (select id from tracks where id not in (select track_id from scrobbles))",
"from tracks where id not in (select track_id from scrobbles)",
# artists with no tracks
"from artists where id not in (select artist_id from trackartists) and id not in (select target_artist from associated_artists)",
# tracks with no artists (scrobbles first)
"from scrobbles where track_id in (select id from tracks where id not in (select track_id from trackartists))",
"from tracks where id not in (select track_id from trackartists)"
]
to_delete = [
# tracks with no scrobbles (trackartist entries first)
"from trackartists where track_id in (select id from tracks where id not in (select track_id from scrobbles))",
"from tracks where id not in (select track_id from scrobbles)",
# artists with no tracks
"from artists where id not in (select artist_id from trackartists) and id not in (select target_artist from associated_artists)",
# tracks with no artists (scrobbles first)
"from scrobbles where track_id in (select id from tracks where id not in (select track_id from trackartists))",
"from tracks where id not in (select track_id from trackartists)"
]
for d in to_delete:
selection = conn.execute(sql.text(f"select * {d}"))
for row in selection.all():
log(f"Deleting {row}")
deletion = conn.execute(sql.text(f"delete {d}"))
for d in to_delete:
selection = dbconn.execute(sql.text(f"select * {d}"))
for row in selection.all():
log(f"Deleting {row}")
deletion = dbconn.execute(sql.text(f"delete {d}"))
log("Database Cleanup complete!")
log("Database Cleanup complete!")
#if a2+a1>0: log(f"Deleted {a2} tracks without scrobbles ({a1} track artist entries)")
#if a2+a1>0: log(f"Deleted {a2} tracks without scrobbles ({a1} track artist entries)")
#if a3>0: log(f"Deleted {a3} artists without tracks")
#if a3>0: log(f"Deleted {a3} artists without tracks")
#if a5+a4>0: log(f"Deleted {a5} tracks without artists ({a4} scrobbles)")
#if a5+a4>0: log(f"Deleted {a5} tracks without artists ({a4} scrobbles)")
@ -767,6 +922,46 @@ def renormalize_names():
rows = conn.execute(DB['artists'].update().where(DB['artists'].c.id == id).values(name_normalized=norm_target))
@connection_provider
def merge_duplicate_tracks(artist_id,dbconn=None):
rows = dbconn.execute(
DB['trackartists'].select().where(
DB['trackartists'].c.artist_id == artist_id
)
)
affected_tracks = [r.track_id for r in rows]
track_artists = {}
rows = dbconn.execute(
DB['trackartists'].select().where(
DB['trackartists'].c.track_id.in_(affected_tracks)
)
)
for row in rows:
track_artists.setdefault(row.track_id,[]).append(row.artist_id)
artist_combos = {}
for track_id in track_artists:
artist_combos.setdefault(tuple(sorted(track_artists[track_id])),[]).append(track_id)
for c in artist_combos:
if len(artist_combos[c]) > 1:
track_identifiers = {}
for track_id in artist_combos[c]:
track_identifiers.setdefault(normalize_name(get_track(track_id)['title']),[]).append(track_id)
for track in track_identifiers:
if len(track_identifiers[track]) > 1:
target,*src = track_identifiers[track]
merge_tracks(target,src,dbconn=dbconn)

2
maloja/dev/__init__.py Normal file
View File

@ -0,0 +1,2 @@
### Subpackage that takes care of all things that concern the server process itself,
### e.g. analytics

28
maloja/dev/apidebug.py Normal file
View File

@ -0,0 +1,28 @@
import bottle, waitress
from ..pkg_global.conf import malojaconfig
from doreah.logging import log
from nimrodel import EAPI as API
PORT = malojaconfig["PORT"]
HOST = malojaconfig["HOST"]
the_listener = API(delay=True)
@the_listener.get("{path}")
@the_listener.post("{path}")
def all_requests(path,**kwargs):
result = {
'path':path,
'payload': kwargs
}
log(result)
return result
def run():
server = bottle.Bottle()
the_listener.mount(server,path="apis")
waitress.serve(server, listen=f"*:{PORT}")

View File

@ -1,5 +1,6 @@
import random
import datetime
from doreah.io import ask
@ -66,10 +67,10 @@ def generate_track():
def generate(n=200):
def generate_scrobbles(n=200):
from ..database.sqldb import add_scrobbles
from ...database.sqldb import add_scrobbles
n = int(n)
if ask("Generate random scrobbles?",default=False):

View File

@ -2,11 +2,10 @@ import os
import cProfile, pstats
from doreah.logging import log
from doreah.timing import Clock
from ..globalconf import data_dir
from ..pkg_global.conf import data_dir
profiler = cProfile.Profile()
@ -33,7 +32,7 @@ def profile(func):
if FULL_PROFILE:
try:
pstats.Stats(profiler).dump_stats(os.path.join(benchmarkfolder,f"{func.__name__}.stats"))
except:
except Exception:
pass
return result

View File

@ -1,4 +1,4 @@
from .globalconf import data_dir, malojaconfig
from .pkg_global.conf import data_dir, malojaconfig
from . import thirdparty
from . import database
@ -94,7 +94,7 @@ def dl_image(url):
uri = datauri.DataURI.make(mime,charset='ascii',base64=True,data=data)
log(f"Downloaded {url} for local caching")
return uri
except:
except Exception:
log(f"Image {url} could not be downloaded for local caching")
return None
@ -260,13 +260,16 @@ def local_files(artist=None,artists=None,title=None):
for f in os.listdir(data_dir['images'](purename)):
if f.split(".")[-1] in ["png","jpg","jpeg","gif"]:
images.append("/images/" + purename + "/" + f)
except:
except Exception:
pass
return images
class MalformedB64(Exception):
pass
def set_image(b64,**keys):
track = "title" in keys
if track:
@ -279,7 +282,10 @@ def set_image(b64,**keys):
log("Trying to set image, b64 string: " + str(b64[:30] + "..."),module="debug")
regex = r"data:image/(\w+);base64,(.+)"
type,b64 = re.fullmatch(regex,b64).groups()
match = re.fullmatch(regex,b64)
if not match: raise MalformedB64()
type,b64 = match.groups()
b64 = base64.b64decode(b64)
filename = "webupload" + str(int(datetime.datetime.now().timestamp())) + "." + type
for folder in get_all_possible_filenames(**keys):
@ -293,8 +299,11 @@ def set_image(b64,**keys):
with open(data_dir['images'](folder,filename),"wb") as f:
f.write(b64)
log("Saved image as " + data_dir['images'](folder,filename),module="debug")
# set as current picture in rotation
if track: set_image_in_cache(id,'tracks',os.path.join("/images",folder,filename))
else: set_image_in_cache(id,'artists',os.path.join("/images",folder,filename))
return os.path.join("/images",folder,filename)

View File

@ -1,5 +1,6 @@
from . import filters
from ..globalconf import malojaconfig
from ..pkg_global.conf import malojaconfig
from ..pkg_global import conf
from .. import database, malojatime, images, malojauri, thirdparty, __pkginfo__
from ..database import jinjaview
@ -32,6 +33,7 @@ def update_jinja_environment():
"mlj_uri": malojauri,
"settings": malojaconfig,
"thirdparty": thirdparty,
"conf":conf,
"pkginfo": __pkginfo__,
# external
"urllib": urllib,

View File

@ -13,7 +13,7 @@ def find_representative(sequence,attribute_id,attribute_count):
newsequence = [e for e in newsequence if e[attribute_count] == max(el[attribute_count] for el in newsequence)]
return newsequence[0]
except:
except Exception:
return None
finally:
for e in newsequence:

View File

@ -3,7 +3,7 @@ from calendar import monthrange
from os.path import commonprefix
import math
from .globalconf import malojaconfig
from .pkg_global.conf import malojaconfig
OFFSET = malojaconfig["TIMEZONE"]
@ -320,7 +320,8 @@ class MTRangeComposite(MTRangeGeneric):
if self.since is None: return FIRST_SCROBBLE
else: return self.since.first_stamp()
def last_stamp(self):
if self.to is None: return int(datetime.utcnow().replace(tzinfo=timezone.utc).timestamp())
#if self.to is None: return int(datetime.utcnow().replace(tzinfo=timezone.utc).timestamp())
if self.to is None: return today().last_stamp()
else: return self.to.last_stamp()
def next(self,step=1):
@ -430,7 +431,7 @@ def time_fix(t):
try:
t = [int(p) for p in t]
return MTRangeGregorian(t[:3])
except:
except Exception:
pass
if isinstance(t[1],str) and t[1].startswith("w"):
@ -438,7 +439,7 @@ def time_fix(t):
year = int(t[0])
weeknum = int(t[1][1:])
return MTRangeWeek(year=year,week=weeknum)
except:
except Exception:
raise

View File

@ -146,7 +146,7 @@ def remove_identical(*dicts):
try: #multidicts
for v in d.getall(k):
keys.append(k,v)
except: #normaldicts
except Exception: #normaldicts
v = d.get(k)
keys.append(k,v)

View File

@ -3,7 +3,7 @@ from doreah.configuration import Configuration
from doreah.configuration import types as tp
from .__pkginfo__ import VERSION
from ..__pkginfo__ import VERSION
@ -28,7 +28,7 @@ def is_dir_usable(pth):
os.mknod(pthj(pth,".test"))
os.remove(pthj(pth,".test"))
return True
except:
except Exception:
return False
def get_env_vars(key,pathsuffix=[]):
@ -148,8 +148,8 @@ malojaconfig = Configuration(
"Technical":{
"cache_expire_positive":(tp.Integer(), "Image Cache Expiration", 60, "Days until images are refetched"),
"cache_expire_negative":(tp.Integer(), "Image Cache Negative Expiration", 5, "Days until failed image fetches are reattempted"),
"db_max_memory":(tp.Integer(min=0,max=100), "RAM Percentage soft limit", 80, "RAM Usage in percent at which Maloja should no longer increase its database cache."),
"use_request_cache":(tp.Boolean(), "Use request-local DB Cache", True),
"db_max_memory":(tp.Integer(min=0,max=100), "RAM Percentage soft limit", 50, "RAM Usage in percent at which Maloja should no longer increase its database cache."),
"use_request_cache":(tp.Boolean(), "Use request-local DB Cache", False),
"use_global_cache":(tp.Boolean(), "Use global DB Cache", True)
},
"Fluff":{
@ -179,15 +179,18 @@ malojaconfig = Configuration(
"Database":{
"invalid_artists":(tp.Set(tp.String()), "Invalid Artists", ["[Unknown Artist]","Unknown Artist","Spotify"], "Artists that should be discarded immediately"),
"remove_from_title":(tp.Set(tp.String()), "Remove from Title", ["(Original Mix)","(Radio Edit)","(Album Version)","(Explicit Version)","(Bonus Track)"], "Phrases that should be removed from song titles"),
"delimiters_feat":(tp.Set(tp.String()), "Featuring Delimiters", ["ft.","ft","feat.","feat","featuring","Ft.","Ft","Feat.","Feat","Featuring"], "Delimiters used for extra artists, even when in the title field"),
"delimiters_feat":(tp.Set(tp.String()), "Featuring Delimiters", ["ft.","ft","feat.","feat","featuring"], "Delimiters used for extra artists, even when in the title field"),
"delimiters_informal":(tp.Set(tp.String()), "Informal Delimiters", ["vs.","vs","&"], "Delimiters in informal artist strings with spaces expected around them"),
"delimiters_formal":(tp.Set(tp.String()), "Formal Delimiters", [";","/","|","","",""], "Delimiters used to tag multiple artists when only one tag field is available")
"delimiters_formal":(tp.Set(tp.String()), "Formal Delimiters", [";","/","|","","",""], "Delimiters used to tag multiple artists when only one tag field is available"),
"filters_remix":(tp.Set(tp.String()), "Remix Filters", ["Remix", "Remix Edit", "Short Mix", "Extended Mix", "Soundtrack Version"], "Filters used to recognize the remix artists in the title"),
"parse_remix_artists":(tp.Boolean(), "Parse Remix Artists", False)
},
"Web Interface":{
"default_range_charts_artists":(tp.Choice({'alltime':'All Time','year':'Year','month':"Month",'week':'Week'}), "Default Range Artist Charts", "year"),
"default_range_charts_tracks":(tp.Choice({'alltime':'All Time','year':'Year','month':"Month",'week':'Week'}), "Default Range Track Charts", "year"),
"default_step_pulse":(tp.Choice({'year':'Year','month':"Month",'week':'Week','day':'Day'}), "Default Pulse Step", "month"),
"charts_display_tiles":(tp.Boolean(), "Display Chart Tiles", False),
"display_art_icons":(tp.Boolean(), "Display Album/Artist Icons", True),
"discourage_cpu_heavy_stats":(tp.Boolean(), "Discourage CPU-heavy stats", False, "Prevent visitors from mindlessly clicking on CPU-heavy options. Does not actually disable them for malicious actors!"),
"use_local_images":(tp.Boolean(), "Use Local Images", True),
#"local_image_rotate":(tp.Integer(), "Local Image Rotate", 3600),
@ -311,7 +314,7 @@ config(
auth={
"multiuser":False,
"cookieprefix":"maloja",
"stylesheets":["/style.css"],
"stylesheets":["/maloja.css"],
"dbfile":data_dir['auth']("auth.ddb")
},
logging={
@ -326,6 +329,9 @@ config(
custom_css_files = [f for f in os.listdir(data_dir['css']()) if f.lower().endswith('.css')]
# what the fuck did i just write
# this spaghetti file is proudly sponsored by the rice crackers i'm eating at the

View File

@ -11,21 +11,21 @@ try:
from simplejson import JSONEncoder
JSONEncoder._olddefault = JSONEncoder.default
JSONEncoder.default = newdefault
except:
except Exception:
pass
try:
from json import JSONEncoder
JSONEncoder._olddefault = JSONEncoder.default
JSONEncoder.default = newdefault
except:
except Exception:
pass
try:
from ujson import JSONEncoder
JSONEncoder._olddefault = JSONEncoder.default
JSONEncoder.default = newdefault
except:
except Exception:
pass
@ -51,7 +51,7 @@ class expandeddate(date):
def fromchrcalendar(cls,y,w,d):
try:
return datetime.date.fromisocalendar(y,w,d) - timedelta(days=1) #sunday instead of monday
except:
except Exception:
# pre python3.8 compatibility
firstdayofyear = datetime.date(y,1,1)

View File

@ -1,140 +0,0 @@
import subprocess
from doreah import settings
from doreah.control import mainfunction
from doreah.io import col
import os
import signal
from ipaddress import ip_address
from .setup import setup
from . import tasks
from .. import __pkginfo__ as info
from .. import globalconf
def print_header_info():
print()
#print("#####")
print(col['yellow']("Maloja"),"v" + info.VERSION)
print(info.HOMEPAGE)
#print("#####")
print()
def getInstance():
try:
output = subprocess.check_output(["pidof","Maloja"])
return int(output)
except:
return None
def getInstanceSupervisor():
try:
output = subprocess.check_output(["pidof","maloja_supervisor"])
return int(output)
except:
return None
def restart():
stop()
start()
def start():
if getInstanceSupervisor() is not None:
print("Maloja is already running.")
else:
print_header_info()
setup()
try:
#p = subprocess.Popen(["python3","-m","maloja.server"],stdout=subprocess.DEVNULL,stderr=subprocess.DEVNULL)
sp = subprocess.Popen(["python3","-m","maloja.proccontrol.supervisor"],stdout=subprocess.DEVNULL,stderr=subprocess.DEVNULL)
print(col["green"]("Maloja started!"))
port = globalconf.malojaconfig["PORT"]
print("Visit your server address (Port " + str(port) + ") to see your web interface. Visit /admin_setup to get started.")
print("If you're installing this on your local machine, these links should get you there:")
print("\t" + col["blue"]("http://localhost:" + str(port)))
print("\t" + col["blue"]("http://localhost:" + str(port) + "/admin_setup"))
return True
except:
print("Error while starting Maloja.")
return False
def stop():
pid_sv = getInstanceSupervisor()
if pid_sv is not None:
os.kill(pid_sv,signal.SIGTERM)
pid = getInstance()
if pid is not None:
os.kill(pid,signal.SIGTERM)
if pid is None and pid_sv is None:
return False
print("Maloja stopped!")
return True
def onlysetup():
print_header_info()
setup()
print("Setup complete!")
def direct():
print_header_info()
setup()
from .. import server
server.run_server()
def debug():
os.environ["MALOJA_DEV_MODE"] = 'true'
globalconf.malojaconfig.load_environment()
direct()
def print_info():
print_header_info()
print(col['lightblue']("Configuration Directory:"),globalconf.dir_settings['config'])
print(col['lightblue']("Data Directory: "),globalconf.dir_settings['state'])
print(col['lightblue']("Log Directory: "),globalconf.dir_settings['logs'])
print(col['lightblue']("Network: "),f"IPv{ip_address(globalconf.malojaconfig['host']).version}, Port {globalconf.malojaconfig['port']}")
print(col['lightblue']("Timezone: "),f"UTC{globalconf.malojaconfig['timezone']:+d}")
print()
print()
@mainfunction({"l":"level","v":"version","V":"version"},flags=['version','include_images'],shield=True)
def main(*args,**kwargs):
actions = {
# server
"start":start,
"restart":restart,
"stop":stop,
"run":direct,
"debug":debug,
"setup":onlysetup,
# admin scripts
"import":tasks.import_scrobbles, # maloja import /x/y.csv
"backup":tasks.backup, # maloja backup --targetfolder /x/y --include_images
"generate":tasks.generate, # maloja generate 400
"export":tasks.export, # maloja export
# aux
"info":print_info
}
if "version" in kwargs:
print(info.VERSION)
return True
else:
try:
action, *args = args
action = actions[action]
except (ValueError, KeyError):
print("Valid commands: " + " ".join(a for a in actions))
return False
return action(*args,**kwargs)

View File

@ -1,33 +0,0 @@
#!/usr/bin/env python3
import os
from ..globalconf import malojaconfig
import subprocess
import setproctitle
import signal
from doreah.logging import log
from .control import getInstance
setproctitle.setproctitle("maloja_supervisor")
def start():
try:
return subprocess.Popen(
["python3", "-m", "maloja","run"],
stdout=subprocess.DEVNULL,
stderr=subprocess.DEVNULL,
)
except e:
log("Error starting Maloja: " + str(e),module="supervisor")
while True:
log("Maloja is not running, starting...",module="supervisor")
process = start()
process.wait()

View File

@ -1,4 +1,3 @@
from .import_scrobbles import import_scrobbles
from .backup import backup
from .generate import generate
from .export import export # read that line out loud

View File

@ -2,7 +2,7 @@ import tarfile
import time
import glob
import os
from ...globalconf import dir_settings
from ...pkg_global.conf import dir_settings
from pathlib import PurePath
from doreah.logging import log

View File

@ -4,7 +4,7 @@ import json, csv
from doreah.io import col, ask, prompt
from ...cleanup import *
from ...globalconf import data_dir
from ...pkg_global.conf import data_dir
c = CleanerAgent()
@ -37,18 +37,27 @@ def import_scrobbles(inputf):
typeid,typedesc = "lastfm","Last.fm"
importfunc = parse_lastfm
elif re.match("Streaming_History_Audio.+\.json",filename):
typeid,typedesc = "spotify","Spotify"
importfunc = parse_spotify_lite
elif re.match("endsong_[0-9]+\.json",filename):
typeid,typedesc = "spotify","Spotify"
importfunc = parse_spotify_full
importfunc = parse_spotify
elif re.match("StreamingHistory[0-9]+\.json",filename):
typeid,typedesc = "spotify","Spotify"
importfunc = parse_spotify_lite
importfunc = parse_spotify_lite_legacy
elif re.match("maloja_export_[0-9]+\.json",filename):
typeid,typedesc = "maloja","Maloja"
importfunc = parse_maloja
# username_lb-YYYY-MM-DD.json
elif re.match(".*_lb-[0-9-]+\.json",filename):
typeid,typedesc = "listenbrainz","ListenBrainz"
importfunc = parse_listenbrainz
else:
print("File",inputf,"could not be identified as a valid import source.")
return result
@ -76,6 +85,7 @@ def import_scrobbles(inputf):
# extra info
extrainfo = {}
if scrobble.get('album_name'): extrainfo['album_name'] = scrobble['album_name']
if scrobble.get('album_artist'): extrainfo['album_artist'] = scrobble['album_artist']
# saving this in the scrobble instead of the track because for now it's not meant
# to be authorative information, just payload of the scrobble
@ -84,7 +94,7 @@ def import_scrobbles(inputf):
"track":{
"artists":scrobble['track_artists'],
"title":scrobble['track_title'],
"length":None
"length":scrobble['track_length'],
},
"duration":scrobble['scrobble_duration'],
"origin":"import:" + typeid,
@ -116,7 +126,7 @@ def import_scrobbles(inputf):
return result
def parse_spotify_lite(inputf):
def parse_spotify_lite_legacy(inputf):
pth = os.path
inputfolder = pth.relpath(pth.dirname(pth.abspath(inputf)))
filenames = re.compile(r'StreamingHistory[0-9]+\.json')
@ -154,6 +164,7 @@ def parse_spotify_lite(inputf):
yield ("CONFIDENT_IMPORT",{
'track_title':title,
'track_artists': artist,
'track_length': None,
'scrobble_time': timestamp,
'scrobble_duration':played,
'album_name': None
@ -165,7 +176,59 @@ def parse_spotify_lite(inputf):
print()
def parse_spotify_full(inputf):
def parse_spotify_lite(inputf):
pth = os.path
inputfolder = pth.relpath(pth.dirname(pth.abspath(inputf)))
filenames = re.compile(r'Streaming_History_Audio.+\.json')
inputfiles = [os.path.join(inputfolder,f) for f in os.listdir(inputfolder) if filenames.match(f)]
if len(inputfiles) == 0:
print("No files found!")
return
if inputfiles != [inputf]:
print("Spotify files should all be imported together to identify duplicates across the whole dataset.")
if not ask("Import " + ", ".join(col['yellow'](i) for i in inputfiles) + "?",default=True):
inputfiles = [inputf]
for inputf in inputfiles:
print("Importing",col['yellow'](inputf),"...")
with open(inputf,'r') as inputfd:
data = json.load(inputfd)
for entry in data:
try:
played = int(entry['ms_played'] / 1000)
timestamp = int(
datetime.datetime.strptime(entry['ts'],"%Y-%m-%dT%H:%M:%SZ").timestamp()
)
artist = entry['master_metadata_album_artist_name'] # hmmm
title = entry['master_metadata_track_name']
album = entry['master_metadata_album_album_name']
albumartist = entry['master_metadata_album_artist_name']
if played < 30:
yield ('CONFIDENT_SKIP',None,f"{entry} is shorter than 30 seconds, skipping...")
continue
yield ("CONFIDENT_IMPORT",{
'track_title':title,
'track_artists': artist,
'track_length': None,
'scrobble_time': timestamp,
'scrobble_duration':played,
'album_name': album,
'album_artist': albumartist
},'')
except Exception as e:
yield ('FAIL',None,f"{entry} could not be parsed. Scrobble not imported. ({repr(e)})")
continue
print()
def parse_spotify(inputf):
pth = os.path
inputfolder = pth.relpath(pth.dirname(pth.abspath(inputf)))
filenames = re.compile(r'endsong_[0-9]+\.json')
@ -174,7 +237,7 @@ def parse_spotify_full(inputf):
if len(inputfiles) == 0:
print("No files found!")
return
if inputfiles != [inputf]:
print("Spotify files should all be imported together to identify duplicates across the whole dataset.")
if not ask("Import " + ", ".join(col['yellow'](i) for i in inputfiles) + "?",default=True):
@ -262,6 +325,7 @@ def parse_spotify_full(inputf):
yield (status,{
'track_title':title,
'track_artists': artist,
'track_length': None,
'album_name': album,
'scrobble_time': timestamp,
'scrobble_duration':played
@ -294,6 +358,7 @@ def parse_lastfm(inputf):
yield ('CONFIDENT_IMPORT',{
'track_title': title,
'track_artists': artist,
'track_length': None,
'album_name': album,
'scrobble_time': int(datetime.datetime.strptime(
time + '+0000',
@ -305,6 +370,28 @@ def parse_lastfm(inputf):
yield ('FAIL',None,f"{row} (Line {line}) could not be parsed. Scrobble not imported. ({repr(e)})")
continue
def parse_listenbrainz(inputf):
with open(inputf,'r') as inputfd:
data = json.load(inputfd)
for entry in data:
try:
track_metadata = entry['track_metadata']
additional_info = track_metadata.get('additional_info', {})
yield ("CONFIDENT_IMPORT",{
'track_title': track_metadata['track_name'],
'track_artists': additional_info.get('artist_names') or track_metadata['artist_name'],
'track_length': int(additional_info.get('duration_ms', 0) / 1000) or additional_info.get('duration'),
'album_name': track_metadata.get('release_name'),
'scrobble_time': entry['listened_at'],
'scrobble_duration': None,
},'')
except Exception as e:
yield ('FAIL',None,f"{entry} could not be parsed. Scrobble not imported. ({repr(e)})")
continue
def parse_maloja(inputf):
@ -318,6 +405,7 @@ def parse_maloja(inputf):
yield ('CONFIDENT_IMPORT',{
'track_title': s['track']['title'],
'track_artists': s['track']['artists'],
'track_length': s['track']['length'],
'album_name': s['track'].get('album',{}).get('name',''),
'scrobble_time': s['time'],
'scrobble_duration': s['duration']

View File

@ -2,9 +2,7 @@
import sys
import os
from threading import Thread
import setproctitle
from importlib import resources
from css_html_js_minify import html_minify, css_minify
import datauri
import time
@ -12,6 +10,7 @@ import time
# server stuff
from bottle import Bottle, static_file, request, response, FormsDict, redirect, BaseRequest, abort
import waitress
from jinja2.exceptions import TemplateNotFound
# doreah toolkit
from doreah.logging import log
@ -22,12 +21,12 @@ from . import database
from .database.jinjaview import JinjaDBConnection
from .images import resolve_track_image, resolve_artist_image
from .malojauri import uri_to_internal, remove_identical
from .globalconf import malojaconfig, data_dir
from .pkg_global.conf import malojaconfig, data_dir
from .jinjaenv.context import jinja_environment
from .apis import init_apis, apikeystore
from .proccontrol.profiler import profile
from .dev.profiler import profile
######
@ -43,48 +42,6 @@ BaseRequest.MEMFILE_MAX = 15 * 1024 * 1024
webserver = Bottle()
#rename process, this is now required for the daemon manager to work
setproctitle.setproctitle("Maloja")
######
### CSS
#####
def generate_css():
cssstr = ""
with resources.files('maloja') / 'web' / 'static' as staticfolder:
for file in os.listdir(os.path.join(staticfolder,"css")):
if file.endswith(".css"):
with open(os.path.join(staticfolder,"css",file),"r") as filed:
cssstr += filed.read()
for file in os.listdir(data_dir['css']()):
if file.endswith(".css"):
with open(os.path.join(data_dir['css'](file)),"r") as filed:
cssstr += filed.read()
cssstr = css_minify(cssstr)
return cssstr
css = generate_css()
######
### MINIFY
#####
def clean_html(inp):
return inp
#if malojaconfig["DEV_MODE"]: return inp
#else: return html_minify(inp)
@ -204,13 +161,6 @@ def static_image(pth):
return resp
@webserver.route("/style.css")
def get_css():
response.content_type = 'text/css'
if malojaconfig["DEV_MODE"]: return generate_css()
else: return css
@webserver.route("/login")
def login():
return auth.get_login_page()
@ -219,7 +169,7 @@ def login():
@webserver.route("/<name>.<ext>")
@webserver.route("/media/<name>.<ext>")
def static(name,ext):
assert ext in ["txt","ico","jpeg","jpg","png","less","js","ttf"]
assert ext in ["txt","ico","jpeg","jpg","png","less","js","ttf","css"]
with resources.files('maloja') / 'web' / 'static' as staticfolder:
response = static_file(ext + "/" + name + "." + ext,root=staticfolder)
response.set_header("Cache-Control", "public, max-age=3600")
@ -233,6 +183,15 @@ def static(path):
response.set_header("Cache-Control", "public, max-age=3600")
return response
# static files not supplied by the package
@webserver.get("/static_custom/<category>/<path:path>")
def static_custom(category,path):
rootpath = {
'css':data_dir['css']()
}
response = static_file(path,root=rootpath[category])
response.set_header("Cache-Control", "public, max-age=3600")
return response
### DYNAMIC
@ -254,16 +213,17 @@ def jinja_page(name):
"_urikeys":keys, #temporary!
}
loc_context["filterkeys"], loc_context["limitkeys"], loc_context["delimitkeys"], loc_context["amountkeys"], loc_context["specialkeys"] = uri_to_internal(keys)
template = jinja_environment.get_template(name + '.jinja')
try:
template = jinja_environment.get_template(name + '.jinja')
res = template.render(**loc_context)
except TemplateNotFound:
abort(404,f"Not found: '{name}'")
except (ValueError, IndexError):
abort(404,"This Artist or Track does not exist")
if malojaconfig["DEV_MODE"]: jinja_environment.cache.clear()
return clean_html(res)
return res
@webserver.route("/<name:re:admin.*>")
@auth.authenticated

View File

@ -1,10 +1,12 @@
from importlib import resources
from distutils import dir_util
from doreah.io import col, ask, prompt
from doreah import auth
import os
from ..globalconf import data_dir, dir_settings, malojaconfig
from importlib import resources
from distutils import dir_util
from doreah.io import col, ask, prompt
from doreah import auth
from .pkg_global.conf import data_dir, dir_settings, malojaconfig
@ -48,7 +50,7 @@ def setup():
# OWN API KEY
from ..apis import apikeystore
from .apis import apikeystore
if len(apikeystore) == 0:
answer = ask("Do you want to set up a key to enable scrobbling? Your scrobble extension needs that key so that only you can scrobble tracks to your database.",default=True,skip=SKIP)
if answer:

View File

@ -13,7 +13,7 @@ import base64
from doreah.logging import log
from threading import BoundedSemaphore
from ..globalconf import malojaconfig
from ..pkg_global.conf import malojaconfig
from .. import database
@ -230,7 +230,7 @@ class MetadataInterface(GenericInterface,abstract=True):
for node in self.metadata[resp]:
try:
res = res[node]
except:
except Exception:
return None
return res

View File

@ -18,7 +18,7 @@ class MusicBrainz(MetadataInterface):
metadata = {
"response_type":"json",
"response_parse_tree_track": ["images",0,"image"],
"response_parse_tree_track": ["images",0,"thumbnails","500"],
"required_settings": [],
}
@ -57,7 +57,7 @@ class MusicBrainz(MetadataInterface):
if imgurl is not None: imgurl = self.postprocess_url(imgurl)
return imgurl
except:
except Exception:
return None
finally:
time.sleep(2)

View File

@ -7,7 +7,7 @@ import csv
from doreah.logging import log
from doreah.io import col
from .globalconf import data_dir, dir_settings
from .pkg_global.conf import data_dir, dir_settings
from .apis import _apikeys
@ -37,7 +37,7 @@ def upgrade_apikeys():
for key,identifier in entries:
_apikeys.apikeystore[identifier] = key
os.remove(oldfile)
except:
except Exception:
pass

View File

@ -37,7 +37,6 @@
</span>
<br/><br/>
<span id="notification"></span>
</td>
</tr>

View File

@ -8,12 +8,16 @@
<title>{% block title %}{% endblock %}</title>
<meta name="description" content='Maloja is a self-hosted music scrobble server.' />
<link rel="icon" type="image/x-icon" href="/favicon.ico" />
<meta name="color-scheme" content="dark" />
<meta name="darkreader" content="wat" />
<link rel="stylesheet" href="/style.css" />
<link rel="stylesheet" href="/maloja.css" />
<link rel="stylesheet" href="/static/css/themes/{{ settings.theme }}.css" />
{% for cssf in conf.custom_css_files %}
<link rel="stylesheet" href="/static_custom/css/{{ cssf }}" />
{% endfor %}
<script src="/search.js"></script>
<script src="/neopolitan.js"></script>
@ -50,9 +54,7 @@
{% endblock %}
{% endblock %}
<div id="notification_area">
</div>
<div class="footer">
@ -84,9 +86,21 @@
</div>
</div>
<a href="/admin_overview"><div title="Server Administration" id="settingsicon" class="clickable_icon">
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24"><path d="M17 12.645v-2.289c-1.17-.417-1.907-.533-2.28-1.431-.373-.9.07-1.512.6-2.625l-1.618-1.619c-1.105.525-1.723.974-2.626.6-.9-.374-1.017-1.117-1.431-2.281h-2.29c-.412 1.158-.53 1.907-1.431 2.28h-.001c-.9.374-1.51-.07-2.625-.6l-1.617 1.619c.527 1.11.973 1.724.6 2.625-.375.901-1.123 1.019-2.281 1.431v2.289c1.155.412 1.907.531 2.28 1.431.376.908-.081 1.534-.6 2.625l1.618 1.619c1.107-.525 1.724-.974 2.625-.6h.001c.9.373 1.018 1.118 1.431 2.28h2.289c.412-1.158.53-1.905 1.437-2.282h.001c.894-.372 1.501.071 2.619.602l1.618-1.619c-.525-1.107-.974-1.723-.601-2.625.374-.899 1.126-1.019 2.282-1.43zm-8.5 1.689c-1.564 0-2.833-1.269-2.833-2.834s1.269-2.834 2.833-2.834 2.833 1.269 2.833 2.834-1.269 2.834-2.833 2.834zm15.5 4.205v-1.077c-.55-.196-.897-.251-1.073-.673-.176-.424.033-.711.282-1.236l-.762-.762c-.52.248-.811.458-1.235.283-.424-.175-.479-.525-.674-1.073h-1.076c-.194.545-.25.897-.674 1.073-.424.176-.711-.033-1.235-.283l-.762.762c.248.523.458.812.282 1.236-.176.424-.528.479-1.073.673v1.077c.544.193.897.25 1.073.673.177.427-.038.722-.282 1.236l.762.762c.521-.248.812-.458 1.235-.283.424.175.479.526.674 1.073h1.076c.194-.545.25-.897.676-1.074h.001c.421-.175.706.034 1.232.284l.762-.762c-.247-.521-.458-.812-.282-1.235s.529-.481 1.073-.674zm-4 .794c-.736 0-1.333-.597-1.333-1.333s.597-1.333 1.333-1.333 1.333.597 1.333 1.333-.597 1.333-1.333 1.333z"/></svg>
</div></a>
<div id="icon_bar">
{% block icon_bar %}{% endblock %}
{% include 'icons/settings.jinja' %}
</div>
<div id="notification_area">
</div>
<!-- Load script as late as possible so DOM renders first -->
<script src="/lazyload17-8-2.min.js"></script>
<script>
var lazyLoadInstance = new LazyLoad({});
</script>
</body>
</html>

View File

@ -16,7 +16,7 @@
<td style="padding-right:7px;">
Artists:
</td><td id="artists_td">
<input placeholder='Separate with Enter' class='simpleinput' id='artists' onKeydown='keyDetect(event)' />
<input placeholder='Separate with Enter' class='simpleinput' id='artists' onKeydown='keyDetect(event)' onblur='addEnteredArtist()' />
</td>
</tr>
<tr>

View File

@ -66,6 +66,9 @@
<ul>
<li>manually scrobble from track pages</li>
<li>delete scrobbles</li>
<li>reparse scrobbles</li>
<li>edit tracks and artists</li>
<li>merge tracks and artists</li>
<li>upload artist and track art by dropping a file on the existing image on an artist or track page</li>
<li>see more detailed error pages</li>
</ul>
@ -80,10 +83,10 @@
Backup your data.<br/><br/>
<a href="/apis/mlj_1/backup" download="maloja_backup.tar.gz">
<a class="hidelink" href="/apis/mlj_1/backup" download="maloja_backup.tar.gz">
<button type="button">Backup</button>
</a>
<a href="/apis/mlj_1/export" download="maloja_export.json">
<a class="hidelink" href="/apis/mlj_1/export" download="maloja_export.json">
<button type="button">Export</button>
</a>

View File

@ -71,7 +71,7 @@
<tr> <td>album</td> <td><i>Album title - optional</i></td> </tr>
<tr> <td>albumartists</td> <td><i>List of album artists - optional</i></td> </tr>
<tr> <td>duration</td> <td><i>Duration of play in seconds - optional</i></td> </tr>
<tr> <td>length</td> <td><i>Full length of the trackin seconds - optional</i></td> </tr>
<tr> <td>length</td> <td><i>Full length of the track in seconds - optional</i></td> </tr>
<tr> <td>time</td> <td><i>UNIX timestamp - optional, defaults to time of request</i></td> </tr>
<tr> <td>fix</td> <td><i>Set this to false to skip server-side metadata fixing - optional</i></td> </tr>
@ -85,12 +85,12 @@
<h2>Import your Last.FM data</h2>
Switching from Last.fm? <a class="textlink" href="https://benjaminbenben.com/lastfm-to-csv/">Download all your data</a> and run the command <span class="stats">maloja import <i>(the file you just downloaded)</i></span>.<br/>
You can also try out <a href="https://github.com/FoxxMD/multi-scrobbler">Multi-Scrobbler</a> to import scrobbles from a wider range of sources.
You can also try out <a class="textlink" href="https://github.com/FoxxMD/multi-scrobbler">Multi-Scrobbler</a> to import scrobbles from a wider range of sources.
<br/><br/>
<h2>Set up some rules</h2>
After you've scrobbled for a bit, you might want to check the <a class="textlink" href="/admin_issues">Issues page</a> to see if you need to set up some rules. You can also manually add rules in your server's "rules" directory - just add your own .tsv file and read the instructions on how to declare a rule.
You can add some rules in your server's "rules" directory - just add your own .tsv file and read the instructions on how to declare a rule.
<br/><br/>
You can also set up some predefined rulesets right away!

View File

@ -6,6 +6,7 @@
{% block scripts %}
<script src="/rangeselect.js"></script>
<script src="/edit.js"></script>
{% endblock %}
{% set artist = filterkeys.artist %}
@ -26,10 +27,23 @@
{% set encodedartist = mlj_uri.uriencode({'artist':artist}) %}
{% block icon_bar %}
{% if adminmode %}
{% include 'icons/edit.jinja' %}
{% include 'icons/merge.jinja' %}
{% include 'icons/merge_mark.jinja' %}
{% include 'icons/merge_cancel.jinja' %}
<script>showValidMergeIcons();</script>
{% endif %}
{% endblock %}
{% block content %}
<script>
const entity_id = {{ info.id }};
const entity_type = 'artist';
const entity_name = {{ artist | tojson }};
</script>
@ -40,6 +54,7 @@
<div
class="changeable-image" data-uploader="b64=>upload('{{ encodedartist }}',b64)"
style="background-image:url('{{ images.get_artist_image(artist) }}');"
title="Drag & Drop to upload new image"
></div>
{% else %}
<div style="background-image:url('{{ images.get_artist_image(artist) }}');">
@ -47,7 +62,7 @@
{% endif %}
</td>
<td class="text">
<h1 class="headerwithextra">{{ info.artist }}</h1>
<h1 id="main_entity_name" class="headerwithextra">{{ info.artist | e }}</h1>
{% if competes %}<span class="rank"><a href="/charts_artists?max=100">#{{ info.position }}</a></span>{% endif %}
<br/>
{% if competes and included %}
@ -56,7 +71,9 @@
<span>Competing under {{ links.link(credited) }} (#{{ info.position }})</span>
{% endif %}
<p class="stats"><a href="{{ mlj_uri.create_uri("/scrobbles",filterkeys) }}">{{ info['scrobbles'] }} Scrobbles</a></p>
<p class="stats">
<a href="{{ mlj_uri.create_uri("/scrobbles",filterkeys) }}">{{ info['scrobbles'] }} Scrobbles</a>
</p>
@ -72,6 +89,7 @@
</tr>
</table>
<h2><a href='{{ mlj_uri.create_uri("/charts_tracks",filterkeys) }}'>Top Tracks</a></h2>

View File

@ -0,0 +1,202 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View File

@ -0,0 +1,21 @@
MIT License
Copyright (c) 2022 GitHub Inc.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

View File

@ -0,0 +1,6 @@
<div class='deleteicon clickable_icon danger' onclick="toggleDeleteConfirm(this)" title="Delete scrobble">
<svg height="16px" viewBox="0 0 24 24" width="16px">
<path d="M0 0h24v24H0z" fill="none"/>
<path d="M6 19c0 1.1.9 2 2 2h8c1.1 0 2-.9 2-2V7H6v12zM19 4h-3.5l-1-1h-5l-1 1H5v2h14V4z"/>
</svg>
</div>

View File

@ -0,0 +1,5 @@
<div title="Edit" id="editicon" class="clickable_icon" onclick="editEntity()">
<svg width="24" height="24" viewBox="0 0 24 24">
<path fill-rule="evenodd" d="M17.263 2.177a1.75 1.75 0 012.474 0l2.586 2.586a1.75 1.75 0 010 2.474L19.53 10.03l-.012.013L8.69 20.378a1.75 1.75 0 01-.699.409l-5.523 1.68a.75.75 0 01-.935-.935l1.673-5.5a1.75 1.75 0 01.466-.756L14.476 4.963l2.787-2.786zm-2.275 4.371l-10.28 9.813a.25.25 0 00-.067.108l-1.264 4.154 4.177-1.271a.25.25 0 00.1-.059l10.273-9.806-2.94-2.939zM19 8.44l2.263-2.262a.25.25 0 000-.354l-2.586-2.586a.25.25 0 00-.354 0L16.061 5.5 19 8.44z"/>
</svg>
</div>

View File

@ -0,0 +1,5 @@
<div title="Merge" id="mergeicon" class="clickable_icon hide" onclick="merge()">
<svg viewBox="0 0 16 16" width="24" height="24">
<path fill-rule="evenodd" d="M5 3.254V3.25v.005a.75.75 0 110-.005v.004zm.45 1.9a2.25 2.25 0 10-1.95.218v5.256a2.25 2.25 0 101.5 0V7.123A5.735 5.735 0 009.25 9h1.378a2.251 2.251 0 100-1.5H9.25a4.25 4.25 0 01-3.8-2.346zM12.75 9a.75.75 0 100-1.5.75.75 0 000 1.5zm-8.5 4.5a.75.75 0 100-1.5.75.75 0 000 1.5z"></path>
</svg>
</div>

View File

@ -0,0 +1,5 @@
<div title="Cancel merge" id="mergecancelicon" class="clickable_icon hide" onclick="cancelMerge()">
<svg viewBox="0 0 16 16" width="24" height="24">
<path fill-rule="evenodd" d="M10.72 1.227a.75.75 0 011.06 0l.97.97.97-.97a.75.75 0 111.06 1.061l-.97.97.97.97a.75.75 0 01-1.06 1.06l-.97-.97-.97.97a.75.75 0 11-1.06-1.06l.97-.97-.97-.97a.75.75 0 010-1.06zM12.75 6.5a.75.75 0 00-.75.75v3.378a2.251 2.251 0 101.5 0V7.25a.75.75 0 00-.75-.75zm0 5.5a.75.75 0 100 1.5.75.75 0 000-1.5zM2.5 3.25a.75.75 0 111.5 0 .75.75 0 01-1.5 0zM3.25 1a2.25 2.25 0 00-.75 4.372v5.256a2.251 2.251 0 101.5 0V5.372A2.25 2.25 0 003.25 1zm0 11a.75.75 0 100 1.5.75.75 0 000-1.5z"></path>
</svg>
</div>

View File

@ -0,0 +1,5 @@
<div title="Mark for merging" id="mergemarkicon" class="clickable_icon hide" onclick="markForMerge()">
<svg viewBox="0 0 16 16" width="24" height="24">
<path fill-rule="evenodd" d="M7.177 3.073L9.573.677A.25.25 0 0110 .854v4.792a.25.25 0 01-.427.177L7.177 3.427a.25.25 0 010-.354zM3.75 2.5a.75.75 0 100 1.5.75.75 0 000-1.5zm-2.25.75a2.25 2.25 0 113 2.122v5.256a2.251 2.251 0 11-1.5 0V5.372A2.25 2.25 0 011.5 3.25zM11 2.5h-1V4h1a1 1 0 011 1v5.628a2.251 2.251 0 101.5 0V5A2.5 2.5 0 0011 2.5zm1 10.25a.75.75 0 111.5 0 .75.75 0 01-1.5 0zM3.75 12a.75.75 0 100 1.5.75.75 0 000-1.5z"></path>
</svg>
</div>

View File

@ -0,0 +1,7 @@
<td style="opacity:0.5;text-align:center;">
<svg height="96px" viewBox="0 0 24 24" width="96px">
<path d="M0 0h24v24H0z" fill="none"/>
<path d="M4.27 3L3 4.27l9 9v.28c-.59-.34-1.27-.55-2-.55-2.21 0-4 1.79-4 4s1.79 4 4 4 4-1.79 4-4v-1.73L19.73 21 21 19.73 4.27 3zM14 7h4V3h-6v5.18l2 2z"/>
</svg>
<br/>No scrobbles yet!
</td>

View File

@ -0,0 +1,5 @@
<div class='refreshicon clickable_icon danger' onclick="toggleReparseConfirm(this)" title="Reparse original scrobble">
<svg height="16px" viewBox="0 0 24 24" width="16px">
<path d="M0 0h24v24H0z" fill="none"/><path d="M17.65 6.35C16.2 4.9 14.21 4 12 4c-4.42 0-7.99 3.58-7.99 8s3.57 8 7.99 8c3.73 0 6.84-2.55 7.73-6h-2.08c-.82 2.33-3.04 4-5.65 4-3.31 0-6-2.69-6-6s2.69-6 6-6c1.66 0 3.14.69 4.22 1.78L13 11h7V4l-2.35 2.35z"/>
</svg>
</div>

View File

@ -0,0 +1,10 @@
<a class='hidelink' href="/admin_overview">
<div title="Server Administration" id="settingsicon" class="clickable_icon" style="margin-left:25px;">
<svg enable-background="new 0 0 24 24" height="24px" viewBox="0 0 24 24" width="24px">
<g>
<path d="M0,0h24v24H0V0z" fill="none"/>
<path d="M19.14,12.94c0.04-0.3,0.06-0.61,0.06-0.94c0-0.32-0.02-0.64-0.07-0.94l2.03-1.58c0.18-0.14,0.23-0.41,0.12-0.61 l-1.92-3.32c-0.12-0.22-0.37-0.29-0.59-0.22l-2.39,0.96c-0.5-0.38-1.03-0.7-1.62-0.94L14.4,2.81c-0.04-0.24-0.24-0.41-0.48-0.41 h-3.84c-0.24,0-0.43,0.17-0.47,0.41L9.25,5.35C8.66,5.59,8.12,5.92,7.63,6.29L5.24,5.33c-0.22-0.08-0.47,0-0.59,0.22L2.74,8.87 C2.62,9.08,2.66,9.34,2.86,9.48l2.03,1.58C4.84,11.36,4.8,11.69,4.8,12s0.02,0.64,0.07,0.94l-2.03,1.58 c-0.18,0.14-0.23,0.41-0.12,0.61l1.92,3.32c0.12,0.22,0.37,0.29,0.59,0.22l2.39-0.96c0.5,0.38,1.03,0.7,1.62,0.94l0.36,2.54 c0.05,0.24,0.24,0.41,0.48,0.41h3.84c0.24,0,0.44-0.17,0.47-0.41l0.36-2.54c0.59-0.24,1.13-0.56,1.62-0.94l2.39,0.96 c0.22,0.08,0.47,0,0.59-0.22l1.92-3.32c0.12-0.22,0.07-0.47-0.12-0.61L19.14,12.94z M12,15.6c-1.98,0-3.6-1.62-3.6-3.6 s1.62-3.6,3.6-3.6s3.6,1.62,3.6,3.6S13.98,15.6,12,15.6z"/>
</g>
</svg>
</div>
</a>

View File

@ -9,8 +9,12 @@
{% set charts_cycler = cycler(*charts_14) %}
<table class="tiles_top"><tr>
{% for segment in range(3) %}
{% if charts_14[0] is none and loop.first %}
{% include 'icons/nodata.jinja' %}
{% else %}
<td>
{% set segmentsize = segment+1 %}
<table class="tiles_{{ segmentsize }}x{{ segmentsize }} tiles_sub">
@ -23,7 +27,7 @@
{% set rank = entry.rank %}
<td>
<a href="{{ links.url(artist) }}">
<div style='background-image:url("{{ images.get_artist_image(artist) }}")'>
<div class="lazy" data-bg="{{ images.get_artist_image(artist) }}"'>
<span class='stats'>#{{ rank }}</span> <span>{{ artist }}</span>
</div>
</a>
@ -35,6 +39,7 @@
</tr>
{%- endfor -%}
</table>
</td>
</td>
{% endif %}
{% endfor %}
</tr></table>

View File

@ -11,6 +11,9 @@
<table class="tiles_top"><tr>
{% for segment in range(3) %}
{% if charts_14[0] is none and loop.first %}
{% include 'icons/nodata.jinja' %}
{% else %}
<td>
{% set segmentsize = segment+1 %}
<table class="tiles_{{ segmentsize }}x{{ segmentsize }} tiles_sub">
@ -23,7 +26,7 @@
{% set rank = entry.rank %}
<td>
<a href="{{ links.url(track) }}">
<div style='background-image:url("{{ images.get_track_image(track) }}")'>
<div class="lazy" data-bg="{{ images.get_track_image(track) }}")'>
<span class='stats'>#{{ rank }}</span> <span>{{ track.title }}</span>
</div>
</a>
@ -35,6 +38,7 @@
</tr>
{%- endfor %}
</table>
</td>
</td>
{% endif %}
{% endfor %}
</tr></table>

View File

@ -17,18 +17,28 @@
{{ entityrow.row(s.track) }}
{% if adminmode %}
<td class='delete_area'>
<span class="confirmactions">
<button class="smallbutton warning" onclick="deleteScrobble({{ s.time }},this)">Confirm</button>
<button class="smallbutton" onclick="toggleDeleteConfirm(this)">Cancel</button>
<td class='scrobble_action_area'>
<span class='scrobble_action_type'>
<span class="confirmactions">
<button class="smallbutton warning" onclick="reparseScrobble({{ s.time }},this)">Reparse</button>
<button class="smallbutton" onclick="toggleReparseConfirm(this)">Cancel</button>
</span>
<span class="initializeactions">
{% include 'icons/reparse.jinja' %}
</span>
</span>
<span class="initializeactions">
<div class='deleteicon clickable_icon danger' onclick="toggleDeleteConfirm(this)">
<svg style="width:14px;height:14px" viewBox="0 0 24 24">
<path d="M19,4H15.5L14.5,3H9.5L8.5,4H5V6H19M6,19A2,2 0 0,0 8,21H16A2,2 0 0,0 18,19V7H6V19Z" />
</svg>
</div>
<span class='scrobble_action_type'>
<span class="confirmactions">
<button class="smallbutton warning" onclick="deleteScrobble({{ s.time }},this)">Delete</button>
<button class="smallbutton" onclick="toggleDeleteConfirm(this)">Cancel</button>
</span>
<span class="initializeactions">
{% include 'icons/delete.jinja' %}
</span>
</span>
</td>

View File

@ -8,7 +8,11 @@
{% set img = images.get_artist_image(entity) %}
{% endif %}
<td class='icon'><div style="background-image:url('{{ img }}')"></div></td>
<td class='icon'>
{% if settings['DISPLAY_ART_ICONS'] %}
<div class="lazy" data-bg="{{ img }}"></div>
{% endif %}
</td>
{% if entity is mapping and 'artists' in entity %}
{% if settings['TRACK_SEARCH_PROVIDER'] %}
<td class='searchProvider'>{{ links.link_search(entity) }}</td>

View File

@ -5,7 +5,7 @@
{% set name = entity %}
{% endif %}
<a href="{{ url(entity) }}">{{ name }}</a>
<a href="{{ url(entity) }}">{{ name | e }}</a>
{%- endmacro %}
{% macro links(entities) -%}

View File

@ -10,8 +10,7 @@
{% if pages > 1 %}
{% if page > 1 %}
<a href='{{ mlj_uri.create_uri("",filterkeys,limitkeys,delimitkeys,amountkeys,{'page':0}) }}'>
<span class='stat_selector'>1</span>
</a> |
<span class='stat_selector'>1</span></a> |
{% endif %}
{% if page > 2 %}
@ -20,8 +19,7 @@
{% if page > 0 %}
<a href='{{ mlj_uri.create_uri("",filterkeys,limitkeys,delimitkeys,amountkeys,{'page':page-1}) }}'>
<span class='stat_selector'>{{ page }}</span>
</a> «
<span class='stat_selector'>{{ page }}</span></a> «
{% endif %}
<span style='opacity:0.5;' class='stat_selector'>
@ -30,8 +28,7 @@
{% if page < pages-1 %}
» <a href='{{ mlj_uri.create_uri("",filterkeys,limitkeys,delimitkeys,amountkeys,{'page':page+1}) }}'>
<span class='stat_selector'>{{ page+2 }}</span>
</a>
<span class='stat_selector'>{{ page+2 }}</span></a>
{% endif %}
{% if page < pages-3 %}
@ -40,8 +37,7 @@
{% if page < pages-2 %}
| <a href='{{ mlj_uri.create_uri("",filterkeys,limitkeys,delimitkeys,amountkeys,{'page':pages-1}) }}'>
<span class='stat_selector'>{{ pages }}</span>
</a>
<span class='stat_selector'>{{ pages }}</span></a>
{% endif %}
{% endif %}

View File

@ -75,7 +75,7 @@
<span class="stat_module">
{%- with amountkeys = {"perpage":15,"page":0}, shortTimeDesc=True -%}
{%- with amountkeys = {"perpage":12,"page":0}, shortTimeDesc=True -%}
{% include 'partials/scrobbles.jinja' %}
{%- endwith -%}
</span>

View File

@ -5,6 +5,7 @@
{% block scripts %}
<script src="/rangeselect.js"></script>
<script src="/edit.js"></script>
<script>
function scrobble(encodedtrack) {
neo.xhttprequest('/apis/mlj_1/newscrobble?nofix&' + encodedtrack,data={},method="POST").then(response=>{window.location.reload()});
@ -21,8 +22,24 @@
{% set encodedtrack = mlj_uri.uriencode({'track':track}) %}
{% block icon_bar %}
{% if adminmode %}
{% include 'icons/edit.jinja' %}
{% include 'icons/merge.jinja' %}
{% include 'icons/merge_mark.jinja' %}
{% include 'icons/merge_cancel.jinja' %}
<script>showValidMergeIcons();</script>
{% endif %}
{% endblock %}
{% block content %}
<script>
const entity_id = {{ info.id }};
const entity_type = 'track';
const entity_name = {{ track.title | tojson }};
</script>
{% import 'partials/awards_track.jinja' as awards %}
@ -34,6 +51,7 @@
<div
class="changeable-image" data-uploader="b64=>upload('{{ encodedtrack }}',b64)"
style="background-image:url('{{ images.get_track_image(track) }}');"
title="Drag & Drop to upload new image"
></div>
{% else %}
<div style="background-image:url('{{ images.get_track_image(track) }}');">
@ -42,7 +60,7 @@
</td>
<td class="text">
<span>{{ links.links(track.artists) }}</span><br/>
<h1 class="headerwithextra">{{ info.track.title }}</h1>
<h1 id="main_entity_name" class="headerwithextra">{{ info.track.title | e }}</h1>
{{ awards.certs(track) }}
<span class="rank"><a href="/charts_tracks?max=100">#{{ info.position }}</a></span>
<br/>

View File

@ -2,6 +2,8 @@
COMMON STYLES FOR MALOJA, ALBULA AND POSSIBLY OTHERS
**/
@import url("/grisonsfont.css");
:root {
--base-color: #232327;
--base-color-dark: #090909;
@ -156,5 +158,5 @@ input:focus {
.hide {
display:none;
display:none !important;
}

View File

@ -1,3 +1,6 @@
@import url("/grisons.css");
body {
padding:15px;
padding-bottom:35px;
@ -55,24 +58,32 @@ div.header h1 {
settings icon
**/
div.clickable_icon {
display: inline-block;
svg {
fill: var(--text-color);
cursor: pointer;
}
div.clickable_icon:hover {
fill: var(--text-color-focus);
}
div.clickable_icon.danger:hover {
fill: red;
}
div#settingsicon {
div#icon_bar {
position:fixed;
right:30px;
top:30px;
}
div#icon_bar div.clickable_icon {
display: inline-block;
height:26px;
width:26px;
}
div.clickable_icon svg {
cursor: pointer;
}
div.clickable_icon:hover svg {
fill: var(--text-color-focus);
}
div.clickable_icon.danger:hover svg {
fill: red;
}
/**
Footer
@ -198,7 +209,7 @@ div#notification_area {
div#notification_area div.notification {
background-color:white;
width:400px;
height:100px;
height:50px;
margin-bottom:7px;
padding:9px;
opacity:0.4;
@ -512,7 +523,8 @@ table.list {
table.list tr {
background-color: var(--current-bg-color);
border-color: var(--current-bg-color);
height: 1.4em;
height: 1.45em;
transition: opacity 2s;
}
@ -610,31 +622,50 @@ table.list td.searchProvider:hover {
color: gold;
}
table.list td.delete_area {
table.list td.scrobble_action_area {
text-align: right;
width:7em;
width:2em;
overflow:visible;
}
table.list tr td.scrobble_action_area span.scrobble_action_type {
display:inline-block;
float:right;
}
table.list td.scrobble_action_area span.scrobble_action_type.active {
}
/* rows that can be deleted in some form
'active' class on the delete area cell to toggle confirm prompt
'removed' class on the whole row to delete
*/
table.list tr td.delete_area span.confirmactions {
table.list tr td.scrobble_action_area span.scrobble_action_type span.confirmactions {
display: none;
}
table.list tr td.delete_area span.initializeactions {
table.list tr td.scrobble_action_area span.scrobble_action_type span.initializeactions {
display: initial;
}
table.list tr td.delete_area.active span.confirmactions {
/* when other action is active, hide all */
table.list tr td.scrobble_action_area.active span.scrobble_action_type span.initializeactions {
display: none;
}
table.list tr td.scrobble_action_area.active span.scrobble_action_type span.initializeactions {
display: none;
}
/* except this one itself is active */
table.list tr td.scrobble_action_area.active span.scrobble_action_type.active span.confirmactions {
display: initial;
}
table.list tr td.delete_area.active span.initializeactions {
table.list tr td.scrobble_action_area.active span.scrobble_action_type.active span.initializeactions {
display: none;
}
table.list tr.removed td.delete_area span.confirmactions {
table.list tr.removed td.scrobble_action_area span.scrobble_action_type {
display: none;
}
table.list tr.removed td.delete_area span.initializeactions {
table.list tr.removed td.scrobble_action_area span.scrobble_action_type {
display: none;
}
table.list tr.removed {
@ -643,6 +674,13 @@ table.list tr.removed {
}
table.list tr.changed {
/*background-color: rgba(222,209,180,0.7) !important;*/
opacity:0;
transition: opacity 0.2s;
}
/*
table td.artists div {
overflow:hidden;
@ -678,7 +716,7 @@ table.list td.amount {
text-align:right;
}
table.list td.bar {
width:500px;
width:400px;
/* background-color: var(--base-color); */
/* Remove 5er separators for bars */
/*border-color:rgba(0,0,0,0)!important;*/
@ -696,7 +734,7 @@ table.list tr:hover td.bar div {
}
table.list td.chart {
width:500px;
width:400px;
/* background-color: var(--base-color); */
/* Remove 5er separators for bars */
/*border-color:rgba(0,0,0,0)!important;*/
@ -810,8 +848,11 @@ table.tiles_top td div {
table.tiles_top td span {
background-color:rgba(0,0,0,0.7);
display: table-cell;
display: inline-block;
margin-top:2%;
padding: 3px;
max-width: 67%;
vertical-align: text-top;
}
table.tiles_top td a:hover {
text-decoration: none;
@ -825,12 +866,12 @@ table.tiles_1x1 td {
table.tiles_2x2 td {
height:50%;
width:50%;
font-size:90%
font-size:80%
}
table.tiles_3x3 td {
height:33.333%;
width:33.333%;
font-size:70%
font-size:60%
}
table.tiles_4x4 td {
font-size:50%
@ -839,6 +880,24 @@ table.tiles_5x5 td {
font-size:40%
}
/* Safari fix */
table.tiles_sub.tiles_3x3 td div {
min-height: 100px;
min-width: 100px;
}
table.tiles_sub.tiles_2x2 td div {
min-height: 150px;
min-width: 150px;
}
table.tiles_sub.tiles_1x1 td div {
min-height: 300px;
min-width: 300px;
}
table.tiles_sub a span {
overflow-wrap: anywhere;
}
.summary_rank {
background-size:cover;

View File

@ -1,12 +1,266 @@
// JS for all web interface editing / deletion of scrobble data
// HELPERS
function selectAll(e) {
// https://stackoverflow.com/a/6150060/6651341
var range = document.createRange();
range.selectNodeContents(e);
var sel = window.getSelection();
sel.removeAllRanges();
sel.addRange(range);
}
// DELETION
function toggleDeleteConfirm(element) {
element.parentElement.parentElement.classList.toggle('active');
element.parentElement.parentElement.parentElement.classList.toggle('active');
}
function deleteScrobble(id,element) {
element.parentElement.parentElement.parentElement.classList.add('removed');
var callback_func = function(req){
if (req.status == 200) {
element.parentElement.parentElement.parentElement.parentElement.classList.add('removed');
notifyCallback(req);
}
else {
notifyCallback(req);
}
};
neo.xhttpreq("/apis/mlj_1/delete_scrobble",data={'timestamp':id},method="POST",callback=(()=>null),json=true);
neo.xhttpreq("/apis/mlj_1/delete_scrobble",data={'timestamp':id},method="POST",callback=callback_func,json=true);
}
// REPARSING
function toggleReparseConfirm(element) {
element.parentElement.parentElement.classList.toggle('active');
element.parentElement.parentElement.parentElement.classList.toggle('active');
}
function reparseScrobble(id, element) {
toggleReparseConfirm(element);
callback_func = function(req){
if (req.status == 200) {
if (req.response.status != 'no_operation') {
//window.location.reload();
notifyCallback(req);
var newtrack = req.response.scrobble.track;
var row = element.parentElement.parentElement.parentElement.parentElement;
changeScrobbleRow(row,newtrack);
}
else {
notifyCallback(req);
}
}
else {
notifyCallback(req);
}
};
neo.xhttpreq("/apis/mlj_1/reparse_scrobble",data={'timestamp':id},method="POST",callback=callback_func,json=true);
}
function changeScrobbleRow(element,newtrack) {
element.classList.add('changed');
setTimeout(function(){
element.getElementsByClassName('track')[0].innerHTML = createTrackCell(newtrack);
},200);
setTimeout(function(){element.classList.remove('changed')},300);
}
function createTrackCell(trackinfo) {
var trackquery = new URLSearchParams();
trackinfo.artists.forEach((a)=>trackquery.append('artist',a));
trackquery.append('title',trackinfo.title);
tracklink = document.createElement('a');
tracklink.href = "/track?" + trackquery.toString();
tracklink.textContent = trackinfo.title;
artistelements = []
var artistholder = document.createElement('span');
artistholder.classList.add('artist_in_trackcolumn');
for (var a of trackinfo.artists) {
var artistquery = new URLSearchParams();
artistquery.append('artist',a);
artistlink = document.createElement('a');
artistlink.href = "/artist?" + artistquery.toString();
artistlink.textContent = a;
artistelements.push(artistlink.outerHTML)
}
artistholder.innerHTML = artistelements.join(", ");
return artistholder.outerHTML + " " + tracklink.outerHTML;
}
// EDIT NAME
function editEntity() {
var namefield = document.getElementById('main_entity_name');
try {
namefield.contentEditable = "plaintext-only"; // not supported by Firefox
}
catch (e) {
namefield.contentEditable = true;
}
namefield.addEventListener('keydown',function(e){
// dont allow new lines, done on enter
if (e.key === "Enter") {
e.preventDefault();
namefield.blur(); // this leads to below
}
// cancel on esc
else if (e.key === "Escape" || e.key === "Esc") {
e.preventDefault();
namefield.textContent = entity_name;
namefield.blur();
}
})
// emergency, not pretty because it will move cursor
namefield.addEventListener('input',function(e){
if (namefield.textContent.includes("\n")) {
namefield.textContent = namefield.textContent.replace("\n","");
}
})
// manually clicking away OR enter
namefield.addEventListener('blur',function(e){
doneEditing();
})
namefield.focus();
selectAll(namefield);
}
function doneEditing() {
window.getSelection().removeAllRanges();
var namefield = document.getElementById('main_entity_name');
namefield.contentEditable = "false";
newname = namefield.textContent;
if (newname != entity_name) {
var searchParams = new URLSearchParams(window.location.search);
if (entity_type == 'artist') {
var endpoint = "/apis/mlj_1/edit_artist";
searchParams.set("artist", newname);
var payload = {'id':entity_id,'name':newname};
}
else if (entity_type == 'track') {
var endpoint = "/apis/mlj_1/edit_track";
searchParams.set("title", newname);
var payload = {'id':entity_id,'title':newname}
}
callback_func = function(req){
if (req.status == 200) {
window.location = "?" + searchParams.toString();
}
else {
notifyCallback(req);
namefield.textContent = entity_name;
}
};
neo.xhttpreq(
endpoint,
data=payload,
method="POST",
callback=callback_func,
json=true
);
}
}
// MERGING
function showValidMergeIcons() {
const lcst = window.sessionStorage;
var key = "marked_for_merge_" + entity_type;
var current_stored = (lcst.getItem(key) || '').split(",");
current_stored = current_stored.filter((x)=>x).map((x)=>parseInt(x));
var mergeicon = document.getElementById('mergeicon');
var mergemarkicon = document.getElementById('mergemarkicon');
var mergecancelicon = document.getElementById('mergecancelicon');
mergeicon.classList.add('hide');
mergemarkicon.classList.add('hide');
mergecancelicon.classList.add('hide');
if (current_stored.length == 0) {
mergemarkicon.classList.remove('hide');
}
else {
mergecancelicon.classList.remove('hide');
if (current_stored.includes(entity_id)) {
}
else {
mergemarkicon.classList.remove('hide');
mergeicon.classList.remove('hide');
}
}
}
function markForMerge() {
const lcst = window.sessionStorage;
var key = "marked_for_merge_" + entity_type;
var current_stored = (lcst.getItem(key) || '').split(",");
current_stored = current_stored.filter((x)=>x).map((x)=>parseInt(x));
current_stored.push(entity_id);
current_stored = [...new Set(current_stored)];
lcst.setItem(key,current_stored); //this already formats it correctly
notify("Marked " + entity_name + " for merge","Currently " + current_stored.length + " marked!")
showValidMergeIcons();
}
function merge() {
const lcst = window.sessionStorage;
var key = "marked_for_merge_" + entity_type;
var current_stored = lcst.getItem(key).split(",");
current_stored = current_stored.filter((x)=>x).map((x)=>parseInt(x));
callback_func = function(req){
if (req.status == 200) {
window.location.reload();
}
else {
notifyCallback(req);
}
};
neo.xhttpreq(
"/apis/mlj_1/merge_" + entity_type + "s",
data={
'source_ids':current_stored,
'target_id':entity_id
},
method="POST",
callback=callback_func,
json=true
);
lcst.removeItem(key);
}
function cancelMerge() {
const lcst = window.sessionStorage;
var key = "marked_for_merge_" + entity_type;
lcst.setItem(key,[]);
showValidMergeIcons();
notify("Cancelled merge!","")
}

File diff suppressed because one or more lines are too long

View File

@ -69,8 +69,9 @@ function scrobble(artists,title) {
"title":title
}
if (title != "" && artists.length > 0) {
neo.xhttpreq("/apis/mlj_1/newscrobble",data=payload,method="POST",callback=scrobbledone,json=true)
neo.xhttpreq("/apis/mlj_1/newscrobble",data=payload,method="POST",callback=notifyCallback,json=true)
}
document.getElementById("title").value = "";
@ -125,14 +126,14 @@ function searchresult_manualscrobbling() {
console.log(tracks);
for (let t of tracks) {
track = document.createElement("span");
trackstr = t["artists"].join(", ") + " - " + t["title"];
trackstr = t.track["artists"].join(", ") + " - " + t.track["title"];
tracklink = t["link"];
track.innerHTML = "<a href='" + tracklink + "'>" + trackstr + "</a>";
row = document.createElement("tr")
col1 = document.createElement("td")
button = document.createElement("button")
button.innerHTML = "Scrobble!"
button.onclick = function(){ scrobble(t["artists"],t["title"])};
button.onclick = function(){ scrobble(t.track["artists"],t.track["title"])};
col2 = document.createElement("td")
row.appendChild(col1)
col1.appendChild(button)

View File

@ -6,7 +6,7 @@ const colors = {
}
const notification_template = info => `
<div class="notification" style="background-color:${colors[type]};">
<div class="notification" style="background-color:${colors[info.notification_type]};">
<b>${info.title}</b><br/>
<span>${info.body}</span>
@ -20,11 +20,11 @@ function htmlToElement(html) {
return template.content.firstChild;
}
function notify(title,msg,type='info',reload=false) {
function notify(title,msg,notification_type='info',reload=false) {
info = {
'title':title,
'body':msg,
'type':type
'notification_type':notification_type
}
var element = htmlToElement(notification_template(info));
@ -33,3 +33,22 @@ function notify(title,msg,type='info',reload=false) {
setTimeout(function(e){e.remove();},7000,element);
}
function notifyCallback(request) {
var body = request.response;
var status = request.status;
if (status == 200) {
var notification_type = 'info';
var title = "Success!";
var msg = body.desc || body;
}
else {
var notification_type = 'warning';
var title = "Error: " + body.error.type;
var msg = body.error.desc || "";
}
notify(title,msg,notification_type);
}

View File

@ -61,29 +61,29 @@ function searchresult() {
}
for (var i=0;i<artists.length;i++) {
name = artists[i]["name"];
name = artists[i]["artist"];
link = artists[i]["link"];
image = artists[i]["image"];
var node = oneresult.cloneNode(true);
node.setAttribute("onclick","goto('" + link + "')");
node.children[0].style.backgroundImage = "url('" + image + "')";
node.children[1].children[0].innerHTML = name;
node.children[1].children[0].textContent = name;
results_artists.appendChild(node);
}
for (var i=0;i<tracks.length;i++) {
artists = tracks[i]["artists"].join(", ");
title = tracks[i]["title"];
artists = tracks[i]["track"]["artists"].join(", ");
title = tracks[i]["track"]["title"];
link = tracks[i]["link"];
image = tracks[i]["image"];
var node = oneresult.cloneNode(true);
node.setAttribute("onclick","goto('" + link + "')");
node.children[0].style.backgroundImage = "url('" + image + "')";
node.children[1].children[0].innerHTML = artists;
node.children[1].children[2].innerHTML = title;
node.children[1].children[0].textContent = artists;
node.children[1].children[2].textContent = title;
results_tracks.appendChild(node);
}

View File

@ -1,2 +1,2 @@
User-agent: *
Disallow: *
Disallow: /

View File

@ -1,9 +1,9 @@
[project]
name = "malojaserver"
version = "3.0.6"
version = "3.1.5"
description = "Self-hosted music scrobble database"
readme = "./README.md"
requires-python = ">=3.6"
requires-python = ">=3.7"
license = { file="./LICENSE" }
authors = [ { name="Johannes Krattenmacher", email="maloja@dev.krateng.ch" } ]
@ -20,14 +20,13 @@ classifiers = [
dependencies = [
"bottle>=0.12.16",
"waitress>=1.3",
"doreah>=1.9.1, <2",
"waitress>=2.1.0",
"doreah>=1.9.4, <2",
"nimrodel>=0.8.0",
"setproctitle>=1.1.10",
#"pyvips>=2.1.16",
"jinja2>=2.11",
"jinja2>=3.0.0",
"lru-dict>=1.1.6",
"css_html_js_minify>=2.5.5",
"psutil>=5.8.0",
"sqlalchemy>=1.4",
"python-datauri>=1.1.0",
@ -40,7 +39,7 @@ full = [
]
[project.scripts]
maloja = "maloja.proccontrol.control:main"
maloja = "maloja.__main__:main"
[build-system]
requires = ["flit_core >=3.2,<4"]

Some files were not shown because too many files have changed in this diff Show More