mirror of
https://gitlab.com/SIGBUS/nyaa.git
synced 2024-12-22 15:50:00 +00:00
Merge branch 'master' into reports
This commit is contained in:
commit
bea63315cd
146
README.md
146
README.md
|
@ -1,82 +1,110 @@
|
||||||
# NyaaV2
|
# NyaaV2
|
||||||
|
|
||||||
## Setup
|
## Setting up for development
|
||||||
|
This project uses Python 3.6. There are features used that do not exist in 3.5, so make sure to use Python 3.6.
|
||||||
|
This guide also assumes you 1) are using Linux and 2) are somewhat capable with the commandline.
|
||||||
|
It's not impossible to run Nyaa on Windows, but this guide doesn't focus on that.
|
||||||
|
|
||||||
|
### Code Quality:
|
||||||
|
- Before we get any deeper, remember to follow PEP8 style guidelines and run `./lint.sh` before committing.
|
||||||
|
- You may also use `pycodestyle nyaa/ --show-source --max-line-length=100´ to see a list of warnings/problems instead of having `lint.sh` making modifications for you
|
||||||
|
- Other than PEP8, try to keep your code clean and easy to understand, as well. It's only polite!
|
||||||
|
|
||||||
|
### Setting up Pyenv
|
||||||
|
pyenv eases the use of different Python versions, and as not all Linux distros offer 3.6 packages, it's right up our alley.
|
||||||
- Install dependencies https://github.com/pyenv/pyenv/wiki/Common-build-problems
|
- Install dependencies https://github.com/pyenv/pyenv/wiki/Common-build-problems
|
||||||
- Install `pyenv` https://github.com/pyenv/pyenv/blob/master/README.md#installation
|
- Install `pyenv` https://github.com/pyenv/pyenv/blob/master/README.md#installation
|
||||||
- Install `pyenv-virtualenv` https://github.com/pyenv/pyenv-virtualenv/blob/master/README.md
|
- Install `pyenv-virtualenv` https://github.com/pyenv/pyenv-virtualenv/blob/master/README.md
|
||||||
- `pyenv install 3.6.1`
|
- Install Python 3.6.1 with `pyenv` and create a virtualenv for the project:
|
||||||
- `pyenv virtualenv 3.6.1 nyaa`
|
- `pyenv install 3.6.1`
|
||||||
- `pyenv activate nyaa`
|
- `pyenv virtualenv 3.6.1 nyaa`
|
||||||
|
- `pyenv activate nyaa`
|
||||||
- Install dependencies with `pip install -r requirements.txt`
|
- Install dependencies with `pip install -r requirements.txt`
|
||||||
- Copy `config.example.py` into `config.py`
|
- Copy `config.example.py` into `config.py`
|
||||||
- Change TABLE_PREFIX to `nyaa_` or `sukebei_` depending on the site
|
- Change `SITE_FLAVOR` in your `config.py` depending on which instance you want to host
|
||||||
|
|
||||||
### Setting up MySQL/MariaDB database for advanced functionality
|
### Setting up MySQL/MariaDB database
|
||||||
|
You *may* use SQLite but the current support for it in this project is outdated and rather unsupported.
|
||||||
- Enable `USE_MYSQL` flag in config.py
|
- Enable `USE_MYSQL` flag in config.py
|
||||||
- Install latest mariadb by following instructions here https://downloads.mariadb.org/mariadb/repositories/
|
- Install latest mariadb by following instructions here https://downloads.mariadb.org/mariadb/repositories/
|
||||||
- Tested versions: `mysql Ver 15.1 Distrib 10.0.30-MariaDB, for debian-linux-gnu (x86_64) using readline 5.2`
|
- Tested versions: `mysql Ver 15.1 Distrib 10.0.30-MariaDB, for debian-linux-gnu (x86_64) using readline 5.2`
|
||||||
- Run the following commands logged in as your root db user:
|
- Run the following commands logged in as your root db user (substitute for your own `config.py` values if desired):
|
||||||
- `CREATE USER 'test'@'localhost' IDENTIFIED BY 'test123';`
|
- `CREATE USER 'test'@'localhost' IDENTIFIED BY 'test123';`
|
||||||
- `GRANT ALL PRIVILEGES ON * . * TO 'test'@'localhost';`
|
- `GRANT ALL PRIVILEGES ON *.* TO 'test'@'localhost';`
|
||||||
- `FLUSH PRIVILEGES;`
|
- `FLUSH PRIVILEGES;`
|
||||||
- `CREATE DATABASE nyaav2 DEFAULT CHARACTER SET utf8 COLLATE utf8_bin;`
|
- `CREATE DATABASE nyaav2 DEFAULT CHARACTER SET utf8 COLLATE utf8_bin;`
|
||||||
- To setup and import nyaa_maria_vx.sql:
|
|
||||||
- `mysql -u <user> -p nyaav2`
|
|
||||||
- `DROP DATABASE nyaav2;`
|
|
||||||
- `CREATE DATABASE nyaav2 DEFAULT CHARACTER SET utf8 COLLATE utf8_bin;`
|
|
||||||
- `SOURCE ~/path/to/database/nyaa_maria_vx.sql`
|
|
||||||
|
|
||||||
### Finishing up
|
### Finishing up
|
||||||
- Run `python db_create.py` to create the database
|
- Run `python db_create.py` to create the database and import categories
|
||||||
- Load the .sql file
|
- Follow the advice of `db_create.py` and run `./db_migrate.py stamp head` to mark the database version for Alembic
|
||||||
- `mysql -u user -p nyaav2`
|
|
||||||
- `SOURCE cocks.sql`
|
|
||||||
- Remember to change the default user password to an empty string to disable logging in
|
|
||||||
- Start the dev server with `python run.py`
|
- Start the dev server with `python run.py`
|
||||||
- When you are finished developing, deactivate your virtualenv with `source deactivate`
|
- When you are finished developing, deactivate your virtualenv with `pyenv deactivate` or `source deactivate` (or just close your shell session)
|
||||||
|
|
||||||
## Enabling ElasticSearch
|
You're now ready for simple testing and development!
|
||||||
|
Continue below to learn about database migrations and enabling the advanced search engine, Elasticsearch.
|
||||||
|
|
||||||
### Basics
|
|
||||||
- Install jdk `sudo apt-get install openjdk-8-jdk`
|
|
||||||
- Install elasticsearch https://www.elastic.co/guide/en/elasticsearch/reference/current/deb.html
|
|
||||||
- `sudo systemctl enable elasticsearch.service`
|
|
||||||
- `sudo systemctl start elasticsearch.service`
|
|
||||||
- Run `curl -XGET 'localhost:9200'` and make sure ES is running
|
|
||||||
- Optional: install Kabana as a search frontend for ES
|
|
||||||
|
|
||||||
### Enable MySQL Binlogging
|
|
||||||
- Add the `[mariadb]` bin-log section to my.cnf and reload mysql server
|
|
||||||
- Connect to mysql
|
|
||||||
- `SHOW VARIABLES LIKE 'binlog_format';`
|
|
||||||
- Make sure it shows ROW
|
|
||||||
- Connect to root user
|
|
||||||
- `GRANT REPLICATION SLAVE ON *.* TO 'test'@'localhost';` where test is the user you will be running `sync_es.py` with
|
|
||||||
|
|
||||||
### Setting up ES
|
|
||||||
- Run `./create_es.sh` and this creates two indicies: `nyaa` and `sukebei`
|
|
||||||
- The output should show `acknowledged: true` twice
|
|
||||||
- The safest bet is to disable the webapp here to ensure there's no database writes
|
|
||||||
- Run `python import_to_es.py` with `SITE_FLAVOR` set to `nyaa`
|
|
||||||
- Run `python import_to_es.py` with `SITE_FLAVOR` set to `sukebei`
|
|
||||||
- These will take some time to run as it's indexing
|
|
||||||
|
|
||||||
### Setting up sync_es.py
|
|
||||||
- Sync_es.py keeps the ElasticSearch index updated by reading the BinLog
|
|
||||||
- Configure the MySQL options with the user where you granted the REPLICATION permissions
|
|
||||||
- Connect to MySQL, run `SHOW MASTER STATUS;`.
|
|
||||||
- Copy the output to `/var/lib/sync_es_position.json` with the contents `{"log_file": "FILE", "log_pos": POSITION}` and replace FILENAME with File (something like master1-bin.000002) in the SQL output and POSITION (something like 892528513) with Position
|
|
||||||
- Set up `sync_es.py` as a service and run it, preferably as the system/root
|
|
||||||
- Make sure `sync_es.py` runs within venv with the right dependencies
|
|
||||||
|
|
||||||
Enable the `USE_ELASTIC_SEARCH` flag in `config.py`, restart the application, and you're good to go.
|
|
||||||
|
|
||||||
## Database migrations
|
## Database migrations
|
||||||
- Uses [flask-Migrate](https://flask-migrate.readthedocs.io/)
|
- Database migrations are done with [flask-Migrate](https://flask-migrate.readthedocs.io/), a wrapper around [Alembic](http://alembic.zzzcomputing.com/en/latest/).
|
||||||
- Run `./db_migrate.py db migrate` to generate the migration script after database model changes.
|
- If someone has made changes in the database schema and included a new migration script:
|
||||||
- Take a look at the result in `migrations/versions/...` to make sure nothing went wrong.
|
- If your database has never been marked by Alembic (you're on a database from before the migrations), run `./db_migrate.py stamp head` before pulling the new migration script(s).
|
||||||
- Run `./db_migrate.py db upgrade` to upgrade your database.
|
- If you already have the new scripts, check the output of `./db_migrate.py history` instead and choose a hash that matches your current database state, then run `./db_migrate.py stamp <hash>`.
|
||||||
|
- Update your branch (eg. `git fetch && git rebase origin/master`)
|
||||||
|
- Run `./db_migrate.py upgrade head` to run the migration. Done!
|
||||||
|
- If *you* have made a change in the database schema:
|
||||||
|
- Save your changes in `models.py` and ensure the database schema matches the previous version (ie. your new tables/columns are not added to the live database)
|
||||||
|
- Run `./db_migrate.py migrate -m "Short description of changes"` to automatically generate a migration script for the changes
|
||||||
|
- Check the script (`migrations/versions/...`) and make sure it works! Alembic may not able to notice all changes.
|
||||||
|
- Run `./db_migrate.py upgrade` to run the migration and verify the upgrade works.
|
||||||
|
- (Run `./db_migrate.py downgrade` to verify the downgrade works as well, then upgrade again)
|
||||||
|
|
||||||
## Code Quality:
|
|
||||||
- Remember to follow PEP8 style guidelines and run `./lint.sh` before committing.
|
## Setting up and enabling Elasticsearch
|
||||||
|
|
||||||
|
### Installing Elasticsearch
|
||||||
|
- Install JDK with `sudo apt-get install openjdk-8-jdk`
|
||||||
|
- Install [Elasticsearch](https://www.elastic.co/downloads/elasticsearch)
|
||||||
|
- [From packages...](https://www.elastic.co/guide/en/elasticsearch/reference/current/deb.html)
|
||||||
|
- Enable the service:
|
||||||
|
- `sudo systemctl enable elasticsearch.service`
|
||||||
|
- `sudo systemctl start elasticsearch.service`
|
||||||
|
- or [simply extracting the archives and running the files](https://www.elastic.co/guide/en/elasticsearch/reference/current/_installation.html), if you don't feel like permantently installing ES
|
||||||
|
- Run `curl -XGET 'localhost:9200'` and make sure ES is running
|
||||||
|
- Optional: install [Kibana](https://www.elastic.co/products/kibana) as a search debug frontend for ES
|
||||||
|
|
||||||
|
### Setting up ES
|
||||||
|
- Run `./create_es.sh` to create the indices for the torrents: `nyaa` and `sukebei`
|
||||||
|
- The output should show `acknowledged: true` twice
|
||||||
|
- Stop the Nyaa app if you haven't already
|
||||||
|
- Run `python import_to_es.py` to import all the torrents (on nyaa and sukebei) into the ES indices.
|
||||||
|
- This may take some time to run if you have plenty of torrents in your database.
|
||||||
|
|
||||||
|
Enable the `USE_ELASTIC_SEARCH` flag in `config.py` and (re)start the application.
|
||||||
|
Elasticsearch should now be functional! The ES indices won't be updated "live" with the current setup, continue below for instructions on how to hook Elasticsearch up to MySQL binlog.
|
||||||
|
|
||||||
|
However, take note that binglog is not necessary for simple ES testing and development; you can simply run `import_to_es.py` from time to time to reindex all the torrents.
|
||||||
|
|
||||||
|
### Enabling MySQL Binlogging
|
||||||
|
- Edit your MariaDB/MySQL server configuration and add the following under `[mariadb]`:
|
||||||
|
```
|
||||||
|
log-bin
|
||||||
|
server_id=1
|
||||||
|
log-basename=master1
|
||||||
|
binlog-format=row
|
||||||
|
```
|
||||||
|
- Restart MariaDB/MySQL (`sudo service mysql restart`)
|
||||||
|
- Copy the example configuration (`es_sync_config.example.json`) as `es_sync_config.json` and adjust options in it to your liking (verify the connection options!)
|
||||||
|
- Connect to mysql as root
|
||||||
|
- Verify that the result of `SHOW VARIABLES LIKE 'binlog_format';` is `ROW`
|
||||||
|
- Execute `GRANT REPLICATION SLAVE ON *.* TO 'username'@'localhost';` to allow your configured user access to the binlog
|
||||||
|
|
||||||
|
|
||||||
|
### Setting up sync_es.py
|
||||||
|
`sync_es.py` keeps the Elasticsearch indices updated by reading the binlog and pushing the changes to the ES indices.
|
||||||
|
- Make sure `es_sync_config.json` is configured with the user you grated the `REPLICATION` permissions
|
||||||
|
- Run `import_to_es.py` and copy the outputted JSON into the file specified by `save_loc` in your `es_sync_config.json`
|
||||||
|
- Run `sync_es.py` as-is *or*, for actual deployment, set it up as a service and run it, preferably as the system/root
|
||||||
|
- Make sure `sync_es.py` runs within the venv with the right dependencies!
|
||||||
|
|
||||||
|
You're done! The script should now be feeding updates from the database to Elasticsearch.
|
||||||
|
Take note, however, that the specified ES index refresh interval is 30 seconds, which may feel like a long time on local development. Feel free to adjust it or [poke Elasticsearch yourself!](https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-refresh.html)
|
||||||
|
|
|
@ -7,7 +7,7 @@ USE_EMAIL_VERIFICATION = False
|
||||||
USE_MYSQL = True
|
USE_MYSQL = True
|
||||||
|
|
||||||
# Enable this once stat integration is done
|
# Enable this once stat integration is done
|
||||||
ENABLE_SHOW_STATS = False
|
ENABLE_SHOW_STATS = True
|
||||||
|
|
||||||
BASE_DIR = os.path.abspath(os.path.dirname(__file__))
|
BASE_DIR = os.path.abspath(os.path.dirname(__file__))
|
||||||
if USE_MYSQL:
|
if USE_MYSQL:
|
||||||
|
@ -21,7 +21,6 @@ SECRET_KEY = '***'
|
||||||
|
|
||||||
# Prefix for running multiple sites, user table will not be prefixed.
|
# Prefix for running multiple sites, user table will not be prefixed.
|
||||||
SITE_FLAVOR = 'nyaa' # 'nyaa' or 'sukebei'
|
SITE_FLAVOR = 'nyaa' # 'nyaa' or 'sukebei'
|
||||||
TABLE_PREFIX = SITE_FLAVOR + '_'
|
|
||||||
|
|
||||||
# for recaptcha and email verification:
|
# for recaptcha and email verification:
|
||||||
# keys for localhost. Change as appropriate when actual domain is registered.
|
# keys for localhost. Change as appropriate when actual domain is registered.
|
||||||
|
@ -33,16 +32,16 @@ MAIL_FROM_ADDRESS = '***'
|
||||||
SMTP_USERNAME = '***'
|
SMTP_USERNAME = '***'
|
||||||
SMTP_PASSWORD = '***'
|
SMTP_PASSWORD = '***'
|
||||||
|
|
||||||
# What the site identifies itself as.
|
# What the site identifies itself as. This affects templates, not database stuff.
|
||||||
SITE_NAME = 'Nyaa'
|
SITE_NAME = 'Nyaa'
|
||||||
|
|
||||||
# The maximum number of files a torrent can contain
|
# The maximum number of files a torrent can contain
|
||||||
# until the site says "Too many files to display."
|
# until the site says "Too many files to display."
|
||||||
MAX_FILES_VIEW = 1000
|
MAX_FILES_VIEW = 1000
|
||||||
|
|
||||||
# """
|
#
|
||||||
# Setting to make sure main announce url is present in torrent
|
# Setting to make sure main announce url is present in torrent
|
||||||
# """
|
#
|
||||||
ENFORCE_MAIN_ANNOUNCE_URL = False
|
ENFORCE_MAIN_ANNOUNCE_URL = False
|
||||||
MAIN_ANNOUNCE_URL = ''
|
MAIN_ANNOUNCE_URL = ''
|
||||||
|
|
||||||
|
@ -51,10 +50,11 @@ BACKUP_TORRENT_FOLDER = 'torrents'
|
||||||
#
|
#
|
||||||
# Search Options
|
# Search Options
|
||||||
#
|
#
|
||||||
# Max ES search results, do not set over 10000
|
|
||||||
RESULTS_PER_PAGE = 75
|
RESULTS_PER_PAGE = 75
|
||||||
|
|
||||||
|
# See README.MD on Elasticsearch setup
|
||||||
USE_ELASTIC_SEARCH = False
|
USE_ELASTIC_SEARCH = False
|
||||||
ENABLE_ELASTIC_SEARCH_HIGHLIGHT = False
|
ENABLE_ELASTIC_SEARCH_HIGHLIGHT = False
|
||||||
|
# Max ES search results, do not set over 10000
|
||||||
ES_MAX_SEARCH_RESULT = 1000
|
ES_MAX_SEARCH_RESULT = 1000
|
||||||
ES_INDEX_NAME = SITE_FLAVOR # we create indicies named nyaa or sukebei
|
ES_INDEX_NAME = SITE_FLAVOR # we create indicies named nyaa or sukebei
|
|
@ -1,11 +0,0 @@
|
||||||
{
|
|
||||||
"save_loc": "/tmp/pos.json",
|
|
||||||
"mysql_host": "127.0.0.1",
|
|
||||||
"mysql_port": 13306,
|
|
||||||
"mysql_user": "root",
|
|
||||||
"mysql_password": "dunnolol",
|
|
||||||
"database": "nyaav2",
|
|
||||||
"internal_queue_depth": 10000,
|
|
||||||
"es_chunk_size": 10000,
|
|
||||||
"flush_interval": 5
|
|
||||||
}
|
|
73
db_create.py
73
db_create.py
|
@ -1,35 +1,60 @@
|
||||||
|
#!/usr/bin/env python3
|
||||||
import sys
|
import sys
|
||||||
|
import sqlalchemy
|
||||||
from nyaa import app, db, models
|
from nyaa import app, db, models
|
||||||
|
|
||||||
# Create tables
|
NYAA_CATEGORIES = [
|
||||||
|
('Anime', ['Anime Music Video', 'English-translated', 'Non-English-translated', 'Raw']),
|
||||||
|
('Audio', ['Lossless', 'Lossy']),
|
||||||
|
('Literature', ['English-translated', 'Non-English-translated', 'Raw']),
|
||||||
|
('Live Action', ['English-translated', 'Idol/Promotional Video', 'Non-English-translated', 'Raw']),
|
||||||
|
('Pictures', ['Graphics', 'Photos']),
|
||||||
|
('Software', ['Applications', 'Games']),
|
||||||
|
]
|
||||||
|
|
||||||
db.create_all()
|
|
||||||
|
|
||||||
# Insert categories and insert if it doesn't eixst
|
SUKEBEI_CATEGORIES = [
|
||||||
existing_cats = models.MainCategory.query.all()
|
('Art', ['Anime', 'Doujinshi', 'Games', 'Manga', 'Pictures']),
|
||||||
if not existing_cats:
|
('Real Life', ['Photobooks / Pictures', 'Videos']),
|
||||||
if app.config['SITE_FLAVOR'] == 'nyaa':
|
]
|
||||||
CATEGORIES = [
|
|
||||||
('Anime', ['Anime Music Video', 'English-translated', 'Non-English-translated', 'Raw']),
|
|
||||||
('Audio', ['Lossless', 'Lossy']),
|
|
||||||
('Literature', ['English-translated', 'Non-English-translated', 'Raw']),
|
|
||||||
('Live Action', ['English-translated', 'Idol/Promotional Video', 'Non-English-translated', 'Raw']),
|
|
||||||
('Pictures', ['Graphics', 'Photos']),
|
|
||||||
('Software', ['Applications', 'Games']),
|
|
||||||
]
|
|
||||||
elif app.config['SITE_FLAVOR'] == 'sukebei':
|
|
||||||
CATEGORIES = [
|
|
||||||
('Art', ['Anime', 'Doujinshi', 'Games', 'Manga', 'Pictures']),
|
|
||||||
('Real Life', ['Photobooks / Pictures', 'Videos']),
|
|
||||||
]
|
|
||||||
else:
|
|
||||||
CATEGORIES = []
|
|
||||||
|
|
||||||
for main_cat_name, sub_cat_names in CATEGORIES:
|
|
||||||
main_cat = models.MainCategory(name=main_cat_name)
|
def add_categories(categories, main_class, sub_class):
|
||||||
|
for main_cat_name, sub_cat_names in categories:
|
||||||
|
main_cat = main_class(name=main_cat_name)
|
||||||
for i, sub_cat_name in enumerate(sub_cat_names):
|
for i, sub_cat_name in enumerate(sub_cat_names):
|
||||||
# Composite keys can't autoincrement, set sub_cat id manually (1-index)
|
# Composite keys can't autoincrement, set sub_cat id manually (1-index)
|
||||||
sub_cat = models.SubCategory(id=i+1, name=sub_cat_name, main_category=main_cat)
|
sub_cat = sub_class(id=i+1, name=sub_cat_name, main_category=main_cat)
|
||||||
db.session.add(main_cat)
|
db.session.add(main_cat)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
# Test for the user table, assume db is empty if it's not created
|
||||||
|
database_empty = False
|
||||||
|
try:
|
||||||
|
models.User.query.first()
|
||||||
|
except (sqlalchemy.exc.ProgrammingError, sqlalchemy.exc.OperationalError):
|
||||||
|
database_empty = True
|
||||||
|
|
||||||
|
|
||||||
|
print('Creating all tables...')
|
||||||
|
db.create_all()
|
||||||
|
|
||||||
|
|
||||||
|
nyaa_category_test = models.NyaaMainCategory.query.first()
|
||||||
|
if not nyaa_category_test:
|
||||||
|
print('Adding Nyaa categories...')
|
||||||
|
add_categories(NYAA_CATEGORIES, models.NyaaMainCategory, models.NyaaSubCategory)
|
||||||
|
|
||||||
|
sukebei_category_test = models.SukebeiMainCategory.query.first()
|
||||||
|
if not sukebei_category_test:
|
||||||
|
print('Adding Sukebei categories...')
|
||||||
|
add_categories(SUKEBEI_CATEGORIES, models.SukebeiMainCategory, models.SukebeiSubCategory)
|
||||||
|
|
||||||
db.session.commit()
|
db.session.commit()
|
||||||
|
|
||||||
|
if database_empty:
|
||||||
|
print('Remember to run the following to mark the database up-to-date for Alembic:')
|
||||||
|
print('./db_migrate.py stamp head')
|
||||||
|
# Technically we should be able to do this here, but when you have
|
||||||
|
# Flask-Migrate and Flask-SQA and everything... I didn't get it working.
|
|
@ -1,5 +1,6 @@
|
||||||
#!/usr/bin/python3
|
#!/usr/bin/env python3
|
||||||
# -*- coding: utf-8 -*-
|
# -*- coding: utf-8 -*-
|
||||||
|
import sys
|
||||||
from nyaa import app, db
|
from nyaa import app, db
|
||||||
from flask_script import Manager
|
from flask_script import Manager
|
||||||
from flask_migrate import Migrate, MigrateCommand
|
from flask_migrate import Migrate, MigrateCommand
|
||||||
|
@ -10,4 +11,7 @@ manager = Manager(app)
|
||||||
manager.add_command("db", MigrateCommand)
|
manager.add_command("db", MigrateCommand)
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
|
# Patch sys.argv to default to 'db'
|
||||||
|
sys.argv.insert(1, 'db')
|
||||||
|
|
||||||
manager.run()
|
manager.run()
|
||||||
|
|
|
@ -91,6 +91,8 @@ mappings:
|
||||||
type: long
|
type: long
|
||||||
seed_count:
|
seed_count:
|
||||||
type: long
|
type: long
|
||||||
|
comment_count:
|
||||||
|
type: long
|
||||||
# these ids are really only for filtering, thus keyword
|
# these ids are really only for filtering, thus keyword
|
||||||
uploader_id:
|
uploader_id:
|
||||||
type: keyword
|
type: keyword
|
||||||
|
|
11
es_sync_config.example.json
Normal file
11
es_sync_config.example.json
Normal file
|
@ -0,0 +1,11 @@
|
||||||
|
{
|
||||||
|
"save_loc": "/tmp/pos.json",
|
||||||
|
"mysql_host": "127.0.0.1",
|
||||||
|
"mysql_port": 3306,
|
||||||
|
"mysql_user": "nyaa",
|
||||||
|
"mysql_password": "some_password",
|
||||||
|
"database": "nyaav2",
|
||||||
|
"internal_queue_depth": 10000,
|
||||||
|
"es_chunk_size": 10000,
|
||||||
|
"flush_interval": 5
|
||||||
|
}
|
|
@ -5,22 +5,16 @@ which is assumed to already exist.
|
||||||
This is a one-shot deal, so you'd either need to complement it
|
This is a one-shot deal, so you'd either need to complement it
|
||||||
with a cron job or some binlog-reading thing (TODO)
|
with a cron job or some binlog-reading thing (TODO)
|
||||||
"""
|
"""
|
||||||
from nyaa import app
|
import sys
|
||||||
from nyaa.models import Torrent
|
import json
|
||||||
|
from nyaa import app, db, models
|
||||||
|
|
||||||
from elasticsearch import Elasticsearch
|
from elasticsearch import Elasticsearch
|
||||||
from elasticsearch.client import IndicesClient
|
from elasticsearch.client import IndicesClient
|
||||||
from elasticsearch import helpers
|
from elasticsearch import helpers
|
||||||
import progressbar
|
|
||||||
import sys
|
|
||||||
|
|
||||||
bar = progressbar.ProgressBar(
|
# This should be progressbar33
|
||||||
max_value=Torrent.query.count(),
|
import progressbar
|
||||||
widgets=[
|
|
||||||
progressbar.SimpleProgress(),
|
|
||||||
' [', progressbar.Timer(), '] ',
|
|
||||||
progressbar.Bar(),
|
|
||||||
' (', progressbar.ETA(), ') ',
|
|
||||||
])
|
|
||||||
|
|
||||||
es = Elasticsearch(timeout=30)
|
es = Elasticsearch(timeout=30)
|
||||||
ic = IndicesClient(es)
|
ic = IndicesClient(es)
|
||||||
|
@ -32,11 +26,11 @@ ic = IndicesClient(es)
|
||||||
# we don't want to reindex all the user's torrents just because they
|
# we don't want to reindex all the user's torrents just because they
|
||||||
# changed their name, and we don't really want to FTS search on the user anyway.
|
# changed their name, and we don't really want to FTS search on the user anyway.
|
||||||
# Maybe it's more convenient to derefence though.
|
# Maybe it's more convenient to derefence though.
|
||||||
def mk_es(t):
|
def mk_es(t, index_name):
|
||||||
return {
|
return {
|
||||||
"_id": t.id,
|
"_id": t.id,
|
||||||
"_type": "torrent",
|
"_type": "torrent",
|
||||||
"_index": app.config['ES_INDEX_NAME'],
|
"_index": index_name,
|
||||||
"_source": {
|
"_source": {
|
||||||
# we're also indexing the id as a number so you can
|
# we're also indexing the id as a number so you can
|
||||||
# order by it. seems like this is just equivalent to
|
# order by it. seems like this is just equivalent to
|
||||||
|
@ -51,6 +45,7 @@ def mk_es(t):
|
||||||
"uploader_id": t.uploader_id,
|
"uploader_id": t.uploader_id,
|
||||||
"main_category_id": t.main_category_id,
|
"main_category_id": t.main_category_id,
|
||||||
"sub_category_id": t.sub_category_id,
|
"sub_category_id": t.sub_category_id,
|
||||||
|
"comment_count": t.comment_count,
|
||||||
# XXX all the bitflags are numbers
|
# XXX all the bitflags are numbers
|
||||||
"anonymous": bool(t.anonymous),
|
"anonymous": bool(t.anonymous),
|
||||||
"trusted": bool(t.trusted),
|
"trusted": bool(t.trusted),
|
||||||
|
@ -72,7 +67,7 @@ def mk_es(t):
|
||||||
# page through an sqlalchemy query, like the per_fetch but
|
# page through an sqlalchemy query, like the per_fetch but
|
||||||
# doesn't break the eager joins its doing against the stats table.
|
# doesn't break the eager joins its doing against the stats table.
|
||||||
# annoying that this isn't built in somehow.
|
# annoying that this isn't built in somehow.
|
||||||
def page_query(query, limit=sys.maxsize, batch_size=10000):
|
def page_query(query, limit=sys.maxsize, batch_size=10000, progress_bar=None):
|
||||||
start = 0
|
start = 0
|
||||||
while True:
|
while True:
|
||||||
# XXX very inelegant way to do this, i'm confus
|
# XXX very inelegant way to do this, i'm confus
|
||||||
|
@ -88,13 +83,46 @@ def page_query(query, limit=sys.maxsize, batch_size=10000):
|
||||||
yield(thing)
|
yield(thing)
|
||||||
if not had_things or stop == limit:
|
if not had_things or stop == limit:
|
||||||
break
|
break
|
||||||
bar.update(start)
|
if progress_bar:
|
||||||
|
progress_bar.update(start)
|
||||||
start = min(limit, start + batch_size)
|
start = min(limit, start + batch_size)
|
||||||
|
|
||||||
# turn off refreshes while bulk loading
|
FLAVORS = [
|
||||||
ic.put_settings(body={'index': {'refresh_interval': '-1'}}, index=app.config['ES_INDEX_NAME'])
|
('nyaa', models.NyaaTorrent),
|
||||||
|
('sukebei', models.SukebeiTorrent)
|
||||||
|
]
|
||||||
|
|
||||||
helpers.bulk(es, (mk_es(t) for t in page_query(Torrent.query)), chunk_size=10000)
|
# Get binlog status from mysql
|
||||||
|
master_status = db.engine.execute('SHOW MASTER STATUS;').fetchone()
|
||||||
|
|
||||||
# restore to near-enough real time
|
position_json = {
|
||||||
ic.put_settings(body={'index': {'refresh_interval': '30s'}}, index=app.config['ES_INDEX_NAME'])
|
'log_file': master_status[0],
|
||||||
|
'log_pos': master_status[1]
|
||||||
|
}
|
||||||
|
|
||||||
|
print('Save the following in the file configured in your ES sync config JSON:')
|
||||||
|
print(json.dumps(position_json))
|
||||||
|
|
||||||
|
for flavor, torrent_class in FLAVORS:
|
||||||
|
print('Importing torrents for index', flavor, 'from', torrent_class)
|
||||||
|
bar = progressbar.ProgressBar(
|
||||||
|
maxval=torrent_class.query.count(),
|
||||||
|
widgets=[ progressbar.SimpleProgress(),
|
||||||
|
' [', progressbar.Timer(), '] ',
|
||||||
|
progressbar.Bar(),
|
||||||
|
' (', progressbar.ETA(), ') ',
|
||||||
|
])
|
||||||
|
|
||||||
|
# turn off refreshes while bulk loading
|
||||||
|
ic.put_settings(body={'index': {'refresh_interval': '-1'}}, index=flavor)
|
||||||
|
|
||||||
|
bar.start()
|
||||||
|
helpers.bulk(es, (mk_es(t, flavor) for t in page_query(torrent_class.query, progress_bar=bar)), chunk_size=10000)
|
||||||
|
bar.finish()
|
||||||
|
|
||||||
|
# Refresh the index immideately
|
||||||
|
ic.refresh(index=flavor)
|
||||||
|
print('Index refresh done.')
|
||||||
|
|
||||||
|
# restore to near-enough real time
|
||||||
|
ic.put_settings(body={'index': {'refresh_interval': '30s'}}, index=flavor)
|
||||||
|
|
|
@ -0,0 +1,52 @@
|
||||||
|
"""Add comment_count to Torrent
|
||||||
|
|
||||||
|
Revision ID: 2bceb2cb4d7c
|
||||||
|
Revises: d0eeb8049623
|
||||||
|
Create Date: 2017-05-26 15:07:21.114331
|
||||||
|
|
||||||
|
"""
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = '2bceb2cb4d7c'
|
||||||
|
down_revision = 'd0eeb8049623'
|
||||||
|
branch_labels = None
|
||||||
|
depends_on = None
|
||||||
|
|
||||||
|
COMMENT_UPDATE_SQL = '''UPDATE {0}_torrents
|
||||||
|
SET comment_count = (
|
||||||
|
SELECT COUNT(*) FROM {0}_comments
|
||||||
|
WHERE {0}_torrents.id = {0}_comments.torrent_id
|
||||||
|
);'''
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade():
|
||||||
|
# ### commands auto generated by Alembic - please adjust! ###
|
||||||
|
op.add_column('nyaa_torrents', sa.Column('comment_count', sa.Integer(), nullable=False))
|
||||||
|
op.create_index(op.f('ix_nyaa_torrents_comment_count'), 'nyaa_torrents', ['comment_count'], unique=False)
|
||||||
|
|
||||||
|
op.add_column('sukebei_torrents', sa.Column('comment_count', sa.Integer(), nullable=False))
|
||||||
|
op.create_index(op.f('ix_sukebei_torrents_comment_count'), 'sukebei_torrents', ['comment_count'], unique=False)
|
||||||
|
# ### end Alembic commands ###
|
||||||
|
|
||||||
|
connection = op.get_bind()
|
||||||
|
|
||||||
|
print('Updating comment counts on nyaa_torrents...')
|
||||||
|
connection.execute(sa.sql.text(COMMENT_UPDATE_SQL.format('nyaa')))
|
||||||
|
print('Done.')
|
||||||
|
|
||||||
|
print('Updating comment counts on sukebei_torrents...')
|
||||||
|
connection.execute(sa.sql.text(COMMENT_UPDATE_SQL.format('sukebei')))
|
||||||
|
print('Done.')
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade():
|
||||||
|
# ### commands auto generated by Alembic - please adjust! ###
|
||||||
|
op.drop_index(op.f('ix_nyaa_torrents_comment_count'), table_name='nyaa_torrents')
|
||||||
|
op.drop_column('nyaa_torrents', 'comment_count')
|
||||||
|
|
||||||
|
op.drop_index(op.f('ix_sukebei_torrents_comment_count'), table_name='sukebei_torrents')
|
||||||
|
op.drop_column('sukebei_torrents', 'comment_count')
|
||||||
|
# ### end Alembic commands ###
|
|
@ -11,20 +11,19 @@ import sqlalchemy as sa
|
||||||
|
|
||||||
# revision identifiers, used by Alembic.
|
# revision identifiers, used by Alembic.
|
||||||
revision = '3001f79b7722'
|
revision = '3001f79b7722'
|
||||||
down_revision = None
|
down_revision = '97ddefed1834'
|
||||||
branch_labels = None
|
branch_labels = None
|
||||||
depends_on = None
|
depends_on = None
|
||||||
|
|
||||||
|
TABLE_PREFIXES = ('nyaa', 'sukebei')
|
||||||
|
|
||||||
|
|
||||||
def upgrade():
|
def upgrade():
|
||||||
# ### commands auto generated by Alembic - please adjust! ###
|
for prefix in TABLE_PREFIXES:
|
||||||
op.add_column('nyaa_torrents', sa.Column('uploader_ip', sa.Binary(), nullable=True))
|
op.add_column(prefix + '_torrents', sa.Column('uploader_ip', sa.Binary(), nullable=True))
|
||||||
op.add_column('sukebei_torrents', sa.Column('uploader_ip', sa.Binary(), nullable=True))
|
# ### end Alembic commands ###
|
||||||
# ### end Alembic commands ###
|
|
||||||
|
|
||||||
|
|
||||||
def downgrade():
|
def downgrade():
|
||||||
# ### commands auto generated by Alembic - please adjust! ###
|
for prefix in TABLE_PREFIXES:
|
||||||
op.drop_column('nyaa_torrents', 'uploader_ip')
|
op.drop_column(prefix + '_torrents', 'uploader_ip')
|
||||||
op.drop_column('sukebei_torrents', 'uploader_ip')
|
|
||||||
# ### end Alembic commands ###
|
|
||||||
|
|
166
migrations/versions/97ddefed1834_initial_database_state.py
Normal file
166
migrations/versions/97ddefed1834_initial_database_state.py
Normal file
|
@ -0,0 +1,166 @@
|
||||||
|
"""Initial database state
|
||||||
|
|
||||||
|
Revision ID: 97ddefed1834
|
||||||
|
Revises:
|
||||||
|
Create Date: 2017-05-26 18:46:14.440040
|
||||||
|
|
||||||
|
"""
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
import sqlalchemy_utils
|
||||||
|
from sqlalchemy.dialects import mysql
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = '97ddefed1834'
|
||||||
|
down_revision = None
|
||||||
|
branch_labels = None
|
||||||
|
depends_on = None
|
||||||
|
|
||||||
|
TABLE_PREFIXES = ('nyaa', 'sukebei')
|
||||||
|
|
||||||
|
def upgrade():
|
||||||
|
# Shared tables
|
||||||
|
op.create_table('users',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('username', sa.String(length=32, collation='ascii_general_ci'), nullable=False),
|
||||||
|
sa.Column('email', sqlalchemy_utils.types.email.EmailType(length=255), nullable=True),
|
||||||
|
|
||||||
|
# These are actually PasswordType, UserStatusType and UserLevelType,
|
||||||
|
# but database-wise binary and integers are what's being used
|
||||||
|
sa.Column('password_hash', sa.Binary(length=255), nullable=False),
|
||||||
|
sa.Column('status', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('level', sa.Integer(), nullable=False),
|
||||||
|
|
||||||
|
sa.Column('created_time', sa.DateTime(), nullable=True),
|
||||||
|
sa.Column('last_login_date', sa.DateTime(), nullable=True),
|
||||||
|
sa.Column('last_login_ip', sa.Binary(), nullable=True),
|
||||||
|
sa.PrimaryKeyConstraint('id'),
|
||||||
|
sa.UniqueConstraint('email'),
|
||||||
|
sa.UniqueConstraint('username')
|
||||||
|
)
|
||||||
|
|
||||||
|
op.create_table('trackers',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('uri', sa.String(length=255, collation='utf8_general_ci'), nullable=False),
|
||||||
|
sa.Column('disabled', sa.Boolean(), nullable=False),
|
||||||
|
sa.PrimaryKeyConstraint('id'),
|
||||||
|
sa.UniqueConstraint('uri')
|
||||||
|
)
|
||||||
|
|
||||||
|
# Nyaa and Sukebei
|
||||||
|
for prefix in TABLE_PREFIXES:
|
||||||
|
# Main categories
|
||||||
|
op.create_table(prefix + '_main_categories',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('name', sa.String(length=64), nullable=False),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
# Sub categories
|
||||||
|
op.create_table(prefix + '_sub_categories',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('main_category_id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('name', sa.String(length=64), nullable=False),
|
||||||
|
sa.ForeignKeyConstraint(['main_category_id'], [prefix + '_main_categories.id'], ),
|
||||||
|
sa.PrimaryKeyConstraint('id', 'main_category_id')
|
||||||
|
)
|
||||||
|
# Main torrent table
|
||||||
|
op.create_table(prefix + '_torrents',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('info_hash', sa.BINARY(length=20), nullable=False),
|
||||||
|
sa.Column('display_name', sa.String(length=255, collation='utf8_general_ci'), nullable=False),
|
||||||
|
sa.Column('torrent_name', sa.String(length=255), nullable=False),
|
||||||
|
sa.Column('information', sa.String(length=255), nullable=False),
|
||||||
|
sa.Column('description', mysql.TEXT(collation='utf8mb4_bin'), nullable=False),
|
||||||
|
sa.Column('filesize', sa.BIGINT(), nullable=False),
|
||||||
|
sa.Column('encoding', sa.String(length=32), nullable=False),
|
||||||
|
sa.Column('flags', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('uploader_id', sa.Integer(), nullable=True),
|
||||||
|
sa.Column('has_torrent', sa.Boolean(), nullable=False),
|
||||||
|
sa.Column('created_time', sa.DateTime(), nullable=False),
|
||||||
|
sa.Column('updated_time', sa.DateTime(), nullable=False),
|
||||||
|
sa.Column('main_category_id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('sub_category_id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('redirect', sa.Integer(), nullable=True),
|
||||||
|
sa.ForeignKeyConstraint(['main_category_id', 'sub_category_id'], [prefix + '_sub_categories.main_category_id', prefix + '_sub_categories.id'], ),
|
||||||
|
sa.ForeignKeyConstraint(['main_category_id'], [prefix + '_main_categories.id'], ),
|
||||||
|
sa.ForeignKeyConstraint(['redirect'], [prefix + '_torrents.id'], ),
|
||||||
|
sa.ForeignKeyConstraint(['uploader_id'], ['users.id'], ),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_' + prefix + '_torrents_display_name'), prefix + '_torrents', ['display_name'], unique=False)
|
||||||
|
op.create_index(op.f('ix_' + prefix + '_torrents_filesize'), prefix + '_torrents', ['filesize'], unique=False)
|
||||||
|
op.create_index(op.f('ix_' + prefix + '_torrents_flags'), prefix + '_torrents', ['flags'], unique=False)
|
||||||
|
op.create_index(op.f('ix_' + prefix + '_torrents_info_hash'), prefix + '_torrents', ['info_hash'], unique=True)
|
||||||
|
op.create_index(prefix + '_uploader_flag_idx', prefix + '_torrents', ['uploader_id', 'flags'], unique=False)
|
||||||
|
|
||||||
|
# Statistics for torrents
|
||||||
|
op.create_table(prefix + '_statistics',
|
||||||
|
sa.Column('torrent_id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('seed_count', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('leech_count', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('download_count', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('last_updated', sa.DateTime(), nullable=True),
|
||||||
|
sa.ForeignKeyConstraint(['torrent_id'], [prefix + '_torrents.id'], ondelete='CASCADE'),
|
||||||
|
sa.PrimaryKeyConstraint('torrent_id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_' + prefix + '_statistics_download_count'), prefix + '_statistics', ['download_count'], unique=False)
|
||||||
|
op.create_index(op.f('ix_' + prefix + '_statistics_leech_count'), prefix + '_statistics', ['leech_count'], unique=False)
|
||||||
|
op.create_index(op.f('ix_' + prefix + '_statistics_seed_count'), prefix + '_statistics', ['seed_count'], unique=False)
|
||||||
|
|
||||||
|
# Trackers relationships for torrents
|
||||||
|
op.create_table(prefix + '_torrent_trackers',
|
||||||
|
sa.Column('torrent_id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('tracker_id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('order', sa.Integer(), nullable=False),
|
||||||
|
sa.ForeignKeyConstraint(['torrent_id'], [prefix + '_torrents.id'], ondelete='CASCADE'),
|
||||||
|
sa.ForeignKeyConstraint(['tracker_id'], ['trackers.id'], ondelete='CASCADE'),
|
||||||
|
sa.PrimaryKeyConstraint('torrent_id', 'tracker_id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_' + prefix + '_torrent_trackers_order'), prefix + '_torrent_trackers', ['order'], unique=False)
|
||||||
|
|
||||||
|
# Torrent filelists
|
||||||
|
op.create_table(prefix + '_torrents_filelist',
|
||||||
|
sa.Column('torrent_id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('filelist_blob', mysql.MEDIUMBLOB(), nullable=True),
|
||||||
|
sa.ForeignKeyConstraint(['torrent_id'], [prefix + '_torrents.id'], ondelete='CASCADE'),
|
||||||
|
sa.PrimaryKeyConstraint('torrent_id'),
|
||||||
|
mysql_row_format='COMPRESSED'
|
||||||
|
)
|
||||||
|
|
||||||
|
# Torrent info_dicts
|
||||||
|
op.create_table(prefix + '_torrents_info',
|
||||||
|
sa.Column('torrent_id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('info_dict', mysql.MEDIUMBLOB(), nullable=True),
|
||||||
|
sa.ForeignKeyConstraint(['torrent_id'], [prefix + '_torrents.id'], ondelete='CASCADE'),
|
||||||
|
sa.PrimaryKeyConstraint('torrent_id'),
|
||||||
|
mysql_row_format='COMPRESSED'
|
||||||
|
)
|
||||||
|
# ### end Alembic commands ###
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade():
|
||||||
|
# Note: this may fail. It's better to just drop all tables instead (or reset the database)
|
||||||
|
|
||||||
|
# Nyaa and Sukebei
|
||||||
|
for prefix in TABLE_PREFIXES:
|
||||||
|
op.drop_table(prefix + '_torrents_info')
|
||||||
|
op.drop_table(prefix + '_torrents_filelist')
|
||||||
|
op.drop_index(op.f('ix_' + prefix + '_torrent_trackers_order'), table_name=prefix + '_torrent_trackers')
|
||||||
|
op.drop_table(prefix + '_torrent_trackers')
|
||||||
|
op.drop_index(op.f('ix_' + prefix + '_statistics_seed_count'), table_name=prefix + '_statistics')
|
||||||
|
op.drop_index(op.f('ix_' + prefix + '_statistics_leech_count'), table_name=prefix + '_statistics')
|
||||||
|
op.drop_index(op.f('ix_' + prefix + '_statistics_download_count'), table_name=prefix + '_statistics')
|
||||||
|
op.drop_table(prefix + '_statistics')
|
||||||
|
op.drop_table(prefix + '_torrents')
|
||||||
|
op.drop_index(prefix + '_uploader_flag_idx', table_name=prefix + '_torrents')
|
||||||
|
op.drop_index(op.f('ix_' + prefix + '_torrents_info_hash'), table_name=prefix + '_torrents')
|
||||||
|
op.drop_index(op.f('ix_' + prefix + '_torrents_flags'), table_name=prefix + '_torrents')
|
||||||
|
op.drop_index(op.f('ix_' + prefix + '_torrents_filesize'), table_name=prefix + '_torrents')
|
||||||
|
op.drop_index(op.f('ix_' + prefix + '_torrents_display_name'), table_name=prefix + '_torrents')
|
||||||
|
op.drop_table(prefix + '_sub_categories')
|
||||||
|
op.drop_table(prefix + '_main_categories')
|
||||||
|
|
||||||
|
# Shared tables
|
||||||
|
op.drop_table('users')
|
||||||
|
op.drop_table('trackers')
|
||||||
|
# ### end Alembic commands ###
|
|
@ -15,34 +15,23 @@ down_revision = '3001f79b7722'
|
||||||
branch_labels = None
|
branch_labels = None
|
||||||
depends_on = None
|
depends_on = None
|
||||||
|
|
||||||
|
TABLE_PREFIXES = ('nyaa', 'sukebei')
|
||||||
|
|
||||||
|
|
||||||
def upgrade():
|
def upgrade():
|
||||||
# ### commands auto generated by Alembic - please adjust! ###
|
for prefix in TABLE_PREFIXES:
|
||||||
op.create_table('nyaa_comments',
|
op.create_table(prefix + '_comments',
|
||||||
sa.Column('id', sa.Integer(), nullable=False),
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
sa.Column('torrent_id', sa.Integer(), nullable=False),
|
sa.Column('torrent_id', sa.Integer(), nullable=False),
|
||||||
sa.Column('user_id', sa.Integer(), nullable=True),
|
sa.Column('user_id', sa.Integer(), nullable=True),
|
||||||
sa.Column('created_time', sa.DateTime(), nullable=True),
|
sa.Column('created_time', sa.DateTime(), nullable=True),
|
||||||
sa.Column('text', sa.String(length=255, collation='utf8mb4_bin'), nullable=False),
|
sa.Column('text', sa.String(length=255, collation='utf8mb4_bin'), nullable=False),
|
||||||
sa.ForeignKeyConstraint(['torrent_id'], ['nyaa_torrents.id'], ondelete='CASCADE'),
|
sa.ForeignKeyConstraint(['torrent_id'], [prefix + '_torrents.id'], ondelete='CASCADE'),
|
||||||
sa.ForeignKeyConstraint(['user_id'], ['users.id'], ondelete='CASCADE'),
|
sa.ForeignKeyConstraint(['user_id'], ['users.id'], ondelete='CASCADE'),
|
||||||
sa.PrimaryKeyConstraint('id')
|
sa.PrimaryKeyConstraint('id')
|
||||||
)
|
)
|
||||||
op.create_table('sukebei_comments',
|
|
||||||
sa.Column('id', sa.Integer(), nullable=False),
|
|
||||||
sa.Column('torrent_id', sa.Integer(), nullable=False),
|
|
||||||
sa.Column('user_id', sa.Integer(), nullable=True),
|
|
||||||
sa.Column('created_time', sa.DateTime(), nullable=True),
|
|
||||||
sa.Column('text', sa.String(length=255, collation='utf8mb4_bin'), nullable=False),
|
|
||||||
sa.ForeignKeyConstraint(['torrent_id'], ['sukebei_torrents.id'], ondelete='CASCADE'),
|
|
||||||
sa.ForeignKeyConstraint(['user_id'], ['users.id'], ondelete='CASCADE'),
|
|
||||||
sa.PrimaryKeyConstraint('id')
|
|
||||||
)
|
|
||||||
# ### end Alembic commands ###
|
|
||||||
|
|
||||||
|
|
||||||
def downgrade():
|
def downgrade():
|
||||||
# ### commands auto generated by Alembic - please adjust! ###
|
for prefix in TABLE_PREFIXES:
|
||||||
op.drop_table('nyaa_comments')
|
op.drop_table(prefix + '_comments')
|
||||||
op.drop_table('sukebei_comments')
|
|
||||||
# ### end Alembic commands ###
|
|
||||||
|
|
|
@ -22,6 +22,15 @@ if app.config['DEBUG']:
|
||||||
app.config['DEBUG_TB_INTERCEPT_REDIRECTS'] = False
|
app.config['DEBUG_TB_INTERCEPT_REDIRECTS'] = False
|
||||||
toolbar = DebugToolbarExtension(app)
|
toolbar = DebugToolbarExtension(app)
|
||||||
app.logger.setLevel(logging.DEBUG)
|
app.logger.setLevel(logging.DEBUG)
|
||||||
|
|
||||||
|
# Forbid caching
|
||||||
|
@app.after_request
|
||||||
|
def forbid_cache(request):
|
||||||
|
request.headers['Cache-Control'] = 'no-cache, no-store, must-revalidate, max-age=0'
|
||||||
|
request.headers['Pragma'] = 'no-cache'
|
||||||
|
request.headers['Expires'] = '0'
|
||||||
|
return request
|
||||||
|
|
||||||
else:
|
else:
|
||||||
app.logger.setLevel(logging.WARNING)
|
app.logger.setLevel(logging.WARNING)
|
||||||
|
|
||||||
|
|
|
@ -43,7 +43,7 @@ _username_validator = Regexp(
|
||||||
|
|
||||||
|
|
||||||
class LoginForm(FlaskForm):
|
class LoginForm(FlaskForm):
|
||||||
username = StringField('Username or email address', [DataRequired(), _username_validator])
|
username = StringField('Username or email address', [DataRequired()])
|
||||||
password = PasswordField('Password', [DataRequired()])
|
password = PasswordField('Password', [DataRequired()])
|
||||||
|
|
||||||
|
|
||||||
|
|
448
nyaa/models.py
448
nyaa/models.py
|
@ -3,10 +3,13 @@ from enum import Enum, IntEnum
|
||||||
from datetime import datetime, timezone
|
from datetime import datetime, timezone
|
||||||
from nyaa import app, db
|
from nyaa import app, db
|
||||||
from nyaa.torrents import create_magnet
|
from nyaa.torrents import create_magnet
|
||||||
|
|
||||||
from sqlalchemy import func, ForeignKeyConstraint, Index
|
from sqlalchemy import func, ForeignKeyConstraint, Index
|
||||||
|
from sqlalchemy.ext import declarative
|
||||||
from sqlalchemy_utils import ChoiceType, EmailType, PasswordType
|
from sqlalchemy_utils import ChoiceType, EmailType, PasswordType
|
||||||
from werkzeug.security import generate_password_hash, check_password_hash
|
|
||||||
from sqlalchemy_fulltext import FullText
|
from sqlalchemy_fulltext import FullText
|
||||||
|
|
||||||
|
from werkzeug.security import generate_password_hash, check_password_hash
|
||||||
from ipaddress import ip_address
|
from ipaddress import ip_address
|
||||||
|
|
||||||
import re
|
import re
|
||||||
|
@ -17,7 +20,6 @@ from hashlib import md5
|
||||||
|
|
||||||
if app.config['USE_MYSQL']:
|
if app.config['USE_MYSQL']:
|
||||||
from sqlalchemy.dialects import mysql
|
from sqlalchemy.dialects import mysql
|
||||||
|
|
||||||
BinaryType = mysql.BINARY
|
BinaryType = mysql.BINARY
|
||||||
DescriptionTextType = mysql.TEXT
|
DescriptionTextType = mysql.TEXT
|
||||||
MediumBlobType = mysql.MEDIUMBLOB
|
MediumBlobType = mysql.MEDIUMBLOB
|
||||||
|
@ -32,10 +34,36 @@ else:
|
||||||
COL_UTF8MB4_BIN = None
|
COL_UTF8MB4_BIN = None
|
||||||
COL_ASCII_GENERAL_CI = 'NOCASE'
|
COL_ASCII_GENERAL_CI = 'NOCASE'
|
||||||
|
|
||||||
|
|
||||||
# For property timestamps
|
# For property timestamps
|
||||||
UTC_EPOCH = datetime.utcfromtimestamp(0)
|
UTC_EPOCH = datetime.utcfromtimestamp(0)
|
||||||
|
|
||||||
|
|
||||||
|
class DeclarativeHelperBase(object):
|
||||||
|
''' This class eases our nyaa-sukebei shenanigans by automatically adjusting
|
||||||
|
__tablename__ and providing class methods for renaming references. '''
|
||||||
|
# See http://docs.sqlalchemy.org/en/latest/orm/extensions/declarative/api.html
|
||||||
|
|
||||||
|
__tablename_base__ = None
|
||||||
|
__flavor__ = None
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def _table_prefix_string(cls):
|
||||||
|
return cls.__flavor__.lower() + '_'
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def _table_prefix(cls, table_name):
|
||||||
|
return cls._table_prefix_string() + table_name
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def _flavor_prefix(cls, table_name):
|
||||||
|
return cls.__flavor__ + table_name
|
||||||
|
|
||||||
|
@declarative.declared_attr
|
||||||
|
def __tablename__(cls):
|
||||||
|
return cls._table_prefix(cls.__tablename_base__)
|
||||||
|
|
||||||
|
|
||||||
class TorrentFlags(IntEnum):
|
class TorrentFlags(IntEnum):
|
||||||
NONE = 0
|
NONE = 0
|
||||||
ANONYMOUS = 1
|
ANONYMOUS = 1
|
||||||
|
@ -46,16 +74,13 @@ class TorrentFlags(IntEnum):
|
||||||
DELETED = 32
|
DELETED = 32
|
||||||
|
|
||||||
|
|
||||||
DB_TABLE_PREFIX = app.config['TABLE_PREFIX']
|
class TorrentBase(DeclarativeHelperBase):
|
||||||
|
__tablename_base__ = 'torrents'
|
||||||
|
|
||||||
class Torrent(db.Model):
|
|
||||||
__tablename__ = DB_TABLE_PREFIX + 'torrents'
|
|
||||||
|
|
||||||
id = db.Column(db.Integer, primary_key=True)
|
id = db.Column(db.Integer, primary_key=True)
|
||||||
info_hash = db.Column(BinaryType(length=20), unique=True, nullable=False, index=True)
|
info_hash = db.Column(BinaryType(length=20), unique=True, nullable=False, index=True)
|
||||||
display_name = db.Column(
|
display_name = db.Column(db.String(length=255, collation=COL_UTF8_GENERAL_CI),
|
||||||
db.String(length=255, collation=COL_UTF8_GENERAL_CI), nullable=False, index=True)
|
nullable=False, index=True)
|
||||||
torrent_name = db.Column(db.String(length=255), nullable=False)
|
torrent_name = db.Column(db.String(length=255), nullable=False)
|
||||||
information = db.Column(db.String(length=255), nullable=False)
|
information = db.Column(db.String(length=255), nullable=False)
|
||||||
description = db.Column(DescriptionTextType(collation=COL_UTF8MB4_BIN), nullable=False)
|
description = db.Column(DescriptionTextType(collation=COL_UTF8MB4_BIN), nullable=False)
|
||||||
|
@ -63,50 +88,95 @@ class Torrent(db.Model):
|
||||||
filesize = db.Column(db.BIGINT, default=0, nullable=False, index=True)
|
filesize = db.Column(db.BIGINT, default=0, nullable=False, index=True)
|
||||||
encoding = db.Column(db.String(length=32), nullable=False)
|
encoding = db.Column(db.String(length=32), nullable=False)
|
||||||
flags = db.Column(db.Integer, default=0, nullable=False, index=True)
|
flags = db.Column(db.Integer, default=0, nullable=False, index=True)
|
||||||
uploader_id = db.Column(db.Integer, db.ForeignKey('users.id'), nullable=True)
|
|
||||||
|
@declarative.declared_attr
|
||||||
|
def uploader_id(cls):
|
||||||
|
# Even though this is same for both tables, declarative requires this
|
||||||
|
return db.Column(db.Integer, db.ForeignKey('users.id'), nullable=True)
|
||||||
|
|
||||||
uploader_ip = db.Column(db.Binary(length=16), default=None, nullable=True)
|
uploader_ip = db.Column(db.Binary(length=16), default=None, nullable=True)
|
||||||
has_torrent = db.Column(db.Boolean, nullable=False, default=False)
|
has_torrent = db.Column(db.Boolean, nullable=False, default=False)
|
||||||
|
|
||||||
|
comment_count = db.Column(db.Integer, default=0, nullable=False, index=True)
|
||||||
|
|
||||||
created_time = db.Column(db.DateTime(timezone=False), default=datetime.utcnow, nullable=False)
|
created_time = db.Column(db.DateTime(timezone=False), default=datetime.utcnow, nullable=False)
|
||||||
updated_time = db.Column(db.DateTime(timezone=False),
|
updated_time = db.Column(db.DateTime(timezone=False), default=datetime.utcnow,
|
||||||
default=datetime.utcnow, onupdate=datetime.utcnow, nullable=False)
|
onupdate=datetime.utcnow, nullable=False)
|
||||||
|
|
||||||
|
@declarative.declared_attr
|
||||||
|
def main_category_id(cls):
|
||||||
|
fk = db.ForeignKey(cls._table_prefix('main_categories.id'))
|
||||||
|
return db.Column(db.Integer, fk, nullable=False)
|
||||||
|
|
||||||
main_category_id = db.Column(db.Integer, db.ForeignKey(
|
|
||||||
DB_TABLE_PREFIX + 'main_categories.id'), nullable=False)
|
|
||||||
sub_category_id = db.Column(db.Integer, nullable=False)
|
sub_category_id = db.Column(db.Integer, nullable=False)
|
||||||
redirect = db.Column(db.Integer, db.ForeignKey(
|
|
||||||
DB_TABLE_PREFIX + 'torrents.id'), nullable=True)
|
|
||||||
|
|
||||||
__table_args__ = (
|
@declarative.declared_attr
|
||||||
Index('uploader_flag_idx', 'uploader_id', 'flags'),
|
def redirect(cls):
|
||||||
ForeignKeyConstraint(
|
fk = db.ForeignKey(cls._table_prefix('torrents.id'))
|
||||||
['main_category_id', 'sub_category_id'],
|
return db.Column(db.Integer, fk, nullable=True)
|
||||||
[DB_TABLE_PREFIX + 'sub_categories.main_category_id',
|
|
||||||
DB_TABLE_PREFIX + 'sub_categories.id']
|
|
||||||
), {}
|
|
||||||
)
|
|
||||||
|
|
||||||
user = db.relationship('User', uselist=False, back_populates='torrents')
|
@declarative.declared_attr
|
||||||
main_category = db.relationship('MainCategory', uselist=False,
|
def __table_args__(cls):
|
||||||
back_populates='torrents', lazy="joined")
|
return (
|
||||||
sub_category = db.relationship('SubCategory', uselist=False, backref='torrents', lazy="joined",
|
Index(cls._table_prefix('uploader_flag_idx'), 'uploader_id', 'flags'),
|
||||||
primaryjoin=(
|
ForeignKeyConstraint(
|
||||||
"and_(SubCategory.id == foreign(Torrent.sub_category_id), "
|
['main_category_id', 'sub_category_id'],
|
||||||
"SubCategory.main_category_id == Torrent.main_category_id)"))
|
[cls._table_prefix('sub_categories.main_category_id'),
|
||||||
info = db.relationship('TorrentInfo', uselist=False,
|
cls._table_prefix('sub_categories.id')]
|
||||||
cascade="all, delete-orphan", back_populates='torrent')
|
), {}
|
||||||
filelist = db.relationship('TorrentFilelist', uselist=False,
|
)
|
||||||
|
|
||||||
|
@declarative.declared_attr
|
||||||
|
def user(cls):
|
||||||
|
return db.relationship('User', uselist=False, back_populates=cls._table_prefix('torrents'))
|
||||||
|
|
||||||
|
@declarative.declared_attr
|
||||||
|
def main_category(cls):
|
||||||
|
return db.relationship(cls._flavor_prefix('MainCategory'), uselist=False,
|
||||||
|
back_populates='torrents', lazy="joined")
|
||||||
|
|
||||||
|
@declarative.declared_attr
|
||||||
|
def sub_category(cls):
|
||||||
|
join_sql = ("and_({0}SubCategory.id == foreign({0}Torrent.sub_category_id), "
|
||||||
|
"{0}SubCategory.main_category_id == {0}Torrent.main_category_id)")
|
||||||
|
return db.relationship(cls._flavor_prefix('SubCategory'), uselist=False,
|
||||||
|
backref='torrents', lazy="joined",
|
||||||
|
primaryjoin=join_sql.format(cls.__flavor__))
|
||||||
|
|
||||||
|
@declarative.declared_attr
|
||||||
|
def info(cls):
|
||||||
|
return db.relationship(cls._flavor_prefix('TorrentInfo'), uselist=False,
|
||||||
cascade="all, delete-orphan", back_populates='torrent')
|
cascade="all, delete-orphan", back_populates='torrent')
|
||||||
stats = db.relationship('Statistic', uselist=False,
|
|
||||||
cascade="all, delete-orphan", back_populates='torrent', lazy='joined')
|
@declarative.declared_attr
|
||||||
trackers = db.relationship('TorrentTrackers', uselist=True, cascade="all, delete-orphan",
|
def filelist(cls):
|
||||||
lazy='joined', order_by='TorrentTrackers.order')
|
return db.relationship(cls._flavor_prefix('TorrentFilelist'), uselist=False,
|
||||||
comments = db.relationship('Comment', uselist=True,
|
cascade="all, delete-orphan", back_populates='torrent')
|
||||||
|
|
||||||
|
@declarative.declared_attr
|
||||||
|
def stats(cls):
|
||||||
|
return db.relationship(cls._flavor_prefix('Statistic'), uselist=False,
|
||||||
|
cascade="all, delete-orphan", back_populates='torrent',
|
||||||
|
lazy='joined')
|
||||||
|
|
||||||
|
@declarative.declared_attr
|
||||||
|
def trackers(cls):
|
||||||
|
return db.relationship(cls._flavor_prefix('TorrentTrackers'), uselist=True,
|
||||||
|
cascade="all, delete-orphan", lazy='joined',
|
||||||
|
order_by=cls._flavor_prefix('TorrentTrackers.order'))
|
||||||
|
|
||||||
|
@declarative.declared_attr
|
||||||
|
def comments(cls):
|
||||||
|
return db.relationship(cls._flavor_prefix('Comment'), uselist=True,
|
||||||
cascade="all, delete-orphan")
|
cascade="all, delete-orphan")
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
return '<{0} #{1.id} \'{1.display_name}\' {1.filesize}b>'.format(type(self).__name__, self)
|
return '<{0} #{1.id} \'{1.display_name}\' {1.filesize}b>'.format(type(self).__name__, self)
|
||||||
|
|
||||||
|
def update_comment_count(self):
|
||||||
|
self.comment_count = Comment.query.filter_by(torrent_id=self.id).count()
|
||||||
|
return self.comment_count
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def created_utc_timestamp(self):
|
def created_utc_timestamp(self):
|
||||||
''' Returns a UTC POSIX timestamp, as seconds '''
|
''' Returns a UTC POSIX timestamp, as seconds '''
|
||||||
|
@ -149,6 +219,8 @@ class Torrent(db.Model):
|
||||||
if self.uploader_ip:
|
if self.uploader_ip:
|
||||||
return str(ip_address(self.uploader_ip))
|
return str(ip_address(self.uploader_ip))
|
||||||
|
|
||||||
|
# Flag getters and setters below
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def anonymous(self):
|
def anonymous(self):
|
||||||
return self.flags & TorrentFlags.ANONYMOUS
|
return self.flags & TorrentFlags.ANONYMOUS
|
||||||
|
@ -197,6 +269,8 @@ class Torrent(db.Model):
|
||||||
def complete(self, value):
|
def complete(self, value):
|
||||||
self.flags = (self.flags & ~TorrentFlags.COMPLETE) | (value and TorrentFlags.COMPLETE)
|
self.flags = (self.flags & ~TorrentFlags.COMPLETE) | (value and TorrentFlags.COMPLETE)
|
||||||
|
|
||||||
|
# Class methods
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def by_id(cls, id):
|
def by_id(cls, id):
|
||||||
return cls.query.get(id)
|
return cls.query.get(id)
|
||||||
|
@ -211,44 +285,57 @@ class Torrent(db.Model):
|
||||||
return cls.by_info_hash(info_hash_bytes)
|
return cls.by_info_hash(info_hash_bytes)
|
||||||
|
|
||||||
|
|
||||||
class TorrentNameSearch(FullText, Torrent):
|
class TorrentFilelistBase(DeclarativeHelperBase):
|
||||||
__fulltext_columns__ = ('display_name',)
|
__tablename_base__ = 'torrents_filelist'
|
||||||
|
|
||||||
|
|
||||||
class TorrentFilelist(db.Model):
|
|
||||||
__tablename__ = DB_TABLE_PREFIX + 'torrents_filelist'
|
|
||||||
__table_args__ = {'mysql_row_format': 'COMPRESSED'}
|
__table_args__ = {'mysql_row_format': 'COMPRESSED'}
|
||||||
|
|
||||||
torrent_id = db.Column(db.Integer, db.ForeignKey(
|
@declarative.declared_attr
|
||||||
DB_TABLE_PREFIX + 'torrents.id', ondelete="CASCADE"), primary_key=True)
|
def torrent_id(cls):
|
||||||
|
fk = db.ForeignKey(cls._table_prefix('torrents.id'), ondelete="CASCADE")
|
||||||
|
return db.Column(db.Integer, fk, primary_key=True)
|
||||||
|
|
||||||
filelist_blob = db.Column(MediumBlobType, nullable=True)
|
filelist_blob = db.Column(MediumBlobType, nullable=True)
|
||||||
|
|
||||||
torrent = db.relationship('Torrent', uselist=False, back_populates='filelist')
|
@declarative.declared_attr
|
||||||
|
def torrent(cls):
|
||||||
|
return db.relationship(cls._flavor_prefix('Torrent'), uselist=False,
|
||||||
|
back_populates='filelist')
|
||||||
|
|
||||||
|
|
||||||
class TorrentInfo(db.Model):
|
class TorrentInfoBase(DeclarativeHelperBase):
|
||||||
__tablename__ = DB_TABLE_PREFIX + 'torrents_info'
|
__tablename_base__ = 'torrents_info'
|
||||||
|
|
||||||
__table_args__ = {'mysql_row_format': 'COMPRESSED'}
|
__table_args__ = {'mysql_row_format': 'COMPRESSED'}
|
||||||
|
|
||||||
torrent_id = db.Column(db.Integer, db.ForeignKey(
|
@declarative.declared_attr
|
||||||
DB_TABLE_PREFIX + 'torrents.id', ondelete="CASCADE"), primary_key=True)
|
def torrent_id(cls):
|
||||||
|
return db.Column(db.Integer, db.ForeignKey(
|
||||||
|
cls._table_prefix('torrents.id'), ondelete="CASCADE"), primary_key=True)
|
||||||
info_dict = db.Column(MediumBlobType, nullable=True)
|
info_dict = db.Column(MediumBlobType, nullable=True)
|
||||||
|
|
||||||
torrent = db.relationship('Torrent', uselist=False, back_populates='info')
|
@declarative.declared_attr
|
||||||
|
def torrent(cls):
|
||||||
|
return db.relationship(cls._flavor_prefix('Torrent'), uselist=False, back_populates='info')
|
||||||
|
|
||||||
|
|
||||||
class Statistic(db.Model):
|
class StatisticBase(DeclarativeHelperBase):
|
||||||
__tablename__ = DB_TABLE_PREFIX + 'statistics'
|
__tablename_base__ = 'statistics'
|
||||||
|
|
||||||
torrent_id = db.Column(db.Integer, db.ForeignKey(
|
@declarative.declared_attr
|
||||||
DB_TABLE_PREFIX + 'torrents.id', ondelete="CASCADE"), primary_key=True)
|
def torrent_id(cls):
|
||||||
|
fk = db.ForeignKey(cls._table_prefix('torrents.id'), ondelete="CASCADE")
|
||||||
|
return db.Column(db.Integer, fk, primary_key=True)
|
||||||
|
|
||||||
seed_count = db.Column(db.Integer, default=0, nullable=False, index=True)
|
seed_count = db.Column(db.Integer, default=0, nullable=False, index=True)
|
||||||
leech_count = db.Column(db.Integer, default=0, nullable=False, index=True)
|
leech_count = db.Column(db.Integer, default=0, nullable=False, index=True)
|
||||||
download_count = db.Column(db.Integer, default=0, nullable=False, index=True)
|
download_count = db.Column(db.Integer, default=0, nullable=False, index=True)
|
||||||
last_updated = db.Column(db.DateTime(timezone=False))
|
last_updated = db.Column(db.DateTime(timezone=False))
|
||||||
|
|
||||||
torrent = db.relationship('Torrent', uselist=False, back_populates='stats')
|
@declarative.declared_attr
|
||||||
|
def torrent(cls):
|
||||||
|
return db.relationship(cls._flavor_prefix('Torrent'), uselist=False,
|
||||||
|
back_populates='stats')
|
||||||
|
|
||||||
|
|
||||||
class Trackers(db.Model):
|
class Trackers(db.Model):
|
||||||
|
@ -264,30 +351,43 @@ class Trackers(db.Model):
|
||||||
return cls.query.filter_by(uri=uri).first()
|
return cls.query.filter_by(uri=uri).first()
|
||||||
|
|
||||||
|
|
||||||
class TorrentTrackers(db.Model):
|
class TorrentTrackersBase(DeclarativeHelperBase):
|
||||||
__tablename__ = DB_TABLE_PREFIX + 'torrent_trackers'
|
__tablename_base__ = 'torrent_trackers'
|
||||||
|
|
||||||
|
@declarative.declared_attr
|
||||||
|
def torrent_id(cls):
|
||||||
|
fk = db.ForeignKey(cls._table_prefix('torrents.id'), ondelete="CASCADE")
|
||||||
|
return db.Column(db.Integer, fk, primary_key=True)
|
||||||
|
|
||||||
|
@declarative.declared_attr
|
||||||
|
def tracker_id(cls):
|
||||||
|
fk = db.ForeignKey('trackers.id', ondelete="CASCADE")
|
||||||
|
return db.Column(db.Integer, fk, primary_key=True)
|
||||||
|
|
||||||
torrent_id = db.Column(db.Integer, db.ForeignKey(
|
|
||||||
DB_TABLE_PREFIX + 'torrents.id', ondelete="CASCADE"), primary_key=True)
|
|
||||||
tracker_id = db.Column(db.Integer, db.ForeignKey(
|
|
||||||
'trackers.id', ondelete="CASCADE"), primary_key=True)
|
|
||||||
order = db.Column(db.Integer, nullable=False, index=True)
|
order = db.Column(db.Integer, nullable=False, index=True)
|
||||||
|
|
||||||
tracker = db.relationship('Trackers', uselist=False, lazy='joined')
|
@declarative.declared_attr
|
||||||
|
def tracker(cls):
|
||||||
|
return db.relationship('Trackers', uselist=False, lazy='joined')
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def by_torrent_id(cls, torrent_id):
|
def by_torrent_id(cls, torrent_id):
|
||||||
return cls.query.filter_by(torrent_id=torrent_id).order_by(cls.order.desc())
|
return cls.query.filter_by(torrent_id=torrent_id).order_by(cls.order.desc())
|
||||||
|
|
||||||
|
|
||||||
class MainCategory(db.Model):
|
class MainCategoryBase(DeclarativeHelperBase):
|
||||||
__tablename__ = DB_TABLE_PREFIX + 'main_categories'
|
__tablename_base__ = 'main_categories'
|
||||||
|
|
||||||
id = db.Column(db.Integer, primary_key=True)
|
id = db.Column(db.Integer, primary_key=True)
|
||||||
name = db.Column(db.String(length=64), nullable=False)
|
name = db.Column(db.String(length=64), nullable=False)
|
||||||
|
|
||||||
sub_categories = db.relationship('SubCategory', back_populates='main_category')
|
@declarative.declared_attr
|
||||||
torrents = db.relationship('Torrent', back_populates='main_category')
|
def sub_categories(cls):
|
||||||
|
return db.relationship(cls._flavor_prefix('SubCategory'), back_populates='main_category')
|
||||||
|
|
||||||
|
@declarative.declared_attr
|
||||||
|
def torrents(cls):
|
||||||
|
return db.relationship(cls._flavor_prefix('Torrent'), back_populates='main_category')
|
||||||
|
|
||||||
def get_category_ids(self):
|
def get_category_ids(self):
|
||||||
return (self.id, 0)
|
return (self.id, 0)
|
||||||
|
@ -301,18 +401,22 @@ class MainCategory(db.Model):
|
||||||
return cls.query.get(id)
|
return cls.query.get(id)
|
||||||
|
|
||||||
|
|
||||||
class SubCategory(db.Model):
|
class SubCategoryBase(DeclarativeHelperBase):
|
||||||
__tablename__ = DB_TABLE_PREFIX + 'sub_categories'
|
__tablename_base__ = 'sub_categories'
|
||||||
|
|
||||||
id = db.Column(db.Integer, primary_key=True)
|
id = db.Column(db.Integer, primary_key=True)
|
||||||
main_category_id = db.Column(db.Integer, db.ForeignKey(
|
|
||||||
DB_TABLE_PREFIX + 'main_categories.id'), primary_key=True)
|
@declarative.declared_attr
|
||||||
|
def main_category_id(cls):
|
||||||
|
fk = db.ForeignKey(cls._table_prefix('main_categories.id'))
|
||||||
|
return db.Column(db.Integer, fk, primary_key=True)
|
||||||
|
|
||||||
name = db.Column(db.String(length=64), nullable=False)
|
name = db.Column(db.String(length=64), nullable=False)
|
||||||
|
|
||||||
main_category = db.relationship('MainCategory', uselist=False, back_populates='sub_categories')
|
@declarative.declared_attr
|
||||||
# torrents = db.relationship('Torrent', back_populates='sub_category'),
|
def main_category(cls):
|
||||||
# primaryjoin="and_(Torrent.sub_category_id == foreign(SubCategory.id), "
|
return db.relationship(cls._flavor_prefix('MainCategory'), uselist=False,
|
||||||
# "Torrent.main_category_id == SubCategory.main_category_id)")
|
back_populates='sub_categories')
|
||||||
|
|
||||||
def get_category_ids(self):
|
def get_category_ids(self):
|
||||||
return (self.main_category_id, self.id)
|
return (self.main_category_id, self.id)
|
||||||
|
@ -326,17 +430,27 @@ class SubCategory(db.Model):
|
||||||
return cls.query.get((sub_cat_id, main_cat_id))
|
return cls.query.get((sub_cat_id, main_cat_id))
|
||||||
|
|
||||||
|
|
||||||
class Comment(db.Model):
|
class CommentBase(DeclarativeHelperBase):
|
||||||
__tablename__ = DB_TABLE_PREFIX + 'comments'
|
__tablename_base__ = 'comments'
|
||||||
|
|
||||||
id = db.Column(db.Integer, primary_key=True)
|
id = db.Column(db.Integer, primary_key=True)
|
||||||
torrent_id = db.Column(db.Integer, db.ForeignKey(
|
|
||||||
DB_TABLE_PREFIX + 'torrents.id', ondelete='CASCADE'), nullable=False)
|
@declarative.declared_attr
|
||||||
user_id = db.Column(db.Integer, db.ForeignKey('users.id', ondelete='CASCADE'))
|
def torrent_id(cls):
|
||||||
|
return db.Column(db.Integer, db.ForeignKey(
|
||||||
|
cls._table_prefix('torrents.id'), ondelete='CASCADE'), nullable=False)
|
||||||
|
|
||||||
|
@declarative.declared_attr
|
||||||
|
def user_id(cls):
|
||||||
|
return db.Column(db.Integer, db.ForeignKey('users.id', ondelete='CASCADE'))
|
||||||
|
|
||||||
created_time = db.Column(db.DateTime(timezone=False), default=datetime.utcnow)
|
created_time = db.Column(db.DateTime(timezone=False), default=datetime.utcnow)
|
||||||
text = db.Column(db.String(length=255, collation=COL_UTF8MB4_BIN), nullable=False)
|
text = db.Column(db.String(length=255, collation=COL_UTF8MB4_BIN), nullable=False)
|
||||||
|
|
||||||
user = db.relationship('User', uselist=False, back_populates='comments', lazy="joined")
|
@declarative.declared_attr
|
||||||
|
def user(cls):
|
||||||
|
return db.relationship('User', uselist=False,
|
||||||
|
back_populates=cls._table_prefix('comments'), lazy="joined")
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
return '<Comment %r>' % self.id
|
return '<Comment %r>' % self.id
|
||||||
|
@ -376,9 +490,11 @@ class User(db.Model):
|
||||||
last_login_date = db.Column(db.DateTime(timezone=False), default=None, nullable=True)
|
last_login_date = db.Column(db.DateTime(timezone=False), default=None, nullable=True)
|
||||||
last_login_ip = db.Column(db.Binary(length=16), default=None, nullable=True)
|
last_login_ip = db.Column(db.Binary(length=16), default=None, nullable=True)
|
||||||
|
|
||||||
torrents = db.relationship('Torrent', back_populates='user', lazy='dynamic')
|
nyaa_torrents = db.relationship('NyaaTorrent', back_populates='user', lazy='dynamic')
|
||||||
comments = db.relationship('Comment', back_populates='user', lazy='dynamic')
|
nyaa_comments = db.relationship('NyaaComment', back_populates='user', lazy='dynamic')
|
||||||
# session = db.relationship('Session', uselist=False, back_populates='user')
|
|
||||||
|
sukebei_torrents = db.relationship('SukebeiTorrent', back_populates='user', lazy='dynamic')
|
||||||
|
sukebei_comments = db.relationship('SukebeiComment', back_populates='user', lazy='dynamic')
|
||||||
|
|
||||||
def __init__(self, username, email, password):
|
def __init__(self, username, email, password):
|
||||||
self.username = username
|
self.username = username
|
||||||
|
@ -470,20 +586,30 @@ class ReportStatus(IntEnum):
|
||||||
INVALID = 2
|
INVALID = 2
|
||||||
|
|
||||||
|
|
||||||
class Report(db.Model):
|
class ReportBase(DeclarativeHelperBase):
|
||||||
__tablename__ = DB_TABLE_PREFIX + 'reports'
|
__tablename_base__ = 'reports'
|
||||||
|
|
||||||
id = db.Column(db.Integer, primary_key=True)
|
id = db.Column(db.Integer, primary_key=True)
|
||||||
torrent_id = db.Column(db.Integer, db.ForeignKey(
|
|
||||||
DB_TABLE_PREFIX + 'torrents.id', ondelete='CASCADE'))
|
|
||||||
user_id = db.Column(db.Integer, db.ForeignKey(
|
|
||||||
'users.id'))
|
|
||||||
created_time = db.Column(db.DateTime(timezone=False), default=datetime.utcnow)
|
created_time = db.Column(db.DateTime(timezone=False), default=datetime.utcnow)
|
||||||
reason = db.Column(db.String(length=255), nullable=False)
|
reason = db.Column(db.String(length=255), nullable=False)
|
||||||
status = db.Column(ChoiceType(ReportStatus, impl=db.Integer()), nullable=False)
|
status = db.Column(ChoiceType(ReportStatus, impl=db.Integer()), nullable=False)
|
||||||
|
|
||||||
user = db.relationship('User', uselist=False, lazy="joined")
|
@declarative.declared_attr
|
||||||
torrent = db.relationship('Torrent', uselist=False, lazy="joined")
|
def torrent_id(cls):
|
||||||
|
return db.Column(db.Integer, db.ForeignKey(
|
||||||
|
cls._table_prefix('torrents.id'), ondelete='CASCADE'), nullable=False)
|
||||||
|
|
||||||
|
@declarative.declared_attr
|
||||||
|
def user_id(cls):
|
||||||
|
return db.Column(db.Integer, db.ForeignKey('users.id'))
|
||||||
|
|
||||||
|
@declarative.declared_attr
|
||||||
|
def user(cls):
|
||||||
|
return db.relationship('User', uselist=False, lazy="joined")
|
||||||
|
|
||||||
|
@declarative.declared_attr
|
||||||
|
def torrent(cls):
|
||||||
|
return db.relationship(cls._flavor_prefix('Torrent'), uselist=False, lazy="joined")
|
||||||
|
|
||||||
def __init__(self, torrent_id, user_id, reason):
|
def __init__(self, torrent_id, user_id, reason):
|
||||||
self.torrent_id = torrent_id
|
self.torrent_id = torrent_id
|
||||||
|
@ -512,12 +638,130 @@ class Report(db.Model):
|
||||||
def remove_reviewed(cls, id):
|
def remove_reviewed(cls, id):
|
||||||
return cls.query.filter(cls.torrent_id == id, cls.status == 0).delete()
|
return cls.query.filter(cls.torrent_id == id, cls.status == 0).delete()
|
||||||
|
|
||||||
# class Session(db.Model):
|
|
||||||
# __tablename__ = 'sessions'
|
# Actually declare our site-specific classes
|
||||||
#
|
|
||||||
# session_id = db.Column(db.Integer, primary_key=True)
|
# Torrent
|
||||||
# user_id = db.Column(db.Integer, db.ForeignKey('users.id'), nullable=False)
|
class NyaaTorrent(TorrentBase, db.Model):
|
||||||
# login_ip = db.Column(db.Binary(length=16), nullable=True)
|
__flavor__ = 'Nyaa'
|
||||||
# login_date = db.Column(db.DateTime(timezone=False), nullable=True)
|
|
||||||
#
|
|
||||||
# user = db.relationship('User', back_populates='session')
|
class SukebeiTorrent(TorrentBase, db.Model):
|
||||||
|
__flavor__ = 'Sukebei'
|
||||||
|
|
||||||
|
|
||||||
|
# Fulltext models for MySQL
|
||||||
|
if app.config['USE_MYSQL']:
|
||||||
|
class NyaaTorrentNameSearch(FullText, NyaaTorrent):
|
||||||
|
__fulltext_columns__ = ('display_name',)
|
||||||
|
__table_args__ = {'extend_existing': True}
|
||||||
|
|
||||||
|
class SukebeiTorrentNameSearch(FullText, SukebeiTorrent):
|
||||||
|
__fulltext_columns__ = ('display_name',)
|
||||||
|
__table_args__ = {'extend_existing': True}
|
||||||
|
else:
|
||||||
|
# Bogus classes for Sqlite
|
||||||
|
class NyaaTorrentNameSearch(object):
|
||||||
|
pass
|
||||||
|
|
||||||
|
class SukebeiTorrentNameSearch(object):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
# TorrentFilelist
|
||||||
|
class NyaaTorrentFilelist(TorrentFilelistBase, db.Model):
|
||||||
|
__flavor__ = 'Nyaa'
|
||||||
|
|
||||||
|
|
||||||
|
class SukebeiTorrentFilelist(TorrentFilelistBase, db.Model):
|
||||||
|
__flavor__ = 'Sukebei'
|
||||||
|
|
||||||
|
|
||||||
|
# TorrentInfo
|
||||||
|
class NyaaTorrentInfo(TorrentInfoBase, db.Model):
|
||||||
|
__flavor__ = 'Nyaa'
|
||||||
|
|
||||||
|
|
||||||
|
class SukebeiTorrentInfo(TorrentInfoBase, db.Model):
|
||||||
|
__flavor__ = 'Sukebei'
|
||||||
|
|
||||||
|
|
||||||
|
# Statistic
|
||||||
|
class NyaaStatistic(StatisticBase, db.Model):
|
||||||
|
__flavor__ = 'Nyaa'
|
||||||
|
|
||||||
|
|
||||||
|
class SukebeiStatistic(StatisticBase, db.Model):
|
||||||
|
__flavor__ = 'Sukebei'
|
||||||
|
|
||||||
|
|
||||||
|
# TorrentTrackers
|
||||||
|
class NyaaTorrentTrackers(TorrentTrackersBase, db.Model):
|
||||||
|
__flavor__ = 'Nyaa'
|
||||||
|
|
||||||
|
|
||||||
|
class SukebeiTorrentTrackers(TorrentTrackersBase, db.Model):
|
||||||
|
__flavor__ = 'Sukebei'
|
||||||
|
|
||||||
|
|
||||||
|
# MainCategory
|
||||||
|
class NyaaMainCategory(MainCategoryBase, db.Model):
|
||||||
|
__flavor__ = 'Nyaa'
|
||||||
|
|
||||||
|
|
||||||
|
class SukebeiMainCategory(MainCategoryBase, db.Model):
|
||||||
|
__flavor__ = 'Sukebei'
|
||||||
|
|
||||||
|
|
||||||
|
# SubCategory
|
||||||
|
class NyaaSubCategory(SubCategoryBase, db.Model):
|
||||||
|
__flavor__ = 'Nyaa'
|
||||||
|
|
||||||
|
|
||||||
|
class SukebeiSubCategory(SubCategoryBase, db.Model):
|
||||||
|
__flavor__ = 'Sukebei'
|
||||||
|
|
||||||
|
|
||||||
|
# Comment
|
||||||
|
class NyaaComment(CommentBase, db.Model):
|
||||||
|
__flavor__ = 'Nyaa'
|
||||||
|
|
||||||
|
|
||||||
|
class SukebeiComment(CommentBase, db.Model):
|
||||||
|
__flavor__ = 'Sukebei'
|
||||||
|
|
||||||
|
|
||||||
|
# Report
|
||||||
|
class NyaaReport(ReportBase, db.Model):
|
||||||
|
__flavor__ = 'Nyaa'
|
||||||
|
|
||||||
|
|
||||||
|
class SukebeiReport(ReportBase, db.Model):
|
||||||
|
__flavor__ = 'Sukebei'
|
||||||
|
|
||||||
|
|
||||||
|
# Choose our defaults for models.Torrent etc
|
||||||
|
if app.config['SITE_FLAVOR'] == 'nyaa':
|
||||||
|
Torrent = NyaaTorrent
|
||||||
|
TorrentFilelist = NyaaTorrentFilelist
|
||||||
|
TorrentInfo = NyaaTorrentInfo
|
||||||
|
Statistic = NyaaStatistic
|
||||||
|
TorrentTrackers = NyaaTorrentTrackers
|
||||||
|
MainCategory = NyaaMainCategory
|
||||||
|
SubCategory = NyaaSubCategory
|
||||||
|
Comment = NyaaComment
|
||||||
|
Report = NyaaReport
|
||||||
|
|
||||||
|
TorrentNameSearch = NyaaTorrentNameSearch
|
||||||
|
elif app.config['SITE_FLAVOR'] == 'sukebei':
|
||||||
|
Torrent = SukebeiTorrent
|
||||||
|
TorrentFilelist = SukebeiTorrentFilelist
|
||||||
|
TorrentInfo = SukebeiTorrentInfo
|
||||||
|
Statistic = SukebeiStatistic
|
||||||
|
TorrentTrackers = SukebeiTorrentTrackers
|
||||||
|
MainCategory = SukebeiMainCategory
|
||||||
|
SubCategory = SukebeiSubCategory
|
||||||
|
Comment = SukebeiComment
|
||||||
|
Report = SukebeiReport
|
||||||
|
|
||||||
|
TorrentNameSearch = SukebeiTorrentNameSearch
|
||||||
|
|
|
@ -51,7 +51,7 @@ def redirect_url():
|
||||||
|
|
||||||
|
|
||||||
@app.template_global()
|
@app.template_global()
|
||||||
def static_cachebuster(static_filename):
|
def static_cachebuster(filename):
|
||||||
''' Adds a ?t=<mtime> cachebuster to the given path, if the file exists.
|
''' Adds a ?t=<mtime> cachebuster to the given path, if the file exists.
|
||||||
Results are cached in memory and persist until app restart! '''
|
Results are cached in memory and persist until app restart! '''
|
||||||
# Instead of timestamps, we could use commit hashes (we already load it in __init__)
|
# Instead of timestamps, we could use commit hashes (we already load it in __init__)
|
||||||
|
@ -60,19 +60,18 @@ def static_cachebuster(static_filename):
|
||||||
|
|
||||||
if app.debug:
|
if app.debug:
|
||||||
# Do not bust cache on debug (helps debugging)
|
# Do not bust cache on debug (helps debugging)
|
||||||
return static_filename
|
return flask.url_for('static', filename=filename)
|
||||||
|
|
||||||
# Get file mtime if not already cached.
|
# Get file mtime if not already cached.
|
||||||
if static_filename not in _static_cache:
|
if filename not in _static_cache:
|
||||||
file_path = os.path.join(app.config['BASE_DIR'], 'nyaa', static_filename[1:])
|
file_path = os.path.join(app.static_folder, filename)
|
||||||
|
file_mtime = None
|
||||||
if os.path.exists(file_path):
|
if os.path.exists(file_path):
|
||||||
file_mtime = int(os.path.getmtime(file_path))
|
file_mtime = int(os.path.getmtime(file_path))
|
||||||
_static_cache[static_filename] = static_filename + '?t=' + str(file_mtime)
|
|
||||||
else:
|
|
||||||
# Throw a warning?
|
|
||||||
_static_cache[static_filename] = static_filename
|
|
||||||
|
|
||||||
return _static_cache[static_filename]
|
_static_cache[filename] = file_mtime
|
||||||
|
|
||||||
|
return flask.url_for('static', filename=filename, t=_static_cache[filename])
|
||||||
|
|
||||||
|
|
||||||
@app.template_global()
|
@app.template_global()
|
||||||
|
@ -657,9 +656,10 @@ def view_torrent(torrent_id):
|
||||||
text=comment_text)
|
text=comment_text)
|
||||||
|
|
||||||
db.session.add(comment)
|
db.session.add(comment)
|
||||||
db.session.commit()
|
db.session.flush()
|
||||||
|
|
||||||
torrent_count = models.Comment.query.filter_by(torrent_id=torrent.id).count()
|
torrent_count = torrent.update_comment_count()
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
flask.flash('Comment successfully posted.', 'success')
|
flask.flash('Comment successfully posted.', 'success')
|
||||||
|
|
||||||
|
@ -687,6 +687,9 @@ def view_torrent(torrent_id):
|
||||||
def delete_comment(torrent_id, comment_id):
|
def delete_comment(torrent_id, comment_id):
|
||||||
if not flask.g.user:
|
if not flask.g.user:
|
||||||
flask.abort(403)
|
flask.abort(403)
|
||||||
|
torrent = models.Torrent.by_id(torrent_id)
|
||||||
|
if not torrent:
|
||||||
|
flask.abort(404)
|
||||||
|
|
||||||
comment = models.Comment.query.filter_by(id=comment_id).first()
|
comment = models.Comment.query.filter_by(id=comment_id).first()
|
||||||
if not comment:
|
if not comment:
|
||||||
|
@ -696,6 +699,8 @@ def delete_comment(torrent_id, comment_id):
|
||||||
flask.abort(403)
|
flask.abort(403)
|
||||||
|
|
||||||
db.session.delete(comment)
|
db.session.delete(comment)
|
||||||
|
db.session.flush()
|
||||||
|
torrent.update_comment_count()
|
||||||
db.session.commit()
|
db.session.commit()
|
||||||
|
|
||||||
flask.flash('Comment successfully deleted.', 'success')
|
flask.flash('Comment successfully deleted.', 'success')
|
||||||
|
|
|
@ -25,6 +25,7 @@ def search_elastic(term='', user=None, sort='id', order='desc',
|
||||||
'id': 'id',
|
'id': 'id',
|
||||||
'size': 'filesize',
|
'size': 'filesize',
|
||||||
# 'name': 'display_name', # This is slow and buggy
|
# 'name': 'display_name', # This is slow and buggy
|
||||||
|
'comments': 'comment_count',
|
||||||
'seeders': 'seed_count',
|
'seeders': 'seed_count',
|
||||||
'leechers': 'leech_count',
|
'leechers': 'leech_count',
|
||||||
'downloads': 'download_count'
|
'downloads': 'download_count'
|
||||||
|
@ -190,6 +191,7 @@ def search_db(term='', user=None, sort='id', order='desc', category='0_0',
|
||||||
'size': models.Torrent.filesize,
|
'size': models.Torrent.filesize,
|
||||||
# Disable this because we disabled this in search_elastic, for the sake of consistency:
|
# Disable this because we disabled this in search_elastic, for the sake of consistency:
|
||||||
# 'name': models.Torrent.display_name,
|
# 'name': models.Torrent.display_name,
|
||||||
|
'comments': models.Torrent.comment_count,
|
||||||
'seeders': models.Statistic.seed_count,
|
'seeders': models.Statistic.seed_count,
|
||||||
'leechers': models.Statistic.leech_count,
|
'leechers': models.Statistic.leech_count,
|
||||||
'downloads': models.Statistic.download_count
|
'downloads': models.Statistic.download_count
|
||||||
|
|
|
@ -62,6 +62,30 @@ table.torrent-list tbody tr td a:visited {
|
||||||
color: #1d4568;
|
color: #1d4568;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/* comments count */
|
||||||
|
table.torrent-list .hdr-comments {
|
||||||
|
border-left: hidden;
|
||||||
|
font-size: medium;
|
||||||
|
}
|
||||||
|
|
||||||
|
table.torrent-list .hdr-comments i {
|
||||||
|
margin-right: 6px;
|
||||||
|
}
|
||||||
|
|
||||||
|
table.torrent-list tbody .comments {
|
||||||
|
position: relative;
|
||||||
|
float: right;
|
||||||
|
border: 1px solid #d7d7d7;
|
||||||
|
border-radius: 3px;
|
||||||
|
color: #383838;
|
||||||
|
padding: 0 5px;
|
||||||
|
font-size: small;
|
||||||
|
background-color: #ffffff;
|
||||||
|
}
|
||||||
|
table.torrent-list tbody .comments i {
|
||||||
|
padding-right: 2px;
|
||||||
|
}
|
||||||
|
|
||||||
#torrent-description img {
|
#torrent-description img {
|
||||||
max-width: 100%;
|
max-width: 100%;
|
||||||
}
|
}
|
||||||
|
|
|
@ -6,9 +6,9 @@
|
||||||
|
|
||||||
<meta name="viewport" content="width=device-width">
|
<meta name="viewport" content="width=device-width">
|
||||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||||
<link rel="shortcut icon" type="image/png" href="/static/favicon.png">
|
<link rel="shortcut icon" type="image/png" href="{{ url_for('static', filename='favicon.png') }}">
|
||||||
<link rel="icon" type="image/png" href="/static/favicon.png">
|
<link rel="icon" type="image/png" href="{{ url_for('static', filename='favicon.png') }}">
|
||||||
<link rel="mask-icon" href="/static/pinned-tab.svg" color="#3582F7">
|
<link rel="mask-icon" href="{{ url_for('static', filename='pinned-tab.svg') }}" color="#3582F7">
|
||||||
<link rel="alternate" type="application/rss+xml" href="{% if rss_filter %}{{ url_for('home', page='rss', _external=True, **rss_filter) }}{% else %}{{ url_for('home', page='rss', _external=True) }}{% endif %}" />
|
<link rel="alternate" type="application/rss+xml" href="{% if rss_filter %}{{ url_for('home', page='rss', _external=True, **rss_filter) }}{% else %}{{ url_for('home', page='rss', _external=True) }}{% endif %}" />
|
||||||
|
|
||||||
<meta property="og:site_name" content="{{ config.SITE_NAME }}">
|
<meta property="og:site_name" content="{{ config.SITE_NAME }}">
|
||||||
|
@ -25,10 +25,10 @@
|
||||||
make the navbar not look awful on tablets.
|
make the navbar not look awful on tablets.
|
||||||
-->
|
-->
|
||||||
{# These are extracted here for the dark mode toggle #}
|
{# These are extracted here for the dark mode toggle #}
|
||||||
{% set bootstrap_light = static_cachebuster('/static/css/bootstrap.min.css') %}
|
{% set bootstrap_light = static_cachebuster('css/bootstrap.min.css') %}
|
||||||
{% set bootstrap_dark = static_cachebuster('/static/css/bootstrap-dark.min.css') %}
|
{% set bootstrap_dark = static_cachebuster('css/bootstrap-dark.min.css') %}
|
||||||
<link href="{{ bootstrap_light }}" rel="stylesheet" id="bsThemeLink">
|
<link href="{{ bootstrap_light }}" rel="stylesheet" id="bsThemeLink">
|
||||||
<link href="{{ static_cachebuster('/static/css/bootstrap-xl-mod.css') }}" rel="stylesheet">
|
<link href="{{ static_cachebuster('css/bootstrap-xl-mod.css') }}" rel="stylesheet">
|
||||||
<!--
|
<!--
|
||||||
This theme changer script needs to be inline and right under the above stylesheet link to prevent FOUC (Flash Of Unstyled Content)
|
This theme changer script needs to be inline and right under the above stylesheet link to prevent FOUC (Flash Of Unstyled Content)
|
||||||
Development version is commented out in static/js/main.js at the bottom of the file
|
Development version is commented out in static/js/main.js at the bottom of the file
|
||||||
|
@ -38,15 +38,15 @@
|
||||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/font-awesome/4.7.0/css/font-awesome.min.css" integrity="sha256-eZrrJcwDc/3uDhsdt61sL2oOBY362qM3lon1gyExkL0=" crossorigin="anonymous" />
|
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/font-awesome/4.7.0/css/font-awesome.min.css" integrity="sha256-eZrrJcwDc/3uDhsdt61sL2oOBY362qM3lon1gyExkL0=" crossorigin="anonymous" />
|
||||||
|
|
||||||
<!-- Custom styles for this template -->
|
<!-- Custom styles for this template -->
|
||||||
<link href="{{ static_cachebuster('/static/css/main.css') }}" rel="stylesheet">
|
<link href="{{ static_cachebuster('css/main.css') }}" rel="stylesheet">
|
||||||
|
|
||||||
<!-- Core JavaScript -->
|
<!-- Core JavaScript -->
|
||||||
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.2.1/jquery.min.js" integrity="sha256-hwg4gsxgFZhOsEEamdOYGBf13FyQuiTwlAQgxVSNgt4=" crossorigin="anonymous"></script>
|
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.2.1/jquery.min.js" integrity="sha256-hwg4gsxgFZhOsEEamdOYGBf13FyQuiTwlAQgxVSNgt4=" crossorigin="anonymous"></script>
|
||||||
<script src="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.7/js/bootstrap.min.js" integrity="sha256-U5ZEeKfGNOja007MMD3YBI0A3OSZOQbeG6z2f2Y0hu8=" crossorigin="anonymous"></script>
|
<script src="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.7/js/bootstrap.min.js" integrity="sha256-U5ZEeKfGNOja007MMD3YBI0A3OSZOQbeG6z2f2Y0hu8=" crossorigin="anonymous"></script>
|
||||||
<script src="https://cdnjs.cloudflare.com/ajax/libs/commonmark/0.27.0/commonmark.min.js" integrity="sha256-10JreQhQG80GtKuzsioj0K46DlaB/CK/EG+NuG0q97E=" crossorigin="anonymous"></script>
|
<script src="https://cdnjs.cloudflare.com/ajax/libs/commonmark/0.27.0/commonmark.min.js" integrity="sha256-10JreQhQG80GtKuzsioj0K46DlaB/CK/EG+NuG0q97E=" crossorigin="anonymous"></script>
|
||||||
<!-- Modified to not apply border-radius to selectpickers and stuff so our navbar looks cool -->
|
<!-- Modified to not apply border-radius to selectpickers and stuff so our navbar looks cool -->
|
||||||
<script src="{{ static_cachebuster('/static/js/bootstrap-select.js') }}"></script>
|
<script src="{{ static_cachebuster('js/bootstrap-select.js') }}"></script>
|
||||||
<script src="{{ static_cachebuster('/static/js/main.js') }}"></script>
|
<script src="{{ static_cachebuster('js/main.js') }}"></script>
|
||||||
|
|
||||||
<!-- HTML5 shim and Respond.js for IE8 support of HTML5 elements and media queries -->
|
<!-- HTML5 shim and Respond.js for IE8 support of HTML5 elements and media queries -->
|
||||||
<!--[if lt IE 9]>
|
<!--[if lt IE 9]>
|
||||||
|
@ -65,21 +65,21 @@
|
||||||
<span class="icon-bar"></span>
|
<span class="icon-bar"></span>
|
||||||
<span class="icon-bar"></span>
|
<span class="icon-bar"></span>
|
||||||
</button>
|
</button>
|
||||||
<a class="navbar-brand" href="/">{{ config.SITE_NAME }}</a>
|
<a class="navbar-brand" href="{{ url_for('home') }}">{{ config.SITE_NAME }}</a>
|
||||||
</div>
|
</div>
|
||||||
{% set search_username = (user.username + ("'" if user.username[-1] == 's' else "'s")) if user_page else None %}
|
{% set search_username = (user.username + ("'" if user.username[-1] == 's' else "'s")) if user_page else None %}
|
||||||
{% set search_placeholder = 'Search {} torrents...'.format(search_username) if user_page else 'Search...' %}
|
{% set search_placeholder = 'Search {} torrents...'.format(search_username) if user_page else 'Search...' %}
|
||||||
<div id="navbar" class="navbar-collapse collapse">
|
<div id="navbar" class="navbar-collapse collapse">
|
||||||
<ul class="nav navbar-nav">
|
<ul class="nav navbar-nav">
|
||||||
<li {% if request.path == "/upload" %} class="active"{% endif %}><a href="/upload">Upload</a></li>
|
<li {% if request.path == url_for('upload') %}class="active"{% endif %}><a href="{{ url_for('upload') }}">Upload</a></li>
|
||||||
<li class="dropdown">
|
<li class="dropdown">
|
||||||
<a href="#" class="dropdown-toggle" data-toggle="dropdown" role="button" aria-haspopup="true" aria-expanded="false">
|
<a href="#" class="dropdown-toggle" data-toggle="dropdown" role="button" aria-haspopup="true" aria-expanded="false">
|
||||||
About
|
About
|
||||||
<span class="caret"></span>
|
<span class="caret"></span>
|
||||||
</a>
|
</a>
|
||||||
<ul class="dropdown-menu">
|
<ul class="dropdown-menu">
|
||||||
<li {% if request.path == "/rules" %} class="active"{% endif %}><a href="/rules">Rules</a></li>
|
<li {% if request.path == url_for('site_rules') %}class="active"{% endif %}><a href="{{ url_for('site_rules') }}">Rules</a></li>
|
||||||
<li {% if request.path == "/help" %} class="active"{% endif %}><a href="/help">Help</a></li>
|
<li {% if request.path == url_for('site_help') %}class="active"{% endif %}><a href="{{ url_for('site_help') }}">Help</a></li>
|
||||||
</ul>
|
</ul>
|
||||||
</li>
|
</li>
|
||||||
<li><a href="{% if rss_filter %}{{ url_for('home', page='rss', **rss_filter) }}{% else %}{{ url_for('home', page='rss') }}{% endif %}">RSS</a></li>
|
<li><a href="{% if rss_filter %}{{ url_for('home', page='rss', **rss_filter) }}{% else %}{{ url_for('home', page='rss') }}{% endif %}">RSS</a></li>
|
||||||
|
@ -119,13 +119,13 @@
|
||||||
</a>
|
</a>
|
||||||
</li>
|
</li>
|
||||||
<li>
|
<li>
|
||||||
<a href="/profile">
|
<a href="{{ url_for('profile') }}">
|
||||||
<i class="fa fa-gear fa-fw"></i>
|
<i class="fa fa-gear fa-fw"></i>
|
||||||
Profile
|
Profile
|
||||||
</a>
|
</a>
|
||||||
</li>
|
</li>
|
||||||
<li>
|
<li>
|
||||||
<a href="/logout">
|
<a href="{{ url_for('logout') }}">
|
||||||
<i class="fa fa-times fa-fw"></i>
|
<i class="fa fa-times fa-fw"></i>
|
||||||
Logout
|
Logout
|
||||||
</a>
|
</a>
|
||||||
|
@ -145,13 +145,13 @@
|
||||||
</a>
|
</a>
|
||||||
<ul class="dropdown-menu">
|
<ul class="dropdown-menu">
|
||||||
<li>
|
<li>
|
||||||
<a href="/login">
|
<a href="{{ url_for('login') }}">
|
||||||
<i class="fa fa-sign-in fa-fw"></i>
|
<i class="fa fa-sign-in fa-fw"></i>
|
||||||
Login
|
Login
|
||||||
</a>
|
</a>
|
||||||
</li>
|
</li>
|
||||||
<li>
|
<li>
|
||||||
<a href="/register">
|
<a href="{{ url_for('register') }}">
|
||||||
<i class="fa fa-pencil fa-fw"></i>
|
<i class="fa fa-pencil fa-fw"></i>
|
||||||
Register
|
Register
|
||||||
</a>
|
</a>
|
||||||
|
@ -207,7 +207,7 @@
|
||||||
{% if user_page %}
|
{% if user_page %}
|
||||||
<form class="navbar-form navbar-right form" action="{{ url_for('view_user', user_name=user.username) }}" method="get">
|
<form class="navbar-form navbar-right form" action="{{ url_for('view_user', user_name=user.username) }}" method="get">
|
||||||
{% else %}
|
{% else %}
|
||||||
<form class="navbar-form navbar-right form" action="/" method="get">
|
<form class="navbar-form navbar-right form" action="{{ url_for('home') }}" method="get">
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
|
||||||
<input type="text" class="form-control" name="q" placeholder="{{ search_placeholder }}" value="{{ search["term"] if search is defined else '' }}">
|
<input type="text" class="form-control" name="q" placeholder="{{ search_placeholder }}" value="{{ search["term"] if search is defined else '' }}">
|
||||||
|
@ -243,7 +243,7 @@
|
||||||
{% if user_page %}
|
{% if user_page %}
|
||||||
<form class="navbar-form navbar-right form" action="{{ url_for('view_user', user_name=user.username) }}" method="get">
|
<form class="navbar-form navbar-right form" action="{{ url_for('view_user', user_name=user.username) }}" method="get">
|
||||||
{% else %}
|
{% else %}
|
||||||
<form class="navbar-form navbar-right form" action="/" method="get">
|
<form class="navbar-form navbar-right form" action="{{ url_for('home') }}" method="get">
|
||||||
{% endif %}
|
{% endif %}
|
||||||
<div class="input-group search-container hidden-xs hidden-sm">
|
<div class="input-group search-container hidden-xs hidden-sm">
|
||||||
<input type="text" class="form-control search-bar" name="q" placeholder="{{ search_placeholder }}" value="{{ search["term"] if search is defined else '' }}">
|
<input type="text" class="form-control search-bar" name="q" placeholder="{{ search_placeholder }}" value="{{ search["term"] if search is defined else '' }}">
|
||||||
|
|
|
@ -12,7 +12,7 @@
|
||||||
{% if special_results is defined and not search.user %}
|
{% if special_results is defined and not search.user %}
|
||||||
{% if special_results.first_word_user %}
|
{% if special_results.first_word_user %}
|
||||||
<div class="alert alert-info">
|
<div class="alert alert-info">
|
||||||
<a href="/user/{{ special_results.first_word_user.username }}{{ modify_query(q=special_results.query_sans_user)[1:] }}">Click here to see only results uploaded by {{ special_results.first_word_user.username }}</a>
|
<a href="{{ url_for('view_user', user_name=special_results.first_word_user.username) }}{{ modify_query(q=special_results.query_sans_user)[1:] }}">Click here to see only results uploaded by {{ special_results.first_word_user.username }}</a>
|
||||||
</div>
|
</div>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
@ -28,6 +28,9 @@
|
||||||
{% call render_column_header("hdr-name", "width:auto;") %}
|
{% call render_column_header("hdr-name", "width:auto;") %}
|
||||||
<div>Name</div>
|
<div>Name</div>
|
||||||
{% endcall %}
|
{% endcall %}
|
||||||
|
{% call render_column_header("hdr-comments", "width:50px;", center_text=True, sort_key="comments", header_title="Comments") %}
|
||||||
|
<i class="fa fa-comments-o"></i>
|
||||||
|
{% endcall %}
|
||||||
{% call render_column_header("hdr-link", "width:70px;", center_text=True) %}
|
{% call render_column_header("hdr-link", "width:70px;", center_text=True) %}
|
||||||
<div>Link</div>
|
<div>Link</div>
|
||||||
{% endcall %}
|
{% endcall %}
|
||||||
|
@ -44,12 +47,10 @@
|
||||||
{% endcall %}
|
{% endcall %}
|
||||||
{% call render_column_header("hdr-leechers", "width:50px;", center_text=True, sort_key="leechers", header_title="Leeches") %}
|
{% call render_column_header("hdr-leechers", "width:50px;", center_text=True, sort_key="leechers", header_title="Leeches") %}
|
||||||
<i class="fa fa-arrow-down" aria-hidden="true"></i>
|
<i class="fa fa-arrow-down" aria-hidden="true"></i>
|
||||||
|
|
||||||
{% endcall %}
|
{% endcall %}
|
||||||
{% call render_column_header("hdr-downloads", "width:50px;", center_text=True, sort_key="downloads", header_title="Completed downloads") %}
|
{% call render_column_header("hdr-downloads", "width:50px;", center_text=True, sort_key="downloads", header_title="Completed downloads") %}
|
||||||
<i class="fa fa-check" aria-hidden="true"></i>
|
<i class="fa fa-check" aria-hidden="true"></i>
|
||||||
{% endcall %}
|
{% endcall %}
|
||||||
|
|
||||||
{% endif %}
|
{% endif %}
|
||||||
</tr>
|
</tr>
|
||||||
</thead>
|
</thead>
|
||||||
|
@ -61,20 +62,29 @@
|
||||||
{% set icon_dir = config.SITE_FLAVOR %}
|
{% set icon_dir = config.SITE_FLAVOR %}
|
||||||
<td style="padding:0 4px;">
|
<td style="padding:0 4px;">
|
||||||
{% if use_elastic %}
|
{% if use_elastic %}
|
||||||
<a href="/?c={{ cat_id }}" title="{{ category_name(cat_id) }}">
|
<a href="{{ url_for('home', c=cat_id) }}" title="{{ category_name(cat_id) }}">
|
||||||
{% else %}
|
{% else %}
|
||||||
<a href="/?c={{ cat_id }}" title="{{ torrent.main_category.name }} - {{ torrent.sub_category.name }}">
|
<a href="{{ url_for('home', c=cat_id) }}" title="{{ torrent.main_category.name }} - {{ torrent.sub_category.name }}">
|
||||||
{% endif %}
|
{% endif %}
|
||||||
<img src="/static/img/icons/{{ icon_dir }}/{{ cat_id }}.png" alt="{{ category_name(cat_id) }}">
|
<img src="{{ url_for('static', filename='img/icons/%s/%s.png'|format(icon_dir, cat_id)) }}" alt="{{ category_name(cat_id) }}">
|
||||||
</a>
|
</a>
|
||||||
</td>
|
</td>
|
||||||
{% if use_elastic %}
|
<td colspan="2">
|
||||||
<td><a href="{{ url_for('view_torrent', torrent_id=torrent.meta.id) }}" title="{{ torrent.display_name | escape }}">{%if "highlight" in torrent.meta %}{{ torrent.meta.highlight.display_name[0] | safe }}{% else %}{{torrent.display_name}}{%endif%}</a></td>
|
{% set torrent_id = torrent.meta.id if use_elastic else torrent.id %}
|
||||||
{% else %}
|
{% set com_count = torrent.comment_count %}
|
||||||
<td><a href="{{ url_for('view_torrent', torrent_id=torrent.id) }}" title="{{ torrent.display_name | escape }}">{{ torrent.display_name | escape }}</a></td>
|
{% if com_count %}
|
||||||
{% endif %}
|
<a href="{{ url_for('view_torrent', torrent_id=torrent_id, _anchor='comments') }}" class="comments" title="{{ '{c} comment{s}'.format(c=com_count, s='s' if com_count > 1 else '') }}">
|
||||||
<td style="white-space: nowrap;text-align: center;">
|
<i class="fa fa-comments-o"></i>{{ com_count -}}
|
||||||
{% if torrent.has_torrent %}<a href="{{ url_for('download_torrent', torrent_id=torrent.id) }}"><i class="fa fa-fw fa-download"></i></a>{% endif %}
|
</a>
|
||||||
|
{% endif %}
|
||||||
|
{% if use_elastic %}
|
||||||
|
<a href="{{ url_for('view_torrent', torrent_id=torrent_id) }}" title="{{ torrent.display_name | escape }}">{%if "highlight" in torrent.meta %}{{ torrent.meta.highlight.display_name[0] | safe }}{% else %}{{torrent.display_name}}{%endif%}</a>
|
||||||
|
{% else %}
|
||||||
|
<a href="{{ url_for('view_torrent', torrent_id=torrent_id) }}" title="{{ torrent.display_name | escape }}">{{ torrent.display_name | escape }}</a>
|
||||||
|
{% endif %}
|
||||||
|
</td>
|
||||||
|
<td class="text-center" style="white-space: nowrap;">
|
||||||
|
{% if torrent.has_torrent %}<a href="{{ url_for('download_torrent', torrent_id=torrent_id) }}"><i class="fa fa-fw fa-download"></i></a>{% endif %}
|
||||||
{% if use_elastic %}
|
{% if use_elastic %}
|
||||||
<a href="{{ create_magnet_from_es_info(torrent.display_name, torrent.info_hash) }}"><i class="fa fa-fw fa-magnet"></i></a>
|
<a href="{{ create_magnet_from_es_info(torrent.display_name, torrent.info_hash) }}"><i class="fa fa-fw fa-magnet"></i></a>
|
||||||
{% else %}
|
{% else %}
|
||||||
|
|
|
@ -10,7 +10,7 @@
|
||||||
<div class="panel-heading"{% if torrent.hidden %} style="background-color: darkgray;"{% endif %}>
|
<div class="panel-heading"{% if torrent.hidden %} style="background-color: darkgray;"{% endif %}>
|
||||||
<h3 class="panel-title">
|
<h3 class="panel-title">
|
||||||
{% if can_edit %}
|
{% if can_edit %}
|
||||||
<a href="{{ request.url }}/edit" title="Edit torrent"><i class="fa fa-fw fa-pencil"></i></a>
|
<a href="{{ url_for('edit_torrent', torrent_id=torrent.id) }}" title="Edit torrent"><i class="fa fa-fw fa-pencil"></i></a>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{{ torrent.display_name }}
|
{{ torrent.display_name }}
|
||||||
</h3>
|
</h3>
|
||||||
|
@ -132,8 +132,7 @@
|
||||||
</div>
|
</div>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
|
||||||
|
<div id="comments" class="panel panel-default">
|
||||||
<div class="panel panel-default">
|
|
||||||
<div class="panel-heading">
|
<div class="panel-heading">
|
||||||
<h3 class="panel-title">
|
<h3 class="panel-title">
|
||||||
Comments - {{ comments | length }}
|
Comments - {{ comments | length }}
|
||||||
|
|
|
@ -28,7 +28,7 @@ mysqlclient==1.3.10
|
||||||
orderedset==2.0
|
orderedset==2.0
|
||||||
packaging==16.8
|
packaging==16.8
|
||||||
passlib==1.7.1
|
passlib==1.7.1
|
||||||
progressbar2==3.20.0
|
progressbar33==2.4
|
||||||
pycodestyle==2.3.1
|
pycodestyle==2.3.1
|
||||||
pycparser==2.17
|
pycparser==2.17
|
||||||
PyMySQL==0.7.11
|
PyMySQL==0.7.11
|
||||||
|
|
95
sync_es.py
95
sync_es.py
|
@ -24,6 +24,10 @@ database into es, at the expense of redoing a (small) amount of indexing.
|
||||||
|
|
||||||
This uses multithreading so we don't have to block on socket io (both binlog
|
This uses multithreading so we don't have to block on socket io (both binlog
|
||||||
reading and es POSTing). asyncio soon™
|
reading and es POSTing). asyncio soon™
|
||||||
|
|
||||||
|
This script will exit on any sort of exception, so you'll want to use your
|
||||||
|
supervisor's restart functionality, e.g. Restart=failure in systemd, or
|
||||||
|
the poor man's `while true; do sync_es.py; sleep 1; done` in tmux.
|
||||||
"""
|
"""
|
||||||
from elasticsearch import Elasticsearch
|
from elasticsearch import Elasticsearch
|
||||||
from elasticsearch.helpers import bulk, BulkIndexError
|
from elasticsearch.helpers import bulk, BulkIndexError
|
||||||
|
@ -86,6 +90,7 @@ def reindex_torrent(t, index_name):
|
||||||
"uploader_id": t['uploader_id'],
|
"uploader_id": t['uploader_id'],
|
||||||
"main_category_id": t['main_category_id'],
|
"main_category_id": t['main_category_id'],
|
||||||
"sub_category_id": t['sub_category_id'],
|
"sub_category_id": t['sub_category_id'],
|
||||||
|
"comment_count": t['comment_count'],
|
||||||
# XXX all the bitflags are numbers
|
# XXX all the bitflags are numbers
|
||||||
"anonymous": bool(f & TorrentFlags.ANONYMOUS),
|
"anonymous": bool(f & TorrentFlags.ANONYMOUS),
|
||||||
"trusted": bool(f & TorrentFlags.TRUSTED),
|
"trusted": bool(f & TorrentFlags.TRUSTED),
|
||||||
|
@ -132,14 +137,31 @@ def delet_this(row, index_name):
|
||||||
'_type': 'torrent',
|
'_type': 'torrent',
|
||||||
'_id': str(row['values']['id'])}
|
'_id': str(row['values']['id'])}
|
||||||
|
|
||||||
|
# we could try to make this script robust to errors from es or mysql, but since
|
||||||
|
# the only thing we can do is "clear state and retry", it's easier to leave
|
||||||
|
# this to the supervisor. If we we carrying around heavier state in-process,
|
||||||
|
# it'd be more worth it to handle errors ourselves.
|
||||||
|
#
|
||||||
|
# Apparently there's no setDefaultUncaughtExceptionHandler in threading, and
|
||||||
|
# sys.excepthook is also broken, so this gives us the same
|
||||||
|
# exit-if-anything-happens semantics.
|
||||||
|
class ExitingThread(Thread):
|
||||||
|
def run(self):
|
||||||
|
try:
|
||||||
|
self.run_happy()
|
||||||
|
except:
|
||||||
|
log.exception("something happened")
|
||||||
|
# sys.exit only exits the thread, lame
|
||||||
|
import os
|
||||||
|
os._exit(1)
|
||||||
|
|
||||||
class BinlogReader(Thread):
|
class BinlogReader(ExitingThread):
|
||||||
# write_buf is the Queue we communicate with
|
# write_buf is the Queue we communicate with
|
||||||
def __init__(self, write_buf):
|
def __init__(self, write_buf):
|
||||||
Thread.__init__(self)
|
Thread.__init__(self)
|
||||||
self.write_buf = write_buf
|
self.write_buf = write_buf
|
||||||
|
|
||||||
def run(self):
|
def run_happy(self):
|
||||||
with open(SAVE_LOC) as f:
|
with open(SAVE_LOC) as f:
|
||||||
pos = json.load(f)
|
pos = json.load(f)
|
||||||
|
|
||||||
|
@ -228,7 +250,7 @@ class BinlogReader(Thread):
|
||||||
else:
|
else:
|
||||||
raise Exception(f"unknown table {s.table}")
|
raise Exception(f"unknown table {s.table}")
|
||||||
|
|
||||||
class EsPoster(Thread):
|
class EsPoster(ExitingThread):
|
||||||
# read_buf is the queue of stuff to bulk post
|
# read_buf is the queue of stuff to bulk post
|
||||||
def __init__(self, read_buf, chunk_size=1000, flush_interval=5):
|
def __init__(self, read_buf, chunk_size=1000, flush_interval=5):
|
||||||
Thread.__init__(self)
|
Thread.__init__(self)
|
||||||
|
@ -236,59 +258,72 @@ class EsPoster(Thread):
|
||||||
self.chunk_size = chunk_size
|
self.chunk_size = chunk_size
|
||||||
self.flush_interval = flush_interval
|
self.flush_interval = flush_interval
|
||||||
|
|
||||||
def run(self):
|
def run_happy(self):
|
||||||
es = Elasticsearch(timeout=30)
|
es = Elasticsearch(timeout=30)
|
||||||
|
|
||||||
last_save = time.time()
|
last_save = time.time()
|
||||||
since_last = 0
|
since_last = 0
|
||||||
|
# XXX keep track of last posted position for save points, awkward
|
||||||
|
posted_log_file = None
|
||||||
|
posted_log_pos = None
|
||||||
|
|
||||||
while True:
|
while True:
|
||||||
actions = []
|
actions = []
|
||||||
while len(actions) < self.chunk_size:
|
now = time.time()
|
||||||
|
# wait up to flush_interval seconds after starting the batch
|
||||||
|
deadline = now + self.flush_interval
|
||||||
|
while len(actions) < self.chunk_size and now < deadline:
|
||||||
|
timeout = deadline - now
|
||||||
try:
|
try:
|
||||||
# grab next event from queue with metadata that creepily
|
# grab next event from queue with metadata that creepily
|
||||||
# updates, surviving outside the scope of the loop
|
# updates, surviving outside the scope of the loop
|
||||||
((log_file, log_pos, timestamp), action) = \
|
((log_file, log_pos, timestamp), action) = \
|
||||||
self.read_buf.get(block=True, timeout=self.flush_interval)
|
self.read_buf.get(block=True, timeout=timeout)
|
||||||
actions.append(action)
|
actions.append(action)
|
||||||
|
now = time.time()
|
||||||
except Empty:
|
except Empty:
|
||||||
# nothing new for the whole interval
|
# nothing new for the whole interval
|
||||||
break
|
break
|
||||||
|
|
||||||
if not actions:
|
if actions:
|
||||||
# nothing to post
|
# XXX "time" to get histogram of no events per bulk
|
||||||
log.debug("no changes...")
|
stats.timing('actions_per_bulk', len(actions))
|
||||||
continue
|
|
||||||
|
|
||||||
# XXX "time" to get histogram of no events per bulk
|
try:
|
||||||
stats.timing('actions_per_bulk', len(actions))
|
with stats.timer('post_bulk'):
|
||||||
|
bulk(es, actions, chunk_size=self.chunk_size)
|
||||||
try:
|
except BulkIndexError as bie:
|
||||||
with stats.timer('post_bulk'):
|
# in certain cases where we're really out of sync, we update a
|
||||||
bulk(es, actions, chunk_size=self.chunk_size)
|
# stat when the torrent doc is, causing a "document missing"
|
||||||
except BulkIndexError as bie:
|
# error from es, with no way to suppress that server-side.
|
||||||
# in certain cases where we're really out of sync, we update a
|
# Thus ignore that type of error if it's the only problem
|
||||||
# stat when the torrent doc is, causing a "document missing"
|
for e in bie.errors:
|
||||||
# error from es, with no way to suppress that server-side.
|
try:
|
||||||
# Thus ignore that type of error if it's the only problem
|
if e['update']['error']['type'] != 'document_missing_exception':
|
||||||
for e in bie.errors:
|
raise bie
|
||||||
try:
|
except KeyError:
|
||||||
if e['update']['error']['type'] != 'document_missing_exception':
|
|
||||||
raise bie
|
raise bie
|
||||||
except KeyError:
|
|
||||||
raise bie
|
|
||||||
|
|
||||||
# how far we're behind, wall clock
|
# how far we've gotten in the actual log
|
||||||
stats.gauge('process_latency', int((time.time() - timestamp) * 1000))
|
posted_log_file = log_file
|
||||||
|
posted_log_pos = log_pos
|
||||||
|
|
||||||
|
# how far we're behind, wall clock
|
||||||
|
stats.gauge('process_latency', int((time.time() - timestamp) * 1000))
|
||||||
|
else:
|
||||||
|
log.debug("no changes...")
|
||||||
|
|
||||||
since_last += len(actions)
|
since_last += len(actions)
|
||||||
if since_last >= 10000 or (time.time() - last_save) > 10:
|
# TODO instead of this manual timeout loop, could move this to another queue/thread
|
||||||
|
if posted_log_file is not None and (since_last >= 10000 or (time.time() - last_save) > 10):
|
||||||
log.info(f"saving position {log_file}/{log_pos}, {time.time() - timestamp:,.3f} seconds behind")
|
log.info(f"saving position {log_file}/{log_pos}, {time.time() - timestamp:,.3f} seconds behind")
|
||||||
with stats.timer('save_pos'):
|
with stats.timer('save_pos'):
|
||||||
with open(SAVE_LOC, 'w') as f:
|
with open(SAVE_LOC, 'w') as f:
|
||||||
json.dump({"log_file": log_file, "log_pos": log_pos}, f)
|
json.dump({"log_file": posted_log_file, "log_pos": posted_log_pos}, f)
|
||||||
last_save = time.time()
|
last_save = time.time()
|
||||||
since_last = 0
|
since_last = 0
|
||||||
|
posted_log_file = None
|
||||||
|
posted_log_pos = None
|
||||||
|
|
||||||
# in-memory queue between binlog and es. The bigger it is, the more events we
|
# in-memory queue between binlog and es. The bigger it is, the more events we
|
||||||
# can parse in memory while waiting for es to catch up, at the expense of heap.
|
# can parse in memory while waiting for es to catch up, at the expense of heap.
|
||||||
|
|
Loading…
Reference in a new issue