Compare commits

...

46 Commits

Author SHA1 Message Date
e3fce36c89 Bump version: 1.3.7 → 1.3.8
All checks were successful
Build and Publish Docker Image / build-and-push (push) Successful in 1m29s
2025-08-10 17:27:14 +01:00
6741fb6eb1 expiary check 2025-08-10 17:26:15 +01:00
0105b9539a Bump version: 1.3.6 → 1.3.7
All checks were successful
Build and Publish Docker Image / build-and-push (push) Successful in 1m46s
2025-07-23 09:28:28 +01:00
0f1080b196 rework config login and add update NPM function 2025-07-23 09:26:11 +01:00
5e8f1ee46d Bump version: 1.3.5 → 1.3.6
All checks were successful
Build and Publish Docker Image / build-and-push (push) Successful in 1m31s
2025-07-19 23:19:34 +01:00
f619644d59 cleanup 2025-07-19 23:19:31 +01:00
0e56ca16bb Bump version: 1.3.4 → 1.3.5
All checks were successful
Build and Publish Docker Image / build-and-push (push) Successful in 1m31s
2025-07-19 23:14:09 +01:00
b673a232b1 looking for extra urls 2025-07-19 23:13:48 +01:00
64b54e959e lets deploy and see 2025-07-19 11:56:30 +01:00
aeaac5fd4e extra url modifications 2025-07-19 11:31:16 +01:00
a78ca2d0b2 Bump version: 1.3.3 → 1.3.4
All checks were successful
Build and Publish Docker Image / build-and-push (push) Successful in 1m34s
2025-07-19 10:39:16 +01:00
317066703b cors 2025-07-19 10:39:08 +01:00
c82ac9350f Bump version: 1.3.2 → 1.3.3
All checks were successful
Build and Publish Docker Image / build-and-push (push) Successful in 1m36s
2025-07-19 09:40:33 +01:00
c8a9cae55e fix the imports 2025-07-19 09:40:31 +01:00
07d32958cb Bump version: 1.3.1 → 1.3.2
All checks were successful
Build and Publish Docker Image / build-and-push (push) Successful in 1m35s
2025-07-19 09:32:53 +01:00
984b3a62f9 dns on config page 2025-07-19 09:32:45 +01:00
050904dde5 Bump version: 1.3.0 → 1.3.1
All checks were successful
Build and Publish Docker Image / build-and-push (push) Successful in 1m31s
2025-07-19 09:16:28 +01:00
4303f568b3 modify dns in config 2025-07-19 09:16:21 +01:00
125e42e501 Bump version: 1.2.20 → 1.3.0
All checks were successful
Build and Publish Docker Image / build-and-push (push) Successful in 1m42s
2025-07-19 09:05:34 +01:00
2ef573d63c manual check on config page 2025-07-19 09:05:25 +01:00
b8eb97e8e2 Bump version: 1.2.19 → 1.2.20
All checks were successful
Build and Publish Docker Image / build-and-push (push) Successful in 1m38s
2025-07-18 18:03:48 +01:00
14490bd7d2 run at 10am 2025-07-18 18:03:43 +01:00
211afa3966 Bump version: 1.2.18 → 1.2.19
All checks were successful
Build and Publish Docker Image / build-and-push (push) Successful in 1m41s
2025-07-18 17:51:52 +01:00
c79a908281 extend notifications 2025-07-18 17:51:48 +01:00
93e9a1990a Bump version: 1.2.17 → 1.2.18
All checks were successful
Build and Publish Docker Image / build-and-push (push) Successful in 1m32s
2025-07-18 17:36:16 +01:00
2ef3aee412 fix the key 2025-07-18 17:36:12 +01:00
7f68c0b2ab Bump version: 1.2.16 → 1.2.17
All checks were successful
Build and Publish Docker Image / build-and-push (push) Successful in 1m32s
2025-07-18 17:25:46 +01:00
cc3648c897 import base64 2025-07-18 17:25:43 +01:00
809f1ebda5 Bump version: 1.2.15 → 1.2.16
All checks were successful
Build and Publish Docker Image / build-and-push (push) Successful in 1m33s
2025-07-18 17:07:50 +01:00
5ab44bd78d fix key logic 2025-07-18 17:07:46 +01:00
42831d294c Bump version: 1.2.14 → 1.2.15
All checks were successful
Build and Publish Docker Image / build-and-push (push) Successful in 1m35s
2025-07-18 16:27:11 +01:00
927bae0f3d notifications 2025-07-18 16:27:08 +01:00
87753785d4 Bump version: 1.2.13 → 1.2.14
All checks were successful
Build and Publish Docker Image / build-and-push (push) Successful in 1m53s
2025-07-18 15:57:03 +01:00
2ad7c34157 notifications....again 2025-07-18 15:56:59 +01:00
51138c9d50 Bump version: 1.2.12 → 1.2.13
All checks were successful
Build and Publish Docker Image / build-and-push (push) Successful in 1m32s
2025-07-18 09:23:27 +01:00
cbc1da11fc notification test 2025-07-18 09:23:24 +01:00
001cf43cd7 Bump version: 1.2.11 → 1.2.12
All checks were successful
Build and Publish Docker Image / build-and-push (push) Successful in 1m27s
2025-07-17 19:06:05 +01:00
c3ab41dd6a test notification 2025-07-17 19:06:03 +01:00
950da26958 Bump version: 1.2.10 → 1.2.11
All checks were successful
Build and Publish Docker Image / build-and-push (push) Successful in 1m39s
2025-07-17 18:13:16 +01:00
546c4fe6f6 notifications 2025-07-17 18:13:07 +01:00
6388ca7176 Bump version: 1.2.9 → 1.2.10
All checks were successful
Build and Publish Docker Image / build-and-push (push) Successful in 1m27s
2025-07-17 17:49:53 +01:00
19bfa3e4d6 another try 2025-07-17 17:49:50 +01:00
ac79f26373 Bump version: 1.2.8 → 1.2.9
All checks were successful
Build and Publish Docker Image / build-and-push (push) Successful in 1m34s
2025-07-17 17:30:04 +01:00
d66576b451 last fix 2025-07-17 17:30:02 +01:00
60190750ec Bump version: 1.2.7 → 1.2.8
All checks were successful
Build and Publish Docker Image / build-and-push (push) Successful in 1m40s
2025-07-17 17:19:36 +01:00
19c84fb900 loggin 2025-07-17 17:19:31 +01:00
18 changed files with 578 additions and 64 deletions

View File

@ -1,5 +1,5 @@
[tool.bumpversion]
current_version = "1.2.7"
current_version = "1.3.8"
commit = true
tag = true
tag_name = "{new_version}"

2
.vscode/launch.json vendored
View File

@ -17,7 +17,7 @@
"FLASK_APP": "ktvmanager.main:create_app",
"FLASK_ENV": "development",
"PYTHONPATH": "${workspaceFolder}",
"FLASK_RUN_PORT": "5001"
"FLASK_RUN_PORT": "5002"
},
"args": [
"run",

View File

@ -1 +1 @@
1.2.7
1.3.8

View File

@ -33,6 +33,9 @@ ENV FLASK_APP=ktvmanager.main:create_app
# Copy application code
COPY . .
# Debugging step to verify file presence
RUN ls -la /app/ktvmanager/lib
# Handle versioning
ARG VERSION
RUN if [ -n "$VERSION" ]; then echo $VERSION > VERSION; fi

21
generate_vapid_keys.py Normal file
View File

@ -0,0 +1,21 @@
from py_vapid import Vapid
import os
vapid = Vapid()
vapid.generate_keys()
from cryptography.hazmat.primitives import serialization
private_key = vapid.private_key.private_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PrivateFormat.PKCS8,
encryption_algorithm=serialization.NoEncryption()
).decode('utf-8')
public_key = vapid.public_key.public_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PublicFormat.SubjectPublicKeyInfo
).decode('utf-8')
print(f"VAPID_PRIVATE_KEY='{private_key}'")
print(f"VAPID_PUBLIC_KEY='{public_key}'")

72
host_9_config.txt Normal file
View File

@ -0,0 +1,72 @@
location ~ ^/Mongoose(.*)$ {
return 302 http://m3u.sstv.one:81/$1$is_args$args;
}
location ~ ^/Blazin(.*)$ {
return 302 http://blazin.dns-cloud.net:8080/$1$is_args$args;
}
location ~ ^/Insanity(.*)$ {
return 302 https://biglicks.win:443/$1$is_args$args;
}
location ~ ^/Badger(.*)$ {
return 302 http://hurricanetv.kiev.ua:80/$1$is_args$args;
}
location ~ ^/Gunslinger(.*)$ {
return 302 http://jabawalkies.club:8080/$1$is_args$args;
}
location ~ ^/KDB(.*)$ {
return 302 http://finger-ya-bum-hole.site/$1$is_args$args;
}
location ~ ^/Graphite(.*)$ {
return 302 http://sarahgraphite.liveme.vip:80/$1$is_args$args;
}
location ~ ^/old-Premium(.*)$ {
return 302 https://kwikfitfitter.life:443/$1$is_args$args;
}
location ~ ^/Gold(.*)$ {
return 302 http://server1.elitehosting.gq:8090/$1$is_args$args;
}
location ~ ^/Bravado(.*)$ {
return 302 http://le.thund.re/$1$is_args$args;
}
location ~ ^/Titan(.*)$ {
return 302 http://maximumorg.xyz:80/$1$is_args$args;
}
location ~ ^/Wolfie(.*)$ {
return 302 http://deviltv.fun:8080/$1$is_args$args;
}
location ~ ^/DiamondBack(.*)$ {
return 302 http://pro-media.live:2052/$1$is_args$args;
}
location ~ ^/Halo(.*)$ {
return 302 http://i-like-turtles.org:8080/$1$is_args$args;
}
location ~ ^/Nitro(.*)$ {
return 302 http://mr-beans-streams.xyz$1$is_args$args;
}
location ~ ^/Insanity(.*)$ {
return 302 https://biglicks.win:443/$1$is_args$args;
}
location ~ ^/Bonsai(.*)$ {
return 302 http://crazyservertimes.pro/$1$is_args$args;
}
location ~ ^/New-Prem(.*)$ {
return 302 http://hello.exodus-2.xyz:8080/$1$is_args$args;
}
location ~ ^/Crystal(.*)$ {
return 302 https://line.ottcst.com/$1$is_args$args;
}
location ~ ^/VIP(.*)$ {
return 302 https://1visions.co.uk:443/$1$is_args$args;
}
location ~ ^/WILD(.*)$ {
return 302 http://wildversion.com:8080/$1$is_args$args;
}
location ~ ^/STEST(.*)$ {
return 302 http://notwhatyourlookingfor.ru/$1$is_args$args;
}
location ~ ^/SPARE(.*)$ {
return 302 http://moontv.co.uk/$1$is_args$args;
}
location ~ ^/QUARTZ(.*)$ {
return 302 http://anyholeisagoal.ru/$1$is_args$args;
}

View File

@ -3,7 +3,7 @@ import sys
from dotenv import load_dotenv
import mysql.connector
from datetime import datetime, timedelta
from routes.api import send_notification
from ktvmanager.lib.notifications import send_notification
from ktvmanager.lib.database import get_push_subscriptions, _execute_query
# Add the project root to the Python path
@ -33,41 +33,50 @@ def get_all_accounts(db_connection: MySQLConnection) -> List[Dict[str, Any]]:
return accounts
def send_expiry_notifications() -> None:
def send_expiry_notifications(app) -> None:
"""
Sends notifications to users with accounts expiring in the next 30 days.
"""
now = datetime.now()
thirty_days_later = now + timedelta(days=30)
now_timestamp = int(now.timestamp())
thirty_days_later_timestamp = int(thirty_days_later.timestamp())
with app.app_context():
now = datetime.now()
thirty_days_later = now + timedelta(days=30)
now_timestamp = int(now.timestamp())
thirty_days_later_timestamp = int(thirty_days_later.timestamp())
query = """
SELECT u.id as user_id, ua.username, ua.expiaryDate
FROM users u
JOIN userAccounts ua ON u.id = ua.userID
WHERE ua.expiaryDate BETWEEN %s AND %s
"""
expiring_accounts = _execute_query(query, (now_timestamp, thirty_days_later_timestamp))
query = """
SELECT u.id as user_id, ua.username, ua.expiaryDate
FROM users u
JOIN userAccounts ua ON u.id = ua.userID
WHERE ua.expiaryDate BETWEEN %s AND %s
"""
expiring_accounts = _execute_query(query, (now_timestamp, thirty_days_later_timestamp))
for account in expiring_accounts:
user_id = account['user_id']
subscriptions = get_push_subscriptions(user_id)
for sub in subscriptions:
# Check if a notification has been sent recently
last_notified_query = "SELECT last_notified FROM push_subscriptions WHERE id = %s"
last_notified_result = _execute_query(last_notified_query, (sub['id'],))
last_notified = last_notified_result[0]['last_notified'] if last_notified_result and last_notified_result[0]['last_notified'] else None
for account in expiring_accounts:
expiry_date = datetime.fromtimestamp(account['expiaryDate'])
days_to_expiry = (expiry_date.date() - now.date()).days
if last_notified and last_notified > now - timedelta(days=1):
continue
if days_to_expiry == 30 or days_to_expiry == 7:
print(f"Found expiring account: {account['username']}")
user_id = account['user_id']
subscriptions = get_push_subscriptions(user_id)
for sub in subscriptions:
# Check if a notification has been sent recently
last_notified_query = "SELECT last_notified FROM push_subscriptions WHERE id = %s"
last_notified_result = _execute_query(last_notified_query, (sub['id'],))
last_notified = last_notified_result[0]['last_notified'] if last_notified_result and last_notified_result[0]['last_notified'] else None
message = f"Your account {account['username']} is due to expire on {datetime.fromtimestamp(account['expiaryDate']).strftime('%d-%m-%Y')}."
send_notification(sub['subscription_json'], message)
if last_notified and last_notified.date() == now.date():
continue
# Update the last notified timestamp
update_last_notified_query = "UPDATE push_subscriptions SET last_notified = %s WHERE id = %s"
_execute_query(update_last_notified_query, (now, sub['id']))
message = {
"title": "Account Expiry Warning",
"body": f"Your account {account['username']} is due to expire in {days_to_expiry} days."
}
send_notification(sub['subscription_json'], json.dumps(message))
# Update the last notified timestamp
update_last_notified_query = "UPDATE push_subscriptions SET last_notified = %s WHERE id = %s"
_execute_query(update_last_notified_query, (now, sub['id']))
def main() -> None:

View File

@ -14,15 +14,12 @@ class Config:
DATABASE = os.getenv("DATABASE")
DBPORT = os.getenv("DBPORT")
STREAM_URLS = ["http://example.com", "http://example.org"]
VAPID_PRIVATE_KEY = """-----BEGIN PRIVATE KEY-----
MIGHAgEAMBMGByqGSM49AgEGCCqGSM49AwEHBG0wawIBAQQg6vkDnUOnpMUZ+DAv
gEge20aPDmffv1rYTADnaNP5NvGhRANCAATZvXvlV0QyvzvgOdsEMSt07n5qgbBn
ICQ0s1x364rGswAcVVJuu8q5XgZQrBLk/lkhQBcyyuuAjc4OvJLADqEk
-----END PRIVATE KEY-----"""
VAPID_PRIVATE_KEY = "MIGHAgEAMBMGByqGSM49AgEGCCqGSM49AwEHBG0wawIBAQQg6vkDnUOnpMUZ+DAvgEge20aPDmffv1rYTADnaNP5NvGhRANCAATZvXvlV0QyvzvgOdsEMSt07n5qgbBnICQ0s1x364rGswAcVVJuu8q5XgZQrBLk/lkhQBcyyuuAjc4OvJLADqEk"
VAPID_PUBLIC_KEY = """-----BEGIN PUBLIC KEY-----
MFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAE2b175VdEMr874DnbBDErdO5+aoGw
ZyAkNLNcd+uKxrMAHFVSbrvKuV4GUKwS5P5ZIUAXMsrrgI3ODrySwA6hJA==
-----END PUBLIC KEY-----"""
VAPID_CLAIM_EMAIL = "mailto:karl@k-world.me.uk" # Replace with your email
SECRET_KEY = "a_very_secret_key"
class DevelopmentConfig(Config):

View File

@ -1,6 +1,7 @@
from functools import wraps
from flask import request, jsonify, Blueprint, Response
from typing import Callable, Any, Tuple, Dict
from .database import get_user_id_from_username
auth_blueprint = Blueprint("auth", __name__)
@ -57,4 +58,5 @@ def check_login(username: str, password: str) -> Response:
Returns:
A Flask JSON response indicating success.
"""
return jsonify({"auth": "Success"})
user_id = get_user_id_from_username(username)
return jsonify({"auth": "Success", "user_id": user_id, "username": username})

View File

@ -138,6 +138,16 @@ def validate_account() -> Tuple[Response, int]:
200,
)
# Check if account is expired
exp_date_str = result["data"]["user_info"]["exp_date"]
if exp_date_str:
from datetime import datetime, timezone
exp_date = datetime.fromtimestamp(int(exp_date_str), tz=timezone.utc)
current_date = datetime.now(timezone.utc)
if current_date > exp_date:
return jsonify({"message": "Account is expired", "data": result}), 401
return jsonify({"message": "Account is valid", "data": result}), 200
else:
return jsonify({"message": "Account is invalid"}), 401

View File

@ -114,6 +114,17 @@ def get_stream_names() -> Response:
return jsonify(stream_names)
def get_all_stream_urls() -> Response:
"""Retrieves all stream names and URLs from the database.
Returns:
A Flask JSON response containing a list of stream names and URLs.
"""
query = "SELECT DISTINCT SUBSTRING_INDEX(stream, ' ', 1) AS streamName, streamURL FROM userAccounts"
results = _execute_query(query)
return jsonify(results)
def single_check() -> Response | Tuple[Response, int]:
"""
Performs a check on a single account provided in the request JSON.
@ -222,14 +233,29 @@ def save_push_subscription(user_id: int, subscription_json: str) -> None:
_execute_query(query, params)
def get_push_subscriptions(user_id: int) -> List[Dict[str, Any]]:
"""Retrieves all push subscriptions for a given user ID.
def get_push_subscriptions(user_id: Optional[int] = None) -> List[Dict[str, Any]]:
"""Retrieves all push subscriptions for a given user ID, or all if no user_id is provided.
Args:
user_id: The ID of the user.
user_id: The ID of the user (optional).
Returns:
A list of push subscriptions.
"""
query = "SELECT * FROM push_subscriptions WHERE user_id = %s"
return _execute_query(query, (user_id,))
if user_id:
query = "SELECT * FROM push_subscriptions WHERE user_id = %s"
return _execute_query(query, (user_id,))
else:
query = "SELECT * FROM push_subscriptions"
return _execute_query(query)
def delete_push_subscription(subscription_json: str) -> None:
"""Deletes a push subscription from the database.
Args:
subscription_json: The push subscription information as a JSON string.
"""
query = "DELETE FROM push_subscriptions WHERE subscription_json = %s"
params = (subscription_json,)
_execute_query(query, params)

View File

@ -0,0 +1,19 @@
import json
from flask import current_app
from pywebpush import webpush, WebPushException
from ktvmanager.lib.database import delete_push_subscription
def send_notification(subscription_json, message_body):
try:
subscription_info = json.loads(subscription_json)
webpush(
subscription_info=subscription_info,
data=message_body,
vapid_private_key=current_app.config["VAPID_PRIVATE_KEY"],
vapid_claims={"sub": current_app.config["VAPID_CLAIM_EMAIL"]},
)
except WebPushException as ex:
print(f"Web push error: {ex}")
if ex.response and ex.response.status_code == 410:
print("Subscription is no longer valid, removing from DB.")
delete_push_subscription(subscription_json)

View File

@ -1,12 +1,18 @@
import os
from flask import Flask, jsonify
from flask_cors import CORS
from dotenv import load_dotenv
from ktvmanager.config import DevelopmentConfig, ProductionConfig
from routes.api import api_blueprint
from routes.dns import dns_bp
from routes.extra_urls import extra_urls_bp
from ktvmanager.lib.database import initialize_db_pool
from ktvmanager.account_checker import send_expiry_notifications
from apscheduler.schedulers.background import BackgroundScheduler
def create_app():
app = Flask(__name__)
CORS(app)
load_dotenv()
if os.environ.get("FLASK_ENV") == "production":
@ -17,8 +23,22 @@ def create_app():
with app.app_context():
initialize_db_pool()
# Schedule the daily account check
scheduler = BackgroundScheduler()
# Pass the app instance to the job
scheduler.add_job(func=lambda: send_expiry_notifications(app), trigger="cron", hour=10, minute=0)
scheduler.start()
# Register blueprints
app.register_blueprint(api_blueprint)
app.register_blueprint(dns_bp)
app.register_blueprint(extra_urls_bp)
@app.route('/check-expiry', methods=['POST'])
def check_expiry():
"""Manually triggers the expiry notification check."""
send_expiry_notifications(app)
return jsonify({"message": "Expiry notification check triggered."})
# Error handlers
@app.errorhandler(404)

157
npm_config_modifier.py Normal file
View File

@ -0,0 +1,157 @@
import requests
import json
import argparse
import mysql.connector
import re
import os
from dotenv import load_dotenv
load_dotenv()
class NginxProxyManager:
def __init__(self, host, email, password):
self.host = host
self.email = email
self.password = password
self.token = None
def login(self):
url = f"{self.host}/api/tokens"
payload = {
"identity": self.email,
"secret": self.password
}
headers = {
"Content-Type": "application/json"
}
response = requests.post(url, headers=headers, data=json.dumps(payload))
if response.status_code == 200:
self.token = response.json()["token"]
print("Login successful.")
else:
print(f"Failed to login: {response.text}")
exit(1)
def get_proxy_hosts(self):
if not self.token:
self.login()
url = f"{self.host}/api/nginx/proxy-hosts"
headers = {
"Authorization": f"Bearer {self.token}"
}
response = requests.get(url, headers=headers)
if response.status_code == 200:
return response.json()
else:
print(f"Failed to get proxy hosts: {response.text}")
return []
def get_proxy_host(self, host_id):
if not self.token:
self.login()
url = f"{self.host}/api/nginx/proxy-hosts/{host_id}"
headers = {
"Authorization": f"Bearer {self.token}"
}
response = requests.get(url, headers=headers)
if response.status_code == 200:
return response.json()
else:
print(f"Failed to get proxy host {host_id}: {response.text}")
return None
def update_proxy_host_config(self, host_id, config):
if not self.token:
self.login()
url = f"{self.host}/api/nginx/proxy-hosts/{host_id}"
payload = {
"advanced_config": config
}
headers = {
"Authorization": f"Bearer {self.token}",
"Content-Type": "application/json"
}
response = requests.put(url, headers=headers, data=json.dumps(payload))
if response.status_code == 200:
print(f"Successfully updated proxy host {host_id}")
else:
print(f"Failed to update proxy host {host_id}: {response.text}")
def get_streams_from_db(db_host, db_user, db_pass, db_name, db_port):
try:
conn = mysql.connector.connect(
host=db_host,
user=db_user,
password=db_pass,
database=db_name,
port=db_port
)
cursor = conn.cursor(dictionary=True)
cursor.execute("SELECT streamName, streamURL FROM streams")
streams = cursor.fetchall()
cursor.close()
conn.close()
return streams
except mysql.connector.Error as err:
print(f"Error connecting to database: {err}")
return []
def update_config_with_streams(config, streams):
for stream in streams:
stream_name = stream['streamName']
stream_url = stream['streamURL']
# Use a more specific regex to avoid replacing parts of other URLs
pattern = re.compile(f'(location ~ \^/{re.escape(stream_name)}\(\.\*\)\$ {{\s*return 302 )([^;]+)(;\\s*}})')
config = pattern.sub(f'\\1{stream_url}/$1$is_args$args\\3', config)
return config
if __name__ == "__main__":
parser = argparse.ArgumentParser(description="Modify Nginx Proxy Manager custom configuration.")
parser.add_argument("--list-hosts", action="store_true", help="List all proxy hosts")
parser.add_argument("--host-id", type=int, help="The ID of the proxy host to modify")
parser.add_argument("--config-file", type=str, help="Path to the file containing the new advanced configuration")
parser.add_argument("--download-config", type=str, help="Path to save the current advanced configuration")
parser.add_argument("--update-from-db", action="store_true", help="Update the configuration from the database")
args = parser.parse_args()
npm_host = os.getenv("NPM_HOST")
npm_email = os.getenv("NPM_EMAIL")
npm_password = os.getenv("NPM_PASSWORD")
db_host = os.getenv("DBHOST")
db_user = os.getenv("DBUSER")
db_pass = os.getenv("DBPASS")
db_name = os.getenv("DATABASE")
db_port = os.getenv("DBPORT")
npm = NginxProxyManager(npm_host, npm_email, npm_password)
npm.login()
if args.list_hosts:
hosts = npm.get_proxy_hosts()
for host in hosts:
print(f"ID: {host['id']}, Domains: {', '.join(host['domain_names'])}")
if args.host_id and args.download_config:
host = npm.get_proxy_host(args.host_id)
if host:
with open(args.download_config, 'w') as f:
f.write(host.get('advanced_config', ''))
print(f"Configuration for host {args.host_id} downloaded to {args.download_config}")
if args.host_id and args.config_file:
with open(args.config_file, 'r') as f:
config = f.read()
npm.update_proxy_host_config(args.host_id, config)
if args.host_id and args.update_from_db:
host = npm.get_proxy_host(args.host_id)
if host:
current_config = host.get('advanced_config', '')
streams = get_streams_from_db(db_host, db_user, db_pass, db_name, db_port)
if streams:
new_config = update_config_with_streams(current_config, streams)
npm.update_proxy_host_config(args.host_id, new_config)

View File

@ -38,4 +38,6 @@ mysql-connector-python
python-dotenv
python-dotenv
pywebpush==1.13.0
stem==1.8.2
stem==1.8.2
APScheduler==3.10.4
Flask-Cors==4.0.1

View File

@ -8,13 +8,18 @@ from ktvmanager.lib.database import (
get_user_id_from_username,
save_push_subscription,
get_push_subscriptions,
get_all_stream_urls,
)
from ktvmanager.lib.get_urls import get_latest_urls_from_dns
from ktvmanager.lib.auth import requires_basic_auth, check_login
from ktvmanager.lib.checker import validate_account
from typing import Tuple
import json
from pywebpush import webpush, WebPushException
import re
import base64
from cryptography.hazmat.primitives import serialization
from cryptography.hazmat.primitives.asymmetric import ec
from ktvmanager.lib.notifications import send_notification
api_blueprint = Blueprint("api", __name__)
@ -67,6 +72,21 @@ def get_user_accounts_streams_route(username: str, password: str) -> Response:
return jsonify(get_latest_urls_from_dns())
@api_blueprint.route("/get_all_stream_urls")
@requires_basic_auth
def get_all_stream_urls_route(username: str, password: str) -> Response:
"""Retrieves all stream names and URLs.
Args:
username: The username of the user.
password: The password of the user (used for authentication).
Returns:
A Flask JSON response containing the list of stream names and URLs.
"""
return get_all_stream_urls()
@api_blueprint.route("/singleCheck", methods=["POST"])
@requires_basic_auth
def single_check_route(username: str, password: str) -> Response:
@ -146,11 +166,26 @@ def login_route(username: str, password: str) -> Response:
@api_blueprint.route("/vapid-public-key", methods=["GET"])
def vapid_public_key():
"""Provides the VAPID public key."""
public_key = current_app.config["VAPID_PUBLIC_KEY"]
# Clean up the key by removing headers, footers, and all whitespace
public_key = "".join(public_key.replace("-----BEGIN PUBLIC KEY-----", "").replace("-----END PUBLIC KEY-----", "").split())
return jsonify({"public_key": public_key})
"""Provides the VAPID public key in the correct format."""
pem_key = current_app.config["VAPID_PUBLIC_KEY"]
try:
public_key = serialization.load_pem_public_key(pem_key.encode("utf-8"))
if not isinstance(public_key, ec.EllipticCurvePublicKey):
raise TypeError("VAPID public key is not an Elliptic Curve key")
# Get the raw, uncompressed public key bytes (65 bytes for P-256)
raw_key = public_key.public_bytes(
encoding=serialization.Encoding.X962,
format=serialization.PublicFormat.UncompressedPoint
)
# URL-safe base64 encode the raw key
url_safe_key = base64.urlsafe_b64encode(raw_key).rstrip(b'=').decode('utf-8')
return jsonify({"public_key": url_safe_key})
except (ValueError, TypeError, AttributeError) as e:
current_app.logger.error(f"Error processing VAPID public key: {e}")
return jsonify({"error": "Could not process VAPID public key"}), 500
@api_blueprint.route("/save-subscription", methods=["POST"])
@ -177,20 +212,6 @@ def save_subscription(username: str, password: str) -> Response:
return jsonify({"message": "Subscription saved."})
def send_notification(subscription_info, message_body):
try:
webpush(
subscription_info=subscription_info,
data=message_body,
vapid_private_key=current_app.config["VAPID_PRIVATE_KEY"],
vapid_claims={"sub": "mailto:your-email@example.com"},
)
except WebPushException as ex:
print(f"Web push error: {ex}")
# You might want to remove the subscription if it's invalid
if ex.response and ex.response.status_code == 410:
print("Subscription is no longer valid, removing from DB.")
# Add logic to remove the subscription from your database
@api_blueprint.route("/send-expiry-notifications", methods=["POST"])
@ -207,4 +228,39 @@ def send_expiry_notifications_route(username: str, password: str) -> Response:
"""
from ktvmanager.account_checker import send_expiry_notifications
send_expiry_notifications()
return jsonify({"message": "Expiry notifications sent."})
return jsonify({"message": "Expiry notifications sent."})
@api_blueprint.route("/send-test-notification", methods=["POST"])
@requires_basic_auth
def send_test_notification_route(username: str, password: str) -> Response:
"""Sends a test push notification to all users."""
data = request.get_json(silent=True)
message = data.get("message", "Ktv Test") if data else "Ktv Test"
try:
subscriptions = get_push_subscriptions() # Get all subscriptions
except Exception as e:
print(f"Error getting push subscriptions: {e}")
return jsonify({"error": "Could not retrieve push subscriptions from the database."}), 500
if not subscriptions:
return jsonify({"message": "No push subscriptions found."}), 404
message_body = json.dumps({"title": "KTVManager", "body": message})
success_count = 0
failure_count = 0
for sub in subscriptions:
try:
send_notification(sub['subscription_json'], message_body)
success_count += 1
except Exception as e:
print(f"Error sending notification to subscription ID {sub.get('id', 'N/A')}: {e}")
failure_count += 1
return jsonify({
"message": f"Test notification sending process completed.",
"sent": success_count,
"failed": failure_count
})

60
routes/dns.py Normal file
View File

@ -0,0 +1,60 @@
from flask import Blueprint, request, jsonify
import os
dns_bp = Blueprint('dns', __name__)
DNS_FILE = os.path.join(os.path.dirname(__file__), '..', 'ktvmanager', 'lib', 'DNS_list.txt')
def read_dns_list():
if not os.path.exists(DNS_FILE):
return []
with open(DNS_FILE, 'r') as f:
return [line.strip() for line in f.readlines() if line.strip()]
def write_dns_list(dns_list):
with open(DNS_FILE, 'w') as f:
for item in dns_list:
f.write(f"{item}\n")
@dns_bp.route('/dns', methods=['GET'])
def get_dns_list():
"""Gets the list of DNS entries."""
return jsonify(read_dns_list())
@dns_bp.route('/dns', methods=['POST'])
def add_dns():
"""Adds a new DNS entry."""
data = request.get_json()
if not data or 'dns_entry' not in data:
return jsonify({'error': 'Missing dns_entry in request body'}), 400
dns_entry = data.get('dns_entry')
if not dns_entry:
return jsonify({'error': 'DNS entry cannot be empty.'}), 400
dns_list = read_dns_list()
if dns_entry in dns_list:
return jsonify({'message': 'DNS entry already exists.'}), 200
dns_list.append(dns_entry)
write_dns_list(dns_list)
return jsonify({'message': 'DNS entry added successfully.'}), 201
@dns_bp.route('/dns', methods=['DELETE'])
def remove_dns():
"""Removes a DNS entry."""
data = request.get_json()
if not data or 'dns_entry' not in data:
return jsonify({'error': 'Missing dns_entry in request body'}), 400
dns_entry = data.get('dns_entry')
if not dns_entry:
return jsonify({'error': 'DNS entry cannot be empty.'}), 400
dns_list = read_dns_list()
if dns_entry not in dns_list:
return jsonify({'error': 'DNS entry not found.'}), 404
dns_list.remove(dns_entry)
write_dns_list(dns_list)
return jsonify({'message': 'DNS entry removed successfully.'}), 200

60
routes/extra_urls.py Normal file
View File

@ -0,0 +1,60 @@
from flask import Blueprint, request, jsonify
import os
extra_urls_bp = Blueprint('extra_urls', __name__)
EXTRA_URLS_FILE = os.path.join(os.path.dirname(__file__), '..', 'ktvmanager', 'lib', 'extra_urls.txt')
def read_extra_urls_list():
if not os.path.exists(EXTRA_URLS_FILE):
return []
with open(EXTRA_URLS_FILE, 'r') as f:
return [line.strip() for line in f.readlines() if line.strip()]
def write_extra_urls_list(extra_urls_list):
with open(EXTRA_URLS_FILE, 'w') as f:
for item in extra_urls_list:
f.write(f"{item}\n")
@extra_urls_bp.route('/extra_urls', methods=['GET'])
def get_extra_urls_list():
"""Gets the list of extra URLs."""
return jsonify(read_extra_urls_list())
@extra_urls_bp.route('/extra_urls', methods=['POST'])
def add_extra_url():
"""Adds a new extra URL."""
data = request.get_json()
if not data or 'extra_url' not in data:
return jsonify({'error': 'Missing extra_url in request body'}), 400
extra_url = data.get('extra_url')
if not extra_url:
return jsonify({'error': 'Extra URL cannot be empty.'}), 400
extra_urls_list = read_extra_urls_list()
if extra_url in extra_urls_list:
return jsonify({'message': 'Extra URL already exists.'}), 200
extra_urls_list.append(extra_url)
write_extra_urls_list(extra_urls_list)
return jsonify({'message': 'Extra URL added successfully.'}), 201
@extra_urls_bp.route('/extra_urls', methods=['DELETE'])
def remove_extra_url():
"""Removes an extra URL."""
data = request.get_json()
if not data or 'extra_url' not in data:
return jsonify({'error': 'Missing extra_url in request body'}), 400
extra_url = data.get('extra_url')
if not extra_url:
return jsonify({'error': 'Extra URL cannot be empty.'}), 400
extra_urls_list = read_extra_urls_list()
if extra_url not in extra_urls_list:
return jsonify({'error': 'Extra URL not found.'}), 404
extra_urls_list.remove(extra_url)
write_extra_urls_list(extra_urls_list)
return jsonify({'message': 'Extra URL removed successfully.'}), 200