Compare commits

..

55 Commits

Author SHA1 Message Date
3ce8fa0080 Integrate dead torrent cleanup into Manager loop
Move dead torrent detection logic from 50 MrtveTorrenty.py into 70 Manager.py as step 1b, so the manager handles completed, dead, and new torrents in a single run cycle.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-18 13:54:35 +01:00
c4f2d8b13d git 2026-03-06 17:25:47 +01:00
a74ad8ff00 git 2026-03-06 17:22:02 +01:00
afbca5b348 z230 2026-03-06 14:24:32 +01:00
20c4a7d8b4 Merge remote-tracking branch 'origin/master' 2026-03-06 07:11:30 +01:00
489b236b9b git 2026-03-06 07:11:03 +01:00
197cb3f8db z230 2026-03-03 14:27:37 +01:00
b37db5397e Fix handle_completed — guard against invalid completion_on timestamp
qBittorrent returns completion_on = -1 for torrents that were never
completed. datetime.fromtimestamp(-1) throws OSError on Windows.
Added explicit check for negative values and try/except for safety.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-02 07:02:21 +01:00
15b498ca55 Refactor 70 Manager.py — multi-client support + scheduled task mode
- Add CLIENTS list: UltraCC Seedbox (max 20) + Local qBittorrent (max 20)
- Add 'added' to SELECT_NEXT exclusion list to prevent two clients
  claiming the same torrent from the shared DB queue
- Add qb_client column tracking — each torrent records which client
  downloaded it; per-client stats shown at startup
- Extract process_client() to encapsulate steps 1-3 per client
- Remove continuous loop and sleep — script runs once and exits,
  designed to be triggered by a scheduled task

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-01 20:32:57 +01:00
7646f6f68f Add Seedbox/70 Manager.py — continuous download manager for UltraCC 2026-03-01 12:15:49 +01:00
e0cb02c490 Move 95 IncrementalImport.py to Seedbox/ 2026-03-01 12:01:10 +01:00
6b8728360c Add 95 IncrementalImport.py — incremental torrent scraper without Selenium 2026-03-01 11:58:22 +01:00
d57f7d75ce Add Seedbox/60 AktualizaceSeeders.py — scrape seeders/leechers from sktorrent.eu 2026-03-01 11:45:43 +01:00
0710af0f82 Replace all DB host 192.168.1.50 -> 192.168.1.76 2026-03-01 08:11:45 +01:00
b5b3da1105 50 MrtveTorrenty: add criterion B — stuck near 100% for 7+ days 2026-03-01 07:55:12 +01:00
bf81c037a9 git 2026-03-01 07:49:08 +01:00
02da0247f1 Merge: Add Seedbox/50 MrtveTorrenty.py — dead torrent cleanup for UltraCC 2026-03-01 07:48:40 +01:00
1edbe8c1e7 Add Seedbox/50 MrtveTorrenty.py — dead torrent cleanup for UltraCC seedbox 2026-03-01 07:48:34 +01:00
f1a5967430 git 2026-02-09 18:08:19 +01:00
ec1735e629 git 2026-02-08 13:09:20 +01:00
1219f840c6 git 2026-02-08 13:08:00 +01:00
67e320287a git 2026-02-07 07:26:26 +01:00
0ed1411bbd git 2026-02-06 07:09:54 +01:00
aff7993093 git 2026-02-04 05:59:17 +01:00
3d11661997 git 2026-02-01 07:18:20 +01:00
7b0404bfe3 z230 2026-01-30 10:28:42 +01:00
0b7475c5c4 reporter 2026-01-20 06:18:42 +01:00
edee7cb8dd reporter 2026-01-20 06:18:29 +01:00
a990f7b394 git 2026-01-19 21:05:28 +01:00
52ae7cf60d git 2026-01-19 07:10:41 +01:00
5aac1b29c6 z230 2026-01-16 13:31:02 +01:00
c3c723e2e8 z230 2026-01-16 13:19:26 +01:00
f451317b6f z230 2026-01-16 13:10:07 +01:00
a4ede43153 z230 2026-01-13 14:43:44 +01:00
74083614e5 Remove PyCharm IDE files from repository 2026-01-11 08:18:35 +01:00
6d8ea05edb git 2026-01-11 08:15:46 +01:00
387d09b59c vbnotebook 2026-01-10 15:23:49 +01:00
a64f4b663f vbnotebook 2026-01-10 08:56:58 +01:00
84e38b01f1 vbnotebook 2026-01-09 06:38:19 +01:00
44162413e1 vbnotebook 2026-01-08 07:23:30 +01:00
1fc3323afd vbnotebook 2026-01-08 07:23:09 +01:00
8ea687b724 vbnotebook 2026-01-08 07:22:05 +01:00
b08c11b815 z230 2025-12-30 08:51:10 +01:00
b4e5c2f46e Save local IDE settings 2025-12-30 08:42:44 +01:00
a113f68e97 Save local settings before merge 2025-12-30 08:40:56 +01:00
5f01729bce WIP: torrent manipulation logic 2025-12-18 11:25:10 +01:00
bf75bffb02 z230 2025-12-18 11:22:00 +01:00
0769bd2670 z230 2025-12-18 07:53:48 +01:00
5f1c55243e reporter 2025-12-15 07:03:22 +01:00
314eb20e6b reporter 2025-12-15 06:28:20 +01:00
9c95999a26 Remove PyCharm IDE files and add .gitignore 2025-12-15 06:15:28 +01:00
1ed48cd640 reporter 2025-12-15 06:13:07 +01:00
6e8395890d z230 2025-12-08 19:26:13 +01:00
bc35cbdfac vbnotebook 2025-12-08 06:53:29 +01:00
af295ff63c vbnotebook 2025-12-08 06:08:04 +01:00
5372 changed files with 73917 additions and 36 deletions

3
.gitignore vendored
View File

@@ -6,8 +6,9 @@ __pycache__/
*.pyc *.pyc
*.log *.log
# IDE # IDE (PyCharm)
.idea/ .idea/
*.iml
# OS # OS
.DS_Store .DS_Store

3
.idea/.gitignore generated vendored
View File

@@ -1,3 +0,0 @@
# Default ignored files
/shelf/
/workspace.xml

10
.idea/Torrents.iml generated
View File

@@ -1,10 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<module type="PYTHON_MODULE" version="4">
<component name="NewModuleRootManager">
<content url="file://$MODULE_DIR$">
<excludeFolder url="file://$MODULE_DIR$/.venv" />
</content>
<orderEntry type="jdk" jdkName="Python 3.12 (Torrents)" jdkType="Python SDK" />
<orderEntry type="sourceFolder" forTests="false" />
</component>
</module>

View File

@@ -1,6 +0,0 @@
<component name="InspectionProjectProfileManager">
<settings>
<option name="USE_PROJECT_PROFILE" value="false" />
<version value="1.0" />
</settings>
</component>

8
.idea/modules.xml generated
View File

@@ -1,8 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="ProjectModuleManager">
<modules>
<module fileurl="file://$PROJECT_DIR$/.idea/Torrents.iml" filepath="$PROJECT_DIR$/.idea/Torrents.iml" />
</modules>
</component>
</project>

6
.idea/vcs.xml generated
View File

@@ -1,6 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="VcsDirectoryMappings">
<mapping directory="$PROJECT_DIR$" vcs="Git" />
</component>
</project>

123
10Library.py Normal file
View File

@@ -0,0 +1,123 @@
import os
import psycopg2
from psycopg2 import extras
from tqdm import tqdm
import time
import sys
# --- KONFIGURACE ---
DB_CONFIG = {
"host": "192.168.1.76", # Doplňte IP adresu svého Unraidu/Postgresu
"database": "files",
"user": "vladimir.buzalka",
"password": "Vlado7309208104++",
"port": "5432"
}
DIRECTORY_TO_SCAN = "//tower/Library"
BATCH_SIZE = 2000 # Zvýšeno na 2000 pro ještě lepší efektivitu u 5M souborů
# --------- ----------
def scan_to_postgres():
conn = None
total_count = 0
files_batch = []
try:
conn = psycopg2.connect(**DB_CONFIG)
cur = conn.cursor()
# Inicializace tabulky
cur.execute("""
CREATE TABLE IF NOT EXISTS library_files
(
id
SERIAL
PRIMARY
KEY,
file_path
TEXT
NOT
NULL,
file_name
TEXT
NOT
NULL,
file_size_bytes
BIGINT,
indexed_at
TIMESTAMP
DEFAULT
CURRENT_TIMESTAMP
);
""")
conn.commit()
print(f"🚀 Zahajuji indexaci: {DIRECTORY_TO_SCAN}")
# Progress bar s automatickým škálováním jednotek (k, M)
pbar = tqdm(
unit=" soubor",
unit_scale=True,
unit_divisor=1000,
desc="Probíhá skenování",
dynamic_ncols=True
)
def save_batch(batch_data):
"""Pomocná funkce pro zápis do DB"""
insert_query = "INSERT INTO library_files (file_path, file_name, file_size_bytes) VALUES %s"
psycopg2.extras.execute_values(cur, insert_query, batch_data)
conn.commit()
# Rychlé procházení pomocí os.scandir
for root, dirs, files in os.walk(DIRECTORY_TO_SCAN):
for name in files:
full_path = os.path.join(root, name)
try:
# Získání metadat (velikost)
file_size = os.path.getsize(full_path)
files_batch.append((full_path, name, file_size))
total_count += 1
if len(files_batch) >= BATCH_SIZE:
save_batch(files_batch)
pbar.update(len(files_batch))
files_batch = []
except (OSError, PermissionError):
continue
# Uložení posledního neúplného zbytku
if files_batch:
save_batch(files_batch)
pbar.update(len(files_batch))
pbar.close()
print(f"\n✅ Hotovo! Celkem zaindexováno {total_count} souborů.")
except KeyboardInterrupt:
print("\n\n⚠️ Skenování přerušeno uživatelem. Ukládám rozpracovaná data...")
if files_batch:
try:
save_batch(files_batch)
print(f"Posledních {len(files_batch)} záznamů uloženo.")
except:
print("Nepodařilo se uložit poslední dávku.")
sys.exit(0)
except Exception as e:
print(f"\n❌ Chyba: {e}")
finally:
if conn:
cur.close()
conn.close()
if __name__ == "__main__":
start_time = time.time()
scan_to_postgres()
duration = time.time() - start_time
print(
f"⏱️ Celkový čas: {duration / 60:.2f} minut (rychlost: {int(5000000 / duration if duration > 0 else 0)} souborů/s)")

View File

@@ -39,7 +39,7 @@ COOKIE_FILE = Path("sktorrent_cookies.json")
# Start URL pro kategorii 24, seřazeno podle data DESC # Start URL pro kategorii 24, seřazeno podle data DESC
START_URL = ( START_URL = (
"https://sktorrent.eu/torrent/torrents.php" "https://sktorrent.eu/torrent/torrents.php"
"?active=0&category=24&order=data&by=DESC&zaner=&jazyk=&page=90" "?search=&category=24&zaner=&jazyk=&active=0"
) )
chrome_options = Options() chrome_options = Options()

View File

@@ -23,7 +23,7 @@ HEADERS = {"User-Agent": USER_AGENT}
DB_CFG = { DB_CFG = {
"host": "192.168.1.76", "host": "192.168.1.76",
"port": 3307, "port": 3307,git remote set-url origin https://gitea.buzalka.cz/administrator/torrents.git
"user": "root", "user": "root",
"password": "Vlado9674+", "password": "Vlado9674+",
"database": "torrents", "database": "torrents",

295
50 TorrentManipulation.py Normal file
View File

@@ -0,0 +1,295 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
import time
from datetime import datetime, timedelta
import pymysql
import qbittorrentapi
import bencodepy
# ==============================
# ⚙ CONFIGURATION
# ==============================
DB_CONFIG = {
"host": "192.168.1.76",
"port": 3307,
"user": "root",
"password": "Vlado9674+",
"database": "torrents",
"charset": "utf8mb4",
"autocommit": True,
}
QBT_CONFIG = {
"host": "192.168.1.76",
"port": 8080,
"username": "admin",
"password": "adminadmin",
}
MAX_ACTIVE_DOWNLOADS = 10
LOOP_SLEEP_SECONDS = 60
# Torrent označíme jako "dead" pokud nebyl nikdy "seen_complete"
# více než X minut od přidání
DEAD_TORRENT_MINUTES = 5
DEFAULT_SAVE_PATH = None
# ==============================
# 🔧 CONNECT
# ==============================
db = pymysql.connect(**DB_CONFIG)
cursor = db.cursor(pymysql.cursors.DictCursor)
qb = qbittorrentapi.Client(
host=QBT_CONFIG["host"],
port=QBT_CONFIG["port"],
username=QBT_CONFIG["username"],
password=QBT_CONFIG["password"],
)
try:
qb.auth_log_in()
print("✅ Connected to qBittorrent.")
except Exception as e:
print("❌ Could not connect:", e)
raise SystemExit(1)
# ==============================
# 🧪 TORRENT VALIDATION
# ==============================
def is_valid_torrent(blob: bytes) -> bool:
"""
Returns True only if BLOB is a valid .torrent file.
"""
try:
data = bencodepy.decode(blob)
return isinstance(data, dict) and b"info" in data
except Exception:
return False
# ==============================
# 🔄 SYNC FROM QB → DB
# ==============================
def sync_qb_to_db():
torrents = qb.torrents_info()
for t in torrents:
completion_dt = None
if getattr(t, "completion_on", 0):
try:
completion_dt = datetime.fromtimestamp(t.completion_on)
except:
pass
sql = """
UPDATE torrents
SET qb_added = 1,
qb_hash = COALESCE(qb_hash, %s),
qb_state = %s,
qb_progress = %s,
qb_savepath = %s,
qb_completed_datetime =
IF(%s IS NOT NULL AND qb_completed_datetime IS NULL, %s, qb_completed_datetime),
qb_last_update = NOW()
WHERE qb_hash = %s OR torrent_hash = %s
"""
cursor.execute(sql, (
t.hash,
t.state,
float(t.progress) * 100.0,
getattr(t, "save_path", None),
completion_dt,
completion_dt,
t.hash,
t.hash
))
# ==============================
# 🧹 HANDLE COMPLETED + DEAD TORRENTS
# ==============================
def handle_completed_and_dead():
torrents = qb.torrents_info()
for t in torrents:
t_hash = t.hash
state = t.state
progress = float(t.progress)
# ==========================
# ✔ COMPLETED
# ==========================
if progress >= 1.0 or state in {"completed", "uploading", "stalledUP", "queuedUP"}:
print(f"✅ Completed torrent → remove (keep data): {t.name}")
try:
qb.torrents_delete(torrent_hashes=t_hash, delete_files=False)
except Exception as e:
print("⚠️ delete failed:", e)
cursor.execute("""
UPDATE torrents
SET qb_state='completed',
qb_progress=100,
qb_completed_datetime=NOW(),
qb_last_update=NOW()
WHERE qb_hash=%s OR torrent_hash=%s
""", (t_hash, t_hash))
continue
# ==========================
# ❌ DEAD TORRENT (never seen_complete)
# ==========================
props = qb.torrents_properties(t_hash)
seen = getattr(props, "last_seen", 0)
if seen == -1: # never seen complete
added_dt = getattr(t, "added_on", 0)
if added_dt:
added_time = datetime.fromtimestamp(added_dt)
if datetime.now() - added_time > timedelta(minutes=DEAD_TORRENT_MINUTES):
print(f"💀 Dead torrent (> {DEAD_TORRENT_MINUTES} min unseen): {t.name}")
try:
qb.torrents_delete(torrent_hashes=t_hash, delete_files=True)
except:
pass
cursor.execute("""
UPDATE torrents
SET qb_state='dead',
qb_last_update=NOW()
WHERE qb_hash=%s OR torrent_hash=%s
""", (t_hash, t_hash))
# ==============================
# 📊 COUNT ACTIVE DOWNLOADS
# ==============================
def count_active_downloads():
torrents = qb.torrents_info(filter="all")
return sum(1 for t in torrents if float(t.progress) < 1.0)
# ==============================
# ENQUEUE NEW TORRENTS
# ==============================
def enqueue_new_torrents():
active = count_active_downloads()
print("DEBUG active =", active)
if active >= MAX_ACTIVE_DOWNLOADS:
print(f"📦 {active}/{MAX_ACTIVE_DOWNLOADS} active → no enqueue")
return
slots = MAX_ACTIVE_DOWNLOADS - active
sql = """
SELECT id, torrent_hash, torrent_content, torrent_filename, added_datetime
FROM torrents
WHERE (qb_added IS NULL OR qb_added = 0)
AND torrent_content IS NOT NULL
ORDER BY added_datetime DESC -- <── take NEWEST FIRST
LIMIT %s
"""
cursor.execute(sql, (slots,))
rows = cursor.fetchall()
if not rows:
print(" No new torrents")
return
for row in rows:
t_id = row["id"]
t_hash = row["torrent_hash"]
blob = row["torrent_content"]
filename = row.get("torrent_filename", "unknown.torrent")
if not blob:
print("⚠️ empty blob, skip")
continue
# ==========================
# 🧪 VALIDATION OF .TORRENT
# ==========================
if not is_valid_torrent(blob):
print(f"❌ INVALID TORRENT id={t_id}, size={len(blob)} → deleting content")
cursor.execute("""
UPDATE torrents
SET qb_state='invalid',
torrent_content=NULL,
qb_last_update=NOW()
WHERE id=%s
""", (t_id,))
continue
# ==========================
# ADD TORRENT
# ==========================
print(f" Adding torrent: {filename} ({t_hash})")
try:
qb.torrents_add(torrent_files=blob, savepath=DEFAULT_SAVE_PATH)
except Exception as e:
print(f"❌ Failed to add {t_hash}: {e}")
continue
cursor.execute("""
UPDATE torrents
SET qb_added=1,
qb_hash=COALESCE(qb_hash, %s),
qb_state='added',
qb_last_update=NOW()
WHERE id=%s
""", (t_hash, t_id))
# ==============================
# 🏁 MAIN LOOP
# ==============================
print("🚀 Worker started")
try:
while True:
print(f"\n⏱ Loop {datetime.now():%Y-%m-%d %H:%M:%S}")
sync_qb_to_db()
handle_completed_and_dead()
enqueue_new_torrents()
print(f"🛌 Sleep {LOOP_SLEEP_SECONDS}s\n")
time.sleep(LOOP_SLEEP_SECONDS)
except KeyboardInterrupt:
print("🛑 Stopping worker...")
finally:
db.close()
print("👋 Bye.")

72
60 Testcountoftorrents.py Normal file
View File

@@ -0,0 +1,72 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
from datetime import datetime
import qbittorrentapi
# ==============================
# CONFIG přizpůsob si podle sebe
# ==============================
QBT_CONFIG = {
"host": "192.168.1.76",
"port": 8080,
"username": "admin",
"password": "adminadmin",
}
def fmt_ts(ts: int) -> str:
"""
Převod unix timestampu na čitelný string.
qBittorrent vrací -1 pokud hodnota není známá.
"""
if ts is None or ts <= 0:
return ""
try:
return datetime.fromtimestamp(ts).strftime("%Y-%m-%d %H:%M:%S")
except Exception:
return f"invalid({ts})"
def main():
# Připojení
qb = qbittorrentapi.Client(
host=QBT_CONFIG["host"],
port=QBT_CONFIG["port"],
username=QBT_CONFIG["username"],
password=QBT_CONFIG["password"],
)
try:
qb.auth_log_in()
print("✅ Connected to qBittorrent\n")
except Exception as e:
print("❌ Could not connect to qBittorrent:", e)
return
# Všechno, žádný filter na downloading
torrents = qb.torrents_info(filter='all')
print(f"Found {len(torrents)} torrents (filter='all')\n")
for t in torrents:
# properties obsahují last_seen
try:
props = qb.torrents_properties(t.hash)
except Exception as e:
print(f"⚠️ Cannot get properties for {t.hash[:8]} {t.name}: {e}")
continue
seen_complete = getattr(t, "seen_complete", None) # z /torrents/info
last_seen = getattr(props, "last_seen", None) # z /torrents/properties
print("=" * 80)
print(f"Name : {t.name}")
print(f"Hash : {t.hash}")
print(f"State : {t.state}")
print(f"Progress : {float(t.progress) * 100:.2f}%")
print(f"Seen complete: {fmt_ts(seen_complete)} (t.seen_complete)")
print(f"Last seen : {fmt_ts(last_seen)} (props.last_seen)")
print("\n✅ Done.")
if __name__ == "__main__":
main()

143
70 MD5.py Normal file
View File

@@ -0,0 +1,143 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
FAST MD5 indexer with in-memory cache
- prints every processed file
- skips unchanged files instantly
- restart-safe (no reprocessing same files)
"""
import os
import hashlib
from datetime import datetime
import pymysql
# ==============================
# CONFIG
# ==============================
ROOT_DIR = r"\\tower1\#ColdData\porno"
DB_CONFIG = {
"host": "192.168.1.76",
"port": 3307,
"user": "root",
"password": "Vlado9674+",
"database": "torrents",
"charset": "utf8mb4",
"autocommit": True,
}
CHUNK_SIZE = 1024 * 1024 # 1 MB
PRINT_SKIPPED = True # set False if too noisy
# ==============================
# HELPERS
# ==============================
def compute_md5(path: str) -> str:
h = hashlib.md5()
with open(path, "rb") as f:
for chunk in iter(lambda: f.read(CHUNK_SIZE), b""):
h.update(chunk)
return h.hexdigest()
def format_size(size):
for unit in ["B", "KB", "MB", "GB", "TB"]:
if size < 1024:
return f"{size:.1f} {unit}"
size /= 1024
return f"{size:.1f} PB"
# ==============================
# MAIN
# ==============================
def main():
db = pymysql.connect(**DB_CONFIG)
cur = db.cursor()
print("📥 Loading already indexed files into memory...")
cur.execute("""
SELECT full_path, file_size, UNIX_TIMESTAMP(mtime)
FROM file_md5_index
""")
indexed = {
(row[0], row[1], row[2])
for row in cur.fetchall()
}
print(f"✅ Loaded {len(indexed):,} indexed entries")
print("======================================")
new_files = 0
skipped = 0
for root, _, files in os.walk(ROOT_DIR):
for fname in files:
full_path = os.path.join(root, fname)
try:
stat = os.stat(full_path)
except (OSError, FileNotFoundError):
continue
mtime = int(stat.st_mtime)
size = stat.st_size
key = (full_path, size, mtime)
# FAST PATH
if key in indexed:
skipped += 1
if PRINT_SKIPPED:
print("⏭ SKIP")
print(f" File: {full_path}")
continue
print(" NEW / UPDATED")
print(f" Size: {format_size(size)}")
print(f" File: {full_path}")
try:
md5 = compute_md5(full_path)
except Exception as e:
print(f"❌ MD5 failed: {e}")
continue
cur.execute("""
INSERT INTO file_md5_index
(full_path, file_name, directory, file_size, mtime, md5)
VALUES (%s, %s, %s, %s, FROM_UNIXTIME(%s), %s)
ON DUPLICATE KEY UPDATE
file_size=VALUES(file_size),
mtime=VALUES(mtime),
md5=VALUES(md5),
updated_at=CURRENT_TIMESTAMP
""", (
full_path,
fname,
root,
size,
mtime,
md5,
))
new_files += 1
print(f" MD5 : {md5}")
print("--------------------------------------")
print("======================================")
print(f"✅ New / updated : {new_files}")
print(f"⏭ Skipped : {skipped}")
print("======================================")
cur.close()
db.close()
if __name__ == "__main__":
main()

356
80 TorrentManipulation.py Normal file
View File

@@ -0,0 +1,356 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
from datetime import datetime, timedelta
import pymysql
import qbittorrentapi
import bencodepy
from EmailMessagingGraph import send_mail
# ==============================
# ⚙ CONFIGURATION
# ==============================
DB_CONFIG = {
"host": "192.168.1.76",
"port": 3307,
"user": "root",
"password": "Vlado9674+",
"database": "torrents",
"charset": "utf8mb4",
"autocommit": True,
}
QBT_CONFIG = {
"host": "192.168.1.76",
"port": 8080,
"username": "admin",
"password": "adminadmin",
}
# ZVÝŠENO NA 100 dle požadavku
MAX_ACTIVE_DOWNLOADS = 250
# JAK DLOUHO ČEKAT?
# Doporučuji alespoň 3 dny (4320 minut).
# Pokud se do 3 dnů neobjeví nikdo, kdo má 100% souboru, je to pravděpodobně mrtvé.
DEAD_TORRENT_DAYS = 3
DEAD_TORRENT_MINUTES = DEAD_TORRENT_DAYS * 24 * 60
DEFAULT_SAVE_PATH = None
MAIL_TO = "vladimir.buzalka@buzalka.cz"
MAX_LIST_ITEMS = 50 # cap lists in email
# ==============================
# 🧮 RUNTIME STATS + LISTS
# ==============================
RUN_START = datetime.now()
stat_synced = 0
stat_completed = 0
stat_dead = 0
stat_enqueued = 0
deleted_completed = [] # list[str]
deleted_dead = [] # list[str]
added_new = [] # list[str]
active_downloading = [] # list[str]
# ==============================
# 🔧 CONNECT
# ==============================
db = pymysql.connect(**DB_CONFIG)
cursor = db.cursor(pymysql.cursors.DictCursor)
qb = qbittorrentapi.Client(**QBT_CONFIG)
try:
qb.auth_log_in()
print("✅ Connected to qBittorrent.")
except Exception as e:
raise SystemExit(f"❌ Could not connect to qBittorrent: {e}")
# ==============================
# 🧪 TORRENT VALIDATION
# ==============================
def is_valid_torrent(blob: bytes) -> bool:
try:
data = bencodepy.decode(blob)
return isinstance(data, dict) and b"info" in data
except Exception:
return False
# ==============================
# 🔄 SYNC FROM QB → DB
# ==============================
def sync_qb_to_db():
global stat_synced
torrents = qb.torrents_info()
stat_synced = len(torrents)
for t in torrents:
completion_dt = None
if getattr(t, "completion_on", 0):
try:
completion_dt = datetime.fromtimestamp(t.completion_on)
except Exception:
pass
cursor.execute("""
UPDATE torrents
SET qb_added = 1,
qb_hash = COALESCE(qb_hash, %s),
qb_state = %s,
qb_progress = %s,
qb_savepath = %s,
qb_completed_datetime =
IF(%s IS NOT NULL AND qb_completed_datetime IS NULL, %s, qb_completed_datetime),
qb_last_update = NOW()
WHERE qb_hash = %s OR torrent_hash = %s
""", (
t.hash,
t.state,
float(t.progress) * 100.0,
getattr(t, "save_path", None),
completion_dt,
completion_dt,
t.hash,
t.hash,
))
# ==============================
# 🧹 HANDLE COMPLETED + DEAD
# ==============================
def handle_completed_and_dead():
global stat_completed, stat_dead
# Načteme info o torrentech
torrents = qb.torrents_info()
for t in torrents:
t_hash = t.hash
state = t.state
progress = float(t.progress)
# Získání dostupnosti (availability) - defaultně -1 pokud není k dispozici
availability = float(getattr(t, "availability", -1))
# Získání času přidání
added_ts = getattr(t, "added_on", 0)
added_dt = datetime.fromtimestamp(added_ts) if added_ts > 0 else datetime.now()
age_in_minutes = (datetime.now() - added_dt).total_seconds() / 60
# ---------------------------
# 1. ✔ COMPLETED (Hotovo)
# ---------------------------
if progress >= 1.0 or state in {"completed", "uploading", "stalledUP", "queuedUP"}:
stat_completed += 1
deleted_completed.append(t.name)
try:
# Smažeme z QB, ale necháme data na disku
qb.torrents_delete(torrent_hashes=t_hash, delete_files=False)
except Exception as e:
print(f"⚠️ delete (keep data) failed for {t.name}: {e}")
cursor.execute("""
UPDATE torrents
SET qb_state='completed',
qb_progress=100,
qb_completed_datetime=NOW(),
qb_last_update=NOW()
WHERE qb_hash=%s OR torrent_hash=%s
""", (t_hash, t_hash))
continue
# ---------------------------
# 2. ❌ DEAD (Mrtvý)
# ---------------------------
# Logika: Je to starší než limit? A ZÁROVEŇ je dostupnost < 1 (nikdo nemá celý soubor)?
is_old_enough = age_in_minutes > DEAD_TORRENT_MINUTES
is_unavailable = availability < 1.0
if is_old_enough and is_unavailable:
stat_dead += 1
deleted_dead.append(f"{t.name} (Avail: {availability:.2f})")
try:
# Smažeme z QB včetně nedotažených souborů
qb.torrents_delete(torrent_hashes=t_hash, delete_files=True)
except Exception as e:
print(f"⚠️ delete (files) failed for {t.name}: {e}")
cursor.execute("""
UPDATE torrents
SET qb_state='dead',
qb_last_update=NOW()
WHERE qb_hash=%s OR torrent_hash=%s
""", (t_hash, t_hash))
# ==============================
# 📊 ACTIVE DOWNLOADS
# ==============================
def count_active_downloads():
# Počítáme jen ty, co nejsou hotové (progress < 100%)
return sum(1 for t in qb.torrents_info() if float(t.progress) < 1.0)
def snapshot_active_downloading():
"""
Capture current actively downloading torrents (progress < 100%).
"""
active = []
for t in qb.torrents_info():
prog = float(t.progress)
avail = float(getattr(t, "availability", 0))
if prog < 1.0:
active.append(f"{t.name}{prog * 100:.1f}% — Avail:{avail:.2f}")
return sorted(active)
# ==============================
# ENQUEUE NEW TORRENTS
# ==============================
def enqueue_new_torrents():
global stat_enqueued
active = count_active_downloads()
# Pokud máme plno (100+), nic nepřidáváme
if active >= MAX_ACTIVE_DOWNLOADS:
return
# Kolik slotů zbývá do 100
slots = MAX_ACTIVE_DOWNLOADS - active
cursor.execute("""
SELECT id, torrent_hash, torrent_content, torrent_filename
FROM torrents
WHERE (qb_added IS NULL OR qb_added = 0)
AND torrent_content IS NOT NULL
AND (qb_state IS NULL OR qb_state != 'dead')
ORDER BY added_datetime DESC
LIMIT %s
""", (slots,))
rows = cursor.fetchall()
for row in rows:
blob = row["torrent_content"]
if not blob:
continue
if not is_valid_torrent(blob):
cursor.execute("""
UPDATE torrents
SET qb_state='invalid',
torrent_content=NULL,
qb_last_update=NOW()
WHERE id=%s
""", (row["id"],))
continue
# Add torrent
try:
qb.torrents_add(torrent_files=blob, savepath=DEFAULT_SAVE_PATH)
except Exception as e:
print(f"❌ Failed to add {row['torrent_hash']}: {e}")
continue
stat_enqueued += 1
added_new.append(row.get("torrent_filename") or row["torrent_hash"])
cursor.execute("""
UPDATE torrents
SET qb_added=1,
qb_hash=COALESCE(qb_hash, %s),
qb_state='added',
qb_last_update=NOW()
WHERE id=%s
""", (row["torrent_hash"], row["id"]))
# ==============================
# ✉️ EMAIL HELPERS
# ==============================
def format_list(title: str, items: list[str]) -> list[str]:
lines = []
if not items:
return [f"{title}: (none)"]
lines.append(f"{title}: {len(items)}")
shown = items[:MAX_LIST_ITEMS]
for it in shown:
lines.append(f" - {it}")
if len(items) > MAX_LIST_ITEMS:
lines.append(f" ... (+{len(items) - MAX_LIST_ITEMS} more)")
return lines
# ==============================
# 🏁 MAIN (ONE RUN)
# ==============================
print("🚀 QB worker run started")
try:
sync_qb_to_db()
handle_completed_and_dead()
enqueue_new_torrents()
# Snapshot after enqueue/deletions, so email reflects end-state
active_downloading = snapshot_active_downloading()
finally:
db.close()
# ==============================
# 📧 EMAIL REPORT
# ==============================
RUN_END = datetime.now()
body_lines = [
f"Run started : {RUN_START:%Y-%m-%d %H:%M:%S}",
f"Run finished: {RUN_END:%Y-%m-%d %H:%M:%S}",
"",
f"QB torrents synced : {stat_synced}",
f"Completed removed : {stat_completed}",
f"Dead removed : {stat_dead}",
f"New torrents added : {stat_enqueued}",
f"Active downloads : {len(active_downloading)} (Max: {MAX_ACTIVE_DOWNLOADS})",
"",
]
body_lines += format_list("Deleted (completed, kept data)", deleted_completed)
body_lines.append("")
body_lines += format_list("Deleted (DEAD > 3 days & Avail < 1.0)", deleted_dead)
body_lines.append("")
body_lines += format_list("Newly added to qBittorrent", added_new)
body_lines.append("")
body_lines += format_list("Actively downloading now", active_downloading)
send_mail(
to=MAIL_TO,
subject=f"qBittorrent worker {RUN_START:%Y-%m-%d %H:%M}",
body="\n".join(body_lines),
html=False,
)
print("📧 Email report sent")
print("🎉 DONE")

362
81 TorrentManipulation.py Normal file
View File

@@ -0,0 +1,362 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
from datetime import datetime, timedelta
import pymysql
import qbittorrentapi
import bencodepy
from EmailMessagingGraph import send_mail
# ==============================
# ⚙ CONFIGURATION
# ==============================
DB_CONFIG = {
"host": "192.168.1.76",
"port": 3307,
"user": "root",
"password": "Vlado9674+",
"database": "torrents",
"charset": "utf8mb4",
"autocommit": True,
}
QBT_CONFIG = {
"host": "192.168.1.76",
"port": 8080,
"username": "admin",
"password": "adminadmin",
}
# ZVÝŠENO NA 100 dle požadavku
MAX_ACTIVE_DOWNLOADS = 250
# JAK DLOUHO ČEKAT?
# Doporučuji alespoň 3 dny (4320 minut).
# Pokud se do 3 dnů neobjeví nikdo, kdo má 100% souboru, je to pravděpodobně mrtvé.
DEAD_TORRENT_DAYS = 3
DEAD_TORRENT_MINUTES = DEAD_TORRENT_DAYS * 24 * 60
DEFAULT_SAVE_PATH = None
MAIL_TO = "vladimir.buzalka@buzalka.cz"
MAX_LIST_ITEMS = 50 # cap lists in email
# ==============================
# 🧮 RUNTIME STATS + LISTS
# ==============================
RUN_START = datetime.now()
stat_synced = 0
stat_completed = 0
stat_dead = 0
stat_enqueued = 0
deleted_completed = [] # list[str]
deleted_dead = [] # list[str]
added_new = [] # list[str]
active_downloading = [] # list[str]
# ==============================
# 🔧 CONNECT
# ==============================
db = pymysql.connect(**DB_CONFIG)
cursor = db.cursor(pymysql.cursors.DictCursor)
qb = qbittorrentapi.Client(**QBT_CONFIG)
try:
qb.auth_log_in()
print("✅ Connected to qBittorrent.")
except Exception as e:
raise SystemExit(f"❌ Could not connect to qBittorrent: {e}")
# ==============================
# 🧪 TORRENT VALIDATION
# ==============================
def is_valid_torrent(blob: bytes) -> bool:
try:
data = bencodepy.decode(blob)
return isinstance(data, dict) and b"info" in data
except Exception:
return False
# ==============================
# 🔄 SYNC FROM QB → DB
# ==============================
def sync_qb_to_db():
global stat_synced
torrents = qb.torrents_info(limit=1000)
stat_synced = len(torrents)
for t in torrents:
completion_dt = None
if getattr(t, "completion_on", 0):
try:
completion_dt = datetime.fromtimestamp(t.completion_on)
except Exception:
pass
cursor.execute("""
UPDATE torrents
SET qb_added = 1,
qb_hash = COALESCE(qb_hash, %s),
qb_state = %s,
qb_progress = %s,
qb_savepath = %s,
qb_completed_datetime =
IF(%s IS NOT NULL AND qb_completed_datetime IS NULL, %s, qb_completed_datetime),
qb_last_update = NOW()
WHERE qb_hash = %s OR torrent_hash = %s
""", (
t.hash,
t.state,
float(t.progress) * 100.0,
getattr(t, "save_path", None),
completion_dt,
completion_dt,
t.hash,
t.hash,
))
# ==============================
# 🧹 HANDLE COMPLETED + DEAD
# ==============================
def handle_completed_and_dead():
global stat_completed, stat_dead
# Načteme info o torrentech
torrents = qb.torrents_info(limit=1000)
for t in torrents:
t_hash = t.hash
state = t.state
progress = float(t.progress)
# Získání dostupnosti (availability) - defaultně -1 pokud není k dispozici
availability = float(getattr(t, "availability", -1))
# Získání času přidání
added_ts = getattr(t, "added_on", 0)
added_dt = datetime.fromtimestamp(added_ts) if added_ts > 0 else datetime.now()
age_in_minutes = (datetime.now() - added_dt).total_seconds() / 60
# ---------------------------
# 1. ✔ COMPLETED (Hotovo)
# ---------------------------
if progress >= 1.0 or state in {"completed", "uploading", "stalledUP", "queuedUP"}:
stat_completed += 1
deleted_completed.append(t.name)
try:
# Smažeme z QB, ale necháme data na disku
qb.torrents_delete(torrent_hashes=t_hash, delete_files=False)
except Exception as e:
print(f"⚠️ delete (keep data) failed for {t.name}: {e}")
cursor.execute("""
UPDATE torrents
SET qb_state='completed',
qb_progress=100,
qb_completed_datetime=NOW(),
qb_last_update=NOW()
WHERE qb_hash=%s OR torrent_hash=%s
""", (t_hash, t_hash))
continue
# ---------------------------
# 2. ❌ DEAD (Mrtvý)
# ---------------------------
# LOGIKA:
# A) Starší než limit (3 dny)
# B) Dostupnost < 1.0 (nikdo nemá celý soubor)
# C) Stav je VYLOŽENĚ "stalledDL" (zaseknuté stahování)
# Tím ignorujeme "queuedDL" (čeká ve frontě) i "downloading" (stahuje)
is_old_enough = age_in_minutes > DEAD_TORRENT_MINUTES
is_unavailable = availability < 1.0
is_stalled = (state == "stalledDL")
if is_old_enough and is_unavailable and is_stalled:
stat_dead += 1
deleted_dead.append(f"{t.name} (Avail: {availability:.2f}, State: {state})")
try:
# Smažeme z QB včetně nedotažených souborů
qb.torrents_delete(torrent_hashes=t_hash, delete_files=True)
except Exception as e:
print(f"⚠️ delete (files) failed for {t.name}: {e}")
cursor.execute("""
UPDATE torrents
SET qb_state='dead',
qb_last_update=NOW()
WHERE qb_hash=%s OR torrent_hash=%s
""", (t_hash, t_hash))
# ==============================
# 📊 ACTIVE DOWNLOADS
# ==============================
def count_active_downloads():
# Počítáme jen ty, co nejsou hotové (progress < 100%)
return sum(1 for t in qb.torrents_info(limit=1000) if float(t.progress) < 1.0)
def snapshot_active_downloading():
"""
Capture current actively downloading torrents (progress < 100%).
"""
active = []
for t in qb.torrents_info(limit=1000):
prog = float(t.progress)
avail = float(getattr(t, "availability", 0))
# Zobrazíme i stav, abychom v mailu viděli, zda je queued nebo stalled
state = t.state
if prog < 1.0:
active.append(f"{t.name}{prog * 100:.1f}% — Avail:{avail:.2f} — [{state}]")
return sorted(active)
# ==============================
# ENQUEUE NEW TORRENTS
# ==============================
def enqueue_new_torrents():
global stat_enqueued
active = count_active_downloads()
# Pokud máme plno, nic nepřidáváme
if active >= MAX_ACTIVE_DOWNLOADS:
return
# Kolik slotů zbývá
slots = MAX_ACTIVE_DOWNLOADS - active
cursor.execute("""
SELECT id, torrent_hash, torrent_content, torrent_filename
FROM torrents
WHERE (qb_added IS NULL OR qb_added = 0)
AND torrent_content IS NOT NULL
AND (qb_state IS NULL OR qb_state != 'dead')
ORDER BY added_datetime DESC
LIMIT %s
""", (slots,))
rows = cursor.fetchall()
for row in rows:
blob = row["torrent_content"]
if not blob:
continue
if not is_valid_torrent(blob):
cursor.execute("""
UPDATE torrents
SET qb_state='invalid',
torrent_content=NULL,
qb_last_update=NOW()
WHERE id=%s
""", (row["id"],))
continue
# Add torrent
try:
qb.torrents_add(torrent_files=blob, savepath=DEFAULT_SAVE_PATH)
except Exception as e:
print(f"❌ Failed to add {row['torrent_hash']}: {e}")
continue
stat_enqueued += 1
added_new.append(row.get("torrent_filename") or row["torrent_hash"])
cursor.execute("""
UPDATE torrents
SET qb_added=1,
qb_hash=COALESCE(qb_hash, %s),
qb_state='added',
qb_last_update=NOW()
WHERE id=%s
""", (row["torrent_hash"], row["id"]))
# ==============================
# ✉️ EMAIL HELPERS
# ==============================
def format_list(title: str, items: list[str]) -> list[str]:
lines = []
if not items:
return [f"{title}: (none)"]
lines.append(f"{title}: {len(items)}")
shown = items[:MAX_LIST_ITEMS]
for it in shown:
lines.append(f" - {it}")
if len(items) > MAX_LIST_ITEMS:
lines.append(f" ... (+{len(items) - MAX_LIST_ITEMS} more)")
return lines
# ==============================
# 🏁 MAIN (ONE RUN)
# ==============================
print("🚀 QB worker run started")
try:
sync_qb_to_db()
handle_completed_and_dead()
enqueue_new_torrents()
# Snapshot after enqueue/deletions, so email reflects end-state
active_downloading = snapshot_active_downloading()
finally:
db.close()
# ==============================
# 📧 EMAIL REPORT
# ==============================
RUN_END = datetime.now()
body_lines = [
f"Run started : {RUN_START:%Y-%m-%d %H:%M:%S}",
f"Run finished: {RUN_END:%Y-%m-%d %H:%M:%S}",
"",
f"QB torrents synced : {stat_synced}",
f"Completed removed : {stat_completed}",
f"Dead removed : {stat_dead}",
f"New torrents added : {stat_enqueued}",
f"Active downloads : {len(active_downloading)} (Max: {MAX_ACTIVE_DOWNLOADS})",
"",
]
body_lines += format_list("Deleted (completed, kept data)", deleted_completed)
body_lines.append("")
body_lines += format_list("Deleted (DEAD > 3 days & StalledDL & Avail < 1.0)", deleted_dead)
body_lines.append("")
body_lines += format_list("Newly added to qBittorrent", added_new)
body_lines.append("")
body_lines += format_list("Actively downloading now", active_downloading)
send_mail(
to=MAIL_TO,
subject=f"qBittorrent worker {RUN_START:%Y-%m-%d %H:%M}",
body="\n".join(body_lines),
html=False,
)
print("📧 Email report sent")
print("🎉 DONE")

153
82 Reporting.py Normal file
View File

@@ -0,0 +1,153 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
import pymysql
import pandas as pd
import os
from datetime import datetime
# ==============================
# ⚙ KONFIGURACE
# ==============================
DB_CONFIG = {
"host": "192.168.1.76",
"port": 3307,
"user": "root",
"password": "Vlado9674+",
"database": "torrents",
"charset": "utf8mb4"
}
# Cílová složka (používám 'r' před řetězcem pro bezpečné načtení zpětných lomítek)
OUTPUT_DIR = r"u:\Dropbox\!!!Days\Downloads Z230"
FILE_NAME = f"Torrents_Report_{datetime.now():%Y-%m-%d}.xlsx"
# Spojíme cestu a název souboru
FULL_OUTPUT_PATH = os.path.join(OUTPUT_DIR, FILE_NAME)
# ==============================
# 📥 NAČTENÍ DAT
# ==============================
def get_data():
print("⏳ Připojuji se k databázi a stahuji data...")
conn = pymysql.connect(**DB_CONFIG)
query = """
SELECT
id,
category,
title_visible AS 'Název',
size_pretty AS 'Velikost',
added_datetime AS 'Přidáno do DB',
qb_state AS 'Stav v QB',
qb_progress AS 'Postup (%)',
qb_savepath AS 'Cesta na disku',
qb_completed_datetime AS 'Dokončeno',
qb_last_update AS 'Poslední info'
FROM torrents
ORDER BY added_datetime DESC
"""
df = pd.read_sql(query, conn)
conn.close()
return df
# ==============================
# 🎨 FORMÁTOVÁNÍ EXCELU
# ==============================
def auto_adjust_columns(writer, df, sheet_name):
"""Bezpečné automatické nastavení šířky sloupců"""
worksheet = writer.sheets[sheet_name]
for idx, col in enumerate(df.columns):
series = df[col]
max_len = len(str(col)) # minimálně délka hlavičky
for val in series:
if val is None or (isinstance(val, float) and pd.isna(val)):
length = 0
else:
length = len(str(val))
if length > max_len:
max_len = length
max_len = min(max_len + 2, 60)
worksheet.set_column(idx, idx, max_len)
# ==============================
# 🚀 HLAVNÍ LOGIKA
# ==============================
def generate_report():
# 1. Kontrola cesty
if not os.path.exists(OUTPUT_DIR):
print(f"❌ CHYBA: Cílová složka neexistuje nebo není dostupná: {OUTPUT_DIR}")
print(" Ujistěte se, že je disk U: připojen.")
return
df = get_data()
print(f"✅ Načteno {len(df)} záznamů.")
# 2. ÚPRAVA DAT
df['Postup (%)'] = df['Postup (%)'].fillna(0).astype(float).round(1)
# 3. FILTROVÁNÍ
# A) DEAD
mask_dead = df['Stav v QB'].isin(['dead', 'invalid'])
df_dead = df[mask_dead].copy()
# B) COMPLETED
mask_completed = (
(df['Stav v QB'] == 'completed') |
(df['Postup (%)'] >= 100)
) & (~mask_dead)
df_completed = df[mask_completed].copy()
# C) ACTIVE / QUEUED
mask_active = (~mask_dead) & (~mask_completed)
df_active = df[mask_active].copy()
# Seřazení
df_active = df_active.sort_values(by=['Postup (%)', 'Přidáno do DB'], ascending=[False, False])
df_completed = df_completed.sort_values(by='Dokončeno', ascending=False)
df_dead = df_dead.sort_values(by='Poslední info', ascending=False)
# 4. EXPORT
print(f"💾 Ukládám do: {FULL_OUTPUT_PATH}")
try:
with pd.ExcelWriter(FULL_OUTPUT_PATH, engine='xlsxwriter') as writer:
# List 1: Ke stažení
df_active.to_excel(writer, sheet_name='Ke stažení', index=False)
auto_adjust_columns(writer, df_active, 'Ke stažení')
# List 2: Hotovo
df_completed.to_excel(writer, sheet_name='Hotovo', index=False)
auto_adjust_columns(writer, df_completed, 'Hotovo')
# List 3: Dead
df_dead.to_excel(writer, sheet_name='Smazáno (Dead)', index=False)
auto_adjust_columns(writer, df_dead, 'Smazáno (Dead)')
# List 4: Vše
df.to_excel(writer, sheet_name='Kompletní DB', index=False)
auto_adjust_columns(writer, df, 'Kompletní DB')
print("🎉 Hotovo! Report byl úspěšně uložen na disk U:")
except Exception as e:
print(f"❌ Chyba při zápisu souboru: {e}")
if __name__ == "__main__":
generate_report()

View File

@@ -0,0 +1,310 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
import pymysql
from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.chrome.options import Options
import time
import re
import urllib.parse as urlparse
from pathlib import Path
import json
import requests
import datetime
import sys
# Ensure this file exists in your directory
from EmailMessagingGraph import send_mail
# ============================================================
# RUNTIME INFO
# ============================================================
RUN_START = datetime.datetime.now()
processed_count = 0
new_torrent_count = 0
existing_torrent_count = 0
new_titles = []
print(f"🕒 Run started at {RUN_START:%Y-%m-%d %H:%M:%S}")
sys.stdout.flush()
# ============================================================
# 1) MySQL CONNECTION
# ============================================================
db = pymysql.connect(
host="192.168.1.76",
port=3306,
user="root",
password="Vlado9674+",
database="torrents",
charset="utf8mb4",
autocommit=True,
)
cursor = db.cursor()
# ============================================================
# 2) Selenium setup
# ============================================================
COOKIE_FILE = Path("sktorrent_cookies.json")
# Updated to standard torrents.php as requested
BASE_URL = (
"https://sktorrent.eu/torrent/torrents.php"
"?active=0&category=24&order=data&by=DESC&zaner=&jazyk="
)
chrome_options = Options()
chrome_options.add_argument("--start-maximized")
chrome_options.add_argument("--disable-notifications")
chrome_options.add_argument("--disable-popup-blocking")
chrome_options.add_argument("--disable-extensions")
driver = webdriver.Chrome(options=chrome_options)
driver.set_window_position(380, 50)
driver.set_window_size(1350, 1000)
driver.get("https://sktorrent.eu")
if COOKIE_FILE.exists():
with open(COOKIE_FILE, "r", encoding="utf-8") as f:
cookies = json.load(f)
for c in cookies:
driver.add_cookie(c)
print("🍪 Cookies loaded.")
else:
print("⚠️ Cookie file not found login may be required.")
# ============================================================
# 3) requests.Session from Selenium cookies
# ============================================================
requests_session = requests.Session()
for ck in driver.get_cookies():
requests_session.cookies.set(ck["name"], ck["value"])
print("🔗 Requests session initialized.")
# ============================================================
# 4) Popup handler
# ============================================================
def close_popup_if_any():
try:
driver.execute_script("try { interstitialBox.closeit(); } catch(e) {}")
time.sleep(0.5)
except Exception:
pass
# ============================================================
# 5) Parse one torrent row (MODIFIED)
# ============================================================
def parse_row(cells):
# --- 1. INITIALIZE ---
torrent_hash = None
download_url = None
category = cells[0].text.strip()
try:
# --- 2. EXTRACT DOWNLOAD URL (Column 1) ---
download_a = cells[1].find_element(By.TAG_NAME, "a")
download_url = download_a.get_attribute("href")
parsed_dl = urlparse.urlparse(download_url)
dl_query = urlparse.parse_qs(parsed_dl.query)
torrent_filename = dl_query.get("f", ["unknown.torrent"])[0]
# --- 3. EXTRACT DETAILS & HASH (Column 2) ---
title_links = cells[2].find_elements(By.TAG_NAME, "a")
if not title_links:
return None
a_tag = title_links[0]
visible_name = a_tag.text.strip()
full_title = a_tag.get_attribute("title")
details_link = a_tag.get_attribute("href")
parsed = urlparse.urlparse(details_link)
query = urlparse.parse_qs(parsed.query)
if "id" not in query:
return None
torrent_hash = query["id"][0]
# --- 4. EXTRACT SIZE & DATE ---
text_block = cells[2].get_attribute("innerText")
text_block_clean = " ".join(text_block.split())
size_match = re.search(r"Velkost ([0-9\.]+ ?[KMG]B)", text_block_clean, re.IGNORECASE)
added_match = re.search(r"Pridany (.+?)(?:\sObrázok|$)", text_block_clean, re.IGNORECASE)
size_pretty = size_match.group(1) if size_match else None
added_pretty = added_match.group(1) if added_match else None
added_mysql = None
if added_pretty:
clean = added_pretty.replace(" o ", " ").strip()
parts = clean.split(" ")
if len(parts) >= 2:
date_part, time_part = parts[0], parts[1]
if len(time_part.split(":")) == 2: time_part += ":00"
try:
d, m, y = date_part.split("/")
added_mysql = f"{y}-{m}-{d} {time_part}"
except: pass
# --- 5. IMAGE & STATS ---
img_link = None
try:
image_a = cells[2].find_element(By.XPATH, ".//a[contains(text(),'Obrázok')]")
mouseover = image_a.get_attribute("onmouseover")
img_match = re.search(r"src=([^ ]+)", mouseover)
if img_match:
img_link = img_match.group(1).replace("'", "").strip()
if img_link.startswith("//"): img_link = "https:" + img_link
except: pass
seeders_number = int(cells[4].find_element(By.TAG_NAME, "a").text.strip())
seeders_link = cells[4].find_element(By.TAG_NAME, "a").get_attribute("href")
leechers_number = int(cells[5].find_element(By.TAG_NAME, "a").text.strip())
leechers_link = cells[5].find_element(By.TAG_NAME, "a").get_attribute("href")
# --- 6. DATABASE CHECK & DOWNLOAD ---
cursor.execute("SELECT torrent_content FROM torrents WHERE torrent_hash=%s", (torrent_hash,))
db_row = cursor.fetchone()
already_have_torrent = db_row is not None and db_row[0] is not None
torrent_content = None
if not already_have_torrent:
time.sleep(2)
try:
resp = requests_session.get(download_url, timeout=10)
resp.raise_for_status()
torrent_content = resp.content
except Exception as e:
print(f" ⚠️ Download failed for {visible_name}: {e}")
return {
"torrent_hash": torrent_hash,
"details_link": details_link,
"download_url": download_url,
"category": category,
"title_visible": visible_name,
"title_full": full_title,
"size_pretty": size_pretty,
"added_datetime": added_mysql,
"preview_image": img_link,
"seeders": seeders_number,
"seeders_link": seeders_link,
"leechers": leechers_number,
"leechers_link": leechers_link,
"torrent_filename": torrent_filename,
"torrent_content": torrent_content if not already_have_torrent else None,
"is_new_torrent": not already_have_torrent,
}
except Exception as e:
print(f"⚠️ parse_row logic failed: {e}")
return None
# ============================================================
# 6) INSERT SQL (MODIFIED)
# ============================================================
insert_sql = """
INSERT INTO torrents (
torrent_hash, details_link, download_url, category, title_visible, title_full,
size_pretty, added_datetime, preview_image,
seeders, seeders_link, leechers, leechers_link,
torrent_filename, torrent_content
) VALUES (
%(torrent_hash)s, %(details_link)s, %(download_url)s, %(category)s, %(title_visible)s, %(title_full)s,
%(size_pretty)s, %(added_datetime)s, %(preview_image)s,
%(seeders)s, %(seeders_link)s, %(leechers)s, %(leechers_link)s,
%(torrent_filename)s, %(torrent_content)s
)
ON DUPLICATE KEY UPDATE
seeders = VALUES(seeders),
leechers = VALUES(leechers),
download_url = VALUES(download_url),
torrent_content = COALESCE(VALUES(torrent_content), torrent_content);
"""
# Note: COALESCE(torrent_content, VALUES(torrent_content))
# keeps the old value if the new one is NULL,
# but updates it if the old one was NULL and the new one is binary.
# ============================================================
# 7) PROCESS ALL PAGES
# ============================================================
TOTAL_PAGES = 226
for page_num in range(0, TOTAL_PAGES):
current_url = f"{BASE_URL}&page={page_num}"
print(f"\n🌐 Loading Page Index {page_num} (Page {page_num + 1}/{TOTAL_PAGES})")
driver.get(current_url)
time.sleep(2)
close_popup_if_any()
# Find table rows
rows = driver.find_elements(By.CSS_SELECTOR, "table tr")
# FILTER: Only keep rows that have 7 columns AND a link in the 2nd column (index 1)
# This automatically discards headers and empty space rows.
real_rows = []
for r in rows:
cells = r.find_elements(By.TAG_NAME, "td")
if len(cells) == 7 and cells[1].find_elements(By.TAG_NAME, "a"):
real_rows.append(cells)
if not real_rows:
print("⚠️ No data rows found on this page. Ending loop.")
break
# === INSERT THIS LINE HERE ===
page_new_items = 0
# =============================
for cells in real_rows:
try:
data = parse_row(cells)
# ... rest of your logic ...
except Exception as e:
print(f"⚠️ parse_row failed: {e}")
continue
if not data: continue
processed_count += 1
if data["is_new_torrent"]:
new_torrent_count += 1
page_new_items += 1
new_titles.append(data["title_visible"])
print(f"💾 NEW: {data['title_visible']}")
else:
existing_torrent_count += 1
print(f"♻️ UPDATING: {data['title_visible']}")
cursor.execute(insert_sql, data)
# # If an entire page is old news, we can stop the deep crawl
# if page_new_items == 0 and page_num > 0:
# print("🛑 Page contained only known items. Sync complete.")
# break
time.sleep(1)
# ============================================================
# 8) SEND EMAIL REPORT
# ============================================================
RUN_END = datetime.datetime.now()
subject = f"SKTorrent run {RUN_START:%Y-%m-%d %H:%M}"
body = (
f"Run started: {RUN_START:%Y-%m-%d %H:%M:%S}\n"
f"Run finished: {RUN_END:%Y-%m-%d %H:%M:%S}\n\n"
f"Processed torrents: {processed_count}\n"
f"New torrents saved: {new_torrent_count}\n"
f"Existing torrents updated: {existing_torrent_count}\n"
)
if new_titles:
body += "\nNew torrents list:\n- " + "\n- ".join(new_titles)
send_mail(to="vladimir.buzalka@buzalka.cz", subject=subject, body=body, html=False)
print("📧 Email report sent.")
driver.quit()
print("🎉 DONE")

292
91 5threaddownloader.py Normal file
View File

@@ -0,0 +1,292 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
import pymysql
from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.chrome.options import Options
import time
import re
import urllib.parse as urlparse
from pathlib import Path
import json
import requests
import datetime
import sys
import threading
from concurrent.futures import ThreadPoolExecutor
# Ensure this file exists in your directory
from EmailMessagingGraph import send_mail
# ============================================================
# CONFIGURATION
# ============================================================
TOTAL_PAGES = 226
THREADS = 5
COOKIE_FILE = Path("sktorrent_cookies.json")
# Database settings
DB_CONFIG = {
"host": "192.168.1.76",
"port": 3306,
"user": "root",
"password": "Vlado9674+",
"database": "torrents",
"charset": "utf8mb4",
"autocommit": True,
}
BASE_URL = (
"https://sktorrent.eu/torrent/torrents.php"
"?active=0&category=24&order=data&by=DESC&zaner=&jazyk="
)
# Global counters for reporting (Thread-safe lock needed)
stats_lock = threading.Lock()
stats = {
"processed": 0,
"new": 0,
"existing": 0,
"new_titles": []
}
# ============================================================
# 1) WORKER FUNCTION (Runs inside each thread)
# ============================================================
def process_page_chunk(page_indices, thread_id):
"""
This function creates its OWN browser and OWN database connection.
It processes the specific list of page numbers assigned to it.
"""
print(f"🧵 [Thread-{thread_id}] Starting. Assigned {len(page_indices)} pages.")
# --- A. Setup Independent DB Connection ---
try:
db = pymysql.connect(**DB_CONFIG)
cursor = db.cursor()
except Exception as e:
print(f"❌ [Thread-{thread_id}] DB Connection failed: {e}")
return
# --- B. Setup Independent Selenium Driver ---
chrome_options = Options()
# HEADLESS MODE is safer for 5 threads to avoid popping up 5 windows
chrome_options.add_argument("--headless=new")
chrome_options.add_argument("--disable-notifications")
chrome_options.add_argument("--disable-popup-blocking")
chrome_options.add_argument("--disable-extensions")
chrome_options.add_argument("--log-level=3") # Reduce noise
driver = webdriver.Chrome(options=chrome_options)
driver.set_window_size(1350, 1000)
# --- C. Login / Cookies ---
driver.get("https://sktorrent.eu")
if COOKIE_FILE.exists():
with open(COOKIE_FILE, "r", encoding="utf-8") as f:
cookies = json.load(f)
for c in cookies:
driver.add_cookie(c)
# --- D. Requests Session ---
requests_session = requests.Session()
for ck in driver.get_cookies():
requests_session.cookies.set(ck["name"], ck["value"])
# --- E. Helper: Parse Row (Local scope) ---
def parse_row(cells):
try:
category = cells[0].text.strip()
# Download URL
download_a = cells[1].find_element(By.TAG_NAME, "a")
download_url = download_a.get_attribute("href")
parsed_dl = urlparse.urlparse(download_url)
dl_query = urlparse.parse_qs(parsed_dl.query)
torrent_filename = dl_query.get("f", ["unknown.torrent"])[0]
# Details & Hash
title_links = cells[2].find_elements(By.TAG_NAME, "a")
if not title_links: return None
a_tag = title_links[0]
visible_name = a_tag.text.strip()
full_title = a_tag.get_attribute("title")
details_link = a_tag.get_attribute("href")
parsed = urlparse.urlparse(details_link)
query = urlparse.parse_qs(parsed.query)
if "id" not in query: return None
torrent_hash = query["id"][0]
# Size & Date
text_block = cells[2].get_attribute("innerText")
clean_text = " ".join(text_block.split())
size_match = re.search(r"Velkost ([0-9\.]+ ?[KMG]B)", clean_text, re.IGNORECASE)
added_match = re.search(r"Pridany (.+?)(?:\sObrázok|$)", clean_text, re.IGNORECASE)
size_pretty = size_match.group(1) if size_match else None
added_mysql = None
if added_match:
clean = added_match.group(1).replace(" o ", " ").strip()
parts = clean.split(" ")
if len(parts) >= 2:
d, m, y = parts[0].split("/")
t = parts[1] + ":00" if len(parts[1].split(":")) == 2 else parts[1]
try:
added_mysql = f"{y}-{m}-{d} {t}"
except:
pass
# Image
img_link = None
try:
img_a = cells[2].find_element(By.XPATH, ".//a[contains(text(),'Obrázok')]")
img_src = re.search(r"src=([^ ]+)", img_a.get_attribute("onmouseover"))
if img_src:
img_link = img_src.group(1).replace("'", "").strip()
if img_link.startswith("//"): img_link = "https:" + img_link
except:
pass
# Stats
seeders = int(cells[4].find_element(By.TAG_NAME, "a").text.strip())
seeders_link = cells[4].find_element(By.TAG_NAME, "a").get_attribute("href")
leechers = int(cells[5].find_element(By.TAG_NAME, "a").text.strip())
leechers_link = cells[5].find_element(By.TAG_NAME, "a").get_attribute("href")
# Check DB
cursor.execute("SELECT torrent_content FROM torrents WHERE torrent_hash=%s", (torrent_hash,))
row = cursor.fetchone()
already_have_file = row is not None and row[0] is not None
content = None
if not already_have_file:
# Politeness sleep only if downloading
time.sleep(1)
try:
r = requests_session.get(download_url, timeout=10)
r.raise_for_status()
content = r.content
except:
pass
return {
"torrent_hash": torrent_hash, "details_link": details_link, "download_url": download_url,
"category": category, "title_visible": visible_name, "title_full": full_title,
"size_pretty": size_pretty, "added_datetime": added_mysql, "preview_image": img_link,
"seeders": seeders, "seeders_link": seeders_link, "leechers": leechers, "leechers_link": leechers_link,
"torrent_filename": torrent_filename, "torrent_content": content,
"is_new_torrent": not already_have_file
}
except Exception:
return None
# --- F. Loop through Assigned Pages ---
for page_num in page_indices:
url = f"{BASE_URL}&page={page_num}"
print(f" 🔄 [Thread-{thread_id}] Scraping Page {page_num}")
try:
driver.get(url)
# Close popup (simplified JS)
driver.execute_script("try { interstitialBox.closeit(); } catch(e) {}")
# Row Filtering
rows = driver.find_elements(By.CSS_SELECTOR, "table tr")
real_rows = []
for r in rows:
cs = r.find_elements(By.TAG_NAME, "td")
if len(cs) == 7 and cs[1].find_elements(By.TAG_NAME, "a"):
real_rows.append(cs)
if not real_rows:
print(f" ⚠️ [Thread-{thread_id}] Page {page_num} empty.")
continue
# Process Rows
for cells in real_rows:
data = parse_row(cells)
if not data: continue
# Update Global Stats safely
with stats_lock:
stats["processed"] += 1
if data["is_new_torrent"]:
stats["new"] += 1
stats["new_titles"].append(data["title_visible"])
else:
stats["existing"] += 1
# Insert SQL
sql = """
INSERT INTO torrents (
torrent_hash, details_link, download_url, category, title_visible, title_full,
size_pretty, added_datetime, preview_image,
seeders, seeders_link, leechers, leechers_link,
torrent_filename, torrent_content
) VALUES (
%(torrent_hash)s, %(details_link)s, %(download_url)s, %(category)s, %(title_visible)s, %(title_full)s,
%(size_pretty)s, %(added_datetime)s, %(preview_image)s,
%(seeders)s, %(seeders_link)s, %(leechers)s, %(leechers_link)s,
%(torrent_filename)s, %(torrent_content)s
)
ON DUPLICATE KEY UPDATE
seeders = VALUES(seeders),
leechers = VALUES(leechers),
download_url = VALUES(download_url),
torrent_content = COALESCE(VALUES(torrent_content), torrent_content);
"""
cursor.execute(sql, data)
except Exception as e:
print(f" 💥 [Thread-{thread_id}] Error on page {page_num}: {e}")
# Cleanup
driver.quit()
db.close()
print(f"🏁 [Thread-{thread_id}] Finished assigned pages.")
# ============================================================
# 2) MAIN EXECUTION
# ============================================================
if __name__ == "__main__":
RUN_START = datetime.datetime.now()
print(f"🚀 Starting Multithreaded Scraper with {THREADS} threads...")
# 1. Distribute pages among threads
# Example: If 226 pages and 5 threads, each gets ~45 pages
all_pages = list(range(TOTAL_PAGES))
chunk_size = len(all_pages) // THREADS + 1
chunks = [all_pages[i:i + chunk_size] for i in range(0, len(all_pages), chunk_size)]
# 2. Start Threads
with ThreadPoolExecutor(max_workers=THREADS) as executor:
futures = []
for i, page_chunk in enumerate(chunks):
if page_chunk: # Only start if chunk is not empty
futures.append(executor.submit(process_page_chunk, page_chunk, i + 1))
# Wait for all to finish
for f in futures:
f.result()
# 3. Final Report
RUN_END = datetime.datetime.now()
print("\n✅ All threads completed.")
body = (
f"Run started: {RUN_START:%Y-%m-%d %H:%M:%S}\n"
f"Run finished: {RUN_END:%Y-%m-%d %H:%M:%S}\n\n"
f"Processed torrents: {stats['processed']}\n"
f"New torrents saved: {stats['new']}\n"
f"Existing torrents updated: {stats['existing']}\n"
)
if stats["new_titles"]:
body += "\nNew torrents list:\n- " + "\n- ".join(stats["new_titles"])
send_mail(to="vladimir.buzalka@buzalka.cz", subject=f"SKTorrent Multi-Thread Run", body=body, html=False)
print("📧 Email report sent.")

View File

@@ -0,0 +1,212 @@
import pymysql
import requests
import json
import time
import random
import os
import re
from pathlib import Path
from concurrent.futures import ThreadPoolExecutor
from threading import Lock
from selenium import webdriver
from selenium.webdriver.chrome.options import Options
# ============================================================
# KONFIGURACE
# ============================================================
DB_CONFIG = {
"host": "192.168.1.76",
"port": 3306,
"user": "root",
"password": "Vlado9674+",
"database": "torrents",
"charset": "utf8mb4",
"autocommit": True,
}
COOKIE_FILE = Path("sktorrent_cookies.json")
BACKUP_DIR = "saved_torrents" # Adresář pro lokální zálohu
THREADS = 5 # Počet vláken
# Globální zámek pro výpisy do konzole, aby se nepřepisovaly
print_lock = Lock()
stats = {"fixed": 0, "failed": 0, "saved_to_disk": 0}
# ============================================================
# POMOCNÉ FUNKCE
# ============================================================
def sanitize_filename(name):
"""Odstraní z názvu souboru nepovolené znaky"""
# Povolíme jen písmena, čísla, tečky, pomlčky a mezery
clean = re.sub(r'[^\w\s\.-]', '', name)
return clean.strip()[:100] # Ořízneme na 100 znaků pro jistotu
def ensure_backup_dir():
"""Vytvoří adresář pro torrenty, pokud neexistuje"""
if not os.path.exists(BACKUP_DIR):
os.makedirs(BACKUP_DIR)
print(f"📁 Vytvořen adresář pro zálohu: {os.path.abspath(BACKUP_DIR)}")
def get_browser_identity():
"""
Spustí Selenium (Chrome) JEN JEDNOU, aby získal validní
User-Agent a čerstvé Cookies pro threads.
"""
print("🤖 Startuji Selenium pro získání identity prohlížeče...")
opts = Options()
opts.add_argument("--headless=new")
opts.add_argument("--disable-gpu")
driver = webdriver.Chrome(options=opts)
# Jdeme na web nastavit doménu pro cookies
driver.get("https://sktorrent.eu")
# Načteme cookies ze souboru
if COOKIE_FILE.exists():
with open(COOKIE_FILE, "r", encoding="utf-8") as f:
cookies_list = json.load(f)
for c in cookies_list:
driver.add_cookie(c)
driver.refresh()
time.sleep(2)
# Exportujeme identitu
user_agent = driver.execute_script("return navigator.userAgent;")
browser_cookies = driver.get_cookies()
driver.quit()
print("✅ Identita získána.")
return user_agent, browser_cookies
# ============================================================
# WORKER (Pracovní vlákno)
# ============================================================
def worker_task(rows_chunk, thread_id, user_agent, cookies_list):
"""
Tato funkce běží v každém vlákně zvlášť.
"""
# 1. Vytvoření vlastní Session pro toto vlákno
session = requests.Session()
session.headers.update({"User-Agent": user_agent})
for c in cookies_list:
session.cookies.set(c['name'], c['value'])
# 2. Vlastní připojení k DB (nutné pro thread-safety)
try:
db = pymysql.connect(**DB_CONFIG)
cursor = db.cursor()
except Exception as e:
with print_lock:
print(f"❌ [Thread-{thread_id}] Chyba DB připojení: {e}")
return
for row in rows_chunk:
t_hash, url, title = row
# Ochrana: krátká náhodná pauza, aby 5 vláken nezabilo server
time.sleep(random.uniform(0.5, 2.0))
try:
# Stažení
resp = session.get(url, timeout=15)
if resp.status_code == 403:
with print_lock:
print(f"⛔ [Thread-{thread_id}] 403 Forbidden! {title[:20]}...")
stats["failed"] += 1
continue
resp.raise_for_status()
content = resp.content
if len(content) > 100:
# A) Uložit do DB (BLOB)
sql = "UPDATE torrents SET torrent_content = %s WHERE torrent_hash = %s"
cursor.execute(sql, (content, t_hash))
# B) Uložit na DISK (Soubor)
clean_name = sanitize_filename(title)
# Přidáme kousek hashe do názvu, aby se nepřepsaly soubory se stejným jménem
filename = f"{clean_name}_{t_hash[:6]}.torrent"
file_path = os.path.join(BACKUP_DIR, filename)
with open(file_path, "wb") as f:
f.write(content)
with print_lock:
print(f"✅ [Thread-{thread_id}] OK: {clean_name}")
stats["fixed"] += 1
stats["saved_to_disk"] += 1
else:
with print_lock:
print(f"⚠️ [Thread-{thread_id}] Prázdný soubor: {title}")
stats["failed"] += 1
except Exception as e:
with print_lock:
print(f"❌ [Thread-{thread_id}] Chyba: {title[:20]}... -> {e}")
stats["failed"] += 1
db.close()
with print_lock:
print(f"🏁 [Thread-{thread_id}] Dokončil práci.")
# ============================================================
# HLAVNÍ LOOP
# ============================================================
if __name__ == "__main__":
ensure_backup_dir()
# 1. Získat data z DB
print("🔍 Načítám seznam chybějících souborů z DB...")
main_db = pymysql.connect(**DB_CONFIG)
with main_db.cursor() as c:
# Hledáme ty, co mají URL, ale nemají obsah
c.execute(
"SELECT torrent_hash, download_url, title_visible FROM torrents WHERE torrent_content IS NULL AND download_url IS NOT NULL")
all_rows = c.fetchall()
main_db.close()
total = len(all_rows)
print(f"📋 K opravě: {total} položek.")
if total == 0:
print("🎉 Není co opravovat.")
exit()
# 2. Získat "Super Identitu" přes Selenium (jen jednou)
u_agent, browser_cookies = get_browser_identity()
# 3. Rozdělit práci pro 5 vláken
chunk_size = total // THREADS + 1
chunks = [all_rows[i:i + chunk_size] for i in range(0, total, chunk_size)]
print(f"🚀 Spouštím {THREADS} vláken (ukládání do DB + do složky '{BACKUP_DIR}')...")
# 4. Spustit multithreading
with ThreadPoolExecutor(max_workers=THREADS) as executor:
futures = []
for i, chunk in enumerate(chunks):
if chunk:
# Každému vláknu předáme kus práce + identitu prohlížeče
futures.append(executor.submit(worker_task, chunk, i + 1, u_agent, browser_cookies))
# Čekáme na dokončení
for f in futures:
f.result()
print("\n" + "=" * 40)
print(f"🏁 DOKONČENO")
print(f"✅ Opraveno v DB: {stats['fixed']}")
print(f"💾 Uloženo na disk: {stats['saved_to_disk']}")
print(f"❌ Chyby: {stats['failed']}")
print(f"📁 Soubory najdeš v: {os.path.abspath(BACKUP_DIR)}")
print("=" * 40)

View File

@@ -0,0 +1,133 @@
import pymysql
import requests
import json
import time
import random
import os
import re
from pathlib import Path
from selenium import webdriver
from selenium.webdriver.chrome.options import Options
# ============================================================
# KONFIGURACE
# ============================================================
DB_CONFIG = {
"host": "192.168.1.76",
"port": 3306,
"user": "root",
"password": "Vlado9674+",
"database": "torrents",
"charset": "utf8mb4",
"autocommit": True,
}
COOKIE_FILE = Path("sktorrent_cookies.json")
BACKUP_DIR = "saved_torrents"
# ============================================================
# POMOCNÉ FUNKCE
# ============================================================
def sanitize_filename(name):
clean = re.sub(r'[^\w\s\.-]', '', name)
return clean.strip()[:100]
def get_browser_identity():
print("🤖 Startuji Selenium (Single Thread Mode)...")
opts = Options()
opts.add_argument("--headless=new")
opts.add_argument("--disable-gpu")
driver = webdriver.Chrome(options=opts)
driver.get("https://sktorrent.eu")
if COOKIE_FILE.exists():
with open(COOKIE_FILE, "r", encoding="utf-8") as f:
cookies_list = json.load(f)
for c in cookies_list:
driver.add_cookie(c)
driver.refresh()
time.sleep(2)
user_agent = driver.execute_script("return navigator.userAgent;")
browser_cookies = driver.get_cookies()
driver.quit()
return user_agent, browser_cookies
# ============================================================
# MAIN
# ============================================================
if __name__ == "__main__":
if not os.path.exists(BACKUP_DIR):
os.makedirs(BACKUP_DIR)
# 1. Načíst zbývající chyby
db = pymysql.connect(**DB_CONFIG)
cursor = db.cursor()
cursor.execute(
"SELECT torrent_hash, download_url, title_visible FROM torrents WHERE torrent_content IS NULL AND download_url IS NOT NULL")
rows = cursor.fetchall()
print(f"📋 Zbývá opravit: {len(rows)} položek.")
if not rows:
print("🎉 Hotovo! Vše je staženo.")
exit()
# 2. Získat identitu
ua, cookies = get_browser_identity()
session = requests.Session()
session.headers.update({"User-Agent": ua})
for c in cookies:
session.cookies.set(c['name'], c['value'])
# 3. Pomalá smyčka (1 vlákno)
success = 0
dead_links = 0
print("🚀 Spouštím jemné dočištění...")
for i, row in enumerate(rows):
t_hash, url, title = row
print(f"[{i + 1}/{len(rows)}] {title[:50]}...", end=" ")
try:
# Delší pauza pro stabilitu
time.sleep(random.uniform(1.5, 3.0))
resp = session.get(url, timeout=20) # Delší timeout
if resp.status_code == 404:
print("❌ 404 Nenalezeno (soubor na serveru neexistuje)")
dead_links += 1
continue
if resp.status_code != 200:
print(f"❌ Chyba {resp.status_code}")
continue
content = resp.content
if len(content) > 100:
# DB
cursor.execute("UPDATE torrents SET torrent_content = %s WHERE torrent_hash = %s", (content, t_hash))
# Disk
fname = f"{sanitize_filename(title)}_{t_hash[:6]}.torrent"
with open(os.path.join(BACKUP_DIR, fname), "wb") as f:
f.write(content)
print("✅ OK")
success += 1
else:
print("⚠️ Prázdný soubor")
except Exception as e:
print(f"❌ Selhalo: {e}")
db.close()
print("\n" + "=" * 30)
print(f"🏁 FINÁLE: Opraveno {success} z {len(rows)}")
if dead_links > 0:
print(f"💀 Mrtvé odkazy (404): {dead_links} (ty už opravit nejdou)")

View File

@@ -0,0 +1,158 @@
import pymysql
import bencodepy
import os
from pathlib import Path
# ============================================================
# CONFIGURATION
# ============================================================
# Your network path (Use raw string r"..." for backslashes)
# PHYSICAL_DIR = Path(r"\\tower\torrents\downloads")
PHYSICAL_DIR = Path(r"\\tower1\#Colddata\Porno")
DB_CONFIG = {
"host": "192.168.1.76",
"port": 3306,
"user": "root",
"password": "Vlado9674+",
"database": "torrents",
"charset": "utf8mb4",
"autocommit": True,
}
# ============================================================
# HELPER FUNCTIONS
# ============================================================
def decode_bytes(b):
"""
Decodes bytes from Bencode into a string.
Tries UTF-8 first, then common fallbacks.
"""
if isinstance(b, str): return b
encodings = ['utf-8', 'windows-1250', 'latin-1', 'cp1252']
for enc in encodings:
try:
return b.decode(enc)
except:
continue
return b.decode('utf-8', errors='ignore')
def check_torrent_in_filesystem(torrent_blob, root_path):
"""
Parses the binary BLOB, calculates expected paths,
and checks if they exist in the root_path.
"""
try:
# Decode the binary BLOB
data = bencodepy.decode(torrent_blob)
info = data.get(b'info')
if not info: return False
# Get the name of the root file/folder defined in the torrent
name = decode_bytes(info.get(b'name'))
# Calculate expected location
target_path = root_path / name
# 1. Check if the main path exists
if not target_path.exists():
return False
# 2. Size Verification (Basic)
# If it's a single file
if b'files' not in info:
expected_size = info[b'length']
real_size = target_path.stat().st_size
# Allow 1% variance or 1KB (sometimes filesystems vary slightly)
if abs(real_size - expected_size) < 4096:
return True
return False
# If it's a multi-file torrent (folder)
else:
# If the folder exists, we assume it's mostly good,
# but let's check at least one file inside to be sure it's not empty.
files = info[b'files']
if not files: return True # Empty folder torrent? rare but possible.
# Check the first file in the list
first_file_path = target_path.joinpath(*[decode_bytes(p) for p in files[0][b'path']])
return first_file_path.exists()
except Exception as e:
# If Bencode fails or path is weird
return False
# ============================================================
# MAIN EXECUTION
# ============================================================
if __name__ == "__main__":
if not PHYSICAL_DIR.exists():
print(f"❌ ERROR: Cannot access path: {PHYSICAL_DIR}")
print("Make sure the drive is mapped or the network path is accessible.")
exit()
print(f"📂 Scanning storage: {PHYSICAL_DIR}")
print("🚀 Connecting to Database...")
db = pymysql.connect(**DB_CONFIG)
cursor = db.cursor()
# 1. Get all torrents that have content (BLOB)
# We only select ID and Content to keep memory usage reasonable
cursor.execute(
"SELECT torrent_hash, title_visible, torrent_content FROM torrents WHERE torrent_content IS NOT NULL")
rows = cursor.fetchall()
total = len(rows)
print(f"📋 Analysing {total} torrents from database against disk files...")
found_count = 0
missing_count = 0
# 2. Iterate and Check
updates = [] # Store successful hashes to batch update later
for index, row in enumerate(rows):
t_hash, title, blob = row
is_downloaded = check_torrent_in_filesystem(blob, PHYSICAL_DIR)
if is_downloaded:
found_count += 1
updates.append(t_hash)
# Print only every 50th line to reduce clutter, or if found
# print(f"✅ Found: {title[:50]}")
else:
missing_count += 1
if index % 100 == 0:
print(f" Processed {index}/{total} ... (Found: {found_count})")
# 3. Batch Update Database
print(f"\n💾 Updating Database: Marking {len(updates)} torrents as 'physical_exists = 1'...")
# Reset everything to 0 first (in case you deleted files since last run)
cursor.execute("UPDATE torrents SET physical_exists = 0")
if updates:
# Update in chunks of 1000 to be safe
chunk_size = 1000
for i in range(0, len(updates), chunk_size):
chunk = updates[i:i + chunk_size]
format_strings = ','.join(['%s'] * len(chunk))
cursor.execute(f"UPDATE torrents SET physical_exists = 1 WHERE torrent_hash IN ({format_strings})",
tuple(chunk))
db.commit()
db.close()
print("\n" + "=" * 40)
print(f"🏁 SCAN COMPLETE")
print(f"✅ Physically Available: {found_count}")
print(f"❌ Missing / Not Downloaded: {missing_count}")
print(f"📊 Completion Rate: {int((found_count / total) * 100)}%")
print("=" * 40)

91
EmailMessagingGraph.py Normal file
View File

@@ -0,0 +1,91 @@
"""
EmailMessagingGraph.py
----------------------
Private Microsoft Graph mail sender
Application permissions, shared mailbox
"""
import msal
import requests
from functools import lru_cache
from typing import Union, List
# =========================
# PRIVATE CONFIG (ONLY YOU)
# =========================
TENANT_ID = "7d269944-37a4-43a1-8140-c7517dc426e9"
CLIENT_ID = "4b222bfd-78c9-4239-a53f-43006b3ed07f"
CLIENT_SECRET = "Txg8Q~MjhocuopxsJyJBhPmDfMxZ2r5WpTFj1dfk"
SENDER = "reports@buzalka.cz"
AUTHORITY = f"https://login.microsoftonline.com/{TENANT_ID}"
SCOPE = ["https://graph.microsoft.com/.default"]
@lru_cache(maxsize=1)
def _get_token() -> str:
app = msal.ConfidentialClientApplication(
CLIENT_ID,
authority=AUTHORITY,
client_credential=CLIENT_SECRET,
)
token = app.acquire_token_for_client(scopes=SCOPE)
if "access_token" not in token:
raise RuntimeError(f"Graph auth failed: {token}")
return token["access_token"]
def send_mail(
to: Union[str, List[str]],
subject: str,
body: str,
*,
html: bool = False,
):
"""
Send email via Microsoft Graph.
:param to: email or list of emails
:param subject: subject
:param body: email body
:param html: True = HTML, False = plain text
"""
if isinstance(to, str):
to = [to]
payload = {
"message": {
"subject": subject,
"body": {
"contentType": "HTML" if html else "Text",
"content": body,
},
"toRecipients": [
{"emailAddress": {"address": addr}} for addr in to
],
},
"saveToSentItems": "true",
}
headers = {
"Authorization": f"Bearer {_get_token()}",
"Content-Type": "application/json",
}
r = requests.post(
f"https://graph.microsoft.com/v1.0/users/{SENDER}/sendMail",
headers=headers,
json=payload,
timeout=30,
)
if r.status_code != 202:
raise RuntimeError(
f"sendMail failed [{r.status_code}]: {r.text}"
)

342
Reporter_ReadNewTorrents.py Normal file
View File

@@ -0,0 +1,342 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
import pymysql
from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.chrome.options import Options
import time
import re
import urllib.parse as urlparse
from pathlib import Path
import json
import requests
import datetime
import sys
from EmailMessagingGraph import send_mail
# ============================================================
# RUNTIME INFO
# ============================================================
RUN_START = datetime.datetime.now()
processed_count = 0
new_torrent_count = 0
existing_torrent_count = 0
new_titles = []
print(f"🕒 Run started at {RUN_START:%Y-%m-%d %H:%M:%S}")
sys.stdout.flush()
# ============================================================
# 1) MySQL CONNECTION
# ============================================================
db = pymysql.connect(
host="192.168.1.76",
port=3306,
user="root",
password="Vlado9674+",
database="torrents",
charset="utf8mb4",
autocommit=True,
)
cursor = db.cursor()
# ============================================================
# 2) Selenium setup
# ============================================================
COOKIE_FILE = Path("sktorrent_cookies.json")
START_URL = (
"https://sktorrent.eu/torrent/torrents.php"
"?search=&category=24&zaner=&jazyk=&active=0"
)
chrome_options = Options()
chrome_options.add_argument("--start-maximized")
chrome_options.add_argument("--disable-notifications")
chrome_options.add_argument("--disable-popup-blocking")
chrome_options.add_argument("--disable-extensions")
driver = webdriver.Chrome(options=chrome_options)
driver.set_window_position(380, 50)
driver.set_window_size(1350, 1000)
driver.get("https://sktorrent.eu")
if COOKIE_FILE.exists():
with open(COOKIE_FILE, "r", encoding="utf-8") as f:
cookies = json.load(f)
for c in cookies:
driver.add_cookie(c)
print("🍪 Cookies loaded.")
else:
print("⚠️ Cookie file not found login may be required.")
# ============================================================
# 3) requests.Session from Selenium cookies
# ============================================================
requests_session = requests.Session()
for ck in driver.get_cookies():
requests_session.cookies.set(ck["name"], ck["value"])
print("🔗 Requests session initialized.")
# ============================================================
# 4) Popup handler
# ============================================================
def close_popup_if_any():
try:
driver.execute_script("try { interstitialBox.closeit(); } catch(e) {}")
time.sleep(0.5)
except Exception:
pass
# ============================================================
# 5) Parse one torrent row
# ============================================================
def parse_row(cells):
category = cells[0].text.strip()
try:
download_a = cells[1].find_element(By.TAG_NAME, "a")
download_link = download_a.get_attribute("href")
except:
return None
parsed_dl = urlparse.urlparse(download_link)
dl_query = urlparse.parse_qs(parsed_dl.query)
torrent_filename = dl_query.get("f", ["unknown.torrent"])[0]
title_links = cells[2].find_elements(By.TAG_NAME, "a")
if not title_links:
return None
a_tag = title_links[0]
visible_name = a_tag.text.strip()
full_title = a_tag.get_attribute("title")
details_link = a_tag.get_attribute("href")
parsed = urlparse.urlparse(details_link)
query = urlparse.parse_qs(parsed.query)
if "id" not in query:
return None
torrent_hash = query["id"][0]
text_block = cells[2].get_attribute("innerText")
text_block_clean = " ".join(text_block.split())
size_match = re.search(r"Velkost ([0-9\.]+ ?[KMG]B)", text_block_clean, re.IGNORECASE)
added_match = re.search(r"Pridany (.+?)(?:\sObrázok|$)", text_block_clean, re.IGNORECASE)
size_pretty = size_match.group(1) if size_match else None
added_pretty = added_match.group(1) if added_match else None
# ======================================================
# EXACT DATE PROCESSING COPIED 1:1 FROM YOUR FILE
# ======================================================
added_mysql = None
if added_pretty:
# "29/11/2025 o 02:29" → "29/11/2025 02:29"
clean = added_pretty.replace(" o ", " ").strip()
parts = clean.split(" ")
date_part = parts[0]
time_part = parts[1] if len(parts) > 1 else "00:00:00"
# pokud chybí sekundy, přidej
if len(time_part.split(":")) == 2:
time_part += ":00"
day, month, year = date_part.split("/")
added_mysql = f"{year}-{month}-{day} {time_part}"
# ======================================================
# Image preview
# ======================================================
img_link = None
try:
image_a = cells[2].find_element(
By.XPATH,
".//a[contains(text(),'Obrázok')]"
)
mouseover = image_a.get_attribute("onmouseover")
img_match = re.search(r"src=([^ ]+)", mouseover)
if img_match:
img_link = img_match.group(1).replace("'", "").strip()
if img_link.startswith("//"):
img_link = "https:" + img_link
except:
pass
seeders_a = cells[4].find_element(By.TAG_NAME, "a")
seeders_number = int(seeders_a.text.strip())
seeders_link = seeders_a.get_attribute("href")
leechers_a = cells[5].find_element(By.TAG_NAME, "a")
leechers_number = int(leechers_a.text.strip())
leechers_link = leechers_a.get_attribute("href")
cursor.execute(
"SELECT torrent_content FROM torrents WHERE torrent_hash=%s",
(torrent_hash,),
)
row = cursor.fetchone()
already_have_torrent = row is not None and row[0] is not None
torrent_content = None
if not already_have_torrent:
time.sleep(3)
try:
resp = requests_session.get(download_link)
resp.raise_for_status()
torrent_content = resp.content
except:
torrent_content = None
return {
"torrent_hash": torrent_hash,
"details_link": details_link,
"category": category,
"title_visible": visible_name,
"title_full": full_title,
"size_pretty": size_pretty,
"added_datetime": added_mysql,
"preview_image": img_link,
"seeders": seeders_number,
"seeders_link": seeders_link,
"leechers": leechers_number,
"leechers_link": leechers_link,
"torrent_filename": torrent_filename,
"torrent_content": torrent_content if not already_have_torrent else None,
"is_new_torrent": not already_have_torrent,
}
# ============================================================
# 6) INSERT SQL
# ============================================================
insert_sql = """
INSERT INTO torrents (
torrent_hash, details_link, category, title_visible, title_full,
size_pretty, added_datetime, preview_image,
seeders, seeders_link, leechers, leechers_link,
torrent_filename, torrent_content
) VALUES (
%(torrent_hash)s, %(details_link)s, %(category)s, %(title_visible)s, %(title_full)s,
%(size_pretty)s, %(added_datetime)s, %(preview_image)s,
%(seeders)s, %(seeders_link)s, %(leechers)s, %(leechers_link)s,
%(torrent_filename)s, %(torrent_content)s
)
ON DUPLICATE KEY UPDATE
details_link = VALUES(details_link),
category = VALUES(category),
title_visible = VALUES(title_visible),
title_full = VALUES(title_full),
size_pretty = VALUES(size_pretty),
added_datetime = VALUES(added_datetime),
preview_image = VALUES(preview_image),
seeders = VALUES(seeders),
seeders_link = VALUES(seeders_link),
leechers = VALUES(leechers),
leechers_link = VALUES(leechers_link),
torrent_filename = VALUES(torrent_filename),
torrent_content = COALESCE(VALUES(torrent_content), torrent_content);
"""
# ============================================================
# 7) PROCESS FIRST PAGE ONLY
# ============================================================
print("\n🌐 Loading FIRST page")
driver.get(START_URL)
time.sleep(2)
close_popup_if_any()
rows = driver.find_elements(By.CSS_SELECTOR, "table tr")
real_rows = [
r.find_elements(By.TAG_NAME, "td")
for r in rows
if len(r.find_elements(By.TAG_NAME, "td")) == 7
]
print(f"📄 Found {len(real_rows)} torrent rows")
for cells in real_rows:
try:
data = parse_row(cells)
except Exception as e:
print(f"⚠️ parse_row failed: {e}")
continue
if not data:
continue
processed_count += 1
if data["is_new_torrent"]:
new_torrent_count += 1
new_titles.append(data["title_visible"])
else:
existing_torrent_count += 1
print("💾 Saving:", data["title_visible"])
cursor.execute(insert_sql, data)
# ============================================================
# 8) SEND EMAIL REPORT
# ============================================================
RUN_END = datetime.datetime.now()
subject = f"SKTorrent hourly run {RUN_START:%Y-%m-%d %H:%M}"
lines = [
f"Run started: {RUN_START:%Y-%m-%d %H:%M:%S}",
f"Run finished: {RUN_END:%Y-%m-%d %H:%M:%S}",
"",
f"Processed torrents: {processed_count}",
f"New torrent files downloaded: {new_torrent_count}",
f"Already known torrents: {existing_torrent_count}",
]
if new_titles:
lines.append("")
lines.append("New torrents:")
for t in new_titles:
lines.append(f"- {t}")
body = "\n".join(lines)
send_mail(
to="vladimir.buzalka@buzalka.cz",
subject=subject,
body=body,
html=False,
)
print("📧 Email report sent.")
driver.quit()
print("🎉 DONE")

View File

@@ -0,0 +1,337 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
from datetime import datetime, timedelta
import pymysql
import qbittorrentapi
import bencodepy
from EmailMessagingGraph import send_mail
# ==============================
# ⚙ CONFIGURATION
# ==============================
DB_CONFIG = {
"host": "192.168.1.76",
"port": 3307,
"user": "root",
"password": "Vlado9674+",
"database": "torrents",
"charset": "utf8mb4",
"autocommit": True,
}
QBT_CONFIG = {
"host": "192.168.1.76",
"port": 8080,
"username": "admin",
"password": "adminadmin",
}
MAX_ACTIVE_DOWNLOADS = 10
DEAD_TORRENT_MINUTES = 60
DEFAULT_SAVE_PATH = None
MAIL_TO = "vladimir.buzalka@buzalka.cz"
MAX_LIST_ITEMS = 50 # cap lists in email
# ==============================
# 🧮 RUNTIME STATS + LISTS
# ==============================
RUN_START = datetime.now()
stat_synced = 0
stat_completed = 0
stat_dead = 0
stat_enqueued = 0
deleted_completed = [] # list[str]
deleted_dead = [] # list[str]
added_new = [] # list[str]
active_downloading = [] # list[str]
# ==============================
# 🔧 CONNECT
# ==============================
db = pymysql.connect(**DB_CONFIG)
cursor = db.cursor(pymysql.cursors.DictCursor)
qb = qbittorrentapi.Client(**QBT_CONFIG)
try:
qb.auth_log_in()
print("✅ Connected to qBittorrent.")
except Exception as e:
raise SystemExit(f"❌ Could not connect to qBittorrent: {e}")
# ==============================
# 🧪 TORRENT VALIDATION
# ==============================
def is_valid_torrent(blob: bytes) -> bool:
try:
data = bencodepy.decode(blob)
return isinstance(data, dict) and b"info" in data
except Exception:
return False
# ==============================
# 🔄 SYNC FROM QB → DB
# ==============================
def sync_qb_to_db():
global stat_synced
torrents = qb.torrents_info()
stat_synced = len(torrents)
for t in torrents:
completion_dt = None
if getattr(t, "completion_on", 0):
try:
completion_dt = datetime.fromtimestamp(t.completion_on)
except Exception:
pass
cursor.execute("""
UPDATE torrents
SET qb_added = 1,
qb_hash = COALESCE(qb_hash, %s),
qb_state = %s,
qb_progress = %s,
qb_savepath = %s,
qb_completed_datetime =
IF(%s IS NOT NULL AND qb_completed_datetime IS NULL, %s, qb_completed_datetime),
qb_last_update = NOW()
WHERE qb_hash = %s OR torrent_hash = %s
""", (
t.hash,
t.state,
float(t.progress) * 100.0,
getattr(t, "save_path", None),
completion_dt,
completion_dt,
t.hash,
t.hash,
))
# ==============================
# 🧹 HANDLE COMPLETED + DEAD
# ==============================
def handle_completed_and_dead():
global stat_completed, stat_dead
torrents = qb.torrents_info()
for t in torrents:
t_hash = t.hash
state = t.state
progress = float(t.progress)
# ✔ COMPLETED
if progress >= 1.0 or state in {"completed", "uploading", "stalledUP", "queuedUP"}:
stat_completed += 1
deleted_completed.append(t.name)
try:
qb.torrents_delete(torrent_hashes=t_hash, delete_files=False)
except Exception as e:
# keep name in report; just note error in DB state if you want later
print(f"⚠️ delete (keep data) failed for {t.name}: {e}")
cursor.execute("""
UPDATE torrents
SET qb_state='completed',
qb_progress=100,
qb_completed_datetime=NOW(),
qb_last_update=NOW()
WHERE qb_hash=%s OR torrent_hash=%s
""", (t_hash, t_hash))
continue
# ❌ DEAD (never seen_complete)
try:
props = qb.torrents_properties(t_hash)
except Exception:
continue
if getattr(props, "last_seen", 0) == -1:
added_dt = getattr(t, "added_on", 0)
if added_dt:
if datetime.now() - datetime.fromtimestamp(added_dt) > timedelta(minutes=DEAD_TORRENT_MINUTES):
stat_dead += 1
deleted_dead.append(t.name)
try:
qb.torrents_delete(torrent_hashes=t_hash, delete_files=True)
except Exception as e:
print(f"⚠️ delete (files) failed for {t.name}: {e}")
cursor.execute("""
UPDATE torrents
SET qb_state='dead',
qb_last_update=NOW()
WHERE qb_hash=%s OR torrent_hash=%s
""", (t_hash, t_hash))
# ==============================
# 📊 ACTIVE DOWNLOADS
# ==============================
def count_active_downloads():
return sum(1 for t in qb.torrents_info() if float(t.progress) < 1.0)
def snapshot_active_downloading():
"""
Capture current actively downloading torrents (progress < 100%).
"""
active = []
for t in qb.torrents_info():
prog = float(t.progress)
if prog < 1.0:
active.append(f"{t.name}{prog*100:.1f}% — {t.state}")
return sorted(active)
# ==============================
# ENQUEUE NEW TORRENTS
# ==============================
def enqueue_new_torrents():
global stat_enqueued
active = count_active_downloads()
if active >= MAX_ACTIVE_DOWNLOADS:
return
slots = MAX_ACTIVE_DOWNLOADS - active
cursor.execute("""
SELECT id, torrent_hash, torrent_content, torrent_filename
FROM torrents
WHERE (qb_added IS NULL OR qb_added = 0)
AND torrent_content IS NOT NULL
ORDER BY added_datetime DESC
LIMIT %s
""", (slots,))
for row in cursor.fetchall():
blob = row["torrent_content"]
if not blob:
continue
if not is_valid_torrent(blob):
cursor.execute("""
UPDATE torrents
SET qb_state='invalid',
torrent_content=NULL,
qb_last_update=NOW()
WHERE id=%s
""", (row["id"],))
continue
# Add torrent
try:
qb.torrents_add(torrent_files=blob, savepath=DEFAULT_SAVE_PATH)
except Exception as e:
print(f"❌ Failed to add {row['torrent_hash']}: {e}")
continue
stat_enqueued += 1
added_new.append(row.get("torrent_filename") or row["torrent_hash"])
cursor.execute("""
UPDATE torrents
SET qb_added=1,
qb_hash=COALESCE(qb_hash, %s),
qb_state='added',
qb_last_update=NOW()
WHERE id=%s
""", (row["torrent_hash"], row["id"]))
# ==============================
# ✉️ EMAIL HELPERS
# ==============================
def format_list(title: str, items: list[str]) -> list[str]:
lines = []
if not items:
return [f"{title}: (none)"]
lines.append(f"{title}: {len(items)}")
shown = items[:MAX_LIST_ITEMS]
for it in shown:
lines.append(f" - {it}")
if len(items) > MAX_LIST_ITEMS:
lines.append(f" ... (+{len(items) - MAX_LIST_ITEMS} more)")
return lines
# ==============================
# 🏁 MAIN (ONE RUN)
# ==============================
print("🚀 QB worker run started")
try:
sync_qb_to_db()
handle_completed_and_dead()
enqueue_new_torrents()
# Snapshot after enqueue/deletions, so email reflects end-state
active_downloading = snapshot_active_downloading()
finally:
db.close()
# ==============================
# 📧 EMAIL REPORT
# ==============================
RUN_END = datetime.now()
body_lines = [
f"Run started : {RUN_START:%Y-%m-%d %H:%M:%S}",
f"Run finished: {RUN_END:%Y-%m-%d %H:%M:%S}",
"",
f"QB torrents synced : {stat_synced}",
f"Completed removed : {stat_completed}",
f"Dead removed : {stat_dead}",
f"New torrents added : {stat_enqueued}",
f"Active downloads : {sum(1 for _ in active_downloading)}",
"",
]
body_lines += format_list("Deleted (completed, kept data)", deleted_completed)
body_lines.append("")
body_lines += format_list("Deleted (dead, deleted files)", deleted_dead)
body_lines.append("")
body_lines += format_list("Newly added to qBittorrent", added_new)
body_lines.append("")
body_lines += format_list("Actively downloading now", active_downloading)
send_mail(
to=MAIL_TO,
subject=f"qBittorrent worker {RUN_START:%Y-%m-%d %H:%M}",
body="\n".join(body_lines),
html=False,
)
print("📧 Email report sent")
print("🎉 DONE")

0
Results/.gitkeep Normal file
View File

View File

@@ -0,0 +1,118 @@
import pymysql
import qbittorrentapi
# ============================================================
# CONFIG
# ============================================================
DRY_RUN = False
QBT_URL = "https://vladob.zen.usbx.me/qbittorrent"
QBT_USER = "vladob"
QBT_PASS = "jCni3U6d#y4bfcm"
DB_CONFIG = {
"host": "192.168.1.76",
"port": 3306,
"user": "root",
"password": "Vlado9674+",
"database": "torrents",
"charset": "utf8mb4",
"autocommit": True,
}
# ============================================================
# MAIN
# ============================================================
def main():
print("=" * 60)
print("QB COMPLETED CLEANUP")
print(f"DRY_RUN = {DRY_RUN}")
print("=" * 60)
# --------------------------------------------------------
# DB CONNECT
# --------------------------------------------------------
db = pymysql.connect(**DB_CONFIG)
cursor = db.cursor()
# --------------------------------------------------------
# CONNECT QB
# --------------------------------------------------------
print("🔌 Connecting qBittorrent...")
qbt = qbittorrentapi.Client(
host=QBT_URL,
username=QBT_USER,
password=QBT_PASS,
VERIFY_WEBUI_CERTIFICATE=False
)
qbt.auth_log_in()
torrents = qbt.torrents_info()
print(f"Loaded torrents from qB: {len(torrents)}")
processed = 0
removed = 0
# --------------------------------------------------------
# LOOP
# --------------------------------------------------------
for t in torrents:
if getattr(t, "amount_left", 1) != 0:
continue
print(f"✅ COMPLETED → {t.name}")
# ----------------------------------------------------
# UPDATE DB
# ----------------------------------------------------
if not DRY_RUN:
cursor.execute("""
UPDATE torrents
SET
qb_completed_datetime = COALESCE(qb_completed_datetime, NOW()),
qb_state = %s,
qb_last_update = NOW()
WHERE torrent_hash = %s
""", (t.state, t.hash.lower()))
else:
print(f"[DRY] Would update DB → {t.name}")
processed += 1
# ----------------------------------------------------
# REMOVE FROM QB (KEEP FILES)
# ----------------------------------------------------
try:
if DRY_RUN:
print(f"[DRY] Would remove torrent from qB → {t.name}")
else:
qbt.torrents_delete(
torrent_hashes=t.hash,
delete_files=False
)
removed += 1
except Exception as e:
print(f"❌ Remove failed: {t.name}{e}")
# --------------------------------------------------------
# SUMMARY
# --------------------------------------------------------
print("\n" + "=" * 60)
print(f"Completed processed: {processed}")
print(f"Removed from qB: {removed}")
print("=" * 60)
db.close()
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,89 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
import pymysql
import bencodepy
# ===============================
# DB CONFIG
# ===============================
DB_CONFIG = dict(
host="192.168.1.76",
user="root",
password="Vlado9674+",
database="torrents",
charset="utf8mb4"
)
LIMIT = 5 # kolik torrentů zobrazit
# ===============================
# TORRENT PARSER
# ===============================
def parse_torrent(blob):
data = bencodepy.decode(blob)
info = data[b'info']
files = []
# multi-file torrent
if b'files' in info:
for f in info[b'files']:
path = "/".join(p.decode(errors="ignore") for p in f[b'path'])
size = f[b'length']
files.append((path, size))
# single-file torrent
else:
name = info[b'name'].decode(errors="ignore")
size = info[b'length']
files.append((name, size))
return files
# ===============================
# MAIN
# ===============================
def main():
conn = pymysql.connect(**DB_CONFIG)
cur = conn.cursor()
cur.execute(f"""
SELECT id, title_visible, qb_savepath, torrent_content
FROM torrents
WHERE torrent_content IS NOT NULL
LIMIT {LIMIT}
""")
rows = cur.fetchall()
for tid, title, savepath, blob in rows:
print("\n" + "="*80)
print(f"Torrent ID : {tid}")
print(f"Title : {title}")
print(f"Savepath : {savepath}")
try:
files = parse_torrent(blob)
print(f"Files inside torrent: {len(files)}")
for path, size in files:
print(f" {size:>12} B {path}")
except Exception as e:
print("ERROR parsing torrent:", e)
cur.close()
conn.close()
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,214 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
import os
import pymysql
import bencodepy
from tqdm import tqdm
# =====================================
# CONFIG
# =====================================
ULTRACC_ROOT = r"\\tower\torrents\ultracc"
DRY_MODE = False
DB_CONFIG = dict(
host="192.168.1.76",
user="root",
password="Vlado9674+",
database="torrents",
charset="utf8mb4"
)
# =====================================
# TORRENT PARSER
# =====================================
def parse_torrent(blob):
data = bencodepy.decode(blob)
info = data[b'info']
files = []
# multi file
if b'files' in info:
for f in info[b'files']:
rel_path = "/".join(p.decode(errors="ignore") for p in f[b'path'])
size = f[b'length']
files.append((rel_path, size))
multi = True
else:
name = info[b'name'].decode(errors="ignore")
size = info[b'length']
files.append((name, size))
multi = False
return files, multi
# =====================================
# BUILD FILESYSTEM INDEX
# =====================================
import pickle
import os
INDEX_FILE = r"U:\PycharmProjects\Torrents\fs_index_ultracc.pkl"
FORCE_REBUILD = False
def build_fs_index():
# ============================
# LOAD EXISTING INDEX
# ============================
if os.path.exists(INDEX_FILE) and not FORCE_REBUILD:
print("Načítám uložený filesystem index...")
with open(INDEX_FILE, "rb") as f:
index = pickle.load(f)
print(f"Načten index ({len(index)} klíčů)")
return index
# ============================
# BUILD NEW INDEX
# ============================
print("Indexuji filesystem...")
index = {}
pocet = 0
for root, _, files in os.walk(ULTRACC_ROOT):
for f in files:
full = os.path.join(root, f)
try:
size = os.path.getsize(full)
except OSError:
continue
key = (f.lower(), size)
pocet += 1
if pocet % 100 == 0:
print(pocet)
index.setdefault(key, []).append(full)
print(f"Index obsahuje {len(index)} unikátních souborů")
# ============================
# SAVE INDEX
# ============================
print("Ukládám index na disk...")
with open(INDEX_FILE, "wb") as f:
pickle.dump(index, f, protocol=pickle.HIGHEST_PROTOCOL)
print("Index uložen")
return index
# =====================================
# VALIDACE ROOTU
# =====================================
def validate_root(root, torrent_files):
for rel_path, size in torrent_files:
check = os.path.join(root, rel_path.replace("/", os.sep))
if not os.path.exists(check):
return False
return True
# =====================================
# MAIN
# =====================================
def main():
fs_index = build_fs_index()
conn = pymysql.connect(**DB_CONFIG)
cur = conn.cursor()
cur.execute("""
SELECT id, torrent_content
FROM torrents
WHERE torrent_content IS NOT NULL and physical_exists=FALSE
""")
rows = cur.fetchall()
print(f"Torrentů ke kontrole: {len(rows)}")
success = 0
for tid, blob in tqdm(rows):
try:
torrent_files, multi = parse_torrent(blob)
# vezmeme největší soubor pro lookup
rel_path, size = max(torrent_files, key=lambda x: x[1])
fname = os.path.basename(rel_path).lower()
key = (fname, size)
if key not in fs_index:
continue
found = False
for full_path in fs_index[key]:
if multi:
# ROOT = odečtení relativní cesty
root = full_path[:-len(rel_path)]
root = root.rstrip("\\/")
else:
# single file
root = os.path.dirname(full_path)
if validate_root(root, torrent_files):
found = True
success += 1
print(f"[FOUND] Torrent {tid}{root}")
if not DRY_MODE:
cur.execute("""
UPDATE torrents
SET physical_exists = 1
WHERE id = %s
""", (tid,))
break
if not found:
pass
except Exception as e:
print(f"ERROR torrent {tid}: {e}")
if not DRY_MODE:
conn.commit()
print(f"Celkem nalezeno: {success}")
cur.close()
conn.close()
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,149 @@
import pymysql
import qbittorrentapi
from datetime import datetime
# ============================================================
# CONFIG
# ============================================================
DRY_RUN = False # ← change to True for testing
QBT_URL = "https://vladob.zen.usbx.me/qbittorrent"
QBT_USER = "vladob"
QBT_PASS = "jCni3U6d#y4bfcm"
DB_CONFIG = {
"host": "192.168.1.76",
"port": 3306,
"user": "root",
"password": "Vlado9674+",
"database": "torrents",
"charset": "utf8mb4",
"autocommit": True,
}
# ============================================================
# CONNECT
# ============================================================
def connect_qbt():
qbt = qbittorrentapi.Client(
host=QBT_URL,
username=QBT_USER,
password=QBT_PASS,
VERIFY_WEBUI_CERTIFICATE=False
)
qbt.auth_log_in()
return qbt
def connect_db():
return pymysql.connect(**DB_CONFIG)
# ============================================================
# MAIN
# ============================================================
def main():
print("🚀 UltraCC completed torrent cleanup")
print(f"DRY_RUN = {DRY_RUN}")
print("------------------------------------------------")
qbt = connect_qbt()
db = connect_db()
cursor = db.cursor()
torrents = qbt.torrents_info()
removed_count = 0
skipped_count = 0
updated_db_count = 0
for t in torrents:
# ----------------------------------------------------
# Only completed torrents
# ----------------------------------------------------
if not t.completion_on:
continue
thash = t.hash.lower()
name = t.name
state = t.state
completion_dt = datetime.fromtimestamp(t.completion_on)
# ----------------------------------------------------
# Check DB
# ----------------------------------------------------
cursor.execute("""
SELECT qb_completed_datetime
FROM torrents
WHERE torrent_hash = %s
""", (thash,))
row = cursor.fetchone()
# ====================================================
# CASE 1 — Torrent completed in qBittorrent but NOT in DB
# → Update DB + Remove torrent
# ====================================================
if not row or not row[0]:
print(f"✔ COMPLETED (DB update required): {name}")
print(f" Completion time: {completion_dt}")
if not DRY_RUN:
cursor.execute("""
UPDATE torrents
SET
qb_state = %s,
qb_progress = 1,
qb_completed_datetime = %s,
qb_last_update = NOW()
WHERE torrent_hash = %s
""", (f"completed_removed ({state})", completion_dt, thash))
updated_db_count += 1
# Remove torrent
if DRY_RUN:
print(" [DRY] Would remove torrent")
else:
qbt.torrents_delete(delete_files=False, torrent_hashes=thash)
removed_count += 1
continue
# ====================================================
# CASE 2 — Torrent completed in qBittorrent AND already completed in DB
# → Just remove torrent
# ====================================================
if row and row[0]:
print(f"✔ COMPLETED (Already in DB → removing): {name}")
if DRY_RUN:
print(" [DRY] Would remove torrent")
else:
qbt.torrents_delete(delete_files=False, torrent_hashes=thash)
removed_count += 1
continue
skipped_count += 1
# =========================================================
# SUMMARY
# =========================================================
print("------------------------------------------------")
print(f"Removed torrents : {removed_count}")
print(f"DB updated : {updated_db_count}")
print(f"Skipped torrents : {skipped_count}")
print(f"DRY_RUN : {DRY_RUN}")
print("------------------------------------------------")
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,124 @@
import pymysql
import qbittorrentapi
from datetime import datetime
# ============================================================
# CONFIG
# ============================================================
QBT_URL = "https://vladob.zen.usbx.me/qbittorrent"
QBT_USER = "vladob"
QBT_PASS = "jCni3U6d#y4bfcm"
DB_CONFIG = {
"host": "192.168.1.76",
"port": 3306,
"user": "root",
"password": "Vlado9674+",
"database": "torrents",
"charset": "utf8mb4",
"autocommit": True,
}
# ============================================================
# CONNECT
# ============================================================
def connect_qbt():
qbt = qbittorrentapi.Client(
host=QBT_URL,
username=QBT_USER,
password=QBT_PASS,
VERIFY_WEBUI_CERTIFICATE=False
)
qbt.auth_log_in()
return qbt
def connect_db():
return pymysql.connect(**DB_CONFIG)
# ============================================================
# MAIN AUDIT
# ============================================================
def main():
print("🔎 UltraCC → DB COMPLETION AUDIT")
print("READ ONLY MODE")
print("------------------------------------------------")
qbt = connect_qbt()
db = connect_db()
cursor = db.cursor()
torrents = qbt.torrents_info()
total_completed = 0
not_in_db = 0
db_no_completion = 0
db_completed = 0
for t in torrents:
# Only completed torrents
if not t.completion_on:
continue
total_completed += 1
thash = t.hash.lower()
name = t.name
completion_dt = datetime.fromtimestamp(t.completion_on)
# ---------------------------------------------
# Query DB
# ---------------------------------------------
cursor.execute("""
SELECT qb_completed_datetime
FROM torrents
WHERE torrent_hash = %s
""", (thash,))
row = cursor.fetchone()
# ---------------------------------------------
# Status classification
# ---------------------------------------------
if not row:
status = "❌ NOT_IN_DB"
not_in_db += 1
elif row[0] is None:
status = "⚠ DB_NO_COMPLETION"
db_no_completion += 1
else:
status = "✅ DB_COMPLETED"
db_completed += 1
# ---------------------------------------------
# Output
# ---------------------------------------------
print(f"{status} | {name}")
print(f" Hash: {thash}")
print(f" UltraCC completion: {completion_dt}")
if row and row[0]:
print(f" DB completion : {row[0]}")
print()
# =================================================
# SUMMARY
# =================================================
print("------------------------------------------------")
print("📊 SUMMARY")
print(f"Completed torrents in UltraCC : {total_completed}")
print(f"NOT IN DB : {not_in_db}")
print(f"DB WITHOUT COMPLETION : {db_no_completion}")
print(f"DB COMPLETED : {db_completed}")
print("------------------------------------------------")
if __name__ == "__main__":
main()

157
Seedbox/50 MrtveTorrenty.py Normal file
View File

@@ -0,0 +1,157 @@
import pymysql
import qbittorrentapi
from datetime import datetime, timedelta
# ============================================================
# CONFIG
# ============================================================
DRY_RUN = False # ← změň na False až si ověříš výstup
DEAD_AFTER_HOURS = 72 # torrent musí být v qB alespoň tolik hodin
DEAD_PROGRESS_THRESHOLD = 95.0 # pokud je progress < tato % po uplynutí doby → dead
STUCK_AFTER_HOURS = 168 # 7 dní — pro torrenty téměř hotové (>= 95%) ale zaseknuté
QBT_URL = "https://vladob.zen.usbx.me/qbittorrent"
QBT_USER = "vladob"
QBT_PASS = "jCni3U6d#y4bfcm"
DB_CONFIG = {
"host": "192.168.1.76",
"port": 3306,
"user": "root",
"password": "Vlado9674+",
"database": "torrents",
"charset": "utf8mb4",
"autocommit": True,
}
# ============================================================
# CONNECT
# ============================================================
def connect_qbt():
qbt = qbittorrentapi.Client(
host=QBT_URL,
username=QBT_USER,
password=QBT_PASS,
VERIFY_WEBUI_CERTIFICATE=False
)
qbt.auth_log_in()
return qbt
def connect_db():
return pymysql.connect(**DB_CONFIG)
# ============================================================
# MAIN
# ============================================================
def main():
print("=" * 60)
print("MRTVÉ TORRENTY — UltraCC Seedbox cleanup")
print(f"DRY_RUN = {DRY_RUN}")
print(f"Kritérium A: progress < {DEAD_PROGRESS_THRESHOLD}% A v qB déle než {DEAD_AFTER_HOURS}h")
print(f"Kritérium B: progress >= {DEAD_PROGRESS_THRESHOLD}% ale < 100% A v qB déle než {STUCK_AFTER_HOURS}h (zaseknutý)")
print("=" * 60)
qbt = connect_qbt()
db = connect_db()
cursor = db.cursor()
torrents = qbt.torrents_info()
now = datetime.now()
deadline_a = now - timedelta(hours=DEAD_AFTER_HOURS)
deadline_b = now - timedelta(hours=STUCK_AFTER_HOURS)
dead_count = 0
skip_count = 0
error_count = 0
for t in torrents:
# Přeskočit dokončené (ty řeší skript 40)
if t.completion_on:
skip_count += 1
continue
added_on = t.added_on # unix timestamp
if not added_on:
skip_count += 1
continue
added_dt = datetime.fromtimestamp(added_on)
progress_pct = float(t.progress) * 100.0
age_hours = (now - added_dt).total_seconds() / 3600
# ── Kritérium A: nízký progress po 72h ────────────────
is_dead_a = (added_dt <= deadline_a) and (progress_pct < DEAD_PROGRESS_THRESHOLD)
# ── Kritérium B: zaseknutý blízko 100% po 7 dnech ─────
is_dead_b = (added_dt <= deadline_b) and (progress_pct >= DEAD_PROGRESS_THRESHOLD) and (progress_pct < 100.0)
if not is_dead_a and not is_dead_b:
skip_count += 1
continue
# ── Torrent splňuje kritéria "mrtvý" ──────────────────
thash = t.hash.lower()
reason = "nízký progress po 72h" if is_dead_a else "zaseknutý blízko 100% po 7 dnech"
print(f"\n💀 MRTVÝ ({reason}): {t.name}")
print(f" Přidán : {added_dt} ({age_hours:.1f}h zpět)")
print(f" Progress : {progress_pct:.1f}%")
print(f" Stav : {t.state}")
print(f" Seeds : {t.num_seeds} Peers: {t.num_leechs}")
if DRY_RUN:
print(f" [DRY] Smazal bych z qBittorrentu a označil v DB jako incomplete")
dead_count += 1
continue
# Smazat z qBittorrentu (i soubory — nekompletní data jsou k ničemu)
try:
qbt.torrents_delete(torrent_hashes=thash, delete_files=True)
print(f" ✔ Smazán z qBittorrentu")
except Exception as e:
print(f" ❌ Smazání selhalo: {e}")
error_count += 1
continue
# Označit v DB jako incomplete
cursor.execute("""
UPDATE torrents
SET
qb_state = 'incomplete',
qb_progress = %s,
qb_last_update = NOW()
WHERE torrent_hash = %s
""", (progress_pct, thash))
if cursor.rowcount > 0:
print(f" ✔ DB → incomplete (progress={progress_pct:.1f}%)")
else:
print(f" ⚠ DB: žádný řádek pro hash {thash}")
dead_count += 1
# ============================================================
# SUMMARY
# ============================================================
print("\n" + "=" * 60)
print(f"Mrtvé torrenty zpracováno : {dead_count}")
print(f"Přeskočeno : {skip_count}")
print(f"Chyby : {error_count}")
print(f"DRY_RUN : {DRY_RUN}")
print("=" * 60)
db.close()
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,94 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
import pymysql
import bencodepy
# ============================================================
# DB CONFIG UPRAV
# ============================================================
DB_CONFIG = {
"host": "192.168.1.50",
"user": "root",
"password": "Vlado9674+",
"database": "torrents",
"charset": "utf8mb4",
"cursorclass": pymysql.cursors.SSCursor # streaming
}
# ============================================================
# HELPERS
# ============================================================
def decode_if_bytes(value):
if isinstance(value, bytes):
return value.decode("utf-8", errors="replace")
return value
def parse_torrent(blob):
data = bencodepy.decode(blob)
info = data[b"info"]
torrent_name = decode_if_bytes(info[b"name"])
files = []
# multi-file torrent
if b"files" in info:
for f in info[b"files"]:
path = "/".join(decode_if_bytes(p) for p in f[b"path"])
length = f[b"length"]
files.append((path, length))
# single file torrent
else:
length = info[b"length"]
files.append((torrent_name, length))
return torrent_name, files
# ============================================================
# MAIN
# ============================================================
def main():
conn = pymysql.connect(**DB_CONFIG)
with conn.cursor() as cursor:
cursor.execute("""
SELECT id, title_visible, torrent_content
FROM torrents
WHERE torrent_content IS NOT NULL
""")
for row in cursor:
torrent_id = row[0]
title_visible = row[1]
blob = row[2]
try:
name, files = parse_torrent(blob)
print("=" * 70)
print(f"DB ID : {torrent_id}")
print(f"Title visible : {title_visible}")
print(f"Torrent name : {name}")
print("Files:")
for f, size in files:
print(f" - {f} ({size} bytes)")
except Exception as e:
print(f"ERROR parsing torrent ID {torrent_id}: {e}")
conn.close()
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,137 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
import pymysql
import bencodepy
from openpyxl import Workbook
from openpyxl.utils import get_column_letter
# ============================================================
# DB CONFIG
# ============================================================
DB_CONFIG = {
"host": "192.168.1.50",
"user": "root",
"password": "Vlado9674+",
"database": "torrents",
"charset": "utf8mb4",
"cursorclass": pymysql.cursors.SSCursor
}
OUTPUT_FILE = "torrent_report.xlsx"
# ============================================================
# HELPERS
# ============================================================
def decode_if_bytes(value):
if isinstance(value, bytes):
return value.decode("utf-8", errors="replace")
return value
def get_root_name(info):
# prefer UTF-8 variant
if b"name.utf-8" in info:
return decode_if_bytes(info[b"name.utf-8"])
return decode_if_bytes(info[b"name"])
def get_file_parts(file_entry):
# prefer UTF-8 variant
if b"path.utf-8" in file_entry:
return [decode_if_bytes(p) for p in file_entry[b"path.utf-8"]]
return [decode_if_bytes(p) for p in file_entry[b"path"]]
def parse_torrent(blob):
data = bencodepy.decode(blob)
info = data[b"info"]
root_name = get_root_name(info)
files = []
# =====================
# MULTI FILE TORRENT
# =====================
if b"files" in info:
for f in info[b"files"]:
parts = get_file_parts(f)
# ochrana proti root/root duplicite
if parts and parts[0] == root_name:
full_path = "/".join(parts)
else:
full_path = root_name + "/" + "/".join(parts)
files.append(full_path)
# =====================
# SINGLE FILE TORRENT
# =====================
else:
files.append(root_name)
return files
# ============================================================
# MAIN
# ============================================================
def main():
conn = pymysql.connect(**DB_CONFIG)
wb = Workbook()
ws = wb.active
ws.title = "Torrent report"
ws.append(["QB Status", "Title", "Torrent Path"])
with conn.cursor() as cursor:
cursor.execute("""
SELECT qb_state, title_visible, torrent_content
FROM torrents
WHERE torrent_content IS NOT NULL
""")
for qb_state, title_visible, blob in cursor:
qb_state = qb_state or "UNKNOWN"
title_visible = title_visible or ""
try:
files = parse_torrent(blob)
for f in files:
ws.append([qb_state, title_visible, f])
except Exception as e:
ws.append([qb_state, title_visible, f"ERROR: {e}"])
# autosize
for col in ws.columns:
max_len = 0
col_letter = get_column_letter(col[0].column)
for cell in col:
if cell.value:
max_len = max(max_len, len(str(cell.value)))
ws.column_dimensions[col_letter].width = min(max_len + 2, 90)
wb.save(OUTPUT_FILE)
conn.close()
print("DONE ->", OUTPUT_FILE)
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,48 @@
import pymysql
# =========================
# CONFIG
# =========================
DB_CONFIG = {
"host": "192.168.1.50",
"user": "root",
"password": "Vlado9674+",
"database": "torrents",
"charset": "utf8mb4"
}
SEARCH_NAME = "Balík audioknih"
# =========================
# MAIN
# =========================
conn = pymysql.connect(**DB_CONFIG)
with conn.cursor() as cursor:
cursor.execute("""
SELECT id, title_visible, torrent_content
FROM torrents
WHERE title_visible LIKE %s
LIMIT 1
""", ("%" + SEARCH_NAME + "%",))
row = cursor.fetchone()
if not row:
print("Torrent not found")
exit()
torrent_id, title, blob = row
filename = f"{title}.torrent".replace("/", "_")
with open(filename, "wb") as f:
f.write(blob)
print("Saved:", filename)
conn.close()

View File

@@ -0,0 +1,220 @@
import pymysql
import requests
import json
import time
import re
import sys
from bs4 import BeautifulSoup
from datetime import datetime
# ============================================================
# CONFIG
# ============================================================
COOKIE_FILE = "sktorrent_cookies.json"
BASE_URL = "https://sktorrent.eu/torrent/torrents.php?active=0&category=24&order=data&by=DESC"
SLEEP_BETWEEN_PAGES = 2.0 # sekundy mezi stránkami (web nás neblokuje)
MAX_PAGES = 300 # pojistka — skript se zastaví nejpozději zde
# Kolik stránek za sebou bez jediné shody v DB = konec (dorazili jsme k novým torrentům)
STOP_AFTER_EMPTY_PAGES = 5
# Kolik 403 chyb za sebou = přerušit (web nás blokuje)
STOP_AFTER_403 = 3
DB_CONFIG = {
"host": "192.168.1.76",
"port": 3306,
"user": "root",
"password": "Vlado9674+",
"database": "torrents",
"charset": "utf8mb4",
"autocommit": True,
}
# ============================================================
# CONNECT
# ============================================================
def connect_db():
return pymysql.connect(**DB_CONFIG)
def build_session():
with open(COOKIE_FILE, "r", encoding="utf-8") as f:
cookies = json.load(f)
session = requests.Session()
session.headers["User-Agent"] = (
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36"
)
for c in cookies:
session.cookies.set(c["name"], c["value"], domain=c.get("domain", ""))
return session
# ============================================================
# PARSE ONE PAGE
# ============================================================
def parse_page(html):
"""
Vrátí seznam dict: {hash, seeders, leechers}
"""
soup = BeautifulSoup(html, "html.parser")
results = []
for row in soup.select("table tr"):
cells = row.find_all("td")
if len(cells) != 7:
continue
# td[1] musí mít odkaz download.php?id=<hash>
dl_link = cells[1].find("a", href=re.compile(r"download\.php\?id="))
if not dl_link:
continue
match = re.search(r"id=([a-f0-9]+)", dl_link["href"])
if not match:
continue
torrent_hash = match.group(1).lower()
# seeders = td[4], leechers = td[5]
seeders_text = cells[4].get_text(strip=True)
leechers_text = cells[5].get_text(strip=True)
try:
seeders = int(seeders_text)
except ValueError:
seeders = 0
try:
leechers = int(leechers_text)
except ValueError:
leechers = 0
results.append({
"hash": torrent_hash,
"seeders": seeders,
"leechers": leechers,
})
return results
# ============================================================
# MAIN
# ============================================================
def main():
sys.stdout.reconfigure(encoding="utf-8")
print("=" * 60)
print("AKTUALIZACE SEEDERS / LEECHERS — sktorrent.eu")
print(f"Spuštěno: {datetime.now():%Y-%m-%d %H:%M:%S}")
print("=" * 60)
session = build_session()
db = connect_db()
cursor = db.cursor()
# Zjisti max stránku
r0 = session.get(f"{BASE_URL}&page=0", timeout=15)
all_page_nums = [int(m.group(1)) for m in re.finditer(r"page=(\d+)", r0.text)]
max_page = max(all_page_nums) if all_page_nums else MAX_PAGES
print(f"Max stránka na webu: {max_page}")
print(f"Prochází od stránky {max_page} směrem dolů...\n")
total_pages = 0
total_parsed = 0
total_updated = 0
total_skipped = 0
consecutive_empty = 0 # stránky za sebou bez jediné shody v DB
consecutive_403 = 0 # 403 chyby za sebou
# Procházíme od nejstarší stránky (konec) k nejnovější (začátek)
for page in range(max_page, -1, -1):
url = f"{BASE_URL}&page={page}"
try:
r = session.get(url, timeout=15)
r.raise_for_status()
consecutive_403 = 0 # reset po úspěchu
except requests.exceptions.HTTPError as e:
if e.response is not None and e.response.status_code == 403:
consecutive_403 += 1
print(f"⚠️ Stránka {page} — 403 Forbidden ({consecutive_403}/{STOP_AFTER_403})")
if consecutive_403 >= STOP_AFTER_403:
print(f"\n🛑 {STOP_AFTER_403}× 403 za sebou — web nás blokuje, přerušuji.")
break
time.sleep(5) # pauza po 403
else:
print(f"⚠️ Stránka {page} — chyba: {e}")
continue
except Exception as e:
print(f"⚠️ Stránka {page} — chyba: {e}")
continue
if "login.php" in r.url or "Prihlas sa" in r.text:
print("❌ Cookies expiraly — je potřeba se znovu přihlásit (spusť Selenium skript)")
break
rows = parse_page(r.text)
if not rows:
print(f" Stránka {page:3d} → prázdná, konec paginace.")
break
total_pages += 1
total_parsed += len(rows)
page_updated = 0
for item in rows:
cursor.execute("""
UPDATE torrents
SET
seeders = %s,
leechers = %s,
qb_last_update = NOW()
WHERE torrent_hash = %s
""", (item["seeders"], item["leechers"], item["hash"]))
if cursor.rowcount > 0:
total_updated += 1
page_updated += 1
else:
total_skipped += 1
print(f" Stránka {page:3d}{len(rows):2d} torrentů, "
f"updatováno: {page_updated:2d} (celkem: {total_updated})")
# Zastavit pokud jsme dorazili do oblasti novějších torrentů (mimo DB)
if page_updated == 0:
consecutive_empty += 1
if consecutive_empty >= STOP_AFTER_EMPTY_PAGES:
print(f"\n{STOP_AFTER_EMPTY_PAGES} stránek po sobě bez shody → "
f"dorazili jsme k novějším torrentům, které nejsou v DB. Konec.")
break
else:
consecutive_empty = 0
time.sleep(SLEEP_BETWEEN_PAGES)
# ============================================================
# SUMMARY
# ============================================================
print()
print("=" * 60)
print(f"Hotovo: {datetime.now():%Y-%m-%d %H:%M:%S}")
print(f"Stránek zpracováno : {total_pages}")
print(f"Záznamů parsováno : {total_parsed}")
print(f"DB řádků updatováno: {total_updated}")
print(f"Nebylo v DB : {total_skipped}")
print("=" * 60)
db.close()
if __name__ == "__main__":
main()

383
Seedbox/70 Manager.py Normal file
View File

@@ -0,0 +1,383 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Download Manager — Multi-client (UltraCC + lokální qBittorrent)
Smyčka každých N minut pro každý klient:
1. Dokončené torrenty → odeber z qBittorrentu (data zachovej), zapiš do DB
2. Spočítej volné sloty
3. Doplň nové torrenty dle priority: seeders + velikost
Oba klienti sdílí stejnou DB frontu. Torrent "nárokovaný" jedním klientem
(qb_state='added') nebude nabídnut druhému klientovi.
"""
import pymysql
import qbittorrentapi
import sys
from datetime import datetime, timedelta
# ============================================================
# CONFIG
# ============================================================
DEAD_AFTER_HOURS = 72 # progress < 95% po 72h → dead
DEAD_PROGRESS_THRESHOLD = 95.0
STUCK_AFTER_HOURS = 168 # progress >= 95% ale < 100% po 7 dnech → dead
CLIENTS = [
{
"name": "UltraCC Seedbox",
"max_concurrent": 30,
"qbt": {
"host": "https://vladob.zen.usbx.me/qbittorrent",
"username": "vladob",
"password": "jCni3U6d#y4bfcm",
"VERIFY_WEBUI_CERTIFICATE": False,
},
},
{
"name": "Local qBittorrent",
"max_concurrent": 30,
"qbt": {
"host": "192.168.1.76",
"port": 8080,
"username": "admin",
"password": "adminadmin",
},
},
]
DB_CONFIG = {
"host": "192.168.1.76",
"port": 3306,
"user": "root",
"password": "Vlado9674+",
"database": "torrents",
"charset": "utf8mb4",
"autocommit": True,
}
# ============================================================
# PRIORITY SQL
# Pořadí: nejlepší seeders + nejmenší soubory první
#
# DŮLEŽITÉ: 'added' je v exclusion listu — torrent nárokovaný
# jedním klientem nebude nabídnut druhému.
# ============================================================
SELECT_NEXT = """
SELECT id, torrent_hash, torrent_content, title_visible, size_pretty, seeders
FROM torrents
WHERE
torrent_content IS NOT NULL
AND qb_completed_datetime IS NULL
AND (qb_state IS NULL OR qb_state NOT IN (
'added', 'incomplete', 'invalid', 'dead',
'completed', 'completed_removed'
))
ORDER BY
CASE
WHEN seeders >= 3 THEN 1
WHEN seeders >= 1 THEN 2
ELSE 3
END ASC,
CASE
WHEN size_pretty LIKE '%%MB%%' THEN 0
ELSE 1
END ASC,
seeders DESC,
added_datetime DESC
LIMIT %s
"""
# ============================================================
# CONNECT
# ============================================================
def connect_db():
return pymysql.connect(**DB_CONFIG)
def connect_qbt(cfg: dict):
qbt = qbittorrentapi.Client(**cfg)
qbt.auth_log_in()
return qbt
# ============================================================
# STEP 1: Handle completed torrents
# ============================================================
def handle_completed(qbt, cursor):
"""
Dokončené torrenty odebere z qBittorrentu (data zůstanou)
a zapíše qb_completed_datetime do DB.
"""
removed = 0
for t in qbt.torrents_info():
if not t.completion_on or t.completion_on < 0:
continue
try:
completed_dt = datetime.fromtimestamp(t.completion_on)
except (OSError, ValueError, OverflowError):
continue
thash = t.hash.lower()
try:
qbt.torrents_delete(torrent_hashes=thash, delete_files=False)
except Exception as e:
print(f" ⚠️ Nelze odebrat {t.name[:40]}: {e}")
continue
cursor.execute("""
UPDATE torrents
SET
qb_state = 'completed_removed',
qb_progress = 100,
qb_completed_datetime = COALESCE(qb_completed_datetime, %s),
qb_last_update = NOW()
WHERE torrent_hash = %s OR qb_hash = %s
""", (completed_dt, thash, thash))
print(f" ✅ Dokončen a odebrán: {t.name[:50]}")
removed += 1
return removed
# ============================================================
# STEP 1b: Handle dead torrents (z 50 MrtveTorrenty.py)
# ============================================================
def handle_dead_torrents(qbt, cursor):
"""
Mrtvé torrenty: nízký progress po 72h NEBO zaseknutý >= 95% po 7 dnech.
Smaže z qBittorrentu včetně souborů a označí v DB jako incomplete.
"""
now = datetime.now()
deadline_a = now - timedelta(hours=DEAD_AFTER_HOURS)
deadline_b = now - timedelta(hours=STUCK_AFTER_HOURS)
dead_count = 0
for t in qbt.torrents_info():
# Přeskočit dokončené
if t.completion_on and t.completion_on > 0:
continue
added_on = t.added_on
if not added_on:
continue
added_dt = datetime.fromtimestamp(added_on)
progress_pct = float(t.progress) * 100.0
# Kritérium A: nízký progress po 72h
is_dead_a = (added_dt <= deadline_a) and (progress_pct < DEAD_PROGRESS_THRESHOLD)
# Kritérium B: zaseknutý blízko 100% po 7 dnech
is_dead_b = (added_dt <= deadline_b) and (progress_pct >= DEAD_PROGRESS_THRESHOLD) and (progress_pct < 100.0)
if not is_dead_a and not is_dead_b:
continue
thash = t.hash.lower()
reason = "nízký progress po 72h" if is_dead_a else "zaseknutý blízko 100% po 7 dnech"
print(f" 💀 MRTVÝ ({reason}): {t.name[:50]}")
print(f" Progress: {progress_pct:.1f}% | Stav: {t.state} | Seeds: {t.num_seeds}")
try:
qbt.torrents_delete(torrent_hashes=thash, delete_files=True)
except Exception as e:
print(f" ❌ Smazání selhalo: {e}")
continue
cursor.execute("""
UPDATE torrents
SET
qb_state = 'incomplete',
qb_progress = %s,
qb_last_update = NOW()
WHERE torrent_hash = %s OR qb_hash = %s
""", (progress_pct, thash, thash))
dead_count += 1
return dead_count
# ============================================================
# STEP 2: Count active (non-completed) torrents in qBittorrent
# ============================================================
def get_active_hashes(qbt):
"""
Vrátí sadu hashů torrentů, které jsou aktuálně v qBittorrentu.
"""
return {t.hash.lower() for t in qbt.torrents_info()}
# ============================================================
# STEP 3: Add new torrents
# ============================================================
def add_torrents(qbt, cursor, active_hashes, slots, client_name: str):
"""
Vybere z DB torrenty dle priority a přidá je do qBittorrentu.
Přeskočí ty, které jsou tam již nahrané (dle active_hashes).
Zapíše qb_client = client_name, aby bylo vidět, kdo torrent stahuje.
"""
cursor.execute(SELECT_NEXT, (slots * 3,))
rows = cursor.fetchall()
added = 0
for row in rows:
if added >= slots:
break
t_id, t_hash, content, title, size, seeders = row
t_hash = t_hash.lower() if t_hash else ""
if t_hash in active_hashes:
# Torrent je v qB, ale DB ho ještě nemá označený
cursor.execute("""
UPDATE torrents SET qb_added=1, qb_last_update=NOW()
WHERE id=%s AND (qb_added IS NULL OR qb_added=0)
""", (t_id,))
continue
if not content:
continue
try:
qbt.torrents_add(
torrent_files={f"{t_hash}.torrent": content},
is_paused=False,
)
except Exception as e:
print(f" ❌ Nelze přidat {(title or '')[:40]}: {e}")
continue
cursor.execute("""
UPDATE torrents
SET qb_added=1, qb_state='added', qb_client=%s, qb_last_update=NOW()
WHERE id=%s
""", (client_name, t_id))
print(f" Přidán: {(title or '')[:45]} "
f"| {size or '?':>10} | seeds={seeders}")
added += 1
return added
# ============================================================
# PROCESS ONE CLIENT (steps 1-3)
# ============================================================
def process_client(client_cfg: dict, cursor):
name = client_cfg["name"]
max_concurrent = client_cfg["max_concurrent"]
print(f"\n ┌── {name} (max {max_concurrent}) ──")
try:
qbt = connect_qbt(client_cfg["qbt"])
except Exception as e:
print(f" │ ❌ Nelze se připojit: {e}")
print(f" └──")
return
# Krok 1: Dokončené
print(f" │ [1] Kontrola dokončených...")
removed = handle_completed(qbt, cursor)
if removed == 0:
print(f" │ Žádné dokončené.")
# Krok 1b: Mrtvé torrenty
print(f" │ [1b] Kontrola mrtvých torrentů...")
dead = handle_dead_torrents(qbt, cursor)
if dead == 0:
print(f" │ Žádné mrtvé.")
else:
print(f" │ Odstraněno mrtvých: {dead}")
# Krok 2: Stav slotů
active_hashes = get_active_hashes(qbt)
active = len(active_hashes)
slots = max(0, max_concurrent - active)
print(f" │ [2] Sloty: {active}/{max_concurrent} aktivních | volných: {slots}")
# Krok 3: Doplnění
if slots > 0:
print(f" │ [3] Doplňuji {slots} torrentů...")
added = add_torrents(qbt, cursor, active_hashes, slots, name)
if added == 0:
print(f" │ Žádné vhodné torrenty k přidání.")
else:
print(f" │ Přidáno: {added}")
else:
print(f" │ [3] Sloty plné ({active}/{max_concurrent}), přeskakuji.")
print(f" └──")
# ============================================================
# MAIN LOOP
# ============================================================
def main():
sys.stdout.reconfigure(encoding="utf-8")
now = datetime.now()
print("=" * 60)
print(f"DOWNLOAD MANAGER — Multi-client {now:%Y-%m-%d %H:%M:%S}")
for c in CLIENTS:
print(f"{c['name']} (max {c['max_concurrent']})")
print(f"Celkem slotů: {sum(c['max_concurrent'] for c in CLIENTS)}")
print("=" * 60)
db = connect_db()
cursor = db.cursor()
try:
# DB statistiky — celkové
cursor.execute("""
SELECT
SUM(CASE WHEN qb_completed_datetime IS NOT NULL THEN 1 ELSE 0 END),
SUM(CASE WHEN qb_state IS NULL AND torrent_content IS NOT NULL THEN 1 ELSE 0 END),
SUM(CASE WHEN qb_state IN ('incomplete','dead') THEN 1 ELSE 0 END)
FROM torrents
""")
db_completed, db_waiting, db_dead = cursor.fetchone()
print(f" DB — staženo: {db_completed or 0} "
f"| čeká: {db_waiting or 0} "
f"| dead/incomplete: {db_dead or 0}")
# DB statistiky — per-client
cursor.execute("""
SELECT qb_client, COUNT(*) AS cnt
FROM torrents
WHERE qb_client IS NOT NULL
GROUP BY qb_client
""")
per_client = cursor.fetchall()
if per_client:
parts = " | ".join(f"{name}: {cnt}" for name, cnt in per_client)
print(f" DB — per-client: {parts}")
# Zpracuj každý klient
# (druhý klient vidí stav DB aktualizovaný prvním)
for client_cfg in CLIENTS:
process_client(client_cfg, cursor)
finally:
db.close()
print("\n👋 Hotovo.")
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,182 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Projde /mnt/user/torrents/ultracc, pro každý soubor spočítá blake3
a porovná s tabulkou file_md5_index. Pokud je hash nalezen → soubor smaže.
Po smazání souborů odstraní prázdné adresáře.
"""
import os
import sys
import blake3
import pymysql
import paramiko
from pathlib import Path
# ============================================================
# CONFIG
# ============================================================
SCAN_DIR = "//tower/torrents/ultracc2"
SSH_CONFIG = {
"hostname": "192.168.1.76",
"port": 22,
"username": "root",
"password": "7309208104",
}
ULTRACC_DIRS = [
"/mnt/user/Torrents/UltraCC",
"/mnt/user/Torrents/UltraCC1",
"/mnt/user/Torrents/UltraCC2",
]
DB_CONFIG = {
"host": "192.168.1.76",
"port": 3306,
"user": "root",
"password": "Vlado9674+",
"database": "torrents",
"charset": "utf8mb4",
}
CHUNK_SIZE = 8 * 1024 * 1024 # 8 MB
DRY_RUN = False # True = pouze vypíše, nesmaže
# ============================================================
# HELPERS
# ============================================================
def compute_blake3(path: Path) -> bytes:
"""Vrátí blake3 digest jako 32 raw bytes."""
h = blake3.blake3()
with open(path, "rb") as f:
while True:
chunk = f.read(CHUNK_SIZE)
if not chunk:
break
h.update(chunk)
return h.digest()
def hash_in_db(cursor, digest: bytes):
"""Vrátí (host_name, full_path) prvního záznamu s daným hashem, nebo None."""
cursor.execute(
"SELECT host_name, full_path FROM file_md5_index WHERE blake3 = %s AND host_name = 'tower1' AND full_path LIKE '/mnt/user/#ColdData/Porno/%%' LIMIT 1",
(digest,)
)
return cursor.fetchone() # None nebo (host_name, full_path)
def remove_empty_dirs(root: str) -> int:
"""Rekurzivně smaže prázdné adresáře pod root. Vrátí počet smazaných."""
removed = 0
for dirpath, dirnames, filenames in os.walk(root, topdown=False):
if dirpath == root:
continue
try:
os.rmdir(dirpath)
print(f" [rmdir] {dirpath}")
removed += 1
except OSError:
pass
return removed
# ============================================================
# MAIN
# ============================================================
def set_ultracc_permissions():
"""Přes SSH nastaví na Tower chown nobody:users + chmod 777 pro všechny UltraCC adresáře."""
print("Nastavuji práva na Tower (UltraCC*)...")
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(**SSH_CONFIG)
for d in ULTRACC_DIRS:
_, out, err = ssh.exec_command(
'chown -R nobody:users "%s" && chmod -R 777 "%s" && echo OK' % (d, d)
)
result = out.read().decode().strip()
error = err.read().decode().strip()
if result == "OK":
print(f" [OK] {d}")
else:
print(f" [CHYBA] {d}: {error}")
ssh.close()
print()
def main():
dry_run = DRY_RUN
set_ultracc_permissions()
if dry_run:
print("=== DRY RUN — nic se nesmaže ===\n")
conn = pymysql.connect(**DB_CONFIG)
cursor = conn.cursor()
scan_root = Path(SCAN_DIR)
if not scan_root.exists():
print(f"CHYBA: Adresář neexistuje: {SCAN_DIR}")
sys.exit(1)
files_checked = 0
files_deleted = 0
files_kept = 0
bytes_deleted = 0
for file_path in scan_root.rglob("*"):
if not file_path.is_file():
continue
files_checked += 1
size = file_path.stat().st_size
try:
digest = compute_blake3(file_path)
except OSError as e:
print(f" [CHYBA čtení] {file_path}: {e}")
continue
db_match = hash_in_db(cursor, digest)
if db_match:
db_host, db_path = db_match
print(f" [SMAZAT] {file_path} ({size:,} B)")
print(f" ↳ originál v DB: [{db_host}] {db_path}")
if not dry_run:
try:
file_path.unlink()
files_deleted += 1
bytes_deleted += size
except OSError as e:
print(f" [CHYBA smazání] {file_path}: {e}")
else:
files_deleted += 1
bytes_deleted += size
else:
print(f" [zachovat] {file_path} ({size:,} B)")
files_kept += 1
cursor.close()
conn.close()
print()
print(f"Zkontrolováno: {files_checked} souborů")
print(f"Ke smazání: {files_deleted} souborů ({bytes_deleted / 1024**3:.2f} GB)")
print(f"Zachováno: {files_kept} souborů")
if not dry_run and files_deleted > 0:
print("\nOdstraňuji prázdné adresáře...")
removed = remove_empty_dirs(SCAN_DIR)
print(f"Odstraněno prázdných adresářů: {removed}")
if dry_run:
print("\n(Dry run — žádné změny nebyly provedeny)")
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,182 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Projde /mnt/user/torrents/ultracc, pro každý soubor spočítá blake3
a porovná s tabulkou file_md5_index. Pokud je hash nalezen → soubor smaže.
Po smazání souborů odstraní prázdné adresáře.
"""
import os
import sys
import blake3
import pymysql
import paramiko
from pathlib import Path
# ============================================================
# CONFIG
# ============================================================
SCAN_DIR = "//tower/torrents/ultracc"
SSH_CONFIG = {
"hostname": "192.168.1.76",
"port": 22,
"username": "root",
"password": "7309208104",
}
ULTRACC_DIRS = [
"/mnt/user/Torrents/UltraCC",
"/mnt/user/Torrents/UltraCC1",
"/mnt/user/Torrents/UltraCC2",
]
DB_CONFIG = {
"host": "192.168.1.76",
"port": 3306,
"user": "root",
"password": "Vlado9674+",
"database": "torrents",
"charset": "utf8mb4",
}
CHUNK_SIZE = 8 * 1024 * 1024 # 8 MB
DRY_RUN = True # True = pouze vypíše, nesmaže
# ============================================================
# HELPERS
# ============================================================
def compute_blake3(path: Path) -> bytes:
"""Vrátí blake3 digest jako 32 raw bytes."""
h = blake3.blake3()
with open(path, "rb") as f:
while True:
chunk = f.read(CHUNK_SIZE)
if not chunk:
break
h.update(chunk)
return h.digest()
def hash_in_db(cursor, digest: bytes):
"""Vrátí (host_name, full_path) prvního záznamu s daným hashem, nebo None."""
cursor.execute(
"SELECT host_name, full_path FROM file_md5_index WHERE blake3 = %s AND host_name = 'tower1' AND full_path LIKE '/mnt/user/#ColdData/Porno/%%' LIMIT 1",
(digest,)
)
return cursor.fetchone() # None nebo (host_name, full_path)
def remove_empty_dirs(root: str) -> int:
"""Rekurzivně smaže prázdné adresáře pod root. Vrátí počet smazaných."""
removed = 0
for dirpath, dirnames, filenames in os.walk(root, topdown=False):
if dirpath == root:
continue
try:
os.rmdir(dirpath)
print(f" [rmdir] {dirpath}")
removed += 1
except OSError:
pass
return removed
# ============================================================
# MAIN
# ============================================================
def set_ultracc_permissions():
"""Přes SSH nastaví na Tower chown nobody:users + chmod 777 pro všechny UltraCC adresáře."""
print("Nastavuji práva na Tower (UltraCC*)...")
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(**SSH_CONFIG)
for d in ULTRACC_DIRS:
_, out, err = ssh.exec_command(
'chown -R nobody:users "%s" && chmod -R 777 "%s" && echo OK' % (d, d)
)
result = out.read().decode().strip()
error = err.read().decode().strip()
if result == "OK":
print(f" [OK] {d}")
else:
print(f" [CHYBA] {d}: {error}")
ssh.close()
print()
def main():
dry_run = DRY_RUN
set_ultracc_permissions()
if dry_run:
print("=== DRY RUN — nic se nesmaže ===\n")
conn = pymysql.connect(**DB_CONFIG)
cursor = conn.cursor()
scan_root = Path(SCAN_DIR)
if not scan_root.exists():
print(f"CHYBA: Adresář neexistuje: {SCAN_DIR}")
sys.exit(1)
files_checked = 0
files_deleted = 0
files_kept = 0
bytes_deleted = 0
for file_path in scan_root.rglob("*"):
if not file_path.is_file():
continue
files_checked += 1
size = file_path.stat().st_size
try:
digest = compute_blake3(file_path)
except OSError as e:
print(f" [CHYBA čtení] {file_path}: {e}")
continue
db_match = hash_in_db(cursor, digest)
if db_match:
db_host, db_path = db_match
print(f" [SMAZAT] {file_path} ({size:,} B)")
print(f" ↳ originál v DB: [{db_host}] {db_path}")
if not dry_run:
try:
file_path.unlink()
files_deleted += 1
bytes_deleted += size
except OSError as e:
print(f" [CHYBA smazání] {file_path}: {e}")
else:
files_deleted += 1
bytes_deleted += size
else:
print(f" [zachovat] {file_path} ({size:,} B)")
files_kept += 1
cursor.close()
conn.close()
print()
print(f"Zkontrolováno: {files_checked} souborů")
print(f"Ke smazání: {files_deleted} souborů ({bytes_deleted / 1024**3:.2f} GB)")
print(f"Zachováno: {files_kept} souborů")
if not dry_run and files_deleted > 0:
print("\nOdstraňuji prázdné adresáře...")
removed = remove_empty_dirs(SCAN_DIR)
print(f"Odstraněno prázdných adresářů: {removed}")
if dry_run:
print("\n(Dry run — žádné změny nebyly provedeny)")
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,182 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Projde /mnt/user/torrents/ultracc, pro každý soubor spočítá blake3
a porovná s tabulkou file_md5_index. Pokud je hash nalezen → soubor smaže.
Po smazání souborů odstraní prázdné adresáře.
"""
import os
import sys
import blake3
import pymysql
import paramiko
from pathlib import Path
# ============================================================
# CONFIG
# ============================================================
SCAN_DIR = "//tower/torrents/ultracc1"
SSH_CONFIG = {
"hostname": "192.168.1.76",
"port": 22,
"username": "root",
"password": "7309208104",
}
ULTRACC_DIRS = [
"/mnt/user/Torrents/UltraCC",
"/mnt/user/Torrents/UltraCC1",
"/mnt/user/Torrents/UltraCC2",
]
DB_CONFIG = {
"host": "192.168.1.76",
"port": 3306,
"user": "root",
"password": "Vlado9674+",
"database": "torrents",
"charset": "utf8mb4",
}
CHUNK_SIZE = 8 * 1024 * 1024 # 8 MB
DRY_RUN = False # True = pouze vypíše, nesmaže
# ============================================================
# HELPERS
# ============================================================
def compute_blake3(path: Path) -> bytes:
"""Vrátí blake3 digest jako 32 raw bytes."""
h = blake3.blake3()
with open(path, "rb") as f:
while True:
chunk = f.read(CHUNK_SIZE)
if not chunk:
break
h.update(chunk)
return h.digest()
def hash_in_db(cursor, digest: bytes):
"""Vrátí (host_name, full_path) prvního záznamu s daným hashem, nebo None."""
cursor.execute(
"SELECT host_name, full_path FROM file_md5_index WHERE blake3 = %s AND host_name = 'tower1' AND full_path LIKE '/mnt/user/#ColdData/Porno/%%' LIMIT 1",
(digest,)
)
return cursor.fetchone() # None nebo (host_name, full_path)
def remove_empty_dirs(root: str) -> int:
"""Rekurzivně smaže prázdné adresáře pod root. Vrátí počet smazaných."""
removed = 0
for dirpath, dirnames, filenames in os.walk(root, topdown=False):
if dirpath == root:
continue
try:
os.rmdir(dirpath)
print(f" [rmdir] {dirpath}")
removed += 1
except OSError:
pass
return removed
# ============================================================
# MAIN
# ============================================================
def set_ultracc_permissions():
"""Přes SSH nastaví na Tower chown nobody:users + chmod 777 pro všechny UltraCC adresáře."""
print("Nastavuji práva na Tower (UltraCC*)...")
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(**SSH_CONFIG)
for d in ULTRACC_DIRS:
_, out, err = ssh.exec_command(
'chown -R nobody:users "%s" && chmod -R 777 "%s" && echo OK' % (d, d)
)
result = out.read().decode().strip()
error = err.read().decode().strip()
if result == "OK":
print(f" [OK] {d}")
else:
print(f" [CHYBA] {d}: {error}")
ssh.close()
print()
def main():
dry_run = DRY_RUN
set_ultracc_permissions()
if dry_run:
print("=== DRY RUN — nic se nesmaže ===\n")
conn = pymysql.connect(**DB_CONFIG)
cursor = conn.cursor()
scan_root = Path(SCAN_DIR)
if not scan_root.exists():
print(f"CHYBA: Adresář neexistuje: {SCAN_DIR}")
sys.exit(1)
files_checked = 0
files_deleted = 0
files_kept = 0
bytes_deleted = 0
for file_path in scan_root.rglob("*"):
if not file_path.is_file():
continue
files_checked += 1
size = file_path.stat().st_size
try:
digest = compute_blake3(file_path)
except OSError as e:
print(f" [CHYBA čtení] {file_path}: {e}")
continue
db_match = hash_in_db(cursor, digest)
if db_match:
db_host, db_path = db_match
print(f" [SMAZAT] {file_path} ({size:,} B)")
print(f" ↳ originál v DB: [{db_host}] {db_path}")
if not dry_run:
try:
file_path.unlink()
files_deleted += 1
bytes_deleted += size
except OSError as e:
print(f" [CHYBA smazání] {file_path}: {e}")
else:
files_deleted += 1
bytes_deleted += size
else:
print(f" [zachovat] {file_path} ({size:,} B)")
files_kept += 1
cursor.close()
conn.close()
print()
print(f"Zkontrolováno: {files_checked} souborů")
print(f"Ke smazání: {files_deleted} souborů ({bytes_deleted / 1024**3:.2f} GB)")
print(f"Zachováno: {files_kept} souborů")
if not dry_run and files_deleted > 0:
print("\nOdstraňuji prázdné adresáře...")
removed = remove_empty_dirs(SCAN_DIR)
print(f"Odstraněno prázdných adresářů: {removed}")
if dry_run:
print("\n(Dry run — žádné změny nebyly provedeny)")
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,286 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Incremental import — sktorrent.eu
- Prochází od nejnovějších torrentů
- Stahuje a ukládá .torrent soubory pro nové záznamy
- Zastaví se, jakmile narazí na torrent, který už v DB máme
- Nevyžaduje Selenium — stačí requests + BeautifulSoup + cookies
"""
import pymysql
import requests
import json
import time
import re
import sys
from bs4 import BeautifulSoup
from pathlib import Path
from datetime import datetime
import urllib.parse as urlparse
# ============================================================
# CONFIG
# ============================================================
COOKIE_FILE = Path("sktorrent_cookies.json")
BASE_URL = (
"https://sktorrent.eu/torrent/torrents.php"
"?active=0&category=24&order=data&by=DESC"
)
SLEEP_BETWEEN_PAGES = 2.0 # pauza mezi stránkami
SLEEP_BEFORE_DOWNLOAD = 1.5 # pauza před stažením každého .torrent souboru
DB_CONFIG = {
"host": "192.168.1.76",
"port": 3306,
"user": "root",
"password": "Vlado9674+",
"database": "torrents",
"charset": "utf8mb4",
"autocommit": True,
}
# ============================================================
# CONNECT
# ============================================================
def connect_db():
return pymysql.connect(**DB_CONFIG)
def build_session():
if not COOKIE_FILE.exists():
raise FileNotFoundError(f"Cookie soubor nenalezen: {COOKIE_FILE}")
with open(COOKIE_FILE, "r", encoding="utf-8") as f:
cookies = json.load(f)
session = requests.Session()
session.headers["User-Agent"] = (
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36"
)
for c in cookies:
session.cookies.set(c["name"], c["value"], domain=c.get("domain", ""))
return session
# ============================================================
# PARSE ONE LISTING PAGE
# ============================================================
def parse_page(html):
"""
Vrátí seznam dict pro každý torrent řádek na stránce.
Prázdný seznam = konec paginace nebo chyba.
"""
soup = BeautifulSoup(html, "html.parser")
results = []
for row in soup.select("table tr"):
cells = row.find_all("td")
if len(cells) != 7:
continue
# td[1] — odkaz na stažení: download.php?id=<hash>&f=<filename>
dl_a = cells[1].find("a", href=re.compile(r"download\.php\?id="))
if not dl_a:
continue
download_url = dl_a["href"]
if not download_url.startswith("http"):
download_url = "https://sktorrent.eu/torrent/" + download_url
m_hash = re.search(r"id=([a-f0-9A-F]+)", download_url)
if not m_hash:
continue
torrent_hash = m_hash.group(1).lower()
parsed_dl = urlparse.urlparse(download_url)
dl_query = urlparse.parse_qs(parsed_dl.query)
torrent_filename = dl_query.get("f", ["unknown.torrent"])[0]
# td[2] — název, details link, velikost, datum
title_a = cells[2].find("a", href=re.compile(r"details\.php\?id="))
if not title_a:
continue
title_visible = title_a.get_text(strip=True)
title_full = title_a.get("title", title_visible)
details_link = title_a["href"]
if not details_link.startswith("http"):
details_link = "https://sktorrent.eu/torrent/" + details_link
cell2_text = cells[2].get_text(" ", strip=True)
size_match = re.search(r"Velkost\s+([\d\.,]+\s*[KMG]B)", cell2_text, re.IGNORECASE)
added_match = re.search(r"Pridany\s+(\d+/\d+/\d+)\s+(?:o\s+)?(\d+:\d+)", cell2_text, re.IGNORECASE)
size_pretty = size_match.group(1).strip() if size_match else None
added_mysql = None
if added_match:
try:
d, mo, y = added_match.group(1).split("/")
t = added_match.group(2) + ":00"
added_mysql = f"{y}-{mo}-{d} {t}"
except Exception:
pass
# td[0] — kategorie
category = cells[0].get_text(strip=True)
# td[4] seeders, td[5] leechers
try:
seeders = int(cells[4].get_text(strip=True))
except ValueError:
seeders = 0
try:
leechers = int(cells[5].get_text(strip=True))
except ValueError:
leechers = 0
results.append({
"torrent_hash": torrent_hash,
"download_url": download_url,
"details_link": details_link,
"torrent_filename": torrent_filename,
"category": category,
"title_visible": title_visible,
"title_full": title_full,
"size_pretty": size_pretty,
"added_datetime": added_mysql,
"seeders": seeders,
"leechers": leechers,
})
return results
# ============================================================
# DOWNLOAD .TORRENT FILE
# ============================================================
def download_torrent(session, url):
try:
r = session.get(url, timeout=15)
r.raise_for_status()
if len(r.content) < 20:
return None
return r.content
except Exception as e:
print(f" ⚠️ Stažení selhalo: {e}")
return None
# ============================================================
# DB INSERT
# ============================================================
INSERT_SQL = """
INSERT INTO torrents (
torrent_hash, details_link, download_url, category,
title_visible, title_full, size_pretty, added_datetime,
seeders, leechers, torrent_filename, torrent_content
) VALUES (
%(torrent_hash)s, %(details_link)s, %(download_url)s, %(category)s,
%(title_visible)s, %(title_full)s, %(size_pretty)s, %(added_datetime)s,
%(seeders)s, %(leechers)s, %(torrent_filename)s, %(torrent_content)s
)
ON DUPLICATE KEY UPDATE
seeders = VALUES(seeders),
leechers = VALUES(leechers),
download_url = VALUES(download_url),
torrent_content = COALESCE(VALUES(torrent_content), torrent_content)
"""
# ============================================================
# MAIN
# ============================================================
def main():
sys.stdout.reconfigure(encoding="utf-8")
print("=" * 60)
print("INCREMENTAL IMPORT — sktorrent.eu")
print(f"Spuštěno: {datetime.now():%Y-%m-%d %H:%M:%S}")
print("Pořadí: nejnovější → nejstarší | stop při první shodě")
print("=" * 60)
session = build_session()
db = connect_db()
cursor = db.cursor()
new_count = 0
page = 0
stop = False
while not stop:
url = f"{BASE_URL}&page={page}"
try:
r = session.get(url, timeout=15)
r.raise_for_status()
except Exception as e:
print(f"⚠️ Stránka {page} — chyba: {e}")
break
if "login.php" in r.url or "Prihlas sa" in r.text:
print("❌ Cookies expiraly — spusť přihlašovací Selenium skript a obnov cookies.")
break
rows = parse_page(r.text)
if not rows:
print(f" Stránka {page} — žádné záznamy, konec.")
break
print(f"\n📄 Stránka {page} ({len(rows)} torrentů)")
for item in rows:
# Zkontroluj DB
cursor.execute(
"SELECT 1 FROM torrents WHERE torrent_hash = %s",
(item["torrent_hash"],)
)
exists = cursor.fetchone()
if exists:
print(f" ⏹ Již v DB: {item['title_visible']} → zastavuji import.")
stop = True
break
# Nový torrent — stáhni .torrent soubor
print(f" ⬇️ Nový: {item['title_visible']}")
time.sleep(SLEEP_BEFORE_DOWNLOAD)
content = download_torrent(session, item["download_url"])
if content:
print(f" ✔ Staženo ({len(content):,} B)")
else:
print(f" ✖ Nepodařilo se stáhnout, ukládám bez obsahu")
item["torrent_content"] = content
cursor.execute(INSERT_SQL, item)
new_count += 1
if not stop:
page += 1
time.sleep(SLEEP_BETWEEN_PAGES)
# ============================================================
# SUMMARY
# ============================================================
print()
print("=" * 60)
print(f"Hotovo: {datetime.now():%Y-%m-%d %H:%M:%S}")
print(f"Nových torrentů uloženo : {new_count}")
print(f"Stránek prošlo : {page}")
print("=" * 60)
db.close()
if __name__ == "__main__":
main()

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,22 @@
[
{
"name": "uid",
"value": "646071",
"domain": "sktorrent.eu",
"path": "/",
"expires": 1798003565.462807,
"httpOnly": false,
"secure": false,
"sameSite": "Lax"
},
{
"name": "pass",
"value": "91df6b497860582e09a7b333569d0187",
"domain": "sktorrent.eu",
"path": "/",
"expires": 1798003565.463191,
"httpOnly": false,
"secure": false,
"sameSite": "Lax"
}
]

BIN
Seedbox/torrent_report.xlsx Normal file

Binary file not shown.

View File

@@ -0,0 +1,64 @@
import pymysql
import pymysql.cursors
def get_unfinished_torrents():
# Konfigurace připojení
connection_config = {
'host': '192.168.1.76',
'user': 'root',
'password': 'Vlado9674+',
'database': 'torrents',
'port': 3307,
'cursorclass': pymysql.cursors.DictCursor # Vrací výsledky jako slovník
}
try:
# Navázání spojení
connection = pymysql.connect(**connection_config)
with connection.cursor() as cursor:
# SQL Dotaz
sql = """
SELECT
title_visible,
qb_progress,
qb_state,
size_pretty,
added_datetime
FROM torrents
WHERE qb_added = 1
AND qb_progress < 1
AND qb_state NOT IN ('seeding', 'uploading', 'stalledUP', 'pausedUP', 'completed')
ORDER BY qb_progress DESC;
"""
cursor.execute(sql)
results = cursor.fetchall()
print(f"\n--- NEDOKONČENÉ TORRENTY (Port {connection_config['port']}) ---")
if not results:
print("Vše je hotovo nebo nic neběží.")
else:
for row in results:
# Předpokládáme, že qb_progress je float (0.0 až 1.0)
progress_pct = row['qb_progress'] * 100
print(f"Torrent: {row['title_visible']}")
print(f"Stav: {row['qb_state']}")
print(f"Pokrok: {progress_pct:.2f}%")
print(f"Velikost: {row['size_pretty']}")
print("-" * 40)
except pymysql.MySQLError as e:
print(f"Chyba při komunikaci s DB: {e}")
finally:
if 'connection' in locals():
connection.close()
print("Spojení s databází bylo uzavřeno.")
if __name__ == "__main__":
get_unfinished_torrents()

BIN
fs_index_ultracc.pkl Normal file

Binary file not shown.

BIN
library_paths.db Normal file

Binary file not shown.

BIN
paths.pkl Normal file

Binary file not shown.

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

Binary file not shown.

File diff suppressed because one or more lines are too long

Binary file not shown.

Some files were not shown because too many files have changed in this diff Show More