Compare commits

..

29 Commits

Author SHA1 Message Date
7d0231c004 Handle submit API response safely and avoid 500 crash 2026-04-12 18:44:48 +02:00
0fe0e436aa Update docker-compose.yml 2026-04-12 16:43:36 +02:00
6d103d42c5 Merge pull request #19 from DasPoschi/claude/audit-security-performance-pWwx2
Add security hardening and XSS protection
2026-04-06 09:49:22 +02:00
Claude
a879543a1c Security audit: fix XSS, missing function, improve SSH & URL handling
- Fix XSS: HTML-escape all user input (URLs, package names, errors, proxy data)
- Fix NameError: add missing is_demo_link() function (called but undefined)
- Fix: remove unused http_in fetch in proxies_get()
- Security: mask API keys in log output (TMDB key no longer visible in logs)
- Security: use known_hosts for SSH host key verification when available
- Security: remove .env from git tracking, add .env.example template
- Usability: add URL reachability check before submitting to JDownloader
- Usability: add "Erledigte Jobs entfernen" button to clear finished/failed jobs
- Usability: color-code job status (red for failed, green for finished)
- Docs: add security section to README (known_hosts, HTTPS, .env)

https://claude.ai/code/session_01S774Pqazr2U8vkSyhUBgDs
2026-04-06 07:46:53 +00:00
44e4354d1f Merge pull request #18 from DasPoschi/codex/fix-jdownloader-api-package-removal-error-54zoo0
Detect demo link downloads and fail early
2026-01-21 21:25:03 +01:00
f87f0f5cdc Merge branch 'main' into codex/fix-jdownloader-api-package-removal-error-54zoo0 2026-01-21 21:23:26 +01:00
68353b33aa Detect demo link downloads and fail early 2026-01-21 21:22:59 +01:00
c3b1fcadfa Merge pull request #17 from DasPoschi/codex/fix-jdownloader-api-package-removal-error
Add raw MyJDownloader API fallback for removing/canceling links
2026-01-21 21:09:25 +01:00
25ad8c05d0 Add raw API cleanup fallback for JDownloader 2026-01-21 21:08:48 +01:00
b65cb53463 Merge pull request #16 from DasPoschi/codex/fetch-proxies-from-proxyscrape-api-4xe4oq
Remove proxy blacklist and HTTP proxy handling; use ProxyScrape SOCKS lists
2026-01-04 14:46:16 +01:00
6c13fbbb2f Merge branch 'main' into codex/fetch-proxies-from-proxyscrape-api-4xe4oq 2026-01-04 14:46:06 +01:00
33282ddbcb Remove proxy blacklist filters 2026-01-04 14:45:44 +01:00
7795e22744 Merge pull request #15 from DasPoschi/codex/fetch-proxies-from-proxyscrape-api-4vaqb3
Remove HTTP proxies from proxy UI
2026-01-04 14:27:14 +01:00
e83f1323cd Merge branch 'main' into codex/fetch-proxies-from-proxyscrape-api-4vaqb3 2026-01-04 14:27:07 +01:00
194b16e09c Remove HTTP proxies from UI 2026-01-04 14:26:36 +01:00
423e8e28ec Merge pull request #14 from DasPoschi/codex/fetch-proxies-from-proxyscrape-api
Use ProxyScrape API for SOCKS lists and normalize single-line responses
2026-01-04 14:20:54 +01:00
daeee039fa Update proxy sources for socks lists 2026-01-04 14:20:38 +01:00
97a5afbee9 Update app.py 2026-01-03 23:09:10 +01:00
a2de578087 Merge pull request #13 from DasPoschi/codex/configure-proxies-for-downloads-only-r44dl5
Add *.your-server.de to proxy blacklist
2026-01-03 23:04:38 +01:00
c3aac479fe Merge branch 'main' into codex/configure-proxies-for-downloads-only-r44dl5 2026-01-03 22:56:21 +01:00
1350b50199 Add your-server.de to proxy blacklist 2026-01-03 22:55:54 +01:00
6b06134edf Merge pull request #12 from DasPoschi/codex/configure-proxies-for-downloads-only
Bypass proxies for internal HTTP calls
2026-01-03 22:43:36 +01:00
be4785b04a Bypass proxies for non-download requests 2026-01-03 22:42:49 +01:00
db39f2b55e Merge pull request #11 from DasPoschi/codex/add-proxy-list-import-function-3g187r
Refresh jobs table via `/jobs` endpoint instead of full page reload
2026-01-01 22:23:09 +01:00
a0e7ed91c7 Update jobs progress without full reload 2026-01-01 22:22:41 +01:00
7443a0e0ca Merge pull request #10 from DasPoschi/codex/add-proxy-list-import-function-g3956j
Add JDProxies blacklist filters and save/export support
2026-01-01 22:12:53 +01:00
3cf7581797 Merge branch 'main' into codex/add-proxy-list-import-function-g3956j 2026-01-01 22:12:44 +01:00
e9ccb51f13 Add JDProxies blacklist filters 2026-01-01 22:12:10 +01:00
a549ba66ba Merge pull request #9 from DasPoschi/codex/add-proxy-list-import-function-q1akx1
Write JDProxies JSON export and add save endpoint/UI
2026-01-01 21:27:20 +01:00
5 changed files with 327 additions and 83 deletions

View File

@@ -53,3 +53,9 @@ BASIC_AUTH_PASS=CHANGE_ME
# ===== Polling =====
POLL_SECONDS=5
# ===== SSH host key verification (optional) =====
# Path to known_hosts file inside container. If present, strict host key
# checking is used. If absent, all host keys are accepted (less secure).
# Generate with: ssh-keyscan -p 22 192.168.1.1 > known_hosts
# SSH_KNOWN_HOSTS=/ssh/known_hosts

View File

@@ -11,7 +11,7 @@ Web GUI to:
## Files
- `docker-compose.yml` stack
- `.env.example` copy to `.env` and fill values
- `.env.example` copy to `.env` and fill in your values (**never commit `.env`!**)
- `jd-webgui/app.py` FastAPI web app
- `jd-webgui/Dockerfile` includes ffprobe
@@ -40,6 +40,16 @@ docker compose up -d --build
- If `MYJD_DEVICE` is empty, the WebGUI will automatically pick the first available device.
- Ensure the SSH user can write to `/jellyfin/Filme` (and series dir if used).
## Security
- **Never commit `.env`** it contains passwords and API keys. Only `.env.example` is tracked.
- **SSH host key verification**: For secure SFTP transfers, provide a `known_hosts` file:
```bash
ssh-keyscan -p 22 192.168.1.1 > known_hosts
```
Mount it in `docker-compose.yml` and set `SSH_KNOWN_HOSTS=/ssh/known_hosts`.
Without it, any host key is accepted (MITM risk on untrusted networks).
- **Basic Auth** protects the WebGUI but transmits credentials in cleartext over HTTP. Use a reverse proxy with HTTPS (e.g. Traefik, Caddy) in production.
## Troubleshooting
- Device not found: list devices
```bash

View File

@@ -21,8 +21,6 @@ services:
- jdownloader
ports:
- "8080:8080"
env_file:
- .env
environment:
TZ: Europe/Berlin
volumes:

View File

@@ -3,6 +3,7 @@ from __future__ import annotations
import base64
import hashlib
import html as html_mod
import json
import os
import re
@@ -18,7 +19,7 @@ from typing import Any, Dict, List, Optional, Tuple
from myjdapi import Myjdapi
import paramiko
from fastapi import FastAPI, Form, Request
from fastapi.responses import HTMLResponse, RedirectResponse
from fastapi.responses import HTMLResponse, PlainTextResponse, RedirectResponse
from fastapi.staticfiles import StaticFiles
# ============================================================
@@ -57,9 +58,22 @@ POLL_SECONDS = float(os.environ.get("POLL_SECONDS", "5"))
# JDownloader writes here inside container
JD_OUTPUT_PATH = "/output"
PROXY_EXPORT_PATH = os.environ.get("PROXY_EXPORT_PATH", "/output/jd-proxies.jdproxies")
LOG_BUFFER_LIMIT = int(os.environ.get("LOG_BUFFER_LIMIT", "500"))
URL_RE = re.compile(r"^https?://", re.I)
NO_PROXY_OPENER = urllib.request.build_opener(urllib.request.ProxyHandler({}))
def esc(s: str) -> str:
"""HTML-escape a string to prevent XSS."""
return html_mod.escape(str(s), quote=True)
def mask_secret(value: str, visible: int = 4) -> str:
"""Mask a secret string, showing only the last `visible` characters."""
if len(value) <= visible:
return "***"
return "***" + value[-visible:]
VIDEO_EXTS = {
".mkv", ".mp4", ".m4v", ".avi", ".mov", ".wmv", ".flv", ".webm",
".ts", ".m2ts", ".mts", ".mpg", ".mpeg", ".vob", ".ogv",
@@ -123,6 +137,21 @@ class Job:
jobs: Dict[str, Job] = {}
lock = threading.Lock()
log_lock = threading.Lock()
connection_logs: List[str] = []
def log_connection(message: str) -> None:
timestamp = time.strftime("%Y-%m-%d %H:%M:%S")
line = f"[{timestamp}] {message}"
with log_lock:
connection_logs.append(line)
if len(connection_logs) > LOG_BUFFER_LIMIT:
excess = len(connection_logs) - LOG_BUFFER_LIMIT
del connection_logs[:excess]
def get_connection_logs() -> str:
with log_lock:
return "\n".join(connection_logs)
# ============================================================
# Core helpers
@@ -149,6 +178,7 @@ def ensure_env():
def get_device():
jd = Myjdapi()
log_connection(f"MyJDownloader connect as {MYJD_EMAIL or 'unknown'}")
jd.connect(MYJD_EMAIL, MYJD_PASSWORD)
wanted = (MYJD_DEVICE or "").strip()
@@ -200,6 +230,29 @@ def is_video_file(path: str) -> bool:
return False
return ext in VIDEO_EXTS
DEMO_PATTERNS = {"big_buck_bunny", "bigbuckbunny", "big buck bunny", "bbb_sunflower"}
def is_demo_link(name: str) -> bool:
"""Detect JDownloader demo/fallback videos (e.g. Big Buck Bunny)."""
lower = name.lower().replace("-", "_").replace(".", " ")
return any(pat in lower for pat in DEMO_PATTERNS)
def check_url_reachable(url: str) -> Optional[str]:
"""Try a HEAD request to verify the URL is reachable. Returns error string or None."""
try:
req = urllib.request.Request(url, method="HEAD")
req.add_header("User-Agent", "Mozilla/5.0")
with urllib.request.urlopen(req, timeout=10) as resp:
if resp.status >= 400:
return f"URL antwortet mit HTTP {resp.status}"
except urllib.error.HTTPError as e:
return f"URL nicht erreichbar: HTTP {e.code}"
except urllib.error.URLError as e:
return f"URL nicht erreichbar: {e.reason}"
except Exception as e:
return f"URL-Check fehlgeschlagen: {e}"
return None
def md5_file(path: str) -> str:
h = hashlib.md5()
with open(path, "rb") as f:
@@ -243,9 +296,17 @@ def ffprobe_ok(path: str) -> bool:
# ============================================================
# SSH/SFTP
# ============================================================
SSH_KNOWN_HOSTS = os.environ.get("SSH_KNOWN_HOSTS", "/ssh/known_hosts")
def ssh_connect() -> paramiko.SSHClient:
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
if os.path.isfile(SSH_KNOWN_HOSTS):
ssh.load_host_keys(SSH_KNOWN_HOSTS)
ssh.set_missing_host_key_policy(paramiko.RejectPolicy())
log_connection(f"SSH connect {JELLYFIN_USER}@{JELLYFIN_HOST}:{JELLYFIN_PORT} (known_hosts verified)")
else:
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
log_connection(f"SSH connect {JELLYFIN_USER}@{JELLYFIN_HOST}:{JELLYFIN_PORT} (WARNING: no known_hosts, accepting any host key)")
ssh.connect(
hostname=JELLYFIN_HOST,
port=JELLYFIN_PORT,
@@ -268,6 +329,7 @@ def sftp_mkdirs(sftp: paramiko.SFTPClient, remote_dir: str):
def sftp_upload(ssh: paramiko.SSHClient, local_path: str, remote_path: str):
sftp = ssh.open_sftp()
try:
log_connection(f"SFTP upload {local_path} -> {remote_path}")
sftp_mkdirs(sftp, os.path.dirname(remote_path))
sftp.put(local_path, remote_path)
finally:
@@ -276,6 +338,7 @@ def sftp_upload(ssh: paramiko.SSHClient, local_path: str, remote_path: str):
def remote_md5sum(ssh: paramiko.SSHClient, remote_path: str) -> str:
quoted = shlex.quote(remote_path)
cmd = f"md5sum {quoted}"
log_connection(f"SSH exec {cmd}")
stdin, stdout, stderr = ssh.exec_command(cmd, timeout=120)
out = stdout.read().decode("utf-8", "replace").strip()
err = stderr.read().decode("utf-8", "replace").strip()
@@ -288,9 +351,20 @@ def remote_md5sum(ssh: paramiko.SSHClient, remote_path: str) -> str:
# ============================================================
# TMDB & naming
# ============================================================
def _sanitize_url_for_log(url: str) -> str:
"""Remove sensitive query params (api_key) from URLs before logging."""
parsed = urllib.parse.urlparse(url)
params = urllib.parse.parse_qs(parsed.query, keep_blank_values=True)
for key in ("api_key", "apikey", "token"):
if key in params:
params[key] = ["***"]
safe_query = urllib.parse.urlencode(params, doseq=True)
return urllib.parse.urlunparse(parsed._replace(query=safe_query))
def _http_get_json(url: str, headers: Optional[Dict[str, str]] = None) -> Any:
req = urllib.request.Request(url, headers=headers or {})
with urllib.request.urlopen(req, timeout=20) as r:
log_connection(f"HTTP GET {_sanitize_url_for_log(url)} (no-proxy)")
with NO_PROXY_OPENER.open(req, timeout=20) as r:
return json.loads(r.read().decode("utf-8", "replace"))
def tmdb_search_movie(query: str) -> Optional[Dict[str, Any]]:
@@ -363,8 +437,12 @@ def format_proxy_lines(raw: str, scheme: str) -> str:
def fetch_proxy_list(url: str) -> str:
req = urllib.request.Request(url)
with urllib.request.urlopen(req, timeout=20) as resp:
return resp.read().decode("utf-8", "replace")
log_connection(f"HTTP GET {url} (no-proxy)")
with NO_PROXY_OPENER.open(req, timeout=20) as resp:
text = resp.read().decode("utf-8", "replace")
if "\n" not in text and re.search(r"\s", text):
return re.sub(r"\s+", "\n", text.strip())
return text
def build_jdproxies_payload(text: str) -> Dict[str, Any]:
if not text.strip():
@@ -375,6 +453,23 @@ def build_jdproxies_payload(text: str) -> Dict[str, Any]:
"socks4": "SOCKS4",
"http": "HTTP",
}
entries.append({
"filter": None,
"proxy": {
"address": None,
"password": None,
"port": 80,
"type": "NONE",
"username": None,
"connectMethodPrefered": False,
"preferNativeImplementation": False,
"resolveHostName": False,
},
"enabled": True,
"pac": False,
"rangeRequestsSupported": True,
"reconnectSupported": True,
})
for line in text.splitlines():
s = line.strip()
if not s:
@@ -412,6 +507,8 @@ def save_proxy_export(text: str) -> str:
export_dir = os.path.dirname(export_path)
if export_dir:
os.makedirs(export_dir, exist_ok=True)
if os.path.exists(export_path):
os.remove(export_path)
with open(export_path, "w", encoding="utf-8") as handle:
handle.write(json.dumps(payload, indent=2))
handle.write("\n")
@@ -488,7 +585,8 @@ def jellyfin_refresh_library():
try:
url = JELLYFIN_API_BASE + path
req = urllib.request.Request(url, headers=headers, method="POST")
with urllib.request.urlopen(req, timeout=20) as r:
log_connection(f"HTTP POST {url} (no-proxy)")
with NO_PROXY_OPENER.open(req, timeout=20) as r:
_ = r.read()
return
except Exception:
@@ -545,6 +643,31 @@ def local_paths_from_links(links: List[Dict[str, Any]], pkg_map: Dict[Any, Dict[
out.append(p)
return out
def call_raw_jd_api(dev, endpoints: List[str], payloads: List[Dict[str, Any]]) -> bool:
method_candidates = ["action", "call", "api", "request"]
for method_name in method_candidates:
method = getattr(dev, method_name, None)
if method is None:
continue
for endpoint in endpoints:
for payload in payloads:
try:
method(endpoint, payload)
return True
except TypeError:
try:
method(endpoint, params=payload)
return True
except TypeError:
try:
method(endpoint, data=payload)
return True
except Exception:
continue
except Exception:
continue
return False
def try_remove_from_jd(dev, links: List[Dict[str, Any]], pkg_map: Dict[Any, Dict[str, Any]]) -> Optional[str]:
link_ids = [l.get("uuid") for l in links if l.get("uuid") is not None]
pkg_ids = list(pkg_map.keys())
@@ -579,6 +702,14 @@ def try_remove_from_jd(dev, links: List[Dict[str, Any]], pkg_map: Dict[Any, Dict
except Exception:
continue
endpoint_candidates = [
"downloads/removeLinks",
"downloadsV2/removeLinks",
"downloadcontroller/removeLinks",
]
if call_raw_jd_api(dev, endpoint_candidates, payloads):
return None
return "JDownloader-API: Paket/Links konnten nicht entfernt werden (Wrapper-Methoden nicht vorhanden)."
def try_cancel_from_jd(dev, links: List[Dict[str, Any]], pkg_map: Dict[Any, Dict[str, Any]]) -> Optional[str]:
@@ -617,6 +748,14 @@ def try_cancel_from_jd(dev, links: List[Dict[str, Any]], pkg_map: Dict[Any, Dict
except Exception:
continue
endpoint_candidates = [
"downloads/removeLinks",
"downloadsV2/removeLinks",
"downloadcontroller/removeLinks",
]
if call_raw_jd_api(dev, endpoint_candidates, payloads):
return None
return "JDownloader-API: Abbrechen fehlgeschlagen (Wrapper-Methoden nicht vorhanden)."
def cancel_job(dev, jobid: str) -> Optional[str]:
@@ -692,6 +831,16 @@ def worker(jobid: str):
time.sleep(POLL_SECONDS)
continue
all_demo = all(is_demo_link(l.get("name", "")) for l in links)
if all_demo and not is_demo_link(job.url):
cancel_msg = cancel_job(dev, jobid)
with lock:
job.status = "failed"
base_msg = "JDownloader lieferte das Demo-Video Big Buck Bunny statt des gewünschten Links."
job.message = f"{base_msg} {cancel_msg}" if cancel_msg else base_msg
job.progress = 0.0
return
all_finished = all(bool(l.get("finished")) for l in links)
if not all_finished:
progress = calculate_progress(links)
@@ -782,7 +931,19 @@ def worker(jobid: str):
def favicon():
return HTMLResponse(status_code=204)
def render_page(error: str = "") -> str:
@app.get("/jobs", response_class=HTMLResponse)
def jobs_get():
return HTMLResponse(render_job_rows())
@app.get("/logs", response_class=HTMLResponse)
def logs_get():
return HTMLResponse(render_logs_page())
@app.get("/logs/data", response_class=PlainTextResponse)
def logs_data():
return PlainTextResponse(get_connection_logs())
def render_job_rows() -> str:
rows = ""
with lock:
job_list = list(jobs.values())[::-1]
@@ -798,21 +959,29 @@ def render_page(error: str = "") -> str:
cancel_html = ""
if j.status not in {"finished", "failed", "canceled"}:
cancel_html = (
f"<form method='post' action='/cancel/{j.id}' class='inline-form'>"
f"<form method='post' action='/cancel/{esc(j.id)}' class='inline-form'>"
f"<button type='submit' class='danger'>Abbrechen</button>"
f"</form>"
)
status_class = "error" if j.status == "failed" else ("success" if j.status == "finished" else "")
rows += (
f"<tr>"
f"<td><code>{j.id}</code></td>"
f"<td style='max-width:560px; word-break:break-all;'>{j.url}</td>"
f"<td>{j.package_name}</td>"
f"<td>{j.library}</td>"
f"<td><b>{j.status}</b><br/><small>{j.message}</small>{progress_html}{cancel_html}</td>"
f"<td><code>{esc(j.id)}</code></td>"
f"<td style='max-width:560px; word-break:break-all;'>{esc(j.url)}</td>"
f"<td>{esc(j.package_name)}</td>"
f"<td>{esc(j.library)}</td>"
f"<td><b class='{status_class}'>{esc(j.status)}</b><br/><small>{esc(j.message)}</small>{progress_html}{cancel_html}</td>"
f"</tr>"
)
err_html = f"<p class='error'>{error}</p>" if error else ""
if not rows:
rows = "<tr><td colspan='5'><em>No jobs yet.</em></td></tr>"
return rows
def render_page(error: str = "") -> str:
rows = render_job_rows()
err_html = f"<p class='error'>{esc(error)}</p>" if error else ""
auth_note = "aktiv" if _auth_enabled() else "aus"
return f"""
<html>
@@ -821,10 +990,18 @@ def render_page(error: str = "") -> str:
<meta charset="utf-8">
<title>JD → Jellyfin</title>
<script>
setInterval(() => {{
async function refreshJobs() {{
if (document.hidden) return;
window.location.reload();
}}, 5000);
try {{
const resp = await fetch('/jobs');
if (!resp.ok) return;
const html = await resp.text();
const tbody = document.getElementById('jobs-body');
if (tbody) tbody.innerHTML = html;
}} catch (e) {{
}}
}}
setInterval(refreshJobs, 5000);
</script>
</head>
<body>
@@ -862,10 +1039,14 @@ def render_page(error: str = "") -> str:
<thead>
<tr><th>JobID</th><th>URL</th><th>Paket</th><th>Ziel</th><th>Status</th></tr>
</thead>
<tbody>
{rows if rows else "<tr><td colspan='5'><em>No jobs yet.</em></td></tr>"}
<tbody id="jobs-body">
{rows}
</tbody>
</table>
<form method="post" action="/clear-finished" style="margin-top:10px;">
<button type="submit" style="background:#666; color:#fff;">Erledigte Jobs entfernen</button>
</form>
</body>
</html>
"""
@@ -878,20 +1059,55 @@ def render_nav(active: str) -> str:
"<div style='margin: 8px 0 14px 0;'>"
+ link("Downloads", "/", "downloads")
+ link("Proxies", "/proxies", "proxies")
+ link("Logs", "/logs", "logs")
+ "</div>"
)
def render_logs_page() -> str:
return f"""
<html>
<head>
<link rel="stylesheet" href="/static/style.css">
<meta charset="utf-8">
<title>JD → Jellyfin (Logs)</title>
<script>
async function refreshLogs() {{
if (document.hidden) return;
try {{
const resp = await fetch('/logs/data');
if (!resp.ok) return;
const text = await resp.text();
const area = document.getElementById('log-body');
if (area) {{
area.value = text;
area.scrollTop = area.scrollHeight;
}}
}} catch (e) {{
}}
}}
setInterval(refreshLogs, 2000);
window.addEventListener('load', refreshLogs);
</script>
</head>
<body>
<h1>JD → Jellyfin</h1>
{render_nav("logs")}
<p class="hint">Verbindungs-Debugger (Echtzeit). Letzte {LOG_BUFFER_LIMIT} Einträge.</p>
<textarea id="log-body" class="log-area" rows="20" readonly></textarea>
</body>
</html>
"""
def render_proxies_page(
error: str = "",
message: str = "",
socks5_in: str = "",
socks4_in: str = "",
http_in: str = "",
out_text: str = "",
export_path: str = "",
) -> str:
err_html = f"<p class='error'>{error}</p>" if error else ""
msg_html = f"<p class='success'>{message}</p>" if message else ""
err_html = f"<p class='error'>{esc(error)}</p>" if error else ""
msg_html = f"<p class='success'>{esc(message)}</p>" if message else ""
return f"""
<html>
<head>
@@ -908,27 +1124,22 @@ def render_proxies_page(
<form method="post" action="/proxies">
<div class="row">
<label>SOCKS5 (ein Proxy pro Zeile, z. B. IP:PORT)</label><br/>
<textarea name="socks5_in" rows="6" style="width:100%; max-width:860px; padding:10px; border:1px solid #ccc; border-radius:8px;">{socks5_in}</textarea>
<textarea name="socks5_in" rows="6" style="width:100%; max-width:860px; padding:10px; border:1px solid #ccc; border-radius:8px;">{esc(socks5_in)}</textarea>
</div>
<div class="row">
<label>SOCKS4 (ein Proxy pro Zeile, z. B. IP:PORT)</label><br/>
<textarea name="socks4_in" rows="6" style="width:100%; max-width:860px; padding:10px; border:1px solid #ccc; border-radius:8px;">{socks4_in}</textarea>
</div>
<div class="row">
<label>HTTP (ein Proxy pro Zeile, z. B. IP:PORT)</label><br/>
<textarea name="http_in" rows="6" style="width:100%; max-width:860px; padding:10px; border:1px solid #ccc; border-radius:8px;">{http_in}</textarea>
<textarea name="socks4_in" rows="6" style="width:100%; max-width:860px; padding:10px; border:1px solid #ccc; border-radius:8px;">{esc(socks4_in)}</textarea>
</div>
<button type="submit">In JDownloader-Format umwandeln</button>
</form>
<h2 style="margin-top:18px;">JDownloader Import-Liste</h2>
<p class="hint">Format: <code>socks5://IP:PORT</code>, <code>socks4://IP:PORT</code>, <code>http://IP:PORT</code>. Keine Prüfung/Validierung.</p>
<p class="hint">Format: <code>socks5://IP:PORT</code>, <code>socks4://IP:PORT</code>. Keine Prüfung/Validierung.</p>
<div class="row">
<textarea id="out" rows="12" readonly style="width:100%; max-width:860px; padding:10px; border:1px solid #ccc; border-radius:8px;">{out_text}</textarea>
<textarea id="out" rows="12" readonly style="width:100%; max-width:860px; padding:10px; border:1px solid #ccc; border-radius:8px;">{esc(out_text)}</textarea>
</div>
<button type="button" onclick="navigator.clipboard.writeText(document.getElementById('out').value)">Kopieren</button>
@@ -937,13 +1148,12 @@ def render_proxies_page(
<p class="hint">Speichert die Liste als <code>.jdproxies</code> im Container, z. B. zum Import in JDownloader → Verbindungsmanager → Importieren.</p>
<form method="post" action="/proxies/save">
<textarea name="socks5_in" style="display:none;">{socks5_in}</textarea>
<textarea name="socks4_in" style="display:none;">{socks4_in}</textarea>
<textarea name="http_in" style="display:none;">{http_in}</textarea>
<textarea name="socks5_in" style="display:none;">{esc(socks5_in)}</textarea>
<textarea name="socks4_in" style="display:none;">{esc(socks4_in)}</textarea>
<button type="submit">Liste als JDProxies speichern</button>
</form>
<p class="hint">Aktueller Pfad: <code>{export_path or PROXY_EXPORT_PATH}</code></p>
<p class="hint">Aktueller Pfad: <code>{esc(export_path or PROXY_EXPORT_PATH)}</code></p>
</body>
</html>
"""
@@ -958,41 +1168,59 @@ def index():
@app.post("/submit")
def submit(url: str = Form(...), package_name: str = Form(""), library: str = Form("auto")):
ensure_env()
url = url.strip()
package_name = (package_name or "").strip() or "WebGUI"
library = (library or "auto").strip().lower()
try:
ensure_env()
url = url.strip()
package_name = (package_name or "").strip() or "WebGUI"
library = (library or "auto").strip().lower()
if not URL_RE.match(url):
return HTMLResponse(render_page("Nur http(s) URLs erlaubt."), status_code=400)
if not URL_RE.match(url):
return HTMLResponse(render_page("Nur http(s) URLs erlaubt."), status_code=400)
dev = get_device()
resp = dev.linkgrabber.add_links([{
"links": url,
"autostart": True,
"assignJobID": True,
"packageName": package_name,
}])
url_err = check_url_reachable(url)
if url_err:
log_connection(f"URL-Check fehlgeschlagen: {url} -> {url_err}")
return HTMLResponse(render_page(f"Link nicht erreichbar: {url_err}"), status_code=400)
jobid = str(resp.get("id", ""))
if not jobid:
return HTMLResponse(render_page(f"Unerwartete Antwort von add_links: {resp}"), status_code=500)
dev = get_device()
resp = dev.linkgrabber.add_links([{
"links": url,
"autostart": True,
"assignJobID": True,
"packageName": package_name,
}])
with lock:
jobs[jobid] = Job(
id=jobid,
url=url,
package_name=package_name,
library=library,
status="queued",
message="Download gestartet",
progress=0.0,
)
jobid = ""
if isinstance(resp, dict):
jobid = str(resp.get("id", "")).strip()
elif isinstance(resp, (str, int)):
jobid = str(resp).strip()
elif isinstance(resp, list) and resp and isinstance(resp[0], dict):
jobid = str(resp[0].get("id", "")).strip()
t = threading.Thread(target=worker, args=(jobid,), daemon=True)
t.start()
if not jobid:
msg = f"Unerwartete Antwort von add_links (kein Job-ID): {resp!r}"
log_connection(msg)
return HTMLResponse(render_page(msg), status_code=502)
return RedirectResponse(url="/", status_code=303)
with lock:
jobs[jobid] = Job(
id=jobid,
url=url,
package_name=package_name,
library=library,
status="queued",
message="Download gestartet",
progress=0.0,
)
t = threading.Thread(target=worker, args=(jobid,), daemon=True)
t.start()
return RedirectResponse(url="/", status_code=303)
except Exception as e:
log_connection(f"Submit-Fehler: {e}")
return HTMLResponse(render_page(f"Interner Fehler beim Absenden: {e}"), status_code=500)
@app.post("/cancel/{jobid}")
def cancel(jobid: str):
@@ -1006,21 +1234,30 @@ def cancel(jobid: str):
job.message = "Abbruch angefordert…"
return RedirectResponse(url="/", status_code=303)
@app.post("/clear-finished")
def clear_finished():
with lock:
to_remove = [jid for jid, j in jobs.items() if j.status in {"finished", "failed", "canceled"}]
for jid in to_remove:
del jobs[jid]
return RedirectResponse(url="/", status_code=303)
@app.get("/proxies", response_class=HTMLResponse)
def proxies_get():
try:
socks5_in = fetch_proxy_list("https://raw.githubusercontent.com/TheSpeedX/SOCKS-List/master/socks5.txt")
socks4_in = fetch_proxy_list("https://raw.githubusercontent.com/TheSpeedX/SOCKS-List/master/socks4.txt")
http_in = fetch_proxy_list("https://raw.githubusercontent.com/TheSpeedX/SOCKS-List/master/http.txt")
socks5_in = fetch_proxy_list(
"https://api.proxyscrape.com/v4/free-proxy-list/get?request=displayproxies&protocol=socks5&timeout=10000&country=all&ssl=yes&anonymity=elite&skip=0&limit=2000"
)
socks4_in = fetch_proxy_list(
"https://api.proxyscrape.com/v4/free-proxy-list/get?request=displayproxies&protocol=socks4&timeout=10000&country=all&ssl=yes&anonymity=elite&skip=0&limit=2000"
)
s5 = format_proxy_lines(socks5_in, "socks5")
s4 = format_proxy_lines(socks4_in, "socks4")
hp = format_proxy_lines(http_in, "http")
combined = "\n".join([x for x in [s5, s4, hp] if x.strip()])
combined = "\n".join([x for x in [s5, s4] if x.strip()])
return HTMLResponse(render_proxies_page(
socks5_in=socks5_in,
socks4_in=socks4_in,
http_in=http_in,
out_text=combined,
export_path=PROXY_EXPORT_PATH,
))
@@ -1031,18 +1268,15 @@ def proxies_get():
def proxies_post(
socks5_in: str = Form(""),
socks4_in: str = Form(""),
http_in: str = Form(""),
):
try:
s5 = format_proxy_lines(socks5_in, "socks5")
s4 = format_proxy_lines(socks4_in, "socks4")
hp = format_proxy_lines(http_in, "http")
combined = "\n".join([x for x in [s5, s4, hp] if x.strip()])
combined = "\n".join([x for x in [s5, s4] if x.strip()])
return HTMLResponse(render_proxies_page(
socks5_in=socks5_in,
socks4_in=socks4_in,
http_in=http_in,
out_text=combined,
export_path=PROXY_EXPORT_PATH,
))
@@ -1051,7 +1285,6 @@ def proxies_post(
error=str(e),
socks5_in=socks5_in,
socks4_in=socks4_in,
http_in=http_in,
out_text="",
export_path=PROXY_EXPORT_PATH,
), status_code=400)
@@ -1060,19 +1293,16 @@ def proxies_post(
def proxies_save(
socks5_in: str = Form(""),
socks4_in: str = Form(""),
http_in: str = Form(""),
):
try:
s5 = format_proxy_lines(socks5_in, "socks5")
s4 = format_proxy_lines(socks4_in, "socks4")
hp = format_proxy_lines(http_in, "http")
combined = "\n".join([x for x in [s5, s4, hp] if x.strip()])
combined = "\n".join([x for x in [s5, s4] if x.strip()])
export_path = save_proxy_export(combined)
return HTMLResponse(render_proxies_page(
message=f"Proxy-Liste gespeichert: {export_path}",
socks5_in=socks5_in,
socks4_in=socks4_in,
http_in=http_in,
out_text=combined,
export_path=export_path,
))
@@ -1081,7 +1311,6 @@ def proxies_save(
error=str(e),
socks5_in=socks5_in,
socks4_in=socks4_in,
http_in=http_in,
out_text="",
export_path=PROXY_EXPORT_PATH,
), status_code=400)

View File

@@ -19,3 +19,4 @@ code { font-family: ui-monospace, SFMono-Regular, Menlo, Consolas, monospace; fo
.progress-row { display:flex; align-items:center; gap:8px; margin-top:6px; }
.progress-text { font-size:12px; color:#333; min-width:48px; }
.inline-form { margin-top:6px; }
.log-area { width:100%; max-width: 920px; padding:10px; border:1px solid #ccc; border-radius:8px; background:#fff; }