Вы не можете выбрать более 25 тем Темы должны начинаться с буквы или цифры, могут содержать дефисы(-) и должны содержать не более 35 символов.
 
 
 
 

467 строки
12 KiB

  1. Progetto Container Docker ble-ai-localizer
  2. Ambiente: MN reslevis 192.168.1.3
  3. m1.MajorNet-x64.6.6.0-60.bin:03 October 2025
  4. server linux gentoo kernl: 6.6.74-gentoo-x86_64 Python default version 3.10.16
  5. cd /data/service/
  6. ###########################################################################################
  7. Passo 1 Creazione struttura progetto (sul server)
  8. mkdir -p ble-ai-localizer/{app,config,data,models}
  9. cd /data/service/ble-ai-localizer
  10. touch Dockerfile requirements.txt entrypoint.sh docker-compose.yml
  11. touch app/{__init__.py,main.py,settings.py,auth.py,mqtt_client.py,train.py,infer.py}
  12. touch config/config.yaml config/secrets.yaml
  13. Cosa hai ottenuto
  14. app/ codice python (training + inference)
  15. config/ configurazione (config.yaml versionabile + secrets.yaml non versionabile)
  16. data/ dataset fingerprint (volume)
  17. models/ modello salvato (volume)
  18. ###########################################################################################
  19. Passo 2 .gitignore (consigliato subito)
  20. Se usi git:
  21. cat > .gitignore << 'EOF'
  22. # secrets e artefatti
  23. config/secrets.yaml
  24. data/
  25. models/
  26. # python
  27. __pycache__/
  28. *.pyc
  29. .venv/
  30. # editor
  31. .vscode/
  32. .idea/
  33. EOF
  34. ###########################################################################################
  35. Passo 3 Verifica prerequisiti Docker (Gentoo)
  36. Esegui:
  37. docker --version
  38. docker compose version
  39. Se docker compose non esiste, di solito su Gentoo:
  40. o hai il plugin docker-cli-compose
  41. o usi docker-compose (legacy). In tal caso dimmelo e ti adatto i comandi.
  42. Quando hai eseguito i comandi del Passo 1–3, nel prossimo step ti faccio creare:
  43. requirements.txt minimale + pinned
  44. Dockerfile compatibile (python:3.10-slim)
  45. docker-compose.yml con volumi (config/, data/, models/) e variabile MODE=train|infer
  46. Se vuoi andare avanti subito senza feedback, incolla qui l’output di:
  47. docker --version
  48. docker compose version
  49. docker compose version
  50. Docker version 27.4.1, build b9d17eaebb55b7652ce37ae5c7c52fcb34194956
  51. Docker Compose version v2.28.1
  52. Perfetto, sei a posto (Docker 27.4.1 + Compose v2.28.1).
  53. ###########################################################################################
  54. Passo 4 — Popoliamo i file base (requirements, Dockerfile, entrypoint, compose)
  55. 4.1 requirements.txt
  56. Metti dentro (pinned e minimale per il tuo caso):
  57. cat > requirements.txt << 'EOF'
  58. pandas==2.2.2
  59. numpy==1.26.4
  60. scikit-learn==1.5.1
  61. joblib==1.4.2
  62. paho-mqtt==2.1.0
  63. requests==2.32.3
  64. PyYAML==6.0.2
  65. fastapi==0.115.0
  66. uvicorn[standard]==0.30.6
  67. matplotlib==3.9.2
  68. seaborn==0.13.2
  69. EOF
  70. (Se poi decidiamo che plot non servono in produzione, li togliamo.)
  71. 4.2 Dockerfile (Python “fissato”, non-root, robusto)
  72. cat > Dockerfile << 'EOF'
  73. FROM python:3.10-slim
  74. # dipendenze native utili a numpy/scikit (safe choice)
  75. RUN apt-get update && apt-get install -y --no-install-recommends \
  76. build-essential gcc g++ \
  77. && rm -rf /var/lib/apt/lists/*
  78. WORKDIR /app
  79. COPY requirements.txt /app/requirements.txt
  80. RUN pip install --no-cache-dir -r /app/requirements.txt
  81. COPY app/ /app/app/
  82. COPY entrypoint.sh /app/entrypoint.sh
  83. RUN chmod +x /app/entrypoint.sh
  84. # utente non-root
  85. RUN useradd -m appuser
  86. USER appuser
  87. # cartelle dati/modelli (volumi)
  88. RUN mkdir -p /home/appuser/data /home/appuser/models
  89. ENV DATA_DIR=/home/appuser/data
  90. ENV MODELS_DIR=/home/appuser/models
  91. ENTRYPOINT ["/app/entrypoint.sh"]
  92. EOF
  93. 4.3 entrypoint.sh (MODE=train|infer|api)
  94. cat > entrypoint.sh << 'EOF'
  95. #!/usr/bin/env bash
  96. set -euo pipefail
  97. MODE="${MODE:-infer}"
  98. case "$MODE" in
  99. train)
  100. exec python -m app.main train
  101. ;;
  102. infer)
  103. exec python -m app.main infer
  104. ;;
  105. api)
  106. exec uvicorn app.main:api --host 0.0.0.0 --port "${PORT:-8080}"
  107. ;;
  108. *)
  109. echo "MODE non valido: $MODE (usa train|infer|api)"
  110. exit 1
  111. ;;
  112. esac
  113. EOF
  114. Poi rendilo eseguibile:
  115. chmod +x entrypoint.sh
  116. 4.4 docker-compose.yml (volumi + config esterna)
  117. cat > docker-compose.yml << 'EOF'
  118. services:
  119. ble-ai-localizer:
  120. build: .
  121. image: ble-ai-localizer:0.1.0
  122. environment:
  123. MODE: "infer"
  124. CONFIG_FILE: "/app/config/config.yaml"
  125. SECRETS_FILE: "/app/config/secrets.yaml"
  126. volumes:
  127. - ./config:/app/config:ro
  128. - ./data:/data
  129. - ./models:/models
  130. restart: unless-stopped
  131. EOF
  132. 4.5 config/config.yaml (scheletro)
  133. cat > config/config.yaml << 'EOF'
  134. mqtt:
  135. host: "mosquitto"
  136. port: 1883
  137. topic: "ble/raw"
  138. api:
  139. get_gateways_url: "https://APIHOST:5050/reslevis/getGateways"
  140. verify_tls: false
  141. refresh_seconds: 300
  142. oidc:
  143. token_url: "https://KEYCLOAK/realms/REALM/protocol/openid-connect/token"
  144. client_id: "Fastapi"
  145. audience: "Fastapi"
  146. paths:
  147. dataset: "/data/fingerprint.parquet"
  148. model: "/models/model.joblib"
  149. ml:
  150. # parametri generali; l'algoritmo è un dettaglio interno
  151. method: "knn"
  152. k: 7
  153. weights: "distance"
  154. metric: "euclidean"
  155. EOF
  156. 4.6 config/secrets.yaml (placeholder, non versionare)
  157. cat > config/secrets.yaml << 'EOF'
  158. oidc:
  159. client_secret: "CHANGE_ME"
  160. username: "CHANGE_ME"
  161. password: "CHANGE_ME"
  162. EOF
  163. ###########################################################################################
  164. Passo 5 Metti un main minimale per verificare che il container parte
  165. app/main.py
  166. cat > app/main.py << 'EOF'
  167. from fastapi import FastAPI
  168. from .settings import load_settings
  169. api = FastAPI()
  170. @api.get("/health")
  171. def health():
  172. return {"status": "ok"}
  173. def main():
  174. import sys
  175. settings = load_settings()
  176. print("Settings loaded. Keys:", list(settings.keys()))
  177. if len(sys.argv) < 2:
  178. raise SystemExit("Usage: python -m app.main [train|infer]")
  179. cmd = sys.argv[1].lower()
  180. if cmd == "train":
  181. print("TRAIN mode (placeholder)")
  182. elif cmd == "infer":
  183. print("INFER mode (placeholder)")
  184. else:
  185. raise SystemExit("Unknown command")
  186. if __name__ == "__main__":
  187. main()
  188. EOF
  189. app/settings.py
  190. cat > app/settings.py << 'EOF'
  191. import os
  192. from pathlib import Path
  193. import yaml
  194. def _read_yaml(path: str) -> dict:
  195. with open(path, "r", encoding="utf-8") as f:
  196. return yaml.safe_load(f) or {}
  197. def deep_merge(a: dict, b: dict) -> dict:
  198. out = dict(a or {})
  199. for k, v in (b or {}).items():
  200. if isinstance(v, dict) and isinstance(out.get(k), dict):
  201. out[k] = deep_merge(out[k], v)
  202. else:
  203. out[k] = v
  204. return out
  205. def load_settings() -> dict:
  206. cfg_path = os.getenv("CONFIG_FILE", "/app/config/config.yaml")
  207. settings = _read_yaml(cfg_path)
  208. secrets_path = os.getenv("SECRETS_FILE", "")
  209. if secrets_path and Path(secrets_path).exists():
  210. secrets = _read_yaml(secrets_path)
  211. settings = deep_merge(settings, secrets)
  212. # fallback paths (coerenti con compose)
  213. settings.setdefault("paths", {})
  214. settings["paths"].setdefault("dataset", os.getenv("DATASET_PATH", "/data/fingerprint.parquet"))
  215. settings["paths"].setdefault("model", os.getenv("MODEL_PATH", "/models/model.joblib"))
  216. return settings
  217. EOF
  218. (Gli altri file per ora possono restare vuoti.)
  219. ###########################################################################################
  220. Passo 6 Build & test smoke test
  221. docker compose build
  222. ###########################################################################################
  223. ### END prima creazione container #################
  224. ###########################################################################################
  225. ###########################################################################################
  226. 1) Come avviare SOLO ble-ai-localizer (senza impattare gli altri)
  227. Vai nella directory del progetto:
  228. cd /data/service/ble-ai-localizer
  229. Avvio in background:
  230. docker compose -p ble-ai-localizer up -d
  231. Perché -p ble-ai-localizer?
  232. Imposta esplicitamente il project name, così sei sicuro al 100% di nonagganciare per errore un altro compose.
  233. Verifica stato (solo di questo progetto):
  234. docker compose -p ble-ai-localizer ps
  235. Log (solo di questo progetto):
  236. docker compose -p ble-ai-localizer logs -f
  237. 2) Come stoppare/riavviare SOLO ble-ai-localizer
  238. Stop (non rimuove container):
  239. docker compose -p ble-ai-localizer stop
  240. Restart:
  241. docker compose -p ble-ai-localizer restart
  242. Stop + rimozione container/network del progetto (NON tocca volumi bind ./data, ./models):
  243. docker compose -p ble-ai-localizer down
  244. 3) Aggiornare solo il tuo container (codice o Dockerfile cambiato)
  245. Ricostruisci l'immagine:
  246. docker compose -p ble-ai-localizer build
  247. Riavvia applicando l'immagine nuova:
  248. docker compose -p ble-ai-localizer up -d
  249. Se vuoi forzare rebuild+restart in un colpo:
  250. docker compose -p ble-ai-localizer up -d --build
  251. 4) Esportare l'immagine (backup o deploy su altro server)
  252. Esempio export in tar (meglio gzippato):
  253. docker save ble-ai-localizer:0.1.0 | gzip > ble-ai-localizer_0.1.0.tar.gz
  254. Su altro server:
  255. gzip -dc ble-ai-localizer_0.1.0.tar.gz | docker load
  256. #Debug
  257. mosquitto_sub -v -h 192.168.1.101 -p 1883 -t '#' -V mqttv311 | grep publish_out
  258. docker compose -p ble-ai-localizer exec -T ble-ai-localizer ls -l /data/config/
  259. #Caso di gateway che non rileva nessun beacon
  260. publish_out/ac233fc1dcd3 [{"timestamp":"2026-01-30T13:40:08.885Z","type":"Gateway","mac":"AC233FC1DCD3","nums":0}]
  261. #Verifica del modello in uso:
  262. #Ver 1
  263. docker compose -p ble-ai-localizer exec -T ble-ai-localizer python - <<'PY'
  264. import hashlib, joblib, os
  265. p="/data/model/model.joblib"
  266. b=open(p,"rb").read()
  267. print(f"FILE: {p}")
  268. print(f"sha256={hashlib.sha256(b).hexdigest()[:12]} size={len(b)} bytes")
  269. m=joblib.load(p)
  270. print("TYPE:", type(m))
  271. for k in ["version","nan_fill","k_floor","k_xy","weights","metric","floors"]:
  272. print(f"{k}:", getattr(m,k,None))
  273. gws=getattr(m,"feature_gateways",[])
  274. print("gateways:", len(gws), "first:", gws[:5])
  275. regs=getattr(m,"xy_regs",{})
  276. print("xy_regs floors:", sorted(list(regs.keys())))
  277. PY
  278. service "ble-ai-localizer" is not running
  279. #Ver 2
  280. docker compose -p ble-ai-localizer exec -T ble-ai-localizer python - <<'PY'
  281. import joblib, pprint
  282. m = joblib.load("/data/model/model.joblib")
  283. keys = [
  284. "created_at_utc","sklearn_version","numpy_version",
  285. "gateways_order","nan_fill","k_floor","k_xy","weights","metric","floors"
  286. ]
  287. pprint.pprint({k: m.get(k) for k in keys})
  288. PY
  289. #Esempio utilizzo server API:
  290. Ottenimento del token:
  291. TOKEN=$(
  292. curl -k -s -X POST "https://10.251.0.30:10002/realms/API.Server.local/protocol/openid-connect/token" \
  293. -H "Content-Type: application/x-www-form-urlencoded" \
  294. -d "grant_type=password" \
  295. -d "client_id=Fastapi" \
  296. -d "client_secret=wojuoB7Z5xhlPFrF2lIxJSSdVHCApEgC" \
  297. -d "username=core" \
  298. -d "password=C0r3_us3r_Cr3d3nt14ls" \
  299. -d "audience=Fastapi" \
  300. | jq -r '.access_token'
  301. )
  302. Utilizzare il token in un API:
  303. curl -k -X 'GET' \
  304. 'https://10.251.0.30:5050/reslevis/getTrackers' \
  305. -H 'accept: application/json' \
  306. -H 'Authorization: Bearer $TOKEN'
  307. #Aggiornamento del software
  308. cd /data/service/ble-ai-localizer
  309. docker compose up -d --build
  310. docker compose -p ble-ai-localizer build
  311. docker compose -p ble-ai-localizer up -d --build
  312. docker system prune
  313. Gestione Sart/Stop Container
  314. cd /data/service/ble-ai-localizer
  315. docker compose -p ble-ai-localizer up -d
  316. docker compose -p ble-ai-localizer logs -f --tail=200 --timestamps
  317. docker compose -p ble-ai-localizer stop
  318. docker compose -p ble-ai-localizer restart
  319. docker compose -p ble-ai-localizer down
  320. Accesso Web MajorNET ResLevis:
  321. https://10.251.0.30/frontend/app_reslevis/app.html#home
  322. Accesso Web a Container ble-ai-localizer
  323. URL: http://0.0.0.0:8501
  324. http://192.168.1.3:8501/
  325. username e password da file composer: docker-compose.yml
  326. UI_USER: "Admin"
  327. UI_PASSWORD: "pwdadmin1" <-- facilitate per accesso
  328. da mobile
  329. NOTE:
  330. - Il valore di k utilizzato rappresenta la retta (k=2) o il piano (k>2) tra i
  331. beacon di trainig, in addestramento deve coincidere come minimo con il
  332. numero di beacon per stanza, ma è un parametro globale per cui va fatto
  333. rispettare per tutte le stanze, per cui ad esempio se si decide di catturare 5
  334. misure per stanza k conviene metterlo a 3
  335. - almeno a 1m dalle pareti della stanza (evitare attaccato al muro su pareti
  336. adiacenti)
  337. - per piani simemtrici le misure di ogni piano conviene farle conincidere
  338. - aggingere timestamp in fingerprint e registarre traffico mqtt raw (a
  339. parita' di finestra di registrazione potranno essere rivalutati i valori
  340. letti)
  341. - Tempo di tx nel beacon 200 ms
  342. - potenze 0, -4 ,-8, -12
  343. - slot di raccolta 30s
  344. - time 1400
  345. - Inferenza a 7 sec
  346. - durante la raccolta occorre almeno un beacon di test per potenza che veiene escluso
  347. nell'addestarmento ma usato per l'inferenza con il traffico registrato nella
  348. fase di raccolta.
  349. - testare prima gw e mgtt se regge 200 ms