Files
PriceHunter/Dockerfile
mariosemes a3ae3b248f Use remote Chromium container instead of local Puppeteer launch
- Add browserless/chromium container to docker-compose
- Add docker-compose.dev.yml for local dev (Chromium on port 3001)
- Browser scraper connects via WebSocket (CHROMIUM_WS env var)
- Falls back to local launch if CHROMIUM_WS not set
- Remove Chromium install from Dockerfile (smaller image)
- Auto-reconnect on browser disconnect

Tested: remote Chromium connects in ~500ms, HG Spot scrapes in
~2.2s total. No longer blocks the Node.js event loop.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 22:47:40 +01:00

42 lines
928 B
Docker

# Stage 1: Build client
FROM node:20-alpine AS client-build
WORKDIR /app/src/client
COPY src/client/package*.json ./
RUN npm ci
COPY src/client/ ./
RUN npm run build
# Stage 2: Build server
FROM node:20-alpine AS server-build
WORKDIR /app
COPY package*.json ./
RUN npm ci
COPY tsconfig.json ./
COPY src/server/ ./src/server/
RUN npx tsc
# Stage 3: Production runtime
FROM node:20-alpine
WORKDIR /app
# Skip Chromium download — we use a remote Chromium container
ENV PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=true
COPY package*.json ./
RUN npm ci --omit=dev
COPY --from=server-build /app/dist ./dist
COPY --from=client-build /app/src/client/build ./dist/client
COPY src/server/db/migrations ./dist/server/db/migrations
ENV NODE_ENV=production
ENV PORT=3000
ENV DATABASE_PATH=/app/data/pricehunter.db
ENV STORES_DIR=/app/stores
ENV CHROMIUM_WS=ws://chromium:3000
EXPOSE 3000
VOLUME /app/data
CMD ["node", "dist/server/index.js"]