I have many services running on my server and about half of them use postgres. As long as I installed them manually I would always create a new database and reuse the same postgres instance for each service, which seems to me quite logical. The least amount of overhead, fast boot, etc.

But since I started to use docker, most of the docker-compose files come with their own instance of postgres. Until now I just let them do it and were running a couple of instances of postgres. But it’s kind of getting rediciolous how many postgres instances I run on one server.

Do you guys run several dockerized instances of postgres or do you rewrite the docker compose files to give access to your one central postgres instance? And are there usually any problems with that like version incompatibilities, etc.?

  • LifeBandit666@feddit.uk
    link
    fedilink
    English
    arrow-up
    3
    ·
    3 months ago

    I agree to a certain extent and I’m actively using Docker.

    What I’ve done is made an Ubuntu VM, put Docker on it and booted a Portainer client container on it, then made that into a container template, so I can just give it an IP address and boot it up, then add it to Portainer in 3 clicks.

    It’s great for just having a go on something and seeing if I wanna pursue it.

    But so far I’ve tried to boot and run Arr and Plex, and more recently Logitech Media Server and it’s just been hard work.

    I’ve found I’m making more VMs than I thought I would and just putting things together in them, rather than trying to run stacks of Docker together.

    That said, it looks like it is awesome when you know what you’re doing.