We are interested in deploying and running our own Galaxy server at University Computing Centre in Zagreb. Our users expressed the desire for it, so we were hoping that you can assist us in this matter.
We are providing resources for all Croatian scientists so deployment should be done in scalable manner. We tested CloudMan/CloudLaunch but that one is discontinued now: https://galaxyproject.org/blog/2021-10-sunsetting-cloudlaunch. We already have a cloud infrastructure based on OpenStack. Could you please recommend what would be the optimal way of deploying Galaxy on top of such infrastructure?
We would prefer to have a single Galaxy instance where users would be authenticated with their national AAI credentials. Ideally users would spawn their own instances through such interface, similar to JupyterLab infrastructure. Is this an option with Galaxy?
From performance side, how does Galaxy perform with its tools against regular job running in terminal? Eg. bwa-mem2 aligning or processing large GWAS datasets?
The European Galaxy server is running on OpenStack and we provide terraform and ansible roles to deploy and configure everything on GitHub. This stack is used in Belgium, Norway, France, Estonia and Italy to deploy their national Galaxy instances.
We are also offering Admin training and you are invited to visit us in Freiburg to learn more about it.
Spawning a single-user instance for each user on demand is the more challenging part. Such architecture is used e.g. in the Anvil project (https://anvilproject.org/) and lots of its code and configuration is public, but there is no cookie-cutter ready for it afaik.
In the end Galaxy runs jobs on the compute cluster almost the same way as if you ran it directly – just with more convenience, tracked provenance etc. Performance-wise running a tool inside or outside of Galaxy should be close to equal.
this really depends. Usually its one week hands-on. But we also had a few days online, or a week-long online training. We also had one month stay in Freiburg and set up your server with us as part of an exchange project.