We recently installed a Galaxy instance on our private cloud and configured buckets for data transfer that we intend to analyze. However, when we accessed the GUI post-installation to upload data, the interface did not display the drive containing our data. Our team is unsure how to resolve this issue. Although we found installation tutorials for Galaxy on commercial platforms like AWS and Azure, these do not cover configurations for a local cloud setup. We would appreciate any assistance or guidance on configuring our system correctly. We’re eager to use this setup extensively for our research projects, but this issue has stalled our progress. Thank you in advance for your help.
Welcome, @Dhinub
It looks like there isn’t any discussion yet for your question. Would you be able to clarify what you want to do a bit more?
I’ll start with two examples of what we have, and you can explain more and try to compare/contrast
- Data Read function
Upload tool → Choose Remote Files
Please review that at a public server and let us know what you want might differ, or even be the same. The “FTP” directory you see there is an area hosted by the server, and seems like it would be similar to how you want to use your buckets.
Also: this would be a read-only file system, yes? Or do you want users to be able to write back to the file system as well?
- Data Write function
History → Export to file
This would be the part that can write to data storage, instead of just hosting some temporary archive link.
Let’s start there.
Thanks Jennaj. Will try your suggestions and let you know if it works.