Unable to view downloaded files

Hello, I have a line in python:

resultsdir = "."
job.downloadResults(directory=resultsdir)

Once the code is run, it takes time for the job to finish, but it is being executed successfully, and when I log the directory in which the results are being downloaded, it gives me this directory:
/galaxy_share/database/job_working_directory2/095/95512/working

But when I cd into the directory, the folder 095 has no contents present in it, but the job is being shown as successfully executed, may I know why this is happening or where can I find my download files so that I can access them from my code.

Not sure what your end goal is but…
This folder:
/galaxy_share/database/job_working_directory2/095/95512/ is a temporary folder. It is used to run a tool in sort of it’s own environment. I think when a tool is done it writes the output to another folder galaxy/database/files. In this folder you will not recognize your files because they all have a unique number.

You can download your files from the galaxy web interface or use BioBlend https://bioblend.readthedocs.io/en/latest/

EDIT:

If you mean that you have two scripts and one needs to use the output of the other, you can make a workflow.

Actually, there is a tool named CIPRES which runs on remote server, it returns the results after sometime as well as downloads some files, I just need to get access of those files once they are downloaded, is there a way to do that?

EDIT: I need to access the files from the code and not from the web interface.

There probably is.

I guess that this means in your code resultsdir = "." the “current folder”. In case of galaxy that is a temp folder like you now discovered. Can you not just do something like resultsdir = "/home/username/downloadedfiles"?

That’s what I thought initially, but won’t that be too heavy (in the long run) for the server to store a lot of files every time the cipres tool is run. Are the files from other tools also stored permanently on the server or they are deleted after a period of time?

If it is to much data in the long run that is up to you, you know your hardware and the goal of your project. You can always delete them right?

What you want now is not really how galaxy is designed (I think) but there are always possibilities to get things working. But yes, normally you can delete files after a period of time. Here is a page with info about that https://galaxyproject.org/admin/config/performance/purge-histories-and-datasets/