I/O upload data without duplicating them

I’ve a simple tool that takes a directory as input.
Galaxy asks me to upload a dataset to start my tool, but when I select the directory to process, Galaxy uploads the data, so as everything is local it’s basically duplicating them, and the data are huge so I don’t have enough storage space.
What do I have to do to be able to process a directory without uploading its content?
Thanks in advance.

The first thing that comes to my mind is to use data-tables (but this is maybe not the best answer) and store the path that need to be the input. And then do this in combination with the watch_tools option set to True. Even if you had enough space I thought you could not use the upload function to upload a folder path.