I have a need to upload a lot of files to object storage every month. There are about 5,000 files, each around 3 GB that need to be uploaded every month.
The files are originally stored on an Azure instance that is publicly available, but I do not have direct access to that Azure account as it is an external company. You can find the URL's of the file and download them directly using the URL.
I have a python script on a Windows Remote Desktop that reads a list of the 5,000 URL locations, and downloads a file, uploads the file to Object Storage, and then deletes the file, and moves to the next. It loops through all 5,000 files.
It takes a few days to run, and I would love to have this process run in under a day, ideally under 10 hours as it could be an overnight process every first of the month.
Is there a faster way to do this?
I would love a way that did not have to download the file to my remote desktop first, as this eats up half the time for something I am about to delete anyway.
My remote desktop has 500 GB upload and download speeds. Would Oracle Cloud Functions handle this faster?
Any thoughts? Thanks.