Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add onlyMatching argument to syncFromSynapse to filter downloads #900

Merged
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
51 changes: 37 additions & 14 deletions synapseutils/sync.py
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ def _sync_executor(syn):


def syncFromSynapse(syn, entity, path=None, ifcollision='overwrite.local', allFiles=None, followLink=False,
manifest="all", downloadFile=True):
manifest="all", downloadFile=True, onlyMatching=None):
"""Synchronizes all the files in a folder (including subfolders) from Synapse and adds a readme manifest with file
metadata.

Expand All @@ -69,6 +69,11 @@ def syncFromSynapse(syn, entity, path=None, ifcollision='overwrite.local', allFi
:param downloadFile Determines whether downloading the files.
Defaults to True

:param onlyMatching Determines list of regexes to be matched against files.
Only if at least one file matches the regex, it will
be downloaded.
Defaults to None
Comment on lines +72 to +75

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What do you think of simplifying this argument to a single regex since multiple patterns can be combined using |? My rationale is that we can use re.compile() downstream on a single expression to be efficient. I'm guessing evaluating a single pre-compiled regex will be faster than compiling and matching a regex (which is what happens when you use re.match()) in a for-loop.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could either expect a pre-compiled regex or perform the compilation in syncFromSynapse().

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we think of additional use cases other than filtering on the entity name? I'm just wondering if we can generalize this parameter to being a dictionary mapping entity properties/annotations (keys) to pre-compiled regular expressions (values)?

@thomasyu888: What do you think?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shifting conversation to here: https://sagebionetworks.jira.com/browse/SYNPY-1236. @vpchung will be tackling this when she has the time.


:returns: list of entities (files, tables, links)

This function will crawl all subfolders of the project/folder specified by `entity` and download all files that have
Expand Down Expand Up @@ -104,7 +109,7 @@ def syncFromSynapse(syn, entity, path=None, ifcollision='overwrite.local', allFi
# 2 threads always, if those aren't available then we'll run single threaded to avoid a deadlock
with _sync_executor(syn) as executor:
sync_from_synapse = _SyncDownloader(syn, executor)
files = sync_from_synapse.sync(entity, path, ifcollision, followLink, downloadFile, manifest)
files = sync_from_synapse.sync(entity, path, ifcollision, followLink, downloadFile, manifest, onlyMatching)

# the allFiles parameter used to be passed in as part of the recursive implementation of this function
# with the public signature invoking itself. now that this isn't a recursive any longer we don't need
Expand Down Expand Up @@ -225,7 +230,7 @@ def __init__(self, syn, executor: concurrent.futures.Executor, max_concurrent_fi
max_concurrent_file_downloads = max(int(max_concurrent_file_downloads or self._syn.max_threads / 2), 1)
self._file_semaphore = threading.BoundedSemaphore(max_concurrent_file_downloads)

def sync(self, entity, path, ifcollision, followLink, downloadFile=True, manifest="all"):
def sync(self, entity, path, ifcollision, followLink, downloadFile=True, manifest="all", onlyMatching=None):
progress = CumulativeTransferProgress('Downloaded')

if is_synapse_id(entity):
Expand All @@ -238,7 +243,7 @@ def sync(self, entity, path, ifcollision, followLink, downloadFile=True, manifes
)

if is_container(entity):
root_folder_sync = self._sync_root(entity, path, ifcollision, followLink, progress, downloadFile, manifest)
root_folder_sync = self._sync_root(entity, path, ifcollision, followLink, progress, downloadFile, manifest, onlyMatching)

# once the whole folder hierarchy has been traversed this entrant thread waits for
# all file downloads to complete before returning
Expand All @@ -256,22 +261,39 @@ def sync(self, entity, path, ifcollision, followLink, downloadFile=True, manifes
files.sort(key=lambda f: f.get('path') or '')
return files

def _sync_file(self, entity_id, parent_folder_sync, path, ifcollision, followLink, progress, downloadFile):
def _sync_file(self, entity_id, parent_folder_sync, path, ifcollision, followLink, progress, downloadFile, onlyMatching):
try:
# we use syn.get to download the File.
# these context managers ensure that we are using some shared state
# when conducting that download (shared progress bar, ExecutorService shared
# by all multi threaded downloads in this sync)
with progress.accumulate_progress(), \
download_shared_executor(self._executor):

entity = self._syn.get(
entity_id,
downloadLocation=path,
ifcollision=ifcollision,
followLink=followLink,
downloadFile=downloadFile,
)

file_matches = True
if onlyMatching is not None:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit (i.e. optional): This can be simplified to if onlyMatching: or perhaps more defensively as if isinstance(onlyMatching, re.Pattern) if we expect pre-compiled regex objects.

file_matches = False
entity_meta = self._syn.get(
entity_id,
downloadFile=False,
)
for regex in onlyMatching:
if re.match(regex, entity_meta.name) is not None:
file_matches = True
Comment on lines +280 to +282

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As per my above comment, I think this part can be made more efficient if re.compile() was used upstream on a single regex (optionally, with multiple patterns separated by |) and then matched once here, like:

onlyMatching.match(entity_meta.name)


if file_matches:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we need the file_matches flag. Couldn't we simply short-circuit this function with an early return statement if onlyMatching is set and there are no matches? This way, we avoid indenting the self._syn.get(..., downloadFile=downloadFile) bit. We would have to run the parent_folder_sync.update() statement you have below immediately before the return statement.

I'll happily elaborate if my suggestion isn't clear.

entity = self._syn.get(
entity_id,
downloadLocation=path,
ifcollision=ifcollision,
followLink=followLink,
downloadFile=downloadFile,
)
else:
parent_folder_sync.update(
finished_id=entity_id,
)
return

files = []
provenance = None
Expand Down Expand Up @@ -302,7 +324,7 @@ def _sync_file(self, entity_id, parent_folder_sync, path, ifcollision, followLin
finally:
self._file_semaphore.release()

def _sync_root(self, root, root_path, ifcollision, followLink, progress, downloadFile, manifest="all"):
def _sync_root(self, root, root_path, ifcollision, followLink, progress, downloadFile, manifest="all", onlyMatching=None):
# stack elements are a 3-tuple of:
# 1. the folder entity/dict
# 2. the local path to the folder to download to
Expand Down Expand Up @@ -372,6 +394,7 @@ def _sync_root(self, root, root_path, ifcollision, followLink, progress, downloa
followLink,
progress,
downloadFile,
onlyMatching,
)

for child_folder in child_folders:
Expand Down