diff --git a/index.html b/index.html index d157c59b3..376fd88c9 100644 --- a/index.html +++ b/index.html @@ -1501,7 +1501,7 @@
The synapseclient
package lets you communicate with the cloud-hosted Synapse service to access data and create shared data analysis projects from within Python scripts or at the interactive Python console. Other Synapse clients exist for R, Java, and the web. The Python client can also be used from the command line.
The synapseclient
package lets you communicate with the cloud-hosted Synapse service to access data and create shared data analysis projects from within Python scripts or at the interactive Python console. Other Synapse clients exist for R, Java, and the web. The Python client can also be used from the command line.
Installing this package will install synapseclient
, synapseutils
and the command line client. synapseutils
contains beta features and the behavior of these features are subject to change.
If you're just getting started with Synapse, have a look at the Getting Started guides for Synapse.
diff --git a/search/search_index.json b/search/search_index.json index 4da1c438f..a558da866 100644 --- a/search/search_index.json +++ b/search/search_index.json @@ -1 +1 @@ -{"config":{"lang":["en"],"separator":"[\\s\\-]+","pipeline":["stopWordFilter"]},"docs":[{"location":"","title":"Synapse Python/Command Line Client Documentation","text":"The synapseclient
package provides an interface to Synapse, a collaborative, open-source research platform that allows teams to share data, track analyses, and collaborate, providing support for:
The synapseclient
package lets you communicate with the cloud-hosted Synapse service to access data and create shared data analysis projects from within Python scripts or at the interactive Python console. Other Synapse clients exist for R, Java, and the web. The Python client can also be used from the command line.
Installing this package will install synapseclient
, synapseutils
and the command line client. synapseutils
contains beta features and the behavior of these features are subject to change.
If you're just getting started with Synapse, have a look at the Getting Started guides for Synapse.
"},{"location":"news/","title":"Release Notes","text":""},{"location":"news/#320-2023-11-27","title":"3.2.0 (2023-11-27)","text":""},{"location":"news/#highlights","title":"Highlights","text":"get_user_profile_by_username
and get_user_profile_by_id
to handle for use cases when a username is a number.syn.get
with ifcollision='overwrite.local
does not always overwrite previous fileifcollision=overwrite.local
syn.get
with ifcollision='overwrite.local
does not always overwrite previous filesynapse login
and synapse config
correctly work as a result.@memoize
decorator with @functools.lru_cache
decorator.-parent
will become --parent
. Commands that support camel case like --parentId
will be changed to --parent-id
.date_parser
and parse_date
in [pd.read_csv` in table moduleblack
the Python auto formatter on the files\\>=
1.5-parent
will become [--parent. Commands that support camel case like
--parentIdwill be changed to
--parent-id`.Locked down pandas version to only support pandas \\<
1.5
Next major release (3.0.0)...
\\>=
1.5-parent
will become \\--parent
. Commands that support camel case like \\--parentId
will be changed to \\--parent-id
.Added support for Datasets
# from python\nimport synapseclient\nimport synapseutils\nsyn = synapseclient.login()\ndataset_items = [\n {'entityId': \"syn000\", 'versionNumber': 1},\n {...},\n]\ndataset = synapseclient.Dataset(\n name=\"My Dataset\",\n parent=project,\n dataset_items=dataset_items\n)\ndataset = syn.store(dataset)\n# Add/remove specific Synapse IDs to/from the Dataset\ndataset.add_item({'entityId': \"syn111\", 'versionNumber': 1})\ndataset.remove_item(\"syn000\")\ndataset = syn.store(dataset)\n# Add a single Folder to the Dataset\n# this will recursively add all the files in the folder\ndataset.add_folder(\"syn123\")\n# Add a list of Folders, overwriting any existing files in the dataset\ndataset.add_folders([\"syn456\", \"syn789\"], force=True)\ndataset = syn.store(dataset)\n# Create snapshot version of dataset\nsyn.create_snapshot_version(\n dataset.id,\n label=\"v1.0\",\n comment=\"This is version 1\"\n)\n
Added support for downloading from download cart. You can use this feature by first adding items to your download cart on Synapse.
# from python\nimport synapseclient\nimport synapseutils\nsyn = synapseclient.login()\nmanifest_path = syn.get_download_list()\n
# from command line\nsynapse get-download-list\n
Next major release (3.0.0) there will be major cosmetic changes to the cli such as removing all camel case or non-standard single dash long command line interface (cli) parameters. Example: command line arguments like -parent
will become \\--parent
. Commands that support camel case like \\--parentId
will be changed to \\--parent-id
.
ViewBase
for Datasets instead of SchemaBase
Next major release (3.0.0) there will be major cosmetic changes to the cli such as removing all camel case or non-standard single dash long command line interface (cli) parameters. Example: command line arguments like -parent
will become \\--parent
. Commands that support camel case like \\--parentId
will be changed to \\--parent-id
.
Added support for materialized views
# from python\nimport synapseclient\nimport synapseutils\nsyn = synapseclient.login()\nview = synapseclient.MaterializedViewSchema(\n name=\"test-material-view\",\n parent=\"syn34234\",\n definingSQL=\"SELECT * FROM syn111 F JOIN syn2222 P on (F.PATIENT_ID = P.PATIENT_ID)\"\n)\nview_ent = syn.store(view)\n
Removed support for Python 3.6 and added support for Python 3.10
Add function to create Synapse config file
# from the command line\nsynapse config\n
includeTypes
to synapseutils.walk()
forceVersion
on changeFileMetadata
dataset
as an entity type to return in getChildren()-parent
will become \\--parent
. Commands that support camel case like \\--parentId
will be changed to \\--parent-id
.Added ability to generate a manifest file from your local directory structure.
# from the command line\n# write the manifest to manifest.tsv\nsynapse manifest --parent-id syn123 --manifest-file ./manifest.tsv /path/to/local/directory\n# stdout\nsynapse manifest --parent-id syn123 /path/to/local/directory\n
Added ability to pipe manifest stdout into sync function.
# from the command line\nsynapse manifest --parent-id syn123 ./docs/ | synapse sync -\n
Added ability to return summary statistics of csv and tsv files stored in Synapse.
# from python\nimport synapseclient\nimport synapseutils\nsyn = synapseclient.login()\nstatistics = synapseutils.describe(syn=syn, entity=\"syn12345\")\nprint(statistics)\n{\n \"column1\": {\n \"dtype\": \"object\",\n \"mode\": \"FOOBAR\"\n },\n \"column2\": {\n \"dtype\": \"int64\",\n \"mode\": 1,\n \"min\": 1,\n \"max\": 2,\n \"mean\": 1.4\n },\n \"column3\": {\n \"dtype\": \"bool\",\n \"mode\": false,\n \"min\": false,\n \"max\": true,\n \"mean\": 0.5\n }\n}\n
Next major release (3.0.0) there will be major cosmetic changes to the cli such as removing all camel case or non-standard single dash long command line interface (cli) parameters. Example: command line arguments like -parent
will become \\--parent
. Commands that support camel case like \\--parentId
will be changed to \\--parent-id
.
synapse manifest
stdout in synapse sync
functionAdded ability to authenticate from a SYNAPSE_AUTH_TOKEN
environment variable set with a valid personal access token.
# e.g. set environment variable prior to invoking a Synapse command or running a program that uses synapseclient\nSYNAPSE_AUTH_TOKEN='<my_personal_access_token>' synapse <subcommand options>\n
The environment variable will take priority over credentials in the user's .synapseConfig
file or any credentials saved in a prior login using the remember me option.
See here for more details on usage.
Added ability to silence all console output.
# from the command line, use the --silent option with any synapse subcommand, here it will suppress the download progress indicator\nsynapse --silent get <synid>\n
# from code using synapseclient, pass the silent option to the Synapse constructor\nimport synapseclient\n\nsyn = synapseclient.Synapse(silent=True)\nsyn.login()\nsyn.get(<synid>)\n
Improved robustness during downloads with unstable connections. Specifically the client will automatically recover when encoutering some types of network errors that previously would have caused a download to start over as indicated by a reset progress bar.
Entities can be annotated with boolean datatypes, for example:
file = synapseclient.File('/path/to/file', parentId='syn123', synapse_is_great=True)\nsyn.store(file)\n
synapseclient is additionally packaged as a Python wheel.
The index_files_for_migration and migrate_indexed_files functions are added to synapseutils to help migrate files in Synapse projects and folders between AWS S3 buckets in the same region. More details on using these utilities can be found here.
This version supports login programatically and from the command line using personal access tokens that can be obtained from your synapse.org Settings. Additional documentation on login and be found here.
# programmatic\nsyn = synapseclient.login(authToken=<token>)\n
# command line\nsynapse login -p <token>\n
The location where downloaded entities are cached can be customized to a location other than the user's home directory. This is useful in environments where writing to a home directory is not appropriate (e.g. an AWS lambda).
syn = synapseclient.Synapse(cache_root_dir=<directory path>)\n
A helper method on the Synapse object has been added to enable obtaining the Synapse certification quiz status of a user.
passed = syn.is_certified(<username or user_id>)\n
This version has been tested with Python 3.9.
synapse get -r
and synapse sync
in the command line client, respectively) are transferred in in parallel threads rather than serially, substantially improving the performance of these operations.This version includes a performance improvement for syncFromSynapse downloads of deep folder hierarchies to local filesystem locations outside of the Synapse cache.
Support is added for SubmissionViews that can be used to query and edit a set of submissions through table services.
from synapseclient import SubmissionViewSchema\n\nproject = syn.get(\"syn123\")\nevaluation_id = '9876543'\nview = syn.store(SubmissionViewSchema(name='My Submission View', parent=project, scopes=[evaluation_id]))\nview_table = syn.tableQuery(f\"select * from {view.id}\")\n
A max_threads
property of the Synapse object has been added to customize the number of concurrent threads that will be used during file transfers.
import synapseclient\nsyn = synapseclient.login()\nsyn.max_threads = 20\n
If not customized the default value is (CPU count + 4). Adjusting this value higher may speed up file transfers if the local system resources can take advantage of the higher setting. Currently this value applies only to files whose underlying storage is AWS S3.
Alternately, a value can be stored in the synapseConfig configuration file that will automatically apply as the default if a value is not explicitly set.
[transfer]\nmax_threads=16\n
This release includes support for directly accessing S3 storage locations using AWS Security Token Service credentials. This allows use of external AWS clients and libraries with Synapse storage, and can be used to accelerate file transfers under certain conditions. To create an STS enabled folder and set-up direct access to S3 storage, see here <sts_storage_locations>
{.interpreted-text role=\"ref\"}.
The getAnnotations
and setAnnotations
methods of the Synapse object have been deprecated in favor of newer get_annotations
and set_annotations
methods, respectively. The newer versions are parameterized with a typed Annotations
dictionary rather than a plain Python dictionary to prevent existing annotations from being accidentally overwritten. The expected usage for setting annotations is to first retrieve the existing Annotations
for an entity before saving changes by passing back a modified value.
annos = syn.get_annotations('syn123')\n\n# set key 'foo' to have value of 'bar' and 'baz'\nannos['foo'] = ['bar', 'baz']\n# single values will automatically be wrapped in a list once stored\nannos['qwerty'] = 'asdf'\n\nannos = syn.set_annotations(annos)\n
The deprecated annotations methods may be removed in a future release.
A full list of issues addressed in this release are below.
"},{"location":"news/#bug-fixes_14","title":"Bug Fixes","text":"Python 2 is no longer supported as of this release. This release requires Python 3.6+.
"},{"location":"news/#highlights_17","title":"Highlights:","text":"Multi-threaded download of files from Synapse can be enabled by setting syn.multi_threaded
to True
on a synapseclient.Synapse
object. This will become the default implementation in the future, but to ensure stability for the first release of this feature, it must be intentionally enabled.
import synapseclient\nsyn = synapseclient.login()\nsyn.multi_threaded = True\n# syn123 now will be downloaded via the multi-threaded implementation\nsyn.get(\"syn123\")\n
Currently, multi-threaded download only works with files stored in AWS S3, where most files on Synapse reside. This also includes custom storage locations that point to an AWS S3 bucket. Files not stored in S3 will fall back to single-threaded download even if syn.multi_threaded==True
.
-
`synapseutils.copy()` now has limitations on what can be copied:\n\n: - A user must have download permissions on the entity they\n want to copy.\n - Users cannot copy any entities that have [access\n requirements](https://help.synapse.org/docs/Sharing-Settings,-Permissions,-and-Conditions-for-Use.2024276030.html).\n
contentTypes
and fileNames
are optional parameters in synapseutils.copyFileHandles()
Synapse Docker Repository(synapseclient.DockerRepository
) objects can now be submitted to Synapse evaluation queues using the entity
argument in synapseclient.Synapse.submit()
. An optional argument docker_tag=\"latest\"
has also been added to synapseclient.Synapse.submit()
\\\" to designate which tagged Docker image to submit.
A full list of issues addressed in this release are below.
"},{"location":"news/#bugs-fixes","title":"Bugs Fixes","text":"In version 1.9.2, we improved Views' usability by exposing [set_entity_types()` function to change the entity types that will show up in a View:
import synapseclient\nfrom synapseclient.table import EntityViewType\n\nsyn = synapseclient.login()\nview = syn.get(\"syn12345\")\nview.set_entity_types([EntityViewType.FILE, EntityViewType.FOLDER])\nview = syn.store(view)\n
"},{"location":"news/#features","title":"Features","text":"In version 1.9.1, we fix various bugs and added two new features:
In version 1.9.0, we deprecated and removed query()
and chunkedQuery()
. These functions used the old query services which does not perform well. To query for entities filter by annotations, please use EntityViewSchema
.
We also deprecated the following functions and will remove them in Synapse Python client version 2.0. In the Activity
object:
usedEntity()
usedURL()
In the Synapse
object:
getEntity()
loadEntity()
createEntity()
updateEntity()
deleteEntity()
downloadEntity()
uploadFile()
uploadFileHandle()
uploadSynapseManagedFileHandle()
downloadTableFile()
Please see our documentation for more details on how to migrate your code away from these functions.
"},{"location":"news/#features_2","title":"Features","text":"copyWiki
functionIn this release, we have been performed some house-keeping on the code base. The two major changes are:
making syn.move()
available to move an entity to a new parent in Synapse. For example:
import synapseclient\nfrom synapseclient import Folder\n\nsyn = synapseclient.login()\n\nfile = syn.get(\"syn123\")\nfolder = Folder(\"new folder\", parent=\"syn456\")\nfolder = syn.store(folder)\n\n# moving file to the newly created folder\nsyn.move(file, folder)\n
exposing the ability to use the Synapse Python client with single threaded. This feature is useful when running Python script in an environment that does not support multi-threading. However, this will negatively impact upload speed. To use single threaded:
import synapseclient\nsynapseclient.config.single_threaded = True\n
This release is a hotfix for a bug. Please refer to 1.8.0 release notes for information about additional changes.
"},{"location":"news/#bug-fixes_21","title":"Bug Fixes","text":"This release has 2 major changes:
\\~/synapseCache/.session
). The python client now relies on keyring to handle credential storage of your Synapse credentials.The remaining changes are bug fixes and cleanup of test code.
Below are the full list of issues addressed by this release:
"},{"location":"news/#bug-fixes_22","title":"Bug Fixes","text":"v1.7.4 release was broken for new users that installed from pip. v1.7.5 has the same changes as v1.7.4 but fixes the pip installation.
"},{"location":"news/#174-2018-01-29","title":"1.7.4 (2018-01-29)","text":"This release mostly includes bugfixes and improvements for various Table classes:
: - Fixed bug where you couldn't store a table converted to a pandas.Dataframe
if it had a INTEGER column with some missing values. - EntityViewSchema
can now automatically add all annotations within your defined scopes
as columns. Just set the view's addAnnotationColumns=True
before calling syn.store()
. This attribute defaults to True
for all newly created EntityViewSchemas
. Setting addAnnotationColumns=True
on existing tables will only add annotation columns that are not already a part of your schema. - You can now use synapseutils.notifyMe
as a decorator to notify you by email when your function has completed. You will also be notified of any Errors if they are thrown while your function runs.
We also added some new features:
: - syn.findEntityId()
function that allows you to find an Entity by its name and parentId, set parentId to None
to search for Projects by name. - The bulk upload functionality of synapseutils.syncToSynapse
is available from the command line using: synapse sync
.
Below are the full list of issues addressed by this release:
"},{"location":"news/#features_4","title":"Features","text":"Release 1.7.3 introduces fixes and quality of life changes to Tables and synapseutils:
Changes to Tables:
etag
column in your SQL query when using a tableQuery()
to update File/Project Views. just SELECT
the relevant columns and etags will be resolved automatically.PartialRowSet
class allows you to only have to upload changes to individual cells of a table instead of every row that had a value changed. It is recommended to use the PartialRowSet.from_mapping()
classmethod instead of the PartialRowSet
constructor.Changes to synapseutils:
\\~
to refer to your home directory in your manifest.tsvWe also added improved debug logging and use Python's builtin logging
module instead of printing directly to sys.stderr
Below are the full list of issues addressed by this release:
"},{"location":"news/#bug-fixes_24","title":"Bug Fixes","text":"Release 1.7 is a large bugfix release with several new features. The main ones include:
We have expanded the synapseutils packages to add the ability to:
File View tables can now be created from the python client using EntityViewSchema. See fileviews documentation.
The python client is now able to upload to user owned S3 Buckets. Click here for instructions on linking your S3 bucket to synapse.
We've also made various improvements to existing features:
\\--description
argument when creating/updating entities from the command line client will now create a Wiki
for that entity. You can also use \\--descriptionFile
to write the contents of a markdown file as the entity's Wiki
file_entity.cacheDir
and file_entity.files
is being DEPRECATED in favor of file_entity.path
for finding the location of a downloaded File
pandas
dataframe\\
s containing `datetime` values can now be properly converted into csv and uploaded to Synapse.We also added a optional convert_to_datetime
parameter to CsvFileTable.asDataFrame()
that will automatically convert Synapse DATE columns into datetime
objects instead of leaving them as long
unix timestamps
Below are the full list of bugs and issues addressed by this release:
"},{"location":"news/#features_6","title":"Features","text":"In version 1.6 we introduce a new sub-module _synapseutils that provide convenience functions for more complicated operations in Synapse such as copying of files wikis and folders. In addition we have introduced several improvements in downloading content from Synapse. As with uploads we are now able to recover from an interrupted download and will retry on network failures.
We have improved download robustness and error checking, along with extensive recovery on failed operations. This includes the ability for the client to pause operation when Synapse is updated.
By default, data sets in Synapse are private to your user account, but they can easily be shared with specific users, groups, or the public.
See:
Periodically we will be publishing results of benchmarking the Synapse Python Client compared to directly working with AWS S3. The purpose of these benchmarks is to make data driven decisions on where to spend time optimizing the client. Additionally, it will give us a way to measure the impact of changes to the client.
"},{"location":"explanations/benchmarking/#results","title":"Results","text":""},{"location":"explanations/benchmarking/#12122023-downloading-files-from-synapse","title":"12/12/2023: Downloading files from Synapse","text":"The results were created on a t3a.micro
EC2 instance with a 200GB disk size running in us-east-1. The script that was run can be found in docs/scripts/downloadBenchmark.py
and docs/scripts/uploadTestFiles.py
.
During this download test I tried various thread counts to see what performance looked like at different levels. What I found was that going over the default count of threads during download of large files (10GB and over) led to signficantly unstable performance. The client would often crash or hang during execution. As a result the general reccomendation is as follows:
multiprocessing.cpu_count() + 4
The results were created on a t3a.micro
EC2 instance with a 200GB disk size running in us-east-1. The script that was run can be found in docs/scripts
. The time to create the files on disk is not included.
This test includes adding 5 annotations to each file, a Text, Integer, Floating Point, Boolean, and Date.
S3 was not benchmarked again.
As a result of these tests the sweet spot for thread count is around 50 threads. It is not reccomended to go over 50 threads as it resulted in signficant instability in the client.
Test Thread Count Synapseutils Sync os.walk + syn.store Per file size 25 Files 1MB total size 6 10.75s 10.96s 40KB 25 Files 1MB total size 25 6.79s 11.31s 40KB 25 Files 1MB total size 50 6.05s 10.90s 40KB 25 Files 1MB total size 100 6.14s 10.89s 40KB 775 Files 10MB total size 6 268.33s 298.12s 12.9KB 775 Files 10MB total size 25 162.63s 305.93s 12.9KB 775 Files 10MB total size 50 86.46s 304.40s 12.9KB 775 Files 10MB total size 100 85.55s 304.71s 12.9KB 10 Files 1GB total size 6 27.17s 36.25s 100MB 10 Files 1GB total size 25 22.26s 12.77s 100MB 10 Files 1GB total size 50 22.24s 12.26s 100MB 10 Files 1GB total size 100 Wouldn't complete Wouldn't complete 100MB"},{"location":"explanations/benchmarking/#11142023-uploading-files-to-synapse-default-thread-count","title":"11/14/2023: Uploading files to Synapse, Default thread count","text":"The results were created on a t3a.micro
EC2 instance with a 200GB disk size running in us-east-1. The script that was run can be found in docs/scripts
. The time to create the files on disk is not included.
This test uses the default number of threads in the client: multiprocessing.cpu_count() + 4
The manifest is a tsv file with file locations and metadata to be pushed to Synapse. The purpose is to allow bulk actions through a TSV without the need to manually execute commands for every requested action.
"},{"location":"explanations/manifest_tsv/#manifest-file-format","title":"Manifest file format","text":"The format of the manifest file is a tab delimited file with one row per file to upload and columns describing the file. The minimum required columns are path and parent where path is the local file path and parent is the Synapse Id of the project or folder where the file is uploaded to.
In addition to these columns you can specify any of the parameters to the File constructor (name, synapseStore, contentType) as well as parameters to the syn.store command (used, executed, activityName, activityDescription, forceVersion).
For only updating annotations without uploading new versions of unchanged files, the syn.store parameter forceVersion should be included in the manifest with the value set to False.
Used and executed can be semi-colon (\";\") separated lists of Synapse ids, urls and/or local filepaths of files already stored in Synapse (or being stored in Synapse by the manifest). If you leave a space, like \"syn1234; syn2345\" the white space from \" syn2345\" will be stripped.
Any additional columns will be added as annotations.
"},{"location":"explanations/manifest_tsv/#required-fields","title":"Required fields:","text":"Field Meaning Example path local file path or URL /path/to/local/file.txt parent synapse id syn1235"},{"location":"explanations/manifest_tsv/#common-fields","title":"Common fields:","text":"Field Meaning Example name name of file in Synapse Example_file forceVersion whether to update version False"},{"location":"explanations/manifest_tsv/#activityprovenance-fields","title":"Activity/Provenance fields:","text":"Each of these are individual examples and is what you would find in a row in each of these columns. To clarify, \"syn1235;/path/to_local/file.txt\" below states that you would like both \"syn1234\" and \"/path/to_local/file.txt\" added as items used to generate a file. You can also specify one item by specifying \"syn1234\"
Field Meaning Example used List of items used to generate file \"syn1235;/path/to_local/file.txt\" executed List of items executed \"https://github.org/;/path/to_local/code.py\" activityName Name of activity in provenance \"Ran normalization\" activityDescription Text description on what was done \"Ran algorithm xyx with parameters...\"See:
Any columns that are not in the reserved names described above will be interpreted as annotations of the file
For example this is adding 2 annotations to each row:
path parent annot1 annot2 /path/file1.txt syn1243 \"bar\" 3.1415 /path/file2.txt syn12433 \"baz\" 2.71 /path/file3.txt syn12455 \"zzz\" 3.52See:
In Synapse, entities have both properties and annotations. Properties are used by the system, whereas annotations are completely user defined. In the Python client, we try to present this situation as a normal object, with one set of properties.
Printing an entity will show the division between properties and annotations.
print(entity)\n
Under the covers, an Entity object has two dictionaries, one for properties and one for annotations. These two namespaces are distinct, so there is a possibility of collisions. It is recommended to avoid defining annotations with names that collide with properties, but this is not enforced.
## don't do this!\nentity.properties['description'] = 'One thing'\nentity.annotations['description'] = 'A different thing'\n
In case of conflict, properties will take precedence.
print(entity.description)\n#> One thing\n
Some additional ambiguity is entailed in the use of dot notation. Entity objects have their own internal properties which are not persisted to Synapse. As in all Python objects, these properties are held in object.dict. For example, this dictionary holds the keys 'properties' and 'annotations' whose values are both dictionaries themselves.
The rule, for either getting or setting is: first look in the object then look in properties, then look in annotations. If the key is not found in any of these three, a get results in a KeyError
and a set results in a new annotation being created. Thus, the following results in a new annotation that will be persisted in Synapse:
entity.foo = 'bar'\n
To create an object member variable, which will not be persisted in Synapse, this unfortunate notation is required:
entity.__dict__['foo'] = 'bar'\n
As mentioned previously, name collisions are entirely possible. Keys in the three namespaces can be referred to unambiguously like so:
entity.__dict__['key']\n\nentity.properties.key\nentity.properties['key']\n\nentity.annotations.key\nentity.annotations['key']\n
Most of the time, users should be able to ignore these distinctions and treat Entities like normal Python objects. End users should never need to manipulate items in dict.
See also:
These methods enable access to the Synapse REST(ish) API taking care of details like endpoints and authentication. See the REST API documentation.
See:
Synapse can use a variety of storage mechanisms to store content, however the most common storage solution is AWS S3. This article illustrates some special features that can be used with S3 storage and how they interact with the Python client. In particular it covers:
Synapse projects or folders can be configured to use custom implementations for their underlying data storage. More information on this feature can be found here. The most common implementation of this is to configure a folder to store data in a user controlled AWS S3 bucket rather than Synapse's default internal S3 storage.
"},{"location":"guides/data_storage/#creating-a-new-folder-backed-by-a-user-specified-s3-bucket","title":"Creating a new folder backed by a user specified S3 bucket","text":"The following illustrates creating a new folder backed by a user specified S3 bucket. Note: An existing folder also works.
If you are changing the storage location of an existing folder to a user specified S3 bucket none of the files will be migrated. In order to migrate the files to the new storage location see the section Migrating programmatically. When you change the storage location for a folder only NEW files uploaded to the folder are uploaded to the user specific S3 bucket.
Ensure that the bucket is properly configured.
Create a folder and configure it to use external S3 storage:
# create a new folder to use with external S3 storage\nfolder = syn.store(Folder(name=folder_name, parent=parent))\n# You may also use an existing folder like:\n# folder = syn.get(\"syn123\")\nfolder, storage_location, project_setting = syn.create_s3_storage_location(\n folder=folder,\n bucket_name='my-external-synapse-bucket',\n base_key='path/within/bucket',\n )\n\n# if needed the unique storage location identifier can be obtained e.g.\nstorage_location_id = storage_location['storageLocationId']\n
"},{"location":"guides/data_storage/#creating-a-new-project-backed-by-a-user-specified-s3-bucket","title":"Creating a new project backed by a user specified S3 bucket","text":"The following illustrates creating a new project backed by a user specified S3 bucket. Note: An existing project also works.
If you are changing the storage location of an existing project to a user specified S3 bucket none of the files will be migrated. In order to migrate the files to the new storage location see the documentation further down in this article labeled 'Migrating programmatically'. When you change the storage location for a project only NEW files uploaded to the project are uploaded to the user specific S3 bucket.
Ensure that the bucket is properly configured.
Create a project and configure it to use external S3 storage:
# create a new, or retrieve an existing project to use with external S3 storage\nproject = syn.store(Project(name=\"my_project_name\"))\nproject_storage, storage_location, project_setting = syn.create_s3_storage_location(\n # Despite the KW argument name, this can be a project or folder\n folder=project,\n bucket_name='my-external-synapse-bucket',\n base_key='path/within/bucket',\n)\n\n# if needed the unique storage location identifier can be obtained e.g.\nstorage_location_id = storage_location['storageLocationId']\n
Once an external S3 storage folder exists, you can interact with it as you would any other folder using Synapse tools. If you wish to add an object that is stored within the bucket to Synapse you can do that by adding a file handle for that object using the Python client and then storing the file to that handle.
parent_synapse_folder_id = 'syn123'\nlocal_file_path = '/path/to/local/file'\nbucket = 'my-external-synapse-bucket'\ns3_key = 'path/within/bucket/file'\n\n# in this example we use boto to create a file independently of Synapse\ns3_client = boto3.client('s3')\ns3_client.upload_file(\n Filename=local_file_path,\n Bucket=bucket,\n Key=s3_key\n)\n\n# now we add a file handle for that file and store the file to that handle\nfile_handle = syn.create_external_s3_file_handle(\n bucket,\n s3_key,\n local_file_path,\n parent=parent_synapse_folder_id,\n)\nfile = File(parentId=folder['id'], dataFileHandleId=file_handle['id'])\nfile_entity = syn.store(file)\n
"},{"location":"guides/data_storage/#storage-location-migration","title":"Storage location migration","text":"There are circumstances where it can be useful to move the files underlying Synapse entities from one storage location to another without impacting the structure or identifiers of the Synapse entities themselves. An example scenario is needing to use STS features with an existing Synapse Project that was not initially configured with an STS enabled custom storage location.
The Synapse client has utilities for migrating entities to a new storage location without having to download the content locally and re-uploading it which can be slow, and may alter the meta data associated with the entities in undesirable ways.
During the migration it is recommended that uploads and downloads are blocked to prevent possible conflicts or race conditions. This can be done by setting permissions to Can view
for the project or folder being migrated. After the migration is complete set the permissions back to their original values.
Expected time to migrate data is around 13 minutes per 100Gb as of 11/21/2023.
"},{"location":"guides/data_storage/#migrating-programmatically","title":"Migrating programmatically","text":"Migrating a Synapse project or folder programmatically is a two step process.
First ensure that you know the id of the storage location you want to migrate to. More info on storage locations can be found above and here.
Once the storage location is known, the first step to migrate the project or folder is to create a migratable index of its contents using the index_files_for_migration function, e.g.
When specifying the .db
file for the migratable indexes you need to specify a .db
file that does not already exist for another synapse project or folder on disk. It is the best practice to specify a unique name for the file by including the synapse id in the name of the file, or other unique identifier.
import synapseutils\n\nentity_id = 'syn123' # a Synapse entity whose contents need to be migrated, e.g. a Project or Folder\ndest_storage_location_id = '12345' # the id of the destination storage location being migrated to\n\n# a path on disk where this utility can create a sqlite database to store its index.\n# nothing needs to exist at this path, but it must be a valid path on a volume with sufficient\n# disk space to store a meta data listing of all the contents in the indexed entity.\n# a rough rule of thumb is 100kB per 1000 entities indexed.\ndb_path = '/tmp/foo/syn123_bar.db'\n\nresult = synapseutils.index_files_for_migration(\n syn,\n entity_id,\n dest_storage_location_id,\n db_path,\n\n # optional args, see function documentation linked above for a description of these parameters\n source_storage_location_ids=['54321', '98765'],\n file_version_strategy='new',\n include_table_files=False,\n continue_on_error=True\n)\n
If called on a container (e.g. a Project or Folder) the index_files_for_migration function will recursively index all of the children of that container (including its subfolders). Once the entity has been indexed you can optionally programmatically inspect the the contents of the index or output its contents to a csv file in order to manually inspect it using the available methods on the returned result object.
The next step to trigger the migration from the indexed files is using the migrate_indexed_files function, e.g.
result = synapseutils.migrate_indexed_files(\n syn,\n db_path,\n\n # optional args, see function documentation linked above for a description of these parameters\n create_table_snapshots=True,\n continue_on_error=False,\n force=True\n)\n
The result can be again be inspected as above to see the results of the migration.
Note that above the force parameter is necessary if running from a non-interactive shell. Proceeding with a migration requires confirmation in the form of user prompt. If running programmatically this parameter instead confirms your intention to proceed with the migration.
"},{"location":"guides/data_storage/#putting-all-the-migration-pieces-together","title":"Putting all the migration pieces together","text":"import os\nimport synapseutils\nimport synapseclient\n\nmy_synapse_project_or_folder_to_migrate = \"syn123\"\n\nexternal_bucket_name = \"my-external-synapse-bucket\"\nexternal_bucket_base_key = \"path/within/bucket/\"\n\nmy_user_id = \"1234\"\n\n# a path on disk where this utility can create a sqlite database to store its index.\n# nothing needs to exist at this path, but it must be a valid path on a volume with sufficient\n# disk space to store a meta data listing of all the contents in the indexed entity.\n# a rough rule of thumb is 100kB per 1000 entities indexed.\ndb_path = os.path.expanduser(\n f\"~/synapseMigration/{my_synapse_project_or_folder_to_migrate}_my.db\"\n)\n\nsyn = synapseclient.Synapse()\n\n# Log-in with ~.synapseConfig `authToken`\nsyn.login()\n\n# The project or folder I want to migrate everything to this S3 storage location\nproject_or_folder = syn.get(my_synapse_project_or_folder_to_migrate)\n\nproject_or_folder, storage_location, project_setting = syn.create_s3_storage_location(\n # Despite the KW argument name, this can be a project or folder\n folder=project_or_folder,\n bucket_name=external_bucket_name,\n base_key=external_bucket_base_key,\n)\n\n# The id of the destination storage location being migrated to\nstorage_location_id = storage_location[\"storageLocationId\"]\nprint(\n f\"Indexing: {project_or_folder.id} for migration to storage_id: {storage_location_id} at: {db_path}\"\n)\n\ntry:\n result = synapseutils.index_files_for_migration(\n syn,\n project_or_folder.id,\n storage_location_id,\n db_path,\n file_version_strategy=\"all\",\n )\n\n print(f\"Indexing result: {result.get_counts_by_status()}\")\n\n print(\"Migrating files...\")\n\n result = synapseutils.migrate_indexed_files(\n syn,\n db_path,\n force=True,\n )\n\n print(f\"Migration result: {result.get_counts_by_status()}\")\n syn.sendMessage(\n userIds=[my_user_id],\n messageSubject=f\"Migration success for {project_or_folder.id}\",\n messageBody=f\"Migration result: {result.get_counts_by_status()}\",\n )\nexcept Exception as e:\n syn.sendMessage(\n userIds=[my_user_id],\n messageSubject=f\"Migration failed for {project_or_folder.id}\",\n messageBody=f\"Migration failed with error: {e}\",\n )\n
The result of running this should look like
Indexing: syn123 for migration to storage_id: 11111 at: /home/user/synapseMigration/syn123_my.db\nIndexing result: {'INDEXED': 100, 'MIGRATED': 0, 'ALREADY_MIGRATED': 0, 'ERRORED': 0}\nMigrating files...\nMigration result: {'INDEXED': 0, 'MIGRATED': 100, 'ALREADY_MIGRATED': 0, 'ERRORED': 0}\n
"},{"location":"guides/data_storage/#migrating-from-the-command-line","title":"Migrating from the command line","text":"Synapse entities can also be migrated from the command line. The options are similar to above. Whereas migrating programatically involves two separate function calls, from the command line there is a single migrate
command with the dryRun argument providing the option to generate the index only without proceeding onto the migration.
Note that as above, confirmation is required before a migration starts. As above, this must either be in the form of confirming via a prompt if running the command from an interactive shell, or using the force command.
The optional csv_log_path argument will output the results to a csv file for record keeping, and is recommended.
synapse migrate syn123 54321 /tmp/migrate.db --csv_log_path /tmp/migrate.csv\n
Sample output:
Indexing Project syn123\nIndexing file entity syn888\nIndexing file entity syn999\nIndexed 2 items, 2 needing migration, 0 already stored in destination storage location (54321). Encountered 0 errors.\n21 items for migration to 54321. Proceed? (y/n)? y\nCreating new version for file entity syn888\nCreating new version for file entity syn999\nCompleted migration of syn123. 2 files migrated. 0 errors encountered\nWriting csv log to /tmp/migrate.csv\n
.. _sts_storage_locations:"},{"location":"guides/data_storage/#sts-storage-locations","title":"STS Storage Locations","text":"Create an STS enabled folder to use AWS Security Token Service credentials with S3 storage locations. These credentials can be scoped to access individual Synapse files or folders and can be used with external S3 tools such as the awscli and the boto3 library separately from Synapse to read and write files to and from Synapse storage. At this time read and write capabilities are supported for external storage locations, while default Synapse storage is limited to read only. Please read the linked documentation for a complete understanding of the capabilities and restrictions of STS enabled folders.
"},{"location":"guides/data_storage/#creating-an-sts-enabled-folder","title":"Creating an STS enabled folder","text":"Creating an STS enabled folder is similar to creating an external storage folder as described above, but this time passing an additional sts_enabled=True keyword parameter. The bucket_name and base_key parameters apply to external storage locations and can be omitted to use Synapse internal storage. Note also that STS can only be enabled on an empty folder.
# create a new folder to use with STS and external S3 storage\nfolder = syn.store(Folder(name=folder_name, parent=parent))\nfolder, storage_location, project_setting = syn.create_s3_storage_location(\n folder=folder,\n bucket_name='my-external-synapse-bucket',\n base_key='path/within/bucket',\n sts_enabled=True,\n)\n
"},{"location":"guides/data_storage/#using-credentials-with-the-awscli","title":"Using credentials with the awscli","text":"This example illustrates obtaining STS credentials and using them with the awscli command line tool. The first command outputs the credentials as shell commands to execute which will then be picked up by subsequent aws cli commands. Note that the bucket-owner-full-control ACL is required when putting an object via STS credentials. This ensures that the object ownership will be transferred to the owner of the AWS bucket.
$ synapse get-sts-token -o shell syn123 read_write\n\nexport SYNAPSE_STS_S3_LOCATION=\"s3://my-external-synapse-bucket/path/within/bucket\"\nexport AWS_ACCESS_KEY_ID=\"<access_key_id>\"\nexport AWS_SECRET_ACCESS_KEY=\"<secret_access_key>\"\nexport AWS_SESSION_TOKEN=\"<session_token>\n\n# if the above are executed in the shell, the awscli will automatically apply them\n\n# e.g. copy a file directly to the bucket using the exported credentials\n$ aws s3 cp /path/to/local/file $SYNAPSE_STS_S3_LOCATION --acl bucket-owner-full-control\n
"},{"location":"guides/data_storage/#using-credentials-with-boto3-in-python","title":"Using credentials with boto3 in python","text":"This example illustrates retrieving STS credentials and using them with boto3 within python code, in this case to upload a file. Note that the bucket-owner-full-control ACL is required when putting an object via STS credentials. This ensures that the object ownership will be transferred to the owner of the AWS bucket.
# the boto output_format is compatible with the boto3 session api.\ncredentials = syn.get_sts_storage_token('syn123', 'read_write', output_format='boto')\n\ns3_client = boto3.client('s3', **credentials)\ns3_client.upload_file(\n Filename='/path/to/local/file,\n Bucket='my-external-synapse-bucket',\n Key='path/within/bucket/file',\n ExtraArgs={'ACL': 'bucket-owner-full-control'},\n)\n
"},{"location":"guides/data_storage/#automatic-transfers-tofrom-sts-storage-locations-using-boto3-with-synapseclient","title":"Automatic transfers to/from STS storage locations using boto3 with synapseclient","text":"The Python Synapse client can be configured to automatically use STS tokens to perform uploads and downloads to enabled storage locations using an installed boto3 library rather than through the traditional Synapse client APIs. This can improve performance in certain situations, particularly uploads of large files, as the data transfer itself can be conducted purely against the AWS S3 APIs, only invoking the Synapse APIs to retrieve the necessary token and to update Synapse metadata in the case of an upload. Once configured to do so, retrieval of STS tokens for supported operations occurs automatically without any change in synapseclient usage.
To enable STS/boto3 transfers on all get
and store
operations, do the following:
pip install boto3\n
# add to .synapseConfig to automatically apply as default for all synapse client instances\n[transfer]\nuse_boto_sts=true\n\n# alternatively set on a per instance basis within python code\nsyn.use_boto_sts_transfers = True\n
Note that if boto3 is not installed, then these settings will have no effect.
"},{"location":"guides/data_storage/#sftp","title":"SFTP","text":""},{"location":"guides/data_storage/#installation","title":"Installation","text":"Installing the extra libraries that the Python client uses to communicate with SFTP servers may add a few steps to the installation process.
The required libraries are:
Building these libraries on Unix OS's is straightforward, but you need the Python development headers and libraries. For example, in Debian or Ubuntu distributions:
sudo apt-get install python-dev\n
Once this requirement is met, sudo pip install synapseclient
should be able to build pycrypto.
Binary distributions of pycrypto built for Windows is available from Michael Foord at Voidspace. Install this before installing the Python client.
After running the pycrypto installer, sudo pip install synapseclient
should work.
Another option is to build your own binary with either the free developer tools from Microsoft or the MinGW compiler.
"},{"location":"guides/data_storage/#configure-your-client","title":"Configure your client","text":"Make sure you configure your ~/.synapseConfig file to connect to your SFTP server.
"},{"location":"guides/validate_annotations/","title":"Validate Annotations","text":"Warning: This is a beta implementation and is subject to change. Use at your own risk.
Validate annotations on your Synapse entities by leveraging the JSON schema services. Here are the steps you must take to set up the JSON Schema service.
"},{"location":"guides/validate_annotations/#create-a-json-schema-organization","title":"Create a JSON Schema organization","text":"Set up Synapse client and JSON Schema service:
import synapseclient\nsyn = synapseclient.login()\nsyn.get_available_services() # Output: ['json_schema']\njs = syn.service(\"json_schema\")\n
Create, manage, and delete a JSON Schema organization:
my_org_name = <your org name here>\nmy_org = js.JsonSchemaOrganization(my_org_name)\nmy_org # Output: JsonSchemaOrganization(name=my_org_name)\nmy_org.create()\nmy_org.get_acl()\nmy_org.set_acl([syn.getUserProfile().ownerId])\n# my_org.update_acl([syn.getUserProfile().ownerId])\nmy_org.delete()\n
Retrieve existing organization and associated JSON schemas:
orgs = js.list_organizations()\nsage_org = js.JsonSchemaOrganization(\"sage.annotations\")\nschemas = sage_org.list_json_schemas()\nschema1 = next(schemas)\nschema2 = sage_org.get_json_schema(schema1.name)\nassert schema1 is schema2 # True\nschema1 # Output: JsonSchema(org='sage.annotations', name='analysis.alignmentMethod')\n
Manage a specific version of a JSON schema:
versions = schema1.list_versions()\nversion1 = next(versions)\nraw_body = version1.body\nfull_body = version1.expand()\nversion1\n# Output: JsonSchemaVersion(org='sage.annotations', name='analysis.alignmentMethod', version='0.0.1')\n
Create a new JSON schema version for an existing organization:
from random import randint\nrint = randint(0, 100000)\nschema_name = \"my.schema\"\n\n# Method 1\nmy_org = js.JsonSchemaOrganization(my_org_name)\nnew_version1 = my_org.create_json_schema(raw_body, schema_name, f\"0.{rint}.1\")\n\n# Method 2\nmy_schema = js.JsonSchema(my_org, schema_name)\nnew_version2 = my_schema.create(raw_body, f\"0.{rint}.2\")\n\n# Method 3\nmy_version = js.JsonSchemaVersion(my_org, schema_name, f\"0.{rint}.3\")\nnew_version3 = my_version.create(raw_body)\n
Test validation on a Synapse entity:
from time import sleep\nsynapse_id = \"syn25922647\"\njs.bind_json_schema(new_version1.uri, synapse_id)\njs.get_json_schema(synapse_id)\nsleep(3)\njs.validate(synapse_id)\njs.validate_children(synapse_id)\njs.validation_stats(synapse_id)\njs.unbind_json_schema(synapse_id)\n
Access to low-level API functions:
js.create_organization(organization_name)\njs.get_organization(organization_name)\njs.list_organizations()\njs.delete_organization(organization_id)\njs.get_organization_acl(organization_id)\njs.update_organization_acl(organization_id, resource_access, etag)\njs.list_json_schemas(organization_name)\njs.list_json_schema_versions(organization_name, json_schema_name)\njs.create_json_schema(json_schema_body, dry_run)\njs.get_json_schema_body(json_schema_uri)\njs.delete_json_schema(json_schema_uri)\njs.json_schema_validation(json_schema_uri)\njs.bind_json_schema_to_entity(synapse_id, json_schema_uri)\njs.get_json_schema_from_entity(synapse_id)\njs.delete_json_schema_from_entity(synapse_id)\njs.validate_entity_with_json_schema(synapse_id)\njs.get_json_schema_validation_statistics(synapse_id)\njs.get_invalid_json_schema_validation(synapse_id)\n
"},{"location":"guides/views/","title":"Views","text":"A view is a view of all entities (File, Folder, Project, Table, Docker Repository, View) within one or more Projects or Folders. Views can:
Let's go over some examples to demonstrate how view works. First, create a new project and add some files:
import synapseclient\nfrom synapseclient import Project, File, Column, Table, EntityViewSchema, EntityViewType\nsyn = synapseclient.Synapse()\nsyn.login()\n\n# Create a new project\nproject = syn.store(Project(\"test view\"))\n\n# Create some files\nfile1 = syn.store(File(path=\"path/to/file1.txt\", parent=project))\nfile2 = syn.store(File(path=\"path/to/file2.txt\", parent=project))\n\n# add some annotations\nsyn.setAnnotations(file1, {\"contributor\":\"Sage\", \"class\":\"V\"})\nsyn.setAnnotations(file2, {\"contributor\":\"UW\", \"rank\":\"X\"})\n
"},{"location":"guides/views/#creating-a-view","title":"Creating a View","text":"To create a view, defines its name, columns, parent, scope, and the type of the view:
view = EntityViewSchema(name=\"my first file view\",\n columns=[\n Column(name=\"contributor\", columnType=\"STRING\"),\n Column(name=\"class\", columnType=\"STRING\"),\n Column(name=\"rank\", columnType=\"STRING\")),\n parent=project['id'],\n scopes=project['id'],\n includeEntityTypes=[EntityViewType.FILE, EntityViewType.FOLDER],\n addDefaultViewColumns=True)\nview = syn.store(view)\n
We support the following entity type in a View:
* EntityViewType.FILE\n* EntityViewType.PROJECT\n* EntityViewType.TABLE\n* EntityViewType.FOLDER\n* EntityViewType.VIEW\n* EntityViewType.DOCKER\n
To see the content of your newly created View, use syn.tableQuery():
query_results = syn.tableQuery(\"select * from %s\" % view['id'])\ndata = query_results.asDataFrame()\n
"},{"location":"guides/views/#updating-annotations-using-view","title":"Updating Annotations using View","text":"To update class
annotation for file2
, simply update the view:
# Retrieve the view data using table query\nquery_results = syn.tableQuery(\"select * from %s\" % view['id'])\ndata = query_results.asDataFrame()\n\n# Modify the annotations by modifying the view data and store it\ndata[\"class\"] = [\"V\", \"VI\"]\nsyn.store(Table(view['id'], data))\n
The change in annotations reflect in synGetAnnotations():
syn.getAnnotations(file2['id'])\n
A View is a Table. Please visit Tables to see how to change schema, update content, and other operations that can be done on View.
"},{"location":"reference/activity/","title":"Activity/Provenance","text":""},{"location":"reference/activity/#synapseclient.activity","title":"synapseclient.activity
","text":"Provenance
The Activity object represents the source of a data set or the data processing steps used to produce it. Using W3C provenance ontology terms, a result is generated by a combination of data and code which are either used or executed.
"},{"location":"reference/activity/#synapseclient.activity--imports","title":"Imports","text":"from synapseclient import Activity\n
"},{"location":"reference/activity/#synapseclient.activity--creating-an-activity-object","title":"Creating an activity object","text":"act = Activity(name='clustering',\n description='whizzy clustering',\n used=['syn1234','syn1235'],\n executed='syn4567')\n
Here, syn1234 and syn1235 might be two types of measurements on a common set of samples. Some whizzy clustering code might be referred to by syn4567. The used and executed can reference entities in Synapse or URLs.
Alternatively, you can build an activity up piecemeal::
act = Activity(name='clustering', description='whizzy clustering')\nact.used(['syn12345', 'syn12346'])\nact.executed(\n 'https://raw.githubusercontent.com/Sage-Bionetworks/synapsePythonClient/develop/tests/unit/unit_test_client.py')\n
"},{"location":"reference/activity/#synapseclient.activity--storing-entities-with-provenance","title":"Storing entities with provenance","text":"The activity can be passed in when storing an Entity to set the Entity's provenance::
clustered_samples = syn.store(clustered_samples, activity=act)\n
We've now recorded that clustered_samples
is the output of our whizzy clustering algorithm applied to the data stored in syn1234 and syn1235.
The synapseclient.Synapse.store has shortcuts for specifying the used and executed lists directly. For example, when storing a data entity, it's a good idea to record its source::
excellent_data = syn.store(excellent_data,\n activityName='data-r-us'\n activityDescription='downloaded from data-r-us',\n used='http://data-r-us.com/excellent/data.xyz')\n
"},{"location":"reference/activity/#synapseclient.activity-classes","title":"Classes","text":""},{"location":"reference/activity/#synapseclient.activity.Activity","title":"Activity
","text":" Bases: dict
Represents the provenance of a Synapse Entity.
PARAMETER DESCRIPTIONname
Name of the Activity
DEFAULT: None
description
A short text description of the Activity
DEFAULT: None
used
Either a list of:
DEFAULT: None
executed
A code resource that was executed to generate the Entity.
DEFAULT: None
data
A dictionary representation of an Activity, with fields 'name', 'description' and 'used' (a list of reference objects)
DEFAULT: {}
See also: The W3C's provenance ontology
Source code insynapseclient/activity.py
class Activity(dict):\n \"\"\"\n Represents the provenance of a Synapse Entity.\n\n Parameters:\n name: Name of the Activity\n description: A short text description of the Activity\n used: Either a list of:\n\n - [reference objects](https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/Reference.html) (e.g. [{'targetId':'syn123456', 'targetVersionNumber':1}])\n - a list of Synapse Entities or Entity IDs\n - a list of URL's\n executed: A code resource that was executed to generate the Entity.\n data: A dictionary representation of an Activity, with fields 'name', 'description' and 'used' (a list of reference objects)\n\n See also: The [W3C's provenance ontology](http://www.w3.org/TR/prov-o/)\n \"\"\"\n\n # TODO: make constructors from JSON consistent across objects\n def __init__(self, name=None, description=None, used=None, executed=None, data={}):\n super(Activity, self).__init__(data)\n if \"used\" not in self:\n self[\"used\"] = []\n\n if name is not None:\n self[\"name\"] = name\n if description is not None:\n self[\"description\"] = description\n if used is not None:\n self.used(used)\n if executed is not None:\n self.executed(executed)\n\n def used(\n self, target=None, targetVersion=None, wasExecuted=None, url=None, name=None\n ):\n \"\"\"\n Add a resource used by the activity.\n\n This method tries to be as permissive as possible. It accepts a string which might be a synapse ID or a URL,\n a synapse entity, a UsedEntity or UsedURL dictionary or a list containing any combination of these.\n\n In addition, named parameters can be used to specify the fields of either a UsedEntity or a UsedURL.\n If target and optionally targetVersion are specified, create a UsedEntity.\n If url and optionally name are specified, create a UsedURL.\n\n It is an error to specify both target/targetVersion parameters and url/name parameters in the same call.\n To add multiple UsedEntities and UsedURLs, make a separate call for each or pass in a list.\n\n In case of conflicting settings for wasExecuted both inside an object and with a parameter, the parameter wins.\n For example, this UsedURL will have wasExecuted set to False::\n\n activity.used({'url':'http://google.com', 'name':'Goog', 'wasExecuted':True}, wasExecuted=False)\n\n Entity examples::\n\n activity.used('syn12345')\n activity.used(entity)\n activity.used(target=entity, targetVersion=2)\n activity.used(codeEntity, wasExecuted=True)\n activity.used({'reference':{'target':'syn12345', 'targetVersion':1}, 'wasExecuted':False})\n\n URL examples::\n\n activity.used('http://mydomain.com/my/awesome/data.RData')\n activity.used(url='http://mydomain.com/my/awesome/data.RData', name='Awesome Data')\n activity.used(url='https://github.com/joe_hacker/code_repo', name='Gnarly hacks', wasExecuted=True)\n activity.used({'url':'https://github.com/joe_hacker/code_repo', 'name':'Gnarly hacks'}, wasExecuted=True)\n\n List example::\n\n activity.used(['syn12345', 'syn23456', entity, \\\n {'reference':{'target':'syn100009', 'targetVersion':2}, 'wasExecuted':True}, \\\n 'http://mydomain.com/my/awesome/data.RData'])\n \"\"\"\n # -- A list of targets\n if isinstance(target, list):\n badargs = _get_any_bad_args([\"targetVersion\", \"url\", \"name\"], locals())\n _raise_incorrect_used_usage(badargs, \"list of used resources\")\n\n for item in target:\n self.used(item, wasExecuted=wasExecuted)\n return\n\n # -- UsedEntity\n elif is_used_entity(target):\n badargs = _get_any_bad_args([\"targetVersion\", \"url\", \"name\"], locals())\n _raise_incorrect_used_usage(\n badargs, \"dictionary representing a used resource\"\n )\n\n resource = target\n if \"concreteType\" not in resource:\n resource[\n \"concreteType\"\n ] = \"org.sagebionetworks.repo.model.provenance.UsedEntity\"\n\n # -- Used URL\n elif is_used_url(target):\n badargs = _get_any_bad_args([\"targetVersion\", \"url\", \"name\"], locals())\n _raise_incorrect_used_usage(badargs, \"URL\")\n\n resource = target\n if \"concreteType\" not in resource:\n resource[\n \"concreteType\"\n ] = \"org.sagebionetworks.repo.model.provenance.UsedURL\"\n\n # -- Synapse Entity\n elif is_synapse_entity(target):\n badargs = _get_any_bad_args([\"url\", \"name\"], locals())\n _raise_incorrect_used_usage(badargs, \"Synapse entity\")\n\n reference = {\"targetId\": target[\"id\"]}\n if \"versionNumber\" in target:\n reference[\"targetVersionNumber\"] = target[\"versionNumber\"]\n if targetVersion:\n reference[\"targetVersionNumber\"] = int(targetVersion)\n resource = {\n \"reference\": reference,\n \"concreteType\": \"org.sagebionetworks.repo.model.provenance.UsedEntity\",\n }\n # -- URL parameter\n elif url:\n badargs = _get_any_bad_args([\"target\", \"targetVersion\"], locals())\n _raise_incorrect_used_usage(badargs, \"URL\")\n\n resource = {\n \"url\": url,\n \"name\": name if name else target,\n \"concreteType\": \"org.sagebionetworks.repo.model.provenance.UsedURL\",\n }\n\n # -- URL as a string\n elif is_url(target):\n badargs = _get_any_bad_args([\"targetVersion\"], locals())\n _raise_incorrect_used_usage(badargs, \"URL\")\n resource = {\n \"url\": target,\n \"name\": name if name else target,\n \"concreteType\": \"org.sagebionetworks.repo.model.provenance.UsedURL\",\n }\n\n # -- Synapse Entity ID (assuming the string is an ID)\n elif isinstance(target, str):\n badargs = _get_any_bad_args([\"url\", \"name\"], locals())\n _raise_incorrect_used_usage(badargs, \"Synapse entity\")\n vals = target.split(\".\") # Handle synapseIds of from syn234.4\n if not is_synapse_id_str(vals[0]):\n raise ValueError(\"%s is not a valid Synapse id\" % target)\n if len(vals) == 2:\n if targetVersion and int(targetVersion) != int(vals[1]):\n raise ValueError(\n \"Two conflicting versions for %s were specified\" % target\n )\n targetVersion = int(vals[1])\n reference = {\"targetId\": vals[0]}\n if targetVersion:\n reference[\"targetVersionNumber\"] = int(targetVersion)\n resource = {\n \"reference\": reference,\n \"concreteType\": \"org.sagebionetworks.repo.model.provenance.UsedEntity\",\n }\n else:\n raise SynapseError(\"Unexpected parameters in call to Activity.used().\")\n\n # Set wasExecuted\n if wasExecuted is None:\n # Default to False\n if \"wasExecuted\" not in resource:\n resource[\"wasExecuted\"] = False\n else:\n # wasExecuted parameter overrides setting in an object\n resource[\"wasExecuted\"] = wasExecuted\n\n # Add the used resource to the activity\n self[\"used\"].append(resource)\n\n def executed(self, target=None, targetVersion=None, url=None, name=None):\n \"\"\"\n Add a code resource that was executed during the activity.\n See :py:func:`synapseclient.activity.Activity.used`\n \"\"\"\n self.used(\n target=target,\n targetVersion=targetVersion,\n url=url,\n name=name,\n wasExecuted=True,\n )\n\n def _getStringList(self, wasExecuted=True):\n usedList = []\n for source in [\n source\n for source in self[\"used\"]\n if source.get(\"wasExecuted\", False) == wasExecuted\n ]:\n if source[\"concreteType\"].endswith(\"UsedURL\"):\n if source.get(\"name\"):\n usedList.append(source.get(\"name\"))\n else:\n usedList.append(source.get(\"url\"))\n else: # It is an entity for now\n tmpstr = source[\"reference\"][\"targetId\"]\n if \"targetVersionNumber\" in source[\"reference\"]:\n tmpstr += \".%i\" % source[\"reference\"][\"targetVersionNumber\"]\n usedList.append(tmpstr)\n return usedList\n\n def _getExecutedStringList(self):\n return self._getStringList(wasExecuted=True)\n\n def _getUsedStringList(self):\n return self._getStringList(wasExecuted=False)\n\n def __str__(self):\n str = \"%s\\n Executed:\\n\" % self.get(\"name\", \"\")\n str += \"\\n\".join(self._getExecutedStringList())\n str += \" Used:\\n\"\n str += \"\\n\".join(self._getUsedStringList())\n return str\n
"},{"location":"reference/activity/#synapseclient.activity.Activity-functions","title":"Functions","text":""},{"location":"reference/activity/#synapseclient.activity.Activity.used","title":"used(target=None, targetVersion=None, wasExecuted=None, url=None, name=None)
","text":"Add a resource used by the activity.
This method tries to be as permissive as possible. It accepts a string which might be a synapse ID or a URL, a synapse entity, a UsedEntity or UsedURL dictionary or a list containing any combination of these.
In addition, named parameters can be used to specify the fields of either a UsedEntity or a UsedURL. If target and optionally targetVersion are specified, create a UsedEntity. If url and optionally name are specified, create a UsedURL.
It is an error to specify both target/targetVersion parameters and url/name parameters in the same call. To add multiple UsedEntities and UsedURLs, make a separate call for each or pass in a list.
In case of conflicting settings for wasExecuted both inside an object and with a parameter, the parameter wins. For example, this UsedURL will have wasExecuted set to False::
activity.used({'url':'http://google.com', 'name':'Goog', 'wasExecuted':True}, wasExecuted=False)\n
Entity examples::
activity.used('syn12345')\nactivity.used(entity)\nactivity.used(target=entity, targetVersion=2)\nactivity.used(codeEntity, wasExecuted=True)\nactivity.used({'reference':{'target':'syn12345', 'targetVersion':1}, 'wasExecuted':False})\n
URL examples::
activity.used('http://mydomain.com/my/awesome/data.RData')\nactivity.used(url='http://mydomain.com/my/awesome/data.RData', name='Awesome Data')\nactivity.used(url='https://github.com/joe_hacker/code_repo', name='Gnarly hacks', wasExecuted=True)\nactivity.used({'url':'https://github.com/joe_hacker/code_repo', 'name':'Gnarly hacks'}, wasExecuted=True)\n
List example::
activity.used(['syn12345', 'syn23456', entity, {'reference':{'target':'syn100009', 'targetVersion':2}, 'wasExecuted':True}, 'http://mydomain.com/my/awesome/data.RData'])\n
Source code in synapseclient/activity.py
def used(\n self, target=None, targetVersion=None, wasExecuted=None, url=None, name=None\n):\n \"\"\"\n Add a resource used by the activity.\n\n This method tries to be as permissive as possible. It accepts a string which might be a synapse ID or a URL,\n a synapse entity, a UsedEntity or UsedURL dictionary or a list containing any combination of these.\n\n In addition, named parameters can be used to specify the fields of either a UsedEntity or a UsedURL.\n If target and optionally targetVersion are specified, create a UsedEntity.\n If url and optionally name are specified, create a UsedURL.\n\n It is an error to specify both target/targetVersion parameters and url/name parameters in the same call.\n To add multiple UsedEntities and UsedURLs, make a separate call for each or pass in a list.\n\n In case of conflicting settings for wasExecuted both inside an object and with a parameter, the parameter wins.\n For example, this UsedURL will have wasExecuted set to False::\n\n activity.used({'url':'http://google.com', 'name':'Goog', 'wasExecuted':True}, wasExecuted=False)\n\n Entity examples::\n\n activity.used('syn12345')\n activity.used(entity)\n activity.used(target=entity, targetVersion=2)\n activity.used(codeEntity, wasExecuted=True)\n activity.used({'reference':{'target':'syn12345', 'targetVersion':1}, 'wasExecuted':False})\n\n URL examples::\n\n activity.used('http://mydomain.com/my/awesome/data.RData')\n activity.used(url='http://mydomain.com/my/awesome/data.RData', name='Awesome Data')\n activity.used(url='https://github.com/joe_hacker/code_repo', name='Gnarly hacks', wasExecuted=True)\n activity.used({'url':'https://github.com/joe_hacker/code_repo', 'name':'Gnarly hacks'}, wasExecuted=True)\n\n List example::\n\n activity.used(['syn12345', 'syn23456', entity, \\\n {'reference':{'target':'syn100009', 'targetVersion':2}, 'wasExecuted':True}, \\\n 'http://mydomain.com/my/awesome/data.RData'])\n \"\"\"\n # -- A list of targets\n if isinstance(target, list):\n badargs = _get_any_bad_args([\"targetVersion\", \"url\", \"name\"], locals())\n _raise_incorrect_used_usage(badargs, \"list of used resources\")\n\n for item in target:\n self.used(item, wasExecuted=wasExecuted)\n return\n\n # -- UsedEntity\n elif is_used_entity(target):\n badargs = _get_any_bad_args([\"targetVersion\", \"url\", \"name\"], locals())\n _raise_incorrect_used_usage(\n badargs, \"dictionary representing a used resource\"\n )\n\n resource = target\n if \"concreteType\" not in resource:\n resource[\n \"concreteType\"\n ] = \"org.sagebionetworks.repo.model.provenance.UsedEntity\"\n\n # -- Used URL\n elif is_used_url(target):\n badargs = _get_any_bad_args([\"targetVersion\", \"url\", \"name\"], locals())\n _raise_incorrect_used_usage(badargs, \"URL\")\n\n resource = target\n if \"concreteType\" not in resource:\n resource[\n \"concreteType\"\n ] = \"org.sagebionetworks.repo.model.provenance.UsedURL\"\n\n # -- Synapse Entity\n elif is_synapse_entity(target):\n badargs = _get_any_bad_args([\"url\", \"name\"], locals())\n _raise_incorrect_used_usage(badargs, \"Synapse entity\")\n\n reference = {\"targetId\": target[\"id\"]}\n if \"versionNumber\" in target:\n reference[\"targetVersionNumber\"] = target[\"versionNumber\"]\n if targetVersion:\n reference[\"targetVersionNumber\"] = int(targetVersion)\n resource = {\n \"reference\": reference,\n \"concreteType\": \"org.sagebionetworks.repo.model.provenance.UsedEntity\",\n }\n # -- URL parameter\n elif url:\n badargs = _get_any_bad_args([\"target\", \"targetVersion\"], locals())\n _raise_incorrect_used_usage(badargs, \"URL\")\n\n resource = {\n \"url\": url,\n \"name\": name if name else target,\n \"concreteType\": \"org.sagebionetworks.repo.model.provenance.UsedURL\",\n }\n\n # -- URL as a string\n elif is_url(target):\n badargs = _get_any_bad_args([\"targetVersion\"], locals())\n _raise_incorrect_used_usage(badargs, \"URL\")\n resource = {\n \"url\": target,\n \"name\": name if name else target,\n \"concreteType\": \"org.sagebionetworks.repo.model.provenance.UsedURL\",\n }\n\n # -- Synapse Entity ID (assuming the string is an ID)\n elif isinstance(target, str):\n badargs = _get_any_bad_args([\"url\", \"name\"], locals())\n _raise_incorrect_used_usage(badargs, \"Synapse entity\")\n vals = target.split(\".\") # Handle synapseIds of from syn234.4\n if not is_synapse_id_str(vals[0]):\n raise ValueError(\"%s is not a valid Synapse id\" % target)\n if len(vals) == 2:\n if targetVersion and int(targetVersion) != int(vals[1]):\n raise ValueError(\n \"Two conflicting versions for %s were specified\" % target\n )\n targetVersion = int(vals[1])\n reference = {\"targetId\": vals[0]}\n if targetVersion:\n reference[\"targetVersionNumber\"] = int(targetVersion)\n resource = {\n \"reference\": reference,\n \"concreteType\": \"org.sagebionetworks.repo.model.provenance.UsedEntity\",\n }\n else:\n raise SynapseError(\"Unexpected parameters in call to Activity.used().\")\n\n # Set wasExecuted\n if wasExecuted is None:\n # Default to False\n if \"wasExecuted\" not in resource:\n resource[\"wasExecuted\"] = False\n else:\n # wasExecuted parameter overrides setting in an object\n resource[\"wasExecuted\"] = wasExecuted\n\n # Add the used resource to the activity\n self[\"used\"].append(resource)\n
"},{"location":"reference/activity/#synapseclient.activity.Activity.executed","title":"executed(target=None, targetVersion=None, url=None, name=None)
","text":"Add a code resource that was executed during the activity. See :py:func:synapseclient.activity.Activity.used
synapseclient/activity.py
def executed(self, target=None, targetVersion=None, url=None, name=None):\n \"\"\"\n Add a code resource that was executed during the activity.\n See :py:func:`synapseclient.activity.Activity.used`\n \"\"\"\n self.used(\n target=target,\n targetVersion=targetVersion,\n url=url,\n name=name,\n wasExecuted=True,\n )\n
"},{"location":"reference/activity/#synapseclient.activity-functions","title":"Functions","text":""},{"location":"reference/activity/#synapseclient.activity.is_used_entity","title":"is_used_entity(x)
","text":"Returns True if the given object represents a UsedEntity.
Source code insynapseclient/activity.py
def is_used_entity(x):\n \"\"\"Returns True if the given object represents a UsedEntity.\"\"\"\n\n # A UsedEntity must be a dictionary with a 'reference' field, with a 'targetId' field\n if (\n not isinstance(x, collections.abc.Mapping)\n or \"reference\" not in x\n or \"targetId\" not in x[\"reference\"]\n ):\n return False\n\n # Must only have three keys\n if not all(key in (\"reference\", \"wasExecuted\", \"concreteType\") for key in x.keys()):\n return False\n\n # 'reference' field can only have two keys\n if not all(\n key in (\"targetId\", \"targetVersionNumber\") for key in x[\"reference\"].keys()\n ):\n return False\n\n return True\n
"},{"location":"reference/activity/#synapseclient.activity.is_used_url","title":"is_used_url(x)
","text":"Returns True if the given object represents a UsedURL.
Source code insynapseclient/activity.py
def is_used_url(x):\n \"\"\"Returns True if the given object represents a UsedURL.\"\"\"\n\n # A UsedURL must be a dictionary with a 'url' field\n if not isinstance(x, collections.abc.Mapping) or \"url\" not in x:\n return False\n\n # Must only have four keys\n if not all(\n key in (\"url\", \"name\", \"wasExecuted\", \"concreteType\") for key in x.keys()\n ):\n return False\n\n return True\n
"},{"location":"reference/annotations/","title":"Annotations","text":""},{"location":"reference/annotations/#synapseclient.annotations","title":"synapseclient.annotations
","text":""},{"location":"reference/annotations/#synapseclient.annotations--annotations","title":"Annotations","text":"Annotations are arbitrary metadata attached to Synapse entities. They can be accessed like ordinary object properties or like dictionary keys:
entity.my_annotation = 'This is one way to do it'\nentity['other_annotation'] = 'This is another'\n
Annotations can be given in the constructor for Synapse Entities:
entity = File('data.xyz', parent=my_project, rating=9.1234)\n
Annotate the entity with location data:
entity.lat_long = [47.627477, -122.332154]\n
Record when we collected the data. This will use the current timezone of the machine running the code.
from datetime import datetime as Datetime\nentity.collection_date = Datetime.now()\n
Record when we collected the data in UTC:
from datetime import datetime as Datetime\nentity.collection_date = Datetime.utcnow()\n
See:
Data sources are best recorded using Synapse's Activity/Provenance tools.
"},{"location":"reference/annotations/#synapseclient.annotations--implementation-details","title":"Implementation details","text":"In Synapse, entities have both properties and annotations. Properties are used by the system, whereas annotations are completely user defined. In the Python client, we try to present this situation as a normal object, with one set of properties.
See also:
Annotations
","text":" Bases: dict
Represent Synapse Entity annotations as a flat dictionary with the system assigned properties id, etag as object attributes.
ATTRIBUTE DESCRIPTIONid
Synapse ID of the Entity
etag
Synapse etag of the Entity
values
(Optional) dictionary of values to be copied into annotations
**kwargs
additional key-value pairs to be added as annotations
Creating a few instances
Creating and setting annotations
example1 = Annotations('syn123','40256475-6fb3-11ea-bb0a-9cb6d0d8d984', {'foo':'bar'})\nexample2 = Annotations('syn123','40256475-6fb3-11ea-bb0a-9cb6d0d8d984', foo='bar')\nexample3 = Annotations('syn123','40256475-6fb3-11ea-bb0a-9cb6d0d8d984')\nexample3['foo'] = 'bar'\n
Source code in synapseclient/annotations.py
class Annotations(dict):\n \"\"\"\n Represent Synapse Entity annotations as a flat dictionary with the system assigned properties id, etag\n as object attributes.\n\n Attributes:\n id: Synapse ID of the Entity\n etag: Synapse etag of the Entity\n values: (Optional) dictionary of values to be copied into annotations\n **kwargs: additional key-value pairs to be added as annotations\n\n Example: Creating a few instances\n Creating and setting annotations\n\n example1 = Annotations('syn123','40256475-6fb3-11ea-bb0a-9cb6d0d8d984', {'foo':'bar'})\n example2 = Annotations('syn123','40256475-6fb3-11ea-bb0a-9cb6d0d8d984', foo='bar')\n example3 = Annotations('syn123','40256475-6fb3-11ea-bb0a-9cb6d0d8d984')\n example3['foo'] = 'bar'\n \"\"\"\n\n id: str\n etag: str\n\n def __init__(\n self,\n id: typing.Union[str, int, Entity],\n etag: str,\n values: typing.Dict = None,\n **kwargs,\n ):\n \"\"\"\n Create an Annotations object taking key value pairs from a dictionary or from keyword arguments.\n System properties id, etag, creationDate and uri become attributes of the object.\n\n :param id: A Synapse ID, a Synapse Entity object, a plain dictionary in which 'id' maps to a Synapse ID\n :param etag: etag of the Synapse Entity\n :param values: (Optional) dictionary of values to be copied into annotations\n :param \\**kwargs: additional key-value pairs to be added as annotations\n\n Example::\n\n example1 = Annotations('syn123','40256475-6fb3-11ea-bb0a-9cb6d0d8d984', {'foo':'bar'})\n example2 = Annotations('syn123','40256475-6fb3-11ea-bb0a-9cb6d0d8d984', foo='bar')\n example3 = Annotations('syn123','40256475-6fb3-11ea-bb0a-9cb6d0d8d984')\n example3['foo'] = 'bar'\n\n \"\"\"\n super().__init__()\n\n self.id = id\n self.etag = etag\n\n if values:\n self.update(values)\n if kwargs:\n self.update(kwargs)\n\n @property\n def id(self):\n return self._id\n\n @id.setter\n def id(self, value):\n if value is None:\n raise ValueError(\"id must not be None\")\n self._id = id_of(value)\n\n @property\n def etag(self):\n return self._etag\n\n @etag.setter\n def etag(self, value):\n if value is None:\n raise ValueError(\"etag must not be None\")\n self._etag = str(value)\n
"},{"location":"reference/annotations/#synapseclient.annotations-functions","title":"Functions","text":""},{"location":"reference/annotations/#synapseclient.annotations.is_synapse_annotations","title":"is_synapse_annotations(annotations)
","text":"Tests if the given object is a Synapse-style Annotations object.
PARAMETER DESCRIPTIONannotations
A key-value mapping that may or may not be a Synapse-style Annotations object.
TYPE: Mapping
bool
True if the given object is a Synapse-style Annotations object, False otherwise.
Source code insynapseclient/annotations.py
def is_synapse_annotations(annotations: typing.Mapping) -> bool:\n \"\"\"Tests if the given object is a Synapse-style Annotations object.\n\n Arguments:\n annotations: A key-value mapping that may or may not be a Synapse-style Annotations object.\n\n Returns:\n True if the given object is a Synapse-style Annotations object, False otherwise.\n \"\"\"\n if not isinstance(annotations, collections.abc.Mapping):\n return False\n return annotations.keys() >= {\"id\", \"etag\", \"annotations\"}\n
"},{"location":"reference/annotations/#synapseclient.annotations.is_submission_status_annotations","title":"is_submission_status_annotations(annotations)
","text":"Tests if the given dictionary is in the form of annotations to submission status
PARAMETER DESCRIPTIONannotations
A key-value mapping that may or may not be a submission status annotations object.
TYPE: Mapping
bool
True if the given object is a submission status annotations object, False otherwise.
Source code insynapseclient/annotations.py
def is_submission_status_annotations(annotations: collections.abc.Mapping) -> bool:\n \"\"\"Tests if the given dictionary is in the form of annotations to submission status\n\n Arguments:\n annotations: A key-value mapping that may or may not be a submission status annotations object.\n\n Returns:\n True if the given object is a submission status annotations object, False otherwise.\n \"\"\"\n keys = [\"objectId\", \"scopeId\", \"stringAnnos\", \"longAnnos\", \"doubleAnnos\"]\n if not isinstance(annotations, collections.abc.Mapping):\n return False\n return all([key in keys for key in annotations.keys()])\n
"},{"location":"reference/annotations/#synapseclient.annotations.to_submission_status_annotations","title":"to_submission_status_annotations(annotations, is_private=True)
","text":"Converts a normal dictionary to the format used to annotate submission statuses, which is different from the format used to annotate entities.
PARAMETER DESCRIPTIONannotations
A normal Python dictionary whose values are strings, floats, ints or doubles.
is_private
Set privacy on all annotations at once. These can be set individually using set_privacy.
DEFAULT: True
Adding and converting annotations
from synapseclient.annotations import to_submission_status_annotations, from_submission_status_annotations\nfrom datetime import datetime as Datetime\n\n## create a submission and get its status\nsubmission = syn.submit(evaluation, 'syn11111111')\nsubmission_status = syn.getSubmissionStatus(submission)\n\n## add annotations\nsubmission_status.annotations = {'foo':'bar', 'shoe_size':12, 'IQ':12, 'timestamp':Datetime.now()}\n\n## convert annotations\nsubmission_status.annotations = to_submission_status_annotations(submission_status.annotations)\nsubmission_status = syn.store(submission_status)\n
Synapse categorizes these annotations by: stringAnnos, doubleAnnos, longAnnos.
Source code insynapseclient/annotations.py
def to_submission_status_annotations(annotations, is_private=True):\n \"\"\"\n Converts a normal dictionary to the format used to annotate submission statuses, which is different from the format\n used to annotate entities.\n\n Arguments:\n annotations: A normal Python dictionary whose values are strings, floats, ints or doubles.\n is_private: Set privacy on all annotations at once. These can be set individually using\n [set_privacy][synapseclient.annotations.set_privacy].\n\n\n Example: Using this function\n Adding and converting annotations\n\n from synapseclient.annotations import to_submission_status_annotations, from_submission_status_annotations\n from datetime import datetime as Datetime\n\n ## create a submission and get its status\n submission = syn.submit(evaluation, 'syn11111111')\n submission_status = syn.getSubmissionStatus(submission)\n\n ## add annotations\n submission_status.annotations = {'foo':'bar', 'shoe_size':12, 'IQ':12, 'timestamp':Datetime.now()}\n\n ## convert annotations\n submission_status.annotations = to_submission_status_annotations(submission_status.annotations)\n submission_status = syn.store(submission_status)\n\n\n Synapse categorizes these annotations by: stringAnnos, doubleAnnos, longAnnos.\n \"\"\"\n if is_submission_status_annotations(annotations):\n return annotations\n synapseAnnos = {}\n for key, value in annotations.items():\n if key in [\"objectId\", \"scopeId\", \"stringAnnos\", \"longAnnos\", \"doubleAnnos\"]:\n synapseAnnos[key] = value\n elif isinstance(value, bool):\n synapseAnnos.setdefault(\"stringAnnos\", []).append(\n {\"key\": key, \"value\": str(value).lower(), \"isPrivate\": is_private}\n )\n elif isinstance(value, int):\n synapseAnnos.setdefault(\"longAnnos\", []).append(\n {\"key\": key, \"value\": value, \"isPrivate\": is_private}\n )\n elif isinstance(value, float):\n synapseAnnos.setdefault(\"doubleAnnos\", []).append(\n {\"key\": key, \"value\": value, \"isPrivate\": is_private}\n )\n elif isinstance(value, str):\n synapseAnnos.setdefault(\"stringAnnos\", []).append(\n {\"key\": key, \"value\": value, \"isPrivate\": is_private}\n )\n elif is_date(value):\n synapseAnnos.setdefault(\"longAnnos\", []).append(\n {\n \"key\": key,\n \"value\": to_unix_epoch_time(value),\n \"isPrivate\": is_private,\n }\n )\n else:\n synapseAnnos.setdefault(\"stringAnnos\", []).append(\n {\"key\": key, \"value\": str(value), \"isPrivate\": is_private}\n )\n return synapseAnnos\n
"},{"location":"reference/annotations/#synapseclient.annotations.from_submission_status_annotations","title":"from_submission_status_annotations(annotations)
","text":"Convert back from submission status annotation format to a normal dictionary.
PARAMETER DESCRIPTIONannotations
A dictionary in the format used to annotate submission statuses.
RETURNS DESCRIPTION
dict
A normal Python dictionary.
Using this functionConverting from submission status annotations
submission_status.annotations = from_submission_status_annotations(submission_status.annotations)\n
Source code in synapseclient/annotations.py
def from_submission_status_annotations(annotations) -> dict:\n \"\"\"\n Convert back from submission status annotation format to a normal dictionary.\n\n Arguments:\n annotations: A dictionary in the format used to annotate submission statuses.\n\n Returns:\n A normal Python dictionary.\n\n Example: Using this function\n Converting from submission status annotations\n\n submission_status.annotations = from_submission_status_annotations(submission_status.annotations)\n \"\"\"\n dictionary = {}\n for key, value in annotations.items():\n if key in [\"stringAnnos\", \"longAnnos\"]:\n dictionary.update({kvp[\"key\"]: kvp[\"value\"] for kvp in value})\n elif key == \"doubleAnnos\":\n dictionary.update({kvp[\"key\"]: float(kvp[\"value\"]) for kvp in value})\n else:\n dictionary[key] = value\n return dictionary\n
"},{"location":"reference/annotations/#synapseclient.annotations.set_privacy","title":"set_privacy(annotations, key, is_private=True, value_types=['longAnnos', 'doubleAnnos', 'stringAnnos'])
","text":"Set privacy of individual annotations, where annotations are in the format used by Synapse SubmissionStatus objects. See the Annotations documentation.
PARAMETER DESCRIPTIONannotations
Annotations that have already been converted to Synapse format using to_submission_status_annotations.
key
The key of the annotation whose privacy we're setting.
is_private
If False, the annotation will be visible to users with READ permission on the evaluation. If True, the it will be visible only to users with READ_PRIVATE_SUBMISSION on the evaluation.
DEFAULT: True
value_types
A list of the value types in which to search for the key.
DEFAULT: ['longAnnos', 'doubleAnnos', 'stringAnnos']
synapseclient/annotations.py
def set_privacy(\n annotations,\n key,\n is_private=True,\n value_types=[\"longAnnos\", \"doubleAnnos\", \"stringAnnos\"],\n):\n \"\"\"\n Set privacy of individual annotations, where annotations are in the format used by Synapse SubmissionStatus objects.\n See the [Annotations documentation](https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/annotation/Annotations.html).\n\n Arguments:\n annotations: Annotations that have already been converted to Synapse format using\n [to_submission_status_annotations][synapseclient.annotations.to_submission_status_annotations].\n key: The key of the annotation whose privacy we're setting.\n is_private: If False, the annotation will be visible to users with READ permission on the evaluation.\n If True, the it will be visible only to users with READ_PRIVATE_SUBMISSION on the evaluation.\n value_types: A list of the value types in which to search for the key.\n\n \"\"\"\n for value_type in value_types:\n kvps = annotations.get(value_type, None)\n if kvps:\n for kvp in kvps:\n if kvp[\"key\"] == key:\n kvp[\"isPrivate\"] = is_private\n return kvp\n raise KeyError('The key \"%s\" couldn\\'t be found in the annotations.' % key)\n
"},{"location":"reference/annotations/#synapseclient.annotations.to_synapse_annotations","title":"to_synapse_annotations(annotations)
","text":"Transforms a simple flat dictionary to a Synapse-style Annotation object. https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/annotation/v2/Annotations.html
PARAMETER DESCRIPTIONannotations
A simple flat dictionary of annotations.
TYPE: Annotations
Dict[str, Any]
A Synapse-style Annotation dict.
Source code insynapseclient/annotations.py
def to_synapse_annotations(annotations: Annotations) -> typing.Dict[str, typing.Any]:\n \"\"\"Transforms a simple flat dictionary to a Synapse-style Annotation object.\n https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/annotation/v2/Annotations.html\n\n Arguments:\n annotations: A simple flat dictionary of annotations.\n\n Returns:\n A Synapse-style Annotation dict.\n \"\"\"\n\n if is_synapse_annotations(annotations):\n return annotations\n synapse_annos = {}\n\n if not isinstance(annotations, Annotations):\n raise TypeError(\n \"annotations must be a synapseclient.Annotations object with 'id' and 'etag' attributes\"\n )\n\n synapse_annos[\"id\"] = annotations.id\n synapse_annos[\"etag\"] = annotations.etag\n\n synapse_annos[\"annotations\"] = _convert_to_annotations_list(annotations)\n return synapse_annos\n
"},{"location":"reference/annotations/#synapseclient.annotations.from_synapse_annotations","title":"from_synapse_annotations(raw_annotations)
","text":"Transforms a Synapse-style Annotation object to a simple flat dictionary.
PARAMETER DESCRIPTIONraw_annotations
A Synapse-style Annotation dict.
TYPE: Dict[str, Any]
Annotations
A simple flat dictionary of annotations.
Source code insynapseclient/annotations.py
def from_synapse_annotations(\n raw_annotations: typing.Dict[str, typing.Any]\n) -> Annotations:\n \"\"\"Transforms a Synapse-style Annotation object to a simple flat dictionary.\n\n Arguments:\n raw_annotations: A Synapse-style Annotation dict.\n\n Returns:\n A simple flat dictionary of annotations.\n \"\"\"\n if not is_synapse_annotations(raw_annotations):\n raise ValueError(\n 'Unexpected format of annotations from Synapse. Must include keys: \"id\", \"etag\", and \"annotations\"'\n )\n\n annos = Annotations(raw_annotations[\"id\"], raw_annotations[\"etag\"])\n for key, value_and_type in raw_annotations[\"annotations\"].items():\n key: str\n conversion_func = ANNO_TYPE_TO_FUNC[value_and_type[\"type\"]]\n annos[key] = [conversion_func(v) for v in value_and_type[\"value\"]]\n\n return annos\n
"},{"location":"reference/annotations/#synapseclient.annotations.convert_old_annotation_json","title":"convert_old_annotation_json(annotations)
","text":"Transforms a parsed JSON dictionary of old style annotations into a new style consistent with the entity bundle v2 format.
This is intended to support some models that were saved as serialized entity bundle JSON (Submissions). we don't need to support newer types here e.g. BOOLEAN because they did not exist at the time that annotation JSON was saved in this form.
Source code insynapseclient/annotations.py
def convert_old_annotation_json(annotations):\n \"\"\"Transforms a parsed JSON dictionary of old style annotations\n into a new style consistent with the entity bundle v2 format.\n\n This is intended to support some models that were saved as serialized\n entity bundle JSON (Submissions). we don't need to support newer\n types here e.g. BOOLEAN because they did not exist at the time\n that annotation JSON was saved in this form.\n \"\"\"\n\n meta_keys = (\"id\", \"etag\", \"creationDate\", \"uri\")\n\n type_mapping = {\n \"doubleAnnotations\": \"DOUBLE\",\n \"stringAnnotations\": \"STRING\",\n \"longAnnotations\": \"LONG\",\n \"dateAnnotations\": \"TIMESTAMP_MS\",\n }\n\n annos_v1_keys = set(meta_keys) | set(type_mapping.keys())\n\n # blobAnnotations appear to be little/unused and there is no mapping defined here but if they\n # are present on the annos we should treat it as an old style annos dict\n annos_v1_keys.add(\"blobAnnotations\")\n\n # if any keys in the annos dict are not consistent with an old style annotations then we treat\n # it as an annotations2 style dictionary that is not in need of any conversion\n if any(k not in annos_v1_keys for k in annotations.keys()):\n return annotations\n\n converted = {k: v for k, v in annotations.items() if k in meta_keys}\n converted_annos = converted[\"annotations\"] = {}\n\n for old_type_key, converted_type in type_mapping.items():\n values = annotations.get(old_type_key)\n if values:\n for k, vs in values.items():\n converted_annos[k] = {\n \"type\": converted_type,\n \"value\": vs,\n }\n\n return converted\n
"},{"location":"reference/cli/","title":"Command Line Client","text":"The Synapse Python Client can be used from the command line via the synapse command.
Note: The command line client is installed along with installation of the Synapse Python client.
"},{"location":"reference/cli/#usage","title":"Usage","text":"For help, type:
synapse -h\n
"},{"location":"reference/client/","title":"Client","text":""},{"location":"reference/client/#synapseclient.Synapse","title":"synapseclient.Synapse
","text":" Bases: object
Constructs a Python client object for the Synapse repository service
ATTRIBUTE DESCRIPTIONrepoEndpoint
Location of Synapse repository
authEndpoint
Location of authentication service
fileHandleEndpoint
Location of file service
portalEndpoint
Location of the website
serviceTimeoutSeconds
Wait time before timeout (currently unused)
debug
Print debugging messages if True
skip_checks
Skip version and endpoint checks
configPath
Path to config File with setting for Synapse defaults to ~/.synapseConfig
requests_session
a custom requests.Session object that this Synapse instance will use when making http requests.
cache_root_dir
root directory for storing cache data
silent
Defaults to False.
Getting started
Logging in to Synapse using an authToken
import synapseclient\nsyn = synapseclient.login(authToken=\"authtoken\")\n
Using environment variable or .synapseConfig
import synapseclient\nsyn = synapseclient.login()\n
Source code in synapseclient/client.py
class Synapse(object):\n \"\"\"\n Constructs a Python client object for the Synapse repository service\n\n Attributes:\n repoEndpoint: Location of Synapse repository\n authEndpoint: Location of authentication service\n fileHandleEndpoint: Location of file service\n portalEndpoint: Location of the website\n serviceTimeoutSeconds: Wait time before timeout (currently unused)\n debug: Print debugging messages if True\n skip_checks: Skip version and endpoint checks\n configPath: Path to config File with setting for Synapse\n defaults to ~/.synapseConfig\n requests_session: a custom requests.Session object that this Synapse instance will use\n when making http requests.\n cache_root_dir: root directory for storing cache data\n silent: Defaults to False.\n\n Example: Getting started\n Logging in to Synapse using an authToken\n\n import synapseclient\n syn = synapseclient.login(authToken=\"authtoken\")\n\n Using environment variable or `.synapseConfig`\n\n import synapseclient\n syn = synapseclient.login()\n\n \"\"\"\n\n # TODO: add additional boolean for write to disk?\n @tracer.start_as_current_span(\"Synapse::__init__\")\n def __init__(\n self,\n repoEndpoint: str = None,\n authEndpoint: str = None,\n fileHandleEndpoint: str = None,\n portalEndpoint: str = None,\n debug: bool = None,\n skip_checks: bool = False,\n configPath: str = CONFIG_FILE,\n requests_session: requests.Session = None,\n cache_root_dir: str = None,\n silent: bool = None,\n ):\n self._requests_session = requests_session or requests.Session()\n\n cache_root_dir = (\n cache.CACHE_ROOT_DIR if cache_root_dir is None else cache_root_dir\n )\n\n config_debug = None\n # Check for a config file\n self.configPath = configPath\n if os.path.isfile(configPath):\n config = self.getConfigFile(configPath)\n if config.has_option(\"cache\", \"location\"):\n cache_root_dir = config.get(\"cache\", \"location\")\n if config.has_section(\"debug\"):\n config_debug = True\n\n if debug is None:\n debug = config_debug if config_debug is not None else DEBUG_DEFAULT\n\n self.cache = cache.Cache(cache_root_dir)\n self._sts_token_store = sts_transfer.StsTokenStore()\n\n self.setEndpoints(\n repoEndpoint, authEndpoint, fileHandleEndpoint, portalEndpoint, skip_checks\n )\n\n self.default_headers = {\n \"content-type\": \"application/json; charset=UTF-8\",\n \"Accept\": \"application/json; charset=UTF-8\",\n }\n self.credentials = None\n\n if not isinstance(debug, bool):\n raise ValueError(\"debug must be set to a bool (either True or False)\")\n self.debug = debug\n\n self.silent = silent\n self._init_logger() # initializes self.logger\n\n self.skip_checks = skip_checks\n\n self.table_query_sleep = 2\n self.table_query_backoff = 1.1\n self.table_query_max_sleep = 20\n self.table_query_timeout = 600 # in seconds\n self.multi_threaded = True # if set to True, multi threaded download will be used for http and https URLs\n\n transfer_config = self._get_transfer_config()\n self.max_threads = transfer_config[\"max_threads\"]\n self.use_boto_sts_transfers = transfer_config[\"use_boto_sts\"]\n\n # initialize logging\n def _init_logger(self):\n logger_name = (\n SILENT_LOGGER_NAME\n if self.silent\n else DEBUG_LOGGER_NAME\n if self.debug\n else DEFAULT_LOGGER_NAME\n )\n self.logger = logging.getLogger(logger_name)\n logging.getLogger(\"py.warnings\").handlers = self.logger.handlers\n\n @property\n def max_threads(self) -> int:\n return self._max_threads\n\n @max_threads.setter\n def max_threads(self, value: int):\n self._max_threads = min(max(value, 1), MAX_THREADS_CAP)\n\n @property\n def username(self) -> Union[str, None]:\n # for backwards compatability when username was a part of the Synapse object and not in credentials\n return self.credentials.username if self.credentials is not None else None\n\n @tracer.start_as_current_span(\"Synapse::getConfigFile\")\n @functools.lru_cache()\n def getConfigFile(self, configPath: str) -> configparser.RawConfigParser:\n \"\"\"\n Retrieves the client configuration information.\n\n Arguments:\n configPath: Path to configuration file on local file system\n\n Returns:\n A RawConfigParser populated with properties from the user's configuration file.\n \"\"\"\n\n try:\n config = configparser.RawConfigParser()\n config.read(configPath) # Does not fail if the file does not exist\n return config\n except configparser.Error as ex:\n raise ValueError(\n \"Error parsing Synapse config file: {}\".format(configPath)\n ) from ex\n\n @tracer.start_as_current_span(\"Synapse::setEndpoints\")\n def setEndpoints(\n self,\n repoEndpoint: str = None,\n authEndpoint: str = None,\n fileHandleEndpoint: str = None,\n portalEndpoint: str = None,\n skip_checks: bool = False,\n ):\n \"\"\"\n Sets the locations for each of the Synapse services (mostly useful for testing).\n\n Arguments:\n repoEndpoint: Location of synapse repository\n authEndpoint: Location of authentication service\n fileHandleEndpoint: Location of file service\n portalEndpoint: Location of the website\n skip_checks: Skip version and endpoint checks\n\n Example: Switching endpoints\n To switch between staging and production endpoints\n\n syn.setEndpoints(**synapseclient.client.STAGING_ENDPOINTS)\n syn.setEndpoints(**synapseclient.client.PRODUCTION_ENDPOINTS)\n\n \"\"\"\n\n endpoints = {\n \"repoEndpoint\": repoEndpoint,\n \"authEndpoint\": authEndpoint,\n \"fileHandleEndpoint\": fileHandleEndpoint,\n \"portalEndpoint\": portalEndpoint,\n }\n\n # For unspecified endpoints, first look in the config file\n config = self.getConfigFile(self.configPath)\n for point in endpoints.keys():\n if endpoints[point] is None and config.has_option(\"endpoints\", point):\n endpoints[point] = config.get(\"endpoints\", point)\n\n # Endpoints default to production\n for point in endpoints.keys():\n if endpoints[point] is None:\n endpoints[point] = PRODUCTION_ENDPOINTS[point]\n\n # Update endpoints if we get redirected\n if not skip_checks:\n response = self._requests_session.get(\n endpoints[point],\n allow_redirects=False,\n headers=synapseclient.USER_AGENT,\n )\n if response.status_code == 301:\n endpoints[point] = response.headers[\"location\"]\n\n self.repoEndpoint = endpoints[\"repoEndpoint\"]\n self.authEndpoint = endpoints[\"authEndpoint\"]\n self.fileHandleEndpoint = endpoints[\"fileHandleEndpoint\"]\n self.portalEndpoint = endpoints[\"portalEndpoint\"]\n\n @tracer.start_as_current_span(\"Synapse::login\")\n def login(\n self,\n email: str = None,\n password: str = None,\n apiKey: str = None,\n sessionToken: str = None,\n rememberMe: bool = False,\n silent: bool = False,\n forced: bool = False,\n authToken: str = None,\n ):\n \"\"\"\n Valid combinations of login() arguments:\n\n - email/username and password\n - email/username and apiKey (Base64 encoded string)\n - authToken\n - sessionToken (**DEPRECATED**)\n\n If no login arguments are provided or only username is provided, login() will attempt to log in using\n information from these sources (in order of preference):\n\n 1. User's personal access token from environment the variable: SYNAPSE_AUTH_TOKEN\n 2. .synapseConfig file (in user home folder unless configured otherwise)\n 3. cached credentials from previous `login()` where `rememberMe=True` was passed as a parameter\n\n Arguments:\n email: Synapse user name (or an email address associated with a Synapse account)\n password: **!!WILL BE DEPRECATED!!** password. Please use authToken (Synapse personal access token)\n apiKey: **!!WILL BE DEPRECATED!!** Base64 encoded Synapse API key\n sessionToken: **!!DEPRECATED FIELD!!** User's current session token. Using this field will ignore the\n following fields: email, password, apiKey\n rememberMe: Whether the authentication information should be cached in your operating system's\n credential storage.\n authToken: A bearer authorization token, e.g. a personal access token, can be used in lieu of a\n password or apiKey.\n silent: Defaults to False. Suppresses the \"Welcome ...!\" message.\n forced: Defaults to False. Bypass the credential cache if set.\n\n **GNOME Keyring** (recommended) or **KWallet** is recommended to be installed for credential storage on\n **Linux** systems.\n If it is not installed/setup, credentials will be stored as PLAIN-TEXT file with read and write permissions for\n the current user only (chmod 600).\n On Windows and Mac OS, a default credentials storage exists so it will be preferred over the plain-text file.\n To install GNOME Keyring on Ubuntu:\n\n sudo apt-get install gnome-keyring\n\n sudo apt-get install python-dbus #(for Python 2 installed via apt-get)\n OR\n sudo apt-get install python3-dbus #(for Python 3 installed via apt-get)\n OR\n sudo apt-get install libdbus-glib-1-dev #(for custom installation of Python or vitualenv)\n sudo pip install dbus-python #(may take a while to compile C code)\n\n If you are on a headless Linux session (e.g. connecting via SSH), please run the following commands before\n running your Python session:\n\n dbus-run-session -- bash #(replace 'bash' with 'sh' if bash is unavailable)\n echo -n \"REPLACE_WITH_YOUR_KEYRING_PASSWORD\"|gnome-keyring-daemon -- unlock\n\n Example: Logging in\n Using an auth token:\n\n syn.login(authToken=\"authtoken\")\n #> Welcome, Me!\n\n Using a username/password:\n\n syn.login('my-username', 'secret-password', rememberMe=True)\n syn.login('my-username', 'secret-password', rememberMe=True)\n #> Welcome, Me!\n syn.login('my-username', 'secret-password', rememberMe=True)\n #> Welcome, Me!\n\n After logging in with the *rememberMe* flag set, an API key will be cached and\n used to authenticate for future logins:\n\n syn.login()\n #> Welcome, Me!\n\n \"\"\"\n # Note: the order of the logic below reflects the ordering in the docstring above.\n\n # Check version before logging in\n if not self.skip_checks:\n version_check()\n\n # Make sure to invalidate the existing session\n self.logout()\n\n credential_provider_chain = get_default_credential_chain()\n # TODO: remove deprecated sessionToken when we move to a different solution\n self.credentials = credential_provider_chain.get_credentials(\n self,\n UserLoginArgs(\n email,\n password,\n apiKey,\n forced,\n sessionToken,\n authToken,\n ),\n )\n\n # Final check on login success\n if not self.credentials:\n raise SynapseNoCredentialsError(\"No credentials provided.\")\n\n # Save the API key in the cache\n if rememberMe:\n message = (\n \"The rememberMe parameter will be deprecated by early 2024. Please use the ~/.synapseConfig \"\n \"or SYNAPSE_AUTH_TOKEN environmental variable to set up your Synapse connection.\"\n )\n self.logger.warning(message)\n delete_stored_credentials(self.credentials.username)\n self.credentials.store_to_keyring()\n cached_sessions.set_most_recent_user(self.credentials.username)\n\n if not silent:\n profile = self.getUserProfile()\n # TODO-PY3: in Python2, do we need to ensure that this is encoded in utf-8\n self.logger.info(\n \"Welcome, %s!\\n\"\n % (\n profile[\"displayName\"]\n if \"displayName\" in profile\n else self.credentials.username\n )\n )\n\n def _get_config_section_dict(self, section_name: str) -> dict:\n config = self.getConfigFile(self.configPath)\n try:\n return dict(config.items(section_name))\n except configparser.NoSectionError:\n # section not present\n return {}\n\n def _get_config_authentication(self) -> str:\n return self._get_config_section_dict(\n config_file_constants.AUTHENTICATION_SECTION_NAME\n )\n\n def _get_client_authenticated_s3_profile(self, endpoint: str, bucket: str) -> str:\n config_section = endpoint + \"/\" + bucket\n return self._get_config_section_dict(config_section).get(\n \"profile_name\", \"default\"\n )\n\n def _get_transfer_config(self) -> dict:\n # defaults\n transfer_config = {\"max_threads\": DEFAULT_NUM_THREADS, \"use_boto_sts\": False}\n\n for k, v in self._get_config_section_dict(\"transfer\").items():\n if v:\n if k == \"max_threads\" and v:\n try:\n transfer_config[\"max_threads\"] = int(v)\n except ValueError as cause:\n raise ValueError(\n f\"Invalid transfer.max_threads config setting {v}\"\n ) from cause\n\n elif k == \"use_boto_sts\":\n lower_v = v.lower()\n if lower_v not in (\"true\", \"false\"):\n raise ValueError(\n f\"Invalid transfer.use_boto_sts config setting {v}\"\n )\n\n transfer_config[\"use_boto_sts\"] = \"true\" == lower_v\n\n return transfer_config\n\n @tracer.start_as_current_span(\"Synapse::_getSessionToken\")\n def _getSessionToken(self, email: str, password: str) -> str:\n \"\"\"Returns a validated session token.\"\"\"\n try:\n req = {\"email\": email, \"password\": password}\n session = self.restPOST(\n \"/session\",\n body=json.dumps(req),\n endpoint=self.authEndpoint,\n headers=self.default_headers,\n )\n return session[\"sessionToken\"]\n except SynapseHTTPError as err:\n if (\n err.response.status_code == 403\n or err.response.status_code == 404\n or err.response.status_code == 401\n ):\n raise SynapseAuthenticationError(\"Invalid username or password.\")\n raise\n\n @tracer.start_as_current_span(\"Synapse::_getAPIKey\")\n def _getAPIKey(self, sessionToken: str) -> str:\n \"\"\"Uses a session token to fetch an API key.\"\"\"\n\n headers = {\"sessionToken\": sessionToken, \"Accept\": \"application/json\"}\n secret = self.restGET(\"/secretKey\", endpoint=self.authEndpoint, headers=headers)\n return secret[\"secretKey\"]\n\n @tracer.start_as_current_span(\"Synapse::_is_logged_in\")\n def _is_logged_in(self) -> bool:\n \"\"\"Test whether the user is logged in to Synapse.\"\"\"\n # This is a quick sanity check to see if credentials have been\n # configured on the client\n if self.credentials is None:\n return False\n # The public can query this command so there is no need to try catch.\n user = self.restGET(\"/userProfile\")\n if user.get(\"userName\") == \"anonymous\":\n return False\n return True\n\n def logout(self, forgetMe: bool = False):\n \"\"\"\n Removes authentication information from the Synapse client.\n\n Arguments:\n forgetMe: Set as True to clear any local storage of authentication information.\n See the flag \"rememberMe\" in [synapseclient.Synapse.login][]\n\n Returns:\n None\n \"\"\"\n # Delete the user's API key from the cache\n if forgetMe and self.credentials:\n self.credentials.delete_from_keyring()\n\n self.credentials = None\n\n def invalidateAPIKey(self):\n \"\"\"Invalidates authentication across all clients.\n\n Returns:\n None\n \"\"\"\n\n # Logout globally\n if self._is_logged_in():\n self.restDELETE(\"/secretKey\", endpoint=self.authEndpoint)\n\n @functools.lru_cache()\n def get_user_profile_by_username(\n self,\n username: str = None,\n sessionToken: str = None,\n ) -> UserProfile:\n \"\"\"\n Get the details about a Synapse user.\n Retrieves information on the current user if 'id' is omitted or is empty string.\n\n Arguments:\n username: The userName of a user\n sessionToken: The session token to use to find the user profile\n\n Returns:\n The user profile for the user of interest.\n\n Example: Using this function\n Getting your own profile\n\n my_profile = syn.get_user_profile_by_username()\n\n Getting another user's profile\n\n freds_profile = syn.get_user_profile_by_username('fredcommo')\n \"\"\"\n is_none = username is None\n is_str = isinstance(username, str)\n if not is_str and not is_none:\n raise TypeError(\"username must be string or None\")\n if is_str:\n principals = self._findPrincipals(username)\n for principal in principals:\n if principal.get(\"userName\", None).lower() == username.lower():\n id = principal[\"ownerId\"]\n break\n else:\n raise ValueError(f\"Can't find user '{username}'\")\n else:\n id = \"\"\n uri = f\"/userProfile/{id}\"\n return UserProfile(\n **self.restGET(\n uri, headers={\"sessionToken\": sessionToken} if sessionToken else None\n )\n )\n\n @functools.lru_cache()\n def get_user_profile_by_id(\n self,\n id: int = None,\n sessionToken: str = None,\n ) -> UserProfile:\n \"\"\"\n Get the details about a Synapse user.\n Retrieves information on the current user if 'id' is omitted.\n\n Arguments:\n id: The ownerId of a user\n sessionToken: The session token to use to find the user profile\n\n Returns:\n The user profile for the user of interest.\n\n\n Example: Using this function\n Getting your own profile\n\n my_profile = syn.get_user_profile_by_id()\n\n Getting another user's profile\n\n freds_profile = syn.get_user_profile_by_id(1234567)\n \"\"\"\n if id:\n if not isinstance(id, int):\n raise TypeError(\"id must be an 'ownerId' integer\")\n else:\n id = \"\"\n uri = f\"/userProfile/{id}\"\n return UserProfile(\n **self.restGET(\n uri, headers={\"sessionToken\": sessionToken} if sessionToken else None\n )\n )\n\n @functools.lru_cache()\n def getUserProfile(\n self,\n id: Union[str, int, UserProfile, TeamMember] = None,\n sessionToken: str = None,\n ) -> UserProfile:\n \"\"\"\n Get the details about a Synapse user.\n Retrieves information on the current user if 'id' is omitted.\n\n Arguments:\n id: The 'userId' (aka 'ownerId') of a user or the userName\n sessionToken: The session token to use to find the user profile\n\n Returns:\n The user profile for the user of interest.\n\n Example: Using this function\n Getting your own profile\n\n my_profile = syn.getUserProfile()\n\n Getting another user's profile\n\n freds_profile = syn.getUserProfile('fredcommo')\n \"\"\"\n try:\n # if id is unset or a userID, this will succeed\n id = \"\" if id is None else int(id)\n except (TypeError, ValueError):\n if isinstance(id, collections.abc.Mapping) and \"ownerId\" in id:\n id = id.ownerId\n elif isinstance(id, TeamMember):\n id = id.member.ownerId\n else:\n principals = self._findPrincipals(id)\n if len(principals) == 1:\n id = principals[0][\"ownerId\"]\n else:\n for principal in principals:\n if principal.get(\"userName\", None).lower() == id.lower():\n id = principal[\"ownerId\"]\n break\n else: # no break\n raise ValueError('Can\\'t find user \"%s\": ' % id)\n uri = \"/userProfile/%s\" % id\n return UserProfile(\n **self.restGET(\n uri, headers={\"sessionToken\": sessionToken} if sessionToken else None\n )\n )\n\n def _findPrincipals(self, query_string: str) -> typing.List[UserGroupHeader]:\n \"\"\"\n Find users or groups by name or email.\n\n :returns: A list of userGroupHeader objects with fields displayName, email, firstName, lastName, isIndividual,\n ownerId\n\n Example::\n\n syn._findPrincipals('test')\n\n [{u'displayName': u'Synapse Test',\n u'email': u'syn...t@sagebase.org',\n u'firstName': u'Synapse',\n u'isIndividual': True,\n u'lastName': u'Test',\n u'ownerId': u'1560002'},\n {u'displayName': ... }]\n\n \"\"\"\n uri = \"/userGroupHeaders?prefix=%s\" % urllib_urlparse.quote(query_string)\n return [UserGroupHeader(**result) for result in self._GET_paginated(uri)]\n\n def _get_certified_passing_record(self, userid: int) -> dict:\n \"\"\"Retrieve the Passing Record on the User Certification test for the given user.\n\n :params userid: Synapse user Id\n\n :returns: Synapse Passing Record\n https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/quiz/PassingRecord.html\n \"\"\"\n response = self.restGET(f\"/user/{userid}/certifiedUserPassingRecord\")\n return response\n\n def _get_user_bundle(self, userid: int, mask: int) -> dict:\n \"\"\"Retrieve the user bundle for the given user.\n\n :params userid: Synapse user Id\n :params mask: Bit field indicating which components to include in the bundle.\n\n :returns: Synapse User Bundle\n https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/UserBundle.html\n \"\"\"\n try:\n response = self.restGET(f\"/user/{userid}/bundle?mask={mask}\")\n except SynapseHTTPError as ex:\n if ex.response.status_code == 404:\n return None\n return response\n\n @tracer.start_as_current_span(\"Synapse::is_certified\")\n def is_certified(self, user: typing.Union[str, int]) -> bool:\n \"\"\"Determines whether a Synapse user is a certified user.\n\n Arguments:\n user: Synapse username or Id\n\n Returns:\n True if the Synapse user is certified\n \"\"\"\n # Check if userid or username exists\n syn_user = self.getUserProfile(user)\n # Get passing record\n\n try:\n certification_status = self._get_certified_passing_record(\n syn_user[\"ownerId\"]\n )\n return certification_status[\"passed\"]\n except SynapseHTTPError as ex:\n if ex.response.status_code == 404:\n # user hasn't taken the quiz\n return False\n raise\n\n @tracer.start_as_current_span(\"Synapse::is_synapse_id\")\n def is_synapse_id(self, syn_id: str) -> bool:\n \"\"\"Checks if given synID is valid (attached to actual entity?)\n\n Arguments:\n syn_id: A Synapse ID\n\n Returns:\n True if the Synapse ID is valid\n \"\"\"\n if isinstance(syn_id, str):\n try:\n self.get(syn_id, downloadFile=False)\n except SynapseFileNotFoundError:\n return False\n except (\n SynapseHTTPError,\n SynapseAuthenticationError,\n ) as err:\n status = (\n err.__context__.response.status_code or err.response.status_code\n )\n if status in (400, 404):\n return False\n # Valid ID but user lacks permission or is not logged in\n elif status == 403:\n return True\n return True\n self.logger.warning(\"synID must be a string\")\n return False\n\n def onweb(self, entity, subpageId=None):\n \"\"\"Opens up a browser window to the entity page or wiki-subpage.\n\n Arguments:\n entity: Either an Entity or a Synapse ID\n subpageId: (Optional) ID of one of the wiki's sub-pages\n\n Returns:\n None\n \"\"\"\n if isinstance(entity, str) and os.path.isfile(entity):\n entity = self.get(entity, downloadFile=False)\n synId = id_of(entity)\n if subpageId is None:\n webbrowser.open(\"%s#!Synapse:%s\" % (self.portalEndpoint, synId))\n else:\n webbrowser.open(\n \"%s#!Wiki:%s/ENTITY/%s\" % (self.portalEndpoint, synId, subpageId)\n )\n\n @tracer.start_as_current_span(\"Synapse::printEntity\")\n def printEntity(self, entity, ensure_ascii=True) -> None:\n \"\"\"\n Pretty prints an Entity.\n\n Arguments:\n entity: The entity to be printed.\n ensure_ascii: If True, escapes all non-ASCII characters\n\n Returns:\n None\n \"\"\"\n\n if utils.is_synapse_id_str(entity):\n entity = self._getEntity(entity)\n try:\n self.logger.info(\n json.dumps(entity, sort_keys=True, indent=2, ensure_ascii=ensure_ascii)\n )\n except TypeError:\n self.logger.info(str(entity))\n\n def _print_transfer_progress(self, *args, **kwargs):\n # Checking synapse if the mode is silent mode.\n # If self.silent is True, no need to print out transfer progress.\n if self.silent is not True:\n cumulative_transfer_progress.printTransferProgress(*args, **kwargs)\n\n ############################################################\n # Service methods #\n ############################################################\n\n _services = {\n \"json_schema\": \"JsonSchemaService\",\n }\n\n def get_available_services(self) -> typing.List[str]:\n \"\"\"Get available Synapse services\n This is a beta feature and is subject to change\n\n Returns:\n List of available services\n \"\"\"\n services = self._services.keys()\n return list(services)\n\n def service(self, service_name: str):\n \"\"\"Get available Synapse services\n This is a beta feature and is subject to change\n\n Arguments:\n service_name: name of the service\n \"\"\"\n # This is to avoid circular imports\n # TODO: revisit the import order and method https://stackoverflow.com/a/37126790\n # To move this to the top\n import synapseclient.services\n\n assert isinstance(service_name, str)\n service_name = service_name.lower().replace(\" \", \"_\")\n assert service_name in self._services, (\n f\"Unrecognized service ({service_name}). Run the 'get_available_\"\n \"services()' method to get a list of available services.\"\n )\n service_attr = self._services[service_name]\n service_cls = getattr(synapseclient.services, service_attr)\n service = service_cls(self)\n return service\n\n ############################################################\n # Get / Store methods #\n ############################################################\n\n @tracer.start_as_current_span(\"Synapse::get\")\n def get(self, entity, **kwargs):\n \"\"\"\n Gets a Synapse entity from the repository service.\n\n Arguments:\n entity: A Synapse ID, a Synapse Entity object, a plain dictionary in which 'id' maps to a\n Synapse ID or a local file that is stored in Synapse (found by the file MD5)\n version: The specific version to get.\n Defaults to the most recent version.\n downloadFile: Whether associated files(s) should be downloaded.\n Defaults to True\n downloadLocation: Directory where to download the Synapse File Entity.\n Defaults to the local cache.\n followLink: Whether the link returns the target Entity.\n Defaults to False\n ifcollision: Determines how to handle file collisions.\n May be \"overwrite.local\", \"keep.local\", or \"keep.both\".\n Defaults to \"keep.both\".\n limitSearch: A Synanpse ID used to limit the search in Synapse if entity is specified as a local\n file. That is, if the file is stored in multiple locations in Synapse only the ones\n in the specified folder/project will be returned.\n\n Returns:\n A new Synapse Entity object of the appropriate type.\n\n Example: Using this function\n Download file into cache\n\n entity = syn.get('syn1906479')\n print(entity.name)\n print(entity.path)\n\n Download file into current working directory\n\n entity = syn.get('syn1906479', downloadLocation='.')\n print(entity.name)\n print(entity.path)\n\n Determine the provenance of a locally stored file as indicated in Synapse\n\n entity = syn.get('/path/to/file.txt', limitSearch='syn12312')\n print(syn.getProvenance(entity))\n \"\"\"\n # If entity is a local file determine the corresponding synapse entity\n if isinstance(entity, str) and os.path.isfile(entity):\n bundle = self._getFromFile(entity, kwargs.pop(\"limitSearch\", None))\n kwargs[\"downloadFile\"] = False\n kwargs[\"path\"] = entity\n\n elif isinstance(entity, str) and not utils.is_synapse_id_str(entity):\n raise SynapseFileNotFoundError(\n (\n \"The parameter %s is neither a local file path \"\n \" or a valid entity id\" % entity\n )\n )\n # have not been saved entities\n elif isinstance(entity, Entity) and not entity.get(\"id\"):\n raise ValueError(\n \"Cannot retrieve entity that has not been saved.\"\n \" Please use syn.store() to save your entity and try again.\"\n )\n else:\n version = kwargs.get(\"version\", None)\n bundle = self._getEntityBundle(entity, version)\n # Check and warn for unmet access requirements\n self._check_entity_restrictions(\n bundle, entity, kwargs.get(\"downloadFile\", True)\n )\n\n return_data = self._getWithEntityBundle(\n entityBundle=bundle, entity=entity, **kwargs\n )\n trace.get_current_span().set_attributes(\n {\n \"synapse.id\": return_data.get(\"id\", \"\"),\n \"synapse.concrete_type\": return_data.get(\"concreteType\", \"\"),\n }\n )\n return return_data\n\n def _check_entity_restrictions(self, bundle, entity, downloadFile):\n restrictionInformation = bundle[\"restrictionInformation\"]\n if restrictionInformation[\"hasUnmetAccessRequirement\"]:\n warning_message = (\n \"\\nThis entity has access restrictions. Please visit the web page for this entity \"\n f'(syn.onweb(\"{id_of(entity)}\")). Look for the \"Access\" label and the lock icon underneath '\n 'the file name. Click \"Request Access\", and then review and fulfill the file '\n \"download requirement(s).\\n\"\n )\n if downloadFile and bundle.get(\"entityType\") not in (\"project\", \"folder\"):\n raise SynapseUnmetAccessRestrictions(warning_message)\n warnings.warn(warning_message)\n\n @tracer.start_as_current_span(\"Synapse::_getFromFile\")\n def _getFromFile(self, filepath, limitSearch=None):\n \"\"\"\n Gets a Synapse entityBundle based on the md5 of a local file\n See :py:func:`synapseclient.Synapse.get`.\n\n :param filepath: path to local file\n :param limitSearch: Limits the places in Synapse where the file is searched for.\n \"\"\"\n results = self.restGET(\n \"/entity/md5/%s\" % utils.md5_for_file(filepath).hexdigest()\n )[\"results\"]\n if limitSearch is not None:\n # Go through and find the path of every entity found\n paths = [self.restGET(\"/entity/%s/path\" % ent[\"id\"]) for ent in results]\n # Filter out all entities whose path does not contain limitSearch\n results = [\n ent\n for ent, path in zip(results, paths)\n if utils.is_in_path(limitSearch, path)\n ]\n if len(results) == 0: # None found\n raise SynapseFileNotFoundError(\"File %s not found in Synapse\" % (filepath,))\n elif len(results) > 1:\n id_txts = \"\\n\".join(\n [\"%s.%i\" % (r[\"id\"], r[\"versionNumber\"]) for r in results]\n )\n self.logger.warning(\n \"\\nThe file %s is associated with many files in Synapse:\\n%s\\n\"\n \"You can limit to files in specific project or folder by setting the limitSearch to the\"\n \" synapse Id of the project or folder.\\n\"\n \"Will use the first one returned: \\n\"\n \"%s version %i\\n\"\n % (filepath, id_txts, results[0][\"id\"], results[0][\"versionNumber\"])\n )\n entity = results[0]\n\n bundle = self._getEntityBundle(entity, version=entity[\"versionNumber\"])\n self.cache.add(bundle[\"entity\"][\"dataFileHandleId\"], filepath)\n\n return bundle\n\n @tracer.start_as_current_span(\"Synapse::move\")\n def move(self, entity, new_parent):\n \"\"\"\n Move a Synapse entity to a new container.\n\n Arguments:\n entity: A Synapse ID, a Synapse Entity object, or a local file that is stored in Synapse\n new_parent: The new parent container (Folder or Project) to which the entity should be moved.\n\n Returns:\n The Synapse Entity object that has been moved.\n\n Example: Using this function\n Move a Synapse Entity object to a new parent container\n\n entity = syn.move('syn456', 'syn123')\n \"\"\"\n\n entity = self.get(entity, downloadFile=False)\n entity.parentId = id_of(new_parent)\n entity = self.store(entity, forceVersion=False)\n trace.get_current_span().set_attributes(\n {\n \"synapse.id\": entity.get(\"id\", \"\"),\n \"synapse.parent_id\": entity.get(\"parentId\", \"\"),\n }\n )\n\n return entity\n\n def _getWithEntityBundle(self, entityBundle, entity=None, **kwargs):\n \"\"\"\n Creates a :py:mod:`synapseclient.Entity` from an entity bundle returned by Synapse.\n An existing Entity can be supplied in case we want to refresh a stale Entity.\n\n :param entityBundle: Uses the given dictionary as the meta information of the Entity to get\n :param entity: Optional, entity whose local state will be copied into the returned entity\n :param submission: Optional, access associated files through a submission rather than\n through an entity.\n\n See :py:func:`synapseclient.Synapse.get`.\n See :py:func:`synapseclient.Synapse._getEntityBundle`.\n See :py:mod:`synapseclient.Entity`.\n \"\"\"\n\n # Note: This version overrides the version of 'entity' (if the object is Mappable)\n kwargs.pop(\"version\", None)\n downloadFile = kwargs.pop(\"downloadFile\", True)\n downloadLocation = kwargs.pop(\"downloadLocation\", None)\n ifcollision = kwargs.pop(\"ifcollision\", None)\n submission = kwargs.pop(\"submission\", None)\n followLink = kwargs.pop(\"followLink\", False)\n path = kwargs.pop(\"path\", None)\n\n # make sure user didn't accidentlaly pass a kwarg that we don't handle\n if kwargs: # if there are remaining items in the kwargs\n raise TypeError(\"Unexpected **kwargs: %r\" % kwargs)\n\n # If Link, get target ID entity bundle\n if (\n entityBundle[\"entity\"][\"concreteType\"]\n == \"org.sagebionetworks.repo.model.Link\"\n and followLink\n ):\n targetId = entityBundle[\"entity\"][\"linksTo\"][\"targetId\"]\n targetVersion = entityBundle[\"entity\"][\"linksTo\"].get(\"targetVersionNumber\")\n entityBundle = self._getEntityBundle(targetId, targetVersion)\n\n # TODO is it an error to specify both downloadFile=False and downloadLocation?\n # TODO this matters if we want to return already cached files when downloadFile=False\n\n # Make a fresh copy of the Entity\n local_state = (\n entity.local_state() if entity and isinstance(entity, Entity) else {}\n )\n if path is not None:\n local_state[\"path\"] = path\n properties = entityBundle[\"entity\"]\n annotations = from_synapse_annotations(entityBundle[\"annotations\"])\n entity = Entity.create(properties, annotations, local_state)\n\n # Handle download of fileEntities\n if isinstance(entity, File):\n # update the entity with FileHandle metadata\n file_handle = next(\n (\n handle\n for handle in entityBundle[\"fileHandles\"]\n if handle[\"id\"] == entity.dataFileHandleId\n ),\n None,\n )\n entity._update_file_handle(file_handle)\n\n if downloadFile:\n if file_handle:\n self._download_file_entity(\n downloadLocation,\n entity,\n ifcollision,\n submission,\n )\n else: # no filehandle means that we do not have DOWNLOAD permission\n warning_message = (\n \"WARNING: You have READ permission on this file entity but not DOWNLOAD \"\n \"permission. The file has NOT been downloaded.\"\n )\n self.logger.warning(\n \"\\n\"\n + \"!\" * len(warning_message)\n + \"\\n\"\n + warning_message\n + \"\\n\"\n + \"!\" * len(warning_message)\n + \"\\n\"\n )\n return entity\n\n def _ensure_download_location_is_directory(self, downloadLocation):\n download_dir = os.path.expandvars(os.path.expanduser(downloadLocation))\n if os.path.isfile(download_dir):\n raise ValueError(\n \"Parameter 'downloadLocation' should be a directory, not a file.\"\n )\n return download_dir\n\n @tracer.start_as_current_span(\"Synapse::_download_file_entity\")\n def _download_file_entity(\n self,\n downloadLocation: str,\n entity: Entity,\n ifcollision: str,\n submission: str,\n ):\n # set the initial local state\n entity.path = None\n entity.files = []\n entity.cacheDir = None\n\n # check to see if an UNMODIFIED version of the file (since it was last downloaded) already exists\n # this location could be either in .synapseCache or a user specified location to which the user previously\n # downloaded the file\n cached_file_path = self.cache.get(entity.dataFileHandleId, downloadLocation)\n\n # location in .synapseCache where the file would be corresponding to its FileHandleId\n synapseCache_location = self.cache.get_cache_dir(entity.dataFileHandleId)\n\n file_name = (\n entity._file_handle.fileName\n if cached_file_path is None\n else os.path.basename(cached_file_path)\n )\n\n # Decide the best download location for the file\n if downloadLocation is not None:\n # Make sure the specified download location is a fully resolved directory\n downloadLocation = self._ensure_download_location_is_directory(\n downloadLocation\n )\n elif cached_file_path is not None:\n # file already cached so use that as the download location\n downloadLocation = os.path.dirname(cached_file_path)\n else:\n # file not cached and no user-specified location so default to .synapseCache\n downloadLocation = synapseCache_location\n\n # resolve file path collisions by either overwriting, renaming, or not downloading, depending on the\n # ifcollision value\n downloadPath = self._resolve_download_path_collisions(\n downloadLocation,\n file_name,\n ifcollision,\n synapseCache_location,\n cached_file_path,\n )\n if downloadPath is None:\n return\n\n if cached_file_path is not None: # copy from cache\n if downloadPath != cached_file_path:\n # create the foider if it does not exist already\n if not os.path.exists(downloadLocation):\n os.makedirs(downloadLocation)\n shutil.copy(cached_file_path, downloadPath)\n\n else: # download the file from URL (could be a local file)\n objectType = \"FileEntity\" if submission is None else \"SubmissionAttachment\"\n objectId = entity[\"id\"] if submission is None else submission\n\n # reassign downloadPath because if url points to local file (e.g. file://~/someLocalFile.txt)\n # it won't be \"downloaded\" and, instead, downloadPath will just point to '~/someLocalFile.txt'\n # _downloadFileHandle may also return None to indicate that the download failed\n downloadPath = self._downloadFileHandle(\n entity.dataFileHandleId, objectId, objectType, downloadPath\n )\n\n if downloadPath is None or not os.path.exists(downloadPath):\n return\n\n # converts the path format from forward slashes back to backward slashes on Windows\n entity.path = os.path.normpath(downloadPath)\n entity.files = [os.path.basename(downloadPath)]\n entity.cacheDir = os.path.dirname(downloadPath)\n\n def _resolve_download_path_collisions(\n self,\n downloadLocation,\n file_name,\n ifcollision,\n synapseCache_location,\n cached_file_path,\n ):\n # always overwrite if we are downloading to .synapseCache\n if utils.normalize_path(downloadLocation) == synapseCache_location:\n if ifcollision is not None and ifcollision != \"overwrite.local\":\n self.logger.warning(\n \"\\n\"\n + \"!\" * 50\n + f\"\\nifcollision={ifcollision} \"\n + \"is being IGNORED because the download destination is synapse's cache.\"\n ' Instead, the behavior is \"overwrite.local\". \\n' + \"!\" * 50 + \"\\n\"\n )\n ifcollision = \"overwrite.local\"\n # if ifcollision not specified, keep.local\n ifcollision = ifcollision or \"keep.both\"\n\n downloadPath = utils.normalize_path(os.path.join(downloadLocation, file_name))\n # resolve collison\n if os.path.exists(downloadPath):\n if ifcollision == \"overwrite.local\":\n pass\n elif ifcollision == \"keep.local\":\n # Don't want to overwrite the local file.\n return None\n elif ifcollision == \"keep.both\":\n if downloadPath != cached_file_path:\n return utils.unique_filename(downloadPath)\n else:\n raise ValueError(\n 'Invalid parameter: \"%s\" is not a valid value '\n 'for \"ifcollision\"' % ifcollision\n )\n return downloadPath\n\n def store(\n self,\n obj,\n *,\n createOrUpdate=True,\n forceVersion=True,\n versionLabel=None,\n isRestricted=False,\n activity=None,\n used=None,\n executed=None,\n activityName=None,\n activityDescription=None,\n opentelemetry_context=None,\n ):\n \"\"\"\n Creates a new Entity or updates an existing Entity, uploading any files in the process.\n\n Arguments:\n obj: A Synapse Entity, Evaluation, or Wiki\n used: The Entity, Synapse ID, or URL used to create the object (can also be a list of these)\n executed: The Entity, Synapse ID, or URL representing code executed to create the object\n (can also be a list of these)\n activity: Activity object specifying the user's provenance.\n activityName: Activity name to be used in conjunction with *used* and *executed*.\n activityDescription: Activity description to be used in conjunction with *used* and *executed*.\n createOrUpdate: Indicates whether the method should automatically perform an update if the 'obj'\n conflicts with an existing Synapse object. Defaults to True.\n forceVersion: Indicates whether the method should increment the version of the object even if nothing\n has changed. Defaults to True.\n versionLabel: Arbitrary string used to label the version.\n isRestricted: If set to true, an email will be sent to the Synapse access control team to start the\n process of adding terms-of-use or review board approval for this entity.\n You will be contacted with regards to the specific data being restricted and the\n requirements of access.\n opentelemetry_context: OpenTelemetry context to propogate to this function to use for tracing. Used\n cases where multi-threaded operations need to be linked to parent spans.\n\n Returns:\n A Synapse Entity, Evaluation, or Wiki\n\n Example: Using this function\n Creating a new Project:\n\n from synapseclient import Project\n\n project = Project('My uniquely named project')\n project = syn.store(project)\n\n Adding files with [provenance (aka: Activity)][synapseclient.Activity]:\n\n from synapseclient import File, Activity\n\n # A synapse entity *syn1906480* contains data\n # entity *syn1917825* contains code\n activity = Activity(\n 'Fancy Processing',\n description='No seriously, really fancy processing',\n used=['syn1906480', 'http://data_r_us.com/fancy/data.txt'],\n executed='syn1917825')\n\n test_entity = File('/path/to/data/file.xyz', description='Fancy new data', parent=project)\n test_entity = syn.store(test_entity, activity=activity)\n\n \"\"\"\n with tracer.start_as_current_span(\n \"Synapse::store\", context=opentelemetry_context\n ):\n trace.get_current_span().set_attributes(\n {\"thread.id\": threading.get_ident()}\n )\n # SYNPY-1031: activity must be Activity object or code will fail later\n if activity:\n if not isinstance(activity, synapseclient.Activity):\n raise ValueError(\"activity should be synapseclient.Activity object\")\n # _before_store hook\n # give objects a chance to do something before being stored\n if hasattr(obj, \"_before_synapse_store\"):\n obj._before_synapse_store(self)\n\n # _synapse_store hook\n # for objects that know how to store themselves\n if hasattr(obj, \"_synapse_store\"):\n return obj._synapse_store(self)\n\n # Handle all non-Entity objects\n if not (isinstance(obj, Entity) or type(obj) == dict):\n if isinstance(obj, Wiki):\n return self._storeWiki(obj, createOrUpdate)\n\n if \"id\" in obj: # If ID is present, update\n trace.get_current_span().set_attributes({\"synapse.id\": obj[\"id\"]})\n return type(obj)(**self.restPUT(obj.putURI(), obj.json()))\n\n try: # If no ID is present, attempt to POST the object\n trace.get_current_span().set_attributes({\"synapse.id\": \"\"})\n return type(obj)(**self.restPOST(obj.postURI(), obj.json()))\n\n except SynapseHTTPError as err:\n # If already present and we want to update attempt to get the object content\n if createOrUpdate and err.response.status_code == 409:\n newObj = self.restGET(obj.getByNameURI(obj.name))\n newObj.update(obj)\n obj = type(obj)(**newObj)\n trace.get_current_span().set_attributes(\n {\"synapse.id\": obj[\"id\"]}\n )\n obj.update(self.restPUT(obj.putURI(), obj.json()))\n return obj\n raise\n\n # If the input object is an Entity or a dictionary\n entity = obj\n properties, annotations, local_state = split_entity_namespaces(entity)\n bundle = None\n # Explicitly set an empty versionComment property if none is supplied,\n # otherwise an existing entity bundle's versionComment will be copied to the update.\n properties[\"versionComment\"] = (\n properties[\"versionComment\"] if \"versionComment\" in properties else None\n )\n\n # Anything with a path is treated as a cache-able item\n # Only Files are expected in the following logic\n if entity.get(\"path\", False) and not isinstance(obj, Folder):\n if \"concreteType\" not in properties:\n properties[\"concreteType\"] = File._synapse_entity_type\n # Make sure the path is fully resolved\n entity[\"path\"] = os.path.expanduser(entity[\"path\"])\n\n # Check if the File already exists in Synapse by fetching metadata on it\n bundle = self._getEntityBundle(entity)\n\n if bundle:\n if createOrUpdate:\n # update our properties from the existing bundle so that we have\n # enough to process this as an entity update.\n properties = {**bundle[\"entity\"], **properties}\n\n # Check if the file should be uploaded\n fileHandle = find_data_file_handle(bundle)\n if (\n fileHandle\n and fileHandle[\"concreteType\"]\n == \"org.sagebionetworks.repo.model.file.ExternalFileHandle\"\n ):\n # switching away from ExternalFileHandle or the url was updated\n needs_upload = entity[\"synapseStore\"] or (\n fileHandle[\"externalURL\"] != entity[\"externalURL\"]\n )\n else:\n # Check if we need to upload a new version of an existing\n # file. If the file referred to by entity['path'] has been\n # modified, we want to upload the new version.\n # If synapeStore is false then we must upload a ExternalFileHandle\n needs_upload = not entity[\n \"synapseStore\"\n ] or not self.cache.contains(\n bundle[\"entity\"][\"dataFileHandleId\"], entity[\"path\"]\n )\n elif entity.get(\"dataFileHandleId\", None) is not None:\n needs_upload = False\n else:\n needs_upload = True\n\n if needs_upload:\n local_state_fh = local_state.get(\"_file_handle\", {})\n synapseStore = local_state.get(\"synapseStore\", True)\n fileHandle = upload_file_handle(\n self,\n entity[\"parentId\"],\n local_state[\"path\"]\n if (synapseStore or local_state_fh.get(\"externalURL\") is None)\n else local_state_fh.get(\"externalURL\"),\n synapseStore=synapseStore,\n md5=local_state_fh.get(\"contentMd5\"),\n file_size=local_state_fh.get(\"contentSize\"),\n mimetype=local_state_fh.get(\"contentType\"),\n max_threads=self.max_threads,\n )\n properties[\"dataFileHandleId\"] = fileHandle[\"id\"]\n local_state[\"_file_handle\"] = fileHandle\n\n elif \"dataFileHandleId\" not in properties:\n # Handle the case where the Entity lacks an ID\n # But becomes an update() due to conflict\n properties[\"dataFileHandleId\"] = bundle[\"entity\"][\n \"dataFileHandleId\"\n ]\n\n # update the file_handle metadata if the FileEntity's FileHandle id has changed\n local_state_fh_id = local_state.get(\"_file_handle\", {}).get(\"id\")\n if (\n local_state_fh_id\n and properties[\"dataFileHandleId\"] != local_state_fh_id\n ):\n local_state[\"_file_handle\"] = find_data_file_handle(\n self._getEntityBundle(\n properties[\"id\"],\n requestedObjects={\n \"includeEntity\": True,\n \"includeFileHandles\": True,\n },\n )\n )\n\n # check if we already have the filehandleid cached somewhere\n cached_path = self.cache.get(properties[\"dataFileHandleId\"])\n if cached_path is None:\n local_state[\"path\"] = None\n local_state[\"cacheDir\"] = None\n local_state[\"files\"] = []\n else:\n local_state[\"path\"] = cached_path\n local_state[\"cacheDir\"] = os.path.dirname(cached_path)\n local_state[\"files\"] = [os.path.basename(cached_path)]\n\n # Create or update Entity in Synapse\n if \"id\" in properties:\n trace.get_current_span().set_attributes(\n {\"synapse.id\": properties[\"id\"]}\n )\n properties = self._updateEntity(properties, forceVersion, versionLabel)\n else:\n # If Link, get the target name, version number and concrete type and store in link properties\n if properties[\"concreteType\"] == \"org.sagebionetworks.repo.model.Link\":\n target_properties = self._getEntity(\n properties[\"linksTo\"][\"targetId\"],\n version=properties[\"linksTo\"].get(\"targetVersionNumber\"),\n )\n if target_properties[\"parentId\"] == properties[\"parentId\"]:\n raise ValueError(\n \"Cannot create a Link to an entity under the same parent.\"\n )\n properties[\"linksToClassName\"] = target_properties[\"concreteType\"]\n if (\n target_properties.get(\"versionNumber\") is not None\n and properties[\"linksTo\"].get(\"targetVersionNumber\") is not None\n ):\n properties[\"linksTo\"][\n \"targetVersionNumber\"\n ] = target_properties[\"versionNumber\"]\n properties[\"name\"] = target_properties[\"name\"]\n try:\n properties = self._createEntity(properties)\n except SynapseHTTPError as ex:\n if createOrUpdate and ex.response.status_code == 409:\n # Get the existing Entity's ID via the name and parent\n existing_entity_id = self.findEntityId(\n properties[\"name\"], properties.get(\"parentId\", None)\n )\n if existing_entity_id is None:\n raise\n\n # get existing properties and annotations\n if not bundle:\n bundle = self._getEntityBundle(\n existing_entity_id,\n requestedObjects={\n \"includeEntity\": True,\n \"includeAnnotations\": True,\n },\n )\n\n properties = {**bundle[\"entity\"], **properties}\n\n # we additionally merge the annotations under the assumption that a missing annotation\n # from a resolved conflict represents an newer annotation that should be preserved\n # rather than an intentionally deleted annotation.\n annotations = {\n **from_synapse_annotations(bundle[\"annotations\"]),\n **annotations,\n }\n\n properties = self._updateEntity(\n properties, forceVersion, versionLabel\n )\n\n else:\n raise\n\n # Deal with access restrictions\n if isRestricted:\n self._createAccessRequirementIfNone(properties)\n\n # Update annotations\n if (not bundle and annotations) or (\n bundle and check_annotations_changed(bundle[\"annotations\"], annotations)\n ):\n annotations = self.set_annotations(\n Annotations(properties[\"id\"], properties[\"etag\"], annotations)\n )\n properties[\"etag\"] = annotations.etag\n\n # If the parameters 'used' or 'executed' are given, create an Activity object\n if used or executed:\n if activity is not None:\n raise SynapseProvenanceError(\n \"Provenance can be specified as an Activity object or as used/executed\"\n \" item(s), but not both.\"\n )\n activity = Activity(\n name=activityName,\n description=activityDescription,\n used=used,\n executed=executed,\n )\n\n # If we have an Activity, set it as the Entity's provenance record\n if activity:\n self.setProvenance(properties, activity)\n\n # 'etag' has changed, so get the new Entity\n properties = self._getEntity(properties)\n\n # Return the updated Entity object\n entity = Entity.create(properties, annotations, local_state)\n return_data = self.get(entity, downloadFile=False)\n\n trace.get_current_span().set_attributes(\n {\n \"synapse.id\": return_data.get(\"id\", \"\"),\n \"synapse.concrete_type\": entity.get(\"concreteType\", \"\"),\n }\n )\n return return_data\n\n @tracer.start_as_current_span(\"Synapse::_createAccessRequirementIfNone\")\n def _createAccessRequirementIfNone(self, entity):\n \"\"\"\n Checks to see if the given entity has access requirements.\n If not, then one is added\n \"\"\"\n existingRestrictions = self.restGET(\n \"/entity/%s/accessRequirement?offset=0&limit=1\" % id_of(entity)\n )\n if len(existingRestrictions[\"results\"]) <= 0:\n self.restPOST(\"/entity/%s/lockAccessRequirement\" % id_of(entity), body=\"\")\n\n def _getEntityBundle(self, entity, version=None, requestedObjects=None):\n \"\"\"\n Gets some information about the Entity.\n\n :parameter entity: a Synapse Entity or Synapse ID\n :parameter version: the entity's version (defaults to None meaning most recent version)\n :parameter requestedObjects: A dict indicating settings for what to include\n\n default value for requestedObjects is::\n\n requestedObjects = {'includeEntity': True,\n 'includeAnnotations': True,\n 'includeFileHandles': True,\n 'includeRestrictionInformation': True}\n\n Keys available for requestedObjects::\n\n includeEntity\n includeAnnotations\n includePermissions\n includeEntityPath\n includeHasChildren\n includeAccessControlList\n includeFileHandles\n includeTableBundle\n includeRootWikiId\n includeBenefactorACL\n includeDOIAssociation\n includeFileName\n includeThreadCount\n includeRestrictionInformation\n\n\n Keys with values set to False may simply be omitted.\n For example, we might ask for an entity bundle containing file handles, annotations, and properties::\n requested_objects = {'includeEntity':True\n 'includeAnnotations':True,\n 'includeFileHandles':True}\n bundle = syn._getEntityBundle('syn111111', )\n\n :returns: An EntityBundle with the requested fields or by default Entity header, annotations, unmet access\n requirements, and file handles\n \"\"\"\n\n # If 'entity' is given without an ID, try to find it by 'parentId' and 'name'.\n # Use case:\n # If the user forgets to catch the return value of a syn.store(e)\n # this allows them to recover by doing: e = syn.get(e)\n if requestedObjects is None:\n requestedObjects = {\n \"includeEntity\": True,\n \"includeAnnotations\": True,\n \"includeFileHandles\": True,\n \"includeRestrictionInformation\": True,\n }\n if (\n isinstance(entity, collections.abc.Mapping)\n and \"id\" not in entity\n and \"name\" in entity\n ):\n entity = self.findEntityId(entity[\"name\"], entity.get(\"parentId\", None))\n\n # Avoid an exception from finding an ID from a NoneType\n try:\n id_of(entity)\n except ValueError:\n return None\n\n if version is not None:\n uri = f\"/entity/{id_of(entity)}/version/{int(version):d}/bundle2\"\n else:\n uri = f\"/entity/{id_of(entity)}/bundle2\"\n bundle = self.restPOST(uri, body=json.dumps(requestedObjects))\n\n return bundle\n\n @tracer.start_as_current_span(\"Synapse::delete\")\n def delete(self, obj, version=None):\n \"\"\"\n Removes an object from Synapse.\n\n Arguments:\n obj: An existing object stored on Synapse such as Evaluation, File, Project, or Wiki\n version: For entities, specify a particular version to delete.\n \"\"\"\n # Handle all strings as the Entity ID for backward compatibility\n if isinstance(obj, str):\n entity_id = id_of(obj)\n trace.get_current_span().set_attributes({\"synapse.id\": entity_id})\n if version:\n self.restDELETE(uri=f\"/entity/{entity_id}/version/{version}\")\n else:\n self.restDELETE(uri=f\"/entity/{entity_id}\")\n elif hasattr(obj, \"_synapse_delete\"):\n return obj._synapse_delete(self)\n else:\n try:\n if isinstance(obj, Versionable):\n self.restDELETE(obj.deleteURI(versionNumber=version))\n else:\n self.restDELETE(obj.deleteURI())\n except AttributeError:\n raise SynapseError(\n f\"Can't delete a {type(obj)}. Please specify a Synapse object or id\"\n )\n\n _user_name_cache = {}\n\n def _get_user_name(self, user_id):\n if user_id not in self._user_name_cache:\n self._user_name_cache[user_id] = utils.extract_user_name(\n self.getUserProfile(user_id)\n )\n return self._user_name_cache[user_id]\n\n @tracer.start_as_current_span(\"Synapse::_list\")\n def _list(\n self,\n parent,\n recursive=False,\n long_format=False,\n show_modified=False,\n indent=0,\n out=sys.stdout,\n ):\n \"\"\"\n List child objects of the given parent, recursively if requested.\n \"\"\"\n fields = [\"id\", \"name\", \"nodeType\"]\n if long_format:\n fields.extend([\"createdByPrincipalId\", \"createdOn\", \"versionNumber\"])\n if show_modified:\n fields.extend([\"modifiedByPrincipalId\", \"modifiedOn\"])\n results = self.getChildren(parent)\n\n results_found = False\n for result in results:\n results_found = True\n\n fmt_fields = {\n \"name\": result[\"name\"],\n \"id\": result[\"id\"],\n \"padding\": \" \" * indent,\n \"slash_or_not\": \"/\" if is_container(result) else \"\",\n }\n fmt_string = \"{id}\"\n\n if long_format:\n fmt_fields[\"createdOn\"] = utils.iso_to_datetime(\n result[\"createdOn\"]\n ).strftime(\"%Y-%m-%d %H:%M\")\n fmt_fields[\"createdBy\"] = self._get_user_name(result[\"createdBy\"])[:18]\n fmt_fields[\"version\"] = result[\"versionNumber\"]\n fmt_string += \" {version:3} {createdBy:>18} {createdOn}\"\n if show_modified:\n fmt_fields[\"modifiedOn\"] = utils.iso_to_datetime(\n result[\"modifiedOn\"]\n ).strftime(\"%Y-%m-%d %H:%M\")\n fmt_fields[\"modifiedBy\"] = self._get_user_name(result[\"modifiedBy\"])[\n :18\n ]\n fmt_string += \" {modifiedBy:>18} {modifiedOn}\"\n\n fmt_string += \" {padding}{name}{slash_or_not}\\n\"\n out.write(fmt_string.format(**fmt_fields))\n\n if (indent == 0 or recursive) and is_container(result):\n self._list(\n result[\"id\"],\n recursive=recursive,\n long_format=long_format,\n show_modified=show_modified,\n indent=indent + 2,\n out=out,\n )\n\n if indent == 0 and not results_found:\n out.write(\n \"No results visible to {username} found for id {id}\\n\".format(\n username=self.credentials.username, id=id_of(parent)\n )\n )\n\n def uploadFileHandle(\n self, path, parent, synapseStore=True, mimetype=None, md5=None, file_size=None\n ):\n \"\"\"Uploads the file in the provided path (if necessary) to a storage location based on project settings.\n Returns a new FileHandle as a dict to represent the stored file.\n\n Arguments:\n parent: Parent of the entity to which we upload.\n path: File path to the file being uploaded\n synapseStore: If False, will not upload the file, but instead create an ExternalFileHandle that references\n the file on the local machine.\n If True, will upload the file based on StorageLocation determined by the entity_parent_id\n mimetype: The MIME type metadata for the uploaded file\n md5: The MD5 checksum for the file, if known. Otherwise if the file is a local file, it will be calculated\n automatically.\n file_size: The size the file, if known. Otherwise if the file is a local file, it will be calculated\n automatically.\n file_type: The MIME type the file, if known. Otherwise if the file is a local file, it will be calculated\n automatically.\n\n Returns:\n A dict of a new FileHandle as a dict that represents the uploaded file\n \"\"\"\n return upload_file_handle(\n self, parent, path, synapseStore, md5, file_size, mimetype\n )\n\n ############################################################\n # Download List #\n ############################################################\n def clear_download_list(self):\n \"\"\"Clear all files from download list\"\"\"\n self.restDELETE(\"/download/list\")\n\n def remove_from_download_list(self, list_of_files: typing.List[typing.Dict]) -> int:\n \"\"\"Remove a batch of files from download list\n\n Arguments:\n list_of_files: Array of files in the format of a mapping {fileEntityId: synid, versionNumber: version}\n\n Returns:\n Number of files removed from download list\n \"\"\"\n request_body = {\"batchToRemove\": list_of_files}\n num_files_removed = self.restPOST(\n \"/download/list/remove\", body=json.dumps(request_body)\n )\n return num_files_removed\n\n def _generate_manifest_from_download_list(\n self,\n quoteCharacter: str = '\"',\n escapeCharacter: str = \"\\\\\",\n lineEnd: str = os.linesep,\n separator: str = \",\",\n header: bool = True,\n ):\n \"\"\"Creates a download list manifest generation request\n\n :param quoteCharacter: The character to be used for quoted elements in the resulting file.\n Defaults to '\"'.\n :param escapeCharacter: The escape character to be used for escaping a separator or quote in the resulting\n file. Defaults to \"\\\".\n :param lineEnd: The line feed terminator to be used for the resulting file. Defaults to os.linesep.\n :param separator: The delimiter to be used for separating entries in the resulting file. Defaults to \",\".\n :param header: Is the first line a header? Defaults to True.\n\n :returns: Filehandle of download list manifest\n \"\"\"\n request_body = {\n \"concreteType\": \"org.sagebionetworks.repo.model.download.DownloadListManifestRequest\",\n \"csvTableDescriptor\": {\n \"separator\": separator,\n \"quoteCharacter\": quoteCharacter,\n \"escapeCharacter\": escapeCharacter,\n \"lineEnd\": lineEnd,\n \"isFirstLineHeader\": header,\n },\n }\n return self._waitForAsync(\n uri=\"/download/list/manifest/async\", request=request_body\n )\n\n @tracer.start_as_current_span(\"Synapse::get_download_list_manifest\")\n def get_download_list_manifest(self):\n \"\"\"Get the path of the download list manifest file\n\n Returns:\n Path of download list manifest file\n \"\"\"\n manifest = self._generate_manifest_from_download_list()\n # Get file handle download link\n file_result = self._getFileHandleDownload(\n fileHandleId=manifest[\"resultFileHandleId\"],\n objectId=manifest[\"resultFileHandleId\"],\n objectType=\"FileEntity\",\n )\n # Download the manifest\n downloaded_path = self._download_from_URL(\n url=file_result[\"preSignedURL\"],\n destination=\"./\",\n fileHandleId=file_result[\"fileHandleId\"],\n expected_md5=file_result[\"fileHandle\"].get(\"contentMd5\"),\n )\n trace.get_current_span().set_attributes(\n {\"synapse.file_handle_id\": file_result[\"fileHandleId\"]}\n )\n return downloaded_path\n\n @tracer.start_as_current_span(\"Synapse::get_download_list\")\n def get_download_list(self, downloadLocation: str = None) -> str:\n \"\"\"Download all files from your Synapse download list\n\n Arguments:\n downloadLocation: Directory to download files to.\n\n Returns:\n Manifest file with file paths\n \"\"\"\n dl_list_path = self.get_download_list_manifest()\n downloaded_files = []\n new_manifest_path = f\"manifest_{time.time_ns()}.csv\"\n with open(dl_list_path) as manifest_f, open(\n new_manifest_path, \"w\"\n ) as write_obj:\n reader = csv.DictReader(manifest_f)\n columns = reader.fieldnames\n columns.extend([\"path\", \"error\"])\n # Write the downloaded paths to a new manifest file\n writer = csv.DictWriter(write_obj, fieldnames=columns)\n writer.writeheader()\n\n for row in reader:\n # You can add things to the download list that you don't have access to\n # So there must be a try catch here\n try:\n entity = self.get(row[\"ID\"], downloadLocation=downloadLocation)\n # Must include version number because you can have multiple versions of a\n # file in the download list\n downloaded_files.append(\n {\n \"fileEntityId\": row[\"ID\"],\n \"versionNumber\": row[\"versionNumber\"],\n }\n )\n row[\"path\"] = entity.path\n row[\"error\"] = \"\"\n except Exception:\n row[\"path\"] = \"\"\n row[\"error\"] = \"DOWNLOAD FAILED\"\n self.logger.error(\"Unable to download file\")\n writer.writerow(row)\n\n # Don't want to clear all the download list because you can add things\n # to the download list after initiating this command.\n # Files that failed to download should not be removed from download list\n # Remove all files from download list after the entire download is complete.\n # This is because if download fails midway, we want to return the full manifest\n if downloaded_files:\n # Only want to invoke this if there is a list of files to remove\n # or the API call will error\n self.remove_from_download_list(list_of_files=downloaded_files)\n else:\n self.logger.warning(\"A manifest was created, but no files were downloaded\")\n\n # Always remove original manifest file\n os.remove(dl_list_path)\n\n return new_manifest_path\n\n ############################################################\n # Get / Set Annotations #\n ############################################################\n\n def _getRawAnnotations(self, entity, version=None):\n \"\"\"\n Retrieve annotations for an Entity returning them in the native Synapse format.\n \"\"\"\n # Note: Specifying the version results in a zero-ed out etag,\n # even if the version is the most recent.\n # See `PLFM-1874 <https://sagebionetworks.jira.com/browse/PLFM-1874>`_ for more details.\n if version:\n uri = f\"/entity/{id_of(entity)}/version/{str(version)}/annotations2\"\n else:\n uri = f\"/entity/{id_of(entity)}/annotations2\"\n return self.restGET(uri)\n\n @deprecated.sphinx.deprecated(\n version=\"2.1.0\",\n reason=\"deprecated and replaced with :py:meth:`get_annotations`\",\n )\n def getAnnotations(self, entity, version=None):\n \"\"\"deprecated and replaced with :py:meth:`get_annotations`\"\"\"\n return self.get_annotations(entity, version=version)\n\n @tracer.start_as_current_span(\"Synapse::get_annotations\")\n def get_annotations(\n self, entity: typing.Union[str, Entity], version: typing.Union[str, int] = None\n ) -> Annotations:\n \"\"\"\n Retrieve annotations for an Entity from the Synapse Repository as a Python dict.\n\n Note that collapsing annotations from the native Synapse format to a Python dict may involve some loss of\n information. See `_getRawAnnotations` to get annotations in the native format.\n\n Arguments:\n entity: An Entity or Synapse ID to lookup\n version: The version of the Entity to retrieve.\n\n Returns:\n A [synapseclient.annotations.Annotations][] object, a dict that also has id and etag attributes\n \"\"\"\n return from_synapse_annotations(self._getRawAnnotations(entity, version))\n\n @deprecated.sphinx.deprecated(\n version=\"2.1.0\",\n reason=\"deprecated and replaced with :py:meth:`set_annotations` \"\n \"This method is UNSAFE and may overwrite existing annotations\"\n \" without confirming that you have retrieved and\"\n \" updated the latest annotations\",\n )\n def setAnnotations(self, entity, annotations=None, **kwargs):\n \"\"\"\n Store annotations for an Entity in the Synapse Repository.\n\n :param entity: The Entity or Synapse Entity ID whose annotations are to be updated\n :param annotations: A dictionary of annotation names and values\n :param kwargs: annotation names and values\n :returns: the updated annotations for the entity\n\n \"\"\"\n if not annotations:\n annotations = {}\n\n annotations.update(kwargs)\n\n id = id_of(entity)\n trace.get_current_span().set_attributes({\"synapse.id\": id})\n etag = (\n annotations.etag\n if hasattr(annotations, \"etag\")\n else annotations.get(\"etag\")\n )\n\n if not etag:\n if \"etag\" in entity:\n etag = entity[\"etag\"]\n else:\n uri = \"/entity/%s/annotations2\" % id_of(entity)\n old_annos = self.restGET(uri)\n etag = old_annos[\"etag\"]\n\n return self.set_annotations(Annotations(id, etag, annotations))\n\n @tracer.start_as_current_span(\"Synapse::set_annotations\")\n def set_annotations(self, annotations: Annotations):\n \"\"\"\n Store annotations for an Entity in the Synapse Repository.\n\n Arguments:\n annotations: A [synapseclient.annotations.Annotations][] of annotation names and values,\n with the id and etag attribute set\n\n Returns:\n The updated [synapseclient.annotations.Annotations][] for the entity\n\n Example: Using this function\n Getting annotations, adding a new annotation, and updating the annotations:\n\n annos = syn.get_annotations('syn123')\n\n # annos will contain the id and etag associated with the entity upon retrieval\n print(annos.id)\n # syn123\n print(annos.etag)\n # 7bdb83e9-a50a-46e4-987a-4962559f090f (Usually some UUID in the form of a string)\n\n # returned annos object from get_annotations() can be used as if it were a dict\n\n # set key 'foo' to have value of 'bar' and 'baz'\n annos['foo'] = ['bar', 'baz']\n\n # single values will automatically be wrapped in a list once stored\n annos['qwerty'] = 'asdf'\n\n # store the annotations\n annos = syn.set_annotations(annos)\n\n print(annos)\n # {'foo':['bar','baz], 'qwerty':['asdf']}\n \"\"\"\n\n if not isinstance(annotations, Annotations):\n raise TypeError(\"Expected a synapseclient.Annotations object\")\n\n synapseAnnos = to_synapse_annotations(annotations)\n\n entity_id = id_of(annotations)\n trace.get_current_span().set_attributes({\"synapse.id\": entity_id})\n\n return from_synapse_annotations(\n self.restPUT(\n f\"/entity/{entity_id}/annotations2\",\n body=json.dumps(synapseAnnos),\n )\n )\n\n ############################################################\n # Querying #\n ############################################################\n\n @tracer.start_as_current_span(\"Synapse::getChildren\")\n def getChildren(\n self,\n parent,\n includeTypes=[\n \"folder\",\n \"file\",\n \"table\",\n \"link\",\n \"entityview\",\n \"dockerrepo\",\n \"submissionview\",\n \"dataset\",\n \"materializedview\",\n ],\n sortBy=\"NAME\",\n sortDirection=\"ASC\",\n ):\n \"\"\"\n Retrieves all of the entities stored within a parent such as folder or project.\n\n Arguments:\n parent: An id or an object of a Synapse container or None to retrieve all projects\n includeTypes: Must be a list of entity types (ie. [\"folder\",\"file\"]) which can be found here:\n http://docs.synapse.org/rest/org/sagebionetworks/repo/model/EntityType.html\n sortBy: How results should be sorted. Can be NAME, or CREATED_ON\n sortDirection: The direction of the result sort. Can be ASC, or DESC\n\n Yields:\n An iterator that shows all the children of the container.\n\n Also see:\n\n - [synapseutils.walk][]\n \"\"\"\n parentId = id_of(parent) if parent is not None else None\n\n trace.get_current_span().set_attributes({\"synapse.parent_id\": parentId})\n entityChildrenRequest = {\n \"parentId\": parentId,\n \"includeTypes\": includeTypes,\n \"sortBy\": sortBy,\n \"sortDirection\": sortDirection,\n \"nextPageToken\": None,\n }\n entityChildrenResponse = {\"nextPageToken\": \"first\"}\n while entityChildrenResponse.get(\"nextPageToken\") is not None:\n entityChildrenResponse = self.restPOST(\n \"/entity/children\", body=json.dumps(entityChildrenRequest)\n )\n for child in entityChildrenResponse[\"page\"]:\n yield child\n if entityChildrenResponse.get(\"nextPageToken\") is not None:\n entityChildrenRequest[\"nextPageToken\"] = entityChildrenResponse[\n \"nextPageToken\"\n ]\n\n @tracer.start_as_current_span(\"Synapse::md5Query\")\n def md5Query(self, md5):\n \"\"\"\n Find the Entities which have attached file(s) which have the given MD5 hash.\n\n Arguments:\n md5: The MD5 to query for (hexadecimal string)\n\n Returns:\n A list of Entity headers\n \"\"\"\n\n return self.restGET(\"/entity/md5/%s\" % md5)[\"results\"]\n\n ############################################################\n # ACL manipulation #\n ############################################################\n\n def _getBenefactor(self, entity):\n \"\"\"An Entity gets its ACL from its benefactor.\"\"\"\n\n if utils.is_synapse_id_str(entity) or is_synapse_entity(entity):\n return self.restGET(\"/entity/%s/benefactor\" % id_of(entity))\n return entity\n\n def _getACL(self, entity):\n \"\"\"Get the effective ACL for a Synapse Entity.\"\"\"\n\n if hasattr(entity, \"getACLURI\"):\n uri = entity.getACLURI()\n else:\n # Get the ACL from the benefactor (which may be the entity itself)\n benefactor = self._getBenefactor(entity)\n trace.get_current_span().set_attributes({\"synapse.id\": benefactor[\"id\"]})\n uri = \"/entity/%s/acl\" % (benefactor[\"id\"])\n return self.restGET(uri)\n\n def _storeACL(self, entity, acl):\n \"\"\"\n Create or update the ACL for a Synapse Entity.\n\n :param entity: An entity or Synapse ID\n :param acl: An ACl as a dict\n\n :returns: the new or updated ACL\n\n .. code-block:: python\n\n {'resourceAccess': [\n {'accessType': ['READ'],\n 'principalId': 222222}\n ]}\n \"\"\"\n if hasattr(entity, \"putACLURI\"):\n return self.restPUT(entity.putACLURI(), json.dumps(acl))\n else:\n # Get benefactor. (An entity gets its ACL from its benefactor.)\n entity_id = id_of(entity)\n uri = \"/entity/%s/benefactor\" % entity_id\n benefactor = self.restGET(uri)\n\n # Update or create new ACL\n uri = \"/entity/%s/acl\" % entity_id\n if benefactor[\"id\"] == entity_id:\n return self.restPUT(uri, json.dumps(acl))\n else:\n return self.restPOST(uri, json.dumps(acl))\n\n def _getUserbyPrincipalIdOrName(self, principalId: str = None):\n \"\"\"\n Given either a string, int or None finds the corresponding user where None implies PUBLIC\n\n :param principalId: Identifier of a user or group\n\n :returns: The integer ID of the user\n \"\"\"\n if principalId is None or principalId == \"PUBLIC\":\n return PUBLIC\n try:\n return int(principalId)\n\n # If principalId is not a number assume it is a name or email\n except ValueError:\n userProfiles = self.restGET(\"/userGroupHeaders?prefix=%s\" % principalId)\n totalResults = len(userProfiles[\"children\"])\n if totalResults == 1:\n return int(userProfiles[\"children\"][0][\"ownerId\"])\n elif totalResults > 1:\n for profile in userProfiles[\"children\"]:\n if profile[\"userName\"] == principalId:\n return int(profile[\"ownerId\"])\n\n supplementalMessage = (\n \"Please be more specific\" if totalResults > 1 else \"No matches\"\n )\n raise SynapseError(\n \"Unknown Synapse user (%s). %s.\" % (principalId, supplementalMessage)\n )\n\n @tracer.start_as_current_span(\"Synapse::getPermissions\")\n def getPermissions(\n self,\n entity: Union[Entity, Evaluation, str, collections.abc.Mapping],\n principalId: str = None,\n ):\n \"\"\"Get the permissions that a user or group has on an Entity.\n Arguments:\n entity: An Entity or Synapse ID to lookup\n principalId: Identifier of a user or group (defaults to PUBLIC users)\n\n Returns:\n An array containing some combination of\n ['READ', 'CREATE', 'UPDATE', 'DELETE', 'CHANGE_PERMISSIONS', 'DOWNLOAD']\n or an empty array\n \"\"\"\n principal_id = self._getUserbyPrincipalIdOrName(principalId)\n\n trace.get_current_span().set_attributes(\n {\"synapse.id\": id_of(entity), \"synapse.principal_id\": principal_id}\n )\n\n acl = self._getACL(entity)\n\n team_list = self._find_teams_for_principal(principal_id)\n team_ids = [int(team.id) for team in team_list]\n effective_permission_set = set()\n\n # This user_profile_bundle is being used to verify that the principal_id is a registered user of the system\n user_profile_bundle = self._get_user_bundle(principal_id, 1)\n\n # Loop over all permissions in the returned ACL and add it to the effective_permission_set\n # if the principalId in the ACL matches\n # 1) the one we are looking for,\n # 2) a team the entity is a member of,\n # 3) PUBLIC\n # 4) A user_profile_bundle exists for the principal_id\n for permissions in acl[\"resourceAccess\"]:\n if \"principalId\" in permissions and (\n permissions[\"principalId\"] == principal_id\n or permissions[\"principalId\"] in team_ids\n or permissions[\"principalId\"] == PUBLIC\n or (\n permissions[\"principalId\"] == AUTHENTICATED_USERS\n and user_profile_bundle is not None\n )\n ):\n effective_permission_set = effective_permission_set.union(\n permissions[\"accessType\"]\n )\n return list(effective_permission_set)\n\n @tracer.start_as_current_span(\"Synapse::setPermissions\")\n def setPermissions(\n self,\n entity,\n principalId=None,\n accessType=[\"READ\", \"DOWNLOAD\"],\n modify_benefactor=False,\n warn_if_inherits=True,\n overwrite=True,\n ):\n \"\"\"\n Sets permission that a user or group has on an Entity.\n An Entity may have its own ACL or inherit its ACL from a benefactor.\n\n Arguments:\n entity: An Entity or Synapse ID to modify\n principalId: Identifier of a user or group. '273948' is for all registered Synapse users\n and '273949' is for public access.\n accessType: Type of permission to be granted. One or more of CREATE, READ, DOWNLOAD, UPDATE,\n DELETE, CHANGE_PERMISSIONS\n modify_benefactor: Set as True when modifying a benefactor's ACL\n warn_if_inherits: Set as False, when creating a new ACL.\n Trying to modify the ACL of an Entity that inherits its ACL will result in a warning\n overwrite: By default this function overwrites existing permissions for the specified user.\n Set this flag to False to add new permissions non-destructively.\n\n Returns:\n An Access Control List object\n\n Example: Using this function\n Setting permissions\n\n # Grant all registered users download access\n syn.setPermissions('syn1234','273948',['READ','DOWNLOAD'])\n # Grant the public view access\n syn.setPermissions('syn1234','273949',['READ'])\n \"\"\"\n entity_id = id_of(entity)\n trace.get_current_span().set_attributes({\"synapse.id\": entity_id})\n\n benefactor = self._getBenefactor(entity)\n if benefactor[\"id\"] != entity_id:\n if modify_benefactor:\n entity = benefactor\n elif warn_if_inherits:\n self.logger.warning(\n \"Creating an ACL for entity %s, which formerly inherited access control from a\"\n ' benefactor entity, \"%s\" (%s).\\n'\n % (entity_id, benefactor[\"name\"], benefactor[\"id\"])\n )\n\n acl = self._getACL(entity)\n\n principalId = self._getUserbyPrincipalIdOrName(principalId)\n\n # Find existing permissions\n permissions_to_update = None\n for permissions in acl[\"resourceAccess\"]:\n if (\n \"principalId\" in permissions\n and permissions[\"principalId\"] == principalId\n ):\n permissions_to_update = permissions\n break\n\n if accessType is None or accessType == []:\n # remove permissions\n if permissions_to_update and overwrite:\n acl[\"resourceAccess\"].remove(permissions_to_update)\n else:\n # add a 'resourceAccess' entry, if necessary\n if not permissions_to_update:\n permissions_to_update = {\"accessType\": [], \"principalId\": principalId}\n acl[\"resourceAccess\"].append(permissions_to_update)\n if overwrite:\n permissions_to_update[\"accessType\"] = accessType\n else:\n permissions_to_update[\"accessType\"] = list(\n set(permissions_to_update[\"accessType\"]) | set(accessType)\n )\n return self._storeACL(entity, acl)\n\n ############################################################\n # Provenance #\n ############################################################\n\n # TODO: rename these to Activity\n @tracer.start_as_current_span(\"Synapse::getProvenance\")\n def getProvenance(self, entity, version=None) -> Activity:\n \"\"\"\n Retrieve provenance information for a Synapse Entity.\n\n Arguments:\n entity: An Entity or Synapse ID to lookup\n version: The version of the Entity to retrieve. Gets the most recent version if omitted\n\n Returns:\n An Activity object or raises exception if no provenance record exists\n\n Raises:\n SynapseHTTPError: if no provenance record exists\n \"\"\"\n\n # Get versionNumber from Entity\n if version is None and \"versionNumber\" in entity:\n version = entity[\"versionNumber\"]\n entity_id = id_of(entity)\n if version:\n uri = \"/entity/%s/version/%d/generatedBy\" % (entity_id, version)\n else:\n uri = \"/entity/%s/generatedBy\" % entity_id\n\n trace.get_current_span().set_attributes({\"synapse.id\": entity_id})\n return Activity(data=self.restGET(uri))\n\n @tracer.start_as_current_span(\"Synapse::setProvenance\")\n def setProvenance(self, entity, activity) -> Activity:\n \"\"\"\n Stores a record of the code and data used to derive a Synapse entity.\n\n Arguments:\n entity: An Entity or Synapse ID to modify\n activity: A [synapseclient.activity.Activity][]\n\n Returns:\n An updated [synapseclient.activity.Activity][] object\n \"\"\"\n\n # Assert that the entity was generated by a given Activity.\n activity = self._saveActivity(activity)\n\n entity_id = id_of(entity)\n # assert that an entity is generated by an activity\n uri = \"/entity/%s/generatedBy?generatedBy=%s\" % (entity_id, activity[\"id\"])\n activity = Activity(data=self.restPUT(uri))\n\n trace.get_current_span().set_attributes({\"synapse.id\": entity_id})\n return activity\n\n @tracer.start_as_current_span(\"Synapse::deleteProvenance\")\n def deleteProvenance(self, entity) -> None:\n \"\"\"\n Removes provenance information from an Entity and deletes the associated Activity.\n\n Arguments:\n entity: An Entity or Synapse ID to modify\n \"\"\"\n\n activity = self.getProvenance(entity)\n if not activity:\n return\n entity_id = id_of(entity)\n trace.get_current_span().set_attributes({\"synapse.id\": entity_id})\n\n uri = \"/entity/%s/generatedBy\" % entity_id\n self.restDELETE(uri)\n\n # TODO: what happens if the activity is shared by more than one entity?\n uri = \"/activity/%s\" % activity[\"id\"]\n self.restDELETE(uri)\n\n def _saveActivity(self, activity):\n if \"id\" in activity:\n # We're updating provenance\n uri = \"/activity/%s\" % activity[\"id\"]\n activity = Activity(data=self.restPUT(uri, json.dumps(activity)))\n else:\n activity = self.restPOST(\"/activity\", body=json.dumps(activity))\n return activity\n\n @tracer.start_as_current_span(\"Synapse::updateActivity\")\n def updateActivity(self, activity):\n \"\"\"\n Modifies an existing Activity.\n\n Arguments:\n activity: The Activity to be updated.\n\n Returns:\n An updated Activity object\n \"\"\"\n if \"id\" not in activity:\n raise ValueError(\"The activity you want to update must exist on Synapse\")\n trace.get_current_span().set_attributes({\"synapse.id\": activity[\"id\"]})\n return self._saveActivity(activity)\n\n def _convertProvenanceList(self, usedList, limitSearch=None):\n \"\"\"Convert a list of synapse Ids, URLs and local files by replacing local files with Synapse Ids\"\"\"\n if usedList is None:\n return None\n usedList = [\n self.get(target, limitSearch=limitSearch)\n if (os.path.isfile(target) if isinstance(target, str) else False)\n else target\n for target in usedList\n ]\n return usedList\n\n ############################################################\n # File handle service calls #\n ############################################################\n\n def _getFileHandleDownload(self, fileHandleId, objectId, objectType=None):\n \"\"\"\n Gets the URL and the metadata as filehandle object for a filehandle or fileHandleId\n\n :param fileHandleId: ID of fileHandle to download\n :param objectId: The ID of the object associated with the file e.g. syn234\n :param objectType: Type of object associated with a file e.g. FileEntity, TableEntity\n\n :returns: dictionary with keys: fileHandle, fileHandleId and preSignedURL\n \"\"\"\n body = {\n \"includeFileHandles\": True,\n \"includePreSignedURLs\": True,\n \"requestedFiles\": [\n {\n \"fileHandleId\": fileHandleId,\n \"associateObjectId\": objectId,\n \"associateObjectType\": objectType or \"FileEntity\",\n }\n ],\n }\n response = self.restPOST(\n \"/fileHandle/batch\", body=json.dumps(body), endpoint=self.fileHandleEndpoint\n )\n result = response[\"requestedFiles\"][0]\n failure = result.get(\"failureCode\")\n if failure == \"NOT_FOUND\":\n raise SynapseFileNotFoundError(\n \"The fileHandleId %s could not be found\" % fileHandleId\n )\n elif failure == \"UNAUTHORIZED\":\n raise SynapseError(\n \"You are not authorized to access fileHandleId %s associated with the Synapse\"\n \" %s: %s\" % (fileHandleId, objectType, objectId)\n )\n return result\n\n @staticmethod\n def _is_retryable_download_error(ex):\n # some exceptions caught during download indicate non-recoverable situations that\n # will not be remedied by a repeated download attempt.\n return not (\n (isinstance(ex, OSError) and ex.errno == errno.ENOSPC)\n or isinstance(ex, SynapseMd5MismatchError) # out of disk space\n )\n\n @tracer.start_as_current_span(\"Synapse::_downloadFileHandle\")\n def _downloadFileHandle(\n self, fileHandleId, objectId, objectType, destination, retries=5\n ):\n \"\"\"\n Download a file from the given URL to the local file system.\n\n :param fileHandleId: id of the FileHandle to download\n :param objectId: id of the Synapse object that uses the FileHandle e.g. \"syn123\"\n :param objectType: type of the Synapse object that uses the FileHandle e.g. \"FileEntity\"\n :param destination: destination on local file system\n :param retries: (default=5) Number of download retries attempted before throwing an exception.\n\n :returns: path to downloaded file\n \"\"\"\n os.makedirs(os.path.dirname(destination), exist_ok=True)\n\n while retries > 0:\n try:\n fileResult = self._getFileHandleDownload(\n fileHandleId, objectId, objectType\n )\n fileHandle = fileResult[\"fileHandle\"]\n concreteType = fileHandle[\"concreteType\"]\n storageLocationId = fileHandle.get(\"storageLocationId\")\n\n if concreteType == concrete_types.EXTERNAL_OBJECT_STORE_FILE_HANDLE:\n profile = self._get_client_authenticated_s3_profile(\n fileHandle[\"endpointUrl\"], fileHandle[\"bucket\"]\n )\n downloaded_path = S3ClientWrapper.download_file(\n fileHandle[\"bucket\"],\n fileHandle[\"endpointUrl\"],\n fileHandle[\"fileKey\"],\n destination,\n profile_name=profile,\n show_progress=not self.silent,\n )\n\n elif (\n sts_transfer.is_boto_sts_transfer_enabled(self)\n and sts_transfer.is_storage_location_sts_enabled(\n self, objectId, storageLocationId\n )\n and concreteType == concrete_types.S3_FILE_HANDLE\n ):\n\n def download_fn(credentials):\n return S3ClientWrapper.download_file(\n fileHandle[\"bucketName\"],\n None,\n fileHandle[\"key\"],\n destination,\n credentials=credentials,\n show_progress=not self.silent,\n # pass through our synapse threading config to boto s3\n transfer_config_kwargs={\n \"max_concurrency\": self.max_threads\n },\n )\n\n downloaded_path = sts_transfer.with_boto_sts_credentials(\n download_fn,\n self,\n objectId,\n \"read_only\",\n )\n\n elif (\n self.multi_threaded\n and concreteType == concrete_types.S3_FILE_HANDLE\n and fileHandle.get(\"contentSize\", 0)\n > multithread_download.SYNAPSE_DEFAULT_DOWNLOAD_PART_SIZE\n ):\n # run the download multi threaded if the file supports it, we're configured to do so,\n # and the file is large enough that it would be broken into parts to take advantage of\n # multiple downloading threads. otherwise it's more efficient to run the download as a simple\n # single threaded URL download.\n downloaded_path = self._download_from_url_multi_threaded(\n fileHandleId,\n objectId,\n objectType,\n destination,\n expected_md5=fileHandle.get(\"contentMd5\"),\n )\n\n else:\n downloaded_path = self._download_from_URL(\n fileResult[\"preSignedURL\"],\n destination,\n fileHandle[\"id\"],\n expected_md5=fileHandle.get(\"contentMd5\"),\n )\n self.cache.add(fileHandle[\"id\"], downloaded_path)\n return downloaded_path\n\n except Exception as ex:\n if not self._is_retryable_download_error(ex):\n raise\n\n exc_info = sys.exc_info()\n ex.progress = 0 if not hasattr(ex, \"progress\") else ex.progress\n self.logger.debug(\n \"\\nRetrying download on error: [%s] after progressing %i bytes\"\n % (exc_info[0], ex.progress),\n exc_info=True,\n ) # this will include stack trace\n if ex.progress == 0: # No progress was made reduce remaining retries.\n retries -= 1\n if retries <= 0:\n # Re-raise exception\n raise\n\n raise Exception(\"should not reach this line\")\n\n @tracer.start_as_current_span(\"Synapse::_download_from_url_multi_threaded\")\n def _download_from_url_multi_threaded(\n self, file_handle_id, object_id, object_type, destination, *, expected_md5=None\n ):\n destination = os.path.abspath(destination)\n temp_destination = utils.temp_download_filename(destination, file_handle_id)\n\n request = multithread_download.DownloadRequest(\n file_handle_id=int(file_handle_id),\n object_id=object_id,\n object_type=object_type,\n path=temp_destination,\n )\n\n multithread_download.download_file(self, request)\n\n if (\n expected_md5\n ): # if md5 not set (should be the case for all except http download)\n actual_md5 = utils.md5_for_file(temp_destination).hexdigest()\n # check md5 if given\n if actual_md5 != expected_md5:\n try:\n os.remove(temp_destination)\n except FileNotFoundError:\n # file already does not exist. nothing to do\n pass\n raise SynapseMd5MismatchError(\n \"Downloaded file {filename}'s md5 {md5} does not match expected MD5 of\"\n \" {expected_md5}\".format(\n filename=temp_destination,\n md5=actual_md5,\n expected_md5=expected_md5,\n )\n )\n # once download completed, rename to desired destination\n shutil.move(temp_destination, destination)\n\n return destination\n\n def _is_synapse_uri(self, uri):\n # check whether the given uri is hosted at the configured synapse repo endpoint\n uri_domain = urllib_urlparse.urlparse(uri).netloc\n synapse_repo_domain = urllib_urlparse.urlparse(self.repoEndpoint).netloc\n return uri_domain.lower() == synapse_repo_domain.lower()\n\n @tracer.start_as_current_span(\"Synapse::_download_from_URL\")\n def _download_from_URL(\n self, url, destination, fileHandleId=None, expected_md5=None\n ):\n \"\"\"\n Download a file from the given URL to the local file system.\n\n :param url: source of download\n :param destination: destination on local file system\n :param fileHandleId: (optional) if given, the file will be given a temporary name that includes the file\n handle id which allows resuming partial downloads of the same file from previous\n sessions\n :param expected_md5: (optional) if given, check that the MD5 of the downloaded file matched the expected MD5\n\n :returns: path to downloaded file\n \"\"\"\n destination = os.path.abspath(destination)\n actual_md5 = None\n redirect_count = 0\n delete_on_md5_mismatch = True\n self.logger.debug(f\"Downloading from {url} to {destination}\")\n while redirect_count < REDIRECT_LIMIT:\n redirect_count += 1\n scheme = urllib_urlparse.urlparse(url).scheme\n if scheme == \"file\":\n delete_on_md5_mismatch = False\n destination = utils.file_url_to_path(url, verify_exists=True)\n if destination is None:\n raise IOError(\"Local file (%s) does not exist.\" % url)\n break\n elif scheme == \"sftp\":\n username, password = self._getUserCredentials(url)\n destination = SFTPWrapper.download_file(\n url, destination, username, password, show_progress=not self.silent\n )\n break\n elif scheme == \"ftp\":\n transfer_start_time = time.time()\n\n def _ftp_report_hook(\n block_number: int, read_size: int, total_size: int\n ) -> None:\n show_progress = not self.silent\n if show_progress:\n self._print_transfer_progress(\n transferred=block_number * read_size,\n toBeTransferred=total_size,\n prefix=\"Downloading \",\n postfix=os.path.basename(destination),\n dt=time.time() - transfer_start_time,\n )\n\n urllib_request.urlretrieve(\n url=url, filename=destination, reporthook=_ftp_report_hook\n )\n break\n elif scheme == \"http\" or scheme == \"https\":\n # if a partial download exists with the temporary name,\n temp_destination = utils.temp_download_filename(\n destination, fileHandleId\n )\n range_header = (\n {\n \"Range\": \"bytes={start}-\".format(\n start=os.path.getsize(temp_destination)\n )\n }\n if os.path.exists(temp_destination)\n else {}\n )\n\n # pass along synapse auth credentials only if downloading directly from synapse\n auth = self.credentials if self._is_synapse_uri(url) else None\n response = with_retry(\n lambda: self._requests_session.get(\n url,\n headers=self._generate_headers(range_header),\n stream=True,\n allow_redirects=False,\n auth=auth,\n ),\n verbose=self.debug,\n **STANDARD_RETRY_PARAMS,\n )\n try:\n exceptions._raise_for_status(response, verbose=self.debug)\n except SynapseHTTPError as err:\n if err.response.status_code == 404:\n raise SynapseError(\"Could not download the file at %s\" % url)\n elif (\n err.response.status_code == 416\n ): # Requested Range Not Statisfiable\n # this is a weird error when the client already finished downloading but the loop continues\n # When this exception occurs, the range we request is guaranteed to be >= file size so we\n # assume that the file has been fully downloaded, rename it to destination file\n # and break out of the loop to perform the MD5 check.\n # If it fails the user can retry with another download.\n shutil.move(temp_destination, destination)\n break\n raise\n\n # handle redirects\n if response.status_code in [301, 302, 303, 307, 308]:\n url = response.headers[\"location\"]\n # don't break, loop again\n else:\n # get filename from content-disposition, if we don't have it already\n if os.path.isdir(destination):\n filename = utils.extract_filename(\n content_disposition_header=response.headers.get(\n \"content-disposition\", None\n ),\n default_filename=utils.guess_file_name(url),\n )\n destination = os.path.join(destination, filename)\n # Stream the file to disk\n if \"content-length\" in response.headers:\n toBeTransferred = float(response.headers[\"content-length\"])\n else:\n toBeTransferred = -1\n transferred = 0\n\n # Servers that respect the Range header return 206 Partial Content\n if response.status_code == 206:\n mode = \"ab\"\n previouslyTransferred = os.path.getsize(temp_destination)\n toBeTransferred += previouslyTransferred\n transferred += previouslyTransferred\n sig = utils.md5_for_file(temp_destination)\n else:\n mode = \"wb\"\n previouslyTransferred = 0\n sig = hashlib.new(\"md5\", usedforsecurity=False)\n\n try:\n with open(temp_destination, mode) as fd:\n t0 = time.time()\n for nChunks, chunk in enumerate(\n response.iter_content(FILE_BUFFER_SIZE)\n ):\n fd.write(chunk)\n sig.update(chunk)\n\n # the 'content-length' header gives the total number of bytes that will be transferred\n # to us len(chunk) cannot be used to track progress because iter_content automatically\n # decodes the chunks if the response body is encoded so the len(chunk) could be\n # different from the total number of bytes we've read read from the response body\n # response.raw.tell() is the total number of response body bytes transferred over the\n # wire so far\n transferred = (\n response.raw.tell() + previouslyTransferred\n )\n self._print_transfer_progress(\n transferred,\n toBeTransferred,\n \"Downloading \",\n os.path.basename(destination),\n dt=time.time() - t0,\n )\n except (\n Exception\n ) as ex: # We will add a progress parameter then push it back to retry.\n ex.progress = transferred - previouslyTransferred\n raise\n\n # verify that the file was completely downloaded and retry if it is not complete\n if toBeTransferred > 0 and transferred < toBeTransferred:\n self.logger.warning(\n \"\\nRetrying download because the connection ended early.\\n\"\n )\n continue\n\n actual_md5 = sig.hexdigest()\n # rename to final destination\n shutil.move(temp_destination, destination)\n break\n else:\n self.logger.error(\"Unable to download URLs of type %s\" % scheme)\n return None\n\n else: # didn't break out of loop\n raise SynapseHTTPError(\"Too many redirects\")\n\n if (\n actual_md5 is None\n ): # if md5 not set (should be the case for all except http download)\n actual_md5 = utils.md5_for_file(destination).hexdigest()\n\n # check md5 if given\n if expected_md5 and actual_md5 != expected_md5:\n if delete_on_md5_mismatch and os.path.exists(destination):\n os.remove(destination)\n raise SynapseMd5MismatchError(\n \"Downloaded file {filename}'s md5 {md5} does not match expected MD5 of\"\n \" {expected_md5}\".format(\n filename=destination, md5=actual_md5, expected_md5=expected_md5\n )\n )\n\n return destination\n\n @tracer.start_as_current_span(\"Synapse::_createExternalFileHandle\")\n def _createExternalFileHandle(\n self, externalURL, mimetype=None, md5=None, fileSize=None\n ):\n \"\"\"Create a new FileHandle representing an external URL.\"\"\"\n fileName = externalURL.split(\"/\")[-1]\n externalURL = utils.as_url(externalURL)\n fileHandle = {\n \"concreteType\": concrete_types.EXTERNAL_FILE_HANDLE,\n \"fileName\": fileName,\n \"externalURL\": externalURL,\n \"contentMd5\": md5,\n \"contentSize\": fileSize,\n }\n if mimetype is None:\n (mimetype, enc) = mimetypes.guess_type(externalURL, strict=False)\n if mimetype is not None:\n fileHandle[\"contentType\"] = mimetype\n return self.restPOST(\n \"/externalFileHandle\", json.dumps(fileHandle), self.fileHandleEndpoint\n )\n\n @tracer.start_as_current_span(\"Synapse::_createExternalObjectStoreFileHandle\")\n def _createExternalObjectStoreFileHandle(\n self, s3_file_key, file_path, storage_location_id, mimetype=None\n ):\n if mimetype is None:\n mimetype, enc = mimetypes.guess_type(file_path, strict=False)\n file_handle = {\n \"concreteType\": concrete_types.EXTERNAL_OBJECT_STORE_FILE_HANDLE,\n \"fileKey\": s3_file_key,\n \"fileName\": os.path.basename(file_path),\n \"contentMd5\": utils.md5_for_file(file_path).hexdigest(),\n \"contentSize\": os.stat(file_path).st_size,\n \"storageLocationId\": storage_location_id,\n \"contentType\": mimetype,\n }\n\n return self.restPOST(\n \"/externalFileHandle\", json.dumps(file_handle), self.fileHandleEndpoint\n )\n\n @tracer.start_as_current_span(\"Synapse::create_external_s3_file_handle\")\n def create_external_s3_file_handle(\n self,\n bucket_name,\n s3_file_key,\n file_path,\n *,\n parent=None,\n storage_location_id=None,\n mimetype=None,\n ):\n \"\"\"\n Create an external S3 file handle for e.g. a file that has been uploaded directly to\n an external S3 storage location.\n\n Arguments:\n bucket_name: Name of the S3 bucket\n s3_file_key: S3 key of the uploaded object\n file_path: Local path of the uploaded file\n parent: Parent entity to create the file handle in, the file handle will be created\n in the default storage location of the parent. Mutually exclusive with\n storage_location_id\n storage_location_id: Explicit storage location id to create the file handle in, mutually exclusive\n with parent\n mimetype: Mimetype of the file, if known\n\n Raises:\n ValueError: If neither parent nor storage_location_id is specified, or if both are specified.\n \"\"\"\n\n if storage_location_id:\n if parent:\n raise ValueError(\"Pass parent or storage_location_id, not both\")\n elif not parent:\n raise ValueError(\"One of parent or storage_location_id is required\")\n else:\n upload_destination = self._getDefaultUploadDestination(parent)\n storage_location_id = upload_destination[\"storageLocationId\"]\n\n if mimetype is None:\n mimetype, enc = mimetypes.guess_type(file_path, strict=False)\n\n file_handle = {\n \"concreteType\": concrete_types.S3_FILE_HANDLE,\n \"key\": s3_file_key,\n \"bucketName\": bucket_name,\n \"fileName\": os.path.basename(file_path),\n \"contentMd5\": utils.md5_for_file(file_path).hexdigest(),\n \"contentSize\": os.stat(file_path).st_size,\n \"storageLocationId\": storage_location_id,\n \"contentType\": mimetype,\n }\n\n return self.restPOST(\n \"/externalFileHandle/s3\",\n json.dumps(file_handle),\n endpoint=self.fileHandleEndpoint,\n )\n\n @tracer.start_as_current_span(\"Synapse::_get_file_handle_as_creator\")\n def _get_file_handle_as_creator(self, fileHandle):\n \"\"\"Retrieve a fileHandle from the fileHandle service.\n You must be the creator of the filehandle to use this method. Otherwise, an 403-Forbidden error will be raised\n \"\"\"\n\n uri = \"/fileHandle/%s\" % (id_of(fileHandle),)\n return self.restGET(uri, endpoint=self.fileHandleEndpoint)\n\n @tracer.start_as_current_span(\"Synapse::_deleteFileHandle\")\n def _deleteFileHandle(self, fileHandle):\n \"\"\"\n Delete the given file handle.\n\n Note: Only the user that created the FileHandle can delete it. Also, a FileHandle cannot be deleted if it is\n associated with a FileEntity or WikiPage\n \"\"\"\n\n uri = \"/fileHandle/%s\" % (id_of(fileHandle),)\n self.restDELETE(uri, endpoint=self.fileHandleEndpoint)\n return fileHandle\n\n ############################################################\n # SFTP #\n ############################################################\n\n @tracer.start_as_current_span(\"Synapse::_getDefaultUploadDestination\")\n def _getDefaultUploadDestination(self, parent_entity):\n return self.restGET(\n \"/entity/%s/uploadDestination\" % id_of(parent_entity),\n endpoint=self.fileHandleEndpoint,\n )\n\n @tracer.start_as_current_span(\"Synapse::_getUserCredentials\")\n def _getUserCredentials(self, url, username=None, password=None):\n \"\"\"Get user credentials for a specified URL by either looking in the configFile or querying the user.\n\n :param username: username on server (optionally specified)\n :param password: password for authentication on the server (optionally specified)\n\n :returns: tuple of username, password\n \"\"\"\n # Get authentication information from configFile\n\n parsedURL = urllib_urlparse.urlparse(url)\n baseURL = parsedURL.scheme + \"://\" + parsedURL.hostname\n\n config = self.getConfigFile(self.configPath)\n if username is None and config.has_option(baseURL, \"username\"):\n username = config.get(baseURL, \"username\")\n if password is None and config.has_option(baseURL, \"password\"):\n password = config.get(baseURL, \"password\")\n # If I still don't have a username and password prompt for it\n if username is None:\n username = getpass.getuser() # Default to login name\n # Note that if we hit the following line from within nosetests in\n # Python 3, we get \"TypeError: bad argument type for built-in operation\".\n # Luckily, this case isn't covered in our test suite!\n user = input(\"Username for %s (%s):\" % (baseURL, username))\n username = username if user == \"\" else user\n if password is None:\n password = getpass.getpass(\"Password for %s:\" % baseURL)\n return username, password\n\n ############################################\n # Project/Folder storage location settings #\n ############################################\n\n @tracer.start_as_current_span(\"Synapse::createStorageLocationSetting\")\n def createStorageLocationSetting(self, storage_type, **kwargs):\n \"\"\"\n Creates an IMMUTABLE storage location based on the specified type.\n\n For each storage_type, the following kwargs should be specified:\n\n **ExternalObjectStorage**: (S3-like (e.g. AWS S3 or Openstack) bucket not accessed by Synapse)\n\n - endpointUrl: endpoint URL of the S3 service (for example: 'https://s3.amazonaws.com')\n - bucket: the name of the bucket to use\n\n **ExternalS3Storage**: (Amazon S3 bucket accessed by Synapse)\n\n - bucket: the name of the bucket to use\n\n **ExternalStorage**: (SFTP or FTP storage location not accessed by Synapse)\n\n - url: the base URL for uploading to the external destination\n - supportsSubfolders(optional): does the destination support creating subfolders under the base url\n (default: false)\n\n **ProxyStorage**: (a proxy server that controls access to a storage)\n\n - secretKey: The encryption key used to sign all pre-signed URLs used to communicate with the proxy.\n - proxyUrl: The HTTPS URL of the proxy used for upload and download.\n\n Arguments:\n storage_type: The type of the StorageLocationSetting to create\n banner: (Optional) The optional banner to show every time a file is uploaded\n description: (Optional) The description to show the user when the user has to choose which upload destination to use\n kwargs: fields necessary for creation of the specified storage_type\n\n Returns:\n A dict of the created StorageLocationSetting\n \"\"\"\n upload_type_dict = {\n \"ExternalObjectStorage\": \"S3\",\n \"ExternalS3Storage\": \"S3\",\n \"ExternalStorage\": \"SFTP\",\n \"ProxyStorage\": \"PROXYLOCAL\",\n }\n\n if storage_type not in upload_type_dict:\n raise ValueError(\"Unknown storage_type: %s\", storage_type)\n\n # ProxyStorageLocationSettings has an extra 's' at the end >:(\n kwargs[\"concreteType\"] = (\n \"org.sagebionetworks.repo.model.project.\"\n + storage_type\n + \"LocationSetting\"\n + (\"s\" if storage_type == \"ProxyStorage\" else \"\")\n )\n kwargs[\"uploadType\"] = upload_type_dict[storage_type]\n\n return self.restPOST(\"/storageLocation\", body=json.dumps(kwargs))\n\n @tracer.start_as_current_span(\"Synapse::getMyStorageLocationSetting\")\n def getMyStorageLocationSetting(self, storage_location_id):\n \"\"\"\n Get a StorageLocationSetting by its id.\n\n Arguments:\n storage_location_id: id of the StorageLocationSetting to retrieve.\n The corresponding StorageLocationSetting must have been created by this user.\n\n Returns:\n A dict describing the StorageLocationSetting retrieved by its id\n \"\"\"\n return self.restGET(\"/storageLocation/%s\" % storage_location_id)\n\n @tracer.start_as_current_span(\"Synapse::setStorageLocation\")\n def setStorageLocation(self, entity, storage_location_id):\n \"\"\"\n Sets the storage location for a Project or Folder\n\n Arguments:\n entity: A Project or Folder to which the StorageLocationSetting is set\n storage_location_id: A StorageLocation id or a list of StorageLocation ids. Pass in None for the default\n Synapse storage.\n\n Returns:\n The created or updated settings as a dict.\n \"\"\"\n if storage_location_id is None:\n storage_location_id = DEFAULT_STORAGE_LOCATION_ID\n locations = (\n storage_location_id\n if isinstance(storage_location_id, list)\n else [storage_location_id]\n )\n\n existing_setting = self.getProjectSetting(entity, \"upload\")\n if existing_setting is not None:\n existing_setting[\"locations\"] = locations\n self.restPUT(\"/projectSettings\", body=json.dumps(existing_setting))\n return self.getProjectSetting(entity, \"upload\")\n else:\n project_destination = {\n \"concreteType\": \"org.sagebionetworks.repo.model.project.UploadDestinationListSetting\",\n \"settingsType\": \"upload\",\n \"locations\": locations,\n \"projectId\": id_of(entity),\n }\n\n return self.restPOST(\n \"/projectSettings\", body=json.dumps(project_destination)\n )\n\n @tracer.start_as_current_span(\"Synapse::getProjectSetting\")\n def getProjectSetting(self, project, setting_type):\n \"\"\"\n Gets the ProjectSetting for a project.\n\n Arguments:\n project: Project entity or its id as a string\n setting_type: Type of setting. Choose from:\n\n - `upload`\n - `external_sync`\n - `requester_pays`\n\n Returns:\n The ProjectSetting as a dict or None if no settings of the specified type exist.\n \"\"\"\n if setting_type not in {\"upload\", \"external_sync\", \"requester_pays\"}:\n raise ValueError(\"Invalid project_type: %s\" % setting_type)\n\n response = self.restGET(\n \"/projectSettings/{projectId}/type/{type}\".format(\n projectId=id_of(project), type=setting_type\n )\n )\n return (\n response if response else None\n ) # if no project setting, a empty string is returned as the response\n\n @tracer.start_as_current_span(\"Synapse::get_sts_storage_token\")\n def get_sts_storage_token(\n self, entity, permission, *, output_format=\"json\", min_remaining_life=None\n ):\n \"\"\"Get STS credentials for the given entity_id and permission, outputting it in the given format\n\n Arguments:\n entity: The entity or entity id whose credentials are being returned\n permission: One of:\n\n - `read_only`\n - `read_write`\n output_format: One of:\n\n - `json`: the dictionary returned from the Synapse STS API including expiration\n - `boto`: a dictionary compatible with a boto session (aws_access_key_id, etc)\n - `shell`: output commands for exporting credentials appropriate for the detected shell\n - `bash`: output commands for exporting credentials into a bash shell\n - `cmd`: output commands for exporting credentials into a windows cmd shell\n - `powershell`: output commands for exporting credentials into a windows powershell\n min_remaining_life: The minimum allowable remaining life on a cached token to return. If a cached token\n has left than this amount of time left a fresh token will be fetched\n \"\"\"\n return sts_transfer.get_sts_credentials(\n self,\n id_of(entity),\n permission,\n output_format=output_format,\n min_remaining_life=min_remaining_life,\n )\n\n @tracer.start_as_current_span(\"Synapse::create_s3_storage_location\")\n def create_s3_storage_location(\n self,\n *,\n parent=None,\n folder_name=None,\n folder=None,\n bucket_name=None,\n base_key=None,\n sts_enabled=False,\n ):\n \"\"\"\n Create a storage location in the given parent, either in the given folder or by creating a new\n folder in that parent with the given name. This will both create a StorageLocationSetting,\n and a ProjectSetting together, optionally creating a new folder in which to locate it,\n and optionally enabling this storage location for access via STS. If enabling an existing folder for STS,\n it must be empty.\n\n Arguments:\n parent: The parent in which to locate the storage location (mutually exclusive with folder)\n folder_name: The name of a new folder to create (mutually exclusive with folder)\n folder: The existing folder in which to create the storage location (mutually exclusive with folder_name)\n bucket_name: The name of an S3 bucket, if this is an external storage location,\n if None will use Synapse S3 storage\n base_key: The base key of within the bucket, None to use the bucket root,\n only applicable if bucket_name is passed\n sts_enabled: Whether this storage location should be STS enabled\n\n Returns:\n A 3-tuple of the synapse Folder, a the storage location setting, and the project setting dictionaries.\n \"\"\"\n if folder_name and parent:\n if folder:\n raise ValueError(\n \"folder and folder_name are mutually exclusive, only one should be passed\"\n )\n\n folder = self.store(Folder(name=folder_name, parent=parent))\n\n elif not folder:\n raise ValueError(\"either folder or folder_name should be required\")\n\n storage_location_kwargs = {\n \"uploadType\": \"S3\",\n \"stsEnabled\": sts_enabled,\n }\n\n if bucket_name:\n storage_location_kwargs[\n \"concreteType\"\n ] = concrete_types.EXTERNAL_S3_STORAGE_LOCATION_SETTING\n storage_location_kwargs[\"bucket\"] = bucket_name\n if base_key:\n storage_location_kwargs[\"baseKey\"] = base_key\n else:\n storage_location_kwargs[\n \"concreteType\"\n ] = concrete_types.SYNAPSE_S3_STORAGE_LOCATION_SETTING\n\n storage_location_setting = self.restPOST(\n \"/storageLocation\", json.dumps(storage_location_kwargs)\n )\n\n storage_location_id = storage_location_setting[\"storageLocationId\"]\n project_setting = self.setStorageLocation(\n folder,\n storage_location_id,\n )\n\n return folder, storage_location_setting, project_setting\n\n ############################################################\n # CRUD for Evaluations #\n ############################################################\n\n @tracer.start_as_current_span(\"Synapse::getEvaluation\")\n def getEvaluation(self, id):\n \"\"\"\n Gets an Evaluation object from Synapse.\n\n Arguments:\n id: The ID of the [synapseclient.evaluation.Evaluation][] to return.\n\n Returns:\n An [synapseclient.evaluation.Evaluation][] object\n\n Example: Using this function\n Creating an Evaluation instance\n\n evaluation = syn.getEvaluation(2005090)\n \"\"\"\n\n evaluation_id = id_of(id)\n uri = Evaluation.getURI(evaluation_id)\n return Evaluation(**self.restGET(uri))\n\n # TODO: Should this be combined with getEvaluation?\n @tracer.start_as_current_span(\"Synapse::getEvaluationByName\")\n def getEvaluationByName(self, name):\n \"\"\"\n Gets an Evaluation object from Synapse.\n\n Arguments:\n Name: The name of the [synapseclient.evaluation.Evaluation][] to return.\n\n Returns:\n An [synapseclient.evaluation.Evaluation][] object\n \"\"\"\n uri = Evaluation.getByNameURI(name)\n return Evaluation(**self.restGET(uri))\n\n @tracer.start_as_current_span(\"Synapse::getEvaluationByContentSource\")\n def getEvaluationByContentSource(self, entity):\n \"\"\"\n Returns a generator over evaluations that derive their content from the given entity\n\n Arguments:\n entity: The [synapseclient.entity.Project][] whose Evaluations are to be fetched.\n\n Yields:\n A generator over [synapseclient.evaluation.Evaluation][] objects for the given [synapseclient.entity.Project][].\n \"\"\"\n\n entityId = id_of(entity)\n url = \"/entity/%s/evaluation\" % entityId\n\n for result in self._GET_paginated(url):\n yield Evaluation(**result)\n\n @tracer.start_as_current_span(\"Synapse::_findTeam\")\n def _findTeam(self, name):\n \"\"\"\n Retrieve a Teams matching the supplied name fragment\n \"\"\"\n for result in self._GET_paginated(\"/teams?fragment=%s\" % name):\n yield Team(**result)\n\n @tracer.start_as_current_span(\"Synapse::_find_teams_for_principal\")\n def _find_teams_for_principal(self, principal_id: str) -> typing.Iterator[Team]:\n \"\"\"\n Retrieve a list of teams for the matching principal ID. If the principalId that is passed in is a team itself,\n or not found, this will return a generator that yields no results.\n\n :param principal_id: Identifier of a user or group.\n\n :return: A generator that yields objects of type :py:class:`synapseclient.team.Team`\n \"\"\"\n for result in self._GET_paginated(f\"/user/{principal_id}/team\"):\n yield Team(**result)\n\n @tracer.start_as_current_span(\"Synapse::getTeam\")\n def getTeam(self, id):\n \"\"\"\n Finds a team with a given ID or name.\n\n Arguments:\n id: The ID or name of the team or a Team object to retrieve.\n\n Returns:\n An object of type [synapseclient.team.Team][]\n \"\"\"\n # Retrieves team id\n teamid = id_of(id)\n try:\n int(teamid)\n except (TypeError, ValueError):\n if isinstance(id, str):\n for team in self._findTeam(id):\n if team.name == id:\n teamid = team.id\n break\n else:\n raise ValueError('Can\\'t find team \"{}\"'.format(teamid))\n else:\n raise ValueError('Can\\'t find team \"{}\"'.format(teamid))\n return Team(**self.restGET(\"/team/%s\" % teamid))\n\n @tracer.start_as_current_span(\"Synapse::getTeamMembers\")\n def getTeamMembers(self, team):\n \"\"\"\n Lists the members of the given team.\n\n Arguments:\n team: A [synapseclient.team.Team][] object or a team's ID.\n\n Yields:\n A generator over [synapseclient.team.TeamMember][] objects.\n\n \"\"\"\n for result in self._GET_paginated(\"/teamMembers/{id}\".format(id=id_of(team))):\n yield TeamMember(**result)\n\n @tracer.start_as_current_span(\"Synapse::_get_docker_digest\")\n def _get_docker_digest(self, entity, docker_tag=\"latest\"):\n \"\"\"\n Get matching Docker sha-digest of a DockerRepository given a Docker tag\n\n :param entity: Synapse id or entity of Docker repository\n :param docker_tag: Docker tag\n :returns: Docker digest matching Docker tag\n \"\"\"\n entityid = id_of(entity)\n uri = \"/entity/{entityId}/dockerTag\".format(entityId=entityid)\n\n docker_commits = self._GET_paginated(uri)\n docker_digest = None\n for commit in docker_commits:\n if docker_tag == commit[\"tag\"]:\n docker_digest = commit[\"digest\"]\n if docker_digest is None:\n raise ValueError(\n \"Docker tag {docker_tag} not found. Please specify a \"\n \"docker tag that exists. 'latest' is used as \"\n \"default.\".format(docker_tag=docker_tag)\n )\n return docker_digest\n\n @tracer.start_as_current_span(\"Synapse::get_team_open_invitations\")\n def get_team_open_invitations(self, team):\n \"\"\"Retrieve the open requests submitted to a Team\n https://rest-docs.synapse.org/rest/GET/team/id/openInvitation.html\n\n Arguments:\n team: A [synapseclient.team.Team][] object or a team's ID.\n\n Yields:\n Generator of MembershipRequest\n \"\"\"\n teamid = id_of(team)\n request = \"/team/{team}/openInvitation\".format(team=teamid)\n open_requests = self._GET_paginated(request)\n return open_requests\n\n @tracer.start_as_current_span(\"Synapse::get_membership_status\")\n def get_membership_status(self, userid, team):\n \"\"\"Retrieve a user's Team Membership Status bundle.\n https://rest-docs.synapse.org/rest/GET/team/id/member/principalId/membershipStatus.html\n\n Arguments:\n user: Synapse user ID\n team: A [synapseclient.team.Team][] object or a team's ID.\n\n Returns:\n dict of TeamMembershipStatus\n \"\"\"\n teamid = id_of(team)\n request = \"/team/{team}/member/{user}/membershipStatus\".format(\n team=teamid, user=userid\n )\n membership_status = self.restGET(request)\n return membership_status\n\n @tracer.start_as_current_span(\"Synapse::_delete_membership_invitation\")\n def _delete_membership_invitation(self, invitationid):\n \"\"\"Delete open membership invitation\n\n :param invitationid: Open invitation id\n \"\"\"\n self.restDELETE(\"/membershipInvitation/{id}\".format(id=invitationid))\n\n @tracer.start_as_current_span(\"Synapse::send_membership_invitation\")\n def send_membership_invitation(\n self, teamId, inviteeId=None, inviteeEmail=None, message=None\n ):\n \"\"\"Create a membership invitation and send an email notification\n to the invitee.\n\n Arguments:\n teamId: Synapse teamId\n inviteeId: Synapse username or profile id of user\n inviteeEmail: Email of user\n message: Additional message for the user getting invited to the\n team.\n\n Returns:\n MembershipInvitation\n \"\"\"\n\n invite_request = {\"teamId\": str(teamId), \"message\": message}\n if inviteeEmail is not None:\n invite_request[\"inviteeEmail\"] = str(inviteeEmail)\n if inviteeId is not None:\n invite_request[\"inviteeId\"] = str(inviteeId)\n\n response = self.restPOST(\n \"/membershipInvitation\", body=json.dumps(invite_request)\n )\n return response\n\n @tracer.start_as_current_span(\"Synapse::invite_to_team\")\n def invite_to_team(\n self, team, user=None, inviteeEmail=None, message=None, force=False\n ):\n \"\"\"Invite user to a Synapse team via Synapse username or email\n (choose one or the other)\n\n Arguments:\n syn: Synapse object\n team: A [synapseclient.team.Team][] object or a team's ID.\n user: Synapse username or profile id of user\n inviteeEmail: Email of user\n message: Additional message for the user getting invited to the team.\n force: If an open invitation exists for the invitee, the old invite will be cancelled.\n\n Returns:\n MembershipInvitation or None if user is already a member\n \"\"\"\n # Throw error if both user and email is specified and if both not\n # specified\n id_email_specified = inviteeEmail is not None and user is not None\n id_email_notspecified = inviteeEmail is None and user is None\n if id_email_specified or id_email_notspecified:\n raise ValueError(\"Must specify either 'user' or 'inviteeEmail'\")\n\n teamid = id_of(team)\n is_member = False\n open_invitations = self.get_team_open_invitations(teamid)\n\n if user is not None:\n inviteeId = self.getUserProfile(user)[\"ownerId\"]\n membership_status = self.get_membership_status(inviteeId, teamid)\n is_member = membership_status[\"isMember\"]\n open_invites_to_user = [\n invitation\n for invitation in open_invitations\n if invitation.get(\"inviteeId\") == inviteeId\n ]\n else:\n inviteeId = None\n open_invites_to_user = [\n invitation\n for invitation in open_invitations\n if invitation.get(\"inviteeEmail\") == inviteeEmail\n ]\n # Only invite if the invitee is not a member and\n # if invitee doesn't have an open invitation unless force=True\n if not is_member and (not open_invites_to_user or force):\n # Delete all old invitations\n for invite in open_invites_to_user:\n self._delete_membership_invitation(invite[\"id\"])\n return self.send_membership_invitation(\n teamid, inviteeId=inviteeId, inviteeEmail=inviteeEmail, message=message\n )\n if is_member:\n not_sent_reason = \"invitee is already a member\"\n else:\n not_sent_reason = (\n \"invitee already has an open invitation \"\n \"Set force=True to send new invite.\"\n )\n\n self.logger.warning(\"No invitation sent: {}\".format(not_sent_reason))\n # Return None if no invite is sent.\n return None\n\n @tracer.start_as_current_span(\"Synapse::submit\")\n def submit(\n self,\n evaluation,\n entity,\n name=None,\n team=None,\n silent=False,\n submitterAlias=None,\n teamName=None,\n dockerTag=\"latest\",\n ):\n \"\"\"\n Submit an Entity for [evaluation][synapseclient.evaluation.Evaluation].\n\n Arguments:\n evalation: Evaluation queue to submit to\n entity: The Entity containing the Submissions\n name: A name for this submission. In the absent of this parameter, the entity name will be used.\n (Optional) A [synapseclient.team.Team][] object, ID or name of a Team that is registered for the challenge\n team: (optional) A [synapseclient.team.Team][] object, ID or name of a Team that is registered for the challenge\n silent: Set to True to suppress output.\n submitterAlias: (optional) A nickname, possibly for display in leaderboards in place of the submitter's name\n teamName: (deprecated) A synonym for submitterAlias\n dockerTag: (optional) The Docker tag must be specified if the entity is a DockerRepository.\n\n Returns:\n A [synapseclient.evaluation.Submission][] object\n\n\n In the case of challenges, a team can optionally be provided to give credit to members of the team that\n contributed to the submission. The team must be registered for the challenge with which the given evaluation is\n associated. The caller must be a member of the submitting team.\n\n Example: Using this function\n Getting and submitting an evaluation\n\n evaluation = syn.getEvaluation(123)\n entity = syn.get('syn456')\n submission = syn.submit(evaluation, entity, name='Our Final Answer', team='Blue Team')\n \"\"\"\n\n require_param(evaluation, \"evaluation\")\n require_param(entity, \"entity\")\n\n evaluation_id = id_of(evaluation)\n\n entity_id = id_of(entity)\n if isinstance(entity, synapseclient.DockerRepository):\n # Edge case if dockerTag is specified as None\n if dockerTag is None:\n raise ValueError(\n \"A dockerTag is required to submit a DockerEntity. Cannot be None\"\n )\n docker_repository = entity[\"repositoryName\"]\n else:\n docker_repository = None\n\n if \"versionNumber\" not in entity:\n entity = self.get(entity, downloadFile=False)\n # version defaults to 1 to hack around required version field and allow submission of files/folders\n entity_version = entity.get(\"versionNumber\", 1)\n\n # default name of submission to name of entity\n if name is None and \"name\" in entity:\n name = entity[\"name\"]\n\n team_id = None\n if team:\n team = self.getTeam(team)\n team_id = id_of(team)\n\n contributors, eligibility_hash = self._get_contributors(evaluation_id, team)\n\n # for backward compatible until we remove supports for teamName\n if not submitterAlias:\n if teamName:\n submitterAlias = teamName\n elif team and \"name\" in team:\n submitterAlias = team[\"name\"]\n\n if isinstance(entity, synapseclient.DockerRepository):\n docker_digest = self._get_docker_digest(entity, dockerTag)\n else:\n docker_digest = None\n\n submission = {\n \"evaluationId\": evaluation_id,\n \"name\": name,\n \"entityId\": entity_id,\n \"versionNumber\": entity_version,\n \"dockerDigest\": docker_digest,\n \"dockerRepositoryName\": docker_repository,\n \"teamId\": team_id,\n \"contributors\": contributors,\n \"submitterAlias\": submitterAlias,\n }\n\n submitted = self._submit(submission, entity[\"etag\"], eligibility_hash)\n\n # if we want to display the receipt message, we need the full object\n if not silent:\n if not (isinstance(evaluation, Evaluation)):\n evaluation = self.getEvaluation(evaluation_id)\n if \"submissionReceiptMessage\" in evaluation:\n self.logger.info(evaluation[\"submissionReceiptMessage\"])\n\n return Submission(**submitted)\n\n @tracer.start_as_current_span(\"Synapse::_submit\")\n def _submit(self, submission, entity_etag, eligibility_hash):\n require_param(submission, \"submission\")\n require_param(entity_etag, \"entity_etag\")\n # URI requires the etag of the entity and, in the case of a team submission, requires an eligibilityStateHash\n uri = \"/evaluation/submission?etag=%s\" % entity_etag\n if eligibility_hash:\n uri += \"&submissionEligibilityHash={0}\".format(eligibility_hash)\n submitted = self.restPOST(uri, json.dumps(submission))\n return submitted\n\n @tracer.start_as_current_span(\"Synapse::_get_contributors\")\n def _get_contributors(self, evaluation_id, team):\n if not evaluation_id or not team:\n return None, None\n\n team_id = id_of(team)\n # see https://rest-docs.synapse.org/rest/GET/evaluation/evalId/team/id/submissionEligibility.html\n eligibility = self.restGET(\n \"/evaluation/{evalId}/team/{id}/submissionEligibility\".format(\n evalId=evaluation_id, id=team_id\n )\n )\n\n if not eligibility[\"teamEligibility\"][\"isEligible\"]:\n # Check team eligibility and raise an exception if not eligible\n if not eligibility[\"teamEligibility\"][\"isRegistered\"]:\n raise SynapseError(\n 'Team \"{team}\" is not registered.'.format(team=team.name)\n )\n if eligibility[\"teamEligibility\"][\"isQuotaFilled\"]:\n raise SynapseError(\n 'Team \"{team}\" has already submitted the full quota of submissions.'.format(\n team=team.name\n )\n )\n raise SynapseError('Team \"{team}\" is not eligible.'.format(team=team.name))\n\n # Include all team members who are eligible.\n contributors = [\n {\"principalId\": member[\"principalId\"]}\n for member in eligibility[\"membersEligibility\"]\n if member[\"isEligible\"] and not member[\"hasConflictingSubmission\"]\n ]\n return contributors, eligibility[\"eligibilityStateHash\"]\n\n @tracer.start_as_current_span(\"Synapse::_allowParticipation\")\n def _allowParticipation(\n self,\n evaluation,\n user,\n rights=[\"READ\", \"PARTICIPATE\", \"SUBMIT\", \"UPDATE_SUBMISSION\"],\n ):\n \"\"\"\n Grants the given user the minimal access rights to join and submit to an Evaluation.\n Note: The specification of this method has not been decided yet, so the method is likely to change in future.\n\n :param evaluation: An Evaluation object or Evaluation ID\n :param user: Either a user group or the principal ID of a user to grant rights to.\n To allow all users, use \"PUBLIC\".\n To allow authenticated users, use \"AUTHENTICATED_USERS\".\n :param rights: The access rights to give to the users.\n Defaults to \"READ\", \"PARTICIPATE\", \"SUBMIT\", and \"UPDATE_SUBMISSION\".\n \"\"\"\n\n # Check to see if the user is an ID or group\n userId = -1\n try:\n # TODO: is there a better way to differentiate between a userID and a group name?\n # What if a group is named with just numbers?\n userId = int(user)\n\n # Verify that the user exists\n try:\n self.getUserProfile(userId)\n except SynapseHTTPError as err:\n if err.response.status_code == 404:\n raise SynapseError(\"The user (%s) does not exist\" % str(userId))\n raise\n\n except ValueError:\n # Fetch the ID of the user group\n userId = self._getUserbyPrincipalIdOrName(user)\n\n if not isinstance(evaluation, Evaluation):\n evaluation = self.getEvaluation(id_of(evaluation))\n\n self.setPermissions(evaluation, userId, accessType=rights, overwrite=False)\n\n @tracer.start_as_current_span(\"Synapse::getSubmissions\")\n def getSubmissions(self, evaluation, status=None, myOwn=False, limit=20, offset=0):\n \"\"\"\n Arguments:\n evaluation: Evaluation to get submissions from.\n status: Optionally filter submissions for a specific status.\n One of:\n\n - `OPEN`\n - `CLOSED`\n - `SCORED`\n - `INVALID`\n - `VALIDATED`\n - `EVALUATION_IN_PROGRESS`\n - `RECEIVED`\n - `REJECTED`\n - `ACCEPTED`\n myOwn: Determines if only your Submissions should be fetched.\n Defaults to False (all Submissions)\n limit: Limits the number of submissions in a single response.\n Because this method returns a generator and repeatedly\n fetches submissions, this argument is limiting the\n size of a single request and NOT the number of sub-\n missions returned in total.\n offset: Start iterating at a submission offset from the first submission.\n\n Yields:\n A generator over [synapseclient.evaluation.Submission][] objects for an Evaluation\n\n Example: Using this function\n Print submissions\n\n for submission in syn.getSubmissions(1234567):\n print(submission['entityId'])\n\n See:\n\n - [synapseclient.evaluation][]\n \"\"\"\n\n evaluation_id = id_of(evaluation)\n uri = \"/evaluation/%s/submission%s\" % (evaluation_id, \"\" if myOwn else \"/all\")\n\n if status is not None:\n uri += \"?status=%s\" % status\n\n for result in self._GET_paginated(uri, limit=limit, offset=offset):\n yield Submission(**result)\n\n @tracer.start_as_current_span(\"Synapse::_getSubmissionBundles\")\n def _getSubmissionBundles(\n self, evaluation, status=None, myOwn=False, limit=20, offset=0\n ):\n \"\"\"\n :param evaluation: Evaluation to get submissions from.\n :param status: Optionally filter submissions for a specific status.\n One of {OPEN, CLOSED, SCORED, INVALID}\n :param myOwn: Determines if only your Submissions should be fetched.\n Defaults to False (all Submissions)\n :param limit: Limits the number of submissions coming back from the\n service in a single response.\n :param offset: Start iterating at a submission offset from the first\n submission.\n\n :returns: A generator over dictionaries with keys 'submission' and 'submissionStatus'.\n\n Example::\n\n for sb in syn._getSubmissionBundles(1234567):\n print(sb['submission']['name'], \\\\\n sb['submission']['submitterAlias'], \\\\\n sb['submissionStatus']['status'], \\\\\n sb['submissionStatus']['score'])\n\n This may later be changed to return objects, pending some thought on how submissions along with related status\n and annotations should be represented in the clients.\n\n See: :py:mod:`synapseclient.evaluation`\n \"\"\"\n\n evaluation_id = id_of(evaluation)\n url = \"/evaluation/%s/submission/bundle%s\" % (\n evaluation_id,\n \"\" if myOwn else \"/all\",\n )\n if status is not None:\n url += \"?status=%s\" % status\n\n return self._GET_paginated(url, limit=limit, offset=offset)\n\n @tracer.start_as_current_span(\"Synapse::getSubmissionBundles\")\n def getSubmissionBundles(\n self, evaluation, status=None, myOwn=False, limit=20, offset=0\n ):\n \"\"\"\n Retrieve submission bundles (submission and submissions status) for an evaluation queue, optionally filtered by\n submission status and/or owner.\n\n Arguments:\n evaluation: Evaluation to get submissions from.\n status: Optionally filter submissions for a specific status.\n One of:\n\n - `OPEN`\n - `CLOSED`\n - `SCORED`\n - `INVALID`\n myOwn: Determines if only your Submissions should be fetched.\n Defaults to False (all Submissions)\n limit: Limits the number of submissions coming back from the\n service in a single response.\n offset: Start iterating at a submission offset from the first submission.\n\n Yields:\n A generator over tuples containing a [synapseclient.evaluation.Submission][] and a [synapseclient.evaluation.SubmissionStatus][].\n\n Example: Using this function\n Loop over submissions\n\n for submission, status in syn.getSubmissionBundles(evaluation):\n print(submission.name, \\\\\n submission.submitterAlias, \\\\\n status.status, \\\\\n status.score)\n\n This may later be changed to return objects, pending some thought on how submissions along with related status\n and annotations should be represented in the clients.\n\n See:\n\n - [synapseclient.evaluation][]\n \"\"\"\n for bundle in self._getSubmissionBundles(\n evaluation, status=status, myOwn=myOwn, limit=limit, offset=offset\n ):\n yield (\n Submission(**bundle[\"submission\"]),\n SubmissionStatus(**bundle[\"submissionStatus\"]),\n )\n\n @tracer.start_as_current_span(\"Synapse::_GET_paginated\")\n def _GET_paginated(self, uri, limit=20, offset=0):\n \"\"\"\n :param uri: A URI that returns paginated results\n :param limit: How many records should be returned per request\n :param offset: At what record offset from the first should iteration start\n\n :returns: A generator over some paginated results\n\n The limit parameter is set at 20 by default. Using a larger limit results in fewer calls to the service, but if\n responses are large enough to be a burden on the service they may be truncated.\n \"\"\"\n\n prev_num_results = sys.maxsize\n while prev_num_results > 0:\n uri = utils._limit_and_offset(uri, limit=limit, offset=offset)\n page = self.restGET(uri)\n results = page[\"results\"] if \"results\" in page else page[\"children\"]\n prev_num_results = len(results)\n\n for result in results:\n offset += 1\n yield result\n\n @tracer.start_as_current_span(\"Synapse::_POST_paginated\")\n def _POST_paginated(self, uri, body, **kwargs):\n \"\"\"\n :param uri: A URI that returns paginated results\n :param body: POST request payload\n\n :returns: A generator over some paginated results\n \"\"\"\n\n next_page_token = None\n while True:\n body[\"nextPageToken\"] = next_page_token\n response = self.restPOST(uri, body=json.dumps(body), **kwargs)\n next_page_token = response.get(\"nextPageToken\")\n for item in response[\"page\"]:\n yield item\n if next_page_token is None:\n break\n\n @tracer.start_as_current_span(\"Synapse::getSubmission\")\n def getSubmission(self, id, **kwargs):\n \"\"\"\n Gets a [synapseclient.evaluation.Submission][] object by its id.\n\n Arguments:\n id: The id of the submission to retrieve\n\n Returns:\n A [synapseclient.evaluation.Submission][] object\n\n\n :param id: The id of the submission to retrieve\n\n :return: a :py:class:`synapseclient.evaluation.Submission` object\n\n See:\n\n - [synapseclient.Synapse.get][] for information\n on the *downloadFile*, *downloadLocation*, and *ifcollision* parameters\n \"\"\"\n\n submission_id = id_of(id)\n uri = Submission.getURI(submission_id)\n submission = Submission(**self.restGET(uri))\n\n # Pre-fetch the Entity tied to the Submission, if there is one\n if \"entityId\" in submission and submission[\"entityId\"] is not None:\n entityBundleJSON = json.loads(submission[\"entityBundleJSON\"])\n\n # getWithEntityBundle expects a bundle services v2 style\n # annotations dict, but the evaluations API may return\n # an older format annotations object in the encoded JSON\n # depending on when the original submission was made.\n annotations = entityBundleJSON.get(\"annotations\")\n if annotations:\n entityBundleJSON[\"annotations\"] = convert_old_annotation_json(\n annotations\n )\n\n related = self._getWithEntityBundle(\n entityBundle=entityBundleJSON,\n entity=submission[\"entityId\"],\n submission=submission_id,\n **kwargs,\n )\n submission.entity = related\n submission.filePath = related.get(\"path\", None)\n\n return submission\n\n @tracer.start_as_current_span(\"Synapse::getSubmissionStatus\")\n def getSubmissionStatus(self, submission):\n \"\"\"\n Downloads the status of a Submission.\n\n Arguments:\n submission: The submission to lookup\n\n Returns:\n A [synapseclient.evaluation.SubmissionStatus][] object\n \"\"\"\n\n submission_id = id_of(submission)\n uri = SubmissionStatus.getURI(submission_id)\n val = self.restGET(uri)\n return SubmissionStatus(**val)\n\n ############################################################\n # CRUD for Wikis #\n ############################################################\n\n @tracer.start_as_current_span(\"Synapse::getWiki\")\n def getWiki(self, owner, subpageId=None, version=None):\n \"\"\"\n Get a [synapseclient.wiki.Wiki][] object from Synapse. Uses wiki2 API which supports versioning.\n\n Arguments:\n owner: The entity to which the Wiki is attached\n subpageId: The id of the specific sub-page or None to get the root Wiki page\n version: The version of the page to retrieve or None to retrieve the latest\n\n Returns:\n A [synapseclient.wiki.Wiki][] object\n \"\"\"\n uri = \"/entity/{ownerId}/wiki2\".format(ownerId=id_of(owner))\n if subpageId is not None:\n uri += \"/{wikiId}\".format(wikiId=subpageId)\n if version is not None:\n uri += \"?wikiVersion={version}\".format(version=version)\n\n wiki = self.restGET(uri)\n wiki[\"owner\"] = owner\n wiki = Wiki(**wiki)\n\n path = self.cache.get(wiki.markdownFileHandleId)\n if not path:\n cache_dir = self.cache.get_cache_dir(wiki.markdownFileHandleId)\n if not os.path.exists(cache_dir):\n os.makedirs(cache_dir)\n path = self._downloadFileHandle(\n wiki[\"markdownFileHandleId\"],\n wiki[\"id\"],\n \"WikiMarkdown\",\n os.path.join(cache_dir, str(wiki.markdownFileHandleId) + \".md\"),\n )\n try:\n import gzip\n\n with gzip.open(path) as f:\n markdown = f.read().decode(\"utf-8\")\n except IOError:\n with open(path) as f:\n markdown = f.read().decode(\"utf-8\")\n\n wiki.markdown = markdown\n wiki.markdown_path = path\n\n return wiki\n\n @tracer.start_as_current_span(\"Synapse::getWikiHeaders\")\n def getWikiHeaders(self, owner):\n \"\"\"\n Retrieves the headers of all Wikis belonging to the owner (the entity to which the Wiki is attached).\n\n Arguments:\n owner: An Entity\n\n Returns:\n A list of Objects with three fields: id, title and parentId.\n \"\"\"\n\n uri = \"/entity/%s/wikiheadertree\" % id_of(owner)\n return [DictObject(**header) for header in self._GET_paginated(uri)]\n\n @tracer.start_as_current_span(\"Synapse::_storeWiki\")\n def _storeWiki(self, wiki, createOrUpdate): # type: (Wiki, bool) -> Wiki\n \"\"\"\n Stores or updates the given Wiki.\n\n :param wiki: A Wiki object\n\n :returns: An updated Wiki object\n \"\"\"\n # Make sure the file handle field is a list\n if \"attachmentFileHandleIds\" not in wiki:\n wiki[\"attachmentFileHandleIds\"] = []\n\n # Convert all attachments into file handles\n if wiki.get(\"attachments\") is not None:\n for attachment in wiki[\"attachments\"]:\n fileHandle = upload_synapse_s3(self, attachment)\n wiki[\"attachmentFileHandleIds\"].append(fileHandle[\"id\"])\n del wiki[\"attachments\"]\n\n # Perform an update if the Wiki has an ID\n if \"id\" in wiki:\n updated_wiki = Wiki(\n owner=wiki.ownerId, **self.restPUT(wiki.putURI(), wiki.json())\n )\n\n # Perform a create if the Wiki has no ID\n else:\n try:\n updated_wiki = Wiki(\n owner=wiki.ownerId, **self.restPOST(wiki.postURI(), wiki.json())\n )\n except SynapseHTTPError as err:\n # If already present we get an unhelpful SQL error\n if createOrUpdate and (\n (\n err.response.status_code == 400\n and \"DuplicateKeyException\" in err.message\n )\n or err.response.status_code == 409\n ):\n existing_wiki = self.getWiki(wiki.ownerId)\n\n # overwrite everything except for the etag (this will keep unmodified fields in the existing wiki)\n etag = existing_wiki[\"etag\"]\n existing_wiki.update(wiki)\n existing_wiki.etag = etag\n\n updated_wiki = Wiki(\n owner=wiki.ownerId,\n **self.restPUT(existing_wiki.putURI(), existing_wiki.json()),\n )\n else:\n raise\n return updated_wiki\n\n @tracer.start_as_current_span(\"Synapse::getWikiAttachments\")\n def getWikiAttachments(self, wiki):\n \"\"\"\n Retrieve the attachments to a wiki page.\n\n Arguments:\n wiki: The Wiki object for which the attachments are to be returned.\n\n Returns:\n A list of file handles for the files attached to the Wiki.\n \"\"\"\n uri = \"/entity/%s/wiki/%s/attachmenthandles\" % (wiki.ownerId, wiki.id)\n results = self.restGET(uri)\n file_handles = list(WikiAttachment(**fh) for fh in results[\"list\"])\n return file_handles\n\n ############################################################\n # Tables #\n ############################################################\n\n @tracer.start_as_current_span(\"Synapse::_waitForAsync\")\n def _waitForAsync(self, uri, request, endpoint=None):\n if endpoint is None:\n endpoint = self.repoEndpoint\n async_job_id = self.restPOST(\n uri + \"/start\", body=json.dumps(request), endpoint=endpoint\n )\n\n # https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/asynch/AsynchronousJobStatus.html\n sleep = self.table_query_sleep\n start_time = time.time()\n lastMessage, lastProgress, lastTotal, progressed = \"\", 0, 1, False\n while time.time() - start_time < self.table_query_timeout:\n result = self.restGET(\n uri + \"/get/%s\" % async_job_id[\"token\"], endpoint=endpoint\n )\n if result.get(\"jobState\", None) == \"PROCESSING\":\n progressed = True\n message = result.get(\"progressMessage\", lastMessage)\n progress = result.get(\"progressCurrent\", lastProgress)\n total = result.get(\"progressTotal\", lastTotal)\n if message != \"\":\n self._print_transfer_progress(\n progress, total, message, isBytes=False\n )\n # Reset the time if we made progress (fix SYNPY-214)\n if message != lastMessage or lastProgress != progress:\n start_time = time.time()\n lastMessage, lastProgress, lastTotal = message, progress, total\n sleep = min(\n self.table_query_max_sleep, sleep * self.table_query_backoff\n )\n doze(sleep)\n else:\n break\n else:\n raise SynapseTimeoutError(\n \"Timeout waiting for query results: %0.1f seconds \"\n % (time.time() - start_time)\n )\n if result.get(\"jobState\", None) == \"FAILED\":\n raise SynapseError(\n result.get(\"errorMessage\", None)\n + \"\\n\"\n + result.get(\"errorDetails\", None),\n asynchronousJobStatus=result,\n )\n if progressed:\n self._print_transfer_progress(total, total, message, isBytes=False)\n return result\n\n @tracer.start_as_current_span(\"Synapse::getColumn\")\n def getColumn(self, id):\n \"\"\"\n Gets a Column object from Synapse by ID.\n\n See: [synapseclient.table.Column][]\n\n Arguments:\n id: The ID of the column to retrieve\n\n Returns:\n An object of type [synapseclient.table.Column][]\n\n\n Example: Using this function\n Getting a column\n\n column = syn.getColumn(123)\n \"\"\"\n return Column(**self.restGET(Column.getURI(id)))\n\n @tracer.start_as_current_span(\"Synapse::getColumns\")\n def getColumns(self, x, limit=100, offset=0):\n \"\"\"\n Get the columns defined in Synapse either (1) corresponding to a set of column headers, (2) those for a given\n schema, or (3) those whose names start with a given prefix.\n\n Arguments:\n x: A list of column headers, a Table Entity object (Schema/EntityViewSchema), a Table's Synapse ID, or a\n string prefix\n limit: maximum number of columns to return (pagination parameter)\n offset: the index of the first column to return (pagination parameter)\n\n Yields:\n A generator over [synapseclient.table.Column][] objects\n \"\"\"\n if x is None:\n uri = \"/column\"\n for result in self._GET_paginated(uri, limit=limit, offset=offset):\n yield Column(**result)\n elif isinstance(x, (list, tuple)):\n for header in x:\n try:\n # if header is an integer, it's a columnID, otherwise it's an aggregate column, like \"AVG(Foo)\"\n int(header)\n yield self.getColumn(header)\n except ValueError:\n # ignore aggregate column\n pass\n elif isinstance(x, SchemaBase) or utils.is_synapse_id_str(x):\n for col in self.getTableColumns(x):\n yield col\n elif isinstance(x, str):\n uri = \"/column?prefix=\" + x\n for result in self._GET_paginated(uri, limit=limit, offset=offset):\n yield Column(**result)\n else:\n ValueError(\"Can't get columns for a %s\" % type(x))\n\n @tracer.start_as_current_span(\"Synapse::create_snapshot_version\")\n def create_snapshot_version(\n self,\n table: typing.Union[\n EntityViewSchema, Schema, str, SubmissionViewSchema, Dataset\n ],\n comment: str = None,\n label: str = None,\n activity: typing.Union[Activity, str] = None,\n wait: bool = True,\n ) -> int:\n \"\"\"Create a new Table Version, new View version, or new Dataset version.\n\n Arguments:\n table: The schema of the Table/View, or its ID.\n comment: Optional snapshot comment.\n label: Optional snapshot label.\n activity: Optional activity ID applied to snapshot version.\n wait: True if this method should return the snapshot version after waiting for any necessary\n asynchronous table updates to complete. If False this method will return\n as soon as any updates are initiated.\n\n Returns:\n The snapshot version number if wait=True, None if wait=False\n \"\"\"\n ent = self.get(id_of(table), downloadFile=False)\n if isinstance(ent, (EntityViewSchema, SubmissionViewSchema, Dataset)):\n result = self._async_table_update(\n table,\n create_snapshot=True,\n comment=comment,\n label=label,\n activity=activity,\n wait=wait,\n )\n elif isinstance(ent, Schema):\n result = self._create_table_snapshot(\n table,\n comment=comment,\n label=label,\n activity=activity,\n )\n else:\n raise ValueError(\n \"This function only accepts Synapse ids of Tables or Views\"\n )\n\n # for consistency we return nothing if wait=False since we can't\n # supply the snapshot version on an async table update without waiting\n return result[\"snapshotVersionNumber\"] if wait else None\n\n @tracer.start_as_current_span(\"Synapse::_create_table_snapshot\")\n def _create_table_snapshot(\n self,\n table: typing.Union[Schema, str],\n comment: str = None,\n label: str = None,\n activity: typing.Union[Activity, str] = None,\n ) -> dict:\n \"\"\"Creates Table snapshot\n\n :param table: The schema of the Table\n :param comment: Optional snapshot comment.\n :param label: Optional snapshot label.\n :param activity: Optional activity ID or activity instance applied to snapshot version.\n\n :return: Snapshot Response\n \"\"\"\n\n # check the activity id or object is provided\n activity_id = None\n if isinstance(activity, collections.abc.Mapping):\n if \"id\" not in activity:\n activity = self._saveActivity(activity)\n activity_id = activity[\"id\"]\n elif activity is not None:\n activity_id = str(activity)\n\n snapshot_body = {\n \"snapshotComment\": comment,\n \"snapshotLabel\": label,\n \"snapshotActivityId\": activity_id,\n }\n new_body = {\n key: value for key, value in snapshot_body.items() if value is not None\n }\n snapshot = self.restPOST(\n \"/entity/{}/table/snapshot\".format(id_of(table)), body=json.dumps(new_body)\n )\n return snapshot\n\n @tracer.start_as_current_span(\"Synapse::_async_table_update\")\n def _async_table_update(\n self,\n table: typing.Union[EntityViewSchema, Schema, str, SubmissionViewSchema],\n changes: typing.List[dict] = [],\n create_snapshot: bool = False,\n comment: str = None,\n label: str = None,\n activity: str = None,\n wait: bool = True,\n ) -> dict:\n \"\"\"Creates view updates and snapshots\n\n :param table: The schema of the EntityView or its ID.\n :param changes: Array of Table changes\n :param create_snapshot: Create snapshot\n :param comment: Optional snapshot comment.\n :param label: Optional snapshot label.\n :param activity: Optional activity ID applied to snapshot version.\n :param wait: True to wait for async table update to complete\n\n :return: Snapshot Response\n \"\"\"\n snapshot_options = {\n \"snapshotComment\": comment,\n \"snapshotLabel\": label,\n \"snapshotActivityId\": activity,\n }\n new_snapshot = {\n key: value for key, value in snapshot_options.items() if value is not None\n }\n table_update_body = {\n \"changes\": changes,\n \"createSnapshot\": create_snapshot,\n \"snapshotOptions\": new_snapshot,\n }\n\n uri = \"/entity/{}/table/transaction/async\".format(id_of(table))\n\n if wait:\n result = self._waitForAsync(uri, table_update_body)\n\n else:\n result = self.restPOST(\n \"{}/start\".format(uri), body=json.dumps(table_update_body)\n )\n\n return result\n\n @tracer.start_as_current_span(\"Synapse::getTableColumns\")\n def getTableColumns(self, table):\n \"\"\"\n Retrieve the column models used in the given table schema.\n\n Arguments:\n table: The schema of the Table whose columns are to be retrieved\n\n Yields:\n A Generator over the Table's [columns][synapseclient.table.Column]\n \"\"\"\n uri = \"/entity/{id}/column\".format(id=id_of(table))\n # The returned object type for this service, PaginatedColumnModels, is a misnomer.\n # This service always returns the full list of results so the pagination does not not actually matter.\n for result in self.restGET(uri)[\"results\"]:\n yield Column(**result)\n\n @tracer.start_as_current_span(\"Synapse::tableQuery\")\n def tableQuery(self, query, resultsAs=\"csv\", **kwargs):\n \"\"\"\n Query a Synapse Table.\n\n\n :param query: query string in a `SQL-like syntax \\\n <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/web/controller/TableExamples.html>`_, for example\n \"SELECT * from syn12345\"\n\n :param resultsAs: select whether results are returned as a CSV file (\"csv\") or incrementally downloaded as\n sets of rows (\"rowset\").\n\n\n :param query: query string in a `SQL-like syntax \\\n <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/web/controller/TableExamples.html>`_, for example\n \"SELECT * from syn12345\"\n\n :param resultsAs: select whether results are returned as a CSV file (\"csv\") or incrementally downloaded as\n sets of rows (\"rowset\").\n\n You can receive query results either as a generator over rows or as a CSV file. For smallish tables, either\n method will work equally well. Use of a \"rowset\" generator allows rows to be processed one at a time and\n processing may be stopped before downloading the entire table.\n\n Optional keyword arguments differ for the two return types of `rowset` or `csv`\n\n Arguments:\n query: Query string in a [SQL-like syntax](https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/web/controller/TableExamples.html), for example: `\"SELECT * from syn12345\"`\n resultsAs: select whether results are returned as a CSV file (\"csv\") or incrementally downloaded as sets of rows (\"rowset\")\n limit: (rowset only) Specify the maximum number of rows to be returned, defaults to None\n offset: (rowset only) Don't return the first n rows, defaults to None\n quoteCharacter: (csv only) default double quote\n escapeCharacter: (csv only) default backslash\n lineEnd: (csv only) defaults to os.linesep\n separator: (csv only) defaults to comma\n header: (csv only) True by default\n includeRowIdAndRowVersion: (csv only) True by default\n downloadLocation: (csv only) directory path to download the CSV file to\n\n\n Returns:\n A [TableQueryResult][synapseclient.table.TableQueryResult] or [CsvFileTable][synapseclient.table.CsvFileTable] object\n\n\n NOTE:\n When performing queries on frequently updated tables, the table can be inaccessible for a period leading\n to a timeout of the query. Since the results are guaranteed to eventually be returned you can change the\n max timeout by setting the table_query_timeout variable of the Synapse object:\n\n # Sets the max timeout to 5 minutes.\n syn.table_query_timeout = 300\n\n \"\"\"\n if resultsAs.lower() == \"rowset\":\n return TableQueryResult(self, query, **kwargs)\n elif resultsAs.lower() == \"csv\":\n # TODO: remove isConsistent because it has now been deprecated\n # from the backend\n if kwargs.get(\"isConsistent\") is not None:\n kwargs.pop(\"isConsistent\")\n return CsvFileTable.from_table_query(self, query, **kwargs)\n else:\n raise ValueError(\n \"Unknown return type requested from tableQuery: \" + str(resultsAs)\n )\n\n @tracer.start_as_current_span(\"Synapse::_queryTable\")\n def _queryTable(\n self, query, limit=None, offset=None, isConsistent=True, partMask=None\n ):\n \"\"\"\n Query a table and return the first page of results as a `QueryResultBundle \\\n <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/QueryResultBundle.html>`_.\n If the result contains a *nextPageToken*, following pages a retrieved by calling :py:meth:`~._queryTableNext`.\n\n :param partMask: Optional, default all. The 'partsMask' is a bit field for requesting\n different elements in the resulting JSON bundle.\n Query Results (queryResults) = 0x1\n Query Count (queryCount) = 0x2\n Select Columns (selectColumns) = 0x4\n Max Rows Per Page (maxRowsPerPage) = 0x8\n \"\"\"\n\n # See: https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/QueryBundleRequest.html\n query_bundle_request = {\n \"concreteType\": \"org.sagebionetworks.repo.model.table.QueryBundleRequest\",\n \"query\": {\n \"sql\": query,\n \"isConsistent\": isConsistent,\n \"includeEntityEtag\": True,\n },\n }\n\n if partMask:\n query_bundle_request[\"partMask\"] = partMask\n if limit is not None:\n query_bundle_request[\"query\"][\"limit\"] = limit\n if offset is not None:\n query_bundle_request[\"query\"][\"offset\"] = offset\n query_bundle_request[\"query\"][\"isConsistent\"] = isConsistent\n\n uri = \"/entity/{id}/table/query/async\".format(\n id=extract_synapse_id_from_query(query)\n )\n\n return self._waitForAsync(uri=uri, request=query_bundle_request)\n\n @tracer.start_as_current_span(\"Synapse::_queryTableNext\")\n def _queryTableNext(self, nextPageToken, tableId):\n uri = \"/entity/{id}/table/query/nextPage/async\".format(id=tableId)\n return self._waitForAsync(uri=uri, request=nextPageToken)\n\n @tracer.start_as_current_span(\"Synapse::_uploadCsv\")\n def _uploadCsv(\n self,\n filepath,\n schema,\n updateEtag=None,\n quoteCharacter='\"',\n escapeCharacter=\"\\\\\",\n lineEnd=os.linesep,\n separator=\",\",\n header=True,\n linesToSkip=0,\n ):\n \"\"\"\n Send an `UploadToTableRequest \\\n <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/UploadToTableRequest.html>`_ to Synapse.\n\n :param filepath: Path of a `CSV <https://en.wikipedia.org/wiki/Comma-separated_values>`_ file.\n :param schema: A table entity or its Synapse ID.\n :param updateEtag: Any RowSet returned from Synapse will contain the current etag of the change set.\n To update any rows from a RowSet the etag must be provided with the POST.\n\n :returns: `UploadToTableResult \\\n <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/UploadToTableResult.html>`_\n \"\"\"\n\n fileHandleId = multipart_upload_file(self, filepath, content_type=\"text/csv\")\n\n uploadRequest = {\n \"concreteType\": \"org.sagebionetworks.repo.model.table.UploadToTableRequest\",\n \"csvTableDescriptor\": {\n \"isFirstLineHeader\": header,\n \"quoteCharacter\": quoteCharacter,\n \"escapeCharacter\": escapeCharacter,\n \"lineEnd\": lineEnd,\n \"separator\": separator,\n },\n \"linesToSkip\": linesToSkip,\n \"tableId\": id_of(schema),\n \"uploadFileHandleId\": fileHandleId,\n }\n\n if updateEtag:\n uploadRequest[\"updateEtag\"] = updateEtag\n\n response = self._async_table_update(schema, changes=[uploadRequest], wait=True)\n self._check_table_transaction_response(response)\n\n return response\n\n @tracer.start_as_current_span(\"Synapse::_check_table_transaction_response\")\n def _check_table_transaction_response(self, response):\n for result in response[\"results\"]:\n result_type = result[\"concreteType\"]\n\n if result_type in {\n concrete_types.ROW_REFERENCE_SET_RESULTS,\n concrete_types.TABLE_SCHEMA_CHANGE_RESPONSE,\n concrete_types.UPLOAD_TO_TABLE_RESULT,\n }:\n # if these fail, it we would have gotten an HttpError before the results came back\n pass\n elif result_type == concrete_types.ENTITY_UPDATE_RESULTS:\n # TODO: output full response to error file when the logging JIRA issue gets pulled in\n successful_updates = []\n failed_updates = []\n for update_result in result[\"updateResults\"]:\n failure_code = update_result.get(\"failureCode\")\n failure_message = update_result.get(\"failureMessage\")\n entity_id = update_result.get(\"entityId\")\n if failure_code or failure_message:\n failed_updates.append(update_result)\n else:\n successful_updates.append(entity_id)\n\n if failed_updates:\n raise SynapseError(\n \"Not all of the entities were updated.\"\n \" Successful updates: %s. Failed updates: %s\"\n % (successful_updates, failed_updates)\n )\n\n else:\n warnings.warn(\n \"Unexpected result from a table transaction of type [%s].\"\n \" Please check the result to make sure it is correct. %s\"\n % (result_type, result)\n )\n\n @tracer.start_as_current_span(\"Synapse::_queryTableCsv\")\n def _queryTableCsv(\n self,\n query,\n quoteCharacter='\"',\n escapeCharacter=\"\\\\\",\n lineEnd=os.linesep,\n separator=\",\",\n header=True,\n includeRowIdAndRowVersion=True,\n downloadLocation=None,\n ):\n \"\"\"\n Query a Synapse Table and download a CSV file containing the results.\n\n Sends a `DownloadFromTableRequest \\\n <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/DownloadFromTableRequest.html>`_ to Synapse.\n\n :return: a tuple containing a `DownloadFromTableResult \\\n <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/DownloadFromTableResult.html>`_\n\n The DownloadFromTableResult object contains these fields:\n * headers: ARRAY<STRING>, The list of ColumnModel IDs that describes the rows of this set.\n * resultsFileHandleId: STRING, The resulting file handle ID can be used to download the CSV file created by\n this query.\n * concreteType: STRING\n * etag: STRING, Any RowSet returned from Synapse will contain the current etag of the change\n set.\n To update any rows from a RowSet the etag must be provided with the POST.\n * tableId: STRING, The ID of the table identified in the from clause of the table query.\n \"\"\"\n\n download_from_table_request = {\n \"concreteType\": \"org.sagebionetworks.repo.model.table.DownloadFromTableRequest\",\n \"csvTableDescriptor\": {\n \"isFirstLineHeader\": header,\n \"quoteCharacter\": quoteCharacter,\n \"escapeCharacter\": escapeCharacter,\n \"lineEnd\": lineEnd,\n \"separator\": separator,\n },\n \"sql\": query,\n \"writeHeader\": header,\n \"includeRowIdAndRowVersion\": includeRowIdAndRowVersion,\n \"includeEntityEtag\": True,\n }\n\n uri = \"/entity/{id}/table/download/csv/async\".format(\n id=extract_synapse_id_from_query(query)\n )\n download_from_table_result = self._waitForAsync(\n uri=uri, request=download_from_table_request\n )\n file_handle_id = download_from_table_result[\"resultsFileHandleId\"]\n cached_file_path = self.cache.get(\n file_handle_id=file_handle_id, path=downloadLocation\n )\n if cached_file_path is not None:\n return download_from_table_result, cached_file_path\n\n if downloadLocation:\n download_dir = self._ensure_download_location_is_directory(downloadLocation)\n else:\n download_dir = self.cache.get_cache_dir(file_handle_id)\n\n os.makedirs(download_dir, exist_ok=True)\n filename = f\"SYNAPSE_TABLE_QUERY_{file_handle_id}.csv\"\n path = self._downloadFileHandle(\n file_handle_id,\n extract_synapse_id_from_query(query),\n \"TableEntity\",\n os.path.join(download_dir, filename),\n )\n\n return download_from_table_result, path\n\n # This is redundant with syn.store(Column(...)) and will be removed unless people prefer this method.\n @tracer.start_as_current_span(\"Synapse::createColumn\")\n def createColumn(\n self, name, columnType, maximumSize=None, defaultValue=None, enumValues=None\n ):\n columnModel = Column(\n name=name,\n columnType=columnType,\n maximumSize=maximumSize,\n defaultValue=defaultValue,\n enumValue=enumValues,\n )\n return Column(**self.restPOST(\"/column\", json.dumps(columnModel)))\n\n @tracer.start_as_current_span(\"Synapse::createColumns\")\n def createColumns(self, columns: typing.List[Column]) -> typing.List[Column]:\n \"\"\"\n Creates a batch of [synapseclient.table.Column][]'s within a single request.\n\n Arguments:\n columns: A list of [synapseclient.table.Column][]'s\n\n Returns:\n A list of [synapseclient.table.Column][]'s that have been created in Synapse\n \"\"\"\n request_body = {\n \"concreteType\": \"org.sagebionetworks.repo.model.ListWrapper\",\n \"list\": list(columns),\n }\n response = self.restPOST(\"/column/batch\", json.dumps(request_body))\n return [Column(**col) for col in response[\"list\"]]\n\n @tracer.start_as_current_span(\"Synapse::_getColumnByName\")\n def _getColumnByName(self, schema, column_name):\n \"\"\"\n Given a schema and a column name, get the corresponding py:class:`Column` object.\n \"\"\"\n for column in self.getColumns(schema):\n if column.name == column_name:\n return column\n return None\n\n @tracer.start_as_current_span(\"Synapse::downloadTableColumns\")\n def downloadTableColumns(self, table, columns, downloadLocation=None, **kwargs):\n \"\"\"\n Bulk download of table-associated files.\n\n Arguments:\n table: Table query result\n columns: A list of column names as strings\n downloadLocation: Directory into which to download the files\n\n Returns:\n A dictionary from file handle ID to path in the local file system.\n\n For example, consider a Synapse table whose ID is \"syn12345\" with two columns of type FILEHANDLEID named 'foo'\n and 'bar'. The associated files are JSON encoded, so we might retrieve the files from Synapse and load for the\n second 100 of those rows as shown here:\n\n import json\n\n results = syn.tableQuery('SELECT * FROM syn12345 LIMIT 100 OFFSET 100')\n file_map = syn.downloadTableColumns(results, ['foo', 'bar'])\n\n for file_handle_id, path in file_map.items():\n with open(path) as f:\n data[file_handle_id] = f.read()\n\n \"\"\"\n\n RETRIABLE_FAILURE_CODES = [\"EXCEEDS_SIZE_LIMIT\"]\n MAX_DOWNLOAD_TRIES = 100\n max_files_per_request = kwargs.get(\"max_files_per_request\", 2500)\n # Rowset tableQuery result not allowed\n if isinstance(table, TableQueryResult):\n raise ValueError(\n \"downloadTableColumn doesn't work with rowsets. Please use default tableQuery settings.\"\n )\n if isinstance(columns, str):\n columns = [columns]\n if not isinstance(columns, collections.abc.Iterable):\n raise TypeError(\"Columns parameter requires a list of column names\")\n\n (\n file_handle_associations,\n file_handle_to_path_map,\n ) = self._build_table_download_file_handle_list(\n table,\n columns,\n downloadLocation,\n )\n\n self.logger.info(\n \"Downloading %d files, %d cached locally\"\n % (len(file_handle_associations), len(file_handle_to_path_map))\n )\n\n permanent_failures = collections.OrderedDict()\n\n attempts = 0\n while len(file_handle_associations) > 0 and attempts < MAX_DOWNLOAD_TRIES:\n attempts += 1\n\n file_handle_associations_batch = file_handle_associations[\n :max_files_per_request\n ]\n\n # ------------------------------------------------------------\n # call async service to build zip file\n # ------------------------------------------------------------\n\n # returns a BulkFileDownloadResponse:\n # https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/file/BulkFileDownloadResponse.html\n request = dict(\n concreteType=\"org.sagebionetworks.repo.model.file.BulkFileDownloadRequest\",\n requestedFiles=file_handle_associations_batch,\n )\n response = self._waitForAsync(\n uri=\"/file/bulk/async\",\n request=request,\n endpoint=self.fileHandleEndpoint,\n )\n\n # ------------------------------------------------------------\n # download zip file\n # ------------------------------------------------------------\n\n temp_dir = tempfile.mkdtemp()\n zipfilepath = os.path.join(temp_dir, \"table_file_download.zip\")\n try:\n zipfilepath = self._downloadFileHandle(\n response[\"resultZipFileHandleId\"],\n table.tableId,\n \"TableEntity\",\n zipfilepath,\n )\n # TODO handle case when no zip file is returned\n # TODO test case when we give it partial or all bad file handles\n # TODO test case with deleted fileHandleID\n # TODO return null for permanent failures\n\n # ------------------------------------------------------------\n # unzip into cache\n # ------------------------------------------------------------\n\n if downloadLocation:\n download_dir = self._ensure_download_location_is_directory(\n downloadLocation\n )\n\n with zipfile.ZipFile(zipfilepath) as zf:\n # the directory structure within the zip follows that of the cache:\n # {fileHandleId modulo 1000}/{fileHandleId}/{fileName}\n for summary in response[\"fileSummary\"]:\n if summary[\"status\"] == \"SUCCESS\":\n if not downloadLocation:\n download_dir = self.cache.get_cache_dir(\n summary[\"fileHandleId\"]\n )\n\n filepath = extract_zip_file_to_directory(\n zf, summary[\"zipEntryName\"], download_dir\n )\n self.cache.add(summary[\"fileHandleId\"], filepath)\n file_handle_to_path_map[summary[\"fileHandleId\"]] = filepath\n elif summary[\"failureCode\"] not in RETRIABLE_FAILURE_CODES:\n permanent_failures[summary[\"fileHandleId\"]] = summary\n finally:\n if os.path.exists(zipfilepath):\n os.remove(zipfilepath)\n\n # Do we have remaining files to download?\n file_handle_associations = [\n fha\n for fha in file_handle_associations\n if fha[\"fileHandleId\"] not in file_handle_to_path_map\n and fha[\"fileHandleId\"] not in permanent_failures.keys()\n ]\n\n # TODO if there are files we still haven't downloaded\n\n return file_handle_to_path_map\n\n @tracer.start_as_current_span(\"Synapse::_build_table_download_file_handle_list\")\n def _build_table_download_file_handle_list(self, table, columns, downloadLocation):\n # ------------------------------------------------------------\n # build list of file handles to download\n # ------------------------------------------------------------\n cols_not_found = [\n c for c in columns if c not in [h.name for h in table.headers]\n ]\n if len(cols_not_found) > 0:\n raise ValueError(\n \"Columns not found: \"\n + \", \".join('\"' + col + '\"' for col in cols_not_found)\n )\n col_indices = [i for i, h in enumerate(table.headers) if h.name in columns]\n # see: https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/file/BulkFileDownloadRequest.html\n file_handle_associations = []\n file_handle_to_path_map = collections.OrderedDict()\n seen_file_handle_ids = (\n set()\n ) # ensure not sending duplicate requests for the same FileHandle IDs\n for row in table:\n for col_index in col_indices:\n file_handle_id = row[col_index]\n if is_integer(file_handle_id):\n path_to_cached_file = self.cache.get(\n file_handle_id, path=downloadLocation\n )\n if path_to_cached_file:\n file_handle_to_path_map[file_handle_id] = path_to_cached_file\n elif file_handle_id not in seen_file_handle_ids:\n file_handle_associations.append(\n dict(\n associateObjectType=\"TableEntity\",\n fileHandleId=file_handle_id,\n associateObjectId=table.tableId,\n )\n )\n seen_file_handle_ids.add(file_handle_id)\n else:\n warnings.warn(\"Weird file handle: %s\" % file_handle_id)\n return file_handle_associations, file_handle_to_path_map\n\n @tracer.start_as_current_span(\"Synapse::_get_default_view_columns\")\n def _get_default_view_columns(self, view_type, view_type_mask=None):\n \"\"\"Get default view columns\"\"\"\n uri = f\"/column/tableview/defaults?viewEntityType={view_type}\"\n if view_type_mask:\n uri += f\"&viewTypeMask={view_type_mask}\"\n return [Column(**col) for col in self.restGET(uri)[\"list\"]]\n\n @tracer.start_as_current_span(\"Synapse::_get_annotation_view_columns\")\n def _get_annotation_view_columns(\n self, scope_ids: list, view_type: str, view_type_mask: str = None\n ) -> list:\n \"\"\"Get all the columns of a submission of entity view based on existing annotations\n\n :param scope_ids: List of Evaluation Queue or Project/Folder Ids\n :param view_type: submissionview or entityview\n :param view_type_mask: Bit mask representing the types to include in the view.\n\n :returns: list of columns\n \"\"\"\n columns = []\n next_page_token = None\n while True:\n view_scope = {\n \"concreteType\": \"org.sagebionetworks.repo.model.table.ViewColumnModelRequest\",\n \"viewScope\": {\n \"scope\": scope_ids,\n \"viewEntityType\": view_type,\n \"viewTypeMask\": view_type_mask,\n },\n }\n if next_page_token:\n view_scope[\"nextPageToken\"] = next_page_token\n response = self._waitForAsync(\n uri=\"/column/view/scope/async\", request=view_scope\n )\n columns.extend(Column(**column) for column in response[\"results\"])\n next_page_token = response.get(\"nextPageToken\")\n if next_page_token is None:\n break\n return columns\n\n ############################################################\n # CRUD for Entities (properties) #\n ############################################################\n\n @tracer.start_as_current_span(\"Synapse::_getEntity\")\n def _getEntity(self, entity, version=None):\n \"\"\"\n Get an entity from Synapse.\n\n :param entity: A Synapse ID, a dictionary representing an Entity, or a Synapse Entity object\n :param version: The version number to fetch\n\n :returns: A dictionary containing an Entity's properties\n \"\"\"\n\n uri = \"/entity/\" + id_of(entity)\n if version:\n uri += \"/version/%d\" % version\n return self.restGET(uri)\n\n @tracer.start_as_current_span(\"Synapse::_createEntity\")\n def _createEntity(self, entity):\n \"\"\"\n Create a new entity in Synapse.\n\n :param entity: A dictionary representing an Entity or a Synapse Entity object\n\n :returns: A dictionary containing an Entity's properties\n \"\"\"\n\n return self.restPOST(uri=\"/entity\", body=json.dumps(get_properties(entity)))\n\n @tracer.start_as_current_span(\"Synapse::_updateEntity\")\n def _updateEntity(self, entity, incrementVersion=True, versionLabel=None):\n \"\"\"\n Update an existing entity in Synapse.\n\n :param entity: A dictionary representing an Entity or a Synapse Entity object\n :param incrementVersion: whether to increment the entity version (if Versionable)\n :param versionLabel: a label for the entity version (if Versionable)\n\n\n :returns: A dictionary containing an Entity's properties\n \"\"\"\n\n uri = \"/entity/%s\" % id_of(entity)\n\n params = {}\n if is_versionable(entity):\n if versionLabel:\n # a versionLabel implicitly implies incrementing\n incrementVersion = True\n elif incrementVersion and \"versionNumber\" in entity:\n versionLabel = str(entity[\"versionNumber\"] + 1)\n\n if incrementVersion:\n entity[\"versionLabel\"] = versionLabel\n params[\"newVersion\"] = \"true\"\n\n return self.restPUT(uri, body=json.dumps(get_properties(entity)), params=params)\n\n @tracer.start_as_current_span(\"Synapse::findEntityId\")\n def findEntityId(self, name, parent=None):\n \"\"\"\n Find an Entity given its name and parent.\n\n Arguments:\n name: Name of the entity to find\n parent: An Entity object or the Id of an entity as a string. Omit if searching for a Project by name\n\n Returns:\n The Entity ID or None if not found\n \"\"\"\n # when we want to search for a project by name. set parentId as None instead of ROOT_ENTITY\n entity_lookup_request = {\n \"parentId\": id_of(parent) if parent else None,\n \"entityName\": name,\n }\n try:\n return self.restPOST(\n \"/entity/child\", body=json.dumps(entity_lookup_request)\n ).get(\"id\")\n except SynapseHTTPError as e:\n if (\n e.response.status_code == 404\n ): # a 404 error is raised if the entity does not exist\n return None\n raise\n\n ############################################################\n # Send Message #\n ############################################################\n @tracer.start_as_current_span(\"Synapse::sendMessage\")\n def sendMessage(\n self, userIds, messageSubject, messageBody, contentType=\"text/plain\"\n ):\n \"\"\"\n send a message via Synapse.\n\n Arguments:\n userIds: A list of user IDs to which the message is to be sent\n messageSubject: The subject for the message\n messageBody: The body of the message\n contentType: optional contentType of message body (default=\"text/plain\")\n Should be one of \"text/plain\" or \"text/html\"\n\n Returns:\n The metadata of the created message\n \"\"\"\n\n fileHandleId = multipart_upload_string(\n self, messageBody, content_type=contentType\n )\n message = dict(\n recipients=userIds, subject=messageSubject, fileHandleId=fileHandleId\n )\n return self.restPOST(uri=\"/message\", body=json.dumps(message))\n\n ############################################################\n # Low level Rest calls #\n ############################################################\n\n def _generate_headers(self, headers=None):\n \"\"\"Generate headers (auth headers produced separately by credentials object)\"\"\"\n\n if headers is None:\n headers = dict(self.default_headers)\n headers.update(synapseclient.USER_AGENT)\n\n return headers\n\n def _handle_synapse_http_error(self, response):\n \"\"\"Raise errors as appropriate for returned Synapse http status codes\"\"\"\n\n try:\n exceptions._raise_for_status(response, verbose=self.debug)\n except exceptions.SynapseHTTPError as ex:\n # if we get a unauthenticated or forbidden error and the user is not logged in\n # then we raise it as an authentication error.\n # we can't know for certain that logging in to their particular account will grant them\n # access to this resource but more than likely it's the cause of this error.\n if response.status_code in (401, 403) and not self.credentials:\n raise SynapseAuthenticationError(\n \"You are not logged in and do not have access to a requested resource.\"\n ) from ex\n\n raise\n\n def _rest_call(\n self,\n method,\n uri,\n data,\n endpoint,\n headers,\n retryPolicy,\n requests_session,\n **kwargs,\n ):\n uri, headers = self._build_uri_and_headers(\n uri, endpoint=endpoint, headers=headers\n )\n\n retryPolicy = self._build_retry_policy(retryPolicy)\n requests_session = requests_session or self._requests_session\n\n auth = kwargs.pop(\"auth\", self.credentials)\n requests_method_fn = getattr(requests_session, method)\n response = with_retry(\n lambda: requests_method_fn(\n uri,\n data=data,\n headers=headers,\n auth=auth,\n **kwargs,\n ),\n verbose=self.debug,\n **retryPolicy,\n )\n\n self._handle_synapse_http_error(response)\n return response\n\n @tracer.start_as_current_span(\"Synapse::restGET\")\n def restGET(\n self,\n uri,\n endpoint=None,\n headers=None,\n retryPolicy={},\n requests_session=None,\n **kwargs,\n ):\n \"\"\"\n Sends an HTTP GET request to the Synapse server.\n\n Arguments:\n uri: URI on which get is performed\n endpoint: Server endpoint, defaults to self.repoEndpoint\n headers: Dictionary of headers to use rather than the API-key-signed default set of headers\n requests_session: An external requests.Session object to use when making this specific call\n kwargs: Any other arguments taken by a [request](http://docs.python-requests.org/en/latest/) method\n\n Returns:\n JSON encoding of response\n \"\"\"\n trace.get_current_span().set_attributes({\"url.path\": uri})\n response = self._rest_call(\n \"get\", uri, None, endpoint, headers, retryPolicy, requests_session, **kwargs\n )\n return self._return_rest_body(response)\n\n @tracer.start_as_current_span(\"Synapse::restPOST\")\n def restPOST(\n self,\n uri,\n body,\n endpoint=None,\n headers=None,\n retryPolicy={},\n requests_session=None,\n **kwargs,\n ):\n \"\"\"\n Sends an HTTP POST request to the Synapse server.\n\n Arguments:\n uri: URI on which get is performed\n endpoint: Server endpoint, defaults to self.repoEndpoint\n body: The payload to be delivered\n headers: Dictionary of headers to use rather than the API-key-signed default set of headers\n requests_session: an external requests.Session object to use when making this specific call\n kwargs: Any other arguments taken by a [request](http://docs.python-requests.org/en/latest/) method\n\n Returns:\n JSON encoding of response\n \"\"\"\n trace.get_current_span().set_attributes({\"url.path\": uri})\n response = self._rest_call(\n \"post\",\n uri,\n body,\n endpoint,\n headers,\n retryPolicy,\n requests_session,\n **kwargs,\n )\n return self._return_rest_body(response)\n\n @tracer.start_as_current_span(\"Synapse::restPUT\")\n def restPUT(\n self,\n uri,\n body=None,\n endpoint=None,\n headers=None,\n retryPolicy={},\n requests_session=None,\n **kwargs,\n ):\n \"\"\"\n Sends an HTTP PUT request to the Synapse server.\n\n Arguments:\n uri: URI on which get is performed\n endpoint: Server endpoint, defaults to self.repoEndpoint\n body: The payload to be delivered\n headers: Dictionary of headers to use rather than the API-key-signed default set of headers\n requests_session: Sn external requests.Session object to use when making this specific call\n kwargs: Any other arguments taken by a [request](http://docs.python-requests.org/en/latest/) method\n\n Returns\n JSON encoding of response\n \"\"\"\n trace.get_current_span().set_attributes({\"url.path\": uri})\n response = self._rest_call(\n \"put\", uri, body, endpoint, headers, retryPolicy, requests_session, **kwargs\n )\n return self._return_rest_body(response)\n\n @tracer.start_as_current_span(\"Synapse::restDELETE\")\n def restDELETE(\n self,\n uri,\n endpoint=None,\n headers=None,\n retryPolicy={},\n requests_session=None,\n **kwargs,\n ):\n \"\"\"\n Sends an HTTP DELETE request to the Synapse server.\n\n Arguments:\n uri: URI of resource to be deleted\n endpoint: Server endpoint, defaults to self.repoEndpoint\n headers: Dictionary of headers to use rather than the API-key-signed default set of headers\n requests_session: An external requests.Session object to use when making this specific call\n kwargs: Any other arguments taken by a [request](http://docs.python-requests.org/en/latest/) method\n\n \"\"\"\n trace.get_current_span().set_attributes({\"url.path\": uri})\n self._rest_call(\n \"delete\",\n uri,\n None,\n endpoint,\n headers,\n retryPolicy,\n requests_session,\n **kwargs,\n )\n\n def _build_uri_and_headers(self, uri, endpoint=None, headers=None):\n \"\"\"Returns a tuple of the URI and headers to request with.\"\"\"\n\n if endpoint is None:\n endpoint = self.repoEndpoint\n\n trace.get_current_span().set_attributes({\"server.address\": endpoint})\n\n # Check to see if the URI is incomplete (i.e. a Synapse URL)\n # In that case, append a Synapse endpoint to the URI\n parsedURL = urllib_urlparse.urlparse(uri)\n if parsedURL.netloc == \"\":\n uri = endpoint + uri\n\n if headers is None:\n headers = self._generate_headers()\n return uri, headers\n\n def _build_retry_policy(self, retryPolicy={}):\n \"\"\"Returns a retry policy to be passed onto _with_retry.\"\"\"\n\n defaults = dict(STANDARD_RETRY_PARAMS)\n defaults.update(retryPolicy)\n return defaults\n\n def _return_rest_body(self, response):\n \"\"\"Returns either a dictionary or a string depending on the 'content-type' of the response.\"\"\"\n trace.get_current_span().set_attributes(\n {\"http.response.status_code\": response.status_code}\n )\n if is_json(response.headers.get(\"content-type\", None)):\n return response.json()\n return response.text\n
"},{"location":"reference/client/#synapseclient.Synapse-functions","title":"Functions","text":""},{"location":"reference/client/#synapseclient.Synapse.login","title":"login(email=None, password=None, apiKey=None, sessionToken=None, rememberMe=False, silent=False, forced=False, authToken=None)
","text":"Valid combinations of login() arguments:
- email/username and password\n- email/username and apiKey (Base64 encoded string)\n- authToken\n- sessionToken (**DEPRECATED**)\n
If no login arguments are provided or only username is provided, login() will attempt to log in using information from these sources (in order of preference):
login()
where rememberMe=True
was passed as a parameteremail
Synapse user name (or an email address associated with a Synapse account)
TYPE: str
DEFAULT: None
password
!!WILL BE DEPRECATED!! password. Please use authToken (Synapse personal access token)
TYPE: str
DEFAULT: None
apiKey
!!WILL BE DEPRECATED!! Base64 encoded Synapse API key
TYPE: str
DEFAULT: None
sessionToken
!!DEPRECATED FIELD!! User's current session token. Using this field will ignore the following fields: email, password, apiKey
TYPE: str
DEFAULT: None
rememberMe
Whether the authentication information should be cached in your operating system's credential storage.
TYPE: bool
DEFAULT: False
authToken
A bearer authorization token, e.g. a personal access token, can be used in lieu of a password or apiKey.
TYPE: str
DEFAULT: None
silent
Defaults to False. Suppresses the \"Welcome ...!\" message.
TYPE: bool
DEFAULT: False
forced
Defaults to False. Bypass the credential cache if set.
TYPE: bool
DEFAULT: False
GNOME Keyring (recommended) or KWallet is recommended to be installed for credential storage on Linux systems. If it is not installed/setup, credentials will be stored as PLAIN-TEXT file with read and write permissions for the current user only (chmod 600). On Windows and Mac OS, a default credentials storage exists so it will be preferred over the plain-text file. To install GNOME Keyring on Ubuntu:
sudo apt-get install gnome-keyring\n\nsudo apt-get install python-dbus #(for Python 2 installed via apt-get)\nOR\nsudo apt-get install python3-dbus #(for Python 3 installed via apt-get)\nOR\nsudo apt-get install libdbus-glib-1-dev #(for custom installation of Python or vitualenv)\nsudo pip install dbus-python #(may take a while to compile C code)\n
If you are on a headless Linux session (e.g. connecting via SSH), please run the following commands before running your Python session:
dbus-run-session -- bash #(replace 'bash' with 'sh' if bash is unavailable)\necho -n \"REPLACE_WITH_YOUR_KEYRING_PASSWORD\"|gnome-keyring-daemon -- unlock\n
Logging in Using an auth token:
syn.login(authToken=\"authtoken\")\n#> Welcome, Me!\n
Using a username/password:
syn.login('my-username', 'secret-password', rememberMe=True)\nsyn.login('my-username', 'secret-password', rememberMe=True)\n#> Welcome, Me!\n syn.login('my-username', 'secret-password', rememberMe=True)\n#> Welcome, Me!\n
After logging in with the rememberMe flag set, an API key will be cached and used to authenticate for future logins:
syn.login()\n#> Welcome, Me!\n
Source code in synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::login\")\ndef login(\n self,\n email: str = None,\n password: str = None,\n apiKey: str = None,\n sessionToken: str = None,\n rememberMe: bool = False,\n silent: bool = False,\n forced: bool = False,\n authToken: str = None,\n):\n \"\"\"\n Valid combinations of login() arguments:\n\n - email/username and password\n - email/username and apiKey (Base64 encoded string)\n - authToken\n - sessionToken (**DEPRECATED**)\n\n If no login arguments are provided or only username is provided, login() will attempt to log in using\n information from these sources (in order of preference):\n\n 1. User's personal access token from environment the variable: SYNAPSE_AUTH_TOKEN\n 2. .synapseConfig file (in user home folder unless configured otherwise)\n 3. cached credentials from previous `login()` where `rememberMe=True` was passed as a parameter\n\n Arguments:\n email: Synapse user name (or an email address associated with a Synapse account)\n password: **!!WILL BE DEPRECATED!!** password. Please use authToken (Synapse personal access token)\n apiKey: **!!WILL BE DEPRECATED!!** Base64 encoded Synapse API key\n sessionToken: **!!DEPRECATED FIELD!!** User's current session token. Using this field will ignore the\n following fields: email, password, apiKey\n rememberMe: Whether the authentication information should be cached in your operating system's\n credential storage.\n authToken: A bearer authorization token, e.g. a personal access token, can be used in lieu of a\n password or apiKey.\n silent: Defaults to False. Suppresses the \"Welcome ...!\" message.\n forced: Defaults to False. Bypass the credential cache if set.\n\n **GNOME Keyring** (recommended) or **KWallet** is recommended to be installed for credential storage on\n **Linux** systems.\n If it is not installed/setup, credentials will be stored as PLAIN-TEXT file with read and write permissions for\n the current user only (chmod 600).\n On Windows and Mac OS, a default credentials storage exists so it will be preferred over the plain-text file.\n To install GNOME Keyring on Ubuntu:\n\n sudo apt-get install gnome-keyring\n\n sudo apt-get install python-dbus #(for Python 2 installed via apt-get)\n OR\n sudo apt-get install python3-dbus #(for Python 3 installed via apt-get)\n OR\n sudo apt-get install libdbus-glib-1-dev #(for custom installation of Python or vitualenv)\n sudo pip install dbus-python #(may take a while to compile C code)\n\n If you are on a headless Linux session (e.g. connecting via SSH), please run the following commands before\n running your Python session:\n\n dbus-run-session -- bash #(replace 'bash' with 'sh' if bash is unavailable)\n echo -n \"REPLACE_WITH_YOUR_KEYRING_PASSWORD\"|gnome-keyring-daemon -- unlock\n\n Example: Logging in\n Using an auth token:\n\n syn.login(authToken=\"authtoken\")\n #> Welcome, Me!\n\n Using a username/password:\n\n syn.login('my-username', 'secret-password', rememberMe=True)\n syn.login('my-username', 'secret-password', rememberMe=True)\n #> Welcome, Me!\n syn.login('my-username', 'secret-password', rememberMe=True)\n #> Welcome, Me!\n\n After logging in with the *rememberMe* flag set, an API key will be cached and\n used to authenticate for future logins:\n\n syn.login()\n #> Welcome, Me!\n\n \"\"\"\n # Note: the order of the logic below reflects the ordering in the docstring above.\n\n # Check version before logging in\n if not self.skip_checks:\n version_check()\n\n # Make sure to invalidate the existing session\n self.logout()\n\n credential_provider_chain = get_default_credential_chain()\n # TODO: remove deprecated sessionToken when we move to a different solution\n self.credentials = credential_provider_chain.get_credentials(\n self,\n UserLoginArgs(\n email,\n password,\n apiKey,\n forced,\n sessionToken,\n authToken,\n ),\n )\n\n # Final check on login success\n if not self.credentials:\n raise SynapseNoCredentialsError(\"No credentials provided.\")\n\n # Save the API key in the cache\n if rememberMe:\n message = (\n \"The rememberMe parameter will be deprecated by early 2024. Please use the ~/.synapseConfig \"\n \"or SYNAPSE_AUTH_TOKEN environmental variable to set up your Synapse connection.\"\n )\n self.logger.warning(message)\n delete_stored_credentials(self.credentials.username)\n self.credentials.store_to_keyring()\n cached_sessions.set_most_recent_user(self.credentials.username)\n\n if not silent:\n profile = self.getUserProfile()\n # TODO-PY3: in Python2, do we need to ensure that this is encoded in utf-8\n self.logger.info(\n \"Welcome, %s!\\n\"\n % (\n profile[\"displayName\"]\n if \"displayName\" in profile\n else self.credentials.username\n )\n )\n
"},{"location":"reference/client/#synapseclient.Synapse.logout","title":"logout(forgetMe=False)
","text":"Removes authentication information from the Synapse client.
PARAMETER DESCRIPTIONforgetMe
Set as True to clear any local storage of authentication information. See the flag \"rememberMe\" in synapseclient.Synapse.login
TYPE: bool
DEFAULT: False
None
Source code insynapseclient/client.py
def logout(self, forgetMe: bool = False):\n \"\"\"\n Removes authentication information from the Synapse client.\n\n Arguments:\n forgetMe: Set as True to clear any local storage of authentication information.\n See the flag \"rememberMe\" in [synapseclient.Synapse.login][]\n\n Returns:\n None\n \"\"\"\n # Delete the user's API key from the cache\n if forgetMe and self.credentials:\n self.credentials.delete_from_keyring()\n\n self.credentials = None\n
"},{"location":"reference/client/#synapseclient.Synapse.get","title":"get(entity, **kwargs)
","text":"Gets a Synapse entity from the repository service.
PARAMETER DESCRIPTIONentity
A Synapse ID, a Synapse Entity object, a plain dictionary in which 'id' maps to a Synapse ID or a local file that is stored in Synapse (found by the file MD5)
version
The specific version to get. Defaults to the most recent version.
downloadFile
Whether associated files(s) should be downloaded. Defaults to True
downloadLocation
Directory where to download the Synapse File Entity. Defaults to the local cache.
followLink
Whether the link returns the target Entity. Defaults to False
ifcollision
Determines how to handle file collisions. May be \"overwrite.local\", \"keep.local\", or \"keep.both\". Defaults to \"keep.both\".
limitSearch
A Synanpse ID used to limit the search in Synapse if entity is specified as a local file. That is, if the file is stored in multiple locations in Synapse only the ones in the specified folder/project will be returned.
RETURNS DESCRIPTION
A new Synapse Entity object of the appropriate type.
Using this functionDownload file into cache
entity = syn.get('syn1906479')\nprint(entity.name)\nprint(entity.path)\n
Download file into current working directory
entity = syn.get('syn1906479', downloadLocation='.')\nprint(entity.name)\nprint(entity.path)\n
Determine the provenance of a locally stored file as indicated in Synapse
entity = syn.get('/path/to/file.txt', limitSearch='syn12312')\nprint(syn.getProvenance(entity))\n
Source code in synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::get\")\ndef get(self, entity, **kwargs):\n \"\"\"\n Gets a Synapse entity from the repository service.\n\n Arguments:\n entity: A Synapse ID, a Synapse Entity object, a plain dictionary in which 'id' maps to a\n Synapse ID or a local file that is stored in Synapse (found by the file MD5)\n version: The specific version to get.\n Defaults to the most recent version.\n downloadFile: Whether associated files(s) should be downloaded.\n Defaults to True\n downloadLocation: Directory where to download the Synapse File Entity.\n Defaults to the local cache.\n followLink: Whether the link returns the target Entity.\n Defaults to False\n ifcollision: Determines how to handle file collisions.\n May be \"overwrite.local\", \"keep.local\", or \"keep.both\".\n Defaults to \"keep.both\".\n limitSearch: A Synanpse ID used to limit the search in Synapse if entity is specified as a local\n file. That is, if the file is stored in multiple locations in Synapse only the ones\n in the specified folder/project will be returned.\n\n Returns:\n A new Synapse Entity object of the appropriate type.\n\n Example: Using this function\n Download file into cache\n\n entity = syn.get('syn1906479')\n print(entity.name)\n print(entity.path)\n\n Download file into current working directory\n\n entity = syn.get('syn1906479', downloadLocation='.')\n print(entity.name)\n print(entity.path)\n\n Determine the provenance of a locally stored file as indicated in Synapse\n\n entity = syn.get('/path/to/file.txt', limitSearch='syn12312')\n print(syn.getProvenance(entity))\n \"\"\"\n # If entity is a local file determine the corresponding synapse entity\n if isinstance(entity, str) and os.path.isfile(entity):\n bundle = self._getFromFile(entity, kwargs.pop(\"limitSearch\", None))\n kwargs[\"downloadFile\"] = False\n kwargs[\"path\"] = entity\n\n elif isinstance(entity, str) and not utils.is_synapse_id_str(entity):\n raise SynapseFileNotFoundError(\n (\n \"The parameter %s is neither a local file path \"\n \" or a valid entity id\" % entity\n )\n )\n # have not been saved entities\n elif isinstance(entity, Entity) and not entity.get(\"id\"):\n raise ValueError(\n \"Cannot retrieve entity that has not been saved.\"\n \" Please use syn.store() to save your entity and try again.\"\n )\n else:\n version = kwargs.get(\"version\", None)\n bundle = self._getEntityBundle(entity, version)\n # Check and warn for unmet access requirements\n self._check_entity_restrictions(\n bundle, entity, kwargs.get(\"downloadFile\", True)\n )\n\n return_data = self._getWithEntityBundle(\n entityBundle=bundle, entity=entity, **kwargs\n )\n trace.get_current_span().set_attributes(\n {\n \"synapse.id\": return_data.get(\"id\", \"\"),\n \"synapse.concrete_type\": return_data.get(\"concreteType\", \"\"),\n }\n )\n return return_data\n
"},{"location":"reference/client/#synapseclient.Synapse.store","title":"store(obj, *, createOrUpdate=True, forceVersion=True, versionLabel=None, isRestricted=False, activity=None, used=None, executed=None, activityName=None, activityDescription=None, opentelemetry_context=None)
","text":"Creates a new Entity or updates an existing Entity, uploading any files in the process.
PARAMETER DESCRIPTIONobj
A Synapse Entity, Evaluation, or Wiki
used
The Entity, Synapse ID, or URL used to create the object (can also be a list of these)
DEFAULT: None
executed
The Entity, Synapse ID, or URL representing code executed to create the object (can also be a list of these)
DEFAULT: None
activity
Activity object specifying the user's provenance.
DEFAULT: None
activityName
Activity name to be used in conjunction with used and executed.
DEFAULT: None
activityDescription
Activity description to be used in conjunction with used and executed.
DEFAULT: None
createOrUpdate
Indicates whether the method should automatically perform an update if the 'obj' conflicts with an existing Synapse object. Defaults to True.
DEFAULT: True
forceVersion
Indicates whether the method should increment the version of the object even if nothing has changed. Defaults to True.
DEFAULT: True
versionLabel
Arbitrary string used to label the version.
DEFAULT: None
isRestricted
If set to true, an email will be sent to the Synapse access control team to start the process of adding terms-of-use or review board approval for this entity. You will be contacted with regards to the specific data being restricted and the requirements of access.
DEFAULT: False
opentelemetry_context
OpenTelemetry context to propogate to this function to use for tracing. Used cases where multi-threaded operations need to be linked to parent spans.
DEFAULT: None
A Synapse Entity, Evaluation, or Wiki
Using this functionCreating a new Project:
from synapseclient import Project\n\nproject = Project('My uniquely named project')\nproject = syn.store(project)\n
Adding files with provenance (aka: Activity):
from synapseclient import File, Activity\n\n# A synapse entity *syn1906480* contains data\n# entity *syn1917825* contains code\nactivity = Activity(\n 'Fancy Processing',\n description='No seriously, really fancy processing',\n used=['syn1906480', 'http://data_r_us.com/fancy/data.txt'],\n executed='syn1917825')\n\ntest_entity = File('/path/to/data/file.xyz', description='Fancy new data', parent=project)\ntest_entity = syn.store(test_entity, activity=activity)\n
Source code in synapseclient/client.py
def store(\n self,\n obj,\n *,\n createOrUpdate=True,\n forceVersion=True,\n versionLabel=None,\n isRestricted=False,\n activity=None,\n used=None,\n executed=None,\n activityName=None,\n activityDescription=None,\n opentelemetry_context=None,\n):\n \"\"\"\n Creates a new Entity or updates an existing Entity, uploading any files in the process.\n\n Arguments:\n obj: A Synapse Entity, Evaluation, or Wiki\n used: The Entity, Synapse ID, or URL used to create the object (can also be a list of these)\n executed: The Entity, Synapse ID, or URL representing code executed to create the object\n (can also be a list of these)\n activity: Activity object specifying the user's provenance.\n activityName: Activity name to be used in conjunction with *used* and *executed*.\n activityDescription: Activity description to be used in conjunction with *used* and *executed*.\n createOrUpdate: Indicates whether the method should automatically perform an update if the 'obj'\n conflicts with an existing Synapse object. Defaults to True.\n forceVersion: Indicates whether the method should increment the version of the object even if nothing\n has changed. Defaults to True.\n versionLabel: Arbitrary string used to label the version.\n isRestricted: If set to true, an email will be sent to the Synapse access control team to start the\n process of adding terms-of-use or review board approval for this entity.\n You will be contacted with regards to the specific data being restricted and the\n requirements of access.\n opentelemetry_context: OpenTelemetry context to propogate to this function to use for tracing. Used\n cases where multi-threaded operations need to be linked to parent spans.\n\n Returns:\n A Synapse Entity, Evaluation, or Wiki\n\n Example: Using this function\n Creating a new Project:\n\n from synapseclient import Project\n\n project = Project('My uniquely named project')\n project = syn.store(project)\n\n Adding files with [provenance (aka: Activity)][synapseclient.Activity]:\n\n from synapseclient import File, Activity\n\n # A synapse entity *syn1906480* contains data\n # entity *syn1917825* contains code\n activity = Activity(\n 'Fancy Processing',\n description='No seriously, really fancy processing',\n used=['syn1906480', 'http://data_r_us.com/fancy/data.txt'],\n executed='syn1917825')\n\n test_entity = File('/path/to/data/file.xyz', description='Fancy new data', parent=project)\n test_entity = syn.store(test_entity, activity=activity)\n\n \"\"\"\n with tracer.start_as_current_span(\n \"Synapse::store\", context=opentelemetry_context\n ):\n trace.get_current_span().set_attributes(\n {\"thread.id\": threading.get_ident()}\n )\n # SYNPY-1031: activity must be Activity object or code will fail later\n if activity:\n if not isinstance(activity, synapseclient.Activity):\n raise ValueError(\"activity should be synapseclient.Activity object\")\n # _before_store hook\n # give objects a chance to do something before being stored\n if hasattr(obj, \"_before_synapse_store\"):\n obj._before_synapse_store(self)\n\n # _synapse_store hook\n # for objects that know how to store themselves\n if hasattr(obj, \"_synapse_store\"):\n return obj._synapse_store(self)\n\n # Handle all non-Entity objects\n if not (isinstance(obj, Entity) or type(obj) == dict):\n if isinstance(obj, Wiki):\n return self._storeWiki(obj, createOrUpdate)\n\n if \"id\" in obj: # If ID is present, update\n trace.get_current_span().set_attributes({\"synapse.id\": obj[\"id\"]})\n return type(obj)(**self.restPUT(obj.putURI(), obj.json()))\n\n try: # If no ID is present, attempt to POST the object\n trace.get_current_span().set_attributes({\"synapse.id\": \"\"})\n return type(obj)(**self.restPOST(obj.postURI(), obj.json()))\n\n except SynapseHTTPError as err:\n # If already present and we want to update attempt to get the object content\n if createOrUpdate and err.response.status_code == 409:\n newObj = self.restGET(obj.getByNameURI(obj.name))\n newObj.update(obj)\n obj = type(obj)(**newObj)\n trace.get_current_span().set_attributes(\n {\"synapse.id\": obj[\"id\"]}\n )\n obj.update(self.restPUT(obj.putURI(), obj.json()))\n return obj\n raise\n\n # If the input object is an Entity or a dictionary\n entity = obj\n properties, annotations, local_state = split_entity_namespaces(entity)\n bundle = None\n # Explicitly set an empty versionComment property if none is supplied,\n # otherwise an existing entity bundle's versionComment will be copied to the update.\n properties[\"versionComment\"] = (\n properties[\"versionComment\"] if \"versionComment\" in properties else None\n )\n\n # Anything with a path is treated as a cache-able item\n # Only Files are expected in the following logic\n if entity.get(\"path\", False) and not isinstance(obj, Folder):\n if \"concreteType\" not in properties:\n properties[\"concreteType\"] = File._synapse_entity_type\n # Make sure the path is fully resolved\n entity[\"path\"] = os.path.expanduser(entity[\"path\"])\n\n # Check if the File already exists in Synapse by fetching metadata on it\n bundle = self._getEntityBundle(entity)\n\n if bundle:\n if createOrUpdate:\n # update our properties from the existing bundle so that we have\n # enough to process this as an entity update.\n properties = {**bundle[\"entity\"], **properties}\n\n # Check if the file should be uploaded\n fileHandle = find_data_file_handle(bundle)\n if (\n fileHandle\n and fileHandle[\"concreteType\"]\n == \"org.sagebionetworks.repo.model.file.ExternalFileHandle\"\n ):\n # switching away from ExternalFileHandle or the url was updated\n needs_upload = entity[\"synapseStore\"] or (\n fileHandle[\"externalURL\"] != entity[\"externalURL\"]\n )\n else:\n # Check if we need to upload a new version of an existing\n # file. If the file referred to by entity['path'] has been\n # modified, we want to upload the new version.\n # If synapeStore is false then we must upload a ExternalFileHandle\n needs_upload = not entity[\n \"synapseStore\"\n ] or not self.cache.contains(\n bundle[\"entity\"][\"dataFileHandleId\"], entity[\"path\"]\n )\n elif entity.get(\"dataFileHandleId\", None) is not None:\n needs_upload = False\n else:\n needs_upload = True\n\n if needs_upload:\n local_state_fh = local_state.get(\"_file_handle\", {})\n synapseStore = local_state.get(\"synapseStore\", True)\n fileHandle = upload_file_handle(\n self,\n entity[\"parentId\"],\n local_state[\"path\"]\n if (synapseStore or local_state_fh.get(\"externalURL\") is None)\n else local_state_fh.get(\"externalURL\"),\n synapseStore=synapseStore,\n md5=local_state_fh.get(\"contentMd5\"),\n file_size=local_state_fh.get(\"contentSize\"),\n mimetype=local_state_fh.get(\"contentType\"),\n max_threads=self.max_threads,\n )\n properties[\"dataFileHandleId\"] = fileHandle[\"id\"]\n local_state[\"_file_handle\"] = fileHandle\n\n elif \"dataFileHandleId\" not in properties:\n # Handle the case where the Entity lacks an ID\n # But becomes an update() due to conflict\n properties[\"dataFileHandleId\"] = bundle[\"entity\"][\n \"dataFileHandleId\"\n ]\n\n # update the file_handle metadata if the FileEntity's FileHandle id has changed\n local_state_fh_id = local_state.get(\"_file_handle\", {}).get(\"id\")\n if (\n local_state_fh_id\n and properties[\"dataFileHandleId\"] != local_state_fh_id\n ):\n local_state[\"_file_handle\"] = find_data_file_handle(\n self._getEntityBundle(\n properties[\"id\"],\n requestedObjects={\n \"includeEntity\": True,\n \"includeFileHandles\": True,\n },\n )\n )\n\n # check if we already have the filehandleid cached somewhere\n cached_path = self.cache.get(properties[\"dataFileHandleId\"])\n if cached_path is None:\n local_state[\"path\"] = None\n local_state[\"cacheDir\"] = None\n local_state[\"files\"] = []\n else:\n local_state[\"path\"] = cached_path\n local_state[\"cacheDir\"] = os.path.dirname(cached_path)\n local_state[\"files\"] = [os.path.basename(cached_path)]\n\n # Create or update Entity in Synapse\n if \"id\" in properties:\n trace.get_current_span().set_attributes(\n {\"synapse.id\": properties[\"id\"]}\n )\n properties = self._updateEntity(properties, forceVersion, versionLabel)\n else:\n # If Link, get the target name, version number and concrete type and store in link properties\n if properties[\"concreteType\"] == \"org.sagebionetworks.repo.model.Link\":\n target_properties = self._getEntity(\n properties[\"linksTo\"][\"targetId\"],\n version=properties[\"linksTo\"].get(\"targetVersionNumber\"),\n )\n if target_properties[\"parentId\"] == properties[\"parentId\"]:\n raise ValueError(\n \"Cannot create a Link to an entity under the same parent.\"\n )\n properties[\"linksToClassName\"] = target_properties[\"concreteType\"]\n if (\n target_properties.get(\"versionNumber\") is not None\n and properties[\"linksTo\"].get(\"targetVersionNumber\") is not None\n ):\n properties[\"linksTo\"][\n \"targetVersionNumber\"\n ] = target_properties[\"versionNumber\"]\n properties[\"name\"] = target_properties[\"name\"]\n try:\n properties = self._createEntity(properties)\n except SynapseHTTPError as ex:\n if createOrUpdate and ex.response.status_code == 409:\n # Get the existing Entity's ID via the name and parent\n existing_entity_id = self.findEntityId(\n properties[\"name\"], properties.get(\"parentId\", None)\n )\n if existing_entity_id is None:\n raise\n\n # get existing properties and annotations\n if not bundle:\n bundle = self._getEntityBundle(\n existing_entity_id,\n requestedObjects={\n \"includeEntity\": True,\n \"includeAnnotations\": True,\n },\n )\n\n properties = {**bundle[\"entity\"], **properties}\n\n # we additionally merge the annotations under the assumption that a missing annotation\n # from a resolved conflict represents an newer annotation that should be preserved\n # rather than an intentionally deleted annotation.\n annotations = {\n **from_synapse_annotations(bundle[\"annotations\"]),\n **annotations,\n }\n\n properties = self._updateEntity(\n properties, forceVersion, versionLabel\n )\n\n else:\n raise\n\n # Deal with access restrictions\n if isRestricted:\n self._createAccessRequirementIfNone(properties)\n\n # Update annotations\n if (not bundle and annotations) or (\n bundle and check_annotations_changed(bundle[\"annotations\"], annotations)\n ):\n annotations = self.set_annotations(\n Annotations(properties[\"id\"], properties[\"etag\"], annotations)\n )\n properties[\"etag\"] = annotations.etag\n\n # If the parameters 'used' or 'executed' are given, create an Activity object\n if used or executed:\n if activity is not None:\n raise SynapseProvenanceError(\n \"Provenance can be specified as an Activity object or as used/executed\"\n \" item(s), but not both.\"\n )\n activity = Activity(\n name=activityName,\n description=activityDescription,\n used=used,\n executed=executed,\n )\n\n # If we have an Activity, set it as the Entity's provenance record\n if activity:\n self.setProvenance(properties, activity)\n\n # 'etag' has changed, so get the new Entity\n properties = self._getEntity(properties)\n\n # Return the updated Entity object\n entity = Entity.create(properties, annotations, local_state)\n return_data = self.get(entity, downloadFile=False)\n\n trace.get_current_span().set_attributes(\n {\n \"synapse.id\": return_data.get(\"id\", \"\"),\n \"synapse.concrete_type\": entity.get(\"concreteType\", \"\"),\n }\n )\n return return_data\n
"},{"location":"reference/client/#synapseclient.Synapse.move","title":"move(entity, new_parent)
","text":"Move a Synapse entity to a new container.
PARAMETER DESCRIPTIONentity
A Synapse ID, a Synapse Entity object, or a local file that is stored in Synapse
new_parent
The new parent container (Folder or Project) to which the entity should be moved.
RETURNS DESCRIPTION
The Synapse Entity object that has been moved.
Using this functionMove a Synapse Entity object to a new parent container
entity = syn.move('syn456', 'syn123')\n
Source code in synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::move\")\ndef move(self, entity, new_parent):\n \"\"\"\n Move a Synapse entity to a new container.\n\n Arguments:\n entity: A Synapse ID, a Synapse Entity object, or a local file that is stored in Synapse\n new_parent: The new parent container (Folder or Project) to which the entity should be moved.\n\n Returns:\n The Synapse Entity object that has been moved.\n\n Example: Using this function\n Move a Synapse Entity object to a new parent container\n\n entity = syn.move('syn456', 'syn123')\n \"\"\"\n\n entity = self.get(entity, downloadFile=False)\n entity.parentId = id_of(new_parent)\n entity = self.store(entity, forceVersion=False)\n trace.get_current_span().set_attributes(\n {\n \"synapse.id\": entity.get(\"id\", \"\"),\n \"synapse.parent_id\": entity.get(\"parentId\", \"\"),\n }\n )\n\n return entity\n
"},{"location":"reference/client/#synapseclient.Synapse.delete","title":"delete(obj, version=None)
","text":"Removes an object from Synapse.
PARAMETER DESCRIPTIONobj
An existing object stored on Synapse such as Evaluation, File, Project, or Wiki
version
For entities, specify a particular version to delete.
DEFAULT: None
synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::delete\")\ndef delete(self, obj, version=None):\n \"\"\"\n Removes an object from Synapse.\n\n Arguments:\n obj: An existing object stored on Synapse such as Evaluation, File, Project, or Wiki\n version: For entities, specify a particular version to delete.\n \"\"\"\n # Handle all strings as the Entity ID for backward compatibility\n if isinstance(obj, str):\n entity_id = id_of(obj)\n trace.get_current_span().set_attributes({\"synapse.id\": entity_id})\n if version:\n self.restDELETE(uri=f\"/entity/{entity_id}/version/{version}\")\n else:\n self.restDELETE(uri=f\"/entity/{entity_id}\")\n elif hasattr(obj, \"_synapse_delete\"):\n return obj._synapse_delete(self)\n else:\n try:\n if isinstance(obj, Versionable):\n self.restDELETE(obj.deleteURI(versionNumber=version))\n else:\n self.restDELETE(obj.deleteURI())\n except AttributeError:\n raise SynapseError(\n f\"Can't delete a {type(obj)}. Please specify a Synapse object or id\"\n )\n
"},{"location":"reference/client/#synapseclient.Synapse.get_annotations","title":"get_annotations(entity, version=None)
","text":"Retrieve annotations for an Entity from the Synapse Repository as a Python dict.
Note that collapsing annotations from the native Synapse format to a Python dict may involve some loss of information. See _getRawAnnotations
to get annotations in the native format.
entity
An Entity or Synapse ID to lookup
TYPE: Union[str, Entity]
version
The version of the Entity to retrieve.
TYPE: Union[str, int]
DEFAULT: None
Annotations
A synapseclient.annotations.Annotations object, a dict that also has id and etag attributes
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::get_annotations\")\ndef get_annotations(\n self, entity: typing.Union[str, Entity], version: typing.Union[str, int] = None\n) -> Annotations:\n \"\"\"\n Retrieve annotations for an Entity from the Synapse Repository as a Python dict.\n\n Note that collapsing annotations from the native Synapse format to a Python dict may involve some loss of\n information. See `_getRawAnnotations` to get annotations in the native format.\n\n Arguments:\n entity: An Entity or Synapse ID to lookup\n version: The version of the Entity to retrieve.\n\n Returns:\n A [synapseclient.annotations.Annotations][] object, a dict that also has id and etag attributes\n \"\"\"\n return from_synapse_annotations(self._getRawAnnotations(entity, version))\n
"},{"location":"reference/client/#synapseclient.Synapse.set_annotations","title":"set_annotations(annotations)
","text":"Store annotations for an Entity in the Synapse Repository.
PARAMETER DESCRIPTIONannotations
A synapseclient.annotations.Annotations of annotation names and values, with the id and etag attribute set
TYPE: Annotations
The updated synapseclient.annotations.Annotations for the entity
Using this functionGetting annotations, adding a new annotation, and updating the annotations:
annos = syn.get_annotations('syn123')\n\n# annos will contain the id and etag associated with the entity upon retrieval\nprint(annos.id)\n# syn123\nprint(annos.etag)\n# 7bdb83e9-a50a-46e4-987a-4962559f090f (Usually some UUID in the form of a string)\n\n# returned annos object from get_annotations() can be used as if it were a dict\n\n# set key 'foo' to have value of 'bar' and 'baz'\nannos['foo'] = ['bar', 'baz']\n\n# single values will automatically be wrapped in a list once stored\nannos['qwerty'] = 'asdf'\n\n# store the annotations\nannos = syn.set_annotations(annos)\n\nprint(annos)\n# {'foo':['bar','baz], 'qwerty':['asdf']}\n
Source code in synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::set_annotations\")\ndef set_annotations(self, annotations: Annotations):\n \"\"\"\n Store annotations for an Entity in the Synapse Repository.\n\n Arguments:\n annotations: A [synapseclient.annotations.Annotations][] of annotation names and values,\n with the id and etag attribute set\n\n Returns:\n The updated [synapseclient.annotations.Annotations][] for the entity\n\n Example: Using this function\n Getting annotations, adding a new annotation, and updating the annotations:\n\n annos = syn.get_annotations('syn123')\n\n # annos will contain the id and etag associated with the entity upon retrieval\n print(annos.id)\n # syn123\n print(annos.etag)\n # 7bdb83e9-a50a-46e4-987a-4962559f090f (Usually some UUID in the form of a string)\n\n # returned annos object from get_annotations() can be used as if it were a dict\n\n # set key 'foo' to have value of 'bar' and 'baz'\n annos['foo'] = ['bar', 'baz']\n\n # single values will automatically be wrapped in a list once stored\n annos['qwerty'] = 'asdf'\n\n # store the annotations\n annos = syn.set_annotations(annos)\n\n print(annos)\n # {'foo':['bar','baz], 'qwerty':['asdf']}\n \"\"\"\n\n if not isinstance(annotations, Annotations):\n raise TypeError(\"Expected a synapseclient.Annotations object\")\n\n synapseAnnos = to_synapse_annotations(annotations)\n\n entity_id = id_of(annotations)\n trace.get_current_span().set_attributes({\"synapse.id\": entity_id})\n\n return from_synapse_annotations(\n self.restPUT(\n f\"/entity/{entity_id}/annotations2\",\n body=json.dumps(synapseAnnos),\n )\n )\n
"},{"location":"reference/client/#synapseclient.Synapse.tableQuery","title":"tableQuery(query, resultsAs='csv', **kwargs)
","text":"Query a Synapse Table.
:param query: query string in a SQL-like syntax <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/web/controller/TableExamples.html>
_, for example \"SELECT * from syn12345\"
:param resultsAs: select whether results are returned as a CSV file (\"csv\") or incrementally downloaded as sets of rows (\"rowset\").
:param query: query string in a SQL-like syntax <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/web/controller/TableExamples.html>
_, for example \"SELECT * from syn12345\"
:param resultsAs: select whether results are returned as a CSV file (\"csv\") or incrementally downloaded as sets of rows (\"rowset\").
You can receive query results either as a generator over rows or as a CSV file. For smallish tables, either method will work equally well. Use of a \"rowset\" generator allows rows to be processed one at a time and processing may be stopped before downloading the entire table.
Optional keyword arguments differ for the two return types of rowset
or csv
query
Query string in a SQL-like syntax, for example: \"SELECT * from syn12345\"
resultsAs
select whether results are returned as a CSV file (\"csv\") or incrementally downloaded as sets of rows (\"rowset\")
DEFAULT: 'csv'
limit
(rowset only) Specify the maximum number of rows to be returned, defaults to None
offset
(rowset only) Don't return the first n rows, defaults to None
quoteCharacter
(csv only) default double quote
escapeCharacter
(csv only) default backslash
lineEnd
(csv only) defaults to os.linesep
separator
(csv only) defaults to comma
header
(csv only) True by default
includeRowIdAndRowVersion
(csv only) True by default
downloadLocation
(csv only) directory path to download the CSV file to
RETURNS DESCRIPTION
A TableQueryResult or CsvFileTable object
NOTEWhen performing queries on frequently updated tables, the table can be inaccessible for a period leading to a timeout of the query. Since the results are guaranteed to eventually be returned you can change the max timeout by setting the table_query_timeout variable of the Synapse object:
# Sets the max timeout to 5 minutes.\n syn.table_query_timeout = 300\n
Source code in synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::tableQuery\")\ndef tableQuery(self, query, resultsAs=\"csv\", **kwargs):\n \"\"\"\n Query a Synapse Table.\n\n\n :param query: query string in a `SQL-like syntax \\\n <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/web/controller/TableExamples.html>`_, for example\n \"SELECT * from syn12345\"\n\n :param resultsAs: select whether results are returned as a CSV file (\"csv\") or incrementally downloaded as\n sets of rows (\"rowset\").\n\n\n :param query: query string in a `SQL-like syntax \\\n <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/web/controller/TableExamples.html>`_, for example\n \"SELECT * from syn12345\"\n\n :param resultsAs: select whether results are returned as a CSV file (\"csv\") or incrementally downloaded as\n sets of rows (\"rowset\").\n\n You can receive query results either as a generator over rows or as a CSV file. For smallish tables, either\n method will work equally well. Use of a \"rowset\" generator allows rows to be processed one at a time and\n processing may be stopped before downloading the entire table.\n\n Optional keyword arguments differ for the two return types of `rowset` or `csv`\n\n Arguments:\n query: Query string in a [SQL-like syntax](https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/web/controller/TableExamples.html), for example: `\"SELECT * from syn12345\"`\n resultsAs: select whether results are returned as a CSV file (\"csv\") or incrementally downloaded as sets of rows (\"rowset\")\n limit: (rowset only) Specify the maximum number of rows to be returned, defaults to None\n offset: (rowset only) Don't return the first n rows, defaults to None\n quoteCharacter: (csv only) default double quote\n escapeCharacter: (csv only) default backslash\n lineEnd: (csv only) defaults to os.linesep\n separator: (csv only) defaults to comma\n header: (csv only) True by default\n includeRowIdAndRowVersion: (csv only) True by default\n downloadLocation: (csv only) directory path to download the CSV file to\n\n\n Returns:\n A [TableQueryResult][synapseclient.table.TableQueryResult] or [CsvFileTable][synapseclient.table.CsvFileTable] object\n\n\n NOTE:\n When performing queries on frequently updated tables, the table can be inaccessible for a period leading\n to a timeout of the query. Since the results are guaranteed to eventually be returned you can change the\n max timeout by setting the table_query_timeout variable of the Synapse object:\n\n # Sets the max timeout to 5 minutes.\n syn.table_query_timeout = 300\n\n \"\"\"\n if resultsAs.lower() == \"rowset\":\n return TableQueryResult(self, query, **kwargs)\n elif resultsAs.lower() == \"csv\":\n # TODO: remove isConsistent because it has now been deprecated\n # from the backend\n if kwargs.get(\"isConsistent\") is not None:\n kwargs.pop(\"isConsistent\")\n return CsvFileTable.from_table_query(self, query, **kwargs)\n else:\n raise ValueError(\n \"Unknown return type requested from tableQuery: \" + str(resultsAs)\n )\n
"},{"location":"reference/client/#synapseclient.Synapse.createColumns","title":"createColumns(columns)
","text":"Creates a batch of synapseclient.table.Column's within a single request.
PARAMETER DESCRIPTIONcolumns
A list of synapseclient.table.Column's
TYPE: List[Column]
List[Column]
A list of synapseclient.table.Column's that have been created in Synapse
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::createColumns\")\ndef createColumns(self, columns: typing.List[Column]) -> typing.List[Column]:\n \"\"\"\n Creates a batch of [synapseclient.table.Column][]'s within a single request.\n\n Arguments:\n columns: A list of [synapseclient.table.Column][]'s\n\n Returns:\n A list of [synapseclient.table.Column][]'s that have been created in Synapse\n \"\"\"\n request_body = {\n \"concreteType\": \"org.sagebionetworks.repo.model.ListWrapper\",\n \"list\": list(columns),\n }\n response = self.restPOST(\"/column/batch\", json.dumps(request_body))\n return [Column(**col) for col in response[\"list\"]]\n
"},{"location":"reference/client/#synapseclient.Synapse.getColumn","title":"getColumn(id)
","text":"Gets a Column object from Synapse by ID.
See: synapseclient.table.Column
PARAMETER DESCRIPTIONid
The ID of the column to retrieve
RETURNS DESCRIPTION
An object of type synapseclient.table.Column
Using this functionGetting a column
column = syn.getColumn(123)\n
Source code in synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getColumn\")\ndef getColumn(self, id):\n \"\"\"\n Gets a Column object from Synapse by ID.\n\n See: [synapseclient.table.Column][]\n\n Arguments:\n id: The ID of the column to retrieve\n\n Returns:\n An object of type [synapseclient.table.Column][]\n\n\n Example: Using this function\n Getting a column\n\n column = syn.getColumn(123)\n \"\"\"\n return Column(**self.restGET(Column.getURI(id)))\n
"},{"location":"reference/client/#synapseclient.Synapse.getColumns","title":"getColumns(x, limit=100, offset=0)
","text":"Get the columns defined in Synapse either (1) corresponding to a set of column headers, (2) those for a given schema, or (3) those whose names start with a given prefix.
PARAMETER DESCRIPTIONx
A list of column headers, a Table Entity object (Schema/EntityViewSchema), a Table's Synapse ID, or a string prefix
limit
maximum number of columns to return (pagination parameter)
DEFAULT: 100
offset
the index of the first column to return (pagination parameter)
DEFAULT: 0
A generator over synapseclient.table.Column objects
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getColumns\")\ndef getColumns(self, x, limit=100, offset=0):\n \"\"\"\n Get the columns defined in Synapse either (1) corresponding to a set of column headers, (2) those for a given\n schema, or (3) those whose names start with a given prefix.\n\n Arguments:\n x: A list of column headers, a Table Entity object (Schema/EntityViewSchema), a Table's Synapse ID, or a\n string prefix\n limit: maximum number of columns to return (pagination parameter)\n offset: the index of the first column to return (pagination parameter)\n\n Yields:\n A generator over [synapseclient.table.Column][] objects\n \"\"\"\n if x is None:\n uri = \"/column\"\n for result in self._GET_paginated(uri, limit=limit, offset=offset):\n yield Column(**result)\n elif isinstance(x, (list, tuple)):\n for header in x:\n try:\n # if header is an integer, it's a columnID, otherwise it's an aggregate column, like \"AVG(Foo)\"\n int(header)\n yield self.getColumn(header)\n except ValueError:\n # ignore aggregate column\n pass\n elif isinstance(x, SchemaBase) or utils.is_synapse_id_str(x):\n for col in self.getTableColumns(x):\n yield col\n elif isinstance(x, str):\n uri = \"/column?prefix=\" + x\n for result in self._GET_paginated(uri, limit=limit, offset=offset):\n yield Column(**result)\n else:\n ValueError(\"Can't get columns for a %s\" % type(x))\n
"},{"location":"reference/client/#synapseclient.Synapse.getTableColumns","title":"getTableColumns(table)
","text":"Retrieve the column models used in the given table schema.
PARAMETER DESCRIPTIONtable
The schema of the Table whose columns are to be retrieved
YIELDS DESCRIPTION
A Generator over the Table's columns
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getTableColumns\")\ndef getTableColumns(self, table):\n \"\"\"\n Retrieve the column models used in the given table schema.\n\n Arguments:\n table: The schema of the Table whose columns are to be retrieved\n\n Yields:\n A Generator over the Table's [columns][synapseclient.table.Column]\n \"\"\"\n uri = \"/entity/{id}/column\".format(id=id_of(table))\n # The returned object type for this service, PaginatedColumnModels, is a misnomer.\n # This service always returns the full list of results so the pagination does not not actually matter.\n for result in self.restGET(uri)[\"results\"]:\n yield Column(**result)\n
"},{"location":"reference/client/#synapseclient.Synapse.downloadTableColumns","title":"downloadTableColumns(table, columns, downloadLocation=None, **kwargs)
","text":"Bulk download of table-associated files.
PARAMETER DESCRIPTIONtable
Table query result
columns
A list of column names as strings
downloadLocation
Directory into which to download the files
DEFAULT: None
A dictionary from file handle ID to path in the local file system.
For example, consider a Synapse table whose ID is \"syn12345\" with two columns of type FILEHANDLEID named 'foo' and 'bar'. The associated files are JSON encoded, so we might retrieve the files from Synapse and load for the second 100 of those rows as shown here:
import json\n\nresults = syn.tableQuery('SELECT * FROM syn12345 LIMIT 100 OFFSET 100')\nfile_map = syn.downloadTableColumns(results, ['foo', 'bar'])\n\nfor file_handle_id, path in file_map.items():\n with open(path) as f:\n data[file_handle_id] = f.read()\n
Source code in synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::downloadTableColumns\")\ndef downloadTableColumns(self, table, columns, downloadLocation=None, **kwargs):\n \"\"\"\n Bulk download of table-associated files.\n\n Arguments:\n table: Table query result\n columns: A list of column names as strings\n downloadLocation: Directory into which to download the files\n\n Returns:\n A dictionary from file handle ID to path in the local file system.\n\n For example, consider a Synapse table whose ID is \"syn12345\" with two columns of type FILEHANDLEID named 'foo'\n and 'bar'. The associated files are JSON encoded, so we might retrieve the files from Synapse and load for the\n second 100 of those rows as shown here:\n\n import json\n\n results = syn.tableQuery('SELECT * FROM syn12345 LIMIT 100 OFFSET 100')\n file_map = syn.downloadTableColumns(results, ['foo', 'bar'])\n\n for file_handle_id, path in file_map.items():\n with open(path) as f:\n data[file_handle_id] = f.read()\n\n \"\"\"\n\n RETRIABLE_FAILURE_CODES = [\"EXCEEDS_SIZE_LIMIT\"]\n MAX_DOWNLOAD_TRIES = 100\n max_files_per_request = kwargs.get(\"max_files_per_request\", 2500)\n # Rowset tableQuery result not allowed\n if isinstance(table, TableQueryResult):\n raise ValueError(\n \"downloadTableColumn doesn't work with rowsets. Please use default tableQuery settings.\"\n )\n if isinstance(columns, str):\n columns = [columns]\n if not isinstance(columns, collections.abc.Iterable):\n raise TypeError(\"Columns parameter requires a list of column names\")\n\n (\n file_handle_associations,\n file_handle_to_path_map,\n ) = self._build_table_download_file_handle_list(\n table,\n columns,\n downloadLocation,\n )\n\n self.logger.info(\n \"Downloading %d files, %d cached locally\"\n % (len(file_handle_associations), len(file_handle_to_path_map))\n )\n\n permanent_failures = collections.OrderedDict()\n\n attempts = 0\n while len(file_handle_associations) > 0 and attempts < MAX_DOWNLOAD_TRIES:\n attempts += 1\n\n file_handle_associations_batch = file_handle_associations[\n :max_files_per_request\n ]\n\n # ------------------------------------------------------------\n # call async service to build zip file\n # ------------------------------------------------------------\n\n # returns a BulkFileDownloadResponse:\n # https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/file/BulkFileDownloadResponse.html\n request = dict(\n concreteType=\"org.sagebionetworks.repo.model.file.BulkFileDownloadRequest\",\n requestedFiles=file_handle_associations_batch,\n )\n response = self._waitForAsync(\n uri=\"/file/bulk/async\",\n request=request,\n endpoint=self.fileHandleEndpoint,\n )\n\n # ------------------------------------------------------------\n # download zip file\n # ------------------------------------------------------------\n\n temp_dir = tempfile.mkdtemp()\n zipfilepath = os.path.join(temp_dir, \"table_file_download.zip\")\n try:\n zipfilepath = self._downloadFileHandle(\n response[\"resultZipFileHandleId\"],\n table.tableId,\n \"TableEntity\",\n zipfilepath,\n )\n # TODO handle case when no zip file is returned\n # TODO test case when we give it partial or all bad file handles\n # TODO test case with deleted fileHandleID\n # TODO return null for permanent failures\n\n # ------------------------------------------------------------\n # unzip into cache\n # ------------------------------------------------------------\n\n if downloadLocation:\n download_dir = self._ensure_download_location_is_directory(\n downloadLocation\n )\n\n with zipfile.ZipFile(zipfilepath) as zf:\n # the directory structure within the zip follows that of the cache:\n # {fileHandleId modulo 1000}/{fileHandleId}/{fileName}\n for summary in response[\"fileSummary\"]:\n if summary[\"status\"] == \"SUCCESS\":\n if not downloadLocation:\n download_dir = self.cache.get_cache_dir(\n summary[\"fileHandleId\"]\n )\n\n filepath = extract_zip_file_to_directory(\n zf, summary[\"zipEntryName\"], download_dir\n )\n self.cache.add(summary[\"fileHandleId\"], filepath)\n file_handle_to_path_map[summary[\"fileHandleId\"]] = filepath\n elif summary[\"failureCode\"] not in RETRIABLE_FAILURE_CODES:\n permanent_failures[summary[\"fileHandleId\"]] = summary\n finally:\n if os.path.exists(zipfilepath):\n os.remove(zipfilepath)\n\n # Do we have remaining files to download?\n file_handle_associations = [\n fha\n for fha in file_handle_associations\n if fha[\"fileHandleId\"] not in file_handle_to_path_map\n and fha[\"fileHandleId\"] not in permanent_failures.keys()\n ]\n\n # TODO if there are files we still haven't downloaded\n\n return file_handle_to_path_map\n
"},{"location":"reference/client/#synapseclient.Synapse.getPermissions","title":"getPermissions(entity, principalId=None)
","text":"Get the permissions that a user or group has on an Entity. Arguments: entity: An Entity or Synapse ID to lookup principalId: Identifier of a user or group (defaults to PUBLIC users)
RETURNS DESCRIPTIONAn array containing some combination of
['READ', 'CREATE', 'UPDATE', 'DELETE', 'CHANGE_PERMISSIONS', 'DOWNLOAD']
or an empty array
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getPermissions\")\ndef getPermissions(\n self,\n entity: Union[Entity, Evaluation, str, collections.abc.Mapping],\n principalId: str = None,\n):\n \"\"\"Get the permissions that a user or group has on an Entity.\n Arguments:\n entity: An Entity or Synapse ID to lookup\n principalId: Identifier of a user or group (defaults to PUBLIC users)\n\n Returns:\n An array containing some combination of\n ['READ', 'CREATE', 'UPDATE', 'DELETE', 'CHANGE_PERMISSIONS', 'DOWNLOAD']\n or an empty array\n \"\"\"\n principal_id = self._getUserbyPrincipalIdOrName(principalId)\n\n trace.get_current_span().set_attributes(\n {\"synapse.id\": id_of(entity), \"synapse.principal_id\": principal_id}\n )\n\n acl = self._getACL(entity)\n\n team_list = self._find_teams_for_principal(principal_id)\n team_ids = [int(team.id) for team in team_list]\n effective_permission_set = set()\n\n # This user_profile_bundle is being used to verify that the principal_id is a registered user of the system\n user_profile_bundle = self._get_user_bundle(principal_id, 1)\n\n # Loop over all permissions in the returned ACL and add it to the effective_permission_set\n # if the principalId in the ACL matches\n # 1) the one we are looking for,\n # 2) a team the entity is a member of,\n # 3) PUBLIC\n # 4) A user_profile_bundle exists for the principal_id\n for permissions in acl[\"resourceAccess\"]:\n if \"principalId\" in permissions and (\n permissions[\"principalId\"] == principal_id\n or permissions[\"principalId\"] in team_ids\n or permissions[\"principalId\"] == PUBLIC\n or (\n permissions[\"principalId\"] == AUTHENTICATED_USERS\n and user_profile_bundle is not None\n )\n ):\n effective_permission_set = effective_permission_set.union(\n permissions[\"accessType\"]\n )\n return list(effective_permission_set)\n
"},{"location":"reference/client/#synapseclient.Synapse.setPermissions","title":"setPermissions(entity, principalId=None, accessType=['READ', 'DOWNLOAD'], modify_benefactor=False, warn_if_inherits=True, overwrite=True)
","text":"Sets permission that a user or group has on an Entity. An Entity may have its own ACL or inherit its ACL from a benefactor.
PARAMETER DESCRIPTIONentity
An Entity or Synapse ID to modify
principalId
Identifier of a user or group. '273948' is for all registered Synapse users and '273949' is for public access.
DEFAULT: None
accessType
Type of permission to be granted. One or more of CREATE, READ, DOWNLOAD, UPDATE, DELETE, CHANGE_PERMISSIONS
DEFAULT: ['READ', 'DOWNLOAD']
modify_benefactor
Set as True when modifying a benefactor's ACL
DEFAULT: False
warn_if_inherits
Set as False, when creating a new ACL. Trying to modify the ACL of an Entity that inherits its ACL will result in a warning
DEFAULT: True
overwrite
By default this function overwrites existing permissions for the specified user. Set this flag to False to add new permissions non-destructively.
DEFAULT: True
An Access Control List object
Using this functionSetting permissions
# Grant all registered users download access\nsyn.setPermissions('syn1234','273948',['READ','DOWNLOAD'])\n# Grant the public view access\nsyn.setPermissions('syn1234','273949',['READ'])\n
Source code in synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::setPermissions\")\ndef setPermissions(\n self,\n entity,\n principalId=None,\n accessType=[\"READ\", \"DOWNLOAD\"],\n modify_benefactor=False,\n warn_if_inherits=True,\n overwrite=True,\n):\n \"\"\"\n Sets permission that a user or group has on an Entity.\n An Entity may have its own ACL or inherit its ACL from a benefactor.\n\n Arguments:\n entity: An Entity or Synapse ID to modify\n principalId: Identifier of a user or group. '273948' is for all registered Synapse users\n and '273949' is for public access.\n accessType: Type of permission to be granted. One or more of CREATE, READ, DOWNLOAD, UPDATE,\n DELETE, CHANGE_PERMISSIONS\n modify_benefactor: Set as True when modifying a benefactor's ACL\n warn_if_inherits: Set as False, when creating a new ACL.\n Trying to modify the ACL of an Entity that inherits its ACL will result in a warning\n overwrite: By default this function overwrites existing permissions for the specified user.\n Set this flag to False to add new permissions non-destructively.\n\n Returns:\n An Access Control List object\n\n Example: Using this function\n Setting permissions\n\n # Grant all registered users download access\n syn.setPermissions('syn1234','273948',['READ','DOWNLOAD'])\n # Grant the public view access\n syn.setPermissions('syn1234','273949',['READ'])\n \"\"\"\n entity_id = id_of(entity)\n trace.get_current_span().set_attributes({\"synapse.id\": entity_id})\n\n benefactor = self._getBenefactor(entity)\n if benefactor[\"id\"] != entity_id:\n if modify_benefactor:\n entity = benefactor\n elif warn_if_inherits:\n self.logger.warning(\n \"Creating an ACL for entity %s, which formerly inherited access control from a\"\n ' benefactor entity, \"%s\" (%s).\\n'\n % (entity_id, benefactor[\"name\"], benefactor[\"id\"])\n )\n\n acl = self._getACL(entity)\n\n principalId = self._getUserbyPrincipalIdOrName(principalId)\n\n # Find existing permissions\n permissions_to_update = None\n for permissions in acl[\"resourceAccess\"]:\n if (\n \"principalId\" in permissions\n and permissions[\"principalId\"] == principalId\n ):\n permissions_to_update = permissions\n break\n\n if accessType is None or accessType == []:\n # remove permissions\n if permissions_to_update and overwrite:\n acl[\"resourceAccess\"].remove(permissions_to_update)\n else:\n # add a 'resourceAccess' entry, if necessary\n if not permissions_to_update:\n permissions_to_update = {\"accessType\": [], \"principalId\": principalId}\n acl[\"resourceAccess\"].append(permissions_to_update)\n if overwrite:\n permissions_to_update[\"accessType\"] = accessType\n else:\n permissions_to_update[\"accessType\"] = list(\n set(permissions_to_update[\"accessType\"]) | set(accessType)\n )\n return self._storeACL(entity, acl)\n
"},{"location":"reference/client/#synapseclient.Synapse.getProvenance","title":"getProvenance(entity, version=None)
","text":"Retrieve provenance information for a Synapse Entity.
PARAMETER DESCRIPTIONentity
An Entity or Synapse ID to lookup
version
The version of the Entity to retrieve. Gets the most recent version if omitted
DEFAULT: None
Activity
An Activity object or raises exception if no provenance record exists
RAISES DESCRIPTIONSynapseHTTPError
if no provenance record exists
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getProvenance\")\ndef getProvenance(self, entity, version=None) -> Activity:\n \"\"\"\n Retrieve provenance information for a Synapse Entity.\n\n Arguments:\n entity: An Entity or Synapse ID to lookup\n version: The version of the Entity to retrieve. Gets the most recent version if omitted\n\n Returns:\n An Activity object or raises exception if no provenance record exists\n\n Raises:\n SynapseHTTPError: if no provenance record exists\n \"\"\"\n\n # Get versionNumber from Entity\n if version is None and \"versionNumber\" in entity:\n version = entity[\"versionNumber\"]\n entity_id = id_of(entity)\n if version:\n uri = \"/entity/%s/version/%d/generatedBy\" % (entity_id, version)\n else:\n uri = \"/entity/%s/generatedBy\" % entity_id\n\n trace.get_current_span().set_attributes({\"synapse.id\": entity_id})\n return Activity(data=self.restGET(uri))\n
"},{"location":"reference/client/#synapseclient.Synapse.setProvenance","title":"setProvenance(entity, activity)
","text":"Stores a record of the code and data used to derive a Synapse entity.
PARAMETER DESCRIPTIONentity
An Entity or Synapse ID to modify
activity
A synapseclient.activity.Activity
RETURNS DESCRIPTION
Activity
An updated synapseclient.activity.Activity object
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::setProvenance\")\ndef setProvenance(self, entity, activity) -> Activity:\n \"\"\"\n Stores a record of the code and data used to derive a Synapse entity.\n\n Arguments:\n entity: An Entity or Synapse ID to modify\n activity: A [synapseclient.activity.Activity][]\n\n Returns:\n An updated [synapseclient.activity.Activity][] object\n \"\"\"\n\n # Assert that the entity was generated by a given Activity.\n activity = self._saveActivity(activity)\n\n entity_id = id_of(entity)\n # assert that an entity is generated by an activity\n uri = \"/entity/%s/generatedBy?generatedBy=%s\" % (entity_id, activity[\"id\"])\n activity = Activity(data=self.restPUT(uri))\n\n trace.get_current_span().set_attributes({\"synapse.id\": entity_id})\n return activity\n
"},{"location":"reference/client/#synapseclient.Synapse.deleteProvenance","title":"deleteProvenance(entity)
","text":"Removes provenance information from an Entity and deletes the associated Activity.
PARAMETER DESCRIPTIONentity
An Entity or Synapse ID to modify
Source code in
synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::deleteProvenance\")\ndef deleteProvenance(self, entity) -> None:\n \"\"\"\n Removes provenance information from an Entity and deletes the associated Activity.\n\n Arguments:\n entity: An Entity or Synapse ID to modify\n \"\"\"\n\n activity = self.getProvenance(entity)\n if not activity:\n return\n entity_id = id_of(entity)\n trace.get_current_span().set_attributes({\"synapse.id\": entity_id})\n\n uri = \"/entity/%s/generatedBy\" % entity_id\n self.restDELETE(uri)\n\n # TODO: what happens if the activity is shared by more than one entity?\n uri = \"/activity/%s\" % activity[\"id\"]\n self.restDELETE(uri)\n
"},{"location":"reference/client/#synapseclient.Synapse.updateActivity","title":"updateActivity(activity)
","text":"Modifies an existing Activity.
PARAMETER DESCRIPTIONactivity
The Activity to be updated.
RETURNS DESCRIPTION
An updated Activity object
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::updateActivity\")\ndef updateActivity(self, activity):\n \"\"\"\n Modifies an existing Activity.\n\n Arguments:\n activity: The Activity to be updated.\n\n Returns:\n An updated Activity object\n \"\"\"\n if \"id\" not in activity:\n raise ValueError(\"The activity you want to update must exist on Synapse\")\n trace.get_current_span().set_attributes({\"synapse.id\": activity[\"id\"]})\n return self._saveActivity(activity)\n
"},{"location":"reference/client/#synapseclient.Synapse.findEntityId","title":"findEntityId(name, parent=None)
","text":"Find an Entity given its name and parent.
PARAMETER DESCRIPTIONname
Name of the entity to find
parent
An Entity object or the Id of an entity as a string. Omit if searching for a Project by name
DEFAULT: None
The Entity ID or None if not found
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::findEntityId\")\ndef findEntityId(self, name, parent=None):\n \"\"\"\n Find an Entity given its name and parent.\n\n Arguments:\n name: Name of the entity to find\n parent: An Entity object or the Id of an entity as a string. Omit if searching for a Project by name\n\n Returns:\n The Entity ID or None if not found\n \"\"\"\n # when we want to search for a project by name. set parentId as None instead of ROOT_ENTITY\n entity_lookup_request = {\n \"parentId\": id_of(parent) if parent else None,\n \"entityName\": name,\n }\n try:\n return self.restPOST(\n \"/entity/child\", body=json.dumps(entity_lookup_request)\n ).get(\"id\")\n except SynapseHTTPError as e:\n if (\n e.response.status_code == 404\n ): # a 404 error is raised if the entity does not exist\n return None\n raise\n
"},{"location":"reference/client/#synapseclient.Synapse.getChildren","title":"getChildren(parent, includeTypes=['folder', 'file', 'table', 'link', 'entityview', 'dockerrepo', 'submissionview', 'dataset', 'materializedview'], sortBy='NAME', sortDirection='ASC')
","text":"Retrieves all of the entities stored within a parent such as folder or project.
PARAMETER DESCRIPTIONparent
An id or an object of a Synapse container or None to retrieve all projects
includeTypes
Must be a list of entity types (ie. [\"folder\",\"file\"]) which can be found here: http://docs.synapse.org/rest/org/sagebionetworks/repo/model/EntityType.html
DEFAULT: ['folder', 'file', 'table', 'link', 'entityview', 'dockerrepo', 'submissionview', 'dataset', 'materializedview']
sortBy
How results should be sorted. Can be NAME, or CREATED_ON
DEFAULT: 'NAME'
sortDirection
The direction of the result sort. Can be ASC, or DESC
DEFAULT: 'ASC'
An iterator that shows all the children of the container.
Also see:
synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getChildren\")\ndef getChildren(\n self,\n parent,\n includeTypes=[\n \"folder\",\n \"file\",\n \"table\",\n \"link\",\n \"entityview\",\n \"dockerrepo\",\n \"submissionview\",\n \"dataset\",\n \"materializedview\",\n ],\n sortBy=\"NAME\",\n sortDirection=\"ASC\",\n):\n \"\"\"\n Retrieves all of the entities stored within a parent such as folder or project.\n\n Arguments:\n parent: An id or an object of a Synapse container or None to retrieve all projects\n includeTypes: Must be a list of entity types (ie. [\"folder\",\"file\"]) which can be found here:\n http://docs.synapse.org/rest/org/sagebionetworks/repo/model/EntityType.html\n sortBy: How results should be sorted. Can be NAME, or CREATED_ON\n sortDirection: The direction of the result sort. Can be ASC, or DESC\n\n Yields:\n An iterator that shows all the children of the container.\n\n Also see:\n\n - [synapseutils.walk][]\n \"\"\"\n parentId = id_of(parent) if parent is not None else None\n\n trace.get_current_span().set_attributes({\"synapse.parent_id\": parentId})\n entityChildrenRequest = {\n \"parentId\": parentId,\n \"includeTypes\": includeTypes,\n \"sortBy\": sortBy,\n \"sortDirection\": sortDirection,\n \"nextPageToken\": None,\n }\n entityChildrenResponse = {\"nextPageToken\": \"first\"}\n while entityChildrenResponse.get(\"nextPageToken\") is not None:\n entityChildrenResponse = self.restPOST(\n \"/entity/children\", body=json.dumps(entityChildrenRequest)\n )\n for child in entityChildrenResponse[\"page\"]:\n yield child\n if entityChildrenResponse.get(\"nextPageToken\") is not None:\n entityChildrenRequest[\"nextPageToken\"] = entityChildrenResponse[\n \"nextPageToken\"\n ]\n
"},{"location":"reference/client/#synapseclient.Synapse.getTeam","title":"getTeam(id)
","text":"Finds a team with a given ID or name.
PARAMETER DESCRIPTIONid
The ID or name of the team or a Team object to retrieve.
RETURNS DESCRIPTION
An object of type synapseclient.team.Team
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getTeam\")\ndef getTeam(self, id):\n \"\"\"\n Finds a team with a given ID or name.\n\n Arguments:\n id: The ID or name of the team or a Team object to retrieve.\n\n Returns:\n An object of type [synapseclient.team.Team][]\n \"\"\"\n # Retrieves team id\n teamid = id_of(id)\n try:\n int(teamid)\n except (TypeError, ValueError):\n if isinstance(id, str):\n for team in self._findTeam(id):\n if team.name == id:\n teamid = team.id\n break\n else:\n raise ValueError('Can\\'t find team \"{}\"'.format(teamid))\n else:\n raise ValueError('Can\\'t find team \"{}\"'.format(teamid))\n return Team(**self.restGET(\"/team/%s\" % teamid))\n
"},{"location":"reference/client/#synapseclient.Synapse.getTeamMembers","title":"getTeamMembers(team)
","text":"Lists the members of the given team.
PARAMETER DESCRIPTIONteam
A synapseclient.team.Team object or a team's ID.
YIELDS DESCRIPTION
A generator over synapseclient.team.TeamMember objects.
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getTeamMembers\")\ndef getTeamMembers(self, team):\n \"\"\"\n Lists the members of the given team.\n\n Arguments:\n team: A [synapseclient.team.Team][] object or a team's ID.\n\n Yields:\n A generator over [synapseclient.team.TeamMember][] objects.\n\n \"\"\"\n for result in self._GET_paginated(\"/teamMembers/{id}\".format(id=id_of(team))):\n yield TeamMember(**result)\n
"},{"location":"reference/client/#synapseclient.Synapse.invite_to_team","title":"invite_to_team(team, user=None, inviteeEmail=None, message=None, force=False)
","text":"Invite user to a Synapse team via Synapse username or email (choose one or the other)
PARAMETER DESCRIPTIONsyn
Synapse object
team
A synapseclient.team.Team object or a team's ID.
user
Synapse username or profile id of user
DEFAULT: None
inviteeEmail
Email of user
DEFAULT: None
message
Additional message for the user getting invited to the team.
DEFAULT: None
force
If an open invitation exists for the invitee, the old invite will be cancelled.
DEFAULT: False
MembershipInvitation or None if user is already a member
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::invite_to_team\")\ndef invite_to_team(\n self, team, user=None, inviteeEmail=None, message=None, force=False\n):\n \"\"\"Invite user to a Synapse team via Synapse username or email\n (choose one or the other)\n\n Arguments:\n syn: Synapse object\n team: A [synapseclient.team.Team][] object or a team's ID.\n user: Synapse username or profile id of user\n inviteeEmail: Email of user\n message: Additional message for the user getting invited to the team.\n force: If an open invitation exists for the invitee, the old invite will be cancelled.\n\n Returns:\n MembershipInvitation or None if user is already a member\n \"\"\"\n # Throw error if both user and email is specified and if both not\n # specified\n id_email_specified = inviteeEmail is not None and user is not None\n id_email_notspecified = inviteeEmail is None and user is None\n if id_email_specified or id_email_notspecified:\n raise ValueError(\"Must specify either 'user' or 'inviteeEmail'\")\n\n teamid = id_of(team)\n is_member = False\n open_invitations = self.get_team_open_invitations(teamid)\n\n if user is not None:\n inviteeId = self.getUserProfile(user)[\"ownerId\"]\n membership_status = self.get_membership_status(inviteeId, teamid)\n is_member = membership_status[\"isMember\"]\n open_invites_to_user = [\n invitation\n for invitation in open_invitations\n if invitation.get(\"inviteeId\") == inviteeId\n ]\n else:\n inviteeId = None\n open_invites_to_user = [\n invitation\n for invitation in open_invitations\n if invitation.get(\"inviteeEmail\") == inviteeEmail\n ]\n # Only invite if the invitee is not a member and\n # if invitee doesn't have an open invitation unless force=True\n if not is_member and (not open_invites_to_user or force):\n # Delete all old invitations\n for invite in open_invites_to_user:\n self._delete_membership_invitation(invite[\"id\"])\n return self.send_membership_invitation(\n teamid, inviteeId=inviteeId, inviteeEmail=inviteeEmail, message=message\n )\n if is_member:\n not_sent_reason = \"invitee is already a member\"\n else:\n not_sent_reason = (\n \"invitee already has an open invitation \"\n \"Set force=True to send new invite.\"\n )\n\n self.logger.warning(\"No invitation sent: {}\".format(not_sent_reason))\n # Return None if no invite is sent.\n return None\n
"},{"location":"reference/client/#synapseclient.Synapse.get_membership_status","title":"get_membership_status(userid, team)
","text":"Retrieve a user's Team Membership Status bundle. https://rest-docs.synapse.org/rest/GET/team/id/member/principalId/membershipStatus.html
PARAMETER DESCRIPTIONuser
Synapse user ID
team
A synapseclient.team.Team object or a team's ID.
RETURNS DESCRIPTION
dict of TeamMembershipStatus
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::get_membership_status\")\ndef get_membership_status(self, userid, team):\n \"\"\"Retrieve a user's Team Membership Status bundle.\n https://rest-docs.synapse.org/rest/GET/team/id/member/principalId/membershipStatus.html\n\n Arguments:\n user: Synapse user ID\n team: A [synapseclient.team.Team][] object or a team's ID.\n\n Returns:\n dict of TeamMembershipStatus\n \"\"\"\n teamid = id_of(team)\n request = \"/team/{team}/member/{user}/membershipStatus\".format(\n team=teamid, user=userid\n )\n membership_status = self.restGET(request)\n return membership_status\n
"},{"location":"reference/client/#synapseclient.Synapse.get_team_open_invitations","title":"get_team_open_invitations(team)
","text":"Retrieve the open requests submitted to a Team https://rest-docs.synapse.org/rest/GET/team/id/openInvitation.html
PARAMETER DESCRIPTIONteam
A synapseclient.team.Team object or a team's ID.
YIELDS DESCRIPTION
Generator of MembershipRequest
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::get_team_open_invitations\")\ndef get_team_open_invitations(self, team):\n \"\"\"Retrieve the open requests submitted to a Team\n https://rest-docs.synapse.org/rest/GET/team/id/openInvitation.html\n\n Arguments:\n team: A [synapseclient.team.Team][] object or a team's ID.\n\n Yields:\n Generator of MembershipRequest\n \"\"\"\n teamid = id_of(team)\n request = \"/team/{team}/openInvitation\".format(team=teamid)\n open_requests = self._GET_paginated(request)\n return open_requests\n
"},{"location":"reference/client/#synapseclient.Synapse.send_membership_invitation","title":"send_membership_invitation(teamId, inviteeId=None, inviteeEmail=None, message=None)
","text":"Create a membership invitation and send an email notification to the invitee.
PARAMETER DESCRIPTIONteamId
Synapse teamId
inviteeId
Synapse username or profile id of user
DEFAULT: None
inviteeEmail
Email of user
DEFAULT: None
message
Additional message for the user getting invited to the team.
DEFAULT: None
MembershipInvitation
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::send_membership_invitation\")\ndef send_membership_invitation(\n self, teamId, inviteeId=None, inviteeEmail=None, message=None\n):\n \"\"\"Create a membership invitation and send an email notification\n to the invitee.\n\n Arguments:\n teamId: Synapse teamId\n inviteeId: Synapse username or profile id of user\n inviteeEmail: Email of user\n message: Additional message for the user getting invited to the\n team.\n\n Returns:\n MembershipInvitation\n \"\"\"\n\n invite_request = {\"teamId\": str(teamId), \"message\": message}\n if inviteeEmail is not None:\n invite_request[\"inviteeEmail\"] = str(inviteeEmail)\n if inviteeId is not None:\n invite_request[\"inviteeId\"] = str(inviteeId)\n\n response = self.restPOST(\n \"/membershipInvitation\", body=json.dumps(invite_request)\n )\n return response\n
"},{"location":"reference/client/#synapseclient.Synapse.submit","title":"submit(evaluation, entity, name=None, team=None, silent=False, submitterAlias=None, teamName=None, dockerTag='latest')
","text":"Submit an Entity for evaluation.
PARAMETER DESCRIPTIONevalation
Evaluation queue to submit to
entity
The Entity containing the Submissions
name
A name for this submission. In the absent of this parameter, the entity name will be used. (Optional) A synapseclient.team.Team object, ID or name of a Team that is registered for the challenge
DEFAULT: None
team
(optional) A synapseclient.team.Team object, ID or name of a Team that is registered for the challenge
DEFAULT: None
silent
Set to True to suppress output.
DEFAULT: False
submitterAlias
(optional) A nickname, possibly for display in leaderboards in place of the submitter's name
DEFAULT: None
teamName
(deprecated) A synonym for submitterAlias
DEFAULT: None
dockerTag
(optional) The Docker tag must be specified if the entity is a DockerRepository.
DEFAULT: 'latest'
A synapseclient.evaluation.Submission object
In the case of challenges, a team can optionally be provided to give credit to members of the team that contributed to the submission. The team must be registered for the challenge with which the given evaluation is associated. The caller must be a member of the submitting team.
Using this functionGetting and submitting an evaluation
evaluation = syn.getEvaluation(123)\nentity = syn.get('syn456')\nsubmission = syn.submit(evaluation, entity, name='Our Final Answer', team='Blue Team')\n
Source code in synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::submit\")\ndef submit(\n self,\n evaluation,\n entity,\n name=None,\n team=None,\n silent=False,\n submitterAlias=None,\n teamName=None,\n dockerTag=\"latest\",\n):\n \"\"\"\n Submit an Entity for [evaluation][synapseclient.evaluation.Evaluation].\n\n Arguments:\n evalation: Evaluation queue to submit to\n entity: The Entity containing the Submissions\n name: A name for this submission. In the absent of this parameter, the entity name will be used.\n (Optional) A [synapseclient.team.Team][] object, ID or name of a Team that is registered for the challenge\n team: (optional) A [synapseclient.team.Team][] object, ID or name of a Team that is registered for the challenge\n silent: Set to True to suppress output.\n submitterAlias: (optional) A nickname, possibly for display in leaderboards in place of the submitter's name\n teamName: (deprecated) A synonym for submitterAlias\n dockerTag: (optional) The Docker tag must be specified if the entity is a DockerRepository.\n\n Returns:\n A [synapseclient.evaluation.Submission][] object\n\n\n In the case of challenges, a team can optionally be provided to give credit to members of the team that\n contributed to the submission. The team must be registered for the challenge with which the given evaluation is\n associated. The caller must be a member of the submitting team.\n\n Example: Using this function\n Getting and submitting an evaluation\n\n evaluation = syn.getEvaluation(123)\n entity = syn.get('syn456')\n submission = syn.submit(evaluation, entity, name='Our Final Answer', team='Blue Team')\n \"\"\"\n\n require_param(evaluation, \"evaluation\")\n require_param(entity, \"entity\")\n\n evaluation_id = id_of(evaluation)\n\n entity_id = id_of(entity)\n if isinstance(entity, synapseclient.DockerRepository):\n # Edge case if dockerTag is specified as None\n if dockerTag is None:\n raise ValueError(\n \"A dockerTag is required to submit a DockerEntity. Cannot be None\"\n )\n docker_repository = entity[\"repositoryName\"]\n else:\n docker_repository = None\n\n if \"versionNumber\" not in entity:\n entity = self.get(entity, downloadFile=False)\n # version defaults to 1 to hack around required version field and allow submission of files/folders\n entity_version = entity.get(\"versionNumber\", 1)\n\n # default name of submission to name of entity\n if name is None and \"name\" in entity:\n name = entity[\"name\"]\n\n team_id = None\n if team:\n team = self.getTeam(team)\n team_id = id_of(team)\n\n contributors, eligibility_hash = self._get_contributors(evaluation_id, team)\n\n # for backward compatible until we remove supports for teamName\n if not submitterAlias:\n if teamName:\n submitterAlias = teamName\n elif team and \"name\" in team:\n submitterAlias = team[\"name\"]\n\n if isinstance(entity, synapseclient.DockerRepository):\n docker_digest = self._get_docker_digest(entity, dockerTag)\n else:\n docker_digest = None\n\n submission = {\n \"evaluationId\": evaluation_id,\n \"name\": name,\n \"entityId\": entity_id,\n \"versionNumber\": entity_version,\n \"dockerDigest\": docker_digest,\n \"dockerRepositoryName\": docker_repository,\n \"teamId\": team_id,\n \"contributors\": contributors,\n \"submitterAlias\": submitterAlias,\n }\n\n submitted = self._submit(submission, entity[\"etag\"], eligibility_hash)\n\n # if we want to display the receipt message, we need the full object\n if not silent:\n if not (isinstance(evaluation, Evaluation)):\n evaluation = self.getEvaluation(evaluation_id)\n if \"submissionReceiptMessage\" in evaluation:\n self.logger.info(evaluation[\"submissionReceiptMessage\"])\n\n return Submission(**submitted)\n
"},{"location":"reference/client/#synapseclient.Synapse.getConfigFile","title":"getConfigFile(configPath)
cached
","text":"Retrieves the client configuration information.
PARAMETER DESCRIPTIONconfigPath
Path to configuration file on local file system
TYPE: str
RawConfigParser
A RawConfigParser populated with properties from the user's configuration file.
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getConfigFile\")\n@functools.lru_cache()\ndef getConfigFile(self, configPath: str) -> configparser.RawConfigParser:\n \"\"\"\n Retrieves the client configuration information.\n\n Arguments:\n configPath: Path to configuration file on local file system\n\n Returns:\n A RawConfigParser populated with properties from the user's configuration file.\n \"\"\"\n\n try:\n config = configparser.RawConfigParser()\n config.read(configPath) # Does not fail if the file does not exist\n return config\n except configparser.Error as ex:\n raise ValueError(\n \"Error parsing Synapse config file: {}\".format(configPath)\n ) from ex\n
"},{"location":"reference/client/#synapseclient.Synapse.setEndpoints","title":"setEndpoints(repoEndpoint=None, authEndpoint=None, fileHandleEndpoint=None, portalEndpoint=None, skip_checks=False)
","text":"Sets the locations for each of the Synapse services (mostly useful for testing).
PARAMETER DESCRIPTIONrepoEndpoint
Location of synapse repository
TYPE: str
DEFAULT: None
authEndpoint
Location of authentication service
TYPE: str
DEFAULT: None
fileHandleEndpoint
Location of file service
TYPE: str
DEFAULT: None
portalEndpoint
Location of the website
TYPE: str
DEFAULT: None
skip_checks
Skip version and endpoint checks
TYPE: bool
DEFAULT: False
To switch between staging and production endpoints
syn.setEndpoints(**synapseclient.client.STAGING_ENDPOINTS)\nsyn.setEndpoints(**synapseclient.client.PRODUCTION_ENDPOINTS)\n
Source code in synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::setEndpoints\")\ndef setEndpoints(\n self,\n repoEndpoint: str = None,\n authEndpoint: str = None,\n fileHandleEndpoint: str = None,\n portalEndpoint: str = None,\n skip_checks: bool = False,\n):\n \"\"\"\n Sets the locations for each of the Synapse services (mostly useful for testing).\n\n Arguments:\n repoEndpoint: Location of synapse repository\n authEndpoint: Location of authentication service\n fileHandleEndpoint: Location of file service\n portalEndpoint: Location of the website\n skip_checks: Skip version and endpoint checks\n\n Example: Switching endpoints\n To switch between staging and production endpoints\n\n syn.setEndpoints(**synapseclient.client.STAGING_ENDPOINTS)\n syn.setEndpoints(**synapseclient.client.PRODUCTION_ENDPOINTS)\n\n \"\"\"\n\n endpoints = {\n \"repoEndpoint\": repoEndpoint,\n \"authEndpoint\": authEndpoint,\n \"fileHandleEndpoint\": fileHandleEndpoint,\n \"portalEndpoint\": portalEndpoint,\n }\n\n # For unspecified endpoints, first look in the config file\n config = self.getConfigFile(self.configPath)\n for point in endpoints.keys():\n if endpoints[point] is None and config.has_option(\"endpoints\", point):\n endpoints[point] = config.get(\"endpoints\", point)\n\n # Endpoints default to production\n for point in endpoints.keys():\n if endpoints[point] is None:\n endpoints[point] = PRODUCTION_ENDPOINTS[point]\n\n # Update endpoints if we get redirected\n if not skip_checks:\n response = self._requests_session.get(\n endpoints[point],\n allow_redirects=False,\n headers=synapseclient.USER_AGENT,\n )\n if response.status_code == 301:\n endpoints[point] = response.headers[\"location\"]\n\n self.repoEndpoint = endpoints[\"repoEndpoint\"]\n self.authEndpoint = endpoints[\"authEndpoint\"]\n self.fileHandleEndpoint = endpoints[\"fileHandleEndpoint\"]\n self.portalEndpoint = endpoints[\"portalEndpoint\"]\n
"},{"location":"reference/client/#synapseclient.Synapse.invalidateAPIKey","title":"invalidateAPIKey()
","text":"Invalidates authentication across all clients.
RETURNS DESCRIPTIONNone
Source code insynapseclient/client.py
def invalidateAPIKey(self):\n \"\"\"Invalidates authentication across all clients.\n\n Returns:\n None\n \"\"\"\n\n # Logout globally\n if self._is_logged_in():\n self.restDELETE(\"/secretKey\", endpoint=self.authEndpoint)\n
"},{"location":"reference/client/#synapseclient.Synapse.get_user_profile_by_username","title":"get_user_profile_by_username(username=None, sessionToken=None)
cached
","text":"Get the details about a Synapse user. Retrieves information on the current user if 'id' is omitted or is empty string.
PARAMETER DESCRIPTIONusername
The userName of a user
TYPE: str
DEFAULT: None
sessionToken
The session token to use to find the user profile
TYPE: str
DEFAULT: None
UserProfile
The user profile for the user of interest.
Using this functionGetting your own profile
my_profile = syn.get_user_profile_by_username()\n
Getting another user's profile
freds_profile = syn.get_user_profile_by_username('fredcommo')\n
Source code in synapseclient/client.py
@functools.lru_cache()\ndef get_user_profile_by_username(\n self,\n username: str = None,\n sessionToken: str = None,\n) -> UserProfile:\n \"\"\"\n Get the details about a Synapse user.\n Retrieves information on the current user if 'id' is omitted or is empty string.\n\n Arguments:\n username: The userName of a user\n sessionToken: The session token to use to find the user profile\n\n Returns:\n The user profile for the user of interest.\n\n Example: Using this function\n Getting your own profile\n\n my_profile = syn.get_user_profile_by_username()\n\n Getting another user's profile\n\n freds_profile = syn.get_user_profile_by_username('fredcommo')\n \"\"\"\n is_none = username is None\n is_str = isinstance(username, str)\n if not is_str and not is_none:\n raise TypeError(\"username must be string or None\")\n if is_str:\n principals = self._findPrincipals(username)\n for principal in principals:\n if principal.get(\"userName\", None).lower() == username.lower():\n id = principal[\"ownerId\"]\n break\n else:\n raise ValueError(f\"Can't find user '{username}'\")\n else:\n id = \"\"\n uri = f\"/userProfile/{id}\"\n return UserProfile(\n **self.restGET(\n uri, headers={\"sessionToken\": sessionToken} if sessionToken else None\n )\n )\n
"},{"location":"reference/client/#synapseclient.Synapse.get_user_profile_by_id","title":"get_user_profile_by_id(id=None, sessionToken=None)
cached
","text":"Get the details about a Synapse user. Retrieves information on the current user if 'id' is omitted.
PARAMETER DESCRIPTIONid
The ownerId of a user
TYPE: int
DEFAULT: None
sessionToken
The session token to use to find the user profile
TYPE: str
DEFAULT: None
UserProfile
The user profile for the user of interest.
Using this functionGetting your own profile
my_profile = syn.get_user_profile_by_id()\n
Getting another user's profile
freds_profile = syn.get_user_profile_by_id(1234567)\n
Source code in synapseclient/client.py
@functools.lru_cache()\ndef get_user_profile_by_id(\n self,\n id: int = None,\n sessionToken: str = None,\n) -> UserProfile:\n \"\"\"\n Get the details about a Synapse user.\n Retrieves information on the current user if 'id' is omitted.\n\n Arguments:\n id: The ownerId of a user\n sessionToken: The session token to use to find the user profile\n\n Returns:\n The user profile for the user of interest.\n\n\n Example: Using this function\n Getting your own profile\n\n my_profile = syn.get_user_profile_by_id()\n\n Getting another user's profile\n\n freds_profile = syn.get_user_profile_by_id(1234567)\n \"\"\"\n if id:\n if not isinstance(id, int):\n raise TypeError(\"id must be an 'ownerId' integer\")\n else:\n id = \"\"\n uri = f\"/userProfile/{id}\"\n return UserProfile(\n **self.restGET(\n uri, headers={\"sessionToken\": sessionToken} if sessionToken else None\n )\n )\n
"},{"location":"reference/client/#synapseclient.Synapse.getUserProfile","title":"getUserProfile(id=None, sessionToken=None)
cached
","text":"Get the details about a Synapse user. Retrieves information on the current user if 'id' is omitted.
PARAMETER DESCRIPTIONid
The 'userId' (aka 'ownerId') of a user or the userName
TYPE: Union[str, int, UserProfile, TeamMember]
DEFAULT: None
sessionToken
The session token to use to find the user profile
TYPE: str
DEFAULT: None
UserProfile
The user profile for the user of interest.
Using this functionGetting your own profile
my_profile = syn.getUserProfile()\n
Getting another user's profile
freds_profile = syn.getUserProfile('fredcommo')\n
Source code in synapseclient/client.py
@functools.lru_cache()\ndef getUserProfile(\n self,\n id: Union[str, int, UserProfile, TeamMember] = None,\n sessionToken: str = None,\n) -> UserProfile:\n \"\"\"\n Get the details about a Synapse user.\n Retrieves information on the current user if 'id' is omitted.\n\n Arguments:\n id: The 'userId' (aka 'ownerId') of a user or the userName\n sessionToken: The session token to use to find the user profile\n\n Returns:\n The user profile for the user of interest.\n\n Example: Using this function\n Getting your own profile\n\n my_profile = syn.getUserProfile()\n\n Getting another user's profile\n\n freds_profile = syn.getUserProfile('fredcommo')\n \"\"\"\n try:\n # if id is unset or a userID, this will succeed\n id = \"\" if id is None else int(id)\n except (TypeError, ValueError):\n if isinstance(id, collections.abc.Mapping) and \"ownerId\" in id:\n id = id.ownerId\n elif isinstance(id, TeamMember):\n id = id.member.ownerId\n else:\n principals = self._findPrincipals(id)\n if len(principals) == 1:\n id = principals[0][\"ownerId\"]\n else:\n for principal in principals:\n if principal.get(\"userName\", None).lower() == id.lower():\n id = principal[\"ownerId\"]\n break\n else: # no break\n raise ValueError('Can\\'t find user \"%s\": ' % id)\n uri = \"/userProfile/%s\" % id\n return UserProfile(\n **self.restGET(\n uri, headers={\"sessionToken\": sessionToken} if sessionToken else None\n )\n )\n
"},{"location":"reference/client/#synapseclient.Synapse.is_certified","title":"is_certified(user)
","text":"Determines whether a Synapse user is a certified user.
PARAMETER DESCRIPTIONuser
Synapse username or Id
TYPE: Union[str, int]
bool
True if the Synapse user is certified
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::is_certified\")\ndef is_certified(self, user: typing.Union[str, int]) -> bool:\n \"\"\"Determines whether a Synapse user is a certified user.\n\n Arguments:\n user: Synapse username or Id\n\n Returns:\n True if the Synapse user is certified\n \"\"\"\n # Check if userid or username exists\n syn_user = self.getUserProfile(user)\n # Get passing record\n\n try:\n certification_status = self._get_certified_passing_record(\n syn_user[\"ownerId\"]\n )\n return certification_status[\"passed\"]\n except SynapseHTTPError as ex:\n if ex.response.status_code == 404:\n # user hasn't taken the quiz\n return False\n raise\n
"},{"location":"reference/client/#synapseclient.Synapse.is_synapse_id","title":"is_synapse_id(syn_id)
","text":"Checks if given synID is valid (attached to actual entity?)
PARAMETER DESCRIPTIONsyn_id
A Synapse ID
TYPE: str
bool
True if the Synapse ID is valid
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::is_synapse_id\")\ndef is_synapse_id(self, syn_id: str) -> bool:\n \"\"\"Checks if given synID is valid (attached to actual entity?)\n\n Arguments:\n syn_id: A Synapse ID\n\n Returns:\n True if the Synapse ID is valid\n \"\"\"\n if isinstance(syn_id, str):\n try:\n self.get(syn_id, downloadFile=False)\n except SynapseFileNotFoundError:\n return False\n except (\n SynapseHTTPError,\n SynapseAuthenticationError,\n ) as err:\n status = (\n err.__context__.response.status_code or err.response.status_code\n )\n if status in (400, 404):\n return False\n # Valid ID but user lacks permission or is not logged in\n elif status == 403:\n return True\n return True\n self.logger.warning(\"synID must be a string\")\n return False\n
"},{"location":"reference/client/#synapseclient.Synapse.onweb","title":"onweb(entity, subpageId=None)
","text":"Opens up a browser window to the entity page or wiki-subpage.
PARAMETER DESCRIPTIONentity
Either an Entity or a Synapse ID
subpageId
(Optional) ID of one of the wiki's sub-pages
DEFAULT: None
None
Source code insynapseclient/client.py
def onweb(self, entity, subpageId=None):\n \"\"\"Opens up a browser window to the entity page or wiki-subpage.\n\n Arguments:\n entity: Either an Entity or a Synapse ID\n subpageId: (Optional) ID of one of the wiki's sub-pages\n\n Returns:\n None\n \"\"\"\n if isinstance(entity, str) and os.path.isfile(entity):\n entity = self.get(entity, downloadFile=False)\n synId = id_of(entity)\n if subpageId is None:\n webbrowser.open(\"%s#!Synapse:%s\" % (self.portalEndpoint, synId))\n else:\n webbrowser.open(\n \"%s#!Wiki:%s/ENTITY/%s\" % (self.portalEndpoint, synId, subpageId)\n )\n
"},{"location":"reference/client/#synapseclient.Synapse.printEntity","title":"printEntity(entity, ensure_ascii=True)
","text":"Pretty prints an Entity.
PARAMETER DESCRIPTIONentity
The entity to be printed.
ensure_ascii
If True, escapes all non-ASCII characters
DEFAULT: True
None
None
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::printEntity\")\ndef printEntity(self, entity, ensure_ascii=True) -> None:\n \"\"\"\n Pretty prints an Entity.\n\n Arguments:\n entity: The entity to be printed.\n ensure_ascii: If True, escapes all non-ASCII characters\n\n Returns:\n None\n \"\"\"\n\n if utils.is_synapse_id_str(entity):\n entity = self._getEntity(entity)\n try:\n self.logger.info(\n json.dumps(entity, sort_keys=True, indent=2, ensure_ascii=ensure_ascii)\n )\n except TypeError:\n self.logger.info(str(entity))\n
"},{"location":"reference/client/#synapseclient.Synapse.get_available_services","title":"get_available_services()
","text":"Get available Synapse services This is a beta feature and is subject to change
RETURNS DESCRIPTIONList[str]
List of available services
Source code insynapseclient/client.py
def get_available_services(self) -> typing.List[str]:\n \"\"\"Get available Synapse services\n This is a beta feature and is subject to change\n\n Returns:\n List of available services\n \"\"\"\n services = self._services.keys()\n return list(services)\n
"},{"location":"reference/client/#synapseclient.Synapse.service","title":"service(service_name)
","text":"Get available Synapse services This is a beta feature and is subject to change
PARAMETER DESCRIPTIONservice_name
name of the service
TYPE: str
synapseclient/client.py
def service(self, service_name: str):\n \"\"\"Get available Synapse services\n This is a beta feature and is subject to change\n\n Arguments:\n service_name: name of the service\n \"\"\"\n # This is to avoid circular imports\n # TODO: revisit the import order and method https://stackoverflow.com/a/37126790\n # To move this to the top\n import synapseclient.services\n\n assert isinstance(service_name, str)\n service_name = service_name.lower().replace(\" \", \"_\")\n assert service_name in self._services, (\n f\"Unrecognized service ({service_name}). Run the 'get_available_\"\n \"services()' method to get a list of available services.\"\n )\n service_attr = self._services[service_name]\n service_cls = getattr(synapseclient.services, service_attr)\n service = service_cls(self)\n return service\n
"},{"location":"reference/client/#synapseclient.Synapse.clear_download_list","title":"clear_download_list()
","text":"Clear all files from download list
Source code insynapseclient/client.py
def clear_download_list(self):\n \"\"\"Clear all files from download list\"\"\"\n self.restDELETE(\"/download/list\")\n
"},{"location":"reference/client/#synapseclient.Synapse.create_external_s3_file_handle","title":"create_external_s3_file_handle(bucket_name, s3_file_key, file_path, *, parent=None, storage_location_id=None, mimetype=None)
","text":"Create an external S3 file handle for e.g. a file that has been uploaded directly to an external S3 storage location.
PARAMETER DESCRIPTIONbucket_name
Name of the S3 bucket
s3_file_key
S3 key of the uploaded object
file_path
Local path of the uploaded file
parent
Parent entity to create the file handle in, the file handle will be created in the default storage location of the parent. Mutually exclusive with storage_location_id
DEFAULT: None
storage_location_id
Explicit storage location id to create the file handle in, mutually exclusive with parent
DEFAULT: None
mimetype
Mimetype of the file, if known
DEFAULT: None
ValueError
If neither parent nor storage_location_id is specified, or if both are specified.
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::create_external_s3_file_handle\")\ndef create_external_s3_file_handle(\n self,\n bucket_name,\n s3_file_key,\n file_path,\n *,\n parent=None,\n storage_location_id=None,\n mimetype=None,\n):\n \"\"\"\n Create an external S3 file handle for e.g. a file that has been uploaded directly to\n an external S3 storage location.\n\n Arguments:\n bucket_name: Name of the S3 bucket\n s3_file_key: S3 key of the uploaded object\n file_path: Local path of the uploaded file\n parent: Parent entity to create the file handle in, the file handle will be created\n in the default storage location of the parent. Mutually exclusive with\n storage_location_id\n storage_location_id: Explicit storage location id to create the file handle in, mutually exclusive\n with parent\n mimetype: Mimetype of the file, if known\n\n Raises:\n ValueError: If neither parent nor storage_location_id is specified, or if both are specified.\n \"\"\"\n\n if storage_location_id:\n if parent:\n raise ValueError(\"Pass parent or storage_location_id, not both\")\n elif not parent:\n raise ValueError(\"One of parent or storage_location_id is required\")\n else:\n upload_destination = self._getDefaultUploadDestination(parent)\n storage_location_id = upload_destination[\"storageLocationId\"]\n\n if mimetype is None:\n mimetype, enc = mimetypes.guess_type(file_path, strict=False)\n\n file_handle = {\n \"concreteType\": concrete_types.S3_FILE_HANDLE,\n \"key\": s3_file_key,\n \"bucketName\": bucket_name,\n \"fileName\": os.path.basename(file_path),\n \"contentMd5\": utils.md5_for_file(file_path).hexdigest(),\n \"contentSize\": os.stat(file_path).st_size,\n \"storageLocationId\": storage_location_id,\n \"contentType\": mimetype,\n }\n\n return self.restPOST(\n \"/externalFileHandle/s3\",\n json.dumps(file_handle),\n endpoint=self.fileHandleEndpoint,\n )\n
"},{"location":"reference/client/#synapseclient.Synapse.getMyStorageLocationSetting","title":"getMyStorageLocationSetting(storage_location_id)
","text":"Get a StorageLocationSetting by its id.
PARAMETER DESCRIPTIONstorage_location_id
id of the StorageLocationSetting to retrieve. The corresponding StorageLocationSetting must have been created by this user.
RETURNS DESCRIPTION
A dict describing the StorageLocationSetting retrieved by its id
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getMyStorageLocationSetting\")\ndef getMyStorageLocationSetting(self, storage_location_id):\n \"\"\"\n Get a StorageLocationSetting by its id.\n\n Arguments:\n storage_location_id: id of the StorageLocationSetting to retrieve.\n The corresponding StorageLocationSetting must have been created by this user.\n\n Returns:\n A dict describing the StorageLocationSetting retrieved by its id\n \"\"\"\n return self.restGET(\"/storageLocation/%s\" % storage_location_id)\n
"},{"location":"reference/client/#synapseclient.Synapse.createStorageLocationSetting","title":"createStorageLocationSetting(storage_type, **kwargs)
","text":"Creates an IMMUTABLE storage location based on the specified type.
For each storage_type, the following kwargs should be specified:
ExternalObjectStorage: (S3-like (e.g. AWS S3 or Openstack) bucket not accessed by Synapse)
ExternalS3Storage: (Amazon S3 bucket accessed by Synapse)
ExternalStorage: (SFTP or FTP storage location not accessed by Synapse)
ProxyStorage: (a proxy server that controls access to a storage)
storage_type
The type of the StorageLocationSetting to create
banner
(Optional) The optional banner to show every time a file is uploaded
description
(Optional) The description to show the user when the user has to choose which upload destination to use
kwargs
fields necessary for creation of the specified storage_type
DEFAULT: {}
A dict of the created StorageLocationSetting
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::createStorageLocationSetting\")\ndef createStorageLocationSetting(self, storage_type, **kwargs):\n \"\"\"\n Creates an IMMUTABLE storage location based on the specified type.\n\n For each storage_type, the following kwargs should be specified:\n\n **ExternalObjectStorage**: (S3-like (e.g. AWS S3 or Openstack) bucket not accessed by Synapse)\n\n - endpointUrl: endpoint URL of the S3 service (for example: 'https://s3.amazonaws.com')\n - bucket: the name of the bucket to use\n\n **ExternalS3Storage**: (Amazon S3 bucket accessed by Synapse)\n\n - bucket: the name of the bucket to use\n\n **ExternalStorage**: (SFTP or FTP storage location not accessed by Synapse)\n\n - url: the base URL for uploading to the external destination\n - supportsSubfolders(optional): does the destination support creating subfolders under the base url\n (default: false)\n\n **ProxyStorage**: (a proxy server that controls access to a storage)\n\n - secretKey: The encryption key used to sign all pre-signed URLs used to communicate with the proxy.\n - proxyUrl: The HTTPS URL of the proxy used for upload and download.\n\n Arguments:\n storage_type: The type of the StorageLocationSetting to create\n banner: (Optional) The optional banner to show every time a file is uploaded\n description: (Optional) The description to show the user when the user has to choose which upload destination to use\n kwargs: fields necessary for creation of the specified storage_type\n\n Returns:\n A dict of the created StorageLocationSetting\n \"\"\"\n upload_type_dict = {\n \"ExternalObjectStorage\": \"S3\",\n \"ExternalS3Storage\": \"S3\",\n \"ExternalStorage\": \"SFTP\",\n \"ProxyStorage\": \"PROXYLOCAL\",\n }\n\n if storage_type not in upload_type_dict:\n raise ValueError(\"Unknown storage_type: %s\", storage_type)\n\n # ProxyStorageLocationSettings has an extra 's' at the end >:(\n kwargs[\"concreteType\"] = (\n \"org.sagebionetworks.repo.model.project.\"\n + storage_type\n + \"LocationSetting\"\n + (\"s\" if storage_type == \"ProxyStorage\" else \"\")\n )\n kwargs[\"uploadType\"] = upload_type_dict[storage_type]\n\n return self.restPOST(\"/storageLocation\", body=json.dumps(kwargs))\n
"},{"location":"reference/client/#synapseclient.Synapse.create_s3_storage_location","title":"create_s3_storage_location(*, parent=None, folder_name=None, folder=None, bucket_name=None, base_key=None, sts_enabled=False)
","text":"Create a storage location in the given parent, either in the given folder or by creating a new folder in that parent with the given name. This will both create a StorageLocationSetting, and a ProjectSetting together, optionally creating a new folder in which to locate it, and optionally enabling this storage location for access via STS. If enabling an existing folder for STS, it must be empty.
PARAMETER DESCRIPTIONparent
The parent in which to locate the storage location (mutually exclusive with folder)
DEFAULT: None
folder_name
The name of a new folder to create (mutually exclusive with folder)
DEFAULT: None
folder
The existing folder in which to create the storage location (mutually exclusive with folder_name)
DEFAULT: None
bucket_name
The name of an S3 bucket, if this is an external storage location, if None will use Synapse S3 storage
DEFAULT: None
base_key
The base key of within the bucket, None to use the bucket root, only applicable if bucket_name is passed
DEFAULT: None
sts_enabled
Whether this storage location should be STS enabled
DEFAULT: False
A 3-tuple of the synapse Folder, a the storage location setting, and the project setting dictionaries.
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::create_s3_storage_location\")\ndef create_s3_storage_location(\n self,\n *,\n parent=None,\n folder_name=None,\n folder=None,\n bucket_name=None,\n base_key=None,\n sts_enabled=False,\n):\n \"\"\"\n Create a storage location in the given parent, either in the given folder or by creating a new\n folder in that parent with the given name. This will both create a StorageLocationSetting,\n and a ProjectSetting together, optionally creating a new folder in which to locate it,\n and optionally enabling this storage location for access via STS. If enabling an existing folder for STS,\n it must be empty.\n\n Arguments:\n parent: The parent in which to locate the storage location (mutually exclusive with folder)\n folder_name: The name of a new folder to create (mutually exclusive with folder)\n folder: The existing folder in which to create the storage location (mutually exclusive with folder_name)\n bucket_name: The name of an S3 bucket, if this is an external storage location,\n if None will use Synapse S3 storage\n base_key: The base key of within the bucket, None to use the bucket root,\n only applicable if bucket_name is passed\n sts_enabled: Whether this storage location should be STS enabled\n\n Returns:\n A 3-tuple of the synapse Folder, a the storage location setting, and the project setting dictionaries.\n \"\"\"\n if folder_name and parent:\n if folder:\n raise ValueError(\n \"folder and folder_name are mutually exclusive, only one should be passed\"\n )\n\n folder = self.store(Folder(name=folder_name, parent=parent))\n\n elif not folder:\n raise ValueError(\"either folder or folder_name should be required\")\n\n storage_location_kwargs = {\n \"uploadType\": \"S3\",\n \"stsEnabled\": sts_enabled,\n }\n\n if bucket_name:\n storage_location_kwargs[\n \"concreteType\"\n ] = concrete_types.EXTERNAL_S3_STORAGE_LOCATION_SETTING\n storage_location_kwargs[\"bucket\"] = bucket_name\n if base_key:\n storage_location_kwargs[\"baseKey\"] = base_key\n else:\n storage_location_kwargs[\n \"concreteType\"\n ] = concrete_types.SYNAPSE_S3_STORAGE_LOCATION_SETTING\n\n storage_location_setting = self.restPOST(\n \"/storageLocation\", json.dumps(storage_location_kwargs)\n )\n\n storage_location_id = storage_location_setting[\"storageLocationId\"]\n project_setting = self.setStorageLocation(\n folder,\n storage_location_id,\n )\n\n return folder, storage_location_setting, project_setting\n
"},{"location":"reference/client/#synapseclient.Synapse.setStorageLocation","title":"setStorageLocation(entity, storage_location_id)
","text":"Sets the storage location for a Project or Folder
PARAMETER DESCRIPTIONentity
A Project or Folder to which the StorageLocationSetting is set
storage_location_id
A StorageLocation id or a list of StorageLocation ids. Pass in None for the default Synapse storage.
RETURNS DESCRIPTION
The created or updated settings as a dict.
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::setStorageLocation\")\ndef setStorageLocation(self, entity, storage_location_id):\n \"\"\"\n Sets the storage location for a Project or Folder\n\n Arguments:\n entity: A Project or Folder to which the StorageLocationSetting is set\n storage_location_id: A StorageLocation id or a list of StorageLocation ids. Pass in None for the default\n Synapse storage.\n\n Returns:\n The created or updated settings as a dict.\n \"\"\"\n if storage_location_id is None:\n storage_location_id = DEFAULT_STORAGE_LOCATION_ID\n locations = (\n storage_location_id\n if isinstance(storage_location_id, list)\n else [storage_location_id]\n )\n\n existing_setting = self.getProjectSetting(entity, \"upload\")\n if existing_setting is not None:\n existing_setting[\"locations\"] = locations\n self.restPUT(\"/projectSettings\", body=json.dumps(existing_setting))\n return self.getProjectSetting(entity, \"upload\")\n else:\n project_destination = {\n \"concreteType\": \"org.sagebionetworks.repo.model.project.UploadDestinationListSetting\",\n \"settingsType\": \"upload\",\n \"locations\": locations,\n \"projectId\": id_of(entity),\n }\n\n return self.restPOST(\n \"/projectSettings\", body=json.dumps(project_destination)\n )\n
"},{"location":"reference/client/#synapseclient.Synapse.get_sts_storage_token","title":"get_sts_storage_token(entity, permission, *, output_format='json', min_remaining_life=None)
","text":"Get STS credentials for the given entity_id and permission, outputting it in the given format
PARAMETER DESCRIPTIONentity
The entity or entity id whose credentials are being returned
permission
One of:
read_only
read_write
output_format
One of:
json
: the dictionary returned from the Synapse STS API including expirationboto
: a dictionary compatible with a boto session (aws_access_key_id, etc)shell
: output commands for exporting credentials appropriate for the detected shellbash
: output commands for exporting credentials into a bash shellcmd
: output commands for exporting credentials into a windows cmd shellpowershell
: output commands for exporting credentials into a windows powershell DEFAULT: 'json'
min_remaining_life
The minimum allowable remaining life on a cached token to return. If a cached token has left than this amount of time left a fresh token will be fetched
DEFAULT: None
synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::get_sts_storage_token\")\ndef get_sts_storage_token(\n self, entity, permission, *, output_format=\"json\", min_remaining_life=None\n):\n \"\"\"Get STS credentials for the given entity_id and permission, outputting it in the given format\n\n Arguments:\n entity: The entity or entity id whose credentials are being returned\n permission: One of:\n\n - `read_only`\n - `read_write`\n output_format: One of:\n\n - `json`: the dictionary returned from the Synapse STS API including expiration\n - `boto`: a dictionary compatible with a boto session (aws_access_key_id, etc)\n - `shell`: output commands for exporting credentials appropriate for the detected shell\n - `bash`: output commands for exporting credentials into a bash shell\n - `cmd`: output commands for exporting credentials into a windows cmd shell\n - `powershell`: output commands for exporting credentials into a windows powershell\n min_remaining_life: The minimum allowable remaining life on a cached token to return. If a cached token\n has left than this amount of time left a fresh token will be fetched\n \"\"\"\n return sts_transfer.get_sts_credentials(\n self,\n id_of(entity),\n permission,\n output_format=output_format,\n min_remaining_life=min_remaining_life,\n )\n
"},{"location":"reference/client/#synapseclient.Synapse.create_snapshot_version","title":"create_snapshot_version(table, comment=None, label=None, activity=None, wait=True)
","text":"Create a new Table Version, new View version, or new Dataset version.
PARAMETER DESCRIPTIONtable
The schema of the Table/View, or its ID.
TYPE: Union[EntityViewSchema, Schema, str, SubmissionViewSchema, Dataset]
comment
Optional snapshot comment.
TYPE: str
DEFAULT: None
label
Optional snapshot label.
TYPE: str
DEFAULT: None
activity
Optional activity ID applied to snapshot version.
TYPE: Union[Activity, str]
DEFAULT: None
wait
True if this method should return the snapshot version after waiting for any necessary asynchronous table updates to complete. If False this method will return as soon as any updates are initiated.
TYPE: bool
DEFAULT: True
int
The snapshot version number if wait=True, None if wait=False
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::create_snapshot_version\")\ndef create_snapshot_version(\n self,\n table: typing.Union[\n EntityViewSchema, Schema, str, SubmissionViewSchema, Dataset\n ],\n comment: str = None,\n label: str = None,\n activity: typing.Union[Activity, str] = None,\n wait: bool = True,\n) -> int:\n \"\"\"Create a new Table Version, new View version, or new Dataset version.\n\n Arguments:\n table: The schema of the Table/View, or its ID.\n comment: Optional snapshot comment.\n label: Optional snapshot label.\n activity: Optional activity ID applied to snapshot version.\n wait: True if this method should return the snapshot version after waiting for any necessary\n asynchronous table updates to complete. If False this method will return\n as soon as any updates are initiated.\n\n Returns:\n The snapshot version number if wait=True, None if wait=False\n \"\"\"\n ent = self.get(id_of(table), downloadFile=False)\n if isinstance(ent, (EntityViewSchema, SubmissionViewSchema, Dataset)):\n result = self._async_table_update(\n table,\n create_snapshot=True,\n comment=comment,\n label=label,\n activity=activity,\n wait=wait,\n )\n elif isinstance(ent, Schema):\n result = self._create_table_snapshot(\n table,\n comment=comment,\n label=label,\n activity=activity,\n )\n else:\n raise ValueError(\n \"This function only accepts Synapse ids of Tables or Views\"\n )\n\n # for consistency we return nothing if wait=False since we can't\n # supply the snapshot version on an async table update without waiting\n return result[\"snapshotVersionNumber\"] if wait else None\n
"},{"location":"reference/client/#synapseclient.Synapse.getConfigFile","title":"getConfigFile(configPath)
cached
","text":"Retrieves the client configuration information.
PARAMETER DESCRIPTIONconfigPath
Path to configuration file on local file system
TYPE: str
RawConfigParser
A RawConfigParser populated with properties from the user's configuration file.
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getConfigFile\")\n@functools.lru_cache()\ndef getConfigFile(self, configPath: str) -> configparser.RawConfigParser:\n \"\"\"\n Retrieves the client configuration information.\n\n Arguments:\n configPath: Path to configuration file on local file system\n\n Returns:\n A RawConfigParser populated with properties from the user's configuration file.\n \"\"\"\n\n try:\n config = configparser.RawConfigParser()\n config.read(configPath) # Does not fail if the file does not exist\n return config\n except configparser.Error as ex:\n raise ValueError(\n \"Error parsing Synapse config file: {}\".format(configPath)\n ) from ex\n
"},{"location":"reference/client/#synapseclient.Synapse.getEvaluation","title":"getEvaluation(id)
","text":"Gets an Evaluation object from Synapse.
PARAMETER DESCRIPTIONid
The ID of the synapseclient.evaluation.Evaluation to return.
RETURNS DESCRIPTION
An synapseclient.evaluation.Evaluation object
Using this functionCreating an Evaluation instance
evaluation = syn.getEvaluation(2005090)\n
Source code in synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getEvaluation\")\ndef getEvaluation(self, id):\n \"\"\"\n Gets an Evaluation object from Synapse.\n\n Arguments:\n id: The ID of the [synapseclient.evaluation.Evaluation][] to return.\n\n Returns:\n An [synapseclient.evaluation.Evaluation][] object\n\n Example: Using this function\n Creating an Evaluation instance\n\n evaluation = syn.getEvaluation(2005090)\n \"\"\"\n\n evaluation_id = id_of(id)\n uri = Evaluation.getURI(evaluation_id)\n return Evaluation(**self.restGET(uri))\n
"},{"location":"reference/client/#synapseclient.Synapse.getEvaluationByContentSource","title":"getEvaluationByContentSource(entity)
","text":"Returns a generator over evaluations that derive their content from the given entity
PARAMETER DESCRIPTIONentity
The synapseclient.entity.Project whose Evaluations are to be fetched.
YIELDS DESCRIPTION
A generator over synapseclient.evaluation.Evaluation objects for the given synapseclient.entity.Project.
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getEvaluationByContentSource\")\ndef getEvaluationByContentSource(self, entity):\n \"\"\"\n Returns a generator over evaluations that derive their content from the given entity\n\n Arguments:\n entity: The [synapseclient.entity.Project][] whose Evaluations are to be fetched.\n\n Yields:\n A generator over [synapseclient.evaluation.Evaluation][] objects for the given [synapseclient.entity.Project][].\n \"\"\"\n\n entityId = id_of(entity)\n url = \"/entity/%s/evaluation\" % entityId\n\n for result in self._GET_paginated(url):\n yield Evaluation(**result)\n
"},{"location":"reference/client/#synapseclient.Synapse.getEvaluationByName","title":"getEvaluationByName(name)
","text":"Gets an Evaluation object from Synapse.
PARAMETER DESCRIPTIONName
The name of the synapseclient.evaluation.Evaluation to return.
RETURNS DESCRIPTION
An synapseclient.evaluation.Evaluation object
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getEvaluationByName\")\ndef getEvaluationByName(self, name):\n \"\"\"\n Gets an Evaluation object from Synapse.\n\n Arguments:\n Name: The name of the [synapseclient.evaluation.Evaluation][] to return.\n\n Returns:\n An [synapseclient.evaluation.Evaluation][] object\n \"\"\"\n uri = Evaluation.getByNameURI(name)\n return Evaluation(**self.restGET(uri))\n
"},{"location":"reference/client/#synapseclient.Synapse.getProjectSetting","title":"getProjectSetting(project, setting_type)
","text":"Gets the ProjectSetting for a project.
PARAMETER DESCRIPTIONproject
Project entity or its id as a string
setting_type
Type of setting. Choose from:
upload
external_sync
requester_pays
RETURNS DESCRIPTION
The ProjectSetting as a dict or None if no settings of the specified type exist.
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getProjectSetting\")\ndef getProjectSetting(self, project, setting_type):\n \"\"\"\n Gets the ProjectSetting for a project.\n\n Arguments:\n project: Project entity or its id as a string\n setting_type: Type of setting. Choose from:\n\n - `upload`\n - `external_sync`\n - `requester_pays`\n\n Returns:\n The ProjectSetting as a dict or None if no settings of the specified type exist.\n \"\"\"\n if setting_type not in {\"upload\", \"external_sync\", \"requester_pays\"}:\n raise ValueError(\"Invalid project_type: %s\" % setting_type)\n\n response = self.restGET(\n \"/projectSettings/{projectId}/type/{type}\".format(\n projectId=id_of(project), type=setting_type\n )\n )\n return (\n response if response else None\n ) # if no project setting, a empty string is returned as the response\n
"},{"location":"reference/client/#synapseclient.Synapse.getSubmission","title":"getSubmission(id, **kwargs)
","text":"Gets a synapseclient.evaluation.Submission object by its id.
PARAMETER DESCRIPTIONid
The id of the submission to retrieve
RETURNS DESCRIPTION
A synapseclient.evaluation.Submission object
:param id: The id of the submission to retrieve
:return: a :py:class:synapseclient.evaluation.Submission
object
See:
synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getSubmission\")\ndef getSubmission(self, id, **kwargs):\n \"\"\"\n Gets a [synapseclient.evaluation.Submission][] object by its id.\n\n Arguments:\n id: The id of the submission to retrieve\n\n Returns:\n A [synapseclient.evaluation.Submission][] object\n\n\n :param id: The id of the submission to retrieve\n\n :return: a :py:class:`synapseclient.evaluation.Submission` object\n\n See:\n\n - [synapseclient.Synapse.get][] for information\n on the *downloadFile*, *downloadLocation*, and *ifcollision* parameters\n \"\"\"\n\n submission_id = id_of(id)\n uri = Submission.getURI(submission_id)\n submission = Submission(**self.restGET(uri))\n\n # Pre-fetch the Entity tied to the Submission, if there is one\n if \"entityId\" in submission and submission[\"entityId\"] is not None:\n entityBundleJSON = json.loads(submission[\"entityBundleJSON\"])\n\n # getWithEntityBundle expects a bundle services v2 style\n # annotations dict, but the evaluations API may return\n # an older format annotations object in the encoded JSON\n # depending on when the original submission was made.\n annotations = entityBundleJSON.get(\"annotations\")\n if annotations:\n entityBundleJSON[\"annotations\"] = convert_old_annotation_json(\n annotations\n )\n\n related = self._getWithEntityBundle(\n entityBundle=entityBundleJSON,\n entity=submission[\"entityId\"],\n submission=submission_id,\n **kwargs,\n )\n submission.entity = related\n submission.filePath = related.get(\"path\", None)\n\n return submission\n
"},{"location":"reference/client/#synapseclient.Synapse.getSubmissions","title":"getSubmissions(evaluation, status=None, myOwn=False, limit=20, offset=0)
","text":"PARAMETER DESCRIPTION evaluation
Evaluation to get submissions from.
status
Optionally filter submissions for a specific status. One of:
OPEN
CLOSED
SCORED
INVALID
VALIDATED
EVALUATION_IN_PROGRESS
RECEIVED
REJECTED
ACCEPTED
DEFAULT: None
myOwn
Determines if only your Submissions should be fetched. Defaults to False (all Submissions)
DEFAULT: False
limit
Limits the number of submissions in a single response. Because this method returns a generator and repeatedly fetches submissions, this argument is limiting the size of a single request and NOT the number of sub- missions returned in total.
DEFAULT: 20
offset
Start iterating at a submission offset from the first submission.
DEFAULT: 0
A generator over synapseclient.evaluation.Submission objects for an Evaluation
Using this functionPrint submissions
for submission in syn.getSubmissions(1234567):\n print(submission['entityId'])\n
See:
synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getSubmissions\")\ndef getSubmissions(self, evaluation, status=None, myOwn=False, limit=20, offset=0):\n \"\"\"\n Arguments:\n evaluation: Evaluation to get submissions from.\n status: Optionally filter submissions for a specific status.\n One of:\n\n - `OPEN`\n - `CLOSED`\n - `SCORED`\n - `INVALID`\n - `VALIDATED`\n - `EVALUATION_IN_PROGRESS`\n - `RECEIVED`\n - `REJECTED`\n - `ACCEPTED`\n myOwn: Determines if only your Submissions should be fetched.\n Defaults to False (all Submissions)\n limit: Limits the number of submissions in a single response.\n Because this method returns a generator and repeatedly\n fetches submissions, this argument is limiting the\n size of a single request and NOT the number of sub-\n missions returned in total.\n offset: Start iterating at a submission offset from the first submission.\n\n Yields:\n A generator over [synapseclient.evaluation.Submission][] objects for an Evaluation\n\n Example: Using this function\n Print submissions\n\n for submission in syn.getSubmissions(1234567):\n print(submission['entityId'])\n\n See:\n\n - [synapseclient.evaluation][]\n \"\"\"\n\n evaluation_id = id_of(evaluation)\n uri = \"/evaluation/%s/submission%s\" % (evaluation_id, \"\" if myOwn else \"/all\")\n\n if status is not None:\n uri += \"?status=%s\" % status\n\n for result in self._GET_paginated(uri, limit=limit, offset=offset):\n yield Submission(**result)\n
"},{"location":"reference/client/#synapseclient.Synapse.getSubmissionBundles","title":"getSubmissionBundles(evaluation, status=None, myOwn=False, limit=20, offset=0)
","text":"Retrieve submission bundles (submission and submissions status) for an evaluation queue, optionally filtered by submission status and/or owner.
PARAMETER DESCRIPTIONevaluation
Evaluation to get submissions from.
status
Optionally filter submissions for a specific status. One of:
OPEN
CLOSED
SCORED
INVALID
DEFAULT: None
myOwn
Determines if only your Submissions should be fetched. Defaults to False (all Submissions)
DEFAULT: False
limit
Limits the number of submissions coming back from the service in a single response.
DEFAULT: 20
offset
Start iterating at a submission offset from the first submission.
DEFAULT: 0
A generator over tuples containing a synapseclient.evaluation.Submission and a synapseclient.evaluation.SubmissionStatus.
Using this functionLoop over submissions
for submission, status in syn.getSubmissionBundles(evaluation):\n print(submission.name, \\\n submission.submitterAlias, \\\n status.status, \\\n status.score)\n
This may later be changed to return objects, pending some thought on how submissions along with related status and annotations should be represented in the clients.
See:
synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getSubmissionBundles\")\ndef getSubmissionBundles(\n self, evaluation, status=None, myOwn=False, limit=20, offset=0\n):\n \"\"\"\n Retrieve submission bundles (submission and submissions status) for an evaluation queue, optionally filtered by\n submission status and/or owner.\n\n Arguments:\n evaluation: Evaluation to get submissions from.\n status: Optionally filter submissions for a specific status.\n One of:\n\n - `OPEN`\n - `CLOSED`\n - `SCORED`\n - `INVALID`\n myOwn: Determines if only your Submissions should be fetched.\n Defaults to False (all Submissions)\n limit: Limits the number of submissions coming back from the\n service in a single response.\n offset: Start iterating at a submission offset from the first submission.\n\n Yields:\n A generator over tuples containing a [synapseclient.evaluation.Submission][] and a [synapseclient.evaluation.SubmissionStatus][].\n\n Example: Using this function\n Loop over submissions\n\n for submission, status in syn.getSubmissionBundles(evaluation):\n print(submission.name, \\\\\n submission.submitterAlias, \\\\\n status.status, \\\\\n status.score)\n\n This may later be changed to return objects, pending some thought on how submissions along with related status\n and annotations should be represented in the clients.\n\n See:\n\n - [synapseclient.evaluation][]\n \"\"\"\n for bundle in self._getSubmissionBundles(\n evaluation, status=status, myOwn=myOwn, limit=limit, offset=offset\n ):\n yield (\n Submission(**bundle[\"submission\"]),\n SubmissionStatus(**bundle[\"submissionStatus\"]),\n )\n
"},{"location":"reference/client/#synapseclient.Synapse.getSubmissionStatus","title":"getSubmissionStatus(submission)
","text":"Downloads the status of a Submission.
PARAMETER DESCRIPTIONsubmission
The submission to lookup
RETURNS DESCRIPTION
A synapseclient.evaluation.SubmissionStatus object
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getSubmissionStatus\")\ndef getSubmissionStatus(self, submission):\n \"\"\"\n Downloads the status of a Submission.\n\n Arguments:\n submission: The submission to lookup\n\n Returns:\n A [synapseclient.evaluation.SubmissionStatus][] object\n \"\"\"\n\n submission_id = id_of(submission)\n uri = SubmissionStatus.getURI(submission_id)\n val = self.restGET(uri)\n return SubmissionStatus(**val)\n
"},{"location":"reference/client/#synapseclient.Synapse.getWiki","title":"getWiki(owner, subpageId=None, version=None)
","text":"Get a synapseclient.wiki.Wiki object from Synapse. Uses wiki2 API which supports versioning.
PARAMETER DESCRIPTIONowner
The entity to which the Wiki is attached
subpageId
The id of the specific sub-page or None to get the root Wiki page
DEFAULT: None
version
The version of the page to retrieve or None to retrieve the latest
DEFAULT: None
A synapseclient.wiki.Wiki object
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getWiki\")\ndef getWiki(self, owner, subpageId=None, version=None):\n \"\"\"\n Get a [synapseclient.wiki.Wiki][] object from Synapse. Uses wiki2 API which supports versioning.\n\n Arguments:\n owner: The entity to which the Wiki is attached\n subpageId: The id of the specific sub-page or None to get the root Wiki page\n version: The version of the page to retrieve or None to retrieve the latest\n\n Returns:\n A [synapseclient.wiki.Wiki][] object\n \"\"\"\n uri = \"/entity/{ownerId}/wiki2\".format(ownerId=id_of(owner))\n if subpageId is not None:\n uri += \"/{wikiId}\".format(wikiId=subpageId)\n if version is not None:\n uri += \"?wikiVersion={version}\".format(version=version)\n\n wiki = self.restGET(uri)\n wiki[\"owner\"] = owner\n wiki = Wiki(**wiki)\n\n path = self.cache.get(wiki.markdownFileHandleId)\n if not path:\n cache_dir = self.cache.get_cache_dir(wiki.markdownFileHandleId)\n if not os.path.exists(cache_dir):\n os.makedirs(cache_dir)\n path = self._downloadFileHandle(\n wiki[\"markdownFileHandleId\"],\n wiki[\"id\"],\n \"WikiMarkdown\",\n os.path.join(cache_dir, str(wiki.markdownFileHandleId) + \".md\"),\n )\n try:\n import gzip\n\n with gzip.open(path) as f:\n markdown = f.read().decode(\"utf-8\")\n except IOError:\n with open(path) as f:\n markdown = f.read().decode(\"utf-8\")\n\n wiki.markdown = markdown\n wiki.markdown_path = path\n\n return wiki\n
"},{"location":"reference/client/#synapseclient.Synapse.getWikiAttachments","title":"getWikiAttachments(wiki)
","text":"Retrieve the attachments to a wiki page.
PARAMETER DESCRIPTIONwiki
The Wiki object for which the attachments are to be returned.
RETURNS DESCRIPTION
A list of file handles for the files attached to the Wiki.
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getWikiAttachments\")\ndef getWikiAttachments(self, wiki):\n \"\"\"\n Retrieve the attachments to a wiki page.\n\n Arguments:\n wiki: The Wiki object for which the attachments are to be returned.\n\n Returns:\n A list of file handles for the files attached to the Wiki.\n \"\"\"\n uri = \"/entity/%s/wiki/%s/attachmenthandles\" % (wiki.ownerId, wiki.id)\n results = self.restGET(uri)\n file_handles = list(WikiAttachment(**fh) for fh in results[\"list\"])\n return file_handles\n
"},{"location":"reference/client/#synapseclient.Synapse.getWikiHeaders","title":"getWikiHeaders(owner)
","text":"Retrieves the headers of all Wikis belonging to the owner (the entity to which the Wiki is attached).
PARAMETER DESCRIPTIONowner
An Entity
RETURNS DESCRIPTION
A list of Objects with three fields: id, title and parentId.
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getWikiHeaders\")\ndef getWikiHeaders(self, owner):\n \"\"\"\n Retrieves the headers of all Wikis belonging to the owner (the entity to which the Wiki is attached).\n\n Arguments:\n owner: An Entity\n\n Returns:\n A list of Objects with three fields: id, title and parentId.\n \"\"\"\n\n uri = \"/entity/%s/wikiheadertree\" % id_of(owner)\n return [DictObject(**header) for header in self._GET_paginated(uri)]\n
"},{"location":"reference/client/#synapseclient.Synapse.get_download_list","title":"get_download_list(downloadLocation=None)
","text":"Download all files from your Synapse download list
PARAMETER DESCRIPTIONdownloadLocation
Directory to download files to.
TYPE: str
DEFAULT: None
str
Manifest file with file paths
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::get_download_list\")\ndef get_download_list(self, downloadLocation: str = None) -> str:\n \"\"\"Download all files from your Synapse download list\n\n Arguments:\n downloadLocation: Directory to download files to.\n\n Returns:\n Manifest file with file paths\n \"\"\"\n dl_list_path = self.get_download_list_manifest()\n downloaded_files = []\n new_manifest_path = f\"manifest_{time.time_ns()}.csv\"\n with open(dl_list_path) as manifest_f, open(\n new_manifest_path, \"w\"\n ) as write_obj:\n reader = csv.DictReader(manifest_f)\n columns = reader.fieldnames\n columns.extend([\"path\", \"error\"])\n # Write the downloaded paths to a new manifest file\n writer = csv.DictWriter(write_obj, fieldnames=columns)\n writer.writeheader()\n\n for row in reader:\n # You can add things to the download list that you don't have access to\n # So there must be a try catch here\n try:\n entity = self.get(row[\"ID\"], downloadLocation=downloadLocation)\n # Must include version number because you can have multiple versions of a\n # file in the download list\n downloaded_files.append(\n {\n \"fileEntityId\": row[\"ID\"],\n \"versionNumber\": row[\"versionNumber\"],\n }\n )\n row[\"path\"] = entity.path\n row[\"error\"] = \"\"\n except Exception:\n row[\"path\"] = \"\"\n row[\"error\"] = \"DOWNLOAD FAILED\"\n self.logger.error(\"Unable to download file\")\n writer.writerow(row)\n\n # Don't want to clear all the download list because you can add things\n # to the download list after initiating this command.\n # Files that failed to download should not be removed from download list\n # Remove all files from download list after the entire download is complete.\n # This is because if download fails midway, we want to return the full manifest\n if downloaded_files:\n # Only want to invoke this if there is a list of files to remove\n # or the API call will error\n self.remove_from_download_list(list_of_files=downloaded_files)\n else:\n self.logger.warning(\"A manifest was created, but no files were downloaded\")\n\n # Always remove original manifest file\n os.remove(dl_list_path)\n\n return new_manifest_path\n
"},{"location":"reference/client/#synapseclient.Synapse.get_download_list_manifest","title":"get_download_list_manifest()
","text":"Get the path of the download list manifest file
RETURNS DESCRIPTIONPath of download list manifest file
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::get_download_list_manifest\")\ndef get_download_list_manifest(self):\n \"\"\"Get the path of the download list manifest file\n\n Returns:\n Path of download list manifest file\n \"\"\"\n manifest = self._generate_manifest_from_download_list()\n # Get file handle download link\n file_result = self._getFileHandleDownload(\n fileHandleId=manifest[\"resultFileHandleId\"],\n objectId=manifest[\"resultFileHandleId\"],\n objectType=\"FileEntity\",\n )\n # Download the manifest\n downloaded_path = self._download_from_URL(\n url=file_result[\"preSignedURL\"],\n destination=\"./\",\n fileHandleId=file_result[\"fileHandleId\"],\n expected_md5=file_result[\"fileHandle\"].get(\"contentMd5\"),\n )\n trace.get_current_span().set_attributes(\n {\"synapse.file_handle_id\": file_result[\"fileHandleId\"]}\n )\n return downloaded_path\n
"},{"location":"reference/client/#synapseclient.Synapse.remove_from_download_list","title":"remove_from_download_list(list_of_files)
","text":"Remove a batch of files from download list
PARAMETER DESCRIPTIONlist_of_files
Array of files in the format of a mapping {fileEntityId: synid, versionNumber: version}
TYPE: List[Dict]
int
Number of files removed from download list
Source code insynapseclient/client.py
def remove_from_download_list(self, list_of_files: typing.List[typing.Dict]) -> int:\n \"\"\"Remove a batch of files from download list\n\n Arguments:\n list_of_files: Array of files in the format of a mapping {fileEntityId: synid, versionNumber: version}\n\n Returns:\n Number of files removed from download list\n \"\"\"\n request_body = {\"batchToRemove\": list_of_files}\n num_files_removed = self.restPOST(\n \"/download/list/remove\", body=json.dumps(request_body)\n )\n return num_files_removed\n
"},{"location":"reference/client/#synapseclient.Synapse.md5Query","title":"md5Query(md5)
","text":"Find the Entities which have attached file(s) which have the given MD5 hash.
PARAMETER DESCRIPTIONmd5
The MD5 to query for (hexadecimal string)
RETURNS DESCRIPTION
A list of Entity headers
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::md5Query\")\ndef md5Query(self, md5):\n \"\"\"\n Find the Entities which have attached file(s) which have the given MD5 hash.\n\n Arguments:\n md5: The MD5 to query for (hexadecimal string)\n\n Returns:\n A list of Entity headers\n \"\"\"\n\n return self.restGET(\"/entity/md5/%s\" % md5)[\"results\"]\n
"},{"location":"reference/client/#synapseclient.Synapse.sendMessage","title":"sendMessage(userIds, messageSubject, messageBody, contentType='text/plain')
","text":"send a message via Synapse.
PARAMETER DESCRIPTIONuserIds
A list of user IDs to which the message is to be sent
messageSubject
The subject for the message
messageBody
The body of the message
contentType
optional contentType of message body (default=\"text/plain\") Should be one of \"text/plain\" or \"text/html\"
DEFAULT: 'text/plain'
The metadata of the created message
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::sendMessage\")\ndef sendMessage(\n self, userIds, messageSubject, messageBody, contentType=\"text/plain\"\n):\n \"\"\"\n send a message via Synapse.\n\n Arguments:\n userIds: A list of user IDs to which the message is to be sent\n messageSubject: The subject for the message\n messageBody: The body of the message\n contentType: optional contentType of message body (default=\"text/plain\")\n Should be one of \"text/plain\" or \"text/html\"\n\n Returns:\n The metadata of the created message\n \"\"\"\n\n fileHandleId = multipart_upload_string(\n self, messageBody, content_type=contentType\n )\n message = dict(\n recipients=userIds, subject=messageSubject, fileHandleId=fileHandleId\n )\n return self.restPOST(uri=\"/message\", body=json.dumps(message))\n
"},{"location":"reference/client/#synapseclient.Synapse.uploadFileHandle","title":"uploadFileHandle(path, parent, synapseStore=True, mimetype=None, md5=None, file_size=None)
","text":"Uploads the file in the provided path (if necessary) to a storage location based on project settings. Returns a new FileHandle as a dict to represent the stored file.
PARAMETER DESCRIPTIONparent
Parent of the entity to which we upload.
path
File path to the file being uploaded
synapseStore
If False, will not upload the file, but instead create an ExternalFileHandle that references the file on the local machine. If True, will upload the file based on StorageLocation determined by the entity_parent_id
DEFAULT: True
mimetype
The MIME type metadata for the uploaded file
DEFAULT: None
md5
The MD5 checksum for the file, if known. Otherwise if the file is a local file, it will be calculated automatically.
DEFAULT: None
file_size
The size the file, if known. Otherwise if the file is a local file, it will be calculated automatically.
DEFAULT: None
file_type
The MIME type the file, if known. Otherwise if the file is a local file, it will be calculated automatically.
RETURNS DESCRIPTION
A dict of a new FileHandle as a dict that represents the uploaded file
Source code insynapseclient/client.py
def uploadFileHandle(\n self, path, parent, synapseStore=True, mimetype=None, md5=None, file_size=None\n):\n \"\"\"Uploads the file in the provided path (if necessary) to a storage location based on project settings.\n Returns a new FileHandle as a dict to represent the stored file.\n\n Arguments:\n parent: Parent of the entity to which we upload.\n path: File path to the file being uploaded\n synapseStore: If False, will not upload the file, but instead create an ExternalFileHandle that references\n the file on the local machine.\n If True, will upload the file based on StorageLocation determined by the entity_parent_id\n mimetype: The MIME type metadata for the uploaded file\n md5: The MD5 checksum for the file, if known. Otherwise if the file is a local file, it will be calculated\n automatically.\n file_size: The size the file, if known. Otherwise if the file is a local file, it will be calculated\n automatically.\n file_type: The MIME type the file, if known. Otherwise if the file is a local file, it will be calculated\n automatically.\n\n Returns:\n A dict of a new FileHandle as a dict that represents the uploaded file\n \"\"\"\n return upload_file_handle(\n self, parent, path, synapseStore, md5, file_size, mimetype\n )\n
"},{"location":"reference/client/#synapseclient.Synapse.restGET","title":"restGET(uri, endpoint=None, headers=None, retryPolicy={}, requests_session=None, **kwargs)
","text":"Sends an HTTP GET request to the Synapse server.
PARAMETER DESCRIPTIONuri
URI on which get is performed
endpoint
Server endpoint, defaults to self.repoEndpoint
DEFAULT: None
headers
Dictionary of headers to use rather than the API-key-signed default set of headers
DEFAULT: None
requests_session
An external requests.Session object to use when making this specific call
DEFAULT: None
kwargs
Any other arguments taken by a request method
DEFAULT: {}
JSON encoding of response
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::restGET\")\ndef restGET(\n self,\n uri,\n endpoint=None,\n headers=None,\n retryPolicy={},\n requests_session=None,\n **kwargs,\n):\n \"\"\"\n Sends an HTTP GET request to the Synapse server.\n\n Arguments:\n uri: URI on which get is performed\n endpoint: Server endpoint, defaults to self.repoEndpoint\n headers: Dictionary of headers to use rather than the API-key-signed default set of headers\n requests_session: An external requests.Session object to use when making this specific call\n kwargs: Any other arguments taken by a [request](http://docs.python-requests.org/en/latest/) method\n\n Returns:\n JSON encoding of response\n \"\"\"\n trace.get_current_span().set_attributes({\"url.path\": uri})\n response = self._rest_call(\n \"get\", uri, None, endpoint, headers, retryPolicy, requests_session, **kwargs\n )\n return self._return_rest_body(response)\n
"},{"location":"reference/client/#synapseclient.Synapse.restPOST","title":"restPOST(uri, body, endpoint=None, headers=None, retryPolicy={}, requests_session=None, **kwargs)
","text":"Sends an HTTP POST request to the Synapse server.
PARAMETER DESCRIPTIONuri
URI on which get is performed
endpoint
Server endpoint, defaults to self.repoEndpoint
DEFAULT: None
body
The payload to be delivered
headers
Dictionary of headers to use rather than the API-key-signed default set of headers
DEFAULT: None
requests_session
an external requests.Session object to use when making this specific call
DEFAULT: None
kwargs
Any other arguments taken by a request method
DEFAULT: {}
JSON encoding of response
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::restPOST\")\ndef restPOST(\n self,\n uri,\n body,\n endpoint=None,\n headers=None,\n retryPolicy={},\n requests_session=None,\n **kwargs,\n):\n \"\"\"\n Sends an HTTP POST request to the Synapse server.\n\n Arguments:\n uri: URI on which get is performed\n endpoint: Server endpoint, defaults to self.repoEndpoint\n body: The payload to be delivered\n headers: Dictionary of headers to use rather than the API-key-signed default set of headers\n requests_session: an external requests.Session object to use when making this specific call\n kwargs: Any other arguments taken by a [request](http://docs.python-requests.org/en/latest/) method\n\n Returns:\n JSON encoding of response\n \"\"\"\n trace.get_current_span().set_attributes({\"url.path\": uri})\n response = self._rest_call(\n \"post\",\n uri,\n body,\n endpoint,\n headers,\n retryPolicy,\n requests_session,\n **kwargs,\n )\n return self._return_rest_body(response)\n
"},{"location":"reference/client/#synapseclient.Synapse.restPUT","title":"restPUT(uri, body=None, endpoint=None, headers=None, retryPolicy={}, requests_session=None, **kwargs)
","text":"Sends an HTTP PUT request to the Synapse server.
PARAMETER DESCRIPTIONuri
URI on which get is performed
endpoint
Server endpoint, defaults to self.repoEndpoint
DEFAULT: None
body
The payload to be delivered
DEFAULT: None
headers
Dictionary of headers to use rather than the API-key-signed default set of headers
DEFAULT: None
requests_session
Sn external requests.Session object to use when making this specific call
DEFAULT: None
kwargs
Any other arguments taken by a request method
DEFAULT: {}
Returns JSON encoding of response
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::restPUT\")\ndef restPUT(\n self,\n uri,\n body=None,\n endpoint=None,\n headers=None,\n retryPolicy={},\n requests_session=None,\n **kwargs,\n):\n \"\"\"\n Sends an HTTP PUT request to the Synapse server.\n\n Arguments:\n uri: URI on which get is performed\n endpoint: Server endpoint, defaults to self.repoEndpoint\n body: The payload to be delivered\n headers: Dictionary of headers to use rather than the API-key-signed default set of headers\n requests_session: Sn external requests.Session object to use when making this specific call\n kwargs: Any other arguments taken by a [request](http://docs.python-requests.org/en/latest/) method\n\n Returns\n JSON encoding of response\n \"\"\"\n trace.get_current_span().set_attributes({\"url.path\": uri})\n response = self._rest_call(\n \"put\", uri, body, endpoint, headers, retryPolicy, requests_session, **kwargs\n )\n return self._return_rest_body(response)\n
"},{"location":"reference/client/#synapseclient.Synapse.restDELETE","title":"restDELETE(uri, endpoint=None, headers=None, retryPolicy={}, requests_session=None, **kwargs)
","text":"Sends an HTTP DELETE request to the Synapse server.
PARAMETER DESCRIPTIONuri
URI of resource to be deleted
endpoint
Server endpoint, defaults to self.repoEndpoint
DEFAULT: None
headers
Dictionary of headers to use rather than the API-key-signed default set of headers
DEFAULT: None
requests_session
An external requests.Session object to use when making this specific call
DEFAULT: None
kwargs
Any other arguments taken by a request method
DEFAULT: {}
synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::restDELETE\")\ndef restDELETE(\n self,\n uri,\n endpoint=None,\n headers=None,\n retryPolicy={},\n requests_session=None,\n **kwargs,\n):\n \"\"\"\n Sends an HTTP DELETE request to the Synapse server.\n\n Arguments:\n uri: URI of resource to be deleted\n endpoint: Server endpoint, defaults to self.repoEndpoint\n headers: Dictionary of headers to use rather than the API-key-signed default set of headers\n requests_session: An external requests.Session object to use when making this specific call\n kwargs: Any other arguments taken by a [request](http://docs.python-requests.org/en/latest/) method\n\n \"\"\"\n trace.get_current_span().set_attributes({\"url.path\": uri})\n self._rest_call(\n \"delete\",\n uri,\n None,\n endpoint,\n headers,\n retryPolicy,\n requests_session,\n **kwargs,\n )\n
"},{"location":"reference/client/#more-information","title":"More information","text":"See also the Synapse API documentation
"},{"location":"reference/core/","title":"Core","text":"This section is for super users / developers only. These functions are subject to change as they are internal development functions. Use at your own risk.
"},{"location":"reference/core/#multipart-upload","title":"Multipart Upload","text":""},{"location":"reference/core/#synapseclient.core.upload.multipart_upload","title":"synapseclient.core.upload.multipart_upload
","text":"Implements the client side of Synapse multipart upload
_, which provides a robust means of uploading large files (into the 10s of GB). End users should not need to call any of these functions directly.
.. _Synapse multipart upload: https://rest-docs.synapse.org/rest/index.html#org.sagebionetworks.file.controller.UploadController
"},{"location":"reference/core/#synapseclient.core.upload.multipart_upload-classes","title":"Classes","text":""},{"location":"reference/core/#synapseclient.core.upload.multipart_upload.UploadAttempt","title":"UploadAttempt
","text":"Source code in synapseclient/core/upload/multipart_upload.py
class UploadAttempt:\n def __init__(\n self,\n syn,\n dest_file_name,\n upload_request_payload,\n part_request_body_provider_fn,\n md5_fn,\n max_threads: int,\n force_restart: bool,\n ):\n self._syn = syn\n self._dest_file_name = dest_file_name\n self._part_size = upload_request_payload[\"partSizeBytes\"]\n\n self._upload_request_payload = upload_request_payload\n\n self._part_request_body_provider_fn = part_request_body_provider_fn\n self._md5_fn = md5_fn\n\n self._max_threads = max_threads\n self._force_restart = force_restart\n\n self._lock = threading.Lock()\n self._aborted = False\n\n # populated later\n self._upload_id: str = None\n self._pre_signed_part_urls: Mapping[int, str] = None\n\n @classmethod\n def _get_remaining_part_numbers(cls, upload_status):\n part_numbers = []\n parts_state = upload_status[\"partsState\"]\n\n # parts are 1-based\n for i, part_status in enumerate(parts_state, 1):\n if part_status == \"0\":\n part_numbers.append(i)\n\n return len(parts_state), part_numbers\n\n @classmethod\n def _get_thread_session(cls):\n # get a lazily initialized requests.Session from the thread.\n # we want to share a requests.Session over the course of a thread\n # to take advantage of persistent http connection. we put it on a\n # thread local rather that in the task closure since a connection can\n # be reused across separate part uploads so no reason to restrict it\n # per worker task.\n session = getattr(_thread_local, \"session\", None)\n if not session:\n session = _thread_local.session = requests.Session()\n return session\n\n def _is_copy(self):\n # is this a copy or upload request\n return (\n self._upload_request_payload.get(\"concreteType\")\n == concrete_types.MULTIPART_UPLOAD_COPY_REQUEST\n )\n\n def _create_synapse_upload(self):\n return self._syn.restPOST(\n \"/file/multipart?forceRestart={}\".format(str(self._force_restart).lower()),\n json.dumps(self._upload_request_payload),\n endpoint=self._syn.fileHandleEndpoint,\n )\n\n def _fetch_pre_signed_part_urls(\n self,\n upload_id: str,\n part_numbers: List[int],\n requests_session: requests.Session = None,\n ) -> Mapping[int, str]:\n uri = \"/file/multipart/{upload_id}/presigned/url/batch\".format(\n upload_id=upload_id\n )\n body = {\n \"uploadId\": upload_id,\n \"partNumbers\": part_numbers,\n }\n\n response = self._syn.restPOST(\n uri,\n json.dumps(body),\n requests_session=requests_session,\n endpoint=self._syn.fileHandleEndpoint,\n )\n\n part_urls = {}\n for part in response[\"partPresignedUrls\"]:\n part_urls[part[\"partNumber\"]] = (\n part[\"uploadPresignedUrl\"],\n part.get(\"signedHeaders\", {}),\n )\n\n return part_urls\n\n def _refresh_pre_signed_part_urls(\n self,\n part_number: int,\n expired_url: str,\n ):\n \"\"\"Refresh all unfetched presigned urls, and return the refreshed\n url for the given part number. If an existing expired_url is passed\n and the url for the given part has already changed that new url\n will be returned without a refresh (i.e. it is assumed that another\n thread has already refreshed the url since the passed url expired).\n\n :param part_number: the part number whose refreshed url should\n be returned\n :param expired_url: the url that was detected as expired triggering\n this refresh\n\n \"\"\"\n with self._lock:\n current_url = self._pre_signed_part_urls[part_number]\n if current_url != expired_url:\n # if the url has already changed since the given url\n # was detected as expired we can assume that another\n # thread already refreshed the url and can avoid the extra\n # fetch.\n refreshed_url = current_url\n else:\n self._pre_signed_part_urls = self._fetch_pre_signed_part_urls(\n self._upload_id,\n list(self._pre_signed_part_urls.keys()),\n )\n\n refreshed_url = self._pre_signed_part_urls[part_number]\n\n return refreshed_url\n\n def _handle_part(self, part_number, otel_context: typing.Union[Context, None]):\n if otel_context:\n context.attach(otel_context)\n with tracer.start_as_current_span(\"UploadAttempt::_handle_part\"):\n trace.get_current_span().set_attributes(\n {\"thread.id\": threading.get_ident()}\n )\n with self._lock:\n if self._aborted:\n # this upload attempt has already been aborted\n # so we short circuit the attempt to upload this part\n raise SynapseUploadAbortedException(\n \"Upload aborted, skipping part {}\".format(part_number)\n )\n\n part_url, signed_headers = self._pre_signed_part_urls.get(part_number)\n\n session = self._get_thread_session()\n\n # obtain the body (i.e. the upload bytes) for the given part number.\n body = (\n self._part_request_body_provider_fn(part_number)\n if self._part_request_body_provider_fn\n else None\n )\n part_size = len(body) if body else 0\n for retry in range(2):\n\n def put_fn():\n with tracer.start_as_current_span(\"UploadAttempt::put_part\"):\n return session.put(part_url, body, headers=signed_headers)\n\n try:\n # use our backoff mechanism here, we have encountered 500s on puts to AWS signed urls\n response = with_retry(\n put_fn, retry_exceptions=[requests.exceptions.ConnectionError]\n )\n _raise_for_status(response)\n\n # completed upload part to s3 successfully\n break\n\n except SynapseHTTPError as ex:\n if ex.response.status_code == 403 and retry < 1:\n # we interpret this to mean our pre_signed url expired.\n self._syn.logger.debug(\n \"The pre-signed upload URL for part {} has expired.\"\n \"Refreshing urls and retrying.\\n\".format(part_number)\n )\n\n # we refresh all the urls and obtain this part's\n # specific url for the retry\n with tracer.start_as_current_span(\n \"UploadAttempt::refresh_pre_signed_part_urls\"\n ):\n (\n part_url,\n signed_headers,\n ) = self._refresh_pre_signed_part_urls(\n part_number,\n part_url,\n )\n\n else:\n raise\n\n md5_hex = self._md5_fn(body, response)\n\n # now tell synapse that we uploaded that part successfully\n self._syn.restPUT(\n \"/file/multipart/{upload_id}/add/{part_number}?partMD5Hex={md5}\".format(\n upload_id=self._upload_id,\n part_number=part_number,\n md5=md5_hex,\n ),\n requests_session=session,\n endpoint=self._syn.fileHandleEndpoint,\n )\n\n # remove so future batch pre_signed url fetches will exclude this part\n with self._lock:\n del self._pre_signed_part_urls[part_number]\n\n return part_number, part_size\n\n @tracer.start_as_current_span(\"UploadAttempt::_upload_parts\")\n def _upload_parts(self, part_count, remaining_part_numbers):\n trace.get_current_span().set_attributes({\"thread.id\": threading.get_ident()})\n time_upload_started = time.time()\n completed_part_count = part_count - len(remaining_part_numbers)\n file_size = self._upload_request_payload.get(\"fileSizeBytes\")\n\n if not self._is_copy():\n # we won't have bytes to measure during a copy so the byte oriented progress bar is not useful\n progress = previously_transferred = min(\n completed_part_count * self._part_size,\n file_size,\n )\n\n self._syn._print_transfer_progress(\n progress,\n file_size,\n prefix=\"Uploading\",\n postfix=self._dest_file_name,\n previouslyTransferred=previously_transferred,\n )\n\n self._pre_signed_part_urls = self._fetch_pre_signed_part_urls(\n self._upload_id,\n remaining_part_numbers,\n )\n\n futures = []\n with _executor(self._max_threads, False) as executor:\n # we don't wait on the shutdown since we do so ourselves below\n\n for part_number in remaining_part_numbers:\n futures.append(\n executor.submit(\n self._handle_part,\n part_number,\n context.get_current(),\n )\n )\n\n for result in concurrent.futures.as_completed(futures):\n try:\n _, part_size = result.result()\n\n if part_size and not self._is_copy():\n progress += part_size\n self._syn._print_transfer_progress(\n min(progress, file_size),\n file_size,\n prefix=\"Uploading\",\n postfix=self._dest_file_name,\n dt=time.time() - time_upload_started,\n previouslyTransferred=previously_transferred,\n )\n except (Exception, KeyboardInterrupt) as cause:\n with self._lock:\n self._aborted = True\n\n # wait for all threads to complete before\n # raising the exception, we don't want to return\n # control while there are still threads from this\n # upload attempt running\n concurrent.futures.wait(futures)\n\n if isinstance(cause, KeyboardInterrupt):\n raise SynapseUploadAbortedException(\"User interrupted upload\")\n raise SynapseUploadFailedException(\"Part upload failed\") from cause\n\n @tracer.start_as_current_span(\"UploadAttempt::_complete_upload\")\n def _complete_upload(self):\n upload_status_response = self._syn.restPUT(\n \"/file/multipart/{upload_id}/complete\".format(\n upload_id=self._upload_id,\n ),\n requests_session=self._get_thread_session(),\n endpoint=self._syn.fileHandleEndpoint,\n )\n\n upload_state = upload_status_response.get(\"state\")\n if upload_state != \"COMPLETED\":\n # at this point we think successfully uploaded all the parts\n # but the upload status isn't complete, we'll throw an error\n # and let a subsequent attempt try to reconcile\n raise SynapseUploadFailedException(\n \"Upload status has an unexpected state {}\".format(upload_state)\n )\n\n return upload_status_response\n\n def __call__(self):\n upload_status_response = self._create_synapse_upload()\n upload_state = upload_status_response.get(\"state\")\n\n if upload_state != \"COMPLETED\":\n self._upload_id = upload_status_response[\"uploadId\"]\n part_count, remaining_part_numbers = self._get_remaining_part_numbers(\n upload_status_response\n )\n\n # if no remaining part numbers then all the parts have been\n # uploaded but the upload has not been marked complete.\n if remaining_part_numbers:\n self._upload_parts(part_count, remaining_part_numbers)\n upload_status_response = self._complete_upload()\n\n return upload_status_response\n
"},{"location":"reference/core/#synapseclient.core.upload.multipart_upload-functions","title":"Functions","text":""},{"location":"reference/core/#synapseclient.core.upload.multipart_upload.shared_executor","title":"shared_executor(executor)
","text":"An outside process that will eventually trigger an upload through the this module can configure a shared Executor by running its code within this context manager.
Source code insynapseclient/core/upload/multipart_upload.py
@contextmanager\ndef shared_executor(executor):\n \"\"\"An outside process that will eventually trigger an upload through the this module\n can configure a shared Executor by running its code within this context manager.\"\"\"\n _thread_local.executor = executor\n try:\n yield\n finally:\n del _thread_local.executor\n
"},{"location":"reference/core/#synapseclient.core.upload.multipart_upload.multipart_upload_file","title":"multipart_upload_file(syn, file_path, dest_file_name=None, content_type=None, part_size=None, storage_location_id=None, preview=True, force_restart=False, max_threads=None)
","text":"Upload a file to a Synapse upload destination in chunks.
:param syn: a Synapse object :param file_path: the file to upload :param dest_file_name: upload as a different filename :param content_type: contentType
_ :param part_size: Number of bytes per part. Minimum 5MB. :param storage_location_id: an id indicating where the file should be stored. Retrieved from Synapse's UploadDestination :param preview: True to generate a preview :param force_restart: True to restart a previously initiated upload from scratch, False to try to resume :param max_threads: number of concurrent threads to devote to upload
:returns: a File Handle ID
Keyword arguments are passed down to :py:func:_multipart_upload
and :py:func:_start_multipart_upload
. .. _contentType: https://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.17
synapseclient/core/upload/multipart_upload.py
@tracer.start_as_current_span(\"multipart_upload::multipart_upload_file\")\ndef multipart_upload_file(\n syn,\n file_path: str,\n dest_file_name: str = None,\n content_type: str = None,\n part_size: int = None,\n storage_location_id: str = None,\n preview: bool = True,\n force_restart: bool = False,\n max_threads: int = None,\n) -> str:\n \"\"\"\n Upload a file to a Synapse upload destination in chunks.\n\n :param syn: a Synapse object\n :param file_path: the file to upload\n :param dest_file_name: upload as a different filename\n :param content_type: `contentType`_\n :param part_size: Number of bytes per part. Minimum 5MB.\n :param storage_location_id: an id indicating where the file should be\n stored. Retrieved from Synapse's UploadDestination\n :param preview: True to generate a preview\n :param force_restart: True to restart a previously initiated upload\n from scratch, False to try to resume\n :param max_threads: number of concurrent threads to devote\n to upload\n\n :returns: a File Handle ID\n\n Keyword arguments are passed down to :py:func:`_multipart_upload` and :py:func:`_start_multipart_upload`.\n .. _contentType: https://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.17\n\n \"\"\"\n\n trace.get_current_span().set_attributes(\n {\"synapse.storage_location_id\": storage_location_id}\n )\n\n if not os.path.exists(file_path):\n raise IOError('File \"{}\" not found.'.format(file_path))\n if os.path.isdir(file_path):\n raise IOError('File \"{}\" is a directory.'.format(file_path))\n\n file_size = os.path.getsize(file_path)\n if not dest_file_name:\n dest_file_name = os.path.basename(file_path)\n\n if content_type is None:\n mime_type, _ = mimetypes.guess_type(file_path, strict=False)\n content_type = mime_type or \"application/octet-stream\"\n\n callback_func = Spinner().print_tick if not syn.silent else None\n md5_hex = md5_for_file(file_path, callback=callback_func).hexdigest()\n\n part_size = _get_part_size(part_size, file_size)\n\n upload_request = {\n \"concreteType\": concrete_types.MULTIPART_UPLOAD_REQUEST,\n \"contentType\": content_type,\n \"contentMD5Hex\": md5_hex,\n \"fileName\": dest_file_name,\n \"fileSizeBytes\": file_size,\n \"generatePreview\": preview,\n \"partSizeBytes\": part_size,\n \"storageLocationId\": storage_location_id,\n }\n\n def part_fn(part_number):\n return _get_file_chunk(file_path, part_number, part_size)\n\n return _multipart_upload(\n syn,\n dest_file_name,\n upload_request,\n part_fn,\n md5_fn,\n force_restart=force_restart,\n max_threads=max_threads,\n )\n
"},{"location":"reference/core/#synapseclient.core.upload.multipart_upload.multipart_upload_string","title":"multipart_upload_string(syn, text, dest_file_name=None, part_size=None, content_type=None, storage_location_id=None, preview=True, force_restart=False, max_threads=None)
","text":"Upload a file to a Synapse upload destination in chunks.
:param syn: a Synapse object :param text: a string to upload as a file. :param dest_file_name: upload as a different filename :param content_type: contentType
_ :param part_size: number of bytes per part. Minimum 5MB. :param storage_location_id: an id indicating where the file should be stored. Retrieved from Synapse's UploadDestination :param preview: True to generate a preview :param force_restart: True to restart a previously initiated upload from scratch, False to try to resume :param max_threads: number of concurrent threads to devote to upload
:returns: a File Handle ID
Keyword arguments are passed down to :py:func:_multipart_upload
and :py:func:_start_multipart_upload
.
.. _contentType: https://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.17
Source code insynapseclient/core/upload/multipart_upload.py
@tracer.start_as_current_span(\"multipart_upload::multipart_upload_string\")\ndef multipart_upload_string(\n syn,\n text: str,\n dest_file_name: str = None,\n part_size: int = None,\n content_type: str = None,\n storage_location_id: str = None,\n preview: bool = True,\n force_restart: bool = False,\n max_threads: int = None,\n):\n \"\"\"\n Upload a file to a Synapse upload destination in chunks.\n\n :param syn: a Synapse object\n :param text: a string to upload as a file.\n :param dest_file_name: upload as a different filename\n :param content_type: `contentType`_\n :param part_size: number of bytes per part. Minimum 5MB.\n :param storage_location_id: an id indicating where the file should be\n stored. Retrieved from Synapse's UploadDestination\n :param preview: True to generate a preview\n :param force_restart: True to restart a previously initiated upload\n from scratch, False to try to resume\n :param max_threads: number of concurrent threads to devote\n to upload\n\n :returns: a File Handle ID\n\n Keyword arguments are passed down to\n :py:func:`_multipart_upload` and :py:func:`_start_multipart_upload`.\n\n .. _contentType:\n https://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.17\n \"\"\"\n\n data = text.encode(\"utf-8\")\n file_size = len(data)\n md5_hex = md5_fn(data, None)\n\n if not dest_file_name:\n dest_file_name = \"message.txt\"\n\n if not content_type:\n content_type = \"text/plain; charset=utf-8\"\n\n part_size = _get_part_size(part_size, file_size)\n\n upload_request = {\n \"concreteType\": concrete_types.MULTIPART_UPLOAD_REQUEST,\n \"contentType\": content_type,\n \"contentMD5Hex\": md5_hex,\n \"fileName\": dest_file_name,\n \"fileSizeBytes\": file_size,\n \"generatePreview\": preview,\n \"partSizeBytes\": part_size,\n \"storageLocationId\": storage_location_id,\n }\n\n def part_fn(part_number):\n return _get_data_chunk(data, part_number, part_size)\n\n part_size = _get_part_size(part_size, file_size)\n return _multipart_upload(\n syn,\n dest_file_name,\n upload_request,\n part_fn,\n md5_fn,\n force_restart=force_restart,\n max_threads=max_threads,\n )\n
"},{"location":"reference/core/#synapseclient.core.upload.multipart_upload._multipart_upload","title":"synapseclient.core.upload.multipart_upload._multipart_upload(syn, dest_file_name, upload_request, part_fn, md5_fn, force_restart=False, max_threads=None)
","text":"Source code in synapseclient/core/upload/multipart_upload.py
def _multipart_upload(\n syn,\n dest_file_name,\n upload_request,\n part_fn,\n md5_fn,\n force_restart: bool = False,\n max_threads: int = None,\n):\n if max_threads is None:\n max_threads = pool_provider.DEFAULT_NUM_THREADS\n\n max_threads = max(max_threads, 1)\n\n retry = 0\n while True:\n try:\n upload_status_response = UploadAttempt(\n syn,\n dest_file_name,\n upload_request,\n part_fn,\n md5_fn,\n max_threads,\n # only force_restart the first time through (if requested).\n # a retry after a caught exception will not restart the upload\n # from scratch.\n force_restart and retry == 0,\n )()\n\n # success\n return upload_status_response[\"resultFileHandleId\"]\n\n except SynapseUploadFailedException:\n if retry < MAX_RETRIES:\n retry += 1\n else:\n raise\n
"},{"location":"reference/core/#utils","title":"Utils","text":""},{"location":"reference/core/#synapseclient.core.utils","title":"synapseclient.core.utils
","text":"Utility functions useful in the implementation and testing of the Synapse client.
"},{"location":"reference/core/#synapseclient.core.utils-classes","title":"Classes","text":""},{"location":"reference/core/#synapseclient.core.utils.threadsafe_iter","title":"threadsafe_iter
","text":"Takes an iterator/generator and makes it thread-safe by serializing call to the next
method of given iterator/generator. See: http://anandology.com/blog/using-iterators-and-generators/
synapseclient/core/utils.py
class threadsafe_iter:\n \"\"\"Takes an iterator/generator and makes it thread-safe by serializing call to the `next` method of given\n iterator/generator.\n See: http://anandology.com/blog/using-iterators-and-generators/\n \"\"\"\n\n def __init__(self, it):\n self.it = it\n self.lock = threading.Lock()\n\n def __iter__(self):\n return self\n\n def __next__(self):\n with self.lock:\n return next(self.it)\n
"},{"location":"reference/core/#synapseclient.core.utils.deprecated_keyword_param","title":"deprecated_keyword_param
","text":"A decorator to use to warn when a keyword parameter from a function has been deprecated and is intended for future removal. Will emit a warning such a keyword is passed.
Source code insynapseclient/core/utils.py
class deprecated_keyword_param:\n \"\"\"A decorator to use to warn when a keyword parameter from a function has been deprecated\n and is intended for future removal. Will emit a warning such a keyword is passed.\"\"\"\n\n def __init__(self, keywords, version, reason):\n self.keywords = set(keywords)\n self.version = version\n self.reason = reason\n\n def __call__(self, fn):\n def wrapper(*args, **kwargs):\n found = self.keywords.intersection(kwargs)\n if found:\n warnings.warn(\n \"Parameter(s) {} deprecated since version {}; {}\".format(\n sorted(list(found)), self.version, self.reason\n ),\n category=DeprecationWarning,\n stacklevel=2,\n )\n\n return fn(*args, **kwargs)\n\n return wrapper\n
"},{"location":"reference/core/#synapseclient.core.utils-functions","title":"Functions","text":""},{"location":"reference/core/#synapseclient.core.utils.md5_for_file","title":"md5_for_file(filename, block_size=2 * MB, callback=None)
","text":"Calculates the MD5 of the given file. See source <http://stackoverflow.com/questions/1131220/get-md5-hash-of-a-files-without-open-it-in-python>
_.
:param filename: The file to read in :param block_size: How much of the file to read in at once (bytes). Defaults to 2 MB :param callback: The callback function that help us show loading spinner on terminal. Defaults to None :returns: The MD5
Source code insynapseclient/core/utils.py
def md5_for_file(filename, block_size=2 * MB, callback=None):\n \"\"\"\n Calculates the MD5 of the given file.\n See `source <http://stackoverflow.com/questions/1131220/get-md5-hash-of-a-files-without-open-it-in-python>`_.\n\n :param filename: The file to read in\n :param block_size: How much of the file to read in at once (bytes).\n Defaults to 2 MB\n :param callback: The callback function that help us show loading spinner on terminal.\n Defaults to None\n :returns: The MD5\n \"\"\"\n\n md5 = hashlib.new(\"md5\", usedforsecurity=False)\n with open(filename, \"rb\") as f:\n while True:\n if callback:\n callback()\n data = f.read(block_size)\n if not data:\n break\n md5.update(data)\n return md5\n
"},{"location":"reference/core/#synapseclient.core.utils.md5_fn","title":"md5_fn(part, _)
","text":"Calculate the MD5 of a file-like object.
:part -- A file-like object to read from.
:returns: The MD5
Source code insynapseclient/core/utils.py
def md5_fn(part, _):\n \"\"\"Calculate the MD5 of a file-like object.\n\n :part -- A file-like object to read from.\n\n :returns: The MD5\n \"\"\"\n md5 = hashlib.new(\"md5\", usedforsecurity=False)\n md5.update(part)\n return md5.hexdigest()\n
"},{"location":"reference/core/#synapseclient.core.utils.download_file","title":"download_file(url, localFilepath=None)
","text":"Downloads a remote file.
:param localFilePath: May be None, in which case a temporary file is created
:returns: localFilePath
Source code insynapseclient/core/utils.py
def download_file(url, localFilepath=None):\n \"\"\"\n Downloads a remote file.\n\n :param localFilePath: May be None, in which case a temporary file is created\n\n :returns: localFilePath\n \"\"\"\n\n f = None\n try:\n if localFilepath:\n dir = os.path.dirname(localFilepath)\n if not os.path.exists(dir):\n os.makedirs(dir)\n f = open(localFilepath, \"wb\")\n else:\n f = tempfile.NamedTemporaryFile(delete=False)\n localFilepath = f.name\n\n r = requests.get(url, stream=True)\n toBeTransferred = float(r.headers[\"content-length\"])\n for nChunks, chunk in enumerate(r.iter_content(chunk_size=1024 * 10)):\n if chunk:\n f.write(chunk)\n printTransferProgress(nChunks * 1024 * 10, toBeTransferred)\n finally:\n if f:\n f.close()\n printTransferProgress(toBeTransferred, toBeTransferred)\n\n return localFilepath\n
"},{"location":"reference/core/#synapseclient.core.utils.extract_filename","title":"extract_filename(content_disposition_header, default_filename=None)
","text":"Extract a filename from an HTTP content-disposition header field.
See this memo <http://tools.ietf.org/html/rfc6266>
and this package <http://pypi.python.org/pypi/rfc6266>
for cryptic details.
synapseclient/core/utils.py
def extract_filename(content_disposition_header, default_filename=None):\n \"\"\"\n Extract a filename from an HTTP content-disposition header field.\n\n See `this memo <http://tools.ietf.org/html/rfc6266>`_ and `this package <http://pypi.python.org/pypi/rfc6266>`_\n for cryptic details.\n \"\"\"\n\n if not content_disposition_header:\n return default_filename\n value, params = cgi.parse_header(content_disposition_header)\n return params.get(\"filename\", default_filename)\n
"},{"location":"reference/core/#synapseclient.core.utils.extract_user_name","title":"extract_user_name(profile)
","text":"Extract a displayable user name from a user's profile
Source code insynapseclient/core/utils.py
def extract_user_name(profile):\n \"\"\"\n Extract a displayable user name from a user's profile\n \"\"\"\n if \"userName\" in profile and profile[\"userName\"]:\n return profile[\"userName\"]\n elif \"displayName\" in profile and profile[\"displayName\"]:\n return profile[\"displayName\"]\n else:\n if (\n \"firstName\" in profile\n and profile[\"firstName\"]\n and \"lastName\" in profile\n and profile[\"lastName\"]\n ):\n return profile[\"firstName\"] + \" \" + profile[\"lastName\"]\n elif \"lastName\" in profile and profile[\"lastName\"]:\n return profile[\"lastName\"]\n elif \"firstName\" in profile and profile[\"firstName\"]:\n return profile[\"firstName\"]\n else:\n return str(profile.get(\"id\", \"Unknown-user\"))\n
"},{"location":"reference/core/#synapseclient.core.utils.id_of","title":"id_of(obj)
","text":"Try to figure out the Synapse ID of the given object.
:param obj: May be a string, Entity object, or dictionary
:returns: The ID or throws an exception
Source code insynapseclient/core/utils.py
def id_of(obj):\n \"\"\"\n Try to figure out the Synapse ID of the given object.\n\n :param obj: May be a string, Entity object, or dictionary\n\n :returns: The ID or throws an exception\n \"\"\"\n if isinstance(obj, str):\n return str(obj)\n if isinstance(obj, numbers.Number):\n return str(obj)\n\n id_attr_names = [\n \"id\",\n \"ownerId\",\n \"tableId\",\n ] # possible attribute names for a synapse Id\n for attribute_name in id_attr_names:\n syn_id = _get_from_members_items_or_properties(obj, attribute_name)\n if syn_id is not None:\n return str(syn_id)\n\n raise ValueError(\"Invalid parameters: couldn't find id of \" + str(obj))\n
"},{"location":"reference/core/#synapseclient.core.utils.concrete_type_of","title":"concrete_type_of(obj)
","text":"Return the concrete type of an object representing a Synapse entity. This is meant to operate either against an actual Entity object, or the lighter weight dictionary returned by Synapse#getChildren, both of which are Mappings.
Source code insynapseclient/core/utils.py
def concrete_type_of(obj: collections.abc.Mapping):\n \"\"\"\n Return the concrete type of an object representing a Synapse entity.\n This is meant to operate either against an actual Entity object, or the lighter\n weight dictionary returned by Synapse#getChildren, both of which are Mappings.\n \"\"\"\n concrete_type = None\n if isinstance(obj, collections.abc.Mapping):\n for key in (\"concreteType\", \"type\"):\n concrete_type = obj.get(key)\n if concrete_type:\n break\n\n if not isinstance(concrete_type, str) or not concrete_type.startswith(\n \"org.sagebionetworks.repo.model\"\n ):\n raise ValueError(\"Unable to determine concreteType\")\n\n return concrete_type\n
"},{"location":"reference/core/#synapseclient.core.utils.is_in_path","title":"is_in_path(id, path)
","text":"Determines whether id is in the path as returned from /entity/{id}/path
:param id: synapse id string :param path: object as returned from '/entity/{id}/path'
:returns: True or False
Source code insynapseclient/core/utils.py
def is_in_path(id, path):\n \"\"\"Determines whether id is in the path as returned from /entity/{id}/path\n\n :param id: synapse id string\n :param path: object as returned from '/entity/{id}/path'\n\n :returns: True or False\n \"\"\"\n return id in [item[\"id\"] for item in path[\"path\"]]\n
"},{"location":"reference/core/#synapseclient.core.utils.get_properties","title":"get_properties(entity)
","text":"Returns the dictionary of properties of the given Entity.
Source code insynapseclient/core/utils.py
def get_properties(entity):\n \"\"\"Returns the dictionary of properties of the given Entity.\"\"\"\n\n return entity.properties if hasattr(entity, \"properties\") else entity\n
"},{"location":"reference/core/#synapseclient.core.utils.is_url","title":"is_url(s)
","text":"Return True if the string appears to be a valid URL.
Source code insynapseclient/core/utils.py
def is_url(s):\n \"\"\"Return True if the string appears to be a valid URL.\"\"\"\n if isinstance(s, str):\n try:\n url_parts = urllib_parse.urlsplit(s)\n # looks like a Windows drive letter?\n if len(url_parts.scheme) == 1 and url_parts.scheme.isalpha():\n return False\n if url_parts.scheme == \"file\" and bool(url_parts.path):\n return True\n return bool(url_parts.scheme) and bool(url_parts.netloc)\n except Exception:\n return False\n return False\n
"},{"location":"reference/core/#synapseclient.core.utils.as_url","title":"as_url(s)
","text":"Tries to convert the input into a proper URL.
Source code insynapseclient/core/utils.py
def as_url(s):\n \"\"\"Tries to convert the input into a proper URL.\"\"\"\n url_parts = urllib_parse.urlsplit(s)\n # Windows drive letter?\n if len(url_parts.scheme) == 1 and url_parts.scheme.isalpha():\n return \"file:///%s\" % str(s).replace(\"\\\\\", \"/\")\n if url_parts.scheme:\n return url_parts.geturl()\n else:\n return \"file://%s\" % str(s)\n
"},{"location":"reference/core/#synapseclient.core.utils.guess_file_name","title":"guess_file_name(string)
","text":"Tries to derive a filename from an arbitrary string.
Source code insynapseclient/core/utils.py
def guess_file_name(string):\n \"\"\"Tries to derive a filename from an arbitrary string.\"\"\"\n path = normalize_path(urllib_parse.urlparse(string).path)\n tokens = [x for x in path.split(\"/\") if x != \"\"]\n if len(tokens) > 0:\n return tokens[-1]\n\n # Try scrubbing the path of illegal characters\n if len(path) > 0:\n path = re.sub(r\"[^a-zA-Z0-9_.+() -]\", \"\", path)\n if len(path) > 0:\n return path\n raise ValueError(\"Could not derive a name from %s\" % string)\n
"},{"location":"reference/core/#synapseclient.core.utils.normalize_path","title":"normalize_path(path)
","text":"Transforms a path into an absolute path with forward slashes only.
Source code insynapseclient/core/utils.py
def normalize_path(path):\n \"\"\"Transforms a path into an absolute path with forward slashes only.\"\"\"\n if path is None:\n return None\n return re.sub(r\"\\\\\", \"/\", os.path.normcase(os.path.abspath(path)))\n
"},{"location":"reference/core/#synapseclient.core.utils.equal_paths","title":"equal_paths(path1, path2)
","text":"Compare file paths in a platform neutral way
Source code insynapseclient/core/utils.py
def equal_paths(path1, path2):\n \"\"\"\n Compare file paths in a platform neutral way\n \"\"\"\n return normalize_path(path1) == normalize_path(path2)\n
"},{"location":"reference/core/#synapseclient.core.utils.file_url_to_path","title":"file_url_to_path(url, verify_exists=False)
","text":"Convert a file URL to a path, handling some odd cases around Windows paths.
:param url: a file URL :param verify_exists: If true, return an populated dict only if the resulting file path exists on the local file system.
:returns: a path or None if the URL is not a file URL.
Source code insynapseclient/core/utils.py
def file_url_to_path(url, verify_exists=False):\n \"\"\"\n Convert a file URL to a path, handling some odd cases around Windows paths.\n\n :param url: a file URL\n :param verify_exists: If true, return an populated dict only if the resulting file path exists on the local file\n system.\n\n :returns: a path or None if the URL is not a file URL.\n \"\"\"\n parts = urllib_parse.urlsplit(url)\n if parts.scheme == \"file\" or parts.scheme == \"\":\n path = parts.path\n # A windows file URL, for example file:///c:/WINDOWS/asdf.txt\n # will get back a path of: /c:/WINDOWS/asdf.txt, which we need to fix by\n # lopping off the leading slash character. Apparently, the Python developers\n # think this is not a bug: http://bugs.python.org/issue7965\n if re.match(r\"\\/[A-Za-z]:\", path):\n path = path[1:]\n if os.path.exists(path) or not verify_exists:\n return path\n return None\n
"},{"location":"reference/core/#synapseclient.core.utils.is_same_base_url","title":"is_same_base_url(url1, url2)
","text":"Compares two urls to see if they are the same excluding up to the base path
:param url1: a URL :param url2: a second URL
:returns: Boolean
Source code insynapseclient/core/utils.py
def is_same_base_url(url1, url2):\n \"\"\"Compares two urls to see if they are the same excluding up to the base path\n\n :param url1: a URL\n :param url2: a second URL\n\n :returns: Boolean\n \"\"\"\n url1 = urllib_parse.urlsplit(url1)\n url2 = urllib_parse.urlsplit(url2)\n return url1.scheme == url2.scheme and url1.hostname == url2.hostname\n
"},{"location":"reference/core/#synapseclient.core.utils.is_synapse_id_str","title":"is_synapse_id_str(obj)
","text":"If the input is a Synapse ID return it, otherwise return None
Source code insynapseclient/core/utils.py
def is_synapse_id_str(obj):\n \"\"\"If the input is a Synapse ID return it, otherwise return None\"\"\"\n if isinstance(obj, str):\n m = re.match(r\"(syn\\d+$)\", obj)\n if m:\n return m.group(1)\n return None\n
"},{"location":"reference/core/#synapseclient.core.utils.datetime_or_none","title":"datetime_or_none(datetime_str)
","text":"Attempts to convert a string to a datetime object. Returns None if it fails.
Some of the expected formats of datetime_str are: - 2023-12-04T07:00:00Z - 2001-01-01 15:00:00+07:00 - 2001-01-01 15:00:00-07:00 - 2023-12-04 07:00:00+00:00 - 2019-01-01
:param datetime_str: The string to convert to a datetime object :return: The datetime object or None if the conversion fails
Source code insynapseclient/core/utils.py
def datetime_or_none(datetime_str: str) -> typing.Union[datetime.datetime, None]:\n \"\"\"Attempts to convert a string to a datetime object. Returns None if it fails.\n\n Some of the expected formats of datetime_str are:\n - 2023-12-04T07:00:00Z\n - 2001-01-01 15:00:00+07:00\n - 2001-01-01 15:00:00-07:00\n - 2023-12-04 07:00:00+00:00\n - 2019-01-01\n\n :param datetime_str: The string to convert to a datetime object\n :return: The datetime object or None if the conversion fails\n \"\"\"\n try:\n return datetime.datetime.fromisoformat(datetime_str.replace(\"Z\", \"+00:00\"))\n except Exception:\n return None\n
"},{"location":"reference/core/#synapseclient.core.utils.is_date","title":"is_date(dt)
","text":"Objects of class datetime.date and datetime.datetime will be recognized as dates
Source code insynapseclient/core/utils.py
def is_date(dt):\n \"\"\"Objects of class datetime.date and datetime.datetime will be recognized as dates\"\"\"\n return isinstance(dt, datetime.date) or isinstance(dt, datetime.datetime)\n
"},{"location":"reference/core/#synapseclient.core.utils.to_list","title":"to_list(value)
","text":"Convert the value (an iterable or a scalar value) to a list.
Source code insynapseclient/core/utils.py
def to_list(value):\n \"\"\"Convert the value (an iterable or a scalar value) to a list.\"\"\"\n if isinstance(value, collections.abc.Iterable) and not isinstance(value, str):\n values = []\n for val in value:\n possible_datetime = None\n if isinstance(val, str):\n possible_datetime = datetime_or_none(value)\n values.append(val if possible_datetime is None else possible_datetime)\n return values\n else:\n possible_datetime = None\n if isinstance(value, str):\n possible_datetime = datetime_or_none(value)\n return [value if possible_datetime is None else possible_datetime]\n
"},{"location":"reference/core/#synapseclient.core.utils.make_bogus_data_file","title":"make_bogus_data_file(n=100, seed=None)
","text":"Makes a bogus data file for testing. It is the caller's responsibility to clean up the file when finished.
:param n: How many random floating point numbers to be written into the file, separated by commas :param seed: Random seed for the random numbers
:returns: The name of the file
Source code insynapseclient/core/utils.py
def make_bogus_data_file(n=100, seed=None):\n \"\"\"\n Makes a bogus data file for testing. It is the caller's responsibility to clean up the file when finished.\n\n :param n: How many random floating point numbers to be written into the file, separated by commas\n :param seed: Random seed for the random numbers\n\n :returns: The name of the file\n \"\"\"\n\n if seed is not None:\n random.seed(seed)\n data = [random.gauss(mu=0.0, sigma=1.0) for i in range(n)]\n\n f = tempfile.NamedTemporaryFile(mode=\"w\", suffix=\".txt\", delete=False)\n try:\n f.write(\", \".join(str(n) for n in data))\n f.write(\"\\n\")\n finally:\n f.close()\n\n return normalize_path(f.name)\n
"},{"location":"reference/core/#synapseclient.core.utils.make_bogus_binary_file","title":"make_bogus_binary_file(n=1 * KB, filepath=None, printprogress=False)
","text":"Makes a bogus binary data file for testing. It is the caller's responsibility to clean up the file when finished.
:param n: How many bytes to write
:returns: The name of the file
Source code insynapseclient/core/utils.py
def make_bogus_binary_file(n=1 * KB, filepath=None, printprogress=False):\n \"\"\"\n Makes a bogus binary data file for testing. It is the caller's responsibility to clean up the file when finished.\n\n :param n: How many bytes to write\n\n :returns: The name of the file\n \"\"\"\n\n with open(filepath, \"wb\") if filepath else tempfile.NamedTemporaryFile(\n mode=\"wb\", suffix=\".dat\", delete=False\n ) as f:\n if not filepath:\n filepath = f.name\n progress = 0\n remaining = n\n while remaining > 0:\n buff_size = int(min(remaining, 1 * KB))\n f.write(os.urandom(buff_size))\n remaining -= buff_size\n if printprogress:\n progress += buff_size\n printTransferProgress(progress, n, \"Generated \", filepath)\n return normalize_path(filepath)\n
"},{"location":"reference/core/#synapseclient.core.utils.to_unix_epoch_time","title":"to_unix_epoch_time(dt)
","text":"Convert either datetime.date or datetime.datetime objects <http://docs.python.org/2/library/datetime.html>
_ to UNIX time.
synapseclient/core/utils.py
def to_unix_epoch_time(dt: typing.Union[datetime.date, datetime.datetime, str]) -> int:\n \"\"\"\n Convert either `datetime.date or datetime.datetime objects <http://docs.python.org/2/library/datetime.html>`_\n to UNIX time.\n \"\"\"\n if type(dt) == str:\n dt = datetime.datetime.fromisoformat(dt.replace(\"Z\", \"+00:00\"))\n if type(dt) == datetime.date:\n current_timezone = datetime.datetime.now().astimezone().tzinfo\n datetime_utc = datetime.datetime.combine(dt, datetime.time(0, 0, 0, 0)).replace(\n tzinfo=current_timezone\n )\n else:\n # If the datetime is not timezone aware, assume it is in the local timezone.\n # This is required in order for windows to work with the `astimezone` method.\n if dt.tzinfo is None:\n current_timezone = datetime.datetime.now().astimezone().tzinfo\n dt = dt.replace(tzinfo=current_timezone)\n datetime_utc = dt.astimezone(datetime.timezone.utc)\n return int((datetime_utc - UNIX_EPOCH).total_seconds() * 1000)\n
"},{"location":"reference/core/#synapseclient.core.utils.to_unix_epoch_time_secs","title":"to_unix_epoch_time_secs(dt)
","text":"Convert either datetime.date or datetime.datetime objects <http://docs.python.org/2/library/datetime.html>
_ to UNIX time.
synapseclient/core/utils.py
def to_unix_epoch_time_secs(\n dt: typing.Union[datetime.date, datetime.datetime]\n) -> float:\n \"\"\"\n Convert either `datetime.date or datetime.datetime objects <http://docs.python.org/2/library/datetime.html>`_\n to UNIX time.\n \"\"\"\n if type(dt) == datetime.date:\n current_timezone = datetime.datetime.now().astimezone().tzinfo\n datetime_utc = datetime.datetime.combine(dt, datetime.time(0, 0, 0, 0)).replace(\n tzinfo=current_timezone\n )\n else:\n # If the datetime is not timezone aware, assume it is in the local timezone.\n # This is required in order for windows to work with the `astimezone` method.\n if dt.tzinfo is None:\n current_timezone = datetime.datetime.now().astimezone().tzinfo\n dt = dt.replace(tzinfo=current_timezone)\n datetime_utc = dt.astimezone(datetime.timezone.utc)\n return (datetime_utc - UNIX_EPOCH).total_seconds()\n
"},{"location":"reference/core/#synapseclient.core.utils.from_unix_epoch_time_secs","title":"from_unix_epoch_time_secs(secs)
","text":"Returns a Datetime object given milliseconds since midnight Jan 1, 1970.
Source code insynapseclient/core/utils.py
def from_unix_epoch_time_secs(secs):\n \"\"\"Returns a Datetime object given milliseconds since midnight Jan 1, 1970.\"\"\"\n if isinstance(secs, str):\n secs = float(secs)\n\n # utcfromtimestamp() fails for negative values (dates before 1970-1-1) on Windows\n # so, here's a hack that enables ancient events, such as Chris's birthday to be\n # converted from milliseconds since the UNIX epoch to higher level Datetime objects. Ha!\n if platform.system() == \"Windows\" and secs < 0:\n mirror_date = datetime.datetime.utcfromtimestamp(abs(secs)).replace(\n tzinfo=datetime.timezone.utc\n )\n\n result = (UNIX_EPOCH - (mirror_date - UNIX_EPOCH)).replace(\n tzinfo=datetime.timezone.utc\n )\n\n return result\n datetime_instance = datetime.datetime.utcfromtimestamp(secs).replace(\n tzinfo=datetime.timezone.utc\n )\n\n return datetime_instance\n
"},{"location":"reference/core/#synapseclient.core.utils.from_unix_epoch_time","title":"from_unix_epoch_time(ms)
","text":"Returns a Datetime object given milliseconds since midnight Jan 1, 1970.
Source code insynapseclient/core/utils.py
def from_unix_epoch_time(ms) -> datetime.datetime:\n \"\"\"Returns a Datetime object given milliseconds since midnight Jan 1, 1970.\"\"\"\n\n if isinstance(ms, str):\n ms = float(ms)\n return from_unix_epoch_time_secs(ms / 1000.0)\n
"},{"location":"reference/core/#synapseclient.core.utils.format_time_interval","title":"format_time_interval(seconds)
","text":"Format a time interval given in seconds to a readable value, e.g. \"5 minutes, 37 seconds\".
Source code insynapseclient/core/utils.py
def format_time_interval(seconds):\n \"\"\"Format a time interval given in seconds to a readable value, e.g. \\\"5 minutes, 37 seconds\\\".\"\"\"\n\n periods = (\n (\"year\", 60 * 60 * 24 * 365),\n (\"month\", 60 * 60 * 24 * 30),\n (\"day\", 60 * 60 * 24),\n (\"hour\", 60 * 60),\n (\"minute\", 60),\n (\"second\", 1),\n )\n\n result = []\n for period_name, period_seconds in periods:\n if seconds > period_seconds or period_name == \"second\":\n period_value, seconds = divmod(seconds, period_seconds)\n if period_value > 0 or period_name == \"second\":\n if period_value == 1:\n result.append(\"%d %s\" % (period_value, period_name))\n else:\n result.append(\"%d %ss\" % (period_value, period_name))\n return \", \".join(result)\n
"},{"location":"reference/core/#synapseclient.core.utils.itersubclasses","title":"itersubclasses(cls, _seen=None)
","text":"http://code.activestate.com/recipes/576949/ (r3)
itersubclasses(cls)
Generator over all subclasses of a given class, in depth first order.
list(itersubclasses(int)) == [bool] True class A(object): pass class B(A): pass class C(A): pass class D(B,C): pass class E(D): pass
for cls in itersubclasses(A): ... print(cls.name) B D E C
Source code insynapseclient/core/utils.py
def itersubclasses(cls, _seen=None):\n \"\"\"\n http://code.activestate.com/recipes/576949/ (r3)\n\n itersubclasses(cls)\n\n Generator over all subclasses of a given class, in depth first order.\n\n >>> list(itersubclasses(int)) == [bool]\n True\n >>> class A(object): pass\n >>> class B(A): pass\n >>> class C(A): pass\n >>> class D(B,C): pass\n >>> class E(D): pass\n >>>\n >>> for cls in itersubclasses(A):\n ... print(cls.__name__)\n B\n D\n E\n C\n >>> # get ALL (new-style) classes currently defined\n >>> [cls.__name__ for cls in itersubclasses(object)] #doctest: +ELLIPSIS\n ['type', ...'tuple', ...]\n \"\"\"\n\n if not isinstance(cls, type):\n raise TypeError(\n \"itersubclasses must be called with \" \"new-style classes, not %.100r\" % cls\n )\n if _seen is None:\n _seen = set()\n try:\n subs = cls.__subclasses__()\n except TypeError: # fails only when cls is type\n subs = cls.__subclasses__(cls)\n for sub in subs:\n if sub not in _seen:\n _seen.add(sub)\n yield sub\n for inner_sub in itersubclasses(sub, _seen):\n yield inner_sub\n
"},{"location":"reference/core/#synapseclient.core.utils.itersubclasses--get-all-new-style-classes-currently-defined","title":"get ALL (new-style) classes currently defined","text":"[cls.name for cls in itersubclasses(object)] #doctest: +ELLIPSIS ['type', ...'tuple', ...]
"},{"location":"reference/core/#synapseclient.core.utils.normalize_whitespace","title":"normalize_whitespace(s)
","text":"Strips the string and replace all whitespace sequences and other non-printable characters with a single space.
Source code insynapseclient/core/utils.py
def normalize_whitespace(s):\n \"\"\"\n Strips the string and replace all whitespace sequences and other non-printable characters with a single space.\n \"\"\"\n assert isinstance(s, str)\n return re.sub(r\"[\\x00-\\x20\\s]+\", \" \", s.strip())\n
"},{"location":"reference/core/#synapseclient.core.utils.query_limit_and_offset","title":"query_limit_and_offset(query, hard_limit=1000)
","text":"Extract limit and offset from the end of a query string.
:returns: A triple containing the query with limit and offset removed, the limit at most equal to the hard_limit, and the offset which defaults to 1
Source code insynapseclient/core/utils.py
def query_limit_and_offset(query, hard_limit=1000):\n \"\"\"\n Extract limit and offset from the end of a query string.\n\n :returns: A triple containing the query with limit and offset removed, the limit at most equal to the hard_limit,\n and the offset which\n defaults to 1\n \"\"\"\n # Regex a lower-case string to simplify matching\n tempQueryStr = query.lower()\n regex = r\"\\A(.*\\s)(offset|limit)\\s*(\\d*\\s*)\\Z\"\n\n # Continue to strip off and save the last limit/offset\n match = re.search(regex, tempQueryStr)\n options = {}\n while match is not None:\n options[match.group(2)] = int(match.group(3))\n tempQueryStr = match.group(1)\n match = re.search(regex, tempQueryStr)\n\n # Get a truncated version of the original query string (not in lower-case)\n query = query[: len(tempQueryStr)].strip()\n\n # Continue querying until the entire query has been fetched (or crash out)\n limit = min(options.get(\"limit\", hard_limit), hard_limit)\n offset = options.get(\"offset\", 1)\n\n return query, limit, offset\n
"},{"location":"reference/core/#synapseclient.core.utils.extract_synapse_id_from_query","title":"extract_synapse_id_from_query(query)
","text":"An unfortunate hack to pull the synapse ID out of a table query of the form \"select column1, column2 from syn12345 where....\" needed to build URLs for table services.
Source code insynapseclient/core/utils.py
def extract_synapse_id_from_query(query):\n \"\"\"\n An unfortunate hack to pull the synapse ID out of a table query of the form \"select column1, column2 from syn12345\n where....\" needed to build URLs for table services.\n \"\"\"\n m = re.search(r\"from\\s+(syn\\d+)\", query, re.IGNORECASE)\n if m:\n return m.group(1)\n else:\n raise ValueError('Couldn\\'t extract synapse ID from query: \"%s\"' % query)\n
"},{"location":"reference/core/#synapseclient.core.utils.printTransferProgress","title":"printTransferProgress(transferred, toBeTransferred, prefix='', postfix='', isBytes=True, dt=None, previouslyTransferred=0)
","text":"Prints a progress bar
:param transferred: a number of items/bytes completed :param toBeTransferred: total number of items/bytes when completed :param prefix: String printed before progress bar :param postfix: String printed after progress bar :param isBytes: A boolean indicating whether to convert bytes to kB, MB, GB etc. :param dt: The time in seconds that has passed since transfer started is used to calculate rate :param previouslyTransferred: the number of bytes that were already transferred before this transfer began (e.g. someone ctrl+c'd out of an upload and restarted it later)
Source code insynapseclient/core/utils.py
def printTransferProgress(\n transferred,\n toBeTransferred,\n prefix=\"\",\n postfix=\"\",\n isBytes=True,\n dt=None,\n previouslyTransferred=0,\n):\n \"\"\"Prints a progress bar\n\n :param transferred: a number of items/bytes completed\n :param toBeTransferred: total number of items/bytes when completed\n :param prefix: String printed before progress bar\n :param postfix: String printed after progress bar\n :param isBytes: A boolean indicating whether to convert bytes to kB, MB, GB etc.\n :param dt: The time in seconds that has passed since transfer started is used to calculate rate\n :param previouslyTransferred: the number of bytes that were already transferred before this transfer began\n (e.g. someone ctrl+c'd out of an upload and restarted it later)\n\n \"\"\"\n if not sys.stdout.isatty():\n return\n barLength = 20 # Modify this to change the length of the progress bar\n status = \"\"\n rate = \"\"\n if dt is not None and dt != 0:\n rate = (transferred - previouslyTransferred) / float(dt)\n rate = \"(%s/s)\" % humanizeBytes(rate) if isBytes else rate\n if toBeTransferred < 0:\n defaultToBeTransferred = barLength * 1 * MB\n if transferred > defaultToBeTransferred:\n progress = (\n float(transferred % defaultToBeTransferred) / defaultToBeTransferred\n )\n else:\n progress = float(transferred) / defaultToBeTransferred\n elif toBeTransferred == 0: # There is nothing to be transferred\n progress = 1\n status = \"Done...\\n\"\n else:\n progress = float(transferred) / toBeTransferred\n if progress >= 1:\n progress = 1\n status = \"Done...\\n\"\n block = int(round(barLength * progress))\n nbytes = humanizeBytes(transferred) if isBytes else transferred\n if toBeTransferred > 0:\n outOf = \"/%s\" % (humanizeBytes(toBeTransferred) if isBytes else toBeTransferred)\n percentage = \"%4.2f%%\" % (progress * 100)\n else:\n outOf = \"\"\n percentage = \"\"\n text = \"\\r%s [%s]%s %s%s %s %s %s \" % (\n prefix,\n \"#\" * block + \"-\" * (barLength - block),\n percentage,\n nbytes,\n outOf,\n rate,\n postfix,\n status,\n )\n sys.stdout.write(text)\n sys.stdout.flush()\n
"},{"location":"reference/core/#synapseclient.core.utils.touch","title":"touch(path, times=None)
","text":"Make sure a file exists. Update its access and modified times.
Source code insynapseclient/core/utils.py
def touch(path, times=None):\n \"\"\"\n Make sure a file exists. Update its access and modified times.\n \"\"\"\n basedir = os.path.dirname(path)\n if not os.path.exists(basedir):\n try:\n os.makedirs(basedir)\n except OSError as err:\n # alternate processes might be creating these at the same time\n if err.errno != errno.EEXIST:\n raise\n\n with open(path, \"a\"):\n os.utime(path, times)\n return path\n
"},{"location":"reference/core/#synapseclient.core.utils.is_json","title":"is_json(content_type)
","text":"detect if a content-type is JSON
Source code insynapseclient/core/utils.py
def is_json(content_type):\n \"\"\"detect if a content-type is JSON\"\"\"\n # The value of Content-Type defined here:\n # http://www.w3.org/Protocols/rfc2616/rfc2616-sec3.html#sec3.7\n return (\n content_type.lower().strip().startswith(\"application/json\")\n if content_type\n else False\n )\n
"},{"location":"reference/core/#synapseclient.core.utils.find_data_file_handle","title":"find_data_file_handle(bundle)
","text":"Return the fileHandle whose ID matches the dataFileHandleId in an entity bundle
Source code insynapseclient/core/utils.py
def find_data_file_handle(bundle):\n \"\"\"Return the fileHandle whose ID matches the dataFileHandleId in an entity bundle\"\"\"\n for fileHandle in bundle[\"fileHandles\"]:\n if fileHandle[\"id\"] == bundle[\"entity\"][\"dataFileHandleId\"]:\n return fileHandle\n return None\n
"},{"location":"reference/core/#synapseclient.core.utils.unique_filename","title":"unique_filename(path)
","text":"Returns a unique path by appending (n) for some number n to the end of the filename.
Source code insynapseclient/core/utils.py
def unique_filename(path):\n \"\"\"Returns a unique path by appending (n) for some number n to the end of the filename.\"\"\"\n\n base, ext = os.path.splitext(path)\n counter = 0\n while os.path.exists(path):\n counter += 1\n path = base + (\"(%d)\" % counter) + ext\n\n return path\n
"},{"location":"reference/core/#synapseclient.core.utils.threadsafe_generator","title":"threadsafe_generator(f)
","text":"A decorator that takes a generator function and makes it thread-safe. See: http://anandology.com/blog/using-iterators-and-generators/
Source code insynapseclient/core/utils.py
def threadsafe_generator(f):\n \"\"\"A decorator that takes a generator function and makes it thread-safe.\n See: http://anandology.com/blog/using-iterators-and-generators/\n \"\"\"\n\n def g(*a, **kw):\n return threadsafe_iter(f(*a, **kw))\n\n return g\n
"},{"location":"reference/core/#synapseclient.core.utils.extract_prefix","title":"extract_prefix(keys)
","text":"Takes a list of strings and extracts a common prefix delimited by a dot, for example::
extract_prefix([\"entity.bang\", \"entity.bar\", \"entity.bat\"])\n# returns \"entity\"\n
Source code in synapseclient/core/utils.py
def extract_prefix(keys):\n \"\"\"\n Takes a list of strings and extracts a common prefix delimited by a dot,\n for example::\n\n extract_prefix([\"entity.bang\", \"entity.bar\", \"entity.bat\"])\n # returns \"entity\"\n\n \"\"\"\n prefixes = set()\n for key in keys:\n parts = key.split(\".\")\n if len(parts) > 1:\n prefixes.add(parts[0])\n else:\n return \"\"\n if len(prefixes) == 1:\n return prefixes.pop() + \".\"\n return \"\"\n
"},{"location":"reference/core/#synapseclient.core.utils.extract_zip_file_to_directory","title":"extract_zip_file_to_directory(zip_file, zip_entry_name, target_dir)
","text":"Extracts a specified file in a zip to the specified directory :param zip_file: an opened zip file. e.g. \"with zipfile.ZipFile(zipfilepath) as zip_file:\" :param zip_entry_name: the name of the file to be extracted from the zip e.g. folderInsideZipIfAny/fileName.txt :param target_dir: the directory to which the file will be extracted
:return: full path to the extracted file
Source code insynapseclient/core/utils.py
def extract_zip_file_to_directory(zip_file, zip_entry_name, target_dir):\n \"\"\"\n Extracts a specified file in a zip to the specified directory\n :param zip_file: an opened zip file. e.g. \"with zipfile.ZipFile(zipfilepath) as zip_file:\"\n :param zip_entry_name: the name of the file to be extracted from the zip e.g. folderInsideZipIfAny/fileName.txt\n :param target_dir: the directory to which the file will be extracted\n\n :return: full path to the extracted file\n \"\"\"\n file_base_name = os.path.basename(zip_entry_name) # base name of the file\n filepath = os.path.join(\n target_dir, file_base_name\n ) # file path to the cached file to write\n\n # Create the cache directory if it does not exist\n if not os.path.exists(target_dir):\n os.makedirs(target_dir)\n\n # write the file from the zip into the cache\n with open(filepath, \"wb\") as cache_file:\n cache_file.write(zip_file.read(zip_entry_name))\n\n return filepath\n
"},{"location":"reference/core/#synapseclient.core.utils.topolgical_sort","title":"topolgical_sort(graph)
","text":"Given a graph in the form of a dictionary returns a sorted list
Adapted from: http://blog.jupo.org/2012/04/06/topological-sorting-acyclic-directed-graphs/
:param graph: a dictionary with values containing lists of keys referencing back into the dictionary
:returns: sorted list of items
Source code insynapseclient/core/utils.py
def topolgical_sort(graph):\n \"\"\"Given a graph in the form of a dictionary returns a sorted list\n\n Adapted from: http://blog.jupo.org/2012/04/06/topological-sorting-acyclic-directed-graphs/\n\n :param graph: a dictionary with values containing lists of keys referencing back into the dictionary\n\n :returns: sorted list of items\n \"\"\"\n graph_unsorted = graph.copy()\n graph_sorted = []\n # Convert the unsorted graph into a hash table. This gives us\n # constant-time lookup for checking if edges are unresolved\n\n # Run until the unsorted graph is empty.\n while graph_unsorted:\n # Go through each of the node/edges pairs in the unsorted\n # graph. If a set of edges doesn't contain any nodes that\n # haven't been resolved, that is, that are still in the\n # unsorted graph, remove the pair from the unsorted graph,\n # and append it to the sorted graph. Note here that by using\n # using the items() method for iterating, a copy of the\n # unsorted graph is used, allowing us to modify the unsorted\n # graph as we move through it. We also keep a flag for\n # checking that that graph is acyclic, which is true if any\n # nodes are resolved during each pass through the graph. If\n # not, we need to bail out as the graph therefore can't be\n # sorted.\n acyclic = False\n for node, edges in list(graph_unsorted.items()):\n for edge in edges:\n if edge in graph_unsorted:\n break\n else:\n acyclic = True\n del graph_unsorted[node]\n graph_sorted.append((node, edges))\n\n if not acyclic:\n # We've passed through all the unsorted nodes and\n # weren't able to resolve any of them, which means there\n # are nodes with cyclic edges that will never be resolved,\n # so we bail out with an error.\n raise RuntimeError(\n \"A cyclic dependency occurred.\"\n \" Some files in provenance reference each other circularly.\"\n )\n return graph_sorted\n
"},{"location":"reference/core/#synapseclient.core.utils.caller_module_name","title":"caller_module_name(current_frame)
","text":":param current_frame: use inspect.currentframe(). :return: the name of the module calling the function, foo(), in which this calling_module() is invoked. Ignores callers that belong in the same module as foo()
Source code insynapseclient/core/utils.py
def caller_module_name(current_frame):\n \"\"\"\n :param current_frame: use inspect.currentframe().\n :return: the name of the module calling the function, foo(), in which this calling_module() is invoked.\n Ignores callers that belong in the same module as foo()\n \"\"\"\n\n current_frame_filename = (\n current_frame.f_code.co_filename\n ) # filename in which foo() resides\n\n # go back a frame takes us to the frame calling foo()\n caller_frame = current_frame.f_back\n caller_filename = caller_frame.f_code.co_filename\n\n # find the first frame that does not have the same filename. this ensures that we don't consider functions within\n # the same module as foo() that use foo() as a helper function\n while caller_filename == current_frame_filename:\n caller_frame = caller_frame.f_back\n caller_filename = caller_frame.f_code.co_filename\n\n return inspect.getmodulename(caller_filename)\n
"},{"location":"reference/core/#synapseclient.core.utils.snake_case","title":"snake_case(string)
","text":"Convert the given string from CamelCase to snake_case
Source code insynapseclient/core/utils.py
def snake_case(string):\n \"\"\"Convert the given string from CamelCase to snake_case\"\"\"\n # https://stackoverflow.com/a/1176023\n return re.sub(r\"(?<!^)(?=[A-Z])\", \"_\", string).lower()\n
"},{"location":"reference/core/#synapseclient.core.utils.is_base64_encoded","title":"is_base64_encoded(input_string)
","text":"Return whether the given input string appears to be base64 encoded
Source code insynapseclient/core/utils.py
def is_base64_encoded(input_string):\n \"\"\"Return whether the given input string appears to be base64 encoded\"\"\"\n if not input_string:\n # None, empty string are not considered encoded\n return False\n try:\n # see if we can decode it and then reencode it back to the input\n byte_string = (\n input_string\n if isinstance(input_string, bytes)\n else str.encode(input_string)\n )\n return base64.b64encode(base64.b64decode(byte_string)) == byte_string\n except Exception:\n return False\n
"},{"location":"reference/core/#versions","title":"Versions","text":""},{"location":"reference/core/#synapseclient.core.version_check","title":"synapseclient.core.version_check
","text":"Version Functions
Check for latest version and recommend upgrade::
synapseclient.check_for_updates()\n
Print release notes for installed version of client::
synapseclient.release_notes()\n
.. automethod:: synapseclient.core.version_check.check_for_updates .. automethod:: synapseclient.core.version_check.release_notes
"},{"location":"reference/core/#synapseclient.core.version_check-functions","title":"Functions","text":""},{"location":"reference/core/#synapseclient.core.version_check.version_check","title":"version_check(current_version=None, version_url=_VERSION_URL, check_for_point_releases=False)
","text":"Gets the latest version information from version_url and check against the current version. Recommends upgrade, if a newer version exists.
:returns: True if current version is the latest release (or higher) version, False otherwise.
Source code insynapseclient/core/version_check.py
def version_check(\n current_version=None, version_url=_VERSION_URL, check_for_point_releases=False\n):\n \"\"\"\n Gets the latest version information from version_url and check against the current version.\n Recommends upgrade, if a newer version exists.\n\n :returns: True if current version is the latest release (or higher) version,\n False otherwise.\n \"\"\"\n\n try:\n if not current_version:\n current_version = synapseclient.__version__\n\n version_info = _get_version_info(version_url)\n\n current_base_version = _strip_dev_suffix(current_version)\n\n # Check blacklist\n if (\n current_base_version in version_info[\"blacklist\"]\n or current_version in version_info[\"blacklist\"]\n ):\n msg = (\n \"\\nPLEASE UPGRADE YOUR CLIENT\\n\\nUpgrading your SynapseClient is required. \"\n \"Please upgrade your client by typing:\\n\"\n \" pip install --upgrade synapseclient\\n\\n\"\n )\n raise SystemExit(msg)\n\n if \"message\" in version_info:\n sys.stderr.write(version_info[\"message\"] + \"\\n\")\n\n levels = 3 if check_for_point_releases else 2\n\n # Compare with latest version\n if _version_tuple(current_version, levels=levels) < _version_tuple(\n version_info[\"latestVersion\"], levels=levels\n ):\n sys.stderr.write(\n \"\\nUPGRADE AVAILABLE\\n\\nA more recent version of the Synapse Client (%s) \"\n \"is available. Your version (%s) can be upgraded by typing:\\n\"\n \" pip install --upgrade synapseclient\\n\\n\"\n % (\n version_info[\"latestVersion\"],\n current_version,\n )\n )\n if \"releaseNotes\" in version_info:\n sys.stderr.write(\n \"Python Synapse Client version %s release notes\\n\\n\"\n % version_info[\"latestVersion\"]\n )\n sys.stderr.write(version_info[\"releaseNotes\"] + \"\\n\\n\")\n return False\n\n except Exception as e:\n # Don't prevent the client from running if something goes wrong\n sys.stderr.write(\"Exception in version check: %s\\n\" % (str(e),))\n return False\n\n return True\n
"},{"location":"reference/core/#synapseclient.core.version_check.check_for_updates","title":"check_for_updates()
","text":"Check for the existence of newer versions of the client, reporting both current release version and development version.
For help installing development versions of the client, see the docs for :py:mod:synapseclient
or the README.md <https://github.com/Sage-Bionetworks/synapsePythonClient>
_.
synapseclient/core/version_check.py
def check_for_updates():\n \"\"\"\n Check for the existence of newer versions of the client, reporting both current release version and development\n version.\n\n For help installing development versions of the client, see the docs for\n :py:mod:`synapseclient` or the `README.md <https://github.com/Sage-Bionetworks/synapsePythonClient>`_.\n \"\"\"\n sys.stderr.write(\"Python Synapse Client\\n\")\n sys.stderr.write(\"currently running version: %s\\n\" % synapseclient.__version__)\n\n release_version_info = _get_version_info(_VERSION_URL)\n sys.stderr.write(\n \"latest release version: %s\\n\" % release_version_info[\"latestVersion\"]\n )\n\n if _version_tuple(synapseclient.__version__, levels=3) < _version_tuple(\n release_version_info[\"latestVersion\"], levels=3\n ):\n print(\n (\n \"\\nUPGRADE AVAILABLE\\n\\nA more recent version of the Synapse Client (%s) is available. \"\n \"Your version (%s) can be upgraded by typing:\\n\"\n \" pip install --upgrade synapseclient\\n\\n\"\n )\n % (\n release_version_info[\"latestVersion\"],\n synapseclient.__version__,\n )\n )\n else:\n sys.stderr.write(\"\\nYour Synapse client is up to date!\\n\")\n
"},{"location":"reference/core/#synapseclient.core.version_check.release_notes","title":"release_notes(version_url=None)
","text":"Print release notes for the installed version of the client or latest release or development version if version_url is supplied.
:param version_url: Defaults to None, meaning release notes for the installed version. Alternatives are:
- synapseclient.version_check._VERSION_URL\n - synapseclient.version_check._DEV_VERSION_URL\n
Source code in synapseclient/core/version_check.py
def release_notes(version_url=None):\n \"\"\"\n Print release notes for the installed version of the client or latest release or development version if version_url\n is supplied.\n\n :param version_url: Defaults to None, meaning release notes for the installed version. Alternatives are:\n\n - synapseclient.version_check._VERSION_URL\n - synapseclient.version_check._DEV_VERSION_URL\n\n \"\"\"\n version_info = _get_version_info(version_url)\n sys.stderr.write(\n \"Python Synapse Client version %s release notes\\n\\n\"\n % version_info[\"latestVersion\"]\n )\n if \"releaseNotes\" in version_info:\n sys.stderr.write(version_info[\"releaseNotes\"] + \"\\n\")\n
"},{"location":"reference/docker_repository/","title":"DockerRepository","text":""},{"location":"reference/docker_repository/#synapseclient.entity.DockerRepository","title":"synapseclient.entity.DockerRepository
","text":" Bases: Entity
A Docker repository is a lightweight virtual machine image.
NOTE: store()-ing a DockerRepository created in the Python client will always result in it being treated as a reference to an external Docker repository that is not managed by synapse. To upload a docker image that is managed by Synapse please use the official Docker client and read https://help.synapse.org/docs/Synapse-Docker-Registry.2011037752.html for instructions on uploading a Docker Image to Synapse
ATTRIBUTE DESCRIPTIONrepositoryName
The name of the Docker Repository. Usually in the format: [host[:port]/]path. If host is not set, it will default to that of DockerHub. port can only be specified if the host is also specified.
parent
The parent project or folder
properties
A map of Synapse properties
annotations
A map of user defined annotations
local_state
Internal use only
Source code in
synapseclient/entity.py
class DockerRepository(Entity):\n \"\"\"\n A Docker repository is a lightweight virtual machine image.\n\n NOTE: store()-ing a DockerRepository created in the Python client will always result in it being treated as a\n reference to an external Docker repository that is not managed by synapse.\n To upload a docker image that is managed by Synapse please use the official Docker client and read\n https://help.synapse.org/docs/Synapse-Docker-Registry.2011037752.html for instructions on uploading a Docker Image to Synapse\n\n Attributes:\n repositoryName: The name of the Docker Repository. Usually in the format: [host[:port]/]path.\n If host is not set, it will default to that of DockerHub. port can only be specified\n if the host is also specified.\n parent: The parent project or folder\n properties: A map of Synapse properties\n annotations: A map of user defined annotations\n local_state: Internal use only\n \"\"\"\n\n _synapse_entity_type = \"org.sagebionetworks.repo.model.docker.DockerRepository\"\n\n _property_keys = Entity._property_keys + [\"repositoryName\"]\n\n def __init__(\n self,\n repositoryName=None,\n parent=None,\n properties=None,\n annotations=None,\n local_state=None,\n **kwargs,\n ):\n if repositoryName:\n kwargs[\"repositoryName\"] = repositoryName\n super(DockerRepository, self).__init__(\n properties=properties,\n annotations=annotations,\n local_state=local_state,\n parent=parent,\n **kwargs,\n )\n if \"repositoryName\" not in self:\n raise SynapseMalformedEntityError(\n \"DockerRepository must have a repositoryName.\"\n )\n
"},{"location":"reference/entity/","title":"Entity","text":"The Entity class is the base class for all entities, including Project, Folder, File, and Link.
Entities are dictionary-like objects in which both object and dictionary notation (entity.foo
or entity['foo']
) can be used interchangeably.
synapseclient.entity.Entity
","text":" Bases: MutableMapping
A Synapse entity is an object that has metadata, access control, and potentially a file. It can represent data, source code, or a folder that contains other entities.
Entities should typically be created using the constructors for specific subclasses such as synapseclient.Project, synapseclient.Folder or synapseclient.File.
ATTRIBUTE DESCRIPTIONid
The unique immutable ID for this entity. A new ID will be generated for new Entities. Once issued, this ID is guaranteed to never change or be re-issued
name
The name of this entity. Must be 256 characters or less. Names may only contain: letters, numbers, spaces, underscores, hyphens, periods, plus signs, apostrophes, and parentheses
description
The description of this entity. Must be 1000 characters or less.
parentId
The ID of the Entity that is the parent of this Entity.
entityType
concreteType
Indicates which implementation of Entity this object represents. The value is the fully qualified class name, e.g. org.sagebionetworks.repo.model.FileEntity.
etag
Synapse employs an Optimistic Concurrency Control (OCC) scheme to handle concurrent updates. Since the E-Tag changes every time an entity is updated it is used to detect when a client's current representation of an entity is out-of-date.
annotations
The dict of annotations for this entity.
accessControlList
createdOn
The date this entity was created.
createdBy
The ID of the user that created this entity.
modifiedOn
The date this entity was last modified.
modifiedBy
The ID of the user that last modified this entity.
Source code in
synapseclient/entity.py
class Entity(collections.abc.MutableMapping):\n \"\"\"\n A Synapse entity is an object that has metadata, access control, and potentially a file. It can represent data,\n source code, or a folder that contains other entities.\n\n Entities should typically be created using the constructors for specific subclasses\n such as [synapseclient.Project][], [synapseclient.Folder][] or [synapseclient.File][].\n\n Attributes:\n id: The unique immutable ID for this entity. A new ID will be generated for new\n Entities. Once issued, this ID is guaranteed to never change or be re-issued\n name: The name of this entity. Must be 256 characters or less. Names may only\n contain: letters, numbers, spaces, underscores, hyphens, periods, plus\n signs, apostrophes, and parentheses\n description: The description of this entity. Must be 1000 characters or less.\n parentId: The ID of the Entity that is the parent of this Entity.\n entityType:\n concreteType: Indicates which implementation of Entity this object represents.\n The value is the fully qualified class name, e.g.\n org.sagebionetworks.repo.model.FileEntity.\n etag: Synapse employs an Optimistic Concurrency Control (OCC) scheme to handle\n concurrent updates. Since the E-Tag changes every time an entity is\n updated it is used to detect when a client's current representation of\n an entity is out-of-date.\n annotations: The dict of annotations for this entity.\n accessControlList:\n createdOn: The date this entity was created.\n createdBy: The ID of the user that created this entity.\n modifiedOn: The date this entity was last modified.\n modifiedBy: The ID of the user that last modified this entity.\n \"\"\"\n\n _synapse_entity_type = \"org.sagebionetworks.repo.model.Entity\"\n _property_keys = [\n \"id\",\n \"name\",\n \"description\",\n \"parentId\",\n \"entityType\",\n \"concreteType\",\n \"uri\",\n \"etag\",\n \"annotations\",\n \"accessControlList\",\n \"createdOn\",\n \"createdBy\",\n \"modifiedOn\",\n \"modifiedBy\",\n ]\n _local_keys = []\n\n @classmethod\n def create(cls, properties=None, annotations=None, local_state=None):\n \"\"\"\n Create an Entity or a subclass given dictionaries of properties and annotations, as might be received from the\n Synapse Repository.\n\n Arguments:\n properties: A map of Synapse properties\n\n - If 'concreteType' is defined in properties, we create the proper subclass of Entity. If not, give back the\n type whose constructor was called.\n - If passed an Entity as input, create a new Entity using the input entity as a prototype.\n annotations: A map of user defined annotations\n local_state: Internal use only\n \"\"\"\n\n # Create a new Entity using an existing Entity as a prototype\n if isinstance(properties, Entity):\n if annotations is None:\n annotations = {}\n if local_state is None:\n local_state = {}\n annotations.update(properties.annotations)\n local_state.update(properties.local_state())\n properties = properties.properties\n if \"id\" in properties:\n del properties[\"id\"]\n\n if (\n cls == Entity\n and \"concreteType\" in properties\n and properties[\"concreteType\"] in entity_type_to_class\n ):\n cls = entity_type_to_class[properties[\"concreteType\"]]\n return cls(\n properties=properties, annotations=annotations, local_state=local_state\n )\n\n @classmethod\n def getURI(cls, id):\n return \"/entity/%s\" % id\n\n def __new__(cls, *args, **kwargs):\n obj = object.__new__(cls)\n\n # Make really sure that properties and annotations exist before\n # any object methods get invoked. This is important because the\n # dot operator magic methods have been overridden and depend on\n # properties and annotations existing.\n obj.__dict__[\"properties\"] = DictObject()\n obj.__dict__[\"annotations\"] = DictObject()\n return obj\n\n def __init__(\n self, properties=None, annotations=None, local_state=None, parent=None, **kwargs\n ):\n if properties:\n if isinstance(properties, collections.abc.Mapping):\n if \"annotations\" in properties and isinstance(\n properties[\"annotations\"], collections.abc.Mapping\n ):\n annotations.update(properties[\"annotations\"])\n del properties[\"annotations\"]\n\n # Re-map `items` to `datasetItems` to avoid namespace conflicts\n # between Dataset schema and the items() builtin method.\n if \"items\" in properties:\n properties[\"datasetItems\"] = properties[\"items\"]\n del properties[\"items\"]\n self.__dict__[\"properties\"].update(properties)\n else:\n raise SynapseMalformedEntityError(\n \"Unknown argument type: properties is a %s\" % str(type(properties))\n )\n\n if annotations:\n if isinstance(annotations, collections.abc.Mapping):\n self.__dict__[\"annotations\"].update(annotations)\n elif isinstance(annotations, str):\n self.properties[\"annotations\"] = annotations\n else:\n raise SynapseMalformedEntityError(\n \"Unknown argument type: annotations is a %s\"\n % str(type(annotations))\n )\n\n if local_state:\n if isinstance(local_state, collections.abc.Mapping):\n self.local_state(local_state)\n else:\n raise SynapseMalformedEntityError(\n \"Unknown argument type: local_state is a %s\"\n % str(type(local_state))\n )\n\n for key in self.__class__._local_keys:\n if key not in self.__dict__:\n self.__dict__[key] = None\n\n # Extract parentId from parent\n if \"parentId\" not in kwargs:\n if parent:\n try:\n kwargs[\"parentId\"] = id_of(parent)\n except Exception:\n if isinstance(parent, Entity) and \"id\" not in parent:\n raise SynapseMalformedEntityError(\n \"Couldn't find 'id' of parent.\"\n \" Has it been stored in Synapse?\"\n )\n else:\n raise SynapseMalformedEntityError(\n \"Couldn't find 'id' of parent.\"\n )\n\n # Note: that this will work properly if derived classes declare their internal state variable *before* invoking\n # super(...).__init__(...)\n for key, value in kwargs.items():\n self.__setitem__(key, value)\n\n if \"concreteType\" not in self:\n self[\"concreteType\"] = self.__class__._synapse_entity_type\n\n # Only project can be top-level. All other entity types require parentId don't enforce this for generic Entity\n if (\n \"parentId\" not in self\n and not isinstance(self, Project)\n and not type(self) == Entity\n ):\n raise SynapseMalformedEntityError(\n \"Entities of type %s must have a parentId.\" % type(self)\n )\n\n def postURI(self):\n return \"/entity\"\n\n def putURI(self):\n return \"/entity/%s\" % self.id\n\n def deleteURI(self, versionNumber=None):\n if versionNumber:\n return \"/entity/%s/version/%s\" % (self.id, versionNumber)\n else:\n return \"/entity/%s\" % self.id\n\n def local_state(self, state=None) -> dict:\n \"\"\"\n Set or get the object's internal state, excluding properties, or annotations.\n\n Arguments:\n state: A dictionary\n\n Returns:\n The object's internal state, excluding properties, or annotations.\n \"\"\"\n if state:\n for key, value in state.items():\n if key not in [\"annotations\", \"properties\"]:\n self.__dict__[key] = value\n result = {}\n for key, value in self.__dict__.items():\n if key not in [\"annotations\", \"properties\"] and not key.startswith(\"__\"):\n result[key] = value\n return result\n\n def __setattr__(self, key, value):\n return self.__setitem__(key, value)\n\n def __setitem__(self, key, value):\n if key in self.__dict__ or key in self.__class__._local_keys:\n # If we assign like so:\n # entity.annotations = {'foo';123, 'bar':'bat'}\n # Wrap the dictionary in a DictObject so we can\n # later do:\n # entity.annotations.foo = 'bar'\n if (key == \"annotations\" or key == \"properties\") and not isinstance(\n value, DictObject\n ):\n value = DictObject(value)\n self.__dict__[key] = value\n elif key in self.__class__._property_keys:\n self.properties[key] = value\n else:\n self.annotations[key] = value\n\n # TODO: def __delattr__\n\n def __getattr__(self, key):\n # Note: that __getattr__ is only called after an attempt to\n # look the key up in the object's dictionary has failed.\n try:\n return self.__getitem__(key)\n except KeyError:\n # Note that hasattr in Python2 is more permissive than Python3\n # about what exceptions it catches. In Python3, hasattr catches\n # only AttributeError\n raise AttributeError(key)\n\n def __getitem__(self, key):\n if key in self.__dict__:\n return self.__dict__[key]\n elif key in self.properties:\n return self.properties[key]\n elif key in self.annotations:\n return self.annotations[key]\n else:\n raise KeyError(key)\n\n def __delitem__(self, key):\n if key in self.properties:\n del self.properties[key]\n elif key in self.annotations:\n del self.annotations[key]\n\n def __iter__(self):\n return iter(self.keys())\n\n def __len__(self):\n return len(self.keys())\n\n # TODO shouldn't these include local_state as well? -jcb\n def keys(self):\n \"\"\"Returns a set of property and annotation keys\"\"\"\n return set(self.properties.keys()) | set(self.annotations.keys())\n\n def has_key(self, key):\n \"\"\"Is the given key a property or annotation?\"\"\"\n\n return key in self.properties or key in self.annotations\n\n def _write_kvps(self, f, dictionary, key_filter=None, key_aliases=None):\n for key in sorted(dictionary.keys()):\n if (not key_filter) or key_filter(key):\n f.write(\" \")\n f.write(str(key) if not key_aliases else key_aliases[key])\n f.write(\"=\")\n f.write(str(dictionary[key]))\n f.write(\"\\n\")\n\n def __str__(self):\n f = io.StringIO()\n\n f.write(\n \"%s: %s (%s)\\n\"\n % (\n self.__class__.__name__,\n self.properties.get(\"name\", \"None\"),\n self[\"id\"] if \"id\" in self else \"-\",\n )\n )\n\n self._str_localstate(f)\n\n f.write(\"properties:\\n\")\n self._write_kvps(f, self.properties)\n\n f.write(\"annotations:\\n\")\n self._write_kvps(f, self.annotations)\n\n return f.getvalue()\n\n def _str_localstate(self, f): # type: (io.StringIO) -> None\n \"\"\"\n Helper method for writing the string representation of the local state to a StringIO object\n :param f: a StringIO object to which the local state string will be written\n \"\"\"\n self._write_kvps(\n f,\n self.__dict__,\n lambda key: not (\n key in [\"properties\", \"annotations\"] or key.startswith(\"__\")\n ),\n )\n\n def __repr__(self):\n \"\"\"Returns an eval-able representation of the Entity.\"\"\"\n\n f = io.StringIO()\n f.write(self.__class__.__name__)\n f.write(\"(\")\n f.write(\n \", \".join(\n {\n \"%s=%s\"\n % (\n str(key),\n value.__repr__(),\n )\n for key, value in itertools.chain(\n list(\n [\n k_v\n for k_v in self.__dict__.items()\n if not (\n k_v[0] in [\"properties\", \"annotations\"]\n or k_v[0].startswith(\"__\")\n )\n ]\n ),\n self.properties.items(),\n self.annotations.items(),\n )\n }\n )\n )\n f.write(\")\")\n return f.getvalue()\n
"},{"location":"reference/entity/#synapseclient.entity.Entity-functions","title":"Functions","text":""},{"location":"reference/entity/#synapseclient.entity.Entity.create","title":"create(properties=None, annotations=None, local_state=None)
classmethod
","text":"Create an Entity or a subclass given dictionaries of properties and annotations, as might be received from the Synapse Repository.
PARAMETER DESCRIPTIONproperties
A map of Synapse properties
DEFAULT: None
annotations
A map of user defined annotations
DEFAULT: None
local_state
Internal use only
DEFAULT: None
synapseclient/entity.py
@classmethod\ndef create(cls, properties=None, annotations=None, local_state=None):\n \"\"\"\n Create an Entity or a subclass given dictionaries of properties and annotations, as might be received from the\n Synapse Repository.\n\n Arguments:\n properties: A map of Synapse properties\n\n - If 'concreteType' is defined in properties, we create the proper subclass of Entity. If not, give back the\n type whose constructor was called.\n - If passed an Entity as input, create a new Entity using the input entity as a prototype.\n annotations: A map of user defined annotations\n local_state: Internal use only\n \"\"\"\n\n # Create a new Entity using an existing Entity as a prototype\n if isinstance(properties, Entity):\n if annotations is None:\n annotations = {}\n if local_state is None:\n local_state = {}\n annotations.update(properties.annotations)\n local_state.update(properties.local_state())\n properties = properties.properties\n if \"id\" in properties:\n del properties[\"id\"]\n\n if (\n cls == Entity\n and \"concreteType\" in properties\n and properties[\"concreteType\"] in entity_type_to_class\n ):\n cls = entity_type_to_class[properties[\"concreteType\"]]\n return cls(\n properties=properties, annotations=annotations, local_state=local_state\n )\n
"},{"location":"reference/entity/#synapseclient.entity.Entity.local_state","title":"local_state(state=None)
","text":"Set or get the object's internal state, excluding properties, or annotations.
PARAMETER DESCRIPTIONstate
A dictionary
DEFAULT: None
dict
The object's internal state, excluding properties, or annotations.
Source code insynapseclient/entity.py
def local_state(self, state=None) -> dict:\n \"\"\"\n Set or get the object's internal state, excluding properties, or annotations.\n\n Arguments:\n state: A dictionary\n\n Returns:\n The object's internal state, excluding properties, or annotations.\n \"\"\"\n if state:\n for key, value in state.items():\n if key not in [\"annotations\", \"properties\"]:\n self.__dict__[key] = value\n result = {}\n for key, value in self.__dict__.items():\n if key not in [\"annotations\", \"properties\"] and not key.startswith(\"__\"):\n result[key] = value\n return result\n
"},{"location":"reference/entity/#synapseclient.entity.Entity.keys","title":"keys()
","text":"Returns a set of property and annotation keys
Source code insynapseclient/entity.py
def keys(self):\n \"\"\"Returns a set of property and annotation keys\"\"\"\n return set(self.properties.keys()) | set(self.annotations.keys())\n
"},{"location":"reference/entity/#synapseclient.entity.Entity.has_key","title":"has_key(key)
","text":"Is the given key a property or annotation?
Source code insynapseclient/entity.py
def has_key(self, key):\n \"\"\"Is the given key a property or annotation?\"\"\"\n\n return key in self.properties or key in self.annotations\n
"},{"location":"reference/entity/#synapseclient.entity.Versionable","title":"synapseclient.entity.Versionable
","text":" Bases: object
An entity for which Synapse will store a version history.
ATTRIBUTE DESCRIPTIONversionNumber
The version number issued to this version on the object.
versionLabel
The version label for this entity
versionComment
The version comment for this entity
versionUrl
versions
Source code in
synapseclient/entity.py
class Versionable(object):\n \"\"\"An entity for which Synapse will store a version history.\n\n Attributes:\n versionNumber: The version number issued to this version on the object.\n versionLabel: \tThe version label for this entity\n versionComment: The version comment for this entity\n versionUrl:\n versions:\n\n \"\"\"\n\n _synapse_entity_type = \"org.sagebionetworks.repo.model.Versionable\"\n _property_keys = [\n \"versionNumber\",\n \"versionLabel\",\n \"versionComment\",\n \"versionUrl\",\n \"versions\",\n ]\n
"},{"location":"reference/evaluation/","title":"Evaluation","text":""},{"location":"reference/evaluation/#synapseclient.evaluation","title":"synapseclient.evaluation
","text":"Evaluations
An evaluation_ object represents a collection of Synapse Entities that will be processed in a particular way. This could mean scoring Entries in a challenge or executing a processing pipeline.
Imports::
from synapseclient import Evaluation, Submission, SubmissionStatus\n
Evaluations can be retrieved by ID::
evaluation = syn.getEvaluation(1901877)\n
Like entities, evaluations are access controlled via ACLs. The :py:func:synapseclient.Synapse.getPermissions
and :py:func:synapseclient.Synapse.setPermissions
methods work for evaluations:
access = syn.getPermissions(evaluation, user_id)\n
The :py:func:synapseclient.Synapse.submit
method returns a Submission_ object::
entity = syn.get(synapse_id)\nsubmission = syn.submit(evaluation, entity, name='My Data', team='My Team')\n
The Submission object can then be used to check the status <#submission-status>
_ of the submission::
status = syn.getSubmissionStatus(submission)\n
The status of a submission may be Submission status objects can be updated, usually by changing the status and score fields, and stored back to Synapse using :py:func:synapseclient.Synapse.store
::
status.score = 0.99\nstatus.status = 'SCORED'\nstatus = syn.store(status)\n
See:
synapseclient.Synapse.getEvaluation
synapseclient.Synapse.getEvaluationByContentSource
synapseclient.Synapse.getEvaluationByName
synapseclient.Synapse.submit
synapseclient.Synapse.getSubmissions
synapseclient.Synapse.getSubmission
synapseclient.Synapse.getSubmissionStatus
synapseclient.Synapse.getPermissions
synapseclient.Synapse.setPermissions
Evaluation\n
.. autoclass:: synapseclient.evaluation.Evaluation :members: init
Submission\n
.. autoclass:: synapseclient.evaluation.Submission :members: init
Submission Status\n
.. autoclass:: synapseclient.evaluation.SubmissionStatus :members: init
"},{"location":"reference/evaluation/#synapseclient.evaluation-classes","title":"Classes","text":""},{"location":"reference/evaluation/#synapseclient.evaluation.Evaluation","title":"Evaluation
","text":" Bases: DictObject
An Evaluation Submission queue, allowing submissions, retrieval and scoring.
:param name: Name of the evaluation :param description: A short description of the evaluation :param contentSource: Synapse Project associated with the evaluation :param submissionReceiptMessage: Message to display to users upon submission :param submissionInstructionsMessage: Message to display to users detailing acceptable formatting for submissions.
To create an Evaluation <https://rest-docs.synapse.org/rest/org/sagebionetworks/evaluation/model/Evaluation.html>
_ and store it in Synapse::
evaluation = syn.store(Evaluation(\n name=\"Q1 Final\",\n description=\"Predict progression of MMSE scores for final scoring\",\n contentSource=\"syn2290704\"))\n
The contentSource field links the evaluation to its :py:class:synapseclient.entity.Project
. (Or, really, any synapse ID, but sticking to projects is a good idea.)
Evaluations <https://rest-docs.synapse.org/rest/org/sagebionetworks/evaluation/model/Evaluation.html>
_ can be retrieved from Synapse by ID::
evaluation = syn.getEvaluation(1901877)\n
...by the Synapse ID of the content source (associated entity)::
evaluation = syn.getEvaluationByContentSource('syn12345')\n
...or by the name of the evaluation::
evaluation = syn.getEvaluationByName('Foo Challenge Question 1')\n
Source code in synapseclient/evaluation.py
class Evaluation(DictObject):\n \"\"\"\n An Evaluation Submission queue, allowing submissions, retrieval and scoring.\n\n :param name: Name of the evaluation\n :param description: A short description of the evaluation\n :param contentSource: Synapse Project associated with the evaluation\n :param submissionReceiptMessage: Message to display to users upon submission\n :param submissionInstructionsMessage: Message to display to users detailing acceptable formatting for submissions.\n\n `To create an Evaluation <https://rest-docs.synapse.org/rest/org/sagebionetworks/evaluation/model/Evaluation.html>`_\n and store it in Synapse::\n\n evaluation = syn.store(Evaluation(\n name=\"Q1 Final\",\n description=\"Predict progression of MMSE scores for final scoring\",\n contentSource=\"syn2290704\"))\n\n The contentSource field links the evaluation to its :py:class:`synapseclient.entity.Project`.\n (Or, really, any synapse ID, but sticking to projects is a good idea.)\n\n `Evaluations <https://rest-docs.synapse.org/rest/org/sagebionetworks/evaluation/model/Evaluation.html>`_ can be retrieved\n from Synapse by ID::\n\n evaluation = syn.getEvaluation(1901877)\n\n ...by the Synapse ID of the content source (associated entity)::\n\n evaluation = syn.getEvaluationByContentSource('syn12345')\n\n ...or by the name of the evaluation::\n\n evaluation = syn.getEvaluationByName('Foo Challenge Question 1')\n\n \"\"\"\n\n @classmethod\n def getByNameURI(cls, name: str):\n quoted_name = urllib_urlparse.quote(name)\n return f\"/evaluation/name/{quoted_name}\"\n\n @classmethod\n def getURI(cls, id: Union[str, int]):\n return f\"/evaluation/{id}\"\n\n def __init__(self, **kwargs):\n kwargs[\"contentSource\"] = kwargs.get(\"contentSource\", \"\")\n if not kwargs[\"contentSource\"].startswith(\n \"syn\"\n ): # Verify that synapse Id given\n raise ValueError(\n 'The \"contentSource\" parameter must be specified as a Synapse Entity when creating an'\n \" Evaluation\"\n )\n super(Evaluation, self).__init__(kwargs)\n\n def postURI(self):\n return \"/evaluation\"\n\n def putURI(self):\n return f\"/evaluation/{self.id}\"\n\n def deleteURI(self):\n return f\"/evaluation/{self.id}\"\n\n def getACLURI(self):\n return f\"/evaluation/{self.id}/acl\"\n\n def putACLURI(self):\n return \"/evaluation/acl\"\n
"},{"location":"reference/evaluation/#synapseclient.evaluation.Submission","title":"Submission
","text":" Bases: DictObject
Builds an Synapse submission object.
:param name: Name of submission :param entityId: Synapse ID of the Entity to submit :param evaluationId: ID of the Evaluation to which the Entity is to be submitted :param versionNumber: Version number of the submitted Entity :param submitterAlias: A pseudonym or team name for a challenge entry
Source code insynapseclient/evaluation.py
class Submission(DictObject):\n \"\"\"\n Builds an Synapse submission object.\n\n :param name: Name of submission\n :param entityId: Synapse ID of the Entity to submit\n :param evaluationId: ID of the Evaluation to which the Entity is to be submitted\n :param versionNumber: Version number of the submitted Entity\n :param submitterAlias: A pseudonym or team name for a challenge entry\n \"\"\"\n\n @classmethod\n def getURI(cls, id: Union[str, int]):\n return f\"/evaluation/submission/{id}\"\n\n def __init__(self, **kwargs):\n if not (\n \"evaluationId\" in kwargs\n and \"entityId\" in kwargs\n and \"versionNumber\" in kwargs\n ):\n raise KeyError\n\n super().__init__(kwargs)\n\n def postURI(self):\n return f\"/evaluation/submission?etag={self.etag}\"\n\n def putURI(self):\n return f\"/evaluation/submission/{self.id}\"\n\n def deleteURI(self):\n return f\"/evaluation/submission/{self.id}\"\n
"},{"location":"reference/evaluation/#synapseclient.evaluation.SubmissionStatus","title":"SubmissionStatus
","text":" Bases: DictObject
Builds an Synapse submission status object. https://rest-docs.synapse.org/rest/org/sagebionetworks/evaluation/model/SubmissionStatus.html
:param id: Unique immutable Synapse Id of the Submission :param status: Status can be one of https://rest-docs.synapse.org/rest/org/sagebionetworks/evaluation/model/SubmissionStatusEnum.html. :param submissionAnnotations: synapseclient.Annotations to store annotations of submission :param canCancel: Can this submission be cancelled? :param cancelRequested: Has user requested to cancel this submission?
Source code insynapseclient/evaluation.py
class SubmissionStatus(DictObject):\n \"\"\"\n Builds an Synapse submission status object.\n https://rest-docs.synapse.org/rest/org/sagebionetworks/evaluation/model/SubmissionStatus.html\n\n :param id: Unique immutable Synapse Id of the Submission\n :param status: Status can be one of\n https://rest-docs.synapse.org/rest/org/sagebionetworks/evaluation/model/SubmissionStatusEnum.html.\n :param submissionAnnotations: synapseclient.Annotations to store annotations of submission\n :param canCancel: Can this submission be cancelled?\n :param cancelRequested: Has user requested to cancel this submission?\n \"\"\"\n\n @classmethod\n def getURI(cls, id: Union[str, int]):\n return f\"/evaluation/submission/{id}/status\"\n\n def __init__(self, id: Union[str, int], etag: str, **kwargs):\n annotations = kwargs.pop(\"submissionAnnotations\", {})\n # If it is synapse annotations, turn into a format\n # that can be worked with otherwise, create\n # synapseclient.Annotations\n submission_annotations = _convert_to_annotation_cls(\n id=id, etag=etag, values=annotations\n )\n # In Python 3, the super(SubmissionStatus, self) call is equivalent to the parameterless super()\n super().__init__(\n id=id, etag=etag, submissionAnnotations=submission_annotations, **kwargs\n )\n\n # def postURI(self):\n # return '/evaluation/submission/%s/status' % self.id\n\n def putURI(self):\n return f\"/evaluation/submission/{self.id}/status\"\n\n # def deleteURI(self):\n # return '/evaluation/submission/%s/status' % self.id\n\n def json(self, ensure_ascii: bool = True):\n \"\"\"Overloaded json function, turning submissionAnnotations into\n synapse style annotations\"\"\"\n\n json_dict = self\n # If not synapse annotations, turn them into synapseclient.Annotations\n # must have id and etag to turn into synapse annotations\n if not is_synapse_annotations(self.submissionAnnotations):\n json_dict = self.copy()\n\n annotations = _convert_to_annotation_cls(\n id=self.id, etag=self.etag, values=self.submissionAnnotations\n )\n # Turn into synapse annotation\n json_dict[\"submissionAnnotations\"] = to_synapse_annotations(annotations)\n return json.dumps(\n json_dict, sort_keys=True, indent=2, ensure_ascii=ensure_ascii\n )\n
"},{"location":"reference/evaluation/#synapseclient.evaluation.SubmissionStatus-functions","title":"Functions","text":""},{"location":"reference/evaluation/#synapseclient.evaluation.SubmissionStatus.json","title":"json(ensure_ascii=True)
","text":"Overloaded json function, turning submissionAnnotations into synapse style annotations
Source code insynapseclient/evaluation.py
def json(self, ensure_ascii: bool = True):\n \"\"\"Overloaded json function, turning submissionAnnotations into\n synapse style annotations\"\"\"\n\n json_dict = self\n # If not synapse annotations, turn them into synapseclient.Annotations\n # must have id and etag to turn into synapse annotations\n if not is_synapse_annotations(self.submissionAnnotations):\n json_dict = self.copy()\n\n annotations = _convert_to_annotation_cls(\n id=self.id, etag=self.etag, values=self.submissionAnnotations\n )\n # Turn into synapse annotation\n json_dict[\"submissionAnnotations\"] = to_synapse_annotations(annotations)\n return json.dumps(\n json_dict, sort_keys=True, indent=2, ensure_ascii=ensure_ascii\n )\n
"},{"location":"reference/evaluation/#synapseclient.evaluation-functions","title":"Functions","text":""},{"location":"reference/exceptions/","title":"Exceptions","text":""},{"location":"reference/exceptions/#synapseclient.core.exceptions","title":"synapseclient.core.exceptions
","text":""},{"location":"reference/exceptions/#synapseclient.core.exceptions-classes","title":"Classes","text":""},{"location":"reference/exceptions/#synapseclient.core.exceptions.SynapseError","title":"SynapseError
","text":" Bases: Exception
Generic exception thrown by the client.
Source code insynapseclient/core/exceptions.py
class SynapseError(Exception):\n \"\"\"Generic exception thrown by the client.\"\"\"\n
"},{"location":"reference/exceptions/#synapseclient.core.exceptions.SynapseMd5MismatchError","title":"SynapseMd5MismatchError
","text":" Bases: SynapseError
, IOError
Error raised when MD5 computed for a download file fails to match the MD5 of its file handle.
Source code insynapseclient/core/exceptions.py
class SynapseMd5MismatchError(SynapseError, IOError):\n \"\"\"Error raised when MD5 computed for a download file fails to match the MD5 of its file handle.\"\"\"\n
"},{"location":"reference/exceptions/#synapseclient.core.exceptions.SynapseFileNotFoundError","title":"SynapseFileNotFoundError
","text":" Bases: SynapseError
Error thrown when a local file is not found in Synapse.
Source code insynapseclient/core/exceptions.py
class SynapseFileNotFoundError(SynapseError):\n \"\"\"Error thrown when a local file is not found in Synapse.\"\"\"\n
"},{"location":"reference/exceptions/#synapseclient.core.exceptions.SynapseTimeoutError","title":"SynapseTimeoutError
","text":" Bases: SynapseError
Timed out waiting for response from Synapse.
Source code insynapseclient/core/exceptions.py
class SynapseTimeoutError(SynapseError):\n \"\"\"Timed out waiting for response from Synapse.\"\"\"\n
"},{"location":"reference/exceptions/#synapseclient.core.exceptions.SynapseAuthenticationError","title":"SynapseAuthenticationError
","text":" Bases: SynapseError
Unauthorized access.
Source code insynapseclient/core/exceptions.py
class SynapseAuthenticationError(SynapseError):\n \"\"\"Unauthorized access.\"\"\"\n
"},{"location":"reference/exceptions/#synapseclient.core.exceptions.SynapseNoCredentialsError","title":"SynapseNoCredentialsError
","text":" Bases: SynapseAuthenticationError
No credentials for authentication
Source code insynapseclient/core/exceptions.py
class SynapseNoCredentialsError(SynapseAuthenticationError):\n \"\"\"No credentials for authentication\"\"\"\n
"},{"location":"reference/exceptions/#synapseclient.core.exceptions.SynapseFileCacheError","title":"SynapseFileCacheError
","text":" Bases: SynapseError
Error related to local file storage.
Source code insynapseclient/core/exceptions.py
class SynapseFileCacheError(SynapseError):\n \"\"\"Error related to local file storage.\"\"\"\n
"},{"location":"reference/exceptions/#synapseclient.core.exceptions.SynapseMalformedEntityError","title":"SynapseMalformedEntityError
","text":" Bases: SynapseError
Unexpected structure of Entities.
Source code insynapseclient/core/exceptions.py
class SynapseMalformedEntityError(SynapseError):\n \"\"\"Unexpected structure of Entities.\"\"\"\n
"},{"location":"reference/exceptions/#synapseclient.core.exceptions.SynapseUnmetAccessRestrictions","title":"SynapseUnmetAccessRestrictions
","text":" Bases: SynapseError
Request cannot be completed due to unmet access restrictions.
Source code insynapseclient/core/exceptions.py
class SynapseUnmetAccessRestrictions(SynapseError):\n \"\"\"Request cannot be completed due to unmet access restrictions.\"\"\"\n
"},{"location":"reference/exceptions/#synapseclient.core.exceptions.SynapseProvenanceError","title":"SynapseProvenanceError
","text":" Bases: SynapseError
Incorrect usage of provenance objects.
Source code insynapseclient/core/exceptions.py
class SynapseProvenanceError(SynapseError):\n \"\"\"Incorrect usage of provenance objects.\"\"\"\n
"},{"location":"reference/exceptions/#synapseclient.core.exceptions.SynapseHTTPError","title":"SynapseHTTPError
","text":" Bases: SynapseError
, HTTPError
Wraps recognized HTTP errors. See HTTPError <http://docs.python-requests.org/en/latest/api/?highlight=exceptions#requests.exceptions.HTTPError>
_
synapseclient/core/exceptions.py
class SynapseHTTPError(SynapseError, requests.exceptions.HTTPError):\n \"\"\"Wraps recognized HTTP errors. See\n `HTTPError <http://docs.python-requests.org/en/latest/api/?highlight=exceptions#requests.exceptions.HTTPError>`_\n \"\"\"\n
"},{"location":"reference/exceptions/#synapseclient.core.exceptions.SynapseUploadAbortedException","title":"SynapseUploadAbortedException
","text":" Bases: SynapseError
Raised when a worker thread detects the upload was aborted and stops further processing.
Source code insynapseclient/core/exceptions.py
class SynapseUploadAbortedException(SynapseError):\n \"\"\"Raised when a worker thread detects the upload was\n aborted and stops further processing.\"\"\"\n\n pass\n
"},{"location":"reference/exceptions/#synapseclient.core.exceptions.SynapseUploadFailedException","title":"SynapseUploadFailedException
","text":" Bases: SynapseError
Raised when an upload failed. Should be chained to a cause Exception
Source code insynapseclient/core/exceptions.py
class SynapseUploadFailedException(SynapseError):\n \"\"\"Raised when an upload failed. Should be chained to a cause Exception\"\"\"\n\n pass\n
"},{"location":"reference/file/","title":"File","text":""},{"location":"reference/file/#synapseclient.entity.File","title":"synapseclient.entity.File
","text":" Bases: Entity
, Versionable
Represents a file in Synapse.
When a File object is stored, the associated local file or its URL will be stored in Synapse. A File must have a path (or URL) and a parent. By default, the name of the file in Synapse matches the filename, but by specifying the name
attribute, the File Entity name can be different.
A Synapse File Entity has a name separate from the name of the actual file it represents. When a file is uploaded to Synapse, its filename is fixed, even though the name of the entity can be changed at any time. Synapse provides a way to change this filename and the content-type of the file for future downloads by creating a new version of the file with a modified copy of itself. This can be done with the synapseutils.copy_functions.changeFileMetaData function.
import synapseutils\ne = syn.get(synid)\nprint(os.path.basename(e.path)) ## prints, e.g., \"my_file.txt\"\ne = synapseutils.changeFileMetaData(syn, e, \"my_newname_file.txt\")\n
Setting fileNameOverride will not change the name of a copy of the file that's already downloaded into your local cache. Either rename the local copy manually or remove it from the cache and re-download.:
syn.cache.remove(e.dataFileHandleId)\ne = syn.get(e)\nprint(os.path.basename(e.path)) ## prints \"my_newname_file.txt\"\n
PARAMETER DESCRIPTION path
Location to be represented by this File
DEFAULT: None
name
Name of the file in Synapse, not to be confused with the name within the path
parent
Project or Folder where this File is stored
DEFAULT: None
synapseStore
Whether the File should be uploaded or if only the path should be stored when synapseclient.Synapse.store is called on the File object.
DEFAULT: True
contentType
Manually specify Content-type header, for example \"application/png\" or \"application/json; charset=UTF-8\"
dataFileHandleId
Defining an existing dataFileHandleId will use the existing dataFileHandleId The creator of the file must also be the owner of the dataFileHandleId to have permission to store the file.
properties
A map of Synapse properties
DEFAULT: None
annotations
A map of user defined annotations
DEFAULT: None
local_state
Internal use only
DEFAULT: None
Creating and storing a File
# The Entity name is derived from the path and is 'data.xyz'\ndata = File('/path/to/file/data.xyz', parent=folder)\ndata = syn.store(data)\n
Setting the name of the file in Synapse to 'my entity'
# The Entity name is specified as 'my entity'\ndata = File('/path/to/file/data.xyz', name=\"my entity\", parent=folder)\ndata = syn.store(data)\n
Source code in synapseclient/entity.py
class File(Entity, Versionable):\n \"\"\"\n Represents a file in Synapse.\n\n When a File object is stored, the associated local file or its URL will be stored in Synapse. A File must have a\n path (or URL) and a parent. By default, the name of the file in Synapse matches the filename, but by specifying\n the `name` attribute, the File Entity name can be different.\n\n ## Changing File Names\n\n A Synapse File Entity has a name separate from the name of the actual file it represents. When a file is uploaded to\n Synapse, its filename is fixed, even though the name of the entity can be changed at any time. Synapse provides a way\n to change this filename and the content-type of the file for future downloads by creating a new version of the file\n with a modified copy of itself. This can be done with the synapseutils.copy_functions.changeFileMetaData function.\n\n import synapseutils\n e = syn.get(synid)\n print(os.path.basename(e.path)) ## prints, e.g., \"my_file.txt\"\n e = synapseutils.changeFileMetaData(syn, e, \"my_newname_file.txt\")\n\n Setting *fileNameOverride* will **not** change the name of a copy of the\n file that's already downloaded into your local cache. Either rename the\n local copy manually or remove it from the cache and re-download.:\n\n syn.cache.remove(e.dataFileHandleId)\n e = syn.get(e)\n print(os.path.basename(e.path)) ## prints \"my_newname_file.txt\"\n\n Parameters:\n path: Location to be represented by this File\n name: Name of the file in Synapse, not to be confused with the name within the path\n parent: Project or Folder where this File is stored\n synapseStore: Whether the File should be uploaded or if only the path should be stored when\n [synapseclient.Synapse.store][] is called on the File object.\n contentType: Manually specify Content-type header, for example \"application/png\" or\n \"application/json; charset=UTF-8\"\n dataFileHandleId: Defining an existing dataFileHandleId will use the existing dataFileHandleId\n The creator of the file must also be the owner of the dataFileHandleId to have\n permission to store the file.\n properties: A map of Synapse properties\n annotations: A map of user defined annotations\n local_state: Internal use only\n\n Example: Creating instances\n Creating and storing a File\n\n # The Entity name is derived from the path and is 'data.xyz'\n data = File('/path/to/file/data.xyz', parent=folder)\n data = syn.store(data)\n\n Setting the name of the file in Synapse to 'my entity'\n\n # The Entity name is specified as 'my entity'\n data = File('/path/to/file/data.xyz', name=\"my entity\", parent=folder)\n data = syn.store(data)\n \"\"\"\n\n # Note: externalURL technically should not be in the keys since it's only a field/member variable of\n # ExternalFileHandle, but for backwards compatibility it's included\n _file_handle_keys = [\n \"createdOn\",\n \"id\",\n \"concreteType\",\n \"contentSize\",\n \"createdBy\",\n \"etag\",\n \"fileName\",\n \"contentType\",\n \"contentMd5\",\n \"storageLocationId\",\n \"externalURL\",\n ]\n # Used for backwards compatability. The keys found below used to located in the entity's local_state\n # (i.e. __dict__).\n _file_handle_aliases = {\n \"md5\": \"contentMd5\",\n \"externalURL\": \"externalURL\",\n \"fileSize\": \"contentSize\",\n \"contentType\": \"contentType\",\n }\n _file_handle_aliases_inverse = {v: k for k, v in _file_handle_aliases.items()}\n\n _property_keys = (\n Entity._property_keys + Versionable._property_keys + [\"dataFileHandleId\"]\n )\n _local_keys = Entity._local_keys + [\n \"path\",\n \"cacheDir\",\n \"files\",\n \"synapseStore\",\n \"_file_handle\",\n ]\n _synapse_entity_type = \"org.sagebionetworks.repo.model.FileEntity\"\n\n # TODO: File(path=\"/path/to/file\", synapseStore=True, parentId=\"syn101\")\n def __init__(\n self,\n path=None,\n parent=None,\n synapseStore=True,\n properties=None,\n annotations=None,\n local_state=None,\n **kwargs,\n ):\n if path and \"name\" not in kwargs:\n kwargs[\"name\"] = utils.guess_file_name(path)\n self.__dict__[\"path\"] = path\n if path:\n cacheDir, basename = os.path.split(path)\n self.__dict__[\"cacheDir\"] = cacheDir\n self.__dict__[\"files\"] = [basename]\n else:\n self.__dict__[\"cacheDir\"] = None\n self.__dict__[\"files\"] = []\n self.__dict__[\"synapseStore\"] = synapseStore\n\n # pop the _file_handle from local properties because it is handled differently from other local_state\n self._update_file_handle(\n local_state.pop(\"_file_handle\", None) if (local_state is not None) else None\n )\n\n super(File, self).__init__(\n concreteType=File._synapse_entity_type,\n properties=properties,\n annotations=annotations,\n local_state=local_state,\n parent=parent,\n **kwargs,\n )\n\n def _update_file_handle(self, file_handle_update_dict=None):\n \"\"\"\n Sets the file handle\n\n Should not need to be called by users\n \"\"\"\n\n # replace the file handle dict\n fh_dict = (\n DictObject(file_handle_update_dict)\n if file_handle_update_dict is not None\n else DictObject()\n )\n self.__dict__[\"_file_handle\"] = fh_dict\n\n if (\n file_handle_update_dict is not None\n and file_handle_update_dict.get(\"concreteType\")\n == \"org.sagebionetworks.repo.model.file.ExternalFileHandle\"\n and urllib_parse.urlparse(file_handle_update_dict.get(\"externalURL\")).scheme\n != \"sftp\"\n ):\n self.__dict__[\"synapseStore\"] = False\n\n # initialize all nonexistent keys to have value of None\n for key in self.__class__._file_handle_keys:\n if key not in fh_dict:\n fh_dict[key] = None\n\n def __setitem__(self, key, value):\n if key == \"_file_handle\":\n self._update_file_handle(value)\n elif key in self.__class__._file_handle_aliases:\n self._file_handle[self.__class__._file_handle_aliases[key]] = value\n else:\n\n def expand_and_convert_to_URL(path):\n return utils.as_url(os.path.expandvars(os.path.expanduser(path)))\n\n # hacky solution to allowing immediate switching into a ExternalFileHandle pointing to the current path\n # yes, there is boolean zen but I feel like it is easier to read/understand this way\n if (\n key == \"synapseStore\"\n and value is False\n and self[\"synapseStore\"] is True\n and utils.caller_module_name(inspect.currentframe()) != \"client\"\n ):\n self[\"externalURL\"] = expand_and_convert_to_URL(self[\"path\"])\n\n # hacky solution because we historically allowed modifying 'path' to indicate wanting to change to a new\n # ExternalFileHandle\n # don't change exernalURL if it's just the synapseclient setting metadata after a function call such as\n # syn.get()\n if (\n key == \"path\"\n and not self[\"synapseStore\"]\n and utils.caller_module_name(inspect.currentframe()) != \"client\"\n ):\n self[\"externalURL\"] = expand_and_convert_to_URL(value)\n self[\"contentMd5\"] = None\n self[\"contentSize\"] = None\n super(File, self).__setitem__(key, value)\n\n def __getitem__(self, item):\n if item in self.__class__._file_handle_aliases:\n return self._file_handle[self.__class__._file_handle_aliases[item]]\n else:\n return super(File, self).__getitem__(item)\n\n def _str_localstate(self, f):\n self._write_kvps(\n f,\n self._file_handle,\n lambda key: key\n in [\"externalURL\", \"contentMd5\", \"contentSize\", \"contentType\"],\n self._file_handle_aliases_inverse,\n )\n self._write_kvps(\n f,\n self.__dict__,\n lambda key: not (\n key in [\"properties\", \"annotations\", \"_file_handle\"]\n or key.startswith(\"__\")\n ),\n )\n
"},{"location":"reference/folder/","title":"Folder","text":""},{"location":"reference/folder/#synapseclient.entity.Folder","title":"synapseclient.entity.Folder
","text":" Bases: Entity
Represents a folder in Synapse.
Folders must have a name and a parent and can optionally have annotations.
ATTRIBUTE DESCRIPTIONname
The name of the folder
parent
The parent project or folder
properties
A map of Synapse properties
annotations
A map of user defined annotations
local_state
Internal use only
Using this class
Creating an instance and storing the folder
folder = Folder(name='my data', parent=project)\nfolder = syn.store(folder)\n
Source code in synapseclient/entity.py
class Folder(Entity):\n \"\"\"\n Represents a folder in Synapse.\n\n Folders must have a name and a parent and can optionally have annotations.\n\n Attributes:\n name: The name of the folder\n parent: The parent project or folder\n properties: A map of Synapse properties\n annotations: A map of user defined annotations\n local_state: Internal use only\n\n Example: Using this class\n Creating an instance and storing the folder\n\n folder = Folder(name='my data', parent=project)\n folder = syn.store(folder)\n \"\"\"\n\n _synapse_entity_type = \"org.sagebionetworks.repo.model.Folder\"\n\n def __init__(\n self,\n name=None,\n parent=None,\n properties=None,\n annotations=None,\n local_state=None,\n **kwargs,\n ):\n if name:\n kwargs[\"name\"] = name\n super(Folder, self).__init__(\n concreteType=Folder._synapse_entity_type,\n properties=properties,\n annotations=annotations,\n local_state=local_state,\n parent=parent,\n **kwargs,\n )\n
"},{"location":"reference/json_schema/","title":"JSON Schema","text":""},{"location":"reference/json_schema/#synapseclient.services.json_schema","title":"synapseclient.services.json_schema
","text":"JSON Schema
.. warning:: This is a beta implementation and is subject to change. Use at your own risk.
"},{"location":"reference/json_schema/#synapseclient.services.json_schema-classes","title":"Classes","text":""},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaVersion","title":"JsonSchemaVersion
","text":"Json schema version response object
:param organization: JSON schema organization. :type organization: JsonSchemaOrganization :param name: Name of the JSON schema. :type name: str :param semantic_version: Version of JSON schema. Defaults to None. :type semantic_version: str, optional
Source code insynapseclient/services/json_schema.py
class JsonSchemaVersion:\n \"\"\"Json schema version response object\n\n :param organization: JSON schema organization.\n :type organization: JsonSchemaOrganization\n :param name: Name of the JSON schema.\n :type name: str\n :param semantic_version: Version of JSON schema. Defaults to None.\n :type semantic_version: str, optional\n \"\"\"\n\n def __init__(\n self,\n organization: JsonSchemaOrganization,\n name: str,\n semantic_version: str = None,\n ) -> None:\n self.organization = organization\n self.name = name\n self.semantic_version = semantic_version\n self.uri = None\n self.version_id = None\n self.created_on = None\n self.created_by = None\n self.json_sha256_hex = None\n self.set_service(self.organization.service)\n\n def __repr__(self):\n string = (\n f\"JsonSchemaVersion(org={self.organization.name!r}, name={self.name!r}, \"\n f\"version={self.semantic_version!r})\"\n )\n return string\n\n def set_service(self, service):\n self.service = service\n\n @property\n def raw(self):\n self.must_get()\n return self._raw\n\n def parse_response(self, response):\n self._raw = response\n self.uri = response[\"$id\"]\n self.version_id = response[\"versionId\"]\n self.created_on = response[\"createdOn\"]\n self.created_by = response[\"createdBy\"]\n self.json_sha256_hex = response[\"jsonSHA256Hex\"]\n\n @classmethod\n def from_response(cls, organization, response):\n semver = response.get(\"semanticVersion\")\n version = cls(organization, response[\"schemaName\"], semver)\n version.parse_response(response)\n return version\n\n def get(self):\n \"\"\"Get the JSON Schema Version\"\"\"\n if self.uri is not None:\n return True\n json_schema = self.organization.get_json_schema(self.name)\n if json_schema is None:\n return False\n raw_version = json_schema.get_version(self.semantic_version, raw=True)\n if raw_version is None:\n return False\n self.parse_response(raw_version)\n return True\n\n def must_get(self):\n already_exists = self.get()\n assert already_exists, (\n \"This operation requires that the JSON Schema name is created first.\"\n \"Call the 'create_version()' method to trigger the creation.\"\n )\n\n def create(\n self,\n json_schema_body: dict,\n dry_run: bool = False,\n ):\n \"\"\"Create JSON schema version\n\n :param json_schema_body: JSON schema body\n :type json_schema_body: dict\n :param dry_run: Do not store to Synapse. Defaults to False.\n :type dry_run: bool, optional\n :returns: JSON Schema\n \"\"\"\n uri = f\"{self.organization.name}-{self.name}\"\n if self.semantic_version:\n uri = f\"{uri}-{self.semantic_version}\"\n json_schema_body[\"$id\"] = uri\n response = self.service.create_json_schema(json_schema_body, dry_run)\n if dry_run:\n return response\n raw_version = response[\"newVersionInfo\"]\n self.parse_response(raw_version)\n return self\n\n def delete(self):\n \"\"\"Delete the JSON schema version\"\"\"\n self.must_get()\n response = self.service.delete_json_schema(self.uri)\n return response\n\n @property\n def body(self):\n self.must_get()\n json_schema_body = self.service.get_json_schema_body(self.uri)\n return json_schema_body\n\n def expand(self):\n \"\"\"Validate entities with schema\"\"\"\n self.must_get()\n response = self.service.json_schema_validation(self.uri)\n json_schema_body = response[\"validationSchema\"]\n return json_schema_body\n\n def bind_to_object(self, synapse_id: str):\n \"\"\"Bind schema to an entity\n\n :param synapse_id: Synapse Id to bind json schema to.\n :type synapse_id: str\n \"\"\"\n self.must_get()\n response = self.service.bind_json_schema_to_entity(synapse_id, self.uri)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaVersion-functions","title":"Functions","text":""},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaVersion.get","title":"get()
","text":"Get the JSON Schema Version
Source code insynapseclient/services/json_schema.py
def get(self):\n \"\"\"Get the JSON Schema Version\"\"\"\n if self.uri is not None:\n return True\n json_schema = self.organization.get_json_schema(self.name)\n if json_schema is None:\n return False\n raw_version = json_schema.get_version(self.semantic_version, raw=True)\n if raw_version is None:\n return False\n self.parse_response(raw_version)\n return True\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaVersion.create","title":"create(json_schema_body, dry_run=False)
","text":"Create JSON schema version
:param json_schema_body: JSON schema body :type json_schema_body: dict :param dry_run: Do not store to Synapse. Defaults to False. :type dry_run: bool, optional :returns: JSON Schema
Source code insynapseclient/services/json_schema.py
def create(\n self,\n json_schema_body: dict,\n dry_run: bool = False,\n):\n \"\"\"Create JSON schema version\n\n :param json_schema_body: JSON schema body\n :type json_schema_body: dict\n :param dry_run: Do not store to Synapse. Defaults to False.\n :type dry_run: bool, optional\n :returns: JSON Schema\n \"\"\"\n uri = f\"{self.organization.name}-{self.name}\"\n if self.semantic_version:\n uri = f\"{uri}-{self.semantic_version}\"\n json_schema_body[\"$id\"] = uri\n response = self.service.create_json_schema(json_schema_body, dry_run)\n if dry_run:\n return response\n raw_version = response[\"newVersionInfo\"]\n self.parse_response(raw_version)\n return self\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaVersion.delete","title":"delete()
","text":"Delete the JSON schema version
Source code insynapseclient/services/json_schema.py
def delete(self):\n \"\"\"Delete the JSON schema version\"\"\"\n self.must_get()\n response = self.service.delete_json_schema(self.uri)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaVersion.expand","title":"expand()
","text":"Validate entities with schema
Source code insynapseclient/services/json_schema.py
def expand(self):\n \"\"\"Validate entities with schema\"\"\"\n self.must_get()\n response = self.service.json_schema_validation(self.uri)\n json_schema_body = response[\"validationSchema\"]\n return json_schema_body\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaVersion.bind_to_object","title":"bind_to_object(synapse_id)
","text":"Bind schema to an entity
:param synapse_id: Synapse Id to bind json schema to. :type synapse_id: str
Source code insynapseclient/services/json_schema.py
def bind_to_object(self, synapse_id: str):\n \"\"\"Bind schema to an entity\n\n :param synapse_id: Synapse Id to bind json schema to.\n :type synapse_id: str\n \"\"\"\n self.must_get()\n response = self.service.bind_json_schema_to_entity(synapse_id, self.uri)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchema","title":"JsonSchema
","text":"Json schema response object
:param organization: JSON schema organization. :type organization: JsonSchemaOrganization :param name: Name of the JSON schema. :type name: str
Source code insynapseclient/services/json_schema.py
class JsonSchema:\n \"\"\"Json schema response object\n\n :param organization: JSON schema organization.\n :type organization: JsonSchemaOrganization\n :param name: Name of the JSON schema.\n :type name: str\n \"\"\"\n\n def __init__(self, organization: JsonSchemaOrganization, name: str) -> None:\n self.organization = organization\n self.name = name\n self.id = None\n self.created_on = None\n self.created_by = None\n self._versions = dict()\n self.set_service(self.organization.service)\n\n def __repr__(self):\n string = f\"JsonSchema(org={self.organization.name!r}, name={self.name!r})\"\n return string\n\n def set_service(self, service):\n self.service = service\n\n @property\n def raw(self):\n self.must_get()\n return self._raw\n\n def parse_response(self, response):\n self._raw = response\n self.id = response[\"schemaId\"]\n self.created_on = response[\"createdOn\"]\n self.created_by = response[\"createdBy\"]\n\n @classmethod\n def from_response(cls, organization, response):\n json_schema = cls(organization, response[\"schemaName\"])\n json_schema.parse_response(response)\n return json_schema\n\n def get(self):\n \"\"\"Get Json schema\"\"\"\n if self.id is not None:\n return True\n response = self.organization.get_json_schema(self.name, raw=True)\n if response is None:\n return False\n self.parse_response(response)\n return True\n\n def must_get(self):\n already_exists = self.get()\n assert already_exists, (\n \"This operation requires that the JSON Schema name is created first.\"\n \"Call the 'create_version()' method to trigger the creation.\"\n )\n\n def list_versions(self):\n \"\"\"List versions of the json schema\"\"\"\n self.must_get()\n self._versions = dict()\n response = self.service.list_json_schema_versions(\n self.organization.name, self.name\n )\n for raw_version in response:\n semver = raw_version.get(\"semanticVersion\")\n version = JsonSchemaVersion.from_response(self.organization, raw_version)\n # Handle that multiple versions can have None/null as their semver\n if semver is None:\n update_none_version = (\n # Is this the first null version?\n semver not in self._versions\n # Or is the version ID higher (i.e., more recent)?\n or version.version_id > self._versions[semver].version_id\n )\n if update_none_version:\n self._versions[semver] = (raw_version, version)\n else:\n self._versions[semver] = (raw_version, version)\n # Skip versions w/o semver until the end\n if semver is not None:\n yield version\n # Return version w/o semver now (if applicable) to ensure latest is returned\n if None in self._versions:\n yield self._versions[None]\n\n def get_version(self, semantic_version: str = None, raw: bool = False):\n self.must_get()\n if semantic_version not in self._versions:\n list(self.list_versions())\n raw_version, version = self._versions.get(semantic_version, [None, None])\n return raw_version if raw else version\n\n def create(\n self,\n json_schema_body: dict,\n semantic_version: str = None,\n dry_run: bool = False,\n ):\n \"\"\"Create JSON schema\n\n :param json_schema_body: JSON schema body\n :type json_schema_body: dict\n :param semantic_version: Version of JSON schema. Defaults to None.\n :type semantic_version: str, optional\n :param dry_run: Do not store to Synapse. Defaults to False.\n :type dry_run: bool, optional\n \"\"\"\n uri = f\"{self.organization.name}-{self.name}\"\n if semantic_version:\n uri = f\"{uri}-{semantic_version}\"\n json_schema_body[\"$id\"] = uri\n response = self.service.create_json_schema(json_schema_body, dry_run)\n if dry_run:\n return response\n raw_version = response[\"newVersionInfo\"]\n version = JsonSchemaVersion.from_response(self.organization, raw_version)\n self._versions[semantic_version] = (raw_version, version)\n return version\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchema-functions","title":"Functions","text":""},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchema.get","title":"get()
","text":"Get Json schema
Source code insynapseclient/services/json_schema.py
def get(self):\n \"\"\"Get Json schema\"\"\"\n if self.id is not None:\n return True\n response = self.organization.get_json_schema(self.name, raw=True)\n if response is None:\n return False\n self.parse_response(response)\n return True\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchema.list_versions","title":"list_versions()
","text":"List versions of the json schema
Source code insynapseclient/services/json_schema.py
def list_versions(self):\n \"\"\"List versions of the json schema\"\"\"\n self.must_get()\n self._versions = dict()\n response = self.service.list_json_schema_versions(\n self.organization.name, self.name\n )\n for raw_version in response:\n semver = raw_version.get(\"semanticVersion\")\n version = JsonSchemaVersion.from_response(self.organization, raw_version)\n # Handle that multiple versions can have None/null as their semver\n if semver is None:\n update_none_version = (\n # Is this the first null version?\n semver not in self._versions\n # Or is the version ID higher (i.e., more recent)?\n or version.version_id > self._versions[semver].version_id\n )\n if update_none_version:\n self._versions[semver] = (raw_version, version)\n else:\n self._versions[semver] = (raw_version, version)\n # Skip versions w/o semver until the end\n if semver is not None:\n yield version\n # Return version w/o semver now (if applicable) to ensure latest is returned\n if None in self._versions:\n yield self._versions[None]\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchema.create","title":"create(json_schema_body, semantic_version=None, dry_run=False)
","text":"Create JSON schema
:param json_schema_body: JSON schema body :type json_schema_body: dict :param semantic_version: Version of JSON schema. Defaults to None. :type semantic_version: str, optional :param dry_run: Do not store to Synapse. Defaults to False. :type dry_run: bool, optional
Source code insynapseclient/services/json_schema.py
def create(\n self,\n json_schema_body: dict,\n semantic_version: str = None,\n dry_run: bool = False,\n):\n \"\"\"Create JSON schema\n\n :param json_schema_body: JSON schema body\n :type json_schema_body: dict\n :param semantic_version: Version of JSON schema. Defaults to None.\n :type semantic_version: str, optional\n :param dry_run: Do not store to Synapse. Defaults to False.\n :type dry_run: bool, optional\n \"\"\"\n uri = f\"{self.organization.name}-{self.name}\"\n if semantic_version:\n uri = f\"{uri}-{semantic_version}\"\n json_schema_body[\"$id\"] = uri\n response = self.service.create_json_schema(json_schema_body, dry_run)\n if dry_run:\n return response\n raw_version = response[\"newVersionInfo\"]\n version = JsonSchemaVersion.from_response(self.organization, raw_version)\n self._versions[semantic_version] = (raw_version, version)\n return version\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaOrganization","title":"JsonSchemaOrganization
","text":"Json Schema Organization
:param name: Name of JSON schema organization :type name: str
Source code insynapseclient/services/json_schema.py
class JsonSchemaOrganization:\n \"\"\"Json Schema Organization\n\n :param name: Name of JSON schema organization\n :type name: str\n \"\"\"\n\n def __init__(self, name: str) -> None:\n self.name = name\n self.id = None\n self.created_on = None\n self.created_by = None\n self._json_schemas = dict()\n self._raw_json_schemas = dict()\n\n def __repr__(self):\n string = f\"JsonSchemaOrganization(name={self.name!r})\"\n return string\n\n def set_service(self, service):\n self.service = service\n\n def get(self):\n \"\"\"Gets Json Schema organization\"\"\"\n if self.id is not None:\n return True\n try:\n response = self.service.get_organization(self.name)\n except SynapseHTTPError as e:\n error_msg = str(e)\n if \"not found\" in error_msg:\n return False\n else:\n raise e\n self.id = response[\"id\"]\n self.created_on = response[\"createdOn\"]\n self.created_by = response[\"createdBy\"]\n return True\n\n def must_get(self):\n already_exists = self.get()\n assert already_exists, (\n \"This operation requires that the organization is created first. \"\n \"Call the 'create()' method to trigger the creation.\"\n )\n\n @property\n def name(self):\n return self._name\n\n @name.setter\n def name(self, value):\n if len(value) < 6:\n raise ValueError(\"Name must be at least 6 characters.\")\n if len(value) > 250:\n raise ValueError(\"Name cannot exceed 250 characters. \")\n if value[0].isdigit():\n raise ValueError(\"Name must not start with a number.\")\n self._name = value\n\n @property\n def raw(self):\n self.must_get()\n return self._raw\n\n def parse_response(self, response):\n self._raw = response\n self.id = response[\"id\"]\n self.created_on = response[\"createdOn\"]\n self.created_by = response[\"createdBy\"]\n\n @classmethod\n def from_response(cls, response):\n organization = cls(response[\"name\"])\n organization.parse_response(response)\n return organization\n\n def create(self):\n \"\"\"Create the JSON schema organization\"\"\"\n already_exists = self.get()\n if already_exists:\n return\n response = self.service.create_organization(self.name)\n self.parse_response(response)\n return self\n\n def delete(self):\n \"\"\"Delete the JSON schema organization\"\"\"\n self.must_get()\n response = self.service.delete_organization(self.id)\n return response\n\n def get_acl(self):\n \"\"\"Get ACL of JSON schema organization\"\"\"\n self.must_get()\n response = self.service.get_organization_acl(self.id)\n return response\n\n def set_acl(\n self,\n principal_ids: Sequence[int],\n access_type: Sequence[str] = DEFAULT_ACCESS,\n etag: str = None,\n ):\n \"\"\"Set ACL of JSON schema organization\n\n :param principal_ids: List of Synapse user or team ids.\n :type principal_ids: list\n :param access_type: Access control list. Defaults to [\"CHANGE_PERMISSIONS\", \"DELETE\", \"READ\", \"CREATE\", \"UPDATE\"].\n :type access_type: list, optional\n :param etag: Etag. Defaults to None.\n :type etag: str, optional\n \"\"\"\n self.must_get()\n if etag is None:\n acl = self.get_acl()\n etag = acl[\"etag\"]\n resource_access = [\n {\"principalId\": principal_id, \"accessType\": access_type}\n for principal_id in principal_ids\n ]\n response = self.service.update_organization_acl(self.id, resource_access, etag)\n return response\n\n def update_acl(\n self,\n principal_ids: Sequence[int],\n access_type: Sequence[str] = DEFAULT_ACCESS,\n etag: str = None,\n ):\n \"\"\"Update ACL of JSON schema organization\n\n :param principal_ids: List of Synapse user or team ids.\n :type principal_ids: list\n :param access_type: Access control list. Defaults to [\"CHANGE_PERMISSIONS\", \"DELETE\", \"READ\", \"CREATE\", \"UPDATE\"].\n :type access_type: list, optional\n :param etag: Etag. Defaults to None.\n :type etag: str, optional\n \"\"\"\n self.must_get()\n principal_ids = set(principal_ids)\n acl = self.get_acl()\n resource_access = acl[\"resourceAccess\"]\n if etag is None:\n etag = acl[\"etag\"]\n for entry in resource_access:\n if entry[\"principalId\"] in principal_ids:\n entry[\"accessType\"] = access_type\n principal_ids.remove(entry[\"principalId\"])\n for principal_id in principal_ids:\n entry = {\n \"principalId\": principal_id,\n \"accessType\": access_type,\n }\n resource_access.append(entry)\n response = self.service.update_organization_acl(self.id, resource_access, etag)\n return response\n\n def list_json_schemas(self):\n \"\"\"List JSON schemas available from the organization\"\"\"\n self.must_get()\n response = self.service.list_json_schemas(self.name)\n for raw_json_schema in response:\n json_schema = JsonSchema.from_response(self, raw_json_schema)\n self._raw_json_schemas[json_schema.name] = raw_json_schema\n self._json_schemas[json_schema.name] = json_schema\n yield json_schema\n\n def get_json_schema(self, json_schema_name: str, raw: bool = False):\n \"\"\"Get JSON schema\n\n :param json_schema_name: Name of JSON schema.\n :type json_schema_name: str\n :param raw: Return raw JSON schema. Default is False.\n :type raw: bool, optional\n \"\"\"\n self.must_get()\n if json_schema_name not in self._json_schemas:\n list(self.list_json_schemas())\n if raw:\n json_schema = self._raw_json_schemas.get(json_schema_name)\n else:\n json_schema = self._json_schemas.get(json_schema_name)\n return json_schema\n\n def create_json_schema(\n self,\n json_schema_body: dict,\n name: str = None,\n semantic_version: str = None,\n dry_run: bool = False,\n ):\n \"\"\"Create JSON schema\n\n :param json_schema_body: JSON schema dict\n :type json_schema_body: dict\n :param name: Name of JSON schema. Defaults to None.\n :type name: str, optional\n :param semantic_version: Version of JSON schema. Defaults to None.\n :type semantic_version: str, optional\n :param dry_run: Don't store to Synapse. Defaults to False.\n :type dry_run: bool, optional\n \"\"\"\n if name:\n uri = f\"{self.name}-{name}\"\n if semantic_version:\n uri = f\"{uri}-{semantic_version}\"\n json_schema_body[\"$id\"] = uri\n else:\n assert (\n semantic_version is not None\n ), \"Specify both the name and the semantic version (not just the latter)\"\n response = self.service.create_json_schema(json_schema_body, dry_run)\n if dry_run:\n return response\n raw_version = response[\"newVersionInfo\"]\n json_schema = JsonSchemaVersion.from_response(self, raw_version)\n return json_schema\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaOrganization-functions","title":"Functions","text":""},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaOrganization.get","title":"get()
","text":"Gets Json Schema organization
Source code insynapseclient/services/json_schema.py
def get(self):\n \"\"\"Gets Json Schema organization\"\"\"\n if self.id is not None:\n return True\n try:\n response = self.service.get_organization(self.name)\n except SynapseHTTPError as e:\n error_msg = str(e)\n if \"not found\" in error_msg:\n return False\n else:\n raise e\n self.id = response[\"id\"]\n self.created_on = response[\"createdOn\"]\n self.created_by = response[\"createdBy\"]\n return True\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaOrganization.create","title":"create()
","text":"Create the JSON schema organization
Source code insynapseclient/services/json_schema.py
def create(self):\n \"\"\"Create the JSON schema organization\"\"\"\n already_exists = self.get()\n if already_exists:\n return\n response = self.service.create_organization(self.name)\n self.parse_response(response)\n return self\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaOrganization.delete","title":"delete()
","text":"Delete the JSON schema organization
Source code insynapseclient/services/json_schema.py
def delete(self):\n \"\"\"Delete the JSON schema organization\"\"\"\n self.must_get()\n response = self.service.delete_organization(self.id)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaOrganization.get_acl","title":"get_acl()
","text":"Get ACL of JSON schema organization
Source code insynapseclient/services/json_schema.py
def get_acl(self):\n \"\"\"Get ACL of JSON schema organization\"\"\"\n self.must_get()\n response = self.service.get_organization_acl(self.id)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaOrganization.set_acl","title":"set_acl(principal_ids, access_type=DEFAULT_ACCESS, etag=None)
","text":"Set ACL of JSON schema organization
:param principal_ids: List of Synapse user or team ids. :type principal_ids: list :param access_type: Access control list. Defaults to [\"CHANGE_PERMISSIONS\", \"DELETE\", \"READ\", \"CREATE\", \"UPDATE\"]. :type access_type: list, optional :param etag: Etag. Defaults to None. :type etag: str, optional
Source code insynapseclient/services/json_schema.py
def set_acl(\n self,\n principal_ids: Sequence[int],\n access_type: Sequence[str] = DEFAULT_ACCESS,\n etag: str = None,\n):\n \"\"\"Set ACL of JSON schema organization\n\n :param principal_ids: List of Synapse user or team ids.\n :type principal_ids: list\n :param access_type: Access control list. Defaults to [\"CHANGE_PERMISSIONS\", \"DELETE\", \"READ\", \"CREATE\", \"UPDATE\"].\n :type access_type: list, optional\n :param etag: Etag. Defaults to None.\n :type etag: str, optional\n \"\"\"\n self.must_get()\n if etag is None:\n acl = self.get_acl()\n etag = acl[\"etag\"]\n resource_access = [\n {\"principalId\": principal_id, \"accessType\": access_type}\n for principal_id in principal_ids\n ]\n response = self.service.update_organization_acl(self.id, resource_access, etag)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaOrganization.update_acl","title":"update_acl(principal_ids, access_type=DEFAULT_ACCESS, etag=None)
","text":"Update ACL of JSON schema organization
:param principal_ids: List of Synapse user or team ids. :type principal_ids: list :param access_type: Access control list. Defaults to [\"CHANGE_PERMISSIONS\", \"DELETE\", \"READ\", \"CREATE\", \"UPDATE\"]. :type access_type: list, optional :param etag: Etag. Defaults to None. :type etag: str, optional
Source code insynapseclient/services/json_schema.py
def update_acl(\n self,\n principal_ids: Sequence[int],\n access_type: Sequence[str] = DEFAULT_ACCESS,\n etag: str = None,\n):\n \"\"\"Update ACL of JSON schema organization\n\n :param principal_ids: List of Synapse user or team ids.\n :type principal_ids: list\n :param access_type: Access control list. Defaults to [\"CHANGE_PERMISSIONS\", \"DELETE\", \"READ\", \"CREATE\", \"UPDATE\"].\n :type access_type: list, optional\n :param etag: Etag. Defaults to None.\n :type etag: str, optional\n \"\"\"\n self.must_get()\n principal_ids = set(principal_ids)\n acl = self.get_acl()\n resource_access = acl[\"resourceAccess\"]\n if etag is None:\n etag = acl[\"etag\"]\n for entry in resource_access:\n if entry[\"principalId\"] in principal_ids:\n entry[\"accessType\"] = access_type\n principal_ids.remove(entry[\"principalId\"])\n for principal_id in principal_ids:\n entry = {\n \"principalId\": principal_id,\n \"accessType\": access_type,\n }\n resource_access.append(entry)\n response = self.service.update_organization_acl(self.id, resource_access, etag)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaOrganization.list_json_schemas","title":"list_json_schemas()
","text":"List JSON schemas available from the organization
Source code insynapseclient/services/json_schema.py
def list_json_schemas(self):\n \"\"\"List JSON schemas available from the organization\"\"\"\n self.must_get()\n response = self.service.list_json_schemas(self.name)\n for raw_json_schema in response:\n json_schema = JsonSchema.from_response(self, raw_json_schema)\n self._raw_json_schemas[json_schema.name] = raw_json_schema\n self._json_schemas[json_schema.name] = json_schema\n yield json_schema\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaOrganization.get_json_schema","title":"get_json_schema(json_schema_name, raw=False)
","text":"Get JSON schema
:param json_schema_name: Name of JSON schema. :type json_schema_name: str :param raw: Return raw JSON schema. Default is False. :type raw: bool, optional
Source code insynapseclient/services/json_schema.py
def get_json_schema(self, json_schema_name: str, raw: bool = False):\n \"\"\"Get JSON schema\n\n :param json_schema_name: Name of JSON schema.\n :type json_schema_name: str\n :param raw: Return raw JSON schema. Default is False.\n :type raw: bool, optional\n \"\"\"\n self.must_get()\n if json_schema_name not in self._json_schemas:\n list(self.list_json_schemas())\n if raw:\n json_schema = self._raw_json_schemas.get(json_schema_name)\n else:\n json_schema = self._json_schemas.get(json_schema_name)\n return json_schema\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaOrganization.create_json_schema","title":"create_json_schema(json_schema_body, name=None, semantic_version=None, dry_run=False)
","text":"Create JSON schema
:param json_schema_body: JSON schema dict :type json_schema_body: dict :param name: Name of JSON schema. Defaults to None. :type name: str, optional :param semantic_version: Version of JSON schema. Defaults to None. :type semantic_version: str, optional :param dry_run: Don't store to Synapse. Defaults to False. :type dry_run: bool, optional
Source code insynapseclient/services/json_schema.py
def create_json_schema(\n self,\n json_schema_body: dict,\n name: str = None,\n semantic_version: str = None,\n dry_run: bool = False,\n):\n \"\"\"Create JSON schema\n\n :param json_schema_body: JSON schema dict\n :type json_schema_body: dict\n :param name: Name of JSON schema. Defaults to None.\n :type name: str, optional\n :param semantic_version: Version of JSON schema. Defaults to None.\n :type semantic_version: str, optional\n :param dry_run: Don't store to Synapse. Defaults to False.\n :type dry_run: bool, optional\n \"\"\"\n if name:\n uri = f\"{self.name}-{name}\"\n if semantic_version:\n uri = f\"{uri}-{semantic_version}\"\n json_schema_body[\"$id\"] = uri\n else:\n assert (\n semantic_version is not None\n ), \"Specify both the name and the semantic version (not just the latter)\"\n response = self.service.create_json_schema(json_schema_body, dry_run)\n if dry_run:\n return response\n raw_version = response[\"newVersionInfo\"]\n json_schema = JsonSchemaVersion.from_response(self, raw_version)\n return json_schema\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService","title":"JsonSchemaService
","text":"Json Schema Service
:param synapse: Synapse connection :type synapse: Synapse
Source code insynapseclient/services/json_schema.py
class JsonSchemaService:\n \"\"\"Json Schema Service\n\n :param synapse: Synapse connection\n :type synapse: Synapse\n \"\"\"\n\n def __init__(self, synapse: Synapse = None) -> None:\n self.synapse = synapse\n\n @wraps(Synapse.login)\n def login(self, *args, **kwargs):\n synapse = Synapse()\n synapse.login(*args, **kwargs)\n self.synapse = synapse\n\n @wraps(JsonSchemaOrganization)\n def JsonSchemaOrganization(self, *args, **kwargs):\n instance = JsonSchemaOrganization(*args, **kwargs)\n instance.set_service(self)\n return instance\n\n @wraps(JsonSchemaVersion)\n def JsonSchemaVersion(self, *args, **kwargs):\n instance = JsonSchemaVersion(*args, **kwargs)\n instance.set_service(self)\n return instance\n\n @wraps(JsonSchema)\n def JsonSchema(self, *args, **kwargs):\n instance = JsonSchema(*args, **kwargs)\n instance.set_service(self)\n return instance\n\n def authentication_required(func):\n @wraps(func)\n def wrapper(self, *args, **kwargs):\n msg = (\n f\"`JsonSchemaService.{func.__name__}()` requests must be authenticated.\"\n \" Login using the `login()` method on the existing `JsonSchemaService`\"\n \" instance (e.g., `js.login()` or `js.login(authToken=...)`).\"\n )\n assert self.synapse is not None, msg\n try:\n result = func(self, *args, **kwargs)\n except SynapseAuthenticationError as e:\n raise SynapseAuthenticationError(msg).with_traceback(e.__traceback__)\n return result\n\n return wrapper\n\n @authentication_required\n def create_organization(self, organization_name: str):\n \"\"\"Create a new organization\n\n :param organization_name: JSON schema organization name\n :type organization_name: str\n \"\"\"\n request_body = {\"organizationName\": organization_name}\n response = self.synapse.restPOST(\n \"/schema/organization\", body=json.dumps(request_body)\n )\n return response\n\n @authentication_required\n def get_organization(self, organization_name: str):\n \"\"\"Get a organization\n\n :param organization_name: JSON schema organization name\n :type organization_name: str\n \"\"\"\n response = self.synapse.restGET(\n f\"/schema/organization?name={organization_name}\"\n )\n return response\n\n def list_organizations(self):\n \"\"\"List organizations\"\"\"\n request_body = {}\n response = self.synapse._POST_paginated(\n \"/schema/organization/list\", request_body\n )\n return response\n\n @authentication_required\n def delete_organization(self, organization_id: str):\n \"\"\"Delete organization\n\n :param organization_id: JSON schema organization Id\n :type organization_id: str\n \"\"\"\n response = self.synapse.restDELETE(f\"/schema/organization/{organization_id}\")\n return response\n\n @authentication_required\n def get_organization_acl(self, organization_id: str):\n \"\"\"Get ACL associated with Organization\n\n :param organization_id: JSON schema organization Id\n :type organization_id: str\n \"\"\"\n response = self.synapse.restGET(f\"/schema/organization/{organization_id}/acl\")\n return response\n\n @authentication_required\n def update_organization_acl(\n self,\n organization_id: str,\n resource_access: Sequence[Mapping[str, Sequence[str]]],\n etag: str,\n ):\n \"\"\"Get ACL associated with Organization\n\n :param organization_id: JSON schema organization Id\n :type organization_id: str\n :param resource_access: Resource access array\n :type resource_access: list\n :param etag: Etag\n :type etag: str\n \"\"\"\n request_body = {\"resourceAccess\": resource_access, \"etag\": etag}\n response = self.synapse.restPUT(\n f\"/schema/organization/{organization_id}/acl\", body=json.dumps(request_body)\n )\n return response\n\n def list_json_schemas(self, organization_name: str):\n \"\"\"List JSON schemas for an organization\n\n :param organization_name: JSON schema organization name\n :type organization_name: str\n \"\"\"\n request_body = {\"organizationName\": organization_name}\n response = self.synapse._POST_paginated(\"/schema/list\", request_body)\n return response\n\n def list_json_schema_versions(self, organization_name: str, json_schema_name: str):\n \"\"\"List version information for each JSON schema\n\n :param organization_name: JSON schema organization name\n :type organization_name: str\n :param json_schema_name: JSON schema name\n :type json_schema_name: str\n \"\"\"\n request_body = {\n \"organizationName\": organization_name,\n \"schemaName\": json_schema_name,\n }\n response = self.synapse._POST_paginated(\"/schema/version/list\", request_body)\n return response\n\n @authentication_required\n def create_json_schema(self, json_schema_body: dict, dry_run: bool = False):\n \"\"\"Create a JSON schema\n\n :param json_schema_body: JSON schema body\n :type json_schema_body: dict\n :param dry_run: Don't store to Synapse. Default to False.\n :type dry_run: bool, optional\n \"\"\"\n request_body = {\n \"concreteType\": \"org.sagebionetworks.repo.model.schema.CreateSchemaRequest\",\n \"schema\": json_schema_body,\n \"dryRun\": dry_run,\n }\n response = self.synapse._waitForAsync(\"/schema/type/create/async\", request_body)\n return response\n\n def get_json_schema_body(self, json_schema_uri: str):\n \"\"\"Get registered JSON schema with its $id\n\n :param json_schema_uri: JSON schema URI\n :type json_schema_uri: str\n \"\"\"\n response = self.synapse.restGET(f\"/schema/type/registered/{json_schema_uri}\")\n return response\n\n @authentication_required\n def delete_json_schema(self, json_schema_uri: str):\n \"\"\"Delete the given schema using its $id\n\n :param json_schema_uri: JSON schema URI\n :type json_schema_uri: str\n \"\"\"\n response = self.synapse.restDELETE(f\"/schema/type/registered/{json_schema_uri}\")\n return response\n\n @authentication_required\n def json_schema_validation(self, json_schema_uri: str):\n \"\"\"Use a JSON schema for validation\n\n :param json_schema_uri: JSON schema URI\n :type json_schema_uri: str\n \"\"\"\n request_body = {\n \"concreteType\": (\n \"org.sagebionetworks.repo.model.schema.GetValidationSchemaRequest\"\n ),\n \"$id\": json_schema_uri,\n }\n response = self.synapse._waitForAsync(\n \"/schema/type/validation/async\", request_body\n )\n return response\n\n @authentication_required\n def bind_json_schema_to_entity(self, synapse_id: str, json_schema_uri: str):\n \"\"\"Bind a JSON schema to an entity\n\n :param synapse_id: Synapse Id\n :type synapse_id: str\n :param json_schema_uri: JSON schema URI\n :type json_schema_uri: str\n \"\"\"\n request_body = {\"entityId\": synapse_id, \"schema$id\": json_schema_uri}\n response = self.synapse.restPUT(\n f\"/entity/{synapse_id}/schema/binding\", body=json.dumps(request_body)\n )\n return response\n\n @authentication_required\n def get_json_schema_from_entity(self, synapse_id: str):\n \"\"\"Get bound schema from entity\n\n :param synapse_id: Synapse Id\n :type synapse_id: str\n \"\"\"\n response = self.synapse.restGET(f\"/entity/{synapse_id}/schema/binding\")\n return response\n\n @authentication_required\n def delete_json_schema_from_entity(self, synapse_id: str):\n \"\"\"Delete bound schema from entity\n\n :param synapse_id: Synapse Id\n :type synapse_id: str\n \"\"\"\n response = self.synapse.restDELETE(f\"/entity/{synapse_id}/schema/binding\")\n return response\n\n @authentication_required\n def validate_entity_with_json_schema(self, synapse_id: str):\n \"\"\"Get validation results of an entity against bound JSON schema\n\n :param synapse_id: Synapse Id\n :type synapse_id: str\n \"\"\"\n response = self.synapse.restGET(f\"/entity/{synapse_id}/schema/validation\")\n return response\n\n @authentication_required\n def get_json_schema_validation_statistics(self, synapse_id: str):\n \"\"\"Get the summary statistic of json schema validation results for\n a container entity\n\n :param synapse_id: Synapse Id\n :type synapse_id: str\n \"\"\"\n response = self.synapse.restGET(\n f\"/entity/{synapse_id}/schema/validation/statistics\"\n )\n return response\n\n @authentication_required\n def get_invalid_json_schema_validation(self, synapse_id: str):\n \"\"\"Get a single page of invalid JSON schema validation results for a container Entity\n (Project or Folder).\n\n :param synapse_id: Synapse Id\n :type synapse_id: str\n \"\"\"\n request_body = {\"containerId\": synapse_id}\n response = self.synapse._POST_paginated(\n f\"/entity/{synapse_id}/schema/validation/invalid\", request_body\n )\n return response\n\n # The methods below are here until they are integrated with Synapse/Entity\n\n def bind_json_schema(self, json_schema_uri: str, entity: Union[str, Entity]):\n \"\"\"Bind a JSON schema to an entity\n\n :param json_schema_uri: JSON schema URI\n :type json_schema_uri: str\n :param entity: Synapse Entity or Synapse Id\n :type entity: str, Entity\n \"\"\"\n synapse_id = id_of(entity)\n response = self.bind_json_schema_to_entity(synapse_id, json_schema_uri)\n return response\n\n def get_json_schema(self, entity: Union[str, Entity]):\n \"\"\"Get a JSON schema associated to an Entity\n\n :param entity: Synapse Entity or Synapse Id\n :type entity: str, Entity\n \"\"\"\n synapse_id = id_of(entity)\n response = self.get_json_schema_from_entity(synapse_id)\n return response\n\n def unbind_json_schema(self, entity: Union[str, Entity]):\n \"\"\"Unbind a JSON schema from an entity\n\n :param entity: Synapse Entity or Synapse Id\n :type entity: str, Entity\n \"\"\"\n synapse_id = id_of(entity)\n response = self.delete_json_schema_from_entity(synapse_id)\n return response\n\n def validate(self, entity: Union[str, Entity]):\n \"\"\"Validate an entity based on the bound JSON schema\n\n :param entity: Synapse Entity or Synapse Id\n :type entity: str, Entity\n \"\"\"\n synapse_id = id_of(entity)\n response = self.validate_entity_with_json_schema(synapse_id)\n return response\n\n def validation_stats(self, entity: Union[str, Entity]):\n \"\"\"Get validation statistics of an entity based on the bound JSON schema\n\n :param entity: Synapse Entity or Synapse Id\n :type entity: str, Entity\n \"\"\"\n synapse_id = id_of(entity)\n response = self.get_json_schema_validation_statistics(synapse_id)\n return response\n\n def validate_children(self, entity: Union[str, Entity]):\n \"\"\"Validate an entity and it's children based on the bound JSON schema\n\n :param entity: Synapse Entity or Synapse Id of a project or folder.\n :type entity: str, Entity\n \"\"\"\n synapse_id = id_of(entity)\n response = self.get_invalid_json_schema_validation(synapse_id)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService-functions","title":"Functions","text":""},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.create_organization","title":"create_organization(organization_name)
","text":"Create a new organization
:param organization_name: JSON schema organization name :type organization_name: str
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef create_organization(self, organization_name: str):\n \"\"\"Create a new organization\n\n :param organization_name: JSON schema organization name\n :type organization_name: str\n \"\"\"\n request_body = {\"organizationName\": organization_name}\n response = self.synapse.restPOST(\n \"/schema/organization\", body=json.dumps(request_body)\n )\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.get_organization","title":"get_organization(organization_name)
","text":"Get a organization
:param organization_name: JSON schema organization name :type organization_name: str
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef get_organization(self, organization_name: str):\n \"\"\"Get a organization\n\n :param organization_name: JSON schema organization name\n :type organization_name: str\n \"\"\"\n response = self.synapse.restGET(\n f\"/schema/organization?name={organization_name}\"\n )\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.list_organizations","title":"list_organizations()
","text":"List organizations
Source code insynapseclient/services/json_schema.py
def list_organizations(self):\n \"\"\"List organizations\"\"\"\n request_body = {}\n response = self.synapse._POST_paginated(\n \"/schema/organization/list\", request_body\n )\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.delete_organization","title":"delete_organization(organization_id)
","text":"Delete organization
:param organization_id: JSON schema organization Id :type organization_id: str
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef delete_organization(self, organization_id: str):\n \"\"\"Delete organization\n\n :param organization_id: JSON schema organization Id\n :type organization_id: str\n \"\"\"\n response = self.synapse.restDELETE(f\"/schema/organization/{organization_id}\")\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.get_organization_acl","title":"get_organization_acl(organization_id)
","text":"Get ACL associated with Organization
:param organization_id: JSON schema organization Id :type organization_id: str
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef get_organization_acl(self, organization_id: str):\n \"\"\"Get ACL associated with Organization\n\n :param organization_id: JSON schema organization Id\n :type organization_id: str\n \"\"\"\n response = self.synapse.restGET(f\"/schema/organization/{organization_id}/acl\")\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.update_organization_acl","title":"update_organization_acl(organization_id, resource_access, etag)
","text":"Get ACL associated with Organization
:param organization_id: JSON schema organization Id :type organization_id: str :param resource_access: Resource access array :type resource_access: list :param etag: Etag :type etag: str
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef update_organization_acl(\n self,\n organization_id: str,\n resource_access: Sequence[Mapping[str, Sequence[str]]],\n etag: str,\n):\n \"\"\"Get ACL associated with Organization\n\n :param organization_id: JSON schema organization Id\n :type organization_id: str\n :param resource_access: Resource access array\n :type resource_access: list\n :param etag: Etag\n :type etag: str\n \"\"\"\n request_body = {\"resourceAccess\": resource_access, \"etag\": etag}\n response = self.synapse.restPUT(\n f\"/schema/organization/{organization_id}/acl\", body=json.dumps(request_body)\n )\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.list_json_schemas","title":"list_json_schemas(organization_name)
","text":"List JSON schemas for an organization
:param organization_name: JSON schema organization name :type organization_name: str
Source code insynapseclient/services/json_schema.py
def list_json_schemas(self, organization_name: str):\n \"\"\"List JSON schemas for an organization\n\n :param organization_name: JSON schema organization name\n :type organization_name: str\n \"\"\"\n request_body = {\"organizationName\": organization_name}\n response = self.synapse._POST_paginated(\"/schema/list\", request_body)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.list_json_schema_versions","title":"list_json_schema_versions(organization_name, json_schema_name)
","text":"List version information for each JSON schema
:param organization_name: JSON schema organization name :type organization_name: str :param json_schema_name: JSON schema name :type json_schema_name: str
Source code insynapseclient/services/json_schema.py
def list_json_schema_versions(self, organization_name: str, json_schema_name: str):\n \"\"\"List version information for each JSON schema\n\n :param organization_name: JSON schema organization name\n :type organization_name: str\n :param json_schema_name: JSON schema name\n :type json_schema_name: str\n \"\"\"\n request_body = {\n \"organizationName\": organization_name,\n \"schemaName\": json_schema_name,\n }\n response = self.synapse._POST_paginated(\"/schema/version/list\", request_body)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.create_json_schema","title":"create_json_schema(json_schema_body, dry_run=False)
","text":"Create a JSON schema
:param json_schema_body: JSON schema body :type json_schema_body: dict :param dry_run: Don't store to Synapse. Default to False. :type dry_run: bool, optional
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef create_json_schema(self, json_schema_body: dict, dry_run: bool = False):\n \"\"\"Create a JSON schema\n\n :param json_schema_body: JSON schema body\n :type json_schema_body: dict\n :param dry_run: Don't store to Synapse. Default to False.\n :type dry_run: bool, optional\n \"\"\"\n request_body = {\n \"concreteType\": \"org.sagebionetworks.repo.model.schema.CreateSchemaRequest\",\n \"schema\": json_schema_body,\n \"dryRun\": dry_run,\n }\n response = self.synapse._waitForAsync(\"/schema/type/create/async\", request_body)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.get_json_schema_body","title":"get_json_schema_body(json_schema_uri)
","text":"Get registered JSON schema with its $id
:param json_schema_uri: JSON schema URI :type json_schema_uri: str
Source code insynapseclient/services/json_schema.py
def get_json_schema_body(self, json_schema_uri: str):\n \"\"\"Get registered JSON schema with its $id\n\n :param json_schema_uri: JSON schema URI\n :type json_schema_uri: str\n \"\"\"\n response = self.synapse.restGET(f\"/schema/type/registered/{json_schema_uri}\")\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.delete_json_schema","title":"delete_json_schema(json_schema_uri)
","text":"Delete the given schema using its $id
:param json_schema_uri: JSON schema URI :type json_schema_uri: str
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef delete_json_schema(self, json_schema_uri: str):\n \"\"\"Delete the given schema using its $id\n\n :param json_schema_uri: JSON schema URI\n :type json_schema_uri: str\n \"\"\"\n response = self.synapse.restDELETE(f\"/schema/type/registered/{json_schema_uri}\")\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.json_schema_validation","title":"json_schema_validation(json_schema_uri)
","text":"Use a JSON schema for validation
:param json_schema_uri: JSON schema URI :type json_schema_uri: str
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef json_schema_validation(self, json_schema_uri: str):\n \"\"\"Use a JSON schema for validation\n\n :param json_schema_uri: JSON schema URI\n :type json_schema_uri: str\n \"\"\"\n request_body = {\n \"concreteType\": (\n \"org.sagebionetworks.repo.model.schema.GetValidationSchemaRequest\"\n ),\n \"$id\": json_schema_uri,\n }\n response = self.synapse._waitForAsync(\n \"/schema/type/validation/async\", request_body\n )\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.bind_json_schema_to_entity","title":"bind_json_schema_to_entity(synapse_id, json_schema_uri)
","text":"Bind a JSON schema to an entity
:param synapse_id: Synapse Id :type synapse_id: str :param json_schema_uri: JSON schema URI :type json_schema_uri: str
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef bind_json_schema_to_entity(self, synapse_id: str, json_schema_uri: str):\n \"\"\"Bind a JSON schema to an entity\n\n :param synapse_id: Synapse Id\n :type synapse_id: str\n :param json_schema_uri: JSON schema URI\n :type json_schema_uri: str\n \"\"\"\n request_body = {\"entityId\": synapse_id, \"schema$id\": json_schema_uri}\n response = self.synapse.restPUT(\n f\"/entity/{synapse_id}/schema/binding\", body=json.dumps(request_body)\n )\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.get_json_schema_from_entity","title":"get_json_schema_from_entity(synapse_id)
","text":"Get bound schema from entity
:param synapse_id: Synapse Id :type synapse_id: str
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef get_json_schema_from_entity(self, synapse_id: str):\n \"\"\"Get bound schema from entity\n\n :param synapse_id: Synapse Id\n :type synapse_id: str\n \"\"\"\n response = self.synapse.restGET(f\"/entity/{synapse_id}/schema/binding\")\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.delete_json_schema_from_entity","title":"delete_json_schema_from_entity(synapse_id)
","text":"Delete bound schema from entity
:param synapse_id: Synapse Id :type synapse_id: str
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef delete_json_schema_from_entity(self, synapse_id: str):\n \"\"\"Delete bound schema from entity\n\n :param synapse_id: Synapse Id\n :type synapse_id: str\n \"\"\"\n response = self.synapse.restDELETE(f\"/entity/{synapse_id}/schema/binding\")\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.validate_entity_with_json_schema","title":"validate_entity_with_json_schema(synapse_id)
","text":"Get validation results of an entity against bound JSON schema
:param synapse_id: Synapse Id :type synapse_id: str
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef validate_entity_with_json_schema(self, synapse_id: str):\n \"\"\"Get validation results of an entity against bound JSON schema\n\n :param synapse_id: Synapse Id\n :type synapse_id: str\n \"\"\"\n response = self.synapse.restGET(f\"/entity/{synapse_id}/schema/validation\")\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.get_json_schema_validation_statistics","title":"get_json_schema_validation_statistics(synapse_id)
","text":"Get the summary statistic of json schema validation results for a container entity
:param synapse_id: Synapse Id :type synapse_id: str
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef get_json_schema_validation_statistics(self, synapse_id: str):\n \"\"\"Get the summary statistic of json schema validation results for\n a container entity\n\n :param synapse_id: Synapse Id\n :type synapse_id: str\n \"\"\"\n response = self.synapse.restGET(\n f\"/entity/{synapse_id}/schema/validation/statistics\"\n )\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.get_invalid_json_schema_validation","title":"get_invalid_json_schema_validation(synapse_id)
","text":"Get a single page of invalid JSON schema validation results for a container Entity (Project or Folder).
:param synapse_id: Synapse Id :type synapse_id: str
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef get_invalid_json_schema_validation(self, synapse_id: str):\n \"\"\"Get a single page of invalid JSON schema validation results for a container Entity\n (Project or Folder).\n\n :param synapse_id: Synapse Id\n :type synapse_id: str\n \"\"\"\n request_body = {\"containerId\": synapse_id}\n response = self.synapse._POST_paginated(\n f\"/entity/{synapse_id}/schema/validation/invalid\", request_body\n )\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.bind_json_schema","title":"bind_json_schema(json_schema_uri, entity)
","text":"Bind a JSON schema to an entity
:param json_schema_uri: JSON schema URI :type json_schema_uri: str :param entity: Synapse Entity or Synapse Id :type entity: str, Entity
Source code insynapseclient/services/json_schema.py
def bind_json_schema(self, json_schema_uri: str, entity: Union[str, Entity]):\n \"\"\"Bind a JSON schema to an entity\n\n :param json_schema_uri: JSON schema URI\n :type json_schema_uri: str\n :param entity: Synapse Entity or Synapse Id\n :type entity: str, Entity\n \"\"\"\n synapse_id = id_of(entity)\n response = self.bind_json_schema_to_entity(synapse_id, json_schema_uri)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.get_json_schema","title":"get_json_schema(entity)
","text":"Get a JSON schema associated to an Entity
:param entity: Synapse Entity or Synapse Id :type entity: str, Entity
Source code insynapseclient/services/json_schema.py
def get_json_schema(self, entity: Union[str, Entity]):\n \"\"\"Get a JSON schema associated to an Entity\n\n :param entity: Synapse Entity or Synapse Id\n :type entity: str, Entity\n \"\"\"\n synapse_id = id_of(entity)\n response = self.get_json_schema_from_entity(synapse_id)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.unbind_json_schema","title":"unbind_json_schema(entity)
","text":"Unbind a JSON schema from an entity
:param entity: Synapse Entity or Synapse Id :type entity: str, Entity
Source code insynapseclient/services/json_schema.py
def unbind_json_schema(self, entity: Union[str, Entity]):\n \"\"\"Unbind a JSON schema from an entity\n\n :param entity: Synapse Entity or Synapse Id\n :type entity: str, Entity\n \"\"\"\n synapse_id = id_of(entity)\n response = self.delete_json_schema_from_entity(synapse_id)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.validate","title":"validate(entity)
","text":"Validate an entity based on the bound JSON schema
:param entity: Synapse Entity or Synapse Id :type entity: str, Entity
Source code insynapseclient/services/json_schema.py
def validate(self, entity: Union[str, Entity]):\n \"\"\"Validate an entity based on the bound JSON schema\n\n :param entity: Synapse Entity or Synapse Id\n :type entity: str, Entity\n \"\"\"\n synapse_id = id_of(entity)\n response = self.validate_entity_with_json_schema(synapse_id)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.validation_stats","title":"validation_stats(entity)
","text":"Get validation statistics of an entity based on the bound JSON schema
:param entity: Synapse Entity or Synapse Id :type entity: str, Entity
Source code insynapseclient/services/json_schema.py
def validation_stats(self, entity: Union[str, Entity]):\n \"\"\"Get validation statistics of an entity based on the bound JSON schema\n\n :param entity: Synapse Entity or Synapse Id\n :type entity: str, Entity\n \"\"\"\n synapse_id = id_of(entity)\n response = self.get_json_schema_validation_statistics(synapse_id)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.validate_children","title":"validate_children(entity)
","text":"Validate an entity and it's children based on the bound JSON schema
:param entity: Synapse Entity or Synapse Id of a project or folder. :type entity: str, Entity
Source code insynapseclient/services/json_schema.py
def validate_children(self, entity: Union[str, Entity]):\n \"\"\"Validate an entity and it's children based on the bound JSON schema\n\n :param entity: Synapse Entity or Synapse Id of a project or folder.\n :type entity: str, Entity\n \"\"\"\n synapse_id = id_of(entity)\n response = self.get_invalid_json_schema_validation(synapse_id)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema-functions","title":"Functions","text":""},{"location":"reference/link/","title":"Link","text":""},{"location":"reference/link/#synapseclient.entity.Link","title":"synapseclient.entity.Link
","text":" Bases: Entity
Represents a link in Synapse.
Links must have a target ID and a parent. When you do :py:func:synapseclient.Synapse.get
on a Link object, the Link object is returned. If the target is desired, specify followLink=True in synapseclient.Synapse.get.
targetId
The ID of the entity to be linked
targetVersion
The version of the entity to be linked
parent
The parent project or folder
properties
A map of Synapse properties
annotations
A map of user defined annotations
local_state
Internal use only
Using this class
Creating an instance and storing the link
link = Link('targetID', parent=folder)\nlink = syn.store(link)\n
Source code in synapseclient/entity.py
class Link(Entity):\n \"\"\"\n Represents a link in Synapse.\n\n Links must have a target ID and a parent. When you do :py:func:`synapseclient.Synapse.get` on a Link object,\n the Link object is returned. If the target is desired, specify followLink=True in synapseclient.Synapse.get.\n\n Attributes:\n targetId: The ID of the entity to be linked\n targetVersion: The version of the entity to be linked\n parent: The parent project or folder\n properties: A map of Synapse properties\n annotations: A map of user defined annotations\n local_state: Internal use only\n\n\n Example: Using this class\n Creating an instance and storing the link\n\n link = Link('targetID', parent=folder)\n link = syn.store(link)\n \"\"\"\n\n _property_keys = Entity._property_keys + [\"linksTo\", \"linksToClassName\"]\n _local_keys = Entity._local_keys\n _synapse_entity_type = \"org.sagebionetworks.repo.model.Link\"\n\n def __init__(\n self,\n targetId=None,\n targetVersion=None,\n parent=None,\n properties=None,\n annotations=None,\n local_state=None,\n **kwargs,\n ):\n if targetId is not None and targetVersion is not None:\n kwargs[\"linksTo\"] = dict(\n targetId=utils.id_of(targetId), targetVersionNumber=targetVersion\n )\n elif targetId is not None and targetVersion is None:\n kwargs[\"linksTo\"] = dict(targetId=utils.id_of(targetId))\n elif properties is not None and \"linksTo\" in properties:\n pass\n else:\n raise SynapseMalformedEntityError(\"Must provide a target id\")\n super(Link, self).__init__(\n concreteType=Link._synapse_entity_type,\n properties=properties,\n annotations=annotations,\n local_state=local_state,\n parent=parent,\n **kwargs,\n )\n
"},{"location":"reference/project/","title":"Project","text":""},{"location":"reference/project/#synapseclient.entity.Project","title":"synapseclient.entity.Project
","text":" Bases: Entity
Represents a project in Synapse.
Projects in Synapse must be uniquely named. Trying to create a project with a name that's already taken, say 'My project', will result in an error
ATTRIBUTE DESCRIPTIONname
The name of the project
properties
A map of Synapse properties
annotations
A map of user defined annotations
local_state
Internal use only
Using this class
Creating an instance and storing the project
project = Project('Foobarbat project')\nproject = syn.store(project)\n
Source code in synapseclient/entity.py
class Project(Entity):\n \"\"\"\n Represents a project in Synapse.\n\n Projects in Synapse must be uniquely named. Trying to create a project with a name that's already taken, say\n 'My project', will result in an error\n\n Attributes:\n name: The name of the project\n properties: A map of Synapse properties\n annotations: A map of user defined annotations\n local_state: Internal use only\n\n\n Example: Using this class\n Creating an instance and storing the project\n\n project = Project('Foobarbat project')\n project = syn.store(project)\n \"\"\"\n\n _synapse_entity_type = \"org.sagebionetworks.repo.model.Project\"\n\n def __init__(\n self, name=None, properties=None, annotations=None, local_state=None, **kwargs\n ):\n if name:\n kwargs[\"name\"] = name\n super(Project, self).__init__(\n concreteType=Project._synapse_entity_type,\n properties=properties,\n annotations=annotations,\n local_state=local_state,\n **kwargs,\n )\n
"},{"location":"reference/synapse_utils/","title":"Synapse Utils","text":""},{"location":"reference/synapse_utils/#synapseutils","title":"synapseutils
","text":""},{"location":"reference/synapse_utils/#synapseutils--overview","title":"Overview","text":"The synapseutils
package provides both higher level beta functions as well as utilities for interacting with Synapse. The behavior of these functions are subject to change.
synapseutils.sync
","text":""},{"location":"reference/synapse_utils/#synapseutils.sync-functions","title":"Functions","text":""},{"location":"reference/synapse_utils/#synapseutils.sync.syncFromSynapse","title":"syncFromSynapse(syn, entity, path=None, ifcollision='overwrite.local', allFiles=None, followLink=False, manifest='all', downloadFile=True)
","text":"Synchronizes all the files in a folder (including subfolders) from Synapse and adds a readme manifest with file metadata.
PARAMETER DESCRIPTIONsyn
A Synapse object with user's login, e.g. syn = synapseclient.login()
entity
A Synapse ID, a Synapse Entity object of type file, folder or project.
path
An optional path where the file hierarchy will be reproduced. If not specified the files will by default be placed in the synapseCache.
DEFAULT: None
ifcollision
Determines how to handle file collisions. Maybe \"overwrite.local\", \"keep.local\", or \"keep.both\".
DEFAULT: 'overwrite.local'
followLink
Determines whether the link returns the target Entity.
DEFAULT: False
manifest
Determines whether creating manifest file automatically. The optional values here (all
, root
, suppress
).
DEFAULT: 'all'
downloadFile
Determines whether downloading the files.
DEFAULT: True
List of entities (files, tables, links)
This function will crawl all subfolders of the project/folder specified by entity
and download all files that have not already been downloaded. If there are newer files in Synapse (or a local file has been edited outside of the cache) since the last download then local the file will be replaced by the new file unless \"ifcollision\" is changed.
If the files are being downloaded to a specific location outside of the Synapse cache a file (SYNAPSE_METADATA_MANIFEST.tsv) will also be added in the path that contains the metadata (annotations, storage location and provenance of all downloaded files).
See also:
Download and print the paths of all downloaded files:
entities = syncFromSynapse(syn, \"syn1234\")\nfor f in entities:\n print(f.path)\n
Source code in synapseutils/sync.py
@tracer.start_as_current_span(\"sync::syncFromSynapse\")\ndef syncFromSynapse(\n syn,\n entity,\n path=None,\n ifcollision=\"overwrite.local\",\n allFiles=None,\n followLink=False,\n manifest=\"all\",\n downloadFile=True,\n):\n \"\"\"Synchronizes all the files in a folder (including subfolders) from Synapse and adds a readme manifest with file\n metadata.\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n entity: A Synapse ID, a Synapse Entity object of type file, folder or project.\n path: An optional path where the file hierarchy will be reproduced. If not specified the files will by default be placed in the synapseCache.\n ifcollision: Determines how to handle file collisions. Maybe \"overwrite.local\", \"keep.local\", or \"keep.both\".\n followLink: Determines whether the link returns the target Entity.\n manifest: Determines whether creating manifest file automatically. The optional values here (`all`, `root`, `suppress`).\n downloadFile: Determines whether downloading the files.\n\n Returns:\n List of entities ([files][synapseclient.File], [tables][synapseclient.Table], [links][synapseclient.Link])\n\n\n This function will crawl all subfolders of the project/folder specified by `entity` and download all files that have\n not already been downloaded. If there are newer files in Synapse (or a local file has been edited outside of the\n cache) since the last download then local the file will be replaced by the new file unless \"ifcollision\" is changed.\n\n If the files are being downloaded to a specific location outside of the Synapse cache a file\n (SYNAPSE_METADATA_MANIFEST.tsv) will also be added in the path that contains the metadata (annotations, storage\n location and provenance of all downloaded files).\n\n See also:\n\n - [synapseutils.sync.syncToSynapse][]\n\n Example: Using this function\n Download and print the paths of all downloaded files:\n\n entities = syncFromSynapse(syn, \"syn1234\")\n for f in entities:\n print(f.path)\n \"\"\"\n\n if manifest not in (\"all\", \"root\", \"suppress\"):\n raise ValueError(\n 'Value of manifest option should be one of the (\"all\", \"root\", \"suppress\")'\n )\n\n # we'll have the following threads:\n # 1. the entrant thread to this function walks the folder hierarchy and schedules files for download,\n # and then waits for all the file downloads to complete\n # 2. each file download will run in a separate thread in an Executor\n # 3. downloads that support S3 multipart concurrent downloads will be scheduled by the thread in #2 and have\n # their parts downloaded in additional threads in the same Executor\n # To support multipart downloads in #3 using the same Executor as the download thread #2, we need at least\n # 2 threads always, if those aren't available then we'll run single threaded to avoid a deadlock\n with _sync_executor(syn) as executor:\n sync_from_synapse = _SyncDownloader(syn, executor)\n files = sync_from_synapse.sync(\n entity, path, ifcollision, followLink, downloadFile, manifest\n )\n\n # the allFiles parameter used to be passed in as part of the recursive implementation of this function\n # with the public signature invoking itself. now that this isn't a recursive any longer we don't need\n # allFiles as a parameter (especially on the public signature) but it is retained for now for backwards\n # compatibility with external invokers.\n if allFiles is not None:\n allFiles.extend(files)\n files = allFiles\n\n return files\n
"},{"location":"reference/synapse_utils/#synapseutils.sync.syncToSynapse","title":"syncToSynapse(syn, manifestFile, dryRun=False, sendMessages=True, retries=MAX_RETRIES)
","text":"Synchronizes files specified in the manifest file to Synapse.
Given a file describing all of the uploads uploads the content to Synapse and optionally notifies you via Synapse messagging (email) at specific intervals, on errors and on completion.
Read more about the manifest file format
PARAMETER DESCRIPTIONsyn
A Synapse object with user's login, e.g. syn = synapseclient.login()
manifestFile
A tsv file with file locations and metadata to be pushed to Synapse.
dryRun
Performs validation without uploading if set to True.
DEFAULT: False
sendMessages
Sends out messages on completion if set to True.
DEFAULT: True
None
None
Source code insynapseutils/sync.py
@tracer.start_as_current_span(\"sync::syncToSynapse\")\ndef syncToSynapse(\n syn, manifestFile, dryRun=False, sendMessages=True, retries=MAX_RETRIES\n) -> None:\n \"\"\"Synchronizes files specified in the manifest file to Synapse.\n\n Given a file describing all of the uploads uploads the content to Synapse and optionally notifies you via Synapse\n messagging (email) at specific intervals, on errors and on completion.\n\n [Read more about the manifest file format](../../explanations/manifest_tsv/)\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n manifestFile: A tsv file with file locations and metadata to be pushed to Synapse.\n dryRun: Performs validation without uploading if set to True.\n sendMessages: Sends out messages on completion if set to True.\n\n Returns:\n None\n \"\"\"\n df = readManifestFile(syn, manifestFile)\n # have to check all size of single file\n sizes = [\n os.stat(os.path.expandvars(os.path.expanduser(f))).st_size\n for f in df.path\n if not is_url(f)\n ]\n # Write output on what is getting pushed and estimated times - send out message.\n sys.stdout.write(\"=\" * 50 + \"\\n\")\n sys.stdout.write(\n \"We are about to upload %i files with a total size of %s.\\n \"\n % (len(df), utils.humanizeBytes(sum(sizes)))\n )\n sys.stdout.write(\"=\" * 50 + \"\\n\")\n\n if dryRun:\n return\n\n sys.stdout.write(\"Starting upload...\\n\")\n if sendMessages:\n notify_decorator = notifyMe(syn, \"Upload of %s\" % manifestFile, retries=retries)\n upload = notify_decorator(_manifest_upload)\n upload(syn, df)\n else:\n _manifest_upload(syn, df)\n
"},{"location":"reference/synapse_utils/#synapseutils.sync.generateManifest","title":"generateManifest(syn, allFiles, filename, provenance_cache=None)
","text":"Generates a manifest file based on a list of entities objects.
Read more about the manifest file format
PARAMETER DESCRIPTIONsyn
A Synapse object with user's login, e.g. syn = synapseclient.login()
allFiles
A list of File Entity objects on Synapse (can't be Synapse IDs)
filename
file where manifest will be written
provenance_cache
an optional dict of known provenance dicts keyed by entity ids
DEFAULT: None
None
None
Source code insynapseutils/sync.py
def generateManifest(syn, allFiles, filename, provenance_cache=None) -> None:\n \"\"\"Generates a manifest file based on a list of entities objects.\n\n [Read more about the manifest file format](../../explanations/manifest_tsv/)\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n allFiles: A list of File Entity objects on Synapse (can't be Synapse IDs)\n filename: file where manifest will be written\n provenance_cache: an optional dict of known provenance dicts keyed by entity ids\n\n Returns:\n None\n \"\"\"\n keys, data = _extract_file_entity_metadata(\n syn, allFiles, provenance_cache=provenance_cache\n )\n _write_manifest_data(filename, keys, data)\n
"},{"location":"reference/synapse_utils/#synapseutils.sync.generate_sync_manifest","title":"generate_sync_manifest(syn, directory_path, parent_id, manifest_path)
","text":"Generate manifest for syncToSynapse() from a local directory.
Read more about the manifest file format
PARAMETER DESCRIPTIONsyn
A Synapse object with user's login, e.g. syn = synapseclient.login()
directory_path
Path to local directory to be pushed to Synapse.
parent_id
Synapse ID of the parent folder/project on Synapse.
manifest_path
Path to the manifest file to be generated.
RETURNS DESCRIPTION
None
None
Source code insynapseutils/sync.py
@tracer.start_as_current_span(\"sync::generate_sync_manifest\")\ndef generate_sync_manifest(syn, directory_path, parent_id, manifest_path) -> None:\n \"\"\"Generate manifest for syncToSynapse() from a local directory.\n\n [Read more about the manifest file format](../../explanations/manifest_tsv/)\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n directory_path: Path to local directory to be pushed to Synapse.\n parent_id: Synapse ID of the parent folder/project on Synapse.\n manifest_path: Path to the manifest file to be generated.\n\n Returns:\n None\n \"\"\"\n manifest_cols = [\"path\", \"parent\"]\n manifest_rows = _walk_directory_tree(syn, directory_path, parent_id)\n _write_manifest_data(manifest_path, manifest_cols, manifest_rows)\n
"},{"location":"reference/synapse_utils/#synapseutils.sync.readManifestFile","title":"readManifestFile(syn, manifestFile)
","text":"Verifies a file manifest and returns a reordered dataframe ready for upload.
Read more about the manifest file format
PARAMETER DESCRIPTIONsyn
A Synapse object with user's login, e.g. syn = synapseclient.login()
manifestFile
A tsv file with file locations and metadata to be pushed to Synapse.
RETURNS DESCRIPTION
A pandas dataframe if the manifest is validated.
Source code insynapseutils/sync.py
@tracer.start_as_current_span(\"sync::readManifestFile\")\ndef readManifestFile(syn, manifestFile):\n \"\"\"Verifies a file manifest and returns a reordered dataframe ready for upload.\n\n [Read more about the manifest file format](../../explanations/manifest_tsv/)\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n manifestFile: A tsv file with file locations and metadata to be pushed to Synapse.\n\n Returns:\n A pandas dataframe if the manifest is validated.\n \"\"\"\n table.test_import_pandas()\n import pandas as pd\n\n if manifestFile is sys.stdin:\n sys.stdout.write(\"Validation and upload of: <stdin>\\n\")\n else:\n sys.stdout.write(\"Validation and upload of: %s\\n\" % manifestFile)\n # Read manifest file into pandas dataframe\n df = pd.read_csv(manifestFile, sep=\"\\t\")\n if \"synapseStore\" not in df:\n df = df.assign(synapseStore=None)\n df.loc[\n df[\"path\"].apply(is_url), \"synapseStore\"\n ] = False # override synapseStore values to False when path is a url\n df.loc[\n df[\"synapseStore\"].isnull(), \"synapseStore\"\n ] = True # remaining unset values default to True\n df.synapseStore = df.synapseStore.astype(bool)\n df = df.fillna(\"\")\n\n sys.stdout.write(\"Validating columns of manifest...\")\n for field in REQUIRED_FIELDS:\n sys.stdout.write(\".\")\n if field not in df.columns:\n sys.stdout.write(\"\\n\")\n raise ValueError(\"Manifest must contain a column of %s\" % field)\n sys.stdout.write(\"OK\\n\")\n\n sys.stdout.write(\"Validating that all paths exist...\")\n df.path = df.path.apply(_check_path_and_normalize)\n\n sys.stdout.write(\"OK\\n\")\n\n sys.stdout.write(\"Validating that all files are unique...\")\n # Both the path and the combination of entity name and parent must be unique\n if len(df.path) != len(set(df.path)):\n raise ValueError(\"All rows in manifest must contain a unique file to upload\")\n sys.stdout.write(\"OK\\n\")\n\n # Check each size of uploaded file\n sys.stdout.write(\"Validating that all the files are not empty...\")\n _check_size_each_file(df)\n sys.stdout.write(\"OK\\n\")\n\n # check the name of each file should be store on Synapse\n name_column = \"name\"\n # Create entity name column from basename\n if name_column not in df.columns:\n filenames = [os.path.basename(path) for path in df[\"path\"]]\n df[\"name\"] = filenames\n\n sys.stdout.write(\"Validating file names... \\n\")\n _check_file_name(df)\n sys.stdout.write(\"OK\\n\")\n\n sys.stdout.write(\"Validating provenance...\")\n df = _sortAndFixProvenance(syn, df)\n sys.stdout.write(\"OK\\n\")\n\n sys.stdout.write(\"Validating that parents exist and are containers...\")\n parents = set(df.parent)\n for synId in parents:\n try:\n container = syn.get(synId, downloadFile=False)\n except SynapseHTTPError:\n sys.stdout.write(\n \"\\n%s in the parent column is not a valid Synapse Id\\n\" % synId\n )\n raise\n if not is_container(container):\n sys.stdout.write(\n \"\\n%s in the parent column is is not a Folder or Project\\n\" % synId\n )\n raise SynapseHTTPError\n sys.stdout.write(\"OK\\n\")\n return df\n
"},{"location":"reference/synapse_utils/#synapseutils.copy_functions","title":"synapseutils.copy_functions
","text":""},{"location":"reference/synapse_utils/#synapseutils.copy_functions-functions","title":"Functions","text":""},{"location":"reference/synapse_utils/#synapseutils.copy_functions.copy","title":"copy(syn, entity, destinationId, skipCopyWikiPage=False, skipCopyAnnotations=False, **kwargs)
","text":"syn
A Synapse object with user's login, e.g. syn = synapseclient.login()
entity
A synapse entity ID
destinationId
Synapse ID of a folder/project that the copied entity is being copied to
skipCopyWikiPage
Skip copying the wiki pages.
DEFAULT: False
skipCopyAnnotations
Skips copying the annotations.
DEFAULT: False
version
(File copy only) Can specify version of a file. Default to None
updateExisting
(File copy only) When the destination has an entity that has the same name, users can choose to update that entity. It must be the same entity type Default to False
setProvenance
(File copy only) Has three values to set the provenance of the copied entity: traceback: Sets to the source entity existing: Sets to source entity's original provenance (if it exists) None: No provenance is set
excludeTypes
(Folder/Project copy only) Accepts a list of entity types (file, table, link) which determines which entity types to not copy. Defaults to an empty list.
RETURNS DESCRIPTION
A mapping between the original and copied entity: {'syn1234':'syn33455'}
Using this functionSample copy:
import synapseutils\nimport synapseclient\nsyn = synapseclient.login()\nsynapseutils.copy(syn, ...)\n
Copying Files:
synapseutils.copy(syn, \"syn12345\", \"syn45678\", updateExisting=False, setProvenance = \"traceback\",version=None)\n
Copying Folders/Projects:
# This will copy everything in the project into the destinationId except files and tables.\nsynapseutils.copy(syn, \"syn123450\",\"syn345678\",excludeTypes=[\"file\",\"table\"])\n
Source code in synapseutils/copy_functions.py
def copy(\n syn,\n entity,\n destinationId,\n skipCopyWikiPage=False,\n skipCopyAnnotations=False,\n **kwargs,\n):\n \"\"\"\n - This function will assist users in copying entities (Tables, Links, Files, Folders, Projects),\n and will recursively copy everything in directories.\n - A Mapping of the old entities to the new entities will be created and all the wikis of each entity\n will also be copied over and links to synapse Ids will be updated.\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n entity: A synapse entity ID\n destinationId: Synapse ID of a folder/project that the copied entity is being copied to\n skipCopyWikiPage: Skip copying the wiki pages.\n skipCopyAnnotations: Skips copying the annotations.\n version: (File copy only) Can specify version of a file. Default to None\n updateExisting: (File copy only) When the destination has an entity that has the same name,\n users can choose to update that entity. It must be the same entity type\n Default to False\n setProvenance: (File copy only) Has three values to set the provenance of the copied entity:\n traceback: Sets to the source entity\n existing: Sets to source entity's original provenance (if it exists)\n None: No provenance is set\n excludeTypes: (Folder/Project copy only) Accepts a list of entity types (file, table, link) which determines\n which entity types to not copy. Defaults to an empty list.\n\n Returns:\n A mapping between the original and copied entity: {'syn1234':'syn33455'}\n\n Example: Using this function\n Sample copy:\n\n import synapseutils\n import synapseclient\n syn = synapseclient.login()\n synapseutils.copy(syn, ...)\n\n Copying Files:\n\n synapseutils.copy(syn, \"syn12345\", \"syn45678\", updateExisting=False, setProvenance = \"traceback\",version=None)\n\n Copying Folders/Projects:\n\n # This will copy everything in the project into the destinationId except files and tables.\n synapseutils.copy(syn, \"syn123450\",\"syn345678\",excludeTypes=[\"file\",\"table\"])\n \"\"\"\n updateLinks = kwargs.get(\"updateLinks\", True)\n updateSynIds = kwargs.get(\"updateSynIds\", True)\n entitySubPageId = kwargs.get(\"entitySubPageId\", None)\n destinationSubPageId = kwargs.get(\"destinationSubPageId\", None)\n\n mapping = _copyRecursive(\n syn, entity, destinationId, skipCopyAnnotations=skipCopyAnnotations, **kwargs\n )\n if not skipCopyWikiPage:\n for oldEnt in mapping:\n copyWiki(\n syn,\n oldEnt,\n mapping[oldEnt],\n entitySubPageId=entitySubPageId,\n destinationSubPageId=destinationSubPageId,\n updateLinks=updateLinks,\n updateSynIds=updateSynIds,\n entityMap=mapping,\n )\n return mapping\n
"},{"location":"reference/synapse_utils/#synapseutils.copy_functions.changeFileMetaData","title":"changeFileMetaData(syn, entity, downloadAs=None, contentType=None, forceVersion=True)
","text":"Change File Entity metadata like the download as name.
PARAMETER DESCRIPTIONsyn
A Synapse object with user's login, e.g. syn = synapseclient.login()
entity
Synapse entity Id or object.
contentType
Specify content type to change the content type of a filehandle.
DEFAULT: None
downloadAs
Specify filename to change the filename of a filehandle.
DEFAULT: None
forceVersion
Indicates whether the method should increment the version of the object even if nothing has changed. Defaults to True.
DEFAULT: True
Synapse Entity
Using this functionCan be used to change the filename or the file content-type without downloading:
file_entity = syn.get(synid)\nprint(os.path.basename(file_entity.path)) ## prints, e.g., \"my_file.txt\"\nfile_entity = synapseutils.changeFileMetaData(syn, file_entity, \"my_new_name_file.txt\")\n
Source code in synapseutils/copy_functions.py
def changeFileMetaData(\n syn, entity, downloadAs=None, contentType=None, forceVersion=True\n):\n \"\"\"\n Change File Entity metadata like the download as name.\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n entity: Synapse entity Id or object.\n contentType: Specify content type to change the content type of a filehandle.\n downloadAs: Specify filename to change the filename of a filehandle.\n forceVersion: Indicates whether the method should increment the version of the object even if nothing has changed. Defaults to True.\n\n Returns:\n Synapse Entity\n\n Example: Using this function\n Can be used to change the filename or the file content-type without downloading:\n\n file_entity = syn.get(synid)\n print(os.path.basename(file_entity.path)) ## prints, e.g., \"my_file.txt\"\n file_entity = synapseutils.changeFileMetaData(syn, file_entity, \"my_new_name_file.txt\")\n \"\"\"\n ent = syn.get(entity, downloadFile=False)\n fileResult = syn._getFileHandleDownload(ent.dataFileHandleId, ent.id)\n ent.contentType = ent.contentType if contentType is None else contentType\n downloadAs = (\n fileResult[\"fileHandle\"][\"fileName\"] if downloadAs is None else downloadAs\n )\n copiedFileHandle = copyFileHandles(\n syn,\n [ent.dataFileHandleId],\n [ent.concreteType.split(\".\")[-1]],\n [ent.id],\n [contentType],\n [downloadAs],\n )\n copyResult = copiedFileHandle[0]\n if copyResult.get(\"failureCode\") is not None:\n raise ValueError(\n \"%s dataFileHandleId: %s\"\n % (copyResult[\"failureCode\"], copyResult[\"originalFileHandleId\"])\n )\n ent.dataFileHandleId = copyResult[\"newFileHandle\"][\"id\"]\n ent = syn.store(ent, forceVersion=forceVersion)\n return ent\n
"},{"location":"reference/synapse_utils/#synapseutils.copy_functions.copyFileHandles","title":"copyFileHandles(syn, fileHandles, associateObjectTypes, associateObjectIds, newContentTypes=None, newFileNames=None)
","text":"Given a list of fileHandle Ids or Objects, copy the fileHandles
PARAMETER DESCRIPTIONsyn
A Synapse object with user's login, e.g. syn = synapseclient.login()
fileHandles
List of fileHandle Ids or Objects
associateObjectTypes
List of associated object types: FileEntity, TableEntity, WikiAttachment, UserProfileAttachment, MessageAttachment, TeamAttachment, SubmissionAttachment, VerificationSubmission (Must be the same length as fileHandles)
associateObjectIds
List of associated object Ids: If copying a file, the objectId is the synapse id, and if copying a wiki attachment, the object id is the wiki subpage id. (Must be the same length as fileHandles)
newContentTypes
(Optional) List of content types. Set each item to a new content type for each file handle, or leave the item as None to keep the original content type. Default None, which keeps all original content types.
DEFAULT: None
newFileNames
(Optional) List of filenames. Set each item to a new filename for each file handle, or leave the item as None to keep the original name. Default None, which keeps all original file names.
DEFAULT: None
List of batch filehandle copy results, can include failureCodes: UNAUTHORIZED and NOT_FOUND
RAISES DESCRIPTIONValueError
If length of all input arguments are not the same
Source code insynapseutils/copy_functions.py
def copyFileHandles(\n syn,\n fileHandles,\n associateObjectTypes,\n associateObjectIds,\n newContentTypes=None,\n newFileNames=None,\n):\n \"\"\"\n Given a list of fileHandle Ids or Objects, copy the fileHandles\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n fileHandles: List of fileHandle Ids or Objects\n associateObjectTypes: List of associated object types: FileEntity, TableEntity, WikiAttachment,\n UserProfileAttachment, MessageAttachment, TeamAttachment, SubmissionAttachment,\n VerificationSubmission (Must be the same length as fileHandles)\n associateObjectIds: List of associated object Ids: If copying a file, the objectId is the synapse id,\n and if copying a wiki attachment, the object id is the wiki subpage id.\n (Must be the same length as fileHandles)\n newContentTypes: (Optional) List of content types. Set each item to a new content type for each file\n handle, or leave the item as None to keep the original content type. Default None,\n which keeps all original content types.\n newFileNames: (Optional) List of filenames. Set each item to a new filename for each file handle,\n or leave the item as None to keep the original name. Default None, which keeps all\n original file names.\n\n Returns:\n List of batch filehandle copy results, can include failureCodes: UNAUTHORIZED and NOT_FOUND\n\n Raises:\n ValueError: If length of all input arguments are not the same\n \"\"\"\n\n # Check if length of all inputs are equal\n if not (\n len(fileHandles) == len(associateObjectTypes) == len(associateObjectIds)\n and (newContentTypes is None or len(newContentTypes) == len(associateObjectIds))\n and (newFileNames is None or len(newFileNames) == len(associateObjectIds))\n ):\n raise ValueError(\"Length of all input arguments must be the same\")\n\n # If no optional params passed, assign to empty list\n if newContentTypes is None:\n newContentTypes = []\n if newFileNames is None:\n newFileNames = []\n\n # Remove this line if we change API to only take fileHandleIds and not Objects\n file_handle_ids = [synapseclient.core.utils.id_of(handle) for handle in fileHandles]\n\n # division logic for POST call here\n master_copy_results_list = [] # list which holds all results from POST call\n for (\n batch_file_handles_ids,\n batch_assoc_obj_types,\n batch_assoc_obj_ids,\n batch_con_type,\n batch_file_name,\n ) in _batch_iterator_generator(\n [\n file_handle_ids,\n associateObjectTypes,\n associateObjectIds,\n newContentTypes,\n newFileNames,\n ],\n MAX_FILE_HANDLE_PER_COPY_REQUEST,\n ):\n batch_copy_results = _copy_file_handles_batch(\n syn,\n batch_file_handles_ids,\n batch_assoc_obj_types,\n batch_assoc_obj_ids,\n batch_con_type,\n batch_file_name,\n )\n master_copy_results_list.extend(batch_copy_results)\n\n return master_copy_results_list\n
"},{"location":"reference/synapse_utils/#synapseutils.copy_functions.copyWiki","title":"copyWiki(syn, entity, destinationId, entitySubPageId=None, destinationSubPageId=None, updateLinks=True, updateSynIds=True, entityMap=None)
","text":"Copies wikis and updates internal links
PARAMETER DESCRIPTIONsyn
A Synapse object with user's login, e.g. syn = synapseclient.login()
entity
A synapse ID of an entity whose wiki you want to copy
destinationId
Synapse ID of a folder/project that the wiki wants to be copied to
updateLinks
Update all the internal links. (e.g. syn1234/wiki/34345 becomes syn3345/wiki/49508)
DEFAULT: True
updateSynIds
Update all the synapse ID's referenced in the wikis. (e.g. syn1234 becomes syn2345) Defaults to True but needs an entityMap
DEFAULT: True
entityMap
An entity map {'oldSynId','newSynId'} to update the synapse IDs referenced in the wiki.
DEFAULT: None
entitySubPageId
Can specify subPageId and copy all of its subwikis Defaults to None, which copies the entire wiki subPageId can be found: https://www.synapse.org/#!Synapse:syn123/wiki/1234 In this case, 1234 is the subPageId.
DEFAULT: None
destinationSubPageId
Can specify destination subPageId to copy wikis to.
DEFAULT: None
A list of Objects with three fields: id, title and parentId.
Source code insynapseutils/copy_functions.py
def copyWiki(\n syn,\n entity,\n destinationId,\n entitySubPageId=None,\n destinationSubPageId=None,\n updateLinks=True,\n updateSynIds=True,\n entityMap=None,\n):\n \"\"\"\n Copies wikis and updates internal links\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n entity: A synapse ID of an entity whose wiki you want to copy\n destinationId: Synapse ID of a folder/project that the wiki wants to be copied to\n updateLinks: Update all the internal links. (e.g. syn1234/wiki/34345 becomes syn3345/wiki/49508)\n updateSynIds: Update all the synapse ID's referenced in the wikis. (e.g. syn1234 becomes syn2345)\n Defaults to True but needs an entityMap\n entityMap: An entity map {'oldSynId','newSynId'} to update the synapse IDs referenced in the wiki.\n entitySubPageId: Can specify subPageId and copy all of its subwikis\n Defaults to None, which copies the entire wiki subPageId can be found:\n https://www.synapse.org/#!Synapse:syn123/wiki/1234\n In this case, 1234 is the subPageId.\n destinationSubPageId: Can specify destination subPageId to copy wikis to.\n\n Returns:\n A list of Objects with three fields: id, title and parentId.\n \"\"\"\n\n # Validate input parameters\n if entitySubPageId:\n entitySubPageId = str(int(entitySubPageId))\n if destinationSubPageId:\n destinationSubPageId = str(int(destinationSubPageId))\n\n oldOwn = syn.get(entity, downloadFile=False)\n # getWikiHeaders fails when there is no wiki\n\n try:\n oldWikiHeaders = syn.getWikiHeaders(oldOwn)\n except SynapseHTTPError as e:\n if e.response.status_code == 404:\n return []\n else:\n raise e\n\n newOwn = syn.get(destinationId, downloadFile=False)\n wikiIdMap = dict()\n newWikis = dict()\n # If entitySubPageId is given but not destinationSubPageId, set the pageId to \"\" (will get the root page)\n # A entitySubPage could be copied to a project without any wiki pages, this has to be checked\n newWikiPage = None\n if destinationSubPageId:\n try:\n newWikiPage = syn.getWiki(newOwn, destinationSubPageId)\n except SynapseHTTPError as e:\n if e.response.status_code == 404:\n pass\n else:\n raise e\n if entitySubPageId:\n oldWikiHeaders = _getSubWikiHeaders(oldWikiHeaders, entitySubPageId)\n\n if not oldWikiHeaders:\n return []\n\n for wikiHeader in oldWikiHeaders:\n wiki = syn.getWiki(oldOwn, wikiHeader[\"id\"])\n syn.logger.info(\"Got wiki %s\" % wikiHeader[\"id\"])\n if not wiki.get(\"attachmentFileHandleIds\"):\n new_file_handles = []\n else:\n results = [\n syn._getFileHandleDownload(\n filehandleId, wiki.id, objectType=\"WikiAttachment\"\n )\n for filehandleId in wiki[\"attachmentFileHandleIds\"]\n ]\n # Get rid of the previews\n nopreviews = [\n attach[\"fileHandle\"]\n for attach in results\n if not attach[\"fileHandle\"][\"isPreview\"]\n ]\n contentTypes = [attach[\"contentType\"] for attach in nopreviews]\n fileNames = [attach[\"fileName\"] for attach in nopreviews]\n copiedFileHandles = copyFileHandles(\n syn,\n nopreviews,\n [\"WikiAttachment\"] * len(nopreviews),\n [wiki.id] * len(nopreviews),\n contentTypes,\n fileNames,\n )\n # Check if failurecodes exist\n for filehandle in copiedFileHandles:\n if filehandle.get(\"failureCode\") is not None:\n raise ValueError(\n \"%s dataFileHandleId: %s\"\n % (\n filehandle[\"failureCode\"],\n filehandle[\"originalFileHandleId\"],\n )\n )\n new_file_handles = [\n filehandle[\"newFileHandle\"][\"id\"] for filehandle in copiedFileHandles\n ]\n # for some reason some wikis don't have titles?\n if hasattr(wikiHeader, \"parentId\"):\n newWikiPage = Wiki(\n owner=newOwn,\n title=wiki.get(\"title\", \"\"),\n markdown=wiki.markdown,\n fileHandles=new_file_handles,\n parentWikiId=wikiIdMap[wiki.parentWikiId],\n )\n newWikiPage = syn.store(newWikiPage)\n else:\n if destinationSubPageId is not None and newWikiPage is not None:\n newWikiPage[\"attachmentFileHandleIds\"] = new_file_handles\n newWikiPage[\"markdown\"] = wiki[\"markdown\"]\n newWikiPage[\"title\"] = wiki.get(\"title\", \"\")\n # Need to add logic to update titles here\n newWikiPage = syn.store(newWikiPage)\n else:\n newWikiPage = Wiki(\n owner=newOwn,\n title=wiki.get(\"title\", \"\"),\n markdown=wiki.markdown,\n fileHandles=new_file_handles,\n parentWikiId=destinationSubPageId,\n )\n newWikiPage = syn.store(newWikiPage)\n newWikis[newWikiPage[\"id\"]] = newWikiPage\n wikiIdMap[wiki[\"id\"]] = newWikiPage[\"id\"]\n\n if updateLinks:\n syn.logger.info(\"Updating internal links:\\n\")\n newWikis = _updateInternalLinks(newWikis, wikiIdMap, entity, destinationId)\n syn.logger.info(\"Done updating internal links.\\n\")\n\n if updateSynIds and entityMap is not None:\n syn.logger.info(\"Updating Synapse references:\\n\")\n newWikis = _updateSynIds(newWikis, wikiIdMap, entityMap)\n syn.logger.info(\"Done updating Synapse IDs.\\n\")\n\n syn.logger.info(\"Storing new Wikis\\n\")\n for oldWikiId in wikiIdMap.keys():\n newWikiId = wikiIdMap[oldWikiId]\n newWikis[newWikiId] = syn.store(newWikis[newWikiId])\n syn.logger.info(\"\\tStored: %s\\n\" % newWikiId)\n return syn.getWikiHeaders(newOwn)\n
"},{"location":"reference/synapse_utils/#synapseutils.walk_functions","title":"synapseutils.walk_functions
","text":""},{"location":"reference/synapse_utils/#synapseutils.walk_functions-functions","title":"Functions","text":""},{"location":"reference/synapse_utils/#synapseutils.walk_functions.walk","title":"walk(syn, synId, includeTypes=['folder', 'file', 'table', 'link', 'entityview', 'dockerrepo', 'submissionview', 'dataset', 'materializedview'])
","text":"Traverse through the hierarchy of files and folders stored under the synId. Has the same behavior as os.walk()
PARAMETER DESCRIPTIONsyn
A Synapse object with user's login, e.g. syn = synapseclient.login()
synId
A synapse ID of a folder or project
includeTypes
Must be a list of entity types (ie.[\"file\", \"table\"]) The \"folder\" type is always included so the hierarchy can be traversed
DEFAULT: ['folder', 'file', 'table', 'link', 'entityview', 'dockerrepo', 'submissionview', 'dataset', 'materializedview']
Traversing through a project and printing out the directory path, folders, and files
walkedPath = walk(syn, \"syn1234\", [\"file\"]) #Exclude tables and views\n\nfor dirpath, dirname, filename in walkedPath:\n print(dirpath)\n print(dirname) #All the folders in the directory path\n print(filename) #All the files in the directory path\n
Source code in synapseutils/walk_functions.py
def walk(\n syn,\n synId,\n includeTypes=[\n \"folder\",\n \"file\",\n \"table\",\n \"link\",\n \"entityview\",\n \"dockerrepo\",\n \"submissionview\",\n \"dataset\",\n \"materializedview\",\n ],\n):\n \"\"\"\n Traverse through the hierarchy of files and folders stored under the synId. Has the same behavior as os.walk()\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n synId: A synapse ID of a folder or project\n includeTypes: Must be a list of entity types (ie.[\"file\", \"table\"])\n The \"folder\" type is always included so the hierarchy can be traversed\n\n Example: Using this function\n Traversing through a project and printing out the directory path, folders, and files\n\n walkedPath = walk(syn, \"syn1234\", [\"file\"]) #Exclude tables and views\n\n for dirpath, dirname, filename in walkedPath:\n print(dirpath)\n print(dirname) #All the folders in the directory path\n print(filename) #All the files in the directory path\n\n \"\"\"\n # Ensure that \"folder\" is included so the hierarchy can be traversed\n if \"folder\" not in includeTypes:\n includeTypes.append(\"folder\")\n return _helpWalk(syn, synId, includeTypes)\n
"},{"location":"reference/synapse_utils/#synapseutils.monitor","title":"synapseutils.monitor
","text":""},{"location":"reference/synapse_utils/#synapseutils.monitor-functions","title":"Functions","text":""},{"location":"reference/synapse_utils/#synapseutils.monitor.notifyMe","title":"notifyMe(syn, messageSubject='', retries=0)
","text":"Function decorator that notifies you via email whenever an function completes running or there is a failure.
PARAMETER DESCRIPTIONsyn
A Synapse object with user's login, e.g. syn = synapseclient.login()
messageSubject
A string with subject line for sent out messages.
DEFAULT: ''
retries
Number of retries to attempt on failure
DEFAULT: 0
As a decorator:
# to decorate a function that you define\nfrom synapseutils import notifyMe\nimport synapseclient\nsyn = synapseclient.login()\n\n@notifyMe(syn, 'Long running function', retries=2)\ndef my_function(x):\n doing_something()\n return long_runtime_func(x)\n\nmy_function(123)\n
Wrapping a function:
# to wrap a function that already exists\nfrom synapseutils import notifyMe\nimport synapseclient\nsyn = synapseclient.login()\n\nnotify_decorator = notifyMe(syn, 'Long running query', retries=2)\nmy_query = notify_decorator(syn.tableQuery)\nresults = my_query(\"select id from syn1223\")\n
Source code in synapseutils/monitor.py
def notifyMe(syn, messageSubject=\"\", retries=0):\n \"\"\"Function decorator that notifies you via email whenever an function completes running or there is a failure.\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n messageSubject: A string with subject line for sent out messages.\n retries: Number of retries to attempt on failure\n\n Example: Using this function\n As a decorator:\n\n # to decorate a function that you define\n from synapseutils import notifyMe\n import synapseclient\n syn = synapseclient.login()\n\n @notifyMe(syn, 'Long running function', retries=2)\n def my_function(x):\n doing_something()\n return long_runtime_func(x)\n\n my_function(123)\n\n Wrapping a function:\n\n # to wrap a function that already exists\n from synapseutils import notifyMe\n import synapseclient\n syn = synapseclient.login()\n\n notify_decorator = notifyMe(syn, 'Long running query', retries=2)\n my_query = notify_decorator(syn.tableQuery)\n results = my_query(\"select id from syn1223\")\n \"\"\"\n\n def notify_decorator(func):\n @functools.wraps(func)\n def with_retry_and_messaging(*args, **kwargs):\n attempt = 0\n destination = syn.getUserProfile()[\"ownerId\"]\n while attempt <= retries:\n try:\n output = func(*args, **kwargs)\n syn.sendMessage(\n [destination],\n messageSubject,\n messageBody=\"Call to %s completed successfully!\"\n % func.__name__,\n )\n return output\n except Exception as e:\n sys.stderr.write(traceback.format_exc())\n syn.sendMessage(\n [destination],\n messageSubject,\n messageBody=(\n \"Encountered a temporary Failure during upload. \"\n \"Will retry %i more times. \\n\\n Error message was:\\n%s\\n\\n%s\"\n % (retries - attempt, e, traceback.format_exc())\n ),\n )\n attempt += 1\n\n return with_retry_and_messaging\n\n return notify_decorator\n
"},{"location":"reference/synapse_utils/#synapseutils.monitor.with_progress_bar","title":"with_progress_bar(func, totalCalls, prefix='', postfix='', isBytes=False)
","text":"Wraps a function to add a progress bar based on the number of calls to that function.
PARAMETER DESCRIPTIONfunc
Function being wrapped with progress Bar
totalCalls
total number of items/bytes when completed
prefix
String printed before progress bar
DEFAULT: ''
prefix
String printed after progress bar
DEFAULT: ''
isBytes
A boolean indicating weather to convert bytes to kB, MB, GB etc.
DEFAULT: False
A wrapped function that contains a progress bar
Source code insynapseutils/monitor.py
def with_progress_bar(func, totalCalls, prefix=\"\", postfix=\"\", isBytes=False):\n \"\"\"Wraps a function to add a progress bar based on the number of calls to that function.\n\n Arguments:\n func: Function being wrapped with progress Bar\n totalCalls: total number of items/bytes when completed\n prefix: String printed before progress bar\n prefix: String printed after progress bar\n isBytes: A boolean indicating weather to convert bytes to kB, MB, GB etc.\n\n Returns:\n A wrapped function that contains a progress bar\n \"\"\"\n completed = Value(\"d\", 0)\n lock = Lock()\n\n def progress(*args, **kwargs):\n with lock:\n completed.value += 1\n printTransferProgress(completed.value, totalCalls, prefix, postfix, isBytes)\n return func(*args, **kwargs)\n\n return progress\n
"},{"location":"reference/synapse_utils/#synapseutils.migrate_functions","title":"synapseutils.migrate_functions
","text":""},{"location":"reference/synapse_utils/#synapseutils.migrate_functions-classes","title":"Classes","text":""},{"location":"reference/synapse_utils/#synapseutils.migrate_functions.MigrationResult","title":"MigrationResult
","text":"A MigrationResult is a proxy object to the underlying sqlite db. It provides a programmatic interface that allows the caller to iterate over the file handles that were migrated without having to connect to or know the schema of the sqlite db, and also avoids the potential memory liability of putting everything into an in memory data structure that could be a liability when migrating a huge project of hundreds of thousands/millions of entities.
As this proxy object is not thread safe since it accesses an underlying sqlite db.
Source code insynapseutils/migrate_functions.py
class MigrationResult:\n \"\"\"A MigrationResult is a proxy object to the underlying sqlite db.\n It provides a programmatic interface that allows the caller to iterate over the\n file handles that were migrated without having to connect to or know the schema\n of the sqlite db, and also avoids the potential memory liability of putting\n everything into an in memory data structure that could be a liability when\n migrating a huge project of hundreds of thousands/millions of entities.\n\n As this proxy object is not thread safe since it accesses an underlying sqlite db.\n \"\"\"\n\n def __init__(self, syn, db_path):\n self._syn = syn\n self.db_path = db_path\n\n def get_counts_by_status(self):\n \"\"\"\n Returns a dictionary of counts by the migration status of each indexed file/version.\n Keys are as follows:\n\n - `INDEXED` - the file/version has been indexed and will be migrated on a call to migrate_indexed_files\n - `MIGRATED` - the file/version has been migrated\n - `ALREADY_MIGRATED` - the file/version was already stored at the target storage location and no migration is needed\n - `ERRORED` - an error occurred while indexing or migrating the file/version\n \"\"\" # noqa\n import sqlite3\n\n with sqlite3.connect(self.db_path) as conn:\n cursor = conn.cursor()\n\n # for the purposes of these counts, containers (Projects and Folders) do not count.\n # we are counting actual files only\n result = cursor.execute(\n \"select status, count(*) from migrations where type in (?, ?) group by status\",\n (_MigrationType.FILE.value, _MigrationType.TABLE_ATTACHED_FILE.value),\n )\n\n counts_by_status = {status.name: 0 for status in _MigrationStatus}\n for row in result:\n status = row[0]\n count = row[1]\n counts_by_status[_MigrationStatus(status).name] = count\n\n return counts_by_status\n\n def get_migrations(self):\n \"\"\"\n A generator yielding each file/version in the migration index.\n A dictionary of the properties of the migration row is yielded as follows\n\n Yields:\n id: the Synapse id\n type: the concrete type of the entity\n version: the verson of the file entity (if applicable)\n row_id: the row of the table attached file (if applicable)\n col_id: the column id of the table attached file (if applicable)\n from_storage_location_id: - the previous storage location id where the file/version was stored\n from_file_handle_id: the id file handle of the existing file/version\n to_file_handle_id: if migrated, the new file handle id\n status: one of INDEXED, MIGRATED, ALREADY_MIGRATED, ERRORED indicating the status of the file/version\n exception: if an error was encountered indexing/migrating the file/version its stack is here\n \"\"\"\n import sqlite3\n\n with sqlite3.connect(self.db_path) as conn:\n cursor = conn.cursor()\n\n last_id = None\n column_names = None\n\n rowid = -1\n while True:\n results = cursor.execute(\n \"\"\"\n select\n rowid,\n\n id,\n type,\n version,\n row_id,\n col_id,\n from_storage_location_id,\n from_file_handle_id,\n to_file_handle_id,\n file_size,\n status,\n exception\n from migrations\n where\n rowid > ?\n and type in (?, ?)\n order by\n rowid\n limit ?\n \"\"\",\n (\n rowid,\n _MigrationType.FILE.value,\n _MigrationType.TABLE_ATTACHED_FILE.value,\n _get_batch_size(),\n ),\n )\n\n row_count = 0\n for row in results:\n row_count += 1\n\n # using the internal sqlite rowid for ordering only\n rowid = row[0]\n\n # exclude the sqlite internal rowid\n row_dict = _get_row_dict(cursor, row, False)\n entity_id = row_dict[\"id\"]\n if entity_id != last_id:\n # if the next row is dealing with a different entity than the last table\n # id then we discard any cached column names we looked up\n column_names = {}\n\n row_dict[\"type\"] = (\n \"file\"\n if row_dict[\"type\"] == _MigrationType.FILE.value\n else \"table\"\n )\n\n for int_arg in (\n \"version\",\n \"row_id\",\n \"from_storage_location_id\",\n \"from_file_handle_id\",\n \"to_file_handle_id\",\n ):\n int_val = row_dict.get(int_arg)\n if int_val is not None:\n row_dict[int_arg] = int(int_val)\n\n col_id = row_dict.pop(\"col_id\", None)\n if col_id is not None:\n column_name = column_names.get(col_id)\n\n # for usability we look up the actual column name from the id,\n # but that involves a lookup so we cache them for re-use across\n # rows that deal with the same table entity\n if column_name is None:\n column = self._syn.restGET(\"/column/{}\".format(col_id))\n column_name = column_names[col_id] = column[\"name\"]\n\n row_dict[\"col_name\"] = column_name\n\n row_dict[\"status\"] = _MigrationStatus(row_dict[\"status\"]).name\n\n yield row_dict\n\n last_id = entity_id\n\n if row_count == 0:\n # out of rows\n break\n\n def as_csv(self, path):\n \"\"\"\n Output a flat csv file of the contents of the Migration index.\n\n Arguments:\n path: The path to the csv file to be created\n\n Returns:\n None: But a csv file is created at the given path with the following columns:\n id: the Synapse id\n type: the concrete type of the entity\n version: the verson of the file entity (if applicable)\n row_id: the row of the table attached file (if applicable)\n col_name: the column name of the column the table attached file resides in (if applicable)\n from_storage_location_id: the previous storage location id where the file/version was stored\n from_file_handle_id: the id file handle of the existing file/version\n to_file_handle_id: if migrated, the new file handle id\n status: one of INDEXED, MIGRATED, ALREADY_MIGRATED, ERRORED indicating the status of the file/version\n exception: if an error was encountered indexing/migrating the file/version its stack is here\n\n \"\"\"\n\n with open(path, \"w\", newline=\"\") as csv_file:\n csv_writer = csv.writer(csv_file)\n\n # headers\n csv_writer.writerow(\n [\n \"id\",\n \"type\",\n \"version\",\n \"row_id\",\n \"col_name\",\n \"from_storage_location_id\",\n \"from_file_handle_id\",\n \"to_file_handle_id\",\n \"status\",\n \"exception\",\n ]\n )\n\n for row_dict in self.get_migrations():\n row_data = [\n row_dict[\"id\"],\n row_dict[\"type\"],\n row_dict.get(\"version\"),\n row_dict.get(\"row_id\"),\n row_dict.get(\"col_name\"),\n row_dict.get(\"from_storage_location_id\"),\n row_dict.get(\"from_file_handle_id\"),\n row_dict.get(\"to_file_handle_id\"),\n row_dict[\"status\"],\n row_dict.get(\"exception\"),\n ]\n\n csv_writer.writerow(row_data)\n
"},{"location":"reference/synapse_utils/#synapseutils.migrate_functions.MigrationResult-functions","title":"Functions","text":""},{"location":"reference/synapse_utils/#synapseutils.migrate_functions.MigrationResult.get_counts_by_status","title":"get_counts_by_status()
","text":"Returns a dictionary of counts by the migration status of each indexed file/version. Keys are as follows:
INDEXED
- the file/version has been indexed and will be migrated on a call to migrate_indexed_filesMIGRATED
- the file/version has been migratedALREADY_MIGRATED
- the file/version was already stored at the target storage location and no migration is neededERRORED
- an error occurred while indexing or migrating the file/versionsynapseutils/migrate_functions.py
def get_counts_by_status(self):\n \"\"\"\n Returns a dictionary of counts by the migration status of each indexed file/version.\n Keys are as follows:\n\n - `INDEXED` - the file/version has been indexed and will be migrated on a call to migrate_indexed_files\n - `MIGRATED` - the file/version has been migrated\n - `ALREADY_MIGRATED` - the file/version was already stored at the target storage location and no migration is needed\n - `ERRORED` - an error occurred while indexing or migrating the file/version\n \"\"\" # noqa\n import sqlite3\n\n with sqlite3.connect(self.db_path) as conn:\n cursor = conn.cursor()\n\n # for the purposes of these counts, containers (Projects and Folders) do not count.\n # we are counting actual files only\n result = cursor.execute(\n \"select status, count(*) from migrations where type in (?, ?) group by status\",\n (_MigrationType.FILE.value, _MigrationType.TABLE_ATTACHED_FILE.value),\n )\n\n counts_by_status = {status.name: 0 for status in _MigrationStatus}\n for row in result:\n status = row[0]\n count = row[1]\n counts_by_status[_MigrationStatus(status).name] = count\n\n return counts_by_status\n
"},{"location":"reference/synapse_utils/#synapseutils.migrate_functions.MigrationResult.get_migrations","title":"get_migrations()
","text":"A generator yielding each file/version in the migration index. A dictionary of the properties of the migration row is yielded as follows
YIELDS DESCRIPTIONid
the Synapse id
type
the concrete type of the entity
version
the verson of the file entity (if applicable)
row_id
the row of the table attached file (if applicable)
col_id
the column id of the table attached file (if applicable)
from_storage_location_id
from_file_handle_id
the id file handle of the existing file/version
to_file_handle_id
if migrated, the new file handle id
status
one of INDEXED, MIGRATED, ALREADY_MIGRATED, ERRORED indicating the status of the file/version
exception
if an error was encountered indexing/migrating the file/version its stack is here
Source code insynapseutils/migrate_functions.py
def get_migrations(self):\n \"\"\"\n A generator yielding each file/version in the migration index.\n A dictionary of the properties of the migration row is yielded as follows\n\n Yields:\n id: the Synapse id\n type: the concrete type of the entity\n version: the verson of the file entity (if applicable)\n row_id: the row of the table attached file (if applicable)\n col_id: the column id of the table attached file (if applicable)\n from_storage_location_id: - the previous storage location id where the file/version was stored\n from_file_handle_id: the id file handle of the existing file/version\n to_file_handle_id: if migrated, the new file handle id\n status: one of INDEXED, MIGRATED, ALREADY_MIGRATED, ERRORED indicating the status of the file/version\n exception: if an error was encountered indexing/migrating the file/version its stack is here\n \"\"\"\n import sqlite3\n\n with sqlite3.connect(self.db_path) as conn:\n cursor = conn.cursor()\n\n last_id = None\n column_names = None\n\n rowid = -1\n while True:\n results = cursor.execute(\n \"\"\"\n select\n rowid,\n\n id,\n type,\n version,\n row_id,\n col_id,\n from_storage_location_id,\n from_file_handle_id,\n to_file_handle_id,\n file_size,\n status,\n exception\n from migrations\n where\n rowid > ?\n and type in (?, ?)\n order by\n rowid\n limit ?\n \"\"\",\n (\n rowid,\n _MigrationType.FILE.value,\n _MigrationType.TABLE_ATTACHED_FILE.value,\n _get_batch_size(),\n ),\n )\n\n row_count = 0\n for row in results:\n row_count += 1\n\n # using the internal sqlite rowid for ordering only\n rowid = row[0]\n\n # exclude the sqlite internal rowid\n row_dict = _get_row_dict(cursor, row, False)\n entity_id = row_dict[\"id\"]\n if entity_id != last_id:\n # if the next row is dealing with a different entity than the last table\n # id then we discard any cached column names we looked up\n column_names = {}\n\n row_dict[\"type\"] = (\n \"file\"\n if row_dict[\"type\"] == _MigrationType.FILE.value\n else \"table\"\n )\n\n for int_arg in (\n \"version\",\n \"row_id\",\n \"from_storage_location_id\",\n \"from_file_handle_id\",\n \"to_file_handle_id\",\n ):\n int_val = row_dict.get(int_arg)\n if int_val is not None:\n row_dict[int_arg] = int(int_val)\n\n col_id = row_dict.pop(\"col_id\", None)\n if col_id is not None:\n column_name = column_names.get(col_id)\n\n # for usability we look up the actual column name from the id,\n # but that involves a lookup so we cache them for re-use across\n # rows that deal with the same table entity\n if column_name is None:\n column = self._syn.restGET(\"/column/{}\".format(col_id))\n column_name = column_names[col_id] = column[\"name\"]\n\n row_dict[\"col_name\"] = column_name\n\n row_dict[\"status\"] = _MigrationStatus(row_dict[\"status\"]).name\n\n yield row_dict\n\n last_id = entity_id\n\n if row_count == 0:\n # out of rows\n break\n
"},{"location":"reference/synapse_utils/#synapseutils.migrate_functions.MigrationResult.as_csv","title":"as_csv(path)
","text":"Output a flat csv file of the contents of the Migration index.
PARAMETER DESCRIPTIONpath
The path to the csv file to be created
RETURNS DESCRIPTION
None
But a csv file is created at the given path with the following columns:
id
the Synapse id
type
the concrete type of the entity
version
the verson of the file entity (if applicable)
row_id
the row of the table attached file (if applicable)
col_name
the column name of the column the table attached file resides in (if applicable)
from_storage_location_id
the previous storage location id where the file/version was stored
from_file_handle_id
the id file handle of the existing file/version
to_file_handle_id
if migrated, the new file handle id
status
one of INDEXED, MIGRATED, ALREADY_MIGRATED, ERRORED indicating the status of the file/version
exception
if an error was encountered indexing/migrating the file/version its stack is here
Source code insynapseutils/migrate_functions.py
def as_csv(self, path):\n \"\"\"\n Output a flat csv file of the contents of the Migration index.\n\n Arguments:\n path: The path to the csv file to be created\n\n Returns:\n None: But a csv file is created at the given path with the following columns:\n id: the Synapse id\n type: the concrete type of the entity\n version: the verson of the file entity (if applicable)\n row_id: the row of the table attached file (if applicable)\n col_name: the column name of the column the table attached file resides in (if applicable)\n from_storage_location_id: the previous storage location id where the file/version was stored\n from_file_handle_id: the id file handle of the existing file/version\n to_file_handle_id: if migrated, the new file handle id\n status: one of INDEXED, MIGRATED, ALREADY_MIGRATED, ERRORED indicating the status of the file/version\n exception: if an error was encountered indexing/migrating the file/version its stack is here\n\n \"\"\"\n\n with open(path, \"w\", newline=\"\") as csv_file:\n csv_writer = csv.writer(csv_file)\n\n # headers\n csv_writer.writerow(\n [\n \"id\",\n \"type\",\n \"version\",\n \"row_id\",\n \"col_name\",\n \"from_storage_location_id\",\n \"from_file_handle_id\",\n \"to_file_handle_id\",\n \"status\",\n \"exception\",\n ]\n )\n\n for row_dict in self.get_migrations():\n row_data = [\n row_dict[\"id\"],\n row_dict[\"type\"],\n row_dict.get(\"version\"),\n row_dict.get(\"row_id\"),\n row_dict.get(\"col_name\"),\n row_dict.get(\"from_storage_location_id\"),\n row_dict.get(\"from_file_handle_id\"),\n row_dict.get(\"to_file_handle_id\"),\n row_dict[\"status\"],\n row_dict.get(\"exception\"),\n ]\n\n csv_writer.writerow(row_data)\n
"},{"location":"reference/synapse_utils/#synapseutils.migrate_functions-functions","title":"Functions","text":""},{"location":"reference/synapse_utils/#synapseutils.migrate_functions.index_files_for_migration","title":"index_files_for_migration(syn, entity, dest_storage_location_id, db_path, source_storage_location_ids=None, file_version_strategy='new', include_table_files=False, continue_on_error=False)
","text":"Index the given entity for migration to a new storage location. This is the first step in migrating an entity to a new storage location using synapseutils.
This function will create a sqlite database at the given db_path that can be subsequently passed to the migrate_indexed_files function for actual migration. This function itself does not modify the given entity in any way.
PARAMETER DESCRIPTIONsyn
A Synapse object with user's login, e.g. syn = synapseclient.login()
TYPE: Synapse
entity
A Synapse entity whose files should be migrated. Can be a Project, Folder, File entity, or Table entity. If it is a container (a Project or Folder) its contents will be recursively indexed.
dest_storage_location_id
The id of the new storage location to be migrated to.
TYPE: str
db_path
A path on disk where a sqlite db can be created to store the contents of the created index.
TYPE: str
source_storage_location_ids
An optional iterable of storage location ids that will be migrated. If provided, files outside of one of the listed storage locations will not be indexed for migration. If not provided, then all files not already in the destination storage location will be indexed for migrated.
TYPE: Iterable[str]
DEFAULT: None
file_version_strategy
One of \"new\" (default), \"all\", \"latest\", \"skip\" as follows:
new
: will create a new version of file entities in the new storage location, leaving existing versions unchangedall
: all existing versions will be migrated in place to the new storage locationlatest
: the latest version will be migrated in place to the new storage locationskip
: skip migrating file entities. use this e.g. if wanting to e.g. migrate table attached files in a container while leaving the files unchanged DEFAULT: 'new'
include_table_files
Whether to migrate files attached to tables. If False (default) then e.g. only file entities in the container will be migrated and tables will be untouched.
DEFAULT: False
continue_on_error
Whether any errors encountered while indexing an entity (access etc) will be raised or instead just recorded in the index while allowing the index creation to continue. Default is False (any errors are raised).
DEFAULT: False
A MigrationResult object that can be used to inspect the contents of the index or output the index to a CSV for manual inspection.
Source code insynapseutils/migrate_functions.py
def index_files_for_migration(\n syn: synapseclient.Synapse,\n entity,\n dest_storage_location_id: str,\n db_path: str,\n source_storage_location_ids: typing.Iterable[str] = None,\n file_version_strategy=\"new\",\n include_table_files=False,\n continue_on_error=False,\n):\n \"\"\"\n Index the given entity for migration to a new storage location. This is the first step in migrating an entity\n to a new storage location using synapseutils.\n\n This function will create a sqlite database at the given db_path that can be subsequently passed\n to the migrate_indexed_files function for actual migration. This function itself does not modify the given entity\n in any way.\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n entity: A Synapse entity whose files should be migrated. Can be a Project, Folder,\n File entity, or Table entity. If it is a container (a Project or Folder)\n its contents will be recursively indexed.\n dest_storage_location_id: The id of the new storage location to be migrated to.\n db_path: A path on disk where a sqlite db can be created to store the contents of the\n created index.\n source_storage_location_ids: An optional iterable of storage location ids that\n will be migrated. If provided, files outside of\n one of the listed storage locations will not be\n indexed for migration. If not provided, then all\n files not already in the destination storage\n location will be indexed for migrated.\n file_version_strategy: One of \"new\" (default), \"all\", \"latest\", \"skip\" as follows:\n\n - `new`: will create a new version of file entities in the new storage location, leaving existing versions unchanged\n - `all`: all existing versions will be migrated in place to the new storage location\n - `latest`: the latest version will be migrated in place to the new storage location\n - `skip`: skip migrating file entities. use this e.g. if wanting to e.g. migrate table attached files in a container while leaving the files unchanged\n include_table_files: Whether to migrate files attached to tables. If False (default) then e.g. only\n file entities in the container will be migrated and tables will be untouched.\n continue_on_error: Whether any errors encountered while indexing an entity (access etc) will be raised\n or instead just recorded in the index while allowing the index creation\n to continue. Default is False (any errors are raised).\n\n Returns:\n A MigrationResult object that can be used to inspect the contents of the index or output the index to a CSV for manual inspection.\n \"\"\" # noqa\n root_id = utils.id_of(entity)\n\n # accept an Iterable, but easier to work internally if we can assume a list of strings\n source_storage_location_ids = [str(s) for s in source_storage_location_ids or []]\n\n file_version_strategies = {\"new\", \"all\", \"latest\", \"skip\"}\n if file_version_strategy not in file_version_strategies:\n raise ValueError(\n \"Invalid file_version_strategy: {}, must be one of {}\".format(\n file_version_strategy, file_version_strategies\n )\n )\n\n if file_version_strategy == \"skip\" and not include_table_files:\n raise ValueError(\n \"Skipping both files entities and table attached files, nothing to migrate\"\n )\n\n _verify_storage_location_ownership(syn, dest_storage_location_id)\n\n test_import_sqlite3()\n import sqlite3\n\n with sqlite3.connect(db_path) as conn:\n cursor = conn.cursor()\n _ensure_schema(cursor)\n\n _verify_index_settings(\n cursor,\n db_path,\n root_id,\n dest_storage_location_id,\n source_storage_location_ids,\n file_version_strategy,\n include_table_files,\n )\n conn.commit()\n\n entity = syn.get(root_id, downloadFile=False)\n try:\n _index_entity(\n conn,\n cursor,\n syn,\n entity,\n None,\n dest_storage_location_id,\n source_storage_location_ids,\n file_version_strategy,\n include_table_files,\n continue_on_error,\n )\n\n except _IndexingError as indexing_ex:\n logging.exception(\n \"Aborted due to failure to index entity %s of type %s. Use the continue_on_error option to skip \"\n \"over entities due to individual failures.\",\n indexing_ex.entity_id,\n indexing_ex.concrete_type,\n )\n\n raise indexing_ex.__cause__\n\n return MigrationResult(syn, db_path)\n
"},{"location":"reference/synapse_utils/#synapseutils.migrate_functions.migrate_indexed_files","title":"migrate_indexed_files(syn, db_path, create_table_snapshots=True, continue_on_error=False, force=False)
","text":"Migrate files previously indexed in a sqlite database at the given db_path using the separate index_files_for_migration function. The files listed in the index will be migrated according to the configuration of that index.
PARAMETER DESCRIPTIONsyn
A Synapse object with user's login, e.g. syn = synapseclient.login()
TYPE: Synapse
db_path
A path on disk where a sqlite db was created using the index_files_for_migration function.
TYPE: str
create_table_snapshots
When updating the files in any table, whether the a snapshot of the table is first created.
DEFAULT: True
continue_on_error
Whether any errors encountered while migrating will be raised or instead just recorded in the sqlite database while allowing the migration to continue. Default is False (any errors are raised).
DEFAULT: False
force
If running in an interactive shell, migration requires an interactice confirmation. This can be bypassed by using the force=True option.
DEFAULT: False
Union[MigrationResult, None]
A MigrationResult object that can be used to inspect the results of the migration.
Source code insynapseutils/migrate_functions.py
def migrate_indexed_files(\n syn: synapseclient.Synapse,\n db_path: str,\n create_table_snapshots=True,\n continue_on_error=False,\n force=False,\n) -> typing.Union[MigrationResult, None]:\n \"\"\"\n Migrate files previously indexed in a sqlite database at the given db_path using the separate\n index_files_for_migration function. The files listed in the index will be migrated according to the\n configuration of that index.\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n db_path: A path on disk where a sqlite db was created using the index_files_for_migration function.\n create_table_snapshots: When updating the files in any table, whether the a snapshot of the table is\n first created.\n continue_on_error: Whether any errors encountered while migrating will be raised\n or instead just recorded in the sqlite database while allowing the migration\n to continue. Default is False (any errors are raised).\n force: If running in an interactive shell, migration requires an interactice confirmation.\n This can be bypassed by using the force=True option.\n\n Returns:\n A MigrationResult object that can be used to inspect the results of the migration.\n \"\"\"\n executor, max_concurrent_file_copies = _get_executor(syn)\n\n test_import_sqlite3()\n import sqlite3\n\n with sqlite3.connect(db_path) as conn:\n cursor = conn.cursor()\n\n _ensure_schema(cursor)\n settings = _retrieve_index_settings(cursor)\n if settings is None:\n # no settings were available at the index given\n raise ValueError(\n \"Unable to retrieve existing index settings from '{}'. \"\n \"Either this path does represent a previously created migration index file or the file is corrupt.\"\n )\n\n dest_storage_location_id = settings[\"dest_storage_location_id\"]\n if not _confirm_migration(cursor, force, dest_storage_location_id):\n logging.info(\"Migration aborted.\")\n return\n\n key = _MigrationKey(id=\"\", type=None, row_id=-1, col_id=-1, version=-1)\n\n futures = set()\n\n # we keep track of the file handles that are currently being migrated\n # so that if we encounter multiple entities associated with the same\n # file handle we can copy the file handle once and update all the entities\n # with the single copied file handle\n pending_file_handle_ids = set()\n completed_file_handle_ids = set()\n\n # we keep track of the entity keys (syn id + version) so that we know\n # if we encounter the same one twice. normally we wouldn't but when we backtrack\n # to update any entities skipped because of a shared file handle we might\n # query for the same key as is already being operated on.\n pending_keys = set()\n\n batch_size = _get_batch_size()\n while True:\n # we query for additional file or table associated file handles to migrate in batches\n # ordering by synapse id. there can be multiple file handles associated with a particular\n # synapse id (i.e. multiple file entity versions or multiple table attached files per table),\n # so the ordering and where clause need to account for that.\n # we also include in the query any unmigrated files that were skipped previously through\n # the query loop that share a file handle with a file handle id that is now finished.\n version = key.version if key.version is not None else -1\n row_id = key.row_id if key.row_id is not None else -1\n col_id = key.col_id if key.col_id is not None else -1\n\n query_kwargs = {\n \"indexed_status\": _MigrationStatus.INDEXED.value,\n \"id\": key.id,\n \"file_type\": _MigrationType.FILE.value,\n \"table_type\": _MigrationType.TABLE_ATTACHED_FILE.value,\n \"version\": version,\n \"row_id\": row_id,\n \"col_id\": col_id,\n # ensure that we aren't ever adding more items to the shared executor than allowed\n \"limit\": min(batch_size, max_concurrent_file_copies - len(futures)),\n }\n\n # we can't use both named and positional literals in a query, so we use named\n # literals and then inline a string for the values for our file handle ids\n # since these are a dynamic list of values\n pending_file_handle_in = \"('\" + \"','\".join(pending_file_handle_ids) + \"')\"\n completed_file_handle_in = (\n \"('\" + \"','\".join(completed_file_handle_ids) + \"')\"\n )\n\n results = cursor.execute(\n f\"\"\"\n select\n id,\n type,\n version,\n row_id,\n col_id,\n from_file_handle_id,\n file_size\n from migrations\n where\n status = :indexed_status\n and (\n (\n ((id > :id and type in (:file_type, :table_type))\n or (id = :id and type = :file_type and version is not null and version > :version)\n or (id = :id and type = :table_type and (row_id > :row_id or (row_id = :row_id and col_id > :col_id))))\n and from_file_handle_id not in {pending_file_handle_in}\n ) or\n (\n id <= :id\n and from_file_handle_id in {completed_file_handle_in}\n )\n )\n order by\n id,\n type,\n row_id,\n col_id,\n version\n limit :limit\n \"\"\", # noqa\n query_kwargs,\n )\n\n row_count = 0\n for row in results:\n row_count += 1\n\n row_dict = _get_row_dict(cursor, row, True)\n key_dict = {\n k: v\n for k, v in row_dict.items()\n if k in (\"id\", \"type\", \"version\", \"row_id\", \"col_id\")\n }\n\n last_key = key\n key = _MigrationKey(**key_dict)\n from_file_handle_id = row_dict[\"from_file_handle_id\"]\n\n if (\n key in pending_keys\n or from_file_handle_id in pending_file_handle_ids\n ):\n # if this record is already being migrated or it shares a file handle\n # with a record that is being migrated then skip this.\n # if it the record shares a file handle it will be picked up later\n # when its file handle is completed.\n continue\n\n file_size = row_dict[\"file_size\"]\n\n pending_keys.add(key)\n to_file_handle_id = _check_file_handle_exists(\n conn.cursor(), from_file_handle_id\n )\n if not to_file_handle_id:\n pending_file_handle_ids.add(from_file_handle_id)\n\n if key.type == _MigrationType.FILE.value:\n if key.version is None:\n migration_fn = _create_new_file_version\n\n else:\n migration_fn = _migrate_file_version\n\n elif key.type == _MigrationType.TABLE_ATTACHED_FILE.value:\n if last_key.id != key.id and create_table_snapshots:\n syn.create_snapshot_version(key.id)\n\n migration_fn = _migrate_table_attached_file\n\n else:\n raise ValueError(\n \"Unexpected type {} with id {}\".format(key.type, key.id)\n )\n\n def migration_task(\n syn,\n key,\n from_file_handle_id,\n to_file_handle_id,\n file_size,\n storage_location_id,\n ):\n # a closure to wrap the actual function call so that we an add some local variables\n # to the return tuple which will be consumed when the future is processed\n with shared_executor(executor):\n try:\n # instrument the shared executor in this thread so that we won't\n # create a new executor to perform the multipart copy\n to_file_handle_id = migration_fn(\n syn,\n key,\n from_file_handle_id,\n to_file_handle_id,\n file_size,\n storage_location_id,\n )\n return key, from_file_handle_id, to_file_handle_id\n except Exception as ex:\n raise _MigrationError(\n key, from_file_handle_id, to_file_handle_id\n ) from ex\n\n future = executor.submit(\n migration_task,\n syn,\n key,\n from_file_handle_id,\n to_file_handle_id,\n file_size,\n dest_storage_location_id,\n )\n futures.add(future)\n\n if row_count == 0 and not pending_file_handle_ids:\n # we've run out of migratable sqlite rows, we have nothing else\n # to submit, so we break out and wait for all remaining\n # tasks to conclude.\n break\n\n if len(futures) >= max_concurrent_file_copies or row_count < batch_size:\n # if we have no concurrency left to process any additional entities\n # or if we're near the end of he migration and have a small\n # remainder batch then we wait for one of the processing migrations\n # to finish. a small batch doesn't mean this is the last batch since\n # a completed file handle here could be associated with another\n # entity that we deferred before because it shared the same file handle id\n futures, completed_file_handle_ids = _wait_futures(\n conn,\n cursor,\n futures,\n pending_keys,\n concurrent.futures.FIRST_COMPLETED,\n continue_on_error,\n )\n\n pending_file_handle_ids -= completed_file_handle_ids\n\n if futures:\n # wait for all remaining migrations to conclude before returning\n _wait_futures(\n conn,\n cursor,\n futures,\n pending_keys,\n concurrent.futures.ALL_COMPLETED,\n continue_on_error,\n )\n\n return MigrationResult(syn, db_path)\n
"},{"location":"reference/synapse_utils/#synapseutils.describe_functions","title":"synapseutils.describe_functions
","text":""},{"location":"reference/synapse_utils/#synapseutils.describe_functions-functions","title":"Functions","text":""},{"location":"reference/synapse_utils/#synapseutils.describe_functions.describe","title":"describe(syn, entity)
","text":"Gets a synapse entity and returns summary statistics about it.
PARAMETER DESCRIPTIONsyn
A Synapse object with user's login, e.g. syn = synapseclient.login()
entity
synapse id of the entity to be described
TYPE: str
Describing columns of a table
import synapseclient\nimport synapseutils\nsyn = synapseclient.login()\nstatistics = synapseutils(syn, entity=\"syn123\")\nprint(statistics)\n{\n \"column1\": {\n \"dtype\": \"object\",\n \"mode\": \"FOOBAR\"\n },\n \"column2\": {\n \"dtype\": \"int64\",\n \"mode\": 1,\n \"min\": 1,\n \"max\": 2,\n \"mean\": 1.4\n },\n \"column3\": {\n \"dtype\": \"bool\",\n \"mode\": false,\n \"min\": false,\n \"max\": true,\n \"mean\": 0.5\n }\n}\n
RETURNS DESCRIPTION Union[dict, None]
A dict if the dataset is valid; None if not.
Source code insynapseutils/describe_functions.py
def describe(syn, entity: str) -> typing.Union[dict, None]:\n \"\"\"\n Gets a synapse entity and returns summary statistics about it.\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n entity: synapse id of the entity to be described\n\n Example: Using this function\n Describing columns of a table\n\n import synapseclient\n import synapseutils\n syn = synapseclient.login()\n statistics = synapseutils(syn, entity=\"syn123\")\n print(statistics)\n {\n \"column1\": {\n \"dtype\": \"object\",\n \"mode\": \"FOOBAR\"\n },\n \"column2\": {\n \"dtype\": \"int64\",\n \"mode\": 1,\n \"min\": 1,\n \"max\": 2,\n \"mean\": 1.4\n },\n \"column3\": {\n \"dtype\": \"bool\",\n \"mode\": false,\n \"min\": false,\n \"max\": true,\n \"mean\": 0.5\n }\n }\n\n Returns:\n A dict if the dataset is valid; None if not.\n \"\"\"\n df = _open_entity_as_df(syn=syn, entity=entity)\n\n if df is None:\n return None\n\n stats = _describe_wrapper(df)\n syn.logger.info(json.dumps(stats, indent=2, default=str))\n return stats\n
"},{"location":"reference/table_schema/","title":"Table Schema","text":""},{"location":"reference/table_schema/#synapseclient.table.Schema","title":"synapseclient.table.Schema
","text":" Bases: SchemaBase
A Schema is an :py:class:synapseclient.entity.Entity
that defines a set of columns in a table.
:param name: the name for the Table Schema object :param description: User readable description of the schema :param columns: a list of :py:class:Column
objects or their IDs :param parent: the project in Synapse to which this table belongs :param properties: A map of Synapse properties :param annotations: A map of user defined annotations :param local_state: Internal use only
Example::
cols = [Column(name='Isotope', columnType='STRING'),\n Column(name='Atomic Mass', columnType='INTEGER'),\n Column(name='Halflife', columnType='DOUBLE'),\n Column(name='Discovered', columnType='DATE')]\n\nschema = syn.store(Schema(name='MyTable', columns=cols, parent=project))\n
Source code in synapseclient/table.py
class Schema(SchemaBase):\n \"\"\"\n A Schema is an :py:class:`synapseclient.entity.Entity` that defines a set of columns in a table.\n\n :param name: the name for the Table Schema object\n :param description: User readable description of the schema\n :param columns: a list of :py:class:`Column` objects or their IDs\n :param parent: the project in Synapse to which this table belongs\n :param properties: A map of Synapse properties\n :param annotations: A map of user defined annotations\n :param local_state: Internal use only\n\n Example::\n\n cols = [Column(name='Isotope', columnType='STRING'),\n Column(name='Atomic Mass', columnType='INTEGER'),\n Column(name='Halflife', columnType='DOUBLE'),\n Column(name='Discovered', columnType='DATE')]\n\n schema = syn.store(Schema(name='MyTable', columns=cols, parent=project))\n \"\"\"\n\n _synapse_entity_type = \"org.sagebionetworks.repo.model.table.TableEntity\"\n\n def __init__(\n self,\n name=None,\n columns=None,\n parent=None,\n properties=None,\n annotations=None,\n local_state=None,\n **kwargs,\n ):\n super(Schema, self).__init__(\n name=name,\n columns=columns,\n properties=properties,\n annotations=annotations,\n local_state=local_state,\n parent=parent,\n **kwargs,\n )\n
"},{"location":"reference/tables/","title":"Tables","text":""},{"location":"reference/tables/#synapseclient.table","title":"synapseclient.table
","text":""},{"location":"reference/tables/#synapseclient.table--tables","title":"Tables","text":"Synapse Tables enable storage of tabular data in Synapse in a form that can be queried using a SQL-like query language.
A table has a Schema and holds a set of rows conforming to that schema.
A Schema defines a series of Column of the following types:
Read more information about using Table in synapse in the tutorials section.
"},{"location":"reference/tables/#synapseclient.table-classes","title":"Classes","text":""},{"location":"reference/tables/#synapseclient.table.SchemaBase","title":"SchemaBase
","text":" Bases: Entity
This is the an Abstract Class for EntityViewSchema and Schema containing the common methods for both. You can not create an object of this type.
Source code insynapseclient/table.py
class SchemaBase(Entity, metaclass=abc.ABCMeta):\n \"\"\"\n This is the an Abstract Class for EntityViewSchema and Schema containing the common methods for both.\n You can not create an object of this type.\n \"\"\"\n\n _property_keys = Entity._property_keys + [\"columnIds\"]\n _local_keys = Entity._local_keys + [\"columns_to_store\"]\n\n @property\n @abc.abstractmethod # forces subclasses to define _synapse_entity_type\n def _synapse_entity_type(self):\n pass\n\n @abc.abstractmethod\n def __init__(\n self, name, columns, properties, annotations, local_state, parent, **kwargs\n ):\n self.properties.setdefault(\"columnIds\", [])\n self.__dict__.setdefault(\"columns_to_store\", [])\n\n if name:\n kwargs[\"name\"] = name\n super(SchemaBase, self).__init__(\n properties=properties,\n annotations=annotations,\n local_state=local_state,\n parent=parent,\n **kwargs,\n )\n if columns:\n self.addColumns(columns)\n\n def addColumn(self, column):\n \"\"\"\n :param column: a column object or its ID\n \"\"\"\n if isinstance(column, str) or isinstance(column, int) or hasattr(column, \"id\"):\n self.properties.columnIds.append(id_of(column))\n elif isinstance(column, Column):\n if not self.__dict__.get(\"columns_to_store\", None):\n self.__dict__[\"columns_to_store\"] = []\n self.__dict__[\"columns_to_store\"].append(column)\n else:\n raise ValueError(\"Not a column? %s\" % str(column))\n\n def addColumns(self, columns):\n \"\"\"\n :param columns: a list of column objects or their ID\n \"\"\"\n for column in columns:\n self.addColumn(column)\n\n def removeColumn(self, column):\n \"\"\"\n :param column: a column object or its ID\n \"\"\"\n if isinstance(column, str) or isinstance(column, int) or hasattr(column, \"id\"):\n self.properties.columnIds.remove(id_of(column))\n elif isinstance(column, Column) and self.columns_to_store:\n self.columns_to_store.remove(column)\n else:\n ValueError(\"Can't remove column %s\" + str(column))\n\n def has_columns(self):\n \"\"\"Does this schema have columns specified?\"\"\"\n return bool(\n self.properties.get(\"columnIds\", None)\n or self.__dict__.get(\"columns_to_store\", None)\n )\n\n def _before_synapse_store(self, syn):\n if len(self.columns_to_store) + len(self.columnIds) > MAX_NUM_TABLE_COLUMNS:\n raise ValueError(\n \"Too many columns. The limit is %s columns per table\"\n % MAX_NUM_TABLE_COLUMNS\n )\n\n # store any columns before storing table\n if self.columns_to_store:\n self.properties.columnIds.extend(\n column.id for column in syn.createColumns(self.columns_to_store)\n )\n self.columns_to_store = []\n
"},{"location":"reference/tables/#synapseclient.table.SchemaBase-functions","title":"Functions","text":""},{"location":"reference/tables/#synapseclient.table.SchemaBase.addColumn","title":"addColumn(column)
","text":":param column: a column object or its ID
Source code insynapseclient/table.py
def addColumn(self, column):\n \"\"\"\n :param column: a column object or its ID\n \"\"\"\n if isinstance(column, str) or isinstance(column, int) or hasattr(column, \"id\"):\n self.properties.columnIds.append(id_of(column))\n elif isinstance(column, Column):\n if not self.__dict__.get(\"columns_to_store\", None):\n self.__dict__[\"columns_to_store\"] = []\n self.__dict__[\"columns_to_store\"].append(column)\n else:\n raise ValueError(\"Not a column? %s\" % str(column))\n
"},{"location":"reference/tables/#synapseclient.table.SchemaBase.addColumns","title":"addColumns(columns)
","text":":param columns: a list of column objects or their ID
Source code insynapseclient/table.py
def addColumns(self, columns):\n \"\"\"\n :param columns: a list of column objects or their ID\n \"\"\"\n for column in columns:\n self.addColumn(column)\n
"},{"location":"reference/tables/#synapseclient.table.SchemaBase.removeColumn","title":"removeColumn(column)
","text":":param column: a column object or its ID
Source code insynapseclient/table.py
def removeColumn(self, column):\n \"\"\"\n :param column: a column object or its ID\n \"\"\"\n if isinstance(column, str) or isinstance(column, int) or hasattr(column, \"id\"):\n self.properties.columnIds.remove(id_of(column))\n elif isinstance(column, Column) and self.columns_to_store:\n self.columns_to_store.remove(column)\n else:\n ValueError(\"Can't remove column %s\" + str(column))\n
"},{"location":"reference/tables/#synapseclient.table.SchemaBase.has_columns","title":"has_columns()
","text":"Does this schema have columns specified?
Source code insynapseclient/table.py
def has_columns(self):\n \"\"\"Does this schema have columns specified?\"\"\"\n return bool(\n self.properties.get(\"columnIds\", None)\n or self.__dict__.get(\"columns_to_store\", None)\n )\n
"},{"location":"reference/tables/#synapseclient.table.Schema","title":"Schema
","text":" Bases: SchemaBase
A Schema is an :py:class:synapseclient.entity.Entity
that defines a set of columns in a table.
:param name: the name for the Table Schema object :param description: User readable description of the schema :param columns: a list of :py:class:Column
objects or their IDs :param parent: the project in Synapse to which this table belongs :param properties: A map of Synapse properties :param annotations: A map of user defined annotations :param local_state: Internal use only
Example::
cols = [Column(name='Isotope', columnType='STRING'),\n Column(name='Atomic Mass', columnType='INTEGER'),\n Column(name='Halflife', columnType='DOUBLE'),\n Column(name='Discovered', columnType='DATE')]\n\nschema = syn.store(Schema(name='MyTable', columns=cols, parent=project))\n
Source code in synapseclient/table.py
class Schema(SchemaBase):\n \"\"\"\n A Schema is an :py:class:`synapseclient.entity.Entity` that defines a set of columns in a table.\n\n :param name: the name for the Table Schema object\n :param description: User readable description of the schema\n :param columns: a list of :py:class:`Column` objects or their IDs\n :param parent: the project in Synapse to which this table belongs\n :param properties: A map of Synapse properties\n :param annotations: A map of user defined annotations\n :param local_state: Internal use only\n\n Example::\n\n cols = [Column(name='Isotope', columnType='STRING'),\n Column(name='Atomic Mass', columnType='INTEGER'),\n Column(name='Halflife', columnType='DOUBLE'),\n Column(name='Discovered', columnType='DATE')]\n\n schema = syn.store(Schema(name='MyTable', columns=cols, parent=project))\n \"\"\"\n\n _synapse_entity_type = \"org.sagebionetworks.repo.model.table.TableEntity\"\n\n def __init__(\n self,\n name=None,\n columns=None,\n parent=None,\n properties=None,\n annotations=None,\n local_state=None,\n **kwargs,\n ):\n super(Schema, self).__init__(\n name=name,\n columns=columns,\n properties=properties,\n annotations=annotations,\n local_state=local_state,\n parent=parent,\n **kwargs,\n )\n
"},{"location":"reference/tables/#synapseclient.table.MaterializedViewSchema","title":"MaterializedViewSchema
","text":" Bases: SchemaBase
A MaterializedViewSchema is an :py:class:synapseclient.entity.Entity
that defines a set of columns in a materialized view along with the SQL statement.
:param name: the name for the Materialized View Schema object :param description: User readable description of the schema :param definingSQL: The synapse SQL statement that defines the data in the materialized view. The SQL may contain JOIN clauses on multiple tables. :param columns: a list of :py:class:Column
objects or their IDs :param parent: the project in Synapse to which this Materialized View belongs :param properties: A map of Synapse properties :param annotations: A map of user defined annotations :param local_state: Internal use only
Example::
defining_sql = \"SELECT * FROM syn111 F JOIN syn2222 P on (F.patient_id = P.patient_id)\"\n\nschema = syn.store(MaterializedViewSchema(name='MyTable', parent=project, definingSQL=defining_sql))\n
Source code in synapseclient/table.py
class MaterializedViewSchema(SchemaBase):\n \"\"\"\n A MaterializedViewSchema is an :py:class:`synapseclient.entity.Entity` that defines a set of columns in a\n materialized view along with the SQL statement.\n\n :param name: the name for the Materialized View Schema object\n :param description: User readable description of the schema\n :param definingSQL: The synapse SQL statement that defines the data in the materialized view. The SQL may\n contain JOIN clauses on multiple tables.\n :param columns: a list of :py:class:`Column` objects or their IDs\n :param parent: the project in Synapse to which this Materialized View belongs\n :param properties: A map of Synapse properties\n :param annotations: A map of user defined annotations\n :param local_state: Internal use only\n\n Example::\n\n defining_sql = \"SELECT * FROM syn111 F JOIN syn2222 P on (F.patient_id = P.patient_id)\"\n\n schema = syn.store(MaterializedViewSchema(name='MyTable', parent=project, definingSQL=defining_sql))\n \"\"\"\n\n _synapse_entity_type = \"org.sagebionetworks.repo.model.table.MaterializedView\"\n _property_keys = SchemaBase._property_keys + [\"definingSQL\"]\n\n def __init__(\n self,\n name=None,\n columns=None,\n parent=None,\n definingSQL=None,\n properties=None,\n annotations=None,\n local_state=None,\n **kwargs,\n ):\n if definingSQL is not None:\n kwargs[\"definingSQL\"] = definingSQL\n super(MaterializedViewSchema, self).__init__(\n name=name,\n columns=columns,\n properties=properties,\n annotations=annotations,\n local_state=local_state,\n parent=parent,\n **kwargs,\n )\n
"},{"location":"reference/tables/#synapseclient.table.ViewBase","title":"ViewBase
","text":" Bases: SchemaBase
This is a helper class for EntityViewSchema and SubmissionViewSchema containing the common methods for both.
Source code insynapseclient/table.py
class ViewBase(SchemaBase):\n \"\"\"\n This is a helper class for EntityViewSchema and SubmissionViewSchema\n containing the common methods for both.\n \"\"\"\n\n _synapse_entity_type = \"\"\n _property_keys = SchemaBase._property_keys + [\"viewTypeMask\", \"scopeIds\"]\n _local_keys = SchemaBase._local_keys + [\n \"addDefaultViewColumns\",\n \"addAnnotationColumns\",\n \"ignoredAnnotationColumnNames\",\n ]\n\n def add_scope(self, entities):\n \"\"\"\n :param entities: a Project, Folder, Evaluation object or its ID, can also be a list of them\n \"\"\"\n if isinstance(entities, list):\n # add ids to a temp list so that we don't partially modify scopeIds on an exception in id_of()\n temp_list = [id_of(entity) for entity in entities]\n self.scopeIds.extend(temp_list)\n else:\n self.scopeIds.append(id_of(entities))\n\n def _filter_duplicate_columns(self, syn, columns_to_add):\n \"\"\"\n If a column to be added has the same name and same type as an existing column, it will be considered a duplicate\n and not added.\n :param syn: a :py:class:`synapseclient.client.Synapse` object that is logged in\n :param columns_to_add: iterable collection of type :py:class:`synapseclient.table.Column` objects\n :return: a filtered list of columns to add\n \"\"\"\n\n # no point in making HTTP calls to retrieve existing Columns if we not adding any new columns\n if not columns_to_add:\n return columns_to_add\n\n # set up Column name/type tracking\n # map of str -> set(str), where str is the column type as a string and set is a set of column name strings\n column_type_to_annotation_names = {}\n\n # add to existing columns the columns that user has added but not yet created in synapse\n column_generator = (\n itertools.chain(syn.getColumns(self.columnIds), self.columns_to_store)\n if self.columns_to_store\n else syn.getColumns(self.columnIds)\n )\n\n for column in column_generator:\n column_name = column[\"name\"]\n column_type = column[\"columnType\"]\n\n column_type_to_annotation_names.setdefault(column_type, set()).add(\n column_name\n )\n\n valid_columns = []\n for column in columns_to_add:\n new_col_name = column[\"name\"]\n new_col_type = column[\"columnType\"]\n\n typed_col_name_set = column_type_to_annotation_names.setdefault(\n new_col_type, set()\n )\n if new_col_name not in typed_col_name_set:\n typed_col_name_set.add(new_col_name)\n valid_columns.append(column)\n return valid_columns\n\n def _before_synapse_store(self, syn):\n # get the default EntityView columns from Synapse and add them to the columns list\n additional_columns = []\n view_type = self._synapse_entity_type.split(\".\")[-1].lower()\n mask = self.get(\"viewTypeMask\")\n\n if self.addDefaultViewColumns:\n additional_columns.extend(\n syn._get_default_view_columns(view_type, view_type_mask=mask)\n )\n\n # get default annotations\n if self.addAnnotationColumns:\n anno_columns = [\n x\n for x in syn._get_annotation_view_columns(\n self.scopeIds, view_type, view_type_mask=mask\n )\n if x[\"name\"] not in self.ignoredAnnotationColumnNames\n ]\n additional_columns.extend(anno_columns)\n\n self.addColumns(self._filter_duplicate_columns(syn, additional_columns))\n\n # set these boolean flags to false so they are not repeated.\n self.addDefaultViewColumns = False\n self.addAnnotationColumns = False\n\n super(ViewBase, self)._before_synapse_store(syn)\n
"},{"location":"reference/tables/#synapseclient.table.ViewBase-functions","title":"Functions","text":""},{"location":"reference/tables/#synapseclient.table.ViewBase.add_scope","title":"add_scope(entities)
","text":":param entities: a Project, Folder, Evaluation object or its ID, can also be a list of them
Source code insynapseclient/table.py
def add_scope(self, entities):\n \"\"\"\n :param entities: a Project, Folder, Evaluation object or its ID, can also be a list of them\n \"\"\"\n if isinstance(entities, list):\n # add ids to a temp list so that we don't partially modify scopeIds on an exception in id_of()\n temp_list = [id_of(entity) for entity in entities]\n self.scopeIds.extend(temp_list)\n else:\n self.scopeIds.append(id_of(entities))\n
"},{"location":"reference/tables/#synapseclient.table.Dataset","title":"Dataset
","text":" Bases: ViewBase
A Dataset is an :py:class:synapseclient.entity.Entity
that defines a flat list of entities as a tableview (a.k.a. a \"dataset\").
:param name: The name for the Dataset object :param description: User readable description of the schema :param columns: A list of :py:class:Column
objects or their IDs :param parent: The Synapse Project to which this Dataset belongs :param properties: A map of Synapse properties :param annotations: A map of user defined annotations :param dataset_items: A list of items characterized by entityId and versionNumber :param folder: A list of Folder IDs :param local_state: Internal use only
Example::
from synapseclient import Dataset\n\n# Create a Dataset with pre-defined DatasetItems. Default Dataset columns\n# are used if no schema is provided.\ndataset_items = [\n {'entityId': \"syn000\", 'versionNumber': 1},\n {...},\n]\ndataset = syn.store(Dataset(\n name=\"My Dataset\",\n parent=project,\n dataset_items=dataset_items))\n\n# Add/remove specific Synapse IDs to/from the Dataset\ndataset.add_item({'entityId': \"syn111\", 'versionNumber': 1})\ndataset.remove_item(\"syn000\")\ndataset = syn.store(dataset)\n\n# Add a list of Synapse IDs to the Dataset\nnew_items = [\n {'entityId': \"syn222\", 'versionNumber': 2},\n {'entityId': \"syn333\", 'versionNumber': 1}\n]\ndataset.add_items(new_items)\ndataset = syn.store(dataset)\n
Folders can easily be added recursively to a dataset, that is, all files within the folder (including sub-folders) will be added. Note that using the following methods will add files with the latest version number ONLY. If another version number is desired, use :py:meth:synapseclient.table.add_item
or :py:meth:synapseclient.table.add_items
.
Example::
# Add a single Folder to the Dataset\ndataset.add_folder(\"syn123\")\n\n# Add a list of Folders, overwriting any existing files in the dataset\ndataset.add_folders([\"syn456\", \"syn789\"], force=True)\n\ndataset = syn.store(dataset)\n
empty() can be used to truncate a dataset, that is, remove all current items from the set.
Example::
dataset.empty()\ndataset = syn.store(dataset)\n
To get the number of entities in the dataset, use len().
Example::
print(f\"{dataset.name} has {len(dataset)} items.\")\n
To create a snapshot version of the Dataset, use :py:meth:synapseclient.client.create_snapshot_version
.
Example::
syn = synapseclient.login()\nsyn.create_snapshot_version(\n dataset.id,\n label=\"v1.0\",\n comment=\"This is version 1\")\n
Source code in synapseclient/table.py
class Dataset(ViewBase):\n \"\"\"\n A Dataset is an :py:class:`synapseclient.entity.Entity` that defines a\n flat list of entities as a tableview (a.k.a. a \"dataset\").\n\n :param name: The name for the Dataset object\n :param description: User readable description of the schema\n :param columns: A list of :py:class:`Column` objects or their IDs\n :param parent: The Synapse Project to which this Dataset belongs\n :param properties: A map of Synapse properties\n :param annotations: A map of user defined annotations\n :param dataset_items: A list of items characterized by entityId and versionNumber\n :param folder: A list of Folder IDs\n :param local_state: Internal use only\n\n Example::\n\n from synapseclient import Dataset\n\n # Create a Dataset with pre-defined DatasetItems. Default Dataset columns\n # are used if no schema is provided.\n dataset_items = [\n {'entityId': \"syn000\", 'versionNumber': 1},\n {...},\n ]\n dataset = syn.store(Dataset(\n name=\"My Dataset\",\n parent=project,\n dataset_items=dataset_items))\n\n # Add/remove specific Synapse IDs to/from the Dataset\n dataset.add_item({'entityId': \"syn111\", 'versionNumber': 1})\n dataset.remove_item(\"syn000\")\n dataset = syn.store(dataset)\n\n # Add a list of Synapse IDs to the Dataset\n new_items = [\n {'entityId': \"syn222\", 'versionNumber': 2},\n {'entityId': \"syn333\", 'versionNumber': 1}\n ]\n dataset.add_items(new_items)\n dataset = syn.store(dataset)\n\n Folders can easily be added recursively to a dataset, that is, all files\n within the folder (including sub-folders) will be added. Note that using\n the following methods will add files with the latest version number ONLY.\n If another version number is desired, use :py:meth:`synapseclient.table.add_item`\n or :py:meth:`synapseclient.table.add_items`.\n\n Example::\n\n # Add a single Folder to the Dataset\n dataset.add_folder(\"syn123\")\n\n # Add a list of Folders, overwriting any existing files in the dataset\n dataset.add_folders([\"syn456\", \"syn789\"], force=True)\n\n dataset = syn.store(dataset)\n\n empty() can be used to truncate a dataset, that is, remove all current\n items from the set.\n\n Example::\n\n dataset.empty()\n dataset = syn.store(dataset)\n\n To get the number of entities in the dataset, use len().\n\n Example::\n\n print(f\"{dataset.name} has {len(dataset)} items.\")\n\n To create a snapshot version of the Dataset, use\n :py:meth:`synapseclient.client.create_snapshot_version`.\n\n Example::\n\n syn = synapseclient.login()\n syn.create_snapshot_version(\n dataset.id,\n label=\"v1.0\",\n comment=\"This is version 1\")\n \"\"\"\n\n _synapse_entity_type: str = \"org.sagebionetworks.repo.model.table.Dataset\"\n _property_keys: List[str] = ViewBase._property_keys + [\"datasetItems\"]\n _local_keys: List[str] = ViewBase._local_keys + [\"folders_to_add\", \"force\"]\n\n def __init__(\n self,\n name=None,\n columns=None,\n parent=None,\n properties=None,\n addDefaultViewColumns=True,\n addAnnotationColumns=True,\n ignoredAnnotationColumnNames=[],\n annotations=None,\n local_state=None,\n dataset_items=None,\n folders=None,\n force=False,\n **kwargs,\n ):\n self.properties.setdefault(\"datasetItems\", [])\n self.__dict__.setdefault(\"folders_to_add\", set())\n self.ignoredAnnotationColumnNames = set(ignoredAnnotationColumnNames)\n self.viewTypeMask = EntityViewType.DATASET.value\n super(Dataset, self).__init__(\n name=name,\n columns=columns,\n properties=properties,\n annotations=annotations,\n local_state=local_state,\n parent=parent,\n **kwargs,\n )\n\n self.force = force\n if dataset_items:\n self.add_items(dataset_items, force)\n if folders:\n self.add_folders(folders, force)\n\n # HACK: make sure we don't try to add columns to schemas that we retrieve from synapse\n is_from_normal_constructor = not (properties or local_state)\n # allowing annotations because user might want to update annotations all at once\n self.addDefaultViewColumns = (\n addDefaultViewColumns and is_from_normal_constructor\n )\n self.addAnnotationColumns = addAnnotationColumns and is_from_normal_constructor\n\n def __len__(self):\n return len(self.properties.datasetItems)\n\n @staticmethod\n def _check_needed_keys(keys: List[str]):\n required_keys = {\"entityId\", \"versionNumber\"}\n if required_keys - keys:\n raise LookupError(\n \"DatasetItem missing a required property: %s\"\n % str(required_keys - keys)\n )\n return True\n\n def add_item(self, dataset_item: Dict[str, str], force: bool = True):\n \"\"\"\n :param dataset_item: a single dataset item\n :param force: force add item\n \"\"\"\n if isinstance(dataset_item, dict) and self._check_needed_keys(\n dataset_item.keys()\n ):\n if not self.has_item(dataset_item.get(\"entityId\")):\n self.properties.datasetItems.append(dataset_item)\n else:\n if force:\n self.remove_item(dataset_item.get(\"entityId\"))\n self.properties.datasetItems.append(dataset_item)\n else:\n raise ValueError(\n f\"Duplicate item found: {dataset_item.get('entityId')}. \"\n \"Set force=True to overwrite the existing item.\"\n )\n else:\n raise ValueError(\"Not a DatasetItem? %s\" % str(dataset_item))\n\n def add_items(self, dataset_items: List[Dict[str, str]], force: bool = True):\n \"\"\"\n :param dataset_items: a list of dataset items\n :param force: force add items\n \"\"\"\n for dataset_item in dataset_items:\n self.add_item(dataset_item, force)\n\n def remove_item(self, item_id: str):\n \"\"\"\n :param item_id: a single dataset item Synapse ID\n \"\"\"\n item_id = id_of(item_id)\n if item_id.startswith(\"syn\"):\n for i, curr_item in enumerate(self.properties.datasetItems):\n if curr_item.get(\"entityId\") == item_id:\n del self.properties.datasetItems[i]\n break\n else:\n raise ValueError(\"Not a Synapse ID: %s\" % str(item_id))\n\n def empty(self):\n self.properties.datasetItems = []\n\n def has_item(self, item_id):\n \"\"\"\n :param item_id: a single dataset item Synapse ID\n \"\"\"\n return any(item[\"entityId\"] == item_id for item in self.properties.datasetItems)\n\n def add_folder(self, folder: str, force: bool = True):\n \"\"\"\n :param folder: a single Synapse Folder ID\n :param force: force add items from folder\n \"\"\"\n if not self.__dict__.get(\"folders_to_add\", None):\n self.__dict__[\"folders_to_add\"] = set()\n self.__dict__[\"folders_to_add\"].add(folder)\n # if self.force != force:\n self.force = force\n\n def add_folders(self, folders: List[str], force: bool = True):\n \"\"\"\n :param folders: a list of Synapse Folder IDs\n :param force: force add items from folders\n \"\"\"\n if (\n isinstance(folders, list)\n or isinstance(folders, set)\n or isinstance(folders, tuple)\n ):\n self.force = force\n for folder in folders:\n self.add_folder(folder, force)\n else:\n raise ValueError(f\"Not a list of Folder IDs: {folders}\")\n\n def _add_folder_files(self, syn, folder):\n files = []\n children = syn.getChildren(folder)\n for child in children:\n if child.get(\"type\") == \"org.sagebionetworks.repo.model.Folder\":\n files.extend(self._add_folder_files(syn, child.get(\"id\")))\n elif child.get(\"type\") == \"org.sagebionetworks.repo.model.FileEntity\":\n files.append(\n {\n \"entityId\": child.get(\"id\"),\n \"versionNumber\": child.get(\"versionNumber\"),\n }\n )\n else:\n raise ValueError(f\"Not a Folder?: {folder}\")\n return files\n\n def _before_synapse_store(self, syn):\n # Add files from folders (if any) before storing dataset.\n if self.folders_to_add:\n for folder in self.folders_to_add:\n items_to_add = self._add_folder_files(syn, folder)\n self.add_items(items_to_add, self.force)\n self.folders_to_add = set()\n # Must set this scopeIds is used to get all annotations from the\n # entities\n self.scopeIds = [item[\"entityId\"] for item in self.properties.datasetItems]\n super()._before_synapse_store(syn)\n # Reset attribute to force-add items from folders.\n self.force = True\n # Remap `datasetItems` back to `items` before storing (since `items`\n # is the accepted field name in the API, not `datasetItems`).\n self.properties.items = self.properties.datasetItems\n
"},{"location":"reference/tables/#synapseclient.table.Dataset-functions","title":"Functions","text":""},{"location":"reference/tables/#synapseclient.table.Dataset.add_item","title":"add_item(dataset_item, force=True)
","text":":param dataset_item: a single dataset item :param force: force add item
Source code insynapseclient/table.py
def add_item(self, dataset_item: Dict[str, str], force: bool = True):\n \"\"\"\n :param dataset_item: a single dataset item\n :param force: force add item\n \"\"\"\n if isinstance(dataset_item, dict) and self._check_needed_keys(\n dataset_item.keys()\n ):\n if not self.has_item(dataset_item.get(\"entityId\")):\n self.properties.datasetItems.append(dataset_item)\n else:\n if force:\n self.remove_item(dataset_item.get(\"entityId\"))\n self.properties.datasetItems.append(dataset_item)\n else:\n raise ValueError(\n f\"Duplicate item found: {dataset_item.get('entityId')}. \"\n \"Set force=True to overwrite the existing item.\"\n )\n else:\n raise ValueError(\"Not a DatasetItem? %s\" % str(dataset_item))\n
"},{"location":"reference/tables/#synapseclient.table.Dataset.add_items","title":"add_items(dataset_items, force=True)
","text":":param dataset_items: a list of dataset items :param force: force add items
Source code insynapseclient/table.py
def add_items(self, dataset_items: List[Dict[str, str]], force: bool = True):\n \"\"\"\n :param dataset_items: a list of dataset items\n :param force: force add items\n \"\"\"\n for dataset_item in dataset_items:\n self.add_item(dataset_item, force)\n
"},{"location":"reference/tables/#synapseclient.table.Dataset.remove_item","title":"remove_item(item_id)
","text":":param item_id: a single dataset item Synapse ID
Source code insynapseclient/table.py
def remove_item(self, item_id: str):\n \"\"\"\n :param item_id: a single dataset item Synapse ID\n \"\"\"\n item_id = id_of(item_id)\n if item_id.startswith(\"syn\"):\n for i, curr_item in enumerate(self.properties.datasetItems):\n if curr_item.get(\"entityId\") == item_id:\n del self.properties.datasetItems[i]\n break\n else:\n raise ValueError(\"Not a Synapse ID: %s\" % str(item_id))\n
"},{"location":"reference/tables/#synapseclient.table.Dataset.has_item","title":"has_item(item_id)
","text":":param item_id: a single dataset item Synapse ID
Source code insynapseclient/table.py
def has_item(self, item_id):\n \"\"\"\n :param item_id: a single dataset item Synapse ID\n \"\"\"\n return any(item[\"entityId\"] == item_id for item in self.properties.datasetItems)\n
"},{"location":"reference/tables/#synapseclient.table.Dataset.add_folder","title":"add_folder(folder, force=True)
","text":":param folder: a single Synapse Folder ID :param force: force add items from folder
Source code insynapseclient/table.py
def add_folder(self, folder: str, force: bool = True):\n \"\"\"\n :param folder: a single Synapse Folder ID\n :param force: force add items from folder\n \"\"\"\n if not self.__dict__.get(\"folders_to_add\", None):\n self.__dict__[\"folders_to_add\"] = set()\n self.__dict__[\"folders_to_add\"].add(folder)\n # if self.force != force:\n self.force = force\n
"},{"location":"reference/tables/#synapseclient.table.Dataset.add_folders","title":"add_folders(folders, force=True)
","text":":param folders: a list of Synapse Folder IDs :param force: force add items from folders
Source code insynapseclient/table.py
def add_folders(self, folders: List[str], force: bool = True):\n \"\"\"\n :param folders: a list of Synapse Folder IDs\n :param force: force add items from folders\n \"\"\"\n if (\n isinstance(folders, list)\n or isinstance(folders, set)\n or isinstance(folders, tuple)\n ):\n self.force = force\n for folder in folders:\n self.add_folder(folder, force)\n else:\n raise ValueError(f\"Not a list of Folder IDs: {folders}\")\n
"},{"location":"reference/tables/#synapseclient.table.EntityViewSchema","title":"EntityViewSchema
","text":" Bases: ViewBase
A EntityViewSchema is a :py:class:synapseclient.entity.Entity
that displays all files/projects (depending on user choice) within a given set of scopes
:param name: the name of the Entity View Table object :param columns: a list of :py:class:Column
objects or their IDs. These are optional. :param parent: the project in Synapse to which this table belongs :param scopes: a list of Projects/Folders or their ids :param type: This field is deprecated. Please use includeEntityTypes
:param includeEntityTypes: a list of entity types to include in the view. Supported entity types are:
- EntityViewType.FILE,\n - EntityViewType.PROJECT,\n - EntityViewType.TABLE,\n - EntityViewType.FOLDER,\n - EntityViewType.VIEW,\n - EntityViewType.DOCKER\n\n If none is provided, the view will default to include EntityViewType.FILE.\n
:param addDefaultViewColumns: If true, adds all default columns (e.g. name, createdOn, modifiedBy etc.) Defaults to True. The default columns will be added after a call to :py:meth:synapseclient.Synapse.store
. :param addAnnotationColumns: If true, adds columns for all annotation keys defined across all Entities in the EntityViewSchema's scope. Defaults to True. The annotation columns will be added after a call to :py:meth:synapseclient.Synapse.store
. :param ignoredAnnotationColumnNames: A list of strings representing annotation names. When addAnnotationColumns is True, the names in this list will not be automatically added as columns to the EntityViewSchema if they exist in any of the defined scopes. :param properties: A map of Synapse properties :param annotations: A map of user defined annotations :param local_state: Internal use only
Example::
from synapseclient import EntityViewType\n\nproject_or_folder = syn.get(\"syn123\")\nschema = syn.store(EntityViewSchema(name='MyTable', parent=project, scopes=[project_or_folder_id, 'syn123'],\n includeEntityTypes=[EntityViewType.FILE]))\n
Source code in synapseclient/table.py
class EntityViewSchema(ViewBase):\n \"\"\"\n A EntityViewSchema is a :py:class:`synapseclient.entity.Entity` that displays all files/projects\n (depending on user choice) within a given set of scopes\n\n :param name: the name of the Entity View Table object\n :param columns: a list of :py:class:`Column` objects or their IDs. These are optional.\n :param parent: the project in Synapse to which this table belongs\n :param scopes: a list of Projects/Folders or their ids\n :param type: This field is deprecated. Please use `includeEntityTypes`\n :param includeEntityTypes: a list of entity types to include in the view. Supported entity types are:\n\n - EntityViewType.FILE,\n - EntityViewType.PROJECT,\n - EntityViewType.TABLE,\n - EntityViewType.FOLDER,\n - EntityViewType.VIEW,\n - EntityViewType.DOCKER\n\n If none is provided, the view will default to include EntityViewType.FILE.\n :param addDefaultViewColumns: If true, adds all default columns (e.g. name, createdOn, modifiedBy etc.)\n Defaults to True.\n The default columns will be added after a call to\n :py:meth:`synapseclient.Synapse.store`.\n :param addAnnotationColumns: If true, adds columns for all annotation keys defined across all Entities in\n the EntityViewSchema's scope. Defaults to True.\n The annotation columns will be added after a call to\n :py:meth:`synapseclient.Synapse.store`.\n :param ignoredAnnotationColumnNames: A list of strings representing annotation names.\n When addAnnotationColumns is True, the names in this list will not be\n automatically added as columns to the EntityViewSchema if they exist in any\n of the defined scopes.\n :param properties: A map of Synapse properties\n :param annotations: A map of user defined annotations\n :param local_state: Internal use only\n\n Example::\n\n from synapseclient import EntityViewType\n\n project_or_folder = syn.get(\"syn123\")\n schema = syn.store(EntityViewSchema(name='MyTable', parent=project, scopes=[project_or_folder_id, 'syn123'],\n includeEntityTypes=[EntityViewType.FILE]))\n\n \"\"\"\n\n _synapse_entity_type = \"org.sagebionetworks.repo.model.table.EntityView\"\n\n def __init__(\n self,\n name=None,\n columns=None,\n parent=None,\n scopes=None,\n type=None,\n includeEntityTypes=None,\n addDefaultViewColumns=True,\n addAnnotationColumns=True,\n ignoredAnnotationColumnNames=[],\n properties=None,\n annotations=None,\n local_state=None,\n **kwargs,\n ):\n if includeEntityTypes:\n kwargs[\"viewTypeMask\"] = _get_view_type_mask(includeEntityTypes)\n elif type:\n kwargs[\"viewTypeMask\"] = _get_view_type_mask_for_deprecated_type(type)\n elif properties and \"type\" in properties:\n kwargs[\"viewTypeMask\"] = _get_view_type_mask_for_deprecated_type(\n properties[\"type\"]\n )\n properties[\"type\"] = None\n\n self.ignoredAnnotationColumnNames = set(ignoredAnnotationColumnNames)\n super(EntityViewSchema, self).__init__(\n name=name,\n columns=columns,\n properties=properties,\n annotations=annotations,\n local_state=local_state,\n parent=parent,\n **kwargs,\n )\n\n # This is a hacky solution to make sure we don't try to add columns to schemas that we retrieve from synapse\n is_from_normal_constructor = not (properties or local_state)\n # allowing annotations because user might want to update annotations all at once\n self.addDefaultViewColumns = (\n addDefaultViewColumns and is_from_normal_constructor\n )\n self.addAnnotationColumns = addAnnotationColumns and is_from_normal_constructor\n\n # set default values after constructor so we don't overwrite the values defined in properties using .get()\n # because properties, unlike local_state, do not have nonexistent keys assigned with a value of None\n if self.get(\"viewTypeMask\") is None:\n self.viewTypeMask = EntityViewType.FILE.value\n if self.get(\"scopeIds\") is None:\n self.scopeIds = []\n\n # add the scopes last so that we can append the passed in scopes to those defined in properties\n if scopes is not None:\n self.add_scope(scopes)\n\n def set_entity_types(self, includeEntityTypes):\n \"\"\"\n :param includeEntityTypes: a list of entity types to include in the view. This list will replace the previous\n settings. Supported entity types are:\n\n - EntityViewType.FILE,\n - EntityViewType.PROJECT,\n - EntityViewType.TABLE,\n - EntityViewType.FOLDER,\n - EntityViewType.VIEW,\n - EntityViewType.DOCKER\n\n \"\"\"\n self.viewTypeMask = _get_view_type_mask(includeEntityTypes)\n
"},{"location":"reference/tables/#synapseclient.table.EntityViewSchema-functions","title":"Functions","text":""},{"location":"reference/tables/#synapseclient.table.EntityViewSchema.set_entity_types","title":"set_entity_types(includeEntityTypes)
","text":":param includeEntityTypes: a list of entity types to include in the view. This list will replace the previous settings. Supported entity types are:
- EntityViewType.FILE,\n - EntityViewType.PROJECT,\n - EntityViewType.TABLE,\n - EntityViewType.FOLDER,\n - EntityViewType.VIEW,\n - EntityViewType.DOCKER\n
Source code in synapseclient/table.py
def set_entity_types(self, includeEntityTypes):\n \"\"\"\n :param includeEntityTypes: a list of entity types to include in the view. This list will replace the previous\n settings. Supported entity types are:\n\n - EntityViewType.FILE,\n - EntityViewType.PROJECT,\n - EntityViewType.TABLE,\n - EntityViewType.FOLDER,\n - EntityViewType.VIEW,\n - EntityViewType.DOCKER\n\n \"\"\"\n self.viewTypeMask = _get_view_type_mask(includeEntityTypes)\n
"},{"location":"reference/tables/#synapseclient.table.SubmissionViewSchema","title":"SubmissionViewSchema
","text":" Bases: ViewBase
A SubmissionViewSchema is a :py:class:synapseclient.entity.Entity
that displays all files/projects (depending on user choice) within a given set of scopes
:param name: the name of the Entity View Table object :param columns: a list of :py:class:Column
objects or their IDs. These are optional. :param parent: the project in Synapse to which this table belongs :param scopes: a list of Evaluation Queues or their ids :param addDefaultViewColumns: If true, adds all default columns (e.g. name, createdOn, modifiedBy etc.) Defaults to True. The default columns will be added after a call to :py:meth:synapseclient.Synapse.store
. :param addAnnotationColumns: If true, adds columns for all annotation keys defined across all Entities in the SubmissionViewSchema's scope. Defaults to True. The annotation columns will be added after a call to :py:meth:synapseclient.Synapse.store
. :param ignoredAnnotationColumnNames: A list of strings representing annotation names. When addAnnotationColumns is True, the names in this list will not be automatically added as columns to the SubmissionViewSchema if they exist in any of the defined scopes. :param properties: A map of Synapse properties :param annotations: A map of user defined annotations :param local_state: Internal use only
Example:: from synapseclient import SubmissionViewSchema
project = syn.get(\"syn123\")\nschema = syn.store(SubmissionViewSchema(name='My Submission View', parent=project, scopes=['9614543']))\n
Source code in synapseclient/table.py
class SubmissionViewSchema(ViewBase):\n \"\"\"\n A SubmissionViewSchema is a :py:class:`synapseclient.entity.Entity` that displays all files/projects\n (depending on user choice) within a given set of scopes\n\n :param name: the name of the Entity View Table object\n :param columns: a list of :py:class:`Column` objects or their IDs. These are optional.\n :param parent: the project in Synapse to which this table belongs\n :param scopes: a list of Evaluation Queues or their ids\n :param addDefaultViewColumns: If true, adds all default columns (e.g. name, createdOn, modifiedBy etc.)\n Defaults to True.\n The default columns will be added after a call to\n :py:meth:`synapseclient.Synapse.store`.\n :param addAnnotationColumns: If true, adds columns for all annotation keys defined across all Entities in\n the SubmissionViewSchema's scope. Defaults to True.\n The annotation columns will be added after a call to\n :py:meth:`synapseclient.Synapse.store`.\n :param ignoredAnnotationColumnNames: A list of strings representing annotation names.\n When addAnnotationColumns is True, the names in this list will not be\n automatically added as columns to the SubmissionViewSchema if they exist in\n any of the defined scopes.\n :param properties: A map of Synapse properties\n :param annotations: A map of user defined annotations\n :param local_state: Internal use only\n\n Example::\n from synapseclient import SubmissionViewSchema\n\n project = syn.get(\"syn123\")\n schema = syn.store(SubmissionViewSchema(name='My Submission View', parent=project, scopes=['9614543']))\n \"\"\"\n\n _synapse_entity_type = \"org.sagebionetworks.repo.model.table.SubmissionView\"\n\n def __init__(\n self,\n name=None,\n columns=None,\n parent=None,\n scopes=None,\n addDefaultViewColumns=True,\n addAnnotationColumns=True,\n ignoredAnnotationColumnNames=[],\n properties=None,\n annotations=None,\n local_state=None,\n **kwargs,\n ):\n self.ignoredAnnotationColumnNames = set(ignoredAnnotationColumnNames)\n super(SubmissionViewSchema, self).__init__(\n name=name,\n columns=columns,\n properties=properties,\n annotations=annotations,\n local_state=local_state,\n parent=parent,\n **kwargs,\n )\n # This is a hacky solution to make sure we don't try to add columns to schemas that we retrieve from synapse\n is_from_normal_constructor = not (properties or local_state)\n # allowing annotations because user might want to update annotations all at once\n self.addDefaultViewColumns = (\n addDefaultViewColumns and is_from_normal_constructor\n )\n self.addAnnotationColumns = addAnnotationColumns and is_from_normal_constructor\n\n if self.get(\"scopeIds\") is None:\n self.scopeIds = []\n\n # add the scopes last so that we can append the passed in scopes to those defined in properties\n if scopes is not None:\n self.add_scope(scopes)\n
"},{"location":"reference/tables/#synapseclient.table.SelectColumn","title":"SelectColumn
","text":" Bases: DictObject
Defines a column to be used in a table :py:class:synapseclient.table.Schema
.
:var id: An immutable ID issued by the platform :param columnType: Can be any of: \"STRING\", \"DOUBLE\", \"INTEGER\", \"BOOLEAN\", \"DATE\", \"FILEHANDLEID\", \"ENTITYID\" :param name: The display name of the column
:type id: string :type columnType: string :type name: string
Source code insynapseclient/table.py
class SelectColumn(DictObject):\n \"\"\"\n Defines a column to be used in a table :py:class:`synapseclient.table.Schema`.\n\n :var id: An immutable ID issued by the platform\n :param columnType: Can be any of: \"STRING\", \"DOUBLE\", \"INTEGER\", \"BOOLEAN\", \"DATE\", \"FILEHANDLEID\", \"ENTITYID\"\n :param name: The display name of the column\n\n :type id: string\n :type columnType: string\n :type name: string\n \"\"\"\n\n def __init__(self, id=None, columnType=None, name=None, **kwargs):\n super(SelectColumn, self).__init__()\n if id:\n self.id = id\n\n if name:\n self.name = name\n\n if columnType:\n self.columnType = columnType\n\n # Notes that this param is only used to support forward compatibility.\n self.update(kwargs)\n\n @classmethod\n def from_column(cls, column):\n return cls(\n column.get(\"id\", None),\n column.get(\"columnType\", None),\n column.get(\"name\", None),\n )\n
"},{"location":"reference/tables/#synapseclient.table.Column","title":"Column
","text":" Bases: DictObject
Defines a column to be used in a table :py:class:synapseclient.table.Schema
:py:class:synapseclient.table.EntityViewSchema
.
:var id: An immutable ID issued by the platform :param columnType: The column type determines the type of data that can be stored in a column. It can be any of: \"STRING\", \"DOUBLE\", \"INTEGER\", \"BOOLEAN\", \"DATE\", \"FILEHANDLEID\", \"ENTITYID\", \"LINK\", \"LARGETEXT\", \"USERID\". For more information, please see: https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/ColumnType.html :param maximumSize: A parameter for columnTypes with a maximum size. For example, ColumnType.STRINGs have a default maximum size of 50 characters, but can be set to a maximumSize of 1 to 1000 characters. :param maximumListLength: Required if using a columnType with a \"_LIST\" suffix. Describes the maximum number of values that will appear in that list. Value range 1-100 inclusive. Default 100 :param name: The display name of the column :param enumValues: Columns type of STRING can be constrained to an enumeration values set on this list. :param defaultValue: The default value for this column. Columns of type FILEHANDLEID and ENTITYID are not allowed to have default values.
:type id: string :type maximumSize: integer :type maximumListLength: integer :type columnType: string :type name: string :type enumValues: array of strings :type defaultValue: string
Source code insynapseclient/table.py
class Column(DictObject):\n \"\"\"\n Defines a column to be used in a table :py:class:`synapseclient.table.Schema`\n :py:class:`synapseclient.table.EntityViewSchema`.\n\n :var id: An immutable ID issued by the platform\n :param columnType: The column type determines the type of data that can be stored in a column. It can be any\n of: \"STRING\", \"DOUBLE\", \"INTEGER\", \"BOOLEAN\", \"DATE\", \"FILEHANDLEID\", \"ENTITYID\", \"LINK\",\n \"LARGETEXT\", \"USERID\". For more information, please see:\n https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/ColumnType.html\n :param maximumSize: A parameter for columnTypes with a maximum size. For example, ColumnType.STRINGs have a\n default maximum size of 50 characters, but can be set to a maximumSize of 1 to 1000\n characters.\n :param maximumListLength: Required if using a columnType with a \"_LIST\" suffix. Describes the maximum number of\n values that will appear in that list. Value range 1-100 inclusive. Default 100\n :param name: The display name of the column\n :param enumValues: Columns type of STRING can be constrained to an enumeration values set on this list.\n :param defaultValue: The default value for this column. Columns of type FILEHANDLEID and ENTITYID are not\n allowed to have default values.\n\n :type id: string\n :type maximumSize: integer\n :type maximumListLength: integer\n :type columnType: string\n :type name: string\n :type enumValues: array of strings\n :type defaultValue: string\n \"\"\"\n\n @classmethod\n def getURI(cls, id):\n return \"/column/%s\" % id\n\n def __init__(self, **kwargs):\n super(Column, self).__init__(kwargs)\n self[\"concreteType\"] = concrete_types.COLUMN_MODEL\n\n def postURI(self):\n return \"/column\"\n
"},{"location":"reference/tables/#synapseclient.table.AppendableRowset","title":"AppendableRowset
","text":" Bases: DictObject
Abstract Base Class for :py:class:Rowset
and :py:class:PartialRowset
synapseclient/table.py
class AppendableRowset(DictObject, metaclass=abc.ABCMeta):\n \"\"\"Abstract Base Class for :py:class:`Rowset` and :py:class:`PartialRowset`\"\"\"\n\n @abc.abstractmethod\n def __init__(self, schema, **kwargs):\n if (\"tableId\" not in kwargs) and schema:\n kwargs[\"tableId\"] = id_of(schema)\n\n if not kwargs.get(\"tableId\", None):\n raise ValueError(\n \"Table schema ID must be defined to create a %s\" % type(self).__name__\n )\n super(AppendableRowset, self).__init__(kwargs)\n\n def _synapse_store(self, syn):\n \"\"\"\n Creates and POSTs an AppendableRowSetRequest_\n\n .. AppendableRowSetRequest:\n https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/AppendableRowSetRequest.html\n \"\"\"\n append_rowset_request = {\n \"concreteType\": concrete_types.APPENDABLE_ROWSET_REQUEST,\n \"toAppend\": self,\n \"entityId\": self.tableId,\n }\n\n response = syn._async_table_update(\n self.tableId, [append_rowset_request], wait=True\n )\n syn._check_table_transaction_response(response)\n return response[\"results\"][0]\n
"},{"location":"reference/tables/#synapseclient.table.PartialRowset","title":"PartialRowset
","text":" Bases: AppendableRowset
A set of Partial Rows used for updating cells of a table. PartialRowsets allow you to push only the individual cells you wish to change instead of pushing entire rows with many unchanged cells.
Example:: #### the following code will change cells in a hypothetical table, syn123: #### these same steps will also work for using EntityView tables to change Entity annotations # # fooCol | barCol fooCol | barCol # ----------------- =======> ---------------------- # foo1 | bar1 foo foo1 | bar1 # foo2 | bar2 foo2 | bar bar 2
query_results = syn.tableQuery(\"SELECT * FROM syn123\")\n\n# The easiest way to know the rowId of the row you wish to change\n# is by converting the table to a pandas DataFrame with rowIdAndVersionInIndex=False\ndf = query_results.asDataFrame(rowIdAndVersionInIndex=False)\n\npartial_changes = {df['ROW_ID'][0]: {'fooCol': 'foo foo 1'},\n df['ROW_ID'][1]: {'barCol': 'bar bar 2'}}\n\n# you will need to pass in your original query result as an argument\n# so that we can perform column id translation and etag retrieval on your behalf:\npartial_rowset = PartialRowset.from_mapping(partial_changes, query_results)\nsyn.store(partial_rowset)\n
:param schema: The :py:class:Schema
of the table to update or its tableId as a string :param rows: A list of PartialRows
synapseclient/table.py
class PartialRowset(AppendableRowset):\n \"\"\"A set of Partial Rows used for updating cells of a table.\n PartialRowsets allow you to push only the individual cells you wish to change instead of pushing entire rows with\n many unchanged cells.\n\n Example::\n #### the following code will change cells in a hypothetical table, syn123:\n #### these same steps will also work for using EntityView tables to change Entity annotations\n #\n # fooCol | barCol fooCol | barCol\n # ----------------- =======> ----------------------\n # foo1 | bar1 foo foo1 | bar1\n # foo2 | bar2 foo2 | bar bar 2\n\n query_results = syn.tableQuery(\"SELECT * FROM syn123\")\n\n # The easiest way to know the rowId of the row you wish to change\n # is by converting the table to a pandas DataFrame with rowIdAndVersionInIndex=False\n df = query_results.asDataFrame(rowIdAndVersionInIndex=False)\n\n partial_changes = {df['ROW_ID'][0]: {'fooCol': 'foo foo 1'},\n df['ROW_ID'][1]: {'barCol': 'bar bar 2'}}\n\n # you will need to pass in your original query result as an argument\n # so that we can perform column id translation and etag retrieval on your behalf:\n partial_rowset = PartialRowset.from_mapping(partial_changes, query_results)\n syn.store(partial_rowset)\n\n :param schema: The :py:class:`Schema` of the table to update or its tableId as a string\n :param rows: A list of PartialRows\n \"\"\"\n\n @classmethod\n def from_mapping(cls, mapping, originalQueryResult):\n \"\"\"Creates a PartialRowset\n :param mapping: A mapping of mappings in the structure: {ROW_ID : {COLUMN_NAME: NEW_COL_VALUE}}\n :param originalQueryResult:\n :return: a PartialRowSet that can be syn.store()-ed to apply the changes\n \"\"\"\n if not isinstance(mapping, collections.abc.Mapping):\n raise ValueError(\"mapping must be a supported Mapping type such as 'dict'\")\n\n try:\n name_to_column_id = {\n col.name: col.id for col in originalQueryResult.headers if \"id\" in col\n }\n except AttributeError:\n raise ValueError(\n \"originalQueryResult must be the result of a syn.tableQuery()\"\n )\n\n row_ids = set(int(id) for id in mapping.keys())\n\n # row_ids in the originalQueryResult are not guaranteed to be in ascending order\n # iterate over all etags but only map the row_ids used for this partial update to their etags\n row_etags = {\n row_id: etag\n for row_id, row_version, etag in originalQueryResult.iter_row_metadata()\n if row_id in row_ids and etag is not None\n }\n\n partial_rows = [\n PartialRow(\n row_changes,\n row_id,\n etag=row_etags.get(int(row_id)),\n nameToColumnId=name_to_column_id,\n )\n for row_id, row_changes in mapping.items()\n ]\n\n return cls(originalQueryResult.tableId, partial_rows)\n\n def __init__(self, schema, rows):\n super(PartialRowset, self).__init__(schema)\n self.concreteType = concrete_types.PARTIAL_ROW_SET\n\n if isinstance(rows, PartialRow):\n self.rows = [rows]\n else:\n try:\n if all(isinstance(row, PartialRow) for row in rows):\n self.rows = list(rows)\n else:\n raise ValueError(\"rows must contain only values of type PartialRow\")\n except TypeError:\n raise ValueError(\"rows must be iterable\")\n
"},{"location":"reference/tables/#synapseclient.table.PartialRowset-functions","title":"Functions","text":""},{"location":"reference/tables/#synapseclient.table.PartialRowset.from_mapping","title":"from_mapping(mapping, originalQueryResult)
classmethod
","text":"Creates a PartialRowset :param mapping: A mapping of mappings in the structure: {ROW_ID : {COLUMN_NAME: NEW_COL_VALUE}} :param originalQueryResult: :return: a PartialRowSet that can be syn.store()-ed to apply the changes
Source code insynapseclient/table.py
@classmethod\ndef from_mapping(cls, mapping, originalQueryResult):\n \"\"\"Creates a PartialRowset\n :param mapping: A mapping of mappings in the structure: {ROW_ID : {COLUMN_NAME: NEW_COL_VALUE}}\n :param originalQueryResult:\n :return: a PartialRowSet that can be syn.store()-ed to apply the changes\n \"\"\"\n if not isinstance(mapping, collections.abc.Mapping):\n raise ValueError(\"mapping must be a supported Mapping type such as 'dict'\")\n\n try:\n name_to_column_id = {\n col.name: col.id for col in originalQueryResult.headers if \"id\" in col\n }\n except AttributeError:\n raise ValueError(\n \"originalQueryResult must be the result of a syn.tableQuery()\"\n )\n\n row_ids = set(int(id) for id in mapping.keys())\n\n # row_ids in the originalQueryResult are not guaranteed to be in ascending order\n # iterate over all etags but only map the row_ids used for this partial update to their etags\n row_etags = {\n row_id: etag\n for row_id, row_version, etag in originalQueryResult.iter_row_metadata()\n if row_id in row_ids and etag is not None\n }\n\n partial_rows = [\n PartialRow(\n row_changes,\n row_id,\n etag=row_etags.get(int(row_id)),\n nameToColumnId=name_to_column_id,\n )\n for row_id, row_changes in mapping.items()\n ]\n\n return cls(originalQueryResult.tableId, partial_rows)\n
"},{"location":"reference/tables/#synapseclient.table.RowSet","title":"RowSet
","text":" Bases: AppendableRowset
A Synapse object of type org.sagebionetworks.repo.model.table.RowSet <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/RowSet.html>
_.
:param schema: A :py:class:synapseclient.table.Schema
object that will be used to set the tableId :param headers: The list of SelectColumn objects that describe the fields in each row. :param columns: An alternative to 'headers', a list of column objects that describe the fields in each row. :param tableId: The ID of the TableEntity that owns these rows :param rows: The :py:class:synapseclient.table.Row
s of this set. The index of each row value aligns with the index of each header. :var etag: Any RowSet returned from Synapse will contain the current etag of the change set. To update any rows from a RowSet the etag must be provided with the POST.
:type headers: array of SelectColumns :type etag: string :type tableId: string :type rows: array of rows
Source code insynapseclient/table.py
class RowSet(AppendableRowset):\n \"\"\"\n A Synapse object of type `org.sagebionetworks.repo.model.table.RowSet \\\n <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/RowSet.html>`_.\n\n :param schema: A :py:class:`synapseclient.table.Schema` object that will be used to set the tableId\n :param headers: The list of SelectColumn objects that describe the fields in each row.\n :param columns: An alternative to 'headers', a list of column objects that describe the fields in each row.\n :param tableId: The ID of the TableEntity that owns these rows\n :param rows: The :py:class:`synapseclient.table.Row` s of this set. The index of each row value aligns with the\n index of each header.\n :var etag: Any RowSet returned from Synapse will contain the current etag of the change set. To update any\n rows from a RowSet the etag must be provided with the POST.\n\n :type headers: array of SelectColumns\n :type etag: string\n :type tableId: string\n :type rows: array of rows\n \"\"\"\n\n @classmethod\n def from_json(cls, json):\n headers = [SelectColumn(**header) for header in json.get(\"headers\", [])]\n rows = [cast_row(Row(**row), headers) for row in json.get(\"rows\", [])]\n return cls(\n headers=headers,\n rows=rows,\n **{key: json[key] for key in json.keys() if key not in [\"headers\", \"rows\"]},\n )\n\n def __init__(self, columns=None, schema=None, **kwargs):\n if \"headers\" not in kwargs:\n if columns and schema:\n raise ValueError(\n \"Please only user either 'columns' or 'schema' as an argument but not both.\"\n )\n if columns:\n kwargs.setdefault(\"headers\", []).extend(\n [SelectColumn.from_column(column) for column in columns]\n )\n elif schema and isinstance(schema, Schema):\n kwargs.setdefault(\"headers\", []).extend(\n [SelectColumn(id=id) for id in schema[\"columnIds\"]]\n )\n\n if not kwargs.get(\"headers\", None):\n raise ValueError(\"Column headers must be defined to create a RowSet\")\n kwargs[\"concreteType\"] = \"org.sagebionetworks.repo.model.table.RowSet\"\n\n super(RowSet, self).__init__(schema, **kwargs)\n\n def _synapse_store(self, syn):\n response = super(RowSet, self)._synapse_store(syn)\n return response.get(\"rowReferenceSet\", response)\n\n def _synapse_delete(self, syn):\n \"\"\"\n Delete the rows in the RowSet.\n Example::\n syn.delete(syn.tableQuery('select name from %s where no_good = true' % schema1.id))\n \"\"\"\n row_id_vers_generator = ((row.rowId, row.versionNumber) for row in self.rows)\n _delete_rows(syn, self.tableId, row_id_vers_generator)\n
"},{"location":"reference/tables/#synapseclient.table.Row","title":"Row
","text":" Bases: DictObject
A row <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/Row.html>
_ in a Table.
:param values: A list of values :param rowId: The immutable ID issued to a new row :param versionNumber: The version number of this row. Each row version is immutable, so when a row is updated a new version is created.
Source code insynapseclient/table.py
class Row(DictObject):\n \"\"\"\n A `row <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/Row.html>`_ in a Table.\n\n :param values: A list of values\n :param rowId: The immutable ID issued to a new row\n :param versionNumber: The version number of this row. Each row version is immutable, so when a row is updated a\n new version is created.\n \"\"\"\n\n def __init__(self, values, rowId=None, versionNumber=None, etag=None, **kwargs):\n super(Row, self).__init__()\n self.values = values\n if rowId is not None:\n self.rowId = rowId\n if versionNumber is not None:\n self.versionNumber = versionNumber\n if etag is not None:\n self.etag = etag\n\n # Notes that this param is only used to support forward compatibility.\n self.update(kwargs)\n
"},{"location":"reference/tables/#synapseclient.table.PartialRow","title":"PartialRow
","text":" Bases: DictObject
This is a lower-level class for use in :py:class::PartialRowSet
to update individual cells within a table.
It is recommended you use :py:classmethod::PartialRowSet.from_mapping
to construct partial change sets to a table.
If you want to do the tedious parts yourself:
To change cells in the \"foo\"(colId:1234) and \"bar\"(colId:456) columns of a row with rowId=5 :: rowId = 5
#pass in with columnIds as key:\nPartialRow({123: 'fooVal', 456:'barVal'}, rowId)\n\n#pass in with a nameToColumnId argument\n\n#manually define:\nnameToColumnId = {'foo':123, 'bar':456}\n#OR if you have the result of a tableQuery() you can generate nameToColumnId using:\nquery_result = syn.tableQuery(\"SELECT * FROM syn123\")\nnameToColumnId = {col.name:col.id for col in query_result.headers}\n\nPartialRow({'foo': 'fooVal', 'bar':'barVal'}, rowId, nameToColumnId=nameToColumnId)\n
:param values: A Mapping where: - key is name of the column (or its columnId) to change in the desired row - value is the new desired value for that column :param rowId: The id of the row to be updated :param etag: used for updating File/Project Views(::py:class:EntityViewSchema
). Not necessary for a (::py:class:Schema
) Table :param nameToColumnId: Optional map column names to column Ids. If this is provided, the keys of your values
Mapping will be replaced with the column ids in the nameToColumnId
dict. Include this as an argument when you are providing the column names instead of columnIds as the keys to the values
Mapping.
synapseclient/table.py
class PartialRow(DictObject):\n \"\"\"This is a lower-level class for use in :py:class::`PartialRowSet` to update individual cells within a table.\n\n It is recommended you use :py:classmethod::`PartialRowSet.from_mapping`to construct partial change sets to a table.\n\n If you want to do the tedious parts yourself:\n\n To change cells in the \"foo\"(colId:1234) and \"bar\"(colId:456) columns of a row with rowId=5 ::\n rowId = 5\n\n #pass in with columnIds as key:\n PartialRow({123: 'fooVal', 456:'barVal'}, rowId)\n\n #pass in with a nameToColumnId argument\n\n #manually define:\n nameToColumnId = {'foo':123, 'bar':456}\n #OR if you have the result of a tableQuery() you can generate nameToColumnId using:\n query_result = syn.tableQuery(\"SELECT * FROM syn123\")\n nameToColumnId = {col.name:col.id for col in query_result.headers}\n\n PartialRow({'foo': 'fooVal', 'bar':'barVal'}, rowId, nameToColumnId=nameToColumnId)\n\n :param values: A Mapping where:\n - key is name of the column (or its columnId) to change in the desired row\n - value is the new desired value for that column\n :param rowId: The id of the row to be updated\n :param etag: used for updating File/Project Views(::py:class:`EntityViewSchema`). Not necessary for a\n (::py:class:`Schema`) Table\n :param nameToColumnId: Optional map column names to column Ids. If this is provided, the keys of your `values`\n Mapping will be replaced with the column ids in the `nameToColumnId` dict. Include this\n as an argument when you are providing the column names instead of columnIds as the keys\n to the `values` Mapping.\n\n \"\"\"\n\n def __init__(self, values, rowId, etag=None, nameToColumnId=None):\n super(PartialRow, self).__init__()\n if not isinstance(values, collections.abc.Mapping):\n raise ValueError(\"values must be a Mapping\")\n\n rowId = int(rowId)\n\n self.values = [\n {\n \"key\": nameToColumnId[x_key] if nameToColumnId is not None else x_key,\n \"value\": x_value,\n }\n for x_key, x_value in values.items()\n ]\n self.rowId = rowId\n if etag is not None:\n self.etag = etag\n
"},{"location":"reference/tables/#synapseclient.table.TableAbstractBaseClass","title":"TableAbstractBaseClass
","text":" Bases: Iterable
, Sized
Abstract base class for Tables based on different data containers.
Source code insynapseclient/table.py
class TableAbstractBaseClass(collections.abc.Iterable, collections.abc.Sized):\n \"\"\"\n Abstract base class for Tables based on different data containers.\n \"\"\"\n\n RowMetadataTuple = collections.namedtuple(\n \"RowMetadataTuple\", [\"row_id\", \"row_version\", \"row_etag\"]\n )\n\n def __init__(self, schema, headers=None, etag=None):\n if isinstance(schema, Schema):\n self.schema = schema\n self.tableId = schema.id if schema and \"id\" in schema else None\n self.headers = (\n headers if headers else [SelectColumn(id=id) for id in schema.columnIds]\n )\n self.etag = etag\n elif isinstance(schema, str):\n self.schema = None\n self.tableId = schema\n self.headers = headers\n self.etag = etag\n else:\n ValueError(\"Must provide a schema or a synapse ID of a Table Entity\")\n\n def asDataFrame(self):\n raise NotImplementedError()\n\n def asRowSet(self):\n return RowSet(\n headers=self.headers,\n tableId=self.tableId,\n etag=self.etag,\n rows=[row if isinstance(row, Row) else Row(row) for row in self],\n )\n\n def _synapse_store(self, syn):\n raise NotImplementedError()\n\n def _synapse_delete(self, syn):\n \"\"\"\n Delete the rows that result from a table query.\n\n Example::\n syn.delete(syn.tableQuery('select name from %s where no_good = true' % schema1.id))\n \"\"\"\n row_id_vers_generator = (\n (metadata.row_id, metadata.row_version)\n for metadata in self.iter_row_metadata()\n )\n _delete_rows(syn, self.tableId, row_id_vers_generator)\n\n @abc.abstractmethod\n def iter_row_metadata(self):\n \"\"\"Iterates the table results to get row_id and row_etag. If an etag does not exist for a row, it will\n generated as (row_id, None)\n\n :return: a generator that gives :py:class::`collections.namedtuple` with format (row_id, row_etag)\n \"\"\"\n pass\n
"},{"location":"reference/tables/#synapseclient.table.TableAbstractBaseClass-functions","title":"Functions","text":""},{"location":"reference/tables/#synapseclient.table.TableAbstractBaseClass.iter_row_metadata","title":"iter_row_metadata()
abstractmethod
","text":"Iterates the table results to get row_id and row_etag. If an etag does not exist for a row, it will generated as (row_id, None)
:return: a generator that gives :py:class::collections.namedtuple
with format (row_id, row_etag)
synapseclient/table.py
@abc.abstractmethod\ndef iter_row_metadata(self):\n \"\"\"Iterates the table results to get row_id and row_etag. If an etag does not exist for a row, it will\n generated as (row_id, None)\n\n :return: a generator that gives :py:class::`collections.namedtuple` with format (row_id, row_etag)\n \"\"\"\n pass\n
"},{"location":"reference/tables/#synapseclient.table.RowSetTable","title":"RowSetTable
","text":" Bases: TableAbstractBaseClass
A Table object that wraps a RowSet.
Source code insynapseclient/table.py
class RowSetTable(TableAbstractBaseClass):\n \"\"\"\n A Table object that wraps a RowSet.\n \"\"\"\n\n def __init__(self, schema, rowset):\n super(RowSetTable, self).__init__(schema, etag=rowset.get(\"etag\", None))\n self.rowset = rowset\n\n def _synapse_store(self, syn):\n row_reference_set = syn.store(self.rowset)\n return RowSetTable(self.schema, row_reference_set)\n\n def asDataFrame(self):\n test_import_pandas()\n import pandas as pd\n\n if any([row[\"rowId\"] for row in self.rowset[\"rows\"]]):\n rownames = row_labels_from_rows(self.rowset[\"rows\"])\n else:\n rownames = None\n\n series = collections.OrderedDict()\n for i, header in enumerate(self.rowset[\"headers\"]):\n series[header.name] = pd.Series(\n name=header.name,\n data=[row[\"values\"][i] for row in self.rowset[\"rows\"]],\n index=rownames,\n )\n\n return pd.DataFrame(data=series, index=rownames)\n\n def asRowSet(self):\n return self.rowset\n\n def __iter__(self):\n def iterate_rows(rows, headers):\n for row in rows:\n yield cast_values(row, headers)\n\n return iterate_rows(self.rowset[\"rows\"], self.rowset[\"headers\"])\n\n def __len__(self):\n return len(self.rowset[\"rows\"])\n\n def iter_row_metadata(self):\n raise NotImplementedError(\"iter_metadata is not supported for RowSetTable\")\n
"},{"location":"reference/tables/#synapseclient.table.TableQueryResult","title":"TableQueryResult
","text":" Bases: TableAbstractBaseClass
An object to wrap rows returned as a result of a table query. The TableQueryResult object can be used to iterate over results of a query.
Example ::
results = syn.tableQuery(\"select * from syn1234\")\nfor row in results:\n print(row)\n
Source code in synapseclient/table.py
class TableQueryResult(TableAbstractBaseClass):\n \"\"\"\n An object to wrap rows returned as a result of a table query.\n The TableQueryResult object can be used to iterate over results of a query.\n\n Example ::\n\n results = syn.tableQuery(\"select * from syn1234\")\n for row in results:\n print(row)\n \"\"\"\n\n def __init__(self, synapse, query, limit=None, offset=None, isConsistent=True):\n self.syn = synapse\n\n self.query = query\n self.limit = limit\n self.offset = offset\n self.isConsistent = isConsistent\n\n result = self.syn._queryTable(\n query=query, limit=limit, offset=offset, isConsistent=isConsistent\n )\n\n self.rowset = RowSet.from_json(result[\"queryResult\"][\"queryResults\"])\n\n self.columnModels = [Column(**col) for col in result.get(\"columnModels\", [])]\n self.nextPageToken = result[\"queryResult\"].get(\"nextPageToken\", None)\n self.count = result.get(\"queryCount\", None)\n self.maxRowsPerPage = result.get(\"maxRowsPerPage\", None)\n self.i = -1\n\n super(TableQueryResult, self).__init__(\n schema=self.rowset.get(\"tableId\", None),\n headers=self.rowset.headers,\n etag=self.rowset.get(\"etag\", None),\n )\n\n def _synapse_store(self, syn):\n raise SynapseError(\n \"A TableQueryResult is a read only object and can't be stored in Synapse. Convert to a\"\n \" DataFrame or RowSet instead.\"\n )\n\n def asDataFrame(self, rowIdAndVersionInIndex=True):\n \"\"\"\n Convert query result to a Pandas DataFrame.\n\n :param rowIdAndVersionInIndex: Make the dataframe index consist of the row_id and row_version (and row_etag\n if it exists)\n\n \"\"\"\n test_import_pandas()\n import pandas as pd\n\n # To turn a TableQueryResult into a data frame, we add a page of rows\n # at a time on the untested theory that it's more efficient than\n # adding a single row at a time to the data frame.\n\n def construct_rownames(rowset, offset=0):\n try:\n return (\n row_labels_from_rows(rowset[\"rows\"])\n if rowIdAndVersionInIndex\n else None\n )\n except KeyError:\n # if we don't have row id and version, just number the rows\n # python3 cast range to list for safety\n return list(range(offset, offset + len(rowset[\"rows\"])))\n\n # first page of rows\n offset = 0\n rownames = construct_rownames(self.rowset, offset)\n offset += len(self.rowset[\"rows\"])\n series = collections.OrderedDict()\n\n if not rowIdAndVersionInIndex:\n # Since we use an OrderedDict this must happen before we construct the other columns\n # add row id, verison, and etag as rows\n append_etag = False # only useful when (not rowIdAndVersionInIndex), hooray for lazy variables!\n series[\"ROW_ID\"] = pd.Series(\n name=\"ROW_ID\", data=[row[\"rowId\"] for row in self.rowset[\"rows\"]]\n )\n series[\"ROW_VERSION\"] = pd.Series(\n name=\"ROW_VERSION\",\n data=[row[\"versionNumber\"] for row in self.rowset[\"rows\"]],\n )\n\n row_etag = [row.get(\"etag\") for row in self.rowset[\"rows\"]]\n if any(row_etag):\n append_etag = True\n series[\"ROW_ETAG\"] = pd.Series(name=\"ROW_ETAG\", data=row_etag)\n\n for i, header in enumerate(self.rowset[\"headers\"]):\n column_name = header.name\n series[column_name] = pd.Series(\n name=column_name,\n data=[row[\"values\"][i] for row in self.rowset[\"rows\"]],\n index=rownames,\n )\n\n # subsequent pages of rows\n while self.nextPageToken:\n result = self.syn._queryTableNext(self.nextPageToken, self.tableId)\n self.rowset = RowSet.from_json(result[\"queryResults\"])\n self.nextPageToken = result.get(\"nextPageToken\", None)\n self.i = 0\n\n rownames = construct_rownames(self.rowset, offset)\n offset += len(self.rowset[\"rows\"])\n\n if not rowIdAndVersionInIndex:\n # TODO: Look into why this isn't being assigned\n series[\"ROW_ID\"].append(\n pd.Series(\n name=\"ROW_ID\", data=[row[\"id\"] for row in self.rowset[\"rows\"]]\n )\n )\n series[\"ROW_VERSION\"].append(\n pd.Series(\n name=\"ROW_VERSION\",\n data=[row[\"version\"] for row in self.rowset[\"rows\"]],\n )\n )\n if append_etag:\n series[\"ROW_ETAG\"] = pd.Series(\n name=\"ROW_ETAG\",\n data=[row.get(\"etag\") for row in self.rowset[\"rows\"]],\n )\n\n for i, header in enumerate(self.rowset[\"headers\"]):\n column_name = header.name\n series[column_name] = pd.concat(\n [\n series[column_name],\n pd.Series(\n name=column_name,\n data=[row[\"values\"][i] for row in self.rowset[\"rows\"]],\n index=rownames,\n ),\n ],\n # can't verify integrity when indices are just numbers instead of 'rowid_rowversion'\n verify_integrity=rowIdAndVersionInIndex,\n )\n\n return pd.DataFrame(data=series)\n\n def asRowSet(self):\n # Note that as of stack 60, an empty query will omit the headers field\n # see PLFM-3014\n return RowSet(\n headers=self.headers,\n tableId=self.tableId,\n etag=self.etag,\n rows=[row for row in self],\n )\n\n def __iter__(self):\n return self\n\n def next(self):\n \"\"\"\n Python 2 iterator\n \"\"\"\n self.i += 1\n if self.i >= len(self.rowset[\"rows\"]):\n if self.nextPageToken:\n result = self.syn._queryTableNext(self.nextPageToken, self.tableId)\n self.rowset = RowSet.from_json(result[\"queryResults\"])\n self.nextPageToken = result.get(\"nextPageToken\", None)\n self.i = 0\n else:\n raise StopIteration()\n return self.rowset[\"rows\"][self.i]\n\n def __next__(self):\n \"\"\"\n Python 3 iterator\n \"\"\"\n return self.next()\n\n def __len__(self):\n return len(self.rowset[\"rows\"])\n\n def iter_row_metadata(self):\n \"\"\"Iterates the table results to get row_id and row_etag. If an etag does not exist for a row, it will\n generated as (row_id, row_version,None)\n\n :return: a generator that gives :py:class::`collections.namedtuple` with format (row_id, row_version, row_etag)\n \"\"\"\n for row in self:\n yield type(self).RowMetadataTuple(\n int(row[\"rowId\"]), int(row[\"versionNumber\"]), row.get(\"etag\")\n )\n
"},{"location":"reference/tables/#synapseclient.table.TableQueryResult-functions","title":"Functions","text":""},{"location":"reference/tables/#synapseclient.table.TableQueryResult.asDataFrame","title":"asDataFrame(rowIdAndVersionInIndex=True)
","text":"Convert query result to a Pandas DataFrame.
:param rowIdAndVersionInIndex: Make the dataframe index consist of the row_id and row_version (and row_etag if it exists)
Source code insynapseclient/table.py
def asDataFrame(self, rowIdAndVersionInIndex=True):\n \"\"\"\n Convert query result to a Pandas DataFrame.\n\n :param rowIdAndVersionInIndex: Make the dataframe index consist of the row_id and row_version (and row_etag\n if it exists)\n\n \"\"\"\n test_import_pandas()\n import pandas as pd\n\n # To turn a TableQueryResult into a data frame, we add a page of rows\n # at a time on the untested theory that it's more efficient than\n # adding a single row at a time to the data frame.\n\n def construct_rownames(rowset, offset=0):\n try:\n return (\n row_labels_from_rows(rowset[\"rows\"])\n if rowIdAndVersionInIndex\n else None\n )\n except KeyError:\n # if we don't have row id and version, just number the rows\n # python3 cast range to list for safety\n return list(range(offset, offset + len(rowset[\"rows\"])))\n\n # first page of rows\n offset = 0\n rownames = construct_rownames(self.rowset, offset)\n offset += len(self.rowset[\"rows\"])\n series = collections.OrderedDict()\n\n if not rowIdAndVersionInIndex:\n # Since we use an OrderedDict this must happen before we construct the other columns\n # add row id, verison, and etag as rows\n append_etag = False # only useful when (not rowIdAndVersionInIndex), hooray for lazy variables!\n series[\"ROW_ID\"] = pd.Series(\n name=\"ROW_ID\", data=[row[\"rowId\"] for row in self.rowset[\"rows\"]]\n )\n series[\"ROW_VERSION\"] = pd.Series(\n name=\"ROW_VERSION\",\n data=[row[\"versionNumber\"] for row in self.rowset[\"rows\"]],\n )\n\n row_etag = [row.get(\"etag\") for row in self.rowset[\"rows\"]]\n if any(row_etag):\n append_etag = True\n series[\"ROW_ETAG\"] = pd.Series(name=\"ROW_ETAG\", data=row_etag)\n\n for i, header in enumerate(self.rowset[\"headers\"]):\n column_name = header.name\n series[column_name] = pd.Series(\n name=column_name,\n data=[row[\"values\"][i] for row in self.rowset[\"rows\"]],\n index=rownames,\n )\n\n # subsequent pages of rows\n while self.nextPageToken:\n result = self.syn._queryTableNext(self.nextPageToken, self.tableId)\n self.rowset = RowSet.from_json(result[\"queryResults\"])\n self.nextPageToken = result.get(\"nextPageToken\", None)\n self.i = 0\n\n rownames = construct_rownames(self.rowset, offset)\n offset += len(self.rowset[\"rows\"])\n\n if not rowIdAndVersionInIndex:\n # TODO: Look into why this isn't being assigned\n series[\"ROW_ID\"].append(\n pd.Series(\n name=\"ROW_ID\", data=[row[\"id\"] for row in self.rowset[\"rows\"]]\n )\n )\n series[\"ROW_VERSION\"].append(\n pd.Series(\n name=\"ROW_VERSION\",\n data=[row[\"version\"] for row in self.rowset[\"rows\"]],\n )\n )\n if append_etag:\n series[\"ROW_ETAG\"] = pd.Series(\n name=\"ROW_ETAG\",\n data=[row.get(\"etag\") for row in self.rowset[\"rows\"]],\n )\n\n for i, header in enumerate(self.rowset[\"headers\"]):\n column_name = header.name\n series[column_name] = pd.concat(\n [\n series[column_name],\n pd.Series(\n name=column_name,\n data=[row[\"values\"][i] for row in self.rowset[\"rows\"]],\n index=rownames,\n ),\n ],\n # can't verify integrity when indices are just numbers instead of 'rowid_rowversion'\n verify_integrity=rowIdAndVersionInIndex,\n )\n\n return pd.DataFrame(data=series)\n
"},{"location":"reference/tables/#synapseclient.table.TableQueryResult.next","title":"next()
","text":"Python 2 iterator
Source code insynapseclient/table.py
def next(self):\n \"\"\"\n Python 2 iterator\n \"\"\"\n self.i += 1\n if self.i >= len(self.rowset[\"rows\"]):\n if self.nextPageToken:\n result = self.syn._queryTableNext(self.nextPageToken, self.tableId)\n self.rowset = RowSet.from_json(result[\"queryResults\"])\n self.nextPageToken = result.get(\"nextPageToken\", None)\n self.i = 0\n else:\n raise StopIteration()\n return self.rowset[\"rows\"][self.i]\n
"},{"location":"reference/tables/#synapseclient.table.TableQueryResult.iter_row_metadata","title":"iter_row_metadata()
","text":"Iterates the table results to get row_id and row_etag. If an etag does not exist for a row, it will generated as (row_id, row_version,None)
:return: a generator that gives :py:class::collections.namedtuple
with format (row_id, row_version, row_etag)
synapseclient/table.py
def iter_row_metadata(self):\n \"\"\"Iterates the table results to get row_id and row_etag. If an etag does not exist for a row, it will\n generated as (row_id, row_version,None)\n\n :return: a generator that gives :py:class::`collections.namedtuple` with format (row_id, row_version, row_etag)\n \"\"\"\n for row in self:\n yield type(self).RowMetadataTuple(\n int(row[\"rowId\"]), int(row[\"versionNumber\"]), row.get(\"etag\")\n )\n
"},{"location":"reference/tables/#synapseclient.table.CsvFileTable","title":"CsvFileTable
","text":" Bases: TableAbstractBaseClass
An object to wrap a CSV file that may be stored into a Synapse table or returned as a result of a table query.
Source code insynapseclient/table.py
class CsvFileTable(TableAbstractBaseClass):\n \"\"\"\n An object to wrap a CSV file that may be stored into a Synapse table or\n returned as a result of a table query.\n \"\"\"\n\n @classmethod\n def from_table_query(\n cls,\n synapse,\n query,\n quoteCharacter='\"',\n escapeCharacter=\"\\\\\",\n lineEnd=str(os.linesep),\n separator=\",\",\n header=True,\n includeRowIdAndRowVersion=True,\n downloadLocation=None,\n ):\n \"\"\"\n Create a Table object wrapping a CSV file resulting from querying a Synapse table.\n Mostly for internal use.\n \"\"\"\n\n download_from_table_result, path = synapse._queryTableCsv(\n query=query,\n quoteCharacter=quoteCharacter,\n escapeCharacter=escapeCharacter,\n lineEnd=lineEnd,\n separator=separator,\n header=header,\n includeRowIdAndRowVersion=includeRowIdAndRowVersion,\n downloadLocation=downloadLocation,\n )\n\n # A dirty hack to find out if we got back row ID and Version\n # in particular, we don't get these back from aggregate queries\n with io.open(path, \"r\", encoding=\"utf-8\") as f:\n reader = csv.reader(\n f,\n delimiter=separator,\n escapechar=escapeCharacter,\n lineterminator=lineEnd,\n quotechar=quoteCharacter,\n )\n first_line = next(reader)\n if len(download_from_table_result[\"headers\"]) + 2 == len(first_line):\n includeRowIdAndRowVersion = True\n else:\n includeRowIdAndRowVersion = False\n\n self = cls(\n filepath=path,\n schema=download_from_table_result.get(\"tableId\", None),\n etag=download_from_table_result.get(\"etag\", None),\n quoteCharacter=quoteCharacter,\n escapeCharacter=escapeCharacter,\n lineEnd=lineEnd,\n separator=separator,\n header=header,\n includeRowIdAndRowVersion=includeRowIdAndRowVersion,\n headers=[\n SelectColumn(**header)\n for header in download_from_table_result[\"headers\"]\n ],\n )\n\n return self\n\n @classmethod\n def from_data_frame(\n cls,\n schema,\n df,\n filepath=None,\n etag=None,\n quoteCharacter='\"',\n escapeCharacter=\"\\\\\",\n lineEnd=str(os.linesep),\n separator=\",\",\n header=True,\n includeRowIdAndRowVersion=None,\n headers=None,\n **kwargs,\n ):\n # infer columns from data frame if not specified\n if not headers:\n cols = as_table_columns(df)\n headers = [SelectColumn.from_column(col) for col in cols]\n\n # if the schema has no columns, use the inferred columns\n if isinstance(schema, Schema) and not schema.has_columns():\n schema.addColumns(cols)\n\n # convert row names in the format [row_id]_[version] or [row_id]_[version]_[etag] back to columns\n # etag is essentially a UUID\n etag_pattern = r\"[0-9a-fA-F]{8}-[0-9a-fA-F]{4}-[1-5][0-9a-fA-F]{3}-[89abAB][0-9a-fA-F]{3}-[0-9a-fA-F]{12}\"\n row_id_version_pattern = re.compile(r\"(\\d+)_(\\d+)(_(\" + etag_pattern + r\"))?\")\n\n row_id = []\n row_version = []\n row_etag = []\n for row_name in df.index.values:\n m = row_id_version_pattern.match(str(row_name))\n row_id.append(m.group(1) if m else None)\n row_version.append(m.group(2) if m else None)\n row_etag.append(m.group(4) if m else None)\n\n # include row ID and version, if we're asked to OR if it's encoded in row names\n if includeRowIdAndRowVersion or (\n includeRowIdAndRowVersion is None and any(row_id)\n ):\n df2 = df.copy()\n\n cls._insert_dataframe_column_if_not_exist(df2, 0, \"ROW_ID\", row_id)\n cls._insert_dataframe_column_if_not_exist(\n df2, 1, \"ROW_VERSION\", row_version\n )\n if any(row_etag):\n cls._insert_dataframe_column_if_not_exist(df2, 2, \"ROW_ETAG\", row_etag)\n\n df = df2\n includeRowIdAndRowVersion = True\n\n f = None\n try:\n if not filepath:\n temp_dir = tempfile.mkdtemp()\n filepath = os.path.join(temp_dir, \"table.csv\")\n\n f = io.open(filepath, mode=\"w\", encoding=\"utf-8\", newline=\"\")\n\n test_import_pandas()\n import pandas as pd\n\n if isinstance(schema, Schema):\n for col in schema.columns_to_store:\n if col[\"columnType\"] == \"DATE\":\n\n def _trailing_date_time_millisecond(t):\n if isinstance(t, str):\n return t[:-3]\n\n df[col.name] = pd.to_datetime(\n df[col.name], errors=\"coerce\"\n ).dt.strftime(\"%s%f\")\n df[col.name] = df[col.name].apply(\n lambda x: _trailing_date_time_millisecond(x)\n )\n\n df.to_csv(\n f,\n index=False,\n sep=separator,\n header=header,\n quotechar=quoteCharacter,\n escapechar=escapeCharacter,\n lineterminator=lineEnd,\n na_rep=kwargs.get(\"na_rep\", \"\"),\n float_format=\"%.12g\",\n )\n # NOTE: reason for flat_format='%.12g':\n # pandas automatically converts int columns into float64 columns when some cells in the column have no\n # value. If we write the whole number back as a decimal (e.g. '3.0'), Synapse complains that we are writing\n # a float into a INTEGER(synapse table type) column. Using the 'g' will strip off '.0' from whole number\n # values. pandas by default (with no float_format parameter) seems to keep 12 values after decimal, so we\n # use '%.12g'.c\n # see SYNPY-267.\n finally:\n if f:\n f.close()\n\n return cls(\n schema=schema,\n filepath=filepath,\n etag=etag,\n quoteCharacter=quoteCharacter,\n escapeCharacter=escapeCharacter,\n lineEnd=lineEnd,\n separator=separator,\n header=header,\n includeRowIdAndRowVersion=includeRowIdAndRowVersion,\n headers=headers,\n )\n\n @staticmethod\n def _insert_dataframe_column_if_not_exist(\n dataframe, insert_index, col_name, insert_column_data\n ):\n # if the column already exists verify the column data is same as what we parsed\n if col_name in dataframe.columns:\n if dataframe[col_name].tolist() != insert_column_data:\n raise SynapseError(\n (\n \"A column named '{0}' already exists and does not match the '{0}' values present in\"\n \" the DataFrame's row names. Please refain from using or modifying '{0}' as a\"\n \" column for your data because it is necessary for version tracking in Synapse's\"\n \" tables\"\n ).format(col_name)\n )\n else:\n dataframe.insert(insert_index, col_name, insert_column_data)\n\n @classmethod\n def from_list_of_rows(\n cls,\n schema,\n values,\n filepath=None,\n etag=None,\n quoteCharacter='\"',\n escapeCharacter=\"\\\\\",\n lineEnd=str(os.linesep),\n separator=\",\",\n linesToSkip=0,\n includeRowIdAndRowVersion=None,\n headers=None,\n ):\n # create CSV file\n f = None\n try:\n if not filepath:\n temp_dir = tempfile.mkdtemp()\n filepath = os.path.join(temp_dir, \"table.csv\")\n\n f = io.open(filepath, \"w\", encoding=\"utf-8\", newline=\"\")\n\n writer = csv.writer(\n f,\n quoting=csv.QUOTE_NONNUMERIC,\n delimiter=separator,\n escapechar=escapeCharacter,\n lineterminator=lineEnd,\n quotechar=quoteCharacter,\n skipinitialspace=linesToSkip,\n )\n\n # if we haven't explicitly set columns, try to grab them from\n # the schema object\n if (\n not headers\n and \"columns_to_store\" in schema\n and schema.columns_to_store is not None\n ):\n headers = [\n SelectColumn.from_column(col) for col in schema.columns_to_store\n ]\n\n # write headers?\n if headers:\n writer.writerow([header.name for header in headers])\n header = True\n else:\n header = False\n\n # write row data\n for row in values:\n writer.writerow(row)\n\n finally:\n if f:\n f.close()\n\n return cls(\n schema=schema,\n filepath=filepath,\n etag=etag,\n quoteCharacter=quoteCharacter,\n escapeCharacter=escapeCharacter,\n lineEnd=lineEnd,\n separator=separator,\n header=header,\n headers=headers,\n includeRowIdAndRowVersion=includeRowIdAndRowVersion,\n )\n\n def __init__(\n self,\n schema,\n filepath,\n etag=None,\n quoteCharacter=DEFAULT_QUOTE_CHARACTER,\n escapeCharacter=DEFAULT_ESCAPSE_CHAR,\n lineEnd=str(os.linesep),\n separator=DEFAULT_SEPARATOR,\n header=True,\n linesToSkip=0,\n includeRowIdAndRowVersion=None,\n headers=None,\n ):\n self.filepath = filepath\n\n self.includeRowIdAndRowVersion = includeRowIdAndRowVersion\n\n # CsvTableDescriptor fields\n self.linesToSkip = linesToSkip\n self.quoteCharacter = quoteCharacter\n self.escapeCharacter = escapeCharacter\n self.lineEnd = lineEnd\n self.separator = separator\n self.header = header\n\n super(CsvFileTable, self).__init__(schema, headers=headers, etag=etag)\n\n self.setColumnHeaders(headers)\n\n def _synapse_store(self, syn):\n copied_self = copy.copy(self)\n return copied_self._update_self(syn)\n\n def _update_self(self, syn):\n if isinstance(self.schema, Schema) and self.schema.get(\"id\", None) is None:\n # store schema\n self.schema = syn.store(self.schema)\n self.tableId = self.schema.id\n\n result = syn._uploadCsv(\n self.filepath,\n self.schema if self.schema else self.tableId,\n updateEtag=self.etag,\n quoteCharacter=self.quoteCharacter,\n escapeCharacter=self.escapeCharacter,\n lineEnd=self.lineEnd,\n separator=self.separator,\n header=self.header,\n linesToSkip=self.linesToSkip,\n )\n\n upload_to_table_result = result[\"results\"][0]\n\n assert upload_to_table_result[\"concreteType\"] in (\n \"org.sagebionetworks.repo.model.table.EntityUpdateResults\",\n \"org.sagebionetworks.repo.model.table.UploadToTableResult\",\n ), \"Not an UploadToTableResult or EntityUpdateResults.\"\n if \"etag\" in upload_to_table_result:\n self.etag = upload_to_table_result[\"etag\"]\n return self\n\n def asDataFrame(self, rowIdAndVersionInIndex=True, convert_to_datetime=False):\n \"\"\"Convert query result to a Pandas DataFrame.\n\n :param rowIdAndVersionInIndex: Make the dataframe index consist of the row_id and row_version\n (and row_etag if it exists)\n :param convert_to_datetime: If set to True, will convert all Synapse DATE columns from UNIX timestamp\n integers into UTC datetime objects\n\n :return: Pandas dataframe with results\n \"\"\"\n test_import_pandas()\n import pandas as pd\n\n try:\n # Handle bug in pandas 0.19 requiring quotechar to be str not unicode or newstr\n quoteChar = self.quoteCharacter\n\n # determine which columns are DATE columns so we can convert milisecond timestamps into datetime objects\n date_columns = []\n list_columns = []\n dtype = {}\n\n if self.headers is not None:\n for select_column in self.headers:\n if select_column.columnType == \"STRING\":\n # we want to identify string columns so that pandas doesn't try to\n # automatically parse strings in a string column to other data types\n dtype[select_column.name] = str\n elif select_column.columnType in LIST_COLUMN_TYPES:\n list_columns.append(select_column.name)\n elif select_column.columnType == \"DATE\" and convert_to_datetime:\n date_columns.append(select_column.name)\n\n return _csv_to_pandas_df(\n self.filepath,\n separator=self.separator,\n quote_char=quoteChar,\n escape_char=self.escapeCharacter,\n contain_headers=self.header,\n lines_to_skip=self.linesToSkip,\n date_columns=date_columns,\n list_columns=list_columns,\n rowIdAndVersionInIndex=rowIdAndVersionInIndex,\n dtype=dtype,\n )\n except pd.parser.CParserError:\n return pd.DataFrame()\n\n def asRowSet(self):\n # Extract row id and version, if present in rows\n row_id_col = None\n row_ver_col = None\n for i, header in enumerate(self.headers):\n if header.name == \"ROW_ID\":\n row_id_col = i\n elif header.name == \"ROW_VERSION\":\n row_ver_col = i\n\n def to_row_object(row, row_id_col=None, row_ver_col=None):\n if isinstance(row, Row):\n return row\n rowId = row[row_id_col] if row_id_col is not None else None\n versionNumber = row[row_ver_col] if row_ver_col is not None else None\n values = [\n elem for i, elem in enumerate(row) if i not in [row_id_col, row_ver_col]\n ]\n return Row(values, rowId=rowId, versionNumber=versionNumber)\n\n return RowSet(\n headers=[\n elem\n for i, elem in enumerate(self.headers)\n if i not in [row_id_col, row_ver_col]\n ],\n tableId=self.tableId,\n etag=self.etag,\n rows=[to_row_object(row, row_id_col, row_ver_col) for row in self],\n )\n\n def setColumnHeaders(self, headers):\n \"\"\"\n Set the list of :py:class:`synapseclient.table.SelectColumn` objects that will be used to convert fields to the\n appropriate data types.\n\n Column headers are automatically set when querying.\n \"\"\"\n if self.includeRowIdAndRowVersion:\n names = [header.name for header in headers]\n if \"ROW_ID\" not in names and \"ROW_VERSION\" not in names:\n headers = [\n SelectColumn(name=\"ROW_ID\", columnType=\"STRING\"),\n SelectColumn(name=\"ROW_VERSION\", columnType=\"STRING\"),\n ] + headers\n self.headers = headers\n\n def __iter__(self):\n def iterate_rows(filepath, headers):\n if not self.header or not self.headers:\n raise ValueError(\"Iteration not supported for table without headers.\")\n\n header_name = {header.name for header in headers}\n row_metadata_headers = {\"ROW_ID\", \"ROW_VERSION\", \"ROW_ETAG\"}\n num_row_metadata_in_headers = len(header_name & row_metadata_headers)\n with io.open(filepath, encoding=\"utf-8\", newline=self.lineEnd) as f:\n reader = csv.reader(\n f,\n delimiter=self.separator,\n escapechar=self.escapeCharacter,\n lineterminator=self.lineEnd,\n quotechar=self.quoteCharacter,\n )\n csv_header = set(next(reader))\n # the number of row metadata differences between the csv headers and self.headers\n num_metadata_cols_diff = (\n len(csv_header & row_metadata_headers) - num_row_metadata_in_headers\n )\n # we only process 2 cases:\n # 1. matching row metadata\n # 2. if metadata does not match, self.headers must not contains row metadata\n if num_metadata_cols_diff == 0 or num_row_metadata_in_headers == 0:\n for row in reader:\n yield cast_values(row[num_metadata_cols_diff:], headers)\n else:\n raise ValueError(\n \"There is mismatching row metadata in the csv file and in headers.\"\n )\n\n return iterate_rows(self.filepath, self.headers)\n\n def __len__(self):\n with io.open(self.filepath, encoding=\"utf-8\", newline=self.lineEnd) as f:\n if self.header: # ignore the header line\n f.readline()\n\n return sum(1 for line in f)\n\n def iter_row_metadata(self):\n \"\"\"Iterates the table results to get row_id and row_etag. If an etag does not exist for a row,\n it will generated as (row_id, None)\n\n :return: a generator that gives :py:class::`collections.namedtuple` with format (row_id, row_etag)\n \"\"\"\n with io.open(self.filepath, encoding=\"utf-8\", newline=self.lineEnd) as f:\n reader = csv.reader(\n f,\n delimiter=self.separator,\n escapechar=self.escapeCharacter,\n lineterminator=self.lineEnd,\n quotechar=self.quoteCharacter,\n )\n header = next(reader)\n\n # The ROW_... headers are always in a predefined order\n row_id_index = header.index(\"ROW_ID\")\n row_version_index = header.index(\"ROW_VERSION\")\n try:\n row_etag_index = header.index(\"ROW_ETAG\")\n except ValueError:\n row_etag_index = None\n\n for row in reader:\n yield type(self).RowMetadataTuple(\n int(row[row_id_index]),\n int(row[row_version_index]),\n row[row_etag_index] if (row_etag_index is not None) else None,\n )\n
"},{"location":"reference/tables/#synapseclient.table.CsvFileTable-functions","title":"Functions","text":""},{"location":"reference/tables/#synapseclient.table.CsvFileTable.from_table_query","title":"from_table_query(synapse, query, quoteCharacter='\"', escapeCharacter='\\\\', lineEnd=str(os.linesep), separator=',', header=True, includeRowIdAndRowVersion=True, downloadLocation=None)
classmethod
","text":"Create a Table object wrapping a CSV file resulting from querying a Synapse table. Mostly for internal use.
Source code insynapseclient/table.py
@classmethod\ndef from_table_query(\n cls,\n synapse,\n query,\n quoteCharacter='\"',\n escapeCharacter=\"\\\\\",\n lineEnd=str(os.linesep),\n separator=\",\",\n header=True,\n includeRowIdAndRowVersion=True,\n downloadLocation=None,\n):\n \"\"\"\n Create a Table object wrapping a CSV file resulting from querying a Synapse table.\n Mostly for internal use.\n \"\"\"\n\n download_from_table_result, path = synapse._queryTableCsv(\n query=query,\n quoteCharacter=quoteCharacter,\n escapeCharacter=escapeCharacter,\n lineEnd=lineEnd,\n separator=separator,\n header=header,\n includeRowIdAndRowVersion=includeRowIdAndRowVersion,\n downloadLocation=downloadLocation,\n )\n\n # A dirty hack to find out if we got back row ID and Version\n # in particular, we don't get these back from aggregate queries\n with io.open(path, \"r\", encoding=\"utf-8\") as f:\n reader = csv.reader(\n f,\n delimiter=separator,\n escapechar=escapeCharacter,\n lineterminator=lineEnd,\n quotechar=quoteCharacter,\n )\n first_line = next(reader)\n if len(download_from_table_result[\"headers\"]) + 2 == len(first_line):\n includeRowIdAndRowVersion = True\n else:\n includeRowIdAndRowVersion = False\n\n self = cls(\n filepath=path,\n schema=download_from_table_result.get(\"tableId\", None),\n etag=download_from_table_result.get(\"etag\", None),\n quoteCharacter=quoteCharacter,\n escapeCharacter=escapeCharacter,\n lineEnd=lineEnd,\n separator=separator,\n header=header,\n includeRowIdAndRowVersion=includeRowIdAndRowVersion,\n headers=[\n SelectColumn(**header)\n for header in download_from_table_result[\"headers\"]\n ],\n )\n\n return self\n
"},{"location":"reference/tables/#synapseclient.table.CsvFileTable.asDataFrame","title":"asDataFrame(rowIdAndVersionInIndex=True, convert_to_datetime=False)
","text":"Convert query result to a Pandas DataFrame.
:param rowIdAndVersionInIndex: Make the dataframe index consist of the row_id and row_version (and row_etag if it exists) :param convert_to_datetime: If set to True, will convert all Synapse DATE columns from UNIX timestamp integers into UTC datetime objects
:return: Pandas dataframe with results
Source code insynapseclient/table.py
def asDataFrame(self, rowIdAndVersionInIndex=True, convert_to_datetime=False):\n \"\"\"Convert query result to a Pandas DataFrame.\n\n :param rowIdAndVersionInIndex: Make the dataframe index consist of the row_id and row_version\n (and row_etag if it exists)\n :param convert_to_datetime: If set to True, will convert all Synapse DATE columns from UNIX timestamp\n integers into UTC datetime objects\n\n :return: Pandas dataframe with results\n \"\"\"\n test_import_pandas()\n import pandas as pd\n\n try:\n # Handle bug in pandas 0.19 requiring quotechar to be str not unicode or newstr\n quoteChar = self.quoteCharacter\n\n # determine which columns are DATE columns so we can convert milisecond timestamps into datetime objects\n date_columns = []\n list_columns = []\n dtype = {}\n\n if self.headers is not None:\n for select_column in self.headers:\n if select_column.columnType == \"STRING\":\n # we want to identify string columns so that pandas doesn't try to\n # automatically parse strings in a string column to other data types\n dtype[select_column.name] = str\n elif select_column.columnType in LIST_COLUMN_TYPES:\n list_columns.append(select_column.name)\n elif select_column.columnType == \"DATE\" and convert_to_datetime:\n date_columns.append(select_column.name)\n\n return _csv_to_pandas_df(\n self.filepath,\n separator=self.separator,\n quote_char=quoteChar,\n escape_char=self.escapeCharacter,\n contain_headers=self.header,\n lines_to_skip=self.linesToSkip,\n date_columns=date_columns,\n list_columns=list_columns,\n rowIdAndVersionInIndex=rowIdAndVersionInIndex,\n dtype=dtype,\n )\n except pd.parser.CParserError:\n return pd.DataFrame()\n
"},{"location":"reference/tables/#synapseclient.table.CsvFileTable.setColumnHeaders","title":"setColumnHeaders(headers)
","text":"Set the list of :py:class:synapseclient.table.SelectColumn
objects that will be used to convert fields to the appropriate data types.
Column headers are automatically set when querying.
Source code insynapseclient/table.py
def setColumnHeaders(self, headers):\n \"\"\"\n Set the list of :py:class:`synapseclient.table.SelectColumn` objects that will be used to convert fields to the\n appropriate data types.\n\n Column headers are automatically set when querying.\n \"\"\"\n if self.includeRowIdAndRowVersion:\n names = [header.name for header in headers]\n if \"ROW_ID\" not in names and \"ROW_VERSION\" not in names:\n headers = [\n SelectColumn(name=\"ROW_ID\", columnType=\"STRING\"),\n SelectColumn(name=\"ROW_VERSION\", columnType=\"STRING\"),\n ] + headers\n self.headers = headers\n
"},{"location":"reference/tables/#synapseclient.table.CsvFileTable.iter_row_metadata","title":"iter_row_metadata()
","text":"Iterates the table results to get row_id and row_etag. If an etag does not exist for a row, it will generated as (row_id, None)
:return: a generator that gives :py:class::collections.namedtuple
with format (row_id, row_etag)
synapseclient/table.py
def iter_row_metadata(self):\n \"\"\"Iterates the table results to get row_id and row_etag. If an etag does not exist for a row,\n it will generated as (row_id, None)\n\n :return: a generator that gives :py:class::`collections.namedtuple` with format (row_id, row_etag)\n \"\"\"\n with io.open(self.filepath, encoding=\"utf-8\", newline=self.lineEnd) as f:\n reader = csv.reader(\n f,\n delimiter=self.separator,\n escapechar=self.escapeCharacter,\n lineterminator=self.lineEnd,\n quotechar=self.quoteCharacter,\n )\n header = next(reader)\n\n # The ROW_... headers are always in a predefined order\n row_id_index = header.index(\"ROW_ID\")\n row_version_index = header.index(\"ROW_VERSION\")\n try:\n row_etag_index = header.index(\"ROW_ETAG\")\n except ValueError:\n row_etag_index = None\n\n for row in reader:\n yield type(self).RowMetadataTuple(\n int(row[row_id_index]),\n int(row[row_version_index]),\n row[row_etag_index] if (row_etag_index is not None) else None,\n )\n
"},{"location":"reference/tables/#synapseclient.table-functions","title":"Functions","text":""},{"location":"reference/tables/#synapseclient.table.as_table_columns","title":"as_table_columns(values)
","text":"Return a list of Synapse table :py:class:Column
objects that correspond to the columns in the given values.
:params values: an object that holds the content of the tables - a string holding the path to a CSV file, a filehandle, or StringIO containing valid csv content - a Pandas DataFrame <http://pandas.pydata.org/pandas-docs/stable/api.html#dataframe>
_
:returns: A list of Synapse table :py:class:Column
objects
Example::
import pandas as pd\n\ndf = pd.DataFrame(dict(a=[1, 2, 3], b=[\"c\", \"d\", \"e\"]))\ncols = as_table_columns(df)\n
Source code in synapseclient/table.py
def as_table_columns(values):\n \"\"\"\n Return a list of Synapse table :py:class:`Column` objects that correspond to the columns in the given values.\n\n :params values: an object that holds the content of the tables\n - a string holding the path to a CSV file, a filehandle, or StringIO containing valid csv content\n - a Pandas `DataFrame <http://pandas.pydata.org/pandas-docs/stable/api.html#dataframe>`_\n\n :returns: A list of Synapse table :py:class:`Column` objects\n\n Example::\n\n import pandas as pd\n\n df = pd.DataFrame(dict(a=[1, 2, 3], b=[\"c\", \"d\", \"e\"]))\n cols = as_table_columns(df)\n \"\"\"\n test_import_pandas()\n import pandas as pd\n from pandas.api.types import infer_dtype\n\n df = None\n\n # pandas DataFrame\n if isinstance(values, pd.DataFrame):\n df = values\n # filename of a csv file\n # in Python 3, we can check that the values is instanceof io.IOBase\n # for now, check if values has attr `read`\n elif isinstance(values, str) or hasattr(values, \"read\"):\n df = _csv_to_pandas_df(values)\n\n if df is None:\n raise ValueError(\"Values of type %s is not yet supported.\" % type(values))\n\n cols = list()\n for col in df:\n inferred_type = infer_dtype(df[col], skipna=True)\n columnType = PANDAS_TABLE_TYPE.get(inferred_type, \"STRING\")\n if columnType == \"STRING\":\n maxStrLen = df[col].str.len().max()\n if maxStrLen > 1000:\n cols.append(Column(name=col, columnType=\"LARGETEXT\", defaultValue=\"\"))\n else:\n size = int(\n round(min(1000, max(30, maxStrLen * 1.5)))\n ) # Determine the length of the longest string\n cols.append(\n Column(\n name=col,\n columnType=columnType,\n maximumSize=size,\n defaultValue=\"\",\n )\n )\n else:\n cols.append(Column(name=col, columnType=columnType))\n return cols\n
"},{"location":"reference/tables/#synapseclient.table.df2Table","title":"df2Table(df, syn, tableName, parentProject)
","text":"Creates a new table from data in pandas data frame. parameters: df, tableName, parentProject
Source code insynapseclient/table.py
def df2Table(df, syn, tableName, parentProject):\n \"\"\"Creates a new table from data in pandas data frame.\n parameters: df, tableName, parentProject\n \"\"\"\n\n # Create columns:\n cols = as_table_columns(df)\n cols = [syn.store(col) for col in cols]\n\n # Create Table Schema\n schema1 = Schema(name=tableName, columns=cols, parent=parentProject)\n schema1 = syn.store(schema1)\n\n # Add data to Table\n for i in range(0, df.shape[0] / 1200 + 1):\n start = i * 1200\n end = min((i + 1) * 1200, df.shape[0])\n rowset1 = RowSet(\n columns=cols,\n schema=schema1,\n rows=[Row(list(df.ix[j, :])) for j in range(start, end)],\n )\n syn.store(rowset1)\n\n return schema1\n
"},{"location":"reference/tables/#synapseclient.table.to_boolean","title":"to_boolean(value)
","text":"Convert a string to boolean, case insensitively, where true values are: true, t, and 1 and false values are: false, f, 0. Raise a ValueError for all other values.
Source code insynapseclient/table.py
def to_boolean(value):\n \"\"\"\n Convert a string to boolean, case insensitively,\n where true values are: true, t, and 1 and false values are: false, f, 0.\n Raise a ValueError for all other values.\n \"\"\"\n if isinstance(value, bool):\n return value\n\n if isinstance(value, str):\n lower_value = value.lower()\n if lower_value in [\"true\", \"t\", \"1\"]:\n return True\n if lower_value in [\"false\", \"f\", \"0\"]:\n return False\n\n raise ValueError(\"Can't convert %s to boolean.\" % value)\n
"},{"location":"reference/tables/#synapseclient.table.cast_values","title":"cast_values(values, headers)
","text":"Convert a row of table query results from strings to the correct column type.
See: https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/ColumnType.html
Source code insynapseclient/table.py
def cast_values(values, headers):\n \"\"\"\n Convert a row of table query results from strings to the correct column type.\n\n See: https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/ColumnType.html\n \"\"\"\n if len(values) != len(headers):\n raise ValueError(\n \"The number of columns in the csv file does not match the given headers. %d fields, %d headers\"\n % (len(values), len(headers))\n )\n\n result = []\n for header, field in zip(headers, values):\n columnType = header.get(\"columnType\", \"STRING\")\n\n # convert field to column type\n if field is None or field == \"\":\n result.append(None)\n elif columnType in {\n \"STRING\",\n \"ENTITYID\",\n \"FILEHANDLEID\",\n \"LARGETEXT\",\n \"USERID\",\n \"LINK\",\n }:\n result.append(field)\n elif columnType == \"DOUBLE\":\n result.append(float(field))\n elif columnType == \"INTEGER\":\n result.append(int(field))\n elif columnType == \"BOOLEAN\":\n result.append(to_boolean(field))\n elif columnType == \"DATE\":\n result.append(from_unix_epoch_time(field))\n elif columnType in {\n \"STRING_LIST\",\n \"INTEGER_LIST\",\n \"BOOLEAN_LIST\",\n \"ENTITYID_LIST\",\n \"USERID_LIST\",\n }:\n result.append(json.loads(field))\n elif columnType == \"DATE_LIST\":\n result.append(json.loads(field, parse_int=from_unix_epoch_time))\n else:\n # default to string for unknown column type\n result.append(field)\n\n return result\n
"},{"location":"reference/tables/#synapseclient.table.escape_column_name","title":"escape_column_name(column)
","text":"Escape the name of the given column for use in a Synapse table query statement :param column: a string or column dictionary object with a 'name' key
Source code insynapseclient/table.py
def escape_column_name(column):\n \"\"\"Escape the name of the given column for use in a Synapse table query statement\n :param column: a string or column dictionary object with a 'name' key\"\"\"\n col_name = (\n column[\"name\"] if isinstance(column, collections.abc.Mapping) else str(column)\n )\n escaped_name = col_name.replace('\"', '\"\"')\n return f'\"{escaped_name}\"'\n
"},{"location":"reference/tables/#synapseclient.table.join_column_names","title":"join_column_names(columns)
","text":"Join the names of the given columns into a comma delimited list suitable for use in a Synapse table query :param columns: a sequence of column string names or dictionary objets with column 'name' keys
Source code insynapseclient/table.py
def join_column_names(columns):\n \"\"\"Join the names of the given columns into a comma delimited list suitable for use in a Synapse table query\n :param columns: a sequence of column string names or dictionary objets with column 'name' keys\n \"\"\"\n return \",\".join(escape_column_name(c) for c in columns)\n
"},{"location":"reference/tables/#synapseclient.table.build_table","title":"build_table(name, parent, values)
","text":"Build a Table object
:param name: the name for the Table Schema object :param parent: the project in Synapse to which this table belongs :param values: an object that holds the content of the tables - a string holding the path to a CSV file - a Pandas DataFrame <http://pandas.pydata.org/pandas-docs/stable/api.html#dataframe>
_
:return: a Table object suitable for storing
Example::
path = \"/path/to/file.csv\"\ntable = build_table(\"simple_table\", \"syn123\", path)\ntable = syn.store(table)\n\nimport pandas as pd\n\ndf = pd.DataFrame(dict(a=[1, 2, 3], b=[\"c\", \"d\", \"e\"]))\ntable = build_table(\"simple_table\", \"syn123\", df)\ntable = syn.store(table)\n
Source code in synapseclient/table.py
def build_table(name, parent, values):\n \"\"\"\n Build a Table object\n\n :param name: the name for the Table Schema object\n :param parent: the project in Synapse to which this table belongs\n :param values: an object that holds the content of the tables\n - a string holding the path to a CSV file\n - a Pandas `DataFrame <http://pandas.pydata.org/pandas-docs/stable/api.html#dataframe>`_\n\n :return: a Table object suitable for storing\n\n Example::\n\n path = \"/path/to/file.csv\"\n table = build_table(\"simple_table\", \"syn123\", path)\n table = syn.store(table)\n\n import pandas as pd\n\n df = pd.DataFrame(dict(a=[1, 2, 3], b=[\"c\", \"d\", \"e\"]))\n table = build_table(\"simple_table\", \"syn123\", df)\n table = syn.store(table)\n \"\"\"\n test_import_pandas()\n import pandas as pd\n\n if not isinstance(values, pd.DataFrame) and not isinstance(values, str):\n raise ValueError(\"Values of type %s is not yet supported.\" % type(values))\n cols = as_table_columns(values)\n schema = Schema(name=name, columns=cols, parent=parent)\n headers = [SelectColumn.from_column(col) for col in cols]\n return Table(schema, values, headers=headers)\n
"},{"location":"reference/tables/#synapseclient.table.Table","title":"Table(schema, values, **kwargs)
","text":"Combine a table schema and a set of values into some type of Table object depending on what type of values are given.
:param schema: a table :py:class:Schema
object or Synapse Id of Table. :param values: an object that holds the content of the tables - a :py:class:RowSet
- a list of lists (or tuples) where each element is a row - a string holding the path to a CSV file - a Pandas DataFrame <http://pandas.pydata.org/pandas-docs/stable/api.html#dataframe>
- a dict which will be wrapped by a Pandas DataFrame <http://pandas.pydata.org/pandas-docs/stable/api.html#dataframe>
:return: a Table object suitable for storing
Usually, the immediate next step after creating a Table object is to store it::
table = syn.store(Table(schema, values))\n
End users should not need to know the details of these Table subclasses:
TableAbstractBaseClass
RowSetTable
TableQueryResult
CsvFileTable
synapseclient/table.py
def Table(schema, values, **kwargs):\n \"\"\"\n Combine a table schema and a set of values into some type of Table object\n depending on what type of values are given.\n\n :param schema: a table :py:class:`Schema` object or Synapse Id of Table.\n :param values: an object that holds the content of the tables\n - a :py:class:`RowSet`\n - a list of lists (or tuples) where each element is a row\n - a string holding the path to a CSV file\n - a Pandas `DataFrame <http://pandas.pydata.org/pandas-docs/stable/api.html#dataframe>`_\n - a dict which will be wrapped by a Pandas \\\n `DataFrame <http://pandas.pydata.org/pandas-docs/stable/api.html#dataframe>`_\n\n :return: a Table object suitable for storing\n\n Usually, the immediate next step after creating a Table object is to store it::\n\n table = syn.store(Table(schema, values))\n\n End users should not need to know the details of these Table subclasses:\n\n - :py:class:`TableAbstractBaseClass`\n - :py:class:`RowSetTable`\n - :py:class:`TableQueryResult`\n - :py:class:`CsvFileTable`\n \"\"\"\n\n try:\n import pandas as pd\n\n pandas_available = True\n except: # noqa\n pandas_available = False\n\n # a RowSet\n if isinstance(values, RowSet):\n return RowSetTable(schema, values, **kwargs)\n\n # a list of rows\n elif isinstance(values, (list, tuple)):\n return CsvFileTable.from_list_of_rows(schema, values, **kwargs)\n\n # filename of a csv file\n elif isinstance(values, str):\n return CsvFileTable(schema, filepath=values, **kwargs)\n\n # pandas DataFrame\n elif pandas_available and isinstance(values, pd.DataFrame):\n return CsvFileTable.from_data_frame(schema, values, **kwargs)\n\n # dict\n elif pandas_available and isinstance(values, dict):\n return CsvFileTable.from_data_frame(schema, pd.DataFrame(values), **kwargs)\n\n else:\n raise ValueError(\n \"Don't know how to make tables from values of type %s.\" % type(values)\n )\n
"},{"location":"reference/teams/","title":"Teams","text":""},{"location":"reference/teams/#synapseclient.team","title":"synapseclient.team
","text":"Functions that interact with Synapse Teams
"},{"location":"reference/teams/#synapseclient.team-classes","title":"Classes","text":""},{"location":"reference/teams/#synapseclient.team.UserProfile","title":"UserProfile
","text":" Bases: DictObject
Information about a Synapse user. In practice the constructor is not called directly by the client.
:param ownerId: A foreign key to the ID of the 'principal' object for the user. :param uri: The Uniform Resource Identifier (URI) for this entity. :param etag: Synapse employs an Optimistic Concurrency Control (OCC) scheme to handle concurrent updates. Since the E-Tag changes every time an entity is updated it is used to detect when a client's current representation of an entity is out-of-date. :param firstName: This person's given name (forename) :param lastName: This person's family name (surname) :param emails: The list of user email addresses registered to this user. :param userName: A name chosen by the user that uniquely identifies them. :param summary: A summary description about this person :param position: This person's current position title :param location: This person's location :param industry: \"The industry/discipline that this person is associated with :param company: This person's current affiliation :param profilePicureFileHandleId: The File Handle ID of the user's profile picture. :param url: A link to more information about this person :param notificationSettings: An object of type :py:class:org.sagebionetworks.repo.model.message.Settings
containing the user's preferences regarding when email notifications should be sent
synapseclient/team.py
class UserProfile(DictObject):\n \"\"\"\n Information about a Synapse user. In practice the constructor is not called directly by the client.\n\n :param ownerId: A foreign key to the ID of the 'principal' object for the user.\n :param uri: The Uniform Resource Identifier (URI) for this entity.\n :param etag: Synapse employs an Optimistic Concurrency Control (OCC) scheme to handle concurrent updates.\n Since the E-Tag changes every time an entity is updated it is used to detect when a client's current representation\n of an entity is out-of-date.\n :param firstName: This person's given name (forename)\n :param lastName: This person's family name (surname)\n :param emails: The list of user email addresses registered to this user.\n :param userName: A name chosen by the user that uniquely identifies them.\n :param summary: A summary description about this person\n :param position: This person's current position title\n :param location: This person's location\n :param industry: \"The industry/discipline that this person is associated with\n :param company: This person's current affiliation\n :param profilePicureFileHandleId: The File Handle ID of the user's profile picture.\n :param url: A link to more information about this person\n :param notificationSettings: An object of type :py:class:`org.sagebionetworks.repo.model.message.Settings`\n containing the user's preferences regarding when email notifications should be sent\n \"\"\"\n\n def __init__(self, **kwargs):\n super(UserProfile, self).__init__(kwargs)\n
"},{"location":"reference/teams/#synapseclient.team.UserGroupHeader","title":"UserGroupHeader
","text":" Bases: DictObject
Select metadata about a Synapse principal. In practice the constructor is not called directly by the client.
:param ownerId: A foreign key to the ID of the 'principal' object for the user. :param firstName: First Name :param lastName: Last Name :param userName: A name chosen by the user that uniquely identifies them. :param email: User's current email address :param isIndividual: True if this is a user, false if it is a group
Source code insynapseclient/team.py
class UserGroupHeader(DictObject):\n \"\"\"\n Select metadata about a Synapse principal. In practice the constructor is not called directly by the client.\n\n :param ownerId: A foreign key to the ID of the 'principal' object for the user.\n :param firstName: First Name\n :param lastName: Last Name\n :param userName: A name chosen by the user that uniquely identifies them.\n :param email: User's current email address\n :param isIndividual: True if this is a user, false if it is a group\n \"\"\"\n\n def __init__(self, **kwargs):\n super(UserGroupHeader, self).__init__(kwargs)\n
"},{"location":"reference/teams/#synapseclient.team.Team","title":"Team
","text":" Bases: DictObject
Represents a Synapse Team <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/Team.html>
_. User definable fields are:
:param icon: fileHandleId for icon image of the Team :param description: A short description of this Team. :param name: The name of the Team. :param canPublicJoin: true for teams which members can join without an invitation or approval
Source code insynapseclient/team.py
class Team(DictObject):\n \"\"\"\n Represents a `Synapse Team <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/Team.html>`_.\n User definable fields are:\n\n :param icon: fileHandleId for icon image of the Team\n :param description: A short description of this Team.\n :param name: The name of the Team.\n :param canPublicJoin: true for teams which members can join without an invitation or approval\n \"\"\"\n\n def __init__(self, **kwargs):\n super(Team, self).__init__(kwargs)\n\n @classmethod\n def getURI(cls, id):\n return \"/team/%s\" % id\n\n def postURI(self):\n return \"/team\"\n\n def putURI(self):\n return \"/team\"\n\n def deleteURI(self):\n return \"/team/%s\" % self.id\n\n def getACLURI(self):\n return \"/team/%s/acl\" % self.id\n\n def putACLURI(self):\n return \"/team/acl\"\n
"},{"location":"reference/teams/#synapseclient.team.TeamMember","title":"TeamMember
","text":" Bases: DictObject
Contains information about a user's membership in a Team. In practice the constructor is not called directly by the client.
:param teamId: the ID of the team :param member: An object of type :py:class:org.sagebionetworks.repo.model.UserGroupHeader
describing the member :param isAdmin: Whether the given member is an administrator of the team
synapseclient/team.py
class TeamMember(DictObject):\n \"\"\"\n Contains information about a user's membership in a Team. In practice the constructor is not called directly by\n the client.\n\n :param teamId: the ID of the team\n :param member: An object of type :py:class:`org.sagebionetworks.repo.model.UserGroupHeader` describing the member\n :param isAdmin: Whether the given member is an administrator of the team\n\n \"\"\"\n\n def __init__(self, **kwargs):\n if \"member\" in kwargs:\n kwargs[\"member\"] = UserGroupHeader(**kwargs[\"member\"])\n super(TeamMember, self).__init__(kwargs)\n
"},{"location":"reference/view_schema/","title":"Entity View Schema","text":""},{"location":"reference/view_schema/#synapseclient.table.EntityViewSchema","title":"synapseclient.table.EntityViewSchema
","text":" Bases: ViewBase
A EntityViewSchema is a :py:class:synapseclient.entity.Entity
that displays all files/projects (depending on user choice) within a given set of scopes
:param name: the name of the Entity View Table object :param columns: a list of :py:class:Column
objects or their IDs. These are optional. :param parent: the project in Synapse to which this table belongs :param scopes: a list of Projects/Folders or their ids :param type: This field is deprecated. Please use includeEntityTypes
:param includeEntityTypes: a list of entity types to include in the view. Supported entity types are:
- EntityViewType.FILE,\n - EntityViewType.PROJECT,\n - EntityViewType.TABLE,\n - EntityViewType.FOLDER,\n - EntityViewType.VIEW,\n - EntityViewType.DOCKER\n\n If none is provided, the view will default to include EntityViewType.FILE.\n
:param addDefaultViewColumns: If true, adds all default columns (e.g. name, createdOn, modifiedBy etc.) Defaults to True. The default columns will be added after a call to :py:meth:synapseclient.Synapse.store
. :param addAnnotationColumns: If true, adds columns for all annotation keys defined across all Entities in the EntityViewSchema's scope. Defaults to True. The annotation columns will be added after a call to :py:meth:synapseclient.Synapse.store
. :param ignoredAnnotationColumnNames: A list of strings representing annotation names. When addAnnotationColumns is True, the names in this list will not be automatically added as columns to the EntityViewSchema if they exist in any of the defined scopes. :param properties: A map of Synapse properties :param annotations: A map of user defined annotations :param local_state: Internal use only
Example::
from synapseclient import EntityViewType\n\nproject_or_folder = syn.get(\"syn123\")\nschema = syn.store(EntityViewSchema(name='MyTable', parent=project, scopes=[project_or_folder_id, 'syn123'],\n includeEntityTypes=[EntityViewType.FILE]))\n
Source code in synapseclient/table.py
class EntityViewSchema(ViewBase):\n \"\"\"\n A EntityViewSchema is a :py:class:`synapseclient.entity.Entity` that displays all files/projects\n (depending on user choice) within a given set of scopes\n\n :param name: the name of the Entity View Table object\n :param columns: a list of :py:class:`Column` objects or their IDs. These are optional.\n :param parent: the project in Synapse to which this table belongs\n :param scopes: a list of Projects/Folders or their ids\n :param type: This field is deprecated. Please use `includeEntityTypes`\n :param includeEntityTypes: a list of entity types to include in the view. Supported entity types are:\n\n - EntityViewType.FILE,\n - EntityViewType.PROJECT,\n - EntityViewType.TABLE,\n - EntityViewType.FOLDER,\n - EntityViewType.VIEW,\n - EntityViewType.DOCKER\n\n If none is provided, the view will default to include EntityViewType.FILE.\n :param addDefaultViewColumns: If true, adds all default columns (e.g. name, createdOn, modifiedBy etc.)\n Defaults to True.\n The default columns will be added after a call to\n :py:meth:`synapseclient.Synapse.store`.\n :param addAnnotationColumns: If true, adds columns for all annotation keys defined across all Entities in\n the EntityViewSchema's scope. Defaults to True.\n The annotation columns will be added after a call to\n :py:meth:`synapseclient.Synapse.store`.\n :param ignoredAnnotationColumnNames: A list of strings representing annotation names.\n When addAnnotationColumns is True, the names in this list will not be\n automatically added as columns to the EntityViewSchema if they exist in any\n of the defined scopes.\n :param properties: A map of Synapse properties\n :param annotations: A map of user defined annotations\n :param local_state: Internal use only\n\n Example::\n\n from synapseclient import EntityViewType\n\n project_or_folder = syn.get(\"syn123\")\n schema = syn.store(EntityViewSchema(name='MyTable', parent=project, scopes=[project_or_folder_id, 'syn123'],\n includeEntityTypes=[EntityViewType.FILE]))\n\n \"\"\"\n\n _synapse_entity_type = \"org.sagebionetworks.repo.model.table.EntityView\"\n\n def __init__(\n self,\n name=None,\n columns=None,\n parent=None,\n scopes=None,\n type=None,\n includeEntityTypes=None,\n addDefaultViewColumns=True,\n addAnnotationColumns=True,\n ignoredAnnotationColumnNames=[],\n properties=None,\n annotations=None,\n local_state=None,\n **kwargs,\n ):\n if includeEntityTypes:\n kwargs[\"viewTypeMask\"] = _get_view_type_mask(includeEntityTypes)\n elif type:\n kwargs[\"viewTypeMask\"] = _get_view_type_mask_for_deprecated_type(type)\n elif properties and \"type\" in properties:\n kwargs[\"viewTypeMask\"] = _get_view_type_mask_for_deprecated_type(\n properties[\"type\"]\n )\n properties[\"type\"] = None\n\n self.ignoredAnnotationColumnNames = set(ignoredAnnotationColumnNames)\n super(EntityViewSchema, self).__init__(\n name=name,\n columns=columns,\n properties=properties,\n annotations=annotations,\n local_state=local_state,\n parent=parent,\n **kwargs,\n )\n\n # This is a hacky solution to make sure we don't try to add columns to schemas that we retrieve from synapse\n is_from_normal_constructor = not (properties or local_state)\n # allowing annotations because user might want to update annotations all at once\n self.addDefaultViewColumns = (\n addDefaultViewColumns and is_from_normal_constructor\n )\n self.addAnnotationColumns = addAnnotationColumns and is_from_normal_constructor\n\n # set default values after constructor so we don't overwrite the values defined in properties using .get()\n # because properties, unlike local_state, do not have nonexistent keys assigned with a value of None\n if self.get(\"viewTypeMask\") is None:\n self.viewTypeMask = EntityViewType.FILE.value\n if self.get(\"scopeIds\") is None:\n self.scopeIds = []\n\n # add the scopes last so that we can append the passed in scopes to those defined in properties\n if scopes is not None:\n self.add_scope(scopes)\n\n def set_entity_types(self, includeEntityTypes):\n \"\"\"\n :param includeEntityTypes: a list of entity types to include in the view. This list will replace the previous\n settings. Supported entity types are:\n\n - EntityViewType.FILE,\n - EntityViewType.PROJECT,\n - EntityViewType.TABLE,\n - EntityViewType.FOLDER,\n - EntityViewType.VIEW,\n - EntityViewType.DOCKER\n\n \"\"\"\n self.viewTypeMask = _get_view_type_mask(includeEntityTypes)\n
"},{"location":"reference/view_schema/#synapseclient.table.EntityViewSchema-functions","title":"Functions","text":""},{"location":"reference/view_schema/#synapseclient.table.EntityViewSchema.set_entity_types","title":"set_entity_types(includeEntityTypes)
","text":":param includeEntityTypes: a list of entity types to include in the view. This list will replace the previous settings. Supported entity types are:
- EntityViewType.FILE,\n - EntityViewType.PROJECT,\n - EntityViewType.TABLE,\n - EntityViewType.FOLDER,\n - EntityViewType.VIEW,\n - EntityViewType.DOCKER\n
Source code in synapseclient/table.py
def set_entity_types(self, includeEntityTypes):\n \"\"\"\n :param includeEntityTypes: a list of entity types to include in the view. This list will replace the previous\n settings. Supported entity types are:\n\n - EntityViewType.FILE,\n - EntityViewType.PROJECT,\n - EntityViewType.TABLE,\n - EntityViewType.FOLDER,\n - EntityViewType.VIEW,\n - EntityViewType.DOCKER\n\n \"\"\"\n self.viewTypeMask = _get_view_type_mask(includeEntityTypes)\n
"},{"location":"reference/wiki/","title":"Wiki","text":""},{"location":"reference/wiki/#synapseclient.wiki","title":"synapseclient.wiki
","text":"Wiki
A Wiki page requires a title, markdown and an owner object and can also include images.
Creating a Wiki\n
::
from synapseclient import Wiki\n\nentity = syn.get('syn123456')\n\ncontent = \"\"\"\n# My Wiki Page\n\nHere is a description of my **fantastic** project!\n\nAn attached image:\n${image?fileName=logo.png&align=none}\n\"\"\"\n\nwiki = Wiki(title='My Wiki Page',\n owner=entity,\n markdown=content,\n attachments=['/path/to/logo.png'])\n\nwiki = syn.store(wiki)\n
Embedding images\n
Note that in the above example, we've attached a logo graphic and embedded it in the web page.
Figures that are more than just decoration can be stored as Synapse entities allowing versioning and provenance information to be recorded. This is a better choice for figures with data behind them.
Updating a Wiki\n
::
entity = syn.get('syn123456')\nwiki = syn.getWiki(entity)\n\nwiki.markdown = \"\"\"\n# My Wiki Page\n\nHere is a description of my **fantastic** project! Let's\n*emphasize* the important stuff.\n\nAn embedded image that is also a Synapse entity:\n${image?synapseId=syn1824434&align=None&scale=66}\n\nNow we can track it's provenance and keep multiple versions.\n\"\"\"\n\nwiki = syn.store(wiki)\n
Wiki Class\n
.. autoclass:: synapseclient.wiki.Wiki :members: init
Wiki methods\n
synapseclient.Synapse.getWiki
synapseclient.Synapse.getWikiHeaders
synapseclient.Synapse.store
synapseclient.Synapse.delete
Wiki
","text":" Bases: DictObject
Represents a wiki page in Synapse with content specified in markdown.
:param title: Title of the Wiki :param owner: Parent Entity that the Wiki will belong to :param markdown: Content of the Wiki (cannot be defined if markdownFile is defined) :param markdownFile: Path to file which contains the Content of Wiki (cannot be defined if markdown is defined) :param attachments: List of paths to files to attach :param fileHandles: List of file handle IDs representing files to be attached :param parentWikiId: (optional) For sub-pages, specify parent wiki page
Source code insynapseclient/wiki.py
class Wiki(DictObject):\n \"\"\"\n Represents a wiki page in Synapse with content specified in markdown.\n\n :param title: Title of the Wiki\n :param owner: Parent Entity that the Wiki will belong to\n :param markdown: Content of the Wiki (cannot be defined if markdownFile is defined)\n :param markdownFile: Path to file which contains the Content of Wiki (cannot be defined if markdown is defined)\n :param attachments: List of paths to files to attach\n :param fileHandles: List of file handle IDs representing files to be attached\n :param parentWikiId: (optional) For sub-pages, specify parent wiki page\n \"\"\"\n\n __PROPERTIES = (\n \"title\",\n \"markdown\",\n \"attachmentFileHandleIds\",\n \"id\",\n \"etag\",\n \"createdBy\",\n \"createdOn\",\n \"modifiedBy\",\n \"modifiedOn\",\n \"parentWikiId\",\n )\n\n def __init__(self, **kwargs):\n # Verify that the parameters are correct\n if \"owner\" not in kwargs:\n raise ValueError(\"Wiki constructor must have an owner specified\")\n\n # Initialize the file handle list to be an empty list\n if \"attachmentFileHandleIds\" not in kwargs:\n kwargs[\"attachmentFileHandleIds\"] = []\n\n # update the markdown\n self.update_markdown(\n kwargs.pop(\"markdown\", None), kwargs.pop(\"markdownFile\", None)\n )\n\n # Move the 'fileHandles' into the proper (wordier) bucket\n if \"fileHandles\" in kwargs:\n for handle in kwargs[\"fileHandles\"]:\n kwargs[\"attachmentFileHandleIds\"].append(handle)\n del kwargs[\"fileHandles\"]\n\n super(Wiki, self).__init__(kwargs)\n self.ownerId = id_of(self.owner)\n del self[\"owner\"]\n\n def json(self):\n \"\"\"Returns the JSON representation of the Wiki object.\"\"\"\n return json.dumps({k: v for k, v in self.items() if k in self.__PROPERTIES})\n\n def getURI(self):\n \"\"\"For internal use.\"\"\"\n\n return \"/entity/%s/wiki/%s\" % (self.ownerId, self.id)\n\n def postURI(self):\n \"\"\"For internal use.\"\"\"\n\n return \"/entity/%s/wiki\" % self.ownerId\n\n def putURI(self):\n \"\"\"For internal use.\"\"\"\n\n return \"/entity/%s/wiki/%s\" % (self.ownerId, self.id)\n\n def deleteURI(self):\n \"\"\"For internal use.\"\"\"\n\n return \"/entity/%s/wiki/%s\" % (self.ownerId, self.id)\n\n def update_markdown(self, markdown=None, markdown_file=None):\n \"\"\"\n Updates the wiki's markdown. Specify only one of markdown or markdown_file\n :param markdown: text that will become the markdown\n :param markdown_file: path to a file. Its contents will be the markdown\n \"\"\"\n if markdown and markdown_file:\n raise ValueError(\"Please use only one argument: markdown or markdownFile\")\n\n if markdown_file:\n # pop the 'markdownFile' kwargs because we don't actually need it in the dictionary to upload to synapse\n markdown_path = os.path.expandvars(os.path.expanduser(markdown_file))\n if not os.path.isfile(markdown_path):\n raise ValueError(markdown_file + \"is not a valid file\")\n with open(markdown_path, \"r\") as opened_markdown_file:\n markdown = opened_markdown_file.read()\n\n self[\"markdown\"] = markdown\n
"},{"location":"reference/wiki/#synapseclient.wiki.Wiki-functions","title":"Functions","text":""},{"location":"reference/wiki/#synapseclient.wiki.Wiki.json","title":"json()
","text":"Returns the JSON representation of the Wiki object.
Source code insynapseclient/wiki.py
def json(self):\n \"\"\"Returns the JSON representation of the Wiki object.\"\"\"\n return json.dumps({k: v for k, v in self.items() if k in self.__PROPERTIES})\n
"},{"location":"reference/wiki/#synapseclient.wiki.Wiki.getURI","title":"getURI()
","text":"For internal use.
Source code insynapseclient/wiki.py
def getURI(self):\n \"\"\"For internal use.\"\"\"\n\n return \"/entity/%s/wiki/%s\" % (self.ownerId, self.id)\n
"},{"location":"reference/wiki/#synapseclient.wiki.Wiki.postURI","title":"postURI()
","text":"For internal use.
Source code insynapseclient/wiki.py
def postURI(self):\n \"\"\"For internal use.\"\"\"\n\n return \"/entity/%s/wiki\" % self.ownerId\n
"},{"location":"reference/wiki/#synapseclient.wiki.Wiki.putURI","title":"putURI()
","text":"For internal use.
Source code insynapseclient/wiki.py
def putURI(self):\n \"\"\"For internal use.\"\"\"\n\n return \"/entity/%s/wiki/%s\" % (self.ownerId, self.id)\n
"},{"location":"reference/wiki/#synapseclient.wiki.Wiki.deleteURI","title":"deleteURI()
","text":"For internal use.
Source code insynapseclient/wiki.py
def deleteURI(self):\n \"\"\"For internal use.\"\"\"\n\n return \"/entity/%s/wiki/%s\" % (self.ownerId, self.id)\n
"},{"location":"reference/wiki/#synapseclient.wiki.Wiki.update_markdown","title":"update_markdown(markdown=None, markdown_file=None)
","text":"Updates the wiki's markdown. Specify only one of markdown or markdown_file :param markdown: text that will become the markdown :param markdown_file: path to a file. Its contents will be the markdown
Source code insynapseclient/wiki.py
def update_markdown(self, markdown=None, markdown_file=None):\n \"\"\"\n Updates the wiki's markdown. Specify only one of markdown or markdown_file\n :param markdown: text that will become the markdown\n :param markdown_file: path to a file. Its contents will be the markdown\n \"\"\"\n if markdown and markdown_file:\n raise ValueError(\"Please use only one argument: markdown or markdownFile\")\n\n if markdown_file:\n # pop the 'markdownFile' kwargs because we don't actually need it in the dictionary to upload to synapse\n markdown_path = os.path.expandvars(os.path.expanduser(markdown_file))\n if not os.path.isfile(markdown_path):\n raise ValueError(markdown_file + \"is not a valid file\")\n with open(markdown_path, \"r\") as opened_markdown_file:\n markdown = opened_markdown_file.read()\n\n self[\"markdown\"] = markdown\n
"},{"location":"reference/wiki/#synapseclient.wiki.WikiAttachment","title":"WikiAttachment
","text":" Bases: DictObject
Represents a wiki page attachment
Source code insynapseclient/wiki.py
class WikiAttachment(DictObject):\n \"\"\"\n Represents a wiki page attachment\n\n \"\"\"\n\n __PROPERTIES = (\"contentType\", \"fileName\", \"contentMd5\", \"contentSize\")\n\n def __init__(self, **kwargs):\n super(WikiAttachment, self).__init__(**kwargs)\n
"},{"location":"reference/wiki/#synapseclient.wiki-functions","title":"Functions","text":""},{"location":"tutorials/authentication/","title":"Authentication","text":"There are multiple ways one can login to Synapse. We recommend users to choose the method that fits their workflow.
"},{"location":"tutorials/authentication/#one-time-login","title":"One Time Login","text":"Use a personal access token token obtained from synapse.org under your Settings. Note that a token must minimally have the view scope to be used with the Synapse Python Client.
syn = synapseclient.login(authToken=\"authtoken\")\n
"},{"location":"tutorials/authentication/#use-environment-variable","title":"Use Environment Variable","text":"Setting the SYNAPSE_AUTH_TOKEN
environment variable will allow you to login to Synapse with a personal access token
The environment variable will take priority over credentials in the user's .synapseConfig
file.
In your shell, you can pass an environment variable to Python inline by defining it before the command:
SYNAPSE_AUTH_TOKEN='<my_personal_access_token>' python3\n
Alternatively you may export it first, then start: Python
export SYNAPSE_AUTH_TOKEN='<my_personal_access_token>'\npython3\n
Once you are inside Python, you may simply login without passing any arguments:
import synapseclient\nsyn = synapseclient.login()\n
To use the environment variable with the command line client, simply substitute python
for the synapse
command
SYNAPSE_AUTH_TOKEN='<my_personal_access_token>' synapse get syn123\nSYNAPSE_AUTH_TOKEN='<my_personal_access_token>' synapse store --parentid syn123 ~/foobar.txt\n
Or alternatively, for multiple commands:
export SYNAPSE_AUTH_TOKEN='<my_personal_access_token>'\nsynapse get syn123\nsynapse store --parentid syn123 ~/foobar.txt\n
"},{"location":"tutorials/authentication/#use-synapseconfig","title":"Use .synapseConfig
","text":"For writing code using the Synapse Python client that is easy to share with others, please do not include your credentials in the code. Instead, please use .synapseConfig
file to manage your credentials.
When installing the Synapse Python client, the .synapseConfig
is added to your home directory.
You may also create the ~/.synapseConfig
file by utilizing the command line client command and following the interactive prompts:
synapse config\n
The following describes how to add your credentials to the .synapseConfig
file without the use of the synapse config
command.
Open the .synapseConfig
file and find the following section:
#[authentication]\n#username = <username>\n#authtoken = <authtoken>\n
To enable this section, uncomment it. You don't need to specify your username when using authtoken as a pair, but if you do, it will be used to verify your identity. A personal access token generated from your synapse.org Settings can be used as your .synapseConfig authtoken.
[authentication]\nauthtoken = <authtoken>\n
Now, you can login without specifying any arguments:
import synapseclient\nsyn = synapseclient.login()\n
For legacy compatibility, the .synapseConfig
[authentication]
section will continue to support apikey
or username
+ password
pair until early 2024 when they are both deprecated in favor of personal access tokens (authtoken
) which can be scoped to certain functions and are revocable.
For more information, see:
The Synapse Python Client can be used from the command line via the synapse
command.
Note: The command line client is installed along with Installation of the Synapse Python client.
"},{"location":"tutorials/command_line_client/#usage","title":"Usage","text":"For help, type:
synapse -h\n
For help on specific commands, type:
synapse [command] -h\n
Logging in with an auth token environment variable, type:
synapse login -p $MY_SYNAPSE_TOKENWelcome, First Last!Logged in as: username (1234567)The usage is as follows:
synapse [-h] [--version] [-u SYNAPSEUSER] [-p SYNAPSEPASSWORD] [-c CONFIGPATH] [--debug] [--silent] [-s]\n [--otel {console,otlp}]\n {get,manifest,sync,store,add,mv,cp,get-download-list,associate,delete,query,submit,show,cat,list,config,set-provenance,get-provenance,set-annotations,get-annotations,create,store-table,onweb,login,test-encoding,get-sts-token,migrate}\n ...\n
"},{"location":"tutorials/command_line_client/#options","title":"Options","text":"Name Type Description Default --version
Flag Show program\u2019s version number and exit -u, --username
Option Username used to connect to Synapse -p, --password
Option Password, api key, or token used to connect to Synapse -c, --configPath
Option Path to configuration file used to connect to Synapse \u201c~/.synapseConfig\u201d --debug
Flag Set to debug mode, additional output and error messages are printed to the console False --silent
Flag Set to silent mode, console output is suppressed False -s, --skip-checks
Flag Suppress checking for version upgrade messages and endpoint redirection False --otel
Option Enable the usage of OpenTelemetry for tracing. Possible choices: console, otlp"},{"location":"tutorials/command_line_client/#subcommands","title":"Subcommands","text":"get
","text":"synapse get [-h] [-q queryString] [-v VERSION] [-r] [--followLink] [--limitSearch projId] [--downloadLocation path]\n [--multiThreaded] [--manifest {all,root,suppress}]\n [local path]\n
Name Type Description Default local path
Positional Synapse ID of form syn123 of desired data object. -q, --query
Named Optional query parameter, will fetch all of the entities returned by a query. -v, --version
Named Synapse version number of entity to retrieve. Most recent version -r, --recursive
Named Fetches content in Synapse recursively contained in the parentId specified by id. False --followLink
Named Determines whether the link returns the target Entity. False --limitSearch
Named Synapse ID of a container such as project or folder to limit search for files if using a path. --downloadLocation
Named Directory to download file to. \u201c./\u201d --multiThreaded
Named Download file using a multiple threaded implementation. True --manifest
Named Determines whether creating manifest file automatically. \u201call\u201d"},{"location":"tutorials/command_line_client/#manifest","title":"manifest
","text":"Generate manifest for uploading directory tree to Synapse.
synapse manifest [-h] --parent-id syn123 [--manifest-file OUTPUT] PATH\n
Name Type Description Default PATH
Positional A path to a file or folder whose manifest will be generated. --parent-id
Named Synapse ID of project or folder where to upload data. --manifest-file
Named A TSV output file path where the generated manifest is stored. stdout"},{"location":"tutorials/command_line_client/#sync","title":"sync
","text":"Synchronize files described in a manifest to Synapse.
synapse sync [-h] [--dryRun] [--sendMessages] [--retries INT] FILE\n
Name Type Description Default FILE
Positional A tsv file with file locations and metadata to be pushed to Synapse. See synapseutils.sync.syncToSynapse for details on the format of a manifest. --dryRun
Named Perform validation without uploading. False --sendMessages
Named Send notifications via Synapse messaging (email) at specific intervals, on errors and on completion. False --retries
Named Number of retries for failed uploads. 4"},{"location":"tutorials/command_line_client/#store","title":"store
","text":"Uploads and adds a file to Synapse.
synapse store [-h] (--parentid syn123 | --id syn123 | --type TYPE) [--name NAME]\n [--description DESCRIPTION | --descriptionFile DESCRIPTION_FILE_PATH] [--used [target [target ...]]]\n [--executed [target [target ...]]] [--limitSearch projId] [--noForceVersion] [--annotations ANNOTATIONS]\n [--replace]\n [FILE]\n
Name Type Description Default FILE
Positional File to be added to synapse. --parentid, --parentId
Named Synapse ID of project or folder where to upload data (must be specified if \u2013id is not used). --id
Named Optional Id of entity in Synapse to be updated. --type
Named Type of object, such as \u201cFile\u201d, \u201cFolder\u201d, or \u201cProject\u201d, to create in Synapse. \u201cFile\u201d --name
Named Name of data object in Synapse. --description
Named Description of data object in Synapse. --descriptionFile
Named Path to a markdown file containing description of project/folder. --used
Named Synapse ID, a url, or a local file path (of a file previously uploaded to Synapse) from which the specified entity is derived. --executed
Named Synapse ID, a url, or a local file path (of a file previously uploaded to Synapse) that was executed to generate the specified entity. --limitSearch
Named Synapse ID of a container such as project or folder to limit search for provenance files. --noForceVersion
Named Do not force a new version to be created if the contents of the file have not changed. False --annotations
Named Annotations to add as a JSON formatted string, should evaluate to a dictionary (key/value pairs). Example: \u2018{\u201cfoo\u201d: 1, \u201cbar\u201d:\u201dquux\u201d}\u2019 --replace
Named Replace all existing annotations with the given annotations. False"},{"location":"tutorials/command_line_client/#add","title":"add
","text":"Uploads and adds a file to Synapse.
synapse add [-h] (--parentid syn123 | --id syn123 | --type TYPE) [--name NAME]\n [--description DESCRIPTION | --descriptionFile DESCRIPTION_FILE_PATH] [--used [target [target ...]]]\n [--executed [target [target ...]]] [--limitSearch projId] [--noForceVersion] [--annotations ANNOTATIONS] [--replace]\n [FILE]\n
Name Type Description Default FILE
Positional File to be added to synapse. --parentid, --parentId
Named Synapse ID of project or folder where to upload data (must be specified if \u2013id is not used). --id
Named Optional Id of entity in Synapse to be updated. --type
Named Type of object, such as \u201cFile\u201d, \u201cFolder\u201d, or \u201cProject\u201d, to create in Synapse. \u201cFile\u201d --name
Named Name of data object in Synapse. --description
Named Description of data object in Synapse. --descriptionFile
Named Path to a markdown file containing description of project/folder. --used
Named Synapse ID, a url, or a local file path (of a file previously uploaded to Synapse) from which the specified entity is derived. --executed
Named Synapse ID, a url, or a local file path (of a file previously uploaded to Synapse) that was executed to generate the specified entity. --limitSearch
Named Synapse ID of a container such as project or folder to limit search for provenance files. --noForceVersion
Named Do not force a new version to be created if the contents of the file have not changed. False --annotations
Named Annotations to add as a JSON formatted string, should evaluate to a dictionary (key/value pairs). Example: \u2018{\u201cfoo\u201d: 1, \u201cbar\u201d:\u201dquux\u201d}\u2019 --replace
Named Replace all existing annotations with the given annotations. False"},{"location":"tutorials/command_line_client/#mv","title":"mv
","text":"Moves a file/folder in Synapse.
synapse mv [-h] --id syn123 --parentid syn123\n
Name Type Description --id
Named Id of entity in Synapse to be moved. --parentid, --parentId
Named Synapse ID of project or folder where file/folder will be moved."},{"location":"tutorials/command_line_client/#cp","title":"cp
","text":"Copies specific versions of synapse content such as files, folders and projects by recursively copying all sub-content.
synapse cp [-h] --destinationId syn123 [--version 1] [--setProvenance traceback] [--updateExisting] [--skipCopyAnnotations]\n [--excludeTypes [file table [file table ...]]] [--skipCopyWiki]\n syn123\n
Name Type Description Default syn123
Positional Id of entity in Synapse to be copied. --destinationId
Named Synapse ID of project or folder where file will be copied to. --version, -v
Named Synapse version number of File or Link to retrieve. This parameter cannot be used when copying Projects or Folders. Defaults to most recent version. Most recent version --setProvenance
Named Has three values to set the provenance of the copied entity-traceback: Sets to the source entityexisting: Sets to source entity\u2019s original provenance (if it exists)None/none: No provenance is set \"traceback\" --updateExisting
Named Will update the file if there is already a file that is named the same in the destination False --skipCopyAnnotations
Named Do not copy the annotations False --excludeTypes
Named Accepts a list of entity types (file, table, link) which determines which entity types to not copy. [] --skipCopyWiki
Named Do not copy the wiki pages False"},{"location":"tutorials/command_line_client/#get-download-list","title":"get-download-list
","text":"Download files from the Synapse download cart.
synapse get-download-list [-h] [--downloadLocation path]\n
Name Type Description Default --downloadLocation
Named Directory to download file to. \"./\""},{"location":"tutorials/command_line_client/#associate","title":"associate
","text":"Associate local files with the files stored in Synapse so that calls to \u201csynapse get\u201d and \u201csynapse show\u201d don\u2019t re-download the files but use the already existing file.
synapse associate [-h] [--limitSearch projId] [-r] path\n
Name Type Description Default path
Positional Local file path. --limitSearch
Named Synapse ID of a container such as project or folder to limit search to. -r
Named Perform recursive association with all local files in a folder. False"},{"location":"tutorials/command_line_client/#delete","title":"delete
","text":"Removes a dataset from Synapse.
synapse delete [-h] [--version VERSION] syn123\n
Name Type Description syn123
Positional Synapse ID of form syn123 of desired data object. --version
Named Version number to delete of given entity."},{"location":"tutorials/command_line_client/#query","title":"query
","text":"Performs SQL like queries on Synapse.
synapse query [-h] [string [string ...]]\n
Name Type Description string
Positional A query string. Note that when using the command line query strings must be passed intact as a single string. In most shells this can mean wrapping the query in quotes as appropriate and escaping any quotes that may appear within the query string itself. Example: synapse query \"select \\\"column has spaces\\\" from syn123\"
. See Table Examples for more information."},{"location":"tutorials/command_line_client/#submit","title":"submit
","text":"Submit an entity or a file for evaluation.
synapse submit [-h] [--evaluationID EVALUATIONID] [--evaluationName EVALUATIONNAME] [--entity ENTITY] [--file FILE]\n [--parentId PARENTID] [--name NAME] [--teamName TEAMNAME] [--submitterAlias ALIAS] [--used [target [target ...]]]\n [--executed [target [target ...]]] [--limitSearch projId]\n
Name Type Description --evaluationID, --evaluationId, --evalID
Named Evaluation ID where the entity/file will be submitted. --evaluationName, --evalN
Named Evaluation Name where the entity/file will be submitted. --entity, --eid, --entityId, --id
Named Synapse ID of the entity to be submitted. --file, -f
Named File to be submitted to the challenge. --parentId, --parentid, --parent
Named Synapse ID of project or folder where to upload data. --name
Named Name of the submission. --teamName, --team
Named Submit on behalf of a registered team. --submitterAlias, --alias
Named A nickname, possibly for display in leaderboards. --used
Named Synapse ID, a url, or a local file path (of a file previously uploaded to Synapse) from which the specified entity is derived. --executed
Named Synapse ID, a url, or a local file path (of a file previously uploaded to Synapse) that was executed to generate the specified entity. --limitSearch
Named Synapse ID of a container such as project or folder to limit search for provenance files."},{"location":"tutorials/command_line_client/#show","title":"show
","text":"Show metadata for an entity.
synapse show [-h] [--limitSearch projId] syn123\n
Name Type Description syn123
Positional Synapse ID of form syn123 of desired synapse object. --limitSearch
Named Synapse ID of a container such as project or folder to limit search for provenance files."},{"location":"tutorials/command_line_client/#cat","title":"cat
","text":"Prints a dataset from Synapse.
synapse cat [-h] [-v VERSION] syn123\n
Name Type Description Default syn123
Positional Synapse ID of form syn123 of desired data object. -v, --version
Named Synapse version number of entity to display. Most recent version"},{"location":"tutorials/command_line_client/#list","title":"list
","text":"List Synapse entities contained by the given Project or Folder. Note: May not be supported in future versions of the client.
synapse list [-h] [-r] [-l] [-m] syn123\n
Name Type Description Default syn123
Positional Synapse ID of a project or folder. -r, --recursive
Named Recursively list contents of the subtree descending from the given Synapse ID. False -l, --long
Named List synapse entities in long format. False -m, --modified
Named List modified by and modified date. False"},{"location":"tutorials/command_line_client/#config","title":"config
","text":"Create or modify a Synapse configuration file.
synapse config [-h]\n
Name Type Description -h
Named Show the help message and exit."},{"location":"tutorials/command_line_client/#set-provenance","title":"set-provenance
","text":"Create provenance records.
synapse set-provenance [-h] --id syn123 [--name NAME] [--description DESCRIPTION] [-o [OUTPUT_FILE]]\n [--used [target [target ...]]] [--executed [target [target ...]]] [--limitSearch projId]\n
Name Type Description --id
Named Synapse ID of entity whose provenance we are accessing. --name
Named Name of the activity that generated the entity. --description
Named Description of the activity that generated the entity. -o, --output
Named Output the provenance record in JSON format. --used
Named Synapse ID, a url, or a local file path (of a file previously uploaded to Synapse) from which the specified entity is derived. --executed
Named Synapse ID, a url, or a local file path (of a file previously uploaded to Synapse) that was executed to generate the specified entity. --limitSearch
Named Synapse ID of a container such as project or folder to limit search for provenance files."},{"location":"tutorials/command_line_client/#get-provenance","title":"get-provenance
","text":"Show provenance records.
synapse get-provenance [-h] --id syn123 [--version version] [-o [OUTPUT_FILE]]\n
Name Type Description --id
Named Synapse ID of entity whose provenance we are accessing. --version
Named Version of Synapse entity whose provenance we are accessing. -o, --output
Named Output the provenance record in JSON format."},{"location":"tutorials/command_line_client/#set-annotations","title":"set-annotations
","text":"Create annotations records.
synapse set-annotations [-h] --id syn123 --annotations ANNOTATIONS [-r]\n
Name Type Description Default --id
Named Synapse ID of entity whose annotations we are accessing. --annotations
Named Annotations to add as a JSON formatted string, should evaluate to a dictionary (key/value pairs). Example: \u2018{\u201cfoo\u201d: 1, \u201cbar\u201d:\u201dquux\u201d}\u2019. -r, --replace
Named Replace all existing annotations with the given annotations. False"},{"location":"tutorials/command_line_client/#get-annotations","title":"get-annotations
","text":"Show annotations records.
synapse get-annotations [-h] --id syn123 [-o [OUTPUT_FILE]]\n
Name Type Description --id
Named Synapse ID of entity whose annotations we are accessing. -o, --output
Named Output the annotations record in JSON format."},{"location":"tutorials/command_line_client/#create","title":"create
","text":"Creates folders or projects on Synapse.
synapse create [-h] [--parentid syn123] --name NAME [--description DESCRIPTION | --descriptionFile DESCRIPTION_FILE_PATH] type\n
Name Type Description type
Positional Type of object to create in synapse one of {Project, Folder}. --parentid, --parentId
Named Synapse ID of project or folder where to place folder [not used with project]. --name
Named Name of folder/project. --description
Named Description of project/folder. --descriptionFile
Named Path to a markdown file containing description of project/folder."},{"location":"tutorials/command_line_client/#store-table","title":"store-table
","text":"Creates a Synapse Table given a csv.
synapse store-table [-h] --name NAME [--parentid syn123] [--csv foo.csv]\n
Name Type Description --name
Named Name of Table. --parentid, --parentId
Named Synapse ID of project. --csv
Named Path to csv."},{"location":"tutorials/command_line_client/#onweb","title":"onweb
","text":"Opens Synapse website for Entity.
synapse onweb [-h] id\n
Name Type Description id
Positional Synapse id."},{"location":"tutorials/command_line_client/#login","title":"login
","text":"Login to Synapse and (optionally) cache credentials.
synapse login [-h] [-u SYNAPSEUSER] [-p SYNAPSEPASSWORD] [--rememberMe]\n
Name Type Description Default -u, --username
Named Username used to connect to Synapse. -p, --password
Named This will be deprecated. Password or api key used to connect to Synapse. --rememberMe, --remember-me
Named Cache credentials for automatic authentication on future interactions with Synapse. False"},{"location":"tutorials/command_line_client/#test-encoding","title":"test-encoding
","text":"Test character encoding to help diagnose problems.
synapse test-encoding [-h]\n
Name Type Description -h
Named Show the help message and exit."},{"location":"tutorials/command_line_client/#get-sts-token","title":"get-sts-token
","text":"Get an STS token for access to AWS S3 storage underlying Synapse.
synapse get-sts-token [-h] [-o {json,boto,shell,bash,cmd,powershell}] id {read_write,read_only}\n
Name Type Description Default id
Positional Synapse id. permission
Positional Possible choices: read_write, read_only. -o, --output
Named Possible choices: json, boto, shell, bash, cmd, powershell. \"shell\""},{"location":"tutorials/command_line_client/#migrate","title":"migrate
","text":"Migrate Synapse entities to a different storage location.
synapse migrate [-h] [--source_storage_location_ids [SOURCE_STORAGE_LOCATION_IDS [SOURCE_STORAGE_LOCATION_IDS ...]]]\n [--file_version_strategy FILE_VERSION_STRATEGY] [--include_table_files] [--continue_on_error]\n [--csv_log_path CSV_LOG_PATH] [--dryRun] [--force]\n id dest_storage_location_id db_path\n
Name Type Description Default id
Positional Synapse id. dest_storage_location_id
Positional Destination Synapse storage location id. db_path
Positional Local system path where a record keeping file can be stored. --source_storage_location_ids
Named Source Synapse storage location ids. If specified only files in these storage locations will be migrated. --file_version_strategy
Named One of \u2018new\u2019, \u2018latest\u2019, \u2018all\u2019, \u2018skip\u2019. New creates a new version of each entity, latest migrates the most recent version, all migrates all versions, skip avoids migrating file entities (use when exclusively targeting table attached files. \"new\" --include_table_files
Named Include table attached files when migrating. False --continue_on_error
Named Whether to continue processing other entities if migration of one fails. False --csv_log_path
Named Path where to log a csv documenting the changes from the migration. --dryRun
Named Dry run, files will be indexed by not migrated. False --force
Named Bypass interactive prompt confirming migration. False"},{"location":"tutorials/configuration/","title":"Configuration","text":"The synapse python client can be configured either programmatically or by using a configuration file. When installing the Synapse Python client, the .synapseConfig
is added to your home directory. This configuration file is used to store a number of configuration options, including your Synapse authtoken, cache, and multi-threading settings.
A full example .synapseConfig
can be found in the github repository.
.synapseConfig
sections","text":""},{"location":"tutorials/configuration/#authentication","title":"[authentication]
","text":"See details on this section in the authentication document.
"},{"location":"tutorials/configuration/#cache","title":"[cache]
","text":"Your downloaded files are cached to avoid repeat downloads of the same file. change 'location' to use a different folder on your computer as the cache location
"},{"location":"tutorials/configuration/#endpoints","title":"[endpoints]
","text":"Configuring these will cause the Python client to use these as Synapse service endpoints instead of the default prod endpoints.
"},{"location":"tutorials/configuration/#transfer","title":"[transfer]
","text":"Settings to configure how Synapse uploads/downloads data.
You may also set the max_threads
programmatically via:
import synapseclient\nsyn = synapseclient.login()\nsyn.max_threads = 10\n
"},{"location":"tutorials/file_versioning/","title":"File Upload","text":"Files in Synapse are versionable. Please see Versioning for more information about how versions in Files works.
"},{"location":"tutorials/file_versioning/#uploading-a-new-version","title":"Uploading a New Version","text":"Uploading a new version follows the same steps as uploading a file for the first time - use the same file name and store it in the same location (e.g., the same parentId). It is recommended to add a comment to the new version in order to easily track differences at a glance. The example file raw_data.txt
will now have a version of 2 and a comment describing the change.
Explicit example:
import synapseclient\n\n# fetch the file in Synapse\nfile_to_update = syn.get('syn2222', downloadFile=False)\n\n# save the local path to the new version of the file\nfile_to_update.path = '/path/to/new/version/of/raw_data.txt'\n\n# add a version comment\nfile_to_update.versionComment = 'Added 5 random normally distributed numbers.'\n\n# store the new file\nupdated_file = syn.store(file_to_update)\n
Implicit example:
# Assuming that there is a file created with:\nsyn.store(File('path/to/old/raw_data.txt', parentId='syn123456'))\n\n# To create a new version of that file, make sure you store it with the exact same name\nnew_file = syn.store(File('path/to/new_version/raw_data.txt', parentId='syn123456'))\n
"},{"location":"tutorials/file_versioning/#updating-annotations-or-provenance-without-changing-versions","title":"Updating Annotations or Provenance without Changing Versions","text":"Any change to a File will automatically update its version. If this isn\u2019t the desired behavior, such as minor changes to the metadata, you can set forceVersion=False
with the Python client. For command line, the commands set-annotations
and set-provenance
will update the metadata without creating a new version. Adding/updating annotations and provenance in the web client will also not cause a version change.
Important: Because Provenance is tracked by version, set forceVersion=False
for minor changes to avoid breaking Provenance.
Setting annotations without changing version:
# Get file from Synapse, set download=False since we are only updating annotations\nfile = syn.get('syn56789', download=False)\n\n# Add annotations\nfile.annotations = {\"fileType\":\"bam\", \"assay\":\"RNA-seq\"}\n\n# Store the file without creating a new version\nfile = syn.store(file, forceVersion=False)\n
"},{"location":"tutorials/file_versioning/#setting-provenance-without-changing-version","title":"Setting Provenance without Changing Version","text":"To set Provenance without changing the file version:
# Get file from Synapse, set download=False since we are only updating provenance\nfile = syn.get('syn56789', download=False)\n\n# Add provenance\nfile = syn.setProvenance(file, activity = Activity(used = '/path/to/example_code'))\n\n# Store the file without creating a new version\nfile = syn.store(file, forceVersion=False)\n
"},{"location":"tutorials/file_versioning/#downloading-a-specific-version","title":"Downloading a Specific Version","text":"By default, the File downloaded will always be the most recent version. However, a specific version can be downloaded by passing the version parameter:
entity = syn.get(\"syn3260973\", version=1)\n
"},{"location":"tutorials/installation/","title":"Installation","text":"By following the instructions below, you are installing the synapseclient
, synapseutils
and the command line client.
The synapseclient package is available from PyPI. It can be installed or upgraded with pip. Due to the nature of Python, we highly recommend you set up your python environment with conda or pyenv and create virtual environments to control your Python dependencies for your work.
conda create -n synapseclient python=3.9\nconda activate synapseclient\n(sudo) pip install (--upgrade) synapseclient[pandas, pysftp]\n
pyenv install -v 3.9.13\npyenv global 3.9.13\npython -m venv env\nsource env/bin/activate\n(sudo) python3 -m pip3 install (--upgrade) synapseclient[pandas, pysftp]\n
The dependencies on pandas and pysftp are optional. The Synapse synapseclient.table
feature integrates with Pandas. Support for sftp is required for users of SFTP file storage. Both require native libraries to be compiled or installed separately from prebuilt binaries.
Source code and development versions are available on Github. Installing from source:
git clone git://github.com/Sage-Bionetworks/synapsePythonClient.git\ncd synapsePythonClient\n
You can stay on the master branch to get the latest stable release or check out the develop branch or a tagged revision:
git checkout <branch or tag>\n
Next, either install the package in the site-packages directory pip install .
or pip install -e .
to make the installation follow the head without having to reinstall:
pip install .\n
"},{"location":"tutorials/python_client/","title":"Working with the Python client","text":""},{"location":"tutorials/python_client/#authentication","title":"Authentication","text":"Most operations in Synapse require you to be logged in. Please follow instructions in authentication to configure your client:
import synapseclient\nsyn = synapseclient.Synapse()\nsyn.login()\n# If you aren't logged in, this following command will\n# show that you are an \"anonymous\" user.\nsyn.getUserProfile()\n
"},{"location":"tutorials/python_client/#accessing-data","title":"Accessing Data","text":"Synapse identifiers are used to refer to projects and data which are represented by synapseclient.entity
objects. For example, the entity syn1899498 represents a tab-delimited file containing a 100 by 4 matrix. Getting the entity retrieves an object that holds metadata describing the matrix, and also downloads the file to a local cache:
import synapseclient\n# This is a shortcut to login\nsyn = synapseclient.login()\nentity = syn.get('syn1899498')\n
View the entity's metadata in the Python console:
print(entity)\n
This is one simple way to read in a small matrix:
rows = []\nwith open(entity.path) as f:\n header = f.readline().split('\\t')\n for line in f:\n row = [float(x) for x in line.split('\\t')]\n rows.append(row)\n
View the entity in the browser:
syn.onweb('syn1899498')\n
You can create your own projects and upload your own data sets. Synapse stores entities in a hierarchical or tree structure. Projects are at the top level and must be uniquely named:
import synapseclient\nfrom synapseclient import Project, Folder, File\n\nsyn = synapseclient.login()\n# Project names must be globally unique\nproject = Project('My uniquely named project')\nproject = syn.store(project)\n
Creating a folder:
data_folder = Folder('Data', parent=project)\ndata_folder = syn.store(data_folder)\n
Adding files to the project. You will get an error if you try to store an empty file in Synapse. Here we create temporary files, but you can specify your own file path:
import tempfile\n\ntemp = tempfile.NamedTemporaryFile(prefix='your_file', suffix='.txt')\nwith open(temp.name, \"w\") as temp_f:\n temp_f.write(\"Example text\")\nfilepath = temp.name\ntest_entity = File(filepath, description='Fancy new data', parent=data_folder)\ntest_entity = syn.store(test_entity)\nprint(test_entity)\n
You may notice that there is \"downloadAs\" name and \"entity name\". By default, the client will use the file's name as the entity name, but you can configure the file to display a different name on Synapse:
test_second_entity = File(filepath, name=\"second file\", parent=data_folder)\ntest_second_entity = syn.store(test_second_entity)\nprint(test_second_entity)\n
In addition to simple data storage, Synapse entities can be annotated with key/value metadata, described in markdown documents (Wiki), and linked together in provenance graphs to create a reproducible record of a data analysis pipeline.
See also:
Annotations are arbitrary metadata attached to Synapse entities. There are different ways to creating annotations. Using the entity created from the previous step in the tutorial, for example:
# First method\ntest_ent = syn.get(test_entity.id)\ntest_ent.foo = \"foo\"\ntest_ent.bar = \"bar\"\nsyn.store(test_ent)\n\n# Second method\ntest_ent = syn.get(test_entity.id)\nannotations = {\"foo\": \"foo\", \"bar\": \"bar\"}\ntest_ent.annotations = annotations\nsyn.store(test_ent)\n
See:
Synapse supports versioning of many entity types. This tutorial will focus on File versions. Using the project/folder created earlier in this tutorial
Uploading a new version. Synapse leverages the entity name to version entities:
import tempfile\n\ntemp = tempfile.NamedTemporaryFile(prefix='second', suffix='.txt')\nwith open(temp.name, \"w\") as temp_f:\n temp_f.write(\"First text\")\n\nversion_entity = File(temp.name, parent=data_folder)\nversion_entity = syn.store(version_entity)\nprint(version_entity.versionNumber)\n\nwith open(temp.name, \"w\") as temp_f:\n temp_f.write(\"Second text\")\nversion_entity = File(temp.name, parent=data_folder)\nversion_entity = syn.store(version_entity)\nprint(version_entity.versionNumber)\n
Downloading a specific version. By default, Synapse downloads the latest version unless a version is specified:
version_1 = syn.get(version_entity, version=1)\n
"},{"location":"tutorials/python_client/#provenance","title":"Provenance","text":"Synapse provides tools for tracking 'provenance', or the transformation of raw data into processed results, by linking derived data objects to source data and the code used to perform the transformation:
# pass the provenance to the store function\nprovenance_ent = syn.store(\n version_entity,\n used=[version_1.id],\n executed=[\"https://github.com/Sage-Bionetworks/synapsePythonClient/tree/v2.7.2\"]\n)\n
See:
Views display rows and columns of information, and they can be shared and queried with SQL. Views are queries of other data already in Synapse. They allow you to see groups of files, tables, projects, or submissions and any associated annotations about those items.
Annotations are an essential component to building a view. Annotations are labels that you apply to your data, stored as key-value pairs in Synapse.
We will create a file view from the project above:
import synapseclient\nsyn = synapseclient.login()\n# Here we are using project.id from the earlier sections from this tutorial\nproject_id = project.id\nfileview = EntityViewSchema(\n name='MyTable',\n parent=project_id,\n scopes=[project_id]\n)\nfileview_ent = syn.store(fileview)\n
You can now query it to see all the files within the project. Note: it is highly recommended to install pandas
:
query = syn.tableQuery(f\"select * from {fileview_ent.id}\")\nquery_results = query.asDataFrame()\nprint(query_results)\n
See:
For more information see the Synapse Getting Started.
"},{"location":"tutorials/reticulate/","title":"Using synapseclient with R through reticulate","text":"This article describes using the Python synapseclient with R through the reticulate package, which provides an interface between R and Python libraries.
While the separate synapser R package exists and can be installed directly in an R environment without the need for reticulate, it is not currently compatible with an R environment that already includes reticulate. In such cases using the Python synapseclient is an alternative.
"},{"location":"tutorials/reticulate/#installation","title":"Installation","text":""},{"location":"tutorials/reticulate/#installing-reticulate","title":"Installing reticulate","text":"This article assumes that reticulate is installed and available in your R environment. If not it can be installed as follows:
install.packages(\"reticulate\")\n
"},{"location":"tutorials/reticulate/#installing-synapseclient","title":"Installing synapseclient","text":"The Python synapseclient can be installed either directly into the Python installation you intend to use with reticulate or from within R using the reticulate library.
synapseclient has the same requirements and dependencies when installed for use with reticulate as it does in other usage. In particular note that synapseclient requires a Python version of 3.6 or greater.
"},{"location":"tutorials/reticulate/#installing-into-python","title":"Installing into Python","text":"The Python synapseclient is available on the PyPi package repository and can be installed through Python tools that interface with the repository, such as pip. To install synapseclient for use with reticulate directly into a Python environment, first ensure that the current Python interpreter is the one you intend to use with reticulate. This may be a particular installation of Python, or a loaded virtual environment. See reticulate's Python version configuration documentation for more information on how reticulate can be configured to use particular Python environments.
For help installing a reticulate compatible Python, see the reticulate version of the SynapseShinyApp.
Once you have ensured you are interacting with your intended Python interpreter, follow the standard synapseclient installation instructions to install synapseclient.
"},{"location":"tutorials/reticulate/#installing-from-rreticulate","title":"Installing from R/Reticulate","text":"To install synapseclient from within R, first ensure that the reticulate library is loaded.
library(reticulate)\n
Once loaded, ensure that reticulate will use the Python installation you intend. You may need to provide reticulate a hint or otherwise point it at the proper Python installation.
Next install the synapseclient using reticulate's py_install command, e.g.
py_install(\"synapseclient\")\n
You may also want to install some of synapseclient's optional dependencies, such as Pandas for table support.
py_install(\"pandas\")\n
See synapseclient's installation instructions for more information on optional dependencies.
"},{"location":"tutorials/reticulate/#usage","title":"Usage","text":"Once synapseclient is installed it can be used once it is imported through R's import command:
synapseclient <- import(\"synapseclient\")\n
If you are using synapseclient with reticulate when writing an R package, you will want to wrap the import in an onLoad and use the delay_load option, .e.g.
synapseclient <- NULL\n\n.onLoad <- function(libname, pkgname) {\n synapseclient <<- reticulate::import(\"synapseclient\", delay_load = TRUE)\n}\n
This will allow users of your package to configure their reticulate usage properly regardless of when they load your package. More information on this technique can be found here.
If you are familiar with the synapser R package, many of the commands will be similar, but unlike in synapser where package functions and classes are made available in the global namespace through the search path, when using synapseclient through reticulate, classes are accessed through the imported synapseclient module and functionality is provided through an instantiated Synapse instance.
For example classes that were globally available are now available through the imported synapseclient module.
# File from synapser\nsynapseclient$File\n\n# Table from synapser\nsynapseclient$Table\n
And various syn functions are now methods on the Synapse object:
# using synapseclient with reticulate we must instantiate a Synapse instance\nsyn <- synapseclient$Synapse()\n\n# synLogin from synapser\nsyn$login()\n\n# synGet from synapser\nsyn$get(identifier)\n\n# synStore from syanpser\nsyn$store(entity)\n
Each synapse object has its own state, such as configuration and login credentials.
"},{"location":"tutorials/reticulate/#credentials","title":"Credentials","text":"synapseclient accessed through reticulate supports the same authentication options as it does when accessed directly from Python, for example:
syn <- synapseclient$synapse()\n\n# one time login\nsyn$login('<username', '<password>')\n\n# login and store credentials for future use\nsyn$login('<username', '<password>', rememberMe=TRUE)\n
See Managing Synapse Credentials for complete documentation on how synapseclient handles credentials and authentication.
"},{"location":"tutorials/reticulate/#accessing-data","title":"Accessing Data","text":"The following illustrates some examples of storing and retrieving data in Synapse using synapseclient through reticulate.
See here for more details on available data access APIs.
Create a project with a unique name
# use hex_digits to generate random string and use it to name a project\nhex_digits <- c(as.character(0:9), letters[1:6])\nprojectName <- sprintf(\"My unique project %s\", paste0(sample(hex_digits, 32, replace = TRUE), collapse = \"\"))\n\nproject <- synapseclient$Project(projectName)\nproject <- syn$store(project)\n
Create, store, and retrieve a file
filePath <- tempfile()\nconnection <- file(filePath)\nwriteChar(\"a \\t b \\t c \\n d \\t e \\t f \\n\", connection, eos = NULL)\nclose(connection)\n\nfile <- synapseclient$File(path = filePath, parent = project)\nfile <- syn$store(file)\nsynId <- file$properties$id\n\n# download the file using its identifier to specific path\nfileEntity <- syn$get(synId, downloadLocation=\"/path/to/folder\")\n\n# view the file meta data in the console\nprint(fileEntity)\n\n# view the file on the web\nsyn$onweb(synId)\n
Create folder and add files to the folder:
dataFolder <- synapseclient$Folder(\"Data\", parent = project)\ndataFolder <- syn$store(dataFolder)\n\nfilePath <- tempfile()\nconnection <- file(filePath)\nwriteChar(\"this is the content of the file\", connection, eos = NULL)\nclose(connection)\nfile <- synapseclient$File(path = filePath, parent = dataFolder)\nfile <- syn$store(file)\n
"},{"location":"tutorials/reticulate/#annotating-synapse-entities","title":"Annotating Synapse Entities","text":"This illustrates adding annotations to a Synapse entity.
# first retrieve the existing annotations object\nannotations <- syn$get_annotations(project)\n\nannotations$foo <- \"bar\"\nannotations$fooList <- list(\"bar\", \"baz\")\n\nsyn$set_annotations(annotations)\n
See here for more information on annotations.
"},{"location":"tutorials/reticulate/#activityprovenance","title":"Activity/Provenance","text":"This example illustrates creating an entity with associated provenance.
See here for more information on Activity/Provenance related APIs.
act <- synapseclient$Activity(\n name = \"clustering\",\n description = \"whizzy clustering\",\n used = c(\"syn1234\", \"syn1235\"),\n executed = \"syn4567\")\n
filePath <- tempfile()\nconnection <- file(filePath)\nwriteChar(\"some test\", connection, eos = NULL)\nclose(connection)\n\nfile = synapseclient$File(filePath, name=\"provenance_file.txt\", parent=project)\nfile <- syn$store(file, activity = act)\n
"},{"location":"tutorials/reticulate/#tables","title":"Tables","text":"These examples illustrate manipulating Synapse Tables. Note that you must have installed the Pandas dependency into the Python environment as described above in order to use this feature.
See here for more information on tables.
The following illustrates building a table from an R data frame. The schema will be generated from the data types of the values within the data frame.
# start with an R data frame\ngenes <- data.frame(\n Name = c(\"foo\", \"arg\", \"zap\", \"bah\", \"bnk\", \"xyz\"),\n Chromosome = c(1, 2, 2, 1, 1, 1),\n Start = c(12345, 20001, 30033, 40444, 51234, 61234),\n End = c(126000, 20200, 30999, 41444, 54567, 68686),\n Strand = c(\"+\", \"+\", \"-\", \"-\", \"+\", \"+\"),\n TranscriptionFactor = c(F, F, F, F, T, F))\n\n# build a Synapse table from the data frame.\n# a schema is automatically generated\n# note that reticulate will automatically convert from an R data frame to Pandas\ntable <- synapseclient$build_table(\"My Favorite Genes\", project, genes)\n\ntable <- syn$store(table)\n
Alternately the schema can be specified. At this time when using date values it is necessary to use a date string formatted in \"YYYY-MM-dd HH:mm:ss.mmm\" format or integer unix epoch millisecond value and explicitly specify the type in the schema due to how dates are translated to the Python client.
prez_birthdays <- data.frame(\n Name = c(\"George Washington\", \"Thomas Jefferson\", \"Abraham Lincoln\"),\n Time = c(\"1732-02-22 11:23:11.024\", \"1743-04-13 00:00:00.000\", \"1809-02-12 01:02:03.456\"))\n\ncols <- list(\n synapseclient$Column(name = \"Name\", columnType = \"STRING\", maximumSize = 20),\n synapseclient$Column(name = \"Time\", columnType = \"DATE\"))\n\nschema <- synapseclient$Schema(name = \"President Birthdays\", columns = cols, parent = project)\ntable <- synapseclient$Table(schema, prez_birthdays)\n\n# store the table in Synapse\ntable <- syn$store(table)\n
We can query a table as in the following:
tableId <- table$tableId\n\nresults <- syn$tableQuery(sprintf(\"select * from %s where Name='George Washington'\", tableId))\nresults$asDataFrame()\n
"},{"location":"tutorials/reticulate/#wikis","title":"Wikis","text":"This example illustrates creating a wiki.
See here for more information on wiki APIs.
content <- \"\n# My Wiki Page\nHere is a description of my **fantastic** project!\n\"\n\n# attachment\nfilePath <- tempfile()\nconnection <- file(filePath)\nwriteChar(\"this is the content of the file\", connection, eos = NULL)\nclose(connection)\nwiki <- synapseclient$Wiki(\n owner = project,\n title = \"My Wiki Page\",\n markdown = content,\n attachments = list(filePath)\n)\nwiki <- syn$store(wiki)\n
An existing wiki can be updated as follows.
wiki <- syn$getWiki(project)\nwiki$markdown <- \"\n# My Wiki Page\nHere is a description of my **fantastic** project! Let's\n*emphasize* the important stuff.\n\"\nwiki <- syn$store(wiki)\n
"},{"location":"tutorials/reticulate/#evaluations","title":"Evaluations","text":"An Evaluation is a Synapse construct useful for building processing pipelines and for scoring predictive modeling and data analysis challenges.
See here for more information on Evaluations.
Creating an Evaluation:
eval <- synapseclient$Evaluation(\n name = sprintf(\"My unique evaluation created on %s\", format(Sys.time(), \"%a %b %d %H%M%OS4 %Y\")),\n description = \"testing\",\n contentSource = project,\n submissionReceiptMessage = \"Thank you for your submission!\",\n submissionInstructionsMessage = \"This evaluation only accepts files.\")\n\neval <- syn$store(eval)\n\neval <- syn$getEvaluation(eval$id)\n
Submitting a file to an existing Evaluation:
# first create a file to submit\nfilePath <- tempfile()\nconnection <- file(filePath)\nwriteChar(\"this is my first submission\", connection, eos = NULL)\nclose(connection)\nfile <- synapseclient$File(path = filePath, parent = project)\nfile <- syn$store(file)\n# submit the created file\nsubmission <- syn$submit(eval, file)\n
List submissions:
submissions <- syn$getSubmissionBundles(eval)\n\n# submissions are returned as a generator\nlist(iterate(submissions))\n
Retrieving submission by id:
submission <- syn$getSubmission(submission$id)\n
Retrieving the submission status:
submissionStatus <- syn$getSubmissionStatus(submission)\nsubmissionStatus\n
Query an evaluation:
queryString <- sprintf(\"query=select * from evaluation_%s LIMIT %s OFFSET %s'\", eval$id, 10, 0)\nsyn$restGET(paste(\"/evaluation/submission/query?\", URLencode(queryString), sep = \"\"))\n
"},{"location":"tutorials/reticulate/#sharing-access-to-content","title":"Sharing Access to Content","text":"The following illustrates sharing access to a Synapse Entity.
See here for more information on Access Control including all available permissions.
# get permissions on an entity\n# to get permissions for a user/group pass a principalId identifier,\n# otherwise the assumed permission will apply to the public\n\n# make the project publicly accessible\nacl <- syn$setPermissions(project, accessType = list(\"READ\"))\n\nperms = syn$getPermissions(project)\n
"},{"location":"tutorials/reticulate/#views","title":"Views","text":"A view is a view of all entities (File, Folder, Project, Table, Docker Repository, View) within one or more Projects or Folders. Views can: The following examples illustrate some view operations.
See here for more information on Views. A view is implemented as a Table, see here for more information on Tables.
First create some files we can use in a view:
filePath1 <- tempfile()\nconnection <- file(filePath1)\nwriteChar(\"this is the content of the first file\", connection, eos = NULL)\nclose(connection)\nfile1 <- synapseclient$File(path = filePath1, parent = project)\nfile1 <- syn$store(file1)\nfilePath2 <- tempfile()\nconnection2 <- file(filePath2)\nwriteChar(\"this is the content of the second file\", connection, eos = NULL)\nclose(connection2)\nfile2 <- synapseclient$File(path = filePath2, parent = project)\nfile2 <- syn$store(file2)\n\n# add some annotations\nfileAnnotations1 <- syn$get_annotations(file1)\nfileAnnotations2 <- syn$get_annotations(file2)\n\nfileAnnotations1$contributor <- \"Sage\"\nfileAnnotations1$class <- \"V\"\nsyn$set_annotations(fileAnnotations1)\n\nfileAnnotations2$contributor = \"UW\"\nfileAnnotations2$rank = \"X\"\nsyn$set_annotations(fileAnnotations2)\n
Now create a view:
columns = c(\n synapseclient$Column(name = \"contributor\", columnType = \"STRING\"),\n synapseclient$Column(name = \"class\", columnType = \"STRING\"),\n synapseclient$Column(name = \"rank\", columnType = \"STRING\")\n)\n\nview <- synapseclient$EntityViewSchema(\n name = \"my first file view\",\n columns = columns,\n parent = project,\n scopes = project,\n includeEntityTypes = c(synapseclient$EntityViewType$FILE, synapseclient$EntityViewType$FOLDER),\n addDefaultViewColumns = TRUE\n)\n\nview <- syn$store(view)\n
We can now see content of our view (note that views are not created synchronously it may take a few seconds for the view table to be queryable).
queryResults <- syn$tableQuery(sprintf(\"select * from %s\", view$properties$id))\ndata <- queryResults$asDataFrame()\ndata\n
We can update annotations using a view as follows:
data[\"class\"] <- c(\"V\", \"VI\")\nsyn$store(synapseclient$Table(view$properties$id, data))\n\n# the change in annotations is reflected in get_annotations():\nsyn$get_annotations(file2$properties$id)\n
"},{"location":"tutorials/reticulate/#update-views-content","title":"Update View's Content","text":"# A view can contain different types of entity. To change the types of entity that will show up in a view:\nview <- syn$get(view$properties$id)\nview$set_entity_types(list(synapseclient$EntityViewType$FILE))\n
"},{"location":"tutorials/reticulate/#using-with-a-shiny-app","title":"Using with a Shiny App","text":"Reticulate and the Python synapseclient can be used to workaround an issue that exists when using synapser with a Shiny App. Since synapser shares a Synapse client instance within the R process, multiple users of a synapser integrated Shiny App may end up sharing a login if precautions aren't taken. When using reticulate with synapseclient, session scoped Synapse client objects can be created that avoid this issue.
See SynapseShinyApp for a sample application and a discussion of the issue, and the reticulate branch for an alternative implementation using reticulate with synapseclient.
"},{"location":"tutorials/tables/","title":"Tables","text":"Tables can be built up by adding sets of rows that follow a user-defined schema and queried using a SQL-like syntax.
"},{"location":"tutorials/tables/#creating-a-table-and-loading-it-with-data","title":"Creating a table and loading it with data","text":""},{"location":"tutorials/tables/#initial-setup","title":"Initial setup:","text":"import synapseclient\nfrom synapseclient import Project, File, Folder\nfrom synapseclient import Schema, Column, Table, Row, RowSet, as_table_columns, build_table\n\nsyn = synapseclient.Synapse()\nsyn.login()\n\nproject = syn.get('syn123')\n
"},{"location":"tutorials/tables/#example-data","title":"Example data","text":"First, let's load some data. Let's say we had a file, genes.csv:
Name,Chromosome,Start,End,Strand,TranscriptionFactor\nfoo,1,12345,12600,+,False\narg,2,20001,20200,+,False\nzap,2,30033,30999,-,False\nbah,1,40444,41444,-,False\nbnk,1,51234,54567,+,True\nxyz,1,61234,68686,+,False\n
"},{"location":"tutorials/tables/#creating-a-table-with-columns","title":"Creating a table with columns","text":"table = build_table('My Favorite Genes', project, \"/path/to/genes.csv\")\nsyn.store(table)\n
build_table will set the Table Schema which defines the columns of the table. To create a table with a custom Schema, first create the Schema:
cols = [\n Column(name='Name', columnType='STRING', maximumSize=20),\n Column(name='Chromosome', columnType='STRING', maximumSize=20),\n Column(name='Start', columnType='INTEGER'),\n Column(name='End', columnType='INTEGER'),\n Column(name='Strand', columnType='STRING', enumValues=['+', '-'], maximumSize=1),\n Column(name='TranscriptionFactor', columnType='BOOLEAN')]\n\nschema = Schema(name='My Favorite Genes', columns=cols, parent=project)\n
"},{"location":"tutorials/tables/#storing-the-table-in-synapse","title":"Storing the table in Synapse","text":"table = Table(schema, \"/path/to/genes.csv\")\ntable = syn.store(table)\n
The Table
function takes two arguments, a schema object and data in some form, which can be:
RowSet
objectWith a bit of luck, we now have a table populated with data. Let's try to query:
results = syn.tableQuery(\"select * from %s where Chromosome='1' and Start < 41000 and End > 20000\"\n % table.schema.id)\nfor row in results:\n print(row)\n
"},{"location":"tutorials/tables/#using-pandas-to-accomplish-setup-and-querying","title":"Using Pandas to accomplish setup and querying","text":"Pandas is a popular library for working with tabular data. If you have Pandas installed, the goal is that Synapse Tables will play nice with it.
Create a Synapse Table from a DataFrame:
import pandas as pd\n\ndf = pd.read_csv(\"/path/to/genes.csv\", index_col=False)\ntable = build_table('My Favorite Genes', project, df)\ntable = syn.store(table)\n
build_table
uses pandas DataFrame dtype to set the Table Schema
. To create a table with a custom Schema
, first create the Schema
:
schema = Schema(name='My Favorite Genes', columns=as_table_columns(df), parent=project)\ntable = syn.store(Table(schema, df))\n
Get query results as a DataFrame:
results = syn.tableQuery(\"select * from %s where Chromosome='2'\" % table.schema.id)\ndf = results.asDataFrame()\n
"},{"location":"tutorials/tables/#changing-data","title":"Changing Data","text":"Once the schema is settled, changes come in two flavors: appending new rows and updating existing ones.
Appending new rows is fairly straightforward. To continue the previous example, we might add some new genes from another file:
table = syn.store(Table(table.schema.id, \"/path/to/more_genes.csv\"))\n
To quickly add a few rows, use a list of row data:
new_rows = [[\"Qux1\", \"4\", 201001, 202001, \"+\", False],\n [\"Qux2\", \"4\", 203001, 204001, \"+\", False]]\ntable = syn.store(Table(schema, new_rows))\n
Updating rows requires an etag, which identifies the most recent change set plus row IDs and version numbers for each row to be modified. We get those by querying before updating. Minimizing changesets to contain only rows that actually change will make processing faster.
For example, let's update the names of some of our favorite genes:
results = syn.tableQuery(\"select * from %s where Chromosome='1'\" % table.schema.id)\ndf = results.asDataFrame()\ndf['Name'] = ['rzing', 'zing1', 'zing2', 'zing3']\n
Note that we're propagating the etag from the query results. Without it, we'd get an error saying something about an \"Invalid etag\":
table = syn.store(Table(schema, df, etag=results.etag))\n
The etag is used by the server to prevent concurrent users from making conflicting changes, a technique called optimistic concurrency. In case of a conflict, your update may be rejected. You then have to do another query and try your update again.
"},{"location":"tutorials/tables/#changing-table-structure","title":"Changing Table Structure","text":"Adding columns can be done using the methods Schema.addColumn
or addColumns
on the Schema
object:
schema = syn.get(\"syn000000\")\nbday_column = syn.store(Column(name='birthday', columnType='DATE'))\nschema.addColumn(bday_column)\nschema = syn.store(schema)\n
Renaming or otherwise modifying a column involves removing the column and adding a new column:
cols = syn.getTableColumns(schema)\nfor col in cols:\n if col.name == \"birthday\":\n schema.removeColumn(col)\nbday_column2 = syn.store(Column(name='birthday2', columnType='DATE'))\nschema.addColumn(bday_column2)\nschema = syn.store(schema)\n
"},{"location":"tutorials/tables/#table-attached-files","title":"Table attached files","text":"Synapse tables support a special column type called 'File' which contain a file handle, an identifier of a file stored in Synapse. Here's an example of how to upload files into Synapse, associate them with a table and read them back later:
# your synapse project\nimport tempfile\nproject = syn.get(...)\n\n# Create temporary files to store\ntemp = tempfile.NamedTemporaryFile()\nwith open(temp.name, \"w+\") as temp_d:\n temp_d.write(\"this is a test\")\n\ntemp2 = tempfile.NamedTemporaryFile()\nwith open(temp2.name, \"w+\") as temp_d:\n temp_d.write(\"this is a test 2\")\n\n# store the table's schema\ncols = [\n Column(name='artist', columnType='STRING', maximumSize=50),\n Column(name='album', columnType='STRING', maximumSize=50),\n Column(name='year', columnType='INTEGER'),\n Column(name='catalog', columnType='STRING', maximumSize=50),\n Column(name='cover', columnType='FILEHANDLEID')]\nschema = syn.store(Schema(name='Jazz Albums', columns=cols, parent=project))\n\n# the actual data\ndata = [[\"John Coltrane\", \"Blue Train\", 1957, \"BLP 1577\", temp.name],\n [\"Sonny Rollins\", \"Vol. 2\", 1957, \"BLP 1558\", temp.name],\n [\"Sonny Rollins\", \"Newk's Time\", 1958, \"BLP 4001\", temp2.name],\n [\"Kenny Burrel\", \"Kenny Burrel\", 1956, \"BLP 1543\", temp2.name]]\n\n# upload album covers\nfor row in data:\n file_handle = syn.uploadFileHandle(row[4], parent=project)\n row[4] = file_handle['id']\n\n# store the table data\nrow_reference_set = syn.store(RowSet(schema=schema, rows=[Row(r) for r in data]))\n\n# Later, we'll want to query the table and download our album covers\nresults = syn.tableQuery(f\"select artist, album, cover from {schema.id} where artist = 'Sonny Rollins'\")\ntest_files = syn.downloadTableColumns(results, ['cover'])\n
"},{"location":"tutorials/tables/#deleting-rows","title":"Deleting rows","text":"Query for the rows you want to delete and call syn.delete on the results:
results = syn.tableQuery(\"select * from %s where Chromosome='2'\" % table.schema.id)\na = syn.delete(results)\n
"},{"location":"tutorials/tables/#deleting-the-whole-table","title":"Deleting the whole table","text":"Deleting the schema deletes the whole table and all rows:
syn.delete(schema)\n
"},{"location":"tutorials/tables/#queries","title":"Queries","text":"The query language is quite similar to SQL select statements, except that joins are not supported. The documentation for the Synapse API has lots of query examples.
See:
The synapseclient
package provides an interface to Synapse, a collaborative, open-source research platform that allows teams to share data, track analyses, and collaborate, providing support for:
The synapseclient
package lets you communicate with the cloud-hosted Synapse service to access data and create shared data analysis projects from within Python scripts or at the interactive Python console. Other Synapse clients exist for R, Java, and the web. The Python client can also be used from the command line.
Installing this package will install synapseclient
, synapseutils
and the command line client. synapseutils
contains beta features and the behavior of these features are subject to change.
If you're just getting started with Synapse, have a look at the Getting Started guides for Synapse.
"},{"location":"news/","title":"Release Notes","text":""},{"location":"news/#320-2023-11-27","title":"3.2.0 (2023-11-27)","text":""},{"location":"news/#highlights","title":"Highlights","text":"get_user_profile_by_username
and get_user_profile_by_id
to handle for use cases when a username is a number.syn.get
with ifcollision='overwrite.local
does not always overwrite previous fileifcollision=overwrite.local
syn.get
with ifcollision='overwrite.local
does not always overwrite previous filesynapse login
and synapse config
correctly work as a result.@memoize
decorator with @functools.lru_cache
decorator.-parent
will become --parent
. Commands that support camel case like --parentId
will be changed to --parent-id
.date_parser
and parse_date
in [pd.read_csv` in table moduleblack
the Python auto formatter on the files\\>=
1.5-parent
will become [--parent. Commands that support camel case like
--parentIdwill be changed to
--parent-id`.Locked down pandas version to only support pandas \\<
1.5
Next major release (3.0.0)...
\\>=
1.5-parent
will become \\--parent
. Commands that support camel case like \\--parentId
will be changed to \\--parent-id
.Added support for Datasets
# from python\nimport synapseclient\nimport synapseutils\nsyn = synapseclient.login()\ndataset_items = [\n {'entityId': \"syn000\", 'versionNumber': 1},\n {...},\n]\ndataset = synapseclient.Dataset(\n name=\"My Dataset\",\n parent=project,\n dataset_items=dataset_items\n)\ndataset = syn.store(dataset)\n# Add/remove specific Synapse IDs to/from the Dataset\ndataset.add_item({'entityId': \"syn111\", 'versionNumber': 1})\ndataset.remove_item(\"syn000\")\ndataset = syn.store(dataset)\n# Add a single Folder to the Dataset\n# this will recursively add all the files in the folder\ndataset.add_folder(\"syn123\")\n# Add a list of Folders, overwriting any existing files in the dataset\ndataset.add_folders([\"syn456\", \"syn789\"], force=True)\ndataset = syn.store(dataset)\n# Create snapshot version of dataset\nsyn.create_snapshot_version(\n dataset.id,\n label=\"v1.0\",\n comment=\"This is version 1\"\n)\n
Added support for downloading from download cart. You can use this feature by first adding items to your download cart on Synapse.
# from python\nimport synapseclient\nimport synapseutils\nsyn = synapseclient.login()\nmanifest_path = syn.get_download_list()\n
# from command line\nsynapse get-download-list\n
Next major release (3.0.0) there will be major cosmetic changes to the cli such as removing all camel case or non-standard single dash long command line interface (cli) parameters. Example: command line arguments like -parent
will become \\--parent
. Commands that support camel case like \\--parentId
will be changed to \\--parent-id
.
ViewBase
for Datasets instead of SchemaBase
Next major release (3.0.0) there will be major cosmetic changes to the cli such as removing all camel case or non-standard single dash long command line interface (cli) parameters. Example: command line arguments like -parent
will become \\--parent
. Commands that support camel case like \\--parentId
will be changed to \\--parent-id
.
Added support for materialized views
# from python\nimport synapseclient\nimport synapseutils\nsyn = synapseclient.login()\nview = synapseclient.MaterializedViewSchema(\n name=\"test-material-view\",\n parent=\"syn34234\",\n definingSQL=\"SELECT * FROM syn111 F JOIN syn2222 P on (F.PATIENT_ID = P.PATIENT_ID)\"\n)\nview_ent = syn.store(view)\n
Removed support for Python 3.6 and added support for Python 3.10
Add function to create Synapse config file
# from the command line\nsynapse config\n
includeTypes
to synapseutils.walk()
forceVersion
on changeFileMetadata
dataset
as an entity type to return in getChildren()-parent
will become \\--parent
. Commands that support camel case like \\--parentId
will be changed to \\--parent-id
.Added ability to generate a manifest file from your local directory structure.
# from the command line\n# write the manifest to manifest.tsv\nsynapse manifest --parent-id syn123 --manifest-file ./manifest.tsv /path/to/local/directory\n# stdout\nsynapse manifest --parent-id syn123 /path/to/local/directory\n
Added ability to pipe manifest stdout into sync function.
# from the command line\nsynapse manifest --parent-id syn123 ./docs/ | synapse sync -\n
Added ability to return summary statistics of csv and tsv files stored in Synapse.
# from python\nimport synapseclient\nimport synapseutils\nsyn = synapseclient.login()\nstatistics = synapseutils.describe(syn=syn, entity=\"syn12345\")\nprint(statistics)\n{\n \"column1\": {\n \"dtype\": \"object\",\n \"mode\": \"FOOBAR\"\n },\n \"column2\": {\n \"dtype\": \"int64\",\n \"mode\": 1,\n \"min\": 1,\n \"max\": 2,\n \"mean\": 1.4\n },\n \"column3\": {\n \"dtype\": \"bool\",\n \"mode\": false,\n \"min\": false,\n \"max\": true,\n \"mean\": 0.5\n }\n}\n
Next major release (3.0.0) there will be major cosmetic changes to the cli such as removing all camel case or non-standard single dash long command line interface (cli) parameters. Example: command line arguments like -parent
will become \\--parent
. Commands that support camel case like \\--parentId
will be changed to \\--parent-id
.
synapse manifest
stdout in synapse sync
functionAdded ability to authenticate from a SYNAPSE_AUTH_TOKEN
environment variable set with a valid personal access token.
# e.g. set environment variable prior to invoking a Synapse command or running a program that uses synapseclient\nSYNAPSE_AUTH_TOKEN='<my_personal_access_token>' synapse <subcommand options>\n
The environment variable will take priority over credentials in the user's .synapseConfig
file or any credentials saved in a prior login using the remember me option.
See here for more details on usage.
Added ability to silence all console output.
# from the command line, use the --silent option with any synapse subcommand, here it will suppress the download progress indicator\nsynapse --silent get <synid>\n
# from code using synapseclient, pass the silent option to the Synapse constructor\nimport synapseclient\n\nsyn = synapseclient.Synapse(silent=True)\nsyn.login()\nsyn.get(<synid>)\n
Improved robustness during downloads with unstable connections. Specifically the client will automatically recover when encoutering some types of network errors that previously would have caused a download to start over as indicated by a reset progress bar.
Entities can be annotated with boolean datatypes, for example:
file = synapseclient.File('/path/to/file', parentId='syn123', synapse_is_great=True)\nsyn.store(file)\n
synapseclient is additionally packaged as a Python wheel.
The index_files_for_migration and migrate_indexed_files functions are added to synapseutils to help migrate files in Synapse projects and folders between AWS S3 buckets in the same region. More details on using these utilities can be found here.
This version supports login programatically and from the command line using personal access tokens that can be obtained from your synapse.org Settings. Additional documentation on login and be found here.
# programmatic\nsyn = synapseclient.login(authToken=<token>)\n
# command line\nsynapse login -p <token>\n
The location where downloaded entities are cached can be customized to a location other than the user's home directory. This is useful in environments where writing to a home directory is not appropriate (e.g. an AWS lambda).
syn = synapseclient.Synapse(cache_root_dir=<directory path>)\n
A helper method on the Synapse object has been added to enable obtaining the Synapse certification quiz status of a user.
passed = syn.is_certified(<username or user_id>)\n
This version has been tested with Python 3.9.
synapse get -r
and synapse sync
in the command line client, respectively) are transferred in in parallel threads rather than serially, substantially improving the performance of these operations.This version includes a performance improvement for syncFromSynapse downloads of deep folder hierarchies to local filesystem locations outside of the Synapse cache.
Support is added for SubmissionViews that can be used to query and edit a set of submissions through table services.
from synapseclient import SubmissionViewSchema\n\nproject = syn.get(\"syn123\")\nevaluation_id = '9876543'\nview = syn.store(SubmissionViewSchema(name='My Submission View', parent=project, scopes=[evaluation_id]))\nview_table = syn.tableQuery(f\"select * from {view.id}\")\n
A max_threads
property of the Synapse object has been added to customize the number of concurrent threads that will be used during file transfers.
import synapseclient\nsyn = synapseclient.login()\nsyn.max_threads = 20\n
If not customized the default value is (CPU count + 4). Adjusting this value higher may speed up file transfers if the local system resources can take advantage of the higher setting. Currently this value applies only to files whose underlying storage is AWS S3.
Alternately, a value can be stored in the synapseConfig configuration file that will automatically apply as the default if a value is not explicitly set.
[transfer]\nmax_threads=16\n
This release includes support for directly accessing S3 storage locations using AWS Security Token Service credentials. This allows use of external AWS clients and libraries with Synapse storage, and can be used to accelerate file transfers under certain conditions. To create an STS enabled folder and set-up direct access to S3 storage, see here <sts_storage_locations>
{.interpreted-text role=\"ref\"}.
The getAnnotations
and setAnnotations
methods of the Synapse object have been deprecated in favor of newer get_annotations
and set_annotations
methods, respectively. The newer versions are parameterized with a typed Annotations
dictionary rather than a plain Python dictionary to prevent existing annotations from being accidentally overwritten. The expected usage for setting annotations is to first retrieve the existing Annotations
for an entity before saving changes by passing back a modified value.
annos = syn.get_annotations('syn123')\n\n# set key 'foo' to have value of 'bar' and 'baz'\nannos['foo'] = ['bar', 'baz']\n# single values will automatically be wrapped in a list once stored\nannos['qwerty'] = 'asdf'\n\nannos = syn.set_annotations(annos)\n
The deprecated annotations methods may be removed in a future release.
A full list of issues addressed in this release are below.
"},{"location":"news/#bug-fixes_14","title":"Bug Fixes","text":"Python 2 is no longer supported as of this release. This release requires Python 3.6+.
"},{"location":"news/#highlights_17","title":"Highlights:","text":"Multi-threaded download of files from Synapse can be enabled by setting syn.multi_threaded
to True
on a synapseclient.Synapse
object. This will become the default implementation in the future, but to ensure stability for the first release of this feature, it must be intentionally enabled.
import synapseclient\nsyn = synapseclient.login()\nsyn.multi_threaded = True\n# syn123 now will be downloaded via the multi-threaded implementation\nsyn.get(\"syn123\")\n
Currently, multi-threaded download only works with files stored in AWS S3, where most files on Synapse reside. This also includes custom storage locations that point to an AWS S3 bucket. Files not stored in S3 will fall back to single-threaded download even if syn.multi_threaded==True
.
-
`synapseutils.copy()` now has limitations on what can be copied:\n\n: - A user must have download permissions on the entity they\n want to copy.\n - Users cannot copy any entities that have [access\n requirements](https://help.synapse.org/docs/Sharing-Settings,-Permissions,-and-Conditions-for-Use.2024276030.html).\n
contentTypes
and fileNames
are optional parameters in synapseutils.copyFileHandles()
Synapse Docker Repository(synapseclient.DockerRepository
) objects can now be submitted to Synapse evaluation queues using the entity
argument in synapseclient.Synapse.submit()
. An optional argument docker_tag=\"latest\"
has also been added to synapseclient.Synapse.submit()
\\\" to designate which tagged Docker image to submit.
A full list of issues addressed in this release are below.
"},{"location":"news/#bugs-fixes","title":"Bugs Fixes","text":"In version 1.9.2, we improved Views' usability by exposing [set_entity_types()` function to change the entity types that will show up in a View:
import synapseclient\nfrom synapseclient.table import EntityViewType\n\nsyn = synapseclient.login()\nview = syn.get(\"syn12345\")\nview.set_entity_types([EntityViewType.FILE, EntityViewType.FOLDER])\nview = syn.store(view)\n
"},{"location":"news/#features","title":"Features","text":"In version 1.9.1, we fix various bugs and added two new features:
In version 1.9.0, we deprecated and removed query()
and chunkedQuery()
. These functions used the old query services which does not perform well. To query for entities filter by annotations, please use EntityViewSchema
.
We also deprecated the following functions and will remove them in Synapse Python client version 2.0. In the Activity
object:
usedEntity()
usedURL()
In the Synapse
object:
getEntity()
loadEntity()
createEntity()
updateEntity()
deleteEntity()
downloadEntity()
uploadFile()
uploadFileHandle()
uploadSynapseManagedFileHandle()
downloadTableFile()
Please see our documentation for more details on how to migrate your code away from these functions.
"},{"location":"news/#features_2","title":"Features","text":"copyWiki
functionIn this release, we have been performed some house-keeping on the code base. The two major changes are:
making syn.move()
available to move an entity to a new parent in Synapse. For example:
import synapseclient\nfrom synapseclient import Folder\n\nsyn = synapseclient.login()\n\nfile = syn.get(\"syn123\")\nfolder = Folder(\"new folder\", parent=\"syn456\")\nfolder = syn.store(folder)\n\n# moving file to the newly created folder\nsyn.move(file, folder)\n
exposing the ability to use the Synapse Python client with single threaded. This feature is useful when running Python script in an environment that does not support multi-threading. However, this will negatively impact upload speed. To use single threaded:
import synapseclient\nsynapseclient.config.single_threaded = True\n
This release is a hotfix for a bug. Please refer to 1.8.0 release notes for information about additional changes.
"},{"location":"news/#bug-fixes_21","title":"Bug Fixes","text":"This release has 2 major changes:
\\~/synapseCache/.session
). The python client now relies on keyring to handle credential storage of your Synapse credentials.The remaining changes are bug fixes and cleanup of test code.
Below are the full list of issues addressed by this release:
"},{"location":"news/#bug-fixes_22","title":"Bug Fixes","text":"v1.7.4 release was broken for new users that installed from pip. v1.7.5 has the same changes as v1.7.4 but fixes the pip installation.
"},{"location":"news/#174-2018-01-29","title":"1.7.4 (2018-01-29)","text":"This release mostly includes bugfixes and improvements for various Table classes:
: - Fixed bug where you couldn't store a table converted to a pandas.Dataframe
if it had a INTEGER column with some missing values. - EntityViewSchema
can now automatically add all annotations within your defined scopes
as columns. Just set the view's addAnnotationColumns=True
before calling syn.store()
. This attribute defaults to True
for all newly created EntityViewSchemas
. Setting addAnnotationColumns=True
on existing tables will only add annotation columns that are not already a part of your schema. - You can now use synapseutils.notifyMe
as a decorator to notify you by email when your function has completed. You will also be notified of any Errors if they are thrown while your function runs.
We also added some new features:
: - syn.findEntityId()
function that allows you to find an Entity by its name and parentId, set parentId to None
to search for Projects by name. - The bulk upload functionality of synapseutils.syncToSynapse
is available from the command line using: synapse sync
.
Below are the full list of issues addressed by this release:
"},{"location":"news/#features_4","title":"Features","text":"Release 1.7.3 introduces fixes and quality of life changes to Tables and synapseutils:
Changes to Tables:
etag
column in your SQL query when using a tableQuery()
to update File/Project Views. just SELECT
the relevant columns and etags will be resolved automatically.PartialRowSet
class allows you to only have to upload changes to individual cells of a table instead of every row that had a value changed. It is recommended to use the PartialRowSet.from_mapping()
classmethod instead of the PartialRowSet
constructor.Changes to synapseutils:
\\~
to refer to your home directory in your manifest.tsvWe also added improved debug logging and use Python's builtin logging
module instead of printing directly to sys.stderr
Below are the full list of issues addressed by this release:
"},{"location":"news/#bug-fixes_24","title":"Bug Fixes","text":"Release 1.7 is a large bugfix release with several new features. The main ones include:
We have expanded the synapseutils packages to add the ability to:
File View tables can now be created from the python client using EntityViewSchema. See fileviews documentation.
The python client is now able to upload to user owned S3 Buckets. Click here for instructions on linking your S3 bucket to synapse.
We've also made various improvements to existing features:
\\--description
argument when creating/updating entities from the command line client will now create a Wiki
for that entity. You can also use \\--descriptionFile
to write the contents of a markdown file as the entity's Wiki
file_entity.cacheDir
and file_entity.files
is being DEPRECATED in favor of file_entity.path
for finding the location of a downloaded File
pandas
dataframe\\
s containing `datetime` values can now be properly converted into csv and uploaded to Synapse.We also added a optional convert_to_datetime
parameter to CsvFileTable.asDataFrame()
that will automatically convert Synapse DATE columns into datetime
objects instead of leaving them as long
unix timestamps
Below are the full list of bugs and issues addressed by this release:
"},{"location":"news/#features_6","title":"Features","text":"In version 1.6 we introduce a new sub-module _synapseutils that provide convenience functions for more complicated operations in Synapse such as copying of files wikis and folders. In addition we have introduced several improvements in downloading content from Synapse. As with uploads we are now able to recover from an interrupted download and will retry on network failures.
We have improved download robustness and error checking, along with extensive recovery on failed operations. This includes the ability for the client to pause operation when Synapse is updated.
By default, data sets in Synapse are private to your user account, but they can easily be shared with specific users, groups, or the public.
See:
Periodically we will be publishing results of benchmarking the Synapse Python Client compared to directly working with AWS S3. The purpose of these benchmarks is to make data driven decisions on where to spend time optimizing the client. Additionally, it will give us a way to measure the impact of changes to the client.
"},{"location":"explanations/benchmarking/#results","title":"Results","text":""},{"location":"explanations/benchmarking/#12122023-downloading-files-from-synapse","title":"12/12/2023: Downloading files from Synapse","text":"The results were created on a t3a.micro
EC2 instance with a 200GB disk size running in us-east-1. The script that was run can be found in docs/scripts/downloadBenchmark.py
and docs/scripts/uploadTestFiles.py
.
During this download test I tried various thread counts to see what performance looked like at different levels. What I found was that going over the default count of threads during download of large files (10GB and over) led to signficantly unstable performance. The client would often crash or hang during execution. As a result the general reccomendation is as follows:
multiprocessing.cpu_count() + 4
The results were created on a t3a.micro
EC2 instance with a 200GB disk size running in us-east-1. The script that was run can be found in docs/scripts
. The time to create the files on disk is not included.
This test includes adding 5 annotations to each file, a Text, Integer, Floating Point, Boolean, and Date.
S3 was not benchmarked again.
As a result of these tests the sweet spot for thread count is around 50 threads. It is not reccomended to go over 50 threads as it resulted in signficant instability in the client.
Test Thread Count Synapseutils Sync os.walk + syn.store Per file size 25 Files 1MB total size 6 10.75s 10.96s 40KB 25 Files 1MB total size 25 6.79s 11.31s 40KB 25 Files 1MB total size 50 6.05s 10.90s 40KB 25 Files 1MB total size 100 6.14s 10.89s 40KB 775 Files 10MB total size 6 268.33s 298.12s 12.9KB 775 Files 10MB total size 25 162.63s 305.93s 12.9KB 775 Files 10MB total size 50 86.46s 304.40s 12.9KB 775 Files 10MB total size 100 85.55s 304.71s 12.9KB 10 Files 1GB total size 6 27.17s 36.25s 100MB 10 Files 1GB total size 25 22.26s 12.77s 100MB 10 Files 1GB total size 50 22.24s 12.26s 100MB 10 Files 1GB total size 100 Wouldn't complete Wouldn't complete 100MB"},{"location":"explanations/benchmarking/#11142023-uploading-files-to-synapse-default-thread-count","title":"11/14/2023: Uploading files to Synapse, Default thread count","text":"The results were created on a t3a.micro
EC2 instance with a 200GB disk size running in us-east-1. The script that was run can be found in docs/scripts
. The time to create the files on disk is not included.
This test uses the default number of threads in the client: multiprocessing.cpu_count() + 4
The manifest is a tsv file with file locations and metadata to be pushed to Synapse. The purpose is to allow bulk actions through a TSV without the need to manually execute commands for every requested action.
"},{"location":"explanations/manifest_tsv/#manifest-file-format","title":"Manifest file format","text":"The format of the manifest file is a tab delimited file with one row per file to upload and columns describing the file. The minimum required columns are path and parent where path is the local file path and parent is the Synapse Id of the project or folder where the file is uploaded to.
In addition to these columns you can specify any of the parameters to the File constructor (name, synapseStore, contentType) as well as parameters to the syn.store command (used, executed, activityName, activityDescription, forceVersion).
For only updating annotations without uploading new versions of unchanged files, the syn.store parameter forceVersion should be included in the manifest with the value set to False.
Used and executed can be semi-colon (\";\") separated lists of Synapse ids, urls and/or local filepaths of files already stored in Synapse (or being stored in Synapse by the manifest). If you leave a space, like \"syn1234; syn2345\" the white space from \" syn2345\" will be stripped.
Any additional columns will be added as annotations.
"},{"location":"explanations/manifest_tsv/#required-fields","title":"Required fields:","text":"Field Meaning Example path local file path or URL /path/to/local/file.txt parent synapse id syn1235"},{"location":"explanations/manifest_tsv/#common-fields","title":"Common fields:","text":"Field Meaning Example name name of file in Synapse Example_file forceVersion whether to update version False"},{"location":"explanations/manifest_tsv/#activityprovenance-fields","title":"Activity/Provenance fields:","text":"Each of these are individual examples and is what you would find in a row in each of these columns. To clarify, \"syn1235;/path/to_local/file.txt\" below states that you would like both \"syn1234\" and \"/path/to_local/file.txt\" added as items used to generate a file. You can also specify one item by specifying \"syn1234\"
Field Meaning Example used List of items used to generate file \"syn1235;/path/to_local/file.txt\" executed List of items executed \"https://github.org/;/path/to_local/code.py\" activityName Name of activity in provenance \"Ran normalization\" activityDescription Text description on what was done \"Ran algorithm xyx with parameters...\"See:
Any columns that are not in the reserved names described above will be interpreted as annotations of the file
For example this is adding 2 annotations to each row:
path parent annot1 annot2 /path/file1.txt syn1243 \"bar\" 3.1415 /path/file2.txt syn12433 \"baz\" 2.71 /path/file3.txt syn12455 \"zzz\" 3.52See:
In Synapse, entities have both properties and annotations. Properties are used by the system, whereas annotations are completely user defined. In the Python client, we try to present this situation as a normal object, with one set of properties.
Printing an entity will show the division between properties and annotations.
print(entity)\n
Under the covers, an Entity object has two dictionaries, one for properties and one for annotations. These two namespaces are distinct, so there is a possibility of collisions. It is recommended to avoid defining annotations with names that collide with properties, but this is not enforced.
## don't do this!\nentity.properties['description'] = 'One thing'\nentity.annotations['description'] = 'A different thing'\n
In case of conflict, properties will take precedence.
print(entity.description)\n#> One thing\n
Some additional ambiguity is entailed in the use of dot notation. Entity objects have their own internal properties which are not persisted to Synapse. As in all Python objects, these properties are held in object.dict. For example, this dictionary holds the keys 'properties' and 'annotations' whose values are both dictionaries themselves.
The rule, for either getting or setting is: first look in the object then look in properties, then look in annotations. If the key is not found in any of these three, a get results in a KeyError
and a set results in a new annotation being created. Thus, the following results in a new annotation that will be persisted in Synapse:
entity.foo = 'bar'\n
To create an object member variable, which will not be persisted in Synapse, this unfortunate notation is required:
entity.__dict__['foo'] = 'bar'\n
As mentioned previously, name collisions are entirely possible. Keys in the three namespaces can be referred to unambiguously like so:
entity.__dict__['key']\n\nentity.properties.key\nentity.properties['key']\n\nentity.annotations.key\nentity.annotations['key']\n
Most of the time, users should be able to ignore these distinctions and treat Entities like normal Python objects. End users should never need to manipulate items in dict.
See also:
These methods enable access to the Synapse REST(ish) API taking care of details like endpoints and authentication. See the REST API documentation.
See:
Synapse can use a variety of storage mechanisms to store content, however the most common storage solution is AWS S3. This article illustrates some special features that can be used with S3 storage and how they interact with the Python client. In particular it covers:
Synapse projects or folders can be configured to use custom implementations for their underlying data storage. More information on this feature can be found here. The most common implementation of this is to configure a folder to store data in a user controlled AWS S3 bucket rather than Synapse's default internal S3 storage.
"},{"location":"guides/data_storage/#creating-a-new-folder-backed-by-a-user-specified-s3-bucket","title":"Creating a new folder backed by a user specified S3 bucket","text":"The following illustrates creating a new folder backed by a user specified S3 bucket. Note: An existing folder also works.
If you are changing the storage location of an existing folder to a user specified S3 bucket none of the files will be migrated. In order to migrate the files to the new storage location see the section Migrating programmatically. When you change the storage location for a folder only NEW files uploaded to the folder are uploaded to the user specific S3 bucket.
Ensure that the bucket is properly configured.
Create a folder and configure it to use external S3 storage:
# create a new folder to use with external S3 storage\nfolder = syn.store(Folder(name=folder_name, parent=parent))\n# You may also use an existing folder like:\n# folder = syn.get(\"syn123\")\nfolder, storage_location, project_setting = syn.create_s3_storage_location(\n folder=folder,\n bucket_name='my-external-synapse-bucket',\n base_key='path/within/bucket',\n )\n\n# if needed the unique storage location identifier can be obtained e.g.\nstorage_location_id = storage_location['storageLocationId']\n
"},{"location":"guides/data_storage/#creating-a-new-project-backed-by-a-user-specified-s3-bucket","title":"Creating a new project backed by a user specified S3 bucket","text":"The following illustrates creating a new project backed by a user specified S3 bucket. Note: An existing project also works.
If you are changing the storage location of an existing project to a user specified S3 bucket none of the files will be migrated. In order to migrate the files to the new storage location see the documentation further down in this article labeled 'Migrating programmatically'. When you change the storage location for a project only NEW files uploaded to the project are uploaded to the user specific S3 bucket.
Ensure that the bucket is properly configured.
Create a project and configure it to use external S3 storage:
# create a new, or retrieve an existing project to use with external S3 storage\nproject = syn.store(Project(name=\"my_project_name\"))\nproject_storage, storage_location, project_setting = syn.create_s3_storage_location(\n # Despite the KW argument name, this can be a project or folder\n folder=project,\n bucket_name='my-external-synapse-bucket',\n base_key='path/within/bucket',\n)\n\n# if needed the unique storage location identifier can be obtained e.g.\nstorage_location_id = storage_location['storageLocationId']\n
Once an external S3 storage folder exists, you can interact with it as you would any other folder using Synapse tools. If you wish to add an object that is stored within the bucket to Synapse you can do that by adding a file handle for that object using the Python client and then storing the file to that handle.
parent_synapse_folder_id = 'syn123'\nlocal_file_path = '/path/to/local/file'\nbucket = 'my-external-synapse-bucket'\ns3_key = 'path/within/bucket/file'\n\n# in this example we use boto to create a file independently of Synapse\ns3_client = boto3.client('s3')\ns3_client.upload_file(\n Filename=local_file_path,\n Bucket=bucket,\n Key=s3_key\n)\n\n# now we add a file handle for that file and store the file to that handle\nfile_handle = syn.create_external_s3_file_handle(\n bucket,\n s3_key,\n local_file_path,\n parent=parent_synapse_folder_id,\n)\nfile = File(parentId=folder['id'], dataFileHandleId=file_handle['id'])\nfile_entity = syn.store(file)\n
"},{"location":"guides/data_storage/#storage-location-migration","title":"Storage location migration","text":"There are circumstances where it can be useful to move the files underlying Synapse entities from one storage location to another without impacting the structure or identifiers of the Synapse entities themselves. An example scenario is needing to use STS features with an existing Synapse Project that was not initially configured with an STS enabled custom storage location.
The Synapse client has utilities for migrating entities to a new storage location without having to download the content locally and re-uploading it which can be slow, and may alter the meta data associated with the entities in undesirable ways.
During the migration it is recommended that uploads and downloads are blocked to prevent possible conflicts or race conditions. This can be done by setting permissions to Can view
for the project or folder being migrated. After the migration is complete set the permissions back to their original values.
Expected time to migrate data is around 13 minutes per 100Gb as of 11/21/2023.
"},{"location":"guides/data_storage/#migrating-programmatically","title":"Migrating programmatically","text":"Migrating a Synapse project or folder programmatically is a two step process.
First ensure that you know the id of the storage location you want to migrate to. More info on storage locations can be found above and here.
Once the storage location is known, the first step to migrate the project or folder is to create a migratable index of its contents using the index_files_for_migration function, e.g.
When specifying the .db
file for the migratable indexes you need to specify a .db
file that does not already exist for another synapse project or folder on disk. It is the best practice to specify a unique name for the file by including the synapse id in the name of the file, or other unique identifier.
import synapseutils\n\nentity_id = 'syn123' # a Synapse entity whose contents need to be migrated, e.g. a Project or Folder\ndest_storage_location_id = '12345' # the id of the destination storage location being migrated to\n\n# a path on disk where this utility can create a sqlite database to store its index.\n# nothing needs to exist at this path, but it must be a valid path on a volume with sufficient\n# disk space to store a meta data listing of all the contents in the indexed entity.\n# a rough rule of thumb is 100kB per 1000 entities indexed.\ndb_path = '/tmp/foo/syn123_bar.db'\n\nresult = synapseutils.index_files_for_migration(\n syn,\n entity_id,\n dest_storage_location_id,\n db_path,\n\n # optional args, see function documentation linked above for a description of these parameters\n source_storage_location_ids=['54321', '98765'],\n file_version_strategy='new',\n include_table_files=False,\n continue_on_error=True\n)\n
If called on a container (e.g. a Project or Folder) the index_files_for_migration function will recursively index all of the children of that container (including its subfolders). Once the entity has been indexed you can optionally programmatically inspect the the contents of the index or output its contents to a csv file in order to manually inspect it using the available methods on the returned result object.
The next step to trigger the migration from the indexed files is using the migrate_indexed_files function, e.g.
result = synapseutils.migrate_indexed_files(\n syn,\n db_path,\n\n # optional args, see function documentation linked above for a description of these parameters\n create_table_snapshots=True,\n continue_on_error=False,\n force=True\n)\n
The result can be again be inspected as above to see the results of the migration.
Note that above the force parameter is necessary if running from a non-interactive shell. Proceeding with a migration requires confirmation in the form of user prompt. If running programmatically this parameter instead confirms your intention to proceed with the migration.
"},{"location":"guides/data_storage/#putting-all-the-migration-pieces-together","title":"Putting all the migration pieces together","text":"import os\nimport synapseutils\nimport synapseclient\n\nmy_synapse_project_or_folder_to_migrate = \"syn123\"\n\nexternal_bucket_name = \"my-external-synapse-bucket\"\nexternal_bucket_base_key = \"path/within/bucket/\"\n\nmy_user_id = \"1234\"\n\n# a path on disk where this utility can create a sqlite database to store its index.\n# nothing needs to exist at this path, but it must be a valid path on a volume with sufficient\n# disk space to store a meta data listing of all the contents in the indexed entity.\n# a rough rule of thumb is 100kB per 1000 entities indexed.\ndb_path = os.path.expanduser(\n f\"~/synapseMigration/{my_synapse_project_or_folder_to_migrate}_my.db\"\n)\n\nsyn = synapseclient.Synapse()\n\n# Log-in with ~.synapseConfig `authToken`\nsyn.login()\n\n# The project or folder I want to migrate everything to this S3 storage location\nproject_or_folder = syn.get(my_synapse_project_or_folder_to_migrate)\n\nproject_or_folder, storage_location, project_setting = syn.create_s3_storage_location(\n # Despite the KW argument name, this can be a project or folder\n folder=project_or_folder,\n bucket_name=external_bucket_name,\n base_key=external_bucket_base_key,\n)\n\n# The id of the destination storage location being migrated to\nstorage_location_id = storage_location[\"storageLocationId\"]\nprint(\n f\"Indexing: {project_or_folder.id} for migration to storage_id: {storage_location_id} at: {db_path}\"\n)\n\ntry:\n result = synapseutils.index_files_for_migration(\n syn,\n project_or_folder.id,\n storage_location_id,\n db_path,\n file_version_strategy=\"all\",\n )\n\n print(f\"Indexing result: {result.get_counts_by_status()}\")\n\n print(\"Migrating files...\")\n\n result = synapseutils.migrate_indexed_files(\n syn,\n db_path,\n force=True,\n )\n\n print(f\"Migration result: {result.get_counts_by_status()}\")\n syn.sendMessage(\n userIds=[my_user_id],\n messageSubject=f\"Migration success for {project_or_folder.id}\",\n messageBody=f\"Migration result: {result.get_counts_by_status()}\",\n )\nexcept Exception as e:\n syn.sendMessage(\n userIds=[my_user_id],\n messageSubject=f\"Migration failed for {project_or_folder.id}\",\n messageBody=f\"Migration failed with error: {e}\",\n )\n
The result of running this should look like
Indexing: syn123 for migration to storage_id: 11111 at: /home/user/synapseMigration/syn123_my.db\nIndexing result: {'INDEXED': 100, 'MIGRATED': 0, 'ALREADY_MIGRATED': 0, 'ERRORED': 0}\nMigrating files...\nMigration result: {'INDEXED': 0, 'MIGRATED': 100, 'ALREADY_MIGRATED': 0, 'ERRORED': 0}\n
"},{"location":"guides/data_storage/#migrating-from-the-command-line","title":"Migrating from the command line","text":"Synapse entities can also be migrated from the command line. The options are similar to above. Whereas migrating programatically involves two separate function calls, from the command line there is a single migrate
command with the dryRun argument providing the option to generate the index only without proceeding onto the migration.
Note that as above, confirmation is required before a migration starts. As above, this must either be in the form of confirming via a prompt if running the command from an interactive shell, or using the force command.
The optional csv_log_path argument will output the results to a csv file for record keeping, and is recommended.
synapse migrate syn123 54321 /tmp/migrate.db --csv_log_path /tmp/migrate.csv\n
Sample output:
Indexing Project syn123\nIndexing file entity syn888\nIndexing file entity syn999\nIndexed 2 items, 2 needing migration, 0 already stored in destination storage location (54321). Encountered 0 errors.\n21 items for migration to 54321. Proceed? (y/n)? y\nCreating new version for file entity syn888\nCreating new version for file entity syn999\nCompleted migration of syn123. 2 files migrated. 0 errors encountered\nWriting csv log to /tmp/migrate.csv\n
.. _sts_storage_locations:"},{"location":"guides/data_storage/#sts-storage-locations","title":"STS Storage Locations","text":"Create an STS enabled folder to use AWS Security Token Service credentials with S3 storage locations. These credentials can be scoped to access individual Synapse files or folders and can be used with external S3 tools such as the awscli and the boto3 library separately from Synapse to read and write files to and from Synapse storage. At this time read and write capabilities are supported for external storage locations, while default Synapse storage is limited to read only. Please read the linked documentation for a complete understanding of the capabilities and restrictions of STS enabled folders.
"},{"location":"guides/data_storage/#creating-an-sts-enabled-folder","title":"Creating an STS enabled folder","text":"Creating an STS enabled folder is similar to creating an external storage folder as described above, but this time passing an additional sts_enabled=True keyword parameter. The bucket_name and base_key parameters apply to external storage locations and can be omitted to use Synapse internal storage. Note also that STS can only be enabled on an empty folder.
# create a new folder to use with STS and external S3 storage\nfolder = syn.store(Folder(name=folder_name, parent=parent))\nfolder, storage_location, project_setting = syn.create_s3_storage_location(\n folder=folder,\n bucket_name='my-external-synapse-bucket',\n base_key='path/within/bucket',\n sts_enabled=True,\n)\n
"},{"location":"guides/data_storage/#using-credentials-with-the-awscli","title":"Using credentials with the awscli","text":"This example illustrates obtaining STS credentials and using them with the awscli command line tool. The first command outputs the credentials as shell commands to execute which will then be picked up by subsequent aws cli commands. Note that the bucket-owner-full-control ACL is required when putting an object via STS credentials. This ensures that the object ownership will be transferred to the owner of the AWS bucket.
$ synapse get-sts-token -o shell syn123 read_write\n\nexport SYNAPSE_STS_S3_LOCATION=\"s3://my-external-synapse-bucket/path/within/bucket\"\nexport AWS_ACCESS_KEY_ID=\"<access_key_id>\"\nexport AWS_SECRET_ACCESS_KEY=\"<secret_access_key>\"\nexport AWS_SESSION_TOKEN=\"<session_token>\n\n# if the above are executed in the shell, the awscli will automatically apply them\n\n# e.g. copy a file directly to the bucket using the exported credentials\n$ aws s3 cp /path/to/local/file $SYNAPSE_STS_S3_LOCATION --acl bucket-owner-full-control\n
"},{"location":"guides/data_storage/#using-credentials-with-boto3-in-python","title":"Using credentials with boto3 in python","text":"This example illustrates retrieving STS credentials and using them with boto3 within python code, in this case to upload a file. Note that the bucket-owner-full-control ACL is required when putting an object via STS credentials. This ensures that the object ownership will be transferred to the owner of the AWS bucket.
# the boto output_format is compatible with the boto3 session api.\ncredentials = syn.get_sts_storage_token('syn123', 'read_write', output_format='boto')\n\ns3_client = boto3.client('s3', **credentials)\ns3_client.upload_file(\n Filename='/path/to/local/file,\n Bucket='my-external-synapse-bucket',\n Key='path/within/bucket/file',\n ExtraArgs={'ACL': 'bucket-owner-full-control'},\n)\n
"},{"location":"guides/data_storage/#automatic-transfers-tofrom-sts-storage-locations-using-boto3-with-synapseclient","title":"Automatic transfers to/from STS storage locations using boto3 with synapseclient","text":"The Python Synapse client can be configured to automatically use STS tokens to perform uploads and downloads to enabled storage locations using an installed boto3 library rather than through the traditional Synapse client APIs. This can improve performance in certain situations, particularly uploads of large files, as the data transfer itself can be conducted purely against the AWS S3 APIs, only invoking the Synapse APIs to retrieve the necessary token and to update Synapse metadata in the case of an upload. Once configured to do so, retrieval of STS tokens for supported operations occurs automatically without any change in synapseclient usage.
To enable STS/boto3 transfers on all get
and store
operations, do the following:
pip install boto3\n
# add to .synapseConfig to automatically apply as default for all synapse client instances\n[transfer]\nuse_boto_sts=true\n\n# alternatively set on a per instance basis within python code\nsyn.use_boto_sts_transfers = True\n
Note that if boto3 is not installed, then these settings will have no effect.
"},{"location":"guides/data_storage/#sftp","title":"SFTP","text":""},{"location":"guides/data_storage/#installation","title":"Installation","text":"Installing the extra libraries that the Python client uses to communicate with SFTP servers may add a few steps to the installation process.
The required libraries are:
Building these libraries on Unix OS's is straightforward, but you need the Python development headers and libraries. For example, in Debian or Ubuntu distributions:
sudo apt-get install python-dev\n
Once this requirement is met, sudo pip install synapseclient
should be able to build pycrypto.
Binary distributions of pycrypto built for Windows is available from Michael Foord at Voidspace. Install this before installing the Python client.
After running the pycrypto installer, sudo pip install synapseclient
should work.
Another option is to build your own binary with either the free developer tools from Microsoft or the MinGW compiler.
"},{"location":"guides/data_storage/#configure-your-client","title":"Configure your client","text":"Make sure you configure your ~/.synapseConfig file to connect to your SFTP server.
"},{"location":"guides/validate_annotations/","title":"Validate Annotations","text":"Warning: This is a beta implementation and is subject to change. Use at your own risk.
Validate annotations on your Synapse entities by leveraging the JSON schema services. Here are the steps you must take to set up the JSON Schema service.
"},{"location":"guides/validate_annotations/#create-a-json-schema-organization","title":"Create a JSON Schema organization","text":"Set up Synapse client and JSON Schema service:
import synapseclient\nsyn = synapseclient.login()\nsyn.get_available_services() # Output: ['json_schema']\njs = syn.service(\"json_schema\")\n
Create, manage, and delete a JSON Schema organization:
my_org_name = <your org name here>\nmy_org = js.JsonSchemaOrganization(my_org_name)\nmy_org # Output: JsonSchemaOrganization(name=my_org_name)\nmy_org.create()\nmy_org.get_acl()\nmy_org.set_acl([syn.getUserProfile().ownerId])\n# my_org.update_acl([syn.getUserProfile().ownerId])\nmy_org.delete()\n
Retrieve existing organization and associated JSON schemas:
orgs = js.list_organizations()\nsage_org = js.JsonSchemaOrganization(\"sage.annotations\")\nschemas = sage_org.list_json_schemas()\nschema1 = next(schemas)\nschema2 = sage_org.get_json_schema(schema1.name)\nassert schema1 is schema2 # True\nschema1 # Output: JsonSchema(org='sage.annotations', name='analysis.alignmentMethod')\n
Manage a specific version of a JSON schema:
versions = schema1.list_versions()\nversion1 = next(versions)\nraw_body = version1.body\nfull_body = version1.expand()\nversion1\n# Output: JsonSchemaVersion(org='sage.annotations', name='analysis.alignmentMethod', version='0.0.1')\n
Create a new JSON schema version for an existing organization:
from random import randint\nrint = randint(0, 100000)\nschema_name = \"my.schema\"\n\n# Method 1\nmy_org = js.JsonSchemaOrganization(my_org_name)\nnew_version1 = my_org.create_json_schema(raw_body, schema_name, f\"0.{rint}.1\")\n\n# Method 2\nmy_schema = js.JsonSchema(my_org, schema_name)\nnew_version2 = my_schema.create(raw_body, f\"0.{rint}.2\")\n\n# Method 3\nmy_version = js.JsonSchemaVersion(my_org, schema_name, f\"0.{rint}.3\")\nnew_version3 = my_version.create(raw_body)\n
Test validation on a Synapse entity:
from time import sleep\nsynapse_id = \"syn25922647\"\njs.bind_json_schema(new_version1.uri, synapse_id)\njs.get_json_schema(synapse_id)\nsleep(3)\njs.validate(synapse_id)\njs.validate_children(synapse_id)\njs.validation_stats(synapse_id)\njs.unbind_json_schema(synapse_id)\n
Access to low-level API functions:
js.create_organization(organization_name)\njs.get_organization(organization_name)\njs.list_organizations()\njs.delete_organization(organization_id)\njs.get_organization_acl(organization_id)\njs.update_organization_acl(organization_id, resource_access, etag)\njs.list_json_schemas(organization_name)\njs.list_json_schema_versions(organization_name, json_schema_name)\njs.create_json_schema(json_schema_body, dry_run)\njs.get_json_schema_body(json_schema_uri)\njs.delete_json_schema(json_schema_uri)\njs.json_schema_validation(json_schema_uri)\njs.bind_json_schema_to_entity(synapse_id, json_schema_uri)\njs.get_json_schema_from_entity(synapse_id)\njs.delete_json_schema_from_entity(synapse_id)\njs.validate_entity_with_json_schema(synapse_id)\njs.get_json_schema_validation_statistics(synapse_id)\njs.get_invalid_json_schema_validation(synapse_id)\n
"},{"location":"guides/views/","title":"Views","text":"A view is a view of all entities (File, Folder, Project, Table, Docker Repository, View) within one or more Projects or Folders. Views can:
Let's go over some examples to demonstrate how view works. First, create a new project and add some files:
import synapseclient\nfrom synapseclient import Project, File, Column, Table, EntityViewSchema, EntityViewType\nsyn = synapseclient.Synapse()\nsyn.login()\n\n# Create a new project\nproject = syn.store(Project(\"test view\"))\n\n# Create some files\nfile1 = syn.store(File(path=\"path/to/file1.txt\", parent=project))\nfile2 = syn.store(File(path=\"path/to/file2.txt\", parent=project))\n\n# add some annotations\nsyn.setAnnotations(file1, {\"contributor\":\"Sage\", \"class\":\"V\"})\nsyn.setAnnotations(file2, {\"contributor\":\"UW\", \"rank\":\"X\"})\n
"},{"location":"guides/views/#creating-a-view","title":"Creating a View","text":"To create a view, defines its name, columns, parent, scope, and the type of the view:
view = EntityViewSchema(name=\"my first file view\",\n columns=[\n Column(name=\"contributor\", columnType=\"STRING\"),\n Column(name=\"class\", columnType=\"STRING\"),\n Column(name=\"rank\", columnType=\"STRING\")),\n parent=project['id'],\n scopes=project['id'],\n includeEntityTypes=[EntityViewType.FILE, EntityViewType.FOLDER],\n addDefaultViewColumns=True)\nview = syn.store(view)\n
We support the following entity type in a View:
* EntityViewType.FILE\n* EntityViewType.PROJECT\n* EntityViewType.TABLE\n* EntityViewType.FOLDER\n* EntityViewType.VIEW\n* EntityViewType.DOCKER\n
To see the content of your newly created View, use syn.tableQuery():
query_results = syn.tableQuery(\"select * from %s\" % view['id'])\ndata = query_results.asDataFrame()\n
"},{"location":"guides/views/#updating-annotations-using-view","title":"Updating Annotations using View","text":"To update class
annotation for file2
, simply update the view:
# Retrieve the view data using table query\nquery_results = syn.tableQuery(\"select * from %s\" % view['id'])\ndata = query_results.asDataFrame()\n\n# Modify the annotations by modifying the view data and store it\ndata[\"class\"] = [\"V\", \"VI\"]\nsyn.store(Table(view['id'], data))\n
The change in annotations reflect in synGetAnnotations():
syn.getAnnotations(file2['id'])\n
A View is a Table. Please visit Tables to see how to change schema, update content, and other operations that can be done on View.
"},{"location":"reference/activity/","title":"Activity/Provenance","text":""},{"location":"reference/activity/#synapseclient.activity","title":"synapseclient.activity
","text":"Provenance
The Activity object represents the source of a data set or the data processing steps used to produce it. Using W3C provenance ontology terms, a result is generated by a combination of data and code which are either used or executed.
"},{"location":"reference/activity/#synapseclient.activity--imports","title":"Imports","text":"from synapseclient import Activity\n
"},{"location":"reference/activity/#synapseclient.activity--creating-an-activity-object","title":"Creating an activity object","text":"act = Activity(name='clustering',\n description='whizzy clustering',\n used=['syn1234','syn1235'],\n executed='syn4567')\n
Here, syn1234 and syn1235 might be two types of measurements on a common set of samples. Some whizzy clustering code might be referred to by syn4567. The used and executed can reference entities in Synapse or URLs.
Alternatively, you can build an activity up piecemeal::
act = Activity(name='clustering', description='whizzy clustering')\nact.used(['syn12345', 'syn12346'])\nact.executed(\n 'https://raw.githubusercontent.com/Sage-Bionetworks/synapsePythonClient/develop/tests/unit/unit_test_client.py')\n
"},{"location":"reference/activity/#synapseclient.activity--storing-entities-with-provenance","title":"Storing entities with provenance","text":"The activity can be passed in when storing an Entity to set the Entity's provenance::
clustered_samples = syn.store(clustered_samples, activity=act)\n
We've now recorded that clustered_samples
is the output of our whizzy clustering algorithm applied to the data stored in syn1234 and syn1235.
The synapseclient.Synapse.store has shortcuts for specifying the used and executed lists directly. For example, when storing a data entity, it's a good idea to record its source::
excellent_data = syn.store(excellent_data,\n activityName='data-r-us'\n activityDescription='downloaded from data-r-us',\n used='http://data-r-us.com/excellent/data.xyz')\n
"},{"location":"reference/activity/#synapseclient.activity-classes","title":"Classes","text":""},{"location":"reference/activity/#synapseclient.activity.Activity","title":"Activity
","text":" Bases: dict
Represents the provenance of a Synapse Entity.
PARAMETER DESCRIPTIONname
Name of the Activity
DEFAULT: None
description
A short text description of the Activity
DEFAULT: None
used
Either a list of:
DEFAULT: None
executed
A code resource that was executed to generate the Entity.
DEFAULT: None
data
A dictionary representation of an Activity, with fields 'name', 'description' and 'used' (a list of reference objects)
DEFAULT: {}
See also: The W3C's provenance ontology
Source code insynapseclient/activity.py
class Activity(dict):\n \"\"\"\n Represents the provenance of a Synapse Entity.\n\n Parameters:\n name: Name of the Activity\n description: A short text description of the Activity\n used: Either a list of:\n\n - [reference objects](https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/Reference.html) (e.g. [{'targetId':'syn123456', 'targetVersionNumber':1}])\n - a list of Synapse Entities or Entity IDs\n - a list of URL's\n executed: A code resource that was executed to generate the Entity.\n data: A dictionary representation of an Activity, with fields 'name', 'description' and 'used' (a list of reference objects)\n\n See also: The [W3C's provenance ontology](http://www.w3.org/TR/prov-o/)\n \"\"\"\n\n # TODO: make constructors from JSON consistent across objects\n def __init__(self, name=None, description=None, used=None, executed=None, data={}):\n super(Activity, self).__init__(data)\n if \"used\" not in self:\n self[\"used\"] = []\n\n if name is not None:\n self[\"name\"] = name\n if description is not None:\n self[\"description\"] = description\n if used is not None:\n self.used(used)\n if executed is not None:\n self.executed(executed)\n\n def used(\n self, target=None, targetVersion=None, wasExecuted=None, url=None, name=None\n ):\n \"\"\"\n Add a resource used by the activity.\n\n This method tries to be as permissive as possible. It accepts a string which might be a synapse ID or a URL,\n a synapse entity, a UsedEntity or UsedURL dictionary or a list containing any combination of these.\n\n In addition, named parameters can be used to specify the fields of either a UsedEntity or a UsedURL.\n If target and optionally targetVersion are specified, create a UsedEntity.\n If url and optionally name are specified, create a UsedURL.\n\n It is an error to specify both target/targetVersion parameters and url/name parameters in the same call.\n To add multiple UsedEntities and UsedURLs, make a separate call for each or pass in a list.\n\n In case of conflicting settings for wasExecuted both inside an object and with a parameter, the parameter wins.\n For example, this UsedURL will have wasExecuted set to False::\n\n activity.used({'url':'http://google.com', 'name':'Goog', 'wasExecuted':True}, wasExecuted=False)\n\n Entity examples::\n\n activity.used('syn12345')\n activity.used(entity)\n activity.used(target=entity, targetVersion=2)\n activity.used(codeEntity, wasExecuted=True)\n activity.used({'reference':{'target':'syn12345', 'targetVersion':1}, 'wasExecuted':False})\n\n URL examples::\n\n activity.used('http://mydomain.com/my/awesome/data.RData')\n activity.used(url='http://mydomain.com/my/awesome/data.RData', name='Awesome Data')\n activity.used(url='https://github.com/joe_hacker/code_repo', name='Gnarly hacks', wasExecuted=True)\n activity.used({'url':'https://github.com/joe_hacker/code_repo', 'name':'Gnarly hacks'}, wasExecuted=True)\n\n List example::\n\n activity.used(['syn12345', 'syn23456', entity, \\\n {'reference':{'target':'syn100009', 'targetVersion':2}, 'wasExecuted':True}, \\\n 'http://mydomain.com/my/awesome/data.RData'])\n \"\"\"\n # -- A list of targets\n if isinstance(target, list):\n badargs = _get_any_bad_args([\"targetVersion\", \"url\", \"name\"], locals())\n _raise_incorrect_used_usage(badargs, \"list of used resources\")\n\n for item in target:\n self.used(item, wasExecuted=wasExecuted)\n return\n\n # -- UsedEntity\n elif is_used_entity(target):\n badargs = _get_any_bad_args([\"targetVersion\", \"url\", \"name\"], locals())\n _raise_incorrect_used_usage(\n badargs, \"dictionary representing a used resource\"\n )\n\n resource = target\n if \"concreteType\" not in resource:\n resource[\n \"concreteType\"\n ] = \"org.sagebionetworks.repo.model.provenance.UsedEntity\"\n\n # -- Used URL\n elif is_used_url(target):\n badargs = _get_any_bad_args([\"targetVersion\", \"url\", \"name\"], locals())\n _raise_incorrect_used_usage(badargs, \"URL\")\n\n resource = target\n if \"concreteType\" not in resource:\n resource[\n \"concreteType\"\n ] = \"org.sagebionetworks.repo.model.provenance.UsedURL\"\n\n # -- Synapse Entity\n elif is_synapse_entity(target):\n badargs = _get_any_bad_args([\"url\", \"name\"], locals())\n _raise_incorrect_used_usage(badargs, \"Synapse entity\")\n\n reference = {\"targetId\": target[\"id\"]}\n if \"versionNumber\" in target:\n reference[\"targetVersionNumber\"] = target[\"versionNumber\"]\n if targetVersion:\n reference[\"targetVersionNumber\"] = int(targetVersion)\n resource = {\n \"reference\": reference,\n \"concreteType\": \"org.sagebionetworks.repo.model.provenance.UsedEntity\",\n }\n # -- URL parameter\n elif url:\n badargs = _get_any_bad_args([\"target\", \"targetVersion\"], locals())\n _raise_incorrect_used_usage(badargs, \"URL\")\n\n resource = {\n \"url\": url,\n \"name\": name if name else target,\n \"concreteType\": \"org.sagebionetworks.repo.model.provenance.UsedURL\",\n }\n\n # -- URL as a string\n elif is_url(target):\n badargs = _get_any_bad_args([\"targetVersion\"], locals())\n _raise_incorrect_used_usage(badargs, \"URL\")\n resource = {\n \"url\": target,\n \"name\": name if name else target,\n \"concreteType\": \"org.sagebionetworks.repo.model.provenance.UsedURL\",\n }\n\n # -- Synapse Entity ID (assuming the string is an ID)\n elif isinstance(target, str):\n badargs = _get_any_bad_args([\"url\", \"name\"], locals())\n _raise_incorrect_used_usage(badargs, \"Synapse entity\")\n vals = target.split(\".\") # Handle synapseIds of from syn234.4\n if not is_synapse_id_str(vals[0]):\n raise ValueError(\"%s is not a valid Synapse id\" % target)\n if len(vals) == 2:\n if targetVersion and int(targetVersion) != int(vals[1]):\n raise ValueError(\n \"Two conflicting versions for %s were specified\" % target\n )\n targetVersion = int(vals[1])\n reference = {\"targetId\": vals[0]}\n if targetVersion:\n reference[\"targetVersionNumber\"] = int(targetVersion)\n resource = {\n \"reference\": reference,\n \"concreteType\": \"org.sagebionetworks.repo.model.provenance.UsedEntity\",\n }\n else:\n raise SynapseError(\"Unexpected parameters in call to Activity.used().\")\n\n # Set wasExecuted\n if wasExecuted is None:\n # Default to False\n if \"wasExecuted\" not in resource:\n resource[\"wasExecuted\"] = False\n else:\n # wasExecuted parameter overrides setting in an object\n resource[\"wasExecuted\"] = wasExecuted\n\n # Add the used resource to the activity\n self[\"used\"].append(resource)\n\n def executed(self, target=None, targetVersion=None, url=None, name=None):\n \"\"\"\n Add a code resource that was executed during the activity.\n See :py:func:`synapseclient.activity.Activity.used`\n \"\"\"\n self.used(\n target=target,\n targetVersion=targetVersion,\n url=url,\n name=name,\n wasExecuted=True,\n )\n\n def _getStringList(self, wasExecuted=True):\n usedList = []\n for source in [\n source\n for source in self[\"used\"]\n if source.get(\"wasExecuted\", False) == wasExecuted\n ]:\n if source[\"concreteType\"].endswith(\"UsedURL\"):\n if source.get(\"name\"):\n usedList.append(source.get(\"name\"))\n else:\n usedList.append(source.get(\"url\"))\n else: # It is an entity for now\n tmpstr = source[\"reference\"][\"targetId\"]\n if \"targetVersionNumber\" in source[\"reference\"]:\n tmpstr += \".%i\" % source[\"reference\"][\"targetVersionNumber\"]\n usedList.append(tmpstr)\n return usedList\n\n def _getExecutedStringList(self):\n return self._getStringList(wasExecuted=True)\n\n def _getUsedStringList(self):\n return self._getStringList(wasExecuted=False)\n\n def __str__(self):\n str = \"%s\\n Executed:\\n\" % self.get(\"name\", \"\")\n str += \"\\n\".join(self._getExecutedStringList())\n str += \" Used:\\n\"\n str += \"\\n\".join(self._getUsedStringList())\n return str\n
"},{"location":"reference/activity/#synapseclient.activity.Activity-functions","title":"Functions","text":""},{"location":"reference/activity/#synapseclient.activity.Activity.used","title":"used(target=None, targetVersion=None, wasExecuted=None, url=None, name=None)
","text":"Add a resource used by the activity.
This method tries to be as permissive as possible. It accepts a string which might be a synapse ID or a URL, a synapse entity, a UsedEntity or UsedURL dictionary or a list containing any combination of these.
In addition, named parameters can be used to specify the fields of either a UsedEntity or a UsedURL. If target and optionally targetVersion are specified, create a UsedEntity. If url and optionally name are specified, create a UsedURL.
It is an error to specify both target/targetVersion parameters and url/name parameters in the same call. To add multiple UsedEntities and UsedURLs, make a separate call for each or pass in a list.
In case of conflicting settings for wasExecuted both inside an object and with a parameter, the parameter wins. For example, this UsedURL will have wasExecuted set to False::
activity.used({'url':'http://google.com', 'name':'Goog', 'wasExecuted':True}, wasExecuted=False)\n
Entity examples::
activity.used('syn12345')\nactivity.used(entity)\nactivity.used(target=entity, targetVersion=2)\nactivity.used(codeEntity, wasExecuted=True)\nactivity.used({'reference':{'target':'syn12345', 'targetVersion':1}, 'wasExecuted':False})\n
URL examples::
activity.used('http://mydomain.com/my/awesome/data.RData')\nactivity.used(url='http://mydomain.com/my/awesome/data.RData', name='Awesome Data')\nactivity.used(url='https://github.com/joe_hacker/code_repo', name='Gnarly hacks', wasExecuted=True)\nactivity.used({'url':'https://github.com/joe_hacker/code_repo', 'name':'Gnarly hacks'}, wasExecuted=True)\n
List example::
activity.used(['syn12345', 'syn23456', entity, {'reference':{'target':'syn100009', 'targetVersion':2}, 'wasExecuted':True}, 'http://mydomain.com/my/awesome/data.RData'])\n
Source code in synapseclient/activity.py
def used(\n self, target=None, targetVersion=None, wasExecuted=None, url=None, name=None\n):\n \"\"\"\n Add a resource used by the activity.\n\n This method tries to be as permissive as possible. It accepts a string which might be a synapse ID or a URL,\n a synapse entity, a UsedEntity or UsedURL dictionary or a list containing any combination of these.\n\n In addition, named parameters can be used to specify the fields of either a UsedEntity or a UsedURL.\n If target and optionally targetVersion are specified, create a UsedEntity.\n If url and optionally name are specified, create a UsedURL.\n\n It is an error to specify both target/targetVersion parameters and url/name parameters in the same call.\n To add multiple UsedEntities and UsedURLs, make a separate call for each or pass in a list.\n\n In case of conflicting settings for wasExecuted both inside an object and with a parameter, the parameter wins.\n For example, this UsedURL will have wasExecuted set to False::\n\n activity.used({'url':'http://google.com', 'name':'Goog', 'wasExecuted':True}, wasExecuted=False)\n\n Entity examples::\n\n activity.used('syn12345')\n activity.used(entity)\n activity.used(target=entity, targetVersion=2)\n activity.used(codeEntity, wasExecuted=True)\n activity.used({'reference':{'target':'syn12345', 'targetVersion':1}, 'wasExecuted':False})\n\n URL examples::\n\n activity.used('http://mydomain.com/my/awesome/data.RData')\n activity.used(url='http://mydomain.com/my/awesome/data.RData', name='Awesome Data')\n activity.used(url='https://github.com/joe_hacker/code_repo', name='Gnarly hacks', wasExecuted=True)\n activity.used({'url':'https://github.com/joe_hacker/code_repo', 'name':'Gnarly hacks'}, wasExecuted=True)\n\n List example::\n\n activity.used(['syn12345', 'syn23456', entity, \\\n {'reference':{'target':'syn100009', 'targetVersion':2}, 'wasExecuted':True}, \\\n 'http://mydomain.com/my/awesome/data.RData'])\n \"\"\"\n # -- A list of targets\n if isinstance(target, list):\n badargs = _get_any_bad_args([\"targetVersion\", \"url\", \"name\"], locals())\n _raise_incorrect_used_usage(badargs, \"list of used resources\")\n\n for item in target:\n self.used(item, wasExecuted=wasExecuted)\n return\n\n # -- UsedEntity\n elif is_used_entity(target):\n badargs = _get_any_bad_args([\"targetVersion\", \"url\", \"name\"], locals())\n _raise_incorrect_used_usage(\n badargs, \"dictionary representing a used resource\"\n )\n\n resource = target\n if \"concreteType\" not in resource:\n resource[\n \"concreteType\"\n ] = \"org.sagebionetworks.repo.model.provenance.UsedEntity\"\n\n # -- Used URL\n elif is_used_url(target):\n badargs = _get_any_bad_args([\"targetVersion\", \"url\", \"name\"], locals())\n _raise_incorrect_used_usage(badargs, \"URL\")\n\n resource = target\n if \"concreteType\" not in resource:\n resource[\n \"concreteType\"\n ] = \"org.sagebionetworks.repo.model.provenance.UsedURL\"\n\n # -- Synapse Entity\n elif is_synapse_entity(target):\n badargs = _get_any_bad_args([\"url\", \"name\"], locals())\n _raise_incorrect_used_usage(badargs, \"Synapse entity\")\n\n reference = {\"targetId\": target[\"id\"]}\n if \"versionNumber\" in target:\n reference[\"targetVersionNumber\"] = target[\"versionNumber\"]\n if targetVersion:\n reference[\"targetVersionNumber\"] = int(targetVersion)\n resource = {\n \"reference\": reference,\n \"concreteType\": \"org.sagebionetworks.repo.model.provenance.UsedEntity\",\n }\n # -- URL parameter\n elif url:\n badargs = _get_any_bad_args([\"target\", \"targetVersion\"], locals())\n _raise_incorrect_used_usage(badargs, \"URL\")\n\n resource = {\n \"url\": url,\n \"name\": name if name else target,\n \"concreteType\": \"org.sagebionetworks.repo.model.provenance.UsedURL\",\n }\n\n # -- URL as a string\n elif is_url(target):\n badargs = _get_any_bad_args([\"targetVersion\"], locals())\n _raise_incorrect_used_usage(badargs, \"URL\")\n resource = {\n \"url\": target,\n \"name\": name if name else target,\n \"concreteType\": \"org.sagebionetworks.repo.model.provenance.UsedURL\",\n }\n\n # -- Synapse Entity ID (assuming the string is an ID)\n elif isinstance(target, str):\n badargs = _get_any_bad_args([\"url\", \"name\"], locals())\n _raise_incorrect_used_usage(badargs, \"Synapse entity\")\n vals = target.split(\".\") # Handle synapseIds of from syn234.4\n if not is_synapse_id_str(vals[0]):\n raise ValueError(\"%s is not a valid Synapse id\" % target)\n if len(vals) == 2:\n if targetVersion and int(targetVersion) != int(vals[1]):\n raise ValueError(\n \"Two conflicting versions for %s were specified\" % target\n )\n targetVersion = int(vals[1])\n reference = {\"targetId\": vals[0]}\n if targetVersion:\n reference[\"targetVersionNumber\"] = int(targetVersion)\n resource = {\n \"reference\": reference,\n \"concreteType\": \"org.sagebionetworks.repo.model.provenance.UsedEntity\",\n }\n else:\n raise SynapseError(\"Unexpected parameters in call to Activity.used().\")\n\n # Set wasExecuted\n if wasExecuted is None:\n # Default to False\n if \"wasExecuted\" not in resource:\n resource[\"wasExecuted\"] = False\n else:\n # wasExecuted parameter overrides setting in an object\n resource[\"wasExecuted\"] = wasExecuted\n\n # Add the used resource to the activity\n self[\"used\"].append(resource)\n
"},{"location":"reference/activity/#synapseclient.activity.Activity.executed","title":"executed(target=None, targetVersion=None, url=None, name=None)
","text":"Add a code resource that was executed during the activity. See :py:func:synapseclient.activity.Activity.used
synapseclient/activity.py
def executed(self, target=None, targetVersion=None, url=None, name=None):\n \"\"\"\n Add a code resource that was executed during the activity.\n See :py:func:`synapseclient.activity.Activity.used`\n \"\"\"\n self.used(\n target=target,\n targetVersion=targetVersion,\n url=url,\n name=name,\n wasExecuted=True,\n )\n
"},{"location":"reference/activity/#synapseclient.activity-functions","title":"Functions","text":""},{"location":"reference/activity/#synapseclient.activity.is_used_entity","title":"is_used_entity(x)
","text":"Returns True if the given object represents a UsedEntity.
Source code insynapseclient/activity.py
def is_used_entity(x):\n \"\"\"Returns True if the given object represents a UsedEntity.\"\"\"\n\n # A UsedEntity must be a dictionary with a 'reference' field, with a 'targetId' field\n if (\n not isinstance(x, collections.abc.Mapping)\n or \"reference\" not in x\n or \"targetId\" not in x[\"reference\"]\n ):\n return False\n\n # Must only have three keys\n if not all(key in (\"reference\", \"wasExecuted\", \"concreteType\") for key in x.keys()):\n return False\n\n # 'reference' field can only have two keys\n if not all(\n key in (\"targetId\", \"targetVersionNumber\") for key in x[\"reference\"].keys()\n ):\n return False\n\n return True\n
"},{"location":"reference/activity/#synapseclient.activity.is_used_url","title":"is_used_url(x)
","text":"Returns True if the given object represents a UsedURL.
Source code insynapseclient/activity.py
def is_used_url(x):\n \"\"\"Returns True if the given object represents a UsedURL.\"\"\"\n\n # A UsedURL must be a dictionary with a 'url' field\n if not isinstance(x, collections.abc.Mapping) or \"url\" not in x:\n return False\n\n # Must only have four keys\n if not all(\n key in (\"url\", \"name\", \"wasExecuted\", \"concreteType\") for key in x.keys()\n ):\n return False\n\n return True\n
"},{"location":"reference/annotations/","title":"Annotations","text":""},{"location":"reference/annotations/#synapseclient.annotations","title":"synapseclient.annotations
","text":""},{"location":"reference/annotations/#synapseclient.annotations--annotations","title":"Annotations","text":"Annotations are arbitrary metadata attached to Synapse entities. They can be accessed like ordinary object properties or like dictionary keys:
entity.my_annotation = 'This is one way to do it'\nentity['other_annotation'] = 'This is another'\n
Annotations can be given in the constructor for Synapse Entities:
entity = File('data.xyz', parent=my_project, rating=9.1234)\n
Annotate the entity with location data:
entity.lat_long = [47.627477, -122.332154]\n
Record when we collected the data. This will use the current timezone of the machine running the code.
from datetime import datetime as Datetime\nentity.collection_date = Datetime.now()\n
Record when we collected the data in UTC:
from datetime import datetime as Datetime\nentity.collection_date = Datetime.utcnow()\n
See:
Data sources are best recorded using Synapse's Activity/Provenance tools.
"},{"location":"reference/annotations/#synapseclient.annotations--implementation-details","title":"Implementation details","text":"In Synapse, entities have both properties and annotations. Properties are used by the system, whereas annotations are completely user defined. In the Python client, we try to present this situation as a normal object, with one set of properties.
See also:
Annotations
","text":" Bases: dict
Represent Synapse Entity annotations as a flat dictionary with the system assigned properties id, etag as object attributes.
ATTRIBUTE DESCRIPTIONid
Synapse ID of the Entity
etag
Synapse etag of the Entity
values
(Optional) dictionary of values to be copied into annotations
**kwargs
additional key-value pairs to be added as annotations
Creating a few instances
Creating and setting annotations
example1 = Annotations('syn123','40256475-6fb3-11ea-bb0a-9cb6d0d8d984', {'foo':'bar'})\nexample2 = Annotations('syn123','40256475-6fb3-11ea-bb0a-9cb6d0d8d984', foo='bar')\nexample3 = Annotations('syn123','40256475-6fb3-11ea-bb0a-9cb6d0d8d984')\nexample3['foo'] = 'bar'\n
Source code in synapseclient/annotations.py
class Annotations(dict):\n \"\"\"\n Represent Synapse Entity annotations as a flat dictionary with the system assigned properties id, etag\n as object attributes.\n\n Attributes:\n id: Synapse ID of the Entity\n etag: Synapse etag of the Entity\n values: (Optional) dictionary of values to be copied into annotations\n **kwargs: additional key-value pairs to be added as annotations\n\n Example: Creating a few instances\n Creating and setting annotations\n\n example1 = Annotations('syn123','40256475-6fb3-11ea-bb0a-9cb6d0d8d984', {'foo':'bar'})\n example2 = Annotations('syn123','40256475-6fb3-11ea-bb0a-9cb6d0d8d984', foo='bar')\n example3 = Annotations('syn123','40256475-6fb3-11ea-bb0a-9cb6d0d8d984')\n example3['foo'] = 'bar'\n \"\"\"\n\n id: str\n etag: str\n\n def __init__(\n self,\n id: typing.Union[str, int, Entity],\n etag: str,\n values: typing.Dict = None,\n **kwargs,\n ):\n \"\"\"\n Create an Annotations object taking key value pairs from a dictionary or from keyword arguments.\n System properties id, etag, creationDate and uri become attributes of the object.\n\n :param id: A Synapse ID, a Synapse Entity object, a plain dictionary in which 'id' maps to a Synapse ID\n :param etag: etag of the Synapse Entity\n :param values: (Optional) dictionary of values to be copied into annotations\n :param \\**kwargs: additional key-value pairs to be added as annotations\n\n Example::\n\n example1 = Annotations('syn123','40256475-6fb3-11ea-bb0a-9cb6d0d8d984', {'foo':'bar'})\n example2 = Annotations('syn123','40256475-6fb3-11ea-bb0a-9cb6d0d8d984', foo='bar')\n example3 = Annotations('syn123','40256475-6fb3-11ea-bb0a-9cb6d0d8d984')\n example3['foo'] = 'bar'\n\n \"\"\"\n super().__init__()\n\n self.id = id\n self.etag = etag\n\n if values:\n self.update(values)\n if kwargs:\n self.update(kwargs)\n\n @property\n def id(self):\n return self._id\n\n @id.setter\n def id(self, value):\n if value is None:\n raise ValueError(\"id must not be None\")\n self._id = id_of(value)\n\n @property\n def etag(self):\n return self._etag\n\n @etag.setter\n def etag(self, value):\n if value is None:\n raise ValueError(\"etag must not be None\")\n self._etag = str(value)\n
"},{"location":"reference/annotations/#synapseclient.annotations-functions","title":"Functions","text":""},{"location":"reference/annotations/#synapseclient.annotations.is_synapse_annotations","title":"is_synapse_annotations(annotations)
","text":"Tests if the given object is a Synapse-style Annotations object.
PARAMETER DESCRIPTIONannotations
A key-value mapping that may or may not be a Synapse-style Annotations object.
TYPE: Mapping
bool
True if the given object is a Synapse-style Annotations object, False otherwise.
Source code insynapseclient/annotations.py
def is_synapse_annotations(annotations: typing.Mapping) -> bool:\n \"\"\"Tests if the given object is a Synapse-style Annotations object.\n\n Arguments:\n annotations: A key-value mapping that may or may not be a Synapse-style Annotations object.\n\n Returns:\n True if the given object is a Synapse-style Annotations object, False otherwise.\n \"\"\"\n if not isinstance(annotations, collections.abc.Mapping):\n return False\n return annotations.keys() >= {\"id\", \"etag\", \"annotations\"}\n
"},{"location":"reference/annotations/#synapseclient.annotations.is_submission_status_annotations","title":"is_submission_status_annotations(annotations)
","text":"Tests if the given dictionary is in the form of annotations to submission status
PARAMETER DESCRIPTIONannotations
A key-value mapping that may or may not be a submission status annotations object.
TYPE: Mapping
bool
True if the given object is a submission status annotations object, False otherwise.
Source code insynapseclient/annotations.py
def is_submission_status_annotations(annotations: collections.abc.Mapping) -> bool:\n \"\"\"Tests if the given dictionary is in the form of annotations to submission status\n\n Arguments:\n annotations: A key-value mapping that may or may not be a submission status annotations object.\n\n Returns:\n True if the given object is a submission status annotations object, False otherwise.\n \"\"\"\n keys = [\"objectId\", \"scopeId\", \"stringAnnos\", \"longAnnos\", \"doubleAnnos\"]\n if not isinstance(annotations, collections.abc.Mapping):\n return False\n return all([key in keys for key in annotations.keys()])\n
"},{"location":"reference/annotations/#synapseclient.annotations.to_submission_status_annotations","title":"to_submission_status_annotations(annotations, is_private=True)
","text":"Converts a normal dictionary to the format used to annotate submission statuses, which is different from the format used to annotate entities.
PARAMETER DESCRIPTIONannotations
A normal Python dictionary whose values are strings, floats, ints or doubles.
is_private
Set privacy on all annotations at once. These can be set individually using set_privacy.
DEFAULT: True
Adding and converting annotations
from synapseclient.annotations import to_submission_status_annotations, from_submission_status_annotations\nfrom datetime import datetime as Datetime\n\n## create a submission and get its status\nsubmission = syn.submit(evaluation, 'syn11111111')\nsubmission_status = syn.getSubmissionStatus(submission)\n\n## add annotations\nsubmission_status.annotations = {'foo':'bar', 'shoe_size':12, 'IQ':12, 'timestamp':Datetime.now()}\n\n## convert annotations\nsubmission_status.annotations = to_submission_status_annotations(submission_status.annotations)\nsubmission_status = syn.store(submission_status)\n
Synapse categorizes these annotations by: stringAnnos, doubleAnnos, longAnnos.
Source code insynapseclient/annotations.py
def to_submission_status_annotations(annotations, is_private=True):\n \"\"\"\n Converts a normal dictionary to the format used to annotate submission statuses, which is different from the format\n used to annotate entities.\n\n Arguments:\n annotations: A normal Python dictionary whose values are strings, floats, ints or doubles.\n is_private: Set privacy on all annotations at once. These can be set individually using\n [set_privacy][synapseclient.annotations.set_privacy].\n\n\n Example: Using this function\n Adding and converting annotations\n\n from synapseclient.annotations import to_submission_status_annotations, from_submission_status_annotations\n from datetime import datetime as Datetime\n\n ## create a submission and get its status\n submission = syn.submit(evaluation, 'syn11111111')\n submission_status = syn.getSubmissionStatus(submission)\n\n ## add annotations\n submission_status.annotations = {'foo':'bar', 'shoe_size':12, 'IQ':12, 'timestamp':Datetime.now()}\n\n ## convert annotations\n submission_status.annotations = to_submission_status_annotations(submission_status.annotations)\n submission_status = syn.store(submission_status)\n\n\n Synapse categorizes these annotations by: stringAnnos, doubleAnnos, longAnnos.\n \"\"\"\n if is_submission_status_annotations(annotations):\n return annotations\n synapseAnnos = {}\n for key, value in annotations.items():\n if key in [\"objectId\", \"scopeId\", \"stringAnnos\", \"longAnnos\", \"doubleAnnos\"]:\n synapseAnnos[key] = value\n elif isinstance(value, bool):\n synapseAnnos.setdefault(\"stringAnnos\", []).append(\n {\"key\": key, \"value\": str(value).lower(), \"isPrivate\": is_private}\n )\n elif isinstance(value, int):\n synapseAnnos.setdefault(\"longAnnos\", []).append(\n {\"key\": key, \"value\": value, \"isPrivate\": is_private}\n )\n elif isinstance(value, float):\n synapseAnnos.setdefault(\"doubleAnnos\", []).append(\n {\"key\": key, \"value\": value, \"isPrivate\": is_private}\n )\n elif isinstance(value, str):\n synapseAnnos.setdefault(\"stringAnnos\", []).append(\n {\"key\": key, \"value\": value, \"isPrivate\": is_private}\n )\n elif is_date(value):\n synapseAnnos.setdefault(\"longAnnos\", []).append(\n {\n \"key\": key,\n \"value\": to_unix_epoch_time(value),\n \"isPrivate\": is_private,\n }\n )\n else:\n synapseAnnos.setdefault(\"stringAnnos\", []).append(\n {\"key\": key, \"value\": str(value), \"isPrivate\": is_private}\n )\n return synapseAnnos\n
"},{"location":"reference/annotations/#synapseclient.annotations.from_submission_status_annotations","title":"from_submission_status_annotations(annotations)
","text":"Convert back from submission status annotation format to a normal dictionary.
PARAMETER DESCRIPTIONannotations
A dictionary in the format used to annotate submission statuses.
RETURNS DESCRIPTION
dict
A normal Python dictionary.
Using this functionConverting from submission status annotations
submission_status.annotations = from_submission_status_annotations(submission_status.annotations)\n
Source code in synapseclient/annotations.py
def from_submission_status_annotations(annotations) -> dict:\n \"\"\"\n Convert back from submission status annotation format to a normal dictionary.\n\n Arguments:\n annotations: A dictionary in the format used to annotate submission statuses.\n\n Returns:\n A normal Python dictionary.\n\n Example: Using this function\n Converting from submission status annotations\n\n submission_status.annotations = from_submission_status_annotations(submission_status.annotations)\n \"\"\"\n dictionary = {}\n for key, value in annotations.items():\n if key in [\"stringAnnos\", \"longAnnos\"]:\n dictionary.update({kvp[\"key\"]: kvp[\"value\"] for kvp in value})\n elif key == \"doubleAnnos\":\n dictionary.update({kvp[\"key\"]: float(kvp[\"value\"]) for kvp in value})\n else:\n dictionary[key] = value\n return dictionary\n
"},{"location":"reference/annotations/#synapseclient.annotations.set_privacy","title":"set_privacy(annotations, key, is_private=True, value_types=['longAnnos', 'doubleAnnos', 'stringAnnos'])
","text":"Set privacy of individual annotations, where annotations are in the format used by Synapse SubmissionStatus objects. See the Annotations documentation.
PARAMETER DESCRIPTIONannotations
Annotations that have already been converted to Synapse format using to_submission_status_annotations.
key
The key of the annotation whose privacy we're setting.
is_private
If False, the annotation will be visible to users with READ permission on the evaluation. If True, the it will be visible only to users with READ_PRIVATE_SUBMISSION on the evaluation.
DEFAULT: True
value_types
A list of the value types in which to search for the key.
DEFAULT: ['longAnnos', 'doubleAnnos', 'stringAnnos']
synapseclient/annotations.py
def set_privacy(\n annotations,\n key,\n is_private=True,\n value_types=[\"longAnnos\", \"doubleAnnos\", \"stringAnnos\"],\n):\n \"\"\"\n Set privacy of individual annotations, where annotations are in the format used by Synapse SubmissionStatus objects.\n See the [Annotations documentation](https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/annotation/Annotations.html).\n\n Arguments:\n annotations: Annotations that have already been converted to Synapse format using\n [to_submission_status_annotations][synapseclient.annotations.to_submission_status_annotations].\n key: The key of the annotation whose privacy we're setting.\n is_private: If False, the annotation will be visible to users with READ permission on the evaluation.\n If True, the it will be visible only to users with READ_PRIVATE_SUBMISSION on the evaluation.\n value_types: A list of the value types in which to search for the key.\n\n \"\"\"\n for value_type in value_types:\n kvps = annotations.get(value_type, None)\n if kvps:\n for kvp in kvps:\n if kvp[\"key\"] == key:\n kvp[\"isPrivate\"] = is_private\n return kvp\n raise KeyError('The key \"%s\" couldn\\'t be found in the annotations.' % key)\n
"},{"location":"reference/annotations/#synapseclient.annotations.to_synapse_annotations","title":"to_synapse_annotations(annotations)
","text":"Transforms a simple flat dictionary to a Synapse-style Annotation object. https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/annotation/v2/Annotations.html
PARAMETER DESCRIPTIONannotations
A simple flat dictionary of annotations.
TYPE: Annotations
Dict[str, Any]
A Synapse-style Annotation dict.
Source code insynapseclient/annotations.py
def to_synapse_annotations(annotations: Annotations) -> typing.Dict[str, typing.Any]:\n \"\"\"Transforms a simple flat dictionary to a Synapse-style Annotation object.\n https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/annotation/v2/Annotations.html\n\n Arguments:\n annotations: A simple flat dictionary of annotations.\n\n Returns:\n A Synapse-style Annotation dict.\n \"\"\"\n\n if is_synapse_annotations(annotations):\n return annotations\n synapse_annos = {}\n\n if not isinstance(annotations, Annotations):\n raise TypeError(\n \"annotations must be a synapseclient.Annotations object with 'id' and 'etag' attributes\"\n )\n\n synapse_annos[\"id\"] = annotations.id\n synapse_annos[\"etag\"] = annotations.etag\n\n synapse_annos[\"annotations\"] = _convert_to_annotations_list(annotations)\n return synapse_annos\n
"},{"location":"reference/annotations/#synapseclient.annotations.from_synapse_annotations","title":"from_synapse_annotations(raw_annotations)
","text":"Transforms a Synapse-style Annotation object to a simple flat dictionary.
PARAMETER DESCRIPTIONraw_annotations
A Synapse-style Annotation dict.
TYPE: Dict[str, Any]
Annotations
A simple flat dictionary of annotations.
Source code insynapseclient/annotations.py
def from_synapse_annotations(\n raw_annotations: typing.Dict[str, typing.Any]\n) -> Annotations:\n \"\"\"Transforms a Synapse-style Annotation object to a simple flat dictionary.\n\n Arguments:\n raw_annotations: A Synapse-style Annotation dict.\n\n Returns:\n A simple flat dictionary of annotations.\n \"\"\"\n if not is_synapse_annotations(raw_annotations):\n raise ValueError(\n 'Unexpected format of annotations from Synapse. Must include keys: \"id\", \"etag\", and \"annotations\"'\n )\n\n annos = Annotations(raw_annotations[\"id\"], raw_annotations[\"etag\"])\n for key, value_and_type in raw_annotations[\"annotations\"].items():\n key: str\n conversion_func = ANNO_TYPE_TO_FUNC[value_and_type[\"type\"]]\n annos[key] = [conversion_func(v) for v in value_and_type[\"value\"]]\n\n return annos\n
"},{"location":"reference/annotations/#synapseclient.annotations.convert_old_annotation_json","title":"convert_old_annotation_json(annotations)
","text":"Transforms a parsed JSON dictionary of old style annotations into a new style consistent with the entity bundle v2 format.
This is intended to support some models that were saved as serialized entity bundle JSON (Submissions). we don't need to support newer types here e.g. BOOLEAN because they did not exist at the time that annotation JSON was saved in this form.
Source code insynapseclient/annotations.py
def convert_old_annotation_json(annotations):\n \"\"\"Transforms a parsed JSON dictionary of old style annotations\n into a new style consistent with the entity bundle v2 format.\n\n This is intended to support some models that were saved as serialized\n entity bundle JSON (Submissions). we don't need to support newer\n types here e.g. BOOLEAN because they did not exist at the time\n that annotation JSON was saved in this form.\n \"\"\"\n\n meta_keys = (\"id\", \"etag\", \"creationDate\", \"uri\")\n\n type_mapping = {\n \"doubleAnnotations\": \"DOUBLE\",\n \"stringAnnotations\": \"STRING\",\n \"longAnnotations\": \"LONG\",\n \"dateAnnotations\": \"TIMESTAMP_MS\",\n }\n\n annos_v1_keys = set(meta_keys) | set(type_mapping.keys())\n\n # blobAnnotations appear to be little/unused and there is no mapping defined here but if they\n # are present on the annos we should treat it as an old style annos dict\n annos_v1_keys.add(\"blobAnnotations\")\n\n # if any keys in the annos dict are not consistent with an old style annotations then we treat\n # it as an annotations2 style dictionary that is not in need of any conversion\n if any(k not in annos_v1_keys for k in annotations.keys()):\n return annotations\n\n converted = {k: v for k, v in annotations.items() if k in meta_keys}\n converted_annos = converted[\"annotations\"] = {}\n\n for old_type_key, converted_type in type_mapping.items():\n values = annotations.get(old_type_key)\n if values:\n for k, vs in values.items():\n converted_annos[k] = {\n \"type\": converted_type,\n \"value\": vs,\n }\n\n return converted\n
"},{"location":"reference/cli/","title":"Command Line Client","text":"The Synapse Python Client can be used from the command line via the synapse command.
Note: The command line client is installed along with installation of the Synapse Python client.
"},{"location":"reference/cli/#usage","title":"Usage","text":"For help, type:
synapse -h\n
"},{"location":"reference/client/","title":"Client","text":""},{"location":"reference/client/#synapseclient.Synapse","title":"synapseclient.Synapse
","text":" Bases: object
Constructs a Python client object for the Synapse repository service
ATTRIBUTE DESCRIPTIONrepoEndpoint
Location of Synapse repository
authEndpoint
Location of authentication service
fileHandleEndpoint
Location of file service
portalEndpoint
Location of the website
serviceTimeoutSeconds
Wait time before timeout (currently unused)
debug
Print debugging messages if True
skip_checks
Skip version and endpoint checks
configPath
Path to config File with setting for Synapse defaults to ~/.synapseConfig
requests_session
a custom requests.Session object that this Synapse instance will use when making http requests.
cache_root_dir
root directory for storing cache data
silent
Defaults to False.
Getting started
Logging in to Synapse using an authToken
import synapseclient\nsyn = synapseclient.login(authToken=\"authtoken\")\n
Using environment variable or .synapseConfig
import synapseclient\nsyn = synapseclient.login()\n
Source code in synapseclient/client.py
class Synapse(object):\n \"\"\"\n Constructs a Python client object for the Synapse repository service\n\n Attributes:\n repoEndpoint: Location of Synapse repository\n authEndpoint: Location of authentication service\n fileHandleEndpoint: Location of file service\n portalEndpoint: Location of the website\n serviceTimeoutSeconds: Wait time before timeout (currently unused)\n debug: Print debugging messages if True\n skip_checks: Skip version and endpoint checks\n configPath: Path to config File with setting for Synapse\n defaults to ~/.synapseConfig\n requests_session: a custom requests.Session object that this Synapse instance will use\n when making http requests.\n cache_root_dir: root directory for storing cache data\n silent: Defaults to False.\n\n Example: Getting started\n Logging in to Synapse using an authToken\n\n import synapseclient\n syn = synapseclient.login(authToken=\"authtoken\")\n\n Using environment variable or `.synapseConfig`\n\n import synapseclient\n syn = synapseclient.login()\n\n \"\"\"\n\n # TODO: add additional boolean for write to disk?\n @tracer.start_as_current_span(\"Synapse::__init__\")\n def __init__(\n self,\n repoEndpoint: str = None,\n authEndpoint: str = None,\n fileHandleEndpoint: str = None,\n portalEndpoint: str = None,\n debug: bool = None,\n skip_checks: bool = False,\n configPath: str = CONFIG_FILE,\n requests_session: requests.Session = None,\n cache_root_dir: str = None,\n silent: bool = None,\n ):\n self._requests_session = requests_session or requests.Session()\n\n cache_root_dir = (\n cache.CACHE_ROOT_DIR if cache_root_dir is None else cache_root_dir\n )\n\n config_debug = None\n # Check for a config file\n self.configPath = configPath\n if os.path.isfile(configPath):\n config = self.getConfigFile(configPath)\n if config.has_option(\"cache\", \"location\"):\n cache_root_dir = config.get(\"cache\", \"location\")\n if config.has_section(\"debug\"):\n config_debug = True\n\n if debug is None:\n debug = config_debug if config_debug is not None else DEBUG_DEFAULT\n\n self.cache = cache.Cache(cache_root_dir)\n self._sts_token_store = sts_transfer.StsTokenStore()\n\n self.setEndpoints(\n repoEndpoint, authEndpoint, fileHandleEndpoint, portalEndpoint, skip_checks\n )\n\n self.default_headers = {\n \"content-type\": \"application/json; charset=UTF-8\",\n \"Accept\": \"application/json; charset=UTF-8\",\n }\n self.credentials = None\n\n if not isinstance(debug, bool):\n raise ValueError(\"debug must be set to a bool (either True or False)\")\n self.debug = debug\n\n self.silent = silent\n self._init_logger() # initializes self.logger\n\n self.skip_checks = skip_checks\n\n self.table_query_sleep = 2\n self.table_query_backoff = 1.1\n self.table_query_max_sleep = 20\n self.table_query_timeout = 600 # in seconds\n self.multi_threaded = True # if set to True, multi threaded download will be used for http and https URLs\n\n transfer_config = self._get_transfer_config()\n self.max_threads = transfer_config[\"max_threads\"]\n self.use_boto_sts_transfers = transfer_config[\"use_boto_sts\"]\n\n # initialize logging\n def _init_logger(self):\n logger_name = (\n SILENT_LOGGER_NAME\n if self.silent\n else DEBUG_LOGGER_NAME\n if self.debug\n else DEFAULT_LOGGER_NAME\n )\n self.logger = logging.getLogger(logger_name)\n logging.getLogger(\"py.warnings\").handlers = self.logger.handlers\n\n @property\n def max_threads(self) -> int:\n return self._max_threads\n\n @max_threads.setter\n def max_threads(self, value: int):\n self._max_threads = min(max(value, 1), MAX_THREADS_CAP)\n\n @property\n def username(self) -> Union[str, None]:\n # for backwards compatability when username was a part of the Synapse object and not in credentials\n return self.credentials.username if self.credentials is not None else None\n\n @tracer.start_as_current_span(\"Synapse::getConfigFile\")\n @functools.lru_cache()\n def getConfigFile(self, configPath: str) -> configparser.RawConfigParser:\n \"\"\"\n Retrieves the client configuration information.\n\n Arguments:\n configPath: Path to configuration file on local file system\n\n Returns:\n A RawConfigParser populated with properties from the user's configuration file.\n \"\"\"\n\n try:\n config = configparser.RawConfigParser()\n config.read(configPath) # Does not fail if the file does not exist\n return config\n except configparser.Error as ex:\n raise ValueError(\n \"Error parsing Synapse config file: {}\".format(configPath)\n ) from ex\n\n @tracer.start_as_current_span(\"Synapse::setEndpoints\")\n def setEndpoints(\n self,\n repoEndpoint: str = None,\n authEndpoint: str = None,\n fileHandleEndpoint: str = None,\n portalEndpoint: str = None,\n skip_checks: bool = False,\n ):\n \"\"\"\n Sets the locations for each of the Synapse services (mostly useful for testing).\n\n Arguments:\n repoEndpoint: Location of synapse repository\n authEndpoint: Location of authentication service\n fileHandleEndpoint: Location of file service\n portalEndpoint: Location of the website\n skip_checks: Skip version and endpoint checks\n\n Example: Switching endpoints\n To switch between staging and production endpoints\n\n syn.setEndpoints(**synapseclient.client.STAGING_ENDPOINTS)\n syn.setEndpoints(**synapseclient.client.PRODUCTION_ENDPOINTS)\n\n \"\"\"\n\n endpoints = {\n \"repoEndpoint\": repoEndpoint,\n \"authEndpoint\": authEndpoint,\n \"fileHandleEndpoint\": fileHandleEndpoint,\n \"portalEndpoint\": portalEndpoint,\n }\n\n # For unspecified endpoints, first look in the config file\n config = self.getConfigFile(self.configPath)\n for point in endpoints.keys():\n if endpoints[point] is None and config.has_option(\"endpoints\", point):\n endpoints[point] = config.get(\"endpoints\", point)\n\n # Endpoints default to production\n for point in endpoints.keys():\n if endpoints[point] is None:\n endpoints[point] = PRODUCTION_ENDPOINTS[point]\n\n # Update endpoints if we get redirected\n if not skip_checks:\n response = self._requests_session.get(\n endpoints[point],\n allow_redirects=False,\n headers=synapseclient.USER_AGENT,\n )\n if response.status_code == 301:\n endpoints[point] = response.headers[\"location\"]\n\n self.repoEndpoint = endpoints[\"repoEndpoint\"]\n self.authEndpoint = endpoints[\"authEndpoint\"]\n self.fileHandleEndpoint = endpoints[\"fileHandleEndpoint\"]\n self.portalEndpoint = endpoints[\"portalEndpoint\"]\n\n @tracer.start_as_current_span(\"Synapse::login\")\n def login(\n self,\n email: str = None,\n password: str = None,\n apiKey: str = None,\n sessionToken: str = None,\n rememberMe: bool = False,\n silent: bool = False,\n forced: bool = False,\n authToken: str = None,\n ):\n \"\"\"\n Valid combinations of login() arguments:\n\n - email/username and password\n - email/username and apiKey (Base64 encoded string)\n - authToken\n - sessionToken (**DEPRECATED**)\n\n If no login arguments are provided or only username is provided, login() will attempt to log in using\n information from these sources (in order of preference):\n\n 1. User's personal access token from environment the variable: SYNAPSE_AUTH_TOKEN\n 2. .synapseConfig file (in user home folder unless configured otherwise)\n 3. cached credentials from previous `login()` where `rememberMe=True` was passed as a parameter\n\n Arguments:\n email: Synapse user name (or an email address associated with a Synapse account)\n password: **!!WILL BE DEPRECATED!!** password. Please use authToken (Synapse personal access token)\n apiKey: **!!WILL BE DEPRECATED!!** Base64 encoded Synapse API key\n sessionToken: **!!DEPRECATED FIELD!!** User's current session token. Using this field will ignore the\n following fields: email, password, apiKey\n rememberMe: Whether the authentication information should be cached in your operating system's\n credential storage.\n authToken: A bearer authorization token, e.g. a personal access token, can be used in lieu of a\n password or apiKey.\n silent: Defaults to False. Suppresses the \"Welcome ...!\" message.\n forced: Defaults to False. Bypass the credential cache if set.\n\n **GNOME Keyring** (recommended) or **KWallet** is recommended to be installed for credential storage on\n **Linux** systems.\n If it is not installed/setup, credentials will be stored as PLAIN-TEXT file with read and write permissions for\n the current user only (chmod 600).\n On Windows and Mac OS, a default credentials storage exists so it will be preferred over the plain-text file.\n To install GNOME Keyring on Ubuntu:\n\n sudo apt-get install gnome-keyring\n\n sudo apt-get install python-dbus #(for Python 2 installed via apt-get)\n OR\n sudo apt-get install python3-dbus #(for Python 3 installed via apt-get)\n OR\n sudo apt-get install libdbus-glib-1-dev #(for custom installation of Python or vitualenv)\n sudo pip install dbus-python #(may take a while to compile C code)\n\n If you are on a headless Linux session (e.g. connecting via SSH), please run the following commands before\n running your Python session:\n\n dbus-run-session -- bash #(replace 'bash' with 'sh' if bash is unavailable)\n echo -n \"REPLACE_WITH_YOUR_KEYRING_PASSWORD\"|gnome-keyring-daemon -- unlock\n\n Example: Logging in\n Using an auth token:\n\n syn.login(authToken=\"authtoken\")\n #> Welcome, Me!\n\n Using a username/password:\n\n syn.login('my-username', 'secret-password', rememberMe=True)\n syn.login('my-username', 'secret-password', rememberMe=True)\n #> Welcome, Me!\n syn.login('my-username', 'secret-password', rememberMe=True)\n #> Welcome, Me!\n\n After logging in with the *rememberMe* flag set, an API key will be cached and\n used to authenticate for future logins:\n\n syn.login()\n #> Welcome, Me!\n\n \"\"\"\n # Note: the order of the logic below reflects the ordering in the docstring above.\n\n # Check version before logging in\n if not self.skip_checks:\n version_check()\n\n # Make sure to invalidate the existing session\n self.logout()\n\n credential_provider_chain = get_default_credential_chain()\n # TODO: remove deprecated sessionToken when we move to a different solution\n self.credentials = credential_provider_chain.get_credentials(\n self,\n UserLoginArgs(\n email,\n password,\n apiKey,\n forced,\n sessionToken,\n authToken,\n ),\n )\n\n # Final check on login success\n if not self.credentials:\n raise SynapseNoCredentialsError(\"No credentials provided.\")\n\n # Save the API key in the cache\n if rememberMe:\n message = (\n \"The rememberMe parameter will be deprecated by early 2024. Please use the ~/.synapseConfig \"\n \"or SYNAPSE_AUTH_TOKEN environmental variable to set up your Synapse connection.\"\n )\n self.logger.warning(message)\n delete_stored_credentials(self.credentials.username)\n self.credentials.store_to_keyring()\n cached_sessions.set_most_recent_user(self.credentials.username)\n\n if not silent:\n profile = self.getUserProfile()\n # TODO-PY3: in Python2, do we need to ensure that this is encoded in utf-8\n self.logger.info(\n \"Welcome, %s!\\n\"\n % (\n profile[\"displayName\"]\n if \"displayName\" in profile\n else self.credentials.username\n )\n )\n\n def _get_config_section_dict(self, section_name: str) -> dict:\n config = self.getConfigFile(self.configPath)\n try:\n return dict(config.items(section_name))\n except configparser.NoSectionError:\n # section not present\n return {}\n\n def _get_config_authentication(self) -> str:\n return self._get_config_section_dict(\n config_file_constants.AUTHENTICATION_SECTION_NAME\n )\n\n def _get_client_authenticated_s3_profile(self, endpoint: str, bucket: str) -> str:\n config_section = endpoint + \"/\" + bucket\n return self._get_config_section_dict(config_section).get(\n \"profile_name\", \"default\"\n )\n\n def _get_transfer_config(self) -> dict:\n # defaults\n transfer_config = {\"max_threads\": DEFAULT_NUM_THREADS, \"use_boto_sts\": False}\n\n for k, v in self._get_config_section_dict(\"transfer\").items():\n if v:\n if k == \"max_threads\" and v:\n try:\n transfer_config[\"max_threads\"] = int(v)\n except ValueError as cause:\n raise ValueError(\n f\"Invalid transfer.max_threads config setting {v}\"\n ) from cause\n\n elif k == \"use_boto_sts\":\n lower_v = v.lower()\n if lower_v not in (\"true\", \"false\"):\n raise ValueError(\n f\"Invalid transfer.use_boto_sts config setting {v}\"\n )\n\n transfer_config[\"use_boto_sts\"] = \"true\" == lower_v\n\n return transfer_config\n\n @tracer.start_as_current_span(\"Synapse::_getSessionToken\")\n def _getSessionToken(self, email: str, password: str) -> str:\n \"\"\"Returns a validated session token.\"\"\"\n try:\n req = {\"email\": email, \"password\": password}\n session = self.restPOST(\n \"/session\",\n body=json.dumps(req),\n endpoint=self.authEndpoint,\n headers=self.default_headers,\n )\n return session[\"sessionToken\"]\n except SynapseHTTPError as err:\n if (\n err.response.status_code == 403\n or err.response.status_code == 404\n or err.response.status_code == 401\n ):\n raise SynapseAuthenticationError(\"Invalid username or password.\")\n raise\n\n @tracer.start_as_current_span(\"Synapse::_getAPIKey\")\n def _getAPIKey(self, sessionToken: str) -> str:\n \"\"\"Uses a session token to fetch an API key.\"\"\"\n\n headers = {\"sessionToken\": sessionToken, \"Accept\": \"application/json\"}\n secret = self.restGET(\"/secretKey\", endpoint=self.authEndpoint, headers=headers)\n return secret[\"secretKey\"]\n\n @tracer.start_as_current_span(\"Synapse::_is_logged_in\")\n def _is_logged_in(self) -> bool:\n \"\"\"Test whether the user is logged in to Synapse.\"\"\"\n # This is a quick sanity check to see if credentials have been\n # configured on the client\n if self.credentials is None:\n return False\n # The public can query this command so there is no need to try catch.\n user = self.restGET(\"/userProfile\")\n if user.get(\"userName\") == \"anonymous\":\n return False\n return True\n\n def logout(self, forgetMe: bool = False):\n \"\"\"\n Removes authentication information from the Synapse client.\n\n Arguments:\n forgetMe: Set as True to clear any local storage of authentication information.\n See the flag \"rememberMe\" in [synapseclient.Synapse.login][]\n\n Returns:\n None\n \"\"\"\n # Delete the user's API key from the cache\n if forgetMe and self.credentials:\n self.credentials.delete_from_keyring()\n\n self.credentials = None\n\n def invalidateAPIKey(self):\n \"\"\"Invalidates authentication across all clients.\n\n Returns:\n None\n \"\"\"\n\n # Logout globally\n if self._is_logged_in():\n self.restDELETE(\"/secretKey\", endpoint=self.authEndpoint)\n\n @functools.lru_cache()\n def get_user_profile_by_username(\n self,\n username: str = None,\n sessionToken: str = None,\n ) -> UserProfile:\n \"\"\"\n Get the details about a Synapse user.\n Retrieves information on the current user if 'id' is omitted or is empty string.\n\n Arguments:\n username: The userName of a user\n sessionToken: The session token to use to find the user profile\n\n Returns:\n The user profile for the user of interest.\n\n Example: Using this function\n Getting your own profile\n\n my_profile = syn.get_user_profile_by_username()\n\n Getting another user's profile\n\n freds_profile = syn.get_user_profile_by_username('fredcommo')\n \"\"\"\n is_none = username is None\n is_str = isinstance(username, str)\n if not is_str and not is_none:\n raise TypeError(\"username must be string or None\")\n if is_str:\n principals = self._findPrincipals(username)\n for principal in principals:\n if principal.get(\"userName\", None).lower() == username.lower():\n id = principal[\"ownerId\"]\n break\n else:\n raise ValueError(f\"Can't find user '{username}'\")\n else:\n id = \"\"\n uri = f\"/userProfile/{id}\"\n return UserProfile(\n **self.restGET(\n uri, headers={\"sessionToken\": sessionToken} if sessionToken else None\n )\n )\n\n @functools.lru_cache()\n def get_user_profile_by_id(\n self,\n id: int = None,\n sessionToken: str = None,\n ) -> UserProfile:\n \"\"\"\n Get the details about a Synapse user.\n Retrieves information on the current user if 'id' is omitted.\n\n Arguments:\n id: The ownerId of a user\n sessionToken: The session token to use to find the user profile\n\n Returns:\n The user profile for the user of interest.\n\n\n Example: Using this function\n Getting your own profile\n\n my_profile = syn.get_user_profile_by_id()\n\n Getting another user's profile\n\n freds_profile = syn.get_user_profile_by_id(1234567)\n \"\"\"\n if id:\n if not isinstance(id, int):\n raise TypeError(\"id must be an 'ownerId' integer\")\n else:\n id = \"\"\n uri = f\"/userProfile/{id}\"\n return UserProfile(\n **self.restGET(\n uri, headers={\"sessionToken\": sessionToken} if sessionToken else None\n )\n )\n\n @functools.lru_cache()\n def getUserProfile(\n self,\n id: Union[str, int, UserProfile, TeamMember] = None,\n sessionToken: str = None,\n ) -> UserProfile:\n \"\"\"\n Get the details about a Synapse user.\n Retrieves information on the current user if 'id' is omitted.\n\n Arguments:\n id: The 'userId' (aka 'ownerId') of a user or the userName\n sessionToken: The session token to use to find the user profile\n\n Returns:\n The user profile for the user of interest.\n\n Example: Using this function\n Getting your own profile\n\n my_profile = syn.getUserProfile()\n\n Getting another user's profile\n\n freds_profile = syn.getUserProfile('fredcommo')\n \"\"\"\n try:\n # if id is unset or a userID, this will succeed\n id = \"\" if id is None else int(id)\n except (TypeError, ValueError):\n if isinstance(id, collections.abc.Mapping) and \"ownerId\" in id:\n id = id.ownerId\n elif isinstance(id, TeamMember):\n id = id.member.ownerId\n else:\n principals = self._findPrincipals(id)\n if len(principals) == 1:\n id = principals[0][\"ownerId\"]\n else:\n for principal in principals:\n if principal.get(\"userName\", None).lower() == id.lower():\n id = principal[\"ownerId\"]\n break\n else: # no break\n raise ValueError('Can\\'t find user \"%s\": ' % id)\n uri = \"/userProfile/%s\" % id\n return UserProfile(\n **self.restGET(\n uri, headers={\"sessionToken\": sessionToken} if sessionToken else None\n )\n )\n\n def _findPrincipals(self, query_string: str) -> typing.List[UserGroupHeader]:\n \"\"\"\n Find users or groups by name or email.\n\n :returns: A list of userGroupHeader objects with fields displayName, email, firstName, lastName, isIndividual,\n ownerId\n\n Example::\n\n syn._findPrincipals('test')\n\n [{u'displayName': u'Synapse Test',\n u'email': u'syn...t@sagebase.org',\n u'firstName': u'Synapse',\n u'isIndividual': True,\n u'lastName': u'Test',\n u'ownerId': u'1560002'},\n {u'displayName': ... }]\n\n \"\"\"\n uri = \"/userGroupHeaders?prefix=%s\" % urllib_urlparse.quote(query_string)\n return [UserGroupHeader(**result) for result in self._GET_paginated(uri)]\n\n def _get_certified_passing_record(self, userid: int) -> dict:\n \"\"\"Retrieve the Passing Record on the User Certification test for the given user.\n\n :params userid: Synapse user Id\n\n :returns: Synapse Passing Record\n https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/quiz/PassingRecord.html\n \"\"\"\n response = self.restGET(f\"/user/{userid}/certifiedUserPassingRecord\")\n return response\n\n def _get_user_bundle(self, userid: int, mask: int) -> dict:\n \"\"\"Retrieve the user bundle for the given user.\n\n :params userid: Synapse user Id\n :params mask: Bit field indicating which components to include in the bundle.\n\n :returns: Synapse User Bundle\n https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/UserBundle.html\n \"\"\"\n try:\n response = self.restGET(f\"/user/{userid}/bundle?mask={mask}\")\n except SynapseHTTPError as ex:\n if ex.response.status_code == 404:\n return None\n return response\n\n @tracer.start_as_current_span(\"Synapse::is_certified\")\n def is_certified(self, user: typing.Union[str, int]) -> bool:\n \"\"\"Determines whether a Synapse user is a certified user.\n\n Arguments:\n user: Synapse username or Id\n\n Returns:\n True if the Synapse user is certified\n \"\"\"\n # Check if userid or username exists\n syn_user = self.getUserProfile(user)\n # Get passing record\n\n try:\n certification_status = self._get_certified_passing_record(\n syn_user[\"ownerId\"]\n )\n return certification_status[\"passed\"]\n except SynapseHTTPError as ex:\n if ex.response.status_code == 404:\n # user hasn't taken the quiz\n return False\n raise\n\n @tracer.start_as_current_span(\"Synapse::is_synapse_id\")\n def is_synapse_id(self, syn_id: str) -> bool:\n \"\"\"Checks if given synID is valid (attached to actual entity?)\n\n Arguments:\n syn_id: A Synapse ID\n\n Returns:\n True if the Synapse ID is valid\n \"\"\"\n if isinstance(syn_id, str):\n try:\n self.get(syn_id, downloadFile=False)\n except SynapseFileNotFoundError:\n return False\n except (\n SynapseHTTPError,\n SynapseAuthenticationError,\n ) as err:\n status = (\n err.__context__.response.status_code or err.response.status_code\n )\n if status in (400, 404):\n return False\n # Valid ID but user lacks permission or is not logged in\n elif status == 403:\n return True\n return True\n self.logger.warning(\"synID must be a string\")\n return False\n\n def onweb(self, entity, subpageId=None):\n \"\"\"Opens up a browser window to the entity page or wiki-subpage.\n\n Arguments:\n entity: Either an Entity or a Synapse ID\n subpageId: (Optional) ID of one of the wiki's sub-pages\n\n Returns:\n None\n \"\"\"\n if isinstance(entity, str) and os.path.isfile(entity):\n entity = self.get(entity, downloadFile=False)\n synId = id_of(entity)\n if subpageId is None:\n webbrowser.open(\"%s#!Synapse:%s\" % (self.portalEndpoint, synId))\n else:\n webbrowser.open(\n \"%s#!Wiki:%s/ENTITY/%s\" % (self.portalEndpoint, synId, subpageId)\n )\n\n @tracer.start_as_current_span(\"Synapse::printEntity\")\n def printEntity(self, entity, ensure_ascii=True) -> None:\n \"\"\"\n Pretty prints an Entity.\n\n Arguments:\n entity: The entity to be printed.\n ensure_ascii: If True, escapes all non-ASCII characters\n\n Returns:\n None\n \"\"\"\n\n if utils.is_synapse_id_str(entity):\n entity = self._getEntity(entity)\n try:\n self.logger.info(\n json.dumps(entity, sort_keys=True, indent=2, ensure_ascii=ensure_ascii)\n )\n except TypeError:\n self.logger.info(str(entity))\n\n def _print_transfer_progress(self, *args, **kwargs):\n # Checking synapse if the mode is silent mode.\n # If self.silent is True, no need to print out transfer progress.\n if self.silent is not True:\n cumulative_transfer_progress.printTransferProgress(*args, **kwargs)\n\n ############################################################\n # Service methods #\n ############################################################\n\n _services = {\n \"json_schema\": \"JsonSchemaService\",\n }\n\n def get_available_services(self) -> typing.List[str]:\n \"\"\"Get available Synapse services\n This is a beta feature and is subject to change\n\n Returns:\n List of available services\n \"\"\"\n services = self._services.keys()\n return list(services)\n\n def service(self, service_name: str):\n \"\"\"Get available Synapse services\n This is a beta feature and is subject to change\n\n Arguments:\n service_name: name of the service\n \"\"\"\n # This is to avoid circular imports\n # TODO: revisit the import order and method https://stackoverflow.com/a/37126790\n # To move this to the top\n import synapseclient.services\n\n assert isinstance(service_name, str)\n service_name = service_name.lower().replace(\" \", \"_\")\n assert service_name in self._services, (\n f\"Unrecognized service ({service_name}). Run the 'get_available_\"\n \"services()' method to get a list of available services.\"\n )\n service_attr = self._services[service_name]\n service_cls = getattr(synapseclient.services, service_attr)\n service = service_cls(self)\n return service\n\n ############################################################\n # Get / Store methods #\n ############################################################\n\n @tracer.start_as_current_span(\"Synapse::get\")\n def get(self, entity, **kwargs):\n \"\"\"\n Gets a Synapse entity from the repository service.\n\n Arguments:\n entity: A Synapse ID, a Synapse Entity object, a plain dictionary in which 'id' maps to a\n Synapse ID or a local file that is stored in Synapse (found by the file MD5)\n version: The specific version to get.\n Defaults to the most recent version.\n downloadFile: Whether associated files(s) should be downloaded.\n Defaults to True\n downloadLocation: Directory where to download the Synapse File Entity.\n Defaults to the local cache.\n followLink: Whether the link returns the target Entity.\n Defaults to False\n ifcollision: Determines how to handle file collisions.\n May be \"overwrite.local\", \"keep.local\", or \"keep.both\".\n Defaults to \"keep.both\".\n limitSearch: A Synanpse ID used to limit the search in Synapse if entity is specified as a local\n file. That is, if the file is stored in multiple locations in Synapse only the ones\n in the specified folder/project will be returned.\n\n Returns:\n A new Synapse Entity object of the appropriate type.\n\n Example: Using this function\n Download file into cache\n\n entity = syn.get('syn1906479')\n print(entity.name)\n print(entity.path)\n\n Download file into current working directory\n\n entity = syn.get('syn1906479', downloadLocation='.')\n print(entity.name)\n print(entity.path)\n\n Determine the provenance of a locally stored file as indicated in Synapse\n\n entity = syn.get('/path/to/file.txt', limitSearch='syn12312')\n print(syn.getProvenance(entity))\n \"\"\"\n # If entity is a local file determine the corresponding synapse entity\n if isinstance(entity, str) and os.path.isfile(entity):\n bundle = self._getFromFile(entity, kwargs.pop(\"limitSearch\", None))\n kwargs[\"downloadFile\"] = False\n kwargs[\"path\"] = entity\n\n elif isinstance(entity, str) and not utils.is_synapse_id_str(entity):\n raise SynapseFileNotFoundError(\n (\n \"The parameter %s is neither a local file path \"\n \" or a valid entity id\" % entity\n )\n )\n # have not been saved entities\n elif isinstance(entity, Entity) and not entity.get(\"id\"):\n raise ValueError(\n \"Cannot retrieve entity that has not been saved.\"\n \" Please use syn.store() to save your entity and try again.\"\n )\n else:\n version = kwargs.get(\"version\", None)\n bundle = self._getEntityBundle(entity, version)\n # Check and warn for unmet access requirements\n self._check_entity_restrictions(\n bundle, entity, kwargs.get(\"downloadFile\", True)\n )\n\n return_data = self._getWithEntityBundle(\n entityBundle=bundle, entity=entity, **kwargs\n )\n trace.get_current_span().set_attributes(\n {\n \"synapse.id\": return_data.get(\"id\", \"\"),\n \"synapse.concrete_type\": return_data.get(\"concreteType\", \"\"),\n }\n )\n return return_data\n\n def _check_entity_restrictions(self, bundle, entity, downloadFile):\n restrictionInformation = bundle[\"restrictionInformation\"]\n if restrictionInformation[\"hasUnmetAccessRequirement\"]:\n warning_message = (\n \"\\nThis entity has access restrictions. Please visit the web page for this entity \"\n f'(syn.onweb(\"{id_of(entity)}\")). Look for the \"Access\" label and the lock icon underneath '\n 'the file name. Click \"Request Access\", and then review and fulfill the file '\n \"download requirement(s).\\n\"\n )\n if downloadFile and bundle.get(\"entityType\") not in (\"project\", \"folder\"):\n raise SynapseUnmetAccessRestrictions(warning_message)\n warnings.warn(warning_message)\n\n @tracer.start_as_current_span(\"Synapse::_getFromFile\")\n def _getFromFile(self, filepath, limitSearch=None):\n \"\"\"\n Gets a Synapse entityBundle based on the md5 of a local file\n See :py:func:`synapseclient.Synapse.get`.\n\n :param filepath: path to local file\n :param limitSearch: Limits the places in Synapse where the file is searched for.\n \"\"\"\n results = self.restGET(\n \"/entity/md5/%s\" % utils.md5_for_file(filepath).hexdigest()\n )[\"results\"]\n if limitSearch is not None:\n # Go through and find the path of every entity found\n paths = [self.restGET(\"/entity/%s/path\" % ent[\"id\"]) for ent in results]\n # Filter out all entities whose path does not contain limitSearch\n results = [\n ent\n for ent, path in zip(results, paths)\n if utils.is_in_path(limitSearch, path)\n ]\n if len(results) == 0: # None found\n raise SynapseFileNotFoundError(\"File %s not found in Synapse\" % (filepath,))\n elif len(results) > 1:\n id_txts = \"\\n\".join(\n [\"%s.%i\" % (r[\"id\"], r[\"versionNumber\"]) for r in results]\n )\n self.logger.warning(\n \"\\nThe file %s is associated with many files in Synapse:\\n%s\\n\"\n \"You can limit to files in specific project or folder by setting the limitSearch to the\"\n \" synapse Id of the project or folder.\\n\"\n \"Will use the first one returned: \\n\"\n \"%s version %i\\n\"\n % (filepath, id_txts, results[0][\"id\"], results[0][\"versionNumber\"])\n )\n entity = results[0]\n\n bundle = self._getEntityBundle(entity, version=entity[\"versionNumber\"])\n self.cache.add(bundle[\"entity\"][\"dataFileHandleId\"], filepath)\n\n return bundle\n\n @tracer.start_as_current_span(\"Synapse::move\")\n def move(self, entity, new_parent):\n \"\"\"\n Move a Synapse entity to a new container.\n\n Arguments:\n entity: A Synapse ID, a Synapse Entity object, or a local file that is stored in Synapse\n new_parent: The new parent container (Folder or Project) to which the entity should be moved.\n\n Returns:\n The Synapse Entity object that has been moved.\n\n Example: Using this function\n Move a Synapse Entity object to a new parent container\n\n entity = syn.move('syn456', 'syn123')\n \"\"\"\n\n entity = self.get(entity, downloadFile=False)\n entity.parentId = id_of(new_parent)\n entity = self.store(entity, forceVersion=False)\n trace.get_current_span().set_attributes(\n {\n \"synapse.id\": entity.get(\"id\", \"\"),\n \"synapse.parent_id\": entity.get(\"parentId\", \"\"),\n }\n )\n\n return entity\n\n def _getWithEntityBundle(self, entityBundle, entity=None, **kwargs):\n \"\"\"\n Creates a :py:mod:`synapseclient.Entity` from an entity bundle returned by Synapse.\n An existing Entity can be supplied in case we want to refresh a stale Entity.\n\n :param entityBundle: Uses the given dictionary as the meta information of the Entity to get\n :param entity: Optional, entity whose local state will be copied into the returned entity\n :param submission: Optional, access associated files through a submission rather than\n through an entity.\n\n See :py:func:`synapseclient.Synapse.get`.\n See :py:func:`synapseclient.Synapse._getEntityBundle`.\n See :py:mod:`synapseclient.Entity`.\n \"\"\"\n\n # Note: This version overrides the version of 'entity' (if the object is Mappable)\n kwargs.pop(\"version\", None)\n downloadFile = kwargs.pop(\"downloadFile\", True)\n downloadLocation = kwargs.pop(\"downloadLocation\", None)\n ifcollision = kwargs.pop(\"ifcollision\", None)\n submission = kwargs.pop(\"submission\", None)\n followLink = kwargs.pop(\"followLink\", False)\n path = kwargs.pop(\"path\", None)\n\n # make sure user didn't accidentlaly pass a kwarg that we don't handle\n if kwargs: # if there are remaining items in the kwargs\n raise TypeError(\"Unexpected **kwargs: %r\" % kwargs)\n\n # If Link, get target ID entity bundle\n if (\n entityBundle[\"entity\"][\"concreteType\"]\n == \"org.sagebionetworks.repo.model.Link\"\n and followLink\n ):\n targetId = entityBundle[\"entity\"][\"linksTo\"][\"targetId\"]\n targetVersion = entityBundle[\"entity\"][\"linksTo\"].get(\"targetVersionNumber\")\n entityBundle = self._getEntityBundle(targetId, targetVersion)\n\n # TODO is it an error to specify both downloadFile=False and downloadLocation?\n # TODO this matters if we want to return already cached files when downloadFile=False\n\n # Make a fresh copy of the Entity\n local_state = (\n entity.local_state() if entity and isinstance(entity, Entity) else {}\n )\n if path is not None:\n local_state[\"path\"] = path\n properties = entityBundle[\"entity\"]\n annotations = from_synapse_annotations(entityBundle[\"annotations\"])\n entity = Entity.create(properties, annotations, local_state)\n\n # Handle download of fileEntities\n if isinstance(entity, File):\n # update the entity with FileHandle metadata\n file_handle = next(\n (\n handle\n for handle in entityBundle[\"fileHandles\"]\n if handle[\"id\"] == entity.dataFileHandleId\n ),\n None,\n )\n entity._update_file_handle(file_handle)\n\n if downloadFile:\n if file_handle:\n self._download_file_entity(\n downloadLocation,\n entity,\n ifcollision,\n submission,\n )\n else: # no filehandle means that we do not have DOWNLOAD permission\n warning_message = (\n \"WARNING: You have READ permission on this file entity but not DOWNLOAD \"\n \"permission. The file has NOT been downloaded.\"\n )\n self.logger.warning(\n \"\\n\"\n + \"!\" * len(warning_message)\n + \"\\n\"\n + warning_message\n + \"\\n\"\n + \"!\" * len(warning_message)\n + \"\\n\"\n )\n return entity\n\n def _ensure_download_location_is_directory(self, downloadLocation):\n download_dir = os.path.expandvars(os.path.expanduser(downloadLocation))\n if os.path.isfile(download_dir):\n raise ValueError(\n \"Parameter 'downloadLocation' should be a directory, not a file.\"\n )\n return download_dir\n\n @tracer.start_as_current_span(\"Synapse::_download_file_entity\")\n def _download_file_entity(\n self,\n downloadLocation: str,\n entity: Entity,\n ifcollision: str,\n submission: str,\n ):\n # set the initial local state\n entity.path = None\n entity.files = []\n entity.cacheDir = None\n\n # check to see if an UNMODIFIED version of the file (since it was last downloaded) already exists\n # this location could be either in .synapseCache or a user specified location to which the user previously\n # downloaded the file\n cached_file_path = self.cache.get(entity.dataFileHandleId, downloadLocation)\n\n # location in .synapseCache where the file would be corresponding to its FileHandleId\n synapseCache_location = self.cache.get_cache_dir(entity.dataFileHandleId)\n\n file_name = (\n entity._file_handle.fileName\n if cached_file_path is None\n else os.path.basename(cached_file_path)\n )\n\n # Decide the best download location for the file\n if downloadLocation is not None:\n # Make sure the specified download location is a fully resolved directory\n downloadLocation = self._ensure_download_location_is_directory(\n downloadLocation\n )\n elif cached_file_path is not None:\n # file already cached so use that as the download location\n downloadLocation = os.path.dirname(cached_file_path)\n else:\n # file not cached and no user-specified location so default to .synapseCache\n downloadLocation = synapseCache_location\n\n # resolve file path collisions by either overwriting, renaming, or not downloading, depending on the\n # ifcollision value\n downloadPath = self._resolve_download_path_collisions(\n downloadLocation,\n file_name,\n ifcollision,\n synapseCache_location,\n cached_file_path,\n )\n if downloadPath is None:\n return\n\n if cached_file_path is not None: # copy from cache\n if downloadPath != cached_file_path:\n # create the foider if it does not exist already\n if not os.path.exists(downloadLocation):\n os.makedirs(downloadLocation)\n shutil.copy(cached_file_path, downloadPath)\n\n else: # download the file from URL (could be a local file)\n objectType = \"FileEntity\" if submission is None else \"SubmissionAttachment\"\n objectId = entity[\"id\"] if submission is None else submission\n\n # reassign downloadPath because if url points to local file (e.g. file://~/someLocalFile.txt)\n # it won't be \"downloaded\" and, instead, downloadPath will just point to '~/someLocalFile.txt'\n # _downloadFileHandle may also return None to indicate that the download failed\n downloadPath = self._downloadFileHandle(\n entity.dataFileHandleId, objectId, objectType, downloadPath\n )\n\n if downloadPath is None or not os.path.exists(downloadPath):\n return\n\n # converts the path format from forward slashes back to backward slashes on Windows\n entity.path = os.path.normpath(downloadPath)\n entity.files = [os.path.basename(downloadPath)]\n entity.cacheDir = os.path.dirname(downloadPath)\n\n def _resolve_download_path_collisions(\n self,\n downloadLocation,\n file_name,\n ifcollision,\n synapseCache_location,\n cached_file_path,\n ):\n # always overwrite if we are downloading to .synapseCache\n if utils.normalize_path(downloadLocation) == synapseCache_location:\n if ifcollision is not None and ifcollision != \"overwrite.local\":\n self.logger.warning(\n \"\\n\"\n + \"!\" * 50\n + f\"\\nifcollision={ifcollision} \"\n + \"is being IGNORED because the download destination is synapse's cache.\"\n ' Instead, the behavior is \"overwrite.local\". \\n' + \"!\" * 50 + \"\\n\"\n )\n ifcollision = \"overwrite.local\"\n # if ifcollision not specified, keep.local\n ifcollision = ifcollision or \"keep.both\"\n\n downloadPath = utils.normalize_path(os.path.join(downloadLocation, file_name))\n # resolve collison\n if os.path.exists(downloadPath):\n if ifcollision == \"overwrite.local\":\n pass\n elif ifcollision == \"keep.local\":\n # Don't want to overwrite the local file.\n return None\n elif ifcollision == \"keep.both\":\n if downloadPath != cached_file_path:\n return utils.unique_filename(downloadPath)\n else:\n raise ValueError(\n 'Invalid parameter: \"%s\" is not a valid value '\n 'for \"ifcollision\"' % ifcollision\n )\n return downloadPath\n\n def store(\n self,\n obj,\n *,\n createOrUpdate=True,\n forceVersion=True,\n versionLabel=None,\n isRestricted=False,\n activity=None,\n used=None,\n executed=None,\n activityName=None,\n activityDescription=None,\n opentelemetry_context=None,\n ):\n \"\"\"\n Creates a new Entity or updates an existing Entity, uploading any files in the process.\n\n Arguments:\n obj: A Synapse Entity, Evaluation, or Wiki\n used: The Entity, Synapse ID, or URL used to create the object (can also be a list of these)\n executed: The Entity, Synapse ID, or URL representing code executed to create the object\n (can also be a list of these)\n activity: Activity object specifying the user's provenance.\n activityName: Activity name to be used in conjunction with *used* and *executed*.\n activityDescription: Activity description to be used in conjunction with *used* and *executed*.\n createOrUpdate: Indicates whether the method should automatically perform an update if the 'obj'\n conflicts with an existing Synapse object. Defaults to True.\n forceVersion: Indicates whether the method should increment the version of the object even if nothing\n has changed. Defaults to True.\n versionLabel: Arbitrary string used to label the version.\n isRestricted: If set to true, an email will be sent to the Synapse access control team to start the\n process of adding terms-of-use or review board approval for this entity.\n You will be contacted with regards to the specific data being restricted and the\n requirements of access.\n opentelemetry_context: OpenTelemetry context to propogate to this function to use for tracing. Used\n cases where multi-threaded operations need to be linked to parent spans.\n\n Returns:\n A Synapse Entity, Evaluation, or Wiki\n\n Example: Using this function\n Creating a new Project:\n\n from synapseclient import Project\n\n project = Project('My uniquely named project')\n project = syn.store(project)\n\n Adding files with [provenance (aka: Activity)][synapseclient.Activity]:\n\n from synapseclient import File, Activity\n\n # A synapse entity *syn1906480* contains data\n # entity *syn1917825* contains code\n activity = Activity(\n 'Fancy Processing',\n description='No seriously, really fancy processing',\n used=['syn1906480', 'http://data_r_us.com/fancy/data.txt'],\n executed='syn1917825')\n\n test_entity = File('/path/to/data/file.xyz', description='Fancy new data', parent=project)\n test_entity = syn.store(test_entity, activity=activity)\n\n \"\"\"\n with tracer.start_as_current_span(\n \"Synapse::store\", context=opentelemetry_context\n ):\n trace.get_current_span().set_attributes(\n {\"thread.id\": threading.get_ident()}\n )\n # SYNPY-1031: activity must be Activity object or code will fail later\n if activity:\n if not isinstance(activity, synapseclient.Activity):\n raise ValueError(\"activity should be synapseclient.Activity object\")\n # _before_store hook\n # give objects a chance to do something before being stored\n if hasattr(obj, \"_before_synapse_store\"):\n obj._before_synapse_store(self)\n\n # _synapse_store hook\n # for objects that know how to store themselves\n if hasattr(obj, \"_synapse_store\"):\n return obj._synapse_store(self)\n\n # Handle all non-Entity objects\n if not (isinstance(obj, Entity) or type(obj) == dict):\n if isinstance(obj, Wiki):\n return self._storeWiki(obj, createOrUpdate)\n\n if \"id\" in obj: # If ID is present, update\n trace.get_current_span().set_attributes({\"synapse.id\": obj[\"id\"]})\n return type(obj)(**self.restPUT(obj.putURI(), obj.json()))\n\n try: # If no ID is present, attempt to POST the object\n trace.get_current_span().set_attributes({\"synapse.id\": \"\"})\n return type(obj)(**self.restPOST(obj.postURI(), obj.json()))\n\n except SynapseHTTPError as err:\n # If already present and we want to update attempt to get the object content\n if createOrUpdate and err.response.status_code == 409:\n newObj = self.restGET(obj.getByNameURI(obj.name))\n newObj.update(obj)\n obj = type(obj)(**newObj)\n trace.get_current_span().set_attributes(\n {\"synapse.id\": obj[\"id\"]}\n )\n obj.update(self.restPUT(obj.putURI(), obj.json()))\n return obj\n raise\n\n # If the input object is an Entity or a dictionary\n entity = obj\n properties, annotations, local_state = split_entity_namespaces(entity)\n bundle = None\n # Explicitly set an empty versionComment property if none is supplied,\n # otherwise an existing entity bundle's versionComment will be copied to the update.\n properties[\"versionComment\"] = (\n properties[\"versionComment\"] if \"versionComment\" in properties else None\n )\n\n # Anything with a path is treated as a cache-able item\n # Only Files are expected in the following logic\n if entity.get(\"path\", False) and not isinstance(obj, Folder):\n if \"concreteType\" not in properties:\n properties[\"concreteType\"] = File._synapse_entity_type\n # Make sure the path is fully resolved\n entity[\"path\"] = os.path.expanduser(entity[\"path\"])\n\n # Check if the File already exists in Synapse by fetching metadata on it\n bundle = self._getEntityBundle(entity)\n\n if bundle:\n if createOrUpdate:\n # update our properties from the existing bundle so that we have\n # enough to process this as an entity update.\n properties = {**bundle[\"entity\"], **properties}\n\n # Check if the file should be uploaded\n fileHandle = find_data_file_handle(bundle)\n if (\n fileHandle\n and fileHandle[\"concreteType\"]\n == \"org.sagebionetworks.repo.model.file.ExternalFileHandle\"\n ):\n # switching away from ExternalFileHandle or the url was updated\n needs_upload = entity[\"synapseStore\"] or (\n fileHandle[\"externalURL\"] != entity[\"externalURL\"]\n )\n else:\n # Check if we need to upload a new version of an existing\n # file. If the file referred to by entity['path'] has been\n # modified, we want to upload the new version.\n # If synapeStore is false then we must upload a ExternalFileHandle\n needs_upload = not entity[\n \"synapseStore\"\n ] or not self.cache.contains(\n bundle[\"entity\"][\"dataFileHandleId\"], entity[\"path\"]\n )\n elif entity.get(\"dataFileHandleId\", None) is not None:\n needs_upload = False\n else:\n needs_upload = True\n\n if needs_upload:\n local_state_fh = local_state.get(\"_file_handle\", {})\n synapseStore = local_state.get(\"synapseStore\", True)\n fileHandle = upload_file_handle(\n self,\n entity[\"parentId\"],\n local_state[\"path\"]\n if (synapseStore or local_state_fh.get(\"externalURL\") is None)\n else local_state_fh.get(\"externalURL\"),\n synapseStore=synapseStore,\n md5=local_state_fh.get(\"contentMd5\"),\n file_size=local_state_fh.get(\"contentSize\"),\n mimetype=local_state_fh.get(\"contentType\"),\n max_threads=self.max_threads,\n )\n properties[\"dataFileHandleId\"] = fileHandle[\"id\"]\n local_state[\"_file_handle\"] = fileHandle\n\n elif \"dataFileHandleId\" not in properties:\n # Handle the case where the Entity lacks an ID\n # But becomes an update() due to conflict\n properties[\"dataFileHandleId\"] = bundle[\"entity\"][\n \"dataFileHandleId\"\n ]\n\n # update the file_handle metadata if the FileEntity's FileHandle id has changed\n local_state_fh_id = local_state.get(\"_file_handle\", {}).get(\"id\")\n if (\n local_state_fh_id\n and properties[\"dataFileHandleId\"] != local_state_fh_id\n ):\n local_state[\"_file_handle\"] = find_data_file_handle(\n self._getEntityBundle(\n properties[\"id\"],\n requestedObjects={\n \"includeEntity\": True,\n \"includeFileHandles\": True,\n },\n )\n )\n\n # check if we already have the filehandleid cached somewhere\n cached_path = self.cache.get(properties[\"dataFileHandleId\"])\n if cached_path is None:\n local_state[\"path\"] = None\n local_state[\"cacheDir\"] = None\n local_state[\"files\"] = []\n else:\n local_state[\"path\"] = cached_path\n local_state[\"cacheDir\"] = os.path.dirname(cached_path)\n local_state[\"files\"] = [os.path.basename(cached_path)]\n\n # Create or update Entity in Synapse\n if \"id\" in properties:\n trace.get_current_span().set_attributes(\n {\"synapse.id\": properties[\"id\"]}\n )\n properties = self._updateEntity(properties, forceVersion, versionLabel)\n else:\n # If Link, get the target name, version number and concrete type and store in link properties\n if properties[\"concreteType\"] == \"org.sagebionetworks.repo.model.Link\":\n target_properties = self._getEntity(\n properties[\"linksTo\"][\"targetId\"],\n version=properties[\"linksTo\"].get(\"targetVersionNumber\"),\n )\n if target_properties[\"parentId\"] == properties[\"parentId\"]:\n raise ValueError(\n \"Cannot create a Link to an entity under the same parent.\"\n )\n properties[\"linksToClassName\"] = target_properties[\"concreteType\"]\n if (\n target_properties.get(\"versionNumber\") is not None\n and properties[\"linksTo\"].get(\"targetVersionNumber\") is not None\n ):\n properties[\"linksTo\"][\n \"targetVersionNumber\"\n ] = target_properties[\"versionNumber\"]\n properties[\"name\"] = target_properties[\"name\"]\n try:\n properties = self._createEntity(properties)\n except SynapseHTTPError as ex:\n if createOrUpdate and ex.response.status_code == 409:\n # Get the existing Entity's ID via the name and parent\n existing_entity_id = self.findEntityId(\n properties[\"name\"], properties.get(\"parentId\", None)\n )\n if existing_entity_id is None:\n raise\n\n # get existing properties and annotations\n if not bundle:\n bundle = self._getEntityBundle(\n existing_entity_id,\n requestedObjects={\n \"includeEntity\": True,\n \"includeAnnotations\": True,\n },\n )\n\n properties = {**bundle[\"entity\"], **properties}\n\n # we additionally merge the annotations under the assumption that a missing annotation\n # from a resolved conflict represents an newer annotation that should be preserved\n # rather than an intentionally deleted annotation.\n annotations = {\n **from_synapse_annotations(bundle[\"annotations\"]),\n **annotations,\n }\n\n properties = self._updateEntity(\n properties, forceVersion, versionLabel\n )\n\n else:\n raise\n\n # Deal with access restrictions\n if isRestricted:\n self._createAccessRequirementIfNone(properties)\n\n # Update annotations\n if (not bundle and annotations) or (\n bundle and check_annotations_changed(bundle[\"annotations\"], annotations)\n ):\n annotations = self.set_annotations(\n Annotations(properties[\"id\"], properties[\"etag\"], annotations)\n )\n properties[\"etag\"] = annotations.etag\n\n # If the parameters 'used' or 'executed' are given, create an Activity object\n if used or executed:\n if activity is not None:\n raise SynapseProvenanceError(\n \"Provenance can be specified as an Activity object or as used/executed\"\n \" item(s), but not both.\"\n )\n activity = Activity(\n name=activityName,\n description=activityDescription,\n used=used,\n executed=executed,\n )\n\n # If we have an Activity, set it as the Entity's provenance record\n if activity:\n self.setProvenance(properties, activity)\n\n # 'etag' has changed, so get the new Entity\n properties = self._getEntity(properties)\n\n # Return the updated Entity object\n entity = Entity.create(properties, annotations, local_state)\n return_data = self.get(entity, downloadFile=False)\n\n trace.get_current_span().set_attributes(\n {\n \"synapse.id\": return_data.get(\"id\", \"\"),\n \"synapse.concrete_type\": entity.get(\"concreteType\", \"\"),\n }\n )\n return return_data\n\n @tracer.start_as_current_span(\"Synapse::_createAccessRequirementIfNone\")\n def _createAccessRequirementIfNone(self, entity):\n \"\"\"\n Checks to see if the given entity has access requirements.\n If not, then one is added\n \"\"\"\n existingRestrictions = self.restGET(\n \"/entity/%s/accessRequirement?offset=0&limit=1\" % id_of(entity)\n )\n if len(existingRestrictions[\"results\"]) <= 0:\n self.restPOST(\"/entity/%s/lockAccessRequirement\" % id_of(entity), body=\"\")\n\n def _getEntityBundle(self, entity, version=None, requestedObjects=None):\n \"\"\"\n Gets some information about the Entity.\n\n :parameter entity: a Synapse Entity or Synapse ID\n :parameter version: the entity's version (defaults to None meaning most recent version)\n :parameter requestedObjects: A dict indicating settings for what to include\n\n default value for requestedObjects is::\n\n requestedObjects = {'includeEntity': True,\n 'includeAnnotations': True,\n 'includeFileHandles': True,\n 'includeRestrictionInformation': True}\n\n Keys available for requestedObjects::\n\n includeEntity\n includeAnnotations\n includePermissions\n includeEntityPath\n includeHasChildren\n includeAccessControlList\n includeFileHandles\n includeTableBundle\n includeRootWikiId\n includeBenefactorACL\n includeDOIAssociation\n includeFileName\n includeThreadCount\n includeRestrictionInformation\n\n\n Keys with values set to False may simply be omitted.\n For example, we might ask for an entity bundle containing file handles, annotations, and properties::\n requested_objects = {'includeEntity':True\n 'includeAnnotations':True,\n 'includeFileHandles':True}\n bundle = syn._getEntityBundle('syn111111', )\n\n :returns: An EntityBundle with the requested fields or by default Entity header, annotations, unmet access\n requirements, and file handles\n \"\"\"\n\n # If 'entity' is given without an ID, try to find it by 'parentId' and 'name'.\n # Use case:\n # If the user forgets to catch the return value of a syn.store(e)\n # this allows them to recover by doing: e = syn.get(e)\n if requestedObjects is None:\n requestedObjects = {\n \"includeEntity\": True,\n \"includeAnnotations\": True,\n \"includeFileHandles\": True,\n \"includeRestrictionInformation\": True,\n }\n if (\n isinstance(entity, collections.abc.Mapping)\n and \"id\" not in entity\n and \"name\" in entity\n ):\n entity = self.findEntityId(entity[\"name\"], entity.get(\"parentId\", None))\n\n # Avoid an exception from finding an ID from a NoneType\n try:\n id_of(entity)\n except ValueError:\n return None\n\n if version is not None:\n uri = f\"/entity/{id_of(entity)}/version/{int(version):d}/bundle2\"\n else:\n uri = f\"/entity/{id_of(entity)}/bundle2\"\n bundle = self.restPOST(uri, body=json.dumps(requestedObjects))\n\n return bundle\n\n @tracer.start_as_current_span(\"Synapse::delete\")\n def delete(self, obj, version=None):\n \"\"\"\n Removes an object from Synapse.\n\n Arguments:\n obj: An existing object stored on Synapse such as Evaluation, File, Project, or Wiki\n version: For entities, specify a particular version to delete.\n \"\"\"\n # Handle all strings as the Entity ID for backward compatibility\n if isinstance(obj, str):\n entity_id = id_of(obj)\n trace.get_current_span().set_attributes({\"synapse.id\": entity_id})\n if version:\n self.restDELETE(uri=f\"/entity/{entity_id}/version/{version}\")\n else:\n self.restDELETE(uri=f\"/entity/{entity_id}\")\n elif hasattr(obj, \"_synapse_delete\"):\n return obj._synapse_delete(self)\n else:\n try:\n if isinstance(obj, Versionable):\n self.restDELETE(obj.deleteURI(versionNumber=version))\n else:\n self.restDELETE(obj.deleteURI())\n except AttributeError:\n raise SynapseError(\n f\"Can't delete a {type(obj)}. Please specify a Synapse object or id\"\n )\n\n _user_name_cache = {}\n\n def _get_user_name(self, user_id):\n if user_id not in self._user_name_cache:\n self._user_name_cache[user_id] = utils.extract_user_name(\n self.getUserProfile(user_id)\n )\n return self._user_name_cache[user_id]\n\n @tracer.start_as_current_span(\"Synapse::_list\")\n def _list(\n self,\n parent,\n recursive=False,\n long_format=False,\n show_modified=False,\n indent=0,\n out=sys.stdout,\n ):\n \"\"\"\n List child objects of the given parent, recursively if requested.\n \"\"\"\n fields = [\"id\", \"name\", \"nodeType\"]\n if long_format:\n fields.extend([\"createdByPrincipalId\", \"createdOn\", \"versionNumber\"])\n if show_modified:\n fields.extend([\"modifiedByPrincipalId\", \"modifiedOn\"])\n results = self.getChildren(parent)\n\n results_found = False\n for result in results:\n results_found = True\n\n fmt_fields = {\n \"name\": result[\"name\"],\n \"id\": result[\"id\"],\n \"padding\": \" \" * indent,\n \"slash_or_not\": \"/\" if is_container(result) else \"\",\n }\n fmt_string = \"{id}\"\n\n if long_format:\n fmt_fields[\"createdOn\"] = utils.iso_to_datetime(\n result[\"createdOn\"]\n ).strftime(\"%Y-%m-%d %H:%M\")\n fmt_fields[\"createdBy\"] = self._get_user_name(result[\"createdBy\"])[:18]\n fmt_fields[\"version\"] = result[\"versionNumber\"]\n fmt_string += \" {version:3} {createdBy:>18} {createdOn}\"\n if show_modified:\n fmt_fields[\"modifiedOn\"] = utils.iso_to_datetime(\n result[\"modifiedOn\"]\n ).strftime(\"%Y-%m-%d %H:%M\")\n fmt_fields[\"modifiedBy\"] = self._get_user_name(result[\"modifiedBy\"])[\n :18\n ]\n fmt_string += \" {modifiedBy:>18} {modifiedOn}\"\n\n fmt_string += \" {padding}{name}{slash_or_not}\\n\"\n out.write(fmt_string.format(**fmt_fields))\n\n if (indent == 0 or recursive) and is_container(result):\n self._list(\n result[\"id\"],\n recursive=recursive,\n long_format=long_format,\n show_modified=show_modified,\n indent=indent + 2,\n out=out,\n )\n\n if indent == 0 and not results_found:\n out.write(\n \"No results visible to {username} found for id {id}\\n\".format(\n username=self.credentials.username, id=id_of(parent)\n )\n )\n\n def uploadFileHandle(\n self, path, parent, synapseStore=True, mimetype=None, md5=None, file_size=None\n ):\n \"\"\"Uploads the file in the provided path (if necessary) to a storage location based on project settings.\n Returns a new FileHandle as a dict to represent the stored file.\n\n Arguments:\n parent: Parent of the entity to which we upload.\n path: File path to the file being uploaded\n synapseStore: If False, will not upload the file, but instead create an ExternalFileHandle that references\n the file on the local machine.\n If True, will upload the file based on StorageLocation determined by the entity_parent_id\n mimetype: The MIME type metadata for the uploaded file\n md5: The MD5 checksum for the file, if known. Otherwise if the file is a local file, it will be calculated\n automatically.\n file_size: The size the file, if known. Otherwise if the file is a local file, it will be calculated\n automatically.\n file_type: The MIME type the file, if known. Otherwise if the file is a local file, it will be calculated\n automatically.\n\n Returns:\n A dict of a new FileHandle as a dict that represents the uploaded file\n \"\"\"\n return upload_file_handle(\n self, parent, path, synapseStore, md5, file_size, mimetype\n )\n\n ############################################################\n # Download List #\n ############################################################\n def clear_download_list(self):\n \"\"\"Clear all files from download list\"\"\"\n self.restDELETE(\"/download/list\")\n\n def remove_from_download_list(self, list_of_files: typing.List[typing.Dict]) -> int:\n \"\"\"Remove a batch of files from download list\n\n Arguments:\n list_of_files: Array of files in the format of a mapping {fileEntityId: synid, versionNumber: version}\n\n Returns:\n Number of files removed from download list\n \"\"\"\n request_body = {\"batchToRemove\": list_of_files}\n num_files_removed = self.restPOST(\n \"/download/list/remove\", body=json.dumps(request_body)\n )\n return num_files_removed\n\n def _generate_manifest_from_download_list(\n self,\n quoteCharacter: str = '\"',\n escapeCharacter: str = \"\\\\\",\n lineEnd: str = os.linesep,\n separator: str = \",\",\n header: bool = True,\n ):\n \"\"\"Creates a download list manifest generation request\n\n :param quoteCharacter: The character to be used for quoted elements in the resulting file.\n Defaults to '\"'.\n :param escapeCharacter: The escape character to be used for escaping a separator or quote in the resulting\n file. Defaults to \"\\\".\n :param lineEnd: The line feed terminator to be used for the resulting file. Defaults to os.linesep.\n :param separator: The delimiter to be used for separating entries in the resulting file. Defaults to \",\".\n :param header: Is the first line a header? Defaults to True.\n\n :returns: Filehandle of download list manifest\n \"\"\"\n request_body = {\n \"concreteType\": \"org.sagebionetworks.repo.model.download.DownloadListManifestRequest\",\n \"csvTableDescriptor\": {\n \"separator\": separator,\n \"quoteCharacter\": quoteCharacter,\n \"escapeCharacter\": escapeCharacter,\n \"lineEnd\": lineEnd,\n \"isFirstLineHeader\": header,\n },\n }\n return self._waitForAsync(\n uri=\"/download/list/manifest/async\", request=request_body\n )\n\n @tracer.start_as_current_span(\"Synapse::get_download_list_manifest\")\n def get_download_list_manifest(self):\n \"\"\"Get the path of the download list manifest file\n\n Returns:\n Path of download list manifest file\n \"\"\"\n manifest = self._generate_manifest_from_download_list()\n # Get file handle download link\n file_result = self._getFileHandleDownload(\n fileHandleId=manifest[\"resultFileHandleId\"],\n objectId=manifest[\"resultFileHandleId\"],\n objectType=\"FileEntity\",\n )\n # Download the manifest\n downloaded_path = self._download_from_URL(\n url=file_result[\"preSignedURL\"],\n destination=\"./\",\n fileHandleId=file_result[\"fileHandleId\"],\n expected_md5=file_result[\"fileHandle\"].get(\"contentMd5\"),\n )\n trace.get_current_span().set_attributes(\n {\"synapse.file_handle_id\": file_result[\"fileHandleId\"]}\n )\n return downloaded_path\n\n @tracer.start_as_current_span(\"Synapse::get_download_list\")\n def get_download_list(self, downloadLocation: str = None) -> str:\n \"\"\"Download all files from your Synapse download list\n\n Arguments:\n downloadLocation: Directory to download files to.\n\n Returns:\n Manifest file with file paths\n \"\"\"\n dl_list_path = self.get_download_list_manifest()\n downloaded_files = []\n new_manifest_path = f\"manifest_{time.time_ns()}.csv\"\n with open(dl_list_path) as manifest_f, open(\n new_manifest_path, \"w\"\n ) as write_obj:\n reader = csv.DictReader(manifest_f)\n columns = reader.fieldnames\n columns.extend([\"path\", \"error\"])\n # Write the downloaded paths to a new manifest file\n writer = csv.DictWriter(write_obj, fieldnames=columns)\n writer.writeheader()\n\n for row in reader:\n # You can add things to the download list that you don't have access to\n # So there must be a try catch here\n try:\n entity = self.get(row[\"ID\"], downloadLocation=downloadLocation)\n # Must include version number because you can have multiple versions of a\n # file in the download list\n downloaded_files.append(\n {\n \"fileEntityId\": row[\"ID\"],\n \"versionNumber\": row[\"versionNumber\"],\n }\n )\n row[\"path\"] = entity.path\n row[\"error\"] = \"\"\n except Exception:\n row[\"path\"] = \"\"\n row[\"error\"] = \"DOWNLOAD FAILED\"\n self.logger.error(\"Unable to download file\")\n writer.writerow(row)\n\n # Don't want to clear all the download list because you can add things\n # to the download list after initiating this command.\n # Files that failed to download should not be removed from download list\n # Remove all files from download list after the entire download is complete.\n # This is because if download fails midway, we want to return the full manifest\n if downloaded_files:\n # Only want to invoke this if there is a list of files to remove\n # or the API call will error\n self.remove_from_download_list(list_of_files=downloaded_files)\n else:\n self.logger.warning(\"A manifest was created, but no files were downloaded\")\n\n # Always remove original manifest file\n os.remove(dl_list_path)\n\n return new_manifest_path\n\n ############################################################\n # Get / Set Annotations #\n ############################################################\n\n def _getRawAnnotations(self, entity, version=None):\n \"\"\"\n Retrieve annotations for an Entity returning them in the native Synapse format.\n \"\"\"\n # Note: Specifying the version results in a zero-ed out etag,\n # even if the version is the most recent.\n # See `PLFM-1874 <https://sagebionetworks.jira.com/browse/PLFM-1874>`_ for more details.\n if version:\n uri = f\"/entity/{id_of(entity)}/version/{str(version)}/annotations2\"\n else:\n uri = f\"/entity/{id_of(entity)}/annotations2\"\n return self.restGET(uri)\n\n @deprecated.sphinx.deprecated(\n version=\"2.1.0\",\n reason=\"deprecated and replaced with :py:meth:`get_annotations`\",\n )\n def getAnnotations(self, entity, version=None):\n \"\"\"deprecated and replaced with :py:meth:`get_annotations`\"\"\"\n return self.get_annotations(entity, version=version)\n\n @tracer.start_as_current_span(\"Synapse::get_annotations\")\n def get_annotations(\n self, entity: typing.Union[str, Entity], version: typing.Union[str, int] = None\n ) -> Annotations:\n \"\"\"\n Retrieve annotations for an Entity from the Synapse Repository as a Python dict.\n\n Note that collapsing annotations from the native Synapse format to a Python dict may involve some loss of\n information. See `_getRawAnnotations` to get annotations in the native format.\n\n Arguments:\n entity: An Entity or Synapse ID to lookup\n version: The version of the Entity to retrieve.\n\n Returns:\n A [synapseclient.annotations.Annotations][] object, a dict that also has id and etag attributes\n \"\"\"\n return from_synapse_annotations(self._getRawAnnotations(entity, version))\n\n @deprecated.sphinx.deprecated(\n version=\"2.1.0\",\n reason=\"deprecated and replaced with :py:meth:`set_annotations` \"\n \"This method is UNSAFE and may overwrite existing annotations\"\n \" without confirming that you have retrieved and\"\n \" updated the latest annotations\",\n )\n def setAnnotations(self, entity, annotations=None, **kwargs):\n \"\"\"\n Store annotations for an Entity in the Synapse Repository.\n\n :param entity: The Entity or Synapse Entity ID whose annotations are to be updated\n :param annotations: A dictionary of annotation names and values\n :param kwargs: annotation names and values\n :returns: the updated annotations for the entity\n\n \"\"\"\n if not annotations:\n annotations = {}\n\n annotations.update(kwargs)\n\n id = id_of(entity)\n trace.get_current_span().set_attributes({\"synapse.id\": id})\n etag = (\n annotations.etag\n if hasattr(annotations, \"etag\")\n else annotations.get(\"etag\")\n )\n\n if not etag:\n if \"etag\" in entity:\n etag = entity[\"etag\"]\n else:\n uri = \"/entity/%s/annotations2\" % id_of(entity)\n old_annos = self.restGET(uri)\n etag = old_annos[\"etag\"]\n\n return self.set_annotations(Annotations(id, etag, annotations))\n\n @tracer.start_as_current_span(\"Synapse::set_annotations\")\n def set_annotations(self, annotations: Annotations):\n \"\"\"\n Store annotations for an Entity in the Synapse Repository.\n\n Arguments:\n annotations: A [synapseclient.annotations.Annotations][] of annotation names and values,\n with the id and etag attribute set\n\n Returns:\n The updated [synapseclient.annotations.Annotations][] for the entity\n\n Example: Using this function\n Getting annotations, adding a new annotation, and updating the annotations:\n\n annos = syn.get_annotations('syn123')\n\n # annos will contain the id and etag associated with the entity upon retrieval\n print(annos.id)\n # syn123\n print(annos.etag)\n # 7bdb83e9-a50a-46e4-987a-4962559f090f (Usually some UUID in the form of a string)\n\n # returned annos object from get_annotations() can be used as if it were a dict\n\n # set key 'foo' to have value of 'bar' and 'baz'\n annos['foo'] = ['bar', 'baz']\n\n # single values will automatically be wrapped in a list once stored\n annos['qwerty'] = 'asdf'\n\n # store the annotations\n annos = syn.set_annotations(annos)\n\n print(annos)\n # {'foo':['bar','baz], 'qwerty':['asdf']}\n \"\"\"\n\n if not isinstance(annotations, Annotations):\n raise TypeError(\"Expected a synapseclient.Annotations object\")\n\n synapseAnnos = to_synapse_annotations(annotations)\n\n entity_id = id_of(annotations)\n trace.get_current_span().set_attributes({\"synapse.id\": entity_id})\n\n return from_synapse_annotations(\n self.restPUT(\n f\"/entity/{entity_id}/annotations2\",\n body=json.dumps(synapseAnnos),\n )\n )\n\n ############################################################\n # Querying #\n ############################################################\n\n @tracer.start_as_current_span(\"Synapse::getChildren\")\n def getChildren(\n self,\n parent,\n includeTypes=[\n \"folder\",\n \"file\",\n \"table\",\n \"link\",\n \"entityview\",\n \"dockerrepo\",\n \"submissionview\",\n \"dataset\",\n \"materializedview\",\n ],\n sortBy=\"NAME\",\n sortDirection=\"ASC\",\n ):\n \"\"\"\n Retrieves all of the entities stored within a parent such as folder or project.\n\n Arguments:\n parent: An id or an object of a Synapse container or None to retrieve all projects\n includeTypes: Must be a list of entity types (ie. [\"folder\",\"file\"]) which can be found here:\n http://docs.synapse.org/rest/org/sagebionetworks/repo/model/EntityType.html\n sortBy: How results should be sorted. Can be NAME, or CREATED_ON\n sortDirection: The direction of the result sort. Can be ASC, or DESC\n\n Yields:\n An iterator that shows all the children of the container.\n\n Also see:\n\n - [synapseutils.walk][]\n \"\"\"\n parentId = id_of(parent) if parent is not None else None\n\n trace.get_current_span().set_attributes({\"synapse.parent_id\": parentId})\n entityChildrenRequest = {\n \"parentId\": parentId,\n \"includeTypes\": includeTypes,\n \"sortBy\": sortBy,\n \"sortDirection\": sortDirection,\n \"nextPageToken\": None,\n }\n entityChildrenResponse = {\"nextPageToken\": \"first\"}\n while entityChildrenResponse.get(\"nextPageToken\") is not None:\n entityChildrenResponse = self.restPOST(\n \"/entity/children\", body=json.dumps(entityChildrenRequest)\n )\n for child in entityChildrenResponse[\"page\"]:\n yield child\n if entityChildrenResponse.get(\"nextPageToken\") is not None:\n entityChildrenRequest[\"nextPageToken\"] = entityChildrenResponse[\n \"nextPageToken\"\n ]\n\n @tracer.start_as_current_span(\"Synapse::md5Query\")\n def md5Query(self, md5):\n \"\"\"\n Find the Entities which have attached file(s) which have the given MD5 hash.\n\n Arguments:\n md5: The MD5 to query for (hexadecimal string)\n\n Returns:\n A list of Entity headers\n \"\"\"\n\n return self.restGET(\"/entity/md5/%s\" % md5)[\"results\"]\n\n ############################################################\n # ACL manipulation #\n ############################################################\n\n def _getBenefactor(self, entity):\n \"\"\"An Entity gets its ACL from its benefactor.\"\"\"\n\n if utils.is_synapse_id_str(entity) or is_synapse_entity(entity):\n return self.restGET(\"/entity/%s/benefactor\" % id_of(entity))\n return entity\n\n def _getACL(self, entity):\n \"\"\"Get the effective ACL for a Synapse Entity.\"\"\"\n\n if hasattr(entity, \"getACLURI\"):\n uri = entity.getACLURI()\n else:\n # Get the ACL from the benefactor (which may be the entity itself)\n benefactor = self._getBenefactor(entity)\n trace.get_current_span().set_attributes({\"synapse.id\": benefactor[\"id\"]})\n uri = \"/entity/%s/acl\" % (benefactor[\"id\"])\n return self.restGET(uri)\n\n def _storeACL(self, entity, acl):\n \"\"\"\n Create or update the ACL for a Synapse Entity.\n\n :param entity: An entity or Synapse ID\n :param acl: An ACl as a dict\n\n :returns: the new or updated ACL\n\n .. code-block:: python\n\n {'resourceAccess': [\n {'accessType': ['READ'],\n 'principalId': 222222}\n ]}\n \"\"\"\n if hasattr(entity, \"putACLURI\"):\n return self.restPUT(entity.putACLURI(), json.dumps(acl))\n else:\n # Get benefactor. (An entity gets its ACL from its benefactor.)\n entity_id = id_of(entity)\n uri = \"/entity/%s/benefactor\" % entity_id\n benefactor = self.restGET(uri)\n\n # Update or create new ACL\n uri = \"/entity/%s/acl\" % entity_id\n if benefactor[\"id\"] == entity_id:\n return self.restPUT(uri, json.dumps(acl))\n else:\n return self.restPOST(uri, json.dumps(acl))\n\n def _getUserbyPrincipalIdOrName(self, principalId: str = None):\n \"\"\"\n Given either a string, int or None finds the corresponding user where None implies PUBLIC\n\n :param principalId: Identifier of a user or group\n\n :returns: The integer ID of the user\n \"\"\"\n if principalId is None or principalId == \"PUBLIC\":\n return PUBLIC\n try:\n return int(principalId)\n\n # If principalId is not a number assume it is a name or email\n except ValueError:\n userProfiles = self.restGET(\"/userGroupHeaders?prefix=%s\" % principalId)\n totalResults = len(userProfiles[\"children\"])\n if totalResults == 1:\n return int(userProfiles[\"children\"][0][\"ownerId\"])\n elif totalResults > 1:\n for profile in userProfiles[\"children\"]:\n if profile[\"userName\"] == principalId:\n return int(profile[\"ownerId\"])\n\n supplementalMessage = (\n \"Please be more specific\" if totalResults > 1 else \"No matches\"\n )\n raise SynapseError(\n \"Unknown Synapse user (%s). %s.\" % (principalId, supplementalMessage)\n )\n\n @tracer.start_as_current_span(\"Synapse::getPermissions\")\n def getPermissions(\n self,\n entity: Union[Entity, Evaluation, str, collections.abc.Mapping],\n principalId: str = None,\n ):\n \"\"\"Get the permissions that a user or group has on an Entity.\n Arguments:\n entity: An Entity or Synapse ID to lookup\n principalId: Identifier of a user or group (defaults to PUBLIC users)\n\n Returns:\n An array containing some combination of\n ['READ', 'CREATE', 'UPDATE', 'DELETE', 'CHANGE_PERMISSIONS', 'DOWNLOAD']\n or an empty array\n \"\"\"\n principal_id = self._getUserbyPrincipalIdOrName(principalId)\n\n trace.get_current_span().set_attributes(\n {\"synapse.id\": id_of(entity), \"synapse.principal_id\": principal_id}\n )\n\n acl = self._getACL(entity)\n\n team_list = self._find_teams_for_principal(principal_id)\n team_ids = [int(team.id) for team in team_list]\n effective_permission_set = set()\n\n # This user_profile_bundle is being used to verify that the principal_id is a registered user of the system\n user_profile_bundle = self._get_user_bundle(principal_id, 1)\n\n # Loop over all permissions in the returned ACL and add it to the effective_permission_set\n # if the principalId in the ACL matches\n # 1) the one we are looking for,\n # 2) a team the entity is a member of,\n # 3) PUBLIC\n # 4) A user_profile_bundle exists for the principal_id\n for permissions in acl[\"resourceAccess\"]:\n if \"principalId\" in permissions and (\n permissions[\"principalId\"] == principal_id\n or permissions[\"principalId\"] in team_ids\n or permissions[\"principalId\"] == PUBLIC\n or (\n permissions[\"principalId\"] == AUTHENTICATED_USERS\n and user_profile_bundle is not None\n )\n ):\n effective_permission_set = effective_permission_set.union(\n permissions[\"accessType\"]\n )\n return list(effective_permission_set)\n\n @tracer.start_as_current_span(\"Synapse::setPermissions\")\n def setPermissions(\n self,\n entity,\n principalId=None,\n accessType=[\"READ\", \"DOWNLOAD\"],\n modify_benefactor=False,\n warn_if_inherits=True,\n overwrite=True,\n ):\n \"\"\"\n Sets permission that a user or group has on an Entity.\n An Entity may have its own ACL or inherit its ACL from a benefactor.\n\n Arguments:\n entity: An Entity or Synapse ID to modify\n principalId: Identifier of a user or group. '273948' is for all registered Synapse users\n and '273949' is for public access.\n accessType: Type of permission to be granted. One or more of CREATE, READ, DOWNLOAD, UPDATE,\n DELETE, CHANGE_PERMISSIONS\n modify_benefactor: Set as True when modifying a benefactor's ACL\n warn_if_inherits: Set as False, when creating a new ACL.\n Trying to modify the ACL of an Entity that inherits its ACL will result in a warning\n overwrite: By default this function overwrites existing permissions for the specified user.\n Set this flag to False to add new permissions non-destructively.\n\n Returns:\n An Access Control List object\n\n Example: Using this function\n Setting permissions\n\n # Grant all registered users download access\n syn.setPermissions('syn1234','273948',['READ','DOWNLOAD'])\n # Grant the public view access\n syn.setPermissions('syn1234','273949',['READ'])\n \"\"\"\n entity_id = id_of(entity)\n trace.get_current_span().set_attributes({\"synapse.id\": entity_id})\n\n benefactor = self._getBenefactor(entity)\n if benefactor[\"id\"] != entity_id:\n if modify_benefactor:\n entity = benefactor\n elif warn_if_inherits:\n self.logger.warning(\n \"Creating an ACL for entity %s, which formerly inherited access control from a\"\n ' benefactor entity, \"%s\" (%s).\\n'\n % (entity_id, benefactor[\"name\"], benefactor[\"id\"])\n )\n\n acl = self._getACL(entity)\n\n principalId = self._getUserbyPrincipalIdOrName(principalId)\n\n # Find existing permissions\n permissions_to_update = None\n for permissions in acl[\"resourceAccess\"]:\n if (\n \"principalId\" in permissions\n and permissions[\"principalId\"] == principalId\n ):\n permissions_to_update = permissions\n break\n\n if accessType is None or accessType == []:\n # remove permissions\n if permissions_to_update and overwrite:\n acl[\"resourceAccess\"].remove(permissions_to_update)\n else:\n # add a 'resourceAccess' entry, if necessary\n if not permissions_to_update:\n permissions_to_update = {\"accessType\": [], \"principalId\": principalId}\n acl[\"resourceAccess\"].append(permissions_to_update)\n if overwrite:\n permissions_to_update[\"accessType\"] = accessType\n else:\n permissions_to_update[\"accessType\"] = list(\n set(permissions_to_update[\"accessType\"]) | set(accessType)\n )\n return self._storeACL(entity, acl)\n\n ############################################################\n # Provenance #\n ############################################################\n\n # TODO: rename these to Activity\n @tracer.start_as_current_span(\"Synapse::getProvenance\")\n def getProvenance(self, entity, version=None) -> Activity:\n \"\"\"\n Retrieve provenance information for a Synapse Entity.\n\n Arguments:\n entity: An Entity or Synapse ID to lookup\n version: The version of the Entity to retrieve. Gets the most recent version if omitted\n\n Returns:\n An Activity object or raises exception if no provenance record exists\n\n Raises:\n SynapseHTTPError: if no provenance record exists\n \"\"\"\n\n # Get versionNumber from Entity\n if version is None and \"versionNumber\" in entity:\n version = entity[\"versionNumber\"]\n entity_id = id_of(entity)\n if version:\n uri = \"/entity/%s/version/%d/generatedBy\" % (entity_id, version)\n else:\n uri = \"/entity/%s/generatedBy\" % entity_id\n\n trace.get_current_span().set_attributes({\"synapse.id\": entity_id})\n return Activity(data=self.restGET(uri))\n\n @tracer.start_as_current_span(\"Synapse::setProvenance\")\n def setProvenance(self, entity, activity) -> Activity:\n \"\"\"\n Stores a record of the code and data used to derive a Synapse entity.\n\n Arguments:\n entity: An Entity or Synapse ID to modify\n activity: A [synapseclient.activity.Activity][]\n\n Returns:\n An updated [synapseclient.activity.Activity][] object\n \"\"\"\n\n # Assert that the entity was generated by a given Activity.\n activity = self._saveActivity(activity)\n\n entity_id = id_of(entity)\n # assert that an entity is generated by an activity\n uri = \"/entity/%s/generatedBy?generatedBy=%s\" % (entity_id, activity[\"id\"])\n activity = Activity(data=self.restPUT(uri))\n\n trace.get_current_span().set_attributes({\"synapse.id\": entity_id})\n return activity\n\n @tracer.start_as_current_span(\"Synapse::deleteProvenance\")\n def deleteProvenance(self, entity) -> None:\n \"\"\"\n Removes provenance information from an Entity and deletes the associated Activity.\n\n Arguments:\n entity: An Entity or Synapse ID to modify\n \"\"\"\n\n activity = self.getProvenance(entity)\n if not activity:\n return\n entity_id = id_of(entity)\n trace.get_current_span().set_attributes({\"synapse.id\": entity_id})\n\n uri = \"/entity/%s/generatedBy\" % entity_id\n self.restDELETE(uri)\n\n # TODO: what happens if the activity is shared by more than one entity?\n uri = \"/activity/%s\" % activity[\"id\"]\n self.restDELETE(uri)\n\n def _saveActivity(self, activity):\n if \"id\" in activity:\n # We're updating provenance\n uri = \"/activity/%s\" % activity[\"id\"]\n activity = Activity(data=self.restPUT(uri, json.dumps(activity)))\n else:\n activity = self.restPOST(\"/activity\", body=json.dumps(activity))\n return activity\n\n @tracer.start_as_current_span(\"Synapse::updateActivity\")\n def updateActivity(self, activity):\n \"\"\"\n Modifies an existing Activity.\n\n Arguments:\n activity: The Activity to be updated.\n\n Returns:\n An updated Activity object\n \"\"\"\n if \"id\" not in activity:\n raise ValueError(\"The activity you want to update must exist on Synapse\")\n trace.get_current_span().set_attributes({\"synapse.id\": activity[\"id\"]})\n return self._saveActivity(activity)\n\n def _convertProvenanceList(self, usedList, limitSearch=None):\n \"\"\"Convert a list of synapse Ids, URLs and local files by replacing local files with Synapse Ids\"\"\"\n if usedList is None:\n return None\n usedList = [\n self.get(target, limitSearch=limitSearch)\n if (os.path.isfile(target) if isinstance(target, str) else False)\n else target\n for target in usedList\n ]\n return usedList\n\n ############################################################\n # File handle service calls #\n ############################################################\n\n def _getFileHandleDownload(self, fileHandleId, objectId, objectType=None):\n \"\"\"\n Gets the URL and the metadata as filehandle object for a filehandle or fileHandleId\n\n :param fileHandleId: ID of fileHandle to download\n :param objectId: The ID of the object associated with the file e.g. syn234\n :param objectType: Type of object associated with a file e.g. FileEntity, TableEntity\n\n :returns: dictionary with keys: fileHandle, fileHandleId and preSignedURL\n \"\"\"\n body = {\n \"includeFileHandles\": True,\n \"includePreSignedURLs\": True,\n \"requestedFiles\": [\n {\n \"fileHandleId\": fileHandleId,\n \"associateObjectId\": objectId,\n \"associateObjectType\": objectType or \"FileEntity\",\n }\n ],\n }\n response = self.restPOST(\n \"/fileHandle/batch\", body=json.dumps(body), endpoint=self.fileHandleEndpoint\n )\n result = response[\"requestedFiles\"][0]\n failure = result.get(\"failureCode\")\n if failure == \"NOT_FOUND\":\n raise SynapseFileNotFoundError(\n \"The fileHandleId %s could not be found\" % fileHandleId\n )\n elif failure == \"UNAUTHORIZED\":\n raise SynapseError(\n \"You are not authorized to access fileHandleId %s associated with the Synapse\"\n \" %s: %s\" % (fileHandleId, objectType, objectId)\n )\n return result\n\n @staticmethod\n def _is_retryable_download_error(ex):\n # some exceptions caught during download indicate non-recoverable situations that\n # will not be remedied by a repeated download attempt.\n return not (\n (isinstance(ex, OSError) and ex.errno == errno.ENOSPC)\n or isinstance(ex, SynapseMd5MismatchError) # out of disk space\n )\n\n @tracer.start_as_current_span(\"Synapse::_downloadFileHandle\")\n def _downloadFileHandle(\n self, fileHandleId, objectId, objectType, destination, retries=5\n ):\n \"\"\"\n Download a file from the given URL to the local file system.\n\n :param fileHandleId: id of the FileHandle to download\n :param objectId: id of the Synapse object that uses the FileHandle e.g. \"syn123\"\n :param objectType: type of the Synapse object that uses the FileHandle e.g. \"FileEntity\"\n :param destination: destination on local file system\n :param retries: (default=5) Number of download retries attempted before throwing an exception.\n\n :returns: path to downloaded file\n \"\"\"\n os.makedirs(os.path.dirname(destination), exist_ok=True)\n\n while retries > 0:\n try:\n fileResult = self._getFileHandleDownload(\n fileHandleId, objectId, objectType\n )\n fileHandle = fileResult[\"fileHandle\"]\n concreteType = fileHandle[\"concreteType\"]\n storageLocationId = fileHandle.get(\"storageLocationId\")\n\n if concreteType == concrete_types.EXTERNAL_OBJECT_STORE_FILE_HANDLE:\n profile = self._get_client_authenticated_s3_profile(\n fileHandle[\"endpointUrl\"], fileHandle[\"bucket\"]\n )\n downloaded_path = S3ClientWrapper.download_file(\n fileHandle[\"bucket\"],\n fileHandle[\"endpointUrl\"],\n fileHandle[\"fileKey\"],\n destination,\n profile_name=profile,\n show_progress=not self.silent,\n )\n\n elif (\n sts_transfer.is_boto_sts_transfer_enabled(self)\n and sts_transfer.is_storage_location_sts_enabled(\n self, objectId, storageLocationId\n )\n and concreteType == concrete_types.S3_FILE_HANDLE\n ):\n\n def download_fn(credentials):\n return S3ClientWrapper.download_file(\n fileHandle[\"bucketName\"],\n None,\n fileHandle[\"key\"],\n destination,\n credentials=credentials,\n show_progress=not self.silent,\n # pass through our synapse threading config to boto s3\n transfer_config_kwargs={\n \"max_concurrency\": self.max_threads\n },\n )\n\n downloaded_path = sts_transfer.with_boto_sts_credentials(\n download_fn,\n self,\n objectId,\n \"read_only\",\n )\n\n elif (\n self.multi_threaded\n and concreteType == concrete_types.S3_FILE_HANDLE\n and fileHandle.get(\"contentSize\", 0)\n > multithread_download.SYNAPSE_DEFAULT_DOWNLOAD_PART_SIZE\n ):\n # run the download multi threaded if the file supports it, we're configured to do so,\n # and the file is large enough that it would be broken into parts to take advantage of\n # multiple downloading threads. otherwise it's more efficient to run the download as a simple\n # single threaded URL download.\n downloaded_path = self._download_from_url_multi_threaded(\n fileHandleId,\n objectId,\n objectType,\n destination,\n expected_md5=fileHandle.get(\"contentMd5\"),\n )\n\n else:\n downloaded_path = self._download_from_URL(\n fileResult[\"preSignedURL\"],\n destination,\n fileHandle[\"id\"],\n expected_md5=fileHandle.get(\"contentMd5\"),\n )\n self.cache.add(fileHandle[\"id\"], downloaded_path)\n return downloaded_path\n\n except Exception as ex:\n if not self._is_retryable_download_error(ex):\n raise\n\n exc_info = sys.exc_info()\n ex.progress = 0 if not hasattr(ex, \"progress\") else ex.progress\n self.logger.debug(\n \"\\nRetrying download on error: [%s] after progressing %i bytes\"\n % (exc_info[0], ex.progress),\n exc_info=True,\n ) # this will include stack trace\n if ex.progress == 0: # No progress was made reduce remaining retries.\n retries -= 1\n if retries <= 0:\n # Re-raise exception\n raise\n\n raise Exception(\"should not reach this line\")\n\n @tracer.start_as_current_span(\"Synapse::_download_from_url_multi_threaded\")\n def _download_from_url_multi_threaded(\n self, file_handle_id, object_id, object_type, destination, *, expected_md5=None\n ):\n destination = os.path.abspath(destination)\n temp_destination = utils.temp_download_filename(destination, file_handle_id)\n\n request = multithread_download.DownloadRequest(\n file_handle_id=int(file_handle_id),\n object_id=object_id,\n object_type=object_type,\n path=temp_destination,\n )\n\n multithread_download.download_file(self, request)\n\n if (\n expected_md5\n ): # if md5 not set (should be the case for all except http download)\n actual_md5 = utils.md5_for_file(temp_destination).hexdigest()\n # check md5 if given\n if actual_md5 != expected_md5:\n try:\n os.remove(temp_destination)\n except FileNotFoundError:\n # file already does not exist. nothing to do\n pass\n raise SynapseMd5MismatchError(\n \"Downloaded file {filename}'s md5 {md5} does not match expected MD5 of\"\n \" {expected_md5}\".format(\n filename=temp_destination,\n md5=actual_md5,\n expected_md5=expected_md5,\n )\n )\n # once download completed, rename to desired destination\n shutil.move(temp_destination, destination)\n\n return destination\n\n def _is_synapse_uri(self, uri):\n # check whether the given uri is hosted at the configured synapse repo endpoint\n uri_domain = urllib_urlparse.urlparse(uri).netloc\n synapse_repo_domain = urllib_urlparse.urlparse(self.repoEndpoint).netloc\n return uri_domain.lower() == synapse_repo_domain.lower()\n\n @tracer.start_as_current_span(\"Synapse::_download_from_URL\")\n def _download_from_URL(\n self, url, destination, fileHandleId=None, expected_md5=None\n ):\n \"\"\"\n Download a file from the given URL to the local file system.\n\n :param url: source of download\n :param destination: destination on local file system\n :param fileHandleId: (optional) if given, the file will be given a temporary name that includes the file\n handle id which allows resuming partial downloads of the same file from previous\n sessions\n :param expected_md5: (optional) if given, check that the MD5 of the downloaded file matched the expected MD5\n\n :returns: path to downloaded file\n \"\"\"\n destination = os.path.abspath(destination)\n actual_md5 = None\n redirect_count = 0\n delete_on_md5_mismatch = True\n self.logger.debug(f\"Downloading from {url} to {destination}\")\n while redirect_count < REDIRECT_LIMIT:\n redirect_count += 1\n scheme = urllib_urlparse.urlparse(url).scheme\n if scheme == \"file\":\n delete_on_md5_mismatch = False\n destination = utils.file_url_to_path(url, verify_exists=True)\n if destination is None:\n raise IOError(\"Local file (%s) does not exist.\" % url)\n break\n elif scheme == \"sftp\":\n username, password = self._getUserCredentials(url)\n destination = SFTPWrapper.download_file(\n url, destination, username, password, show_progress=not self.silent\n )\n break\n elif scheme == \"ftp\":\n transfer_start_time = time.time()\n\n def _ftp_report_hook(\n block_number: int, read_size: int, total_size: int\n ) -> None:\n show_progress = not self.silent\n if show_progress:\n self._print_transfer_progress(\n transferred=block_number * read_size,\n toBeTransferred=total_size,\n prefix=\"Downloading \",\n postfix=os.path.basename(destination),\n dt=time.time() - transfer_start_time,\n )\n\n urllib_request.urlretrieve(\n url=url, filename=destination, reporthook=_ftp_report_hook\n )\n break\n elif scheme == \"http\" or scheme == \"https\":\n # if a partial download exists with the temporary name,\n temp_destination = utils.temp_download_filename(\n destination, fileHandleId\n )\n range_header = (\n {\n \"Range\": \"bytes={start}-\".format(\n start=os.path.getsize(temp_destination)\n )\n }\n if os.path.exists(temp_destination)\n else {}\n )\n\n # pass along synapse auth credentials only if downloading directly from synapse\n auth = self.credentials if self._is_synapse_uri(url) else None\n response = with_retry(\n lambda: self._requests_session.get(\n url,\n headers=self._generate_headers(range_header),\n stream=True,\n allow_redirects=False,\n auth=auth,\n ),\n verbose=self.debug,\n **STANDARD_RETRY_PARAMS,\n )\n try:\n exceptions._raise_for_status(response, verbose=self.debug)\n except SynapseHTTPError as err:\n if err.response.status_code == 404:\n raise SynapseError(\"Could not download the file at %s\" % url)\n elif (\n err.response.status_code == 416\n ): # Requested Range Not Statisfiable\n # this is a weird error when the client already finished downloading but the loop continues\n # When this exception occurs, the range we request is guaranteed to be >= file size so we\n # assume that the file has been fully downloaded, rename it to destination file\n # and break out of the loop to perform the MD5 check.\n # If it fails the user can retry with another download.\n shutil.move(temp_destination, destination)\n break\n raise\n\n # handle redirects\n if response.status_code in [301, 302, 303, 307, 308]:\n url = response.headers[\"location\"]\n # don't break, loop again\n else:\n # get filename from content-disposition, if we don't have it already\n if os.path.isdir(destination):\n filename = utils.extract_filename(\n content_disposition_header=response.headers.get(\n \"content-disposition\", None\n ),\n default_filename=utils.guess_file_name(url),\n )\n destination = os.path.join(destination, filename)\n # Stream the file to disk\n if \"content-length\" in response.headers:\n toBeTransferred = float(response.headers[\"content-length\"])\n else:\n toBeTransferred = -1\n transferred = 0\n\n # Servers that respect the Range header return 206 Partial Content\n if response.status_code == 206:\n mode = \"ab\"\n previouslyTransferred = os.path.getsize(temp_destination)\n toBeTransferred += previouslyTransferred\n transferred += previouslyTransferred\n sig = utils.md5_for_file(temp_destination)\n else:\n mode = \"wb\"\n previouslyTransferred = 0\n sig = hashlib.new(\"md5\", usedforsecurity=False)\n\n try:\n with open(temp_destination, mode) as fd:\n t0 = time.time()\n for nChunks, chunk in enumerate(\n response.iter_content(FILE_BUFFER_SIZE)\n ):\n fd.write(chunk)\n sig.update(chunk)\n\n # the 'content-length' header gives the total number of bytes that will be transferred\n # to us len(chunk) cannot be used to track progress because iter_content automatically\n # decodes the chunks if the response body is encoded so the len(chunk) could be\n # different from the total number of bytes we've read read from the response body\n # response.raw.tell() is the total number of response body bytes transferred over the\n # wire so far\n transferred = (\n response.raw.tell() + previouslyTransferred\n )\n self._print_transfer_progress(\n transferred,\n toBeTransferred,\n \"Downloading \",\n os.path.basename(destination),\n dt=time.time() - t0,\n )\n except (\n Exception\n ) as ex: # We will add a progress parameter then push it back to retry.\n ex.progress = transferred - previouslyTransferred\n raise\n\n # verify that the file was completely downloaded and retry if it is not complete\n if toBeTransferred > 0 and transferred < toBeTransferred:\n self.logger.warning(\n \"\\nRetrying download because the connection ended early.\\n\"\n )\n continue\n\n actual_md5 = sig.hexdigest()\n # rename to final destination\n shutil.move(temp_destination, destination)\n break\n else:\n self.logger.error(\"Unable to download URLs of type %s\" % scheme)\n return None\n\n else: # didn't break out of loop\n raise SynapseHTTPError(\"Too many redirects\")\n\n if (\n actual_md5 is None\n ): # if md5 not set (should be the case for all except http download)\n actual_md5 = utils.md5_for_file(destination).hexdigest()\n\n # check md5 if given\n if expected_md5 and actual_md5 != expected_md5:\n if delete_on_md5_mismatch and os.path.exists(destination):\n os.remove(destination)\n raise SynapseMd5MismatchError(\n \"Downloaded file {filename}'s md5 {md5} does not match expected MD5 of\"\n \" {expected_md5}\".format(\n filename=destination, md5=actual_md5, expected_md5=expected_md5\n )\n )\n\n return destination\n\n @tracer.start_as_current_span(\"Synapse::_createExternalFileHandle\")\n def _createExternalFileHandle(\n self, externalURL, mimetype=None, md5=None, fileSize=None\n ):\n \"\"\"Create a new FileHandle representing an external URL.\"\"\"\n fileName = externalURL.split(\"/\")[-1]\n externalURL = utils.as_url(externalURL)\n fileHandle = {\n \"concreteType\": concrete_types.EXTERNAL_FILE_HANDLE,\n \"fileName\": fileName,\n \"externalURL\": externalURL,\n \"contentMd5\": md5,\n \"contentSize\": fileSize,\n }\n if mimetype is None:\n (mimetype, enc) = mimetypes.guess_type(externalURL, strict=False)\n if mimetype is not None:\n fileHandle[\"contentType\"] = mimetype\n return self.restPOST(\n \"/externalFileHandle\", json.dumps(fileHandle), self.fileHandleEndpoint\n )\n\n @tracer.start_as_current_span(\"Synapse::_createExternalObjectStoreFileHandle\")\n def _createExternalObjectStoreFileHandle(\n self, s3_file_key, file_path, storage_location_id, mimetype=None\n ):\n if mimetype is None:\n mimetype, enc = mimetypes.guess_type(file_path, strict=False)\n file_handle = {\n \"concreteType\": concrete_types.EXTERNAL_OBJECT_STORE_FILE_HANDLE,\n \"fileKey\": s3_file_key,\n \"fileName\": os.path.basename(file_path),\n \"contentMd5\": utils.md5_for_file(file_path).hexdigest(),\n \"contentSize\": os.stat(file_path).st_size,\n \"storageLocationId\": storage_location_id,\n \"contentType\": mimetype,\n }\n\n return self.restPOST(\n \"/externalFileHandle\", json.dumps(file_handle), self.fileHandleEndpoint\n )\n\n @tracer.start_as_current_span(\"Synapse::create_external_s3_file_handle\")\n def create_external_s3_file_handle(\n self,\n bucket_name,\n s3_file_key,\n file_path,\n *,\n parent=None,\n storage_location_id=None,\n mimetype=None,\n ):\n \"\"\"\n Create an external S3 file handle for e.g. a file that has been uploaded directly to\n an external S3 storage location.\n\n Arguments:\n bucket_name: Name of the S3 bucket\n s3_file_key: S3 key of the uploaded object\n file_path: Local path of the uploaded file\n parent: Parent entity to create the file handle in, the file handle will be created\n in the default storage location of the parent. Mutually exclusive with\n storage_location_id\n storage_location_id: Explicit storage location id to create the file handle in, mutually exclusive\n with parent\n mimetype: Mimetype of the file, if known\n\n Raises:\n ValueError: If neither parent nor storage_location_id is specified, or if both are specified.\n \"\"\"\n\n if storage_location_id:\n if parent:\n raise ValueError(\"Pass parent or storage_location_id, not both\")\n elif not parent:\n raise ValueError(\"One of parent or storage_location_id is required\")\n else:\n upload_destination = self._getDefaultUploadDestination(parent)\n storage_location_id = upload_destination[\"storageLocationId\"]\n\n if mimetype is None:\n mimetype, enc = mimetypes.guess_type(file_path, strict=False)\n\n file_handle = {\n \"concreteType\": concrete_types.S3_FILE_HANDLE,\n \"key\": s3_file_key,\n \"bucketName\": bucket_name,\n \"fileName\": os.path.basename(file_path),\n \"contentMd5\": utils.md5_for_file(file_path).hexdigest(),\n \"contentSize\": os.stat(file_path).st_size,\n \"storageLocationId\": storage_location_id,\n \"contentType\": mimetype,\n }\n\n return self.restPOST(\n \"/externalFileHandle/s3\",\n json.dumps(file_handle),\n endpoint=self.fileHandleEndpoint,\n )\n\n @tracer.start_as_current_span(\"Synapse::_get_file_handle_as_creator\")\n def _get_file_handle_as_creator(self, fileHandle):\n \"\"\"Retrieve a fileHandle from the fileHandle service.\n You must be the creator of the filehandle to use this method. Otherwise, an 403-Forbidden error will be raised\n \"\"\"\n\n uri = \"/fileHandle/%s\" % (id_of(fileHandle),)\n return self.restGET(uri, endpoint=self.fileHandleEndpoint)\n\n @tracer.start_as_current_span(\"Synapse::_deleteFileHandle\")\n def _deleteFileHandle(self, fileHandle):\n \"\"\"\n Delete the given file handle.\n\n Note: Only the user that created the FileHandle can delete it. Also, a FileHandle cannot be deleted if it is\n associated with a FileEntity or WikiPage\n \"\"\"\n\n uri = \"/fileHandle/%s\" % (id_of(fileHandle),)\n self.restDELETE(uri, endpoint=self.fileHandleEndpoint)\n return fileHandle\n\n ############################################################\n # SFTP #\n ############################################################\n\n @tracer.start_as_current_span(\"Synapse::_getDefaultUploadDestination\")\n def _getDefaultUploadDestination(self, parent_entity):\n return self.restGET(\n \"/entity/%s/uploadDestination\" % id_of(parent_entity),\n endpoint=self.fileHandleEndpoint,\n )\n\n @tracer.start_as_current_span(\"Synapse::_getUserCredentials\")\n def _getUserCredentials(self, url, username=None, password=None):\n \"\"\"Get user credentials for a specified URL by either looking in the configFile or querying the user.\n\n :param username: username on server (optionally specified)\n :param password: password for authentication on the server (optionally specified)\n\n :returns: tuple of username, password\n \"\"\"\n # Get authentication information from configFile\n\n parsedURL = urllib_urlparse.urlparse(url)\n baseURL = parsedURL.scheme + \"://\" + parsedURL.hostname\n\n config = self.getConfigFile(self.configPath)\n if username is None and config.has_option(baseURL, \"username\"):\n username = config.get(baseURL, \"username\")\n if password is None and config.has_option(baseURL, \"password\"):\n password = config.get(baseURL, \"password\")\n # If I still don't have a username and password prompt for it\n if username is None:\n username = getpass.getuser() # Default to login name\n # Note that if we hit the following line from within nosetests in\n # Python 3, we get \"TypeError: bad argument type for built-in operation\".\n # Luckily, this case isn't covered in our test suite!\n user = input(\"Username for %s (%s):\" % (baseURL, username))\n username = username if user == \"\" else user\n if password is None:\n password = getpass.getpass(\"Password for %s:\" % baseURL)\n return username, password\n\n ############################################\n # Project/Folder storage location settings #\n ############################################\n\n @tracer.start_as_current_span(\"Synapse::createStorageLocationSetting\")\n def createStorageLocationSetting(self, storage_type, **kwargs):\n \"\"\"\n Creates an IMMUTABLE storage location based on the specified type.\n\n For each storage_type, the following kwargs should be specified:\n\n **ExternalObjectStorage**: (S3-like (e.g. AWS S3 or Openstack) bucket not accessed by Synapse)\n\n - endpointUrl: endpoint URL of the S3 service (for example: 'https://s3.amazonaws.com')\n - bucket: the name of the bucket to use\n\n **ExternalS3Storage**: (Amazon S3 bucket accessed by Synapse)\n\n - bucket: the name of the bucket to use\n\n **ExternalStorage**: (SFTP or FTP storage location not accessed by Synapse)\n\n - url: the base URL for uploading to the external destination\n - supportsSubfolders(optional): does the destination support creating subfolders under the base url\n (default: false)\n\n **ProxyStorage**: (a proxy server that controls access to a storage)\n\n - secretKey: The encryption key used to sign all pre-signed URLs used to communicate with the proxy.\n - proxyUrl: The HTTPS URL of the proxy used for upload and download.\n\n Arguments:\n storage_type: The type of the StorageLocationSetting to create\n banner: (Optional) The optional banner to show every time a file is uploaded\n description: (Optional) The description to show the user when the user has to choose which upload destination to use\n kwargs: fields necessary for creation of the specified storage_type\n\n Returns:\n A dict of the created StorageLocationSetting\n \"\"\"\n upload_type_dict = {\n \"ExternalObjectStorage\": \"S3\",\n \"ExternalS3Storage\": \"S3\",\n \"ExternalStorage\": \"SFTP\",\n \"ProxyStorage\": \"PROXYLOCAL\",\n }\n\n if storage_type not in upload_type_dict:\n raise ValueError(\"Unknown storage_type: %s\", storage_type)\n\n # ProxyStorageLocationSettings has an extra 's' at the end >:(\n kwargs[\"concreteType\"] = (\n \"org.sagebionetworks.repo.model.project.\"\n + storage_type\n + \"LocationSetting\"\n + (\"s\" if storage_type == \"ProxyStorage\" else \"\")\n )\n kwargs[\"uploadType\"] = upload_type_dict[storage_type]\n\n return self.restPOST(\"/storageLocation\", body=json.dumps(kwargs))\n\n @tracer.start_as_current_span(\"Synapse::getMyStorageLocationSetting\")\n def getMyStorageLocationSetting(self, storage_location_id):\n \"\"\"\n Get a StorageLocationSetting by its id.\n\n Arguments:\n storage_location_id: id of the StorageLocationSetting to retrieve.\n The corresponding StorageLocationSetting must have been created by this user.\n\n Returns:\n A dict describing the StorageLocationSetting retrieved by its id\n \"\"\"\n return self.restGET(\"/storageLocation/%s\" % storage_location_id)\n\n @tracer.start_as_current_span(\"Synapse::setStorageLocation\")\n def setStorageLocation(self, entity, storage_location_id):\n \"\"\"\n Sets the storage location for a Project or Folder\n\n Arguments:\n entity: A Project or Folder to which the StorageLocationSetting is set\n storage_location_id: A StorageLocation id or a list of StorageLocation ids. Pass in None for the default\n Synapse storage.\n\n Returns:\n The created or updated settings as a dict.\n \"\"\"\n if storage_location_id is None:\n storage_location_id = DEFAULT_STORAGE_LOCATION_ID\n locations = (\n storage_location_id\n if isinstance(storage_location_id, list)\n else [storage_location_id]\n )\n\n existing_setting = self.getProjectSetting(entity, \"upload\")\n if existing_setting is not None:\n existing_setting[\"locations\"] = locations\n self.restPUT(\"/projectSettings\", body=json.dumps(existing_setting))\n return self.getProjectSetting(entity, \"upload\")\n else:\n project_destination = {\n \"concreteType\": \"org.sagebionetworks.repo.model.project.UploadDestinationListSetting\",\n \"settingsType\": \"upload\",\n \"locations\": locations,\n \"projectId\": id_of(entity),\n }\n\n return self.restPOST(\n \"/projectSettings\", body=json.dumps(project_destination)\n )\n\n @tracer.start_as_current_span(\"Synapse::getProjectSetting\")\n def getProjectSetting(self, project, setting_type):\n \"\"\"\n Gets the ProjectSetting for a project.\n\n Arguments:\n project: Project entity or its id as a string\n setting_type: Type of setting. Choose from:\n\n - `upload`\n - `external_sync`\n - `requester_pays`\n\n Returns:\n The ProjectSetting as a dict or None if no settings of the specified type exist.\n \"\"\"\n if setting_type not in {\"upload\", \"external_sync\", \"requester_pays\"}:\n raise ValueError(\"Invalid project_type: %s\" % setting_type)\n\n response = self.restGET(\n \"/projectSettings/{projectId}/type/{type}\".format(\n projectId=id_of(project), type=setting_type\n )\n )\n return (\n response if response else None\n ) # if no project setting, a empty string is returned as the response\n\n @tracer.start_as_current_span(\"Synapse::get_sts_storage_token\")\n def get_sts_storage_token(\n self, entity, permission, *, output_format=\"json\", min_remaining_life=None\n ):\n \"\"\"Get STS credentials for the given entity_id and permission, outputting it in the given format\n\n Arguments:\n entity: The entity or entity id whose credentials are being returned\n permission: One of:\n\n - `read_only`\n - `read_write`\n output_format: One of:\n\n - `json`: the dictionary returned from the Synapse STS API including expiration\n - `boto`: a dictionary compatible with a boto session (aws_access_key_id, etc)\n - `shell`: output commands for exporting credentials appropriate for the detected shell\n - `bash`: output commands for exporting credentials into a bash shell\n - `cmd`: output commands for exporting credentials into a windows cmd shell\n - `powershell`: output commands for exporting credentials into a windows powershell\n min_remaining_life: The minimum allowable remaining life on a cached token to return. If a cached token\n has left than this amount of time left a fresh token will be fetched\n \"\"\"\n return sts_transfer.get_sts_credentials(\n self,\n id_of(entity),\n permission,\n output_format=output_format,\n min_remaining_life=min_remaining_life,\n )\n\n @tracer.start_as_current_span(\"Synapse::create_s3_storage_location\")\n def create_s3_storage_location(\n self,\n *,\n parent=None,\n folder_name=None,\n folder=None,\n bucket_name=None,\n base_key=None,\n sts_enabled=False,\n ):\n \"\"\"\n Create a storage location in the given parent, either in the given folder or by creating a new\n folder in that parent with the given name. This will both create a StorageLocationSetting,\n and a ProjectSetting together, optionally creating a new folder in which to locate it,\n and optionally enabling this storage location for access via STS. If enabling an existing folder for STS,\n it must be empty.\n\n Arguments:\n parent: The parent in which to locate the storage location (mutually exclusive with folder)\n folder_name: The name of a new folder to create (mutually exclusive with folder)\n folder: The existing folder in which to create the storage location (mutually exclusive with folder_name)\n bucket_name: The name of an S3 bucket, if this is an external storage location,\n if None will use Synapse S3 storage\n base_key: The base key of within the bucket, None to use the bucket root,\n only applicable if bucket_name is passed\n sts_enabled: Whether this storage location should be STS enabled\n\n Returns:\n A 3-tuple of the synapse Folder, a the storage location setting, and the project setting dictionaries.\n \"\"\"\n if folder_name and parent:\n if folder:\n raise ValueError(\n \"folder and folder_name are mutually exclusive, only one should be passed\"\n )\n\n folder = self.store(Folder(name=folder_name, parent=parent))\n\n elif not folder:\n raise ValueError(\"either folder or folder_name should be required\")\n\n storage_location_kwargs = {\n \"uploadType\": \"S3\",\n \"stsEnabled\": sts_enabled,\n }\n\n if bucket_name:\n storage_location_kwargs[\n \"concreteType\"\n ] = concrete_types.EXTERNAL_S3_STORAGE_LOCATION_SETTING\n storage_location_kwargs[\"bucket\"] = bucket_name\n if base_key:\n storage_location_kwargs[\"baseKey\"] = base_key\n else:\n storage_location_kwargs[\n \"concreteType\"\n ] = concrete_types.SYNAPSE_S3_STORAGE_LOCATION_SETTING\n\n storage_location_setting = self.restPOST(\n \"/storageLocation\", json.dumps(storage_location_kwargs)\n )\n\n storage_location_id = storage_location_setting[\"storageLocationId\"]\n project_setting = self.setStorageLocation(\n folder,\n storage_location_id,\n )\n\n return folder, storage_location_setting, project_setting\n\n ############################################################\n # CRUD for Evaluations #\n ############################################################\n\n @tracer.start_as_current_span(\"Synapse::getEvaluation\")\n def getEvaluation(self, id):\n \"\"\"\n Gets an Evaluation object from Synapse.\n\n Arguments:\n id: The ID of the [synapseclient.evaluation.Evaluation][] to return.\n\n Returns:\n An [synapseclient.evaluation.Evaluation][] object\n\n Example: Using this function\n Creating an Evaluation instance\n\n evaluation = syn.getEvaluation(2005090)\n \"\"\"\n\n evaluation_id = id_of(id)\n uri = Evaluation.getURI(evaluation_id)\n return Evaluation(**self.restGET(uri))\n\n # TODO: Should this be combined with getEvaluation?\n @tracer.start_as_current_span(\"Synapse::getEvaluationByName\")\n def getEvaluationByName(self, name):\n \"\"\"\n Gets an Evaluation object from Synapse.\n\n Arguments:\n Name: The name of the [synapseclient.evaluation.Evaluation][] to return.\n\n Returns:\n An [synapseclient.evaluation.Evaluation][] object\n \"\"\"\n uri = Evaluation.getByNameURI(name)\n return Evaluation(**self.restGET(uri))\n\n @tracer.start_as_current_span(\"Synapse::getEvaluationByContentSource\")\n def getEvaluationByContentSource(self, entity):\n \"\"\"\n Returns a generator over evaluations that derive their content from the given entity\n\n Arguments:\n entity: The [synapseclient.entity.Project][] whose Evaluations are to be fetched.\n\n Yields:\n A generator over [synapseclient.evaluation.Evaluation][] objects for the given [synapseclient.entity.Project][].\n \"\"\"\n\n entityId = id_of(entity)\n url = \"/entity/%s/evaluation\" % entityId\n\n for result in self._GET_paginated(url):\n yield Evaluation(**result)\n\n @tracer.start_as_current_span(\"Synapse::_findTeam\")\n def _findTeam(self, name):\n \"\"\"\n Retrieve a Teams matching the supplied name fragment\n \"\"\"\n for result in self._GET_paginated(\"/teams?fragment=%s\" % name):\n yield Team(**result)\n\n @tracer.start_as_current_span(\"Synapse::_find_teams_for_principal\")\n def _find_teams_for_principal(self, principal_id: str) -> typing.Iterator[Team]:\n \"\"\"\n Retrieve a list of teams for the matching principal ID. If the principalId that is passed in is a team itself,\n or not found, this will return a generator that yields no results.\n\n :param principal_id: Identifier of a user or group.\n\n :return: A generator that yields objects of type :py:class:`synapseclient.team.Team`\n \"\"\"\n for result in self._GET_paginated(f\"/user/{principal_id}/team\"):\n yield Team(**result)\n\n @tracer.start_as_current_span(\"Synapse::getTeam\")\n def getTeam(self, id):\n \"\"\"\n Finds a team with a given ID or name.\n\n Arguments:\n id: The ID or name of the team or a Team object to retrieve.\n\n Returns:\n An object of type [synapseclient.team.Team][]\n \"\"\"\n # Retrieves team id\n teamid = id_of(id)\n try:\n int(teamid)\n except (TypeError, ValueError):\n if isinstance(id, str):\n for team in self._findTeam(id):\n if team.name == id:\n teamid = team.id\n break\n else:\n raise ValueError('Can\\'t find team \"{}\"'.format(teamid))\n else:\n raise ValueError('Can\\'t find team \"{}\"'.format(teamid))\n return Team(**self.restGET(\"/team/%s\" % teamid))\n\n @tracer.start_as_current_span(\"Synapse::getTeamMembers\")\n def getTeamMembers(self, team):\n \"\"\"\n Lists the members of the given team.\n\n Arguments:\n team: A [synapseclient.team.Team][] object or a team's ID.\n\n Yields:\n A generator over [synapseclient.team.TeamMember][] objects.\n\n \"\"\"\n for result in self._GET_paginated(\"/teamMembers/{id}\".format(id=id_of(team))):\n yield TeamMember(**result)\n\n @tracer.start_as_current_span(\"Synapse::_get_docker_digest\")\n def _get_docker_digest(self, entity, docker_tag=\"latest\"):\n \"\"\"\n Get matching Docker sha-digest of a DockerRepository given a Docker tag\n\n :param entity: Synapse id or entity of Docker repository\n :param docker_tag: Docker tag\n :returns: Docker digest matching Docker tag\n \"\"\"\n entityid = id_of(entity)\n uri = \"/entity/{entityId}/dockerTag\".format(entityId=entityid)\n\n docker_commits = self._GET_paginated(uri)\n docker_digest = None\n for commit in docker_commits:\n if docker_tag == commit[\"tag\"]:\n docker_digest = commit[\"digest\"]\n if docker_digest is None:\n raise ValueError(\n \"Docker tag {docker_tag} not found. Please specify a \"\n \"docker tag that exists. 'latest' is used as \"\n \"default.\".format(docker_tag=docker_tag)\n )\n return docker_digest\n\n @tracer.start_as_current_span(\"Synapse::get_team_open_invitations\")\n def get_team_open_invitations(self, team):\n \"\"\"Retrieve the open requests submitted to a Team\n https://rest-docs.synapse.org/rest/GET/team/id/openInvitation.html\n\n Arguments:\n team: A [synapseclient.team.Team][] object or a team's ID.\n\n Yields:\n Generator of MembershipRequest\n \"\"\"\n teamid = id_of(team)\n request = \"/team/{team}/openInvitation\".format(team=teamid)\n open_requests = self._GET_paginated(request)\n return open_requests\n\n @tracer.start_as_current_span(\"Synapse::get_membership_status\")\n def get_membership_status(self, userid, team):\n \"\"\"Retrieve a user's Team Membership Status bundle.\n https://rest-docs.synapse.org/rest/GET/team/id/member/principalId/membershipStatus.html\n\n Arguments:\n user: Synapse user ID\n team: A [synapseclient.team.Team][] object or a team's ID.\n\n Returns:\n dict of TeamMembershipStatus\n \"\"\"\n teamid = id_of(team)\n request = \"/team/{team}/member/{user}/membershipStatus\".format(\n team=teamid, user=userid\n )\n membership_status = self.restGET(request)\n return membership_status\n\n @tracer.start_as_current_span(\"Synapse::_delete_membership_invitation\")\n def _delete_membership_invitation(self, invitationid):\n \"\"\"Delete open membership invitation\n\n :param invitationid: Open invitation id\n \"\"\"\n self.restDELETE(\"/membershipInvitation/{id}\".format(id=invitationid))\n\n @tracer.start_as_current_span(\"Synapse::send_membership_invitation\")\n def send_membership_invitation(\n self, teamId, inviteeId=None, inviteeEmail=None, message=None\n ):\n \"\"\"Create a membership invitation and send an email notification\n to the invitee.\n\n Arguments:\n teamId: Synapse teamId\n inviteeId: Synapse username or profile id of user\n inviteeEmail: Email of user\n message: Additional message for the user getting invited to the\n team.\n\n Returns:\n MembershipInvitation\n \"\"\"\n\n invite_request = {\"teamId\": str(teamId), \"message\": message}\n if inviteeEmail is not None:\n invite_request[\"inviteeEmail\"] = str(inviteeEmail)\n if inviteeId is not None:\n invite_request[\"inviteeId\"] = str(inviteeId)\n\n response = self.restPOST(\n \"/membershipInvitation\", body=json.dumps(invite_request)\n )\n return response\n\n @tracer.start_as_current_span(\"Synapse::invite_to_team\")\n def invite_to_team(\n self, team, user=None, inviteeEmail=None, message=None, force=False\n ):\n \"\"\"Invite user to a Synapse team via Synapse username or email\n (choose one or the other)\n\n Arguments:\n syn: Synapse object\n team: A [synapseclient.team.Team][] object or a team's ID.\n user: Synapse username or profile id of user\n inviteeEmail: Email of user\n message: Additional message for the user getting invited to the team.\n force: If an open invitation exists for the invitee, the old invite will be cancelled.\n\n Returns:\n MembershipInvitation or None if user is already a member\n \"\"\"\n # Throw error if both user and email is specified and if both not\n # specified\n id_email_specified = inviteeEmail is not None and user is not None\n id_email_notspecified = inviteeEmail is None and user is None\n if id_email_specified or id_email_notspecified:\n raise ValueError(\"Must specify either 'user' or 'inviteeEmail'\")\n\n teamid = id_of(team)\n is_member = False\n open_invitations = self.get_team_open_invitations(teamid)\n\n if user is not None:\n inviteeId = self.getUserProfile(user)[\"ownerId\"]\n membership_status = self.get_membership_status(inviteeId, teamid)\n is_member = membership_status[\"isMember\"]\n open_invites_to_user = [\n invitation\n for invitation in open_invitations\n if invitation.get(\"inviteeId\") == inviteeId\n ]\n else:\n inviteeId = None\n open_invites_to_user = [\n invitation\n for invitation in open_invitations\n if invitation.get(\"inviteeEmail\") == inviteeEmail\n ]\n # Only invite if the invitee is not a member and\n # if invitee doesn't have an open invitation unless force=True\n if not is_member and (not open_invites_to_user or force):\n # Delete all old invitations\n for invite in open_invites_to_user:\n self._delete_membership_invitation(invite[\"id\"])\n return self.send_membership_invitation(\n teamid, inviteeId=inviteeId, inviteeEmail=inviteeEmail, message=message\n )\n if is_member:\n not_sent_reason = \"invitee is already a member\"\n else:\n not_sent_reason = (\n \"invitee already has an open invitation \"\n \"Set force=True to send new invite.\"\n )\n\n self.logger.warning(\"No invitation sent: {}\".format(not_sent_reason))\n # Return None if no invite is sent.\n return None\n\n @tracer.start_as_current_span(\"Synapse::submit\")\n def submit(\n self,\n evaluation,\n entity,\n name=None,\n team=None,\n silent=False,\n submitterAlias=None,\n teamName=None,\n dockerTag=\"latest\",\n ):\n \"\"\"\n Submit an Entity for [evaluation][synapseclient.evaluation.Evaluation].\n\n Arguments:\n evalation: Evaluation queue to submit to\n entity: The Entity containing the Submissions\n name: A name for this submission. In the absent of this parameter, the entity name will be used.\n (Optional) A [synapseclient.team.Team][] object, ID or name of a Team that is registered for the challenge\n team: (optional) A [synapseclient.team.Team][] object, ID or name of a Team that is registered for the challenge\n silent: Set to True to suppress output.\n submitterAlias: (optional) A nickname, possibly for display in leaderboards in place of the submitter's name\n teamName: (deprecated) A synonym for submitterAlias\n dockerTag: (optional) The Docker tag must be specified if the entity is a DockerRepository.\n\n Returns:\n A [synapseclient.evaluation.Submission][] object\n\n\n In the case of challenges, a team can optionally be provided to give credit to members of the team that\n contributed to the submission. The team must be registered for the challenge with which the given evaluation is\n associated. The caller must be a member of the submitting team.\n\n Example: Using this function\n Getting and submitting an evaluation\n\n evaluation = syn.getEvaluation(123)\n entity = syn.get('syn456')\n submission = syn.submit(evaluation, entity, name='Our Final Answer', team='Blue Team')\n \"\"\"\n\n require_param(evaluation, \"evaluation\")\n require_param(entity, \"entity\")\n\n evaluation_id = id_of(evaluation)\n\n entity_id = id_of(entity)\n if isinstance(entity, synapseclient.DockerRepository):\n # Edge case if dockerTag is specified as None\n if dockerTag is None:\n raise ValueError(\n \"A dockerTag is required to submit a DockerEntity. Cannot be None\"\n )\n docker_repository = entity[\"repositoryName\"]\n else:\n docker_repository = None\n\n if \"versionNumber\" not in entity:\n entity = self.get(entity, downloadFile=False)\n # version defaults to 1 to hack around required version field and allow submission of files/folders\n entity_version = entity.get(\"versionNumber\", 1)\n\n # default name of submission to name of entity\n if name is None and \"name\" in entity:\n name = entity[\"name\"]\n\n team_id = None\n if team:\n team = self.getTeam(team)\n team_id = id_of(team)\n\n contributors, eligibility_hash = self._get_contributors(evaluation_id, team)\n\n # for backward compatible until we remove supports for teamName\n if not submitterAlias:\n if teamName:\n submitterAlias = teamName\n elif team and \"name\" in team:\n submitterAlias = team[\"name\"]\n\n if isinstance(entity, synapseclient.DockerRepository):\n docker_digest = self._get_docker_digest(entity, dockerTag)\n else:\n docker_digest = None\n\n submission = {\n \"evaluationId\": evaluation_id,\n \"name\": name,\n \"entityId\": entity_id,\n \"versionNumber\": entity_version,\n \"dockerDigest\": docker_digest,\n \"dockerRepositoryName\": docker_repository,\n \"teamId\": team_id,\n \"contributors\": contributors,\n \"submitterAlias\": submitterAlias,\n }\n\n submitted = self._submit(submission, entity[\"etag\"], eligibility_hash)\n\n # if we want to display the receipt message, we need the full object\n if not silent:\n if not (isinstance(evaluation, Evaluation)):\n evaluation = self.getEvaluation(evaluation_id)\n if \"submissionReceiptMessage\" in evaluation:\n self.logger.info(evaluation[\"submissionReceiptMessage\"])\n\n return Submission(**submitted)\n\n @tracer.start_as_current_span(\"Synapse::_submit\")\n def _submit(self, submission, entity_etag, eligibility_hash):\n require_param(submission, \"submission\")\n require_param(entity_etag, \"entity_etag\")\n # URI requires the etag of the entity and, in the case of a team submission, requires an eligibilityStateHash\n uri = \"/evaluation/submission?etag=%s\" % entity_etag\n if eligibility_hash:\n uri += \"&submissionEligibilityHash={0}\".format(eligibility_hash)\n submitted = self.restPOST(uri, json.dumps(submission))\n return submitted\n\n @tracer.start_as_current_span(\"Synapse::_get_contributors\")\n def _get_contributors(self, evaluation_id, team):\n if not evaluation_id or not team:\n return None, None\n\n team_id = id_of(team)\n # see https://rest-docs.synapse.org/rest/GET/evaluation/evalId/team/id/submissionEligibility.html\n eligibility = self.restGET(\n \"/evaluation/{evalId}/team/{id}/submissionEligibility\".format(\n evalId=evaluation_id, id=team_id\n )\n )\n\n if not eligibility[\"teamEligibility\"][\"isEligible\"]:\n # Check team eligibility and raise an exception if not eligible\n if not eligibility[\"teamEligibility\"][\"isRegistered\"]:\n raise SynapseError(\n 'Team \"{team}\" is not registered.'.format(team=team.name)\n )\n if eligibility[\"teamEligibility\"][\"isQuotaFilled\"]:\n raise SynapseError(\n 'Team \"{team}\" has already submitted the full quota of submissions.'.format(\n team=team.name\n )\n )\n raise SynapseError('Team \"{team}\" is not eligible.'.format(team=team.name))\n\n # Include all team members who are eligible.\n contributors = [\n {\"principalId\": member[\"principalId\"]}\n for member in eligibility[\"membersEligibility\"]\n if member[\"isEligible\"] and not member[\"hasConflictingSubmission\"]\n ]\n return contributors, eligibility[\"eligibilityStateHash\"]\n\n @tracer.start_as_current_span(\"Synapse::_allowParticipation\")\n def _allowParticipation(\n self,\n evaluation,\n user,\n rights=[\"READ\", \"PARTICIPATE\", \"SUBMIT\", \"UPDATE_SUBMISSION\"],\n ):\n \"\"\"\n Grants the given user the minimal access rights to join and submit to an Evaluation.\n Note: The specification of this method has not been decided yet, so the method is likely to change in future.\n\n :param evaluation: An Evaluation object or Evaluation ID\n :param user: Either a user group or the principal ID of a user to grant rights to.\n To allow all users, use \"PUBLIC\".\n To allow authenticated users, use \"AUTHENTICATED_USERS\".\n :param rights: The access rights to give to the users.\n Defaults to \"READ\", \"PARTICIPATE\", \"SUBMIT\", and \"UPDATE_SUBMISSION\".\n \"\"\"\n\n # Check to see if the user is an ID or group\n userId = -1\n try:\n # TODO: is there a better way to differentiate between a userID and a group name?\n # What if a group is named with just numbers?\n userId = int(user)\n\n # Verify that the user exists\n try:\n self.getUserProfile(userId)\n except SynapseHTTPError as err:\n if err.response.status_code == 404:\n raise SynapseError(\"The user (%s) does not exist\" % str(userId))\n raise\n\n except ValueError:\n # Fetch the ID of the user group\n userId = self._getUserbyPrincipalIdOrName(user)\n\n if not isinstance(evaluation, Evaluation):\n evaluation = self.getEvaluation(id_of(evaluation))\n\n self.setPermissions(evaluation, userId, accessType=rights, overwrite=False)\n\n @tracer.start_as_current_span(\"Synapse::getSubmissions\")\n def getSubmissions(self, evaluation, status=None, myOwn=False, limit=20, offset=0):\n \"\"\"\n Arguments:\n evaluation: Evaluation to get submissions from.\n status: Optionally filter submissions for a specific status.\n One of:\n\n - `OPEN`\n - `CLOSED`\n - `SCORED`\n - `INVALID`\n - `VALIDATED`\n - `EVALUATION_IN_PROGRESS`\n - `RECEIVED`\n - `REJECTED`\n - `ACCEPTED`\n myOwn: Determines if only your Submissions should be fetched.\n Defaults to False (all Submissions)\n limit: Limits the number of submissions in a single response.\n Because this method returns a generator and repeatedly\n fetches submissions, this argument is limiting the\n size of a single request and NOT the number of sub-\n missions returned in total.\n offset: Start iterating at a submission offset from the first submission.\n\n Yields:\n A generator over [synapseclient.evaluation.Submission][] objects for an Evaluation\n\n Example: Using this function\n Print submissions\n\n for submission in syn.getSubmissions(1234567):\n print(submission['entityId'])\n\n See:\n\n - [synapseclient.evaluation][]\n \"\"\"\n\n evaluation_id = id_of(evaluation)\n uri = \"/evaluation/%s/submission%s\" % (evaluation_id, \"\" if myOwn else \"/all\")\n\n if status is not None:\n uri += \"?status=%s\" % status\n\n for result in self._GET_paginated(uri, limit=limit, offset=offset):\n yield Submission(**result)\n\n @tracer.start_as_current_span(\"Synapse::_getSubmissionBundles\")\n def _getSubmissionBundles(\n self, evaluation, status=None, myOwn=False, limit=20, offset=0\n ):\n \"\"\"\n :param evaluation: Evaluation to get submissions from.\n :param status: Optionally filter submissions for a specific status.\n One of {OPEN, CLOSED, SCORED, INVALID}\n :param myOwn: Determines if only your Submissions should be fetched.\n Defaults to False (all Submissions)\n :param limit: Limits the number of submissions coming back from the\n service in a single response.\n :param offset: Start iterating at a submission offset from the first\n submission.\n\n :returns: A generator over dictionaries with keys 'submission' and 'submissionStatus'.\n\n Example::\n\n for sb in syn._getSubmissionBundles(1234567):\n print(sb['submission']['name'], \\\\\n sb['submission']['submitterAlias'], \\\\\n sb['submissionStatus']['status'], \\\\\n sb['submissionStatus']['score'])\n\n This may later be changed to return objects, pending some thought on how submissions along with related status\n and annotations should be represented in the clients.\n\n See: :py:mod:`synapseclient.evaluation`\n \"\"\"\n\n evaluation_id = id_of(evaluation)\n url = \"/evaluation/%s/submission/bundle%s\" % (\n evaluation_id,\n \"\" if myOwn else \"/all\",\n )\n if status is not None:\n url += \"?status=%s\" % status\n\n return self._GET_paginated(url, limit=limit, offset=offset)\n\n @tracer.start_as_current_span(\"Synapse::getSubmissionBundles\")\n def getSubmissionBundles(\n self, evaluation, status=None, myOwn=False, limit=20, offset=0\n ):\n \"\"\"\n Retrieve submission bundles (submission and submissions status) for an evaluation queue, optionally filtered by\n submission status and/or owner.\n\n Arguments:\n evaluation: Evaluation to get submissions from.\n status: Optionally filter submissions for a specific status.\n One of:\n\n - `OPEN`\n - `CLOSED`\n - `SCORED`\n - `INVALID`\n myOwn: Determines if only your Submissions should be fetched.\n Defaults to False (all Submissions)\n limit: Limits the number of submissions coming back from the\n service in a single response.\n offset: Start iterating at a submission offset from the first submission.\n\n Yields:\n A generator over tuples containing a [synapseclient.evaluation.Submission][] and a [synapseclient.evaluation.SubmissionStatus][].\n\n Example: Using this function\n Loop over submissions\n\n for submission, status in syn.getSubmissionBundles(evaluation):\n print(submission.name, \\\\\n submission.submitterAlias, \\\\\n status.status, \\\\\n status.score)\n\n This may later be changed to return objects, pending some thought on how submissions along with related status\n and annotations should be represented in the clients.\n\n See:\n\n - [synapseclient.evaluation][]\n \"\"\"\n for bundle in self._getSubmissionBundles(\n evaluation, status=status, myOwn=myOwn, limit=limit, offset=offset\n ):\n yield (\n Submission(**bundle[\"submission\"]),\n SubmissionStatus(**bundle[\"submissionStatus\"]),\n )\n\n @tracer.start_as_current_span(\"Synapse::_GET_paginated\")\n def _GET_paginated(self, uri, limit=20, offset=0):\n \"\"\"\n :param uri: A URI that returns paginated results\n :param limit: How many records should be returned per request\n :param offset: At what record offset from the first should iteration start\n\n :returns: A generator over some paginated results\n\n The limit parameter is set at 20 by default. Using a larger limit results in fewer calls to the service, but if\n responses are large enough to be a burden on the service they may be truncated.\n \"\"\"\n\n prev_num_results = sys.maxsize\n while prev_num_results > 0:\n uri = utils._limit_and_offset(uri, limit=limit, offset=offset)\n page = self.restGET(uri)\n results = page[\"results\"] if \"results\" in page else page[\"children\"]\n prev_num_results = len(results)\n\n for result in results:\n offset += 1\n yield result\n\n @tracer.start_as_current_span(\"Synapse::_POST_paginated\")\n def _POST_paginated(self, uri, body, **kwargs):\n \"\"\"\n :param uri: A URI that returns paginated results\n :param body: POST request payload\n\n :returns: A generator over some paginated results\n \"\"\"\n\n next_page_token = None\n while True:\n body[\"nextPageToken\"] = next_page_token\n response = self.restPOST(uri, body=json.dumps(body), **kwargs)\n next_page_token = response.get(\"nextPageToken\")\n for item in response[\"page\"]:\n yield item\n if next_page_token is None:\n break\n\n @tracer.start_as_current_span(\"Synapse::getSubmission\")\n def getSubmission(self, id, **kwargs):\n \"\"\"\n Gets a [synapseclient.evaluation.Submission][] object by its id.\n\n Arguments:\n id: The id of the submission to retrieve\n\n Returns:\n A [synapseclient.evaluation.Submission][] object\n\n\n :param id: The id of the submission to retrieve\n\n :return: a :py:class:`synapseclient.evaluation.Submission` object\n\n See:\n\n - [synapseclient.Synapse.get][] for information\n on the *downloadFile*, *downloadLocation*, and *ifcollision* parameters\n \"\"\"\n\n submission_id = id_of(id)\n uri = Submission.getURI(submission_id)\n submission = Submission(**self.restGET(uri))\n\n # Pre-fetch the Entity tied to the Submission, if there is one\n if \"entityId\" in submission and submission[\"entityId\"] is not None:\n entityBundleJSON = json.loads(submission[\"entityBundleJSON\"])\n\n # getWithEntityBundle expects a bundle services v2 style\n # annotations dict, but the evaluations API may return\n # an older format annotations object in the encoded JSON\n # depending on when the original submission was made.\n annotations = entityBundleJSON.get(\"annotations\")\n if annotations:\n entityBundleJSON[\"annotations\"] = convert_old_annotation_json(\n annotations\n )\n\n related = self._getWithEntityBundle(\n entityBundle=entityBundleJSON,\n entity=submission[\"entityId\"],\n submission=submission_id,\n **kwargs,\n )\n submission.entity = related\n submission.filePath = related.get(\"path\", None)\n\n return submission\n\n @tracer.start_as_current_span(\"Synapse::getSubmissionStatus\")\n def getSubmissionStatus(self, submission):\n \"\"\"\n Downloads the status of a Submission.\n\n Arguments:\n submission: The submission to lookup\n\n Returns:\n A [synapseclient.evaluation.SubmissionStatus][] object\n \"\"\"\n\n submission_id = id_of(submission)\n uri = SubmissionStatus.getURI(submission_id)\n val = self.restGET(uri)\n return SubmissionStatus(**val)\n\n ############################################################\n # CRUD for Wikis #\n ############################################################\n\n @tracer.start_as_current_span(\"Synapse::getWiki\")\n def getWiki(self, owner, subpageId=None, version=None):\n \"\"\"\n Get a [synapseclient.wiki.Wiki][] object from Synapse. Uses wiki2 API which supports versioning.\n\n Arguments:\n owner: The entity to which the Wiki is attached\n subpageId: The id of the specific sub-page or None to get the root Wiki page\n version: The version of the page to retrieve or None to retrieve the latest\n\n Returns:\n A [synapseclient.wiki.Wiki][] object\n \"\"\"\n uri = \"/entity/{ownerId}/wiki2\".format(ownerId=id_of(owner))\n if subpageId is not None:\n uri += \"/{wikiId}\".format(wikiId=subpageId)\n if version is not None:\n uri += \"?wikiVersion={version}\".format(version=version)\n\n wiki = self.restGET(uri)\n wiki[\"owner\"] = owner\n wiki = Wiki(**wiki)\n\n path = self.cache.get(wiki.markdownFileHandleId)\n if not path:\n cache_dir = self.cache.get_cache_dir(wiki.markdownFileHandleId)\n if not os.path.exists(cache_dir):\n os.makedirs(cache_dir)\n path = self._downloadFileHandle(\n wiki[\"markdownFileHandleId\"],\n wiki[\"id\"],\n \"WikiMarkdown\",\n os.path.join(cache_dir, str(wiki.markdownFileHandleId) + \".md\"),\n )\n try:\n import gzip\n\n with gzip.open(path) as f:\n markdown = f.read().decode(\"utf-8\")\n except IOError:\n with open(path) as f:\n markdown = f.read().decode(\"utf-8\")\n\n wiki.markdown = markdown\n wiki.markdown_path = path\n\n return wiki\n\n @tracer.start_as_current_span(\"Synapse::getWikiHeaders\")\n def getWikiHeaders(self, owner):\n \"\"\"\n Retrieves the headers of all Wikis belonging to the owner (the entity to which the Wiki is attached).\n\n Arguments:\n owner: An Entity\n\n Returns:\n A list of Objects with three fields: id, title and parentId.\n \"\"\"\n\n uri = \"/entity/%s/wikiheadertree\" % id_of(owner)\n return [DictObject(**header) for header in self._GET_paginated(uri)]\n\n @tracer.start_as_current_span(\"Synapse::_storeWiki\")\n def _storeWiki(self, wiki, createOrUpdate): # type: (Wiki, bool) -> Wiki\n \"\"\"\n Stores or updates the given Wiki.\n\n :param wiki: A Wiki object\n\n :returns: An updated Wiki object\n \"\"\"\n # Make sure the file handle field is a list\n if \"attachmentFileHandleIds\" not in wiki:\n wiki[\"attachmentFileHandleIds\"] = []\n\n # Convert all attachments into file handles\n if wiki.get(\"attachments\") is not None:\n for attachment in wiki[\"attachments\"]:\n fileHandle = upload_synapse_s3(self, attachment)\n wiki[\"attachmentFileHandleIds\"].append(fileHandle[\"id\"])\n del wiki[\"attachments\"]\n\n # Perform an update if the Wiki has an ID\n if \"id\" in wiki:\n updated_wiki = Wiki(\n owner=wiki.ownerId, **self.restPUT(wiki.putURI(), wiki.json())\n )\n\n # Perform a create if the Wiki has no ID\n else:\n try:\n updated_wiki = Wiki(\n owner=wiki.ownerId, **self.restPOST(wiki.postURI(), wiki.json())\n )\n except SynapseHTTPError as err:\n # If already present we get an unhelpful SQL error\n if createOrUpdate and (\n (\n err.response.status_code == 400\n and \"DuplicateKeyException\" in err.message\n )\n or err.response.status_code == 409\n ):\n existing_wiki = self.getWiki(wiki.ownerId)\n\n # overwrite everything except for the etag (this will keep unmodified fields in the existing wiki)\n etag = existing_wiki[\"etag\"]\n existing_wiki.update(wiki)\n existing_wiki.etag = etag\n\n updated_wiki = Wiki(\n owner=wiki.ownerId,\n **self.restPUT(existing_wiki.putURI(), existing_wiki.json()),\n )\n else:\n raise\n return updated_wiki\n\n @tracer.start_as_current_span(\"Synapse::getWikiAttachments\")\n def getWikiAttachments(self, wiki):\n \"\"\"\n Retrieve the attachments to a wiki page.\n\n Arguments:\n wiki: The Wiki object for which the attachments are to be returned.\n\n Returns:\n A list of file handles for the files attached to the Wiki.\n \"\"\"\n uri = \"/entity/%s/wiki/%s/attachmenthandles\" % (wiki.ownerId, wiki.id)\n results = self.restGET(uri)\n file_handles = list(WikiAttachment(**fh) for fh in results[\"list\"])\n return file_handles\n\n ############################################################\n # Tables #\n ############################################################\n\n @tracer.start_as_current_span(\"Synapse::_waitForAsync\")\n def _waitForAsync(self, uri, request, endpoint=None):\n if endpoint is None:\n endpoint = self.repoEndpoint\n async_job_id = self.restPOST(\n uri + \"/start\", body=json.dumps(request), endpoint=endpoint\n )\n\n # https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/asynch/AsynchronousJobStatus.html\n sleep = self.table_query_sleep\n start_time = time.time()\n lastMessage, lastProgress, lastTotal, progressed = \"\", 0, 1, False\n while time.time() - start_time < self.table_query_timeout:\n result = self.restGET(\n uri + \"/get/%s\" % async_job_id[\"token\"], endpoint=endpoint\n )\n if result.get(\"jobState\", None) == \"PROCESSING\":\n progressed = True\n message = result.get(\"progressMessage\", lastMessage)\n progress = result.get(\"progressCurrent\", lastProgress)\n total = result.get(\"progressTotal\", lastTotal)\n if message != \"\":\n self._print_transfer_progress(\n progress, total, message, isBytes=False\n )\n # Reset the time if we made progress (fix SYNPY-214)\n if message != lastMessage or lastProgress != progress:\n start_time = time.time()\n lastMessage, lastProgress, lastTotal = message, progress, total\n sleep = min(\n self.table_query_max_sleep, sleep * self.table_query_backoff\n )\n doze(sleep)\n else:\n break\n else:\n raise SynapseTimeoutError(\n \"Timeout waiting for query results: %0.1f seconds \"\n % (time.time() - start_time)\n )\n if result.get(\"jobState\", None) == \"FAILED\":\n raise SynapseError(\n result.get(\"errorMessage\", None)\n + \"\\n\"\n + result.get(\"errorDetails\", None),\n asynchronousJobStatus=result,\n )\n if progressed:\n self._print_transfer_progress(total, total, message, isBytes=False)\n return result\n\n @tracer.start_as_current_span(\"Synapse::getColumn\")\n def getColumn(self, id):\n \"\"\"\n Gets a Column object from Synapse by ID.\n\n See: [synapseclient.table.Column][]\n\n Arguments:\n id: The ID of the column to retrieve\n\n Returns:\n An object of type [synapseclient.table.Column][]\n\n\n Example: Using this function\n Getting a column\n\n column = syn.getColumn(123)\n \"\"\"\n return Column(**self.restGET(Column.getURI(id)))\n\n @tracer.start_as_current_span(\"Synapse::getColumns\")\n def getColumns(self, x, limit=100, offset=0):\n \"\"\"\n Get the columns defined in Synapse either (1) corresponding to a set of column headers, (2) those for a given\n schema, or (3) those whose names start with a given prefix.\n\n Arguments:\n x: A list of column headers, a Table Entity object (Schema/EntityViewSchema), a Table's Synapse ID, or a\n string prefix\n limit: maximum number of columns to return (pagination parameter)\n offset: the index of the first column to return (pagination parameter)\n\n Yields:\n A generator over [synapseclient.table.Column][] objects\n \"\"\"\n if x is None:\n uri = \"/column\"\n for result in self._GET_paginated(uri, limit=limit, offset=offset):\n yield Column(**result)\n elif isinstance(x, (list, tuple)):\n for header in x:\n try:\n # if header is an integer, it's a columnID, otherwise it's an aggregate column, like \"AVG(Foo)\"\n int(header)\n yield self.getColumn(header)\n except ValueError:\n # ignore aggregate column\n pass\n elif isinstance(x, SchemaBase) or utils.is_synapse_id_str(x):\n for col in self.getTableColumns(x):\n yield col\n elif isinstance(x, str):\n uri = \"/column?prefix=\" + x\n for result in self._GET_paginated(uri, limit=limit, offset=offset):\n yield Column(**result)\n else:\n ValueError(\"Can't get columns for a %s\" % type(x))\n\n @tracer.start_as_current_span(\"Synapse::create_snapshot_version\")\n def create_snapshot_version(\n self,\n table: typing.Union[\n EntityViewSchema, Schema, str, SubmissionViewSchema, Dataset\n ],\n comment: str = None,\n label: str = None,\n activity: typing.Union[Activity, str] = None,\n wait: bool = True,\n ) -> int:\n \"\"\"Create a new Table Version, new View version, or new Dataset version.\n\n Arguments:\n table: The schema of the Table/View, or its ID.\n comment: Optional snapshot comment.\n label: Optional snapshot label.\n activity: Optional activity ID applied to snapshot version.\n wait: True if this method should return the snapshot version after waiting for any necessary\n asynchronous table updates to complete. If False this method will return\n as soon as any updates are initiated.\n\n Returns:\n The snapshot version number if wait=True, None if wait=False\n \"\"\"\n ent = self.get(id_of(table), downloadFile=False)\n if isinstance(ent, (EntityViewSchema, SubmissionViewSchema, Dataset)):\n result = self._async_table_update(\n table,\n create_snapshot=True,\n comment=comment,\n label=label,\n activity=activity,\n wait=wait,\n )\n elif isinstance(ent, Schema):\n result = self._create_table_snapshot(\n table,\n comment=comment,\n label=label,\n activity=activity,\n )\n else:\n raise ValueError(\n \"This function only accepts Synapse ids of Tables or Views\"\n )\n\n # for consistency we return nothing if wait=False since we can't\n # supply the snapshot version on an async table update without waiting\n return result[\"snapshotVersionNumber\"] if wait else None\n\n @tracer.start_as_current_span(\"Synapse::_create_table_snapshot\")\n def _create_table_snapshot(\n self,\n table: typing.Union[Schema, str],\n comment: str = None,\n label: str = None,\n activity: typing.Union[Activity, str] = None,\n ) -> dict:\n \"\"\"Creates Table snapshot\n\n :param table: The schema of the Table\n :param comment: Optional snapshot comment.\n :param label: Optional snapshot label.\n :param activity: Optional activity ID or activity instance applied to snapshot version.\n\n :return: Snapshot Response\n \"\"\"\n\n # check the activity id or object is provided\n activity_id = None\n if isinstance(activity, collections.abc.Mapping):\n if \"id\" not in activity:\n activity = self._saveActivity(activity)\n activity_id = activity[\"id\"]\n elif activity is not None:\n activity_id = str(activity)\n\n snapshot_body = {\n \"snapshotComment\": comment,\n \"snapshotLabel\": label,\n \"snapshotActivityId\": activity_id,\n }\n new_body = {\n key: value for key, value in snapshot_body.items() if value is not None\n }\n snapshot = self.restPOST(\n \"/entity/{}/table/snapshot\".format(id_of(table)), body=json.dumps(new_body)\n )\n return snapshot\n\n @tracer.start_as_current_span(\"Synapse::_async_table_update\")\n def _async_table_update(\n self,\n table: typing.Union[EntityViewSchema, Schema, str, SubmissionViewSchema],\n changes: typing.List[dict] = [],\n create_snapshot: bool = False,\n comment: str = None,\n label: str = None,\n activity: str = None,\n wait: bool = True,\n ) -> dict:\n \"\"\"Creates view updates and snapshots\n\n :param table: The schema of the EntityView or its ID.\n :param changes: Array of Table changes\n :param create_snapshot: Create snapshot\n :param comment: Optional snapshot comment.\n :param label: Optional snapshot label.\n :param activity: Optional activity ID applied to snapshot version.\n :param wait: True to wait for async table update to complete\n\n :return: Snapshot Response\n \"\"\"\n snapshot_options = {\n \"snapshotComment\": comment,\n \"snapshotLabel\": label,\n \"snapshotActivityId\": activity,\n }\n new_snapshot = {\n key: value for key, value in snapshot_options.items() if value is not None\n }\n table_update_body = {\n \"changes\": changes,\n \"createSnapshot\": create_snapshot,\n \"snapshotOptions\": new_snapshot,\n }\n\n uri = \"/entity/{}/table/transaction/async\".format(id_of(table))\n\n if wait:\n result = self._waitForAsync(uri, table_update_body)\n\n else:\n result = self.restPOST(\n \"{}/start\".format(uri), body=json.dumps(table_update_body)\n )\n\n return result\n\n @tracer.start_as_current_span(\"Synapse::getTableColumns\")\n def getTableColumns(self, table):\n \"\"\"\n Retrieve the column models used in the given table schema.\n\n Arguments:\n table: The schema of the Table whose columns are to be retrieved\n\n Yields:\n A Generator over the Table's [columns][synapseclient.table.Column]\n \"\"\"\n uri = \"/entity/{id}/column\".format(id=id_of(table))\n # The returned object type for this service, PaginatedColumnModels, is a misnomer.\n # This service always returns the full list of results so the pagination does not not actually matter.\n for result in self.restGET(uri)[\"results\"]:\n yield Column(**result)\n\n @tracer.start_as_current_span(\"Synapse::tableQuery\")\n def tableQuery(self, query, resultsAs=\"csv\", **kwargs):\n \"\"\"\n Query a Synapse Table.\n\n\n :param query: query string in a `SQL-like syntax \\\n <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/web/controller/TableExamples.html>`_, for example\n \"SELECT * from syn12345\"\n\n :param resultsAs: select whether results are returned as a CSV file (\"csv\") or incrementally downloaded as\n sets of rows (\"rowset\").\n\n\n :param query: query string in a `SQL-like syntax \\\n <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/web/controller/TableExamples.html>`_, for example\n \"SELECT * from syn12345\"\n\n :param resultsAs: select whether results are returned as a CSV file (\"csv\") or incrementally downloaded as\n sets of rows (\"rowset\").\n\n You can receive query results either as a generator over rows or as a CSV file. For smallish tables, either\n method will work equally well. Use of a \"rowset\" generator allows rows to be processed one at a time and\n processing may be stopped before downloading the entire table.\n\n Optional keyword arguments differ for the two return types of `rowset` or `csv`\n\n Arguments:\n query: Query string in a [SQL-like syntax](https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/web/controller/TableExamples.html), for example: `\"SELECT * from syn12345\"`\n resultsAs: select whether results are returned as a CSV file (\"csv\") or incrementally downloaded as sets of rows (\"rowset\")\n limit: (rowset only) Specify the maximum number of rows to be returned, defaults to None\n offset: (rowset only) Don't return the first n rows, defaults to None\n quoteCharacter: (csv only) default double quote\n escapeCharacter: (csv only) default backslash\n lineEnd: (csv only) defaults to os.linesep\n separator: (csv only) defaults to comma\n header: (csv only) True by default\n includeRowIdAndRowVersion: (csv only) True by default\n downloadLocation: (csv only) directory path to download the CSV file to\n\n\n Returns:\n A [TableQueryResult][synapseclient.table.TableQueryResult] or [CsvFileTable][synapseclient.table.CsvFileTable] object\n\n\n NOTE:\n When performing queries on frequently updated tables, the table can be inaccessible for a period leading\n to a timeout of the query. Since the results are guaranteed to eventually be returned you can change the\n max timeout by setting the table_query_timeout variable of the Synapse object:\n\n # Sets the max timeout to 5 minutes.\n syn.table_query_timeout = 300\n\n \"\"\"\n if resultsAs.lower() == \"rowset\":\n return TableQueryResult(self, query, **kwargs)\n elif resultsAs.lower() == \"csv\":\n # TODO: remove isConsistent because it has now been deprecated\n # from the backend\n if kwargs.get(\"isConsistent\") is not None:\n kwargs.pop(\"isConsistent\")\n return CsvFileTable.from_table_query(self, query, **kwargs)\n else:\n raise ValueError(\n \"Unknown return type requested from tableQuery: \" + str(resultsAs)\n )\n\n @tracer.start_as_current_span(\"Synapse::_queryTable\")\n def _queryTable(\n self, query, limit=None, offset=None, isConsistent=True, partMask=None\n ):\n \"\"\"\n Query a table and return the first page of results as a `QueryResultBundle \\\n <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/QueryResultBundle.html>`_.\n If the result contains a *nextPageToken*, following pages a retrieved by calling :py:meth:`~._queryTableNext`.\n\n :param partMask: Optional, default all. The 'partsMask' is a bit field for requesting\n different elements in the resulting JSON bundle.\n Query Results (queryResults) = 0x1\n Query Count (queryCount) = 0x2\n Select Columns (selectColumns) = 0x4\n Max Rows Per Page (maxRowsPerPage) = 0x8\n \"\"\"\n\n # See: https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/QueryBundleRequest.html\n query_bundle_request = {\n \"concreteType\": \"org.sagebionetworks.repo.model.table.QueryBundleRequest\",\n \"query\": {\n \"sql\": query,\n \"isConsistent\": isConsistent,\n \"includeEntityEtag\": True,\n },\n }\n\n if partMask:\n query_bundle_request[\"partMask\"] = partMask\n if limit is not None:\n query_bundle_request[\"query\"][\"limit\"] = limit\n if offset is not None:\n query_bundle_request[\"query\"][\"offset\"] = offset\n query_bundle_request[\"query\"][\"isConsistent\"] = isConsistent\n\n uri = \"/entity/{id}/table/query/async\".format(\n id=extract_synapse_id_from_query(query)\n )\n\n return self._waitForAsync(uri=uri, request=query_bundle_request)\n\n @tracer.start_as_current_span(\"Synapse::_queryTableNext\")\n def _queryTableNext(self, nextPageToken, tableId):\n uri = \"/entity/{id}/table/query/nextPage/async\".format(id=tableId)\n return self._waitForAsync(uri=uri, request=nextPageToken)\n\n @tracer.start_as_current_span(\"Synapse::_uploadCsv\")\n def _uploadCsv(\n self,\n filepath,\n schema,\n updateEtag=None,\n quoteCharacter='\"',\n escapeCharacter=\"\\\\\",\n lineEnd=os.linesep,\n separator=\",\",\n header=True,\n linesToSkip=0,\n ):\n \"\"\"\n Send an `UploadToTableRequest \\\n <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/UploadToTableRequest.html>`_ to Synapse.\n\n :param filepath: Path of a `CSV <https://en.wikipedia.org/wiki/Comma-separated_values>`_ file.\n :param schema: A table entity or its Synapse ID.\n :param updateEtag: Any RowSet returned from Synapse will contain the current etag of the change set.\n To update any rows from a RowSet the etag must be provided with the POST.\n\n :returns: `UploadToTableResult \\\n <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/UploadToTableResult.html>`_\n \"\"\"\n\n fileHandleId = multipart_upload_file(self, filepath, content_type=\"text/csv\")\n\n uploadRequest = {\n \"concreteType\": \"org.sagebionetworks.repo.model.table.UploadToTableRequest\",\n \"csvTableDescriptor\": {\n \"isFirstLineHeader\": header,\n \"quoteCharacter\": quoteCharacter,\n \"escapeCharacter\": escapeCharacter,\n \"lineEnd\": lineEnd,\n \"separator\": separator,\n },\n \"linesToSkip\": linesToSkip,\n \"tableId\": id_of(schema),\n \"uploadFileHandleId\": fileHandleId,\n }\n\n if updateEtag:\n uploadRequest[\"updateEtag\"] = updateEtag\n\n response = self._async_table_update(schema, changes=[uploadRequest], wait=True)\n self._check_table_transaction_response(response)\n\n return response\n\n @tracer.start_as_current_span(\"Synapse::_check_table_transaction_response\")\n def _check_table_transaction_response(self, response):\n for result in response[\"results\"]:\n result_type = result[\"concreteType\"]\n\n if result_type in {\n concrete_types.ROW_REFERENCE_SET_RESULTS,\n concrete_types.TABLE_SCHEMA_CHANGE_RESPONSE,\n concrete_types.UPLOAD_TO_TABLE_RESULT,\n }:\n # if these fail, it we would have gotten an HttpError before the results came back\n pass\n elif result_type == concrete_types.ENTITY_UPDATE_RESULTS:\n # TODO: output full response to error file when the logging JIRA issue gets pulled in\n successful_updates = []\n failed_updates = []\n for update_result in result[\"updateResults\"]:\n failure_code = update_result.get(\"failureCode\")\n failure_message = update_result.get(\"failureMessage\")\n entity_id = update_result.get(\"entityId\")\n if failure_code or failure_message:\n failed_updates.append(update_result)\n else:\n successful_updates.append(entity_id)\n\n if failed_updates:\n raise SynapseError(\n \"Not all of the entities were updated.\"\n \" Successful updates: %s. Failed updates: %s\"\n % (successful_updates, failed_updates)\n )\n\n else:\n warnings.warn(\n \"Unexpected result from a table transaction of type [%s].\"\n \" Please check the result to make sure it is correct. %s\"\n % (result_type, result)\n )\n\n @tracer.start_as_current_span(\"Synapse::_queryTableCsv\")\n def _queryTableCsv(\n self,\n query,\n quoteCharacter='\"',\n escapeCharacter=\"\\\\\",\n lineEnd=os.linesep,\n separator=\",\",\n header=True,\n includeRowIdAndRowVersion=True,\n downloadLocation=None,\n ):\n \"\"\"\n Query a Synapse Table and download a CSV file containing the results.\n\n Sends a `DownloadFromTableRequest \\\n <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/DownloadFromTableRequest.html>`_ to Synapse.\n\n :return: a tuple containing a `DownloadFromTableResult \\\n <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/DownloadFromTableResult.html>`_\n\n The DownloadFromTableResult object contains these fields:\n * headers: ARRAY<STRING>, The list of ColumnModel IDs that describes the rows of this set.\n * resultsFileHandleId: STRING, The resulting file handle ID can be used to download the CSV file created by\n this query.\n * concreteType: STRING\n * etag: STRING, Any RowSet returned from Synapse will contain the current etag of the change\n set.\n To update any rows from a RowSet the etag must be provided with the POST.\n * tableId: STRING, The ID of the table identified in the from clause of the table query.\n \"\"\"\n\n download_from_table_request = {\n \"concreteType\": \"org.sagebionetworks.repo.model.table.DownloadFromTableRequest\",\n \"csvTableDescriptor\": {\n \"isFirstLineHeader\": header,\n \"quoteCharacter\": quoteCharacter,\n \"escapeCharacter\": escapeCharacter,\n \"lineEnd\": lineEnd,\n \"separator\": separator,\n },\n \"sql\": query,\n \"writeHeader\": header,\n \"includeRowIdAndRowVersion\": includeRowIdAndRowVersion,\n \"includeEntityEtag\": True,\n }\n\n uri = \"/entity/{id}/table/download/csv/async\".format(\n id=extract_synapse_id_from_query(query)\n )\n download_from_table_result = self._waitForAsync(\n uri=uri, request=download_from_table_request\n )\n file_handle_id = download_from_table_result[\"resultsFileHandleId\"]\n cached_file_path = self.cache.get(\n file_handle_id=file_handle_id, path=downloadLocation\n )\n if cached_file_path is not None:\n return download_from_table_result, cached_file_path\n\n if downloadLocation:\n download_dir = self._ensure_download_location_is_directory(downloadLocation)\n else:\n download_dir = self.cache.get_cache_dir(file_handle_id)\n\n os.makedirs(download_dir, exist_ok=True)\n filename = f\"SYNAPSE_TABLE_QUERY_{file_handle_id}.csv\"\n path = self._downloadFileHandle(\n file_handle_id,\n extract_synapse_id_from_query(query),\n \"TableEntity\",\n os.path.join(download_dir, filename),\n )\n\n return download_from_table_result, path\n\n # This is redundant with syn.store(Column(...)) and will be removed unless people prefer this method.\n @tracer.start_as_current_span(\"Synapse::createColumn\")\n def createColumn(\n self, name, columnType, maximumSize=None, defaultValue=None, enumValues=None\n ):\n columnModel = Column(\n name=name,\n columnType=columnType,\n maximumSize=maximumSize,\n defaultValue=defaultValue,\n enumValue=enumValues,\n )\n return Column(**self.restPOST(\"/column\", json.dumps(columnModel)))\n\n @tracer.start_as_current_span(\"Synapse::createColumns\")\n def createColumns(self, columns: typing.List[Column]) -> typing.List[Column]:\n \"\"\"\n Creates a batch of [synapseclient.table.Column][]'s within a single request.\n\n Arguments:\n columns: A list of [synapseclient.table.Column][]'s\n\n Returns:\n A list of [synapseclient.table.Column][]'s that have been created in Synapse\n \"\"\"\n request_body = {\n \"concreteType\": \"org.sagebionetworks.repo.model.ListWrapper\",\n \"list\": list(columns),\n }\n response = self.restPOST(\"/column/batch\", json.dumps(request_body))\n return [Column(**col) for col in response[\"list\"]]\n\n @tracer.start_as_current_span(\"Synapse::_getColumnByName\")\n def _getColumnByName(self, schema, column_name):\n \"\"\"\n Given a schema and a column name, get the corresponding py:class:`Column` object.\n \"\"\"\n for column in self.getColumns(schema):\n if column.name == column_name:\n return column\n return None\n\n @tracer.start_as_current_span(\"Synapse::downloadTableColumns\")\n def downloadTableColumns(self, table, columns, downloadLocation=None, **kwargs):\n \"\"\"\n Bulk download of table-associated files.\n\n Arguments:\n table: Table query result\n columns: A list of column names as strings\n downloadLocation: Directory into which to download the files\n\n Returns:\n A dictionary from file handle ID to path in the local file system.\n\n For example, consider a Synapse table whose ID is \"syn12345\" with two columns of type FILEHANDLEID named 'foo'\n and 'bar'. The associated files are JSON encoded, so we might retrieve the files from Synapse and load for the\n second 100 of those rows as shown here:\n\n import json\n\n results = syn.tableQuery('SELECT * FROM syn12345 LIMIT 100 OFFSET 100')\n file_map = syn.downloadTableColumns(results, ['foo', 'bar'])\n\n for file_handle_id, path in file_map.items():\n with open(path) as f:\n data[file_handle_id] = f.read()\n\n \"\"\"\n\n RETRIABLE_FAILURE_CODES = [\"EXCEEDS_SIZE_LIMIT\"]\n MAX_DOWNLOAD_TRIES = 100\n max_files_per_request = kwargs.get(\"max_files_per_request\", 2500)\n # Rowset tableQuery result not allowed\n if isinstance(table, TableQueryResult):\n raise ValueError(\n \"downloadTableColumn doesn't work with rowsets. Please use default tableQuery settings.\"\n )\n if isinstance(columns, str):\n columns = [columns]\n if not isinstance(columns, collections.abc.Iterable):\n raise TypeError(\"Columns parameter requires a list of column names\")\n\n (\n file_handle_associations,\n file_handle_to_path_map,\n ) = self._build_table_download_file_handle_list(\n table,\n columns,\n downloadLocation,\n )\n\n self.logger.info(\n \"Downloading %d files, %d cached locally\"\n % (len(file_handle_associations), len(file_handle_to_path_map))\n )\n\n permanent_failures = collections.OrderedDict()\n\n attempts = 0\n while len(file_handle_associations) > 0 and attempts < MAX_DOWNLOAD_TRIES:\n attempts += 1\n\n file_handle_associations_batch = file_handle_associations[\n :max_files_per_request\n ]\n\n # ------------------------------------------------------------\n # call async service to build zip file\n # ------------------------------------------------------------\n\n # returns a BulkFileDownloadResponse:\n # https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/file/BulkFileDownloadResponse.html\n request = dict(\n concreteType=\"org.sagebionetworks.repo.model.file.BulkFileDownloadRequest\",\n requestedFiles=file_handle_associations_batch,\n )\n response = self._waitForAsync(\n uri=\"/file/bulk/async\",\n request=request,\n endpoint=self.fileHandleEndpoint,\n )\n\n # ------------------------------------------------------------\n # download zip file\n # ------------------------------------------------------------\n\n temp_dir = tempfile.mkdtemp()\n zipfilepath = os.path.join(temp_dir, \"table_file_download.zip\")\n try:\n zipfilepath = self._downloadFileHandle(\n response[\"resultZipFileHandleId\"],\n table.tableId,\n \"TableEntity\",\n zipfilepath,\n )\n # TODO handle case when no zip file is returned\n # TODO test case when we give it partial or all bad file handles\n # TODO test case with deleted fileHandleID\n # TODO return null for permanent failures\n\n # ------------------------------------------------------------\n # unzip into cache\n # ------------------------------------------------------------\n\n if downloadLocation:\n download_dir = self._ensure_download_location_is_directory(\n downloadLocation\n )\n\n with zipfile.ZipFile(zipfilepath) as zf:\n # the directory structure within the zip follows that of the cache:\n # {fileHandleId modulo 1000}/{fileHandleId}/{fileName}\n for summary in response[\"fileSummary\"]:\n if summary[\"status\"] == \"SUCCESS\":\n if not downloadLocation:\n download_dir = self.cache.get_cache_dir(\n summary[\"fileHandleId\"]\n )\n\n filepath = extract_zip_file_to_directory(\n zf, summary[\"zipEntryName\"], download_dir\n )\n self.cache.add(summary[\"fileHandleId\"], filepath)\n file_handle_to_path_map[summary[\"fileHandleId\"]] = filepath\n elif summary[\"failureCode\"] not in RETRIABLE_FAILURE_CODES:\n permanent_failures[summary[\"fileHandleId\"]] = summary\n finally:\n if os.path.exists(zipfilepath):\n os.remove(zipfilepath)\n\n # Do we have remaining files to download?\n file_handle_associations = [\n fha\n for fha in file_handle_associations\n if fha[\"fileHandleId\"] not in file_handle_to_path_map\n and fha[\"fileHandleId\"] not in permanent_failures.keys()\n ]\n\n # TODO if there are files we still haven't downloaded\n\n return file_handle_to_path_map\n\n @tracer.start_as_current_span(\"Synapse::_build_table_download_file_handle_list\")\n def _build_table_download_file_handle_list(self, table, columns, downloadLocation):\n # ------------------------------------------------------------\n # build list of file handles to download\n # ------------------------------------------------------------\n cols_not_found = [\n c for c in columns if c not in [h.name for h in table.headers]\n ]\n if len(cols_not_found) > 0:\n raise ValueError(\n \"Columns not found: \"\n + \", \".join('\"' + col + '\"' for col in cols_not_found)\n )\n col_indices = [i for i, h in enumerate(table.headers) if h.name in columns]\n # see: https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/file/BulkFileDownloadRequest.html\n file_handle_associations = []\n file_handle_to_path_map = collections.OrderedDict()\n seen_file_handle_ids = (\n set()\n ) # ensure not sending duplicate requests for the same FileHandle IDs\n for row in table:\n for col_index in col_indices:\n file_handle_id = row[col_index]\n if is_integer(file_handle_id):\n path_to_cached_file = self.cache.get(\n file_handle_id, path=downloadLocation\n )\n if path_to_cached_file:\n file_handle_to_path_map[file_handle_id] = path_to_cached_file\n elif file_handle_id not in seen_file_handle_ids:\n file_handle_associations.append(\n dict(\n associateObjectType=\"TableEntity\",\n fileHandleId=file_handle_id,\n associateObjectId=table.tableId,\n )\n )\n seen_file_handle_ids.add(file_handle_id)\n else:\n warnings.warn(\"Weird file handle: %s\" % file_handle_id)\n return file_handle_associations, file_handle_to_path_map\n\n @tracer.start_as_current_span(\"Synapse::_get_default_view_columns\")\n def _get_default_view_columns(self, view_type, view_type_mask=None):\n \"\"\"Get default view columns\"\"\"\n uri = f\"/column/tableview/defaults?viewEntityType={view_type}\"\n if view_type_mask:\n uri += f\"&viewTypeMask={view_type_mask}\"\n return [Column(**col) for col in self.restGET(uri)[\"list\"]]\n\n @tracer.start_as_current_span(\"Synapse::_get_annotation_view_columns\")\n def _get_annotation_view_columns(\n self, scope_ids: list, view_type: str, view_type_mask: str = None\n ) -> list:\n \"\"\"Get all the columns of a submission of entity view based on existing annotations\n\n :param scope_ids: List of Evaluation Queue or Project/Folder Ids\n :param view_type: submissionview or entityview\n :param view_type_mask: Bit mask representing the types to include in the view.\n\n :returns: list of columns\n \"\"\"\n columns = []\n next_page_token = None\n while True:\n view_scope = {\n \"concreteType\": \"org.sagebionetworks.repo.model.table.ViewColumnModelRequest\",\n \"viewScope\": {\n \"scope\": scope_ids,\n \"viewEntityType\": view_type,\n \"viewTypeMask\": view_type_mask,\n },\n }\n if next_page_token:\n view_scope[\"nextPageToken\"] = next_page_token\n response = self._waitForAsync(\n uri=\"/column/view/scope/async\", request=view_scope\n )\n columns.extend(Column(**column) for column in response[\"results\"])\n next_page_token = response.get(\"nextPageToken\")\n if next_page_token is None:\n break\n return columns\n\n ############################################################\n # CRUD for Entities (properties) #\n ############################################################\n\n @tracer.start_as_current_span(\"Synapse::_getEntity\")\n def _getEntity(self, entity, version=None):\n \"\"\"\n Get an entity from Synapse.\n\n :param entity: A Synapse ID, a dictionary representing an Entity, or a Synapse Entity object\n :param version: The version number to fetch\n\n :returns: A dictionary containing an Entity's properties\n \"\"\"\n\n uri = \"/entity/\" + id_of(entity)\n if version:\n uri += \"/version/%d\" % version\n return self.restGET(uri)\n\n @tracer.start_as_current_span(\"Synapse::_createEntity\")\n def _createEntity(self, entity):\n \"\"\"\n Create a new entity in Synapse.\n\n :param entity: A dictionary representing an Entity or a Synapse Entity object\n\n :returns: A dictionary containing an Entity's properties\n \"\"\"\n\n return self.restPOST(uri=\"/entity\", body=json.dumps(get_properties(entity)))\n\n @tracer.start_as_current_span(\"Synapse::_updateEntity\")\n def _updateEntity(self, entity, incrementVersion=True, versionLabel=None):\n \"\"\"\n Update an existing entity in Synapse.\n\n :param entity: A dictionary representing an Entity or a Synapse Entity object\n :param incrementVersion: whether to increment the entity version (if Versionable)\n :param versionLabel: a label for the entity version (if Versionable)\n\n\n :returns: A dictionary containing an Entity's properties\n \"\"\"\n\n uri = \"/entity/%s\" % id_of(entity)\n\n params = {}\n if is_versionable(entity):\n if versionLabel:\n # a versionLabel implicitly implies incrementing\n incrementVersion = True\n elif incrementVersion and \"versionNumber\" in entity:\n versionLabel = str(entity[\"versionNumber\"] + 1)\n\n if incrementVersion:\n entity[\"versionLabel\"] = versionLabel\n params[\"newVersion\"] = \"true\"\n\n return self.restPUT(uri, body=json.dumps(get_properties(entity)), params=params)\n\n @tracer.start_as_current_span(\"Synapse::findEntityId\")\n def findEntityId(self, name, parent=None):\n \"\"\"\n Find an Entity given its name and parent.\n\n Arguments:\n name: Name of the entity to find\n parent: An Entity object or the Id of an entity as a string. Omit if searching for a Project by name\n\n Returns:\n The Entity ID or None if not found\n \"\"\"\n # when we want to search for a project by name. set parentId as None instead of ROOT_ENTITY\n entity_lookup_request = {\n \"parentId\": id_of(parent) if parent else None,\n \"entityName\": name,\n }\n try:\n return self.restPOST(\n \"/entity/child\", body=json.dumps(entity_lookup_request)\n ).get(\"id\")\n except SynapseHTTPError as e:\n if (\n e.response.status_code == 404\n ): # a 404 error is raised if the entity does not exist\n return None\n raise\n\n ############################################################\n # Send Message #\n ############################################################\n @tracer.start_as_current_span(\"Synapse::sendMessage\")\n def sendMessage(\n self, userIds, messageSubject, messageBody, contentType=\"text/plain\"\n ):\n \"\"\"\n send a message via Synapse.\n\n Arguments:\n userIds: A list of user IDs to which the message is to be sent\n messageSubject: The subject for the message\n messageBody: The body of the message\n contentType: optional contentType of message body (default=\"text/plain\")\n Should be one of \"text/plain\" or \"text/html\"\n\n Returns:\n The metadata of the created message\n \"\"\"\n\n fileHandleId = multipart_upload_string(\n self, messageBody, content_type=contentType\n )\n message = dict(\n recipients=userIds, subject=messageSubject, fileHandleId=fileHandleId\n )\n return self.restPOST(uri=\"/message\", body=json.dumps(message))\n\n ############################################################\n # Low level Rest calls #\n ############################################################\n\n def _generate_headers(self, headers=None):\n \"\"\"Generate headers (auth headers produced separately by credentials object)\"\"\"\n\n if headers is None:\n headers = dict(self.default_headers)\n headers.update(synapseclient.USER_AGENT)\n\n return headers\n\n def _handle_synapse_http_error(self, response):\n \"\"\"Raise errors as appropriate for returned Synapse http status codes\"\"\"\n\n try:\n exceptions._raise_for_status(response, verbose=self.debug)\n except exceptions.SynapseHTTPError as ex:\n # if we get a unauthenticated or forbidden error and the user is not logged in\n # then we raise it as an authentication error.\n # we can't know for certain that logging in to their particular account will grant them\n # access to this resource but more than likely it's the cause of this error.\n if response.status_code in (401, 403) and not self.credentials:\n raise SynapseAuthenticationError(\n \"You are not logged in and do not have access to a requested resource.\"\n ) from ex\n\n raise\n\n def _rest_call(\n self,\n method,\n uri,\n data,\n endpoint,\n headers,\n retryPolicy,\n requests_session,\n **kwargs,\n ):\n uri, headers = self._build_uri_and_headers(\n uri, endpoint=endpoint, headers=headers\n )\n\n retryPolicy = self._build_retry_policy(retryPolicy)\n requests_session = requests_session or self._requests_session\n\n auth = kwargs.pop(\"auth\", self.credentials)\n requests_method_fn = getattr(requests_session, method)\n response = with_retry(\n lambda: requests_method_fn(\n uri,\n data=data,\n headers=headers,\n auth=auth,\n **kwargs,\n ),\n verbose=self.debug,\n **retryPolicy,\n )\n\n self._handle_synapse_http_error(response)\n return response\n\n @tracer.start_as_current_span(\"Synapse::restGET\")\n def restGET(\n self,\n uri,\n endpoint=None,\n headers=None,\n retryPolicy={},\n requests_session=None,\n **kwargs,\n ):\n \"\"\"\n Sends an HTTP GET request to the Synapse server.\n\n Arguments:\n uri: URI on which get is performed\n endpoint: Server endpoint, defaults to self.repoEndpoint\n headers: Dictionary of headers to use rather than the API-key-signed default set of headers\n requests_session: An external requests.Session object to use when making this specific call\n kwargs: Any other arguments taken by a [request](http://docs.python-requests.org/en/latest/) method\n\n Returns:\n JSON encoding of response\n \"\"\"\n trace.get_current_span().set_attributes({\"url.path\": uri})\n response = self._rest_call(\n \"get\", uri, None, endpoint, headers, retryPolicy, requests_session, **kwargs\n )\n return self._return_rest_body(response)\n\n @tracer.start_as_current_span(\"Synapse::restPOST\")\n def restPOST(\n self,\n uri,\n body,\n endpoint=None,\n headers=None,\n retryPolicy={},\n requests_session=None,\n **kwargs,\n ):\n \"\"\"\n Sends an HTTP POST request to the Synapse server.\n\n Arguments:\n uri: URI on which get is performed\n endpoint: Server endpoint, defaults to self.repoEndpoint\n body: The payload to be delivered\n headers: Dictionary of headers to use rather than the API-key-signed default set of headers\n requests_session: an external requests.Session object to use when making this specific call\n kwargs: Any other arguments taken by a [request](http://docs.python-requests.org/en/latest/) method\n\n Returns:\n JSON encoding of response\n \"\"\"\n trace.get_current_span().set_attributes({\"url.path\": uri})\n response = self._rest_call(\n \"post\",\n uri,\n body,\n endpoint,\n headers,\n retryPolicy,\n requests_session,\n **kwargs,\n )\n return self._return_rest_body(response)\n\n @tracer.start_as_current_span(\"Synapse::restPUT\")\n def restPUT(\n self,\n uri,\n body=None,\n endpoint=None,\n headers=None,\n retryPolicy={},\n requests_session=None,\n **kwargs,\n ):\n \"\"\"\n Sends an HTTP PUT request to the Synapse server.\n\n Arguments:\n uri: URI on which get is performed\n endpoint: Server endpoint, defaults to self.repoEndpoint\n body: The payload to be delivered\n headers: Dictionary of headers to use rather than the API-key-signed default set of headers\n requests_session: Sn external requests.Session object to use when making this specific call\n kwargs: Any other arguments taken by a [request](http://docs.python-requests.org/en/latest/) method\n\n Returns\n JSON encoding of response\n \"\"\"\n trace.get_current_span().set_attributes({\"url.path\": uri})\n response = self._rest_call(\n \"put\", uri, body, endpoint, headers, retryPolicy, requests_session, **kwargs\n )\n return self._return_rest_body(response)\n\n @tracer.start_as_current_span(\"Synapse::restDELETE\")\n def restDELETE(\n self,\n uri,\n endpoint=None,\n headers=None,\n retryPolicy={},\n requests_session=None,\n **kwargs,\n ):\n \"\"\"\n Sends an HTTP DELETE request to the Synapse server.\n\n Arguments:\n uri: URI of resource to be deleted\n endpoint: Server endpoint, defaults to self.repoEndpoint\n headers: Dictionary of headers to use rather than the API-key-signed default set of headers\n requests_session: An external requests.Session object to use when making this specific call\n kwargs: Any other arguments taken by a [request](http://docs.python-requests.org/en/latest/) method\n\n \"\"\"\n trace.get_current_span().set_attributes({\"url.path\": uri})\n self._rest_call(\n \"delete\",\n uri,\n None,\n endpoint,\n headers,\n retryPolicy,\n requests_session,\n **kwargs,\n )\n\n def _build_uri_and_headers(self, uri, endpoint=None, headers=None):\n \"\"\"Returns a tuple of the URI and headers to request with.\"\"\"\n\n if endpoint is None:\n endpoint = self.repoEndpoint\n\n trace.get_current_span().set_attributes({\"server.address\": endpoint})\n\n # Check to see if the URI is incomplete (i.e. a Synapse URL)\n # In that case, append a Synapse endpoint to the URI\n parsedURL = urllib_urlparse.urlparse(uri)\n if parsedURL.netloc == \"\":\n uri = endpoint + uri\n\n if headers is None:\n headers = self._generate_headers()\n return uri, headers\n\n def _build_retry_policy(self, retryPolicy={}):\n \"\"\"Returns a retry policy to be passed onto _with_retry.\"\"\"\n\n defaults = dict(STANDARD_RETRY_PARAMS)\n defaults.update(retryPolicy)\n return defaults\n\n def _return_rest_body(self, response):\n \"\"\"Returns either a dictionary or a string depending on the 'content-type' of the response.\"\"\"\n trace.get_current_span().set_attributes(\n {\"http.response.status_code\": response.status_code}\n )\n if is_json(response.headers.get(\"content-type\", None)):\n return response.json()\n return response.text\n
"},{"location":"reference/client/#synapseclient.Synapse-functions","title":"Functions","text":""},{"location":"reference/client/#synapseclient.Synapse.login","title":"login(email=None, password=None, apiKey=None, sessionToken=None, rememberMe=False, silent=False, forced=False, authToken=None)
","text":"Valid combinations of login() arguments:
- email/username and password\n- email/username and apiKey (Base64 encoded string)\n- authToken\n- sessionToken (**DEPRECATED**)\n
If no login arguments are provided or only username is provided, login() will attempt to log in using information from these sources (in order of preference):
login()
where rememberMe=True
was passed as a parameteremail
Synapse user name (or an email address associated with a Synapse account)
TYPE: str
DEFAULT: None
password
!!WILL BE DEPRECATED!! password. Please use authToken (Synapse personal access token)
TYPE: str
DEFAULT: None
apiKey
!!WILL BE DEPRECATED!! Base64 encoded Synapse API key
TYPE: str
DEFAULT: None
sessionToken
!!DEPRECATED FIELD!! User's current session token. Using this field will ignore the following fields: email, password, apiKey
TYPE: str
DEFAULT: None
rememberMe
Whether the authentication information should be cached in your operating system's credential storage.
TYPE: bool
DEFAULT: False
authToken
A bearer authorization token, e.g. a personal access token, can be used in lieu of a password or apiKey.
TYPE: str
DEFAULT: None
silent
Defaults to False. Suppresses the \"Welcome ...!\" message.
TYPE: bool
DEFAULT: False
forced
Defaults to False. Bypass the credential cache if set.
TYPE: bool
DEFAULT: False
GNOME Keyring (recommended) or KWallet is recommended to be installed for credential storage on Linux systems. If it is not installed/setup, credentials will be stored as PLAIN-TEXT file with read and write permissions for the current user only (chmod 600). On Windows and Mac OS, a default credentials storage exists so it will be preferred over the plain-text file. To install GNOME Keyring on Ubuntu:
sudo apt-get install gnome-keyring\n\nsudo apt-get install python-dbus #(for Python 2 installed via apt-get)\nOR\nsudo apt-get install python3-dbus #(for Python 3 installed via apt-get)\nOR\nsudo apt-get install libdbus-glib-1-dev #(for custom installation of Python or vitualenv)\nsudo pip install dbus-python #(may take a while to compile C code)\n
If you are on a headless Linux session (e.g. connecting via SSH), please run the following commands before running your Python session:
dbus-run-session -- bash #(replace 'bash' with 'sh' if bash is unavailable)\necho -n \"REPLACE_WITH_YOUR_KEYRING_PASSWORD\"|gnome-keyring-daemon -- unlock\n
Logging in Using an auth token:
syn.login(authToken=\"authtoken\")\n#> Welcome, Me!\n
Using a username/password:
syn.login('my-username', 'secret-password', rememberMe=True)\nsyn.login('my-username', 'secret-password', rememberMe=True)\n#> Welcome, Me!\n syn.login('my-username', 'secret-password', rememberMe=True)\n#> Welcome, Me!\n
After logging in with the rememberMe flag set, an API key will be cached and used to authenticate for future logins:
syn.login()\n#> Welcome, Me!\n
Source code in synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::login\")\ndef login(\n self,\n email: str = None,\n password: str = None,\n apiKey: str = None,\n sessionToken: str = None,\n rememberMe: bool = False,\n silent: bool = False,\n forced: bool = False,\n authToken: str = None,\n):\n \"\"\"\n Valid combinations of login() arguments:\n\n - email/username and password\n - email/username and apiKey (Base64 encoded string)\n - authToken\n - sessionToken (**DEPRECATED**)\n\n If no login arguments are provided or only username is provided, login() will attempt to log in using\n information from these sources (in order of preference):\n\n 1. User's personal access token from environment the variable: SYNAPSE_AUTH_TOKEN\n 2. .synapseConfig file (in user home folder unless configured otherwise)\n 3. cached credentials from previous `login()` where `rememberMe=True` was passed as a parameter\n\n Arguments:\n email: Synapse user name (or an email address associated with a Synapse account)\n password: **!!WILL BE DEPRECATED!!** password. Please use authToken (Synapse personal access token)\n apiKey: **!!WILL BE DEPRECATED!!** Base64 encoded Synapse API key\n sessionToken: **!!DEPRECATED FIELD!!** User's current session token. Using this field will ignore the\n following fields: email, password, apiKey\n rememberMe: Whether the authentication information should be cached in your operating system's\n credential storage.\n authToken: A bearer authorization token, e.g. a personal access token, can be used in lieu of a\n password or apiKey.\n silent: Defaults to False. Suppresses the \"Welcome ...!\" message.\n forced: Defaults to False. Bypass the credential cache if set.\n\n **GNOME Keyring** (recommended) or **KWallet** is recommended to be installed for credential storage on\n **Linux** systems.\n If it is not installed/setup, credentials will be stored as PLAIN-TEXT file with read and write permissions for\n the current user only (chmod 600).\n On Windows and Mac OS, a default credentials storage exists so it will be preferred over the plain-text file.\n To install GNOME Keyring on Ubuntu:\n\n sudo apt-get install gnome-keyring\n\n sudo apt-get install python-dbus #(for Python 2 installed via apt-get)\n OR\n sudo apt-get install python3-dbus #(for Python 3 installed via apt-get)\n OR\n sudo apt-get install libdbus-glib-1-dev #(for custom installation of Python or vitualenv)\n sudo pip install dbus-python #(may take a while to compile C code)\n\n If you are on a headless Linux session (e.g. connecting via SSH), please run the following commands before\n running your Python session:\n\n dbus-run-session -- bash #(replace 'bash' with 'sh' if bash is unavailable)\n echo -n \"REPLACE_WITH_YOUR_KEYRING_PASSWORD\"|gnome-keyring-daemon -- unlock\n\n Example: Logging in\n Using an auth token:\n\n syn.login(authToken=\"authtoken\")\n #> Welcome, Me!\n\n Using a username/password:\n\n syn.login('my-username', 'secret-password', rememberMe=True)\n syn.login('my-username', 'secret-password', rememberMe=True)\n #> Welcome, Me!\n syn.login('my-username', 'secret-password', rememberMe=True)\n #> Welcome, Me!\n\n After logging in with the *rememberMe* flag set, an API key will be cached and\n used to authenticate for future logins:\n\n syn.login()\n #> Welcome, Me!\n\n \"\"\"\n # Note: the order of the logic below reflects the ordering in the docstring above.\n\n # Check version before logging in\n if not self.skip_checks:\n version_check()\n\n # Make sure to invalidate the existing session\n self.logout()\n\n credential_provider_chain = get_default_credential_chain()\n # TODO: remove deprecated sessionToken when we move to a different solution\n self.credentials = credential_provider_chain.get_credentials(\n self,\n UserLoginArgs(\n email,\n password,\n apiKey,\n forced,\n sessionToken,\n authToken,\n ),\n )\n\n # Final check on login success\n if not self.credentials:\n raise SynapseNoCredentialsError(\"No credentials provided.\")\n\n # Save the API key in the cache\n if rememberMe:\n message = (\n \"The rememberMe parameter will be deprecated by early 2024. Please use the ~/.synapseConfig \"\n \"or SYNAPSE_AUTH_TOKEN environmental variable to set up your Synapse connection.\"\n )\n self.logger.warning(message)\n delete_stored_credentials(self.credentials.username)\n self.credentials.store_to_keyring()\n cached_sessions.set_most_recent_user(self.credentials.username)\n\n if not silent:\n profile = self.getUserProfile()\n # TODO-PY3: in Python2, do we need to ensure that this is encoded in utf-8\n self.logger.info(\n \"Welcome, %s!\\n\"\n % (\n profile[\"displayName\"]\n if \"displayName\" in profile\n else self.credentials.username\n )\n )\n
"},{"location":"reference/client/#synapseclient.Synapse.logout","title":"logout(forgetMe=False)
","text":"Removes authentication information from the Synapse client.
PARAMETER DESCRIPTIONforgetMe
Set as True to clear any local storage of authentication information. See the flag \"rememberMe\" in synapseclient.Synapse.login
TYPE: bool
DEFAULT: False
None
Source code insynapseclient/client.py
def logout(self, forgetMe: bool = False):\n \"\"\"\n Removes authentication information from the Synapse client.\n\n Arguments:\n forgetMe: Set as True to clear any local storage of authentication information.\n See the flag \"rememberMe\" in [synapseclient.Synapse.login][]\n\n Returns:\n None\n \"\"\"\n # Delete the user's API key from the cache\n if forgetMe and self.credentials:\n self.credentials.delete_from_keyring()\n\n self.credentials = None\n
"},{"location":"reference/client/#synapseclient.Synapse.get","title":"get(entity, **kwargs)
","text":"Gets a Synapse entity from the repository service.
PARAMETER DESCRIPTIONentity
A Synapse ID, a Synapse Entity object, a plain dictionary in which 'id' maps to a Synapse ID or a local file that is stored in Synapse (found by the file MD5)
version
The specific version to get. Defaults to the most recent version.
downloadFile
Whether associated files(s) should be downloaded. Defaults to True
downloadLocation
Directory where to download the Synapse File Entity. Defaults to the local cache.
followLink
Whether the link returns the target Entity. Defaults to False
ifcollision
Determines how to handle file collisions. May be \"overwrite.local\", \"keep.local\", or \"keep.both\". Defaults to \"keep.both\".
limitSearch
A Synanpse ID used to limit the search in Synapse if entity is specified as a local file. That is, if the file is stored in multiple locations in Synapse only the ones in the specified folder/project will be returned.
RETURNS DESCRIPTION
A new Synapse Entity object of the appropriate type.
Using this functionDownload file into cache
entity = syn.get('syn1906479')\nprint(entity.name)\nprint(entity.path)\n
Download file into current working directory
entity = syn.get('syn1906479', downloadLocation='.')\nprint(entity.name)\nprint(entity.path)\n
Determine the provenance of a locally stored file as indicated in Synapse
entity = syn.get('/path/to/file.txt', limitSearch='syn12312')\nprint(syn.getProvenance(entity))\n
Source code in synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::get\")\ndef get(self, entity, **kwargs):\n \"\"\"\n Gets a Synapse entity from the repository service.\n\n Arguments:\n entity: A Synapse ID, a Synapse Entity object, a plain dictionary in which 'id' maps to a\n Synapse ID or a local file that is stored in Synapse (found by the file MD5)\n version: The specific version to get.\n Defaults to the most recent version.\n downloadFile: Whether associated files(s) should be downloaded.\n Defaults to True\n downloadLocation: Directory where to download the Synapse File Entity.\n Defaults to the local cache.\n followLink: Whether the link returns the target Entity.\n Defaults to False\n ifcollision: Determines how to handle file collisions.\n May be \"overwrite.local\", \"keep.local\", or \"keep.both\".\n Defaults to \"keep.both\".\n limitSearch: A Synanpse ID used to limit the search in Synapse if entity is specified as a local\n file. That is, if the file is stored in multiple locations in Synapse only the ones\n in the specified folder/project will be returned.\n\n Returns:\n A new Synapse Entity object of the appropriate type.\n\n Example: Using this function\n Download file into cache\n\n entity = syn.get('syn1906479')\n print(entity.name)\n print(entity.path)\n\n Download file into current working directory\n\n entity = syn.get('syn1906479', downloadLocation='.')\n print(entity.name)\n print(entity.path)\n\n Determine the provenance of a locally stored file as indicated in Synapse\n\n entity = syn.get('/path/to/file.txt', limitSearch='syn12312')\n print(syn.getProvenance(entity))\n \"\"\"\n # If entity is a local file determine the corresponding synapse entity\n if isinstance(entity, str) and os.path.isfile(entity):\n bundle = self._getFromFile(entity, kwargs.pop(\"limitSearch\", None))\n kwargs[\"downloadFile\"] = False\n kwargs[\"path\"] = entity\n\n elif isinstance(entity, str) and not utils.is_synapse_id_str(entity):\n raise SynapseFileNotFoundError(\n (\n \"The parameter %s is neither a local file path \"\n \" or a valid entity id\" % entity\n )\n )\n # have not been saved entities\n elif isinstance(entity, Entity) and not entity.get(\"id\"):\n raise ValueError(\n \"Cannot retrieve entity that has not been saved.\"\n \" Please use syn.store() to save your entity and try again.\"\n )\n else:\n version = kwargs.get(\"version\", None)\n bundle = self._getEntityBundle(entity, version)\n # Check and warn for unmet access requirements\n self._check_entity_restrictions(\n bundle, entity, kwargs.get(\"downloadFile\", True)\n )\n\n return_data = self._getWithEntityBundle(\n entityBundle=bundle, entity=entity, **kwargs\n )\n trace.get_current_span().set_attributes(\n {\n \"synapse.id\": return_data.get(\"id\", \"\"),\n \"synapse.concrete_type\": return_data.get(\"concreteType\", \"\"),\n }\n )\n return return_data\n
"},{"location":"reference/client/#synapseclient.Synapse.store","title":"store(obj, *, createOrUpdate=True, forceVersion=True, versionLabel=None, isRestricted=False, activity=None, used=None, executed=None, activityName=None, activityDescription=None, opentelemetry_context=None)
","text":"Creates a new Entity or updates an existing Entity, uploading any files in the process.
PARAMETER DESCRIPTIONobj
A Synapse Entity, Evaluation, or Wiki
used
The Entity, Synapse ID, or URL used to create the object (can also be a list of these)
DEFAULT: None
executed
The Entity, Synapse ID, or URL representing code executed to create the object (can also be a list of these)
DEFAULT: None
activity
Activity object specifying the user's provenance.
DEFAULT: None
activityName
Activity name to be used in conjunction with used and executed.
DEFAULT: None
activityDescription
Activity description to be used in conjunction with used and executed.
DEFAULT: None
createOrUpdate
Indicates whether the method should automatically perform an update if the 'obj' conflicts with an existing Synapse object. Defaults to True.
DEFAULT: True
forceVersion
Indicates whether the method should increment the version of the object even if nothing has changed. Defaults to True.
DEFAULT: True
versionLabel
Arbitrary string used to label the version.
DEFAULT: None
isRestricted
If set to true, an email will be sent to the Synapse access control team to start the process of adding terms-of-use or review board approval for this entity. You will be contacted with regards to the specific data being restricted and the requirements of access.
DEFAULT: False
opentelemetry_context
OpenTelemetry context to propogate to this function to use for tracing. Used cases where multi-threaded operations need to be linked to parent spans.
DEFAULT: None
A Synapse Entity, Evaluation, or Wiki
Using this functionCreating a new Project:
from synapseclient import Project\n\nproject = Project('My uniquely named project')\nproject = syn.store(project)\n
Adding files with provenance (aka: Activity):
from synapseclient import File, Activity\n\n# A synapse entity *syn1906480* contains data\n# entity *syn1917825* contains code\nactivity = Activity(\n 'Fancy Processing',\n description='No seriously, really fancy processing',\n used=['syn1906480', 'http://data_r_us.com/fancy/data.txt'],\n executed='syn1917825')\n\ntest_entity = File('/path/to/data/file.xyz', description='Fancy new data', parent=project)\ntest_entity = syn.store(test_entity, activity=activity)\n
Source code in synapseclient/client.py
def store(\n self,\n obj,\n *,\n createOrUpdate=True,\n forceVersion=True,\n versionLabel=None,\n isRestricted=False,\n activity=None,\n used=None,\n executed=None,\n activityName=None,\n activityDescription=None,\n opentelemetry_context=None,\n):\n \"\"\"\n Creates a new Entity or updates an existing Entity, uploading any files in the process.\n\n Arguments:\n obj: A Synapse Entity, Evaluation, or Wiki\n used: The Entity, Synapse ID, or URL used to create the object (can also be a list of these)\n executed: The Entity, Synapse ID, or URL representing code executed to create the object\n (can also be a list of these)\n activity: Activity object specifying the user's provenance.\n activityName: Activity name to be used in conjunction with *used* and *executed*.\n activityDescription: Activity description to be used in conjunction with *used* and *executed*.\n createOrUpdate: Indicates whether the method should automatically perform an update if the 'obj'\n conflicts with an existing Synapse object. Defaults to True.\n forceVersion: Indicates whether the method should increment the version of the object even if nothing\n has changed. Defaults to True.\n versionLabel: Arbitrary string used to label the version.\n isRestricted: If set to true, an email will be sent to the Synapse access control team to start the\n process of adding terms-of-use or review board approval for this entity.\n You will be contacted with regards to the specific data being restricted and the\n requirements of access.\n opentelemetry_context: OpenTelemetry context to propogate to this function to use for tracing. Used\n cases where multi-threaded operations need to be linked to parent spans.\n\n Returns:\n A Synapse Entity, Evaluation, or Wiki\n\n Example: Using this function\n Creating a new Project:\n\n from synapseclient import Project\n\n project = Project('My uniquely named project')\n project = syn.store(project)\n\n Adding files with [provenance (aka: Activity)][synapseclient.Activity]:\n\n from synapseclient import File, Activity\n\n # A synapse entity *syn1906480* contains data\n # entity *syn1917825* contains code\n activity = Activity(\n 'Fancy Processing',\n description='No seriously, really fancy processing',\n used=['syn1906480', 'http://data_r_us.com/fancy/data.txt'],\n executed='syn1917825')\n\n test_entity = File('/path/to/data/file.xyz', description='Fancy new data', parent=project)\n test_entity = syn.store(test_entity, activity=activity)\n\n \"\"\"\n with tracer.start_as_current_span(\n \"Synapse::store\", context=opentelemetry_context\n ):\n trace.get_current_span().set_attributes(\n {\"thread.id\": threading.get_ident()}\n )\n # SYNPY-1031: activity must be Activity object or code will fail later\n if activity:\n if not isinstance(activity, synapseclient.Activity):\n raise ValueError(\"activity should be synapseclient.Activity object\")\n # _before_store hook\n # give objects a chance to do something before being stored\n if hasattr(obj, \"_before_synapse_store\"):\n obj._before_synapse_store(self)\n\n # _synapse_store hook\n # for objects that know how to store themselves\n if hasattr(obj, \"_synapse_store\"):\n return obj._synapse_store(self)\n\n # Handle all non-Entity objects\n if not (isinstance(obj, Entity) or type(obj) == dict):\n if isinstance(obj, Wiki):\n return self._storeWiki(obj, createOrUpdate)\n\n if \"id\" in obj: # If ID is present, update\n trace.get_current_span().set_attributes({\"synapse.id\": obj[\"id\"]})\n return type(obj)(**self.restPUT(obj.putURI(), obj.json()))\n\n try: # If no ID is present, attempt to POST the object\n trace.get_current_span().set_attributes({\"synapse.id\": \"\"})\n return type(obj)(**self.restPOST(obj.postURI(), obj.json()))\n\n except SynapseHTTPError as err:\n # If already present and we want to update attempt to get the object content\n if createOrUpdate and err.response.status_code == 409:\n newObj = self.restGET(obj.getByNameURI(obj.name))\n newObj.update(obj)\n obj = type(obj)(**newObj)\n trace.get_current_span().set_attributes(\n {\"synapse.id\": obj[\"id\"]}\n )\n obj.update(self.restPUT(obj.putURI(), obj.json()))\n return obj\n raise\n\n # If the input object is an Entity or a dictionary\n entity = obj\n properties, annotations, local_state = split_entity_namespaces(entity)\n bundle = None\n # Explicitly set an empty versionComment property if none is supplied,\n # otherwise an existing entity bundle's versionComment will be copied to the update.\n properties[\"versionComment\"] = (\n properties[\"versionComment\"] if \"versionComment\" in properties else None\n )\n\n # Anything with a path is treated as a cache-able item\n # Only Files are expected in the following logic\n if entity.get(\"path\", False) and not isinstance(obj, Folder):\n if \"concreteType\" not in properties:\n properties[\"concreteType\"] = File._synapse_entity_type\n # Make sure the path is fully resolved\n entity[\"path\"] = os.path.expanduser(entity[\"path\"])\n\n # Check if the File already exists in Synapse by fetching metadata on it\n bundle = self._getEntityBundle(entity)\n\n if bundle:\n if createOrUpdate:\n # update our properties from the existing bundle so that we have\n # enough to process this as an entity update.\n properties = {**bundle[\"entity\"], **properties}\n\n # Check if the file should be uploaded\n fileHandle = find_data_file_handle(bundle)\n if (\n fileHandle\n and fileHandle[\"concreteType\"]\n == \"org.sagebionetworks.repo.model.file.ExternalFileHandle\"\n ):\n # switching away from ExternalFileHandle or the url was updated\n needs_upload = entity[\"synapseStore\"] or (\n fileHandle[\"externalURL\"] != entity[\"externalURL\"]\n )\n else:\n # Check if we need to upload a new version of an existing\n # file. If the file referred to by entity['path'] has been\n # modified, we want to upload the new version.\n # If synapeStore is false then we must upload a ExternalFileHandle\n needs_upload = not entity[\n \"synapseStore\"\n ] or not self.cache.contains(\n bundle[\"entity\"][\"dataFileHandleId\"], entity[\"path\"]\n )\n elif entity.get(\"dataFileHandleId\", None) is not None:\n needs_upload = False\n else:\n needs_upload = True\n\n if needs_upload:\n local_state_fh = local_state.get(\"_file_handle\", {})\n synapseStore = local_state.get(\"synapseStore\", True)\n fileHandle = upload_file_handle(\n self,\n entity[\"parentId\"],\n local_state[\"path\"]\n if (synapseStore or local_state_fh.get(\"externalURL\") is None)\n else local_state_fh.get(\"externalURL\"),\n synapseStore=synapseStore,\n md5=local_state_fh.get(\"contentMd5\"),\n file_size=local_state_fh.get(\"contentSize\"),\n mimetype=local_state_fh.get(\"contentType\"),\n max_threads=self.max_threads,\n )\n properties[\"dataFileHandleId\"] = fileHandle[\"id\"]\n local_state[\"_file_handle\"] = fileHandle\n\n elif \"dataFileHandleId\" not in properties:\n # Handle the case where the Entity lacks an ID\n # But becomes an update() due to conflict\n properties[\"dataFileHandleId\"] = bundle[\"entity\"][\n \"dataFileHandleId\"\n ]\n\n # update the file_handle metadata if the FileEntity's FileHandle id has changed\n local_state_fh_id = local_state.get(\"_file_handle\", {}).get(\"id\")\n if (\n local_state_fh_id\n and properties[\"dataFileHandleId\"] != local_state_fh_id\n ):\n local_state[\"_file_handle\"] = find_data_file_handle(\n self._getEntityBundle(\n properties[\"id\"],\n requestedObjects={\n \"includeEntity\": True,\n \"includeFileHandles\": True,\n },\n )\n )\n\n # check if we already have the filehandleid cached somewhere\n cached_path = self.cache.get(properties[\"dataFileHandleId\"])\n if cached_path is None:\n local_state[\"path\"] = None\n local_state[\"cacheDir\"] = None\n local_state[\"files\"] = []\n else:\n local_state[\"path\"] = cached_path\n local_state[\"cacheDir\"] = os.path.dirname(cached_path)\n local_state[\"files\"] = [os.path.basename(cached_path)]\n\n # Create or update Entity in Synapse\n if \"id\" in properties:\n trace.get_current_span().set_attributes(\n {\"synapse.id\": properties[\"id\"]}\n )\n properties = self._updateEntity(properties, forceVersion, versionLabel)\n else:\n # If Link, get the target name, version number and concrete type and store in link properties\n if properties[\"concreteType\"] == \"org.sagebionetworks.repo.model.Link\":\n target_properties = self._getEntity(\n properties[\"linksTo\"][\"targetId\"],\n version=properties[\"linksTo\"].get(\"targetVersionNumber\"),\n )\n if target_properties[\"parentId\"] == properties[\"parentId\"]:\n raise ValueError(\n \"Cannot create a Link to an entity under the same parent.\"\n )\n properties[\"linksToClassName\"] = target_properties[\"concreteType\"]\n if (\n target_properties.get(\"versionNumber\") is not None\n and properties[\"linksTo\"].get(\"targetVersionNumber\") is not None\n ):\n properties[\"linksTo\"][\n \"targetVersionNumber\"\n ] = target_properties[\"versionNumber\"]\n properties[\"name\"] = target_properties[\"name\"]\n try:\n properties = self._createEntity(properties)\n except SynapseHTTPError as ex:\n if createOrUpdate and ex.response.status_code == 409:\n # Get the existing Entity's ID via the name and parent\n existing_entity_id = self.findEntityId(\n properties[\"name\"], properties.get(\"parentId\", None)\n )\n if existing_entity_id is None:\n raise\n\n # get existing properties and annotations\n if not bundle:\n bundle = self._getEntityBundle(\n existing_entity_id,\n requestedObjects={\n \"includeEntity\": True,\n \"includeAnnotations\": True,\n },\n )\n\n properties = {**bundle[\"entity\"], **properties}\n\n # we additionally merge the annotations under the assumption that a missing annotation\n # from a resolved conflict represents an newer annotation that should be preserved\n # rather than an intentionally deleted annotation.\n annotations = {\n **from_synapse_annotations(bundle[\"annotations\"]),\n **annotations,\n }\n\n properties = self._updateEntity(\n properties, forceVersion, versionLabel\n )\n\n else:\n raise\n\n # Deal with access restrictions\n if isRestricted:\n self._createAccessRequirementIfNone(properties)\n\n # Update annotations\n if (not bundle and annotations) or (\n bundle and check_annotations_changed(bundle[\"annotations\"], annotations)\n ):\n annotations = self.set_annotations(\n Annotations(properties[\"id\"], properties[\"etag\"], annotations)\n )\n properties[\"etag\"] = annotations.etag\n\n # If the parameters 'used' or 'executed' are given, create an Activity object\n if used or executed:\n if activity is not None:\n raise SynapseProvenanceError(\n \"Provenance can be specified as an Activity object or as used/executed\"\n \" item(s), but not both.\"\n )\n activity = Activity(\n name=activityName,\n description=activityDescription,\n used=used,\n executed=executed,\n )\n\n # If we have an Activity, set it as the Entity's provenance record\n if activity:\n self.setProvenance(properties, activity)\n\n # 'etag' has changed, so get the new Entity\n properties = self._getEntity(properties)\n\n # Return the updated Entity object\n entity = Entity.create(properties, annotations, local_state)\n return_data = self.get(entity, downloadFile=False)\n\n trace.get_current_span().set_attributes(\n {\n \"synapse.id\": return_data.get(\"id\", \"\"),\n \"synapse.concrete_type\": entity.get(\"concreteType\", \"\"),\n }\n )\n return return_data\n
"},{"location":"reference/client/#synapseclient.Synapse.move","title":"move(entity, new_parent)
","text":"Move a Synapse entity to a new container.
PARAMETER DESCRIPTIONentity
A Synapse ID, a Synapse Entity object, or a local file that is stored in Synapse
new_parent
The new parent container (Folder or Project) to which the entity should be moved.
RETURNS DESCRIPTION
The Synapse Entity object that has been moved.
Using this functionMove a Synapse Entity object to a new parent container
entity = syn.move('syn456', 'syn123')\n
Source code in synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::move\")\ndef move(self, entity, new_parent):\n \"\"\"\n Move a Synapse entity to a new container.\n\n Arguments:\n entity: A Synapse ID, a Synapse Entity object, or a local file that is stored in Synapse\n new_parent: The new parent container (Folder or Project) to which the entity should be moved.\n\n Returns:\n The Synapse Entity object that has been moved.\n\n Example: Using this function\n Move a Synapse Entity object to a new parent container\n\n entity = syn.move('syn456', 'syn123')\n \"\"\"\n\n entity = self.get(entity, downloadFile=False)\n entity.parentId = id_of(new_parent)\n entity = self.store(entity, forceVersion=False)\n trace.get_current_span().set_attributes(\n {\n \"synapse.id\": entity.get(\"id\", \"\"),\n \"synapse.parent_id\": entity.get(\"parentId\", \"\"),\n }\n )\n\n return entity\n
"},{"location":"reference/client/#synapseclient.Synapse.delete","title":"delete(obj, version=None)
","text":"Removes an object from Synapse.
PARAMETER DESCRIPTIONobj
An existing object stored on Synapse such as Evaluation, File, Project, or Wiki
version
For entities, specify a particular version to delete.
DEFAULT: None
synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::delete\")\ndef delete(self, obj, version=None):\n \"\"\"\n Removes an object from Synapse.\n\n Arguments:\n obj: An existing object stored on Synapse such as Evaluation, File, Project, or Wiki\n version: For entities, specify a particular version to delete.\n \"\"\"\n # Handle all strings as the Entity ID for backward compatibility\n if isinstance(obj, str):\n entity_id = id_of(obj)\n trace.get_current_span().set_attributes({\"synapse.id\": entity_id})\n if version:\n self.restDELETE(uri=f\"/entity/{entity_id}/version/{version}\")\n else:\n self.restDELETE(uri=f\"/entity/{entity_id}\")\n elif hasattr(obj, \"_synapse_delete\"):\n return obj._synapse_delete(self)\n else:\n try:\n if isinstance(obj, Versionable):\n self.restDELETE(obj.deleteURI(versionNumber=version))\n else:\n self.restDELETE(obj.deleteURI())\n except AttributeError:\n raise SynapseError(\n f\"Can't delete a {type(obj)}. Please specify a Synapse object or id\"\n )\n
"},{"location":"reference/client/#synapseclient.Synapse.get_annotations","title":"get_annotations(entity, version=None)
","text":"Retrieve annotations for an Entity from the Synapse Repository as a Python dict.
Note that collapsing annotations from the native Synapse format to a Python dict may involve some loss of information. See _getRawAnnotations
to get annotations in the native format.
entity
An Entity or Synapse ID to lookup
TYPE: Union[str, Entity]
version
The version of the Entity to retrieve.
TYPE: Union[str, int]
DEFAULT: None
Annotations
A synapseclient.annotations.Annotations object, a dict that also has id and etag attributes
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::get_annotations\")\ndef get_annotations(\n self, entity: typing.Union[str, Entity], version: typing.Union[str, int] = None\n) -> Annotations:\n \"\"\"\n Retrieve annotations for an Entity from the Synapse Repository as a Python dict.\n\n Note that collapsing annotations from the native Synapse format to a Python dict may involve some loss of\n information. See `_getRawAnnotations` to get annotations in the native format.\n\n Arguments:\n entity: An Entity or Synapse ID to lookup\n version: The version of the Entity to retrieve.\n\n Returns:\n A [synapseclient.annotations.Annotations][] object, a dict that also has id and etag attributes\n \"\"\"\n return from_synapse_annotations(self._getRawAnnotations(entity, version))\n
"},{"location":"reference/client/#synapseclient.Synapse.set_annotations","title":"set_annotations(annotations)
","text":"Store annotations for an Entity in the Synapse Repository.
PARAMETER DESCRIPTIONannotations
A synapseclient.annotations.Annotations of annotation names and values, with the id and etag attribute set
TYPE: Annotations
The updated synapseclient.annotations.Annotations for the entity
Using this functionGetting annotations, adding a new annotation, and updating the annotations:
annos = syn.get_annotations('syn123')\n\n# annos will contain the id and etag associated with the entity upon retrieval\nprint(annos.id)\n# syn123\nprint(annos.etag)\n# 7bdb83e9-a50a-46e4-987a-4962559f090f (Usually some UUID in the form of a string)\n\n# returned annos object from get_annotations() can be used as if it were a dict\n\n# set key 'foo' to have value of 'bar' and 'baz'\nannos['foo'] = ['bar', 'baz']\n\n# single values will automatically be wrapped in a list once stored\nannos['qwerty'] = 'asdf'\n\n# store the annotations\nannos = syn.set_annotations(annos)\n\nprint(annos)\n# {'foo':['bar','baz], 'qwerty':['asdf']}\n
Source code in synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::set_annotations\")\ndef set_annotations(self, annotations: Annotations):\n \"\"\"\n Store annotations for an Entity in the Synapse Repository.\n\n Arguments:\n annotations: A [synapseclient.annotations.Annotations][] of annotation names and values,\n with the id and etag attribute set\n\n Returns:\n The updated [synapseclient.annotations.Annotations][] for the entity\n\n Example: Using this function\n Getting annotations, adding a new annotation, and updating the annotations:\n\n annos = syn.get_annotations('syn123')\n\n # annos will contain the id and etag associated with the entity upon retrieval\n print(annos.id)\n # syn123\n print(annos.etag)\n # 7bdb83e9-a50a-46e4-987a-4962559f090f (Usually some UUID in the form of a string)\n\n # returned annos object from get_annotations() can be used as if it were a dict\n\n # set key 'foo' to have value of 'bar' and 'baz'\n annos['foo'] = ['bar', 'baz']\n\n # single values will automatically be wrapped in a list once stored\n annos['qwerty'] = 'asdf'\n\n # store the annotations\n annos = syn.set_annotations(annos)\n\n print(annos)\n # {'foo':['bar','baz], 'qwerty':['asdf']}\n \"\"\"\n\n if not isinstance(annotations, Annotations):\n raise TypeError(\"Expected a synapseclient.Annotations object\")\n\n synapseAnnos = to_synapse_annotations(annotations)\n\n entity_id = id_of(annotations)\n trace.get_current_span().set_attributes({\"synapse.id\": entity_id})\n\n return from_synapse_annotations(\n self.restPUT(\n f\"/entity/{entity_id}/annotations2\",\n body=json.dumps(synapseAnnos),\n )\n )\n
"},{"location":"reference/client/#synapseclient.Synapse.tableQuery","title":"tableQuery(query, resultsAs='csv', **kwargs)
","text":"Query a Synapse Table.
:param query: query string in a SQL-like syntax <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/web/controller/TableExamples.html>
_, for example \"SELECT * from syn12345\"
:param resultsAs: select whether results are returned as a CSV file (\"csv\") or incrementally downloaded as sets of rows (\"rowset\").
:param query: query string in a SQL-like syntax <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/web/controller/TableExamples.html>
_, for example \"SELECT * from syn12345\"
:param resultsAs: select whether results are returned as a CSV file (\"csv\") or incrementally downloaded as sets of rows (\"rowset\").
You can receive query results either as a generator over rows or as a CSV file. For smallish tables, either method will work equally well. Use of a \"rowset\" generator allows rows to be processed one at a time and processing may be stopped before downloading the entire table.
Optional keyword arguments differ for the two return types of rowset
or csv
query
Query string in a SQL-like syntax, for example: \"SELECT * from syn12345\"
resultsAs
select whether results are returned as a CSV file (\"csv\") or incrementally downloaded as sets of rows (\"rowset\")
DEFAULT: 'csv'
limit
(rowset only) Specify the maximum number of rows to be returned, defaults to None
offset
(rowset only) Don't return the first n rows, defaults to None
quoteCharacter
(csv only) default double quote
escapeCharacter
(csv only) default backslash
lineEnd
(csv only) defaults to os.linesep
separator
(csv only) defaults to comma
header
(csv only) True by default
includeRowIdAndRowVersion
(csv only) True by default
downloadLocation
(csv only) directory path to download the CSV file to
RETURNS DESCRIPTION
A TableQueryResult or CsvFileTable object
NOTEWhen performing queries on frequently updated tables, the table can be inaccessible for a period leading to a timeout of the query. Since the results are guaranteed to eventually be returned you can change the max timeout by setting the table_query_timeout variable of the Synapse object:
# Sets the max timeout to 5 minutes.\n syn.table_query_timeout = 300\n
Source code in synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::tableQuery\")\ndef tableQuery(self, query, resultsAs=\"csv\", **kwargs):\n \"\"\"\n Query a Synapse Table.\n\n\n :param query: query string in a `SQL-like syntax \\\n <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/web/controller/TableExamples.html>`_, for example\n \"SELECT * from syn12345\"\n\n :param resultsAs: select whether results are returned as a CSV file (\"csv\") or incrementally downloaded as\n sets of rows (\"rowset\").\n\n\n :param query: query string in a `SQL-like syntax \\\n <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/web/controller/TableExamples.html>`_, for example\n \"SELECT * from syn12345\"\n\n :param resultsAs: select whether results are returned as a CSV file (\"csv\") or incrementally downloaded as\n sets of rows (\"rowset\").\n\n You can receive query results either as a generator over rows or as a CSV file. For smallish tables, either\n method will work equally well. Use of a \"rowset\" generator allows rows to be processed one at a time and\n processing may be stopped before downloading the entire table.\n\n Optional keyword arguments differ for the two return types of `rowset` or `csv`\n\n Arguments:\n query: Query string in a [SQL-like syntax](https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/web/controller/TableExamples.html), for example: `\"SELECT * from syn12345\"`\n resultsAs: select whether results are returned as a CSV file (\"csv\") or incrementally downloaded as sets of rows (\"rowset\")\n limit: (rowset only) Specify the maximum number of rows to be returned, defaults to None\n offset: (rowset only) Don't return the first n rows, defaults to None\n quoteCharacter: (csv only) default double quote\n escapeCharacter: (csv only) default backslash\n lineEnd: (csv only) defaults to os.linesep\n separator: (csv only) defaults to comma\n header: (csv only) True by default\n includeRowIdAndRowVersion: (csv only) True by default\n downloadLocation: (csv only) directory path to download the CSV file to\n\n\n Returns:\n A [TableQueryResult][synapseclient.table.TableQueryResult] or [CsvFileTable][synapseclient.table.CsvFileTable] object\n\n\n NOTE:\n When performing queries on frequently updated tables, the table can be inaccessible for a period leading\n to a timeout of the query. Since the results are guaranteed to eventually be returned you can change the\n max timeout by setting the table_query_timeout variable of the Synapse object:\n\n # Sets the max timeout to 5 minutes.\n syn.table_query_timeout = 300\n\n \"\"\"\n if resultsAs.lower() == \"rowset\":\n return TableQueryResult(self, query, **kwargs)\n elif resultsAs.lower() == \"csv\":\n # TODO: remove isConsistent because it has now been deprecated\n # from the backend\n if kwargs.get(\"isConsistent\") is not None:\n kwargs.pop(\"isConsistent\")\n return CsvFileTable.from_table_query(self, query, **kwargs)\n else:\n raise ValueError(\n \"Unknown return type requested from tableQuery: \" + str(resultsAs)\n )\n
"},{"location":"reference/client/#synapseclient.Synapse.createColumns","title":"createColumns(columns)
","text":"Creates a batch of synapseclient.table.Column's within a single request.
PARAMETER DESCRIPTIONcolumns
A list of synapseclient.table.Column's
TYPE: List[Column]
List[Column]
A list of synapseclient.table.Column's that have been created in Synapse
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::createColumns\")\ndef createColumns(self, columns: typing.List[Column]) -> typing.List[Column]:\n \"\"\"\n Creates a batch of [synapseclient.table.Column][]'s within a single request.\n\n Arguments:\n columns: A list of [synapseclient.table.Column][]'s\n\n Returns:\n A list of [synapseclient.table.Column][]'s that have been created in Synapse\n \"\"\"\n request_body = {\n \"concreteType\": \"org.sagebionetworks.repo.model.ListWrapper\",\n \"list\": list(columns),\n }\n response = self.restPOST(\"/column/batch\", json.dumps(request_body))\n return [Column(**col) for col in response[\"list\"]]\n
"},{"location":"reference/client/#synapseclient.Synapse.getColumn","title":"getColumn(id)
","text":"Gets a Column object from Synapse by ID.
See: synapseclient.table.Column
PARAMETER DESCRIPTIONid
The ID of the column to retrieve
RETURNS DESCRIPTION
An object of type synapseclient.table.Column
Using this functionGetting a column
column = syn.getColumn(123)\n
Source code in synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getColumn\")\ndef getColumn(self, id):\n \"\"\"\n Gets a Column object from Synapse by ID.\n\n See: [synapseclient.table.Column][]\n\n Arguments:\n id: The ID of the column to retrieve\n\n Returns:\n An object of type [synapseclient.table.Column][]\n\n\n Example: Using this function\n Getting a column\n\n column = syn.getColumn(123)\n \"\"\"\n return Column(**self.restGET(Column.getURI(id)))\n
"},{"location":"reference/client/#synapseclient.Synapse.getColumns","title":"getColumns(x, limit=100, offset=0)
","text":"Get the columns defined in Synapse either (1) corresponding to a set of column headers, (2) those for a given schema, or (3) those whose names start with a given prefix.
PARAMETER DESCRIPTIONx
A list of column headers, a Table Entity object (Schema/EntityViewSchema), a Table's Synapse ID, or a string prefix
limit
maximum number of columns to return (pagination parameter)
DEFAULT: 100
offset
the index of the first column to return (pagination parameter)
DEFAULT: 0
A generator over synapseclient.table.Column objects
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getColumns\")\ndef getColumns(self, x, limit=100, offset=0):\n \"\"\"\n Get the columns defined in Synapse either (1) corresponding to a set of column headers, (2) those for a given\n schema, or (3) those whose names start with a given prefix.\n\n Arguments:\n x: A list of column headers, a Table Entity object (Schema/EntityViewSchema), a Table's Synapse ID, or a\n string prefix\n limit: maximum number of columns to return (pagination parameter)\n offset: the index of the first column to return (pagination parameter)\n\n Yields:\n A generator over [synapseclient.table.Column][] objects\n \"\"\"\n if x is None:\n uri = \"/column\"\n for result in self._GET_paginated(uri, limit=limit, offset=offset):\n yield Column(**result)\n elif isinstance(x, (list, tuple)):\n for header in x:\n try:\n # if header is an integer, it's a columnID, otherwise it's an aggregate column, like \"AVG(Foo)\"\n int(header)\n yield self.getColumn(header)\n except ValueError:\n # ignore aggregate column\n pass\n elif isinstance(x, SchemaBase) or utils.is_synapse_id_str(x):\n for col in self.getTableColumns(x):\n yield col\n elif isinstance(x, str):\n uri = \"/column?prefix=\" + x\n for result in self._GET_paginated(uri, limit=limit, offset=offset):\n yield Column(**result)\n else:\n ValueError(\"Can't get columns for a %s\" % type(x))\n
"},{"location":"reference/client/#synapseclient.Synapse.getTableColumns","title":"getTableColumns(table)
","text":"Retrieve the column models used in the given table schema.
PARAMETER DESCRIPTIONtable
The schema of the Table whose columns are to be retrieved
YIELDS DESCRIPTION
A Generator over the Table's columns
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getTableColumns\")\ndef getTableColumns(self, table):\n \"\"\"\n Retrieve the column models used in the given table schema.\n\n Arguments:\n table: The schema of the Table whose columns are to be retrieved\n\n Yields:\n A Generator over the Table's [columns][synapseclient.table.Column]\n \"\"\"\n uri = \"/entity/{id}/column\".format(id=id_of(table))\n # The returned object type for this service, PaginatedColumnModels, is a misnomer.\n # This service always returns the full list of results so the pagination does not not actually matter.\n for result in self.restGET(uri)[\"results\"]:\n yield Column(**result)\n
"},{"location":"reference/client/#synapseclient.Synapse.downloadTableColumns","title":"downloadTableColumns(table, columns, downloadLocation=None, **kwargs)
","text":"Bulk download of table-associated files.
PARAMETER DESCRIPTIONtable
Table query result
columns
A list of column names as strings
downloadLocation
Directory into which to download the files
DEFAULT: None
A dictionary from file handle ID to path in the local file system.
For example, consider a Synapse table whose ID is \"syn12345\" with two columns of type FILEHANDLEID named 'foo' and 'bar'. The associated files are JSON encoded, so we might retrieve the files from Synapse and load for the second 100 of those rows as shown here:
import json\n\nresults = syn.tableQuery('SELECT * FROM syn12345 LIMIT 100 OFFSET 100')\nfile_map = syn.downloadTableColumns(results, ['foo', 'bar'])\n\nfor file_handle_id, path in file_map.items():\n with open(path) as f:\n data[file_handle_id] = f.read()\n
Source code in synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::downloadTableColumns\")\ndef downloadTableColumns(self, table, columns, downloadLocation=None, **kwargs):\n \"\"\"\n Bulk download of table-associated files.\n\n Arguments:\n table: Table query result\n columns: A list of column names as strings\n downloadLocation: Directory into which to download the files\n\n Returns:\n A dictionary from file handle ID to path in the local file system.\n\n For example, consider a Synapse table whose ID is \"syn12345\" with two columns of type FILEHANDLEID named 'foo'\n and 'bar'. The associated files are JSON encoded, so we might retrieve the files from Synapse and load for the\n second 100 of those rows as shown here:\n\n import json\n\n results = syn.tableQuery('SELECT * FROM syn12345 LIMIT 100 OFFSET 100')\n file_map = syn.downloadTableColumns(results, ['foo', 'bar'])\n\n for file_handle_id, path in file_map.items():\n with open(path) as f:\n data[file_handle_id] = f.read()\n\n \"\"\"\n\n RETRIABLE_FAILURE_CODES = [\"EXCEEDS_SIZE_LIMIT\"]\n MAX_DOWNLOAD_TRIES = 100\n max_files_per_request = kwargs.get(\"max_files_per_request\", 2500)\n # Rowset tableQuery result not allowed\n if isinstance(table, TableQueryResult):\n raise ValueError(\n \"downloadTableColumn doesn't work with rowsets. Please use default tableQuery settings.\"\n )\n if isinstance(columns, str):\n columns = [columns]\n if not isinstance(columns, collections.abc.Iterable):\n raise TypeError(\"Columns parameter requires a list of column names\")\n\n (\n file_handle_associations,\n file_handle_to_path_map,\n ) = self._build_table_download_file_handle_list(\n table,\n columns,\n downloadLocation,\n )\n\n self.logger.info(\n \"Downloading %d files, %d cached locally\"\n % (len(file_handle_associations), len(file_handle_to_path_map))\n )\n\n permanent_failures = collections.OrderedDict()\n\n attempts = 0\n while len(file_handle_associations) > 0 and attempts < MAX_DOWNLOAD_TRIES:\n attempts += 1\n\n file_handle_associations_batch = file_handle_associations[\n :max_files_per_request\n ]\n\n # ------------------------------------------------------------\n # call async service to build zip file\n # ------------------------------------------------------------\n\n # returns a BulkFileDownloadResponse:\n # https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/file/BulkFileDownloadResponse.html\n request = dict(\n concreteType=\"org.sagebionetworks.repo.model.file.BulkFileDownloadRequest\",\n requestedFiles=file_handle_associations_batch,\n )\n response = self._waitForAsync(\n uri=\"/file/bulk/async\",\n request=request,\n endpoint=self.fileHandleEndpoint,\n )\n\n # ------------------------------------------------------------\n # download zip file\n # ------------------------------------------------------------\n\n temp_dir = tempfile.mkdtemp()\n zipfilepath = os.path.join(temp_dir, \"table_file_download.zip\")\n try:\n zipfilepath = self._downloadFileHandle(\n response[\"resultZipFileHandleId\"],\n table.tableId,\n \"TableEntity\",\n zipfilepath,\n )\n # TODO handle case when no zip file is returned\n # TODO test case when we give it partial or all bad file handles\n # TODO test case with deleted fileHandleID\n # TODO return null for permanent failures\n\n # ------------------------------------------------------------\n # unzip into cache\n # ------------------------------------------------------------\n\n if downloadLocation:\n download_dir = self._ensure_download_location_is_directory(\n downloadLocation\n )\n\n with zipfile.ZipFile(zipfilepath) as zf:\n # the directory structure within the zip follows that of the cache:\n # {fileHandleId modulo 1000}/{fileHandleId}/{fileName}\n for summary in response[\"fileSummary\"]:\n if summary[\"status\"] == \"SUCCESS\":\n if not downloadLocation:\n download_dir = self.cache.get_cache_dir(\n summary[\"fileHandleId\"]\n )\n\n filepath = extract_zip_file_to_directory(\n zf, summary[\"zipEntryName\"], download_dir\n )\n self.cache.add(summary[\"fileHandleId\"], filepath)\n file_handle_to_path_map[summary[\"fileHandleId\"]] = filepath\n elif summary[\"failureCode\"] not in RETRIABLE_FAILURE_CODES:\n permanent_failures[summary[\"fileHandleId\"]] = summary\n finally:\n if os.path.exists(zipfilepath):\n os.remove(zipfilepath)\n\n # Do we have remaining files to download?\n file_handle_associations = [\n fha\n for fha in file_handle_associations\n if fha[\"fileHandleId\"] not in file_handle_to_path_map\n and fha[\"fileHandleId\"] not in permanent_failures.keys()\n ]\n\n # TODO if there are files we still haven't downloaded\n\n return file_handle_to_path_map\n
"},{"location":"reference/client/#synapseclient.Synapse.getPermissions","title":"getPermissions(entity, principalId=None)
","text":"Get the permissions that a user or group has on an Entity. Arguments: entity: An Entity or Synapse ID to lookup principalId: Identifier of a user or group (defaults to PUBLIC users)
RETURNS DESCRIPTIONAn array containing some combination of
['READ', 'CREATE', 'UPDATE', 'DELETE', 'CHANGE_PERMISSIONS', 'DOWNLOAD']
or an empty array
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getPermissions\")\ndef getPermissions(\n self,\n entity: Union[Entity, Evaluation, str, collections.abc.Mapping],\n principalId: str = None,\n):\n \"\"\"Get the permissions that a user or group has on an Entity.\n Arguments:\n entity: An Entity or Synapse ID to lookup\n principalId: Identifier of a user or group (defaults to PUBLIC users)\n\n Returns:\n An array containing some combination of\n ['READ', 'CREATE', 'UPDATE', 'DELETE', 'CHANGE_PERMISSIONS', 'DOWNLOAD']\n or an empty array\n \"\"\"\n principal_id = self._getUserbyPrincipalIdOrName(principalId)\n\n trace.get_current_span().set_attributes(\n {\"synapse.id\": id_of(entity), \"synapse.principal_id\": principal_id}\n )\n\n acl = self._getACL(entity)\n\n team_list = self._find_teams_for_principal(principal_id)\n team_ids = [int(team.id) for team in team_list]\n effective_permission_set = set()\n\n # This user_profile_bundle is being used to verify that the principal_id is a registered user of the system\n user_profile_bundle = self._get_user_bundle(principal_id, 1)\n\n # Loop over all permissions in the returned ACL and add it to the effective_permission_set\n # if the principalId in the ACL matches\n # 1) the one we are looking for,\n # 2) a team the entity is a member of,\n # 3) PUBLIC\n # 4) A user_profile_bundle exists for the principal_id\n for permissions in acl[\"resourceAccess\"]:\n if \"principalId\" in permissions and (\n permissions[\"principalId\"] == principal_id\n or permissions[\"principalId\"] in team_ids\n or permissions[\"principalId\"] == PUBLIC\n or (\n permissions[\"principalId\"] == AUTHENTICATED_USERS\n and user_profile_bundle is not None\n )\n ):\n effective_permission_set = effective_permission_set.union(\n permissions[\"accessType\"]\n )\n return list(effective_permission_set)\n
"},{"location":"reference/client/#synapseclient.Synapse.setPermissions","title":"setPermissions(entity, principalId=None, accessType=['READ', 'DOWNLOAD'], modify_benefactor=False, warn_if_inherits=True, overwrite=True)
","text":"Sets permission that a user or group has on an Entity. An Entity may have its own ACL or inherit its ACL from a benefactor.
PARAMETER DESCRIPTIONentity
An Entity or Synapse ID to modify
principalId
Identifier of a user or group. '273948' is for all registered Synapse users and '273949' is for public access.
DEFAULT: None
accessType
Type of permission to be granted. One or more of CREATE, READ, DOWNLOAD, UPDATE, DELETE, CHANGE_PERMISSIONS
DEFAULT: ['READ', 'DOWNLOAD']
modify_benefactor
Set as True when modifying a benefactor's ACL
DEFAULT: False
warn_if_inherits
Set as False, when creating a new ACL. Trying to modify the ACL of an Entity that inherits its ACL will result in a warning
DEFAULT: True
overwrite
By default this function overwrites existing permissions for the specified user. Set this flag to False to add new permissions non-destructively.
DEFAULT: True
An Access Control List object
Using this functionSetting permissions
# Grant all registered users download access\nsyn.setPermissions('syn1234','273948',['READ','DOWNLOAD'])\n# Grant the public view access\nsyn.setPermissions('syn1234','273949',['READ'])\n
Source code in synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::setPermissions\")\ndef setPermissions(\n self,\n entity,\n principalId=None,\n accessType=[\"READ\", \"DOWNLOAD\"],\n modify_benefactor=False,\n warn_if_inherits=True,\n overwrite=True,\n):\n \"\"\"\n Sets permission that a user or group has on an Entity.\n An Entity may have its own ACL or inherit its ACL from a benefactor.\n\n Arguments:\n entity: An Entity or Synapse ID to modify\n principalId: Identifier of a user or group. '273948' is for all registered Synapse users\n and '273949' is for public access.\n accessType: Type of permission to be granted. One or more of CREATE, READ, DOWNLOAD, UPDATE,\n DELETE, CHANGE_PERMISSIONS\n modify_benefactor: Set as True when modifying a benefactor's ACL\n warn_if_inherits: Set as False, when creating a new ACL.\n Trying to modify the ACL of an Entity that inherits its ACL will result in a warning\n overwrite: By default this function overwrites existing permissions for the specified user.\n Set this flag to False to add new permissions non-destructively.\n\n Returns:\n An Access Control List object\n\n Example: Using this function\n Setting permissions\n\n # Grant all registered users download access\n syn.setPermissions('syn1234','273948',['READ','DOWNLOAD'])\n # Grant the public view access\n syn.setPermissions('syn1234','273949',['READ'])\n \"\"\"\n entity_id = id_of(entity)\n trace.get_current_span().set_attributes({\"synapse.id\": entity_id})\n\n benefactor = self._getBenefactor(entity)\n if benefactor[\"id\"] != entity_id:\n if modify_benefactor:\n entity = benefactor\n elif warn_if_inherits:\n self.logger.warning(\n \"Creating an ACL for entity %s, which formerly inherited access control from a\"\n ' benefactor entity, \"%s\" (%s).\\n'\n % (entity_id, benefactor[\"name\"], benefactor[\"id\"])\n )\n\n acl = self._getACL(entity)\n\n principalId = self._getUserbyPrincipalIdOrName(principalId)\n\n # Find existing permissions\n permissions_to_update = None\n for permissions in acl[\"resourceAccess\"]:\n if (\n \"principalId\" in permissions\n and permissions[\"principalId\"] == principalId\n ):\n permissions_to_update = permissions\n break\n\n if accessType is None or accessType == []:\n # remove permissions\n if permissions_to_update and overwrite:\n acl[\"resourceAccess\"].remove(permissions_to_update)\n else:\n # add a 'resourceAccess' entry, if necessary\n if not permissions_to_update:\n permissions_to_update = {\"accessType\": [], \"principalId\": principalId}\n acl[\"resourceAccess\"].append(permissions_to_update)\n if overwrite:\n permissions_to_update[\"accessType\"] = accessType\n else:\n permissions_to_update[\"accessType\"] = list(\n set(permissions_to_update[\"accessType\"]) | set(accessType)\n )\n return self._storeACL(entity, acl)\n
"},{"location":"reference/client/#synapseclient.Synapse.getProvenance","title":"getProvenance(entity, version=None)
","text":"Retrieve provenance information for a Synapse Entity.
PARAMETER DESCRIPTIONentity
An Entity or Synapse ID to lookup
version
The version of the Entity to retrieve. Gets the most recent version if omitted
DEFAULT: None
Activity
An Activity object or raises exception if no provenance record exists
RAISES DESCRIPTIONSynapseHTTPError
if no provenance record exists
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getProvenance\")\ndef getProvenance(self, entity, version=None) -> Activity:\n \"\"\"\n Retrieve provenance information for a Synapse Entity.\n\n Arguments:\n entity: An Entity or Synapse ID to lookup\n version: The version of the Entity to retrieve. Gets the most recent version if omitted\n\n Returns:\n An Activity object or raises exception if no provenance record exists\n\n Raises:\n SynapseHTTPError: if no provenance record exists\n \"\"\"\n\n # Get versionNumber from Entity\n if version is None and \"versionNumber\" in entity:\n version = entity[\"versionNumber\"]\n entity_id = id_of(entity)\n if version:\n uri = \"/entity/%s/version/%d/generatedBy\" % (entity_id, version)\n else:\n uri = \"/entity/%s/generatedBy\" % entity_id\n\n trace.get_current_span().set_attributes({\"synapse.id\": entity_id})\n return Activity(data=self.restGET(uri))\n
"},{"location":"reference/client/#synapseclient.Synapse.setProvenance","title":"setProvenance(entity, activity)
","text":"Stores a record of the code and data used to derive a Synapse entity.
PARAMETER DESCRIPTIONentity
An Entity or Synapse ID to modify
activity
A synapseclient.activity.Activity
RETURNS DESCRIPTION
Activity
An updated synapseclient.activity.Activity object
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::setProvenance\")\ndef setProvenance(self, entity, activity) -> Activity:\n \"\"\"\n Stores a record of the code and data used to derive a Synapse entity.\n\n Arguments:\n entity: An Entity or Synapse ID to modify\n activity: A [synapseclient.activity.Activity][]\n\n Returns:\n An updated [synapseclient.activity.Activity][] object\n \"\"\"\n\n # Assert that the entity was generated by a given Activity.\n activity = self._saveActivity(activity)\n\n entity_id = id_of(entity)\n # assert that an entity is generated by an activity\n uri = \"/entity/%s/generatedBy?generatedBy=%s\" % (entity_id, activity[\"id\"])\n activity = Activity(data=self.restPUT(uri))\n\n trace.get_current_span().set_attributes({\"synapse.id\": entity_id})\n return activity\n
"},{"location":"reference/client/#synapseclient.Synapse.deleteProvenance","title":"deleteProvenance(entity)
","text":"Removes provenance information from an Entity and deletes the associated Activity.
PARAMETER DESCRIPTIONentity
An Entity or Synapse ID to modify
Source code in
synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::deleteProvenance\")\ndef deleteProvenance(self, entity) -> None:\n \"\"\"\n Removes provenance information from an Entity and deletes the associated Activity.\n\n Arguments:\n entity: An Entity or Synapse ID to modify\n \"\"\"\n\n activity = self.getProvenance(entity)\n if not activity:\n return\n entity_id = id_of(entity)\n trace.get_current_span().set_attributes({\"synapse.id\": entity_id})\n\n uri = \"/entity/%s/generatedBy\" % entity_id\n self.restDELETE(uri)\n\n # TODO: what happens if the activity is shared by more than one entity?\n uri = \"/activity/%s\" % activity[\"id\"]\n self.restDELETE(uri)\n
"},{"location":"reference/client/#synapseclient.Synapse.updateActivity","title":"updateActivity(activity)
","text":"Modifies an existing Activity.
PARAMETER DESCRIPTIONactivity
The Activity to be updated.
RETURNS DESCRIPTION
An updated Activity object
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::updateActivity\")\ndef updateActivity(self, activity):\n \"\"\"\n Modifies an existing Activity.\n\n Arguments:\n activity: The Activity to be updated.\n\n Returns:\n An updated Activity object\n \"\"\"\n if \"id\" not in activity:\n raise ValueError(\"The activity you want to update must exist on Synapse\")\n trace.get_current_span().set_attributes({\"synapse.id\": activity[\"id\"]})\n return self._saveActivity(activity)\n
"},{"location":"reference/client/#synapseclient.Synapse.findEntityId","title":"findEntityId(name, parent=None)
","text":"Find an Entity given its name and parent.
PARAMETER DESCRIPTIONname
Name of the entity to find
parent
An Entity object or the Id of an entity as a string. Omit if searching for a Project by name
DEFAULT: None
The Entity ID or None if not found
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::findEntityId\")\ndef findEntityId(self, name, parent=None):\n \"\"\"\n Find an Entity given its name and parent.\n\n Arguments:\n name: Name of the entity to find\n parent: An Entity object or the Id of an entity as a string. Omit if searching for a Project by name\n\n Returns:\n The Entity ID or None if not found\n \"\"\"\n # when we want to search for a project by name. set parentId as None instead of ROOT_ENTITY\n entity_lookup_request = {\n \"parentId\": id_of(parent) if parent else None,\n \"entityName\": name,\n }\n try:\n return self.restPOST(\n \"/entity/child\", body=json.dumps(entity_lookup_request)\n ).get(\"id\")\n except SynapseHTTPError as e:\n if (\n e.response.status_code == 404\n ): # a 404 error is raised if the entity does not exist\n return None\n raise\n
"},{"location":"reference/client/#synapseclient.Synapse.getChildren","title":"getChildren(parent, includeTypes=['folder', 'file', 'table', 'link', 'entityview', 'dockerrepo', 'submissionview', 'dataset', 'materializedview'], sortBy='NAME', sortDirection='ASC')
","text":"Retrieves all of the entities stored within a parent such as folder or project.
PARAMETER DESCRIPTIONparent
An id or an object of a Synapse container or None to retrieve all projects
includeTypes
Must be a list of entity types (ie. [\"folder\",\"file\"]) which can be found here: http://docs.synapse.org/rest/org/sagebionetworks/repo/model/EntityType.html
DEFAULT: ['folder', 'file', 'table', 'link', 'entityview', 'dockerrepo', 'submissionview', 'dataset', 'materializedview']
sortBy
How results should be sorted. Can be NAME, or CREATED_ON
DEFAULT: 'NAME'
sortDirection
The direction of the result sort. Can be ASC, or DESC
DEFAULT: 'ASC'
An iterator that shows all the children of the container.
Also see:
synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getChildren\")\ndef getChildren(\n self,\n parent,\n includeTypes=[\n \"folder\",\n \"file\",\n \"table\",\n \"link\",\n \"entityview\",\n \"dockerrepo\",\n \"submissionview\",\n \"dataset\",\n \"materializedview\",\n ],\n sortBy=\"NAME\",\n sortDirection=\"ASC\",\n):\n \"\"\"\n Retrieves all of the entities stored within a parent such as folder or project.\n\n Arguments:\n parent: An id or an object of a Synapse container or None to retrieve all projects\n includeTypes: Must be a list of entity types (ie. [\"folder\",\"file\"]) which can be found here:\n http://docs.synapse.org/rest/org/sagebionetworks/repo/model/EntityType.html\n sortBy: How results should be sorted. Can be NAME, or CREATED_ON\n sortDirection: The direction of the result sort. Can be ASC, or DESC\n\n Yields:\n An iterator that shows all the children of the container.\n\n Also see:\n\n - [synapseutils.walk][]\n \"\"\"\n parentId = id_of(parent) if parent is not None else None\n\n trace.get_current_span().set_attributes({\"synapse.parent_id\": parentId})\n entityChildrenRequest = {\n \"parentId\": parentId,\n \"includeTypes\": includeTypes,\n \"sortBy\": sortBy,\n \"sortDirection\": sortDirection,\n \"nextPageToken\": None,\n }\n entityChildrenResponse = {\"nextPageToken\": \"first\"}\n while entityChildrenResponse.get(\"nextPageToken\") is not None:\n entityChildrenResponse = self.restPOST(\n \"/entity/children\", body=json.dumps(entityChildrenRequest)\n )\n for child in entityChildrenResponse[\"page\"]:\n yield child\n if entityChildrenResponse.get(\"nextPageToken\") is not None:\n entityChildrenRequest[\"nextPageToken\"] = entityChildrenResponse[\n \"nextPageToken\"\n ]\n
"},{"location":"reference/client/#synapseclient.Synapse.getTeam","title":"getTeam(id)
","text":"Finds a team with a given ID or name.
PARAMETER DESCRIPTIONid
The ID or name of the team or a Team object to retrieve.
RETURNS DESCRIPTION
An object of type synapseclient.team.Team
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getTeam\")\ndef getTeam(self, id):\n \"\"\"\n Finds a team with a given ID or name.\n\n Arguments:\n id: The ID or name of the team or a Team object to retrieve.\n\n Returns:\n An object of type [synapseclient.team.Team][]\n \"\"\"\n # Retrieves team id\n teamid = id_of(id)\n try:\n int(teamid)\n except (TypeError, ValueError):\n if isinstance(id, str):\n for team in self._findTeam(id):\n if team.name == id:\n teamid = team.id\n break\n else:\n raise ValueError('Can\\'t find team \"{}\"'.format(teamid))\n else:\n raise ValueError('Can\\'t find team \"{}\"'.format(teamid))\n return Team(**self.restGET(\"/team/%s\" % teamid))\n
"},{"location":"reference/client/#synapseclient.Synapse.getTeamMembers","title":"getTeamMembers(team)
","text":"Lists the members of the given team.
PARAMETER DESCRIPTIONteam
A synapseclient.team.Team object or a team's ID.
YIELDS DESCRIPTION
A generator over synapseclient.team.TeamMember objects.
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getTeamMembers\")\ndef getTeamMembers(self, team):\n \"\"\"\n Lists the members of the given team.\n\n Arguments:\n team: A [synapseclient.team.Team][] object or a team's ID.\n\n Yields:\n A generator over [synapseclient.team.TeamMember][] objects.\n\n \"\"\"\n for result in self._GET_paginated(\"/teamMembers/{id}\".format(id=id_of(team))):\n yield TeamMember(**result)\n
"},{"location":"reference/client/#synapseclient.Synapse.invite_to_team","title":"invite_to_team(team, user=None, inviteeEmail=None, message=None, force=False)
","text":"Invite user to a Synapse team via Synapse username or email (choose one or the other)
PARAMETER DESCRIPTIONsyn
Synapse object
team
A synapseclient.team.Team object or a team's ID.
user
Synapse username or profile id of user
DEFAULT: None
inviteeEmail
Email of user
DEFAULT: None
message
Additional message for the user getting invited to the team.
DEFAULT: None
force
If an open invitation exists for the invitee, the old invite will be cancelled.
DEFAULT: False
MembershipInvitation or None if user is already a member
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::invite_to_team\")\ndef invite_to_team(\n self, team, user=None, inviteeEmail=None, message=None, force=False\n):\n \"\"\"Invite user to a Synapse team via Synapse username or email\n (choose one or the other)\n\n Arguments:\n syn: Synapse object\n team: A [synapseclient.team.Team][] object or a team's ID.\n user: Synapse username or profile id of user\n inviteeEmail: Email of user\n message: Additional message for the user getting invited to the team.\n force: If an open invitation exists for the invitee, the old invite will be cancelled.\n\n Returns:\n MembershipInvitation or None if user is already a member\n \"\"\"\n # Throw error if both user and email is specified and if both not\n # specified\n id_email_specified = inviteeEmail is not None and user is not None\n id_email_notspecified = inviteeEmail is None and user is None\n if id_email_specified or id_email_notspecified:\n raise ValueError(\"Must specify either 'user' or 'inviteeEmail'\")\n\n teamid = id_of(team)\n is_member = False\n open_invitations = self.get_team_open_invitations(teamid)\n\n if user is not None:\n inviteeId = self.getUserProfile(user)[\"ownerId\"]\n membership_status = self.get_membership_status(inviteeId, teamid)\n is_member = membership_status[\"isMember\"]\n open_invites_to_user = [\n invitation\n for invitation in open_invitations\n if invitation.get(\"inviteeId\") == inviteeId\n ]\n else:\n inviteeId = None\n open_invites_to_user = [\n invitation\n for invitation in open_invitations\n if invitation.get(\"inviteeEmail\") == inviteeEmail\n ]\n # Only invite if the invitee is not a member and\n # if invitee doesn't have an open invitation unless force=True\n if not is_member and (not open_invites_to_user or force):\n # Delete all old invitations\n for invite in open_invites_to_user:\n self._delete_membership_invitation(invite[\"id\"])\n return self.send_membership_invitation(\n teamid, inviteeId=inviteeId, inviteeEmail=inviteeEmail, message=message\n )\n if is_member:\n not_sent_reason = \"invitee is already a member\"\n else:\n not_sent_reason = (\n \"invitee already has an open invitation \"\n \"Set force=True to send new invite.\"\n )\n\n self.logger.warning(\"No invitation sent: {}\".format(not_sent_reason))\n # Return None if no invite is sent.\n return None\n
"},{"location":"reference/client/#synapseclient.Synapse.get_membership_status","title":"get_membership_status(userid, team)
","text":"Retrieve a user's Team Membership Status bundle. https://rest-docs.synapse.org/rest/GET/team/id/member/principalId/membershipStatus.html
PARAMETER DESCRIPTIONuser
Synapse user ID
team
A synapseclient.team.Team object or a team's ID.
RETURNS DESCRIPTION
dict of TeamMembershipStatus
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::get_membership_status\")\ndef get_membership_status(self, userid, team):\n \"\"\"Retrieve a user's Team Membership Status bundle.\n https://rest-docs.synapse.org/rest/GET/team/id/member/principalId/membershipStatus.html\n\n Arguments:\n user: Synapse user ID\n team: A [synapseclient.team.Team][] object or a team's ID.\n\n Returns:\n dict of TeamMembershipStatus\n \"\"\"\n teamid = id_of(team)\n request = \"/team/{team}/member/{user}/membershipStatus\".format(\n team=teamid, user=userid\n )\n membership_status = self.restGET(request)\n return membership_status\n
"},{"location":"reference/client/#synapseclient.Synapse.get_team_open_invitations","title":"get_team_open_invitations(team)
","text":"Retrieve the open requests submitted to a Team https://rest-docs.synapse.org/rest/GET/team/id/openInvitation.html
PARAMETER DESCRIPTIONteam
A synapseclient.team.Team object or a team's ID.
YIELDS DESCRIPTION
Generator of MembershipRequest
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::get_team_open_invitations\")\ndef get_team_open_invitations(self, team):\n \"\"\"Retrieve the open requests submitted to a Team\n https://rest-docs.synapse.org/rest/GET/team/id/openInvitation.html\n\n Arguments:\n team: A [synapseclient.team.Team][] object or a team's ID.\n\n Yields:\n Generator of MembershipRequest\n \"\"\"\n teamid = id_of(team)\n request = \"/team/{team}/openInvitation\".format(team=teamid)\n open_requests = self._GET_paginated(request)\n return open_requests\n
"},{"location":"reference/client/#synapseclient.Synapse.send_membership_invitation","title":"send_membership_invitation(teamId, inviteeId=None, inviteeEmail=None, message=None)
","text":"Create a membership invitation and send an email notification to the invitee.
PARAMETER DESCRIPTIONteamId
Synapse teamId
inviteeId
Synapse username or profile id of user
DEFAULT: None
inviteeEmail
Email of user
DEFAULT: None
message
Additional message for the user getting invited to the team.
DEFAULT: None
MembershipInvitation
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::send_membership_invitation\")\ndef send_membership_invitation(\n self, teamId, inviteeId=None, inviteeEmail=None, message=None\n):\n \"\"\"Create a membership invitation and send an email notification\n to the invitee.\n\n Arguments:\n teamId: Synapse teamId\n inviteeId: Synapse username or profile id of user\n inviteeEmail: Email of user\n message: Additional message for the user getting invited to the\n team.\n\n Returns:\n MembershipInvitation\n \"\"\"\n\n invite_request = {\"teamId\": str(teamId), \"message\": message}\n if inviteeEmail is not None:\n invite_request[\"inviteeEmail\"] = str(inviteeEmail)\n if inviteeId is not None:\n invite_request[\"inviteeId\"] = str(inviteeId)\n\n response = self.restPOST(\n \"/membershipInvitation\", body=json.dumps(invite_request)\n )\n return response\n
"},{"location":"reference/client/#synapseclient.Synapse.submit","title":"submit(evaluation, entity, name=None, team=None, silent=False, submitterAlias=None, teamName=None, dockerTag='latest')
","text":"Submit an Entity for evaluation.
PARAMETER DESCRIPTIONevalation
Evaluation queue to submit to
entity
The Entity containing the Submissions
name
A name for this submission. In the absent of this parameter, the entity name will be used. (Optional) A synapseclient.team.Team object, ID or name of a Team that is registered for the challenge
DEFAULT: None
team
(optional) A synapseclient.team.Team object, ID or name of a Team that is registered for the challenge
DEFAULT: None
silent
Set to True to suppress output.
DEFAULT: False
submitterAlias
(optional) A nickname, possibly for display in leaderboards in place of the submitter's name
DEFAULT: None
teamName
(deprecated) A synonym for submitterAlias
DEFAULT: None
dockerTag
(optional) The Docker tag must be specified if the entity is a DockerRepository.
DEFAULT: 'latest'
A synapseclient.evaluation.Submission object
In the case of challenges, a team can optionally be provided to give credit to members of the team that contributed to the submission. The team must be registered for the challenge with which the given evaluation is associated. The caller must be a member of the submitting team.
Using this functionGetting and submitting an evaluation
evaluation = syn.getEvaluation(123)\nentity = syn.get('syn456')\nsubmission = syn.submit(evaluation, entity, name='Our Final Answer', team='Blue Team')\n
Source code in synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::submit\")\ndef submit(\n self,\n evaluation,\n entity,\n name=None,\n team=None,\n silent=False,\n submitterAlias=None,\n teamName=None,\n dockerTag=\"latest\",\n):\n \"\"\"\n Submit an Entity for [evaluation][synapseclient.evaluation.Evaluation].\n\n Arguments:\n evalation: Evaluation queue to submit to\n entity: The Entity containing the Submissions\n name: A name for this submission. In the absent of this parameter, the entity name will be used.\n (Optional) A [synapseclient.team.Team][] object, ID or name of a Team that is registered for the challenge\n team: (optional) A [synapseclient.team.Team][] object, ID or name of a Team that is registered for the challenge\n silent: Set to True to suppress output.\n submitterAlias: (optional) A nickname, possibly for display in leaderboards in place of the submitter's name\n teamName: (deprecated) A synonym for submitterAlias\n dockerTag: (optional) The Docker tag must be specified if the entity is a DockerRepository.\n\n Returns:\n A [synapseclient.evaluation.Submission][] object\n\n\n In the case of challenges, a team can optionally be provided to give credit to members of the team that\n contributed to the submission. The team must be registered for the challenge with which the given evaluation is\n associated. The caller must be a member of the submitting team.\n\n Example: Using this function\n Getting and submitting an evaluation\n\n evaluation = syn.getEvaluation(123)\n entity = syn.get('syn456')\n submission = syn.submit(evaluation, entity, name='Our Final Answer', team='Blue Team')\n \"\"\"\n\n require_param(evaluation, \"evaluation\")\n require_param(entity, \"entity\")\n\n evaluation_id = id_of(evaluation)\n\n entity_id = id_of(entity)\n if isinstance(entity, synapseclient.DockerRepository):\n # Edge case if dockerTag is specified as None\n if dockerTag is None:\n raise ValueError(\n \"A dockerTag is required to submit a DockerEntity. Cannot be None\"\n )\n docker_repository = entity[\"repositoryName\"]\n else:\n docker_repository = None\n\n if \"versionNumber\" not in entity:\n entity = self.get(entity, downloadFile=False)\n # version defaults to 1 to hack around required version field and allow submission of files/folders\n entity_version = entity.get(\"versionNumber\", 1)\n\n # default name of submission to name of entity\n if name is None and \"name\" in entity:\n name = entity[\"name\"]\n\n team_id = None\n if team:\n team = self.getTeam(team)\n team_id = id_of(team)\n\n contributors, eligibility_hash = self._get_contributors(evaluation_id, team)\n\n # for backward compatible until we remove supports for teamName\n if not submitterAlias:\n if teamName:\n submitterAlias = teamName\n elif team and \"name\" in team:\n submitterAlias = team[\"name\"]\n\n if isinstance(entity, synapseclient.DockerRepository):\n docker_digest = self._get_docker_digest(entity, dockerTag)\n else:\n docker_digest = None\n\n submission = {\n \"evaluationId\": evaluation_id,\n \"name\": name,\n \"entityId\": entity_id,\n \"versionNumber\": entity_version,\n \"dockerDigest\": docker_digest,\n \"dockerRepositoryName\": docker_repository,\n \"teamId\": team_id,\n \"contributors\": contributors,\n \"submitterAlias\": submitterAlias,\n }\n\n submitted = self._submit(submission, entity[\"etag\"], eligibility_hash)\n\n # if we want to display the receipt message, we need the full object\n if not silent:\n if not (isinstance(evaluation, Evaluation)):\n evaluation = self.getEvaluation(evaluation_id)\n if \"submissionReceiptMessage\" in evaluation:\n self.logger.info(evaluation[\"submissionReceiptMessage\"])\n\n return Submission(**submitted)\n
"},{"location":"reference/client/#synapseclient.Synapse.getConfigFile","title":"getConfigFile(configPath)
cached
","text":"Retrieves the client configuration information.
PARAMETER DESCRIPTIONconfigPath
Path to configuration file on local file system
TYPE: str
RawConfigParser
A RawConfigParser populated with properties from the user's configuration file.
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getConfigFile\")\n@functools.lru_cache()\ndef getConfigFile(self, configPath: str) -> configparser.RawConfigParser:\n \"\"\"\n Retrieves the client configuration information.\n\n Arguments:\n configPath: Path to configuration file on local file system\n\n Returns:\n A RawConfigParser populated with properties from the user's configuration file.\n \"\"\"\n\n try:\n config = configparser.RawConfigParser()\n config.read(configPath) # Does not fail if the file does not exist\n return config\n except configparser.Error as ex:\n raise ValueError(\n \"Error parsing Synapse config file: {}\".format(configPath)\n ) from ex\n
"},{"location":"reference/client/#synapseclient.Synapse.setEndpoints","title":"setEndpoints(repoEndpoint=None, authEndpoint=None, fileHandleEndpoint=None, portalEndpoint=None, skip_checks=False)
","text":"Sets the locations for each of the Synapse services (mostly useful for testing).
PARAMETER DESCRIPTIONrepoEndpoint
Location of synapse repository
TYPE: str
DEFAULT: None
authEndpoint
Location of authentication service
TYPE: str
DEFAULT: None
fileHandleEndpoint
Location of file service
TYPE: str
DEFAULT: None
portalEndpoint
Location of the website
TYPE: str
DEFAULT: None
skip_checks
Skip version and endpoint checks
TYPE: bool
DEFAULT: False
To switch between staging and production endpoints
syn.setEndpoints(**synapseclient.client.STAGING_ENDPOINTS)\nsyn.setEndpoints(**synapseclient.client.PRODUCTION_ENDPOINTS)\n
Source code in synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::setEndpoints\")\ndef setEndpoints(\n self,\n repoEndpoint: str = None,\n authEndpoint: str = None,\n fileHandleEndpoint: str = None,\n portalEndpoint: str = None,\n skip_checks: bool = False,\n):\n \"\"\"\n Sets the locations for each of the Synapse services (mostly useful for testing).\n\n Arguments:\n repoEndpoint: Location of synapse repository\n authEndpoint: Location of authentication service\n fileHandleEndpoint: Location of file service\n portalEndpoint: Location of the website\n skip_checks: Skip version and endpoint checks\n\n Example: Switching endpoints\n To switch between staging and production endpoints\n\n syn.setEndpoints(**synapseclient.client.STAGING_ENDPOINTS)\n syn.setEndpoints(**synapseclient.client.PRODUCTION_ENDPOINTS)\n\n \"\"\"\n\n endpoints = {\n \"repoEndpoint\": repoEndpoint,\n \"authEndpoint\": authEndpoint,\n \"fileHandleEndpoint\": fileHandleEndpoint,\n \"portalEndpoint\": portalEndpoint,\n }\n\n # For unspecified endpoints, first look in the config file\n config = self.getConfigFile(self.configPath)\n for point in endpoints.keys():\n if endpoints[point] is None and config.has_option(\"endpoints\", point):\n endpoints[point] = config.get(\"endpoints\", point)\n\n # Endpoints default to production\n for point in endpoints.keys():\n if endpoints[point] is None:\n endpoints[point] = PRODUCTION_ENDPOINTS[point]\n\n # Update endpoints if we get redirected\n if not skip_checks:\n response = self._requests_session.get(\n endpoints[point],\n allow_redirects=False,\n headers=synapseclient.USER_AGENT,\n )\n if response.status_code == 301:\n endpoints[point] = response.headers[\"location\"]\n\n self.repoEndpoint = endpoints[\"repoEndpoint\"]\n self.authEndpoint = endpoints[\"authEndpoint\"]\n self.fileHandleEndpoint = endpoints[\"fileHandleEndpoint\"]\n self.portalEndpoint = endpoints[\"portalEndpoint\"]\n
"},{"location":"reference/client/#synapseclient.Synapse.invalidateAPIKey","title":"invalidateAPIKey()
","text":"Invalidates authentication across all clients.
RETURNS DESCRIPTIONNone
Source code insynapseclient/client.py
def invalidateAPIKey(self):\n \"\"\"Invalidates authentication across all clients.\n\n Returns:\n None\n \"\"\"\n\n # Logout globally\n if self._is_logged_in():\n self.restDELETE(\"/secretKey\", endpoint=self.authEndpoint)\n
"},{"location":"reference/client/#synapseclient.Synapse.get_user_profile_by_username","title":"get_user_profile_by_username(username=None, sessionToken=None)
cached
","text":"Get the details about a Synapse user. Retrieves information on the current user if 'id' is omitted or is empty string.
PARAMETER DESCRIPTIONusername
The userName of a user
TYPE: str
DEFAULT: None
sessionToken
The session token to use to find the user profile
TYPE: str
DEFAULT: None
UserProfile
The user profile for the user of interest.
Using this functionGetting your own profile
my_profile = syn.get_user_profile_by_username()\n
Getting another user's profile
freds_profile = syn.get_user_profile_by_username('fredcommo')\n
Source code in synapseclient/client.py
@functools.lru_cache()\ndef get_user_profile_by_username(\n self,\n username: str = None,\n sessionToken: str = None,\n) -> UserProfile:\n \"\"\"\n Get the details about a Synapse user.\n Retrieves information on the current user if 'id' is omitted or is empty string.\n\n Arguments:\n username: The userName of a user\n sessionToken: The session token to use to find the user profile\n\n Returns:\n The user profile for the user of interest.\n\n Example: Using this function\n Getting your own profile\n\n my_profile = syn.get_user_profile_by_username()\n\n Getting another user's profile\n\n freds_profile = syn.get_user_profile_by_username('fredcommo')\n \"\"\"\n is_none = username is None\n is_str = isinstance(username, str)\n if not is_str and not is_none:\n raise TypeError(\"username must be string or None\")\n if is_str:\n principals = self._findPrincipals(username)\n for principal in principals:\n if principal.get(\"userName\", None).lower() == username.lower():\n id = principal[\"ownerId\"]\n break\n else:\n raise ValueError(f\"Can't find user '{username}'\")\n else:\n id = \"\"\n uri = f\"/userProfile/{id}\"\n return UserProfile(\n **self.restGET(\n uri, headers={\"sessionToken\": sessionToken} if sessionToken else None\n )\n )\n
"},{"location":"reference/client/#synapseclient.Synapse.get_user_profile_by_id","title":"get_user_profile_by_id(id=None, sessionToken=None)
cached
","text":"Get the details about a Synapse user. Retrieves information on the current user if 'id' is omitted.
PARAMETER DESCRIPTIONid
The ownerId of a user
TYPE: int
DEFAULT: None
sessionToken
The session token to use to find the user profile
TYPE: str
DEFAULT: None
UserProfile
The user profile for the user of interest.
Using this functionGetting your own profile
my_profile = syn.get_user_profile_by_id()\n
Getting another user's profile
freds_profile = syn.get_user_profile_by_id(1234567)\n
Source code in synapseclient/client.py
@functools.lru_cache()\ndef get_user_profile_by_id(\n self,\n id: int = None,\n sessionToken: str = None,\n) -> UserProfile:\n \"\"\"\n Get the details about a Synapse user.\n Retrieves information on the current user if 'id' is omitted.\n\n Arguments:\n id: The ownerId of a user\n sessionToken: The session token to use to find the user profile\n\n Returns:\n The user profile for the user of interest.\n\n\n Example: Using this function\n Getting your own profile\n\n my_profile = syn.get_user_profile_by_id()\n\n Getting another user's profile\n\n freds_profile = syn.get_user_profile_by_id(1234567)\n \"\"\"\n if id:\n if not isinstance(id, int):\n raise TypeError(\"id must be an 'ownerId' integer\")\n else:\n id = \"\"\n uri = f\"/userProfile/{id}\"\n return UserProfile(\n **self.restGET(\n uri, headers={\"sessionToken\": sessionToken} if sessionToken else None\n )\n )\n
"},{"location":"reference/client/#synapseclient.Synapse.getUserProfile","title":"getUserProfile(id=None, sessionToken=None)
cached
","text":"Get the details about a Synapse user. Retrieves information on the current user if 'id' is omitted.
PARAMETER DESCRIPTIONid
The 'userId' (aka 'ownerId') of a user or the userName
TYPE: Union[str, int, UserProfile, TeamMember]
DEFAULT: None
sessionToken
The session token to use to find the user profile
TYPE: str
DEFAULT: None
UserProfile
The user profile for the user of interest.
Using this functionGetting your own profile
my_profile = syn.getUserProfile()\n
Getting another user's profile
freds_profile = syn.getUserProfile('fredcommo')\n
Source code in synapseclient/client.py
@functools.lru_cache()\ndef getUserProfile(\n self,\n id: Union[str, int, UserProfile, TeamMember] = None,\n sessionToken: str = None,\n) -> UserProfile:\n \"\"\"\n Get the details about a Synapse user.\n Retrieves information on the current user if 'id' is omitted.\n\n Arguments:\n id: The 'userId' (aka 'ownerId') of a user or the userName\n sessionToken: The session token to use to find the user profile\n\n Returns:\n The user profile for the user of interest.\n\n Example: Using this function\n Getting your own profile\n\n my_profile = syn.getUserProfile()\n\n Getting another user's profile\n\n freds_profile = syn.getUserProfile('fredcommo')\n \"\"\"\n try:\n # if id is unset or a userID, this will succeed\n id = \"\" if id is None else int(id)\n except (TypeError, ValueError):\n if isinstance(id, collections.abc.Mapping) and \"ownerId\" in id:\n id = id.ownerId\n elif isinstance(id, TeamMember):\n id = id.member.ownerId\n else:\n principals = self._findPrincipals(id)\n if len(principals) == 1:\n id = principals[0][\"ownerId\"]\n else:\n for principal in principals:\n if principal.get(\"userName\", None).lower() == id.lower():\n id = principal[\"ownerId\"]\n break\n else: # no break\n raise ValueError('Can\\'t find user \"%s\": ' % id)\n uri = \"/userProfile/%s\" % id\n return UserProfile(\n **self.restGET(\n uri, headers={\"sessionToken\": sessionToken} if sessionToken else None\n )\n )\n
"},{"location":"reference/client/#synapseclient.Synapse.is_certified","title":"is_certified(user)
","text":"Determines whether a Synapse user is a certified user.
PARAMETER DESCRIPTIONuser
Synapse username or Id
TYPE: Union[str, int]
bool
True if the Synapse user is certified
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::is_certified\")\ndef is_certified(self, user: typing.Union[str, int]) -> bool:\n \"\"\"Determines whether a Synapse user is a certified user.\n\n Arguments:\n user: Synapse username or Id\n\n Returns:\n True if the Synapse user is certified\n \"\"\"\n # Check if userid or username exists\n syn_user = self.getUserProfile(user)\n # Get passing record\n\n try:\n certification_status = self._get_certified_passing_record(\n syn_user[\"ownerId\"]\n )\n return certification_status[\"passed\"]\n except SynapseHTTPError as ex:\n if ex.response.status_code == 404:\n # user hasn't taken the quiz\n return False\n raise\n
"},{"location":"reference/client/#synapseclient.Synapse.is_synapse_id","title":"is_synapse_id(syn_id)
","text":"Checks if given synID is valid (attached to actual entity?)
PARAMETER DESCRIPTIONsyn_id
A Synapse ID
TYPE: str
bool
True if the Synapse ID is valid
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::is_synapse_id\")\ndef is_synapse_id(self, syn_id: str) -> bool:\n \"\"\"Checks if given synID is valid (attached to actual entity?)\n\n Arguments:\n syn_id: A Synapse ID\n\n Returns:\n True if the Synapse ID is valid\n \"\"\"\n if isinstance(syn_id, str):\n try:\n self.get(syn_id, downloadFile=False)\n except SynapseFileNotFoundError:\n return False\n except (\n SynapseHTTPError,\n SynapseAuthenticationError,\n ) as err:\n status = (\n err.__context__.response.status_code or err.response.status_code\n )\n if status in (400, 404):\n return False\n # Valid ID but user lacks permission or is not logged in\n elif status == 403:\n return True\n return True\n self.logger.warning(\"synID must be a string\")\n return False\n
"},{"location":"reference/client/#synapseclient.Synapse.onweb","title":"onweb(entity, subpageId=None)
","text":"Opens up a browser window to the entity page or wiki-subpage.
PARAMETER DESCRIPTIONentity
Either an Entity or a Synapse ID
subpageId
(Optional) ID of one of the wiki's sub-pages
DEFAULT: None
None
Source code insynapseclient/client.py
def onweb(self, entity, subpageId=None):\n \"\"\"Opens up a browser window to the entity page or wiki-subpage.\n\n Arguments:\n entity: Either an Entity or a Synapse ID\n subpageId: (Optional) ID of one of the wiki's sub-pages\n\n Returns:\n None\n \"\"\"\n if isinstance(entity, str) and os.path.isfile(entity):\n entity = self.get(entity, downloadFile=False)\n synId = id_of(entity)\n if subpageId is None:\n webbrowser.open(\"%s#!Synapse:%s\" % (self.portalEndpoint, synId))\n else:\n webbrowser.open(\n \"%s#!Wiki:%s/ENTITY/%s\" % (self.portalEndpoint, synId, subpageId)\n )\n
"},{"location":"reference/client/#synapseclient.Synapse.printEntity","title":"printEntity(entity, ensure_ascii=True)
","text":"Pretty prints an Entity.
PARAMETER DESCRIPTIONentity
The entity to be printed.
ensure_ascii
If True, escapes all non-ASCII characters
DEFAULT: True
None
None
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::printEntity\")\ndef printEntity(self, entity, ensure_ascii=True) -> None:\n \"\"\"\n Pretty prints an Entity.\n\n Arguments:\n entity: The entity to be printed.\n ensure_ascii: If True, escapes all non-ASCII characters\n\n Returns:\n None\n \"\"\"\n\n if utils.is_synapse_id_str(entity):\n entity = self._getEntity(entity)\n try:\n self.logger.info(\n json.dumps(entity, sort_keys=True, indent=2, ensure_ascii=ensure_ascii)\n )\n except TypeError:\n self.logger.info(str(entity))\n
"},{"location":"reference/client/#synapseclient.Synapse.get_available_services","title":"get_available_services()
","text":"Get available Synapse services This is a beta feature and is subject to change
RETURNS DESCRIPTIONList[str]
List of available services
Source code insynapseclient/client.py
def get_available_services(self) -> typing.List[str]:\n \"\"\"Get available Synapse services\n This is a beta feature and is subject to change\n\n Returns:\n List of available services\n \"\"\"\n services = self._services.keys()\n return list(services)\n
"},{"location":"reference/client/#synapseclient.Synapse.service","title":"service(service_name)
","text":"Get available Synapse services This is a beta feature and is subject to change
PARAMETER DESCRIPTIONservice_name
name of the service
TYPE: str
synapseclient/client.py
def service(self, service_name: str):\n \"\"\"Get available Synapse services\n This is a beta feature and is subject to change\n\n Arguments:\n service_name: name of the service\n \"\"\"\n # This is to avoid circular imports\n # TODO: revisit the import order and method https://stackoverflow.com/a/37126790\n # To move this to the top\n import synapseclient.services\n\n assert isinstance(service_name, str)\n service_name = service_name.lower().replace(\" \", \"_\")\n assert service_name in self._services, (\n f\"Unrecognized service ({service_name}). Run the 'get_available_\"\n \"services()' method to get a list of available services.\"\n )\n service_attr = self._services[service_name]\n service_cls = getattr(synapseclient.services, service_attr)\n service = service_cls(self)\n return service\n
"},{"location":"reference/client/#synapseclient.Synapse.clear_download_list","title":"clear_download_list()
","text":"Clear all files from download list
Source code insynapseclient/client.py
def clear_download_list(self):\n \"\"\"Clear all files from download list\"\"\"\n self.restDELETE(\"/download/list\")\n
"},{"location":"reference/client/#synapseclient.Synapse.create_external_s3_file_handle","title":"create_external_s3_file_handle(bucket_name, s3_file_key, file_path, *, parent=None, storage_location_id=None, mimetype=None)
","text":"Create an external S3 file handle for e.g. a file that has been uploaded directly to an external S3 storage location.
PARAMETER DESCRIPTIONbucket_name
Name of the S3 bucket
s3_file_key
S3 key of the uploaded object
file_path
Local path of the uploaded file
parent
Parent entity to create the file handle in, the file handle will be created in the default storage location of the parent. Mutually exclusive with storage_location_id
DEFAULT: None
storage_location_id
Explicit storage location id to create the file handle in, mutually exclusive with parent
DEFAULT: None
mimetype
Mimetype of the file, if known
DEFAULT: None
ValueError
If neither parent nor storage_location_id is specified, or if both are specified.
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::create_external_s3_file_handle\")\ndef create_external_s3_file_handle(\n self,\n bucket_name,\n s3_file_key,\n file_path,\n *,\n parent=None,\n storage_location_id=None,\n mimetype=None,\n):\n \"\"\"\n Create an external S3 file handle for e.g. a file that has been uploaded directly to\n an external S3 storage location.\n\n Arguments:\n bucket_name: Name of the S3 bucket\n s3_file_key: S3 key of the uploaded object\n file_path: Local path of the uploaded file\n parent: Parent entity to create the file handle in, the file handle will be created\n in the default storage location of the parent. Mutually exclusive with\n storage_location_id\n storage_location_id: Explicit storage location id to create the file handle in, mutually exclusive\n with parent\n mimetype: Mimetype of the file, if known\n\n Raises:\n ValueError: If neither parent nor storage_location_id is specified, or if both are specified.\n \"\"\"\n\n if storage_location_id:\n if parent:\n raise ValueError(\"Pass parent or storage_location_id, not both\")\n elif not parent:\n raise ValueError(\"One of parent or storage_location_id is required\")\n else:\n upload_destination = self._getDefaultUploadDestination(parent)\n storage_location_id = upload_destination[\"storageLocationId\"]\n\n if mimetype is None:\n mimetype, enc = mimetypes.guess_type(file_path, strict=False)\n\n file_handle = {\n \"concreteType\": concrete_types.S3_FILE_HANDLE,\n \"key\": s3_file_key,\n \"bucketName\": bucket_name,\n \"fileName\": os.path.basename(file_path),\n \"contentMd5\": utils.md5_for_file(file_path).hexdigest(),\n \"contentSize\": os.stat(file_path).st_size,\n \"storageLocationId\": storage_location_id,\n \"contentType\": mimetype,\n }\n\n return self.restPOST(\n \"/externalFileHandle/s3\",\n json.dumps(file_handle),\n endpoint=self.fileHandleEndpoint,\n )\n
"},{"location":"reference/client/#synapseclient.Synapse.getMyStorageLocationSetting","title":"getMyStorageLocationSetting(storage_location_id)
","text":"Get a StorageLocationSetting by its id.
PARAMETER DESCRIPTIONstorage_location_id
id of the StorageLocationSetting to retrieve. The corresponding StorageLocationSetting must have been created by this user.
RETURNS DESCRIPTION
A dict describing the StorageLocationSetting retrieved by its id
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getMyStorageLocationSetting\")\ndef getMyStorageLocationSetting(self, storage_location_id):\n \"\"\"\n Get a StorageLocationSetting by its id.\n\n Arguments:\n storage_location_id: id of the StorageLocationSetting to retrieve.\n The corresponding StorageLocationSetting must have been created by this user.\n\n Returns:\n A dict describing the StorageLocationSetting retrieved by its id\n \"\"\"\n return self.restGET(\"/storageLocation/%s\" % storage_location_id)\n
"},{"location":"reference/client/#synapseclient.Synapse.createStorageLocationSetting","title":"createStorageLocationSetting(storage_type, **kwargs)
","text":"Creates an IMMUTABLE storage location based on the specified type.
For each storage_type, the following kwargs should be specified:
ExternalObjectStorage: (S3-like (e.g. AWS S3 or Openstack) bucket not accessed by Synapse)
ExternalS3Storage: (Amazon S3 bucket accessed by Synapse)
ExternalStorage: (SFTP or FTP storage location not accessed by Synapse)
ProxyStorage: (a proxy server that controls access to a storage)
storage_type
The type of the StorageLocationSetting to create
banner
(Optional) The optional banner to show every time a file is uploaded
description
(Optional) The description to show the user when the user has to choose which upload destination to use
kwargs
fields necessary for creation of the specified storage_type
DEFAULT: {}
A dict of the created StorageLocationSetting
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::createStorageLocationSetting\")\ndef createStorageLocationSetting(self, storage_type, **kwargs):\n \"\"\"\n Creates an IMMUTABLE storage location based on the specified type.\n\n For each storage_type, the following kwargs should be specified:\n\n **ExternalObjectStorage**: (S3-like (e.g. AWS S3 or Openstack) bucket not accessed by Synapse)\n\n - endpointUrl: endpoint URL of the S3 service (for example: 'https://s3.amazonaws.com')\n - bucket: the name of the bucket to use\n\n **ExternalS3Storage**: (Amazon S3 bucket accessed by Synapse)\n\n - bucket: the name of the bucket to use\n\n **ExternalStorage**: (SFTP or FTP storage location not accessed by Synapse)\n\n - url: the base URL for uploading to the external destination\n - supportsSubfolders(optional): does the destination support creating subfolders under the base url\n (default: false)\n\n **ProxyStorage**: (a proxy server that controls access to a storage)\n\n - secretKey: The encryption key used to sign all pre-signed URLs used to communicate with the proxy.\n - proxyUrl: The HTTPS URL of the proxy used for upload and download.\n\n Arguments:\n storage_type: The type of the StorageLocationSetting to create\n banner: (Optional) The optional banner to show every time a file is uploaded\n description: (Optional) The description to show the user when the user has to choose which upload destination to use\n kwargs: fields necessary for creation of the specified storage_type\n\n Returns:\n A dict of the created StorageLocationSetting\n \"\"\"\n upload_type_dict = {\n \"ExternalObjectStorage\": \"S3\",\n \"ExternalS3Storage\": \"S3\",\n \"ExternalStorage\": \"SFTP\",\n \"ProxyStorage\": \"PROXYLOCAL\",\n }\n\n if storage_type not in upload_type_dict:\n raise ValueError(\"Unknown storage_type: %s\", storage_type)\n\n # ProxyStorageLocationSettings has an extra 's' at the end >:(\n kwargs[\"concreteType\"] = (\n \"org.sagebionetworks.repo.model.project.\"\n + storage_type\n + \"LocationSetting\"\n + (\"s\" if storage_type == \"ProxyStorage\" else \"\")\n )\n kwargs[\"uploadType\"] = upload_type_dict[storage_type]\n\n return self.restPOST(\"/storageLocation\", body=json.dumps(kwargs))\n
"},{"location":"reference/client/#synapseclient.Synapse.create_s3_storage_location","title":"create_s3_storage_location(*, parent=None, folder_name=None, folder=None, bucket_name=None, base_key=None, sts_enabled=False)
","text":"Create a storage location in the given parent, either in the given folder or by creating a new folder in that parent with the given name. This will both create a StorageLocationSetting, and a ProjectSetting together, optionally creating a new folder in which to locate it, and optionally enabling this storage location for access via STS. If enabling an existing folder for STS, it must be empty.
PARAMETER DESCRIPTIONparent
The parent in which to locate the storage location (mutually exclusive with folder)
DEFAULT: None
folder_name
The name of a new folder to create (mutually exclusive with folder)
DEFAULT: None
folder
The existing folder in which to create the storage location (mutually exclusive with folder_name)
DEFAULT: None
bucket_name
The name of an S3 bucket, if this is an external storage location, if None will use Synapse S3 storage
DEFAULT: None
base_key
The base key of within the bucket, None to use the bucket root, only applicable if bucket_name is passed
DEFAULT: None
sts_enabled
Whether this storage location should be STS enabled
DEFAULT: False
A 3-tuple of the synapse Folder, a the storage location setting, and the project setting dictionaries.
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::create_s3_storage_location\")\ndef create_s3_storage_location(\n self,\n *,\n parent=None,\n folder_name=None,\n folder=None,\n bucket_name=None,\n base_key=None,\n sts_enabled=False,\n):\n \"\"\"\n Create a storage location in the given parent, either in the given folder or by creating a new\n folder in that parent with the given name. This will both create a StorageLocationSetting,\n and a ProjectSetting together, optionally creating a new folder in which to locate it,\n and optionally enabling this storage location for access via STS. If enabling an existing folder for STS,\n it must be empty.\n\n Arguments:\n parent: The parent in which to locate the storage location (mutually exclusive with folder)\n folder_name: The name of a new folder to create (mutually exclusive with folder)\n folder: The existing folder in which to create the storage location (mutually exclusive with folder_name)\n bucket_name: The name of an S3 bucket, if this is an external storage location,\n if None will use Synapse S3 storage\n base_key: The base key of within the bucket, None to use the bucket root,\n only applicable if bucket_name is passed\n sts_enabled: Whether this storage location should be STS enabled\n\n Returns:\n A 3-tuple of the synapse Folder, a the storage location setting, and the project setting dictionaries.\n \"\"\"\n if folder_name and parent:\n if folder:\n raise ValueError(\n \"folder and folder_name are mutually exclusive, only one should be passed\"\n )\n\n folder = self.store(Folder(name=folder_name, parent=parent))\n\n elif not folder:\n raise ValueError(\"either folder or folder_name should be required\")\n\n storage_location_kwargs = {\n \"uploadType\": \"S3\",\n \"stsEnabled\": sts_enabled,\n }\n\n if bucket_name:\n storage_location_kwargs[\n \"concreteType\"\n ] = concrete_types.EXTERNAL_S3_STORAGE_LOCATION_SETTING\n storage_location_kwargs[\"bucket\"] = bucket_name\n if base_key:\n storage_location_kwargs[\"baseKey\"] = base_key\n else:\n storage_location_kwargs[\n \"concreteType\"\n ] = concrete_types.SYNAPSE_S3_STORAGE_LOCATION_SETTING\n\n storage_location_setting = self.restPOST(\n \"/storageLocation\", json.dumps(storage_location_kwargs)\n )\n\n storage_location_id = storage_location_setting[\"storageLocationId\"]\n project_setting = self.setStorageLocation(\n folder,\n storage_location_id,\n )\n\n return folder, storage_location_setting, project_setting\n
"},{"location":"reference/client/#synapseclient.Synapse.setStorageLocation","title":"setStorageLocation(entity, storage_location_id)
","text":"Sets the storage location for a Project or Folder
PARAMETER DESCRIPTIONentity
A Project or Folder to which the StorageLocationSetting is set
storage_location_id
A StorageLocation id or a list of StorageLocation ids. Pass in None for the default Synapse storage.
RETURNS DESCRIPTION
The created or updated settings as a dict.
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::setStorageLocation\")\ndef setStorageLocation(self, entity, storage_location_id):\n \"\"\"\n Sets the storage location for a Project or Folder\n\n Arguments:\n entity: A Project or Folder to which the StorageLocationSetting is set\n storage_location_id: A StorageLocation id or a list of StorageLocation ids. Pass in None for the default\n Synapse storage.\n\n Returns:\n The created or updated settings as a dict.\n \"\"\"\n if storage_location_id is None:\n storage_location_id = DEFAULT_STORAGE_LOCATION_ID\n locations = (\n storage_location_id\n if isinstance(storage_location_id, list)\n else [storage_location_id]\n )\n\n existing_setting = self.getProjectSetting(entity, \"upload\")\n if existing_setting is not None:\n existing_setting[\"locations\"] = locations\n self.restPUT(\"/projectSettings\", body=json.dumps(existing_setting))\n return self.getProjectSetting(entity, \"upload\")\n else:\n project_destination = {\n \"concreteType\": \"org.sagebionetworks.repo.model.project.UploadDestinationListSetting\",\n \"settingsType\": \"upload\",\n \"locations\": locations,\n \"projectId\": id_of(entity),\n }\n\n return self.restPOST(\n \"/projectSettings\", body=json.dumps(project_destination)\n )\n
"},{"location":"reference/client/#synapseclient.Synapse.get_sts_storage_token","title":"get_sts_storage_token(entity, permission, *, output_format='json', min_remaining_life=None)
","text":"Get STS credentials for the given entity_id and permission, outputting it in the given format
PARAMETER DESCRIPTIONentity
The entity or entity id whose credentials are being returned
permission
One of:
read_only
read_write
output_format
One of:
json
: the dictionary returned from the Synapse STS API including expirationboto
: a dictionary compatible with a boto session (aws_access_key_id, etc)shell
: output commands for exporting credentials appropriate for the detected shellbash
: output commands for exporting credentials into a bash shellcmd
: output commands for exporting credentials into a windows cmd shellpowershell
: output commands for exporting credentials into a windows powershell DEFAULT: 'json'
min_remaining_life
The minimum allowable remaining life on a cached token to return. If a cached token has left than this amount of time left a fresh token will be fetched
DEFAULT: None
synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::get_sts_storage_token\")\ndef get_sts_storage_token(\n self, entity, permission, *, output_format=\"json\", min_remaining_life=None\n):\n \"\"\"Get STS credentials for the given entity_id and permission, outputting it in the given format\n\n Arguments:\n entity: The entity or entity id whose credentials are being returned\n permission: One of:\n\n - `read_only`\n - `read_write`\n output_format: One of:\n\n - `json`: the dictionary returned from the Synapse STS API including expiration\n - `boto`: a dictionary compatible with a boto session (aws_access_key_id, etc)\n - `shell`: output commands for exporting credentials appropriate for the detected shell\n - `bash`: output commands for exporting credentials into a bash shell\n - `cmd`: output commands for exporting credentials into a windows cmd shell\n - `powershell`: output commands for exporting credentials into a windows powershell\n min_remaining_life: The minimum allowable remaining life on a cached token to return. If a cached token\n has left than this amount of time left a fresh token will be fetched\n \"\"\"\n return sts_transfer.get_sts_credentials(\n self,\n id_of(entity),\n permission,\n output_format=output_format,\n min_remaining_life=min_remaining_life,\n )\n
"},{"location":"reference/client/#synapseclient.Synapse.create_snapshot_version","title":"create_snapshot_version(table, comment=None, label=None, activity=None, wait=True)
","text":"Create a new Table Version, new View version, or new Dataset version.
PARAMETER DESCRIPTIONtable
The schema of the Table/View, or its ID.
TYPE: Union[EntityViewSchema, Schema, str, SubmissionViewSchema, Dataset]
comment
Optional snapshot comment.
TYPE: str
DEFAULT: None
label
Optional snapshot label.
TYPE: str
DEFAULT: None
activity
Optional activity ID applied to snapshot version.
TYPE: Union[Activity, str]
DEFAULT: None
wait
True if this method should return the snapshot version after waiting for any necessary asynchronous table updates to complete. If False this method will return as soon as any updates are initiated.
TYPE: bool
DEFAULT: True
int
The snapshot version number if wait=True, None if wait=False
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::create_snapshot_version\")\ndef create_snapshot_version(\n self,\n table: typing.Union[\n EntityViewSchema, Schema, str, SubmissionViewSchema, Dataset\n ],\n comment: str = None,\n label: str = None,\n activity: typing.Union[Activity, str] = None,\n wait: bool = True,\n) -> int:\n \"\"\"Create a new Table Version, new View version, or new Dataset version.\n\n Arguments:\n table: The schema of the Table/View, or its ID.\n comment: Optional snapshot comment.\n label: Optional snapshot label.\n activity: Optional activity ID applied to snapshot version.\n wait: True if this method should return the snapshot version after waiting for any necessary\n asynchronous table updates to complete. If False this method will return\n as soon as any updates are initiated.\n\n Returns:\n The snapshot version number if wait=True, None if wait=False\n \"\"\"\n ent = self.get(id_of(table), downloadFile=False)\n if isinstance(ent, (EntityViewSchema, SubmissionViewSchema, Dataset)):\n result = self._async_table_update(\n table,\n create_snapshot=True,\n comment=comment,\n label=label,\n activity=activity,\n wait=wait,\n )\n elif isinstance(ent, Schema):\n result = self._create_table_snapshot(\n table,\n comment=comment,\n label=label,\n activity=activity,\n )\n else:\n raise ValueError(\n \"This function only accepts Synapse ids of Tables or Views\"\n )\n\n # for consistency we return nothing if wait=False since we can't\n # supply the snapshot version on an async table update without waiting\n return result[\"snapshotVersionNumber\"] if wait else None\n
"},{"location":"reference/client/#synapseclient.Synapse.getConfigFile","title":"getConfigFile(configPath)
cached
","text":"Retrieves the client configuration information.
PARAMETER DESCRIPTIONconfigPath
Path to configuration file on local file system
TYPE: str
RawConfigParser
A RawConfigParser populated with properties from the user's configuration file.
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getConfigFile\")\n@functools.lru_cache()\ndef getConfigFile(self, configPath: str) -> configparser.RawConfigParser:\n \"\"\"\n Retrieves the client configuration information.\n\n Arguments:\n configPath: Path to configuration file on local file system\n\n Returns:\n A RawConfigParser populated with properties from the user's configuration file.\n \"\"\"\n\n try:\n config = configparser.RawConfigParser()\n config.read(configPath) # Does not fail if the file does not exist\n return config\n except configparser.Error as ex:\n raise ValueError(\n \"Error parsing Synapse config file: {}\".format(configPath)\n ) from ex\n
"},{"location":"reference/client/#synapseclient.Synapse.getEvaluation","title":"getEvaluation(id)
","text":"Gets an Evaluation object from Synapse.
PARAMETER DESCRIPTIONid
The ID of the synapseclient.evaluation.Evaluation to return.
RETURNS DESCRIPTION
An synapseclient.evaluation.Evaluation object
Using this functionCreating an Evaluation instance
evaluation = syn.getEvaluation(2005090)\n
Source code in synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getEvaluation\")\ndef getEvaluation(self, id):\n \"\"\"\n Gets an Evaluation object from Synapse.\n\n Arguments:\n id: The ID of the [synapseclient.evaluation.Evaluation][] to return.\n\n Returns:\n An [synapseclient.evaluation.Evaluation][] object\n\n Example: Using this function\n Creating an Evaluation instance\n\n evaluation = syn.getEvaluation(2005090)\n \"\"\"\n\n evaluation_id = id_of(id)\n uri = Evaluation.getURI(evaluation_id)\n return Evaluation(**self.restGET(uri))\n
"},{"location":"reference/client/#synapseclient.Synapse.getEvaluationByContentSource","title":"getEvaluationByContentSource(entity)
","text":"Returns a generator over evaluations that derive their content from the given entity
PARAMETER DESCRIPTIONentity
The synapseclient.entity.Project whose Evaluations are to be fetched.
YIELDS DESCRIPTION
A generator over synapseclient.evaluation.Evaluation objects for the given synapseclient.entity.Project.
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getEvaluationByContentSource\")\ndef getEvaluationByContentSource(self, entity):\n \"\"\"\n Returns a generator over evaluations that derive their content from the given entity\n\n Arguments:\n entity: The [synapseclient.entity.Project][] whose Evaluations are to be fetched.\n\n Yields:\n A generator over [synapseclient.evaluation.Evaluation][] objects for the given [synapseclient.entity.Project][].\n \"\"\"\n\n entityId = id_of(entity)\n url = \"/entity/%s/evaluation\" % entityId\n\n for result in self._GET_paginated(url):\n yield Evaluation(**result)\n
"},{"location":"reference/client/#synapseclient.Synapse.getEvaluationByName","title":"getEvaluationByName(name)
","text":"Gets an Evaluation object from Synapse.
PARAMETER DESCRIPTIONName
The name of the synapseclient.evaluation.Evaluation to return.
RETURNS DESCRIPTION
An synapseclient.evaluation.Evaluation object
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getEvaluationByName\")\ndef getEvaluationByName(self, name):\n \"\"\"\n Gets an Evaluation object from Synapse.\n\n Arguments:\n Name: The name of the [synapseclient.evaluation.Evaluation][] to return.\n\n Returns:\n An [synapseclient.evaluation.Evaluation][] object\n \"\"\"\n uri = Evaluation.getByNameURI(name)\n return Evaluation(**self.restGET(uri))\n
"},{"location":"reference/client/#synapseclient.Synapse.getProjectSetting","title":"getProjectSetting(project, setting_type)
","text":"Gets the ProjectSetting for a project.
PARAMETER DESCRIPTIONproject
Project entity or its id as a string
setting_type
Type of setting. Choose from:
upload
external_sync
requester_pays
RETURNS DESCRIPTION
The ProjectSetting as a dict or None if no settings of the specified type exist.
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getProjectSetting\")\ndef getProjectSetting(self, project, setting_type):\n \"\"\"\n Gets the ProjectSetting for a project.\n\n Arguments:\n project: Project entity or its id as a string\n setting_type: Type of setting. Choose from:\n\n - `upload`\n - `external_sync`\n - `requester_pays`\n\n Returns:\n The ProjectSetting as a dict or None if no settings of the specified type exist.\n \"\"\"\n if setting_type not in {\"upload\", \"external_sync\", \"requester_pays\"}:\n raise ValueError(\"Invalid project_type: %s\" % setting_type)\n\n response = self.restGET(\n \"/projectSettings/{projectId}/type/{type}\".format(\n projectId=id_of(project), type=setting_type\n )\n )\n return (\n response if response else None\n ) # if no project setting, a empty string is returned as the response\n
"},{"location":"reference/client/#synapseclient.Synapse.getSubmission","title":"getSubmission(id, **kwargs)
","text":"Gets a synapseclient.evaluation.Submission object by its id.
PARAMETER DESCRIPTIONid
The id of the submission to retrieve
RETURNS DESCRIPTION
A synapseclient.evaluation.Submission object
:param id: The id of the submission to retrieve
:return: a :py:class:synapseclient.evaluation.Submission
object
See:
synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getSubmission\")\ndef getSubmission(self, id, **kwargs):\n \"\"\"\n Gets a [synapseclient.evaluation.Submission][] object by its id.\n\n Arguments:\n id: The id of the submission to retrieve\n\n Returns:\n A [synapseclient.evaluation.Submission][] object\n\n\n :param id: The id of the submission to retrieve\n\n :return: a :py:class:`synapseclient.evaluation.Submission` object\n\n See:\n\n - [synapseclient.Synapse.get][] for information\n on the *downloadFile*, *downloadLocation*, and *ifcollision* parameters\n \"\"\"\n\n submission_id = id_of(id)\n uri = Submission.getURI(submission_id)\n submission = Submission(**self.restGET(uri))\n\n # Pre-fetch the Entity tied to the Submission, if there is one\n if \"entityId\" in submission and submission[\"entityId\"] is not None:\n entityBundleJSON = json.loads(submission[\"entityBundleJSON\"])\n\n # getWithEntityBundle expects a bundle services v2 style\n # annotations dict, but the evaluations API may return\n # an older format annotations object in the encoded JSON\n # depending on when the original submission was made.\n annotations = entityBundleJSON.get(\"annotations\")\n if annotations:\n entityBundleJSON[\"annotations\"] = convert_old_annotation_json(\n annotations\n )\n\n related = self._getWithEntityBundle(\n entityBundle=entityBundleJSON,\n entity=submission[\"entityId\"],\n submission=submission_id,\n **kwargs,\n )\n submission.entity = related\n submission.filePath = related.get(\"path\", None)\n\n return submission\n
"},{"location":"reference/client/#synapseclient.Synapse.getSubmissions","title":"getSubmissions(evaluation, status=None, myOwn=False, limit=20, offset=0)
","text":"PARAMETER DESCRIPTION evaluation
Evaluation to get submissions from.
status
Optionally filter submissions for a specific status. One of:
OPEN
CLOSED
SCORED
INVALID
VALIDATED
EVALUATION_IN_PROGRESS
RECEIVED
REJECTED
ACCEPTED
DEFAULT: None
myOwn
Determines if only your Submissions should be fetched. Defaults to False (all Submissions)
DEFAULT: False
limit
Limits the number of submissions in a single response. Because this method returns a generator and repeatedly fetches submissions, this argument is limiting the size of a single request and NOT the number of sub- missions returned in total.
DEFAULT: 20
offset
Start iterating at a submission offset from the first submission.
DEFAULT: 0
A generator over synapseclient.evaluation.Submission objects for an Evaluation
Using this functionPrint submissions
for submission in syn.getSubmissions(1234567):\n print(submission['entityId'])\n
See:
synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getSubmissions\")\ndef getSubmissions(self, evaluation, status=None, myOwn=False, limit=20, offset=0):\n \"\"\"\n Arguments:\n evaluation: Evaluation to get submissions from.\n status: Optionally filter submissions for a specific status.\n One of:\n\n - `OPEN`\n - `CLOSED`\n - `SCORED`\n - `INVALID`\n - `VALIDATED`\n - `EVALUATION_IN_PROGRESS`\n - `RECEIVED`\n - `REJECTED`\n - `ACCEPTED`\n myOwn: Determines if only your Submissions should be fetched.\n Defaults to False (all Submissions)\n limit: Limits the number of submissions in a single response.\n Because this method returns a generator and repeatedly\n fetches submissions, this argument is limiting the\n size of a single request and NOT the number of sub-\n missions returned in total.\n offset: Start iterating at a submission offset from the first submission.\n\n Yields:\n A generator over [synapseclient.evaluation.Submission][] objects for an Evaluation\n\n Example: Using this function\n Print submissions\n\n for submission in syn.getSubmissions(1234567):\n print(submission['entityId'])\n\n See:\n\n - [synapseclient.evaluation][]\n \"\"\"\n\n evaluation_id = id_of(evaluation)\n uri = \"/evaluation/%s/submission%s\" % (evaluation_id, \"\" if myOwn else \"/all\")\n\n if status is not None:\n uri += \"?status=%s\" % status\n\n for result in self._GET_paginated(uri, limit=limit, offset=offset):\n yield Submission(**result)\n
"},{"location":"reference/client/#synapseclient.Synapse.getSubmissionBundles","title":"getSubmissionBundles(evaluation, status=None, myOwn=False, limit=20, offset=0)
","text":"Retrieve submission bundles (submission and submissions status) for an evaluation queue, optionally filtered by submission status and/or owner.
PARAMETER DESCRIPTIONevaluation
Evaluation to get submissions from.
status
Optionally filter submissions for a specific status. One of:
OPEN
CLOSED
SCORED
INVALID
DEFAULT: None
myOwn
Determines if only your Submissions should be fetched. Defaults to False (all Submissions)
DEFAULT: False
limit
Limits the number of submissions coming back from the service in a single response.
DEFAULT: 20
offset
Start iterating at a submission offset from the first submission.
DEFAULT: 0
A generator over tuples containing a synapseclient.evaluation.Submission and a synapseclient.evaluation.SubmissionStatus.
Using this functionLoop over submissions
for submission, status in syn.getSubmissionBundles(evaluation):\n print(submission.name, \\\n submission.submitterAlias, \\\n status.status, \\\n status.score)\n
This may later be changed to return objects, pending some thought on how submissions along with related status and annotations should be represented in the clients.
See:
synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getSubmissionBundles\")\ndef getSubmissionBundles(\n self, evaluation, status=None, myOwn=False, limit=20, offset=0\n):\n \"\"\"\n Retrieve submission bundles (submission and submissions status) for an evaluation queue, optionally filtered by\n submission status and/or owner.\n\n Arguments:\n evaluation: Evaluation to get submissions from.\n status: Optionally filter submissions for a specific status.\n One of:\n\n - `OPEN`\n - `CLOSED`\n - `SCORED`\n - `INVALID`\n myOwn: Determines if only your Submissions should be fetched.\n Defaults to False (all Submissions)\n limit: Limits the number of submissions coming back from the\n service in a single response.\n offset: Start iterating at a submission offset from the first submission.\n\n Yields:\n A generator over tuples containing a [synapseclient.evaluation.Submission][] and a [synapseclient.evaluation.SubmissionStatus][].\n\n Example: Using this function\n Loop over submissions\n\n for submission, status in syn.getSubmissionBundles(evaluation):\n print(submission.name, \\\\\n submission.submitterAlias, \\\\\n status.status, \\\\\n status.score)\n\n This may later be changed to return objects, pending some thought on how submissions along with related status\n and annotations should be represented in the clients.\n\n See:\n\n - [synapseclient.evaluation][]\n \"\"\"\n for bundle in self._getSubmissionBundles(\n evaluation, status=status, myOwn=myOwn, limit=limit, offset=offset\n ):\n yield (\n Submission(**bundle[\"submission\"]),\n SubmissionStatus(**bundle[\"submissionStatus\"]),\n )\n
"},{"location":"reference/client/#synapseclient.Synapse.getSubmissionStatus","title":"getSubmissionStatus(submission)
","text":"Downloads the status of a Submission.
PARAMETER DESCRIPTIONsubmission
The submission to lookup
RETURNS DESCRIPTION
A synapseclient.evaluation.SubmissionStatus object
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getSubmissionStatus\")\ndef getSubmissionStatus(self, submission):\n \"\"\"\n Downloads the status of a Submission.\n\n Arguments:\n submission: The submission to lookup\n\n Returns:\n A [synapseclient.evaluation.SubmissionStatus][] object\n \"\"\"\n\n submission_id = id_of(submission)\n uri = SubmissionStatus.getURI(submission_id)\n val = self.restGET(uri)\n return SubmissionStatus(**val)\n
"},{"location":"reference/client/#synapseclient.Synapse.getWiki","title":"getWiki(owner, subpageId=None, version=None)
","text":"Get a synapseclient.wiki.Wiki object from Synapse. Uses wiki2 API which supports versioning.
PARAMETER DESCRIPTIONowner
The entity to which the Wiki is attached
subpageId
The id of the specific sub-page or None to get the root Wiki page
DEFAULT: None
version
The version of the page to retrieve or None to retrieve the latest
DEFAULT: None
A synapseclient.wiki.Wiki object
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getWiki\")\ndef getWiki(self, owner, subpageId=None, version=None):\n \"\"\"\n Get a [synapseclient.wiki.Wiki][] object from Synapse. Uses wiki2 API which supports versioning.\n\n Arguments:\n owner: The entity to which the Wiki is attached\n subpageId: The id of the specific sub-page or None to get the root Wiki page\n version: The version of the page to retrieve or None to retrieve the latest\n\n Returns:\n A [synapseclient.wiki.Wiki][] object\n \"\"\"\n uri = \"/entity/{ownerId}/wiki2\".format(ownerId=id_of(owner))\n if subpageId is not None:\n uri += \"/{wikiId}\".format(wikiId=subpageId)\n if version is not None:\n uri += \"?wikiVersion={version}\".format(version=version)\n\n wiki = self.restGET(uri)\n wiki[\"owner\"] = owner\n wiki = Wiki(**wiki)\n\n path = self.cache.get(wiki.markdownFileHandleId)\n if not path:\n cache_dir = self.cache.get_cache_dir(wiki.markdownFileHandleId)\n if not os.path.exists(cache_dir):\n os.makedirs(cache_dir)\n path = self._downloadFileHandle(\n wiki[\"markdownFileHandleId\"],\n wiki[\"id\"],\n \"WikiMarkdown\",\n os.path.join(cache_dir, str(wiki.markdownFileHandleId) + \".md\"),\n )\n try:\n import gzip\n\n with gzip.open(path) as f:\n markdown = f.read().decode(\"utf-8\")\n except IOError:\n with open(path) as f:\n markdown = f.read().decode(\"utf-8\")\n\n wiki.markdown = markdown\n wiki.markdown_path = path\n\n return wiki\n
"},{"location":"reference/client/#synapseclient.Synapse.getWikiAttachments","title":"getWikiAttachments(wiki)
","text":"Retrieve the attachments to a wiki page.
PARAMETER DESCRIPTIONwiki
The Wiki object for which the attachments are to be returned.
RETURNS DESCRIPTION
A list of file handles for the files attached to the Wiki.
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getWikiAttachments\")\ndef getWikiAttachments(self, wiki):\n \"\"\"\n Retrieve the attachments to a wiki page.\n\n Arguments:\n wiki: The Wiki object for which the attachments are to be returned.\n\n Returns:\n A list of file handles for the files attached to the Wiki.\n \"\"\"\n uri = \"/entity/%s/wiki/%s/attachmenthandles\" % (wiki.ownerId, wiki.id)\n results = self.restGET(uri)\n file_handles = list(WikiAttachment(**fh) for fh in results[\"list\"])\n return file_handles\n
"},{"location":"reference/client/#synapseclient.Synapse.getWikiHeaders","title":"getWikiHeaders(owner)
","text":"Retrieves the headers of all Wikis belonging to the owner (the entity to which the Wiki is attached).
PARAMETER DESCRIPTIONowner
An Entity
RETURNS DESCRIPTION
A list of Objects with three fields: id, title and parentId.
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::getWikiHeaders\")\ndef getWikiHeaders(self, owner):\n \"\"\"\n Retrieves the headers of all Wikis belonging to the owner (the entity to which the Wiki is attached).\n\n Arguments:\n owner: An Entity\n\n Returns:\n A list of Objects with three fields: id, title and parentId.\n \"\"\"\n\n uri = \"/entity/%s/wikiheadertree\" % id_of(owner)\n return [DictObject(**header) for header in self._GET_paginated(uri)]\n
"},{"location":"reference/client/#synapseclient.Synapse.get_download_list","title":"get_download_list(downloadLocation=None)
","text":"Download all files from your Synapse download list
PARAMETER DESCRIPTIONdownloadLocation
Directory to download files to.
TYPE: str
DEFAULT: None
str
Manifest file with file paths
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::get_download_list\")\ndef get_download_list(self, downloadLocation: str = None) -> str:\n \"\"\"Download all files from your Synapse download list\n\n Arguments:\n downloadLocation: Directory to download files to.\n\n Returns:\n Manifest file with file paths\n \"\"\"\n dl_list_path = self.get_download_list_manifest()\n downloaded_files = []\n new_manifest_path = f\"manifest_{time.time_ns()}.csv\"\n with open(dl_list_path) as manifest_f, open(\n new_manifest_path, \"w\"\n ) as write_obj:\n reader = csv.DictReader(manifest_f)\n columns = reader.fieldnames\n columns.extend([\"path\", \"error\"])\n # Write the downloaded paths to a new manifest file\n writer = csv.DictWriter(write_obj, fieldnames=columns)\n writer.writeheader()\n\n for row in reader:\n # You can add things to the download list that you don't have access to\n # So there must be a try catch here\n try:\n entity = self.get(row[\"ID\"], downloadLocation=downloadLocation)\n # Must include version number because you can have multiple versions of a\n # file in the download list\n downloaded_files.append(\n {\n \"fileEntityId\": row[\"ID\"],\n \"versionNumber\": row[\"versionNumber\"],\n }\n )\n row[\"path\"] = entity.path\n row[\"error\"] = \"\"\n except Exception:\n row[\"path\"] = \"\"\n row[\"error\"] = \"DOWNLOAD FAILED\"\n self.logger.error(\"Unable to download file\")\n writer.writerow(row)\n\n # Don't want to clear all the download list because you can add things\n # to the download list after initiating this command.\n # Files that failed to download should not be removed from download list\n # Remove all files from download list after the entire download is complete.\n # This is because if download fails midway, we want to return the full manifest\n if downloaded_files:\n # Only want to invoke this if there is a list of files to remove\n # or the API call will error\n self.remove_from_download_list(list_of_files=downloaded_files)\n else:\n self.logger.warning(\"A manifest was created, but no files were downloaded\")\n\n # Always remove original manifest file\n os.remove(dl_list_path)\n\n return new_manifest_path\n
"},{"location":"reference/client/#synapseclient.Synapse.get_download_list_manifest","title":"get_download_list_manifest()
","text":"Get the path of the download list manifest file
RETURNS DESCRIPTIONPath of download list manifest file
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::get_download_list_manifest\")\ndef get_download_list_manifest(self):\n \"\"\"Get the path of the download list manifest file\n\n Returns:\n Path of download list manifest file\n \"\"\"\n manifest = self._generate_manifest_from_download_list()\n # Get file handle download link\n file_result = self._getFileHandleDownload(\n fileHandleId=manifest[\"resultFileHandleId\"],\n objectId=manifest[\"resultFileHandleId\"],\n objectType=\"FileEntity\",\n )\n # Download the manifest\n downloaded_path = self._download_from_URL(\n url=file_result[\"preSignedURL\"],\n destination=\"./\",\n fileHandleId=file_result[\"fileHandleId\"],\n expected_md5=file_result[\"fileHandle\"].get(\"contentMd5\"),\n )\n trace.get_current_span().set_attributes(\n {\"synapse.file_handle_id\": file_result[\"fileHandleId\"]}\n )\n return downloaded_path\n
"},{"location":"reference/client/#synapseclient.Synapse.remove_from_download_list","title":"remove_from_download_list(list_of_files)
","text":"Remove a batch of files from download list
PARAMETER DESCRIPTIONlist_of_files
Array of files in the format of a mapping {fileEntityId: synid, versionNumber: version}
TYPE: List[Dict]
int
Number of files removed from download list
Source code insynapseclient/client.py
def remove_from_download_list(self, list_of_files: typing.List[typing.Dict]) -> int:\n \"\"\"Remove a batch of files from download list\n\n Arguments:\n list_of_files: Array of files in the format of a mapping {fileEntityId: synid, versionNumber: version}\n\n Returns:\n Number of files removed from download list\n \"\"\"\n request_body = {\"batchToRemove\": list_of_files}\n num_files_removed = self.restPOST(\n \"/download/list/remove\", body=json.dumps(request_body)\n )\n return num_files_removed\n
"},{"location":"reference/client/#synapseclient.Synapse.md5Query","title":"md5Query(md5)
","text":"Find the Entities which have attached file(s) which have the given MD5 hash.
PARAMETER DESCRIPTIONmd5
The MD5 to query for (hexadecimal string)
RETURNS DESCRIPTION
A list of Entity headers
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::md5Query\")\ndef md5Query(self, md5):\n \"\"\"\n Find the Entities which have attached file(s) which have the given MD5 hash.\n\n Arguments:\n md5: The MD5 to query for (hexadecimal string)\n\n Returns:\n A list of Entity headers\n \"\"\"\n\n return self.restGET(\"/entity/md5/%s\" % md5)[\"results\"]\n
"},{"location":"reference/client/#synapseclient.Synapse.sendMessage","title":"sendMessage(userIds, messageSubject, messageBody, contentType='text/plain')
","text":"send a message via Synapse.
PARAMETER DESCRIPTIONuserIds
A list of user IDs to which the message is to be sent
messageSubject
The subject for the message
messageBody
The body of the message
contentType
optional contentType of message body (default=\"text/plain\") Should be one of \"text/plain\" or \"text/html\"
DEFAULT: 'text/plain'
The metadata of the created message
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::sendMessage\")\ndef sendMessage(\n self, userIds, messageSubject, messageBody, contentType=\"text/plain\"\n):\n \"\"\"\n send a message via Synapse.\n\n Arguments:\n userIds: A list of user IDs to which the message is to be sent\n messageSubject: The subject for the message\n messageBody: The body of the message\n contentType: optional contentType of message body (default=\"text/plain\")\n Should be one of \"text/plain\" or \"text/html\"\n\n Returns:\n The metadata of the created message\n \"\"\"\n\n fileHandleId = multipart_upload_string(\n self, messageBody, content_type=contentType\n )\n message = dict(\n recipients=userIds, subject=messageSubject, fileHandleId=fileHandleId\n )\n return self.restPOST(uri=\"/message\", body=json.dumps(message))\n
"},{"location":"reference/client/#synapseclient.Synapse.uploadFileHandle","title":"uploadFileHandle(path, parent, synapseStore=True, mimetype=None, md5=None, file_size=None)
","text":"Uploads the file in the provided path (if necessary) to a storage location based on project settings. Returns a new FileHandle as a dict to represent the stored file.
PARAMETER DESCRIPTIONparent
Parent of the entity to which we upload.
path
File path to the file being uploaded
synapseStore
If False, will not upload the file, but instead create an ExternalFileHandle that references the file on the local machine. If True, will upload the file based on StorageLocation determined by the entity_parent_id
DEFAULT: True
mimetype
The MIME type metadata for the uploaded file
DEFAULT: None
md5
The MD5 checksum for the file, if known. Otherwise if the file is a local file, it will be calculated automatically.
DEFAULT: None
file_size
The size the file, if known. Otherwise if the file is a local file, it will be calculated automatically.
DEFAULT: None
file_type
The MIME type the file, if known. Otherwise if the file is a local file, it will be calculated automatically.
RETURNS DESCRIPTION
A dict of a new FileHandle as a dict that represents the uploaded file
Source code insynapseclient/client.py
def uploadFileHandle(\n self, path, parent, synapseStore=True, mimetype=None, md5=None, file_size=None\n):\n \"\"\"Uploads the file in the provided path (if necessary) to a storage location based on project settings.\n Returns a new FileHandle as a dict to represent the stored file.\n\n Arguments:\n parent: Parent of the entity to which we upload.\n path: File path to the file being uploaded\n synapseStore: If False, will not upload the file, but instead create an ExternalFileHandle that references\n the file on the local machine.\n If True, will upload the file based on StorageLocation determined by the entity_parent_id\n mimetype: The MIME type metadata for the uploaded file\n md5: The MD5 checksum for the file, if known. Otherwise if the file is a local file, it will be calculated\n automatically.\n file_size: The size the file, if known. Otherwise if the file is a local file, it will be calculated\n automatically.\n file_type: The MIME type the file, if known. Otherwise if the file is a local file, it will be calculated\n automatically.\n\n Returns:\n A dict of a new FileHandle as a dict that represents the uploaded file\n \"\"\"\n return upload_file_handle(\n self, parent, path, synapseStore, md5, file_size, mimetype\n )\n
"},{"location":"reference/client/#synapseclient.Synapse.restGET","title":"restGET(uri, endpoint=None, headers=None, retryPolicy={}, requests_session=None, **kwargs)
","text":"Sends an HTTP GET request to the Synapse server.
PARAMETER DESCRIPTIONuri
URI on which get is performed
endpoint
Server endpoint, defaults to self.repoEndpoint
DEFAULT: None
headers
Dictionary of headers to use rather than the API-key-signed default set of headers
DEFAULT: None
requests_session
An external requests.Session object to use when making this specific call
DEFAULT: None
kwargs
Any other arguments taken by a request method
DEFAULT: {}
JSON encoding of response
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::restGET\")\ndef restGET(\n self,\n uri,\n endpoint=None,\n headers=None,\n retryPolicy={},\n requests_session=None,\n **kwargs,\n):\n \"\"\"\n Sends an HTTP GET request to the Synapse server.\n\n Arguments:\n uri: URI on which get is performed\n endpoint: Server endpoint, defaults to self.repoEndpoint\n headers: Dictionary of headers to use rather than the API-key-signed default set of headers\n requests_session: An external requests.Session object to use when making this specific call\n kwargs: Any other arguments taken by a [request](http://docs.python-requests.org/en/latest/) method\n\n Returns:\n JSON encoding of response\n \"\"\"\n trace.get_current_span().set_attributes({\"url.path\": uri})\n response = self._rest_call(\n \"get\", uri, None, endpoint, headers, retryPolicy, requests_session, **kwargs\n )\n return self._return_rest_body(response)\n
"},{"location":"reference/client/#synapseclient.Synapse.restPOST","title":"restPOST(uri, body, endpoint=None, headers=None, retryPolicy={}, requests_session=None, **kwargs)
","text":"Sends an HTTP POST request to the Synapse server.
PARAMETER DESCRIPTIONuri
URI on which get is performed
endpoint
Server endpoint, defaults to self.repoEndpoint
DEFAULT: None
body
The payload to be delivered
headers
Dictionary of headers to use rather than the API-key-signed default set of headers
DEFAULT: None
requests_session
an external requests.Session object to use when making this specific call
DEFAULT: None
kwargs
Any other arguments taken by a request method
DEFAULT: {}
JSON encoding of response
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::restPOST\")\ndef restPOST(\n self,\n uri,\n body,\n endpoint=None,\n headers=None,\n retryPolicy={},\n requests_session=None,\n **kwargs,\n):\n \"\"\"\n Sends an HTTP POST request to the Synapse server.\n\n Arguments:\n uri: URI on which get is performed\n endpoint: Server endpoint, defaults to self.repoEndpoint\n body: The payload to be delivered\n headers: Dictionary of headers to use rather than the API-key-signed default set of headers\n requests_session: an external requests.Session object to use when making this specific call\n kwargs: Any other arguments taken by a [request](http://docs.python-requests.org/en/latest/) method\n\n Returns:\n JSON encoding of response\n \"\"\"\n trace.get_current_span().set_attributes({\"url.path\": uri})\n response = self._rest_call(\n \"post\",\n uri,\n body,\n endpoint,\n headers,\n retryPolicy,\n requests_session,\n **kwargs,\n )\n return self._return_rest_body(response)\n
"},{"location":"reference/client/#synapseclient.Synapse.restPUT","title":"restPUT(uri, body=None, endpoint=None, headers=None, retryPolicy={}, requests_session=None, **kwargs)
","text":"Sends an HTTP PUT request to the Synapse server.
PARAMETER DESCRIPTIONuri
URI on which get is performed
endpoint
Server endpoint, defaults to self.repoEndpoint
DEFAULT: None
body
The payload to be delivered
DEFAULT: None
headers
Dictionary of headers to use rather than the API-key-signed default set of headers
DEFAULT: None
requests_session
Sn external requests.Session object to use when making this specific call
DEFAULT: None
kwargs
Any other arguments taken by a request method
DEFAULT: {}
Returns JSON encoding of response
Source code insynapseclient/client.py
@tracer.start_as_current_span(\"Synapse::restPUT\")\ndef restPUT(\n self,\n uri,\n body=None,\n endpoint=None,\n headers=None,\n retryPolicy={},\n requests_session=None,\n **kwargs,\n):\n \"\"\"\n Sends an HTTP PUT request to the Synapse server.\n\n Arguments:\n uri: URI on which get is performed\n endpoint: Server endpoint, defaults to self.repoEndpoint\n body: The payload to be delivered\n headers: Dictionary of headers to use rather than the API-key-signed default set of headers\n requests_session: Sn external requests.Session object to use when making this specific call\n kwargs: Any other arguments taken by a [request](http://docs.python-requests.org/en/latest/) method\n\n Returns\n JSON encoding of response\n \"\"\"\n trace.get_current_span().set_attributes({\"url.path\": uri})\n response = self._rest_call(\n \"put\", uri, body, endpoint, headers, retryPolicy, requests_session, **kwargs\n )\n return self._return_rest_body(response)\n
"},{"location":"reference/client/#synapseclient.Synapse.restDELETE","title":"restDELETE(uri, endpoint=None, headers=None, retryPolicy={}, requests_session=None, **kwargs)
","text":"Sends an HTTP DELETE request to the Synapse server.
PARAMETER DESCRIPTIONuri
URI of resource to be deleted
endpoint
Server endpoint, defaults to self.repoEndpoint
DEFAULT: None
headers
Dictionary of headers to use rather than the API-key-signed default set of headers
DEFAULT: None
requests_session
An external requests.Session object to use when making this specific call
DEFAULT: None
kwargs
Any other arguments taken by a request method
DEFAULT: {}
synapseclient/client.py
@tracer.start_as_current_span(\"Synapse::restDELETE\")\ndef restDELETE(\n self,\n uri,\n endpoint=None,\n headers=None,\n retryPolicy={},\n requests_session=None,\n **kwargs,\n):\n \"\"\"\n Sends an HTTP DELETE request to the Synapse server.\n\n Arguments:\n uri: URI of resource to be deleted\n endpoint: Server endpoint, defaults to self.repoEndpoint\n headers: Dictionary of headers to use rather than the API-key-signed default set of headers\n requests_session: An external requests.Session object to use when making this specific call\n kwargs: Any other arguments taken by a [request](http://docs.python-requests.org/en/latest/) method\n\n \"\"\"\n trace.get_current_span().set_attributes({\"url.path\": uri})\n self._rest_call(\n \"delete\",\n uri,\n None,\n endpoint,\n headers,\n retryPolicy,\n requests_session,\n **kwargs,\n )\n
"},{"location":"reference/client/#more-information","title":"More information","text":"See also the Synapse API documentation
"},{"location":"reference/core/","title":"Core","text":"This section is for super users / developers only. These functions are subject to change as they are internal development functions. Use at your own risk.
"},{"location":"reference/core/#multipart-upload","title":"Multipart Upload","text":""},{"location":"reference/core/#synapseclient.core.upload.multipart_upload","title":"synapseclient.core.upload.multipart_upload
","text":"Implements the client side of Synapse multipart upload
_, which provides a robust means of uploading large files (into the 10s of GB). End users should not need to call any of these functions directly.
.. _Synapse multipart upload: https://rest-docs.synapse.org/rest/index.html#org.sagebionetworks.file.controller.UploadController
"},{"location":"reference/core/#synapseclient.core.upload.multipart_upload-classes","title":"Classes","text":""},{"location":"reference/core/#synapseclient.core.upload.multipart_upload.UploadAttempt","title":"UploadAttempt
","text":"Source code in synapseclient/core/upload/multipart_upload.py
class UploadAttempt:\n def __init__(\n self,\n syn,\n dest_file_name,\n upload_request_payload,\n part_request_body_provider_fn,\n md5_fn,\n max_threads: int,\n force_restart: bool,\n ):\n self._syn = syn\n self._dest_file_name = dest_file_name\n self._part_size = upload_request_payload[\"partSizeBytes\"]\n\n self._upload_request_payload = upload_request_payload\n\n self._part_request_body_provider_fn = part_request_body_provider_fn\n self._md5_fn = md5_fn\n\n self._max_threads = max_threads\n self._force_restart = force_restart\n\n self._lock = threading.Lock()\n self._aborted = False\n\n # populated later\n self._upload_id: str = None\n self._pre_signed_part_urls: Mapping[int, str] = None\n\n @classmethod\n def _get_remaining_part_numbers(cls, upload_status):\n part_numbers = []\n parts_state = upload_status[\"partsState\"]\n\n # parts are 1-based\n for i, part_status in enumerate(parts_state, 1):\n if part_status == \"0\":\n part_numbers.append(i)\n\n return len(parts_state), part_numbers\n\n @classmethod\n def _get_thread_session(cls):\n # get a lazily initialized requests.Session from the thread.\n # we want to share a requests.Session over the course of a thread\n # to take advantage of persistent http connection. we put it on a\n # thread local rather that in the task closure since a connection can\n # be reused across separate part uploads so no reason to restrict it\n # per worker task.\n session = getattr(_thread_local, \"session\", None)\n if not session:\n session = _thread_local.session = requests.Session()\n return session\n\n def _is_copy(self):\n # is this a copy or upload request\n return (\n self._upload_request_payload.get(\"concreteType\")\n == concrete_types.MULTIPART_UPLOAD_COPY_REQUEST\n )\n\n def _create_synapse_upload(self):\n return self._syn.restPOST(\n \"/file/multipart?forceRestart={}\".format(str(self._force_restart).lower()),\n json.dumps(self._upload_request_payload),\n endpoint=self._syn.fileHandleEndpoint,\n )\n\n def _fetch_pre_signed_part_urls(\n self,\n upload_id: str,\n part_numbers: List[int],\n requests_session: requests.Session = None,\n ) -> Mapping[int, str]:\n uri = \"/file/multipart/{upload_id}/presigned/url/batch\".format(\n upload_id=upload_id\n )\n body = {\n \"uploadId\": upload_id,\n \"partNumbers\": part_numbers,\n }\n\n response = self._syn.restPOST(\n uri,\n json.dumps(body),\n requests_session=requests_session,\n endpoint=self._syn.fileHandleEndpoint,\n )\n\n part_urls = {}\n for part in response[\"partPresignedUrls\"]:\n part_urls[part[\"partNumber\"]] = (\n part[\"uploadPresignedUrl\"],\n part.get(\"signedHeaders\", {}),\n )\n\n return part_urls\n\n def _refresh_pre_signed_part_urls(\n self,\n part_number: int,\n expired_url: str,\n ):\n \"\"\"Refresh all unfetched presigned urls, and return the refreshed\n url for the given part number. If an existing expired_url is passed\n and the url for the given part has already changed that new url\n will be returned without a refresh (i.e. it is assumed that another\n thread has already refreshed the url since the passed url expired).\n\n :param part_number: the part number whose refreshed url should\n be returned\n :param expired_url: the url that was detected as expired triggering\n this refresh\n\n \"\"\"\n with self._lock:\n current_url = self._pre_signed_part_urls[part_number]\n if current_url != expired_url:\n # if the url has already changed since the given url\n # was detected as expired we can assume that another\n # thread already refreshed the url and can avoid the extra\n # fetch.\n refreshed_url = current_url\n else:\n self._pre_signed_part_urls = self._fetch_pre_signed_part_urls(\n self._upload_id,\n list(self._pre_signed_part_urls.keys()),\n )\n\n refreshed_url = self._pre_signed_part_urls[part_number]\n\n return refreshed_url\n\n def _handle_part(self, part_number, otel_context: typing.Union[Context, None]):\n if otel_context:\n context.attach(otel_context)\n with tracer.start_as_current_span(\"UploadAttempt::_handle_part\"):\n trace.get_current_span().set_attributes(\n {\"thread.id\": threading.get_ident()}\n )\n with self._lock:\n if self._aborted:\n # this upload attempt has already been aborted\n # so we short circuit the attempt to upload this part\n raise SynapseUploadAbortedException(\n \"Upload aborted, skipping part {}\".format(part_number)\n )\n\n part_url, signed_headers = self._pre_signed_part_urls.get(part_number)\n\n session = self._get_thread_session()\n\n # obtain the body (i.e. the upload bytes) for the given part number.\n body = (\n self._part_request_body_provider_fn(part_number)\n if self._part_request_body_provider_fn\n else None\n )\n part_size = len(body) if body else 0\n for retry in range(2):\n\n def put_fn():\n with tracer.start_as_current_span(\"UploadAttempt::put_part\"):\n return session.put(part_url, body, headers=signed_headers)\n\n try:\n # use our backoff mechanism here, we have encountered 500s on puts to AWS signed urls\n response = with_retry(\n put_fn, retry_exceptions=[requests.exceptions.ConnectionError]\n )\n _raise_for_status(response)\n\n # completed upload part to s3 successfully\n break\n\n except SynapseHTTPError as ex:\n if ex.response.status_code == 403 and retry < 1:\n # we interpret this to mean our pre_signed url expired.\n self._syn.logger.debug(\n \"The pre-signed upload URL for part {} has expired.\"\n \"Refreshing urls and retrying.\\n\".format(part_number)\n )\n\n # we refresh all the urls and obtain this part's\n # specific url for the retry\n with tracer.start_as_current_span(\n \"UploadAttempt::refresh_pre_signed_part_urls\"\n ):\n (\n part_url,\n signed_headers,\n ) = self._refresh_pre_signed_part_urls(\n part_number,\n part_url,\n )\n\n else:\n raise\n\n md5_hex = self._md5_fn(body, response)\n\n # now tell synapse that we uploaded that part successfully\n self._syn.restPUT(\n \"/file/multipart/{upload_id}/add/{part_number}?partMD5Hex={md5}\".format(\n upload_id=self._upload_id,\n part_number=part_number,\n md5=md5_hex,\n ),\n requests_session=session,\n endpoint=self._syn.fileHandleEndpoint,\n )\n\n # remove so future batch pre_signed url fetches will exclude this part\n with self._lock:\n del self._pre_signed_part_urls[part_number]\n\n return part_number, part_size\n\n @tracer.start_as_current_span(\"UploadAttempt::_upload_parts\")\n def _upload_parts(self, part_count, remaining_part_numbers):\n trace.get_current_span().set_attributes({\"thread.id\": threading.get_ident()})\n time_upload_started = time.time()\n completed_part_count = part_count - len(remaining_part_numbers)\n file_size = self._upload_request_payload.get(\"fileSizeBytes\")\n\n if not self._is_copy():\n # we won't have bytes to measure during a copy so the byte oriented progress bar is not useful\n progress = previously_transferred = min(\n completed_part_count * self._part_size,\n file_size,\n )\n\n self._syn._print_transfer_progress(\n progress,\n file_size,\n prefix=\"Uploading\",\n postfix=self._dest_file_name,\n previouslyTransferred=previously_transferred,\n )\n\n self._pre_signed_part_urls = self._fetch_pre_signed_part_urls(\n self._upload_id,\n remaining_part_numbers,\n )\n\n futures = []\n with _executor(self._max_threads, False) as executor:\n # we don't wait on the shutdown since we do so ourselves below\n\n for part_number in remaining_part_numbers:\n futures.append(\n executor.submit(\n self._handle_part,\n part_number,\n context.get_current(),\n )\n )\n\n for result in concurrent.futures.as_completed(futures):\n try:\n _, part_size = result.result()\n\n if part_size and not self._is_copy():\n progress += part_size\n self._syn._print_transfer_progress(\n min(progress, file_size),\n file_size,\n prefix=\"Uploading\",\n postfix=self._dest_file_name,\n dt=time.time() - time_upload_started,\n previouslyTransferred=previously_transferred,\n )\n except (Exception, KeyboardInterrupt) as cause:\n with self._lock:\n self._aborted = True\n\n # wait for all threads to complete before\n # raising the exception, we don't want to return\n # control while there are still threads from this\n # upload attempt running\n concurrent.futures.wait(futures)\n\n if isinstance(cause, KeyboardInterrupt):\n raise SynapseUploadAbortedException(\"User interrupted upload\")\n raise SynapseUploadFailedException(\"Part upload failed\") from cause\n\n @tracer.start_as_current_span(\"UploadAttempt::_complete_upload\")\n def _complete_upload(self):\n upload_status_response = self._syn.restPUT(\n \"/file/multipart/{upload_id}/complete\".format(\n upload_id=self._upload_id,\n ),\n requests_session=self._get_thread_session(),\n endpoint=self._syn.fileHandleEndpoint,\n )\n\n upload_state = upload_status_response.get(\"state\")\n if upload_state != \"COMPLETED\":\n # at this point we think successfully uploaded all the parts\n # but the upload status isn't complete, we'll throw an error\n # and let a subsequent attempt try to reconcile\n raise SynapseUploadFailedException(\n \"Upload status has an unexpected state {}\".format(upload_state)\n )\n\n return upload_status_response\n\n def __call__(self):\n upload_status_response = self._create_synapse_upload()\n upload_state = upload_status_response.get(\"state\")\n\n if upload_state != \"COMPLETED\":\n self._upload_id = upload_status_response[\"uploadId\"]\n part_count, remaining_part_numbers = self._get_remaining_part_numbers(\n upload_status_response\n )\n\n # if no remaining part numbers then all the parts have been\n # uploaded but the upload has not been marked complete.\n if remaining_part_numbers:\n self._upload_parts(part_count, remaining_part_numbers)\n upload_status_response = self._complete_upload()\n\n return upload_status_response\n
"},{"location":"reference/core/#synapseclient.core.upload.multipart_upload-functions","title":"Functions","text":""},{"location":"reference/core/#synapseclient.core.upload.multipart_upload.shared_executor","title":"shared_executor(executor)
","text":"An outside process that will eventually trigger an upload through the this module can configure a shared Executor by running its code within this context manager.
Source code insynapseclient/core/upload/multipart_upload.py
@contextmanager\ndef shared_executor(executor):\n \"\"\"An outside process that will eventually trigger an upload through the this module\n can configure a shared Executor by running its code within this context manager.\"\"\"\n _thread_local.executor = executor\n try:\n yield\n finally:\n del _thread_local.executor\n
"},{"location":"reference/core/#synapseclient.core.upload.multipart_upload.multipart_upload_file","title":"multipart_upload_file(syn, file_path, dest_file_name=None, content_type=None, part_size=None, storage_location_id=None, preview=True, force_restart=False, max_threads=None)
","text":"Upload a file to a Synapse upload destination in chunks.
:param syn: a Synapse object :param file_path: the file to upload :param dest_file_name: upload as a different filename :param content_type: contentType
_ :param part_size: Number of bytes per part. Minimum 5MB. :param storage_location_id: an id indicating where the file should be stored. Retrieved from Synapse's UploadDestination :param preview: True to generate a preview :param force_restart: True to restart a previously initiated upload from scratch, False to try to resume :param max_threads: number of concurrent threads to devote to upload
:returns: a File Handle ID
Keyword arguments are passed down to :py:func:_multipart_upload
and :py:func:_start_multipart_upload
. .. _contentType: https://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.17
synapseclient/core/upload/multipart_upload.py
@tracer.start_as_current_span(\"multipart_upload::multipart_upload_file\")\ndef multipart_upload_file(\n syn,\n file_path: str,\n dest_file_name: str = None,\n content_type: str = None,\n part_size: int = None,\n storage_location_id: str = None,\n preview: bool = True,\n force_restart: bool = False,\n max_threads: int = None,\n) -> str:\n \"\"\"\n Upload a file to a Synapse upload destination in chunks.\n\n :param syn: a Synapse object\n :param file_path: the file to upload\n :param dest_file_name: upload as a different filename\n :param content_type: `contentType`_\n :param part_size: Number of bytes per part. Minimum 5MB.\n :param storage_location_id: an id indicating where the file should be\n stored. Retrieved from Synapse's UploadDestination\n :param preview: True to generate a preview\n :param force_restart: True to restart a previously initiated upload\n from scratch, False to try to resume\n :param max_threads: number of concurrent threads to devote\n to upload\n\n :returns: a File Handle ID\n\n Keyword arguments are passed down to :py:func:`_multipart_upload` and :py:func:`_start_multipart_upload`.\n .. _contentType: https://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.17\n\n \"\"\"\n\n trace.get_current_span().set_attributes(\n {\"synapse.storage_location_id\": storage_location_id}\n )\n\n if not os.path.exists(file_path):\n raise IOError('File \"{}\" not found.'.format(file_path))\n if os.path.isdir(file_path):\n raise IOError('File \"{}\" is a directory.'.format(file_path))\n\n file_size = os.path.getsize(file_path)\n if not dest_file_name:\n dest_file_name = os.path.basename(file_path)\n\n if content_type is None:\n mime_type, _ = mimetypes.guess_type(file_path, strict=False)\n content_type = mime_type or \"application/octet-stream\"\n\n callback_func = Spinner().print_tick if not syn.silent else None\n md5_hex = md5_for_file(file_path, callback=callback_func).hexdigest()\n\n part_size = _get_part_size(part_size, file_size)\n\n upload_request = {\n \"concreteType\": concrete_types.MULTIPART_UPLOAD_REQUEST,\n \"contentType\": content_type,\n \"contentMD5Hex\": md5_hex,\n \"fileName\": dest_file_name,\n \"fileSizeBytes\": file_size,\n \"generatePreview\": preview,\n \"partSizeBytes\": part_size,\n \"storageLocationId\": storage_location_id,\n }\n\n def part_fn(part_number):\n return _get_file_chunk(file_path, part_number, part_size)\n\n return _multipart_upload(\n syn,\n dest_file_name,\n upload_request,\n part_fn,\n md5_fn,\n force_restart=force_restart,\n max_threads=max_threads,\n )\n
"},{"location":"reference/core/#synapseclient.core.upload.multipart_upload.multipart_upload_string","title":"multipart_upload_string(syn, text, dest_file_name=None, part_size=None, content_type=None, storage_location_id=None, preview=True, force_restart=False, max_threads=None)
","text":"Upload a file to a Synapse upload destination in chunks.
:param syn: a Synapse object :param text: a string to upload as a file. :param dest_file_name: upload as a different filename :param content_type: contentType
_ :param part_size: number of bytes per part. Minimum 5MB. :param storage_location_id: an id indicating where the file should be stored. Retrieved from Synapse's UploadDestination :param preview: True to generate a preview :param force_restart: True to restart a previously initiated upload from scratch, False to try to resume :param max_threads: number of concurrent threads to devote to upload
:returns: a File Handle ID
Keyword arguments are passed down to :py:func:_multipart_upload
and :py:func:_start_multipart_upload
.
.. _contentType: https://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.17
Source code insynapseclient/core/upload/multipart_upload.py
@tracer.start_as_current_span(\"multipart_upload::multipart_upload_string\")\ndef multipart_upload_string(\n syn,\n text: str,\n dest_file_name: str = None,\n part_size: int = None,\n content_type: str = None,\n storage_location_id: str = None,\n preview: bool = True,\n force_restart: bool = False,\n max_threads: int = None,\n):\n \"\"\"\n Upload a file to a Synapse upload destination in chunks.\n\n :param syn: a Synapse object\n :param text: a string to upload as a file.\n :param dest_file_name: upload as a different filename\n :param content_type: `contentType`_\n :param part_size: number of bytes per part. Minimum 5MB.\n :param storage_location_id: an id indicating where the file should be\n stored. Retrieved from Synapse's UploadDestination\n :param preview: True to generate a preview\n :param force_restart: True to restart a previously initiated upload\n from scratch, False to try to resume\n :param max_threads: number of concurrent threads to devote\n to upload\n\n :returns: a File Handle ID\n\n Keyword arguments are passed down to\n :py:func:`_multipart_upload` and :py:func:`_start_multipart_upload`.\n\n .. _contentType:\n https://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.17\n \"\"\"\n\n data = text.encode(\"utf-8\")\n file_size = len(data)\n md5_hex = md5_fn(data, None)\n\n if not dest_file_name:\n dest_file_name = \"message.txt\"\n\n if not content_type:\n content_type = \"text/plain; charset=utf-8\"\n\n part_size = _get_part_size(part_size, file_size)\n\n upload_request = {\n \"concreteType\": concrete_types.MULTIPART_UPLOAD_REQUEST,\n \"contentType\": content_type,\n \"contentMD5Hex\": md5_hex,\n \"fileName\": dest_file_name,\n \"fileSizeBytes\": file_size,\n \"generatePreview\": preview,\n \"partSizeBytes\": part_size,\n \"storageLocationId\": storage_location_id,\n }\n\n def part_fn(part_number):\n return _get_data_chunk(data, part_number, part_size)\n\n part_size = _get_part_size(part_size, file_size)\n return _multipart_upload(\n syn,\n dest_file_name,\n upload_request,\n part_fn,\n md5_fn,\n force_restart=force_restart,\n max_threads=max_threads,\n )\n
"},{"location":"reference/core/#synapseclient.core.upload.multipart_upload._multipart_upload","title":"synapseclient.core.upload.multipart_upload._multipart_upload(syn, dest_file_name, upload_request, part_fn, md5_fn, force_restart=False, max_threads=None)
","text":"Source code in synapseclient/core/upload/multipart_upload.py
def _multipart_upload(\n syn,\n dest_file_name,\n upload_request,\n part_fn,\n md5_fn,\n force_restart: bool = False,\n max_threads: int = None,\n):\n if max_threads is None:\n max_threads = pool_provider.DEFAULT_NUM_THREADS\n\n max_threads = max(max_threads, 1)\n\n retry = 0\n while True:\n try:\n upload_status_response = UploadAttempt(\n syn,\n dest_file_name,\n upload_request,\n part_fn,\n md5_fn,\n max_threads,\n # only force_restart the first time through (if requested).\n # a retry after a caught exception will not restart the upload\n # from scratch.\n force_restart and retry == 0,\n )()\n\n # success\n return upload_status_response[\"resultFileHandleId\"]\n\n except SynapseUploadFailedException:\n if retry < MAX_RETRIES:\n retry += 1\n else:\n raise\n
"},{"location":"reference/core/#utils","title":"Utils","text":""},{"location":"reference/core/#synapseclient.core.utils","title":"synapseclient.core.utils
","text":"Utility functions useful in the implementation and testing of the Synapse client.
"},{"location":"reference/core/#synapseclient.core.utils-classes","title":"Classes","text":""},{"location":"reference/core/#synapseclient.core.utils.threadsafe_iter","title":"threadsafe_iter
","text":"Takes an iterator/generator and makes it thread-safe by serializing call to the next
method of given iterator/generator. See: http://anandology.com/blog/using-iterators-and-generators/
synapseclient/core/utils.py
class threadsafe_iter:\n \"\"\"Takes an iterator/generator and makes it thread-safe by serializing call to the `next` method of given\n iterator/generator.\n See: http://anandology.com/blog/using-iterators-and-generators/\n \"\"\"\n\n def __init__(self, it):\n self.it = it\n self.lock = threading.Lock()\n\n def __iter__(self):\n return self\n\n def __next__(self):\n with self.lock:\n return next(self.it)\n
"},{"location":"reference/core/#synapseclient.core.utils.deprecated_keyword_param","title":"deprecated_keyword_param
","text":"A decorator to use to warn when a keyword parameter from a function has been deprecated and is intended for future removal. Will emit a warning such a keyword is passed.
Source code insynapseclient/core/utils.py
class deprecated_keyword_param:\n \"\"\"A decorator to use to warn when a keyword parameter from a function has been deprecated\n and is intended for future removal. Will emit a warning such a keyword is passed.\"\"\"\n\n def __init__(self, keywords, version, reason):\n self.keywords = set(keywords)\n self.version = version\n self.reason = reason\n\n def __call__(self, fn):\n def wrapper(*args, **kwargs):\n found = self.keywords.intersection(kwargs)\n if found:\n warnings.warn(\n \"Parameter(s) {} deprecated since version {}; {}\".format(\n sorted(list(found)), self.version, self.reason\n ),\n category=DeprecationWarning,\n stacklevel=2,\n )\n\n return fn(*args, **kwargs)\n\n return wrapper\n
"},{"location":"reference/core/#synapseclient.core.utils-functions","title":"Functions","text":""},{"location":"reference/core/#synapseclient.core.utils.md5_for_file","title":"md5_for_file(filename, block_size=2 * MB, callback=None)
","text":"Calculates the MD5 of the given file. See source <http://stackoverflow.com/questions/1131220/get-md5-hash-of-a-files-without-open-it-in-python>
_.
:param filename: The file to read in :param block_size: How much of the file to read in at once (bytes). Defaults to 2 MB :param callback: The callback function that help us show loading spinner on terminal. Defaults to None :returns: The MD5
Source code insynapseclient/core/utils.py
def md5_for_file(filename, block_size=2 * MB, callback=None):\n \"\"\"\n Calculates the MD5 of the given file.\n See `source <http://stackoverflow.com/questions/1131220/get-md5-hash-of-a-files-without-open-it-in-python>`_.\n\n :param filename: The file to read in\n :param block_size: How much of the file to read in at once (bytes).\n Defaults to 2 MB\n :param callback: The callback function that help us show loading spinner on terminal.\n Defaults to None\n :returns: The MD5\n \"\"\"\n\n md5 = hashlib.new(\"md5\", usedforsecurity=False)\n with open(filename, \"rb\") as f:\n while True:\n if callback:\n callback()\n data = f.read(block_size)\n if not data:\n break\n md5.update(data)\n return md5\n
"},{"location":"reference/core/#synapseclient.core.utils.md5_fn","title":"md5_fn(part, _)
","text":"Calculate the MD5 of a file-like object.
:part -- A file-like object to read from.
:returns: The MD5
Source code insynapseclient/core/utils.py
def md5_fn(part, _):\n \"\"\"Calculate the MD5 of a file-like object.\n\n :part -- A file-like object to read from.\n\n :returns: The MD5\n \"\"\"\n md5 = hashlib.new(\"md5\", usedforsecurity=False)\n md5.update(part)\n return md5.hexdigest()\n
"},{"location":"reference/core/#synapseclient.core.utils.download_file","title":"download_file(url, localFilepath=None)
","text":"Downloads a remote file.
:param localFilePath: May be None, in which case a temporary file is created
:returns: localFilePath
Source code insynapseclient/core/utils.py
def download_file(url, localFilepath=None):\n \"\"\"\n Downloads a remote file.\n\n :param localFilePath: May be None, in which case a temporary file is created\n\n :returns: localFilePath\n \"\"\"\n\n f = None\n try:\n if localFilepath:\n dir = os.path.dirname(localFilepath)\n if not os.path.exists(dir):\n os.makedirs(dir)\n f = open(localFilepath, \"wb\")\n else:\n f = tempfile.NamedTemporaryFile(delete=False)\n localFilepath = f.name\n\n r = requests.get(url, stream=True)\n toBeTransferred = float(r.headers[\"content-length\"])\n for nChunks, chunk in enumerate(r.iter_content(chunk_size=1024 * 10)):\n if chunk:\n f.write(chunk)\n printTransferProgress(nChunks * 1024 * 10, toBeTransferred)\n finally:\n if f:\n f.close()\n printTransferProgress(toBeTransferred, toBeTransferred)\n\n return localFilepath\n
"},{"location":"reference/core/#synapseclient.core.utils.extract_filename","title":"extract_filename(content_disposition_header, default_filename=None)
","text":"Extract a filename from an HTTP content-disposition header field.
See this memo <http://tools.ietf.org/html/rfc6266>
and this package <http://pypi.python.org/pypi/rfc6266>
for cryptic details.
synapseclient/core/utils.py
def extract_filename(content_disposition_header, default_filename=None):\n \"\"\"\n Extract a filename from an HTTP content-disposition header field.\n\n See `this memo <http://tools.ietf.org/html/rfc6266>`_ and `this package <http://pypi.python.org/pypi/rfc6266>`_\n for cryptic details.\n \"\"\"\n\n if not content_disposition_header:\n return default_filename\n value, params = cgi.parse_header(content_disposition_header)\n return params.get(\"filename\", default_filename)\n
"},{"location":"reference/core/#synapseclient.core.utils.extract_user_name","title":"extract_user_name(profile)
","text":"Extract a displayable user name from a user's profile
Source code insynapseclient/core/utils.py
def extract_user_name(profile):\n \"\"\"\n Extract a displayable user name from a user's profile\n \"\"\"\n if \"userName\" in profile and profile[\"userName\"]:\n return profile[\"userName\"]\n elif \"displayName\" in profile and profile[\"displayName\"]:\n return profile[\"displayName\"]\n else:\n if (\n \"firstName\" in profile\n and profile[\"firstName\"]\n and \"lastName\" in profile\n and profile[\"lastName\"]\n ):\n return profile[\"firstName\"] + \" \" + profile[\"lastName\"]\n elif \"lastName\" in profile and profile[\"lastName\"]:\n return profile[\"lastName\"]\n elif \"firstName\" in profile and profile[\"firstName\"]:\n return profile[\"firstName\"]\n else:\n return str(profile.get(\"id\", \"Unknown-user\"))\n
"},{"location":"reference/core/#synapseclient.core.utils.id_of","title":"id_of(obj)
","text":"Try to figure out the Synapse ID of the given object.
:param obj: May be a string, Entity object, or dictionary
:returns: The ID or throws an exception
Source code insynapseclient/core/utils.py
def id_of(obj):\n \"\"\"\n Try to figure out the Synapse ID of the given object.\n\n :param obj: May be a string, Entity object, or dictionary\n\n :returns: The ID or throws an exception\n \"\"\"\n if isinstance(obj, str):\n return str(obj)\n if isinstance(obj, numbers.Number):\n return str(obj)\n\n id_attr_names = [\n \"id\",\n \"ownerId\",\n \"tableId\",\n ] # possible attribute names for a synapse Id\n for attribute_name in id_attr_names:\n syn_id = _get_from_members_items_or_properties(obj, attribute_name)\n if syn_id is not None:\n return str(syn_id)\n\n raise ValueError(\"Invalid parameters: couldn't find id of \" + str(obj))\n
"},{"location":"reference/core/#synapseclient.core.utils.concrete_type_of","title":"concrete_type_of(obj)
","text":"Return the concrete type of an object representing a Synapse entity. This is meant to operate either against an actual Entity object, or the lighter weight dictionary returned by Synapse#getChildren, both of which are Mappings.
Source code insynapseclient/core/utils.py
def concrete_type_of(obj: collections.abc.Mapping):\n \"\"\"\n Return the concrete type of an object representing a Synapse entity.\n This is meant to operate either against an actual Entity object, or the lighter\n weight dictionary returned by Synapse#getChildren, both of which are Mappings.\n \"\"\"\n concrete_type = None\n if isinstance(obj, collections.abc.Mapping):\n for key in (\"concreteType\", \"type\"):\n concrete_type = obj.get(key)\n if concrete_type:\n break\n\n if not isinstance(concrete_type, str) or not concrete_type.startswith(\n \"org.sagebionetworks.repo.model\"\n ):\n raise ValueError(\"Unable to determine concreteType\")\n\n return concrete_type\n
"},{"location":"reference/core/#synapseclient.core.utils.is_in_path","title":"is_in_path(id, path)
","text":"Determines whether id is in the path as returned from /entity/{id}/path
:param id: synapse id string :param path: object as returned from '/entity/{id}/path'
:returns: True or False
Source code insynapseclient/core/utils.py
def is_in_path(id, path):\n \"\"\"Determines whether id is in the path as returned from /entity/{id}/path\n\n :param id: synapse id string\n :param path: object as returned from '/entity/{id}/path'\n\n :returns: True or False\n \"\"\"\n return id in [item[\"id\"] for item in path[\"path\"]]\n
"},{"location":"reference/core/#synapseclient.core.utils.get_properties","title":"get_properties(entity)
","text":"Returns the dictionary of properties of the given Entity.
Source code insynapseclient/core/utils.py
def get_properties(entity):\n \"\"\"Returns the dictionary of properties of the given Entity.\"\"\"\n\n return entity.properties if hasattr(entity, \"properties\") else entity\n
"},{"location":"reference/core/#synapseclient.core.utils.is_url","title":"is_url(s)
","text":"Return True if the string appears to be a valid URL.
Source code insynapseclient/core/utils.py
def is_url(s):\n \"\"\"Return True if the string appears to be a valid URL.\"\"\"\n if isinstance(s, str):\n try:\n url_parts = urllib_parse.urlsplit(s)\n # looks like a Windows drive letter?\n if len(url_parts.scheme) == 1 and url_parts.scheme.isalpha():\n return False\n if url_parts.scheme == \"file\" and bool(url_parts.path):\n return True\n return bool(url_parts.scheme) and bool(url_parts.netloc)\n except Exception:\n return False\n return False\n
"},{"location":"reference/core/#synapseclient.core.utils.as_url","title":"as_url(s)
","text":"Tries to convert the input into a proper URL.
Source code insynapseclient/core/utils.py
def as_url(s):\n \"\"\"Tries to convert the input into a proper URL.\"\"\"\n url_parts = urllib_parse.urlsplit(s)\n # Windows drive letter?\n if len(url_parts.scheme) == 1 and url_parts.scheme.isalpha():\n return \"file:///%s\" % str(s).replace(\"\\\\\", \"/\")\n if url_parts.scheme:\n return url_parts.geturl()\n else:\n return \"file://%s\" % str(s)\n
"},{"location":"reference/core/#synapseclient.core.utils.guess_file_name","title":"guess_file_name(string)
","text":"Tries to derive a filename from an arbitrary string.
Source code insynapseclient/core/utils.py
def guess_file_name(string):\n \"\"\"Tries to derive a filename from an arbitrary string.\"\"\"\n path = normalize_path(urllib_parse.urlparse(string).path)\n tokens = [x for x in path.split(\"/\") if x != \"\"]\n if len(tokens) > 0:\n return tokens[-1]\n\n # Try scrubbing the path of illegal characters\n if len(path) > 0:\n path = re.sub(r\"[^a-zA-Z0-9_.+() -]\", \"\", path)\n if len(path) > 0:\n return path\n raise ValueError(\"Could not derive a name from %s\" % string)\n
"},{"location":"reference/core/#synapseclient.core.utils.normalize_path","title":"normalize_path(path)
","text":"Transforms a path into an absolute path with forward slashes only.
Source code insynapseclient/core/utils.py
def normalize_path(path):\n \"\"\"Transforms a path into an absolute path with forward slashes only.\"\"\"\n if path is None:\n return None\n return re.sub(r\"\\\\\", \"/\", os.path.normcase(os.path.abspath(path)))\n
"},{"location":"reference/core/#synapseclient.core.utils.equal_paths","title":"equal_paths(path1, path2)
","text":"Compare file paths in a platform neutral way
Source code insynapseclient/core/utils.py
def equal_paths(path1, path2):\n \"\"\"\n Compare file paths in a platform neutral way\n \"\"\"\n return normalize_path(path1) == normalize_path(path2)\n
"},{"location":"reference/core/#synapseclient.core.utils.file_url_to_path","title":"file_url_to_path(url, verify_exists=False)
","text":"Convert a file URL to a path, handling some odd cases around Windows paths.
:param url: a file URL :param verify_exists: If true, return an populated dict only if the resulting file path exists on the local file system.
:returns: a path or None if the URL is not a file URL.
Source code insynapseclient/core/utils.py
def file_url_to_path(url, verify_exists=False):\n \"\"\"\n Convert a file URL to a path, handling some odd cases around Windows paths.\n\n :param url: a file URL\n :param verify_exists: If true, return an populated dict only if the resulting file path exists on the local file\n system.\n\n :returns: a path or None if the URL is not a file URL.\n \"\"\"\n parts = urllib_parse.urlsplit(url)\n if parts.scheme == \"file\" or parts.scheme == \"\":\n path = parts.path\n # A windows file URL, for example file:///c:/WINDOWS/asdf.txt\n # will get back a path of: /c:/WINDOWS/asdf.txt, which we need to fix by\n # lopping off the leading slash character. Apparently, the Python developers\n # think this is not a bug: http://bugs.python.org/issue7965\n if re.match(r\"\\/[A-Za-z]:\", path):\n path = path[1:]\n if os.path.exists(path) or not verify_exists:\n return path\n return None\n
"},{"location":"reference/core/#synapseclient.core.utils.is_same_base_url","title":"is_same_base_url(url1, url2)
","text":"Compares two urls to see if they are the same excluding up to the base path
:param url1: a URL :param url2: a second URL
:returns: Boolean
Source code insynapseclient/core/utils.py
def is_same_base_url(url1, url2):\n \"\"\"Compares two urls to see if they are the same excluding up to the base path\n\n :param url1: a URL\n :param url2: a second URL\n\n :returns: Boolean\n \"\"\"\n url1 = urllib_parse.urlsplit(url1)\n url2 = urllib_parse.urlsplit(url2)\n return url1.scheme == url2.scheme and url1.hostname == url2.hostname\n
"},{"location":"reference/core/#synapseclient.core.utils.is_synapse_id_str","title":"is_synapse_id_str(obj)
","text":"If the input is a Synapse ID return it, otherwise return None
Source code insynapseclient/core/utils.py
def is_synapse_id_str(obj):\n \"\"\"If the input is a Synapse ID return it, otherwise return None\"\"\"\n if isinstance(obj, str):\n m = re.match(r\"(syn\\d+$)\", obj)\n if m:\n return m.group(1)\n return None\n
"},{"location":"reference/core/#synapseclient.core.utils.datetime_or_none","title":"datetime_or_none(datetime_str)
","text":"Attempts to convert a string to a datetime object. Returns None if it fails.
Some of the expected formats of datetime_str are: - 2023-12-04T07:00:00Z - 2001-01-01 15:00:00+07:00 - 2001-01-01 15:00:00-07:00 - 2023-12-04 07:00:00+00:00 - 2019-01-01
:param datetime_str: The string to convert to a datetime object :return: The datetime object or None if the conversion fails
Source code insynapseclient/core/utils.py
def datetime_or_none(datetime_str: str) -> typing.Union[datetime.datetime, None]:\n \"\"\"Attempts to convert a string to a datetime object. Returns None if it fails.\n\n Some of the expected formats of datetime_str are:\n - 2023-12-04T07:00:00Z\n - 2001-01-01 15:00:00+07:00\n - 2001-01-01 15:00:00-07:00\n - 2023-12-04 07:00:00+00:00\n - 2019-01-01\n\n :param datetime_str: The string to convert to a datetime object\n :return: The datetime object or None if the conversion fails\n \"\"\"\n try:\n return datetime.datetime.fromisoformat(datetime_str.replace(\"Z\", \"+00:00\"))\n except Exception:\n return None\n
"},{"location":"reference/core/#synapseclient.core.utils.is_date","title":"is_date(dt)
","text":"Objects of class datetime.date and datetime.datetime will be recognized as dates
Source code insynapseclient/core/utils.py
def is_date(dt):\n \"\"\"Objects of class datetime.date and datetime.datetime will be recognized as dates\"\"\"\n return isinstance(dt, datetime.date) or isinstance(dt, datetime.datetime)\n
"},{"location":"reference/core/#synapseclient.core.utils.to_list","title":"to_list(value)
","text":"Convert the value (an iterable or a scalar value) to a list.
Source code insynapseclient/core/utils.py
def to_list(value):\n \"\"\"Convert the value (an iterable or a scalar value) to a list.\"\"\"\n if isinstance(value, collections.abc.Iterable) and not isinstance(value, str):\n values = []\n for val in value:\n possible_datetime = None\n if isinstance(val, str):\n possible_datetime = datetime_or_none(value)\n values.append(val if possible_datetime is None else possible_datetime)\n return values\n else:\n possible_datetime = None\n if isinstance(value, str):\n possible_datetime = datetime_or_none(value)\n return [value if possible_datetime is None else possible_datetime]\n
"},{"location":"reference/core/#synapseclient.core.utils.make_bogus_data_file","title":"make_bogus_data_file(n=100, seed=None)
","text":"Makes a bogus data file for testing. It is the caller's responsibility to clean up the file when finished.
:param n: How many random floating point numbers to be written into the file, separated by commas :param seed: Random seed for the random numbers
:returns: The name of the file
Source code insynapseclient/core/utils.py
def make_bogus_data_file(n=100, seed=None):\n \"\"\"\n Makes a bogus data file for testing. It is the caller's responsibility to clean up the file when finished.\n\n :param n: How many random floating point numbers to be written into the file, separated by commas\n :param seed: Random seed for the random numbers\n\n :returns: The name of the file\n \"\"\"\n\n if seed is not None:\n random.seed(seed)\n data = [random.gauss(mu=0.0, sigma=1.0) for i in range(n)]\n\n f = tempfile.NamedTemporaryFile(mode=\"w\", suffix=\".txt\", delete=False)\n try:\n f.write(\", \".join(str(n) for n in data))\n f.write(\"\\n\")\n finally:\n f.close()\n\n return normalize_path(f.name)\n
"},{"location":"reference/core/#synapseclient.core.utils.make_bogus_binary_file","title":"make_bogus_binary_file(n=1 * KB, filepath=None, printprogress=False)
","text":"Makes a bogus binary data file for testing. It is the caller's responsibility to clean up the file when finished.
:param n: How many bytes to write
:returns: The name of the file
Source code insynapseclient/core/utils.py
def make_bogus_binary_file(n=1 * KB, filepath=None, printprogress=False):\n \"\"\"\n Makes a bogus binary data file for testing. It is the caller's responsibility to clean up the file when finished.\n\n :param n: How many bytes to write\n\n :returns: The name of the file\n \"\"\"\n\n with open(filepath, \"wb\") if filepath else tempfile.NamedTemporaryFile(\n mode=\"wb\", suffix=\".dat\", delete=False\n ) as f:\n if not filepath:\n filepath = f.name\n progress = 0\n remaining = n\n while remaining > 0:\n buff_size = int(min(remaining, 1 * KB))\n f.write(os.urandom(buff_size))\n remaining -= buff_size\n if printprogress:\n progress += buff_size\n printTransferProgress(progress, n, \"Generated \", filepath)\n return normalize_path(filepath)\n
"},{"location":"reference/core/#synapseclient.core.utils.to_unix_epoch_time","title":"to_unix_epoch_time(dt)
","text":"Convert either datetime.date or datetime.datetime objects <http://docs.python.org/2/library/datetime.html>
_ to UNIX time.
synapseclient/core/utils.py
def to_unix_epoch_time(dt: typing.Union[datetime.date, datetime.datetime, str]) -> int:\n \"\"\"\n Convert either `datetime.date or datetime.datetime objects <http://docs.python.org/2/library/datetime.html>`_\n to UNIX time.\n \"\"\"\n if type(dt) == str:\n dt = datetime.datetime.fromisoformat(dt.replace(\"Z\", \"+00:00\"))\n if type(dt) == datetime.date:\n current_timezone = datetime.datetime.now().astimezone().tzinfo\n datetime_utc = datetime.datetime.combine(dt, datetime.time(0, 0, 0, 0)).replace(\n tzinfo=current_timezone\n )\n else:\n # If the datetime is not timezone aware, assume it is in the local timezone.\n # This is required in order for windows to work with the `astimezone` method.\n if dt.tzinfo is None:\n current_timezone = datetime.datetime.now().astimezone().tzinfo\n dt = dt.replace(tzinfo=current_timezone)\n datetime_utc = dt.astimezone(datetime.timezone.utc)\n return int((datetime_utc - UNIX_EPOCH).total_seconds() * 1000)\n
"},{"location":"reference/core/#synapseclient.core.utils.to_unix_epoch_time_secs","title":"to_unix_epoch_time_secs(dt)
","text":"Convert either datetime.date or datetime.datetime objects <http://docs.python.org/2/library/datetime.html>
_ to UNIX time.
synapseclient/core/utils.py
def to_unix_epoch_time_secs(\n dt: typing.Union[datetime.date, datetime.datetime]\n) -> float:\n \"\"\"\n Convert either `datetime.date or datetime.datetime objects <http://docs.python.org/2/library/datetime.html>`_\n to UNIX time.\n \"\"\"\n if type(dt) == datetime.date:\n current_timezone = datetime.datetime.now().astimezone().tzinfo\n datetime_utc = datetime.datetime.combine(dt, datetime.time(0, 0, 0, 0)).replace(\n tzinfo=current_timezone\n )\n else:\n # If the datetime is not timezone aware, assume it is in the local timezone.\n # This is required in order for windows to work with the `astimezone` method.\n if dt.tzinfo is None:\n current_timezone = datetime.datetime.now().astimezone().tzinfo\n dt = dt.replace(tzinfo=current_timezone)\n datetime_utc = dt.astimezone(datetime.timezone.utc)\n return (datetime_utc - UNIX_EPOCH).total_seconds()\n
"},{"location":"reference/core/#synapseclient.core.utils.from_unix_epoch_time_secs","title":"from_unix_epoch_time_secs(secs)
","text":"Returns a Datetime object given milliseconds since midnight Jan 1, 1970.
Source code insynapseclient/core/utils.py
def from_unix_epoch_time_secs(secs):\n \"\"\"Returns a Datetime object given milliseconds since midnight Jan 1, 1970.\"\"\"\n if isinstance(secs, str):\n secs = float(secs)\n\n # utcfromtimestamp() fails for negative values (dates before 1970-1-1) on Windows\n # so, here's a hack that enables ancient events, such as Chris's birthday to be\n # converted from milliseconds since the UNIX epoch to higher level Datetime objects. Ha!\n if platform.system() == \"Windows\" and secs < 0:\n mirror_date = datetime.datetime.utcfromtimestamp(abs(secs)).replace(\n tzinfo=datetime.timezone.utc\n )\n\n result = (UNIX_EPOCH - (mirror_date - UNIX_EPOCH)).replace(\n tzinfo=datetime.timezone.utc\n )\n\n return result\n datetime_instance = datetime.datetime.utcfromtimestamp(secs).replace(\n tzinfo=datetime.timezone.utc\n )\n\n return datetime_instance\n
"},{"location":"reference/core/#synapseclient.core.utils.from_unix_epoch_time","title":"from_unix_epoch_time(ms)
","text":"Returns a Datetime object given milliseconds since midnight Jan 1, 1970.
Source code insynapseclient/core/utils.py
def from_unix_epoch_time(ms) -> datetime.datetime:\n \"\"\"Returns a Datetime object given milliseconds since midnight Jan 1, 1970.\"\"\"\n\n if isinstance(ms, str):\n ms = float(ms)\n return from_unix_epoch_time_secs(ms / 1000.0)\n
"},{"location":"reference/core/#synapseclient.core.utils.format_time_interval","title":"format_time_interval(seconds)
","text":"Format a time interval given in seconds to a readable value, e.g. \"5 minutes, 37 seconds\".
Source code insynapseclient/core/utils.py
def format_time_interval(seconds):\n \"\"\"Format a time interval given in seconds to a readable value, e.g. \\\"5 minutes, 37 seconds\\\".\"\"\"\n\n periods = (\n (\"year\", 60 * 60 * 24 * 365),\n (\"month\", 60 * 60 * 24 * 30),\n (\"day\", 60 * 60 * 24),\n (\"hour\", 60 * 60),\n (\"minute\", 60),\n (\"second\", 1),\n )\n\n result = []\n for period_name, period_seconds in periods:\n if seconds > period_seconds or period_name == \"second\":\n period_value, seconds = divmod(seconds, period_seconds)\n if period_value > 0 or period_name == \"second\":\n if period_value == 1:\n result.append(\"%d %s\" % (period_value, period_name))\n else:\n result.append(\"%d %ss\" % (period_value, period_name))\n return \", \".join(result)\n
"},{"location":"reference/core/#synapseclient.core.utils.itersubclasses","title":"itersubclasses(cls, _seen=None)
","text":"http://code.activestate.com/recipes/576949/ (r3)
itersubclasses(cls)
Generator over all subclasses of a given class, in depth first order.
list(itersubclasses(int)) == [bool] True class A(object): pass class B(A): pass class C(A): pass class D(B,C): pass class E(D): pass
for cls in itersubclasses(A): ... print(cls.name) B D E C
Source code insynapseclient/core/utils.py
def itersubclasses(cls, _seen=None):\n \"\"\"\n http://code.activestate.com/recipes/576949/ (r3)\n\n itersubclasses(cls)\n\n Generator over all subclasses of a given class, in depth first order.\n\n >>> list(itersubclasses(int)) == [bool]\n True\n >>> class A(object): pass\n >>> class B(A): pass\n >>> class C(A): pass\n >>> class D(B,C): pass\n >>> class E(D): pass\n >>>\n >>> for cls in itersubclasses(A):\n ... print(cls.__name__)\n B\n D\n E\n C\n >>> # get ALL (new-style) classes currently defined\n >>> [cls.__name__ for cls in itersubclasses(object)] #doctest: +ELLIPSIS\n ['type', ...'tuple', ...]\n \"\"\"\n\n if not isinstance(cls, type):\n raise TypeError(\n \"itersubclasses must be called with \" \"new-style classes, not %.100r\" % cls\n )\n if _seen is None:\n _seen = set()\n try:\n subs = cls.__subclasses__()\n except TypeError: # fails only when cls is type\n subs = cls.__subclasses__(cls)\n for sub in subs:\n if sub not in _seen:\n _seen.add(sub)\n yield sub\n for inner_sub in itersubclasses(sub, _seen):\n yield inner_sub\n
"},{"location":"reference/core/#synapseclient.core.utils.itersubclasses--get-all-new-style-classes-currently-defined","title":"get ALL (new-style) classes currently defined","text":"[cls.name for cls in itersubclasses(object)] #doctest: +ELLIPSIS ['type', ...'tuple', ...]
"},{"location":"reference/core/#synapseclient.core.utils.normalize_whitespace","title":"normalize_whitespace(s)
","text":"Strips the string and replace all whitespace sequences and other non-printable characters with a single space.
Source code insynapseclient/core/utils.py
def normalize_whitespace(s):\n \"\"\"\n Strips the string and replace all whitespace sequences and other non-printable characters with a single space.\n \"\"\"\n assert isinstance(s, str)\n return re.sub(r\"[\\x00-\\x20\\s]+\", \" \", s.strip())\n
"},{"location":"reference/core/#synapseclient.core.utils.query_limit_and_offset","title":"query_limit_and_offset(query, hard_limit=1000)
","text":"Extract limit and offset from the end of a query string.
:returns: A triple containing the query with limit and offset removed, the limit at most equal to the hard_limit, and the offset which defaults to 1
Source code insynapseclient/core/utils.py
def query_limit_and_offset(query, hard_limit=1000):\n \"\"\"\n Extract limit and offset from the end of a query string.\n\n :returns: A triple containing the query with limit and offset removed, the limit at most equal to the hard_limit,\n and the offset which\n defaults to 1\n \"\"\"\n # Regex a lower-case string to simplify matching\n tempQueryStr = query.lower()\n regex = r\"\\A(.*\\s)(offset|limit)\\s*(\\d*\\s*)\\Z\"\n\n # Continue to strip off and save the last limit/offset\n match = re.search(regex, tempQueryStr)\n options = {}\n while match is not None:\n options[match.group(2)] = int(match.group(3))\n tempQueryStr = match.group(1)\n match = re.search(regex, tempQueryStr)\n\n # Get a truncated version of the original query string (not in lower-case)\n query = query[: len(tempQueryStr)].strip()\n\n # Continue querying until the entire query has been fetched (or crash out)\n limit = min(options.get(\"limit\", hard_limit), hard_limit)\n offset = options.get(\"offset\", 1)\n\n return query, limit, offset\n
"},{"location":"reference/core/#synapseclient.core.utils.extract_synapse_id_from_query","title":"extract_synapse_id_from_query(query)
","text":"An unfortunate hack to pull the synapse ID out of a table query of the form \"select column1, column2 from syn12345 where....\" needed to build URLs for table services.
Source code insynapseclient/core/utils.py
def extract_synapse_id_from_query(query):\n \"\"\"\n An unfortunate hack to pull the synapse ID out of a table query of the form \"select column1, column2 from syn12345\n where....\" needed to build URLs for table services.\n \"\"\"\n m = re.search(r\"from\\s+(syn\\d+)\", query, re.IGNORECASE)\n if m:\n return m.group(1)\n else:\n raise ValueError('Couldn\\'t extract synapse ID from query: \"%s\"' % query)\n
"},{"location":"reference/core/#synapseclient.core.utils.printTransferProgress","title":"printTransferProgress(transferred, toBeTransferred, prefix='', postfix='', isBytes=True, dt=None, previouslyTransferred=0)
","text":"Prints a progress bar
:param transferred: a number of items/bytes completed :param toBeTransferred: total number of items/bytes when completed :param prefix: String printed before progress bar :param postfix: String printed after progress bar :param isBytes: A boolean indicating whether to convert bytes to kB, MB, GB etc. :param dt: The time in seconds that has passed since transfer started is used to calculate rate :param previouslyTransferred: the number of bytes that were already transferred before this transfer began (e.g. someone ctrl+c'd out of an upload and restarted it later)
Source code insynapseclient/core/utils.py
def printTransferProgress(\n transferred,\n toBeTransferred,\n prefix=\"\",\n postfix=\"\",\n isBytes=True,\n dt=None,\n previouslyTransferred=0,\n):\n \"\"\"Prints a progress bar\n\n :param transferred: a number of items/bytes completed\n :param toBeTransferred: total number of items/bytes when completed\n :param prefix: String printed before progress bar\n :param postfix: String printed after progress bar\n :param isBytes: A boolean indicating whether to convert bytes to kB, MB, GB etc.\n :param dt: The time in seconds that has passed since transfer started is used to calculate rate\n :param previouslyTransferred: the number of bytes that were already transferred before this transfer began\n (e.g. someone ctrl+c'd out of an upload and restarted it later)\n\n \"\"\"\n if not sys.stdout.isatty():\n return\n barLength = 20 # Modify this to change the length of the progress bar\n status = \"\"\n rate = \"\"\n if dt is not None and dt != 0:\n rate = (transferred - previouslyTransferred) / float(dt)\n rate = \"(%s/s)\" % humanizeBytes(rate) if isBytes else rate\n if toBeTransferred < 0:\n defaultToBeTransferred = barLength * 1 * MB\n if transferred > defaultToBeTransferred:\n progress = (\n float(transferred % defaultToBeTransferred) / defaultToBeTransferred\n )\n else:\n progress = float(transferred) / defaultToBeTransferred\n elif toBeTransferred == 0: # There is nothing to be transferred\n progress = 1\n status = \"Done...\\n\"\n else:\n progress = float(transferred) / toBeTransferred\n if progress >= 1:\n progress = 1\n status = \"Done...\\n\"\n block = int(round(barLength * progress))\n nbytes = humanizeBytes(transferred) if isBytes else transferred\n if toBeTransferred > 0:\n outOf = \"/%s\" % (humanizeBytes(toBeTransferred) if isBytes else toBeTransferred)\n percentage = \"%4.2f%%\" % (progress * 100)\n else:\n outOf = \"\"\n percentage = \"\"\n text = \"\\r%s [%s]%s %s%s %s %s %s \" % (\n prefix,\n \"#\" * block + \"-\" * (barLength - block),\n percentage,\n nbytes,\n outOf,\n rate,\n postfix,\n status,\n )\n sys.stdout.write(text)\n sys.stdout.flush()\n
"},{"location":"reference/core/#synapseclient.core.utils.touch","title":"touch(path, times=None)
","text":"Make sure a file exists. Update its access and modified times.
Source code insynapseclient/core/utils.py
def touch(path, times=None):\n \"\"\"\n Make sure a file exists. Update its access and modified times.\n \"\"\"\n basedir = os.path.dirname(path)\n if not os.path.exists(basedir):\n try:\n os.makedirs(basedir)\n except OSError as err:\n # alternate processes might be creating these at the same time\n if err.errno != errno.EEXIST:\n raise\n\n with open(path, \"a\"):\n os.utime(path, times)\n return path\n
"},{"location":"reference/core/#synapseclient.core.utils.is_json","title":"is_json(content_type)
","text":"detect if a content-type is JSON
Source code insynapseclient/core/utils.py
def is_json(content_type):\n \"\"\"detect if a content-type is JSON\"\"\"\n # The value of Content-Type defined here:\n # http://www.w3.org/Protocols/rfc2616/rfc2616-sec3.html#sec3.7\n return (\n content_type.lower().strip().startswith(\"application/json\")\n if content_type\n else False\n )\n
"},{"location":"reference/core/#synapseclient.core.utils.find_data_file_handle","title":"find_data_file_handle(bundle)
","text":"Return the fileHandle whose ID matches the dataFileHandleId in an entity bundle
Source code insynapseclient/core/utils.py
def find_data_file_handle(bundle):\n \"\"\"Return the fileHandle whose ID matches the dataFileHandleId in an entity bundle\"\"\"\n for fileHandle in bundle[\"fileHandles\"]:\n if fileHandle[\"id\"] == bundle[\"entity\"][\"dataFileHandleId\"]:\n return fileHandle\n return None\n
"},{"location":"reference/core/#synapseclient.core.utils.unique_filename","title":"unique_filename(path)
","text":"Returns a unique path by appending (n) for some number n to the end of the filename.
Source code insynapseclient/core/utils.py
def unique_filename(path):\n \"\"\"Returns a unique path by appending (n) for some number n to the end of the filename.\"\"\"\n\n base, ext = os.path.splitext(path)\n counter = 0\n while os.path.exists(path):\n counter += 1\n path = base + (\"(%d)\" % counter) + ext\n\n return path\n
"},{"location":"reference/core/#synapseclient.core.utils.threadsafe_generator","title":"threadsafe_generator(f)
","text":"A decorator that takes a generator function and makes it thread-safe. See: http://anandology.com/blog/using-iterators-and-generators/
Source code insynapseclient/core/utils.py
def threadsafe_generator(f):\n \"\"\"A decorator that takes a generator function and makes it thread-safe.\n See: http://anandology.com/blog/using-iterators-and-generators/\n \"\"\"\n\n def g(*a, **kw):\n return threadsafe_iter(f(*a, **kw))\n\n return g\n
"},{"location":"reference/core/#synapseclient.core.utils.extract_prefix","title":"extract_prefix(keys)
","text":"Takes a list of strings and extracts a common prefix delimited by a dot, for example::
extract_prefix([\"entity.bang\", \"entity.bar\", \"entity.bat\"])\n# returns \"entity\"\n
Source code in synapseclient/core/utils.py
def extract_prefix(keys):\n \"\"\"\n Takes a list of strings and extracts a common prefix delimited by a dot,\n for example::\n\n extract_prefix([\"entity.bang\", \"entity.bar\", \"entity.bat\"])\n # returns \"entity\"\n\n \"\"\"\n prefixes = set()\n for key in keys:\n parts = key.split(\".\")\n if len(parts) > 1:\n prefixes.add(parts[0])\n else:\n return \"\"\n if len(prefixes) == 1:\n return prefixes.pop() + \".\"\n return \"\"\n
"},{"location":"reference/core/#synapseclient.core.utils.extract_zip_file_to_directory","title":"extract_zip_file_to_directory(zip_file, zip_entry_name, target_dir)
","text":"Extracts a specified file in a zip to the specified directory :param zip_file: an opened zip file. e.g. \"with zipfile.ZipFile(zipfilepath) as zip_file:\" :param zip_entry_name: the name of the file to be extracted from the zip e.g. folderInsideZipIfAny/fileName.txt :param target_dir: the directory to which the file will be extracted
:return: full path to the extracted file
Source code insynapseclient/core/utils.py
def extract_zip_file_to_directory(zip_file, zip_entry_name, target_dir):\n \"\"\"\n Extracts a specified file in a zip to the specified directory\n :param zip_file: an opened zip file. e.g. \"with zipfile.ZipFile(zipfilepath) as zip_file:\"\n :param zip_entry_name: the name of the file to be extracted from the zip e.g. folderInsideZipIfAny/fileName.txt\n :param target_dir: the directory to which the file will be extracted\n\n :return: full path to the extracted file\n \"\"\"\n file_base_name = os.path.basename(zip_entry_name) # base name of the file\n filepath = os.path.join(\n target_dir, file_base_name\n ) # file path to the cached file to write\n\n # Create the cache directory if it does not exist\n if not os.path.exists(target_dir):\n os.makedirs(target_dir)\n\n # write the file from the zip into the cache\n with open(filepath, \"wb\") as cache_file:\n cache_file.write(zip_file.read(zip_entry_name))\n\n return filepath\n
"},{"location":"reference/core/#synapseclient.core.utils.topolgical_sort","title":"topolgical_sort(graph)
","text":"Given a graph in the form of a dictionary returns a sorted list
Adapted from: http://blog.jupo.org/2012/04/06/topological-sorting-acyclic-directed-graphs/
:param graph: a dictionary with values containing lists of keys referencing back into the dictionary
:returns: sorted list of items
Source code insynapseclient/core/utils.py
def topolgical_sort(graph):\n \"\"\"Given a graph in the form of a dictionary returns a sorted list\n\n Adapted from: http://blog.jupo.org/2012/04/06/topological-sorting-acyclic-directed-graphs/\n\n :param graph: a dictionary with values containing lists of keys referencing back into the dictionary\n\n :returns: sorted list of items\n \"\"\"\n graph_unsorted = graph.copy()\n graph_sorted = []\n # Convert the unsorted graph into a hash table. This gives us\n # constant-time lookup for checking if edges are unresolved\n\n # Run until the unsorted graph is empty.\n while graph_unsorted:\n # Go through each of the node/edges pairs in the unsorted\n # graph. If a set of edges doesn't contain any nodes that\n # haven't been resolved, that is, that are still in the\n # unsorted graph, remove the pair from the unsorted graph,\n # and append it to the sorted graph. Note here that by using\n # using the items() method for iterating, a copy of the\n # unsorted graph is used, allowing us to modify the unsorted\n # graph as we move through it. We also keep a flag for\n # checking that that graph is acyclic, which is true if any\n # nodes are resolved during each pass through the graph. If\n # not, we need to bail out as the graph therefore can't be\n # sorted.\n acyclic = False\n for node, edges in list(graph_unsorted.items()):\n for edge in edges:\n if edge in graph_unsorted:\n break\n else:\n acyclic = True\n del graph_unsorted[node]\n graph_sorted.append((node, edges))\n\n if not acyclic:\n # We've passed through all the unsorted nodes and\n # weren't able to resolve any of them, which means there\n # are nodes with cyclic edges that will never be resolved,\n # so we bail out with an error.\n raise RuntimeError(\n \"A cyclic dependency occurred.\"\n \" Some files in provenance reference each other circularly.\"\n )\n return graph_sorted\n
"},{"location":"reference/core/#synapseclient.core.utils.caller_module_name","title":"caller_module_name(current_frame)
","text":":param current_frame: use inspect.currentframe(). :return: the name of the module calling the function, foo(), in which this calling_module() is invoked. Ignores callers that belong in the same module as foo()
Source code insynapseclient/core/utils.py
def caller_module_name(current_frame):\n \"\"\"\n :param current_frame: use inspect.currentframe().\n :return: the name of the module calling the function, foo(), in which this calling_module() is invoked.\n Ignores callers that belong in the same module as foo()\n \"\"\"\n\n current_frame_filename = (\n current_frame.f_code.co_filename\n ) # filename in which foo() resides\n\n # go back a frame takes us to the frame calling foo()\n caller_frame = current_frame.f_back\n caller_filename = caller_frame.f_code.co_filename\n\n # find the first frame that does not have the same filename. this ensures that we don't consider functions within\n # the same module as foo() that use foo() as a helper function\n while caller_filename == current_frame_filename:\n caller_frame = caller_frame.f_back\n caller_filename = caller_frame.f_code.co_filename\n\n return inspect.getmodulename(caller_filename)\n
"},{"location":"reference/core/#synapseclient.core.utils.snake_case","title":"snake_case(string)
","text":"Convert the given string from CamelCase to snake_case
Source code insynapseclient/core/utils.py
def snake_case(string):\n \"\"\"Convert the given string from CamelCase to snake_case\"\"\"\n # https://stackoverflow.com/a/1176023\n return re.sub(r\"(?<!^)(?=[A-Z])\", \"_\", string).lower()\n
"},{"location":"reference/core/#synapseclient.core.utils.is_base64_encoded","title":"is_base64_encoded(input_string)
","text":"Return whether the given input string appears to be base64 encoded
Source code insynapseclient/core/utils.py
def is_base64_encoded(input_string):\n \"\"\"Return whether the given input string appears to be base64 encoded\"\"\"\n if not input_string:\n # None, empty string are not considered encoded\n return False\n try:\n # see if we can decode it and then reencode it back to the input\n byte_string = (\n input_string\n if isinstance(input_string, bytes)\n else str.encode(input_string)\n )\n return base64.b64encode(base64.b64decode(byte_string)) == byte_string\n except Exception:\n return False\n
"},{"location":"reference/core/#versions","title":"Versions","text":""},{"location":"reference/core/#synapseclient.core.version_check","title":"synapseclient.core.version_check
","text":"Version Functions
Check for latest version and recommend upgrade::
synapseclient.check_for_updates()\n
Print release notes for installed version of client::
synapseclient.release_notes()\n
.. automethod:: synapseclient.core.version_check.check_for_updates .. automethod:: synapseclient.core.version_check.release_notes
"},{"location":"reference/core/#synapseclient.core.version_check-functions","title":"Functions","text":""},{"location":"reference/core/#synapseclient.core.version_check.version_check","title":"version_check(current_version=None, version_url=_VERSION_URL, check_for_point_releases=False)
","text":"Gets the latest version information from version_url and check against the current version. Recommends upgrade, if a newer version exists.
:returns: True if current version is the latest release (or higher) version, False otherwise.
Source code insynapseclient/core/version_check.py
def version_check(\n current_version=None, version_url=_VERSION_URL, check_for_point_releases=False\n):\n \"\"\"\n Gets the latest version information from version_url and check against the current version.\n Recommends upgrade, if a newer version exists.\n\n :returns: True if current version is the latest release (or higher) version,\n False otherwise.\n \"\"\"\n\n try:\n if not current_version:\n current_version = synapseclient.__version__\n\n version_info = _get_version_info(version_url)\n\n current_base_version = _strip_dev_suffix(current_version)\n\n # Check blacklist\n if (\n current_base_version in version_info[\"blacklist\"]\n or current_version in version_info[\"blacklist\"]\n ):\n msg = (\n \"\\nPLEASE UPGRADE YOUR CLIENT\\n\\nUpgrading your SynapseClient is required. \"\n \"Please upgrade your client by typing:\\n\"\n \" pip install --upgrade synapseclient\\n\\n\"\n )\n raise SystemExit(msg)\n\n if \"message\" in version_info:\n sys.stderr.write(version_info[\"message\"] + \"\\n\")\n\n levels = 3 if check_for_point_releases else 2\n\n # Compare with latest version\n if _version_tuple(current_version, levels=levels) < _version_tuple(\n version_info[\"latestVersion\"], levels=levels\n ):\n sys.stderr.write(\n \"\\nUPGRADE AVAILABLE\\n\\nA more recent version of the Synapse Client (%s) \"\n \"is available. Your version (%s) can be upgraded by typing:\\n\"\n \" pip install --upgrade synapseclient\\n\\n\"\n % (\n version_info[\"latestVersion\"],\n current_version,\n )\n )\n if \"releaseNotes\" in version_info:\n sys.stderr.write(\n \"Python Synapse Client version %s release notes\\n\\n\"\n % version_info[\"latestVersion\"]\n )\n sys.stderr.write(version_info[\"releaseNotes\"] + \"\\n\\n\")\n return False\n\n except Exception as e:\n # Don't prevent the client from running if something goes wrong\n sys.stderr.write(\"Exception in version check: %s\\n\" % (str(e),))\n return False\n\n return True\n
"},{"location":"reference/core/#synapseclient.core.version_check.check_for_updates","title":"check_for_updates()
","text":"Check for the existence of newer versions of the client, reporting both current release version and development version.
For help installing development versions of the client, see the docs for :py:mod:synapseclient
or the README.md <https://github.com/Sage-Bionetworks/synapsePythonClient>
_.
synapseclient/core/version_check.py
def check_for_updates():\n \"\"\"\n Check for the existence of newer versions of the client, reporting both current release version and development\n version.\n\n For help installing development versions of the client, see the docs for\n :py:mod:`synapseclient` or the `README.md <https://github.com/Sage-Bionetworks/synapsePythonClient>`_.\n \"\"\"\n sys.stderr.write(\"Python Synapse Client\\n\")\n sys.stderr.write(\"currently running version: %s\\n\" % synapseclient.__version__)\n\n release_version_info = _get_version_info(_VERSION_URL)\n sys.stderr.write(\n \"latest release version: %s\\n\" % release_version_info[\"latestVersion\"]\n )\n\n if _version_tuple(synapseclient.__version__, levels=3) < _version_tuple(\n release_version_info[\"latestVersion\"], levels=3\n ):\n print(\n (\n \"\\nUPGRADE AVAILABLE\\n\\nA more recent version of the Synapse Client (%s) is available. \"\n \"Your version (%s) can be upgraded by typing:\\n\"\n \" pip install --upgrade synapseclient\\n\\n\"\n )\n % (\n release_version_info[\"latestVersion\"],\n synapseclient.__version__,\n )\n )\n else:\n sys.stderr.write(\"\\nYour Synapse client is up to date!\\n\")\n
"},{"location":"reference/core/#synapseclient.core.version_check.release_notes","title":"release_notes(version_url=None)
","text":"Print release notes for the installed version of the client or latest release or development version if version_url is supplied.
:param version_url: Defaults to None, meaning release notes for the installed version. Alternatives are:
- synapseclient.version_check._VERSION_URL\n - synapseclient.version_check._DEV_VERSION_URL\n
Source code in synapseclient/core/version_check.py
def release_notes(version_url=None):\n \"\"\"\n Print release notes for the installed version of the client or latest release or development version if version_url\n is supplied.\n\n :param version_url: Defaults to None, meaning release notes for the installed version. Alternatives are:\n\n - synapseclient.version_check._VERSION_URL\n - synapseclient.version_check._DEV_VERSION_URL\n\n \"\"\"\n version_info = _get_version_info(version_url)\n sys.stderr.write(\n \"Python Synapse Client version %s release notes\\n\\n\"\n % version_info[\"latestVersion\"]\n )\n if \"releaseNotes\" in version_info:\n sys.stderr.write(version_info[\"releaseNotes\"] + \"\\n\")\n
"},{"location":"reference/docker_repository/","title":"DockerRepository","text":""},{"location":"reference/docker_repository/#synapseclient.entity.DockerRepository","title":"synapseclient.entity.DockerRepository
","text":" Bases: Entity
A Docker repository is a lightweight virtual machine image.
NOTE: store()-ing a DockerRepository created in the Python client will always result in it being treated as a reference to an external Docker repository that is not managed by synapse. To upload a docker image that is managed by Synapse please use the official Docker client and read https://help.synapse.org/docs/Synapse-Docker-Registry.2011037752.html for instructions on uploading a Docker Image to Synapse
ATTRIBUTE DESCRIPTIONrepositoryName
The name of the Docker Repository. Usually in the format: [host[:port]/]path. If host is not set, it will default to that of DockerHub. port can only be specified if the host is also specified.
parent
The parent project or folder
properties
A map of Synapse properties
annotations
A map of user defined annotations
local_state
Internal use only
Source code in
synapseclient/entity.py
class DockerRepository(Entity):\n \"\"\"\n A Docker repository is a lightweight virtual machine image.\n\n NOTE: store()-ing a DockerRepository created in the Python client will always result in it being treated as a\n reference to an external Docker repository that is not managed by synapse.\n To upload a docker image that is managed by Synapse please use the official Docker client and read\n https://help.synapse.org/docs/Synapse-Docker-Registry.2011037752.html for instructions on uploading a Docker Image to Synapse\n\n Attributes:\n repositoryName: The name of the Docker Repository. Usually in the format: [host[:port]/]path.\n If host is not set, it will default to that of DockerHub. port can only be specified\n if the host is also specified.\n parent: The parent project or folder\n properties: A map of Synapse properties\n annotations: A map of user defined annotations\n local_state: Internal use only\n \"\"\"\n\n _synapse_entity_type = \"org.sagebionetworks.repo.model.docker.DockerRepository\"\n\n _property_keys = Entity._property_keys + [\"repositoryName\"]\n\n def __init__(\n self,\n repositoryName=None,\n parent=None,\n properties=None,\n annotations=None,\n local_state=None,\n **kwargs,\n ):\n if repositoryName:\n kwargs[\"repositoryName\"] = repositoryName\n super(DockerRepository, self).__init__(\n properties=properties,\n annotations=annotations,\n local_state=local_state,\n parent=parent,\n **kwargs,\n )\n if \"repositoryName\" not in self:\n raise SynapseMalformedEntityError(\n \"DockerRepository must have a repositoryName.\"\n )\n
"},{"location":"reference/entity/","title":"Entity","text":"The Entity class is the base class for all entities, including Project, Folder, File, and Link.
Entities are dictionary-like objects in which both object and dictionary notation (entity.foo
or entity['foo']
) can be used interchangeably.
synapseclient.entity.Entity
","text":" Bases: MutableMapping
A Synapse entity is an object that has metadata, access control, and potentially a file. It can represent data, source code, or a folder that contains other entities.
Entities should typically be created using the constructors for specific subclasses such as synapseclient.Project, synapseclient.Folder or synapseclient.File.
ATTRIBUTE DESCRIPTIONid
The unique immutable ID for this entity. A new ID will be generated for new Entities. Once issued, this ID is guaranteed to never change or be re-issued
name
The name of this entity. Must be 256 characters or less. Names may only contain: letters, numbers, spaces, underscores, hyphens, periods, plus signs, apostrophes, and parentheses
description
The description of this entity. Must be 1000 characters or less.
parentId
The ID of the Entity that is the parent of this Entity.
entityType
concreteType
Indicates which implementation of Entity this object represents. The value is the fully qualified class name, e.g. org.sagebionetworks.repo.model.FileEntity.
etag
Synapse employs an Optimistic Concurrency Control (OCC) scheme to handle concurrent updates. Since the E-Tag changes every time an entity is updated it is used to detect when a client's current representation of an entity is out-of-date.
annotations
The dict of annotations for this entity.
accessControlList
createdOn
The date this entity was created.
createdBy
The ID of the user that created this entity.
modifiedOn
The date this entity was last modified.
modifiedBy
The ID of the user that last modified this entity.
Source code in
synapseclient/entity.py
class Entity(collections.abc.MutableMapping):\n \"\"\"\n A Synapse entity is an object that has metadata, access control, and potentially a file. It can represent data,\n source code, or a folder that contains other entities.\n\n Entities should typically be created using the constructors for specific subclasses\n such as [synapseclient.Project][], [synapseclient.Folder][] or [synapseclient.File][].\n\n Attributes:\n id: The unique immutable ID for this entity. A new ID will be generated for new\n Entities. Once issued, this ID is guaranteed to never change or be re-issued\n name: The name of this entity. Must be 256 characters or less. Names may only\n contain: letters, numbers, spaces, underscores, hyphens, periods, plus\n signs, apostrophes, and parentheses\n description: The description of this entity. Must be 1000 characters or less.\n parentId: The ID of the Entity that is the parent of this Entity.\n entityType:\n concreteType: Indicates which implementation of Entity this object represents.\n The value is the fully qualified class name, e.g.\n org.sagebionetworks.repo.model.FileEntity.\n etag: Synapse employs an Optimistic Concurrency Control (OCC) scheme to handle\n concurrent updates. Since the E-Tag changes every time an entity is\n updated it is used to detect when a client's current representation of\n an entity is out-of-date.\n annotations: The dict of annotations for this entity.\n accessControlList:\n createdOn: The date this entity was created.\n createdBy: The ID of the user that created this entity.\n modifiedOn: The date this entity was last modified.\n modifiedBy: The ID of the user that last modified this entity.\n \"\"\"\n\n _synapse_entity_type = \"org.sagebionetworks.repo.model.Entity\"\n _property_keys = [\n \"id\",\n \"name\",\n \"description\",\n \"parentId\",\n \"entityType\",\n \"concreteType\",\n \"uri\",\n \"etag\",\n \"annotations\",\n \"accessControlList\",\n \"createdOn\",\n \"createdBy\",\n \"modifiedOn\",\n \"modifiedBy\",\n ]\n _local_keys = []\n\n @classmethod\n def create(cls, properties=None, annotations=None, local_state=None):\n \"\"\"\n Create an Entity or a subclass given dictionaries of properties and annotations, as might be received from the\n Synapse Repository.\n\n Arguments:\n properties: A map of Synapse properties\n\n - If 'concreteType' is defined in properties, we create the proper subclass of Entity. If not, give back the\n type whose constructor was called.\n - If passed an Entity as input, create a new Entity using the input entity as a prototype.\n annotations: A map of user defined annotations\n local_state: Internal use only\n \"\"\"\n\n # Create a new Entity using an existing Entity as a prototype\n if isinstance(properties, Entity):\n if annotations is None:\n annotations = {}\n if local_state is None:\n local_state = {}\n annotations.update(properties.annotations)\n local_state.update(properties.local_state())\n properties = properties.properties\n if \"id\" in properties:\n del properties[\"id\"]\n\n if (\n cls == Entity\n and \"concreteType\" in properties\n and properties[\"concreteType\"] in entity_type_to_class\n ):\n cls = entity_type_to_class[properties[\"concreteType\"]]\n return cls(\n properties=properties, annotations=annotations, local_state=local_state\n )\n\n @classmethod\n def getURI(cls, id):\n return \"/entity/%s\" % id\n\n def __new__(cls, *args, **kwargs):\n obj = object.__new__(cls)\n\n # Make really sure that properties and annotations exist before\n # any object methods get invoked. This is important because the\n # dot operator magic methods have been overridden and depend on\n # properties and annotations existing.\n obj.__dict__[\"properties\"] = DictObject()\n obj.__dict__[\"annotations\"] = DictObject()\n return obj\n\n def __init__(\n self, properties=None, annotations=None, local_state=None, parent=None, **kwargs\n ):\n if properties:\n if isinstance(properties, collections.abc.Mapping):\n if \"annotations\" in properties and isinstance(\n properties[\"annotations\"], collections.abc.Mapping\n ):\n annotations.update(properties[\"annotations\"])\n del properties[\"annotations\"]\n\n # Re-map `items` to `datasetItems` to avoid namespace conflicts\n # between Dataset schema and the items() builtin method.\n if \"items\" in properties:\n properties[\"datasetItems\"] = properties[\"items\"]\n del properties[\"items\"]\n self.__dict__[\"properties\"].update(properties)\n else:\n raise SynapseMalformedEntityError(\n \"Unknown argument type: properties is a %s\" % str(type(properties))\n )\n\n if annotations:\n if isinstance(annotations, collections.abc.Mapping):\n self.__dict__[\"annotations\"].update(annotations)\n elif isinstance(annotations, str):\n self.properties[\"annotations\"] = annotations\n else:\n raise SynapseMalformedEntityError(\n \"Unknown argument type: annotations is a %s\"\n % str(type(annotations))\n )\n\n if local_state:\n if isinstance(local_state, collections.abc.Mapping):\n self.local_state(local_state)\n else:\n raise SynapseMalformedEntityError(\n \"Unknown argument type: local_state is a %s\"\n % str(type(local_state))\n )\n\n for key in self.__class__._local_keys:\n if key not in self.__dict__:\n self.__dict__[key] = None\n\n # Extract parentId from parent\n if \"parentId\" not in kwargs:\n if parent:\n try:\n kwargs[\"parentId\"] = id_of(parent)\n except Exception:\n if isinstance(parent, Entity) and \"id\" not in parent:\n raise SynapseMalformedEntityError(\n \"Couldn't find 'id' of parent.\"\n \" Has it been stored in Synapse?\"\n )\n else:\n raise SynapseMalformedEntityError(\n \"Couldn't find 'id' of parent.\"\n )\n\n # Note: that this will work properly if derived classes declare their internal state variable *before* invoking\n # super(...).__init__(...)\n for key, value in kwargs.items():\n self.__setitem__(key, value)\n\n if \"concreteType\" not in self:\n self[\"concreteType\"] = self.__class__._synapse_entity_type\n\n # Only project can be top-level. All other entity types require parentId don't enforce this for generic Entity\n if (\n \"parentId\" not in self\n and not isinstance(self, Project)\n and not type(self) == Entity\n ):\n raise SynapseMalformedEntityError(\n \"Entities of type %s must have a parentId.\" % type(self)\n )\n\n def postURI(self):\n return \"/entity\"\n\n def putURI(self):\n return \"/entity/%s\" % self.id\n\n def deleteURI(self, versionNumber=None):\n if versionNumber:\n return \"/entity/%s/version/%s\" % (self.id, versionNumber)\n else:\n return \"/entity/%s\" % self.id\n\n def local_state(self, state=None) -> dict:\n \"\"\"\n Set or get the object's internal state, excluding properties, or annotations.\n\n Arguments:\n state: A dictionary\n\n Returns:\n The object's internal state, excluding properties, or annotations.\n \"\"\"\n if state:\n for key, value in state.items():\n if key not in [\"annotations\", \"properties\"]:\n self.__dict__[key] = value\n result = {}\n for key, value in self.__dict__.items():\n if key not in [\"annotations\", \"properties\"] and not key.startswith(\"__\"):\n result[key] = value\n return result\n\n def __setattr__(self, key, value):\n return self.__setitem__(key, value)\n\n def __setitem__(self, key, value):\n if key in self.__dict__ or key in self.__class__._local_keys:\n # If we assign like so:\n # entity.annotations = {'foo';123, 'bar':'bat'}\n # Wrap the dictionary in a DictObject so we can\n # later do:\n # entity.annotations.foo = 'bar'\n if (key == \"annotations\" or key == \"properties\") and not isinstance(\n value, DictObject\n ):\n value = DictObject(value)\n self.__dict__[key] = value\n elif key in self.__class__._property_keys:\n self.properties[key] = value\n else:\n self.annotations[key] = value\n\n # TODO: def __delattr__\n\n def __getattr__(self, key):\n # Note: that __getattr__ is only called after an attempt to\n # look the key up in the object's dictionary has failed.\n try:\n return self.__getitem__(key)\n except KeyError:\n # Note that hasattr in Python2 is more permissive than Python3\n # about what exceptions it catches. In Python3, hasattr catches\n # only AttributeError\n raise AttributeError(key)\n\n def __getitem__(self, key):\n if key in self.__dict__:\n return self.__dict__[key]\n elif key in self.properties:\n return self.properties[key]\n elif key in self.annotations:\n return self.annotations[key]\n else:\n raise KeyError(key)\n\n def __delitem__(self, key):\n if key in self.properties:\n del self.properties[key]\n elif key in self.annotations:\n del self.annotations[key]\n\n def __iter__(self):\n return iter(self.keys())\n\n def __len__(self):\n return len(self.keys())\n\n # TODO shouldn't these include local_state as well? -jcb\n def keys(self):\n \"\"\"Returns a set of property and annotation keys\"\"\"\n return set(self.properties.keys()) | set(self.annotations.keys())\n\n def has_key(self, key):\n \"\"\"Is the given key a property or annotation?\"\"\"\n\n return key in self.properties or key in self.annotations\n\n def _write_kvps(self, f, dictionary, key_filter=None, key_aliases=None):\n for key in sorted(dictionary.keys()):\n if (not key_filter) or key_filter(key):\n f.write(\" \")\n f.write(str(key) if not key_aliases else key_aliases[key])\n f.write(\"=\")\n f.write(str(dictionary[key]))\n f.write(\"\\n\")\n\n def __str__(self):\n f = io.StringIO()\n\n f.write(\n \"%s: %s (%s)\\n\"\n % (\n self.__class__.__name__,\n self.properties.get(\"name\", \"None\"),\n self[\"id\"] if \"id\" in self else \"-\",\n )\n )\n\n self._str_localstate(f)\n\n f.write(\"properties:\\n\")\n self._write_kvps(f, self.properties)\n\n f.write(\"annotations:\\n\")\n self._write_kvps(f, self.annotations)\n\n return f.getvalue()\n\n def _str_localstate(self, f): # type: (io.StringIO) -> None\n \"\"\"\n Helper method for writing the string representation of the local state to a StringIO object\n :param f: a StringIO object to which the local state string will be written\n \"\"\"\n self._write_kvps(\n f,\n self.__dict__,\n lambda key: not (\n key in [\"properties\", \"annotations\"] or key.startswith(\"__\")\n ),\n )\n\n def __repr__(self):\n \"\"\"Returns an eval-able representation of the Entity.\"\"\"\n\n f = io.StringIO()\n f.write(self.__class__.__name__)\n f.write(\"(\")\n f.write(\n \", \".join(\n {\n \"%s=%s\"\n % (\n str(key),\n value.__repr__(),\n )\n for key, value in itertools.chain(\n list(\n [\n k_v\n for k_v in self.__dict__.items()\n if not (\n k_v[0] in [\"properties\", \"annotations\"]\n or k_v[0].startswith(\"__\")\n )\n ]\n ),\n self.properties.items(),\n self.annotations.items(),\n )\n }\n )\n )\n f.write(\")\")\n return f.getvalue()\n
"},{"location":"reference/entity/#synapseclient.entity.Entity-functions","title":"Functions","text":""},{"location":"reference/entity/#synapseclient.entity.Entity.create","title":"create(properties=None, annotations=None, local_state=None)
classmethod
","text":"Create an Entity or a subclass given dictionaries of properties and annotations, as might be received from the Synapse Repository.
PARAMETER DESCRIPTIONproperties
A map of Synapse properties
DEFAULT: None
annotations
A map of user defined annotations
DEFAULT: None
local_state
Internal use only
DEFAULT: None
synapseclient/entity.py
@classmethod\ndef create(cls, properties=None, annotations=None, local_state=None):\n \"\"\"\n Create an Entity or a subclass given dictionaries of properties and annotations, as might be received from the\n Synapse Repository.\n\n Arguments:\n properties: A map of Synapse properties\n\n - If 'concreteType' is defined in properties, we create the proper subclass of Entity. If not, give back the\n type whose constructor was called.\n - If passed an Entity as input, create a new Entity using the input entity as a prototype.\n annotations: A map of user defined annotations\n local_state: Internal use only\n \"\"\"\n\n # Create a new Entity using an existing Entity as a prototype\n if isinstance(properties, Entity):\n if annotations is None:\n annotations = {}\n if local_state is None:\n local_state = {}\n annotations.update(properties.annotations)\n local_state.update(properties.local_state())\n properties = properties.properties\n if \"id\" in properties:\n del properties[\"id\"]\n\n if (\n cls == Entity\n and \"concreteType\" in properties\n and properties[\"concreteType\"] in entity_type_to_class\n ):\n cls = entity_type_to_class[properties[\"concreteType\"]]\n return cls(\n properties=properties, annotations=annotations, local_state=local_state\n )\n
"},{"location":"reference/entity/#synapseclient.entity.Entity.local_state","title":"local_state(state=None)
","text":"Set or get the object's internal state, excluding properties, or annotations.
PARAMETER DESCRIPTIONstate
A dictionary
DEFAULT: None
dict
The object's internal state, excluding properties, or annotations.
Source code insynapseclient/entity.py
def local_state(self, state=None) -> dict:\n \"\"\"\n Set or get the object's internal state, excluding properties, or annotations.\n\n Arguments:\n state: A dictionary\n\n Returns:\n The object's internal state, excluding properties, or annotations.\n \"\"\"\n if state:\n for key, value in state.items():\n if key not in [\"annotations\", \"properties\"]:\n self.__dict__[key] = value\n result = {}\n for key, value in self.__dict__.items():\n if key not in [\"annotations\", \"properties\"] and not key.startswith(\"__\"):\n result[key] = value\n return result\n
"},{"location":"reference/entity/#synapseclient.entity.Entity.keys","title":"keys()
","text":"Returns a set of property and annotation keys
Source code insynapseclient/entity.py
def keys(self):\n \"\"\"Returns a set of property and annotation keys\"\"\"\n return set(self.properties.keys()) | set(self.annotations.keys())\n
"},{"location":"reference/entity/#synapseclient.entity.Entity.has_key","title":"has_key(key)
","text":"Is the given key a property or annotation?
Source code insynapseclient/entity.py
def has_key(self, key):\n \"\"\"Is the given key a property or annotation?\"\"\"\n\n return key in self.properties or key in self.annotations\n
"},{"location":"reference/entity/#synapseclient.entity.Versionable","title":"synapseclient.entity.Versionable
","text":" Bases: object
An entity for which Synapse will store a version history.
ATTRIBUTE DESCRIPTIONversionNumber
The version number issued to this version on the object.
versionLabel
The version label for this entity
versionComment
The version comment for this entity
versionUrl
versions
Source code in
synapseclient/entity.py
class Versionable(object):\n \"\"\"An entity for which Synapse will store a version history.\n\n Attributes:\n versionNumber: The version number issued to this version on the object.\n versionLabel: \tThe version label for this entity\n versionComment: The version comment for this entity\n versionUrl:\n versions:\n\n \"\"\"\n\n _synapse_entity_type = \"org.sagebionetworks.repo.model.Versionable\"\n _property_keys = [\n \"versionNumber\",\n \"versionLabel\",\n \"versionComment\",\n \"versionUrl\",\n \"versions\",\n ]\n
"},{"location":"reference/evaluation/","title":"Evaluation","text":""},{"location":"reference/evaluation/#synapseclient.evaluation","title":"synapseclient.evaluation
","text":"Evaluations
An evaluation_ object represents a collection of Synapse Entities that will be processed in a particular way. This could mean scoring Entries in a challenge or executing a processing pipeline.
Imports::
from synapseclient import Evaluation, Submission, SubmissionStatus\n
Evaluations can be retrieved by ID::
evaluation = syn.getEvaluation(1901877)\n
Like entities, evaluations are access controlled via ACLs. The :py:func:synapseclient.Synapse.getPermissions
and :py:func:synapseclient.Synapse.setPermissions
methods work for evaluations:
access = syn.getPermissions(evaluation, user_id)\n
The :py:func:synapseclient.Synapse.submit
method returns a Submission_ object::
entity = syn.get(synapse_id)\nsubmission = syn.submit(evaluation, entity, name='My Data', team='My Team')\n
The Submission object can then be used to check the status <#submission-status>
_ of the submission::
status = syn.getSubmissionStatus(submission)\n
The status of a submission may be Submission status objects can be updated, usually by changing the status and score fields, and stored back to Synapse using :py:func:synapseclient.Synapse.store
::
status.score = 0.99\nstatus.status = 'SCORED'\nstatus = syn.store(status)\n
See:
synapseclient.Synapse.getEvaluation
synapseclient.Synapse.getEvaluationByContentSource
synapseclient.Synapse.getEvaluationByName
synapseclient.Synapse.submit
synapseclient.Synapse.getSubmissions
synapseclient.Synapse.getSubmission
synapseclient.Synapse.getSubmissionStatus
synapseclient.Synapse.getPermissions
synapseclient.Synapse.setPermissions
Evaluation\n
.. autoclass:: synapseclient.evaluation.Evaluation :members: init
Submission\n
.. autoclass:: synapseclient.evaluation.Submission :members: init
Submission Status\n
.. autoclass:: synapseclient.evaluation.SubmissionStatus :members: init
"},{"location":"reference/evaluation/#synapseclient.evaluation-classes","title":"Classes","text":""},{"location":"reference/evaluation/#synapseclient.evaluation.Evaluation","title":"Evaluation
","text":" Bases: DictObject
An Evaluation Submission queue, allowing submissions, retrieval and scoring.
:param name: Name of the evaluation :param description: A short description of the evaluation :param contentSource: Synapse Project associated with the evaluation :param submissionReceiptMessage: Message to display to users upon submission :param submissionInstructionsMessage: Message to display to users detailing acceptable formatting for submissions.
To create an Evaluation <https://rest-docs.synapse.org/rest/org/sagebionetworks/evaluation/model/Evaluation.html>
_ and store it in Synapse::
evaluation = syn.store(Evaluation(\n name=\"Q1 Final\",\n description=\"Predict progression of MMSE scores for final scoring\",\n contentSource=\"syn2290704\"))\n
The contentSource field links the evaluation to its :py:class:synapseclient.entity.Project
. (Or, really, any synapse ID, but sticking to projects is a good idea.)
Evaluations <https://rest-docs.synapse.org/rest/org/sagebionetworks/evaluation/model/Evaluation.html>
_ can be retrieved from Synapse by ID::
evaluation = syn.getEvaluation(1901877)\n
...by the Synapse ID of the content source (associated entity)::
evaluation = syn.getEvaluationByContentSource('syn12345')\n
...or by the name of the evaluation::
evaluation = syn.getEvaluationByName('Foo Challenge Question 1')\n
Source code in synapseclient/evaluation.py
class Evaluation(DictObject):\n \"\"\"\n An Evaluation Submission queue, allowing submissions, retrieval and scoring.\n\n :param name: Name of the evaluation\n :param description: A short description of the evaluation\n :param contentSource: Synapse Project associated with the evaluation\n :param submissionReceiptMessage: Message to display to users upon submission\n :param submissionInstructionsMessage: Message to display to users detailing acceptable formatting for submissions.\n\n `To create an Evaluation <https://rest-docs.synapse.org/rest/org/sagebionetworks/evaluation/model/Evaluation.html>`_\n and store it in Synapse::\n\n evaluation = syn.store(Evaluation(\n name=\"Q1 Final\",\n description=\"Predict progression of MMSE scores for final scoring\",\n contentSource=\"syn2290704\"))\n\n The contentSource field links the evaluation to its :py:class:`synapseclient.entity.Project`.\n (Or, really, any synapse ID, but sticking to projects is a good idea.)\n\n `Evaluations <https://rest-docs.synapse.org/rest/org/sagebionetworks/evaluation/model/Evaluation.html>`_ can be retrieved\n from Synapse by ID::\n\n evaluation = syn.getEvaluation(1901877)\n\n ...by the Synapse ID of the content source (associated entity)::\n\n evaluation = syn.getEvaluationByContentSource('syn12345')\n\n ...or by the name of the evaluation::\n\n evaluation = syn.getEvaluationByName('Foo Challenge Question 1')\n\n \"\"\"\n\n @classmethod\n def getByNameURI(cls, name: str):\n quoted_name = urllib_urlparse.quote(name)\n return f\"/evaluation/name/{quoted_name}\"\n\n @classmethod\n def getURI(cls, id: Union[str, int]):\n return f\"/evaluation/{id}\"\n\n def __init__(self, **kwargs):\n kwargs[\"contentSource\"] = kwargs.get(\"contentSource\", \"\")\n if not kwargs[\"contentSource\"].startswith(\n \"syn\"\n ): # Verify that synapse Id given\n raise ValueError(\n 'The \"contentSource\" parameter must be specified as a Synapse Entity when creating an'\n \" Evaluation\"\n )\n super(Evaluation, self).__init__(kwargs)\n\n def postURI(self):\n return \"/evaluation\"\n\n def putURI(self):\n return f\"/evaluation/{self.id}\"\n\n def deleteURI(self):\n return f\"/evaluation/{self.id}\"\n\n def getACLURI(self):\n return f\"/evaluation/{self.id}/acl\"\n\n def putACLURI(self):\n return \"/evaluation/acl\"\n
"},{"location":"reference/evaluation/#synapseclient.evaluation.Submission","title":"Submission
","text":" Bases: DictObject
Builds an Synapse submission object.
:param name: Name of submission :param entityId: Synapse ID of the Entity to submit :param evaluationId: ID of the Evaluation to which the Entity is to be submitted :param versionNumber: Version number of the submitted Entity :param submitterAlias: A pseudonym or team name for a challenge entry
Source code insynapseclient/evaluation.py
class Submission(DictObject):\n \"\"\"\n Builds an Synapse submission object.\n\n :param name: Name of submission\n :param entityId: Synapse ID of the Entity to submit\n :param evaluationId: ID of the Evaluation to which the Entity is to be submitted\n :param versionNumber: Version number of the submitted Entity\n :param submitterAlias: A pseudonym or team name for a challenge entry\n \"\"\"\n\n @classmethod\n def getURI(cls, id: Union[str, int]):\n return f\"/evaluation/submission/{id}\"\n\n def __init__(self, **kwargs):\n if not (\n \"evaluationId\" in kwargs\n and \"entityId\" in kwargs\n and \"versionNumber\" in kwargs\n ):\n raise KeyError\n\n super().__init__(kwargs)\n\n def postURI(self):\n return f\"/evaluation/submission?etag={self.etag}\"\n\n def putURI(self):\n return f\"/evaluation/submission/{self.id}\"\n\n def deleteURI(self):\n return f\"/evaluation/submission/{self.id}\"\n
"},{"location":"reference/evaluation/#synapseclient.evaluation.SubmissionStatus","title":"SubmissionStatus
","text":" Bases: DictObject
Builds an Synapse submission status object. https://rest-docs.synapse.org/rest/org/sagebionetworks/evaluation/model/SubmissionStatus.html
:param id: Unique immutable Synapse Id of the Submission :param status: Status can be one of https://rest-docs.synapse.org/rest/org/sagebionetworks/evaluation/model/SubmissionStatusEnum.html. :param submissionAnnotations: synapseclient.Annotations to store annotations of submission :param canCancel: Can this submission be cancelled? :param cancelRequested: Has user requested to cancel this submission?
Source code insynapseclient/evaluation.py
class SubmissionStatus(DictObject):\n \"\"\"\n Builds an Synapse submission status object.\n https://rest-docs.synapse.org/rest/org/sagebionetworks/evaluation/model/SubmissionStatus.html\n\n :param id: Unique immutable Synapse Id of the Submission\n :param status: Status can be one of\n https://rest-docs.synapse.org/rest/org/sagebionetworks/evaluation/model/SubmissionStatusEnum.html.\n :param submissionAnnotations: synapseclient.Annotations to store annotations of submission\n :param canCancel: Can this submission be cancelled?\n :param cancelRequested: Has user requested to cancel this submission?\n \"\"\"\n\n @classmethod\n def getURI(cls, id: Union[str, int]):\n return f\"/evaluation/submission/{id}/status\"\n\n def __init__(self, id: Union[str, int], etag: str, **kwargs):\n annotations = kwargs.pop(\"submissionAnnotations\", {})\n # If it is synapse annotations, turn into a format\n # that can be worked with otherwise, create\n # synapseclient.Annotations\n submission_annotations = _convert_to_annotation_cls(\n id=id, etag=etag, values=annotations\n )\n # In Python 3, the super(SubmissionStatus, self) call is equivalent to the parameterless super()\n super().__init__(\n id=id, etag=etag, submissionAnnotations=submission_annotations, **kwargs\n )\n\n # def postURI(self):\n # return '/evaluation/submission/%s/status' % self.id\n\n def putURI(self):\n return f\"/evaluation/submission/{self.id}/status\"\n\n # def deleteURI(self):\n # return '/evaluation/submission/%s/status' % self.id\n\n def json(self, ensure_ascii: bool = True):\n \"\"\"Overloaded json function, turning submissionAnnotations into\n synapse style annotations\"\"\"\n\n json_dict = self\n # If not synapse annotations, turn them into synapseclient.Annotations\n # must have id and etag to turn into synapse annotations\n if not is_synapse_annotations(self.submissionAnnotations):\n json_dict = self.copy()\n\n annotations = _convert_to_annotation_cls(\n id=self.id, etag=self.etag, values=self.submissionAnnotations\n )\n # Turn into synapse annotation\n json_dict[\"submissionAnnotations\"] = to_synapse_annotations(annotations)\n return json.dumps(\n json_dict, sort_keys=True, indent=2, ensure_ascii=ensure_ascii\n )\n
"},{"location":"reference/evaluation/#synapseclient.evaluation.SubmissionStatus-functions","title":"Functions","text":""},{"location":"reference/evaluation/#synapseclient.evaluation.SubmissionStatus.json","title":"json(ensure_ascii=True)
","text":"Overloaded json function, turning submissionAnnotations into synapse style annotations
Source code insynapseclient/evaluation.py
def json(self, ensure_ascii: bool = True):\n \"\"\"Overloaded json function, turning submissionAnnotations into\n synapse style annotations\"\"\"\n\n json_dict = self\n # If not synapse annotations, turn them into synapseclient.Annotations\n # must have id and etag to turn into synapse annotations\n if not is_synapse_annotations(self.submissionAnnotations):\n json_dict = self.copy()\n\n annotations = _convert_to_annotation_cls(\n id=self.id, etag=self.etag, values=self.submissionAnnotations\n )\n # Turn into synapse annotation\n json_dict[\"submissionAnnotations\"] = to_synapse_annotations(annotations)\n return json.dumps(\n json_dict, sort_keys=True, indent=2, ensure_ascii=ensure_ascii\n )\n
"},{"location":"reference/evaluation/#synapseclient.evaluation-functions","title":"Functions","text":""},{"location":"reference/exceptions/","title":"Exceptions","text":""},{"location":"reference/exceptions/#synapseclient.core.exceptions","title":"synapseclient.core.exceptions
","text":""},{"location":"reference/exceptions/#synapseclient.core.exceptions-classes","title":"Classes","text":""},{"location":"reference/exceptions/#synapseclient.core.exceptions.SynapseError","title":"SynapseError
","text":" Bases: Exception
Generic exception thrown by the client.
Source code insynapseclient/core/exceptions.py
class SynapseError(Exception):\n \"\"\"Generic exception thrown by the client.\"\"\"\n
"},{"location":"reference/exceptions/#synapseclient.core.exceptions.SynapseMd5MismatchError","title":"SynapseMd5MismatchError
","text":" Bases: SynapseError
, IOError
Error raised when MD5 computed for a download file fails to match the MD5 of its file handle.
Source code insynapseclient/core/exceptions.py
class SynapseMd5MismatchError(SynapseError, IOError):\n \"\"\"Error raised when MD5 computed for a download file fails to match the MD5 of its file handle.\"\"\"\n
"},{"location":"reference/exceptions/#synapseclient.core.exceptions.SynapseFileNotFoundError","title":"SynapseFileNotFoundError
","text":" Bases: SynapseError
Error thrown when a local file is not found in Synapse.
Source code insynapseclient/core/exceptions.py
class SynapseFileNotFoundError(SynapseError):\n \"\"\"Error thrown when a local file is not found in Synapse.\"\"\"\n
"},{"location":"reference/exceptions/#synapseclient.core.exceptions.SynapseTimeoutError","title":"SynapseTimeoutError
","text":" Bases: SynapseError
Timed out waiting for response from Synapse.
Source code insynapseclient/core/exceptions.py
class SynapseTimeoutError(SynapseError):\n \"\"\"Timed out waiting for response from Synapse.\"\"\"\n
"},{"location":"reference/exceptions/#synapseclient.core.exceptions.SynapseAuthenticationError","title":"SynapseAuthenticationError
","text":" Bases: SynapseError
Unauthorized access.
Source code insynapseclient/core/exceptions.py
class SynapseAuthenticationError(SynapseError):\n \"\"\"Unauthorized access.\"\"\"\n
"},{"location":"reference/exceptions/#synapseclient.core.exceptions.SynapseNoCredentialsError","title":"SynapseNoCredentialsError
","text":" Bases: SynapseAuthenticationError
No credentials for authentication
Source code insynapseclient/core/exceptions.py
class SynapseNoCredentialsError(SynapseAuthenticationError):\n \"\"\"No credentials for authentication\"\"\"\n
"},{"location":"reference/exceptions/#synapseclient.core.exceptions.SynapseFileCacheError","title":"SynapseFileCacheError
","text":" Bases: SynapseError
Error related to local file storage.
Source code insynapseclient/core/exceptions.py
class SynapseFileCacheError(SynapseError):\n \"\"\"Error related to local file storage.\"\"\"\n
"},{"location":"reference/exceptions/#synapseclient.core.exceptions.SynapseMalformedEntityError","title":"SynapseMalformedEntityError
","text":" Bases: SynapseError
Unexpected structure of Entities.
Source code insynapseclient/core/exceptions.py
class SynapseMalformedEntityError(SynapseError):\n \"\"\"Unexpected structure of Entities.\"\"\"\n
"},{"location":"reference/exceptions/#synapseclient.core.exceptions.SynapseUnmetAccessRestrictions","title":"SynapseUnmetAccessRestrictions
","text":" Bases: SynapseError
Request cannot be completed due to unmet access restrictions.
Source code insynapseclient/core/exceptions.py
class SynapseUnmetAccessRestrictions(SynapseError):\n \"\"\"Request cannot be completed due to unmet access restrictions.\"\"\"\n
"},{"location":"reference/exceptions/#synapseclient.core.exceptions.SynapseProvenanceError","title":"SynapseProvenanceError
","text":" Bases: SynapseError
Incorrect usage of provenance objects.
Source code insynapseclient/core/exceptions.py
class SynapseProvenanceError(SynapseError):\n \"\"\"Incorrect usage of provenance objects.\"\"\"\n
"},{"location":"reference/exceptions/#synapseclient.core.exceptions.SynapseHTTPError","title":"SynapseHTTPError
","text":" Bases: SynapseError
, HTTPError
Wraps recognized HTTP errors. See HTTPError <http://docs.python-requests.org/en/latest/api/?highlight=exceptions#requests.exceptions.HTTPError>
_
synapseclient/core/exceptions.py
class SynapseHTTPError(SynapseError, requests.exceptions.HTTPError):\n \"\"\"Wraps recognized HTTP errors. See\n `HTTPError <http://docs.python-requests.org/en/latest/api/?highlight=exceptions#requests.exceptions.HTTPError>`_\n \"\"\"\n
"},{"location":"reference/exceptions/#synapseclient.core.exceptions.SynapseUploadAbortedException","title":"SynapseUploadAbortedException
","text":" Bases: SynapseError
Raised when a worker thread detects the upload was aborted and stops further processing.
Source code insynapseclient/core/exceptions.py
class SynapseUploadAbortedException(SynapseError):\n \"\"\"Raised when a worker thread detects the upload was\n aborted and stops further processing.\"\"\"\n\n pass\n
"},{"location":"reference/exceptions/#synapseclient.core.exceptions.SynapseUploadFailedException","title":"SynapseUploadFailedException
","text":" Bases: SynapseError
Raised when an upload failed. Should be chained to a cause Exception
Source code insynapseclient/core/exceptions.py
class SynapseUploadFailedException(SynapseError):\n \"\"\"Raised when an upload failed. Should be chained to a cause Exception\"\"\"\n\n pass\n
"},{"location":"reference/file/","title":"File","text":""},{"location":"reference/file/#synapseclient.entity.File","title":"synapseclient.entity.File
","text":" Bases: Entity
, Versionable
Represents a file in Synapse.
When a File object is stored, the associated local file or its URL will be stored in Synapse. A File must have a path (or URL) and a parent. By default, the name of the file in Synapse matches the filename, but by specifying the name
attribute, the File Entity name can be different.
A Synapse File Entity has a name separate from the name of the actual file it represents. When a file is uploaded to Synapse, its filename is fixed, even though the name of the entity can be changed at any time. Synapse provides a way to change this filename and the content-type of the file for future downloads by creating a new version of the file with a modified copy of itself. This can be done with the synapseutils.copy_functions.changeFileMetaData function.
import synapseutils\ne = syn.get(synid)\nprint(os.path.basename(e.path)) ## prints, e.g., \"my_file.txt\"\ne = synapseutils.changeFileMetaData(syn, e, \"my_newname_file.txt\")\n
Setting fileNameOverride will not change the name of a copy of the file that's already downloaded into your local cache. Either rename the local copy manually or remove it from the cache and re-download.:
syn.cache.remove(e.dataFileHandleId)\ne = syn.get(e)\nprint(os.path.basename(e.path)) ## prints \"my_newname_file.txt\"\n
PARAMETER DESCRIPTION path
Location to be represented by this File
DEFAULT: None
name
Name of the file in Synapse, not to be confused with the name within the path
parent
Project or Folder where this File is stored
DEFAULT: None
synapseStore
Whether the File should be uploaded or if only the path should be stored when synapseclient.Synapse.store is called on the File object.
DEFAULT: True
contentType
Manually specify Content-type header, for example \"application/png\" or \"application/json; charset=UTF-8\"
dataFileHandleId
Defining an existing dataFileHandleId will use the existing dataFileHandleId The creator of the file must also be the owner of the dataFileHandleId to have permission to store the file.
properties
A map of Synapse properties
DEFAULT: None
annotations
A map of user defined annotations
DEFAULT: None
local_state
Internal use only
DEFAULT: None
Creating and storing a File
# The Entity name is derived from the path and is 'data.xyz'\ndata = File('/path/to/file/data.xyz', parent=folder)\ndata = syn.store(data)\n
Setting the name of the file in Synapse to 'my entity'
# The Entity name is specified as 'my entity'\ndata = File('/path/to/file/data.xyz', name=\"my entity\", parent=folder)\ndata = syn.store(data)\n
Source code in synapseclient/entity.py
class File(Entity, Versionable):\n \"\"\"\n Represents a file in Synapse.\n\n When a File object is stored, the associated local file or its URL will be stored in Synapse. A File must have a\n path (or URL) and a parent. By default, the name of the file in Synapse matches the filename, but by specifying\n the `name` attribute, the File Entity name can be different.\n\n ## Changing File Names\n\n A Synapse File Entity has a name separate from the name of the actual file it represents. When a file is uploaded to\n Synapse, its filename is fixed, even though the name of the entity can be changed at any time. Synapse provides a way\n to change this filename and the content-type of the file for future downloads by creating a new version of the file\n with a modified copy of itself. This can be done with the synapseutils.copy_functions.changeFileMetaData function.\n\n import synapseutils\n e = syn.get(synid)\n print(os.path.basename(e.path)) ## prints, e.g., \"my_file.txt\"\n e = synapseutils.changeFileMetaData(syn, e, \"my_newname_file.txt\")\n\n Setting *fileNameOverride* will **not** change the name of a copy of the\n file that's already downloaded into your local cache. Either rename the\n local copy manually or remove it from the cache and re-download.:\n\n syn.cache.remove(e.dataFileHandleId)\n e = syn.get(e)\n print(os.path.basename(e.path)) ## prints \"my_newname_file.txt\"\n\n Parameters:\n path: Location to be represented by this File\n name: Name of the file in Synapse, not to be confused with the name within the path\n parent: Project or Folder where this File is stored\n synapseStore: Whether the File should be uploaded or if only the path should be stored when\n [synapseclient.Synapse.store][] is called on the File object.\n contentType: Manually specify Content-type header, for example \"application/png\" or\n \"application/json; charset=UTF-8\"\n dataFileHandleId: Defining an existing dataFileHandleId will use the existing dataFileHandleId\n The creator of the file must also be the owner of the dataFileHandleId to have\n permission to store the file.\n properties: A map of Synapse properties\n annotations: A map of user defined annotations\n local_state: Internal use only\n\n Example: Creating instances\n Creating and storing a File\n\n # The Entity name is derived from the path and is 'data.xyz'\n data = File('/path/to/file/data.xyz', parent=folder)\n data = syn.store(data)\n\n Setting the name of the file in Synapse to 'my entity'\n\n # The Entity name is specified as 'my entity'\n data = File('/path/to/file/data.xyz', name=\"my entity\", parent=folder)\n data = syn.store(data)\n \"\"\"\n\n # Note: externalURL technically should not be in the keys since it's only a field/member variable of\n # ExternalFileHandle, but for backwards compatibility it's included\n _file_handle_keys = [\n \"createdOn\",\n \"id\",\n \"concreteType\",\n \"contentSize\",\n \"createdBy\",\n \"etag\",\n \"fileName\",\n \"contentType\",\n \"contentMd5\",\n \"storageLocationId\",\n \"externalURL\",\n ]\n # Used for backwards compatability. The keys found below used to located in the entity's local_state\n # (i.e. __dict__).\n _file_handle_aliases = {\n \"md5\": \"contentMd5\",\n \"externalURL\": \"externalURL\",\n \"fileSize\": \"contentSize\",\n \"contentType\": \"contentType\",\n }\n _file_handle_aliases_inverse = {v: k for k, v in _file_handle_aliases.items()}\n\n _property_keys = (\n Entity._property_keys + Versionable._property_keys + [\"dataFileHandleId\"]\n )\n _local_keys = Entity._local_keys + [\n \"path\",\n \"cacheDir\",\n \"files\",\n \"synapseStore\",\n \"_file_handle\",\n ]\n _synapse_entity_type = \"org.sagebionetworks.repo.model.FileEntity\"\n\n # TODO: File(path=\"/path/to/file\", synapseStore=True, parentId=\"syn101\")\n def __init__(\n self,\n path=None,\n parent=None,\n synapseStore=True,\n properties=None,\n annotations=None,\n local_state=None,\n **kwargs,\n ):\n if path and \"name\" not in kwargs:\n kwargs[\"name\"] = utils.guess_file_name(path)\n self.__dict__[\"path\"] = path\n if path:\n cacheDir, basename = os.path.split(path)\n self.__dict__[\"cacheDir\"] = cacheDir\n self.__dict__[\"files\"] = [basename]\n else:\n self.__dict__[\"cacheDir\"] = None\n self.__dict__[\"files\"] = []\n self.__dict__[\"synapseStore\"] = synapseStore\n\n # pop the _file_handle from local properties because it is handled differently from other local_state\n self._update_file_handle(\n local_state.pop(\"_file_handle\", None) if (local_state is not None) else None\n )\n\n super(File, self).__init__(\n concreteType=File._synapse_entity_type,\n properties=properties,\n annotations=annotations,\n local_state=local_state,\n parent=parent,\n **kwargs,\n )\n\n def _update_file_handle(self, file_handle_update_dict=None):\n \"\"\"\n Sets the file handle\n\n Should not need to be called by users\n \"\"\"\n\n # replace the file handle dict\n fh_dict = (\n DictObject(file_handle_update_dict)\n if file_handle_update_dict is not None\n else DictObject()\n )\n self.__dict__[\"_file_handle\"] = fh_dict\n\n if (\n file_handle_update_dict is not None\n and file_handle_update_dict.get(\"concreteType\")\n == \"org.sagebionetworks.repo.model.file.ExternalFileHandle\"\n and urllib_parse.urlparse(file_handle_update_dict.get(\"externalURL\")).scheme\n != \"sftp\"\n ):\n self.__dict__[\"synapseStore\"] = False\n\n # initialize all nonexistent keys to have value of None\n for key in self.__class__._file_handle_keys:\n if key not in fh_dict:\n fh_dict[key] = None\n\n def __setitem__(self, key, value):\n if key == \"_file_handle\":\n self._update_file_handle(value)\n elif key in self.__class__._file_handle_aliases:\n self._file_handle[self.__class__._file_handle_aliases[key]] = value\n else:\n\n def expand_and_convert_to_URL(path):\n return utils.as_url(os.path.expandvars(os.path.expanduser(path)))\n\n # hacky solution to allowing immediate switching into a ExternalFileHandle pointing to the current path\n # yes, there is boolean zen but I feel like it is easier to read/understand this way\n if (\n key == \"synapseStore\"\n and value is False\n and self[\"synapseStore\"] is True\n and utils.caller_module_name(inspect.currentframe()) != \"client\"\n ):\n self[\"externalURL\"] = expand_and_convert_to_URL(self[\"path\"])\n\n # hacky solution because we historically allowed modifying 'path' to indicate wanting to change to a new\n # ExternalFileHandle\n # don't change exernalURL if it's just the synapseclient setting metadata after a function call such as\n # syn.get()\n if (\n key == \"path\"\n and not self[\"synapseStore\"]\n and utils.caller_module_name(inspect.currentframe()) != \"client\"\n ):\n self[\"externalURL\"] = expand_and_convert_to_URL(value)\n self[\"contentMd5\"] = None\n self[\"contentSize\"] = None\n super(File, self).__setitem__(key, value)\n\n def __getitem__(self, item):\n if item in self.__class__._file_handle_aliases:\n return self._file_handle[self.__class__._file_handle_aliases[item]]\n else:\n return super(File, self).__getitem__(item)\n\n def _str_localstate(self, f):\n self._write_kvps(\n f,\n self._file_handle,\n lambda key: key\n in [\"externalURL\", \"contentMd5\", \"contentSize\", \"contentType\"],\n self._file_handle_aliases_inverse,\n )\n self._write_kvps(\n f,\n self.__dict__,\n lambda key: not (\n key in [\"properties\", \"annotations\", \"_file_handle\"]\n or key.startswith(\"__\")\n ),\n )\n
"},{"location":"reference/folder/","title":"Folder","text":""},{"location":"reference/folder/#synapseclient.entity.Folder","title":"synapseclient.entity.Folder
","text":" Bases: Entity
Represents a folder in Synapse.
Folders must have a name and a parent and can optionally have annotations.
ATTRIBUTE DESCRIPTIONname
The name of the folder
parent
The parent project or folder
properties
A map of Synapse properties
annotations
A map of user defined annotations
local_state
Internal use only
Using this class
Creating an instance and storing the folder
folder = Folder(name='my data', parent=project)\nfolder = syn.store(folder)\n
Source code in synapseclient/entity.py
class Folder(Entity):\n \"\"\"\n Represents a folder in Synapse.\n\n Folders must have a name and a parent and can optionally have annotations.\n\n Attributes:\n name: The name of the folder\n parent: The parent project or folder\n properties: A map of Synapse properties\n annotations: A map of user defined annotations\n local_state: Internal use only\n\n Example: Using this class\n Creating an instance and storing the folder\n\n folder = Folder(name='my data', parent=project)\n folder = syn.store(folder)\n \"\"\"\n\n _synapse_entity_type = \"org.sagebionetworks.repo.model.Folder\"\n\n def __init__(\n self,\n name=None,\n parent=None,\n properties=None,\n annotations=None,\n local_state=None,\n **kwargs,\n ):\n if name:\n kwargs[\"name\"] = name\n super(Folder, self).__init__(\n concreteType=Folder._synapse_entity_type,\n properties=properties,\n annotations=annotations,\n local_state=local_state,\n parent=parent,\n **kwargs,\n )\n
"},{"location":"reference/json_schema/","title":"JSON Schema","text":""},{"location":"reference/json_schema/#synapseclient.services.json_schema","title":"synapseclient.services.json_schema
","text":"JSON Schema
.. warning:: This is a beta implementation and is subject to change. Use at your own risk.
"},{"location":"reference/json_schema/#synapseclient.services.json_schema-classes","title":"Classes","text":""},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaVersion","title":"JsonSchemaVersion
","text":"Json schema version response object
:param organization: JSON schema organization. :type organization: JsonSchemaOrganization :param name: Name of the JSON schema. :type name: str :param semantic_version: Version of JSON schema. Defaults to None. :type semantic_version: str, optional
Source code insynapseclient/services/json_schema.py
class JsonSchemaVersion:\n \"\"\"Json schema version response object\n\n :param organization: JSON schema organization.\n :type organization: JsonSchemaOrganization\n :param name: Name of the JSON schema.\n :type name: str\n :param semantic_version: Version of JSON schema. Defaults to None.\n :type semantic_version: str, optional\n \"\"\"\n\n def __init__(\n self,\n organization: JsonSchemaOrganization,\n name: str,\n semantic_version: str = None,\n ) -> None:\n self.organization = organization\n self.name = name\n self.semantic_version = semantic_version\n self.uri = None\n self.version_id = None\n self.created_on = None\n self.created_by = None\n self.json_sha256_hex = None\n self.set_service(self.organization.service)\n\n def __repr__(self):\n string = (\n f\"JsonSchemaVersion(org={self.organization.name!r}, name={self.name!r}, \"\n f\"version={self.semantic_version!r})\"\n )\n return string\n\n def set_service(self, service):\n self.service = service\n\n @property\n def raw(self):\n self.must_get()\n return self._raw\n\n def parse_response(self, response):\n self._raw = response\n self.uri = response[\"$id\"]\n self.version_id = response[\"versionId\"]\n self.created_on = response[\"createdOn\"]\n self.created_by = response[\"createdBy\"]\n self.json_sha256_hex = response[\"jsonSHA256Hex\"]\n\n @classmethod\n def from_response(cls, organization, response):\n semver = response.get(\"semanticVersion\")\n version = cls(organization, response[\"schemaName\"], semver)\n version.parse_response(response)\n return version\n\n def get(self):\n \"\"\"Get the JSON Schema Version\"\"\"\n if self.uri is not None:\n return True\n json_schema = self.organization.get_json_schema(self.name)\n if json_schema is None:\n return False\n raw_version = json_schema.get_version(self.semantic_version, raw=True)\n if raw_version is None:\n return False\n self.parse_response(raw_version)\n return True\n\n def must_get(self):\n already_exists = self.get()\n assert already_exists, (\n \"This operation requires that the JSON Schema name is created first.\"\n \"Call the 'create_version()' method to trigger the creation.\"\n )\n\n def create(\n self,\n json_schema_body: dict,\n dry_run: bool = False,\n ):\n \"\"\"Create JSON schema version\n\n :param json_schema_body: JSON schema body\n :type json_schema_body: dict\n :param dry_run: Do not store to Synapse. Defaults to False.\n :type dry_run: bool, optional\n :returns: JSON Schema\n \"\"\"\n uri = f\"{self.organization.name}-{self.name}\"\n if self.semantic_version:\n uri = f\"{uri}-{self.semantic_version}\"\n json_schema_body[\"$id\"] = uri\n response = self.service.create_json_schema(json_schema_body, dry_run)\n if dry_run:\n return response\n raw_version = response[\"newVersionInfo\"]\n self.parse_response(raw_version)\n return self\n\n def delete(self):\n \"\"\"Delete the JSON schema version\"\"\"\n self.must_get()\n response = self.service.delete_json_schema(self.uri)\n return response\n\n @property\n def body(self):\n self.must_get()\n json_schema_body = self.service.get_json_schema_body(self.uri)\n return json_schema_body\n\n def expand(self):\n \"\"\"Validate entities with schema\"\"\"\n self.must_get()\n response = self.service.json_schema_validation(self.uri)\n json_schema_body = response[\"validationSchema\"]\n return json_schema_body\n\n def bind_to_object(self, synapse_id: str):\n \"\"\"Bind schema to an entity\n\n :param synapse_id: Synapse Id to bind json schema to.\n :type synapse_id: str\n \"\"\"\n self.must_get()\n response = self.service.bind_json_schema_to_entity(synapse_id, self.uri)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaVersion-functions","title":"Functions","text":""},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaVersion.get","title":"get()
","text":"Get the JSON Schema Version
Source code insynapseclient/services/json_schema.py
def get(self):\n \"\"\"Get the JSON Schema Version\"\"\"\n if self.uri is not None:\n return True\n json_schema = self.organization.get_json_schema(self.name)\n if json_schema is None:\n return False\n raw_version = json_schema.get_version(self.semantic_version, raw=True)\n if raw_version is None:\n return False\n self.parse_response(raw_version)\n return True\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaVersion.create","title":"create(json_schema_body, dry_run=False)
","text":"Create JSON schema version
:param json_schema_body: JSON schema body :type json_schema_body: dict :param dry_run: Do not store to Synapse. Defaults to False. :type dry_run: bool, optional :returns: JSON Schema
Source code insynapseclient/services/json_schema.py
def create(\n self,\n json_schema_body: dict,\n dry_run: bool = False,\n):\n \"\"\"Create JSON schema version\n\n :param json_schema_body: JSON schema body\n :type json_schema_body: dict\n :param dry_run: Do not store to Synapse. Defaults to False.\n :type dry_run: bool, optional\n :returns: JSON Schema\n \"\"\"\n uri = f\"{self.organization.name}-{self.name}\"\n if self.semantic_version:\n uri = f\"{uri}-{self.semantic_version}\"\n json_schema_body[\"$id\"] = uri\n response = self.service.create_json_schema(json_schema_body, dry_run)\n if dry_run:\n return response\n raw_version = response[\"newVersionInfo\"]\n self.parse_response(raw_version)\n return self\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaVersion.delete","title":"delete()
","text":"Delete the JSON schema version
Source code insynapseclient/services/json_schema.py
def delete(self):\n \"\"\"Delete the JSON schema version\"\"\"\n self.must_get()\n response = self.service.delete_json_schema(self.uri)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaVersion.expand","title":"expand()
","text":"Validate entities with schema
Source code insynapseclient/services/json_schema.py
def expand(self):\n \"\"\"Validate entities with schema\"\"\"\n self.must_get()\n response = self.service.json_schema_validation(self.uri)\n json_schema_body = response[\"validationSchema\"]\n return json_schema_body\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaVersion.bind_to_object","title":"bind_to_object(synapse_id)
","text":"Bind schema to an entity
:param synapse_id: Synapse Id to bind json schema to. :type synapse_id: str
Source code insynapseclient/services/json_schema.py
def bind_to_object(self, synapse_id: str):\n \"\"\"Bind schema to an entity\n\n :param synapse_id: Synapse Id to bind json schema to.\n :type synapse_id: str\n \"\"\"\n self.must_get()\n response = self.service.bind_json_schema_to_entity(synapse_id, self.uri)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchema","title":"JsonSchema
","text":"Json schema response object
:param organization: JSON schema organization. :type organization: JsonSchemaOrganization :param name: Name of the JSON schema. :type name: str
Source code insynapseclient/services/json_schema.py
class JsonSchema:\n \"\"\"Json schema response object\n\n :param organization: JSON schema organization.\n :type organization: JsonSchemaOrganization\n :param name: Name of the JSON schema.\n :type name: str\n \"\"\"\n\n def __init__(self, organization: JsonSchemaOrganization, name: str) -> None:\n self.organization = organization\n self.name = name\n self.id = None\n self.created_on = None\n self.created_by = None\n self._versions = dict()\n self.set_service(self.organization.service)\n\n def __repr__(self):\n string = f\"JsonSchema(org={self.organization.name!r}, name={self.name!r})\"\n return string\n\n def set_service(self, service):\n self.service = service\n\n @property\n def raw(self):\n self.must_get()\n return self._raw\n\n def parse_response(self, response):\n self._raw = response\n self.id = response[\"schemaId\"]\n self.created_on = response[\"createdOn\"]\n self.created_by = response[\"createdBy\"]\n\n @classmethod\n def from_response(cls, organization, response):\n json_schema = cls(organization, response[\"schemaName\"])\n json_schema.parse_response(response)\n return json_schema\n\n def get(self):\n \"\"\"Get Json schema\"\"\"\n if self.id is not None:\n return True\n response = self.organization.get_json_schema(self.name, raw=True)\n if response is None:\n return False\n self.parse_response(response)\n return True\n\n def must_get(self):\n already_exists = self.get()\n assert already_exists, (\n \"This operation requires that the JSON Schema name is created first.\"\n \"Call the 'create_version()' method to trigger the creation.\"\n )\n\n def list_versions(self):\n \"\"\"List versions of the json schema\"\"\"\n self.must_get()\n self._versions = dict()\n response = self.service.list_json_schema_versions(\n self.organization.name, self.name\n )\n for raw_version in response:\n semver = raw_version.get(\"semanticVersion\")\n version = JsonSchemaVersion.from_response(self.organization, raw_version)\n # Handle that multiple versions can have None/null as their semver\n if semver is None:\n update_none_version = (\n # Is this the first null version?\n semver not in self._versions\n # Or is the version ID higher (i.e., more recent)?\n or version.version_id > self._versions[semver].version_id\n )\n if update_none_version:\n self._versions[semver] = (raw_version, version)\n else:\n self._versions[semver] = (raw_version, version)\n # Skip versions w/o semver until the end\n if semver is not None:\n yield version\n # Return version w/o semver now (if applicable) to ensure latest is returned\n if None in self._versions:\n yield self._versions[None]\n\n def get_version(self, semantic_version: str = None, raw: bool = False):\n self.must_get()\n if semantic_version not in self._versions:\n list(self.list_versions())\n raw_version, version = self._versions.get(semantic_version, [None, None])\n return raw_version if raw else version\n\n def create(\n self,\n json_schema_body: dict,\n semantic_version: str = None,\n dry_run: bool = False,\n ):\n \"\"\"Create JSON schema\n\n :param json_schema_body: JSON schema body\n :type json_schema_body: dict\n :param semantic_version: Version of JSON schema. Defaults to None.\n :type semantic_version: str, optional\n :param dry_run: Do not store to Synapse. Defaults to False.\n :type dry_run: bool, optional\n \"\"\"\n uri = f\"{self.organization.name}-{self.name}\"\n if semantic_version:\n uri = f\"{uri}-{semantic_version}\"\n json_schema_body[\"$id\"] = uri\n response = self.service.create_json_schema(json_schema_body, dry_run)\n if dry_run:\n return response\n raw_version = response[\"newVersionInfo\"]\n version = JsonSchemaVersion.from_response(self.organization, raw_version)\n self._versions[semantic_version] = (raw_version, version)\n return version\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchema-functions","title":"Functions","text":""},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchema.get","title":"get()
","text":"Get Json schema
Source code insynapseclient/services/json_schema.py
def get(self):\n \"\"\"Get Json schema\"\"\"\n if self.id is not None:\n return True\n response = self.organization.get_json_schema(self.name, raw=True)\n if response is None:\n return False\n self.parse_response(response)\n return True\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchema.list_versions","title":"list_versions()
","text":"List versions of the json schema
Source code insynapseclient/services/json_schema.py
def list_versions(self):\n \"\"\"List versions of the json schema\"\"\"\n self.must_get()\n self._versions = dict()\n response = self.service.list_json_schema_versions(\n self.organization.name, self.name\n )\n for raw_version in response:\n semver = raw_version.get(\"semanticVersion\")\n version = JsonSchemaVersion.from_response(self.organization, raw_version)\n # Handle that multiple versions can have None/null as their semver\n if semver is None:\n update_none_version = (\n # Is this the first null version?\n semver not in self._versions\n # Or is the version ID higher (i.e., more recent)?\n or version.version_id > self._versions[semver].version_id\n )\n if update_none_version:\n self._versions[semver] = (raw_version, version)\n else:\n self._versions[semver] = (raw_version, version)\n # Skip versions w/o semver until the end\n if semver is not None:\n yield version\n # Return version w/o semver now (if applicable) to ensure latest is returned\n if None in self._versions:\n yield self._versions[None]\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchema.create","title":"create(json_schema_body, semantic_version=None, dry_run=False)
","text":"Create JSON schema
:param json_schema_body: JSON schema body :type json_schema_body: dict :param semantic_version: Version of JSON schema. Defaults to None. :type semantic_version: str, optional :param dry_run: Do not store to Synapse. Defaults to False. :type dry_run: bool, optional
Source code insynapseclient/services/json_schema.py
def create(\n self,\n json_schema_body: dict,\n semantic_version: str = None,\n dry_run: bool = False,\n):\n \"\"\"Create JSON schema\n\n :param json_schema_body: JSON schema body\n :type json_schema_body: dict\n :param semantic_version: Version of JSON schema. Defaults to None.\n :type semantic_version: str, optional\n :param dry_run: Do not store to Synapse. Defaults to False.\n :type dry_run: bool, optional\n \"\"\"\n uri = f\"{self.organization.name}-{self.name}\"\n if semantic_version:\n uri = f\"{uri}-{semantic_version}\"\n json_schema_body[\"$id\"] = uri\n response = self.service.create_json_schema(json_schema_body, dry_run)\n if dry_run:\n return response\n raw_version = response[\"newVersionInfo\"]\n version = JsonSchemaVersion.from_response(self.organization, raw_version)\n self._versions[semantic_version] = (raw_version, version)\n return version\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaOrganization","title":"JsonSchemaOrganization
","text":"Json Schema Organization
:param name: Name of JSON schema organization :type name: str
Source code insynapseclient/services/json_schema.py
class JsonSchemaOrganization:\n \"\"\"Json Schema Organization\n\n :param name: Name of JSON schema organization\n :type name: str\n \"\"\"\n\n def __init__(self, name: str) -> None:\n self.name = name\n self.id = None\n self.created_on = None\n self.created_by = None\n self._json_schemas = dict()\n self._raw_json_schemas = dict()\n\n def __repr__(self):\n string = f\"JsonSchemaOrganization(name={self.name!r})\"\n return string\n\n def set_service(self, service):\n self.service = service\n\n def get(self):\n \"\"\"Gets Json Schema organization\"\"\"\n if self.id is not None:\n return True\n try:\n response = self.service.get_organization(self.name)\n except SynapseHTTPError as e:\n error_msg = str(e)\n if \"not found\" in error_msg:\n return False\n else:\n raise e\n self.id = response[\"id\"]\n self.created_on = response[\"createdOn\"]\n self.created_by = response[\"createdBy\"]\n return True\n\n def must_get(self):\n already_exists = self.get()\n assert already_exists, (\n \"This operation requires that the organization is created first. \"\n \"Call the 'create()' method to trigger the creation.\"\n )\n\n @property\n def name(self):\n return self._name\n\n @name.setter\n def name(self, value):\n if len(value) < 6:\n raise ValueError(\"Name must be at least 6 characters.\")\n if len(value) > 250:\n raise ValueError(\"Name cannot exceed 250 characters. \")\n if value[0].isdigit():\n raise ValueError(\"Name must not start with a number.\")\n self._name = value\n\n @property\n def raw(self):\n self.must_get()\n return self._raw\n\n def parse_response(self, response):\n self._raw = response\n self.id = response[\"id\"]\n self.created_on = response[\"createdOn\"]\n self.created_by = response[\"createdBy\"]\n\n @classmethod\n def from_response(cls, response):\n organization = cls(response[\"name\"])\n organization.parse_response(response)\n return organization\n\n def create(self):\n \"\"\"Create the JSON schema organization\"\"\"\n already_exists = self.get()\n if already_exists:\n return\n response = self.service.create_organization(self.name)\n self.parse_response(response)\n return self\n\n def delete(self):\n \"\"\"Delete the JSON schema organization\"\"\"\n self.must_get()\n response = self.service.delete_organization(self.id)\n return response\n\n def get_acl(self):\n \"\"\"Get ACL of JSON schema organization\"\"\"\n self.must_get()\n response = self.service.get_organization_acl(self.id)\n return response\n\n def set_acl(\n self,\n principal_ids: Sequence[int],\n access_type: Sequence[str] = DEFAULT_ACCESS,\n etag: str = None,\n ):\n \"\"\"Set ACL of JSON schema organization\n\n :param principal_ids: List of Synapse user or team ids.\n :type principal_ids: list\n :param access_type: Access control list. Defaults to [\"CHANGE_PERMISSIONS\", \"DELETE\", \"READ\", \"CREATE\", \"UPDATE\"].\n :type access_type: list, optional\n :param etag: Etag. Defaults to None.\n :type etag: str, optional\n \"\"\"\n self.must_get()\n if etag is None:\n acl = self.get_acl()\n etag = acl[\"etag\"]\n resource_access = [\n {\"principalId\": principal_id, \"accessType\": access_type}\n for principal_id in principal_ids\n ]\n response = self.service.update_organization_acl(self.id, resource_access, etag)\n return response\n\n def update_acl(\n self,\n principal_ids: Sequence[int],\n access_type: Sequence[str] = DEFAULT_ACCESS,\n etag: str = None,\n ):\n \"\"\"Update ACL of JSON schema organization\n\n :param principal_ids: List of Synapse user or team ids.\n :type principal_ids: list\n :param access_type: Access control list. Defaults to [\"CHANGE_PERMISSIONS\", \"DELETE\", \"READ\", \"CREATE\", \"UPDATE\"].\n :type access_type: list, optional\n :param etag: Etag. Defaults to None.\n :type etag: str, optional\n \"\"\"\n self.must_get()\n principal_ids = set(principal_ids)\n acl = self.get_acl()\n resource_access = acl[\"resourceAccess\"]\n if etag is None:\n etag = acl[\"etag\"]\n for entry in resource_access:\n if entry[\"principalId\"] in principal_ids:\n entry[\"accessType\"] = access_type\n principal_ids.remove(entry[\"principalId\"])\n for principal_id in principal_ids:\n entry = {\n \"principalId\": principal_id,\n \"accessType\": access_type,\n }\n resource_access.append(entry)\n response = self.service.update_organization_acl(self.id, resource_access, etag)\n return response\n\n def list_json_schemas(self):\n \"\"\"List JSON schemas available from the organization\"\"\"\n self.must_get()\n response = self.service.list_json_schemas(self.name)\n for raw_json_schema in response:\n json_schema = JsonSchema.from_response(self, raw_json_schema)\n self._raw_json_schemas[json_schema.name] = raw_json_schema\n self._json_schemas[json_schema.name] = json_schema\n yield json_schema\n\n def get_json_schema(self, json_schema_name: str, raw: bool = False):\n \"\"\"Get JSON schema\n\n :param json_schema_name: Name of JSON schema.\n :type json_schema_name: str\n :param raw: Return raw JSON schema. Default is False.\n :type raw: bool, optional\n \"\"\"\n self.must_get()\n if json_schema_name not in self._json_schemas:\n list(self.list_json_schemas())\n if raw:\n json_schema = self._raw_json_schemas.get(json_schema_name)\n else:\n json_schema = self._json_schemas.get(json_schema_name)\n return json_schema\n\n def create_json_schema(\n self,\n json_schema_body: dict,\n name: str = None,\n semantic_version: str = None,\n dry_run: bool = False,\n ):\n \"\"\"Create JSON schema\n\n :param json_schema_body: JSON schema dict\n :type json_schema_body: dict\n :param name: Name of JSON schema. Defaults to None.\n :type name: str, optional\n :param semantic_version: Version of JSON schema. Defaults to None.\n :type semantic_version: str, optional\n :param dry_run: Don't store to Synapse. Defaults to False.\n :type dry_run: bool, optional\n \"\"\"\n if name:\n uri = f\"{self.name}-{name}\"\n if semantic_version:\n uri = f\"{uri}-{semantic_version}\"\n json_schema_body[\"$id\"] = uri\n else:\n assert (\n semantic_version is not None\n ), \"Specify both the name and the semantic version (not just the latter)\"\n response = self.service.create_json_schema(json_schema_body, dry_run)\n if dry_run:\n return response\n raw_version = response[\"newVersionInfo\"]\n json_schema = JsonSchemaVersion.from_response(self, raw_version)\n return json_schema\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaOrganization-functions","title":"Functions","text":""},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaOrganization.get","title":"get()
","text":"Gets Json Schema organization
Source code insynapseclient/services/json_schema.py
def get(self):\n \"\"\"Gets Json Schema organization\"\"\"\n if self.id is not None:\n return True\n try:\n response = self.service.get_organization(self.name)\n except SynapseHTTPError as e:\n error_msg = str(e)\n if \"not found\" in error_msg:\n return False\n else:\n raise e\n self.id = response[\"id\"]\n self.created_on = response[\"createdOn\"]\n self.created_by = response[\"createdBy\"]\n return True\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaOrganization.create","title":"create()
","text":"Create the JSON schema organization
Source code insynapseclient/services/json_schema.py
def create(self):\n \"\"\"Create the JSON schema organization\"\"\"\n already_exists = self.get()\n if already_exists:\n return\n response = self.service.create_organization(self.name)\n self.parse_response(response)\n return self\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaOrganization.delete","title":"delete()
","text":"Delete the JSON schema organization
Source code insynapseclient/services/json_schema.py
def delete(self):\n \"\"\"Delete the JSON schema organization\"\"\"\n self.must_get()\n response = self.service.delete_organization(self.id)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaOrganization.get_acl","title":"get_acl()
","text":"Get ACL of JSON schema organization
Source code insynapseclient/services/json_schema.py
def get_acl(self):\n \"\"\"Get ACL of JSON schema organization\"\"\"\n self.must_get()\n response = self.service.get_organization_acl(self.id)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaOrganization.set_acl","title":"set_acl(principal_ids, access_type=DEFAULT_ACCESS, etag=None)
","text":"Set ACL of JSON schema organization
:param principal_ids: List of Synapse user or team ids. :type principal_ids: list :param access_type: Access control list. Defaults to [\"CHANGE_PERMISSIONS\", \"DELETE\", \"READ\", \"CREATE\", \"UPDATE\"]. :type access_type: list, optional :param etag: Etag. Defaults to None. :type etag: str, optional
Source code insynapseclient/services/json_schema.py
def set_acl(\n self,\n principal_ids: Sequence[int],\n access_type: Sequence[str] = DEFAULT_ACCESS,\n etag: str = None,\n):\n \"\"\"Set ACL of JSON schema organization\n\n :param principal_ids: List of Synapse user or team ids.\n :type principal_ids: list\n :param access_type: Access control list. Defaults to [\"CHANGE_PERMISSIONS\", \"DELETE\", \"READ\", \"CREATE\", \"UPDATE\"].\n :type access_type: list, optional\n :param etag: Etag. Defaults to None.\n :type etag: str, optional\n \"\"\"\n self.must_get()\n if etag is None:\n acl = self.get_acl()\n etag = acl[\"etag\"]\n resource_access = [\n {\"principalId\": principal_id, \"accessType\": access_type}\n for principal_id in principal_ids\n ]\n response = self.service.update_organization_acl(self.id, resource_access, etag)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaOrganization.update_acl","title":"update_acl(principal_ids, access_type=DEFAULT_ACCESS, etag=None)
","text":"Update ACL of JSON schema organization
:param principal_ids: List of Synapse user or team ids. :type principal_ids: list :param access_type: Access control list. Defaults to [\"CHANGE_PERMISSIONS\", \"DELETE\", \"READ\", \"CREATE\", \"UPDATE\"]. :type access_type: list, optional :param etag: Etag. Defaults to None. :type etag: str, optional
Source code insynapseclient/services/json_schema.py
def update_acl(\n self,\n principal_ids: Sequence[int],\n access_type: Sequence[str] = DEFAULT_ACCESS,\n etag: str = None,\n):\n \"\"\"Update ACL of JSON schema organization\n\n :param principal_ids: List of Synapse user or team ids.\n :type principal_ids: list\n :param access_type: Access control list. Defaults to [\"CHANGE_PERMISSIONS\", \"DELETE\", \"READ\", \"CREATE\", \"UPDATE\"].\n :type access_type: list, optional\n :param etag: Etag. Defaults to None.\n :type etag: str, optional\n \"\"\"\n self.must_get()\n principal_ids = set(principal_ids)\n acl = self.get_acl()\n resource_access = acl[\"resourceAccess\"]\n if etag is None:\n etag = acl[\"etag\"]\n for entry in resource_access:\n if entry[\"principalId\"] in principal_ids:\n entry[\"accessType\"] = access_type\n principal_ids.remove(entry[\"principalId\"])\n for principal_id in principal_ids:\n entry = {\n \"principalId\": principal_id,\n \"accessType\": access_type,\n }\n resource_access.append(entry)\n response = self.service.update_organization_acl(self.id, resource_access, etag)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaOrganization.list_json_schemas","title":"list_json_schemas()
","text":"List JSON schemas available from the organization
Source code insynapseclient/services/json_schema.py
def list_json_schemas(self):\n \"\"\"List JSON schemas available from the organization\"\"\"\n self.must_get()\n response = self.service.list_json_schemas(self.name)\n for raw_json_schema in response:\n json_schema = JsonSchema.from_response(self, raw_json_schema)\n self._raw_json_schemas[json_schema.name] = raw_json_schema\n self._json_schemas[json_schema.name] = json_schema\n yield json_schema\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaOrganization.get_json_schema","title":"get_json_schema(json_schema_name, raw=False)
","text":"Get JSON schema
:param json_schema_name: Name of JSON schema. :type json_schema_name: str :param raw: Return raw JSON schema. Default is False. :type raw: bool, optional
Source code insynapseclient/services/json_schema.py
def get_json_schema(self, json_schema_name: str, raw: bool = False):\n \"\"\"Get JSON schema\n\n :param json_schema_name: Name of JSON schema.\n :type json_schema_name: str\n :param raw: Return raw JSON schema. Default is False.\n :type raw: bool, optional\n \"\"\"\n self.must_get()\n if json_schema_name not in self._json_schemas:\n list(self.list_json_schemas())\n if raw:\n json_schema = self._raw_json_schemas.get(json_schema_name)\n else:\n json_schema = self._json_schemas.get(json_schema_name)\n return json_schema\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaOrganization.create_json_schema","title":"create_json_schema(json_schema_body, name=None, semantic_version=None, dry_run=False)
","text":"Create JSON schema
:param json_schema_body: JSON schema dict :type json_schema_body: dict :param name: Name of JSON schema. Defaults to None. :type name: str, optional :param semantic_version: Version of JSON schema. Defaults to None. :type semantic_version: str, optional :param dry_run: Don't store to Synapse. Defaults to False. :type dry_run: bool, optional
Source code insynapseclient/services/json_schema.py
def create_json_schema(\n self,\n json_schema_body: dict,\n name: str = None,\n semantic_version: str = None,\n dry_run: bool = False,\n):\n \"\"\"Create JSON schema\n\n :param json_schema_body: JSON schema dict\n :type json_schema_body: dict\n :param name: Name of JSON schema. Defaults to None.\n :type name: str, optional\n :param semantic_version: Version of JSON schema. Defaults to None.\n :type semantic_version: str, optional\n :param dry_run: Don't store to Synapse. Defaults to False.\n :type dry_run: bool, optional\n \"\"\"\n if name:\n uri = f\"{self.name}-{name}\"\n if semantic_version:\n uri = f\"{uri}-{semantic_version}\"\n json_schema_body[\"$id\"] = uri\n else:\n assert (\n semantic_version is not None\n ), \"Specify both the name and the semantic version (not just the latter)\"\n response = self.service.create_json_schema(json_schema_body, dry_run)\n if dry_run:\n return response\n raw_version = response[\"newVersionInfo\"]\n json_schema = JsonSchemaVersion.from_response(self, raw_version)\n return json_schema\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService","title":"JsonSchemaService
","text":"Json Schema Service
:param synapse: Synapse connection :type synapse: Synapse
Source code insynapseclient/services/json_schema.py
class JsonSchemaService:\n \"\"\"Json Schema Service\n\n :param synapse: Synapse connection\n :type synapse: Synapse\n \"\"\"\n\n def __init__(self, synapse: Synapse = None) -> None:\n self.synapse = synapse\n\n @wraps(Synapse.login)\n def login(self, *args, **kwargs):\n synapse = Synapse()\n synapse.login(*args, **kwargs)\n self.synapse = synapse\n\n @wraps(JsonSchemaOrganization)\n def JsonSchemaOrganization(self, *args, **kwargs):\n instance = JsonSchemaOrganization(*args, **kwargs)\n instance.set_service(self)\n return instance\n\n @wraps(JsonSchemaVersion)\n def JsonSchemaVersion(self, *args, **kwargs):\n instance = JsonSchemaVersion(*args, **kwargs)\n instance.set_service(self)\n return instance\n\n @wraps(JsonSchema)\n def JsonSchema(self, *args, **kwargs):\n instance = JsonSchema(*args, **kwargs)\n instance.set_service(self)\n return instance\n\n def authentication_required(func):\n @wraps(func)\n def wrapper(self, *args, **kwargs):\n msg = (\n f\"`JsonSchemaService.{func.__name__}()` requests must be authenticated.\"\n \" Login using the `login()` method on the existing `JsonSchemaService`\"\n \" instance (e.g., `js.login()` or `js.login(authToken=...)`).\"\n )\n assert self.synapse is not None, msg\n try:\n result = func(self, *args, **kwargs)\n except SynapseAuthenticationError as e:\n raise SynapseAuthenticationError(msg).with_traceback(e.__traceback__)\n return result\n\n return wrapper\n\n @authentication_required\n def create_organization(self, organization_name: str):\n \"\"\"Create a new organization\n\n :param organization_name: JSON schema organization name\n :type organization_name: str\n \"\"\"\n request_body = {\"organizationName\": organization_name}\n response = self.synapse.restPOST(\n \"/schema/organization\", body=json.dumps(request_body)\n )\n return response\n\n @authentication_required\n def get_organization(self, organization_name: str):\n \"\"\"Get a organization\n\n :param organization_name: JSON schema organization name\n :type organization_name: str\n \"\"\"\n response = self.synapse.restGET(\n f\"/schema/organization?name={organization_name}\"\n )\n return response\n\n def list_organizations(self):\n \"\"\"List organizations\"\"\"\n request_body = {}\n response = self.synapse._POST_paginated(\n \"/schema/organization/list\", request_body\n )\n return response\n\n @authentication_required\n def delete_organization(self, organization_id: str):\n \"\"\"Delete organization\n\n :param organization_id: JSON schema organization Id\n :type organization_id: str\n \"\"\"\n response = self.synapse.restDELETE(f\"/schema/organization/{organization_id}\")\n return response\n\n @authentication_required\n def get_organization_acl(self, organization_id: str):\n \"\"\"Get ACL associated with Organization\n\n :param organization_id: JSON schema organization Id\n :type organization_id: str\n \"\"\"\n response = self.synapse.restGET(f\"/schema/organization/{organization_id}/acl\")\n return response\n\n @authentication_required\n def update_organization_acl(\n self,\n organization_id: str,\n resource_access: Sequence[Mapping[str, Sequence[str]]],\n etag: str,\n ):\n \"\"\"Get ACL associated with Organization\n\n :param organization_id: JSON schema organization Id\n :type organization_id: str\n :param resource_access: Resource access array\n :type resource_access: list\n :param etag: Etag\n :type etag: str\n \"\"\"\n request_body = {\"resourceAccess\": resource_access, \"etag\": etag}\n response = self.synapse.restPUT(\n f\"/schema/organization/{organization_id}/acl\", body=json.dumps(request_body)\n )\n return response\n\n def list_json_schemas(self, organization_name: str):\n \"\"\"List JSON schemas for an organization\n\n :param organization_name: JSON schema organization name\n :type organization_name: str\n \"\"\"\n request_body = {\"organizationName\": organization_name}\n response = self.synapse._POST_paginated(\"/schema/list\", request_body)\n return response\n\n def list_json_schema_versions(self, organization_name: str, json_schema_name: str):\n \"\"\"List version information for each JSON schema\n\n :param organization_name: JSON schema organization name\n :type organization_name: str\n :param json_schema_name: JSON schema name\n :type json_schema_name: str\n \"\"\"\n request_body = {\n \"organizationName\": organization_name,\n \"schemaName\": json_schema_name,\n }\n response = self.synapse._POST_paginated(\"/schema/version/list\", request_body)\n return response\n\n @authentication_required\n def create_json_schema(self, json_schema_body: dict, dry_run: bool = False):\n \"\"\"Create a JSON schema\n\n :param json_schema_body: JSON schema body\n :type json_schema_body: dict\n :param dry_run: Don't store to Synapse. Default to False.\n :type dry_run: bool, optional\n \"\"\"\n request_body = {\n \"concreteType\": \"org.sagebionetworks.repo.model.schema.CreateSchemaRequest\",\n \"schema\": json_schema_body,\n \"dryRun\": dry_run,\n }\n response = self.synapse._waitForAsync(\"/schema/type/create/async\", request_body)\n return response\n\n def get_json_schema_body(self, json_schema_uri: str):\n \"\"\"Get registered JSON schema with its $id\n\n :param json_schema_uri: JSON schema URI\n :type json_schema_uri: str\n \"\"\"\n response = self.synapse.restGET(f\"/schema/type/registered/{json_schema_uri}\")\n return response\n\n @authentication_required\n def delete_json_schema(self, json_schema_uri: str):\n \"\"\"Delete the given schema using its $id\n\n :param json_schema_uri: JSON schema URI\n :type json_schema_uri: str\n \"\"\"\n response = self.synapse.restDELETE(f\"/schema/type/registered/{json_schema_uri}\")\n return response\n\n @authentication_required\n def json_schema_validation(self, json_schema_uri: str):\n \"\"\"Use a JSON schema for validation\n\n :param json_schema_uri: JSON schema URI\n :type json_schema_uri: str\n \"\"\"\n request_body = {\n \"concreteType\": (\n \"org.sagebionetworks.repo.model.schema.GetValidationSchemaRequest\"\n ),\n \"$id\": json_schema_uri,\n }\n response = self.synapse._waitForAsync(\n \"/schema/type/validation/async\", request_body\n )\n return response\n\n @authentication_required\n def bind_json_schema_to_entity(self, synapse_id: str, json_schema_uri: str):\n \"\"\"Bind a JSON schema to an entity\n\n :param synapse_id: Synapse Id\n :type synapse_id: str\n :param json_schema_uri: JSON schema URI\n :type json_schema_uri: str\n \"\"\"\n request_body = {\"entityId\": synapse_id, \"schema$id\": json_schema_uri}\n response = self.synapse.restPUT(\n f\"/entity/{synapse_id}/schema/binding\", body=json.dumps(request_body)\n )\n return response\n\n @authentication_required\n def get_json_schema_from_entity(self, synapse_id: str):\n \"\"\"Get bound schema from entity\n\n :param synapse_id: Synapse Id\n :type synapse_id: str\n \"\"\"\n response = self.synapse.restGET(f\"/entity/{synapse_id}/schema/binding\")\n return response\n\n @authentication_required\n def delete_json_schema_from_entity(self, synapse_id: str):\n \"\"\"Delete bound schema from entity\n\n :param synapse_id: Synapse Id\n :type synapse_id: str\n \"\"\"\n response = self.synapse.restDELETE(f\"/entity/{synapse_id}/schema/binding\")\n return response\n\n @authentication_required\n def validate_entity_with_json_schema(self, synapse_id: str):\n \"\"\"Get validation results of an entity against bound JSON schema\n\n :param synapse_id: Synapse Id\n :type synapse_id: str\n \"\"\"\n response = self.synapse.restGET(f\"/entity/{synapse_id}/schema/validation\")\n return response\n\n @authentication_required\n def get_json_schema_validation_statistics(self, synapse_id: str):\n \"\"\"Get the summary statistic of json schema validation results for\n a container entity\n\n :param synapse_id: Synapse Id\n :type synapse_id: str\n \"\"\"\n response = self.synapse.restGET(\n f\"/entity/{synapse_id}/schema/validation/statistics\"\n )\n return response\n\n @authentication_required\n def get_invalid_json_schema_validation(self, synapse_id: str):\n \"\"\"Get a single page of invalid JSON schema validation results for a container Entity\n (Project or Folder).\n\n :param synapse_id: Synapse Id\n :type synapse_id: str\n \"\"\"\n request_body = {\"containerId\": synapse_id}\n response = self.synapse._POST_paginated(\n f\"/entity/{synapse_id}/schema/validation/invalid\", request_body\n )\n return response\n\n # The methods below are here until they are integrated with Synapse/Entity\n\n def bind_json_schema(self, json_schema_uri: str, entity: Union[str, Entity]):\n \"\"\"Bind a JSON schema to an entity\n\n :param json_schema_uri: JSON schema URI\n :type json_schema_uri: str\n :param entity: Synapse Entity or Synapse Id\n :type entity: str, Entity\n \"\"\"\n synapse_id = id_of(entity)\n response = self.bind_json_schema_to_entity(synapse_id, json_schema_uri)\n return response\n\n def get_json_schema(self, entity: Union[str, Entity]):\n \"\"\"Get a JSON schema associated to an Entity\n\n :param entity: Synapse Entity or Synapse Id\n :type entity: str, Entity\n \"\"\"\n synapse_id = id_of(entity)\n response = self.get_json_schema_from_entity(synapse_id)\n return response\n\n def unbind_json_schema(self, entity: Union[str, Entity]):\n \"\"\"Unbind a JSON schema from an entity\n\n :param entity: Synapse Entity or Synapse Id\n :type entity: str, Entity\n \"\"\"\n synapse_id = id_of(entity)\n response = self.delete_json_schema_from_entity(synapse_id)\n return response\n\n def validate(self, entity: Union[str, Entity]):\n \"\"\"Validate an entity based on the bound JSON schema\n\n :param entity: Synapse Entity or Synapse Id\n :type entity: str, Entity\n \"\"\"\n synapse_id = id_of(entity)\n response = self.validate_entity_with_json_schema(synapse_id)\n return response\n\n def validation_stats(self, entity: Union[str, Entity]):\n \"\"\"Get validation statistics of an entity based on the bound JSON schema\n\n :param entity: Synapse Entity or Synapse Id\n :type entity: str, Entity\n \"\"\"\n synapse_id = id_of(entity)\n response = self.get_json_schema_validation_statistics(synapse_id)\n return response\n\n def validate_children(self, entity: Union[str, Entity]):\n \"\"\"Validate an entity and it's children based on the bound JSON schema\n\n :param entity: Synapse Entity or Synapse Id of a project or folder.\n :type entity: str, Entity\n \"\"\"\n synapse_id = id_of(entity)\n response = self.get_invalid_json_schema_validation(synapse_id)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService-functions","title":"Functions","text":""},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.create_organization","title":"create_organization(organization_name)
","text":"Create a new organization
:param organization_name: JSON schema organization name :type organization_name: str
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef create_organization(self, organization_name: str):\n \"\"\"Create a new organization\n\n :param organization_name: JSON schema organization name\n :type organization_name: str\n \"\"\"\n request_body = {\"organizationName\": organization_name}\n response = self.synapse.restPOST(\n \"/schema/organization\", body=json.dumps(request_body)\n )\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.get_organization","title":"get_organization(organization_name)
","text":"Get a organization
:param organization_name: JSON schema organization name :type organization_name: str
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef get_organization(self, organization_name: str):\n \"\"\"Get a organization\n\n :param organization_name: JSON schema organization name\n :type organization_name: str\n \"\"\"\n response = self.synapse.restGET(\n f\"/schema/organization?name={organization_name}\"\n )\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.list_organizations","title":"list_organizations()
","text":"List organizations
Source code insynapseclient/services/json_schema.py
def list_organizations(self):\n \"\"\"List organizations\"\"\"\n request_body = {}\n response = self.synapse._POST_paginated(\n \"/schema/organization/list\", request_body\n )\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.delete_organization","title":"delete_organization(organization_id)
","text":"Delete organization
:param organization_id: JSON schema organization Id :type organization_id: str
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef delete_organization(self, organization_id: str):\n \"\"\"Delete organization\n\n :param organization_id: JSON schema organization Id\n :type organization_id: str\n \"\"\"\n response = self.synapse.restDELETE(f\"/schema/organization/{organization_id}\")\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.get_organization_acl","title":"get_organization_acl(organization_id)
","text":"Get ACL associated with Organization
:param organization_id: JSON schema organization Id :type organization_id: str
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef get_organization_acl(self, organization_id: str):\n \"\"\"Get ACL associated with Organization\n\n :param organization_id: JSON schema organization Id\n :type organization_id: str\n \"\"\"\n response = self.synapse.restGET(f\"/schema/organization/{organization_id}/acl\")\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.update_organization_acl","title":"update_organization_acl(organization_id, resource_access, etag)
","text":"Get ACL associated with Organization
:param organization_id: JSON schema organization Id :type organization_id: str :param resource_access: Resource access array :type resource_access: list :param etag: Etag :type etag: str
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef update_organization_acl(\n self,\n organization_id: str,\n resource_access: Sequence[Mapping[str, Sequence[str]]],\n etag: str,\n):\n \"\"\"Get ACL associated with Organization\n\n :param organization_id: JSON schema organization Id\n :type organization_id: str\n :param resource_access: Resource access array\n :type resource_access: list\n :param etag: Etag\n :type etag: str\n \"\"\"\n request_body = {\"resourceAccess\": resource_access, \"etag\": etag}\n response = self.synapse.restPUT(\n f\"/schema/organization/{organization_id}/acl\", body=json.dumps(request_body)\n )\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.list_json_schemas","title":"list_json_schemas(organization_name)
","text":"List JSON schemas for an organization
:param organization_name: JSON schema organization name :type organization_name: str
Source code insynapseclient/services/json_schema.py
def list_json_schemas(self, organization_name: str):\n \"\"\"List JSON schemas for an organization\n\n :param organization_name: JSON schema organization name\n :type organization_name: str\n \"\"\"\n request_body = {\"organizationName\": organization_name}\n response = self.synapse._POST_paginated(\"/schema/list\", request_body)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.list_json_schema_versions","title":"list_json_schema_versions(organization_name, json_schema_name)
","text":"List version information for each JSON schema
:param organization_name: JSON schema organization name :type organization_name: str :param json_schema_name: JSON schema name :type json_schema_name: str
Source code insynapseclient/services/json_schema.py
def list_json_schema_versions(self, organization_name: str, json_schema_name: str):\n \"\"\"List version information for each JSON schema\n\n :param organization_name: JSON schema organization name\n :type organization_name: str\n :param json_schema_name: JSON schema name\n :type json_schema_name: str\n \"\"\"\n request_body = {\n \"organizationName\": organization_name,\n \"schemaName\": json_schema_name,\n }\n response = self.synapse._POST_paginated(\"/schema/version/list\", request_body)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.create_json_schema","title":"create_json_schema(json_schema_body, dry_run=False)
","text":"Create a JSON schema
:param json_schema_body: JSON schema body :type json_schema_body: dict :param dry_run: Don't store to Synapse. Default to False. :type dry_run: bool, optional
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef create_json_schema(self, json_schema_body: dict, dry_run: bool = False):\n \"\"\"Create a JSON schema\n\n :param json_schema_body: JSON schema body\n :type json_schema_body: dict\n :param dry_run: Don't store to Synapse. Default to False.\n :type dry_run: bool, optional\n \"\"\"\n request_body = {\n \"concreteType\": \"org.sagebionetworks.repo.model.schema.CreateSchemaRequest\",\n \"schema\": json_schema_body,\n \"dryRun\": dry_run,\n }\n response = self.synapse._waitForAsync(\"/schema/type/create/async\", request_body)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.get_json_schema_body","title":"get_json_schema_body(json_schema_uri)
","text":"Get registered JSON schema with its $id
:param json_schema_uri: JSON schema URI :type json_schema_uri: str
Source code insynapseclient/services/json_schema.py
def get_json_schema_body(self, json_schema_uri: str):\n \"\"\"Get registered JSON schema with its $id\n\n :param json_schema_uri: JSON schema URI\n :type json_schema_uri: str\n \"\"\"\n response = self.synapse.restGET(f\"/schema/type/registered/{json_schema_uri}\")\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.delete_json_schema","title":"delete_json_schema(json_schema_uri)
","text":"Delete the given schema using its $id
:param json_schema_uri: JSON schema URI :type json_schema_uri: str
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef delete_json_schema(self, json_schema_uri: str):\n \"\"\"Delete the given schema using its $id\n\n :param json_schema_uri: JSON schema URI\n :type json_schema_uri: str\n \"\"\"\n response = self.synapse.restDELETE(f\"/schema/type/registered/{json_schema_uri}\")\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.json_schema_validation","title":"json_schema_validation(json_schema_uri)
","text":"Use a JSON schema for validation
:param json_schema_uri: JSON schema URI :type json_schema_uri: str
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef json_schema_validation(self, json_schema_uri: str):\n \"\"\"Use a JSON schema for validation\n\n :param json_schema_uri: JSON schema URI\n :type json_schema_uri: str\n \"\"\"\n request_body = {\n \"concreteType\": (\n \"org.sagebionetworks.repo.model.schema.GetValidationSchemaRequest\"\n ),\n \"$id\": json_schema_uri,\n }\n response = self.synapse._waitForAsync(\n \"/schema/type/validation/async\", request_body\n )\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.bind_json_schema_to_entity","title":"bind_json_schema_to_entity(synapse_id, json_schema_uri)
","text":"Bind a JSON schema to an entity
:param synapse_id: Synapse Id :type synapse_id: str :param json_schema_uri: JSON schema URI :type json_schema_uri: str
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef bind_json_schema_to_entity(self, synapse_id: str, json_schema_uri: str):\n \"\"\"Bind a JSON schema to an entity\n\n :param synapse_id: Synapse Id\n :type synapse_id: str\n :param json_schema_uri: JSON schema URI\n :type json_schema_uri: str\n \"\"\"\n request_body = {\"entityId\": synapse_id, \"schema$id\": json_schema_uri}\n response = self.synapse.restPUT(\n f\"/entity/{synapse_id}/schema/binding\", body=json.dumps(request_body)\n )\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.get_json_schema_from_entity","title":"get_json_schema_from_entity(synapse_id)
","text":"Get bound schema from entity
:param synapse_id: Synapse Id :type synapse_id: str
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef get_json_schema_from_entity(self, synapse_id: str):\n \"\"\"Get bound schema from entity\n\n :param synapse_id: Synapse Id\n :type synapse_id: str\n \"\"\"\n response = self.synapse.restGET(f\"/entity/{synapse_id}/schema/binding\")\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.delete_json_schema_from_entity","title":"delete_json_schema_from_entity(synapse_id)
","text":"Delete bound schema from entity
:param synapse_id: Synapse Id :type synapse_id: str
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef delete_json_schema_from_entity(self, synapse_id: str):\n \"\"\"Delete bound schema from entity\n\n :param synapse_id: Synapse Id\n :type synapse_id: str\n \"\"\"\n response = self.synapse.restDELETE(f\"/entity/{synapse_id}/schema/binding\")\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.validate_entity_with_json_schema","title":"validate_entity_with_json_schema(synapse_id)
","text":"Get validation results of an entity against bound JSON schema
:param synapse_id: Synapse Id :type synapse_id: str
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef validate_entity_with_json_schema(self, synapse_id: str):\n \"\"\"Get validation results of an entity against bound JSON schema\n\n :param synapse_id: Synapse Id\n :type synapse_id: str\n \"\"\"\n response = self.synapse.restGET(f\"/entity/{synapse_id}/schema/validation\")\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.get_json_schema_validation_statistics","title":"get_json_schema_validation_statistics(synapse_id)
","text":"Get the summary statistic of json schema validation results for a container entity
:param synapse_id: Synapse Id :type synapse_id: str
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef get_json_schema_validation_statistics(self, synapse_id: str):\n \"\"\"Get the summary statistic of json schema validation results for\n a container entity\n\n :param synapse_id: Synapse Id\n :type synapse_id: str\n \"\"\"\n response = self.synapse.restGET(\n f\"/entity/{synapse_id}/schema/validation/statistics\"\n )\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.get_invalid_json_schema_validation","title":"get_invalid_json_schema_validation(synapse_id)
","text":"Get a single page of invalid JSON schema validation results for a container Entity (Project or Folder).
:param synapse_id: Synapse Id :type synapse_id: str
Source code insynapseclient/services/json_schema.py
@authentication_required\ndef get_invalid_json_schema_validation(self, synapse_id: str):\n \"\"\"Get a single page of invalid JSON schema validation results for a container Entity\n (Project or Folder).\n\n :param synapse_id: Synapse Id\n :type synapse_id: str\n \"\"\"\n request_body = {\"containerId\": synapse_id}\n response = self.synapse._POST_paginated(\n f\"/entity/{synapse_id}/schema/validation/invalid\", request_body\n )\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.bind_json_schema","title":"bind_json_schema(json_schema_uri, entity)
","text":"Bind a JSON schema to an entity
:param json_schema_uri: JSON schema URI :type json_schema_uri: str :param entity: Synapse Entity or Synapse Id :type entity: str, Entity
Source code insynapseclient/services/json_schema.py
def bind_json_schema(self, json_schema_uri: str, entity: Union[str, Entity]):\n \"\"\"Bind a JSON schema to an entity\n\n :param json_schema_uri: JSON schema URI\n :type json_schema_uri: str\n :param entity: Synapse Entity or Synapse Id\n :type entity: str, Entity\n \"\"\"\n synapse_id = id_of(entity)\n response = self.bind_json_schema_to_entity(synapse_id, json_schema_uri)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.get_json_schema","title":"get_json_schema(entity)
","text":"Get a JSON schema associated to an Entity
:param entity: Synapse Entity or Synapse Id :type entity: str, Entity
Source code insynapseclient/services/json_schema.py
def get_json_schema(self, entity: Union[str, Entity]):\n \"\"\"Get a JSON schema associated to an Entity\n\n :param entity: Synapse Entity or Synapse Id\n :type entity: str, Entity\n \"\"\"\n synapse_id = id_of(entity)\n response = self.get_json_schema_from_entity(synapse_id)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.unbind_json_schema","title":"unbind_json_schema(entity)
","text":"Unbind a JSON schema from an entity
:param entity: Synapse Entity or Synapse Id :type entity: str, Entity
Source code insynapseclient/services/json_schema.py
def unbind_json_schema(self, entity: Union[str, Entity]):\n \"\"\"Unbind a JSON schema from an entity\n\n :param entity: Synapse Entity or Synapse Id\n :type entity: str, Entity\n \"\"\"\n synapse_id = id_of(entity)\n response = self.delete_json_schema_from_entity(synapse_id)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.validate","title":"validate(entity)
","text":"Validate an entity based on the bound JSON schema
:param entity: Synapse Entity or Synapse Id :type entity: str, Entity
Source code insynapseclient/services/json_schema.py
def validate(self, entity: Union[str, Entity]):\n \"\"\"Validate an entity based on the bound JSON schema\n\n :param entity: Synapse Entity or Synapse Id\n :type entity: str, Entity\n \"\"\"\n synapse_id = id_of(entity)\n response = self.validate_entity_with_json_schema(synapse_id)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.validation_stats","title":"validation_stats(entity)
","text":"Get validation statistics of an entity based on the bound JSON schema
:param entity: Synapse Entity or Synapse Id :type entity: str, Entity
Source code insynapseclient/services/json_schema.py
def validation_stats(self, entity: Union[str, Entity]):\n \"\"\"Get validation statistics of an entity based on the bound JSON schema\n\n :param entity: Synapse Entity or Synapse Id\n :type entity: str, Entity\n \"\"\"\n synapse_id = id_of(entity)\n response = self.get_json_schema_validation_statistics(synapse_id)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema.JsonSchemaService.validate_children","title":"validate_children(entity)
","text":"Validate an entity and it's children based on the bound JSON schema
:param entity: Synapse Entity or Synapse Id of a project or folder. :type entity: str, Entity
Source code insynapseclient/services/json_schema.py
def validate_children(self, entity: Union[str, Entity]):\n \"\"\"Validate an entity and it's children based on the bound JSON schema\n\n :param entity: Synapse Entity or Synapse Id of a project or folder.\n :type entity: str, Entity\n \"\"\"\n synapse_id = id_of(entity)\n response = self.get_invalid_json_schema_validation(synapse_id)\n return response\n
"},{"location":"reference/json_schema/#synapseclient.services.json_schema-functions","title":"Functions","text":""},{"location":"reference/link/","title":"Link","text":""},{"location":"reference/link/#synapseclient.entity.Link","title":"synapseclient.entity.Link
","text":" Bases: Entity
Represents a link in Synapse.
Links must have a target ID and a parent. When you do :py:func:synapseclient.Synapse.get
on a Link object, the Link object is returned. If the target is desired, specify followLink=True in synapseclient.Synapse.get.
targetId
The ID of the entity to be linked
targetVersion
The version of the entity to be linked
parent
The parent project or folder
properties
A map of Synapse properties
annotations
A map of user defined annotations
local_state
Internal use only
Using this class
Creating an instance and storing the link
link = Link('targetID', parent=folder)\nlink = syn.store(link)\n
Source code in synapseclient/entity.py
class Link(Entity):\n \"\"\"\n Represents a link in Synapse.\n\n Links must have a target ID and a parent. When you do :py:func:`synapseclient.Synapse.get` on a Link object,\n the Link object is returned. If the target is desired, specify followLink=True in synapseclient.Synapse.get.\n\n Attributes:\n targetId: The ID of the entity to be linked\n targetVersion: The version of the entity to be linked\n parent: The parent project or folder\n properties: A map of Synapse properties\n annotations: A map of user defined annotations\n local_state: Internal use only\n\n\n Example: Using this class\n Creating an instance and storing the link\n\n link = Link('targetID', parent=folder)\n link = syn.store(link)\n \"\"\"\n\n _property_keys = Entity._property_keys + [\"linksTo\", \"linksToClassName\"]\n _local_keys = Entity._local_keys\n _synapse_entity_type = \"org.sagebionetworks.repo.model.Link\"\n\n def __init__(\n self,\n targetId=None,\n targetVersion=None,\n parent=None,\n properties=None,\n annotations=None,\n local_state=None,\n **kwargs,\n ):\n if targetId is not None and targetVersion is not None:\n kwargs[\"linksTo\"] = dict(\n targetId=utils.id_of(targetId), targetVersionNumber=targetVersion\n )\n elif targetId is not None and targetVersion is None:\n kwargs[\"linksTo\"] = dict(targetId=utils.id_of(targetId))\n elif properties is not None and \"linksTo\" in properties:\n pass\n else:\n raise SynapseMalformedEntityError(\"Must provide a target id\")\n super(Link, self).__init__(\n concreteType=Link._synapse_entity_type,\n properties=properties,\n annotations=annotations,\n local_state=local_state,\n parent=parent,\n **kwargs,\n )\n
"},{"location":"reference/project/","title":"Project","text":""},{"location":"reference/project/#synapseclient.entity.Project","title":"synapseclient.entity.Project
","text":" Bases: Entity
Represents a project in Synapse.
Projects in Synapse must be uniquely named. Trying to create a project with a name that's already taken, say 'My project', will result in an error
ATTRIBUTE DESCRIPTIONname
The name of the project
properties
A map of Synapse properties
annotations
A map of user defined annotations
local_state
Internal use only
Using this class
Creating an instance and storing the project
project = Project('Foobarbat project')\nproject = syn.store(project)\n
Source code in synapseclient/entity.py
class Project(Entity):\n \"\"\"\n Represents a project in Synapse.\n\n Projects in Synapse must be uniquely named. Trying to create a project with a name that's already taken, say\n 'My project', will result in an error\n\n Attributes:\n name: The name of the project\n properties: A map of Synapse properties\n annotations: A map of user defined annotations\n local_state: Internal use only\n\n\n Example: Using this class\n Creating an instance and storing the project\n\n project = Project('Foobarbat project')\n project = syn.store(project)\n \"\"\"\n\n _synapse_entity_type = \"org.sagebionetworks.repo.model.Project\"\n\n def __init__(\n self, name=None, properties=None, annotations=None, local_state=None, **kwargs\n ):\n if name:\n kwargs[\"name\"] = name\n super(Project, self).__init__(\n concreteType=Project._synapse_entity_type,\n properties=properties,\n annotations=annotations,\n local_state=local_state,\n **kwargs,\n )\n
"},{"location":"reference/synapse_utils/","title":"Synapse Utils","text":""},{"location":"reference/synapse_utils/#synapseutils","title":"synapseutils
","text":""},{"location":"reference/synapse_utils/#synapseutils--overview","title":"Overview","text":"The synapseutils
package provides both higher level beta functions as well as utilities for interacting with Synapse. The behavior of these functions are subject to change.
synapseutils.sync
","text":""},{"location":"reference/synapse_utils/#synapseutils.sync-functions","title":"Functions","text":""},{"location":"reference/synapse_utils/#synapseutils.sync.syncFromSynapse","title":"syncFromSynapse(syn, entity, path=None, ifcollision='overwrite.local', allFiles=None, followLink=False, manifest='all', downloadFile=True)
","text":"Synchronizes all the files in a folder (including subfolders) from Synapse and adds a readme manifest with file metadata.
PARAMETER DESCRIPTIONsyn
A Synapse object with user's login, e.g. syn = synapseclient.login()
entity
A Synapse ID, a Synapse Entity object of type file, folder or project.
path
An optional path where the file hierarchy will be reproduced. If not specified the files will by default be placed in the synapseCache.
DEFAULT: None
ifcollision
Determines how to handle file collisions. Maybe \"overwrite.local\", \"keep.local\", or \"keep.both\".
DEFAULT: 'overwrite.local'
followLink
Determines whether the link returns the target Entity.
DEFAULT: False
manifest
Determines whether creating manifest file automatically. The optional values here (all
, root
, suppress
).
DEFAULT: 'all'
downloadFile
Determines whether downloading the files.
DEFAULT: True
List of entities (files, tables, links)
This function will crawl all subfolders of the project/folder specified by entity
and download all files that have not already been downloaded. If there are newer files in Synapse (or a local file has been edited outside of the cache) since the last download then local the file will be replaced by the new file unless \"ifcollision\" is changed.
If the files are being downloaded to a specific location outside of the Synapse cache a file (SYNAPSE_METADATA_MANIFEST.tsv) will also be added in the path that contains the metadata (annotations, storage location and provenance of all downloaded files).
See also:
Download and print the paths of all downloaded files:
entities = syncFromSynapse(syn, \"syn1234\")\nfor f in entities:\n print(f.path)\n
Source code in synapseutils/sync.py
@tracer.start_as_current_span(\"sync::syncFromSynapse\")\ndef syncFromSynapse(\n syn,\n entity,\n path=None,\n ifcollision=\"overwrite.local\",\n allFiles=None,\n followLink=False,\n manifest=\"all\",\n downloadFile=True,\n):\n \"\"\"Synchronizes all the files in a folder (including subfolders) from Synapse and adds a readme manifest with file\n metadata.\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n entity: A Synapse ID, a Synapse Entity object of type file, folder or project.\n path: An optional path where the file hierarchy will be reproduced. If not specified the files will by default be placed in the synapseCache.\n ifcollision: Determines how to handle file collisions. Maybe \"overwrite.local\", \"keep.local\", or \"keep.both\".\n followLink: Determines whether the link returns the target Entity.\n manifest: Determines whether creating manifest file automatically. The optional values here (`all`, `root`, `suppress`).\n downloadFile: Determines whether downloading the files.\n\n Returns:\n List of entities ([files][synapseclient.File], [tables][synapseclient.Table], [links][synapseclient.Link])\n\n\n This function will crawl all subfolders of the project/folder specified by `entity` and download all files that have\n not already been downloaded. If there are newer files in Synapse (or a local file has been edited outside of the\n cache) since the last download then local the file will be replaced by the new file unless \"ifcollision\" is changed.\n\n If the files are being downloaded to a specific location outside of the Synapse cache a file\n (SYNAPSE_METADATA_MANIFEST.tsv) will also be added in the path that contains the metadata (annotations, storage\n location and provenance of all downloaded files).\n\n See also:\n\n - [synapseutils.sync.syncToSynapse][]\n\n Example: Using this function\n Download and print the paths of all downloaded files:\n\n entities = syncFromSynapse(syn, \"syn1234\")\n for f in entities:\n print(f.path)\n \"\"\"\n\n if manifest not in (\"all\", \"root\", \"suppress\"):\n raise ValueError(\n 'Value of manifest option should be one of the (\"all\", \"root\", \"suppress\")'\n )\n\n # we'll have the following threads:\n # 1. the entrant thread to this function walks the folder hierarchy and schedules files for download,\n # and then waits for all the file downloads to complete\n # 2. each file download will run in a separate thread in an Executor\n # 3. downloads that support S3 multipart concurrent downloads will be scheduled by the thread in #2 and have\n # their parts downloaded in additional threads in the same Executor\n # To support multipart downloads in #3 using the same Executor as the download thread #2, we need at least\n # 2 threads always, if those aren't available then we'll run single threaded to avoid a deadlock\n with _sync_executor(syn) as executor:\n sync_from_synapse = _SyncDownloader(syn, executor)\n files = sync_from_synapse.sync(\n entity, path, ifcollision, followLink, downloadFile, manifest\n )\n\n # the allFiles parameter used to be passed in as part of the recursive implementation of this function\n # with the public signature invoking itself. now that this isn't a recursive any longer we don't need\n # allFiles as a parameter (especially on the public signature) but it is retained for now for backwards\n # compatibility with external invokers.\n if allFiles is not None:\n allFiles.extend(files)\n files = allFiles\n\n return files\n
"},{"location":"reference/synapse_utils/#synapseutils.sync.syncToSynapse","title":"syncToSynapse(syn, manifestFile, dryRun=False, sendMessages=True, retries=MAX_RETRIES)
","text":"Synchronizes files specified in the manifest file to Synapse.
Given a file describing all of the uploads uploads the content to Synapse and optionally notifies you via Synapse messagging (email) at specific intervals, on errors and on completion.
Read more about the manifest file format
PARAMETER DESCRIPTIONsyn
A Synapse object with user's login, e.g. syn = synapseclient.login()
manifestFile
A tsv file with file locations and metadata to be pushed to Synapse.
dryRun
Performs validation without uploading if set to True.
DEFAULT: False
sendMessages
Sends out messages on completion if set to True.
DEFAULT: True
None
None
Source code insynapseutils/sync.py
@tracer.start_as_current_span(\"sync::syncToSynapse\")\ndef syncToSynapse(\n syn, manifestFile, dryRun=False, sendMessages=True, retries=MAX_RETRIES\n) -> None:\n \"\"\"Synchronizes files specified in the manifest file to Synapse.\n\n Given a file describing all of the uploads uploads the content to Synapse and optionally notifies you via Synapse\n messagging (email) at specific intervals, on errors and on completion.\n\n [Read more about the manifest file format](../../explanations/manifest_tsv/)\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n manifestFile: A tsv file with file locations and metadata to be pushed to Synapse.\n dryRun: Performs validation without uploading if set to True.\n sendMessages: Sends out messages on completion if set to True.\n\n Returns:\n None\n \"\"\"\n df = readManifestFile(syn, manifestFile)\n # have to check all size of single file\n sizes = [\n os.stat(os.path.expandvars(os.path.expanduser(f))).st_size\n for f in df.path\n if not is_url(f)\n ]\n # Write output on what is getting pushed and estimated times - send out message.\n sys.stdout.write(\"=\" * 50 + \"\\n\")\n sys.stdout.write(\n \"We are about to upload %i files with a total size of %s.\\n \"\n % (len(df), utils.humanizeBytes(sum(sizes)))\n )\n sys.stdout.write(\"=\" * 50 + \"\\n\")\n\n if dryRun:\n return\n\n sys.stdout.write(\"Starting upload...\\n\")\n if sendMessages:\n notify_decorator = notifyMe(syn, \"Upload of %s\" % manifestFile, retries=retries)\n upload = notify_decorator(_manifest_upload)\n upload(syn, df)\n else:\n _manifest_upload(syn, df)\n
"},{"location":"reference/synapse_utils/#synapseutils.sync.generateManifest","title":"generateManifest(syn, allFiles, filename, provenance_cache=None)
","text":"Generates a manifest file based on a list of entities objects.
Read more about the manifest file format
PARAMETER DESCRIPTIONsyn
A Synapse object with user's login, e.g. syn = synapseclient.login()
allFiles
A list of File Entity objects on Synapse (can't be Synapse IDs)
filename
file where manifest will be written
provenance_cache
an optional dict of known provenance dicts keyed by entity ids
DEFAULT: None
None
None
Source code insynapseutils/sync.py
def generateManifest(syn, allFiles, filename, provenance_cache=None) -> None:\n \"\"\"Generates a manifest file based on a list of entities objects.\n\n [Read more about the manifest file format](../../explanations/manifest_tsv/)\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n allFiles: A list of File Entity objects on Synapse (can't be Synapse IDs)\n filename: file where manifest will be written\n provenance_cache: an optional dict of known provenance dicts keyed by entity ids\n\n Returns:\n None\n \"\"\"\n keys, data = _extract_file_entity_metadata(\n syn, allFiles, provenance_cache=provenance_cache\n )\n _write_manifest_data(filename, keys, data)\n
"},{"location":"reference/synapse_utils/#synapseutils.sync.generate_sync_manifest","title":"generate_sync_manifest(syn, directory_path, parent_id, manifest_path)
","text":"Generate manifest for syncToSynapse() from a local directory.
Read more about the manifest file format
PARAMETER DESCRIPTIONsyn
A Synapse object with user's login, e.g. syn = synapseclient.login()
directory_path
Path to local directory to be pushed to Synapse.
parent_id
Synapse ID of the parent folder/project on Synapse.
manifest_path
Path to the manifest file to be generated.
RETURNS DESCRIPTION
None
None
Source code insynapseutils/sync.py
@tracer.start_as_current_span(\"sync::generate_sync_manifest\")\ndef generate_sync_manifest(syn, directory_path, parent_id, manifest_path) -> None:\n \"\"\"Generate manifest for syncToSynapse() from a local directory.\n\n [Read more about the manifest file format](../../explanations/manifest_tsv/)\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n directory_path: Path to local directory to be pushed to Synapse.\n parent_id: Synapse ID of the parent folder/project on Synapse.\n manifest_path: Path to the manifest file to be generated.\n\n Returns:\n None\n \"\"\"\n manifest_cols = [\"path\", \"parent\"]\n manifest_rows = _walk_directory_tree(syn, directory_path, parent_id)\n _write_manifest_data(manifest_path, manifest_cols, manifest_rows)\n
"},{"location":"reference/synapse_utils/#synapseutils.sync.readManifestFile","title":"readManifestFile(syn, manifestFile)
","text":"Verifies a file manifest and returns a reordered dataframe ready for upload.
Read more about the manifest file format
PARAMETER DESCRIPTIONsyn
A Synapse object with user's login, e.g. syn = synapseclient.login()
manifestFile
A tsv file with file locations and metadata to be pushed to Synapse.
RETURNS DESCRIPTION
A pandas dataframe if the manifest is validated.
Source code insynapseutils/sync.py
@tracer.start_as_current_span(\"sync::readManifestFile\")\ndef readManifestFile(syn, manifestFile):\n \"\"\"Verifies a file manifest and returns a reordered dataframe ready for upload.\n\n [Read more about the manifest file format](../../explanations/manifest_tsv/)\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n manifestFile: A tsv file with file locations and metadata to be pushed to Synapse.\n\n Returns:\n A pandas dataframe if the manifest is validated.\n \"\"\"\n table.test_import_pandas()\n import pandas as pd\n\n if manifestFile is sys.stdin:\n sys.stdout.write(\"Validation and upload of: <stdin>\\n\")\n else:\n sys.stdout.write(\"Validation and upload of: %s\\n\" % manifestFile)\n # Read manifest file into pandas dataframe\n df = pd.read_csv(manifestFile, sep=\"\\t\")\n if \"synapseStore\" not in df:\n df = df.assign(synapseStore=None)\n df.loc[\n df[\"path\"].apply(is_url), \"synapseStore\"\n ] = False # override synapseStore values to False when path is a url\n df.loc[\n df[\"synapseStore\"].isnull(), \"synapseStore\"\n ] = True # remaining unset values default to True\n df.synapseStore = df.synapseStore.astype(bool)\n df = df.fillna(\"\")\n\n sys.stdout.write(\"Validating columns of manifest...\")\n for field in REQUIRED_FIELDS:\n sys.stdout.write(\".\")\n if field not in df.columns:\n sys.stdout.write(\"\\n\")\n raise ValueError(\"Manifest must contain a column of %s\" % field)\n sys.stdout.write(\"OK\\n\")\n\n sys.stdout.write(\"Validating that all paths exist...\")\n df.path = df.path.apply(_check_path_and_normalize)\n\n sys.stdout.write(\"OK\\n\")\n\n sys.stdout.write(\"Validating that all files are unique...\")\n # Both the path and the combination of entity name and parent must be unique\n if len(df.path) != len(set(df.path)):\n raise ValueError(\"All rows in manifest must contain a unique file to upload\")\n sys.stdout.write(\"OK\\n\")\n\n # Check each size of uploaded file\n sys.stdout.write(\"Validating that all the files are not empty...\")\n _check_size_each_file(df)\n sys.stdout.write(\"OK\\n\")\n\n # check the name of each file should be store on Synapse\n name_column = \"name\"\n # Create entity name column from basename\n if name_column not in df.columns:\n filenames = [os.path.basename(path) for path in df[\"path\"]]\n df[\"name\"] = filenames\n\n sys.stdout.write(\"Validating file names... \\n\")\n _check_file_name(df)\n sys.stdout.write(\"OK\\n\")\n\n sys.stdout.write(\"Validating provenance...\")\n df = _sortAndFixProvenance(syn, df)\n sys.stdout.write(\"OK\\n\")\n\n sys.stdout.write(\"Validating that parents exist and are containers...\")\n parents = set(df.parent)\n for synId in parents:\n try:\n container = syn.get(synId, downloadFile=False)\n except SynapseHTTPError:\n sys.stdout.write(\n \"\\n%s in the parent column is not a valid Synapse Id\\n\" % synId\n )\n raise\n if not is_container(container):\n sys.stdout.write(\n \"\\n%s in the parent column is is not a Folder or Project\\n\" % synId\n )\n raise SynapseHTTPError\n sys.stdout.write(\"OK\\n\")\n return df\n
"},{"location":"reference/synapse_utils/#synapseutils.copy_functions","title":"synapseutils.copy_functions
","text":""},{"location":"reference/synapse_utils/#synapseutils.copy_functions-functions","title":"Functions","text":""},{"location":"reference/synapse_utils/#synapseutils.copy_functions.copy","title":"copy(syn, entity, destinationId, skipCopyWikiPage=False, skipCopyAnnotations=False, **kwargs)
","text":"syn
A Synapse object with user's login, e.g. syn = synapseclient.login()
entity
A synapse entity ID
destinationId
Synapse ID of a folder/project that the copied entity is being copied to
skipCopyWikiPage
Skip copying the wiki pages.
DEFAULT: False
skipCopyAnnotations
Skips copying the annotations.
DEFAULT: False
version
(File copy only) Can specify version of a file. Default to None
updateExisting
(File copy only) When the destination has an entity that has the same name, users can choose to update that entity. It must be the same entity type Default to False
setProvenance
(File copy only) Has three values to set the provenance of the copied entity: traceback: Sets to the source entity existing: Sets to source entity's original provenance (if it exists) None: No provenance is set
excludeTypes
(Folder/Project copy only) Accepts a list of entity types (file, table, link) which determines which entity types to not copy. Defaults to an empty list.
RETURNS DESCRIPTION
A mapping between the original and copied entity: {'syn1234':'syn33455'}
Using this functionSample copy:
import synapseutils\nimport synapseclient\nsyn = synapseclient.login()\nsynapseutils.copy(syn, ...)\n
Copying Files:
synapseutils.copy(syn, \"syn12345\", \"syn45678\", updateExisting=False, setProvenance = \"traceback\",version=None)\n
Copying Folders/Projects:
# This will copy everything in the project into the destinationId except files and tables.\nsynapseutils.copy(syn, \"syn123450\",\"syn345678\",excludeTypes=[\"file\",\"table\"])\n
Source code in synapseutils/copy_functions.py
def copy(\n syn,\n entity,\n destinationId,\n skipCopyWikiPage=False,\n skipCopyAnnotations=False,\n **kwargs,\n):\n \"\"\"\n - This function will assist users in copying entities (Tables, Links, Files, Folders, Projects),\n and will recursively copy everything in directories.\n - A Mapping of the old entities to the new entities will be created and all the wikis of each entity\n will also be copied over and links to synapse Ids will be updated.\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n entity: A synapse entity ID\n destinationId: Synapse ID of a folder/project that the copied entity is being copied to\n skipCopyWikiPage: Skip copying the wiki pages.\n skipCopyAnnotations: Skips copying the annotations.\n version: (File copy only) Can specify version of a file. Default to None\n updateExisting: (File copy only) When the destination has an entity that has the same name,\n users can choose to update that entity. It must be the same entity type\n Default to False\n setProvenance: (File copy only) Has three values to set the provenance of the copied entity:\n traceback: Sets to the source entity\n existing: Sets to source entity's original provenance (if it exists)\n None: No provenance is set\n excludeTypes: (Folder/Project copy only) Accepts a list of entity types (file, table, link) which determines\n which entity types to not copy. Defaults to an empty list.\n\n Returns:\n A mapping between the original and copied entity: {'syn1234':'syn33455'}\n\n Example: Using this function\n Sample copy:\n\n import synapseutils\n import synapseclient\n syn = synapseclient.login()\n synapseutils.copy(syn, ...)\n\n Copying Files:\n\n synapseutils.copy(syn, \"syn12345\", \"syn45678\", updateExisting=False, setProvenance = \"traceback\",version=None)\n\n Copying Folders/Projects:\n\n # This will copy everything in the project into the destinationId except files and tables.\n synapseutils.copy(syn, \"syn123450\",\"syn345678\",excludeTypes=[\"file\",\"table\"])\n \"\"\"\n updateLinks = kwargs.get(\"updateLinks\", True)\n updateSynIds = kwargs.get(\"updateSynIds\", True)\n entitySubPageId = kwargs.get(\"entitySubPageId\", None)\n destinationSubPageId = kwargs.get(\"destinationSubPageId\", None)\n\n mapping = _copyRecursive(\n syn, entity, destinationId, skipCopyAnnotations=skipCopyAnnotations, **kwargs\n )\n if not skipCopyWikiPage:\n for oldEnt in mapping:\n copyWiki(\n syn,\n oldEnt,\n mapping[oldEnt],\n entitySubPageId=entitySubPageId,\n destinationSubPageId=destinationSubPageId,\n updateLinks=updateLinks,\n updateSynIds=updateSynIds,\n entityMap=mapping,\n )\n return mapping\n
"},{"location":"reference/synapse_utils/#synapseutils.copy_functions.changeFileMetaData","title":"changeFileMetaData(syn, entity, downloadAs=None, contentType=None, forceVersion=True)
","text":"Change File Entity metadata like the download as name.
PARAMETER DESCRIPTIONsyn
A Synapse object with user's login, e.g. syn = synapseclient.login()
entity
Synapse entity Id or object.
contentType
Specify content type to change the content type of a filehandle.
DEFAULT: None
downloadAs
Specify filename to change the filename of a filehandle.
DEFAULT: None
forceVersion
Indicates whether the method should increment the version of the object even if nothing has changed. Defaults to True.
DEFAULT: True
Synapse Entity
Using this functionCan be used to change the filename or the file content-type without downloading:
file_entity = syn.get(synid)\nprint(os.path.basename(file_entity.path)) ## prints, e.g., \"my_file.txt\"\nfile_entity = synapseutils.changeFileMetaData(syn, file_entity, \"my_new_name_file.txt\")\n
Source code in synapseutils/copy_functions.py
def changeFileMetaData(\n syn, entity, downloadAs=None, contentType=None, forceVersion=True\n):\n \"\"\"\n Change File Entity metadata like the download as name.\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n entity: Synapse entity Id or object.\n contentType: Specify content type to change the content type of a filehandle.\n downloadAs: Specify filename to change the filename of a filehandle.\n forceVersion: Indicates whether the method should increment the version of the object even if nothing has changed. Defaults to True.\n\n Returns:\n Synapse Entity\n\n Example: Using this function\n Can be used to change the filename or the file content-type without downloading:\n\n file_entity = syn.get(synid)\n print(os.path.basename(file_entity.path)) ## prints, e.g., \"my_file.txt\"\n file_entity = synapseutils.changeFileMetaData(syn, file_entity, \"my_new_name_file.txt\")\n \"\"\"\n ent = syn.get(entity, downloadFile=False)\n fileResult = syn._getFileHandleDownload(ent.dataFileHandleId, ent.id)\n ent.contentType = ent.contentType if contentType is None else contentType\n downloadAs = (\n fileResult[\"fileHandle\"][\"fileName\"] if downloadAs is None else downloadAs\n )\n copiedFileHandle = copyFileHandles(\n syn,\n [ent.dataFileHandleId],\n [ent.concreteType.split(\".\")[-1]],\n [ent.id],\n [contentType],\n [downloadAs],\n )\n copyResult = copiedFileHandle[0]\n if copyResult.get(\"failureCode\") is not None:\n raise ValueError(\n \"%s dataFileHandleId: %s\"\n % (copyResult[\"failureCode\"], copyResult[\"originalFileHandleId\"])\n )\n ent.dataFileHandleId = copyResult[\"newFileHandle\"][\"id\"]\n ent = syn.store(ent, forceVersion=forceVersion)\n return ent\n
"},{"location":"reference/synapse_utils/#synapseutils.copy_functions.copyFileHandles","title":"copyFileHandles(syn, fileHandles, associateObjectTypes, associateObjectIds, newContentTypes=None, newFileNames=None)
","text":"Given a list of fileHandle Ids or Objects, copy the fileHandles
PARAMETER DESCRIPTIONsyn
A Synapse object with user's login, e.g. syn = synapseclient.login()
fileHandles
List of fileHandle Ids or Objects
associateObjectTypes
List of associated object types: FileEntity, TableEntity, WikiAttachment, UserProfileAttachment, MessageAttachment, TeamAttachment, SubmissionAttachment, VerificationSubmission (Must be the same length as fileHandles)
associateObjectIds
List of associated object Ids: If copying a file, the objectId is the synapse id, and if copying a wiki attachment, the object id is the wiki subpage id. (Must be the same length as fileHandles)
newContentTypes
(Optional) List of content types. Set each item to a new content type for each file handle, or leave the item as None to keep the original content type. Default None, which keeps all original content types.
DEFAULT: None
newFileNames
(Optional) List of filenames. Set each item to a new filename for each file handle, or leave the item as None to keep the original name. Default None, which keeps all original file names.
DEFAULT: None
List of batch filehandle copy results, can include failureCodes: UNAUTHORIZED and NOT_FOUND
RAISES DESCRIPTIONValueError
If length of all input arguments are not the same
Source code insynapseutils/copy_functions.py
def copyFileHandles(\n syn,\n fileHandles,\n associateObjectTypes,\n associateObjectIds,\n newContentTypes=None,\n newFileNames=None,\n):\n \"\"\"\n Given a list of fileHandle Ids or Objects, copy the fileHandles\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n fileHandles: List of fileHandle Ids or Objects\n associateObjectTypes: List of associated object types: FileEntity, TableEntity, WikiAttachment,\n UserProfileAttachment, MessageAttachment, TeamAttachment, SubmissionAttachment,\n VerificationSubmission (Must be the same length as fileHandles)\n associateObjectIds: List of associated object Ids: If copying a file, the objectId is the synapse id,\n and if copying a wiki attachment, the object id is the wiki subpage id.\n (Must be the same length as fileHandles)\n newContentTypes: (Optional) List of content types. Set each item to a new content type for each file\n handle, or leave the item as None to keep the original content type. Default None,\n which keeps all original content types.\n newFileNames: (Optional) List of filenames. Set each item to a new filename for each file handle,\n or leave the item as None to keep the original name. Default None, which keeps all\n original file names.\n\n Returns:\n List of batch filehandle copy results, can include failureCodes: UNAUTHORIZED and NOT_FOUND\n\n Raises:\n ValueError: If length of all input arguments are not the same\n \"\"\"\n\n # Check if length of all inputs are equal\n if not (\n len(fileHandles) == len(associateObjectTypes) == len(associateObjectIds)\n and (newContentTypes is None or len(newContentTypes) == len(associateObjectIds))\n and (newFileNames is None or len(newFileNames) == len(associateObjectIds))\n ):\n raise ValueError(\"Length of all input arguments must be the same\")\n\n # If no optional params passed, assign to empty list\n if newContentTypes is None:\n newContentTypes = []\n if newFileNames is None:\n newFileNames = []\n\n # Remove this line if we change API to only take fileHandleIds and not Objects\n file_handle_ids = [synapseclient.core.utils.id_of(handle) for handle in fileHandles]\n\n # division logic for POST call here\n master_copy_results_list = [] # list which holds all results from POST call\n for (\n batch_file_handles_ids,\n batch_assoc_obj_types,\n batch_assoc_obj_ids,\n batch_con_type,\n batch_file_name,\n ) in _batch_iterator_generator(\n [\n file_handle_ids,\n associateObjectTypes,\n associateObjectIds,\n newContentTypes,\n newFileNames,\n ],\n MAX_FILE_HANDLE_PER_COPY_REQUEST,\n ):\n batch_copy_results = _copy_file_handles_batch(\n syn,\n batch_file_handles_ids,\n batch_assoc_obj_types,\n batch_assoc_obj_ids,\n batch_con_type,\n batch_file_name,\n )\n master_copy_results_list.extend(batch_copy_results)\n\n return master_copy_results_list\n
"},{"location":"reference/synapse_utils/#synapseutils.copy_functions.copyWiki","title":"copyWiki(syn, entity, destinationId, entitySubPageId=None, destinationSubPageId=None, updateLinks=True, updateSynIds=True, entityMap=None)
","text":"Copies wikis and updates internal links
PARAMETER DESCRIPTIONsyn
A Synapse object with user's login, e.g. syn = synapseclient.login()
entity
A synapse ID of an entity whose wiki you want to copy
destinationId
Synapse ID of a folder/project that the wiki wants to be copied to
updateLinks
Update all the internal links. (e.g. syn1234/wiki/34345 becomes syn3345/wiki/49508)
DEFAULT: True
updateSynIds
Update all the synapse ID's referenced in the wikis. (e.g. syn1234 becomes syn2345) Defaults to True but needs an entityMap
DEFAULT: True
entityMap
An entity map {'oldSynId','newSynId'} to update the synapse IDs referenced in the wiki.
DEFAULT: None
entitySubPageId
Can specify subPageId and copy all of its subwikis Defaults to None, which copies the entire wiki subPageId can be found: https://www.synapse.org/#!Synapse:syn123/wiki/1234 In this case, 1234 is the subPageId.
DEFAULT: None
destinationSubPageId
Can specify destination subPageId to copy wikis to.
DEFAULT: None
A list of Objects with three fields: id, title and parentId.
Source code insynapseutils/copy_functions.py
def copyWiki(\n syn,\n entity,\n destinationId,\n entitySubPageId=None,\n destinationSubPageId=None,\n updateLinks=True,\n updateSynIds=True,\n entityMap=None,\n):\n \"\"\"\n Copies wikis and updates internal links\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n entity: A synapse ID of an entity whose wiki you want to copy\n destinationId: Synapse ID of a folder/project that the wiki wants to be copied to\n updateLinks: Update all the internal links. (e.g. syn1234/wiki/34345 becomes syn3345/wiki/49508)\n updateSynIds: Update all the synapse ID's referenced in the wikis. (e.g. syn1234 becomes syn2345)\n Defaults to True but needs an entityMap\n entityMap: An entity map {'oldSynId','newSynId'} to update the synapse IDs referenced in the wiki.\n entitySubPageId: Can specify subPageId and copy all of its subwikis\n Defaults to None, which copies the entire wiki subPageId can be found:\n https://www.synapse.org/#!Synapse:syn123/wiki/1234\n In this case, 1234 is the subPageId.\n destinationSubPageId: Can specify destination subPageId to copy wikis to.\n\n Returns:\n A list of Objects with three fields: id, title and parentId.\n \"\"\"\n\n # Validate input parameters\n if entitySubPageId:\n entitySubPageId = str(int(entitySubPageId))\n if destinationSubPageId:\n destinationSubPageId = str(int(destinationSubPageId))\n\n oldOwn = syn.get(entity, downloadFile=False)\n # getWikiHeaders fails when there is no wiki\n\n try:\n oldWikiHeaders = syn.getWikiHeaders(oldOwn)\n except SynapseHTTPError as e:\n if e.response.status_code == 404:\n return []\n else:\n raise e\n\n newOwn = syn.get(destinationId, downloadFile=False)\n wikiIdMap = dict()\n newWikis = dict()\n # If entitySubPageId is given but not destinationSubPageId, set the pageId to \"\" (will get the root page)\n # A entitySubPage could be copied to a project without any wiki pages, this has to be checked\n newWikiPage = None\n if destinationSubPageId:\n try:\n newWikiPage = syn.getWiki(newOwn, destinationSubPageId)\n except SynapseHTTPError as e:\n if e.response.status_code == 404:\n pass\n else:\n raise e\n if entitySubPageId:\n oldWikiHeaders = _getSubWikiHeaders(oldWikiHeaders, entitySubPageId)\n\n if not oldWikiHeaders:\n return []\n\n for wikiHeader in oldWikiHeaders:\n wiki = syn.getWiki(oldOwn, wikiHeader[\"id\"])\n syn.logger.info(\"Got wiki %s\" % wikiHeader[\"id\"])\n if not wiki.get(\"attachmentFileHandleIds\"):\n new_file_handles = []\n else:\n results = [\n syn._getFileHandleDownload(\n filehandleId, wiki.id, objectType=\"WikiAttachment\"\n )\n for filehandleId in wiki[\"attachmentFileHandleIds\"]\n ]\n # Get rid of the previews\n nopreviews = [\n attach[\"fileHandle\"]\n for attach in results\n if not attach[\"fileHandle\"][\"isPreview\"]\n ]\n contentTypes = [attach[\"contentType\"] for attach in nopreviews]\n fileNames = [attach[\"fileName\"] for attach in nopreviews]\n copiedFileHandles = copyFileHandles(\n syn,\n nopreviews,\n [\"WikiAttachment\"] * len(nopreviews),\n [wiki.id] * len(nopreviews),\n contentTypes,\n fileNames,\n )\n # Check if failurecodes exist\n for filehandle in copiedFileHandles:\n if filehandle.get(\"failureCode\") is not None:\n raise ValueError(\n \"%s dataFileHandleId: %s\"\n % (\n filehandle[\"failureCode\"],\n filehandle[\"originalFileHandleId\"],\n )\n )\n new_file_handles = [\n filehandle[\"newFileHandle\"][\"id\"] for filehandle in copiedFileHandles\n ]\n # for some reason some wikis don't have titles?\n if hasattr(wikiHeader, \"parentId\"):\n newWikiPage = Wiki(\n owner=newOwn,\n title=wiki.get(\"title\", \"\"),\n markdown=wiki.markdown,\n fileHandles=new_file_handles,\n parentWikiId=wikiIdMap[wiki.parentWikiId],\n )\n newWikiPage = syn.store(newWikiPage)\n else:\n if destinationSubPageId is not None and newWikiPage is not None:\n newWikiPage[\"attachmentFileHandleIds\"] = new_file_handles\n newWikiPage[\"markdown\"] = wiki[\"markdown\"]\n newWikiPage[\"title\"] = wiki.get(\"title\", \"\")\n # Need to add logic to update titles here\n newWikiPage = syn.store(newWikiPage)\n else:\n newWikiPage = Wiki(\n owner=newOwn,\n title=wiki.get(\"title\", \"\"),\n markdown=wiki.markdown,\n fileHandles=new_file_handles,\n parentWikiId=destinationSubPageId,\n )\n newWikiPage = syn.store(newWikiPage)\n newWikis[newWikiPage[\"id\"]] = newWikiPage\n wikiIdMap[wiki[\"id\"]] = newWikiPage[\"id\"]\n\n if updateLinks:\n syn.logger.info(\"Updating internal links:\\n\")\n newWikis = _updateInternalLinks(newWikis, wikiIdMap, entity, destinationId)\n syn.logger.info(\"Done updating internal links.\\n\")\n\n if updateSynIds and entityMap is not None:\n syn.logger.info(\"Updating Synapse references:\\n\")\n newWikis = _updateSynIds(newWikis, wikiIdMap, entityMap)\n syn.logger.info(\"Done updating Synapse IDs.\\n\")\n\n syn.logger.info(\"Storing new Wikis\\n\")\n for oldWikiId in wikiIdMap.keys():\n newWikiId = wikiIdMap[oldWikiId]\n newWikis[newWikiId] = syn.store(newWikis[newWikiId])\n syn.logger.info(\"\\tStored: %s\\n\" % newWikiId)\n return syn.getWikiHeaders(newOwn)\n
"},{"location":"reference/synapse_utils/#synapseutils.walk_functions","title":"synapseutils.walk_functions
","text":""},{"location":"reference/synapse_utils/#synapseutils.walk_functions-functions","title":"Functions","text":""},{"location":"reference/synapse_utils/#synapseutils.walk_functions.walk","title":"walk(syn, synId, includeTypes=['folder', 'file', 'table', 'link', 'entityview', 'dockerrepo', 'submissionview', 'dataset', 'materializedview'])
","text":"Traverse through the hierarchy of files and folders stored under the synId. Has the same behavior as os.walk()
PARAMETER DESCRIPTIONsyn
A Synapse object with user's login, e.g. syn = synapseclient.login()
synId
A synapse ID of a folder or project
includeTypes
Must be a list of entity types (ie.[\"file\", \"table\"]) The \"folder\" type is always included so the hierarchy can be traversed
DEFAULT: ['folder', 'file', 'table', 'link', 'entityview', 'dockerrepo', 'submissionview', 'dataset', 'materializedview']
Traversing through a project and printing out the directory path, folders, and files
walkedPath = walk(syn, \"syn1234\", [\"file\"]) #Exclude tables and views\n\nfor dirpath, dirname, filename in walkedPath:\n print(dirpath)\n print(dirname) #All the folders in the directory path\n print(filename) #All the files in the directory path\n
Source code in synapseutils/walk_functions.py
def walk(\n syn,\n synId,\n includeTypes=[\n \"folder\",\n \"file\",\n \"table\",\n \"link\",\n \"entityview\",\n \"dockerrepo\",\n \"submissionview\",\n \"dataset\",\n \"materializedview\",\n ],\n):\n \"\"\"\n Traverse through the hierarchy of files and folders stored under the synId. Has the same behavior as os.walk()\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n synId: A synapse ID of a folder or project\n includeTypes: Must be a list of entity types (ie.[\"file\", \"table\"])\n The \"folder\" type is always included so the hierarchy can be traversed\n\n Example: Using this function\n Traversing through a project and printing out the directory path, folders, and files\n\n walkedPath = walk(syn, \"syn1234\", [\"file\"]) #Exclude tables and views\n\n for dirpath, dirname, filename in walkedPath:\n print(dirpath)\n print(dirname) #All the folders in the directory path\n print(filename) #All the files in the directory path\n\n \"\"\"\n # Ensure that \"folder\" is included so the hierarchy can be traversed\n if \"folder\" not in includeTypes:\n includeTypes.append(\"folder\")\n return _helpWalk(syn, synId, includeTypes)\n
"},{"location":"reference/synapse_utils/#synapseutils.monitor","title":"synapseutils.monitor
","text":""},{"location":"reference/synapse_utils/#synapseutils.monitor-functions","title":"Functions","text":""},{"location":"reference/synapse_utils/#synapseutils.monitor.notifyMe","title":"notifyMe(syn, messageSubject='', retries=0)
","text":"Function decorator that notifies you via email whenever an function completes running or there is a failure.
PARAMETER DESCRIPTIONsyn
A Synapse object with user's login, e.g. syn = synapseclient.login()
messageSubject
A string with subject line for sent out messages.
DEFAULT: ''
retries
Number of retries to attempt on failure
DEFAULT: 0
As a decorator:
# to decorate a function that you define\nfrom synapseutils import notifyMe\nimport synapseclient\nsyn = synapseclient.login()\n\n@notifyMe(syn, 'Long running function', retries=2)\ndef my_function(x):\n doing_something()\n return long_runtime_func(x)\n\nmy_function(123)\n
Wrapping a function:
# to wrap a function that already exists\nfrom synapseutils import notifyMe\nimport synapseclient\nsyn = synapseclient.login()\n\nnotify_decorator = notifyMe(syn, 'Long running query', retries=2)\nmy_query = notify_decorator(syn.tableQuery)\nresults = my_query(\"select id from syn1223\")\n
Source code in synapseutils/monitor.py
def notifyMe(syn, messageSubject=\"\", retries=0):\n \"\"\"Function decorator that notifies you via email whenever an function completes running or there is a failure.\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n messageSubject: A string with subject line for sent out messages.\n retries: Number of retries to attempt on failure\n\n Example: Using this function\n As a decorator:\n\n # to decorate a function that you define\n from synapseutils import notifyMe\n import synapseclient\n syn = synapseclient.login()\n\n @notifyMe(syn, 'Long running function', retries=2)\n def my_function(x):\n doing_something()\n return long_runtime_func(x)\n\n my_function(123)\n\n Wrapping a function:\n\n # to wrap a function that already exists\n from synapseutils import notifyMe\n import synapseclient\n syn = synapseclient.login()\n\n notify_decorator = notifyMe(syn, 'Long running query', retries=2)\n my_query = notify_decorator(syn.tableQuery)\n results = my_query(\"select id from syn1223\")\n \"\"\"\n\n def notify_decorator(func):\n @functools.wraps(func)\n def with_retry_and_messaging(*args, **kwargs):\n attempt = 0\n destination = syn.getUserProfile()[\"ownerId\"]\n while attempt <= retries:\n try:\n output = func(*args, **kwargs)\n syn.sendMessage(\n [destination],\n messageSubject,\n messageBody=\"Call to %s completed successfully!\"\n % func.__name__,\n )\n return output\n except Exception as e:\n sys.stderr.write(traceback.format_exc())\n syn.sendMessage(\n [destination],\n messageSubject,\n messageBody=(\n \"Encountered a temporary Failure during upload. \"\n \"Will retry %i more times. \\n\\n Error message was:\\n%s\\n\\n%s\"\n % (retries - attempt, e, traceback.format_exc())\n ),\n )\n attempt += 1\n\n return with_retry_and_messaging\n\n return notify_decorator\n
"},{"location":"reference/synapse_utils/#synapseutils.monitor.with_progress_bar","title":"with_progress_bar(func, totalCalls, prefix='', postfix='', isBytes=False)
","text":"Wraps a function to add a progress bar based on the number of calls to that function.
PARAMETER DESCRIPTIONfunc
Function being wrapped with progress Bar
totalCalls
total number of items/bytes when completed
prefix
String printed before progress bar
DEFAULT: ''
prefix
String printed after progress bar
DEFAULT: ''
isBytes
A boolean indicating weather to convert bytes to kB, MB, GB etc.
DEFAULT: False
A wrapped function that contains a progress bar
Source code insynapseutils/monitor.py
def with_progress_bar(func, totalCalls, prefix=\"\", postfix=\"\", isBytes=False):\n \"\"\"Wraps a function to add a progress bar based on the number of calls to that function.\n\n Arguments:\n func: Function being wrapped with progress Bar\n totalCalls: total number of items/bytes when completed\n prefix: String printed before progress bar\n prefix: String printed after progress bar\n isBytes: A boolean indicating weather to convert bytes to kB, MB, GB etc.\n\n Returns:\n A wrapped function that contains a progress bar\n \"\"\"\n completed = Value(\"d\", 0)\n lock = Lock()\n\n def progress(*args, **kwargs):\n with lock:\n completed.value += 1\n printTransferProgress(completed.value, totalCalls, prefix, postfix, isBytes)\n return func(*args, **kwargs)\n\n return progress\n
"},{"location":"reference/synapse_utils/#synapseutils.migrate_functions","title":"synapseutils.migrate_functions
","text":""},{"location":"reference/synapse_utils/#synapseutils.migrate_functions-classes","title":"Classes","text":""},{"location":"reference/synapse_utils/#synapseutils.migrate_functions.MigrationResult","title":"MigrationResult
","text":"A MigrationResult is a proxy object to the underlying sqlite db. It provides a programmatic interface that allows the caller to iterate over the file handles that were migrated without having to connect to or know the schema of the sqlite db, and also avoids the potential memory liability of putting everything into an in memory data structure that could be a liability when migrating a huge project of hundreds of thousands/millions of entities.
As this proxy object is not thread safe since it accesses an underlying sqlite db.
Source code insynapseutils/migrate_functions.py
class MigrationResult:\n \"\"\"A MigrationResult is a proxy object to the underlying sqlite db.\n It provides a programmatic interface that allows the caller to iterate over the\n file handles that were migrated without having to connect to or know the schema\n of the sqlite db, and also avoids the potential memory liability of putting\n everything into an in memory data structure that could be a liability when\n migrating a huge project of hundreds of thousands/millions of entities.\n\n As this proxy object is not thread safe since it accesses an underlying sqlite db.\n \"\"\"\n\n def __init__(self, syn, db_path):\n self._syn = syn\n self.db_path = db_path\n\n def get_counts_by_status(self):\n \"\"\"\n Returns a dictionary of counts by the migration status of each indexed file/version.\n Keys are as follows:\n\n - `INDEXED` - the file/version has been indexed and will be migrated on a call to migrate_indexed_files\n - `MIGRATED` - the file/version has been migrated\n - `ALREADY_MIGRATED` - the file/version was already stored at the target storage location and no migration is needed\n - `ERRORED` - an error occurred while indexing or migrating the file/version\n \"\"\" # noqa\n import sqlite3\n\n with sqlite3.connect(self.db_path) as conn:\n cursor = conn.cursor()\n\n # for the purposes of these counts, containers (Projects and Folders) do not count.\n # we are counting actual files only\n result = cursor.execute(\n \"select status, count(*) from migrations where type in (?, ?) group by status\",\n (_MigrationType.FILE.value, _MigrationType.TABLE_ATTACHED_FILE.value),\n )\n\n counts_by_status = {status.name: 0 for status in _MigrationStatus}\n for row in result:\n status = row[0]\n count = row[1]\n counts_by_status[_MigrationStatus(status).name] = count\n\n return counts_by_status\n\n def get_migrations(self):\n \"\"\"\n A generator yielding each file/version in the migration index.\n A dictionary of the properties of the migration row is yielded as follows\n\n Yields:\n id: the Synapse id\n type: the concrete type of the entity\n version: the verson of the file entity (if applicable)\n row_id: the row of the table attached file (if applicable)\n col_id: the column id of the table attached file (if applicable)\n from_storage_location_id: - the previous storage location id where the file/version was stored\n from_file_handle_id: the id file handle of the existing file/version\n to_file_handle_id: if migrated, the new file handle id\n status: one of INDEXED, MIGRATED, ALREADY_MIGRATED, ERRORED indicating the status of the file/version\n exception: if an error was encountered indexing/migrating the file/version its stack is here\n \"\"\"\n import sqlite3\n\n with sqlite3.connect(self.db_path) as conn:\n cursor = conn.cursor()\n\n last_id = None\n column_names = None\n\n rowid = -1\n while True:\n results = cursor.execute(\n \"\"\"\n select\n rowid,\n\n id,\n type,\n version,\n row_id,\n col_id,\n from_storage_location_id,\n from_file_handle_id,\n to_file_handle_id,\n file_size,\n status,\n exception\n from migrations\n where\n rowid > ?\n and type in (?, ?)\n order by\n rowid\n limit ?\n \"\"\",\n (\n rowid,\n _MigrationType.FILE.value,\n _MigrationType.TABLE_ATTACHED_FILE.value,\n _get_batch_size(),\n ),\n )\n\n row_count = 0\n for row in results:\n row_count += 1\n\n # using the internal sqlite rowid for ordering only\n rowid = row[0]\n\n # exclude the sqlite internal rowid\n row_dict = _get_row_dict(cursor, row, False)\n entity_id = row_dict[\"id\"]\n if entity_id != last_id:\n # if the next row is dealing with a different entity than the last table\n # id then we discard any cached column names we looked up\n column_names = {}\n\n row_dict[\"type\"] = (\n \"file\"\n if row_dict[\"type\"] == _MigrationType.FILE.value\n else \"table\"\n )\n\n for int_arg in (\n \"version\",\n \"row_id\",\n \"from_storage_location_id\",\n \"from_file_handle_id\",\n \"to_file_handle_id\",\n ):\n int_val = row_dict.get(int_arg)\n if int_val is not None:\n row_dict[int_arg] = int(int_val)\n\n col_id = row_dict.pop(\"col_id\", None)\n if col_id is not None:\n column_name = column_names.get(col_id)\n\n # for usability we look up the actual column name from the id,\n # but that involves a lookup so we cache them for re-use across\n # rows that deal with the same table entity\n if column_name is None:\n column = self._syn.restGET(\"/column/{}\".format(col_id))\n column_name = column_names[col_id] = column[\"name\"]\n\n row_dict[\"col_name\"] = column_name\n\n row_dict[\"status\"] = _MigrationStatus(row_dict[\"status\"]).name\n\n yield row_dict\n\n last_id = entity_id\n\n if row_count == 0:\n # out of rows\n break\n\n def as_csv(self, path):\n \"\"\"\n Output a flat csv file of the contents of the Migration index.\n\n Arguments:\n path: The path to the csv file to be created\n\n Returns:\n None: But a csv file is created at the given path with the following columns:\n id: the Synapse id\n type: the concrete type of the entity\n version: the verson of the file entity (if applicable)\n row_id: the row of the table attached file (if applicable)\n col_name: the column name of the column the table attached file resides in (if applicable)\n from_storage_location_id: the previous storage location id where the file/version was stored\n from_file_handle_id: the id file handle of the existing file/version\n to_file_handle_id: if migrated, the new file handle id\n status: one of INDEXED, MIGRATED, ALREADY_MIGRATED, ERRORED indicating the status of the file/version\n exception: if an error was encountered indexing/migrating the file/version its stack is here\n\n \"\"\"\n\n with open(path, \"w\", newline=\"\") as csv_file:\n csv_writer = csv.writer(csv_file)\n\n # headers\n csv_writer.writerow(\n [\n \"id\",\n \"type\",\n \"version\",\n \"row_id\",\n \"col_name\",\n \"from_storage_location_id\",\n \"from_file_handle_id\",\n \"to_file_handle_id\",\n \"status\",\n \"exception\",\n ]\n )\n\n for row_dict in self.get_migrations():\n row_data = [\n row_dict[\"id\"],\n row_dict[\"type\"],\n row_dict.get(\"version\"),\n row_dict.get(\"row_id\"),\n row_dict.get(\"col_name\"),\n row_dict.get(\"from_storage_location_id\"),\n row_dict.get(\"from_file_handle_id\"),\n row_dict.get(\"to_file_handle_id\"),\n row_dict[\"status\"],\n row_dict.get(\"exception\"),\n ]\n\n csv_writer.writerow(row_data)\n
"},{"location":"reference/synapse_utils/#synapseutils.migrate_functions.MigrationResult-functions","title":"Functions","text":""},{"location":"reference/synapse_utils/#synapseutils.migrate_functions.MigrationResult.get_counts_by_status","title":"get_counts_by_status()
","text":"Returns a dictionary of counts by the migration status of each indexed file/version. Keys are as follows:
INDEXED
- the file/version has been indexed and will be migrated on a call to migrate_indexed_filesMIGRATED
- the file/version has been migratedALREADY_MIGRATED
- the file/version was already stored at the target storage location and no migration is neededERRORED
- an error occurred while indexing or migrating the file/versionsynapseutils/migrate_functions.py
def get_counts_by_status(self):\n \"\"\"\n Returns a dictionary of counts by the migration status of each indexed file/version.\n Keys are as follows:\n\n - `INDEXED` - the file/version has been indexed and will be migrated on a call to migrate_indexed_files\n - `MIGRATED` - the file/version has been migrated\n - `ALREADY_MIGRATED` - the file/version was already stored at the target storage location and no migration is needed\n - `ERRORED` - an error occurred while indexing or migrating the file/version\n \"\"\" # noqa\n import sqlite3\n\n with sqlite3.connect(self.db_path) as conn:\n cursor = conn.cursor()\n\n # for the purposes of these counts, containers (Projects and Folders) do not count.\n # we are counting actual files only\n result = cursor.execute(\n \"select status, count(*) from migrations where type in (?, ?) group by status\",\n (_MigrationType.FILE.value, _MigrationType.TABLE_ATTACHED_FILE.value),\n )\n\n counts_by_status = {status.name: 0 for status in _MigrationStatus}\n for row in result:\n status = row[0]\n count = row[1]\n counts_by_status[_MigrationStatus(status).name] = count\n\n return counts_by_status\n
"},{"location":"reference/synapse_utils/#synapseutils.migrate_functions.MigrationResult.get_migrations","title":"get_migrations()
","text":"A generator yielding each file/version in the migration index. A dictionary of the properties of the migration row is yielded as follows
YIELDS DESCRIPTIONid
the Synapse id
type
the concrete type of the entity
version
the verson of the file entity (if applicable)
row_id
the row of the table attached file (if applicable)
col_id
the column id of the table attached file (if applicable)
from_storage_location_id
from_file_handle_id
the id file handle of the existing file/version
to_file_handle_id
if migrated, the new file handle id
status
one of INDEXED, MIGRATED, ALREADY_MIGRATED, ERRORED indicating the status of the file/version
exception
if an error was encountered indexing/migrating the file/version its stack is here
Source code insynapseutils/migrate_functions.py
def get_migrations(self):\n \"\"\"\n A generator yielding each file/version in the migration index.\n A dictionary of the properties of the migration row is yielded as follows\n\n Yields:\n id: the Synapse id\n type: the concrete type of the entity\n version: the verson of the file entity (if applicable)\n row_id: the row of the table attached file (if applicable)\n col_id: the column id of the table attached file (if applicable)\n from_storage_location_id: - the previous storage location id where the file/version was stored\n from_file_handle_id: the id file handle of the existing file/version\n to_file_handle_id: if migrated, the new file handle id\n status: one of INDEXED, MIGRATED, ALREADY_MIGRATED, ERRORED indicating the status of the file/version\n exception: if an error was encountered indexing/migrating the file/version its stack is here\n \"\"\"\n import sqlite3\n\n with sqlite3.connect(self.db_path) as conn:\n cursor = conn.cursor()\n\n last_id = None\n column_names = None\n\n rowid = -1\n while True:\n results = cursor.execute(\n \"\"\"\n select\n rowid,\n\n id,\n type,\n version,\n row_id,\n col_id,\n from_storage_location_id,\n from_file_handle_id,\n to_file_handle_id,\n file_size,\n status,\n exception\n from migrations\n where\n rowid > ?\n and type in (?, ?)\n order by\n rowid\n limit ?\n \"\"\",\n (\n rowid,\n _MigrationType.FILE.value,\n _MigrationType.TABLE_ATTACHED_FILE.value,\n _get_batch_size(),\n ),\n )\n\n row_count = 0\n for row in results:\n row_count += 1\n\n # using the internal sqlite rowid for ordering only\n rowid = row[0]\n\n # exclude the sqlite internal rowid\n row_dict = _get_row_dict(cursor, row, False)\n entity_id = row_dict[\"id\"]\n if entity_id != last_id:\n # if the next row is dealing with a different entity than the last table\n # id then we discard any cached column names we looked up\n column_names = {}\n\n row_dict[\"type\"] = (\n \"file\"\n if row_dict[\"type\"] == _MigrationType.FILE.value\n else \"table\"\n )\n\n for int_arg in (\n \"version\",\n \"row_id\",\n \"from_storage_location_id\",\n \"from_file_handle_id\",\n \"to_file_handle_id\",\n ):\n int_val = row_dict.get(int_arg)\n if int_val is not None:\n row_dict[int_arg] = int(int_val)\n\n col_id = row_dict.pop(\"col_id\", None)\n if col_id is not None:\n column_name = column_names.get(col_id)\n\n # for usability we look up the actual column name from the id,\n # but that involves a lookup so we cache them for re-use across\n # rows that deal with the same table entity\n if column_name is None:\n column = self._syn.restGET(\"/column/{}\".format(col_id))\n column_name = column_names[col_id] = column[\"name\"]\n\n row_dict[\"col_name\"] = column_name\n\n row_dict[\"status\"] = _MigrationStatus(row_dict[\"status\"]).name\n\n yield row_dict\n\n last_id = entity_id\n\n if row_count == 0:\n # out of rows\n break\n
"},{"location":"reference/synapse_utils/#synapseutils.migrate_functions.MigrationResult.as_csv","title":"as_csv(path)
","text":"Output a flat csv file of the contents of the Migration index.
PARAMETER DESCRIPTIONpath
The path to the csv file to be created
RETURNS DESCRIPTION
None
But a csv file is created at the given path with the following columns:
id
the Synapse id
type
the concrete type of the entity
version
the verson of the file entity (if applicable)
row_id
the row of the table attached file (if applicable)
col_name
the column name of the column the table attached file resides in (if applicable)
from_storage_location_id
the previous storage location id where the file/version was stored
from_file_handle_id
the id file handle of the existing file/version
to_file_handle_id
if migrated, the new file handle id
status
one of INDEXED, MIGRATED, ALREADY_MIGRATED, ERRORED indicating the status of the file/version
exception
if an error was encountered indexing/migrating the file/version its stack is here
Source code insynapseutils/migrate_functions.py
def as_csv(self, path):\n \"\"\"\n Output a flat csv file of the contents of the Migration index.\n\n Arguments:\n path: The path to the csv file to be created\n\n Returns:\n None: But a csv file is created at the given path with the following columns:\n id: the Synapse id\n type: the concrete type of the entity\n version: the verson of the file entity (if applicable)\n row_id: the row of the table attached file (if applicable)\n col_name: the column name of the column the table attached file resides in (if applicable)\n from_storage_location_id: the previous storage location id where the file/version was stored\n from_file_handle_id: the id file handle of the existing file/version\n to_file_handle_id: if migrated, the new file handle id\n status: one of INDEXED, MIGRATED, ALREADY_MIGRATED, ERRORED indicating the status of the file/version\n exception: if an error was encountered indexing/migrating the file/version its stack is here\n\n \"\"\"\n\n with open(path, \"w\", newline=\"\") as csv_file:\n csv_writer = csv.writer(csv_file)\n\n # headers\n csv_writer.writerow(\n [\n \"id\",\n \"type\",\n \"version\",\n \"row_id\",\n \"col_name\",\n \"from_storage_location_id\",\n \"from_file_handle_id\",\n \"to_file_handle_id\",\n \"status\",\n \"exception\",\n ]\n )\n\n for row_dict in self.get_migrations():\n row_data = [\n row_dict[\"id\"],\n row_dict[\"type\"],\n row_dict.get(\"version\"),\n row_dict.get(\"row_id\"),\n row_dict.get(\"col_name\"),\n row_dict.get(\"from_storage_location_id\"),\n row_dict.get(\"from_file_handle_id\"),\n row_dict.get(\"to_file_handle_id\"),\n row_dict[\"status\"],\n row_dict.get(\"exception\"),\n ]\n\n csv_writer.writerow(row_data)\n
"},{"location":"reference/synapse_utils/#synapseutils.migrate_functions-functions","title":"Functions","text":""},{"location":"reference/synapse_utils/#synapseutils.migrate_functions.index_files_for_migration","title":"index_files_for_migration(syn, entity, dest_storage_location_id, db_path, source_storage_location_ids=None, file_version_strategy='new', include_table_files=False, continue_on_error=False)
","text":"Index the given entity for migration to a new storage location. This is the first step in migrating an entity to a new storage location using synapseutils.
This function will create a sqlite database at the given db_path that can be subsequently passed to the migrate_indexed_files function for actual migration. This function itself does not modify the given entity in any way.
PARAMETER DESCRIPTIONsyn
A Synapse object with user's login, e.g. syn = synapseclient.login()
TYPE: Synapse
entity
A Synapse entity whose files should be migrated. Can be a Project, Folder, File entity, or Table entity. If it is a container (a Project or Folder) its contents will be recursively indexed.
dest_storage_location_id
The id of the new storage location to be migrated to.
TYPE: str
db_path
A path on disk where a sqlite db can be created to store the contents of the created index.
TYPE: str
source_storage_location_ids
An optional iterable of storage location ids that will be migrated. If provided, files outside of one of the listed storage locations will not be indexed for migration. If not provided, then all files not already in the destination storage location will be indexed for migrated.
TYPE: Iterable[str]
DEFAULT: None
file_version_strategy
One of \"new\" (default), \"all\", \"latest\", \"skip\" as follows:
new
: will create a new version of file entities in the new storage location, leaving existing versions unchangedall
: all existing versions will be migrated in place to the new storage locationlatest
: the latest version will be migrated in place to the new storage locationskip
: skip migrating file entities. use this e.g. if wanting to e.g. migrate table attached files in a container while leaving the files unchanged DEFAULT: 'new'
include_table_files
Whether to migrate files attached to tables. If False (default) then e.g. only file entities in the container will be migrated and tables will be untouched.
DEFAULT: False
continue_on_error
Whether any errors encountered while indexing an entity (access etc) will be raised or instead just recorded in the index while allowing the index creation to continue. Default is False (any errors are raised).
DEFAULT: False
A MigrationResult object that can be used to inspect the contents of the index or output the index to a CSV for manual inspection.
Source code insynapseutils/migrate_functions.py
def index_files_for_migration(\n syn: synapseclient.Synapse,\n entity,\n dest_storage_location_id: str,\n db_path: str,\n source_storage_location_ids: typing.Iterable[str] = None,\n file_version_strategy=\"new\",\n include_table_files=False,\n continue_on_error=False,\n):\n \"\"\"\n Index the given entity for migration to a new storage location. This is the first step in migrating an entity\n to a new storage location using synapseutils.\n\n This function will create a sqlite database at the given db_path that can be subsequently passed\n to the migrate_indexed_files function for actual migration. This function itself does not modify the given entity\n in any way.\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n entity: A Synapse entity whose files should be migrated. Can be a Project, Folder,\n File entity, or Table entity. If it is a container (a Project or Folder)\n its contents will be recursively indexed.\n dest_storage_location_id: The id of the new storage location to be migrated to.\n db_path: A path on disk where a sqlite db can be created to store the contents of the\n created index.\n source_storage_location_ids: An optional iterable of storage location ids that\n will be migrated. If provided, files outside of\n one of the listed storage locations will not be\n indexed for migration. If not provided, then all\n files not already in the destination storage\n location will be indexed for migrated.\n file_version_strategy: One of \"new\" (default), \"all\", \"latest\", \"skip\" as follows:\n\n - `new`: will create a new version of file entities in the new storage location, leaving existing versions unchanged\n - `all`: all existing versions will be migrated in place to the new storage location\n - `latest`: the latest version will be migrated in place to the new storage location\n - `skip`: skip migrating file entities. use this e.g. if wanting to e.g. migrate table attached files in a container while leaving the files unchanged\n include_table_files: Whether to migrate files attached to tables. If False (default) then e.g. only\n file entities in the container will be migrated and tables will be untouched.\n continue_on_error: Whether any errors encountered while indexing an entity (access etc) will be raised\n or instead just recorded in the index while allowing the index creation\n to continue. Default is False (any errors are raised).\n\n Returns:\n A MigrationResult object that can be used to inspect the contents of the index or output the index to a CSV for manual inspection.\n \"\"\" # noqa\n root_id = utils.id_of(entity)\n\n # accept an Iterable, but easier to work internally if we can assume a list of strings\n source_storage_location_ids = [str(s) for s in source_storage_location_ids or []]\n\n file_version_strategies = {\"new\", \"all\", \"latest\", \"skip\"}\n if file_version_strategy not in file_version_strategies:\n raise ValueError(\n \"Invalid file_version_strategy: {}, must be one of {}\".format(\n file_version_strategy, file_version_strategies\n )\n )\n\n if file_version_strategy == \"skip\" and not include_table_files:\n raise ValueError(\n \"Skipping both files entities and table attached files, nothing to migrate\"\n )\n\n _verify_storage_location_ownership(syn, dest_storage_location_id)\n\n test_import_sqlite3()\n import sqlite3\n\n with sqlite3.connect(db_path) as conn:\n cursor = conn.cursor()\n _ensure_schema(cursor)\n\n _verify_index_settings(\n cursor,\n db_path,\n root_id,\n dest_storage_location_id,\n source_storage_location_ids,\n file_version_strategy,\n include_table_files,\n )\n conn.commit()\n\n entity = syn.get(root_id, downloadFile=False)\n try:\n _index_entity(\n conn,\n cursor,\n syn,\n entity,\n None,\n dest_storage_location_id,\n source_storage_location_ids,\n file_version_strategy,\n include_table_files,\n continue_on_error,\n )\n\n except _IndexingError as indexing_ex:\n logging.exception(\n \"Aborted due to failure to index entity %s of type %s. Use the continue_on_error option to skip \"\n \"over entities due to individual failures.\",\n indexing_ex.entity_id,\n indexing_ex.concrete_type,\n )\n\n raise indexing_ex.__cause__\n\n return MigrationResult(syn, db_path)\n
"},{"location":"reference/synapse_utils/#synapseutils.migrate_functions.migrate_indexed_files","title":"migrate_indexed_files(syn, db_path, create_table_snapshots=True, continue_on_error=False, force=False)
","text":"Migrate files previously indexed in a sqlite database at the given db_path using the separate index_files_for_migration function. The files listed in the index will be migrated according to the configuration of that index.
PARAMETER DESCRIPTIONsyn
A Synapse object with user's login, e.g. syn = synapseclient.login()
TYPE: Synapse
db_path
A path on disk where a sqlite db was created using the index_files_for_migration function.
TYPE: str
create_table_snapshots
When updating the files in any table, whether the a snapshot of the table is first created.
DEFAULT: True
continue_on_error
Whether any errors encountered while migrating will be raised or instead just recorded in the sqlite database while allowing the migration to continue. Default is False (any errors are raised).
DEFAULT: False
force
If running in an interactive shell, migration requires an interactice confirmation. This can be bypassed by using the force=True option.
DEFAULT: False
Union[MigrationResult, None]
A MigrationResult object that can be used to inspect the results of the migration.
Source code insynapseutils/migrate_functions.py
def migrate_indexed_files(\n syn: synapseclient.Synapse,\n db_path: str,\n create_table_snapshots=True,\n continue_on_error=False,\n force=False,\n) -> typing.Union[MigrationResult, None]:\n \"\"\"\n Migrate files previously indexed in a sqlite database at the given db_path using the separate\n index_files_for_migration function. The files listed in the index will be migrated according to the\n configuration of that index.\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n db_path: A path on disk where a sqlite db was created using the index_files_for_migration function.\n create_table_snapshots: When updating the files in any table, whether the a snapshot of the table is\n first created.\n continue_on_error: Whether any errors encountered while migrating will be raised\n or instead just recorded in the sqlite database while allowing the migration\n to continue. Default is False (any errors are raised).\n force: If running in an interactive shell, migration requires an interactice confirmation.\n This can be bypassed by using the force=True option.\n\n Returns:\n A MigrationResult object that can be used to inspect the results of the migration.\n \"\"\"\n executor, max_concurrent_file_copies = _get_executor(syn)\n\n test_import_sqlite3()\n import sqlite3\n\n with sqlite3.connect(db_path) as conn:\n cursor = conn.cursor()\n\n _ensure_schema(cursor)\n settings = _retrieve_index_settings(cursor)\n if settings is None:\n # no settings were available at the index given\n raise ValueError(\n \"Unable to retrieve existing index settings from '{}'. \"\n \"Either this path does represent a previously created migration index file or the file is corrupt.\"\n )\n\n dest_storage_location_id = settings[\"dest_storage_location_id\"]\n if not _confirm_migration(cursor, force, dest_storage_location_id):\n logging.info(\"Migration aborted.\")\n return\n\n key = _MigrationKey(id=\"\", type=None, row_id=-1, col_id=-1, version=-1)\n\n futures = set()\n\n # we keep track of the file handles that are currently being migrated\n # so that if we encounter multiple entities associated with the same\n # file handle we can copy the file handle once and update all the entities\n # with the single copied file handle\n pending_file_handle_ids = set()\n completed_file_handle_ids = set()\n\n # we keep track of the entity keys (syn id + version) so that we know\n # if we encounter the same one twice. normally we wouldn't but when we backtrack\n # to update any entities skipped because of a shared file handle we might\n # query for the same key as is already being operated on.\n pending_keys = set()\n\n batch_size = _get_batch_size()\n while True:\n # we query for additional file or table associated file handles to migrate in batches\n # ordering by synapse id. there can be multiple file handles associated with a particular\n # synapse id (i.e. multiple file entity versions or multiple table attached files per table),\n # so the ordering and where clause need to account for that.\n # we also include in the query any unmigrated files that were skipped previously through\n # the query loop that share a file handle with a file handle id that is now finished.\n version = key.version if key.version is not None else -1\n row_id = key.row_id if key.row_id is not None else -1\n col_id = key.col_id if key.col_id is not None else -1\n\n query_kwargs = {\n \"indexed_status\": _MigrationStatus.INDEXED.value,\n \"id\": key.id,\n \"file_type\": _MigrationType.FILE.value,\n \"table_type\": _MigrationType.TABLE_ATTACHED_FILE.value,\n \"version\": version,\n \"row_id\": row_id,\n \"col_id\": col_id,\n # ensure that we aren't ever adding more items to the shared executor than allowed\n \"limit\": min(batch_size, max_concurrent_file_copies - len(futures)),\n }\n\n # we can't use both named and positional literals in a query, so we use named\n # literals and then inline a string for the values for our file handle ids\n # since these are a dynamic list of values\n pending_file_handle_in = \"('\" + \"','\".join(pending_file_handle_ids) + \"')\"\n completed_file_handle_in = (\n \"('\" + \"','\".join(completed_file_handle_ids) + \"')\"\n )\n\n results = cursor.execute(\n f\"\"\"\n select\n id,\n type,\n version,\n row_id,\n col_id,\n from_file_handle_id,\n file_size\n from migrations\n where\n status = :indexed_status\n and (\n (\n ((id > :id and type in (:file_type, :table_type))\n or (id = :id and type = :file_type and version is not null and version > :version)\n or (id = :id and type = :table_type and (row_id > :row_id or (row_id = :row_id and col_id > :col_id))))\n and from_file_handle_id not in {pending_file_handle_in}\n ) or\n (\n id <= :id\n and from_file_handle_id in {completed_file_handle_in}\n )\n )\n order by\n id,\n type,\n row_id,\n col_id,\n version\n limit :limit\n \"\"\", # noqa\n query_kwargs,\n )\n\n row_count = 0\n for row in results:\n row_count += 1\n\n row_dict = _get_row_dict(cursor, row, True)\n key_dict = {\n k: v\n for k, v in row_dict.items()\n if k in (\"id\", \"type\", \"version\", \"row_id\", \"col_id\")\n }\n\n last_key = key\n key = _MigrationKey(**key_dict)\n from_file_handle_id = row_dict[\"from_file_handle_id\"]\n\n if (\n key in pending_keys\n or from_file_handle_id in pending_file_handle_ids\n ):\n # if this record is already being migrated or it shares a file handle\n # with a record that is being migrated then skip this.\n # if it the record shares a file handle it will be picked up later\n # when its file handle is completed.\n continue\n\n file_size = row_dict[\"file_size\"]\n\n pending_keys.add(key)\n to_file_handle_id = _check_file_handle_exists(\n conn.cursor(), from_file_handle_id\n )\n if not to_file_handle_id:\n pending_file_handle_ids.add(from_file_handle_id)\n\n if key.type == _MigrationType.FILE.value:\n if key.version is None:\n migration_fn = _create_new_file_version\n\n else:\n migration_fn = _migrate_file_version\n\n elif key.type == _MigrationType.TABLE_ATTACHED_FILE.value:\n if last_key.id != key.id and create_table_snapshots:\n syn.create_snapshot_version(key.id)\n\n migration_fn = _migrate_table_attached_file\n\n else:\n raise ValueError(\n \"Unexpected type {} with id {}\".format(key.type, key.id)\n )\n\n def migration_task(\n syn,\n key,\n from_file_handle_id,\n to_file_handle_id,\n file_size,\n storage_location_id,\n ):\n # a closure to wrap the actual function call so that we an add some local variables\n # to the return tuple which will be consumed when the future is processed\n with shared_executor(executor):\n try:\n # instrument the shared executor in this thread so that we won't\n # create a new executor to perform the multipart copy\n to_file_handle_id = migration_fn(\n syn,\n key,\n from_file_handle_id,\n to_file_handle_id,\n file_size,\n storage_location_id,\n )\n return key, from_file_handle_id, to_file_handle_id\n except Exception as ex:\n raise _MigrationError(\n key, from_file_handle_id, to_file_handle_id\n ) from ex\n\n future = executor.submit(\n migration_task,\n syn,\n key,\n from_file_handle_id,\n to_file_handle_id,\n file_size,\n dest_storage_location_id,\n )\n futures.add(future)\n\n if row_count == 0 and not pending_file_handle_ids:\n # we've run out of migratable sqlite rows, we have nothing else\n # to submit, so we break out and wait for all remaining\n # tasks to conclude.\n break\n\n if len(futures) >= max_concurrent_file_copies or row_count < batch_size:\n # if we have no concurrency left to process any additional entities\n # or if we're near the end of he migration and have a small\n # remainder batch then we wait for one of the processing migrations\n # to finish. a small batch doesn't mean this is the last batch since\n # a completed file handle here could be associated with another\n # entity that we deferred before because it shared the same file handle id\n futures, completed_file_handle_ids = _wait_futures(\n conn,\n cursor,\n futures,\n pending_keys,\n concurrent.futures.FIRST_COMPLETED,\n continue_on_error,\n )\n\n pending_file_handle_ids -= completed_file_handle_ids\n\n if futures:\n # wait for all remaining migrations to conclude before returning\n _wait_futures(\n conn,\n cursor,\n futures,\n pending_keys,\n concurrent.futures.ALL_COMPLETED,\n continue_on_error,\n )\n\n return MigrationResult(syn, db_path)\n
"},{"location":"reference/synapse_utils/#synapseutils.describe_functions","title":"synapseutils.describe_functions
","text":""},{"location":"reference/synapse_utils/#synapseutils.describe_functions-functions","title":"Functions","text":""},{"location":"reference/synapse_utils/#synapseutils.describe_functions.describe","title":"describe(syn, entity)
","text":"Gets a synapse entity and returns summary statistics about it.
PARAMETER DESCRIPTIONsyn
A Synapse object with user's login, e.g. syn = synapseclient.login()
entity
synapse id of the entity to be described
TYPE: str
Describing columns of a table
import synapseclient\nimport synapseutils\nsyn = synapseclient.login()\nstatistics = synapseutils(syn, entity=\"syn123\")\nprint(statistics)\n{\n \"column1\": {\n \"dtype\": \"object\",\n \"mode\": \"FOOBAR\"\n },\n \"column2\": {\n \"dtype\": \"int64\",\n \"mode\": 1,\n \"min\": 1,\n \"max\": 2,\n \"mean\": 1.4\n },\n \"column3\": {\n \"dtype\": \"bool\",\n \"mode\": false,\n \"min\": false,\n \"max\": true,\n \"mean\": 0.5\n }\n}\n
RETURNS DESCRIPTION Union[dict, None]
A dict if the dataset is valid; None if not.
Source code insynapseutils/describe_functions.py
def describe(syn, entity: str) -> typing.Union[dict, None]:\n \"\"\"\n Gets a synapse entity and returns summary statistics about it.\n\n Arguments:\n syn: A Synapse object with user's login, e.g. syn = synapseclient.login()\n entity: synapse id of the entity to be described\n\n Example: Using this function\n Describing columns of a table\n\n import synapseclient\n import synapseutils\n syn = synapseclient.login()\n statistics = synapseutils(syn, entity=\"syn123\")\n print(statistics)\n {\n \"column1\": {\n \"dtype\": \"object\",\n \"mode\": \"FOOBAR\"\n },\n \"column2\": {\n \"dtype\": \"int64\",\n \"mode\": 1,\n \"min\": 1,\n \"max\": 2,\n \"mean\": 1.4\n },\n \"column3\": {\n \"dtype\": \"bool\",\n \"mode\": false,\n \"min\": false,\n \"max\": true,\n \"mean\": 0.5\n }\n }\n\n Returns:\n A dict if the dataset is valid; None if not.\n \"\"\"\n df = _open_entity_as_df(syn=syn, entity=entity)\n\n if df is None:\n return None\n\n stats = _describe_wrapper(df)\n syn.logger.info(json.dumps(stats, indent=2, default=str))\n return stats\n
"},{"location":"reference/table_schema/","title":"Table Schema","text":""},{"location":"reference/table_schema/#synapseclient.table.Schema","title":"synapseclient.table.Schema
","text":" Bases: SchemaBase
A Schema is an :py:class:synapseclient.entity.Entity
that defines a set of columns in a table.
:param name: the name for the Table Schema object :param description: User readable description of the schema :param columns: a list of :py:class:Column
objects or their IDs :param parent: the project in Synapse to which this table belongs :param properties: A map of Synapse properties :param annotations: A map of user defined annotations :param local_state: Internal use only
Example::
cols = [Column(name='Isotope', columnType='STRING'),\n Column(name='Atomic Mass', columnType='INTEGER'),\n Column(name='Halflife', columnType='DOUBLE'),\n Column(name='Discovered', columnType='DATE')]\n\nschema = syn.store(Schema(name='MyTable', columns=cols, parent=project))\n
Source code in synapseclient/table.py
class Schema(SchemaBase):\n \"\"\"\n A Schema is an :py:class:`synapseclient.entity.Entity` that defines a set of columns in a table.\n\n :param name: the name for the Table Schema object\n :param description: User readable description of the schema\n :param columns: a list of :py:class:`Column` objects or their IDs\n :param parent: the project in Synapse to which this table belongs\n :param properties: A map of Synapse properties\n :param annotations: A map of user defined annotations\n :param local_state: Internal use only\n\n Example::\n\n cols = [Column(name='Isotope', columnType='STRING'),\n Column(name='Atomic Mass', columnType='INTEGER'),\n Column(name='Halflife', columnType='DOUBLE'),\n Column(name='Discovered', columnType='DATE')]\n\n schema = syn.store(Schema(name='MyTable', columns=cols, parent=project))\n \"\"\"\n\n _synapse_entity_type = \"org.sagebionetworks.repo.model.table.TableEntity\"\n\n def __init__(\n self,\n name=None,\n columns=None,\n parent=None,\n properties=None,\n annotations=None,\n local_state=None,\n **kwargs,\n ):\n super(Schema, self).__init__(\n name=name,\n columns=columns,\n properties=properties,\n annotations=annotations,\n local_state=local_state,\n parent=parent,\n **kwargs,\n )\n
"},{"location":"reference/tables/","title":"Tables","text":""},{"location":"reference/tables/#synapseclient.table","title":"synapseclient.table
","text":""},{"location":"reference/tables/#synapseclient.table--tables","title":"Tables","text":"Synapse Tables enable storage of tabular data in Synapse in a form that can be queried using a SQL-like query language.
A table has a Schema and holds a set of rows conforming to that schema.
A Schema defines a series of Column of the following types:
Read more information about using Table in synapse in the tutorials section.
"},{"location":"reference/tables/#synapseclient.table-classes","title":"Classes","text":""},{"location":"reference/tables/#synapseclient.table.SchemaBase","title":"SchemaBase
","text":" Bases: Entity
This is the an Abstract Class for EntityViewSchema and Schema containing the common methods for both. You can not create an object of this type.
Source code insynapseclient/table.py
class SchemaBase(Entity, metaclass=abc.ABCMeta):\n \"\"\"\n This is the an Abstract Class for EntityViewSchema and Schema containing the common methods for both.\n You can not create an object of this type.\n \"\"\"\n\n _property_keys = Entity._property_keys + [\"columnIds\"]\n _local_keys = Entity._local_keys + [\"columns_to_store\"]\n\n @property\n @abc.abstractmethod # forces subclasses to define _synapse_entity_type\n def _synapse_entity_type(self):\n pass\n\n @abc.abstractmethod\n def __init__(\n self, name, columns, properties, annotations, local_state, parent, **kwargs\n ):\n self.properties.setdefault(\"columnIds\", [])\n self.__dict__.setdefault(\"columns_to_store\", [])\n\n if name:\n kwargs[\"name\"] = name\n super(SchemaBase, self).__init__(\n properties=properties,\n annotations=annotations,\n local_state=local_state,\n parent=parent,\n **kwargs,\n )\n if columns:\n self.addColumns(columns)\n\n def addColumn(self, column):\n \"\"\"\n :param column: a column object or its ID\n \"\"\"\n if isinstance(column, str) or isinstance(column, int) or hasattr(column, \"id\"):\n self.properties.columnIds.append(id_of(column))\n elif isinstance(column, Column):\n if not self.__dict__.get(\"columns_to_store\", None):\n self.__dict__[\"columns_to_store\"] = []\n self.__dict__[\"columns_to_store\"].append(column)\n else:\n raise ValueError(\"Not a column? %s\" % str(column))\n\n def addColumns(self, columns):\n \"\"\"\n :param columns: a list of column objects or their ID\n \"\"\"\n for column in columns:\n self.addColumn(column)\n\n def removeColumn(self, column):\n \"\"\"\n :param column: a column object or its ID\n \"\"\"\n if isinstance(column, str) or isinstance(column, int) or hasattr(column, \"id\"):\n self.properties.columnIds.remove(id_of(column))\n elif isinstance(column, Column) and self.columns_to_store:\n self.columns_to_store.remove(column)\n else:\n ValueError(\"Can't remove column %s\" + str(column))\n\n def has_columns(self):\n \"\"\"Does this schema have columns specified?\"\"\"\n return bool(\n self.properties.get(\"columnIds\", None)\n or self.__dict__.get(\"columns_to_store\", None)\n )\n\n def _before_synapse_store(self, syn):\n if len(self.columns_to_store) + len(self.columnIds) > MAX_NUM_TABLE_COLUMNS:\n raise ValueError(\n \"Too many columns. The limit is %s columns per table\"\n % MAX_NUM_TABLE_COLUMNS\n )\n\n # store any columns before storing table\n if self.columns_to_store:\n self.properties.columnIds.extend(\n column.id for column in syn.createColumns(self.columns_to_store)\n )\n self.columns_to_store = []\n
"},{"location":"reference/tables/#synapseclient.table.SchemaBase-functions","title":"Functions","text":""},{"location":"reference/tables/#synapseclient.table.SchemaBase.addColumn","title":"addColumn(column)
","text":":param column: a column object or its ID
Source code insynapseclient/table.py
def addColumn(self, column):\n \"\"\"\n :param column: a column object or its ID\n \"\"\"\n if isinstance(column, str) or isinstance(column, int) or hasattr(column, \"id\"):\n self.properties.columnIds.append(id_of(column))\n elif isinstance(column, Column):\n if not self.__dict__.get(\"columns_to_store\", None):\n self.__dict__[\"columns_to_store\"] = []\n self.__dict__[\"columns_to_store\"].append(column)\n else:\n raise ValueError(\"Not a column? %s\" % str(column))\n
"},{"location":"reference/tables/#synapseclient.table.SchemaBase.addColumns","title":"addColumns(columns)
","text":":param columns: a list of column objects or their ID
Source code insynapseclient/table.py
def addColumns(self, columns):\n \"\"\"\n :param columns: a list of column objects or their ID\n \"\"\"\n for column in columns:\n self.addColumn(column)\n
"},{"location":"reference/tables/#synapseclient.table.SchemaBase.removeColumn","title":"removeColumn(column)
","text":":param column: a column object or its ID
Source code insynapseclient/table.py
def removeColumn(self, column):\n \"\"\"\n :param column: a column object or its ID\n \"\"\"\n if isinstance(column, str) or isinstance(column, int) or hasattr(column, \"id\"):\n self.properties.columnIds.remove(id_of(column))\n elif isinstance(column, Column) and self.columns_to_store:\n self.columns_to_store.remove(column)\n else:\n ValueError(\"Can't remove column %s\" + str(column))\n
"},{"location":"reference/tables/#synapseclient.table.SchemaBase.has_columns","title":"has_columns()
","text":"Does this schema have columns specified?
Source code insynapseclient/table.py
def has_columns(self):\n \"\"\"Does this schema have columns specified?\"\"\"\n return bool(\n self.properties.get(\"columnIds\", None)\n or self.__dict__.get(\"columns_to_store\", None)\n )\n
"},{"location":"reference/tables/#synapseclient.table.Schema","title":"Schema
","text":" Bases: SchemaBase
A Schema is an :py:class:synapseclient.entity.Entity
that defines a set of columns in a table.
:param name: the name for the Table Schema object :param description: User readable description of the schema :param columns: a list of :py:class:Column
objects or their IDs :param parent: the project in Synapse to which this table belongs :param properties: A map of Synapse properties :param annotations: A map of user defined annotations :param local_state: Internal use only
Example::
cols = [Column(name='Isotope', columnType='STRING'),\n Column(name='Atomic Mass', columnType='INTEGER'),\n Column(name='Halflife', columnType='DOUBLE'),\n Column(name='Discovered', columnType='DATE')]\n\nschema = syn.store(Schema(name='MyTable', columns=cols, parent=project))\n
Source code in synapseclient/table.py
class Schema(SchemaBase):\n \"\"\"\n A Schema is an :py:class:`synapseclient.entity.Entity` that defines a set of columns in a table.\n\n :param name: the name for the Table Schema object\n :param description: User readable description of the schema\n :param columns: a list of :py:class:`Column` objects or their IDs\n :param parent: the project in Synapse to which this table belongs\n :param properties: A map of Synapse properties\n :param annotations: A map of user defined annotations\n :param local_state: Internal use only\n\n Example::\n\n cols = [Column(name='Isotope', columnType='STRING'),\n Column(name='Atomic Mass', columnType='INTEGER'),\n Column(name='Halflife', columnType='DOUBLE'),\n Column(name='Discovered', columnType='DATE')]\n\n schema = syn.store(Schema(name='MyTable', columns=cols, parent=project))\n \"\"\"\n\n _synapse_entity_type = \"org.sagebionetworks.repo.model.table.TableEntity\"\n\n def __init__(\n self,\n name=None,\n columns=None,\n parent=None,\n properties=None,\n annotations=None,\n local_state=None,\n **kwargs,\n ):\n super(Schema, self).__init__(\n name=name,\n columns=columns,\n properties=properties,\n annotations=annotations,\n local_state=local_state,\n parent=parent,\n **kwargs,\n )\n
"},{"location":"reference/tables/#synapseclient.table.MaterializedViewSchema","title":"MaterializedViewSchema
","text":" Bases: SchemaBase
A MaterializedViewSchema is an :py:class:synapseclient.entity.Entity
that defines a set of columns in a materialized view along with the SQL statement.
:param name: the name for the Materialized View Schema object :param description: User readable description of the schema :param definingSQL: The synapse SQL statement that defines the data in the materialized view. The SQL may contain JOIN clauses on multiple tables. :param columns: a list of :py:class:Column
objects or their IDs :param parent: the project in Synapse to which this Materialized View belongs :param properties: A map of Synapse properties :param annotations: A map of user defined annotations :param local_state: Internal use only
Example::
defining_sql = \"SELECT * FROM syn111 F JOIN syn2222 P on (F.patient_id = P.patient_id)\"\n\nschema = syn.store(MaterializedViewSchema(name='MyTable', parent=project, definingSQL=defining_sql))\n
Source code in synapseclient/table.py
class MaterializedViewSchema(SchemaBase):\n \"\"\"\n A MaterializedViewSchema is an :py:class:`synapseclient.entity.Entity` that defines a set of columns in a\n materialized view along with the SQL statement.\n\n :param name: the name for the Materialized View Schema object\n :param description: User readable description of the schema\n :param definingSQL: The synapse SQL statement that defines the data in the materialized view. The SQL may\n contain JOIN clauses on multiple tables.\n :param columns: a list of :py:class:`Column` objects or their IDs\n :param parent: the project in Synapse to which this Materialized View belongs\n :param properties: A map of Synapse properties\n :param annotations: A map of user defined annotations\n :param local_state: Internal use only\n\n Example::\n\n defining_sql = \"SELECT * FROM syn111 F JOIN syn2222 P on (F.patient_id = P.patient_id)\"\n\n schema = syn.store(MaterializedViewSchema(name='MyTable', parent=project, definingSQL=defining_sql))\n \"\"\"\n\n _synapse_entity_type = \"org.sagebionetworks.repo.model.table.MaterializedView\"\n _property_keys = SchemaBase._property_keys + [\"definingSQL\"]\n\n def __init__(\n self,\n name=None,\n columns=None,\n parent=None,\n definingSQL=None,\n properties=None,\n annotations=None,\n local_state=None,\n **kwargs,\n ):\n if definingSQL is not None:\n kwargs[\"definingSQL\"] = definingSQL\n super(MaterializedViewSchema, self).__init__(\n name=name,\n columns=columns,\n properties=properties,\n annotations=annotations,\n local_state=local_state,\n parent=parent,\n **kwargs,\n )\n
"},{"location":"reference/tables/#synapseclient.table.ViewBase","title":"ViewBase
","text":" Bases: SchemaBase
This is a helper class for EntityViewSchema and SubmissionViewSchema containing the common methods for both.
Source code insynapseclient/table.py
class ViewBase(SchemaBase):\n \"\"\"\n This is a helper class for EntityViewSchema and SubmissionViewSchema\n containing the common methods for both.\n \"\"\"\n\n _synapse_entity_type = \"\"\n _property_keys = SchemaBase._property_keys + [\"viewTypeMask\", \"scopeIds\"]\n _local_keys = SchemaBase._local_keys + [\n \"addDefaultViewColumns\",\n \"addAnnotationColumns\",\n \"ignoredAnnotationColumnNames\",\n ]\n\n def add_scope(self, entities):\n \"\"\"\n :param entities: a Project, Folder, Evaluation object or its ID, can also be a list of them\n \"\"\"\n if isinstance(entities, list):\n # add ids to a temp list so that we don't partially modify scopeIds on an exception in id_of()\n temp_list = [id_of(entity) for entity in entities]\n self.scopeIds.extend(temp_list)\n else:\n self.scopeIds.append(id_of(entities))\n\n def _filter_duplicate_columns(self, syn, columns_to_add):\n \"\"\"\n If a column to be added has the same name and same type as an existing column, it will be considered a duplicate\n and not added.\n :param syn: a :py:class:`synapseclient.client.Synapse` object that is logged in\n :param columns_to_add: iterable collection of type :py:class:`synapseclient.table.Column` objects\n :return: a filtered list of columns to add\n \"\"\"\n\n # no point in making HTTP calls to retrieve existing Columns if we not adding any new columns\n if not columns_to_add:\n return columns_to_add\n\n # set up Column name/type tracking\n # map of str -> set(str), where str is the column type as a string and set is a set of column name strings\n column_type_to_annotation_names = {}\n\n # add to existing columns the columns that user has added but not yet created in synapse\n column_generator = (\n itertools.chain(syn.getColumns(self.columnIds), self.columns_to_store)\n if self.columns_to_store\n else syn.getColumns(self.columnIds)\n )\n\n for column in column_generator:\n column_name = column[\"name\"]\n column_type = column[\"columnType\"]\n\n column_type_to_annotation_names.setdefault(column_type, set()).add(\n column_name\n )\n\n valid_columns = []\n for column in columns_to_add:\n new_col_name = column[\"name\"]\n new_col_type = column[\"columnType\"]\n\n typed_col_name_set = column_type_to_annotation_names.setdefault(\n new_col_type, set()\n )\n if new_col_name not in typed_col_name_set:\n typed_col_name_set.add(new_col_name)\n valid_columns.append(column)\n return valid_columns\n\n def _before_synapse_store(self, syn):\n # get the default EntityView columns from Synapse and add them to the columns list\n additional_columns = []\n view_type = self._synapse_entity_type.split(\".\")[-1].lower()\n mask = self.get(\"viewTypeMask\")\n\n if self.addDefaultViewColumns:\n additional_columns.extend(\n syn._get_default_view_columns(view_type, view_type_mask=mask)\n )\n\n # get default annotations\n if self.addAnnotationColumns:\n anno_columns = [\n x\n for x in syn._get_annotation_view_columns(\n self.scopeIds, view_type, view_type_mask=mask\n )\n if x[\"name\"] not in self.ignoredAnnotationColumnNames\n ]\n additional_columns.extend(anno_columns)\n\n self.addColumns(self._filter_duplicate_columns(syn, additional_columns))\n\n # set these boolean flags to false so they are not repeated.\n self.addDefaultViewColumns = False\n self.addAnnotationColumns = False\n\n super(ViewBase, self)._before_synapse_store(syn)\n
"},{"location":"reference/tables/#synapseclient.table.ViewBase-functions","title":"Functions","text":""},{"location":"reference/tables/#synapseclient.table.ViewBase.add_scope","title":"add_scope(entities)
","text":":param entities: a Project, Folder, Evaluation object or its ID, can also be a list of them
Source code insynapseclient/table.py
def add_scope(self, entities):\n \"\"\"\n :param entities: a Project, Folder, Evaluation object or its ID, can also be a list of them\n \"\"\"\n if isinstance(entities, list):\n # add ids to a temp list so that we don't partially modify scopeIds on an exception in id_of()\n temp_list = [id_of(entity) for entity in entities]\n self.scopeIds.extend(temp_list)\n else:\n self.scopeIds.append(id_of(entities))\n
"},{"location":"reference/tables/#synapseclient.table.Dataset","title":"Dataset
","text":" Bases: ViewBase
A Dataset is an :py:class:synapseclient.entity.Entity
that defines a flat list of entities as a tableview (a.k.a. a \"dataset\").
:param name: The name for the Dataset object :param description: User readable description of the schema :param columns: A list of :py:class:Column
objects or their IDs :param parent: The Synapse Project to which this Dataset belongs :param properties: A map of Synapse properties :param annotations: A map of user defined annotations :param dataset_items: A list of items characterized by entityId and versionNumber :param folder: A list of Folder IDs :param local_state: Internal use only
Example::
from synapseclient import Dataset\n\n# Create a Dataset with pre-defined DatasetItems. Default Dataset columns\n# are used if no schema is provided.\ndataset_items = [\n {'entityId': \"syn000\", 'versionNumber': 1},\n {...},\n]\ndataset = syn.store(Dataset(\n name=\"My Dataset\",\n parent=project,\n dataset_items=dataset_items))\n\n# Add/remove specific Synapse IDs to/from the Dataset\ndataset.add_item({'entityId': \"syn111\", 'versionNumber': 1})\ndataset.remove_item(\"syn000\")\ndataset = syn.store(dataset)\n\n# Add a list of Synapse IDs to the Dataset\nnew_items = [\n {'entityId': \"syn222\", 'versionNumber': 2},\n {'entityId': \"syn333\", 'versionNumber': 1}\n]\ndataset.add_items(new_items)\ndataset = syn.store(dataset)\n
Folders can easily be added recursively to a dataset, that is, all files within the folder (including sub-folders) will be added. Note that using the following methods will add files with the latest version number ONLY. If another version number is desired, use :py:meth:synapseclient.table.add_item
or :py:meth:synapseclient.table.add_items
.
Example::
# Add a single Folder to the Dataset\ndataset.add_folder(\"syn123\")\n\n# Add a list of Folders, overwriting any existing files in the dataset\ndataset.add_folders([\"syn456\", \"syn789\"], force=True)\n\ndataset = syn.store(dataset)\n
empty() can be used to truncate a dataset, that is, remove all current items from the set.
Example::
dataset.empty()\ndataset = syn.store(dataset)\n
To get the number of entities in the dataset, use len().
Example::
print(f\"{dataset.name} has {len(dataset)} items.\")\n
To create a snapshot version of the Dataset, use :py:meth:synapseclient.client.create_snapshot_version
.
Example::
syn = synapseclient.login()\nsyn.create_snapshot_version(\n dataset.id,\n label=\"v1.0\",\n comment=\"This is version 1\")\n
Source code in synapseclient/table.py
class Dataset(ViewBase):\n \"\"\"\n A Dataset is an :py:class:`synapseclient.entity.Entity` that defines a\n flat list of entities as a tableview (a.k.a. a \"dataset\").\n\n :param name: The name for the Dataset object\n :param description: User readable description of the schema\n :param columns: A list of :py:class:`Column` objects or their IDs\n :param parent: The Synapse Project to which this Dataset belongs\n :param properties: A map of Synapse properties\n :param annotations: A map of user defined annotations\n :param dataset_items: A list of items characterized by entityId and versionNumber\n :param folder: A list of Folder IDs\n :param local_state: Internal use only\n\n Example::\n\n from synapseclient import Dataset\n\n # Create a Dataset with pre-defined DatasetItems. Default Dataset columns\n # are used if no schema is provided.\n dataset_items = [\n {'entityId': \"syn000\", 'versionNumber': 1},\n {...},\n ]\n dataset = syn.store(Dataset(\n name=\"My Dataset\",\n parent=project,\n dataset_items=dataset_items))\n\n # Add/remove specific Synapse IDs to/from the Dataset\n dataset.add_item({'entityId': \"syn111\", 'versionNumber': 1})\n dataset.remove_item(\"syn000\")\n dataset = syn.store(dataset)\n\n # Add a list of Synapse IDs to the Dataset\n new_items = [\n {'entityId': \"syn222\", 'versionNumber': 2},\n {'entityId': \"syn333\", 'versionNumber': 1}\n ]\n dataset.add_items(new_items)\n dataset = syn.store(dataset)\n\n Folders can easily be added recursively to a dataset, that is, all files\n within the folder (including sub-folders) will be added. Note that using\n the following methods will add files with the latest version number ONLY.\n If another version number is desired, use :py:meth:`synapseclient.table.add_item`\n or :py:meth:`synapseclient.table.add_items`.\n\n Example::\n\n # Add a single Folder to the Dataset\n dataset.add_folder(\"syn123\")\n\n # Add a list of Folders, overwriting any existing files in the dataset\n dataset.add_folders([\"syn456\", \"syn789\"], force=True)\n\n dataset = syn.store(dataset)\n\n empty() can be used to truncate a dataset, that is, remove all current\n items from the set.\n\n Example::\n\n dataset.empty()\n dataset = syn.store(dataset)\n\n To get the number of entities in the dataset, use len().\n\n Example::\n\n print(f\"{dataset.name} has {len(dataset)} items.\")\n\n To create a snapshot version of the Dataset, use\n :py:meth:`synapseclient.client.create_snapshot_version`.\n\n Example::\n\n syn = synapseclient.login()\n syn.create_snapshot_version(\n dataset.id,\n label=\"v1.0\",\n comment=\"This is version 1\")\n \"\"\"\n\n _synapse_entity_type: str = \"org.sagebionetworks.repo.model.table.Dataset\"\n _property_keys: List[str] = ViewBase._property_keys + [\"datasetItems\"]\n _local_keys: List[str] = ViewBase._local_keys + [\"folders_to_add\", \"force\"]\n\n def __init__(\n self,\n name=None,\n columns=None,\n parent=None,\n properties=None,\n addDefaultViewColumns=True,\n addAnnotationColumns=True,\n ignoredAnnotationColumnNames=[],\n annotations=None,\n local_state=None,\n dataset_items=None,\n folders=None,\n force=False,\n **kwargs,\n ):\n self.properties.setdefault(\"datasetItems\", [])\n self.__dict__.setdefault(\"folders_to_add\", set())\n self.ignoredAnnotationColumnNames = set(ignoredAnnotationColumnNames)\n self.viewTypeMask = EntityViewType.DATASET.value\n super(Dataset, self).__init__(\n name=name,\n columns=columns,\n properties=properties,\n annotations=annotations,\n local_state=local_state,\n parent=parent,\n **kwargs,\n )\n\n self.force = force\n if dataset_items:\n self.add_items(dataset_items, force)\n if folders:\n self.add_folders(folders, force)\n\n # HACK: make sure we don't try to add columns to schemas that we retrieve from synapse\n is_from_normal_constructor = not (properties or local_state)\n # allowing annotations because user might want to update annotations all at once\n self.addDefaultViewColumns = (\n addDefaultViewColumns and is_from_normal_constructor\n )\n self.addAnnotationColumns = addAnnotationColumns and is_from_normal_constructor\n\n def __len__(self):\n return len(self.properties.datasetItems)\n\n @staticmethod\n def _check_needed_keys(keys: List[str]):\n required_keys = {\"entityId\", \"versionNumber\"}\n if required_keys - keys:\n raise LookupError(\n \"DatasetItem missing a required property: %s\"\n % str(required_keys - keys)\n )\n return True\n\n def add_item(self, dataset_item: Dict[str, str], force: bool = True):\n \"\"\"\n :param dataset_item: a single dataset item\n :param force: force add item\n \"\"\"\n if isinstance(dataset_item, dict) and self._check_needed_keys(\n dataset_item.keys()\n ):\n if not self.has_item(dataset_item.get(\"entityId\")):\n self.properties.datasetItems.append(dataset_item)\n else:\n if force:\n self.remove_item(dataset_item.get(\"entityId\"))\n self.properties.datasetItems.append(dataset_item)\n else:\n raise ValueError(\n f\"Duplicate item found: {dataset_item.get('entityId')}. \"\n \"Set force=True to overwrite the existing item.\"\n )\n else:\n raise ValueError(\"Not a DatasetItem? %s\" % str(dataset_item))\n\n def add_items(self, dataset_items: List[Dict[str, str]], force: bool = True):\n \"\"\"\n :param dataset_items: a list of dataset items\n :param force: force add items\n \"\"\"\n for dataset_item in dataset_items:\n self.add_item(dataset_item, force)\n\n def remove_item(self, item_id: str):\n \"\"\"\n :param item_id: a single dataset item Synapse ID\n \"\"\"\n item_id = id_of(item_id)\n if item_id.startswith(\"syn\"):\n for i, curr_item in enumerate(self.properties.datasetItems):\n if curr_item.get(\"entityId\") == item_id:\n del self.properties.datasetItems[i]\n break\n else:\n raise ValueError(\"Not a Synapse ID: %s\" % str(item_id))\n\n def empty(self):\n self.properties.datasetItems = []\n\n def has_item(self, item_id):\n \"\"\"\n :param item_id: a single dataset item Synapse ID\n \"\"\"\n return any(item[\"entityId\"] == item_id for item in self.properties.datasetItems)\n\n def add_folder(self, folder: str, force: bool = True):\n \"\"\"\n :param folder: a single Synapse Folder ID\n :param force: force add items from folder\n \"\"\"\n if not self.__dict__.get(\"folders_to_add\", None):\n self.__dict__[\"folders_to_add\"] = set()\n self.__dict__[\"folders_to_add\"].add(folder)\n # if self.force != force:\n self.force = force\n\n def add_folders(self, folders: List[str], force: bool = True):\n \"\"\"\n :param folders: a list of Synapse Folder IDs\n :param force: force add items from folders\n \"\"\"\n if (\n isinstance(folders, list)\n or isinstance(folders, set)\n or isinstance(folders, tuple)\n ):\n self.force = force\n for folder in folders:\n self.add_folder(folder, force)\n else:\n raise ValueError(f\"Not a list of Folder IDs: {folders}\")\n\n def _add_folder_files(self, syn, folder):\n files = []\n children = syn.getChildren(folder)\n for child in children:\n if child.get(\"type\") == \"org.sagebionetworks.repo.model.Folder\":\n files.extend(self._add_folder_files(syn, child.get(\"id\")))\n elif child.get(\"type\") == \"org.sagebionetworks.repo.model.FileEntity\":\n files.append(\n {\n \"entityId\": child.get(\"id\"),\n \"versionNumber\": child.get(\"versionNumber\"),\n }\n )\n else:\n raise ValueError(f\"Not a Folder?: {folder}\")\n return files\n\n def _before_synapse_store(self, syn):\n # Add files from folders (if any) before storing dataset.\n if self.folders_to_add:\n for folder in self.folders_to_add:\n items_to_add = self._add_folder_files(syn, folder)\n self.add_items(items_to_add, self.force)\n self.folders_to_add = set()\n # Must set this scopeIds is used to get all annotations from the\n # entities\n self.scopeIds = [item[\"entityId\"] for item in self.properties.datasetItems]\n super()._before_synapse_store(syn)\n # Reset attribute to force-add items from folders.\n self.force = True\n # Remap `datasetItems` back to `items` before storing (since `items`\n # is the accepted field name in the API, not `datasetItems`).\n self.properties.items = self.properties.datasetItems\n
"},{"location":"reference/tables/#synapseclient.table.Dataset-functions","title":"Functions","text":""},{"location":"reference/tables/#synapseclient.table.Dataset.add_item","title":"add_item(dataset_item, force=True)
","text":":param dataset_item: a single dataset item :param force: force add item
Source code insynapseclient/table.py
def add_item(self, dataset_item: Dict[str, str], force: bool = True):\n \"\"\"\n :param dataset_item: a single dataset item\n :param force: force add item\n \"\"\"\n if isinstance(dataset_item, dict) and self._check_needed_keys(\n dataset_item.keys()\n ):\n if not self.has_item(dataset_item.get(\"entityId\")):\n self.properties.datasetItems.append(dataset_item)\n else:\n if force:\n self.remove_item(dataset_item.get(\"entityId\"))\n self.properties.datasetItems.append(dataset_item)\n else:\n raise ValueError(\n f\"Duplicate item found: {dataset_item.get('entityId')}. \"\n \"Set force=True to overwrite the existing item.\"\n )\n else:\n raise ValueError(\"Not a DatasetItem? %s\" % str(dataset_item))\n
"},{"location":"reference/tables/#synapseclient.table.Dataset.add_items","title":"add_items(dataset_items, force=True)
","text":":param dataset_items: a list of dataset items :param force: force add items
Source code insynapseclient/table.py
def add_items(self, dataset_items: List[Dict[str, str]], force: bool = True):\n \"\"\"\n :param dataset_items: a list of dataset items\n :param force: force add items\n \"\"\"\n for dataset_item in dataset_items:\n self.add_item(dataset_item, force)\n
"},{"location":"reference/tables/#synapseclient.table.Dataset.remove_item","title":"remove_item(item_id)
","text":":param item_id: a single dataset item Synapse ID
Source code insynapseclient/table.py
def remove_item(self, item_id: str):\n \"\"\"\n :param item_id: a single dataset item Synapse ID\n \"\"\"\n item_id = id_of(item_id)\n if item_id.startswith(\"syn\"):\n for i, curr_item in enumerate(self.properties.datasetItems):\n if curr_item.get(\"entityId\") == item_id:\n del self.properties.datasetItems[i]\n break\n else:\n raise ValueError(\"Not a Synapse ID: %s\" % str(item_id))\n
"},{"location":"reference/tables/#synapseclient.table.Dataset.has_item","title":"has_item(item_id)
","text":":param item_id: a single dataset item Synapse ID
Source code insynapseclient/table.py
def has_item(self, item_id):\n \"\"\"\n :param item_id: a single dataset item Synapse ID\n \"\"\"\n return any(item[\"entityId\"] == item_id for item in self.properties.datasetItems)\n
"},{"location":"reference/tables/#synapseclient.table.Dataset.add_folder","title":"add_folder(folder, force=True)
","text":":param folder: a single Synapse Folder ID :param force: force add items from folder
Source code insynapseclient/table.py
def add_folder(self, folder: str, force: bool = True):\n \"\"\"\n :param folder: a single Synapse Folder ID\n :param force: force add items from folder\n \"\"\"\n if not self.__dict__.get(\"folders_to_add\", None):\n self.__dict__[\"folders_to_add\"] = set()\n self.__dict__[\"folders_to_add\"].add(folder)\n # if self.force != force:\n self.force = force\n
"},{"location":"reference/tables/#synapseclient.table.Dataset.add_folders","title":"add_folders(folders, force=True)
","text":":param folders: a list of Synapse Folder IDs :param force: force add items from folders
Source code insynapseclient/table.py
def add_folders(self, folders: List[str], force: bool = True):\n \"\"\"\n :param folders: a list of Synapse Folder IDs\n :param force: force add items from folders\n \"\"\"\n if (\n isinstance(folders, list)\n or isinstance(folders, set)\n or isinstance(folders, tuple)\n ):\n self.force = force\n for folder in folders:\n self.add_folder(folder, force)\n else:\n raise ValueError(f\"Not a list of Folder IDs: {folders}\")\n
"},{"location":"reference/tables/#synapseclient.table.EntityViewSchema","title":"EntityViewSchema
","text":" Bases: ViewBase
A EntityViewSchema is a :py:class:synapseclient.entity.Entity
that displays all files/projects (depending on user choice) within a given set of scopes
:param name: the name of the Entity View Table object :param columns: a list of :py:class:Column
objects or their IDs. These are optional. :param parent: the project in Synapse to which this table belongs :param scopes: a list of Projects/Folders or their ids :param type: This field is deprecated. Please use includeEntityTypes
:param includeEntityTypes: a list of entity types to include in the view. Supported entity types are:
- EntityViewType.FILE,\n - EntityViewType.PROJECT,\n - EntityViewType.TABLE,\n - EntityViewType.FOLDER,\n - EntityViewType.VIEW,\n - EntityViewType.DOCKER\n\n If none is provided, the view will default to include EntityViewType.FILE.\n
:param addDefaultViewColumns: If true, adds all default columns (e.g. name, createdOn, modifiedBy etc.) Defaults to True. The default columns will be added after a call to :py:meth:synapseclient.Synapse.store
. :param addAnnotationColumns: If true, adds columns for all annotation keys defined across all Entities in the EntityViewSchema's scope. Defaults to True. The annotation columns will be added after a call to :py:meth:synapseclient.Synapse.store
. :param ignoredAnnotationColumnNames: A list of strings representing annotation names. When addAnnotationColumns is True, the names in this list will not be automatically added as columns to the EntityViewSchema if they exist in any of the defined scopes. :param properties: A map of Synapse properties :param annotations: A map of user defined annotations :param local_state: Internal use only
Example::
from synapseclient import EntityViewType\n\nproject_or_folder = syn.get(\"syn123\")\nschema = syn.store(EntityViewSchema(name='MyTable', parent=project, scopes=[project_or_folder_id, 'syn123'],\n includeEntityTypes=[EntityViewType.FILE]))\n
Source code in synapseclient/table.py
class EntityViewSchema(ViewBase):\n \"\"\"\n A EntityViewSchema is a :py:class:`synapseclient.entity.Entity` that displays all files/projects\n (depending on user choice) within a given set of scopes\n\n :param name: the name of the Entity View Table object\n :param columns: a list of :py:class:`Column` objects or their IDs. These are optional.\n :param parent: the project in Synapse to which this table belongs\n :param scopes: a list of Projects/Folders or their ids\n :param type: This field is deprecated. Please use `includeEntityTypes`\n :param includeEntityTypes: a list of entity types to include in the view. Supported entity types are:\n\n - EntityViewType.FILE,\n - EntityViewType.PROJECT,\n - EntityViewType.TABLE,\n - EntityViewType.FOLDER,\n - EntityViewType.VIEW,\n - EntityViewType.DOCKER\n\n If none is provided, the view will default to include EntityViewType.FILE.\n :param addDefaultViewColumns: If true, adds all default columns (e.g. name, createdOn, modifiedBy etc.)\n Defaults to True.\n The default columns will be added after a call to\n :py:meth:`synapseclient.Synapse.store`.\n :param addAnnotationColumns: If true, adds columns for all annotation keys defined across all Entities in\n the EntityViewSchema's scope. Defaults to True.\n The annotation columns will be added after a call to\n :py:meth:`synapseclient.Synapse.store`.\n :param ignoredAnnotationColumnNames: A list of strings representing annotation names.\n When addAnnotationColumns is True, the names in this list will not be\n automatically added as columns to the EntityViewSchema if they exist in any\n of the defined scopes.\n :param properties: A map of Synapse properties\n :param annotations: A map of user defined annotations\n :param local_state: Internal use only\n\n Example::\n\n from synapseclient import EntityViewType\n\n project_or_folder = syn.get(\"syn123\")\n schema = syn.store(EntityViewSchema(name='MyTable', parent=project, scopes=[project_or_folder_id, 'syn123'],\n includeEntityTypes=[EntityViewType.FILE]))\n\n \"\"\"\n\n _synapse_entity_type = \"org.sagebionetworks.repo.model.table.EntityView\"\n\n def __init__(\n self,\n name=None,\n columns=None,\n parent=None,\n scopes=None,\n type=None,\n includeEntityTypes=None,\n addDefaultViewColumns=True,\n addAnnotationColumns=True,\n ignoredAnnotationColumnNames=[],\n properties=None,\n annotations=None,\n local_state=None,\n **kwargs,\n ):\n if includeEntityTypes:\n kwargs[\"viewTypeMask\"] = _get_view_type_mask(includeEntityTypes)\n elif type:\n kwargs[\"viewTypeMask\"] = _get_view_type_mask_for_deprecated_type(type)\n elif properties and \"type\" in properties:\n kwargs[\"viewTypeMask\"] = _get_view_type_mask_for_deprecated_type(\n properties[\"type\"]\n )\n properties[\"type\"] = None\n\n self.ignoredAnnotationColumnNames = set(ignoredAnnotationColumnNames)\n super(EntityViewSchema, self).__init__(\n name=name,\n columns=columns,\n properties=properties,\n annotations=annotations,\n local_state=local_state,\n parent=parent,\n **kwargs,\n )\n\n # This is a hacky solution to make sure we don't try to add columns to schemas that we retrieve from synapse\n is_from_normal_constructor = not (properties or local_state)\n # allowing annotations because user might want to update annotations all at once\n self.addDefaultViewColumns = (\n addDefaultViewColumns and is_from_normal_constructor\n )\n self.addAnnotationColumns = addAnnotationColumns and is_from_normal_constructor\n\n # set default values after constructor so we don't overwrite the values defined in properties using .get()\n # because properties, unlike local_state, do not have nonexistent keys assigned with a value of None\n if self.get(\"viewTypeMask\") is None:\n self.viewTypeMask = EntityViewType.FILE.value\n if self.get(\"scopeIds\") is None:\n self.scopeIds = []\n\n # add the scopes last so that we can append the passed in scopes to those defined in properties\n if scopes is not None:\n self.add_scope(scopes)\n\n def set_entity_types(self, includeEntityTypes):\n \"\"\"\n :param includeEntityTypes: a list of entity types to include in the view. This list will replace the previous\n settings. Supported entity types are:\n\n - EntityViewType.FILE,\n - EntityViewType.PROJECT,\n - EntityViewType.TABLE,\n - EntityViewType.FOLDER,\n - EntityViewType.VIEW,\n - EntityViewType.DOCKER\n\n \"\"\"\n self.viewTypeMask = _get_view_type_mask(includeEntityTypes)\n
"},{"location":"reference/tables/#synapseclient.table.EntityViewSchema-functions","title":"Functions","text":""},{"location":"reference/tables/#synapseclient.table.EntityViewSchema.set_entity_types","title":"set_entity_types(includeEntityTypes)
","text":":param includeEntityTypes: a list of entity types to include in the view. This list will replace the previous settings. Supported entity types are:
- EntityViewType.FILE,\n - EntityViewType.PROJECT,\n - EntityViewType.TABLE,\n - EntityViewType.FOLDER,\n - EntityViewType.VIEW,\n - EntityViewType.DOCKER\n
Source code in synapseclient/table.py
def set_entity_types(self, includeEntityTypes):\n \"\"\"\n :param includeEntityTypes: a list of entity types to include in the view. This list will replace the previous\n settings. Supported entity types are:\n\n - EntityViewType.FILE,\n - EntityViewType.PROJECT,\n - EntityViewType.TABLE,\n - EntityViewType.FOLDER,\n - EntityViewType.VIEW,\n - EntityViewType.DOCKER\n\n \"\"\"\n self.viewTypeMask = _get_view_type_mask(includeEntityTypes)\n
"},{"location":"reference/tables/#synapseclient.table.SubmissionViewSchema","title":"SubmissionViewSchema
","text":" Bases: ViewBase
A SubmissionViewSchema is a :py:class:synapseclient.entity.Entity
that displays all files/projects (depending on user choice) within a given set of scopes
:param name: the name of the Entity View Table object :param columns: a list of :py:class:Column
objects or their IDs. These are optional. :param parent: the project in Synapse to which this table belongs :param scopes: a list of Evaluation Queues or their ids :param addDefaultViewColumns: If true, adds all default columns (e.g. name, createdOn, modifiedBy etc.) Defaults to True. The default columns will be added after a call to :py:meth:synapseclient.Synapse.store
. :param addAnnotationColumns: If true, adds columns for all annotation keys defined across all Entities in the SubmissionViewSchema's scope. Defaults to True. The annotation columns will be added after a call to :py:meth:synapseclient.Synapse.store
. :param ignoredAnnotationColumnNames: A list of strings representing annotation names. When addAnnotationColumns is True, the names in this list will not be automatically added as columns to the SubmissionViewSchema if they exist in any of the defined scopes. :param properties: A map of Synapse properties :param annotations: A map of user defined annotations :param local_state: Internal use only
Example:: from synapseclient import SubmissionViewSchema
project = syn.get(\"syn123\")\nschema = syn.store(SubmissionViewSchema(name='My Submission View', parent=project, scopes=['9614543']))\n
Source code in synapseclient/table.py
class SubmissionViewSchema(ViewBase):\n \"\"\"\n A SubmissionViewSchema is a :py:class:`synapseclient.entity.Entity` that displays all files/projects\n (depending on user choice) within a given set of scopes\n\n :param name: the name of the Entity View Table object\n :param columns: a list of :py:class:`Column` objects or their IDs. These are optional.\n :param parent: the project in Synapse to which this table belongs\n :param scopes: a list of Evaluation Queues or their ids\n :param addDefaultViewColumns: If true, adds all default columns (e.g. name, createdOn, modifiedBy etc.)\n Defaults to True.\n The default columns will be added after a call to\n :py:meth:`synapseclient.Synapse.store`.\n :param addAnnotationColumns: If true, adds columns for all annotation keys defined across all Entities in\n the SubmissionViewSchema's scope. Defaults to True.\n The annotation columns will be added after a call to\n :py:meth:`synapseclient.Synapse.store`.\n :param ignoredAnnotationColumnNames: A list of strings representing annotation names.\n When addAnnotationColumns is True, the names in this list will not be\n automatically added as columns to the SubmissionViewSchema if they exist in\n any of the defined scopes.\n :param properties: A map of Synapse properties\n :param annotations: A map of user defined annotations\n :param local_state: Internal use only\n\n Example::\n from synapseclient import SubmissionViewSchema\n\n project = syn.get(\"syn123\")\n schema = syn.store(SubmissionViewSchema(name='My Submission View', parent=project, scopes=['9614543']))\n \"\"\"\n\n _synapse_entity_type = \"org.sagebionetworks.repo.model.table.SubmissionView\"\n\n def __init__(\n self,\n name=None,\n columns=None,\n parent=None,\n scopes=None,\n addDefaultViewColumns=True,\n addAnnotationColumns=True,\n ignoredAnnotationColumnNames=[],\n properties=None,\n annotations=None,\n local_state=None,\n **kwargs,\n ):\n self.ignoredAnnotationColumnNames = set(ignoredAnnotationColumnNames)\n super(SubmissionViewSchema, self).__init__(\n name=name,\n columns=columns,\n properties=properties,\n annotations=annotations,\n local_state=local_state,\n parent=parent,\n **kwargs,\n )\n # This is a hacky solution to make sure we don't try to add columns to schemas that we retrieve from synapse\n is_from_normal_constructor = not (properties or local_state)\n # allowing annotations because user might want to update annotations all at once\n self.addDefaultViewColumns = (\n addDefaultViewColumns and is_from_normal_constructor\n )\n self.addAnnotationColumns = addAnnotationColumns and is_from_normal_constructor\n\n if self.get(\"scopeIds\") is None:\n self.scopeIds = []\n\n # add the scopes last so that we can append the passed in scopes to those defined in properties\n if scopes is not None:\n self.add_scope(scopes)\n
"},{"location":"reference/tables/#synapseclient.table.SelectColumn","title":"SelectColumn
","text":" Bases: DictObject
Defines a column to be used in a table :py:class:synapseclient.table.Schema
.
:var id: An immutable ID issued by the platform :param columnType: Can be any of: \"STRING\", \"DOUBLE\", \"INTEGER\", \"BOOLEAN\", \"DATE\", \"FILEHANDLEID\", \"ENTITYID\" :param name: The display name of the column
:type id: string :type columnType: string :type name: string
Source code insynapseclient/table.py
class SelectColumn(DictObject):\n \"\"\"\n Defines a column to be used in a table :py:class:`synapseclient.table.Schema`.\n\n :var id: An immutable ID issued by the platform\n :param columnType: Can be any of: \"STRING\", \"DOUBLE\", \"INTEGER\", \"BOOLEAN\", \"DATE\", \"FILEHANDLEID\", \"ENTITYID\"\n :param name: The display name of the column\n\n :type id: string\n :type columnType: string\n :type name: string\n \"\"\"\n\n def __init__(self, id=None, columnType=None, name=None, **kwargs):\n super(SelectColumn, self).__init__()\n if id:\n self.id = id\n\n if name:\n self.name = name\n\n if columnType:\n self.columnType = columnType\n\n # Notes that this param is only used to support forward compatibility.\n self.update(kwargs)\n\n @classmethod\n def from_column(cls, column):\n return cls(\n column.get(\"id\", None),\n column.get(\"columnType\", None),\n column.get(\"name\", None),\n )\n
"},{"location":"reference/tables/#synapseclient.table.Column","title":"Column
","text":" Bases: DictObject
Defines a column to be used in a table :py:class:synapseclient.table.Schema
:py:class:synapseclient.table.EntityViewSchema
.
:var id: An immutable ID issued by the platform :param columnType: The column type determines the type of data that can be stored in a column. It can be any of: \"STRING\", \"DOUBLE\", \"INTEGER\", \"BOOLEAN\", \"DATE\", \"FILEHANDLEID\", \"ENTITYID\", \"LINK\", \"LARGETEXT\", \"USERID\". For more information, please see: https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/ColumnType.html :param maximumSize: A parameter for columnTypes with a maximum size. For example, ColumnType.STRINGs have a default maximum size of 50 characters, but can be set to a maximumSize of 1 to 1000 characters. :param maximumListLength: Required if using a columnType with a \"_LIST\" suffix. Describes the maximum number of values that will appear in that list. Value range 1-100 inclusive. Default 100 :param name: The display name of the column :param enumValues: Columns type of STRING can be constrained to an enumeration values set on this list. :param defaultValue: The default value for this column. Columns of type FILEHANDLEID and ENTITYID are not allowed to have default values.
:type id: string :type maximumSize: integer :type maximumListLength: integer :type columnType: string :type name: string :type enumValues: array of strings :type defaultValue: string
Source code insynapseclient/table.py
class Column(DictObject):\n \"\"\"\n Defines a column to be used in a table :py:class:`synapseclient.table.Schema`\n :py:class:`synapseclient.table.EntityViewSchema`.\n\n :var id: An immutable ID issued by the platform\n :param columnType: The column type determines the type of data that can be stored in a column. It can be any\n of: \"STRING\", \"DOUBLE\", \"INTEGER\", \"BOOLEAN\", \"DATE\", \"FILEHANDLEID\", \"ENTITYID\", \"LINK\",\n \"LARGETEXT\", \"USERID\". For more information, please see:\n https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/ColumnType.html\n :param maximumSize: A parameter for columnTypes with a maximum size. For example, ColumnType.STRINGs have a\n default maximum size of 50 characters, but can be set to a maximumSize of 1 to 1000\n characters.\n :param maximumListLength: Required if using a columnType with a \"_LIST\" suffix. Describes the maximum number of\n values that will appear in that list. Value range 1-100 inclusive. Default 100\n :param name: The display name of the column\n :param enumValues: Columns type of STRING can be constrained to an enumeration values set on this list.\n :param defaultValue: The default value for this column. Columns of type FILEHANDLEID and ENTITYID are not\n allowed to have default values.\n\n :type id: string\n :type maximumSize: integer\n :type maximumListLength: integer\n :type columnType: string\n :type name: string\n :type enumValues: array of strings\n :type defaultValue: string\n \"\"\"\n\n @classmethod\n def getURI(cls, id):\n return \"/column/%s\" % id\n\n def __init__(self, **kwargs):\n super(Column, self).__init__(kwargs)\n self[\"concreteType\"] = concrete_types.COLUMN_MODEL\n\n def postURI(self):\n return \"/column\"\n
"},{"location":"reference/tables/#synapseclient.table.AppendableRowset","title":"AppendableRowset
","text":" Bases: DictObject
Abstract Base Class for :py:class:Rowset
and :py:class:PartialRowset
synapseclient/table.py
class AppendableRowset(DictObject, metaclass=abc.ABCMeta):\n \"\"\"Abstract Base Class for :py:class:`Rowset` and :py:class:`PartialRowset`\"\"\"\n\n @abc.abstractmethod\n def __init__(self, schema, **kwargs):\n if (\"tableId\" not in kwargs) and schema:\n kwargs[\"tableId\"] = id_of(schema)\n\n if not kwargs.get(\"tableId\", None):\n raise ValueError(\n \"Table schema ID must be defined to create a %s\" % type(self).__name__\n )\n super(AppendableRowset, self).__init__(kwargs)\n\n def _synapse_store(self, syn):\n \"\"\"\n Creates and POSTs an AppendableRowSetRequest_\n\n .. AppendableRowSetRequest:\n https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/AppendableRowSetRequest.html\n \"\"\"\n append_rowset_request = {\n \"concreteType\": concrete_types.APPENDABLE_ROWSET_REQUEST,\n \"toAppend\": self,\n \"entityId\": self.tableId,\n }\n\n response = syn._async_table_update(\n self.tableId, [append_rowset_request], wait=True\n )\n syn._check_table_transaction_response(response)\n return response[\"results\"][0]\n
"},{"location":"reference/tables/#synapseclient.table.PartialRowset","title":"PartialRowset
","text":" Bases: AppendableRowset
A set of Partial Rows used for updating cells of a table. PartialRowsets allow you to push only the individual cells you wish to change instead of pushing entire rows with many unchanged cells.
Example:: #### the following code will change cells in a hypothetical table, syn123: #### these same steps will also work for using EntityView tables to change Entity annotations # # fooCol | barCol fooCol | barCol # ----------------- =======> ---------------------- # foo1 | bar1 foo foo1 | bar1 # foo2 | bar2 foo2 | bar bar 2
query_results = syn.tableQuery(\"SELECT * FROM syn123\")\n\n# The easiest way to know the rowId of the row you wish to change\n# is by converting the table to a pandas DataFrame with rowIdAndVersionInIndex=False\ndf = query_results.asDataFrame(rowIdAndVersionInIndex=False)\n\npartial_changes = {df['ROW_ID'][0]: {'fooCol': 'foo foo 1'},\n df['ROW_ID'][1]: {'barCol': 'bar bar 2'}}\n\n# you will need to pass in your original query result as an argument\n# so that we can perform column id translation and etag retrieval on your behalf:\npartial_rowset = PartialRowset.from_mapping(partial_changes, query_results)\nsyn.store(partial_rowset)\n
:param schema: The :py:class:Schema
of the table to update or its tableId as a string :param rows: A list of PartialRows
synapseclient/table.py
class PartialRowset(AppendableRowset):\n \"\"\"A set of Partial Rows used for updating cells of a table.\n PartialRowsets allow you to push only the individual cells you wish to change instead of pushing entire rows with\n many unchanged cells.\n\n Example::\n #### the following code will change cells in a hypothetical table, syn123:\n #### these same steps will also work for using EntityView tables to change Entity annotations\n #\n # fooCol | barCol fooCol | barCol\n # ----------------- =======> ----------------------\n # foo1 | bar1 foo foo1 | bar1\n # foo2 | bar2 foo2 | bar bar 2\n\n query_results = syn.tableQuery(\"SELECT * FROM syn123\")\n\n # The easiest way to know the rowId of the row you wish to change\n # is by converting the table to a pandas DataFrame with rowIdAndVersionInIndex=False\n df = query_results.asDataFrame(rowIdAndVersionInIndex=False)\n\n partial_changes = {df['ROW_ID'][0]: {'fooCol': 'foo foo 1'},\n df['ROW_ID'][1]: {'barCol': 'bar bar 2'}}\n\n # you will need to pass in your original query result as an argument\n # so that we can perform column id translation and etag retrieval on your behalf:\n partial_rowset = PartialRowset.from_mapping(partial_changes, query_results)\n syn.store(partial_rowset)\n\n :param schema: The :py:class:`Schema` of the table to update or its tableId as a string\n :param rows: A list of PartialRows\n \"\"\"\n\n @classmethod\n def from_mapping(cls, mapping, originalQueryResult):\n \"\"\"Creates a PartialRowset\n :param mapping: A mapping of mappings in the structure: {ROW_ID : {COLUMN_NAME: NEW_COL_VALUE}}\n :param originalQueryResult:\n :return: a PartialRowSet that can be syn.store()-ed to apply the changes\n \"\"\"\n if not isinstance(mapping, collections.abc.Mapping):\n raise ValueError(\"mapping must be a supported Mapping type such as 'dict'\")\n\n try:\n name_to_column_id = {\n col.name: col.id for col in originalQueryResult.headers if \"id\" in col\n }\n except AttributeError:\n raise ValueError(\n \"originalQueryResult must be the result of a syn.tableQuery()\"\n )\n\n row_ids = set(int(id) for id in mapping.keys())\n\n # row_ids in the originalQueryResult are not guaranteed to be in ascending order\n # iterate over all etags but only map the row_ids used for this partial update to their etags\n row_etags = {\n row_id: etag\n for row_id, row_version, etag in originalQueryResult.iter_row_metadata()\n if row_id in row_ids and etag is not None\n }\n\n partial_rows = [\n PartialRow(\n row_changes,\n row_id,\n etag=row_etags.get(int(row_id)),\n nameToColumnId=name_to_column_id,\n )\n for row_id, row_changes in mapping.items()\n ]\n\n return cls(originalQueryResult.tableId, partial_rows)\n\n def __init__(self, schema, rows):\n super(PartialRowset, self).__init__(schema)\n self.concreteType = concrete_types.PARTIAL_ROW_SET\n\n if isinstance(rows, PartialRow):\n self.rows = [rows]\n else:\n try:\n if all(isinstance(row, PartialRow) for row in rows):\n self.rows = list(rows)\n else:\n raise ValueError(\"rows must contain only values of type PartialRow\")\n except TypeError:\n raise ValueError(\"rows must be iterable\")\n
"},{"location":"reference/tables/#synapseclient.table.PartialRowset-functions","title":"Functions","text":""},{"location":"reference/tables/#synapseclient.table.PartialRowset.from_mapping","title":"from_mapping(mapping, originalQueryResult)
classmethod
","text":"Creates a PartialRowset :param mapping: A mapping of mappings in the structure: {ROW_ID : {COLUMN_NAME: NEW_COL_VALUE}} :param originalQueryResult: :return: a PartialRowSet that can be syn.store()-ed to apply the changes
Source code insynapseclient/table.py
@classmethod\ndef from_mapping(cls, mapping, originalQueryResult):\n \"\"\"Creates a PartialRowset\n :param mapping: A mapping of mappings in the structure: {ROW_ID : {COLUMN_NAME: NEW_COL_VALUE}}\n :param originalQueryResult:\n :return: a PartialRowSet that can be syn.store()-ed to apply the changes\n \"\"\"\n if not isinstance(mapping, collections.abc.Mapping):\n raise ValueError(\"mapping must be a supported Mapping type such as 'dict'\")\n\n try:\n name_to_column_id = {\n col.name: col.id for col in originalQueryResult.headers if \"id\" in col\n }\n except AttributeError:\n raise ValueError(\n \"originalQueryResult must be the result of a syn.tableQuery()\"\n )\n\n row_ids = set(int(id) for id in mapping.keys())\n\n # row_ids in the originalQueryResult are not guaranteed to be in ascending order\n # iterate over all etags but only map the row_ids used for this partial update to their etags\n row_etags = {\n row_id: etag\n for row_id, row_version, etag in originalQueryResult.iter_row_metadata()\n if row_id in row_ids and etag is not None\n }\n\n partial_rows = [\n PartialRow(\n row_changes,\n row_id,\n etag=row_etags.get(int(row_id)),\n nameToColumnId=name_to_column_id,\n )\n for row_id, row_changes in mapping.items()\n ]\n\n return cls(originalQueryResult.tableId, partial_rows)\n
"},{"location":"reference/tables/#synapseclient.table.RowSet","title":"RowSet
","text":" Bases: AppendableRowset
A Synapse object of type org.sagebionetworks.repo.model.table.RowSet <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/RowSet.html>
_.
:param schema: A :py:class:synapseclient.table.Schema
object that will be used to set the tableId :param headers: The list of SelectColumn objects that describe the fields in each row. :param columns: An alternative to 'headers', a list of column objects that describe the fields in each row. :param tableId: The ID of the TableEntity that owns these rows :param rows: The :py:class:synapseclient.table.Row
s of this set. The index of each row value aligns with the index of each header. :var etag: Any RowSet returned from Synapse will contain the current etag of the change set. To update any rows from a RowSet the etag must be provided with the POST.
:type headers: array of SelectColumns :type etag: string :type tableId: string :type rows: array of rows
Source code insynapseclient/table.py
class RowSet(AppendableRowset):\n \"\"\"\n A Synapse object of type `org.sagebionetworks.repo.model.table.RowSet \\\n <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/RowSet.html>`_.\n\n :param schema: A :py:class:`synapseclient.table.Schema` object that will be used to set the tableId\n :param headers: The list of SelectColumn objects that describe the fields in each row.\n :param columns: An alternative to 'headers', a list of column objects that describe the fields in each row.\n :param tableId: The ID of the TableEntity that owns these rows\n :param rows: The :py:class:`synapseclient.table.Row` s of this set. The index of each row value aligns with the\n index of each header.\n :var etag: Any RowSet returned from Synapse will contain the current etag of the change set. To update any\n rows from a RowSet the etag must be provided with the POST.\n\n :type headers: array of SelectColumns\n :type etag: string\n :type tableId: string\n :type rows: array of rows\n \"\"\"\n\n @classmethod\n def from_json(cls, json):\n headers = [SelectColumn(**header) for header in json.get(\"headers\", [])]\n rows = [cast_row(Row(**row), headers) for row in json.get(\"rows\", [])]\n return cls(\n headers=headers,\n rows=rows,\n **{key: json[key] for key in json.keys() if key not in [\"headers\", \"rows\"]},\n )\n\n def __init__(self, columns=None, schema=None, **kwargs):\n if \"headers\" not in kwargs:\n if columns and schema:\n raise ValueError(\n \"Please only user either 'columns' or 'schema' as an argument but not both.\"\n )\n if columns:\n kwargs.setdefault(\"headers\", []).extend(\n [SelectColumn.from_column(column) for column in columns]\n )\n elif schema and isinstance(schema, Schema):\n kwargs.setdefault(\"headers\", []).extend(\n [SelectColumn(id=id) for id in schema[\"columnIds\"]]\n )\n\n if not kwargs.get(\"headers\", None):\n raise ValueError(\"Column headers must be defined to create a RowSet\")\n kwargs[\"concreteType\"] = \"org.sagebionetworks.repo.model.table.RowSet\"\n\n super(RowSet, self).__init__(schema, **kwargs)\n\n def _synapse_store(self, syn):\n response = super(RowSet, self)._synapse_store(syn)\n return response.get(\"rowReferenceSet\", response)\n\n def _synapse_delete(self, syn):\n \"\"\"\n Delete the rows in the RowSet.\n Example::\n syn.delete(syn.tableQuery('select name from %s where no_good = true' % schema1.id))\n \"\"\"\n row_id_vers_generator = ((row.rowId, row.versionNumber) for row in self.rows)\n _delete_rows(syn, self.tableId, row_id_vers_generator)\n
"},{"location":"reference/tables/#synapseclient.table.Row","title":"Row
","text":" Bases: DictObject
A row <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/Row.html>
_ in a Table.
:param values: A list of values :param rowId: The immutable ID issued to a new row :param versionNumber: The version number of this row. Each row version is immutable, so when a row is updated a new version is created.
Source code insynapseclient/table.py
class Row(DictObject):\n \"\"\"\n A `row <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/Row.html>`_ in a Table.\n\n :param values: A list of values\n :param rowId: The immutable ID issued to a new row\n :param versionNumber: The version number of this row. Each row version is immutable, so when a row is updated a\n new version is created.\n \"\"\"\n\n def __init__(self, values, rowId=None, versionNumber=None, etag=None, **kwargs):\n super(Row, self).__init__()\n self.values = values\n if rowId is not None:\n self.rowId = rowId\n if versionNumber is not None:\n self.versionNumber = versionNumber\n if etag is not None:\n self.etag = etag\n\n # Notes that this param is only used to support forward compatibility.\n self.update(kwargs)\n
"},{"location":"reference/tables/#synapseclient.table.PartialRow","title":"PartialRow
","text":" Bases: DictObject
This is a lower-level class for use in :py:class::PartialRowSet
to update individual cells within a table.
It is recommended you use :py:classmethod::PartialRowSet.from_mapping
to construct partial change sets to a table.
If you want to do the tedious parts yourself:
To change cells in the \"foo\"(colId:1234) and \"bar\"(colId:456) columns of a row with rowId=5 :: rowId = 5
#pass in with columnIds as key:\nPartialRow({123: 'fooVal', 456:'barVal'}, rowId)\n\n#pass in with a nameToColumnId argument\n\n#manually define:\nnameToColumnId = {'foo':123, 'bar':456}\n#OR if you have the result of a tableQuery() you can generate nameToColumnId using:\nquery_result = syn.tableQuery(\"SELECT * FROM syn123\")\nnameToColumnId = {col.name:col.id for col in query_result.headers}\n\nPartialRow({'foo': 'fooVal', 'bar':'barVal'}, rowId, nameToColumnId=nameToColumnId)\n
:param values: A Mapping where: - key is name of the column (or its columnId) to change in the desired row - value is the new desired value for that column :param rowId: The id of the row to be updated :param etag: used for updating File/Project Views(::py:class:EntityViewSchema
). Not necessary for a (::py:class:Schema
) Table :param nameToColumnId: Optional map column names to column Ids. If this is provided, the keys of your values
Mapping will be replaced with the column ids in the nameToColumnId
dict. Include this as an argument when you are providing the column names instead of columnIds as the keys to the values
Mapping.
synapseclient/table.py
class PartialRow(DictObject):\n \"\"\"This is a lower-level class for use in :py:class::`PartialRowSet` to update individual cells within a table.\n\n It is recommended you use :py:classmethod::`PartialRowSet.from_mapping`to construct partial change sets to a table.\n\n If you want to do the tedious parts yourself:\n\n To change cells in the \"foo\"(colId:1234) and \"bar\"(colId:456) columns of a row with rowId=5 ::\n rowId = 5\n\n #pass in with columnIds as key:\n PartialRow({123: 'fooVal', 456:'barVal'}, rowId)\n\n #pass in with a nameToColumnId argument\n\n #manually define:\n nameToColumnId = {'foo':123, 'bar':456}\n #OR if you have the result of a tableQuery() you can generate nameToColumnId using:\n query_result = syn.tableQuery(\"SELECT * FROM syn123\")\n nameToColumnId = {col.name:col.id for col in query_result.headers}\n\n PartialRow({'foo': 'fooVal', 'bar':'barVal'}, rowId, nameToColumnId=nameToColumnId)\n\n :param values: A Mapping where:\n - key is name of the column (or its columnId) to change in the desired row\n - value is the new desired value for that column\n :param rowId: The id of the row to be updated\n :param etag: used for updating File/Project Views(::py:class:`EntityViewSchema`). Not necessary for a\n (::py:class:`Schema`) Table\n :param nameToColumnId: Optional map column names to column Ids. If this is provided, the keys of your `values`\n Mapping will be replaced with the column ids in the `nameToColumnId` dict. Include this\n as an argument when you are providing the column names instead of columnIds as the keys\n to the `values` Mapping.\n\n \"\"\"\n\n def __init__(self, values, rowId, etag=None, nameToColumnId=None):\n super(PartialRow, self).__init__()\n if not isinstance(values, collections.abc.Mapping):\n raise ValueError(\"values must be a Mapping\")\n\n rowId = int(rowId)\n\n self.values = [\n {\n \"key\": nameToColumnId[x_key] if nameToColumnId is not None else x_key,\n \"value\": x_value,\n }\n for x_key, x_value in values.items()\n ]\n self.rowId = rowId\n if etag is not None:\n self.etag = etag\n
"},{"location":"reference/tables/#synapseclient.table.TableAbstractBaseClass","title":"TableAbstractBaseClass
","text":" Bases: Iterable
, Sized
Abstract base class for Tables based on different data containers.
Source code insynapseclient/table.py
class TableAbstractBaseClass(collections.abc.Iterable, collections.abc.Sized):\n \"\"\"\n Abstract base class for Tables based on different data containers.\n \"\"\"\n\n RowMetadataTuple = collections.namedtuple(\n \"RowMetadataTuple\", [\"row_id\", \"row_version\", \"row_etag\"]\n )\n\n def __init__(self, schema, headers=None, etag=None):\n if isinstance(schema, Schema):\n self.schema = schema\n self.tableId = schema.id if schema and \"id\" in schema else None\n self.headers = (\n headers if headers else [SelectColumn(id=id) for id in schema.columnIds]\n )\n self.etag = etag\n elif isinstance(schema, str):\n self.schema = None\n self.tableId = schema\n self.headers = headers\n self.etag = etag\n else:\n ValueError(\"Must provide a schema or a synapse ID of a Table Entity\")\n\n def asDataFrame(self):\n raise NotImplementedError()\n\n def asRowSet(self):\n return RowSet(\n headers=self.headers,\n tableId=self.tableId,\n etag=self.etag,\n rows=[row if isinstance(row, Row) else Row(row) for row in self],\n )\n\n def _synapse_store(self, syn):\n raise NotImplementedError()\n\n def _synapse_delete(self, syn):\n \"\"\"\n Delete the rows that result from a table query.\n\n Example::\n syn.delete(syn.tableQuery('select name from %s where no_good = true' % schema1.id))\n \"\"\"\n row_id_vers_generator = (\n (metadata.row_id, metadata.row_version)\n for metadata in self.iter_row_metadata()\n )\n _delete_rows(syn, self.tableId, row_id_vers_generator)\n\n @abc.abstractmethod\n def iter_row_metadata(self):\n \"\"\"Iterates the table results to get row_id and row_etag. If an etag does not exist for a row, it will\n generated as (row_id, None)\n\n :return: a generator that gives :py:class::`collections.namedtuple` with format (row_id, row_etag)\n \"\"\"\n pass\n
"},{"location":"reference/tables/#synapseclient.table.TableAbstractBaseClass-functions","title":"Functions","text":""},{"location":"reference/tables/#synapseclient.table.TableAbstractBaseClass.iter_row_metadata","title":"iter_row_metadata()
abstractmethod
","text":"Iterates the table results to get row_id and row_etag. If an etag does not exist for a row, it will generated as (row_id, None)
:return: a generator that gives :py:class::collections.namedtuple
with format (row_id, row_etag)
synapseclient/table.py
@abc.abstractmethod\ndef iter_row_metadata(self):\n \"\"\"Iterates the table results to get row_id and row_etag. If an etag does not exist for a row, it will\n generated as (row_id, None)\n\n :return: a generator that gives :py:class::`collections.namedtuple` with format (row_id, row_etag)\n \"\"\"\n pass\n
"},{"location":"reference/tables/#synapseclient.table.RowSetTable","title":"RowSetTable
","text":" Bases: TableAbstractBaseClass
A Table object that wraps a RowSet.
Source code insynapseclient/table.py
class RowSetTable(TableAbstractBaseClass):\n \"\"\"\n A Table object that wraps a RowSet.\n \"\"\"\n\n def __init__(self, schema, rowset):\n super(RowSetTable, self).__init__(schema, etag=rowset.get(\"etag\", None))\n self.rowset = rowset\n\n def _synapse_store(self, syn):\n row_reference_set = syn.store(self.rowset)\n return RowSetTable(self.schema, row_reference_set)\n\n def asDataFrame(self):\n test_import_pandas()\n import pandas as pd\n\n if any([row[\"rowId\"] for row in self.rowset[\"rows\"]]):\n rownames = row_labels_from_rows(self.rowset[\"rows\"])\n else:\n rownames = None\n\n series = collections.OrderedDict()\n for i, header in enumerate(self.rowset[\"headers\"]):\n series[header.name] = pd.Series(\n name=header.name,\n data=[row[\"values\"][i] for row in self.rowset[\"rows\"]],\n index=rownames,\n )\n\n return pd.DataFrame(data=series, index=rownames)\n\n def asRowSet(self):\n return self.rowset\n\n def __iter__(self):\n def iterate_rows(rows, headers):\n for row in rows:\n yield cast_values(row, headers)\n\n return iterate_rows(self.rowset[\"rows\"], self.rowset[\"headers\"])\n\n def __len__(self):\n return len(self.rowset[\"rows\"])\n\n def iter_row_metadata(self):\n raise NotImplementedError(\"iter_metadata is not supported for RowSetTable\")\n
"},{"location":"reference/tables/#synapseclient.table.TableQueryResult","title":"TableQueryResult
","text":" Bases: TableAbstractBaseClass
An object to wrap rows returned as a result of a table query. The TableQueryResult object can be used to iterate over results of a query.
Example ::
results = syn.tableQuery(\"select * from syn1234\")\nfor row in results:\n print(row)\n
Source code in synapseclient/table.py
class TableQueryResult(TableAbstractBaseClass):\n \"\"\"\n An object to wrap rows returned as a result of a table query.\n The TableQueryResult object can be used to iterate over results of a query.\n\n Example ::\n\n results = syn.tableQuery(\"select * from syn1234\")\n for row in results:\n print(row)\n \"\"\"\n\n def __init__(self, synapse, query, limit=None, offset=None, isConsistent=True):\n self.syn = synapse\n\n self.query = query\n self.limit = limit\n self.offset = offset\n self.isConsistent = isConsistent\n\n result = self.syn._queryTable(\n query=query, limit=limit, offset=offset, isConsistent=isConsistent\n )\n\n self.rowset = RowSet.from_json(result[\"queryResult\"][\"queryResults\"])\n\n self.columnModels = [Column(**col) for col in result.get(\"columnModels\", [])]\n self.nextPageToken = result[\"queryResult\"].get(\"nextPageToken\", None)\n self.count = result.get(\"queryCount\", None)\n self.maxRowsPerPage = result.get(\"maxRowsPerPage\", None)\n self.i = -1\n\n super(TableQueryResult, self).__init__(\n schema=self.rowset.get(\"tableId\", None),\n headers=self.rowset.headers,\n etag=self.rowset.get(\"etag\", None),\n )\n\n def _synapse_store(self, syn):\n raise SynapseError(\n \"A TableQueryResult is a read only object and can't be stored in Synapse. Convert to a\"\n \" DataFrame or RowSet instead.\"\n )\n\n def asDataFrame(self, rowIdAndVersionInIndex=True):\n \"\"\"\n Convert query result to a Pandas DataFrame.\n\n :param rowIdAndVersionInIndex: Make the dataframe index consist of the row_id and row_version (and row_etag\n if it exists)\n\n \"\"\"\n test_import_pandas()\n import pandas as pd\n\n # To turn a TableQueryResult into a data frame, we add a page of rows\n # at a time on the untested theory that it's more efficient than\n # adding a single row at a time to the data frame.\n\n def construct_rownames(rowset, offset=0):\n try:\n return (\n row_labels_from_rows(rowset[\"rows\"])\n if rowIdAndVersionInIndex\n else None\n )\n except KeyError:\n # if we don't have row id and version, just number the rows\n # python3 cast range to list for safety\n return list(range(offset, offset + len(rowset[\"rows\"])))\n\n # first page of rows\n offset = 0\n rownames = construct_rownames(self.rowset, offset)\n offset += len(self.rowset[\"rows\"])\n series = collections.OrderedDict()\n\n if not rowIdAndVersionInIndex:\n # Since we use an OrderedDict this must happen before we construct the other columns\n # add row id, verison, and etag as rows\n append_etag = False # only useful when (not rowIdAndVersionInIndex), hooray for lazy variables!\n series[\"ROW_ID\"] = pd.Series(\n name=\"ROW_ID\", data=[row[\"rowId\"] for row in self.rowset[\"rows\"]]\n )\n series[\"ROW_VERSION\"] = pd.Series(\n name=\"ROW_VERSION\",\n data=[row[\"versionNumber\"] for row in self.rowset[\"rows\"]],\n )\n\n row_etag = [row.get(\"etag\") for row in self.rowset[\"rows\"]]\n if any(row_etag):\n append_etag = True\n series[\"ROW_ETAG\"] = pd.Series(name=\"ROW_ETAG\", data=row_etag)\n\n for i, header in enumerate(self.rowset[\"headers\"]):\n column_name = header.name\n series[column_name] = pd.Series(\n name=column_name,\n data=[row[\"values\"][i] for row in self.rowset[\"rows\"]],\n index=rownames,\n )\n\n # subsequent pages of rows\n while self.nextPageToken:\n result = self.syn._queryTableNext(self.nextPageToken, self.tableId)\n self.rowset = RowSet.from_json(result[\"queryResults\"])\n self.nextPageToken = result.get(\"nextPageToken\", None)\n self.i = 0\n\n rownames = construct_rownames(self.rowset, offset)\n offset += len(self.rowset[\"rows\"])\n\n if not rowIdAndVersionInIndex:\n # TODO: Look into why this isn't being assigned\n series[\"ROW_ID\"].append(\n pd.Series(\n name=\"ROW_ID\", data=[row[\"id\"] for row in self.rowset[\"rows\"]]\n )\n )\n series[\"ROW_VERSION\"].append(\n pd.Series(\n name=\"ROW_VERSION\",\n data=[row[\"version\"] for row in self.rowset[\"rows\"]],\n )\n )\n if append_etag:\n series[\"ROW_ETAG\"] = pd.Series(\n name=\"ROW_ETAG\",\n data=[row.get(\"etag\") for row in self.rowset[\"rows\"]],\n )\n\n for i, header in enumerate(self.rowset[\"headers\"]):\n column_name = header.name\n series[column_name] = pd.concat(\n [\n series[column_name],\n pd.Series(\n name=column_name,\n data=[row[\"values\"][i] for row in self.rowset[\"rows\"]],\n index=rownames,\n ),\n ],\n # can't verify integrity when indices are just numbers instead of 'rowid_rowversion'\n verify_integrity=rowIdAndVersionInIndex,\n )\n\n return pd.DataFrame(data=series)\n\n def asRowSet(self):\n # Note that as of stack 60, an empty query will omit the headers field\n # see PLFM-3014\n return RowSet(\n headers=self.headers,\n tableId=self.tableId,\n etag=self.etag,\n rows=[row for row in self],\n )\n\n def __iter__(self):\n return self\n\n def next(self):\n \"\"\"\n Python 2 iterator\n \"\"\"\n self.i += 1\n if self.i >= len(self.rowset[\"rows\"]):\n if self.nextPageToken:\n result = self.syn._queryTableNext(self.nextPageToken, self.tableId)\n self.rowset = RowSet.from_json(result[\"queryResults\"])\n self.nextPageToken = result.get(\"nextPageToken\", None)\n self.i = 0\n else:\n raise StopIteration()\n return self.rowset[\"rows\"][self.i]\n\n def __next__(self):\n \"\"\"\n Python 3 iterator\n \"\"\"\n return self.next()\n\n def __len__(self):\n return len(self.rowset[\"rows\"])\n\n def iter_row_metadata(self):\n \"\"\"Iterates the table results to get row_id and row_etag. If an etag does not exist for a row, it will\n generated as (row_id, row_version,None)\n\n :return: a generator that gives :py:class::`collections.namedtuple` with format (row_id, row_version, row_etag)\n \"\"\"\n for row in self:\n yield type(self).RowMetadataTuple(\n int(row[\"rowId\"]), int(row[\"versionNumber\"]), row.get(\"etag\")\n )\n
"},{"location":"reference/tables/#synapseclient.table.TableQueryResult-functions","title":"Functions","text":""},{"location":"reference/tables/#synapseclient.table.TableQueryResult.asDataFrame","title":"asDataFrame(rowIdAndVersionInIndex=True)
","text":"Convert query result to a Pandas DataFrame.
:param rowIdAndVersionInIndex: Make the dataframe index consist of the row_id and row_version (and row_etag if it exists)
Source code insynapseclient/table.py
def asDataFrame(self, rowIdAndVersionInIndex=True):\n \"\"\"\n Convert query result to a Pandas DataFrame.\n\n :param rowIdAndVersionInIndex: Make the dataframe index consist of the row_id and row_version (and row_etag\n if it exists)\n\n \"\"\"\n test_import_pandas()\n import pandas as pd\n\n # To turn a TableQueryResult into a data frame, we add a page of rows\n # at a time on the untested theory that it's more efficient than\n # adding a single row at a time to the data frame.\n\n def construct_rownames(rowset, offset=0):\n try:\n return (\n row_labels_from_rows(rowset[\"rows\"])\n if rowIdAndVersionInIndex\n else None\n )\n except KeyError:\n # if we don't have row id and version, just number the rows\n # python3 cast range to list for safety\n return list(range(offset, offset + len(rowset[\"rows\"])))\n\n # first page of rows\n offset = 0\n rownames = construct_rownames(self.rowset, offset)\n offset += len(self.rowset[\"rows\"])\n series = collections.OrderedDict()\n\n if not rowIdAndVersionInIndex:\n # Since we use an OrderedDict this must happen before we construct the other columns\n # add row id, verison, and etag as rows\n append_etag = False # only useful when (not rowIdAndVersionInIndex), hooray for lazy variables!\n series[\"ROW_ID\"] = pd.Series(\n name=\"ROW_ID\", data=[row[\"rowId\"] for row in self.rowset[\"rows\"]]\n )\n series[\"ROW_VERSION\"] = pd.Series(\n name=\"ROW_VERSION\",\n data=[row[\"versionNumber\"] for row in self.rowset[\"rows\"]],\n )\n\n row_etag = [row.get(\"etag\") for row in self.rowset[\"rows\"]]\n if any(row_etag):\n append_etag = True\n series[\"ROW_ETAG\"] = pd.Series(name=\"ROW_ETAG\", data=row_etag)\n\n for i, header in enumerate(self.rowset[\"headers\"]):\n column_name = header.name\n series[column_name] = pd.Series(\n name=column_name,\n data=[row[\"values\"][i] for row in self.rowset[\"rows\"]],\n index=rownames,\n )\n\n # subsequent pages of rows\n while self.nextPageToken:\n result = self.syn._queryTableNext(self.nextPageToken, self.tableId)\n self.rowset = RowSet.from_json(result[\"queryResults\"])\n self.nextPageToken = result.get(\"nextPageToken\", None)\n self.i = 0\n\n rownames = construct_rownames(self.rowset, offset)\n offset += len(self.rowset[\"rows\"])\n\n if not rowIdAndVersionInIndex:\n # TODO: Look into why this isn't being assigned\n series[\"ROW_ID\"].append(\n pd.Series(\n name=\"ROW_ID\", data=[row[\"id\"] for row in self.rowset[\"rows\"]]\n )\n )\n series[\"ROW_VERSION\"].append(\n pd.Series(\n name=\"ROW_VERSION\",\n data=[row[\"version\"] for row in self.rowset[\"rows\"]],\n )\n )\n if append_etag:\n series[\"ROW_ETAG\"] = pd.Series(\n name=\"ROW_ETAG\",\n data=[row.get(\"etag\") for row in self.rowset[\"rows\"]],\n )\n\n for i, header in enumerate(self.rowset[\"headers\"]):\n column_name = header.name\n series[column_name] = pd.concat(\n [\n series[column_name],\n pd.Series(\n name=column_name,\n data=[row[\"values\"][i] for row in self.rowset[\"rows\"]],\n index=rownames,\n ),\n ],\n # can't verify integrity when indices are just numbers instead of 'rowid_rowversion'\n verify_integrity=rowIdAndVersionInIndex,\n )\n\n return pd.DataFrame(data=series)\n
"},{"location":"reference/tables/#synapseclient.table.TableQueryResult.next","title":"next()
","text":"Python 2 iterator
Source code insynapseclient/table.py
def next(self):\n \"\"\"\n Python 2 iterator\n \"\"\"\n self.i += 1\n if self.i >= len(self.rowset[\"rows\"]):\n if self.nextPageToken:\n result = self.syn._queryTableNext(self.nextPageToken, self.tableId)\n self.rowset = RowSet.from_json(result[\"queryResults\"])\n self.nextPageToken = result.get(\"nextPageToken\", None)\n self.i = 0\n else:\n raise StopIteration()\n return self.rowset[\"rows\"][self.i]\n
"},{"location":"reference/tables/#synapseclient.table.TableQueryResult.iter_row_metadata","title":"iter_row_metadata()
","text":"Iterates the table results to get row_id and row_etag. If an etag does not exist for a row, it will generated as (row_id, row_version,None)
:return: a generator that gives :py:class::collections.namedtuple
with format (row_id, row_version, row_etag)
synapseclient/table.py
def iter_row_metadata(self):\n \"\"\"Iterates the table results to get row_id and row_etag. If an etag does not exist for a row, it will\n generated as (row_id, row_version,None)\n\n :return: a generator that gives :py:class::`collections.namedtuple` with format (row_id, row_version, row_etag)\n \"\"\"\n for row in self:\n yield type(self).RowMetadataTuple(\n int(row[\"rowId\"]), int(row[\"versionNumber\"]), row.get(\"etag\")\n )\n
"},{"location":"reference/tables/#synapseclient.table.CsvFileTable","title":"CsvFileTable
","text":" Bases: TableAbstractBaseClass
An object to wrap a CSV file that may be stored into a Synapse table or returned as a result of a table query.
Source code insynapseclient/table.py
class CsvFileTable(TableAbstractBaseClass):\n \"\"\"\n An object to wrap a CSV file that may be stored into a Synapse table or\n returned as a result of a table query.\n \"\"\"\n\n @classmethod\n def from_table_query(\n cls,\n synapse,\n query,\n quoteCharacter='\"',\n escapeCharacter=\"\\\\\",\n lineEnd=str(os.linesep),\n separator=\",\",\n header=True,\n includeRowIdAndRowVersion=True,\n downloadLocation=None,\n ):\n \"\"\"\n Create a Table object wrapping a CSV file resulting from querying a Synapse table.\n Mostly for internal use.\n \"\"\"\n\n download_from_table_result, path = synapse._queryTableCsv(\n query=query,\n quoteCharacter=quoteCharacter,\n escapeCharacter=escapeCharacter,\n lineEnd=lineEnd,\n separator=separator,\n header=header,\n includeRowIdAndRowVersion=includeRowIdAndRowVersion,\n downloadLocation=downloadLocation,\n )\n\n # A dirty hack to find out if we got back row ID and Version\n # in particular, we don't get these back from aggregate queries\n with io.open(path, \"r\", encoding=\"utf-8\") as f:\n reader = csv.reader(\n f,\n delimiter=separator,\n escapechar=escapeCharacter,\n lineterminator=lineEnd,\n quotechar=quoteCharacter,\n )\n first_line = next(reader)\n if len(download_from_table_result[\"headers\"]) + 2 == len(first_line):\n includeRowIdAndRowVersion = True\n else:\n includeRowIdAndRowVersion = False\n\n self = cls(\n filepath=path,\n schema=download_from_table_result.get(\"tableId\", None),\n etag=download_from_table_result.get(\"etag\", None),\n quoteCharacter=quoteCharacter,\n escapeCharacter=escapeCharacter,\n lineEnd=lineEnd,\n separator=separator,\n header=header,\n includeRowIdAndRowVersion=includeRowIdAndRowVersion,\n headers=[\n SelectColumn(**header)\n for header in download_from_table_result[\"headers\"]\n ],\n )\n\n return self\n\n @classmethod\n def from_data_frame(\n cls,\n schema,\n df,\n filepath=None,\n etag=None,\n quoteCharacter='\"',\n escapeCharacter=\"\\\\\",\n lineEnd=str(os.linesep),\n separator=\",\",\n header=True,\n includeRowIdAndRowVersion=None,\n headers=None,\n **kwargs,\n ):\n # infer columns from data frame if not specified\n if not headers:\n cols = as_table_columns(df)\n headers = [SelectColumn.from_column(col) for col in cols]\n\n # if the schema has no columns, use the inferred columns\n if isinstance(schema, Schema) and not schema.has_columns():\n schema.addColumns(cols)\n\n # convert row names in the format [row_id]_[version] or [row_id]_[version]_[etag] back to columns\n # etag is essentially a UUID\n etag_pattern = r\"[0-9a-fA-F]{8}-[0-9a-fA-F]{4}-[1-5][0-9a-fA-F]{3}-[89abAB][0-9a-fA-F]{3}-[0-9a-fA-F]{12}\"\n row_id_version_pattern = re.compile(r\"(\\d+)_(\\d+)(_(\" + etag_pattern + r\"))?\")\n\n row_id = []\n row_version = []\n row_etag = []\n for row_name in df.index.values:\n m = row_id_version_pattern.match(str(row_name))\n row_id.append(m.group(1) if m else None)\n row_version.append(m.group(2) if m else None)\n row_etag.append(m.group(4) if m else None)\n\n # include row ID and version, if we're asked to OR if it's encoded in row names\n if includeRowIdAndRowVersion or (\n includeRowIdAndRowVersion is None and any(row_id)\n ):\n df2 = df.copy()\n\n cls._insert_dataframe_column_if_not_exist(df2, 0, \"ROW_ID\", row_id)\n cls._insert_dataframe_column_if_not_exist(\n df2, 1, \"ROW_VERSION\", row_version\n )\n if any(row_etag):\n cls._insert_dataframe_column_if_not_exist(df2, 2, \"ROW_ETAG\", row_etag)\n\n df = df2\n includeRowIdAndRowVersion = True\n\n f = None\n try:\n if not filepath:\n temp_dir = tempfile.mkdtemp()\n filepath = os.path.join(temp_dir, \"table.csv\")\n\n f = io.open(filepath, mode=\"w\", encoding=\"utf-8\", newline=\"\")\n\n test_import_pandas()\n import pandas as pd\n\n if isinstance(schema, Schema):\n for col in schema.columns_to_store:\n if col[\"columnType\"] == \"DATE\":\n\n def _trailing_date_time_millisecond(t):\n if isinstance(t, str):\n return t[:-3]\n\n df[col.name] = pd.to_datetime(\n df[col.name], errors=\"coerce\"\n ).dt.strftime(\"%s%f\")\n df[col.name] = df[col.name].apply(\n lambda x: _trailing_date_time_millisecond(x)\n )\n\n df.to_csv(\n f,\n index=False,\n sep=separator,\n header=header,\n quotechar=quoteCharacter,\n escapechar=escapeCharacter,\n lineterminator=lineEnd,\n na_rep=kwargs.get(\"na_rep\", \"\"),\n float_format=\"%.12g\",\n )\n # NOTE: reason for flat_format='%.12g':\n # pandas automatically converts int columns into float64 columns when some cells in the column have no\n # value. If we write the whole number back as a decimal (e.g. '3.0'), Synapse complains that we are writing\n # a float into a INTEGER(synapse table type) column. Using the 'g' will strip off '.0' from whole number\n # values. pandas by default (with no float_format parameter) seems to keep 12 values after decimal, so we\n # use '%.12g'.c\n # see SYNPY-267.\n finally:\n if f:\n f.close()\n\n return cls(\n schema=schema,\n filepath=filepath,\n etag=etag,\n quoteCharacter=quoteCharacter,\n escapeCharacter=escapeCharacter,\n lineEnd=lineEnd,\n separator=separator,\n header=header,\n includeRowIdAndRowVersion=includeRowIdAndRowVersion,\n headers=headers,\n )\n\n @staticmethod\n def _insert_dataframe_column_if_not_exist(\n dataframe, insert_index, col_name, insert_column_data\n ):\n # if the column already exists verify the column data is same as what we parsed\n if col_name in dataframe.columns:\n if dataframe[col_name].tolist() != insert_column_data:\n raise SynapseError(\n (\n \"A column named '{0}' already exists and does not match the '{0}' values present in\"\n \" the DataFrame's row names. Please refain from using or modifying '{0}' as a\"\n \" column for your data because it is necessary for version tracking in Synapse's\"\n \" tables\"\n ).format(col_name)\n )\n else:\n dataframe.insert(insert_index, col_name, insert_column_data)\n\n @classmethod\n def from_list_of_rows(\n cls,\n schema,\n values,\n filepath=None,\n etag=None,\n quoteCharacter='\"',\n escapeCharacter=\"\\\\\",\n lineEnd=str(os.linesep),\n separator=\",\",\n linesToSkip=0,\n includeRowIdAndRowVersion=None,\n headers=None,\n ):\n # create CSV file\n f = None\n try:\n if not filepath:\n temp_dir = tempfile.mkdtemp()\n filepath = os.path.join(temp_dir, \"table.csv\")\n\n f = io.open(filepath, \"w\", encoding=\"utf-8\", newline=\"\")\n\n writer = csv.writer(\n f,\n quoting=csv.QUOTE_NONNUMERIC,\n delimiter=separator,\n escapechar=escapeCharacter,\n lineterminator=lineEnd,\n quotechar=quoteCharacter,\n skipinitialspace=linesToSkip,\n )\n\n # if we haven't explicitly set columns, try to grab them from\n # the schema object\n if (\n not headers\n and \"columns_to_store\" in schema\n and schema.columns_to_store is not None\n ):\n headers = [\n SelectColumn.from_column(col) for col in schema.columns_to_store\n ]\n\n # write headers?\n if headers:\n writer.writerow([header.name for header in headers])\n header = True\n else:\n header = False\n\n # write row data\n for row in values:\n writer.writerow(row)\n\n finally:\n if f:\n f.close()\n\n return cls(\n schema=schema,\n filepath=filepath,\n etag=etag,\n quoteCharacter=quoteCharacter,\n escapeCharacter=escapeCharacter,\n lineEnd=lineEnd,\n separator=separator,\n header=header,\n headers=headers,\n includeRowIdAndRowVersion=includeRowIdAndRowVersion,\n )\n\n def __init__(\n self,\n schema,\n filepath,\n etag=None,\n quoteCharacter=DEFAULT_QUOTE_CHARACTER,\n escapeCharacter=DEFAULT_ESCAPSE_CHAR,\n lineEnd=str(os.linesep),\n separator=DEFAULT_SEPARATOR,\n header=True,\n linesToSkip=0,\n includeRowIdAndRowVersion=None,\n headers=None,\n ):\n self.filepath = filepath\n\n self.includeRowIdAndRowVersion = includeRowIdAndRowVersion\n\n # CsvTableDescriptor fields\n self.linesToSkip = linesToSkip\n self.quoteCharacter = quoteCharacter\n self.escapeCharacter = escapeCharacter\n self.lineEnd = lineEnd\n self.separator = separator\n self.header = header\n\n super(CsvFileTable, self).__init__(schema, headers=headers, etag=etag)\n\n self.setColumnHeaders(headers)\n\n def _synapse_store(self, syn):\n copied_self = copy.copy(self)\n return copied_self._update_self(syn)\n\n def _update_self(self, syn):\n if isinstance(self.schema, Schema) and self.schema.get(\"id\", None) is None:\n # store schema\n self.schema = syn.store(self.schema)\n self.tableId = self.schema.id\n\n result = syn._uploadCsv(\n self.filepath,\n self.schema if self.schema else self.tableId,\n updateEtag=self.etag,\n quoteCharacter=self.quoteCharacter,\n escapeCharacter=self.escapeCharacter,\n lineEnd=self.lineEnd,\n separator=self.separator,\n header=self.header,\n linesToSkip=self.linesToSkip,\n )\n\n upload_to_table_result = result[\"results\"][0]\n\n assert upload_to_table_result[\"concreteType\"] in (\n \"org.sagebionetworks.repo.model.table.EntityUpdateResults\",\n \"org.sagebionetworks.repo.model.table.UploadToTableResult\",\n ), \"Not an UploadToTableResult or EntityUpdateResults.\"\n if \"etag\" in upload_to_table_result:\n self.etag = upload_to_table_result[\"etag\"]\n return self\n\n def asDataFrame(self, rowIdAndVersionInIndex=True, convert_to_datetime=False):\n \"\"\"Convert query result to a Pandas DataFrame.\n\n :param rowIdAndVersionInIndex: Make the dataframe index consist of the row_id and row_version\n (and row_etag if it exists)\n :param convert_to_datetime: If set to True, will convert all Synapse DATE columns from UNIX timestamp\n integers into UTC datetime objects\n\n :return: Pandas dataframe with results\n \"\"\"\n test_import_pandas()\n import pandas as pd\n\n try:\n # Handle bug in pandas 0.19 requiring quotechar to be str not unicode or newstr\n quoteChar = self.quoteCharacter\n\n # determine which columns are DATE columns so we can convert milisecond timestamps into datetime objects\n date_columns = []\n list_columns = []\n dtype = {}\n\n if self.headers is not None:\n for select_column in self.headers:\n if select_column.columnType == \"STRING\":\n # we want to identify string columns so that pandas doesn't try to\n # automatically parse strings in a string column to other data types\n dtype[select_column.name] = str\n elif select_column.columnType in LIST_COLUMN_TYPES:\n list_columns.append(select_column.name)\n elif select_column.columnType == \"DATE\" and convert_to_datetime:\n date_columns.append(select_column.name)\n\n return _csv_to_pandas_df(\n self.filepath,\n separator=self.separator,\n quote_char=quoteChar,\n escape_char=self.escapeCharacter,\n contain_headers=self.header,\n lines_to_skip=self.linesToSkip,\n date_columns=date_columns,\n list_columns=list_columns,\n rowIdAndVersionInIndex=rowIdAndVersionInIndex,\n dtype=dtype,\n )\n except pd.parser.CParserError:\n return pd.DataFrame()\n\n def asRowSet(self):\n # Extract row id and version, if present in rows\n row_id_col = None\n row_ver_col = None\n for i, header in enumerate(self.headers):\n if header.name == \"ROW_ID\":\n row_id_col = i\n elif header.name == \"ROW_VERSION\":\n row_ver_col = i\n\n def to_row_object(row, row_id_col=None, row_ver_col=None):\n if isinstance(row, Row):\n return row\n rowId = row[row_id_col] if row_id_col is not None else None\n versionNumber = row[row_ver_col] if row_ver_col is not None else None\n values = [\n elem for i, elem in enumerate(row) if i not in [row_id_col, row_ver_col]\n ]\n return Row(values, rowId=rowId, versionNumber=versionNumber)\n\n return RowSet(\n headers=[\n elem\n for i, elem in enumerate(self.headers)\n if i not in [row_id_col, row_ver_col]\n ],\n tableId=self.tableId,\n etag=self.etag,\n rows=[to_row_object(row, row_id_col, row_ver_col) for row in self],\n )\n\n def setColumnHeaders(self, headers):\n \"\"\"\n Set the list of :py:class:`synapseclient.table.SelectColumn` objects that will be used to convert fields to the\n appropriate data types.\n\n Column headers are automatically set when querying.\n \"\"\"\n if self.includeRowIdAndRowVersion:\n names = [header.name for header in headers]\n if \"ROW_ID\" not in names and \"ROW_VERSION\" not in names:\n headers = [\n SelectColumn(name=\"ROW_ID\", columnType=\"STRING\"),\n SelectColumn(name=\"ROW_VERSION\", columnType=\"STRING\"),\n ] + headers\n self.headers = headers\n\n def __iter__(self):\n def iterate_rows(filepath, headers):\n if not self.header or not self.headers:\n raise ValueError(\"Iteration not supported for table without headers.\")\n\n header_name = {header.name for header in headers}\n row_metadata_headers = {\"ROW_ID\", \"ROW_VERSION\", \"ROW_ETAG\"}\n num_row_metadata_in_headers = len(header_name & row_metadata_headers)\n with io.open(filepath, encoding=\"utf-8\", newline=self.lineEnd) as f:\n reader = csv.reader(\n f,\n delimiter=self.separator,\n escapechar=self.escapeCharacter,\n lineterminator=self.lineEnd,\n quotechar=self.quoteCharacter,\n )\n csv_header = set(next(reader))\n # the number of row metadata differences between the csv headers and self.headers\n num_metadata_cols_diff = (\n len(csv_header & row_metadata_headers) - num_row_metadata_in_headers\n )\n # we only process 2 cases:\n # 1. matching row metadata\n # 2. if metadata does not match, self.headers must not contains row metadata\n if num_metadata_cols_diff == 0 or num_row_metadata_in_headers == 0:\n for row in reader:\n yield cast_values(row[num_metadata_cols_diff:], headers)\n else:\n raise ValueError(\n \"There is mismatching row metadata in the csv file and in headers.\"\n )\n\n return iterate_rows(self.filepath, self.headers)\n\n def __len__(self):\n with io.open(self.filepath, encoding=\"utf-8\", newline=self.lineEnd) as f:\n if self.header: # ignore the header line\n f.readline()\n\n return sum(1 for line in f)\n\n def iter_row_metadata(self):\n \"\"\"Iterates the table results to get row_id and row_etag. If an etag does not exist for a row,\n it will generated as (row_id, None)\n\n :return: a generator that gives :py:class::`collections.namedtuple` with format (row_id, row_etag)\n \"\"\"\n with io.open(self.filepath, encoding=\"utf-8\", newline=self.lineEnd) as f:\n reader = csv.reader(\n f,\n delimiter=self.separator,\n escapechar=self.escapeCharacter,\n lineterminator=self.lineEnd,\n quotechar=self.quoteCharacter,\n )\n header = next(reader)\n\n # The ROW_... headers are always in a predefined order\n row_id_index = header.index(\"ROW_ID\")\n row_version_index = header.index(\"ROW_VERSION\")\n try:\n row_etag_index = header.index(\"ROW_ETAG\")\n except ValueError:\n row_etag_index = None\n\n for row in reader:\n yield type(self).RowMetadataTuple(\n int(row[row_id_index]),\n int(row[row_version_index]),\n row[row_etag_index] if (row_etag_index is not None) else None,\n )\n
"},{"location":"reference/tables/#synapseclient.table.CsvFileTable-functions","title":"Functions","text":""},{"location":"reference/tables/#synapseclient.table.CsvFileTable.from_table_query","title":"from_table_query(synapse, query, quoteCharacter='\"', escapeCharacter='\\\\', lineEnd=str(os.linesep), separator=',', header=True, includeRowIdAndRowVersion=True, downloadLocation=None)
classmethod
","text":"Create a Table object wrapping a CSV file resulting from querying a Synapse table. Mostly for internal use.
Source code insynapseclient/table.py
@classmethod\ndef from_table_query(\n cls,\n synapse,\n query,\n quoteCharacter='\"',\n escapeCharacter=\"\\\\\",\n lineEnd=str(os.linesep),\n separator=\",\",\n header=True,\n includeRowIdAndRowVersion=True,\n downloadLocation=None,\n):\n \"\"\"\n Create a Table object wrapping a CSV file resulting from querying a Synapse table.\n Mostly for internal use.\n \"\"\"\n\n download_from_table_result, path = synapse._queryTableCsv(\n query=query,\n quoteCharacter=quoteCharacter,\n escapeCharacter=escapeCharacter,\n lineEnd=lineEnd,\n separator=separator,\n header=header,\n includeRowIdAndRowVersion=includeRowIdAndRowVersion,\n downloadLocation=downloadLocation,\n )\n\n # A dirty hack to find out if we got back row ID and Version\n # in particular, we don't get these back from aggregate queries\n with io.open(path, \"r\", encoding=\"utf-8\") as f:\n reader = csv.reader(\n f,\n delimiter=separator,\n escapechar=escapeCharacter,\n lineterminator=lineEnd,\n quotechar=quoteCharacter,\n )\n first_line = next(reader)\n if len(download_from_table_result[\"headers\"]) + 2 == len(first_line):\n includeRowIdAndRowVersion = True\n else:\n includeRowIdAndRowVersion = False\n\n self = cls(\n filepath=path,\n schema=download_from_table_result.get(\"tableId\", None),\n etag=download_from_table_result.get(\"etag\", None),\n quoteCharacter=quoteCharacter,\n escapeCharacter=escapeCharacter,\n lineEnd=lineEnd,\n separator=separator,\n header=header,\n includeRowIdAndRowVersion=includeRowIdAndRowVersion,\n headers=[\n SelectColumn(**header)\n for header in download_from_table_result[\"headers\"]\n ],\n )\n\n return self\n
"},{"location":"reference/tables/#synapseclient.table.CsvFileTable.asDataFrame","title":"asDataFrame(rowIdAndVersionInIndex=True, convert_to_datetime=False)
","text":"Convert query result to a Pandas DataFrame.
:param rowIdAndVersionInIndex: Make the dataframe index consist of the row_id and row_version (and row_etag if it exists) :param convert_to_datetime: If set to True, will convert all Synapse DATE columns from UNIX timestamp integers into UTC datetime objects
:return: Pandas dataframe with results
Source code insynapseclient/table.py
def asDataFrame(self, rowIdAndVersionInIndex=True, convert_to_datetime=False):\n \"\"\"Convert query result to a Pandas DataFrame.\n\n :param rowIdAndVersionInIndex: Make the dataframe index consist of the row_id and row_version\n (and row_etag if it exists)\n :param convert_to_datetime: If set to True, will convert all Synapse DATE columns from UNIX timestamp\n integers into UTC datetime objects\n\n :return: Pandas dataframe with results\n \"\"\"\n test_import_pandas()\n import pandas as pd\n\n try:\n # Handle bug in pandas 0.19 requiring quotechar to be str not unicode or newstr\n quoteChar = self.quoteCharacter\n\n # determine which columns are DATE columns so we can convert milisecond timestamps into datetime objects\n date_columns = []\n list_columns = []\n dtype = {}\n\n if self.headers is not None:\n for select_column in self.headers:\n if select_column.columnType == \"STRING\":\n # we want to identify string columns so that pandas doesn't try to\n # automatically parse strings in a string column to other data types\n dtype[select_column.name] = str\n elif select_column.columnType in LIST_COLUMN_TYPES:\n list_columns.append(select_column.name)\n elif select_column.columnType == \"DATE\" and convert_to_datetime:\n date_columns.append(select_column.name)\n\n return _csv_to_pandas_df(\n self.filepath,\n separator=self.separator,\n quote_char=quoteChar,\n escape_char=self.escapeCharacter,\n contain_headers=self.header,\n lines_to_skip=self.linesToSkip,\n date_columns=date_columns,\n list_columns=list_columns,\n rowIdAndVersionInIndex=rowIdAndVersionInIndex,\n dtype=dtype,\n )\n except pd.parser.CParserError:\n return pd.DataFrame()\n
"},{"location":"reference/tables/#synapseclient.table.CsvFileTable.setColumnHeaders","title":"setColumnHeaders(headers)
","text":"Set the list of :py:class:synapseclient.table.SelectColumn
objects that will be used to convert fields to the appropriate data types.
Column headers are automatically set when querying.
Source code insynapseclient/table.py
def setColumnHeaders(self, headers):\n \"\"\"\n Set the list of :py:class:`synapseclient.table.SelectColumn` objects that will be used to convert fields to the\n appropriate data types.\n\n Column headers are automatically set when querying.\n \"\"\"\n if self.includeRowIdAndRowVersion:\n names = [header.name for header in headers]\n if \"ROW_ID\" not in names and \"ROW_VERSION\" not in names:\n headers = [\n SelectColumn(name=\"ROW_ID\", columnType=\"STRING\"),\n SelectColumn(name=\"ROW_VERSION\", columnType=\"STRING\"),\n ] + headers\n self.headers = headers\n
"},{"location":"reference/tables/#synapseclient.table.CsvFileTable.iter_row_metadata","title":"iter_row_metadata()
","text":"Iterates the table results to get row_id and row_etag. If an etag does not exist for a row, it will generated as (row_id, None)
:return: a generator that gives :py:class::collections.namedtuple
with format (row_id, row_etag)
synapseclient/table.py
def iter_row_metadata(self):\n \"\"\"Iterates the table results to get row_id and row_etag. If an etag does not exist for a row,\n it will generated as (row_id, None)\n\n :return: a generator that gives :py:class::`collections.namedtuple` with format (row_id, row_etag)\n \"\"\"\n with io.open(self.filepath, encoding=\"utf-8\", newline=self.lineEnd) as f:\n reader = csv.reader(\n f,\n delimiter=self.separator,\n escapechar=self.escapeCharacter,\n lineterminator=self.lineEnd,\n quotechar=self.quoteCharacter,\n )\n header = next(reader)\n\n # The ROW_... headers are always in a predefined order\n row_id_index = header.index(\"ROW_ID\")\n row_version_index = header.index(\"ROW_VERSION\")\n try:\n row_etag_index = header.index(\"ROW_ETAG\")\n except ValueError:\n row_etag_index = None\n\n for row in reader:\n yield type(self).RowMetadataTuple(\n int(row[row_id_index]),\n int(row[row_version_index]),\n row[row_etag_index] if (row_etag_index is not None) else None,\n )\n
"},{"location":"reference/tables/#synapseclient.table-functions","title":"Functions","text":""},{"location":"reference/tables/#synapseclient.table.as_table_columns","title":"as_table_columns(values)
","text":"Return a list of Synapse table :py:class:Column
objects that correspond to the columns in the given values.
:params values: an object that holds the content of the tables - a string holding the path to a CSV file, a filehandle, or StringIO containing valid csv content - a Pandas DataFrame <http://pandas.pydata.org/pandas-docs/stable/api.html#dataframe>
_
:returns: A list of Synapse table :py:class:Column
objects
Example::
import pandas as pd\n\ndf = pd.DataFrame(dict(a=[1, 2, 3], b=[\"c\", \"d\", \"e\"]))\ncols = as_table_columns(df)\n
Source code in synapseclient/table.py
def as_table_columns(values):\n \"\"\"\n Return a list of Synapse table :py:class:`Column` objects that correspond to the columns in the given values.\n\n :params values: an object that holds the content of the tables\n - a string holding the path to a CSV file, a filehandle, or StringIO containing valid csv content\n - a Pandas `DataFrame <http://pandas.pydata.org/pandas-docs/stable/api.html#dataframe>`_\n\n :returns: A list of Synapse table :py:class:`Column` objects\n\n Example::\n\n import pandas as pd\n\n df = pd.DataFrame(dict(a=[1, 2, 3], b=[\"c\", \"d\", \"e\"]))\n cols = as_table_columns(df)\n \"\"\"\n test_import_pandas()\n import pandas as pd\n from pandas.api.types import infer_dtype\n\n df = None\n\n # pandas DataFrame\n if isinstance(values, pd.DataFrame):\n df = values\n # filename of a csv file\n # in Python 3, we can check that the values is instanceof io.IOBase\n # for now, check if values has attr `read`\n elif isinstance(values, str) or hasattr(values, \"read\"):\n df = _csv_to_pandas_df(values)\n\n if df is None:\n raise ValueError(\"Values of type %s is not yet supported.\" % type(values))\n\n cols = list()\n for col in df:\n inferred_type = infer_dtype(df[col], skipna=True)\n columnType = PANDAS_TABLE_TYPE.get(inferred_type, \"STRING\")\n if columnType == \"STRING\":\n maxStrLen = df[col].str.len().max()\n if maxStrLen > 1000:\n cols.append(Column(name=col, columnType=\"LARGETEXT\", defaultValue=\"\"))\n else:\n size = int(\n round(min(1000, max(30, maxStrLen * 1.5)))\n ) # Determine the length of the longest string\n cols.append(\n Column(\n name=col,\n columnType=columnType,\n maximumSize=size,\n defaultValue=\"\",\n )\n )\n else:\n cols.append(Column(name=col, columnType=columnType))\n return cols\n
"},{"location":"reference/tables/#synapseclient.table.df2Table","title":"df2Table(df, syn, tableName, parentProject)
","text":"Creates a new table from data in pandas data frame. parameters: df, tableName, parentProject
Source code insynapseclient/table.py
def df2Table(df, syn, tableName, parentProject):\n \"\"\"Creates a new table from data in pandas data frame.\n parameters: df, tableName, parentProject\n \"\"\"\n\n # Create columns:\n cols = as_table_columns(df)\n cols = [syn.store(col) for col in cols]\n\n # Create Table Schema\n schema1 = Schema(name=tableName, columns=cols, parent=parentProject)\n schema1 = syn.store(schema1)\n\n # Add data to Table\n for i in range(0, df.shape[0] / 1200 + 1):\n start = i * 1200\n end = min((i + 1) * 1200, df.shape[0])\n rowset1 = RowSet(\n columns=cols,\n schema=schema1,\n rows=[Row(list(df.ix[j, :])) for j in range(start, end)],\n )\n syn.store(rowset1)\n\n return schema1\n
"},{"location":"reference/tables/#synapseclient.table.to_boolean","title":"to_boolean(value)
","text":"Convert a string to boolean, case insensitively, where true values are: true, t, and 1 and false values are: false, f, 0. Raise a ValueError for all other values.
Source code insynapseclient/table.py
def to_boolean(value):\n \"\"\"\n Convert a string to boolean, case insensitively,\n where true values are: true, t, and 1 and false values are: false, f, 0.\n Raise a ValueError for all other values.\n \"\"\"\n if isinstance(value, bool):\n return value\n\n if isinstance(value, str):\n lower_value = value.lower()\n if lower_value in [\"true\", \"t\", \"1\"]:\n return True\n if lower_value in [\"false\", \"f\", \"0\"]:\n return False\n\n raise ValueError(\"Can't convert %s to boolean.\" % value)\n
"},{"location":"reference/tables/#synapseclient.table.cast_values","title":"cast_values(values, headers)
","text":"Convert a row of table query results from strings to the correct column type.
See: https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/ColumnType.html
Source code insynapseclient/table.py
def cast_values(values, headers):\n \"\"\"\n Convert a row of table query results from strings to the correct column type.\n\n See: https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/table/ColumnType.html\n \"\"\"\n if len(values) != len(headers):\n raise ValueError(\n \"The number of columns in the csv file does not match the given headers. %d fields, %d headers\"\n % (len(values), len(headers))\n )\n\n result = []\n for header, field in zip(headers, values):\n columnType = header.get(\"columnType\", \"STRING\")\n\n # convert field to column type\n if field is None or field == \"\":\n result.append(None)\n elif columnType in {\n \"STRING\",\n \"ENTITYID\",\n \"FILEHANDLEID\",\n \"LARGETEXT\",\n \"USERID\",\n \"LINK\",\n }:\n result.append(field)\n elif columnType == \"DOUBLE\":\n result.append(float(field))\n elif columnType == \"INTEGER\":\n result.append(int(field))\n elif columnType == \"BOOLEAN\":\n result.append(to_boolean(field))\n elif columnType == \"DATE\":\n result.append(from_unix_epoch_time(field))\n elif columnType in {\n \"STRING_LIST\",\n \"INTEGER_LIST\",\n \"BOOLEAN_LIST\",\n \"ENTITYID_LIST\",\n \"USERID_LIST\",\n }:\n result.append(json.loads(field))\n elif columnType == \"DATE_LIST\":\n result.append(json.loads(field, parse_int=from_unix_epoch_time))\n else:\n # default to string for unknown column type\n result.append(field)\n\n return result\n
"},{"location":"reference/tables/#synapseclient.table.escape_column_name","title":"escape_column_name(column)
","text":"Escape the name of the given column for use in a Synapse table query statement :param column: a string or column dictionary object with a 'name' key
Source code insynapseclient/table.py
def escape_column_name(column):\n \"\"\"Escape the name of the given column for use in a Synapse table query statement\n :param column: a string or column dictionary object with a 'name' key\"\"\"\n col_name = (\n column[\"name\"] if isinstance(column, collections.abc.Mapping) else str(column)\n )\n escaped_name = col_name.replace('\"', '\"\"')\n return f'\"{escaped_name}\"'\n
"},{"location":"reference/tables/#synapseclient.table.join_column_names","title":"join_column_names(columns)
","text":"Join the names of the given columns into a comma delimited list suitable for use in a Synapse table query :param columns: a sequence of column string names or dictionary objets with column 'name' keys
Source code insynapseclient/table.py
def join_column_names(columns):\n \"\"\"Join the names of the given columns into a comma delimited list suitable for use in a Synapse table query\n :param columns: a sequence of column string names or dictionary objets with column 'name' keys\n \"\"\"\n return \",\".join(escape_column_name(c) for c in columns)\n
"},{"location":"reference/tables/#synapseclient.table.build_table","title":"build_table(name, parent, values)
","text":"Build a Table object
:param name: the name for the Table Schema object :param parent: the project in Synapse to which this table belongs :param values: an object that holds the content of the tables - a string holding the path to a CSV file - a Pandas DataFrame <http://pandas.pydata.org/pandas-docs/stable/api.html#dataframe>
_
:return: a Table object suitable for storing
Example::
path = \"/path/to/file.csv\"\ntable = build_table(\"simple_table\", \"syn123\", path)\ntable = syn.store(table)\n\nimport pandas as pd\n\ndf = pd.DataFrame(dict(a=[1, 2, 3], b=[\"c\", \"d\", \"e\"]))\ntable = build_table(\"simple_table\", \"syn123\", df)\ntable = syn.store(table)\n
Source code in synapseclient/table.py
def build_table(name, parent, values):\n \"\"\"\n Build a Table object\n\n :param name: the name for the Table Schema object\n :param parent: the project in Synapse to which this table belongs\n :param values: an object that holds the content of the tables\n - a string holding the path to a CSV file\n - a Pandas `DataFrame <http://pandas.pydata.org/pandas-docs/stable/api.html#dataframe>`_\n\n :return: a Table object suitable for storing\n\n Example::\n\n path = \"/path/to/file.csv\"\n table = build_table(\"simple_table\", \"syn123\", path)\n table = syn.store(table)\n\n import pandas as pd\n\n df = pd.DataFrame(dict(a=[1, 2, 3], b=[\"c\", \"d\", \"e\"]))\n table = build_table(\"simple_table\", \"syn123\", df)\n table = syn.store(table)\n \"\"\"\n test_import_pandas()\n import pandas as pd\n\n if not isinstance(values, pd.DataFrame) and not isinstance(values, str):\n raise ValueError(\"Values of type %s is not yet supported.\" % type(values))\n cols = as_table_columns(values)\n schema = Schema(name=name, columns=cols, parent=parent)\n headers = [SelectColumn.from_column(col) for col in cols]\n return Table(schema, values, headers=headers)\n
"},{"location":"reference/tables/#synapseclient.table.Table","title":"Table(schema, values, **kwargs)
","text":"Combine a table schema and a set of values into some type of Table object depending on what type of values are given.
:param schema: a table :py:class:Schema
object or Synapse Id of Table. :param values: an object that holds the content of the tables - a :py:class:RowSet
- a list of lists (or tuples) where each element is a row - a string holding the path to a CSV file - a Pandas DataFrame <http://pandas.pydata.org/pandas-docs/stable/api.html#dataframe>
- a dict which will be wrapped by a Pandas DataFrame <http://pandas.pydata.org/pandas-docs/stable/api.html#dataframe>
:return: a Table object suitable for storing
Usually, the immediate next step after creating a Table object is to store it::
table = syn.store(Table(schema, values))\n
End users should not need to know the details of these Table subclasses:
TableAbstractBaseClass
RowSetTable
TableQueryResult
CsvFileTable
synapseclient/table.py
def Table(schema, values, **kwargs):\n \"\"\"\n Combine a table schema and a set of values into some type of Table object\n depending on what type of values are given.\n\n :param schema: a table :py:class:`Schema` object or Synapse Id of Table.\n :param values: an object that holds the content of the tables\n - a :py:class:`RowSet`\n - a list of lists (or tuples) where each element is a row\n - a string holding the path to a CSV file\n - a Pandas `DataFrame <http://pandas.pydata.org/pandas-docs/stable/api.html#dataframe>`_\n - a dict which will be wrapped by a Pandas \\\n `DataFrame <http://pandas.pydata.org/pandas-docs/stable/api.html#dataframe>`_\n\n :return: a Table object suitable for storing\n\n Usually, the immediate next step after creating a Table object is to store it::\n\n table = syn.store(Table(schema, values))\n\n End users should not need to know the details of these Table subclasses:\n\n - :py:class:`TableAbstractBaseClass`\n - :py:class:`RowSetTable`\n - :py:class:`TableQueryResult`\n - :py:class:`CsvFileTable`\n \"\"\"\n\n try:\n import pandas as pd\n\n pandas_available = True\n except: # noqa\n pandas_available = False\n\n # a RowSet\n if isinstance(values, RowSet):\n return RowSetTable(schema, values, **kwargs)\n\n # a list of rows\n elif isinstance(values, (list, tuple)):\n return CsvFileTable.from_list_of_rows(schema, values, **kwargs)\n\n # filename of a csv file\n elif isinstance(values, str):\n return CsvFileTable(schema, filepath=values, **kwargs)\n\n # pandas DataFrame\n elif pandas_available and isinstance(values, pd.DataFrame):\n return CsvFileTable.from_data_frame(schema, values, **kwargs)\n\n # dict\n elif pandas_available and isinstance(values, dict):\n return CsvFileTable.from_data_frame(schema, pd.DataFrame(values), **kwargs)\n\n else:\n raise ValueError(\n \"Don't know how to make tables from values of type %s.\" % type(values)\n )\n
"},{"location":"reference/teams/","title":"Teams","text":""},{"location":"reference/teams/#synapseclient.team","title":"synapseclient.team
","text":"Functions that interact with Synapse Teams
"},{"location":"reference/teams/#synapseclient.team-classes","title":"Classes","text":""},{"location":"reference/teams/#synapseclient.team.UserProfile","title":"UserProfile
","text":" Bases: DictObject
Information about a Synapse user. In practice the constructor is not called directly by the client.
:param ownerId: A foreign key to the ID of the 'principal' object for the user. :param uri: The Uniform Resource Identifier (URI) for this entity. :param etag: Synapse employs an Optimistic Concurrency Control (OCC) scheme to handle concurrent updates. Since the E-Tag changes every time an entity is updated it is used to detect when a client's current representation of an entity is out-of-date. :param firstName: This person's given name (forename) :param lastName: This person's family name (surname) :param emails: The list of user email addresses registered to this user. :param userName: A name chosen by the user that uniquely identifies them. :param summary: A summary description about this person :param position: This person's current position title :param location: This person's location :param industry: \"The industry/discipline that this person is associated with :param company: This person's current affiliation :param profilePicureFileHandleId: The File Handle ID of the user's profile picture. :param url: A link to more information about this person :param notificationSettings: An object of type :py:class:org.sagebionetworks.repo.model.message.Settings
containing the user's preferences regarding when email notifications should be sent
synapseclient/team.py
class UserProfile(DictObject):\n \"\"\"\n Information about a Synapse user. In practice the constructor is not called directly by the client.\n\n :param ownerId: A foreign key to the ID of the 'principal' object for the user.\n :param uri: The Uniform Resource Identifier (URI) for this entity.\n :param etag: Synapse employs an Optimistic Concurrency Control (OCC) scheme to handle concurrent updates.\n Since the E-Tag changes every time an entity is updated it is used to detect when a client's current representation\n of an entity is out-of-date.\n :param firstName: This person's given name (forename)\n :param lastName: This person's family name (surname)\n :param emails: The list of user email addresses registered to this user.\n :param userName: A name chosen by the user that uniquely identifies them.\n :param summary: A summary description about this person\n :param position: This person's current position title\n :param location: This person's location\n :param industry: \"The industry/discipline that this person is associated with\n :param company: This person's current affiliation\n :param profilePicureFileHandleId: The File Handle ID of the user's profile picture.\n :param url: A link to more information about this person\n :param notificationSettings: An object of type :py:class:`org.sagebionetworks.repo.model.message.Settings`\n containing the user's preferences regarding when email notifications should be sent\n \"\"\"\n\n def __init__(self, **kwargs):\n super(UserProfile, self).__init__(kwargs)\n
"},{"location":"reference/teams/#synapseclient.team.UserGroupHeader","title":"UserGroupHeader
","text":" Bases: DictObject
Select metadata about a Synapse principal. In practice the constructor is not called directly by the client.
:param ownerId: A foreign key to the ID of the 'principal' object for the user. :param firstName: First Name :param lastName: Last Name :param userName: A name chosen by the user that uniquely identifies them. :param email: User's current email address :param isIndividual: True if this is a user, false if it is a group
Source code insynapseclient/team.py
class UserGroupHeader(DictObject):\n \"\"\"\n Select metadata about a Synapse principal. In practice the constructor is not called directly by the client.\n\n :param ownerId: A foreign key to the ID of the 'principal' object for the user.\n :param firstName: First Name\n :param lastName: Last Name\n :param userName: A name chosen by the user that uniquely identifies them.\n :param email: User's current email address\n :param isIndividual: True if this is a user, false if it is a group\n \"\"\"\n\n def __init__(self, **kwargs):\n super(UserGroupHeader, self).__init__(kwargs)\n
"},{"location":"reference/teams/#synapseclient.team.Team","title":"Team
","text":" Bases: DictObject
Represents a Synapse Team <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/Team.html>
_. User definable fields are:
:param icon: fileHandleId for icon image of the Team :param description: A short description of this Team. :param name: The name of the Team. :param canPublicJoin: true for teams which members can join without an invitation or approval
Source code insynapseclient/team.py
class Team(DictObject):\n \"\"\"\n Represents a `Synapse Team <https://rest-docs.synapse.org/rest/org/sagebionetworks/repo/model/Team.html>`_.\n User definable fields are:\n\n :param icon: fileHandleId for icon image of the Team\n :param description: A short description of this Team.\n :param name: The name of the Team.\n :param canPublicJoin: true for teams which members can join without an invitation or approval\n \"\"\"\n\n def __init__(self, **kwargs):\n super(Team, self).__init__(kwargs)\n\n @classmethod\n def getURI(cls, id):\n return \"/team/%s\" % id\n\n def postURI(self):\n return \"/team\"\n\n def putURI(self):\n return \"/team\"\n\n def deleteURI(self):\n return \"/team/%s\" % self.id\n\n def getACLURI(self):\n return \"/team/%s/acl\" % self.id\n\n def putACLURI(self):\n return \"/team/acl\"\n
"},{"location":"reference/teams/#synapseclient.team.TeamMember","title":"TeamMember
","text":" Bases: DictObject
Contains information about a user's membership in a Team. In practice the constructor is not called directly by the client.
:param teamId: the ID of the team :param member: An object of type :py:class:org.sagebionetworks.repo.model.UserGroupHeader
describing the member :param isAdmin: Whether the given member is an administrator of the team
synapseclient/team.py
class TeamMember(DictObject):\n \"\"\"\n Contains information about a user's membership in a Team. In practice the constructor is not called directly by\n the client.\n\n :param teamId: the ID of the team\n :param member: An object of type :py:class:`org.sagebionetworks.repo.model.UserGroupHeader` describing the member\n :param isAdmin: Whether the given member is an administrator of the team\n\n \"\"\"\n\n def __init__(self, **kwargs):\n if \"member\" in kwargs:\n kwargs[\"member\"] = UserGroupHeader(**kwargs[\"member\"])\n super(TeamMember, self).__init__(kwargs)\n
"},{"location":"reference/view_schema/","title":"Entity View Schema","text":""},{"location":"reference/view_schema/#synapseclient.table.EntityViewSchema","title":"synapseclient.table.EntityViewSchema
","text":" Bases: ViewBase
A EntityViewSchema is a :py:class:synapseclient.entity.Entity
that displays all files/projects (depending on user choice) within a given set of scopes
:param name: the name of the Entity View Table object :param columns: a list of :py:class:Column
objects or their IDs. These are optional. :param parent: the project in Synapse to which this table belongs :param scopes: a list of Projects/Folders or their ids :param type: This field is deprecated. Please use includeEntityTypes
:param includeEntityTypes: a list of entity types to include in the view. Supported entity types are:
- EntityViewType.FILE,\n - EntityViewType.PROJECT,\n - EntityViewType.TABLE,\n - EntityViewType.FOLDER,\n - EntityViewType.VIEW,\n - EntityViewType.DOCKER\n\n If none is provided, the view will default to include EntityViewType.FILE.\n
:param addDefaultViewColumns: If true, adds all default columns (e.g. name, createdOn, modifiedBy etc.) Defaults to True. The default columns will be added after a call to :py:meth:synapseclient.Synapse.store
. :param addAnnotationColumns: If true, adds columns for all annotation keys defined across all Entities in the EntityViewSchema's scope. Defaults to True. The annotation columns will be added after a call to :py:meth:synapseclient.Synapse.store
. :param ignoredAnnotationColumnNames: A list of strings representing annotation names. When addAnnotationColumns is True, the names in this list will not be automatically added as columns to the EntityViewSchema if they exist in any of the defined scopes. :param properties: A map of Synapse properties :param annotations: A map of user defined annotations :param local_state: Internal use only
Example::
from synapseclient import EntityViewType\n\nproject_or_folder = syn.get(\"syn123\")\nschema = syn.store(EntityViewSchema(name='MyTable', parent=project, scopes=[project_or_folder_id, 'syn123'],\n includeEntityTypes=[EntityViewType.FILE]))\n
Source code in synapseclient/table.py
class EntityViewSchema(ViewBase):\n \"\"\"\n A EntityViewSchema is a :py:class:`synapseclient.entity.Entity` that displays all files/projects\n (depending on user choice) within a given set of scopes\n\n :param name: the name of the Entity View Table object\n :param columns: a list of :py:class:`Column` objects or their IDs. These are optional.\n :param parent: the project in Synapse to which this table belongs\n :param scopes: a list of Projects/Folders or their ids\n :param type: This field is deprecated. Please use `includeEntityTypes`\n :param includeEntityTypes: a list of entity types to include in the view. Supported entity types are:\n\n - EntityViewType.FILE,\n - EntityViewType.PROJECT,\n - EntityViewType.TABLE,\n - EntityViewType.FOLDER,\n - EntityViewType.VIEW,\n - EntityViewType.DOCKER\n\n If none is provided, the view will default to include EntityViewType.FILE.\n :param addDefaultViewColumns: If true, adds all default columns (e.g. name, createdOn, modifiedBy etc.)\n Defaults to True.\n The default columns will be added after a call to\n :py:meth:`synapseclient.Synapse.store`.\n :param addAnnotationColumns: If true, adds columns for all annotation keys defined across all Entities in\n the EntityViewSchema's scope. Defaults to True.\n The annotation columns will be added after a call to\n :py:meth:`synapseclient.Synapse.store`.\n :param ignoredAnnotationColumnNames: A list of strings representing annotation names.\n When addAnnotationColumns is True, the names in this list will not be\n automatically added as columns to the EntityViewSchema if they exist in any\n of the defined scopes.\n :param properties: A map of Synapse properties\n :param annotations: A map of user defined annotations\n :param local_state: Internal use only\n\n Example::\n\n from synapseclient import EntityViewType\n\n project_or_folder = syn.get(\"syn123\")\n schema = syn.store(EntityViewSchema(name='MyTable', parent=project, scopes=[project_or_folder_id, 'syn123'],\n includeEntityTypes=[EntityViewType.FILE]))\n\n \"\"\"\n\n _synapse_entity_type = \"org.sagebionetworks.repo.model.table.EntityView\"\n\n def __init__(\n self,\n name=None,\n columns=None,\n parent=None,\n scopes=None,\n type=None,\n includeEntityTypes=None,\n addDefaultViewColumns=True,\n addAnnotationColumns=True,\n ignoredAnnotationColumnNames=[],\n properties=None,\n annotations=None,\n local_state=None,\n **kwargs,\n ):\n if includeEntityTypes:\n kwargs[\"viewTypeMask\"] = _get_view_type_mask(includeEntityTypes)\n elif type:\n kwargs[\"viewTypeMask\"] = _get_view_type_mask_for_deprecated_type(type)\n elif properties and \"type\" in properties:\n kwargs[\"viewTypeMask\"] = _get_view_type_mask_for_deprecated_type(\n properties[\"type\"]\n )\n properties[\"type\"] = None\n\n self.ignoredAnnotationColumnNames = set(ignoredAnnotationColumnNames)\n super(EntityViewSchema, self).__init__(\n name=name,\n columns=columns,\n properties=properties,\n annotations=annotations,\n local_state=local_state,\n parent=parent,\n **kwargs,\n )\n\n # This is a hacky solution to make sure we don't try to add columns to schemas that we retrieve from synapse\n is_from_normal_constructor = not (properties or local_state)\n # allowing annotations because user might want to update annotations all at once\n self.addDefaultViewColumns = (\n addDefaultViewColumns and is_from_normal_constructor\n )\n self.addAnnotationColumns = addAnnotationColumns and is_from_normal_constructor\n\n # set default values after constructor so we don't overwrite the values defined in properties using .get()\n # because properties, unlike local_state, do not have nonexistent keys assigned with a value of None\n if self.get(\"viewTypeMask\") is None:\n self.viewTypeMask = EntityViewType.FILE.value\n if self.get(\"scopeIds\") is None:\n self.scopeIds = []\n\n # add the scopes last so that we can append the passed in scopes to those defined in properties\n if scopes is not None:\n self.add_scope(scopes)\n\n def set_entity_types(self, includeEntityTypes):\n \"\"\"\n :param includeEntityTypes: a list of entity types to include in the view. This list will replace the previous\n settings. Supported entity types are:\n\n - EntityViewType.FILE,\n - EntityViewType.PROJECT,\n - EntityViewType.TABLE,\n - EntityViewType.FOLDER,\n - EntityViewType.VIEW,\n - EntityViewType.DOCKER\n\n \"\"\"\n self.viewTypeMask = _get_view_type_mask(includeEntityTypes)\n
"},{"location":"reference/view_schema/#synapseclient.table.EntityViewSchema-functions","title":"Functions","text":""},{"location":"reference/view_schema/#synapseclient.table.EntityViewSchema.set_entity_types","title":"set_entity_types(includeEntityTypes)
","text":":param includeEntityTypes: a list of entity types to include in the view. This list will replace the previous settings. Supported entity types are:
- EntityViewType.FILE,\n - EntityViewType.PROJECT,\n - EntityViewType.TABLE,\n - EntityViewType.FOLDER,\n - EntityViewType.VIEW,\n - EntityViewType.DOCKER\n
Source code in synapseclient/table.py
def set_entity_types(self, includeEntityTypes):\n \"\"\"\n :param includeEntityTypes: a list of entity types to include in the view. This list will replace the previous\n settings. Supported entity types are:\n\n - EntityViewType.FILE,\n - EntityViewType.PROJECT,\n - EntityViewType.TABLE,\n - EntityViewType.FOLDER,\n - EntityViewType.VIEW,\n - EntityViewType.DOCKER\n\n \"\"\"\n self.viewTypeMask = _get_view_type_mask(includeEntityTypes)\n
"},{"location":"reference/wiki/","title":"Wiki","text":""},{"location":"reference/wiki/#synapseclient.wiki","title":"synapseclient.wiki
","text":"Wiki
A Wiki page requires a title, markdown and an owner object and can also include images.
Creating a Wiki\n
::
from synapseclient import Wiki\n\nentity = syn.get('syn123456')\n\ncontent = \"\"\"\n# My Wiki Page\n\nHere is a description of my **fantastic** project!\n\nAn attached image:\n${image?fileName=logo.png&align=none}\n\"\"\"\n\nwiki = Wiki(title='My Wiki Page',\n owner=entity,\n markdown=content,\n attachments=['/path/to/logo.png'])\n\nwiki = syn.store(wiki)\n
Embedding images\n
Note that in the above example, we've attached a logo graphic and embedded it in the web page.
Figures that are more than just decoration can be stored as Synapse entities allowing versioning and provenance information to be recorded. This is a better choice for figures with data behind them.
Updating a Wiki\n
::
entity = syn.get('syn123456')\nwiki = syn.getWiki(entity)\n\nwiki.markdown = \"\"\"\n# My Wiki Page\n\nHere is a description of my **fantastic** project! Let's\n*emphasize* the important stuff.\n\nAn embedded image that is also a Synapse entity:\n${image?synapseId=syn1824434&align=None&scale=66}\n\nNow we can track it's provenance and keep multiple versions.\n\"\"\"\n\nwiki = syn.store(wiki)\n
Wiki Class\n
.. autoclass:: synapseclient.wiki.Wiki :members: init
Wiki methods\n
synapseclient.Synapse.getWiki
synapseclient.Synapse.getWikiHeaders
synapseclient.Synapse.store
synapseclient.Synapse.delete
Wiki
","text":" Bases: DictObject
Represents a wiki page in Synapse with content specified in markdown.
:param title: Title of the Wiki :param owner: Parent Entity that the Wiki will belong to :param markdown: Content of the Wiki (cannot be defined if markdownFile is defined) :param markdownFile: Path to file which contains the Content of Wiki (cannot be defined if markdown is defined) :param attachments: List of paths to files to attach :param fileHandles: List of file handle IDs representing files to be attached :param parentWikiId: (optional) For sub-pages, specify parent wiki page
Source code insynapseclient/wiki.py
class Wiki(DictObject):\n \"\"\"\n Represents a wiki page in Synapse with content specified in markdown.\n\n :param title: Title of the Wiki\n :param owner: Parent Entity that the Wiki will belong to\n :param markdown: Content of the Wiki (cannot be defined if markdownFile is defined)\n :param markdownFile: Path to file which contains the Content of Wiki (cannot be defined if markdown is defined)\n :param attachments: List of paths to files to attach\n :param fileHandles: List of file handle IDs representing files to be attached\n :param parentWikiId: (optional) For sub-pages, specify parent wiki page\n \"\"\"\n\n __PROPERTIES = (\n \"title\",\n \"markdown\",\n \"attachmentFileHandleIds\",\n \"id\",\n \"etag\",\n \"createdBy\",\n \"createdOn\",\n \"modifiedBy\",\n \"modifiedOn\",\n \"parentWikiId\",\n )\n\n def __init__(self, **kwargs):\n # Verify that the parameters are correct\n if \"owner\" not in kwargs:\n raise ValueError(\"Wiki constructor must have an owner specified\")\n\n # Initialize the file handle list to be an empty list\n if \"attachmentFileHandleIds\" not in kwargs:\n kwargs[\"attachmentFileHandleIds\"] = []\n\n # update the markdown\n self.update_markdown(\n kwargs.pop(\"markdown\", None), kwargs.pop(\"markdownFile\", None)\n )\n\n # Move the 'fileHandles' into the proper (wordier) bucket\n if \"fileHandles\" in kwargs:\n for handle in kwargs[\"fileHandles\"]:\n kwargs[\"attachmentFileHandleIds\"].append(handle)\n del kwargs[\"fileHandles\"]\n\n super(Wiki, self).__init__(kwargs)\n self.ownerId = id_of(self.owner)\n del self[\"owner\"]\n\n def json(self):\n \"\"\"Returns the JSON representation of the Wiki object.\"\"\"\n return json.dumps({k: v for k, v in self.items() if k in self.__PROPERTIES})\n\n def getURI(self):\n \"\"\"For internal use.\"\"\"\n\n return \"/entity/%s/wiki/%s\" % (self.ownerId, self.id)\n\n def postURI(self):\n \"\"\"For internal use.\"\"\"\n\n return \"/entity/%s/wiki\" % self.ownerId\n\n def putURI(self):\n \"\"\"For internal use.\"\"\"\n\n return \"/entity/%s/wiki/%s\" % (self.ownerId, self.id)\n\n def deleteURI(self):\n \"\"\"For internal use.\"\"\"\n\n return \"/entity/%s/wiki/%s\" % (self.ownerId, self.id)\n\n def update_markdown(self, markdown=None, markdown_file=None):\n \"\"\"\n Updates the wiki's markdown. Specify only one of markdown or markdown_file\n :param markdown: text that will become the markdown\n :param markdown_file: path to a file. Its contents will be the markdown\n \"\"\"\n if markdown and markdown_file:\n raise ValueError(\"Please use only one argument: markdown or markdownFile\")\n\n if markdown_file:\n # pop the 'markdownFile' kwargs because we don't actually need it in the dictionary to upload to synapse\n markdown_path = os.path.expandvars(os.path.expanduser(markdown_file))\n if not os.path.isfile(markdown_path):\n raise ValueError(markdown_file + \"is not a valid file\")\n with open(markdown_path, \"r\") as opened_markdown_file:\n markdown = opened_markdown_file.read()\n\n self[\"markdown\"] = markdown\n
"},{"location":"reference/wiki/#synapseclient.wiki.Wiki-functions","title":"Functions","text":""},{"location":"reference/wiki/#synapseclient.wiki.Wiki.json","title":"json()
","text":"Returns the JSON representation of the Wiki object.
Source code insynapseclient/wiki.py
def json(self):\n \"\"\"Returns the JSON representation of the Wiki object.\"\"\"\n return json.dumps({k: v for k, v in self.items() if k in self.__PROPERTIES})\n
"},{"location":"reference/wiki/#synapseclient.wiki.Wiki.getURI","title":"getURI()
","text":"For internal use.
Source code insynapseclient/wiki.py
def getURI(self):\n \"\"\"For internal use.\"\"\"\n\n return \"/entity/%s/wiki/%s\" % (self.ownerId, self.id)\n
"},{"location":"reference/wiki/#synapseclient.wiki.Wiki.postURI","title":"postURI()
","text":"For internal use.
Source code insynapseclient/wiki.py
def postURI(self):\n \"\"\"For internal use.\"\"\"\n\n return \"/entity/%s/wiki\" % self.ownerId\n
"},{"location":"reference/wiki/#synapseclient.wiki.Wiki.putURI","title":"putURI()
","text":"For internal use.
Source code insynapseclient/wiki.py
def putURI(self):\n \"\"\"For internal use.\"\"\"\n\n return \"/entity/%s/wiki/%s\" % (self.ownerId, self.id)\n
"},{"location":"reference/wiki/#synapseclient.wiki.Wiki.deleteURI","title":"deleteURI()
","text":"For internal use.
Source code insynapseclient/wiki.py
def deleteURI(self):\n \"\"\"For internal use.\"\"\"\n\n return \"/entity/%s/wiki/%s\" % (self.ownerId, self.id)\n
"},{"location":"reference/wiki/#synapseclient.wiki.Wiki.update_markdown","title":"update_markdown(markdown=None, markdown_file=None)
","text":"Updates the wiki's markdown. Specify only one of markdown or markdown_file :param markdown: text that will become the markdown :param markdown_file: path to a file. Its contents will be the markdown
Source code insynapseclient/wiki.py
def update_markdown(self, markdown=None, markdown_file=None):\n \"\"\"\n Updates the wiki's markdown. Specify only one of markdown or markdown_file\n :param markdown: text that will become the markdown\n :param markdown_file: path to a file. Its contents will be the markdown\n \"\"\"\n if markdown and markdown_file:\n raise ValueError(\"Please use only one argument: markdown or markdownFile\")\n\n if markdown_file:\n # pop the 'markdownFile' kwargs because we don't actually need it in the dictionary to upload to synapse\n markdown_path = os.path.expandvars(os.path.expanduser(markdown_file))\n if not os.path.isfile(markdown_path):\n raise ValueError(markdown_file + \"is not a valid file\")\n with open(markdown_path, \"r\") as opened_markdown_file:\n markdown = opened_markdown_file.read()\n\n self[\"markdown\"] = markdown\n
"},{"location":"reference/wiki/#synapseclient.wiki.WikiAttachment","title":"WikiAttachment
","text":" Bases: DictObject
Represents a wiki page attachment
Source code insynapseclient/wiki.py
class WikiAttachment(DictObject):\n \"\"\"\n Represents a wiki page attachment\n\n \"\"\"\n\n __PROPERTIES = (\"contentType\", \"fileName\", \"contentMd5\", \"contentSize\")\n\n def __init__(self, **kwargs):\n super(WikiAttachment, self).__init__(**kwargs)\n
"},{"location":"reference/wiki/#synapseclient.wiki-functions","title":"Functions","text":""},{"location":"tutorials/authentication/","title":"Authentication","text":"There are multiple ways one can login to Synapse. We recommend users to choose the method that fits their workflow.
"},{"location":"tutorials/authentication/#one-time-login","title":"One Time Login","text":"Use a personal access token token obtained from synapse.org under your Settings. Note that a token must minimally have the view scope to be used with the Synapse Python Client.
syn = synapseclient.login(authToken=\"authtoken\")\n
"},{"location":"tutorials/authentication/#use-environment-variable","title":"Use Environment Variable","text":"Setting the SYNAPSE_AUTH_TOKEN
environment variable will allow you to login to Synapse with a personal access token
The environment variable will take priority over credentials in the user's .synapseConfig
file.
In your shell, you can pass an environment variable to Python inline by defining it before the command:
SYNAPSE_AUTH_TOKEN='<my_personal_access_token>' python3\n
Alternatively you may export it first, then start: Python
export SYNAPSE_AUTH_TOKEN='<my_personal_access_token>'\npython3\n
Once you are inside Python, you may simply login without passing any arguments:
import synapseclient\nsyn = synapseclient.login()\n
To use the environment variable with the command line client, simply substitute python
for the synapse
command
SYNAPSE_AUTH_TOKEN='<my_personal_access_token>' synapse get syn123\nSYNAPSE_AUTH_TOKEN='<my_personal_access_token>' synapse store --parentid syn123 ~/foobar.txt\n
Or alternatively, for multiple commands:
export SYNAPSE_AUTH_TOKEN='<my_personal_access_token>'\nsynapse get syn123\nsynapse store --parentid syn123 ~/foobar.txt\n
"},{"location":"tutorials/authentication/#use-synapseconfig","title":"Use .synapseConfig
","text":"For writing code using the Synapse Python client that is easy to share with others, please do not include your credentials in the code. Instead, please use .synapseConfig
file to manage your credentials.
When installing the Synapse Python client, the .synapseConfig
is added to your home directory.
You may also create the ~/.synapseConfig
file by utilizing the command line client command and following the interactive prompts:
synapse config\n
The following describes how to add your credentials to the .synapseConfig
file without the use of the synapse config
command.
Open the .synapseConfig
file and find the following section:
#[authentication]\n#username = <username>\n#authtoken = <authtoken>\n
To enable this section, uncomment it. You don't need to specify your username when using authtoken as a pair, but if you do, it will be used to verify your identity. A personal access token generated from your synapse.org Settings can be used as your .synapseConfig authtoken.
[authentication]\nauthtoken = <authtoken>\n
Now, you can login without specifying any arguments:
import synapseclient\nsyn = synapseclient.login()\n
For legacy compatibility, the .synapseConfig
[authentication]
section will continue to support apikey
or username
+ password
pair until early 2024 when they are both deprecated in favor of personal access tokens (authtoken
) which can be scoped to certain functions and are revocable.
For more information, see:
The Synapse Python Client can be used from the command line via the synapse
command.
Note: The command line client is installed along with Installation of the Synapse Python client.
"},{"location":"tutorials/command_line_client/#usage","title":"Usage","text":"For help, type:
synapse -h\n
For help on specific commands, type:
synapse [command] -h\n
Logging in with an auth token environment variable, type:
synapse login -p $MY_SYNAPSE_TOKENWelcome, First Last!Logged in as: username (1234567)The usage is as follows:
synapse [-h] [--version] [-u SYNAPSEUSER] [-p SYNAPSEPASSWORD] [-c CONFIGPATH] [--debug] [--silent] [-s]\n [--otel {console,otlp}]\n {get,manifest,sync,store,add,mv,cp,get-download-list,associate,delete,query,submit,show,cat,list,config,set-provenance,get-provenance,set-annotations,get-annotations,create,store-table,onweb,login,test-encoding,get-sts-token,migrate}\n ...\n
"},{"location":"tutorials/command_line_client/#options","title":"Options","text":"Name Type Description Default --version
Flag Show program\u2019s version number and exit -u, --username
Option Username used to connect to Synapse -p, --password
Option Password, api key, or token used to connect to Synapse -c, --configPath
Option Path to configuration file used to connect to Synapse \u201c~/.synapseConfig\u201d --debug
Flag Set to debug mode, additional output and error messages are printed to the console False --silent
Flag Set to silent mode, console output is suppressed False -s, --skip-checks
Flag Suppress checking for version upgrade messages and endpoint redirection False --otel
Option Enable the usage of OpenTelemetry for tracing. Possible choices: console, otlp"},{"location":"tutorials/command_line_client/#subcommands","title":"Subcommands","text":"get
","text":"synapse get [-h] [-q queryString] [-v VERSION] [-r] [--followLink] [--limitSearch projId] [--downloadLocation path]\n [--multiThreaded] [--manifest {all,root,suppress}]\n [local path]\n
Name Type Description Default local path
Positional Synapse ID of form syn123 of desired data object. -q, --query
Named Optional query parameter, will fetch all of the entities returned by a query. -v, --version
Named Synapse version number of entity to retrieve. Most recent version -r, --recursive
Named Fetches content in Synapse recursively contained in the parentId specified by id. False --followLink
Named Determines whether the link returns the target Entity. False --limitSearch
Named Synapse ID of a container such as project or folder to limit search for files if using a path. --downloadLocation
Named Directory to download file to. \u201c./\u201d --multiThreaded
Named Download file using a multiple threaded implementation. True --manifest
Named Determines whether creating manifest file automatically. \u201call\u201d"},{"location":"tutorials/command_line_client/#manifest","title":"manifest
","text":"Generate manifest for uploading directory tree to Synapse.
synapse manifest [-h] --parent-id syn123 [--manifest-file OUTPUT] PATH\n
Name Type Description Default PATH
Positional A path to a file or folder whose manifest will be generated. --parent-id
Named Synapse ID of project or folder where to upload data. --manifest-file
Named A TSV output file path where the generated manifest is stored. stdout"},{"location":"tutorials/command_line_client/#sync","title":"sync
","text":"Synchronize files described in a manifest to Synapse.
synapse sync [-h] [--dryRun] [--sendMessages] [--retries INT] FILE\n
Name Type Description Default FILE
Positional A tsv file with file locations and metadata to be pushed to Synapse. See synapseutils.sync.syncToSynapse for details on the format of a manifest. --dryRun
Named Perform validation without uploading. False --sendMessages
Named Send notifications via Synapse messaging (email) at specific intervals, on errors and on completion. False --retries
Named Number of retries for failed uploads. 4"},{"location":"tutorials/command_line_client/#store","title":"store
","text":"Uploads and adds a file to Synapse.
synapse store [-h] (--parentid syn123 | --id syn123 | --type TYPE) [--name NAME]\n [--description DESCRIPTION | --descriptionFile DESCRIPTION_FILE_PATH] [--used [target [target ...]]]\n [--executed [target [target ...]]] [--limitSearch projId] [--noForceVersion] [--annotations ANNOTATIONS]\n [--replace]\n [FILE]\n
Name Type Description Default FILE
Positional File to be added to synapse. --parentid, --parentId
Named Synapse ID of project or folder where to upload data (must be specified if \u2013id is not used). --id
Named Optional Id of entity in Synapse to be updated. --type
Named Type of object, such as \u201cFile\u201d, \u201cFolder\u201d, or \u201cProject\u201d, to create in Synapse. \u201cFile\u201d --name
Named Name of data object in Synapse. --description
Named Description of data object in Synapse. --descriptionFile
Named Path to a markdown file containing description of project/folder. --used
Named Synapse ID, a url, or a local file path (of a file previously uploaded to Synapse) from which the specified entity is derived. --executed
Named Synapse ID, a url, or a local file path (of a file previously uploaded to Synapse) that was executed to generate the specified entity. --limitSearch
Named Synapse ID of a container such as project or folder to limit search for provenance files. --noForceVersion
Named Do not force a new version to be created if the contents of the file have not changed. False --annotations
Named Annotations to add as a JSON formatted string, should evaluate to a dictionary (key/value pairs). Example: \u2018{\u201cfoo\u201d: 1, \u201cbar\u201d:\u201dquux\u201d}\u2019 --replace
Named Replace all existing annotations with the given annotations. False"},{"location":"tutorials/command_line_client/#add","title":"add
","text":"Uploads and adds a file to Synapse.
synapse add [-h] (--parentid syn123 | --id syn123 | --type TYPE) [--name NAME]\n [--description DESCRIPTION | --descriptionFile DESCRIPTION_FILE_PATH] [--used [target [target ...]]]\n [--executed [target [target ...]]] [--limitSearch projId] [--noForceVersion] [--annotations ANNOTATIONS] [--replace]\n [FILE]\n
Name Type Description Default FILE
Positional File to be added to synapse. --parentid, --parentId
Named Synapse ID of project or folder where to upload data (must be specified if \u2013id is not used). --id
Named Optional Id of entity in Synapse to be updated. --type
Named Type of object, such as \u201cFile\u201d, \u201cFolder\u201d, or \u201cProject\u201d, to create in Synapse. \u201cFile\u201d --name
Named Name of data object in Synapse. --description
Named Description of data object in Synapse. --descriptionFile
Named Path to a markdown file containing description of project/folder. --used
Named Synapse ID, a url, or a local file path (of a file previously uploaded to Synapse) from which the specified entity is derived. --executed
Named Synapse ID, a url, or a local file path (of a file previously uploaded to Synapse) that was executed to generate the specified entity. --limitSearch
Named Synapse ID of a container such as project or folder to limit search for provenance files. --noForceVersion
Named Do not force a new version to be created if the contents of the file have not changed. False --annotations
Named Annotations to add as a JSON formatted string, should evaluate to a dictionary (key/value pairs). Example: \u2018{\u201cfoo\u201d: 1, \u201cbar\u201d:\u201dquux\u201d}\u2019 --replace
Named Replace all existing annotations with the given annotations. False"},{"location":"tutorials/command_line_client/#mv","title":"mv
","text":"Moves a file/folder in Synapse.
synapse mv [-h] --id syn123 --parentid syn123\n
Name Type Description --id
Named Id of entity in Synapse to be moved. --parentid, --parentId
Named Synapse ID of project or folder where file/folder will be moved."},{"location":"tutorials/command_line_client/#cp","title":"cp
","text":"Copies specific versions of synapse content such as files, folders and projects by recursively copying all sub-content.
synapse cp [-h] --destinationId syn123 [--version 1] [--setProvenance traceback] [--updateExisting] [--skipCopyAnnotations]\n [--excludeTypes [file table [file table ...]]] [--skipCopyWiki]\n syn123\n
Name Type Description Default syn123
Positional Id of entity in Synapse to be copied. --destinationId
Named Synapse ID of project or folder where file will be copied to. --version, -v
Named Synapse version number of File or Link to retrieve. This parameter cannot be used when copying Projects or Folders. Defaults to most recent version. Most recent version --setProvenance
Named Has three values to set the provenance of the copied entity-traceback: Sets to the source entityexisting: Sets to source entity\u2019s original provenance (if it exists)None/none: No provenance is set \"traceback\" --updateExisting
Named Will update the file if there is already a file that is named the same in the destination False --skipCopyAnnotations
Named Do not copy the annotations False --excludeTypes
Named Accepts a list of entity types (file, table, link) which determines which entity types to not copy. [] --skipCopyWiki
Named Do not copy the wiki pages False"},{"location":"tutorials/command_line_client/#get-download-list","title":"get-download-list
","text":"Download files from the Synapse download cart.
synapse get-download-list [-h] [--downloadLocation path]\n
Name Type Description Default --downloadLocation
Named Directory to download file to. \"./\""},{"location":"tutorials/command_line_client/#associate","title":"associate
","text":"Associate local files with the files stored in Synapse so that calls to \u201csynapse get\u201d and \u201csynapse show\u201d don\u2019t re-download the files but use the already existing file.
synapse associate [-h] [--limitSearch projId] [-r] path\n
Name Type Description Default path
Positional Local file path. --limitSearch
Named Synapse ID of a container such as project or folder to limit search to. -r
Named Perform recursive association with all local files in a folder. False"},{"location":"tutorials/command_line_client/#delete","title":"delete
","text":"Removes a dataset from Synapse.
synapse delete [-h] [--version VERSION] syn123\n
Name Type Description syn123
Positional Synapse ID of form syn123 of desired data object. --version
Named Version number to delete of given entity."},{"location":"tutorials/command_line_client/#query","title":"query
","text":"Performs SQL like queries on Synapse.
synapse query [-h] [string [string ...]]\n
Name Type Description string
Positional A query string. Note that when using the command line query strings must be passed intact as a single string. In most shells this can mean wrapping the query in quotes as appropriate and escaping any quotes that may appear within the query string itself. Example: synapse query \"select \\\"column has spaces\\\" from syn123\"
. See Table Examples for more information."},{"location":"tutorials/command_line_client/#submit","title":"submit
","text":"Submit an entity or a file for evaluation.
synapse submit [-h] [--evaluationID EVALUATIONID] [--evaluationName EVALUATIONNAME] [--entity ENTITY] [--file FILE]\n [--parentId PARENTID] [--name NAME] [--teamName TEAMNAME] [--submitterAlias ALIAS] [--used [target [target ...]]]\n [--executed [target [target ...]]] [--limitSearch projId]\n
Name Type Description --evaluationID, --evaluationId, --evalID
Named Evaluation ID where the entity/file will be submitted. --evaluationName, --evalN
Named Evaluation Name where the entity/file will be submitted. --entity, --eid, --entityId, --id
Named Synapse ID of the entity to be submitted. --file, -f
Named File to be submitted to the challenge. --parentId, --parentid, --parent
Named Synapse ID of project or folder where to upload data. --name
Named Name of the submission. --teamName, --team
Named Submit on behalf of a registered team. --submitterAlias, --alias
Named A nickname, possibly for display in leaderboards. --used
Named Synapse ID, a url, or a local file path (of a file previously uploaded to Synapse) from which the specified entity is derived. --executed
Named Synapse ID, a url, or a local file path (of a file previously uploaded to Synapse) that was executed to generate the specified entity. --limitSearch
Named Synapse ID of a container such as project or folder to limit search for provenance files."},{"location":"tutorials/command_line_client/#show","title":"show
","text":"Show metadata for an entity.
synapse show [-h] [--limitSearch projId] syn123\n
Name Type Description syn123
Positional Synapse ID of form syn123 of desired synapse object. --limitSearch
Named Synapse ID of a container such as project or folder to limit search for provenance files."},{"location":"tutorials/command_line_client/#cat","title":"cat
","text":"Prints a dataset from Synapse.
synapse cat [-h] [-v VERSION] syn123\n
Name Type Description Default syn123
Positional Synapse ID of form syn123 of desired data object. -v, --version
Named Synapse version number of entity to display. Most recent version"},{"location":"tutorials/command_line_client/#list","title":"list
","text":"List Synapse entities contained by the given Project or Folder. Note: May not be supported in future versions of the client.
synapse list [-h] [-r] [-l] [-m] syn123\n
Name Type Description Default syn123
Positional Synapse ID of a project or folder. -r, --recursive
Named Recursively list contents of the subtree descending from the given Synapse ID. False -l, --long
Named List synapse entities in long format. False -m, --modified
Named List modified by and modified date. False"},{"location":"tutorials/command_line_client/#config","title":"config
","text":"Create or modify a Synapse configuration file.
synapse config [-h]\n
Name Type Description -h
Named Show the help message and exit."},{"location":"tutorials/command_line_client/#set-provenance","title":"set-provenance
","text":"Create provenance records.
synapse set-provenance [-h] --id syn123 [--name NAME] [--description DESCRIPTION] [-o [OUTPUT_FILE]]\n [--used [target [target ...]]] [--executed [target [target ...]]] [--limitSearch projId]\n
Name Type Description --id
Named Synapse ID of entity whose provenance we are accessing. --name
Named Name of the activity that generated the entity. --description
Named Description of the activity that generated the entity. -o, --output
Named Output the provenance record in JSON format. --used
Named Synapse ID, a url, or a local file path (of a file previously uploaded to Synapse) from which the specified entity is derived. --executed
Named Synapse ID, a url, or a local file path (of a file previously uploaded to Synapse) that was executed to generate the specified entity. --limitSearch
Named Synapse ID of a container such as project or folder to limit search for provenance files."},{"location":"tutorials/command_line_client/#get-provenance","title":"get-provenance
","text":"Show provenance records.
synapse get-provenance [-h] --id syn123 [--version version] [-o [OUTPUT_FILE]]\n
Name Type Description --id
Named Synapse ID of entity whose provenance we are accessing. --version
Named Version of Synapse entity whose provenance we are accessing. -o, --output
Named Output the provenance record in JSON format."},{"location":"tutorials/command_line_client/#set-annotations","title":"set-annotations
","text":"Create annotations records.
synapse set-annotations [-h] --id syn123 --annotations ANNOTATIONS [-r]\n
Name Type Description Default --id
Named Synapse ID of entity whose annotations we are accessing. --annotations
Named Annotations to add as a JSON formatted string, should evaluate to a dictionary (key/value pairs). Example: \u2018{\u201cfoo\u201d: 1, \u201cbar\u201d:\u201dquux\u201d}\u2019. -r, --replace
Named Replace all existing annotations with the given annotations. False"},{"location":"tutorials/command_line_client/#get-annotations","title":"get-annotations
","text":"Show annotations records.
synapse get-annotations [-h] --id syn123 [-o [OUTPUT_FILE]]\n
Name Type Description --id
Named Synapse ID of entity whose annotations we are accessing. -o, --output
Named Output the annotations record in JSON format."},{"location":"tutorials/command_line_client/#create","title":"create
","text":"Creates folders or projects on Synapse.
synapse create [-h] [--parentid syn123] --name NAME [--description DESCRIPTION | --descriptionFile DESCRIPTION_FILE_PATH] type\n
Name Type Description type
Positional Type of object to create in synapse one of {Project, Folder}. --parentid, --parentId
Named Synapse ID of project or folder where to place folder [not used with project]. --name
Named Name of folder/project. --description
Named Description of project/folder. --descriptionFile
Named Path to a markdown file containing description of project/folder."},{"location":"tutorials/command_line_client/#store-table","title":"store-table
","text":"Creates a Synapse Table given a csv.
synapse store-table [-h] --name NAME [--parentid syn123] [--csv foo.csv]\n
Name Type Description --name
Named Name of Table. --parentid, --parentId
Named Synapse ID of project. --csv
Named Path to csv."},{"location":"tutorials/command_line_client/#onweb","title":"onweb
","text":"Opens Synapse website for Entity.
synapse onweb [-h] id\n
Name Type Description id
Positional Synapse id."},{"location":"tutorials/command_line_client/#login","title":"login
","text":"Login to Synapse and (optionally) cache credentials.
synapse login [-h] [-u SYNAPSEUSER] [-p SYNAPSEPASSWORD] [--rememberMe]\n
Name Type Description Default -u, --username
Named Username used to connect to Synapse. -p, --password
Named This will be deprecated. Password or api key used to connect to Synapse. --rememberMe, --remember-me
Named Cache credentials for automatic authentication on future interactions with Synapse. False"},{"location":"tutorials/command_line_client/#test-encoding","title":"test-encoding
","text":"Test character encoding to help diagnose problems.
synapse test-encoding [-h]\n
Name Type Description -h
Named Show the help message and exit."},{"location":"tutorials/command_line_client/#get-sts-token","title":"get-sts-token
","text":"Get an STS token for access to AWS S3 storage underlying Synapse.
synapse get-sts-token [-h] [-o {json,boto,shell,bash,cmd,powershell}] id {read_write,read_only}\n
Name Type Description Default id
Positional Synapse id. permission
Positional Possible choices: read_write, read_only. -o, --output
Named Possible choices: json, boto, shell, bash, cmd, powershell. \"shell\""},{"location":"tutorials/command_line_client/#migrate","title":"migrate
","text":"Migrate Synapse entities to a different storage location.
synapse migrate [-h] [--source_storage_location_ids [SOURCE_STORAGE_LOCATION_IDS [SOURCE_STORAGE_LOCATION_IDS ...]]]\n [--file_version_strategy FILE_VERSION_STRATEGY] [--include_table_files] [--continue_on_error]\n [--csv_log_path CSV_LOG_PATH] [--dryRun] [--force]\n id dest_storage_location_id db_path\n
Name Type Description Default id
Positional Synapse id. dest_storage_location_id
Positional Destination Synapse storage location id. db_path
Positional Local system path where a record keeping file can be stored. --source_storage_location_ids
Named Source Synapse storage location ids. If specified only files in these storage locations will be migrated. --file_version_strategy
Named One of \u2018new\u2019, \u2018latest\u2019, \u2018all\u2019, \u2018skip\u2019. New creates a new version of each entity, latest migrates the most recent version, all migrates all versions, skip avoids migrating file entities (use when exclusively targeting table attached files. \"new\" --include_table_files
Named Include table attached files when migrating. False --continue_on_error
Named Whether to continue processing other entities if migration of one fails. False --csv_log_path
Named Path where to log a csv documenting the changes from the migration. --dryRun
Named Dry run, files will be indexed by not migrated. False --force
Named Bypass interactive prompt confirming migration. False"},{"location":"tutorials/configuration/","title":"Configuration","text":"The synapse python client can be configured either programmatically or by using a configuration file. When installing the Synapse Python client, the .synapseConfig
is added to your home directory. This configuration file is used to store a number of configuration options, including your Synapse authtoken, cache, and multi-threading settings.
A full example .synapseConfig
can be found in the github repository.
.synapseConfig
sections","text":""},{"location":"tutorials/configuration/#authentication","title":"[authentication]
","text":"See details on this section in the authentication document.
"},{"location":"tutorials/configuration/#cache","title":"[cache]
","text":"Your downloaded files are cached to avoid repeat downloads of the same file. change 'location' to use a different folder on your computer as the cache location
"},{"location":"tutorials/configuration/#endpoints","title":"[endpoints]
","text":"Configuring these will cause the Python client to use these as Synapse service endpoints instead of the default prod endpoints.
"},{"location":"tutorials/configuration/#transfer","title":"[transfer]
","text":"Settings to configure how Synapse uploads/downloads data.
You may also set the max_threads
programmatically via:
import synapseclient\nsyn = synapseclient.login()\nsyn.max_threads = 10\n
"},{"location":"tutorials/file_versioning/","title":"File Upload","text":"Files in Synapse are versionable. Please see Versioning for more information about how versions in Files works.
"},{"location":"tutorials/file_versioning/#uploading-a-new-version","title":"Uploading a New Version","text":"Uploading a new version follows the same steps as uploading a file for the first time - use the same file name and store it in the same location (e.g., the same parentId). It is recommended to add a comment to the new version in order to easily track differences at a glance. The example file raw_data.txt
will now have a version of 2 and a comment describing the change.
Explicit example:
import synapseclient\n\n# fetch the file in Synapse\nfile_to_update = syn.get('syn2222', downloadFile=False)\n\n# save the local path to the new version of the file\nfile_to_update.path = '/path/to/new/version/of/raw_data.txt'\n\n# add a version comment\nfile_to_update.versionComment = 'Added 5 random normally distributed numbers.'\n\n# store the new file\nupdated_file = syn.store(file_to_update)\n
Implicit example:
# Assuming that there is a file created with:\nsyn.store(File('path/to/old/raw_data.txt', parentId='syn123456'))\n\n# To create a new version of that file, make sure you store it with the exact same name\nnew_file = syn.store(File('path/to/new_version/raw_data.txt', parentId='syn123456'))\n
"},{"location":"tutorials/file_versioning/#updating-annotations-or-provenance-without-changing-versions","title":"Updating Annotations or Provenance without Changing Versions","text":"Any change to a File will automatically update its version. If this isn\u2019t the desired behavior, such as minor changes to the metadata, you can set forceVersion=False
with the Python client. For command line, the commands set-annotations
and set-provenance
will update the metadata without creating a new version. Adding/updating annotations and provenance in the web client will also not cause a version change.
Important: Because Provenance is tracked by version, set forceVersion=False
for minor changes to avoid breaking Provenance.
Setting annotations without changing version:
# Get file from Synapse, set download=False since we are only updating annotations\nfile = syn.get('syn56789', download=False)\n\n# Add annotations\nfile.annotations = {\"fileType\":\"bam\", \"assay\":\"RNA-seq\"}\n\n# Store the file without creating a new version\nfile = syn.store(file, forceVersion=False)\n
"},{"location":"tutorials/file_versioning/#setting-provenance-without-changing-version","title":"Setting Provenance without Changing Version","text":"To set Provenance without changing the file version:
# Get file from Synapse, set download=False since we are only updating provenance\nfile = syn.get('syn56789', download=False)\n\n# Add provenance\nfile = syn.setProvenance(file, activity = Activity(used = '/path/to/example_code'))\n\n# Store the file without creating a new version\nfile = syn.store(file, forceVersion=False)\n
"},{"location":"tutorials/file_versioning/#downloading-a-specific-version","title":"Downloading a Specific Version","text":"By default, the File downloaded will always be the most recent version. However, a specific version can be downloaded by passing the version parameter:
entity = syn.get(\"syn3260973\", version=1)\n
"},{"location":"tutorials/installation/","title":"Installation","text":"By following the instructions below, you are installing the synapseclient
, synapseutils
and the command line client.
The synapseclient package is available from PyPI. It can be installed or upgraded with pip. Due to the nature of Python, we highly recommend you set up your python environment with conda or pyenv and create virtual environments to control your Python dependencies for your work.
conda create -n synapseclient python=3.9\nconda activate synapseclient\n(sudo) pip install (--upgrade) synapseclient[pandas, pysftp]\n
pyenv install -v 3.9.13\npyenv global 3.9.13\npython -m venv env\nsource env/bin/activate\n(sudo) python3 -m pip3 install (--upgrade) synapseclient[pandas, pysftp]\n
The dependencies on pandas and pysftp are optional. The Synapse synapseclient.table
feature integrates with Pandas. Support for sftp is required for users of SFTP file storage. Both require native libraries to be compiled or installed separately from prebuilt binaries.
Source code and development versions are available on Github. Installing from source:
git clone https://github.com/Sage-Bionetworks/synapsePythonClient.git\ncd synapsePythonClient\n
You can stay on the master branch to get the latest stable release or check out the develop branch or a tagged revision:
git checkout <branch or tag>\n
Next, either install the package in the site-packages directory pip install .
or pip install -e .
to make the installation follow the head without having to reinstall:
pip install .\n
"},{"location":"tutorials/python_client/","title":"Working with the Python client","text":""},{"location":"tutorials/python_client/#authentication","title":"Authentication","text":"Most operations in Synapse require you to be logged in. Please follow instructions in authentication to configure your client:
import synapseclient\nsyn = synapseclient.Synapse()\nsyn.login()\n# If you aren't logged in, this following command will\n# show that you are an \"anonymous\" user.\nsyn.getUserProfile()\n
"},{"location":"tutorials/python_client/#accessing-data","title":"Accessing Data","text":"Synapse identifiers are used to refer to projects and data which are represented by synapseclient.entity
objects. For example, the entity syn1899498 represents a tab-delimited file containing a 100 by 4 matrix. Getting the entity retrieves an object that holds metadata describing the matrix, and also downloads the file to a local cache:
import synapseclient\n# This is a shortcut to login\nsyn = synapseclient.login()\nentity = syn.get('syn1899498')\n
View the entity's metadata in the Python console:
print(entity)\n
This is one simple way to read in a small matrix:
rows = []\nwith open(entity.path) as f:\n header = f.readline().split('\\t')\n for line in f:\n row = [float(x) for x in line.split('\\t')]\n rows.append(row)\n
View the entity in the browser:
syn.onweb('syn1899498')\n
You can create your own projects and upload your own data sets. Synapse stores entities in a hierarchical or tree structure. Projects are at the top level and must be uniquely named:
import synapseclient\nfrom synapseclient import Project, Folder, File\n\nsyn = synapseclient.login()\n# Project names must be globally unique\nproject = Project('My uniquely named project')\nproject = syn.store(project)\n
Creating a folder:
data_folder = Folder('Data', parent=project)\ndata_folder = syn.store(data_folder)\n
Adding files to the project. You will get an error if you try to store an empty file in Synapse. Here we create temporary files, but you can specify your own file path:
import tempfile\n\ntemp = tempfile.NamedTemporaryFile(prefix='your_file', suffix='.txt')\nwith open(temp.name, \"w\") as temp_f:\n temp_f.write(\"Example text\")\nfilepath = temp.name\ntest_entity = File(filepath, description='Fancy new data', parent=data_folder)\ntest_entity = syn.store(test_entity)\nprint(test_entity)\n
You may notice that there is \"downloadAs\" name and \"entity name\". By default, the client will use the file's name as the entity name, but you can configure the file to display a different name on Synapse:
test_second_entity = File(filepath, name=\"second file\", parent=data_folder)\ntest_second_entity = syn.store(test_second_entity)\nprint(test_second_entity)\n
In addition to simple data storage, Synapse entities can be annotated with key/value metadata, described in markdown documents (Wiki), and linked together in provenance graphs to create a reproducible record of a data analysis pipeline.
See also:
Annotations are arbitrary metadata attached to Synapse entities. There are different ways to creating annotations. Using the entity created from the previous step in the tutorial, for example:
# First method\ntest_ent = syn.get(test_entity.id)\ntest_ent.foo = \"foo\"\ntest_ent.bar = \"bar\"\nsyn.store(test_ent)\n\n# Second method\ntest_ent = syn.get(test_entity.id)\nannotations = {\"foo\": \"foo\", \"bar\": \"bar\"}\ntest_ent.annotations = annotations\nsyn.store(test_ent)\n
See:
Synapse supports versioning of many entity types. This tutorial will focus on File versions. Using the project/folder created earlier in this tutorial
Uploading a new version. Synapse leverages the entity name to version entities:
import tempfile\n\ntemp = tempfile.NamedTemporaryFile(prefix='second', suffix='.txt')\nwith open(temp.name, \"w\") as temp_f:\n temp_f.write(\"First text\")\n\nversion_entity = File(temp.name, parent=data_folder)\nversion_entity = syn.store(version_entity)\nprint(version_entity.versionNumber)\n\nwith open(temp.name, \"w\") as temp_f:\n temp_f.write(\"Second text\")\nversion_entity = File(temp.name, parent=data_folder)\nversion_entity = syn.store(version_entity)\nprint(version_entity.versionNumber)\n
Downloading a specific version. By default, Synapse downloads the latest version unless a version is specified:
version_1 = syn.get(version_entity, version=1)\n
"},{"location":"tutorials/python_client/#provenance","title":"Provenance","text":"Synapse provides tools for tracking 'provenance', or the transformation of raw data into processed results, by linking derived data objects to source data and the code used to perform the transformation:
# pass the provenance to the store function\nprovenance_ent = syn.store(\n version_entity,\n used=[version_1.id],\n executed=[\"https://github.com/Sage-Bionetworks/synapsePythonClient/tree/v2.7.2\"]\n)\n
See:
Views display rows and columns of information, and they can be shared and queried with SQL. Views are queries of other data already in Synapse. They allow you to see groups of files, tables, projects, or submissions and any associated annotations about those items.
Annotations are an essential component to building a view. Annotations are labels that you apply to your data, stored as key-value pairs in Synapse.
We will create a file view from the project above:
import synapseclient\nsyn = synapseclient.login()\n# Here we are using project.id from the earlier sections from this tutorial\nproject_id = project.id\nfileview = EntityViewSchema(\n name='MyTable',\n parent=project_id,\n scopes=[project_id]\n)\nfileview_ent = syn.store(fileview)\n
You can now query it to see all the files within the project. Note: it is highly recommended to install pandas
:
query = syn.tableQuery(f\"select * from {fileview_ent.id}\")\nquery_results = query.asDataFrame()\nprint(query_results)\n
See:
For more information see the Synapse Getting Started.
"},{"location":"tutorials/reticulate/","title":"Using synapseclient with R through reticulate","text":"This article describes using the Python synapseclient with R through the reticulate package, which provides an interface between R and Python libraries.
While the separate synapser R package exists and can be installed directly in an R environment without the need for reticulate, it is not currently compatible with an R environment that already includes reticulate. In such cases using the Python synapseclient is an alternative.
"},{"location":"tutorials/reticulate/#installation","title":"Installation","text":""},{"location":"tutorials/reticulate/#installing-reticulate","title":"Installing reticulate","text":"This article assumes that reticulate is installed and available in your R environment. If not it can be installed as follows:
install.packages(\"reticulate\")\n
"},{"location":"tutorials/reticulate/#installing-synapseclient","title":"Installing synapseclient","text":"The Python synapseclient can be installed either directly into the Python installation you intend to use with reticulate or from within R using the reticulate library.
synapseclient has the same requirements and dependencies when installed for use with reticulate as it does in other usage. In particular note that synapseclient requires a Python version of 3.6 or greater.
"},{"location":"tutorials/reticulate/#installing-into-python","title":"Installing into Python","text":"The Python synapseclient is available on the PyPi package repository and can be installed through Python tools that interface with the repository, such as pip. To install synapseclient for use with reticulate directly into a Python environment, first ensure that the current Python interpreter is the one you intend to use with reticulate. This may be a particular installation of Python, or a loaded virtual environment. See reticulate's Python version configuration documentation for more information on how reticulate can be configured to use particular Python environments.
For help installing a reticulate compatible Python, see the reticulate version of the SynapseShinyApp.
Once you have ensured you are interacting with your intended Python interpreter, follow the standard synapseclient installation instructions to install synapseclient.
"},{"location":"tutorials/reticulate/#installing-from-rreticulate","title":"Installing from R/Reticulate","text":"To install synapseclient from within R, first ensure that the reticulate library is loaded.
library(reticulate)\n
Once loaded, ensure that reticulate will use the Python installation you intend. You may need to provide reticulate a hint or otherwise point it at the proper Python installation.
Next install the synapseclient using reticulate's py_install command, e.g.
py_install(\"synapseclient\")\n
You may also want to install some of synapseclient's optional dependencies, such as Pandas for table support.
py_install(\"pandas\")\n
See synapseclient's installation instructions for more information on optional dependencies.
"},{"location":"tutorials/reticulate/#usage","title":"Usage","text":"Once synapseclient is installed it can be used once it is imported through R's import command:
synapseclient <- import(\"synapseclient\")\n
If you are using synapseclient with reticulate when writing an R package, you will want to wrap the import in an onLoad and use the delay_load option, .e.g.
synapseclient <- NULL\n\n.onLoad <- function(libname, pkgname) {\n synapseclient <<- reticulate::import(\"synapseclient\", delay_load = TRUE)\n}\n
This will allow users of your package to configure their reticulate usage properly regardless of when they load your package. More information on this technique can be found here.
If you are familiar with the synapser R package, many of the commands will be similar, but unlike in synapser where package functions and classes are made available in the global namespace through the search path, when using synapseclient through reticulate, classes are accessed through the imported synapseclient module and functionality is provided through an instantiated Synapse instance.
For example classes that were globally available are now available through the imported synapseclient module.
# File from synapser\nsynapseclient$File\n\n# Table from synapser\nsynapseclient$Table\n
And various syn functions are now methods on the Synapse object:
# using synapseclient with reticulate we must instantiate a Synapse instance\nsyn <- synapseclient$Synapse()\n\n# synLogin from synapser\nsyn$login()\n\n# synGet from synapser\nsyn$get(identifier)\n\n# synStore from syanpser\nsyn$store(entity)\n
Each synapse object has its own state, such as configuration and login credentials.
"},{"location":"tutorials/reticulate/#credentials","title":"Credentials","text":"synapseclient accessed through reticulate supports the same authentication options as it does when accessed directly from Python, for example:
syn <- synapseclient$synapse()\n\n# one time login\nsyn$login('<username', '<password>')\n\n# login and store credentials for future use\nsyn$login('<username', '<password>', rememberMe=TRUE)\n
See Managing Synapse Credentials for complete documentation on how synapseclient handles credentials and authentication.
"},{"location":"tutorials/reticulate/#accessing-data","title":"Accessing Data","text":"The following illustrates some examples of storing and retrieving data in Synapse using synapseclient through reticulate.
See here for more details on available data access APIs.
Create a project with a unique name
# use hex_digits to generate random string and use it to name a project\nhex_digits <- c(as.character(0:9), letters[1:6])\nprojectName <- sprintf(\"My unique project %s\", paste0(sample(hex_digits, 32, replace = TRUE), collapse = \"\"))\n\nproject <- synapseclient$Project(projectName)\nproject <- syn$store(project)\n
Create, store, and retrieve a file
filePath <- tempfile()\nconnection <- file(filePath)\nwriteChar(\"a \\t b \\t c \\n d \\t e \\t f \\n\", connection, eos = NULL)\nclose(connection)\n\nfile <- synapseclient$File(path = filePath, parent = project)\nfile <- syn$store(file)\nsynId <- file$properties$id\n\n# download the file using its identifier to specific path\nfileEntity <- syn$get(synId, downloadLocation=\"/path/to/folder\")\n\n# view the file meta data in the console\nprint(fileEntity)\n\n# view the file on the web\nsyn$onweb(synId)\n
Create folder and add files to the folder:
dataFolder <- synapseclient$Folder(\"Data\", parent = project)\ndataFolder <- syn$store(dataFolder)\n\nfilePath <- tempfile()\nconnection <- file(filePath)\nwriteChar(\"this is the content of the file\", connection, eos = NULL)\nclose(connection)\nfile <- synapseclient$File(path = filePath, parent = dataFolder)\nfile <- syn$store(file)\n
"},{"location":"tutorials/reticulate/#annotating-synapse-entities","title":"Annotating Synapse Entities","text":"This illustrates adding annotations to a Synapse entity.
# first retrieve the existing annotations object\nannotations <- syn$get_annotations(project)\n\nannotations$foo <- \"bar\"\nannotations$fooList <- list(\"bar\", \"baz\")\n\nsyn$set_annotations(annotations)\n
See here for more information on annotations.
"},{"location":"tutorials/reticulate/#activityprovenance","title":"Activity/Provenance","text":"This example illustrates creating an entity with associated provenance.
See here for more information on Activity/Provenance related APIs.
act <- synapseclient$Activity(\n name = \"clustering\",\n description = \"whizzy clustering\",\n used = c(\"syn1234\", \"syn1235\"),\n executed = \"syn4567\")\n
filePath <- tempfile()\nconnection <- file(filePath)\nwriteChar(\"some test\", connection, eos = NULL)\nclose(connection)\n\nfile = synapseclient$File(filePath, name=\"provenance_file.txt\", parent=project)\nfile <- syn$store(file, activity = act)\n
"},{"location":"tutorials/reticulate/#tables","title":"Tables","text":"These examples illustrate manipulating Synapse Tables. Note that you must have installed the Pandas dependency into the Python environment as described above in order to use this feature.
See here for more information on tables.
The following illustrates building a table from an R data frame. The schema will be generated from the data types of the values within the data frame.
# start with an R data frame\ngenes <- data.frame(\n Name = c(\"foo\", \"arg\", \"zap\", \"bah\", \"bnk\", \"xyz\"),\n Chromosome = c(1, 2, 2, 1, 1, 1),\n Start = c(12345, 20001, 30033, 40444, 51234, 61234),\n End = c(126000, 20200, 30999, 41444, 54567, 68686),\n Strand = c(\"+\", \"+\", \"-\", \"-\", \"+\", \"+\"),\n TranscriptionFactor = c(F, F, F, F, T, F))\n\n# build a Synapse table from the data frame.\n# a schema is automatically generated\n# note that reticulate will automatically convert from an R data frame to Pandas\ntable <- synapseclient$build_table(\"My Favorite Genes\", project, genes)\n\ntable <- syn$store(table)\n
Alternately the schema can be specified. At this time when using date values it is necessary to use a date string formatted in \"YYYY-MM-dd HH:mm:ss.mmm\" format or integer unix epoch millisecond value and explicitly specify the type in the schema due to how dates are translated to the Python client.
prez_birthdays <- data.frame(\n Name = c(\"George Washington\", \"Thomas Jefferson\", \"Abraham Lincoln\"),\n Time = c(\"1732-02-22 11:23:11.024\", \"1743-04-13 00:00:00.000\", \"1809-02-12 01:02:03.456\"))\n\ncols <- list(\n synapseclient$Column(name = \"Name\", columnType = \"STRING\", maximumSize = 20),\n synapseclient$Column(name = \"Time\", columnType = \"DATE\"))\n\nschema <- synapseclient$Schema(name = \"President Birthdays\", columns = cols, parent = project)\ntable <- synapseclient$Table(schema, prez_birthdays)\n\n# store the table in Synapse\ntable <- syn$store(table)\n
We can query a table as in the following:
tableId <- table$tableId\n\nresults <- syn$tableQuery(sprintf(\"select * from %s where Name='George Washington'\", tableId))\nresults$asDataFrame()\n
"},{"location":"tutorials/reticulate/#wikis","title":"Wikis","text":"This example illustrates creating a wiki.
See here for more information on wiki APIs.
content <- \"\n# My Wiki Page\nHere is a description of my **fantastic** project!\n\"\n\n# attachment\nfilePath <- tempfile()\nconnection <- file(filePath)\nwriteChar(\"this is the content of the file\", connection, eos = NULL)\nclose(connection)\nwiki <- synapseclient$Wiki(\n owner = project,\n title = \"My Wiki Page\",\n markdown = content,\n attachments = list(filePath)\n)\nwiki <- syn$store(wiki)\n
An existing wiki can be updated as follows.
wiki <- syn$getWiki(project)\nwiki$markdown <- \"\n# My Wiki Page\nHere is a description of my **fantastic** project! Let's\n*emphasize* the important stuff.\n\"\nwiki <- syn$store(wiki)\n
"},{"location":"tutorials/reticulate/#evaluations","title":"Evaluations","text":"An Evaluation is a Synapse construct useful for building processing pipelines and for scoring predictive modeling and data analysis challenges.
See here for more information on Evaluations.
Creating an Evaluation:
eval <- synapseclient$Evaluation(\n name = sprintf(\"My unique evaluation created on %s\", format(Sys.time(), \"%a %b %d %H%M%OS4 %Y\")),\n description = \"testing\",\n contentSource = project,\n submissionReceiptMessage = \"Thank you for your submission!\",\n submissionInstructionsMessage = \"This evaluation only accepts files.\")\n\neval <- syn$store(eval)\n\neval <- syn$getEvaluation(eval$id)\n
Submitting a file to an existing Evaluation:
# first create a file to submit\nfilePath <- tempfile()\nconnection <- file(filePath)\nwriteChar(\"this is my first submission\", connection, eos = NULL)\nclose(connection)\nfile <- synapseclient$File(path = filePath, parent = project)\nfile <- syn$store(file)\n# submit the created file\nsubmission <- syn$submit(eval, file)\n
List submissions:
submissions <- syn$getSubmissionBundles(eval)\n\n# submissions are returned as a generator\nlist(iterate(submissions))\n
Retrieving submission by id:
submission <- syn$getSubmission(submission$id)\n
Retrieving the submission status:
submissionStatus <- syn$getSubmissionStatus(submission)\nsubmissionStatus\n
Query an evaluation:
queryString <- sprintf(\"query=select * from evaluation_%s LIMIT %s OFFSET %s'\", eval$id, 10, 0)\nsyn$restGET(paste(\"/evaluation/submission/query?\", URLencode(queryString), sep = \"\"))\n
"},{"location":"tutorials/reticulate/#sharing-access-to-content","title":"Sharing Access to Content","text":"The following illustrates sharing access to a Synapse Entity.
See here for more information on Access Control including all available permissions.
# get permissions on an entity\n# to get permissions for a user/group pass a principalId identifier,\n# otherwise the assumed permission will apply to the public\n\n# make the project publicly accessible\nacl <- syn$setPermissions(project, accessType = list(\"READ\"))\n\nperms = syn$getPermissions(project)\n
"},{"location":"tutorials/reticulate/#views","title":"Views","text":"A view is a view of all entities (File, Folder, Project, Table, Docker Repository, View) within one or more Projects or Folders. Views can: The following examples illustrate some view operations.
See here for more information on Views. A view is implemented as a Table, see here for more information on Tables.
First create some files we can use in a view:
filePath1 <- tempfile()\nconnection <- file(filePath1)\nwriteChar(\"this is the content of the first file\", connection, eos = NULL)\nclose(connection)\nfile1 <- synapseclient$File(path = filePath1, parent = project)\nfile1 <- syn$store(file1)\nfilePath2 <- tempfile()\nconnection2 <- file(filePath2)\nwriteChar(\"this is the content of the second file\", connection, eos = NULL)\nclose(connection2)\nfile2 <- synapseclient$File(path = filePath2, parent = project)\nfile2 <- syn$store(file2)\n\n# add some annotations\nfileAnnotations1 <- syn$get_annotations(file1)\nfileAnnotations2 <- syn$get_annotations(file2)\n\nfileAnnotations1$contributor <- \"Sage\"\nfileAnnotations1$class <- \"V\"\nsyn$set_annotations(fileAnnotations1)\n\nfileAnnotations2$contributor = \"UW\"\nfileAnnotations2$rank = \"X\"\nsyn$set_annotations(fileAnnotations2)\n
Now create a view:
columns = c(\n synapseclient$Column(name = \"contributor\", columnType = \"STRING\"),\n synapseclient$Column(name = \"class\", columnType = \"STRING\"),\n synapseclient$Column(name = \"rank\", columnType = \"STRING\")\n)\n\nview <- synapseclient$EntityViewSchema(\n name = \"my first file view\",\n columns = columns,\n parent = project,\n scopes = project,\n includeEntityTypes = c(synapseclient$EntityViewType$FILE, synapseclient$EntityViewType$FOLDER),\n addDefaultViewColumns = TRUE\n)\n\nview <- syn$store(view)\n
We can now see content of our view (note that views are not created synchronously it may take a few seconds for the view table to be queryable).
queryResults <- syn$tableQuery(sprintf(\"select * from %s\", view$properties$id))\ndata <- queryResults$asDataFrame()\ndata\n
We can update annotations using a view as follows:
data[\"class\"] <- c(\"V\", \"VI\")\nsyn$store(synapseclient$Table(view$properties$id, data))\n\n# the change in annotations is reflected in get_annotations():\nsyn$get_annotations(file2$properties$id)\n
"},{"location":"tutorials/reticulate/#update-views-content","title":"Update View's Content","text":"# A view can contain different types of entity. To change the types of entity that will show up in a view:\nview <- syn$get(view$properties$id)\nview$set_entity_types(list(synapseclient$EntityViewType$FILE))\n
"},{"location":"tutorials/reticulate/#using-with-a-shiny-app","title":"Using with a Shiny App","text":"Reticulate and the Python synapseclient can be used to workaround an issue that exists when using synapser with a Shiny App. Since synapser shares a Synapse client instance within the R process, multiple users of a synapser integrated Shiny App may end up sharing a login if precautions aren't taken. When using reticulate with synapseclient, session scoped Synapse client objects can be created that avoid this issue.
See SynapseShinyApp for a sample application and a discussion of the issue, and the reticulate branch for an alternative implementation using reticulate with synapseclient.
"},{"location":"tutorials/tables/","title":"Tables","text":"Tables can be built up by adding sets of rows that follow a user-defined schema and queried using a SQL-like syntax.
"},{"location":"tutorials/tables/#creating-a-table-and-loading-it-with-data","title":"Creating a table and loading it with data","text":""},{"location":"tutorials/tables/#initial-setup","title":"Initial setup:","text":"import synapseclient\nfrom synapseclient import Project, File, Folder\nfrom synapseclient import Schema, Column, Table, Row, RowSet, as_table_columns, build_table\n\nsyn = synapseclient.Synapse()\nsyn.login()\n\nproject = syn.get('syn123')\n
"},{"location":"tutorials/tables/#example-data","title":"Example data","text":"First, let's load some data. Let's say we had a file, genes.csv:
Name,Chromosome,Start,End,Strand,TranscriptionFactor\nfoo,1,12345,12600,+,False\narg,2,20001,20200,+,False\nzap,2,30033,30999,-,False\nbah,1,40444,41444,-,False\nbnk,1,51234,54567,+,True\nxyz,1,61234,68686,+,False\n
"},{"location":"tutorials/tables/#creating-a-table-with-columns","title":"Creating a table with columns","text":"table = build_table('My Favorite Genes', project, \"/path/to/genes.csv\")\nsyn.store(table)\n
build_table will set the Table Schema which defines the columns of the table. To create a table with a custom Schema, first create the Schema:
cols = [\n Column(name='Name', columnType='STRING', maximumSize=20),\n Column(name='Chromosome', columnType='STRING', maximumSize=20),\n Column(name='Start', columnType='INTEGER'),\n Column(name='End', columnType='INTEGER'),\n Column(name='Strand', columnType='STRING', enumValues=['+', '-'], maximumSize=1),\n Column(name='TranscriptionFactor', columnType='BOOLEAN')]\n\nschema = Schema(name='My Favorite Genes', columns=cols, parent=project)\n
"},{"location":"tutorials/tables/#storing-the-table-in-synapse","title":"Storing the table in Synapse","text":"table = Table(schema, \"/path/to/genes.csv\")\ntable = syn.store(table)\n
The Table
function takes two arguments, a schema object and data in some form, which can be:
RowSet
objectWith a bit of luck, we now have a table populated with data. Let's try to query:
results = syn.tableQuery(\"select * from %s where Chromosome='1' and Start < 41000 and End > 20000\"\n % table.schema.id)\nfor row in results:\n print(row)\n
"},{"location":"tutorials/tables/#using-pandas-to-accomplish-setup-and-querying","title":"Using Pandas to accomplish setup and querying","text":"Pandas is a popular library for working with tabular data. If you have Pandas installed, the goal is that Synapse Tables will play nice with it.
Create a Synapse Table from a DataFrame:
import pandas as pd\n\ndf = pd.read_csv(\"/path/to/genes.csv\", index_col=False)\ntable = build_table('My Favorite Genes', project, df)\ntable = syn.store(table)\n
build_table
uses pandas DataFrame dtype to set the Table Schema
. To create a table with a custom Schema
, first create the Schema
:
schema = Schema(name='My Favorite Genes', columns=as_table_columns(df), parent=project)\ntable = syn.store(Table(schema, df))\n
Get query results as a DataFrame:
results = syn.tableQuery(\"select * from %s where Chromosome='2'\" % table.schema.id)\ndf = results.asDataFrame()\n
"},{"location":"tutorials/tables/#changing-data","title":"Changing Data","text":"Once the schema is settled, changes come in two flavors: appending new rows and updating existing ones.
Appending new rows is fairly straightforward. To continue the previous example, we might add some new genes from another file:
table = syn.store(Table(table.schema.id, \"/path/to/more_genes.csv\"))\n
To quickly add a few rows, use a list of row data:
new_rows = [[\"Qux1\", \"4\", 201001, 202001, \"+\", False],\n [\"Qux2\", \"4\", 203001, 204001, \"+\", False]]\ntable = syn.store(Table(schema, new_rows))\n
Updating rows requires an etag, which identifies the most recent change set plus row IDs and version numbers for each row to be modified. We get those by querying before updating. Minimizing changesets to contain only rows that actually change will make processing faster.
For example, let's update the names of some of our favorite genes:
results = syn.tableQuery(\"select * from %s where Chromosome='1'\" % table.schema.id)\ndf = results.asDataFrame()\ndf['Name'] = ['rzing', 'zing1', 'zing2', 'zing3']\n
Note that we're propagating the etag from the query results. Without it, we'd get an error saying something about an \"Invalid etag\":
table = syn.store(Table(schema, df, etag=results.etag))\n
The etag is used by the server to prevent concurrent users from making conflicting changes, a technique called optimistic concurrency. In case of a conflict, your update may be rejected. You then have to do another query and try your update again.
"},{"location":"tutorials/tables/#changing-table-structure","title":"Changing Table Structure","text":"Adding columns can be done using the methods Schema.addColumn
or addColumns
on the Schema
object:
schema = syn.get(\"syn000000\")\nbday_column = syn.store(Column(name='birthday', columnType='DATE'))\nschema.addColumn(bday_column)\nschema = syn.store(schema)\n
Renaming or otherwise modifying a column involves removing the column and adding a new column:
cols = syn.getTableColumns(schema)\nfor col in cols:\n if col.name == \"birthday\":\n schema.removeColumn(col)\nbday_column2 = syn.store(Column(name='birthday2', columnType='DATE'))\nschema.addColumn(bday_column2)\nschema = syn.store(schema)\n
"},{"location":"tutorials/tables/#table-attached-files","title":"Table attached files","text":"Synapse tables support a special column type called 'File' which contain a file handle, an identifier of a file stored in Synapse. Here's an example of how to upload files into Synapse, associate them with a table and read them back later:
# your synapse project\nimport tempfile\nproject = syn.get(...)\n\n# Create temporary files to store\ntemp = tempfile.NamedTemporaryFile()\nwith open(temp.name, \"w+\") as temp_d:\n temp_d.write(\"this is a test\")\n\ntemp2 = tempfile.NamedTemporaryFile()\nwith open(temp2.name, \"w+\") as temp_d:\n temp_d.write(\"this is a test 2\")\n\n# store the table's schema\ncols = [\n Column(name='artist', columnType='STRING', maximumSize=50),\n Column(name='album', columnType='STRING', maximumSize=50),\n Column(name='year', columnType='INTEGER'),\n Column(name='catalog', columnType='STRING', maximumSize=50),\n Column(name='cover', columnType='FILEHANDLEID')]\nschema = syn.store(Schema(name='Jazz Albums', columns=cols, parent=project))\n\n# the actual data\ndata = [[\"John Coltrane\", \"Blue Train\", 1957, \"BLP 1577\", temp.name],\n [\"Sonny Rollins\", \"Vol. 2\", 1957, \"BLP 1558\", temp.name],\n [\"Sonny Rollins\", \"Newk's Time\", 1958, \"BLP 4001\", temp2.name],\n [\"Kenny Burrel\", \"Kenny Burrel\", 1956, \"BLP 1543\", temp2.name]]\n\n# upload album covers\nfor row in data:\n file_handle = syn.uploadFileHandle(row[4], parent=project)\n row[4] = file_handle['id']\n\n# store the table data\nrow_reference_set = syn.store(RowSet(schema=schema, rows=[Row(r) for r in data]))\n\n# Later, we'll want to query the table and download our album covers\nresults = syn.tableQuery(f\"select artist, album, cover from {schema.id} where artist = 'Sonny Rollins'\")\ntest_files = syn.downloadTableColumns(results, ['cover'])\n
"},{"location":"tutorials/tables/#deleting-rows","title":"Deleting rows","text":"Query for the rows you want to delete and call syn.delete on the results:
results = syn.tableQuery(\"select * from %s where Chromosome='2'\" % table.schema.id)\na = syn.delete(results)\n
"},{"location":"tutorials/tables/#deleting-the-whole-table","title":"Deleting the whole table","text":"Deleting the schema deletes the whole table and all rows:
syn.delete(schema)\n
"},{"location":"tutorials/tables/#queries","title":"Queries","text":"The query language is quite similar to SQL select statements, except that joins are not supported. The documentation for the Synapse API has lots of query examples.
See:
The dependencies on pandas and pysftp are optional. The Synapse synapseclient.table
feature integrates with Pandas. Support for sftp is required for users of SFTP file storage. Both require native libraries to be compiled or installed separately from prebuilt binaries.
Source code and development versions are available on Github. Installing from source:
-git clone git://github.com/Sage-Bionetworks/synapsePythonClient.git
+git clone https://github.com/Sage-Bionetworks/synapsePythonClient.git
cd synapsePythonClient
You can stay on the master branch to get the latest stable release or check out the develop branch or a tagged revision: