* Fix: Locally copied version of last published workfile is not incremented * fix subset first match * correct anatomy name * Fix typo and linting * keep source filepath for further path conformation * fetch also input dependencies of workfile * required changes * lint * fix case only one subset * Enhancement: copy last workfile as reusable methods (#6) * Enhancement: copy last published workfile as reusable methods (WiP) * Added get_host_extensions method, added subset_id and las_version_doc access, added optional arguments to get_last_published_workfile * Plugged in the new methods + minor changes * Added docstrings, last workfile optional argument, and removed unused code * Using new implementation to get local workfile path. Warning: It adds an extra dot to the extension which I need to fix * Refactoring and fixed double dots * Added match subset_id and get representation method, plus clan up * Removed unused vars * Fixed some rebasing errors * delinted unchanged code and renamed get_representation into get_representation_with_task * This time it's really delinted, I hope... * Update openpype/modules/sync_server/sync_server.py reprenation isn't the right spelling (: Co-authored-by: Félix David <felixg.david@gmail.com> * Changes based on reviews * Fixed non imperative docstring and missing space * Fixed another non imperative docstring * Update openpype/modules/sync_server/sync_server.py Fixed typo Co-authored-by: Félix David <felixg.david@gmail.com> Co-authored-by: Hayley GUILLOT <hayleyguillot@outlook.com> Co-authored-by: Félix David <felixg.david@gmail.com> * Fix: syntax error * fix single subset case * Restore sync server enabled test in hook * Python2 syntax * renaming and missing key case handling * Fix local workfile overwritten on update in some cases (#7) * Fix: Local workfile overwrite when local version number is higher than published workfile version number (WiP) * Changed regex search, clean up * Readded mistakenly removed newline * lint * remove anticipated functions for cleaner PR * remove funcs from entities.py * change to get_last_workfile_with_version * clean * Update openpype/modules/sync_server/sync_server.py Co-authored-by: Jakub Trllo <43494761+iLLiCiTiT@users.noreply.github.com> * removed get_last_published_workfile_path * moved hook to sync server module * fix lint * Refactor - download only if not present * Refactor - change to list instead of set * Refactor - removing unnecessary code last_published_workfile_path must exists or we wouldn't get there. Use version only from that. * Refactor - removing unnecessary imports * Added check for max fail tries * Refactor - cleaned up how to get last workfile * Updated docstrings * Remove unused imports Co-authored-by: Félix David <felixg.david@gmail.com> * OP-5466 - run this on more DCC * Updated documentation * Fix - handle hero versions Skip hero versions, look only for versioned published to get max version id. * Hound * Refactor - simplified download_last_published_workfile Logic should be in pre hook * Skip if no profile found * Removed unwanted import * Use collected project_doc Co-authored-by: Jakub Trllo <43494761+iLLiCiTiT@users.noreply.github.com> * Use cached project_settings Co-authored-by: Jakub Trllo <43494761+iLLiCiTiT@users.noreply.github.com> --------- Co-authored-by: Félix David <felixg.david@gmail.com> Co-authored-by: Sharkitty <81646000+Sharkitty@users.noreply.github.com> Co-authored-by: Hayley GUILLOT <hayleyguillot@outlook.com> Co-authored-by: Jakub Trllo <43494761+iLLiCiTiT@users.noreply.github.com> Co-authored-by: Jakub Ježek <jakubjezek001@gmail.com> |
||
|---|---|---|
| .. | ||
| launch_hooks | ||
| providers | ||
| resources | ||
| tray | ||
| __init__.py | ||
| README.md | ||
| rest_api.py | ||
| sync_server.py | ||
| sync_server_module.py | ||
| utils.py | ||
Synchronization server
This server is scheduled at start of Pype, it periodically checks avalon DB for 'representation' records which have in theirs files.sites record with name: 'gdrive' (or any other site name from 'gdrive.json') without field 'created_dt'.
This denotes that this representation should be synced to GDrive. Records like these are created by IntegrateNew process based on configuration. Leave 'config.json.remote_site' empty for not synchronizing at all.
One provider could have multiple sites. (GDrive implementation is 'a provider', target folder on it is 'a site')
Quick HOWTOs:
I want to start syncing my newly published files:
-
Check that Sync server is enabled globally in
pype/settings/defaults/system_settings/modules.json -
Get credentials for service account, share target folder on Gdrive with it
-
Set path to stored credentials file in
pype/settings/defaults/project_settings/global.json.credentials_url -
Set name of site, root folder and provider('gdrive' in case of Google Drive) in
pype/settings/defaults/project_settings/global.json.sites -
Update
pype/settings/defaults/project_settings/global.json.remote_siteto name of site you set in previous step. -
Check that project setting is enabled (in this
global.jsonfile) -
Start Pype and publish
My published file is not syncing:
- Check that representation record contains for all 'files.sites' skeleton in
format:
{name: "MY_CONFIGURED_REMOTE_SITE"} - Check if that record doesn't have already 'created_dt' filled. That would denote that file was synced but someone might have had removed it on remote site.
- If that records contains field "error", check that "tries" field doesn't contain same value as threshold in config.json.retry_cnt. If it does fix the problem mentioned in 'error' field, delete 'tries' field.
I want to sync my already published files:
- Configure your Pype for syncing (see first section of Howtos).
- Manually add skeleton {name: "MY_CONFIGURED_REMOTE_SITE"} to all
representation.files.sites:
db.getCollection('MY_PROJECT').update({type:"representation"}, {$set:{"files.$[].sites.MY_CONFIGURED_REMOTE_SITE" : {}}}, true, true)
I want to create new custom provider:
- take
providers\abstract_provider.pyas a base class - create provider class in
providerswith a name according to a provider (eg. 'gdrive.py' for gdrive provider etc.) - upload provider icon in png format, 24x24, into
providers\resources, its name must follow name of provider (eg. 'gdrive.png' for gdrive provider) - register new provider into
providers.lib.py, test how many files could be manipulated at same time, check provider's API for limits
Needed configuration:
pype/settings/defaults/project_settings/global.json.sync_server:
"local_id": "local_0",-- identifier of user pype"retry_cnt": 3,-- how many times try to synch file in case of error"loop_delay": 60,-- how many seconds between sync loops"publish_site": "studio",-- which site user current, 'studio' by default, could by same as 'local_id' if user is working from home without connection to studio infrastructure"remote_site": "gdrive"-- key for site to synchronize to. Must match to site configured lower in this file. Used in IntegrateNew to prepare skeleton for syncing in the representation record. Leave empty if no syncing is wanted. This is a general configuration, 'local_id', 'publish_site' and 'remote_site' will be set and changed by some GUI in the future.
pype/settings/defaults/project_settings/global.json.sync_server.sites:
- "provider": "gdrive" -- type of provider, must be registered in 'sync_server\providers\lib.py'
- "credentials_url": "/my_secret_folder/credentials.json",
-- path to credentials for service account
- "root": { -- "root": "/My Drive" in simple scenario, config here for
-- multiroot projects
- "root_one": "/My Drive/work_folder",
- "root_tow": "/My Drive/publish_folder"
}
}``