diff --git a/.github/ISSUE_TEMPLATE/bug_report.yml b/.github/ISSUE_TEMPLATE/bug_report.yml
index 1280e6a6e5..a2ca0d5e48 100644
--- a/.github/ISSUE_TEMPLATE/bug_report.yml
+++ b/.github/ISSUE_TEMPLATE/bug_report.yml
@@ -35,6 +35,10 @@ body:
label: Version
description: What version are you running? Look to OpenPype Tray
options:
+ - 3.16.0
+ - 3.16.0-nightly.2
+ - 3.16.0-nightly.1
+ - 3.15.12
- 3.15.12-nightly.4
- 3.15.12-nightly.3
- 3.15.12-nightly.2
@@ -135,6 +139,7 @@ body:
- 3.14.5-nightly.1
- 3.14.4
- 3.14.4-nightly.4
+
validations:
required: true
- type: dropdown
diff --git a/.github/workflows/miletone_release_trigger.yml b/.github/workflows/miletone_release_trigger.yml
index 4a031be7f9..d755f7eb9f 100644
--- a/.github/workflows/miletone_release_trigger.yml
+++ b/.github/workflows/miletone_release_trigger.yml
@@ -5,12 +5,6 @@ on:
inputs:
milestone:
required: true
- release-type:
- type: choice
- description: What release should be created
- options:
- - release
- - pre-release
milestone:
types: closed
diff --git a/.gitignore b/.gitignore
index 50f52f65a3..e5019a4e74 100644
--- a/.gitignore
+++ b/.gitignore
@@ -37,6 +37,7 @@ Temporary Items
###########
/build
/dist/
+/server_addon/package/*
/vendor/bin/*
/vendor/python/*
diff --git a/CHANGELOG.md b/CHANGELOG.md
index 095e0d96e4..33dbdb14fa 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -1,6 +1,1457 @@
# Changelog
+## [3.16.0](https://github.com/ynput/OpenPype/tree/3.16.0)
+
+
+[Full Changelog](https://github.com/ynput/OpenPype/compare/...3.16.0)
+
+### **🆕 New features**
+
+
+
+General: Reduce usage of legacy io #4723
+
+Replace usages of `legacy_io` with getter methods or reuse already available information. Create plugins using CreateContext are using context from CreateContext object. Loaders are usign getter function from context tools. Publish plugin are using information instance.data or context.data. In some cases were pieces of code refactored a little e.g. fps getter in maya.
+
+
+___
+
+
+
+
+
+Documentation: API docs reborn - yet again #4419
+
+## Feature
+
+Add functional base for API Documentation using Sphinx and AutoAPI.
+
+After unsuccessful #2512, #834 and #210 this is yet another try. But this time without ambition to solve the whole issue. This is making Shinx script to work and nothing else. Any changes and improvements in API docs should be made in subsequent PRs.
+
+## How to use it
+
+You can run:
+
+```sh
+cd .\docs
+make.bat html
+```
+
+or
+
+```sh
+cd ./docs
+make html
+```
+
+This will go over our code and generate **.rst** files in `/docs/source/autoapi` and from those it will generate full html documentation in `/docs/build/html`.
+
+During the build you'll see tons of red errors that are pointing to our issues:
+
+1) **Wrong imports**
+ Invalid import are usually wrong relative imports (too deep) or circular imports.
+
+2) **Invalid doc-strings**
+ Doc-strings to be processed into documentation needs to follow some syntax - this can be checked by running
+ `pydocstyle` that is already included with OpenPype
+3) **Invalid markdown/rst files**
+ md/rst files can be included inside rst files using `.. include::` directive. But they have to be properly formatted.
+
+
+## Editing rst templates
+
+Everything starts with `/docs/source/index.rst` - this file should be properly edited, Right now it just includes `readme.rst` that in turn include and parse main `README.md`. This is entrypoint to API documentation. All templates generated by AutoAPI are in `/docs/source/autoapi`. They should be eventually commited to repository and edited too.
+
+## Steps for enhancing API documentation
+
+1) Run `/docs/make.bat html`
+2) Read the red errors/warnings - fix it in the code
+3) Run `/docs/make.bat html` again until there are not red lines
+4) Edit rst files and add some meaningfull content there
+
+> **Note**
+> This can (should) be merged as is without doc-string fixes in the code or changes in templates. All additional improvements on API documentation should be made in new PRs.
+
+> **Warning**
+> You need to add new dependencies to use it. Run `create_venv`.
+
+Connected to #2490
+___
+
+
+
+
+
+Global: custom location for OP local versions #4673
+
+This provides configurable location to unzip Openpype version zips. By default, it was hardcoded to artist's app data folder, which might be problematic/slow with roaming profiles.Location must be accessible by user running OP Tray with write permissions (so `Program Files` might be problematic)
+
+
+___
+
+
+
+
+
+AYON: Update settings conversion #4837
+
+Updated conversion script of AYON settings to v3 settings. PR is related to changes in addons repository https://github.com/ynput/ayon-addons/pull/6 . Changed how the conversion happens -> conversion output does not start with openpype defaults but as empty dictionary.
+
+
+___
+
+
+
+
+
+AYON: Implement integrate links publish plugin #4842
+
+Implemented entity links get/create functions. Added new integrator which replaces v3 integrator for links.
+
+
+___
+
+
+
+
+
+General: Version attributes integration #4991
+
+Implemented unified integrate plugin to update version attributes after all integrations for AYON. The goal is to be able update attribute values in a unified way to a version when all addon integrators are done, so e.g. ftrack can add ftrack id to matching version in AYON server etc.The can be stored under `"versionAttributes"` key.
+
+
+___
+
+
+
+
+
+AYON: Staging versions can be used #4992
+
+Added ability to use staging versions in AYON mode.
+
+
+___
+
+
+
+
+
+AYON: Preparation for products #5038
+
+Prepare ayon settings conversion script for `product` settings conversion.
+
+
+___
+
+
+
+
+
+Loader: Hide inactive versions in UI #5101
+
+Added support for `active` argument to hide versions with active set to False in Loader UI when in AYON mode.
+
+
+___
+
+
+
+
+
+General: CLI addon command #5109
+
+Added `addon` alias for `module` in OpenPype cli commands.
+
+
+___
+
+
+
+
+
+AYON: OpenPype as server addon #5199
+
+OpenPype repository can be converted to AYON addon for distribution. Addon has defined dependencies that are required to use it and are not in base ayon-launcher (desktop application).
+
+
+___
+
+
+
+
+
+General: Runtime dependencies #5206
+
+Defined runtime dependencies in pyproject toml. Moved python ocio and otio modules there.
+
+
+___
+
+
+
+
+
+AYON: Bundle distribution #5209
+
+Since AYON server 0.3.0 are addon versions defined by bundles which affects how addons, dependency packages and installers are handled. Only source of truth, about any version of anything that should be used, is server bundle.
+
+
+___
+
+
+
+
+
+Feature/blender handle q application #5264
+
+This edit is to change the way the QApplication is run for Blender. It calls in the singleton (QApplication) during the register. This is made so that other Qt applications and addons are able to run on Blender. In its current implementation, if a QApplication is already running, all functionality of OpenPype becomes unavailable.
+
+
+___
+
+
+
+### **🚀 Enhancements**
+
+
+
+General: Connect to AYON server (base) #3924
+
+Initial implementation of being able use AYON server in current OpenPype client. Added ability to connect to AYON server and use base queries.
+
+AYON mode has it's own executable (and start script). To start in AYON mode just replace `start.py` with `ayon_start.py` (added tray start script to tools). Added constant `AYON_SERVER_ENABLED` to `openpype/__init__.py` to know if ayon mode is enabled. In that case Mongo is not used at all and any attempts will cause crashes.I had to modify `~/openpype/client` content to be able do this switch. Mongo implementation was moved to `mongo` subfolder and use "star imports" in files from where current imports are used. Logic of any tool or query in code was not changed at all. Since functions were based on mongo queries they don't use full potential of AYON server abilities.ATM implementation has login UI, distribution of files from server and replacement of mongo queries. For queries is used `ayon_api` module. Which is in live development so the versions may change from day to day.
+
+
+___
+
+
+
+
+
+Enhancement kitsu note with exceptions #4537
+
+Adding a setting to choose some exceptions to IntegrateKitsuNote task status changes.
+
+
+___
+
+
+
+
+
+General: Environment variable for default OCIO configs #4670
+
+Define environment variable which lead to root of builtin ocio configs to be able change the root without changing settings. For the path in settings was used `"{OPENPYPE_ROOT}/vendor/bin/ocioconfig/OpenColorIOConfig"` which disallow to change the root somewhere else. That will be needed in AYON where configs won't be part of desktop application but downloaded from server.
+
+
+___
+
+
+
+
+
+AYON: Editorial hierarchy creation #4699
+
+Implemented extract hierarchy to AYON plugin which created entities in AYON using ayon api.
+
+
+___
+
+
+
+
+
+AYON: Vendorize ayon api #4753
+
+Vendorize ayon api into openpype vendor directory. The reason is that `ayon-python-api` is in live development and will fix/add features often in next few weeks/months, and because update of dependency requires new release -> new build, we want to avoid the need of doing that as it would affect OpenPype development.
+
+
+___
+
+
+
+
+
+General: Update PySide 6 for MacOs #4764
+
+New version of PySide6 does not have issues with settings UI. It is still breaking UI stylesheets so it is not changed for other plaforms but it is enhancement from previous state.
+
+
+___
+
+
+
+
+
+General: Removed unused cli commands #4902
+
+Removed `texturecopy` and `launch` cli commands from cli commands.
+
+
+___
+
+
+
+
+
+AYON: Linux & MacOS launch script #4970
+
+Added shell script to launch tray in AYON mode.
+
+
+___
+
+
+
+
+
+General: Qt scale enhancement #5059
+
+Set ~~'QT_SCALE_FACTOR_ROUNDING_POLICY'~~ scale factor rounding policy of QApplication to `PassThrough` so the scaling can be 'float' number and not just 'int' (150% -> 1.5 scale).
+
+
+___
+
+
+
+
+
+CI: WPS linting instead of Hound (rebase) 2 #5115
+
+Because Hound currently used to lint the code on GH ships with really old flake8 support, it fails miserably on any newer Python syntax. This PR is adding WPS linter to GitHub workflows that should step in.
+
+
+___
+
+
+
+
+
+Max: OP parameters only displays what is attached to the container #5229
+
+The OP parameter in 3dsmax only displays what is currently attached to the container while deleting while you can see the items which is not added when you are adding to the container.
+
+
+___
+
+
+
+
+
+Testing: improving logging during testing #5271
+
+Unit testing logging was crashing on more then one nested layers of inherited loggers.
+
+
+___
+
+
+
+
+
+Nuke: removing deprecated settings in baking #5275
+
+Removing deprecated settings for baking with reformat. This option was only for single reformat node and it had been substituted with multiple reposition nodes.
+
+
+___
+
+
+
+### **🐛 Bug fixes**
+
+
+
+AYON: General fixes and updates #4975
+
+Few smaller fixes related to AYON connection. Some of fixes were taken from this PR.
+
+
+___
+
+
+
+
+
+Start script: Change returncode on validate or list versions #4515
+
+Change exit code from `1` to `0` when versions are printed or when version is validated.
+
+Return code `1` is indicating error but there didn't happen any error.
+
+
+___
+
+
+
+
+
+AYON: Change login UI works #4754
+
+Fixed change of login UI. Logic change UI did show up, new login was successful, but after restart was used the previous login. This change fix the issue.
+
+
+___
+
+
+
+
+
+AYON: General issues #4763
+
+Vendorized `ayon_api` from PR broke OpenPype launch, because `ayon_api` is not available. Moved `ayon_api` from ayon specific subforlder to `common` python vendor in OpenPype, and removed login in ayon start script (which was invalid anyway). Also made fixed compatibility with PySide6 by using `qtpy` instead of `Qt` and changing code which is not PySide6 compatible.
+
+
+___
+
+
+
+
+
+AYON: Small fixes #4841
+
+Bugsfixes and enhancements related to AYON logic. Define `BUILTIN_OCIO_ROOT` environment variable so OCIO configs are working. Use constants from ayon api instead of hardcoding them in codebase. Change process name from "openpype" to "ayon". Don't execute login dialog when application is not yet running but use `open` method instead. Fixed missing modules settings which were not taken from openpype defaults. Updated ayon api to `0.1.17`.
+
+
+___
+
+
+
+
+
+Bugfix - Update gazu to 0.9.3 #4845
+
+This updates Gazu to 0.9.3 to make sure Gazu works with Kitsu and Zou 0.16.x+
+
+
+___
+
+
+
+
+
+Igniter: fix error reports in silent mode #4909
+
+Some errors in silent mode commands in Igniter were suppressed and not visible for example in Deadline log.
+
+
+___
+
+
+
+
+
+General: Remove ayon api from poetry lock #4964
+
+Remove AYON python api from pyproject.toml and poetry.lock again.
+
+
+___
+
+
+
+
+
+Ftrack: Fix AYON settings conversion #4967
+
+Fix conversion of ftrack settings in AYON mode.
+
+
+___
+
+
+
+
+
+AYON: ISO date format conversion issues #4981
+
+Function `datetime.fromisoformat` was replaced with `arrow.get` to be used instead.
+
+
+___
+
+
+
+
+
+AYON: Missing files on representations #4989
+
+Fix integration of files into representation in server database.
+
+
+___
+
+
+
+
+
+General: Fix Python 2 vendor for arrow #4993
+
+Moved remaining dependencies for arrow from ftrack to python 2 vendor.
+
+
+___
+
+
+
+
+
+General: Fix new load plugins for next minor relase #5000
+
+Fix access to `fname` attribute which is not available on load plugin anymore.
+
+
+___
+
+
+
+
+
+General: Fix mongo secure connection #5031
+
+Fix `ssl` and `tls` keys checks in mongo uri query string.
+
+
+___
+
+
+
+
+
+AYON: Fix site sync settings #5069
+
+Fixed settings for AYON variant of sync server.
+
+
+___
+
+
+
+
+
+General: Replace deprecated keyword argument in PyMongo #5080
+
+Use argument `tlsCAFile` instead of `ssl_ca_certs` to avoid deprecation warnings.
+
+
+___
+
+
+
+
+
+Igniter: QApplication is created #5081
+
+Function `_get_qt_app` actually creates new `QApplication` if was not created yet.
+
+
+___
+
+
+
+
+
+General: Lower unidecode version #5090
+
+Use older version of Unidecode module to support Python 2.
+
+
+___
+
+
+
+
+
+General: Lower cryptography to 39.0.0 #5099
+
+Lower cryptography to 39.0.0 to avoid breaking of DCCs like Maya and Nuke.
+
+
+___
+
+
+
+
+
+AYON: Global environments key fix #5118
+
+Seems that when converting ayon settings to OP settings the `environments` setting is put under the `environments` key in `general` however when populating the environment the `environment` key gets picked up, which does not contain the environment variables from the `core/environments` setting
+
+
+___
+
+
+
+
+
+Add collector to tray publisher for getting frame range data #5152
+
+Add collector to tray publisher to get frame range data. User can choose to enable this collector if they need this in the publisher.Resolve #5136
+
+
+___
+
+
+
+
+
+Unreal: get current project settings not using unreal project name #5170
+
+There was a bug where Unreal project name was used to query project settings. But Unreal project name can differ from the "real" one because of naming convention rules set by Unreal. This is fixing it by asking for current project settings.
+
+
+___
+
+
+
+
+
+Substance Painter: Fix Collect Texture Set Images unable to copy.deepcopy due to QMenu #5238
+
+Fix `copy.deepcopy` of `instance.data`.
+
+
+___
+
+
+
+
+
+Ayon: server returns different key #5251
+
+Package returned from server has `filename` instead of `name`.
+
+
+___
+
+
+
+
+
+Substance Painter: Fix default color management settings #5259
+
+The default settings for color management for Substance Painter were invalid, it was set to override the global config by default but specified no valid config paths of its own - and thus errored that the paths were not correct.This sets the defaults correctly to match other hosts._I quickly checked - this seems to be the only host with the wrong default settings_
+
+
+___
+
+
+
+
+
+Nuke: fixing container data if windows path in value #5267
+
+Windows path in container data are reformatted. Previously it was reported that Nuke was rising `utf8 0xc0` error if backward slashes were in data values.
+
+
+___
+
+
+
+
+
+Houdini: fix typo error in collect arnold rop #5281
+
+Fixing a typo error in `collect_arnold_rop.py`Reference: #5280
+
+
+___
+
+
+
+
+
+Slack - enhanced logging and protection against failure #5287
+
+Covered issues found in production on customer site. SlackAPI exception doesn't need to have 'error', covered uncaught exception.
+
+
+___
+
+
+
+
+
+Maya: Removed unnecessary import of pyblish.cli #5292
+
+This import resulted in adding additional logging handler which lead to duplication of logs in hosts with plugins containing `is_in_tests` method. Import is unnecessary for testing functionality.
+
+
+___
+
+
+
+### **🔀 Refactored code**
+
+
+
+Loader: Remove `context` argument from Loader.__init__() #4602
+
+Remove the previously required `context` argument.
+
+
+___
+
+
+
+
+
+Global: Remove legacy integrator #4786
+
+Remove the legacy integrator.
+
+
+___
+
+
+
+### **📃 Documentation**
+
+
+
+Next Minor Release #5291
+
+
+___
+
+
+
+### **Merged pull requests**
+
+
+
+Maya: Refactor to new publisher #4388
+
+**Refactor Maya to use the new publisher with new creators.**
+
+
+- [x] Legacy instance can be converted in UI using `SubsetConvertorPlugin`
+- [x] Fix support for old style "render" and "vrayscene" instance to the new per layer format.
+- [x] Context data is stored with scene
+- [x] Workfile instance converted to AutoCreator
+- [x] Converted Creator classes
+- [x] Create animation
+- [x] Create ass
+- [x] Create assembly
+- [x] Create camera
+- [x] Create layout
+- [x] Create look
+- [x] Create mayascene
+- [x] Create model
+- [x] Create multiverse look
+- [x] Create multiverse usd
+- [x] Create multiverse usd comp
+- [x] Create multiverse usd over
+- [x] Create pointcache
+- [x] Create proxy abc
+- [x] Create redshift proxy
+- [x] Create render
+- [x] Create rendersetup
+- [x] Create review
+- [x] Create rig
+- [x] Create setdress
+- [x] Create unreal skeletalmesh
+- [x] Create unreal staticmesh
+- [x] Create vrayproxy
+- [x] Create vrayscene
+- [x] Create xgen
+- [x] Create yeti cache
+- [x] Create yeti rig
+- [ ] Tested new Creator publishes
+- [x] Publish animation
+- [x] Publish ass
+- [x] Publish assembly
+- [x] Publish camera
+- [x] Publish layout
+- [x] Publish look
+- [x] Publish mayascene
+- [x] Publish model
+- [ ] Publish multiverse look
+- [ ] Publish multiverse usd
+- [ ] Publish multiverse usd comp
+- [ ] Publish multiverse usd over
+- [x] Publish pointcache
+- [x] Publish proxy abc
+- [x] Publish redshift proxy
+- [x] Publish render
+- [x] Publish rendersetup
+- [x] Publish review
+- [x] Publish rig
+- [x] Publish setdress
+- [x] Publish unreal skeletalmesh
+- [x] Publish unreal staticmesh
+- [x] Publish vrayproxy
+- [x] Publish vrayscene
+- [x] Publish xgen
+- [x] Publish yeti cache
+- [x] Publish yeti rig
+- [x] Publish workfile
+- [x] Rig loader correctly generates a new style animation creator instance
+- [ ] Validations / Error messages for common validation failures look nice and usable as a report.
+- [ ] Make Create Animation hidden to the user (should not create manually?)
+- [x] Correctly detect difference between **'creator_attributes'** and **'instance_data'** since both are "flattened" to the top node.
+
+
+___
+
+
+
+
+
+Start script: Fix possible issues with destination drive path #4478
+
+Drive paths for windows are fixing possibly missing slash at the end of destination path.
+
+Windows `subst` command require to have destination path with slash if it's a drive (it should be `G:\` not `G:`).
+
+
+___
+
+
+
+
+
+Global: Move PyOpenColorIO to vendor/python #4946
+
+So that DCCs don't conflict with their own.
+
+See https://github.com/ynput/OpenPype/pull/4267#issuecomment-1537153263 for the issue with Gaffer.
+
+I'm not sure if this is the correct approach, but I assume PySide/Shiboken is under `vendor/python` for this reason as well...
+___
+
+
+
+
+
+RuntimeError with Click on deadline publish #5065
+
+I changed Click to version 8.0 instead of 7.1.2 to solve this error:
+```
+2023-05-30 16:16:51: 0: STDOUT: Traceback (most recent call last):
+2023-05-30 16:16:51: 0: STDOUT: File "start.py", line 1126, in boot
+2023-05-30 16:16:51: 0: STDOUT: File "/prod/softprod/apps/openpype/LINUX/3.15/dependencies/click/core.py", line 829, in __call__
+2023-05-30 16:16:51: 0: STDOUT: return self.main(*args, **kwargs)
+2023-05-30 16:16:51: 0: STDOUT: File "/prod/softprod/apps/openpype/LINUX/3.15/dependencies/click/core.py", line 760, in main
+2023-05-30 16:16:51: 0: STDOUT: _verify_python3_env()
+2023-05-30 16:16:51: 0: STDOUT: File "/prod/softprod/apps/openpype/LINUX/3.15/dependencies/click/_unicodefun.py", line 126, in _verify_python3_env
+2023-05-30 16:16:51: 0: STDOUT: raise RuntimeError(
+2023-05-30 16:16:51: 0: STDOUT: RuntimeError: Click will abort further execution because Python 3 was configured to use ASCII as encoding for the environment. Consult https://click.palletsprojects.com/python3/ for mitigation steps.
+```
+
+
+___
+
+
+
+
+
+
+## [3.15.12](https://github.com/ynput/OpenPype/tree/3.15.12)
+
+
+[Full Changelog](https://github.com/ynput/OpenPype/compare/3.15.11...3.15.12)
+
+### **🆕 New features**
+
+
+
+Tray Publisher: User can set colorspace per instance explicitly #4901
+
+With this feature a user can set/override the colorspace for the representations of an instance explicitly instead of relying on the File Rules from project settings or alike. This way you can ingest any file and explicitly say "this file is colorspace X".
+
+
+___
+
+
+
+
+
+Review Family in Max #5001
+
+Review Feature by creating preview animation in 3dsmax(The code is still cleaning up so there is going to be some updates until it is ready for review)
+
+
+___
+
+
+
+
+
+AfterEffects: support for workfile template builder #5163
+
+This PR add functionality of templated workfile builder. It allows someone to prepare AE workfile with placeholders as for automatically loading particular representation of particular subset of particular asset from context where workfile is opened.Selection from multiple prepared workfiles is provided with usage of templates, specific type of tasks could use particular workfile template etc.Artists then can build workfile from template when opening new workfile.
+
+
+___
+
+
+
+
+
+CreatePlugin: Get next version helper #5242
+
+Implemented helper functions to get next available versions for create instances.
+
+
+___
+
+
+
+### **🚀 Enhancements**
+
+
+
+Maya: Improve Templates #4854
+
+Use library method for fetching reference node and support parent in hierarchy.
+
+
+___
+
+
+
+
+
+Bug: Maya - xgen sidecar files arent moved when saving workfile as an new asset workfile changing context - OP-6222 #5215
+
+This PR manages the Xgen files when switching context in the Workfiles app.
+
+
+___
+
+
+
+
+
+node references to check for duplicates in Max #5192
+
+No duplicates for node references in Max when users trying to select nodes before publishing
+
+
+___
+
+
+
+
+
+Tweak profiles logging to debug level #5194
+
+Tweak profiles logging to debug level since they aren't artist facing logs.
+
+
+___
+
+
+
+
+
+Enhancement: Reduce more visual clutter for artists in new publisher reports #5208
+
+Got this from one of our artists' reports - figured some of these logs were definitely not for the artist, reduced those logs to debug level.
+
+
+___
+
+
+
+
+
+Cosmetics: Tweak pyblish repair actions (icon, logs, docstring) #5213
+
+- Add icon to RepairContextAction
+- logs to debug level
+- also add attempt repair for RepairAction for consistency
+- fix RepairContextAction docstring to mention correct argument name
+
+#### Additional info
+
+We should not forget to remove this ["deprecated" actions.py file](https://github.com/ynput/OpenPype/blob/3501d0d23a78fbaef106da2fffe946cb49bef855/openpype/action.py) in 3.16 (next-minor)
+
+## Testing notes:
+
+1. Run some fabulous repairs!
+
+___
+
+
+
+
+
+Maya: fix save file prompt on launch last workfile with color management enabled + restructure `set_colorspace` #5225
+
+- Only set `configFilePath` when OCIO env var is not set since it doesn't do anything if OCIO var is set anyway.
+- Set the Maya 2022+ default OCIO path using the resources path instead of "" to avoid Maya Save File on new file after launch
+- **Bugfix: This is what fixes the Save prompt on open last workfile feature with Global color management enabled**
+- Move all code related to applying the maya settings together after querying the settings
+- Swap around the `if use_workfile_settings` since the check was reversed
+- Use `get_current_project_name()` instead of environment vars
+
+
+___
+
+
+
+
+
+Enhancement: More descriptive error messages for Loaders #5227
+
+Tweak raised errors and error messages for loader errors.
+
+
+___
+
+
+
+
+
+Houdini: add select invalid action for ValidateSopOutputNode #5231
+
+This PR adds `SelectROPAction` action to `houdini\api\action.py`and it's used in `Validate Output Node``SelectROPAction` is used to select the associated ROPs with the errored instances.
+
+
+___
+
+
+
+
+
+Remove new lines from the delivery template string #5235
+
+If the delivery template has a new line symbol at the end, say it was copied from the text editor, the delivery process will fail with `OSError` due to incorrect destination path. To avoid that I added `rstrip()` to the `delivery_path` processing.
+
+
+___
+
+
+
+
+
+Houdini: better selection on pointcache creation #5250
+
+Houdini allows `ObjNode` path as `sop_path` in the `ROP` unlike OP/ Ayon require `sop_path` to be set to a sop node path explicitly In this code, better selection is used to filter out invalid selections from OP/ Ayon point of viewValid selections are
+- `SopNode` that has parent of type `geo` or `subnet`
+- `ObjNode` of type `geo` that has
+- `SopNode` of type `output`
+- `SopNode` with render flag `on` (if no `Sopnode` of type `output`)this effectively filter
+- empty `ObjNode`
+- `ObjNode`(s) of other types like `cam` and `dopnet`
+- `SopNode`(s) that thier parents of other types like `cam` and `sop solver`
+
+
+___
+
+
+
+
+
+Update scene inventory even if any errors occurred during update #5252
+
+When selecting many items in the scene inventory to update versions and one of the items would error out the updating stops. However, before this PR the scene inventory would also NOT refresh making you think it did nothing.Also implemented as method to allow some code deduplication.
+
+
+___
+
+
+
+### **🐛 Bug fixes**
+
+
+
+Maya: Convert frame values to integers #5188
+
+Convert frame values to integers.
+
+
+___
+
+
+
+
+
+Maya: fix the register_event_callback correctly collecting workfile save after #5214
+
+fixing the bug of register_event_callback not being able to collect action of "workfile_save_after" for lock file action
+
+
+___
+
+
+
+
+
+Maya: aligning default settings to distributed aces 1.2 config #5233
+
+Maya colorspace setttings defaults are set the way they align our distributed ACES 1.2 config file set in global colorspace configs.
+
+
+___
+
+
+
+
+
+RepairAction and SelectInvalidAction filter instances failed on the exact plugin #5240
+
+RepairAction and SelectInvalidAction actually filter to instances that failed on the exact plugin - not on "any failure"
+
+
+___
+
+
+
+
+
+Maya: Bugfix look update nodes by id with non-unique shape names (query with `fullPath`) #5257
+
+Fixes a bug where updating attributes on nodes with assigned shader if shape name existed more than once in the scene due to `cmds.listRelatives` call not being done with the `fullPath=True` flag.Original error:
+```python
+# Traceback (most recent call last):
+# File "E:\openpype\OpenPype\openpype\tools\sceneinventory\view.py", line 264, in
+# lambda: self._show_version_dialog(items))
+# File "E:\openpype\OpenPype\openpype\tools\sceneinventory\view.py", line 722, in _show_version_dialog
+# self._update_containers(items, version)
+# File "E:\openpype\OpenPype\openpype\tools\sceneinventory\view.py", line 849, in _update_containers
+# update_container(item, item_version)
+# File "E:\openpype\OpenPype\openpype\pipeline\load\utils.py", line 502, in update_container
+# return loader.update(container, new_representation)
+# File "E:\openpype\OpenPype\openpype\hosts\maya\plugins\load\load_look.py", line 119, in update
+# nodes_by_id[lib.get_id(n)].append(n)
+# File "E:\openpype\OpenPype\openpype\hosts\maya\api\lib.py", line 1420, in get_id
+# sel.add(node)
+```
+
+
+___
+
+
+
+
+
+Nuke: Create nodes with inpanel=False #5051
+
+This PR is meant to remove the annoyance of the UI changing focus to the properties window just for the property window of the newly created node to disappear. Instead of using node.hideControlPanel I'm implementing the concealment during the creation of the node which will not change the focus of the current window.
+___
+
+
+
+
+
+Fix the reset frame range not setting up the right timeline in Max #5187
+
+Resolve #5181
+
+
+___
+
+
+
+
+
+Resolve: after launch automatization fixes #5193
+
+Workfile is no correctly created and aligned witch actual project. Also the launching mechanism is now fixed so even no workfile had been saved yet it will open OpenPype menu automatically.
+
+
+___
+
+
+
+
+
+General: Revert backward incompatible change of path to template to multiplatform #5197
+
+Now platformity is still handed by usage of `work[root]` (or any other root that is accessible across platforms.)
+
+
+___
+
+
+
+
+
+Nuke: root set format updating in node graph #5198
+
+Nuke root node needs to be reset on some values so any knobs could be updated in node graph. This works the same way as an user would change frame number so expressions would update its values in knobs.
+
+
+___
+
+
+
+
+
+Hiero: fixing otio current project and cosmetics #5200
+
+Otio were not returning correct current project once additional Untitled project was open in project manager stack.
+
+
+___
+
+
+
+
+
+Max: Publisher instances dont hold its enabled disabled states when Publisher reopened again #5202
+
+Resolve #5183, general maxscript conversion issue to python (e.g. bool conversion, true in maxscript while True in Python)(Also resolve the ValueError when you change the subset to publish into list view menu)
+
+
+___
+
+
+
+
+
+Burnins: Filter script is defined only for video streams #5205
+
+Burnins are working for inputs with audio.
+
+
+___
+
+
+
+
+
+Colorspace lib fix compatible python version comparison #5212
+
+Fix python version comparison.
+
+
+___
+
+
+
+
+
+Houdini: Fix `get_color_management_preferences` #5217
+
+Fix the issue described here where the logic for retrieving the current OCIO display and view was incorrectly trying to apply a regex to it.
+
+
+___
+
+
+
+
+
+Houdini: Redshift ROP image format bug #5218
+
+Problem :
+"RS_outputFileFormat" parm value was missing
+and there were more "image_format" than redshift rop supports
+
+Fix:
+1) removed unnecessary formats from `image_format_enum`
+2) add the selected format value to `RS_outputFileFormat`
+___
+
+
+
+
+
+Colorspace: check PyOpenColorIO rather then python version #5223
+
+Fixing previously merged PR (https://github.com/ynput/OpenPype/pull/5212) And applying better way to check compatibility with PyOpenColorIO python api.
+
+
+___
+
+
+
+
+
+Validate delivery action representations status #5228
+
+- disable delivery button if no representations checked
+- fix macos combobox layout
+- add error message if no delivery templates found
+
+
+___
+
+
+
+
+
+ Houdini: Add geometry check for pointcache family #5230
+
+When `sop_path` on ABC ROP node points to a non `SopNode`, these validators `validate_abc_primitive_to_detail.py`, `validate_primitive_hierarchy_paths.py` will error and crash when this line is executed `geo = output_node.geometryAtFrame(frame)`
+
+
+___
+
+
+
+
+
+Houdini: Add geometry check for VDB family #5232
+
+When `sop_path` on Geometry ROP node points to a non SopNode, this validator `validate_vdb_output_node.py` will error and crash when this line is executed`sop_node.geometryAtFrame(frame)`
+
+
+___
+
+
+
+
+
+Substance Painter: Include the setting only in publish tab #5234
+
+Instead of having two settings in both create and publish tab, there is solely one setting in the publish tab for users to set up the parameters.Resolve #5172
+
+
+___
+
+
+
+
+
+Maya: Fix collecting arnold prefix when none #5243
+
+When no prefix is specified in render settings, the renderlayer collector would error.
+
+
+___
+
+
+
+
+
+Deadline: OPENPYPE_VERSION should only be added when running from build #5244
+
+When running from source the environment variable `OPENPYPE_VERSION` should not be added. This is a bugfix for the feature #4489
+
+
+___
+
+
+
+
+
+Fix no prompt for "unsaved changes" showing when opening workfile in Houdini #5246
+
+Fix no prompt for "unsaved changes" showing when opening workfile in Houdini.
+
+
+___
+
+
+
+
+
+Fix no prompt for "unsaved changes" showing when opening workfile in Substance Painter #5248
+
+Fix no prompt for "unsaved changes" showing when opening workfile in Substance Painter.
+
+
+___
+
+
+
+
+
+General: add the os library before os.environ.get #5249
+
+Adding os library into `creator_plugins.py` due to `os.environ.get` in line 667
+
+
+___
+
+
+
+
+
+Maya: Fix set_attribute for enum attributes #5261
+
+Fix for #5260
+
+
+___
+
+
+
+
+
+Unreal: Move Qt imports away from module init #5268
+
+Importing `Window` creates errors in headless mode.
+```
+*** WRN: >>> { ModulesLoader }: [ FAILED to import host folder unreal ]
+=============================
+No Qt bindings could be found
+=============================
+Traceback (most recent call last):
+ File "C:\Users\tokejepsen\OpenPype\.venv\lib\site-packages\qtpy\__init__.py", line 252, in
+ from PySide6 import __version__ as PYSIDE_VERSION # analysis:ignore
+ModuleNotFoundERROR: No module named 'PySide6'
+
+During handling of the above exception, another exception occurred:
+
+Traceback (most recent call last):
+ File "C:\Users\tokejepsen\OpenPype\openpype\modules\base.py", line 385, in _load_modules
+ default_module = __import__(
+ File "C:\Users\tokejepsen\OpenPype\openpype\hosts\unreal\__init__.py", line 1, in
+ from .addon import UnrealAddon
+ File "C:\Users\tokejepsen\OpenPype\openpype\hosts\unreal\addon.py", line 4, in
+ from openpype.widgets.message_window import Window
+ File "C:\Users\tokejepsen\OpenPype\openpype\widgets\__init__.py", line 1, in
+ from .password_dialog import PasswordDialog
+ File "C:\Users\tokejepsen\OpenPype\openpype\widgets\password_dialog.py", line 1, in
+ from qtpy import QtWidgets, QtCore, QtGui
+ File "C:\Users\tokejepsen\OpenPype\.venv\lib\site-packages\qtpy\__init__.py", line 259, in
+ raise QtBindingsNotFoundERROR()
+qtpy.QtBindingsNotFoundERROR: No Qt bindings could be found
+```
+
+
+___
+
+
+
+### **🔀 Refactored code**
+
+
+
+Maya: Minor refactoring and code cleanup #5226
+
+Some small cleanup and refactoring of logic. Removing old comments, unused imports and some minor optimization. Also removed the prints of the loader names of each container the scene in `fix_incompatible_containers` + optimizing by using `set` and defining only once. Moved some UI related code/tweaks to run `on_init` only if not in headless mode. Removed an empty `obj.py` file.Each commit message kind of describes why the change was made.
+
+
+___
+
+
+
+### **Merged pull requests**
+
+
+
+Bug: Template builder fails when loading data without outliner representation #5222
+
+I add an assertion management in case the container does not have a represention in outliner.
+
+
+___
+
+
+
+
+
+AfterEffects - add container check validator to AE settings #5203
+
+Adds check if scene contains only latest version of loaded containers.
+
+
+___
+
+
+
+
+
+
## [3.15.11](https://github.com/ynput/OpenPype/tree/3.15.11)
diff --git a/README.md b/README.md
index 8757e3db92..6caed8061c 100644
--- a/README.md
+++ b/README.md
@@ -3,7 +3,7 @@
[](#contributors-)
OpenPype
-====
+========
[](https://github.com/pypeclub/pype/actions/workflows/documentation.yml) 
@@ -47,7 +47,7 @@ It can be built and ran on all common platforms. We develop and test on the foll
For more details on requirements visit [requirements documentation](https://openpype.io/docs/dev_requirements)
Building OpenPype
--------------
+-----------------
To build OpenPype you currently need [Python 3.9](https://www.python.org/downloads/) as we are following
[vfx platform](https://vfxplatform.com). Because of some Linux distros comes with newer Python version
@@ -67,9 +67,9 @@ git clone --recurse-submodules git@github.com:Pypeclub/OpenPype.git
#### To build OpenPype:
-1) Run `.\tools\create_env.ps1` to create virtual environment in `.\venv`
+1) Run `.\tools\create_env.ps1` to create virtual environment in `.\venv`.
2) Run `.\tools\fetch_thirdparty_libs.ps1` to download third-party dependencies like ffmpeg and oiio. Those will be included in build.
-3) Run `.\tools\build.ps1` to build OpenPype executables in `.\build\`
+3) Run `.\tools\build.ps1` to build OpenPype executables in `.\build\`.
To create distributable OpenPype versions, run `./tools/create_zip.ps1` - that will
create zip file with name `openpype-vx.x.x.zip` parsed from current OpenPype repository and
@@ -88,38 +88,38 @@ some OpenPype dependencies like [CMake](https://cmake.org/) and **XCode Command
Easy way of installing everything necessary is to use [Homebrew](https://brew.sh):
1) Install **Homebrew**:
-```sh
-/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
-```
+ ```sh
+ /bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
+ ```
2) Install **cmake**:
-```sh
-brew install cmake
-```
+ ```sh
+ brew install cmake
+ ```
3) Install [pyenv](https://github.com/pyenv/pyenv):
-```sh
-brew install pyenv
-echo 'eval "$(pyenv init -)"' >> ~/.zshrc
-pyenv init
-exec "$SHELL"
-PATH=$(pyenv root)/shims:$PATH
-```
+ ```sh
+ brew install pyenv
+ echo 'eval "$(pyenv init -)"' >> ~/.zshrc
+ pyenv init
+ exec "$SHELL"
+ PATH=$(pyenv root)/shims:$PATH
+ ```
-4) Pull in required Python version 3.9.x
-```sh
-# install Python build dependences
-brew install openssl readline sqlite3 xz zlib
+4) Pull in required Python version 3.9.x:
+ ```sh
+ # install Python build dependences
+ brew install openssl readline sqlite3 xz zlib
-# replace with up-to-date 3.9.x version
-pyenv install 3.9.6
-```
+ # replace with up-to-date 3.9.x version
+ pyenv install 3.9.6
+ ```
-5) Set local Python version
-```sh
-# switch to OpenPype source directory
-pyenv local 3.9.6
-```
+5) Set local Python version:
+ ```sh
+ # switch to OpenPype source directory
+ pyenv local 3.9.6
+ ```
#### To build OpenPype:
@@ -241,7 +241,7 @@ pyenv local 3.9.6
Running OpenPype
-------------
+----------------
OpenPype can by executed either from live sources (this repository) or from
*"frozen code"* - executables that can be build using steps described above.
@@ -289,7 +289,7 @@ To run tests, execute `.\tools\run_tests(.ps1|.sh)`.
Developer tools
--------------
+---------------
In case you wish to add your own tools to `.\tools` folder without git tracking, it is possible by adding it with `dev_*` suffix (example: `dev_clear_pyc(.ps1|.sh)`).
diff --git a/ayon_start.py b/ayon_start.py
new file mode 100644
index 0000000000..458c46bba6
--- /dev/null
+++ b/ayon_start.py
@@ -0,0 +1,483 @@
+# -*- coding: utf-8 -*-
+"""Main entry point for AYON command.
+
+Bootstrapping process of AYON.
+"""
+import os
+import sys
+import site
+import traceback
+import contextlib
+
+
+# Enabled logging debug mode when "--debug" is passed
+if "--verbose" in sys.argv:
+ expected_values = (
+ "Expected: notset, debug, info, warning, error, critical"
+ " or integer [0-50]."
+ )
+ idx = sys.argv.index("--verbose")
+ sys.argv.pop(idx)
+ if idx < len(sys.argv):
+ value = sys.argv.pop(idx)
+ else:
+ raise RuntimeError((
+ f"Expect value after \"--verbose\" argument. {expected_values}"
+ ))
+
+ log_level = None
+ low_value = value.lower()
+ if low_value.isdigit():
+ log_level = int(low_value)
+ elif low_value == "notset":
+ log_level = 0
+ elif low_value == "debug":
+ log_level = 10
+ elif low_value == "info":
+ log_level = 20
+ elif low_value == "warning":
+ log_level = 30
+ elif low_value == "error":
+ log_level = 40
+ elif low_value == "critical":
+ log_level = 50
+
+ if log_level is None:
+ raise ValueError((
+ "Unexpected value after \"--verbose\" "
+ f"argument \"{value}\". {expected_values}"
+ ))
+
+ os.environ["OPENPYPE_LOG_LEVEL"] = str(log_level)
+ os.environ["AYON_LOG_LEVEL"] = str(log_level)
+
+# Enable debug mode, may affect log level if log level is not defined
+if "--debug" in sys.argv:
+ sys.argv.remove("--debug")
+ os.environ["AYON_DEBUG"] = "1"
+ os.environ["OPENPYPE_DEBUG"] = "1"
+
+if "--automatic-tests" in sys.argv:
+ sys.argv.remove("--automatic-tests")
+ os.environ["IS_TEST"] = "1"
+
+SKIP_HEADERS = False
+if "--skip-headers" in sys.argv:
+ sys.argv.remove("--skip-headers")
+ SKIP_HEADERS = True
+
+SKIP_BOOTSTRAP = False
+if "--skip-bootstrap" in sys.argv:
+ sys.argv.remove("--skip-bootstrap")
+ SKIP_BOOTSTRAP = True
+
+if "--use-staging" in sys.argv:
+ sys.argv.remove("--use-staging")
+ os.environ["AYON_USE_STAGING"] = "1"
+ os.environ["OPENPYPE_USE_STAGING"] = "1"
+
+if "--headless" in sys.argv:
+ os.environ["AYON_HEADLESS_MODE"] = "1"
+ os.environ["OPENPYPE_HEADLESS_MODE"] = "1"
+ sys.argv.remove("--headless")
+
+elif (
+ os.getenv("AYON_HEADLESS_MODE") != "1"
+ or os.getenv("OPENPYPE_HEADLESS_MODE") != "1"
+):
+ os.environ.pop("AYON_HEADLESS_MODE", None)
+ os.environ.pop("OPENPYPE_HEADLESS_MODE", None)
+
+elif (
+ os.getenv("AYON_HEADLESS_MODE")
+ != os.getenv("OPENPYPE_HEADLESS_MODE")
+):
+ os.environ["OPENPYPE_HEADLESS_MODE"] = (
+ os.environ["AYON_HEADLESS_MODE"]
+ )
+
+IS_BUILT_APPLICATION = getattr(sys, "frozen", False)
+HEADLESS_MODE_ENABLED = os.getenv("AYON_HEADLESS_MODE") == "1"
+
+_pythonpath = os.getenv("PYTHONPATH", "")
+_python_paths = _pythonpath.split(os.pathsep)
+if not IS_BUILT_APPLICATION:
+ # Code root defined by `start.py` directory
+ AYON_ROOT = os.path.dirname(os.path.abspath(__file__))
+ _dependencies_path = site.getsitepackages()[-1]
+else:
+ AYON_ROOT = os.path.dirname(sys.executable)
+
+ # add dependencies folder to sys.pat for frozen code
+ _dependencies_path = os.path.normpath(
+ os.path.join(AYON_ROOT, "dependencies")
+ )
+# add stuff from `/dependencies` to PYTHONPATH.
+sys.path.append(_dependencies_path)
+_python_paths.append(_dependencies_path)
+
+# Vendored python modules that must not be in PYTHONPATH environment but
+# are required for OpenPype processes
+sys.path.insert(0, os.path.join(AYON_ROOT, "vendor", "python"))
+
+# Add common package to sys path
+# - common contains common code for bootstraping and OpenPype processes
+sys.path.insert(0, os.path.join(AYON_ROOT, "common"))
+
+# This is content of 'core' addon which is ATM part of build
+common_python_vendor = os.path.join(
+ AYON_ROOT,
+ "openpype",
+ "vendor",
+ "python",
+ "common"
+)
+# Add tools dir to sys path for pyblish UI discovery
+tools_dir = os.path.join(AYON_ROOT, "openpype", "tools")
+for path in (AYON_ROOT, common_python_vendor, tools_dir):
+ while path in _python_paths:
+ _python_paths.remove(path)
+
+ while path in sys.path:
+ sys.path.remove(path)
+
+ _python_paths.insert(0, path)
+ sys.path.insert(0, path)
+
+os.environ["PYTHONPATH"] = os.pathsep.join(_python_paths)
+
+# enabled AYON state
+os.environ["USE_AYON_SERVER"] = "1"
+# Set this to point either to `python` from venv in case of live code
+# or to `ayon` or `ayon_console` in case of frozen code
+os.environ["AYON_EXECUTABLE"] = sys.executable
+os.environ["OPENPYPE_EXECUTABLE"] = sys.executable
+os.environ["AYON_ROOT"] = AYON_ROOT
+os.environ["OPENPYPE_ROOT"] = AYON_ROOT
+os.environ["OPENPYPE_REPOS_ROOT"] = AYON_ROOT
+os.environ["AYON_MENU_LABEL"] = "AYON"
+os.environ["AVALON_LABEL"] = "AYON"
+# Set name of pyblish UI import
+os.environ["PYBLISH_GUI"] = "pyblish_pype"
+# Set builtin OCIO root
+os.environ["BUILTIN_OCIO_ROOT"] = os.path.join(
+ AYON_ROOT,
+ "vendor",
+ "bin",
+ "ocioconfig",
+ "OpenColorIOConfigs"
+)
+
+import blessed # noqa: E402
+import certifi # noqa: E402
+
+
+if sys.__stdout__:
+ term = blessed.Terminal()
+
+ def _print(message: str):
+ if message.startswith("!!! "):
+ print(f'{term.orangered2("!!! ")}{message[4:]}')
+ elif message.startswith(">>> "):
+ print(f'{term.aquamarine3(">>> ")}{message[4:]}')
+ elif message.startswith("--- "):
+ print(f'{term.darkolivegreen3("--- ")}{message[4:]}')
+ elif message.startswith("*** "):
+ print(f'{term.gold("*** ")}{message[4:]}')
+ elif message.startswith(" - "):
+ print(f'{term.wheat(" - ")}{message[4:]}')
+ elif message.startswith(" . "):
+ print(f'{term.tan(" . ")}{message[4:]}')
+ elif message.startswith(" - "):
+ print(f'{term.seagreen3(" - ")}{message[7:]}')
+ elif message.startswith(" ! "):
+ print(f'{term.goldenrod(" ! ")}{message[7:]}')
+ elif message.startswith(" * "):
+ print(f'{term.aquamarine1(" * ")}{message[7:]}')
+ elif message.startswith(" "):
+ print(f'{term.darkseagreen3(" ")}{message[4:]}')
+ else:
+ print(message)
+else:
+ def _print(message: str):
+ print(message)
+
+
+# if SSL_CERT_FILE is not set prior to OpenPype launch, we set it to point
+# to certifi bundle to make sure we have reasonably new CA certificates.
+if not os.getenv("SSL_CERT_FILE"):
+ os.environ["SSL_CERT_FILE"] = certifi.where()
+elif os.getenv("SSL_CERT_FILE") != certifi.where():
+ _print("--- your system is set to use custom CA certificate bundle.")
+
+from ayon_api import get_base_url
+from ayon_api.constants import SERVER_URL_ENV_KEY, SERVER_API_ENV_KEY
+from ayon_common import is_staging_enabled
+from ayon_common.connection.credentials import (
+ ask_to_login_ui,
+ add_server,
+ need_server_or_login,
+ load_environments,
+ set_environments,
+ create_global_connection,
+ confirm_server_login,
+)
+from ayon_common.distribution import (
+ AyonDistribution,
+ BundleNotFoundError,
+ show_missing_bundle_information,
+)
+
+
+def set_global_environments() -> None:
+ """Set global OpenPype's environments."""
+ import acre
+
+ from openpype.settings import get_general_environments
+
+ general_env = get_general_environments()
+
+ # first resolve general environment because merge doesn't expect
+ # values to be list.
+ # TODO: switch to OpenPype environment functions
+ merged_env = acre.merge(
+ acre.compute(acre.parse(general_env), cleanup=False),
+ dict(os.environ)
+ )
+ env = acre.compute(
+ merged_env,
+ cleanup=False
+ )
+ os.environ.clear()
+ os.environ.update(env)
+
+ # Hardcoded default values
+ os.environ["PYBLISH_GUI"] = "pyblish_pype"
+ # Change scale factor only if is not set
+ if "QT_AUTO_SCREEN_SCALE_FACTOR" not in os.environ:
+ os.environ["QT_AUTO_SCREEN_SCALE_FACTOR"] = "1"
+
+
+def set_addons_environments():
+ """Set global environments for OpenPype modules.
+
+ This requires to have OpenPype in `sys.path`.
+ """
+
+ import acre
+ from openpype.modules import ModulesManager
+
+ modules_manager = ModulesManager()
+
+ # Merge environments with current environments and update values
+ if module_envs := modules_manager.collect_global_environments():
+ parsed_envs = acre.parse(module_envs)
+ env = acre.merge(parsed_envs, dict(os.environ))
+ os.environ.clear()
+ os.environ.update(env)
+
+
+def _connect_to_ayon_server():
+ load_environments()
+ if not need_server_or_login():
+ create_global_connection()
+ return
+
+ if HEADLESS_MODE_ENABLED:
+ _print("!!! Cannot open v4 Login dialog in headless mode.")
+ _print((
+ "!!! Please use `{}` to specify server address"
+ " and '{}' to specify user's token."
+ ).format(SERVER_URL_ENV_KEY, SERVER_API_ENV_KEY))
+ sys.exit(1)
+
+ current_url = os.environ.get(SERVER_URL_ENV_KEY)
+ url, token, username = ask_to_login_ui(current_url, always_on_top=True)
+ if url is not None and token is not None:
+ confirm_server_login(url, token, username)
+ return
+
+ if url is not None:
+ add_server(url, username)
+
+ _print("!!! Login was not successful.")
+ sys.exit(0)
+
+
+def _check_and_update_from_ayon_server():
+ """Gets addon info from v4, compares with local folder and updates it.
+
+ Raises:
+ RuntimeError
+ """
+
+ distribution = AyonDistribution()
+ bundle = None
+ bundle_name = None
+ try:
+ bundle = distribution.bundle_to_use
+ if bundle is not None:
+ bundle_name = bundle.name
+ except BundleNotFoundError as exc:
+ bundle_name = exc.bundle_name
+
+ if bundle is None:
+ url = get_base_url()
+ if not HEADLESS_MODE_ENABLED:
+ show_missing_bundle_information(url, bundle_name)
+
+ elif bundle_name:
+ _print((
+ f"!!! Requested release bundle '{bundle_name}'"
+ " is not available on server."
+ ))
+ _print(
+ "!!! Check if selected release bundle"
+ f" is available on the server '{url}'."
+ )
+
+ else:
+ mode = "staging" if is_staging_enabled() else "production"
+ _print(
+ f"!!! No release bundle is set as {mode} on the AYON server."
+ )
+ _print(
+ "!!! Make sure there is a release bundle set"
+ f" as \"{mode}\" on the AYON server '{url}'."
+ )
+ sys.exit(1)
+
+ distribution.distribute()
+ distribution.validate_distribution()
+ os.environ["AYON_BUNDLE_NAME"] = bundle_name
+
+ python_paths = [
+ path
+ for path in os.getenv("PYTHONPATH", "").split(os.pathsep)
+ if path
+ ]
+
+ for path in distribution.get_sys_paths():
+ sys.path.insert(0, path)
+ if path not in python_paths:
+ python_paths.append(path)
+ os.environ["PYTHONPATH"] = os.pathsep.join(python_paths)
+
+
+def boot():
+ """Bootstrap OpenPype."""
+
+ from openpype.version import __version__
+
+ # TODO load version
+ os.environ["OPENPYPE_VERSION"] = __version__
+ os.environ["AYON_VERSION"] = __version__
+
+ _connect_to_ayon_server()
+ _check_and_update_from_ayon_server()
+
+ # delete OpenPype module and it's submodules from cache so it is used from
+ # specific version
+ modules_to_del = [
+ sys.modules.pop(module_name)
+ for module_name in tuple(sys.modules)
+ if module_name == "openpype" or module_name.startswith("openpype.")
+ ]
+
+ for module_name in modules_to_del:
+ with contextlib.suppress(AttributeError, KeyError):
+ del sys.modules[module_name]
+
+
+def main_cli():
+ from openpype import cli
+ from openpype.version import __version__
+ from openpype.lib import terminal as t
+
+ _print(">>> loading environments ...")
+ _print(" - global AYON ...")
+ set_global_environments()
+ _print(" - for addons ...")
+ set_addons_environments()
+
+ # print info when not running scripts defined in 'silent commands'
+ if not SKIP_HEADERS:
+ info = get_info(is_staging_enabled())
+ info.insert(0, f">>> Using AYON from [ {AYON_ROOT} ]")
+
+ t_width = 20
+ with contextlib.suppress(ValueError, OSError):
+ t_width = os.get_terminal_size().columns - 2
+
+ _header = f"*** AYON [{__version__}] "
+ info.insert(0, _header + "-" * (t_width - len(_header)))
+
+ for i in info:
+ t.echo(i)
+
+ try:
+ cli.main(obj={}, prog_name="ayon")
+ except Exception: # noqa
+ exc_info = sys.exc_info()
+ _print("!!! AYON crashed:")
+ traceback.print_exception(*exc_info)
+ sys.exit(1)
+
+
+def script_cli():
+ """Run and execute script."""
+
+ filepath = os.path.abspath(sys.argv[1])
+
+ # Find '__main__.py' in directory
+ if os.path.isdir(filepath):
+ new_filepath = os.path.join(filepath, "__main__.py")
+ if not os.path.exists(new_filepath):
+ raise RuntimeError(
+ f"can't find '__main__' module in '{filepath}'")
+ filepath = new_filepath
+
+ # Add parent dir to sys path
+ sys.path.insert(0, os.path.dirname(filepath))
+
+ # Read content and execute
+ with open(filepath, "r") as stream:
+ content = stream.read()
+
+ exec(compile(content, filepath, "exec"), globals())
+
+
+def get_info(use_staging=None) -> list:
+ """Print additional information to console."""
+
+ inf = []
+ if use_staging:
+ inf.append(("AYON variant", "staging"))
+ else:
+ inf.append(("AYON variant", "production"))
+ inf.append(("AYON bundle", os.getenv("AYON_BUNDLE")))
+
+ # NOTE add addons information
+
+ maximum = max(len(i[0]) for i in inf)
+ formatted = []
+ for info in inf:
+ padding = (maximum - len(info[0])) + 1
+ formatted.append(f'... {info[0]}:{" " * padding}[ {info[1]} ]')
+ return formatted
+
+
+def main():
+ if not SKIP_BOOTSTRAP:
+ boot()
+
+ args = list(sys.argv)
+ args.pop(0)
+ if args and os.path.exists(args[0]):
+ script_cli()
+ else:
+ main_cli()
+
+
+if __name__ == "__main__":
+ main()
diff --git a/common/ayon_common/__init__.py b/common/ayon_common/__init__.py
new file mode 100644
index 0000000000..ddabb7da2f
--- /dev/null
+++ b/common/ayon_common/__init__.py
@@ -0,0 +1,16 @@
+from .utils import (
+ IS_BUILT_APPLICATION,
+ is_staging_enabled,
+ get_local_site_id,
+ get_ayon_appdirs,
+ get_ayon_launch_args,
+)
+
+
+__all__ = (
+ "IS_BUILT_APPLICATION",
+ "is_staging_enabled",
+ "get_local_site_id",
+ "get_ayon_appdirs",
+ "get_ayon_launch_args",
+)
diff --git a/common/openpype_common/distribution/__init__.py b/common/ayon_common/connection/__init__.py
similarity index 100%
rename from common/openpype_common/distribution/__init__.py
rename to common/ayon_common/connection/__init__.py
diff --git a/common/ayon_common/connection/credentials.py b/common/ayon_common/connection/credentials.py
new file mode 100644
index 0000000000..7f70cb7992
--- /dev/null
+++ b/common/ayon_common/connection/credentials.py
@@ -0,0 +1,511 @@
+"""Handle credentials and connection to server for client application.
+
+Cache and store used server urls. Store/load API keys to/from keyring if
+needed. Store metadata about used urls, usernames for the urls and when was
+the connection with the username established.
+
+On bootstrap is created global connection with information about site and
+client version. The connection object lives in 'ayon_api'.
+"""
+
+import os
+import json
+import platform
+import datetime
+import contextlib
+import subprocess
+import tempfile
+from typing import Optional, Union, Any
+
+import ayon_api
+
+from ayon_api.constants import SERVER_URL_ENV_KEY, SERVER_API_ENV_KEY
+from ayon_api.exceptions import UrlError
+from ayon_api.utils import (
+ validate_url,
+ is_token_valid,
+ logout_from_server,
+)
+
+from ayon_common.utils import (
+ get_ayon_appdirs,
+ get_local_site_id,
+ get_ayon_launch_args,
+ is_staging_enabled,
+)
+
+
+class ChangeUserResult:
+ def __init__(
+ self, logged_out, old_url, old_token, old_username,
+ new_url, new_token, new_username
+ ):
+ shutdown = logged_out
+ restart = new_url is not None and new_url != old_url
+ token_changed = new_token is not None and new_token != old_token
+
+ self.logged_out = logged_out
+ self.old_url = old_url
+ self.old_token = old_token
+ self.old_username = old_username
+ self.new_url = new_url
+ self.new_token = new_token
+ self.new_username = new_username
+
+ self.shutdown = shutdown
+ self.restart = restart
+ self.token_changed = token_changed
+
+
+def _get_servers_path():
+ return get_ayon_appdirs("used_servers.json")
+
+
+def get_servers_info_data():
+ """Metadata about used server on this machine.
+
+ Store data about all used server urls, last used url and user username for
+ the url. Using this metadata we can remember which username was used per
+ url if token stored in keyring loose lifetime.
+
+ Returns:
+ dict[str, Any]: Information about servers.
+ """
+
+ data = {}
+ servers_info_path = _get_servers_path()
+ if not os.path.exists(servers_info_path):
+ dirpath = os.path.dirname(servers_info_path)
+ if not os.path.exists(dirpath):
+ os.makedirs(dirpath)
+
+ return data
+
+ with open(servers_info_path, "r") as stream:
+ with contextlib.suppress(BaseException):
+ data = json.load(stream)
+ return data
+
+
+def add_server(url: str, username: str):
+ """Add server to server info metadata.
+
+ This function will also mark the url as last used url on the machine so on
+ next launch will be used.
+
+ Args:
+ url (str): Server url.
+ username (str): Name of user used to log in.
+ """
+
+ servers_info_path = _get_servers_path()
+ data = get_servers_info_data()
+ data["last_server"] = url
+ if "urls" not in data:
+ data["urls"] = {}
+ data["urls"][url] = {
+ "updated_dt": datetime.datetime.now().strftime("%Y/%m/%d %H:%M:%S"),
+ "username": username,
+ }
+
+ with open(servers_info_path, "w") as stream:
+ json.dump(data, stream)
+
+
+def remove_server(url: str):
+ """Remove server url from servers information.
+
+ This should be used on logout to completelly loose information about server
+ on the machine.
+
+ Args:
+ url (str): Server url.
+ """
+
+ if not url:
+ return
+
+ servers_info_path = _get_servers_path()
+ data = get_servers_info_data()
+ if data.get("last_server") == url:
+ data["last_server"] = None
+
+ if "urls" in data:
+ data["urls"].pop(url, None)
+
+ with open(servers_info_path, "w") as stream:
+ json.dump(data, stream)
+
+
+def get_last_server(
+ data: Optional[dict[str, Any]] = None
+) -> Union[str, None]:
+ """Last server used to log in on this machine.
+
+ Args:
+ data (Optional[dict[str, Any]]): Prepared server information data.
+
+ Returns:
+ Union[str, None]: Last used server url.
+ """
+
+ if data is None:
+ data = get_servers_info_data()
+ return data.get("last_server")
+
+
+def get_last_username_by_url(
+ url: str,
+ data: Optional[dict[str, Any]] = None
+) -> Union[str, None]:
+ """Get last username which was used for passed url.
+
+ Args:
+ url (str): Server url.
+ data (Optional[dict[str, Any]]): Servers info.
+
+ Returns:
+ Union[str, None]: Username.
+ """
+
+ if not url:
+ return None
+
+ if data is None:
+ data = get_servers_info_data()
+
+ if urls := data.get("urls"):
+ if url_info := urls.get(url):
+ return url_info.get("username")
+ return None
+
+
+def get_last_server_with_username():
+ """Receive last server and username used in last connection.
+
+ Returns:
+ tuple[Union[str, None], Union[str, None]]: Url and username.
+ """
+
+ data = get_servers_info_data()
+ url = get_last_server(data)
+ username = get_last_username_by_url(url)
+ return url, username
+
+
+class TokenKeyring:
+ # Fake username with hardcoded username
+ username_key = "username"
+
+ def __init__(self, url):
+ try:
+ import keyring
+
+ except Exception as exc:
+ raise NotImplementedError(
+ "Python module `keyring` is not available."
+ ) from exc
+
+ # hack for cx_freeze and Windows keyring backend
+ if platform.system().lower() == "windows":
+ from keyring.backends import Windows
+
+ keyring.set_keyring(Windows.WinVaultKeyring())
+
+ self._url = url
+ self._keyring_key = f"AYON/{url}"
+
+ def get_value(self):
+ import keyring
+
+ return keyring.get_password(self._keyring_key, self.username_key)
+
+ def set_value(self, value):
+ import keyring
+
+ if value is not None:
+ keyring.set_password(self._keyring_key, self.username_key, value)
+ return
+
+ with contextlib.suppress(keyring.errors.PasswordDeleteError):
+ keyring.delete_password(self._keyring_key, self.username_key)
+
+
+def load_token(url: str) -> Union[str, None]:
+ """Get token for url from keyring.
+
+ Args:
+ url (str): Server url.
+
+ Returns:
+ Union[str, None]: Token for passed url available in keyring.
+ """
+
+ return TokenKeyring(url).get_value()
+
+
+def store_token(url: str, token: str):
+ """Store token by url to keyring.
+
+ Args:
+ url (str): Server url.
+ token (str): User token to server.
+ """
+
+ TokenKeyring(url).set_value(token)
+
+
+def ask_to_login_ui(
+ url: Optional[str] = None,
+ always_on_top: Optional[bool] = False
+) -> tuple[str, str, str]:
+ """Ask user to login using UI.
+
+ This should be used only when user is not yet logged in at all or available
+ credentials are invalid. To change credentials use 'change_user_ui'
+ function.
+
+ Use a subprocess to show UI.
+
+ Args:
+ url (Optional[str]): Server url that could be prefilled in UI.
+ always_on_top (Optional[bool]): Window will be drawn on top of
+ other windows.
+
+ Returns:
+ tuple[str, str, str]: Url, user's token and username.
+ """
+
+ current_dir = os.path.dirname(os.path.abspath(__file__))
+ ui_dir = os.path.join(current_dir, "ui")
+
+ if url is None:
+ url = get_last_server()
+ username = get_last_username_by_url(url)
+ data = {
+ "url": url,
+ "username": username,
+ "always_on_top": always_on_top,
+ }
+
+ with tempfile.NamedTemporaryFile(
+ mode="w", prefix="ayon_login", suffix=".json", delete=False
+ ) as tmp:
+ output = tmp.name
+ json.dump(data, tmp)
+
+ code = subprocess.call(
+ get_ayon_launch_args(ui_dir, "--skip-bootstrap", output))
+ if code != 0:
+ raise RuntimeError("Failed to show login UI")
+
+ with open(output, "r") as stream:
+ data = json.load(stream)
+ os.remove(output)
+ return data["output"]
+
+
+def change_user_ui() -> ChangeUserResult:
+ """Change user using UI.
+
+ Show UI to user where he can change credentials or url. Output will contain
+ all information about old/new values of url, username, api key. If user
+ confirmed or declined values.
+
+ Returns:
+ ChangeUserResult: Information about user change.
+ """
+
+ from .ui import change_user
+
+ url, username = get_last_server_with_username()
+ token = load_token(url)
+ result = change_user(url, username, token)
+ new_url, new_token, new_username, logged_out = result
+
+ output = ChangeUserResult(
+ logged_out, url, token, username,
+ new_url, new_token, new_username
+ )
+ if output.logged_out:
+ logout(url, token)
+
+ elif output.token_changed:
+ change_token(
+ output.new_url,
+ output.new_token,
+ output.new_username,
+ output.old_url
+ )
+ return output
+
+
+def change_token(
+ url: str,
+ token: str,
+ username: Optional[str] = None,
+ old_url: Optional[str] = None
+):
+ """Change url and token in currently running session.
+
+ Function can also change server url, in that case are previous credentials
+ NOT removed from cache.
+
+ Args:
+ url (str): Url to server.
+ token (str): New token to be used for url connection.
+ username (Optional[str]): Username of logged user.
+ old_url (Optional[str]): Previous url. Value from 'get_last_server'
+ is used if not entered.
+ """
+
+ if old_url is None:
+ old_url = get_last_server()
+ if old_url and old_url == url:
+ remove_url_cache(old_url)
+
+ # TODO check if ayon_api is already connected
+ add_server(url, username)
+ store_token(url, token)
+ ayon_api.change_token(url, token)
+
+
+def remove_url_cache(url: str):
+ """Clear cache for server url.
+
+ Args:
+ url (str): Server url which is removed from cache.
+ """
+
+ store_token(url, None)
+
+
+def remove_token_cache(url: str, token: str):
+ """Remove token from local cache of url.
+
+ Is skipped if cached token under the passed url is not the same
+ as passed token.
+
+ Args:
+ url (str): Url to server.
+ token (str): Token to be removed from url cache.
+ """
+
+ if load_token(url) == token:
+ remove_url_cache(url)
+
+
+def logout(url: str, token: str):
+ """Logout from server and throw token away.
+
+ Args:
+ url (str): Url from which should be logged out.
+ token (str): Token which should be used to log out.
+ """
+
+ remove_server(url)
+ ayon_api.close_connection()
+ ayon_api.set_environments(None, None)
+ remove_token_cache(url, token)
+ logout_from_server(url, token)
+
+
+def load_environments():
+ """Load environments on startup.
+
+ Handle environments needed for connection with server. Environments are
+ 'AYON_SERVER_URL' and 'AYON_API_KEY'.
+
+ Server is looked up from environment. Already set environent is not
+ changed. If environemnt is not filled then last server stored in appdirs
+ is used.
+
+ Token is skipped if url is not available. Otherwise, is also checked from
+ env and if is not available then uses 'load_token' to try to get token
+ based on server url.
+ """
+
+ server_url = os.environ.get(SERVER_URL_ENV_KEY)
+ if not server_url:
+ server_url = get_last_server()
+ if not server_url:
+ return
+ os.environ[SERVER_URL_ENV_KEY] = server_url
+
+ if not os.environ.get(SERVER_API_ENV_KEY):
+ if token := load_token(server_url):
+ os.environ[SERVER_API_ENV_KEY] = token
+
+
+def set_environments(url: str, token: str):
+ """Change url and token environemnts in currently running process.
+
+ Args:
+ url (str): New server url.
+ token (str): User's token.
+ """
+
+ ayon_api.set_environments(url, token)
+
+
+def create_global_connection():
+ """Create global connection with site id and client version.
+
+ Make sure the global connection in 'ayon_api' have entered site id and
+ client version.
+
+ Set default settings variant to use based on 'is_staging_enabled'.
+ """
+
+ ayon_api.create_connection(
+ get_local_site_id(), os.environ.get("AYON_VERSION")
+ )
+ ayon_api.set_default_settings_variant(
+ "staging" if is_staging_enabled() else "production"
+ )
+
+
+def need_server_or_login() -> bool:
+ """Check if server url or login to the server are needed.
+
+ It is recommended to call 'load_environments' on startup before this check.
+ But in some cases this function could be called after startup.
+
+ Returns:
+ bool: 'True' if server and token are available. Otherwise 'False'.
+ """
+
+ server_url = os.environ.get(SERVER_URL_ENV_KEY)
+ if not server_url:
+ return True
+
+ try:
+ server_url = validate_url(server_url)
+ except UrlError:
+ return True
+
+ token = os.environ.get(SERVER_API_ENV_KEY)
+ if token:
+ return not is_token_valid(server_url, token)
+
+ token = load_token(server_url)
+ if token:
+ return not is_token_valid(server_url, token)
+ return True
+
+
+def confirm_server_login(url, token, username):
+ """Confirm login of user and do necessary stepts to apply changes.
+
+ This should not be used on "change" of user but on first login.
+
+ Args:
+ url (str): Server url where user authenticated.
+ token (str): API token used for authentication to server.
+ username (Union[str, None]): Username related to API token.
+ """
+
+ add_server(url, username)
+ store_token(url, token)
+ set_environments(url, token)
+ create_global_connection()
diff --git a/common/ayon_common/connection/ui/__init__.py b/common/ayon_common/connection/ui/__init__.py
new file mode 100644
index 0000000000..96e573df0d
--- /dev/null
+++ b/common/ayon_common/connection/ui/__init__.py
@@ -0,0 +1,12 @@
+from .login_window import (
+ ServerLoginWindow,
+ ask_to_login,
+ change_user,
+)
+
+
+__all__ = (
+ "ServerLoginWindow",
+ "ask_to_login",
+ "change_user",
+)
diff --git a/common/ayon_common/connection/ui/__main__.py b/common/ayon_common/connection/ui/__main__.py
new file mode 100644
index 0000000000..719b2b8ef5
--- /dev/null
+++ b/common/ayon_common/connection/ui/__main__.py
@@ -0,0 +1,23 @@
+import sys
+import json
+
+from ayon_common.connection.ui.login_window import ask_to_login
+
+
+def main(output_path):
+ with open(output_path, "r") as stream:
+ data = json.load(stream)
+
+ url = data.get("url")
+ username = data.get("username")
+ always_on_top = data.get("always_on_top", False)
+ out_url, out_token, out_username = ask_to_login(
+ url, username, always_on_top=always_on_top)
+
+ data["output"] = [out_url, out_token, out_username]
+ with open(output_path, "w") as stream:
+ json.dump(data, stream)
+
+
+if __name__ == "__main__":
+ main(sys.argv[-1])
diff --git a/common/ayon_common/connection/ui/login_window.py b/common/ayon_common/connection/ui/login_window.py
new file mode 100644
index 0000000000..94c239852e
--- /dev/null
+++ b/common/ayon_common/connection/ui/login_window.py
@@ -0,0 +1,710 @@
+import traceback
+
+from qtpy import QtWidgets, QtCore, QtGui
+
+from ayon_api.exceptions import UrlError
+from ayon_api.utils import validate_url, login_to_server
+
+from ayon_common.resources import (
+ get_resource_path,
+ get_icon_path,
+ load_stylesheet,
+)
+from ayon_common.ui_utils import set_style_property, get_qt_app
+
+from .widgets import (
+ PressHoverButton,
+ PlaceholderLineEdit,
+)
+
+
+class LogoutConfirmDialog(QtWidgets.QDialog):
+ def __init__(self, *args, **kwargs):
+ super().__init__(*args, **kwargs)
+
+ self.setWindowTitle("Logout confirmation")
+
+ message_widget = QtWidgets.QWidget(self)
+
+ message_label = QtWidgets.QLabel(
+ (
+ "You are going to logout. This action will close this"
+ " application and will invalidate your login."
+ " All other applications launched with this login won't be"
+ " able to use it anymore.
"
+ "You can cancel logout and only change server and user login"
+ " in login dialog.
"
+ "Press OK to confirm logout."
+ ),
+ message_widget
+ )
+ message_label.setWordWrap(True)
+
+ message_layout = QtWidgets.QHBoxLayout(message_widget)
+ message_layout.setContentsMargins(0, 0, 0, 0)
+ message_layout.addWidget(message_label, 1)
+
+ sep_frame = QtWidgets.QFrame(self)
+ sep_frame.setObjectName("Separator")
+ sep_frame.setMinimumHeight(2)
+ sep_frame.setMaximumHeight(2)
+
+ footer_widget = QtWidgets.QWidget(self)
+
+ cancel_btn = QtWidgets.QPushButton("Cancel", footer_widget)
+ confirm_btn = QtWidgets.QPushButton("OK", footer_widget)
+
+ footer_layout = QtWidgets.QHBoxLayout(footer_widget)
+ footer_layout.setContentsMargins(0, 0, 0, 0)
+ footer_layout.addStretch(1)
+ footer_layout.addWidget(cancel_btn, 0)
+ footer_layout.addWidget(confirm_btn, 0)
+
+ main_layout = QtWidgets.QVBoxLayout(self)
+ main_layout.addWidget(message_widget, 0)
+ main_layout.addStretch(1)
+ main_layout.addWidget(sep_frame, 0)
+ main_layout.addWidget(footer_widget, 0)
+
+ cancel_btn.clicked.connect(self._on_cancel_click)
+ confirm_btn.clicked.connect(self._on_confirm_click)
+
+ self._cancel_btn = cancel_btn
+ self._confirm_btn = confirm_btn
+ self._result = False
+
+ def showEvent(self, event):
+ super().showEvent(event)
+ self._match_btns_sizes()
+
+ def resizeEvent(self, event):
+ super().resizeEvent(event)
+ self._match_btns_sizes()
+
+ def _match_btns_sizes(self):
+ width = max(
+ self._cancel_btn.sizeHint().width(),
+ self._confirm_btn.sizeHint().width()
+ )
+ self._cancel_btn.setMinimumWidth(width)
+ self._confirm_btn.setMinimumWidth(width)
+
+ def _on_cancel_click(self):
+ self._result = False
+ self.reject()
+
+ def _on_confirm_click(self):
+ self._result = True
+ self.accept()
+
+ def get_result(self):
+ return self._result
+
+
+class ServerLoginWindow(QtWidgets.QDialog):
+ default_width = 410
+ default_height = 170
+
+ def __init__(self, *args, **kwargs):
+ super().__init__(*args, **kwargs)
+
+ icon_path = get_icon_path()
+ icon = QtGui.QIcon(icon_path)
+ self.setWindowIcon(icon)
+ self.setWindowTitle("Login to server")
+
+ edit_icon_path = get_resource_path("edit.png")
+ edit_icon = QtGui.QIcon(edit_icon_path)
+
+ # --- URL page ---
+ login_widget = QtWidgets.QWidget(self)
+
+ user_cred_widget = QtWidgets.QWidget(login_widget)
+
+ url_label = QtWidgets.QLabel("URL:", user_cred_widget)
+
+ url_widget = QtWidgets.QWidget(user_cred_widget)
+
+ url_input = PlaceholderLineEdit(url_widget)
+ url_input.setPlaceholderText("< https://ayon.server.com >")
+
+ url_preview = QtWidgets.QLineEdit(url_widget)
+ url_preview.setReadOnly(True)
+ url_preview.setObjectName("LikeDisabledInput")
+
+ url_edit_btn = PressHoverButton(user_cred_widget)
+ url_edit_btn.setIcon(edit_icon)
+ url_edit_btn.setObjectName("PasswordBtn")
+
+ url_layout = QtWidgets.QHBoxLayout(url_widget)
+ url_layout.setContentsMargins(0, 0, 0, 0)
+ url_layout.addWidget(url_input, 1)
+ url_layout.addWidget(url_preview, 1)
+
+ # --- URL separator ---
+ url_cred_sep = QtWidgets.QFrame(self)
+ url_cred_sep.setObjectName("Separator")
+ url_cred_sep.setMinimumHeight(2)
+ url_cred_sep.setMaximumHeight(2)
+
+ # --- Login page ---
+ username_label = QtWidgets.QLabel("Username:", user_cred_widget)
+
+ username_widget = QtWidgets.QWidget(user_cred_widget)
+
+ username_input = PlaceholderLineEdit(username_widget)
+ username_input.setPlaceholderText("< Artist >")
+
+ username_preview = QtWidgets.QLineEdit(username_widget)
+ username_preview.setReadOnly(True)
+ username_preview.setObjectName("LikeDisabledInput")
+
+ username_edit_btn = PressHoverButton(user_cred_widget)
+ username_edit_btn.setIcon(edit_icon)
+ username_edit_btn.setObjectName("PasswordBtn")
+
+ username_layout = QtWidgets.QHBoxLayout(username_widget)
+ username_layout.setContentsMargins(0, 0, 0, 0)
+ username_layout.addWidget(username_input, 1)
+ username_layout.addWidget(username_preview, 1)
+
+ password_label = QtWidgets.QLabel("Password:", user_cred_widget)
+ password_input = PlaceholderLineEdit(user_cred_widget)
+ password_input.setPlaceholderText("< *********** >")
+ password_input.setEchoMode(PlaceholderLineEdit.Password)
+
+ api_label = QtWidgets.QLabel("API key:", user_cred_widget)
+ api_preview = QtWidgets.QLineEdit(user_cred_widget)
+ api_preview.setReadOnly(True)
+ api_preview.setObjectName("LikeDisabledInput")
+
+ show_password_icon_path = get_resource_path("eye.png")
+ show_password_icon = QtGui.QIcon(show_password_icon_path)
+ show_password_btn = PressHoverButton(user_cred_widget)
+ show_password_btn.setObjectName("PasswordBtn")
+ show_password_btn.setIcon(show_password_icon)
+ show_password_btn.setFocusPolicy(QtCore.Qt.ClickFocus)
+
+ cred_msg_sep = QtWidgets.QFrame(self)
+ cred_msg_sep.setObjectName("Separator")
+ cred_msg_sep.setMinimumHeight(2)
+ cred_msg_sep.setMaximumHeight(2)
+
+ # --- Credentials inputs ---
+ user_cred_layout = QtWidgets.QGridLayout(user_cred_widget)
+ user_cred_layout.setContentsMargins(0, 0, 0, 0)
+ row = 0
+
+ user_cred_layout.addWidget(url_label, row, 0, 1, 1)
+ user_cred_layout.addWidget(url_widget, row, 1, 1, 1)
+ user_cred_layout.addWidget(url_edit_btn, row, 2, 1, 1)
+ row += 1
+
+ user_cred_layout.addWidget(url_cred_sep, row, 0, 1, 3)
+ row += 1
+
+ user_cred_layout.addWidget(username_label, row, 0, 1, 1)
+ user_cred_layout.addWidget(username_widget, row, 1, 1, 1)
+ user_cred_layout.addWidget(username_edit_btn, row, 2, 2, 1)
+ row += 1
+
+ user_cred_layout.addWidget(api_label, row, 0, 1, 1)
+ user_cred_layout.addWidget(api_preview, row, 1, 1, 1)
+ row += 1
+
+ user_cred_layout.addWidget(password_label, row, 0, 1, 1)
+ user_cred_layout.addWidget(password_input, row, 1, 1, 1)
+ user_cred_layout.addWidget(show_password_btn, row, 2, 1, 1)
+ row += 1
+
+ user_cred_layout.addWidget(cred_msg_sep, row, 0, 1, 3)
+ row += 1
+
+ user_cred_layout.setColumnStretch(0, 0)
+ user_cred_layout.setColumnStretch(1, 1)
+ user_cred_layout.setColumnStretch(2, 0)
+
+ login_layout = QtWidgets.QVBoxLayout(login_widget)
+ login_layout.setContentsMargins(0, 0, 0, 0)
+ login_layout.addWidget(user_cred_widget, 1)
+
+ # --- Messages ---
+ # Messages for users (e.g. invalid url etc.)
+ message_label = QtWidgets.QLabel(self)
+ message_label.setWordWrap(True)
+ message_label.setTextInteractionFlags(QtCore.Qt.TextBrowserInteraction)
+
+ footer_widget = QtWidgets.QWidget(self)
+ logout_btn = QtWidgets.QPushButton("Logout", footer_widget)
+ user_message = QtWidgets.QLabel(footer_widget)
+ login_btn = QtWidgets.QPushButton("Login", footer_widget)
+ confirm_btn = QtWidgets.QPushButton("Confirm", footer_widget)
+
+ footer_layout = QtWidgets.QHBoxLayout(footer_widget)
+ footer_layout.setContentsMargins(0, 0, 0, 0)
+ footer_layout.addWidget(logout_btn, 0)
+ footer_layout.addWidget(user_message, 1)
+ footer_layout.addWidget(login_btn, 0)
+ footer_layout.addWidget(confirm_btn, 0)
+
+ main_layout = QtWidgets.QVBoxLayout(self)
+ main_layout.addWidget(login_widget, 0)
+ main_layout.addWidget(message_label, 0)
+ main_layout.addStretch(1)
+ main_layout.addWidget(footer_widget, 0)
+
+ url_input.textChanged.connect(self._on_url_change)
+ url_input.returnPressed.connect(self._on_url_enter_press)
+ username_input.textChanged.connect(self._on_user_change)
+ username_input.returnPressed.connect(self._on_username_enter_press)
+ password_input.returnPressed.connect(self._on_password_enter_press)
+ show_password_btn.change_state.connect(self._on_show_password)
+ url_edit_btn.clicked.connect(self._on_url_edit_click)
+ username_edit_btn.clicked.connect(self._on_username_edit_click)
+ logout_btn.clicked.connect(self._on_logout_click)
+ login_btn.clicked.connect(self._on_login_click)
+ confirm_btn.clicked.connect(self._on_login_click)
+
+ self._message_label = message_label
+
+ self._url_widget = url_widget
+ self._url_input = url_input
+ self._url_preview = url_preview
+ self._url_edit_btn = url_edit_btn
+
+ self._login_widget = login_widget
+
+ self._user_cred_widget = user_cred_widget
+ self._username_input = username_input
+ self._username_preview = username_preview
+ self._username_edit_btn = username_edit_btn
+
+ self._password_label = password_label
+ self._password_input = password_input
+ self._show_password_btn = show_password_btn
+ self._api_label = api_label
+ self._api_preview = api_preview
+
+ self._logout_btn = logout_btn
+ self._user_message = user_message
+ self._login_btn = login_btn
+ self._confirm_btn = confirm_btn
+
+ self._url_is_valid = None
+ self._credentials_are_valid = None
+ self._result = (None, None, None, False)
+ self._first_show = True
+
+ self._allow_logout = False
+ self._logged_in = False
+ self._url_edit_mode = False
+ self._username_edit_mode = False
+
+ def set_allow_logout(self, allow_logout):
+ if allow_logout is self._allow_logout:
+ return
+ self._allow_logout = allow_logout
+
+ self._update_states_by_edit_mode()
+
+ def _set_logged_in(self, logged_in):
+ if logged_in is self._logged_in:
+ return
+ self._logged_in = logged_in
+
+ self._update_states_by_edit_mode()
+
+ def _set_url_edit_mode(self, edit_mode):
+ if self._url_edit_mode is not edit_mode:
+ self._url_edit_mode = edit_mode
+ self._update_states_by_edit_mode()
+
+ def _set_username_edit_mode(self, edit_mode):
+ if self._username_edit_mode is not edit_mode:
+ self._username_edit_mode = edit_mode
+ self._update_states_by_edit_mode()
+
+ def _get_url_user_edit(self):
+ url_edit = True
+ if self._logged_in and not self._url_edit_mode:
+ url_edit = False
+ user_edit = url_edit
+ if not user_edit and self._logged_in and self._username_edit_mode:
+ user_edit = True
+ return url_edit, user_edit
+
+ def _update_states_by_edit_mode(self):
+ url_edit, user_edit = self._get_url_user_edit()
+
+ self._url_preview.setVisible(not url_edit)
+ self._url_input.setVisible(url_edit)
+ self._url_edit_btn.setVisible(self._allow_logout and not url_edit)
+
+ self._username_preview.setVisible(not user_edit)
+ self._username_input.setVisible(user_edit)
+ self._username_edit_btn.setVisible(
+ self._allow_logout and not user_edit
+ )
+
+ self._api_preview.setVisible(not user_edit)
+ self._api_label.setVisible(not user_edit)
+
+ self._password_label.setVisible(user_edit)
+ self._show_password_btn.setVisible(user_edit)
+ self._password_input.setVisible(user_edit)
+
+ self._logout_btn.setVisible(self._allow_logout and self._logged_in)
+ self._login_btn.setVisible(not self._allow_logout)
+ self._confirm_btn.setVisible(self._allow_logout)
+ self._update_login_btn_state(url_edit, user_edit)
+
+ def _update_login_btn_state(self, url_edit=None, user_edit=None, url=None):
+ if url_edit is None:
+ url_edit, user_edit = self._get_url_user_edit()
+
+ if url is None:
+ url = self._url_input.text()
+
+ enabled = bool(url) and (url_edit or user_edit)
+
+ self._login_btn.setEnabled(enabled)
+ self._confirm_btn.setEnabled(enabled)
+
+ def showEvent(self, event):
+ super().showEvent(event)
+ if self._first_show:
+ self._first_show = False
+ self._on_first_show()
+
+ def _on_first_show(self):
+ self.setStyleSheet(load_stylesheet())
+ self.resize(self.default_width, self.default_height)
+ self._center_window()
+ if self._allow_logout is None:
+ self.set_allow_logout(False)
+
+ self._update_states_by_edit_mode()
+ if not self._url_input.text():
+ widget = self._url_input
+ elif not self._username_input.text():
+ widget = self._username_input
+ else:
+ widget = self._password_input
+
+ self._set_input_focus(widget)
+
+ def result(self):
+ """Result url and token or login.
+
+ Returns:
+ Union[Tuple[str, str], Tuple[None, None]]: Url and token used for
+ login if was successful otherwise are both set to None.
+ """
+ return self._result
+
+ def _center_window(self):
+ """Move window to center of screen."""
+
+ if hasattr(QtWidgets.QApplication, "desktop"):
+ desktop = QtWidgets.QApplication.desktop()
+ screen_idx = desktop.screenNumber(self)
+ screen_geo = desktop.screenGeometry(screen_idx)
+ else:
+ screen = self.screen()
+ screen_geo = screen.geometry()
+
+ geo = self.frameGeometry()
+ geo.moveCenter(screen_geo.center())
+ if geo.y() < screen_geo.y():
+ geo.setY(screen_geo.y())
+ self.move(geo.topLeft())
+
+ def _on_url_change(self, text):
+ self._update_login_btn_state(url=text)
+ self._set_url_valid(None)
+ self._set_credentials_valid(None)
+ self._url_preview.setText(text)
+
+ def _set_url_valid(self, valid):
+ if valid is self._url_is_valid:
+ return
+
+ self._url_is_valid = valid
+ self._set_input_valid_state(self._url_input, valid)
+
+ def _set_credentials_valid(self, valid):
+ if self._credentials_are_valid is valid:
+ return
+
+ self._credentials_are_valid = valid
+ self._set_input_valid_state(self._username_input, valid)
+ self._set_input_valid_state(self._password_input, valid)
+
+ def _on_url_enter_press(self):
+ self._set_input_focus(self._username_input)
+
+ def _on_user_change(self, username):
+ self._username_preview.setText(username)
+
+ def _on_username_enter_press(self):
+ self._set_input_focus(self._password_input)
+
+ def _on_password_enter_press(self):
+ self._login()
+
+ def _on_show_password(self, show_password):
+ if show_password:
+ placeholder_text = "< MySecret124 >"
+ echo_mode = QtWidgets.QLineEdit.Normal
+ else:
+ placeholder_text = "< *********** >"
+ echo_mode = QtWidgets.QLineEdit.Password
+
+ self._password_input.setEchoMode(echo_mode)
+ self._password_input.setPlaceholderText(placeholder_text)
+
+ def _on_username_edit_click(self):
+ self._username_edit_mode = True
+ self._update_states_by_edit_mode()
+
+ def _on_url_edit_click(self):
+ self._url_edit_mode = True
+ self._update_states_by_edit_mode()
+
+ def _on_logout_click(self):
+ dialog = LogoutConfirmDialog(self)
+ dialog.exec_()
+ if dialog.get_result():
+ self._result = (None, None, None, True)
+ self.accept()
+
+ def _on_login_click(self):
+ self._login()
+
+ def _validate_url(self):
+ """Use url from input to connect and change window state on success.
+
+ Todos:
+ Threaded check.
+ """
+
+ url = self._url_input.text()
+ valid_url = None
+ try:
+ valid_url = validate_url(url)
+
+ except UrlError as exc:
+ parts = [f"{exc.title}"]
+ parts.extend(f"- {hint}" for hint in exc.hints)
+ self._set_message(" ".join(parts))
+
+ except KeyboardInterrupt:
+ # Reraise KeyboardInterrupt error
+ raise
+
+ except BaseException:
+ self._set_unexpected_error()
+ return
+
+ if valid_url is None:
+ return False
+
+ self._url_input.setText(valid_url)
+ return True
+
+ def _login(self):
+ if (
+ not self._login_btn.isEnabled()
+ and not self._confirm_btn.isEnabled()
+ ):
+ return
+
+ if not self._url_is_valid:
+ self._set_url_valid(self._validate_url())
+
+ if not self._url_is_valid:
+ self._set_input_focus(self._url_input)
+ self._set_credentials_valid(None)
+ return
+
+ self._clear_message()
+
+ url = self._url_input.text()
+ username = self._username_input.text()
+ password = self._password_input.text()
+ try:
+ token = login_to_server(url, username, password)
+ except BaseException:
+ self._set_unexpected_error()
+ return
+
+ if token is not None:
+ self._result = (url, token, username, False)
+ self.accept()
+ return
+
+ self._set_credentials_valid(False)
+ message_lines = ["Invalid credentials"]
+ if not username.strip():
+ message_lines.append("- Username is not filled")
+
+ if not password.strip():
+ message_lines.append("- Password is not filled")
+
+ if username and password:
+ message_lines.append("- Check your credentials")
+
+ self._set_message(" ".join(message_lines))
+ self._set_input_focus(self._username_input)
+
+ def _set_input_focus(self, widget):
+ widget.setFocus(QtCore.Qt.MouseFocusReason)
+
+ def _set_input_valid_state(self, widget, valid):
+ state = ""
+ if valid is True:
+ state = "valid"
+ elif valid is False:
+ state = "invalid"
+ set_style_property(widget, "state", state)
+
+ def _set_message(self, message):
+ self._message_label.setText(message)
+
+ def _clear_message(self):
+ self._message_label.setText("")
+
+ def _set_unexpected_error(self):
+ # TODO add traceback somewhere
+ # - maybe a button to show or copy?
+ traceback.print_exc()
+ lines = [
+ "Unexpected error happened",
+ "- Can be caused by wrong url (leading elsewhere)"
+ ]
+ self._set_message(" ".join(lines))
+
+ def set_url(self, url):
+ self._url_preview.setText(url)
+ self._url_input.setText(url)
+ self._validate_url()
+
+ def set_username(self, username):
+ self._username_preview.setText(username)
+ self._username_input.setText(username)
+
+ def _set_api_key(self, api_key):
+ if not api_key or len(api_key) < 3:
+ self._api_preview.setText(api_key or "")
+ return
+
+ api_key_len = len(api_key)
+ offset = 6
+ if api_key_len < offset:
+ offset = api_key_len // 2
+ api_key = api_key[:offset] + "." * (api_key_len - offset)
+
+ self._api_preview.setText(api_key)
+
+ def set_logged_in(
+ self,
+ logged_in,
+ url=None,
+ username=None,
+ api_key=None,
+ allow_logout=None
+ ):
+ if url is not None:
+ self.set_url(url)
+
+ if username is not None:
+ self.set_username(username)
+
+ if api_key:
+ self._set_api_key(api_key)
+
+ if logged_in and allow_logout is None:
+ allow_logout = True
+
+ self._set_logged_in(logged_in)
+
+ if allow_logout:
+ self.set_allow_logout(True)
+ elif allow_logout is False:
+ self.set_allow_logout(False)
+
+
+def ask_to_login(url=None, username=None, always_on_top=False):
+ """Ask user to login using Qt dialog.
+
+ Function creates new QApplication if is not created yet.
+
+ Args:
+ url (Optional[str]): Server url that will be prefilled in dialog.
+ username (Optional[str]): Username that will be prefilled in dialog.
+ always_on_top (Optional[bool]): Window will be drawn on top of
+ other windows.
+
+ Returns:
+ tuple[str, str, str]: Returns Url, user's token and username. Url can
+ be changed during dialog lifetime that's why the url is returned.
+ """
+
+ app_instance = get_qt_app()
+
+ window = ServerLoginWindow()
+ if always_on_top:
+ window.setWindowFlags(
+ window.windowFlags()
+ | QtCore.Qt.WindowStaysOnTopHint
+ )
+
+ if url:
+ window.set_url(url)
+
+ if username:
+ window.set_username(username)
+
+ if not app_instance.startingUp():
+ window.exec_()
+ else:
+ window.open()
+ app_instance.exec_()
+ result = window.result()
+ out_url, out_token, out_username, _ = result
+ return out_url, out_token, out_username
+
+
+def change_user(url, username, api_key, always_on_top=False):
+ """Ask user to login using Qt dialog.
+
+ Function creates new QApplication if is not created yet.
+
+ Args:
+ url (str): Server url that will be prefilled in dialog.
+ username (str): Username that will be prefilled in dialog.
+ api_key (str): API key that will be prefilled in dialog.
+ always_on_top (Optional[bool]): Window will be drawn on top of
+ other windows.
+
+ Returns:
+ Tuple[str, str]: Returns Url and user's token. Url can be changed
+ during dialog lifetime that's why the url is returned.
+ """
+
+ app_instance = get_qt_app()
+ window = ServerLoginWindow()
+ if always_on_top:
+ window.setWindowFlags(
+ window.windowFlags()
+ | QtCore.Qt.WindowStaysOnTopHint
+ )
+ window.set_logged_in(True, url, username, api_key)
+
+ if not app_instance.startingUp():
+ window.exec_()
+ else:
+ window.open()
+ # This can become main Qt loop. Maybe should live elsewhere
+ app_instance.exec_()
+ return window.result()
diff --git a/common/ayon_common/connection/ui/widgets.py b/common/ayon_common/connection/ui/widgets.py
new file mode 100644
index 0000000000..78b73e056d
--- /dev/null
+++ b/common/ayon_common/connection/ui/widgets.py
@@ -0,0 +1,47 @@
+from qtpy import QtWidgets, QtCore, QtGui
+
+
+class PressHoverButton(QtWidgets.QPushButton):
+ """Keep track about mouse press/release and enter/leave."""
+
+ _mouse_pressed = False
+ _mouse_hovered = False
+ change_state = QtCore.Signal(bool)
+
+ def mousePressEvent(self, event):
+ self._mouse_pressed = True
+ self._mouse_hovered = True
+ self.change_state.emit(self._mouse_hovered)
+ super(PressHoverButton, self).mousePressEvent(event)
+
+ def mouseReleaseEvent(self, event):
+ self._mouse_pressed = False
+ self._mouse_hovered = False
+ self.change_state.emit(self._mouse_hovered)
+ super(PressHoverButton, self).mouseReleaseEvent(event)
+
+ def mouseMoveEvent(self, event):
+ mouse_pos = self.mapFromGlobal(QtGui.QCursor.pos())
+ under_mouse = self.rect().contains(mouse_pos)
+ if under_mouse != self._mouse_hovered:
+ self._mouse_hovered = under_mouse
+ self.change_state.emit(self._mouse_hovered)
+
+ super(PressHoverButton, self).mouseMoveEvent(event)
+
+
+class PlaceholderLineEdit(QtWidgets.QLineEdit):
+ """Set placeholder color of QLineEdit in Qt 5.12 and higher."""
+
+ def __init__(self, *args, **kwargs):
+ super(PlaceholderLineEdit, self).__init__(*args, **kwargs)
+ # Change placeholder palette color
+ if hasattr(QtGui.QPalette, "PlaceholderText"):
+ filter_palette = self.palette()
+ color = QtGui.QColor("#D3D8DE")
+ color.setAlpha(67)
+ filter_palette.setColor(
+ QtGui.QPalette.PlaceholderText,
+ color
+ )
+ self.setPalette(filter_palette)
diff --git a/common/openpype_common/distribution/README.md b/common/ayon_common/distribution/README.md
similarity index 96%
rename from common/openpype_common/distribution/README.md
rename to common/ayon_common/distribution/README.md
index 212eb267b8..f1c34ba722 100644
--- a/common/openpype_common/distribution/README.md
+++ b/common/ayon_common/distribution/README.md
@@ -5,7 +5,7 @@ Code in this folder is backend portion of Addon distribution logic for v4 server
Each host, module will be separate Addon in the future. Each v4 server could run different set of Addons.
-Client (running on artist machine) will in the first step ask v4 for list of enabled addons.
+Client (running on artist machine) will in the first step ask v4 for list of enabled addons.
(It expects list of json documents matching to `addon_distribution.py:AddonInfo` object.)
Next it will compare presence of enabled addon version in local folder. In the case of missing version of
an addon, client will use information in the addon to download (from http/shared local disk/git) zip file
@@ -15,4 +15,4 @@ Required part of addon distribution will be sharing of dependencies (python libr
Location of this folder might change in the future as it will be required for a clint to add this folder to sys.path reliably.
-This code needs to be independent on Openpype code as much as possible!
\ No newline at end of file
+This code needs to be independent on Openpype code as much as possible!
diff --git a/common/ayon_common/distribution/__init__.py b/common/ayon_common/distribution/__init__.py
new file mode 100644
index 0000000000..e3c0f0e161
--- /dev/null
+++ b/common/ayon_common/distribution/__init__.py
@@ -0,0 +1,9 @@
+from .control import AyonDistribution, BundleNotFoundError
+from .utils import show_missing_bundle_information
+
+
+__all__ = (
+ "AyonDistribution",
+ "BundleNotFoundError",
+ "show_missing_bundle_information",
+)
diff --git a/common/ayon_common/distribution/control.py b/common/ayon_common/distribution/control.py
new file mode 100644
index 0000000000..95c221d753
--- /dev/null
+++ b/common/ayon_common/distribution/control.py
@@ -0,0 +1,1116 @@
+import os
+import sys
+import json
+import traceback
+import collections
+import datetime
+import logging
+import shutil
+import threading
+import platform
+import attr
+from enum import Enum
+
+import ayon_api
+
+from ayon_common.utils import is_staging_enabled
+
+from .utils import (
+ get_addons_dir,
+ get_dependencies_dir,
+)
+from .downloaders import get_default_download_factory
+from .data_structures import (
+ AddonInfo,
+ DependencyItem,
+ Bundle,
+)
+
+NOT_SET = type("UNKNOWN", (), {"__bool__": lambda: False})()
+
+
+class BundleNotFoundError(Exception):
+ """Bundle name is defined but is not available on server.
+
+ Args:
+ bundle_name (str): Name of bundle that was not found.
+ """
+
+ def __init__(self, bundle_name):
+ self.bundle_name = bundle_name
+ super().__init__(
+ f"Bundle '{bundle_name}' is not available on server"
+ )
+
+
+class UpdateState(Enum):
+ UNKNOWN = "unknown"
+ UPDATED = "udated"
+ OUTDATED = "outdated"
+ UPDATE_FAILED = "failed"
+ MISS_SOURCE_FILES = "miss_source_files"
+
+
+class DistributeTransferProgress:
+ """Progress of single source item in 'DistributionItem'.
+
+ The item is to keep track of single source item.
+ """
+
+ def __init__(self):
+ self._transfer_progress = ayon_api.TransferProgress()
+ self._started = False
+ self._failed = False
+ self._fail_reason = None
+ self._unzip_started = False
+ self._unzip_finished = False
+ self._hash_check_started = False
+ self._hash_check_finished = False
+
+ def set_started(self):
+ """Call when source distribution starts."""
+
+ self._started = True
+
+ def set_failed(self, reason):
+ """Set source distribution as failed.
+
+ Args:
+ reason (str): Error message why the transfer failed.
+ """
+
+ self._failed = True
+ self._fail_reason = reason
+
+ def set_hash_check_started(self):
+ """Call just before hash check starts."""
+
+ self._hash_check_started = True
+
+ def set_hash_check_finished(self):
+ """Call just after hash check finishes."""
+
+ self._hash_check_finished = True
+
+ def set_unzip_started(self):
+ """Call just before unzip starts."""
+
+ self._unzip_started = True
+
+ def set_unzip_finished(self):
+ """Call just after unzip finishes."""
+
+ self._unzip_finished = True
+
+ @property
+ def is_running(self):
+ """Source distribution is in progress.
+
+ Returns:
+ bool: Transfer is in progress.
+ """
+
+ return bool(
+ self._started
+ and not self._failed
+ and not self._hash_check_finished
+ )
+
+ @property
+ def transfer_progress(self):
+ """Source file 'download' progress tracker.
+
+ Returns:
+ ayon_api.TransferProgress.: Content download progress.
+ """
+
+ return self._transfer_progress
+
+ @property
+ def started(self):
+ return self._started
+
+ @property
+ def hash_check_started(self):
+ return self._hash_check_started
+
+ @property
+ def hash_check_finished(self):
+ return self._has_check_finished
+
+ @property
+ def unzip_started(self):
+ return self._unzip_started
+
+ @property
+ def unzip_finished(self):
+ return self._unzip_finished
+
+ @property
+ def failed(self):
+ return self._failed or self._transfer_progress.failed
+
+ @property
+ def fail_reason(self):
+ return self._fail_reason or self._transfer_progress.fail_reason
+
+
+class DistributionItem:
+ """Distribution item with sources and target directories.
+
+ Distribution item can be an addon or dependency package. Distribution item
+ can be already distributed and don't need any progression. The item keeps
+ track of the progress. The reason is to be able to use the distribution
+ items as source data for UI without implementing the same logic.
+
+ Distribution is "state" based. Distribution can be 'UPDATED' or 'OUTDATED'
+ at the initialization. If item is 'UPDATED' the distribution is skipped
+ and 'OUTDATED' will trigger the distribution process.
+
+ Because the distribution may have multiple sources each source has own
+ progress item.
+
+ Args:
+ state (UpdateState): Initial state (UpdateState.UPDATED or
+ UpdateState.OUTDATED).
+ unzip_dirpath (str): Path to directory where zip is downloaded.
+ download_dirpath (str): Path to directory where file is unzipped.
+ file_hash (str): Hash of file for validation.
+ factory (DownloadFactory): Downloaders factory object.
+ sources (List[SourceInfo]): Possible sources to receive the
+ distribution item.
+ downloader_data (Dict[str, Any]): More information for downloaders.
+ item_label (str): Label used in log outputs (and in UI).
+ logger (logging.Logger): Logger object.
+ """
+
+ def __init__(
+ self,
+ state,
+ unzip_dirpath,
+ download_dirpath,
+ file_hash,
+ factory,
+ sources,
+ downloader_data,
+ item_label,
+ logger=None,
+ ):
+ if logger is None:
+ logger = logging.getLogger(self.__class__.__name__)
+ self.log = logger
+ self.state = state
+ self.unzip_dirpath = unzip_dirpath
+ self.download_dirpath = download_dirpath
+ self.file_hash = file_hash
+ self.factory = factory
+ self.sources = [
+ (source, DistributeTransferProgress())
+ for source in sources
+ ]
+ self.downloader_data = downloader_data
+ self.item_label = item_label
+
+ self._need_distribution = state != UpdateState.UPDATED
+ self._current_source_progress = None
+ self._used_source_progress = None
+ self._used_source = None
+ self._dist_started = False
+ self._dist_finished = False
+
+ self._error_msg = None
+ self._error_detail = None
+
+ @property
+ def need_distribution(self):
+ """Need distribution based on initial state.
+
+ Returns:
+ bool: Need distribution.
+ """
+
+ return self._need_distribution
+
+ @property
+ def current_source_progress(self):
+ """Currently processed source progress object.
+
+ Returns:
+ Union[DistributeTransferProgress, None]: Transfer progress or None.
+ """
+
+ return self._current_source_progress
+
+ @property
+ def used_source_progress(self):
+ """Transfer progress that successfully distributed the item.
+
+ Returns:
+ Union[DistributeTransferProgress, None]: Transfer progress or None.
+ """
+
+ return self._used_source_progress
+
+ @property
+ def used_source(self):
+ """Data of source item.
+
+ Returns:
+ Union[Dict[str, Any], None]: SourceInfo data or None.
+ """
+
+ return self._used_source
+
+ @property
+ def error_message(self):
+ """Reason why distribution item failed.
+
+ Returns:
+ Union[str, None]: Error message.
+ """
+
+ return self._error_msg
+
+ @property
+ def error_detail(self):
+ """Detailed reason why distribution item failed.
+
+ Returns:
+ Union[str, None]: Detailed information (maybe traceback).
+ """
+
+ return self._error_detail
+
+ def _distribute(self):
+ if not self.sources:
+ message = (
+ f"{self.item_label}: Don't have"
+ " any sources to download from."
+ )
+ self.log.error(message)
+ self._error_msg = message
+ self.state = UpdateState.MISS_SOURCE_FILES
+ return
+
+ download_dirpath = self.download_dirpath
+ unzip_dirpath = self.unzip_dirpath
+ for source, source_progress in self.sources:
+ self._current_source_progress = source_progress
+ source_progress.set_started()
+
+ # Remove directory if exists
+ if os.path.isdir(unzip_dirpath):
+ self.log.debug(f"Cleaning {unzip_dirpath}")
+ shutil.rmtree(unzip_dirpath)
+
+ # Create directory
+ os.makedirs(unzip_dirpath)
+ if not os.path.isdir(download_dirpath):
+ os.makedirs(download_dirpath)
+
+ try:
+ downloader = self.factory.get_downloader(source.type)
+ except Exception:
+ message = f"Unknown downloader {source.type}"
+ source_progress.set_failed(message)
+ self.log.warning(message, exc_info=True)
+ continue
+
+ source_data = attr.asdict(source)
+ cleanup_args = (
+ source_data,
+ download_dirpath,
+ self.downloader_data
+ )
+
+ try:
+ zip_filepath = downloader.download(
+ source_data,
+ download_dirpath,
+ self.downloader_data,
+ source_progress.transfer_progress,
+ )
+ except Exception:
+ message = "Failed to download source"
+ source_progress.set_failed(message)
+ self.log.warning(
+ f"{self.item_label}: {message}",
+ exc_info=True
+ )
+ downloader.cleanup(*cleanup_args)
+ continue
+
+ source_progress.set_hash_check_started()
+ try:
+ downloader.check_hash(zip_filepath, self.file_hash)
+ except Exception:
+ message = "File hash does not match"
+ source_progress.set_failed(message)
+ self.log.warning(
+ f"{self.item_label}: {message}",
+ exc_info=True
+ )
+ downloader.cleanup(*cleanup_args)
+ continue
+
+ source_progress.set_hash_check_finished()
+ source_progress.set_unzip_started()
+ try:
+ downloader.unzip(zip_filepath, unzip_dirpath)
+ except Exception:
+ message = "Couldn't unzip source file"
+ source_progress.set_failed(message)
+ self.log.warning(
+ f"{self.item_label}: {message}",
+ exc_info=True
+ )
+ downloader.cleanup(*cleanup_args)
+ continue
+
+ source_progress.set_unzip_finished()
+ downloader.cleanup(*cleanup_args)
+ self.state = UpdateState.UPDATED
+ self._used_source = source_data
+ break
+
+ last_progress = self._current_source_progress
+ self._current_source_progress = None
+ if self.state == UpdateState.UPDATED:
+ self._used_source_progress = last_progress
+ self.log.info(f"{self.item_label}: Distributed")
+ return
+
+ self.log.error(f"{self.item_label}: Failed to distribute")
+ self._error_msg = "Failed to receive or install source files"
+
+ def distribute(self):
+ """Execute distribution logic."""
+
+ if not self.need_distribution or self._dist_started:
+ return
+
+ self._dist_started = True
+ try:
+ if self.state == UpdateState.OUTDATED:
+ self._distribute()
+
+ except Exception as exc:
+ self.state = UpdateState.UPDATE_FAILED
+ self._error_msg = str(exc)
+ self._error_detail = "".join(
+ traceback.format_exception(*sys.exc_info())
+ )
+ self.log.error(
+ f"{self.item_label}: Distibution filed",
+ exc_info=True
+ )
+
+ finally:
+ self._dist_finished = True
+ if self.state == UpdateState.OUTDATED:
+ self.state = UpdateState.UPDATE_FAILED
+ self._error_msg = "Distribution failed"
+
+ if (
+ self.state != UpdateState.UPDATED
+ and self.unzip_dirpath
+ and os.path.isdir(self.unzip_dirpath)
+ ):
+ self.log.debug(f"Cleaning {self.unzip_dirpath}")
+ shutil.rmtree(self.unzip_dirpath)
+
+
+class AyonDistribution:
+ """Distribution control.
+
+ Receive information from server what addons and dependency packages
+ should be available locally and prepare/validate their distribution.
+
+ Arguments are available for testing of the class.
+
+ Args:
+ addon_dirpath (Optional[str]): Where addons will be stored.
+ dependency_dirpath (Optional[str]): Where dependencies will be stored.
+ dist_factory (Optional[DownloadFactory]): Factory which cares about
+ downloading of items based on source type.
+ addons_info (Optional[list[dict[str, Any]]): List of prepared
+ addons' info.
+ dependency_packages_info (Optional[list[dict[str, Any]]): Info
+ about packages from server.
+ bundles_info (Optional[Dict[str, Any]]): Info about
+ bundles.
+ bundle_name (Optional[str]): Name of bundle to use. If not passed
+ an environment variable 'AYON_BUNDLE_NAME' is checked for value.
+ When both are not available the bundle is defined by 'use_staging'
+ value.
+ use_staging (Optional[bool]): Use staging versions of an addon.
+ If not passed, 'is_staging_enabled' is used as default value.
+ """
+
+ def __init__(
+ self,
+ addon_dirpath=None,
+ dependency_dirpath=None,
+ dist_factory=None,
+ addons_info=NOT_SET,
+ dependency_packages_info=NOT_SET,
+ bundles_info=NOT_SET,
+ bundle_name=NOT_SET,
+ use_staging=None
+ ):
+ self._log = None
+
+ self._dist_started = False
+ self._dist_finished = False
+
+ self._addons_dirpath = addon_dirpath or get_addons_dir()
+ self._dependency_dirpath = dependency_dirpath or get_dependencies_dir()
+ self._dist_factory = (
+ dist_factory or get_default_download_factory()
+ )
+
+ if bundle_name is NOT_SET:
+ bundle_name = os.environ.get("AYON_BUNDLE_NAME", NOT_SET)
+
+ # Raw addons data from server
+ self._addons_info = addons_info
+ # Prepared data as Addon objects
+ self._addon_items = NOT_SET
+ # Distrubtion items of addons
+ # - only those addons and versions that should be distributed
+ self._addon_dist_items = NOT_SET
+
+ # Raw dependency packages data from server
+ self._dependency_packages_info = dependency_packages_info
+ # Prepared dependency packages as objects
+ self._dependency_packages_items = NOT_SET
+ # Dependency package item that should be used
+ self._dependency_package_item = NOT_SET
+ # Distribution item of dependency package
+ self._dependency_dist_item = NOT_SET
+
+ # Raw bundles data from server
+ self._bundles_info = bundles_info
+ # Bundles as objects
+ self._bundle_items = NOT_SET
+
+ # Bundle that should be used in production
+ self._production_bundle = NOT_SET
+ # Bundle that should be used in staging
+ self._staging_bundle = NOT_SET
+ # Boolean that defines if staging bundle should be used
+ self._use_staging = use_staging
+
+ # Specific bundle name should be used
+ self._bundle_name = bundle_name
+ # Final bundle that will be used
+ self._bundle = NOT_SET
+
+ @property
+ def use_staging(self):
+ """Staging version of a bundle should be used.
+
+ This value is completely ignored if specific bundle name should
+ be used.
+
+ Returns:
+ bool: True if staging version should be used.
+ """
+
+ if self._use_staging is None:
+ self._use_staging = is_staging_enabled()
+ return self._use_staging
+
+ @property
+ def log(self):
+ """Helper to access logger.
+
+ Returns:
+ logging.Logger: Logger instance.
+ """
+ if self._log is None:
+ self._log = logging.getLogger(self.__class__.__name__)
+ return self._log
+
+ @property
+ def bundles_info(self):
+ """
+
+ Returns:
+ dict[str, dict[str, Any]]: Bundles information from server.
+ """
+
+ if self._bundles_info is NOT_SET:
+ self._bundles_info = ayon_api.get_bundles()
+ return self._bundles_info
+
+ @property
+ def bundle_items(self):
+ """
+
+ Returns:
+ list[Bundle]: List of bundles info.
+ """
+
+ if self._bundle_items is NOT_SET:
+ self._bundle_items = [
+ Bundle.from_dict(info)
+ for info in self.bundles_info["bundles"]
+ ]
+ return self._bundle_items
+
+ def _prepare_production_staging_bundles(self):
+ production_bundle = None
+ staging_bundle = None
+ for bundle in self.bundle_items:
+ if bundle.is_production:
+ production_bundle = bundle
+ if bundle.is_staging:
+ staging_bundle = bundle
+ self._production_bundle = production_bundle
+ self._staging_bundle = staging_bundle
+
+ @property
+ def production_bundle(self):
+ """
+ Returns:
+ Union[Bundle, None]: Bundle that should be used in production.
+ """
+
+ if self._production_bundle is NOT_SET:
+ self._prepare_production_staging_bundles()
+ return self._production_bundle
+
+ @property
+ def staging_bundle(self):
+ """
+ Returns:
+ Union[Bundle, None]: Bundle that should be used in staging.
+ """
+
+ if self._staging_bundle is NOT_SET:
+ self._prepare_production_staging_bundles()
+ return self._staging_bundle
+
+ @property
+ def bundle_to_use(self):
+ """Bundle that will be used for distribution.
+
+ Bundle that should be used can be affected by 'bundle_name'
+ or 'use_staging'.
+
+ Returns:
+ Union[Bundle, None]: Bundle that will be used for distribution
+ or None.
+
+ Raises:
+ BundleNotFoundError: When bundle name to use is defined
+ but is not available on server.
+ """
+
+ if self._bundle is NOT_SET:
+ if self._bundle_name is not NOT_SET:
+ bundle = next(
+ (
+ bundle
+ for bundle in self.bundle_items
+ if bundle.name == self._bundle_name
+ ),
+ None
+ )
+ if bundle is None:
+ raise BundleNotFoundError(self._bundle_name)
+
+ self._bundle = bundle
+ elif self.use_staging:
+ self._bundle = self.staging_bundle
+ else:
+ self._bundle = self.production_bundle
+ return self._bundle
+
+ @property
+ def bundle_name_to_use(self):
+ bundle = self.bundle_to_use
+ return None if bundle is None else bundle.name
+
+ @property
+ def addons_info(self):
+ """Server information about available addons.
+
+ Returns:
+ Dict[str, dict[str, Any]: Addon info by addon name.
+ """
+
+ if self._addons_info is NOT_SET:
+ server_info = ayon_api.get_addons_info(details=True)
+ self._addons_info = server_info["addons"]
+ return self._addons_info
+
+ @property
+ def addon_items(self):
+ """Information about available addons on server.
+
+ Addons may require distribution of files. For those addons will be
+ created 'DistributionItem' handling distribution itself.
+
+ Returns:
+ Dict[str, AddonInfo]: Addon info object by addon name.
+ """
+
+ if self._addon_items is NOT_SET:
+ addons_info = {}
+ for addon in self.addons_info:
+ addon_info = AddonInfo.from_dict(addon)
+ addons_info[addon_info.name] = addon_info
+ self._addon_items = addons_info
+ return self._addon_items
+
+ @property
+ def dependency_packages_info(self):
+ """Server information about available dependency packages.
+
+ Notes:
+ For testing purposes it is possible to pass dependency packages
+ information to '__init__'.
+
+ Returns:
+ list[dict[str, Any]]: Dependency packages information.
+ """
+
+ if self._dependency_packages_info is NOT_SET:
+ self._dependency_packages_info = (
+ ayon_api.get_dependency_packages())["packages"]
+ return self._dependency_packages_info
+
+ @property
+ def dependency_packages_items(self):
+ """Dependency packages as objects.
+
+ Returns:
+ dict[str, DependencyItem]: Dependency packages as objects by name.
+ """
+
+ if self._dependency_packages_items is NOT_SET:
+ dependenc_package_items = {}
+ for item in self.dependency_packages_info:
+ item = DependencyItem.from_dict(item)
+ dependenc_package_items[item.name] = item
+ self._dependency_packages_items = dependenc_package_items
+ return self._dependency_packages_items
+
+ @property
+ def dependency_package_item(self):
+ """Dependency package item that should be used by bundle.
+
+ Returns:
+ Union[None, Dict[str, Any]]: None if bundle does not have
+ specified dependency package.
+ """
+
+ if self._dependency_package_item is NOT_SET:
+ dependency_package_item = None
+ bundle = self.bundle_to_use
+ if bundle is not None:
+ package_name = bundle.dependency_packages.get(
+ platform.system().lower()
+ )
+ dependency_package_item = self.dependency_packages_items.get(
+ package_name)
+ self._dependency_package_item = dependency_package_item
+ return self._dependency_package_item
+
+ def _prepare_current_addon_dist_items(self):
+ addons_metadata = self.get_addons_metadata()
+ output = []
+ addon_versions = {}
+ bundle = self.bundle_to_use
+ if bundle is not None:
+ addon_versions = bundle.addon_versions
+ for addon_name, addon_item in self.addon_items.items():
+ addon_version = addon_versions.get(addon_name)
+ # Addon is not in bundle -> Skip
+ if addon_version is None:
+ continue
+
+ addon_version_item = addon_item.versions.get(addon_version)
+ # Addon version is not available in addons info
+ # - TODO handle this case (raise error, skip, store, report, ...)
+ if addon_version_item is None:
+ print(
+ f"Version '{addon_version}' of addon '{addon_name}'"
+ " is not available on server."
+ )
+ continue
+
+ if not addon_version_item.require_distribution:
+ continue
+ full_name = addon_version_item.full_name
+ addon_dest = os.path.join(self._addons_dirpath, full_name)
+ self.log.debug(f"Checking {full_name} in {addon_dest}")
+ addon_in_metadata = (
+ addon_name in addons_metadata
+ and addon_version_item.version in addons_metadata[addon_name]
+ )
+ if addon_in_metadata and os.path.isdir(addon_dest):
+ self.log.debug(
+ f"Addon version folder {addon_dest} already exists."
+ )
+ state = UpdateState.UPDATED
+
+ else:
+ state = UpdateState.OUTDATED
+
+ downloader_data = {
+ "type": "addon",
+ "name": addon_name,
+ "version": addon_version
+ }
+
+ dist_item = DistributionItem(
+ state,
+ addon_dest,
+ addon_dest,
+ addon_version_item.hash,
+ self._dist_factory,
+ list(addon_version_item.sources),
+ downloader_data,
+ full_name,
+ self.log
+ )
+ output.append({
+ "dist_item": dist_item,
+ "addon_name": addon_name,
+ "addon_version": addon_version,
+ "addon_item": addon_item,
+ "addon_version_item": addon_version_item,
+ })
+ return output
+
+ def _prepare_dependency_progress(self):
+ package = self.dependency_package_item
+ if package is None:
+ return None
+
+ metadata = self.get_dependency_metadata()
+ downloader_data = {
+ "type": "dependency_package",
+ "name": package.name,
+ "platform": package.platform_name
+ }
+ zip_dir = package_dir = os.path.join(
+ self._dependency_dirpath, package.name
+ )
+ self.log.debug(f"Checking {package.name} in {package_dir}")
+
+ if not os.path.isdir(package_dir) or package.name not in metadata:
+ state = UpdateState.OUTDATED
+ else:
+ state = UpdateState.UPDATED
+
+ return DistributionItem(
+ state,
+ zip_dir,
+ package_dir,
+ package.checksum,
+ self._dist_factory,
+ package.sources,
+ downloader_data,
+ package.name,
+ self.log,
+ )
+
+ def get_addon_dist_items(self):
+ """Addon distribution items.
+
+ These items describe source files required by addon to be available on
+ machine. Each item may have 0-n source information from where can be
+ obtained. If file is already available it's state will be 'UPDATED'.
+
+ Example output:
+ [
+ {
+ "dist_item": DistributionItem,
+ "addon_name": str,
+ "addon_version": str,
+ "addon_item": AddonInfo,
+ "addon_version_item": AddonVersionInfo
+ }, {
+ ...
+ }
+ ]
+
+ Returns:
+ list[dict[str, Any]]: Distribution items with addon version item.
+ """
+
+ if self._addon_dist_items is NOT_SET:
+ self._addon_dist_items = (
+ self._prepare_current_addon_dist_items())
+ return self._addon_dist_items
+
+ def get_dependency_dist_item(self):
+ """Dependency package distribution item.
+
+ Item describe source files required by server to be available on
+ machine. Item may have 0-n source information from where can be
+ obtained. If file is already available it's state will be 'UPDATED'.
+
+ 'None' is returned if server does not have defined any dependency
+ package.
+
+ Returns:
+ Union[None, DistributionItem]: Dependency item or None if server
+ does not have specified any dependency package.
+ """
+
+ if self._dependency_dist_item is NOT_SET:
+ self._dependency_dist_item = self._prepare_dependency_progress()
+ return self._dependency_dist_item
+
+ def get_dependency_metadata_filepath(self):
+ """Path to distribution metadata file.
+
+ Metadata contain information about distributed packages, used source,
+ expected file hash and time when file was distributed.
+
+ Returns:
+ str: Path to a file where dependency package metadata are stored.
+ """
+
+ return os.path.join(self._dependency_dirpath, "dependency.json")
+
+ def get_addons_metadata_filepath(self):
+ """Path to addons metadata file.
+
+ Metadata contain information about distributed addons, used sources,
+ expected file hashes and time when files were distributed.
+
+ Returns:
+ str: Path to a file where addons metadata are stored.
+ """
+
+ return os.path.join(self._addons_dirpath, "addons.json")
+
+ def read_metadata_file(self, filepath, default_value=None):
+ """Read json file from path.
+
+ Method creates the file when does not exist with default value.
+
+ Args:
+ filepath (str): Path to json file.
+ default_value (Union[Dict[str, Any], List[Any], None]): Default
+ value if the file is not available (or valid).
+
+ Returns:
+ Union[Dict[str, Any], List[Any]]: Value from file.
+ """
+
+ if default_value is None:
+ default_value = {}
+
+ if not os.path.exists(filepath):
+ return default_value
+
+ try:
+ with open(filepath, "r") as stream:
+ data = json.load(stream)
+ except ValueError:
+ data = default_value
+ return data
+
+ def save_metadata_file(self, filepath, data):
+ """Store data to json file.
+
+ Method creates the file when does not exist.
+
+ Args:
+ filepath (str): Path to json file.
+ data (Union[Dict[str, Any], List[Any]]): Data to store into file.
+ """
+
+ if not os.path.exists(filepath):
+ dirpath = os.path.dirname(filepath)
+ if not os.path.exists(dirpath):
+ os.makedirs(dirpath)
+ with open(filepath, "w") as stream:
+ json.dump(data, stream, indent=4)
+
+ def get_dependency_metadata(self):
+ filepath = self.get_dependency_metadata_filepath()
+ return self.read_metadata_file(filepath, {})
+
+ def update_dependency_metadata(self, package_name, data):
+ dependency_metadata = self.get_dependency_metadata()
+ dependency_metadata[package_name] = data
+ filepath = self.get_dependency_metadata_filepath()
+ self.save_metadata_file(filepath, dependency_metadata)
+
+ def get_addons_metadata(self):
+ filepath = self.get_addons_metadata_filepath()
+ return self.read_metadata_file(filepath, {})
+
+ def update_addons_metadata(self, addons_information):
+ if not addons_information:
+ return
+ addons_metadata = self.get_addons_metadata()
+ for addon_name, version_value in addons_information.items():
+ if addon_name not in addons_metadata:
+ addons_metadata[addon_name] = {}
+ for addon_version, version_data in version_value.items():
+ addons_metadata[addon_name][addon_version] = version_data
+
+ filepath = self.get_addons_metadata_filepath()
+ self.save_metadata_file(filepath, addons_metadata)
+
+ def finish_distribution(self):
+ """Store metadata about distributed items."""
+
+ self._dist_finished = True
+ stored_time = datetime.datetime.now().strftime("%Y-%m-%d %H:%M:%S")
+ dependency_dist_item = self.get_dependency_dist_item()
+ if (
+ dependency_dist_item is not None
+ and dependency_dist_item.need_distribution
+ and dependency_dist_item.state == UpdateState.UPDATED
+ ):
+ package = self.dependency_package
+ source = dependency_dist_item.used_source
+ if source is not None:
+ data = {
+ "source": source,
+ "file_hash": dependency_dist_item.file_hash,
+ "distributed_dt": stored_time
+ }
+ self.update_dependency_metadata(package.name, data)
+
+ addons_info = {}
+ for item in self.get_addon_dist_items():
+ dist_item = item["dist_item"]
+ if (
+ not dist_item.need_distribution
+ or dist_item.state != UpdateState.UPDATED
+ ):
+ continue
+
+ source_data = dist_item.used_source
+ if not source_data:
+ continue
+
+ addon_name = item["addon_name"]
+ addon_version = item["addon_version"]
+ addons_info.setdefault(addon_name, {})
+ addons_info[addon_name][addon_version] = {
+ "source": source_data,
+ "file_hash": dist_item.file_hash,
+ "distributed_dt": stored_time
+ }
+
+ self.update_addons_metadata(addons_info)
+
+ def get_all_distribution_items(self):
+ """Distribution items required by server.
+
+ Items contain dependency package item and all addons that are enabled
+ and have distribution requirements.
+
+ Items can be already available on machine.
+
+ Returns:
+ List[DistributionItem]: Distribution items required by server.
+ """
+
+ output = [
+ item["dist_item"]
+ for item in self.get_addon_dist_items()
+ ]
+ dependency_dist_item = self.get_dependency_dist_item()
+ if dependency_dist_item is not None:
+ output.insert(0, dependency_dist_item)
+
+ return output
+
+ def distribute(self, threaded=False):
+ """Distribute all missing items.
+
+ Method will try to distribute all items that are required by server.
+
+ This method does not handle failed items. To validate the result call
+ 'validate_distribution' when this method finishes.
+
+ Args:
+ threaded (bool): Distribute items in threads.
+ """
+
+ if self._dist_started:
+ raise RuntimeError("Distribution already started")
+ self._dist_started = True
+ threads = collections.deque()
+ for item in self.get_all_distribution_items():
+ if threaded:
+ threads.append(threading.Thread(target=item.distribute))
+ else:
+ item.distribute()
+
+ while threads:
+ thread = threads.popleft()
+ if thread.is_alive():
+ threads.append(thread)
+ else:
+ thread.join()
+
+ self.finish_distribution()
+
+ def validate_distribution(self):
+ """Check if all required distribution items are distributed.
+
+ Raises:
+ RuntimeError: Any of items is not available.
+ """
+
+ invalid = []
+ dependency_package = self.get_dependency_dist_item()
+ if (
+ dependency_package is not None
+ and dependency_package.state != UpdateState.UPDATED
+ ):
+ invalid.append("Dependency package")
+
+ for item in self.get_addon_dist_items():
+ dist_item = item["dist_item"]
+ if dist_item.state != UpdateState.UPDATED:
+ invalid.append(item["addon_name"])
+
+ if not invalid:
+ return
+
+ raise RuntimeError("Failed to distribute {}".format(
+ ", ".join([f'"{item}"' for item in invalid])
+ ))
+
+ def get_sys_paths(self):
+ """Get all paths to python packages that should be added to python.
+
+ These paths lead to addon directories and python dependencies in
+ dependency package.
+
+ Todos:
+ Add dependency package directory to output. ATM is not structure of
+ dependency package 100% defined.
+
+ Returns:
+ List[str]: Paths that should be added to 'sys.path' and
+ 'PYTHONPATH'.
+ """
+
+ output = []
+ for item in self.get_all_distribution_items():
+ if item.state != UpdateState.UPDATED:
+ continue
+ unzip_dirpath = item.unzip_dirpath
+ if unzip_dirpath and os.path.exists(unzip_dirpath):
+ output.append(unzip_dirpath)
+ return output
+
+
+def cli(*args):
+ raise NotImplementedError
diff --git a/common/ayon_common/distribution/data_structures.py b/common/ayon_common/distribution/data_structures.py
new file mode 100644
index 0000000000..aa93d4ed71
--- /dev/null
+++ b/common/ayon_common/distribution/data_structures.py
@@ -0,0 +1,265 @@
+import attr
+from enum import Enum
+
+
+class UrlType(Enum):
+ HTTP = "http"
+ GIT = "git"
+ FILESYSTEM = "filesystem"
+ SERVER = "server"
+
+
+@attr.s
+class MultiPlatformValue(object):
+ windows = attr.ib(default=None)
+ linux = attr.ib(default=None)
+ darwin = attr.ib(default=None)
+
+
+@attr.s
+class SourceInfo(object):
+ type = attr.ib()
+
+
+@attr.s
+class LocalSourceInfo(SourceInfo):
+ path = attr.ib(default=attr.Factory(MultiPlatformValue))
+
+
+@attr.s
+class WebSourceInfo(SourceInfo):
+ url = attr.ib(default=None)
+ headers = attr.ib(default=None)
+ filename = attr.ib(default=None)
+
+
+@attr.s
+class ServerSourceInfo(SourceInfo):
+ filename = attr.ib(default=None)
+ path = attr.ib(default=None)
+
+
+def convert_source(source):
+ """Create source object from data information.
+
+ Args:
+ source (Dict[str, any]): Information about source.
+
+ Returns:
+ Union[None, SourceInfo]: Object with source information if type is
+ known.
+ """
+
+ source_type = source.get("type")
+ if not source_type:
+ return None
+
+ if source_type == UrlType.FILESYSTEM.value:
+ return LocalSourceInfo(
+ type=source_type,
+ path=source["path"]
+ )
+
+ if source_type == UrlType.HTTP.value:
+ url = source["path"]
+ return WebSourceInfo(
+ type=source_type,
+ url=url,
+ headers=source.get("headers"),
+ filename=source.get("filename")
+ )
+
+ if source_type == UrlType.SERVER.value:
+ return ServerSourceInfo(
+ type=source_type,
+ filename=source.get("filename"),
+ path=source.get("path")
+ )
+
+
+def prepare_sources(src_sources):
+ sources = []
+ unknown_sources = []
+ for source in (src_sources or []):
+ dependency_source = convert_source(source)
+ if dependency_source is not None:
+ sources.append(dependency_source)
+ else:
+ print(f"Unknown source {source.get('type')}")
+ unknown_sources.append(source)
+ return sources, unknown_sources
+
+
+@attr.s
+class VersionData(object):
+ version_data = attr.ib(default=None)
+
+
+@attr.s
+class AddonVersionInfo(object):
+ version = attr.ib()
+ full_name = attr.ib()
+ title = attr.ib(default=None)
+ require_distribution = attr.ib(default=False)
+ sources = attr.ib(default=attr.Factory(list))
+ unknown_sources = attr.ib(default=attr.Factory(list))
+ hash = attr.ib(default=None)
+
+ @classmethod
+ def from_dict(
+ cls, addon_name, addon_title, addon_version, version_data
+ ):
+ """Addon version info.
+
+ Args:
+ addon_name (str): Name of addon.
+ addon_title (str): Title of addon.
+ addon_version (str): Version of addon.
+ version_data (dict[str, Any]): Addon version information from
+ server.
+
+ Returns:
+ AddonVersionInfo: Addon version info.
+ """
+
+ full_name = f"{addon_name}_{addon_version}"
+ title = f"{addon_title} {addon_version}"
+
+ source_info = version_data.get("clientSourceInfo")
+ require_distribution = source_info is not None
+ sources, unknown_sources = prepare_sources(source_info)
+
+ return cls(
+ version=addon_version,
+ full_name=full_name,
+ require_distribution=require_distribution,
+ sources=sources,
+ unknown_sources=unknown_sources,
+ hash=version_data.get("hash"),
+ title=title
+ )
+
+
+@attr.s
+class AddonInfo(object):
+ """Object matching json payload from Server"""
+ name = attr.ib()
+ versions = attr.ib(default=attr.Factory(dict))
+ title = attr.ib(default=None)
+ description = attr.ib(default=None)
+ license = attr.ib(default=None)
+ authors = attr.ib(default=None)
+
+ @classmethod
+ def from_dict(cls, data):
+ """Addon info by available versions.
+
+ Args:
+ data (dict[str, Any]): Addon information from server. Should
+ contain information about every version under 'versions'.
+
+ Returns:
+ AddonInfo: Addon info with available versions.
+ """
+
+ # server payload contains info about all versions
+ addon_name = data["name"]
+ title = data.get("title") or addon_name
+
+ src_versions = data.get("versions") or {}
+ dst_versions = {
+ addon_version: AddonVersionInfo.from_dict(
+ addon_name, title, addon_version, version_data
+ )
+ for addon_version, version_data in src_versions.items()
+ }
+ return cls(
+ name=addon_name,
+ versions=dst_versions,
+ description=data.get("description"),
+ title=data.get("title") or addon_name,
+ license=data.get("license"),
+ authors=data.get("authors")
+ )
+
+
+@attr.s
+class DependencyItem(object):
+ """Object matching payload from Server about single dependency package"""
+ name = attr.ib()
+ platform_name = attr.ib()
+ checksum = attr.ib()
+ sources = attr.ib(default=attr.Factory(list))
+ unknown_sources = attr.ib(default=attr.Factory(list))
+ source_addons = attr.ib(default=attr.Factory(dict))
+ python_modules = attr.ib(default=attr.Factory(dict))
+
+ @classmethod
+ def from_dict(cls, package):
+ src_sources = package.get("sources") or []
+ for source in src_sources:
+ if source.get("type") == "server" and not source.get("filename"):
+ source["filename"] = package["filename"]
+ sources, unknown_sources = prepare_sources(src_sources)
+ return cls(
+ name=package["filename"],
+ platform_name=package["platform"],
+ sources=sources,
+ unknown_sources=unknown_sources,
+ checksum=package["checksum"],
+ source_addons=package["sourceAddons"],
+ python_modules=package["pythonModules"]
+ )
+
+
+@attr.s
+class Installer:
+ version = attr.ib()
+ filename = attr.ib()
+ platform_name = attr.ib()
+ size = attr.ib()
+ checksum = attr.ib()
+ python_version = attr.ib()
+ python_modules = attr.ib()
+ sources = attr.ib(default=attr.Factory(list))
+ unknown_sources = attr.ib(default=attr.Factory(list))
+
+ @classmethod
+ def from_dict(cls, installer_info):
+ sources, unknown_sources = prepare_sources(
+ installer_info.get("sources"))
+
+ return cls(
+ version=installer_info["version"],
+ filename=installer_info["filename"],
+ platform_name=installer_info["platform"],
+ size=installer_info["size"],
+ sources=sources,
+ unknown_sources=unknown_sources,
+ checksum=installer_info["checksum"],
+ python_version=installer_info["pythonVersion"],
+ python_modules=installer_info["pythonModules"]
+ )
+
+
+@attr.s
+class Bundle:
+ """Class representing bundle information."""
+
+ name = attr.ib()
+ installer_version = attr.ib()
+ addon_versions = attr.ib(default=attr.Factory(dict))
+ dependency_packages = attr.ib(default=attr.Factory(dict))
+ is_production = attr.ib(default=False)
+ is_staging = attr.ib(default=False)
+
+ @classmethod
+ def from_dict(cls, data):
+ return cls(
+ name=data["name"],
+ installer_version=data.get("installerVersion"),
+ addon_versions=data.get("addons", {}),
+ dependency_packages=data.get("dependencyPackages", {}),
+ is_production=data["isProduction"],
+ is_staging=data["isStaging"],
+ )
diff --git a/common/ayon_common/distribution/downloaders.py b/common/ayon_common/distribution/downloaders.py
new file mode 100644
index 0000000000..23280176c3
--- /dev/null
+++ b/common/ayon_common/distribution/downloaders.py
@@ -0,0 +1,250 @@
+import os
+import logging
+import platform
+from abc import ABCMeta, abstractmethod
+
+import ayon_api
+
+from .file_handler import RemoteFileHandler
+from .data_structures import UrlType
+
+
+class SourceDownloader(metaclass=ABCMeta):
+ """Abstract class for source downloader."""
+
+ log = logging.getLogger(__name__)
+
+ @classmethod
+ @abstractmethod
+ def download(cls, source, destination_dir, data, transfer_progress):
+ """Returns url of downloaded addon zip file.
+
+ Tranfer progress can be ignored, in that case file transfer won't
+ be shown as 0-100% but as 'running'. First step should be to set
+ destination content size and then add transferred chunk sizes.
+
+ Args:
+ source (dict): {type:"http", "url":"https://} ...}
+ destination_dir (str): local folder to unzip
+ data (dict): More information about download content. Always have
+ 'type' key in.
+ transfer_progress (ayon_api.TransferProgress): Progress of
+ transferred (copy/download) content.
+
+ Returns:
+ (str) local path to addon zip file
+ """
+
+ pass
+
+ @classmethod
+ @abstractmethod
+ def cleanup(cls, source, destination_dir, data):
+ """Cleanup files when distribution finishes or crashes.
+
+ Cleanup e.g. temporary files (downloaded zip) or other related stuff
+ to downloader.
+ """
+
+ pass
+
+ @classmethod
+ def check_hash(cls, addon_path, addon_hash, hash_type="sha256"):
+ """Compares 'hash' of downloaded 'addon_url' file.
+
+ Args:
+ addon_path (str): Local path to addon file.
+ addon_hash (str): Hash of downloaded file.
+ hash_type (str): Type of hash.
+
+ Raises:
+ ValueError if hashes doesn't match
+ """
+
+ if not os.path.exists(addon_path):
+ raise ValueError(f"{addon_path} doesn't exist.")
+ if not RemoteFileHandler.check_integrity(
+ addon_path, addon_hash, hash_type=hash_type
+ ):
+ raise ValueError(f"{addon_path} doesn't match expected hash.")
+
+ @classmethod
+ def unzip(cls, addon_zip_path, destination_dir):
+ """Unzips local 'addon_zip_path' to 'destination'.
+
+ Args:
+ addon_zip_path (str): local path to addon zip file
+ destination_dir (str): local folder to unzip
+ """
+
+ RemoteFileHandler.unzip(addon_zip_path, destination_dir)
+ os.remove(addon_zip_path)
+
+
+class OSDownloader(SourceDownloader):
+ """Downloader using files from file drive."""
+
+ @classmethod
+ def download(cls, source, destination_dir, data, transfer_progress):
+ # OS doesn't need to download, unzip directly
+ addon_url = source["path"].get(platform.system().lower())
+ if not os.path.exists(addon_url):
+ raise ValueError(f"{addon_url} is not accessible")
+ return addon_url
+
+ @classmethod
+ def cleanup(cls, source, destination_dir, data):
+ # Nothing to do - download does not copy anything
+ pass
+
+
+class HTTPDownloader(SourceDownloader):
+ """Downloader using http or https protocol."""
+
+ CHUNK_SIZE = 100000
+
+ @staticmethod
+ def get_filename(source):
+ source_url = source["url"]
+ filename = source.get("filename")
+ if not filename:
+ filename = os.path.basename(source_url)
+ basename, ext = os.path.splitext(filename)
+ allowed_exts = set(RemoteFileHandler.IMPLEMENTED_ZIP_FORMATS)
+ if ext.lower().lstrip(".") not in allowed_exts:
+ filename = f"{basename}.zip"
+ return filename
+
+ @classmethod
+ def download(cls, source, destination_dir, data, transfer_progress):
+ source_url = source["url"]
+ cls.log.debug(f"Downloading {source_url} to {destination_dir}")
+ headers = source.get("headers")
+ filename = cls.get_filename(source)
+
+ # TODO use transfer progress
+ RemoteFileHandler.download_url(
+ source_url,
+ destination_dir,
+ filename,
+ headers=headers
+ )
+
+ return os.path.join(destination_dir, filename)
+
+ @classmethod
+ def cleanup(cls, source, destination_dir, data):
+ filename = cls.get_filename(source)
+ filepath = os.path.join(destination_dir, filename)
+ if os.path.exists(filepath) and os.path.isfile(filepath):
+ os.remove(filepath)
+
+
+class AyonServerDownloader(SourceDownloader):
+ """Downloads static resource file from AYON Server.
+
+ Expects filled env var AYON_SERVER_URL.
+ """
+
+ CHUNK_SIZE = 8192
+
+ @classmethod
+ def download(cls, source, destination_dir, data, transfer_progress):
+ path = source["path"]
+ filename = source["filename"]
+ if path and not filename:
+ filename = path.split("/")[-1]
+
+ cls.log.debug(f"Downloading {filename} to {destination_dir}")
+
+ _, ext = os.path.splitext(filename)
+ ext = ext.lower().lstrip(".")
+ valid_exts = set(RemoteFileHandler.IMPLEMENTED_ZIP_FORMATS)
+ if ext not in valid_exts:
+ raise ValueError((
+ f"Invalid file extension \"{ext}\"."
+ f" Expected {', '.join(valid_exts)}"
+ ))
+
+ if path:
+ filepath = os.path.join(destination_dir, filename)
+ return ayon_api.download_file(
+ path,
+ filepath,
+ chunk_size=cls.CHUNK_SIZE,
+ progress=transfer_progress
+ )
+
+ # dst_filepath = os.path.join(destination_dir, filename)
+ if data["type"] == "dependency_package":
+ return ayon_api.download_dependency_package(
+ data["name"],
+ destination_dir,
+ filename,
+ platform_name=data["platform"],
+ chunk_size=cls.CHUNK_SIZE,
+ progress=transfer_progress
+ )
+
+ if data["type"] == "addon":
+ return ayon_api.download_addon_private_file(
+ data["name"],
+ data["version"],
+ filename,
+ destination_dir,
+ chunk_size=cls.CHUNK_SIZE,
+ progress=transfer_progress
+ )
+
+ raise ValueError(f"Unknown type to download \"{data['type']}\"")
+
+ @classmethod
+ def cleanup(cls, source, destination_dir, data):
+ filename = source["filename"]
+ filepath = os.path.join(destination_dir, filename)
+ if os.path.exists(filepath) and os.path.isfile(filepath):
+ os.remove(filepath)
+
+
+class DownloadFactory:
+ """Factory for downloaders."""
+
+ def __init__(self):
+ self._downloaders = {}
+
+ def register_format(self, downloader_type, downloader):
+ """Register downloader for download type.
+
+ Args:
+ downloader_type (UrlType): Type of source.
+ downloader (SourceDownloader): Downloader which cares about
+ download, hash check and unzipping.
+ """
+
+ self._downloaders[downloader_type.value] = downloader
+
+ def get_downloader(self, downloader_type):
+ """Registered downloader for type.
+
+ Args:
+ downloader_type (UrlType): Type of source.
+
+ Returns:
+ SourceDownloader: Downloader object which should care about file
+ distribution.
+
+ Raises:
+ ValueError: If type does not have registered downloader.
+ """
+
+ if downloader := self._downloaders.get(downloader_type):
+ return downloader()
+ raise ValueError(f"{downloader_type} not implemented")
+
+
+def get_default_download_factory():
+ download_factory = DownloadFactory()
+ download_factory.register_format(UrlType.FILESYSTEM, OSDownloader)
+ download_factory.register_format(UrlType.HTTP, HTTPDownloader)
+ download_factory.register_format(UrlType.SERVER, AyonServerDownloader)
+ return download_factory
diff --git a/common/openpype_common/distribution/file_handler.py b/common/ayon_common/distribution/file_handler.py
similarity index 66%
rename from common/openpype_common/distribution/file_handler.py
rename to common/ayon_common/distribution/file_handler.py
index e649f143e9..07f6962c98 100644
--- a/common/openpype_common/distribution/file_handler.py
+++ b/common/ayon_common/distribution/file_handler.py
@@ -9,21 +9,23 @@ import hashlib
import tarfile
import zipfile
+import requests
-USER_AGENT = "openpype"
+USER_AGENT = "AYON-launcher"
class RemoteFileHandler:
"""Download file from url, might be GDrive shareable link"""
- IMPLEMENTED_ZIP_FORMATS = ['zip', 'tar', 'tgz',
- 'tar.gz', 'tar.xz', 'tar.bz2']
+ IMPLEMENTED_ZIP_FORMATS = {
+ "zip", "tar", "tgz", "tar.gz", "tar.xz", "tar.bz2"
+ }
@staticmethod
def calculate_md5(fpath, chunk_size=10000):
md5 = hashlib.md5()
- with open(fpath, 'rb') as f:
- for chunk in iter(lambda: f.read(chunk_size), b''):
+ with open(fpath, "rb") as f:
+ for chunk in iter(lambda: f.read(chunk_size), b""):
md5.update(chunk)
return md5.hexdigest()
@@ -45,7 +47,7 @@ class RemoteFileHandler:
h = hashlib.sha256()
b = bytearray(128 * 1024)
mv = memoryview(b)
- with open(fpath, 'rb', buffering=0) as f:
+ with open(fpath, "rb", buffering=0) as f:
for n in iter(lambda: f.readinto(mv), 0):
h.update(mv[:n])
return h.hexdigest()
@@ -62,27 +64,32 @@ class RemoteFileHandler:
return True
if not hash_type:
raise ValueError("Provide hash type, md5 or sha256")
- if hash_type == 'md5':
+ if hash_type == "md5":
return RemoteFileHandler.check_md5(fpath, hash_value)
if hash_type == "sha256":
return RemoteFileHandler.check_sha256(fpath, hash_value)
@staticmethod
def download_url(
- url, root, filename=None,
- sha256=None, max_redirect_hops=3
+ url,
+ root,
+ filename=None,
+ max_redirect_hops=3,
+ headers=None
):
- """Download a file from a url and place it in root.
+ """Download a file from url and place it in root.
+
Args:
url (str): URL to download file from
root (str): Directory to place downloaded file in
filename (str, optional): Name to save the file under.
If None, use the basename of the URL
- sha256 (str, optional): sha256 checksum of the download.
- If None, do not check
- max_redirect_hops (int, optional): Maximum number of redirect
+ max_redirect_hops (Optional[int]): Maximum number of redirect
hops allowed
+ headers (Optional[dict[str, str]]): Additional required headers
+ - Authentication etc..
"""
+
root = os.path.expanduser(root)
if not filename:
filename = os.path.basename(url)
@@ -90,55 +97,44 @@ class RemoteFileHandler:
os.makedirs(root, exist_ok=True)
- # check if file is already present locally
- if RemoteFileHandler.check_integrity(fpath,
- sha256, hash_type="sha256"):
- print('Using downloaded and verified file: ' + fpath)
- return
-
# expand redirect chain if needed
- url = RemoteFileHandler._get_redirect_url(url,
- max_hops=max_redirect_hops)
+ url = RemoteFileHandler._get_redirect_url(
+ url, max_hops=max_redirect_hops, headers=headers)
# check if file is located on Google Drive
file_id = RemoteFileHandler._get_google_drive_file_id(url)
if file_id is not None:
return RemoteFileHandler.download_file_from_google_drive(
- file_id, root, filename, sha256)
+ file_id, root, filename)
# download the file
try:
- print('Downloading ' + url + ' to ' + fpath)
- RemoteFileHandler._urlretrieve(url, fpath)
- except (urllib.error.URLError, IOError) as e:
- if url[:5] == 'https':
- url = url.replace('https:', 'http:')
- print('Failed download. Trying https -> http instead.'
- ' Downloading ' + url + ' to ' + fpath)
- RemoteFileHandler._urlretrieve(url, fpath)
- else:
- raise e
+ print(f"Downloading {url} to {fpath}")
+ RemoteFileHandler._urlretrieve(url, fpath, headers=headers)
+ except (urllib.error.URLError, IOError) as exc:
+ if url[:5] != "https":
+ raise exc
- # check integrity of downloaded file
- if not RemoteFileHandler.check_integrity(fpath,
- sha256, hash_type="sha256"):
- raise RuntimeError("File not found or corrupted.")
+ url = url.replace("https:", "http:")
+ print((
+ "Failed download. Trying https -> http instead."
+ f" Downloading {url} to {fpath}"
+ ))
+ RemoteFileHandler._urlretrieve(url, fpath, headers=headers)
@staticmethod
- def download_file_from_google_drive(file_id, root,
- filename=None,
- sha256=None):
+ def download_file_from_google_drive(
+ file_id, root, filename=None
+ ):
"""Download a Google Drive file from and place it in root.
Args:
file_id (str): id of file to be downloaded
root (str): Directory to place downloaded file in
filename (str, optional): Name to save the file under.
If None, use the id of the file.
- sha256 (str, optional): sha256 checksum of the download.
- If None, do not check
"""
# Based on https://stackoverflow.com/questions/38511444/python-download-files-from-google-drive-using-url # noqa
- import requests
+
url = "https://docs.google.com/uc?export=download"
root = os.path.expanduser(root)
@@ -148,17 +144,16 @@ class RemoteFileHandler:
os.makedirs(root, exist_ok=True)
- if os.path.isfile(fpath) and RemoteFileHandler.check_integrity(
- fpath, sha256, hash_type="sha256"):
- print('Using downloaded and verified file: ' + fpath)
+ if os.path.isfile(fpath) and RemoteFileHandler.check_integrity(fpath):
+ print(f"Using downloaded and verified file: {fpath}")
else:
session = requests.Session()
- response = session.get(url, params={'id': file_id}, stream=True)
+ response = session.get(url, params={"id": file_id}, stream=True)
token = RemoteFileHandler._get_confirm_token(response)
if token:
- params = {'id': file_id, 'confirm': token}
+ params = {"id": file_id, "confirm": token}
response = session.get(url, params=params, stream=True)
response_content_generator = response.iter_content(32768)
@@ -186,28 +181,28 @@ class RemoteFileHandler:
destination_path = os.path.dirname(path)
_, archive_type = os.path.splitext(path)
- archive_type = archive_type.lstrip('.')
+ archive_type = archive_type.lstrip(".")
- if archive_type in ['zip']:
- print("Unzipping {}->{}".format(path, destination_path))
+ if archive_type in ["zip"]:
+ print(f"Unzipping {path}->{destination_path}")
zip_file = zipfile.ZipFile(path)
zip_file.extractall(destination_path)
zip_file.close()
elif archive_type in [
- 'tar', 'tgz', 'tar.gz', 'tar.xz', 'tar.bz2'
+ "tar", "tgz", "tar.gz", "tar.xz", "tar.bz2"
]:
- print("Unzipping {}->{}".format(path, destination_path))
- if archive_type == 'tar':
- tar_type = 'r:'
- elif archive_type.endswith('xz'):
- tar_type = 'r:xz'
- elif archive_type.endswith('gz'):
- tar_type = 'r:gz'
- elif archive_type.endswith('bz2'):
- tar_type = 'r:bz2'
+ print(f"Unzipping {path}->{destination_path}")
+ if archive_type == "tar":
+ tar_type = "r:"
+ elif archive_type.endswith("xz"):
+ tar_type = "r:xz"
+ elif archive_type.endswith("gz"):
+ tar_type = "r:gz"
+ elif archive_type.endswith("bz2"):
+ tar_type = "r:bz2"
else:
- tar_type = 'r:*'
+ tar_type = "r:*"
try:
tar_file = tarfile.open(path, tar_type)
except tarfile.ReadError:
@@ -216,29 +211,35 @@ class RemoteFileHandler:
tar_file.close()
@staticmethod
- def _urlretrieve(url, filename, chunk_size):
+ def _urlretrieve(url, filename, chunk_size=None, headers=None):
+ final_headers = {"User-Agent": USER_AGENT}
+ if headers:
+ final_headers.update(headers)
+
+ chunk_size = chunk_size or 8192
with open(filename, "wb") as fh:
with urllib.request.urlopen(
- urllib.request.Request(url,
- headers={"User-Agent": USER_AGENT})) \
- as response:
+ urllib.request.Request(url, headers=final_headers)
+ ) as response:
for chunk in iter(lambda: response.read(chunk_size), ""):
if not chunk:
break
fh.write(chunk)
@staticmethod
- def _get_redirect_url(url, max_hops):
+ def _get_redirect_url(url, max_hops, headers=None):
initial_url = url
- headers = {"Method": "HEAD", "User-Agent": USER_AGENT}
-
+ final_headers = {"Method": "HEAD", "User-Agent": USER_AGENT}
+ if headers:
+ final_headers.update(headers)
for _ in range(max_hops + 1):
with urllib.request.urlopen(
- urllib.request.Request(url, headers=headers)) as response:
+ urllib.request.Request(url, headers=final_headers)
+ ) as response:
if response.url == url or response.url is None:
return url
- url = response.url
+ return response.url
else:
raise RecursionError(
f"Request to {initial_url} exceeded {max_hops} redirects. "
@@ -248,7 +249,7 @@ class RemoteFileHandler:
@staticmethod
def _get_confirm_token(response):
for key, value in response.cookies.items():
- if key.startswith('download_warning'):
+ if key.startswith("download_warning"):
return value
# handle antivirus warning for big zips
diff --git a/common/ayon_common/distribution/tests/test_addon_distributtion.py b/common/ayon_common/distribution/tests/test_addon_distributtion.py
new file mode 100644
index 0000000000..3e7bd1bc6a
--- /dev/null
+++ b/common/ayon_common/distribution/tests/test_addon_distributtion.py
@@ -0,0 +1,248 @@
+import os
+import sys
+import copy
+import tempfile
+
+
+import attr
+import pytest
+
+current_dir = os.path.dirname(os.path.abspath(__file__))
+root_dir = os.path.abspath(os.path.join(current_dir, "..", "..", "..", ".."))
+sys.path.append(root_dir)
+
+from common.ayon_common.distribution.downloaders import (
+ DownloadFactory,
+ OSDownloader,
+ HTTPDownloader,
+)
+from common.ayon_common.distribution.control import (
+ AyonDistribution,
+ UpdateState,
+)
+from common.ayon_common.distribution.data_structures import (
+ AddonInfo,
+ UrlType,
+)
+
+
+@pytest.fixture
+def download_factory():
+ addon_downloader = DownloadFactory()
+ addon_downloader.register_format(UrlType.FILESYSTEM, OSDownloader)
+ addon_downloader.register_format(UrlType.HTTP, HTTPDownloader)
+
+ yield addon_downloader
+
+
+@pytest.fixture
+def http_downloader(download_factory):
+ yield download_factory.get_downloader(UrlType.HTTP.value)
+
+
+@pytest.fixture
+def temp_folder():
+ yield tempfile.mkdtemp(prefix="ayon_test_")
+
+
+@pytest.fixture
+def sample_bundles():
+ yield {
+ "bundles": [
+ {
+ "name": "TestBundle",
+ "createdAt": "2023-06-29T00:00:00.0+00:00",
+ "installerVersion": None,
+ "addons": {
+ "slack": "1.0.0"
+ },
+ "dependencyPackages": {},
+ "isProduction": True,
+ "isStaging": False
+ }
+ ],
+ "productionBundle": "TestBundle",
+ "stagingBundle": None
+ }
+
+
+@pytest.fixture
+def sample_addon_info():
+ yield {
+ "name": "slack",
+ "title": "Slack addon",
+ "versions": {
+ "1.0.0": {
+ "hasSettings": True,
+ "hasSiteSettings": False,
+ "clientPyproject": {
+ "tool": {
+ "poetry": {
+ "dependencies": {
+ "nxtools": "^1.6",
+ "orjson": "^3.6.7",
+ "typer": "^0.4.1",
+ "email-validator": "^1.1.3",
+ "python": "^3.10",
+ "fastapi": "^0.73.0"
+ }
+ }
+ }
+ },
+ "clientSourceInfo": [
+ {
+ "type": "http",
+ "path": "https://drive.google.com/file/d/1TcuV8c2OV8CcbPeWi7lxOdqWsEqQNPYy/view?usp=sharing", # noqa
+ "filename": "dummy.zip"
+ },
+ {
+ "type": "filesystem",
+ "path": {
+ "windows": "P:/sources/some_file.zip",
+ "linux": "/mnt/srv/sources/some_file.zip",
+ "darwin": "/Volumes/srv/sources/some_file.zip"
+ }
+ }
+ ],
+ "frontendScopes": {
+ "project": {
+ "sidebar": "hierarchy",
+ }
+ },
+ "hash": "4be25eb6215e91e5894d3c5475aeb1e379d081d3f5b43b4ee15b0891cf5f5658" # noqa
+ }
+ },
+ "description": ""
+ }
+
+
+def test_register(printer):
+ download_factory = DownloadFactory()
+
+ assert len(download_factory._downloaders) == 0, "Contains registered"
+
+ download_factory.register_format(UrlType.FILESYSTEM, OSDownloader)
+ assert len(download_factory._downloaders) == 1, "Should contain one"
+
+
+def test_get_downloader(printer, download_factory):
+ assert download_factory.get_downloader(UrlType.FILESYSTEM.value), "Should find" # noqa
+
+ with pytest.raises(ValueError):
+ download_factory.get_downloader("unknown"), "Shouldn't find"
+
+
+def test_addon_info(printer, sample_addon_info):
+ """Tests parsing of expected payload from v4 server into AadonInfo."""
+ valid_minimum = {
+ "name": "slack",
+ "versions": {
+ "1.0.0": {
+ "clientSourceInfo": [
+ {
+ "type": "filesystem",
+ "path": {
+ "windows": "P:/sources/some_file.zip",
+ "linux": "/mnt/srv/sources/some_file.zip",
+ "darwin": "/Volumes/srv/sources/some_file.zip"
+ }
+ }
+ ]
+ }
+ }
+ }
+
+ assert AddonInfo.from_dict(valid_minimum), "Missing required fields"
+
+ addon = AddonInfo.from_dict(sample_addon_info)
+ assert addon, "Should be created"
+ assert addon.name == "slack", "Incorrect name"
+ assert "1.0.0" in addon.versions, "Version is not in versions"
+
+ with pytest.raises(TypeError):
+ assert addon["name"], "Dict approach not implemented"
+
+ addon_as_dict = attr.asdict(addon)
+ assert addon_as_dict["name"], "Dict approach should work"
+
+
+def _get_dist_item(dist_items, name, version):
+ final_dist_info = next(
+ (
+ dist_info
+ for dist_info in dist_items
+ if (
+ dist_info["addon_name"] == name
+ and dist_info["addon_version"] == version
+ )
+ ),
+ {}
+ )
+ return final_dist_info["dist_item"]
+
+
+def test_update_addon_state(
+ printer, sample_addon_info, temp_folder, download_factory, sample_bundles
+):
+ """Tests possible cases of addon update."""
+
+ addon_version = list(sample_addon_info["versions"])[0]
+ broken_addon_info = copy.deepcopy(sample_addon_info)
+
+ # Cause crash because of invalid hash
+ broken_addon_info["versions"][addon_version]["hash"] = "brokenhash"
+ distribution = AyonDistribution(
+ addon_dirpath=temp_folder,
+ dependency_dirpath=temp_folder,
+ dist_factory=download_factory,
+ addons_info=[broken_addon_info],
+ dependency_packages_info=[],
+ bundles_info=sample_bundles
+ )
+ distribution.distribute()
+ dist_items = distribution.get_addon_dist_items()
+ slack_dist_item = _get_dist_item(
+ dist_items,
+ sample_addon_info["name"],
+ addon_version
+ )
+ slack_state = slack_dist_item.state
+ assert slack_state == UpdateState.UPDATE_FAILED, (
+ "Update should have failed because of wrong hash")
+
+ # Fix cache and validate if was updated
+ distribution = AyonDistribution(
+ addon_dirpath=temp_folder,
+ dependency_dirpath=temp_folder,
+ dist_factory=download_factory,
+ addons_info=[sample_addon_info],
+ dependency_packages_info=[],
+ bundles_info=sample_bundles
+ )
+ distribution.distribute()
+ dist_items = distribution.get_addon_dist_items()
+ slack_dist_item = _get_dist_item(
+ dist_items,
+ sample_addon_info["name"],
+ addon_version
+ )
+ assert slack_dist_item.state == UpdateState.UPDATED, (
+ "Addon should have been updated")
+
+ # Is UPDATED without calling distribute
+ distribution = AyonDistribution(
+ addon_dirpath=temp_folder,
+ dependency_dirpath=temp_folder,
+ dist_factory=download_factory,
+ addons_info=[sample_addon_info],
+ dependency_packages_info=[],
+ bundles_info=sample_bundles
+ )
+ dist_items = distribution.get_addon_dist_items()
+ slack_dist_item = _get_dist_item(
+ dist_items,
+ sample_addon_info["name"],
+ addon_version
+ )
+ assert slack_dist_item.state == UpdateState.UPDATED, (
+ "Addon should already exist")
diff --git a/common/ayon_common/distribution/ui/missing_bundle_window.py b/common/ayon_common/distribution/ui/missing_bundle_window.py
new file mode 100644
index 0000000000..ae7a6a2976
--- /dev/null
+++ b/common/ayon_common/distribution/ui/missing_bundle_window.py
@@ -0,0 +1,146 @@
+import sys
+
+from qtpy import QtWidgets, QtGui
+
+from ayon_common import is_staging_enabled
+from ayon_common.resources import (
+ get_icon_path,
+ load_stylesheet,
+)
+from ayon_common.ui_utils import get_qt_app
+
+
+class MissingBundleWindow(QtWidgets.QDialog):
+ default_width = 410
+ default_height = 170
+
+ def __init__(
+ self, url=None, bundle_name=None, use_staging=None, parent=None
+ ):
+ super().__init__(parent)
+
+ icon_path = get_icon_path()
+ icon = QtGui.QIcon(icon_path)
+ self.setWindowIcon(icon)
+ self.setWindowTitle("Missing Bundle")
+
+ self._url = url
+ self._bundle_name = bundle_name
+ self._use_staging = use_staging
+ self._first_show = True
+
+ info_label = QtWidgets.QLabel("", self)
+ info_label.setWordWrap(True)
+
+ btns_widget = QtWidgets.QWidget(self)
+ confirm_btn = QtWidgets.QPushButton("Exit", btns_widget)
+
+ btns_layout = QtWidgets.QHBoxLayout(btns_widget)
+ btns_layout.setContentsMargins(0, 0, 0, 0)
+ btns_layout.addStretch(1)
+ btns_layout.addWidget(confirm_btn, 0)
+
+ main_layout = QtWidgets.QVBoxLayout(self)
+ main_layout.addWidget(info_label, 0)
+ main_layout.addStretch(1)
+ main_layout.addWidget(btns_widget, 0)
+
+ confirm_btn.clicked.connect(self._on_confirm_click)
+
+ self._info_label = info_label
+ self._confirm_btn = confirm_btn
+
+ self._update_label()
+
+ def set_url(self, url):
+ if url == self._url:
+ return
+ self._url = url
+ self._update_label()
+
+ def set_bundle_name(self, bundle_name):
+ if bundle_name == self._bundle_name:
+ return
+ self._bundle_name = bundle_name
+ self._update_label()
+
+ def set_use_staging(self, use_staging):
+ if self._use_staging == use_staging:
+ return
+ self._use_staging = use_staging
+ self._update_label()
+
+ def showEvent(self, event):
+ super().showEvent(event)
+ if self._first_show:
+ self._first_show = False
+ self._on_first_show()
+ self._recalculate_sizes()
+
+ def resizeEvent(self, event):
+ super().resizeEvent(event)
+ self._recalculate_sizes()
+
+ def _recalculate_sizes(self):
+ hint = self._confirm_btn.sizeHint()
+ new_width = max((hint.width(), hint.height() * 3))
+ self._confirm_btn.setMinimumWidth(new_width)
+
+ def _on_first_show(self):
+ self.setStyleSheet(load_stylesheet())
+ self.resize(self.default_width, self.default_height)
+
+ def _on_confirm_click(self):
+ self.accept()
+ self.close()
+
+ def _update_label(self):
+ self._info_label.setText(self._get_label())
+
+ def _get_label(self):
+ url_part = f" {self._url}" if self._url else ""
+
+ if self._bundle_name:
+ return (
+ f"Requested release bundle {self._bundle_name}"
+ f" is not available on server{url_part}."
+ "
Try to restart AYON desktop launcher. Please"
+ " contact your administrator if issue persist."
+ )
+ mode = "staging" if self._use_staging else "production"
+ return (
+ f"No release bundle is set as {mode} on the AYON"
+ f" server{url_part} so there is nothing to launch."
+ "
Please contact your administrator"
+ " to resolve the issue."
+ )
+
+
+def main():
+ """Show message that server does not have set bundle to use.
+
+ It is possible to pass url as argument to show it in the message. To use
+ this feature, pass `--url ` as argument to this script.
+ """
+
+ url = None
+ bundle_name = None
+ if "--url" in sys.argv:
+ url_index = sys.argv.index("--url") + 1
+ if url_index < len(sys.argv):
+ url = sys.argv[url_index]
+
+ if "--bundle" in sys.argv:
+ bundle_index = sys.argv.index("--bundle") + 1
+ if bundle_index < len(sys.argv):
+ bundle_name = sys.argv[bundle_index]
+
+ use_staging = is_staging_enabled()
+ app = get_qt_app()
+ window = MissingBundleWindow(url, bundle_name, use_staging)
+ window.show()
+ app.exec_()
+
+
+if __name__ == "__main__":
+ main()
diff --git a/common/ayon_common/distribution/utils.py b/common/ayon_common/distribution/utils.py
new file mode 100644
index 0000000000..a8b755707a
--- /dev/null
+++ b/common/ayon_common/distribution/utils.py
@@ -0,0 +1,90 @@
+import os
+import subprocess
+
+from ayon_common.utils import get_ayon_appdirs, get_ayon_launch_args
+
+
+def get_local_dir(*subdirs):
+ """Get product directory in user's home directory.
+
+ Each user on machine have own local directory where are downloaded updates,
+ addons etc.
+
+ Returns:
+ str: Path to product local directory.
+ """
+
+ if not subdirs:
+ raise ValueError("Must fill dir_name if nothing else provided!")
+
+ local_dir = get_ayon_appdirs(*subdirs)
+ if not os.path.isdir(local_dir):
+ try:
+ os.makedirs(local_dir)
+ except Exception: # TODO fix exception
+ raise RuntimeError(f"Cannot create {local_dir}")
+
+ return local_dir
+
+
+def get_addons_dir():
+ """Directory where addon packages are stored.
+
+ Path to addons is defined using python module 'appdirs' which
+
+ The path is stored into environment variable 'AYON_ADDONS_DIR'.
+ Value of environment variable can be overriden, but we highly recommended
+ to use that option only for development purposes.
+
+ Returns:
+ str: Path to directory where addons should be downloaded.
+ """
+
+ addons_dir = os.environ.get("AYON_ADDONS_DIR")
+ if not addons_dir:
+ addons_dir = get_local_dir("addons")
+ os.environ["AYON_ADDONS_DIR"] = addons_dir
+ return addons_dir
+
+
+def get_dependencies_dir():
+ """Directory where dependency packages are stored.
+
+ Path to addons is defined using python module 'appdirs' which
+
+ The path is stored into environment variable 'AYON_DEPENDENCIES_DIR'.
+ Value of environment variable can be overriden, but we highly recommended
+ to use that option only for development purposes.
+
+ Returns:
+ str: Path to directory where dependency packages should be downloaded.
+ """
+
+ dependencies_dir = os.environ.get("AYON_DEPENDENCIES_DIR")
+ if not dependencies_dir:
+ dependencies_dir = get_local_dir("dependency_packages")
+ os.environ["AYON_DEPENDENCIES_DIR"] = dependencies_dir
+ return dependencies_dir
+
+
+def show_missing_bundle_information(url, bundle_name=None):
+ """Show missing bundle information window.
+
+ This function should be called when server does not have set bundle for
+ production or staging, or when bundle that should be used is not available
+ on server.
+
+ Using subprocess to show the dialog. Is blocking and is waiting until
+ dialog is closed.
+
+ Args:
+ url (str): Server url where bundle is not set.
+ bundle_name (Optional[str]): Name of bundle that was not found.
+ """
+
+ ui_dir = os.path.join(os.path.dirname(__file__), "ui")
+ script_path = os.path.join(ui_dir, "missing_bundle_window.py")
+ args = get_ayon_launch_args(script_path, "--skip-bootstrap", "--url", url)
+ if bundle_name:
+ args.extend(["--bundle", bundle_name])
+ subprocess.call(args)
diff --git a/common/ayon_common/resources/AYON.icns b/common/ayon_common/resources/AYON.icns
new file mode 100644
index 0000000000..2ec66cf3e0
Binary files /dev/null and b/common/ayon_common/resources/AYON.icns differ
diff --git a/common/ayon_common/resources/AYON.ico b/common/ayon_common/resources/AYON.ico
new file mode 100644
index 0000000000..e0ec3292f8
Binary files /dev/null and b/common/ayon_common/resources/AYON.ico differ
diff --git a/common/ayon_common/resources/AYON.png b/common/ayon_common/resources/AYON.png
new file mode 100644
index 0000000000..ed13aeea52
Binary files /dev/null and b/common/ayon_common/resources/AYON.png differ
diff --git a/common/ayon_common/resources/AYON_staging.png b/common/ayon_common/resources/AYON_staging.png
new file mode 100644
index 0000000000..75dadfd56c
Binary files /dev/null and b/common/ayon_common/resources/AYON_staging.png differ
diff --git a/common/ayon_common/resources/__init__.py b/common/ayon_common/resources/__init__.py
new file mode 100644
index 0000000000..2b516feff3
--- /dev/null
+++ b/common/ayon_common/resources/__init__.py
@@ -0,0 +1,25 @@
+import os
+
+from ayon_common.utils import is_staging_enabled
+
+RESOURCES_DIR = os.path.dirname(os.path.abspath(__file__))
+
+
+def get_resource_path(*args):
+ path_items = list(args)
+ path_items.insert(0, RESOURCES_DIR)
+ return os.path.sep.join(path_items)
+
+
+def get_icon_path():
+ if is_staging_enabled():
+ return get_resource_path("AYON_staging.png")
+ return get_resource_path("AYON.png")
+
+
+def load_stylesheet():
+ stylesheet_path = get_resource_path("stylesheet.css")
+
+ with open(stylesheet_path, "r") as stream:
+ content = stream.read()
+ return content
diff --git a/common/ayon_common/resources/edit.png b/common/ayon_common/resources/edit.png
new file mode 100644
index 0000000000..a5a07998a6
Binary files /dev/null and b/common/ayon_common/resources/edit.png differ
diff --git a/common/ayon_common/resources/eye.png b/common/ayon_common/resources/eye.png
new file mode 100644
index 0000000000..5a683e2974
Binary files /dev/null and b/common/ayon_common/resources/eye.png differ
diff --git a/common/ayon_common/resources/stylesheet.css b/common/ayon_common/resources/stylesheet.css
new file mode 100644
index 0000000000..01e664e9e8
--- /dev/null
+++ b/common/ayon_common/resources/stylesheet.css
@@ -0,0 +1,84 @@
+* {
+ font-size: 10pt;
+ font-family: "Noto Sans";
+ font-weight: 450;
+ outline: none;
+}
+
+QWidget {
+ color: #D3D8DE;
+ background: #2C313A;
+ border-radius: 0px;
+}
+
+QWidget:disabled {
+ color: #5b6779;
+}
+
+QLabel {
+ background: transparent;
+}
+
+QPushButton {
+ text-align:center center;
+ border: 0px solid transparent;
+ border-radius: 0.2em;
+ padding: 3px 5px 3px 5px;
+ background: #434a56;
+}
+
+QPushButton:hover {
+ background: rgba(168, 175, 189, 0.3);
+ color: #F0F2F5;
+}
+
+QPushButton:pressed {}
+
+QPushButton:disabled {
+ background: #434a56;
+}
+
+QLineEdit {
+ border: 1px solid #373D48;
+ border-radius: 0.3em;
+ background: #21252B;
+ padding: 0.1em;
+}
+
+QLineEdit:disabled {
+ background: #2C313A;
+}
+QLineEdit:hover {
+ border-color: rgba(168, 175, 189, .3);
+}
+QLineEdit:focus {
+ border-color: rgb(92, 173, 214);
+}
+
+QLineEdit[state="invalid"] {
+ border-color: #AA5050;
+}
+
+#Separator {
+ background: rgba(75, 83, 98, 127);
+}
+
+#PasswordBtn {
+ border: none;
+ padding: 0.1em;
+ background: transparent;
+}
+
+#PasswordBtn:hover {
+ background: #434a56;
+}
+
+#LikeDisabledInput {
+ background: #2C313A;
+}
+#LikeDisabledInput:hover {
+ border-color: #373D48;
+}
+#LikeDisabledInput:focus {
+ border-color: #373D48;
+}
diff --git a/common/ayon_common/ui_utils.py b/common/ayon_common/ui_utils.py
new file mode 100644
index 0000000000..a3894d0d9c
--- /dev/null
+++ b/common/ayon_common/ui_utils.py
@@ -0,0 +1,36 @@
+import sys
+from qtpy import QtWidgets, QtCore
+
+
+def set_style_property(widget, property_name, property_value):
+ """Set widget's property that may affect style.
+
+ Style of widget is polished if current property value is different.
+ """
+
+ cur_value = widget.property(property_name)
+ if cur_value == property_value:
+ return
+ widget.setProperty(property_name, property_value)
+ widget.style().polish(widget)
+
+
+def get_qt_app():
+ app = QtWidgets.QApplication.instance()
+ if app is not None:
+ return app
+
+ for attr_name in (
+ "AA_EnableHighDpiScaling",
+ "AA_UseHighDpiPixmaps",
+ ):
+ attr = getattr(QtCore.Qt, attr_name, None)
+ if attr is not None:
+ QtWidgets.QApplication.setAttribute(attr)
+
+ if hasattr(QtWidgets.QApplication, "setHighDpiScaleFactorRoundingPolicy"):
+ QtWidgets.QApplication.setHighDpiScaleFactorRoundingPolicy(
+ QtCore.Qt.HighDpiScaleFactorRoundingPolicy.PassThrough
+ )
+
+ return QtWidgets.QApplication(sys.argv)
diff --git a/common/ayon_common/utils.py b/common/ayon_common/utils.py
new file mode 100644
index 0000000000..c0d0c7c0b1
--- /dev/null
+++ b/common/ayon_common/utils.py
@@ -0,0 +1,90 @@
+import os
+import sys
+import appdirs
+
+IS_BUILT_APPLICATION = getattr(sys, "frozen", False)
+
+
+def get_ayon_appdirs(*args):
+ """Local app data directory of AYON client.
+
+ Args:
+ *args (Iterable[str]): Subdirectories/files in local app data dir.
+
+ Returns:
+ str: Path to directory/file in local app data dir.
+ """
+
+ return os.path.join(
+ appdirs.user_data_dir("AYON", "Ynput"),
+ *args
+ )
+
+
+def is_staging_enabled():
+ """Check if staging is enabled.
+
+ Returns:
+ bool: True if staging is enabled.
+ """
+
+ return os.getenv("AYON_USE_STAGING") == "1"
+
+
+def _create_local_site_id():
+ """Create a local site identifier.
+
+ Returns:
+ str: Randomly generated site id.
+ """
+
+ from coolname import generate_slug
+
+ new_id = generate_slug(3)
+
+ print("Created local site id \"{}\"".format(new_id))
+
+ return new_id
+
+
+def get_local_site_id():
+ """Get local site identifier.
+
+ Site id is created if does not exist yet.
+
+ Returns:
+ str: Site id.
+ """
+
+ # used for background syncing
+ site_id = os.environ.get("AYON_SITE_ID")
+ if site_id:
+ return site_id
+
+ site_id_path = get_ayon_appdirs("site_id")
+ if os.path.exists(site_id_path):
+ with open(site_id_path, "r") as stream:
+ site_id = stream.read()
+
+ if not site_id:
+ site_id = _create_local_site_id()
+ with open(site_id_path, "w") as stream:
+ stream.write(site_id)
+ return site_id
+
+
+def get_ayon_launch_args(*args):
+ """Launch arguments that can be used to launch ayon process.
+
+ Args:
+ *args (str): Additional arguments.
+
+ Returns:
+ list[str]: Launch arguments.
+ """
+
+ output = [sys.executable]
+ if not IS_BUILT_APPLICATION:
+ output.append(sys.argv[0])
+ output.extend(args)
+ return output
diff --git a/common/openpype_common/distribution/addon_distribution.py b/common/openpype_common/distribution/addon_distribution.py
deleted file mode 100644
index 5e48639dec..0000000000
--- a/common/openpype_common/distribution/addon_distribution.py
+++ /dev/null
@@ -1,208 +0,0 @@
-import os
-from enum import Enum
-from abc import abstractmethod
-import attr
-import logging
-import requests
-import platform
-import shutil
-
-from .file_handler import RemoteFileHandler
-from .addon_info import AddonInfo
-
-
-class UpdateState(Enum):
- EXISTS = "exists"
- UPDATED = "updated"
- FAILED = "failed"
-
-
-class AddonDownloader:
- log = logging.getLogger(__name__)
-
- def __init__(self):
- self._downloaders = {}
-
- def register_format(self, downloader_type, downloader):
- self._downloaders[downloader_type.value] = downloader
-
- def get_downloader(self, downloader_type):
- downloader = self._downloaders.get(downloader_type)
- if not downloader:
- raise ValueError(f"{downloader_type} not implemented")
- return downloader()
-
- @classmethod
- @abstractmethod
- def download(cls, source, destination):
- """Returns url to downloaded addon zip file.
-
- Args:
- source (dict): {type:"http", "url":"https://} ...}
- destination (str): local folder to unzip
- Returns:
- (str) local path to addon zip file
- """
- pass
-
- @classmethod
- def check_hash(cls, addon_path, addon_hash):
- """Compares 'hash' of downloaded 'addon_url' file.
-
- Args:
- addon_path (str): local path to addon zip file
- addon_hash (str): sha256 hash of zip file
- Raises:
- ValueError if hashes doesn't match
- """
- if not os.path.exists(addon_path):
- raise ValueError(f"{addon_path} doesn't exist.")
- if not RemoteFileHandler.check_integrity(addon_path,
- addon_hash,
- hash_type="sha256"):
- raise ValueError(f"{addon_path} doesn't match expected hash.")
-
- @classmethod
- def unzip(cls, addon_zip_path, destination):
- """Unzips local 'addon_zip_path' to 'destination'.
-
- Args:
- addon_zip_path (str): local path to addon zip file
- destination (str): local folder to unzip
- """
- RemoteFileHandler.unzip(addon_zip_path, destination)
- os.remove(addon_zip_path)
-
- @classmethod
- def remove(cls, addon_url):
- pass
-
-
-class OSAddonDownloader(AddonDownloader):
-
- @classmethod
- def download(cls, source, destination):
- # OS doesnt need to download, unzip directly
- addon_url = source["path"].get(platform.system().lower())
- if not os.path.exists(addon_url):
- raise ValueError("{} is not accessible".format(addon_url))
- return addon_url
-
-
-class HTTPAddonDownloader(AddonDownloader):
- CHUNK_SIZE = 100000
-
- @classmethod
- def download(cls, source, destination):
- source_url = source["url"]
- cls.log.debug(f"Downloading {source_url} to {destination}")
- file_name = os.path.basename(destination)
- _, ext = os.path.splitext(file_name)
- if (ext.replace(".", '') not
- in set(RemoteFileHandler.IMPLEMENTED_ZIP_FORMATS)):
- file_name += ".zip"
- RemoteFileHandler.download_url(source_url,
- destination,
- filename=file_name)
-
- return os.path.join(destination, file_name)
-
-
-def get_addons_info(server_endpoint):
- """Returns list of addon information from Server"""
- # TODO temp
- # addon_info = AddonInfo(
- # **{"name": "openpype_slack",
- # "version": "1.0.0",
- # "addon_url": "c:/projects/openpype_slack_1.0.0.zip",
- # "type": UrlType.FILESYSTEM,
- # "hash": "4be25eb6215e91e5894d3c5475aeb1e379d081d3f5b43b4ee15b0891cf5f5658"}) # noqa
- #
- # http_addon = AddonInfo(
- # **{"name": "openpype_slack",
- # "version": "1.0.0",
- # "addon_url": "https://drive.google.com/file/d/1TcuV8c2OV8CcbPeWi7lxOdqWsEqQNPYy/view?usp=sharing", # noqa
- # "type": UrlType.HTTP,
- # "hash": "4be25eb6215e91e5894d3c5475aeb1e379d081d3f5b43b4ee15b0891cf5f5658"}) # noqa
-
- response = requests.get(server_endpoint)
- if not response.ok:
- raise Exception(response.text)
-
- addons_info = []
- for addon in response.json():
- addons_info.append(AddonInfo(**addon))
- return addons_info
-
-
-def update_addon_state(addon_infos, destination_folder, factory,
- log=None):
- """Loops through all 'addon_infos', compares local version, unzips.
-
- Loops through server provided list of dictionaries with information about
- available addons. Looks if each addon is already present and deployed.
- If isn't, addon zip gets downloaded and unzipped into 'destination_folder'.
- Args:
- addon_infos (list of AddonInfo)
- destination_folder (str): local path
- factory (AddonDownloader): factory to get appropriate downloader per
- addon type
- log (logging.Logger)
- Returns:
- (dict): {"addon_full_name": UpdateState.value
- (eg. "exists"|"updated"|"failed")
- """
- if not log:
- log = logging.getLogger(__name__)
-
- download_states = {}
- for addon in addon_infos:
- full_name = "{}_{}".format(addon.name, addon.version)
- addon_dest = os.path.join(destination_folder, full_name)
-
- if os.path.isdir(addon_dest):
- log.debug(f"Addon version folder {addon_dest} already exists.")
- download_states[full_name] = UpdateState.EXISTS.value
- continue
-
- for source in addon.sources:
- download_states[full_name] = UpdateState.FAILED.value
- try:
- downloader = factory.get_downloader(source.type)
- zip_file_path = downloader.download(attr.asdict(source),
- addon_dest)
- downloader.check_hash(zip_file_path, addon.hash)
- downloader.unzip(zip_file_path, addon_dest)
- download_states[full_name] = UpdateState.UPDATED.value
- break
- except Exception:
- log.warning(f"Error happened during updating {addon.name}",
- exc_info=True)
- if os.path.isdir(addon_dest):
- log.debug(f"Cleaning {addon_dest}")
- shutil.rmtree(addon_dest)
-
- return download_states
-
-
-def check_addons(server_endpoint, addon_folder, downloaders):
- """Main entry point to compare existing addons with those on server.
-
- Args:
- server_endpoint (str): url to v4 server endpoint
- addon_folder (str): local dir path for addons
- downloaders (AddonDownloader): factory of downloaders
-
- Raises:
- (RuntimeError) if any addon failed update
- """
- addons_info = get_addons_info(server_endpoint)
- result = update_addon_state(addons_info,
- addon_folder,
- downloaders)
- if UpdateState.FAILED.value in result.values():
- raise RuntimeError(f"Unable to update some addons {result}")
-
-
-def cli(*args):
- raise NotImplementedError
diff --git a/common/openpype_common/distribution/addon_info.py b/common/openpype_common/distribution/addon_info.py
deleted file mode 100644
index 00ece11f3b..0000000000
--- a/common/openpype_common/distribution/addon_info.py
+++ /dev/null
@@ -1,80 +0,0 @@
-import attr
-from enum import Enum
-
-
-class UrlType(Enum):
- HTTP = "http"
- GIT = "git"
- FILESYSTEM = "filesystem"
-
-
-@attr.s
-class MultiPlatformPath(object):
- windows = attr.ib(default=None)
- linux = attr.ib(default=None)
- darwin = attr.ib(default=None)
-
-
-@attr.s
-class AddonSource(object):
- type = attr.ib()
-
-
-@attr.s
-class LocalAddonSource(AddonSource):
- path = attr.ib(default=attr.Factory(MultiPlatformPath))
-
-
-@attr.s
-class WebAddonSource(AddonSource):
- url = attr.ib(default=None)
-
-
-@attr.s
-class VersionData(object):
- version_data = attr.ib(default=None)
-
-
-@attr.s
-class AddonInfo(object):
- """Object matching json payload from Server"""
- name = attr.ib()
- version = attr.ib()
- title = attr.ib(default=None)
- sources = attr.ib(default=attr.Factory(dict))
- hash = attr.ib(default=None)
- description = attr.ib(default=None)
- license = attr.ib(default=None)
- authors = attr.ib(default=None)
-
- @classmethod
- def from_dict(cls, data):
- sources = []
-
- production_version = data.get("productionVersion")
- if not production_version:
- return
-
- # server payload contains info about all versions
- # active addon must have 'productionVersion' and matching version info
- version_data = data.get("versions", {})[production_version]
-
- for source in version_data.get("clientSourceInfo", []):
- if source.get("type") == UrlType.FILESYSTEM.value:
- source_addon = LocalAddonSource(type=source["type"],
- path=source["path"])
- if source.get("type") == UrlType.HTTP.value:
- source_addon = WebAddonSource(type=source["type"],
- url=source["url"])
-
- sources.append(source_addon)
-
- return cls(name=data.get("name"),
- version=production_version,
- sources=sources,
- hash=data.get("hash"),
- description=data.get("description"),
- title=data.get("title"),
- license=data.get("license"),
- authors=data.get("authors"))
-
diff --git a/common/openpype_common/distribution/tests/test_addon_distributtion.py b/common/openpype_common/distribution/tests/test_addon_distributtion.py
deleted file mode 100644
index 765ea0596a..0000000000
--- a/common/openpype_common/distribution/tests/test_addon_distributtion.py
+++ /dev/null
@@ -1,167 +0,0 @@
-import pytest
-import attr
-import tempfile
-
-from common.openpype_common.distribution.addon_distribution import (
- AddonDownloader,
- OSAddonDownloader,
- HTTPAddonDownloader,
- AddonInfo,
- update_addon_state,
- UpdateState
-)
-from common.openpype_common.distribution.addon_info import UrlType
-
-
-@pytest.fixture
-def addon_downloader():
- addon_downloader = AddonDownloader()
- addon_downloader.register_format(UrlType.FILESYSTEM, OSAddonDownloader)
- addon_downloader.register_format(UrlType.HTTP, HTTPAddonDownloader)
-
- yield addon_downloader
-
-
-@pytest.fixture
-def http_downloader(addon_downloader):
- yield addon_downloader.get_downloader(UrlType.HTTP.value)
-
-
-@pytest.fixture
-def temp_folder():
- yield tempfile.mkdtemp()
-
-
-@pytest.fixture
-def sample_addon_info():
- addon_info = {
- "versions": {
- "1.0.0": {
- "clientPyproject": {
- "tool": {
- "poetry": {
- "dependencies": {
- "nxtools": "^1.6",
- "orjson": "^3.6.7",
- "typer": "^0.4.1",
- "email-validator": "^1.1.3",
- "python": "^3.10",
- "fastapi": "^0.73.0"
- }
- }
- }
- },
- "hasSettings": True,
- "clientSourceInfo": [
- {
- "type": "http",
- "url": "https://drive.google.com/file/d/1TcuV8c2OV8CcbPeWi7lxOdqWsEqQNPYy/view?usp=sharing" # noqa
- },
- {
- "type": "filesystem",
- "path": {
- "windows": ["P:/sources/some_file.zip",
- "W:/sources/some_file.zip"], # noqa
- "linux": ["/mnt/srv/sources/some_file.zip"],
- "darwin": ["/Volumes/srv/sources/some_file.zip"]
- }
- }
- ],
- "frontendScopes": {
- "project": {
- "sidebar": "hierarchy"
- }
- }
- }
- },
- "description": "",
- "title": "Slack addon",
- "name": "openpype_slack",
- "productionVersion": "1.0.0",
- "hash": "4be25eb6215e91e5894d3c5475aeb1e379d081d3f5b43b4ee15b0891cf5f5658" # noqa
- }
- yield addon_info
-
-
-def test_register(printer):
- addon_downloader = AddonDownloader()
-
- assert len(addon_downloader._downloaders) == 0, "Contains registered"
-
- addon_downloader.register_format(UrlType.FILESYSTEM, OSAddonDownloader)
- assert len(addon_downloader._downloaders) == 1, "Should contain one"
-
-
-def test_get_downloader(printer, addon_downloader):
- assert addon_downloader.get_downloader(UrlType.FILESYSTEM.value), "Should find" # noqa
-
- with pytest.raises(ValueError):
- addon_downloader.get_downloader("unknown"), "Shouldn't find"
-
-
-def test_addon_info(printer, sample_addon_info):
- """Tests parsing of expected payload from v4 server into AadonInfo."""
- valid_minimum = {
- "name": "openpype_slack",
- "productionVersion": "1.0.0",
- "versions": {
- "1.0.0": {
- "clientSourceInfo": [
- {
- "type": "filesystem",
- "path": {
- "windows": [
- "P:/sources/some_file.zip",
- "W:/sources/some_file.zip"],
- "linux": [
- "/mnt/srv/sources/some_file.zip"],
- "darwin": [
- "/Volumes/srv/sources/some_file.zip"] # noqa
- }
- }
- ]
- }
- }
- }
-
- assert AddonInfo.from_dict(valid_minimum), "Missing required fields"
-
- valid_minimum["versions"].pop("1.0.0")
- with pytest.raises(KeyError):
- assert not AddonInfo.from_dict(valid_minimum), "Must fail without version data" # noqa
-
- valid_minimum.pop("productionVersion")
- assert not AddonInfo.from_dict(
- valid_minimum), "none if not productionVersion" # noqa
-
- addon = AddonInfo.from_dict(sample_addon_info)
- assert addon, "Should be created"
- assert addon.name == "openpype_slack", "Incorrect name"
- assert addon.version == "1.0.0", "Incorrect version"
-
- with pytest.raises(TypeError):
- assert addon["name"], "Dict approach not implemented"
-
- addon_as_dict = attr.asdict(addon)
- assert addon_as_dict["name"], "Dict approach should work"
-
-
-def test_update_addon_state(printer, sample_addon_info,
- temp_folder, addon_downloader):
- """Tests possible cases of addon update."""
- addon_info = AddonInfo.from_dict(sample_addon_info)
- orig_hash = addon_info.hash
-
- addon_info.hash = "brokenhash"
- result = update_addon_state([addon_info], temp_folder, addon_downloader)
- assert result["openpype_slack_1.0.0"] == UpdateState.FAILED.value, \
- "Update should failed because of wrong hash"
-
- addon_info.hash = orig_hash
- result = update_addon_state([addon_info], temp_folder, addon_downloader)
- assert result["openpype_slack_1.0.0"] == UpdateState.UPDATED.value, \
- "Addon should have been updated"
-
- result = update_addon_state([addon_info], temp_folder, addon_downloader)
- assert result["openpype_slack_1.0.0"] == UpdateState.EXISTS.value, \
- "Addon should already exist"
diff --git a/docs/README.md b/docs/README.md
new file mode 100644
index 0000000000..102da990aa
--- /dev/null
+++ b/docs/README.md
@@ -0,0 +1,74 @@
+API Documentation
+=================
+
+This documents the way how to build and modify API documentation using Sphinx and AutoAPI. Ground for documentation
+should be directly in sources - in docstrings and markdowns. Sphinx and AutoAPI will crawl over them and generate
+RST files that are in turn used to generate HTML documentation. For docstrings we prefer "Napoleon" or "Google" style
+docstrings, but RST is also acceptable mainly in cases where you need to use Sphinx directives.
+
+Using only docstrings is not really viable as some documentation should be done on higher level - like overview of
+some modules/functionality and so on. This should be done directly in RST files and committed to repository.
+
+Configuration
+-------------
+Configuration is done in `/docs/source/conf.py`. The most important settings are:
+
+- `autodoc_mock_imports`: add modules that can't be actually imported by Sphinx in running environment, like `nuke`, `maya`, etc.
+- `autoapi_ignore`: add directories that shouldn't be processed by **AutoAPI**, like vendor dirs, etc.
+- `html_theme_options`: you can use these options to influence how the html theme of the generated files will look.
+- `myst_gfm_only`: are Myst parser option for Markdown setting what flavour of Markdown should be used.
+
+How to build it
+---------------
+
+You can run:
+
+```sh
+cd .\docs
+make.bat html
+```
+
+on linux/macOS:
+
+```sh
+cd ./docs
+make html
+```
+
+This will go over our code and generate **.rst** files in `/docs/source/autoapi` and from those it will generate
+full html documentation in `/docs/build/html`.
+
+During the build you may see tons of red errors that are pointing to our issues:
+
+1) **Wrong imports** -
+Invalid import are usually wrong relative imports (too deep) or circular imports.
+2) **Invalid docstrings** -
+Docstrings to be processed into documentation needs to follow some syntax - this can be checked by running
+`pydocstyle` that is already included with OpenPype
+3) **Invalid markdown/rst files** -
+Markdown/RST files can be included inside RST files using `.. include::` directive. But they have to be properly
+formatted.
+
+Editing RST templates
+---------------------
+Everything starts with `/docs/source/index.rst` - this file should be properly edited, Right now it just
+includes `readme.rst` that in turn include and parse main `README.md`. This is entrypoint to API documentation.
+All templates generated by AutoAPI are in `/docs/source/autoapi`. They should be eventually committed to repository
+and edited too.
+
+Steps for enhancing API documentation
+-------------------------------------
+
+1) Run `/docs/make.bat html`
+2) Read the red errors/warnings - fix it in the code
+3) Run `/docs/make.bat html` - again until there are no red lines
+4) Edit RST files and add some meaningful content there
+
+Resources
+=========
+
+- [ReStructuredText on Wikipedia](https://en.wikipedia.org/wiki/ReStructuredText)
+- [RST Quick Reference](https://docutils.sourceforge.io/docs/user/rst/quickref.html)
+- [Sphinx AutoAPI Documentation](https://sphinx-autoapi.readthedocs.io/en/latest/)
+- [Example of Google Style Python Docstrings](https://sphinxcontrib-napoleon.readthedocs.io/en/latest/example_google.html)
+- [Sphinx Directives](https://www.sphinx-doc.org/en/master/usage/restructuredtext/directives.html)
diff --git a/docs/make.bat b/docs/make.bat
index 4d9eb83d9f..1d261df277 100644
--- a/docs/make.bat
+++ b/docs/make.bat
@@ -5,7 +5,7 @@ pushd %~dp0
REM Command file for Sphinx documentation
if "%SPHINXBUILD%" == "" (
- set SPHINXBUILD=sphinx-build
+ set SPHINXBUILD=..\.poetry\bin\poetry run sphinx-build
)
set SOURCEDIR=source
set BUILDDIR=build
diff --git a/docs/source/_static/AYON_tight_G.svg b/docs/source/_static/AYON_tight_G.svg
new file mode 100644
index 0000000000..2c5b73deea
--- /dev/null
+++ b/docs/source/_static/AYON_tight_G.svg
@@ -0,0 +1,38 @@
+
+
+
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/tests/__init__.py b/docs/source/_static/README.md
similarity index 100%
rename from openpype/modules/ftrack/python2_vendor/arrow/tests/__init__.py
rename to docs/source/_static/README.md
diff --git a/docs/source/_templates/autoapi/index.rst b/docs/source/_templates/autoapi/index.rst
new file mode 100644
index 0000000000..95d0ad8911
--- /dev/null
+++ b/docs/source/_templates/autoapi/index.rst
@@ -0,0 +1,15 @@
+API Reference
+=============
+
+This page contains auto-generated API reference documentation [#f1]_.
+
+.. toctree::
+ :titlesonly:
+
+ {% for page in pages %}
+ {% if page.top_level_object and page.display %}
+ {{ page.include_path }}
+ {% endif %}
+ {% endfor %}
+
+.. [#f1] Created with `sphinx-autoapi `_
diff --git a/docs/source/_templates/autoapi/python/attribute.rst b/docs/source/_templates/autoapi/python/attribute.rst
new file mode 100644
index 0000000000..ebaba555ad
--- /dev/null
+++ b/docs/source/_templates/autoapi/python/attribute.rst
@@ -0,0 +1 @@
+{% extends "python/data.rst" %}
diff --git a/docs/source/_templates/autoapi/python/class.rst b/docs/source/_templates/autoapi/python/class.rst
new file mode 100644
index 0000000000..df5edffb62
--- /dev/null
+++ b/docs/source/_templates/autoapi/python/class.rst
@@ -0,0 +1,58 @@
+{% if obj.display %}
+.. py:{{ obj.type }}:: {{ obj.short_name }}{% if obj.args %}({{ obj.args }}){% endif %}
+{% for (args, return_annotation) in obj.overloads %}
+ {{ " " * (obj.type | length) }} {{ obj.short_name }}{% if args %}({{ args }}){% endif %}
+{% endfor %}
+
+
+ {% if obj.bases %}
+ {% if "show-inheritance" in autoapi_options %}
+ Bases: {% for base in obj.bases %}{{ base|link_objs }}{% if not loop.last %}, {% endif %}{% endfor %}
+ {% endif %}
+
+
+ {% if "show-inheritance-diagram" in autoapi_options and obj.bases != ["object"] %}
+ .. autoapi-inheritance-diagram:: {{ obj.obj["full_name"] }}
+ :parts: 1
+ {% if "private-members" in autoapi_options %}
+ :private-bases:
+ {% endif %}
+
+ {% endif %}
+ {% endif %}
+ {% if obj.docstring %}
+ {{ obj.docstring|indent(3) }}
+ {% endif %}
+ {% if "inherited-members" in autoapi_options %}
+ {% set visible_classes = obj.classes|selectattr("display")|list %}
+ {% else %}
+ {% set visible_classes = obj.classes|rejectattr("inherited")|selectattr("display")|list %}
+ {% endif %}
+ {% for klass in visible_classes %}
+ {{ klass.render()|indent(3) }}
+ {% endfor %}
+ {% if "inherited-members" in autoapi_options %}
+ {% set visible_properties = obj.properties|selectattr("display")|list %}
+ {% else %}
+ {% set visible_properties = obj.properties|rejectattr("inherited")|selectattr("display")|list %}
+ {% endif %}
+ {% for property in visible_properties %}
+ {{ property.render()|indent(3) }}
+ {% endfor %}
+ {% if "inherited-members" in autoapi_options %}
+ {% set visible_attributes = obj.attributes|selectattr("display")|list %}
+ {% else %}
+ {% set visible_attributes = obj.attributes|rejectattr("inherited")|selectattr("display")|list %}
+ {% endif %}
+ {% for attribute in visible_attributes %}
+ {{ attribute.render()|indent(3) }}
+ {% endfor %}
+ {% if "inherited-members" in autoapi_options %}
+ {% set visible_methods = obj.methods|selectattr("display")|list %}
+ {% else %}
+ {% set visible_methods = obj.methods|rejectattr("inherited")|selectattr("display")|list %}
+ {% endif %}
+ {% for method in visible_methods %}
+ {{ method.render()|indent(3) }}
+ {% endfor %}
+{% endif %}
diff --git a/docs/source/_templates/autoapi/python/data.rst b/docs/source/_templates/autoapi/python/data.rst
new file mode 100644
index 0000000000..3d12b2d0c7
--- /dev/null
+++ b/docs/source/_templates/autoapi/python/data.rst
@@ -0,0 +1,37 @@
+{% if obj.display %}
+.. py:{{ obj.type }}:: {{ obj.name }}
+ {%- if obj.annotation is not none %}
+
+ :type: {%- if obj.annotation %} {{ obj.annotation }}{%- endif %}
+
+ {%- endif %}
+
+ {%- if obj.value is not none %}
+
+ :value: {% if obj.value is string and obj.value.splitlines()|count > 1 -%}
+ Multiline-String
+
+ .. raw:: html
+
+ Show Value
+
+ .. code-block:: python
+
+ """{{ obj.value|indent(width=8,blank=true) }}"""
+
+ .. raw:: html
+
+
+
+ {%- else -%}
+ {%- if obj.value is string -%}
+ {{ "%r" % obj.value|string|truncate(100) }}
+ {%- else -%}
+ {{ obj.value|string|truncate(100) }}
+ {%- endif -%}
+ {%- endif %}
+ {%- endif %}
+
+
+ {{ obj.docstring|indent(3) }}
+{% endif %}
diff --git a/docs/source/_templates/autoapi/python/exception.rst b/docs/source/_templates/autoapi/python/exception.rst
new file mode 100644
index 0000000000..92f3d38fd5
--- /dev/null
+++ b/docs/source/_templates/autoapi/python/exception.rst
@@ -0,0 +1 @@
+{% extends "python/class.rst" %}
diff --git a/docs/source/_templates/autoapi/python/function.rst b/docs/source/_templates/autoapi/python/function.rst
new file mode 100644
index 0000000000..b00d5c2445
--- /dev/null
+++ b/docs/source/_templates/autoapi/python/function.rst
@@ -0,0 +1,15 @@
+{% if obj.display %}
+.. py:function:: {{ obj.short_name }}({{ obj.args }}){% if obj.return_annotation is not none %} -> {{ obj.return_annotation }}{% endif %}
+
+{% for (args, return_annotation) in obj.overloads %}
+ {{ obj.short_name }}({{ args }}){% if return_annotation is not none %} -> {{ return_annotation }}{% endif %}
+
+{% endfor %}
+ {% for property in obj.properties %}
+ :{{ property }}:
+ {% endfor %}
+
+ {% if obj.docstring %}
+ {{ obj.docstring|indent(3) }}
+ {% endif %}
+{% endif %}
diff --git a/docs/source/_templates/autoapi/python/method.rst b/docs/source/_templates/autoapi/python/method.rst
new file mode 100644
index 0000000000..723cb7bbe5
--- /dev/null
+++ b/docs/source/_templates/autoapi/python/method.rst
@@ -0,0 +1,19 @@
+{%- if obj.display %}
+.. py:method:: {{ obj.short_name }}({{ obj.args }}){% if obj.return_annotation is not none %} -> {{ obj.return_annotation }}{% endif %}
+
+{% for (args, return_annotation) in obj.overloads %}
+ {{ obj.short_name }}({{ args }}){% if return_annotation is not none %} -> {{ return_annotation }}{% endif %}
+
+{% endfor %}
+ {% if obj.properties %}
+ {% for property in obj.properties %}
+ :{{ property }}:
+ {% endfor %}
+
+ {% else %}
+
+ {% endif %}
+ {% if obj.docstring %}
+ {{ obj.docstring|indent(3) }}
+ {% endif %}
+{% endif %}
diff --git a/docs/source/_templates/autoapi/python/module.rst b/docs/source/_templates/autoapi/python/module.rst
new file mode 100644
index 0000000000..d2714f6c9d
--- /dev/null
+++ b/docs/source/_templates/autoapi/python/module.rst
@@ -0,0 +1,114 @@
+{% if not obj.display %}
+:orphan:
+
+{% endif %}
+:py:mod:`{{ obj.name }}`
+=========={{ "=" * obj.name|length }}
+
+.. py:module:: {{ obj.name }}
+
+{% if obj.docstring %}
+.. autoapi-nested-parse::
+
+ {{ obj.docstring|indent(3) }}
+
+{% endif %}
+
+{% block subpackages %}
+{% set visible_subpackages = obj.subpackages|selectattr("display")|list %}
+{% if visible_subpackages %}
+Subpackages
+-----------
+.. toctree::
+ :titlesonly:
+ :maxdepth: 3
+
+{% for subpackage in visible_subpackages %}
+ {{ subpackage.short_name }}/index.rst
+{% endfor %}
+
+
+{% endif %}
+{% endblock %}
+{% block submodules %}
+{% set visible_submodules = obj.submodules|selectattr("display")|list %}
+{% if visible_submodules %}
+Submodules
+----------
+.. toctree::
+ :titlesonly:
+ :maxdepth: 1
+
+{% for submodule in visible_submodules %}
+ {{ submodule.short_name }}/index.rst
+{% endfor %}
+
+
+{% endif %}
+{% endblock %}
+{% block content %}
+{% if obj.all is not none %}
+{% set visible_children = obj.children|selectattr("short_name", "in", obj.all)|list %}
+{% elif obj.type is equalto("package") %}
+{% set visible_children = obj.children|selectattr("display")|list %}
+{% else %}
+{% set visible_children = obj.children|selectattr("display")|rejectattr("imported")|list %}
+{% endif %}
+{% if visible_children %}
+{{ obj.type|title }} Contents
+{{ "-" * obj.type|length }}---------
+
+{% set visible_classes = visible_children|selectattr("type", "equalto", "class")|list %}
+{% set visible_functions = visible_children|selectattr("type", "equalto", "function")|list %}
+{% set visible_attributes = visible_children|selectattr("type", "equalto", "data")|list %}
+{% if "show-module-summary" in autoapi_options and (visible_classes or visible_functions) %}
+{% block classes scoped %}
+{% if visible_classes %}
+Classes
+~~~~~~~
+
+.. autoapisummary::
+
+{% for klass in visible_classes %}
+ {{ klass.id }}
+{% endfor %}
+
+
+{% endif %}
+{% endblock %}
+
+{% block functions scoped %}
+{% if visible_functions %}
+Functions
+~~~~~~~~~
+
+.. autoapisummary::
+
+{% for function in visible_functions %}
+ {{ function.id }}
+{% endfor %}
+
+
+{% endif %}
+{% endblock %}
+
+{% block attributes scoped %}
+{% if visible_attributes %}
+Attributes
+~~~~~~~~~~
+
+.. autoapisummary::
+
+{% for attribute in visible_attributes %}
+ {{ attribute.id }}
+{% endfor %}
+
+
+{% endif %}
+{% endblock %}
+{% endif %}
+{% for obj_item in visible_children %}
+{{ obj_item.render()|indent(0) }}
+{% endfor %}
+{% endif %}
+{% endblock %}
diff --git a/docs/source/_templates/autoapi/python/package.rst b/docs/source/_templates/autoapi/python/package.rst
new file mode 100644
index 0000000000..fb9a64965e
--- /dev/null
+++ b/docs/source/_templates/autoapi/python/package.rst
@@ -0,0 +1 @@
+{% extends "python/module.rst" %}
diff --git a/docs/source/_templates/autoapi/python/property.rst b/docs/source/_templates/autoapi/python/property.rst
new file mode 100644
index 0000000000..70af24236f
--- /dev/null
+++ b/docs/source/_templates/autoapi/python/property.rst
@@ -0,0 +1,15 @@
+{%- if obj.display %}
+.. py:property:: {{ obj.short_name }}
+ {% if obj.annotation %}
+ :type: {{ obj.annotation }}
+ {% endif %}
+ {% if obj.properties %}
+ {% for property in obj.properties %}
+ :{{ property }}:
+ {% endfor %}
+ {% endif %}
+
+ {% if obj.docstring %}
+ {{ obj.docstring|indent(3) }}
+ {% endif %}
+{% endif %}
diff --git a/docs/source/conf.py b/docs/source/conf.py
index 5b34ff8dc0..916a397e8e 100644
--- a/docs/source/conf.py
+++ b/docs/source/conf.py
@@ -17,18 +17,29 @@
import os
import sys
-pype_root = os.path.abspath('../..')
-sys.path.insert(0, pype_root)
+import revitron_sphinx_theme
+
+openpype_root = os.path.abspath('../..')
+sys.path.insert(0, openpype_root)
+# app = QApplication([])
+
+"""
repos = os.listdir(os.path.abspath("../../repos"))
-repos = [os.path.join(pype_root, "repos", repo) for repo in repos]
+repos = [os.path.join(openpype_root, "repos", repo) for repo in repos]
for repo in repos:
sys.path.append(repo)
+"""
+
+todo_include_todos = True
+autodoc_mock_imports = ["maya", "pymel", "nuke", "nukestudio", "nukescripts",
+ "hiero", "bpy", "fusion", "houdini", "hou", "unreal",
+ "__builtin__", "resolve", "pysync", "DaVinciResolveScript"]
# -- Project information -----------------------------------------------------
-project = 'pype'
-copyright = '2019, Orbi Tools'
-author = 'Orbi Tools'
+project = 'OpenPype'
+copyright = '2023 Ynput'
+author = 'Ynput'
# The short X.Y version
version = ''
@@ -52,11 +63,41 @@ extensions = [
'sphinx.ext.todo',
'sphinx.ext.coverage',
'sphinx.ext.mathjax',
- 'sphinx.ext.viewcode',
'sphinx.ext.autosummary',
- 'recommonmark'
+ 'revitron_sphinx_theme',
+ 'autoapi.extension',
+ 'myst_parser'
]
+##############################
+# Autoapi settings
+##############################
+
+autoapi_dirs = ['../../openpype', '../../igniter']
+
+# bypass modules with a lot of python2 content for now
+autoapi_ignore = [
+ "*vendor*",
+ "*schemas*",
+ "*startup/*",
+ "*/website*",
+ "*openpype/hooks*",
+ "*openpype/style*",
+ "openpype/tests*",
+ # to many levels of relative import:
+ "*/modules/sync_server/*"
+]
+autoapi_keep_files = True
+autoapi_options = [
+ 'members',
+ 'undoc-members',
+ 'show-inheritance',
+ 'show-module-summary'
+]
+autoapi_add_toctree_entry = True
+autoapi_template_dir = '_templates/autoapi'
+
+
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
@@ -64,7 +105,7 @@ templates_path = ['_templates']
# You can specify multiple suffix as a list of string:
#
# source_suffix = ['.rst', '.md']
-source_suffix = '.rst'
+source_suffix = ['.rst', '.md']
# The master toctree document.
master_doc = 'index'
@@ -74,12 +115,15 @@ master_doc = 'index'
#
# This is also used if you do content translation via gettext catalogs.
# Usually you set "language" from the command line for these cases.
-language = None
+language = "English"
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
# This pattern also affects html_static_path and html_extra_path.
-exclude_patterns = []
+exclude_patterns = [
+ "openpype.hosts.resolve.*",
+ "openpype.tools.*"
+ ]
# The name of the Pygments (syntax highlighting) style to use.
pygments_style = 'friendly'
@@ -97,15 +141,22 @@ autosummary_generate = True
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
#
-html_theme = 'sphinx_rtd_theme'
+html_theme = 'revitron_sphinx_theme'
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the
# documentation.
#
html_theme_options = {
- 'collapse_navigation': False
+ 'collapse_navigation': True,
+ 'sticky_navigation': True,
+ 'navigation_depth': 4,
+ 'includehidden': True,
+ 'titles_only': False,
+ 'github_url': '',
}
+html_logo = '_static/AYON_tight_G.svg'
+
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
@@ -153,8 +204,8 @@ latex_elements = {
# (source start file, target name, title,
# author, documentclass [howto, manual, or own class]).
latex_documents = [
- (master_doc, 'pype.tex', 'pype Documentation',
- 'OrbiTools', 'manual'),
+ (master_doc, 'openpype.tex', 'OpenPype Documentation',
+ 'Ynput', 'manual'),
]
@@ -163,7 +214,7 @@ latex_documents = [
# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
- (master_doc, 'pype', 'pype Documentation',
+ (master_doc, 'openpype', 'OpenPype Documentation',
[author], 1)
]
@@ -174,8 +225,8 @@ man_pages = [
# (source start file, target name, title, author,
# dir menu entry, description, category)
texinfo_documents = [
- (master_doc, 'pype', 'pype Documentation',
- author, 'pype', 'One line description of project.',
+ (master_doc, 'OpenPype', 'OpenPype Documentation',
+ author, 'OpenPype', 'Pipeline for studios',
'Miscellaneous'),
]
@@ -207,7 +258,4 @@ intersphinx_mapping = {
'https://docs.python.org/3/': None
}
-# -- Options for todo extension ----------------------------------------------
-
-# If true, `todo` and `todoList` produce output, else they produce nothing.
-todo_include_todos = True
+myst_gfm_only = True
diff --git a/docs/source/igniter.bootstrap_repos.rst b/docs/source/igniter.bootstrap_repos.rst
deleted file mode 100644
index 7c6e0a0757..0000000000
--- a/docs/source/igniter.bootstrap_repos.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-igniter.bootstrap\_repos module
-===============================
-
-.. automodule:: igniter.bootstrap_repos
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/igniter.install_dialog.rst b/docs/source/igniter.install_dialog.rst
deleted file mode 100644
index bf30ec270e..0000000000
--- a/docs/source/igniter.install_dialog.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-igniter.install\_dialog module
-==============================
-
-.. automodule:: igniter.install_dialog
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/igniter.install_thread.rst b/docs/source/igniter.install_thread.rst
deleted file mode 100644
index 6c19516219..0000000000
--- a/docs/source/igniter.install_thread.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-igniter.install\_thread module
-==============================
-
-.. automodule:: igniter.install_thread
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/igniter.rst b/docs/source/igniter.rst
deleted file mode 100644
index b4aebe88b0..0000000000
--- a/docs/source/igniter.rst
+++ /dev/null
@@ -1,42 +0,0 @@
-igniter package
-===============
-
-.. automodule:: igniter
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-igniter.bootstrap\_repos module
--------------------------------
-
-.. automodule:: igniter.bootstrap_repos
- :members:
- :undoc-members:
- :show-inheritance:
-
-igniter.install\_dialog module
-------------------------------
-
-.. automodule:: igniter.install_dialog
- :members:
- :undoc-members:
- :show-inheritance:
-
-igniter.install\_thread module
-------------------------------
-
-.. automodule:: igniter.install_thread
- :members:
- :undoc-members:
- :show-inheritance:
-
-igniter.tools module
---------------------
-
-.. automodule:: igniter.tools
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/igniter.tools.rst b/docs/source/igniter.tools.rst
deleted file mode 100644
index 4fdbdf9d29..0000000000
--- a/docs/source/igniter.tools.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-igniter.tools module
-====================
-
-.. automodule:: igniter.tools
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/index.rst b/docs/source/index.rst
index b54d153894..f703468fca 100644
--- a/docs/source/index.rst
+++ b/docs/source/index.rst
@@ -1,14 +1,15 @@
-.. pype documentation master file, created by
+.. openpype documentation master file, created by
sphinx-quickstart on Mon May 13 17:18:23 2019.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
-Welcome to pype's documentation!
-================================
+Welcome to OpenPype's API documentation!
+========================================
.. toctree::
- readme
- modules
+
+ Readme
+
Indices and tables
==================
diff --git a/docs/source/modules.rst b/docs/source/modules.rst
deleted file mode 100644
index 1956d9ed04..0000000000
--- a/docs/source/modules.rst
+++ /dev/null
@@ -1,8 +0,0 @@
-igniter
-=======
-
-.. toctree::
- :maxdepth: 6
-
- igniter
- pype
\ No newline at end of file
diff --git a/docs/source/pype.action.rst b/docs/source/pype.action.rst
deleted file mode 100644
index 62a32e08b5..0000000000
--- a/docs/source/pype.action.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.action module
-==================
-
-.. automodule:: pype.action
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.api.rst b/docs/source/pype.api.rst
deleted file mode 100644
index af3602a895..0000000000
--- a/docs/source/pype.api.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.api module
-===============
-
-.. automodule:: pype.api
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.cli.rst b/docs/source/pype.cli.rst
deleted file mode 100644
index 7e4a336fa9..0000000000
--- a/docs/source/pype.cli.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.cli module
-===============
-
-.. automodule:: pype.cli
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.aftereffects.rst b/docs/source/pype.hosts.aftereffects.rst
deleted file mode 100644
index 3c2b2dda41..0000000000
--- a/docs/source/pype.hosts.aftereffects.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.aftereffects package
-===============================
-
-.. automodule:: pype.hosts.aftereffects
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.blender.action.rst b/docs/source/pype.hosts.blender.action.rst
deleted file mode 100644
index a6444b1efc..0000000000
--- a/docs/source/pype.hosts.blender.action.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.blender.action module
-================================
-
-.. automodule:: pype.hosts.blender.action
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.blender.plugin.rst b/docs/source/pype.hosts.blender.plugin.rst
deleted file mode 100644
index cf6a8feec8..0000000000
--- a/docs/source/pype.hosts.blender.plugin.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.blender.plugin module
-================================
-
-.. automodule:: pype.hosts.blender.plugin
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.blender.rst b/docs/source/pype.hosts.blender.rst
deleted file mode 100644
index 19cb85e5f3..0000000000
--- a/docs/source/pype.hosts.blender.rst
+++ /dev/null
@@ -1,26 +0,0 @@
-pype.hosts.blender package
-==========================
-
-.. automodule:: pype.hosts.blender
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.hosts.blender.action module
---------------------------------
-
-.. automodule:: pype.hosts.blender.action
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.hosts.blender.plugin module
---------------------------------
-
-.. automodule:: pype.hosts.blender.plugin
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.celaction.cli.rst b/docs/source/pype.hosts.celaction.cli.rst
deleted file mode 100644
index c8843b90bd..0000000000
--- a/docs/source/pype.hosts.celaction.cli.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.celaction.cli module
-===============================
-
-.. automodule:: pype.hosts.celaction.cli
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.celaction.rst b/docs/source/pype.hosts.celaction.rst
deleted file mode 100644
index 1aa236397e..0000000000
--- a/docs/source/pype.hosts.celaction.rst
+++ /dev/null
@@ -1,18 +0,0 @@
-pype.hosts.celaction package
-============================
-
-.. automodule:: pype.hosts.celaction
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.hosts.celaction.cli module
--------------------------------
-
-.. automodule:: pype.hosts.celaction.cli
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.fusion.lib.rst b/docs/source/pype.hosts.fusion.lib.rst
deleted file mode 100644
index 32b8f501f5..0000000000
--- a/docs/source/pype.hosts.fusion.lib.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.fusion.lib module
-============================
-
-.. automodule:: pype.hosts.fusion.lib
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.fusion.menu.rst b/docs/source/pype.hosts.fusion.menu.rst
deleted file mode 100644
index ec5bf76612..0000000000
--- a/docs/source/pype.hosts.fusion.menu.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.fusion.menu module
-=============================
-
-.. automodule:: pype.hosts.fusion.menu
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.fusion.pipeline.rst b/docs/source/pype.hosts.fusion.pipeline.rst
deleted file mode 100644
index ff2a6440a8..0000000000
--- a/docs/source/pype.hosts.fusion.pipeline.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.fusion.pipeline module
-=================================
-
-.. automodule:: pype.hosts.fusion.pipeline
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.fusion.rst b/docs/source/pype.hosts.fusion.rst
deleted file mode 100644
index 7c2fee827c..0000000000
--- a/docs/source/pype.hosts.fusion.rst
+++ /dev/null
@@ -1,26 +0,0 @@
-pype.hosts.fusion package
-=========================
-
-.. automodule:: pype.hosts.fusion
- :members:
- :undoc-members:
- :show-inheritance:
-
-Subpackages
------------
-
-.. toctree::
- :maxdepth: 6
-
- pype.hosts.fusion.scripts
-
-Submodules
-----------
-
-pype.hosts.fusion.lib module
-----------------------------
-
-.. automodule:: pype.hosts.fusion.lib
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.fusion.scripts.duplicate_with_inputs.rst b/docs/source/pype.hosts.fusion.scripts.duplicate_with_inputs.rst
deleted file mode 100644
index 2503c20f3b..0000000000
--- a/docs/source/pype.hosts.fusion.scripts.duplicate_with_inputs.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.fusion.scripts.duplicate\_with\_inputs module
-========================================================
-
-.. automodule:: pype.hosts.fusion.scripts.duplicate_with_inputs
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.fusion.scripts.fusion_switch_shot.rst b/docs/source/pype.hosts.fusion.scripts.fusion_switch_shot.rst
deleted file mode 100644
index 770300116f..0000000000
--- a/docs/source/pype.hosts.fusion.scripts.fusion_switch_shot.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.fusion.scripts.fusion\_switch\_shot module
-=====================================================
-
-.. automodule:: pype.hosts.fusion.scripts.fusion_switch_shot
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.fusion.scripts.rst b/docs/source/pype.hosts.fusion.scripts.rst
deleted file mode 100644
index 5de5f66652..0000000000
--- a/docs/source/pype.hosts.fusion.scripts.rst
+++ /dev/null
@@ -1,26 +0,0 @@
-pype.hosts.fusion.scripts package
-=================================
-
-.. automodule:: pype.hosts.fusion.scripts
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.hosts.fusion.scripts.fusion\_switch\_shot module
------------------------------------------------------
-
-.. automodule:: pype.hosts.fusion.scripts.fusion_switch_shot
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.hosts.fusion.scripts.publish\_filesequence module
-------------------------------------------------------
-
-.. automodule:: pype.hosts.fusion.scripts.publish_filesequence
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.fusion.scripts.set_rendermode.rst b/docs/source/pype.hosts.fusion.scripts.set_rendermode.rst
deleted file mode 100644
index 27bff63466..0000000000
--- a/docs/source/pype.hosts.fusion.scripts.set_rendermode.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.fusion.scripts.set\_rendermode module
-================================================
-
-.. automodule:: pype.hosts.fusion.scripts.set_rendermode
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.fusion.utils.rst b/docs/source/pype.hosts.fusion.utils.rst
deleted file mode 100644
index b6de3d0510..0000000000
--- a/docs/source/pype.hosts.fusion.utils.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.fusion.utils module
-==============================
-
-.. automodule:: pype.hosts.fusion.utils
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.harmony.rst b/docs/source/pype.hosts.harmony.rst
deleted file mode 100644
index 60e1fcdce6..0000000000
--- a/docs/source/pype.hosts.harmony.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.harmony package
-==========================
-
-.. automodule:: pype.hosts.harmony
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.hiero.events.rst b/docs/source/pype.hosts.hiero.events.rst
deleted file mode 100644
index 874abbffba..0000000000
--- a/docs/source/pype.hosts.hiero.events.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.hiero.events module
-==============================
-
-.. automodule:: pype.hosts.hiero.events
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.hiero.lib.rst b/docs/source/pype.hosts.hiero.lib.rst
deleted file mode 100644
index 8c0d33b03b..0000000000
--- a/docs/source/pype.hosts.hiero.lib.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.hiero.lib module
-===========================
-
-.. automodule:: pype.hosts.hiero.lib
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.hiero.menu.rst b/docs/source/pype.hosts.hiero.menu.rst
deleted file mode 100644
index baa1317e61..0000000000
--- a/docs/source/pype.hosts.hiero.menu.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.hiero.menu module
-============================
-
-.. automodule:: pype.hosts.hiero.menu
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.hiero.rst b/docs/source/pype.hosts.hiero.rst
deleted file mode 100644
index 9a7891b45e..0000000000
--- a/docs/source/pype.hosts.hiero.rst
+++ /dev/null
@@ -1,19 +0,0 @@
-pype.hosts.hiero package
-========================
-
-.. automodule:: pype.hosts.hiero
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-.. toctree::
- :maxdepth: 10
-
- pype.hosts.hiero.events
- pype.hosts.hiero.lib
- pype.hosts.hiero.menu
- pype.hosts.hiero.tags
- pype.hosts.hiero.workio
diff --git a/docs/source/pype.hosts.hiero.tags.rst b/docs/source/pype.hosts.hiero.tags.rst
deleted file mode 100644
index 0df33279d5..0000000000
--- a/docs/source/pype.hosts.hiero.tags.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.hiero.tags module
-============================
-
-.. automodule:: pype.hosts.hiero.tags
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.hiero.workio.rst b/docs/source/pype.hosts.hiero.workio.rst
deleted file mode 100644
index 11aae43212..0000000000
--- a/docs/source/pype.hosts.hiero.workio.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.hiero.workio module
-==============================
-
-.. automodule:: pype.hosts.hiero.workio
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.houdini.lib.rst b/docs/source/pype.hosts.houdini.lib.rst
deleted file mode 100644
index ba6e60d5f3..0000000000
--- a/docs/source/pype.hosts.houdini.lib.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.houdini.lib module
-=============================
-
-.. automodule:: pype.hosts.houdini.lib
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.houdini.rst b/docs/source/pype.hosts.houdini.rst
deleted file mode 100644
index 5db18ab3d4..0000000000
--- a/docs/source/pype.hosts.houdini.rst
+++ /dev/null
@@ -1,18 +0,0 @@
-pype.hosts.houdini package
-==========================
-
-.. automodule:: pype.hosts.houdini
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.hosts.houdini.lib module
------------------------------
-
-.. automodule:: pype.hosts.houdini.lib
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.maya.action.rst b/docs/source/pype.hosts.maya.action.rst
deleted file mode 100644
index e1ad7e5d43..0000000000
--- a/docs/source/pype.hosts.maya.action.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.maya.action module
-=============================
-
-.. automodule:: pype.hosts.maya.action
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.maya.customize.rst b/docs/source/pype.hosts.maya.customize.rst
deleted file mode 100644
index 335e75b0d4..0000000000
--- a/docs/source/pype.hosts.maya.customize.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.maya.customize module
-================================
-
-.. automodule:: pype.hosts.maya.customize
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.maya.expected_files.rst b/docs/source/pype.hosts.maya.expected_files.rst
deleted file mode 100644
index 0ecf22e502..0000000000
--- a/docs/source/pype.hosts.maya.expected_files.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.maya.expected\_files module
-======================================
-
-.. automodule:: pype.hosts.maya.expected_files
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.maya.lib.rst b/docs/source/pype.hosts.maya.lib.rst
deleted file mode 100644
index 7d7dbe4502..0000000000
--- a/docs/source/pype.hosts.maya.lib.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.maya.lib module
-==========================
-
-.. automodule:: pype.hosts.maya.lib
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.maya.menu.rst b/docs/source/pype.hosts.maya.menu.rst
deleted file mode 100644
index 614e113769..0000000000
--- a/docs/source/pype.hosts.maya.menu.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.maya.menu module
-===========================
-
-.. automodule:: pype.hosts.maya.menu
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.maya.plugin.rst b/docs/source/pype.hosts.maya.plugin.rst
deleted file mode 100644
index 5796b40c70..0000000000
--- a/docs/source/pype.hosts.maya.plugin.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.maya.plugin module
-=============================
-
-.. automodule:: pype.hosts.maya.plugin
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.maya.rst b/docs/source/pype.hosts.maya.rst
deleted file mode 100644
index 0beab888fc..0000000000
--- a/docs/source/pype.hosts.maya.rst
+++ /dev/null
@@ -1,58 +0,0 @@
-pype.hosts.maya package
-=======================
-
-.. automodule:: pype.hosts.maya
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.hosts.maya.action module
------------------------------
-
-.. automodule:: pype.hosts.maya.action
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.hosts.maya.customize module
---------------------------------
-
-.. automodule:: pype.hosts.maya.customize
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.hosts.maya.expected\_files module
---------------------------------------
-
-.. automodule:: pype.hosts.maya.expected_files
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.hosts.maya.lib module
---------------------------
-
-.. automodule:: pype.hosts.maya.lib
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.hosts.maya.menu module
----------------------------
-
-.. automodule:: pype.hosts.maya.menu
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.hosts.maya.plugin module
------------------------------
-
-.. automodule:: pype.hosts.maya.plugin
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.nuke.actions.rst b/docs/source/pype.hosts.nuke.actions.rst
deleted file mode 100644
index d5e8849a38..0000000000
--- a/docs/source/pype.hosts.nuke.actions.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.nuke.actions module
-==============================
-
-.. automodule:: pype.hosts.nuke.actions
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.nuke.lib.rst b/docs/source/pype.hosts.nuke.lib.rst
deleted file mode 100644
index c177a27f2d..0000000000
--- a/docs/source/pype.hosts.nuke.lib.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.nuke.lib module
-==========================
-
-.. automodule:: pype.hosts.nuke.lib
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.nuke.menu.rst b/docs/source/pype.hosts.nuke.menu.rst
deleted file mode 100644
index 190e488b95..0000000000
--- a/docs/source/pype.hosts.nuke.menu.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.nuke.menu module
-===========================
-
-.. automodule:: pype.hosts.nuke.menu
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.nuke.plugin.rst b/docs/source/pype.hosts.nuke.plugin.rst
deleted file mode 100644
index ddd5f1db89..0000000000
--- a/docs/source/pype.hosts.nuke.plugin.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.nuke.plugin module
-=============================
-
-.. automodule:: pype.hosts.nuke.plugin
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.nuke.presets.rst b/docs/source/pype.hosts.nuke.presets.rst
deleted file mode 100644
index a69aa8a367..0000000000
--- a/docs/source/pype.hosts.nuke.presets.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.nuke.presets module
-==============================
-
-.. automodule:: pype.hosts.nuke.presets
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.nuke.rst b/docs/source/pype.hosts.nuke.rst
deleted file mode 100644
index 559de65927..0000000000
--- a/docs/source/pype.hosts.nuke.rst
+++ /dev/null
@@ -1,58 +0,0 @@
-pype.hosts.nuke package
-=======================
-
-.. automodule:: pype.hosts.nuke
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.hosts.nuke.actions module
-------------------------------
-
-.. automodule:: pype.hosts.nuke.actions
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.hosts.nuke.lib module
---------------------------
-
-.. automodule:: pype.hosts.nuke.lib
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.hosts.nuke.menu module
----------------------------
-
-.. automodule:: pype.hosts.nuke.menu
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.hosts.nuke.plugin module
------------------------------
-
-.. automodule:: pype.hosts.nuke.plugin
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.hosts.nuke.presets module
-------------------------------
-
-.. automodule:: pype.hosts.nuke.presets
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.hosts.nuke.utils module
-----------------------------
-
-.. automodule:: pype.hosts.nuke.utils
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.nuke.utils.rst b/docs/source/pype.hosts.nuke.utils.rst
deleted file mode 100644
index 66974dc707..0000000000
--- a/docs/source/pype.hosts.nuke.utils.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.nuke.utils module
-============================
-
-.. automodule:: pype.hosts.nuke.utils
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.nukestudio.rst b/docs/source/pype.hosts.nukestudio.rst
deleted file mode 100644
index c718d699fa..0000000000
--- a/docs/source/pype.hosts.nukestudio.rst
+++ /dev/null
@@ -1,50 +0,0 @@
-pype.hosts.nukestudio package
-=============================
-
-.. automodule:: pype.hosts.nukestudio
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.hosts.nukestudio.events module
------------------------------------
-
-.. automodule:: pype.hosts.nukestudio.events
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.hosts.nukestudio.lib module
---------------------------------
-
-.. automodule:: pype.hosts.nukestudio.lib
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.hosts.nukestudio.menu module
----------------------------------
-
-.. automodule:: pype.hosts.nukestudio.menu
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.hosts.nukestudio.tags module
----------------------------------
-
-.. automodule:: pype.hosts.nukestudio.tags
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.hosts.nukestudio.workio module
------------------------------------
-
-.. automodule:: pype.hosts.nukestudio.workio
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.photoshop.rst b/docs/source/pype.hosts.photoshop.rst
deleted file mode 100644
index f77ea79874..0000000000
--- a/docs/source/pype.hosts.photoshop.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.photoshop package
-============================
-
-.. automodule:: pype.hosts.photoshop
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.premiere.lib.rst b/docs/source/pype.hosts.premiere.lib.rst
deleted file mode 100644
index e2c2723841..0000000000
--- a/docs/source/pype.hosts.premiere.lib.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.premiere.lib module
-==============================
-
-.. automodule:: pype.hosts.premiere.lib
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.premiere.rst b/docs/source/pype.hosts.premiere.rst
deleted file mode 100644
index 7c38d52c22..0000000000
--- a/docs/source/pype.hosts.premiere.rst
+++ /dev/null
@@ -1,18 +0,0 @@
-pype.hosts.premiere package
-===========================
-
-.. automodule:: pype.hosts.premiere
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.hosts.premiere.lib module
-------------------------------
-
-.. automodule:: pype.hosts.premiere.lib
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.resolve.action.rst b/docs/source/pype.hosts.resolve.action.rst
deleted file mode 100644
index 781694781f..0000000000
--- a/docs/source/pype.hosts.resolve.action.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.resolve.action module
-================================
-
-.. automodule:: pype.hosts.resolve.action
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.resolve.lib.rst b/docs/source/pype.hosts.resolve.lib.rst
deleted file mode 100644
index 5860f783cc..0000000000
--- a/docs/source/pype.hosts.resolve.lib.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.resolve.lib module
-=============================
-
-.. automodule:: pype.hosts.resolve.lib
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.resolve.menu.rst b/docs/source/pype.hosts.resolve.menu.rst
deleted file mode 100644
index df87dcde98..0000000000
--- a/docs/source/pype.hosts.resolve.menu.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.resolve.menu module
-==============================
-
-.. automodule:: pype.hosts.resolve.menu
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.resolve.otio.davinci_export.rst b/docs/source/pype.hosts.resolve.otio.davinci_export.rst
deleted file mode 100644
index 498f96a7ed..0000000000
--- a/docs/source/pype.hosts.resolve.otio.davinci_export.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.resolve.otio.davinci\_export module
-==============================================
-
-.. automodule:: pype.hosts.resolve.otio.davinci_export
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.resolve.otio.davinci_import.rst b/docs/source/pype.hosts.resolve.otio.davinci_import.rst
deleted file mode 100644
index 30f43cc9fe..0000000000
--- a/docs/source/pype.hosts.resolve.otio.davinci_import.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.resolve.otio.davinci\_import module
-==============================================
-
-.. automodule:: pype.hosts.resolve.otio.davinci_import
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.resolve.otio.rst b/docs/source/pype.hosts.resolve.otio.rst
deleted file mode 100644
index 523d8937ca..0000000000
--- a/docs/source/pype.hosts.resolve.otio.rst
+++ /dev/null
@@ -1,17 +0,0 @@
-pype.hosts.resolve.otio package
-===============================
-
-.. automodule:: pype.hosts.resolve.otio
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-.. toctree::
- :maxdepth: 10
-
- pype.hosts.resolve.otio.davinci_export
- pype.hosts.resolve.otio.davinci_import
- pype.hosts.resolve.otio.utils
diff --git a/docs/source/pype.hosts.resolve.otio.utils.rst b/docs/source/pype.hosts.resolve.otio.utils.rst
deleted file mode 100644
index 765f492732..0000000000
--- a/docs/source/pype.hosts.resolve.otio.utils.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.resolve.otio.utils module
-====================================
-
-.. automodule:: pype.hosts.resolve.otio.utils
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.resolve.pipeline.rst b/docs/source/pype.hosts.resolve.pipeline.rst
deleted file mode 100644
index 3efc24137b..0000000000
--- a/docs/source/pype.hosts.resolve.pipeline.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.resolve.pipeline module
-==================================
-
-.. automodule:: pype.hosts.resolve.pipeline
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.resolve.plugin.rst b/docs/source/pype.hosts.resolve.plugin.rst
deleted file mode 100644
index 26f6c56aef..0000000000
--- a/docs/source/pype.hosts.resolve.plugin.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.resolve.plugin module
-================================
-
-.. automodule:: pype.hosts.resolve.plugin
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.resolve.preload_console.rst b/docs/source/pype.hosts.resolve.preload_console.rst
deleted file mode 100644
index 0d38ae14ea..0000000000
--- a/docs/source/pype.hosts.resolve.preload_console.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.resolve.preload\_console module
-==========================================
-
-.. automodule:: pype.hosts.resolve.preload_console
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.resolve.rst b/docs/source/pype.hosts.resolve.rst
deleted file mode 100644
index 368129e43e..0000000000
--- a/docs/source/pype.hosts.resolve.rst
+++ /dev/null
@@ -1,74 +0,0 @@
-pype.hosts.resolve package
-==========================
-
-.. automodule:: pype.hosts.resolve
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.hosts.resolve.action module
---------------------------------
-
-.. automodule:: pype.hosts.resolve.action
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.hosts.resolve.lib module
------------------------------
-
-.. automodule:: pype.hosts.resolve.lib
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.hosts.resolve.menu module
-------------------------------
-
-.. automodule:: pype.hosts.resolve.menu
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.hosts.resolve.pipeline module
-----------------------------------
-
-.. automodule:: pype.hosts.resolve.pipeline
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.hosts.resolve.plugin module
---------------------------------
-
-.. automodule:: pype.hosts.resolve.plugin
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.hosts.resolve.preload\_console module
-------------------------------------------
-
-.. automodule:: pype.hosts.resolve.preload_console
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.hosts.resolve.utils module
--------------------------------
-
-.. automodule:: pype.hosts.resolve.utils
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.hosts.resolve.workio module
---------------------------------
-
-.. automodule:: pype.hosts.resolve.workio
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.resolve.todo-rendering.rst b/docs/source/pype.hosts.resolve.todo-rendering.rst
deleted file mode 100644
index 8ea80183ce..0000000000
--- a/docs/source/pype.hosts.resolve.todo-rendering.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.resolve.todo\-rendering module
-=========================================
-
-.. automodule:: pype.hosts.resolve.todo-rendering
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.resolve.utils.rst b/docs/source/pype.hosts.resolve.utils.rst
deleted file mode 100644
index e390a5d026..0000000000
--- a/docs/source/pype.hosts.resolve.utils.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.resolve.utils module
-===============================
-
-.. automodule:: pype.hosts.resolve.utils
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.resolve.workio.rst b/docs/source/pype.hosts.resolve.workio.rst
deleted file mode 100644
index 5dceb99d64..0000000000
--- a/docs/source/pype.hosts.resolve.workio.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.resolve.workio module
-================================
-
-.. automodule:: pype.hosts.resolve.workio
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.rst b/docs/source/pype.hosts.rst
deleted file mode 100644
index e2d9121501..0000000000
--- a/docs/source/pype.hosts.rst
+++ /dev/null
@@ -1,26 +0,0 @@
-pype.hosts package
-==================
-
-.. automodule:: pype.hosts
- :members:
- :undoc-members:
- :show-inheritance:
-
-Subpackages
------------
-
-.. toctree::
- :maxdepth: 6
-
- pype.hosts.blender
- pype.hosts.celaction
- pype.hosts.fusion
- pype.hosts.harmony
- pype.hosts.houdini
- pype.hosts.maya
- pype.hosts.nuke
- pype.hosts.nukestudio
- pype.hosts.photoshop
- pype.hosts.premiere
- pype.hosts.resolve
- pype.hosts.unreal
diff --git a/docs/source/pype.hosts.tvpaint.api.rst b/docs/source/pype.hosts.tvpaint.api.rst
deleted file mode 100644
index 43273e8ec5..0000000000
--- a/docs/source/pype.hosts.tvpaint.api.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.tvpaint.api package
-==============================
-
-.. automodule:: pype.hosts.tvpaint.api
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.tvpaint.rst b/docs/source/pype.hosts.tvpaint.rst
deleted file mode 100644
index 561be3a9dc..0000000000
--- a/docs/source/pype.hosts.tvpaint.rst
+++ /dev/null
@@ -1,15 +0,0 @@
-pype.hosts.tvpaint package
-==========================
-
-.. automodule:: pype.hosts.tvpaint
- :members:
- :undoc-members:
- :show-inheritance:
-
-Subpackages
------------
-
-.. toctree::
- :maxdepth: 10
-
- pype.hosts.tvpaint.api
diff --git a/docs/source/pype.hosts.unreal.lib.rst b/docs/source/pype.hosts.unreal.lib.rst
deleted file mode 100644
index b891e71c47..0000000000
--- a/docs/source/pype.hosts.unreal.lib.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.unreal.lib module
-============================
-
-.. automodule:: pype.hosts.unreal.lib
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.unreal.plugin.rst b/docs/source/pype.hosts.unreal.plugin.rst
deleted file mode 100644
index e3ef81c7c7..0000000000
--- a/docs/source/pype.hosts.unreal.plugin.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.hosts.unreal.plugin module
-===============================
-
-.. automodule:: pype.hosts.unreal.plugin
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.hosts.unreal.rst b/docs/source/pype.hosts.unreal.rst
deleted file mode 100644
index f46140298b..0000000000
--- a/docs/source/pype.hosts.unreal.rst
+++ /dev/null
@@ -1,26 +0,0 @@
-pype.hosts.unreal package
-=========================
-
-.. automodule:: pype.hosts.unreal
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.hosts.unreal.lib module
-----------------------------
-
-.. automodule:: pype.hosts.unreal.lib
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.hosts.unreal.plugin module
--------------------------------
-
-.. automodule:: pype.hosts.unreal.plugin
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.launcher_actions.rst b/docs/source/pype.launcher_actions.rst
deleted file mode 100644
index c7525acbd1..0000000000
--- a/docs/source/pype.launcher_actions.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.launcher\_actions module
-=============================
-
-.. automodule:: pype.launcher_actions
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.lib.abstract_collect_render.rst b/docs/source/pype.lib.abstract_collect_render.rst
deleted file mode 100644
index d6adadc271..0000000000
--- a/docs/source/pype.lib.abstract_collect_render.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.lib.abstract\_collect\_render module
-=========================================
-
-.. automodule:: pype.lib.abstract_collect_render
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.lib.abstract_expected_files.rst b/docs/source/pype.lib.abstract_expected_files.rst
deleted file mode 100644
index 904aeb3375..0000000000
--- a/docs/source/pype.lib.abstract_expected_files.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.lib.abstract\_expected\_files module
-=========================================
-
-.. automodule:: pype.lib.abstract_expected_files
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.lib.abstract_metaplugins.rst b/docs/source/pype.lib.abstract_metaplugins.rst
deleted file mode 100644
index 9f2751b630..0000000000
--- a/docs/source/pype.lib.abstract_metaplugins.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.lib.abstract\_metaplugins module
-=====================================
-
-.. automodule:: pype.lib.abstract_metaplugins
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.lib.abstract_submit_deadline.rst b/docs/source/pype.lib.abstract_submit_deadline.rst
deleted file mode 100644
index a57222add3..0000000000
--- a/docs/source/pype.lib.abstract_submit_deadline.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.lib.abstract\_submit\_deadline module
-==========================================
-
-.. automodule:: pype.lib.abstract_submit_deadline
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.lib.anatomy.rst b/docs/source/pype.lib.anatomy.rst
deleted file mode 100644
index 7bddb37c8a..0000000000
--- a/docs/source/pype.lib.anatomy.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.lib.anatomy module
-=======================
-
-.. automodule:: pype.lib.anatomy
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.lib.applications.rst b/docs/source/pype.lib.applications.rst
deleted file mode 100644
index 8d1ff9b2c6..0000000000
--- a/docs/source/pype.lib.applications.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.lib.applications module
-============================
-
-.. automodule:: pype.lib.applications
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.lib.avalon_context.rst b/docs/source/pype.lib.avalon_context.rst
deleted file mode 100644
index 067ea3380f..0000000000
--- a/docs/source/pype.lib.avalon_context.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.lib.avalon\_context module
-===============================
-
-.. automodule:: pype.lib.avalon_context
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.lib.config.rst b/docs/source/pype.lib.config.rst
deleted file mode 100644
index ce4c13f4e7..0000000000
--- a/docs/source/pype.lib.config.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.lib.config module
-======================
-
-.. automodule:: pype.lib.config
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.lib.deprecated.rst b/docs/source/pype.lib.deprecated.rst
deleted file mode 100644
index ec5ee58d67..0000000000
--- a/docs/source/pype.lib.deprecated.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.lib.deprecated module
-==========================
-
-.. automodule:: pype.lib.deprecated
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.lib.editorial.rst b/docs/source/pype.lib.editorial.rst
deleted file mode 100644
index d32e495e51..0000000000
--- a/docs/source/pype.lib.editorial.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.lib.editorial module
-=========================
-
-.. automodule:: pype.lib.editorial
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.lib.env_tools.rst b/docs/source/pype.lib.env_tools.rst
deleted file mode 100644
index cb470207c8..0000000000
--- a/docs/source/pype.lib.env_tools.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.lib.env\_tools module
-==========================
-
-.. automodule:: pype.lib.env_tools
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.lib.execute.rst b/docs/source/pype.lib.execute.rst
deleted file mode 100644
index 82c4ef0ad8..0000000000
--- a/docs/source/pype.lib.execute.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.lib.execute module
-=======================
-
-.. automodule:: pype.lib.execute
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.lib.ffmpeg_utils.rst b/docs/source/pype.lib.ffmpeg_utils.rst
deleted file mode 100644
index 968a3f39c8..0000000000
--- a/docs/source/pype.lib.ffmpeg_utils.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.lib.ffmpeg\_utils module
-=============================
-
-.. automodule:: pype.lib.ffmpeg_utils
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.lib.git_progress.rst b/docs/source/pype.lib.git_progress.rst
deleted file mode 100644
index 017cf4c3c7..0000000000
--- a/docs/source/pype.lib.git_progress.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.lib.git\_progress module
-=============================
-
-.. automodule:: pype.lib.git_progress
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.lib.log.rst b/docs/source/pype.lib.log.rst
deleted file mode 100644
index 6282178850..0000000000
--- a/docs/source/pype.lib.log.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.lib.log module
-===================
-
-.. automodule:: pype.lib.log
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.lib.mongo.rst b/docs/source/pype.lib.mongo.rst
deleted file mode 100644
index 34fbc6af7f..0000000000
--- a/docs/source/pype.lib.mongo.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.lib.mongo module
-=====================
-
-.. automodule:: pype.lib.mongo
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.lib.path_tools.rst b/docs/source/pype.lib.path_tools.rst
deleted file mode 100644
index c19c41eea3..0000000000
--- a/docs/source/pype.lib.path_tools.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.lib.path\_tools module
-===========================
-
-.. automodule:: pype.lib.path_tools
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.lib.plugin_tools.rst b/docs/source/pype.lib.plugin_tools.rst
deleted file mode 100644
index 6eadc5d3be..0000000000
--- a/docs/source/pype.lib.plugin_tools.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.lib.plugin\_tools module
-=============================
-
-.. automodule:: pype.lib.plugin_tools
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.lib.profiling.rst b/docs/source/pype.lib.profiling.rst
deleted file mode 100644
index 1fded0c8fd..0000000000
--- a/docs/source/pype.lib.profiling.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.lib.profiling module
-=========================
-
-.. automodule:: pype.lib.profiling
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.lib.python_module_tools.rst b/docs/source/pype.lib.python_module_tools.rst
deleted file mode 100644
index c916080bce..0000000000
--- a/docs/source/pype.lib.python_module_tools.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.lib.python\_module\_tools module
-=====================================
-
-.. automodule:: pype.lib.python_module_tools
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.lib.rst b/docs/source/pype.lib.rst
deleted file mode 100644
index ea880eea3e..0000000000
--- a/docs/source/pype.lib.rst
+++ /dev/null
@@ -1,90 +0,0 @@
-pype.lib package
-================
-
-.. automodule:: pype.lib
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.lib.anatomy module
------------------------
-
-.. automodule:: pype.lib.anatomy
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.lib.config module
-----------------------
-
-.. automodule:: pype.lib.config
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.lib.execute module
------------------------
-
-.. automodule:: pype.lib.execute
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.lib.git\_progress module
------------------------------
-
-.. automodule:: pype.lib.git_progress
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.lib.lib module
--------------------
-
-.. automodule:: pype.lib.lib
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.lib.log module
--------------------
-
-.. automodule:: pype.lib.log
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.lib.mongo module
----------------------
-
-.. automodule:: pype.lib.mongo
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.lib.profiling module
--------------------------
-
-.. automodule:: pype.lib.profiling
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.lib.terminal module
-------------------------
-
-.. automodule:: pype.lib.terminal
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.lib.user\_settings module
-------------------------------
-
-.. automodule:: pype.lib.user_settings
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.lib.terminal.rst b/docs/source/pype.lib.terminal.rst
deleted file mode 100644
index dafe1d8f69..0000000000
--- a/docs/source/pype.lib.terminal.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.lib.terminal module
-========================
-
-.. automodule:: pype.lib.terminal
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.lib.terminal_splash.rst b/docs/source/pype.lib.terminal_splash.rst
deleted file mode 100644
index 06038f0f09..0000000000
--- a/docs/source/pype.lib.terminal_splash.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.lib.terminal\_splash module
-================================
-
-.. automodule:: pype.lib.terminal_splash
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.lib.user_settings.rst b/docs/source/pype.lib.user_settings.rst
deleted file mode 100644
index 7b4e8ced78..0000000000
--- a/docs/source/pype.lib.user_settings.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.lib.user\_settings module
-==============================
-
-.. automodule:: pype.lib.user_settings
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.adobe_communicator.adobe_comunicator.rst b/docs/source/pype.modules.adobe_communicator.adobe_comunicator.rst
deleted file mode 100644
index aadbaa0dc5..0000000000
--- a/docs/source/pype.modules.adobe_communicator.adobe_comunicator.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.adobe\_communicator.adobe\_comunicator module
-==========================================================
-
-.. automodule:: pype.modules.adobe_communicator.adobe_comunicator
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.adobe_communicator.lib.publish.rst b/docs/source/pype.modules.adobe_communicator.lib.publish.rst
deleted file mode 100644
index a16bf1dd0a..0000000000
--- a/docs/source/pype.modules.adobe_communicator.lib.publish.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.adobe\_communicator.lib.publish module
-===================================================
-
-.. automodule:: pype.modules.adobe_communicator.lib.publish
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.adobe_communicator.lib.rest_api.rst b/docs/source/pype.modules.adobe_communicator.lib.rest_api.rst
deleted file mode 100644
index 457bebef99..0000000000
--- a/docs/source/pype.modules.adobe_communicator.lib.rest_api.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.adobe\_communicator.lib.rest\_api module
-=====================================================
-
-.. automodule:: pype.modules.adobe_communicator.lib.rest_api
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.adobe_communicator.lib.rst b/docs/source/pype.modules.adobe_communicator.lib.rst
deleted file mode 100644
index cdec4ce80e..0000000000
--- a/docs/source/pype.modules.adobe_communicator.lib.rst
+++ /dev/null
@@ -1,26 +0,0 @@
-pype.modules.adobe\_communicator.lib package
-============================================
-
-.. automodule:: pype.modules.adobe_communicator.lib
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.modules.adobe\_communicator.lib.publish module
----------------------------------------------------
-
-.. automodule:: pype.modules.adobe_communicator.lib.publish
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.adobe\_communicator.lib.rest\_api module
------------------------------------------------------
-
-.. automodule:: pype.modules.adobe_communicator.lib.rest_api
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.adobe_communicator.rst b/docs/source/pype.modules.adobe_communicator.rst
deleted file mode 100644
index f2fa40ced4..0000000000
--- a/docs/source/pype.modules.adobe_communicator.rst
+++ /dev/null
@@ -1,26 +0,0 @@
-pype.modules.adobe\_communicator package
-========================================
-
-.. automodule:: pype.modules.adobe_communicator
- :members:
- :undoc-members:
- :show-inheritance:
-
-Subpackages
------------
-
-.. toctree::
- :maxdepth: 6
-
- pype.modules.adobe_communicator.lib
-
-Submodules
-----------
-
-pype.modules.adobe\_communicator.adobe\_comunicator module
-----------------------------------------------------------
-
-.. automodule:: pype.modules.adobe_communicator.adobe_comunicator
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.avalon_apps.avalon_app.rst b/docs/source/pype.modules.avalon_apps.avalon_app.rst
deleted file mode 100644
index 43f467e748..0000000000
--- a/docs/source/pype.modules.avalon_apps.avalon_app.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.avalon\_apps.avalon\_app module
-============================================
-
-.. automodule:: pype.modules.avalon_apps.avalon_app
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.avalon_apps.rest_api.rst b/docs/source/pype.modules.avalon_apps.rest_api.rst
deleted file mode 100644
index d89c979311..0000000000
--- a/docs/source/pype.modules.avalon_apps.rest_api.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.avalon\_apps.rest\_api module
-==========================================
-
-.. automodule:: pype.modules.avalon_apps.rest_api
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.avalon_apps.rst b/docs/source/pype.modules.avalon_apps.rst
deleted file mode 100644
index 4755eddae6..0000000000
--- a/docs/source/pype.modules.avalon_apps.rst
+++ /dev/null
@@ -1,26 +0,0 @@
-pype.modules.avalon\_apps package
-=================================
-
-.. automodule:: pype.modules.avalon_apps
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.modules.avalon\_apps.avalon\_app module
---------------------------------------------
-
-.. automodule:: pype.modules.avalon_apps.avalon_app
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.avalon\_apps.rest\_api module
-------------------------------------------
-
-.. automodule:: pype.modules.avalon_apps.rest_api
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.base.rst b/docs/source/pype.modules.base.rst
deleted file mode 100644
index 7cd3cfbd44..0000000000
--- a/docs/source/pype.modules.base.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.base module
-========================
-
-.. automodule:: pype.modules.base
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.clockify.clockify.rst b/docs/source/pype.modules.clockify.clockify.rst
deleted file mode 100644
index a3deaab81d..0000000000
--- a/docs/source/pype.modules.clockify.clockify.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.clockify.clockify module
-=====================================
-
-.. automodule:: pype.modules.clockify.clockify
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.clockify.clockify_api.rst b/docs/source/pype.modules.clockify.clockify_api.rst
deleted file mode 100644
index 2facc550c5..0000000000
--- a/docs/source/pype.modules.clockify.clockify_api.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.clockify.clockify\_api module
-==========================================
-
-.. automodule:: pype.modules.clockify.clockify_api
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.clockify.clockify_module.rst b/docs/source/pype.modules.clockify.clockify_module.rst
deleted file mode 100644
index 85f8e75ad1..0000000000
--- a/docs/source/pype.modules.clockify.clockify_module.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.clockify.clockify\_module module
-=============================================
-
-.. automodule:: pype.modules.clockify.clockify_module
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.clockify.constants.rst b/docs/source/pype.modules.clockify.constants.rst
deleted file mode 100644
index e30a073bfc..0000000000
--- a/docs/source/pype.modules.clockify.constants.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.clockify.constants module
-======================================
-
-.. automodule:: pype.modules.clockify.constants
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.clockify.rst b/docs/source/pype.modules.clockify.rst
deleted file mode 100644
index 550ba049c2..0000000000
--- a/docs/source/pype.modules.clockify.rst
+++ /dev/null
@@ -1,42 +0,0 @@
-pype.modules.clockify package
-=============================
-
-.. automodule:: pype.modules.clockify
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.modules.clockify.clockify module
--------------------------------------
-
-.. automodule:: pype.modules.clockify.clockify
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.clockify.clockify\_api module
-------------------------------------------
-
-.. automodule:: pype.modules.clockify.clockify_api
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.clockify.constants module
---------------------------------------
-
-.. automodule:: pype.modules.clockify.constants
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.clockify.widgets module
-------------------------------------
-
-.. automodule:: pype.modules.clockify.widgets
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.clockify.widgets.rst b/docs/source/pype.modules.clockify.widgets.rst
deleted file mode 100644
index e9809fb048..0000000000
--- a/docs/source/pype.modules.clockify.widgets.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.clockify.widgets module
-====================================
-
-.. automodule:: pype.modules.clockify.widgets
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.deadline.deadline_module.rst b/docs/source/pype.modules.deadline.deadline_module.rst
deleted file mode 100644
index 43e7198a8b..0000000000
--- a/docs/source/pype.modules.deadline.deadline_module.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.deadline.deadline\_module module
-=============================================
-
-.. automodule:: pype.modules.deadline.deadline_module
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.deadline.rst b/docs/source/pype.modules.deadline.rst
deleted file mode 100644
index 7633b2b950..0000000000
--- a/docs/source/pype.modules.deadline.rst
+++ /dev/null
@@ -1,15 +0,0 @@
-pype.modules.deadline package
-=============================
-
-.. automodule:: pype.modules.deadline
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-.. toctree::
- :maxdepth: 10
-
- pype.modules.deadline.deadline_module
diff --git a/docs/source/pype.modules.ftrack.ftrack_module.rst b/docs/source/pype.modules.ftrack.ftrack_module.rst
deleted file mode 100644
index 4188ffbed8..0000000000
--- a/docs/source/pype.modules.ftrack.ftrack_module.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.ftrack.ftrack\_module module
-=========================================
-
-.. automodule:: pype.modules.ftrack.ftrack_module
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.ftrack.ftrack_server.custom_db_connector.rst b/docs/source/pype.modules.ftrack.ftrack_server.custom_db_connector.rst
deleted file mode 100644
index b42c3e054d..0000000000
--- a/docs/source/pype.modules.ftrack.ftrack_server.custom_db_connector.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.ftrack.ftrack\_server.custom\_db\_connector module
-===============================================================
-
-.. automodule:: pype.modules.ftrack.ftrack_server.custom_db_connector
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.ftrack.ftrack_server.event_server_cli.rst b/docs/source/pype.modules.ftrack.ftrack_server.event_server_cli.rst
deleted file mode 100644
index d6404f965c..0000000000
--- a/docs/source/pype.modules.ftrack.ftrack_server.event_server_cli.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.ftrack.ftrack\_server.event\_server\_cli module
-============================================================
-
-.. automodule:: pype.modules.ftrack.ftrack_server.event_server_cli
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.ftrack.ftrack_server.ftrack_server.rst b/docs/source/pype.modules.ftrack.ftrack_server.ftrack_server.rst
deleted file mode 100644
index af2783c263..0000000000
--- a/docs/source/pype.modules.ftrack.ftrack_server.ftrack_server.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.ftrack.ftrack\_server.ftrack\_server module
-========================================================
-
-.. automodule:: pype.modules.ftrack.ftrack_server.ftrack_server
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.ftrack.ftrack_server.lib.rst b/docs/source/pype.modules.ftrack.ftrack_server.lib.rst
deleted file mode 100644
index 2ac4cef517..0000000000
--- a/docs/source/pype.modules.ftrack.ftrack_server.lib.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.ftrack.ftrack\_server.lib module
-=============================================
-
-.. automodule:: pype.modules.ftrack.ftrack_server.lib
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.ftrack.ftrack_server.rst b/docs/source/pype.modules.ftrack.ftrack_server.rst
deleted file mode 100644
index 417acc1a45..0000000000
--- a/docs/source/pype.modules.ftrack.ftrack_server.rst
+++ /dev/null
@@ -1,90 +0,0 @@
-pype.modules.ftrack.ftrack\_server package
-==========================================
-
-.. automodule:: pype.modules.ftrack.ftrack_server
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.modules.ftrack.ftrack\_server.custom\_db\_connector module
----------------------------------------------------------------
-
-.. automodule:: pype.modules.ftrack.ftrack_server.custom_db_connector
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.ftrack.ftrack\_server.event\_server\_cli module
-------------------------------------------------------------
-
-.. automodule:: pype.modules.ftrack.ftrack_server.event_server_cli
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.ftrack.ftrack\_server.ftrack\_server module
---------------------------------------------------------
-
-.. automodule:: pype.modules.ftrack.ftrack_server.ftrack_server
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.ftrack.ftrack\_server.lib module
----------------------------------------------
-
-.. automodule:: pype.modules.ftrack.ftrack_server.lib
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.ftrack.ftrack\_server.socket\_thread module
---------------------------------------------------------
-
-.. automodule:: pype.modules.ftrack.ftrack_server.socket_thread
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.ftrack.ftrack\_server.sub\_event\_processor module
----------------------------------------------------------------
-
-.. automodule:: pype.modules.ftrack.ftrack_server.sub_event_processor
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.ftrack.ftrack\_server.sub\_event\_status module
-------------------------------------------------------------
-
-.. automodule:: pype.modules.ftrack.ftrack_server.sub_event_status
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.ftrack.ftrack\_server.sub\_event\_storer module
-------------------------------------------------------------
-
-.. automodule:: pype.modules.ftrack.ftrack_server.sub_event_storer
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.ftrack.ftrack\_server.sub\_legacy\_server module
--------------------------------------------------------------
-
-.. automodule:: pype.modules.ftrack.ftrack_server.sub_legacy_server
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.ftrack.ftrack\_server.sub\_user\_server module
------------------------------------------------------------
-
-.. automodule:: pype.modules.ftrack.ftrack_server.sub_user_server
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.ftrack.ftrack_server.socket_thread.rst b/docs/source/pype.modules.ftrack.ftrack_server.socket_thread.rst
deleted file mode 100644
index d8d24a8288..0000000000
--- a/docs/source/pype.modules.ftrack.ftrack_server.socket_thread.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.ftrack.ftrack\_server.socket\_thread module
-========================================================
-
-.. automodule:: pype.modules.ftrack.ftrack_server.socket_thread
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.ftrack.ftrack_server.sub_event_processor.rst b/docs/source/pype.modules.ftrack.ftrack_server.sub_event_processor.rst
deleted file mode 100644
index 04f863e347..0000000000
--- a/docs/source/pype.modules.ftrack.ftrack_server.sub_event_processor.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.ftrack.ftrack\_server.sub\_event\_processor module
-===============================================================
-
-.. automodule:: pype.modules.ftrack.ftrack_server.sub_event_processor
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.ftrack.ftrack_server.sub_event_status.rst b/docs/source/pype.modules.ftrack.ftrack_server.sub_event_status.rst
deleted file mode 100644
index 876b7313cf..0000000000
--- a/docs/source/pype.modules.ftrack.ftrack_server.sub_event_status.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.ftrack.ftrack\_server.sub\_event\_status module
-============================================================
-
-.. automodule:: pype.modules.ftrack.ftrack_server.sub_event_status
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.ftrack.ftrack_server.sub_event_storer.rst b/docs/source/pype.modules.ftrack.ftrack_server.sub_event_storer.rst
deleted file mode 100644
index 3d2d400d55..0000000000
--- a/docs/source/pype.modules.ftrack.ftrack_server.sub_event_storer.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.ftrack.ftrack\_server.sub\_event\_storer module
-============================================================
-
-.. automodule:: pype.modules.ftrack.ftrack_server.sub_event_storer
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.ftrack.ftrack_server.sub_legacy_server.rst b/docs/source/pype.modules.ftrack.ftrack_server.sub_legacy_server.rst
deleted file mode 100644
index d25cdfe8de..0000000000
--- a/docs/source/pype.modules.ftrack.ftrack_server.sub_legacy_server.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.ftrack.ftrack\_server.sub\_legacy\_server module
-=============================================================
-
-.. automodule:: pype.modules.ftrack.ftrack_server.sub_legacy_server
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.ftrack.ftrack_server.sub_user_server.rst b/docs/source/pype.modules.ftrack.ftrack_server.sub_user_server.rst
deleted file mode 100644
index c13095d5f1..0000000000
--- a/docs/source/pype.modules.ftrack.ftrack_server.sub_user_server.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.ftrack.ftrack\_server.sub\_user\_server module
-===========================================================
-
-.. automodule:: pype.modules.ftrack.ftrack_server.sub_user_server
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.ftrack.lib.avalon_sync.rst b/docs/source/pype.modules.ftrack.lib.avalon_sync.rst
deleted file mode 100644
index 954ec4d911..0000000000
--- a/docs/source/pype.modules.ftrack.lib.avalon_sync.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.ftrack.lib.avalon\_sync module
-===========================================
-
-.. automodule:: pype.modules.ftrack.lib.avalon_sync
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.ftrack.lib.credentials.rst b/docs/source/pype.modules.ftrack.lib.credentials.rst
deleted file mode 100644
index 3965dc406d..0000000000
--- a/docs/source/pype.modules.ftrack.lib.credentials.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.ftrack.lib.credentials module
-==========================================
-
-.. automodule:: pype.modules.ftrack.lib.credentials
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.ftrack.lib.ftrack_action_handler.rst b/docs/source/pype.modules.ftrack.lib.ftrack_action_handler.rst
deleted file mode 100644
index cec38f9b8a..0000000000
--- a/docs/source/pype.modules.ftrack.lib.ftrack_action_handler.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.ftrack.lib.ftrack\_action\_handler module
-======================================================
-
-.. automodule:: pype.modules.ftrack.lib.ftrack_action_handler
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.ftrack.lib.ftrack_app_handler.rst b/docs/source/pype.modules.ftrack.lib.ftrack_app_handler.rst
deleted file mode 100644
index 1f7395927d..0000000000
--- a/docs/source/pype.modules.ftrack.lib.ftrack_app_handler.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.ftrack.lib.ftrack\_app\_handler module
-===================================================
-
-.. automodule:: pype.modules.ftrack.lib.ftrack_app_handler
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.ftrack.lib.ftrack_base_handler.rst b/docs/source/pype.modules.ftrack.lib.ftrack_base_handler.rst
deleted file mode 100644
index 94fab7c940..0000000000
--- a/docs/source/pype.modules.ftrack.lib.ftrack_base_handler.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.ftrack.lib.ftrack\_base\_handler module
-====================================================
-
-.. automodule:: pype.modules.ftrack.lib.ftrack_base_handler
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.ftrack.lib.ftrack_event_handler.rst b/docs/source/pype.modules.ftrack.lib.ftrack_event_handler.rst
deleted file mode 100644
index 0b57219b50..0000000000
--- a/docs/source/pype.modules.ftrack.lib.ftrack_event_handler.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.ftrack.lib.ftrack\_event\_handler module
-=====================================================
-
-.. automodule:: pype.modules.ftrack.lib.ftrack_event_handler
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.ftrack.lib.rst b/docs/source/pype.modules.ftrack.lib.rst
deleted file mode 100644
index 32a219ab3a..0000000000
--- a/docs/source/pype.modules.ftrack.lib.rst
+++ /dev/null
@@ -1,58 +0,0 @@
-pype.modules.ftrack.lib package
-===============================
-
-.. automodule:: pype.modules.ftrack.lib
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.modules.ftrack.lib.avalon\_sync module
--------------------------------------------
-
-.. automodule:: pype.modules.ftrack.lib.avalon_sync
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.ftrack.lib.credentials module
-------------------------------------------
-
-.. automodule:: pype.modules.ftrack.lib.credentials
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.ftrack.lib.ftrack\_action\_handler module
-------------------------------------------------------
-
-.. automodule:: pype.modules.ftrack.lib.ftrack_action_handler
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.ftrack.lib.ftrack\_app\_handler module
----------------------------------------------------
-
-.. automodule:: pype.modules.ftrack.lib.ftrack_app_handler
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.ftrack.lib.ftrack\_base\_handler module
-----------------------------------------------------
-
-.. automodule:: pype.modules.ftrack.lib.ftrack_base_handler
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.ftrack.lib.ftrack\_event\_handler module
------------------------------------------------------
-
-.. automodule:: pype.modules.ftrack.lib.ftrack_event_handler
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.ftrack.lib.settings.rst b/docs/source/pype.modules.ftrack.lib.settings.rst
deleted file mode 100644
index 255d52178a..0000000000
--- a/docs/source/pype.modules.ftrack.lib.settings.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.ftrack.lib.settings module
-=======================================
-
-.. automodule:: pype.modules.ftrack.lib.settings
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.ftrack.rst b/docs/source/pype.modules.ftrack.rst
deleted file mode 100644
index 13a92db808..0000000000
--- a/docs/source/pype.modules.ftrack.rst
+++ /dev/null
@@ -1,17 +0,0 @@
-pype.modules.ftrack package
-===========================
-
-.. automodule:: pype.modules.ftrack
- :members:
- :undoc-members:
- :show-inheritance:
-
-Subpackages
------------
-
-.. toctree::
- :maxdepth: 6
-
- pype.modules.ftrack.ftrack_server
- pype.modules.ftrack.lib
- pype.modules.ftrack.tray
diff --git a/docs/source/pype.modules.ftrack.tray.ftrack_module.rst b/docs/source/pype.modules.ftrack.tray.ftrack_module.rst
deleted file mode 100644
index c4a370472c..0000000000
--- a/docs/source/pype.modules.ftrack.tray.ftrack_module.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.ftrack.tray.ftrack\_module module
-==============================================
-
-.. automodule:: pype.modules.ftrack.tray.ftrack_module
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.ftrack.tray.ftrack_tray.rst b/docs/source/pype.modules.ftrack.tray.ftrack_tray.rst
deleted file mode 100644
index 147647e9b4..0000000000
--- a/docs/source/pype.modules.ftrack.tray.ftrack_tray.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.ftrack.tray.ftrack\_tray module
-============================================
-
-.. automodule:: pype.modules.ftrack.tray.ftrack_tray
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.ftrack.tray.login_dialog.rst b/docs/source/pype.modules.ftrack.tray.login_dialog.rst
deleted file mode 100644
index dabc2e73a7..0000000000
--- a/docs/source/pype.modules.ftrack.tray.login_dialog.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.ftrack.tray.login\_dialog module
-=============================================
-
-.. automodule:: pype.modules.ftrack.tray.login_dialog
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.ftrack.tray.login_tools.rst b/docs/source/pype.modules.ftrack.tray.login_tools.rst
deleted file mode 100644
index 00ec690866..0000000000
--- a/docs/source/pype.modules.ftrack.tray.login_tools.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.ftrack.tray.login\_tools module
-============================================
-
-.. automodule:: pype.modules.ftrack.tray.login_tools
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.ftrack.tray.rst b/docs/source/pype.modules.ftrack.tray.rst
deleted file mode 100644
index 79772a9c3b..0000000000
--- a/docs/source/pype.modules.ftrack.tray.rst
+++ /dev/null
@@ -1,34 +0,0 @@
-pype.modules.ftrack.tray package
-================================
-
-.. automodule:: pype.modules.ftrack.tray
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.modules.ftrack.tray.ftrack\_module module
-----------------------------------------------
-
-.. automodule:: pype.modules.ftrack.tray.ftrack_module
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.ftrack.tray.login\_dialog module
----------------------------------------------
-
-.. automodule:: pype.modules.ftrack.tray.login_dialog
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.ftrack.tray.login\_tools module
---------------------------------------------
-
-.. automodule:: pype.modules.ftrack.tray.login_tools
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.idle_manager.idle_manager.rst b/docs/source/pype.modules.idle_manager.idle_manager.rst
deleted file mode 100644
index 8e93f97e6b..0000000000
--- a/docs/source/pype.modules.idle_manager.idle_manager.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.idle\_manager.idle\_manager module
-===============================================
-
-.. automodule:: pype.modules.idle_manager.idle_manager
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.idle_manager.rst b/docs/source/pype.modules.idle_manager.rst
deleted file mode 100644
index a3f7922999..0000000000
--- a/docs/source/pype.modules.idle_manager.rst
+++ /dev/null
@@ -1,18 +0,0 @@
-pype.modules.idle\_manager package
-==================================
-
-.. automodule:: pype.modules.idle_manager
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.modules.idle\_manager.idle\_manager module
------------------------------------------------
-
-.. automodule:: pype.modules.idle_manager.idle_manager
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.launcher_action.rst b/docs/source/pype.modules.launcher_action.rst
deleted file mode 100644
index a63408e747..0000000000
--- a/docs/source/pype.modules.launcher_action.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.launcher\_action module
-====================================
-
-.. automodule:: pype.modules.launcher_action
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.log_viewer.log_view_module.rst b/docs/source/pype.modules.log_viewer.log_view_module.rst
deleted file mode 100644
index 8d80170a9c..0000000000
--- a/docs/source/pype.modules.log_viewer.log_view_module.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.log\_viewer.log\_view\_module module
-=================================================
-
-.. automodule:: pype.modules.log_viewer.log_view_module
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.log_viewer.rst b/docs/source/pype.modules.log_viewer.rst
deleted file mode 100644
index e275d56086..0000000000
--- a/docs/source/pype.modules.log_viewer.rst
+++ /dev/null
@@ -1,23 +0,0 @@
-pype.modules.log\_viewer package
-================================
-
-.. automodule:: pype.modules.log_viewer
- :members:
- :undoc-members:
- :show-inheritance:
-
-Subpackages
------------
-
-.. toctree::
- :maxdepth: 10
-
- pype.modules.log_viewer.tray
-
-Submodules
-----------
-
-.. toctree::
- :maxdepth: 10
-
- pype.modules.log_viewer.log_view_module
diff --git a/docs/source/pype.modules.log_viewer.tray.app.rst b/docs/source/pype.modules.log_viewer.tray.app.rst
deleted file mode 100644
index 0948a05594..0000000000
--- a/docs/source/pype.modules.log_viewer.tray.app.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.log\_viewer.tray.app module
-========================================
-
-.. automodule:: pype.modules.log_viewer.tray.app
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.log_viewer.tray.models.rst b/docs/source/pype.modules.log_viewer.tray.models.rst
deleted file mode 100644
index 4da3887600..0000000000
--- a/docs/source/pype.modules.log_viewer.tray.models.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.log\_viewer.tray.models module
-===========================================
-
-.. automodule:: pype.modules.log_viewer.tray.models
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.log_viewer.tray.rst b/docs/source/pype.modules.log_viewer.tray.rst
deleted file mode 100644
index 5f4b92f627..0000000000
--- a/docs/source/pype.modules.log_viewer.tray.rst
+++ /dev/null
@@ -1,17 +0,0 @@
-pype.modules.log\_viewer.tray package
-=====================================
-
-.. automodule:: pype.modules.log_viewer.tray
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-.. toctree::
- :maxdepth: 10
-
- pype.modules.log_viewer.tray.app
- pype.modules.log_viewer.tray.models
- pype.modules.log_viewer.tray.widgets
diff --git a/docs/source/pype.modules.log_viewer.tray.widgets.rst b/docs/source/pype.modules.log_viewer.tray.widgets.rst
deleted file mode 100644
index cb57c96559..0000000000
--- a/docs/source/pype.modules.log_viewer.tray.widgets.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.log\_viewer.tray.widgets module
-============================================
-
-.. automodule:: pype.modules.log_viewer.tray.widgets
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.muster.muster.rst b/docs/source/pype.modules.muster.muster.rst
deleted file mode 100644
index d3ba1e7052..0000000000
--- a/docs/source/pype.modules.muster.muster.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.muster.muster module
-=================================
-
-.. automodule:: pype.modules.muster.muster
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.muster.rst b/docs/source/pype.modules.muster.rst
deleted file mode 100644
index d8d0f762f4..0000000000
--- a/docs/source/pype.modules.muster.rst
+++ /dev/null
@@ -1,26 +0,0 @@
-pype.modules.muster package
-===========================
-
-.. automodule:: pype.modules.muster
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.modules.muster.muster module
----------------------------------
-
-.. automodule:: pype.modules.muster.muster
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.muster.widget\_login module
-----------------------------------------
-
-.. automodule:: pype.modules.muster.widget_login
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.muster.widget_login.rst b/docs/source/pype.modules.muster.widget_login.rst
deleted file mode 100644
index 1c59cec820..0000000000
--- a/docs/source/pype.modules.muster.widget_login.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.muster.widget\_login module
-========================================
-
-.. automodule:: pype.modules.muster.widget_login
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.rest_api.base_class.rst b/docs/source/pype.modules.rest_api.base_class.rst
deleted file mode 100644
index c2a1030a78..0000000000
--- a/docs/source/pype.modules.rest_api.base_class.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.rest\_api.base\_class module
-=========================================
-
-.. automodule:: pype.modules.rest_api.base_class
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.rest_api.lib.exceptions.rst b/docs/source/pype.modules.rest_api.lib.exceptions.rst
deleted file mode 100644
index d755420ad0..0000000000
--- a/docs/source/pype.modules.rest_api.lib.exceptions.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.rest\_api.lib.exceptions module
-============================================
-
-.. automodule:: pype.modules.rest_api.lib.exceptions
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.rest_api.lib.factory.rst b/docs/source/pype.modules.rest_api.lib.factory.rst
deleted file mode 100644
index 2131d1b8da..0000000000
--- a/docs/source/pype.modules.rest_api.lib.factory.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.rest\_api.lib.factory module
-=========================================
-
-.. automodule:: pype.modules.rest_api.lib.factory
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.rest_api.lib.handler.rst b/docs/source/pype.modules.rest_api.lib.handler.rst
deleted file mode 100644
index 6e340daf9b..0000000000
--- a/docs/source/pype.modules.rest_api.lib.handler.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.rest\_api.lib.handler module
-=========================================
-
-.. automodule:: pype.modules.rest_api.lib.handler
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.rest_api.lib.lib.rst b/docs/source/pype.modules.rest_api.lib.lib.rst
deleted file mode 100644
index 19663788e0..0000000000
--- a/docs/source/pype.modules.rest_api.lib.lib.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.rest\_api.lib.lib module
-=====================================
-
-.. automodule:: pype.modules.rest_api.lib.lib
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.rest_api.lib.rst b/docs/source/pype.modules.rest_api.lib.rst
deleted file mode 100644
index ed8288ee73..0000000000
--- a/docs/source/pype.modules.rest_api.lib.rst
+++ /dev/null
@@ -1,42 +0,0 @@
-pype.modules.rest\_api.lib package
-==================================
-
-.. automodule:: pype.modules.rest_api.lib
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.modules.rest\_api.lib.exceptions module
---------------------------------------------
-
-.. automodule:: pype.modules.rest_api.lib.exceptions
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.rest\_api.lib.factory module
------------------------------------------
-
-.. automodule:: pype.modules.rest_api.lib.factory
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.rest\_api.lib.handler module
------------------------------------------
-
-.. automodule:: pype.modules.rest_api.lib.handler
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.rest\_api.lib.lib module
--------------------------------------
-
-.. automodule:: pype.modules.rest_api.lib.lib
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.rest_api.rest_api.rst b/docs/source/pype.modules.rest_api.rest_api.rst
deleted file mode 100644
index e3d951ac9f..0000000000
--- a/docs/source/pype.modules.rest_api.rest_api.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.rest\_api.rest\_api module
-=======================================
-
-.. automodule:: pype.modules.rest_api.rest_api
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.rest_api.rst b/docs/source/pype.modules.rest_api.rst
deleted file mode 100644
index 09c58c84f8..0000000000
--- a/docs/source/pype.modules.rest_api.rst
+++ /dev/null
@@ -1,34 +0,0 @@
-pype.modules.rest\_api package
-==============================
-
-.. automodule:: pype.modules.rest_api
- :members:
- :undoc-members:
- :show-inheritance:
-
-Subpackages
------------
-
-.. toctree::
- :maxdepth: 6
-
- pype.modules.rest_api.lib
-
-Submodules
-----------
-
-pype.modules.rest\_api.base\_class module
------------------------------------------
-
-.. automodule:: pype.modules.rest_api.base_class
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.rest\_api.rest\_api module
----------------------------------------
-
-.. automodule:: pype.modules.rest_api.rest_api
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.rst b/docs/source/pype.modules.rst
deleted file mode 100644
index 148c2084b4..0000000000
--- a/docs/source/pype.modules.rst
+++ /dev/null
@@ -1,36 +0,0 @@
-pype.modules package
-====================
-
-.. automodule:: pype.modules
- :members:
- :undoc-members:
- :show-inheritance:
-
-Subpackages
------------
-
-.. toctree::
- :maxdepth: 6
-
- pype.modules.adobe_communicator
- pype.modules.avalon_apps
- pype.modules.clockify
- pype.modules.ftrack
- pype.modules.idle_manager
- pype.modules.muster
- pype.modules.rest_api
- pype.modules.standalonepublish
- pype.modules.timers_manager
- pype.modules.user
- pype.modules.websocket_server
-
-Submodules
-----------
-
-pype.modules.base module
-------------------------
-
-.. automodule:: pype.modules.base
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.settings_action.rst b/docs/source/pype.modules.settings_action.rst
deleted file mode 100644
index 10f0881ced..0000000000
--- a/docs/source/pype.modules.settings_action.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.settings\_action module
-====================================
-
-.. automodule:: pype.modules.settings_action
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.standalonepublish.rst b/docs/source/pype.modules.standalonepublish.rst
deleted file mode 100644
index 2ed366af5c..0000000000
--- a/docs/source/pype.modules.standalonepublish.rst
+++ /dev/null
@@ -1,18 +0,0 @@
-pype.modules.standalonepublish package
-======================================
-
-.. automodule:: pype.modules.standalonepublish
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.modules.standalonepublish.standalonepublish\_module module
----------------------------------------------------------------
-
-.. automodule:: pype.modules.standalonepublish.standalonepublish_module
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.standalonepublish.standalonepublish_module.rst b/docs/source/pype.modules.standalonepublish.standalonepublish_module.rst
deleted file mode 100644
index a78826a4b4..0000000000
--- a/docs/source/pype.modules.standalonepublish.standalonepublish_module.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.standalonepublish.standalonepublish\_module module
-===============================================================
-
-.. automodule:: pype.modules.standalonepublish.standalonepublish_module
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.standalonepublish_action.rst b/docs/source/pype.modules.standalonepublish_action.rst
deleted file mode 100644
index d51dbcefa0..0000000000
--- a/docs/source/pype.modules.standalonepublish_action.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.standalonepublish\_action module
-=============================================
-
-.. automodule:: pype.modules.standalonepublish_action
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.sync_server.rst b/docs/source/pype.modules.sync_server.rst
deleted file mode 100644
index a26dc7e212..0000000000
--- a/docs/source/pype.modules.sync_server.rst
+++ /dev/null
@@ -1,16 +0,0 @@
-pype.modules.sync\_server package
-=================================
-
-.. automodule:: pype.modules.sync_server
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-.. toctree::
- :maxdepth: 10
-
- pype.modules.sync_server.sync_server
- pype.modules.sync_server.utils
diff --git a/docs/source/pype.modules.sync_server.sync_server.rst b/docs/source/pype.modules.sync_server.sync_server.rst
deleted file mode 100644
index 36d6aa68ed..0000000000
--- a/docs/source/pype.modules.sync_server.sync_server.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.sync\_server.sync\_server module
-=============================================
-
-.. automodule:: pype.modules.sync_server.sync_server
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.sync_server.utils.rst b/docs/source/pype.modules.sync_server.utils.rst
deleted file mode 100644
index 325d5e435d..0000000000
--- a/docs/source/pype.modules.sync_server.utils.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.sync\_server.utils module
-======================================
-
-.. automodule:: pype.modules.sync_server.utils
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.timers_manager.rst b/docs/source/pype.modules.timers_manager.rst
deleted file mode 100644
index 6c971e9dc1..0000000000
--- a/docs/source/pype.modules.timers_manager.rst
+++ /dev/null
@@ -1,26 +0,0 @@
-pype.modules.timers\_manager package
-====================================
-
-.. automodule:: pype.modules.timers_manager
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.modules.timers\_manager.timers\_manager module
----------------------------------------------------
-
-.. automodule:: pype.modules.timers_manager.timers_manager
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.timers\_manager.widget\_user\_idle module
-------------------------------------------------------
-
-.. automodule:: pype.modules.timers_manager.widget_user_idle
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.timers_manager.timers_manager.rst b/docs/source/pype.modules.timers_manager.timers_manager.rst
deleted file mode 100644
index fe18e4d15c..0000000000
--- a/docs/source/pype.modules.timers_manager.timers_manager.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.timers\_manager.timers\_manager module
-===================================================
-
-.. automodule:: pype.modules.timers_manager.timers_manager
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.timers_manager.widget_user_idle.rst b/docs/source/pype.modules.timers_manager.widget_user_idle.rst
deleted file mode 100644
index b072879c7a..0000000000
--- a/docs/source/pype.modules.timers_manager.widget_user_idle.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.timers\_manager.widget\_user\_idle module
-======================================================
-
-.. automodule:: pype.modules.timers_manager.widget_user_idle
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.user.rst b/docs/source/pype.modules.user.rst
deleted file mode 100644
index d181b263e5..0000000000
--- a/docs/source/pype.modules.user.rst
+++ /dev/null
@@ -1,26 +0,0 @@
-pype.modules.user package
-=========================
-
-.. automodule:: pype.modules.user
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.modules.user.user\_module module
--------------------------------------
-
-.. automodule:: pype.modules.user.user_module
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.user.widget\_user module
--------------------------------------
-
-.. automodule:: pype.modules.user.widget_user
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.user.user_module.rst b/docs/source/pype.modules.user.user_module.rst
deleted file mode 100644
index a8e0cd6bad..0000000000
--- a/docs/source/pype.modules.user.user_module.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.user.user\_module module
-=====================================
-
-.. automodule:: pype.modules.user.user_module
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.user.widget_user.rst b/docs/source/pype.modules.user.widget_user.rst
deleted file mode 100644
index 2979e5ead4..0000000000
--- a/docs/source/pype.modules.user.widget_user.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.user.widget\_user module
-=====================================
-
-.. automodule:: pype.modules.user.widget_user
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.websocket_server.hosts.aftereffects.rst b/docs/source/pype.modules.websocket_server.hosts.aftereffects.rst
deleted file mode 100644
index 9f4720ae14..0000000000
--- a/docs/source/pype.modules.websocket_server.hosts.aftereffects.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.websocket\_server.hosts.aftereffects module
-========================================================
-
-.. automodule:: pype.modules.websocket_server.hosts.aftereffects
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.websocket_server.hosts.external_app_1.rst b/docs/source/pype.modules.websocket_server.hosts.external_app_1.rst
deleted file mode 100644
index 4ac69d9015..0000000000
--- a/docs/source/pype.modules.websocket_server.hosts.external_app_1.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.websocket\_server.hosts.external\_app\_1 module
-============================================================
-
-.. automodule:: pype.modules.websocket_server.hosts.external_app_1
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.websocket_server.hosts.photoshop.rst b/docs/source/pype.modules.websocket_server.hosts.photoshop.rst
deleted file mode 100644
index cbda61275a..0000000000
--- a/docs/source/pype.modules.websocket_server.hosts.photoshop.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.websocket\_server.hosts.photoshop module
-=====================================================
-
-.. automodule:: pype.modules.websocket_server.hosts.photoshop
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.websocket_server.hosts.rst b/docs/source/pype.modules.websocket_server.hosts.rst
deleted file mode 100644
index d5ce7c3f8e..0000000000
--- a/docs/source/pype.modules.websocket_server.hosts.rst
+++ /dev/null
@@ -1,26 +0,0 @@
-pype.modules.websocket\_server.hosts package
-============================================
-
-.. automodule:: pype.modules.websocket_server.hosts
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.modules.websocket\_server.hosts.external\_app\_1 module
-------------------------------------------------------------
-
-.. automodule:: pype.modules.websocket_server.hosts.external_app_1
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules.websocket\_server.hosts.photoshop module
------------------------------------------------------
-
-.. automodule:: pype.modules.websocket_server.hosts.photoshop
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.websocket_server.rst b/docs/source/pype.modules.websocket_server.rst
deleted file mode 100644
index a83d371df1..0000000000
--- a/docs/source/pype.modules.websocket_server.rst
+++ /dev/null
@@ -1,26 +0,0 @@
-pype.modules.websocket\_server package
-======================================
-
-.. automodule:: pype.modules.websocket_server
- :members:
- :undoc-members:
- :show-inheritance:
-
-Subpackages
------------
-
-.. toctree::
- :maxdepth: 6
-
- pype.modules.websocket_server.hosts
-
-Submodules
-----------
-
-pype.modules.websocket\_server.websocket\_server module
--------------------------------------------------------
-
-.. automodule:: pype.modules.websocket_server.websocket_server
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules.websocket_server.websocket_server.rst b/docs/source/pype.modules.websocket_server.websocket_server.rst
deleted file mode 100644
index 354c9e6cf9..0000000000
--- a/docs/source/pype.modules.websocket_server.websocket_server.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules.websocket\_server.websocket\_server module
-=======================================================
-
-.. automodule:: pype.modules.websocket_server.websocket_server
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.modules_manager.rst b/docs/source/pype.modules_manager.rst
deleted file mode 100644
index a5f2327d65..0000000000
--- a/docs/source/pype.modules_manager.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.modules\_manager module
-============================
-
-.. automodule:: pype.modules_manager
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugin.rst b/docs/source/pype.plugin.rst
deleted file mode 100644
index c20bb77b2b..0000000000
--- a/docs/source/pype.plugin.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugin module
-==================
-
-.. automodule:: pype.plugin
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.collect_animation.rst b/docs/source/pype.plugins.maya.publish.collect_animation.rst
deleted file mode 100644
index 497c497057..0000000000
--- a/docs/source/pype.plugins.maya.publish.collect_animation.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.collect\_animation module
-===================================================
-
-.. automodule:: pype.plugins.maya.publish.collect_animation
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.collect_ass.rst b/docs/source/pype.plugins.maya.publish.collect_ass.rst
deleted file mode 100644
index a44e61ce98..0000000000
--- a/docs/source/pype.plugins.maya.publish.collect_ass.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.collect\_ass module
-=============================================
-
-.. automodule:: pype.plugins.maya.publish.collect_ass
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.collect_assembly.rst b/docs/source/pype.plugins.maya.publish.collect_assembly.rst
deleted file mode 100644
index 5baa91818b..0000000000
--- a/docs/source/pype.plugins.maya.publish.collect_assembly.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.collect\_assembly module
-==================================================
-
-.. automodule:: pype.plugins.maya.publish.collect_assembly
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.collect_file_dependencies.rst b/docs/source/pype.plugins.maya.publish.collect_file_dependencies.rst
deleted file mode 100644
index efe857140e..0000000000
--- a/docs/source/pype.plugins.maya.publish.collect_file_dependencies.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.collect\_file\_dependencies module
-============================================================
-
-.. automodule:: pype.plugins.maya.publish.collect_file_dependencies
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.collect_ftrack_family.rst b/docs/source/pype.plugins.maya.publish.collect_ftrack_family.rst
deleted file mode 100644
index 872bbc69a4..0000000000
--- a/docs/source/pype.plugins.maya.publish.collect_ftrack_family.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.collect\_ftrack\_family module
-========================================================
-
-.. automodule:: pype.plugins.maya.publish.collect_ftrack_family
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.collect_history.rst b/docs/source/pype.plugins.maya.publish.collect_history.rst
deleted file mode 100644
index 5a98778c24..0000000000
--- a/docs/source/pype.plugins.maya.publish.collect_history.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.collect\_history module
-=================================================
-
-.. automodule:: pype.plugins.maya.publish.collect_history
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.collect_instances.rst b/docs/source/pype.plugins.maya.publish.collect_instances.rst
deleted file mode 100644
index 33c8b97597..0000000000
--- a/docs/source/pype.plugins.maya.publish.collect_instances.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.collect\_instances module
-===================================================
-
-.. automodule:: pype.plugins.maya.publish.collect_instances
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.collect_look.rst b/docs/source/pype.plugins.maya.publish.collect_look.rst
deleted file mode 100644
index 234fcf20d1..0000000000
--- a/docs/source/pype.plugins.maya.publish.collect_look.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.collect\_look module
-==============================================
-
-.. automodule:: pype.plugins.maya.publish.collect_look
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.collect_maya_units.rst b/docs/source/pype.plugins.maya.publish.collect_maya_units.rst
deleted file mode 100644
index 0cb01b0fa7..0000000000
--- a/docs/source/pype.plugins.maya.publish.collect_maya_units.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.collect\_maya\_units module
-=====================================================
-
-.. automodule:: pype.plugins.maya.publish.collect_maya_units
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.collect_maya_workspace.rst b/docs/source/pype.plugins.maya.publish.collect_maya_workspace.rst
deleted file mode 100644
index 7447052004..0000000000
--- a/docs/source/pype.plugins.maya.publish.collect_maya_workspace.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.collect\_maya\_workspace module
-=========================================================
-
-.. automodule:: pype.plugins.maya.publish.collect_maya_workspace
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.collect_mayaascii.rst b/docs/source/pype.plugins.maya.publish.collect_mayaascii.rst
deleted file mode 100644
index 14fe826229..0000000000
--- a/docs/source/pype.plugins.maya.publish.collect_mayaascii.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.collect\_mayaascii module
-===================================================
-
-.. automodule:: pype.plugins.maya.publish.collect_mayaascii
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.collect_model.rst b/docs/source/pype.plugins.maya.publish.collect_model.rst
deleted file mode 100644
index b30bf3fb22..0000000000
--- a/docs/source/pype.plugins.maya.publish.collect_model.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.collect\_model module
-===============================================
-
-.. automodule:: pype.plugins.maya.publish.collect_model
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.collect_remove_marked.rst b/docs/source/pype.plugins.maya.publish.collect_remove_marked.rst
deleted file mode 100644
index a0bf9498d7..0000000000
--- a/docs/source/pype.plugins.maya.publish.collect_remove_marked.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.collect\_remove\_marked module
-========================================================
-
-.. automodule:: pype.plugins.maya.publish.collect_remove_marked
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.collect_render.rst b/docs/source/pype.plugins.maya.publish.collect_render.rst
deleted file mode 100644
index 6de8827119..0000000000
--- a/docs/source/pype.plugins.maya.publish.collect_render.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.collect\_render module
-================================================
-
-.. automodule:: pype.plugins.maya.publish.collect_render
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.collect_render_layer_aovs.rst b/docs/source/pype.plugins.maya.publish.collect_render_layer_aovs.rst
deleted file mode 100644
index ab511fc5dd..0000000000
--- a/docs/source/pype.plugins.maya.publish.collect_render_layer_aovs.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.collect\_render\_layer\_aovs module
-=============================================================
-
-.. automodule:: pype.plugins.maya.publish.collect_render_layer_aovs
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.collect_renderable_camera.rst b/docs/source/pype.plugins.maya.publish.collect_renderable_camera.rst
deleted file mode 100644
index c98e8000a1..0000000000
--- a/docs/source/pype.plugins.maya.publish.collect_renderable_camera.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.collect\_renderable\_camera module
-============================================================
-
-.. automodule:: pype.plugins.maya.publish.collect_renderable_camera
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.collect_review.rst b/docs/source/pype.plugins.maya.publish.collect_review.rst
deleted file mode 100644
index d73127aa85..0000000000
--- a/docs/source/pype.plugins.maya.publish.collect_review.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.collect\_review module
-================================================
-
-.. automodule:: pype.plugins.maya.publish.collect_review
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.collect_rig.rst b/docs/source/pype.plugins.maya.publish.collect_rig.rst
deleted file mode 100644
index e7c0528482..0000000000
--- a/docs/source/pype.plugins.maya.publish.collect_rig.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.collect\_rig module
-=============================================
-
-.. automodule:: pype.plugins.maya.publish.collect_rig
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.collect_scene.rst b/docs/source/pype.plugins.maya.publish.collect_scene.rst
deleted file mode 100644
index c5c2fef222..0000000000
--- a/docs/source/pype.plugins.maya.publish.collect_scene.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.collect\_scene module
-===============================================
-
-.. automodule:: pype.plugins.maya.publish.collect_scene
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.collect_unreal_staticmesh.rst b/docs/source/pype.plugins.maya.publish.collect_unreal_staticmesh.rst
deleted file mode 100644
index 673f0865fd..0000000000
--- a/docs/source/pype.plugins.maya.publish.collect_unreal_staticmesh.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.collect\_unreal\_staticmesh module
-============================================================
-
-.. automodule:: pype.plugins.maya.publish.collect_unreal_staticmesh
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.collect_workscene_fps.rst b/docs/source/pype.plugins.maya.publish.collect_workscene_fps.rst
deleted file mode 100644
index ed4386a7ba..0000000000
--- a/docs/source/pype.plugins.maya.publish.collect_workscene_fps.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.collect\_workscene\_fps module
-========================================================
-
-.. automodule:: pype.plugins.maya.publish.collect_workscene_fps
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.collect_yeti_cache.rst b/docs/source/pype.plugins.maya.publish.collect_yeti_cache.rst
deleted file mode 100644
index 32ab50baca..0000000000
--- a/docs/source/pype.plugins.maya.publish.collect_yeti_cache.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.collect\_yeti\_cache module
-=====================================================
-
-.. automodule:: pype.plugins.maya.publish.collect_yeti_cache
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.collect_yeti_rig.rst b/docs/source/pype.plugins.maya.publish.collect_yeti_rig.rst
deleted file mode 100644
index 8cf968b7c5..0000000000
--- a/docs/source/pype.plugins.maya.publish.collect_yeti_rig.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.collect\_yeti\_rig module
-===================================================
-
-.. automodule:: pype.plugins.maya.publish.collect_yeti_rig
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.determine_future_version.rst b/docs/source/pype.plugins.maya.publish.determine_future_version.rst
deleted file mode 100644
index 55c6155680..0000000000
--- a/docs/source/pype.plugins.maya.publish.determine_future_version.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.determine\_future\_version module
-===========================================================
-
-.. automodule:: pype.plugins.maya.publish.determine_future_version
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.extract_animation.rst b/docs/source/pype.plugins.maya.publish.extract_animation.rst
deleted file mode 100644
index 3649723042..0000000000
--- a/docs/source/pype.plugins.maya.publish.extract_animation.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.extract\_animation module
-===================================================
-
-.. automodule:: pype.plugins.maya.publish.extract_animation
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.extract_ass.rst b/docs/source/pype.plugins.maya.publish.extract_ass.rst
deleted file mode 100644
index be8123e5d7..0000000000
--- a/docs/source/pype.plugins.maya.publish.extract_ass.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.extract\_ass module
-=============================================
-
-.. automodule:: pype.plugins.maya.publish.extract_ass
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.extract_assembly.rst b/docs/source/pype.plugins.maya.publish.extract_assembly.rst
deleted file mode 100644
index b36e8f6d30..0000000000
--- a/docs/source/pype.plugins.maya.publish.extract_assembly.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.extract\_assembly module
-==================================================
-
-.. automodule:: pype.plugins.maya.publish.extract_assembly
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.extract_assproxy.rst b/docs/source/pype.plugins.maya.publish.extract_assproxy.rst
deleted file mode 100644
index fc97a2ee46..0000000000
--- a/docs/source/pype.plugins.maya.publish.extract_assproxy.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.extract\_assproxy module
-==================================================
-
-.. automodule:: pype.plugins.maya.publish.extract_assproxy
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.extract_camera_alembic.rst b/docs/source/pype.plugins.maya.publish.extract_camera_alembic.rst
deleted file mode 100644
index a9df3da011..0000000000
--- a/docs/source/pype.plugins.maya.publish.extract_camera_alembic.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.extract\_camera\_alembic module
-=========================================================
-
-.. automodule:: pype.plugins.maya.publish.extract_camera_alembic
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.extract_camera_mayaScene.rst b/docs/source/pype.plugins.maya.publish.extract_camera_mayaScene.rst
deleted file mode 100644
index db1799f52f..0000000000
--- a/docs/source/pype.plugins.maya.publish.extract_camera_mayaScene.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.extract\_camera\_mayaScene module
-===========================================================
-
-.. automodule:: pype.plugins.maya.publish.extract_camera_mayaScene
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.extract_fbx.rst b/docs/source/pype.plugins.maya.publish.extract_fbx.rst
deleted file mode 100644
index fffd5a6394..0000000000
--- a/docs/source/pype.plugins.maya.publish.extract_fbx.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.extract\_fbx module
-=============================================
-
-.. automodule:: pype.plugins.maya.publish.extract_fbx
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.extract_look.rst b/docs/source/pype.plugins.maya.publish.extract_look.rst
deleted file mode 100644
index f2708678ce..0000000000
--- a/docs/source/pype.plugins.maya.publish.extract_look.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.extract\_look module
-==============================================
-
-.. automodule:: pype.plugins.maya.publish.extract_look
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.extract_maya_scene_raw.rst b/docs/source/pype.plugins.maya.publish.extract_maya_scene_raw.rst
deleted file mode 100644
index 1e080dd0eb..0000000000
--- a/docs/source/pype.plugins.maya.publish.extract_maya_scene_raw.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.extract\_maya\_scene\_raw module
-==========================================================
-
-.. automodule:: pype.plugins.maya.publish.extract_maya_scene_raw
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.extract_model.rst b/docs/source/pype.plugins.maya.publish.extract_model.rst
deleted file mode 100644
index c78b49c777..0000000000
--- a/docs/source/pype.plugins.maya.publish.extract_model.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.extract\_model module
-===============================================
-
-.. automodule:: pype.plugins.maya.publish.extract_model
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.extract_playblast.rst b/docs/source/pype.plugins.maya.publish.extract_playblast.rst
deleted file mode 100644
index 1aa284b370..0000000000
--- a/docs/source/pype.plugins.maya.publish.extract_playblast.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.extract\_playblast module
-===================================================
-
-.. automodule:: pype.plugins.maya.publish.extract_playblast
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.extract_pointcache.rst b/docs/source/pype.plugins.maya.publish.extract_pointcache.rst
deleted file mode 100644
index 97ebde4933..0000000000
--- a/docs/source/pype.plugins.maya.publish.extract_pointcache.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.extract\_pointcache module
-====================================================
-
-.. automodule:: pype.plugins.maya.publish.extract_pointcache
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.extract_rendersetup.rst b/docs/source/pype.plugins.maya.publish.extract_rendersetup.rst
deleted file mode 100644
index 86cb178f42..0000000000
--- a/docs/source/pype.plugins.maya.publish.extract_rendersetup.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.extract\_rendersetup module
-=====================================================
-
-.. automodule:: pype.plugins.maya.publish.extract_rendersetup
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.extract_rig.rst b/docs/source/pype.plugins.maya.publish.extract_rig.rst
deleted file mode 100644
index f6419c9473..0000000000
--- a/docs/source/pype.plugins.maya.publish.extract_rig.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.extract\_rig module
-=============================================
-
-.. automodule:: pype.plugins.maya.publish.extract_rig
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.extract_thumbnail.rst b/docs/source/pype.plugins.maya.publish.extract_thumbnail.rst
deleted file mode 100644
index 2d03e11d55..0000000000
--- a/docs/source/pype.plugins.maya.publish.extract_thumbnail.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.extract\_thumbnail module
-===================================================
-
-.. automodule:: pype.plugins.maya.publish.extract_thumbnail
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.extract_vrayproxy.rst b/docs/source/pype.plugins.maya.publish.extract_vrayproxy.rst
deleted file mode 100644
index 5439ff59ca..0000000000
--- a/docs/source/pype.plugins.maya.publish.extract_vrayproxy.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.extract\_vrayproxy module
-===================================================
-
-.. automodule:: pype.plugins.maya.publish.extract_vrayproxy
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.extract_yeti_cache.rst b/docs/source/pype.plugins.maya.publish.extract_yeti_cache.rst
deleted file mode 100644
index 7ad84dfc70..0000000000
--- a/docs/source/pype.plugins.maya.publish.extract_yeti_cache.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.extract\_yeti\_cache module
-=====================================================
-
-.. automodule:: pype.plugins.maya.publish.extract_yeti_cache
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.extract_yeti_rig.rst b/docs/source/pype.plugins.maya.publish.extract_yeti_rig.rst
deleted file mode 100644
index 76d483d91b..0000000000
--- a/docs/source/pype.plugins.maya.publish.extract_yeti_rig.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.extract\_yeti\_rig module
-===================================================
-
-.. automodule:: pype.plugins.maya.publish.extract_yeti_rig
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.increment_current_file_deadline.rst b/docs/source/pype.plugins.maya.publish.increment_current_file_deadline.rst
deleted file mode 100644
index 97126a6c77..0000000000
--- a/docs/source/pype.plugins.maya.publish.increment_current_file_deadline.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.increment\_current\_file\_deadline module
-===================================================================
-
-.. automodule:: pype.plugins.maya.publish.increment_current_file_deadline
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.rst b/docs/source/pype.plugins.maya.publish.rst
deleted file mode 100644
index dba0a9118c..0000000000
--- a/docs/source/pype.plugins.maya.publish.rst
+++ /dev/null
@@ -1,146 +0,0 @@
-pype.plugins.maya.publish package
-=================================
-
-.. automodule:: pype.plugins.maya.publish
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-.. toctree::
- :maxdepth: 10
-
- pype.plugins.maya.publish.collect_animation
- pype.plugins.maya.publish.collect_ass
- pype.plugins.maya.publish.collect_assembly
- pype.plugins.maya.publish.collect_file_dependencies
- pype.plugins.maya.publish.collect_ftrack_family
- pype.plugins.maya.publish.collect_history
- pype.plugins.maya.publish.collect_instances
- pype.plugins.maya.publish.collect_look
- pype.plugins.maya.publish.collect_maya_units
- pype.plugins.maya.publish.collect_maya_workspace
- pype.plugins.maya.publish.collect_mayaascii
- pype.plugins.maya.publish.collect_model
- pype.plugins.maya.publish.collect_remove_marked
- pype.plugins.maya.publish.collect_render
- pype.plugins.maya.publish.collect_render_layer_aovs
- pype.plugins.maya.publish.collect_renderable_camera
- pype.plugins.maya.publish.collect_review
- pype.plugins.maya.publish.collect_rig
- pype.plugins.maya.publish.collect_scene
- pype.plugins.maya.publish.collect_unreal_staticmesh
- pype.plugins.maya.publish.collect_workscene_fps
- pype.plugins.maya.publish.collect_yeti_cache
- pype.plugins.maya.publish.collect_yeti_rig
- pype.plugins.maya.publish.determine_future_version
- pype.plugins.maya.publish.extract_animation
- pype.plugins.maya.publish.extract_ass
- pype.plugins.maya.publish.extract_assembly
- pype.plugins.maya.publish.extract_assproxy
- pype.plugins.maya.publish.extract_camera_alembic
- pype.plugins.maya.publish.extract_camera_mayaScene
- pype.plugins.maya.publish.extract_fbx
- pype.plugins.maya.publish.extract_look
- pype.plugins.maya.publish.extract_maya_scene_raw
- pype.plugins.maya.publish.extract_model
- pype.plugins.maya.publish.extract_playblast
- pype.plugins.maya.publish.extract_pointcache
- pype.plugins.maya.publish.extract_rendersetup
- pype.plugins.maya.publish.extract_rig
- pype.plugins.maya.publish.extract_thumbnail
- pype.plugins.maya.publish.extract_vrayproxy
- pype.plugins.maya.publish.extract_yeti_cache
- pype.plugins.maya.publish.extract_yeti_rig
- pype.plugins.maya.publish.increment_current_file_deadline
- pype.plugins.maya.publish.save_scene
- pype.plugins.maya.publish.submit_maya_deadline
- pype.plugins.maya.publish.submit_maya_muster
- pype.plugins.maya.publish.validate_animation_content
- pype.plugins.maya.publish.validate_animation_out_set_related_node_ids
- pype.plugins.maya.publish.validate_ass_relative_paths
- pype.plugins.maya.publish.validate_assembly_name
- pype.plugins.maya.publish.validate_assembly_namespaces
- pype.plugins.maya.publish.validate_assembly_transforms
- pype.plugins.maya.publish.validate_attributes
- pype.plugins.maya.publish.validate_camera_attributes
- pype.plugins.maya.publish.validate_camera_contents
- pype.plugins.maya.publish.validate_color_sets
- pype.plugins.maya.publish.validate_current_renderlayer_renderable
- pype.plugins.maya.publish.validate_deadline_connection
- pype.plugins.maya.publish.validate_frame_range
- pype.plugins.maya.publish.validate_instance_has_members
- pype.plugins.maya.publish.validate_instance_subset
- pype.plugins.maya.publish.validate_instancer_content
- pype.plugins.maya.publish.validate_instancer_frame_ranges
- pype.plugins.maya.publish.validate_joints_hidden
- pype.plugins.maya.publish.validate_look_contents
- pype.plugins.maya.publish.validate_look_default_shaders_connections
- pype.plugins.maya.publish.validate_look_id_reference_edits
- pype.plugins.maya.publish.validate_look_members_unique
- pype.plugins.maya.publish.validate_look_no_default_shaders
- pype.plugins.maya.publish.validate_look_sets
- pype.plugins.maya.publish.validate_look_shading_group
- pype.plugins.maya.publish.validate_look_single_shader
- pype.plugins.maya.publish.validate_maya_units
- pype.plugins.maya.publish.validate_mesh_arnold_attributes
- pype.plugins.maya.publish.validate_mesh_has_uv
- pype.plugins.maya.publish.validate_mesh_lamina_faces
- pype.plugins.maya.publish.validate_mesh_no_negative_scale
- pype.plugins.maya.publish.validate_mesh_non_manifold
- pype.plugins.maya.publish.validate_mesh_non_zero_edge
- pype.plugins.maya.publish.validate_mesh_normals_unlocked
- pype.plugins.maya.publish.validate_mesh_overlapping_uvs
- pype.plugins.maya.publish.validate_mesh_shader_connections
- pype.plugins.maya.publish.validate_mesh_single_uv_set
- pype.plugins.maya.publish.validate_mesh_uv_set_map1
- pype.plugins.maya.publish.validate_mesh_vertices_have_edges
- pype.plugins.maya.publish.validate_model_content
- pype.plugins.maya.publish.validate_model_name
- pype.plugins.maya.publish.validate_muster_connection
- pype.plugins.maya.publish.validate_no_animation
- pype.plugins.maya.publish.validate_no_default_camera
- pype.plugins.maya.publish.validate_no_namespace
- pype.plugins.maya.publish.validate_no_null_transforms
- pype.plugins.maya.publish.validate_no_unknown_nodes
- pype.plugins.maya.publish.validate_no_vraymesh
- pype.plugins.maya.publish.validate_node_ids
- pype.plugins.maya.publish.validate_node_ids_deformed_shapes
- pype.plugins.maya.publish.validate_node_ids_in_database
- pype.plugins.maya.publish.validate_node_ids_related
- pype.plugins.maya.publish.validate_node_ids_unique
- pype.plugins.maya.publish.validate_node_no_ghosting
- pype.plugins.maya.publish.validate_render_image_rule
- pype.plugins.maya.publish.validate_render_no_default_cameras
- pype.plugins.maya.publish.validate_render_single_camera
- pype.plugins.maya.publish.validate_renderlayer_aovs
- pype.plugins.maya.publish.validate_rendersettings
- pype.plugins.maya.publish.validate_resources
- pype.plugins.maya.publish.validate_rig_contents
- pype.plugins.maya.publish.validate_rig_controllers
- pype.plugins.maya.publish.validate_rig_controllers_arnold_attributes
- pype.plugins.maya.publish.validate_rig_out_set_node_ids
- pype.plugins.maya.publish.validate_rig_output_ids
- pype.plugins.maya.publish.validate_scene_set_workspace
- pype.plugins.maya.publish.validate_shader_name
- pype.plugins.maya.publish.validate_shape_default_names
- pype.plugins.maya.publish.validate_shape_render_stats
- pype.plugins.maya.publish.validate_single_assembly
- pype.plugins.maya.publish.validate_skinCluster_deformer_set
- pype.plugins.maya.publish.validate_step_size
- pype.plugins.maya.publish.validate_transform_naming_suffix
- pype.plugins.maya.publish.validate_transform_zero
- pype.plugins.maya.publish.validate_unicode_strings
- pype.plugins.maya.publish.validate_unreal_mesh_triangulated
- pype.plugins.maya.publish.validate_unreal_staticmesh_naming
- pype.plugins.maya.publish.validate_unreal_up_axis
- pype.plugins.maya.publish.validate_vray_distributed_rendering
- pype.plugins.maya.publish.validate_vray_translator_settings
- pype.plugins.maya.publish.validate_vrayproxy
- pype.plugins.maya.publish.validate_vrayproxy_members
- pype.plugins.maya.publish.validate_yeti_renderscript_callbacks
- pype.plugins.maya.publish.validate_yeti_rig_cache_state
- pype.plugins.maya.publish.validate_yeti_rig_input_in_instance
- pype.plugins.maya.publish.validate_yeti_rig_settings
diff --git a/docs/source/pype.plugins.maya.publish.save_scene.rst b/docs/source/pype.plugins.maya.publish.save_scene.rst
deleted file mode 100644
index 2537bca03d..0000000000
--- a/docs/source/pype.plugins.maya.publish.save_scene.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.save\_scene module
-============================================
-
-.. automodule:: pype.plugins.maya.publish.save_scene
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.submit_maya_deadline.rst b/docs/source/pype.plugins.maya.publish.submit_maya_deadline.rst
deleted file mode 100644
index 0e521cec4e..0000000000
--- a/docs/source/pype.plugins.maya.publish.submit_maya_deadline.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.submit\_maya\_deadline module
-=======================================================
-
-.. automodule:: pype.plugins.maya.publish.submit_maya_deadline
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.submit_maya_muster.rst b/docs/source/pype.plugins.maya.publish.submit_maya_muster.rst
deleted file mode 100644
index 4ae263e157..0000000000
--- a/docs/source/pype.plugins.maya.publish.submit_maya_muster.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.submit\_maya\_muster module
-=====================================================
-
-.. automodule:: pype.plugins.maya.publish.submit_maya_muster
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_animation_content.rst b/docs/source/pype.plugins.maya.publish.validate_animation_content.rst
deleted file mode 100644
index 65191bb957..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_animation_content.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_animation\_content module
-=============================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_animation_content
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_animation_out_set_related_node_ids.rst b/docs/source/pype.plugins.maya.publish.validate_animation_out_set_related_node_ids.rst
deleted file mode 100644
index ea289e84ed..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_animation_out_set_related_node_ids.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_animation\_out\_set\_related\_node\_ids module
-==================================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_animation_out_set_related_node_ids
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_ass_relative_paths.rst b/docs/source/pype.plugins.maya.publish.validate_ass_relative_paths.rst
deleted file mode 100644
index f35ef916cc..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_ass_relative_paths.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_ass\_relative\_paths module
-===============================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_ass_relative_paths
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_assembly_name.rst b/docs/source/pype.plugins.maya.publish.validate_assembly_name.rst
deleted file mode 100644
index c8178226b2..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_assembly_name.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_assembly\_name module
-=========================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_assembly_name
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_assembly_namespaces.rst b/docs/source/pype.plugins.maya.publish.validate_assembly_namespaces.rst
deleted file mode 100644
index 847b90281e..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_assembly_namespaces.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_assembly\_namespaces module
-===============================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_assembly_namespaces
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_assembly_transforms.rst b/docs/source/pype.plugins.maya.publish.validate_assembly_transforms.rst
deleted file mode 100644
index b4348a2908..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_assembly_transforms.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_assembly\_transforms module
-===============================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_assembly_transforms
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_attributes.rst b/docs/source/pype.plugins.maya.publish.validate_attributes.rst
deleted file mode 100644
index 862820a7c0..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_attributes.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_attributes module
-=====================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_attributes
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_camera_attributes.rst b/docs/source/pype.plugins.maya.publish.validate_camera_attributes.rst
deleted file mode 100644
index 054198f812..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_camera_attributes.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_camera\_attributes module
-=============================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_camera_attributes
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_camera_contents.rst b/docs/source/pype.plugins.maya.publish.validate_camera_contents.rst
deleted file mode 100644
index 9cf6604f7a..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_camera_contents.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_camera\_contents module
-===========================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_camera_contents
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_color_sets.rst b/docs/source/pype.plugins.maya.publish.validate_color_sets.rst
deleted file mode 100644
index 59bb5607bf..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_color_sets.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_color\_sets module
-======================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_color_sets
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_current_renderlayer_renderable.rst b/docs/source/pype.plugins.maya.publish.validate_current_renderlayer_renderable.rst
deleted file mode 100644
index 31c52477aa..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_current_renderlayer_renderable.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_current\_renderlayer\_renderable module
-===========================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_current_renderlayer_renderable
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_deadline_connection.rst b/docs/source/pype.plugins.maya.publish.validate_deadline_connection.rst
deleted file mode 100644
index 3f8c4b6313..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_deadline_connection.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_deadline\_connection module
-===============================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_deadline_connection
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_frame_range.rst b/docs/source/pype.plugins.maya.publish.validate_frame_range.rst
deleted file mode 100644
index 0ccc8ed1cd..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_frame_range.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_frame\_range module
-=======================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_frame_range
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_instance_has_members.rst b/docs/source/pype.plugins.maya.publish.validate_instance_has_members.rst
deleted file mode 100644
index 862d32f114..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_instance_has_members.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_instance\_has\_members module
-=================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_instance_has_members
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_instance_subset.rst b/docs/source/pype.plugins.maya.publish.validate_instance_subset.rst
deleted file mode 100644
index f71febb73c..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_instance_subset.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_instance\_subset module
-===========================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_instance_subset
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_instancer_content.rst b/docs/source/pype.plugins.maya.publish.validate_instancer_content.rst
deleted file mode 100644
index 761889dd4d..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_instancer_content.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_instancer\_content module
-=============================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_instancer_content
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_instancer_frame_ranges.rst b/docs/source/pype.plugins.maya.publish.validate_instancer_frame_ranges.rst
deleted file mode 100644
index 85338c3e2d..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_instancer_frame_ranges.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_instancer\_frame\_ranges module
-===================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_instancer_frame_ranges
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_joints_hidden.rst b/docs/source/pype.plugins.maya.publish.validate_joints_hidden.rst
deleted file mode 100644
index ede5af0c67..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_joints_hidden.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_joints\_hidden module
-=========================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_joints_hidden
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_look_contents.rst b/docs/source/pype.plugins.maya.publish.validate_look_contents.rst
deleted file mode 100644
index 946f924fb3..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_look_contents.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_look\_contents module
-=========================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_look_contents
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_look_default_shaders_connections.rst b/docs/source/pype.plugins.maya.publish.validate_look_default_shaders_connections.rst
deleted file mode 100644
index e293cfc0f1..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_look_default_shaders_connections.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_look\_default\_shaders\_connections module
-==============================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_look_default_shaders_connections
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_look_id_reference_edits.rst b/docs/source/pype.plugins.maya.publish.validate_look_id_reference_edits.rst
deleted file mode 100644
index 007f4e2d03..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_look_id_reference_edits.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_look\_id\_reference\_edits module
-=====================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_look_id_reference_edits
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_look_members_unique.rst b/docs/source/pype.plugins.maya.publish.validate_look_members_unique.rst
deleted file mode 100644
index 3378e8a0f6..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_look_members_unique.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_look\_members\_unique module
-================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_look_members_unique
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_look_no_default_shaders.rst b/docs/source/pype.plugins.maya.publish.validate_look_no_default_shaders.rst
deleted file mode 100644
index 662e2c7621..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_look_no_default_shaders.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_look\_no\_default\_shaders module
-=====================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_look_no_default_shaders
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_look_sets.rst b/docs/source/pype.plugins.maya.publish.validate_look_sets.rst
deleted file mode 100644
index 5427331568..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_look_sets.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_look\_sets module
-=====================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_look_sets
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_look_shading_group.rst b/docs/source/pype.plugins.maya.publish.validate_look_shading_group.rst
deleted file mode 100644
index 259f4952b7..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_look_shading_group.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_look\_shading\_group module
-===============================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_look_shading_group
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_look_single_shader.rst b/docs/source/pype.plugins.maya.publish.validate_look_single_shader.rst
deleted file mode 100644
index fa43283416..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_look_single_shader.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_look\_single\_shader module
-===============================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_look_single_shader
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_maya_units.rst b/docs/source/pype.plugins.maya.publish.validate_maya_units.rst
deleted file mode 100644
index 16af19f6d9..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_maya_units.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_maya\_units module
-======================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_maya_units
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_mesh_arnold_attributes.rst b/docs/source/pype.plugins.maya.publish.validate_mesh_arnold_attributes.rst
deleted file mode 100644
index ef18ad1457..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_mesh_arnold_attributes.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_mesh\_arnold\_attributes module
-===================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_mesh_arnold_attributes
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_mesh_has_uv.rst b/docs/source/pype.plugins.maya.publish.validate_mesh_has_uv.rst
deleted file mode 100644
index c6af7063c3..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_mesh_has_uv.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_mesh\_has\_uv module
-========================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_mesh_has_uv
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_mesh_lamina_faces.rst b/docs/source/pype.plugins.maya.publish.validate_mesh_lamina_faces.rst
deleted file mode 100644
index 006488e77f..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_mesh_lamina_faces.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_mesh\_lamina\_faces module
-==============================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_mesh_lamina_faces
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_mesh_no_negative_scale.rst b/docs/source/pype.plugins.maya.publish.validate_mesh_no_negative_scale.rst
deleted file mode 100644
index 8720f3d018..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_mesh_no_negative_scale.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_mesh\_no\_negative\_scale module
-====================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_mesh_no_negative_scale
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_mesh_non_manifold.rst b/docs/source/pype.plugins.maya.publish.validate_mesh_non_manifold.rst
deleted file mode 100644
index a69a4c6fc4..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_mesh_non_manifold.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_mesh\_non\_manifold module
-==============================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_mesh_non_manifold
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_mesh_non_zero_edge.rst b/docs/source/pype.plugins.maya.publish.validate_mesh_non_zero_edge.rst
deleted file mode 100644
index 89ea60d1bc..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_mesh_non_zero_edge.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_mesh\_non\_zero\_edge module
-================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_mesh_non_zero_edge
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_mesh_normals_unlocked.rst b/docs/source/pype.plugins.maya.publish.validate_mesh_normals_unlocked.rst
deleted file mode 100644
index 7dfbd0717d..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_mesh_normals_unlocked.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_mesh\_normals\_unlocked module
-==================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_mesh_normals_unlocked
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_mesh_overlapping_uvs.rst b/docs/source/pype.plugins.maya.publish.validate_mesh_overlapping_uvs.rst
deleted file mode 100644
index f5df633124..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_mesh_overlapping_uvs.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_mesh\_overlapping\_uvs module
-=================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_mesh_overlapping_uvs
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_mesh_shader_connections.rst b/docs/source/pype.plugins.maya.publish.validate_mesh_shader_connections.rst
deleted file mode 100644
index b3cd77ab2a..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_mesh_shader_connections.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_mesh\_shader\_connections module
-====================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_mesh_shader_connections
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_mesh_single_uv_set.rst b/docs/source/pype.plugins.maya.publish.validate_mesh_single_uv_set.rst
deleted file mode 100644
index 29a1217437..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_mesh_single_uv_set.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_mesh\_single\_uv\_set module
-================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_mesh_single_uv_set
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_mesh_uv_set_map1.rst b/docs/source/pype.plugins.maya.publish.validate_mesh_uv_set_map1.rst
deleted file mode 100644
index 49d1b22497..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_mesh_uv_set_map1.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_mesh\_uv\_set\_map1 module
-==============================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_mesh_uv_set_map1
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_mesh_vertices_have_edges.rst b/docs/source/pype.plugins.maya.publish.validate_mesh_vertices_have_edges.rst
deleted file mode 100644
index 99e3047e3d..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_mesh_vertices_have_edges.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_mesh\_vertices\_have\_edges module
-======================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_mesh_vertices_have_edges
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_model_content.rst b/docs/source/pype.plugins.maya.publish.validate_model_content.rst
deleted file mode 100644
index dc0a415718..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_model_content.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_model\_content module
-=========================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_model_content
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_model_name.rst b/docs/source/pype.plugins.maya.publish.validate_model_name.rst
deleted file mode 100644
index ea78ceea70..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_model_name.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_model\_name module
-======================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_model_name
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_muster_connection.rst b/docs/source/pype.plugins.maya.publish.validate_muster_connection.rst
deleted file mode 100644
index 4a4a1e926b..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_muster_connection.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_muster\_connection module
-=============================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_muster_connection
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_no_animation.rst b/docs/source/pype.plugins.maya.publish.validate_no_animation.rst
deleted file mode 100644
index b42021369d..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_no_animation.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_no\_animation module
-========================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_no_animation
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_no_default_camera.rst b/docs/source/pype.plugins.maya.publish.validate_no_default_camera.rst
deleted file mode 100644
index 59544369f6..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_no_default_camera.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_no\_default\_camera module
-==============================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_no_default_camera
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_no_namespace.rst b/docs/source/pype.plugins.maya.publish.validate_no_namespace.rst
deleted file mode 100644
index bdf4ceb324..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_no_namespace.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_no\_namespace module
-========================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_no_namespace
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_no_null_transforms.rst b/docs/source/pype.plugins.maya.publish.validate_no_null_transforms.rst
deleted file mode 100644
index 12beed8c33..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_no_null_transforms.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_no\_null\_transforms module
-===============================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_no_null_transforms
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_no_unknown_nodes.rst b/docs/source/pype.plugins.maya.publish.validate_no_unknown_nodes.rst
deleted file mode 100644
index 12c977dbb9..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_no_unknown_nodes.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_no\_unknown\_nodes module
-=============================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_no_unknown_nodes
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_no_vraymesh.rst b/docs/source/pype.plugins.maya.publish.validate_no_vraymesh.rst
deleted file mode 100644
index a1a0b9ee64..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_no_vraymesh.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_no\_vraymesh module
-=======================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_no_vraymesh
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_node_ids.rst b/docs/source/pype.plugins.maya.publish.validate_node_ids.rst
deleted file mode 100644
index 7b1d79100f..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_node_ids.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_node\_ids module
-====================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_node_ids
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_node_ids_deformed_shapes.rst b/docs/source/pype.plugins.maya.publish.validate_node_ids_deformed_shapes.rst
deleted file mode 100644
index 90ef81c5b5..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_node_ids_deformed_shapes.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_node\_ids\_deformed\_shapes module
-======================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_node_ids_deformed_shapes
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_node_ids_in_database.rst b/docs/source/pype.plugins.maya.publish.validate_node_ids_in_database.rst
deleted file mode 100644
index 5eb0047d16..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_node_ids_in_database.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_node\_ids\_in\_database module
-==================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_node_ids_in_database
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_node_ids_related.rst b/docs/source/pype.plugins.maya.publish.validate_node_ids_related.rst
deleted file mode 100644
index 1f030462ae..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_node_ids_related.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_node\_ids\_related module
-=============================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_node_ids_related
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_node_ids_unique.rst b/docs/source/pype.plugins.maya.publish.validate_node_ids_unique.rst
deleted file mode 100644
index 20ba3a3a6d..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_node_ids_unique.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_node\_ids\_unique module
-============================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_node_ids_unique
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_node_no_ghosting.rst b/docs/source/pype.plugins.maya.publish.validate_node_no_ghosting.rst
deleted file mode 100644
index 8315888630..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_node_no_ghosting.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_node\_no\_ghosting module
-=============================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_node_no_ghosting
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_render_image_rule.rst b/docs/source/pype.plugins.maya.publish.validate_render_image_rule.rst
deleted file mode 100644
index 88870a9ea8..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_render_image_rule.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_render\_image\_rule module
-==============================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_render_image_rule
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_render_no_default_cameras.rst b/docs/source/pype.plugins.maya.publish.validate_render_no_default_cameras.rst
deleted file mode 100644
index b464dbeab6..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_render_no_default_cameras.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_render\_no\_default\_cameras module
-=======================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_render_no_default_cameras
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_render_single_camera.rst b/docs/source/pype.plugins.maya.publish.validate_render_single_camera.rst
deleted file mode 100644
index 60a0cbd6fb..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_render_single_camera.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_render\_single\_camera module
-=================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_render_single_camera
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_renderlayer_aovs.rst b/docs/source/pype.plugins.maya.publish.validate_renderlayer_aovs.rst
deleted file mode 100644
index 65d5181065..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_renderlayer_aovs.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_renderlayer\_aovs module
-============================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_renderlayer_aovs
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_rendersettings.rst b/docs/source/pype.plugins.maya.publish.validate_rendersettings.rst
deleted file mode 100644
index fce7dba5b8..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_rendersettings.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_rendersettings module
-=========================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_rendersettings
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_resources.rst b/docs/source/pype.plugins.maya.publish.validate_resources.rst
deleted file mode 100644
index 0a866acdbb..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_resources.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_resources module
-====================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_resources
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_rig_contents.rst b/docs/source/pype.plugins.maya.publish.validate_rig_contents.rst
deleted file mode 100644
index dbd7d84bed..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_rig_contents.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_rig\_contents module
-========================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_rig_contents
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_rig_controllers.rst b/docs/source/pype.plugins.maya.publish.validate_rig_controllers.rst
deleted file mode 100644
index 3bf075e8ad..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_rig_controllers.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_rig\_controllers module
-===========================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_rig_controllers
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_rig_controllers_arnold_attributes.rst b/docs/source/pype.plugins.maya.publish.validate_rig_controllers_arnold_attributes.rst
deleted file mode 100644
index 67e9256f3a..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_rig_controllers_arnold_attributes.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_rig\_controllers\_arnold\_attributes module
-===============================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_rig_controllers_arnold_attributes
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_rig_out_set_node_ids.rst b/docs/source/pype.plugins.maya.publish.validate_rig_out_set_node_ids.rst
deleted file mode 100644
index e4f1cfc428..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_rig_out_set_node_ids.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_rig\_out\_set\_node\_ids module
-===================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_rig_out_set_node_ids
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_rig_output_ids.rst b/docs/source/pype.plugins.maya.publish.validate_rig_output_ids.rst
deleted file mode 100644
index e1d3b1a659..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_rig_output_ids.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_rig\_output\_ids module
-===========================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_rig_output_ids
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_scene_set_workspace.rst b/docs/source/pype.plugins.maya.publish.validate_scene_set_workspace.rst
deleted file mode 100644
index daf2f152d9..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_scene_set_workspace.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_scene\_set\_workspace module
-================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_scene_set_workspace
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_shader_name.rst b/docs/source/pype.plugins.maya.publish.validate_shader_name.rst
deleted file mode 100644
index ae5b196a1d..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_shader_name.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_shader\_name module
-=======================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_shader_name
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_shape_default_names.rst b/docs/source/pype.plugins.maya.publish.validate_shape_default_names.rst
deleted file mode 100644
index 49effc932d..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_shape_default_names.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_shape\_default\_names module
-================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_shape_default_names
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_shape_render_stats.rst b/docs/source/pype.plugins.maya.publish.validate_shape_render_stats.rst
deleted file mode 100644
index 359af50a0f..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_shape_render_stats.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_shape\_render\_stats module
-===============================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_shape_render_stats
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_single_assembly.rst b/docs/source/pype.plugins.maya.publish.validate_single_assembly.rst
deleted file mode 100644
index 090f57b3ff..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_single_assembly.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_single\_assembly module
-===========================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_single_assembly
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_skinCluster_deformer_set.rst b/docs/source/pype.plugins.maya.publish.validate_skinCluster_deformer_set.rst
deleted file mode 100644
index 607a610097..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_skinCluster_deformer_set.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_skinCluster\_deformer\_set module
-=====================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_skinCluster_deformer_set
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_step_size.rst b/docs/source/pype.plugins.maya.publish.validate_step_size.rst
deleted file mode 100644
index bb883ea7b5..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_step_size.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_step\_size module
-=====================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_step_size
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_transform_naming_suffix.rst b/docs/source/pype.plugins.maya.publish.validate_transform_naming_suffix.rst
deleted file mode 100644
index 4d7edda78d..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_transform_naming_suffix.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_transform\_naming\_suffix module
-====================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_transform_naming_suffix
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_transform_zero.rst b/docs/source/pype.plugins.maya.publish.validate_transform_zero.rst
deleted file mode 100644
index 6d5cacfe00..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_transform_zero.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_transform\_zero module
-==========================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_transform_zero
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_unicode_strings.rst b/docs/source/pype.plugins.maya.publish.validate_unicode_strings.rst
deleted file mode 100644
index 9cc17d6810..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_unicode_strings.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_unicode\_strings module
-===========================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_unicode_strings
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_unreal_mesh_triangulated.rst b/docs/source/pype.plugins.maya.publish.validate_unreal_mesh_triangulated.rst
deleted file mode 100644
index 4dcb518194..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_unreal_mesh_triangulated.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_unreal\_mesh\_triangulated module
-=====================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_unreal_mesh_triangulated
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_unreal_staticmesh_naming.rst b/docs/source/pype.plugins.maya.publish.validate_unreal_staticmesh_naming.rst
deleted file mode 100644
index f7225ab395..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_unreal_staticmesh_naming.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_unreal\_staticmesh\_naming module
-=====================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_unreal_staticmesh_naming
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_unreal_up_axis.rst b/docs/source/pype.plugins.maya.publish.validate_unreal_up_axis.rst
deleted file mode 100644
index ff688c493f..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_unreal_up_axis.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_unreal\_up\_axis module
-===========================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_unreal_up_axis
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_vray_distributed_rendering.rst b/docs/source/pype.plugins.maya.publish.validate_vray_distributed_rendering.rst
deleted file mode 100644
index f5d05e6d76..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_vray_distributed_rendering.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_vray\_distributed\_rendering module
-=======================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_vray_distributed_rendering
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_vray_referenced_aovs.rst b/docs/source/pype.plugins.maya.publish.validate_vray_referenced_aovs.rst
deleted file mode 100644
index 16ad9666aa..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_vray_referenced_aovs.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_vray\_referenced\_aovs module
-=================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_vray_referenced_aovs
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_vray_translator_settings.rst b/docs/source/pype.plugins.maya.publish.validate_vray_translator_settings.rst
deleted file mode 100644
index a06a9531dd..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_vray_translator_settings.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_vray\_translator\_settings module
-=====================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_vray_translator_settings
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_vrayproxy.rst b/docs/source/pype.plugins.maya.publish.validate_vrayproxy.rst
deleted file mode 100644
index 081f58924a..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_vrayproxy.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_vrayproxy module
-====================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_vrayproxy
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_vrayproxy_members.rst b/docs/source/pype.plugins.maya.publish.validate_vrayproxy_members.rst
deleted file mode 100644
index 7c587f39b0..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_vrayproxy_members.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_vrayproxy\_members module
-=============================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_vrayproxy_members
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_yeti_renderscript_callbacks.rst b/docs/source/pype.plugins.maya.publish.validate_yeti_renderscript_callbacks.rst
deleted file mode 100644
index 889d469b2f..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_yeti_renderscript_callbacks.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_yeti\_renderscript\_callbacks module
-========================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_yeti_renderscript_callbacks
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_yeti_rig_cache_state.rst b/docs/source/pype.plugins.maya.publish.validate_yeti_rig_cache_state.rst
deleted file mode 100644
index 4138b1e8a4..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_yeti_rig_cache_state.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_yeti\_rig\_cache\_state module
-==================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_yeti_rig_cache_state
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_yeti_rig_input_in_instance.rst b/docs/source/pype.plugins.maya.publish.validate_yeti_rig_input_in_instance.rst
deleted file mode 100644
index 37b862926c..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_yeti_rig_input_in_instance.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_yeti\_rig\_input\_in\_instance module
-=========================================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_yeti_rig_input_in_instance
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.publish.validate_yeti_rig_settings.rst b/docs/source/pype.plugins.maya.publish.validate_yeti_rig_settings.rst
deleted file mode 100644
index 9fd54193dc..0000000000
--- a/docs/source/pype.plugins.maya.publish.validate_yeti_rig_settings.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.plugins.maya.publish.validate\_yeti\_rig\_settings module
-==============================================================
-
-.. automodule:: pype.plugins.maya.publish.validate_yeti_rig_settings
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.plugins.maya.rst b/docs/source/pype.plugins.maya.rst
deleted file mode 100644
index 129cf5fce9..0000000000
--- a/docs/source/pype.plugins.maya.rst
+++ /dev/null
@@ -1,15 +0,0 @@
-pype.plugins.maya package
-=========================
-
-.. automodule:: pype.plugins.maya
- :members:
- :undoc-members:
- :show-inheritance:
-
-Subpackages
------------
-
-.. toctree::
- :maxdepth: 10
-
- pype.plugins.maya.publish
diff --git a/docs/source/pype.plugins.rst b/docs/source/pype.plugins.rst
deleted file mode 100644
index 8e5e45ba5d..0000000000
--- a/docs/source/pype.plugins.rst
+++ /dev/null
@@ -1,15 +0,0 @@
-pype.plugins package
-====================
-
-.. automodule:: pype.plugins
- :members:
- :undoc-members:
- :show-inheritance:
-
-Subpackages
------------
-
-.. toctree::
- :maxdepth: 10
-
- pype.plugins.maya
diff --git a/docs/source/pype.pype_commands.rst b/docs/source/pype.pype_commands.rst
deleted file mode 100644
index b8a416df7b..0000000000
--- a/docs/source/pype.pype_commands.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.pype\_commands module
-==========================
-
-.. automodule:: pype.pype_commands
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.resources.rst b/docs/source/pype.resources.rst
deleted file mode 100644
index 2fb5b92dce..0000000000
--- a/docs/source/pype.resources.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.resources package
-======================
-
-.. automodule:: pype.resources
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.rst b/docs/source/pype.rst
deleted file mode 100644
index 3589d2f3fe..0000000000
--- a/docs/source/pype.rst
+++ /dev/null
@@ -1,99 +0,0 @@
-pype package
-============
-
-.. automodule:: pype
- :members:
- :undoc-members:
- :show-inheritance:
-
-Subpackages
------------
-
-.. toctree::
- :maxdepth: 6
-
- pype.hosts
- pype.lib
- pype.modules
- pype.resources
- pype.scripts
- pype.settings
- pype.tests
- pype.tools
- pype.vendor
- pype.widgets
-
-Submodules
-----------
-
-pype.action module
-------------------
-
-.. automodule:: pype.action
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.api module
----------------
-
-.. automodule:: pype.api
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.cli module
----------------
-
-.. automodule:: pype.cli
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.launcher\_actions module
------------------------------
-
-.. automodule:: pype.launcher_actions
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.modules\_manager module
-----------------------------
-
-.. automodule:: pype.modules_manager
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.plugin module
-------------------
-
-.. automodule:: pype.plugin
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.pype\_commands module
---------------------------
-
-.. automodule:: pype.pype_commands
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.setdress\_api module
--------------------------
-
-.. automodule:: pype.setdress_api
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.version module
--------------------
-
-.. automodule:: pype.version
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.scripts.export_maya_ass_job.rst b/docs/source/pype.scripts.export_maya_ass_job.rst
deleted file mode 100644
index c35cc49ddd..0000000000
--- a/docs/source/pype.scripts.export_maya_ass_job.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.scripts.export\_maya\_ass\_job module
-==========================================
-
-.. automodule:: pype.scripts.export_maya_ass_job
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.scripts.fusion_switch_shot.rst b/docs/source/pype.scripts.fusion_switch_shot.rst
deleted file mode 100644
index 39d3473d16..0000000000
--- a/docs/source/pype.scripts.fusion_switch_shot.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.scripts.fusion\_switch\_shot module
-========================================
-
-.. automodule:: pype.scripts.fusion_switch_shot
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.scripts.otio_burnin.rst b/docs/source/pype.scripts.otio_burnin.rst
deleted file mode 100644
index e6a93017f5..0000000000
--- a/docs/source/pype.scripts.otio_burnin.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.scripts.otio\_burnin module
-================================
-
-.. automodule:: pype.scripts.otio_burnin
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.scripts.publish_deadline.rst b/docs/source/pype.scripts.publish_deadline.rst
deleted file mode 100644
index d134e17244..0000000000
--- a/docs/source/pype.scripts.publish_deadline.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.scripts.publish\_deadline module
-=====================================
-
-.. automodule:: pype.scripts.publish_deadline
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.scripts.publish_filesequence.rst b/docs/source/pype.scripts.publish_filesequence.rst
deleted file mode 100644
index 440d52caad..0000000000
--- a/docs/source/pype.scripts.publish_filesequence.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.scripts.publish\_filesequence module
-=========================================
-
-.. automodule:: pype.scripts.publish_filesequence
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.scripts.rst b/docs/source/pype.scripts.rst
deleted file mode 100644
index 5985771b97..0000000000
--- a/docs/source/pype.scripts.rst
+++ /dev/null
@@ -1,58 +0,0 @@
-pype.scripts package
-====================
-
-.. automodule:: pype.scripts
- :members:
- :undoc-members:
- :show-inheritance:
-
-Subpackages
------------
-
-.. toctree::
- :maxdepth: 6
-
- pype.scripts.slates
-
-Submodules
-----------
-
-pype.scripts.export\_maya\_ass\_job module
-------------------------------------------
-
-.. automodule:: pype.scripts.export_maya_ass_job
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.scripts.fusion\_switch\_shot module
-----------------------------------------
-
-.. automodule:: pype.scripts.fusion_switch_shot
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.scripts.otio\_burnin module
---------------------------------
-
-.. automodule:: pype.scripts.otio_burnin
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.scripts.publish\_deadline module
--------------------------------------
-
-.. automodule:: pype.scripts.publish_deadline
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.scripts.publish\_filesequence module
------------------------------------------
-
-.. automodule:: pype.scripts.publish_filesequence
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.scripts.slates.rst b/docs/source/pype.scripts.slates.rst
deleted file mode 100644
index 74b4cb4343..0000000000
--- a/docs/source/pype.scripts.slates.rst
+++ /dev/null
@@ -1,15 +0,0 @@
-pype.scripts.slates package
-===========================
-
-.. automodule:: pype.scripts.slates
- :members:
- :undoc-members:
- :show-inheritance:
-
-Subpackages
------------
-
-.. toctree::
- :maxdepth: 6
-
- pype.scripts.slates.slate_base
diff --git a/docs/source/pype.scripts.slates.slate_base.api.rst b/docs/source/pype.scripts.slates.slate_base.api.rst
deleted file mode 100644
index 0016a5c42a..0000000000
--- a/docs/source/pype.scripts.slates.slate_base.api.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.scripts.slates.slate\_base.api module
-==========================================
-
-.. automodule:: pype.scripts.slates.slate_base.api
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.scripts.slates.slate_base.base.rst b/docs/source/pype.scripts.slates.slate_base.base.rst
deleted file mode 100644
index 5e34d654b0..0000000000
--- a/docs/source/pype.scripts.slates.slate_base.base.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.scripts.slates.slate\_base.base module
-===========================================
-
-.. automodule:: pype.scripts.slates.slate_base.base
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.scripts.slates.slate_base.example.rst b/docs/source/pype.scripts.slates.slate_base.example.rst
deleted file mode 100644
index 95ebcc835a..0000000000
--- a/docs/source/pype.scripts.slates.slate_base.example.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.scripts.slates.slate\_base.example module
-==============================================
-
-.. automodule:: pype.scripts.slates.slate_base.example
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.scripts.slates.slate_base.font_factory.rst b/docs/source/pype.scripts.slates.slate_base.font_factory.rst
deleted file mode 100644
index c53efef554..0000000000
--- a/docs/source/pype.scripts.slates.slate_base.font_factory.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.scripts.slates.slate\_base.font\_factory module
-====================================================
-
-.. automodule:: pype.scripts.slates.slate_base.font_factory
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.scripts.slates.slate_base.items.rst b/docs/source/pype.scripts.slates.slate_base.items.rst
deleted file mode 100644
index 25abb11bb9..0000000000
--- a/docs/source/pype.scripts.slates.slate_base.items.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.scripts.slates.slate\_base.items module
-============================================
-
-.. automodule:: pype.scripts.slates.slate_base.items
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.scripts.slates.slate_base.layer.rst b/docs/source/pype.scripts.slates.slate_base.layer.rst
deleted file mode 100644
index 8681e3accf..0000000000
--- a/docs/source/pype.scripts.slates.slate_base.layer.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.scripts.slates.slate\_base.layer module
-============================================
-
-.. automodule:: pype.scripts.slates.slate_base.layer
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.scripts.slates.slate_base.lib.rst b/docs/source/pype.scripts.slates.slate_base.lib.rst
deleted file mode 100644
index c4ef2c912e..0000000000
--- a/docs/source/pype.scripts.slates.slate_base.lib.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.scripts.slates.slate\_base.lib module
-==========================================
-
-.. automodule:: pype.scripts.slates.slate_base.lib
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.scripts.slates.slate_base.main_frame.rst b/docs/source/pype.scripts.slates.slate_base.main_frame.rst
deleted file mode 100644
index 5093c28a74..0000000000
--- a/docs/source/pype.scripts.slates.slate_base.main_frame.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.scripts.slates.slate\_base.main\_frame module
-==================================================
-
-.. automodule:: pype.scripts.slates.slate_base.main_frame
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.scripts.slates.slate_base.rst b/docs/source/pype.scripts.slates.slate_base.rst
deleted file mode 100644
index 00726c04bf..0000000000
--- a/docs/source/pype.scripts.slates.slate_base.rst
+++ /dev/null
@@ -1,74 +0,0 @@
-pype.scripts.slates.slate\_base package
-=======================================
-
-.. automodule:: pype.scripts.slates.slate_base
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.scripts.slates.slate\_base.api module
-------------------------------------------
-
-.. automodule:: pype.scripts.slates.slate_base.api
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.scripts.slates.slate\_base.base module
--------------------------------------------
-
-.. automodule:: pype.scripts.slates.slate_base.base
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.scripts.slates.slate\_base.example module
-----------------------------------------------
-
-.. automodule:: pype.scripts.slates.slate_base.example
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.scripts.slates.slate\_base.font\_factory module
-----------------------------------------------------
-
-.. automodule:: pype.scripts.slates.slate_base.font_factory
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.scripts.slates.slate\_base.items module
---------------------------------------------
-
-.. automodule:: pype.scripts.slates.slate_base.items
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.scripts.slates.slate\_base.layer module
---------------------------------------------
-
-.. automodule:: pype.scripts.slates.slate_base.layer
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.scripts.slates.slate\_base.lib module
-------------------------------------------
-
-.. automodule:: pype.scripts.slates.slate_base.lib
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.scripts.slates.slate\_base.main\_frame module
---------------------------------------------------
-
-.. automodule:: pype.scripts.slates.slate_base.main_frame
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.setdress_api.rst b/docs/source/pype.setdress_api.rst
deleted file mode 100644
index 95638ea64d..0000000000
--- a/docs/source/pype.setdress_api.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.setdress\_api module
-=========================
-
-.. automodule:: pype.setdress_api
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.settings.constants.rst b/docs/source/pype.settings.constants.rst
deleted file mode 100644
index ac652089c8..0000000000
--- a/docs/source/pype.settings.constants.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.settings.constants module
-==============================
-
-.. automodule:: pype.settings.constants
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.settings.handlers.rst b/docs/source/pype.settings.handlers.rst
deleted file mode 100644
index 60ea0ae952..0000000000
--- a/docs/source/pype.settings.handlers.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.settings.handlers module
-=============================
-
-.. automodule:: pype.settings.handlers
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.settings.lib.rst b/docs/source/pype.settings.lib.rst
deleted file mode 100644
index d6e3e8bd06..0000000000
--- a/docs/source/pype.settings.lib.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.settings.lib module
-========================
-
-.. automodule:: pype.settings.lib
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.settings.rst b/docs/source/pype.settings.rst
deleted file mode 100644
index 5bf131d555..0000000000
--- a/docs/source/pype.settings.rst
+++ /dev/null
@@ -1,18 +0,0 @@
-pype.settings package
-=====================
-
-.. automodule:: pype.settings
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.settings.lib module
-------------------------
-
-.. automodule:: pype.settings.lib
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tests.lib.rst b/docs/source/pype.tests.lib.rst
deleted file mode 100644
index 375ebd0258..0000000000
--- a/docs/source/pype.tests.lib.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tests.lib module
-=====================
-
-.. automodule:: pype.tests.lib
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tests.rst b/docs/source/pype.tests.rst
deleted file mode 100644
index 3f34cdcd77..0000000000
--- a/docs/source/pype.tests.rst
+++ /dev/null
@@ -1,42 +0,0 @@
-pype.tests package
-==================
-
-.. automodule:: pype.tests
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.tests.lib module
----------------------
-
-.. automodule:: pype.tests.lib
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tests.test\_avalon\_plugin\_presets module
------------------------------------------------
-
-.. automodule:: pype.tests.test_avalon_plugin_presets
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tests.test\_mongo\_performance module
-------------------------------------------
-
-.. automodule:: pype.tests.test_mongo_performance
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tests.test\_pyblish\_filter module
----------------------------------------
-
-.. automodule:: pype.tests.test_pyblish_filter
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tests.test_avalon_plugin_presets.rst b/docs/source/pype.tests.test_avalon_plugin_presets.rst
deleted file mode 100644
index b4ff802256..0000000000
--- a/docs/source/pype.tests.test_avalon_plugin_presets.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tests.test\_avalon\_plugin\_presets module
-===============================================
-
-.. automodule:: pype.tests.test_avalon_plugin_presets
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tests.test_lib_restructuralization.rst b/docs/source/pype.tests.test_lib_restructuralization.rst
deleted file mode 100644
index 8d426fcb6b..0000000000
--- a/docs/source/pype.tests.test_lib_restructuralization.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tests.test\_lib\_restructuralization module
-================================================
-
-.. automodule:: pype.tests.test_lib_restructuralization
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tests.test_mongo_performance.rst b/docs/source/pype.tests.test_mongo_performance.rst
deleted file mode 100644
index 4686247e59..0000000000
--- a/docs/source/pype.tests.test_mongo_performance.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tests.test\_mongo\_performance module
-==========================================
-
-.. automodule:: pype.tests.test_mongo_performance
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tests.test_pyblish_filter.rst b/docs/source/pype.tests.test_pyblish_filter.rst
deleted file mode 100644
index 196ec02433..0000000000
--- a/docs/source/pype.tests.test_pyblish_filter.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tests.test\_pyblish\_filter module
-=======================================
-
-.. automodule:: pype.tests.test_pyblish_filter
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.assetcreator.app.rst b/docs/source/pype.tools.assetcreator.app.rst
deleted file mode 100644
index b46281b07a..0000000000
--- a/docs/source/pype.tools.assetcreator.app.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.assetcreator.app module
-==================================
-
-.. automodule:: pype.tools.assetcreator.app
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.assetcreator.model.rst b/docs/source/pype.tools.assetcreator.model.rst
deleted file mode 100644
index 752791d07c..0000000000
--- a/docs/source/pype.tools.assetcreator.model.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.assetcreator.model module
-====================================
-
-.. automodule:: pype.tools.assetcreator.model
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.assetcreator.rst b/docs/source/pype.tools.assetcreator.rst
deleted file mode 100644
index b95c3b3c60..0000000000
--- a/docs/source/pype.tools.assetcreator.rst
+++ /dev/null
@@ -1,34 +0,0 @@
-pype.tools.assetcreator package
-===============================
-
-.. automodule:: pype.tools.assetcreator
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.tools.assetcreator.app module
-----------------------------------
-
-.. automodule:: pype.tools.assetcreator.app
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.assetcreator.model module
-------------------------------------
-
-.. automodule:: pype.tools.assetcreator.model
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.assetcreator.widget module
--------------------------------------
-
-.. automodule:: pype.tools.assetcreator.widget
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.assetcreator.widget.rst b/docs/source/pype.tools.assetcreator.widget.rst
deleted file mode 100644
index 23ed335306..0000000000
--- a/docs/source/pype.tools.assetcreator.widget.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.assetcreator.widget module
-=====================================
-
-.. automodule:: pype.tools.assetcreator.widget
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.launcher.actions.rst b/docs/source/pype.tools.launcher.actions.rst
deleted file mode 100644
index e2ec217d4b..0000000000
--- a/docs/source/pype.tools.launcher.actions.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.launcher.actions module
-==================================
-
-.. automodule:: pype.tools.launcher.actions
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.launcher.delegates.rst b/docs/source/pype.tools.launcher.delegates.rst
deleted file mode 100644
index e8a7519cd5..0000000000
--- a/docs/source/pype.tools.launcher.delegates.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.launcher.delegates module
-====================================
-
-.. automodule:: pype.tools.launcher.delegates
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.launcher.flickcharm.rst b/docs/source/pype.tools.launcher.flickcharm.rst
deleted file mode 100644
index 5105d3235e..0000000000
--- a/docs/source/pype.tools.launcher.flickcharm.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.launcher.flickcharm module
-=====================================
-
-.. automodule:: pype.tools.launcher.flickcharm
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.launcher.lib.rst b/docs/source/pype.tools.launcher.lib.rst
deleted file mode 100644
index 28db8a6540..0000000000
--- a/docs/source/pype.tools.launcher.lib.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.launcher.lib module
-==============================
-
-.. automodule:: pype.tools.launcher.lib
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.launcher.models.rst b/docs/source/pype.tools.launcher.models.rst
deleted file mode 100644
index 701826284e..0000000000
--- a/docs/source/pype.tools.launcher.models.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.launcher.models module
-=================================
-
-.. automodule:: pype.tools.launcher.models
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.launcher.rst b/docs/source/pype.tools.launcher.rst
deleted file mode 100644
index c4782bf9bb..0000000000
--- a/docs/source/pype.tools.launcher.rst
+++ /dev/null
@@ -1,66 +0,0 @@
-pype.tools.launcher package
-===========================
-
-.. automodule:: pype.tools.launcher
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.tools.launcher.actions module
-----------------------------------
-
-.. automodule:: pype.tools.launcher.actions
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.launcher.delegates module
-------------------------------------
-
-.. automodule:: pype.tools.launcher.delegates
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.launcher.flickcharm module
--------------------------------------
-
-.. automodule:: pype.tools.launcher.flickcharm
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.launcher.lib module
-------------------------------
-
-.. automodule:: pype.tools.launcher.lib
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.launcher.models module
----------------------------------
-
-.. automodule:: pype.tools.launcher.models
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.launcher.widgets module
-----------------------------------
-
-.. automodule:: pype.tools.launcher.widgets
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.launcher.window module
----------------------------------
-
-.. automodule:: pype.tools.launcher.window
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.launcher.widgets.rst b/docs/source/pype.tools.launcher.widgets.rst
deleted file mode 100644
index 400a5b7a2c..0000000000
--- a/docs/source/pype.tools.launcher.widgets.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.launcher.widgets module
-==================================
-
-.. automodule:: pype.tools.launcher.widgets
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.launcher.window.rst b/docs/source/pype.tools.launcher.window.rst
deleted file mode 100644
index ae92207795..0000000000
--- a/docs/source/pype.tools.launcher.window.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.launcher.window module
-=================================
-
-.. automodule:: pype.tools.launcher.window
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.pyblish_pype.app.rst b/docs/source/pype.tools.pyblish_pype.app.rst
deleted file mode 100644
index a70aada725..0000000000
--- a/docs/source/pype.tools.pyblish_pype.app.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.pyblish\_pype.app module
-===================================
-
-.. automodule:: pype.tools.pyblish_pype.app
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.pyblish_pype.awesome.rst b/docs/source/pype.tools.pyblish_pype.awesome.rst
deleted file mode 100644
index 50a81ac5e8..0000000000
--- a/docs/source/pype.tools.pyblish_pype.awesome.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.pyblish\_pype.awesome module
-=======================================
-
-.. automodule:: pype.tools.pyblish_pype.awesome
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.pyblish_pype.compat.rst b/docs/source/pype.tools.pyblish_pype.compat.rst
deleted file mode 100644
index 4beee41e00..0000000000
--- a/docs/source/pype.tools.pyblish_pype.compat.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.pyblish\_pype.compat module
-======================================
-
-.. automodule:: pype.tools.pyblish_pype.compat
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.pyblish_pype.constants.rst b/docs/source/pype.tools.pyblish_pype.constants.rst
deleted file mode 100644
index bab67a2270..0000000000
--- a/docs/source/pype.tools.pyblish_pype.constants.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.pyblish\_pype.constants module
-=========================================
-
-.. automodule:: pype.tools.pyblish_pype.constants
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.pyblish_pype.control.rst b/docs/source/pype.tools.pyblish_pype.control.rst
deleted file mode 100644
index c2f8c0031e..0000000000
--- a/docs/source/pype.tools.pyblish_pype.control.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.pyblish\_pype.control module
-=======================================
-
-.. automodule:: pype.tools.pyblish_pype.control
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.pyblish_pype.delegate.rst b/docs/source/pype.tools.pyblish_pype.delegate.rst
deleted file mode 100644
index 8796c9830f..0000000000
--- a/docs/source/pype.tools.pyblish_pype.delegate.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.pyblish\_pype.delegate module
-========================================
-
-.. automodule:: pype.tools.pyblish_pype.delegate
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.pyblish_pype.mock.rst b/docs/source/pype.tools.pyblish_pype.mock.rst
deleted file mode 100644
index 8c22e80856..0000000000
--- a/docs/source/pype.tools.pyblish_pype.mock.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.pyblish\_pype.mock module
-====================================
-
-.. automodule:: pype.tools.pyblish_pype.mock
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.pyblish_pype.model.rst b/docs/source/pype.tools.pyblish_pype.model.rst
deleted file mode 100644
index 983b06cc8a..0000000000
--- a/docs/source/pype.tools.pyblish_pype.model.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.pyblish\_pype.model module
-=====================================
-
-.. automodule:: pype.tools.pyblish_pype.model
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.pyblish_pype.rst b/docs/source/pype.tools.pyblish_pype.rst
deleted file mode 100644
index 9479b5399f..0000000000
--- a/docs/source/pype.tools.pyblish_pype.rst
+++ /dev/null
@@ -1,130 +0,0 @@
-pype.tools.pyblish\_pype package
-================================
-
-.. automodule:: pype.tools.pyblish_pype
- :members:
- :undoc-members:
- :show-inheritance:
-
-Subpackages
------------
-
-.. toctree::
- :maxdepth: 6
-
- pype.tools.pyblish_pype.vendor
-
-Submodules
-----------
-
-pype.tools.pyblish\_pype.app module
------------------------------------
-
-.. automodule:: pype.tools.pyblish_pype.app
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.pyblish\_pype.awesome module
----------------------------------------
-
-.. automodule:: pype.tools.pyblish_pype.awesome
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.pyblish\_pype.compat module
---------------------------------------
-
-.. automodule:: pype.tools.pyblish_pype.compat
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.pyblish\_pype.constants module
------------------------------------------
-
-.. automodule:: pype.tools.pyblish_pype.constants
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.pyblish\_pype.control module
----------------------------------------
-
-.. automodule:: pype.tools.pyblish_pype.control
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.pyblish\_pype.delegate module
-----------------------------------------
-
-.. automodule:: pype.tools.pyblish_pype.delegate
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.pyblish\_pype.mock module
-------------------------------------
-
-.. automodule:: pype.tools.pyblish_pype.mock
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.pyblish\_pype.model module
--------------------------------------
-
-.. automodule:: pype.tools.pyblish_pype.model
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.pyblish\_pype.settings module
-----------------------------------------
-
-.. automodule:: pype.tools.pyblish_pype.settings
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.pyblish\_pype.util module
-------------------------------------
-
-.. automodule:: pype.tools.pyblish_pype.util
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.pyblish\_pype.version module
----------------------------------------
-
-.. automodule:: pype.tools.pyblish_pype.version
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.pyblish\_pype.view module
-------------------------------------
-
-.. automodule:: pype.tools.pyblish_pype.view
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.pyblish\_pype.widgets module
----------------------------------------
-
-.. automodule:: pype.tools.pyblish_pype.widgets
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.pyblish\_pype.window module
---------------------------------------
-
-.. automodule:: pype.tools.pyblish_pype.window
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.pyblish_pype.settings.rst b/docs/source/pype.tools.pyblish_pype.settings.rst
deleted file mode 100644
index 2e4e95cca0..0000000000
--- a/docs/source/pype.tools.pyblish_pype.settings.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.pyblish\_pype.settings module
-========================================
-
-.. automodule:: pype.tools.pyblish_pype.settings
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.pyblish_pype.util.rst b/docs/source/pype.tools.pyblish_pype.util.rst
deleted file mode 100644
index fa34295f12..0000000000
--- a/docs/source/pype.tools.pyblish_pype.util.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.pyblish\_pype.util module
-====================================
-
-.. automodule:: pype.tools.pyblish_pype.util
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.pyblish_pype.vendor.qtawesome.animation.rst b/docs/source/pype.tools.pyblish_pype.vendor.qtawesome.animation.rst
deleted file mode 100644
index a892128308..0000000000
--- a/docs/source/pype.tools.pyblish_pype.vendor.qtawesome.animation.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.pyblish\_pype.vendor.qtawesome.animation module
-==========================================================
-
-.. automodule:: pype.tools.pyblish_pype.vendor.qtawesome.animation
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.pyblish_pype.vendor.qtawesome.iconic_font.rst b/docs/source/pype.tools.pyblish_pype.vendor.qtawesome.iconic_font.rst
deleted file mode 100644
index 4f4337348f..0000000000
--- a/docs/source/pype.tools.pyblish_pype.vendor.qtawesome.iconic_font.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.pyblish\_pype.vendor.qtawesome.iconic\_font module
-=============================================================
-
-.. automodule:: pype.tools.pyblish_pype.vendor.qtawesome.iconic_font
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.pyblish_pype.vendor.qtawesome.rst b/docs/source/pype.tools.pyblish_pype.vendor.qtawesome.rst
deleted file mode 100644
index 68b2ec4659..0000000000
--- a/docs/source/pype.tools.pyblish_pype.vendor.qtawesome.rst
+++ /dev/null
@@ -1,26 +0,0 @@
-pype.tools.pyblish\_pype.vendor.qtawesome package
-=================================================
-
-.. automodule:: pype.tools.pyblish_pype.vendor.qtawesome
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.tools.pyblish\_pype.vendor.qtawesome.animation module
-----------------------------------------------------------
-
-.. automodule:: pype.tools.pyblish_pype.vendor.qtawesome.animation
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.pyblish\_pype.vendor.qtawesome.iconic\_font module
--------------------------------------------------------------
-
-.. automodule:: pype.tools.pyblish_pype.vendor.qtawesome.iconic_font
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.pyblish_pype.vendor.rst b/docs/source/pype.tools.pyblish_pype.vendor.rst
deleted file mode 100644
index 69e6096053..0000000000
--- a/docs/source/pype.tools.pyblish_pype.vendor.rst
+++ /dev/null
@@ -1,15 +0,0 @@
-pype.tools.pyblish\_pype.vendor package
-=======================================
-
-.. automodule:: pype.tools.pyblish_pype.vendor
- :members:
- :undoc-members:
- :show-inheritance:
-
-Subpackages
------------
-
-.. toctree::
- :maxdepth: 6
-
- pype.tools.pyblish_pype.vendor.qtawesome
diff --git a/docs/source/pype.tools.pyblish_pype.version.rst b/docs/source/pype.tools.pyblish_pype.version.rst
deleted file mode 100644
index a6ddcd5ce8..0000000000
--- a/docs/source/pype.tools.pyblish_pype.version.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.pyblish\_pype.version module
-=======================================
-
-.. automodule:: pype.tools.pyblish_pype.version
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.pyblish_pype.view.rst b/docs/source/pype.tools.pyblish_pype.view.rst
deleted file mode 100644
index 21d34d9daa..0000000000
--- a/docs/source/pype.tools.pyblish_pype.view.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.pyblish\_pype.view module
-====================================
-
-.. automodule:: pype.tools.pyblish_pype.view
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.pyblish_pype.widgets.rst b/docs/source/pype.tools.pyblish_pype.widgets.rst
deleted file mode 100644
index 8a0d3c380a..0000000000
--- a/docs/source/pype.tools.pyblish_pype.widgets.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.pyblish\_pype.widgets module
-=======================================
-
-.. automodule:: pype.tools.pyblish_pype.widgets
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.pyblish_pype.window.rst b/docs/source/pype.tools.pyblish_pype.window.rst
deleted file mode 100644
index 10f7b1a36e..0000000000
--- a/docs/source/pype.tools.pyblish_pype.window.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.pyblish\_pype.window module
-======================================
-
-.. automodule:: pype.tools.pyblish_pype.window
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.rst b/docs/source/pype.tools.rst
deleted file mode 100644
index d82ed3384a..0000000000
--- a/docs/source/pype.tools.rst
+++ /dev/null
@@ -1,19 +0,0 @@
-pype.tools package
-==================
-
-.. automodule:: pype.tools
- :members:
- :undoc-members:
- :show-inheritance:
-
-Subpackages
------------
-
-.. toctree::
- :maxdepth: 6
-
- pype.tools.assetcreator
- pype.tools.launcher
- pype.tools.pyblish_pype
- pype.tools.settings
- pype.tools.standalonepublish
diff --git a/docs/source/pype.tools.settings.rst b/docs/source/pype.tools.settings.rst
deleted file mode 100644
index ef54851ab1..0000000000
--- a/docs/source/pype.tools.settings.rst
+++ /dev/null
@@ -1,15 +0,0 @@
-pype.tools.settings package
-===========================
-
-.. automodule:: pype.tools.settings
- :members:
- :undoc-members:
- :show-inheritance:
-
-Subpackages
------------
-
-.. toctree::
- :maxdepth: 6
-
- pype.tools.settings.settings
diff --git a/docs/source/pype.tools.settings.settings.rst b/docs/source/pype.tools.settings.settings.rst
deleted file mode 100644
index 793914e1a8..0000000000
--- a/docs/source/pype.tools.settings.settings.rst
+++ /dev/null
@@ -1,16 +0,0 @@
-pype.tools.settings.settings package
-====================================
-
-.. automodule:: pype.tools.settings.settings
- :members:
- :undoc-members:
- :show-inheritance:
-
-Subpackages
------------
-
-.. toctree::
- :maxdepth: 6
-
- pype.tools.settings.settings.style
- pype.tools.settings.settings.widgets
diff --git a/docs/source/pype.tools.settings.settings.style.rst b/docs/source/pype.tools.settings.settings.style.rst
deleted file mode 100644
index 228322245c..0000000000
--- a/docs/source/pype.tools.settings.settings.style.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.settings.settings.style package
-==========================================
-
-.. automodule:: pype.tools.settings.settings.style
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.settings.settings.widgets.anatomy_types.rst b/docs/source/pype.tools.settings.settings.widgets.anatomy_types.rst
deleted file mode 100644
index ca951c82f0..0000000000
--- a/docs/source/pype.tools.settings.settings.widgets.anatomy_types.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.settings.settings.widgets.anatomy\_types module
-==========================================================
-
-.. automodule:: pype.tools.settings.settings.widgets.anatomy_types
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.settings.settings.widgets.base.rst b/docs/source/pype.tools.settings.settings.widgets.base.rst
deleted file mode 100644
index 8964d6f628..0000000000
--- a/docs/source/pype.tools.settings.settings.widgets.base.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.settings.settings.widgets.base module
-================================================
-
-.. automodule:: pype.tools.settings.settings.widgets.base
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.settings.settings.widgets.item_types.rst b/docs/source/pype.tools.settings.settings.widgets.item_types.rst
deleted file mode 100644
index 5e505538a7..0000000000
--- a/docs/source/pype.tools.settings.settings.widgets.item_types.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.settings.settings.widgets.item\_types module
-=======================================================
-
-.. automodule:: pype.tools.settings.settings.widgets.item_types
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.settings.settings.widgets.lib.rst b/docs/source/pype.tools.settings.settings.widgets.lib.rst
deleted file mode 100644
index ae100c74b2..0000000000
--- a/docs/source/pype.tools.settings.settings.widgets.lib.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.settings.settings.widgets.lib module
-===============================================
-
-.. automodule:: pype.tools.settings.settings.widgets.lib
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.settings.settings.widgets.multiselection_combobox.rst b/docs/source/pype.tools.settings.settings.widgets.multiselection_combobox.rst
deleted file mode 100644
index 004f2ae21f..0000000000
--- a/docs/source/pype.tools.settings.settings.widgets.multiselection_combobox.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.settings.settings.widgets.multiselection\_combobox module
-====================================================================
-
-.. automodule:: pype.tools.settings.settings.widgets.multiselection_combobox
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.settings.settings.widgets.rst b/docs/source/pype.tools.settings.settings.widgets.rst
deleted file mode 100644
index 8734280a08..0000000000
--- a/docs/source/pype.tools.settings.settings.widgets.rst
+++ /dev/null
@@ -1,74 +0,0 @@
-pype.tools.settings.settings.widgets package
-============================================
-
-.. automodule:: pype.tools.settings.settings.widgets
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.tools.settings.settings.widgets.anatomy\_types module
-----------------------------------------------------------
-
-.. automodule:: pype.tools.settings.settings.widgets.anatomy_types
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.settings.settings.widgets.base module
-------------------------------------------------
-
-.. automodule:: pype.tools.settings.settings.widgets.base
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.settings.settings.widgets.item\_types module
--------------------------------------------------------
-
-.. automodule:: pype.tools.settings.settings.widgets.item_types
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.settings.settings.widgets.lib module
------------------------------------------------
-
-.. automodule:: pype.tools.settings.settings.widgets.lib
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.settings.settings.widgets.multiselection\_combobox module
---------------------------------------------------------------------
-
-.. automodule:: pype.tools.settings.settings.widgets.multiselection_combobox
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.settings.settings.widgets.tests module
--------------------------------------------------
-
-.. automodule:: pype.tools.settings.settings.widgets.tests
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.settings.settings.widgets.widgets module
----------------------------------------------------
-
-.. automodule:: pype.tools.settings.settings.widgets.widgets
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.settings.settings.widgets.window module
---------------------------------------------------
-
-.. automodule:: pype.tools.settings.settings.widgets.window
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.settings.settings.widgets.tests.rst b/docs/source/pype.tools.settings.settings.widgets.tests.rst
deleted file mode 100644
index fe8d6dabef..0000000000
--- a/docs/source/pype.tools.settings.settings.widgets.tests.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.settings.settings.widgets.tests module
-=================================================
-
-.. automodule:: pype.tools.settings.settings.widgets.tests
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.settings.settings.widgets.widgets.rst b/docs/source/pype.tools.settings.settings.widgets.widgets.rst
deleted file mode 100644
index 238e584ac3..0000000000
--- a/docs/source/pype.tools.settings.settings.widgets.widgets.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.settings.settings.widgets.widgets module
-===================================================
-
-.. automodule:: pype.tools.settings.settings.widgets.widgets
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.settings.settings.widgets.window.rst b/docs/source/pype.tools.settings.settings.widgets.window.rst
deleted file mode 100644
index d67678012f..0000000000
--- a/docs/source/pype.tools.settings.settings.widgets.window.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.settings.settings.widgets.window module
-==================================================
-
-.. automodule:: pype.tools.settings.settings.widgets.window
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.standalonepublish.app.rst b/docs/source/pype.tools.standalonepublish.app.rst
deleted file mode 100644
index 74776b80fe..0000000000
--- a/docs/source/pype.tools.standalonepublish.app.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.standalonepublish.app module
-=======================================
-
-.. automodule:: pype.tools.standalonepublish.app
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.standalonepublish.publish.rst b/docs/source/pype.tools.standalonepublish.publish.rst
deleted file mode 100644
index 47ad57e7fb..0000000000
--- a/docs/source/pype.tools.standalonepublish.publish.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.standalonepublish.publish module
-===========================================
-
-.. automodule:: pype.tools.standalonepublish.publish
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.standalonepublish.rst b/docs/source/pype.tools.standalonepublish.rst
deleted file mode 100644
index 5ca8194b61..0000000000
--- a/docs/source/pype.tools.standalonepublish.rst
+++ /dev/null
@@ -1,34 +0,0 @@
-pype.tools.standalonepublish package
-====================================
-
-.. automodule:: pype.tools.standalonepublish
- :members:
- :undoc-members:
- :show-inheritance:
-
-Subpackages
------------
-
-.. toctree::
- :maxdepth: 6
-
- pype.tools.standalonepublish.widgets
-
-Submodules
-----------
-
-pype.tools.standalonepublish.app module
----------------------------------------
-
-.. automodule:: pype.tools.standalonepublish.app
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.standalonepublish.publish module
--------------------------------------------
-
-.. automodule:: pype.tools.standalonepublish.publish
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.standalonepublish.widgets.model_asset.rst b/docs/source/pype.tools.standalonepublish.widgets.model_asset.rst
deleted file mode 100644
index 84d0ca2d93..0000000000
--- a/docs/source/pype.tools.standalonepublish.widgets.model_asset.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.standalonepublish.widgets.model\_asset module
-========================================================
-
-.. automodule:: pype.tools.standalonepublish.widgets.model_asset
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.standalonepublish.widgets.model_filter_proxy_exact_match.rst b/docs/source/pype.tools.standalonepublish.widgets.model_filter_proxy_exact_match.rst
deleted file mode 100644
index 0c3ae79b99..0000000000
--- a/docs/source/pype.tools.standalonepublish.widgets.model_filter_proxy_exact_match.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.standalonepublish.widgets.model\_filter\_proxy\_exact\_match module
-==============================================================================
-
-.. automodule:: pype.tools.standalonepublish.widgets.model_filter_proxy_exact_match
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.standalonepublish.widgets.model_filter_proxy_recursive_sort.rst b/docs/source/pype.tools.standalonepublish.widgets.model_filter_proxy_recursive_sort.rst
deleted file mode 100644
index b828b75030..0000000000
--- a/docs/source/pype.tools.standalonepublish.widgets.model_filter_proxy_recursive_sort.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.standalonepublish.widgets.model\_filter\_proxy\_recursive\_sort module
-=================================================================================
-
-.. automodule:: pype.tools.standalonepublish.widgets.model_filter_proxy_recursive_sort
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.standalonepublish.widgets.model_node.rst b/docs/source/pype.tools.standalonepublish.widgets.model_node.rst
deleted file mode 100644
index 4789b14501..0000000000
--- a/docs/source/pype.tools.standalonepublish.widgets.model_node.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.standalonepublish.widgets.model\_node module
-=======================================================
-
-.. automodule:: pype.tools.standalonepublish.widgets.model_node
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.standalonepublish.widgets.model_tasks_template.rst b/docs/source/pype.tools.standalonepublish.widgets.model_tasks_template.rst
deleted file mode 100644
index dbee838530..0000000000
--- a/docs/source/pype.tools.standalonepublish.widgets.model_tasks_template.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.standalonepublish.widgets.model\_tasks\_template module
-==================================================================
-
-.. automodule:: pype.tools.standalonepublish.widgets.model_tasks_template
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.standalonepublish.widgets.model_tree.rst b/docs/source/pype.tools.standalonepublish.widgets.model_tree.rst
deleted file mode 100644
index 38eecb095a..0000000000
--- a/docs/source/pype.tools.standalonepublish.widgets.model_tree.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.standalonepublish.widgets.model\_tree module
-=======================================================
-
-.. automodule:: pype.tools.standalonepublish.widgets.model_tree
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.standalonepublish.widgets.model_tree_view_deselectable.rst b/docs/source/pype.tools.standalonepublish.widgets.model_tree_view_deselectable.rst
deleted file mode 100644
index 9afb505113..0000000000
--- a/docs/source/pype.tools.standalonepublish.widgets.model_tree_view_deselectable.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.standalonepublish.widgets.model\_tree\_view\_deselectable module
-===========================================================================
-
-.. automodule:: pype.tools.standalonepublish.widgets.model_tree_view_deselectable
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.standalonepublish.widgets.resources.rst b/docs/source/pype.tools.standalonepublish.widgets.resources.rst
deleted file mode 100644
index a0eddae63e..0000000000
--- a/docs/source/pype.tools.standalonepublish.widgets.resources.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.standalonepublish.widgets.resources package
-======================================================
-
-.. automodule:: pype.tools.standalonepublish.widgets.resources
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.standalonepublish.widgets.rst b/docs/source/pype.tools.standalonepublish.widgets.rst
deleted file mode 100644
index 65bbcb62fc..0000000000
--- a/docs/source/pype.tools.standalonepublish.widgets.rst
+++ /dev/null
@@ -1,146 +0,0 @@
-pype.tools.standalonepublish.widgets package
-============================================
-
-.. automodule:: pype.tools.standalonepublish.widgets
- :members:
- :undoc-members:
- :show-inheritance:
-
-Subpackages
------------
-
-.. toctree::
- :maxdepth: 6
-
- pype.tools.standalonepublish.widgets.resources
-
-Submodules
-----------
-
-pype.tools.standalonepublish.widgets.model\_asset module
---------------------------------------------------------
-
-.. automodule:: pype.tools.standalonepublish.widgets.model_asset
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.standalonepublish.widgets.model\_filter\_proxy\_exact\_match module
-------------------------------------------------------------------------------
-
-.. automodule:: pype.tools.standalonepublish.widgets.model_filter_proxy_exact_match
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.standalonepublish.widgets.model\_filter\_proxy\_recursive\_sort module
----------------------------------------------------------------------------------
-
-.. automodule:: pype.tools.standalonepublish.widgets.model_filter_proxy_recursive_sort
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.standalonepublish.widgets.model\_node module
--------------------------------------------------------
-
-.. automodule:: pype.tools.standalonepublish.widgets.model_node
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.standalonepublish.widgets.model\_tasks\_template module
-------------------------------------------------------------------
-
-.. automodule:: pype.tools.standalonepublish.widgets.model_tasks_template
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.standalonepublish.widgets.model\_tree module
--------------------------------------------------------
-
-.. automodule:: pype.tools.standalonepublish.widgets.model_tree
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.standalonepublish.widgets.model\_tree\_view\_deselectable module
----------------------------------------------------------------------------
-
-.. automodule:: pype.tools.standalonepublish.widgets.model_tree_view_deselectable
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.standalonepublish.widgets.widget\_asset module
----------------------------------------------------------
-
-.. automodule:: pype.tools.standalonepublish.widgets.widget_asset
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.standalonepublish.widgets.widget\_component\_item module
--------------------------------------------------------------------
-
-.. automodule:: pype.tools.standalonepublish.widgets.widget_component_item
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.standalonepublish.widgets.widget\_components module
---------------------------------------------------------------
-
-.. automodule:: pype.tools.standalonepublish.widgets.widget_components
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.standalonepublish.widgets.widget\_components\_list module
---------------------------------------------------------------------
-
-.. automodule:: pype.tools.standalonepublish.widgets.widget_components_list
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.standalonepublish.widgets.widget\_drop\_empty module
----------------------------------------------------------------
-
-.. automodule:: pype.tools.standalonepublish.widgets.widget_drop_empty
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.standalonepublish.widgets.widget\_drop\_frame module
----------------------------------------------------------------
-
-.. automodule:: pype.tools.standalonepublish.widgets.widget_drop_frame
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.standalonepublish.widgets.widget\_family module
-----------------------------------------------------------
-
-.. automodule:: pype.tools.standalonepublish.widgets.widget_family
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.standalonepublish.widgets.widget\_family\_desc module
-----------------------------------------------------------------
-
-.. automodule:: pype.tools.standalonepublish.widgets.widget_family_desc
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.tools.standalonepublish.widgets.widget\_shadow module
-----------------------------------------------------------
-
-.. automodule:: pype.tools.standalonepublish.widgets.widget_shadow
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.standalonepublish.widgets.widget_asset.rst b/docs/source/pype.tools.standalonepublish.widgets.widget_asset.rst
deleted file mode 100644
index 51a3763628..0000000000
--- a/docs/source/pype.tools.standalonepublish.widgets.widget_asset.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.standalonepublish.widgets.widget\_asset module
-=========================================================
-
-.. automodule:: pype.tools.standalonepublish.widgets.widget_asset
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.standalonepublish.widgets.widget_component_item.rst b/docs/source/pype.tools.standalonepublish.widgets.widget_component_item.rst
deleted file mode 100644
index 3495fdf192..0000000000
--- a/docs/source/pype.tools.standalonepublish.widgets.widget_component_item.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.standalonepublish.widgets.widget\_component\_item module
-===================================================================
-
-.. automodule:: pype.tools.standalonepublish.widgets.widget_component_item
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.standalonepublish.widgets.widget_components.rst b/docs/source/pype.tools.standalonepublish.widgets.widget_components.rst
deleted file mode 100644
index be7c19af9f..0000000000
--- a/docs/source/pype.tools.standalonepublish.widgets.widget_components.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.standalonepublish.widgets.widget\_components module
-==============================================================
-
-.. automodule:: pype.tools.standalonepublish.widgets.widget_components
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.standalonepublish.widgets.widget_components_list.rst b/docs/source/pype.tools.standalonepublish.widgets.widget_components_list.rst
deleted file mode 100644
index 051efe07fe..0000000000
--- a/docs/source/pype.tools.standalonepublish.widgets.widget_components_list.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.standalonepublish.widgets.widget\_components\_list module
-====================================================================
-
-.. automodule:: pype.tools.standalonepublish.widgets.widget_components_list
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.standalonepublish.widgets.widget_drop_empty.rst b/docs/source/pype.tools.standalonepublish.widgets.widget_drop_empty.rst
deleted file mode 100644
index b5b0a6acac..0000000000
--- a/docs/source/pype.tools.standalonepublish.widgets.widget_drop_empty.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.standalonepublish.widgets.widget\_drop\_empty module
-===============================================================
-
-.. automodule:: pype.tools.standalonepublish.widgets.widget_drop_empty
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.standalonepublish.widgets.widget_drop_frame.rst b/docs/source/pype.tools.standalonepublish.widgets.widget_drop_frame.rst
deleted file mode 100644
index 6b3e3690e0..0000000000
--- a/docs/source/pype.tools.standalonepublish.widgets.widget_drop_frame.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.standalonepublish.widgets.widget\_drop\_frame module
-===============================================================
-
-.. automodule:: pype.tools.standalonepublish.widgets.widget_drop_frame
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.standalonepublish.widgets.widget_family.rst b/docs/source/pype.tools.standalonepublish.widgets.widget_family.rst
deleted file mode 100644
index 24c9d5496f..0000000000
--- a/docs/source/pype.tools.standalonepublish.widgets.widget_family.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.standalonepublish.widgets.widget\_family module
-==========================================================
-
-.. automodule:: pype.tools.standalonepublish.widgets.widget_family
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.standalonepublish.widgets.widget_family_desc.rst b/docs/source/pype.tools.standalonepublish.widgets.widget_family_desc.rst
deleted file mode 100644
index 5a7f92177f..0000000000
--- a/docs/source/pype.tools.standalonepublish.widgets.widget_family_desc.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.standalonepublish.widgets.widget\_family\_desc module
-================================================================
-
-.. automodule:: pype.tools.standalonepublish.widgets.widget_family_desc
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.standalonepublish.widgets.widget_shadow.rst b/docs/source/pype.tools.standalonepublish.widgets.widget_shadow.rst
deleted file mode 100644
index 19f5c22198..0000000000
--- a/docs/source/pype.tools.standalonepublish.widgets.widget_shadow.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.standalonepublish.widgets.widget\_shadow module
-==========================================================
-
-.. automodule:: pype.tools.standalonepublish.widgets.widget_shadow
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.tray.pype_tray.rst b/docs/source/pype.tools.tray.pype_tray.rst
deleted file mode 100644
index 9fc49c5763..0000000000
--- a/docs/source/pype.tools.tray.pype_tray.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.tray.pype\_tray module
-=================================
-
-.. automodule:: pype.tools.tray.pype_tray
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.tray.rst b/docs/source/pype.tools.tray.rst
deleted file mode 100644
index b28059d170..0000000000
--- a/docs/source/pype.tools.tray.rst
+++ /dev/null
@@ -1,15 +0,0 @@
-pype.tools.tray package
-=======================
-
-.. automodule:: pype.tools.tray
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-.. toctree::
- :maxdepth: 10
-
- pype.tools.tray.pype_tray
diff --git a/docs/source/pype.tools.workfiles.app.rst b/docs/source/pype.tools.workfiles.app.rst
deleted file mode 100644
index a3a46b8a07..0000000000
--- a/docs/source/pype.tools.workfiles.app.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.workfiles.app module
-===============================
-
-.. automodule:: pype.tools.workfiles.app
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.workfiles.model.rst b/docs/source/pype.tools.workfiles.model.rst
deleted file mode 100644
index 44cea32b97..0000000000
--- a/docs/source/pype.tools.workfiles.model.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.workfiles.model module
-=================================
-
-.. automodule:: pype.tools.workfiles.model
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.tools.workfiles.rst b/docs/source/pype.tools.workfiles.rst
deleted file mode 100644
index 147c4cebbe..0000000000
--- a/docs/source/pype.tools.workfiles.rst
+++ /dev/null
@@ -1,17 +0,0 @@
-pype.tools.workfiles package
-============================
-
-.. automodule:: pype.tools.workfiles
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-.. toctree::
- :maxdepth: 10
-
- pype.tools.workfiles.app
- pype.tools.workfiles.model
- pype.tools.workfiles.view
diff --git a/docs/source/pype.tools.workfiles.view.rst b/docs/source/pype.tools.workfiles.view.rst
deleted file mode 100644
index acd32ed250..0000000000
--- a/docs/source/pype.tools.workfiles.view.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.tools.workfiles.view module
-================================
-
-.. automodule:: pype.tools.workfiles.view
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.backports.configparser.helpers.rst b/docs/source/pype.vendor.backports.configparser.helpers.rst
deleted file mode 100644
index 8d44d0a8c4..0000000000
--- a/docs/source/pype.vendor.backports.configparser.helpers.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.backports.configparser.helpers module
-=================================================
-
-.. automodule:: pype.vendor.backports.configparser.helpers
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.backports.configparser.rst b/docs/source/pype.vendor.backports.configparser.rst
deleted file mode 100644
index 4f778a4a87..0000000000
--- a/docs/source/pype.vendor.backports.configparser.rst
+++ /dev/null
@@ -1,18 +0,0 @@
-pype.vendor.backports.configparser package
-==========================================
-
-.. automodule:: pype.vendor.backports.configparser
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.vendor.backports.configparser.helpers module
--------------------------------------------------
-
-.. automodule:: pype.vendor.backports.configparser.helpers
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.backports.functools_lru_cache.rst b/docs/source/pype.vendor.backports.functools_lru_cache.rst
deleted file mode 100644
index 26f2746cec..0000000000
--- a/docs/source/pype.vendor.backports.functools_lru_cache.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.backports.functools\_lru\_cache module
-==================================================
-
-.. automodule:: pype.vendor.backports.functools_lru_cache
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.backports.rst b/docs/source/pype.vendor.backports.rst
deleted file mode 100644
index ff9efc29c5..0000000000
--- a/docs/source/pype.vendor.backports.rst
+++ /dev/null
@@ -1,26 +0,0 @@
-pype.vendor.backports package
-=============================
-
-.. automodule:: pype.vendor.backports
- :members:
- :undoc-members:
- :show-inheritance:
-
-Subpackages
------------
-
-.. toctree::
- :maxdepth: 6
-
- pype.vendor.backports.configparser
-
-Submodules
-----------
-
-pype.vendor.backports.functools\_lru\_cache module
---------------------------------------------------
-
-.. automodule:: pype.vendor.backports.functools_lru_cache
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.builtins.rst b/docs/source/pype.vendor.builtins.rst
deleted file mode 100644
index e21fb768ed..0000000000
--- a/docs/source/pype.vendor.builtins.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.builtins package
-============================
-
-.. automodule:: pype.vendor.builtins
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.capture.rst b/docs/source/pype.vendor.capture.rst
deleted file mode 100644
index d42e073fb5..0000000000
--- a/docs/source/pype.vendor.capture.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.capture module
-==========================
-
-.. automodule:: pype.vendor.capture
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.capture_gui.accordion.rst b/docs/source/pype.vendor.capture_gui.accordion.rst
deleted file mode 100644
index cca228f151..0000000000
--- a/docs/source/pype.vendor.capture_gui.accordion.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.capture\_gui.accordion module
-=========================================
-
-.. automodule:: pype.vendor.capture_gui.accordion
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.capture_gui.app.rst b/docs/source/pype.vendor.capture_gui.app.rst
deleted file mode 100644
index 291296834e..0000000000
--- a/docs/source/pype.vendor.capture_gui.app.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.capture\_gui.app module
-===================================
-
-.. automodule:: pype.vendor.capture_gui.app
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.capture_gui.colorpicker.rst b/docs/source/pype.vendor.capture_gui.colorpicker.rst
deleted file mode 100644
index c9e56500f2..0000000000
--- a/docs/source/pype.vendor.capture_gui.colorpicker.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.capture\_gui.colorpicker module
-===========================================
-
-.. automodule:: pype.vendor.capture_gui.colorpicker
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.capture_gui.lib.rst b/docs/source/pype.vendor.capture_gui.lib.rst
deleted file mode 100644
index e94a3bd196..0000000000
--- a/docs/source/pype.vendor.capture_gui.lib.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.capture\_gui.lib module
-===================================
-
-.. automodule:: pype.vendor.capture_gui.lib
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.capture_gui.plugin.rst b/docs/source/pype.vendor.capture_gui.plugin.rst
deleted file mode 100644
index 2e8f58c873..0000000000
--- a/docs/source/pype.vendor.capture_gui.plugin.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.capture\_gui.plugin module
-======================================
-
-.. automodule:: pype.vendor.capture_gui.plugin
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.capture_gui.presets.rst b/docs/source/pype.vendor.capture_gui.presets.rst
deleted file mode 100644
index c81b4e1c23..0000000000
--- a/docs/source/pype.vendor.capture_gui.presets.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.capture\_gui.presets module
-=======================================
-
-.. automodule:: pype.vendor.capture_gui.presets
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.capture_gui.rst b/docs/source/pype.vendor.capture_gui.rst
deleted file mode 100644
index f7efce3501..0000000000
--- a/docs/source/pype.vendor.capture_gui.rst
+++ /dev/null
@@ -1,82 +0,0 @@
-pype.vendor.capture\_gui package
-================================
-
-.. automodule:: pype.vendor.capture_gui
- :members:
- :undoc-members:
- :show-inheritance:
-
-Subpackages
------------
-
-.. toctree::
- :maxdepth: 6
-
- pype.vendor.capture_gui.vendor
-
-Submodules
-----------
-
-pype.vendor.capture\_gui.accordion module
------------------------------------------
-
-.. automodule:: pype.vendor.capture_gui.accordion
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.capture\_gui.app module
------------------------------------
-
-.. automodule:: pype.vendor.capture_gui.app
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.capture\_gui.colorpicker module
--------------------------------------------
-
-.. automodule:: pype.vendor.capture_gui.colorpicker
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.capture\_gui.lib module
------------------------------------
-
-.. automodule:: pype.vendor.capture_gui.lib
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.capture\_gui.plugin module
---------------------------------------
-
-.. automodule:: pype.vendor.capture_gui.plugin
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.capture\_gui.presets module
----------------------------------------
-
-.. automodule:: pype.vendor.capture_gui.presets
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.capture\_gui.tokens module
---------------------------------------
-
-.. automodule:: pype.vendor.capture_gui.tokens
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.capture\_gui.version module
----------------------------------------
-
-.. automodule:: pype.vendor.capture_gui.version
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.capture_gui.tokens.rst b/docs/source/pype.vendor.capture_gui.tokens.rst
deleted file mode 100644
index 9e144a4d37..0000000000
--- a/docs/source/pype.vendor.capture_gui.tokens.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.capture\_gui.tokens module
-======================================
-
-.. automodule:: pype.vendor.capture_gui.tokens
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.capture_gui.vendor.Qt.rst b/docs/source/pype.vendor.capture_gui.vendor.Qt.rst
deleted file mode 100644
index 447e6dd812..0000000000
--- a/docs/source/pype.vendor.capture_gui.vendor.Qt.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.capture\_gui.vendor.Qt module
-=========================================
-
-.. automodule:: pype.vendor.capture_gui.vendor.Qt
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.capture_gui.vendor.rst b/docs/source/pype.vendor.capture_gui.vendor.rst
deleted file mode 100644
index 0befc4bbb7..0000000000
--- a/docs/source/pype.vendor.capture_gui.vendor.rst
+++ /dev/null
@@ -1,18 +0,0 @@
-pype.vendor.capture\_gui.vendor package
-=======================================
-
-.. automodule:: pype.vendor.capture_gui.vendor
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.vendor.capture\_gui.vendor.Qt module
------------------------------------------
-
-.. automodule:: pype.vendor.capture_gui.vendor.Qt
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.capture_gui.version.rst b/docs/source/pype.vendor.capture_gui.version.rst
deleted file mode 100644
index 3f0cfbabfd..0000000000
--- a/docs/source/pype.vendor.capture_gui.version.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.capture\_gui.version module
-=======================================
-
-.. automodule:: pype.vendor.capture_gui.version
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.accessor.base.rst b/docs/source/pype.vendor.ftrack_api_old.accessor.base.rst
deleted file mode 100644
index 5155df82aa..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.accessor.base.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.accessor.base module
-=================================================
-
-.. automodule:: pype.vendor.ftrack_api_old.accessor.base
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.accessor.disk.rst b/docs/source/pype.vendor.ftrack_api_old.accessor.disk.rst
deleted file mode 100644
index 3040fe18fd..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.accessor.disk.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.accessor.disk module
-=================================================
-
-.. automodule:: pype.vendor.ftrack_api_old.accessor.disk
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.accessor.rst b/docs/source/pype.vendor.ftrack_api_old.accessor.rst
deleted file mode 100644
index 1f7b522460..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.accessor.rst
+++ /dev/null
@@ -1,34 +0,0 @@
-pype.vendor.ftrack\_api\_old.accessor package
-=============================================
-
-.. automodule:: pype.vendor.ftrack_api_old.accessor
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.vendor.ftrack\_api\_old.accessor.base module
--------------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.accessor.base
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.accessor.disk module
--------------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.accessor.disk
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.accessor.server module
----------------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.accessor.server
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.accessor.server.rst b/docs/source/pype.vendor.ftrack_api_old.accessor.server.rst
deleted file mode 100644
index db835f99c4..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.accessor.server.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.accessor.server module
-===================================================
-
-.. automodule:: pype.vendor.ftrack_api_old.accessor.server
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.attribute.rst b/docs/source/pype.vendor.ftrack_api_old.attribute.rst
deleted file mode 100644
index 54276ceb2a..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.attribute.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.attribute module
-=============================================
-
-.. automodule:: pype.vendor.ftrack_api_old.attribute
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.cache.rst b/docs/source/pype.vendor.ftrack_api_old.cache.rst
deleted file mode 100644
index 396bc5a1cd..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.cache.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.cache module
-=========================================
-
-.. automodule:: pype.vendor.ftrack_api_old.cache
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.collection.rst b/docs/source/pype.vendor.ftrack_api_old.collection.rst
deleted file mode 100644
index de911619fc..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.collection.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.collection module
-==============================================
-
-.. automodule:: pype.vendor.ftrack_api_old.collection
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.data.rst b/docs/source/pype.vendor.ftrack_api_old.data.rst
deleted file mode 100644
index 2f67185cee..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.data.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.data module
-========================================
-
-.. automodule:: pype.vendor.ftrack_api_old.data
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.entity.asset_version.rst b/docs/source/pype.vendor.ftrack_api_old.entity.asset_version.rst
deleted file mode 100644
index 7ad3d87fd9..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.entity.asset_version.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.entity.asset\_version module
-=========================================================
-
-.. automodule:: pype.vendor.ftrack_api_old.entity.asset_version
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.entity.base.rst b/docs/source/pype.vendor.ftrack_api_old.entity.base.rst
deleted file mode 100644
index b87428f817..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.entity.base.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.entity.base module
-===============================================
-
-.. automodule:: pype.vendor.ftrack_api_old.entity.base
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.entity.component.rst b/docs/source/pype.vendor.ftrack_api_old.entity.component.rst
deleted file mode 100644
index 27901ab786..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.entity.component.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.entity.component module
-====================================================
-
-.. automodule:: pype.vendor.ftrack_api_old.entity.component
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.entity.factory.rst b/docs/source/pype.vendor.ftrack_api_old.entity.factory.rst
deleted file mode 100644
index caada5c3c8..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.entity.factory.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.entity.factory module
-==================================================
-
-.. automodule:: pype.vendor.ftrack_api_old.entity.factory
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.entity.job.rst b/docs/source/pype.vendor.ftrack_api_old.entity.job.rst
deleted file mode 100644
index 6f4ca18323..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.entity.job.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.entity.job module
-==============================================
-
-.. automodule:: pype.vendor.ftrack_api_old.entity.job
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.entity.location.rst b/docs/source/pype.vendor.ftrack_api_old.entity.location.rst
deleted file mode 100644
index 2f0b380349..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.entity.location.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.entity.location module
-===================================================
-
-.. automodule:: pype.vendor.ftrack_api_old.entity.location
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.entity.note.rst b/docs/source/pype.vendor.ftrack_api_old.entity.note.rst
deleted file mode 100644
index c04e959402..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.entity.note.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.entity.note module
-===============================================
-
-.. automodule:: pype.vendor.ftrack_api_old.entity.note
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.entity.project_schema.rst b/docs/source/pype.vendor.ftrack_api_old.entity.project_schema.rst
deleted file mode 100644
index 6332a2e523..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.entity.project_schema.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.entity.project\_schema module
-==========================================================
-
-.. automodule:: pype.vendor.ftrack_api_old.entity.project_schema
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.entity.rst b/docs/source/pype.vendor.ftrack_api_old.entity.rst
deleted file mode 100644
index bb43a7621b..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.entity.rst
+++ /dev/null
@@ -1,82 +0,0 @@
-pype.vendor.ftrack\_api\_old.entity package
-===========================================
-
-.. automodule:: pype.vendor.ftrack_api_old.entity
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.vendor.ftrack\_api\_old.entity.asset\_version module
----------------------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.entity.asset_version
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.entity.base module
------------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.entity.base
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.entity.component module
-----------------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.entity.component
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.entity.factory module
---------------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.entity.factory
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.entity.job module
-----------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.entity.job
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.entity.location module
----------------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.entity.location
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.entity.note module
------------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.entity.note
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.entity.project\_schema module
-----------------------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.entity.project_schema
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.entity.user module
------------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.entity.user
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.entity.user.rst b/docs/source/pype.vendor.ftrack_api_old.entity.user.rst
deleted file mode 100644
index c0fe6574a6..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.entity.user.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.entity.user module
-===============================================
-
-.. automodule:: pype.vendor.ftrack_api_old.entity.user
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.event.base.rst b/docs/source/pype.vendor.ftrack_api_old.event.base.rst
deleted file mode 100644
index 74b86e3bb2..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.event.base.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.event.base module
-==============================================
-
-.. automodule:: pype.vendor.ftrack_api_old.event.base
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.event.expression.rst b/docs/source/pype.vendor.ftrack_api_old.event.expression.rst
deleted file mode 100644
index 860678797b..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.event.expression.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.event.expression module
-====================================================
-
-.. automodule:: pype.vendor.ftrack_api_old.event.expression
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.event.hub.rst b/docs/source/pype.vendor.ftrack_api_old.event.hub.rst
deleted file mode 100644
index d09d52eedf..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.event.hub.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.event.hub module
-=============================================
-
-.. automodule:: pype.vendor.ftrack_api_old.event.hub
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.event.rst b/docs/source/pype.vendor.ftrack_api_old.event.rst
deleted file mode 100644
index 2db27bf7f8..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.event.rst
+++ /dev/null
@@ -1,50 +0,0 @@
-pype.vendor.ftrack\_api\_old.event package
-==========================================
-
-.. automodule:: pype.vendor.ftrack_api_old.event
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.vendor.ftrack\_api\_old.event.base module
-----------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.event.base
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.event.expression module
-----------------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.event.expression
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.event.hub module
----------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.event.hub
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.event.subscriber module
-----------------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.event.subscriber
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.event.subscription module
-------------------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.event.subscription
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.event.subscriber.rst b/docs/source/pype.vendor.ftrack_api_old.event.subscriber.rst
deleted file mode 100644
index a9bd13aabc..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.event.subscriber.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.event.subscriber module
-====================================================
-
-.. automodule:: pype.vendor.ftrack_api_old.event.subscriber
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.event.subscription.rst b/docs/source/pype.vendor.ftrack_api_old.event.subscription.rst
deleted file mode 100644
index 423fa9a688..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.event.subscription.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.event.subscription module
-======================================================
-
-.. automodule:: pype.vendor.ftrack_api_old.event.subscription
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.exception.rst b/docs/source/pype.vendor.ftrack_api_old.exception.rst
deleted file mode 100644
index 54dbeeac36..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.exception.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.exception module
-=============================================
-
-.. automodule:: pype.vendor.ftrack_api_old.exception
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.formatter.rst b/docs/source/pype.vendor.ftrack_api_old.formatter.rst
deleted file mode 100644
index 75a23eefca..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.formatter.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.formatter module
-=============================================
-
-.. automodule:: pype.vendor.ftrack_api_old.formatter
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.inspection.rst b/docs/source/pype.vendor.ftrack_api_old.inspection.rst
deleted file mode 100644
index 2b8849b3d0..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.inspection.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.inspection module
-==============================================
-
-.. automodule:: pype.vendor.ftrack_api_old.inspection
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.logging.rst b/docs/source/pype.vendor.ftrack_api_old.logging.rst
deleted file mode 100644
index a10fa10c26..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.logging.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.logging module
-===========================================
-
-.. automodule:: pype.vendor.ftrack_api_old.logging
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.operation.rst b/docs/source/pype.vendor.ftrack_api_old.operation.rst
deleted file mode 100644
index a1d9d606f8..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.operation.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.operation module
-=============================================
-
-.. automodule:: pype.vendor.ftrack_api_old.operation
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.plugin.rst b/docs/source/pype.vendor.ftrack_api_old.plugin.rst
deleted file mode 100644
index 0f26c705d2..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.plugin.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.plugin module
-==========================================
-
-.. automodule:: pype.vendor.ftrack_api_old.plugin
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.query.rst b/docs/source/pype.vendor.ftrack_api_old.query.rst
deleted file mode 100644
index 5cf5aba0e4..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.query.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.query module
-=========================================
-
-.. automodule:: pype.vendor.ftrack_api_old.query
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.resource_identifier_transformer.base.rst b/docs/source/pype.vendor.ftrack_api_old.resource_identifier_transformer.base.rst
deleted file mode 100644
index dccf51ea71..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.resource_identifier_transformer.base.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.resource\_identifier\_transformer.base module
-==========================================================================
-
-.. automodule:: pype.vendor.ftrack_api_old.resource_identifier_transformer.base
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.resource_identifier_transformer.rst b/docs/source/pype.vendor.ftrack_api_old.resource_identifier_transformer.rst
deleted file mode 100644
index 342ecd9321..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.resource_identifier_transformer.rst
+++ /dev/null
@@ -1,18 +0,0 @@
-pype.vendor.ftrack\_api\_old.resource\_identifier\_transformer package
-======================================================================
-
-.. automodule:: pype.vendor.ftrack_api_old.resource_identifier_transformer
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.vendor.ftrack\_api\_old.resource\_identifier\_transformer.base module
---------------------------------------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.resource_identifier_transformer.base
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.rst b/docs/source/pype.vendor.ftrack_api_old.rst
deleted file mode 100644
index 51d0a29357..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.rst
+++ /dev/null
@@ -1,126 +0,0 @@
-pype.vendor.ftrack\_api\_old package
-====================================
-
-.. automodule:: pype.vendor.ftrack_api_old
- :members:
- :undoc-members:
- :show-inheritance:
-
-Subpackages
------------
-
-.. toctree::
- :maxdepth: 6
-
- pype.vendor.ftrack_api_old.accessor
- pype.vendor.ftrack_api_old.entity
- pype.vendor.ftrack_api_old.event
- pype.vendor.ftrack_api_old.resource_identifier_transformer
- pype.vendor.ftrack_api_old.structure
-
-Submodules
-----------
-
-pype.vendor.ftrack\_api\_old.attribute module
----------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.attribute
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.cache module
------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.cache
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.collection module
-----------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.collection
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.data module
-----------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.data
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.exception module
----------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.exception
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.formatter module
----------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.formatter
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.inspection module
-----------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.inspection
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.logging module
--------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.logging
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.operation module
----------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.operation
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.plugin module
-------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.plugin
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.query module
------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.query
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.session module
--------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.session
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.symbol module
-------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.symbol
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.session.rst b/docs/source/pype.vendor.ftrack_api_old.session.rst
deleted file mode 100644
index beecdeb6af..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.session.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.session module
-===========================================
-
-.. automodule:: pype.vendor.ftrack_api_old.session
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.structure.base.rst b/docs/source/pype.vendor.ftrack_api_old.structure.base.rst
deleted file mode 100644
index 617d8aaed7..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.structure.base.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.structure.base module
-==================================================
-
-.. automodule:: pype.vendor.ftrack_api_old.structure.base
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.structure.entity_id.rst b/docs/source/pype.vendor.ftrack_api_old.structure.entity_id.rst
deleted file mode 100644
index ab6fd0997a..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.structure.entity_id.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.structure.entity\_id module
-========================================================
-
-.. automodule:: pype.vendor.ftrack_api_old.structure.entity_id
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.structure.id.rst b/docs/source/pype.vendor.ftrack_api_old.structure.id.rst
deleted file mode 100644
index 6b887b7917..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.structure.id.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.structure.id module
-================================================
-
-.. automodule:: pype.vendor.ftrack_api_old.structure.id
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.structure.origin.rst b/docs/source/pype.vendor.ftrack_api_old.structure.origin.rst
deleted file mode 100644
index 8ad5fbdc11..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.structure.origin.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.structure.origin module
-====================================================
-
-.. automodule:: pype.vendor.ftrack_api_old.structure.origin
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.structure.rst b/docs/source/pype.vendor.ftrack_api_old.structure.rst
deleted file mode 100644
index 2402430589..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.structure.rst
+++ /dev/null
@@ -1,50 +0,0 @@
-pype.vendor.ftrack\_api\_old.structure package
-==============================================
-
-.. automodule:: pype.vendor.ftrack_api_old.structure
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.vendor.ftrack\_api\_old.structure.base module
---------------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.structure.base
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.structure.entity\_id module
---------------------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.structure.entity_id
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.structure.id module
-------------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.structure.id
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.structure.origin module
-----------------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.structure.origin
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.ftrack\_api\_old.structure.standard module
-------------------------------------------------------
-
-.. automodule:: pype.vendor.ftrack_api_old.structure.standard
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.structure.standard.rst b/docs/source/pype.vendor.ftrack_api_old.structure.standard.rst
deleted file mode 100644
index 800201084f..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.structure.standard.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.structure.standard module
-======================================================
-
-.. automodule:: pype.vendor.ftrack_api_old.structure.standard
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.ftrack_api_old.symbol.rst b/docs/source/pype.vendor.ftrack_api_old.symbol.rst
deleted file mode 100644
index bc358d374a..0000000000
--- a/docs/source/pype.vendor.ftrack_api_old.symbol.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.ftrack\_api\_old.symbol module
-==========================================
-
-.. automodule:: pype.vendor.ftrack_api_old.symbol
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.pysync.rst b/docs/source/pype.vendor.pysync.rst
deleted file mode 100644
index fbe5b33fb7..0000000000
--- a/docs/source/pype.vendor.pysync.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.vendor.pysync module
-=========================
-
-.. automodule:: pype.vendor.pysync
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.vendor.rst b/docs/source/pype.vendor.rst
deleted file mode 100644
index 23aa17f7ab..0000000000
--- a/docs/source/pype.vendor.rst
+++ /dev/null
@@ -1,37 +0,0 @@
-pype.vendor package
-===================
-
-.. automodule:: pype.vendor
- :members:
- :undoc-members:
- :show-inheritance:
-
-Subpackages
------------
-
-.. toctree::
- :maxdepth: 6
-
- pype.vendor.backports
- pype.vendor.builtins
- pype.vendor.capture_gui
- pype.vendor.ftrack_api_old
-
-Submodules
-----------
-
-pype.vendor.capture module
---------------------------
-
-.. automodule:: pype.vendor.capture
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.vendor.pysync module
--------------------------
-
-.. automodule:: pype.vendor.pysync
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.version.rst b/docs/source/pype.version.rst
deleted file mode 100644
index 7ec69dc423..0000000000
--- a/docs/source/pype.version.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.version module
-===================
-
-.. automodule:: pype.version
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.widgets.message_window.rst b/docs/source/pype.widgets.message_window.rst
deleted file mode 100644
index 60be203837..0000000000
--- a/docs/source/pype.widgets.message_window.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.widgets.message\_window module
-===================================
-
-.. automodule:: pype.widgets.message_window
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.widgets.popup.rst b/docs/source/pype.widgets.popup.rst
deleted file mode 100644
index 7186ff48de..0000000000
--- a/docs/source/pype.widgets.popup.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.widgets.popup module
-=========================
-
-.. automodule:: pype.widgets.popup
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.widgets.project_settings.rst b/docs/source/pype.widgets.project_settings.rst
deleted file mode 100644
index 9589cf5479..0000000000
--- a/docs/source/pype.widgets.project_settings.rst
+++ /dev/null
@@ -1,7 +0,0 @@
-pype.widgets.project\_settings module
-=====================================
-
-.. automodule:: pype.widgets.project_settings
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/pype.widgets.rst b/docs/source/pype.widgets.rst
deleted file mode 100644
index 1f09318b67..0000000000
--- a/docs/source/pype.widgets.rst
+++ /dev/null
@@ -1,34 +0,0 @@
-pype.widgets package
-====================
-
-.. automodule:: pype.widgets
- :members:
- :undoc-members:
- :show-inheritance:
-
-Submodules
-----------
-
-pype.widgets.message\_window module
------------------------------------
-
-.. automodule:: pype.widgets.message_window
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.widgets.popup module
--------------------------
-
-.. automodule:: pype.widgets.popup
- :members:
- :undoc-members:
- :show-inheritance:
-
-pype.widgets.project\_settings module
--------------------------------------
-
-.. automodule:: pype.widgets.project_settings
- :members:
- :undoc-members:
- :show-inheritance:
diff --git a/docs/source/readme.rst b/docs/source/readme.rst
index 823c0df3c8..138b88bba8 100644
--- a/docs/source/readme.rst
+++ b/docs/source/readme.rst
@@ -1,2 +1,6 @@
-.. title:: Pype Readme
+===============
+OpenPype Readme
+===============
+
.. include:: ../../README.md
+ :parser: myst_parser.sphinx_
diff --git a/igniter/__init__.py b/igniter/__init__.py
index aa1b1d209e..16ffb940f6 100644
--- a/igniter/__init__.py
+++ b/igniter/__init__.py
@@ -19,21 +19,37 @@ if "OpenPypeVersion" not in sys.modules:
sys.modules["OpenPypeVersion"] = OpenPypeVersion
+def _get_qt_app():
+ from qtpy import QtWidgets, QtCore
+
+ app = QtWidgets.QApplication.instance()
+ if app is not None:
+ return app
+
+ for attr_name in (
+ "AA_EnableHighDpiScaling",
+ "AA_UseHighDpiPixmaps",
+ ):
+ attr = getattr(QtCore.Qt, attr_name, None)
+ if attr is not None:
+ QtWidgets.QApplication.setAttribute(attr)
+
+ if hasattr(QtWidgets.QApplication, "setHighDpiScaleFactorRoundingPolicy"):
+ QtWidgets.QApplication.setHighDpiScaleFactorRoundingPolicy(
+ QtCore.Qt.HighDpiScaleFactorRoundingPolicy.PassThrough
+ )
+
+ return QtWidgets.QApplication(sys.argv)
+
+
def open_dialog():
"""Show Igniter dialog."""
if os.getenv("OPENPYPE_HEADLESS_MODE"):
print("!!! Can't open dialog in headless mode. Exiting.")
sys.exit(1)
- from qtpy import QtWidgets, QtCore
from .install_dialog import InstallDialog
- scale_attr = getattr(QtCore.Qt, "AA_EnableHighDpiScaling", None)
- if scale_attr is not None:
- QtWidgets.QApplication.setAttribute(scale_attr)
-
- app = QtWidgets.QApplication.instance()
- if not app:
- app = QtWidgets.QApplication(sys.argv)
+ app = _get_qt_app()
d = InstallDialog()
d.open()
@@ -47,16 +63,10 @@ def open_update_window(openpype_version):
if os.getenv("OPENPYPE_HEADLESS_MODE"):
print("!!! Can't open dialog in headless mode. Exiting.")
sys.exit(1)
- from qtpy import QtWidgets, QtCore
+
from .update_window import UpdateWindow
- scale_attr = getattr(QtCore.Qt, "AA_EnableHighDpiScaling", None)
- if scale_attr is not None:
- QtWidgets.QApplication.setAttribute(scale_attr)
-
- app = QtWidgets.QApplication.instance()
- if not app:
- app = QtWidgets.QApplication(sys.argv)
+ app = _get_qt_app()
d = UpdateWindow(version=openpype_version)
d.open()
@@ -71,16 +81,10 @@ def show_message_dialog(title, message):
if os.getenv("OPENPYPE_HEADLESS_MODE"):
print("!!! Can't open dialog in headless mode. Exiting.")
sys.exit(1)
- from qtpy import QtWidgets, QtCore
+
from .message_dialog import MessageDialog
- scale_attr = getattr(QtCore.Qt, "AA_EnableHighDpiScaling", None)
- if scale_attr is not None:
- QtWidgets.QApplication.setAttribute(scale_attr)
-
- app = QtWidgets.QApplication.instance()
- if not app:
- app = QtWidgets.QApplication(sys.argv)
+ app = _get_qt_app()
dialog = MessageDialog(title, message)
dialog.open()
diff --git a/igniter/bootstrap_repos.py b/igniter/bootstrap_repos.py
index 6c7c834062..4cf00375bf 100644
--- a/igniter/bootstrap_repos.py
+++ b/igniter/bootstrap_repos.py
@@ -25,7 +25,8 @@ from .user_settings import (
from .tools import (
get_openpype_global_settings,
get_openpype_path_from_settings,
- get_expected_studio_version_str
+ get_expected_studio_version_str,
+ get_local_openpype_path_from_settings
)
@@ -61,6 +62,8 @@ class OpenPypeVersion(semver.VersionInfo):
"""
path = None
+
+ _local_openpype_path = None
# this should match any string complying with https://semver.org/
_VERSION_REGEX = re.compile(r"(?P0|[1-9]\d*)\.(?P0|[1-9]\d*)\.(?P0|[1-9]\d*)(?:-(?P[a-zA-Z\d\-.]*))?(?:\+(?P[a-zA-Z\d\-.]*))?") # noqa: E501
_installed_version = None
@@ -289,6 +292,23 @@ class OpenPypeVersion(semver.VersionInfo):
"""
return os.getenv("OPENPYPE_PATH")
+ @classmethod
+ def get_local_openpype_path(cls):
+ """Path to unzipped versions.
+
+ By default it should be user appdata, but could be overridden by
+ settings.
+ """
+ if cls._local_openpype_path:
+ return cls._local_openpype_path
+
+ settings = get_openpype_global_settings(os.environ["OPENPYPE_MONGO"])
+ data_dir = get_local_openpype_path_from_settings(settings)
+ if not data_dir:
+ data_dir = Path(user_data_dir("openpype", "pypeclub"))
+ cls._local_openpype_path = data_dir
+ return data_dir
+
@classmethod
def openpype_path_is_set(cls):
"""Path to OpenPype zip directory is set."""
@@ -319,9 +339,8 @@ class OpenPypeVersion(semver.VersionInfo):
list: of compatible versions available on the machine.
"""
- # DEPRECATED: backwards compatible way to look for versions in root
- dir_to_search = Path(user_data_dir("openpype", "pypeclub"))
- versions = OpenPypeVersion.get_versions_from_directory(dir_to_search)
+ dir_to_search = cls.get_local_openpype_path()
+ versions = cls.get_versions_from_directory(dir_to_search)
return list(sorted(set(versions)))
@@ -533,17 +552,15 @@ class BootstrapRepos:
"""
# vendor and app used to construct user data dir
- self._vendor = "pypeclub"
- self._app = "openpype"
+ self._message = message
self._log = log.getLogger(str(__class__))
- self.data_dir = Path(user_data_dir(self._app, self._vendor))
+ self.set_data_dir(None)
self.secure_registry = OpenPypeSecureRegistry("mongodb")
self.registry = OpenPypeSettingsRegistry()
self.zip_filter = [".pyc", "__pycache__"]
self.openpype_filter = [
"openpype", "schema", "LICENSE"
]
- self._message = message
# dummy progress reporter
def empty_progress(x: int):
@@ -554,6 +571,13 @@ class BootstrapRepos:
progress_callback = empty_progress
self._progress_callback = progress_callback
+ def set_data_dir(self, data_dir):
+ if not data_dir:
+ self.data_dir = Path(user_data_dir("openpype", "pypeclub"))
+ else:
+ self._print(f"overriding local folder: {data_dir}")
+ self.data_dir = data_dir
+
@staticmethod
def get_version_path_from_list(
version: str, version_list: list) -> Union[Path, None]:
diff --git a/igniter/install_thread.py b/igniter/install_thread.py
index 4723e6adfb..1d55213de7 100644
--- a/igniter/install_thread.py
+++ b/igniter/install_thread.py
@@ -14,7 +14,11 @@ from .bootstrap_repos import (
OpenPypeVersion
)
-from .tools import validate_mongo_connection
+from .tools import (
+ get_openpype_global_settings,
+ get_local_openpype_path_from_settings,
+ validate_mongo_connection
+)
class InstallThread(QtCore.QThread):
@@ -80,6 +84,15 @@ class InstallThread(QtCore.QThread):
return
os.environ["OPENPYPE_MONGO"] = self._mongo
+ if not validate_mongo_connection(self._mongo):
+ self.message.emit(f"Cannot connect to {self._mongo}", True)
+ self._set_result(-1)
+ return
+
+ global_settings = get_openpype_global_settings(self._mongo)
+ data_dir = get_local_openpype_path_from_settings(global_settings)
+ bs.set_data_dir(data_dir)
+
self.message.emit(
f"Detecting installed OpenPype versions in {bs.data_dir}",
False)
diff --git a/igniter/tools.py b/igniter/tools.py
index 79235b2329..9dea203f0c 100644
--- a/igniter/tools.py
+++ b/igniter/tools.py
@@ -40,7 +40,7 @@ def should_add_certificate_path_to_mongo_url(mongo_url):
add_certificate = False
# Check if url 'ssl' or 'tls' are set to 'true'
for key in ("ssl", "tls"):
- if key in query and "true" in query["ssl"]:
+ if key in query and "true" in query[key]:
add_certificate = True
break
@@ -73,7 +73,7 @@ def validate_mongo_connection(cnx: str) -> (bool, str):
}
# Add certificate path if should be required
if should_add_certificate_path_to_mongo_url(cnx):
- kwargs["ssl_ca_certs"] = certifi.where()
+ kwargs["tlsCAFile"] = certifi.where()
try:
client = MongoClient(cnx, **kwargs)
@@ -147,7 +147,7 @@ def get_openpype_global_settings(url: str) -> dict:
"""
kwargs = {}
if should_add_certificate_path_to_mongo_url(url):
- kwargs["ssl_ca_certs"] = certifi.where()
+ kwargs["tlsCAFile"] = certifi.where()
try:
# Create mongo connection
@@ -188,6 +188,26 @@ def get_openpype_path_from_settings(settings: dict) -> Union[str, None]:
return next((path for path in paths if os.path.exists(path)), None)
+def get_local_openpype_path_from_settings(settings: dict) -> Union[str, None]:
+ """Get OpenPype local path from global settings.
+
+ Used to download and unzip OP versions.
+ Args:
+ settings (dict): settings from DB.
+
+ Returns:
+ path to OpenPype or None if not found
+ """
+ path = (
+ settings
+ .get("local_openpype_path", {})
+ .get(platform.system().lower())
+ )
+ if path:
+ return Path(path)
+ return None
+
+
def get_expected_studio_version_str(
staging=False, global_settings=None
) -> str:
diff --git a/igniter/update_thread.py b/igniter/update_thread.py
index e98c95f892..0223477d0a 100644
--- a/igniter/update_thread.py
+++ b/igniter/update_thread.py
@@ -48,6 +48,8 @@ class UpdateThread(QtCore.QThread):
"""
bs = BootstrapRepos(
progress_callback=self.set_progress, message=self.message)
+
+ bs.set_data_dir(OpenPypeVersion.get_local_openpype_path())
version_path = bs.install_version(self._openpype_version)
self._set_result(version_path)
diff --git a/openpype/__init__.py b/openpype/__init__.py
index 810664707a..e6b77b1853 100644
--- a/openpype/__init__.py
+++ b/openpype/__init__.py
@@ -3,3 +3,5 @@ import os
PACKAGE_DIR = os.path.dirname(os.path.abspath(__file__))
PLUGINS_DIR = os.path.join(PACKAGE_DIR, "plugins")
+
+AYON_SERVER_ENABLED = os.environ.get("USE_AYON_SERVER") == "1"
diff --git a/openpype/cli.py b/openpype/cli.py
index 54af42920d..bc837cdeba 100644
--- a/openpype/cli.py
+++ b/openpype/cli.py
@@ -5,11 +5,24 @@ import sys
import code
import click
-# import sys
from .pype_commands import PypeCommands
-@click.group(invoke_without_command=True)
+class AliasedGroup(click.Group):
+ def __init__(self, *args, **kwargs):
+ super().__init__(*args, **kwargs)
+ self._aliases = {}
+
+ def set_alias(self, src_name, dst_name):
+ self._aliases[dst_name] = src_name
+
+ def get_command(self, ctx, cmd_name):
+ if cmd_name in self._aliases:
+ cmd_name = self._aliases[cmd_name]
+ return super().get_command(ctx, cmd_name)
+
+
+@click.group(cls=AliasedGroup, invoke_without_command=True)
@click.pass_context
@click.option("--use-version",
expose_value=False, help="use specified version")
@@ -58,16 +71,20 @@ def tray():
@PypeCommands.add_modules
-@main.group(help="Run command line arguments of OpenPype modules")
+@main.group(help="Run command line arguments of OpenPype addons")
@click.pass_context
def module(ctx):
- """Module specific commands created dynamically.
+ """Addon specific commands created dynamically.
- These commands are generated dynamically by currently loaded addon/modules.
+ These commands are generated dynamically by currently loaded addons.
"""
pass
+# Add 'addon' as alias for module
+main.set_alias("module", "addon")
+
+
@main.command()
@click.option("--ftrack-url", envvar="FTRACK_SERVER",
help="Ftrack server url")
@@ -200,85 +217,6 @@ def remotepublish(project, path, user=None, targets=None):
PypeCommands.remotepublish(project, path, user, targets=targets)
-@main.command()
-@click.option("-p", "--project", required=True,
- help="name of project asset is under")
-@click.option("-a", "--asset", required=True,
- help="name of asset to which we want to copy textures")
-@click.option("--path", required=True,
- help="path where textures are found",
- type=click.Path(exists=True))
-def texturecopy(project, asset, path):
- """Copy specified textures to provided asset path.
-
- It validates if project and asset exists. Then it will use speedcopy to
- copy all textures found in all directories under --path to destination
- folder, determined by template texture in anatomy. I will use source
- filename and automatically rise version number on directory.
-
- Result will be copied without directory structure so it will be flat then.
- Nothing is written to database.
- """
-
- PypeCommands().texture_copy(project, asset, path)
-
-
-@main.command(context_settings={"ignore_unknown_options": True})
-@click.option("--app", help="Registered application name")
-@click.option("--project", help="Project name",
- default=lambda: os.environ.get('AVALON_PROJECT', ''))
-@click.option("--asset", help="Asset name",
- default=lambda: os.environ.get('AVALON_ASSET', ''))
-@click.option("--task", help="Task name",
- default=lambda: os.environ.get('AVALON_TASK', ''))
-@click.option("--tools", help="List of tools to add")
-@click.option("--user", help="Pype user name",
- default=lambda: os.environ.get('OPENPYPE_USERNAME', ''))
-@click.option("-fs",
- "--ftrack-server",
- help="Registered application name",
- default=lambda: os.environ.get('FTRACK_SERVER', ''))
-@click.option("-fu",
- "--ftrack-user",
- help="Registered application name",
- default=lambda: os.environ.get('FTRACK_API_USER', ''))
-@click.option("-fk",
- "--ftrack-key",
- help="Registered application name",
- default=lambda: os.environ.get('FTRACK_API_KEY', ''))
-@click.argument('arguments', nargs=-1)
-def launch(app, project, asset, task,
- ftrack_server, ftrack_user, ftrack_key, tools, arguments, user):
- """Launch registered application name in Pype context.
-
- You can define applications in pype-config toml files. Project, asset name
- and task name must be provided (even if they are not used by app itself).
- Optionally you can specify ftrack credentials if needed.
-
- ARGUMENTS are passed to launched application.
-
- """
- # TODO: this needs to switch for Settings
- if ftrack_server:
- os.environ["FTRACK_SERVER"] = ftrack_server
-
- if ftrack_server:
- os.environ["FTRACK_API_USER"] = ftrack_user
-
- if ftrack_server:
- os.environ["FTRACK_API_KEY"] = ftrack_key
-
- if user:
- os.environ["OPENPYPE_USERNAME"] = user
-
- # test required
- if not project or not asset or not task:
- print("!!! Missing required arguments")
- return
-
- PypeCommands().run_application(app, project, asset, task, tools, arguments)
-
-
@main.command(context_settings={"ignore_unknown_options": True})
def projectmanager():
PypeCommands().launch_project_manager()
diff --git a/openpype/client/entities.py b/openpype/client/entities.py
index adbdd7a47c..5d9654c611 100644
--- a/openpype/client/entities.py
+++ b/openpype/client/entities.py
@@ -1,1553 +1,6 @@
-"""Unclear if these will have public functions like these.
+from openpype import AYON_SERVER_ENABLED
-Goal is that most of functions here are called on (or with) an object
-that has project name as a context (e.g. on 'ProjectEntity'?).
-
-+ We will need more specific functions doing very specific queries really fast.
-"""
-
-import re
-import collections
-
-import six
-from bson.objectid import ObjectId
-
-from .mongo import get_project_database, get_project_connection
-
-PatternType = type(re.compile(""))
-
-
-def _prepare_fields(fields, required_fields=None):
- if not fields:
- return None
-
- output = {
- field: True
- for field in fields
- }
- if "_id" not in output:
- output["_id"] = True
-
- if required_fields:
- for key in required_fields:
- output[key] = True
- return output
-
-
-def convert_id(in_id):
- """Helper function for conversion of id from string to ObjectId.
-
- Args:
- in_id (Union[str, ObjectId, Any]): Entity id that should be converted
- to right type for queries.
-
- Returns:
- Union[ObjectId, Any]: Converted ids to ObjectId or in type.
- """
-
- if isinstance(in_id, six.string_types):
- return ObjectId(in_id)
- return in_id
-
-
-def convert_ids(in_ids):
- """Helper function for conversion of ids from string to ObjectId.
-
- Args:
- in_ids (Iterable[Union[str, ObjectId, Any]]): List of entity ids that
- should be converted to right type for queries.
-
- Returns:
- List[ObjectId]: Converted ids to ObjectId.
- """
-
- _output = set()
- for in_id in in_ids:
- if in_id is not None:
- _output.add(convert_id(in_id))
- return list(_output)
-
-
-def get_projects(active=True, inactive=False, fields=None):
- """Yield all project entity documents.
-
- Args:
- active (Optional[bool]): Include active projects. Defaults to True.
- inactive (Optional[bool]): Include inactive projects.
- Defaults to False.
- fields (Optional[Iterable[str]]): Fields that should be returned. All
- fields are returned if 'None' is passed.
-
- Yields:
- dict: Project entity data which can be reduced to specified 'fields'.
- None is returned if project with specified filters was not found.
- """
- mongodb = get_project_database()
- for project_name in mongodb.collection_names():
- if project_name in ("system.indexes",):
- continue
- project_doc = get_project(
- project_name, active=active, inactive=inactive, fields=fields
- )
- if project_doc is not None:
- yield project_doc
-
-
-def get_project(project_name, active=True, inactive=True, fields=None):
- """Return project entity document by project name.
-
- Args:
- project_name (str): Name of project.
- active (Optional[bool]): Allow active project. Defaults to True.
- inactive (Optional[bool]): Allow inactive project. Defaults to True.
- fields (Optional[Iterable[str]]): Fields that should be returned. All
- fields are returned if 'None' is passed.
-
- Returns:
- Union[Dict, None]: Project entity data which can be reduced to
- specified 'fields'. None is returned if project with specified
- filters was not found.
- """
- # Skip if both are disabled
- if not active and not inactive:
- return None
-
- query_filter = {"type": "project"}
- # Keep query untouched if both should be available
- if active and inactive:
- pass
-
- # Add filter to keep only active
- elif active:
- query_filter["$or"] = [
- {"data.active": {"$exists": False}},
- {"data.active": True},
- ]
-
- # Add filter to keep only inactive
- elif inactive:
- query_filter["$or"] = [
- {"data.active": {"$exists": False}},
- {"data.active": False},
- ]
-
- conn = get_project_connection(project_name)
- return conn.find_one(query_filter, _prepare_fields(fields))
-
-
-def get_whole_project(project_name):
- """Receive all documents from project.
-
- Helper that can be used to get all document from whole project. For example
- for backups etc.
-
- Returns:
- Cursor: Query cursor as iterable which returns all documents from
- project collection.
- """
-
- conn = get_project_connection(project_name)
- return conn.find({})
-
-
-def get_asset_by_id(project_name, asset_id, fields=None):
- """Receive asset data by its id.
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- asset_id (Union[str, ObjectId]): Asset's id.
- fields (Optional[Iterable[str]]): Fields that should be returned. All
- fields are returned if 'None' is passed.
-
- Returns:
- Union[Dict, None]: Asset entity data which can be reduced to
- specified 'fields'. None is returned if asset with specified
- filters was not found.
- """
-
- asset_id = convert_id(asset_id)
- if not asset_id:
- return None
-
- query_filter = {"type": "asset", "_id": asset_id}
- conn = get_project_connection(project_name)
- return conn.find_one(query_filter, _prepare_fields(fields))
-
-
-def get_asset_by_name(project_name, asset_name, fields=None):
- """Receive asset data by its name.
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- asset_name (str): Asset's name.
- fields (Optional[Iterable[str]]): Fields that should be returned. All
- fields are returned if 'None' is passed.
-
- Returns:
- Union[Dict, None]: Asset entity data which can be reduced to
- specified 'fields'. None is returned if asset with specified
- filters was not found.
- """
-
- if not asset_name:
- return None
-
- query_filter = {"type": "asset", "name": asset_name}
- conn = get_project_connection(project_name)
- return conn.find_one(query_filter, _prepare_fields(fields))
-
-
-# NOTE this could be just public function?
-# - any better variable name instead of 'standard'?
-# - same approach can be used for rest of types
-def _get_assets(
- project_name,
- asset_ids=None,
- asset_names=None,
- parent_ids=None,
- standard=True,
- archived=False,
- fields=None
-):
- """Assets for specified project by passed filters.
-
- Passed filters (ids and names) are always combined so all conditions must
- match.
-
- To receive all assets from project just keep filters empty.
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- asset_ids (Iterable[Union[str, ObjectId]]): Asset ids that should
- be found.
- asset_names (Iterable[str]): Name assets that should be found.
- parent_ids (Iterable[Union[str, ObjectId]]): Parent asset ids.
- standard (bool): Query standard assets (type 'asset').
- archived (bool): Query archived assets (type 'archived_asset').
- fields (Optional[Iterable[str]]): Fields that should be returned. All
- fields are returned if 'None' is passed.
-
- Returns:
- Cursor: Query cursor as iterable which returns asset documents matching
- passed filters.
- """
-
- asset_types = []
- if standard:
- asset_types.append("asset")
- if archived:
- asset_types.append("archived_asset")
-
- if not asset_types:
- return []
-
- if len(asset_types) == 1:
- query_filter = {"type": asset_types[0]}
- else:
- query_filter = {"type": {"$in": asset_types}}
-
- if asset_ids is not None:
- asset_ids = convert_ids(asset_ids)
- if not asset_ids:
- return []
- query_filter["_id"] = {"$in": asset_ids}
-
- if asset_names is not None:
- if not asset_names:
- return []
- query_filter["name"] = {"$in": list(asset_names)}
-
- if parent_ids is not None:
- parent_ids = convert_ids(parent_ids)
- if not parent_ids:
- return []
- query_filter["data.visualParent"] = {"$in": parent_ids}
-
- conn = get_project_connection(project_name)
-
- return conn.find(query_filter, _prepare_fields(fields))
-
-
-def get_assets(
- project_name,
- asset_ids=None,
- asset_names=None,
- parent_ids=None,
- archived=False,
- fields=None
-):
- """Assets for specified project by passed filters.
-
- Passed filters (ids and names) are always combined so all conditions must
- match.
-
- To receive all assets from project just keep filters empty.
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- asset_ids (Iterable[Union[str, ObjectId]]): Asset ids that should
- be found.
- asset_names (Iterable[str]): Name assets that should be found.
- parent_ids (Iterable[Union[str, ObjectId]]): Parent asset ids.
- archived (bool): Add also archived assets.
- fields (Optional[Iterable[str]]): Fields that should be returned. All
- fields are returned if 'None' is passed.
-
- Returns:
- Cursor: Query cursor as iterable which returns asset documents matching
- passed filters.
- """
-
- return _get_assets(
- project_name,
- asset_ids,
- asset_names,
- parent_ids,
- True,
- archived,
- fields
- )
-
-
-def get_archived_assets(
- project_name,
- asset_ids=None,
- asset_names=None,
- parent_ids=None,
- fields=None
-):
- """Archived assets for specified project by passed filters.
-
- Passed filters (ids and names) are always combined so all conditions must
- match.
-
- To receive all archived assets from project just keep filters empty.
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- asset_ids (Iterable[Union[str, ObjectId]]): Asset ids that should
- be found.
- asset_names (Iterable[str]): Name assets that should be found.
- parent_ids (Iterable[Union[str, ObjectId]]): Parent asset ids.
- fields (Optional[Iterable[str]]): Fields that should be returned. All
- fields are returned if 'None' is passed.
-
- Returns:
- Cursor: Query cursor as iterable which returns asset documents matching
- passed filters.
- """
-
- return _get_assets(
- project_name, asset_ids, asset_names, parent_ids, False, True, fields
- )
-
-
-def get_asset_ids_with_subsets(project_name, asset_ids=None):
- """Find out which assets have existing subsets.
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- asset_ids (Iterable[Union[str, ObjectId]]): Look only for entered
- asset ids.
-
- Returns:
- Iterable[ObjectId]: Asset ids that have existing subsets.
- """
-
- subset_query = {
- "type": "subset"
- }
- if asset_ids is not None:
- asset_ids = convert_ids(asset_ids)
- if not asset_ids:
- return []
- subset_query["parent"] = {"$in": asset_ids}
-
- conn = get_project_connection(project_name)
- result = conn.aggregate([
- {
- "$match": subset_query
- },
- {
- "$group": {
- "_id": "$parent",
- "count": {"$sum": 1}
- }
- }
- ])
- asset_ids_with_subsets = []
- for item in result:
- asset_id = item["_id"]
- count = item["count"]
- if count > 0:
- asset_ids_with_subsets.append(asset_id)
- return asset_ids_with_subsets
-
-
-def get_subset_by_id(project_name, subset_id, fields=None):
- """Single subset entity data by its id.
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- subset_id (Union[str, ObjectId]): Id of subset which should be found.
- fields (Optional[Iterable[str]]): Fields that should be returned. All
- fields are returned if 'None' is passed.
-
- Returns:
- Union[Dict, None]: Subset entity data which can be reduced to
- specified 'fields'. None is returned if subset with specified
- filters was not found.
- """
-
- subset_id = convert_id(subset_id)
- if not subset_id:
- return None
-
- query_filters = {"type": "subset", "_id": subset_id}
- conn = get_project_connection(project_name)
- return conn.find_one(query_filters, _prepare_fields(fields))
-
-
-def get_subset_by_name(project_name, subset_name, asset_id, fields=None):
- """Single subset entity data by its name and its version id.
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- subset_name (str): Name of subset.
- asset_id (Union[str, ObjectId]): Id of parent asset.
- fields (Optional[Iterable[str]]): Fields that should be returned. All
- fields are returned if 'None' is passed.
-
- Returns:
- Union[Dict, None]: Subset entity data which can be reduced to
- specified 'fields'. None is returned if subset with specified
- filters was not found.
- """
- if not subset_name:
- return None
-
- asset_id = convert_id(asset_id)
- if not asset_id:
- return None
-
- query_filters = {
- "type": "subset",
- "name": subset_name,
- "parent": asset_id
- }
- conn = get_project_connection(project_name)
- return conn.find_one(query_filters, _prepare_fields(fields))
-
-
-def get_subsets(
- project_name,
- subset_ids=None,
- subset_names=None,
- asset_ids=None,
- names_by_asset_ids=None,
- archived=False,
- fields=None
-):
- """Subset entities data from one project filtered by entered filters.
-
- Filters are additive (all conditions must pass to return subset).
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- subset_ids (Iterable[Union[str, ObjectId]]): Subset ids that should be
- queried. Filter ignored if 'None' is passed.
- subset_names (Iterable[str]): Subset names that should be queried.
- Filter ignored if 'None' is passed.
- asset_ids (Iterable[Union[str, ObjectId]]): Asset ids under which
- should look for the subsets. Filter ignored if 'None' is passed.
- names_by_asset_ids (dict[ObjectId, List[str]]): Complex filtering
- using asset ids and list of subset names under the asset.
- archived (bool): Look for archived subsets too.
- fields (Optional[Iterable[str]]): Fields that should be returned. All
- fields are returned if 'None' is passed.
-
- Returns:
- Cursor: Iterable cursor yielding all matching subsets.
- """
-
- subset_types = ["subset"]
- if archived:
- subset_types.append("archived_subset")
-
- if len(subset_types) == 1:
- query_filter = {"type": subset_types[0]}
- else:
- query_filter = {"type": {"$in": subset_types}}
-
- if asset_ids is not None:
- asset_ids = convert_ids(asset_ids)
- if not asset_ids:
- return []
- query_filter["parent"] = {"$in": asset_ids}
-
- if subset_ids is not None:
- subset_ids = convert_ids(subset_ids)
- if not subset_ids:
- return []
- query_filter["_id"] = {"$in": subset_ids}
-
- if subset_names is not None:
- if not subset_names:
- return []
- query_filter["name"] = {"$in": list(subset_names)}
-
- if names_by_asset_ids is not None:
- or_query = []
- for asset_id, names in names_by_asset_ids.items():
- if asset_id and names:
- or_query.append({
- "parent": convert_id(asset_id),
- "name": {"$in": list(names)}
- })
- if not or_query:
- return []
- query_filter["$or"] = or_query
-
- conn = get_project_connection(project_name)
- return conn.find(query_filter, _prepare_fields(fields))
-
-
-def get_subset_families(project_name, subset_ids=None):
- """Set of main families of subsets.
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- subset_ids (Iterable[Union[str, ObjectId]]): Subset ids that should
- be queried. All subsets from project are used if 'None' is passed.
-
- Returns:
- set[str]: Main families of matching subsets.
- """
-
- subset_filter = {
- "type": "subset"
- }
- if subset_ids is not None:
- if not subset_ids:
- return set()
- subset_filter["_id"] = {"$in": list(subset_ids)}
-
- conn = get_project_connection(project_name)
- result = list(conn.aggregate([
- {"$match": subset_filter},
- {"$project": {
- "family": {"$arrayElemAt": ["$data.families", 0]}
- }},
- {"$group": {
- "_id": "family_group",
- "families": {"$addToSet": "$family"}
- }}
- ]))
- if result:
- return set(result[0]["families"])
- return set()
-
-
-def get_version_by_id(project_name, version_id, fields=None):
- """Single version entity data by its id.
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- version_id (Union[str, ObjectId]): Id of version which should be found.
- fields (Optional[Iterable[str]]): Fields that should be returned. All
- fields are returned if 'None' is passed.
-
- Returns:
- Union[Dict, None]: Version entity data which can be reduced to
- specified 'fields'. None is returned if version with specified
- filters was not found.
- """
-
- version_id = convert_id(version_id)
- if not version_id:
- return None
-
- query_filter = {
- "type": {"$in": ["version", "hero_version"]},
- "_id": version_id
- }
- conn = get_project_connection(project_name)
- return conn.find_one(query_filter, _prepare_fields(fields))
-
-
-def get_version_by_name(project_name, version, subset_id, fields=None):
- """Single version entity data by its name and subset id.
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- version (int): name of version entity (its version).
- subset_id (Union[str, ObjectId]): Id of version which should be found.
- fields (Optional[Iterable[str]]): Fields that should be returned. All
- fields are returned if 'None' is passed.
-
- Returns:
- Union[Dict, None]: Version entity data which can be reduced to
- specified 'fields'. None is returned if version with specified
- filters was not found.
- """
-
- subset_id = convert_id(subset_id)
- if not subset_id:
- return None
-
- conn = get_project_connection(project_name)
- query_filter = {
- "type": "version",
- "parent": subset_id,
- "name": version
- }
- return conn.find_one(query_filter, _prepare_fields(fields))
-
-
-def version_is_latest(project_name, version_id):
- """Is version the latest from its subset.
-
- Note:
- Hero versions are considered as latest.
-
- Todo:
- Maybe raise exception when version was not found?
-
- Args:
- project_name (str):Name of project where to look for queried entities.
- version_id (Union[str, ObjectId]): Version id which is checked.
-
- Returns:
- bool: True if is latest version from subset else False.
- """
-
- version_id = convert_id(version_id)
- if not version_id:
- return False
- version_doc = get_version_by_id(
- project_name, version_id, fields=["_id", "type", "parent"]
- )
- # What to do when version is not found?
- if not version_doc:
- return False
-
- if version_doc["type"] == "hero_version":
- return True
-
- last_version = get_last_version_by_subset_id(
- project_name, version_doc["parent"], fields=["_id"]
- )
- return last_version["_id"] == version_id
-
-
-def _get_versions(
- project_name,
- subset_ids=None,
- version_ids=None,
- versions=None,
- standard=True,
- hero=False,
- fields=None
-):
- version_types = []
- if standard:
- version_types.append("version")
-
- if hero:
- version_types.append("hero_version")
-
- if not version_types:
- return []
- elif len(version_types) == 1:
- query_filter = {"type": version_types[0]}
- else:
- query_filter = {"type": {"$in": version_types}}
-
- if subset_ids is not None:
- subset_ids = convert_ids(subset_ids)
- if not subset_ids:
- return []
- query_filter["parent"] = {"$in": subset_ids}
-
- if version_ids is not None:
- version_ids = convert_ids(version_ids)
- if not version_ids:
- return []
- query_filter["_id"] = {"$in": version_ids}
-
- if versions is not None:
- versions = list(versions)
- if not versions:
- return []
-
- if len(versions) == 1:
- query_filter["name"] = versions[0]
- else:
- query_filter["name"] = {"$in": versions}
-
- conn = get_project_connection(project_name)
-
- return conn.find(query_filter, _prepare_fields(fields))
-
-
-def get_versions(
- project_name,
- version_ids=None,
- subset_ids=None,
- versions=None,
- hero=False,
- fields=None
-):
- """Version entities data from one project filtered by entered filters.
-
- Filters are additive (all conditions must pass to return subset).
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- version_ids (Iterable[Union[str, ObjectId]]): Version ids that will
- be queried. Filter ignored if 'None' is passed.
- subset_ids (Iterable[str]): Subset ids that will be queried.
- Filter ignored if 'None' is passed.
- versions (Iterable[int]): Version names (as integers).
- Filter ignored if 'None' is passed.
- hero (bool): Look also for hero versions.
- fields (Optional[Iterable[str]]): Fields that should be returned. All
- fields are returned if 'None' is passed.
-
- Returns:
- Cursor: Iterable cursor yielding all matching versions.
- """
-
- return _get_versions(
- project_name,
- subset_ids,
- version_ids,
- versions,
- standard=True,
- hero=hero,
- fields=fields
- )
-
-
-def get_hero_version_by_subset_id(project_name, subset_id, fields=None):
- """Hero version by subset id.
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- subset_id (Union[str, ObjectId]): Subset id under which
- is hero version.
- fields (Optional[Iterable[str]]): Fields that should be returned. All
- fields are returned if 'None' is passed.
-
- Returns:
- Union[Dict, None]: Hero version entity data which can be reduced to
- specified 'fields'. None is returned if hero version with specified
- filters was not found.
- """
-
- subset_id = convert_id(subset_id)
- if not subset_id:
- return None
-
- versions = list(_get_versions(
- project_name,
- subset_ids=[subset_id],
- standard=False,
- hero=True,
- fields=fields
- ))
- if versions:
- return versions[0]
- return None
-
-
-def get_hero_version_by_id(project_name, version_id, fields=None):
- """Hero version by its id.
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- version_id (Union[str, ObjectId]): Hero version id.
- fields (Optional[Iterable[str]]): Fields that should be returned. All
- fields are returned if 'None' is passed.
-
- Returns:
- Union[Dict, None]: Hero version entity data which can be reduced to
- specified 'fields'. None is returned if hero version with specified
- filters was not found.
- """
-
- version_id = convert_id(version_id)
- if not version_id:
- return None
-
- versions = list(_get_versions(
- project_name,
- version_ids=[version_id],
- standard=False,
- hero=True,
- fields=fields
- ))
- if versions:
- return versions[0]
- return None
-
-
-def get_hero_versions(
- project_name,
- subset_ids=None,
- version_ids=None,
- fields=None
-):
- """Hero version entities data from one project filtered by entered filters.
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- subset_ids (Iterable[Union[str, ObjectId]]): Subset ids for which
- should look for hero versions. Filter ignored if 'None' is passed.
- version_ids (Iterable[Union[str, ObjectId]]): Hero version ids. Filter
- ignored if 'None' is passed.
- fields (Optional[Iterable[str]]): Fields that should be returned. All
- fields are returned if 'None' is passed.
-
- Returns:
- Cursor|list: Iterable yielding hero versions matching passed filters.
- """
-
- return _get_versions(
- project_name,
- subset_ids,
- version_ids,
- standard=False,
- hero=True,
- fields=fields
- )
-
-
-def get_output_link_versions(project_name, version_id, fields=None):
- """Versions where passed version was used as input.
-
- Question:
- Not 100% sure about the usage of the function so the name and docstring
- maybe does not match what it does?
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- version_id (Union[str, ObjectId]): Version id which can be used
- as input link for other versions.
- fields (Optional[Iterable[str]]): Fields that should be returned. All
- fields are returned if 'None' is passed.
-
- Returns:
- Iterable: Iterable cursor yielding versions that are used as input
- links for passed version.
- """
-
- version_id = convert_id(version_id)
- if not version_id:
- return []
-
- conn = get_project_connection(project_name)
- # Does make sense to look for hero versions?
- query_filter = {
- "type": "version",
- "data.inputLinks.id": version_id
- }
- return conn.find(query_filter, _prepare_fields(fields))
-
-
-def get_last_versions(project_name, subset_ids, active=None, fields=None):
- """Latest versions for entered subset_ids.
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- subset_ids (Iterable[Union[str, ObjectId]]): List of subset ids.
- active (Optional[bool]): If True only active versions are returned.
- fields (Optional[Iterable[str]]): Fields that should be returned. All
- fields are returned if 'None' is passed.
-
- Returns:
- dict[ObjectId, int]: Key is subset id and value is last version name.
- """
-
- subset_ids = convert_ids(subset_ids)
- if not subset_ids:
- return {}
-
- if fields is not None:
- fields = list(fields)
- if not fields:
- return {}
-
- # Avoid double query if only name and _id are requested
- name_needed = False
- limit_query = False
- if fields:
- fields_s = set(fields)
- if "name" in fields_s:
- name_needed = True
- fields_s.remove("name")
-
- for field in ("_id", "parent"):
- if field in fields_s:
- fields_s.remove(field)
- limit_query = len(fields_s) == 0
-
- group_item = {
- "_id": "$parent",
- "_version_id": {"$last": "$_id"}
- }
- # Add name if name is needed (only for limit query)
- if name_needed:
- group_item["name"] = {"$last": "$name"}
-
- aggregate_filter = {
- "type": "version",
- "parent": {"$in": subset_ids}
- }
- if active is False:
- aggregate_filter["data.active"] = active
- elif active is True:
- aggregate_filter["$or"] = [
- {"data.active": {"$exists": 0}},
- {"data.active": active},
- ]
-
- aggregation_pipeline = [
- # Find all versions of those subsets
- {"$match": aggregate_filter},
- # Sorting versions all together
- {"$sort": {"name": 1}},
- # Group them by "parent", but only take the last
- {"$group": group_item}
- ]
-
- conn = get_project_connection(project_name)
- aggregate_result = conn.aggregate(aggregation_pipeline)
- if limit_query:
- output = {}
- for item in aggregate_result:
- subset_id = item["_id"]
- item_data = {"_id": item["_version_id"], "parent": subset_id}
- if name_needed:
- item_data["name"] = item["name"]
- output[subset_id] = item_data
- return output
-
- version_ids = [
- doc["_version_id"]
- for doc in aggregate_result
- ]
-
- fields = _prepare_fields(fields, ["parent"])
-
- version_docs = get_versions(
- project_name, version_ids=version_ids, fields=fields
- )
-
- return {
- version_doc["parent"]: version_doc
- for version_doc in version_docs
- }
-
-
-def get_last_version_by_subset_id(project_name, subset_id, fields=None):
- """Last version for passed subset id.
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- subset_id (Union[str, ObjectId]): Id of version which should be found.
- fields (Optional[Iterable[str]]): Fields that should be returned. All
- fields are returned if 'None' is passed.
-
- Returns:
- Union[Dict, None]: Version entity data which can be reduced to
- specified 'fields'. None is returned if version with specified
- filters was not found.
- """
-
- subset_id = convert_id(subset_id)
- if not subset_id:
- return None
-
- last_versions = get_last_versions(
- project_name, subset_ids=[subset_id], fields=fields
- )
- return last_versions.get(subset_id)
-
-
-def get_last_version_by_subset_name(
- project_name, subset_name, asset_id=None, asset_name=None, fields=None
-):
- """Last version for passed subset name under asset id/name.
-
- It is required to pass 'asset_id' or 'asset_name'. Asset id is recommended
- if is available.
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- subset_name (str): Name of subset.
- asset_id (Union[str, ObjectId]): Asset id which is parent of passed
- subset name.
- asset_name (str): Asset name which is parent of passed subset name.
- fields (Optional[Iterable[str]]): Fields that should be returned. All
- fields are returned if 'None' is passed.
-
- Returns:
- Union[Dict, None]: Version entity data which can be reduced to
- specified 'fields'. None is returned if version with specified
- filters was not found.
- """
-
- if not asset_id and not asset_name:
- return None
-
- if not asset_id:
- asset_doc = get_asset_by_name(project_name, asset_name, fields=["_id"])
- if not asset_doc:
- return None
- asset_id = asset_doc["_id"]
- subset_doc = get_subset_by_name(
- project_name, subset_name, asset_id, fields=["_id"]
- )
- if not subset_doc:
- return None
- return get_last_version_by_subset_id(
- project_name, subset_doc["_id"], fields=fields
- )
-
-
-def get_representation_by_id(project_name, representation_id, fields=None):
- """Representation entity data by its id.
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- representation_id (Union[str, ObjectId]): Representation id.
- fields (Optional[Iterable[str]]): Fields that should be returned. All
- fields are returned if 'None' is passed.
-
- Returns:
- Union[Dict, None]: Representation entity data which can be reduced to
- specified 'fields'. None is returned if representation with
- specified filters was not found.
- """
-
- if not representation_id:
- return None
-
- repre_types = ["representation", "archived_representation"]
- query_filter = {
- "type": {"$in": repre_types}
- }
- if representation_id is not None:
- query_filter["_id"] = convert_id(representation_id)
-
- conn = get_project_connection(project_name)
-
- return conn.find_one(query_filter, _prepare_fields(fields))
-
-
-def get_representation_by_name(
- project_name, representation_name, version_id, fields=None
-):
- """Representation entity data by its name and its version id.
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- representation_name (str): Representation name.
- version_id (Union[str, ObjectId]): Id of parent version entity.
- fields (Optional[Iterable[str]]): Fields that should be returned. All
- fields are returned if 'None' is passed.
-
- Returns:
- Union[dict[str, Any], None]: Representation entity data which can be
- reduced to specified 'fields'. None is returned if representation
- with specified filters was not found.
- """
-
- version_id = convert_id(version_id)
- if not version_id or not representation_name:
- return None
- repre_types = ["representation", "archived_representations"]
- query_filter = {
- "type": {"$in": repre_types},
- "name": representation_name,
- "parent": version_id
- }
-
- conn = get_project_connection(project_name)
- return conn.find_one(query_filter, _prepare_fields(fields))
-
-
-def _flatten_dict(data):
- flatten_queue = collections.deque()
- flatten_queue.append(data)
- output = {}
- while flatten_queue:
- item = flatten_queue.popleft()
- for key, value in item.items():
- if not isinstance(value, dict):
- output[key] = value
- continue
-
- tmp = {}
- for subkey, subvalue in value.items():
- new_key = "{}.{}".format(key, subkey)
- tmp[new_key] = subvalue
- flatten_queue.append(tmp)
- return output
-
-
-def _regex_filters(filters):
- output = []
- for key, value in filters.items():
- regexes = []
- a_values = []
- if isinstance(value, PatternType):
- regexes.append(value)
- elif isinstance(value, (list, tuple, set)):
- for item in value:
- if isinstance(item, PatternType):
- regexes.append(item)
- else:
- a_values.append(item)
- else:
- a_values.append(value)
-
- key_filters = []
- if len(a_values) == 1:
- key_filters.append({key: a_values[0]})
- elif a_values:
- key_filters.append({key: {"$in": a_values}})
-
- for regex in regexes:
- key_filters.append({key: {"$regex": regex}})
-
- if len(key_filters) == 1:
- output.append(key_filters[0])
- else:
- output.append({"$or": key_filters})
-
- return output
-
-
-def _get_representations(
- project_name,
- representation_ids,
- representation_names,
- version_ids,
- context_filters,
- names_by_version_ids,
- standard,
- archived,
- fields
-):
- default_output = []
- repre_types = []
- if standard:
- repre_types.append("representation")
- if archived:
- repre_types.append("archived_representation")
-
- if not repre_types:
- return default_output
-
- if len(repre_types) == 1:
- query_filter = {"type": repre_types[0]}
- else:
- query_filter = {"type": {"$in": repre_types}}
-
- if representation_ids is not None:
- representation_ids = convert_ids(representation_ids)
- if not representation_ids:
- return default_output
- query_filter["_id"] = {"$in": representation_ids}
-
- if representation_names is not None:
- if not representation_names:
- return default_output
- query_filter["name"] = {"$in": list(representation_names)}
-
- if version_ids is not None:
- version_ids = convert_ids(version_ids)
- if not version_ids:
- return default_output
- query_filter["parent"] = {"$in": version_ids}
-
- or_queries = []
- if names_by_version_ids is not None:
- or_query = []
- for version_id, names in names_by_version_ids.items():
- if version_id and names:
- or_query.append({
- "parent": convert_id(version_id),
- "name": {"$in": list(names)}
- })
- if not or_query:
- return default_output
- or_queries.append(or_query)
-
- if context_filters is not None:
- if not context_filters:
- return []
- _flatten_filters = _flatten_dict(context_filters)
- flatten_filters = {}
- for key, value in _flatten_filters.items():
- if not key.startswith("context"):
- key = "context.{}".format(key)
- flatten_filters[key] = value
-
- for item in _regex_filters(flatten_filters):
- for key, value in item.items():
- if key != "$or":
- query_filter[key] = value
-
- elif value:
- or_queries.append(value)
-
- if len(or_queries) == 1:
- query_filter["$or"] = or_queries[0]
- elif or_queries:
- and_query = []
- for or_query in or_queries:
- if isinstance(or_query, list):
- or_query = {"$or": or_query}
- and_query.append(or_query)
- query_filter["$and"] = and_query
-
- conn = get_project_connection(project_name)
-
- return conn.find(query_filter, _prepare_fields(fields))
-
-
-def get_representations(
- project_name,
- representation_ids=None,
- representation_names=None,
- version_ids=None,
- context_filters=None,
- names_by_version_ids=None,
- archived=False,
- standard=True,
- fields=None
-):
- """Representation entities data from one project filtered by filters.
-
- Filters are additive (all conditions must pass to return subset).
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- representation_ids (Iterable[Union[str, ObjectId]]): Representation ids
- used as filter. Filter ignored if 'None' is passed.
- representation_names (Iterable[str]): Representations names used
- as filter. Filter ignored if 'None' is passed.
- version_ids (Iterable[str]): Subset ids used as parent filter. Filter
- ignored if 'None' is passed.
- context_filters (Dict[str, List[str, PatternType]]): Filter by
- representation context fields.
- names_by_version_ids (dict[ObjectId, list[str]]): Complex filtering
- using version ids and list of names under the version.
- archived (bool): Output will also contain archived representations.
- fields (Optional[Iterable[str]]): Fields that should be returned. All
- fields are returned if 'None' is passed.
-
- Returns:
- Cursor: Iterable cursor yielding all matching representations.
- """
-
- return _get_representations(
- project_name=project_name,
- representation_ids=representation_ids,
- representation_names=representation_names,
- version_ids=version_ids,
- context_filters=context_filters,
- names_by_version_ids=names_by_version_ids,
- standard=standard,
- archived=archived,
- fields=fields
- )
-
-
-def get_archived_representations(
- project_name,
- representation_ids=None,
- representation_names=None,
- version_ids=None,
- context_filters=None,
- names_by_version_ids=None,
- fields=None
-):
- """Archived representation entities data from project with applied filters.
-
- Filters are additive (all conditions must pass to return subset).
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- representation_ids (Iterable[Union[str, ObjectId]]): Representation ids
- used as filter. Filter ignored if 'None' is passed.
- representation_names (Iterable[str]): Representations names used
- as filter. Filter ignored if 'None' is passed.
- version_ids (Iterable[str]): Subset ids used as parent filter. Filter
- ignored if 'None' is passed.
- context_filters (Dict[str, List[str, PatternType]]): Filter by
- representation context fields.
- names_by_version_ids (dict[ObjectId, List[str]]): Complex filtering
- using version ids and list of names under the version.
- fields (Optional[Iterable[str]]): Fields that should be returned. All
- fields are returned if 'None' is passed.
-
- Returns:
- Cursor: Iterable cursor yielding all matching representations.
- """
-
- return _get_representations(
- project_name=project_name,
- representation_ids=representation_ids,
- representation_names=representation_names,
- version_ids=version_ids,
- context_filters=context_filters,
- names_by_version_ids=names_by_version_ids,
- standard=False,
- archived=True,
- fields=fields
- )
-
-
-def get_representations_parents(project_name, representations):
- """Prepare parents of representation entities.
-
- Each item of returned dictionary contains version, subset, asset
- and project in that order.
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- representations (List[dict]): Representation entities with at least
- '_id' and 'parent' keys.
-
- Returns:
- dict[ObjectId, tuple]: Parents by representation id.
- """
-
- repre_docs_by_version_id = collections.defaultdict(list)
- version_docs_by_version_id = {}
- version_docs_by_subset_id = collections.defaultdict(list)
- subset_docs_by_subset_id = {}
- subset_docs_by_asset_id = collections.defaultdict(list)
- output = {}
- for repre_doc in representations:
- repre_id = repre_doc["_id"]
- version_id = repre_doc["parent"]
- output[repre_id] = (None, None, None, None)
- repre_docs_by_version_id[version_id].append(repre_doc)
-
- version_docs = get_versions(
- project_name,
- version_ids=repre_docs_by_version_id.keys(),
- hero=True
- )
- for version_doc in version_docs:
- version_id = version_doc["_id"]
- subset_id = version_doc["parent"]
- version_docs_by_version_id[version_id] = version_doc
- version_docs_by_subset_id[subset_id].append(version_doc)
-
- subset_docs = get_subsets(
- project_name, subset_ids=version_docs_by_subset_id.keys()
- )
- for subset_doc in subset_docs:
- subset_id = subset_doc["_id"]
- asset_id = subset_doc["parent"]
- subset_docs_by_subset_id[subset_id] = subset_doc
- subset_docs_by_asset_id[asset_id].append(subset_doc)
-
- asset_docs = get_assets(
- project_name, asset_ids=subset_docs_by_asset_id.keys()
- )
- asset_docs_by_id = {
- asset_doc["_id"]: asset_doc
- for asset_doc in asset_docs
- }
-
- project_doc = get_project(project_name)
-
- for version_id, repre_docs in repre_docs_by_version_id.items():
- asset_doc = None
- subset_doc = None
- version_doc = version_docs_by_version_id.get(version_id)
- if version_doc:
- subset_id = version_doc["parent"]
- subset_doc = subset_docs_by_subset_id.get(subset_id)
- if subset_doc:
- asset_id = subset_doc["parent"]
- asset_doc = asset_docs_by_id.get(asset_id)
-
- for repre_doc in repre_docs:
- repre_id = repre_doc["_id"]
- output[repre_id] = (
- version_doc, subset_doc, asset_doc, project_doc
- )
- return output
-
-
-def get_representation_parents(project_name, representation):
- """Prepare parents of representation entity.
-
- Each item of returned dictionary contains version, subset, asset
- and project in that order.
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- representation (dict): Representation entities with at least
- '_id' and 'parent' keys.
-
- Returns:
- dict[ObjectId, tuple]: Parents by representation id.
- """
-
- if not representation:
- return None
-
- repre_id = representation["_id"]
- parents_by_repre_id = get_representations_parents(
- project_name, [representation]
- )
- return parents_by_repre_id[repre_id]
-
-
-def get_thumbnail_id_from_source(project_name, src_type, src_id):
- """Receive thumbnail id from source entity.
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- src_type (str): Type of source entity ('asset', 'version').
- src_id (Union[str, ObjectId]): Id of source entity.
-
- Returns:
- Union[ObjectId, None]: Thumbnail id assigned to entity. If Source
- entity does not have any thumbnail id assigned.
- """
-
- if not src_type or not src_id:
- return None
-
- query_filter = {"_id": convert_id(src_id)}
-
- conn = get_project_connection(project_name)
- src_doc = conn.find_one(query_filter, {"data.thumbnail_id"})
- if src_doc:
- return src_doc.get("data", {}).get("thumbnail_id")
- return None
-
-
-def get_thumbnails(project_name, thumbnail_ids, fields=None):
- """Receive thumbnails entity data.
-
- Thumbnail entity can be used to receive binary content of thumbnail based
- on its content and ThumbnailResolvers.
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- thumbnail_ids (Iterable[Union[str, ObjectId]]): Ids of thumbnail
- entities.
- fields (Optional[Iterable[str]]): Fields that should be returned. All
- fields are returned if 'None' is passed.
-
- Returns:
- cursor: Cursor of queried documents.
- """
-
- if thumbnail_ids:
- thumbnail_ids = convert_ids(thumbnail_ids)
-
- if not thumbnail_ids:
- return []
- query_filter = {
- "type": "thumbnail",
- "_id": {"$in": thumbnail_ids}
- }
- conn = get_project_connection(project_name)
- return conn.find(query_filter, _prepare_fields(fields))
-
-
-def get_thumbnail(project_name, thumbnail_id, fields=None):
- """Receive thumbnail entity data.
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- thumbnail_id (Union[str, ObjectId]): Id of thumbnail entity.
- fields (Optional[Iterable[str]]): Fields that should be returned. All
- fields are returned if 'None' is passed.
-
- Returns:
- Union[Dict, None]: Thumbnail entity data which can be reduced to
- specified 'fields'.None is returned if thumbnail with specified
- filters was not found.
- """
-
- if not thumbnail_id:
- return None
- query_filter = {"type": "thumbnail", "_id": convert_id(thumbnail_id)}
- conn = get_project_connection(project_name)
- return conn.find_one(query_filter, _prepare_fields(fields))
-
-
-def get_workfile_info(
- project_name, asset_id, task_name, filename, fields=None
-):
- """Document with workfile information.
-
- Warning:
- Query is based on filename and context which does not meant it will
- find always right and expected result. Information have limited usage
- and is not recommended to use it as source information about workfile.
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- asset_id (Union[str, ObjectId]): Id of asset entity.
- task_name (str): Task name on asset.
- fields (Optional[Iterable[str]]): Fields that should be returned. All
- fields are returned if 'None' is passed.
-
- Returns:
- Union[Dict, None]: Workfile entity data which can be reduced to
- specified 'fields'.None is returned if workfile with specified
- filters was not found.
- """
-
- if not asset_id or not task_name or not filename:
- return None
-
- query_filter = {
- "type": "workfile",
- "parent": convert_id(asset_id),
- "task_name": task_name,
- "filename": filename
- }
- conn = get_project_connection(project_name)
- return conn.find_one(query_filter, _prepare_fields(fields))
-
-
-"""
-## Custom data storage:
-- Settings - OP settings overrides and local settings
-- Logging - logs from Logger
-- Webpublisher - jobs
-- Ftrack - events
-- Maya - Shaders
- - openpype/hosts/maya/api/shader_definition_editor.py
- - openpype/hosts/maya/plugins/publish/validate_model_name.py
-
-## Global publish plugins
-- openpype/plugins/publish/extract_hierarchy_avalon.py
- Create:
- - asset
- Update:
- - asset
-
-## Lib
-- openpype/lib/avalon_context.py
- Update:
- - workfile data
-- openpype/lib/project_backpack.py
- Update:
- - project
-"""
+if not AYON_SERVER_ENABLED:
+ from .mongo.entities import *
+else:
+ from .server.entities import *
diff --git a/openpype/client/entity_links.py b/openpype/client/entity_links.py
index b74b4ce7f6..e18970de90 100644
--- a/openpype/client/entity_links.py
+++ b/openpype/client/entity_links.py
@@ -1,243 +1,6 @@
-from .mongo import get_project_connection
-from .entities import (
- get_assets,
- get_asset_by_id,
- get_version_by_id,
- get_representation_by_id,
- convert_id,
-)
+from openpype import AYON_SERVER_ENABLED
-
-def get_linked_asset_ids(project_name, asset_doc=None, asset_id=None):
- """Extract linked asset ids from asset document.
-
- One of asset document or asset id must be passed.
-
- Note:
- Asset links now works only from asset to assets.
-
- Args:
- asset_doc (dict): Asset document from DB.
-
- Returns:
- List[Union[ObjectId, str]]: Asset ids of input links.
- """
-
- output = []
- if not asset_doc and not asset_id:
- return output
-
- if not asset_doc:
- asset_doc = get_asset_by_id(
- project_name, asset_id, fields=["data.inputLinks"]
- )
-
- input_links = asset_doc["data"].get("inputLinks")
- if not input_links:
- return output
-
- for item in input_links:
- # Backwards compatibility for "_id" key which was replaced with
- # "id"
- if "_id" in item:
- link_id = item["_id"]
- else:
- link_id = item["id"]
- output.append(link_id)
- return output
-
-
-def get_linked_assets(
- project_name, asset_doc=None, asset_id=None, fields=None
-):
- """Return linked assets based on passed asset document.
-
- One of asset document or asset id must be passed.
-
- Args:
- project_name (str): Name of project where to look for queried entities.
- asset_doc (Dict[str, Any]): Asset document from database.
- asset_id (Union[ObjectId, str]): Asset id. Can be used instead of
- asset document.
- fields (Iterable[str]): Fields that should be returned. All fields are
- returned if 'None' is passed.
-
- Returns:
- List[Dict[str, Any]]: Asset documents of input links for passed
- asset doc.
- """
-
- if not asset_doc:
- if not asset_id:
- return []
- asset_doc = get_asset_by_id(
- project_name,
- asset_id,
- fields=["data.inputLinks"]
- )
- if not asset_doc:
- return []
-
- link_ids = get_linked_asset_ids(project_name, asset_doc=asset_doc)
- if not link_ids:
- return []
-
- return list(get_assets(project_name, asset_ids=link_ids, fields=fields))
-
-
-def get_linked_representation_id(
- project_name, repre_doc=None, repre_id=None, link_type=None, max_depth=None
-):
- """Returns list of linked ids of particular type (if provided).
-
- One of representation document or representation id must be passed.
- Note:
- Representation links now works only from representation through version
- back to representations.
-
- Args:
- project_name (str): Name of project where look for links.
- repre_doc (Dict[str, Any]): Representation document.
- repre_id (Union[ObjectId, str]): Representation id.
- link_type (str): Type of link (e.g. 'reference', ...).
- max_depth (int): Limit recursion level. Default: 0
-
- Returns:
- List[ObjectId] Linked representation ids.
- """
-
- if repre_doc:
- repre_id = repre_doc["_id"]
-
- if repre_id:
- repre_id = convert_id(repre_id)
-
- if not repre_id and not repre_doc:
- return []
-
- version_id = None
- if repre_doc:
- version_id = repre_doc.get("parent")
-
- if not version_id:
- repre_doc = get_representation_by_id(
- project_name, repre_id, fields=["parent"]
- )
- version_id = repre_doc["parent"]
-
- if not version_id:
- return []
-
- version_doc = get_version_by_id(
- project_name, version_id, fields=["type", "version_id"]
- )
- if version_doc["type"] == "hero_version":
- version_id = version_doc["version_id"]
-
- if max_depth is None:
- max_depth = 0
-
- match = {
- "_id": version_id,
- # Links are not stored to hero versions at this moment so filter
- # is limited to just versions
- "type": "version"
- }
-
- graph_lookup = {
- "from": project_name,
- "startWith": "$data.inputLinks.id",
- "connectFromField": "data.inputLinks.id",
- "connectToField": "_id",
- "as": "outputs_recursive",
- "depthField": "depth"
- }
- if max_depth != 0:
- # We offset by -1 since 0 basically means no recursion
- # but the recursion only happens after the initial lookup
- # for outputs.
- graph_lookup["maxDepth"] = max_depth - 1
-
- query_pipeline = [
- # Match
- {"$match": match},
- # Recursive graph lookup for inputs
- {"$graphLookup": graph_lookup}
- ]
- conn = get_project_connection(project_name)
- result = conn.aggregate(query_pipeline)
- referenced_version_ids = _process_referenced_pipeline_result(
- result, link_type
- )
- if not referenced_version_ids:
- return []
-
- ref_ids = conn.distinct(
- "_id",
- filter={
- "parent": {"$in": list(referenced_version_ids)},
- "type": "representation"
- }
- )
-
- return list(ref_ids)
-
-
-def _process_referenced_pipeline_result(result, link_type):
- """Filters result from pipeline for particular link_type.
-
- Pipeline cannot use link_type directly in a query.
-
- Returns:
- (list)
- """
-
- referenced_version_ids = set()
- correctly_linked_ids = set()
- for item in result:
- input_links = item.get("data", {}).get("inputLinks")
- if not input_links:
- continue
-
- _filter_input_links(
- input_links,
- link_type,
- correctly_linked_ids
- )
-
- # outputs_recursive in random order, sort by depth
- outputs_recursive = item.get("outputs_recursive")
- if not outputs_recursive:
- continue
-
- for output in sorted(outputs_recursive, key=lambda o: o["depth"]):
- output_links = output.get("data", {}).get("inputLinks")
- if not output_links and output["type"] != "hero_version":
- continue
-
- # Leaf
- if output["_id"] not in correctly_linked_ids:
- continue
-
- _filter_input_links(
- output_links,
- link_type,
- correctly_linked_ids
- )
-
- referenced_version_ids.add(output["_id"])
-
- return referenced_version_ids
-
-
-def _filter_input_links(input_links, link_type, correctly_linked_ids):
- if not input_links: # to handle hero versions
- return
-
- for input_link in input_links:
- if link_type and input_link["type"] != link_type:
- continue
-
- link_id = input_link.get("id") or input_link.get("_id")
- if link_id is not None:
- correctly_linked_ids.add(link_id)
+if not AYON_SERVER_ENABLED:
+ from .mongo.entity_links import *
+else:
+ from .server.entity_links import *
diff --git a/openpype/client/mongo/__init__.py b/openpype/client/mongo/__init__.py
new file mode 100644
index 0000000000..5c5143a731
--- /dev/null
+++ b/openpype/client/mongo/__init__.py
@@ -0,0 +1,20 @@
+from .mongo import (
+ MongoEnvNotSet,
+ get_default_components,
+ should_add_certificate_path_to_mongo_url,
+ validate_mongo_connection,
+ OpenPypeMongoConnection,
+ get_project_database,
+ get_project_connection,
+)
+
+
+__all__ = (
+ "MongoEnvNotSet",
+ "get_default_components",
+ "should_add_certificate_path_to_mongo_url",
+ "validate_mongo_connection",
+ "OpenPypeMongoConnection",
+ "get_project_database",
+ "get_project_connection",
+)
diff --git a/openpype/client/mongo/entities.py b/openpype/client/mongo/entities.py
new file mode 100644
index 0000000000..260fde4594
--- /dev/null
+++ b/openpype/client/mongo/entities.py
@@ -0,0 +1,1555 @@
+"""Unclear if these will have public functions like these.
+
+Goal is that most of functions here are called on (or with) an object
+that has project name as a context (e.g. on 'ProjectEntity'?).
+
++ We will need more specific functions doing very specific queries really fast.
+"""
+
+import re
+import collections
+
+import six
+from bson.objectid import ObjectId
+
+from .mongo import get_project_database, get_project_connection
+
+PatternType = type(re.compile(""))
+
+
+def _prepare_fields(fields, required_fields=None):
+ if not fields:
+ return None
+
+ output = {
+ field: True
+ for field in fields
+ }
+ if "_id" not in output:
+ output["_id"] = True
+
+ if required_fields:
+ for key in required_fields:
+ output[key] = True
+ return output
+
+
+def convert_id(in_id):
+ """Helper function for conversion of id from string to ObjectId.
+
+ Args:
+ in_id (Union[str, ObjectId, Any]): Entity id that should be converted
+ to right type for queries.
+
+ Returns:
+ Union[ObjectId, Any]: Converted ids to ObjectId or in type.
+ """
+
+ if isinstance(in_id, six.string_types):
+ return ObjectId(in_id)
+ return in_id
+
+
+def convert_ids(in_ids):
+ """Helper function for conversion of ids from string to ObjectId.
+
+ Args:
+ in_ids (Iterable[Union[str, ObjectId, Any]]): List of entity ids that
+ should be converted to right type for queries.
+
+ Returns:
+ List[ObjectId]: Converted ids to ObjectId.
+ """
+
+ _output = set()
+ for in_id in in_ids:
+ if in_id is not None:
+ _output.add(convert_id(in_id))
+ return list(_output)
+
+
+def get_projects(active=True, inactive=False, fields=None):
+ """Yield all project entity documents.
+
+ Args:
+ active (Optional[bool]): Include active projects. Defaults to True.
+ inactive (Optional[bool]): Include inactive projects.
+ Defaults to False.
+ fields (Optional[Iterable[str]]): Fields that should be returned. All
+ fields are returned if 'None' is passed.
+
+ Yields:
+ dict: Project entity data which can be reduced to specified 'fields'.
+ None is returned if project with specified filters was not found.
+ """
+ mongodb = get_project_database()
+ for project_name in mongodb.collection_names():
+ if project_name in ("system.indexes",):
+ continue
+ project_doc = get_project(
+ project_name, active=active, inactive=inactive, fields=fields
+ )
+ if project_doc is not None:
+ yield project_doc
+
+
+def get_project(project_name, active=True, inactive=True, fields=None):
+ """Return project entity document by project name.
+
+ Args:
+ project_name (str): Name of project.
+ active (Optional[bool]): Allow active project. Defaults to True.
+ inactive (Optional[bool]): Allow inactive project. Defaults to True.
+ fields (Optional[Iterable[str]]): Fields that should be returned. All
+ fields are returned if 'None' is passed.
+
+ Returns:
+ Union[Dict, None]: Project entity data which can be reduced to
+ specified 'fields'. None is returned if project with specified
+ filters was not found.
+ """
+ # Skip if both are disabled
+ if not active and not inactive:
+ return None
+
+ query_filter = {"type": "project"}
+ # Keep query untouched if both should be available
+ if active and inactive:
+ pass
+
+ # Add filter to keep only active
+ elif active:
+ query_filter["$or"] = [
+ {"data.active": {"$exists": False}},
+ {"data.active": True},
+ ]
+
+ # Add filter to keep only inactive
+ elif inactive:
+ query_filter["$or"] = [
+ {"data.active": {"$exists": False}},
+ {"data.active": False},
+ ]
+
+ conn = get_project_connection(project_name)
+ return conn.find_one(query_filter, _prepare_fields(fields))
+
+
+def get_whole_project(project_name):
+ """Receive all documents from project.
+
+ Helper that can be used to get all document from whole project. For example
+ for backups etc.
+
+ Returns:
+ Cursor: Query cursor as iterable which returns all documents from
+ project collection.
+ """
+
+ conn = get_project_connection(project_name)
+ return conn.find({})
+
+
+def get_asset_by_id(project_name, asset_id, fields=None):
+ """Receive asset data by its id.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ asset_id (Union[str, ObjectId]): Asset's id.
+ fields (Optional[Iterable[str]]): Fields that should be returned. All
+ fields are returned if 'None' is passed.
+
+ Returns:
+ Union[Dict, None]: Asset entity data which can be reduced to
+ specified 'fields'. None is returned if asset with specified
+ filters was not found.
+ """
+
+ asset_id = convert_id(asset_id)
+ if not asset_id:
+ return None
+
+ query_filter = {"type": "asset", "_id": asset_id}
+ conn = get_project_connection(project_name)
+ return conn.find_one(query_filter, _prepare_fields(fields))
+
+
+def get_asset_by_name(project_name, asset_name, fields=None):
+ """Receive asset data by its name.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ asset_name (str): Asset's name.
+ fields (Optional[Iterable[str]]): Fields that should be returned. All
+ fields are returned if 'None' is passed.
+
+ Returns:
+ Union[Dict, None]: Asset entity data which can be reduced to
+ specified 'fields'. None is returned if asset with specified
+ filters was not found.
+ """
+
+ if not asset_name:
+ return None
+
+ query_filter = {"type": "asset", "name": asset_name}
+ conn = get_project_connection(project_name)
+ return conn.find_one(query_filter, _prepare_fields(fields))
+
+
+# NOTE this could be just public function?
+# - any better variable name instead of 'standard'?
+# - same approach can be used for rest of types
+def _get_assets(
+ project_name,
+ asset_ids=None,
+ asset_names=None,
+ parent_ids=None,
+ standard=True,
+ archived=False,
+ fields=None
+):
+ """Assets for specified project by passed filters.
+
+ Passed filters (ids and names) are always combined so all conditions must
+ match.
+
+ To receive all assets from project just keep filters empty.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ asset_ids (Iterable[Union[str, ObjectId]]): Asset ids that should
+ be found.
+ asset_names (Iterable[str]): Name assets that should be found.
+ parent_ids (Iterable[Union[str, ObjectId]]): Parent asset ids.
+ standard (bool): Query standard assets (type 'asset').
+ archived (bool): Query archived assets (type 'archived_asset').
+ fields (Optional[Iterable[str]]): Fields that should be returned. All
+ fields are returned if 'None' is passed.
+
+ Returns:
+ Cursor: Query cursor as iterable which returns asset documents matching
+ passed filters.
+ """
+
+ asset_types = []
+ if standard:
+ asset_types.append("asset")
+ if archived:
+ asset_types.append("archived_asset")
+
+ if not asset_types:
+ return []
+
+ if len(asset_types) == 1:
+ query_filter = {"type": asset_types[0]}
+ else:
+ query_filter = {"type": {"$in": asset_types}}
+
+ if asset_ids is not None:
+ asset_ids = convert_ids(asset_ids)
+ if not asset_ids:
+ return []
+ query_filter["_id"] = {"$in": asset_ids}
+
+ if asset_names is not None:
+ if not asset_names:
+ return []
+ query_filter["name"] = {"$in": list(asset_names)}
+
+ if parent_ids is not None:
+ parent_ids = convert_ids(parent_ids)
+ if not parent_ids:
+ return []
+ query_filter["data.visualParent"] = {"$in": parent_ids}
+
+ conn = get_project_connection(project_name)
+
+ return conn.find(query_filter, _prepare_fields(fields))
+
+
+def get_assets(
+ project_name,
+ asset_ids=None,
+ asset_names=None,
+ parent_ids=None,
+ archived=False,
+ fields=None
+):
+ """Assets for specified project by passed filters.
+
+ Passed filters (ids and names) are always combined so all conditions must
+ match.
+
+ To receive all assets from project just keep filters empty.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ asset_ids (Iterable[Union[str, ObjectId]]): Asset ids that should
+ be found.
+ asset_names (Iterable[str]): Name assets that should be found.
+ parent_ids (Iterable[Union[str, ObjectId]]): Parent asset ids.
+ archived (bool): Add also archived assets.
+ fields (Optional[Iterable[str]]): Fields that should be returned. All
+ fields are returned if 'None' is passed.
+
+ Returns:
+ Cursor: Query cursor as iterable which returns asset documents matching
+ passed filters.
+ """
+
+ return _get_assets(
+ project_name,
+ asset_ids,
+ asset_names,
+ parent_ids,
+ True,
+ archived,
+ fields
+ )
+
+
+def get_archived_assets(
+ project_name,
+ asset_ids=None,
+ asset_names=None,
+ parent_ids=None,
+ fields=None
+):
+ """Archived assets for specified project by passed filters.
+
+ Passed filters (ids and names) are always combined so all conditions must
+ match.
+
+ To receive all archived assets from project just keep filters empty.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ asset_ids (Iterable[Union[str, ObjectId]]): Asset ids that should
+ be found.
+ asset_names (Iterable[str]): Name assets that should be found.
+ parent_ids (Iterable[Union[str, ObjectId]]): Parent asset ids.
+ fields (Optional[Iterable[str]]): Fields that should be returned. All
+ fields are returned if 'None' is passed.
+
+ Returns:
+ Cursor: Query cursor as iterable which returns asset documents matching
+ passed filters.
+ """
+
+ return _get_assets(
+ project_name, asset_ids, asset_names, parent_ids, False, True, fields
+ )
+
+
+def get_asset_ids_with_subsets(project_name, asset_ids=None):
+ """Find out which assets have existing subsets.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ asset_ids (Iterable[Union[str, ObjectId]]): Look only for entered
+ asset ids.
+
+ Returns:
+ Iterable[ObjectId]: Asset ids that have existing subsets.
+ """
+
+ subset_query = {
+ "type": "subset"
+ }
+ if asset_ids is not None:
+ asset_ids = convert_ids(asset_ids)
+ if not asset_ids:
+ return []
+ subset_query["parent"] = {"$in": asset_ids}
+
+ conn = get_project_connection(project_name)
+ result = conn.aggregate([
+ {
+ "$match": subset_query
+ },
+ {
+ "$group": {
+ "_id": "$parent",
+ "count": {"$sum": 1}
+ }
+ }
+ ])
+ asset_ids_with_subsets = []
+ for item in result:
+ asset_id = item["_id"]
+ count = item["count"]
+ if count > 0:
+ asset_ids_with_subsets.append(asset_id)
+ return asset_ids_with_subsets
+
+
+def get_subset_by_id(project_name, subset_id, fields=None):
+ """Single subset entity data by its id.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ subset_id (Union[str, ObjectId]): Id of subset which should be found.
+ fields (Optional[Iterable[str]]): Fields that should be returned. All
+ fields are returned if 'None' is passed.
+
+ Returns:
+ Union[Dict, None]: Subset entity data which can be reduced to
+ specified 'fields'. None is returned if subset with specified
+ filters was not found.
+ """
+
+ subset_id = convert_id(subset_id)
+ if not subset_id:
+ return None
+
+ query_filters = {"type": "subset", "_id": subset_id}
+ conn = get_project_connection(project_name)
+ return conn.find_one(query_filters, _prepare_fields(fields))
+
+
+def get_subset_by_name(project_name, subset_name, asset_id, fields=None):
+ """Single subset entity data by its name and its version id.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ subset_name (str): Name of subset.
+ asset_id (Union[str, ObjectId]): Id of parent asset.
+ fields (Optional[Iterable[str]]): Fields that should be returned. All
+ fields are returned if 'None' is passed.
+
+ Returns:
+ Union[Dict, None]: Subset entity data which can be reduced to
+ specified 'fields'. None is returned if subset with specified
+ filters was not found.
+ """
+ if not subset_name:
+ return None
+
+ asset_id = convert_id(asset_id)
+ if not asset_id:
+ return None
+
+ query_filters = {
+ "type": "subset",
+ "name": subset_name,
+ "parent": asset_id
+ }
+ conn = get_project_connection(project_name)
+ return conn.find_one(query_filters, _prepare_fields(fields))
+
+
+def get_subsets(
+ project_name,
+ subset_ids=None,
+ subset_names=None,
+ asset_ids=None,
+ names_by_asset_ids=None,
+ archived=False,
+ fields=None
+):
+ """Subset entities data from one project filtered by entered filters.
+
+ Filters are additive (all conditions must pass to return subset).
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ subset_ids (Iterable[Union[str, ObjectId]]): Subset ids that should be
+ queried. Filter ignored if 'None' is passed.
+ subset_names (Iterable[str]): Subset names that should be queried.
+ Filter ignored if 'None' is passed.
+ asset_ids (Iterable[Union[str, ObjectId]]): Asset ids under which
+ should look for the subsets. Filter ignored if 'None' is passed.
+ names_by_asset_ids (dict[ObjectId, List[str]]): Complex filtering
+ using asset ids and list of subset names under the asset.
+ archived (bool): Look for archived subsets too.
+ fields (Optional[Iterable[str]]): Fields that should be returned. All
+ fields are returned if 'None' is passed.
+
+ Returns:
+ Cursor: Iterable cursor yielding all matching subsets.
+ """
+
+ subset_types = ["subset"]
+ if archived:
+ subset_types.append("archived_subset")
+
+ if len(subset_types) == 1:
+ query_filter = {"type": subset_types[0]}
+ else:
+ query_filter = {"type": {"$in": subset_types}}
+
+ if asset_ids is not None:
+ asset_ids = convert_ids(asset_ids)
+ if not asset_ids:
+ return []
+ query_filter["parent"] = {"$in": asset_ids}
+
+ if subset_ids is not None:
+ subset_ids = convert_ids(subset_ids)
+ if not subset_ids:
+ return []
+ query_filter["_id"] = {"$in": subset_ids}
+
+ if subset_names is not None:
+ if not subset_names:
+ return []
+ query_filter["name"] = {"$in": list(subset_names)}
+
+ if names_by_asset_ids is not None:
+ or_query = []
+ for asset_id, names in names_by_asset_ids.items():
+ if asset_id and names:
+ or_query.append({
+ "parent": convert_id(asset_id),
+ "name": {"$in": list(names)}
+ })
+ if not or_query:
+ return []
+ query_filter["$or"] = or_query
+
+ conn = get_project_connection(project_name)
+ return conn.find(query_filter, _prepare_fields(fields))
+
+
+def get_subset_families(project_name, subset_ids=None):
+ """Set of main families of subsets.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ subset_ids (Iterable[Union[str, ObjectId]]): Subset ids that should
+ be queried. All subsets from project are used if 'None' is passed.
+
+ Returns:
+ set[str]: Main families of matching subsets.
+ """
+
+ subset_filter = {
+ "type": "subset"
+ }
+ if subset_ids is not None:
+ if not subset_ids:
+ return set()
+ subset_filter["_id"] = {"$in": list(subset_ids)}
+
+ conn = get_project_connection(project_name)
+ result = list(conn.aggregate([
+ {"$match": subset_filter},
+ {"$project": {
+ "family": {"$arrayElemAt": ["$data.families", 0]}
+ }},
+ {"$group": {
+ "_id": "family_group",
+ "families": {"$addToSet": "$family"}
+ }}
+ ]))
+ if result:
+ return set(result[0]["families"])
+ return set()
+
+
+def get_version_by_id(project_name, version_id, fields=None):
+ """Single version entity data by its id.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ version_id (Union[str, ObjectId]): Id of version which should be found.
+ fields (Optional[Iterable[str]]): Fields that should be returned. All
+ fields are returned if 'None' is passed.
+
+ Returns:
+ Union[Dict, None]: Version entity data which can be reduced to
+ specified 'fields'. None is returned if version with specified
+ filters was not found.
+ """
+
+ version_id = convert_id(version_id)
+ if not version_id:
+ return None
+
+ query_filter = {
+ "type": {"$in": ["version", "hero_version"]},
+ "_id": version_id
+ }
+ conn = get_project_connection(project_name)
+ return conn.find_one(query_filter, _prepare_fields(fields))
+
+
+def get_version_by_name(project_name, version, subset_id, fields=None):
+ """Single version entity data by its name and subset id.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ version (int): name of version entity (its version).
+ subset_id (Union[str, ObjectId]): Id of version which should be found.
+ fields (Optional[Iterable[str]]): Fields that should be returned. All
+ fields are returned if 'None' is passed.
+
+ Returns:
+ Union[Dict, None]: Version entity data which can be reduced to
+ specified 'fields'. None is returned if version with specified
+ filters was not found.
+ """
+
+ subset_id = convert_id(subset_id)
+ if not subset_id:
+ return None
+
+ conn = get_project_connection(project_name)
+ query_filter = {
+ "type": "version",
+ "parent": subset_id,
+ "name": version
+ }
+ return conn.find_one(query_filter, _prepare_fields(fields))
+
+
+def version_is_latest(project_name, version_id):
+ """Is version the latest from its subset.
+
+ Note:
+ Hero versions are considered as latest.
+
+ Todo:
+ Maybe raise exception when version was not found?
+
+ Args:
+ project_name (str):Name of project where to look for queried entities.
+ version_id (Union[str, ObjectId]): Version id which is checked.
+
+ Returns:
+ bool: True if is latest version from subset else False.
+ """
+
+ version_id = convert_id(version_id)
+ if not version_id:
+ return False
+ version_doc = get_version_by_id(
+ project_name, version_id, fields=["_id", "type", "parent"]
+ )
+ # What to do when version is not found?
+ if not version_doc:
+ return False
+
+ if version_doc["type"] == "hero_version":
+ return True
+
+ last_version = get_last_version_by_subset_id(
+ project_name, version_doc["parent"], fields=["_id"]
+ )
+ return last_version["_id"] == version_id
+
+
+def _get_versions(
+ project_name,
+ subset_ids=None,
+ version_ids=None,
+ versions=None,
+ standard=True,
+ hero=False,
+ fields=None
+):
+ version_types = []
+ if standard:
+ version_types.append("version")
+
+ if hero:
+ version_types.append("hero_version")
+
+ if not version_types:
+ return []
+ elif len(version_types) == 1:
+ query_filter = {"type": version_types[0]}
+ else:
+ query_filter = {"type": {"$in": version_types}}
+
+ if subset_ids is not None:
+ subset_ids = convert_ids(subset_ids)
+ if not subset_ids:
+ return []
+ query_filter["parent"] = {"$in": subset_ids}
+
+ if version_ids is not None:
+ version_ids = convert_ids(version_ids)
+ if not version_ids:
+ return []
+ query_filter["_id"] = {"$in": version_ids}
+
+ if versions is not None:
+ versions = list(versions)
+ if not versions:
+ return []
+
+ if len(versions) == 1:
+ query_filter["name"] = versions[0]
+ else:
+ query_filter["name"] = {"$in": versions}
+
+ conn = get_project_connection(project_name)
+
+ return conn.find(query_filter, _prepare_fields(fields))
+
+
+def get_versions(
+ project_name,
+ version_ids=None,
+ subset_ids=None,
+ versions=None,
+ hero=False,
+ fields=None
+):
+ """Version entities data from one project filtered by entered filters.
+
+ Filters are additive (all conditions must pass to return subset).
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ version_ids (Iterable[Union[str, ObjectId]]): Version ids that will
+ be queried. Filter ignored if 'None' is passed.
+ subset_ids (Iterable[str]): Subset ids that will be queried.
+ Filter ignored if 'None' is passed.
+ versions (Iterable[int]): Version names (as integers).
+ Filter ignored if 'None' is passed.
+ hero (bool): Look also for hero versions.
+ fields (Optional[Iterable[str]]): Fields that should be returned. All
+ fields are returned if 'None' is passed.
+
+ Returns:
+ Cursor: Iterable cursor yielding all matching versions.
+ """
+
+ return _get_versions(
+ project_name,
+ subset_ids,
+ version_ids,
+ versions,
+ standard=True,
+ hero=hero,
+ fields=fields
+ )
+
+
+def get_hero_version_by_subset_id(project_name, subset_id, fields=None):
+ """Hero version by subset id.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ subset_id (Union[str, ObjectId]): Subset id under which
+ is hero version.
+ fields (Optional[Iterable[str]]): Fields that should be returned. All
+ fields are returned if 'None' is passed.
+
+ Returns:
+ Union[Dict, None]: Hero version entity data which can be reduced to
+ specified 'fields'. None is returned if hero version with specified
+ filters was not found.
+ """
+
+ subset_id = convert_id(subset_id)
+ if not subset_id:
+ return None
+
+ versions = list(_get_versions(
+ project_name,
+ subset_ids=[subset_id],
+ standard=False,
+ hero=True,
+ fields=fields
+ ))
+ if versions:
+ return versions[0]
+ return None
+
+
+def get_hero_version_by_id(project_name, version_id, fields=None):
+ """Hero version by its id.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ version_id (Union[str, ObjectId]): Hero version id.
+ fields (Optional[Iterable[str]]): Fields that should be returned. All
+ fields are returned if 'None' is passed.
+
+ Returns:
+ Union[Dict, None]: Hero version entity data which can be reduced to
+ specified 'fields'. None is returned if hero version with specified
+ filters was not found.
+ """
+
+ version_id = convert_id(version_id)
+ if not version_id:
+ return None
+
+ versions = list(_get_versions(
+ project_name,
+ version_ids=[version_id],
+ standard=False,
+ hero=True,
+ fields=fields
+ ))
+ if versions:
+ return versions[0]
+ return None
+
+
+def get_hero_versions(
+ project_name,
+ subset_ids=None,
+ version_ids=None,
+ fields=None
+):
+ """Hero version entities data from one project filtered by entered filters.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ subset_ids (Iterable[Union[str, ObjectId]]): Subset ids for which
+ should look for hero versions. Filter ignored if 'None' is passed.
+ version_ids (Iterable[Union[str, ObjectId]]): Hero version ids. Filter
+ ignored if 'None' is passed.
+ fields (Optional[Iterable[str]]): Fields that should be returned. All
+ fields are returned if 'None' is passed.
+
+ Returns:
+ Cursor|list: Iterable yielding hero versions matching passed filters.
+ """
+
+ return _get_versions(
+ project_name,
+ subset_ids,
+ version_ids,
+ standard=False,
+ hero=True,
+ fields=fields
+ )
+
+
+def get_output_link_versions(project_name, version_id, fields=None):
+ """Versions where passed version was used as input.
+
+ Question:
+ Not 100% sure about the usage of the function so the name and docstring
+ maybe does not match what it does?
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ version_id (Union[str, ObjectId]): Version id which can be used
+ as input link for other versions.
+ fields (Optional[Iterable[str]]): Fields that should be returned. All
+ fields are returned if 'None' is passed.
+
+ Returns:
+ Iterable: Iterable cursor yielding versions that are used as input
+ links for passed version.
+ """
+
+ version_id = convert_id(version_id)
+ if not version_id:
+ return []
+
+ conn = get_project_connection(project_name)
+ # Does make sense to look for hero versions?
+ query_filter = {
+ "type": "version",
+ "data.inputLinks.id": version_id
+ }
+ return conn.find(query_filter, _prepare_fields(fields))
+
+
+def get_last_versions(project_name, subset_ids, active=None, fields=None):
+ """Latest versions for entered subset_ids.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ subset_ids (Iterable[Union[str, ObjectId]]): List of subset ids.
+ active (Optional[bool]): If True only active versions are returned.
+ fields (Optional[Iterable[str]]): Fields that should be returned. All
+ fields are returned if 'None' is passed.
+
+ Returns:
+ dict[ObjectId, int]: Key is subset id and value is last version name.
+ """
+
+ subset_ids = convert_ids(subset_ids)
+ if not subset_ids:
+ return {}
+
+ if fields is not None:
+ fields = list(fields)
+ if not fields:
+ return {}
+
+ # Avoid double query if only name and _id are requested
+ name_needed = False
+ limit_query = False
+ if fields:
+ fields_s = set(fields)
+ if "name" in fields_s:
+ name_needed = True
+ fields_s.remove("name")
+
+ for field in ("_id", "parent"):
+ if field in fields_s:
+ fields_s.remove(field)
+ limit_query = len(fields_s) == 0
+
+ group_item = {
+ "_id": "$parent",
+ "_version_id": {"$last": "$_id"}
+ }
+ # Add name if name is needed (only for limit query)
+ if name_needed:
+ group_item["name"] = {"$last": "$name"}
+
+ aggregate_filter = {
+ "type": "version",
+ "parent": {"$in": subset_ids}
+ }
+ if active is False:
+ aggregate_filter["data.active"] = active
+ elif active is True:
+ aggregate_filter["$or"] = [
+ {"data.active": {"$exists": 0}},
+ {"data.active": active},
+ ]
+
+ aggregation_pipeline = [
+ # Find all versions of those subsets
+ {"$match": aggregate_filter},
+ # Sorting versions all together
+ {"$sort": {"name": 1}},
+ # Group them by "parent", but only take the last
+ {"$group": group_item}
+ ]
+
+ conn = get_project_connection(project_name)
+ aggregate_result = conn.aggregate(aggregation_pipeline)
+ if limit_query:
+ output = {}
+ for item in aggregate_result:
+ subset_id = item["_id"]
+ item_data = {"_id": item["_version_id"], "parent": subset_id}
+ if name_needed:
+ item_data["name"] = item["name"]
+ output[subset_id] = item_data
+ return output
+
+ version_ids = [
+ doc["_version_id"]
+ for doc in aggregate_result
+ ]
+
+ fields = _prepare_fields(fields, ["parent"])
+
+ version_docs = get_versions(
+ project_name, version_ids=version_ids, fields=fields
+ )
+
+ return {
+ version_doc["parent"]: version_doc
+ for version_doc in version_docs
+ }
+
+
+def get_last_version_by_subset_id(project_name, subset_id, fields=None):
+ """Last version for passed subset id.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ subset_id (Union[str, ObjectId]): Id of version which should be found.
+ fields (Optional[Iterable[str]]): Fields that should be returned. All
+ fields are returned if 'None' is passed.
+
+ Returns:
+ Union[Dict, None]: Version entity data which can be reduced to
+ specified 'fields'. None is returned if version with specified
+ filters was not found.
+ """
+
+ subset_id = convert_id(subset_id)
+ if not subset_id:
+ return None
+
+ last_versions = get_last_versions(
+ project_name, subset_ids=[subset_id], fields=fields
+ )
+ return last_versions.get(subset_id)
+
+
+def get_last_version_by_subset_name(
+ project_name, subset_name, asset_id=None, asset_name=None, fields=None
+):
+ """Last version for passed subset name under asset id/name.
+
+ It is required to pass 'asset_id' or 'asset_name'. Asset id is recommended
+ if is available.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ subset_name (str): Name of subset.
+ asset_id (Union[str, ObjectId]): Asset id which is parent of passed
+ subset name.
+ asset_name (str): Asset name which is parent of passed subset name.
+ fields (Optional[Iterable[str]]): Fields that should be returned. All
+ fields are returned if 'None' is passed.
+
+ Returns:
+ Union[Dict, None]: Version entity data which can be reduced to
+ specified 'fields'. None is returned if version with specified
+ filters was not found.
+ """
+
+ if not asset_id and not asset_name:
+ return None
+
+ if not asset_id:
+ asset_doc = get_asset_by_name(project_name, asset_name, fields=["_id"])
+ if not asset_doc:
+ return None
+ asset_id = asset_doc["_id"]
+ subset_doc = get_subset_by_name(
+ project_name, subset_name, asset_id, fields=["_id"]
+ )
+ if not subset_doc:
+ return None
+ return get_last_version_by_subset_id(
+ project_name, subset_doc["_id"], fields=fields
+ )
+
+
+def get_representation_by_id(project_name, representation_id, fields=None):
+ """Representation entity data by its id.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ representation_id (Union[str, ObjectId]): Representation id.
+ fields (Optional[Iterable[str]]): Fields that should be returned. All
+ fields are returned if 'None' is passed.
+
+ Returns:
+ Union[Dict, None]: Representation entity data which can be reduced to
+ specified 'fields'. None is returned if representation with
+ specified filters was not found.
+ """
+
+ if not representation_id:
+ return None
+
+ repre_types = ["representation", "archived_representation"]
+ query_filter = {
+ "type": {"$in": repre_types}
+ }
+ if representation_id is not None:
+ query_filter["_id"] = convert_id(representation_id)
+
+ conn = get_project_connection(project_name)
+
+ return conn.find_one(query_filter, _prepare_fields(fields))
+
+
+def get_representation_by_name(
+ project_name, representation_name, version_id, fields=None
+):
+ """Representation entity data by its name and its version id.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ representation_name (str): Representation name.
+ version_id (Union[str, ObjectId]): Id of parent version entity.
+ fields (Optional[Iterable[str]]): Fields that should be returned. All
+ fields are returned if 'None' is passed.
+
+ Returns:
+ Union[dict[str, Any], None]: Representation entity data which can be
+ reduced to specified 'fields'. None is returned if representation
+ with specified filters was not found.
+ """
+
+ version_id = convert_id(version_id)
+ if not version_id or not representation_name:
+ return None
+ repre_types = ["representation", "archived_representations"]
+ query_filter = {
+ "type": {"$in": repre_types},
+ "name": representation_name,
+ "parent": version_id
+ }
+
+ conn = get_project_connection(project_name)
+ return conn.find_one(query_filter, _prepare_fields(fields))
+
+
+def _flatten_dict(data):
+ flatten_queue = collections.deque()
+ flatten_queue.append(data)
+ output = {}
+ while flatten_queue:
+ item = flatten_queue.popleft()
+ for key, value in item.items():
+ if not isinstance(value, dict):
+ output[key] = value
+ continue
+
+ tmp = {}
+ for subkey, subvalue in value.items():
+ new_key = "{}.{}".format(key, subkey)
+ tmp[new_key] = subvalue
+ flatten_queue.append(tmp)
+ return output
+
+
+def _regex_filters(filters):
+ output = []
+ for key, value in filters.items():
+ regexes = []
+ a_values = []
+ if isinstance(value, PatternType):
+ regexes.append(value)
+ elif isinstance(value, (list, tuple, set)):
+ for item in value:
+ if isinstance(item, PatternType):
+ regexes.append(item)
+ else:
+ a_values.append(item)
+ else:
+ a_values.append(value)
+
+ key_filters = []
+ if len(a_values) == 1:
+ key_filters.append({key: a_values[0]})
+ elif a_values:
+ key_filters.append({key: {"$in": a_values}})
+
+ for regex in regexes:
+ key_filters.append({key: {"$regex": regex}})
+
+ if len(key_filters) == 1:
+ output.append(key_filters[0])
+ else:
+ output.append({"$or": key_filters})
+
+ return output
+
+
+def _get_representations(
+ project_name,
+ representation_ids,
+ representation_names,
+ version_ids,
+ context_filters,
+ names_by_version_ids,
+ standard,
+ archived,
+ fields
+):
+ default_output = []
+ repre_types = []
+ if standard:
+ repre_types.append("representation")
+ if archived:
+ repre_types.append("archived_representation")
+
+ if not repre_types:
+ return default_output
+
+ if len(repre_types) == 1:
+ query_filter = {"type": repre_types[0]}
+ else:
+ query_filter = {"type": {"$in": repre_types}}
+
+ if representation_ids is not None:
+ representation_ids = convert_ids(representation_ids)
+ if not representation_ids:
+ return default_output
+ query_filter["_id"] = {"$in": representation_ids}
+
+ if representation_names is not None:
+ if not representation_names:
+ return default_output
+ query_filter["name"] = {"$in": list(representation_names)}
+
+ if version_ids is not None:
+ version_ids = convert_ids(version_ids)
+ if not version_ids:
+ return default_output
+ query_filter["parent"] = {"$in": version_ids}
+
+ or_queries = []
+ if names_by_version_ids is not None:
+ or_query = []
+ for version_id, names in names_by_version_ids.items():
+ if version_id and names:
+ or_query.append({
+ "parent": convert_id(version_id),
+ "name": {"$in": list(names)}
+ })
+ if not or_query:
+ return default_output
+ or_queries.append(or_query)
+
+ if context_filters is not None:
+ if not context_filters:
+ return []
+ _flatten_filters = _flatten_dict(context_filters)
+ flatten_filters = {}
+ for key, value in _flatten_filters.items():
+ if not key.startswith("context"):
+ key = "context.{}".format(key)
+ flatten_filters[key] = value
+
+ for item in _regex_filters(flatten_filters):
+ for key, value in item.items():
+ if key != "$or":
+ query_filter[key] = value
+
+ elif value:
+ or_queries.append(value)
+
+ if len(or_queries) == 1:
+ query_filter["$or"] = or_queries[0]
+ elif or_queries:
+ and_query = []
+ for or_query in or_queries:
+ if isinstance(or_query, list):
+ or_query = {"$or": or_query}
+ and_query.append(or_query)
+ query_filter["$and"] = and_query
+
+ conn = get_project_connection(project_name)
+
+ return conn.find(query_filter, _prepare_fields(fields))
+
+
+def get_representations(
+ project_name,
+ representation_ids=None,
+ representation_names=None,
+ version_ids=None,
+ context_filters=None,
+ names_by_version_ids=None,
+ archived=False,
+ standard=True,
+ fields=None
+):
+ """Representation entities data from one project filtered by filters.
+
+ Filters are additive (all conditions must pass to return subset).
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ representation_ids (Iterable[Union[str, ObjectId]]): Representation ids
+ used as filter. Filter ignored if 'None' is passed.
+ representation_names (Iterable[str]): Representations names used
+ as filter. Filter ignored if 'None' is passed.
+ version_ids (Iterable[str]): Subset ids used as parent filter. Filter
+ ignored if 'None' is passed.
+ context_filters (Dict[str, List[str, PatternType]]): Filter by
+ representation context fields.
+ names_by_version_ids (dict[ObjectId, list[str]]): Complex filtering
+ using version ids and list of names under the version.
+ archived (bool): Output will also contain archived representations.
+ fields (Optional[Iterable[str]]): Fields that should be returned. All
+ fields are returned if 'None' is passed.
+
+ Returns:
+ Cursor: Iterable cursor yielding all matching representations.
+ """
+
+ return _get_representations(
+ project_name=project_name,
+ representation_ids=representation_ids,
+ representation_names=representation_names,
+ version_ids=version_ids,
+ context_filters=context_filters,
+ names_by_version_ids=names_by_version_ids,
+ standard=standard,
+ archived=archived,
+ fields=fields
+ )
+
+
+def get_archived_representations(
+ project_name,
+ representation_ids=None,
+ representation_names=None,
+ version_ids=None,
+ context_filters=None,
+ names_by_version_ids=None,
+ fields=None
+):
+ """Archived representation entities data from project with applied filters.
+
+ Filters are additive (all conditions must pass to return subset).
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ representation_ids (Iterable[Union[str, ObjectId]]): Representation ids
+ used as filter. Filter ignored if 'None' is passed.
+ representation_names (Iterable[str]): Representations names used
+ as filter. Filter ignored if 'None' is passed.
+ version_ids (Iterable[str]): Subset ids used as parent filter. Filter
+ ignored if 'None' is passed.
+ context_filters (Dict[str, List[str, PatternType]]): Filter by
+ representation context fields.
+ names_by_version_ids (dict[ObjectId, List[str]]): Complex filtering
+ using version ids and list of names under the version.
+ fields (Optional[Iterable[str]]): Fields that should be returned. All
+ fields are returned if 'None' is passed.
+
+ Returns:
+ Cursor: Iterable cursor yielding all matching representations.
+ """
+
+ return _get_representations(
+ project_name=project_name,
+ representation_ids=representation_ids,
+ representation_names=representation_names,
+ version_ids=version_ids,
+ context_filters=context_filters,
+ names_by_version_ids=names_by_version_ids,
+ standard=False,
+ archived=True,
+ fields=fields
+ )
+
+
+def get_representations_parents(project_name, representations):
+ """Prepare parents of representation entities.
+
+ Each item of returned dictionary contains version, subset, asset
+ and project in that order.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ representations (List[dict]): Representation entities with at least
+ '_id' and 'parent' keys.
+
+ Returns:
+ dict[ObjectId, tuple]: Parents by representation id.
+ """
+
+ repre_docs_by_version_id = collections.defaultdict(list)
+ version_docs_by_version_id = {}
+ version_docs_by_subset_id = collections.defaultdict(list)
+ subset_docs_by_subset_id = {}
+ subset_docs_by_asset_id = collections.defaultdict(list)
+ output = {}
+ for repre_doc in representations:
+ repre_id = repre_doc["_id"]
+ version_id = repre_doc["parent"]
+ output[repre_id] = (None, None, None, None)
+ repre_docs_by_version_id[version_id].append(repre_doc)
+
+ version_docs = get_versions(
+ project_name,
+ version_ids=repre_docs_by_version_id.keys(),
+ hero=True
+ )
+ for version_doc in version_docs:
+ version_id = version_doc["_id"]
+ subset_id = version_doc["parent"]
+ version_docs_by_version_id[version_id] = version_doc
+ version_docs_by_subset_id[subset_id].append(version_doc)
+
+ subset_docs = get_subsets(
+ project_name, subset_ids=version_docs_by_subset_id.keys()
+ )
+ for subset_doc in subset_docs:
+ subset_id = subset_doc["_id"]
+ asset_id = subset_doc["parent"]
+ subset_docs_by_subset_id[subset_id] = subset_doc
+ subset_docs_by_asset_id[asset_id].append(subset_doc)
+
+ asset_docs = get_assets(
+ project_name, asset_ids=subset_docs_by_asset_id.keys()
+ )
+ asset_docs_by_id = {
+ asset_doc["_id"]: asset_doc
+ for asset_doc in asset_docs
+ }
+
+ project_doc = get_project(project_name)
+
+ for version_id, repre_docs in repre_docs_by_version_id.items():
+ asset_doc = None
+ subset_doc = None
+ version_doc = version_docs_by_version_id.get(version_id)
+ if version_doc:
+ subset_id = version_doc["parent"]
+ subset_doc = subset_docs_by_subset_id.get(subset_id)
+ if subset_doc:
+ asset_id = subset_doc["parent"]
+ asset_doc = asset_docs_by_id.get(asset_id)
+
+ for repre_doc in repre_docs:
+ repre_id = repre_doc["_id"]
+ output[repre_id] = (
+ version_doc, subset_doc, asset_doc, project_doc
+ )
+ return output
+
+
+def get_representation_parents(project_name, representation):
+ """Prepare parents of representation entity.
+
+ Each item of returned dictionary contains version, subset, asset
+ and project in that order.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ representation (dict): Representation entities with at least
+ '_id' and 'parent' keys.
+
+ Returns:
+ dict[ObjectId, tuple]: Parents by representation id.
+ """
+
+ if not representation:
+ return None
+
+ repre_id = representation["_id"]
+ parents_by_repre_id = get_representations_parents(
+ project_name, [representation]
+ )
+ return parents_by_repre_id[repre_id]
+
+
+def get_thumbnail_id_from_source(project_name, src_type, src_id):
+ """Receive thumbnail id from source entity.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ src_type (str): Type of source entity ('asset', 'version').
+ src_id (Union[str, ObjectId]): Id of source entity.
+
+ Returns:
+ Union[ObjectId, None]: Thumbnail id assigned to entity. If Source
+ entity does not have any thumbnail id assigned.
+ """
+
+ if not src_type or not src_id:
+ return None
+
+ query_filter = {"_id": convert_id(src_id)}
+
+ conn = get_project_connection(project_name)
+ src_doc = conn.find_one(query_filter, {"data.thumbnail_id"})
+ if src_doc:
+ return src_doc.get("data", {}).get("thumbnail_id")
+ return None
+
+
+def get_thumbnails(project_name, thumbnail_ids, fields=None):
+ """Receive thumbnails entity data.
+
+ Thumbnail entity can be used to receive binary content of thumbnail based
+ on its content and ThumbnailResolvers.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ thumbnail_ids (Iterable[Union[str, ObjectId]]): Ids of thumbnail
+ entities.
+ fields (Optional[Iterable[str]]): Fields that should be returned. All
+ fields are returned if 'None' is passed.
+
+ Returns:
+ cursor: Cursor of queried documents.
+ """
+
+ if thumbnail_ids:
+ thumbnail_ids = convert_ids(thumbnail_ids)
+
+ if not thumbnail_ids:
+ return []
+ query_filter = {
+ "type": "thumbnail",
+ "_id": {"$in": thumbnail_ids}
+ }
+ conn = get_project_connection(project_name)
+ return conn.find(query_filter, _prepare_fields(fields))
+
+
+def get_thumbnail(
+ project_name, thumbnail_id, entity_type, entity_id, fields=None
+):
+ """Receive thumbnail entity data.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ thumbnail_id (Union[str, ObjectId]): Id of thumbnail entity.
+ fields (Optional[Iterable[str]]): Fields that should be returned. All
+ fields are returned if 'None' is passed.
+
+ Returns:
+ Union[Dict, None]: Thumbnail entity data which can be reduced to
+ specified 'fields'.None is returned if thumbnail with specified
+ filters was not found.
+ """
+
+ if not thumbnail_id:
+ return None
+ query_filter = {"type": "thumbnail", "_id": convert_id(thumbnail_id)}
+ conn = get_project_connection(project_name)
+ return conn.find_one(query_filter, _prepare_fields(fields))
+
+
+def get_workfile_info(
+ project_name, asset_id, task_name, filename, fields=None
+):
+ """Document with workfile information.
+
+ Warning:
+ Query is based on filename and context which does not meant it will
+ find always right and expected result. Information have limited usage
+ and is not recommended to use it as source information about workfile.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ asset_id (Union[str, ObjectId]): Id of asset entity.
+ task_name (str): Task name on asset.
+ fields (Optional[Iterable[str]]): Fields that should be returned. All
+ fields are returned if 'None' is passed.
+
+ Returns:
+ Union[Dict, None]: Workfile entity data which can be reduced to
+ specified 'fields'.None is returned if workfile with specified
+ filters was not found.
+ """
+
+ if not asset_id or not task_name or not filename:
+ return None
+
+ query_filter = {
+ "type": "workfile",
+ "parent": convert_id(asset_id),
+ "task_name": task_name,
+ "filename": filename
+ }
+ conn = get_project_connection(project_name)
+ return conn.find_one(query_filter, _prepare_fields(fields))
+
+
+"""
+## Custom data storage:
+- Settings - OP settings overrides and local settings
+- Logging - logs from Logger
+- Webpublisher - jobs
+- Ftrack - events
+- Maya - Shaders
+ - openpype/hosts/maya/api/shader_definition_editor.py
+ - openpype/hosts/maya/plugins/publish/validate_model_name.py
+
+## Global publish plugins
+- openpype/plugins/publish/extract_hierarchy_avalon.py
+ Create:
+ - asset
+ Update:
+ - asset
+
+## Lib
+- openpype/lib/avalon_context.py
+ Update:
+ - workfile data
+- openpype/lib/project_backpack.py
+ Update:
+ - project
+"""
diff --git a/openpype/client/mongo/entity_links.py b/openpype/client/mongo/entity_links.py
new file mode 100644
index 0000000000..c97a828118
--- /dev/null
+++ b/openpype/client/mongo/entity_links.py
@@ -0,0 +1,244 @@
+from .mongo import get_project_connection
+from .entities import (
+ get_assets,
+ get_asset_by_id,
+ get_version_by_id,
+ get_representation_by_id,
+ convert_id,
+)
+
+
+def get_linked_asset_ids(project_name, asset_doc=None, asset_id=None):
+ """Extract linked asset ids from asset document.
+
+ One of asset document or asset id must be passed.
+
+ Note:
+ Asset links now works only from asset to assets.
+
+ Args:
+ asset_doc (dict): Asset document from DB.
+
+ Returns:
+ List[Union[ObjectId, str]]: Asset ids of input links.
+ """
+
+ output = []
+ if not asset_doc and not asset_id:
+ return output
+
+ if not asset_doc:
+ asset_doc = get_asset_by_id(
+ project_name, asset_id, fields=["data.inputLinks"]
+ )
+
+ input_links = asset_doc["data"].get("inputLinks")
+ if not input_links:
+ return output
+
+ for item in input_links:
+ # Backwards compatibility for "_id" key which was replaced with
+ # "id"
+ if "_id" in item:
+ link_id = item["_id"]
+ else:
+ link_id = item["id"]
+ output.append(link_id)
+ return output
+
+
+def get_linked_assets(
+ project_name, asset_doc=None, asset_id=None, fields=None
+):
+ """Return linked assets based on passed asset document.
+
+ One of asset document or asset id must be passed.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ asset_doc (Dict[str, Any]): Asset document from database.
+ asset_id (Union[ObjectId, str]): Asset id. Can be used instead of
+ asset document.
+ fields (Iterable[str]): Fields that should be returned. All fields are
+ returned if 'None' is passed.
+
+ Returns:
+ List[Dict[str, Any]]: Asset documents of input links for passed
+ asset doc.
+ """
+
+ if not asset_doc:
+ if not asset_id:
+ return []
+ asset_doc = get_asset_by_id(
+ project_name,
+ asset_id,
+ fields=["data.inputLinks"]
+ )
+ if not asset_doc:
+ return []
+
+ link_ids = get_linked_asset_ids(project_name, asset_doc=asset_doc)
+ if not link_ids:
+ return []
+
+ return list(get_assets(project_name, asset_ids=link_ids, fields=fields))
+
+
+def get_linked_representation_id(
+ project_name, repre_doc=None, repre_id=None, link_type=None, max_depth=None
+):
+ """Returns list of linked ids of particular type (if provided).
+
+ One of representation document or representation id must be passed.
+ Note:
+ Representation links now works only from representation through version
+ back to representations.
+
+ Args:
+ project_name (str): Name of project where look for links.
+ repre_doc (Dict[str, Any]): Representation document.
+ repre_id (Union[ObjectId, str]): Representation id.
+ link_type (str): Type of link (e.g. 'reference', ...).
+ max_depth (int): Limit recursion level. Default: 0
+
+ Returns:
+ List[ObjectId] Linked representation ids.
+ """
+
+ if repre_doc:
+ repre_id = repre_doc["_id"]
+
+ if repre_id:
+ repre_id = convert_id(repre_id)
+
+ if not repre_id and not repre_doc:
+ return []
+
+ version_id = None
+ if repre_doc:
+ version_id = repre_doc.get("parent")
+
+ if not version_id:
+ repre_doc = get_representation_by_id(
+ project_name, repre_id, fields=["parent"]
+ )
+ version_id = repre_doc["parent"]
+
+ if not version_id:
+ return []
+
+ version_doc = get_version_by_id(
+ project_name, version_id, fields=["type", "version_id"]
+ )
+ if version_doc["type"] == "hero_version":
+ version_id = version_doc["version_id"]
+
+ if max_depth is None:
+ max_depth = 0
+
+ match = {
+ "_id": version_id,
+ # Links are not stored to hero versions at this moment so filter
+ # is limited to just versions
+ "type": "version"
+ }
+
+ graph_lookup = {
+ "from": project_name,
+ "startWith": "$data.inputLinks.id",
+ "connectFromField": "data.inputLinks.id",
+ "connectToField": "_id",
+ "as": "outputs_recursive",
+ "depthField": "depth"
+ }
+ if max_depth != 0:
+ # We offset by -1 since 0 basically means no recursion
+ # but the recursion only happens after the initial lookup
+ # for outputs.
+ graph_lookup["maxDepth"] = max_depth - 1
+
+ query_pipeline = [
+ # Match
+ {"$match": match},
+ # Recursive graph lookup for inputs
+ {"$graphLookup": graph_lookup}
+ ]
+
+ conn = get_project_connection(project_name)
+ result = conn.aggregate(query_pipeline)
+ referenced_version_ids = _process_referenced_pipeline_result(
+ result, link_type
+ )
+ if not referenced_version_ids:
+ return []
+
+ ref_ids = conn.distinct(
+ "_id",
+ filter={
+ "parent": {"$in": list(referenced_version_ids)},
+ "type": "representation"
+ }
+ )
+
+ return list(ref_ids)
+
+
+def _process_referenced_pipeline_result(result, link_type):
+ """Filters result from pipeline for particular link_type.
+
+ Pipeline cannot use link_type directly in a query.
+
+ Returns:
+ (list)
+ """
+
+ referenced_version_ids = set()
+ correctly_linked_ids = set()
+ for item in result:
+ input_links = item.get("data", {}).get("inputLinks")
+ if not input_links:
+ continue
+
+ _filter_input_links(
+ input_links,
+ link_type,
+ correctly_linked_ids
+ )
+
+ # outputs_recursive in random order, sort by depth
+ outputs_recursive = item.get("outputs_recursive")
+ if not outputs_recursive:
+ continue
+
+ for output in sorted(outputs_recursive, key=lambda o: o["depth"]):
+ output_links = output.get("data", {}).get("inputLinks")
+ if not output_links and output["type"] != "hero_version":
+ continue
+
+ # Leaf
+ if output["_id"] not in correctly_linked_ids:
+ continue
+
+ _filter_input_links(
+ output_links,
+ link_type,
+ correctly_linked_ids
+ )
+
+ referenced_version_ids.add(output["_id"])
+
+ return referenced_version_ids
+
+
+def _filter_input_links(input_links, link_type, correctly_linked_ids):
+ if not input_links: # to handle hero versions
+ return
+
+ for input_link in input_links:
+ if link_type and input_link["type"] != link_type:
+ continue
+
+ link_id = input_link.get("id") or input_link.get("_id")
+ if link_id is not None:
+ correctly_linked_ids.add(link_id)
diff --git a/openpype/client/mongo.py b/openpype/client/mongo/mongo.py
similarity index 98%
rename from openpype/client/mongo.py
rename to openpype/client/mongo/mongo.py
index 251041c028..2be426efeb 100644
--- a/openpype/client/mongo.py
+++ b/openpype/client/mongo/mongo.py
@@ -11,6 +11,7 @@ from bson.json_util import (
CANONICAL_JSON_OPTIONS
)
+from openpype import AYON_SERVER_ENABLED
if sys.version_info[0] == 2:
from urlparse import urlparse, parse_qs
else:
@@ -134,7 +135,7 @@ def should_add_certificate_path_to_mongo_url(mongo_url):
add_certificate = False
# Check if url 'ssl' or 'tls' are set to 'true'
for key in ("ssl", "tls"):
- if key in query and "true" in query["ssl"]:
+ if key in query and "true" in query[key]:
add_certificate = True
break
@@ -206,6 +207,8 @@ class OpenPypeMongoConnection:
@classmethod
def create_connection(cls, mongo_url, timeout=None, retry_attempts=None):
+ if AYON_SERVER_ENABLED:
+ raise RuntimeError("Created mongo connection in AYON mode")
parsed = urlparse(mongo_url)
# Force validation of scheme
if parsed.scheme not in ["mongodb", "mongodb+srv"]:
@@ -221,7 +224,7 @@ class OpenPypeMongoConnection:
"serverSelectionTimeoutMS": timeout
}
if should_add_certificate_path_to_mongo_url(mongo_url):
- kwargs["ssl_ca_certs"] = certifi.where()
+ kwargs["tlsCAFile"] = certifi.where()
mongo_client = pymongo.MongoClient(mongo_url, **kwargs)
diff --git a/openpype/client/mongo/operations.py b/openpype/client/mongo/operations.py
new file mode 100644
index 0000000000..3537aa4a3d
--- /dev/null
+++ b/openpype/client/mongo/operations.py
@@ -0,0 +1,632 @@
+import re
+import copy
+import collections
+
+from bson.objectid import ObjectId
+from pymongo import DeleteOne, InsertOne, UpdateOne
+
+from openpype.client.operations_base import (
+ REMOVED_VALUE,
+ CreateOperation,
+ UpdateOperation,
+ DeleteOperation,
+ BaseOperationsSession
+)
+from .mongo import get_project_connection
+from .entities import get_project
+
+
+PROJECT_NAME_ALLOWED_SYMBOLS = "a-zA-Z0-9_"
+PROJECT_NAME_REGEX = re.compile(
+ "^[{}]+$".format(PROJECT_NAME_ALLOWED_SYMBOLS)
+)
+
+CURRENT_PROJECT_SCHEMA = "openpype:project-3.0"
+CURRENT_PROJECT_CONFIG_SCHEMA = "openpype:config-2.0"
+CURRENT_ASSET_DOC_SCHEMA = "openpype:asset-3.0"
+CURRENT_SUBSET_SCHEMA = "openpype:subset-3.0"
+CURRENT_VERSION_SCHEMA = "openpype:version-3.0"
+CURRENT_HERO_VERSION_SCHEMA = "openpype:hero_version-1.0"
+CURRENT_REPRESENTATION_SCHEMA = "openpype:representation-2.0"
+CURRENT_WORKFILE_INFO_SCHEMA = "openpype:workfile-1.0"
+CURRENT_THUMBNAIL_SCHEMA = "openpype:thumbnail-1.0"
+
+
+def _create_or_convert_to_mongo_id(mongo_id):
+ if mongo_id is None:
+ return ObjectId()
+ return ObjectId(mongo_id)
+
+
+def new_project_document(
+ project_name, project_code, config, data=None, entity_id=None
+):
+ """Create skeleton data of project document.
+
+ Args:
+ project_name (str): Name of project. Used as identifier of a project.
+ project_code (str): Shorter version of projet without spaces and
+ special characters (in most of cases). Should be also considered
+ as unique name across projects.
+ config (Dic[str, Any]): Project config consist of roots, templates,
+ applications and other project Anatomy related data.
+ data (Dict[str, Any]): Project data with information about it's
+ attributes (e.g. 'fps' etc.) or integration specific keys.
+ entity_id (Union[str, ObjectId]): Predefined id of document. New id is
+ created if not passed.
+
+ Returns:
+ Dict[str, Any]: Skeleton of project document.
+ """
+
+ if data is None:
+ data = {}
+
+ data["code"] = project_code
+
+ return {
+ "_id": _create_or_convert_to_mongo_id(entity_id),
+ "name": project_name,
+ "type": CURRENT_PROJECT_SCHEMA,
+ "entity_data": data,
+ "config": config
+ }
+
+
+def new_asset_document(
+ name, project_id, parent_id, parents, data=None, entity_id=None
+):
+ """Create skeleton data of asset document.
+
+ Args:
+ name (str): Is considered as unique identifier of asset in project.
+ project_id (Union[str, ObjectId]): Id of project doument.
+ parent_id (Union[str, ObjectId]): Id of parent asset.
+ parents (List[str]): List of parent assets names.
+ data (Dict[str, Any]): Asset document data. Empty dictionary is used
+ if not passed. Value of 'parent_id' is used to fill 'visualParent'.
+ entity_id (Union[str, ObjectId]): Predefined id of document. New id is
+ created if not passed.
+
+ Returns:
+ Dict[str, Any]: Skeleton of asset document.
+ """
+
+ if data is None:
+ data = {}
+ if parent_id is not None:
+ parent_id = ObjectId(parent_id)
+ data["visualParent"] = parent_id
+ data["parents"] = parents
+
+ return {
+ "_id": _create_or_convert_to_mongo_id(entity_id),
+ "type": "asset",
+ "name": name,
+ "parent": ObjectId(project_id),
+ "data": data,
+ "schema": CURRENT_ASSET_DOC_SCHEMA
+ }
+
+
+def new_subset_document(name, family, asset_id, data=None, entity_id=None):
+ """Create skeleton data of subset document.
+
+ Args:
+ name (str): Is considered as unique identifier of subset under asset.
+ family (str): Subset's family.
+ asset_id (Union[str, ObjectId]): Id of parent asset.
+ data (Dict[str, Any]): Subset document data. Empty dictionary is used
+ if not passed. Value of 'family' is used to fill 'family'.
+ entity_id (Union[str, ObjectId]): Predefined id of document. New id is
+ created if not passed.
+
+ Returns:
+ Dict[str, Any]: Skeleton of subset document.
+ """
+
+ if data is None:
+ data = {}
+ data["family"] = family
+ return {
+ "_id": _create_or_convert_to_mongo_id(entity_id),
+ "schema": CURRENT_SUBSET_SCHEMA,
+ "type": "subset",
+ "name": name,
+ "data": data,
+ "parent": asset_id
+ }
+
+
+def new_version_doc(version, subset_id, data=None, entity_id=None):
+ """Create skeleton data of version document.
+
+ Args:
+ version (int): Is considered as unique identifier of version
+ under subset.
+ subset_id (Union[str, ObjectId]): Id of parent subset.
+ data (Dict[str, Any]): Version document data.
+ entity_id (Union[str, ObjectId]): Predefined id of document. New id is
+ created if not passed.
+
+ Returns:
+ Dict[str, Any]: Skeleton of version document.
+ """
+
+ if data is None:
+ data = {}
+
+ return {
+ "_id": _create_or_convert_to_mongo_id(entity_id),
+ "schema": CURRENT_VERSION_SCHEMA,
+ "type": "version",
+ "name": int(version),
+ "parent": subset_id,
+ "data": data
+ }
+
+
+def new_hero_version_doc(version_id, subset_id, data=None, entity_id=None):
+ """Create skeleton data of hero version document.
+
+ Args:
+ version_id (ObjectId): Is considered as unique identifier of version
+ under subset.
+ subset_id (Union[str, ObjectId]): Id of parent subset.
+ data (Dict[str, Any]): Version document data.
+ entity_id (Union[str, ObjectId]): Predefined id of document. New id is
+ created if not passed.
+
+ Returns:
+ Dict[str, Any]: Skeleton of version document.
+ """
+
+ if data is None:
+ data = {}
+
+ return {
+ "_id": _create_or_convert_to_mongo_id(entity_id),
+ "schema": CURRENT_HERO_VERSION_SCHEMA,
+ "type": "hero_version",
+ "version_id": version_id,
+ "parent": subset_id,
+ "data": data
+ }
+
+
+def new_representation_doc(
+ name, version_id, context, data=None, entity_id=None
+):
+ """Create skeleton data of asset document.
+
+ Args:
+ version (int): Is considered as unique identifier of version
+ under subset.
+ version_id (Union[str, ObjectId]): Id of parent version.
+ context (Dict[str, Any]): Representation context used for fill template
+ of to query.
+ data (Dict[str, Any]): Representation document data.
+ entity_id (Union[str, ObjectId]): Predefined id of document. New id is
+ created if not passed.
+
+ Returns:
+ Dict[str, Any]: Skeleton of version document.
+ """
+
+ if data is None:
+ data = {}
+
+ return {
+ "_id": _create_or_convert_to_mongo_id(entity_id),
+ "schema": CURRENT_REPRESENTATION_SCHEMA,
+ "type": "representation",
+ "parent": version_id,
+ "name": name,
+ "data": data,
+
+ # Imprint shortcut to context for performance reasons.
+ "context": context
+ }
+
+
+def new_thumbnail_doc(data=None, entity_id=None):
+ """Create skeleton data of thumbnail document.
+
+ Args:
+ data (Dict[str, Any]): Thumbnail document data.
+ entity_id (Union[str, ObjectId]): Predefined id of document. New id is
+ created if not passed.
+
+ Returns:
+ Dict[str, Any]: Skeleton of thumbnail document.
+ """
+
+ if data is None:
+ data = {}
+
+ return {
+ "_id": _create_or_convert_to_mongo_id(entity_id),
+ "type": "thumbnail",
+ "schema": CURRENT_THUMBNAIL_SCHEMA,
+ "data": data
+ }
+
+
+def new_workfile_info_doc(
+ filename, asset_id, task_name, files, data=None, entity_id=None
+):
+ """Create skeleton data of workfile info document.
+
+ Workfile document is at this moment used primarily for artist notes.
+
+ Args:
+ filename (str): Filename of workfile.
+ asset_id (Union[str, ObjectId]): Id of asset under which workfile live.
+ task_name (str): Task under which was workfile created.
+ files (List[str]): List of rootless filepaths related to workfile.
+ data (Dict[str, Any]): Additional metadata.
+
+ Returns:
+ Dict[str, Any]: Skeleton of workfile info document.
+ """
+
+ if not data:
+ data = {}
+
+ return {
+ "_id": _create_or_convert_to_mongo_id(entity_id),
+ "type": "workfile",
+ "parent": ObjectId(asset_id),
+ "task_name": task_name,
+ "filename": filename,
+ "data": data,
+ "files": files
+ }
+
+
+def _prepare_update_data(old_doc, new_doc, replace):
+ changes = {}
+ for key, value in new_doc.items():
+ if key not in old_doc or value != old_doc[key]:
+ changes[key] = value
+
+ if replace:
+ for key in old_doc.keys():
+ if key not in new_doc:
+ changes[key] = REMOVED_VALUE
+ return changes
+
+
+def prepare_subset_update_data(old_doc, new_doc, replace=True):
+ """Compare two subset documents and prepare update data.
+
+ Based on compared values will create update data for
+ 'MongoUpdateOperation'.
+
+ Empty output means that documents are identical.
+
+ Returns:
+ Dict[str, Any]: Changes between old and new document.
+ """
+
+ return _prepare_update_data(old_doc, new_doc, replace)
+
+
+def prepare_version_update_data(old_doc, new_doc, replace=True):
+ """Compare two version documents and prepare update data.
+
+ Based on compared values will create update data for
+ 'MongoUpdateOperation'.
+
+ Empty output means that documents are identical.
+
+ Returns:
+ Dict[str, Any]: Changes between old and new document.
+ """
+
+ return _prepare_update_data(old_doc, new_doc, replace)
+
+
+def prepare_hero_version_update_data(old_doc, new_doc, replace=True):
+ """Compare two hero version documents and prepare update data.
+
+ Based on compared values will create update data for 'UpdateOperation'.
+
+ Empty output means that documents are identical.
+
+ Returns:
+ Dict[str, Any]: Changes between old and new document.
+ """
+
+ return _prepare_update_data(old_doc, new_doc, replace)
+
+
+def prepare_representation_update_data(old_doc, new_doc, replace=True):
+ """Compare two representation documents and prepare update data.
+
+ Based on compared values will create update data for
+ 'MongoUpdateOperation'.
+
+ Empty output means that documents are identical.
+
+ Returns:
+ Dict[str, Any]: Changes between old and new document.
+ """
+
+ return _prepare_update_data(old_doc, new_doc, replace)
+
+
+def prepare_workfile_info_update_data(old_doc, new_doc, replace=True):
+ """Compare two workfile info documents and prepare update data.
+
+ Based on compared values will create update data for
+ 'MongoUpdateOperation'.
+
+ Empty output means that documents are identical.
+
+ Returns:
+ Dict[str, Any]: Changes between old and new document.
+ """
+
+ return _prepare_update_data(old_doc, new_doc, replace)
+
+
+class MongoCreateOperation(CreateOperation):
+ """Operation to create an entity.
+
+ Args:
+ project_name (str): On which project operation will happen.
+ entity_type (str): Type of entity on which change happens.
+ e.g. 'asset', 'representation' etc.
+ data (Dict[str, Any]): Data of entity that will be created.
+ """
+
+ operation_name = "create"
+
+ def __init__(self, project_name, entity_type, data):
+ super(MongoCreateOperation, self).__init__(
+ project_name, entity_type, data
+ )
+
+ if "_id" not in self._data:
+ self._data["_id"] = ObjectId()
+ else:
+ self._data["_id"] = ObjectId(self._data["_id"])
+
+ @property
+ def entity_id(self):
+ return self._data["_id"]
+
+ def to_mongo_operation(self):
+ return InsertOne(copy.deepcopy(self._data))
+
+
+class MongoUpdateOperation(UpdateOperation):
+ """Operation to update an entity.
+
+ Args:
+ project_name (str): On which project operation will happen.
+ entity_type (str): Type of entity on which change happens.
+ e.g. 'asset', 'representation' etc.
+ entity_id (Union[str, ObjectId]): Identifier of an entity.
+ update_data (Dict[str, Any]): Key -> value changes that will be set in
+ database. If value is set to 'REMOVED_VALUE' the key will be
+ removed. Only first level of dictionary is checked (on purpose).
+ """
+
+ operation_name = "update"
+
+ def __init__(self, project_name, entity_type, entity_id, update_data):
+ super(MongoUpdateOperation, self).__init__(
+ project_name, entity_type, entity_id, update_data
+ )
+
+ self._entity_id = ObjectId(self._entity_id)
+
+ def to_mongo_operation(self):
+ unset_data = {}
+ set_data = {}
+ for key, value in self._update_data.items():
+ if value is REMOVED_VALUE:
+ unset_data[key] = None
+ else:
+ set_data[key] = value
+
+ op_data = {}
+ if unset_data:
+ op_data["$unset"] = unset_data
+ if set_data:
+ op_data["$set"] = set_data
+
+ if not op_data:
+ return None
+
+ return UpdateOne(
+ {"_id": self.entity_id},
+ op_data
+ )
+
+
+class MongoDeleteOperation(DeleteOperation):
+ """Operation to delete an entity.
+
+ Args:
+ project_name (str): On which project operation will happen.
+ entity_type (str): Type of entity on which change happens.
+ e.g. 'asset', 'representation' etc.
+ entity_id (Union[str, ObjectId]): Entity id that will be removed.
+ """
+
+ operation_name = "delete"
+
+ def __init__(self, project_name, entity_type, entity_id):
+ super(MongoDeleteOperation, self).__init__(
+ project_name, entity_type, entity_id
+ )
+
+ self._entity_id = ObjectId(self._entity_id)
+
+ def to_mongo_operation(self):
+ return DeleteOne({"_id": self.entity_id})
+
+
+class MongoOperationsSession(BaseOperationsSession):
+ """Session storing operations that should happen in an order.
+
+ At this moment does not handle anything special can be sonsidered as
+ stupid list of operations that will happen after each other. If creation
+ of same entity is there multiple times it's handled in any way and document
+ values are not validated.
+
+ All operations must be related to single project.
+
+ Args:
+ project_name (str): Project name to which are operations related.
+ """
+
+ def commit(self):
+ """Commit session operations."""
+
+ operations, self._operations = self._operations, []
+ if not operations:
+ return
+
+ operations_by_project = collections.defaultdict(list)
+ for operation in operations:
+ operations_by_project[operation.project_name].append(operation)
+
+ for project_name, operations in operations_by_project.items():
+ bulk_writes = []
+ for operation in operations:
+ mongo_op = operation.to_mongo_operation()
+ if mongo_op is not None:
+ bulk_writes.append(mongo_op)
+
+ if bulk_writes:
+ collection = get_project_connection(project_name)
+ collection.bulk_write(bulk_writes)
+
+ def create_entity(self, project_name, entity_type, data):
+ """Fast access to 'MongoCreateOperation'.
+
+ Returns:
+ MongoCreateOperation: Object of update operation.
+ """
+
+ operation = MongoCreateOperation(project_name, entity_type, data)
+ self.add(operation)
+ return operation
+
+ def update_entity(self, project_name, entity_type, entity_id, update_data):
+ """Fast access to 'MongoUpdateOperation'.
+
+ Returns:
+ MongoUpdateOperation: Object of update operation.
+ """
+
+ operation = MongoUpdateOperation(
+ project_name, entity_type, entity_id, update_data
+ )
+ self.add(operation)
+ return operation
+
+ def delete_entity(self, project_name, entity_type, entity_id):
+ """Fast access to 'MongoDeleteOperation'.
+
+ Returns:
+ MongoDeleteOperation: Object of delete operation.
+ """
+
+ operation = MongoDeleteOperation(project_name, entity_type, entity_id)
+ self.add(operation)
+ return operation
+
+
+def create_project(
+ project_name,
+ project_code,
+ library_project=False,
+):
+ """Create project using OpenPype settings.
+
+ This project creation function is not validating project document on
+ creation. It is because project document is created blindly with only
+ minimum required information about project which is it's name, code, type
+ and schema.
+
+ Entered project name must be unique and project must not exist yet.
+
+ Note:
+ This function is here to be OP v4 ready but in v3 has more logic
+ to do. That's why inner imports are in the body.
+
+ Args:
+ project_name(str): New project name. Should be unique.
+ project_code(str): Project's code should be unique too.
+ library_project(bool): Project is library project.
+
+ Raises:
+ ValueError: When project name already exists in MongoDB.
+
+ Returns:
+ dict: Created project document.
+ """
+
+ from openpype.settings import ProjectSettings, SaveWarningExc
+ from openpype.pipeline.schema import validate
+
+ if get_project(project_name, fields=["name"]):
+ raise ValueError("Project with name \"{}\" already exists".format(
+ project_name
+ ))
+
+ if not PROJECT_NAME_REGEX.match(project_name):
+ raise ValueError((
+ "Project name \"{}\" contain invalid characters"
+ ).format(project_name))
+
+ project_doc = {
+ "type": "project",
+ "name": project_name,
+ "data": {
+ "code": project_code,
+ "library_project": library_project
+ },
+ "schema": CURRENT_PROJECT_SCHEMA
+ }
+
+ op_session = MongoOperationsSession()
+ # Insert document with basic data
+ create_op = op_session.create_entity(
+ project_name, project_doc["type"], project_doc
+ )
+ op_session.commit()
+
+ # Load ProjectSettings for the project and save it to store all attributes
+ # and Anatomy
+ try:
+ project_settings_entity = ProjectSettings(project_name)
+ project_settings_entity.save()
+ except SaveWarningExc as exc:
+ print(str(exc))
+ except Exception:
+ op_session.delete_entity(
+ project_name, project_doc["type"], create_op.entity_id
+ )
+ op_session.commit()
+ raise
+
+ project_doc = get_project(project_name)
+
+ try:
+ # Validate created project document
+ validate(project_doc)
+ except Exception:
+ # Remove project if is not valid
+ op_session.delete_entity(
+ project_name, project_doc["type"], create_op.entity_id
+ )
+ op_session.commit()
+ raise
+
+ return project_doc
diff --git a/openpype/client/operations.py b/openpype/client/operations.py
index e8c9d28636..8bc09dffd3 100644
--- a/openpype/client/operations.py
+++ b/openpype/client/operations.py
@@ -1,797 +1,24 @@
-import re
-import uuid
-import copy
-import collections
-from abc import ABCMeta, abstractmethod, abstractproperty
+from openpype import AYON_SERVER_ENABLED
-import six
-from bson.objectid import ObjectId
-from pymongo import DeleteOne, InsertOne, UpdateOne
+from .operations_base import REMOVED_VALUE
+if not AYON_SERVER_ENABLED:
+ from .mongo.operations import *
+ OperationsSession = MongoOperationsSession
-from .mongo import get_project_connection
-from .entities import get_project
-
-REMOVED_VALUE = object()
-
-PROJECT_NAME_ALLOWED_SYMBOLS = "a-zA-Z0-9_"
-PROJECT_NAME_REGEX = re.compile(
- "^[{}]+$".format(PROJECT_NAME_ALLOWED_SYMBOLS)
-)
-
-CURRENT_PROJECT_SCHEMA = "openpype:project-3.0"
-CURRENT_PROJECT_CONFIG_SCHEMA = "openpype:config-2.0"
-CURRENT_ASSET_DOC_SCHEMA = "openpype:asset-3.0"
-CURRENT_SUBSET_SCHEMA = "openpype:subset-3.0"
-CURRENT_VERSION_SCHEMA = "openpype:version-3.0"
-CURRENT_HERO_VERSION_SCHEMA = "openpype:hero_version-1.0"
-CURRENT_REPRESENTATION_SCHEMA = "openpype:representation-2.0"
-CURRENT_WORKFILE_INFO_SCHEMA = "openpype:workfile-1.0"
-CURRENT_THUMBNAIL_SCHEMA = "openpype:thumbnail-1.0"
-
-
-def _create_or_convert_to_mongo_id(mongo_id):
- if mongo_id is None:
- return ObjectId()
- return ObjectId(mongo_id)
-
-
-def new_project_document(
- project_name, project_code, config, data=None, entity_id=None
-):
- """Create skeleton data of project document.
-
- Args:
- project_name (str): Name of project. Used as identifier of a project.
- project_code (str): Shorter version of projet without spaces and
- special characters (in most of cases). Should be also considered
- as unique name across projects.
- config (Dic[str, Any]): Project config consist of roots, templates,
- applications and other project Anatomy related data.
- data (Dict[str, Any]): Project data with information about it's
- attributes (e.g. 'fps' etc.) or integration specific keys.
- entity_id (Union[str, ObjectId]): Predefined id of document. New id is
- created if not passed.
-
- Returns:
- Dict[str, Any]: Skeleton of project document.
- """
-
- if data is None:
- data = {}
-
- data["code"] = project_code
-
- return {
- "_id": _create_or_convert_to_mongo_id(entity_id),
- "name": project_name,
- "type": CURRENT_PROJECT_SCHEMA,
- "entity_data": data,
- "config": config
- }
-
-
-def new_asset_document(
- name, project_id, parent_id, parents, data=None, entity_id=None
-):
- """Create skeleton data of asset document.
-
- Args:
- name (str): Is considered as unique identifier of asset in project.
- project_id (Union[str, ObjectId]): Id of project doument.
- parent_id (Union[str, ObjectId]): Id of parent asset.
- parents (List[str]): List of parent assets names.
- data (Dict[str, Any]): Asset document data. Empty dictionary is used
- if not passed. Value of 'parent_id' is used to fill 'visualParent'.
- entity_id (Union[str, ObjectId]): Predefined id of document. New id is
- created if not passed.
-
- Returns:
- Dict[str, Any]: Skeleton of asset document.
- """
-
- if data is None:
- data = {}
- if parent_id is not None:
- parent_id = ObjectId(parent_id)
- data["visualParent"] = parent_id
- data["parents"] = parents
-
- return {
- "_id": _create_or_convert_to_mongo_id(entity_id),
- "type": "asset",
- "name": name,
- "parent": ObjectId(project_id),
- "data": data,
- "schema": CURRENT_ASSET_DOC_SCHEMA
- }
-
-
-def new_subset_document(name, family, asset_id, data=None, entity_id=None):
- """Create skeleton data of subset document.
-
- Args:
- name (str): Is considered as unique identifier of subset under asset.
- family (str): Subset's family.
- asset_id (Union[str, ObjectId]): Id of parent asset.
- data (Dict[str, Any]): Subset document data. Empty dictionary is used
- if not passed. Value of 'family' is used to fill 'family'.
- entity_id (Union[str, ObjectId]): Predefined id of document. New id is
- created if not passed.
-
- Returns:
- Dict[str, Any]: Skeleton of subset document.
- """
-
- if data is None:
- data = {}
- data["family"] = family
- return {
- "_id": _create_or_convert_to_mongo_id(entity_id),
- "schema": CURRENT_SUBSET_SCHEMA,
- "type": "subset",
- "name": name,
- "data": data,
- "parent": asset_id
- }
-
-
-def new_version_doc(version, subset_id, data=None, entity_id=None):
- """Create skeleton data of version document.
-
- Args:
- version (int): Is considered as unique identifier of version
- under subset.
- subset_id (Union[str, ObjectId]): Id of parent subset.
- data (Dict[str, Any]): Version document data.
- entity_id (Union[str, ObjectId]): Predefined id of document. New id is
- created if not passed.
-
- Returns:
- Dict[str, Any]: Skeleton of version document.
- """
-
- if data is None:
- data = {}
-
- return {
- "_id": _create_or_convert_to_mongo_id(entity_id),
- "schema": CURRENT_VERSION_SCHEMA,
- "type": "version",
- "name": int(version),
- "parent": subset_id,
- "data": data
- }
-
-
-def new_hero_version_doc(version_id, subset_id, data=None, entity_id=None):
- """Create skeleton data of hero version document.
-
- Args:
- version_id (ObjectId): Is considered as unique identifier of version
- under subset.
- subset_id (Union[str, ObjectId]): Id of parent subset.
- data (Dict[str, Any]): Version document data.
- entity_id (Union[str, ObjectId]): Predefined id of document. New id is
- created if not passed.
-
- Returns:
- Dict[str, Any]: Skeleton of version document.
- """
-
- if data is None:
- data = {}
-
- return {
- "_id": _create_or_convert_to_mongo_id(entity_id),
- "schema": CURRENT_HERO_VERSION_SCHEMA,
- "type": "hero_version",
- "version_id": version_id,
- "parent": subset_id,
- "data": data
- }
-
-
-def new_representation_doc(
- name, version_id, context, data=None, entity_id=None
-):
- """Create skeleton data of asset document.
-
- Args:
- version (int): Is considered as unique identifier of version
- under subset.
- version_id (Union[str, ObjectId]): Id of parent version.
- context (Dict[str, Any]): Representation context used for fill template
- of to query.
- data (Dict[str, Any]): Representation document data.
- entity_id (Union[str, ObjectId]): Predefined id of document. New id is
- created if not passed.
-
- Returns:
- Dict[str, Any]: Skeleton of version document.
- """
-
- if data is None:
- data = {}
-
- return {
- "_id": _create_or_convert_to_mongo_id(entity_id),
- "schema": CURRENT_REPRESENTATION_SCHEMA,
- "type": "representation",
- "parent": version_id,
- "name": name,
- "data": data,
- # Imprint shortcut to context for performance reasons.
- "context": context
- }
-
-
-def new_thumbnail_doc(data=None, entity_id=None):
- """Create skeleton data of thumbnail document.
-
- Args:
- data (Dict[str, Any]): Thumbnail document data.
- entity_id (Union[str, ObjectId]): Predefined id of document. New id is
- created if not passed.
-
- Returns:
- Dict[str, Any]: Skeleton of thumbnail document.
- """
-
- if data is None:
- data = {}
-
- return {
- "_id": _create_or_convert_to_mongo_id(entity_id),
- "type": "thumbnail",
- "schema": CURRENT_THUMBNAIL_SCHEMA,
- "data": data
- }
-
-
-def new_workfile_info_doc(
- filename, asset_id, task_name, files, data=None, entity_id=None
-):
- """Create skeleton data of workfile info document.
-
- Workfile document is at this moment used primarily for artist notes.
-
- Args:
- filename (str): Filename of workfile.
- asset_id (Union[str, ObjectId]): Id of asset under which workfile live.
- task_name (str): Task under which was workfile created.
- files (List[str]): List of rootless filepaths related to workfile.
- data (Dict[str, Any]): Additional metadata.
-
- Returns:
- Dict[str, Any]: Skeleton of workfile info document.
- """
-
- if not data:
- data = {}
-
- return {
- "_id": _create_or_convert_to_mongo_id(entity_id),
- "type": "workfile",
- "parent": ObjectId(asset_id),
- "task_name": task_name,
- "filename": filename,
- "data": data,
- "files": files
- }
-
-
-def _prepare_update_data(old_doc, new_doc, replace):
- changes = {}
- for key, value in new_doc.items():
- if key not in old_doc or value != old_doc[key]:
- changes[key] = value
-
- if replace:
- for key in old_doc.keys():
- if key not in new_doc:
- changes[key] = REMOVED_VALUE
- return changes
-
-
-def prepare_subset_update_data(old_doc, new_doc, replace=True):
- """Compare two subset documents and prepare update data.
-
- Based on compared values will create update data for 'UpdateOperation'.
-
- Empty output means that documents are identical.
-
- Returns:
- Dict[str, Any]: Changes between old and new document.
- """
-
- return _prepare_update_data(old_doc, new_doc, replace)
-
-
-def prepare_version_update_data(old_doc, new_doc, replace=True):
- """Compare two version documents and prepare update data.
-
- Based on compared values will create update data for 'UpdateOperation'.
-
- Empty output means that documents are identical.
-
- Returns:
- Dict[str, Any]: Changes between old and new document.
- """
-
- return _prepare_update_data(old_doc, new_doc, replace)
-
-
-def prepare_hero_version_update_data(old_doc, new_doc, replace=True):
- """Compare two hero version documents and prepare update data.
-
- Based on compared values will create update data for 'UpdateOperation'.
-
- Empty output means that documents are identical.
-
- Returns:
- Dict[str, Any]: Changes between old and new document.
- """
-
- return _prepare_update_data(old_doc, new_doc, replace)
-
-
-def prepare_representation_update_data(old_doc, new_doc, replace=True):
- """Compare two representation documents and prepare update data.
-
- Based on compared values will create update data for 'UpdateOperation'.
-
- Empty output means that documents are identical.
-
- Returns:
- Dict[str, Any]: Changes between old and new document.
- """
-
- return _prepare_update_data(old_doc, new_doc, replace)
-
-
-def prepare_workfile_info_update_data(old_doc, new_doc, replace=True):
- """Compare two workfile info documents and prepare update data.
-
- Based on compared values will create update data for 'UpdateOperation'.
-
- Empty output means that documents are identical.
-
- Returns:
- Dict[str, Any]: Changes between old and new document.
- """
-
- return _prepare_update_data(old_doc, new_doc, replace)
-
-
-@six.add_metaclass(ABCMeta)
-class AbstractOperation(object):
- """Base operation class.
-
- Operation represent a call into database. The call can create, change or
- remove data.
-
- Args:
- project_name (str): On which project operation will happen.
- entity_type (str): Type of entity on which change happens.
- e.g. 'asset', 'representation' etc.
- """
-
- def __init__(self, project_name, entity_type):
- self._project_name = project_name
- self._entity_type = entity_type
- self._id = str(uuid.uuid4())
-
- @property
- def project_name(self):
- return self._project_name
-
- @property
- def id(self):
- """Identifier of operation."""
-
- return self._id
-
- @property
- def entity_type(self):
- return self._entity_type
-
- @abstractproperty
- def operation_name(self):
- """Stringified type of operation."""
-
- pass
-
- @abstractmethod
- def to_mongo_operation(self):
- """Convert operation to Mongo batch operation."""
-
- pass
-
- def to_data(self):
- """Convert operation to data that can be converted to json or others.
-
- Warning:
- Current state returns ObjectId objects which cannot be parsed by
- json.
-
- Returns:
- Dict[str, Any]: Description of operation.
- """
-
- return {
- "id": self._id,
- "entity_type": self.entity_type,
- "project_name": self.project_name,
- "operation": self.operation_name
- }
-
-
-class CreateOperation(AbstractOperation):
- """Operation to create an entity.
-
- Args:
- project_name (str): On which project operation will happen.
- entity_type (str): Type of entity on which change happens.
- e.g. 'asset', 'representation' etc.
- data (Dict[str, Any]): Data of entity that will be created.
- """
-
- operation_name = "create"
-
- def __init__(self, project_name, entity_type, data):
- super(CreateOperation, self).__init__(project_name, entity_type)
-
- if not data:
- data = {}
- else:
- data = copy.deepcopy(dict(data))
-
- if "_id" not in data:
- data["_id"] = ObjectId()
- else:
- data["_id"] = ObjectId(data["_id"])
-
- self._entity_id = data["_id"]
- self._data = data
-
- def __setitem__(self, key, value):
- self.set_value(key, value)
-
- def __getitem__(self, key):
- return self.data[key]
-
- def set_value(self, key, value):
- self.data[key] = value
-
- def get(self, key, *args, **kwargs):
- return self.data.get(key, *args, **kwargs)
-
- @property
- def entity_id(self):
- return self._entity_id
-
- @property
- def data(self):
- return self._data
-
- def to_mongo_operation(self):
- return InsertOne(copy.deepcopy(self._data))
-
- def to_data(self):
- output = super(CreateOperation, self).to_data()
- output["data"] = copy.deepcopy(self.data)
- return output
-
-
-class UpdateOperation(AbstractOperation):
- """Operation to update an entity.
-
- Args:
- project_name (str): On which project operation will happen.
- entity_type (str): Type of entity on which change happens.
- e.g. 'asset', 'representation' etc.
- entity_id (Union[str, ObjectId]): Identifier of an entity.
- update_data (Dict[str, Any]): Key -> value changes that will be set in
- database. If value is set to 'REMOVED_VALUE' the key will be
- removed. Only first level of dictionary is checked (on purpose).
- """
-
- operation_name = "update"
-
- def __init__(self, project_name, entity_type, entity_id, update_data):
- super(UpdateOperation, self).__init__(project_name, entity_type)
-
- self._entity_id = ObjectId(entity_id)
- self._update_data = update_data
-
- @property
- def entity_id(self):
- return self._entity_id
-
- @property
- def update_data(self):
- return self._update_data
-
- def to_mongo_operation(self):
- unset_data = {}
- set_data = {}
- for key, value in self._update_data.items():
- if value is REMOVED_VALUE:
- unset_data[key] = None
- else:
- set_data[key] = value
-
- op_data = {}
- if unset_data:
- op_data["$unset"] = unset_data
- if set_data:
- op_data["$set"] = set_data
-
- if not op_data:
- return None
-
- return UpdateOne(
- {"_id": self.entity_id},
- op_data
- )
-
- def to_data(self):
- changes = {}
- for key, value in self._update_data.items():
- if value is REMOVED_VALUE:
- value = None
- changes[key] = value
-
- output = super(UpdateOperation, self).to_data()
- output.update({
- "entity_id": self.entity_id,
- "changes": changes
- })
- return output
-
-
-class DeleteOperation(AbstractOperation):
- """Operation to delete an entity.
-
- Args:
- project_name (str): On which project operation will happen.
- entity_type (str): Type of entity on which change happens.
- e.g. 'asset', 'representation' etc.
- entity_id (Union[str, ObjectId]): Entity id that will be removed.
- """
-
- operation_name = "delete"
-
- def __init__(self, project_name, entity_type, entity_id):
- super(DeleteOperation, self).__init__(project_name, entity_type)
-
- self._entity_id = ObjectId(entity_id)
-
- @property
- def entity_id(self):
- return self._entity_id
-
- def to_mongo_operation(self):
- return DeleteOne({"_id": self.entity_id})
-
- def to_data(self):
- output = super(DeleteOperation, self).to_data()
- output["entity_id"] = self.entity_id
- return output
-
-
-class OperationsSession(object):
- """Session storing operations that should happen in an order.
-
- At this moment does not handle anything special can be sonsidered as
- stupid list of operations that will happen after each other. If creation
- of same entity is there multiple times it's handled in any way and document
- values are not validated.
-
- All operations must be related to single project.
-
- Args:
- project_name (str): Project name to which are operations related.
- """
-
- def __init__(self):
- self._operations = []
-
- def add(self, operation):
- """Add operation to be processed.
-
- Args:
- operation (BaseOperation): Operation that should be processed.
- """
- if not isinstance(
- operation,
- (CreateOperation, UpdateOperation, DeleteOperation)
- ):
- raise TypeError("Expected Operation object got {}".format(
- str(type(operation))
- ))
-
- self._operations.append(operation)
-
- def append(self, operation):
- """Add operation to be processed.
-
- Args:
- operation (BaseOperation): Operation that should be processed.
- """
-
- self.add(operation)
-
- def extend(self, operations):
- """Add operations to be processed.
-
- Args:
- operations (List[BaseOperation]): Operations that should be
- processed.
- """
-
- for operation in operations:
- self.add(operation)
-
- def remove(self, operation):
- """Remove operation."""
-
- self._operations.remove(operation)
-
- def clear(self):
- """Clear all registered operations."""
-
- self._operations = []
-
- def to_data(self):
- return [
- operation.to_data()
- for operation in self._operations
- ]
-
- def commit(self):
- """Commit session operations."""
-
- operations, self._operations = self._operations, []
- if not operations:
- return
-
- operations_by_project = collections.defaultdict(list)
- for operation in operations:
- operations_by_project[operation.project_name].append(operation)
-
- for project_name, operations in operations_by_project.items():
- bulk_writes = []
- for operation in operations:
- mongo_op = operation.to_mongo_operation()
- if mongo_op is not None:
- bulk_writes.append(mongo_op)
-
- if bulk_writes:
- collection = get_project_connection(project_name)
- collection.bulk_write(bulk_writes)
-
- def create_entity(self, project_name, entity_type, data):
- """Fast access to 'CreateOperation'.
-
- Returns:
- CreateOperation: Object of update operation.
- """
-
- operation = CreateOperation(project_name, entity_type, data)
- self.add(operation)
- return operation
-
- def update_entity(self, project_name, entity_type, entity_id, update_data):
- """Fast access to 'UpdateOperation'.
-
- Returns:
- UpdateOperation: Object of update operation.
- """
-
- operation = UpdateOperation(
- project_name, entity_type, entity_id, update_data
- )
- self.add(operation)
- return operation
-
- def delete_entity(self, project_name, entity_type, entity_id):
- """Fast access to 'DeleteOperation'.
-
- Returns:
- DeleteOperation: Object of delete operation.
- """
-
- operation = DeleteOperation(project_name, entity_type, entity_id)
- self.add(operation)
- return operation
-
-
-def create_project(
- project_name,
- project_code,
- library_project=False,
-):
- """Create project using OpenPype settings.
-
- This project creation function is not validating project document on
- creation. It is because project document is created blindly with only
- minimum required information about project which is it's name, code, type
- and schema.
-
- Entered project name must be unique and project must not exist yet.
-
- Note:
- This function is here to be OP v4 ready but in v3 has more logic
- to do. That's why inner imports are in the body.
-
- Args:
- project_name(str): New project name. Should be unique.
- project_code(str): Project's code should be unique too.
- library_project(bool): Project is library project.
-
- Raises:
- ValueError: When project name already exists in MongoDB.
-
- Returns:
- dict: Created project document.
- """
-
- from openpype.settings import ProjectSettings, SaveWarningExc
- from openpype.pipeline.schema import validate
-
- if get_project(project_name, fields=["name"]):
- raise ValueError("Project with name \"{}\" already exists".format(
- project_name
- ))
-
- if not PROJECT_NAME_REGEX.match(project_name):
- raise ValueError((
- "Project name \"{}\" contain invalid characters"
- ).format(project_name))
-
- project_doc = {
- "type": "project",
- "name": project_name,
- "data": {
- "code": project_code,
- "library_project": library_project,
- },
- "schema": CURRENT_PROJECT_SCHEMA
- }
-
- op_session = OperationsSession()
- # Insert document with basic data
- create_op = op_session.create_entity(
- project_name, project_doc["type"], project_doc
+else:
+ from ayon_api.server_api import (
+ PROJECT_NAME_ALLOWED_SYMBOLS,
+ PROJECT_NAME_REGEX,
+ )
+ from .server.operations import *
+ from .mongo.operations import (
+ CURRENT_PROJECT_SCHEMA,
+ CURRENT_PROJECT_CONFIG_SCHEMA,
+ CURRENT_ASSET_DOC_SCHEMA,
+ CURRENT_SUBSET_SCHEMA,
+ CURRENT_VERSION_SCHEMA,
+ CURRENT_HERO_VERSION_SCHEMA,
+ CURRENT_REPRESENTATION_SCHEMA,
+ CURRENT_WORKFILE_INFO_SCHEMA,
+ CURRENT_THUMBNAIL_SCHEMA
)
- op_session.commit()
-
- # Load ProjectSettings for the project and save it to store all attributes
- # and Anatomy
- try:
- project_settings_entity = ProjectSettings(project_name)
- project_settings_entity.save()
- except SaveWarningExc as exc:
- print(str(exc))
- except Exception:
- op_session.delete_entity(
- project_name, project_doc["type"], create_op.entity_id
- )
- op_session.commit()
- raise
-
- project_doc = get_project(project_name)
-
- try:
- # Validate created project document
- validate(project_doc)
- except Exception:
- # Remove project if is not valid
- op_session.delete_entity(
- project_name, project_doc["type"], create_op.entity_id
- )
- op_session.commit()
- raise
-
- return project_doc
diff --git a/openpype/client/operations_base.py b/openpype/client/operations_base.py
new file mode 100644
index 0000000000..887b237b1c
--- /dev/null
+++ b/openpype/client/operations_base.py
@@ -0,0 +1,289 @@
+import uuid
+import copy
+from abc import ABCMeta, abstractmethod, abstractproperty
+import six
+
+REMOVED_VALUE = object()
+
+
+@six.add_metaclass(ABCMeta)
+class AbstractOperation(object):
+ """Base operation class.
+
+ Operation represent a call into database. The call can create, change or
+ remove data.
+
+ Args:
+ project_name (str): On which project operation will happen.
+ entity_type (str): Type of entity on which change happens.
+ e.g. 'asset', 'representation' etc.
+ """
+
+ def __init__(self, project_name, entity_type):
+ self._project_name = project_name
+ self._entity_type = entity_type
+ self._id = str(uuid.uuid4())
+
+ @property
+ def project_name(self):
+ return self._project_name
+
+ @property
+ def id(self):
+ """Identifier of operation."""
+
+ return self._id
+
+ @property
+ def entity_type(self):
+ return self._entity_type
+
+ @abstractproperty
+ def operation_name(self):
+ """Stringified type of operation."""
+
+ pass
+
+ def to_data(self):
+ """Convert operation to data that can be converted to json or others.
+
+ Warning:
+ Current state returns ObjectId objects which cannot be parsed by
+ json.
+
+ Returns:
+ Dict[str, Any]: Description of operation.
+ """
+
+ return {
+ "id": self._id,
+ "entity_type": self.entity_type,
+ "project_name": self.project_name,
+ "operation": self.operation_name
+ }
+
+
+class CreateOperation(AbstractOperation):
+ """Operation to create an entity.
+
+ Args:
+ project_name (str): On which project operation will happen.
+ entity_type (str): Type of entity on which change happens.
+ e.g. 'asset', 'representation' etc.
+ data (Dict[str, Any]): Data of entity that will be created.
+ """
+
+ operation_name = "create"
+
+ def __init__(self, project_name, entity_type, data):
+ super(CreateOperation, self).__init__(project_name, entity_type)
+
+ if not data:
+ data = {}
+ else:
+ data = copy.deepcopy(dict(data))
+ self._data = data
+
+ def __setitem__(self, key, value):
+ self.set_value(key, value)
+
+ def __getitem__(self, key):
+ return self.data[key]
+
+ def set_value(self, key, value):
+ self.data[key] = value
+
+ def get(self, key, *args, **kwargs):
+ return self.data.get(key, *args, **kwargs)
+
+ @abstractproperty
+ def entity_id(self):
+ pass
+
+ @property
+ def data(self):
+ return self._data
+
+ def to_data(self):
+ output = super(CreateOperation, self).to_data()
+ output["data"] = copy.deepcopy(self.data)
+ return output
+
+
+class UpdateOperation(AbstractOperation):
+ """Operation to update an entity.
+
+ Args:
+ project_name (str): On which project operation will happen.
+ entity_type (str): Type of entity on which change happens.
+ e.g. 'asset', 'representation' etc.
+ entity_id (Union[str, ObjectId]): Identifier of an entity.
+ update_data (Dict[str, Any]): Key -> value changes that will be set in
+ database. If value is set to 'REMOVED_VALUE' the key will be
+ removed. Only first level of dictionary is checked (on purpose).
+ """
+
+ operation_name = "update"
+
+ def __init__(self, project_name, entity_type, entity_id, update_data):
+ super(UpdateOperation, self).__init__(project_name, entity_type)
+
+ self._entity_id = entity_id
+ self._update_data = update_data
+
+ @property
+ def entity_id(self):
+ return self._entity_id
+
+ @property
+ def update_data(self):
+ return self._update_data
+
+ def to_data(self):
+ changes = {}
+ for key, value in self._update_data.items():
+ if value is REMOVED_VALUE:
+ value = None
+ changes[key] = value
+
+ output = super(UpdateOperation, self).to_data()
+ output.update({
+ "entity_id": self.entity_id,
+ "changes": changes
+ })
+ return output
+
+
+class DeleteOperation(AbstractOperation):
+ """Operation to delete an entity.
+
+ Args:
+ project_name (str): On which project operation will happen.
+ entity_type (str): Type of entity on which change happens.
+ e.g. 'asset', 'representation' etc.
+ entity_id (Union[str, ObjectId]): Entity id that will be removed.
+ """
+
+ operation_name = "delete"
+
+ def __init__(self, project_name, entity_type, entity_id):
+ super(DeleteOperation, self).__init__(project_name, entity_type)
+
+ self._entity_id = entity_id
+
+ @property
+ def entity_id(self):
+ return self._entity_id
+
+ def to_data(self):
+ output = super(DeleteOperation, self).to_data()
+ output["entity_id"] = self.entity_id
+ return output
+
+
+class BaseOperationsSession(object):
+ """Session storing operations that should happen in an order.
+
+ At this moment does not handle anything special can be considered as
+ stupid list of operations that will happen after each other. If creation
+ of same entity is there multiple times it's handled in any way and document
+ values are not validated.
+ """
+
+ def __init__(self):
+ self._operations = []
+
+ def __len__(self):
+ return len(self._operations)
+
+ def add(self, operation):
+ """Add operation to be processed.
+
+ Args:
+ operation (BaseOperation): Operation that should be processed.
+ """
+ if not isinstance(
+ operation,
+ (CreateOperation, UpdateOperation, DeleteOperation)
+ ):
+ raise TypeError("Expected Operation object got {}".format(
+ str(type(operation))
+ ))
+
+ self._operations.append(operation)
+
+ def append(self, operation):
+ """Add operation to be processed.
+
+ Args:
+ operation (BaseOperation): Operation that should be processed.
+ """
+
+ self.add(operation)
+
+ def extend(self, operations):
+ """Add operations to be processed.
+
+ Args:
+ operations (List[BaseOperation]): Operations that should be
+ processed.
+ """
+
+ for operation in operations:
+ self.add(operation)
+
+ def remove(self, operation):
+ """Remove operation."""
+
+ self._operations.remove(operation)
+
+ def clear(self):
+ """Clear all registered operations."""
+
+ self._operations = []
+
+ def to_data(self):
+ return [
+ operation.to_data()
+ for operation in self._operations
+ ]
+
+ @abstractmethod
+ def commit(self):
+ """Commit session operations."""
+ pass
+
+ def create_entity(self, project_name, entity_type, data):
+ """Fast access to 'CreateOperation'.
+
+ Returns:
+ CreateOperation: Object of update operation.
+ """
+
+ operation = CreateOperation(project_name, entity_type, data)
+ self.add(operation)
+ return operation
+
+ def update_entity(self, project_name, entity_type, entity_id, update_data):
+ """Fast access to 'UpdateOperation'.
+
+ Returns:
+ UpdateOperation: Object of update operation.
+ """
+
+ operation = UpdateOperation(
+ project_name, entity_type, entity_id, update_data
+ )
+ self.add(operation)
+ return operation
+
+ def delete_entity(self, project_name, entity_type, entity_id):
+ """Fast access to 'DeleteOperation'.
+
+ Returns:
+ DeleteOperation: Object of delete operation.
+ """
+
+ operation = DeleteOperation(project_name, entity_type, entity_id)
+ self.add(operation)
+ return operation
diff --git a/openpype/client/server/__init__.py b/openpype/client/server/__init__.py
new file mode 100644
index 0000000000..e69de29bb2
diff --git a/openpype/client/server/constants.py b/openpype/client/server/constants.py
new file mode 100644
index 0000000000..1d3f94c702
--- /dev/null
+++ b/openpype/client/server/constants.py
@@ -0,0 +1,18 @@
+# --- Folders ---
+DEFAULT_FOLDER_FIELDS = {
+ "id",
+ "name",
+ "path",
+ "parentId",
+ "active",
+ "parents",
+ "thumbnailId"
+}
+
+REPRESENTATION_FILES_FIELDS = {
+ "files.name",
+ "files.hash",
+ "files.id",
+ "files.path",
+ "files.size",
+}
diff --git a/openpype/client/server/conversion_utils.py b/openpype/client/server/conversion_utils.py
new file mode 100644
index 0000000000..dc95bbeda5
--- /dev/null
+++ b/openpype/client/server/conversion_utils.py
@@ -0,0 +1,1328 @@
+import os
+import arrow
+import collections
+import json
+
+import six
+
+from openpype.client.operations_base import REMOVED_VALUE
+from openpype.client.mongo.operations import (
+ CURRENT_PROJECT_SCHEMA,
+ CURRENT_ASSET_DOC_SCHEMA,
+ CURRENT_SUBSET_SCHEMA,
+ CURRENT_VERSION_SCHEMA,
+ CURRENT_HERO_VERSION_SCHEMA,
+ CURRENT_REPRESENTATION_SCHEMA,
+ CURRENT_WORKFILE_INFO_SCHEMA,
+)
+from .constants import REPRESENTATION_FILES_FIELDS
+from .utils import create_entity_id, prepare_entity_changes
+
+# --- Project entity ---
+PROJECT_FIELDS_MAPPING_V3_V4 = {
+ "_id": {"name"},
+ "name": {"name"},
+ "data": {"data", "code"},
+ "data.library_project": {"library"},
+ "data.code": {"code"},
+ "data.active": {"active"},
+}
+
+# TODO this should not be hardcoded but received from server!!!
+# --- Folder entity ---
+FOLDER_FIELDS_MAPPING_V3_V4 = {
+ "_id": {"id"},
+ "name": {"name"},
+ "label": {"label"},
+ "data": {
+ "parentId", "parents", "active", "tasks", "thumbnailId"
+ },
+ "data.visualParent": {"parentId"},
+ "data.parents": {"parents"},
+ "data.active": {"active"},
+ "data.thumbnail_id": {"thumbnailId"},
+ "data.entityType": {"folderType"}
+}
+
+# --- Subset entity ---
+SUBSET_FIELDS_MAPPING_V3_V4 = {
+ "_id": {"id"},
+ "name": {"name"},
+ "data.active": {"active"},
+ "parent": {"folderId"}
+}
+
+# --- Version entity ---
+VERSION_FIELDS_MAPPING_V3_V4 = {
+ "_id": {"id"},
+ "name": {"version"},
+ "parent": {"productId"}
+}
+
+# --- Representation entity ---
+REPRESENTATION_FIELDS_MAPPING_V3_V4 = {
+ "_id": {"id"},
+ "name": {"name"},
+ "parent": {"versionId"},
+ "context": {"context"},
+ "files": {"files"},
+}
+
+
+def project_fields_v3_to_v4(fields, con):
+ """Convert project fields from v3 to v4 structure.
+
+ Args:
+ fields (Union[Iterable(str), None]): fields to be converted.
+
+ Returns:
+ Union[Set(str), None]: Converted fields to v4 fields.
+ """
+
+ # TODO config fields
+ # - config.apps
+ # - config.groups
+ if not fields:
+ return None
+
+ project_attribs = con.get_attributes_for_type("project")
+ output = set()
+ for field in fields:
+ # If config is needed the rest api call must be used
+ if field.startswith("config"):
+ return None
+
+ if field in PROJECT_FIELDS_MAPPING_V3_V4:
+ output |= PROJECT_FIELDS_MAPPING_V3_V4[field]
+ if field == "data":
+ output |= {
+ "attrib.{}".format(attr)
+ for attr in project_attribs
+ }
+
+ elif field.startswith("data"):
+ field_parts = field.split(".")
+ field_parts.pop(0)
+ data_key = ".".join(field_parts)
+ if data_key in project_attribs:
+ output.add("attrib.{}".format(data_key))
+ else:
+ output.add("data")
+ print("Requested specific key from data {}".format(data_key))
+
+ else:
+ raise ValueError("Unknown field mapping for {}".format(field))
+
+ if "name" not in output:
+ output.add("name")
+ return output
+
+
+def _get_default_template_name(templates):
+ default_template = None
+ for name, template in templates.items():
+ if name == "default":
+ return "default"
+
+ if default_template is None:
+ default_template = name
+
+ return default_template
+
+
+def _template_replacements_to_v3(template):
+ return (
+ template
+ .replace("{folder[name]}", "{asset}")
+ .replace("{product[name]}", "{subset}")
+ .replace("{product[type]}", "{family}")
+ )
+
+
+def _convert_template_item(template):
+ # Others won't have 'directory'
+ if "directory" not in template:
+ return
+ folder = _template_replacements_to_v3(template.pop("directory"))
+ template["folder"] = folder
+ template["file"] = _template_replacements_to_v3(template["file"])
+ template["path"] = "/".join(
+ (folder, template["file"])
+ )
+
+
+def _fill_template_category(templates, cat_templates, cat_key):
+ default_template_name = _get_default_template_name(cat_templates)
+ for template_name, cat_template in cat_templates.items():
+ _convert_template_item(cat_template)
+ if template_name == default_template_name:
+ templates[cat_key] = cat_template
+ else:
+ new_name = "{}_{}".format(cat_key, template_name)
+ templates["others"][new_name] = cat_template
+
+
+def convert_v4_project_to_v3(project):
+ """Convert Project entity data from v4 structure to v3 structure.
+
+ Args:
+ project (Dict[str, Any]): Project entity queried from v4 server.
+
+ Returns:
+ Dict[str, Any]: Project converted to v3 structure.
+ """
+
+ if not project:
+ return project
+
+ project_name = project["name"]
+ output = {
+ "_id": project_name,
+ "name": project_name,
+ "schema": CURRENT_PROJECT_SCHEMA,
+ "type": "project"
+ }
+
+ data = project.get("data") or {}
+ attribs = project.get("attrib") or {}
+ apps_attr = attribs.pop("applications", None) or []
+ applications = [
+ {"name": app_name}
+ for app_name in apps_attr
+ ]
+ data.update(attribs)
+ if "tools" in data:
+ data["tools_env"] = data.pop("tools")
+
+ data["entityType"] = "Project"
+
+ config = {}
+ project_config = project.get("config")
+
+ if project_config:
+ config["apps"] = applications
+ config["roots"] = project_config["roots"]
+
+ templates = project_config["templates"]
+ templates["defaults"] = templates.pop("common", None) or {}
+
+ others_templates = templates.pop("others", None) or {}
+ new_others_templates = {}
+ templates["others"] = new_others_templates
+ for name, template in others_templates.items():
+ _convert_template_item(template)
+ new_others_templates[name] = template
+
+ for key in (
+ "work",
+ "publish",
+ "hero"
+ ):
+ cat_templates = templates.pop(key)
+ _fill_template_category(templates, cat_templates, key)
+
+ delivery_templates = templates.pop("delivery", None) or {}
+ new_delivery_templates = {}
+ for name, delivery_template in delivery_templates.items():
+ new_delivery_templates[name] = "/".join(
+ (delivery_template["directory"], delivery_template["file"])
+ )
+ templates["delivery"] = new_delivery_templates
+
+ config["templates"] = templates
+
+ if "taskTypes" in project:
+ task_types = project["taskTypes"]
+ new_task_types = {}
+ for task_type in task_types:
+ name = task_type.pop("name")
+ new_task_types[name] = task_type
+
+ config["tasks"] = new_task_types
+
+ if config:
+ output["config"] = config
+
+ for data_key, key in (
+ ("library_project", "library"),
+ ("code", "code"),
+ ("active", "active")
+ ):
+ if key in project:
+ data[data_key] = project[key]
+
+ if "attrib" in project:
+ for key, value in project["attrib"].items():
+ data[key] = value
+
+ if data:
+ output["data"] = data
+ return output
+
+
+def folder_fields_v3_to_v4(fields, con):
+ """Convert folder fields from v3 to v4 structure.
+
+ Args:
+ fields (Union[Iterable(str), None]): fields to be converted.
+
+ Returns:
+ Union[Set(str), None]: Converted fields to v4 fields.
+ """
+
+ if not fields:
+ return None
+
+ folder_attributes = con.get_attributes_for_type("folder")
+ output = set()
+ for field in fields:
+ if field in ("schema", "type", "parent"):
+ continue
+
+ if field in FOLDER_FIELDS_MAPPING_V3_V4:
+ output |= FOLDER_FIELDS_MAPPING_V3_V4[field]
+ if field == "data":
+ output |= {
+ "attrib.{}".format(attr)
+ for attr in folder_attributes
+ }
+
+ elif field.startswith("data"):
+ field_parts = field.split(".")
+ field_parts.pop(0)
+ data_key = ".".join(field_parts)
+ if data_key == "label":
+ output.add("name")
+
+ elif data_key in ("icon", "color"):
+ continue
+
+ elif data_key.startswith("tasks"):
+ output.add("tasks")
+
+ elif data_key in folder_attributes:
+ output.add("attrib.{}".format(data_key))
+
+ else:
+ output.add("data")
+ print("Requested specific key from data {}".format(data_key))
+
+ else:
+ raise ValueError("Unknown field mapping for {}".format(field))
+
+ if "id" not in output:
+ output.add("id")
+ return output
+
+
+def convert_v4_tasks_to_v3(tasks):
+ """Convert v4 task item to v3 task.
+
+ Args:
+ tasks (List[Dict[str, Any]]): Task entites.
+
+ Returns:
+ Dict[str, Dict[str, Any]]: Tasks in v3 variant ready for v3 asset.
+ """
+
+ output = {}
+ for task in tasks:
+ task_name = task["name"]
+ new_task = {
+ "type": task["taskType"]
+ }
+ output[task_name] = new_task
+ return output
+
+
+def convert_v4_folder_to_v3(folder, project_name):
+ """Convert v4 folder to v3 asset.
+
+ Args:
+ folder (Dict[str, Any]): Folder entity data.
+ project_name (str): Project name from which folder was queried.
+
+ Returns:
+ Dict[str, Any]: Converted v4 folder to v3 asset.
+ """
+
+ output = {
+ "_id": folder["id"],
+ "parent": project_name,
+ "type": "asset",
+ "schema": CURRENT_ASSET_DOC_SCHEMA
+ }
+
+ output_data = folder.get("data") or {}
+
+ if "name" in folder:
+ output["name"] = folder["name"]
+ output_data["label"] = folder["name"]
+
+ if "folderType" in folder:
+ output_data["entityType"] = folder["folderType"]
+
+ for src_key, dst_key in (
+ ("parentId", "visualParent"),
+ ("active", "active"),
+ ("thumbnailId", "thumbnail_id"),
+ ("parents", "parents"),
+ ):
+ if src_key in folder:
+ output_data[dst_key] = folder[src_key]
+
+ if "attrib" in folder:
+ output_data.update(folder["attrib"])
+
+ if "tools" in output_data:
+ output_data["tools_env"] = output_data.pop("tools")
+
+ if "tasks" in folder:
+ output_data["tasks"] = convert_v4_tasks_to_v3(folder["tasks"])
+
+ output["data"] = output_data
+
+ return output
+
+
+def subset_fields_v3_to_v4(fields, con):
+ """Convert subset fields from v3 to v4 structure.
+
+ Args:
+ fields (Union[Iterable(str), None]): fields to be converted.
+
+ Returns:
+ Union[Set(str), None]: Converted fields to v4 fields.
+ """
+
+ if not fields:
+ return None
+
+ product_attributes = con.get_attributes_for_type("product")
+
+ output = set()
+ for field in fields:
+ if field in ("schema", "type"):
+ continue
+
+ if field in SUBSET_FIELDS_MAPPING_V3_V4:
+ output |= SUBSET_FIELDS_MAPPING_V3_V4[field]
+
+ elif field == "data":
+ output.add("productType")
+ output.add("active")
+ output |= {
+ "attrib.{}".format(attr)
+ for attr in product_attributes
+ }
+
+ elif field.startswith("data"):
+ field_parts = field.split(".")
+ field_parts.pop(0)
+ data_key = ".".join(field_parts)
+ if data_key in ("family", "families"):
+ output.add("productType")
+
+ elif data_key in product_attributes:
+ output.add("attrib.{}".format(data_key))
+
+ else:
+ output.add("data")
+ print("Requested specific key from data {}".format(data_key))
+
+ else:
+ raise ValueError("Unknown field mapping for {}".format(field))
+
+ if "id" not in output:
+ output.add("id")
+ return output
+
+
+def convert_v4_subset_to_v3(subset):
+ output = {
+ "_id": subset["id"],
+ "type": "subset",
+ "schema": CURRENT_SUBSET_SCHEMA
+ }
+ if "folderId" in subset:
+ output["parent"] = subset["folderId"]
+
+ output_data = subset.get("data") or {}
+
+ if "name" in subset:
+ output["name"] = subset["name"]
+
+ if "active" in subset:
+ output_data["active"] = subset["active"]
+
+ if "attrib" in subset:
+ attrib = subset["attrib"]
+ if "productGroup" in attrib:
+ attrib["subsetGroup"] = attrib.pop("productGroup")
+ output_data.update(attrib)
+
+ family = subset.get("productType")
+ if family:
+ output_data["family"] = family
+ output_data["families"] = [family]
+
+ output["data"] = output_data
+
+ return output
+
+
+def version_fields_v3_to_v4(fields, con):
+ """Convert version fields from v3 to v4 structure.
+
+ Args:
+ fields (Union[Iterable(str), None]): fields to be converted.
+
+ Returns:
+ Union[Set(str), None]: Converted fields to v4 fields.
+ """
+
+ if not fields:
+ return None
+
+ version_attributes = con.get_attributes_for_type("version")
+
+ output = set()
+ for field in fields:
+ if field in ("type", "schema", "version_id"):
+ continue
+
+ if field in VERSION_FIELDS_MAPPING_V3_V4:
+ output |= VERSION_FIELDS_MAPPING_V3_V4[field]
+
+ elif field == "data":
+ output |= {
+ "attrib.{}".format(attr)
+ for attr in version_attributes
+ }
+ output |= {
+ "author",
+ "createdAt",
+ "thumbnailId",
+ }
+
+ elif field.startswith("data"):
+ field_parts = field.split(".")
+ field_parts.pop(0)
+ data_key = ".".join(field_parts)
+ if data_key in version_attributes:
+ output.add("attrib.{}".format(data_key))
+
+ elif data_key == "thumbnail_id":
+ output.add("thumbnailId")
+
+ elif data_key == "time":
+ output.add("createdAt")
+
+ elif data_key == "author":
+ output.add("author")
+
+ elif data_key in ("tags", ):
+ continue
+
+ else:
+ output.add("data")
+ print("Requested specific key from data {}".format(data_key))
+
+ else:
+ raise ValueError("Unknown field mapping for {}".format(field))
+
+ if "id" not in output:
+ output.add("id")
+ return output
+
+
+def convert_v4_version_to_v3(version):
+ """Convert v4 version entity to v4 version.
+
+ Args:
+ version (Dict[str, Any]): Queried v4 version entity.
+
+ Returns:
+ Dict[str, Any]: Conveted version entity to v3 structure.
+ """
+
+ version_num = version["version"]
+ if version_num < 0:
+ output = {
+ "_id": version["id"],
+ "type": "hero_version",
+ "schema": CURRENT_HERO_VERSION_SCHEMA,
+ }
+ if "productId" in version:
+ output["parent"] = version["productId"]
+
+ if "data" in version:
+ output["data"] = version["data"]
+ return output
+
+ output = {
+ "_id": version["id"],
+ "type": "version",
+ "name": version_num,
+ "schema": CURRENT_VERSION_SCHEMA
+ }
+ if "productId" in version:
+ output["parent"] = version["productId"]
+
+ output_data = version.get("data") or {}
+ if "attrib" in version:
+ output_data.update(version["attrib"])
+
+ for src_key, dst_key in (
+ ("active", "active"),
+ ("thumbnailId", "thumbnail_id"),
+ ("author", "author")
+ ):
+ if src_key in version:
+ output_data[dst_key] = version[src_key]
+
+ if "createdAt" in version:
+ created_at = arrow.get(version["createdAt"])
+ output_data["time"] = created_at.strftime("%Y%m%dT%H%M%SZ")
+
+ output["data"] = output_data
+
+ return output
+
+
+def representation_fields_v3_to_v4(fields, con):
+ """Convert representation fields from v3 to v4 structure.
+
+ Args:
+ fields (Union[Iterable(str), None]): fields to be converted.
+
+ Returns:
+ Union[Set(str), None]: Converted fields to v4 fields.
+ """
+
+ if not fields:
+ return None
+
+ representation_attributes = con.get_attributes_for_type("representation")
+
+ output = set()
+ for field in fields:
+ if field in ("type", "schema"):
+ continue
+
+ if field in REPRESENTATION_FIELDS_MAPPING_V3_V4:
+ output |= REPRESENTATION_FIELDS_MAPPING_V3_V4[field]
+
+ elif field.startswith("context"):
+ output.add("context")
+
+ # TODO: 'files' can have specific attributes but the keys in v3 and v4
+ # are not the same (content is not the same)
+ elif field.startswith("files"):
+ output |= REPRESENTATION_FILES_FIELDS
+
+ elif field.startswith("data"):
+ output |= {
+ "attrib.{}".format(attr)
+ for attr in representation_attributes
+ }
+
+ else:
+ raise ValueError("Unknown field mapping for {}".format(field))
+
+ if "id" not in output:
+ output.add("id")
+ return output
+
+
+def convert_v4_representation_to_v3(representation):
+ """Convert v4 representation to v3 representation.
+
+ Args:
+ representation (Dict[str, Any]): Queried representation from v4 server.
+
+ Returns:
+ Dict[str, Any]: Converted representation to v3 structure.
+ """
+
+ output = {
+ "type": "representation",
+ "schema": CURRENT_REPRESENTATION_SCHEMA,
+ }
+ if "id" in representation:
+ output["_id"] = representation["id"]
+
+ for v3_key, v4_key in (
+ ("name", "name"),
+ ("parent", "versionId")
+ ):
+ if v4_key in representation:
+ output[v3_key] = representation[v4_key]
+
+ if "context" in representation:
+ context = representation["context"]
+ if isinstance(context, six.string_types):
+ context = json.loads(context)
+
+ if "folder" in context:
+ _c_folder = context.pop("folder")
+ context["asset"] = _c_folder["name"]
+
+ if "product" in context:
+ _c_product = context.pop("product")
+ context["family"] = _c_product["type"]
+ context["subset"] = _c_product["name"]
+
+ output["context"] = context
+
+ if "files" in representation:
+ files = representation["files"]
+ new_files = []
+ # From GraphQl is list
+ if isinstance(files, list):
+ for file_info in files:
+ file_info["_id"] = file_info["id"]
+ new_files.append(file_info)
+
+ # From RestPoint is dictionary
+ elif isinstance(files, dict):
+ for file_id, file_info in files:
+ file_info["_id"] = file_id
+ new_files.append(file_info)
+
+ for file_info in new_files:
+ if not file_info.get("sites"):
+ file_info["sites"] = [{
+ "name": "studio"
+ }]
+
+ output["files"] = new_files
+
+ if representation.get("active") is False:
+ output["type"] = "archived_representation"
+ output["old_id"] = output["_id"]
+
+ output_data = representation.get("data") or {}
+ if "attrib" in representation:
+ output_data.update(representation["attrib"])
+
+ for key, data_key in (
+ ("active", "active"),
+ ):
+ if key in representation:
+ output_data[data_key] = representation[key]
+
+ if "template" in output_data:
+ output_data["template"] = (
+ output_data["template"]
+ .replace("{folder[name]}", "{asset}")
+ .replace("{product[name]}", "{subset}")
+ .replace("{product[type]}", "{family}")
+ )
+
+ output["data"] = output_data
+
+ return output
+
+
+def workfile_info_fields_v3_to_v4(fields):
+ if not fields:
+ return None
+
+ new_fields = set()
+ fields = set(fields)
+ for v3_key, v4_key in (
+ ("_id", "id"),
+ ("files", "path"),
+ ("filename", "name"),
+ ("data", "data"),
+ ):
+ if v3_key in fields:
+ new_fields.add(v4_key)
+
+ if "parent" in fields or "task_name" in fields:
+ new_fields.add("taskId")
+
+ return new_fields
+
+
+def convert_v4_workfile_info_to_v3(workfile_info, task):
+ output = {
+ "type": "workfile",
+ "schema": CURRENT_WORKFILE_INFO_SCHEMA,
+ }
+ if "id" in workfile_info:
+ output["_id"] = workfile_info["id"]
+
+ if "path" in workfile_info:
+ output["files"] = [workfile_info["path"]]
+
+ if "name" in workfile_info:
+ output["filename"] = workfile_info["name"]
+
+ if "taskId" in workfile_info:
+ output["task_name"] = task["name"]
+ output["parent"] = task["folderId"]
+
+ return output
+
+
+def convert_create_asset_to_v4(asset, project, con):
+ folder_attributes = con.get_attributes_for_type("folder")
+
+ asset_data = asset["data"]
+ parent_id = asset_data["visualParent"]
+
+ folder = {
+ "name": asset["name"],
+ "parentId": parent_id,
+ }
+ entity_id = asset.get("_id")
+ if entity_id:
+ folder["id"] = entity_id
+
+ attribs = {}
+ data = {}
+ for key, value in asset_data.items():
+ if key in (
+ "visualParent",
+ "thumbnail_id",
+ "parents",
+ "inputLinks",
+ "avalon_mongo_id",
+ ):
+ continue
+
+ if key not in folder_attributes:
+ data[key] = value
+ elif value is not None:
+ attribs[key] = value
+
+ if attribs:
+ folder["attrib"] = attribs
+
+ if data:
+ folder["data"] = data
+ return folder
+
+
+def convert_create_task_to_v4(task, project, con):
+ if not project["taskTypes"]:
+ raise ValueError(
+ "Project \"{}\" does not have any task types".format(
+ project["name"]))
+
+ task_type = task["type"]
+ if task_type not in project["taskTypes"]:
+ task_type = tuple(project["taskTypes"].keys())[0]
+
+ return {
+ "name": task["name"],
+ "taskType": task_type,
+ "folderId": task["folderId"]
+ }
+
+
+def convert_create_subset_to_v4(subset, con):
+ product_attributes = con.get_attributes_for_type("product")
+
+ subset_data = subset["data"]
+ product_type = subset_data.get("family")
+ if not product_type:
+ product_type = subset_data["families"][0]
+
+ converted_product = {
+ "name": subset["name"],
+ "productType": product_type,
+ "folderId": subset["parent"],
+ }
+ entity_id = subset.get("_id")
+ if entity_id:
+ converted_product["id"] = entity_id
+
+ attribs = {}
+ data = {}
+ if "subsetGroup" in subset_data:
+ subset_data["productGroup"] = subset_data.pop("subsetGroup")
+ for key, value in subset_data.items():
+ if key not in product_attributes:
+ data[key] = value
+ elif value is not None:
+ attribs[key] = value
+
+ if attribs:
+ converted_product["attrib"] = attribs
+
+ if data:
+ converted_product["data"] = data
+
+ return converted_product
+
+
+def convert_create_version_to_v4(version, con):
+ version_attributes = con.get_attributes_for_type("version")
+ converted_version = {
+ "version": version["name"],
+ "productId": version["parent"],
+ }
+ entity_id = version.get("_id")
+ if entity_id:
+ converted_version["id"] = entity_id
+
+ version_data = version["data"]
+ attribs = {}
+ data = {}
+ for key, value in version_data.items():
+ if key not in version_attributes:
+ data[key] = value
+ elif value is not None:
+ attribs[key] = value
+
+ if attribs:
+ converted_version["attrib"] = attribs
+
+ if data:
+ converted_version["data"] = attribs
+
+ return converted_version
+
+
+def convert_create_hero_version_to_v4(hero_version, project_name, con):
+ if "version_id" in hero_version:
+ version_id = hero_version["version_id"]
+ version = con.get_version_by_id(project_name, version_id)
+ version["version"] = - version["version"]
+
+ for auto_key in (
+ "name",
+ "createdAt",
+ "updatedAt",
+ "author",
+ ):
+ version.pop(auto_key, None)
+
+ return version
+
+ version_attributes = con.get_attributes_for_type("version")
+ converted_version = {
+ "version": hero_version["version"],
+ "productId": hero_version["parent"],
+ }
+ entity_id = hero_version.get("_id")
+ if entity_id:
+ converted_version["id"] = entity_id
+
+ version_data = hero_version["data"]
+ attribs = {}
+ data = {}
+ for key, value in version_data.items():
+ if key not in version_attributes:
+ data[key] = value
+ elif value is not None:
+ attribs[key] = value
+
+ if attribs:
+ converted_version["attrib"] = attribs
+
+ if data:
+ converted_version["data"] = attribs
+
+ return converted_version
+
+
+def convert_create_representation_to_v4(representation, con):
+ representation_attributes = con.get_attributes_for_type("representation")
+
+ converted_representation = {
+ "name": representation["name"],
+ "versionId": representation["parent"],
+ }
+ entity_id = representation.get("_id")
+ if entity_id:
+ converted_representation["id"] = entity_id
+
+ if representation.get("type") == "archived_representation":
+ converted_representation["active"] = False
+
+ new_files = []
+ for file_item in representation["files"]:
+ new_file_item = {
+ key: value
+ for key, value in file_item.items()
+ if key in ("hash", "path", "size")
+ }
+ new_file_item.update({
+ "id": create_entity_id(),
+ "hash_type": "op3",
+ "name": os.path.basename(new_file_item["path"])
+ })
+ new_files.append(new_file_item)
+
+ converted_representation["files"] = new_files
+
+ context = representation["context"]
+ context["folder"] = {
+ "name": context.pop("asset", None)
+ }
+ context["product"] = {
+ "type": context.pop("family", None),
+ "name": context.pop("subset", None),
+ }
+
+ attribs = {}
+ data = {
+ "context": context,
+ }
+
+ representation_data = representation["data"]
+ representation_data["template"] = (
+ representation_data["template"]
+ .replace("{asset}", "{folder[name]}")
+ .replace("{subset}", "{product[name]}")
+ .replace("{family}", "{product[type]}")
+ )
+
+ for key, value in representation_data.items():
+ if key not in representation_attributes:
+ data[key] = value
+ elif value is not None:
+ attribs[key] = value
+
+ if attribs:
+ converted_representation["attrib"] = attribs
+
+ if data:
+ converted_representation["data"] = data
+
+ return converted_representation
+
+
+def convert_create_workfile_info_to_v4(data, project_name, con):
+ folder_id = data["parent"]
+ task_name = data["task_name"]
+ task = con.get_task_by_name(project_name, folder_id, task_name)
+ if not task:
+ return None
+
+ workfile_attributes = con.get_attributes_for_type("workfile")
+ filename = data["filename"]
+ possible_attribs = {
+ "extension": os.path.splitext(filename)[-1]
+ }
+ attribs = {}
+ for attr in workfile_attributes:
+ if attr in possible_attribs:
+ attribs[attr] = possible_attribs[attr]
+
+ output = {
+ "path": data["files"][0],
+ "name": filename,
+ "taskId": task["id"]
+ }
+ if "_id" in data:
+ output["id"] = data["_id"]
+
+ if attribs:
+ output["attrib"] = attribs
+
+ output_data = data.get("data")
+ if output_data:
+ output["data"] = output_data
+ return output
+
+
+def _from_flat_dict(data):
+ output = {}
+ for key, value in data.items():
+ output_value = output
+ subkeys = key.split(".")
+ last_key = subkeys.pop(-1)
+ for subkey in subkeys:
+ if subkey not in output_value:
+ output_value[subkey] = {}
+ output_value = output_value[subkey]
+
+ output_value[last_key] = value
+ return output
+
+
+def _to_flat_dict(data):
+ output = {}
+ flat_queue = collections.deque()
+ flat_queue.append(([], data))
+ while flat_queue:
+ item = flat_queue.popleft()
+ parent_keys, data = item
+ for key, value in data.items():
+ keys = list(parent_keys)
+ keys.append(key)
+ if isinstance(value, dict):
+ flat_queue.append((keys, value))
+ else:
+ full_key = ".".join(keys)
+ output[full_key] = value
+
+ return output
+
+
+def convert_update_folder_to_v4(project_name, asset_id, update_data, con):
+ new_update_data = {}
+
+ folder_attributes = con.get_attributes_for_type("folder")
+ full_update_data = _from_flat_dict(update_data)
+ data = full_update_data.get("data")
+
+ has_new_parent = False
+ has_task_changes = False
+ parent_id = None
+ tasks = None
+ new_data = {}
+ attribs = {}
+ if "type" in update_data:
+ new_update_data["active"] = update_data["type"] == "asset"
+
+ if data:
+ if "thumbnail_id" in data:
+ new_update_data["thumbnailId"] = data.pop("thumbnail_id")
+
+ if "tasks" in data:
+ tasks = data.pop("tasks")
+ has_task_changes = True
+
+ if "visualParent" in data:
+ has_new_parent = True
+ parent_id = data.pop("visualParent")
+
+ for key, value in data.items():
+ if key in folder_attributes:
+ attribs[key] = value
+ else:
+ new_data[key] = value
+
+ if "name" in update_data:
+ new_update_data["name"] = update_data["name"]
+
+ if "type" in update_data:
+ new_type = update_data["type"]
+ if new_type == "asset":
+ new_update_data["active"] = True
+ elif new_type == "archived_asset":
+ new_update_data["active"] = False
+
+ if has_new_parent:
+ new_update_data["parentId"] = parent_id
+
+ if new_data:
+ print("Folder has new data: {}".format(new_data))
+ new_update_data["data"] = new_data
+
+ if has_task_changes:
+ raise ValueError("Task changes of folder are not implemented")
+
+ return _to_flat_dict(new_update_data)
+
+
+def convert_update_subset_to_v4(project_name, subset_id, update_data, con):
+ new_update_data = {}
+
+ product_attributes = con.get_attributes_for_type("product")
+ full_update_data = _from_flat_dict(update_data)
+ data = full_update_data.get("data")
+ new_data = {}
+ attribs = {}
+ if data:
+ if "family" in data:
+ family = data.pop("family")
+ new_update_data["productType"] = family
+
+ if "families" in data:
+ families = data.pop("families")
+ if "productType" not in new_update_data:
+ new_update_data["productType"] = families[0]
+
+ if "subsetGroup" in data:
+ data["productGroup"] = data.pop("subsetGroup")
+ for key, value in data.items():
+ if key in product_attributes:
+ if value is REMOVED_VALUE:
+ value = None
+ attribs[key] = value
+
+ elif value is not REMOVED_VALUE:
+ new_data[key] = value
+
+ if attribs:
+ new_update_data["attribs"] = attribs
+
+ if "name" in update_data:
+ new_update_data["name"] = update_data["name"]
+
+ if "type" in update_data:
+ new_type = update_data["type"]
+ if new_type == "subset":
+ new_update_data["active"] = True
+ elif new_type == "archived_subset":
+ new_update_data["active"] = False
+
+ if "parent" in update_data:
+ new_update_data["folderId"] = update_data["parent"]
+
+ flat_data = _to_flat_dict(new_update_data)
+ if new_data:
+ print("Subset has new data: {}".format(new_data))
+ flat_data["data"] = new_data
+
+ return flat_data
+
+
+def convert_update_version_to_v4(project_name, version_id, update_data, con):
+ new_update_data = {}
+
+ version_attributes = con.get_attributes_for_type("version")
+ full_update_data = _from_flat_dict(update_data)
+ data = full_update_data.get("data")
+ new_data = {}
+ attribs = {}
+ if data:
+ if "author" in data:
+ new_update_data["author"] = data.pop("author")
+
+ if "thumbnail_id" in data:
+ new_update_data["thumbnailId"] = data.pop("thumbnail_id")
+
+ for key, value in data.items():
+ if key in version_attributes:
+ if value is REMOVED_VALUE:
+ value = None
+ attribs[key] = value
+
+ elif value is not REMOVED_VALUE:
+ new_data[key] = value
+
+ if attribs:
+ new_update_data["attribs"] = attribs
+
+ if "name" in update_data:
+ new_update_data["version"] = update_data["name"]
+
+ if "type" in update_data:
+ new_type = update_data["type"]
+ if new_type == "version":
+ new_update_data["active"] = True
+ elif new_type == "archived_version":
+ new_update_data["active"] = False
+
+ if "parent" in update_data:
+ new_update_data["productId"] = update_data["parent"]
+
+ flat_data = _to_flat_dict(new_update_data)
+ if new_data:
+ print("Version has new data: {}".format(new_data))
+ flat_data["data"] = new_data
+ return flat_data
+
+
+def convert_update_hero_version_to_v4(
+ project_name, hero_version_id, update_data, con
+):
+ if "version_id" not in update_data:
+ return None
+
+ version_id = update_data["version_id"]
+ hero_version = con.get_hero_version_by_id(project_name, hero_version_id)
+ version = con.get_version_by_id(project_name, version_id)
+ version["version"] = - version["version"]
+ version["id"] = hero_version_id
+
+ for auto_key in (
+ "name",
+ "createdAt",
+ "updatedAt",
+ "author",
+ ):
+ version.pop(auto_key, None)
+
+ return prepare_entity_changes(hero_version, version)
+
+
+def convert_update_representation_to_v4(
+ project_name, repre_id, update_data, con
+):
+ new_update_data = {}
+
+ folder_attributes = con.get_attributes_for_type("folder")
+ full_update_data = _from_flat_dict(update_data)
+ data = full_update_data.get("data")
+
+ new_data = {}
+ attribs = {}
+ if data:
+ for key, value in data.items():
+ if key in folder_attributes:
+ attribs[key] = value
+ else:
+ new_data[key] = value
+
+ if "template" in attribs:
+ attribs["template"] = (
+ attribs["template"]
+ .replace("{asset}", "{folder[name]}")
+ .replace("{family}", "{product[type]}")
+ .replace("{subset}", "{product[name]}")
+ )
+
+ if "name" in update_data:
+ new_update_data["name"] = update_data["name"]
+
+ if "type" in update_data:
+ new_type = update_data["type"]
+ if new_type == "representation":
+ new_update_data["active"] = True
+ elif new_type == "archived_representation":
+ new_update_data["active"] = False
+
+ if "parent" in update_data:
+ new_update_data["versionId"] = update_data["parent"]
+
+ if "context" in update_data:
+ context = update_data["context"]
+ if "asset" in context:
+ context["folder"] = {"name": context.pop("asset")}
+
+ if "family" in context or "subset" in context:
+ context["product"] = {
+ "name": context.pop("subset"),
+ "type": context.pop("family"),
+ }
+ new_data["context"] = context
+
+ if "files" in update_data:
+ new_files = update_data["files"]
+ if isinstance(new_files, dict):
+ new_files = list(new_files.values())
+
+ for item in new_files:
+ for key in tuple(item.keys()):
+ if key not in ("hash", "path", "size"):
+ item.pop(key)
+ item.update({
+ "id": create_entity_id(),
+ "name": os.path.basename(item["path"]),
+ "hash_type": "op3",
+ })
+ new_update_data["files"] = new_files
+
+ flat_data = _to_flat_dict(new_update_data)
+ if new_data:
+ print("Representation has new data: {}".format(new_data))
+ flat_data["data"] = new_data
+
+ return flat_data
+
+
+def convert_update_workfile_info_to_v4(update_data):
+ return {
+ key: value
+ for key, value in update_data.items()
+ if key.startswith("data")
+ }
diff --git a/openpype/client/server/entities.py b/openpype/client/server/entities.py
new file mode 100644
index 0000000000..9579f13add
--- /dev/null
+++ b/openpype/client/server/entities.py
@@ -0,0 +1,694 @@
+import collections
+
+from ayon_api import get_server_api_connection
+
+from openpype.client.mongo.operations import CURRENT_THUMBNAIL_SCHEMA
+
+from .openpype_comp import get_folders_with_tasks
+from .conversion_utils import (
+ project_fields_v3_to_v4,
+ convert_v4_project_to_v3,
+
+ folder_fields_v3_to_v4,
+ convert_v4_folder_to_v3,
+
+ subset_fields_v3_to_v4,
+ convert_v4_subset_to_v3,
+
+ version_fields_v3_to_v4,
+ convert_v4_version_to_v3,
+
+ representation_fields_v3_to_v4,
+ convert_v4_representation_to_v3,
+
+ workfile_info_fields_v3_to_v4,
+ convert_v4_workfile_info_to_v3,
+)
+
+
+def get_projects(active=True, inactive=False, library=None, fields=None):
+ if not active and not inactive:
+ return
+
+ if active and inactive:
+ active = None
+ elif active:
+ active = True
+ elif inactive:
+ active = False
+
+ con = get_server_api_connection()
+ fields = project_fields_v3_to_v4(fields, con)
+ for project in con.get_projects(active, library, fields=fields):
+ yield convert_v4_project_to_v3(project)
+
+
+def get_project(project_name, active=True, inactive=False, fields=None):
+ # Skip if both are disabled
+ con = get_server_api_connection()
+ fields = project_fields_v3_to_v4(fields, con)
+ return convert_v4_project_to_v3(
+ con.get_project(project_name, fields=fields)
+ )
+
+
+def get_whole_project(*args, **kwargs):
+ raise NotImplementedError("'get_whole_project' not implemented")
+
+
+def _get_subsets(
+ project_name,
+ subset_ids=None,
+ subset_names=None,
+ folder_ids=None,
+ names_by_folder_ids=None,
+ archived=False,
+ fields=None
+):
+ # Convert fields and add minimum required fields
+ con = get_server_api_connection()
+ fields = subset_fields_v3_to_v4(fields, con)
+ if fields is not None:
+ for key in (
+ "id",
+ "active"
+ ):
+ fields.add(key)
+
+ active = None
+ if archived:
+ active = False
+
+ for subset in con.get_products(
+ project_name,
+ subset_ids,
+ subset_names,
+ folder_ids,
+ names_by_folder_ids,
+ active,
+ fields
+ ):
+ yield convert_v4_subset_to_v3(subset)
+
+
+def _get_versions(
+ project_name,
+ version_ids=None,
+ subset_ids=None,
+ versions=None,
+ hero=True,
+ standard=True,
+ latest=None,
+ active=None,
+ fields=None
+):
+ con = get_server_api_connection()
+
+ fields = version_fields_v3_to_v4(fields, con)
+
+ # Make sure 'productId' and 'version' are available when hero versions
+ # are queried
+ if fields and hero:
+ fields = set(fields)
+ fields |= {"productId", "version"}
+
+ queried_versions = con.get_versions(
+ project_name,
+ version_ids,
+ subset_ids,
+ versions,
+ hero,
+ standard,
+ latest,
+ active=active,
+ fields=fields
+ )
+
+ versions = []
+ hero_versions = []
+ for version in queried_versions:
+ if version["version"] < 0:
+ hero_versions.append(version)
+ else:
+ versions.append(convert_v4_version_to_v3(version))
+
+ if hero_versions:
+ subset_ids = set()
+ versions_nums = set()
+ for hero_version in hero_versions:
+ versions_nums.add(abs(hero_version["version"]))
+ subset_ids.add(hero_version["productId"])
+
+ hero_eq_versions = con.get_versions(
+ project_name,
+ product_ids=subset_ids,
+ versions=versions_nums,
+ hero=False,
+ fields=["id", "version", "productId"]
+ )
+ hero_eq_by_subset_id = collections.defaultdict(list)
+ for version in hero_eq_versions:
+ hero_eq_by_subset_id[version["productId"]].append(version)
+
+ for hero_version in hero_versions:
+ abs_version = abs(hero_version["version"])
+ subset_id = hero_version["productId"]
+ version_id = None
+ for version in hero_eq_by_subset_id.get(subset_id, []):
+ if version["version"] == abs_version:
+ version_id = version["id"]
+ break
+ conv_hero = convert_v4_version_to_v3(hero_version)
+ conv_hero["version_id"] = version_id
+ versions.append(conv_hero)
+
+ return versions
+
+
+def get_asset_by_id(project_name, asset_id, fields=None):
+ assets = get_assets(
+ project_name, asset_ids=[asset_id], fields=fields
+ )
+ for asset in assets:
+ return asset
+ return None
+
+
+def get_asset_by_name(project_name, asset_name, fields=None):
+ assets = get_assets(
+ project_name, asset_names=[asset_name], fields=fields
+ )
+ for asset in assets:
+ return asset
+ return None
+
+
+def get_assets(
+ project_name,
+ asset_ids=None,
+ asset_names=None,
+ parent_ids=None,
+ archived=False,
+ fields=None
+):
+ if not project_name:
+ return
+
+ active = True
+ if archived:
+ active = False
+
+ con = get_server_api_connection()
+ fields = folder_fields_v3_to_v4(fields, con)
+ kwargs = dict(
+ folder_ids=asset_ids,
+ folder_names=asset_names,
+ parent_ids=parent_ids,
+ active=active,
+ fields=fields
+ )
+
+ if fields is None or "tasks" in fields:
+ folders = get_folders_with_tasks(con, project_name, **kwargs)
+
+ else:
+ folders = con.get_folders(project_name, **kwargs)
+
+ for folder in folders:
+ yield convert_v4_folder_to_v3(folder, project_name)
+
+
+def get_archived_assets(
+ project_name,
+ asset_ids=None,
+ asset_names=None,
+ parent_ids=None,
+ fields=None
+):
+ return get_assets(
+ project_name,
+ asset_ids,
+ asset_names,
+ parent_ids,
+ True,
+ fields
+ )
+
+
+def get_asset_ids_with_subsets(project_name, asset_ids=None):
+ con = get_server_api_connection()
+ return con.get_folder_ids_with_products(project_name, asset_ids)
+
+
+def get_subset_by_id(project_name, subset_id, fields=None):
+ subsets = get_subsets(
+ project_name, subset_ids=[subset_id], fields=fields
+ )
+ for subset in subsets:
+ return subset
+ return None
+
+
+def get_subset_by_name(project_name, subset_name, asset_id, fields=None):
+ subsets = get_subsets(
+ project_name,
+ subset_names=[subset_name],
+ asset_ids=[asset_id],
+ fields=fields
+ )
+ for subset in subsets:
+ return subset
+ return None
+
+
+def get_subsets(
+ project_name,
+ subset_ids=None,
+ subset_names=None,
+ asset_ids=None,
+ names_by_asset_ids=None,
+ archived=False,
+ fields=None
+):
+ return _get_subsets(
+ project_name,
+ subset_ids,
+ subset_names,
+ asset_ids,
+ names_by_asset_ids,
+ archived,
+ fields=fields
+ )
+
+
+def get_subset_families(project_name, subset_ids=None):
+ con = get_server_api_connection()
+ return con.get_product_type_names(project_name, subset_ids)
+
+
+def get_version_by_id(project_name, version_id, fields=None):
+ versions = get_versions(
+ project_name,
+ version_ids=[version_id],
+ fields=fields,
+ hero=True
+ )
+ for version in versions:
+ return version
+ return None
+
+
+def get_version_by_name(project_name, version, subset_id, fields=None):
+ versions = get_versions(
+ project_name,
+ subset_ids=[subset_id],
+ versions=[version],
+ fields=fields
+ )
+ for version in versions:
+ return version
+ return None
+
+
+def get_versions(
+ project_name,
+ version_ids=None,
+ subset_ids=None,
+ versions=None,
+ hero=False,
+ fields=None
+):
+ return _get_versions(
+ project_name,
+ version_ids,
+ subset_ids,
+ versions,
+ hero=hero,
+ standard=True,
+ fields=fields
+ )
+
+
+def get_hero_version_by_id(project_name, version_id, fields=None):
+ versions = get_hero_versions(
+ project_name,
+ version_ids=[version_id],
+ fields=fields
+ )
+ for version in versions:
+ return version
+ return None
+
+
+def get_hero_version_by_subset_id(
+ project_name, subset_id, fields=None
+):
+ versions = get_hero_versions(
+ project_name,
+ subset_ids=[subset_id],
+ fields=fields
+ )
+ for version in versions:
+ return version
+ return None
+
+
+def get_hero_versions(
+ project_name, subset_ids=None, version_ids=None, fields=None
+):
+ return _get_versions(
+ project_name,
+ version_ids=version_ids,
+ subset_ids=subset_ids,
+ hero=True,
+ standard=False,
+ fields=fields
+ )
+
+
+def get_last_versions(project_name, subset_ids, active=None, fields=None):
+ if fields:
+ fields = set(fields)
+ fields.add("parent")
+
+ versions = _get_versions(
+ project_name,
+ subset_ids=subset_ids,
+ latest=True,
+ hero=False,
+ active=active,
+ fields=fields
+ )
+ return {
+ version["parent"]: version
+ for version in versions
+ }
+
+
+def get_last_version_by_subset_id(project_name, subset_id, fields=None):
+ versions = _get_versions(
+ project_name,
+ subset_ids=[subset_id],
+ latest=True,
+ hero=False,
+ fields=fields
+ )
+ if not versions:
+ return None
+ return versions[0]
+
+
+def get_last_version_by_subset_name(
+ project_name,
+ subset_name,
+ asset_id=None,
+ asset_name=None,
+ fields=None
+):
+ if not asset_id and not asset_name:
+ return None
+
+ if not asset_id:
+ asset = get_asset_by_name(
+ project_name, asset_name, fields=["_id"]
+ )
+ if not asset:
+ return None
+ asset_id = asset["_id"]
+
+ subset = get_subset_by_name(
+ project_name, subset_name, asset_id, fields=["_id"]
+ )
+ if not subset:
+ return None
+ return get_last_version_by_subset_id(
+ project_name, subset["id"], fields=fields
+ )
+
+
+def get_output_link_versions(project_name, version_id, fields=None):
+ if not version_id:
+ return []
+
+ con = get_server_api_connection()
+ version_links = con.get_version_links(
+ project_name, version_id, link_direction="out")
+
+ version_ids = {
+ link["entityId"]
+ for link in version_links
+ if link["entityType"] == "version"
+ }
+ if not version_ids:
+ return []
+
+ return get_versions(project_name, version_ids=version_ids, fields=fields)
+
+
+def version_is_latest(project_name, version_id):
+ con = get_server_api_connection()
+ return con.version_is_latest(project_name, version_id)
+
+
+def get_representation_by_id(project_name, representation_id, fields=None):
+ representations = get_representations(
+ project_name,
+ representation_ids=[representation_id],
+ fields=fields
+ )
+ for representation in representations:
+ return representation
+ return None
+
+
+def get_representation_by_name(
+ project_name, representation_name, version_id, fields=None
+):
+ representations = get_representations(
+ project_name,
+ representation_names=[representation_name],
+ version_ids=[version_id],
+ fields=fields
+ )
+ for representation in representations:
+ return representation
+ return None
+
+
+def get_representations(
+ project_name,
+ representation_ids=None,
+ representation_names=None,
+ version_ids=None,
+ context_filters=None,
+ names_by_version_ids=None,
+ archived=False,
+ standard=True,
+ fields=None
+):
+ if context_filters is not None:
+ # TODO should we add the support?
+ # - there was ability to fitler using regex
+ raise ValueError("OP v4 can't filter by representation context.")
+
+ if not archived and not standard:
+ return
+
+ if archived and not standard:
+ active = False
+ elif not archived and standard:
+ active = True
+ else:
+ active = None
+
+ con = get_server_api_connection()
+ fields = representation_fields_v3_to_v4(fields, con)
+ if fields and active is not None:
+ fields.add("active")
+
+ representations = con.get_representations(
+ project_name,
+ representation_ids,
+ representation_names,
+ version_ids,
+ names_by_version_ids,
+ active,
+ fields=fields
+ )
+ for representation in representations:
+ yield convert_v4_representation_to_v3(representation)
+
+
+def get_representation_parents(project_name, representation):
+ if not representation:
+ return None
+
+ repre_id = representation["_id"]
+ parents_by_repre_id = get_representations_parents(
+ project_name, [representation]
+ )
+ return parents_by_repre_id[repre_id]
+
+
+def get_representations_parents(project_name, representations):
+ repre_ids = {
+ repre["_id"]
+ for repre in representations
+ }
+ con = get_server_api_connection()
+ parents_by_repre_id = con.get_representations_parents(project_name,
+ repre_ids)
+ folder_ids = set()
+ for parents in parents_by_repre_id .values():
+ folder_ids.add(parents[2]["id"])
+
+ tasks_by_folder_id = {}
+
+ new_parents = {}
+ for repre_id, parents in parents_by_repre_id .items():
+ version, subset, folder, project = parents
+ folder_tasks = tasks_by_folder_id.get(folder["id"]) or {}
+ folder["tasks"] = folder_tasks
+ new_parents[repre_id] = (
+ convert_v4_version_to_v3(version),
+ convert_v4_subset_to_v3(subset),
+ convert_v4_folder_to_v3(folder, project_name),
+ project
+ )
+ return new_parents
+
+
+def get_archived_representations(
+ project_name,
+ representation_ids=None,
+ representation_names=None,
+ version_ids=None,
+ context_filters=None,
+ names_by_version_ids=None,
+ fields=None
+):
+ return get_representations(
+ project_name,
+ representation_ids=representation_ids,
+ representation_names=representation_names,
+ version_ids=version_ids,
+ context_filters=context_filters,
+ names_by_version_ids=names_by_version_ids,
+ archived=True,
+ standard=False,
+ fields=fields
+ )
+
+
+def get_thumbnail(
+ project_name, thumbnail_id, entity_type, entity_id, fields=None
+):
+ """Receive thumbnail entity data.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ thumbnail_id (Union[str, ObjectId]): Id of thumbnail entity.
+ entity_type (str): Type of entity for which the thumbnail should be
+ received.
+ entity_id (str): Id of entity for which the thumbnail should be
+ received.
+ fields (Iterable[str]): Fields that should be returned. All fields are
+ returned if 'None' is passed.
+
+ Returns:
+ None: If thumbnail with specified id was not found.
+ Dict: Thumbnail entity data which can be reduced to specified 'fields'.
+ """
+
+ if not thumbnail_id or not entity_type or not entity_id:
+ return None
+
+ if entity_type == "asset":
+ entity_type = "folder"
+
+ elif entity_type == "hero_version":
+ entity_type = "version"
+
+ return {
+ "_id": thumbnail_id,
+ "type": "thumbnail",
+ "schema": CURRENT_THUMBNAIL_SCHEMA,
+ "data": {
+ "entity_type": entity_type,
+ "entity_id": entity_id
+ }
+ }
+
+
+def get_thumbnails(project_name, thumbnail_contexts, fields=None):
+ """Get thumbnail entities.
+
+ Warning:
+ This function is not OpenPype compatible. There is none usage of this
+ function in codebase so there is nothing to convert. The previous
+ implementation cannot be AYON compatible without entity types.
+ """
+
+ thumbnail_items = set()
+ for thumbnail_context in thumbnail_contexts:
+ thumbnail_id, entity_type, entity_id = thumbnail_context
+ thumbnail_item = get_thumbnail(
+ project_name, thumbnail_id, entity_type, entity_id
+ )
+ if thumbnail_item:
+ thumbnail_items.add(thumbnail_item)
+ return list(thumbnail_items)
+
+
+def get_thumbnail_id_from_source(project_name, src_type, src_id):
+ """Receive thumbnail id from source entity.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ src_type (str): Type of source entity ('asset', 'version').
+ src_id (Union[str, ObjectId]): Id of source entity.
+
+ Returns:
+ ObjectId: Thumbnail id assigned to entity.
+ None: If Source entity does not have any thumbnail id assigned.
+ """
+
+ if not src_type or not src_id:
+ return None
+
+ if src_type == "version":
+ version = get_version_by_id(
+ project_name, src_id, fields=["data.thumbnail_id"]
+ ) or {}
+ return version.get("data", {}).get("thumbnail_id")
+
+ if src_type == "asset":
+ asset = get_asset_by_id(
+ project_name, src_id, fields=["data.thumbnail_id"]
+ ) or {}
+ return asset.get("data", {}).get("thumbnail_id")
+
+ return None
+
+
+def get_workfile_info(
+ project_name, asset_id, task_name, filename, fields=None
+):
+ if not asset_id or not task_name or not filename:
+ return None
+
+ con = get_server_api_connection()
+ task = con.get_task_by_name(
+ project_name, asset_id, task_name, fields=["id", "name", "folderId"]
+ )
+ if not task:
+ return None
+
+ fields = workfile_info_fields_v3_to_v4(fields)
+
+ for workfile_info in con.get_workfiles_info(
+ project_name, task_ids=[task["id"]], fields=fields
+ ):
+ if workfile_info["name"] == filename:
+ return convert_v4_workfile_info_to_v3(workfile_info, task)
+ return None
diff --git a/openpype/client/server/entity_links.py b/openpype/client/server/entity_links.py
new file mode 100644
index 0000000000..d8395aabe7
--- /dev/null
+++ b/openpype/client/server/entity_links.py
@@ -0,0 +1,156 @@
+import ayon_api
+from ayon_api import get_folder_links, get_versions_links
+
+from .entities import get_assets, get_representation_by_id
+
+
+def get_linked_asset_ids(project_name, asset_doc=None, asset_id=None):
+ """Extract linked asset ids from asset document.
+
+ One of asset document or asset id must be passed.
+
+ Note:
+ Asset links now works only from asset to assets.
+
+ Args:
+ project_name (str): Project where to look for asset.
+ asset_doc (dict): Asset document from DB.
+ asset_id (str): Asset id to find its document.
+
+ Returns:
+ List[Union[ObjectId, str]]: Asset ids of input links.
+ """
+
+ output = []
+ if not asset_doc and not asset_id:
+ return output
+
+ if not asset_id:
+ asset_id = asset_doc["_id"]
+
+ links = get_folder_links(project_name, asset_id, link_direction="in")
+ return [
+ link["entityId"]
+ for link in links
+ if link["entityType"] == "folder"
+ ]
+
+
+def get_linked_assets(
+ project_name, asset_doc=None, asset_id=None, fields=None
+):
+ """Return linked assets based on passed asset document.
+
+ One of asset document or asset id must be passed.
+
+ Args:
+ project_name (str): Name of project where to look for queried entities.
+ asset_doc (Dict[str, Any]): Asset document from database.
+ asset_id (Union[ObjectId, str]): Asset id. Can be used instead of
+ asset document.
+ fields (Iterable[str]): Fields that should be returned. All fields are
+ returned if 'None' is passed.
+
+ Returns:
+ List[Dict[str, Any]]: Asset documents of input links for passed
+ asset doc.
+ """
+
+ link_ids = get_linked_asset_ids(project_name, asset_doc, asset_id)
+ if not link_ids:
+ return []
+ return list(get_assets(project_name, asset_ids=link_ids, fields=fields))
+
+
+
+def get_linked_representation_id(
+ project_name, repre_doc=None, repre_id=None, link_type=None, max_depth=None
+):
+ """Returns list of linked ids of particular type (if provided).
+
+ One of representation document or representation id must be passed.
+ Note:
+ Representation links now works only from representation through version
+ back to representations.
+
+ Todos:
+ Missing depth query. Not sure how it did find more representations in
+ depth, probably links to version?
+
+ Args:
+ project_name (str): Name of project where look for links.
+ repre_doc (Dict[str, Any]): Representation document.
+ repre_id (Union[ObjectId, str]): Representation id.
+ link_type (str): Type of link (e.g. 'reference', ...).
+ max_depth (int): Limit recursion level. Default: 0
+
+ Returns:
+ List[ObjectId] Linked representation ids.
+ """
+
+ if repre_doc:
+ repre_id = repre_doc["_id"]
+
+ if not repre_id and not repre_doc:
+ return []
+
+ version_id = None
+ if repre_doc:
+ version_id = repre_doc.get("parent")
+
+ if not version_id:
+ repre_doc = get_representation_by_id(
+ project_name, repre_id, fields=["parent"]
+ )
+ if repre_doc:
+ version_id = repre_doc["parent"]
+
+ if not version_id:
+ return []
+
+ if max_depth is None or max_depth == 0:
+ max_depth = 1
+
+ link_types = None
+ if link_type:
+ link_types = [link_type]
+
+ # Store already found version ids to avoid recursion, and also to store
+ # output -> Don't forget to remove 'version_id' at the end!!!
+ linked_version_ids = {version_id}
+ # Each loop of depth will reset this variable
+ versions_to_check = {version_id}
+ for _ in range(max_depth):
+ if not versions_to_check:
+ break
+
+ links = get_versions_links(
+ project_name,
+ versions_to_check,
+ link_types=link_types,
+ link_direction="out")
+
+ versions_to_check = set()
+ for link in links:
+ # Care only about version links
+ if link["entityType"] != "version":
+ continue
+ entity_id = link["entityId"]
+ # Skip already found linked version ids
+ if entity_id in linked_version_ids:
+ continue
+ linked_version_ids.add(entity_id)
+ versions_to_check.add(entity_id)
+
+ linked_version_ids.remove(version_id)
+ if not linked_version_ids:
+ return []
+
+ representations = ayon_api.get_representations(
+ project_name,
+ version_ids=linked_version_ids,
+ fields=["id"])
+ return [
+ repre["id"]
+ for repre in representations
+ ]
diff --git a/openpype/client/server/openpype_comp.py b/openpype/client/server/openpype_comp.py
new file mode 100644
index 0000000000..a123fe3167
--- /dev/null
+++ b/openpype/client/server/openpype_comp.py
@@ -0,0 +1,156 @@
+import collections
+from ayon_api.graphql import GraphQlQuery, FIELD_VALUE, fields_to_dict
+
+from .constants import DEFAULT_FOLDER_FIELDS
+
+
+def folders_tasks_graphql_query(fields):
+ query = GraphQlQuery("FoldersQuery")
+ project_name_var = query.add_variable("projectName", "String!")
+ folder_ids_var = query.add_variable("folderIds", "[String!]")
+ parent_folder_ids_var = query.add_variable("parentFolderIds", "[String!]")
+ folder_paths_var = query.add_variable("folderPaths", "[String!]")
+ folder_names_var = query.add_variable("folderNames", "[String!]")
+ has_products_var = query.add_variable("folderHasProducts", "Boolean!")
+
+ project_field = query.add_field("project")
+ project_field.set_filter("name", project_name_var)
+
+ folders_field = project_field.add_field_with_edges("folders")
+ folders_field.set_filter("ids", folder_ids_var)
+ folders_field.set_filter("parentIds", parent_folder_ids_var)
+ folders_field.set_filter("names", folder_names_var)
+ folders_field.set_filter("paths", folder_paths_var)
+ folders_field.set_filter("hasProducts", has_products_var)
+
+ fields = set(fields)
+ fields.discard("tasks")
+ tasks_field = folders_field.add_field_with_edges("tasks")
+ tasks_field.add_field("name")
+ tasks_field.add_field("taskType")
+
+ nested_fields = fields_to_dict(fields)
+
+ query_queue = collections.deque()
+ for key, value in nested_fields.items():
+ query_queue.append((key, value, folders_field))
+
+ while query_queue:
+ item = query_queue.popleft()
+ key, value, parent = item
+ field = parent.add_field(key)
+ if value is FIELD_VALUE:
+ continue
+
+ for k, v in value.items():
+ query_queue.append((k, v, field))
+ return query
+
+
+def get_folders_with_tasks(
+ con,
+ project_name,
+ folder_ids=None,
+ folder_paths=None,
+ folder_names=None,
+ parent_ids=None,
+ active=True,
+ fields=None
+):
+ """Query folders with tasks from server.
+
+ This is for v4 compatibility where tasks were stored on assets. This is
+ an inefficient way how folders and tasks are queried so it was added only
+ as compatibility function.
+
+ Todos:
+ Folder name won't be unique identifier, so we should add folder path
+ filtering.
+
+ Notes:
+ Filter 'active' don't have direct filter in GraphQl.
+
+ Args:
+ con (ServerAPI): Connection to server.
+ project_name (str): Name of project where folders are.
+ folder_ids (Iterable[str]): Folder ids to filter.
+ folder_paths (Iterable[str]): Folder paths used for filtering.
+ folder_names (Iterable[str]): Folder names used for filtering.
+ parent_ids (Iterable[str]): Ids of folder parents. Use 'None'
+ if folder is direct child of project.
+ active (Union[bool, None]): Filter active/inactive folders. Both
+ are returned if is set to None.
+ fields (Union[Iterable(str), None]): Fields to be queried
+ for folder. All possible folder fields are returned if 'None'
+ is passed.
+
+ Returns:
+ List[Dict[str, Any]]: Queried folder entities.
+ """
+
+ if not project_name:
+ return []
+
+ filters = {
+ "projectName": project_name
+ }
+ if folder_ids is not None:
+ folder_ids = set(folder_ids)
+ if not folder_ids:
+ return []
+ filters["folderIds"] = list(folder_ids)
+
+ if folder_paths is not None:
+ folder_paths = set(folder_paths)
+ if not folder_paths:
+ return []
+ filters["folderPaths"] = list(folder_paths)
+
+ if folder_names is not None:
+ folder_names = set(folder_names)
+ if not folder_names:
+ return []
+ filters["folderNames"] = list(folder_names)
+
+ if parent_ids is not None:
+ parent_ids = set(parent_ids)
+ if not parent_ids:
+ return []
+ if None in parent_ids:
+ # Replace 'None' with '"root"' which is used during GraphQl
+ # query for parent ids filter for folders without folder
+ # parent
+ parent_ids.remove(None)
+ parent_ids.add("root")
+
+ if project_name in parent_ids:
+ # Replace project name with '"root"' which is used during
+ # GraphQl query for parent ids filter for folders without
+ # folder parent
+ parent_ids.remove(project_name)
+ parent_ids.add("root")
+
+ filters["parentFolderIds"] = list(parent_ids)
+
+ if fields:
+ fields = set(fields)
+ else:
+ fields = con.get_default_fields_for_type("folder")
+ fields |= DEFAULT_FOLDER_FIELDS
+
+ if active is not None:
+ fields.add("active")
+
+ query = folders_tasks_graphql_query(fields)
+ for attr, filter_value in filters.items():
+ query.set_variable_value(attr, filter_value)
+
+ parsed_data = query.query(con)
+ folders = parsed_data["project"]["folders"]
+ if active is None:
+ return folders
+ return [
+ folder
+ for folder in folders
+ if folder["active"] is active
+ ]
diff --git a/openpype/client/server/operations.py b/openpype/client/server/operations.py
new file mode 100644
index 0000000000..eeb55784e1
--- /dev/null
+++ b/openpype/client/server/operations.py
@@ -0,0 +1,881 @@
+import copy
+import json
+import collections
+import uuid
+import datetime
+
+from bson.objectid import ObjectId
+from ayon_api import get_server_api_connection
+
+from openpype.client.operations_base import (
+ REMOVED_VALUE,
+ CreateOperation,
+ UpdateOperation,
+ DeleteOperation,
+ BaseOperationsSession
+)
+
+from openpype.client.mongo.operations import (
+ CURRENT_THUMBNAIL_SCHEMA,
+ CURRENT_REPRESENTATION_SCHEMA,
+ CURRENT_HERO_VERSION_SCHEMA,
+ CURRENT_VERSION_SCHEMA,
+ CURRENT_SUBSET_SCHEMA,
+ CURRENT_ASSET_DOC_SCHEMA,
+ CURRENT_PROJECT_SCHEMA,
+)
+
+from .conversion_utils import (
+ convert_create_asset_to_v4,
+ convert_create_task_to_v4,
+ convert_create_subset_to_v4,
+ convert_create_version_to_v4,
+ convert_create_hero_version_to_v4,
+ convert_create_representation_to_v4,
+ convert_create_workfile_info_to_v4,
+
+ convert_update_folder_to_v4,
+ convert_update_subset_to_v4,
+ convert_update_version_to_v4,
+ convert_update_hero_version_to_v4,
+ convert_update_representation_to_v4,
+ convert_update_workfile_info_to_v4,
+)
+from .utils import create_entity_id
+
+
+def _create_or_convert_to_id(entity_id=None):
+ if entity_id is None:
+ return create_entity_id()
+
+ if isinstance(entity_id, ObjectId):
+ raise TypeError("Type of 'ObjectId' is not supported anymore.")
+
+ # Validate if can be converted to uuid
+ uuid.UUID(entity_id)
+ return entity_id
+
+
+def new_project_document(
+ project_name, project_code, config, data=None, entity_id=None
+):
+ """Create skeleton data of project document.
+
+ Args:
+ project_name (str): Name of project. Used as identifier of a project.
+ project_code (str): Shorter version of projet without spaces and
+ special characters (in most of cases). Should be also considered
+ as unique name across projects.
+ config (Dic[str, Any]): Project config consist of roots, templates,
+ applications and other project Anatomy related data.
+ data (Dict[str, Any]): Project data with information about it's
+ attributes (e.g. 'fps' etc.) or integration specific keys.
+ entity_id (Union[str, ObjectId]): Predefined id of document. New id is
+ created if not passed.
+
+ Returns:
+ Dict[str, Any]: Skeleton of project document.
+ """
+
+ if data is None:
+ data = {}
+
+ data["code"] = project_code
+
+ return {
+ "_id": _create_or_convert_to_id(entity_id),
+ "name": project_name,
+ "type": CURRENT_PROJECT_SCHEMA,
+ "entity_data": data,
+ "config": config
+ }
+
+
+def new_asset_document(
+ name, project_id, parent_id, parents, data=None, entity_id=None
+):
+ """Create skeleton data of asset document.
+
+ Args:
+ name (str): Is considered as unique identifier of asset in project.
+ project_id (Union[str, ObjectId]): Id of project doument.
+ parent_id (Union[str, ObjectId]): Id of parent asset.
+ parents (List[str]): List of parent assets names.
+ data (Dict[str, Any]): Asset document data. Empty dictionary is used
+ if not passed. Value of 'parent_id' is used to fill 'visualParent'.
+ entity_id (Union[str, ObjectId]): Predefined id of document. New id is
+ created if not passed.
+
+ Returns:
+ Dict[str, Any]: Skeleton of asset document.
+ """
+
+ if data is None:
+ data = {}
+ if parent_id is not None:
+ parent_id = _create_or_convert_to_id(parent_id)
+ data["visualParent"] = parent_id
+ data["parents"] = parents
+
+ return {
+ "_id": _create_or_convert_to_id(entity_id),
+ "type": "asset",
+ "name": name,
+ # This will be ignored
+ "parent": project_id,
+ "data": data,
+ "schema": CURRENT_ASSET_DOC_SCHEMA
+ }
+
+
+def new_subset_document(name, family, asset_id, data=None, entity_id=None):
+ """Create skeleton data of subset document.
+
+ Args:
+ name (str): Is considered as unique identifier of subset under asset.
+ family (str): Subset's family.
+ asset_id (Union[str, ObjectId]): Id of parent asset.
+ data (Dict[str, Any]): Subset document data. Empty dictionary is used
+ if not passed. Value of 'family' is used to fill 'family'.
+ entity_id (Union[str, ObjectId]): Predefined id of document. New id is
+ created if not passed.
+
+ Returns:
+ Dict[str, Any]: Skeleton of subset document.
+ """
+
+ if data is None:
+ data = {}
+ data["family"] = family
+ return {
+ "_id": _create_or_convert_to_id(entity_id),
+ "schema": CURRENT_SUBSET_SCHEMA,
+ "type": "subset",
+ "name": name,
+ "data": data,
+ "parent": _create_or_convert_to_id(asset_id)
+ }
+
+
+def new_version_doc(version, subset_id, data=None, entity_id=None):
+ """Create skeleton data of version document.
+
+ Args:
+ version (int): Is considered as unique identifier of version
+ under subset.
+ subset_id (Union[str, ObjectId]): Id of parent subset.
+ data (Dict[str, Any]): Version document data.
+ entity_id (Union[str, ObjectId]): Predefined id of document. New id is
+ created if not passed.
+
+ Returns:
+ Dict[str, Any]: Skeleton of version document.
+ """
+
+ if data is None:
+ data = {}
+
+ return {
+ "_id": _create_or_convert_to_id(entity_id),
+ "schema": CURRENT_VERSION_SCHEMA,
+ "type": "version",
+ "name": int(version),
+ "parent": _create_or_convert_to_id(subset_id),
+ "data": data
+ }
+
+
+def new_hero_version_doc(subset_id, data, version=None, entity_id=None):
+ """Create skeleton data of hero version document.
+
+ Args:
+ subset_id (Union[str, ObjectId]): Id of parent subset.
+ data (Dict[str, Any]): Version document data.
+ version (int): Version of source version.
+ entity_id (Union[str, ObjectId]): Predefined id of document. New id is
+ created if not passed.
+
+ Returns:
+ Dict[str, Any]: Skeleton of version document.
+ """
+
+ if version is None:
+ version = -1
+ elif version > 0:
+ version = -version
+
+ return {
+ "_id": _create_or_convert_to_id(entity_id),
+ "schema": CURRENT_HERO_VERSION_SCHEMA,
+ "type": "hero_version",
+ "version": version,
+ "parent": _create_or_convert_to_id(subset_id),
+ "data": data
+ }
+
+
+def new_representation_doc(
+ name, version_id, context, data=None, entity_id=None
+):
+ """Create skeleton data of representation document.
+
+ Args:
+ name (str): Representation name considered as unique identifier
+ of representation under version.
+ version_id (Union[str, ObjectId]): Id of parent version.
+ context (Dict[str, Any]): Representation context used for fill template
+ of to query.
+ data (Dict[str, Any]): Representation document data.
+ entity_id (Union[str, ObjectId]): Predefined id of document. New id is
+ created if not passed.
+
+ Returns:
+ Dict[str, Any]: Skeleton of version document.
+ """
+
+ if data is None:
+ data = {}
+
+ return {
+ "_id": _create_or_convert_to_id(entity_id),
+ "schema": CURRENT_REPRESENTATION_SCHEMA,
+ "type": "representation",
+ "parent": _create_or_convert_to_id(version_id),
+ "name": name,
+ "data": data,
+
+ # Imprint shortcut to context for performance reasons.
+ "context": context
+ }
+
+
+def new_thumbnail_doc(data=None, entity_id=None):
+ """Create skeleton data of thumbnail document.
+
+ Args:
+ data (Dict[str, Any]): Thumbnail document data.
+ entity_id (Union[str, ObjectId]): Predefined id of document. New id is
+ created if not passed.
+
+ Returns:
+ Dict[str, Any]: Skeleton of thumbnail document.
+ """
+
+ if data is None:
+ data = {}
+
+ return {
+ "_id": _create_or_convert_to_id(entity_id),
+ "type": "thumbnail",
+ "schema": CURRENT_THUMBNAIL_SCHEMA,
+ "data": data
+ }
+
+
+def new_workfile_info_doc(
+ filename, asset_id, task_name, files, data=None, entity_id=None
+):
+ """Create skeleton data of workfile info document.
+
+ Workfile document is at this moment used primarily for artist notes.
+
+ Args:
+ filename (str): Filename of workfile.
+ asset_id (Union[str, ObjectId]): Id of asset under which workfile live.
+ task_name (str): Task under which was workfile created.
+ files (List[str]): List of rootless filepaths related to workfile.
+ data (Dict[str, Any]): Additional metadata.
+
+ Returns:
+ Dict[str, Any]: Skeleton of workfile info document.
+ """
+
+ if not data:
+ data = {}
+
+ return {
+ "_id": _create_or_convert_to_id(entity_id),
+ "type": "workfile",
+ "parent": _create_or_convert_to_id(asset_id),
+ "task_name": task_name,
+ "filename": filename,
+ "data": data,
+ "files": files
+ }
+
+
+def _prepare_update_data(old_doc, new_doc, replace):
+ changes = {}
+ for key, value in new_doc.items():
+ if key not in old_doc or value != old_doc[key]:
+ changes[key] = value
+
+ if replace:
+ for key in old_doc.keys():
+ if key not in new_doc:
+ changes[key] = REMOVED_VALUE
+ return changes
+
+
+def prepare_subset_update_data(old_doc, new_doc, replace=True):
+ """Compare two subset documents and prepare update data.
+
+ Based on compared values will create update data for
+ 'MongoUpdateOperation'.
+
+ Empty output means that documents are identical.
+
+ Returns:
+ Dict[str, Any]: Changes between old and new document.
+ """
+
+ return _prepare_update_data(old_doc, new_doc, replace)
+
+
+def prepare_version_update_data(old_doc, new_doc, replace=True):
+ """Compare two version documents and prepare update data.
+
+ Based on compared values will create update data for
+ 'MongoUpdateOperation'.
+
+ Empty output means that documents are identical.
+
+ Returns:
+ Dict[str, Any]: Changes between old and new document.
+ """
+
+ return _prepare_update_data(old_doc, new_doc, replace)
+
+
+def prepare_hero_version_update_data(old_doc, new_doc, replace=True):
+ """Compare two hero version documents and prepare update data.
+
+ Based on compared values will create update data for 'UpdateOperation'.
+
+ Empty output means that documents are identical.
+
+ Returns:
+ Dict[str, Any]: Changes between old and new document.
+ """
+
+ changes = _prepare_update_data(old_doc, new_doc, replace)
+ changes.pop("version_id", None)
+ return changes
+
+
+def prepare_representation_update_data(old_doc, new_doc, replace=True):
+ """Compare two representation documents and prepare update data.
+
+ Based on compared values will create update data for
+ 'MongoUpdateOperation'.
+
+ Empty output means that documents are identical.
+
+ Returns:
+ Dict[str, Any]: Changes between old and new document.
+ """
+
+ changes = _prepare_update_data(old_doc, new_doc, replace)
+ context = changes.get("data", {}).get("context")
+ # Make sure that both 'family' and 'subset' are in changes if
+ # one of them changed (they'll both become 'product').
+ if (
+ context
+ and ("family" in context or "subset" in context)
+ ):
+ context["family"] = new_doc["data"]["context"]["family"]
+ context["subset"] = new_doc["data"]["context"]["subset"]
+
+ return changes
+
+
+def prepare_workfile_info_update_data(old_doc, new_doc, replace=True):
+ """Compare two workfile info documents and prepare update data.
+
+ Based on compared values will create update data for
+ 'MongoUpdateOperation'.
+
+ Empty output means that documents are identical.
+
+ Returns:
+ Dict[str, Any]: Changes between old and new document.
+ """
+
+ return _prepare_update_data(old_doc, new_doc, replace)
+
+
+class FailedOperations(Exception):
+ pass
+
+
+def entity_data_json_default(value):
+ if isinstance(value, datetime.datetime):
+ return int(value.timestamp())
+
+ raise TypeError(
+ "Object of type {} is not JSON serializable".format(str(type(value)))
+ )
+
+
+def failed_json_default(value):
+ return "< Failed value {} > {}".format(type(value), str(value))
+
+
+class ServerCreateOperation(CreateOperation):
+ """Opeartion to create an entity.
+
+ Args:
+ project_name (str): On which project operation will happen.
+ entity_type (str): Type of entity on which change happens.
+ e.g. 'asset', 'representation' etc.
+ data (Dict[str, Any]): Data of entity that will be created.
+ """
+
+ def __init__(self, project_name, entity_type, data, session):
+ self._session = session
+
+ if not data:
+ data = {}
+ data = copy.deepcopy(data)
+ if entity_type == "project":
+ raise ValueError("Project cannot be created using operations")
+
+ tasks = None
+ if entity_type in "asset":
+ # TODO handle tasks
+ entity_type = "folder"
+ if "data" in data:
+ tasks = data["data"].get("tasks")
+
+ project = self._session.get_project(project_name)
+ new_data = convert_create_asset_to_v4(data, project, self.con)
+
+ elif entity_type == "task":
+ project = self._session.get_project(project_name)
+ new_data = convert_create_task_to_v4(data, project, self.con)
+
+ elif entity_type == "subset":
+ new_data = convert_create_subset_to_v4(data, self.con)
+ entity_type = "product"
+
+ elif entity_type == "version":
+ new_data = convert_create_version_to_v4(data, self.con)
+
+ elif entity_type == "hero_version":
+ new_data = convert_create_hero_version_to_v4(
+ data, project_name, self.con
+ )
+ entity_type = "version"
+
+ elif entity_type in ("representation", "archived_representation"):
+ new_data = convert_create_representation_to_v4(data, self.con)
+ entity_type = "representation"
+
+ elif entity_type == "workfile":
+ new_data = convert_create_workfile_info_to_v4(
+ data, project_name, self.con
+ )
+
+ else:
+ raise ValueError(
+ "Unhandled entity type \"{}\"".format(entity_type)
+ )
+
+ # Simple check if data can be dumped into json
+ # - should raise error on 'ObjectId' object
+ try:
+ new_data = json.loads(
+ json.dumps(new_data, default=entity_data_json_default)
+ )
+
+ except:
+ raise ValueError("Couldn't json parse body: {}".format(
+ json.dumps(new_data, default=failed_json_default)
+ ))
+
+ super(ServerCreateOperation, self).__init__(
+ project_name, entity_type, new_data
+ )
+
+ if "id" not in self._data:
+ self._data["id"] = create_entity_id()
+
+ if tasks:
+ copied_tasks = copy.deepcopy(tasks)
+ for task_name, task in copied_tasks.items():
+ task["name"] = task_name
+ task["folderId"] = self._data["id"]
+ self.session.create_entity(
+ project_name, "task", task, nested_id=self.id
+ )
+
+ @property
+ def con(self):
+ return self.session.con
+
+ @property
+ def session(self):
+ return self._session
+
+ @property
+ def entity_id(self):
+ return self._data["id"]
+
+ def to_server_operation(self):
+ return {
+ "id": self.id,
+ "type": "create",
+ "entityType": self.entity_type,
+ "entityId": self.entity_id,
+ "data": self._data
+ }
+
+
+class ServerUpdateOperation(UpdateOperation):
+ """Operation to update an entity.
+
+ Args:
+ project_name (str): On which project operation will happen.
+ entity_type (str): Type of entity on which change happens.
+ e.g. 'asset', 'representation' etc.
+ entity_id (Union[str, ObjectId]): Identifier of an entity.
+ update_data (Dict[str, Any]): Key -> value changes that will be set in
+ database. If value is set to 'REMOVED_VALUE' the key will be
+ removed. Only first level of dictionary is checked (on purpose).
+ """
+
+ def __init__(
+ self, project_name, entity_type, entity_id, update_data, session
+ ):
+ self._session = session
+
+ update_data = copy.deepcopy(update_data)
+ if entity_type == "project":
+ raise ValueError("Project cannot be created using operations")
+
+ if entity_type in ("asset", "archived_asset"):
+ new_update_data = convert_update_folder_to_v4(
+ project_name, entity_id, update_data, self.con
+ )
+ entity_type = "folder"
+
+ elif entity_type == "subset":
+ new_update_data = convert_update_subset_to_v4(
+ project_name, entity_id, update_data, self.con
+ )
+ entity_type = "product"
+
+ elif entity_type == "version":
+ new_update_data = convert_update_version_to_v4(
+ project_name, entity_id, update_data, self.con
+ )
+
+ elif entity_type == "hero_version":
+ new_update_data = convert_update_hero_version_to_v4(
+ project_name, entity_id, update_data, self.con
+ )
+ entity_type = "version"
+
+ elif entity_type in ("representation", "archived_representation"):
+ new_update_data = convert_update_representation_to_v4(
+ project_name, entity_id, update_data, self.con
+ )
+ entity_type = "representation"
+
+ elif entity_type == "workfile":
+ new_update_data = convert_update_workfile_info_to_v4(
+ project_name, entity_id, update_data, self.con
+ )
+
+ else:
+ raise ValueError(
+ "Unhandled entity type \"{}\"".format(entity_type)
+ )
+
+ try:
+ new_update_data = json.loads(
+ json.dumps(new_update_data, default=entity_data_json_default)
+ )
+
+ except:
+ raise ValueError("Couldn't json parse body: {}".format(
+ json.dumps(new_update_data, default=failed_json_default)
+ ))
+
+ super(ServerUpdateOperation, self).__init__(
+ project_name, entity_type, entity_id, new_update_data
+ )
+
+ @property
+ def con(self):
+ return self.session.con
+
+ @property
+ def session(self):
+ return self._session
+
+ def to_server_operation(self):
+ if not self._update_data:
+ return None
+
+ update_data = {}
+ for key, value in self._update_data.items():
+ if value is REMOVED_VALUE:
+ value = None
+ update_data[key] = value
+
+ return {
+ "id": self.id,
+ "type": "update",
+ "entityType": self.entity_type,
+ "entityId": self.entity_id,
+ "data": update_data
+ }
+
+
+class ServerDeleteOperation(DeleteOperation):
+ """Opeartion to delete an entity.
+
+ Args:
+ project_name (str): On which project operation will happen.
+ entity_type (str): Type of entity on which change happens.
+ e.g. 'asset', 'representation' etc.
+ entity_id (Union[str, ObjectId]): Entity id that will be removed.
+ """
+
+ def __init__(self, project_name, entity_type, entity_id, session):
+ self._session = session
+
+ if entity_type == "asset":
+ entity_type == "folder"
+
+ elif entity_type == "hero_version":
+ entity_type = "version"
+
+ elif entity_type == "subset":
+ entity_type = "product"
+
+ super(ServerDeleteOperation, self).__init__(
+ project_name, entity_type, entity_id
+ )
+
+ @property
+ def con(self):
+ return self.session.con
+
+ @property
+ def session(self):
+ return self._session
+
+ def to_server_operation(self):
+ return {
+ "id": self.id,
+ "type": self.operation_name,
+ "entityId": self.entity_id,
+ "entityType": self.entity_type,
+ }
+
+
+class OperationsSession(BaseOperationsSession):
+ def __init__(self, con=None, *args, **kwargs):
+ super(OperationsSession, self).__init__(*args, **kwargs)
+ if con is None:
+ con = get_server_api_connection()
+ self._con = con
+ self._project_cache = {}
+ self._nested_operations = collections.defaultdict(list)
+
+ @property
+ def con(self):
+ return self._con
+
+ def get_project(self, project_name):
+ if project_name not in self._project_cache:
+ self._project_cache[project_name] = self.con.get_project(
+ project_name)
+ return copy.deepcopy(self._project_cache[project_name])
+
+ def commit(self):
+ """Commit session operations."""
+
+ operations, self._operations = self._operations, []
+ if not operations:
+ return
+
+ operations_by_project = collections.defaultdict(list)
+ for operation in operations:
+ operations_by_project[operation.project_name].append(operation)
+
+ body_by_id = {}
+ results = []
+ for project_name, operations in operations_by_project.items():
+ operations_body = []
+ for operation in operations:
+ body = operation.to_server_operation()
+ if body is not None:
+ try:
+ json.dumps(body)
+ except:
+ raise ValueError("Couldn't json parse body: {}".format(
+ json.dumps(
+ body, indent=4, default=failed_json_default
+ )
+ ))
+
+ body_by_id[operation.id] = body
+ operations_body.append(body)
+
+ if operations_body:
+ result = self._con.post(
+ "projects/{}/operations".format(project_name),
+ operations=operations_body,
+ canFail=False
+ )
+ results.append(result.data)
+
+ for result in results:
+ if result.get("success"):
+ continue
+
+ if "operations" not in result:
+ raise FailedOperations(
+ "Operation failed. Content: {}".format(str(result))
+ )
+
+ for op_result in result["operations"]:
+ if not op_result["success"]:
+ operation_id = op_result["id"]
+ raise FailedOperations((
+ "Operation \"{}\" failed with data:\n{}\nError: {}."
+ ).format(
+ operation_id,
+ json.dumps(body_by_id[operation_id], indent=4),
+ op_result.get("error", "unknown"),
+ ))
+
+ def create_entity(self, project_name, entity_type, data, nested_id=None):
+ """Fast access to 'ServerCreateOperation'.
+
+ Args:
+ project_name (str): On which project the creation happens.
+ entity_type (str): Which entity type will be created.
+ data (Dicst[str, Any]): Entity data.
+ nested_id (str): Id of other operation from which is triggered
+ operation -> Operations can trigger suboperations but they
+ must be added to operations list after it's parent is added.
+
+ Returns:
+ ServerCreateOperation: Object of update operation.
+ """
+
+ operation = ServerCreateOperation(
+ project_name, entity_type, data, self
+ )
+
+ if nested_id:
+ self._nested_operations[nested_id].append(operation)
+ else:
+ self.add(operation)
+ if operation.id in self._nested_operations:
+ self.extend(self._nested_operations.pop(operation.id))
+
+ return operation
+
+ def update_entity(
+ self, project_name, entity_type, entity_id, update_data, nested_id=None
+ ):
+ """Fast access to 'ServerUpdateOperation'.
+
+ Returns:
+ ServerUpdateOperation: Object of update operation.
+ """
+
+ operation = ServerUpdateOperation(
+ project_name, entity_type, entity_id, update_data, self
+ )
+ if nested_id:
+ self._nested_operations[nested_id].append(operation)
+ else:
+ self.add(operation)
+ if operation.id in self._nested_operations:
+ self.extend(self._nested_operations.pop(operation.id))
+ return operation
+
+ def delete_entity(
+ self, project_name, entity_type, entity_id, nested_id=None
+ ):
+ """Fast access to 'ServerDeleteOperation'.
+
+ Returns:
+ ServerDeleteOperation: Object of delete operation.
+ """
+
+ operation = ServerDeleteOperation(
+ project_name, entity_type, entity_id, self
+ )
+ if nested_id:
+ self._nested_operations[nested_id].append(operation)
+ else:
+ self.add(operation)
+ if operation.id in self._nested_operations:
+ self.extend(self._nested_operations.pop(operation.id))
+ return operation
+
+
+def create_project(
+ project_name,
+ project_code,
+ library_project=False,
+ preset_name=None,
+ con=None
+):
+ """Create project using OpenPype settings.
+
+ This project creation function is not validating project document on
+ creation. It is because project document is created blindly with only
+ minimum required information about project which is it's name, code, type
+ and schema.
+
+ Entered project name must be unique and project must not exist yet.
+
+ Note:
+ This function is here to be OP v4 ready but in v3 has more logic
+ to do. That's why inner imports are in the body.
+
+ Args:
+ project_name (str): New project name. Should be unique.
+ project_code (str): Project's code should be unique too.
+ library_project (bool): Project is library project.
+ preset_name (str): Name of anatomy preset. Default is used if not
+ passed.
+ con (ServerAPI): Connection to server with logged user.
+
+ Raises:
+ ValueError: When project name already exists in MongoDB.
+
+ Returns:
+ dict: Created project document.
+ """
+
+ if con is None:
+ con = get_server_api_connection()
+
+ return con.create_project(
+ project_name,
+ project_code,
+ library_project,
+ preset_name
+ )
+
+
+def delete_project(project_name, con=None):
+ if con is None:
+ con = get_server_api_connection()
+
+ return con.delete_project(project_name)
+
+
+def create_thumbnail(project_name, src_filepath, thumbnail_id=None, con=None):
+ if con is None:
+ con = get_server_api_connection()
+ return con.create_thumbnail(project_name, src_filepath, thumbnail_id)
diff --git a/openpype/client/server/utils.py b/openpype/client/server/utils.py
new file mode 100644
index 0000000000..ed128cfad9
--- /dev/null
+++ b/openpype/client/server/utils.py
@@ -0,0 +1,109 @@
+import uuid
+
+from openpype.client.operations_base import REMOVED_VALUE
+
+
+def create_entity_id():
+ return uuid.uuid1().hex
+
+
+def prepare_attribute_changes(old_entity, new_entity, replace=False):
+ """Prepare changes of attributes on entities.
+
+ Compare 'attrib' of old and new entity data to prepare only changed
+ values that should be sent to server for update.
+
+ Example:
+ >>> # Limited entity data to 'attrib'
+ >>> old_entity = {
+ ... "attrib": {"attr_1": 1, "attr_2": "MyString", "attr_3": True}
+ ... }
+ >>> new_entity = {
+ ... "attrib": {"attr_1": 2, "attr_3": True, "attr_4": 3}
+ ... }
+ >>> # Changes if replacement should not happen
+ >>> expected_changes = {
+ ... "attr_1": 2,
+ ... "attr_4": 3
+ ... }
+ >>> changes = prepare_attribute_changes(old_entity, new_entity)
+ >>> changes == expected_changes
+ True
+
+ >>> # Changes if replacement should happen
+ >>> expected_changes_replace = {
+ ... "attr_1": 2,
+ ... "attr_2": REMOVED_VALUE,
+ ... "attr_4": 3
+ ... }
+ >>> changes_replace = prepare_attribute_changes(
+ ... old_entity, new_entity, True)
+ >>> changes_replace == expected_changes_replace
+ True
+
+ Args:
+ old_entity (dict[str, Any]): Data of entity queried from server.
+ new_entity (dict[str, Any]): Entity data with applied changes.
+ replace (bool): New entity should fully replace all old entity values.
+
+ Returns:
+ Dict[str, Any]: Values from new entity only if value has changed.
+ """
+
+ attrib_changes = {}
+ new_attrib = new_entity.get("attrib")
+ old_attrib = old_entity.get("attrib")
+ if new_attrib is None:
+ if not replace:
+ return attrib_changes
+ new_attrib = {}
+
+ if old_attrib is None:
+ return new_attrib
+
+ for attr, new_attr_value in new_attrib.items():
+ old_attr_value = old_attrib.get(attr)
+ if old_attr_value != new_attr_value:
+ attrib_changes[attr] = new_attr_value
+
+ if replace:
+ for attr in old_attrib:
+ if attr not in new_attrib:
+ attrib_changes[attr] = REMOVED_VALUE
+
+ return attrib_changes
+
+
+def prepare_entity_changes(old_entity, new_entity, replace=False):
+ """Prepare changes of AYON entities.
+
+ Compare old and new entity to filter values from new data that changed.
+
+ Args:
+ old_entity (dict[str, Any]): Data of entity queried from server.
+ new_entity (dict[str, Any]): Entity data with applied changes.
+ replace (bool): All attributes should be replaced by new values. So
+ all attribute values that are not on new entity will be removed.
+
+ Returns:
+ Dict[str, Any]: Only values from new entity that changed.
+ """
+
+ changes = {}
+ for key, new_value in new_entity.items():
+ if key == "attrib":
+ continue
+
+ old_value = old_entity.get(key)
+ if old_value != new_value:
+ changes[key] = new_value
+
+ if replace:
+ for key in old_entity:
+ if key not in new_entity:
+ changes[key] = REMOVED_VALUE
+
+ attr_changes = prepare_attribute_changes(old_entity, new_entity, replace)
+ if attr_changes:
+ changes["attrib"] = attr_changes
+ return changes
diff --git a/openpype/hooks/pre_global_host_data.py b/openpype/hooks/pre_global_host_data.py
index 8a178915fb..260e28a18b 100644
--- a/openpype/hooks/pre_global_host_data.py
+++ b/openpype/hooks/pre_global_host_data.py
@@ -5,7 +5,7 @@ from openpype.lib import (
prepare_app_environments,
prepare_context_environments
)
-from openpype.pipeline import AvalonMongoDB, Anatomy
+from openpype.pipeline import Anatomy
class GlobalHostDataHook(PreLaunchHook):
@@ -26,7 +26,6 @@ class GlobalHostDataHook(PreLaunchHook):
"app": app,
- "dbcon": self.data["dbcon"],
"project_doc": self.data["project_doc"],
"asset_doc": self.data["asset_doc"],
@@ -62,13 +61,6 @@ class GlobalHostDataHook(PreLaunchHook):
# Anatomy
self.data["anatomy"] = Anatomy(project_name)
- # Mongo connection
- dbcon = AvalonMongoDB()
- dbcon.Session["AVALON_PROJECT"] = project_name
- dbcon.install()
-
- self.data["dbcon"] = dbcon
-
# Project document
project_doc = get_project(project_name)
self.data["project_doc"] = project_doc
diff --git a/openpype/host/dirmap.py b/openpype/host/dirmap.py
index 42bf80ecec..e77f06e9d6 100644
--- a/openpype/host/dirmap.py
+++ b/openpype/host/dirmap.py
@@ -149,7 +149,7 @@ class HostDirmap(object):
Returns:
dict : { "source-path": [XXX], "destination-path": [YYYY]}
"""
- project_name = os.getenv("AVALON_PROJECT")
+ project_name = self.project_name
mapping = {}
if (not self.sync_module.enabled or
diff --git a/openpype/hosts/aftereffects/api/launch_logic.py b/openpype/hosts/aftereffects/api/launch_logic.py
index ea71122042..e90c3dc5b8 100644
--- a/openpype/hosts/aftereffects/api/launch_logic.py
+++ b/openpype/hosts/aftereffects/api/launch_logic.py
@@ -13,13 +13,13 @@ from wsrpc_aiohttp import (
WebSocketAsync
)
-from qtpy import QtCore, QtWidgets
+from qtpy import QtCore
from openpype.lib import Logger
-from openpype.tools.utils import host_tools
from openpype.tests.lib import is_in_tests
from openpype.pipeline import install_host, legacy_io
from openpype.modules import ModulesManager
+from openpype.tools.utils import host_tools, get_openpype_qt_app
from openpype.tools.adobe_webserver.app import WebServerTool
from .ws_stub import get_stub
@@ -43,7 +43,7 @@ def main(*subprocess_args):
install_host(host)
os.environ["OPENPYPE_LOG_NO_COLORS"] = "False"
- app = QtWidgets.QApplication([])
+ app = get_openpype_qt_app()
app.setQuitOnLastWindowClosed(False)
launcher = ProcessLauncher(subprocess_args)
diff --git a/openpype/hosts/aftereffects/api/pipeline.py b/openpype/hosts/aftereffects/api/pipeline.py
index 5566ca9e5b..8fc7a70dd8 100644
--- a/openpype/hosts/aftereffects/api/pipeline.py
+++ b/openpype/hosts/aftereffects/api/pipeline.py
@@ -23,6 +23,7 @@ from openpype.host import (
ILoadHost,
IPublishHost
)
+from openpype.tools.utils import get_openpype_qt_app
from .launch_logic import get_stub
from .ws_stub import ConnectionNotEstablishedYet
@@ -236,10 +237,7 @@ def check_inventory():
return
# Warn about outdated containers.
- _app = QtWidgets.QApplication.instance()
- if not _app:
- print("Starting new QApplication..")
- _app = QtWidgets.QApplication([])
+ _app = get_openpype_qt_app()
message_box = QtWidgets.QMessageBox()
message_box.setIcon(QtWidgets.QMessageBox.Warning)
diff --git a/openpype/hosts/aftereffects/plugins/load/load_background.py b/openpype/hosts/aftereffects/plugins/load/load_background.py
index e7c29fee5a..16f45074aa 100644
--- a/openpype/hosts/aftereffects/plugins/load/load_background.py
+++ b/openpype/hosts/aftereffects/plugins/load/load_background.py
@@ -33,9 +33,10 @@ class BackgroundLoader(api.AfterEffectsLoader):
existing_items,
"{}_{}".format(context["asset"]["name"], name))
- layers = get_background_layers(self.fname)
+ path = self.filepath_from_context(context)
+ layers = get_background_layers(path)
if not layers:
- raise ValueError("No layers found in {}".format(self.fname))
+ raise ValueError("No layers found in {}".format(path))
comp = stub.import_background(None, stub.LOADED_ICON + comp_name,
layers)
diff --git a/openpype/hosts/aftereffects/plugins/load/load_file.py b/openpype/hosts/aftereffects/plugins/load/load_file.py
index 33a86aa505..def7c927ab 100644
--- a/openpype/hosts/aftereffects/plugins/load/load_file.py
+++ b/openpype/hosts/aftereffects/plugins/load/load_file.py
@@ -29,32 +29,32 @@ class FileLoader(api.AfterEffectsLoader):
import_options = {}
- file = self.fname
+ path = self.filepath_from_context(context)
repr_cont = context["representation"]["context"]
- if "#" not in file:
+ if "#" not in path:
frame = repr_cont.get("frame")
if frame:
padding = len(frame)
- file = file.replace(frame, "#" * padding)
+ path = path.replace(frame, "#" * padding)
import_options['sequence'] = True
- if not file:
+ if not path:
repr_id = context["representation"]["_id"]
self.log.warning(
"Representation id `{}` is failing to load".format(repr_id))
return
- file = file.replace("\\", "/")
- if '.psd' in file:
+ path = path.replace("\\", "/")
+ if '.psd' in path:
import_options['ImportAsType'] = 'ImportAsType.COMP'
- comp = stub.import_file(self.fname, stub.LOADED_ICON + comp_name,
+ comp = stub.import_file(path, stub.LOADED_ICON + comp_name,
import_options)
if not comp:
self.log.warning(
- "Representation id `{}` is failing to load".format(file))
+ "Representation `{}` is failing to load".format(path))
self.log.warning("Check host app for alert error.")
return
diff --git a/openpype/hosts/aftereffects/plugins/publish/collect_workfile.py b/openpype/hosts/aftereffects/plugins/publish/collect_workfile.py
index c21c3623c3..dc557f67fc 100644
--- a/openpype/hosts/aftereffects/plugins/publish/collect_workfile.py
+++ b/openpype/hosts/aftereffects/plugins/publish/collect_workfile.py
@@ -1,7 +1,6 @@
import os
import pyblish.api
-from openpype.pipeline import legacy_io
from openpype.pipeline.create import get_subset_name
@@ -44,7 +43,7 @@ class CollectWorkfile(pyblish.api.ContextPlugin):
instance.data["publish"] = instance.data["active"] # for DL
def _get_new_instance(self, context, scene_file):
- task = legacy_io.Session["AVALON_TASK"]
+ task = context.data["task"]
version = context.data["version"]
asset_entity = context.data["assetEntity"]
project_entity = context.data["projectEntity"]
diff --git a/openpype/hosts/aftereffects/plugins/publish/validate_instance_asset.py b/openpype/hosts/aftereffects/plugins/publish/validate_instance_asset.py
index 6c36136b20..36f6035d23 100644
--- a/openpype/hosts/aftereffects/plugins/publish/validate_instance_asset.py
+++ b/openpype/hosts/aftereffects/plugins/publish/validate_instance_asset.py
@@ -1,6 +1,6 @@
import pyblish.api
-from openpype.pipeline import legacy_io
+from openpype.pipeline import get_current_asset_name
from openpype.pipeline.publish import (
ValidateContentsOrder,
PublishXmlValidationError,
@@ -30,7 +30,7 @@ class ValidateInstanceAssetRepair(pyblish.api.Action):
for instance in instances:
data = stub.read(instance[0])
- data["asset"] = legacy_io.Session["AVALON_ASSET"]
+ data["asset"] = get_current_asset_name()
stub.imprint(instance[0].instance_id, data)
@@ -54,7 +54,7 @@ class ValidateInstanceAsset(pyblish.api.InstancePlugin):
def process(self, instance):
instance_asset = instance.data["asset"]
- current_asset = legacy_io.Session["AVALON_ASSET"]
+ current_asset = get_current_asset_name()
msg = (
f"Instance asset {instance_asset} is not the same "
f"as current context {current_asset}."
diff --git a/openpype/hosts/blender/api/ops.py b/openpype/hosts/blender/api/ops.py
index 91cbfe524f..2c1b7245cd 100644
--- a/openpype/hosts/blender/api/ops.py
+++ b/openpype/hosts/blender/api/ops.py
@@ -16,7 +16,7 @@ import bpy
import bpy.utils.previews
from openpype import style
-from openpype.pipeline import legacy_io
+from openpype.pipeline import get_current_asset_name, get_current_task_name
from openpype.tools.utils import host_tools
from .workio import OpenFileCacher
@@ -283,7 +283,7 @@ class LaunchLoader(LaunchQtApp):
def before_window_show(self):
self._window.set_context(
- {"asset": legacy_io.Session["AVALON_ASSET"]},
+ {"asset": get_current_asset_name()},
refresh=True
)
@@ -331,8 +331,8 @@ class LaunchWorkFiles(LaunchQtApp):
def execute(self, context):
result = super().execute(context)
self._window.set_context({
- "asset": legacy_io.Session["AVALON_ASSET"],
- "task": legacy_io.Session["AVALON_TASK"]
+ "asset": get_current_asset_name(),
+ "task": get_current_task_name()
})
return result
@@ -362,8 +362,8 @@ class TOPBAR_MT_avalon(bpy.types.Menu):
else:
pyblish_menu_icon_id = 0
- asset = legacy_io.Session['AVALON_ASSET']
- task = legacy_io.Session['AVALON_TASK']
+ asset = get_current_asset_name()
+ task = get_current_task_name()
context_label = f"{asset}, {task}"
context_label_item = layout.row()
context_label_item.operator(
@@ -411,6 +411,7 @@ def register():
pcoll.load("pyblish_menu_icon", str(pyblish_icon_file.absolute()), 'IMAGE')
PREVIEW_COLLECTIONS["avalon"] = pcoll
+ BlenderApplication.get_app()
for cls in classes:
bpy.utils.register_class(cls)
bpy.types.TOPBAR_MT_editor_menus.append(draw_avalon_menu)
diff --git a/openpype/hosts/blender/api/pipeline.py b/openpype/hosts/blender/api/pipeline.py
index 0f756d8cb6..eb696ec184 100644
--- a/openpype/hosts/blender/api/pipeline.py
+++ b/openpype/hosts/blender/api/pipeline.py
@@ -14,6 +14,8 @@ from openpype.client import get_asset_by_name
from openpype.pipeline import (
schema,
legacy_io,
+ get_current_project_name,
+ get_current_asset_name,
register_loader_plugin_path,
register_creator_plugin_path,
deregister_loader_plugin_path,
@@ -112,8 +114,8 @@ def message_window(title, message):
def set_start_end_frames():
- project_name = legacy_io.active_project()
- asset_name = legacy_io.Session["AVALON_ASSET"]
+ project_name = get_current_project_name()
+ asset_name = get_current_asset_name()
asset_doc = get_asset_by_name(project_name, asset_name)
scene = bpy.context.scene
diff --git a/openpype/hosts/blender/api/plugin.py b/openpype/hosts/blender/api/plugin.py
index 1274795c6b..fb87d08cce 100644
--- a/openpype/hosts/blender/api/plugin.py
+++ b/openpype/hosts/blender/api/plugin.py
@@ -243,7 +243,8 @@ class AssetLoader(LoaderPlugin):
"""
# TODO (jasper): make it possible to add the asset several times by
# just re-using the collection
- assert Path(self.fname).exists(), f"{self.fname} doesn't exist."
+ filepath = self.filepath_from_context(context)
+ assert Path(filepath).exists(), f"{filepath} doesn't exist."
asset = context["asset"]["name"]
subset = context["subset"]["name"]
diff --git a/openpype/hosts/blender/plugins/create/create_action.py b/openpype/hosts/blender/plugins/create/create_action.py
index 54b3a501a7..0203ba74c0 100644
--- a/openpype/hosts/blender/plugins/create/create_action.py
+++ b/openpype/hosts/blender/plugins/create/create_action.py
@@ -2,7 +2,7 @@
import bpy
-from openpype.pipeline import legacy_io
+from openpype.pipeline import get_current_task_name
import openpype.hosts.blender.api.plugin
from openpype.hosts.blender.api import lib
@@ -22,7 +22,7 @@ class CreateAction(openpype.hosts.blender.api.plugin.Creator):
name = openpype.hosts.blender.api.plugin.asset_name(asset, subset)
collection = bpy.data.collections.new(name=name)
bpy.context.scene.collection.children.link(collection)
- self.data['task'] = legacy_io.Session.get('AVALON_TASK')
+ self.data['task'] = get_current_task_name()
lib.imprint(collection, self.data)
if (self.options or {}).get("useSelection"):
diff --git a/openpype/hosts/blender/plugins/create/create_animation.py b/openpype/hosts/blender/plugins/create/create_animation.py
index a0e9e5e399..bc2840952b 100644
--- a/openpype/hosts/blender/plugins/create/create_animation.py
+++ b/openpype/hosts/blender/plugins/create/create_animation.py
@@ -2,7 +2,7 @@
import bpy
-from openpype.pipeline import legacy_io
+from openpype.pipeline import get_current_task_name
from openpype.hosts.blender.api import plugin, lib, ops
from openpype.hosts.blender.api.pipeline import AVALON_INSTANCES
@@ -37,7 +37,7 @@ class CreateAnimation(plugin.Creator):
# asset_group.empty_display_type = 'SINGLE_ARROW'
asset_group = bpy.data.collections.new(name=name)
instances.children.link(asset_group)
- self.data['task'] = legacy_io.Session.get('AVALON_TASK')
+ self.data['task'] = get_current_task_name()
lib.imprint(asset_group, self.data)
if (self.options or {}).get("useSelection"):
diff --git a/openpype/hosts/blender/plugins/create/create_camera.py b/openpype/hosts/blender/plugins/create/create_camera.py
index ada512d7ac..6defe02fe5 100644
--- a/openpype/hosts/blender/plugins/create/create_camera.py
+++ b/openpype/hosts/blender/plugins/create/create_camera.py
@@ -2,7 +2,7 @@
import bpy
-from openpype.pipeline import legacy_io
+from openpype.pipeline import get_current_task_name
from openpype.hosts.blender.api import plugin, lib, ops
from openpype.hosts.blender.api.pipeline import AVALON_INSTANCES
@@ -35,7 +35,7 @@ class CreateCamera(plugin.Creator):
asset_group = bpy.data.objects.new(name=name, object_data=None)
asset_group.empty_display_type = 'SINGLE_ARROW'
instances.objects.link(asset_group)
- self.data['task'] = legacy_io.Session.get('AVALON_TASK')
+ self.data['task'] = get_current_task_name()
print(f"self.data: {self.data}")
lib.imprint(asset_group, self.data)
diff --git a/openpype/hosts/blender/plugins/create/create_layout.py b/openpype/hosts/blender/plugins/create/create_layout.py
index 5949a4b86e..68cfaa41ac 100644
--- a/openpype/hosts/blender/plugins/create/create_layout.py
+++ b/openpype/hosts/blender/plugins/create/create_layout.py
@@ -2,7 +2,7 @@
import bpy
-from openpype.pipeline import legacy_io
+from openpype.pipeline import get_current_task_name
from openpype.hosts.blender.api import plugin, lib, ops
from openpype.hosts.blender.api.pipeline import AVALON_INSTANCES
@@ -34,7 +34,7 @@ class CreateLayout(plugin.Creator):
asset_group = bpy.data.objects.new(name=name, object_data=None)
asset_group.empty_display_type = 'SINGLE_ARROW'
instances.objects.link(asset_group)
- self.data['task'] = legacy_io.Session.get('AVALON_TASK')
+ self.data['task'] = get_current_task_name()
lib.imprint(asset_group, self.data)
# Add selected objects to instance
diff --git a/openpype/hosts/blender/plugins/create/create_model.py b/openpype/hosts/blender/plugins/create/create_model.py
index fedc708943..e5204b5b53 100644
--- a/openpype/hosts/blender/plugins/create/create_model.py
+++ b/openpype/hosts/blender/plugins/create/create_model.py
@@ -2,7 +2,7 @@
import bpy
-from openpype.pipeline import legacy_io
+from openpype.pipeline import get_current_task_name
from openpype.hosts.blender.api import plugin, lib, ops
from openpype.hosts.blender.api.pipeline import AVALON_INSTANCES
@@ -34,7 +34,7 @@ class CreateModel(plugin.Creator):
asset_group = bpy.data.objects.new(name=name, object_data=None)
asset_group.empty_display_type = 'SINGLE_ARROW'
instances.objects.link(asset_group)
- self.data['task'] = legacy_io.Session.get('AVALON_TASK')
+ self.data['task'] = get_current_task_name()
lib.imprint(asset_group, self.data)
# Add selected objects to instance
diff --git a/openpype/hosts/blender/plugins/create/create_pointcache.py b/openpype/hosts/blender/plugins/create/create_pointcache.py
index 38707fd3b1..6220f68dc5 100644
--- a/openpype/hosts/blender/plugins/create/create_pointcache.py
+++ b/openpype/hosts/blender/plugins/create/create_pointcache.py
@@ -2,7 +2,7 @@
import bpy
-from openpype.pipeline import legacy_io
+from openpype.pipeline import get_current_task_name
import openpype.hosts.blender.api.plugin
from openpype.hosts.blender.api import lib
@@ -22,7 +22,7 @@ class CreatePointcache(openpype.hosts.blender.api.plugin.Creator):
name = openpype.hosts.blender.api.plugin.asset_name(asset, subset)
collection = bpy.data.collections.new(name=name)
bpy.context.scene.collection.children.link(collection)
- self.data['task'] = legacy_io.Session.get('AVALON_TASK')
+ self.data['task'] = get_current_task_name()
lib.imprint(collection, self.data)
if (self.options or {}).get("useSelection"):
diff --git a/openpype/hosts/blender/plugins/create/create_review.py b/openpype/hosts/blender/plugins/create/create_review.py
index bf4ea6a7cd..914f249891 100644
--- a/openpype/hosts/blender/plugins/create/create_review.py
+++ b/openpype/hosts/blender/plugins/create/create_review.py
@@ -2,7 +2,7 @@
import bpy
-from openpype.pipeline import legacy_io
+from openpype.pipeline import get_current_task_name
from openpype.hosts.blender.api import plugin, lib, ops
from openpype.hosts.blender.api.pipeline import AVALON_INSTANCES
@@ -33,7 +33,7 @@ class CreateReview(plugin.Creator):
name = plugin.asset_name(asset, subset)
asset_group = bpy.data.collections.new(name=name)
instances.children.link(asset_group)
- self.data['task'] = legacy_io.Session.get('AVALON_TASK')
+ self.data['task'] = get_current_task_name()
lib.imprint(asset_group, self.data)
if (self.options or {}).get("useSelection"):
diff --git a/openpype/hosts/blender/plugins/create/create_rig.py b/openpype/hosts/blender/plugins/create/create_rig.py
index 0abd306c6b..2e04fb71c1 100644
--- a/openpype/hosts/blender/plugins/create/create_rig.py
+++ b/openpype/hosts/blender/plugins/create/create_rig.py
@@ -2,7 +2,7 @@
import bpy
-from openpype.pipeline import legacy_io
+from openpype.pipeline import get_current_task_name
from openpype.hosts.blender.api import plugin, lib, ops
from openpype.hosts.blender.api.pipeline import AVALON_INSTANCES
@@ -34,7 +34,7 @@ class CreateRig(plugin.Creator):
asset_group = bpy.data.objects.new(name=name, object_data=None)
asset_group.empty_display_type = 'SINGLE_ARROW'
instances.objects.link(asset_group)
- self.data['task'] = legacy_io.Session.get('AVALON_TASK')
+ self.data['task'] = get_current_task_name()
lib.imprint(asset_group, self.data)
# Add selected objects to instance
diff --git a/openpype/hosts/blender/plugins/load/import_workfile.py b/openpype/hosts/blender/plugins/load/import_workfile.py
index bbdf1c7ea0..4f5016d422 100644
--- a/openpype/hosts/blender/plugins/load/import_workfile.py
+++ b/openpype/hosts/blender/plugins/load/import_workfile.py
@@ -52,7 +52,8 @@ class AppendBlendLoader(plugin.AssetLoader):
color = "#775555"
def load(self, context, name=None, namespace=None, data=None):
- append_workfile(context, self.fname, False)
+ path = self.filepath_from_context(context)
+ append_workfile(context, path, False)
# We do not containerize imported content, it remains unmanaged
return
@@ -76,7 +77,8 @@ class ImportBlendLoader(plugin.AssetLoader):
color = "#775555"
def load(self, context, name=None, namespace=None, data=None):
- append_workfile(context, self.fname, True)
+ path = self.filepath_from_context(context)
+ append_workfile(context, path, True)
# We do not containerize imported content, it remains unmanaged
return
diff --git a/openpype/hosts/blender/plugins/load/load_abc.py b/openpype/hosts/blender/plugins/load/load_abc.py
index c1d73eff02..292925c833 100644
--- a/openpype/hosts/blender/plugins/load/load_abc.py
+++ b/openpype/hosts/blender/plugins/load/load_abc.py
@@ -111,7 +111,7 @@ class CacheModelLoader(plugin.AssetLoader):
options: Additional settings dictionary
"""
- libpath = self.fname
+ libpath = self.filepath_from_context(context)
asset = context["asset"]["name"]
subset = context["subset"]["name"]
diff --git a/openpype/hosts/blender/plugins/load/load_action.py b/openpype/hosts/blender/plugins/load/load_action.py
index 3c8fe988f0..3447e67ebf 100644
--- a/openpype/hosts/blender/plugins/load/load_action.py
+++ b/openpype/hosts/blender/plugins/load/load_action.py
@@ -43,7 +43,7 @@ class BlendActionLoader(openpype.hosts.blender.api.plugin.AssetLoader):
options: Additional settings dictionary
"""
- libpath = self.fname
+ libpath = self.filepath_from_context(context)
asset = context["asset"]["name"]
subset = context["subset"]["name"]
lib_container = openpype.hosts.blender.api.plugin.asset_name(asset, subset)
diff --git a/openpype/hosts/blender/plugins/load/load_animation.py b/openpype/hosts/blender/plugins/load/load_animation.py
index 6b8d4abd04..3e7f808903 100644
--- a/openpype/hosts/blender/plugins/load/load_animation.py
+++ b/openpype/hosts/blender/plugins/load/load_animation.py
@@ -34,7 +34,7 @@ class BlendAnimationLoader(plugin.AssetLoader):
context: Full parenthood of representation to load
options: Additional settings dictionary
"""
- libpath = self.fname
+ libpath = self.filepath_from_context(context)
with bpy.data.libraries.load(
libpath, link=True, relative=False
diff --git a/openpype/hosts/blender/plugins/load/load_audio.py b/openpype/hosts/blender/plugins/load/load_audio.py
index 3f4fcc17de..ac8f363316 100644
--- a/openpype/hosts/blender/plugins/load/load_audio.py
+++ b/openpype/hosts/blender/plugins/load/load_audio.py
@@ -38,7 +38,7 @@ class AudioLoader(plugin.AssetLoader):
context: Full parenthood of representation to load
options: Additional settings dictionary
"""
- libpath = self.fname
+ libpath = self.filepath_from_context(context)
asset = context["asset"]["name"]
subset = context["subset"]["name"]
diff --git a/openpype/hosts/blender/plugins/load/load_camera_abc.py b/openpype/hosts/blender/plugins/load/load_camera_abc.py
index 21b48f409f..e5afecff66 100644
--- a/openpype/hosts/blender/plugins/load/load_camera_abc.py
+++ b/openpype/hosts/blender/plugins/load/load_camera_abc.py
@@ -81,7 +81,9 @@ class AbcCameraLoader(plugin.AssetLoader):
context: Full parenthood of representation to load
options: Additional settings dictionary
"""
- libpath = self.fname
+
+ libpath = self.filepath_from_context(context)
+
asset = context["asset"]["name"]
subset = context["subset"]["name"]
diff --git a/openpype/hosts/blender/plugins/load/load_camera_blend.py b/openpype/hosts/blender/plugins/load/load_camera_blend.py
index f00027f0b4..bd4820bf78 100644
--- a/openpype/hosts/blender/plugins/load/load_camera_blend.py
+++ b/openpype/hosts/blender/plugins/load/load_camera_blend.py
@@ -110,7 +110,7 @@ class BlendCameraLoader(plugin.AssetLoader):
context: Full parenthood of representation to load
options: Additional settings dictionary
"""
- libpath = self.fname
+ libpath = self.filepath_from_context(context)
asset = context["asset"]["name"]
subset = context["subset"]["name"]
diff --git a/openpype/hosts/blender/plugins/load/load_camera_fbx.py b/openpype/hosts/blender/plugins/load/load_camera_fbx.py
index 97f844e610..b9d05dda0a 100644
--- a/openpype/hosts/blender/plugins/load/load_camera_fbx.py
+++ b/openpype/hosts/blender/plugins/load/load_camera_fbx.py
@@ -86,7 +86,7 @@ class FbxCameraLoader(plugin.AssetLoader):
context: Full parenthood of representation to load
options: Additional settings dictionary
"""
- libpath = self.fname
+ libpath = self.filepath_from_context(context)
asset = context["asset"]["name"]
subset = context["subset"]["name"]
diff --git a/openpype/hosts/blender/plugins/load/load_fbx.py b/openpype/hosts/blender/plugins/load/load_fbx.py
index ee2e7d175c..e129ea6754 100644
--- a/openpype/hosts/blender/plugins/load/load_fbx.py
+++ b/openpype/hosts/blender/plugins/load/load_fbx.py
@@ -130,7 +130,7 @@ class FbxModelLoader(plugin.AssetLoader):
context: Full parenthood of representation to load
options: Additional settings dictionary
"""
- libpath = self.fname
+ libpath = self.filepath_from_context(context)
asset = context["asset"]["name"]
subset = context["subset"]["name"]
diff --git a/openpype/hosts/blender/plugins/load/load_layout_blend.py b/openpype/hosts/blender/plugins/load/load_layout_blend.py
index 7d2fd23444..03ccbce3d7 100644
--- a/openpype/hosts/blender/plugins/load/load_layout_blend.py
+++ b/openpype/hosts/blender/plugins/load/load_layout_blend.py
@@ -267,7 +267,7 @@ class BlendLayoutLoader(plugin.AssetLoader):
context: Full parenthood of representation to load
options: Additional settings dictionary
"""
- libpath = self.fname
+ libpath = self.filepath_from_context(context)
asset = context["asset"]["name"]
subset = context["subset"]["name"]
representation = str(context["representation"]["_id"])
diff --git a/openpype/hosts/blender/plugins/load/load_layout_json.py b/openpype/hosts/blender/plugins/load/load_layout_json.py
index eca098627e..81683b8de8 100644
--- a/openpype/hosts/blender/plugins/load/load_layout_json.py
+++ b/openpype/hosts/blender/plugins/load/load_layout_json.py
@@ -144,7 +144,7 @@ class JsonLayoutLoader(plugin.AssetLoader):
context: Full parenthood of representation to load
options: Additional settings dictionary
"""
- libpath = self.fname
+ libpath = self.filepath_from_context(context)
asset = context["asset"]["name"]
subset = context["subset"]["name"]
diff --git a/openpype/hosts/blender/plugins/load/load_look.py b/openpype/hosts/blender/plugins/load/load_look.py
index 70d1b95f02..c121f55633 100644
--- a/openpype/hosts/blender/plugins/load/load_look.py
+++ b/openpype/hosts/blender/plugins/load/load_look.py
@@ -92,7 +92,7 @@ class BlendLookLoader(plugin.AssetLoader):
options: Additional settings dictionary
"""
- libpath = self.fname
+ libpath = self.filepath_from_context(context)
asset = context["asset"]["name"]
subset = context["subset"]["name"]
diff --git a/openpype/hosts/blender/plugins/load/load_model.py b/openpype/hosts/blender/plugins/load/load_model.py
index 0a5d98ffa0..3210a4e841 100644
--- a/openpype/hosts/blender/plugins/load/load_model.py
+++ b/openpype/hosts/blender/plugins/load/load_model.py
@@ -113,7 +113,7 @@ class BlendModelLoader(plugin.AssetLoader):
context: Full parenthood of representation to load
options: Additional settings dictionary
"""
- libpath = self.fname
+ libpath = self.filepath_from_context(context)
asset = context["asset"]["name"]
subset = context["subset"]["name"]
diff --git a/openpype/hosts/blender/plugins/load/load_rig.py b/openpype/hosts/blender/plugins/load/load_rig.py
index 1d23a70061..b9b5ad935f 100644
--- a/openpype/hosts/blender/plugins/load/load_rig.py
+++ b/openpype/hosts/blender/plugins/load/load_rig.py
@@ -181,7 +181,7 @@ class BlendRigLoader(plugin.AssetLoader):
context: Full parenthood of representation to load
options: Additional settings dictionary
"""
- libpath = self.fname
+ libpath = self.filepath_from_context(context)
asset = context["asset"]["name"]
subset = context["subset"]["name"]
diff --git a/openpype/hosts/blender/plugins/publish/collect_current_file.py b/openpype/hosts/blender/plugins/publish/collect_current_file.py
index c3097a0694..c2d8a96a18 100644
--- a/openpype/hosts/blender/plugins/publish/collect_current_file.py
+++ b/openpype/hosts/blender/plugins/publish/collect_current_file.py
@@ -2,7 +2,7 @@ import os
import bpy
import pyblish.api
-from openpype.pipeline import legacy_io
+from openpype.pipeline import get_current_task_name, get_current_asset_name
from openpype.hosts.blender.api import workio
@@ -37,7 +37,7 @@ class CollectBlenderCurrentFile(pyblish.api.ContextPlugin):
folder, file = os.path.split(current_file)
filename, ext = os.path.splitext(file)
- task = legacy_io.Session["AVALON_TASK"]
+ task = get_current_task_name()
data = {}
@@ -47,7 +47,7 @@ class CollectBlenderCurrentFile(pyblish.api.ContextPlugin):
data.update({
"subset": subset,
- "asset": os.getenv("AVALON_ASSET", None),
+ "asset": get_current_asset_name(),
"label": subset,
"publish": True,
"family": "workfile",
diff --git a/openpype/hosts/blender/plugins/publish/collect_review.py b/openpype/hosts/blender/plugins/publish/collect_review.py
index d6abd9d967..82b3ca11eb 100644
--- a/openpype/hosts/blender/plugins/publish/collect_review.py
+++ b/openpype/hosts/blender/plugins/publish/collect_review.py
@@ -1,7 +1,6 @@
import bpy
import pyblish.api
-from openpype.pipeline import legacy_io
class CollectReview(pyblish.api.InstancePlugin):
@@ -39,7 +38,7 @@ class CollectReview(pyblish.api.InstancePlugin):
if not instance.data.get("remove"):
- task = legacy_io.Session.get("AVALON_TASK")
+ task = instance.context.data["task"]
instance.data.update({
"subset": f"{task}Review",
diff --git a/openpype/hosts/celaction/plugins/publish/collect_celaction_instances.py b/openpype/hosts/celaction/plugins/publish/collect_celaction_instances.py
index 35ac7fc264..c815c1edd4 100644
--- a/openpype/hosts/celaction/plugins/publish/collect_celaction_instances.py
+++ b/openpype/hosts/celaction/plugins/publish/collect_celaction_instances.py
@@ -1,6 +1,5 @@
import os
import pyblish.api
-from openpype.pipeline import legacy_io
class CollectCelactionInstances(pyblish.api.ContextPlugin):
@@ -10,7 +9,7 @@ class CollectCelactionInstances(pyblish.api.ContextPlugin):
order = pyblish.api.CollectorOrder + 0.1
def process(self, context):
- task = legacy_io.Session["AVALON_TASK"]
+ task = context.data["task"]
current_file = context.data["currentFile"]
staging_dir = os.path.dirname(current_file)
scene_file = os.path.basename(current_file)
diff --git a/openpype/hosts/flame/api/menu.py b/openpype/hosts/flame/api/menu.py
index 5f9dc57a61..e8bdf32ebd 100644
--- a/openpype/hosts/flame/api/menu.py
+++ b/openpype/hosts/flame/api/menu.py
@@ -1,7 +1,9 @@
-import os
-from qtpy import QtWidgets
from copy import deepcopy
from pprint import pformat
+
+from qtpy import QtWidgets
+
+from openpype.pipeline import get_current_project_name
from openpype.tools.utils.host_tools import HostToolsHelper
menu_group_name = 'OpenPype'
@@ -61,10 +63,10 @@ class _FlameMenuApp(object):
self.framework.prefs_global, self.name)
self.mbox = QtWidgets.QMessageBox()
-
+ project_name = get_current_project_name()
self.menu = {
"actions": [{
- 'name': os.getenv("AVALON_PROJECT", "project"),
+ 'name': project_name or "project",
'isEnabled': False
}],
"name": self.menu_group_name
diff --git a/openpype/hosts/flame/plugins/load/load_clip.py b/openpype/hosts/flame/plugins/load/load_clip.py
index dfb2d2b6f0..338833b449 100644
--- a/openpype/hosts/flame/plugins/load/load_clip.py
+++ b/openpype/hosts/flame/plugins/load/load_clip.py
@@ -82,8 +82,9 @@ class LoadClip(opfapi.ClipLoader):
os.makedirs(openclip_dir)
# prepare clip data from context ad send it to openClipLoader
+ path = self.filepath_from_context(context)
loading_context = {
- "path": self.fname.replace("\\", "/"),
+ "path": path.replace("\\", "/"),
"colorspace": colorspace,
"version": "v{:0>3}".format(version_name),
"layer_rename_template": self.layer_rename_template,
diff --git a/openpype/hosts/flame/plugins/load/load_clip_batch.py b/openpype/hosts/flame/plugins/load/load_clip_batch.py
index 5c5a77f0d0..ca43b94ee9 100644
--- a/openpype/hosts/flame/plugins/load/load_clip_batch.py
+++ b/openpype/hosts/flame/plugins/load/load_clip_batch.py
@@ -81,9 +81,10 @@ class LoadClipBatch(opfapi.ClipLoader):
if not os.path.exists(openclip_dir):
os.makedirs(openclip_dir)
- # prepare clip data from context ad send it to openClipLoader
+ # prepare clip data from context and send it to openClipLoader
+ path = self.filepath_from_context(context)
loading_context = {
- "path": self.fname.replace("\\", "/"),
+ "path": path.replace("\\", "/"),
"colorspace": colorspace,
"version": "v{:0>3}".format(version_name),
"layer_rename_template": self.layer_rename_template,
diff --git a/openpype/hosts/flame/plugins/publish/collect_timeline_otio.py b/openpype/hosts/flame/plugins/publish/collect_timeline_otio.py
index 917041e053..f8cfa9e963 100644
--- a/openpype/hosts/flame/plugins/publish/collect_timeline_otio.py
+++ b/openpype/hosts/flame/plugins/publish/collect_timeline_otio.py
@@ -2,7 +2,6 @@ import pyblish.api
import openpype.hosts.flame.api as opfapi
from openpype.hosts.flame.otio import flame_export
-from openpype.pipeline import legacy_io
from openpype.pipeline.create import get_subset_name
@@ -19,7 +18,7 @@ class CollecTimelineOTIO(pyblish.api.ContextPlugin):
# main
asset_doc = context.data["assetEntity"]
- task_name = legacy_io.Session["AVALON_TASK"]
+ task_name = context.data["task"]
project = opfapi.get_current_project()
sequence = opfapi.get_current_sequence(opfapi.CTX.selection)
diff --git a/openpype/hosts/fusion/api/lib.py b/openpype/hosts/fusion/api/lib.py
index cba8c38c2f..d96557571b 100644
--- a/openpype/hosts/fusion/api/lib.py
+++ b/openpype/hosts/fusion/api/lib.py
@@ -14,7 +14,7 @@ from openpype.client import (
)
from openpype.pipeline import (
switch_container,
- legacy_io,
+ get_current_project_name,
)
from openpype.pipeline.context_tools import get_current_project_asset
@@ -206,7 +206,7 @@ def switch_item(container,
# Collect any of current asset, subset and representation if not provided
# so we can use the original name from those.
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
if any(not x for x in [asset_name, subset_name, representation_name]):
repre_id = container["representation"]
representation = get_representation_by_id(project_name, repre_id)
diff --git a/openpype/hosts/fusion/api/menu.py b/openpype/hosts/fusion/api/menu.py
index 92f38a64c2..50250a6656 100644
--- a/openpype/hosts/fusion/api/menu.py
+++ b/openpype/hosts/fusion/api/menu.py
@@ -12,7 +12,7 @@ from openpype.hosts.fusion.api.lib import (
set_asset_framerange,
set_asset_resolution,
)
-from openpype.pipeline import legacy_io
+from openpype.pipeline import get_current_asset_name
from openpype.resources import get_openpype_icon_filepath
from .pipeline import FusionEventHandler
@@ -125,7 +125,7 @@ class OpenPypeMenu(QtWidgets.QWidget):
def on_task_changed(self):
# Update current context label
- label = legacy_io.Session["AVALON_ASSET"]
+ label = get_current_asset_name()
self.asset_label.setText(label)
def register_callback(self, name, fn):
diff --git a/openpype/hosts/fusion/deploy/Scripts/Comp/OpenPype/switch_ui.py b/openpype/hosts/fusion/deploy/Scripts/Comp/OpenPype/switch_ui.py
index f08dc0bf2c..87322235f5 100644
--- a/openpype/hosts/fusion/deploy/Scripts/Comp/OpenPype/switch_ui.py
+++ b/openpype/hosts/fusion/deploy/Scripts/Comp/OpenPype/switch_ui.py
@@ -11,7 +11,7 @@ from openpype.client import get_assets
from openpype import style
from openpype.pipeline import (
install_host,
- legacy_io,
+ get_current_project_name,
)
from openpype.hosts.fusion import api
from openpype.pipeline.context_tools import get_workdir_from_session
@@ -167,7 +167,7 @@ class App(QtWidgets.QWidget):
return items
def collect_asset_names(self):
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
asset_docs = get_assets(project_name, fields=["name"])
asset_names = {
asset_doc["name"]
diff --git a/openpype/hosts/fusion/plugins/create/create_workfile.py b/openpype/hosts/fusion/plugins/create/create_workfile.py
index 40721ea88a..8acaaa172f 100644
--- a/openpype/hosts/fusion/plugins/create/create_workfile.py
+++ b/openpype/hosts/fusion/plugins/create/create_workfile.py
@@ -5,7 +5,6 @@ from openpype.client import get_asset_by_name
from openpype.pipeline import (
AutoCreator,
CreatedInstance,
- legacy_io,
)
@@ -64,10 +63,10 @@ class FusionWorkfileCreator(AutoCreator):
existing_instance = instance
break
- project_name = legacy_io.Session["AVALON_PROJECT"]
- asset_name = legacy_io.Session["AVALON_ASSET"]
- task_name = legacy_io.Session["AVALON_TASK"]
- host_name = legacy_io.Session["AVALON_APP"]
+ project_name = self.create_context.get_current_project_name()
+ asset_name = self.create_context.get_current_asset_name()
+ task_name = self.create_context.get_current_task_name()
+ host_name = self.create_context.host_name
if existing_instance is None:
asset_doc = get_asset_by_name(project_name, asset_name)
diff --git a/openpype/hosts/fusion/plugins/load/load_alembic.py b/openpype/hosts/fusion/plugins/load/load_alembic.py
index 11bf59af12..9b6d1e12b4 100644
--- a/openpype/hosts/fusion/plugins/load/load_alembic.py
+++ b/openpype/hosts/fusion/plugins/load/load_alembic.py
@@ -32,7 +32,7 @@ class FusionLoadAlembicMesh(load.LoaderPlugin):
comp = get_current_comp()
with comp_lock_and_undo_chunk(comp, "Create tool"):
- path = self.fname
+ path = self.filepath_from_context(context)
args = (-32768, -32768)
tool = comp.AddTool(self.tool_type, *args)
diff --git a/openpype/hosts/fusion/plugins/load/load_fbx.py b/openpype/hosts/fusion/plugins/load/load_fbx.py
index c73ad78394..d15d2c33d7 100644
--- a/openpype/hosts/fusion/plugins/load/load_fbx.py
+++ b/openpype/hosts/fusion/plugins/load/load_fbx.py
@@ -45,7 +45,7 @@ class FusionLoadFBXMesh(load.LoaderPlugin):
# Create the Loader with the filename path set
comp = get_current_comp()
with comp_lock_and_undo_chunk(comp, "Create tool"):
- path = self.fname
+ path = self.filepath_from_context(context)
args = (-32768, -32768)
tool = comp.AddTool(self.tool_type, *args)
diff --git a/openpype/hosts/fusion/plugins/load/load_sequence.py b/openpype/hosts/fusion/plugins/load/load_sequence.py
index 552e282587..20be5faaba 100644
--- a/openpype/hosts/fusion/plugins/load/load_sequence.py
+++ b/openpype/hosts/fusion/plugins/load/load_sequence.py
@@ -1,10 +1,7 @@
import contextlib
import openpype.pipeline.load as load
-from openpype.pipeline.load import (
- get_representation_context,
- get_representation_path_from_context,
-)
+from openpype.pipeline.load import get_representation_context
from openpype.hosts.fusion.api import (
imprint_container,
get_current_comp,
@@ -157,7 +154,7 @@ class FusionLoadSequence(load.LoaderPlugin):
namespace = context["asset"]["name"]
# Use the first file for now
- path = get_representation_path_from_context(context)
+ path = self.filepath_from_context(context)
# Create the Loader with the filename path set
comp = get_current_comp()
@@ -228,7 +225,7 @@ class FusionLoadSequence(load.LoaderPlugin):
comp = tool.Comp()
context = get_representation_context(representation)
- path = get_representation_path_from_context(context)
+ path = self.filepath_from_context(context)
# Get start frame from version data
start = self._get_start(context["version"], tool)
diff --git a/openpype/hosts/fusion/plugins/load/load_workfile.py b/openpype/hosts/fusion/plugins/load/load_workfile.py
index b49d104a15..14e36ca8fd 100644
--- a/openpype/hosts/fusion/plugins/load/load_workfile.py
+++ b/openpype/hosts/fusion/plugins/load/load_workfile.py
@@ -27,6 +27,7 @@ class FusionLoadWorkfile(load.LoaderPlugin):
# Get needed elements
bmd = get_bmd_library()
comp = get_current_comp()
+ path = self.filepath_from_context(context)
# Paste the content of the file into the current comp
- comp.Paste(bmd.readfile(self.fname))
+ comp.Paste(bmd.readfile(path))
diff --git a/openpype/hosts/harmony/api/README.md b/openpype/hosts/harmony/api/README.md
index 12f21f551a..be3920fe29 100644
--- a/openpype/hosts/harmony/api/README.md
+++ b/openpype/hosts/harmony/api/README.md
@@ -610,7 +610,7 @@ class ImageSequenceLoader(load.LoaderPlugin):
def update(self, container, representation):
node = container.pop("node")
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
version = get_version_by_id(project_name, representation["parent"])
files = []
for f in version["data"]["files"]:
diff --git a/openpype/hosts/harmony/plugins/load/load_background.py b/openpype/hosts/harmony/plugins/load/load_background.py
index c28a87791e..853d347c2e 100644
--- a/openpype/hosts/harmony/plugins/load/load_background.py
+++ b/openpype/hosts/harmony/plugins/load/load_background.py
@@ -238,7 +238,8 @@ class BackgroundLoader(load.LoaderPlugin):
def load(self, context, name=None, namespace=None, data=None):
- with open(self.fname) as json_file:
+ path = self.filepath_from_context(context)
+ with open(path) as json_file:
data = json.load(json_file)
layers = list()
@@ -251,7 +252,7 @@ class BackgroundLoader(load.LoaderPlugin):
if layer.get("filename"):
layers.append(layer["filename"])
- bg_folder = os.path.dirname(self.fname)
+ bg_folder = os.path.dirname(path)
subset_name = context["subset"]["name"]
# read_node_name += "_{}".format(uuid.uuid4())
diff --git a/openpype/hosts/harmony/plugins/load/load_imagesequence.py b/openpype/hosts/harmony/plugins/load/load_imagesequence.py
index b95d25f507..754f82e5d5 100644
--- a/openpype/hosts/harmony/plugins/load/load_imagesequence.py
+++ b/openpype/hosts/harmony/plugins/load/load_imagesequence.py
@@ -34,7 +34,7 @@ class ImageSequenceLoader(load.LoaderPlugin):
data (dict, optional): Additional data passed into loader.
"""
- fname = Path(self.fname)
+ fname = Path(self.filepath_from_context(context))
self_name = self.__class__.__name__
collections, remainder = clique.assemble(
os.listdir(fname.parent.as_posix())
diff --git a/openpype/hosts/harmony/plugins/publish/collect_farm_render.py b/openpype/hosts/harmony/plugins/publish/collect_farm_render.py
index f6b26eb3e8..5e9b9094a7 100644
--- a/openpype/hosts/harmony/plugins/publish/collect_farm_render.py
+++ b/openpype/hosts/harmony/plugins/publish/collect_farm_render.py
@@ -5,7 +5,6 @@ from pathlib import Path
import attr
from openpype.lib import get_formatted_current_time
-from openpype.pipeline import legacy_io
from openpype.pipeline import publish
from openpype.pipeline.publish import RenderInstance
import openpype.hosts.harmony.api as harmony
@@ -99,6 +98,8 @@ class CollectFarmRender(publish.AbstractCollectRender):
self_name = self.__class__.__name__
+ asset_name = context.data["asset"]
+
for node in context.data["allNodes"]:
data = harmony.read(node)
@@ -141,7 +142,7 @@ class CollectFarmRender(publish.AbstractCollectRender):
source=context.data["currentFile"],
label=node.split("/")[1],
subset=subset_name,
- asset=legacy_io.Session["AVALON_ASSET"],
+ asset=asset_name,
task=task_name,
attachTo=False,
setMembers=[node],
diff --git a/openpype/hosts/harmony/plugins/publish/collect_palettes.py b/openpype/hosts/harmony/plugins/publish/collect_palettes.py
index bbd60d1c55..e19057e302 100644
--- a/openpype/hosts/harmony/plugins/publish/collect_palettes.py
+++ b/openpype/hosts/harmony/plugins/publish/collect_palettes.py
@@ -1,6 +1,5 @@
# -*- coding: utf-8 -*-
"""Collect palettes from Harmony."""
-import os
import json
import re
@@ -32,6 +31,7 @@ class CollectPalettes(pyblish.api.ContextPlugin):
if (not any([re.search(pattern, task_name)
for pattern in self.allowed_tasks])):
return
+ asset_name = context.data["asset"]
for name, id in palettes.items():
instance = context.create_instance(name)
@@ -39,7 +39,7 @@ class CollectPalettes(pyblish.api.ContextPlugin):
"id": id,
"family": "harmony.palette",
'families': [],
- "asset": os.environ["AVALON_ASSET"],
+ "asset": asset_name,
"subset": "{}{}".format("palette", name)
})
self.log.info(
diff --git a/openpype/hosts/harmony/plugins/publish/collect_workfile.py b/openpype/hosts/harmony/plugins/publish/collect_workfile.py
index 3624147435..4492ab37a5 100644
--- a/openpype/hosts/harmony/plugins/publish/collect_workfile.py
+++ b/openpype/hosts/harmony/plugins/publish/collect_workfile.py
@@ -36,5 +36,5 @@ class CollectWorkfile(pyblish.api.ContextPlugin):
"family": family,
"families": [family],
"representations": [],
- "asset": os.environ["AVALON_ASSET"]
+ "asset": context.data["asset"]
})
diff --git a/openpype/hosts/harmony/plugins/publish/extract_template.py b/openpype/hosts/harmony/plugins/publish/extract_template.py
index 458bf25a3c..e75459fe1e 100644
--- a/openpype/hosts/harmony/plugins/publish/extract_template.py
+++ b/openpype/hosts/harmony/plugins/publish/extract_template.py
@@ -75,7 +75,7 @@ class ExtractTemplate(publish.Extractor):
instance.data["representations"] = [representation]
instance.data["version_name"] = "{}_{}".format(
- instance.data["subset"], os.environ["AVALON_TASK"])
+ instance.data["subset"], instance.context.data["task"])
def get_backdrops(self, node: str) -> list:
"""Get backdrops for the node.
diff --git a/openpype/hosts/harmony/plugins/publish/validate_instances.py b/openpype/hosts/harmony/plugins/publish/validate_instances.py
index ac367082ef..7183de6048 100644
--- a/openpype/hosts/harmony/plugins/publish/validate_instances.py
+++ b/openpype/hosts/harmony/plugins/publish/validate_instances.py
@@ -1,8 +1,7 @@
-import os
-
import pyblish.api
import openpype.hosts.harmony.api as harmony
+from openpype.pipeline import get_current_asset_name
from openpype.pipeline.publish import (
ValidateContentsOrder,
PublishXmlValidationError,
@@ -30,7 +29,7 @@ class ValidateInstanceRepair(pyblish.api.Action):
for instance in instances:
data = harmony.read(instance.data["setMembers"][0])
- data["asset"] = os.environ["AVALON_ASSET"]
+ data["asset"] = get_current_asset_name()
harmony.imprint(instance.data["setMembers"][0], data)
@@ -44,7 +43,7 @@ class ValidateInstance(pyblish.api.InstancePlugin):
def process(self, instance):
instance_asset = instance.data["asset"]
- current_asset = os.environ["AVALON_ASSET"]
+ current_asset = get_current_asset_name()
msg = (
"Instance asset is not the same as current asset:"
f"\nInstance: {instance_asset}\nCurrent: {current_asset}"
diff --git a/openpype/hosts/harmony/plugins/publish/validate_scene_settings.py b/openpype/hosts/harmony/plugins/publish/validate_scene_settings.py
index 6e4c6955e4..866f12076a 100644
--- a/openpype/hosts/harmony/plugins/publish/validate_scene_settings.py
+++ b/openpype/hosts/harmony/plugins/publish/validate_scene_settings.py
@@ -67,7 +67,9 @@ class ValidateSceneSettings(pyblish.api.InstancePlugin):
expected_settings["frameEndHandle"] = expected_settings["frameEnd"] +\
expected_settings["handleEnd"]
- if (any(re.search(pattern, os.getenv('AVALON_TASK'))
+ task_name = instance.context.data["task"]
+
+ if (any(re.search(pattern, task_name)
for pattern in self.skip_resolution_check)):
self.log.info("Skipping resolution check because of "
"task name and pattern {}".format(
diff --git a/openpype/hosts/hiero/api/lib.py b/openpype/hosts/hiero/api/lib.py
index 09d73f5cc2..bf719160d1 100644
--- a/openpype/hosts/hiero/api/lib.py
+++ b/openpype/hosts/hiero/api/lib.py
@@ -22,9 +22,7 @@ except ImportError:
from openpype.client import get_project
from openpype.settings import get_project_settings
-from openpype.pipeline import (
- get_current_project_name, legacy_io, Anatomy
-)
+from openpype.pipeline import Anatomy, get_current_project_name
from openpype.pipeline.load import filter_containers
from openpype.lib import Logger
from . import tags
@@ -626,7 +624,7 @@ def get_publish_attribute(tag):
def sync_avalon_data_to_workfile():
# import session to get project dir
- project_name = legacy_io.Session["AVALON_PROJECT"]
+ project_name = get_current_project_name()
anatomy = Anatomy(project_name)
work_template = anatomy.templates["work"]["path"]
@@ -821,7 +819,7 @@ class PublishAction(QtWidgets.QAction):
# # create root node and save all metadata
# root_node = hiero.core.nuke.RootNode()
#
-# anatomy = Anatomy(os.environ["AVALON_PROJECT"])
+# anatomy = Anatomy(get_current_project_name())
# work_template = anatomy.templates["work"]["path"]
# root_path = anatomy.root_value_for_template(work_template)
#
@@ -1041,7 +1039,7 @@ def _set_hrox_project_knobs(doc, **knobs):
def apply_colorspace_project():
- project_name = os.getenv("AVALON_PROJECT")
+ project_name = get_current_project_name()
# get path the the active projects
project = get_current_project(remove_untitled=True)
current_file = project.path()
@@ -1110,7 +1108,7 @@ def apply_colorspace_project():
def apply_colorspace_clips():
- project_name = os.getenv("AVALON_PROJECT")
+ project_name = get_current_project_name()
project = get_current_project(remove_untitled=True)
clips = project.clips()
@@ -1264,7 +1262,7 @@ def check_inventory_versions(track_items=None):
if not containers:
return
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
filter_result = filter_containers(containers, project_name)
for container in filter_result.latest:
set_track_color(container["_item"], clip_color_last)
diff --git a/openpype/hosts/hiero/api/menu.py b/openpype/hosts/hiero/api/menu.py
index 6baeb38cc0..9967e9c875 100644
--- a/openpype/hosts/hiero/api/menu.py
+++ b/openpype/hosts/hiero/api/menu.py
@@ -4,12 +4,18 @@ import sys
import hiero.core
from hiero.ui import findMenuAction
+from qtpy import QtGui
+
from openpype.lib import Logger
-from openpype.pipeline import legacy_io
from openpype.tools.utils import host_tools
+from openpype.settings import get_project_settings
+from openpype.pipeline import (
+ get_current_project_name,
+ get_current_asset_name,
+ get_current_task_name
+)
from . import tags
-from openpype.settings import get_project_settings
log = Logger.get_logger(__name__)
@@ -17,6 +23,13 @@ self = sys.modules[__name__]
self._change_context_menu = None
+def get_context_label():
+ return "{}, {}".format(
+ get_current_asset_name(),
+ get_current_task_name()
+ )
+
+
def update_menu_task_label():
"""Update the task label in Avalon menu to current session"""
@@ -27,10 +40,7 @@ def update_menu_task_label():
log.warning("Can't find menuItem: {}".format(object_name))
return
- label = "{}, {}".format(
- legacy_io.Session["AVALON_ASSET"],
- legacy_io.Session["AVALON_TASK"]
- )
+ label = get_context_label()
menu = found_menu.menu()
self._change_context_menu = label
@@ -43,7 +53,6 @@ def menu_install():
"""
- from qtpy import QtGui
from . import (
publish, launch_workfiles_app, reload_config,
apply_colorspace_project, apply_colorspace_clips
@@ -56,10 +65,7 @@ def menu_install():
menu_name = os.environ['AVALON_LABEL']
- context_label = "{0}, {1}".format(
- legacy_io.Session["AVALON_ASSET"],
- legacy_io.Session["AVALON_TASK"]
- )
+ context_label = get_context_label()
self._change_context_menu = context_label
@@ -154,7 +160,7 @@ def add_scripts_menu():
return
# load configuration of custom menu
- project_settings = get_project_settings(os.getenv("AVALON_PROJECT"))
+ project_settings = get_project_settings(get_current_project_name())
config = project_settings["hiero"]["scriptsmenu"]["definition"]
_menu = project_settings["hiero"]["scriptsmenu"]["name"]
diff --git a/openpype/hosts/hiero/api/plugin.py b/openpype/hosts/hiero/api/plugin.py
index a3f8a6c524..65a4009756 100644
--- a/openpype/hosts/hiero/api/plugin.py
+++ b/openpype/hosts/hiero/api/plugin.py
@@ -12,6 +12,7 @@ from openpype.settings import get_current_project_settings
from openpype.lib import Logger
from openpype.pipeline import LoaderPlugin, LegacyCreator
from openpype.pipeline.context_tools import get_current_project_asset
+from openpype.pipeline.load import get_representation_path_from_context
from . import lib
log = Logger.get_logger(__name__)
@@ -393,7 +394,7 @@ class ClipLoader:
active_bin = None
data = dict()
- def __init__(self, cls, context, **options):
+ def __init__(self, cls, context, path, **options):
""" Initialize object
Arguments:
@@ -406,6 +407,7 @@ class ClipLoader:
self.__dict__.update(cls.__dict__)
self.context = context
self.active_project = lib.get_current_project()
+ self.fname = path
# try to get value from options or evaluate key value for `handles`
self.with_handles = options.get("handles") or bool(
@@ -467,7 +469,7 @@ class ClipLoader:
self.data["track_name"] = "_".join([subset, representation])
self.data["versionData"] = self.context["version"]["data"]
# gets file path
- file = self.fname
+ file = get_representation_path_from_context(self.context)
if not file:
repr_id = repr["_id"]
log.warning(
diff --git a/openpype/hosts/hiero/api/tags.py b/openpype/hosts/hiero/api/tags.py
index cb7bc14edb..02d8205414 100644
--- a/openpype/hosts/hiero/api/tags.py
+++ b/openpype/hosts/hiero/api/tags.py
@@ -5,7 +5,7 @@ import hiero
from openpype.client import get_project, get_assets
from openpype.lib import Logger
-from openpype.pipeline import legacy_io
+from openpype.pipeline import get_current_project_name
log = Logger.get_logger(__name__)
@@ -142,7 +142,7 @@ def add_tags_to_workfile():
nks_pres_tags = tag_data()
# Get project task types.
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
project_doc = get_project(project_name)
tasks = project_doc["config"]["tasks"]
nks_pres_tags["[Tasks]"] = {}
diff --git a/openpype/hosts/hiero/plugins/load/load_clip.py b/openpype/hosts/hiero/plugins/load/load_clip.py
index c9bebfa8b2..05bd12d185 100644
--- a/openpype/hosts/hiero/plugins/load/load_clip.py
+++ b/openpype/hosts/hiero/plugins/load/load_clip.py
@@ -3,8 +3,8 @@ from openpype.client import (
get_last_version_by_subset_id
)
from openpype.pipeline import (
- legacy_io,
get_representation_path,
+ get_current_project_name,
)
from openpype.lib.transcoding import (
VIDEO_EXTENSIONS,
@@ -87,7 +87,8 @@ class LoadClip(phiero.SequenceLoader):
})
# load clip to timeline and get main variables
- track_item = phiero.ClipLoader(self, context, **options).load()
+ path = self.filepath_from_context(context)
+ track_item = phiero.ClipLoader(self, context, path, **options).load()
namespace = namespace or track_item.name()
version = context['version']
version_data = version.get("data", {})
@@ -147,7 +148,7 @@ class LoadClip(phiero.SequenceLoader):
track_item = phiero.get_track_items(
track_item_name=namespace).pop()
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
version_doc = get_version_by_id(project_name, representation["parent"])
version_data = version_doc.get("data", {})
@@ -210,7 +211,7 @@ class LoadClip(phiero.SequenceLoader):
@classmethod
def set_item_color(cls, track_item, version_doc):
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
last_version_doc = get_last_version_by_subset_id(
project_name, version_doc["parent"], fields=["_id"]
)
diff --git a/openpype/hosts/hiero/plugins/load/load_effects.py b/openpype/hosts/hiero/plugins/load/load_effects.py
index b61cca9731..31147d013f 100644
--- a/openpype/hosts/hiero/plugins/load/load_effects.py
+++ b/openpype/hosts/hiero/plugins/load/load_effects.py
@@ -9,8 +9,8 @@ from openpype.client import (
from openpype.pipeline import (
AVALON_CONTAINER_ID,
load,
- legacy_io,
- get_representation_path
+ get_representation_path,
+ get_current_project_name
)
from openpype.hosts.hiero import api as phiero
from openpype.lib import Logger
@@ -59,7 +59,8 @@ class LoadEffects(load.LoaderPlugin):
}
# getting file path
- file = self.fname.replace("\\", "/")
+ file = self.filepath_from_context(context)
+ file = file.replace("\\", "/")
if self._shared_loading(
file,
@@ -167,7 +168,7 @@ class LoadEffects(load.LoaderPlugin):
namespace = container['namespace']
# get timeline in out data
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
version_doc = get_version_by_id(project_name, representation["parent"])
version_data = version_doc["data"]
clip_in = version_data["clipIn"]
diff --git a/openpype/hosts/hiero/plugins/publish/precollect_workfile.py b/openpype/hosts/hiero/plugins/publish/precollect_workfile.py
index 1f477c1639..5a66581531 100644
--- a/openpype/hosts/hiero/plugins/publish/precollect_workfile.py
+++ b/openpype/hosts/hiero/plugins/publish/precollect_workfile.py
@@ -7,7 +7,6 @@ from qtpy.QtGui import QPixmap
import hiero.ui
-from openpype.pipeline import legacy_io
from openpype.hosts.hiero.api.otio import hiero_export
@@ -19,7 +18,7 @@ class PrecollectWorkfile(pyblish.api.ContextPlugin):
def process(self, context):
- asset = legacy_io.Session["AVALON_ASSET"]
+ asset = context.data["asset"]
subset = "workfile"
active_timeline = hiero.ui.activeSequence()
project = active_timeline.project()
diff --git a/openpype/hosts/hiero/plugins/publish_old_workflow/collect_assetbuilds.py b/openpype/hosts/hiero/plugins/publish_old_workflow/collect_assetbuilds.py
index 5f96533052..767f7c30f7 100644
--- a/openpype/hosts/hiero/plugins/publish_old_workflow/collect_assetbuilds.py
+++ b/openpype/hosts/hiero/plugins/publish_old_workflow/collect_assetbuilds.py
@@ -1,6 +1,5 @@
from pyblish import api
from openpype.client import get_assets
-from openpype.pipeline import legacy_io
class CollectAssetBuilds(api.ContextPlugin):
@@ -18,7 +17,7 @@ class CollectAssetBuilds(api.ContextPlugin):
hosts = ["hiero"]
def process(self, context):
- project_name = legacy_io.active_project()
+ project_name = context.data["projectName"]
asset_builds = {}
for asset in get_assets(project_name):
if asset["data"]["entityType"] == "AssetBuild":
diff --git a/openpype/hosts/houdini/api/lib.py b/openpype/hosts/houdini/api/lib.py
index a32e9d8d61..b03f8c8fc1 100644
--- a/openpype/hosts/houdini/api/lib.py
+++ b/openpype/hosts/houdini/api/lib.py
@@ -10,7 +10,7 @@ import json
import six
from openpype.client import get_asset_by_name
-from openpype.pipeline import legacy_io
+from openpype.pipeline import get_current_project_name, get_current_asset_name
from openpype.pipeline.context_tools import get_current_project_asset
import hou
@@ -78,8 +78,8 @@ def generate_ids(nodes, asset_id=None):
"""
if asset_id is None:
- project_name = legacy_io.active_project()
- asset_name = legacy_io.Session["AVALON_ASSET"]
+ project_name = get_current_project_name()
+ asset_name = get_current_asset_name()
# Get the asset ID from the database for the asset of current context
asset_doc = get_asset_by_name(project_name, asset_name, fields=["_id"])
@@ -474,8 +474,8 @@ def maintained_selection():
def reset_framerange():
"""Set frame range to current asset"""
- project_name = legacy_io.active_project()
- asset_name = legacy_io.Session["AVALON_ASSET"]
+ project_name = get_current_project_name()
+ asset_name = get_current_asset_name()
# Get the asset ID from the database for the asset of current context
asset_doc = get_asset_by_name(project_name, asset_name)
asset_data = asset_doc["data"]
diff --git a/openpype/hosts/houdini/api/shelves.py b/openpype/hosts/houdini/api/shelves.py
index 6e0f367f62..21e44e494a 100644
--- a/openpype/hosts/houdini/api/shelves.py
+++ b/openpype/hosts/houdini/api/shelves.py
@@ -4,6 +4,7 @@ import logging
import platform
from openpype.settings import get_project_settings
+from openpype.pipeline import get_current_project_name
import hou
@@ -17,7 +18,8 @@ def generate_shelves():
current_os = platform.system().lower()
# load configuration of houdini shelves
- project_settings = get_project_settings(os.getenv("AVALON_PROJECT"))
+ project_name = get_current_project_name()
+ project_settings = get_project_settings(project_name)
shelves_set_config = project_settings["houdini"]["shelves"]
if not shelves_set_config:
diff --git a/openpype/hosts/houdini/plugins/create/create_hda.py b/openpype/hosts/houdini/plugins/create/create_hda.py
index 5f95b2efb4..c4093bfbc6 100644
--- a/openpype/hosts/houdini/plugins/create/create_hda.py
+++ b/openpype/hosts/houdini/plugins/create/create_hda.py
@@ -4,7 +4,6 @@ from openpype.client import (
get_asset_by_name,
get_subsets,
)
-from openpype.pipeline import legacy_io
from openpype.hosts.houdini.api import plugin
@@ -21,7 +20,7 @@ class CreateHDA(plugin.HoudiniCreator):
# type: (str) -> bool
"""Check if existing subset name versions already exists."""
# Get all subsets of the current asset
- project_name = legacy_io.active_project()
+ project_name = self.project_name
asset_doc = get_asset_by_name(
project_name, self.data["asset"], fields=["_id"]
)
diff --git a/openpype/hosts/houdini/plugins/create/create_workfile.py b/openpype/hosts/houdini/plugins/create/create_workfile.py
index 1a8537adcd..cc45a6c2a8 100644
--- a/openpype/hosts/houdini/plugins/create/create_workfile.py
+++ b/openpype/hosts/houdini/plugins/create/create_workfile.py
@@ -4,7 +4,6 @@ from openpype.hosts.houdini.api import plugin
from openpype.hosts.houdini.api.lib import read, imprint
from openpype.hosts.houdini.api.pipeline import CONTEXT_CONTAINER
from openpype.pipeline import CreatedInstance, AutoCreator
-from openpype.pipeline import legacy_io
from openpype.client import get_asset_by_name
import hou
@@ -27,9 +26,9 @@ class CreateWorkfile(plugin.HoudiniCreatorBase, AutoCreator):
), None)
project_name = self.project_name
- asset_name = legacy_io.Session["AVALON_ASSET"]
- task_name = legacy_io.Session["AVALON_TASK"]
- host_name = legacy_io.Session["AVALON_APP"]
+ asset_name = self.create_context.get_current_asset_name()
+ task_name = self.create_context.get_current_task_name()
+ host_name = self.host_name
if current_instance is None:
asset_doc = get_asset_by_name(project_name, asset_name)
diff --git a/openpype/hosts/houdini/plugins/load/load_alembic.py b/openpype/hosts/houdini/plugins/load/load_alembic.py
index c6f0ebf2f9..48bd730ebe 100644
--- a/openpype/hosts/houdini/plugins/load/load_alembic.py
+++ b/openpype/hosts/houdini/plugins/load/load_alembic.py
@@ -20,7 +20,8 @@ class AbcLoader(load.LoaderPlugin):
import hou
# Format file name, Houdini only wants forward slashes
- file_path = os.path.normpath(self.fname)
+ file_path = self.filepath_from_context(context)
+ file_path = os.path.normpath(file_path)
file_path = file_path.replace("\\", "/")
# Get the root node
diff --git a/openpype/hosts/houdini/plugins/load/load_alembic_archive.py b/openpype/hosts/houdini/plugins/load/load_alembic_archive.py
index 47d2e1b896..3a577f72b4 100644
--- a/openpype/hosts/houdini/plugins/load/load_alembic_archive.py
+++ b/openpype/hosts/houdini/plugins/load/load_alembic_archive.py
@@ -21,7 +21,8 @@ class AbcArchiveLoader(load.LoaderPlugin):
import hou
# Format file name, Houdini only wants forward slashes
- file_path = os.path.normpath(self.fname)
+ file_path = self.filepath_from_context(context)
+ file_path = os.path.normpath(file_path)
file_path = file_path.replace("\\", "/")
# Get the root node
diff --git a/openpype/hosts/houdini/plugins/load/load_bgeo.py b/openpype/hosts/houdini/plugins/load/load_bgeo.py
index 86e8675c02..22680178c0 100644
--- a/openpype/hosts/houdini/plugins/load/load_bgeo.py
+++ b/openpype/hosts/houdini/plugins/load/load_bgeo.py
@@ -43,9 +43,10 @@ class BgeoLoader(load.LoaderPlugin):
file_node.destroy()
# Explicitly create a file node
+ path = self.filepath_from_context(context)
file_node = container.createNode("file", node_name=node_name)
file_node.setParms(
- {"file": self.format_path(self.fname, context["representation"])})
+ {"file": self.format_path(path, context["representation"])})
# Set display on last node
file_node.setDisplayFlag(True)
diff --git a/openpype/hosts/houdini/plugins/load/load_camera.py b/openpype/hosts/houdini/plugins/load/load_camera.py
index 6365508f4e..7b4a04809e 100644
--- a/openpype/hosts/houdini/plugins/load/load_camera.py
+++ b/openpype/hosts/houdini/plugins/load/load_camera.py
@@ -94,7 +94,8 @@ class CameraLoader(load.LoaderPlugin):
import hou
# Format file name, Houdini only wants forward slashes
- file_path = os.path.normpath(self.fname)
+ file_path = self.filepath_from_context(context)
+ file_path = os.path.normpath(file_path)
file_path = file_path.replace("\\", "/")
# Get the root node
diff --git a/openpype/hosts/houdini/plugins/load/load_hda.py b/openpype/hosts/houdini/plugins/load/load_hda.py
index 2438570c6e..57edc341a3 100644
--- a/openpype/hosts/houdini/plugins/load/load_hda.py
+++ b/openpype/hosts/houdini/plugins/load/load_hda.py
@@ -21,7 +21,8 @@ class HdaLoader(load.LoaderPlugin):
import hou
# Format file name, Houdini only wants forward slashes
- file_path = os.path.normpath(self.fname)
+ file_path = self.filepath_from_context(context)
+ file_path = os.path.normpath(file_path)
file_path = file_path.replace("\\", "/")
# Get the root node
diff --git a/openpype/hosts/houdini/plugins/load/load_image.py b/openpype/hosts/houdini/plugins/load/load_image.py
index 26bc569c53..663a93e48b 100644
--- a/openpype/hosts/houdini/plugins/load/load_image.py
+++ b/openpype/hosts/houdini/plugins/load/load_image.py
@@ -55,7 +55,8 @@ class ImageLoader(load.LoaderPlugin):
def load(self, context, name=None, namespace=None, data=None):
# Format file name, Houdini only wants forward slashes
- file_path = os.path.normpath(self.fname)
+ file_path = self.filepath_from_context(context)
+ file_path = os.path.normpath(file_path)
file_path = file_path.replace("\\", "/")
file_path = self._get_file_sequence(file_path)
diff --git a/openpype/hosts/houdini/plugins/load/load_usd_layer.py b/openpype/hosts/houdini/plugins/load/load_usd_layer.py
index 1f0ec25128..1528cf549f 100644
--- a/openpype/hosts/houdini/plugins/load/load_usd_layer.py
+++ b/openpype/hosts/houdini/plugins/load/load_usd_layer.py
@@ -26,7 +26,8 @@ class USDSublayerLoader(load.LoaderPlugin):
import hou
# Format file name, Houdini only wants forward slashes
- file_path = os.path.normpath(self.fname)
+ file_path = self.filepath_from_context(context)
+ file_path = os.path.normpath(file_path)
file_path = file_path.replace("\\", "/")
# Get the root node
diff --git a/openpype/hosts/houdini/plugins/load/load_usd_reference.py b/openpype/hosts/houdini/plugins/load/load_usd_reference.py
index f66d05395e..8402ad072c 100644
--- a/openpype/hosts/houdini/plugins/load/load_usd_reference.py
+++ b/openpype/hosts/houdini/plugins/load/load_usd_reference.py
@@ -26,7 +26,8 @@ class USDReferenceLoader(load.LoaderPlugin):
import hou
# Format file name, Houdini only wants forward slashes
- file_path = os.path.normpath(self.fname)
+ file_path = self.filepath_from_context(context)
+ file_path = os.path.normpath(file_path)
file_path = file_path.replace("\\", "/")
# Get the root node
diff --git a/openpype/hosts/houdini/plugins/load/load_vdb.py b/openpype/hosts/houdini/plugins/load/load_vdb.py
index 87900502c5..bcc4f200d3 100644
--- a/openpype/hosts/houdini/plugins/load/load_vdb.py
+++ b/openpype/hosts/houdini/plugins/load/load_vdb.py
@@ -40,8 +40,9 @@ class VdbLoader(load.LoaderPlugin):
# Explicitly create a file node
file_node = container.createNode("file", node_name=node_name)
+ path = self.filepath_from_context(context)
file_node.setParms(
- {"file": self.format_path(self.fname, context["representation"])})
+ {"file": self.format_path(path, context["representation"])})
# Set display on last node
file_node.setDisplayFlag(True)
diff --git a/openpype/hosts/houdini/plugins/load/show_usdview.py b/openpype/hosts/houdini/plugins/load/show_usdview.py
index 2737bc40fa..7b03a0738a 100644
--- a/openpype/hosts/houdini/plugins/load/show_usdview.py
+++ b/openpype/hosts/houdini/plugins/load/show_usdview.py
@@ -20,7 +20,8 @@ class ShowInUsdview(load.LoaderPlugin):
usdview = find_executable("usdview")
- filepath = os.path.normpath(self.fname)
+ filepath = self.filepath_from_context(context)
+ filepath = os.path.normpath(filepath)
filepath = filepath.replace("\\", "/")
if not os.path.exists(filepath):
diff --git a/openpype/hosts/houdini/plugins/publish/collect_arnold_rop.py b/openpype/hosts/houdini/plugins/publish/collect_arnold_rop.py
index 614785487f..43b8428c60 100644
--- a/openpype/hosts/houdini/plugins/publish/collect_arnold_rop.py
+++ b/openpype/hosts/houdini/plugins/publish/collect_arnold_rop.py
@@ -50,7 +50,7 @@ class CollectArnoldROPRenderProducts(pyblish.api.InstancePlugin):
num_aovs = rop.evalParm("ar_aovs")
for index in range(1, num_aovs + 1):
# Skip disabled AOVs
- if not rop.evalParm("ar_enable_aovP{}".format(index)):
+ if not rop.evalParm("ar_enable_aov{}".format(index)):
continue
if rop.evalParm("ar_aov_exr_enable_layer_name{}".format(index)):
diff --git a/openpype/hosts/houdini/plugins/publish/collect_usd_bootstrap.py b/openpype/hosts/houdini/plugins/publish/collect_usd_bootstrap.py
index 81274c670e..14a8e3c056 100644
--- a/openpype/hosts/houdini/plugins/publish/collect_usd_bootstrap.py
+++ b/openpype/hosts/houdini/plugins/publish/collect_usd_bootstrap.py
@@ -1,7 +1,6 @@
import pyblish.api
from openpype.client import get_subset_by_name, get_asset_by_name
-from openpype.pipeline import legacy_io
import openpype.lib.usdlib as usdlib
@@ -51,7 +50,7 @@ class CollectUsdBootstrap(pyblish.api.InstancePlugin):
self.log.debug("Add bootstrap for: %s" % bootstrap)
- project_name = legacy_io.active_project()
+ project_name = instance.context.data["projectName"]
asset = get_asset_by_name(project_name, instance.data["asset"])
assert asset, "Asset must exist: %s" % asset
diff --git a/openpype/hosts/houdini/plugins/publish/extract_usd_layered.py b/openpype/hosts/houdini/plugins/publish/extract_usd_layered.py
index 8422a3bc3e..d6193f13c1 100644
--- a/openpype/hosts/houdini/plugins/publish/extract_usd_layered.py
+++ b/openpype/hosts/houdini/plugins/publish/extract_usd_layered.py
@@ -14,7 +14,6 @@ from openpype.client import (
)
from openpype.pipeline import (
get_representation_path,
- legacy_io,
publish,
)
import openpype.hosts.houdini.api.usd as hou_usdlib
@@ -250,7 +249,7 @@ class ExtractUSDLayered(publish.Extractor):
# Set up the dependency for publish if they have new content
# compared to previous publishes
- project_name = legacy_io.active_project()
+ project_name = instance.context.data["projectName"]
for dependency in active_dependencies:
dependency_fname = dependency.data["usdFilename"]
diff --git a/openpype/hosts/houdini/plugins/publish/validate_primitive_hierarchy_paths.py b/openpype/hosts/houdini/plugins/publish/validate_primitive_hierarchy_paths.py
index cd5e724ab3..3da5665f58 100644
--- a/openpype/hosts/houdini/plugins/publish/validate_primitive_hierarchy_paths.py
+++ b/openpype/hosts/houdini/plugins/publish/validate_primitive_hierarchy_paths.py
@@ -1,10 +1,19 @@
# -*- coding: utf-8 -*-
import pyblish.api
-from openpype.pipeline.publish import ValidateContentsOrder
from openpype.pipeline import PublishValidationError
+from openpype.pipeline.publish import (
+ ValidateContentsOrder,
+ RepairAction,
+)
+
import hou
+class AddDefaultPathAction(RepairAction):
+ label = "Add a default path attribute"
+ icon = "mdi.pencil-plus-outline"
+
+
class ValidatePrimitiveHierarchyPaths(pyblish.api.InstancePlugin):
"""Validate all primitives build hierarchy from attribute when enabled.
@@ -18,6 +27,7 @@ class ValidatePrimitiveHierarchyPaths(pyblish.api.InstancePlugin):
families = ["pointcache"]
hosts = ["houdini"]
label = "Validate Prims Hierarchy Path"
+ actions = [AddDefaultPathAction]
def process(self, instance):
invalid = self.get_invalid(instance)
@@ -36,10 +46,10 @@ class ValidatePrimitiveHierarchyPaths(pyblish.api.InstancePlugin):
if output_node is None:
cls.log.error(
"SOP Output node in '%s' does not exist. "
- "Ensure a valid SOP output path is set." % rop_node.path()
+ "Ensure a valid SOP output path is set.", rop_node.path()
)
- return [rop_node.path()]
+ return [rop_node]
build_from_path = rop_node.parm("build_from_path").eval()
if not build_from_path:
@@ -56,9 +66,17 @@ class ValidatePrimitiveHierarchyPaths(pyblish.api.InstancePlugin):
"value set, but 'Build Hierarchy from Attribute'"
"is enabled."
)
- return [rop_node.path()]
+ return [rop_node]
- cls.log.debug("Checking for attribute: %s" % path_attr)
+ cls.log.debug("Checking for attribute: %s", path_attr)
+
+ if not hasattr(output_node, "geometry"):
+ # In the case someone has explicitly set an Object
+ # node instead of a SOP node in Geometry context
+ # then for now we ignore - this allows us to also
+ # export object transforms.
+ cls.log.warning("No geometry output node found, skipping check..")
+ return
if not hasattr(output_node, "geometry"):
# In the case someone has explicitly set an Object
@@ -89,17 +107,17 @@ class ValidatePrimitiveHierarchyPaths(pyblish.api.InstancePlugin):
if not attrib:
cls.log.info(
"Geometry Primitives are missing "
- "path attribute: `%s`" % path_attr
+ "path attribute: `%s`", path_attr
)
- return [output_node.path()]
+ return [output_node]
# Ensure at least a single string value is present
if not attrib.strings():
cls.log.info(
"Primitive path attribute has no "
- "string values: %s" % path_attr
+ "string values: %s", path_attr
)
- return [output_node.path()]
+ return [output_node]
paths = geo.primStringAttribValues(path_attr)
# Ensure all primitives are set to a valid path
@@ -109,6 +127,65 @@ class ValidatePrimitiveHierarchyPaths(pyblish.api.InstancePlugin):
num_prims = len(geo.iterPrims()) # faster than len(geo.prims())
cls.log.info(
"Prims have no value for attribute `%s` "
- "(%s of %s prims)" % (path_attr, len(invalid_prims), num_prims)
+ "(%s of %s prims)", path_attr, len(invalid_prims), num_prims
+ )
+ return [output_node]
+
+ @classmethod
+ def repair(cls, instance):
+ """Add a default path attribute Action.
+
+ It is a helper action more than a repair action,
+ used to add a default single value for the path.
+ """
+
+ rop_node = hou.node(instance.data["instance_node"])
+ output_node = rop_node.parm("sop_path").evalAsNode()
+
+ if not output_node:
+ cls.log.debug(
+ "Action isn't performed, invalid SOP Path on %s",
+ rop_node
+ )
+ return
+
+ # This check to prevent the action from running multiple times.
+ # git_invalid only returns [output_node] when
+ # path attribute is the problem
+ if cls.get_invalid(instance) != [output_node]:
+ return
+
+ path_attr = rop_node.parm("path_attrib").eval()
+
+ path_node = output_node.parent().createNode("name", "AUTO_PATH")
+ path_node.parm("attribname").set(path_attr)
+ path_node.parm("name1").set('`opname("..")`/`opname("..")`Shape')
+
+ cls.log.debug(
+ "'%s' was created. It adds '%s' with a default single value",
+ path_node, path_attr
+ )
+
+ path_node.setGenericFlag(hou.nodeFlag.DisplayComment, True)
+ path_node.setComment(
+ 'Auto path node was created automatically by '
+ '"Add a default path attribute"'
+ '\nFeel free to modify or replace it.'
+ )
+
+ if output_node.type().name() in ["null", "output"]:
+ # Connect before
+ path_node.setFirstInput(output_node.input(0))
+ path_node.moveToGoodPosition()
+ output_node.setFirstInput(path_node)
+ output_node.moveToGoodPosition()
+ else:
+ # Connect after
+ path_node.setFirstInput(output_node)
+ rop_node.parm("sop_path").set(path_node.path())
+ path_node.moveToGoodPosition()
+
+ cls.log.debug(
+ "SOP path on '%s' updated to new output node '%s'",
+ rop_node, path_node
)
- return [output_node.path()]
diff --git a/openpype/hosts/houdini/plugins/publish/validate_usd_shade_model_exists.py b/openpype/hosts/houdini/plugins/publish/validate_usd_shade_model_exists.py
index c4f118ac3b..0db782d545 100644
--- a/openpype/hosts/houdini/plugins/publish/validate_usd_shade_model_exists.py
+++ b/openpype/hosts/houdini/plugins/publish/validate_usd_shade_model_exists.py
@@ -4,7 +4,6 @@ import re
import pyblish.api
from openpype.client import get_subset_by_name
-from openpype.pipeline import legacy_io
from openpype.pipeline.publish import ValidateContentsOrder
from openpype.pipeline import PublishValidationError
@@ -18,7 +17,7 @@ class ValidateUSDShadeModelExists(pyblish.api.InstancePlugin):
label = "USD Shade model exists"
def process(self, instance):
- project_name = legacy_io.active_project()
+ project_name = instance.context.data["projectName"]
asset_name = instance.data["asset"]
subset = instance.data["subset"]
diff --git a/openpype/hosts/houdini/vendor/husdoutputprocessors/avalon_uri_processor.py b/openpype/hosts/houdini/vendor/husdoutputprocessors/avalon_uri_processor.py
index 48019e0a82..310d057a11 100644
--- a/openpype/hosts/houdini/vendor/husdoutputprocessors/avalon_uri_processor.py
+++ b/openpype/hosts/houdini/vendor/husdoutputprocessors/avalon_uri_processor.py
@@ -5,7 +5,7 @@ import husdoutputprocessors.base as base
import colorbleed.usdlib as usdlib
from openpype.client import get_asset_by_name
-from openpype.pipeline import legacy_io, Anatomy
+from openpype.pipeline import Anatomy, get_current_project_name
class AvalonURIOutputProcessor(base.OutputProcessorBase):
@@ -122,7 +122,7 @@ class AvalonURIOutputProcessor(base.OutputProcessorBase):
"""
- PROJECT = legacy_io.Session["AVALON_PROJECT"]
+ PROJECT = get_current_project_name()
anatomy = Anatomy(PROJECT)
asset_doc = get_asset_by_name(PROJECT, asset)
if not asset_doc:
diff --git a/openpype/hosts/max/api/lib_renderproducts.py b/openpype/hosts/max/api/lib_renderproducts.py
index 3074f8e170..90608737c2 100644
--- a/openpype/hosts/max/api/lib_renderproducts.py
+++ b/openpype/hosts/max/api/lib_renderproducts.py
@@ -7,15 +7,18 @@ import os
from pymxs import runtime as rt
from openpype.hosts.max.api.lib import get_current_renderer
-from openpype.pipeline import legacy_io
+from openpype.pipeline import get_current_project_name
from openpype.settings import get_project_settings
class RenderProducts(object):
def __init__(self, project_settings=None):
- self._project_settings = project_settings or get_project_settings(
- legacy_io.Session["AVALON_PROJECT"])
+ self._project_settings = project_settings
+ if not self._project_settings:
+ self._project_settings = get_project_settings(
+ get_current_project_name()
+ )
def get_beauty(self, container):
render_dir = os.path.dirname(rt.rendOutputFilename)
diff --git a/openpype/hosts/max/api/lib_rendersettings.py b/openpype/hosts/max/api/lib_rendersettings.py
index 91e4a5bf9b..1b62edabee 100644
--- a/openpype/hosts/max/api/lib_rendersettings.py
+++ b/openpype/hosts/max/api/lib_rendersettings.py
@@ -2,7 +2,7 @@ import os
from pymxs import runtime as rt
from openpype.lib import Logger
from openpype.settings import get_project_settings
-from openpype.pipeline import legacy_io
+from openpype.pipeline import get_current_project_name
from openpype.pipeline.context_tools import get_current_project_asset
from openpype.hosts.max.api.lib import (
@@ -31,7 +31,7 @@ class RenderSettings(object):
self._project_settings = project_settings
if not self._project_settings:
self._project_settings = get_project_settings(
- legacy_io.Session["AVALON_PROJECT"]
+ get_current_project_name()
)
def set_render_camera(self, selection):
diff --git a/openpype/hosts/max/api/plugin.py b/openpype/hosts/max/api/plugin.py
index 14b0653f40..36b4ea32d4 100644
--- a/openpype/hosts/max/api/plugin.py
+++ b/openpype/hosts/max/api/plugin.py
@@ -15,6 +15,7 @@ MS_CUSTOM_ATTRIB = """attributes "openPypeData"
parameters main rollout:OPparams
(
all_handles type:#maxObjectTab tabSize:0 tabSizeVariable:on
+ sel_list type:#stringTab tabSize:0 tabSizeVariable:on
)
rollout OPparams "OP Parameters"
@@ -30,11 +31,42 @@ MS_CUSTOM_ATTRIB = """attributes "openPypeData"
handle_name = obj_name + "<" + handle as string + ">"
return handle_name
)
+ fn nodes_to_add node =
+ (
+ sceneObjs = #()
+ if classOf node == Container do return false
+ n = node as string
+ for obj in Objects do
+ (
+ tmp_obj = obj as string
+ append sceneObjs tmp_obj
+ )
+ if sel_list != undefined do
+ (
+ for obj in sel_list do
+ (
+ idx = findItem sceneObjs obj
+ if idx do
+ (
+ deleteItem sceneObjs idx
+ )
+ )
+ )
+ idx = findItem sceneObjs n
+ if idx then return true else false
+ )
+
+ fn nodes_to_rmv node =
+ (
+ n = node as string
+ idx = findItem sel_list n
+ if idx then return true else false
+ )
on button_add pressed do
(
current_selection = selectByName title:"Select Objects to add to
- the Container" buttontext:"Add"
+ the Container" buttontext:"Add" filter:nodes_to_add
if current_selection == undefined then return False
temp_arr = #()
i_node_arr = #()
@@ -46,8 +78,12 @@ MS_CUSTOM_ATTRIB = """attributes "openPypeData"
if idx do (
continue
)
+
+ name = c as string
+
append temp_arr handle_name
append i_node_arr node_ref
+ append sel_list name
)
all_handles = join i_node_arr all_handles
list_node.items = join temp_arr list_node.items
@@ -56,7 +92,7 @@ MS_CUSTOM_ATTRIB = """attributes "openPypeData"
on button_del pressed do
(
current_selection = selectByName title:"Select Objects to remove
- from the Container" buttontext:"Remove"
+ from the Container" buttontext:"Remove" filter: nodes_to_rmv
if current_selection == undefined then return False
temp_arr = #()
i_node_arr = #()
@@ -67,6 +103,7 @@ MS_CUSTOM_ATTRIB = """attributes "openPypeData"
(
node_ref = NodeTransformMonitor node:c as string
handle_name = node_to_name c
+ n = c as string
tmp_all_handles = #()
for i in all_handles do
(
@@ -84,6 +121,11 @@ MS_CUSTOM_ATTRIB = """attributes "openPypeData"
(
new_temp_arr = DeleteItem list_node.items idx
)
+ idx = finditem sel_list n
+ if idx do
+ (
+ sel_list = DeleteItem sel_list idx
+ )
)
all_handles = join i_node_arr new_i_node_arr
list_node.items = join temp_arr new_temp_arr
diff --git a/openpype/hosts/max/plugins/load/load_camera_fbx.py b/openpype/hosts/max/plugins/load/load_camera_fbx.py
index c51900dbb7..62284b23d9 100644
--- a/openpype/hosts/max/plugins/load/load_camera_fbx.py
+++ b/openpype/hosts/max/plugins/load/load_camera_fbx.py
@@ -17,7 +17,8 @@ class FbxLoader(load.LoaderPlugin):
def load(self, context, name=None, namespace=None, data=None):
from pymxs import runtime as rt
- filepath = os.path.normpath(self.fname)
+ filepath = self.filepath_from_context(context)
+ filepath = os.path.normpath(filepath)
rt.FBXImporterSetParam("Animation", True)
rt.FBXImporterSetParam("Camera", True)
rt.FBXImporterSetParam("AxisConversionMethod", True)
diff --git a/openpype/hosts/max/plugins/load/load_max_scene.py b/openpype/hosts/max/plugins/load/load_max_scene.py
index e3fb34f5bc..76cd3bf367 100644
--- a/openpype/hosts/max/plugins/load/load_max_scene.py
+++ b/openpype/hosts/max/plugins/load/load_max_scene.py
@@ -19,7 +19,9 @@ class MaxSceneLoader(load.LoaderPlugin):
def load(self, context, name=None, namespace=None, data=None):
from pymxs import runtime as rt
- path = os.path.normpath(self.fname)
+
+ path = self.filepath_from_context(context)
+ path = os.path.normpath(path)
# import the max scene by using "merge file"
path = path.replace('\\', '/')
rt.MergeMaxFile(path)
diff --git a/openpype/hosts/max/plugins/load/load_model.py b/openpype/hosts/max/plugins/load/load_model.py
index 58c6d3c889..cff82a593c 100644
--- a/openpype/hosts/max/plugins/load/load_model.py
+++ b/openpype/hosts/max/plugins/load/load_model.py
@@ -18,7 +18,7 @@ class ModelAbcLoader(load.LoaderPlugin):
def load(self, context, name=None, namespace=None, data=None):
from pymxs import runtime as rt
- file_path = os.path.normpath(self.fname)
+ file_path = os.path.normpath(self.filepath_from_context(context))
abc_before = {
c
diff --git a/openpype/hosts/max/plugins/load/load_model_fbx.py b/openpype/hosts/max/plugins/load/load_model_fbx.py
index 663f79f9f5..12f526ab95 100644
--- a/openpype/hosts/max/plugins/load/load_model_fbx.py
+++ b/openpype/hosts/max/plugins/load/load_model_fbx.py
@@ -17,7 +17,7 @@ class FbxModelLoader(load.LoaderPlugin):
def load(self, context, name=None, namespace=None, data=None):
from pymxs import runtime as rt
- filepath = os.path.normpath(self.fname)
+ filepath = os.path.normpath(self.filepath_from_context(context))
rt.FBXImporterSetParam("Animation", False)
rt.FBXImporterSetParam("Cameras", False)
rt.FBXImporterSetParam("Preserveinstances", True)
diff --git a/openpype/hosts/max/plugins/load/load_model_obj.py b/openpype/hosts/max/plugins/load/load_model_obj.py
index 77d4e08cfb..18a19414fa 100644
--- a/openpype/hosts/max/plugins/load/load_model_obj.py
+++ b/openpype/hosts/max/plugins/load/load_model_obj.py
@@ -18,7 +18,7 @@ class ObjLoader(load.LoaderPlugin):
def load(self, context, name=None, namespace=None, data=None):
from pymxs import runtime as rt
- filepath = os.path.normpath(self.fname)
+ filepath = os.path.normpath(self.filepath_from_context(context))
self.log.debug("Executing command to import..")
rt.Execute(f'importFile @"{filepath}" #noPrompt using:ObjImp')
diff --git a/openpype/hosts/max/plugins/load/load_model_usd.py b/openpype/hosts/max/plugins/load/load_model_usd.py
index 2b34669278..48b50b9b18 100644
--- a/openpype/hosts/max/plugins/load/load_model_usd.py
+++ b/openpype/hosts/max/plugins/load/load_model_usd.py
@@ -20,7 +20,7 @@ class ModelUSDLoader(load.LoaderPlugin):
from pymxs import runtime as rt
# asset_filepath
- filepath = os.path.normpath(self.fname)
+ filepath = os.path.normpath(self.filepath_from_context(context))
import_options = rt.USDImporter.CreateOptions()
base_filename = os.path.basename(filepath)
filename, ext = os.path.splitext(base_filename)
diff --git a/openpype/hosts/max/plugins/load/load_pointcache.py b/openpype/hosts/max/plugins/load/load_pointcache.py
index cadbe7cac2..290503e053 100644
--- a/openpype/hosts/max/plugins/load/load_pointcache.py
+++ b/openpype/hosts/max/plugins/load/load_pointcache.py
@@ -23,7 +23,8 @@ class AbcLoader(load.LoaderPlugin):
def load(self, context, name=None, namespace=None, data=None):
from pymxs import runtime as rt
- file_path = os.path.normpath(self.fname)
+ file_path = self.filepath_from_context(context)
+ file_path = os.path.normpath(file_path)
abc_before = {
c
diff --git a/openpype/hosts/max/plugins/load/load_pointcloud.py b/openpype/hosts/max/plugins/load/load_pointcloud.py
index 8634e1d51f..2a1175167a 100644
--- a/openpype/hosts/max/plugins/load/load_pointcloud.py
+++ b/openpype/hosts/max/plugins/load/load_pointcloud.py
@@ -18,7 +18,7 @@ class PointCloudLoader(load.LoaderPlugin):
"""load point cloud by tyCache"""
from pymxs import runtime as rt
- filepath = os.path.normpath(self.fname)
+ filepath = os.path.normpath(self.filepath_from_context(context))
obj = rt.tyCache()
obj.filename = filepath
diff --git a/openpype/hosts/max/plugins/publish/collect_workfile.py b/openpype/hosts/max/plugins/publish/collect_workfile.py
index 3500b2735c..0eb4bb731e 100644
--- a/openpype/hosts/max/plugins/publish/collect_workfile.py
+++ b/openpype/hosts/max/plugins/publish/collect_workfile.py
@@ -4,7 +4,6 @@ import os
import pyblish.api
from pymxs import runtime as rt
-from openpype.pipeline import legacy_io
class CollectWorkfile(pyblish.api.ContextPlugin):
@@ -26,7 +25,7 @@ class CollectWorkfile(pyblish.api.ContextPlugin):
filename, ext = os.path.splitext(file)
- task = legacy_io.Session["AVALON_TASK"]
+ task = context.data["task"]
data = {}
@@ -36,7 +35,7 @@ class CollectWorkfile(pyblish.api.ContextPlugin):
data.update({
"subset": subset,
- "asset": os.getenv("AVALON_ASSET", None),
+ "asset": context.data["asset"],
"label": subset,
"publish": True,
"family": 'workfile',
diff --git a/openpype/hosts/max/plugins/publish/validate_pointcloud.py b/openpype/hosts/max/plugins/publish/validate_pointcloud.py
index 1ff6eb126f..295a23f1f6 100644
--- a/openpype/hosts/max/plugins/publish/validate_pointcloud.py
+++ b/openpype/hosts/max/plugins/publish/validate_pointcloud.py
@@ -1,15 +1,6 @@
import pyblish.api
from openpype.pipeline import PublishValidationError
from pymxs import runtime as rt
-from openpype.settings import get_project_settings
-from openpype.pipeline import legacy_io
-
-
-def get_setting(project_setting=None):
- project_setting = get_project_settings(
- legacy_io.Session["AVALON_PROJECT"]
- )
- return project_setting["max"]["PointCloud"]
class ValidatePointCloud(pyblish.api.InstancePlugin):
@@ -108,6 +99,9 @@ class ValidatePointCloud(pyblish.api.InstancePlugin):
f"Validating tyFlow custom attributes for {container}")
selection_list = instance.data["members"]
+
+ project_setting = instance.data["project_setting"]
+ attr_settings = project_setting["max"]["PointCloud"]["attribute"]
for sel in selection_list:
obj = sel.baseobject
anim_names = rt.GetSubAnimNames(obj)
@@ -118,8 +112,7 @@ class ValidatePointCloud(pyblish.api.InstancePlugin):
event_name = sub_anim.name
opt = "${0}.{1}.export_particles".format(sel.name,
event_name)
- attributes = get_setting()["attribute"]
- for key, value in attributes.items():
+ for key, value in attr_settings.items():
custom_attr = "{0}.PRTChannels_{1}".format(opt,
value)
try:
diff --git a/openpype/hosts/maya/api/action.py b/openpype/hosts/maya/api/action.py
index 3b8e2c1848..277f4cc238 100644
--- a/openpype/hosts/maya/api/action.py
+++ b/openpype/hosts/maya/api/action.py
@@ -4,7 +4,6 @@ from __future__ import absolute_import
import pyblish.api
from openpype.client import get_asset_by_name
-from openpype.pipeline import legacy_io
from openpype.pipeline.publish import get_errored_instances_from_context
@@ -80,7 +79,7 @@ class GenerateUUIDsOnInvalidAction(pyblish.api.Action):
asset_doc = instance.data.get("assetEntity")
if not asset_doc:
asset_name = instance.data["asset"]
- project_name = legacy_io.active_project()
+ project_name = instance.context.data["projectName"]
self.log.info((
"Asset is not stored on instance."
" Querying by name \"{}\" from project \"{}\""
diff --git a/openpype/hosts/maya/api/commands.py b/openpype/hosts/maya/api/commands.py
index 3e31875fd8..46494413b7 100644
--- a/openpype/hosts/maya/api/commands.py
+++ b/openpype/hosts/maya/api/commands.py
@@ -3,7 +3,7 @@
from maya import cmds
from openpype.client import get_asset_by_name, get_project
-from openpype.pipeline import legacy_io
+from openpype.pipeline import get_current_project_name, get_current_asset_name
class ToolWindows:
@@ -85,8 +85,8 @@ def reset_resolution():
resolution_height = 1080
# Get resolution from asset
- project_name = legacy_io.active_project()
- asset_name = legacy_io.Session["AVALON_ASSET"]
+ project_name = get_current_project_name()
+ asset_name = get_current_asset_name()
asset_doc = get_asset_by_name(project_name, asset_name)
resolution = _resolution_from_document(asset_doc)
# Try get resolution from project
diff --git a/openpype/hosts/maya/api/lib.py b/openpype/hosts/maya/api/lib.py
index ef8ddf8bac..cdc722a409 100644
--- a/openpype/hosts/maya/api/lib.py
+++ b/openpype/hosts/maya/api/lib.py
@@ -25,7 +25,8 @@ from openpype.client import (
)
from openpype.settings import get_project_settings
from openpype.pipeline import (
- legacy_io,
+ get_current_project_name,
+ get_current_asset_name,
discover_loader_plugins,
loaders_from_representation,
get_representation_path,
@@ -1413,8 +1414,8 @@ def generate_ids(nodes, asset_id=None):
if asset_id is None:
# Get the asset ID from the database for the asset of current context
- project_name = legacy_io.active_project()
- asset_name = legacy_io.Session["AVALON_ASSET"]
+ project_name = get_current_project_name()
+ asset_name = get_current_asset_name()
asset_doc = get_asset_by_name(project_name, asset_name, fields=["_id"])
assert asset_doc, "No current asset found in Session"
asset_id = asset_doc['_id']
@@ -1614,17 +1615,15 @@ def get_container_members(container):
# region LOOKDEV
-def list_looks(asset_id):
+def list_looks(project_name, asset_id):
"""Return all look subsets for the given asset
This assumes all look subsets start with "look*" in their names.
"""
-
# # get all subsets with look leading in
# the name associated with the asset
# TODO this should probably look for family 'look' instead of checking
# subset name that can not start with family
- project_name = legacy_io.active_project()
subset_docs = get_subsets(project_name, asset_ids=[asset_id])
return [
subset_doc
@@ -1646,7 +1645,7 @@ def assign_look_by_version(nodes, version_id):
None
"""
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
# Get representations of shader file and relationships
look_representation = get_representation_by_name(
@@ -1712,7 +1711,7 @@ def assign_look(nodes, subset="lookDefault"):
parts = pype_id.split(":", 1)
grouped[parts[0]].append(node)
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
subset_docs = get_subsets(
project_name, subset_names=[subset], asset_ids=grouped.keys()
)
@@ -2226,6 +2225,35 @@ def set_scene_resolution(width, height, pixelAspect):
cmds.setAttr("%s.pixelAspect" % control_node, pixelAspect)
+def get_fps_for_current_context():
+ """Get fps that should be set for current context.
+
+ Todos:
+ - Skip project value.
+ - Merge logic with 'get_frame_range' and 'reset_scene_resolution' ->
+ all the values in the functions can be collected at one place as
+ they have same requirements.
+
+ Returns:
+ Union[int, float]: FPS value.
+ """
+
+ project_name = get_current_project_name()
+ asset_name = get_current_asset_name()
+ asset_doc = get_asset_by_name(
+ project_name, asset_name, fields=["data.fps"]
+ ) or {}
+ fps = asset_doc.get("data", {}).get("fps")
+ if not fps:
+ project_doc = get_project(project_name, fields=["data.fps"]) or {}
+ fps = project_doc.get("data", {}).get("fps")
+
+ if not fps:
+ fps = 25
+
+ return convert_to_maya_fps(fps)
+
+
def get_frame_range(include_animation_range=False):
"""Get the current assets frame range and handles.
@@ -2300,10 +2328,7 @@ def reset_frame_range(playback=True, render=True, fps=True):
fps (bool, Optional): Whether to set scene FPS. Defaults to True.
"""
if fps:
- fps = convert_to_maya_fps(
- float(legacy_io.Session.get("AVALON_FPS", 25))
- )
- set_scene_fps(fps)
+ set_scene_fps(get_fps_for_current_context())
frame_range = get_frame_range(include_animation_range=True)
if not frame_range:
@@ -2339,7 +2364,7 @@ def reset_scene_resolution():
None
"""
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
project_doc = get_project(project_name)
project_data = project_doc["data"]
asset_data = get_current_project_asset()["data"]
@@ -2372,19 +2397,9 @@ def set_context_settings():
None
"""
- # Todo (Wijnand): apply renderer and resolution of project
- project_name = legacy_io.active_project()
- project_doc = get_project(project_name)
- project_data = project_doc["data"]
- asset_doc = get_current_project_asset(fields=["data.fps"])
- asset_data = asset_doc.get("data", {})
# Set project fps
- fps = convert_to_maya_fps(
- asset_data.get("fps", project_data.get("fps", 25))
- )
- legacy_io.Session["AVALON_FPS"] = str(fps)
- set_scene_fps(fps)
+ set_scene_fps(get_fps_for_current_context())
reset_scene_resolution()
@@ -2404,9 +2419,7 @@ def validate_fps():
"""
- expected_fps = convert_to_maya_fps(
- get_current_project_asset(fields=["data.fps"])["data"]["fps"]
- )
+ expected_fps = get_fps_for_current_context()
current_fps = mel.eval('currentTimeUnitToFPS()')
fps_match = current_fps == expected_fps
diff --git a/openpype/hosts/maya/api/lib_rendersettings.py b/openpype/hosts/maya/api/lib_rendersettings.py
index eaa728a2f6..f54633c04d 100644
--- a/openpype/hosts/maya/api/lib_rendersettings.py
+++ b/openpype/hosts/maya/api/lib_rendersettings.py
@@ -6,13 +6,9 @@ import six
import sys
from openpype.lib import Logger
-from openpype.settings import (
- get_project_settings,
- get_current_project_settings
-)
+from openpype.settings import get_project_settings
-from openpype.pipeline import legacy_io
-from openpype.pipeline import CreatorError
+from openpype.pipeline import CreatorError, get_current_project_name
from openpype.pipeline.context_tools import get_current_project_asset
from openpype.hosts.maya.api.lib import reset_frame_range
@@ -27,21 +23,6 @@ class RenderSettings(object):
'mayahardware2': 'defaultRenderGlobals.imageFilePrefix'
}
- _image_prefixes = {
- 'vray': get_current_project_settings()["maya"]["RenderSettings"]["vray_renderer"]["image_prefix"], # noqa
- 'arnold': get_current_project_settings()["maya"]["RenderSettings"]["arnold_renderer"]["image_prefix"], # noqa
- 'renderman': get_current_project_settings()["maya"]["RenderSettings"]["renderman_renderer"]["image_prefix"], # noqa
- 'redshift': get_current_project_settings()["maya"]["RenderSettings"]["redshift_renderer"]["image_prefix"] # noqa
- }
-
- # Renderman only
- _image_dir = {
- 'renderman': get_current_project_settings()["maya"]["RenderSettings"]["renderman_renderer"]["image_dir"], # noqa
- 'cryptomatte': get_current_project_settings()["maya"]["RenderSettings"]["renderman_renderer"]["cryptomatte_dir"], # noqa
- 'imageDisplay': get_current_project_settings()["maya"]["RenderSettings"]["renderman_renderer"]["imageDisplay_dir"], # noqa
- "watermark": get_current_project_settings()["maya"]["RenderSettings"]["renderman_renderer"]["watermark_dir"] # noqa
- }
-
_aov_chars = {
"dot": ".",
"dash": "-",
@@ -55,11 +36,30 @@ class RenderSettings(object):
return cls._image_prefix_nodes[renderer]
def __init__(self, project_settings=None):
- self._project_settings = project_settings
- if not self._project_settings:
- self._project_settings = get_project_settings(
- legacy_io.Session["AVALON_PROJECT"]
+ if not project_settings:
+ project_settings = get_project_settings(
+ get_current_project_name()
)
+ render_settings = project_settings["maya"]["RenderSettings"]
+ image_prefixes = {
+ "vray": render_settings["vray_renderer"]["image_prefix"],
+ "arnold": render_settings["arnold_renderer"]["image_prefix"],
+ "renderman": render_settings["renderman_renderer"]["image_prefix"],
+ "redshift": render_settings["redshift_renderer"]["image_prefix"]
+ }
+
+ # TODO probably should be stored to more explicit attribute
+ # Renderman only
+ renderman_settings = render_settings["renderman_renderer"]
+ _image_dir = {
+ "renderman": renderman_settings["image_dir"],
+ "cryptomatte": renderman_settings["cryptomatte_dir"],
+ "imageDisplay": renderman_settings["imageDisplay_dir"],
+ "watermark": renderman_settings["watermark_dir"]
+ }
+ self._image_prefixes = image_prefixes
+ self._image_dir = _image_dir
+ self._project_settings = project_settings
def set_default_renderer_settings(self, renderer=None):
"""Set basic settings based on renderer."""
diff --git a/openpype/hosts/maya/api/menu.py b/openpype/hosts/maya/api/menu.py
index 645d6f5a1c..715f54686c 100644
--- a/openpype/hosts/maya/api/menu.py
+++ b/openpype/hosts/maya/api/menu.py
@@ -7,7 +7,11 @@ import maya.utils
import maya.cmds as cmds
from openpype.settings import get_project_settings
-from openpype.pipeline import legacy_io
+from openpype.pipeline import (
+ get_current_project_name,
+ get_current_asset_name,
+ get_current_task_name
+)
from openpype.pipeline.workfile import BuildWorkfile
from openpype.tools.utils import host_tools
from openpype.hosts.maya.api import lib, lib_rendersettings
@@ -35,6 +39,13 @@ def _get_menu(menu_name=None):
return widgets.get(menu_name)
+def get_context_label():
+ return "{}, {}".format(
+ get_current_asset_name(),
+ get_current_task_name()
+ )
+
+
def install():
if cmds.about(batch=True):
log.info("Skipping openpype.menu initialization in batch mode..")
@@ -45,19 +56,15 @@ def install():
parent_widget = get_main_window()
cmds.menu(
MENU_NAME,
- label=legacy_io.Session["AVALON_LABEL"],
+ label=os.environ.get("AVALON_LABEL") or "OpenPype",
tearOff=True,
parent="MayaWindow"
)
# Create context menu
- context_label = "{}, {}".format(
- legacy_io.Session["AVALON_ASSET"],
- legacy_io.Session["AVALON_TASK"]
- )
cmds.menuItem(
"currentContext",
- label=context_label,
+ label=get_context_label(),
parent=MENU_NAME,
enable=False
)
@@ -195,7 +202,8 @@ def install():
return
# load configuration of custom menu
- project_settings = get_project_settings(os.getenv("AVALON_PROJECT"))
+ project_name = get_current_project_name()
+ project_settings = get_project_settings(project_name)
config = project_settings["maya"]["scriptsmenu"]["definition"]
_menu = project_settings["maya"]["scriptsmenu"]["name"]
@@ -252,8 +260,5 @@ def update_menu_task_label():
log.warning("Can't find menuItem: {}".format(object_name))
return
- label = "{}, {}".format(
- legacy_io.Session["AVALON_ASSET"],
- legacy_io.Session["AVALON_TASK"]
- )
+ label = get_context_label()
cmds.menuItem(object_name, edit=True, label=label)
diff --git a/openpype/hosts/maya/api/pipeline.py b/openpype/hosts/maya/api/pipeline.py
index 2f2ab83f79..60495ac652 100644
--- a/openpype/hosts/maya/api/pipeline.py
+++ b/openpype/hosts/maya/api/pipeline.py
@@ -27,6 +27,9 @@ from openpype.lib import (
)
from openpype.pipeline import (
legacy_io,
+ get_current_project_name,
+ get_current_asset_name,
+ get_current_task_name,
register_loader_plugin_path,
register_inventory_action_path,
register_creator_plugin_path,
@@ -75,7 +78,7 @@ class MayaHost(HostBase, IWorkfileHost, ILoadHost, IPublishHost):
self._op_events = {}
def install(self):
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
project_settings = get_project_settings(project_name)
# process path mapping
dirmap_processor = MayaDirmap("maya", project_name, project_settings)
@@ -320,7 +323,7 @@ def _remove_workfile_lock():
def handle_workfile_locks():
if lib.IS_HEADLESS:
return False
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
return is_workfile_lock_enabled(MayaHost.name, project_name)
@@ -657,9 +660,9 @@ def on_task_changed():
lib.update_content_on_context_change()
msg = " project: {}\n asset: {}\n task:{}".format(
- legacy_io.active_project(),
- legacy_io.Session["AVALON_ASSET"],
- legacy_io.Session["AVALON_TASK"]
+ get_current_project_name(),
+ get_current_asset_name(),
+ get_current_task_name()
)
lib.show_message(
@@ -674,7 +677,7 @@ def before_workfile_open():
def before_workfile_save(event):
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
if handle_workfile_locks():
_remove_workfile_lock()
workdir_path = event["workdir_path"]
diff --git a/openpype/hosts/maya/api/plugin.py b/openpype/hosts/maya/api/plugin.py
index 553dcaf3aa..2b5aee9700 100644
--- a/openpype/hosts/maya/api/plugin.py
+++ b/openpype/hosts/maya/api/plugin.py
@@ -1,36 +1,24 @@
-import os
import json
-from abc import (
- ABCMeta
-)
-import six
-import re
+import os
+from abc import ABCMeta
+import qargparse
+import six
from maya import cmds
from maya.app.renderSetup.model import renderSetup
-import qargparse
-
-from openpype.lib import Logger
+from openpype.lib import BoolDef, Logger
+from openpype.pipeline import AVALON_CONTAINER_ID, Anatomy, CreatedInstance
+from openpype.pipeline import Creator as NewCreator
from openpype.pipeline import (
- legacy_io,
- LoaderPlugin,
- get_representation_path,
- AVALON_CONTAINER_ID,
- Anatomy,
- LegacyCreator,
- Creator as NewCreator,
- CreatedInstance,
- CreatorError
-)
-from openpype.lib import BoolDef
-from .lib import imprint, read
+ CreatorError, LegacyCreator, LoaderPlugin, get_representation_path,
+ legacy_io)
from openpype.pipeline.load import LoadError
from openpype.settings import get_project_settings
-from .pipeline import containerise
from . import lib
-
+from .lib import imprint, read
+from .pipeline import containerise
log = Logger.get_logger()
@@ -497,7 +485,8 @@ class ReferenceLoader(Loader):
namespace=None,
options=None
):
- assert os.path.exists(self.fname), "%s does not exist." % self.fname
+ path = self.filepath_from_context(context)
+ assert os.path.exists(path), "%s does not exist." % path
asset = context['asset']
subset = context['subset']
@@ -581,6 +570,7 @@ class ReferenceLoader(Loader):
def update(self, container, representation):
from maya import cmds
+
from openpype.hosts.maya.api.lib import get_container_members
node = container["objectName"]
diff --git a/openpype/hosts/maya/api/setdress.py b/openpype/hosts/maya/api/setdress.py
index 0bb1f186eb..7624aacd0f 100644
--- a/openpype/hosts/maya/api/setdress.py
+++ b/openpype/hosts/maya/api/setdress.py
@@ -18,13 +18,13 @@ from openpype.client import (
)
from openpype.pipeline import (
schema,
- legacy_io,
discover_loader_plugins,
loaders_from_representation,
load_container,
update_container,
remove_container,
get_representation_path,
+ get_current_project_name,
)
from openpype.hosts.maya.api.lib import (
matrix_equals,
@@ -289,7 +289,7 @@ def update_package_version(container, version):
"""
# Versioning (from `core.maya.pipeline`)
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
current_representation = get_representation_by_id(
project_name, container["representation"]
)
@@ -332,7 +332,7 @@ def update_package(set_container, representation):
"""
# Load the original package data
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
current_representation = get_representation_by_id(
project_name, set_container["representation"]
)
@@ -380,7 +380,7 @@ def update_scene(set_container, containers, current_data, new_data, new_file):
"""
set_namespace = set_container['namespace']
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
# Update the setdress hierarchy alembic
set_root = get_container_transforms(set_container, root=True)
diff --git a/openpype/hosts/maya/hooks/pre_copy_mel.py b/openpype/hosts/maya/hooks/pre_copy_mel.py
index 6f90af4b7c..9cea829ad7 100644
--- a/openpype/hosts/maya/hooks/pre_copy_mel.py
+++ b/openpype/hosts/maya/hooks/pre_copy_mel.py
@@ -10,10 +10,11 @@ class PreCopyMel(PreLaunchHook):
app_groups = ["maya"]
def execute(self):
- project_name = self.launch_context.env.get("AVALON_PROJECT")
+ project_doc = self.data["project_doc"]
workdir = self.launch_context.env.get("AVALON_WORKDIR")
if not workdir:
self.log.warning("BUG: Workdir is not filled.")
return
- create_workspace_mel(workdir, project_name)
+ project_settings = self.data["project_settings"]
+ create_workspace_mel(workdir, project_doc["name"], project_settings)
diff --git a/openpype/hosts/maya/lib.py b/openpype/hosts/maya/lib.py
index ffb2f0b27c..765c60381b 100644
--- a/openpype/hosts/maya/lib.py
+++ b/openpype/hosts/maya/lib.py
@@ -3,7 +3,7 @@ from openpype.settings import get_project_settings
from openpype.lib import Logger
-def create_workspace_mel(workdir, project_name):
+def create_workspace_mel(workdir, project_name, project_settings=None):
dst_filepath = os.path.join(workdir, "workspace.mel")
if os.path.exists(dst_filepath):
return
@@ -11,8 +11,9 @@ def create_workspace_mel(workdir, project_name):
if not os.path.exists(workdir):
os.makedirs(workdir)
- project_setting = get_project_settings(project_name)
- mel_script = project_setting["maya"].get("mel_workspace")
+ if not project_settings:
+ project_settings = get_project_settings(project_name)
+ mel_script = project_settings["maya"].get("mel_workspace")
# Skip if mel script in settings is empty
if not mel_script:
diff --git a/openpype/hosts/maya/plugins/inventory/import_modelrender.py b/openpype/hosts/maya/plugins/inventory/import_modelrender.py
index 8a7390bc8d..4db8c4f2f6 100644
--- a/openpype/hosts/maya/plugins/inventory/import_modelrender.py
+++ b/openpype/hosts/maya/plugins/inventory/import_modelrender.py
@@ -8,7 +8,7 @@ from openpype.client import (
from openpype.pipeline import (
InventoryAction,
get_representation_context,
- legacy_io,
+ get_current_project_name,
)
from openpype.hosts.maya.api.lib import (
maintained_selection,
@@ -35,7 +35,7 @@ class ImportModelRender(InventoryAction):
def process(self, containers):
from maya import cmds
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
for container in containers:
con_name = container["objectName"]
nodes = []
@@ -68,7 +68,7 @@ class ImportModelRender(InventoryAction):
from maya import cmds
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
repre_docs = get_representations(
project_name, version_ids=[version_id], fields=["_id", "name"]
)
diff --git a/openpype/hosts/maya/plugins/inventory/import_reference.py b/openpype/hosts/maya/plugins/inventory/import_reference.py
index afb1e0e17f..ecc424209d 100644
--- a/openpype/hosts/maya/plugins/inventory/import_reference.py
+++ b/openpype/hosts/maya/plugins/inventory/import_reference.py
@@ -1,7 +1,7 @@
from maya import cmds
from openpype.pipeline import InventoryAction
-from openpype.hosts.maya.api.plugin import get_reference_node
+from openpype.hosts.maya.api.lib import get_reference_node
class ImportReference(InventoryAction):
diff --git a/openpype/hosts/maya/plugins/load/_load_animation.py b/openpype/hosts/maya/plugins/load/_load_animation.py
index 2ba5fe6b64..49792b2806 100644
--- a/openpype/hosts/maya/plugins/load/_load_animation.py
+++ b/openpype/hosts/maya/plugins/load/_load_animation.py
@@ -35,7 +35,8 @@ class AbcLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
# hero_001 (abc)
# asset_counter{optional}
- file_url = self.prepare_root_value(self.fname,
+ path = self.filepath_from_context(context)
+ file_url = self.prepare_root_value(path,
context["project"]["name"])
nodes = cmds.file(file_url,
namespace=namespace,
diff --git a/openpype/hosts/maya/plugins/load/actions.py b/openpype/hosts/maya/plugins/load/actions.py
index 4855f3eed0..348657e592 100644
--- a/openpype/hosts/maya/plugins/load/actions.py
+++ b/openpype/hosts/maya/plugins/load/actions.py
@@ -138,8 +138,9 @@ class ImportMayaLoader(load.LoaderPlugin):
suffix="_",
)
+ path = self.filepath_from_context(context)
with maintained_selection():
- nodes = cmds.file(self.fname,
+ nodes = cmds.file(path,
i=True,
preserveReferences=True,
namespace=namespace,
diff --git a/openpype/hosts/maya/plugins/load/load_arnold_standin.py b/openpype/hosts/maya/plugins/load/load_arnold_standin.py
index 29215bc5c2..b5cc4d629b 100644
--- a/openpype/hosts/maya/plugins/load/load_arnold_standin.py
+++ b/openpype/hosts/maya/plugins/load/load_arnold_standin.py
@@ -89,11 +89,12 @@ class ArnoldStandinLoader(load.LoaderPlugin):
cmds.parent(standin, root)
# Set the standin filepath
+ repre_path = self.filepath_from_context(context)
path, operator = self._setup_proxy(
- standin_shape, self.fname, namespace
+ standin_shape, repre_path, namespace
)
cmds.setAttr(standin_shape + ".dso", path, type="string")
- sequence = is_sequence(os.listdir(os.path.dirname(self.fname)))
+ sequence = is_sequence(os.listdir(os.path.dirname(repre_path)))
cmds.setAttr(standin_shape + ".useFrameExtension", sequence)
fps = float(version["data"].get("fps"))or get_current_session_fps()
diff --git a/openpype/hosts/maya/plugins/load/load_assembly.py b/openpype/hosts/maya/plugins/load/load_assembly.py
index 275f21be5d..0a2733e03c 100644
--- a/openpype/hosts/maya/plugins/load/load_assembly.py
+++ b/openpype/hosts/maya/plugins/load/load_assembly.py
@@ -30,7 +30,7 @@ class AssemblyLoader(load.LoaderPlugin):
)
containers = setdress.load_package(
- filepath=self.fname,
+ filepath=self.filepath_from_context(context),
name=name,
namespace=namespace
)
diff --git a/openpype/hosts/maya/plugins/load/load_audio.py b/openpype/hosts/maya/plugins/load/load_audio.py
index 9e7fd96bdb..265b15f4ae 100644
--- a/openpype/hosts/maya/plugins/load/load_audio.py
+++ b/openpype/hosts/maya/plugins/load/load_audio.py
@@ -6,7 +6,7 @@ from openpype.client import (
get_version_by_id,
)
from openpype.pipeline import (
- legacy_io,
+ get_current_project_name,
load,
get_representation_path,
)
@@ -68,7 +68,7 @@ class AudioLoader(load.LoaderPlugin):
)
# Set frame range.
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
version = get_version_by_id(
project_name, representation["parent"], fields=["parent"]
)
diff --git a/openpype/hosts/maya/plugins/load/load_gpucache.py b/openpype/hosts/maya/plugins/load/load_gpucache.py
index 794b21eb5d..344f2fd060 100644
--- a/openpype/hosts/maya/plugins/load/load_gpucache.py
+++ b/openpype/hosts/maya/plugins/load/load_gpucache.py
@@ -37,7 +37,8 @@ class GpuCacheLoader(load.LoaderPlugin):
label = "{}:{}".format(namespace, name)
root = cmds.group(name=label, empty=True)
- settings = get_project_settings(os.environ['AVALON_PROJECT'])
+ project_name = context["project"]["name"]
+ settings = get_project_settings(project_name)
colors = settings['maya']['load']['colors']
c = colors.get('model')
if c is not None:
@@ -56,7 +57,8 @@ class GpuCacheLoader(load.LoaderPlugin):
name="{0}Shape".format(transform_name))
# Set the cache filepath
- cmds.setAttr(cache + '.cacheFileName', self.fname, type="string")
+ path = self.filepath_from_context(context)
+ cmds.setAttr(cache + '.cacheFileName', path, type="string")
cmds.setAttr(cache + '.cacheGeomPath', "|", type="string") # root
# Lock parenting of the transform and cache
diff --git a/openpype/hosts/maya/plugins/load/load_image.py b/openpype/hosts/maya/plugins/load/load_image.py
index 552bcc33af..3b1f5442ce 100644
--- a/openpype/hosts/maya/plugins/load/load_image.py
+++ b/openpype/hosts/maya/plugins/load/load_image.py
@@ -4,7 +4,8 @@ import copy
from openpype.lib import EnumDef
from openpype.pipeline import (
load,
- get_representation_context
+ get_representation_context,
+ get_current_host_name,
)
from openpype.pipeline.load.utils import get_representation_path_from_context
from openpype.pipeline.colorspace import (
@@ -266,7 +267,7 @@ class FileNodeLoader(load.LoaderPlugin):
# Assume colorspace from filepath based on project settings
project_name = context["project"]["name"]
- host_name = os.environ.get("AVALON_APP")
+ host_name = get_current_host_name()
project_settings = get_project_settings(project_name)
config_data = get_imageio_config(
diff --git a/openpype/hosts/maya/plugins/load/load_image_plane.py b/openpype/hosts/maya/plugins/load/load_image_plane.py
index bf13708e9b..117f4f4202 100644
--- a/openpype/hosts/maya/plugins/load/load_image_plane.py
+++ b/openpype/hosts/maya/plugins/load/load_image_plane.py
@@ -6,9 +6,9 @@ from openpype.client import (
get_version_by_id,
)
from openpype.pipeline import (
- legacy_io,
load,
- get_representation_path
+ get_representation_path,
+ get_current_project_name,
)
from openpype.hosts.maya.api.pipeline import containerise
from openpype.hosts.maya.api.lib import (
@@ -221,7 +221,7 @@ class ImagePlaneLoader(load.LoaderPlugin):
type="string")
# Set frame range.
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
version = get_version_by_id(
project_name, representation["parent"], fields=["parent"]
)
diff --git a/openpype/hosts/maya/plugins/load/load_look.py b/openpype/hosts/maya/plugins/load/load_look.py
index 8f3e017658..20617c77bf 100644
--- a/openpype/hosts/maya/plugins/load/load_look.py
+++ b/openpype/hosts/maya/plugins/load/load_look.py
@@ -7,14 +7,14 @@ from qtpy import QtWidgets
from openpype.client import get_representation_by_name
from openpype.pipeline import (
- legacy_io,
+ get_current_project_name,
get_representation_path,
)
import openpype.hosts.maya.api.plugin
from openpype.hosts.maya.api import lib
from openpype.widgets.message_window import ScrollMessageBox
-from openpype.hosts.maya.api.plugin import get_reference_node
+from openpype.hosts.maya.api.lib import get_reference_node
class LookLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
@@ -29,11 +29,13 @@ class LookLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
color = "orange"
def process_reference(self, context, name, namespace, options):
- import maya.cmds as cmds
+ from maya import cmds
with lib.maintained_selection():
- file_url = self.prepare_root_value(self.fname,
- context["project"]["name"])
+ file_url = self.prepare_root_value(
+ file_url=self.filepath_from_context(context),
+ project_name=context["project"]["name"]
+ )
nodes = cmds.file(file_url,
namespace=namespace,
reference=True,
@@ -76,7 +78,7 @@ class LookLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
shader_nodes = cmds.ls(members, type='shadingEngine')
nodes = set(self._get_nodes_with_shader(shader_nodes))
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
json_representation = get_representation_by_name(
project_name, "json", representation["parent"]
)
@@ -113,8 +115,8 @@ class LookLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
# region compute lookup
nodes_by_id = defaultdict(list)
- for n in nodes:
- nodes_by_id[lib.get_id(n)].append(n)
+ for node in nodes:
+ nodes_by_id[lib.get_id(node)].append(node)
lib.apply_attributes(attributes, nodes_by_id)
def _get_nodes_with_shader(self, shader_nodes):
@@ -125,14 +127,16 @@ class LookLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
Returns
node names
"""
- import maya.cmds as cmds
+ from maya import cmds
- nodes_list = []
for shader in shader_nodes:
- connections = cmds.listConnections(cmds.listHistory(shader, f=1),
+ future = cmds.listHistory(shader, future=True)
+ connections = cmds.listConnections(future,
type='mesh')
if connections:
- for connection in connections:
- nodes_list.extend(cmds.listRelatives(connection,
- shapes=True))
- return nodes_list
+ # Ensure unique entries only to optimize query and results
+ connections = list(set(connections))
+ return cmds.listRelatives(connections,
+ shapes=True,
+ fullPath=True) or []
+ return []
diff --git a/openpype/hosts/maya/plugins/load/load_matchmove.py b/openpype/hosts/maya/plugins/load/load_matchmove.py
index ee3332bd09..46d1be8300 100644
--- a/openpype/hosts/maya/plugins/load/load_matchmove.py
+++ b/openpype/hosts/maya/plugins/load/load_matchmove.py
@@ -1,7 +1,6 @@
from maya import mel
from openpype.pipeline import load
-
class MatchmoveLoader(load.LoaderPlugin):
"""
This will run matchmove script to create track in scene.
@@ -18,11 +17,12 @@ class MatchmoveLoader(load.LoaderPlugin):
color = "orange"
def load(self, context, name, namespace, data):
- if self.fname.lower().endswith(".py"):
- exec(open(self.fname).read())
+ path = self.filepath_from_context(context)
+ if path.lower().endswith(".py"):
+ exec(open(path).read())
- elif self.fname.lower().endswith(".mel"):
- mel.eval('source "{}"'.format(self.fname))
+ elif path.lower().endswith(".mel"):
+ mel.eval('source "{}"'.format(path))
else:
self.log.error("Unsupported script type")
diff --git a/openpype/hosts/maya/plugins/load/load_multiverse_usd.py b/openpype/hosts/maya/plugins/load/load_multiverse_usd.py
index 9e0d38df46..d08fcd904e 100644
--- a/openpype/hosts/maya/plugins/load/load_multiverse_usd.py
+++ b/openpype/hosts/maya/plugins/load/load_multiverse_usd.py
@@ -36,6 +36,8 @@ class MultiverseUsdLoader(load.LoaderPlugin):
suffix="_",
)
+ path = self.filepath_from_context(context)
+
# Make sure we can load the plugin
cmds.loadPlugin("MultiverseForMaya", quiet=True)
import multiverse
@@ -46,7 +48,7 @@ class MultiverseUsdLoader(load.LoaderPlugin):
with maintained_selection():
cmds.namespace(addNamespace=namespace)
with namespaced(namespace, new=False):
- shape = multiverse.CreateUsdCompound(self.fname)
+ shape = multiverse.CreateUsdCompound(path)
transform = cmds.listRelatives(
shape, parent=True, fullPath=True)[0]
diff --git a/openpype/hosts/maya/plugins/load/load_multiverse_usd_over.py b/openpype/hosts/maya/plugins/load/load_multiverse_usd_over.py
index 8a25508ac2..be8d78607b 100644
--- a/openpype/hosts/maya/plugins/load/load_multiverse_usd_over.py
+++ b/openpype/hosts/maya/plugins/load/load_multiverse_usd_over.py
@@ -50,9 +50,10 @@ class MultiverseUsdOverLoader(load.LoaderPlugin):
cmds.loadPlugin("MultiverseForMaya", quiet=True)
import multiverse
+ path = self.filepath_from_context(context)
nodes = current_usd
with maintained_selection():
- multiverse.AddUsdCompoundAssetPath(current_usd[0], self.fname)
+ multiverse.AddUsdCompoundAssetPath(current_usd[0], path)
namespace = current_usd[0].split("|")[1].split(":")[0]
diff --git a/openpype/hosts/maya/plugins/load/load_redshift_proxy.py b/openpype/hosts/maya/plugins/load/load_redshift_proxy.py
index c288e23ded..b3fbfb2ed9 100644
--- a/openpype/hosts/maya/plugins/load/load_redshift_proxy.py
+++ b/openpype/hosts/maya/plugins/load/load_redshift_proxy.py
@@ -46,18 +46,19 @@ class RedshiftProxyLoader(load.LoaderPlugin):
# Ensure Redshift for Maya is loaded.
cmds.loadPlugin("redshift4maya", quiet=True)
+ path = self.filepath_from_context(context)
with maintained_selection():
cmds.namespace(addNamespace=namespace)
with namespaced(namespace, new=False):
- nodes, group_node = self.create_rs_proxy(
- name, self.fname)
+ nodes, group_node = self.create_rs_proxy(name, path)
self[:] = nodes
if not nodes:
return
# colour the group node
- settings = get_project_settings(os.environ['AVALON_PROJECT'])
+ project_name = context["project"]["name"]
+ settings = get_project_settings(project_name)
colors = settings['maya']['load']['colors']
c = colors.get(family)
if c is not None:
diff --git a/openpype/hosts/maya/plugins/load/load_reference.py b/openpype/hosts/maya/plugins/load/load_reference.py
index deadd5b9d3..d339aff69c 100644
--- a/openpype/hosts/maya/plugins/load/load_reference.py
+++ b/openpype/hosts/maya/plugins/load/load_reference.py
@@ -118,15 +118,16 @@ class ReferenceLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
except ValueError:
family = "model"
+ project_name = context["project"]["name"]
# True by default to keep legacy behaviours
attach_to_root = options.get("attach_to_root", True)
group_name = options["group_name"]
+ path = self.filepath_from_context(context)
with maintained_selection():
cmds.loadPlugin("AbcImport.mll", quiet=True)
- file_url = self.prepare_root_value(self.fname,
- context["project"]["name"])
+ file_url = self.prepare_root_value(path, project_name)
nodes = cmds.file(file_url,
namespace=namespace,
sharedReferenceFile=False,
@@ -162,7 +163,7 @@ class ReferenceLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
with parent_nodes(roots, parent=None):
cmds.xform(group_name, zeroTransformPivots=True)
- settings = get_project_settings(os.environ['AVALON_PROJECT'])
+ settings = get_project_settings(project_name)
display_handle = settings['maya']['load'].get(
'reference_loader', {}
diff --git a/openpype/hosts/maya/plugins/load/load_rendersetup.py b/openpype/hosts/maya/plugins/load/load_rendersetup.py
index 7a2d8b1002..8b85f11958 100644
--- a/openpype/hosts/maya/plugins/load/load_rendersetup.py
+++ b/openpype/hosts/maya/plugins/load/load_rendersetup.py
@@ -43,8 +43,9 @@ class RenderSetupLoader(load.LoaderPlugin):
prefix="_" if asset[0].isdigit() else "",
suffix="_",
)
- self.log.info(">>> loading json [ {} ]".format(self.fname))
- with open(self.fname, "r") as file:
+ path = self.filepath_from_context(context)
+ self.log.info(">>> loading json [ {} ]".format(path))
+ with open(path, "r") as file:
renderSetup.instance().decode(
json.load(file), renderSetup.DECODE_AND_OVERWRITE, None)
diff --git a/openpype/hosts/maya/plugins/load/load_vdb_to_arnold.py b/openpype/hosts/maya/plugins/load/load_vdb_to_arnold.py
index 8a386cecfd..0f674a69c4 100644
--- a/openpype/hosts/maya/plugins/load/load_vdb_to_arnold.py
+++ b/openpype/hosts/maya/plugins/load/load_vdb_to_arnold.py
@@ -48,7 +48,8 @@ class LoadVDBtoArnold(load.LoaderPlugin):
label = "{}:{}".format(namespace, name)
root = cmds.group(name=label, empty=True)
- settings = get_project_settings(os.environ['AVALON_PROJECT'])
+ project_name = context["project"]["name"]
+ settings = get_project_settings(project_name)
colors = settings['maya']['load']['colors']
c = colors.get(family)
@@ -65,8 +66,9 @@ class LoadVDBtoArnold(load.LoaderPlugin):
name="{}Shape".format(root),
parent=root)
+ path = self.filepath_from_context(context)
self._set_path(grid_node,
- path=self.fname,
+ path=path,
representation=context["representation"])
# Lock the shape node so the user can't delete the transform/shape
diff --git a/openpype/hosts/maya/plugins/load/load_vdb_to_redshift.py b/openpype/hosts/maya/plugins/load/load_vdb_to_redshift.py
index 1f02321dc8..28cfdc7129 100644
--- a/openpype/hosts/maya/plugins/load/load_vdb_to_redshift.py
+++ b/openpype/hosts/maya/plugins/load/load_vdb_to_redshift.py
@@ -67,7 +67,8 @@ class LoadVDBtoRedShift(load.LoaderPlugin):
label = "{}:{}".format(namespace, name)
root = cmds.createNode("transform", name=label)
- settings = get_project_settings(os.environ['AVALON_PROJECT'])
+ project_name = context["project"]["name"]
+ settings = get_project_settings(project_name)
colors = settings['maya']['load']['colors']
c = colors.get(family)
@@ -85,7 +86,7 @@ class LoadVDBtoRedShift(load.LoaderPlugin):
parent=root)
self._set_path(volume_node,
- path=self.fname,
+ path=self.filepath_from_context(context),
representation=context["representation"])
nodes = [root, volume_node]
diff --git a/openpype/hosts/maya/plugins/load/load_vdb_to_vray.py b/openpype/hosts/maya/plugins/load/load_vdb_to_vray.py
index 9267c59c02..46f2dd674d 100644
--- a/openpype/hosts/maya/plugins/load/load_vdb_to_vray.py
+++ b/openpype/hosts/maya/plugins/load/load_vdb_to_vray.py
@@ -88,8 +88,9 @@ class LoadVDBtoVRay(load.LoaderPlugin):
from openpype.hosts.maya.api.lib import unique_namespace
from openpype.hosts.maya.api.pipeline import containerise
- assert os.path.exists(self.fname), (
- "Path does not exist: %s" % self.fname
+ path = self.filepath_from_context(context)
+ assert os.path.exists(path), (
+ "Path does not exist: %s" % path
)
try:
@@ -126,7 +127,8 @@ class LoadVDBtoVRay(load.LoaderPlugin):
label = "{}:{}_VDB".format(namespace, name)
root = cmds.group(name=label, empty=True)
- settings = get_project_settings(os.environ['AVALON_PROJECT'])
+ project_name = context["project"]["name"]
+ settings = get_project_settings(project_name)
colors = settings['maya']['load']['colors']
c = colors.get(family)
@@ -146,7 +148,7 @@ class LoadVDBtoVRay(load.LoaderPlugin):
cmds.connectAttr("time1.outTime", grid_node + ".currentTime")
# Set path
- self._set_path(grid_node, self.fname, show_preset_popup=True)
+ self._set_path(grid_node, path, show_preset_popup=True)
# Lock the shape node so the user can't delete the transform/shape
# as if it was referenced
diff --git a/openpype/hosts/maya/plugins/load/load_vrayproxy.py b/openpype/hosts/maya/plugins/load/load_vrayproxy.py
index 64184f9e7b..9d926a33ed 100644
--- a/openpype/hosts/maya/plugins/load/load_vrayproxy.py
+++ b/openpype/hosts/maya/plugins/load/load_vrayproxy.py
@@ -12,9 +12,9 @@ import maya.cmds as cmds
from openpype.client import get_representation_by_name
from openpype.settings import get_project_settings
from openpype.pipeline import (
- legacy_io,
load,
- get_representation_path
+ get_current_project_name,
+ get_representation_path,
)
from openpype.hosts.maya.api.lib import (
maintained_selection,
@@ -53,7 +53,9 @@ class VRayProxyLoader(load.LoaderPlugin):
family = "vrayproxy"
# get all representations for this version
- self.fname = self._get_abc(context["version"]["_id"]) or self.fname
+ filename = self._get_abc(context["version"]["_id"])
+ if not filename:
+ filename = self.filepath_from_context(context)
asset_name = context['asset']["name"]
namespace = namespace or unique_namespace(
@@ -69,14 +71,15 @@ class VRayProxyLoader(load.LoaderPlugin):
cmds.namespace(addNamespace=namespace)
with namespaced(namespace, new=False):
nodes, group_node = self.create_vray_proxy(
- name, filename=self.fname)
+ name, filename=filename)
self[:] = nodes
if not nodes:
return
# colour the group node
- settings = get_project_settings(os.environ['AVALON_PROJECT'])
+ project_name = context["project"]["name"]
+ settings = get_project_settings(project_name)
colors = settings['maya']['load']['colors']
c = colors.get(family)
if c is not None:
@@ -185,12 +188,12 @@ class VRayProxyLoader(load.LoaderPlugin):
"""
self.log.debug(
"Looking for abc in published representations of this version.")
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
abc_rep = get_representation_by_name(project_name, "abc", version_id)
if abc_rep:
self.log.debug("Found, we'll link alembic to vray proxy.")
file_name = get_representation_path(abc_rep)
- self.log.debug("File: {}".format(self.fname))
+ self.log.debug("File: {}".format(file_name))
return file_name
return ""
diff --git a/openpype/hosts/maya/plugins/load/load_vrayscene.py b/openpype/hosts/maya/plugins/load/load_vrayscene.py
index d87992f9a7..3a2c3a47f2 100644
--- a/openpype/hosts/maya/plugins/load/load_vrayscene.py
+++ b/openpype/hosts/maya/plugins/load/load_vrayscene.py
@@ -46,15 +46,18 @@ class VRaySceneLoader(load.LoaderPlugin):
with maintained_selection():
cmds.namespace(addNamespace=namespace)
with namespaced(namespace, new=False):
- nodes, root_node = self.create_vray_scene(name,
- filename=self.fname)
+ nodes, root_node = self.create_vray_scene(
+ name,
+ filename=self.filepath_from_context(context)
+ )
self[:] = nodes
if not nodes:
return
# colour the group node
- settings = get_project_settings(os.environ['AVALON_PROJECT'])
+ project_name = context["project"]["name"]
+ settings = get_project_settings(project_name)
colors = settings['maya']['load']['colors']
c = colors.get(family)
if c is not None:
diff --git a/openpype/hosts/maya/plugins/load/load_xgen.py b/openpype/hosts/maya/plugins/load/load_xgen.py
index 16f2e8e842..323f8d7eda 100644
--- a/openpype/hosts/maya/plugins/load/load_xgen.py
+++ b/openpype/hosts/maya/plugins/load/load_xgen.py
@@ -48,7 +48,8 @@ class XgenLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
return
maya_filepath = self.prepare_root_value(
- self.fname, context["project"]["name"]
+ file_url=self.filepath_from_context(context),
+ project_name=context["project"]["name"]
)
# Reference xgen. Xgen does not like being referenced in under a group.
diff --git a/openpype/hosts/maya/plugins/load/load_yeti_cache.py b/openpype/hosts/maya/plugins/load/load_yeti_cache.py
index 5ba381050a..5cded13d4e 100644
--- a/openpype/hosts/maya/plugins/load/load_yeti_cache.py
+++ b/openpype/hosts/maya/plugins/load/load_yeti_cache.py
@@ -60,15 +60,17 @@ class YetiCacheLoader(load.LoaderPlugin):
cmds.loadPlugin("pgYetiMaya", quiet=True)
# Create Yeti cache nodes according to settings
- settings = self.read_settings(self.fname)
+ path = self.filepath_from_context(context)
+ settings = self.read_settings(path)
nodes = []
for node in settings["nodes"]:
nodes.extend(self.create_node(namespace, node))
group_name = "{}:{}".format(namespace, name)
group_node = cmds.group(nodes, name=group_name)
+ project_name = context["project"]["name"]
- settings = get_project_settings(os.environ['AVALON_PROJECT'])
+ settings = get_project_settings(project_name)
colors = settings['maya']['load']['colors']
c = colors.get(family)
diff --git a/openpype/hosts/maya/plugins/load/load_yeti_rig.py b/openpype/hosts/maya/plugins/load/load_yeti_rig.py
index b8066871b0..c9dfe9478b 100644
--- a/openpype/hosts/maya/plugins/load/load_yeti_rig.py
+++ b/openpype/hosts/maya/plugins/load/load_yeti_rig.py
@@ -20,9 +20,10 @@ class YetiRigLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
self, context, name=None, namespace=None, options=None
):
group_name = options['group_name']
+ path = self.filepath_from_context(context)
with lib.maintained_selection():
file_url = self.prepare_root_value(
- self.fname, context["project"]["name"]
+ path, context["project"]["name"]
)
nodes = cmds.file(
file_url,
diff --git a/openpype/hosts/maya/plugins/publish/collect_review.py b/openpype/hosts/maya/plugins/publish/collect_review.py
index 84de971915..6cb10f9066 100644
--- a/openpype/hosts/maya/plugins/publish/collect_review.py
+++ b/openpype/hosts/maya/plugins/publish/collect_review.py
@@ -98,7 +98,7 @@ class CollectReview(pyblish.api.InstancePlugin):
# representations. Once plugins like Extract Review start
# using representations, this should be removed from here
# as Extract Playblast is already adding fps to representation.
- data['fps'] = instance.context.data['fps']
+ data['fps'] = context.data['fps']
data['review_width'] = instance.data['review_width']
data['review_height'] = instance.data['review_height']
data["isolate"] = instance.data["isolate"]
diff --git a/openpype/hosts/maya/plugins/publish/extract_layout.py b/openpype/hosts/maya/plugins/publish/extract_layout.py
index 7921fca069..bf5b4fc0e7 100644
--- a/openpype/hosts/maya/plugins/publish/extract_layout.py
+++ b/openpype/hosts/maya/plugins/publish/extract_layout.py
@@ -6,7 +6,7 @@ from maya import cmds
from maya.api import OpenMaya as om
from openpype.client import get_representation_by_id
-from openpype.pipeline import legacy_io, publish
+from openpype.pipeline import publish
class ExtractLayout(publish.Extractor):
@@ -30,7 +30,7 @@ class ExtractLayout(publish.Extractor):
json_data = []
# TODO representation queries can be refactored to be faster
- project_name = legacy_io.active_project()
+ project_name = instance.context.data["projectName"]
for asset in cmds.sets(str(instance), query=True):
# Find the container
diff --git a/openpype/hosts/maya/plugins/publish/submit_maya_muster.py b/openpype/hosts/maya/plugins/publish/submit_maya_muster.py
index 298c3bd345..8e219eae85 100644
--- a/openpype/hosts/maya/plugins/publish/submit_maya_muster.py
+++ b/openpype/hosts/maya/plugins/publish/submit_maya_muster.py
@@ -265,6 +265,8 @@ class MayaSubmitMuster(pyblish.api.InstancePlugin):
context = instance.context
workspace = context.data["workspaceDir"]
+ project_name = context.data["projectName"]
+ asset_name = context.data["asset"]
filepath = None
@@ -371,8 +373,8 @@ class MayaSubmitMuster(pyblish.api.InstancePlugin):
"jobId": -1,
"startOn": 0,
"parentId": -1,
- "project": os.environ.get('AVALON_PROJECT') or scene,
- "shot": os.environ.get('AVALON_ASSET') or scene,
+ "project": project_name or scene,
+ "shot": asset_name or scene,
"camera": instance.data.get("cameras")[0],
"dependMode": 0,
"packetSize": 4,
diff --git a/openpype/hosts/maya/startup/userSetup.py b/openpype/hosts/maya/startup/userSetup.py
index ae6a999d98..f2899cdb37 100644
--- a/openpype/hosts/maya/startup/userSetup.py
+++ b/openpype/hosts/maya/startup/userSetup.py
@@ -1,7 +1,7 @@
import os
from openpype.settings import get_project_settings
-from openpype.pipeline import install_host
+from openpype.pipeline import install_host, get_current_project_name
from openpype.hosts.maya.api import MayaHost
from maya import cmds
@@ -12,10 +12,11 @@ install_host(host)
print("Starting OpenPype usersetup...")
-project_settings = get_project_settings(os.environ['AVALON_PROJECT'])
+project_name = get_current_project_name()
+settings = get_project_settings(project_name)
# Loading plugins explicitly.
-explicit_plugins_loading = project_settings["maya"]["explicit_plugins_loading"]
+explicit_plugins_loading = settings["maya"]["explicit_plugins_loading"]
if explicit_plugins_loading["enabled"]:
def _explicit_load_plugins():
for plugin in explicit_plugins_loading["plugins_to_load"]:
@@ -46,17 +47,16 @@ if bool(int(os.environ.get(key, "0"))):
)
# Build a shelf.
-shelf_preset = project_settings['maya'].get('project_shelf')
-
+shelf_preset = settings['maya'].get('project_shelf')
if shelf_preset:
- project = os.environ["AVALON_PROJECT"]
-
- icon_path = os.path.join(os.environ['OPENPYPE_PROJECT_SCRIPTS'],
- project, "icons")
+ icon_path = os.path.join(
+ os.environ['OPENPYPE_PROJECT_SCRIPTS'],
+ project_name,
+ "icons")
icon_path = os.path.abspath(icon_path)
for i in shelf_preset['imports']:
- import_string = "from {} import {}".format(project, i)
+ import_string = "from {} import {}".format(project_name, i)
print(import_string)
exec(import_string)
diff --git a/openpype/hosts/maya/tools/mayalookassigner/app.py b/openpype/hosts/maya/tools/mayalookassigner/app.py
index 64fc04dfc4..b5ce7ada34 100644
--- a/openpype/hosts/maya/tools/mayalookassigner/app.py
+++ b/openpype/hosts/maya/tools/mayalookassigner/app.py
@@ -4,9 +4,9 @@ import logging
from qtpy import QtWidgets, QtCore
-from openpype.client import get_last_version_by_subset_id
from openpype import style
-from openpype.pipeline import legacy_io
+from openpype.client import get_last_version_by_subset_id
+from openpype.pipeline import get_current_project_name
from openpype.tools.utils.lib import qt_app_context
from openpype.hosts.maya.api.lib import (
assign_look_by_version,
@@ -216,7 +216,7 @@ class MayaLookAssignerWindow(QtWidgets.QWidget):
selection = self.assign_selected.isChecked()
asset_nodes = self.asset_outliner.get_nodes(selection=selection)
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
start = time.time()
for i, (asset, item) in enumerate(asset_nodes.items()):
diff --git a/openpype/hosts/maya/tools/mayalookassigner/commands.py b/openpype/hosts/maya/tools/mayalookassigner/commands.py
index c5e6c973cf..a1290aa68d 100644
--- a/openpype/hosts/maya/tools/mayalookassigner/commands.py
+++ b/openpype/hosts/maya/tools/mayalookassigner/commands.py
@@ -1,14 +1,14 @@
-from collections import defaultdict
-import logging
import os
+import logging
+from collections import defaultdict
import maya.cmds as cmds
-from openpype.client import get_asset_by_id
+from openpype.client import get_assets
from openpype.pipeline import (
- legacy_io,
remove_container,
registered_host,
+ get_current_project_name,
)
from openpype.hosts.maya.api import lib
@@ -126,18 +126,24 @@ def create_items_from_nodes(nodes):
log.warning("No id hashes")
return asset_view_items
- project_name = legacy_io.active_project()
- for _id, id_nodes in id_hashes.items():
- asset = get_asset_by_id(project_name, _id, fields=["name"])
+ project_name = get_current_project_name()
+ asset_ids = set(id_hashes.keys())
+ asset_docs = get_assets(project_name, asset_ids, fields=["name"])
+ asset_docs_by_id = {
+ str(asset_doc["_id"]): asset_doc
+ for asset_doc in asset_docs
+ }
+ for asset_id, id_nodes in id_hashes.items():
+ asset_doc = asset_docs_by_id.get(asset_id)
# Skip if asset id is not found
- if not asset:
+ if not asset_doc:
log.warning("Id not found in the database, skipping '%s'." % _id)
log.warning("Nodes: %s" % id_nodes)
continue
# Collect available look subsets for this asset
- looks = lib.list_looks(asset["_id"])
+ looks = lib.list_looks(project_name, asset_doc["_id"])
# Collect namespaces the asset is found in
namespaces = set()
@@ -146,8 +152,8 @@ def create_items_from_nodes(nodes):
namespaces.add(namespace)
asset_view_items.append({
- "label": asset["name"],
- "asset": asset,
+ "label": asset_doc["name"],
+ "asset": asset_doc,
"looks": looks,
"namespaces": namespaces
})
diff --git a/openpype/hosts/maya/tools/mayalookassigner/vray_proxies.py b/openpype/hosts/maya/tools/mayalookassigner/vray_proxies.py
index c875fec7f0..97fb832f71 100644
--- a/openpype/hosts/maya/tools/mayalookassigner/vray_proxies.py
+++ b/openpype/hosts/maya/tools/mayalookassigner/vray_proxies.py
@@ -6,7 +6,7 @@ import logging
from maya import cmds
from openpype.client import get_last_version_by_subset_name
-from openpype.pipeline import legacy_io
+from openpype.pipeline import get_current_project_name
import openpype.hosts.maya.lib as maya_lib
from . import lib
from .alembic import get_alembic_ids_cache
@@ -76,7 +76,7 @@ def vrayproxy_assign_look(vrayproxy, subset="lookDefault"):
asset_id = node_id.split(":", 1)[0]
node_ids_by_asset_id[asset_id].add(node_id)
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
for asset_id, node_ids in node_ids_by_asset_id.items():
# Get latest look version
diff --git a/openpype/hosts/nuke/api/lib.py b/openpype/hosts/nuke/api/lib.py
index 0efc46edaf..364c8eeff4 100644
--- a/openpype/hosts/nuke/api/lib.py
+++ b/openpype/hosts/nuke/api/lib.py
@@ -42,8 +42,10 @@ from openpype.pipeline.template_data import get_template_data_with_names
from openpype.pipeline import (
get_current_project_name,
discover_legacy_creator_plugins,
- legacy_io,
Anatomy,
+ get_current_host_name,
+ get_current_project_name,
+ get_current_asset_name,
)
from openpype.pipeline.context_tools import (
get_current_project_asset,
@@ -970,7 +972,7 @@ def check_inventory_versions():
if not repre_ids:
return
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
# Find representations based on found containers
repre_docs = get_representations(
project_name,
@@ -1128,11 +1130,15 @@ def format_anatomy(data):
anatomy = Anatomy()
log.debug("__ anatomy.templates: {}".format(anatomy.templates))
- padding = int(
- anatomy.templates["render"].get(
- "frame_padding"
+ padding = None
+ if "frame_padding" in anatomy.templates.keys():
+ padding = int(anatomy.templates["frame_padding"])
+ elif "render" in anatomy.templates.keys():
+ padding = int(
+ anatomy.templates["render"].get(
+ "frame_padding"
+ )
)
- )
version = data.get("version", None)
if not version:
@@ -1142,7 +1148,7 @@ def format_anatomy(data):
project_name = anatomy.project_name
asset_name = data["asset"]
task_name = data["task"]
- host_name = os.environ["AVALON_APP"]
+ host_name = get_current_host_name()
context_data = get_template_data_with_names(
project_name, asset_name, task_name, host_name
)
@@ -1470,7 +1476,7 @@ def create_write_node_legacy(
if knob["name"] == "file_type":
representation = knob["value"]
- host_name = os.environ.get("AVALON_APP")
+ host_name = get_current_host_name()
try:
data.update({
"app": host_name,
@@ -1929,15 +1935,18 @@ class WorkfileSettings(object):
def __init__(self, root_node=None, nodes=None, **kwargs):
project_doc = kwargs.get("project")
if project_doc is None:
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
project_doc = get_project(project_name)
+ else:
+ project_name = project_doc["name"]
Context._project_doc = project_doc
+ self._project_name = project_name
self._asset = (
kwargs.get("asset_name")
- or legacy_io.Session["AVALON_ASSET"]
+ or get_current_asset_name()
)
- self._asset_entity = get_current_project_asset(self._asset)
+ self._asset_entity = get_asset_by_name(project_name, self._asset)
self._root_node = root_node or nuke.root()
self._nodes = self.get_nodes(nodes=nodes)
@@ -2330,7 +2339,7 @@ Reopening Nuke should synchronize these paths and resolve any discrepancies.
def reset_resolution(self):
"""Set resolution to project resolution."""
log.info("Resetting resolution")
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
asset_data = self._asset_entity["data"]
format_data = {
@@ -2409,7 +2418,7 @@ Reopening Nuke should synchronize these paths and resolve any discrepancies.
from .utils import set_context_favorites
work_dir = os.getenv("AVALON_WORKDIR")
- asset = os.getenv("AVALON_ASSET")
+ asset = get_current_asset_name()
favorite_items = OrderedDict()
# project
@@ -2832,7 +2841,8 @@ def add_scripts_menu():
return
# load configuration of custom menu
- project_settings = get_project_settings(os.getenv("AVALON_PROJECT"))
+ project_name = get_current_project_name()
+ project_settings = get_project_settings(project_name)
config = project_settings["nuke"]["scriptsmenu"]["definition"]
_menu = project_settings["nuke"]["scriptsmenu"]["name"]
@@ -2850,7 +2860,8 @@ def add_scripts_menu():
def add_scripts_gizmo():
# load configuration of custom menu
- project_settings = get_project_settings(os.getenv("AVALON_PROJECT"))
+ project_name = get_current_project_name()
+ project_settings = get_project_settings(project_name)
platform_name = platform.system().lower()
for gizmo_settings in project_settings["nuke"]["gizmo"]:
diff --git a/openpype/hosts/nuke/api/pipeline.py b/openpype/hosts/nuke/api/pipeline.py
index 8406a251e9..cdfc8aa512 100644
--- a/openpype/hosts/nuke/api/pipeline.py
+++ b/openpype/hosts/nuke/api/pipeline.py
@@ -20,6 +20,8 @@ from openpype.pipeline import (
register_creator_plugin_path,
register_inventory_action_path,
AVALON_CONTAINER_ID,
+ get_current_asset_name,
+ get_current_task_name,
)
from openpype.pipeline.workfile import BuildWorkfile
from openpype.tools.utils import host_tools
@@ -211,6 +213,13 @@ def _show_workfiles():
host_tools.show_workfiles(parent=None, on_top=False)
+def get_context_label():
+ return "{0}, {1}".format(
+ get_current_asset_name(),
+ get_current_task_name()
+ )
+
+
def _install_menu():
"""Install Avalon menu into Nuke's main menu bar."""
@@ -220,9 +229,7 @@ def _install_menu():
menu = menubar.addMenu(MENU_LABEL)
if not ASSIST:
- label = "{0}, {1}".format(
- os.environ["AVALON_ASSET"], os.environ["AVALON_TASK"]
- )
+ label = get_context_label()
Context.context_label = label
context_action = menu.addCommand(label)
context_action.setEnabled(False)
@@ -338,9 +345,7 @@ def change_context_label():
menubar = nuke.menu("Nuke")
menu = menubar.findItem(MENU_LABEL)
- label = "{0}, {1}".format(
- os.environ["AVALON_ASSET"], os.environ["AVALON_TASK"]
- )
+ label = get_context_label()
rm_item = [
(i, item) for i, item in enumerate(menu.items())
diff --git a/openpype/hosts/nuke/api/plugin.py b/openpype/hosts/nuke/api/plugin.py
index 7035da2bb5..cfdb407d26 100644
--- a/openpype/hosts/nuke/api/plugin.py
+++ b/openpype/hosts/nuke/api/plugin.py
@@ -19,7 +19,7 @@ from openpype.pipeline import (
CreatorError,
Creator as NewCreator,
CreatedInstance,
- legacy_io
+ get_current_task_name
)
from .lib import (
INSTANCE_DATA_KNOB,
@@ -824,41 +824,6 @@ class ExporterReviewMov(ExporterReview):
add_tags = []
self.publish_on_farm = farm
read_raw = kwargs["read_raw"]
-
- # TODO: remove this when `reformat_nodes_config`
- # is changed in settings
- reformat_node_add = kwargs["reformat_node_add"]
- reformat_node_config = kwargs["reformat_node_config"]
-
- # TODO: make this required in future
- reformat_nodes_config = kwargs.get("reformat_nodes_config", {})
-
- # TODO: remove this once deprecated is removed
- # make sure only reformat_nodes_config is used in future
- if reformat_node_add and reformat_nodes_config.get("enabled"):
- self.log.warning(
- "`reformat_node_add` is deprecated. "
- "Please use only `reformat_nodes_config` instead.")
- reformat_nodes_config = None
-
- # TODO: reformat code when backward compatibility is not needed
- # warning if reformat_nodes_config is not set
- if not reformat_nodes_config:
- self.log.warning(
- "Please set `reformat_nodes_config` in settings. "
- "Using `reformat_node_config` instead."
- )
- reformat_nodes_config = {
- "enabled": reformat_node_add,
- "reposition_nodes": [
- {
- "node_class": "Reformat",
- "knobs": reformat_node_config
- }
- ]
- }
-
-
bake_viewer_process = kwargs["bake_viewer_process"]
bake_viewer_input_process_node = kwargs[
"bake_viewer_input_process"]
@@ -897,6 +862,7 @@ class ExporterReviewMov(ExporterReview):
self._shift_to_previous_node_and_temp(subset, r_node, "Read... `{}`")
# add reformat node
+ reformat_nodes_config = kwargs["reformat_nodes_config"]
if reformat_nodes_config["enabled"]:
reposition_nodes = reformat_nodes_config["reposition_nodes"]
for reposition_node in reposition_nodes:
@@ -1173,7 +1139,7 @@ def convert_to_valid_instaces():
from openpype.hosts.nuke.api import workio
- task_name = legacy_io.Session["AVALON_TASK"]
+ task_name = get_current_task_name()
# save into new workfile
current_file = workio.current_file()
diff --git a/openpype/hosts/nuke/plugins/create/workfile_creator.py b/openpype/hosts/nuke/plugins/create/workfile_creator.py
index 72ef61e63f..c4e0753abc 100644
--- a/openpype/hosts/nuke/plugins/create/workfile_creator.py
+++ b/openpype/hosts/nuke/plugins/create/workfile_creator.py
@@ -3,7 +3,6 @@ from openpype.client import get_asset_by_name
from openpype.pipeline import (
AutoCreator,
CreatedInstance,
- legacy_io,
)
from openpype.hosts.nuke.api import (
INSTANCE_DATA_KNOB,
@@ -27,10 +26,10 @@ class WorkfileCreator(AutoCreator):
root_node, api.INSTANCE_DATA_KNOB
)
- project_name = legacy_io.Session["AVALON_PROJECT"]
- asset_name = legacy_io.Session["AVALON_ASSET"]
- task_name = legacy_io.Session["AVALON_TASK"]
- host_name = legacy_io.Session["AVALON_APP"]
+ project_name = self.create_context.get_current_project_name()
+ asset_name = self.create_context.get_current_asset_name()
+ task_name = self.create_context.get_current_task_name()
+ host_name = self.create_context.host_name
asset_doc = get_asset_by_name(project_name, asset_name)
subset_name = self.get_subset_name(
diff --git a/openpype/hosts/nuke/plugins/load/load_backdrop.py b/openpype/hosts/nuke/plugins/load/load_backdrop.py
index 67c7877e60..fe82d70b5e 100644
--- a/openpype/hosts/nuke/plugins/load/load_backdrop.py
+++ b/openpype/hosts/nuke/plugins/load/load_backdrop.py
@@ -6,8 +6,8 @@ from openpype.client import (
get_last_version_by_subset_id,
)
from openpype.pipeline import (
- legacy_io,
load,
+ get_current_project_name,
get_representation_path,
)
from openpype.hosts.nuke.api.lib import (
@@ -72,7 +72,7 @@ class LoadBackdropNodes(load.LoaderPlugin):
data_imprint.update({k: version_data[k]})
# getting file path
- file = self.fname.replace("\\", "/")
+ file = self.filepath_from_context(context).replace("\\", "/")
# adding nodes to node graph
# just in case we are in group lets jump out of it
@@ -190,7 +190,7 @@ class LoadBackdropNodes(load.LoaderPlugin):
# get main variables
# Get version from io
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
version_doc = get_version_by_id(project_name, representation["parent"])
# get corresponding node
diff --git a/openpype/hosts/nuke/plugins/load/load_camera_abc.py b/openpype/hosts/nuke/plugins/load/load_camera_abc.py
index 40822c9eb7..fec4ee556e 100644
--- a/openpype/hosts/nuke/plugins/load/load_camera_abc.py
+++ b/openpype/hosts/nuke/plugins/load/load_camera_abc.py
@@ -5,8 +5,8 @@ from openpype.client import (
get_last_version_by_subset_id
)
from openpype.pipeline import (
- legacy_io,
load,
+ get_current_project_name,
get_representation_path,
)
from openpype.hosts.nuke.api import (
@@ -57,7 +57,7 @@ class AlembicCameraLoader(load.LoaderPlugin):
data_imprint.update({k: version_data[k]})
# getting file path
- file = self.fname.replace("\\", "/")
+ file = self.filepath_from_context(context).replace("\\", "/")
with maintained_selection():
camera_node = nuke.createNode(
@@ -108,7 +108,7 @@ class AlembicCameraLoader(load.LoaderPlugin):
None
"""
# Get version from io
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
version_doc = get_version_by_id(project_name, representation["parent"])
object_name = container['objectName']
@@ -180,7 +180,7 @@ class AlembicCameraLoader(load.LoaderPlugin):
""" Coloring a node by correct color by actual version
"""
# get all versions in list
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
last_version_doc = get_last_version_by_subset_id(
project_name, version_doc["parent"], fields=["_id"]
)
diff --git a/openpype/hosts/nuke/plugins/load/load_clip.py b/openpype/hosts/nuke/plugins/load/load_clip.py
index ee74582544..5539324fb7 100644
--- a/openpype/hosts/nuke/plugins/load/load_clip.py
+++ b/openpype/hosts/nuke/plugins/load/load_clip.py
@@ -8,7 +8,7 @@ from openpype.client import (
get_last_version_by_subset_id,
)
from openpype.pipeline import (
- legacy_io,
+ get_current_project_name,
get_representation_path,
)
from openpype.hosts.nuke.api.lib import (
@@ -99,7 +99,8 @@ class LoadClip(plugin.NukeLoader):
representation = self._representation_with_hash_in_frame(
representation
)
- filepath = get_representation_path(representation).replace("\\", "/")
+ filepath = self.filepath_from_context(context)
+ filepath = filepath.replace("\\", "/")
self.log.debug("_ filepath: {}".format(filepath))
start_at_workfile = options.get(
@@ -154,7 +155,7 @@ class LoadClip(plugin.NukeLoader):
read_node["file"].setValue(filepath)
used_colorspace = self._set_colorspace(
- read_node, version_data, representation["data"])
+ read_node, version_data, representation["data"], filepath)
self._set_range_to_node(read_node, first, last, start_at_workfile)
@@ -164,8 +165,8 @@ class LoadClip(plugin.NukeLoader):
"handleStart", "handleEnd"]
data_imprint = {}
- for k in add_keys:
- if k == 'version':
+ for key in add_keys:
+ if key == 'version':
version_doc = context["version"]
if version_doc["type"] == "hero_version":
version = "hero"
@@ -173,17 +174,20 @@ class LoadClip(plugin.NukeLoader):
version = version_doc.get("name")
if version:
- data_imprint[k] = version
+ data_imprint[key] = version
- elif k == 'colorspace':
- colorspace = representation["data"].get(k)
- colorspace = colorspace or version_data.get(k)
+ elif key == 'colorspace':
+ colorspace = representation["data"].get(key)
+ colorspace = colorspace or version_data.get(key)
data_imprint["db_colorspace"] = colorspace
if used_colorspace:
data_imprint["used_colorspace"] = used_colorspace
else:
- data_imprint[k] = context["version"]['data'].get(
- k, str(None))
+ value_ = context["version"]['data'].get(
+ key, str(None))
+ if isinstance(value_, (str)):
+ value_ = value_.replace("\\", "/")
+ data_imprint[key] = value_
data_imprint["objectName"] = read_name
@@ -266,7 +270,7 @@ class LoadClip(plugin.NukeLoader):
if "addRetime" in key
]
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
version_doc = get_version_by_id(project_name, representation["parent"])
version_data = version_doc.get("data", {})
@@ -303,8 +307,7 @@ class LoadClip(plugin.NukeLoader):
# we will switch off undo-ing
with viewer_update_and_undo_stop():
used_colorspace = self._set_colorspace(
- read_node, version_data, representation["data"],
- path=filepath)
+ read_node, version_data, representation["data"], filepath)
self._set_range_to_node(read_node, first, last, start_at_workfile)
@@ -451,9 +454,9 @@ class LoadClip(plugin.NukeLoader):
return self.node_name_template.format(**name_data)
- def _set_colorspace(self, node, version_data, repre_data, path=None):
+ def _set_colorspace(self, node, version_data, repre_data, path):
output_color = None
- path = path or self.fname.replace("\\", "/")
+ path = path.replace("\\", "/")
# get colorspace
colorspace = repre_data.get("colorspace")
colorspace = colorspace or version_data.get("colorspace")
diff --git a/openpype/hosts/nuke/plugins/load/load_effects.py b/openpype/hosts/nuke/plugins/load/load_effects.py
index eb1c905c4d..89597e76cc 100644
--- a/openpype/hosts/nuke/plugins/load/load_effects.py
+++ b/openpype/hosts/nuke/plugins/load/load_effects.py
@@ -8,8 +8,8 @@ from openpype.client import (
get_last_version_by_subset_id,
)
from openpype.pipeline import (
- legacy_io,
load,
+ get_current_project_name,
get_representation_path,
)
from openpype.hosts.nuke.api import (
@@ -72,7 +72,7 @@ class LoadEffects(load.LoaderPlugin):
data_imprint.update({k: version_data[k]})
# getting file path
- file = self.fname.replace("\\", "/")
+ file = self.filepath_from_context(context).replace("\\", "/")
# getting data from json file with unicode conversion
with open(file, "r") as f:
@@ -155,7 +155,7 @@ class LoadEffects(load.LoaderPlugin):
"""
# get main variables
# Get version from io
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
version_doc = get_version_by_id(project_name, representation["parent"])
# get corresponding node
diff --git a/openpype/hosts/nuke/plugins/load/load_effects_ip.py b/openpype/hosts/nuke/plugins/load/load_effects_ip.py
index 03be8654ed..efe67be4aa 100644
--- a/openpype/hosts/nuke/plugins/load/load_effects_ip.py
+++ b/openpype/hosts/nuke/plugins/load/load_effects_ip.py
@@ -8,8 +8,8 @@ from openpype.client import (
get_last_version_by_subset_id,
)
from openpype.pipeline import (
- legacy_io,
load,
+ get_current_project_name,
get_representation_path,
)
from openpype.hosts.nuke.api import lib
@@ -73,7 +73,7 @@ class LoadEffectsInputProcess(load.LoaderPlugin):
data_imprint.update({k: version_data[k]})
# getting file path
- file = self.fname.replace("\\", "/")
+ file = self.filepath_from_context(context).replace("\\", "/")
# getting data from json file with unicode conversion
with open(file, "r") as f:
@@ -160,7 +160,7 @@ class LoadEffectsInputProcess(load.LoaderPlugin):
# get main variables
# Get version from io
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
version_doc = get_version_by_id(project_name, representation["parent"])
# get corresponding node
diff --git a/openpype/hosts/nuke/plugins/load/load_gizmo.py b/openpype/hosts/nuke/plugins/load/load_gizmo.py
index 2aa7c49723..6b848ee276 100644
--- a/openpype/hosts/nuke/plugins/load/load_gizmo.py
+++ b/openpype/hosts/nuke/plugins/load/load_gizmo.py
@@ -5,8 +5,8 @@ from openpype.client import (
get_last_version_by_subset_id,
)
from openpype.pipeline import (
- legacy_io,
load,
+ get_current_project_name,
get_representation_path,
)
from openpype.hosts.nuke.api.lib import (
@@ -73,7 +73,7 @@ class LoadGizmo(load.LoaderPlugin):
data_imprint.update({k: version_data[k]})
# getting file path
- file = self.fname.replace("\\", "/")
+ file = self.filepath_from_context(context).replace("\\", "/")
# adding nodes to node graph
# just in case we are in group lets jump out of it
@@ -106,7 +106,7 @@ class LoadGizmo(load.LoaderPlugin):
# get main variables
# Get version from io
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
version_doc = get_version_by_id(project_name, representation["parent"])
# get corresponding node
diff --git a/openpype/hosts/nuke/plugins/load/load_gizmo_ip.py b/openpype/hosts/nuke/plugins/load/load_gizmo_ip.py
index 2514a28299..a8e1218cbe 100644
--- a/openpype/hosts/nuke/plugins/load/load_gizmo_ip.py
+++ b/openpype/hosts/nuke/plugins/load/load_gizmo_ip.py
@@ -6,8 +6,8 @@ from openpype.client import (
get_last_version_by_subset_id,
)
from openpype.pipeline import (
- legacy_io,
load,
+ get_current_project_name,
get_representation_path,
)
from openpype.hosts.nuke.api.lib import (
@@ -75,7 +75,7 @@ class LoadGizmoInputProcess(load.LoaderPlugin):
data_imprint.update({k: version_data[k]})
# getting file path
- file = self.fname.replace("\\", "/")
+ file = self.filepath_from_context(context).replace("\\", "/")
# adding nodes to node graph
# just in case we are in group lets jump out of it
@@ -113,7 +113,7 @@ class LoadGizmoInputProcess(load.LoaderPlugin):
# get main variables
# Get version from io
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
version_doc = get_version_by_id(project_name, representation["parent"])
# get corresponding node
diff --git a/openpype/hosts/nuke/plugins/load/load_image.py b/openpype/hosts/nuke/plugins/load/load_image.py
index 0a79ddada7..d8c0a82206 100644
--- a/openpype/hosts/nuke/plugins/load/load_image.py
+++ b/openpype/hosts/nuke/plugins/load/load_image.py
@@ -7,8 +7,8 @@ from openpype.client import (
get_last_version_by_subset_id,
)
from openpype.pipeline import (
- legacy_io,
load,
+ get_current_project_name,
get_representation_path,
)
from openpype.hosts.nuke.api.lib import (
@@ -86,7 +86,7 @@ class LoadImage(load.LoaderPlugin):
if namespace is None:
namespace = context['asset']['name']
- file = self.fname
+ file = self.filepath_from_context(context)
if not file:
repr_id = context["representation"]["_id"]
@@ -201,7 +201,7 @@ class LoadImage(load.LoaderPlugin):
format(frame_number, "0{}".format(padding)))
# Get start frame from version data
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
version_doc = get_version_by_id(project_name, representation["parent"])
last_version_doc = get_last_version_by_subset_id(
project_name, version_doc["parent"], fields=["_id"]
diff --git a/openpype/hosts/nuke/plugins/load/load_matchmove.py b/openpype/hosts/nuke/plugins/load/load_matchmove.py
index a7d124d472..f942422c00 100644
--- a/openpype/hosts/nuke/plugins/load/load_matchmove.py
+++ b/openpype/hosts/nuke/plugins/load/load_matchmove.py
@@ -18,8 +18,9 @@ class MatchmoveLoader(load.LoaderPlugin):
color = "orange"
def load(self, context, name, namespace, data):
- if self.fname.lower().endswith(".py"):
- exec(open(self.fname).read())
+ path = self.filepath_from_context(context)
+ if path.lower().endswith(".py"):
+ exec(open(path).read())
else:
msg = "Unsupported script type"
diff --git a/openpype/hosts/nuke/plugins/load/load_model.py b/openpype/hosts/nuke/plugins/load/load_model.py
index 36781993ea..0bdcd93dff 100644
--- a/openpype/hosts/nuke/plugins/load/load_model.py
+++ b/openpype/hosts/nuke/plugins/load/load_model.py
@@ -5,8 +5,8 @@ from openpype.client import (
get_last_version_by_subset_id,
)
from openpype.pipeline import (
- legacy_io,
load,
+ get_current_project_name,
get_representation_path,
)
from openpype.hosts.nuke.api.lib import maintained_selection
@@ -55,7 +55,7 @@ class AlembicModelLoader(load.LoaderPlugin):
data_imprint.update({k: version_data[k]})
# getting file path
- file = self.fname.replace("\\", "/")
+ file = self.filepath_from_context(context).replace("\\", "/")
with maintained_selection():
model_node = nuke.createNode(
@@ -112,7 +112,7 @@ class AlembicModelLoader(load.LoaderPlugin):
None
"""
# Get version from io
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
version_doc = get_version_by_id(project_name, representation["parent"])
object_name = container['objectName']
# get corresponding node
@@ -187,7 +187,7 @@ class AlembicModelLoader(load.LoaderPlugin):
def node_version_color(self, version, node):
""" Coloring a node by correct color by actual version"""
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
last_version_doc = get_last_version_by_subset_id(
project_name, version["parent"], fields=["_id"]
)
diff --git a/openpype/hosts/nuke/plugins/load/load_script_precomp.py b/openpype/hosts/nuke/plugins/load/load_script_precomp.py
index b74fdf481a..48d4a0900a 100644
--- a/openpype/hosts/nuke/plugins/load/load_script_precomp.py
+++ b/openpype/hosts/nuke/plugins/load/load_script_precomp.py
@@ -5,7 +5,7 @@ from openpype.client import (
get_last_version_by_subset_id,
)
from openpype.pipeline import (
- legacy_io,
+ get_current_project_name,
load,
get_representation_path,
)
@@ -43,8 +43,8 @@ class LinkAsGroup(load.LoaderPlugin):
if namespace is None:
namespace = context['asset']['name']
- file = self.fname.replace("\\", "/")
- self.log.info("file: {}\n".format(self.fname))
+ file = self.filepath_from_context(context).replace("\\", "/")
+ self.log.info("file: {}\n".format(file))
precomp_name = context["representation"]["context"]["subset"]
@@ -123,7 +123,7 @@ class LinkAsGroup(load.LoaderPlugin):
root = get_representation_path(representation).replace("\\", "/")
# Get start frame from version data
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
version_doc = get_version_by_id(project_name, representation["parent"])
last_version_doc = get_last_version_by_subset_id(
project_name, version_doc["parent"], fields=["_id"]
diff --git a/openpype/hosts/nuke/plugins/publish/collect_reads.py b/openpype/hosts/nuke/plugins/publish/collect_reads.py
index 831ae29a27..38938a3dda 100644
--- a/openpype/hosts/nuke/plugins/publish/collect_reads.py
+++ b/openpype/hosts/nuke/plugins/publish/collect_reads.py
@@ -2,8 +2,6 @@ import os
import re
import nuke
import pyblish.api
-from openpype.client import get_asset_by_name
-from openpype.pipeline import legacy_io
class CollectNukeReads(pyblish.api.InstancePlugin):
@@ -15,16 +13,9 @@ class CollectNukeReads(pyblish.api.InstancePlugin):
families = ["source"]
def process(self, instance):
- node = instance.data["transientData"]["node"]
-
- project_name = legacy_io.active_project()
- asset_name = legacy_io.Session["AVALON_ASSET"]
- asset_doc = get_asset_by_name(project_name, asset_name)
-
- self.log.debug("asset_doc: {}".format(asset_doc["data"]))
-
self.log.debug("checking instance: {}".format(instance))
+ node = instance.data["transientData"]["node"]
if node.Class() != "Read":
return
diff --git a/openpype/hosts/photoshop/api/README.md b/openpype/hosts/photoshop/api/README.md
index 4a36746cb2..7bd2bcb1bf 100644
--- a/openpype/hosts/photoshop/api/README.md
+++ b/openpype/hosts/photoshop/api/README.md
@@ -210,8 +210,9 @@ class ImageLoader(load.LoaderPlugin):
representations = ["*"]
def load(self, context, name=None, namespace=None, data=None):
+ path = self.filepath_from_context(context)
with photoshop.maintained_selection():
- layer = stub.import_smart_object(self.fname)
+ layer = stub.import_smart_object(path)
self[:] = [layer]
diff --git a/openpype/hosts/photoshop/api/pipeline.py b/openpype/hosts/photoshop/api/pipeline.py
index 73dc80260c..56ae2a4c25 100644
--- a/openpype/hosts/photoshop/api/pipeline.py
+++ b/openpype/hosts/photoshop/api/pipeline.py
@@ -6,11 +6,8 @@ import pyblish.api
from openpype.lib import register_event_callback, Logger
from openpype.pipeline import (
- legacy_io,
register_loader_plugin_path,
register_creator_plugin_path,
- deregister_loader_plugin_path,
- deregister_creator_plugin_path,
AVALON_CONTAINER_ID,
)
@@ -23,6 +20,7 @@ from openpype.host import (
from openpype.pipeline.load import any_outdated_containers
from openpype.hosts.photoshop import PHOTOSHOP_HOST_DIR
+from openpype.tools.utils import get_openpype_qt_app
from . import lib
@@ -111,14 +109,6 @@ class PhotoshopHost(HostBase, IWorkfileHost, ILoadHost, IPublishHost):
item["id"] = "publish_context"
_get_stub().imprint(item["id"], item)
- def get_context_title(self):
- """Returns title for Creator window"""
-
- project_name = legacy_io.Session["AVALON_PROJECT"]
- asset_name = legacy_io.Session["AVALON_ASSET"]
- task_name = legacy_io.Session["AVALON_TASK"]
- return "{}/{}/{}".format(project_name, asset_name, task_name)
-
def list_instances(self):
"""List all created instances to publish from current workfile.
@@ -174,10 +164,7 @@ def check_inventory():
return
# Warn about outdated containers.
- _app = QtWidgets.QApplication.instance()
- if not _app:
- print("Starting new QApplication..")
- _app = QtWidgets.QApplication([])
+ _app = get_openpype_qt_app()
message_box = QtWidgets.QMessageBox()
message_box.setIcon(QtWidgets.QMessageBox.Warning)
diff --git a/openpype/hosts/photoshop/plugins/load/load_image.py b/openpype/hosts/photoshop/plugins/load/load_image.py
index 91a9787781..eb770bbd20 100644
--- a/openpype/hosts/photoshop/plugins/load/load_image.py
+++ b/openpype/hosts/photoshop/plugins/load/load_image.py
@@ -22,7 +22,8 @@ class ImageLoader(photoshop.PhotoshopLoader):
name
)
with photoshop.maintained_selection():
- layer = self.import_layer(self.fname, layer_name, stub)
+ path = self.filepath_from_context(context)
+ layer = self.import_layer(path, layer_name, stub)
self[:] = [layer]
namespace = namespace or layer_name
diff --git a/openpype/hosts/photoshop/plugins/load/load_image_from_sequence.py b/openpype/hosts/photoshop/plugins/load/load_image_from_sequence.py
index c25c5a8f2c..f9fceb80bb 100644
--- a/openpype/hosts/photoshop/plugins/load/load_image_from_sequence.py
+++ b/openpype/hosts/photoshop/plugins/load/load_image_from_sequence.py
@@ -29,11 +29,13 @@ class ImageFromSequenceLoader(photoshop.PhotoshopLoader):
options = []
def load(self, context, name=None, namespace=None, data=None):
+
+ path = self.filepath_from_context(context)
if data.get("frame"):
- self.fname = os.path.join(
- os.path.dirname(self.fname), data["frame"]
+ path = os.path.join(
+ os.path.dirname(path), data["frame"]
)
- if not os.path.exists(self.fname):
+ if not os.path.exists(path):
return
stub = self.get_stub()
@@ -42,7 +44,7 @@ class ImageFromSequenceLoader(photoshop.PhotoshopLoader):
)
with photoshop.maintained_selection():
- layer = stub.import_smart_object(self.fname, layer_name)
+ layer = stub.import_smart_object(path, layer_name)
self[:] = [layer]
namespace = namespace or layer_name
diff --git a/openpype/hosts/photoshop/plugins/load/load_reference.py b/openpype/hosts/photoshop/plugins/load/load_reference.py
index 1f32a5d23c..5772e243d5 100644
--- a/openpype/hosts/photoshop/plugins/load/load_reference.py
+++ b/openpype/hosts/photoshop/plugins/load/load_reference.py
@@ -23,7 +23,8 @@ class ReferenceLoader(photoshop.PhotoshopLoader):
stub.get_layers(), context["asset"]["name"], name
)
with photoshop.maintained_selection():
- layer = self.import_layer(self.fname, layer_name, stub)
+ path = self.filepath_from_context(context)
+ layer = self.import_layer(path, layer_name, stub)
self[:] = [layer]
namespace = namespace or layer_name
diff --git a/openpype/hosts/photoshop/plugins/publish/validate_instance_asset.py b/openpype/hosts/photoshop/plugins/publish/validate_instance_asset.py
index b9d721dbdb..1a4932fe99 100644
--- a/openpype/hosts/photoshop/plugins/publish/validate_instance_asset.py
+++ b/openpype/hosts/photoshop/plugins/publish/validate_instance_asset.py
@@ -1,6 +1,6 @@
import pyblish.api
-from openpype.pipeline import legacy_io
+from openpype.pipeline import get_current_asset_name
from openpype.pipeline.publish import (
ValidateContentsOrder,
PublishXmlValidationError,
@@ -28,10 +28,10 @@ class ValidateInstanceAssetRepair(pyblish.api.Action):
# Apply pyblish.logic to get the instances for the plug-in
instances = pyblish.api.instances_by_plugin(failed, plugin)
stub = photoshop.stub()
+ current_asset_name = get_current_asset_name()
for instance in instances:
data = stub.read(instance[0])
-
- data["asset"] = legacy_io.Session["AVALON_ASSET"]
+ data["asset"] = current_asset_name
stub.imprint(instance[0], data)
@@ -55,7 +55,7 @@ class ValidateInstanceAsset(OptionalPyblishPluginMixin,
def process(self, instance):
instance_asset = instance.data["asset"]
- current_asset = legacy_io.Session["AVALON_ASSET"]
+ current_asset = get_current_asset_name()
if instance_asset != current_asset:
msg = (
diff --git a/openpype/hosts/resolve/api/plugin.py b/openpype/hosts/resolve/api/plugin.py
index e5846c2fc2..59c27f29da 100644
--- a/openpype/hosts/resolve/api/plugin.py
+++ b/openpype/hosts/resolve/api/plugin.py
@@ -291,7 +291,7 @@ class ClipLoader:
active_bin = None
data = dict()
- def __init__(self, cls, context, **options):
+ def __init__(self, cls, context, path, **options):
""" Initialize object
Arguments:
@@ -304,6 +304,7 @@ class ClipLoader:
self.__dict__.update(cls.__dict__)
self.context = context
self.active_project = lib.get_current_project()
+ self.fname = path
# try to get value from options or evaluate key value for `handles`
self.with_handles = options.get("handles") or bool(
diff --git a/openpype/hosts/resolve/plugins/load/load_clip.py b/openpype/hosts/resolve/plugins/load/load_clip.py
index 05bfb003d6..3a59ecea80 100644
--- a/openpype/hosts/resolve/plugins/load/load_clip.py
+++ b/openpype/hosts/resolve/plugins/load/load_clip.py
@@ -7,7 +7,7 @@ from openpype.client import (
# from openpype.hosts import resolve
from openpype.pipeline import (
get_representation_path,
- legacy_io,
+ get_current_project_name,
)
from openpype.hosts.resolve.api import lib, plugin
from openpype.hosts.resolve.api.pipeline import (
@@ -55,8 +55,9 @@ class LoadClip(plugin.TimelineItemLoader):
})
# load clip to timeline and get main variables
+ path = self.filepath_from_context(context)
timeline_item = plugin.ClipLoader(
- self, context, **options).load()
+ self, context, path, **options).load()
namespace = namespace or timeline_item.GetName()
version = context['version']
version_data = version.get("data", {})
@@ -109,16 +110,16 @@ class LoadClip(plugin.TimelineItemLoader):
namespace = container['namespace']
timeline_item_data = lib.get_pype_timeline_item_by_name(namespace)
timeline_item = timeline_item_data["clip"]["item"]
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
version = get_version_by_id(project_name, representation["parent"])
version_data = version.get("data", {})
version_name = version.get("name", None)
colorspace = version_data.get("colorspace", None)
object_name = "{}_{}".format(name, namespace)
- self.fname = get_representation_path(representation)
+ path = get_representation_path(representation)
context["version"] = {"data": version_data}
- loader = plugin.ClipLoader(self, context)
+ loader = plugin.ClipLoader(self, context, path)
timeline_item = loader.update(timeline_item)
# add additional metadata from the version to imprint Avalon knob
@@ -152,7 +153,7 @@ class LoadClip(plugin.TimelineItemLoader):
# define version name
version_name = version.get("name", None)
# get all versions in list
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
last_version_doc = get_last_version_by_subset_id(
project_name,
version["parent"],
diff --git a/openpype/hosts/resolve/plugins/publish/precollect_workfile.py b/openpype/hosts/resolve/plugins/publish/precollect_workfile.py
index 0f94216556..a2f3eaed7a 100644
--- a/openpype/hosts/resolve/plugins/publish/precollect_workfile.py
+++ b/openpype/hosts/resolve/plugins/publish/precollect_workfile.py
@@ -1,8 +1,8 @@
import pyblish.api
from pprint import pformat
+from openpype.pipeline import get_current_asset_name
from openpype.hosts.resolve import api as rapi
-from openpype.pipeline import legacy_io
from openpype.hosts.resolve.otio import davinci_export
@@ -14,7 +14,7 @@ class PrecollectWorkfile(pyblish.api.ContextPlugin):
def process(self, context):
- asset = legacy_io.Session["AVALON_ASSET"]
+ asset = get_current_asset_name()
subset = "workfile"
project = rapi.get_current_project()
fps = project.GetSetting("timelineFrameRate")
diff --git a/openpype/hosts/standalonepublisher/plugins/publish/validate_texture_workfiles.py b/openpype/hosts/standalonepublisher/plugins/publish/validate_texture_workfiles.py
index a7ae02a2eb..fd2d4a9f36 100644
--- a/openpype/hosts/standalonepublisher/plugins/publish/validate_texture_workfiles.py
+++ b/openpype/hosts/standalonepublisher/plugins/publish/validate_texture_workfiles.py
@@ -1,7 +1,5 @@
-import os
import pyblish.api
-from openpype.settings import get_project_settings
from openpype.pipeline.publish import (
ValidateContentsOrder,
PublishXmlValidationError,
@@ -21,27 +19,30 @@ class ValidateTextureBatchWorkfiles(pyblish.api.InstancePlugin):
optional = True
def process(self, instance):
- if instance.data["family"] == "workfile":
- ext = instance.data["representations"][0]["ext"]
- main_workfile_extensions = self.get_main_workfile_extensions()
- if ext not in main_workfile_extensions:
- self.log.warning("Only secondary workfile present!")
- return
+ if instance.data["family"] != "workfile":
+ return
- if not instance.data.get("resources"):
- msg = "No secondary workfile present for workfile '{}'". \
- format(instance.data["name"])
- ext = main_workfile_extensions[0]
- formatting_data = {"file_name": instance.data["name"],
- "extension": ext}
+ ext = instance.data["representations"][0]["ext"]
+ main_workfile_extensions = self.get_main_workfile_extensions(
+ instance
+ )
+ if ext not in main_workfile_extensions:
+ self.log.warning("Only secondary workfile present!")
+ return
- raise PublishXmlValidationError(self, msg,
- formatting_data=formatting_data
- )
+ if not instance.data.get("resources"):
+ msg = "No secondary workfile present for workfile '{}'". \
+ format(instance.data["name"])
+ ext = main_workfile_extensions[0]
+ formatting_data = {"file_name": instance.data["name"],
+ "extension": ext}
+
+ raise PublishXmlValidationError(
+ self, msg, formatting_data=formatting_data)
@staticmethod
- def get_main_workfile_extensions():
- project_settings = get_project_settings(os.environ["AVALON_PROJECT"])
+ def get_main_workfile_extensions(instance):
+ project_settings = instance.context.data["project_settings"]
try:
extensions = (project_settings["standalonepublisher"]
diff --git a/openpype/hosts/substancepainter/plugins/load/load_mesh.py b/openpype/hosts/substancepainter/plugins/load/load_mesh.py
index 822095641d..57db869a11 100644
--- a/openpype/hosts/substancepainter/plugins/load/load_mesh.py
+++ b/openpype/hosts/substancepainter/plugins/load/load_mesh.py
@@ -47,7 +47,8 @@ class SubstanceLoadProjectMesh(load.LoaderPlugin):
if not substance_painter.project.is_open():
# Allow to 'initialize' a new project
- result = prompt_new_file_with_mesh(mesh_filepath=self.fname)
+ path = self.filepath_from_context(context)
+ result = prompt_new_file_with_mesh(mesh_filepath=path)
if not result:
self.log.info("User cancelled new project prompt.")
return
@@ -65,7 +66,7 @@ class SubstanceLoadProjectMesh(load.LoaderPlugin):
else:
raise LoadError("Reload of mesh failed")
- path = self.fname
+ path = self.filepath_from_context(context)
substance_painter.project.reload_mesh(path,
settings,
on_mesh_reload)
diff --git a/openpype/hosts/substancepainter/plugins/publish/collect_textureset_images.py b/openpype/hosts/substancepainter/plugins/publish/collect_textureset_images.py
index d11abd1019..316f72509e 100644
--- a/openpype/hosts/substancepainter/plugins/publish/collect_textureset_images.py
+++ b/openpype/hosts/substancepainter/plugins/publish/collect_textureset_images.py
@@ -114,7 +114,7 @@ class CollectTextureSet(pyblish.api.InstancePlugin):
# Clone the instance
image_instance = context.create_instance(image_subset)
image_instance[:] = instance[:]
- image_instance.data.update(copy.deepcopy(instance.data))
+ image_instance.data.update(copy.deepcopy(dict(instance.data)))
image_instance.data["name"] = image_subset
image_instance.data["label"] = image_subset
image_instance.data["subset"] = image_subset
diff --git a/openpype/hosts/traypublisher/plugins/publish/collect_frame_range_asset_entity.py b/openpype/hosts/traypublisher/plugins/publish/collect_frame_range_asset_entity.py
new file mode 100644
index 0000000000..c18e10e438
--- /dev/null
+++ b/openpype/hosts/traypublisher/plugins/publish/collect_frame_range_asset_entity.py
@@ -0,0 +1,42 @@
+import pyblish.api
+from openpype.pipeline import OptionalPyblishPluginMixin
+
+
+class CollectFrameDataFromAssetEntity(pyblish.api.InstancePlugin,
+ OptionalPyblishPluginMixin):
+ """Collect Frame Range data From Asset Entity
+
+ Frame range data will only be collected if the keys
+ are not yet collected for the instance.
+ """
+
+ order = pyblish.api.CollectorOrder + 0.491
+ label = "Collect Frame Data From Asset Entity"
+ families = ["plate", "pointcache",
+ "vdbcache", "online",
+ "render"]
+ hosts = ["traypublisher"]
+ optional = True
+
+ def process(self, instance):
+ if not self.is_active(instance.data):
+ return
+ missing_keys = []
+ for key in (
+ "fps",
+ "frameStart",
+ "frameEnd",
+ "handleStart",
+ "handleEnd"
+ ):
+ if key not in instance.data:
+ missing_keys.append(key)
+ keys_set = []
+ for key in missing_keys:
+ asset_data = instance.data["assetEntity"]["data"]
+ if key in asset_data:
+ instance.data[key] = asset_data[key]
+ keys_set.append(key)
+ if keys_set:
+ self.log.debug(f"Frame range data {keys_set} "
+ "has been collected from asset entity.")
diff --git a/openpype/hosts/tvpaint/plugins/load/load_image.py b/openpype/hosts/tvpaint/plugins/load/load_image.py
index 5283d04355..a400738019 100644
--- a/openpype/hosts/tvpaint/plugins/load/load_image.py
+++ b/openpype/hosts/tvpaint/plugins/load/load_image.py
@@ -77,8 +77,9 @@ class ImportImage(plugin.Loader):
)
# Fill import script with filename and layer name
# - filename mus not contain backwards slashes
+ path = self.filepath_from_context(context).replace("\\", "/")
george_script = self.import_script.format(
- self.fname.replace("\\", "/"),
+ path,
layer_name,
load_options_str
)
diff --git a/openpype/hosts/tvpaint/plugins/load/load_reference_image.py b/openpype/hosts/tvpaint/plugins/load/load_reference_image.py
index 7f7a68cc41..edc116a8e4 100644
--- a/openpype/hosts/tvpaint/plugins/load/load_reference_image.py
+++ b/openpype/hosts/tvpaint/plugins/load/load_reference_image.py
@@ -86,10 +86,12 @@ class LoadImage(plugin.Loader):
subset_name = context["subset"]["name"]
layer_name = self.get_unique_layer_name(asset_name, subset_name)
+ path = self.filepath_from_context(context)
+
# Fill import script with filename and layer name
# - filename mus not contain backwards slashes
george_script = self.import_script.format(
- self.fname.replace("\\", "/"),
+ path.replace("\\", "/"),
layer_name,
load_options_str
)
@@ -271,9 +273,6 @@ class LoadImage(plugin.Loader):
# Remove old layers
self._remove_layers(layer_ids=layer_ids_to_remove)
- # Change `fname` to new representation
- self.fname = self.filepath_from_context(context)
-
name = container["name"]
namespace = container["namespace"]
new_container = self.load(context, name, namespace, {})
diff --git a/openpype/hosts/tvpaint/plugins/load/load_sound.py b/openpype/hosts/tvpaint/plugins/load/load_sound.py
index f312db262a..3003280eef 100644
--- a/openpype/hosts/tvpaint/plugins/load/load_sound.py
+++ b/openpype/hosts/tvpaint/plugins/load/load_sound.py
@@ -60,9 +60,10 @@ class ImportSound(plugin.Loader):
output_filepath = output_file.name.replace("\\", "/")
# Prepare george script
+ path = self.filepath_from_context(context).replace("\\", "/")
import_script = "\n".join(self.import_script_lines)
george_script = import_script.format(
- self.fname.replace("\\", "/"),
+ path,
output_filepath
)
self.log.info("*** George script:\n{}\n***".format(george_script))
diff --git a/openpype/hosts/tvpaint/plugins/load/load_workfile.py b/openpype/hosts/tvpaint/plugins/load/load_workfile.py
index fc7588f56e..2155a1bbd5 100644
--- a/openpype/hosts/tvpaint/plugins/load/load_workfile.py
+++ b/openpype/hosts/tvpaint/plugins/load/load_workfile.py
@@ -3,7 +3,7 @@ import os
from openpype.lib import StringTemplate
from openpype.pipeline import (
registered_host,
- legacy_io,
+ get_current_context,
Anatomy,
)
from openpype.pipeline.workfile import (
@@ -31,18 +31,18 @@ class LoadWorkfile(plugin.Loader):
def load(self, context, name, namespace, options):
# Load context of current workfile as first thing
# - which context and extension has
- host = registered_host()
- current_file = host.get_current_workfile()
-
- context = get_current_workfile_context()
-
- filepath = self.fname.replace("\\", "/")
+ filepath = self.filepath_from_context(context)
+ filepath = filepath.replace("\\", "/")
if not os.path.exists(filepath):
raise FileExistsError(
"The loaded file does not exist. Try downloading it first."
)
+ host = registered_host()
+ current_file = host.get_current_workfile()
+ work_context = get_current_workfile_context()
+
george_script = "tv_LoadProject '\"'\"{}\"'\"'".format(
filepath
)
@@ -50,14 +50,15 @@ class LoadWorkfile(plugin.Loader):
# Save workfile.
host_name = "tvpaint"
- project_name = context.get("project")
- asset_name = context.get("asset")
- task_name = context.get("task")
- # Far cases when there is workfile without context
+ project_name = work_context.get("project")
+ asset_name = work_context.get("asset")
+ task_name = work_context.get("task")
+ # Far cases when there is workfile without work_context
if not asset_name:
- project_name = legacy_io.active_project()
- asset_name = legacy_io.Session["AVALON_ASSET"]
- task_name = legacy_io.Session["AVALON_TASK"]
+ context = get_current_context()
+ project_name = context["project_name"]
+ asset_name = context["asset_name"]
+ task_name = context["task_name"]
template_key = get_workfile_template_key_from_context(
asset_name,
diff --git a/openpype/hosts/unreal/hooks/pre_workfile_preparation.py b/openpype/hosts/unreal/hooks/pre_workfile_preparation.py
index f01609d314..760d55077a 100644
--- a/openpype/hosts/unreal/hooks/pre_workfile_preparation.py
+++ b/openpype/hosts/unreal/hooks/pre_workfile_preparation.py
@@ -111,6 +111,7 @@ class UnrealPrelaunchHook(PreLaunchHook):
ue_project_worker = UEProjectGenerationWorker()
ue_project_worker.setup(
engine_version,
+ self.data["project_name"],
unreal_project_name,
engine_path,
project_dir
diff --git a/openpype/hosts/unreal/lib.py b/openpype/hosts/unreal/lib.py
index 97771472cf..67e7891344 100644
--- a/openpype/hosts/unreal/lib.py
+++ b/openpype/hosts/unreal/lib.py
@@ -1,17 +1,16 @@
# -*- coding: utf-8 -*-
"""Unreal launching and project tools."""
+import json
import os
import platform
-import json
-
+import re
+import subprocess
+from collections import OrderedDict
+from distutils import dir_util
+from pathlib import Path
from typing import List
-from distutils import dir_util
-import subprocess
-import re
-from pathlib import Path
-from collections import OrderedDict
from openpype.settings import get_project_settings
@@ -179,6 +178,7 @@ def _parse_launcher_locations(install_json_path: str) -> dict:
def create_unreal_project(project_name: str,
+ unreal_project_name: str,
ue_version: str,
pr_dir: Path,
engine_path: Path,
@@ -193,7 +193,8 @@ def create_unreal_project(project_name: str,
folder and enable this plugin.
Args:
- project_name (str): Name of the project.
+ project_name (str): Name of the project in AYON.
+ unreal_project_name (str): Name of the project in Unreal.
ue_version (str): Unreal engine version (like 4.23).
pr_dir (Path): Path to directory where project will be created.
engine_path (Path): Path to Unreal Engine installation.
@@ -211,8 +212,12 @@ def create_unreal_project(project_name: str,
Returns:
None
+ Deprecated:
+ since 3.16.0
+
"""
env = env or os.environ
+
preset = get_project_settings(project_name)["unreal"]["project_setup"]
ue_id = ".".join(ue_version.split(".")[:2])
# get unreal engine identifier
@@ -230,7 +235,7 @@ def create_unreal_project(project_name: str,
ue_editor_exe: Path = get_editor_exe_path(engine_path, ue_version)
cmdlet_project: Path = get_path_to_cmdlet_project(ue_version)
- project_file = pr_dir / f"{project_name}.uproject"
+ project_file = pr_dir / f"{unreal_project_name}.uproject"
print("--- Generating a new project ...")
commandlet_cmd = [f'{ue_editor_exe.as_posix()}',
@@ -251,8 +256,9 @@ def create_unreal_project(project_name: str,
return_code = gen_process.wait()
if return_code and return_code != 0:
- raise RuntimeError(f'Failed to generate \'{project_name}\' project! '
- f'Exited with return code {return_code}')
+ raise RuntimeError(
+ (f"Failed to generate '{unreal_project_name}' project! "
+ f"Exited with return code {return_code}"))
print("--- Project has been generated successfully.")
@@ -282,7 +288,7 @@ def create_unreal_project(project_name: str,
subprocess.run(command1)
command2 = [u_build_tool.as_posix(),
- f"-ModuleWithSuffix={project_name},3555", arch,
+ f"-ModuleWithSuffix={unreal_project_name},3555", arch,
"Development", "-TargetType=Editor",
f'-Project={project_file}',
f'{project_file}',
diff --git a/openpype/hosts/unreal/plugins/load/load_alembic_animation.py b/openpype/hosts/unreal/plugins/load/load_alembic_animation.py
index 52eea4122a..cb60197a4c 100644
--- a/openpype/hosts/unreal/plugins/load/load_alembic_animation.py
+++ b/openpype/hosts/unreal/plugins/load/load_alembic_animation.py
@@ -87,7 +87,8 @@ class AnimationAlembicLoader(plugin.Loader):
if not unreal.EditorAssetLibrary.does_directory_exist(asset_dir):
unreal.EditorAssetLibrary.make_directory(asset_dir)
- task = self.get_task(self.fname, asset_dir, asset_name, False)
+ path = self.filepath_from_context(context)
+ task = self.get_task(path, asset_dir, asset_name, False)
asset_tools = unreal.AssetToolsHelpers.get_asset_tools()
asset_tools.import_asset_tasks([task])
diff --git a/openpype/hosts/unreal/plugins/load/load_animation.py b/openpype/hosts/unreal/plugins/load/load_animation.py
index a5ecb677e8..7ed85ee411 100644
--- a/openpype/hosts/unreal/plugins/load/load_animation.py
+++ b/openpype/hosts/unreal/plugins/load/load_animation.py
@@ -26,7 +26,7 @@ class AnimationFBXLoader(plugin.Loader):
icon = "cube"
color = "orange"
- def _process(self, asset_dir, asset_name, instance_name):
+ def _process(self, path, asset_dir, asset_name, instance_name):
automated = False
actor = None
@@ -55,7 +55,7 @@ class AnimationFBXLoader(plugin.Loader):
asset_doc = get_current_project_asset(fields=["data.fps"])
- task.set_editor_property('filename', self.fname)
+ task.set_editor_property('filename', path)
task.set_editor_property('destination_path', asset_dir)
task.set_editor_property('destination_name', asset_name)
task.set_editor_property('replace_existing', False)
@@ -177,14 +177,15 @@ class AnimationFBXLoader(plugin.Loader):
EditorAssetLibrary.make_directory(asset_dir)
- libpath = self.fname.replace("fbx", "json")
+ path = self.filepath_from_context(context)
+ libpath = path.replace(".fbx", ".json")
with open(libpath, "r") as fp:
data = json.load(fp)
instance_name = data.get("instance_name")
- animation = self._process(asset_dir, asset_name, instance_name)
+ animation = self._process(path, asset_dir, asset_name, instance_name)
asset_content = EditorAssetLibrary.list_assets(
hierarchy_dir, recursive=True, include_folder=False)
diff --git a/openpype/hosts/unreal/plugins/load/load_camera.py b/openpype/hosts/unreal/plugins/load/load_camera.py
index 59ea14697d..d663ce20ea 100644
--- a/openpype/hosts/unreal/plugins/load/load_camera.py
+++ b/openpype/hosts/unreal/plugins/load/load_camera.py
@@ -12,7 +12,7 @@ from unreal import (
from openpype.client import get_asset_by_name
from openpype.pipeline import (
AYON_CONTAINER_ID,
- legacy_io,
+ get_current_project_name,
)
from openpype.hosts.unreal.api import plugin
from openpype.hosts.unreal.api.pipeline import (
@@ -184,7 +184,7 @@ class CameraLoader(plugin.Loader):
frame_ranges[i + 1][0], frame_ranges[i + 1][1],
[level])
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
data = get_asset_by_name(project_name, asset)["data"]
cam_seq.set_display_rate(
unreal.FrameRate(data.get("fps"), 1.0))
@@ -200,12 +200,13 @@ class CameraLoader(plugin.Loader):
settings.set_editor_property('reduce_keys', False)
if cam_seq:
+ path = self.filepath_from_context(context)
self._import_camera(
EditorLevelLibrary.get_editor_world(),
cam_seq,
cam_seq.get_bindings(),
settings,
- self.fname
+ path
)
# Set range of all sections
@@ -389,7 +390,7 @@ class CameraLoader(plugin.Loader):
# Set range of all sections
# Changing the range of the section is not enough. We need to change
# the frame of all the keys in the section.
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
asset = container.get('asset')
data = get_asset_by_name(project_name, asset)["data"]
diff --git a/openpype/hosts/unreal/plugins/load/load_geometrycache_abc.py b/openpype/hosts/unreal/plugins/load/load_geometrycache_abc.py
index 3a292fdbd1..13ba236a7d 100644
--- a/openpype/hosts/unreal/plugins/load/load_geometrycache_abc.py
+++ b/openpype/hosts/unreal/plugins/load/load_geometrycache_abc.py
@@ -111,8 +111,9 @@ class PointCacheAlembicLoader(plugin.Loader):
if frame_start == frame_end:
frame_end += 1
+ path = self.filepath_from_context(context)
task = self.get_task(
- self.fname, asset_dir, asset_name, False, frame_start, frame_end)
+ path, asset_dir, asset_name, False, frame_start, frame_end)
unreal.AssetToolsHelpers.get_asset_tools().import_asset_tasks([task]) # noqa: E501
diff --git a/openpype/hosts/unreal/plugins/load/load_layout.py b/openpype/hosts/unreal/plugins/load/load_layout.py
index 86b2e1456c..3b82da5068 100644
--- a/openpype/hosts/unreal/plugins/load/load_layout.py
+++ b/openpype/hosts/unreal/plugins/load/load_layout.py
@@ -23,7 +23,7 @@ from openpype.pipeline import (
load_container,
get_representation_path,
AYON_CONTAINER_ID,
- legacy_io,
+ get_current_project_name,
)
from openpype.pipeline.context_tools import get_current_project_asset
from openpype.settings import get_current_project_settings
@@ -302,7 +302,7 @@ class LayoutLoader(plugin.Loader):
if not version_ids:
return output
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
repre_docs = get_representations(
project_name,
representation_names=["fbx", "abc"],
@@ -603,7 +603,7 @@ class LayoutLoader(plugin.Loader):
frame_ranges[i + 1][0], frame_ranges[i + 1][1],
[level])
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
data = get_asset_by_name(project_name, asset)["data"]
shot.set_display_rate(
unreal.FrameRate(data.get("fps"), 1.0))
@@ -618,7 +618,8 @@ class LayoutLoader(plugin.Loader):
EditorLevelLibrary.load_level(level)
- loaded_assets = self._process(self.fname, asset_dir, shot)
+ path = self.filepath_from_context(context)
+ loaded_assets = self._process(path, asset_dir, shot)
for s in sequences:
EditorAssetLibrary.save_asset(s.get_path_name())
diff --git a/openpype/hosts/unreal/plugins/load/load_layout_existing.py b/openpype/hosts/unreal/plugins/load/load_layout_existing.py
index 929a9a1399..c53e92930a 100644
--- a/openpype/hosts/unreal/plugins/load/load_layout_existing.py
+++ b/openpype/hosts/unreal/plugins/load/load_layout_existing.py
@@ -11,7 +11,7 @@ from openpype.pipeline import (
load_container,
get_representation_path,
AYON_CONTAINER_ID,
- legacy_io,
+ get_current_project_name,
)
from openpype.hosts.unreal.api import plugin
from openpype.hosts.unreal.api import pipeline as upipeline
@@ -380,7 +380,8 @@ class ExistingLayoutLoader(plugin.Loader):
raise AssertionError("Current level not saved")
project_name = context["project"]["name"]
- containers = self._process(self.fname, project_name)
+ path = self.filepath_from_context(context)
+ containers = self._process(path, project_name)
curr_level_path = Path(
curr_level.get_outer().get_path_name()).parent.as_posix()
@@ -410,7 +411,7 @@ class ExistingLayoutLoader(plugin.Loader):
asset_dir = container.get('namespace')
source_path = get_representation_path(representation)
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
containers = self._process(source_path, project_name)
data = {
diff --git a/openpype/hosts/unreal/plugins/load/load_skeletalmesh_abc.py b/openpype/hosts/unreal/plugins/load/load_skeletalmesh_abc.py
index 7591d5582f..0b0030ff77 100644
--- a/openpype/hosts/unreal/plugins/load/load_skeletalmesh_abc.py
+++ b/openpype/hosts/unreal/plugins/load/load_skeletalmesh_abc.py
@@ -89,7 +89,8 @@ class SkeletalMeshAlembicLoader(plugin.Loader):
if not unreal.EditorAssetLibrary.does_directory_exist(asset_dir):
unreal.EditorAssetLibrary.make_directory(asset_dir)
- task = self.get_task(self.fname, asset_dir, asset_name, False)
+ path = self.filepath_from_context(context)
+ task = self.get_task(path, asset_dir, asset_name, False)
unreal.AssetToolsHelpers.get_asset_tools().import_asset_tasks([task]) # noqa: E501
diff --git a/openpype/hosts/unreal/plugins/load/load_skeletalmesh_fbx.py b/openpype/hosts/unreal/plugins/load/load_skeletalmesh_fbx.py
index e9676cde3a..09cd37b9db 100644
--- a/openpype/hosts/unreal/plugins/load/load_skeletalmesh_fbx.py
+++ b/openpype/hosts/unreal/plugins/load/load_skeletalmesh_fbx.py
@@ -23,7 +23,7 @@ class SkeletalMeshFBXLoader(plugin.Loader):
def load(self, context, name, namespace, options):
"""Load and containerise representation into Content Browser.
- This is two step process. First, import FBX to temporary path and
+ This is a two step process. First, import FBX to temporary path and
then call `containerise()` on it - this moves all content to new
directory and then it will create AssetContainer there and imprint it
with metadata. This will mark this path as container.
@@ -65,7 +65,8 @@ class SkeletalMeshFBXLoader(plugin.Loader):
task = unreal.AssetImportTask()
- task.set_editor_property('filename', self.fname)
+ path = self.filepath_from_context(context)
+ task.set_editor_property('filename', path)
task.set_editor_property('destination_path', asset_dir)
task.set_editor_property('destination_name', asset_name)
task.set_editor_property('replace_existing', False)
diff --git a/openpype/hosts/unreal/plugins/load/load_staticmesh_abc.py b/openpype/hosts/unreal/plugins/load/load_staticmesh_abc.py
index befc7b0ac9..98e6d962b1 100644
--- a/openpype/hosts/unreal/plugins/load/load_staticmesh_abc.py
+++ b/openpype/hosts/unreal/plugins/load/load_staticmesh_abc.py
@@ -98,8 +98,9 @@ class StaticMeshAlembicLoader(plugin.Loader):
if not unreal.EditorAssetLibrary.does_directory_exist(asset_dir):
unreal.EditorAssetLibrary.make_directory(asset_dir)
+ path = self.filepath_from_context(context)
task = self.get_task(
- self.fname, asset_dir, asset_name, False, default_conversion)
+ path, asset_dir, asset_name, False, default_conversion)
unreal.AssetToolsHelpers.get_asset_tools().import_asset_tasks([task]) # noqa: E501
diff --git a/openpype/hosts/unreal/plugins/load/load_staticmesh_fbx.py b/openpype/hosts/unreal/plugins/load/load_staticmesh_fbx.py
index e416256486..fa26e252f5 100644
--- a/openpype/hosts/unreal/plugins/load/load_staticmesh_fbx.py
+++ b/openpype/hosts/unreal/plugins/load/load_staticmesh_fbx.py
@@ -88,7 +88,8 @@ class StaticMeshFBXLoader(plugin.Loader):
unreal.EditorAssetLibrary.make_directory(asset_dir)
- task = self.get_task(self.fname, asset_dir, asset_name, False)
+ path = self.filepath_from_context(context)
+ task = self.get_task(path, asset_dir, asset_name, False)
unreal.AssetToolsHelpers.get_asset_tools().import_asset_tasks([task]) # noqa: E501
diff --git a/openpype/hosts/unreal/plugins/load/load_uasset.py b/openpype/hosts/unreal/plugins/load/load_uasset.py
index 30f63abe39..88aaac41e8 100644
--- a/openpype/hosts/unreal/plugins/load/load_uasset.py
+++ b/openpype/hosts/unreal/plugins/load/load_uasset.py
@@ -64,8 +64,9 @@ class UAssetLoader(plugin.Loader):
destination_path = asset_dir.replace(
"/Game", Path(unreal.Paths.project_content_dir()).as_posix(), 1)
+ path = self.filepath_from_context(context)
shutil.copy(
- self.fname,
+ path,
f"{destination_path}/{name}_{unique_number:02}.{self.extension}")
# Create Asset Container
diff --git a/openpype/hosts/unreal/plugins/publish/extract_layout.py b/openpype/hosts/unreal/plugins/publish/extract_layout.py
index 57e7957575..d30d04551d 100644
--- a/openpype/hosts/unreal/plugins/publish/extract_layout.py
+++ b/openpype/hosts/unreal/plugins/publish/extract_layout.py
@@ -8,7 +8,7 @@ from unreal import EditorLevelLibrary as ell
from unreal import EditorAssetLibrary as eal
from openpype.client import get_representation_by_name
-from openpype.pipeline import legacy_io, publish
+from openpype.pipeline import publish
class ExtractLayout(publish.Extractor):
@@ -32,7 +32,7 @@ class ExtractLayout(publish.Extractor):
"Wrong level loaded"
json_data = []
- project_name = legacy_io.active_project()
+ project_name = instance.context.data["projectName"]
for member in instance[:]:
actor = ell.get_actor_reference(member)
diff --git a/openpype/hosts/unreal/ue_workers.py b/openpype/hosts/unreal/ue_workers.py
index 2b7e1375e6..3a0f976957 100644
--- a/openpype/hosts/unreal/ue_workers.py
+++ b/openpype/hosts/unreal/ue_workers.py
@@ -3,16 +3,17 @@ import os
import platform
import re
import subprocess
+import tempfile
from distutils import dir_util
+from distutils.dir_util import copy_tree
from pathlib import Path
from typing import List, Union
-import tempfile
-from distutils.dir_util import copy_tree
-
-import openpype.hosts.unreal.lib as ue_lib
from qtpy import QtCore
+import openpype.hosts.unreal.lib as ue_lib
+from openpype.settings import get_project_settings
+
def parse_comp_progress(line: str, progress_signal: QtCore.Signal(int)):
match = re.search(r"\[[1-9]+/[0-9]+]", line)
@@ -54,24 +55,36 @@ class UEProjectGenerationWorker(QtCore.QObject):
dev_mode = False
def setup(self, ue_version: str,
- project_name,
+ project_name: str,
+ unreal_project_name,
engine_path: Path,
project_dir: Path,
dev_mode: bool = False,
env: dict = None):
+ """Set the worker with necessary parameters.
+
+ Args:
+ ue_version (str): Unreal Engine version.
+ project_name (str): Name of the project in AYON.
+ unreal_project_name (str): Name of the project in Unreal.
+ engine_path (Path): Path to the Unreal Engine.
+ project_dir (Path): Path to the project directory.
+ dev_mode (bool, optional): Whether to run the project in dev mode.
+ Defaults to False.
+ env (dict, optional): Environment variables. Defaults to None.
+
+ """
self.ue_version = ue_version
self.project_dir = project_dir
self.env = env or os.environ
- preset = ue_lib.get_project_settings(
- project_name
- )["unreal"]["project_setup"]
+ preset = get_project_settings(project_name)["unreal"]["project_setup"]
if dev_mode or preset["dev_mode"]:
self.dev_mode = True
- self.project_name = project_name
+ self.project_name = unreal_project_name
self.engine_path = engine_path
def run(self):
diff --git a/openpype/lib/applications.py b/openpype/lib/applications.py
index 8adae34827..f47e11926c 100644
--- a/openpype/lib/applications.py
+++ b/openpype/lib/applications.py
@@ -1371,14 +1371,9 @@ def get_app_environments_for_context(
"""
from openpype.modules import ModulesManager
- from openpype.pipeline import AvalonMongoDB, Anatomy
+ from openpype.pipeline import Anatomy
from openpype.lib.openpype_version import is_running_staging
- # Avalon database connection
- dbcon = AvalonMongoDB()
- dbcon.Session["AVALON_PROJECT"] = project_name
- dbcon.install()
-
# Project document
project_doc = get_project(project_name)
asset_doc = get_asset_by_name(project_name, asset_name)
@@ -1400,7 +1395,6 @@ def get_app_environments_for_context(
"app": app,
- "dbcon": dbcon,
"project_doc": project_doc,
"asset_doc": asset_doc,
@@ -1415,9 +1409,6 @@ def get_app_environments_for_context(
prepare_app_environments(data, env_group, modules_manager)
prepare_context_environments(data, env_group, modules_manager)
- # Discard avalon connection
- dbcon.uninstall()
-
return data["env"]
diff --git a/openpype/lib/execute.py b/openpype/lib/execute.py
index 6f52efdfcc..6c1425fc63 100644
--- a/openpype/lib/execute.py
+++ b/openpype/lib/execute.py
@@ -5,6 +5,8 @@ import platform
import json
import tempfile
+from openpype import AYON_SERVER_ENABLED
+
from .log import Logger
from .vendor_bin_utils import find_executable
@@ -321,19 +323,22 @@ def get_openpype_execute_args(*args):
It is possible to pass any arguments that will be added after pype
executables.
"""
- pype_executable = os.environ["OPENPYPE_EXECUTABLE"]
- pype_args = [pype_executable]
+ executable = os.environ["OPENPYPE_EXECUTABLE"]
+ launch_args = [executable]
- executable_filename = os.path.basename(pype_executable)
+ executable_filename = os.path.basename(executable)
if "python" in executable_filename.lower():
- pype_args.append(
- os.path.join(os.environ["OPENPYPE_ROOT"], "start.py")
+ filename = "start.py"
+ if AYON_SERVER_ENABLED:
+ filename = "ayon_start.py"
+ launch_args.append(
+ os.path.join(os.environ["OPENPYPE_ROOT"], filename)
)
if args:
- pype_args.extend(args)
+ launch_args.extend(args)
- return pype_args
+ return launch_args
def get_linux_launcher_args(*args):
diff --git a/openpype/lib/local_settings.py b/openpype/lib/local_settings.py
index c6c9699240..3fb35a7e7b 100644
--- a/openpype/lib/local_settings.py
+++ b/openpype/lib/local_settings.py
@@ -29,6 +29,7 @@ except ImportError:
import six
import appdirs
+from openpype import AYON_SERVER_ENABLED
from openpype.settings import (
get_local_settings,
get_system_settings
@@ -517,11 +518,54 @@ def _create_local_site_id(registry=None):
return new_id
+def get_ayon_appdirs(*args):
+ """Local app data directory of AYON client.
+
+ Args:
+ *args (Iterable[str]): Subdirectories/files in local app data dir.
+
+ Returns:
+ str: Path to directory/file in local app data dir.
+ """
+
+ return os.path.join(
+ appdirs.user_data_dir("AYON", "Ynput"),
+ *args
+ )
+
+
+def _get_ayon_local_site_id():
+ # used for background syncing
+ site_id = os.environ.get("AYON_SITE_ID")
+ if site_id:
+ return site_id
+
+ site_id_path = get_ayon_appdirs("site_id")
+ if os.path.exists(site_id_path):
+ with open(site_id_path, "r") as stream:
+ site_id = stream.read()
+
+ if site_id:
+ return site_id
+
+ try:
+ from ayon_common.utils import get_local_site_id as _get_local_site_id
+ site_id = _get_local_site_id()
+ except ImportError:
+ raise ValueError("Couldn't access local site id")
+
+ return site_id
+
+
def get_local_site_id():
"""Get local site identifier.
Identifier is created if does not exists yet.
"""
+
+ if AYON_SERVER_ENABLED:
+ return _get_ayon_local_site_id()
+
# override local id from environment
# used for background syncing
if os.environ.get("OPENPYPE_LOCAL_ID"):
diff --git a/openpype/lib/log.py b/openpype/lib/log.py
index 26dcd86eec..dc2e6615fe 100644
--- a/openpype/lib/log.py
+++ b/openpype/lib/log.py
@@ -24,6 +24,7 @@ import traceback
import threading
import copy
+from openpype import AYON_SERVER_ENABLED
from openpype.client.mongo import (
MongoEnvNotSet,
get_default_components,
@@ -212,7 +213,7 @@ class Logger:
log_mongo_url_components = None
# Database name in Mongo
- log_database_name = os.environ["OPENPYPE_DATABASE_NAME"]
+ log_database_name = os.environ.get("OPENPYPE_DATABASE_NAME")
# Collection name under database in Mongo
log_collection_name = "logs"
@@ -326,12 +327,17 @@ class Logger:
# Change initialization state to prevent runtime changes
# if is executed during runtime
cls.initialized = False
- cls.log_mongo_url_components = get_default_components()
+ if not AYON_SERVER_ENABLED:
+ cls.log_mongo_url_components = get_default_components()
# Define if should logging to mongo be used
- use_mongo_logging = bool(log4mongo is not None)
- if use_mongo_logging:
- use_mongo_logging = os.environ.get("OPENPYPE_LOG_TO_SERVER") == "1"
+ if AYON_SERVER_ENABLED:
+ use_mongo_logging = False
+ else:
+ use_mongo_logging = (
+ log4mongo is not None
+ and os.environ.get("OPENPYPE_LOG_TO_SERVER") == "1"
+ )
# Set mongo id for process (ONLY ONCE)
if use_mongo_logging and cls.mongo_process_id is None:
@@ -453,6 +459,9 @@ class Logger:
if not cls.use_mongo_logging:
return
+ if not cls.log_database_name:
+ raise ValueError("Database name for logs is not set")
+
client = log4mongo.handlers._connection
if not client:
client = cls.get_log_mongo_connection()
diff --git a/openpype/lib/openpype_version.py b/openpype/lib/openpype_version.py
index e052002468..bdf7099f61 100644
--- a/openpype/lib/openpype_version.py
+++ b/openpype/lib/openpype_version.py
@@ -13,6 +13,7 @@ import os
import sys
import openpype.version
+from openpype import AYON_SERVER_ENABLED
from .python_module_tools import import_filepath
@@ -88,6 +89,9 @@ def is_running_staging():
bool: Using staging version or not.
"""
+ if AYON_SERVER_ENABLED:
+ return is_staging_enabled()
+
if os.environ.get("OPENPYPE_IS_STAGING") == "1":
return True
diff --git a/openpype/lib/pype_info.py b/openpype/lib/pype_info.py
index 8370ecc88f..2f57d76850 100644
--- a/openpype/lib/pype_info.py
+++ b/openpype/lib/pype_info.py
@@ -5,6 +5,7 @@ import platform
import getpass
import socket
+from openpype import AYON_SERVER_ENABLED
from openpype.settings.lib import get_local_settings
from .execute import get_openpype_execute_args
from .local_settings import get_local_site_id
@@ -33,6 +34,21 @@ def get_openpype_info():
}
+def get_ayon_info():
+ executable_args = get_openpype_execute_args()
+ if is_running_from_build():
+ version_type = "build"
+ else:
+ version_type = "code"
+ return {
+ "build_verison": get_build_version(),
+ "version_type": version_type,
+ "executable": executable_args[-1],
+ "ayon_root": os.environ["AYON_ROOT"],
+ "server_url": os.environ["AYON_SERVER_URL"]
+ }
+
+
def get_workstation_info():
"""Basic information about workstation."""
host_name = socket.gethostname()
@@ -52,12 +68,17 @@ def get_workstation_info():
def get_all_current_info():
"""All information about current process in one dictionary."""
- return {
- "pype": get_openpype_info(),
+
+ output = {
"workstation": get_workstation_info(),
"env": os.environ.copy(),
"local_settings": get_local_settings()
}
+ if AYON_SERVER_ENABLED:
+ output["ayon"] = get_ayon_info()
+ else:
+ output["openpype"] = get_openpype_info()
+ return output
def extract_pype_info_to_file(dirpath):
diff --git a/openpype/lib/usdlib.py b/openpype/lib/usdlib.py
index 5ef1d38f87..cb96a0c1d0 100644
--- a/openpype/lib/usdlib.py
+++ b/openpype/lib/usdlib.py
@@ -9,7 +9,7 @@ except ImportError:
from mvpxr import Usd, UsdGeom, Sdf, Kind
from openpype.client import get_project, get_asset_by_name
-from openpype.pipeline import legacy_io, Anatomy
+from openpype.pipeline import Anatomy, get_current_project_name
log = logging.getLogger(__name__)
@@ -126,7 +126,7 @@ def create_model(filename, asset, variant_subsets):
"""
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
asset_doc = get_asset_by_name(project_name, asset)
assert asset_doc, "Asset not found: %s" % asset
@@ -177,7 +177,7 @@ def create_shade(filename, asset, variant_subsets):
"""
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
asset_doc = get_asset_by_name(project_name, asset)
assert asset_doc, "Asset not found: %s" % asset
@@ -213,7 +213,7 @@ def create_shade_variation(filename, asset, model_variant, shade_variants):
"""
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
asset_doc = get_asset_by_name(project_name, asset)
assert asset_doc, "Asset not found: %s" % asset
@@ -314,7 +314,7 @@ def get_usd_master_path(asset, subset, representation):
"""
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
anatomy = Anatomy(project_name)
project_doc = get_project(
project_name,
diff --git a/openpype/modules/base.py b/openpype/modules/base.py
index fb9b4e1096..9b3637c48a 100644
--- a/openpype/modules/base.py
+++ b/openpype/modules/base.py
@@ -1,5 +1,6 @@
# -*- coding: utf-8 -*-
"""Base class for Pype Modules."""
+import copy
import os
import sys
import json
@@ -12,8 +13,12 @@ import collections
import traceback
from uuid import uuid4
from abc import ABCMeta, abstractmethod
-import six
+import six
+import appdirs
+import ayon_api
+
+from openpype import AYON_SERVER_ENABLED
from openpype.settings import (
get_system_settings,
SYSTEM_SETTINGS_KEY,
@@ -30,8 +35,9 @@ from openpype.settings.lib import (
from openpype.lib import (
Logger,
import_filepath,
- import_module_from_dirpath
+ import_module_from_dirpath,
)
+from openpype.lib.openpype_version import is_staging_enabled
from .interfaces import (
OpenPypeInterface,
@@ -186,7 +192,11 @@ def get_dynamic_modules_dirs():
Returns:
list: Paths loaded from studio overrides.
"""
+
output = []
+ if AYON_SERVER_ENABLED:
+ return output
+
value = get_studio_system_settings_overrides()
for key in ("modules", "addon_paths", platform.system().lower()):
if key not in value:
@@ -299,6 +309,134 @@ def load_modules(force=False):
time.sleep(0.1)
+def _get_ayon_addons_information():
+ """Receive information about addons to use from server.
+
+ Todos:
+ Actually ask server for the information.
+ Allow project name as optional argument to be able to query information
+ about used addons for specific project.
+ Returns:
+ List[Dict[str, Any]]: List of addon information to use.
+ """
+
+ output = []
+ bundle_name = os.getenv("AYON_BUNDLE_NAME")
+ bundles = ayon_api.get_bundles()["bundles"]
+ final_bundle = next(
+ (
+ bundle
+ for bundle in bundles
+ if bundle["name"] == bundle_name
+ ),
+ None
+ )
+ if final_bundle is None:
+ return output
+
+ bundle_addons = final_bundle["addons"]
+ addons = ayon_api.get_addons_info()["addons"]
+ for addon in addons:
+ name = addon["name"]
+ versions = addon.get("versions")
+ addon_version = bundle_addons.get(name)
+ if addon_version is None or not versions:
+ continue
+ version = versions.get(addon_version)
+ if version:
+ version = copy.deepcopy(version)
+ version["name"] = name
+ version["version"] = addon_version
+ output.append(version)
+ return output
+
+
+def _load_ayon_addons(openpype_modules, modules_key, log):
+ """Load AYON addons based on information from server.
+
+ This function should not trigger downloading of any addons but only use
+ what is already available on the machine (at least in first stages of
+ development).
+
+ Args:
+ openpype_modules (_ModuleClass): Module object where modules are
+ stored.
+ log (logging.Logger): Logger object.
+
+ Returns:
+ List[str]: List of v3 addons to skip to load because v4 alternative is
+ imported.
+ """
+
+ v3_addons_to_skip = []
+
+ addons_info = _get_ayon_addons_information()
+ if not addons_info:
+ return v3_addons_to_skip
+ addons_dir = os.path.join(
+ appdirs.user_data_dir("AYON", "Ynput"),
+ "addons"
+ )
+ if not os.path.exists(addons_dir):
+ log.warning("Addons directory does not exists. Path \"{}\"".format(
+ addons_dir
+ ))
+ return v3_addons_to_skip
+
+ for addon_info in addons_info:
+ addon_name = addon_info["name"]
+ addon_version = addon_info["version"]
+
+ folder_name = "{}_{}".format(addon_name, addon_version)
+ addon_dir = os.path.join(addons_dir, folder_name)
+ if not os.path.exists(addon_dir):
+ log.warning((
+ "Directory for addon {} {} does not exists. Path \"{}\""
+ ).format(addon_name, addon_version, addon_dir))
+ continue
+
+ sys.path.insert(0, addon_dir)
+ imported_modules = []
+ for name in os.listdir(addon_dir):
+ path = os.path.join(addon_dir, name)
+ basename, ext = os.path.splitext(name)
+ is_dir = os.path.isdir(path)
+ is_py_file = ext.lower() == ".py"
+ if not is_py_file and not is_dir:
+ continue
+
+ try:
+ mod = __import__(basename, fromlist=("",))
+ imported_modules.append(mod)
+ except BaseException:
+ log.warning(
+ "Failed to import \"{}\"".format(basename),
+ exc_info=True
+ )
+
+ if not imported_modules:
+ log.warning("Addon {} {} has no content to import".format(
+ addon_name, addon_version
+ ))
+ continue
+
+ if len(imported_modules) == 1:
+ mod = imported_modules[0]
+ addon_alias = getattr(mod, "V3_ALIAS", None)
+ if not addon_alias:
+ addon_alias = addon_name
+ v3_addons_to_skip.append(addon_alias)
+ new_import_str = "{}.{}".format(modules_key, addon_alias)
+
+ sys.modules[new_import_str] = mod
+ setattr(openpype_modules, addon_alias, mod)
+
+ else:
+ log.info("More then one module was imported")
+
+ return v3_addons_to_skip
+
+
def _load_modules():
# Key under which will be modules imported in `sys.modules`
modules_key = "openpype_modules"
@@ -308,6 +446,12 @@ def _load_modules():
log = Logger.get_logger("ModulesLoader")
+ ignore_addon_names = []
+ if AYON_SERVER_ENABLED:
+ ignore_addon_names = _load_ayon_addons(
+ openpype_modules, modules_key, log
+ )
+
# Look for OpenPype modules in paths defined with `get_module_dirs`
# - dynamically imported OpenPype modules and addons
module_dirs = get_module_dirs()
@@ -351,6 +495,9 @@ def _load_modules():
fullpath = os.path.join(dirpath, filename)
basename, ext = os.path.splitext(filename)
+ if basename in ignore_addon_names:
+ continue
+
# Validations
if os.path.isdir(fullpath):
# Check existence of init file
diff --git a/openpype/modules/deadline/plugins/publish/submit_houdini_render_deadline.py b/openpype/modules/deadline/plugins/publish/submit_houdini_render_deadline.py
index 254914a850..af341ca8e8 100644
--- a/openpype/modules/deadline/plugins/publish/submit_houdini_render_deadline.py
+++ b/openpype/modules/deadline/plugins/publish/submit_houdini_render_deadline.py
@@ -88,7 +88,6 @@ class HoudiniSubmitDeadline(abstract_submit_deadline.AbstractSubmitDeadline):
"AVALON_APP_NAME",
"OPENPYPE_DEV",
"OPENPYPE_LOG_NO_COLORS",
- "OPENPYPE_VERSION"
]
# Add OpenPype version if we are running from build.
diff --git a/openpype/modules/deadline/plugins/publish/submit_max_deadline.py b/openpype/modules/deadline/plugins/publish/submit_max_deadline.py
index b6a30e36b7..fff7a4ced5 100644
--- a/openpype/modules/deadline/plugins/publish/submit_max_deadline.py
+++ b/openpype/modules/deadline/plugins/publish/submit_max_deadline.py
@@ -20,6 +20,7 @@ from openpype.hosts.max.api.lib import (
from openpype.hosts.max.api.lib_rendersettings import RenderSettings
from openpype_modules.deadline import abstract_submit_deadline
from openpype_modules.deadline.abstract_submit_deadline import DeadlineJobInfo
+from openpype.lib import is_running_from_build
@attr.s
@@ -110,9 +111,13 @@ class MaxSubmitDeadline(abstract_submit_deadline.AbstractSubmitDeadline,
"AVALON_TASK",
"AVALON_APP_NAME",
"OPENPYPE_DEV",
- "OPENPYPE_VERSION",
"IS_TEST"
]
+
+ # Add OpenPype version if we are running from build.
+ if is_running_from_build():
+ keys.append("OPENPYPE_VERSION")
+
# Add mongo url if it's enabled
if self._instance.context.data.get("deadlinePassMongoUrl"):
keys.append("OPENPYPE_MONGO")
@@ -179,20 +184,18 @@ class MaxSubmitDeadline(abstract_submit_deadline.AbstractSubmitDeadline,
}
self.log.debug("Submitting 3dsMax render..")
- payload = self._use_published_name(payload_data)
+ project_settings = instance.context.data["project_settings"]
+ payload = self._use_published_name(payload_data, project_settings)
job_info, plugin_info = payload
self.submit(self.assemble_payload(job_info, plugin_info))
- def _use_published_name(self, data):
+ def _use_published_name(self, data, project_settings):
instance = self._instance
job_info = copy.deepcopy(self.job_info)
plugin_info = copy.deepcopy(self.plugin_info)
plugin_data = {}
- project_setting = get_project_settings(
- legacy_io.Session["AVALON_PROJECT"]
- )
- multipass = get_multipass_setting(project_setting)
+ multipass = get_multipass_setting(project_settings)
if multipass:
plugin_data["DisableMultipass"] = 0
else:
diff --git a/openpype/modules/deadline/plugins/publish/submit_maya_remote_publish_deadline.py b/openpype/modules/deadline/plugins/publish/submit_maya_remote_publish_deadline.py
index 3b04f6d3bc..39120f7c8a 100644
--- a/openpype/modules/deadline/plugins/publish/submit_maya_remote_publish_deadline.py
+++ b/openpype/modules/deadline/plugins/publish/submit_maya_remote_publish_deadline.py
@@ -110,8 +110,8 @@ class MayaSubmitRemotePublishDeadline(
# TODO replace legacy_io with context.data
environment["AVALON_PROJECT"] = project_name
- environment["AVALON_ASSET"] = legacy_io.Session["AVALON_ASSET"]
- environment["AVALON_TASK"] = legacy_io.Session["AVALON_TASK"]
+ environment["AVALON_ASSET"] = instance.context.data["asset"]
+ environment["AVALON_TASK"] = instance.context.data["task"]
environment["AVALON_APP_NAME"] = os.environ.get("AVALON_APP_NAME")
environment["OPENPYPE_LOG_NO_COLORS"] = "1"
environment["OPENPYPE_REMOTE_JOB"] = "1"
diff --git a/openpype/modules/deadline/plugins/publish/submit_publish_job.py b/openpype/modules/deadline/plugins/publish/submit_publish_job.py
index 457ebfd0fe..0529fb8a70 100644
--- a/openpype/modules/deadline/plugins/publish/submit_publish_job.py
+++ b/openpype/modules/deadline/plugins/publish/submit_publish_job.py
@@ -182,6 +182,7 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin,
instance_version = instance.data.get("version") # take this if exists
if instance_version != 1:
override_version = instance_version
+
output_dir = self._get_publish_folder(
anatomy,
deepcopy(instance.data["anatomyData"]),
@@ -197,9 +198,9 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin,
create_metadata_path(instance, anatomy)
environment = {
- "AVALON_PROJECT": legacy_io.Session["AVALON_PROJECT"],
- "AVALON_ASSET": legacy_io.Session["AVALON_ASSET"],
- "AVALON_TASK": legacy_io.Session["AVALON_TASK"],
+ "AVALON_PROJECT": instance.context.data["projectName"],
+ "AVALON_ASSET": instance.context.data["asset"],
+ "AVALON_TASK": instance.context.data["task"],
"OPENPYPE_USERNAME": instance.context.data["user"],
"OPENPYPE_PUBLISH_JOB": "1",
"OPENPYPE_RENDER_JOB": "0",
@@ -317,6 +318,7 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin,
return deadline_publish_job_id
+
def process(self, instance):
# type: (pyblish.api.Instance) -> None
"""Process plugin.
@@ -547,8 +549,9 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin,
be stored
based on 'publish' template
"""
+
+ project_name = self.context.data["projectName"]
if not version:
- project_name = legacy_io.active_project()
version = get_last_version_by_subset_name(
project_name,
subset,
@@ -571,7 +574,6 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin,
else:
# solve deprecated situation when `folder` key is not underneath
# `publish` anatomy
- project_name = legacy_io.Session["AVALON_PROJECT"]
self.log.warning((
"Deprecation warning: Anatomy does not have set `folder`"
" key underneath `publish` (in global of for project `{}`)."
diff --git a/openpype/modules/ftrack/ftrack_module.py b/openpype/modules/ftrack/ftrack_module.py
index d61b5f0b26..b5152ff9c4 100644
--- a/openpype/modules/ftrack/ftrack_module.py
+++ b/openpype/modules/ftrack/ftrack_module.py
@@ -123,18 +123,7 @@ class FtrackModule(
# Add Python 2 modules
python_paths = [
# `python-ftrack-api`
- os.path.join(python_2_vendor, "ftrack-python-api", "source"),
- # `arrow`
- os.path.join(python_2_vendor, "arrow"),
- # `builtins` from `python-future`
- # - `python-future` is strict Python 2 module that cause crashes
- # of Python 3 scripts executed through OpenPype
- # (burnin script etc.)
- os.path.join(python_2_vendor, "builtins"),
- # `backports.functools_lru_cache`
- os.path.join(
- python_2_vendor, "backports.functools_lru_cache"
- )
+ os.path.join(python_2_vendor, "ftrack-python-api", "source")
]
# Load PYTHONPATH from current launch context
diff --git a/openpype/modules/ftrack/plugins/publish/collect_ftrack_api.py b/openpype/modules/ftrack/plugins/publish/collect_ftrack_api.py
index e13b7e65cd..fe3275ce2c 100644
--- a/openpype/modules/ftrack/plugins/publish/collect_ftrack_api.py
+++ b/openpype/modules/ftrack/plugins/publish/collect_ftrack_api.py
@@ -1,8 +1,6 @@
import logging
import pyblish.api
-from openpype.pipeline import legacy_io
-
class CollectFtrackApi(pyblish.api.ContextPlugin):
""" Collects an ftrack session and the current task id. """
@@ -24,9 +22,9 @@ class CollectFtrackApi(pyblish.api.ContextPlugin):
self.log.debug("Ftrack user: \"{0}\"".format(session.api_user))
# Collect task
- project_name = legacy_io.Session["AVALON_PROJECT"]
- asset_name = legacy_io.Session["AVALON_ASSET"]
- task_name = legacy_io.Session["AVALON_TASK"]
+ project_name = context.data["projectName"]
+ asset_name = context.data["asset"]
+ task_name = context.data["task"]
# Find project entity
project_query = 'Project where full_name is "{0}"'.format(project_name)
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/.gitignore b/openpype/modules/ftrack/python2_vendor/arrow/.gitignore
deleted file mode 100644
index 0448d0cf0c..0000000000
--- a/openpype/modules/ftrack/python2_vendor/arrow/.gitignore
+++ /dev/null
@@ -1,211 +0,0 @@
-README.rst.new
-
-# Small entry point file for debugging tasks
-test.py
-
-# Byte-compiled / optimized / DLL files
-__pycache__/
-*.py[cod]
-*$py.class
-
-# C extensions
-*.so
-
-# Distribution / packaging
-.Python
-build/
-develop-eggs/
-dist/
-downloads/
-eggs/
-.eggs/
-lib/
-lib64/
-parts/
-sdist/
-var/
-wheels/
-pip-wheel-metadata/
-share/python-wheels/
-*.egg-info/
-.installed.cfg
-*.egg
-
-# PyInstaller
-# Usually these files are written by a python script from a template
-# before PyInstaller builds the exe, so as to inject date/other infos into it.
-*.manifest
-*.spec
-
-# Installer logs
-pip-log.txt
-pip-delete-this-directory.txt
-
-# Unit test / coverage reports
-htmlcov/
-.tox/
-.nox/
-.coverage
-.coverage.*
-.cache
-nosetests.xml
-coverage.xml
-*.cover
-.hypothesis/
-.pytest_cache/
-
-# Translations
-*.mo
-*.pot
-
-# Django stuff:
-*.log
-local_settings.py
-db.sqlite3
-db.sqlite3-journal
-
-# Flask stuff:
-instance/
-.webassets-cache
-
-# Scrapy stuff:
-.scrapy
-
-# Sphinx documentation
-docs/_build/
-
-# PyBuilder
-target/
-
-# Jupyter Notebook
-.ipynb_checkpoints
-
-# IPython
-profile_default/
-ipython_config.py
-
-# pyenv
-.python-version
-
-# celery beat schedule file
-celerybeat-schedule
-
-# SageMath parsed files
-*.sage.py
-
-# Environments
-.env
-.venv
-env/
-venv/
-ENV/
-local/
-env.bak/
-venv.bak/
-
-# Spyder project settings
-.spyderproject
-.spyproject
-
-# Rope project settings
-.ropeproject
-
-# mkdocs documentation
-/site
-
-# mypy
-.mypy_cache/
-.dmypy.json
-dmypy.json
-
-# Pyre type checker
-.pyre/
-
-# Swap
-[._]*.s[a-v][a-z]
-[._]*.sw[a-p]
-[._]s[a-rt-v][a-z]
-[._]ss[a-gi-z]
-[._]sw[a-p]
-
-# Session
-Session.vim
-Sessionx.vim
-
-# Temporary
-.netrwhist
-*~
-# Auto-generated tag files
-tags
-# Persistent undo
-[._]*.un~
-
-.idea/
-.vscode/
-
-# General
-.DS_Store
-.AppleDouble
-.LSOverride
-
-# Icon must end with two \r
-Icon
-
-
-# Thumbnails
-._*
-
-# Files that might appear in the root of a volume
-.DocumentRevisions-V100
-.fseventsd
-.Spotlight-V100
-.TemporaryItems
-.Trashes
-.VolumeIcon.icns
-.com.apple.timemachine.donotpresent
-
-# Directories potentially created on remote AFP share
-.AppleDB
-.AppleDesktop
-Network Trash Folder
-Temporary Items
-.apdisk
-
-*~
-
-# temporary files which can be created if a process still has a handle open of a deleted file
-.fuse_hidden*
-
-# KDE directory preferences
-.directory
-
-# Linux trash folder which might appear on any partition or disk
-.Trash-*
-
-# .nfs files are created when an open file is removed but is still being accessed
-.nfs*
-
-# Windows thumbnail cache files
-Thumbs.db
-Thumbs.db:encryptable
-ehthumbs.db
-ehthumbs_vista.db
-
-# Dump file
-*.stackdump
-
-# Folder config file
-[Dd]esktop.ini
-
-# Recycle Bin used on file shares
-$RECYCLE.BIN/
-
-# Windows Installer files
-*.cab
-*.msi
-*.msix
-*.msm
-*.msp
-
-# Windows shortcuts
-*.lnk
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/.pre-commit-config.yaml b/openpype/modules/ftrack/python2_vendor/arrow/.pre-commit-config.yaml
deleted file mode 100644
index 1f5128595b..0000000000
--- a/openpype/modules/ftrack/python2_vendor/arrow/.pre-commit-config.yaml
+++ /dev/null
@@ -1,41 +0,0 @@
-default_language_version:
- python: python3
-repos:
- - repo: https://github.com/pre-commit/pre-commit-hooks
- rev: v3.2.0
- hooks:
- - id: trailing-whitespace
- - id: end-of-file-fixer
- - id: fix-encoding-pragma
- exclude: ^arrow/_version.py
- - id: requirements-txt-fixer
- - id: check-ast
- - id: check-yaml
- - id: check-case-conflict
- - id: check-docstring-first
- - id: check-merge-conflict
- - id: debug-statements
- - repo: https://github.com/timothycrosley/isort
- rev: 5.4.2
- hooks:
- - id: isort
- - repo: https://github.com/asottile/pyupgrade
- rev: v2.7.2
- hooks:
- - id: pyupgrade
- - repo: https://github.com/pre-commit/pygrep-hooks
- rev: v1.6.0
- hooks:
- - id: python-no-eval
- - id: python-check-blanket-noqa
- - id: rst-backticks
- - repo: https://github.com/psf/black
- rev: 20.8b1
- hooks:
- - id: black
- args: [--safe, --quiet]
- - repo: https://gitlab.com/pycqa/flake8
- rev: 3.8.3
- hooks:
- - id: flake8
- additional_dependencies: [flake8-bugbear]
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/CHANGELOG.rst b/openpype/modules/ftrack/python2_vendor/arrow/CHANGELOG.rst
deleted file mode 100644
index 0b55a4522c..0000000000
--- a/openpype/modules/ftrack/python2_vendor/arrow/CHANGELOG.rst
+++ /dev/null
@@ -1,598 +0,0 @@
-Changelog
-=========
-
-0.17.0 (2020-10-2)
--------------------
-
-- [WARN] Arrow will **drop support** for Python 2.7 and 3.5 in the upcoming 1.0.0 release. This is the last major release to support Python 2.7 and Python 3.5.
-- [NEW] Arrow now properly handles imaginary datetimes during DST shifts. For example:
-
-..code-block:: python
- >>> just_before = arrow.get(2013, 3, 31, 1, 55, tzinfo="Europe/Paris")
- >>> just_before.shift(minutes=+10)
-
-
-..code-block:: python
- >>> before = arrow.get("2018-03-10 23:00:00", "YYYY-MM-DD HH:mm:ss", tzinfo="US/Pacific")
- >>> after = arrow.get("2018-03-11 04:00:00", "YYYY-MM-DD HH:mm:ss", tzinfo="US/Pacific")
- >>> result=[(t, t.to("utc")) for t in arrow.Arrow.range("hour", before, after)]
- >>> for r in result:
- ... print(r)
- ...
- (, )
- (, )
- (, )
- (, )
- (, )
-
-- [NEW] Added ``humanize`` week granularity translation for Tagalog.
-- [CHANGE] Calls to the ``timestamp`` property now emit a ``DeprecationWarning``. In a future release, ``timestamp`` will be changed to a method to align with Python's datetime module. If you would like to continue using the property, please change your code to use the ``int_timestamp`` or ``float_timestamp`` properties instead.
-- [CHANGE] Expanded and improved Catalan locale.
-- [FIX] Fixed a bug that caused ``Arrow.range()`` to incorrectly cut off ranges in certain scenarios when using month, quarter, or year endings.
-- [FIX] Fixed a bug that caused day of week token parsing to be case sensitive.
-- [INTERNAL] A number of functions were reordered in arrow.py for better organization and grouping of related methods. This change will have no impact on usage.
-- [INTERNAL] A minimum tox version is now enforced for compatibility reasons. Contributors must use tox >3.18.0 going forward.
-
-0.16.0 (2020-08-23)
--------------------
-
-- [WARN] Arrow will **drop support** for Python 2.7 and 3.5 in the upcoming 1.0.0 release. The 0.16.x and 0.17.x releases are the last to support Python 2.7 and 3.5.
-- [NEW] Implemented `PEP 495 `_ to handle ambiguous datetimes. This is achieved by the addition of the ``fold`` attribute for Arrow objects. For example:
-
-.. code-block:: python
-
- >>> before = Arrow(2017, 10, 29, 2, 0, tzinfo='Europe/Stockholm')
-
- >>> before.fold
- 0
- >>> before.ambiguous
- True
- >>> after = Arrow(2017, 10, 29, 2, 0, tzinfo='Europe/Stockholm', fold=1)
-
- >>> after = before.replace(fold=1)
-
-
-- [NEW] Added ``normalize_whitespace`` flag to ``arrow.get``. This is useful for parsing log files and/or any files that may contain inconsistent spacing. For example:
-
-.. code-block:: python
-
- >>> arrow.get("Jun 1 2005 1:33PM", "MMM D YYYY H:mmA", normalize_whitespace=True)
-
- >>> arrow.get("2013-036 \t 04:05:06Z", normalize_whitespace=True)
-
-
-0.15.8 (2020-07-23)
--------------------
-
-- [WARN] Arrow will **drop support** for Python 2.7 and 3.5 in the upcoming 1.0.0 release. The 0.15.x, 0.16.x, and 0.17.x releases are the last to support Python 2.7 and 3.5.
-- [NEW] Added ``humanize`` week granularity translation for Czech.
-- [FIX] ``arrow.get`` will now pick sane defaults when weekdays are passed with particular token combinations, see `#446 `_.
-- [INTERNAL] Moved arrow to an organization. The repo can now be found `here `_.
-- [INTERNAL] Started issuing deprecation warnings for Python 2.7 and 3.5.
-- [INTERNAL] Added Python 3.9 to CI pipeline.
-
-0.15.7 (2020-06-19)
--------------------
-
-- [NEW] Added a number of built-in format strings. See the `docs `_ for a complete list of supported formats. For example:
-
-.. code-block:: python
-
- >>> arw = arrow.utcnow()
- >>> arw.format(arrow.FORMAT_COOKIE)
- 'Wednesday, 27-May-2020 10:30:35 UTC'
-
-- [NEW] Arrow is now fully compatible with Python 3.9 and PyPy3.
-- [NEW] Added Makefile, tox.ini, and requirements.txt files to the distribution bundle.
-- [NEW] Added French Canadian and Swahili locales.
-- [NEW] Added ``humanize`` week granularity translation for Hebrew, Greek, Macedonian, Swedish, Slovak.
-- [FIX] ms and μs timestamps are now normalized in ``arrow.get()``, ``arrow.fromtimestamp()``, and ``arrow.utcfromtimestamp()``. For example:
-
-.. code-block:: python
-
- >>> ts = 1591161115194556
- >>> arw = arrow.get(ts)
-
- >>> arw.timestamp
- 1591161115
-
-- [FIX] Refactored and updated Macedonian, Hebrew, Korean, and Portuguese locales.
-
-0.15.6 (2020-04-29)
--------------------
-
-- [NEW] Added support for parsing and formatting `ISO 8601 week dates `_ via a new token ``W``, for example:
-
-.. code-block:: python
-
- >>> arrow.get("2013-W29-6", "W")
-
- >>> utc=arrow.utcnow()
- >>> utc
-
- >>> utc.format("W")
- '2020-W04-4'
-
-- [NEW] Formatting with ``x`` token (microseconds) is now possible, for example:
-
-.. code-block:: python
-
- >>> dt = arrow.utcnow()
- >>> dt.format("x")
- '1585669870688329'
- >>> dt.format("X")
- '1585669870'
-
-- [NEW] Added ``humanize`` week granularity translation for German, Italian, Polish & Taiwanese locales.
-- [FIX] Consolidated and simplified German locales.
-- [INTERNAL] Moved testing suite from nosetest/Chai to pytest/pytest-mock.
-- [INTERNAL] Converted xunit-style setup and teardown functions in tests to pytest fixtures.
-- [INTERNAL] Setup Github Actions for CI alongside Travis.
-- [INTERNAL] Help support Arrow's future development by donating to the project on `Open Collective `_.
-
-0.15.5 (2020-01-03)
--------------------
-
-- [WARN] Python 2 reached EOL on 2020-01-01. arrow will **drop support** for Python 2 in a future release to be decided (see `#739 `_).
-- [NEW] Added bounds parameter to ``span_range``, ``interval`` and ``span`` methods. This allows you to include or exclude the start and end values.
-- [NEW] ``arrow.get()`` can now create arrow objects from a timestamp with a timezone, for example:
-
-.. code-block:: python
-
- >>> arrow.get(1367900664, tzinfo=tz.gettz('US/Pacific'))
-
-
-- [NEW] ``humanize`` can now combine multiple levels of granularity, for example:
-
-.. code-block:: python
-
- >>> later140 = arrow.utcnow().shift(seconds=+8400)
- >>> later140.humanize(granularity="minute")
- 'in 139 minutes'
- >>> later140.humanize(granularity=["hour", "minute"])
- 'in 2 hours and 19 minutes'
-
-- [NEW] Added Hong Kong locale (``zh_hk``).
-- [NEW] Added ``humanize`` week granularity translation for Dutch.
-- [NEW] Numbers are now displayed when using the seconds granularity in ``humanize``.
-- [CHANGE] ``range`` now supports both the singular and plural forms of the ``frames`` argument (e.g. day and days).
-- [FIX] Improved parsing of strings that contain punctuation.
-- [FIX] Improved behaviour of ``humanize`` when singular seconds are involved.
-
-0.15.4 (2019-11-02)
--------------------
-
-- [FIX] Fixed an issue that caused package installs to fail on Conda Forge.
-
-0.15.3 (2019-11-02)
--------------------
-
-- [NEW] ``factory.get()`` can now create arrow objects from a ISO calendar tuple, for example:
-
-.. code-block:: python
-
- >>> arrow.get((2013, 18, 7))
-
-
-- [NEW] Added a new token ``x`` to allow parsing of integer timestamps with milliseconds and microseconds.
-- [NEW] Formatting now supports escaping of characters using the same syntax as parsing, for example:
-
-.. code-block:: python
-
- >>> arw = arrow.now()
- >>> fmt = "YYYY-MM-DD h [h] m"
- >>> arw.format(fmt)
- '2019-11-02 3 h 32'
-
-- [NEW] Added ``humanize`` week granularity translations for Chinese, Spanish and Vietnamese.
-- [CHANGE] Added ``ParserError`` to module exports.
-- [FIX] Added support for midnight at end of day. See `#703 `_ for details.
-- [INTERNAL] Created Travis build for macOS.
-- [INTERNAL] Test parsing and formatting against full timezone database.
-
-0.15.2 (2019-09-14)
--------------------
-
-- [NEW] Added ``humanize`` week granularity translations for Portuguese and Brazilian Portuguese.
-- [NEW] Embedded changelog within docs and added release dates to versions.
-- [FIX] Fixed a bug that caused test failures on Windows only, see `#668 `_ for details.
-
-0.15.1 (2019-09-10)
--------------------
-
-- [NEW] Added ``humanize`` week granularity translations for Japanese.
-- [FIX] Fixed a bug that caused Arrow to fail when passed a negative timestamp string.
-- [FIX] Fixed a bug that caused Arrow to fail when passed a datetime object with ``tzinfo`` of type ``StaticTzInfo``.
-
-0.15.0 (2019-09-08)
--------------------
-
-- [NEW] Added support for DDD and DDDD ordinal date tokens. The following functionality is now possible: ``arrow.get("1998-045")``, ``arrow.get("1998-45", "YYYY-DDD")``, ``arrow.get("1998-045", "YYYY-DDDD")``.
-- [NEW] ISO 8601 basic format for dates and times is now supported (e.g. ``YYYYMMDDTHHmmssZ``).
-- [NEW] Added ``humanize`` week granularity translations for French, Russian and Swiss German locales.
-- [CHANGE] Timestamps of type ``str`` are no longer supported **without a format string** in the ``arrow.get()`` method. This change was made to support the ISO 8601 basic format and to address bugs such as `#447 `_.
-
-The following will NOT work in v0.15.0:
-
-.. code-block:: python
-
- >>> arrow.get("1565358758")
- >>> arrow.get("1565358758.123413")
-
-The following will work in v0.15.0:
-
-.. code-block:: python
-
- >>> arrow.get("1565358758", "X")
- >>> arrow.get("1565358758.123413", "X")
- >>> arrow.get(1565358758)
- >>> arrow.get(1565358758.123413)
-
-- [CHANGE] When a meridian token (a|A) is passed and no meridians are available for the specified locale (e.g. unsupported or untranslated) a ``ParserError`` is raised.
-- [CHANGE] The timestamp token (``X``) will now match float timestamps of type ``str``: ``arrow.get(“1565358758.123415”, “X”)``.
-- [CHANGE] Strings with leading and/or trailing whitespace will no longer be parsed without a format string. Please see `the docs `_ for ways to handle this.
-- [FIX] The timestamp token (``X``) will now only match on strings that **strictly contain integers and floats**, preventing incorrect matches.
-- [FIX] Most instances of ``arrow.get()`` returning an incorrect ``Arrow`` object from a partial parsing match have been eliminated. The following issue have been addressed: `#91 `_, `#196 `_, `#396 `_, `#434 `_, `#447 `_, `#456 `_, `#519 `_, `#538 `_, `#560 `_.
-
-0.14.7 (2019-09-04)
--------------------
-
-- [CHANGE] ``ArrowParseWarning`` will no longer be printed on every call to ``arrow.get()`` with a datetime string. The purpose of the warning was to start a conversation about the upcoming 0.15.0 changes and we appreciate all the feedback that the community has given us!
-
-0.14.6 (2019-08-28)
--------------------
-
-- [NEW] Added support for ``week`` granularity in ``Arrow.humanize()``. For example, ``arrow.utcnow().shift(weeks=-1).humanize(granularity="week")`` outputs "a week ago". This change introduced two new untranslated words, ``week`` and ``weeks``, to all locale dictionaries, so locale contributions are welcome!
-- [NEW] Fully translated the Brazilian Portugese locale.
-- [CHANGE] Updated the Macedonian locale to inherit from a Slavic base.
-- [FIX] Fixed a bug that caused ``arrow.get()`` to ignore tzinfo arguments of type string (e.g. ``arrow.get(tzinfo="Europe/Paris")``).
-- [FIX] Fixed a bug that occurred when ``arrow.Arrow()`` was instantiated with a ``pytz`` tzinfo object.
-- [FIX] Fixed a bug that caused Arrow to fail when passed a sub-second token, that when rounded, had a value greater than 999999 (e.g. ``arrow.get("2015-01-12T01:13:15.9999995")``). Arrow should now accurately propagate the rounding for large sub-second tokens.
-
-0.14.5 (2019-08-09)
--------------------
-
-- [NEW] Added Afrikaans locale.
-- [CHANGE] Removed deprecated ``replace`` shift functionality. Users looking to pass plural properties to the ``replace`` function to shift values should use ``shift`` instead.
-- [FIX] Fixed bug that occurred when ``factory.get()`` was passed a locale kwarg.
-
-0.14.4 (2019-07-30)
--------------------
-
-- [FIX] Fixed a regression in 0.14.3 that prevented a tzinfo argument of type string to be passed to the ``get()`` function. Functionality such as ``arrow.get("2019072807", "YYYYMMDDHH", tzinfo="UTC")`` should work as normal again.
-- [CHANGE] Moved ``backports.functools_lru_cache`` dependency from ``extra_requires`` to ``install_requires`` for ``Python 2.7`` installs to fix `#495 `_.
-
-0.14.3 (2019-07-28)
--------------------
-
-- [NEW] Added full support for Python 3.8.
-- [CHANGE] Added warnings for upcoming factory.get() parsing changes in 0.15.0. Please see `#612 `_ for full details.
-- [FIX] Extensive refactor and update of documentation.
-- [FIX] factory.get() can now construct from kwargs.
-- [FIX] Added meridians to Spanish Locale.
-
-0.14.2 (2019-06-06)
--------------------
-
-- [CHANGE] Travis CI builds now use tox to lint and run tests.
-- [FIX] Fixed UnicodeDecodeError on certain locales (#600).
-
-0.14.1 (2019-06-06)
--------------------
-
-- [FIX] Fixed ``ImportError: No module named 'dateutil'`` (#598).
-
-0.14.0 (2019-06-06)
--------------------
-
-- [NEW] Added provisional support for Python 3.8.
-- [CHANGE] Removed support for EOL Python 3.4.
-- [FIX] Updated setup.py with modern Python standards.
-- [FIX] Upgraded dependencies to latest versions.
-- [FIX] Enabled flake8 and black on travis builds.
-- [FIX] Formatted code using black and isort.
-
-0.13.2 (2019-05-30)
--------------------
-
-- [NEW] Add is_between method.
-- [FIX] Improved humanize behaviour for near zero durations (#416).
-- [FIX] Correct humanize behaviour with future days (#541).
-- [FIX] Documentation updates.
-- [FIX] Improvements to German Locale.
-
-0.13.1 (2019-02-17)
--------------------
-
-- [NEW] Add support for Python 3.7.
-- [CHANGE] Remove deprecation decorators for Arrow.range(), Arrow.span_range() and Arrow.interval(), all now return generators, wrap with list() to get old behavior.
-- [FIX] Documentation and docstring updates.
-
-0.13.0 (2019-01-09)
--------------------
-
-- [NEW] Added support for Python 3.6.
-- [CHANGE] Drop support for Python 2.6/3.3.
-- [CHANGE] Return generator instead of list for Arrow.range(), Arrow.span_range() and Arrow.interval().
-- [FIX] Make arrow.get() work with str & tzinfo combo.
-- [FIX] Make sure special RegEx characters are escaped in format string.
-- [NEW] Added support for ZZZ when formatting.
-- [FIX] Stop using datetime.utcnow() in internals, use datetime.now(UTC) instead.
-- [FIX] Return NotImplemented instead of TypeError in arrow math internals.
-- [NEW] Added Estonian Locale.
-- [FIX] Small fixes to Greek locale.
-- [FIX] TagalogLocale improvements.
-- [FIX] Added test requirements to setup.
-- [FIX] Improve docs for get, now and utcnow methods.
-- [FIX] Correct typo in depreciation warning.
-
-0.12.1
-------
-
-- [FIX] Allow universal wheels to be generated and reliably installed.
-- [FIX] Make humanize respect only_distance when granularity argument is also given.
-
-0.12.0
-------
-
-- [FIX] Compatibility fix for Python 2.x
-
-0.11.0
-------
-
-- [FIX] Fix grammar of ArabicLocale
-- [NEW] Add Nepali Locale
-- [FIX] Fix month name + rename AustriaLocale -> AustrianLocale
-- [FIX] Fix typo in Basque Locale
-- [FIX] Fix grammar in PortugueseBrazilian locale
-- [FIX] Remove pip --user-mirrors flag
-- [NEW] Add Indonesian Locale
-
-0.10.0
-------
-
-- [FIX] Fix getattr off by one for quarter
-- [FIX] Fix negative offset for UTC
-- [FIX] Update arrow.py
-
-0.9.0
------
-
-- [NEW] Remove duplicate code
-- [NEW] Support gnu date iso 8601
-- [NEW] Add support for universal wheels
-- [NEW] Slovenian locale
-- [NEW] Slovak locale
-- [NEW] Romanian locale
-- [FIX] respect limit even if end is defined range
-- [FIX] Separate replace & shift functions
-- [NEW] Added tox
-- [FIX] Fix supported Python versions in documentation
-- [NEW] Azerbaijani locale added, locale issue fixed in Turkish.
-- [FIX] Format ParserError's raise message
-
-0.8.0
------
-
-- []
-
-0.7.1
------
-
-- [NEW] Esperanto locale (batisteo)
-
-0.7.0
------
-
-- [FIX] Parse localized strings #228 (swistakm)
-- [FIX] Modify tzinfo parameter in ``get`` api #221 (bottleimp)
-- [FIX] Fix Czech locale (PrehistoricTeam)
-- [FIX] Raise TypeError when adding/subtracting non-dates (itsmeolivia)
-- [FIX] Fix pytz conversion error (Kudo)
-- [FIX] Fix overzealous time truncation in span_range (kdeldycke)
-- [NEW] Humanize for time duration #232 (ybrs)
-- [NEW] Add Thai locale (sipp11)
-- [NEW] Adding Belarusian (be) locale (oire)
-- [NEW] Search date in strings (beenje)
-- [NEW] Note that arrow's tokens differ from strptime's. (offby1)
-
-0.6.0
------
-
-- [FIX] Added support for Python 3
-- [FIX] Avoid truncating oversized epoch timestamps. Fixes #216.
-- [FIX] Fixed month abbreviations for Ukrainian
-- [FIX] Fix typo timezone
-- [FIX] A couple of dialect fixes and two new languages
-- [FIX] Spanish locale: ``Miercoles`` should have acute accent
-- [Fix] Fix Finnish grammar
-- [FIX] Fix typo in 'Arrow.floor' docstring
-- [FIX] Use read() utility to open README
-- [FIX] span_range for week frame
-- [NEW] Add minimal support for fractional seconds longer than six digits.
-- [NEW] Adding locale support for Marathi (mr)
-- [NEW] Add count argument to span method
-- [NEW] Improved docs
-
-0.5.1 - 0.5.4
--------------
-
-- [FIX] test the behavior of simplejson instead of calling for_json directly (tonyseek)
-- [FIX] Add Hebrew Locale (doodyparizada)
-- [FIX] Update documentation location (andrewelkins)
-- [FIX] Update setup.py Development Status level (andrewelkins)
-- [FIX] Case insensitive month match (cshowe)
-
-0.5.0
------
-
-- [NEW] struct_time addition. (mhworth)
-- [NEW] Version grep (eirnym)
-- [NEW] Default to ISO 8601 format (emonty)
-- [NEW] Raise TypeError on comparison (sniekamp)
-- [NEW] Adding Macedonian(mk) locale (krisfremen)
-- [FIX] Fix for ISO seconds and fractional seconds (sdispater) (andrewelkins)
-- [FIX] Use correct Dutch wording for "hours" (wbolster)
-- [FIX] Complete the list of english locales (indorilftw)
-- [FIX] Change README to reStructuredText (nyuszika7h)
-- [FIX] Parse lower-cased 'h' (tamentis)
-- [FIX] Slight modifications to Dutch locale (nvie)
-
-0.4.4
------
-
-- [NEW] Include the docs in the released tarball
-- [NEW] Czech localization Czech localization for Arrow
-- [NEW] Add fa_ir to locales
-- [FIX] Fixes parsing of time strings with a final Z
-- [FIX] Fixes ISO parsing and formatting for fractional seconds
-- [FIX] test_fromtimestamp sp
-- [FIX] some typos fixed
-- [FIX] removed an unused import statement
-- [FIX] docs table fix
-- [FIX] Issue with specify 'X' template and no template at all to arrow.get
-- [FIX] Fix "import" typo in docs/index.rst
-- [FIX] Fix unit tests for zero passed
-- [FIX] Update layout.html
-- [FIX] In Norwegian and new Norwegian months and weekdays should not be capitalized
-- [FIX] Fixed discrepancy between specifying 'X' to arrow.get and specifying no template
-
-0.4.3
------
-
-- [NEW] Turkish locale (Emre)
-- [NEW] Arabic locale (Mosab Ahmad)
-- [NEW] Danish locale (Holmars)
-- [NEW] Icelandic locale (Holmars)
-- [NEW] Hindi locale (Atmb4u)
-- [NEW] Malayalam locale (Atmb4u)
-- [NEW] Finnish locale (Stormpat)
-- [NEW] Portuguese locale (Danielcorreia)
-- [NEW] ``h`` and ``hh`` strings are now supported (Averyonghub)
-- [FIX] An incorrect inflection in the Polish locale has been fixed (Avalanchy)
-- [FIX] ``arrow.get`` now properly handles ``Date`` (Jaapz)
-- [FIX] Tests are now declared in ``setup.py`` and the manifest (Pypingou)
-- [FIX] ``__version__`` has been added to ``__init__.py`` (Sametmax)
-- [FIX] ISO 8601 strings can be parsed without a separator (Ivandiguisto / Root)
-- [FIX] Documentation is now more clear regarding some inputs on ``arrow.get`` (Eriktaubeneck)
-- [FIX] Some documentation links have been fixed (Vrutsky)
-- [FIX] Error messages for parse errors are now more descriptive (Maciej Albin)
-- [FIX] The parser now correctly checks for separators in strings (Mschwager)
-
-0.4.2
------
-
-- [NEW] Factory ``get`` method now accepts a single ``Arrow`` argument.
-- [NEW] Tokens SSSS, SSSSS and SSSSSS are supported in parsing.
-- [NEW] ``Arrow`` objects have a ``float_timestamp`` property.
-- [NEW] Vietnamese locale (Iu1nguoi)
-- [NEW] Factory ``get`` method now accepts a list of format strings (Dgilland)
-- [NEW] A MANIFEST.in file has been added (Pypingou)
-- [NEW] Tests can be run directly from ``setup.py`` (Pypingou)
-- [FIX] Arrow docs now list 'day of week' format tokens correctly (Rudolphfroger)
-- [FIX] Several issues with the Korean locale have been resolved (Yoloseem)
-- [FIX] ``humanize`` now correctly returns unicode (Shvechikov)
-- [FIX] ``Arrow`` objects now pickle / unpickle correctly (Yoloseem)
-
-0.4.1
------
-
-- [NEW] Table / explanation of formatting & parsing tokens in docs
-- [NEW] Brazilian locale (Augusto2112)
-- [NEW] Dutch locale (OrangeTux)
-- [NEW] Italian locale (Pertux)
-- [NEW] Austrain locale (LeChewbacca)
-- [NEW] Tagalog locale (Marksteve)
-- [FIX] Corrected spelling and day numbers in German locale (LeChewbacca)
-- [FIX] Factory ``get`` method should now handle unicode strings correctly (Bwells)
-- [FIX] Midnight and noon should now parse and format correctly (Bwells)
-
-0.4.0
------
-
-- [NEW] Format-free ISO 8601 parsing in factory ``get`` method
-- [NEW] Support for 'week' / 'weeks' in ``span``, ``range``, ``span_range``, ``floor`` and ``ceil``
-- [NEW] Support for 'weeks' in ``replace``
-- [NEW] Norwegian locale (Martinp)
-- [NEW] Japanese locale (CortYuming)
-- [FIX] Timezones no longer show the wrong sign when formatted (Bean)
-- [FIX] Microseconds are parsed correctly from strings (Bsidhom)
-- [FIX] Locale day-of-week is no longer off by one (Cynddl)
-- [FIX] Corrected plurals of Ukrainian and Russian nouns (Catchagain)
-- [CHANGE] Old 0.1 ``arrow`` module method removed
-- [CHANGE] Dropped timestamp support in ``range`` and ``span_range`` (never worked correctly)
-- [CHANGE] Dropped parsing of single string as tz string in factory ``get`` method (replaced by ISO 8601)
-
-0.3.5
------
-
-- [NEW] French locale (Cynddl)
-- [NEW] Spanish locale (Slapresta)
-- [FIX] Ranges handle multiple timezones correctly (Ftobia)
-
-0.3.4
------
-
-- [FIX] Humanize no longer sometimes returns the wrong month delta
-- [FIX] ``__format__`` works correctly with no format string
-
-0.3.3
------
-
-- [NEW] Python 2.6 support
-- [NEW] Initial support for locale-based parsing and formatting
-- [NEW] ArrowFactory class, now proxied as the module API
-- [NEW] ``factory`` api method to obtain a factory for a custom type
-- [FIX] Python 3 support and tests completely ironed out
-
-0.3.2
------
-
-- [NEW] Python 3+ support
-
-0.3.1
------
-
-- [FIX] The old ``arrow`` module function handles timestamps correctly as it used to
-
-0.3.0
------
-
-- [NEW] ``Arrow.replace`` method
-- [NEW] Accept timestamps, datetimes and Arrows for datetime inputs, where reasonable
-- [FIX] ``range`` and ``span_range`` respect end and limit parameters correctly
-- [CHANGE] Arrow objects are no longer mutable
-- [CHANGE] Plural attribute name semantics altered: single -> absolute, plural -> relative
-- [CHANGE] Plural names no longer supported as properties (e.g. ``arrow.utcnow().years``)
-
-0.2.1
------
-
-- [NEW] Support for localized humanization
-- [NEW] English, Russian, Greek, Korean, Chinese locales
-
-0.2.0
------
-
-- **REWRITE**
-- [NEW] Date parsing
-- [NEW] Date formatting
-- [NEW] ``floor``, ``ceil`` and ``span`` methods
-- [NEW] ``datetime`` interface implementation
-- [NEW] ``clone`` method
-- [NEW] ``get``, ``now`` and ``utcnow`` API methods
-
-0.1.6
------
-
-- [NEW] Humanized time deltas
-- [NEW] ``__eq__`` implemented
-- [FIX] Issues with conversions related to daylight savings time resolved
-- [CHANGE] ``__str__`` uses ISO formatting
-
-0.1.5
------
-
-- **Started tracking changes**
-- [NEW] Parsing of ISO-formatted time zone offsets (e.g. '+02:30', '-05:00')
-- [NEW] Resolved some issues with timestamps and delta / Olson time zones
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/LICENSE b/openpype/modules/ftrack/python2_vendor/arrow/LICENSE
deleted file mode 100644
index 2bef500de7..0000000000
--- a/openpype/modules/ftrack/python2_vendor/arrow/LICENSE
+++ /dev/null
@@ -1,201 +0,0 @@
- Apache License
- Version 2.0, January 2004
- http://www.apache.org/licenses/
-
- TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
-
- 1. Definitions.
-
- "License" shall mean the terms and conditions for use, reproduction,
- and distribution as defined by Sections 1 through 9 of this document.
-
- "Licensor" shall mean the copyright owner or entity authorized by
- the copyright owner that is granting the License.
-
- "Legal Entity" shall mean the union of the acting entity and all
- other entities that control, are controlled by, or are under common
- control with that entity. For the purposes of this definition,
- "control" means (i) the power, direct or indirect, to cause the
- direction or management of such entity, whether by contract or
- otherwise, or (ii) ownership of fifty percent (50%) or more of the
- outstanding shares, or (iii) beneficial ownership of such entity.
-
- "You" (or "Your") shall mean an individual or Legal Entity
- exercising permissions granted by this License.
-
- "Source" form shall mean the preferred form for making modifications,
- including but not limited to software source code, documentation
- source, and configuration files.
-
- "Object" form shall mean any form resulting from mechanical
- transformation or translation of a Source form, including but
- not limited to compiled object code, generated documentation,
- and conversions to other media types.
-
- "Work" shall mean the work of authorship, whether in Source or
- Object form, made available under the License, as indicated by a
- copyright notice that is included in or attached to the work
- (an example is provided in the Appendix below).
-
- "Derivative Works" shall mean any work, whether in Source or Object
- form, that is based on (or derived from) the Work and for which the
- editorial revisions, annotations, elaborations, or other modifications
- represent, as a whole, an original work of authorship. For the purposes
- of this License, Derivative Works shall not include works that remain
- separable from, or merely link (or bind by name) to the interfaces of,
- the Work and Derivative Works thereof.
-
- "Contribution" shall mean any work of authorship, including
- the original version of the Work and any modifications or additions
- to that Work or Derivative Works thereof, that is intentionally
- submitted to Licensor for inclusion in the Work by the copyright owner
- or by an individual or Legal Entity authorized to submit on behalf of
- the copyright owner. For the purposes of this definition, "submitted"
- means any form of electronic, verbal, or written communication sent
- to the Licensor or its representatives, including but not limited to
- communication on electronic mailing lists, source code control systems,
- and issue tracking systems that are managed by, or on behalf of, the
- Licensor for the purpose of discussing and improving the Work, but
- excluding communication that is conspicuously marked or otherwise
- designated in writing by the copyright owner as "Not a Contribution."
-
- "Contributor" shall mean Licensor and any individual or Legal Entity
- on behalf of whom a Contribution has been received by Licensor and
- subsequently incorporated within the Work.
-
- 2. Grant of Copyright License. Subject to the terms and conditions of
- this License, each Contributor hereby grants to You a perpetual,
- worldwide, non-exclusive, no-charge, royalty-free, irrevocable
- copyright license to reproduce, prepare Derivative Works of,
- publicly display, publicly perform, sublicense, and distribute the
- Work and such Derivative Works in Source or Object form.
-
- 3. Grant of Patent License. Subject to the terms and conditions of
- this License, each Contributor hereby grants to You a perpetual,
- worldwide, non-exclusive, no-charge, royalty-free, irrevocable
- (except as stated in this section) patent license to make, have made,
- use, offer to sell, sell, import, and otherwise transfer the Work,
- where such license applies only to those patent claims licensable
- by such Contributor that are necessarily infringed by their
- Contribution(s) alone or by combination of their Contribution(s)
- with the Work to which such Contribution(s) was submitted. If You
- institute patent litigation against any entity (including a
- cross-claim or counterclaim in a lawsuit) alleging that the Work
- or a Contribution incorporated within the Work constitutes direct
- or contributory patent infringement, then any patent licenses
- granted to You under this License for that Work shall terminate
- as of the date such litigation is filed.
-
- 4. Redistribution. You may reproduce and distribute copies of the
- Work or Derivative Works thereof in any medium, with or without
- modifications, and in Source or Object form, provided that You
- meet the following conditions:
-
- (a) You must give any other recipients of the Work or
- Derivative Works a copy of this License; and
-
- (b) You must cause any modified files to carry prominent notices
- stating that You changed the files; and
-
- (c) You must retain, in the Source form of any Derivative Works
- that You distribute, all copyright, patent, trademark, and
- attribution notices from the Source form of the Work,
- excluding those notices that do not pertain to any part of
- the Derivative Works; and
-
- (d) If the Work includes a "NOTICE" text file as part of its
- distribution, then any Derivative Works that You distribute must
- include a readable copy of the attribution notices contained
- within such NOTICE file, excluding those notices that do not
- pertain to any part of the Derivative Works, in at least one
- of the following places: within a NOTICE text file distributed
- as part of the Derivative Works; within the Source form or
- documentation, if provided along with the Derivative Works; or,
- within a display generated by the Derivative Works, if and
- wherever such third-party notices normally appear. The contents
- of the NOTICE file are for informational purposes only and
- do not modify the License. You may add Your own attribution
- notices within Derivative Works that You distribute, alongside
- or as an addendum to the NOTICE text from the Work, provided
- that such additional attribution notices cannot be construed
- as modifying the License.
-
- You may add Your own copyright statement to Your modifications and
- may provide additional or different license terms and conditions
- for use, reproduction, or distribution of Your modifications, or
- for any such Derivative Works as a whole, provided Your use,
- reproduction, and distribution of the Work otherwise complies with
- the conditions stated in this License.
-
- 5. Submission of Contributions. Unless You explicitly state otherwise,
- any Contribution intentionally submitted for inclusion in the Work
- by You to the Licensor shall be under the terms and conditions of
- this License, without any additional terms or conditions.
- Notwithstanding the above, nothing herein shall supersede or modify
- the terms of any separate license agreement you may have executed
- with Licensor regarding such Contributions.
-
- 6. Trademarks. This License does not grant permission to use the trade
- names, trademarks, service marks, or product names of the Licensor,
- except as required for reasonable and customary use in describing the
- origin of the Work and reproducing the content of the NOTICE file.
-
- 7. Disclaimer of Warranty. Unless required by applicable law or
- agreed to in writing, Licensor provides the Work (and each
- Contributor provides its Contributions) on an "AS IS" BASIS,
- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
- implied, including, without limitation, any warranties or conditions
- of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
- PARTICULAR PURPOSE. You are solely responsible for determining the
- appropriateness of using or redistributing the Work and assume any
- risks associated with Your exercise of permissions under this License.
-
- 8. Limitation of Liability. In no event and under no legal theory,
- whether in tort (including negligence), contract, or otherwise,
- unless required by applicable law (such as deliberate and grossly
- negligent acts) or agreed to in writing, shall any Contributor be
- liable to You for damages, including any direct, indirect, special,
- incidental, or consequential damages of any character arising as a
- result of this License or out of the use or inability to use the
- Work (including but not limited to damages for loss of goodwill,
- work stoppage, computer failure or malfunction, or any and all
- other commercial damages or losses), even if such Contributor
- has been advised of the possibility of such damages.
-
- 9. Accepting Warranty or Additional Liability. While redistributing
- the Work or Derivative Works thereof, You may choose to offer,
- and charge a fee for, acceptance of support, warranty, indemnity,
- or other liability obligations and/or rights consistent with this
- License. However, in accepting such obligations, You may act only
- on Your own behalf and on Your sole responsibility, not on behalf
- of any other Contributor, and only if You agree to indemnify,
- defend, and hold each Contributor harmless for any liability
- incurred by, or claims asserted against, such Contributor by reason
- of your accepting any such warranty or additional liability.
-
- END OF TERMS AND CONDITIONS
-
- APPENDIX: How to apply the Apache License to your work.
-
- To apply the Apache License to your work, attach the following
- boilerplate notice, with the fields enclosed by brackets "[]"
- replaced with your own identifying information. (Don't include
- the brackets!) The text should be enclosed in the appropriate
- comment syntax for the file format. We also recommend that a
- file or class name and description of purpose be included on the
- same "printed page" as the copyright notice for easier
- identification within third-party archives.
-
- Copyright 2019 Chris Smith
-
- Licensed under the Apache License, Version 2.0 (the "License");
- you may not use this file except in compliance with the License.
- You may obtain a copy of the License at
-
- http://www.apache.org/licenses/LICENSE-2.0
-
- Unless required by applicable law or agreed to in writing, software
- distributed under the License is distributed on an "AS IS" BASIS,
- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- See the License for the specific language governing permissions and
- limitations under the License.
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/MANIFEST.in b/openpype/modules/ftrack/python2_vendor/arrow/MANIFEST.in
deleted file mode 100644
index d9955ed96a..0000000000
--- a/openpype/modules/ftrack/python2_vendor/arrow/MANIFEST.in
+++ /dev/null
@@ -1,3 +0,0 @@
-include LICENSE CHANGELOG.rst README.rst Makefile requirements.txt tox.ini
-recursive-include tests *.py
-recursive-include docs *.py *.rst *.bat Makefile
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/Makefile b/openpype/modules/ftrack/python2_vendor/arrow/Makefile
deleted file mode 100644
index f294985dc6..0000000000
--- a/openpype/modules/ftrack/python2_vendor/arrow/Makefile
+++ /dev/null
@@ -1,44 +0,0 @@
-.PHONY: auto test docs clean
-
-auto: build38
-
-build27: PYTHON_VER = python2.7
-build35: PYTHON_VER = python3.5
-build36: PYTHON_VER = python3.6
-build37: PYTHON_VER = python3.7
-build38: PYTHON_VER = python3.8
-build39: PYTHON_VER = python3.9
-
-build27 build35 build36 build37 build38 build39: clean
- virtualenv venv --python=$(PYTHON_VER)
- . venv/bin/activate; \
- pip install -r requirements.txt; \
- pre-commit install
-
-test:
- rm -f .coverage coverage.xml
- . venv/bin/activate; pytest
-
-lint:
- . venv/bin/activate; pre-commit run --all-files --show-diff-on-failure
-
-docs:
- rm -rf docs/_build
- . venv/bin/activate; cd docs; make html
-
-clean: clean-dist
- rm -rf venv .pytest_cache ./**/__pycache__
- rm -f .coverage coverage.xml ./**/*.pyc
-
-clean-dist:
- rm -rf dist build .egg .eggs arrow.egg-info
-
-build-dist:
- . venv/bin/activate; \
- pip install -U setuptools twine wheel; \
- python setup.py sdist bdist_wheel
-
-upload-dist:
- . venv/bin/activate; twine upload dist/*
-
-publish: test clean-dist build-dist upload-dist clean-dist
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/README.rst b/openpype/modules/ftrack/python2_vendor/arrow/README.rst
deleted file mode 100644
index 69f6c50d81..0000000000
--- a/openpype/modules/ftrack/python2_vendor/arrow/README.rst
+++ /dev/null
@@ -1,133 +0,0 @@
-Arrow: Better dates & times for Python
-======================================
-
-.. start-inclusion-marker-do-not-remove
-
-.. image:: https://github.com/arrow-py/arrow/workflows/tests/badge.svg?branch=master
- :alt: Build Status
- :target: https://github.com/arrow-py/arrow/actions?query=workflow%3Atests+branch%3Amaster
-
-.. image:: https://codecov.io/gh/arrow-py/arrow/branch/master/graph/badge.svg
- :alt: Coverage
- :target: https://codecov.io/gh/arrow-py/arrow
-
-.. image:: https://img.shields.io/pypi/v/arrow.svg
- :alt: PyPI Version
- :target: https://pypi.python.org/pypi/arrow
-
-.. image:: https://img.shields.io/pypi/pyversions/arrow.svg
- :alt: Supported Python Versions
- :target: https://pypi.python.org/pypi/arrow
-
-.. image:: https://img.shields.io/pypi/l/arrow.svg
- :alt: License
- :target: https://pypi.python.org/pypi/arrow
-
-.. image:: https://img.shields.io/badge/code%20style-black-000000.svg
- :alt: Code Style: Black
- :target: https://github.com/psf/black
-
-
-**Arrow** is a Python library that offers a sensible and human-friendly approach to creating, manipulating, formatting and converting dates, times and timestamps. It implements and updates the datetime type, plugging gaps in functionality and providing an intelligent module API that supports many common creation scenarios. Simply put, it helps you work with dates and times with fewer imports and a lot less code.
-
-Arrow is named after the `arrow of time `_ and is heavily inspired by `moment.js `_ and `requests `_.
-
-Why use Arrow over built-in modules?
-------------------------------------
-
-Python's standard library and some other low-level modules have near-complete date, time and timezone functionality, but don't work very well from a usability perspective:
-
-- Too many modules: datetime, time, calendar, dateutil, pytz and more
-- Too many types: date, time, datetime, tzinfo, timedelta, relativedelta, etc.
-- Timezones and timestamp conversions are verbose and unpleasant
-- Timezone naivety is the norm
-- Gaps in functionality: ISO 8601 parsing, timespans, humanization
-
-Features
---------
-
-- Fully-implemented, drop-in replacement for datetime
-- Supports Python 2.7, 3.5, 3.6, 3.7, 3.8 and 3.9
-- Timezone-aware and UTC by default
-- Provides super-simple creation options for many common input scenarios
-- :code:`shift` method with support for relative offsets, including weeks
-- Formats and parses strings automatically
-- Wide support for ISO 8601
-- Timezone conversion
-- Timestamp available as a property
-- Generates time spans, ranges, floors and ceilings for time frames ranging from microsecond to year
-- Humanizes and supports a growing list of contributed locales
-- Extensible for your own Arrow-derived types
-
-Quick Start
------------
-
-Installation
-~~~~~~~~~~~~
-
-To install Arrow, use `pip `_ or `pipenv `_:
-
-.. code-block:: console
-
- $ pip install -U arrow
-
-Example Usage
-~~~~~~~~~~~~~
-
-.. code-block:: python
-
- >>> import arrow
- >>> arrow.get('2013-05-11T21:23:58.970460+07:00')
-
-
- >>> utc = arrow.utcnow()
- >>> utc
-
-
- >>> utc = utc.shift(hours=-1)
- >>> utc
-
-
- >>> local = utc.to('US/Pacific')
- >>> local
-
-
- >>> local.timestamp
- 1368303838
-
- >>> local.format()
- '2013-05-11 13:23:58 -07:00'
-
- >>> local.format('YYYY-MM-DD HH:mm:ss ZZ')
- '2013-05-11 13:23:58 -07:00'
-
- >>> local.humanize()
- 'an hour ago'
-
- >>> local.humanize(locale='ko_kr')
- '1시간 전'
-
-.. end-inclusion-marker-do-not-remove
-
-Documentation
--------------
-
-For full documentation, please visit `arrow.readthedocs.io `_.
-
-Contributing
-------------
-
-Contributions are welcome for both code and localizations (adding and updating locales). Begin by gaining familiarity with the Arrow library and its features. Then, jump into contributing:
-
-#. Find an issue or feature to tackle on the `issue tracker `_. Issues marked with the `"good first issue" label `_ may be a great place to start!
-#. Fork `this repository `_ on GitHub and begin making changes in a branch.
-#. Add a few tests to ensure that the bug was fixed or the feature works as expected.
-#. Run the entire test suite and linting checks by running one of the following commands: :code:`tox` (if you have `tox `_ installed) **OR** :code:`make build38 && make test && make lint` (if you do not have Python 3.8 installed, replace :code:`build38` with the latest Python version on your system).
-#. Submit a pull request and await feedback 😃.
-
-If you have any questions along the way, feel free to ask them `here `_.
-
-Support Arrow
--------------
-
-`Open Collective `_ is an online funding platform that provides tools to raise money and share your finances with full transparency. It is the platform of choice for individuals and companies to make one-time or recurring donations directly to the project. If you are interested in making a financial contribution, please visit the `Arrow collective `_.
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/docs/Makefile b/openpype/modules/ftrack/python2_vendor/arrow/docs/Makefile
deleted file mode 100644
index d4bb2cbb9e..0000000000
--- a/openpype/modules/ftrack/python2_vendor/arrow/docs/Makefile
+++ /dev/null
@@ -1,20 +0,0 @@
-# Minimal makefile for Sphinx documentation
-#
-
-# You can set these variables from the command line, and also
-# from the environment for the first two.
-SPHINXOPTS ?=
-SPHINXBUILD ?= sphinx-build
-SOURCEDIR = .
-BUILDDIR = _build
-
-# Put it first so that "make" without argument is like "make help".
-help:
- @$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
-
-.PHONY: help Makefile
-
-# Catch-all target: route all unknown targets to Sphinx using the new
-# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
-%: Makefile
- @$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/docs/conf.py b/openpype/modules/ftrack/python2_vendor/arrow/docs/conf.py
deleted file mode 100644
index aaf3c50822..0000000000
--- a/openpype/modules/ftrack/python2_vendor/arrow/docs/conf.py
+++ /dev/null
@@ -1,62 +0,0 @@
-# -*- coding: utf-8 -*-
-
-# -- Path setup --------------------------------------------------------------
-
-import io
-import os
-import sys
-
-sys.path.insert(0, os.path.abspath(".."))
-
-about = {}
-with io.open("../arrow/_version.py", "r", encoding="utf-8") as f:
- exec(f.read(), about)
-
-# -- Project information -----------------------------------------------------
-
-project = u"Arrow 🏹"
-copyright = "2020, Chris Smith"
-author = "Chris Smith"
-
-release = about["__version__"]
-
-# -- General configuration ---------------------------------------------------
-
-extensions = ["sphinx.ext.autodoc"]
-
-templates_path = []
-
-exclude_patterns = ["_build", "Thumbs.db", ".DS_Store"]
-
-master_doc = "index"
-source_suffix = ".rst"
-pygments_style = "sphinx"
-
-language = None
-
-# -- Options for HTML output -------------------------------------------------
-
-html_theme = "alabaster"
-html_theme_path = []
-html_static_path = []
-
-html_show_sourcelink = False
-html_show_sphinx = False
-html_show_copyright = True
-
-# https://alabaster.readthedocs.io/en/latest/customization.html
-html_theme_options = {
- "description": "Arrow is a sensible and human-friendly approach to dates, times and timestamps.",
- "github_user": "arrow-py",
- "github_repo": "arrow",
- "github_banner": True,
- "show_related": False,
- "show_powered_by": False,
- "github_button": True,
- "github_type": "star",
- "github_count": "true", # must be a string
-}
-
-html_sidebars = {
- "**": ["about.html", "localtoc.html", "relations.html", "searchbox.html"]
-}
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/docs/index.rst b/openpype/modules/ftrack/python2_vendor/arrow/docs/index.rst
deleted file mode 100644
index e2830b04f3..0000000000
--- a/openpype/modules/ftrack/python2_vendor/arrow/docs/index.rst
+++ /dev/null
@@ -1,566 +0,0 @@
-Arrow: Better dates & times for Python
-======================================
-
-Release v\ |release| (`Installation`_) (`Changelog `_)
-
-.. include:: ../README.rst
- :start-after: start-inclusion-marker-do-not-remove
- :end-before: end-inclusion-marker-do-not-remove
-
-User's Guide
-------------
-
-Creation
-~~~~~~~~
-
-Get 'now' easily:
-
-.. code-block:: python
-
- >>> arrow.utcnow()
-
-
- >>> arrow.now()
-
-
- >>> arrow.now('US/Pacific')
-
-
-Create from timestamps (:code:`int` or :code:`float`):
-
-.. code-block:: python
-
- >>> arrow.get(1367900664)
-
-
- >>> arrow.get(1367900664.152325)
-
-
-Use a naive or timezone-aware datetime, or flexibly specify a timezone:
-
-.. code-block:: python
-
- >>> arrow.get(datetime.utcnow())
-
-
- >>> arrow.get(datetime(2013, 5, 5), 'US/Pacific')
-
-
- >>> from dateutil import tz
- >>> arrow.get(datetime(2013, 5, 5), tz.gettz('US/Pacific'))
-
-
- >>> arrow.get(datetime.now(tz.gettz('US/Pacific')))
-
-
-Parse from a string:
-
-.. code-block:: python
-
- >>> arrow.get('2013-05-05 12:30:45', 'YYYY-MM-DD HH:mm:ss')
-
-
-Search a date in a string:
-
-.. code-block:: python
-
- >>> arrow.get('June was born in May 1980', 'MMMM YYYY')
-
-
-Some ISO 8601 compliant strings are recognized and parsed without a format string:
-
- >>> arrow.get('2013-09-30T15:34:00.000-07:00')
-
-
-Arrow objects can be instantiated directly too, with the same arguments as a datetime:
-
-.. code-block:: python
-
- >>> arrow.get(2013, 5, 5)
-
-
- >>> arrow.Arrow(2013, 5, 5)
-
-
-Properties
-~~~~~~~~~~
-
-Get a datetime or timestamp representation:
-
-.. code-block:: python
-
- >>> a = arrow.utcnow()
- >>> a.datetime
- datetime.datetime(2013, 5, 7, 4, 38, 15, 447644, tzinfo=tzutc())
-
- >>> a.timestamp
- 1367901495
-
-Get a naive datetime, and tzinfo:
-
-.. code-block:: python
-
- >>> a.naive
- datetime.datetime(2013, 5, 7, 4, 38, 15, 447644)
-
- >>> a.tzinfo
- tzutc()
-
-Get any datetime value:
-
-.. code-block:: python
-
- >>> a.year
- 2013
-
-Call datetime functions that return properties:
-
-.. code-block:: python
-
- >>> a.date()
- datetime.date(2013, 5, 7)
-
- >>> a.time()
- datetime.time(4, 38, 15, 447644)
-
-Replace & Shift
-~~~~~~~~~~~~~~~
-
-Get a new :class:`Arrow ` object, with altered attributes, just as you would with a datetime:
-
-.. code-block:: python
-
- >>> arw = arrow.utcnow()
- >>> arw
-
-
- >>> arw.replace(hour=4, minute=40)
-
-
-Or, get one with attributes shifted forward or backward:
-
-.. code-block:: python
-
- >>> arw.shift(weeks=+3)
-
-
-Even replace the timezone without altering other attributes:
-
-.. code-block:: python
-
- >>> arw.replace(tzinfo='US/Pacific')
-
-
-Move between the earlier and later moments of an ambiguous time:
-
-.. code-block:: python
-
- >>> paris_transition = arrow.Arrow(2019, 10, 27, 2, tzinfo="Europe/Paris", fold=0)
- >>> paris_transition
-
- >>> paris_transition.ambiguous
- True
- >>> paris_transition.replace(fold=1)
-
-
-Format
-~~~~~~
-
-.. code-block:: python
-
- >>> arrow.utcnow().format('YYYY-MM-DD HH:mm:ss ZZ')
- '2013-05-07 05:23:16 -00:00'
-
-Convert
-~~~~~~~
-
-Convert from UTC to other timezones by name or tzinfo:
-
-.. code-block:: python
-
- >>> utc = arrow.utcnow()
- >>> utc
-
-
- >>> utc.to('US/Pacific')
-
-
- >>> utc.to(tz.gettz('US/Pacific'))
-
-
-Or using shorthand:
-
-.. code-block:: python
-
- >>> utc.to('local')
-
-
- >>> utc.to('local').to('utc')
-
-
-
-Humanize
-~~~~~~~~
-
-Humanize relative to now:
-
-.. code-block:: python
-
- >>> past = arrow.utcnow().shift(hours=-1)
- >>> past.humanize()
- 'an hour ago'
-
-Or another Arrow, or datetime:
-
-.. code-block:: python
-
- >>> present = arrow.utcnow()
- >>> future = present.shift(hours=2)
- >>> future.humanize(present)
- 'in 2 hours'
-
-Indicate time as relative or include only the distance
-
-.. code-block:: python
-
- >>> present = arrow.utcnow()
- >>> future = present.shift(hours=2)
- >>> future.humanize(present)
- 'in 2 hours'
- >>> future.humanize(present, only_distance=True)
- '2 hours'
-
-
-Indicate a specific time granularity (or multiple):
-
-.. code-block:: python
-
- >>> present = arrow.utcnow()
- >>> future = present.shift(minutes=66)
- >>> future.humanize(present, granularity="minute")
- 'in 66 minutes'
- >>> future.humanize(present, granularity=["hour", "minute"])
- 'in an hour and 6 minutes'
- >>> present.humanize(future, granularity=["hour", "minute"])
- 'an hour and 6 minutes ago'
- >>> future.humanize(present, only_distance=True, granularity=["hour", "minute"])
- 'an hour and 6 minutes'
-
-Support for a growing number of locales (see ``locales.py`` for supported languages):
-
-.. code-block:: python
-
-
- >>> future = arrow.utcnow().shift(hours=1)
- >>> future.humanize(a, locale='ru')
- 'через 2 час(а,ов)'
-
-
-Ranges & Spans
-~~~~~~~~~~~~~~
-
-Get the time span of any unit:
-
-.. code-block:: python
-
- >>> arrow.utcnow().span('hour')
- (, )
-
-Or just get the floor and ceiling:
-
-.. code-block:: python
-
- >>> arrow.utcnow().floor('hour')
-
-
- >>> arrow.utcnow().ceil('hour')
-
-
-You can also get a range of time spans:
-
-.. code-block:: python
-
- >>> start = datetime(2013, 5, 5, 12, 30)
- >>> end = datetime(2013, 5, 5, 17, 15)
- >>> for r in arrow.Arrow.span_range('hour', start, end):
- ... print r
- ...
- (, )
- (, )
- (, )
- (, )
- (, )
-
-Or just iterate over a range of time:
-
-.. code-block:: python
-
- >>> start = datetime(2013, 5, 5, 12, 30)
- >>> end = datetime(2013, 5, 5, 17, 15)
- >>> for r in arrow.Arrow.range('hour', start, end):
- ... print repr(r)
- ...
-
-
-
-
-
-
-.. toctree::
- :maxdepth: 2
-
-Factories
-~~~~~~~~~
-
-Use factories to harness Arrow's module API for a custom Arrow-derived type. First, derive your type:
-
-.. code-block:: python
-
- >>> class CustomArrow(arrow.Arrow):
- ...
- ... def days_till_xmas(self):
- ...
- ... xmas = arrow.Arrow(self.year, 12, 25)
- ...
- ... if self > xmas:
- ... xmas = xmas.shift(years=1)
- ...
- ... return (xmas - self).days
-
-
-Then get and use a factory for it:
-
-.. code-block:: python
-
- >>> factory = arrow.ArrowFactory(CustomArrow)
- >>> custom = factory.utcnow()
- >>> custom
- >>>
-
- >>> custom.days_till_xmas()
- >>> 211
-
-Supported Tokens
-~~~~~~~~~~~~~~~~
-
-Use the following tokens for parsing and formatting. Note that they are **not** the same as the tokens for `strptime `_:
-
-+--------------------------------+--------------+-------------------------------------------+
-| |Token |Output |
-+================================+==============+===========================================+
-|**Year** |YYYY |2000, 2001, 2002 ... 2012, 2013 |
-+--------------------------------+--------------+-------------------------------------------+
-| |YY |00, 01, 02 ... 12, 13 |
-+--------------------------------+--------------+-------------------------------------------+
-|**Month** |MMMM |January, February, March ... [#t1]_ |
-+--------------------------------+--------------+-------------------------------------------+
-| |MMM |Jan, Feb, Mar ... [#t1]_ |
-+--------------------------------+--------------+-------------------------------------------+
-| |MM |01, 02, 03 ... 11, 12 |
-+--------------------------------+--------------+-------------------------------------------+
-| |M |1, 2, 3 ... 11, 12 |
-+--------------------------------+--------------+-------------------------------------------+
-|**Day of Year** |DDDD |001, 002, 003 ... 364, 365 |
-+--------------------------------+--------------+-------------------------------------------+
-| |DDD |1, 2, 3 ... 364, 365 |
-+--------------------------------+--------------+-------------------------------------------+
-|**Day of Month** |DD |01, 02, 03 ... 30, 31 |
-+--------------------------------+--------------+-------------------------------------------+
-| |D |1, 2, 3 ... 30, 31 |
-+--------------------------------+--------------+-------------------------------------------+
-| |Do |1st, 2nd, 3rd ... 30th, 31st |
-+--------------------------------+--------------+-------------------------------------------+
-|**Day of Week** |dddd |Monday, Tuesday, Wednesday ... [#t2]_ |
-+--------------------------------+--------------+-------------------------------------------+
-| |ddd |Mon, Tue, Wed ... [#t2]_ |
-+--------------------------------+--------------+-------------------------------------------+
-| |d |1, 2, 3 ... 6, 7 |
-+--------------------------------+--------------+-------------------------------------------+
-|**ISO week date** |W |2011-W05-4, 2019-W17 |
-+--------------------------------+--------------+-------------------------------------------+
-|**Hour** |HH |00, 01, 02 ... 23, 24 |
-+--------------------------------+--------------+-------------------------------------------+
-| |H |0, 1, 2 ... 23, 24 |
-+--------------------------------+--------------+-------------------------------------------+
-| |hh |01, 02, 03 ... 11, 12 |
-+--------------------------------+--------------+-------------------------------------------+
-| |h |1, 2, 3 ... 11, 12 |
-+--------------------------------+--------------+-------------------------------------------+
-|**AM / PM** |A |AM, PM, am, pm [#t1]_ |
-+--------------------------------+--------------+-------------------------------------------+
-| |a |am, pm [#t1]_ |
-+--------------------------------+--------------+-------------------------------------------+
-|**Minute** |mm |00, 01, 02 ... 58, 59 |
-+--------------------------------+--------------+-------------------------------------------+
-| |m |0, 1, 2 ... 58, 59 |
-+--------------------------------+--------------+-------------------------------------------+
-|**Second** |ss |00, 01, 02 ... 58, 59 |
-+--------------------------------+--------------+-------------------------------------------+
-| |s |0, 1, 2 ... 58, 59 |
-+--------------------------------+--------------+-------------------------------------------+
-|**Sub-second** |S... |0, 02, 003, 000006, 123123123123... [#t3]_ |
-+--------------------------------+--------------+-------------------------------------------+
-|**Timezone** |ZZZ |Asia/Baku, Europe/Warsaw, GMT ... [#t4]_ |
-+--------------------------------+--------------+-------------------------------------------+
-| |ZZ |-07:00, -06:00 ... +06:00, +07:00, +08, Z |
-+--------------------------------+--------------+-------------------------------------------+
-| |Z |-0700, -0600 ... +0600, +0700, +08, Z |
-+--------------------------------+--------------+-------------------------------------------+
-|**Seconds Timestamp** |X |1381685817, 1381685817.915482 ... [#t5]_ |
-+--------------------------------+--------------+-------------------------------------------+
-|**ms or µs Timestamp** |x |1569980330813, 1569980330813221 |
-+--------------------------------+--------------+-------------------------------------------+
-
-.. rubric:: Footnotes
-
-.. [#t1] localization support for parsing and formatting
-.. [#t2] localization support only for formatting
-.. [#t3] the result is truncated to microseconds, with `half-to-even rounding `_.
-.. [#t4] timezone names from `tz database `_ provided via dateutil package, note that abbreviations such as MST, PDT, BRST are unlikely to parse due to ambiguity. Use the full IANA zone name instead (Asia/Shanghai, Europe/London, America/Chicago etc).
-.. [#t5] this token cannot be used for parsing timestamps out of natural language strings due to compatibility reasons
-
-Built-in Formats
-++++++++++++++++
-
-There are several formatting standards that are provided as built-in tokens.
-
-.. code-block:: python
-
- >>> arw = arrow.utcnow()
- >>> arw.format(arrow.FORMAT_ATOM)
- '2020-05-27 10:30:35+00:00'
- >>> arw.format(arrow.FORMAT_COOKIE)
- 'Wednesday, 27-May-2020 10:30:35 UTC'
- >>> arw.format(arrow.FORMAT_RSS)
- 'Wed, 27 May 2020 10:30:35 +0000'
- >>> arw.format(arrow.FORMAT_RFC822)
- 'Wed, 27 May 20 10:30:35 +0000'
- >>> arw.format(arrow.FORMAT_RFC850)
- 'Wednesday, 27-May-20 10:30:35 UTC'
- >>> arw.format(arrow.FORMAT_RFC1036)
- 'Wed, 27 May 20 10:30:35 +0000'
- >>> arw.format(arrow.FORMAT_RFC1123)
- 'Wed, 27 May 2020 10:30:35 +0000'
- >>> arw.format(arrow.FORMAT_RFC2822)
- 'Wed, 27 May 2020 10:30:35 +0000'
- >>> arw.format(arrow.FORMAT_RFC3339)
- '2020-05-27 10:30:35+00:00'
- >>> arw.format(arrow.FORMAT_W3C)
- '2020-05-27 10:30:35+00:00'
-
-Escaping Formats
-~~~~~~~~~~~~~~~~
-
-Tokens, phrases, and regular expressions in a format string can be escaped when parsing and formatting by enclosing them within square brackets.
-
-Tokens & Phrases
-++++++++++++++++
-
-Any `token `_ or phrase can be escaped as follows:
-
-.. code-block:: python
-
- >>> fmt = "YYYY-MM-DD h [h] m"
- >>> arw = arrow.get("2018-03-09 8 h 40", fmt)
-
- >>> arw.format(fmt)
- '2018-03-09 8 h 40'
-
- >>> fmt = "YYYY-MM-DD h [hello] m"
- >>> arw = arrow.get("2018-03-09 8 hello 40", fmt)
-
- >>> arw.format(fmt)
- '2018-03-09 8 hello 40'
-
- >>> fmt = "YYYY-MM-DD h [hello world] m"
- >>> arw = arrow.get("2018-03-09 8 hello world 40", fmt)
-
- >>> arw.format(fmt)
- '2018-03-09 8 hello world 40'
-
-This can be useful for parsing dates in different locales such as French, in which it is common to format time strings as "8 h 40" rather than "8:40".
-
-Regular Expressions
-+++++++++++++++++++
-
-You can also escape regular expressions by enclosing them within square brackets. In the following example, we are using the regular expression :code:`\s+` to match any number of whitespace characters that separate the tokens. This is useful if you do not know the number of spaces between tokens ahead of time (e.g. in log files).
-
-.. code-block:: python
-
- >>> fmt = r"ddd[\s+]MMM[\s+]DD[\s+]HH:mm:ss[\s+]YYYY"
- >>> arrow.get("Mon Sep 08 16:41:45 2014", fmt)
-
-
- >>> arrow.get("Mon \tSep 08 16:41:45 2014", fmt)
-
-
- >>> arrow.get("Mon Sep 08 16:41:45 2014", fmt)
-
-
-Punctuation
-~~~~~~~~~~~
-
-Date and time formats may be fenced on either side by one punctuation character from the following list: ``, . ; : ? ! " \` ' [ ] { } ( ) < >``
-
-.. code-block:: python
-
- >>> arrow.get("Cool date: 2019-10-31T09:12:45.123456+04:30.", "YYYY-MM-DDTHH:mm:ss.SZZ")
-
-
- >>> arrow.get("Tomorrow (2019-10-31) is Halloween!", "YYYY-MM-DD")
-
-
- >>> arrow.get("Halloween is on 2019.10.31.", "YYYY.MM.DD")
-
-
- >>> arrow.get("It's Halloween tomorrow (2019-10-31)!", "YYYY-MM-DD")
- # Raises exception because there are multiple punctuation marks following the date
-
-Redundant Whitespace
-~~~~~~~~~~~~~~~~~~~~
-
-Redundant whitespace characters (spaces, tabs, and newlines) can be normalized automatically by passing in the ``normalize_whitespace`` flag to ``arrow.get``:
-
-.. code-block:: python
-
- >>> arrow.get('\t \n 2013-05-05T12:30:45.123456 \t \n', normalize_whitespace=True)
-
-
- >>> arrow.get('2013-05-05 T \n 12:30:45\t123456', 'YYYY-MM-DD T HH:mm:ss S', normalize_whitespace=True)
-
-
-API Guide
----------
-
-arrow.arrow
-~~~~~~~~~~~
-
-.. automodule:: arrow.arrow
- :members:
-
-arrow.factory
-~~~~~~~~~~~~~
-
-.. automodule:: arrow.factory
- :members:
-
-arrow.api
-~~~~~~~~~
-
-.. automodule:: arrow.api
- :members:
-
-arrow.locale
-~~~~~~~~~~~~
-
-.. automodule:: arrow.locales
- :members:
- :undoc-members:
-
-Release History
----------------
-
-.. toctree::
- :maxdepth: 2
-
- releases
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/docs/make.bat b/openpype/modules/ftrack/python2_vendor/arrow/docs/make.bat
deleted file mode 100644
index 922152e96a..0000000000
--- a/openpype/modules/ftrack/python2_vendor/arrow/docs/make.bat
+++ /dev/null
@@ -1,35 +0,0 @@
-@ECHO OFF
-
-pushd %~dp0
-
-REM Command file for Sphinx documentation
-
-if "%SPHINXBUILD%" == "" (
- set SPHINXBUILD=sphinx-build
-)
-set SOURCEDIR=.
-set BUILDDIR=_build
-
-if "%1" == "" goto help
-
-%SPHINXBUILD% >NUL 2>NUL
-if errorlevel 9009 (
- echo.
- echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
- echo.installed, then set the SPHINXBUILD environment variable to point
- echo.to the full path of the 'sphinx-build' executable. Alternatively you
- echo.may add the Sphinx directory to PATH.
- echo.
- echo.If you don't have Sphinx installed, grab it from
- echo.http://sphinx-doc.org/
- exit /b 1
-)
-
-%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
-goto end
-
-:help
-%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
-
-:end
-popd
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/docs/releases.rst b/openpype/modules/ftrack/python2_vendor/arrow/docs/releases.rst
deleted file mode 100644
index 22e1e59c8c..0000000000
--- a/openpype/modules/ftrack/python2_vendor/arrow/docs/releases.rst
+++ /dev/null
@@ -1,3 +0,0 @@
-.. _releases:
-
-.. include:: ../CHANGELOG.rst
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/requirements.txt b/openpype/modules/ftrack/python2_vendor/arrow/requirements.txt
deleted file mode 100644
index df565d8384..0000000000
--- a/openpype/modules/ftrack/python2_vendor/arrow/requirements.txt
+++ /dev/null
@@ -1,14 +0,0 @@
-backports.functools_lru_cache==1.6.1; python_version == "2.7"
-dateparser==0.7.*
-pre-commit==1.21.*; python_version <= "3.5"
-pre-commit==2.6.*; python_version >= "3.6"
-pytest==4.6.*; python_version == "2.7"
-pytest==6.0.*; python_version >= "3.5"
-pytest-cov==2.10.*
-pytest-mock==2.0.*; python_version == "2.7"
-pytest-mock==3.2.*; python_version >= "3.5"
-python-dateutil==2.8.*
-pytz==2019.*
-simplejson==3.17.*
-sphinx==1.8.*; python_version == "2.7"
-sphinx==3.2.*; python_version >= "3.5"
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/setup.cfg b/openpype/modules/ftrack/python2_vendor/arrow/setup.cfg
deleted file mode 100644
index 2a9acf13da..0000000000
--- a/openpype/modules/ftrack/python2_vendor/arrow/setup.cfg
+++ /dev/null
@@ -1,2 +0,0 @@
-[bdist_wheel]
-universal = 1
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/setup.py b/openpype/modules/ftrack/python2_vendor/arrow/setup.py
deleted file mode 100644
index dc4f0e77d5..0000000000
--- a/openpype/modules/ftrack/python2_vendor/arrow/setup.py
+++ /dev/null
@@ -1,50 +0,0 @@
-# -*- coding: utf-8 -*-
-import io
-
-from setuptools import setup
-
-with io.open("README.rst", "r", encoding="utf-8") as f:
- readme = f.read()
-
-about = {}
-with io.open("arrow/_version.py", "r", encoding="utf-8") as f:
- exec(f.read(), about)
-
-setup(
- name="arrow",
- version=about["__version__"],
- description="Better dates & times for Python",
- long_description=readme,
- long_description_content_type="text/x-rst",
- url="https://arrow.readthedocs.io",
- author="Chris Smith",
- author_email="crsmithdev@gmail.com",
- license="Apache 2.0",
- packages=["arrow"],
- zip_safe=False,
- python_requires=">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*",
- install_requires=[
- "python-dateutil>=2.7.0",
- "backports.functools_lru_cache>=1.2.1;python_version=='2.7'",
- ],
- classifiers=[
- "Development Status :: 4 - Beta",
- "Intended Audience :: Developers",
- "License :: OSI Approved :: Apache Software License",
- "Topic :: Software Development :: Libraries :: Python Modules",
- "Programming Language :: Python :: 2",
- "Programming Language :: Python :: 2.7",
- "Programming Language :: Python :: 3",
- "Programming Language :: Python :: 3.5",
- "Programming Language :: Python :: 3.6",
- "Programming Language :: Python :: 3.7",
- "Programming Language :: Python :: 3.8",
- "Programming Language :: Python :: 3.9",
- ],
- keywords="arrow date time datetime timestamp timezone humanize",
- project_urls={
- "Repository": "https://github.com/arrow-py/arrow",
- "Bug Reports": "https://github.com/arrow-py/arrow/issues",
- "Documentation": "https://arrow.readthedocs.io",
- },
-)
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/tests/conftest.py b/openpype/modules/ftrack/python2_vendor/arrow/tests/conftest.py
deleted file mode 100644
index 5bc8a4af2e..0000000000
--- a/openpype/modules/ftrack/python2_vendor/arrow/tests/conftest.py
+++ /dev/null
@@ -1,76 +0,0 @@
-# -*- coding: utf-8 -*-
-from datetime import datetime
-
-import pytest
-from dateutil import tz as dateutil_tz
-
-from arrow import arrow, factory, formatter, locales, parser
-
-
-@pytest.fixture(scope="class")
-def time_utcnow(request):
- request.cls.arrow = arrow.Arrow.utcnow()
-
-
-@pytest.fixture(scope="class")
-def time_2013_01_01(request):
- request.cls.now = arrow.Arrow.utcnow()
- request.cls.arrow = arrow.Arrow(2013, 1, 1)
- request.cls.datetime = datetime(2013, 1, 1)
-
-
-@pytest.fixture(scope="class")
-def time_2013_02_03(request):
- request.cls.arrow = arrow.Arrow(2013, 2, 3, 12, 30, 45, 1)
-
-
-@pytest.fixture(scope="class")
-def time_2013_02_15(request):
- request.cls.datetime = datetime(2013, 2, 15, 3, 41, 22, 8923)
- request.cls.arrow = arrow.Arrow.fromdatetime(request.cls.datetime)
-
-
-@pytest.fixture(scope="class")
-def time_1975_12_25(request):
- request.cls.datetime = datetime(
- 1975, 12, 25, 14, 15, 16, tzinfo=dateutil_tz.gettz("America/New_York")
- )
- request.cls.arrow = arrow.Arrow.fromdatetime(request.cls.datetime)
-
-
-@pytest.fixture(scope="class")
-def arrow_formatter(request):
- request.cls.formatter = formatter.DateTimeFormatter()
-
-
-@pytest.fixture(scope="class")
-def arrow_factory(request):
- request.cls.factory = factory.ArrowFactory()
-
-
-@pytest.fixture(scope="class")
-def lang_locales(request):
- request.cls.locales = locales._locales
-
-
-@pytest.fixture(scope="class")
-def lang_locale(request):
- # As locale test classes are prefixed with Test, we are dynamically getting the locale by the test class name.
- # TestEnglishLocale -> EnglishLocale
- name = request.cls.__name__[4:]
- request.cls.locale = locales.get_locale_by_class_name(name)
-
-
-@pytest.fixture(scope="class")
-def dt_parser(request):
- request.cls.parser = parser.DateTimeParser()
-
-
-@pytest.fixture(scope="class")
-def dt_parser_regex(request):
- request.cls.format_regex = parser.DateTimeParser._FORMAT_RE
-
-
-@pytest.fixture(scope="class")
-def tzinfo_parser(request):
- request.cls.parser = parser.TzinfoParser()
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/tests/test_api.py b/openpype/modules/ftrack/python2_vendor/arrow/tests/test_api.py
deleted file mode 100644
index 9b19a27cd9..0000000000
--- a/openpype/modules/ftrack/python2_vendor/arrow/tests/test_api.py
+++ /dev/null
@@ -1,28 +0,0 @@
-# -*- coding: utf-8 -*-
-import arrow
-
-
-class TestModule:
- def test_get(self, mocker):
- mocker.patch("arrow.api._factory.get", return_value="result")
-
- assert arrow.api.get() == "result"
-
- def test_utcnow(self, mocker):
- mocker.patch("arrow.api._factory.utcnow", return_value="utcnow")
-
- assert arrow.api.utcnow() == "utcnow"
-
- def test_now(self, mocker):
- mocker.patch("arrow.api._factory.now", tz="tz", return_value="now")
-
- assert arrow.api.now("tz") == "now"
-
- def test_factory(self):
- class MockCustomArrowClass(arrow.Arrow):
- pass
-
- result = arrow.api.factory(MockCustomArrowClass)
-
- assert isinstance(result, arrow.factory.ArrowFactory)
- assert isinstance(result.utcnow(), MockCustomArrowClass)
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/tests/test_arrow.py b/openpype/modules/ftrack/python2_vendor/arrow/tests/test_arrow.py
deleted file mode 100644
index b0bd20a5e3..0000000000
--- a/openpype/modules/ftrack/python2_vendor/arrow/tests/test_arrow.py
+++ /dev/null
@@ -1,2150 +0,0 @@
-# -*- coding: utf-8 -*-
-from __future__ import absolute_import, unicode_literals
-
-import calendar
-import pickle
-import sys
-import time
-from datetime import date, datetime, timedelta
-
-import dateutil
-import pytest
-import pytz
-import simplejson as json
-from dateutil import tz
-from dateutil.relativedelta import FR, MO, SA, SU, TH, TU, WE
-
-from arrow import arrow
-
-from .utils import assert_datetime_equality
-
-
-class TestTestArrowInit:
- def test_init_bad_input(self):
-
- with pytest.raises(TypeError):
- arrow.Arrow(2013)
-
- with pytest.raises(TypeError):
- arrow.Arrow(2013, 2)
-
- with pytest.raises(ValueError):
- arrow.Arrow(2013, 2, 2, 12, 30, 45, 9999999)
-
- def test_init(self):
-
- result = arrow.Arrow(2013, 2, 2)
- self.expected = datetime(2013, 2, 2, tzinfo=tz.tzutc())
- assert result._datetime == self.expected
-
- result = arrow.Arrow(2013, 2, 2, 12)
- self.expected = datetime(2013, 2, 2, 12, tzinfo=tz.tzutc())
- assert result._datetime == self.expected
-
- result = arrow.Arrow(2013, 2, 2, 12, 30)
- self.expected = datetime(2013, 2, 2, 12, 30, tzinfo=tz.tzutc())
- assert result._datetime == self.expected
-
- result = arrow.Arrow(2013, 2, 2, 12, 30, 45)
- self.expected = datetime(2013, 2, 2, 12, 30, 45, tzinfo=tz.tzutc())
- assert result._datetime == self.expected
-
- result = arrow.Arrow(2013, 2, 2, 12, 30, 45, 999999)
- self.expected = datetime(2013, 2, 2, 12, 30, 45, 999999, tzinfo=tz.tzutc())
- assert result._datetime == self.expected
-
- result = arrow.Arrow(
- 2013, 2, 2, 12, 30, 45, 999999, tzinfo=tz.gettz("Europe/Paris")
- )
- self.expected = datetime(
- 2013, 2, 2, 12, 30, 45, 999999, tzinfo=tz.gettz("Europe/Paris")
- )
- assert result._datetime == self.expected
-
- # regression tests for issue #626
- def test_init_pytz_timezone(self):
-
- result = arrow.Arrow(
- 2013, 2, 2, 12, 30, 45, 999999, tzinfo=pytz.timezone("Europe/Paris")
- )
- self.expected = datetime(
- 2013, 2, 2, 12, 30, 45, 999999, tzinfo=tz.gettz("Europe/Paris")
- )
- assert result._datetime == self.expected
- assert_datetime_equality(result._datetime, self.expected, 1)
-
- def test_init_with_fold(self):
- before = arrow.Arrow(2017, 10, 29, 2, 0, tzinfo="Europe/Stockholm")
- after = arrow.Arrow(2017, 10, 29, 2, 0, tzinfo="Europe/Stockholm", fold=1)
-
- assert hasattr(before, "fold")
- assert hasattr(after, "fold")
-
- # PEP-495 requires the comparisons below to be true
- assert before == after
- assert before.utcoffset() != after.utcoffset()
-
-
-class TestTestArrowFactory:
- def test_now(self):
-
- result = arrow.Arrow.now()
-
- assert_datetime_equality(
- result._datetime, datetime.now().replace(tzinfo=tz.tzlocal())
- )
-
- def test_utcnow(self):
-
- result = arrow.Arrow.utcnow()
-
- assert_datetime_equality(
- result._datetime, datetime.utcnow().replace(tzinfo=tz.tzutc())
- )
-
- assert result.fold == 0
-
- def test_fromtimestamp(self):
-
- timestamp = time.time()
-
- result = arrow.Arrow.fromtimestamp(timestamp)
- assert_datetime_equality(
- result._datetime, datetime.now().replace(tzinfo=tz.tzlocal())
- )
-
- result = arrow.Arrow.fromtimestamp(timestamp, tzinfo=tz.gettz("Europe/Paris"))
- assert_datetime_equality(
- result._datetime,
- datetime.fromtimestamp(timestamp, tz.gettz("Europe/Paris")),
- )
-
- result = arrow.Arrow.fromtimestamp(timestamp, tzinfo="Europe/Paris")
- assert_datetime_equality(
- result._datetime,
- datetime.fromtimestamp(timestamp, tz.gettz("Europe/Paris")),
- )
-
- with pytest.raises(ValueError):
- arrow.Arrow.fromtimestamp("invalid timestamp")
-
- def test_utcfromtimestamp(self):
-
- timestamp = time.time()
-
- result = arrow.Arrow.utcfromtimestamp(timestamp)
- assert_datetime_equality(
- result._datetime, datetime.utcnow().replace(tzinfo=tz.tzutc())
- )
-
- with pytest.raises(ValueError):
- arrow.Arrow.utcfromtimestamp("invalid timestamp")
-
- def test_fromdatetime(self):
-
- dt = datetime(2013, 2, 3, 12, 30, 45, 1)
-
- result = arrow.Arrow.fromdatetime(dt)
-
- assert result._datetime == dt.replace(tzinfo=tz.tzutc())
-
- def test_fromdatetime_dt_tzinfo(self):
-
- dt = datetime(2013, 2, 3, 12, 30, 45, 1, tzinfo=tz.gettz("US/Pacific"))
-
- result = arrow.Arrow.fromdatetime(dt)
-
- assert result._datetime == dt.replace(tzinfo=tz.gettz("US/Pacific"))
-
- def test_fromdatetime_tzinfo_arg(self):
-
- dt = datetime(2013, 2, 3, 12, 30, 45, 1)
-
- result = arrow.Arrow.fromdatetime(dt, tz.gettz("US/Pacific"))
-
- assert result._datetime == dt.replace(tzinfo=tz.gettz("US/Pacific"))
-
- def test_fromdate(self):
-
- dt = date(2013, 2, 3)
-
- result = arrow.Arrow.fromdate(dt, tz.gettz("US/Pacific"))
-
- assert result._datetime == datetime(2013, 2, 3, tzinfo=tz.gettz("US/Pacific"))
-
- def test_strptime(self):
-
- formatted = datetime(2013, 2, 3, 12, 30, 45).strftime("%Y-%m-%d %H:%M:%S")
-
- result = arrow.Arrow.strptime(formatted, "%Y-%m-%d %H:%M:%S")
- assert result._datetime == datetime(2013, 2, 3, 12, 30, 45, tzinfo=tz.tzutc())
-
- result = arrow.Arrow.strptime(
- formatted, "%Y-%m-%d %H:%M:%S", tzinfo=tz.gettz("Europe/Paris")
- )
- assert result._datetime == datetime(
- 2013, 2, 3, 12, 30, 45, tzinfo=tz.gettz("Europe/Paris")
- )
-
-
-@pytest.mark.usefixtures("time_2013_02_03")
-class TestTestArrowRepresentation:
- def test_repr(self):
-
- result = self.arrow.__repr__()
-
- assert result == "".format(self.arrow._datetime.isoformat())
-
- def test_str(self):
-
- result = self.arrow.__str__()
-
- assert result == self.arrow._datetime.isoformat()
-
- def test_hash(self):
-
- result = self.arrow.__hash__()
-
- assert result == self.arrow._datetime.__hash__()
-
- def test_format(self):
-
- result = "{:YYYY-MM-DD}".format(self.arrow)
-
- assert result == "2013-02-03"
-
- def test_bare_format(self):
-
- result = self.arrow.format()
-
- assert result == "2013-02-03 12:30:45+00:00"
-
- def test_format_no_format_string(self):
-
- result = "{}".format(self.arrow)
-
- assert result == str(self.arrow)
-
- def test_clone(self):
-
- result = self.arrow.clone()
-
- assert result is not self.arrow
- assert result._datetime == self.arrow._datetime
-
-
-@pytest.mark.usefixtures("time_2013_01_01")
-class TestArrowAttribute:
- def test_getattr_base(self):
-
- with pytest.raises(AttributeError):
- self.arrow.prop
-
- def test_getattr_week(self):
-
- assert self.arrow.week == 1
-
- def test_getattr_quarter(self):
- # start dates
- q1 = arrow.Arrow(2013, 1, 1)
- q2 = arrow.Arrow(2013, 4, 1)
- q3 = arrow.Arrow(2013, 8, 1)
- q4 = arrow.Arrow(2013, 10, 1)
- assert q1.quarter == 1
- assert q2.quarter == 2
- assert q3.quarter == 3
- assert q4.quarter == 4
-
- # end dates
- q1 = arrow.Arrow(2013, 3, 31)
- q2 = arrow.Arrow(2013, 6, 30)
- q3 = arrow.Arrow(2013, 9, 30)
- q4 = arrow.Arrow(2013, 12, 31)
- assert q1.quarter == 1
- assert q2.quarter == 2
- assert q3.quarter == 3
- assert q4.quarter == 4
-
- def test_getattr_dt_value(self):
-
- assert self.arrow.year == 2013
-
- def test_tzinfo(self):
-
- self.arrow.tzinfo = tz.gettz("PST")
- assert self.arrow.tzinfo == tz.gettz("PST")
-
- def test_naive(self):
-
- assert self.arrow.naive == self.arrow._datetime.replace(tzinfo=None)
-
- def test_timestamp(self):
-
- assert self.arrow.timestamp == calendar.timegm(
- self.arrow._datetime.utctimetuple()
- )
-
- with pytest.warns(DeprecationWarning):
- self.arrow.timestamp
-
- def test_int_timestamp(self):
-
- assert self.arrow.int_timestamp == calendar.timegm(
- self.arrow._datetime.utctimetuple()
- )
-
- def test_float_timestamp(self):
-
- result = self.arrow.float_timestamp - self.arrow.timestamp
-
- assert result == self.arrow.microsecond
-
- def test_getattr_fold(self):
-
- # UTC is always unambiguous
- assert self.now.fold == 0
-
- ambiguous_dt = arrow.Arrow(
- 2017, 10, 29, 2, 0, tzinfo="Europe/Stockholm", fold=1
- )
- assert ambiguous_dt.fold == 1
-
- with pytest.raises(AttributeError):
- ambiguous_dt.fold = 0
-
- def test_getattr_ambiguous(self):
-
- assert not self.now.ambiguous
-
- ambiguous_dt = arrow.Arrow(2017, 10, 29, 2, 0, tzinfo="Europe/Stockholm")
-
- assert ambiguous_dt.ambiguous
-
- def test_getattr_imaginary(self):
-
- assert not self.now.imaginary
-
- imaginary_dt = arrow.Arrow(2013, 3, 31, 2, 30, tzinfo="Europe/Paris")
-
- assert imaginary_dt.imaginary
-
-
-@pytest.mark.usefixtures("time_utcnow")
-class TestArrowComparison:
- def test_eq(self):
-
- assert self.arrow == self.arrow
- assert self.arrow == self.arrow.datetime
- assert not (self.arrow == "abc")
-
- def test_ne(self):
-
- assert not (self.arrow != self.arrow)
- assert not (self.arrow != self.arrow.datetime)
- assert self.arrow != "abc"
-
- def test_gt(self):
-
- arrow_cmp = self.arrow.shift(minutes=1)
-
- assert not (self.arrow > self.arrow)
- assert not (self.arrow > self.arrow.datetime)
-
- with pytest.raises(TypeError):
- self.arrow > "abc"
-
- assert self.arrow < arrow_cmp
- assert self.arrow < arrow_cmp.datetime
-
- def test_ge(self):
-
- with pytest.raises(TypeError):
- self.arrow >= "abc"
-
- assert self.arrow >= self.arrow
- assert self.arrow >= self.arrow.datetime
-
- def test_lt(self):
-
- arrow_cmp = self.arrow.shift(minutes=1)
-
- assert not (self.arrow < self.arrow)
- assert not (self.arrow < self.arrow.datetime)
-
- with pytest.raises(TypeError):
- self.arrow < "abc"
-
- assert self.arrow < arrow_cmp
- assert self.arrow < arrow_cmp.datetime
-
- def test_le(self):
-
- with pytest.raises(TypeError):
- self.arrow <= "abc"
-
- assert self.arrow <= self.arrow
- assert self.arrow <= self.arrow.datetime
-
-
-@pytest.mark.usefixtures("time_2013_01_01")
-class TestArrowMath:
- def test_add_timedelta(self):
-
- result = self.arrow.__add__(timedelta(days=1))
-
- assert result._datetime == datetime(2013, 1, 2, tzinfo=tz.tzutc())
-
- def test_add_other(self):
-
- with pytest.raises(TypeError):
- self.arrow + 1
-
- def test_radd(self):
-
- result = self.arrow.__radd__(timedelta(days=1))
-
- assert result._datetime == datetime(2013, 1, 2, tzinfo=tz.tzutc())
-
- def test_sub_timedelta(self):
-
- result = self.arrow.__sub__(timedelta(days=1))
-
- assert result._datetime == datetime(2012, 12, 31, tzinfo=tz.tzutc())
-
- def test_sub_datetime(self):
-
- result = self.arrow.__sub__(datetime(2012, 12, 21, tzinfo=tz.tzutc()))
-
- assert result == timedelta(days=11)
-
- def test_sub_arrow(self):
-
- result = self.arrow.__sub__(arrow.Arrow(2012, 12, 21, tzinfo=tz.tzutc()))
-
- assert result == timedelta(days=11)
-
- def test_sub_other(self):
-
- with pytest.raises(TypeError):
- self.arrow - object()
-
- def test_rsub_datetime(self):
-
- result = self.arrow.__rsub__(datetime(2012, 12, 21, tzinfo=tz.tzutc()))
-
- assert result == timedelta(days=-11)
-
- def test_rsub_other(self):
-
- with pytest.raises(TypeError):
- timedelta(days=1) - self.arrow
-
-
-@pytest.mark.usefixtures("time_utcnow")
-class TestArrowDatetimeInterface:
- def test_date(self):
-
- result = self.arrow.date()
-
- assert result == self.arrow._datetime.date()
-
- def test_time(self):
-
- result = self.arrow.time()
-
- assert result == self.arrow._datetime.time()
-
- def test_timetz(self):
-
- result = self.arrow.timetz()
-
- assert result == self.arrow._datetime.timetz()
-
- def test_astimezone(self):
-
- other_tz = tz.gettz("US/Pacific")
-
- result = self.arrow.astimezone(other_tz)
-
- assert result == self.arrow._datetime.astimezone(other_tz)
-
- def test_utcoffset(self):
-
- result = self.arrow.utcoffset()
-
- assert result == self.arrow._datetime.utcoffset()
-
- def test_dst(self):
-
- result = self.arrow.dst()
-
- assert result == self.arrow._datetime.dst()
-
- def test_timetuple(self):
-
- result = self.arrow.timetuple()
-
- assert result == self.arrow._datetime.timetuple()
-
- def test_utctimetuple(self):
-
- result = self.arrow.utctimetuple()
-
- assert result == self.arrow._datetime.utctimetuple()
-
- def test_toordinal(self):
-
- result = self.arrow.toordinal()
-
- assert result == self.arrow._datetime.toordinal()
-
- def test_weekday(self):
-
- result = self.arrow.weekday()
-
- assert result == self.arrow._datetime.weekday()
-
- def test_isoweekday(self):
-
- result = self.arrow.isoweekday()
-
- assert result == self.arrow._datetime.isoweekday()
-
- def test_isocalendar(self):
-
- result = self.arrow.isocalendar()
-
- assert result == self.arrow._datetime.isocalendar()
-
- def test_isoformat(self):
-
- result = self.arrow.isoformat()
-
- assert result == self.arrow._datetime.isoformat()
-
- def test_simplejson(self):
-
- result = json.dumps({"v": self.arrow.for_json()}, for_json=True)
-
- assert json.loads(result)["v"] == self.arrow._datetime.isoformat()
-
- def test_ctime(self):
-
- result = self.arrow.ctime()
-
- assert result == self.arrow._datetime.ctime()
-
- def test_strftime(self):
-
- result = self.arrow.strftime("%Y")
-
- assert result == self.arrow._datetime.strftime("%Y")
-
-
-class TestArrowFalsePositiveDst:
- """These tests relate to issues #376 and #551.
- The key points in both issues are that arrow will assign a UTC timezone if none is provided and
- .to() will change other attributes to be correct whereas .replace() only changes the specified attribute.
-
- Issue 376
- >>> arrow.get('2016-11-06').to('America/New_York').ceil('day')
- < Arrow [2016-11-05T23:59:59.999999-04:00] >
-
- Issue 551
- >>> just_before = arrow.get('2018-11-04T01:59:59.999999')
- >>> just_before
- 2018-11-04T01:59:59.999999+00:00
- >>> just_after = just_before.shift(microseconds=1)
- >>> just_after
- 2018-11-04T02:00:00+00:00
- >>> just_before_eastern = just_before.replace(tzinfo='US/Eastern')
- >>> just_before_eastern
- 2018-11-04T01:59:59.999999-04:00
- >>> just_after_eastern = just_after.replace(tzinfo='US/Eastern')
- >>> just_after_eastern
- 2018-11-04T02:00:00-05:00
- """
-
- def test_dst(self):
- self.before_1 = arrow.Arrow(
- 2016, 11, 6, 3, 59, tzinfo=tz.gettz("America/New_York")
- )
- self.before_2 = arrow.Arrow(2016, 11, 6, tzinfo=tz.gettz("America/New_York"))
- self.after_1 = arrow.Arrow(2016, 11, 6, 4, tzinfo=tz.gettz("America/New_York"))
- self.after_2 = arrow.Arrow(
- 2016, 11, 6, 23, 59, tzinfo=tz.gettz("America/New_York")
- )
- self.before_3 = arrow.Arrow(
- 2018, 11, 4, 3, 59, tzinfo=tz.gettz("America/New_York")
- )
- self.before_4 = arrow.Arrow(2018, 11, 4, tzinfo=tz.gettz("America/New_York"))
- self.after_3 = arrow.Arrow(2018, 11, 4, 4, tzinfo=tz.gettz("America/New_York"))
- self.after_4 = arrow.Arrow(
- 2018, 11, 4, 23, 59, tzinfo=tz.gettz("America/New_York")
- )
- assert self.before_1.day == self.before_2.day
- assert self.after_1.day == self.after_2.day
- assert self.before_3.day == self.before_4.day
- assert self.after_3.day == self.after_4.day
-
-
-class TestArrowConversion:
- def test_to(self):
-
- dt_from = datetime.now()
- arrow_from = arrow.Arrow.fromdatetime(dt_from, tz.gettz("US/Pacific"))
-
- self.expected = dt_from.replace(tzinfo=tz.gettz("US/Pacific")).astimezone(
- tz.tzutc()
- )
-
- assert arrow_from.to("UTC").datetime == self.expected
- assert arrow_from.to(tz.tzutc()).datetime == self.expected
-
- # issue #368
- def test_to_pacific_then_utc(self):
- result = arrow.Arrow(2018, 11, 4, 1, tzinfo="-08:00").to("US/Pacific").to("UTC")
- assert result == arrow.Arrow(2018, 11, 4, 9)
-
- # issue #368
- def test_to_amsterdam_then_utc(self):
- result = arrow.Arrow(2016, 10, 30).to("Europe/Amsterdam")
- assert result.utcoffset() == timedelta(seconds=7200)
-
- # regression test for #690
- def test_to_israel_same_offset(self):
-
- result = arrow.Arrow(2019, 10, 27, 2, 21, 1, tzinfo="+03:00").to("Israel")
- expected = arrow.Arrow(2019, 10, 27, 1, 21, 1, tzinfo="Israel")
-
- assert result == expected
- assert result.utcoffset() != expected.utcoffset()
-
- # issue 315
- def test_anchorage_dst(self):
- before = arrow.Arrow(2016, 3, 13, 1, 59, tzinfo="America/Anchorage")
- after = arrow.Arrow(2016, 3, 13, 2, 1, tzinfo="America/Anchorage")
-
- assert before.utcoffset() != after.utcoffset()
-
- # issue 476
- def test_chicago_fall(self):
-
- result = arrow.Arrow(2017, 11, 5, 2, 1, tzinfo="-05:00").to("America/Chicago")
- expected = arrow.Arrow(2017, 11, 5, 1, 1, tzinfo="America/Chicago")
-
- assert result == expected
- assert result.utcoffset() != expected.utcoffset()
-
- def test_toronto_gap(self):
-
- before = arrow.Arrow(2011, 3, 13, 6, 30, tzinfo="UTC").to("America/Toronto")
- after = arrow.Arrow(2011, 3, 13, 7, 30, tzinfo="UTC").to("America/Toronto")
-
- assert before.datetime.replace(tzinfo=None) == datetime(2011, 3, 13, 1, 30)
- assert after.datetime.replace(tzinfo=None) == datetime(2011, 3, 13, 3, 30)
-
- assert before.utcoffset() != after.utcoffset()
-
- def test_sydney_gap(self):
-
- before = arrow.Arrow(2012, 10, 6, 15, 30, tzinfo="UTC").to("Australia/Sydney")
- after = arrow.Arrow(2012, 10, 6, 16, 30, tzinfo="UTC").to("Australia/Sydney")
-
- assert before.datetime.replace(tzinfo=None) == datetime(2012, 10, 7, 1, 30)
- assert after.datetime.replace(tzinfo=None) == datetime(2012, 10, 7, 3, 30)
-
- assert before.utcoffset() != after.utcoffset()
-
-
-class TestArrowPickling:
- def test_pickle_and_unpickle(self):
-
- dt = arrow.Arrow.utcnow()
-
- pickled = pickle.dumps(dt)
-
- unpickled = pickle.loads(pickled)
-
- assert unpickled == dt
-
-
-class TestArrowReplace:
- def test_not_attr(self):
-
- with pytest.raises(AttributeError):
- arrow.Arrow.utcnow().replace(abc=1)
-
- def test_replace(self):
-
- arw = arrow.Arrow(2013, 5, 5, 12, 30, 45)
-
- assert arw.replace(year=2012) == arrow.Arrow(2012, 5, 5, 12, 30, 45)
- assert arw.replace(month=1) == arrow.Arrow(2013, 1, 5, 12, 30, 45)
- assert arw.replace(day=1) == arrow.Arrow(2013, 5, 1, 12, 30, 45)
- assert arw.replace(hour=1) == arrow.Arrow(2013, 5, 5, 1, 30, 45)
- assert arw.replace(minute=1) == arrow.Arrow(2013, 5, 5, 12, 1, 45)
- assert arw.replace(second=1) == arrow.Arrow(2013, 5, 5, 12, 30, 1)
-
- def test_replace_tzinfo(self):
-
- arw = arrow.Arrow.utcnow().to("US/Eastern")
-
- result = arw.replace(tzinfo=tz.gettz("US/Pacific"))
-
- assert result == arw.datetime.replace(tzinfo=tz.gettz("US/Pacific"))
-
- def test_replace_fold(self):
-
- before = arrow.Arrow(2017, 11, 5, 1, tzinfo="America/New_York")
- after = before.replace(fold=1)
-
- assert before.fold == 0
- assert after.fold == 1
- assert before == after
- assert before.utcoffset() != after.utcoffset()
-
- def test_replace_fold_and_other(self):
-
- arw = arrow.Arrow(2013, 5, 5, 12, 30, 45)
-
- assert arw.replace(fold=1, minute=50) == arrow.Arrow(2013, 5, 5, 12, 50, 45)
- assert arw.replace(minute=50, fold=1) == arrow.Arrow(2013, 5, 5, 12, 50, 45)
-
- def test_replace_week(self):
-
- with pytest.raises(AttributeError):
- arrow.Arrow.utcnow().replace(week=1)
-
- def test_replace_quarter(self):
-
- with pytest.raises(AttributeError):
- arrow.Arrow.utcnow().replace(quarter=1)
-
- def test_replace_quarter_and_fold(self):
- with pytest.raises(AttributeError):
- arrow.utcnow().replace(fold=1, quarter=1)
-
- with pytest.raises(AttributeError):
- arrow.utcnow().replace(quarter=1, fold=1)
-
- def test_replace_other_kwargs(self):
-
- with pytest.raises(AttributeError):
- arrow.utcnow().replace(abc="def")
-
-
-class TestArrowShift:
- def test_not_attr(self):
-
- now = arrow.Arrow.utcnow()
-
- with pytest.raises(AttributeError):
- now.shift(abc=1)
-
- with pytest.raises(AttributeError):
- now.shift(week=1)
-
- def test_shift(self):
-
- arw = arrow.Arrow(2013, 5, 5, 12, 30, 45)
-
- assert arw.shift(years=1) == arrow.Arrow(2014, 5, 5, 12, 30, 45)
- assert arw.shift(quarters=1) == arrow.Arrow(2013, 8, 5, 12, 30, 45)
- assert arw.shift(quarters=1, months=1) == arrow.Arrow(2013, 9, 5, 12, 30, 45)
- assert arw.shift(months=1) == arrow.Arrow(2013, 6, 5, 12, 30, 45)
- assert arw.shift(weeks=1) == arrow.Arrow(2013, 5, 12, 12, 30, 45)
- assert arw.shift(days=1) == arrow.Arrow(2013, 5, 6, 12, 30, 45)
- assert arw.shift(hours=1) == arrow.Arrow(2013, 5, 5, 13, 30, 45)
- assert arw.shift(minutes=1) == arrow.Arrow(2013, 5, 5, 12, 31, 45)
- assert arw.shift(seconds=1) == arrow.Arrow(2013, 5, 5, 12, 30, 46)
- assert arw.shift(microseconds=1) == arrow.Arrow(2013, 5, 5, 12, 30, 45, 1)
-
- # Remember: Python's weekday 0 is Monday
- assert arw.shift(weekday=0) == arrow.Arrow(2013, 5, 6, 12, 30, 45)
- assert arw.shift(weekday=1) == arrow.Arrow(2013, 5, 7, 12, 30, 45)
- assert arw.shift(weekday=2) == arrow.Arrow(2013, 5, 8, 12, 30, 45)
- assert arw.shift(weekday=3) == arrow.Arrow(2013, 5, 9, 12, 30, 45)
- assert arw.shift(weekday=4) == arrow.Arrow(2013, 5, 10, 12, 30, 45)
- assert arw.shift(weekday=5) == arrow.Arrow(2013, 5, 11, 12, 30, 45)
- assert arw.shift(weekday=6) == arw
-
- with pytest.raises(IndexError):
- arw.shift(weekday=7)
-
- # Use dateutil.relativedelta's convenient day instances
- assert arw.shift(weekday=MO) == arrow.Arrow(2013, 5, 6, 12, 30, 45)
- assert arw.shift(weekday=MO(0)) == arrow.Arrow(2013, 5, 6, 12, 30, 45)
- assert arw.shift(weekday=MO(1)) == arrow.Arrow(2013, 5, 6, 12, 30, 45)
- assert arw.shift(weekday=MO(2)) == arrow.Arrow(2013, 5, 13, 12, 30, 45)
- assert arw.shift(weekday=TU) == arrow.Arrow(2013, 5, 7, 12, 30, 45)
- assert arw.shift(weekday=TU(0)) == arrow.Arrow(2013, 5, 7, 12, 30, 45)
- assert arw.shift(weekday=TU(1)) == arrow.Arrow(2013, 5, 7, 12, 30, 45)
- assert arw.shift(weekday=TU(2)) == arrow.Arrow(2013, 5, 14, 12, 30, 45)
- assert arw.shift(weekday=WE) == arrow.Arrow(2013, 5, 8, 12, 30, 45)
- assert arw.shift(weekday=WE(0)) == arrow.Arrow(2013, 5, 8, 12, 30, 45)
- assert arw.shift(weekday=WE(1)) == arrow.Arrow(2013, 5, 8, 12, 30, 45)
- assert arw.shift(weekday=WE(2)) == arrow.Arrow(2013, 5, 15, 12, 30, 45)
- assert arw.shift(weekday=TH) == arrow.Arrow(2013, 5, 9, 12, 30, 45)
- assert arw.shift(weekday=TH(0)) == arrow.Arrow(2013, 5, 9, 12, 30, 45)
- assert arw.shift(weekday=TH(1)) == arrow.Arrow(2013, 5, 9, 12, 30, 45)
- assert arw.shift(weekday=TH(2)) == arrow.Arrow(2013, 5, 16, 12, 30, 45)
- assert arw.shift(weekday=FR) == arrow.Arrow(2013, 5, 10, 12, 30, 45)
- assert arw.shift(weekday=FR(0)) == arrow.Arrow(2013, 5, 10, 12, 30, 45)
- assert arw.shift(weekday=FR(1)) == arrow.Arrow(2013, 5, 10, 12, 30, 45)
- assert arw.shift(weekday=FR(2)) == arrow.Arrow(2013, 5, 17, 12, 30, 45)
- assert arw.shift(weekday=SA) == arrow.Arrow(2013, 5, 11, 12, 30, 45)
- assert arw.shift(weekday=SA(0)) == arrow.Arrow(2013, 5, 11, 12, 30, 45)
- assert arw.shift(weekday=SA(1)) == arrow.Arrow(2013, 5, 11, 12, 30, 45)
- assert arw.shift(weekday=SA(2)) == arrow.Arrow(2013, 5, 18, 12, 30, 45)
- assert arw.shift(weekday=SU) == arw
- assert arw.shift(weekday=SU(0)) == arw
- assert arw.shift(weekday=SU(1)) == arw
- assert arw.shift(weekday=SU(2)) == arrow.Arrow(2013, 5, 12, 12, 30, 45)
-
- def test_shift_negative(self):
-
- arw = arrow.Arrow(2013, 5, 5, 12, 30, 45)
-
- assert arw.shift(years=-1) == arrow.Arrow(2012, 5, 5, 12, 30, 45)
- assert arw.shift(quarters=-1) == arrow.Arrow(2013, 2, 5, 12, 30, 45)
- assert arw.shift(quarters=-1, months=-1) == arrow.Arrow(2013, 1, 5, 12, 30, 45)
- assert arw.shift(months=-1) == arrow.Arrow(2013, 4, 5, 12, 30, 45)
- assert arw.shift(weeks=-1) == arrow.Arrow(2013, 4, 28, 12, 30, 45)
- assert arw.shift(days=-1) == arrow.Arrow(2013, 5, 4, 12, 30, 45)
- assert arw.shift(hours=-1) == arrow.Arrow(2013, 5, 5, 11, 30, 45)
- assert arw.shift(minutes=-1) == arrow.Arrow(2013, 5, 5, 12, 29, 45)
- assert arw.shift(seconds=-1) == arrow.Arrow(2013, 5, 5, 12, 30, 44)
- assert arw.shift(microseconds=-1) == arrow.Arrow(2013, 5, 5, 12, 30, 44, 999999)
-
- # Not sure how practical these negative weekdays are
- assert arw.shift(weekday=-1) == arw.shift(weekday=SU)
- assert arw.shift(weekday=-2) == arw.shift(weekday=SA)
- assert arw.shift(weekday=-3) == arw.shift(weekday=FR)
- assert arw.shift(weekday=-4) == arw.shift(weekday=TH)
- assert arw.shift(weekday=-5) == arw.shift(weekday=WE)
- assert arw.shift(weekday=-6) == arw.shift(weekday=TU)
- assert arw.shift(weekday=-7) == arw.shift(weekday=MO)
-
- with pytest.raises(IndexError):
- arw.shift(weekday=-8)
-
- assert arw.shift(weekday=MO(-1)) == arrow.Arrow(2013, 4, 29, 12, 30, 45)
- assert arw.shift(weekday=TU(-1)) == arrow.Arrow(2013, 4, 30, 12, 30, 45)
- assert arw.shift(weekday=WE(-1)) == arrow.Arrow(2013, 5, 1, 12, 30, 45)
- assert arw.shift(weekday=TH(-1)) == arrow.Arrow(2013, 5, 2, 12, 30, 45)
- assert arw.shift(weekday=FR(-1)) == arrow.Arrow(2013, 5, 3, 12, 30, 45)
- assert arw.shift(weekday=SA(-1)) == arrow.Arrow(2013, 5, 4, 12, 30, 45)
- assert arw.shift(weekday=SU(-1)) == arw
- assert arw.shift(weekday=SU(-2)) == arrow.Arrow(2013, 4, 28, 12, 30, 45)
-
- def test_shift_quarters_bug(self):
-
- arw = arrow.Arrow(2013, 5, 5, 12, 30, 45)
-
- # The value of the last-read argument was used instead of the ``quarters`` argument.
- # Recall that the keyword argument dict, like all dicts, is unordered, so only certain
- # combinations of arguments would exhibit this.
- assert arw.shift(quarters=0, years=1) == arrow.Arrow(2014, 5, 5, 12, 30, 45)
- assert arw.shift(quarters=0, months=1) == arrow.Arrow(2013, 6, 5, 12, 30, 45)
- assert arw.shift(quarters=0, weeks=1) == arrow.Arrow(2013, 5, 12, 12, 30, 45)
- assert arw.shift(quarters=0, days=1) == arrow.Arrow(2013, 5, 6, 12, 30, 45)
- assert arw.shift(quarters=0, hours=1) == arrow.Arrow(2013, 5, 5, 13, 30, 45)
- assert arw.shift(quarters=0, minutes=1) == arrow.Arrow(2013, 5, 5, 12, 31, 45)
- assert arw.shift(quarters=0, seconds=1) == arrow.Arrow(2013, 5, 5, 12, 30, 46)
- assert arw.shift(quarters=0, microseconds=1) == arrow.Arrow(
- 2013, 5, 5, 12, 30, 45, 1
- )
-
- def test_shift_positive_imaginary(self):
-
- # Avoid shifting into imaginary datetimes, take into account DST and other timezone changes.
-
- new_york = arrow.Arrow(2017, 3, 12, 1, 30, tzinfo="America/New_York")
- assert new_york.shift(hours=+1) == arrow.Arrow(
- 2017, 3, 12, 3, 30, tzinfo="America/New_York"
- )
-
- # pendulum example
- paris = arrow.Arrow(2013, 3, 31, 1, 50, tzinfo="Europe/Paris")
- assert paris.shift(minutes=+20) == arrow.Arrow(
- 2013, 3, 31, 3, 10, tzinfo="Europe/Paris"
- )
-
- canberra = arrow.Arrow(2018, 10, 7, 1, 30, tzinfo="Australia/Canberra")
- assert canberra.shift(hours=+1) == arrow.Arrow(
- 2018, 10, 7, 3, 30, tzinfo="Australia/Canberra"
- )
-
- kiev = arrow.Arrow(2018, 3, 25, 2, 30, tzinfo="Europe/Kiev")
- assert kiev.shift(hours=+1) == arrow.Arrow(
- 2018, 3, 25, 4, 30, tzinfo="Europe/Kiev"
- )
-
- # Edge case, the entire day of 2011-12-30 is imaginary in this zone!
- apia = arrow.Arrow(2011, 12, 29, 23, tzinfo="Pacific/Apia")
- assert apia.shift(hours=+2) == arrow.Arrow(
- 2011, 12, 31, 1, tzinfo="Pacific/Apia"
- )
-
- def test_shift_negative_imaginary(self):
-
- new_york = arrow.Arrow(2011, 3, 13, 3, 30, tzinfo="America/New_York")
- assert new_york.shift(hours=-1) == arrow.Arrow(
- 2011, 3, 13, 3, 30, tzinfo="America/New_York"
- )
- assert new_york.shift(hours=-2) == arrow.Arrow(
- 2011, 3, 13, 1, 30, tzinfo="America/New_York"
- )
-
- london = arrow.Arrow(2019, 3, 31, 2, tzinfo="Europe/London")
- assert london.shift(hours=-1) == arrow.Arrow(
- 2019, 3, 31, 2, tzinfo="Europe/London"
- )
- assert london.shift(hours=-2) == arrow.Arrow(
- 2019, 3, 31, 0, tzinfo="Europe/London"
- )
-
- # edge case, crossing the international dateline
- apia = arrow.Arrow(2011, 12, 31, 1, tzinfo="Pacific/Apia")
- assert apia.shift(hours=-2) == arrow.Arrow(
- 2011, 12, 31, 23, tzinfo="Pacific/Apia"
- )
-
- @pytest.mark.skipif(
- dateutil.__version__ < "2.7.1", reason="old tz database (2018d needed)"
- )
- def test_shift_kiritimati(self):
- # corrected 2018d tz database release, will fail in earlier versions
-
- kiritimati = arrow.Arrow(1994, 12, 30, 12, 30, tzinfo="Pacific/Kiritimati")
- assert kiritimati.shift(days=+1) == arrow.Arrow(
- 1995, 1, 1, 12, 30, tzinfo="Pacific/Kiritimati"
- )
-
- @pytest.mark.skipif(
- sys.version_info < (3, 6), reason="unsupported before python 3.6"
- )
- def shift_imaginary_seconds(self):
- # offset has a seconds component
- monrovia = arrow.Arrow(1972, 1, 6, 23, tzinfo="Africa/Monrovia")
- assert monrovia.shift(hours=+1, minutes=+30) == arrow.Arrow(
- 1972, 1, 7, 1, 14, 30, tzinfo="Africa/Monrovia"
- )
-
-
-class TestArrowRange:
- def test_year(self):
-
- result = list(
- arrow.Arrow.range(
- "year", datetime(2013, 1, 2, 3, 4, 5), datetime(2016, 4, 5, 6, 7, 8)
- )
- )
-
- assert result == [
- arrow.Arrow(2013, 1, 2, 3, 4, 5),
- arrow.Arrow(2014, 1, 2, 3, 4, 5),
- arrow.Arrow(2015, 1, 2, 3, 4, 5),
- arrow.Arrow(2016, 1, 2, 3, 4, 5),
- ]
-
- def test_quarter(self):
-
- result = list(
- arrow.Arrow.range(
- "quarter", datetime(2013, 2, 3, 4, 5, 6), datetime(2013, 5, 6, 7, 8, 9)
- )
- )
-
- assert result == [
- arrow.Arrow(2013, 2, 3, 4, 5, 6),
- arrow.Arrow(2013, 5, 3, 4, 5, 6),
- ]
-
- def test_month(self):
-
- result = list(
- arrow.Arrow.range(
- "month", datetime(2013, 2, 3, 4, 5, 6), datetime(2013, 5, 6, 7, 8, 9)
- )
- )
-
- assert result == [
- arrow.Arrow(2013, 2, 3, 4, 5, 6),
- arrow.Arrow(2013, 3, 3, 4, 5, 6),
- arrow.Arrow(2013, 4, 3, 4, 5, 6),
- arrow.Arrow(2013, 5, 3, 4, 5, 6),
- ]
-
- def test_week(self):
-
- result = list(
- arrow.Arrow.range(
- "week", datetime(2013, 9, 1, 2, 3, 4), datetime(2013, 10, 1, 2, 3, 4)
- )
- )
-
- assert result == [
- arrow.Arrow(2013, 9, 1, 2, 3, 4),
- arrow.Arrow(2013, 9, 8, 2, 3, 4),
- arrow.Arrow(2013, 9, 15, 2, 3, 4),
- arrow.Arrow(2013, 9, 22, 2, 3, 4),
- arrow.Arrow(2013, 9, 29, 2, 3, 4),
- ]
-
- def test_day(self):
-
- result = list(
- arrow.Arrow.range(
- "day", datetime(2013, 1, 2, 3, 4, 5), datetime(2013, 1, 5, 6, 7, 8)
- )
- )
-
- assert result == [
- arrow.Arrow(2013, 1, 2, 3, 4, 5),
- arrow.Arrow(2013, 1, 3, 3, 4, 5),
- arrow.Arrow(2013, 1, 4, 3, 4, 5),
- arrow.Arrow(2013, 1, 5, 3, 4, 5),
- ]
-
- def test_hour(self):
-
- result = list(
- arrow.Arrow.range(
- "hour", datetime(2013, 1, 2, 3, 4, 5), datetime(2013, 1, 2, 6, 7, 8)
- )
- )
-
- assert result == [
- arrow.Arrow(2013, 1, 2, 3, 4, 5),
- arrow.Arrow(2013, 1, 2, 4, 4, 5),
- arrow.Arrow(2013, 1, 2, 5, 4, 5),
- arrow.Arrow(2013, 1, 2, 6, 4, 5),
- ]
-
- result = list(
- arrow.Arrow.range(
- "hour", datetime(2013, 1, 2, 3, 4, 5), datetime(2013, 1, 2, 3, 4, 5)
- )
- )
-
- assert result == [arrow.Arrow(2013, 1, 2, 3, 4, 5)]
-
- def test_minute(self):
-
- result = list(
- arrow.Arrow.range(
- "minute", datetime(2013, 1, 2, 3, 4, 5), datetime(2013, 1, 2, 3, 7, 8)
- )
- )
-
- assert result == [
- arrow.Arrow(2013, 1, 2, 3, 4, 5),
- arrow.Arrow(2013, 1, 2, 3, 5, 5),
- arrow.Arrow(2013, 1, 2, 3, 6, 5),
- arrow.Arrow(2013, 1, 2, 3, 7, 5),
- ]
-
- def test_second(self):
-
- result = list(
- arrow.Arrow.range(
- "second", datetime(2013, 1, 2, 3, 4, 5), datetime(2013, 1, 2, 3, 4, 8)
- )
- )
-
- assert result == [
- arrow.Arrow(2013, 1, 2, 3, 4, 5),
- arrow.Arrow(2013, 1, 2, 3, 4, 6),
- arrow.Arrow(2013, 1, 2, 3, 4, 7),
- arrow.Arrow(2013, 1, 2, 3, 4, 8),
- ]
-
- def test_arrow(self):
-
- result = list(
- arrow.Arrow.range(
- "day",
- arrow.Arrow(2013, 1, 2, 3, 4, 5),
- arrow.Arrow(2013, 1, 5, 6, 7, 8),
- )
- )
-
- assert result == [
- arrow.Arrow(2013, 1, 2, 3, 4, 5),
- arrow.Arrow(2013, 1, 3, 3, 4, 5),
- arrow.Arrow(2013, 1, 4, 3, 4, 5),
- arrow.Arrow(2013, 1, 5, 3, 4, 5),
- ]
-
- def test_naive_tz(self):
-
- result = arrow.Arrow.range(
- "year", datetime(2013, 1, 2, 3), datetime(2016, 4, 5, 6), "US/Pacific"
- )
-
- for r in result:
- assert r.tzinfo == tz.gettz("US/Pacific")
-
- def test_aware_same_tz(self):
-
- result = arrow.Arrow.range(
- "day",
- arrow.Arrow(2013, 1, 1, tzinfo=tz.gettz("US/Pacific")),
- arrow.Arrow(2013, 1, 3, tzinfo=tz.gettz("US/Pacific")),
- )
-
- for r in result:
- assert r.tzinfo == tz.gettz("US/Pacific")
-
- def test_aware_different_tz(self):
-
- result = arrow.Arrow.range(
- "day",
- datetime(2013, 1, 1, tzinfo=tz.gettz("US/Eastern")),
- datetime(2013, 1, 3, tzinfo=tz.gettz("US/Pacific")),
- )
-
- for r in result:
- assert r.tzinfo == tz.gettz("US/Eastern")
-
- def test_aware_tz(self):
-
- result = arrow.Arrow.range(
- "day",
- datetime(2013, 1, 1, tzinfo=tz.gettz("US/Eastern")),
- datetime(2013, 1, 3, tzinfo=tz.gettz("US/Pacific")),
- tz=tz.gettz("US/Central"),
- )
-
- for r in result:
- assert r.tzinfo == tz.gettz("US/Central")
-
- def test_imaginary(self):
- # issue #72, avoid duplication in utc column
-
- before = arrow.Arrow(2018, 3, 10, 23, tzinfo="US/Pacific")
- after = arrow.Arrow(2018, 3, 11, 4, tzinfo="US/Pacific")
-
- pacific_range = [t for t in arrow.Arrow.range("hour", before, after)]
- utc_range = [t.to("utc") for t in arrow.Arrow.range("hour", before, after)]
-
- assert len(pacific_range) == len(set(pacific_range))
- assert len(utc_range) == len(set(utc_range))
-
- def test_unsupported(self):
-
- with pytest.raises(AttributeError):
- next(arrow.Arrow.range("abc", datetime.utcnow(), datetime.utcnow()))
-
- def test_range_over_months_ending_on_different_days(self):
- # regression test for issue #842
- result = list(arrow.Arrow.range("month", datetime(2015, 1, 31), limit=4))
- assert result == [
- arrow.Arrow(2015, 1, 31),
- arrow.Arrow(2015, 2, 28),
- arrow.Arrow(2015, 3, 31),
- arrow.Arrow(2015, 4, 30),
- ]
-
- result = list(arrow.Arrow.range("month", datetime(2015, 1, 30), limit=3))
- assert result == [
- arrow.Arrow(2015, 1, 30),
- arrow.Arrow(2015, 2, 28),
- arrow.Arrow(2015, 3, 30),
- ]
-
- result = list(arrow.Arrow.range("month", datetime(2015, 2, 28), limit=3))
- assert result == [
- arrow.Arrow(2015, 2, 28),
- arrow.Arrow(2015, 3, 28),
- arrow.Arrow(2015, 4, 28),
- ]
-
- result = list(arrow.Arrow.range("month", datetime(2015, 3, 31), limit=3))
- assert result == [
- arrow.Arrow(2015, 3, 31),
- arrow.Arrow(2015, 4, 30),
- arrow.Arrow(2015, 5, 31),
- ]
-
- def test_range_over_quarter_months_ending_on_different_days(self):
- result = list(arrow.Arrow.range("quarter", datetime(2014, 11, 30), limit=3))
- assert result == [
- arrow.Arrow(2014, 11, 30),
- arrow.Arrow(2015, 2, 28),
- arrow.Arrow(2015, 5, 30),
- ]
-
- def test_range_over_year_maintains_end_date_across_leap_year(self):
- result = list(arrow.Arrow.range("year", datetime(2012, 2, 29), limit=5))
- assert result == [
- arrow.Arrow(2012, 2, 29),
- arrow.Arrow(2013, 2, 28),
- arrow.Arrow(2014, 2, 28),
- arrow.Arrow(2015, 2, 28),
- arrow.Arrow(2016, 2, 29),
- ]
-
-
-class TestArrowSpanRange:
- def test_year(self):
-
- result = list(
- arrow.Arrow.span_range("year", datetime(2013, 2, 1), datetime(2016, 3, 31))
- )
-
- assert result == [
- (
- arrow.Arrow(2013, 1, 1),
- arrow.Arrow(2013, 12, 31, 23, 59, 59, 999999),
- ),
- (
- arrow.Arrow(2014, 1, 1),
- arrow.Arrow(2014, 12, 31, 23, 59, 59, 999999),
- ),
- (
- arrow.Arrow(2015, 1, 1),
- arrow.Arrow(2015, 12, 31, 23, 59, 59, 999999),
- ),
- (
- arrow.Arrow(2016, 1, 1),
- arrow.Arrow(2016, 12, 31, 23, 59, 59, 999999),
- ),
- ]
-
- def test_quarter(self):
-
- result = list(
- arrow.Arrow.span_range(
- "quarter", datetime(2013, 2, 2), datetime(2013, 5, 15)
- )
- )
-
- assert result == [
- (arrow.Arrow(2013, 1, 1), arrow.Arrow(2013, 3, 31, 23, 59, 59, 999999)),
- (arrow.Arrow(2013, 4, 1), arrow.Arrow(2013, 6, 30, 23, 59, 59, 999999)),
- ]
-
- def test_month(self):
-
- result = list(
- arrow.Arrow.span_range("month", datetime(2013, 1, 2), datetime(2013, 4, 15))
- )
-
- assert result == [
- (arrow.Arrow(2013, 1, 1), arrow.Arrow(2013, 1, 31, 23, 59, 59, 999999)),
- (arrow.Arrow(2013, 2, 1), arrow.Arrow(2013, 2, 28, 23, 59, 59, 999999)),
- (arrow.Arrow(2013, 3, 1), arrow.Arrow(2013, 3, 31, 23, 59, 59, 999999)),
- (arrow.Arrow(2013, 4, 1), arrow.Arrow(2013, 4, 30, 23, 59, 59, 999999)),
- ]
-
- def test_week(self):
-
- result = list(
- arrow.Arrow.span_range("week", datetime(2013, 2, 2), datetime(2013, 2, 28))
- )
-
- assert result == [
- (arrow.Arrow(2013, 1, 28), arrow.Arrow(2013, 2, 3, 23, 59, 59, 999999)),
- (arrow.Arrow(2013, 2, 4), arrow.Arrow(2013, 2, 10, 23, 59, 59, 999999)),
- (
- arrow.Arrow(2013, 2, 11),
- arrow.Arrow(2013, 2, 17, 23, 59, 59, 999999),
- ),
- (
- arrow.Arrow(2013, 2, 18),
- arrow.Arrow(2013, 2, 24, 23, 59, 59, 999999),
- ),
- (arrow.Arrow(2013, 2, 25), arrow.Arrow(2013, 3, 3, 23, 59, 59, 999999)),
- ]
-
- def test_day(self):
-
- result = list(
- arrow.Arrow.span_range(
- "day", datetime(2013, 1, 1, 12), datetime(2013, 1, 4, 12)
- )
- )
-
- assert result == [
- (
- arrow.Arrow(2013, 1, 1, 0),
- arrow.Arrow(2013, 1, 1, 23, 59, 59, 999999),
- ),
- (
- arrow.Arrow(2013, 1, 2, 0),
- arrow.Arrow(2013, 1, 2, 23, 59, 59, 999999),
- ),
- (
- arrow.Arrow(2013, 1, 3, 0),
- arrow.Arrow(2013, 1, 3, 23, 59, 59, 999999),
- ),
- (
- arrow.Arrow(2013, 1, 4, 0),
- arrow.Arrow(2013, 1, 4, 23, 59, 59, 999999),
- ),
- ]
-
- def test_days(self):
-
- result = list(
- arrow.Arrow.span_range(
- "days", datetime(2013, 1, 1, 12), datetime(2013, 1, 4, 12)
- )
- )
-
- assert result == [
- (
- arrow.Arrow(2013, 1, 1, 0),
- arrow.Arrow(2013, 1, 1, 23, 59, 59, 999999),
- ),
- (
- arrow.Arrow(2013, 1, 2, 0),
- arrow.Arrow(2013, 1, 2, 23, 59, 59, 999999),
- ),
- (
- arrow.Arrow(2013, 1, 3, 0),
- arrow.Arrow(2013, 1, 3, 23, 59, 59, 999999),
- ),
- (
- arrow.Arrow(2013, 1, 4, 0),
- arrow.Arrow(2013, 1, 4, 23, 59, 59, 999999),
- ),
- ]
-
- def test_hour(self):
-
- result = list(
- arrow.Arrow.span_range(
- "hour", datetime(2013, 1, 1, 0, 30), datetime(2013, 1, 1, 3, 30)
- )
- )
-
- assert result == [
- (
- arrow.Arrow(2013, 1, 1, 0),
- arrow.Arrow(2013, 1, 1, 0, 59, 59, 999999),
- ),
- (
- arrow.Arrow(2013, 1, 1, 1),
- arrow.Arrow(2013, 1, 1, 1, 59, 59, 999999),
- ),
- (
- arrow.Arrow(2013, 1, 1, 2),
- arrow.Arrow(2013, 1, 1, 2, 59, 59, 999999),
- ),
- (
- arrow.Arrow(2013, 1, 1, 3),
- arrow.Arrow(2013, 1, 1, 3, 59, 59, 999999),
- ),
- ]
-
- result = list(
- arrow.Arrow.span_range(
- "hour", datetime(2013, 1, 1, 3, 30), datetime(2013, 1, 1, 3, 30)
- )
- )
-
- assert result == [
- (arrow.Arrow(2013, 1, 1, 3), arrow.Arrow(2013, 1, 1, 3, 59, 59, 999999))
- ]
-
- def test_minute(self):
-
- result = list(
- arrow.Arrow.span_range(
- "minute", datetime(2013, 1, 1, 0, 0, 30), datetime(2013, 1, 1, 0, 3, 30)
- )
- )
-
- assert result == [
- (
- arrow.Arrow(2013, 1, 1, 0, 0),
- arrow.Arrow(2013, 1, 1, 0, 0, 59, 999999),
- ),
- (
- arrow.Arrow(2013, 1, 1, 0, 1),
- arrow.Arrow(2013, 1, 1, 0, 1, 59, 999999),
- ),
- (
- arrow.Arrow(2013, 1, 1, 0, 2),
- arrow.Arrow(2013, 1, 1, 0, 2, 59, 999999),
- ),
- (
- arrow.Arrow(2013, 1, 1, 0, 3),
- arrow.Arrow(2013, 1, 1, 0, 3, 59, 999999),
- ),
- ]
-
- def test_second(self):
-
- result = list(
- arrow.Arrow.span_range(
- "second", datetime(2013, 1, 1), datetime(2013, 1, 1, 0, 0, 3)
- )
- )
-
- assert result == [
- (
- arrow.Arrow(2013, 1, 1, 0, 0, 0),
- arrow.Arrow(2013, 1, 1, 0, 0, 0, 999999),
- ),
- (
- arrow.Arrow(2013, 1, 1, 0, 0, 1),
- arrow.Arrow(2013, 1, 1, 0, 0, 1, 999999),
- ),
- (
- arrow.Arrow(2013, 1, 1, 0, 0, 2),
- arrow.Arrow(2013, 1, 1, 0, 0, 2, 999999),
- ),
- (
- arrow.Arrow(2013, 1, 1, 0, 0, 3),
- arrow.Arrow(2013, 1, 1, 0, 0, 3, 999999),
- ),
- ]
-
- def test_naive_tz(self):
-
- tzinfo = tz.gettz("US/Pacific")
-
- result = arrow.Arrow.span_range(
- "hour", datetime(2013, 1, 1, 0), datetime(2013, 1, 1, 3, 59), "US/Pacific"
- )
-
- for f, c in result:
- assert f.tzinfo == tzinfo
- assert c.tzinfo == tzinfo
-
- def test_aware_same_tz(self):
-
- tzinfo = tz.gettz("US/Pacific")
-
- result = arrow.Arrow.span_range(
- "hour",
- datetime(2013, 1, 1, 0, tzinfo=tzinfo),
- datetime(2013, 1, 1, 2, 59, tzinfo=tzinfo),
- )
-
- for f, c in result:
- assert f.tzinfo == tzinfo
- assert c.tzinfo == tzinfo
-
- def test_aware_different_tz(self):
-
- tzinfo1 = tz.gettz("US/Pacific")
- tzinfo2 = tz.gettz("US/Eastern")
-
- result = arrow.Arrow.span_range(
- "hour",
- datetime(2013, 1, 1, 0, tzinfo=tzinfo1),
- datetime(2013, 1, 1, 2, 59, tzinfo=tzinfo2),
- )
-
- for f, c in result:
- assert f.tzinfo == tzinfo1
- assert c.tzinfo == tzinfo1
-
- def test_aware_tz(self):
-
- result = arrow.Arrow.span_range(
- "hour",
- datetime(2013, 1, 1, 0, tzinfo=tz.gettz("US/Eastern")),
- datetime(2013, 1, 1, 2, 59, tzinfo=tz.gettz("US/Eastern")),
- tz="US/Central",
- )
-
- for f, c in result:
- assert f.tzinfo == tz.gettz("US/Central")
- assert c.tzinfo == tz.gettz("US/Central")
-
- def test_bounds_param_is_passed(self):
-
- result = list(
- arrow.Arrow.span_range(
- "quarter", datetime(2013, 2, 2), datetime(2013, 5, 15), bounds="[]"
- )
- )
-
- assert result == [
- (arrow.Arrow(2013, 1, 1), arrow.Arrow(2013, 4, 1)),
- (arrow.Arrow(2013, 4, 1), arrow.Arrow(2013, 7, 1)),
- ]
-
-
-class TestArrowInterval:
- def test_incorrect_input(self):
- with pytest.raises(ValueError):
- list(
- arrow.Arrow.interval(
- "month", datetime(2013, 1, 2), datetime(2013, 4, 15), 0
- )
- )
-
- def test_correct(self):
- result = list(
- arrow.Arrow.interval(
- "hour", datetime(2013, 5, 5, 12, 30), datetime(2013, 5, 5, 17, 15), 2
- )
- )
-
- assert result == [
- (
- arrow.Arrow(2013, 5, 5, 12),
- arrow.Arrow(2013, 5, 5, 13, 59, 59, 999999),
- ),
- (
- arrow.Arrow(2013, 5, 5, 14),
- arrow.Arrow(2013, 5, 5, 15, 59, 59, 999999),
- ),
- (
- arrow.Arrow(2013, 5, 5, 16),
- arrow.Arrow(2013, 5, 5, 17, 59, 59, 999999),
- ),
- ]
-
- def test_bounds_param_is_passed(self):
- result = list(
- arrow.Arrow.interval(
- "hour",
- datetime(2013, 5, 5, 12, 30),
- datetime(2013, 5, 5, 17, 15),
- 2,
- bounds="[]",
- )
- )
-
- assert result == [
- (arrow.Arrow(2013, 5, 5, 12), arrow.Arrow(2013, 5, 5, 14)),
- (arrow.Arrow(2013, 5, 5, 14), arrow.Arrow(2013, 5, 5, 16)),
- (arrow.Arrow(2013, 5, 5, 16), arrow.Arrow(2013, 5, 5, 18)),
- ]
-
-
-@pytest.mark.usefixtures("time_2013_02_15")
-class TestArrowSpan:
- def test_span_attribute(self):
-
- with pytest.raises(AttributeError):
- self.arrow.span("span")
-
- def test_span_year(self):
-
- floor, ceil = self.arrow.span("year")
-
- assert floor == datetime(2013, 1, 1, tzinfo=tz.tzutc())
- assert ceil == datetime(2013, 12, 31, 23, 59, 59, 999999, tzinfo=tz.tzutc())
-
- def test_span_quarter(self):
-
- floor, ceil = self.arrow.span("quarter")
-
- assert floor == datetime(2013, 1, 1, tzinfo=tz.tzutc())
- assert ceil == datetime(2013, 3, 31, 23, 59, 59, 999999, tzinfo=tz.tzutc())
-
- def test_span_quarter_count(self):
-
- floor, ceil = self.arrow.span("quarter", 2)
-
- assert floor == datetime(2013, 1, 1, tzinfo=tz.tzutc())
- assert ceil == datetime(2013, 6, 30, 23, 59, 59, 999999, tzinfo=tz.tzutc())
-
- def test_span_year_count(self):
-
- floor, ceil = self.arrow.span("year", 2)
-
- assert floor == datetime(2013, 1, 1, tzinfo=tz.tzutc())
- assert ceil == datetime(2014, 12, 31, 23, 59, 59, 999999, tzinfo=tz.tzutc())
-
- def test_span_month(self):
-
- floor, ceil = self.arrow.span("month")
-
- assert floor == datetime(2013, 2, 1, tzinfo=tz.tzutc())
- assert ceil == datetime(2013, 2, 28, 23, 59, 59, 999999, tzinfo=tz.tzutc())
-
- def test_span_week(self):
-
- floor, ceil = self.arrow.span("week")
-
- assert floor == datetime(2013, 2, 11, tzinfo=tz.tzutc())
- assert ceil == datetime(2013, 2, 17, 23, 59, 59, 999999, tzinfo=tz.tzutc())
-
- def test_span_day(self):
-
- floor, ceil = self.arrow.span("day")
-
- assert floor == datetime(2013, 2, 15, tzinfo=tz.tzutc())
- assert ceil == datetime(2013, 2, 15, 23, 59, 59, 999999, tzinfo=tz.tzutc())
-
- def test_span_hour(self):
-
- floor, ceil = self.arrow.span("hour")
-
- assert floor == datetime(2013, 2, 15, 3, tzinfo=tz.tzutc())
- assert ceil == datetime(2013, 2, 15, 3, 59, 59, 999999, tzinfo=tz.tzutc())
-
- def test_span_minute(self):
-
- floor, ceil = self.arrow.span("minute")
-
- assert floor == datetime(2013, 2, 15, 3, 41, tzinfo=tz.tzutc())
- assert ceil == datetime(2013, 2, 15, 3, 41, 59, 999999, tzinfo=tz.tzutc())
-
- def test_span_second(self):
-
- floor, ceil = self.arrow.span("second")
-
- assert floor == datetime(2013, 2, 15, 3, 41, 22, tzinfo=tz.tzutc())
- assert ceil == datetime(2013, 2, 15, 3, 41, 22, 999999, tzinfo=tz.tzutc())
-
- def test_span_microsecond(self):
-
- floor, ceil = self.arrow.span("microsecond")
-
- assert floor == datetime(2013, 2, 15, 3, 41, 22, 8923, tzinfo=tz.tzutc())
- assert ceil == datetime(2013, 2, 15, 3, 41, 22, 8923, tzinfo=tz.tzutc())
-
- def test_floor(self):
-
- floor, ceil = self.arrow.span("month")
-
- assert floor == self.arrow.floor("month")
- assert ceil == self.arrow.ceil("month")
-
- def test_span_inclusive_inclusive(self):
-
- floor, ceil = self.arrow.span("hour", bounds="[]")
-
- assert floor == datetime(2013, 2, 15, 3, tzinfo=tz.tzutc())
- assert ceil == datetime(2013, 2, 15, 4, tzinfo=tz.tzutc())
-
- def test_span_exclusive_inclusive(self):
-
- floor, ceil = self.arrow.span("hour", bounds="(]")
-
- assert floor == datetime(2013, 2, 15, 3, 0, 0, 1, tzinfo=tz.tzutc())
- assert ceil == datetime(2013, 2, 15, 4, tzinfo=tz.tzutc())
-
- def test_span_exclusive_exclusive(self):
-
- floor, ceil = self.arrow.span("hour", bounds="()")
-
- assert floor == datetime(2013, 2, 15, 3, 0, 0, 1, tzinfo=tz.tzutc())
- assert ceil == datetime(2013, 2, 15, 3, 59, 59, 999999, tzinfo=tz.tzutc())
-
- def test_bounds_are_validated(self):
-
- with pytest.raises(ValueError):
- floor, ceil = self.arrow.span("hour", bounds="][")
-
-
-@pytest.mark.usefixtures("time_2013_01_01")
-class TestArrowHumanize:
- def test_granularity(self):
-
- assert self.now.humanize(granularity="second") == "just now"
-
- later1 = self.now.shift(seconds=1)
- assert self.now.humanize(later1, granularity="second") == "just now"
- assert later1.humanize(self.now, granularity="second") == "just now"
- assert self.now.humanize(later1, granularity="minute") == "0 minutes ago"
- assert later1.humanize(self.now, granularity="minute") == "in 0 minutes"
-
- later100 = self.now.shift(seconds=100)
- assert self.now.humanize(later100, granularity="second") == "100 seconds ago"
- assert later100.humanize(self.now, granularity="second") == "in 100 seconds"
- assert self.now.humanize(later100, granularity="minute") == "a minute ago"
- assert later100.humanize(self.now, granularity="minute") == "in a minute"
- assert self.now.humanize(later100, granularity="hour") == "0 hours ago"
- assert later100.humanize(self.now, granularity="hour") == "in 0 hours"
-
- later4000 = self.now.shift(seconds=4000)
- assert self.now.humanize(later4000, granularity="minute") == "66 minutes ago"
- assert later4000.humanize(self.now, granularity="minute") == "in 66 minutes"
- assert self.now.humanize(later4000, granularity="hour") == "an hour ago"
- assert later4000.humanize(self.now, granularity="hour") == "in an hour"
- assert self.now.humanize(later4000, granularity="day") == "0 days ago"
- assert later4000.humanize(self.now, granularity="day") == "in 0 days"
-
- later105 = self.now.shift(seconds=10 ** 5)
- assert self.now.humanize(later105, granularity="hour") == "27 hours ago"
- assert later105.humanize(self.now, granularity="hour") == "in 27 hours"
- assert self.now.humanize(later105, granularity="day") == "a day ago"
- assert later105.humanize(self.now, granularity="day") == "in a day"
- assert self.now.humanize(later105, granularity="week") == "0 weeks ago"
- assert later105.humanize(self.now, granularity="week") == "in 0 weeks"
- assert self.now.humanize(later105, granularity="month") == "0 months ago"
- assert later105.humanize(self.now, granularity="month") == "in 0 months"
- assert self.now.humanize(later105, granularity=["month"]) == "0 months ago"
- assert later105.humanize(self.now, granularity=["month"]) == "in 0 months"
-
- later106 = self.now.shift(seconds=3 * 10 ** 6)
- assert self.now.humanize(later106, granularity="day") == "34 days ago"
- assert later106.humanize(self.now, granularity="day") == "in 34 days"
- assert self.now.humanize(later106, granularity="week") == "4 weeks ago"
- assert later106.humanize(self.now, granularity="week") == "in 4 weeks"
- assert self.now.humanize(later106, granularity="month") == "a month ago"
- assert later106.humanize(self.now, granularity="month") == "in a month"
- assert self.now.humanize(later106, granularity="year") == "0 years ago"
- assert later106.humanize(self.now, granularity="year") == "in 0 years"
-
- later506 = self.now.shift(seconds=50 * 10 ** 6)
- assert self.now.humanize(later506, granularity="week") == "82 weeks ago"
- assert later506.humanize(self.now, granularity="week") == "in 82 weeks"
- assert self.now.humanize(later506, granularity="month") == "18 months ago"
- assert later506.humanize(self.now, granularity="month") == "in 18 months"
- assert self.now.humanize(later506, granularity="year") == "a year ago"
- assert later506.humanize(self.now, granularity="year") == "in a year"
-
- later108 = self.now.shift(seconds=10 ** 8)
- assert self.now.humanize(later108, granularity="year") == "3 years ago"
- assert later108.humanize(self.now, granularity="year") == "in 3 years"
-
- later108onlydistance = self.now.shift(seconds=10 ** 8)
- assert (
- self.now.humanize(
- later108onlydistance, only_distance=True, granularity="year"
- )
- == "3 years"
- )
- assert (
- later108onlydistance.humanize(
- self.now, only_distance=True, granularity="year"
- )
- == "3 years"
- )
-
- with pytest.raises(AttributeError):
- self.now.humanize(later108, granularity="years")
-
- def test_multiple_granularity(self):
- assert self.now.humanize(granularity="second") == "just now"
- assert self.now.humanize(granularity=["second"]) == "just now"
- assert (
- self.now.humanize(granularity=["year", "month", "day", "hour", "second"])
- == "in 0 years 0 months 0 days 0 hours and 0 seconds"
- )
-
- later4000 = self.now.shift(seconds=4000)
- assert (
- later4000.humanize(self.now, granularity=["hour", "minute"])
- == "in an hour and 6 minutes"
- )
- assert (
- self.now.humanize(later4000, granularity=["hour", "minute"])
- == "an hour and 6 minutes ago"
- )
- assert (
- later4000.humanize(
- self.now, granularity=["hour", "minute"], only_distance=True
- )
- == "an hour and 6 minutes"
- )
- assert (
- later4000.humanize(self.now, granularity=["day", "hour", "minute"])
- == "in 0 days an hour and 6 minutes"
- )
- assert (
- self.now.humanize(later4000, granularity=["day", "hour", "minute"])
- == "0 days an hour and 6 minutes ago"
- )
-
- later105 = self.now.shift(seconds=10 ** 5)
- assert (
- self.now.humanize(later105, granularity=["hour", "day", "minute"])
- == "a day 3 hours and 46 minutes ago"
- )
- with pytest.raises(AttributeError):
- self.now.humanize(later105, granularity=["error", "second"])
-
- later108onlydistance = self.now.shift(seconds=10 ** 8)
- assert (
- self.now.humanize(
- later108onlydistance, only_distance=True, granularity=["year"]
- )
- == "3 years"
- )
- assert (
- self.now.humanize(
- later108onlydistance, only_distance=True, granularity=["month", "week"]
- )
- == "37 months and 4 weeks"
- )
- assert (
- self.now.humanize(
- later108onlydistance, only_distance=True, granularity=["year", "second"]
- )
- == "3 years and 5327200 seconds"
- )
-
- one_min_one_sec_ago = self.now.shift(minutes=-1, seconds=-1)
- assert (
- one_min_one_sec_ago.humanize(self.now, granularity=["minute", "second"])
- == "a minute and a second ago"
- )
-
- one_min_two_secs_ago = self.now.shift(minutes=-1, seconds=-2)
- assert (
- one_min_two_secs_ago.humanize(self.now, granularity=["minute", "second"])
- == "a minute and 2 seconds ago"
- )
-
- def test_seconds(self):
-
- later = self.now.shift(seconds=10)
-
- # regression test for issue #727
- assert self.now.humanize(later) == "10 seconds ago"
- assert later.humanize(self.now) == "in 10 seconds"
-
- assert self.now.humanize(later, only_distance=True) == "10 seconds"
- assert later.humanize(self.now, only_distance=True) == "10 seconds"
-
- def test_minute(self):
-
- later = self.now.shift(minutes=1)
-
- assert self.now.humanize(later) == "a minute ago"
- assert later.humanize(self.now) == "in a minute"
-
- assert self.now.humanize(later, only_distance=True) == "a minute"
- assert later.humanize(self.now, only_distance=True) == "a minute"
-
- def test_minutes(self):
-
- later = self.now.shift(minutes=2)
-
- assert self.now.humanize(later) == "2 minutes ago"
- assert later.humanize(self.now) == "in 2 minutes"
-
- assert self.now.humanize(later, only_distance=True) == "2 minutes"
- assert later.humanize(self.now, only_distance=True) == "2 minutes"
-
- def test_hour(self):
-
- later = self.now.shift(hours=1)
-
- assert self.now.humanize(later) == "an hour ago"
- assert later.humanize(self.now) == "in an hour"
-
- assert self.now.humanize(later, only_distance=True) == "an hour"
- assert later.humanize(self.now, only_distance=True) == "an hour"
-
- def test_hours(self):
-
- later = self.now.shift(hours=2)
-
- assert self.now.humanize(later) == "2 hours ago"
- assert later.humanize(self.now) == "in 2 hours"
-
- assert self.now.humanize(later, only_distance=True) == "2 hours"
- assert later.humanize(self.now, only_distance=True) == "2 hours"
-
- def test_day(self):
-
- later = self.now.shift(days=1)
-
- assert self.now.humanize(later) == "a day ago"
- assert later.humanize(self.now) == "in a day"
-
- # regression test for issue #697
- less_than_48_hours = self.now.shift(
- days=1, hours=23, seconds=59, microseconds=999999
- )
- assert self.now.humanize(less_than_48_hours) == "a day ago"
- assert less_than_48_hours.humanize(self.now) == "in a day"
-
- less_than_48_hours_date = less_than_48_hours._datetime.date()
- with pytest.raises(TypeError):
- # humanize other argument does not take raw datetime.date objects
- self.now.humanize(less_than_48_hours_date)
-
- # convert from date to arrow object
- less_than_48_hours_date = arrow.Arrow.fromdate(less_than_48_hours_date)
- assert self.now.humanize(less_than_48_hours_date) == "a day ago"
- assert less_than_48_hours_date.humanize(self.now) == "in a day"
-
- assert self.now.humanize(later, only_distance=True) == "a day"
- assert later.humanize(self.now, only_distance=True) == "a day"
-
- def test_days(self):
-
- later = self.now.shift(days=2)
-
- assert self.now.humanize(later) == "2 days ago"
- assert later.humanize(self.now) == "in 2 days"
-
- assert self.now.humanize(later, only_distance=True) == "2 days"
- assert later.humanize(self.now, only_distance=True) == "2 days"
-
- # Regression tests for humanize bug referenced in issue 541
- later = self.now.shift(days=3)
- assert later.humanize(self.now) == "in 3 days"
-
- later = self.now.shift(days=3, seconds=1)
- assert later.humanize(self.now) == "in 3 days"
-
- later = self.now.shift(days=4)
- assert later.humanize(self.now) == "in 4 days"
-
- def test_week(self):
-
- later = self.now.shift(weeks=1)
-
- assert self.now.humanize(later) == "a week ago"
- assert later.humanize(self.now) == "in a week"
-
- assert self.now.humanize(later, only_distance=True) == "a week"
- assert later.humanize(self.now, only_distance=True) == "a week"
-
- def test_weeks(self):
-
- later = self.now.shift(weeks=2)
-
- assert self.now.humanize(later) == "2 weeks ago"
- assert later.humanize(self.now) == "in 2 weeks"
-
- assert self.now.humanize(later, only_distance=True) == "2 weeks"
- assert later.humanize(self.now, only_distance=True) == "2 weeks"
-
- def test_month(self):
-
- later = self.now.shift(months=1)
-
- assert self.now.humanize(later) == "a month ago"
- assert later.humanize(self.now) == "in a month"
-
- assert self.now.humanize(later, only_distance=True) == "a month"
- assert later.humanize(self.now, only_distance=True) == "a month"
-
- def test_months(self):
-
- later = self.now.shift(months=2)
- earlier = self.now.shift(months=-2)
-
- assert earlier.humanize(self.now) == "2 months ago"
- assert later.humanize(self.now) == "in 2 months"
-
- assert self.now.humanize(later, only_distance=True) == "2 months"
- assert later.humanize(self.now, only_distance=True) == "2 months"
-
- def test_year(self):
-
- later = self.now.shift(years=1)
-
- assert self.now.humanize(later) == "a year ago"
- assert later.humanize(self.now) == "in a year"
-
- assert self.now.humanize(later, only_distance=True) == "a year"
- assert later.humanize(self.now, only_distance=True) == "a year"
-
- def test_years(self):
-
- later = self.now.shift(years=2)
-
- assert self.now.humanize(later) == "2 years ago"
- assert later.humanize(self.now) == "in 2 years"
-
- assert self.now.humanize(later, only_distance=True) == "2 years"
- assert later.humanize(self.now, only_distance=True) == "2 years"
-
- arw = arrow.Arrow(2014, 7, 2)
-
- result = arw.humanize(self.datetime)
-
- assert result == "in 2 years"
-
- def test_arrow(self):
-
- arw = arrow.Arrow.fromdatetime(self.datetime)
-
- result = arw.humanize(arrow.Arrow.fromdatetime(self.datetime))
-
- assert result == "just now"
-
- def test_datetime_tzinfo(self):
-
- arw = arrow.Arrow.fromdatetime(self.datetime)
-
- result = arw.humanize(self.datetime.replace(tzinfo=tz.tzutc()))
-
- assert result == "just now"
-
- def test_other(self):
-
- arw = arrow.Arrow.fromdatetime(self.datetime)
-
- with pytest.raises(TypeError):
- arw.humanize(object())
-
- def test_invalid_locale(self):
-
- arw = arrow.Arrow.fromdatetime(self.datetime)
-
- with pytest.raises(ValueError):
- arw.humanize(locale="klingon")
-
- def test_none(self):
-
- arw = arrow.Arrow.utcnow()
-
- result = arw.humanize()
-
- assert result == "just now"
-
- result = arw.humanize(None)
-
- assert result == "just now"
-
- def test_untranslated_granularity(self, mocker):
-
- arw = arrow.Arrow.utcnow()
- later = arw.shift(weeks=1)
-
- # simulate an untranslated timeframe key
- mocker.patch.dict("arrow.locales.EnglishLocale.timeframes")
- del arrow.locales.EnglishLocale.timeframes["week"]
- with pytest.raises(ValueError):
- arw.humanize(later, granularity="week")
-
-
-@pytest.mark.usefixtures("time_2013_01_01")
-class TestArrowHumanizeTestsWithLocale:
- def test_now(self):
-
- arw = arrow.Arrow(2013, 1, 1, 0, 0, 0)
-
- result = arw.humanize(self.datetime, locale="ru")
-
- assert result == "сейчас"
-
- def test_seconds(self):
- arw = arrow.Arrow(2013, 1, 1, 0, 0, 44)
-
- result = arw.humanize(self.datetime, locale="ru")
-
- assert result == "через 44 несколько секунд"
-
- def test_years(self):
-
- arw = arrow.Arrow(2011, 7, 2)
-
- result = arw.humanize(self.datetime, locale="ru")
-
- assert result == "2 года назад"
-
-
-class TestArrowIsBetween:
- def test_start_before_end(self):
- target = arrow.Arrow.fromdatetime(datetime(2013, 5, 7))
- start = arrow.Arrow.fromdatetime(datetime(2013, 5, 8))
- end = arrow.Arrow.fromdatetime(datetime(2013, 5, 5))
- result = target.is_between(start, end)
- assert not result
-
- def test_exclusive_exclusive_bounds(self):
- target = arrow.Arrow.fromdatetime(datetime(2013, 5, 5, 12, 30, 27))
- start = arrow.Arrow.fromdatetime(datetime(2013, 5, 5, 12, 30, 10))
- end = arrow.Arrow.fromdatetime(datetime(2013, 5, 5, 12, 30, 36))
- result = target.is_between(start, end, "()")
- assert result
- result = target.is_between(start, end)
- assert result
-
- def test_exclusive_exclusive_bounds_same_date(self):
- target = arrow.Arrow.fromdatetime(datetime(2013, 5, 7))
- start = arrow.Arrow.fromdatetime(datetime(2013, 5, 7))
- end = arrow.Arrow.fromdatetime(datetime(2013, 5, 7))
- result = target.is_between(start, end, "()")
- assert not result
-
- def test_inclusive_exclusive_bounds(self):
- target = arrow.Arrow.fromdatetime(datetime(2013, 5, 6))
- start = arrow.Arrow.fromdatetime(datetime(2013, 5, 4))
- end = arrow.Arrow.fromdatetime(datetime(2013, 5, 6))
- result = target.is_between(start, end, "[)")
- assert not result
-
- def test_exclusive_inclusive_bounds(self):
- target = arrow.Arrow.fromdatetime(datetime(2013, 5, 7))
- start = arrow.Arrow.fromdatetime(datetime(2013, 5, 5))
- end = arrow.Arrow.fromdatetime(datetime(2013, 5, 7))
- result = target.is_between(start, end, "(]")
- assert result
-
- def test_inclusive_inclusive_bounds_same_date(self):
- target = arrow.Arrow.fromdatetime(datetime(2013, 5, 7))
- start = arrow.Arrow.fromdatetime(datetime(2013, 5, 7))
- end = arrow.Arrow.fromdatetime(datetime(2013, 5, 7))
- result = target.is_between(start, end, "[]")
- assert result
-
- def test_type_error_exception(self):
- with pytest.raises(TypeError):
- target = arrow.Arrow.fromdatetime(datetime(2013, 5, 7))
- start = datetime(2013, 5, 5)
- end = arrow.Arrow.fromdatetime(datetime(2013, 5, 8))
- target.is_between(start, end)
-
- with pytest.raises(TypeError):
- target = arrow.Arrow.fromdatetime(datetime(2013, 5, 7))
- start = arrow.Arrow.fromdatetime(datetime(2013, 5, 5))
- end = datetime(2013, 5, 8)
- target.is_between(start, end)
-
- with pytest.raises(TypeError):
- target.is_between(None, None)
-
- def test_value_error_exception(self):
- target = arrow.Arrow.fromdatetime(datetime(2013, 5, 7))
- start = arrow.Arrow.fromdatetime(datetime(2013, 5, 5))
- end = arrow.Arrow.fromdatetime(datetime(2013, 5, 8))
- with pytest.raises(ValueError):
- target.is_between(start, end, "][")
- with pytest.raises(ValueError):
- target.is_between(start, end, "")
- with pytest.raises(ValueError):
- target.is_between(start, end, "]")
- with pytest.raises(ValueError):
- target.is_between(start, end, "[")
- with pytest.raises(ValueError):
- target.is_between(start, end, "hello")
-
-
-class TestArrowUtil:
- def test_get_datetime(self):
-
- get_datetime = arrow.Arrow._get_datetime
-
- arw = arrow.Arrow.utcnow()
- dt = datetime.utcnow()
- timestamp = time.time()
-
- assert get_datetime(arw) == arw.datetime
- assert get_datetime(dt) == dt
- assert (
- get_datetime(timestamp) == arrow.Arrow.utcfromtimestamp(timestamp).datetime
- )
-
- with pytest.raises(ValueError) as raise_ctx:
- get_datetime("abc")
- assert "not recognized as a datetime or timestamp" in str(raise_ctx.value)
-
- def test_get_tzinfo(self):
-
- get_tzinfo = arrow.Arrow._get_tzinfo
-
- with pytest.raises(ValueError) as raise_ctx:
- get_tzinfo("abc")
- assert "not recognized as a timezone" in str(raise_ctx.value)
-
- def test_get_iteration_params(self):
-
- assert arrow.Arrow._get_iteration_params("end", None) == ("end", sys.maxsize)
- assert arrow.Arrow._get_iteration_params(None, 100) == (arrow.Arrow.max, 100)
- assert arrow.Arrow._get_iteration_params(100, 120) == (100, 120)
-
- with pytest.raises(ValueError):
- arrow.Arrow._get_iteration_params(None, None)
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/tests/test_factory.py b/openpype/modules/ftrack/python2_vendor/arrow/tests/test_factory.py
deleted file mode 100644
index 2b8df5168f..0000000000
--- a/openpype/modules/ftrack/python2_vendor/arrow/tests/test_factory.py
+++ /dev/null
@@ -1,390 +0,0 @@
-# -*- coding: utf-8 -*-
-import time
-from datetime import date, datetime
-
-import pytest
-from dateutil import tz
-
-from arrow.parser import ParserError
-
-from .utils import assert_datetime_equality
-
-
-@pytest.mark.usefixtures("arrow_factory")
-class TestGet:
- def test_no_args(self):
-
- assert_datetime_equality(
- self.factory.get(), datetime.utcnow().replace(tzinfo=tz.tzutc())
- )
-
- def test_timestamp_one_arg_no_arg(self):
-
- no_arg = self.factory.get(1406430900).timestamp
- one_arg = self.factory.get("1406430900", "X").timestamp
-
- assert no_arg == one_arg
-
- def test_one_arg_none(self):
-
- assert_datetime_equality(
- self.factory.get(None), datetime.utcnow().replace(tzinfo=tz.tzutc())
- )
-
- def test_struct_time(self):
-
- assert_datetime_equality(
- self.factory.get(time.gmtime()),
- datetime.utcnow().replace(tzinfo=tz.tzutc()),
- )
-
- def test_one_arg_timestamp(self):
-
- int_timestamp = int(time.time())
- timestamp_dt = datetime.utcfromtimestamp(int_timestamp).replace(
- tzinfo=tz.tzutc()
- )
-
- assert self.factory.get(int_timestamp) == timestamp_dt
-
- with pytest.raises(ParserError):
- self.factory.get(str(int_timestamp))
-
- float_timestamp = time.time()
- timestamp_dt = datetime.utcfromtimestamp(float_timestamp).replace(
- tzinfo=tz.tzutc()
- )
-
- assert self.factory.get(float_timestamp) == timestamp_dt
-
- with pytest.raises(ParserError):
- self.factory.get(str(float_timestamp))
-
- # Regression test for issue #216
- # Python 3 raises OverflowError, Python 2 raises ValueError
- timestamp = 99999999999999999999999999.99999999999999999999999999
- with pytest.raises((OverflowError, ValueError)):
- self.factory.get(timestamp)
-
- def test_one_arg_expanded_timestamp(self):
-
- millisecond_timestamp = 1591328104308
- microsecond_timestamp = 1591328104308505
-
- # Regression test for issue #796
- assert self.factory.get(millisecond_timestamp) == datetime.utcfromtimestamp(
- 1591328104.308
- ).replace(tzinfo=tz.tzutc())
- assert self.factory.get(microsecond_timestamp) == datetime.utcfromtimestamp(
- 1591328104.308505
- ).replace(tzinfo=tz.tzutc())
-
- def test_one_arg_timestamp_with_tzinfo(self):
-
- timestamp = time.time()
- timestamp_dt = datetime.fromtimestamp(timestamp, tz=tz.tzutc()).astimezone(
- tz.gettz("US/Pacific")
- )
- timezone = tz.gettz("US/Pacific")
-
- assert_datetime_equality(
- self.factory.get(timestamp, tzinfo=timezone), timestamp_dt
- )
-
- def test_one_arg_arrow(self):
-
- arw = self.factory.utcnow()
- result = self.factory.get(arw)
-
- assert arw == result
-
- def test_one_arg_datetime(self):
-
- dt = datetime.utcnow().replace(tzinfo=tz.tzutc())
-
- assert self.factory.get(dt) == dt
-
- def test_one_arg_date(self):
-
- d = date.today()
- dt = datetime(d.year, d.month, d.day, tzinfo=tz.tzutc())
-
- assert self.factory.get(d) == dt
-
- def test_one_arg_tzinfo(self):
-
- self.expected = (
- datetime.utcnow()
- .replace(tzinfo=tz.tzutc())
- .astimezone(tz.gettz("US/Pacific"))
- )
-
- assert_datetime_equality(
- self.factory.get(tz.gettz("US/Pacific")), self.expected
- )
-
- # regression test for issue #658
- def test_one_arg_dateparser_datetime(self):
- dateparser = pytest.importorskip("dateparser")
- expected = datetime(1990, 1, 1).replace(tzinfo=tz.tzutc())
- # dateparser outputs: datetime.datetime(1990, 1, 1, 0, 0, tzinfo=)
- parsed_date = dateparser.parse("1990-01-01T00:00:00+00:00")
- dt_output = self.factory.get(parsed_date)._datetime.replace(tzinfo=tz.tzutc())
- assert dt_output == expected
-
- def test_kwarg_tzinfo(self):
-
- self.expected = (
- datetime.utcnow()
- .replace(tzinfo=tz.tzutc())
- .astimezone(tz.gettz("US/Pacific"))
- )
-
- assert_datetime_equality(
- self.factory.get(tzinfo=tz.gettz("US/Pacific")), self.expected
- )
-
- def test_kwarg_tzinfo_string(self):
-
- self.expected = (
- datetime.utcnow()
- .replace(tzinfo=tz.tzutc())
- .astimezone(tz.gettz("US/Pacific"))
- )
-
- assert_datetime_equality(self.factory.get(tzinfo="US/Pacific"), self.expected)
-
- with pytest.raises(ParserError):
- self.factory.get(tzinfo="US/PacificInvalidTzinfo")
-
- def test_kwarg_normalize_whitespace(self):
- result = self.factory.get(
- "Jun 1 2005 1:33PM",
- "MMM D YYYY H:mmA",
- tzinfo=tz.tzutc(),
- normalize_whitespace=True,
- )
- assert result._datetime == datetime(2005, 6, 1, 13, 33, tzinfo=tz.tzutc())
-
- result = self.factory.get(
- "\t 2013-05-05T12:30:45.123456 \t \n",
- tzinfo=tz.tzutc(),
- normalize_whitespace=True,
- )
- assert result._datetime == datetime(
- 2013, 5, 5, 12, 30, 45, 123456, tzinfo=tz.tzutc()
- )
-
- def test_one_arg_iso_str(self):
-
- dt = datetime.utcnow()
-
- assert_datetime_equality(
- self.factory.get(dt.isoformat()), dt.replace(tzinfo=tz.tzutc())
- )
-
- def test_one_arg_iso_calendar(self):
-
- pairs = [
- (datetime(2004, 1, 4), (2004, 1, 7)),
- (datetime(2008, 12, 30), (2009, 1, 2)),
- (datetime(2010, 1, 2), (2009, 53, 6)),
- (datetime(2000, 2, 29), (2000, 9, 2)),
- (datetime(2005, 1, 1), (2004, 53, 6)),
- (datetime(2010, 1, 4), (2010, 1, 1)),
- (datetime(2010, 1, 3), (2009, 53, 7)),
- (datetime(2003, 12, 29), (2004, 1, 1)),
- ]
-
- for pair in pairs:
- dt, iso = pair
- assert self.factory.get(iso) == self.factory.get(dt)
-
- with pytest.raises(TypeError):
- self.factory.get((2014, 7, 1, 4))
-
- with pytest.raises(TypeError):
- self.factory.get((2014, 7))
-
- with pytest.raises(ValueError):
- self.factory.get((2014, 70, 1))
-
- with pytest.raises(ValueError):
- self.factory.get((2014, 7, 10))
-
- def test_one_arg_other(self):
-
- with pytest.raises(TypeError):
- self.factory.get(object())
-
- def test_one_arg_bool(self):
-
- with pytest.raises(TypeError):
- self.factory.get(False)
-
- with pytest.raises(TypeError):
- self.factory.get(True)
-
- def test_two_args_datetime_tzinfo(self):
-
- result = self.factory.get(datetime(2013, 1, 1), tz.gettz("US/Pacific"))
-
- assert result._datetime == datetime(2013, 1, 1, tzinfo=tz.gettz("US/Pacific"))
-
- def test_two_args_datetime_tz_str(self):
-
- result = self.factory.get(datetime(2013, 1, 1), "US/Pacific")
-
- assert result._datetime == datetime(2013, 1, 1, tzinfo=tz.gettz("US/Pacific"))
-
- def test_two_args_date_tzinfo(self):
-
- result = self.factory.get(date(2013, 1, 1), tz.gettz("US/Pacific"))
-
- assert result._datetime == datetime(2013, 1, 1, tzinfo=tz.gettz("US/Pacific"))
-
- def test_two_args_date_tz_str(self):
-
- result = self.factory.get(date(2013, 1, 1), "US/Pacific")
-
- assert result._datetime == datetime(2013, 1, 1, tzinfo=tz.gettz("US/Pacific"))
-
- def test_two_args_datetime_other(self):
-
- with pytest.raises(TypeError):
- self.factory.get(datetime.utcnow(), object())
-
- def test_two_args_date_other(self):
-
- with pytest.raises(TypeError):
- self.factory.get(date.today(), object())
-
- def test_two_args_str_str(self):
-
- result = self.factory.get("2013-01-01", "YYYY-MM-DD")
-
- assert result._datetime == datetime(2013, 1, 1, tzinfo=tz.tzutc())
-
- def test_two_args_str_tzinfo(self):
-
- result = self.factory.get("2013-01-01", tzinfo=tz.gettz("US/Pacific"))
-
- assert_datetime_equality(
- result._datetime, datetime(2013, 1, 1, tzinfo=tz.gettz("US/Pacific"))
- )
-
- def test_two_args_twitter_format(self):
-
- # format returned by twitter API for created_at:
- twitter_date = "Fri Apr 08 21:08:54 +0000 2016"
- result = self.factory.get(twitter_date, "ddd MMM DD HH:mm:ss Z YYYY")
-
- assert result._datetime == datetime(2016, 4, 8, 21, 8, 54, tzinfo=tz.tzutc())
-
- def test_two_args_str_list(self):
-
- result = self.factory.get("2013-01-01", ["MM/DD/YYYY", "YYYY-MM-DD"])
-
- assert result._datetime == datetime(2013, 1, 1, tzinfo=tz.tzutc())
-
- def test_two_args_unicode_unicode(self):
-
- result = self.factory.get(u"2013-01-01", u"YYYY-MM-DD")
-
- assert result._datetime == datetime(2013, 1, 1, tzinfo=tz.tzutc())
-
- def test_two_args_other(self):
-
- with pytest.raises(TypeError):
- self.factory.get(object(), object())
-
- def test_three_args_with_tzinfo(self):
-
- timefmt = "YYYYMMDD"
- d = "20150514"
-
- assert self.factory.get(d, timefmt, tzinfo=tz.tzlocal()) == datetime(
- 2015, 5, 14, tzinfo=tz.tzlocal()
- )
-
- def test_three_args(self):
-
- assert self.factory.get(2013, 1, 1) == datetime(2013, 1, 1, tzinfo=tz.tzutc())
-
- def test_full_kwargs(self):
-
- assert (
- self.factory.get(
- year=2016,
- month=7,
- day=14,
- hour=7,
- minute=16,
- second=45,
- microsecond=631092,
- )
- == datetime(2016, 7, 14, 7, 16, 45, 631092, tzinfo=tz.tzutc())
- )
-
- def test_three_kwargs(self):
-
- assert self.factory.get(year=2016, month=7, day=14) == datetime(
- 2016, 7, 14, 0, 0, tzinfo=tz.tzutc()
- )
-
- def test_tzinfo_string_kwargs(self):
- result = self.factory.get("2019072807", "YYYYMMDDHH", tzinfo="UTC")
- assert result._datetime == datetime(2019, 7, 28, 7, 0, 0, 0, tzinfo=tz.tzutc())
-
- def test_insufficient_kwargs(self):
-
- with pytest.raises(TypeError):
- self.factory.get(year=2016)
-
- with pytest.raises(TypeError):
- self.factory.get(year=2016, month=7)
-
- def test_locale(self):
- result = self.factory.get("2010", "YYYY", locale="ja")
- assert result._datetime == datetime(2010, 1, 1, 0, 0, 0, 0, tzinfo=tz.tzutc())
-
- # regression test for issue #701
- result = self.factory.get(
- "Montag, 9. September 2019, 16:15-20:00", "dddd, D. MMMM YYYY", locale="de"
- )
- assert result._datetime == datetime(2019, 9, 9, 0, 0, 0, 0, tzinfo=tz.tzutc())
-
- def test_locale_kwarg_only(self):
- res = self.factory.get(locale="ja")
- assert res.tzinfo == tz.tzutc()
-
- def test_locale_with_tzinfo(self):
- res = self.factory.get(locale="ja", tzinfo=tz.gettz("Asia/Tokyo"))
- assert res.tzinfo == tz.gettz("Asia/Tokyo")
-
-
-@pytest.mark.usefixtures("arrow_factory")
-class TestUtcNow:
- def test_utcnow(self):
-
- assert_datetime_equality(
- self.factory.utcnow()._datetime,
- datetime.utcnow().replace(tzinfo=tz.tzutc()),
- )
-
-
-@pytest.mark.usefixtures("arrow_factory")
-class TestNow:
- def test_no_tz(self):
-
- assert_datetime_equality(self.factory.now(), datetime.now(tz.tzlocal()))
-
- def test_tzinfo(self):
-
- assert_datetime_equality(
- self.factory.now(tz.gettz("EST")), datetime.now(tz.gettz("EST"))
- )
-
- def test_tz_str(self):
-
- assert_datetime_equality(self.factory.now("EST"), datetime.now(tz.gettz("EST")))
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/tests/test_formatter.py b/openpype/modules/ftrack/python2_vendor/arrow/tests/test_formatter.py
deleted file mode 100644
index e97aeb5dcc..0000000000
--- a/openpype/modules/ftrack/python2_vendor/arrow/tests/test_formatter.py
+++ /dev/null
@@ -1,282 +0,0 @@
-# -*- coding: utf-8 -*-
-from datetime import datetime
-
-import pytest
-import pytz
-from dateutil import tz as dateutil_tz
-
-from arrow import (
- FORMAT_ATOM,
- FORMAT_COOKIE,
- FORMAT_RFC822,
- FORMAT_RFC850,
- FORMAT_RFC1036,
- FORMAT_RFC1123,
- FORMAT_RFC2822,
- FORMAT_RFC3339,
- FORMAT_RSS,
- FORMAT_W3C,
-)
-
-from .utils import make_full_tz_list
-
-
-@pytest.mark.usefixtures("arrow_formatter")
-class TestFormatterFormatToken:
- def test_format(self):
-
- dt = datetime(2013, 2, 5, 12, 32, 51)
-
- result = self.formatter.format(dt, "MM-DD-YYYY hh:mm:ss a")
-
- assert result == "02-05-2013 12:32:51 pm"
-
- def test_year(self):
-
- dt = datetime(2013, 1, 1)
- assert self.formatter._format_token(dt, "YYYY") == "2013"
- assert self.formatter._format_token(dt, "YY") == "13"
-
- def test_month(self):
-
- dt = datetime(2013, 1, 1)
- assert self.formatter._format_token(dt, "MMMM") == "January"
- assert self.formatter._format_token(dt, "MMM") == "Jan"
- assert self.formatter._format_token(dt, "MM") == "01"
- assert self.formatter._format_token(dt, "M") == "1"
-
- def test_day(self):
-
- dt = datetime(2013, 2, 1)
- assert self.formatter._format_token(dt, "DDDD") == "032"
- assert self.formatter._format_token(dt, "DDD") == "32"
- assert self.formatter._format_token(dt, "DD") == "01"
- assert self.formatter._format_token(dt, "D") == "1"
- assert self.formatter._format_token(dt, "Do") == "1st"
-
- assert self.formatter._format_token(dt, "dddd") == "Friday"
- assert self.formatter._format_token(dt, "ddd") == "Fri"
- assert self.formatter._format_token(dt, "d") == "5"
-
- def test_hour(self):
-
- dt = datetime(2013, 1, 1, 2)
- assert self.formatter._format_token(dt, "HH") == "02"
- assert self.formatter._format_token(dt, "H") == "2"
-
- dt = datetime(2013, 1, 1, 13)
- assert self.formatter._format_token(dt, "HH") == "13"
- assert self.formatter._format_token(dt, "H") == "13"
-
- dt = datetime(2013, 1, 1, 2)
- assert self.formatter._format_token(dt, "hh") == "02"
- assert self.formatter._format_token(dt, "h") == "2"
-
- dt = datetime(2013, 1, 1, 13)
- assert self.formatter._format_token(dt, "hh") == "01"
- assert self.formatter._format_token(dt, "h") == "1"
-
- # test that 12-hour time converts to '12' at midnight
- dt = datetime(2013, 1, 1, 0)
- assert self.formatter._format_token(dt, "hh") == "12"
- assert self.formatter._format_token(dt, "h") == "12"
-
- def test_minute(self):
-
- dt = datetime(2013, 1, 1, 0, 1)
- assert self.formatter._format_token(dt, "mm") == "01"
- assert self.formatter._format_token(dt, "m") == "1"
-
- def test_second(self):
-
- dt = datetime(2013, 1, 1, 0, 0, 1)
- assert self.formatter._format_token(dt, "ss") == "01"
- assert self.formatter._format_token(dt, "s") == "1"
-
- def test_sub_second(self):
-
- dt = datetime(2013, 1, 1, 0, 0, 0, 123456)
- assert self.formatter._format_token(dt, "SSSSSS") == "123456"
- assert self.formatter._format_token(dt, "SSSSS") == "12345"
- assert self.formatter._format_token(dt, "SSSS") == "1234"
- assert self.formatter._format_token(dt, "SSS") == "123"
- assert self.formatter._format_token(dt, "SS") == "12"
- assert self.formatter._format_token(dt, "S") == "1"
-
- dt = datetime(2013, 1, 1, 0, 0, 0, 2000)
- assert self.formatter._format_token(dt, "SSSSSS") == "002000"
- assert self.formatter._format_token(dt, "SSSSS") == "00200"
- assert self.formatter._format_token(dt, "SSSS") == "0020"
- assert self.formatter._format_token(dt, "SSS") == "002"
- assert self.formatter._format_token(dt, "SS") == "00"
- assert self.formatter._format_token(dt, "S") == "0"
-
- def test_timestamp(self):
-
- timestamp = 1588437009.8952794
- dt = datetime.utcfromtimestamp(timestamp)
- expected = str(int(timestamp))
- assert self.formatter._format_token(dt, "X") == expected
-
- # Must round because time.time() may return a float with greater
- # than 6 digits of precision
- expected = str(int(timestamp * 1000000))
- assert self.formatter._format_token(dt, "x") == expected
-
- def test_timezone(self):
-
- dt = datetime.utcnow().replace(tzinfo=dateutil_tz.gettz("US/Pacific"))
-
- result = self.formatter._format_token(dt, "ZZ")
- assert result == "-07:00" or result == "-08:00"
-
- result = self.formatter._format_token(dt, "Z")
- assert result == "-0700" or result == "-0800"
-
- @pytest.mark.parametrize("full_tz_name", make_full_tz_list())
- def test_timezone_formatter(self, full_tz_name):
-
- # This test will fail if we use "now" as date as soon as we change from/to DST
- dt = datetime(1986, 2, 14, tzinfo=pytz.timezone("UTC")).replace(
- tzinfo=dateutil_tz.gettz(full_tz_name)
- )
- abbreviation = dt.tzname()
-
- result = self.formatter._format_token(dt, "ZZZ")
- assert result == abbreviation
-
- def test_am_pm(self):
-
- dt = datetime(2012, 1, 1, 11)
- assert self.formatter._format_token(dt, "a") == "am"
- assert self.formatter._format_token(dt, "A") == "AM"
-
- dt = datetime(2012, 1, 1, 13)
- assert self.formatter._format_token(dt, "a") == "pm"
- assert self.formatter._format_token(dt, "A") == "PM"
-
- def test_week(self):
- dt = datetime(2017, 5, 19)
- assert self.formatter._format_token(dt, "W") == "2017-W20-5"
-
- # make sure week is zero padded when needed
- dt_early = datetime(2011, 1, 20)
- assert self.formatter._format_token(dt_early, "W") == "2011-W03-4"
-
- def test_nonsense(self):
- dt = datetime(2012, 1, 1, 11)
- assert self.formatter._format_token(dt, None) is None
- assert self.formatter._format_token(dt, "NONSENSE") is None
-
- def test_escape(self):
-
- assert (
- self.formatter.format(
- datetime(2015, 12, 10, 17, 9), "MMMM D, YYYY [at] h:mma"
- )
- == "December 10, 2015 at 5:09pm"
- )
-
- assert (
- self.formatter.format(
- datetime(2015, 12, 10, 17, 9), "[MMMM] M D, YYYY [at] h:mma"
- )
- == "MMMM 12 10, 2015 at 5:09pm"
- )
-
- assert (
- self.formatter.format(
- datetime(1990, 11, 25),
- "[It happened on] MMMM Do [in the year] YYYY [a long time ago]",
- )
- == "It happened on November 25th in the year 1990 a long time ago"
- )
-
- assert (
- self.formatter.format(
- datetime(1990, 11, 25),
- "[It happened on] MMMM Do [in the][ year] YYYY [a long time ago]",
- )
- == "It happened on November 25th in the year 1990 a long time ago"
- )
-
- assert (
- self.formatter.format(
- datetime(1, 1, 1), "[I'm][ entirely][ escaped,][ weee!]"
- )
- == "I'm entirely escaped, weee!"
- )
-
- # Special RegEx characters
- assert (
- self.formatter.format(
- datetime(2017, 12, 31, 2, 0), "MMM DD, YYYY |^${}().*+?<>-& h:mm A"
- )
- == "Dec 31, 2017 |^${}().*+?<>-& 2:00 AM"
- )
-
- # Escaping is atomic: brackets inside brackets are treated literally
- assert self.formatter.format(datetime(1, 1, 1), "[[[ ]]") == "[[ ]"
-
-
-@pytest.mark.usefixtures("arrow_formatter", "time_1975_12_25")
-class TestFormatterBuiltinFormats:
- def test_atom(self):
- assert (
- self.formatter.format(self.datetime, FORMAT_ATOM)
- == "1975-12-25 14:15:16-05:00"
- )
-
- def test_cookie(self):
- assert (
- self.formatter.format(self.datetime, FORMAT_COOKIE)
- == "Thursday, 25-Dec-1975 14:15:16 EST"
- )
-
- def test_rfc_822(self):
- assert (
- self.formatter.format(self.datetime, FORMAT_RFC822)
- == "Thu, 25 Dec 75 14:15:16 -0500"
- )
-
- def test_rfc_850(self):
- assert (
- self.formatter.format(self.datetime, FORMAT_RFC850)
- == "Thursday, 25-Dec-75 14:15:16 EST"
- )
-
- def test_rfc_1036(self):
- assert (
- self.formatter.format(self.datetime, FORMAT_RFC1036)
- == "Thu, 25 Dec 75 14:15:16 -0500"
- )
-
- def test_rfc_1123(self):
- assert (
- self.formatter.format(self.datetime, FORMAT_RFC1123)
- == "Thu, 25 Dec 1975 14:15:16 -0500"
- )
-
- def test_rfc_2822(self):
- assert (
- self.formatter.format(self.datetime, FORMAT_RFC2822)
- == "Thu, 25 Dec 1975 14:15:16 -0500"
- )
-
- def test_rfc3339(self):
- assert (
- self.formatter.format(self.datetime, FORMAT_RFC3339)
- == "1975-12-25 14:15:16-05:00"
- )
-
- def test_rss(self):
- assert (
- self.formatter.format(self.datetime, FORMAT_RSS)
- == "Thu, 25 Dec 1975 14:15:16 -0500"
- )
-
- def test_w3c(self):
- assert (
- self.formatter.format(self.datetime, FORMAT_W3C)
- == "1975-12-25 14:15:16-05:00"
- )
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/tests/test_locales.py b/openpype/modules/ftrack/python2_vendor/arrow/tests/test_locales.py
deleted file mode 100644
index 006ccdd5ba..0000000000
--- a/openpype/modules/ftrack/python2_vendor/arrow/tests/test_locales.py
+++ /dev/null
@@ -1,1352 +0,0 @@
-# -*- coding: utf-8 -*-
-from __future__ import unicode_literals
-
-import pytest
-
-from arrow import arrow, locales
-
-
-@pytest.mark.usefixtures("lang_locales")
-class TestLocaleValidation:
- """Validate locales to ensure that translations are valid and complete"""
-
- def test_locale_validation(self):
-
- for _, locale_cls in self.locales.items():
- # 7 days + 1 spacer to allow for 1-indexing of months
- assert len(locale_cls.day_names) == 8
- assert locale_cls.day_names[0] == ""
- # ensure that all string from index 1 onward are valid (not blank or None)
- assert all(locale_cls.day_names[1:])
-
- assert len(locale_cls.day_abbreviations) == 8
- assert locale_cls.day_abbreviations[0] == ""
- assert all(locale_cls.day_abbreviations[1:])
-
- # 12 months + 1 spacer to allow for 1-indexing of months
- assert len(locale_cls.month_names) == 13
- assert locale_cls.month_names[0] == ""
- assert all(locale_cls.month_names[1:])
-
- assert len(locale_cls.month_abbreviations) == 13
- assert locale_cls.month_abbreviations[0] == ""
- assert all(locale_cls.month_abbreviations[1:])
-
- assert len(locale_cls.names) > 0
- assert locale_cls.past is not None
- assert locale_cls.future is not None
-
-
-class TestModule:
- def test_get_locale(self, mocker):
- mock_locale = mocker.Mock()
- mock_locale_cls = mocker.Mock()
- mock_locale_cls.return_value = mock_locale
-
- with pytest.raises(ValueError):
- arrow.locales.get_locale("locale_name")
-
- cls_dict = arrow.locales._locales
- mocker.patch.dict(cls_dict, {"locale_name": mock_locale_cls})
-
- result = arrow.locales.get_locale("locale_name")
-
- assert result == mock_locale
-
- def test_get_locale_by_class_name(self, mocker):
- mock_locale_cls = mocker.Mock()
- mock_locale_obj = mock_locale_cls.return_value = mocker.Mock()
-
- globals_fn = mocker.Mock()
- globals_fn.return_value = {"NonExistentLocale": mock_locale_cls}
-
- with pytest.raises(ValueError):
- arrow.locales.get_locale_by_class_name("NonExistentLocale")
-
- mocker.patch.object(locales, "globals", globals_fn)
- result = arrow.locales.get_locale_by_class_name("NonExistentLocale")
-
- mock_locale_cls.assert_called_once_with()
- assert result == mock_locale_obj
-
- def test_locales(self):
-
- assert len(locales._locales) > 0
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestEnglishLocale:
- def test_describe(self):
- assert self.locale.describe("now", only_distance=True) == "instantly"
- assert self.locale.describe("now", only_distance=False) == "just now"
-
- def test_format_timeframe(self):
-
- assert self.locale._format_timeframe("hours", 2) == "2 hours"
- assert self.locale._format_timeframe("hour", 0) == "an hour"
-
- def test_format_relative_now(self):
-
- result = self.locale._format_relative("just now", "now", 0)
-
- assert result == "just now"
-
- def test_format_relative_past(self):
-
- result = self.locale._format_relative("an hour", "hour", 1)
-
- assert result == "in an hour"
-
- def test_format_relative_future(self):
-
- result = self.locale._format_relative("an hour", "hour", -1)
-
- assert result == "an hour ago"
-
- def test_ordinal_number(self):
- assert self.locale.ordinal_number(0) == "0th"
- assert self.locale.ordinal_number(1) == "1st"
- assert self.locale.ordinal_number(2) == "2nd"
- assert self.locale.ordinal_number(3) == "3rd"
- assert self.locale.ordinal_number(4) == "4th"
- assert self.locale.ordinal_number(10) == "10th"
- assert self.locale.ordinal_number(11) == "11th"
- assert self.locale.ordinal_number(12) == "12th"
- assert self.locale.ordinal_number(13) == "13th"
- assert self.locale.ordinal_number(14) == "14th"
- assert self.locale.ordinal_number(21) == "21st"
- assert self.locale.ordinal_number(22) == "22nd"
- assert self.locale.ordinal_number(23) == "23rd"
- assert self.locale.ordinal_number(24) == "24th"
-
- assert self.locale.ordinal_number(100) == "100th"
- assert self.locale.ordinal_number(101) == "101st"
- assert self.locale.ordinal_number(102) == "102nd"
- assert self.locale.ordinal_number(103) == "103rd"
- assert self.locale.ordinal_number(104) == "104th"
- assert self.locale.ordinal_number(110) == "110th"
- assert self.locale.ordinal_number(111) == "111th"
- assert self.locale.ordinal_number(112) == "112th"
- assert self.locale.ordinal_number(113) == "113th"
- assert self.locale.ordinal_number(114) == "114th"
- assert self.locale.ordinal_number(121) == "121st"
- assert self.locale.ordinal_number(122) == "122nd"
- assert self.locale.ordinal_number(123) == "123rd"
- assert self.locale.ordinal_number(124) == "124th"
-
- def test_meridian_invalid_token(self):
- assert self.locale.meridian(7, None) is None
- assert self.locale.meridian(7, "B") is None
- assert self.locale.meridian(7, "NONSENSE") is None
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestItalianLocale:
- def test_ordinal_number(self):
- assert self.locale.ordinal_number(1) == "1º"
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestSpanishLocale:
- def test_ordinal_number(self):
- assert self.locale.ordinal_number(1) == "1º"
-
- def test_format_timeframe(self):
- assert self.locale._format_timeframe("now", 0) == "ahora"
- assert self.locale._format_timeframe("seconds", 1) == "1 segundos"
- assert self.locale._format_timeframe("seconds", 3) == "3 segundos"
- assert self.locale._format_timeframe("seconds", 30) == "30 segundos"
- assert self.locale._format_timeframe("minute", 1) == "un minuto"
- assert self.locale._format_timeframe("minutes", 4) == "4 minutos"
- assert self.locale._format_timeframe("minutes", 40) == "40 minutos"
- assert self.locale._format_timeframe("hour", 1) == "una hora"
- assert self.locale._format_timeframe("hours", 5) == "5 horas"
- assert self.locale._format_timeframe("hours", 23) == "23 horas"
- assert self.locale._format_timeframe("day", 1) == "un día"
- assert self.locale._format_timeframe("days", 6) == "6 días"
- assert self.locale._format_timeframe("days", 12) == "12 días"
- assert self.locale._format_timeframe("week", 1) == "una semana"
- assert self.locale._format_timeframe("weeks", 2) == "2 semanas"
- assert self.locale._format_timeframe("weeks", 3) == "3 semanas"
- assert self.locale._format_timeframe("month", 1) == "un mes"
- assert self.locale._format_timeframe("months", 7) == "7 meses"
- assert self.locale._format_timeframe("months", 11) == "11 meses"
- assert self.locale._format_timeframe("year", 1) == "un año"
- assert self.locale._format_timeframe("years", 8) == "8 años"
- assert self.locale._format_timeframe("years", 12) == "12 años"
-
- assert self.locale._format_timeframe("now", 0) == "ahora"
- assert self.locale._format_timeframe("seconds", -1) == "1 segundos"
- assert self.locale._format_timeframe("seconds", -9) == "9 segundos"
- assert self.locale._format_timeframe("seconds", -12) == "12 segundos"
- assert self.locale._format_timeframe("minute", -1) == "un minuto"
- assert self.locale._format_timeframe("minutes", -2) == "2 minutos"
- assert self.locale._format_timeframe("minutes", -10) == "10 minutos"
- assert self.locale._format_timeframe("hour", -1) == "una hora"
- assert self.locale._format_timeframe("hours", -3) == "3 horas"
- assert self.locale._format_timeframe("hours", -11) == "11 horas"
- assert self.locale._format_timeframe("day", -1) == "un día"
- assert self.locale._format_timeframe("days", -2) == "2 días"
- assert self.locale._format_timeframe("days", -12) == "12 días"
- assert self.locale._format_timeframe("week", -1) == "una semana"
- assert self.locale._format_timeframe("weeks", -2) == "2 semanas"
- assert self.locale._format_timeframe("weeks", -3) == "3 semanas"
- assert self.locale._format_timeframe("month", -1) == "un mes"
- assert self.locale._format_timeframe("months", -3) == "3 meses"
- assert self.locale._format_timeframe("months", -13) == "13 meses"
- assert self.locale._format_timeframe("year", -1) == "un año"
- assert self.locale._format_timeframe("years", -4) == "4 años"
- assert self.locale._format_timeframe("years", -14) == "14 años"
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestFrenchLocale:
- def test_ordinal_number(self):
- assert self.locale.ordinal_number(1) == "1er"
- assert self.locale.ordinal_number(2) == "2e"
-
- def test_month_abbreviation(self):
- assert "juil" in self.locale.month_abbreviations
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestFrenchCanadianLocale:
- def test_month_abbreviation(self):
- assert "juill" in self.locale.month_abbreviations
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestRussianLocale:
- def test_plurals2(self):
- assert self.locale._format_timeframe("hours", 0) == "0 часов"
- assert self.locale._format_timeframe("hours", 1) == "1 час"
- assert self.locale._format_timeframe("hours", 2) == "2 часа"
- assert self.locale._format_timeframe("hours", 4) == "4 часа"
- assert self.locale._format_timeframe("hours", 5) == "5 часов"
- assert self.locale._format_timeframe("hours", 21) == "21 час"
- assert self.locale._format_timeframe("hours", 22) == "22 часа"
- assert self.locale._format_timeframe("hours", 25) == "25 часов"
-
- # feminine grammatical gender should be tested separately
- assert self.locale._format_timeframe("minutes", 0) == "0 минут"
- assert self.locale._format_timeframe("minutes", 1) == "1 минуту"
- assert self.locale._format_timeframe("minutes", 2) == "2 минуты"
- assert self.locale._format_timeframe("minutes", 4) == "4 минуты"
- assert self.locale._format_timeframe("minutes", 5) == "5 минут"
- assert self.locale._format_timeframe("minutes", 21) == "21 минуту"
- assert self.locale._format_timeframe("minutes", 22) == "22 минуты"
- assert self.locale._format_timeframe("minutes", 25) == "25 минут"
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestPolishLocale:
- def test_plurals(self):
-
- assert self.locale._format_timeframe("seconds", 0) == "0 sekund"
- assert self.locale._format_timeframe("second", 1) == "sekundę"
- assert self.locale._format_timeframe("seconds", 2) == "2 sekundy"
- assert self.locale._format_timeframe("seconds", 5) == "5 sekund"
- assert self.locale._format_timeframe("seconds", 21) == "21 sekund"
- assert self.locale._format_timeframe("seconds", 22) == "22 sekundy"
- assert self.locale._format_timeframe("seconds", 25) == "25 sekund"
-
- assert self.locale._format_timeframe("minutes", 0) == "0 minut"
- assert self.locale._format_timeframe("minute", 1) == "minutę"
- assert self.locale._format_timeframe("minutes", 2) == "2 minuty"
- assert self.locale._format_timeframe("minutes", 5) == "5 minut"
- assert self.locale._format_timeframe("minutes", 21) == "21 minut"
- assert self.locale._format_timeframe("minutes", 22) == "22 minuty"
- assert self.locale._format_timeframe("minutes", 25) == "25 minut"
-
- assert self.locale._format_timeframe("hours", 0) == "0 godzin"
- assert self.locale._format_timeframe("hour", 1) == "godzinę"
- assert self.locale._format_timeframe("hours", 2) == "2 godziny"
- assert self.locale._format_timeframe("hours", 5) == "5 godzin"
- assert self.locale._format_timeframe("hours", 21) == "21 godzin"
- assert self.locale._format_timeframe("hours", 22) == "22 godziny"
- assert self.locale._format_timeframe("hours", 25) == "25 godzin"
-
- assert self.locale._format_timeframe("weeks", 0) == "0 tygodni"
- assert self.locale._format_timeframe("week", 1) == "tydzień"
- assert self.locale._format_timeframe("weeks", 2) == "2 tygodnie"
- assert self.locale._format_timeframe("weeks", 5) == "5 tygodni"
- assert self.locale._format_timeframe("weeks", 21) == "21 tygodni"
- assert self.locale._format_timeframe("weeks", 22) == "22 tygodnie"
- assert self.locale._format_timeframe("weeks", 25) == "25 tygodni"
-
- assert self.locale._format_timeframe("months", 0) == "0 miesięcy"
- assert self.locale._format_timeframe("month", 1) == "miesiąc"
- assert self.locale._format_timeframe("months", 2) == "2 miesiące"
- assert self.locale._format_timeframe("months", 5) == "5 miesięcy"
- assert self.locale._format_timeframe("months", 21) == "21 miesięcy"
- assert self.locale._format_timeframe("months", 22) == "22 miesiące"
- assert self.locale._format_timeframe("months", 25) == "25 miesięcy"
-
- assert self.locale._format_timeframe("years", 0) == "0 lat"
- assert self.locale._format_timeframe("year", 1) == "rok"
- assert self.locale._format_timeframe("years", 2) == "2 lata"
- assert self.locale._format_timeframe("years", 5) == "5 lat"
- assert self.locale._format_timeframe("years", 21) == "21 lat"
- assert self.locale._format_timeframe("years", 22) == "22 lata"
- assert self.locale._format_timeframe("years", 25) == "25 lat"
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestIcelandicLocale:
- def test_format_timeframe(self):
-
- assert self.locale._format_timeframe("minute", -1) == "einni mínútu"
- assert self.locale._format_timeframe("minute", 1) == "eina mínútu"
-
- assert self.locale._format_timeframe("hours", -2) == "2 tímum"
- assert self.locale._format_timeframe("hours", 2) == "2 tíma"
- assert self.locale._format_timeframe("now", 0) == "rétt í þessu"
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestMalayalamLocale:
- def test_format_timeframe(self):
-
- assert self.locale._format_timeframe("hours", 2) == "2 മണിക്കൂർ"
- assert self.locale._format_timeframe("hour", 0) == "ഒരു മണിക്കൂർ"
-
- def test_format_relative_now(self):
-
- result = self.locale._format_relative("ഇപ്പോൾ", "now", 0)
-
- assert result == "ഇപ്പോൾ"
-
- def test_format_relative_past(self):
-
- result = self.locale._format_relative("ഒരു മണിക്കൂർ", "hour", 1)
- assert result == "ഒരു മണിക്കൂർ ശേഷം"
-
- def test_format_relative_future(self):
-
- result = self.locale._format_relative("ഒരു മണിക്കൂർ", "hour", -1)
- assert result == "ഒരു മണിക്കൂർ മുമ്പ്"
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestHindiLocale:
- def test_format_timeframe(self):
-
- assert self.locale._format_timeframe("hours", 2) == "2 घंटे"
- assert self.locale._format_timeframe("hour", 0) == "एक घंटा"
-
- def test_format_relative_now(self):
-
- result = self.locale._format_relative("अभी", "now", 0)
- assert result == "अभी"
-
- def test_format_relative_past(self):
-
- result = self.locale._format_relative("एक घंटा", "hour", 1)
- assert result == "एक घंटा बाद"
-
- def test_format_relative_future(self):
-
- result = self.locale._format_relative("एक घंटा", "hour", -1)
- assert result == "एक घंटा पहले"
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestCzechLocale:
- def test_format_timeframe(self):
-
- assert self.locale._format_timeframe("hours", 2) == "2 hodiny"
- assert self.locale._format_timeframe("hours", 5) == "5 hodin"
- assert self.locale._format_timeframe("hour", 0) == "0 hodin"
- assert self.locale._format_timeframe("hours", -2) == "2 hodinami"
- assert self.locale._format_timeframe("hours", -5) == "5 hodinami"
- assert self.locale._format_timeframe("now", 0) == "Teď"
-
- assert self.locale._format_timeframe("weeks", 2) == "2 týdny"
- assert self.locale._format_timeframe("weeks", 5) == "5 týdnů"
- assert self.locale._format_timeframe("week", 0) == "0 týdnů"
- assert self.locale._format_timeframe("weeks", -2) == "2 týdny"
- assert self.locale._format_timeframe("weeks", -5) == "5 týdny"
-
- def test_format_relative_now(self):
-
- result = self.locale._format_relative("Teď", "now", 0)
- assert result == "Teď"
-
- def test_format_relative_future(self):
-
- result = self.locale._format_relative("hodinu", "hour", 1)
- assert result == "Za hodinu"
-
- def test_format_relative_past(self):
-
- result = self.locale._format_relative("hodinou", "hour", -1)
- assert result == "Před hodinou"
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestSlovakLocale:
- def test_format_timeframe(self):
-
- assert self.locale._format_timeframe("seconds", -5) == "5 sekundami"
- assert self.locale._format_timeframe("seconds", -2) == "2 sekundami"
- assert self.locale._format_timeframe("second", -1) == "sekundou"
- assert self.locale._format_timeframe("second", 0) == "0 sekúnd"
- assert self.locale._format_timeframe("second", 1) == "sekundu"
- assert self.locale._format_timeframe("seconds", 2) == "2 sekundy"
- assert self.locale._format_timeframe("seconds", 5) == "5 sekúnd"
-
- assert self.locale._format_timeframe("minutes", -5) == "5 minútami"
- assert self.locale._format_timeframe("minutes", -2) == "2 minútami"
- assert self.locale._format_timeframe("minute", -1) == "minútou"
- assert self.locale._format_timeframe("minute", 0) == "0 minút"
- assert self.locale._format_timeframe("minute", 1) == "minútu"
- assert self.locale._format_timeframe("minutes", 2) == "2 minúty"
- assert self.locale._format_timeframe("minutes", 5) == "5 minút"
-
- assert self.locale._format_timeframe("hours", -5) == "5 hodinami"
- assert self.locale._format_timeframe("hours", -2) == "2 hodinami"
- assert self.locale._format_timeframe("hour", -1) == "hodinou"
- assert self.locale._format_timeframe("hour", 0) == "0 hodín"
- assert self.locale._format_timeframe("hour", 1) == "hodinu"
- assert self.locale._format_timeframe("hours", 2) == "2 hodiny"
- assert self.locale._format_timeframe("hours", 5) == "5 hodín"
-
- assert self.locale._format_timeframe("days", -5) == "5 dňami"
- assert self.locale._format_timeframe("days", -2) == "2 dňami"
- assert self.locale._format_timeframe("day", -1) == "dňom"
- assert self.locale._format_timeframe("day", 0) == "0 dní"
- assert self.locale._format_timeframe("day", 1) == "deň"
- assert self.locale._format_timeframe("days", 2) == "2 dni"
- assert self.locale._format_timeframe("days", 5) == "5 dní"
-
- assert self.locale._format_timeframe("weeks", -5) == "5 týždňami"
- assert self.locale._format_timeframe("weeks", -2) == "2 týždňami"
- assert self.locale._format_timeframe("week", -1) == "týždňom"
- assert self.locale._format_timeframe("week", 0) == "0 týždňov"
- assert self.locale._format_timeframe("week", 1) == "týždeň"
- assert self.locale._format_timeframe("weeks", 2) == "2 týždne"
- assert self.locale._format_timeframe("weeks", 5) == "5 týždňov"
-
- assert self.locale._format_timeframe("months", -5) == "5 mesiacmi"
- assert self.locale._format_timeframe("months", -2) == "2 mesiacmi"
- assert self.locale._format_timeframe("month", -1) == "mesiacom"
- assert self.locale._format_timeframe("month", 0) == "0 mesiacov"
- assert self.locale._format_timeframe("month", 1) == "mesiac"
- assert self.locale._format_timeframe("months", 2) == "2 mesiace"
- assert self.locale._format_timeframe("months", 5) == "5 mesiacov"
-
- assert self.locale._format_timeframe("years", -5) == "5 rokmi"
- assert self.locale._format_timeframe("years", -2) == "2 rokmi"
- assert self.locale._format_timeframe("year", -1) == "rokom"
- assert self.locale._format_timeframe("year", 0) == "0 rokov"
- assert self.locale._format_timeframe("year", 1) == "rok"
- assert self.locale._format_timeframe("years", 2) == "2 roky"
- assert self.locale._format_timeframe("years", 5) == "5 rokov"
-
- assert self.locale._format_timeframe("now", 0) == "Teraz"
-
- def test_format_relative_now(self):
-
- result = self.locale._format_relative("Teraz", "now", 0)
- assert result == "Teraz"
-
- def test_format_relative_future(self):
-
- result = self.locale._format_relative("hodinu", "hour", 1)
- assert result == "O hodinu"
-
- def test_format_relative_past(self):
-
- result = self.locale._format_relative("hodinou", "hour", -1)
- assert result == "Pred hodinou"
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestBulgarianLocale:
- def test_plurals2(self):
- assert self.locale._format_timeframe("hours", 0) == "0 часа"
- assert self.locale._format_timeframe("hours", 1) == "1 час"
- assert self.locale._format_timeframe("hours", 2) == "2 часа"
- assert self.locale._format_timeframe("hours", 4) == "4 часа"
- assert self.locale._format_timeframe("hours", 5) == "5 часа"
- assert self.locale._format_timeframe("hours", 21) == "21 час"
- assert self.locale._format_timeframe("hours", 22) == "22 часа"
- assert self.locale._format_timeframe("hours", 25) == "25 часа"
-
- # feminine grammatical gender should be tested separately
- assert self.locale._format_timeframe("minutes", 0) == "0 минути"
- assert self.locale._format_timeframe("minutes", 1) == "1 минута"
- assert self.locale._format_timeframe("minutes", 2) == "2 минути"
- assert self.locale._format_timeframe("minutes", 4) == "4 минути"
- assert self.locale._format_timeframe("minutes", 5) == "5 минути"
- assert self.locale._format_timeframe("minutes", 21) == "21 минута"
- assert self.locale._format_timeframe("minutes", 22) == "22 минути"
- assert self.locale._format_timeframe("minutes", 25) == "25 минути"
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestMacedonianLocale:
- def test_singles_mk(self):
- assert self.locale._format_timeframe("second", 1) == "една секунда"
- assert self.locale._format_timeframe("minute", 1) == "една минута"
- assert self.locale._format_timeframe("hour", 1) == "еден саат"
- assert self.locale._format_timeframe("day", 1) == "еден ден"
- assert self.locale._format_timeframe("week", 1) == "една недела"
- assert self.locale._format_timeframe("month", 1) == "еден месец"
- assert self.locale._format_timeframe("year", 1) == "една година"
-
- def test_meridians_mk(self):
- assert self.locale.meridian(7, "A") == "претпладне"
- assert self.locale.meridian(18, "A") == "попладне"
- assert self.locale.meridian(10, "a") == "дп"
- assert self.locale.meridian(22, "a") == "пп"
-
- def test_describe_mk(self):
- assert self.locale.describe("second", only_distance=True) == "една секунда"
- assert self.locale.describe("second", only_distance=False) == "за една секунда"
- assert self.locale.describe("minute", only_distance=True) == "една минута"
- assert self.locale.describe("minute", only_distance=False) == "за една минута"
- assert self.locale.describe("hour", only_distance=True) == "еден саат"
- assert self.locale.describe("hour", only_distance=False) == "за еден саат"
- assert self.locale.describe("day", only_distance=True) == "еден ден"
- assert self.locale.describe("day", only_distance=False) == "за еден ден"
- assert self.locale.describe("week", only_distance=True) == "една недела"
- assert self.locale.describe("week", only_distance=False) == "за една недела"
- assert self.locale.describe("month", only_distance=True) == "еден месец"
- assert self.locale.describe("month", only_distance=False) == "за еден месец"
- assert self.locale.describe("year", only_distance=True) == "една година"
- assert self.locale.describe("year", only_distance=False) == "за една година"
-
- def test_relative_mk(self):
- # time
- assert self.locale._format_relative("сега", "now", 0) == "сега"
- assert self.locale._format_relative("1 секунда", "seconds", 1) == "за 1 секунда"
- assert self.locale._format_relative("1 минута", "minutes", 1) == "за 1 минута"
- assert self.locale._format_relative("1 саат", "hours", 1) == "за 1 саат"
- assert self.locale._format_relative("1 ден", "days", 1) == "за 1 ден"
- assert self.locale._format_relative("1 недела", "weeks", 1) == "за 1 недела"
- assert self.locale._format_relative("1 месец", "months", 1) == "за 1 месец"
- assert self.locale._format_relative("1 година", "years", 1) == "за 1 година"
- assert (
- self.locale._format_relative("1 секунда", "seconds", -1) == "пред 1 секунда"
- )
- assert (
- self.locale._format_relative("1 минута", "minutes", -1) == "пред 1 минута"
- )
- assert self.locale._format_relative("1 саат", "hours", -1) == "пред 1 саат"
- assert self.locale._format_relative("1 ден", "days", -1) == "пред 1 ден"
- assert self.locale._format_relative("1 недела", "weeks", -1) == "пред 1 недела"
- assert self.locale._format_relative("1 месец", "months", -1) == "пред 1 месец"
- assert self.locale._format_relative("1 година", "years", -1) == "пред 1 година"
-
- def test_plurals_mk(self):
- # Seconds
- assert self.locale._format_timeframe("seconds", 0) == "0 секунди"
- assert self.locale._format_timeframe("seconds", 1) == "1 секунда"
- assert self.locale._format_timeframe("seconds", 2) == "2 секунди"
- assert self.locale._format_timeframe("seconds", 4) == "4 секунди"
- assert self.locale._format_timeframe("seconds", 5) == "5 секунди"
- assert self.locale._format_timeframe("seconds", 21) == "21 секунда"
- assert self.locale._format_timeframe("seconds", 22) == "22 секунди"
- assert self.locale._format_timeframe("seconds", 25) == "25 секунди"
-
- # Minutes
- assert self.locale._format_timeframe("minutes", 0) == "0 минути"
- assert self.locale._format_timeframe("minutes", 1) == "1 минута"
- assert self.locale._format_timeframe("minutes", 2) == "2 минути"
- assert self.locale._format_timeframe("minutes", 4) == "4 минути"
- assert self.locale._format_timeframe("minutes", 5) == "5 минути"
- assert self.locale._format_timeframe("minutes", 21) == "21 минута"
- assert self.locale._format_timeframe("minutes", 22) == "22 минути"
- assert self.locale._format_timeframe("minutes", 25) == "25 минути"
-
- # Hours
- assert self.locale._format_timeframe("hours", 0) == "0 саати"
- assert self.locale._format_timeframe("hours", 1) == "1 саат"
- assert self.locale._format_timeframe("hours", 2) == "2 саати"
- assert self.locale._format_timeframe("hours", 4) == "4 саати"
- assert self.locale._format_timeframe("hours", 5) == "5 саати"
- assert self.locale._format_timeframe("hours", 21) == "21 саат"
- assert self.locale._format_timeframe("hours", 22) == "22 саати"
- assert self.locale._format_timeframe("hours", 25) == "25 саати"
-
- # Days
- assert self.locale._format_timeframe("days", 0) == "0 дена"
- assert self.locale._format_timeframe("days", 1) == "1 ден"
- assert self.locale._format_timeframe("days", 2) == "2 дена"
- assert self.locale._format_timeframe("days", 3) == "3 дена"
- assert self.locale._format_timeframe("days", 21) == "21 ден"
-
- # Weeks
- assert self.locale._format_timeframe("weeks", 0) == "0 недели"
- assert self.locale._format_timeframe("weeks", 1) == "1 недела"
- assert self.locale._format_timeframe("weeks", 2) == "2 недели"
- assert self.locale._format_timeframe("weeks", 4) == "4 недели"
- assert self.locale._format_timeframe("weeks", 5) == "5 недели"
- assert self.locale._format_timeframe("weeks", 21) == "21 недела"
- assert self.locale._format_timeframe("weeks", 22) == "22 недели"
- assert self.locale._format_timeframe("weeks", 25) == "25 недели"
-
- # Months
- assert self.locale._format_timeframe("months", 0) == "0 месеци"
- assert self.locale._format_timeframe("months", 1) == "1 месец"
- assert self.locale._format_timeframe("months", 2) == "2 месеци"
- assert self.locale._format_timeframe("months", 4) == "4 месеци"
- assert self.locale._format_timeframe("months", 5) == "5 месеци"
- assert self.locale._format_timeframe("months", 21) == "21 месец"
- assert self.locale._format_timeframe("months", 22) == "22 месеци"
- assert self.locale._format_timeframe("months", 25) == "25 месеци"
-
- # Years
- assert self.locale._format_timeframe("years", 1) == "1 година"
- assert self.locale._format_timeframe("years", 2) == "2 години"
- assert self.locale._format_timeframe("years", 5) == "5 години"
-
- def test_multi_describe_mk(self):
- describe = self.locale.describe_multi
-
- fulltest = [("years", 5), ("weeks", 1), ("hours", 1), ("minutes", 6)]
- assert describe(fulltest) == "за 5 години 1 недела 1 саат 6 минути"
- seconds4000_0days = [("days", 0), ("hours", 1), ("minutes", 6)]
- assert describe(seconds4000_0days) == "за 0 дена 1 саат 6 минути"
- seconds4000 = [("hours", 1), ("minutes", 6)]
- assert describe(seconds4000) == "за 1 саат 6 минути"
- assert describe(seconds4000, only_distance=True) == "1 саат 6 минути"
- seconds3700 = [("hours", 1), ("minutes", 1)]
- assert describe(seconds3700) == "за 1 саат 1 минута"
- seconds300_0hours = [("hours", 0), ("minutes", 5)]
- assert describe(seconds300_0hours) == "за 0 саати 5 минути"
- seconds300 = [("minutes", 5)]
- assert describe(seconds300) == "за 5 минути"
- seconds60 = [("minutes", 1)]
- assert describe(seconds60) == "за 1 минута"
- assert describe(seconds60, only_distance=True) == "1 минута"
- seconds60 = [("seconds", 1)]
- assert describe(seconds60) == "за 1 секунда"
- assert describe(seconds60, only_distance=True) == "1 секунда"
-
-
-@pytest.mark.usefixtures("time_2013_01_01")
-@pytest.mark.usefixtures("lang_locale")
-class TestHebrewLocale:
- def test_couple_of_timeframe(self):
- assert self.locale._format_timeframe("days", 1) == "יום"
- assert self.locale._format_timeframe("days", 2) == "יומיים"
- assert self.locale._format_timeframe("days", 3) == "3 ימים"
-
- assert self.locale._format_timeframe("hours", 1) == "שעה"
- assert self.locale._format_timeframe("hours", 2) == "שעתיים"
- assert self.locale._format_timeframe("hours", 3) == "3 שעות"
-
- assert self.locale._format_timeframe("week", 1) == "שבוע"
- assert self.locale._format_timeframe("weeks", 2) == "שבועיים"
- assert self.locale._format_timeframe("weeks", 3) == "3 שבועות"
-
- assert self.locale._format_timeframe("months", 1) == "חודש"
- assert self.locale._format_timeframe("months", 2) == "חודשיים"
- assert self.locale._format_timeframe("months", 4) == "4 חודשים"
-
- assert self.locale._format_timeframe("years", 1) == "שנה"
- assert self.locale._format_timeframe("years", 2) == "שנתיים"
- assert self.locale._format_timeframe("years", 5) == "5 שנים"
-
- def test_describe_multi(self):
- describe = self.locale.describe_multi
-
- fulltest = [("years", 5), ("weeks", 1), ("hours", 1), ("minutes", 6)]
- assert describe(fulltest) == "בעוד 5 שנים, שבוע, שעה ו־6 דקות"
- seconds4000_0days = [("days", 0), ("hours", 1), ("minutes", 6)]
- assert describe(seconds4000_0days) == "בעוד 0 ימים, שעה ו־6 דקות"
- seconds4000 = [("hours", 1), ("minutes", 6)]
- assert describe(seconds4000) == "בעוד שעה ו־6 דקות"
- assert describe(seconds4000, only_distance=True) == "שעה ו־6 דקות"
- seconds3700 = [("hours", 1), ("minutes", 1)]
- assert describe(seconds3700) == "בעוד שעה ודקה"
- seconds300_0hours = [("hours", 0), ("minutes", 5)]
- assert describe(seconds300_0hours) == "בעוד 0 שעות ו־5 דקות"
- seconds300 = [("minutes", 5)]
- assert describe(seconds300) == "בעוד 5 דקות"
- seconds60 = [("minutes", 1)]
- assert describe(seconds60) == "בעוד דקה"
- assert describe(seconds60, only_distance=True) == "דקה"
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestMarathiLocale:
- def test_dateCoreFunctionality(self):
- dt = arrow.Arrow(2015, 4, 11, 17, 30, 00)
- assert self.locale.month_name(dt.month) == "एप्रिल"
- assert self.locale.month_abbreviation(dt.month) == "एप्रि"
- assert self.locale.day_name(dt.isoweekday()) == "शनिवार"
- assert self.locale.day_abbreviation(dt.isoweekday()) == "शनि"
-
- def test_format_timeframe(self):
- assert self.locale._format_timeframe("hours", 2) == "2 तास"
- assert self.locale._format_timeframe("hour", 0) == "एक तास"
-
- def test_format_relative_now(self):
- result = self.locale._format_relative("सद्य", "now", 0)
- assert result == "सद्य"
-
- def test_format_relative_past(self):
- result = self.locale._format_relative("एक तास", "hour", 1)
- assert result == "एक तास नंतर"
-
- def test_format_relative_future(self):
- result = self.locale._format_relative("एक तास", "hour", -1)
- assert result == "एक तास आधी"
-
- # Not currently implemented
- def test_ordinal_number(self):
- assert self.locale.ordinal_number(1) == "1"
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestFinnishLocale:
- def test_format_timeframe(self):
- assert self.locale._format_timeframe("hours", 2) == ("2 tuntia", "2 tunnin")
- assert self.locale._format_timeframe("hour", 0) == ("tunti", "tunnin")
-
- def test_format_relative_now(self):
- result = self.locale._format_relative(["juuri nyt", "juuri nyt"], "now", 0)
- assert result == "juuri nyt"
-
- def test_format_relative_past(self):
- result = self.locale._format_relative(["tunti", "tunnin"], "hour", 1)
- assert result == "tunnin kuluttua"
-
- def test_format_relative_future(self):
- result = self.locale._format_relative(["tunti", "tunnin"], "hour", -1)
- assert result == "tunti sitten"
-
- def test_ordinal_number(self):
- assert self.locale.ordinal_number(1) == "1."
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestGermanLocale:
- def test_ordinal_number(self):
- assert self.locale.ordinal_number(1) == "1."
-
- def test_define(self):
- assert self.locale.describe("minute", only_distance=True) == "eine Minute"
- assert self.locale.describe("minute", only_distance=False) == "in einer Minute"
- assert self.locale.describe("hour", only_distance=True) == "eine Stunde"
- assert self.locale.describe("hour", only_distance=False) == "in einer Stunde"
- assert self.locale.describe("day", only_distance=True) == "ein Tag"
- assert self.locale.describe("day", only_distance=False) == "in einem Tag"
- assert self.locale.describe("week", only_distance=True) == "eine Woche"
- assert self.locale.describe("week", only_distance=False) == "in einer Woche"
- assert self.locale.describe("month", only_distance=True) == "ein Monat"
- assert self.locale.describe("month", only_distance=False) == "in einem Monat"
- assert self.locale.describe("year", only_distance=True) == "ein Jahr"
- assert self.locale.describe("year", only_distance=False) == "in einem Jahr"
-
- def test_weekday(self):
- dt = arrow.Arrow(2015, 4, 11, 17, 30, 00)
- assert self.locale.day_name(dt.isoweekday()) == "Samstag"
- assert self.locale.day_abbreviation(dt.isoweekday()) == "Sa"
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestHungarianLocale:
- def test_format_timeframe(self):
- assert self.locale._format_timeframe("hours", 2) == "2 óra"
- assert self.locale._format_timeframe("hour", 0) == "egy órával"
- assert self.locale._format_timeframe("hours", -2) == "2 órával"
- assert self.locale._format_timeframe("now", 0) == "éppen most"
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestEsperantoLocale:
- def test_format_timeframe(self):
- assert self.locale._format_timeframe("hours", 2) == "2 horoj"
- assert self.locale._format_timeframe("hour", 0) == "un horo"
- assert self.locale._format_timeframe("hours", -2) == "2 horoj"
- assert self.locale._format_timeframe("now", 0) == "nun"
-
- def test_ordinal_number(self):
- assert self.locale.ordinal_number(1) == "1a"
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestThaiLocale:
- def test_year_full(self):
- assert self.locale.year_full(2015) == "2558"
-
- def test_year_abbreviation(self):
- assert self.locale.year_abbreviation(2015) == "58"
-
- def test_format_relative_now(self):
- result = self.locale._format_relative("ขณะนี้", "now", 0)
- assert result == "ขณะนี้"
-
- def test_format_relative_past(self):
- result = self.locale._format_relative("1 ชั่วโมง", "hour", 1)
- assert result == "ในอีก 1 ชั่วโมง"
- result = self.locale._format_relative("{0} ชั่วโมง", "hours", 2)
- assert result == "ในอีก {0} ชั่วโมง"
- result = self.locale._format_relative("ไม่กี่วินาที", "seconds", 42)
- assert result == "ในอีกไม่กี่วินาที"
-
- def test_format_relative_future(self):
- result = self.locale._format_relative("1 ชั่วโมง", "hour", -1)
- assert result == "1 ชั่วโมง ที่ผ่านมา"
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestBengaliLocale:
- def test_ordinal_number(self):
- assert self.locale._ordinal_number(0) == "0তম"
- assert self.locale._ordinal_number(1) == "1ম"
- assert self.locale._ordinal_number(3) == "3য়"
- assert self.locale._ordinal_number(4) == "4র্থ"
- assert self.locale._ordinal_number(5) == "5ম"
- assert self.locale._ordinal_number(6) == "6ষ্ঠ"
- assert self.locale._ordinal_number(10) == "10ম"
- assert self.locale._ordinal_number(11) == "11তম"
- assert self.locale._ordinal_number(42) == "42তম"
- assert self.locale._ordinal_number(-1) is None
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestRomanianLocale:
- def test_timeframes(self):
-
- assert self.locale._format_timeframe("hours", 2) == "2 ore"
- assert self.locale._format_timeframe("months", 2) == "2 luni"
-
- assert self.locale._format_timeframe("days", 2) == "2 zile"
- assert self.locale._format_timeframe("years", 2) == "2 ani"
-
- assert self.locale._format_timeframe("hours", 3) == "3 ore"
- assert self.locale._format_timeframe("months", 4) == "4 luni"
- assert self.locale._format_timeframe("days", 3) == "3 zile"
- assert self.locale._format_timeframe("years", 5) == "5 ani"
-
- def test_relative_timeframes(self):
- assert self.locale._format_relative("acum", "now", 0) == "acum"
- assert self.locale._format_relative("o oră", "hour", 1) == "peste o oră"
- assert self.locale._format_relative("o oră", "hour", -1) == "o oră în urmă"
- assert self.locale._format_relative("un minut", "minute", 1) == "peste un minut"
- assert (
- self.locale._format_relative("un minut", "minute", -1) == "un minut în urmă"
- )
- assert (
- self.locale._format_relative("câteva secunde", "seconds", -1)
- == "câteva secunde în urmă"
- )
- assert (
- self.locale._format_relative("câteva secunde", "seconds", 1)
- == "peste câteva secunde"
- )
- assert self.locale._format_relative("o zi", "day", -1) == "o zi în urmă"
- assert self.locale._format_relative("o zi", "day", 1) == "peste o zi"
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestArabicLocale:
- def test_timeframes(self):
-
- # single
- assert self.locale._format_timeframe("minute", 1) == "دقيقة"
- assert self.locale._format_timeframe("hour", 1) == "ساعة"
- assert self.locale._format_timeframe("day", 1) == "يوم"
- assert self.locale._format_timeframe("month", 1) == "شهر"
- assert self.locale._format_timeframe("year", 1) == "سنة"
-
- # double
- assert self.locale._format_timeframe("minutes", 2) == "دقيقتين"
- assert self.locale._format_timeframe("hours", 2) == "ساعتين"
- assert self.locale._format_timeframe("days", 2) == "يومين"
- assert self.locale._format_timeframe("months", 2) == "شهرين"
- assert self.locale._format_timeframe("years", 2) == "سنتين"
-
- # up to ten
- assert self.locale._format_timeframe("minutes", 3) == "3 دقائق"
- assert self.locale._format_timeframe("hours", 4) == "4 ساعات"
- assert self.locale._format_timeframe("days", 5) == "5 أيام"
- assert self.locale._format_timeframe("months", 6) == "6 أشهر"
- assert self.locale._format_timeframe("years", 10) == "10 سنوات"
-
- # more than ten
- assert self.locale._format_timeframe("minutes", 11) == "11 دقيقة"
- assert self.locale._format_timeframe("hours", 19) == "19 ساعة"
- assert self.locale._format_timeframe("months", 24) == "24 شهر"
- assert self.locale._format_timeframe("days", 50) == "50 يوم"
- assert self.locale._format_timeframe("years", 115) == "115 سنة"
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestNepaliLocale:
- def test_format_timeframe(self):
- assert self.locale._format_timeframe("hours", 3) == "3 घण्टा"
- assert self.locale._format_timeframe("hour", 0) == "एक घण्टा"
-
- def test_format_relative_now(self):
- result = self.locale._format_relative("अहिले", "now", 0)
- assert result == "अहिले"
-
- def test_format_relative_future(self):
- result = self.locale._format_relative("एक घण्टा", "hour", 1)
- assert result == "एक घण्टा पछी"
-
- def test_format_relative_past(self):
- result = self.locale._format_relative("एक घण्टा", "hour", -1)
- assert result == "एक घण्टा पहिले"
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestIndonesianLocale:
- def test_timeframes(self):
- assert self.locale._format_timeframe("hours", 2) == "2 jam"
- assert self.locale._format_timeframe("months", 2) == "2 bulan"
-
- assert self.locale._format_timeframe("days", 2) == "2 hari"
- assert self.locale._format_timeframe("years", 2) == "2 tahun"
-
- assert self.locale._format_timeframe("hours", 3) == "3 jam"
- assert self.locale._format_timeframe("months", 4) == "4 bulan"
- assert self.locale._format_timeframe("days", 3) == "3 hari"
- assert self.locale._format_timeframe("years", 5) == "5 tahun"
-
- def test_format_relative_now(self):
- assert self.locale._format_relative("baru saja", "now", 0) == "baru saja"
-
- def test_format_relative_past(self):
- assert self.locale._format_relative("1 jam", "hour", 1) == "dalam 1 jam"
- assert self.locale._format_relative("1 detik", "seconds", 1) == "dalam 1 detik"
-
- def test_format_relative_future(self):
- assert self.locale._format_relative("1 jam", "hour", -1) == "1 jam yang lalu"
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestTagalogLocale:
- def test_singles_tl(self):
- assert self.locale._format_timeframe("second", 1) == "isang segundo"
- assert self.locale._format_timeframe("minute", 1) == "isang minuto"
- assert self.locale._format_timeframe("hour", 1) == "isang oras"
- assert self.locale._format_timeframe("day", 1) == "isang araw"
- assert self.locale._format_timeframe("week", 1) == "isang linggo"
- assert self.locale._format_timeframe("month", 1) == "isang buwan"
- assert self.locale._format_timeframe("year", 1) == "isang taon"
-
- def test_meridians_tl(self):
- assert self.locale.meridian(7, "A") == "ng umaga"
- assert self.locale.meridian(18, "A") == "ng hapon"
- assert self.locale.meridian(10, "a") == "nu"
- assert self.locale.meridian(22, "a") == "nh"
-
- def test_describe_tl(self):
- assert self.locale.describe("second", only_distance=True) == "isang segundo"
- assert (
- self.locale.describe("second", only_distance=False)
- == "isang segundo mula ngayon"
- )
- assert self.locale.describe("minute", only_distance=True) == "isang minuto"
- assert (
- self.locale.describe("minute", only_distance=False)
- == "isang minuto mula ngayon"
- )
- assert self.locale.describe("hour", only_distance=True) == "isang oras"
- assert (
- self.locale.describe("hour", only_distance=False)
- == "isang oras mula ngayon"
- )
- assert self.locale.describe("day", only_distance=True) == "isang araw"
- assert (
- self.locale.describe("day", only_distance=False) == "isang araw mula ngayon"
- )
- assert self.locale.describe("week", only_distance=True) == "isang linggo"
- assert (
- self.locale.describe("week", only_distance=False)
- == "isang linggo mula ngayon"
- )
- assert self.locale.describe("month", only_distance=True) == "isang buwan"
- assert (
- self.locale.describe("month", only_distance=False)
- == "isang buwan mula ngayon"
- )
- assert self.locale.describe("year", only_distance=True) == "isang taon"
- assert (
- self.locale.describe("year", only_distance=False)
- == "isang taon mula ngayon"
- )
-
- def test_relative_tl(self):
- # time
- assert self.locale._format_relative("ngayon", "now", 0) == "ngayon"
- assert (
- self.locale._format_relative("1 segundo", "seconds", 1)
- == "1 segundo mula ngayon"
- )
- assert (
- self.locale._format_relative("1 minuto", "minutes", 1)
- == "1 minuto mula ngayon"
- )
- assert (
- self.locale._format_relative("1 oras", "hours", 1) == "1 oras mula ngayon"
- )
- assert self.locale._format_relative("1 araw", "days", 1) == "1 araw mula ngayon"
- assert (
- self.locale._format_relative("1 linggo", "weeks", 1)
- == "1 linggo mula ngayon"
- )
- assert (
- self.locale._format_relative("1 buwan", "months", 1)
- == "1 buwan mula ngayon"
- )
- assert (
- self.locale._format_relative("1 taon", "years", 1) == "1 taon mula ngayon"
- )
- assert (
- self.locale._format_relative("1 segundo", "seconds", -1)
- == "nakaraang 1 segundo"
- )
- assert (
- self.locale._format_relative("1 minuto", "minutes", -1)
- == "nakaraang 1 minuto"
- )
- assert self.locale._format_relative("1 oras", "hours", -1) == "nakaraang 1 oras"
- assert self.locale._format_relative("1 araw", "days", -1) == "nakaraang 1 araw"
- assert (
- self.locale._format_relative("1 linggo", "weeks", -1)
- == "nakaraang 1 linggo"
- )
- assert (
- self.locale._format_relative("1 buwan", "months", -1) == "nakaraang 1 buwan"
- )
- assert self.locale._format_relative("1 taon", "years", -1) == "nakaraang 1 taon"
-
- def test_plurals_tl(self):
- # Seconds
- assert self.locale._format_timeframe("seconds", 0) == "0 segundo"
- assert self.locale._format_timeframe("seconds", 1) == "1 segundo"
- assert self.locale._format_timeframe("seconds", 2) == "2 segundo"
- assert self.locale._format_timeframe("seconds", 4) == "4 segundo"
- assert self.locale._format_timeframe("seconds", 5) == "5 segundo"
- assert self.locale._format_timeframe("seconds", 21) == "21 segundo"
- assert self.locale._format_timeframe("seconds", 22) == "22 segundo"
- assert self.locale._format_timeframe("seconds", 25) == "25 segundo"
-
- # Minutes
- assert self.locale._format_timeframe("minutes", 0) == "0 minuto"
- assert self.locale._format_timeframe("minutes", 1) == "1 minuto"
- assert self.locale._format_timeframe("minutes", 2) == "2 minuto"
- assert self.locale._format_timeframe("minutes", 4) == "4 minuto"
- assert self.locale._format_timeframe("minutes", 5) == "5 minuto"
- assert self.locale._format_timeframe("minutes", 21) == "21 minuto"
- assert self.locale._format_timeframe("minutes", 22) == "22 minuto"
- assert self.locale._format_timeframe("minutes", 25) == "25 minuto"
-
- # Hours
- assert self.locale._format_timeframe("hours", 0) == "0 oras"
- assert self.locale._format_timeframe("hours", 1) == "1 oras"
- assert self.locale._format_timeframe("hours", 2) == "2 oras"
- assert self.locale._format_timeframe("hours", 4) == "4 oras"
- assert self.locale._format_timeframe("hours", 5) == "5 oras"
- assert self.locale._format_timeframe("hours", 21) == "21 oras"
- assert self.locale._format_timeframe("hours", 22) == "22 oras"
- assert self.locale._format_timeframe("hours", 25) == "25 oras"
-
- # Days
- assert self.locale._format_timeframe("days", 0) == "0 araw"
- assert self.locale._format_timeframe("days", 1) == "1 araw"
- assert self.locale._format_timeframe("days", 2) == "2 araw"
- assert self.locale._format_timeframe("days", 3) == "3 araw"
- assert self.locale._format_timeframe("days", 21) == "21 araw"
-
- # Weeks
- assert self.locale._format_timeframe("weeks", 0) == "0 linggo"
- assert self.locale._format_timeframe("weeks", 1) == "1 linggo"
- assert self.locale._format_timeframe("weeks", 2) == "2 linggo"
- assert self.locale._format_timeframe("weeks", 4) == "4 linggo"
- assert self.locale._format_timeframe("weeks", 5) == "5 linggo"
- assert self.locale._format_timeframe("weeks", 21) == "21 linggo"
- assert self.locale._format_timeframe("weeks", 22) == "22 linggo"
- assert self.locale._format_timeframe("weeks", 25) == "25 linggo"
-
- # Months
- assert self.locale._format_timeframe("months", 0) == "0 buwan"
- assert self.locale._format_timeframe("months", 1) == "1 buwan"
- assert self.locale._format_timeframe("months", 2) == "2 buwan"
- assert self.locale._format_timeframe("months", 4) == "4 buwan"
- assert self.locale._format_timeframe("months", 5) == "5 buwan"
- assert self.locale._format_timeframe("months", 21) == "21 buwan"
- assert self.locale._format_timeframe("months", 22) == "22 buwan"
- assert self.locale._format_timeframe("months", 25) == "25 buwan"
-
- # Years
- assert self.locale._format_timeframe("years", 1) == "1 taon"
- assert self.locale._format_timeframe("years", 2) == "2 taon"
- assert self.locale._format_timeframe("years", 5) == "5 taon"
-
- def test_multi_describe_tl(self):
- describe = self.locale.describe_multi
-
- fulltest = [("years", 5), ("weeks", 1), ("hours", 1), ("minutes", 6)]
- assert describe(fulltest) == "5 taon 1 linggo 1 oras 6 minuto mula ngayon"
- seconds4000_0days = [("days", 0), ("hours", 1), ("minutes", 6)]
- assert describe(seconds4000_0days) == "0 araw 1 oras 6 minuto mula ngayon"
- seconds4000 = [("hours", 1), ("minutes", 6)]
- assert describe(seconds4000) == "1 oras 6 minuto mula ngayon"
- assert describe(seconds4000, only_distance=True) == "1 oras 6 minuto"
- seconds3700 = [("hours", 1), ("minutes", 1)]
- assert describe(seconds3700) == "1 oras 1 minuto mula ngayon"
- seconds300_0hours = [("hours", 0), ("minutes", 5)]
- assert describe(seconds300_0hours) == "0 oras 5 minuto mula ngayon"
- seconds300 = [("minutes", 5)]
- assert describe(seconds300) == "5 minuto mula ngayon"
- seconds60 = [("minutes", 1)]
- assert describe(seconds60) == "1 minuto mula ngayon"
- assert describe(seconds60, only_distance=True) == "1 minuto"
- seconds60 = [("seconds", 1)]
- assert describe(seconds60) == "1 segundo mula ngayon"
- assert describe(seconds60, only_distance=True) == "1 segundo"
-
- def test_ordinal_number_tl(self):
- assert self.locale.ordinal_number(0) == "ika-0"
- assert self.locale.ordinal_number(1) == "ika-1"
- assert self.locale.ordinal_number(2) == "ika-2"
- assert self.locale.ordinal_number(3) == "ika-3"
- assert self.locale.ordinal_number(10) == "ika-10"
- assert self.locale.ordinal_number(23) == "ika-23"
- assert self.locale.ordinal_number(100) == "ika-100"
- assert self.locale.ordinal_number(103) == "ika-103"
- assert self.locale.ordinal_number(114) == "ika-114"
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestEstonianLocale:
- def test_format_timeframe(self):
- assert self.locale._format_timeframe("now", 0) == "just nüüd"
- assert self.locale._format_timeframe("second", 1) == "ühe sekundi"
- assert self.locale._format_timeframe("seconds", 3) == "3 sekundi"
- assert self.locale._format_timeframe("seconds", 30) == "30 sekundi"
- assert self.locale._format_timeframe("minute", 1) == "ühe minuti"
- assert self.locale._format_timeframe("minutes", 4) == "4 minuti"
- assert self.locale._format_timeframe("minutes", 40) == "40 minuti"
- assert self.locale._format_timeframe("hour", 1) == "tunni aja"
- assert self.locale._format_timeframe("hours", 5) == "5 tunni"
- assert self.locale._format_timeframe("hours", 23) == "23 tunni"
- assert self.locale._format_timeframe("day", 1) == "ühe päeva"
- assert self.locale._format_timeframe("days", 6) == "6 päeva"
- assert self.locale._format_timeframe("days", 12) == "12 päeva"
- assert self.locale._format_timeframe("month", 1) == "ühe kuu"
- assert self.locale._format_timeframe("months", 7) == "7 kuu"
- assert self.locale._format_timeframe("months", 11) == "11 kuu"
- assert self.locale._format_timeframe("year", 1) == "ühe aasta"
- assert self.locale._format_timeframe("years", 8) == "8 aasta"
- assert self.locale._format_timeframe("years", 12) == "12 aasta"
-
- assert self.locale._format_timeframe("now", 0) == "just nüüd"
- assert self.locale._format_timeframe("second", -1) == "üks sekund"
- assert self.locale._format_timeframe("seconds", -9) == "9 sekundit"
- assert self.locale._format_timeframe("seconds", -12) == "12 sekundit"
- assert self.locale._format_timeframe("minute", -1) == "üks minut"
- assert self.locale._format_timeframe("minutes", -2) == "2 minutit"
- assert self.locale._format_timeframe("minutes", -10) == "10 minutit"
- assert self.locale._format_timeframe("hour", -1) == "tund aega"
- assert self.locale._format_timeframe("hours", -3) == "3 tundi"
- assert self.locale._format_timeframe("hours", -11) == "11 tundi"
- assert self.locale._format_timeframe("day", -1) == "üks päev"
- assert self.locale._format_timeframe("days", -2) == "2 päeva"
- assert self.locale._format_timeframe("days", -12) == "12 päeva"
- assert self.locale._format_timeframe("month", -1) == "üks kuu"
- assert self.locale._format_timeframe("months", -3) == "3 kuud"
- assert self.locale._format_timeframe("months", -13) == "13 kuud"
- assert self.locale._format_timeframe("year", -1) == "üks aasta"
- assert self.locale._format_timeframe("years", -4) == "4 aastat"
- assert self.locale._format_timeframe("years", -14) == "14 aastat"
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestPortugueseLocale:
- def test_format_timeframe(self):
- assert self.locale._format_timeframe("now", 0) == "agora"
- assert self.locale._format_timeframe("second", 1) == "um segundo"
- assert self.locale._format_timeframe("seconds", 30) == "30 segundos"
- assert self.locale._format_timeframe("minute", 1) == "um minuto"
- assert self.locale._format_timeframe("minutes", 40) == "40 minutos"
- assert self.locale._format_timeframe("hour", 1) == "uma hora"
- assert self.locale._format_timeframe("hours", 23) == "23 horas"
- assert self.locale._format_timeframe("day", 1) == "um dia"
- assert self.locale._format_timeframe("days", 12) == "12 dias"
- assert self.locale._format_timeframe("month", 1) == "um mês"
- assert self.locale._format_timeframe("months", 11) == "11 meses"
- assert self.locale._format_timeframe("year", 1) == "um ano"
- assert self.locale._format_timeframe("years", 12) == "12 anos"
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestBrazilianPortugueseLocale:
- def test_format_timeframe(self):
- assert self.locale._format_timeframe("now", 0) == "agora"
- assert self.locale._format_timeframe("second", 1) == "um segundo"
- assert self.locale._format_timeframe("seconds", 30) == "30 segundos"
- assert self.locale._format_timeframe("minute", 1) == "um minuto"
- assert self.locale._format_timeframe("minutes", 40) == "40 minutos"
- assert self.locale._format_timeframe("hour", 1) == "uma hora"
- assert self.locale._format_timeframe("hours", 23) == "23 horas"
- assert self.locale._format_timeframe("day", 1) == "um dia"
- assert self.locale._format_timeframe("days", 12) == "12 dias"
- assert self.locale._format_timeframe("month", 1) == "um mês"
- assert self.locale._format_timeframe("months", 11) == "11 meses"
- assert self.locale._format_timeframe("year", 1) == "um ano"
- assert self.locale._format_timeframe("years", 12) == "12 anos"
- assert self.locale._format_relative("uma hora", "hour", -1) == "faz uma hora"
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestHongKongLocale:
- def test_format_timeframe(self):
- assert self.locale._format_timeframe("now", 0) == "剛才"
- assert self.locale._format_timeframe("second", 1) == "1秒"
- assert self.locale._format_timeframe("seconds", 30) == "30秒"
- assert self.locale._format_timeframe("minute", 1) == "1分鐘"
- assert self.locale._format_timeframe("minutes", 40) == "40分鐘"
- assert self.locale._format_timeframe("hour", 1) == "1小時"
- assert self.locale._format_timeframe("hours", 23) == "23小時"
- assert self.locale._format_timeframe("day", 1) == "1天"
- assert self.locale._format_timeframe("days", 12) == "12天"
- assert self.locale._format_timeframe("week", 1) == "1星期"
- assert self.locale._format_timeframe("weeks", 38) == "38星期"
- assert self.locale._format_timeframe("month", 1) == "1個月"
- assert self.locale._format_timeframe("months", 11) == "11個月"
- assert self.locale._format_timeframe("year", 1) == "1年"
- assert self.locale._format_timeframe("years", 12) == "12年"
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestChineseTWLocale:
- def test_format_timeframe(self):
- assert self.locale._format_timeframe("now", 0) == "剛才"
- assert self.locale._format_timeframe("second", 1) == "1秒"
- assert self.locale._format_timeframe("seconds", 30) == "30秒"
- assert self.locale._format_timeframe("minute", 1) == "1分鐘"
- assert self.locale._format_timeframe("minutes", 40) == "40分鐘"
- assert self.locale._format_timeframe("hour", 1) == "1小時"
- assert self.locale._format_timeframe("hours", 23) == "23小時"
- assert self.locale._format_timeframe("day", 1) == "1天"
- assert self.locale._format_timeframe("days", 12) == "12天"
- assert self.locale._format_timeframe("week", 1) == "1週"
- assert self.locale._format_timeframe("weeks", 38) == "38週"
- assert self.locale._format_timeframe("month", 1) == "1個月"
- assert self.locale._format_timeframe("months", 11) == "11個月"
- assert self.locale._format_timeframe("year", 1) == "1年"
- assert self.locale._format_timeframe("years", 12) == "12年"
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestSwahiliLocale:
- def test_format_timeframe(self):
- assert self.locale._format_timeframe("now", 0) == "sasa hivi"
- assert self.locale._format_timeframe("second", 1) == "sekunde"
- assert self.locale._format_timeframe("seconds", 3) == "sekunde 3"
- assert self.locale._format_timeframe("seconds", 30) == "sekunde 30"
- assert self.locale._format_timeframe("minute", 1) == "dakika moja"
- assert self.locale._format_timeframe("minutes", 4) == "dakika 4"
- assert self.locale._format_timeframe("minutes", 40) == "dakika 40"
- assert self.locale._format_timeframe("hour", 1) == "saa moja"
- assert self.locale._format_timeframe("hours", 5) == "saa 5"
- assert self.locale._format_timeframe("hours", 23) == "saa 23"
- assert self.locale._format_timeframe("day", 1) == "siku moja"
- assert self.locale._format_timeframe("days", 6) == "siku 6"
- assert self.locale._format_timeframe("days", 12) == "siku 12"
- assert self.locale._format_timeframe("month", 1) == "mwezi moja"
- assert self.locale._format_timeframe("months", 7) == "miezi 7"
- assert self.locale._format_timeframe("week", 1) == "wiki moja"
- assert self.locale._format_timeframe("weeks", 2) == "wiki 2"
- assert self.locale._format_timeframe("months", 11) == "miezi 11"
- assert self.locale._format_timeframe("year", 1) == "mwaka moja"
- assert self.locale._format_timeframe("years", 8) == "miaka 8"
- assert self.locale._format_timeframe("years", 12) == "miaka 12"
-
- def test_format_relative_now(self):
- result = self.locale._format_relative("sasa hivi", "now", 0)
- assert result == "sasa hivi"
-
- def test_format_relative_past(self):
- result = self.locale._format_relative("saa moja", "hour", 1)
- assert result == "muda wa saa moja"
-
- def test_format_relative_future(self):
- result = self.locale._format_relative("saa moja", "hour", -1)
- assert result == "saa moja iliyopita"
-
-
-@pytest.mark.usefixtures("lang_locale")
-class TestKoreanLocale:
- def test_format_timeframe(self):
- assert self.locale._format_timeframe("now", 0) == "지금"
- assert self.locale._format_timeframe("second", 1) == "1초"
- assert self.locale._format_timeframe("seconds", 2) == "2초"
- assert self.locale._format_timeframe("minute", 1) == "1분"
- assert self.locale._format_timeframe("minutes", 2) == "2분"
- assert self.locale._format_timeframe("hour", 1) == "한시간"
- assert self.locale._format_timeframe("hours", 2) == "2시간"
- assert self.locale._format_timeframe("day", 1) == "하루"
- assert self.locale._format_timeframe("days", 2) == "2일"
- assert self.locale._format_timeframe("week", 1) == "1주"
- assert self.locale._format_timeframe("weeks", 2) == "2주"
- assert self.locale._format_timeframe("month", 1) == "한달"
- assert self.locale._format_timeframe("months", 2) == "2개월"
- assert self.locale._format_timeframe("year", 1) == "1년"
- assert self.locale._format_timeframe("years", 2) == "2년"
-
- def test_format_relative(self):
- assert self.locale._format_relative("지금", "now", 0) == "지금"
-
- assert self.locale._format_relative("1초", "second", 1) == "1초 후"
- assert self.locale._format_relative("2초", "seconds", 2) == "2초 후"
- assert self.locale._format_relative("1분", "minute", 1) == "1분 후"
- assert self.locale._format_relative("2분", "minutes", 2) == "2분 후"
- assert self.locale._format_relative("한시간", "hour", 1) == "한시간 후"
- assert self.locale._format_relative("2시간", "hours", 2) == "2시간 후"
- assert self.locale._format_relative("하루", "day", 1) == "내일"
- assert self.locale._format_relative("2일", "days", 2) == "모레"
- assert self.locale._format_relative("3일", "days", 3) == "글피"
- assert self.locale._format_relative("4일", "days", 4) == "그글피"
- assert self.locale._format_relative("5일", "days", 5) == "5일 후"
- assert self.locale._format_relative("1주", "week", 1) == "1주 후"
- assert self.locale._format_relative("2주", "weeks", 2) == "2주 후"
- assert self.locale._format_relative("한달", "month", 1) == "한달 후"
- assert self.locale._format_relative("2개월", "months", 2) == "2개월 후"
- assert self.locale._format_relative("1년", "year", 1) == "내년"
- assert self.locale._format_relative("2년", "years", 2) == "내후년"
- assert self.locale._format_relative("3년", "years", 3) == "3년 후"
-
- assert self.locale._format_relative("1초", "second", -1) == "1초 전"
- assert self.locale._format_relative("2초", "seconds", -2) == "2초 전"
- assert self.locale._format_relative("1분", "minute", -1) == "1분 전"
- assert self.locale._format_relative("2분", "minutes", -2) == "2분 전"
- assert self.locale._format_relative("한시간", "hour", -1) == "한시간 전"
- assert self.locale._format_relative("2시간", "hours", -2) == "2시간 전"
- assert self.locale._format_relative("하루", "day", -1) == "어제"
- assert self.locale._format_relative("2일", "days", -2) == "그제"
- assert self.locale._format_relative("3일", "days", -3) == "그끄제"
- assert self.locale._format_relative("4일", "days", -4) == "4일 전"
- assert self.locale._format_relative("1주", "week", -1) == "1주 전"
- assert self.locale._format_relative("2주", "weeks", -2) == "2주 전"
- assert self.locale._format_relative("한달", "month", -1) == "한달 전"
- assert self.locale._format_relative("2개월", "months", -2) == "2개월 전"
- assert self.locale._format_relative("1년", "year", -1) == "작년"
- assert self.locale._format_relative("2년", "years", -2) == "제작년"
- assert self.locale._format_relative("3년", "years", -3) == "3년 전"
-
- def test_ordinal_number(self):
- assert self.locale.ordinal_number(0) == "0번째"
- assert self.locale.ordinal_number(1) == "첫번째"
- assert self.locale.ordinal_number(2) == "두번째"
- assert self.locale.ordinal_number(3) == "세번째"
- assert self.locale.ordinal_number(4) == "네번째"
- assert self.locale.ordinal_number(5) == "다섯번째"
- assert self.locale.ordinal_number(6) == "여섯번째"
- assert self.locale.ordinal_number(7) == "일곱번째"
- assert self.locale.ordinal_number(8) == "여덟번째"
- assert self.locale.ordinal_number(9) == "아홉번째"
- assert self.locale.ordinal_number(10) == "열번째"
- assert self.locale.ordinal_number(11) == "11번째"
- assert self.locale.ordinal_number(12) == "12번째"
- assert self.locale.ordinal_number(100) == "100번째"
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/tests/test_parser.py b/openpype/modules/ftrack/python2_vendor/arrow/tests/test_parser.py
deleted file mode 100644
index 9fb4e68f3c..0000000000
--- a/openpype/modules/ftrack/python2_vendor/arrow/tests/test_parser.py
+++ /dev/null
@@ -1,1657 +0,0 @@
-# -*- coding: utf-8 -*-
-from __future__ import unicode_literals
-
-import calendar
-import os
-import time
-from datetime import datetime
-
-import pytest
-from dateutil import tz
-
-import arrow
-from arrow import formatter, parser
-from arrow.constants import MAX_TIMESTAMP_US
-from arrow.parser import DateTimeParser, ParserError, ParserMatchError
-
-from .utils import make_full_tz_list
-
-
-@pytest.mark.usefixtures("dt_parser")
-class TestDateTimeParser:
- def test_parse_multiformat(self, mocker):
- mocker.patch(
- "arrow.parser.DateTimeParser.parse",
- string="str",
- fmt="fmt_a",
- side_effect=parser.ParserError,
- )
-
- with pytest.raises(parser.ParserError):
- self.parser._parse_multiformat("str", ["fmt_a"])
-
- mock_datetime = mocker.Mock()
- mocker.patch(
- "arrow.parser.DateTimeParser.parse",
- string="str",
- fmt="fmt_b",
- return_value=mock_datetime,
- )
-
- result = self.parser._parse_multiformat("str", ["fmt_a", "fmt_b"])
- assert result == mock_datetime
-
- def test_parse_multiformat_all_fail(self, mocker):
- mocker.patch(
- "arrow.parser.DateTimeParser.parse",
- string="str",
- fmt="fmt_a",
- side_effect=parser.ParserError,
- )
-
- mocker.patch(
- "arrow.parser.DateTimeParser.parse",
- string="str",
- fmt="fmt_b",
- side_effect=parser.ParserError,
- )
-
- with pytest.raises(parser.ParserError):
- self.parser._parse_multiformat("str", ["fmt_a", "fmt_b"])
-
- def test_parse_multiformat_unself_expected_fail(self, mocker):
- class UnselfExpectedError(Exception):
- pass
-
- mocker.patch(
- "arrow.parser.DateTimeParser.parse",
- string="str",
- fmt="fmt_a",
- side_effect=UnselfExpectedError,
- )
-
- with pytest.raises(UnselfExpectedError):
- self.parser._parse_multiformat("str", ["fmt_a", "fmt_b"])
-
- def test_parse_token_nonsense(self):
- parts = {}
- self.parser._parse_token("NONSENSE", "1900", parts)
- assert parts == {}
-
- def test_parse_token_invalid_meridians(self):
- parts = {}
- self.parser._parse_token("A", "a..m", parts)
- assert parts == {}
- self.parser._parse_token("a", "p..m", parts)
- assert parts == {}
-
- def test_parser_no_caching(self, mocker):
-
- mocked_parser = mocker.patch(
- "arrow.parser.DateTimeParser._generate_pattern_re", fmt="fmt_a"
- )
- self.parser = parser.DateTimeParser(cache_size=0)
- for _ in range(100):
- self.parser._generate_pattern_re("fmt_a")
- assert mocked_parser.call_count == 100
-
- def test_parser_1_line_caching(self, mocker):
- mocked_parser = mocker.patch("arrow.parser.DateTimeParser._generate_pattern_re")
- self.parser = parser.DateTimeParser(cache_size=1)
-
- for _ in range(100):
- self.parser._generate_pattern_re(fmt="fmt_a")
- assert mocked_parser.call_count == 1
- assert mocked_parser.call_args_list[0] == mocker.call(fmt="fmt_a")
-
- for _ in range(100):
- self.parser._generate_pattern_re(fmt="fmt_b")
- assert mocked_parser.call_count == 2
- assert mocked_parser.call_args_list[1] == mocker.call(fmt="fmt_b")
-
- for _ in range(100):
- self.parser._generate_pattern_re(fmt="fmt_a")
- assert mocked_parser.call_count == 3
- assert mocked_parser.call_args_list[2] == mocker.call(fmt="fmt_a")
-
- def test_parser_multiple_line_caching(self, mocker):
- mocked_parser = mocker.patch("arrow.parser.DateTimeParser._generate_pattern_re")
- self.parser = parser.DateTimeParser(cache_size=2)
-
- for _ in range(100):
- self.parser._generate_pattern_re(fmt="fmt_a")
- assert mocked_parser.call_count == 1
- assert mocked_parser.call_args_list[0] == mocker.call(fmt="fmt_a")
-
- for _ in range(100):
- self.parser._generate_pattern_re(fmt="fmt_b")
- assert mocked_parser.call_count == 2
- assert mocked_parser.call_args_list[1] == mocker.call(fmt="fmt_b")
-
- # fmt_a and fmt_b are in the cache, so no new calls should be made
- for _ in range(100):
- self.parser._generate_pattern_re(fmt="fmt_a")
- for _ in range(100):
- self.parser._generate_pattern_re(fmt="fmt_b")
- assert mocked_parser.call_count == 2
- assert mocked_parser.call_args_list[0] == mocker.call(fmt="fmt_a")
- assert mocked_parser.call_args_list[1] == mocker.call(fmt="fmt_b")
-
- def test_YY_and_YYYY_format_list(self):
-
- assert self.parser.parse("15/01/19", ["DD/MM/YY", "DD/MM/YYYY"]) == datetime(
- 2019, 1, 15
- )
-
- # Regression test for issue #580
- assert self.parser.parse("15/01/2019", ["DD/MM/YY", "DD/MM/YYYY"]) == datetime(
- 2019, 1, 15
- )
-
- assert (
- self.parser.parse(
- "15/01/2019T04:05:06.789120Z",
- ["D/M/YYThh:mm:ss.SZ", "D/M/YYYYThh:mm:ss.SZ"],
- )
- == datetime(2019, 1, 15, 4, 5, 6, 789120, tzinfo=tz.tzutc())
- )
-
- # regression test for issue #447
- def test_timestamp_format_list(self):
- # should not match on the "X" token
- assert (
- self.parser.parse(
- "15 Jul 2000",
- ["MM/DD/YYYY", "YYYY-MM-DD", "X", "DD-MMMM-YYYY", "D MMM YYYY"],
- )
- == datetime(2000, 7, 15)
- )
-
- with pytest.raises(ParserError):
- self.parser.parse("15 Jul", "X")
-
-
-@pytest.mark.usefixtures("dt_parser")
-class TestDateTimeParserParse:
- def test_parse_list(self, mocker):
-
- mocker.patch(
- "arrow.parser.DateTimeParser._parse_multiformat",
- string="str",
- formats=["fmt_a", "fmt_b"],
- return_value="result",
- )
-
- result = self.parser.parse("str", ["fmt_a", "fmt_b"])
- assert result == "result"
-
- def test_parse_unrecognized_token(self, mocker):
-
- mocker.patch.dict("arrow.parser.DateTimeParser._BASE_INPUT_RE_MAP")
- del arrow.parser.DateTimeParser._BASE_INPUT_RE_MAP["YYYY"]
-
- # need to make another local parser to apply patch changes
- _parser = parser.DateTimeParser()
- with pytest.raises(parser.ParserError):
- _parser.parse("2013-01-01", "YYYY-MM-DD")
-
- def test_parse_parse_no_match(self):
-
- with pytest.raises(ParserError):
- self.parser.parse("01-01", "YYYY-MM-DD")
-
- def test_parse_separators(self):
-
- with pytest.raises(ParserError):
- self.parser.parse("1403549231", "YYYY-MM-DD")
-
- def test_parse_numbers(self):
-
- self.expected = datetime(2012, 1, 1, 12, 5, 10)
- assert (
- self.parser.parse("2012-01-01 12:05:10", "YYYY-MM-DD HH:mm:ss")
- == self.expected
- )
-
- def test_parse_year_two_digit(self):
-
- self.expected = datetime(1979, 1, 1, 12, 5, 10)
- assert (
- self.parser.parse("79-01-01 12:05:10", "YY-MM-DD HH:mm:ss") == self.expected
- )
-
- def test_parse_timestamp(self):
-
- tz_utc = tz.tzutc()
- int_timestamp = int(time.time())
- self.expected = datetime.fromtimestamp(int_timestamp, tz=tz_utc)
- assert self.parser.parse("{:d}".format(int_timestamp), "X") == self.expected
-
- float_timestamp = time.time()
- self.expected = datetime.fromtimestamp(float_timestamp, tz=tz_utc)
- assert self.parser.parse("{:f}".format(float_timestamp), "X") == self.expected
-
- # test handling of ns timestamp (arrow will round to 6 digits regardless)
- self.expected = datetime.fromtimestamp(float_timestamp, tz=tz_utc)
- assert (
- self.parser.parse("{:f}123".format(float_timestamp), "X") == self.expected
- )
-
- # test ps timestamp (arrow will round to 6 digits regardless)
- self.expected = datetime.fromtimestamp(float_timestamp, tz=tz_utc)
- assert (
- self.parser.parse("{:f}123456".format(float_timestamp), "X")
- == self.expected
- )
-
- # NOTE: negative timestamps cannot be handled by datetime on Window
- # Must use timedelta to handle them. ref: https://stackoverflow.com/questions/36179914
- if os.name != "nt":
- # regression test for issue #662
- negative_int_timestamp = -int_timestamp
- self.expected = datetime.fromtimestamp(negative_int_timestamp, tz=tz_utc)
- assert (
- self.parser.parse("{:d}".format(negative_int_timestamp), "X")
- == self.expected
- )
-
- negative_float_timestamp = -float_timestamp
- self.expected = datetime.fromtimestamp(negative_float_timestamp, tz=tz_utc)
- assert (
- self.parser.parse("{:f}".format(negative_float_timestamp), "X")
- == self.expected
- )
-
- # NOTE: timestamps cannot be parsed from natural language strings (by removing the ^...$) because it will
- # break cases like "15 Jul 2000" and a format list (see issue #447)
- with pytest.raises(ParserError):
- natural_lang_string = "Meet me at {} at the restaurant.".format(
- float_timestamp
- )
- self.parser.parse(natural_lang_string, "X")
-
- with pytest.raises(ParserError):
- self.parser.parse("1565982019.", "X")
-
- with pytest.raises(ParserError):
- self.parser.parse(".1565982019", "X")
-
- def test_parse_expanded_timestamp(self):
- # test expanded timestamps that include milliseconds
- # and microseconds as multiples rather than decimals
- # requested in issue #357
-
- tz_utc = tz.tzutc()
- timestamp = 1569982581.413132
- timestamp_milli = int(round(timestamp * 1000))
- timestamp_micro = int(round(timestamp * 1000000))
-
- # "x" token should parse integer timestamps below MAX_TIMESTAMP normally
- self.expected = datetime.fromtimestamp(int(timestamp), tz=tz_utc)
- assert self.parser.parse("{:d}".format(int(timestamp)), "x") == self.expected
-
- self.expected = datetime.fromtimestamp(round(timestamp, 3), tz=tz_utc)
- assert self.parser.parse("{:d}".format(timestamp_milli), "x") == self.expected
-
- self.expected = datetime.fromtimestamp(timestamp, tz=tz_utc)
- assert self.parser.parse("{:d}".format(timestamp_micro), "x") == self.expected
-
- # anything above max µs timestamp should fail
- with pytest.raises(ValueError):
- self.parser.parse("{:d}".format(int(MAX_TIMESTAMP_US) + 1), "x")
-
- # floats are not allowed with the "x" token
- with pytest.raises(ParserMatchError):
- self.parser.parse("{:f}".format(timestamp), "x")
-
- def test_parse_names(self):
-
- self.expected = datetime(2012, 1, 1)
-
- assert self.parser.parse("January 1, 2012", "MMMM D, YYYY") == self.expected
- assert self.parser.parse("Jan 1, 2012", "MMM D, YYYY") == self.expected
-
- def test_parse_pm(self):
-
- self.expected = datetime(1, 1, 1, 13, 0, 0)
- assert self.parser.parse("1 pm", "H a") == self.expected
- assert self.parser.parse("1 pm", "h a") == self.expected
-
- self.expected = datetime(1, 1, 1, 1, 0, 0)
- assert self.parser.parse("1 am", "H A") == self.expected
- assert self.parser.parse("1 am", "h A") == self.expected
-
- self.expected = datetime(1, 1, 1, 0, 0, 0)
- assert self.parser.parse("12 am", "H A") == self.expected
- assert self.parser.parse("12 am", "h A") == self.expected
-
- self.expected = datetime(1, 1, 1, 12, 0, 0)
- assert self.parser.parse("12 pm", "H A") == self.expected
- assert self.parser.parse("12 pm", "h A") == self.expected
-
- def test_parse_tz_hours_only(self):
-
- self.expected = datetime(2025, 10, 17, 5, 30, 10, tzinfo=tz.tzoffset(None, 0))
- parsed = self.parser.parse("2025-10-17 05:30:10+00", "YYYY-MM-DD HH:mm:ssZ")
- assert parsed == self.expected
-
- def test_parse_tz_zz(self):
-
- self.expected = datetime(2013, 1, 1, tzinfo=tz.tzoffset(None, -7 * 3600))
- assert self.parser.parse("2013-01-01 -07:00", "YYYY-MM-DD ZZ") == self.expected
-
- @pytest.mark.parametrize("full_tz_name", make_full_tz_list())
- def test_parse_tz_name_zzz(self, full_tz_name):
-
- self.expected = datetime(2013, 1, 1, tzinfo=tz.gettz(full_tz_name))
- assert (
- self.parser.parse("2013-01-01 {}".format(full_tz_name), "YYYY-MM-DD ZZZ")
- == self.expected
- )
-
- # note that offsets are not timezones
- with pytest.raises(ParserError):
- self.parser.parse("2013-01-01 12:30:45.9+1000", "YYYY-MM-DDZZZ")
-
- with pytest.raises(ParserError):
- self.parser.parse("2013-01-01 12:30:45.9+10:00", "YYYY-MM-DDZZZ")
-
- with pytest.raises(ParserError):
- self.parser.parse("2013-01-01 12:30:45.9-10", "YYYY-MM-DDZZZ")
-
- def test_parse_subsecond(self):
- self.expected = datetime(2013, 1, 1, 12, 30, 45, 900000)
- assert (
- self.parser.parse("2013-01-01 12:30:45.9", "YYYY-MM-DD HH:mm:ss.S")
- == self.expected
- )
-
- self.expected = datetime(2013, 1, 1, 12, 30, 45, 980000)
- assert (
- self.parser.parse("2013-01-01 12:30:45.98", "YYYY-MM-DD HH:mm:ss.SS")
- == self.expected
- )
-
- self.expected = datetime(2013, 1, 1, 12, 30, 45, 987000)
- assert (
- self.parser.parse("2013-01-01 12:30:45.987", "YYYY-MM-DD HH:mm:ss.SSS")
- == self.expected
- )
-
- self.expected = datetime(2013, 1, 1, 12, 30, 45, 987600)
- assert (
- self.parser.parse("2013-01-01 12:30:45.9876", "YYYY-MM-DD HH:mm:ss.SSSS")
- == self.expected
- )
-
- self.expected = datetime(2013, 1, 1, 12, 30, 45, 987650)
- assert (
- self.parser.parse("2013-01-01 12:30:45.98765", "YYYY-MM-DD HH:mm:ss.SSSSS")
- == self.expected
- )
-
- self.expected = datetime(2013, 1, 1, 12, 30, 45, 987654)
- assert (
- self.parser.parse(
- "2013-01-01 12:30:45.987654", "YYYY-MM-DD HH:mm:ss.SSSSSS"
- )
- == self.expected
- )
-
- def test_parse_subsecond_rounding(self):
- self.expected = datetime(2013, 1, 1, 12, 30, 45, 987654)
- datetime_format = "YYYY-MM-DD HH:mm:ss.S"
-
- # round up
- string = "2013-01-01 12:30:45.9876539"
- assert self.parser.parse(string, datetime_format) == self.expected
- assert self.parser.parse_iso(string) == self.expected
-
- # round down
- string = "2013-01-01 12:30:45.98765432"
- assert self.parser.parse(string, datetime_format) == self.expected
- assert self.parser.parse_iso(string) == self.expected
-
- # round half-up
- string = "2013-01-01 12:30:45.987653521"
- assert self.parser.parse(string, datetime_format) == self.expected
- assert self.parser.parse_iso(string) == self.expected
-
- # round half-down
- string = "2013-01-01 12:30:45.9876545210"
- assert self.parser.parse(string, datetime_format) == self.expected
- assert self.parser.parse_iso(string) == self.expected
-
- # overflow (zero out the subseconds and increment the seconds)
- # regression tests for issue #636
- def test_parse_subsecond_rounding_overflow(self):
- datetime_format = "YYYY-MM-DD HH:mm:ss.S"
-
- self.expected = datetime(2013, 1, 1, 12, 30, 46)
- string = "2013-01-01 12:30:45.9999995"
- assert self.parser.parse(string, datetime_format) == self.expected
- assert self.parser.parse_iso(string) == self.expected
-
- self.expected = datetime(2013, 1, 1, 12, 31, 0)
- string = "2013-01-01 12:30:59.9999999"
- assert self.parser.parse(string, datetime_format) == self.expected
- assert self.parser.parse_iso(string) == self.expected
-
- self.expected = datetime(2013, 1, 2, 0, 0, 0)
- string = "2013-01-01 23:59:59.9999999"
- assert self.parser.parse(string, datetime_format) == self.expected
- assert self.parser.parse_iso(string) == self.expected
-
- # 6 digits should remain unrounded
- self.expected = datetime(2013, 1, 1, 12, 30, 45, 999999)
- string = "2013-01-01 12:30:45.999999"
- assert self.parser.parse(string, datetime_format) == self.expected
- assert self.parser.parse_iso(string) == self.expected
-
- # Regression tests for issue #560
- def test_parse_long_year(self):
- with pytest.raises(ParserError):
- self.parser.parse("09 January 123456789101112", "DD MMMM YYYY")
-
- with pytest.raises(ParserError):
- self.parser.parse("123456789101112 09 January", "YYYY DD MMMM")
-
- with pytest.raises(ParserError):
- self.parser.parse("68096653015/01/19", "YY/M/DD")
-
- def test_parse_with_extra_words_at_start_and_end_invalid(self):
- input_format_pairs = [
- ("blah2016", "YYYY"),
- ("blah2016blah", "YYYY"),
- ("2016blah", "YYYY"),
- ("2016-05blah", "YYYY-MM"),
- ("2016-05-16blah", "YYYY-MM-DD"),
- ("2016-05-16T04:05:06.789120blah", "YYYY-MM-DDThh:mm:ss.S"),
- ("2016-05-16T04:05:06.789120ZblahZ", "YYYY-MM-DDThh:mm:ss.SZ"),
- ("2016-05-16T04:05:06.789120Zblah", "YYYY-MM-DDThh:mm:ss.SZ"),
- ("2016-05-16T04:05:06.789120blahZ", "YYYY-MM-DDThh:mm:ss.SZ"),
- ]
-
- for pair in input_format_pairs:
- with pytest.raises(ParserError):
- self.parser.parse(pair[0], pair[1])
-
- def test_parse_with_extra_words_at_start_and_end_valid(self):
- # Spaces surrounding the parsable date are ok because we
- # allow the parsing of natural language input. Additionally, a single
- # character of specific punctuation before or after the date is okay.
- # See docs for full list of valid punctuation.
-
- assert self.parser.parse("blah 2016 blah", "YYYY") == datetime(2016, 1, 1)
-
- assert self.parser.parse("blah 2016", "YYYY") == datetime(2016, 1, 1)
-
- assert self.parser.parse("2016 blah", "YYYY") == datetime(2016, 1, 1)
-
- # test one additional space along with space divider
- assert self.parser.parse(
- "blah 2016-05-16 04:05:06.789120", "YYYY-MM-DD hh:mm:ss.S"
- ) == datetime(2016, 5, 16, 4, 5, 6, 789120)
-
- assert self.parser.parse(
- "2016-05-16 04:05:06.789120 blah", "YYYY-MM-DD hh:mm:ss.S"
- ) == datetime(2016, 5, 16, 4, 5, 6, 789120)
-
- # test one additional space along with T divider
- assert self.parser.parse(
- "blah 2016-05-16T04:05:06.789120", "YYYY-MM-DDThh:mm:ss.S"
- ) == datetime(2016, 5, 16, 4, 5, 6, 789120)
-
- assert self.parser.parse(
- "2016-05-16T04:05:06.789120 blah", "YYYY-MM-DDThh:mm:ss.S"
- ) == datetime(2016, 5, 16, 4, 5, 6, 789120)
-
- assert (
- self.parser.parse(
- "Meet me at 2016-05-16T04:05:06.789120 at the restaurant.",
- "YYYY-MM-DDThh:mm:ss.S",
- )
- == datetime(2016, 5, 16, 4, 5, 6, 789120)
- )
-
- assert (
- self.parser.parse(
- "Meet me at 2016-05-16 04:05:06.789120 at the restaurant.",
- "YYYY-MM-DD hh:mm:ss.S",
- )
- == datetime(2016, 5, 16, 4, 5, 6, 789120)
- )
-
- # regression test for issue #701
- # tests cases of a partial match surrounded by punctuation
- # for the list of valid punctuation, see documentation
- def test_parse_with_punctuation_fences(self):
- assert self.parser.parse(
- "Meet me at my house on Halloween (2019-31-10)", "YYYY-DD-MM"
- ) == datetime(2019, 10, 31)
-
- assert self.parser.parse(
- "Monday, 9. September 2019, 16:15-20:00", "dddd, D. MMMM YYYY"
- ) == datetime(2019, 9, 9)
-
- assert self.parser.parse("A date is 11.11.2011.", "DD.MM.YYYY") == datetime(
- 2011, 11, 11
- )
-
- with pytest.raises(ParserMatchError):
- self.parser.parse("11.11.2011.1 is not a valid date.", "DD.MM.YYYY")
-
- with pytest.raises(ParserMatchError):
- self.parser.parse(
- "This date has too many punctuation marks following it (11.11.2011).",
- "DD.MM.YYYY",
- )
-
- def test_parse_with_leading_and_trailing_whitespace(self):
- assert self.parser.parse(" 2016", "YYYY") == datetime(2016, 1, 1)
-
- assert self.parser.parse("2016 ", "YYYY") == datetime(2016, 1, 1)
-
- assert self.parser.parse(" 2016 ", "YYYY") == datetime(2016, 1, 1)
-
- assert self.parser.parse(
- " 2016-05-16 04:05:06.789120 ", "YYYY-MM-DD hh:mm:ss.S"
- ) == datetime(2016, 5, 16, 4, 5, 6, 789120)
-
- assert self.parser.parse(
- " 2016-05-16T04:05:06.789120 ", "YYYY-MM-DDThh:mm:ss.S"
- ) == datetime(2016, 5, 16, 4, 5, 6, 789120)
-
- def test_parse_YYYY_DDDD(self):
- assert self.parser.parse("1998-136", "YYYY-DDDD") == datetime(1998, 5, 16)
-
- assert self.parser.parse("1998-006", "YYYY-DDDD") == datetime(1998, 1, 6)
-
- with pytest.raises(ParserError):
- self.parser.parse("1998-456", "YYYY-DDDD")
-
- def test_parse_YYYY_DDD(self):
- assert self.parser.parse("1998-6", "YYYY-DDD") == datetime(1998, 1, 6)
-
- assert self.parser.parse("1998-136", "YYYY-DDD") == datetime(1998, 5, 16)
-
- with pytest.raises(ParserError):
- self.parser.parse("1998-756", "YYYY-DDD")
-
- # month cannot be passed with DDD and DDDD tokens
- def test_parse_YYYY_MM_DDDD(self):
- with pytest.raises(ParserError):
- self.parser.parse("2015-01-009", "YYYY-MM-DDDD")
-
- # year is required with the DDD and DDDD tokens
- def test_parse_DDD_only(self):
- with pytest.raises(ParserError):
- self.parser.parse("5", "DDD")
-
- def test_parse_DDDD_only(self):
- with pytest.raises(ParserError):
- self.parser.parse("145", "DDDD")
-
- def test_parse_ddd_and_dddd(self):
- fr_parser = parser.DateTimeParser("fr")
-
- # Day of week should be ignored when a day is passed
- # 2019-10-17 is a Thursday, so we know day of week
- # is ignored if the same date is outputted
- expected = datetime(2019, 10, 17)
- assert self.parser.parse("Tue 2019-10-17", "ddd YYYY-MM-DD") == expected
- assert fr_parser.parse("mar 2019-10-17", "ddd YYYY-MM-DD") == expected
- assert self.parser.parse("Tuesday 2019-10-17", "dddd YYYY-MM-DD") == expected
- assert fr_parser.parse("mardi 2019-10-17", "dddd YYYY-MM-DD") == expected
-
- # Get first Tuesday after epoch
- expected = datetime(1970, 1, 6)
- assert self.parser.parse("Tue", "ddd") == expected
- assert fr_parser.parse("mar", "ddd") == expected
- assert self.parser.parse("Tuesday", "dddd") == expected
- assert fr_parser.parse("mardi", "dddd") == expected
-
- # Get first Tuesday in 2020
- expected = datetime(2020, 1, 7)
- assert self.parser.parse("Tue 2020", "ddd YYYY") == expected
- assert fr_parser.parse("mar 2020", "ddd YYYY") == expected
- assert self.parser.parse("Tuesday 2020", "dddd YYYY") == expected
- assert fr_parser.parse("mardi 2020", "dddd YYYY") == expected
-
- # Get first Tuesday in February 2020
- expected = datetime(2020, 2, 4)
- assert self.parser.parse("Tue 02 2020", "ddd MM YYYY") == expected
- assert fr_parser.parse("mar 02 2020", "ddd MM YYYY") == expected
- assert self.parser.parse("Tuesday 02 2020", "dddd MM YYYY") == expected
- assert fr_parser.parse("mardi 02 2020", "dddd MM YYYY") == expected
-
- # Get first Tuesday in February after epoch
- expected = datetime(1970, 2, 3)
- assert self.parser.parse("Tue 02", "ddd MM") == expected
- assert fr_parser.parse("mar 02", "ddd MM") == expected
- assert self.parser.parse("Tuesday 02", "dddd MM") == expected
- assert fr_parser.parse("mardi 02", "dddd MM") == expected
-
- # Times remain intact
- expected = datetime(2020, 2, 4, 10, 25, 54, 123456, tz.tzoffset(None, -3600))
- assert (
- self.parser.parse(
- "Tue 02 2020 10:25:54.123456-01:00", "ddd MM YYYY HH:mm:ss.SZZ"
- )
- == expected
- )
- assert (
- fr_parser.parse(
- "mar 02 2020 10:25:54.123456-01:00", "ddd MM YYYY HH:mm:ss.SZZ"
- )
- == expected
- )
- assert (
- self.parser.parse(
- "Tuesday 02 2020 10:25:54.123456-01:00", "dddd MM YYYY HH:mm:ss.SZZ"
- )
- == expected
- )
- assert (
- fr_parser.parse(
- "mardi 02 2020 10:25:54.123456-01:00", "dddd MM YYYY HH:mm:ss.SZZ"
- )
- == expected
- )
-
- def test_parse_ddd_and_dddd_ignore_case(self):
- # Regression test for issue #851
- expected = datetime(2019, 6, 24)
- assert (
- self.parser.parse("MONDAY, June 24, 2019", "dddd, MMMM DD, YYYY")
- == expected
- )
-
- def test_parse_ddd_and_dddd_then_format(self):
- # Regression test for issue #446
- arw_formatter = formatter.DateTimeFormatter()
- assert arw_formatter.format(self.parser.parse("Mon", "ddd"), "ddd") == "Mon"
- assert (
- arw_formatter.format(self.parser.parse("Monday", "dddd"), "dddd")
- == "Monday"
- )
- assert arw_formatter.format(self.parser.parse("Tue", "ddd"), "ddd") == "Tue"
- assert (
- arw_formatter.format(self.parser.parse("Tuesday", "dddd"), "dddd")
- == "Tuesday"
- )
- assert arw_formatter.format(self.parser.parse("Wed", "ddd"), "ddd") == "Wed"
- assert (
- arw_formatter.format(self.parser.parse("Wednesday", "dddd"), "dddd")
- == "Wednesday"
- )
- assert arw_formatter.format(self.parser.parse("Thu", "ddd"), "ddd") == "Thu"
- assert (
- arw_formatter.format(self.parser.parse("Thursday", "dddd"), "dddd")
- == "Thursday"
- )
- assert arw_formatter.format(self.parser.parse("Fri", "ddd"), "ddd") == "Fri"
- assert (
- arw_formatter.format(self.parser.parse("Friday", "dddd"), "dddd")
- == "Friday"
- )
- assert arw_formatter.format(self.parser.parse("Sat", "ddd"), "ddd") == "Sat"
- assert (
- arw_formatter.format(self.parser.parse("Saturday", "dddd"), "dddd")
- == "Saturday"
- )
- assert arw_formatter.format(self.parser.parse("Sun", "ddd"), "ddd") == "Sun"
- assert (
- arw_formatter.format(self.parser.parse("Sunday", "dddd"), "dddd")
- == "Sunday"
- )
-
- def test_parse_HH_24(self):
- assert self.parser.parse(
- "2019-10-30T24:00:00", "YYYY-MM-DDTHH:mm:ss"
- ) == datetime(2019, 10, 31, 0, 0, 0, 0)
- assert self.parser.parse("2019-10-30T24:00", "YYYY-MM-DDTHH:mm") == datetime(
- 2019, 10, 31, 0, 0, 0, 0
- )
- assert self.parser.parse("2019-10-30T24", "YYYY-MM-DDTHH") == datetime(
- 2019, 10, 31, 0, 0, 0, 0
- )
- assert self.parser.parse(
- "2019-10-30T24:00:00.0", "YYYY-MM-DDTHH:mm:ss.S"
- ) == datetime(2019, 10, 31, 0, 0, 0, 0)
- assert self.parser.parse(
- "2019-10-31T24:00:00", "YYYY-MM-DDTHH:mm:ss"
- ) == datetime(2019, 11, 1, 0, 0, 0, 0)
- assert self.parser.parse(
- "2019-12-31T24:00:00", "YYYY-MM-DDTHH:mm:ss"
- ) == datetime(2020, 1, 1, 0, 0, 0, 0)
- assert self.parser.parse(
- "2019-12-31T23:59:59.9999999", "YYYY-MM-DDTHH:mm:ss.S"
- ) == datetime(2020, 1, 1, 0, 0, 0, 0)
-
- with pytest.raises(ParserError):
- self.parser.parse("2019-12-31T24:01:00", "YYYY-MM-DDTHH:mm:ss")
-
- with pytest.raises(ParserError):
- self.parser.parse("2019-12-31T24:00:01", "YYYY-MM-DDTHH:mm:ss")
-
- with pytest.raises(ParserError):
- self.parser.parse("2019-12-31T24:00:00.1", "YYYY-MM-DDTHH:mm:ss.S")
-
- with pytest.raises(ParserError):
- self.parser.parse("2019-12-31T24:00:00.999999", "YYYY-MM-DDTHH:mm:ss.S")
-
- def test_parse_W(self):
-
- assert self.parser.parse("2011-W05-4", "W") == datetime(2011, 2, 3)
- assert self.parser.parse("2011W054", "W") == datetime(2011, 2, 3)
- assert self.parser.parse("2011-W05", "W") == datetime(2011, 1, 31)
- assert self.parser.parse("2011W05", "W") == datetime(2011, 1, 31)
- assert self.parser.parse("2011-W05-4T14:17:01", "WTHH:mm:ss") == datetime(
- 2011, 2, 3, 14, 17, 1
- )
- assert self.parser.parse("2011W054T14:17:01", "WTHH:mm:ss") == datetime(
- 2011, 2, 3, 14, 17, 1
- )
- assert self.parser.parse("2011-W05T14:17:01", "WTHH:mm:ss") == datetime(
- 2011, 1, 31, 14, 17, 1
- )
- assert self.parser.parse("2011W05T141701", "WTHHmmss") == datetime(
- 2011, 1, 31, 14, 17, 1
- )
- assert self.parser.parse("2011W054T141701", "WTHHmmss") == datetime(
- 2011, 2, 3, 14, 17, 1
- )
-
- bad_formats = [
- "201W22",
- "1995-W1-4",
- "2001-W34-90",
- "2001--W34",
- "2011-W03--3",
- "thstrdjtrsrd676776r65",
- "2002-W66-1T14:17:01",
- "2002-W23-03T14:17:01",
- ]
-
- for fmt in bad_formats:
- with pytest.raises(ParserError):
- self.parser.parse(fmt, "W")
-
- def test_parse_normalize_whitespace(self):
- assert self.parser.parse(
- "Jun 1 2005 1:33PM", "MMM D YYYY H:mmA", normalize_whitespace=True
- ) == datetime(2005, 6, 1, 13, 33)
-
- with pytest.raises(ParserError):
- self.parser.parse("Jun 1 2005 1:33PM", "MMM D YYYY H:mmA")
-
- assert (
- self.parser.parse(
- "\t 2013-05-05 T \n 12:30:45\t123456 \t \n",
- "YYYY-MM-DD T HH:mm:ss S",
- normalize_whitespace=True,
- )
- == datetime(2013, 5, 5, 12, 30, 45, 123456)
- )
-
- with pytest.raises(ParserError):
- self.parser.parse(
- "\t 2013-05-05 T \n 12:30:45\t123456 \t \n",
- "YYYY-MM-DD T HH:mm:ss S",
- )
-
- assert self.parser.parse(
- " \n Jun 1\t 2005\n ", "MMM D YYYY", normalize_whitespace=True
- ) == datetime(2005, 6, 1)
-
- with pytest.raises(ParserError):
- self.parser.parse(" \n Jun 1\t 2005\n ", "MMM D YYYY")
-
-
-@pytest.mark.usefixtures("dt_parser_regex")
-class TestDateTimeParserRegex:
- def test_format_year(self):
-
- assert self.format_regex.findall("YYYY-YY") == ["YYYY", "YY"]
-
- def test_format_month(self):
-
- assert self.format_regex.findall("MMMM-MMM-MM-M") == ["MMMM", "MMM", "MM", "M"]
-
- def test_format_day(self):
-
- assert self.format_regex.findall("DDDD-DDD-DD-D") == ["DDDD", "DDD", "DD", "D"]
-
- def test_format_hour(self):
-
- assert self.format_regex.findall("HH-H-hh-h") == ["HH", "H", "hh", "h"]
-
- def test_format_minute(self):
-
- assert self.format_regex.findall("mm-m") == ["mm", "m"]
-
- def test_format_second(self):
-
- assert self.format_regex.findall("ss-s") == ["ss", "s"]
-
- def test_format_subsecond(self):
-
- assert self.format_regex.findall("SSSSSS-SSSSS-SSSS-SSS-SS-S") == [
- "SSSSSS",
- "SSSSS",
- "SSSS",
- "SSS",
- "SS",
- "S",
- ]
-
- def test_format_tz(self):
-
- assert self.format_regex.findall("ZZZ-ZZ-Z") == ["ZZZ", "ZZ", "Z"]
-
- def test_format_am_pm(self):
-
- assert self.format_regex.findall("A-a") == ["A", "a"]
-
- def test_format_timestamp(self):
-
- assert self.format_regex.findall("X") == ["X"]
-
- def test_format_timestamp_milli(self):
-
- assert self.format_regex.findall("x") == ["x"]
-
- def test_escape(self):
-
- escape_regex = parser.DateTimeParser._ESCAPE_RE
-
- assert escape_regex.findall("2018-03-09 8 [h] 40 [hello]") == ["[h]", "[hello]"]
-
- def test_month_names(self):
- p = parser.DateTimeParser("en_us")
-
- text = "_".join(calendar.month_name[1:])
-
- result = p._input_re_map["MMMM"].findall(text)
-
- assert result == calendar.month_name[1:]
-
- def test_month_abbreviations(self):
- p = parser.DateTimeParser("en_us")
-
- text = "_".join(calendar.month_abbr[1:])
-
- result = p._input_re_map["MMM"].findall(text)
-
- assert result == calendar.month_abbr[1:]
-
- def test_digits(self):
-
- assert parser.DateTimeParser._ONE_OR_TWO_DIGIT_RE.findall("4-56") == ["4", "56"]
- assert parser.DateTimeParser._ONE_OR_TWO_OR_THREE_DIGIT_RE.findall(
- "4-56-789"
- ) == ["4", "56", "789"]
- assert parser.DateTimeParser._ONE_OR_MORE_DIGIT_RE.findall(
- "4-56-789-1234-12345"
- ) == ["4", "56", "789", "1234", "12345"]
- assert parser.DateTimeParser._TWO_DIGIT_RE.findall("12-3-45") == ["12", "45"]
- assert parser.DateTimeParser._THREE_DIGIT_RE.findall("123-4-56") == ["123"]
- assert parser.DateTimeParser._FOUR_DIGIT_RE.findall("1234-56") == ["1234"]
-
- def test_tz(self):
- tz_z_re = parser.DateTimeParser._TZ_Z_RE
- assert tz_z_re.findall("-0700") == [("-", "07", "00")]
- assert tz_z_re.findall("+07") == [("+", "07", "")]
- assert tz_z_re.search("15/01/2019T04:05:06.789120Z") is not None
- assert tz_z_re.search("15/01/2019T04:05:06.789120") is None
-
- tz_zz_re = parser.DateTimeParser._TZ_ZZ_RE
- assert tz_zz_re.findall("-07:00") == [("-", "07", "00")]
- assert tz_zz_re.findall("+07") == [("+", "07", "")]
- assert tz_zz_re.search("15/01/2019T04:05:06.789120Z") is not None
- assert tz_zz_re.search("15/01/2019T04:05:06.789120") is None
-
- tz_name_re = parser.DateTimeParser._TZ_NAME_RE
- assert tz_name_re.findall("Europe/Warsaw") == ["Europe/Warsaw"]
- assert tz_name_re.findall("GMT") == ["GMT"]
-
- def test_timestamp(self):
- timestamp_re = parser.DateTimeParser._TIMESTAMP_RE
- assert timestamp_re.findall("1565707550.452729") == ["1565707550.452729"]
- assert timestamp_re.findall("-1565707550.452729") == ["-1565707550.452729"]
- assert timestamp_re.findall("-1565707550") == ["-1565707550"]
- assert timestamp_re.findall("1565707550") == ["1565707550"]
- assert timestamp_re.findall("1565707550.") == []
- assert timestamp_re.findall(".1565707550") == []
-
- def test_timestamp_milli(self):
- timestamp_expanded_re = parser.DateTimeParser._TIMESTAMP_EXPANDED_RE
- assert timestamp_expanded_re.findall("-1565707550") == ["-1565707550"]
- assert timestamp_expanded_re.findall("1565707550") == ["1565707550"]
- assert timestamp_expanded_re.findall("1565707550.452729") == []
- assert timestamp_expanded_re.findall("1565707550.") == []
- assert timestamp_expanded_re.findall(".1565707550") == []
-
- def test_time(self):
- time_re = parser.DateTimeParser._TIME_RE
- time_seperators = [":", ""]
-
- for sep in time_seperators:
- assert time_re.findall("12") == [("12", "", "", "", "")]
- assert time_re.findall("12{sep}35".format(sep=sep)) == [
- ("12", "35", "", "", "")
- ]
- assert time_re.findall("12{sep}35{sep}46".format(sep=sep)) == [
- ("12", "35", "46", "", "")
- ]
- assert time_re.findall("12{sep}35{sep}46.952313".format(sep=sep)) == [
- ("12", "35", "46", ".", "952313")
- ]
- assert time_re.findall("12{sep}35{sep}46,952313".format(sep=sep)) == [
- ("12", "35", "46", ",", "952313")
- ]
-
- assert time_re.findall("12:") == []
- assert time_re.findall("12:35:46.") == []
- assert time_re.findall("12:35:46,") == []
-
-
-@pytest.mark.usefixtures("dt_parser")
-class TestDateTimeParserISO:
- def test_YYYY(self):
-
- assert self.parser.parse_iso("2013") == datetime(2013, 1, 1)
-
- def test_YYYY_DDDD(self):
- assert self.parser.parse_iso("1998-136") == datetime(1998, 5, 16)
-
- assert self.parser.parse_iso("1998-006") == datetime(1998, 1, 6)
-
- with pytest.raises(ParserError):
- self.parser.parse_iso("1998-456")
-
- # 2016 is a leap year, so Feb 29 exists (leap day)
- assert self.parser.parse_iso("2016-059") == datetime(2016, 2, 28)
- assert self.parser.parse_iso("2016-060") == datetime(2016, 2, 29)
- assert self.parser.parse_iso("2016-061") == datetime(2016, 3, 1)
-
- # 2017 is not a leap year, so Feb 29 does not exist
- assert self.parser.parse_iso("2017-059") == datetime(2017, 2, 28)
- assert self.parser.parse_iso("2017-060") == datetime(2017, 3, 1)
- assert self.parser.parse_iso("2017-061") == datetime(2017, 3, 2)
-
- # Since 2016 is a leap year, the 366th day falls in the same year
- assert self.parser.parse_iso("2016-366") == datetime(2016, 12, 31)
-
- # Since 2017 is not a leap year, the 366th day falls in the next year
- assert self.parser.parse_iso("2017-366") == datetime(2018, 1, 1)
-
- def test_YYYY_DDDD_HH_mm_ssZ(self):
-
- assert self.parser.parse_iso("2013-036 04:05:06+01:00") == datetime(
- 2013, 2, 5, 4, 5, 6, tzinfo=tz.tzoffset(None, 3600)
- )
-
- assert self.parser.parse_iso("2013-036 04:05:06Z") == datetime(
- 2013, 2, 5, 4, 5, 6, tzinfo=tz.tzutc()
- )
-
- def test_YYYY_MM_DDDD(self):
- with pytest.raises(ParserError):
- self.parser.parse_iso("2014-05-125")
-
- def test_YYYY_MM(self):
-
- for separator in DateTimeParser.SEPARATORS:
- assert self.parser.parse_iso(separator.join(("2013", "02"))) == datetime(
- 2013, 2, 1
- )
-
- def test_YYYY_MM_DD(self):
-
- for separator in DateTimeParser.SEPARATORS:
- assert self.parser.parse_iso(
- separator.join(("2013", "02", "03"))
- ) == datetime(2013, 2, 3)
-
- def test_YYYY_MM_DDTHH_mmZ(self):
-
- assert self.parser.parse_iso("2013-02-03T04:05+01:00") == datetime(
- 2013, 2, 3, 4, 5, tzinfo=tz.tzoffset(None, 3600)
- )
-
- def test_YYYY_MM_DDTHH_mm(self):
-
- assert self.parser.parse_iso("2013-02-03T04:05") == datetime(2013, 2, 3, 4, 5)
-
- def test_YYYY_MM_DDTHH(self):
-
- assert self.parser.parse_iso("2013-02-03T04") == datetime(2013, 2, 3, 4)
-
- def test_YYYY_MM_DDTHHZ(self):
-
- assert self.parser.parse_iso("2013-02-03T04+01:00") == datetime(
- 2013, 2, 3, 4, tzinfo=tz.tzoffset(None, 3600)
- )
-
- def test_YYYY_MM_DDTHH_mm_ssZ(self):
-
- assert self.parser.parse_iso("2013-02-03T04:05:06+01:00") == datetime(
- 2013, 2, 3, 4, 5, 6, tzinfo=tz.tzoffset(None, 3600)
- )
-
- def test_YYYY_MM_DDTHH_mm_ss(self):
-
- assert self.parser.parse_iso("2013-02-03T04:05:06") == datetime(
- 2013, 2, 3, 4, 5, 6
- )
-
- def test_YYYY_MM_DD_HH_mmZ(self):
-
- assert self.parser.parse_iso("2013-02-03 04:05+01:00") == datetime(
- 2013, 2, 3, 4, 5, tzinfo=tz.tzoffset(None, 3600)
- )
-
- def test_YYYY_MM_DD_HH_mm(self):
-
- assert self.parser.parse_iso("2013-02-03 04:05") == datetime(2013, 2, 3, 4, 5)
-
- def test_YYYY_MM_DD_HH(self):
-
- assert self.parser.parse_iso("2013-02-03 04") == datetime(2013, 2, 3, 4)
-
- def test_invalid_time(self):
-
- with pytest.raises(ParserError):
- self.parser.parse_iso("2013-02-03T")
-
- with pytest.raises(ParserError):
- self.parser.parse_iso("2013-02-03 044")
-
- with pytest.raises(ParserError):
- self.parser.parse_iso("2013-02-03 04:05:06.")
-
- def test_YYYY_MM_DD_HH_mm_ssZ(self):
-
- assert self.parser.parse_iso("2013-02-03 04:05:06+01:00") == datetime(
- 2013, 2, 3, 4, 5, 6, tzinfo=tz.tzoffset(None, 3600)
- )
-
- def test_YYYY_MM_DD_HH_mm_ss(self):
-
- assert self.parser.parse_iso("2013-02-03 04:05:06") == datetime(
- 2013, 2, 3, 4, 5, 6
- )
-
- def test_YYYY_MM_DDTHH_mm_ss_S(self):
-
- assert self.parser.parse_iso("2013-02-03T04:05:06.7") == datetime(
- 2013, 2, 3, 4, 5, 6, 700000
- )
-
- assert self.parser.parse_iso("2013-02-03T04:05:06.78") == datetime(
- 2013, 2, 3, 4, 5, 6, 780000
- )
-
- assert self.parser.parse_iso("2013-02-03T04:05:06.789") == datetime(
- 2013, 2, 3, 4, 5, 6, 789000
- )
-
- assert self.parser.parse_iso("2013-02-03T04:05:06.7891") == datetime(
- 2013, 2, 3, 4, 5, 6, 789100
- )
-
- assert self.parser.parse_iso("2013-02-03T04:05:06.78912") == datetime(
- 2013, 2, 3, 4, 5, 6, 789120
- )
-
- # ISO 8601:2004(E), ISO, 2004-12-01, 4.2.2.4 ... the decimal fraction
- # shall be divided from the integer part by the decimal sign specified
- # in ISO 31-0, i.e. the comma [,] or full stop [.]. Of these, the comma
- # is the preferred sign.
- assert self.parser.parse_iso("2013-02-03T04:05:06,789123678") == datetime(
- 2013, 2, 3, 4, 5, 6, 789124
- )
-
- # there is no limit on the number of decimal places
- assert self.parser.parse_iso("2013-02-03T04:05:06.789123678") == datetime(
- 2013, 2, 3, 4, 5, 6, 789124
- )
-
- def test_YYYY_MM_DDTHH_mm_ss_SZ(self):
-
- assert self.parser.parse_iso("2013-02-03T04:05:06.7+01:00") == datetime(
- 2013, 2, 3, 4, 5, 6, 700000, tzinfo=tz.tzoffset(None, 3600)
- )
-
- assert self.parser.parse_iso("2013-02-03T04:05:06.78+01:00") == datetime(
- 2013, 2, 3, 4, 5, 6, 780000, tzinfo=tz.tzoffset(None, 3600)
- )
-
- assert self.parser.parse_iso("2013-02-03T04:05:06.789+01:00") == datetime(
- 2013, 2, 3, 4, 5, 6, 789000, tzinfo=tz.tzoffset(None, 3600)
- )
-
- assert self.parser.parse_iso("2013-02-03T04:05:06.7891+01:00") == datetime(
- 2013, 2, 3, 4, 5, 6, 789100, tzinfo=tz.tzoffset(None, 3600)
- )
-
- assert self.parser.parse_iso("2013-02-03T04:05:06.78912+01:00") == datetime(
- 2013, 2, 3, 4, 5, 6, 789120, tzinfo=tz.tzoffset(None, 3600)
- )
-
- assert self.parser.parse_iso("2013-02-03 04:05:06.78912Z") == datetime(
- 2013, 2, 3, 4, 5, 6, 789120, tzinfo=tz.tzutc()
- )
-
- def test_W(self):
-
- assert self.parser.parse_iso("2011-W05-4") == datetime(2011, 2, 3)
-
- assert self.parser.parse_iso("2011-W05-4T14:17:01") == datetime(
- 2011, 2, 3, 14, 17, 1
- )
-
- assert self.parser.parse_iso("2011W054") == datetime(2011, 2, 3)
-
- assert self.parser.parse_iso("2011W054T141701") == datetime(
- 2011, 2, 3, 14, 17, 1
- )
-
- def test_invalid_Z(self):
-
- with pytest.raises(ParserError):
- self.parser.parse_iso("2013-02-03T04:05:06.78912z")
-
- with pytest.raises(ParserError):
- self.parser.parse_iso("2013-02-03T04:05:06.78912zz")
-
- with pytest.raises(ParserError):
- self.parser.parse_iso("2013-02-03T04:05:06.78912Zz")
-
- with pytest.raises(ParserError):
- self.parser.parse_iso("2013-02-03T04:05:06.78912ZZ")
-
- with pytest.raises(ParserError):
- self.parser.parse_iso("2013-02-03T04:05:06.78912+Z")
-
- with pytest.raises(ParserError):
- self.parser.parse_iso("2013-02-03T04:05:06.78912-Z")
-
- with pytest.raises(ParserError):
- self.parser.parse_iso("2013-02-03T04:05:06.78912 Z")
-
- def test_parse_subsecond(self):
- self.expected = datetime(2013, 1, 1, 12, 30, 45, 900000)
- assert self.parser.parse_iso("2013-01-01 12:30:45.9") == self.expected
-
- self.expected = datetime(2013, 1, 1, 12, 30, 45, 980000)
- assert self.parser.parse_iso("2013-01-01 12:30:45.98") == self.expected
-
- self.expected = datetime(2013, 1, 1, 12, 30, 45, 987000)
- assert self.parser.parse_iso("2013-01-01 12:30:45.987") == self.expected
-
- self.expected = datetime(2013, 1, 1, 12, 30, 45, 987600)
- assert self.parser.parse_iso("2013-01-01 12:30:45.9876") == self.expected
-
- self.expected = datetime(2013, 1, 1, 12, 30, 45, 987650)
- assert self.parser.parse_iso("2013-01-01 12:30:45.98765") == self.expected
-
- self.expected = datetime(2013, 1, 1, 12, 30, 45, 987654)
- assert self.parser.parse_iso("2013-01-01 12:30:45.987654") == self.expected
-
- # use comma as subsecond separator
- self.expected = datetime(2013, 1, 1, 12, 30, 45, 987654)
- assert self.parser.parse_iso("2013-01-01 12:30:45,987654") == self.expected
-
- def test_gnu_date(self):
- """Regression tests for parsing output from GNU date."""
- # date -Ins
- assert self.parser.parse_iso("2016-11-16T09:46:30,895636557-0800") == datetime(
- 2016, 11, 16, 9, 46, 30, 895636, tzinfo=tz.tzoffset(None, -3600 * 8)
- )
-
- # date --rfc-3339=ns
- assert self.parser.parse_iso("2016-11-16 09:51:14.682141526-08:00") == datetime(
- 2016, 11, 16, 9, 51, 14, 682142, tzinfo=tz.tzoffset(None, -3600 * 8)
- )
-
- def test_isoformat(self):
-
- dt = datetime.utcnow()
-
- assert self.parser.parse_iso(dt.isoformat()) == dt
-
- def test_parse_iso_normalize_whitespace(self):
- assert self.parser.parse_iso(
- "2013-036 \t 04:05:06Z", normalize_whitespace=True
- ) == datetime(2013, 2, 5, 4, 5, 6, tzinfo=tz.tzutc())
-
- with pytest.raises(ParserError):
- self.parser.parse_iso("2013-036 \t 04:05:06Z")
-
- assert self.parser.parse_iso(
- "\t 2013-05-05T12:30:45.123456 \t \n", normalize_whitespace=True
- ) == datetime(2013, 5, 5, 12, 30, 45, 123456)
-
- with pytest.raises(ParserError):
- self.parser.parse_iso("\t 2013-05-05T12:30:45.123456 \t \n")
-
- def test_parse_iso_with_leading_and_trailing_whitespace(self):
- datetime_string = " 2016-11-15T06:37:19.123456"
- with pytest.raises(ParserError):
- self.parser.parse_iso(datetime_string)
-
- datetime_string = " 2016-11-15T06:37:19.123456 "
- with pytest.raises(ParserError):
- self.parser.parse_iso(datetime_string)
-
- datetime_string = "2016-11-15T06:37:19.123456 "
- with pytest.raises(ParserError):
- self.parser.parse_iso(datetime_string)
-
- datetime_string = "2016-11-15T 06:37:19.123456"
- with pytest.raises(ParserError):
- self.parser.parse_iso(datetime_string)
-
- # leading whitespace
- datetime_string = " 2016-11-15 06:37:19.123456"
- with pytest.raises(ParserError):
- self.parser.parse_iso(datetime_string)
-
- # trailing whitespace
- datetime_string = "2016-11-15 06:37:19.123456 "
- with pytest.raises(ParserError):
- self.parser.parse_iso(datetime_string)
-
- datetime_string = " 2016-11-15 06:37:19.123456 "
- with pytest.raises(ParserError):
- self.parser.parse_iso(datetime_string)
-
- # two dividing spaces
- datetime_string = "2016-11-15 06:37:19.123456"
- with pytest.raises(ParserError):
- self.parser.parse_iso(datetime_string)
-
- def test_parse_iso_with_extra_words_at_start_and_end_invalid(self):
- test_inputs = [
- "blah2016",
- "blah2016blah",
- "blah 2016 blah",
- "blah 2016",
- "2016 blah",
- "blah 2016-05-16 04:05:06.789120",
- "2016-05-16 04:05:06.789120 blah",
- "blah 2016-05-16T04:05:06.789120",
- "2016-05-16T04:05:06.789120 blah",
- "2016blah",
- "2016-05blah",
- "2016-05-16blah",
- "2016-05-16T04:05:06.789120blah",
- "2016-05-16T04:05:06.789120ZblahZ",
- "2016-05-16T04:05:06.789120Zblah",
- "2016-05-16T04:05:06.789120blahZ",
- "Meet me at 2016-05-16T04:05:06.789120 at the restaurant.",
- "Meet me at 2016-05-16 04:05:06.789120 at the restaurant.",
- ]
-
- for ti in test_inputs:
- with pytest.raises(ParserError):
- self.parser.parse_iso(ti)
-
- def test_iso8601_basic_format(self):
- assert self.parser.parse_iso("20180517") == datetime(2018, 5, 17)
-
- assert self.parser.parse_iso("20180517T10") == datetime(2018, 5, 17, 10)
-
- assert self.parser.parse_iso("20180517T105513.843456") == datetime(
- 2018, 5, 17, 10, 55, 13, 843456
- )
-
- assert self.parser.parse_iso("20180517T105513Z") == datetime(
- 2018, 5, 17, 10, 55, 13, tzinfo=tz.tzutc()
- )
-
- assert self.parser.parse_iso("20180517T105513.843456-0700") == datetime(
- 2018, 5, 17, 10, 55, 13, 843456, tzinfo=tz.tzoffset(None, -25200)
- )
-
- assert self.parser.parse_iso("20180517T105513-0700") == datetime(
- 2018, 5, 17, 10, 55, 13, tzinfo=tz.tzoffset(None, -25200)
- )
-
- assert self.parser.parse_iso("20180517T105513-07") == datetime(
- 2018, 5, 17, 10, 55, 13, tzinfo=tz.tzoffset(None, -25200)
- )
-
- # ordinal in basic format: YYYYDDDD
- assert self.parser.parse_iso("1998136") == datetime(1998, 5, 16)
-
- # timezone requires +- seperator
- with pytest.raises(ParserError):
- self.parser.parse_iso("20180517T1055130700")
-
- with pytest.raises(ParserError):
- self.parser.parse_iso("20180517T10551307")
-
- # too many digits in date
- with pytest.raises(ParserError):
- self.parser.parse_iso("201860517T105513Z")
-
- # too many digits in time
- with pytest.raises(ParserError):
- self.parser.parse_iso("20180517T1055213Z")
-
- def test_midnight_end_day(self):
- assert self.parser.parse_iso("2019-10-30T24:00:00") == datetime(
- 2019, 10, 31, 0, 0, 0, 0
- )
- assert self.parser.parse_iso("2019-10-30T24:00") == datetime(
- 2019, 10, 31, 0, 0, 0, 0
- )
- assert self.parser.parse_iso("2019-10-30T24:00:00.0") == datetime(
- 2019, 10, 31, 0, 0, 0, 0
- )
- assert self.parser.parse_iso("2019-10-31T24:00:00") == datetime(
- 2019, 11, 1, 0, 0, 0, 0
- )
- assert self.parser.parse_iso("2019-12-31T24:00:00") == datetime(
- 2020, 1, 1, 0, 0, 0, 0
- )
- assert self.parser.parse_iso("2019-12-31T23:59:59.9999999") == datetime(
- 2020, 1, 1, 0, 0, 0, 0
- )
-
- with pytest.raises(ParserError):
- self.parser.parse_iso("2019-12-31T24:01:00")
-
- with pytest.raises(ParserError):
- self.parser.parse_iso("2019-12-31T24:00:01")
-
- with pytest.raises(ParserError):
- self.parser.parse_iso("2019-12-31T24:00:00.1")
-
- with pytest.raises(ParserError):
- self.parser.parse_iso("2019-12-31T24:00:00.999999")
-
-
-@pytest.mark.usefixtures("tzinfo_parser")
-class TestTzinfoParser:
- def test_parse_local(self):
-
- assert self.parser.parse("local") == tz.tzlocal()
-
- def test_parse_utc(self):
-
- assert self.parser.parse("utc") == tz.tzutc()
- assert self.parser.parse("UTC") == tz.tzutc()
-
- def test_parse_iso(self):
-
- assert self.parser.parse("01:00") == tz.tzoffset(None, 3600)
- assert self.parser.parse("11:35") == tz.tzoffset(None, 11 * 3600 + 2100)
- assert self.parser.parse("+01:00") == tz.tzoffset(None, 3600)
- assert self.parser.parse("-01:00") == tz.tzoffset(None, -3600)
-
- assert self.parser.parse("0100") == tz.tzoffset(None, 3600)
- assert self.parser.parse("+0100") == tz.tzoffset(None, 3600)
- assert self.parser.parse("-0100") == tz.tzoffset(None, -3600)
-
- assert self.parser.parse("01") == tz.tzoffset(None, 3600)
- assert self.parser.parse("+01") == tz.tzoffset(None, 3600)
- assert self.parser.parse("-01") == tz.tzoffset(None, -3600)
-
- def test_parse_str(self):
-
- assert self.parser.parse("US/Pacific") == tz.gettz("US/Pacific")
-
- def test_parse_fails(self):
-
- with pytest.raises(parser.ParserError):
- self.parser.parse("fail")
-
-
-@pytest.mark.usefixtures("dt_parser")
-class TestDateTimeParserMonthName:
- def test_shortmonth_capitalized(self):
-
- assert self.parser.parse("2013-Jan-01", "YYYY-MMM-DD") == datetime(2013, 1, 1)
-
- def test_shortmonth_allupper(self):
-
- assert self.parser.parse("2013-JAN-01", "YYYY-MMM-DD") == datetime(2013, 1, 1)
-
- def test_shortmonth_alllower(self):
-
- assert self.parser.parse("2013-jan-01", "YYYY-MMM-DD") == datetime(2013, 1, 1)
-
- def test_month_capitalized(self):
-
- assert self.parser.parse("2013-January-01", "YYYY-MMMM-DD") == datetime(
- 2013, 1, 1
- )
-
- def test_month_allupper(self):
-
- assert self.parser.parse("2013-JANUARY-01", "YYYY-MMMM-DD") == datetime(
- 2013, 1, 1
- )
-
- def test_month_alllower(self):
-
- assert self.parser.parse("2013-january-01", "YYYY-MMMM-DD") == datetime(
- 2013, 1, 1
- )
-
- def test_localized_month_name(self):
- parser_ = parser.DateTimeParser("fr_fr")
-
- assert parser_.parse("2013-Janvier-01", "YYYY-MMMM-DD") == datetime(2013, 1, 1)
-
- def test_localized_month_abbreviation(self):
- parser_ = parser.DateTimeParser("it_it")
-
- assert parser_.parse("2013-Gen-01", "YYYY-MMM-DD") == datetime(2013, 1, 1)
-
-
-@pytest.mark.usefixtures("dt_parser")
-class TestDateTimeParserMeridians:
- def test_meridians_lowercase(self):
- assert self.parser.parse("2013-01-01 5am", "YYYY-MM-DD ha") == datetime(
- 2013, 1, 1, 5
- )
-
- assert self.parser.parse("2013-01-01 5pm", "YYYY-MM-DD ha") == datetime(
- 2013, 1, 1, 17
- )
-
- def test_meridians_capitalized(self):
- assert self.parser.parse("2013-01-01 5AM", "YYYY-MM-DD hA") == datetime(
- 2013, 1, 1, 5
- )
-
- assert self.parser.parse("2013-01-01 5PM", "YYYY-MM-DD hA") == datetime(
- 2013, 1, 1, 17
- )
-
- def test_localized_meridians_lowercase(self):
- parser_ = parser.DateTimeParser("hu_hu")
- assert parser_.parse("2013-01-01 5 de", "YYYY-MM-DD h a") == datetime(
- 2013, 1, 1, 5
- )
-
- assert parser_.parse("2013-01-01 5 du", "YYYY-MM-DD h a") == datetime(
- 2013, 1, 1, 17
- )
-
- def test_localized_meridians_capitalized(self):
- parser_ = parser.DateTimeParser("hu_hu")
- assert parser_.parse("2013-01-01 5 DE", "YYYY-MM-DD h A") == datetime(
- 2013, 1, 1, 5
- )
-
- assert parser_.parse("2013-01-01 5 DU", "YYYY-MM-DD h A") == datetime(
- 2013, 1, 1, 17
- )
-
- # regression test for issue #607
- def test_es_meridians(self):
- parser_ = parser.DateTimeParser("es")
-
- assert parser_.parse(
- "Junio 30, 2019 - 08:00 pm", "MMMM DD, YYYY - hh:mm a"
- ) == datetime(2019, 6, 30, 20, 0)
-
- with pytest.raises(ParserError):
- parser_.parse(
- "Junio 30, 2019 - 08:00 pasdfasdfm", "MMMM DD, YYYY - hh:mm a"
- )
-
- def test_fr_meridians(self):
- parser_ = parser.DateTimeParser("fr")
-
- # the French locale always uses a 24 hour clock, so it does not support meridians
- with pytest.raises(ParserError):
- parser_.parse("Janvier 30, 2019 - 08:00 pm", "MMMM DD, YYYY - hh:mm a")
-
-
-@pytest.mark.usefixtures("dt_parser")
-class TestDateTimeParserMonthOrdinalDay:
- def test_english(self):
- parser_ = parser.DateTimeParser("en_us")
-
- assert parser_.parse("January 1st, 2013", "MMMM Do, YYYY") == datetime(
- 2013, 1, 1
- )
- assert parser_.parse("January 2nd, 2013", "MMMM Do, YYYY") == datetime(
- 2013, 1, 2
- )
- assert parser_.parse("January 3rd, 2013", "MMMM Do, YYYY") == datetime(
- 2013, 1, 3
- )
- assert parser_.parse("January 4th, 2013", "MMMM Do, YYYY") == datetime(
- 2013, 1, 4
- )
- assert parser_.parse("January 11th, 2013", "MMMM Do, YYYY") == datetime(
- 2013, 1, 11
- )
- assert parser_.parse("January 12th, 2013", "MMMM Do, YYYY") == datetime(
- 2013, 1, 12
- )
- assert parser_.parse("January 13th, 2013", "MMMM Do, YYYY") == datetime(
- 2013, 1, 13
- )
- assert parser_.parse("January 21st, 2013", "MMMM Do, YYYY") == datetime(
- 2013, 1, 21
- )
- assert parser_.parse("January 31st, 2013", "MMMM Do, YYYY") == datetime(
- 2013, 1, 31
- )
-
- with pytest.raises(ParserError):
- parser_.parse("January 1th, 2013", "MMMM Do, YYYY")
-
- with pytest.raises(ParserError):
- parser_.parse("January 11st, 2013", "MMMM Do, YYYY")
-
- def test_italian(self):
- parser_ = parser.DateTimeParser("it_it")
-
- assert parser_.parse("Gennaio 1º, 2013", "MMMM Do, YYYY") == datetime(
- 2013, 1, 1
- )
-
- def test_spanish(self):
- parser_ = parser.DateTimeParser("es_es")
-
- assert parser_.parse("Enero 1º, 2013", "MMMM Do, YYYY") == datetime(2013, 1, 1)
-
- def test_french(self):
- parser_ = parser.DateTimeParser("fr_fr")
-
- assert parser_.parse("Janvier 1er, 2013", "MMMM Do, YYYY") == datetime(
- 2013, 1, 1
- )
-
- assert parser_.parse("Janvier 2e, 2013", "MMMM Do, YYYY") == datetime(
- 2013, 1, 2
- )
-
- assert parser_.parse("Janvier 11e, 2013", "MMMM Do, YYYY") == datetime(
- 2013, 1, 11
- )
-
-
-@pytest.mark.usefixtures("dt_parser")
-class TestDateTimeParserSearchDate:
- def test_parse_search(self):
-
- assert self.parser.parse(
- "Today is 25 of September of 2003", "DD of MMMM of YYYY"
- ) == datetime(2003, 9, 25)
-
- def test_parse_search_with_numbers(self):
-
- assert self.parser.parse(
- "2000 people met the 2012-01-01 12:05:10", "YYYY-MM-DD HH:mm:ss"
- ) == datetime(2012, 1, 1, 12, 5, 10)
-
- assert self.parser.parse(
- "Call 01-02-03 on 79-01-01 12:05:10", "YY-MM-DD HH:mm:ss"
- ) == datetime(1979, 1, 1, 12, 5, 10)
-
- def test_parse_search_with_names(self):
-
- assert self.parser.parse("June was born in May 1980", "MMMM YYYY") == datetime(
- 1980, 5, 1
- )
-
- def test_parse_search_locale_with_names(self):
- p = parser.DateTimeParser("sv_se")
-
- assert p.parse("Jan föddes den 31 Dec 1980", "DD MMM YYYY") == datetime(
- 1980, 12, 31
- )
-
- assert p.parse("Jag föddes den 25 Augusti 1975", "DD MMMM YYYY") == datetime(
- 1975, 8, 25
- )
-
- def test_parse_search_fails(self):
-
- with pytest.raises(parser.ParserError):
- self.parser.parse("Jag föddes den 25 Augusti 1975", "DD MMMM YYYY")
-
- def test_escape(self):
-
- format = "MMMM D, YYYY [at] h:mma"
- assert self.parser.parse(
- "Thursday, December 10, 2015 at 5:09pm", format
- ) == datetime(2015, 12, 10, 17, 9)
-
- format = "[MMMM] M D, YYYY [at] h:mma"
- assert self.parser.parse("MMMM 12 10, 2015 at 5:09pm", format) == datetime(
- 2015, 12, 10, 17, 9
- )
-
- format = "[It happened on] MMMM Do [in the year] YYYY [a long time ago]"
- assert self.parser.parse(
- "It happened on November 25th in the year 1990 a long time ago", format
- ) == datetime(1990, 11, 25)
-
- format = "[It happened on] MMMM Do [in the][ year] YYYY [a long time ago]"
- assert self.parser.parse(
- "It happened on November 25th in the year 1990 a long time ago", format
- ) == datetime(1990, 11, 25)
-
- format = "[I'm][ entirely][ escaped,][ weee!]"
- assert self.parser.parse("I'm entirely escaped, weee!", format) == datetime(
- 1, 1, 1
- )
-
- # Special RegEx characters
- format = "MMM DD, YYYY |^${}().*+?<>-& h:mm A"
- assert self.parser.parse(
- "Dec 31, 2017 |^${}().*+?<>-& 2:00 AM", format
- ) == datetime(2017, 12, 31, 2, 0)
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/tests/test_util.py b/openpype/modules/ftrack/python2_vendor/arrow/tests/test_util.py
deleted file mode 100644
index e48b4de066..0000000000
--- a/openpype/modules/ftrack/python2_vendor/arrow/tests/test_util.py
+++ /dev/null
@@ -1,81 +0,0 @@
-# -*- coding: utf-8 -*-
-import time
-from datetime import datetime
-
-import pytest
-
-from arrow import util
-
-
-class TestUtil:
- def test_next_weekday(self):
- # Get first Monday after epoch
- assert util.next_weekday(datetime(1970, 1, 1), 0) == datetime(1970, 1, 5)
-
- # Get first Tuesday after epoch
- assert util.next_weekday(datetime(1970, 1, 1), 1) == datetime(1970, 1, 6)
-
- # Get first Wednesday after epoch
- assert util.next_weekday(datetime(1970, 1, 1), 2) == datetime(1970, 1, 7)
-
- # Get first Thursday after epoch
- assert util.next_weekday(datetime(1970, 1, 1), 3) == datetime(1970, 1, 1)
-
- # Get first Friday after epoch
- assert util.next_weekday(datetime(1970, 1, 1), 4) == datetime(1970, 1, 2)
-
- # Get first Saturday after epoch
- assert util.next_weekday(datetime(1970, 1, 1), 5) == datetime(1970, 1, 3)
-
- # Get first Sunday after epoch
- assert util.next_weekday(datetime(1970, 1, 1), 6) == datetime(1970, 1, 4)
-
- # Weekdays are 0-indexed
- with pytest.raises(ValueError):
- util.next_weekday(datetime(1970, 1, 1), 7)
-
- with pytest.raises(ValueError):
- util.next_weekday(datetime(1970, 1, 1), -1)
-
- def test_total_seconds(self):
- td = datetime(2019, 1, 1) - datetime(2018, 1, 1)
- assert util.total_seconds(td) == td.total_seconds()
-
- def test_is_timestamp(self):
- timestamp_float = time.time()
- timestamp_int = int(timestamp_float)
-
- assert util.is_timestamp(timestamp_int)
- assert util.is_timestamp(timestamp_float)
- assert util.is_timestamp(str(timestamp_int))
- assert util.is_timestamp(str(timestamp_float))
-
- assert not util.is_timestamp(True)
- assert not util.is_timestamp(False)
-
- class InvalidTimestamp:
- pass
-
- assert not util.is_timestamp(InvalidTimestamp())
-
- full_datetime = "2019-06-23T13:12:42"
- assert not util.is_timestamp(full_datetime)
-
- def test_normalize_timestamp(self):
- timestamp = 1591161115.194556
- millisecond_timestamp = 1591161115194
- microsecond_timestamp = 1591161115194556
-
- assert util.normalize_timestamp(timestamp) == timestamp
- assert util.normalize_timestamp(millisecond_timestamp) == 1591161115.194
- assert util.normalize_timestamp(microsecond_timestamp) == 1591161115.194556
-
- with pytest.raises(ValueError):
- util.normalize_timestamp(3e17)
-
- def test_iso_gregorian(self):
- with pytest.raises(ValueError):
- util.iso_to_gregorian(2013, 0, 5)
-
- with pytest.raises(ValueError):
- util.iso_to_gregorian(2013, 8, 0)
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/tests/utils.py b/openpype/modules/ftrack/python2_vendor/arrow/tests/utils.py
deleted file mode 100644
index 2a048feb3f..0000000000
--- a/openpype/modules/ftrack/python2_vendor/arrow/tests/utils.py
+++ /dev/null
@@ -1,16 +0,0 @@
-# -*- coding: utf-8 -*-
-import pytz
-from dateutil.zoneinfo import get_zonefile_instance
-
-from arrow import util
-
-
-def make_full_tz_list():
- dateutil_zones = set(get_zonefile_instance().zones)
- pytz_zones = set(pytz.all_timezones)
- return dateutil_zones.union(pytz_zones)
-
-
-def assert_datetime_equality(dt1, dt2, within=10):
- assert dt1.tzinfo == dt2.tzinfo
- assert abs(util.total_seconds(dt1 - dt2)) < within
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/tox.ini b/openpype/modules/ftrack/python2_vendor/arrow/tox.ini
deleted file mode 100644
index 46576b12e3..0000000000
--- a/openpype/modules/ftrack/python2_vendor/arrow/tox.ini
+++ /dev/null
@@ -1,53 +0,0 @@
-[tox]
-minversion = 3.18.0
-envlist = py{py3,27,35,36,37,38,39},lint,docs
-skip_missing_interpreters = true
-
-[gh-actions]
-python =
- pypy3: pypy3
- 2.7: py27
- 3.5: py35
- 3.6: py36
- 3.7: py37
- 3.8: py38
- 3.9: py39
-
-[testenv]
-deps = -rrequirements.txt
-allowlist_externals = pytest
-commands = pytest
-
-[testenv:lint]
-basepython = python3
-skip_install = true
-deps = pre-commit
-commands =
- pre-commit install
- pre-commit run --all-files --show-diff-on-failure
-
-[testenv:docs]
-basepython = python3
-skip_install = true
-changedir = docs
-deps =
- doc8
- sphinx
- python-dateutil
-allowlist_externals = make
-commands =
- doc8 index.rst ../README.rst --extension .rst --ignore D001
- make html SPHINXOPTS="-W --keep-going"
-
-[pytest]
-addopts = -v --cov-branch --cov=arrow --cov-fail-under=100 --cov-report=term-missing --cov-report=xml
-testpaths = tests
-
-[isort]
-line_length = 88
-multi_line_output = 3
-include_trailing_comma = true
-
-[flake8]
-per-file-ignores = arrow/__init__.py:F401
-ignore = E203,E501,W503
diff --git a/openpype/modules/log_viewer/log_view_module.py b/openpype/modules/log_viewer/log_view_module.py
index e9dba2041c..1cafbe4fbd 100644
--- a/openpype/modules/log_viewer/log_view_module.py
+++ b/openpype/modules/log_viewer/log_view_module.py
@@ -1,3 +1,4 @@
+from openpype import AYON_SERVER_ENABLED
from openpype.modules import OpenPypeModule, ITrayModule
@@ -7,6 +8,8 @@ class LogViewModule(OpenPypeModule, ITrayModule):
def initialize(self, modules_settings):
logging_settings = modules_settings[self.name]
self.enabled = logging_settings["enabled"]
+ if AYON_SERVER_ENABLED:
+ self.enabled = False
# Tray attributes
self.window = None
diff --git a/openpype/modules/project_manager_action.py b/openpype/modules/project_manager_action.py
index 5f74dd9ee5..bf55e1544d 100644
--- a/openpype/modules/project_manager_action.py
+++ b/openpype/modules/project_manager_action.py
@@ -1,3 +1,4 @@
+from openpype import AYON_SERVER_ENABLED
from openpype.modules import OpenPypeModule, ITrayAction
@@ -11,6 +12,9 @@ class ProjectManagerAction(OpenPypeModule, ITrayAction):
module_settings = modules_settings.get(self.name)
if module_settings:
enabled = module_settings.get("enabled", enabled)
+
+ if AYON_SERVER_ENABLED:
+ enabled = False
self.enabled = enabled
# Tray attributes
diff --git a/openpype/modules/royalrender/plugins/publish/collect_sequences_from_job.py b/openpype/modules/royalrender/plugins/publish/collect_sequences_from_job.py
index 4c123e4134..1bfee19e3d 100644
--- a/openpype/modules/royalrender/plugins/publish/collect_sequences_from_job.py
+++ b/openpype/modules/royalrender/plugins/publish/collect_sequences_from_job.py
@@ -189,7 +189,7 @@ class CollectSequencesFromJob(pyblish.api.ContextPlugin):
"families": list(families),
"subset": subset,
"asset": data.get(
- "asset", legacy_io.Session["AVALON_ASSET"]
+ "asset", context.data["asset"]
),
"stagingDir": root,
"frameStart": start,
diff --git a/openpype/modules/settings_action.py b/openpype/modules/settings_action.py
index 90092a133d..5950fbd910 100644
--- a/openpype/modules/settings_action.py
+++ b/openpype/modules/settings_action.py
@@ -1,3 +1,4 @@
+from openpype import AYON_SERVER_ENABLED
from openpype.modules import OpenPypeModule, ITrayAction
@@ -10,6 +11,8 @@ class SettingsAction(OpenPypeModule, ITrayAction):
def initialize(self, _modules_settings):
# This action is always enabled
self.enabled = True
+ if AYON_SERVER_ENABLED:
+ self.enabled = False
# User role
# TODO should be changeable
@@ -80,6 +83,8 @@ class LocalSettingsAction(OpenPypeModule, ITrayAction):
def initialize(self, _modules_settings):
# This action is always enabled
self.enabled = True
+ if AYON_SERVER_ENABLED:
+ self.enabled = False
# Tray attributes
self.settings_window = None
diff --git a/openpype/modules/shotgrid/plugins/publish/collect_shotgrid_entities.py b/openpype/modules/shotgrid/plugins/publish/collect_shotgrid_entities.py
index 0b03ac2e5d..43f5d1ef0e 100644
--- a/openpype/modules/shotgrid/plugins/publish/collect_shotgrid_entities.py
+++ b/openpype/modules/shotgrid/plugins/publish/collect_shotgrid_entities.py
@@ -14,7 +14,7 @@ class CollectShotgridEntities(pyblish.api.ContextPlugin):
avalon_project = context.data.get("projectEntity")
avalon_asset = context.data.get("assetEntity")
- avalon_task_name = os.getenv("AVALON_TASK")
+ avalon_task_name = context.data.get("task")
self.log.info(avalon_project)
self.log.info(avalon_asset)
diff --git a/openpype/modules/slack/plugins/publish/integrate_slack_api.py b/openpype/modules/slack/plugins/publish/integrate_slack_api.py
index 86c97586d2..4c5a39318a 100644
--- a/openpype/modules/slack/plugins/publish/integrate_slack_api.py
+++ b/openpype/modules/slack/plugins/publish/integrate_slack_api.py
@@ -350,6 +350,10 @@ class SlackPython3Operations(AbstractSlackOperations):
self.log.warning("Cannot pull user info, "
"mentions won't work", exc_info=True)
return [], []
+ except Exception:
+ self.log.warning("Cannot pull user info, "
+ "mentions won't work", exc_info=True)
+ return [], []
return users, groups
@@ -377,8 +381,12 @@ class SlackPython3Operations(AbstractSlackOperations):
return response.data["ts"], file_ids
except SlackApiError as e:
# # You will get a SlackApiError if "ok" is False
- error_str = self._enrich_error(str(e.response["error"]), channel)
- self.log.warning("Error happened {}".format(error_str))
+ if e.response.get("error"):
+ error_str = self._enrich_error(str(e.response["error"]), channel)
+ else:
+ error_str = self._enrich_error(str(e), channel)
+ self.log.warning("Error happened: {}".format(error_str),
+ exc_info=True)
except Exception as e:
error_str = self._enrich_error(str(e), channel)
self.log.warning("Not SlackAPI error", exc_info=True)
@@ -448,12 +456,14 @@ class SlackPython2Operations(AbstractSlackOperations):
if response.get("error"):
error_str = self._enrich_error(str(response.get("error")),
channel)
- self.log.warning("Error happened: {}".format(error_str))
+ self.log.warning("Error happened: {}".format(error_str),
+ exc_info=True)
else:
return response["ts"], file_ids
except Exception as e:
# You will get a SlackApiError if "ok" is False
error_str = self._enrich_error(str(e), channel)
- self.log.warning("Error happened: {}".format(error_str))
+ self.log.warning("Error happened: {}".format(error_str),
+ exc_info=True)
return None, []
diff --git a/openpype/plugins/load/add_site.py b/openpype/modules/sync_server/plugins/load/add_site.py
similarity index 100%
rename from openpype/plugins/load/add_site.py
rename to openpype/modules/sync_server/plugins/load/add_site.py
diff --git a/openpype/plugins/load/remove_site.py b/openpype/modules/sync_server/plugins/load/remove_site.py
similarity index 100%
rename from openpype/plugins/load/remove_site.py
rename to openpype/modules/sync_server/plugins/load/remove_site.py
diff --git a/openpype/modules/sync_server/sync_server.py b/openpype/modules/sync_server/sync_server.py
index 98065b68a0..1b7b2dc3a6 100644
--- a/openpype/modules/sync_server/sync_server.py
+++ b/openpype/modules/sync_server/sync_server.py
@@ -536,8 +536,8 @@ class SyncServerThread(threading.Thread):
_site_is_working(self.module, project_name, remote_site,
remote_site_config)]):
self.log.debug(
- "Some of the sites {} - {} is not working properly".format(
- local_site, remote_site
+ "Some of the sites {} - {} in {} is not working properly".format( # noqa
+ local_site, remote_site, project_name
)
)
diff --git a/openpype/modules/sync_server/sync_server_module.py b/openpype/modules/sync_server/sync_server_module.py
index b85b045bd9..67856f0d8e 100644
--- a/openpype/modules/sync_server/sync_server_module.py
+++ b/openpype/modules/sync_server/sync_server_module.py
@@ -15,7 +15,7 @@ from openpype.client import (
get_representations,
get_representation_by_id,
)
-from openpype.modules import OpenPypeModule, ITrayModule
+from openpype.modules import OpenPypeModule, ITrayModule, IPluginPaths
from openpype.settings import (
get_project_settings,
get_system_settings,
@@ -39,7 +39,7 @@ from .utils import time_function, SyncStatus, SiteAlreadyPresentError
log = Logger.get_logger("SyncServer")
-class SyncServerModule(OpenPypeModule, ITrayModule):
+class SyncServerModule(OpenPypeModule, ITrayModule, IPluginPaths):
"""
Synchronization server that is syncing published files from local to
any of implemented providers (like GDrive, S3 etc.)
@@ -136,6 +136,13 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
# projects that long tasks are running on
self.projects_processed = set()
+ def get_plugin_paths(self):
+ """Deadline plugin paths."""
+ current_dir = os.path.dirname(os.path.abspath(__file__))
+ return {
+ "load": [os.path.join(current_dir, "plugins", "load")]
+ }
+
""" Start of Public API """
def add_site(self, project_name, representation_id, site_name=None,
force=False, priority=None, reset_timer=False):
@@ -204,6 +211,58 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
if remove_local_files:
self._remove_local_file(project_name, representation_id, site_name)
+ def get_progress_for_repre(self, doc, active_site, remote_site):
+ """
+ Calculates average progress for representation.
+ If site has created_dt >> fully available >> progress == 1
+ Could be calculated in aggregate if it would be too slow
+ Args:
+ doc(dict): representation dict
+ Returns:
+ (dict) with active and remote sites progress
+ {'studio': 1.0, 'gdrive': -1} - gdrive site is not present
+ -1 is used to highlight the site should be added
+ {'studio': 1.0, 'gdrive': 0.0} - gdrive site is present, not
+ uploaded yet
+ """
+ progress = {active_site: -1,
+ remote_site: -1}
+ if not doc:
+ return progress
+
+ files = {active_site: 0, remote_site: 0}
+ doc_files = doc.get("files") or []
+ for doc_file in doc_files:
+ if not isinstance(doc_file, dict):
+ continue
+
+ sites = doc_file.get("sites") or []
+ for site in sites:
+ if (
+ # Pype 2 compatibility
+ not isinstance(site, dict)
+ # Check if site name is one of progress sites
+ or site["name"] not in progress
+ ):
+ continue
+
+ files[site["name"]] += 1
+ norm_progress = max(progress[site["name"]], 0)
+ if site.get("created_dt"):
+ progress[site["name"]] = norm_progress + 1
+ elif site.get("progress"):
+ progress[site["name"]] = norm_progress + site["progress"]
+ else: # site exists, might be failed, do not add again
+ progress[site["name"]] = 0
+
+ # for example 13 fully avail. files out of 26 >> 13/26 = 0.5
+ avg_progress = {}
+ avg_progress[active_site] = \
+ progress[active_site] / max(files[active_site], 1)
+ avg_progress[remote_site] = \
+ progress[remote_site] / max(files[remote_site], 1)
+ return avg_progress
+
def compute_resource_sync_sites(self, project_name):
"""Get available resource sync sites state for publish process.
diff --git a/openpype/pipeline/__init__.py b/openpype/pipeline/__init__.py
index d656d58adc..5c15a5fa82 100644
--- a/openpype/pipeline/__init__.py
+++ b/openpype/pipeline/__init__.py
@@ -88,6 +88,7 @@ from .context_tools import (
deregister_host,
get_process_id,
+ get_global_context,
get_current_context,
get_current_host_name,
get_current_project_name,
@@ -186,6 +187,7 @@ __all__ = (
"deregister_host",
"get_process_id",
+ "get_global_context",
"get_current_context",
"get_current_host_name",
"get_current_project_name",
diff --git a/openpype/pipeline/anatomy.py b/openpype/pipeline/anatomy.py
index 30748206a3..029b5cc1ff 100644
--- a/openpype/pipeline/anatomy.py
+++ b/openpype/pipeline/anatomy.py
@@ -5,17 +5,19 @@ import platform
import collections
import numbers
+import ayon_api
import six
import time
+from openpype import AYON_SERVER_ENABLED
from openpype.settings.lib import (
get_local_settings,
)
from openpype.settings.constants import (
DEFAULT_PROJECT_KEY
)
-
from openpype.client import get_project
+from openpype.lib import Logger, get_local_site_id
from openpype.lib.path_templates import (
TemplateUnsolved,
TemplateResult,
@@ -23,7 +25,6 @@ from openpype.lib.path_templates import (
TemplatesDict,
FormatObject,
)
-from openpype.lib.log import Logger
from openpype.modules import ModulesManager
log = Logger.get_logger(__name__)
@@ -475,6 +476,13 @@ class Anatomy(BaseAnatomy):
Union[Dict[str, str], None]): Local root overrides.
"""
+ if AYON_SERVER_ENABLED:
+ if not project_name:
+ return
+ return ayon_api.get_project_roots_for_site(
+ project_name, get_local_site_id()
+ )
+
if local_settings is None:
local_settings = get_local_settings()
diff --git a/openpype/pipeline/context_tools.py b/openpype/pipeline/context_tools.py
index 97a5c1ba69..c12b76cc74 100644
--- a/openpype/pipeline/context_tools.py
+++ b/openpype/pipeline/context_tools.py
@@ -320,7 +320,7 @@ def get_current_host_name():
"""Current host name.
Function is based on currently registered host integration or environment
- variant 'AVALON_APP'.
+ variable 'AVALON_APP'.
Returns:
Union[str, None]: Name of host integration in current process or None.
@@ -333,6 +333,26 @@ def get_current_host_name():
def get_global_context():
+ """Global context defined in environment variables.
+
+ Values here may not reflect current context of host integration. The
+ function can be used on startup before a host is registered.
+
+ Use 'get_current_context' to make sure you'll get current host integration
+ context info.
+
+ Example:
+ {
+ "project_name": "Commercial",
+ "asset_name": "Bunny",
+ "task_name": "Animation",
+ }
+
+ Returns:
+ dict[str, Union[str, None]]: Context defined with environment
+ variables.
+ """
+
return {
"project_name": os.environ.get("AVALON_PROJECT"),
"asset_name": os.environ.get("AVALON_ASSET"),
diff --git a/openpype/pipeline/create/creator_plugins.py b/openpype/pipeline/create/creator_plugins.py
index 947a90ef08..c9edbbfd71 100644
--- a/openpype/pipeline/create/creator_plugins.py
+++ b/openpype/pipeline/create/creator_plugins.py
@@ -660,12 +660,12 @@ def discover_convertor_plugins(*args, **kwargs):
def discover_legacy_creator_plugins():
- from openpype.lib import Logger
+ from openpype.pipeline import get_current_project_name
log = Logger.get_logger("CreatorDiscover")
plugins = discover(LegacyCreator)
- project_name = os.environ.get("AVALON_PROJECT")
+ project_name = get_current_project_name()
system_settings = get_system_settings()
project_settings = get_project_settings(project_name)
for plugin in plugins:
diff --git a/openpype/pipeline/delivery.py b/openpype/pipeline/delivery.py
index ddde45d4da..bbd01f7a4e 100644
--- a/openpype/pipeline/delivery.py
+++ b/openpype/pipeline/delivery.py
@@ -178,7 +178,9 @@ def deliver_sequence(
anatomy_data,
format_dict,
report_items,
- log
+ log,
+ has_renumbered_frame=False,
+ new_frame_start=0
):
""" For Pype2(mainly - works in 3 too) where representation might not
contain files.
@@ -294,17 +296,30 @@ def deliver_sequence(
src_head = src_collection.head
src_tail = src_collection.tail
uploaded = 0
+ first_frame = min(src_collection.indexes)
for index in src_collection.indexes:
src_padding = src_collection.format("{padding}") % index
src_file_name = "{}{}{}".format(src_head, src_padding, src_tail)
src = os.path.normpath(
os.path.join(dir_path, src_file_name)
)
-
- dst_padding = dst_collection.format("{padding}") % index
+ dst_index = index
+ if has_renumbered_frame:
+ # Calculate offset between first frame and current frame
+ # - '0' for first frame
+ offset = new_frame_start - first_frame
+ # Add offset to new frame start
+ dst_index = index + offset
+ if dst_index < 0:
+ msg = "Renumber frame has a smaller number than original frame" # noqa
+ report_items[msg].append(src_file_name)
+ log.warning("{} <{}>".format(msg, context))
+ return report_items, 0
+ dst_padding = dst_collection.format("{padding}") % dst_index
dst = "{}{}{}".format(dst_head, dst_padding, dst_tail)
log.debug("Copying single: {} -> {}".format(src, dst))
_copy_file(src, dst)
+
uploaded += 1
return report_items, uploaded
diff --git a/openpype/pipeline/legacy_io.py b/openpype/pipeline/legacy_io.py
index bde2b24c2a..60fa035c22 100644
--- a/openpype/pipeline/legacy_io.py
+++ b/openpype/pipeline/legacy_io.py
@@ -4,6 +4,7 @@ import sys
import logging
import functools
+from openpype import AYON_SERVER_ENABLED
from . import schema
from .mongodb import AvalonMongoDB, session_data_from_environment
@@ -39,8 +40,9 @@ def install():
_connection_object.Session.update(session)
_connection_object.install()
- module._mongo_client = _connection_object.mongo_client
- module._database = module.database = _connection_object.database
+ if not AYON_SERVER_ENABLED:
+ module._mongo_client = _connection_object.mongo_client
+ module._database = module.database = _connection_object.database
module._is_installed = True
diff --git a/openpype/pipeline/load/plugins.py b/openpype/pipeline/load/plugins.py
index e380d65bbe..f87fb3312d 100644
--- a/openpype/pipeline/load/plugins.py
+++ b/openpype/pipeline/load/plugins.py
@@ -39,9 +39,6 @@ class LoaderPlugin(list):
log = logging.getLogger("SubsetLoader")
log.propagate = True
- def __init__(self, context):
- self.fname = self.filepath_from_context(context)
-
@classmethod
def apply_settings(cls, project_settings, system_settings):
host_name = os.environ.get("AVALON_APP")
@@ -246,9 +243,6 @@ class SubsetLoaderPlugin(LoaderPlugin):
namespace (str, optional): Use pre-defined namespace
"""
- def __init__(self, context):
- pass
-
def discover_loader_plugins(project_name=None):
from openpype.lib import Logger
diff --git a/openpype/pipeline/load/utils.py b/openpype/pipeline/load/utils.py
index 2c40280ccd..42418be40e 100644
--- a/openpype/pipeline/load/utils.py
+++ b/openpype/pipeline/load/utils.py
@@ -314,7 +314,12 @@ def load_with_repre_context(
)
)
- loader = Loader(repre_context)
+ loader = Loader()
+
+ # Backwards compatibility: Originally the loader's __init__ required the
+ # representation context to set `fname` attribute to the filename to load
+ loader.fname = get_representation_path_from_context(repre_context)
+
return loader.load(repre_context, name, namespace, options)
@@ -338,8 +343,7 @@ def load_with_subset_context(
)
)
- loader = Loader(subset_context)
- return loader.load(subset_context, name, namespace, options)
+ return Loader().load(subset_context, name, namespace, options)
def load_with_subset_contexts(
@@ -364,8 +368,7 @@ def load_with_subset_contexts(
"Running '{}' on '{}'".format(Loader.__name__, joined_subset_names)
)
- loader = Loader(subset_contexts)
- return loader.load(subset_contexts, name, namespace, options)
+ return Loader().load(subset_contexts, name, namespace, options)
def load_container(
@@ -447,8 +450,7 @@ def remove_container(container):
.format(container.get("loader"))
)
- loader = Loader(get_representation_context(container["representation"]))
- return loader.remove(container)
+ return Loader().remove(container)
def update_container(container, version=-1):
@@ -498,8 +500,7 @@ def update_container(container, version=-1):
.format(container.get("loader"))
)
- loader = Loader(get_representation_context(container["representation"]))
- return loader.update(container, new_representation)
+ return Loader().update(container, new_representation)
def switch_container(container, representation, loader_plugin=None):
@@ -635,7 +636,7 @@ def get_representation_path(representation, root=None, dbcon=None):
root = registered_root()
- def path_from_represenation():
+ def path_from_representation():
try:
template = representation["data"]["template"]
except KeyError:
@@ -759,7 +760,7 @@ def get_representation_path(representation, root=None, dbcon=None):
return os.path.normpath(path)
return (
- path_from_represenation() or
+ path_from_representation() or
path_from_config() or
path_from_data()
)
diff --git a/openpype/pipeline/mongodb.py b/openpype/pipeline/mongodb.py
index be2b67a5e7..41a44c7373 100644
--- a/openpype/pipeline/mongodb.py
+++ b/openpype/pipeline/mongodb.py
@@ -5,6 +5,7 @@ import logging
import pymongo
from uuid import uuid4
+from openpype import AYON_SERVER_ENABLED
from openpype.client import OpenPypeMongoConnection
from . import schema
@@ -187,7 +188,8 @@ class AvalonMongoDB:
return
self._installed = True
- self._database = self.mongo_client[str(os.environ["AVALON_DB"])]
+ if not AYON_SERVER_ENABLED:
+ self._database = self.mongo_client[str(os.environ["AVALON_DB"])]
def uninstall(self):
"""Close any connection to the database"""
diff --git a/openpype/pipeline/template_data.py b/openpype/pipeline/template_data.py
index 627eba5c3d..fd21930ecc 100644
--- a/openpype/pipeline/template_data.py
+++ b/openpype/pipeline/template_data.py
@@ -128,7 +128,7 @@ def get_task_template_data(project_doc, asset_doc, task_name):
Args:
project_doc (Dict[str, Any]): Queried project document.
asset_doc (Dict[str, Any]): Queried asset document.
- tas_name (str): Name of task for which data should be returned.
+ task_name (str): Name of task for which data should be returned.
Returns:
Dict[str, Dict[str, str]]: Template data
diff --git a/openpype/pipeline/thumbnail.py b/openpype/pipeline/thumbnail.py
index 39f3e17893..9d4a6f3e48 100644
--- a/openpype/pipeline/thumbnail.py
+++ b/openpype/pipeline/thumbnail.py
@@ -2,6 +2,7 @@ import os
import copy
import logging
+from openpype import AYON_SERVER_ENABLED
from openpype.client import get_project
from . import legacy_io
from .anatomy import Anatomy
@@ -131,6 +132,32 @@ class BinaryThumbnail(ThumbnailResolver):
return thumbnail_entity["data"].get("binary_data")
+class ServerThumbnailResolver(ThumbnailResolver):
+ def process(self, thumbnail_entity, thumbnail_type):
+ if not AYON_SERVER_ENABLED:
+ return None
+ data = thumbnail_entity["data"]
+ entity_type = data.get("entity_type")
+ entity_id = data.get("entity_id")
+ if not entity_type or not entity_id:
+ return None
+
+ from openpype.client.server.server_api import get_server_api_connection
+
+ project_name = self.dbcon.active_project()
+ thumbnail_id = thumbnail_entity["_id"]
+ con = get_server_api_connection()
+ filepath = con.get_thumbnail(
+ project_name, entity_type, entity_id, thumbnail_id
+ )
+ content = None
+ if filepath:
+ with open(filepath, "rb") as stream:
+ content = stream.read()
+
+ return content
+
+
# Thumbnail resolvers
def discover_thumbnail_resolvers():
return discover(ThumbnailResolver)
@@ -146,3 +173,4 @@ def register_thumbnail_resolver_path(path):
register_thumbnail_resolver(TemplateResolver)
register_thumbnail_resolver(BinaryThumbnail)
+register_thumbnail_resolver(ServerThumbnailResolver)
diff --git a/openpype/pipeline/workfile/build_workfile.py b/openpype/pipeline/workfile/build_workfile.py
index 8329487839..7b153d37b9 100644
--- a/openpype/pipeline/workfile/build_workfile.py
+++ b/openpype/pipeline/workfile/build_workfile.py
@@ -9,7 +9,6 @@ from '~/openpype/pipeline/workfile/workfile_template_builder'. Which gives
more abilities to define how build happens but require more code to achive it.
"""
-import os
import re
import collections
import json
@@ -26,7 +25,6 @@ from openpype.lib import (
filter_profiles,
Logger,
)
-from openpype.pipeline import legacy_io
from openpype.pipeline.load import (
discover_loader_plugins,
IncompatibleLoaderError,
@@ -102,11 +100,17 @@ class BuildWorkfile:
List[Dict[str, Any]]: Loaded containers during build.
"""
+ from openpype.pipeline.context_tools import (
+ get_current_project_name,
+ get_current_asset_name,
+ get_current_task_name,
+ )
+
loaded_containers = []
# Get current asset name and entity
- project_name = legacy_io.active_project()
- current_asset_name = legacy_io.Session["AVALON_ASSET"]
+ project_name = get_current_project_name()
+ current_asset_name = get_current_asset_name()
current_asset_entity = get_asset_by_name(
project_name, current_asset_name
)
@@ -135,7 +139,7 @@ class BuildWorkfile:
return loaded_containers
# Get current task name
- current_task_name = legacy_io.Session["AVALON_TASK"]
+ current_task_name = get_current_task_name()
# Load workfile presets for task
self.build_presets = self.get_build_presets(
@@ -236,9 +240,14 @@ class BuildWorkfile:
Dict[str, Any]: preset per entered task name
"""
- host_name = os.environ["AVALON_APP"]
+ from openpype.pipeline.context_tools import (
+ get_current_host_name,
+ get_current_project_name,
+ )
+
+ host_name = get_current_host_name()
project_settings = get_project_settings(
- legacy_io.Session["AVALON_PROJECT"]
+ get_current_project_name()
)
host_settings = project_settings.get(host_name) or {}
@@ -651,13 +660,15 @@ class BuildWorkfile:
```
"""
+ from openpype.pipeline.context_tools import get_current_project_name
+
output = {}
if not asset_docs:
return output
asset_docs_by_ids = {asset["_id"]: asset for asset in asset_docs}
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
subsets = list(get_subsets(
project_name, asset_ids=asset_docs_by_ids.keys()
))
diff --git a/openpype/pipeline/workfile/workfile_template_builder.py b/openpype/pipeline/workfile/workfile_template_builder.py
index e1013b2645..bdb13415bf 100644
--- a/openpype/pipeline/workfile/workfile_template_builder.py
+++ b/openpype/pipeline/workfile/workfile_template_builder.py
@@ -28,8 +28,7 @@ from openpype.settings import (
get_project_settings,
get_system_settings,
)
-from openpype.host import IWorkfileHost
-from openpype.host import HostBase
+from openpype.host import IWorkfileHost, HostBase
from openpype.lib import (
Logger,
StringTemplate,
@@ -37,7 +36,7 @@ from openpype.lib import (
attribute_definitions,
)
from openpype.lib.attribute_definitions import get_attributes_keys
-from openpype.pipeline import legacy_io, Anatomy
+from openpype.pipeline import Anatomy
from openpype.pipeline.load import (
get_loaders_by_name,
get_contexts_for_repre_docs,
@@ -125,15 +124,30 @@ class AbstractTemplateBuilder(object):
@property
def project_name(self):
- return legacy_io.active_project()
+ if isinstance(self._host, HostBase):
+ return self._host.get_current_project_name()
+ return os.getenv("AVALON_PROJECT")
@property
def current_asset_name(self):
- return legacy_io.Session["AVALON_ASSET"]
+ if isinstance(self._host, HostBase):
+ return self._host.get_current_asset_name()
+ return os.getenv("AVALON_ASSET")
@property
def current_task_name(self):
- return legacy_io.Session["AVALON_TASK"]
+ if isinstance(self._host, HostBase):
+ return self._host.get_current_task_name()
+ return os.getenv("AVALON_TASK")
+
+ def get_current_context(self):
+ if isinstance(self._host, HostBase):
+ return self._host.get_current_context()
+ return {
+ "project_name": self.project_name,
+ "asset_name": self.current_asset_name,
+ "task_name": self.current_task_name
+ }
@property
def system_settings(self):
@@ -790,10 +804,9 @@ class AbstractTemplateBuilder(object):
fill_data["root"] = anatomy.roots
fill_data["project"] = {
"name": project_name,
- "code": anatomy["attributes"]["code"]
+ "code": anatomy.project_code,
}
-
result = StringTemplate.format_template(path, fill_data)
if result.solved:
path = result.normalized()
@@ -1705,9 +1718,10 @@ class PlaceholderCreateMixin(object):
creator_plugin = self.builder.get_creators_by_name()[creator_name]
# create subset name
- project_name = legacy_io.Session["AVALON_PROJECT"]
- task_name = legacy_io.Session["AVALON_TASK"]
- asset_name = legacy_io.Session["AVALON_ASSET"]
+ context = self._builder.get_current_context()
+ project_name = context["project_name"]
+ asset_name = context["asset_name"]
+ task_name = context["task_name"]
if legacy_create:
asset_doc = get_asset_by_name(
diff --git a/openpype/plugins/load/copy_file.py b/openpype/plugins/load/copy_file.py
index 163f56a83a..7fd56c8a6a 100644
--- a/openpype/plugins/load/copy_file.py
+++ b/openpype/plugins/load/copy_file.py
@@ -14,8 +14,9 @@ class CopyFile(load.LoaderPlugin):
color = get_default_entity_icon_color()
def load(self, context, name=None, namespace=None, data=None):
- self.log.info("Added copy to clipboard: {0}".format(self.fname))
- self.copy_file_to_clipboard(self.fname)
+ path = self.filepath_from_context(context)
+ self.log.info("Added copy to clipboard: {0}".format(path))
+ self.copy_file_to_clipboard(path)
@staticmethod
def copy_file_to_clipboard(path):
diff --git a/openpype/plugins/load/copy_file_path.py b/openpype/plugins/load/copy_file_path.py
index 569e5c8780..b055494e85 100644
--- a/openpype/plugins/load/copy_file_path.py
+++ b/openpype/plugins/load/copy_file_path.py
@@ -14,8 +14,9 @@ class CopyFilePath(load.LoaderPlugin):
color = "#999999"
def load(self, context, name=None, namespace=None, data=None):
- self.log.info("Added file path to clipboard: {0}".format(self.fname))
- self.copy_path_to_clipboard(self.fname)
+ path = self.filepath_from_context(context)
+ self.log.info("Added file path to clipboard: {0}".format(path))
+ self.copy_path_to_clipboard(path)
@staticmethod
def copy_path_to_clipboard(path):
diff --git a/openpype/plugins/load/delivery.py b/openpype/plugins/load/delivery.py
index 4bd4f6e9cf..3b493989bd 100644
--- a/openpype/plugins/load/delivery.py
+++ b/openpype/plugins/load/delivery.py
@@ -95,6 +95,12 @@ class DeliveryOptionsDialog(QtWidgets.QDialog):
template_label.setCursor(QtGui.QCursor(QtCore.Qt.IBeamCursor))
template_label.setTextInteractionFlags(QtCore.Qt.TextSelectableByMouse)
+ renumber_frame = QtWidgets.QCheckBox()
+
+ first_frame_start = QtWidgets.QSpinBox()
+ max_int = (1 << 32) // 2
+ first_frame_start.setRange(0, max_int - 1)
+
root_line_edit = QtWidgets.QLineEdit()
repre_checkboxes_layout = QtWidgets.QFormLayout()
@@ -118,6 +124,8 @@ class DeliveryOptionsDialog(QtWidgets.QDialog):
input_layout.addRow("Selected representations", selected_label)
input_layout.addRow("Delivery template", dropdown)
input_layout.addRow("Template value", template_label)
+ input_layout.addRow("Renumber Frame", renumber_frame)
+ input_layout.addRow("Renumber start frame", first_frame_start)
input_layout.addRow("Root", root_line_edit)
input_layout.addRow("Representations", repre_checkboxes_layout)
@@ -145,6 +153,8 @@ class DeliveryOptionsDialog(QtWidgets.QDialog):
self.selected_label = selected_label
self.template_label = template_label
self.dropdown = dropdown
+ self.first_frame_start = first_frame_start
+ self.renumber_frame = renumber_frame
self.root_line_edit = root_line_edit
self.progress_bar = progress_bar
self.text_area = text_area
@@ -181,6 +191,8 @@ class DeliveryOptionsDialog(QtWidgets.QDialog):
datetime_data = get_datetime_data()
template_name = self.dropdown.currentText()
format_dict = get_format_dict(self.anatomy, self.root_line_edit.text())
+ renumber_frame = self.renumber_frame.isChecked()
+ frame_offset = self.first_frame_start.value()
for repre in self._representations:
if repre["name"] not in selected_repres:
continue
@@ -218,9 +230,31 @@ class DeliveryOptionsDialog(QtWidgets.QDialog):
src_paths.append(src_path)
sources_and_frames = collect_frames(src_paths)
+ frames = set(sources_and_frames.values())
+ frames.discard(None)
+ first_frame = None
+ if frames:
+ first_frame = min(frames)
+
for src_path, frame in sources_and_frames.items():
args[0] = src_path
- if frame:
+ # Renumber frames
+ if renumber_frame and frame is not None:
+ # Calculate offset between
+ # first frame and current frame
+ # - '0' for first frame
+ offset = frame_offset - int(first_frame)
+ # Add offset to new frame start
+ dst_frame = int(frame) + offset
+ if dst_frame < 0:
+ msg = "Renumber frame has a smaller number than original frame" # noqa
+ report_items[msg].append(src_path)
+ self.log.warning("{} <{}>".format(
+ msg, dst_frame))
+ continue
+ frame = dst_frame
+
+ if frame is not None:
anatomy_data["frame"] = frame
new_report_items, uploaded = deliver_single_file(*args)
report_items.update(new_report_items)
diff --git a/openpype/plugins/load/open_djv.py b/openpype/plugins/load/open_djv.py
index 9c36e7f405..5c679f6a51 100644
--- a/openpype/plugins/load/open_djv.py
+++ b/openpype/plugins/load/open_djv.py
@@ -33,9 +33,11 @@ class OpenInDJV(load.LoaderPlugin):
color = "orange"
def load(self, context, name, namespace, data):
- directory = os.path.dirname(self.fname)
import clique
+ path = self.filepath_from_context(context)
+ directory = os.path.dirname(path)
+
pattern = clique.PATTERNS["frames"]
files = os.listdir(directory)
collections, remainder = clique.assemble(
@@ -48,7 +50,7 @@ class OpenInDJV(load.LoaderPlugin):
sequence = collections[0]
first_image = list(sequence)[0]
else:
- first_image = self.fname
+ first_image = path
filepath = os.path.normpath(os.path.join(directory, first_image))
self.log.info("Opening : {}".format(filepath))
diff --git a/openpype/plugins/load/open_file.py b/openpype/plugins/load/open_file.py
index 00b2ecd7c5..5c4f4901d1 100644
--- a/openpype/plugins/load/open_file.py
+++ b/openpype/plugins/load/open_file.py
@@ -28,7 +28,7 @@ class OpenFile(load.LoaderPlugin):
def load(self, context, name, namespace, data):
- path = self.fname
+ path = self.filepath_from_context(context)
if not os.path.exists(path):
raise RuntimeError("File not found: {}".format(path))
diff --git a/openpype/plugins/publish/collect_current_context.py b/openpype/plugins/publish/collect_current_context.py
index 7e42700d7d..166d75e5de 100644
--- a/openpype/plugins/publish/collect_current_context.py
+++ b/openpype/plugins/publish/collect_current_context.py
@@ -6,7 +6,7 @@ Provides:
"""
import pyblish.api
-from openpype.pipeline import legacy_io
+from openpype.pipeline import get_current_context
class CollectCurrentContext(pyblish.api.ContextPlugin):
@@ -19,24 +19,20 @@ class CollectCurrentContext(pyblish.api.ContextPlugin):
label = "Collect Current context"
def process(self, context):
- # Make sure 'legacy_io' is intalled
- legacy_io.install()
-
# Check if values are already set
project_name = context.data.get("projectName")
asset_name = context.data.get("asset")
task_name = context.data.get("task")
+
+ current_context = get_current_context()
if not project_name:
- project_name = legacy_io.current_project()
- context.data["projectName"] = project_name
+ context.data["projectName"] = current_context["project_name"]
if not asset_name:
- asset_name = legacy_io.Session.get("AVALON_ASSET")
- context.data["asset"] = asset_name
+ context.data["asset"] = current_context["asset_name"]
if not task_name:
- task_name = legacy_io.Session.get("AVALON_TASK")
- context.data["task"] = task_name
+ context.data["task"] = current_context["task_name"]
# QUESTION should we be explicit with keys? (the same on instances)
# - 'asset' -> 'assetName'
diff --git a/openpype/plugins/publish/collect_sequence_frame_data.py b/openpype/plugins/publish/collect_sequence_frame_data.py
new file mode 100644
index 0000000000..c200b245e9
--- /dev/null
+++ b/openpype/plugins/publish/collect_sequence_frame_data.py
@@ -0,0 +1,53 @@
+import pyblish.api
+import clique
+
+
+class CollectSequenceFrameData(pyblish.api.InstancePlugin):
+ """Collect Sequence Frame Data
+ If the representation includes files with frame numbers,
+ then set `frameStart` and `frameEnd` for the instance to the
+ start and end frame respectively
+ """
+
+ order = pyblish.api.CollectorOrder + 0.2
+ label = "Collect Sequence Frame Data"
+ families = ["plate", "pointcache",
+ "vdbcache", "online",
+ "render"]
+ hosts = ["traypublisher"]
+
+ def process(self, instance):
+ frame_data = self.get_frame_data_from_repre_sequence(instance)
+ if not frame_data:
+ # if no dict data skip collecting the frame range data
+ return
+ for key, value in frame_data.items():
+ if key not in instance.data:
+ instance.data[key] = value
+ self.log.debug(f"Collected Frame range data '{key}':{value} ")
+
+ def get_frame_data_from_repre_sequence(self, instance):
+ repres = instance.data.get("representations")
+ if repres:
+ first_repre = repres[0]
+ if "ext" not in first_repre:
+ self.log.warning("Cannot find file extension"
+ " in representation data")
+ return
+
+ files = first_repre["files"]
+ collections, remainder = clique.assemble(files)
+ if not collections:
+ # No sequences detected and we can't retrieve
+ # frame range
+ self.log.debug(
+ "No sequences detected in the representation data."
+ " Skipping collecting frame range data.")
+ return
+ collection = collections[0]
+ repres_frames = list(collection.indexes)
+
+ return {
+ "frameStart": repres_frames[0],
+ "frameEnd": repres_frames[-1],
+ }
diff --git a/openpype/plugins/publish/extract_hierarchy_avalon.py b/openpype/plugins/publish/extract_hierarchy_avalon.py
index 493780645c..1d57545bc0 100644
--- a/openpype/plugins/publish/extract_hierarchy_avalon.py
+++ b/openpype/plugins/publish/extract_hierarchy_avalon.py
@@ -1,6 +1,7 @@
import collections
from copy import deepcopy
import pyblish.api
+from openpype import AYON_SERVER_ENABLED
from openpype.client import (
get_assets,
get_archived_assets
@@ -16,6 +17,9 @@ class ExtractHierarchyToAvalon(pyblish.api.ContextPlugin):
families = ["clip", "shot"]
def process(self, context):
+ if AYON_SERVER_ENABLED:
+ return
+
if "hierarchyContext" not in context.data:
self.log.info("skipping IntegrateHierarchyToAvalon")
return
diff --git a/openpype/plugins/publish/extract_hierarchy_to_ayon.py b/openpype/plugins/publish/extract_hierarchy_to_ayon.py
new file mode 100644
index 0000000000..915650ae41
--- /dev/null
+++ b/openpype/plugins/publish/extract_hierarchy_to_ayon.py
@@ -0,0 +1,234 @@
+import collections
+import copy
+import json
+import uuid
+import pyblish.api
+
+from ayon_api import slugify_string
+from ayon_api.entity_hub import EntityHub
+
+from openpype import AYON_SERVER_ENABLED
+
+
+def _default_json_parse(value):
+ return str(value)
+
+
+class ExtractHierarchyToAYON(pyblish.api.ContextPlugin):
+ """Create entities in AYON based on collected data."""
+
+ order = pyblish.api.ExtractorOrder - 0.01
+ label = "Extract Hierarchy To AYON"
+ families = ["clip", "shot"]
+
+ def process(self, context):
+ if not AYON_SERVER_ENABLED:
+ return
+
+ hierarchy_context = context.data.get("hierarchyContext")
+ if not hierarchy_context:
+ self.log.info("Skipping")
+ return
+
+ project_name = context.data["projectName"]
+ hierarchy_context = self._filter_hierarchy(context)
+ if not hierarchy_context:
+ self.log.info("All folders were filtered out")
+ return
+
+ self.log.debug("Hierarchy_context: {}".format(
+ json.dumps(hierarchy_context, default=_default_json_parse)
+ ))
+
+ entity_hub = EntityHub(project_name)
+ project = entity_hub.project_entity
+
+ hierarchy_match_queue = collections.deque()
+ hierarchy_match_queue.append((project, hierarchy_context))
+ while hierarchy_match_queue:
+ item = hierarchy_match_queue.popleft()
+ entity, entity_info = item
+
+ # Update attributes of entities
+ for attr_name, attr_value in entity_info["attributes"].items():
+ if attr_name in entity.attribs:
+ entity.attribs[attr_name] = attr_value
+
+ # Check if info has any children to sync
+ children_info = entity_info["children"]
+ tasks_info = entity_info["tasks"]
+ if not tasks_info and not children_info:
+ continue
+
+ # Prepare children by lowered name to easily find matching entities
+ children_by_low_name = {
+ child.name.lower(): child
+ for child in entity.children
+ }
+
+ # Create tasks if are not available
+ for task_info in tasks_info:
+ task_label = task_info["name"]
+ task_name = slugify_string(task_label)
+ if task_name == task_label:
+ task_label = None
+ task_entity = children_by_low_name.get(task_name.lower())
+ # TODO propagate updates of tasks if there are any
+ # TODO check if existing entity have 'task' type
+ if task_entity is None:
+ task_entity = entity_hub.add_new_task(
+ task_info["type"],
+ parent_id=entity.id,
+ name=task_name
+ )
+
+ if task_label:
+ task_entity.label = task_label
+
+ # Create/Update sub-folders
+ for child_info in children_info:
+ child_label = child_info["name"]
+ child_name = slugify_string(child_label)
+ if child_name == child_label:
+ child_label = None
+ # TODO check if existing entity have 'folder' type
+ child_entity = children_by_low_name.get(child_name.lower())
+ if child_entity is None:
+ child_entity = entity_hub.add_new_folder(
+ child_info["entity_type"],
+ parent_id=entity.id,
+ name=child_name
+ )
+
+ if child_label:
+ child_entity.label = child_label
+
+ # Add folder to queue
+ hierarchy_match_queue.append((child_entity, child_info))
+
+ entity_hub.commit_changes()
+
+ def _filter_hierarchy(self, context):
+ """Filter hierarchy context by active folder names.
+
+ Hierarchy context is filtered to folder names on active instances.
+
+ Change hierarchy context to unified structure which suits logic in
+ entity creation.
+
+ Output example:
+ {
+ "name": "MyProject",
+ "entity_type": "Project",
+ "attributes": {},
+ "tasks": [],
+ "children": [
+ {
+ "name": "seq_01",
+ "entity_type": "Sequence",
+ "attributes": {},
+ "tasks": [],
+ "children": [
+ ...
+ ]
+ },
+ ...
+ ]
+ }
+
+ Todos:
+ Change how active folder are defined (names won't be enough in
+ AYON).
+
+ Args:
+ context (pyblish.api.Context): Pyblish context.
+
+ Returns:
+ dict[str, Any]: Hierarchy structure filtered by folder names.
+ """
+
+ # filter only the active publishing instances
+ active_folder_names = set()
+ for instance in context:
+ if instance.data.get("publish") is not False:
+ active_folder_names.add(instance.data.get("asset"))
+
+ active_folder_names.discard(None)
+
+ self.log.debug("Active folder names: {}".format(active_folder_names))
+ if not active_folder_names:
+ return None
+
+ project_item = None
+ project_children_context = None
+ for key, value in context.data["hierarchyContext"].items():
+ project_item = copy.deepcopy(value)
+ project_children_context = project_item.pop("childs", None)
+ project_item["name"] = key
+ project_item["tasks"] = []
+ project_item["attributes"] = project_item.pop(
+ "custom_attributes", {}
+ )
+ project_item["children"] = []
+
+ if not project_children_context:
+ return None
+
+ project_id = uuid.uuid4().hex
+ items_by_id = {project_id: project_item}
+ parent_id_by_item_id = {project_id: None}
+ valid_ids = set()
+
+ hierarchy_queue = collections.deque()
+ hierarchy_queue.append((project_id, project_children_context))
+ while hierarchy_queue:
+ queue_item = hierarchy_queue.popleft()
+ parent_id, children_context = queue_item
+ if not children_context:
+ continue
+
+ for asset_name, asset_info in children_context.items():
+ if (
+ asset_name not in active_folder_names
+ and not asset_info.get("childs")
+ ):
+ continue
+ item_id = uuid.uuid4().hex
+ new_item = copy.deepcopy(asset_info)
+ new_item["name"] = asset_name
+ new_item["children"] = []
+ new_children_context = new_item.pop("childs", None)
+ tasks = new_item.pop("tasks", {})
+ task_items = []
+ for task_name, task_info in tasks.items():
+ task_info["name"] = task_name
+ task_items.append(task_info)
+ new_item["tasks"] = task_items
+ new_item["attributes"] = new_item.pop("custom_attributes", {})
+
+ items_by_id[item_id] = new_item
+ parent_id_by_item_id[item_id] = parent_id
+
+ if asset_name in active_folder_names:
+ valid_ids.add(item_id)
+ hierarchy_queue.append((item_id, new_children_context))
+
+ if not valid_ids:
+ return None
+
+ for item_id in set(valid_ids):
+ parent_id = parent_id_by_item_id[item_id]
+ while parent_id is not None and parent_id not in valid_ids:
+ valid_ids.add(parent_id)
+ parent_id = parent_id_by_item_id[parent_id]
+
+ valid_ids.discard(project_id)
+ for item_id in valid_ids:
+ parent_id = parent_id_by_item_id[item_id]
+ item = items_by_id[item_id]
+ parent_item = items_by_id[parent_id]
+ parent_item["children"].append(item)
+
+ if not project_item["children"]:
+ return None
+ return project_item
diff --git a/openpype/plugins/publish/integrate.py b/openpype/plugins/publish/integrate.py
index f392cf67f7..e76f9ce9c4 100644
--- a/openpype/plugins/publish/integrate.py
+++ b/openpype/plugins/publish/integrate.py
@@ -148,14 +148,8 @@ class IntegrateAsset(pyblish.api.InstancePlugin):
"project", "asset", "task", "subset", "version", "representation",
"family", "hierarchy", "username", "user", "output"
]
- skip_host_families = []
def process(self, instance):
- if self._temp_skip_instance_by_settings(instance):
- return
-
- # Mark instance as processed for legacy integrator
- instance.data["processedWithNewIntegrator"] = True
# Instance should be integrated on a farm
if instance.data.get("farm"):
@@ -201,39 +195,6 @@ class IntegrateAsset(pyblish.api.InstancePlugin):
# the try, except.
file_transactions.finalize()
- def _temp_skip_instance_by_settings(self, instance):
- """Decide if instance will be processed with new or legacy integrator.
-
- This is temporary solution until we test all usecases with new (this)
- integrator plugin.
- """
-
- host_name = instance.context.data["hostName"]
- instance_family = instance.data["family"]
- instance_families = set(instance.data.get("families") or [])
-
- skip = False
- for item in self.skip_host_families:
- if host_name not in item["host"]:
- continue
-
- families = set(item["families"])
- if instance_family in families:
- skip = True
- break
-
- for family in instance_families:
- if family in families:
- skip = True
- break
-
- if skip:
- break
-
- if skip:
- self.log.debug("Instance is marked to be skipped by settings.")
- return skip
-
def filter_representations(self, instance):
# Prepare repsentations that should be integrated
repres = instance.data.get("representations")
diff --git a/openpype/plugins/publish/integrate_hero_version.py b/openpype/plugins/publish/integrate_hero_version.py
index b71207c24f..b7feeac6a4 100644
--- a/openpype/plugins/publish/integrate_hero_version.py
+++ b/openpype/plugins/publish/integrate_hero_version.py
@@ -6,6 +6,7 @@ import shutil
import pyblish.api
+from openpype import AYON_SERVER_ENABLED
from openpype.client import (
get_version_by_id,
get_hero_version_by_subset_id,
@@ -195,11 +196,20 @@ class IntegrateHeroVersion(pyblish.api.InstancePlugin):
entity_id = None
if old_version:
entity_id = old_version["_id"]
- new_hero_version = new_hero_version_doc(
- src_version_entity["_id"],
- src_version_entity["parent"],
- entity_id=entity_id
- )
+
+ if AYON_SERVER_ENABLED:
+ new_hero_version = new_hero_version_doc(
+ src_version_entity["parent"],
+ copy.deepcopy(src_version_entity["data"]),
+ src_version_entity["name"],
+ entity_id=entity_id
+ )
+ else:
+ new_hero_version = new_hero_version_doc(
+ src_version_entity["_id"],
+ src_version_entity["parent"],
+ entity_id=entity_id
+ )
if old_version:
self.log.debug("Replacing old hero version.")
diff --git a/openpype/plugins/publish/integrate_inputlinks.py b/openpype/plugins/publish/integrate_inputlinks.py
index 6964f2d938..3baa462a81 100644
--- a/openpype/plugins/publish/integrate_inputlinks.py
+++ b/openpype/plugins/publish/integrate_inputlinks.py
@@ -3,6 +3,7 @@ from collections import OrderedDict
from bson.objectid import ObjectId
import pyblish.api
+from openpype import AYON_SERVER_ENABLED
from openpype.pipeline import legacy_io
@@ -34,6 +35,7 @@ class IntegrateInputLinks(pyblish.api.ContextPlugin):
plugin.
"""
+
workfile = None
publishing = []
@@ -133,3 +135,7 @@ class IntegrateInputLinks(pyblish.api.ContextPlugin):
{"_id": version_doc["_id"]},
{"$set": {"data.inputLinks": input_links}}
)
+
+
+if AYON_SERVER_ENABLED:
+ del IntegrateInputLinks
diff --git a/openpype/plugins/publish/integrate_inputlinks_ayon.py b/openpype/plugins/publish/integrate_inputlinks_ayon.py
new file mode 100644
index 0000000000..180524cd08
--- /dev/null
+++ b/openpype/plugins/publish/integrate_inputlinks_ayon.py
@@ -0,0 +1,160 @@
+import collections
+
+import pyblish.api
+from ayon_api import create_link, make_sure_link_type_exists
+
+from openpype import AYON_SERVER_ENABLED
+
+
+class IntegrateInputLinksAYON(pyblish.api.ContextPlugin):
+ """Connecting version level dependency links"""
+
+ order = pyblish.api.IntegratorOrder + 0.2
+ label = "Connect Dependency InputLinks AYON"
+
+ def process(self, context):
+ """Connect dependency links for all instances, globally
+
+ Code steps:
+ - filter instances that integrated version
+ - have "versionEntity" entry in data
+ - separate workfile instance within filtered instances
+ - when workfile instance is available:
+ - link all `loadedVersions` as input of the workfile
+ - link workfile as input of all other integrated versions
+ - link version's inputs if it's instance have "inputVersions" entry
+ -
+
+ inputVersions:
+ The "inputVersions" in instance.data should be a list of
+ version ids (str), which are the dependencies of the publishing
+ instance that should be extracted from working scene by the DCC
+ specific publish plugin.
+ """
+
+ workfile_instance, other_instances = self.split_instances(context)
+
+ # Variable where links are stored in submethods
+ new_links_by_type = collections.defaultdict(list)
+
+ self.create_workfile_links(
+ workfile_instance, other_instances, new_links_by_type)
+
+ self.create_generative_links(other_instances, new_links_by_type)
+
+ self.create_links_on_server(context, new_links_by_type)
+
+ def split_instances(self, context):
+ workfile_instance = None
+ other_instances = []
+
+ for instance in context:
+ # Skip inactive instances
+ if not instance.data.get("publish", True):
+ continue
+
+ version_doc = instance.data.get("versionEntity")
+ if not version_doc:
+ self.log.debug(
+ "Instance {} doesn't have version.".format(instance))
+ continue
+
+ family = instance.data.get("family")
+ if family == "workfile":
+ workfile_instance = instance
+ else:
+ other_instances.append(instance)
+ return workfile_instance, other_instances
+
+ def add_link(self, new_links_by_type, link_type, input_id, output_id):
+ """Add dependency link data into temporary variable.
+
+ Args:
+ new_links_by_type (dict[str, list[dict[str, Any]]]): Object where
+ output is stored.
+ link_type (str): Type of link, one of 'reference' or 'generative'
+ input_id (str): Input version id.
+ output_id (str): Output version id.
+ """
+
+ new_links_by_type[link_type].append((input_id, output_id))
+
+ def create_workfile_links(
+ self, workfile_instance, other_instances, new_links_by_type
+ ):
+ if workfile_instance is None:
+ self.log.warn("No workfile in this publish session.")
+ return
+
+ workfile_version_id = workfile_instance.data["versionEntity"]["_id"]
+ # link workfile to all publishing versions
+ for instance in other_instances:
+ self.add_link(
+ new_links_by_type,
+ "generative",
+ workfile_version_id,
+ instance.data["versionEntity"]["_id"],
+ )
+
+ loaded_versions = workfile_instance.context.get("loadedVersions")
+ if not loaded_versions:
+ return
+
+ # link all loaded versions in scene into workfile
+ for version in loaded_versions:
+ self.add_link(
+ new_links_by_type,
+ "reference",
+ version["version"],
+ workfile_version_id,
+ )
+
+ def create_generative_links(self, other_instances, new_links_by_type):
+ for instance in other_instances:
+ input_versions = instance.data.get("inputVersions")
+ if not input_versions:
+ continue
+
+ version_entity = instance.data["versionEntity"]
+ for input_version in input_versions:
+ self.add_link(
+ new_links_by_type,
+ "generative",
+ input_version,
+ version_entity["_id"],
+ )
+
+ def create_links_on_server(self, context, new_links):
+ """Create new links on server.
+
+ Args:
+ dict[str, list[tuple[str, str]]]: Version links by link type.
+ """
+
+ if not new_links:
+ return
+
+ project_name = context.data["projectName"]
+
+ # Make sure link types are available on server
+ for link_type in new_links.keys():
+ make_sure_link_type_exists(
+ project_name, link_type, "version", "version"
+ )
+
+ # Create link themselves
+ for link_type, items in new_links.items():
+ for item in items:
+ input_id, output_id = item
+ create_link(
+ project_name,
+ link_type,
+ input_id,
+ "version",
+ output_id,
+ "version"
+ )
+
+
+if not AYON_SERVER_ENABLED:
+ del IntegrateInputLinksAYON
diff --git a/openpype/plugins/publish/integrate_legacy.py b/openpype/plugins/publish/integrate_legacy.py
deleted file mode 100644
index c238cca633..0000000000
--- a/openpype/plugins/publish/integrate_legacy.py
+++ /dev/null
@@ -1,1299 +0,0 @@
-import os
-from os.path import getsize
-import logging
-import sys
-import copy
-import clique
-import errno
-import six
-import re
-import shutil
-from collections import deque, defaultdict
-from datetime import datetime
-
-from bson.objectid import ObjectId
-from pymongo import DeleteOne, InsertOne
-import pyblish.api
-
-from openpype.client import (
- get_asset_by_name,
- get_subset_by_id,
- get_subset_by_name,
- get_version_by_id,
- get_version_by_name,
- get_representations,
- get_archived_representations,
-)
-from openpype.lib import (
- prepare_template_data,
- create_hard_link,
- StringTemplate,
- TemplateUnsolved,
- source_hash,
- filter_profiles,
- get_local_site_id,
-)
-from openpype.pipeline import legacy_io
-from openpype.pipeline.publish import get_publish_template_name
-
-# this is needed until speedcopy for linux is fixed
-if sys.platform == "win32":
- from speedcopy import copyfile
-else:
- from shutil import copyfile
-
-log = logging.getLogger(__name__)
-
-
-class IntegrateAssetNew(pyblish.api.InstancePlugin):
- """Resolve any dependency issues
-
- This plug-in resolves any paths which, if not updated might break
- the published file.
-
- The order of families is important, when working with lookdev you want to
- first publish the texture, update the texture paths in the nodes and then
- publish the shading network. Same goes for file dependent assets.
-
- Requirements for instance to be correctly integrated
-
- instance.data['representations'] - must be a list and each member
- must be a dictionary with following data:
- 'files': list of filenames for sequence, string for single file.
- Only the filename is allowed, without the folder path.
- 'stagingDir': "path/to/folder/with/files"
- 'name': representation name (usually the same as extension)
- 'ext': file extension
- optional data
- "frameStart"
- "frameEnd"
- 'fps'
- "data": additional metadata for each representation.
- """
-
- label = "Integrate Asset (legacy)"
- # Make sure it happens after new integrator
- order = pyblish.api.IntegratorOrder + 0.00001
- families = ["workfile",
- "pointcache",
- "pointcloud",
- "proxyAbc",
- "camera",
- "animation",
- "model",
- "maxScene",
- "mayaAscii",
- "mayaScene",
- "setdress",
- "layout",
- "ass",
- "vdbcache",
- "scene",
- "vrayproxy",
- "vrayscene_layer",
- "render",
- "prerender",
- "imagesequence",
- "review",
- "rendersetup",
- "rig",
- "plate",
- "look",
- "audio",
- "yetiRig",
- "yeticache",
- "nukenodes",
- "gizmo",
- "source",
- "matchmove",
- "image",
- "assembly",
- "fbx",
- "gltf",
- "textures",
- "action",
- "harmony.template",
- "harmony.palette",
- "editorial",
- "background",
- "camerarig",
- "redshiftproxy",
- "effect",
- "xgen",
- "hda",
- "usd",
- "staticMesh",
- "skeletalMesh",
- "mvLook",
- "mvUsdComposition",
- "mvUsdOverride",
- "simpleUnrealTexture"
- ]
- exclude_families = ["render.farm"]
- db_representation_context_keys = [
- "project", "asset", "task", "subset", "version", "representation",
- "family", "hierarchy", "task", "username", "user"
- ]
- default_template_name = "publish"
-
- # suffix to denote temporary files, use without '.'
- TMP_FILE_EXT = 'tmp'
-
- # file_url : file_size of all published and uploaded files
- integrated_file_sizes = {}
-
- # Attributes set by settings
- subset_grouping_profiles = None
-
- def process(self, instance):
- if instance.data.get("processedWithNewIntegrator"):
- self.log.debug(
- "Instance was already processed with new integrator"
- )
- return
-
- for ef in self.exclude_families:
- if (
- instance.data["family"] == ef or
- ef in instance.data["families"]):
- self.log.debug("Excluded family '{}' in '{}' or {}".format(
- ef, instance.data["family"], instance.data["families"]))
- return
-
- # instance should be published on a farm
- if instance.data.get("farm"):
- return
-
- # Prepare repsentations that should be integrated
- repres = instance.data.get("representations")
- # Raise error if instance don't have any representations
- if not repres:
- raise ValueError(
- "Instance {} has no files to transfer".format(
- instance.data["family"]
- )
- )
-
- # Validate type of stored representations
- if not isinstance(repres, (list, tuple)):
- raise TypeError(
- "Instance 'files' must be a list, got: {0} {1}".format(
- str(type(repres)), str(repres)
- )
- )
-
- # Filter representations
- filtered_repres = []
- for repre in repres:
- if "delete" in repre.get("tags", []):
- continue
- filtered_repres.append(repre)
-
- # Skip instance if there are not representations to integrate
- # all representations should not be integrated
- if not filtered_repres:
- self.log.warning((
- "Skipping, there are no representations"
- " to integrate for instance {}"
- ).format(instance.data["family"]))
- return
-
- self.integrated_file_sizes = {}
- try:
- self.register(instance, filtered_repres)
- self.log.info("Integrated Asset in to the database ...")
- self.log.info("instance.data: {}".format(instance.data))
- self.handle_destination_files(self.integrated_file_sizes,
- 'finalize')
- except Exception:
- # clean destination
- self.log.critical("Error when registering", exc_info=True)
- self.handle_destination_files(self.integrated_file_sizes, 'remove')
- six.reraise(*sys.exc_info())
-
- def register(self, instance, repres):
- # Required environment variables
- anatomy_data = instance.data["anatomyData"]
-
- legacy_io.install()
-
- context = instance.context
-
- project_entity = instance.data["projectEntity"]
- project_name = project_entity["name"]
-
- context_asset_name = None
- context_asset_doc = context.data.get("assetEntity")
- if context_asset_doc:
- context_asset_name = context_asset_doc["name"]
-
- asset_name = instance.data["asset"]
- asset_entity = instance.data.get("assetEntity")
- if not asset_entity or asset_entity["name"] != context_asset_name:
- asset_entity = get_asset_by_name(project_name, asset_name)
- assert asset_entity, (
- "No asset found by the name \"{0}\" in project \"{1}\""
- ).format(asset_name, project_entity["name"])
-
- instance.data["assetEntity"] = asset_entity
-
- # update anatomy data with asset specific keys
- # - name should already been set
- hierarchy = ""
- parents = asset_entity["data"]["parents"]
- if parents:
- hierarchy = "/".join(parents)
- anatomy_data["hierarchy"] = hierarchy
-
- # Make sure task name in anatomy data is same as on instance.data
- asset_tasks = (
- asset_entity.get("data", {}).get("tasks")
- ) or {}
- task_name = instance.data.get("task")
- if task_name:
- task_info = asset_tasks.get(task_name) or {}
- task_type = task_info.get("type")
-
- project_task_types = project_entity["config"]["tasks"]
- task_code = project_task_types.get(task_type, {}).get("short_name")
- anatomy_data["task"] = {
- "name": task_name,
- "type": task_type,
- "short": task_code
- }
-
- elif "task" in anatomy_data:
- # Just set 'task_name' variable to context task
- task_name = anatomy_data["task"]["name"]
- task_type = anatomy_data["task"]["type"]
-
- else:
- task_name = None
- task_type = None
-
- # Fill family in anatomy data
- anatomy_data["family"] = instance.data.get("family")
-
- stagingdir = instance.data.get("stagingDir")
- if not stagingdir:
- self.log.debug((
- "{0} is missing reference to staging directory."
- " Will try to get it from representation."
- ).format(instance))
-
- else:
- self.log.debug(
- "Establishing staging directory @ {0}".format(stagingdir)
- )
-
- subset = self.get_subset(project_name, asset_entity, instance)
- instance.data["subsetEntity"] = subset
-
- version_number = instance.data["version"]
- self.log.debug("Next version: v{}".format(version_number))
-
- version_data = self.create_version_data(context, instance)
-
- version_data_instance = instance.data.get('versionData')
- if version_data_instance:
- version_data.update(version_data_instance)
-
- # TODO rename method from `create_version` to
- # `prepare_version` or similar...
- version = self.create_version(
- subset=subset,
- version_number=version_number,
- data=version_data
- )
-
- self.log.debug("Creating version ...")
-
- new_repre_names_low = [
- _repre["name"].lower()
- for _repre in repres
- ]
-
- existing_version = get_version_by_name(
- project_name, version_number, subset["_id"]
- )
-
- if existing_version is None:
- version_id = legacy_io.insert_one(version).inserted_id
- else:
- # Check if instance have set `append` mode which cause that
- # only replicated representations are set to archive
- append_repres = instance.data.get("append", False)
-
- # Update version data
- # TODO query by _id and
- legacy_io.update_many({
- 'type': 'version',
- 'parent': subset["_id"],
- 'name': version_number
- }, {
- '$set': version
- })
- version_id = existing_version['_id']
-
- # Find representations of existing version and archive them
- current_repres = list(get_representations(
- project_name, version_ids=[version_id]
- ))
- bulk_writes = []
- for repre in current_repres:
- if append_repres:
- # archive only duplicated representations
- if repre["name"].lower() not in new_repre_names_low:
- continue
- # Representation must change type,
- # `_id` must be stored to other key and replaced with new
- # - that is because new representations should have same ID
- repre_id = repre["_id"]
- bulk_writes.append(DeleteOne({"_id": repre_id}))
-
- repre["orig_id"] = repre_id
- repre["_id"] = ObjectId()
- repre["type"] = "archived_representation"
- bulk_writes.append(InsertOne(repre))
-
- # bulk updates
- if bulk_writes:
- legacy_io.database[project_name].bulk_write(
- bulk_writes
- )
-
- version = get_version_by_id(project_name, version_id)
- instance.data["versionEntity"] = version
-
- existing_repres = list(get_archived_representations(
- project_name,
- version_ids=[version_id]
- ))
-
- instance.data['version'] = version['name']
-
- intent_value = instance.context.data.get("intent")
- if intent_value and isinstance(intent_value, dict):
- intent_value = intent_value.get("value")
-
- if intent_value:
- anatomy_data["intent"] = intent_value
-
- anatomy = instance.context.data['anatomy']
-
- # Find the representations to transfer amongst the files
- # Each should be a single representation (as such, a single extension)
- representations = []
- destination_list = []
-
- orig_transfers = []
- if 'transfers' not in instance.data:
- instance.data['transfers'] = []
- else:
- orig_transfers = list(instance.data['transfers'])
-
- family = self.main_family_from_instance(instance)
-
- template_name = get_publish_template_name(
- project_name,
- instance.context.data["hostName"],
- family,
- task_name=task_info.get("name"),
- task_type=task_info.get("type"),
- project_settings=instance.context.data["project_settings"],
- logger=self.log
- )
-
- published_representations = {}
- for idx, repre in enumerate(repres):
- published_files = []
-
- # create template data for Anatomy
- template_data = copy.deepcopy(anatomy_data)
- if intent_value is not None:
- template_data["intent"] = intent_value
-
- resolution_width = repre.get("resolutionWidth")
- resolution_height = repre.get("resolutionHeight")
- fps = instance.data.get("fps")
-
- if resolution_width:
- template_data["resolution_width"] = resolution_width
- if resolution_width:
- template_data["resolution_height"] = resolution_height
- if resolution_width:
- template_data["fps"] = fps
-
- if "originalBasename" in instance.data:
- template_data.update({
- "originalBasename": instance.data.get("originalBasename")
- })
-
- files = repre['files']
- if repre.get('stagingDir'):
- stagingdir = repre['stagingDir']
-
- if repre.get("outputName"):
- template_data["output"] = repre['outputName']
-
- template_data["representation"] = repre["name"]
-
- ext = repre["ext"]
- if ext.startswith("."):
- self.log.warning((
- "Implementaion warning: <\"{}\">"
- " Representation's extension stored under \"ext\" key "
- " started with dot (\"{}\")."
- ).format(repre["name"], ext))
- ext = ext[1:]
- repre["ext"] = ext
- template_data["ext"] = ext
-
- self.log.info(template_name)
- template = os.path.normpath(
- anatomy.templates[template_name]["path"])
-
- sequence_repre = isinstance(files, list)
- repre_context = None
- if sequence_repre:
- self.log.debug(
- "files: {}".format(files))
- src_collections, remainder = clique.assemble(files)
- self.log.debug(
- "src_tail_collections: {}".format(str(src_collections)))
- src_collection = src_collections[0]
-
- # Assert that each member has identical suffix
- src_head = src_collection.format("{head}")
- src_tail = src_collection.format("{tail}")
-
- # fix dst_padding
- valid_files = [x for x in files if src_collection.match(x)]
- padd_len = len(
- valid_files[0].replace(src_head, "").replace(src_tail, "")
- )
- src_padding_exp = "%0{}d".format(padd_len)
-
- test_dest_files = list()
- for i in [1, 2]:
- template_data["representation"] = repre['ext']
- if not repre.get("udim"):
- template_data["frame"] = src_padding_exp % i
- else:
- template_data["udim"] = src_padding_exp % i
-
- template_obj = anatomy.templates_obj[template_name]["path"]
- template_filled = template_obj.format_strict(template_data)
- if repre_context is None:
- repre_context = template_filled.used_values
- test_dest_files.append(
- os.path.normpath(template_filled)
- )
- if not repre.get("udim"):
- template_data["frame"] = repre_context["frame"]
- else:
- template_data["udim"] = repre_context["udim"]
-
- self.log.debug(
- "test_dest_files: {}".format(str(test_dest_files)))
-
- dst_collections, remainder = clique.assemble(test_dest_files)
- dst_collection = dst_collections[0]
- dst_head = dst_collection.format("{head}")
- dst_tail = dst_collection.format("{tail}")
-
- index_frame_start = None
-
- # TODO use frame padding from right template group
- if repre.get("frameStart") is not None:
- frame_start_padding = int(
- anatomy.templates["render"].get(
- "frame_padding",
- anatomy.templates["render"].get("padding")
- )
- )
-
- index_frame_start = int(repre.get("frameStart"))
-
- # exception for slate workflow
- if index_frame_start and "slate" in instance.data["families"]:
- index_frame_start -= 1
-
- dst_padding_exp = src_padding_exp
- dst_start_frame = None
- collection_start = list(src_collection.indexes)[0]
- for i in src_collection.indexes:
- # TODO 1.) do not count padding in each index iteration
- # 2.) do not count dst_padding from src_padding before
- # index_frame_start check
- frame_number = i - collection_start
- src_padding = src_padding_exp % i
-
- src_file_name = "{0}{1}{2}".format(
- src_head, src_padding, src_tail)
-
- dst_padding = src_padding_exp % frame_number
-
- if index_frame_start is not None:
- dst_padding_exp = "%0{}d".format(frame_start_padding)
- dst_padding = dst_padding_exp % (index_frame_start + frame_number) # noqa: E501
- elif repre.get("udim"):
- dst_padding = int(i)
-
- dst = "{0}{1}{2}".format(
- dst_head,
- dst_padding,
- dst_tail
- )
-
- self.log.debug("destination: `{}`".format(dst))
- src = os.path.join(stagingdir, src_file_name)
-
- self.log.debug("source: {}".format(src))
- instance.data["transfers"].append([src, dst])
-
- published_files.append(dst)
-
- # for adding first frame into db
- if not dst_start_frame:
- dst_start_frame = dst_padding
-
- # Store used frame value to template data
- if repre.get("frame"):
- template_data["frame"] = dst_start_frame
-
- dst = "{0}{1}{2}".format(
- dst_head,
- dst_start_frame,
- dst_tail
- )
- repre['published_path'] = dst
-
- else:
- # Single file
- # _______
- # | |\
- # | |
- # | |
- # | |
- # |_______|
- #
- template_data.pop("frame", None)
- fname = files
- assert not os.path.isabs(fname), (
- "Given file name is a full path"
- )
-
- template_data["representation"] = repre['ext']
- # Store used frame value to template data
- if repre.get("udim"):
- template_data["udim"] = repre["udim"][0]
- src = os.path.join(stagingdir, fname)
- template_obj = anatomy.templates_obj[template_name]["path"]
- template_filled = template_obj.format_strict(template_data)
- repre_context = template_filled.used_values
- dst = os.path.normpath(template_filled)
-
- instance.data["transfers"].append([src, dst])
-
- published_files.append(dst)
- repre['published_path'] = dst
- self.log.debug("__ dst: {}".format(dst))
-
- if not instance.data.get("publishDir"):
- instance.data["publishDir"] = (
- anatomy.templates_obj[template_name]["folder"]
- .format_strict(template_data)
- )
- if repre.get("udim"):
- repre_context["udim"] = repre.get("udim") # store list
-
- repre["publishedFiles"] = published_files
-
- for key in self.db_representation_context_keys:
- value = template_data.get(key)
- if not value:
- continue
- repre_context[key] = template_data[key]
-
- # Use previous representation's id if there are any
- repre_id = None
- repre_name_low = repre["name"].lower()
- for _repre in existing_repres:
- # NOTE should we check lowered names?
- if repre_name_low == _repre["name"]:
- repre_id = _repre["orig_id"]
- break
-
- # Create new id if existing representations does not match
- if repre_id is None:
- repre_id = ObjectId()
-
- data = repre.get("data") or {}
- data.update({'path': dst, 'template': template})
- representation = {
- "_id": repre_id,
- "schema": "openpype:representation-2.0",
- "type": "representation",
- "parent": version_id,
- "name": repre['name'],
- "data": data,
- "dependencies": instance.data.get("dependencies", "").split(),
-
- # Imprint shortcut to context
- # for performance reasons.
- "context": repre_context
- }
-
- if repre.get("outputName"):
- representation["context"]["output"] = repre['outputName']
-
- if sequence_repre and repre.get("frameStart") is not None:
- representation['context']['frame'] = (
- dst_padding_exp % int(repre.get("frameStart"))
- )
-
- # any file that should be physically copied is expected in
- # 'transfers' or 'hardlinks'
- if instance.data.get('transfers', False) or \
- instance.data.get('hardlinks', False):
- # could throw exception, will be caught in 'process'
- # all integration to DB is being done together lower,
- # so no rollback needed
- self.log.debug("Integrating source files to destination ...")
- self.integrated_file_sizes.update(self.integrate(instance))
- self.log.debug("Integrated files {}".
- format(self.integrated_file_sizes))
-
- # get 'files' info for representation and all attached resources
- self.log.debug("Preparing files information ...")
- representation["files"] = self.get_files_info(
- instance,
- self.integrated_file_sizes)
-
- self.log.debug("__ representation: {}".format(representation))
- destination_list.append(dst)
- self.log.debug("__ destination_list: {}".format(destination_list))
- instance.data['destination_list'] = destination_list
- representations.append(representation)
- published_representations[repre_id] = {
- "representation": representation,
- "anatomy_data": template_data,
- "published_files": published_files
- }
- self.log.debug("__ representations: {}".format(representations))
- # reset transfers for next representation
- # instance.data['transfers'] is used as a global variable
- # in current codebase
- instance.data['transfers'] = list(orig_transfers)
-
- # Remove old representations if there are any (before insertion of new)
- if existing_repres:
- repre_ids_to_remove = []
- for repre in existing_repres:
- repre_ids_to_remove.append(repre["_id"])
- legacy_io.delete_many({"_id": {"$in": repre_ids_to_remove}})
-
- for rep in instance.data["representations"]:
- self.log.debug("__ rep: {}".format(rep))
-
- legacy_io.insert_many(representations)
- instance.data["published_representations"] = (
- published_representations
- )
- # self.log.debug("Representation: {}".format(representations))
- self.log.info("Registered {} items".format(len(representations)))
-
- def integrate(self, instance):
- """ Move the files.
-
- Through `instance.data["transfers"]`
-
- Args:
- instance: the instance to integrate
- Returns:
- integrated_file_sizes: dictionary of destination file url and
- its size in bytes
- """
- # store destination url and size for reporting and rollback
- integrated_file_sizes = {}
- transfers = list(instance.data.get("transfers", list()))
- for src, dest in transfers:
- if os.path.normpath(src) != os.path.normpath(dest):
- dest = self.get_dest_temp_url(dest)
- self.copy_file(src, dest)
- # TODO needs to be updated during site implementation
- integrated_file_sizes[dest] = os.path.getsize(dest)
-
- # Produce hardlinked copies
- # Note: hardlink can only be produced between two files on the same
- # server/disk and editing one of the two will edit both files at once.
- # As such it is recommended to only make hardlinks between static files
- # to ensure publishes remain safe and non-edited.
- hardlinks = instance.data.get("hardlinks", list())
- for src, dest in hardlinks:
- dest = self.get_dest_temp_url(dest)
- self.log.debug("Hardlinking file ... {} -> {}".format(src, dest))
- if not os.path.exists(dest):
- self.hardlink_file(src, dest)
-
- # TODO needs to be updated during site implementation
- integrated_file_sizes[dest] = os.path.getsize(dest)
-
- return integrated_file_sizes
-
- def copy_file(self, src, dst):
- """ Copy given source to destination
-
- Arguments:
- src (str): the source file which needs to be copied
- dst (str): the destination of the sourc file
- Returns:
- None
- """
- src = os.path.normpath(src)
- dst = os.path.normpath(dst)
- self.log.debug("Copying file ... {} -> {}".format(src, dst))
- dirname = os.path.dirname(dst)
- try:
- os.makedirs(dirname)
- except OSError as e:
- if e.errno == errno.EEXIST:
- pass
- else:
- self.log.critical("An unexpected error occurred.")
- six.reraise(*sys.exc_info())
-
- # copy file with speedcopy and check if size of files are simetrical
- while True:
- if not shutil._samefile(src, dst):
- copyfile(src, dst)
- else:
- self.log.critical(
- "files are the same {} to {}".format(src, dst)
- )
- os.remove(dst)
- try:
- shutil.copyfile(src, dst)
- self.log.debug("Copying files with shutil...")
- except OSError as e:
- self.log.critical("Cannot copy {} to {}".format(src, dst))
- self.log.critical(e)
- six.reraise(*sys.exc_info())
- if str(getsize(src)) in str(getsize(dst)):
- break
-
- def hardlink_file(self, src, dst):
- dirname = os.path.dirname(dst)
-
- try:
- os.makedirs(dirname)
- except OSError as e:
- if e.errno == errno.EEXIST:
- pass
- else:
- self.log.critical("An unexpected error occurred.")
- six.reraise(*sys.exc_info())
-
- create_hard_link(src, dst)
-
- def get_subset(self, project_name, asset, instance):
- subset_name = instance.data["subset"]
- subset = get_subset_by_name(project_name, subset_name, asset["_id"])
-
- if subset is None:
- self.log.info("Subset '%s' not found, creating ..." % subset_name)
- self.log.debug("families. %s" % instance.data.get('families'))
- self.log.debug(
- "families. %s" % type(instance.data.get('families')))
-
- family = instance.data.get("family")
- families = []
- if family:
- families.append(family)
-
- for _family in (instance.data.get("families") or []):
- if _family not in families:
- families.append(_family)
-
- _id = legacy_io.insert_one({
- "schema": "openpype:subset-3.0",
- "type": "subset",
- "name": subset_name,
- "data": {
- "families": families
- },
- "parent": asset["_id"]
- }).inserted_id
-
- subset = get_subset_by_id(project_name, _id)
-
- # QUESTION Why is changing of group and updating it's
- # families in 'get_subset'?
- self._set_subset_group(instance, subset["_id"])
-
- # Update families on subset.
- families = [instance.data["family"]]
- families.extend(instance.data.get("families", []))
- legacy_io.update_many(
- {"type": "subset", "_id": ObjectId(subset["_id"])},
- {"$set": {"data.families": families}}
- )
-
- return subset
-
- def _set_subset_group(self, instance, subset_id):
- """
- Mark subset as belonging to group in DB.
-
- Uses Settings > Global > Publish plugins > IntegrateAssetNew
-
- Args:
- instance (dict): processed instance
- subset_id (str): DB's subset _id
-
- """
- # Fist look into instance data
- subset_group = instance.data.get("subsetGroup")
- if not subset_group:
- subset_group = self._get_subset_group(instance)
-
- if subset_group:
- legacy_io.update_many({
- 'type': 'subset',
- '_id': ObjectId(subset_id)
- }, {'$set': {'data.subsetGroup': subset_group}})
-
- def _get_subset_group(self, instance):
- """Look into subset group profiles set by settings.
-
- Attribute 'subset_grouping_profiles' is defined by OpenPype settings.
- """
- # Skip if 'subset_grouping_profiles' is empty
- if not self.subset_grouping_profiles:
- return None
-
- # QUESTION
- # - is there a chance that task name is not filled in anatomy
- # data?
- # - should we use context task in that case?
- anatomy_data = instance.data["anatomyData"]
- task_name = None
- task_type = None
- if "task" in anatomy_data:
- task_name = anatomy_data["task"]["name"]
- task_type = anatomy_data["task"]["type"]
- filtering_criteria = {
- "families": instance.data["family"],
- "hosts": instance.context.data["hostName"],
- "tasks": task_name,
- "task_types": task_type
- }
- matching_profile = filter_profiles(
- self.subset_grouping_profiles,
- filtering_criteria
- )
- # Skip if there is not matchin profile
- if not matching_profile:
- return None
-
- filled_template = None
- template = matching_profile["template"]
- fill_pairs = (
- ("family", filtering_criteria["families"]),
- ("task", filtering_criteria["tasks"]),
- ("host", filtering_criteria["hosts"]),
- ("subset", instance.data["subset"]),
- ("renderlayer", instance.data.get("renderlayer"))
- )
- fill_pairs = prepare_template_data(fill_pairs)
-
- try:
- filled_template = StringTemplate.format_strict_template(
- template, fill_pairs
- )
- except (KeyError, TemplateUnsolved):
- keys = []
- if fill_pairs:
- keys = fill_pairs.keys()
-
- msg = "Subset grouping failed. " \
- "Only {} are expected in Settings".format(','.join(keys))
- self.log.warning(msg)
-
- return filled_template
-
- def create_version(self, subset, version_number, data=None):
- """ Copy given source to destination
-
- Args:
- subset (dict): the registered subset of the asset
- version_number (int): the version number
-
- Returns:
- dict: collection of data to create a version
- """
-
- return {"schema": "openpype:version-3.0",
- "type": "version",
- "parent": subset["_id"],
- "name": version_number,
- "data": data}
-
- def create_version_data(self, context, instance):
- """Create the data collection for the version
-
- Args:
- context: the current context
- instance: the current instance being published
-
- Returns:
- dict: the required information with instance.data as key
- """
-
- families = []
- current_families = instance.data.get("families", list())
- instance_family = instance.data.get("family", None)
-
- if instance_family is not None:
- families.append(instance_family)
- families += current_families
-
- # create relative source path for DB
- source = instance.data.get("source")
- if not source:
- source = context.data["currentFile"]
- anatomy = instance.context.data["anatomy"]
- source = self.get_rootless_path(anatomy, source)
-
- self.log.debug("Source: {}".format(source))
- version_data = {
- "families": families,
- "time": context.data["time"],
- "author": context.data["user"],
- "source": source,
- "comment": instance.data["comment"],
- "machine": context.data.get("machine"),
- "fps": context.data.get(
- "fps", instance.data.get("fps")
- )
- }
-
- intent_value = instance.context.data.get("intent")
- if intent_value and isinstance(intent_value, dict):
- intent_value = intent_value.get("value")
-
- if intent_value:
- version_data["intent"] = intent_value
-
- # Include optional data if present in
- optionals = [
- "frameStart", "frameEnd", "step",
- "handleEnd", "handleStart", "sourceHashes"
- ]
- for key in optionals:
- if key in instance.data:
- version_data[key] = instance.data[key]
-
- return version_data
-
- def main_family_from_instance(self, instance):
- """Returns main family of entered instance."""
- family = instance.data.get("family")
- if not family:
- family = instance.data["families"][0]
- return family
-
- def get_rootless_path(self, anatomy, path):
- """ Returns, if possible, path without absolute portion from host
- (eg. 'c:\' or '/opt/..')
- This information is host dependent and shouldn't be captured.
- Example:
- 'c:/projects/MyProject1/Assets/publish...' >
- '{root}/MyProject1/Assets...'
-
- Args:
- anatomy: anatomy part from instance
- path: path (absolute)
- Returns:
- path: modified path if possible, or unmodified path
- + warning logged
- """
- success, rootless_path = (
- anatomy.find_root_template_from_path(path)
- )
- if success:
- path = rootless_path
- else:
- self.log.warning((
- "Could not find root path for remapping \"{}\"."
- " This may cause issues on farm."
- ).format(path))
- return path
-
- def get_files_info(self, instance, integrated_file_sizes):
- """ Prepare 'files' portion for attached resources and main asset.
- Combining records from 'transfers' and 'hardlinks' parts from
- instance.
- All attached resources should be added, currently without
- Context info.
-
- Arguments:
- instance: the current instance being published
- integrated_file_sizes: dictionary of destination path (absolute)
- and its file size
- Returns:
- output_resources: array of dictionaries to be added to 'files' key
- in representation
- """
- resources = list(instance.data.get("transfers", []))
- resources.extend(list(instance.data.get("hardlinks", [])))
-
- self.log.debug("get_resource_files_info.resources:{}".
- format(resources))
-
- output_resources = []
- anatomy = instance.context.data["anatomy"]
- for _src, dest in resources:
- path = self.get_rootless_path(anatomy, dest)
- dest = self.get_dest_temp_url(dest)
- file_hash = source_hash(dest)
- if self.TMP_FILE_EXT and \
- ',{}'.format(self.TMP_FILE_EXT) in file_hash:
- file_hash = file_hash.replace(',{}'.format(self.TMP_FILE_EXT),
- '')
-
- file_info = self.prepare_file_info(path,
- integrated_file_sizes[dest],
- file_hash,
- instance=instance)
- output_resources.append(file_info)
-
- return output_resources
-
- def get_dest_temp_url(self, dest):
- """ Enhance destination path with TMP_FILE_EXT to denote temporary
- file.
- Temporary files will be renamed after successful registration
- into DB and full copy to destination
-
- Arguments:
- dest: destination url of published file (absolute)
- Returns:
- dest: destination path + '.TMP_FILE_EXT'
- """
- if self.TMP_FILE_EXT and '.{}'.format(self.TMP_FILE_EXT) not in dest:
- dest += '.{}'.format(self.TMP_FILE_EXT)
- return dest
-
- def prepare_file_info(self, path, size=None, file_hash=None,
- sites=None, instance=None):
- """ Prepare information for one file (asset or resource)
-
- Arguments:
- path: destination url of published file (rootless)
- size(optional): size of file in bytes
- file_hash(optional): hash of file for synchronization validation
- sites(optional): array of published locations,
- [ {'name':'studio', 'created_dt':date} by default
- keys expected ['studio', 'site1', 'gdrive1']
- instance(dict, optional): to get collected settings
- Returns:
- rec: dictionary with filled info
- """
- local_site = 'studio' # default
- remote_site = None
- always_accesible = []
- sync_project_presets = None
-
- rec = {
- "_id": ObjectId(),
- "path": path
- }
- if size:
- rec["size"] = size
-
- if file_hash:
- rec["hash"] = file_hash
-
- if sites:
- rec["sites"] = sites
- else:
- system_sync_server_presets = (
- instance.context.data["system_settings"]
- ["modules"]
- ["sync_server"])
- log.debug("system_sett:: {}".format(system_sync_server_presets))
-
- if system_sync_server_presets["enabled"]:
- sync_project_presets = (
- instance.context.data["project_settings"]
- ["global"]
- ["sync_server"])
-
- if sync_project_presets and sync_project_presets["enabled"]:
- local_site, remote_site = self._get_sites(sync_project_presets)
-
- always_accesible = sync_project_presets["config"]. \
- get("always_accessible_on", [])
-
- already_attached_sites = {}
- meta = {"name": local_site, "created_dt": datetime.now()}
- rec["sites"] = [meta]
- already_attached_sites[meta["name"]] = meta["created_dt"]
-
- if sync_project_presets and sync_project_presets["enabled"]:
- if remote_site and \
- remote_site not in already_attached_sites.keys():
- # add remote
- meta = {"name": remote_site.strip()}
- rec["sites"].append(meta)
- already_attached_sites[meta["name"]] = None
-
- # add alternative sites
- rec, already_attached_sites = self._add_alternative_sites(
- system_sync_server_presets, already_attached_sites, rec)
-
- # add skeleton for site where it should be always synced to
- for always_on_site in set(always_accesible):
- if always_on_site not in already_attached_sites.keys():
- meta = {"name": always_on_site.strip()}
- rec["sites"].append(meta)
- already_attached_sites[meta["name"]] = None
-
- log.debug("final sites:: {}".format(rec["sites"]))
-
- return rec
-
- def _get_sites(self, sync_project_presets):
- """Returns tuple (local_site, remote_site)"""
- local_site_id = get_local_site_id()
- local_site = sync_project_presets["config"]. \
- get("active_site", "studio").strip()
-
- if local_site == 'local':
- local_site = local_site_id
-
- remote_site = sync_project_presets["config"].get("remote_site")
-
- if remote_site == 'local':
- remote_site = local_site_id
-
- return local_site, remote_site
-
- def _add_alternative_sites(self,
- system_sync_server_presets,
- already_attached_sites,
- rec):
- """Loop through all configured sites and add alternatives.
-
- See SyncServerModule.handle_alternate_site
- """
- conf_sites = system_sync_server_presets.get("sites", {})
-
- alt_site_pairs = self._get_alt_site_pairs(conf_sites)
-
- already_attached_keys = list(already_attached_sites.keys())
- for added_site in already_attached_keys:
- real_created = already_attached_sites[added_site]
- for alt_site in alt_site_pairs.get(added_site, []):
- if alt_site in already_attached_sites.keys():
- continue
- meta = {"name": alt_site}
- # alt site inherits state of 'created_dt'
- if real_created:
- meta["created_dt"] = real_created
- rec["sites"].append(meta)
- already_attached_sites[meta["name"]] = real_created
-
- return rec, already_attached_sites
-
- def _get_alt_site_pairs(self, conf_sites):
- """Returns dict of site and its alternative sites.
-
- If `site` has alternative site, it means that alt_site has 'site' as
- alternative site
- Args:
- conf_sites (dict)
- Returns:
- (dict): {'site': [alternative sites]...}
- """
- alt_site_pairs = defaultdict(list)
- for site_name, site_info in conf_sites.items():
- alt_sites = set(site_info.get("alternative_sites", []))
- alt_site_pairs[site_name].extend(alt_sites)
-
- for alt_site in alt_sites:
- alt_site_pairs[alt_site].append(site_name)
-
- for site_name, alt_sites in alt_site_pairs.items():
- sites_queue = deque(alt_sites)
- while sites_queue:
- alt_site = sites_queue.popleft()
-
- # safety against wrong config
- # {"SFTP": {"alternative_site": "SFTP"}
- if alt_site == site_name or alt_site not in alt_site_pairs:
- continue
-
- for alt_alt_site in alt_site_pairs[alt_site]:
- if (
- alt_alt_site != site_name
- and alt_alt_site not in alt_sites
- ):
- alt_sites.append(alt_alt_site)
- sites_queue.append(alt_alt_site)
-
- return alt_site_pairs
-
- def handle_destination_files(self, integrated_file_sizes, mode):
- """ Clean destination files
- Called when error happened during integrating to DB or to disk
- OR called to rename uploaded files from temporary name to final to
- highlight publishing in progress/broken
- Used to clean unwanted files
-
- Arguments:
- integrated_file_sizes: dictionary, file urls as keys, size as value
- mode: 'remove' - clean files,
- 'finalize' - rename files,
- remove TMP_FILE_EXT suffix denoting temp file
- """
- if integrated_file_sizes:
- for file_url, _file_size in integrated_file_sizes.items():
- if not os.path.exists(file_url):
- self.log.debug(
- "File {} was not found.".format(file_url)
- )
- continue
-
- try:
- if mode == 'remove':
- self.log.debug("Removing file {}".format(file_url))
- os.remove(file_url)
- if mode == 'finalize':
- new_name = re.sub(
- r'\.{}$'.format(self.TMP_FILE_EXT),
- '',
- file_url
- )
-
- if os.path.exists(new_name):
- self.log.debug(
- "Overwriting file {} to {}".format(
- file_url, new_name
- )
- )
- shutil.copy(file_url, new_name)
- os.remove(file_url)
- else:
- self.log.debug(
- "Renaming file {} to {}".format(
- file_url, new_name
- )
- )
- os.rename(file_url, new_name)
- except OSError:
- self.log.error("Cannot {} file {}".format(mode, file_url),
- exc_info=True)
- six.reraise(*sys.exc_info())
diff --git a/openpype/plugins/publish/integrate_thumbnail.py b/openpype/plugins/publish/integrate_thumbnail.py
index 2e87d8fc86..9929d8f754 100644
--- a/openpype/plugins/publish/integrate_thumbnail.py
+++ b/openpype/plugins/publish/integrate_thumbnail.py
@@ -18,6 +18,7 @@ import collections
import six
import pyblish.api
+from openpype import AYON_SERVER_ENABLED
from openpype.client import get_versions
from openpype.client.operations import OperationsSession, new_thumbnail_doc
from openpype.pipeline.publish import get_publish_instance_label
@@ -39,6 +40,10 @@ class IntegrateThumbnails(pyblish.api.ContextPlugin):
]
def process(self, context):
+ if AYON_SERVER_ENABLED:
+ self.log.info("AYON is enabled. Skipping v3 thumbnail integration")
+ return
+
# Filter instances which can be used for integration
filtered_instance_items = self._prepare_instances(context)
if not filtered_instance_items:
diff --git a/openpype/plugins/publish/integrate_thumbnail_ayon.py b/openpype/plugins/publish/integrate_thumbnail_ayon.py
new file mode 100644
index 0000000000..ba5664c69f
--- /dev/null
+++ b/openpype/plugins/publish/integrate_thumbnail_ayon.py
@@ -0,0 +1,207 @@
+""" Integrate Thumbnails for Openpype use in Loaders.
+
+ This thumbnail is different from 'thumbnail' representation which could
+ be uploaded to Ftrack, or used as any other representation in Loaders to
+ pull into a scene.
+
+ This one is used only as image describing content of published item and
+ shows up only in Loader in right column section.
+"""
+
+import os
+import collections
+
+import pyblish.api
+
+from openpype import AYON_SERVER_ENABLED
+from openpype.client import get_versions
+from openpype.client.operations import OperationsSession
+
+InstanceFilterResult = collections.namedtuple(
+ "InstanceFilterResult",
+ ["instance", "thumbnail_path", "version_id"]
+)
+
+
+class IntegrateThumbnailsAYON(pyblish.api.ContextPlugin):
+ """Integrate Thumbnails for Openpype use in Loaders."""
+
+ label = "Integrate Thumbnails to AYON"
+ order = pyblish.api.IntegratorOrder + 0.01
+
+ required_context_keys = [
+ "project", "asset", "task", "subset", "version"
+ ]
+
+ def process(self, context):
+ if not AYON_SERVER_ENABLED:
+ self.log.info("AYON is not enabled. Skipping")
+ return
+
+ # Filter instances which can be used for integration
+ filtered_instance_items = self._prepare_instances(context)
+ if not filtered_instance_items:
+ self.log.info(
+ "All instances were filtered. Thumbnail integration skipped."
+ )
+ return
+
+ project_name = context.data["projectName"]
+
+ # Collect version ids from all filtered instance
+ version_ids = {
+ instance_items.version_id
+ for instance_items in filtered_instance_items
+ }
+ # Query versions
+ version_docs = get_versions(
+ project_name,
+ version_ids=version_ids,
+ hero=True,
+ fields=["_id", "type", "name"]
+ )
+ # Store version by their id (converted to string)
+ version_docs_by_str_id = {
+ str(version_doc["_id"]): version_doc
+ for version_doc in version_docs
+ }
+ self._integrate_thumbnails(
+ filtered_instance_items,
+ version_docs_by_str_id,
+ project_name
+ )
+
+ def _prepare_instances(self, context):
+ context_thumbnail_path = context.get("thumbnailPath")
+ valid_context_thumbnail = bool(
+ context_thumbnail_path
+ and os.path.exists(context_thumbnail_path)
+ )
+
+ filtered_instances = []
+ for instance in context:
+ instance_label = self._get_instance_label(instance)
+ # Skip instances without published representations
+ # - there is no place where to put the thumbnail
+ published_repres = instance.data.get("published_representations")
+ if not published_repres:
+ self.log.debug((
+ "There are no published representations"
+ " on the instance {}."
+ ).format(instance_label))
+ continue
+
+ # Find thumbnail path on instance
+ thumbnail_path = self._get_instance_thumbnail_path(
+ published_repres)
+ if thumbnail_path:
+ self.log.debug((
+ "Found thumbnail path for instance \"{}\"."
+ " Thumbnail path: {}"
+ ).format(instance_label, thumbnail_path))
+
+ elif valid_context_thumbnail:
+ # Use context thumbnail path if is available
+ thumbnail_path = context_thumbnail_path
+ self.log.debug((
+ "Using context thumbnail path for instance \"{}\"."
+ " Thumbnail path: {}"
+ ).format(instance_label, thumbnail_path))
+
+ # Skip instance if thumbnail path is not available for it
+ if not thumbnail_path:
+ self.log.info((
+ "Skipping thumbnail integration for instance \"{}\"."
+ " Instance and context"
+ " thumbnail paths are not available."
+ ).format(instance_label))
+ continue
+
+ version_id = str(self._get_version_id(published_repres))
+ filtered_instances.append(
+ InstanceFilterResult(instance, thumbnail_path, version_id)
+ )
+ return filtered_instances
+
+ def _get_version_id(self, published_representations):
+ for repre_info in published_representations.values():
+ return repre_info["representation"]["parent"]
+
+ def _get_instance_thumbnail_path(self, published_representations):
+ thumb_repre_doc = None
+ for repre_info in published_representations.values():
+ repre_doc = repre_info["representation"]
+ if repre_doc["name"].lower() == "thumbnail":
+ thumb_repre_doc = repre_doc
+ break
+
+ if thumb_repre_doc is None:
+ self.log.debug(
+ "There is not representation with name \"thumbnail\""
+ )
+ return None
+
+ path = thumb_repre_doc["data"]["path"]
+ if not os.path.exists(path):
+ self.log.warning(
+ "Thumbnail file cannot be found. Path: {}".format(path)
+ )
+ return None
+ return os.path.normpath(path)
+
+ def _integrate_thumbnails(
+ self,
+ filtered_instance_items,
+ version_docs_by_str_id,
+ project_name
+ ):
+ from openpype.client.server.operations import create_thumbnail
+
+ op_session = OperationsSession()
+
+ for instance_item in filtered_instance_items:
+ instance, thumbnail_path, version_id = instance_item
+ instance_label = self._get_instance_label(instance)
+ version_doc = version_docs_by_str_id.get(version_id)
+ if not version_doc:
+ self.log.warning((
+ "Version entity for instance \"{}\" was not found."
+ ).format(instance_label))
+ continue
+
+ thumbnail_id = create_thumbnail(project_name, thumbnail_path)
+
+ # Set thumbnail id for version
+ op_session.update_entity(
+ project_name,
+ version_doc["type"],
+ version_doc["_id"],
+ {"data.thumbnail_id": thumbnail_id}
+ )
+ if version_doc["type"] == "hero_version":
+ version_name = "Hero"
+ else:
+ version_name = version_doc["name"]
+ self.log.debug("Setting thumbnail for version \"{}\" <{}>".format(
+ version_name, version_id
+ ))
+
+ asset_entity = instance.data["assetEntity"]
+ op_session.update_entity(
+ project_name,
+ asset_entity["type"],
+ asset_entity["_id"],
+ {"data.thumbnail_id": thumbnail_id}
+ )
+ self.log.debug("Setting thumbnail for asset \"{}\" <{}>".format(
+ asset_entity["name"], version_id
+ ))
+
+ op_session.commit()
+
+ def _get_instance_label(self, instance):
+ return (
+ instance.data.get("label")
+ or instance.data.get("name")
+ or "N/A"
+ )
diff --git a/openpype/plugins/publish/integrate_version_attrs.py b/openpype/plugins/publish/integrate_version_attrs.py
new file mode 100644
index 0000000000..ed179ae319
--- /dev/null
+++ b/openpype/plugins/publish/integrate_version_attrs.py
@@ -0,0 +1,93 @@
+import pyblish.api
+import ayon_api
+
+from openpype import AYON_SERVER_ENABLED
+from openpype.client.operations import OperationsSession
+
+
+class IntegrateVersionAttributes(pyblish.api.ContextPlugin):
+ """Integrate version attributes from predefined key.
+
+ Any integration after 'IntegrateAsset' can fill 'versionAttributes' with
+ attribute key & value to be updated on created version.
+
+ The integration must make sure the attribute is available for the version
+ entity otherwise an error would be raised.
+
+ Example of 'versionAttributes':
+ {
+ "ftrack_id": "0123456789-101112-131415",
+ "syncsketch_id": "987654321-012345-678910"
+ }
+ """
+
+ label = "Integrate Version Attributes"
+ order = pyblish.api.IntegratorOrder + 0.5
+
+ def process(self, context):
+ available_attributes = ayon_api.get_attributes_for_type("version")
+ skipped_attributes = set()
+ project_name = context.data["projectName"]
+ op_session = OperationsSession()
+ for instance in context:
+ label = self.get_instance_label(instance)
+ version_entity = instance.data.get("versionEntity")
+ if not version_entity:
+ continue
+ attributes = instance.data.get("versionAttributes")
+ if not attributes:
+ self.log.debug((
+ "Skipping instance {} because it does not specify"
+ " version attributes to set."
+ ).format(label))
+ continue
+
+ filtered_attributes = {}
+ for attr, value in attributes.items():
+ if attr not in available_attributes:
+ skipped_attributes.add(attr)
+ else:
+ filtered_attributes[attr] = value
+
+ if not filtered_attributes:
+ self.log.debug((
+ "Skipping instance {} because all version attributes were"
+ " filtered out."
+ ).format(label))
+ continue
+
+ self.log.debug("Updating attributes on version {} to {}".format(
+ version_entity["_id"], str(filtered_attributes)
+ ))
+ op_session.update_entity(
+ project_name,
+ "version",
+ version_entity["_id"],
+ {"attrib": filtered_attributes}
+ )
+
+ if skipped_attributes:
+ self.log.warning((
+ "Skipped version attributes integration because they're"
+ " not available on the server: {}"
+ ).format(str(skipped_attributes)))
+
+ if len(op_session):
+ op_session.commit()
+ self.log.info("Updated version attributes")
+ else:
+ self.log.debug("There are no version attributes to update")
+
+ @staticmethod
+ def get_instance_label(instance):
+ return (
+ instance.data.get("label")
+ or instance.data.get("name")
+ or instance.data.get("subset")
+ or str(instance)
+ )
+
+
+# Discover the plugin only in AYON mode
+if not AYON_SERVER_ENABLED:
+ del IntegrateVersionAttributes
diff --git a/openpype/pype_commands.py b/openpype/pype_commands.py
index 56a0fe60cd..8a3f25a026 100644
--- a/openpype/pype_commands.py
+++ b/openpype/pype_commands.py
@@ -90,7 +90,10 @@ class PypeCommands:
from openpype.lib import Logger
from openpype.lib.applications import get_app_environments_for_context
from openpype.modules import ModulesManager
- from openpype.pipeline import install_openpype_plugins
+ from openpype.pipeline import (
+ install_openpype_plugins,
+ get_global_context,
+ )
from openpype.tools.utils.host_tools import show_publish
from openpype.tools.utils.lib import qt_app_context
@@ -112,12 +115,14 @@ class PypeCommands:
if not any(paths):
raise RuntimeError("No publish paths specified")
- if os.getenv("AVALON_APP_NAME"):
+ app_full_name = os.getenv("AVALON_APP_NAME")
+ if app_full_name:
+ context = get_global_context()
env = get_app_environments_for_context(
- os.environ["AVALON_PROJECT"],
- os.environ["AVALON_ASSET"],
- os.environ["AVALON_TASK"],
- os.environ["AVALON_APP_NAME"]
+ context["project_name"],
+ context["asset_name"],
+ context["task_name"],
+ app_full_name
)
os.environ.update(env)
@@ -260,12 +265,6 @@ class PypeCommands:
main(output_path, project_name, asset_name, strict)
- def texture_copy(self, project, asset, path):
- pass
-
- def run_application(self, app, project, asset, task, tools, arguments):
- pass
-
def validate_jsons(self):
pass
@@ -291,7 +290,14 @@ class PypeCommands:
folder = "../tests"
# disable warnings and show captured stdout even if success
- args = ["--disable-pytest-warnings", "-rP", folder]
+ args = [
+ "--disable-pytest-warnings",
+ "--capture=sys",
+ "--print",
+ "-W ignore::DeprecationWarning",
+ "-rP",
+ folder
+ ]
if mark:
args.extend(["-m", mark])
diff --git a/openpype/resources/__init__.py b/openpype/resources/__init__.py
index 0d7778e546..b8671f517a 100644
--- a/openpype/resources/__init__.py
+++ b/openpype/resources/__init__.py
@@ -1,4 +1,5 @@
import os
+from openpype import AYON_SERVER_ENABLED
from openpype.lib.openpype_version import is_running_staging
RESOURCES_DIR = os.path.dirname(os.path.abspath(__file__))
@@ -40,11 +41,17 @@ def get_liberation_font_path(bold=False, italic=False):
def get_openpype_production_icon_filepath():
- return get_resource("icons", "openpype_icon.png")
+ filename = "openpype_icon.png"
+ if AYON_SERVER_ENABLED:
+ filename = "AYON_icon.png"
+ return get_resource("icons", filename)
def get_openpype_staging_icon_filepath():
- return get_resource("icons", "openpype_icon_staging.png")
+ filename = "openpype_icon_staging.png"
+ if AYON_SERVER_ENABLED:
+ filename = "AYON_icon_staging.png"
+ return get_resource("icons", filename)
def get_openpype_icon_filepath(staging=None):
@@ -60,7 +67,12 @@ def get_openpype_splash_filepath(staging=None):
if staging is None:
staging = is_running_staging()
- if staging:
+ if AYON_SERVER_ENABLED:
+ if staging:
+ splash_file_name = "AYON_splash_staging.png"
+ else:
+ splash_file_name = "AYON_splash.png"
+ elif staging:
splash_file_name = "openpype_splash_staging.png"
else:
splash_file_name = "openpype_splash.png"
diff --git a/openpype/resources/icons/AYON_icon.png b/openpype/resources/icons/AYON_icon.png
new file mode 100644
index 0000000000..ed13aeea52
Binary files /dev/null and b/openpype/resources/icons/AYON_icon.png differ
diff --git a/openpype/resources/icons/AYON_icon_staging.png b/openpype/resources/icons/AYON_icon_staging.png
new file mode 100644
index 0000000000..75dadfd56c
Binary files /dev/null and b/openpype/resources/icons/AYON_icon_staging.png differ
diff --git a/openpype/resources/icons/AYON_splash.png b/openpype/resources/icons/AYON_splash.png
new file mode 100644
index 0000000000..734aefb740
Binary files /dev/null and b/openpype/resources/icons/AYON_splash.png differ
diff --git a/openpype/resources/icons/AYON_splash_staging.png b/openpype/resources/icons/AYON_splash_staging.png
new file mode 100644
index 0000000000..2923413664
Binary files /dev/null and b/openpype/resources/icons/AYON_splash_staging.png differ
diff --git a/openpype/scripts/fusion_switch_shot.py b/openpype/scripts/fusion_switch_shot.py
index fc22f060a2..8ecf4fb5ea 100644
--- a/openpype/scripts/fusion_switch_shot.py
+++ b/openpype/scripts/fusion_switch_shot.py
@@ -15,6 +15,7 @@ from openpype.pipeline import (
install_host,
registered_host,
legacy_io,
+ get_current_project_name,
)
from openpype.pipeline.context_tools import get_workdir_from_session
@@ -130,7 +131,7 @@ def update_frame_range(comp, representations):
"""
version_ids = [r["parent"] for r in representations]
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
versions = list(get_versions(project_name, version_ids=version_ids))
start = min(v["data"]["frameStart"] for v in versions)
@@ -161,7 +162,7 @@ def switch(asset_name, filepath=None, new=True):
# Assert asset name exists
# It is better to do this here then to wait till switch_shot does it
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
asset = get_asset_by_name(project_name, asset_name)
assert asset, "Could not find '%s' in the database" % asset_name
diff --git a/openpype/settings/ayon_settings.py b/openpype/settings/ayon_settings.py
new file mode 100644
index 0000000000..d2a2afbee0
--- /dev/null
+++ b/openpype/settings/ayon_settings.py
@@ -0,0 +1,1386 @@
+"""Helper functionality to convert AYON settings to OpenPype v3 settings.
+
+The settings are converted, so we can use v3 code with AYON settings. Once
+the code of and addon is converted to full AYON addon which expect AYON
+settings the conversion function can be removed.
+
+The conversion is hardcoded -> there is no other way how to achieve the result.
+
+Main entrypoints are functions:
+- convert_project_settings - convert settings to project settings
+- convert_system_settings - convert settings to system settings
+# Both getters cache values
+- get_ayon_project_settings - replacement for 'get_project_settings'
+- get_ayon_system_settings - replacement for 'get_system_settings'
+"""
+import os
+import collections
+import json
+import copy
+import time
+
+import six
+import ayon_api
+
+
+def _convert_color(color_value):
+ if isinstance(color_value, six.string_types):
+ color_value = color_value.lstrip("#")
+ color_value_len = len(color_value)
+ _color_value = []
+ for idx in range(color_value_len // 2):
+ _color_value.append(int(color_value[idx:idx + 2], 16))
+ for _ in range(4 - len(_color_value)):
+ _color_value.append(255)
+ return _color_value
+
+ if isinstance(color_value, list):
+ # WARNING R,G,B can be 'int' or 'float'
+ # - 'float' variant is using 'int' for min: 0 and max: 1
+ if len(color_value) == 3:
+ # Add alpha
+ color_value.append(255)
+ else:
+ # Convert float alha to int
+ alpha = int(color_value[3] * 255)
+ if alpha > 255:
+ alpha = 255
+ elif alpha < 0:
+ alpha = 0
+ color_value[3] = alpha
+ return color_value
+
+
+def _convert_host_imageio(host_settings):
+ if "imageio" not in host_settings:
+ return
+
+ # --- imageio ---
+ ayon_imageio = host_settings["imageio"]
+ # TODO remove when fixed on server
+ if "ocio_config" in ayon_imageio["ocio_config"]:
+ ayon_imageio["ocio_config"]["filepath"] = (
+ ayon_imageio["ocio_config"].pop("ocio_config")
+ )
+ # Convert file rules
+ imageio_file_rules = ayon_imageio["file_rules"]
+ new_rules = {}
+ for rule in imageio_file_rules["rules"]:
+ name = rule.pop("name")
+ new_rules[name] = rule
+ imageio_file_rules["rules"] = new_rules
+
+
+def _convert_applications_groups(groups, clear_metadata):
+ environment_key = "environment"
+ if isinstance(groups, dict):
+ new_groups = []
+ for name, item in groups.items():
+ item["name"] = name
+ new_groups.append(item)
+ groups = new_groups
+
+ output = {}
+ group_dynamic_labels = {}
+ for group in groups:
+ group_name = group.pop("name")
+ if "label" in group:
+ group_dynamic_labels[group_name] = group["label"]
+
+ tool_group_envs = group[environment_key]
+ if isinstance(tool_group_envs, six.string_types):
+ group[environment_key] = json.loads(tool_group_envs)
+
+ variants = {}
+ variant_dynamic_labels = {}
+ for variant in group.pop("variants"):
+ variant_name = variant.pop("name")
+ label = variant.get("label")
+ if label and label != variant_name:
+ variant_dynamic_labels[variant_name] = label
+ variant_envs = variant[environment_key]
+ if isinstance(variant_envs, six.string_types):
+ variant[environment_key] = json.loads(variant_envs)
+ variants[variant_name] = variant
+ group["variants"] = variants
+
+ if not clear_metadata:
+ variants["__dynamic_keys_labels__"] = variant_dynamic_labels
+ output[group_name] = group
+
+ if not clear_metadata:
+ output["__dynamic_keys_labels__"] = group_dynamic_labels
+ return output
+
+
+def _convert_applications_system_settings(
+ ayon_settings, output, clear_metadata
+):
+ # Addon settings
+ addon_settings = ayon_settings["applications"]
+
+ # Remove project settings
+ addon_settings.pop("only_available", None)
+
+ # Applications settings
+ ayon_apps = addon_settings["applications"]
+ if "adsk_3dsmax" in ayon_apps:
+ ayon_apps["3dsmax"] = ayon_apps.pop("adsk_3dsmax")
+
+ additional_apps = ayon_apps.pop("additional_apps")
+ applications = _convert_applications_groups(
+ ayon_apps, clear_metadata
+ )
+ applications["additional_apps"] = _convert_applications_groups(
+ additional_apps, clear_metadata
+ )
+
+ # Tools settings
+ tools = _convert_applications_groups(
+ addon_settings["tool_groups"], clear_metadata
+ )
+
+ output["applications"] = applications
+ output["tools"] = {"tool_groups": tools}
+
+
+def _convert_general(ayon_settings, output, default_settings):
+ # TODO get studio name/code
+ core_settings = ayon_settings["core"]
+ environments = core_settings["environments"]
+ if isinstance(environments, six.string_types):
+ environments = json.loads(environments)
+
+ general = default_settings["general"]
+ general.update({
+ "log_to_server": False,
+ "studio_name": core_settings["studio_name"],
+ "studio_code": core_settings["studio_code"],
+ "environment": environments
+ })
+ output["general"] = general
+
+
+def _convert_kitsu_system_settings(ayon_settings, output):
+ output["modules"]["kitsu"] = {
+ "server": ayon_settings["kitsu"]["server"]
+ }
+
+
+def _convert_ftrack_system_settings(ayon_settings, output, defaults):
+ # Ftrack contains few keys that are needed for initialization in OpenPype
+ # mode and some are used on different places
+ ftrack_settings = defaults["modules"]["ftrack"]
+ ftrack_settings["ftrack_server"] = (
+ ayon_settings["ftrack"]["ftrack_server"])
+ output["modules"]["ftrack"] = ftrack_settings
+
+
+def _convert_shotgrid_system_settings(ayon_settings, output):
+ ayon_shotgrid = ayon_settings["shotgrid"]
+ # Skip conversion if different ayon addon is used
+ if "leecher_manager_url" not in ayon_shotgrid:
+ output["shotgrid"] = ayon_shotgrid
+ return
+
+ shotgrid_settings = {}
+ for key in (
+ "leecher_manager_url",
+ "leecher_backend_url",
+ "filter_projects_by_login",
+ ):
+ shotgrid_settings[key] = ayon_shotgrid[key]
+
+ new_items = {}
+ for item in ayon_shotgrid["shotgrid_settings"]:
+ name = item.pop("name")
+ new_items[name] = item
+ shotgrid_settings["shotgrid_settings"] = new_items
+
+ output["modules"]["shotgrid"] = shotgrid_settings
+
+
+def _convert_timers_manager_system_settings(ayon_settings, output):
+ ayon_manager = ayon_settings["timers_manager"]
+ manager_settings = {
+ key: ayon_manager[key]
+ for key in {
+ "auto_stop", "full_time", "message_time", "disregard_publishing"
+ }
+ }
+ output["modules"]["timers_manager"] = manager_settings
+
+
+def _convert_clockify_system_settings(ayon_settings, output):
+ output["modules"]["clockify"] = ayon_settings["clockify"]
+
+
+def _convert_deadline_system_settings(ayon_settings, output):
+ ayon_deadline = ayon_settings["deadline"]
+ deadline_settings = {
+ "deadline_urls": {
+ item["name"]: item["value"]
+ for item in ayon_deadline["deadline_urls"]
+ }
+ }
+ output["modules"]["deadline"] = deadline_settings
+
+
+def _convert_muster_system_settings(ayon_settings, output):
+ ayon_muster = ayon_settings["muster"]
+ templates_mapping = {
+ item["name"]: item["value"]
+ for item in ayon_muster["templates_mapping"]
+ }
+ output["modules"]["muster"] = {
+ "templates_mapping": templates_mapping,
+ "MUSTER_REST_URL": ayon_muster["MUSTER_REST_URL"]
+ }
+
+
+def _convert_royalrender_system_settings(ayon_settings, output):
+ ayon_royalrender = ayon_settings["royalrender"]
+ output["modules"]["royalrender"] = {
+ "rr_paths": {
+ item["name"]: item["value"]
+ for item in ayon_royalrender["rr_paths"]
+ }
+ }
+
+
+def _convert_modules_system(
+ ayon_settings, output, addon_versions, default_settings
+):
+ # TODO add all modules
+ # TODO add 'enabled' values
+ for key, func in (
+ ("kitsu", _convert_kitsu_system_settings),
+ ("shotgrid", _convert_shotgrid_system_settings),
+ ("timers_manager", _convert_timers_manager_system_settings),
+ ("clockify", _convert_clockify_system_settings),
+ ("deadline", _convert_deadline_system_settings),
+ ("muster", _convert_muster_system_settings),
+ ("royalrender", _convert_royalrender_system_settings),
+ ):
+ if key in ayon_settings:
+ func(ayon_settings, output)
+
+ if "ftrack" in ayon_settings:
+ _convert_ftrack_system_settings(
+ ayon_settings, output, default_settings)
+
+ output_modules = output["modules"]
+ # TODO remove when not needed
+ for module_name, value in default_settings["modules"].items():
+ if module_name not in output_modules:
+ output_modules[module_name] = value
+
+ for module_name, value in default_settings["modules"].items():
+ if "enabled" not in value or module_name not in output_modules:
+ continue
+
+ ayon_module_name = module_name
+ if module_name == "sync_server":
+ ayon_module_name = "sitesync"
+ output_modules[module_name]["enabled"] = (
+ ayon_module_name in addon_versions)
+
+ # Missing modules conversions
+ # - "sync_server" -> renamed to sitesync
+ # - "slack" -> only 'enabled'
+ # - "job_queue" -> completelly missing in ayon
+
+
+def convert_system_settings(ayon_settings, default_settings, addon_versions):
+ default_settings = copy.deepcopy(default_settings)
+ output = {
+ "modules": {}
+ }
+ if "applications" in ayon_settings:
+ _convert_applications_system_settings(ayon_settings, output, False)
+
+ if "core" in ayon_settings:
+ _convert_general(ayon_settings, output, default_settings)
+
+ _convert_modules_system(
+ ayon_settings,
+ output,
+ addon_versions,
+ default_settings
+ )
+ for key, value in default_settings.items():
+ if key not in output:
+ output[key] = value
+ return output
+
+
+# --------- Project settings ---------
+def _convert_applications_project_settings(ayon_settings, output):
+ if "applications" not in ayon_settings:
+ return
+
+ output["applications"] = {
+ "only_available": ayon_settings["applications"]["only_available"]
+ }
+
+
+def _convert_blender_project_settings(ayon_settings, output):
+ if "blender" not in ayon_settings:
+ return
+ ayon_blender = ayon_settings["blender"]
+ _convert_host_imageio(ayon_blender)
+
+ ayon_publish = ayon_blender["publish"]
+
+ for plugin in ("ExtractThumbnail", "ExtractPlayblast"):
+ plugin_settings = ayon_publish[plugin]
+ plugin_settings["presets"] = json.loads(plugin_settings["presets"])
+
+ output["blender"] = ayon_blender
+
+
+def _convert_celaction_project_settings(ayon_settings, output):
+ if "celaction" not in ayon_settings:
+ return
+
+ ayon_celaction = ayon_settings["celaction"]
+ _convert_host_imageio(ayon_celaction)
+
+ output["celaction"] = ayon_celaction
+
+
+def _convert_flame_project_settings(ayon_settings, output):
+ if "flame" not in ayon_settings:
+ return
+
+ ayon_flame = ayon_settings["flame"]
+
+ ayon_publish_flame = ayon_flame["publish"]
+ # Plugin 'ExtractSubsetResources' renamed to 'ExtractProductResources'
+ if "ExtractSubsetResources" in ayon_publish_flame:
+ ayon_product_resources = ayon_publish_flame["ExtractSubsetResources"]
+ else:
+ ayon_product_resources = (
+ ayon_publish_flame.pop("ExtractProductResources"))
+ ayon_publish_flame["ExtractSubsetResources"] = ayon_product_resources
+
+ # 'ExtractSubsetResources' changed model of 'export_presets_mapping'
+ # - some keys were moved under 'other_parameters'
+ new_subset_resources = {}
+ for item in ayon_product_resources.pop("export_presets_mapping"):
+ name = item.pop("name")
+ if "other_parameters" in item:
+ other_parameters = item.pop("other_parameters")
+ item.update(other_parameters)
+ new_subset_resources[name] = item
+
+ ayon_product_resources["export_presets_mapping"] = new_subset_resources
+
+ # 'imageio' changed model
+ # - missing subkey 'project' which is in root of 'imageio' model
+ _convert_host_imageio(ayon_flame)
+ ayon_imageio_flame = ayon_flame["imageio"]
+ if "project" not in ayon_imageio_flame:
+ profile_mapping = ayon_imageio_flame.pop("profilesMapping")
+ ayon_flame["imageio"] = {
+ "project": ayon_imageio_flame,
+ "profilesMapping": profile_mapping
+ }
+
+ ayon_load_flame = ayon_flame["load"]
+ for plugin_name in ("LoadClip", "LoadClipBatch"):
+ plugin_settings = ayon_load_flame[plugin_name]
+ plugin_settings["families"] = plugin_settings.pop("product_types")
+ plugin_settings["clip_name_template"] = (
+ plugin_settings["clip_name_template"]
+ .replace("{folder[name]}", "{asset}")
+ .replace("{product[name]}", "{subset}")
+ )
+ plugin_settings["layer_rename_template"] = (
+ plugin_settings["layer_rename_template"]
+ .replace("{folder[name]}", "{asset}")
+ .replace("{product[name]}", "{subset}")
+ )
+
+ output["flame"] = ayon_flame
+
+
+def _convert_fusion_project_settings(ayon_settings, output):
+ if "fusion" not in ayon_settings:
+ return
+
+ ayon_fusion = ayon_settings["fusion"]
+ _convert_host_imageio(ayon_fusion)
+
+ ayon_imageio_fusion = ayon_fusion["imageio"]
+
+ if "ocioSettings" in ayon_imageio_fusion:
+ ayon_ocio_setting = ayon_imageio_fusion.pop("ocioSettings")
+ paths = ayon_ocio_setting.pop("ocioPathModel")
+ for key, value in tuple(paths.items()):
+ new_value = []
+ if value:
+ new_value.append(value)
+ paths[key] = new_value
+
+ ayon_ocio_setting["configFilePath"] = paths
+ ayon_imageio_fusion["ocio"] = ayon_ocio_setting
+ elif "ocio" in ayon_imageio_fusion:
+ paths = ayon_imageio_fusion["ocio"].pop("configFilePath")
+ for key, value in tuple(paths.items()):
+ new_value = []
+ if value:
+ new_value.append(value)
+ paths[key] = new_value
+ ayon_imageio_fusion["ocio"]["configFilePath"] = paths
+
+ _convert_host_imageio(ayon_imageio_fusion)
+
+ ayon_create_saver = ayon_fusion["create"]["CreateSaver"]
+ ayon_create_saver["temp_rendering_path_template"] = (
+ ayon_create_saver["temp_rendering_path_template"]
+ .replace("{product[name]}", "{subset}")
+ .replace("{product[type]}", "{family}")
+ .replace("{folder[name]}", "{asset}")
+ .replace("{task[name]}", "{task}")
+ )
+
+ output["fusion"] = ayon_fusion
+
+
+def _convert_maya_project_settings(ayon_settings, output):
+ if "maya" not in ayon_settings:
+ return
+
+ ayon_maya = ayon_settings["maya"]
+
+ # Change key of render settings
+ ayon_maya["RenderSettings"] = ayon_maya.pop("render_settings")
+
+ # Convert extensions mapping
+ ayon_maya["ext_mapping"] = {
+ item["name"]: item["value"]
+ for item in ayon_maya["ext_mapping"]
+ }
+
+ # Publish UI filters
+ new_filters = {}
+ for item in ayon_maya["filters"]:
+ new_filters[item["name"]] = {
+ subitem["name"]: subitem["value"]
+ for subitem in item["value"]
+ }
+ ayon_maya["filters"] = new_filters
+
+ # Maya dirmap
+ ayon_maya_dirmap = ayon_maya.pop("maya_dirmap")
+ ayon_maya_dirmap_path = ayon_maya_dirmap["paths"]
+ ayon_maya_dirmap_path["source-path"] = (
+ ayon_maya_dirmap_path.pop("source_path")
+ )
+ ayon_maya_dirmap_path["destination-path"] = (
+ ayon_maya_dirmap_path.pop("destination_path")
+ )
+ ayon_maya["maya-dirmap"] = ayon_maya_dirmap
+
+ # Create plugins
+ ayon_create = ayon_maya["create"]
+ ayon_create_static_mesh = ayon_create["CreateUnrealStaticMesh"]
+ if "static_mesh_prefixes" in ayon_create_static_mesh:
+ ayon_create_static_mesh["static_mesh_prefix"] = (
+ ayon_create_static_mesh.pop("static_mesh_prefixes")
+ )
+
+ # --- Publish (START) ---
+ ayon_publish = ayon_maya["publish"]
+ try:
+ attributes = json.loads(
+ ayon_publish["ValidateAttributes"]["attributes"]
+ )
+ except ValueError:
+ attributes = {}
+ ayon_publish["ValidateAttributes"]["attributes"] = attributes
+
+ try:
+ SUFFIX_NAMING_TABLE = json.loads(
+ ayon_publish
+ ["ValidateTransformNamingSuffix"]
+ ["SUFFIX_NAMING_TABLE"]
+ )
+ except ValueError:
+ SUFFIX_NAMING_TABLE = {}
+ ayon_publish["ValidateTransformNamingSuffix"]["SUFFIX_NAMING_TABLE"] = (
+ SUFFIX_NAMING_TABLE
+ )
+
+ validate_frame_range = ayon_publish["ValidateFrameRange"]
+ if "exclude_product_types" in validate_frame_range:
+ validate_frame_range["exclude_families"] = (
+ validate_frame_range.pop("exclude_product_types"))
+
+ # Extract playblast capture settings
+ validate_rendern_settings = ayon_publish["ValidateRenderSettings"]
+ for key in (
+ "arnold_render_attributes",
+ "vray_render_attributes",
+ "redshift_render_attributes",
+ "renderman_render_attributes",
+ ):
+ if key not in validate_rendern_settings:
+ continue
+ validate_rendern_settings[key] = [
+ [item["type"], item["value"]]
+ for item in validate_rendern_settings[key]
+ ]
+
+ plugin_path_attributes = ayon_publish["ValidatePluginPathAttributes"]
+ plugin_path_attributes["attribute"] = {
+ item["name"]: item["value"]
+ for item in plugin_path_attributes["attribute"]
+ }
+
+ ayon_capture_preset = ayon_publish["ExtractPlayblast"]["capture_preset"]
+ display_options = ayon_capture_preset["DisplayOptions"]
+ for key in ("background", "backgroundBottom", "backgroundTop"):
+ display_options[key] = _convert_color(display_options[key])
+
+ for src_key, dst_key in (
+ ("DisplayOptions", "Display Options"),
+ ("ViewportOptions", "Viewport Options"),
+ ("CameraOptions", "Camera Options"),
+ ):
+ ayon_capture_preset[dst_key] = ayon_capture_preset.pop(src_key)
+
+ viewport_options = ayon_capture_preset["Viewport Options"]
+ viewport_options["pluginObjects"] = {
+ item["name"]: item["value"]
+ for item in viewport_options["pluginObjects"]
+ }
+
+ # Extract Camera Alembic bake attributes
+ try:
+ bake_attributes = json.loads(
+ ayon_publish["ExtractCameraAlembic"]["bake_attributes"]
+ )
+ except ValueError:
+ bake_attributes = []
+ ayon_publish["ExtractCameraAlembic"]["bake_attributes"] = bake_attributes
+
+ # --- Publish (END) ---
+ for renderer_settings in ayon_maya["RenderSettings"].values():
+ if (
+ not isinstance(renderer_settings, dict)
+ or "additional_options" not in renderer_settings
+ ):
+ continue
+ renderer_settings["additional_options"] = [
+ [item["attribute"], item["value"]]
+ for item in renderer_settings["additional_options"]
+ ]
+
+ # Workfile build
+ ayon_workfile_build = ayon_maya["workfile_build"]
+ for item in ayon_workfile_build["profiles"]:
+ for key in ("current_context", "linked_assets"):
+ for subitem in item[key]:
+ if "families" in subitem:
+ break
+ subitem["families"] = subitem.pop("product_types")
+ subitem["subset_name_filters"] = subitem.pop(
+ "product_name_filters")
+
+ _convert_host_imageio(ayon_maya)
+
+ ayon_maya_load = ayon_maya["load"]
+ load_colors = ayon_maya_load["colors"]
+ for key, color in tuple(load_colors.items()):
+ load_colors[key] = _convert_color(color)
+
+ reference_loader = ayon_maya_load["reference_loader"]
+ reference_loader["namespace"] = (
+ reference_loader["namespace"]
+ .replace("{folder[name]}", "{asset_name}")
+ .replace("{product[name]}", "{subset}")
+ )
+
+ output["maya"] = ayon_maya
+
+
+def _convert_nuke_knobs(knobs):
+ new_knobs = []
+ for knob in knobs:
+ knob_type = knob["type"]
+
+ if knob_type == "boolean":
+ knob_type = "bool"
+
+ if knob_type != "bool":
+ value = knob[knob_type]
+ elif knob_type in knob:
+ value = knob[knob_type]
+ else:
+ value = knob["boolean"]
+
+ new_knob = {
+ "type": knob_type,
+ "name": knob["name"],
+ }
+ new_knobs.append(new_knob)
+
+ if knob_type == "formatable":
+ new_knob["template"] = value["template"]
+ new_knob["to_type"] = value["to_type"]
+ continue
+
+ value_key = "value"
+ if knob_type == "expression":
+ value_key = "expression"
+
+ elif knob_type == "color_gui":
+ value = _convert_color(value)
+
+ elif knob_type == "vector_2d":
+ value = [value["x"], value["y"]]
+
+ elif knob_type == "vector_3d":
+ value = [value["x"], value["y"], value["z"]]
+
+ new_knob[value_key] = value
+ return new_knobs
+
+
+def _convert_nuke_project_settings(ayon_settings, output):
+ if "nuke" not in ayon_settings:
+ return
+
+ ayon_nuke = ayon_settings["nuke"]
+
+ # --- Dirmap ---
+ dirmap = ayon_nuke.pop("dirmap")
+ for src_key, dst_key in (
+ ("source_path", "source-path"),
+ ("destination_path", "destination-path"),
+ ):
+ dirmap["paths"][dst_key] = dirmap["paths"].pop(src_key)
+ ayon_nuke["nuke-dirmap"] = dirmap
+
+ # --- Filters ---
+ new_gui_filters = {}
+ for item in ayon_nuke.pop("filters"):
+ subvalue = {}
+ key = item["name"]
+ for subitem in item["value"]:
+ subvalue[subitem["name"]] = subitem["value"]
+ new_gui_filters[key] = subvalue
+ ayon_nuke["filters"] = new_gui_filters
+
+ # --- Load ---
+ ayon_load = ayon_nuke["load"]
+ ayon_load["LoadClip"]["_representations"] = (
+ ayon_load["LoadClip"].pop("representations_include")
+ )
+ ayon_load["LoadImage"]["_representations"] = (
+ ayon_load["LoadImage"].pop("representations_include")
+ )
+
+ # --- Create ---
+ ayon_create = ayon_nuke["create"]
+ for creator_name in (
+ "CreateWritePrerender",
+ "CreateWriteImage",
+ "CreateWriteRender",
+ ):
+ create_plugin_settings = ayon_create[creator_name]
+ create_plugin_settings["temp_rendering_path_template"] = (
+ create_plugin_settings["temp_rendering_path_template"]
+ .replace("{product[name]}", "{subset}")
+ .replace("{product[type]}", "{family}")
+ .replace("{task[name]}", "{task}")
+ .replace("{folder[name]}", "{asset}")
+ )
+ new_prenodes = {}
+ for prenode in create_plugin_settings["prenodes"]:
+ name = prenode.pop("name")
+ prenode["knobs"] = _convert_nuke_knobs(prenode["knobs"])
+ new_prenodes[name] = prenode
+
+ create_plugin_settings["prenodes"] = new_prenodes
+
+ # --- Publish ---
+ ayon_publish = ayon_nuke["publish"]
+ slate_mapping = ayon_publish["ExtractSlateFrame"]["key_value_mapping"]
+ for key in tuple(slate_mapping.keys()):
+ value = slate_mapping[key]
+ slate_mapping[key] = [value["enabled"], value["template"]]
+
+ ayon_publish["ValidateKnobs"]["knobs"] = json.loads(
+ ayon_publish["ValidateKnobs"]["knobs"]
+ )
+
+ new_review_data_outputs = {}
+ for item in ayon_publish["ExtractReviewDataMov"]["outputs"]:
+ item_filter = item["filter"]
+ if "product_names" in item_filter:
+ item_filter["subsets"] = item_filter.pop("product_names")
+ item_filter["families"] = item_filter.pop("product_types")
+
+ item["reformat_node_config"] = _convert_nuke_knobs(
+ item["reformat_node_config"])
+
+ for node in item["reformat_nodes_config"]["reposition_nodes"]:
+ node["knobs"] = _convert_nuke_knobs(node["knobs"])
+
+ name = item.pop("name")
+ new_review_data_outputs[name] = item
+ ayon_publish["ExtractReviewDataMov"]["outputs"] = new_review_data_outputs
+
+ collect_instance_data = ayon_publish["CollectInstanceData"]
+ if "sync_workfile_version_on_product_types" in collect_instance_data:
+ collect_instance_data["sync_workfile_version_on_families"] = (
+ collect_instance_data.pop(
+ "sync_workfile_version_on_product_types"))
+
+ # TODO 'ExtractThumbnail' does not have ideal schema in v3
+ ayon_extract_thumbnail = ayon_publish["ExtractThumbnail"]
+ new_thumbnail_nodes = {}
+ for item in ayon_extract_thumbnail["nodes"]:
+ name = item["nodeclass"]
+ value = []
+ for knob in _convert_nuke_knobs(item["knobs"]):
+ knob_name = knob["name"]
+ # This may crash
+ if knob["type"] == "expression":
+ knob_value = knob["expression"]
+ else:
+ knob_value = knob["value"]
+ value.append([knob_name, knob_value])
+ new_thumbnail_nodes[name] = value
+
+ ayon_extract_thumbnail["nodes"] = new_thumbnail_nodes
+
+ if "reposition_nodes" in ayon_extract_thumbnail:
+ for item in ayon_extract_thumbnail["reposition_nodes"]:
+ item["knobs"] = _convert_nuke_knobs(item["knobs"])
+
+ # --- ImageIO ---
+ # NOTE 'monitorOutLut' is maybe not yet in v3 (ut should be)
+ _convert_host_imageio(ayon_nuke)
+ ayon_imageio = ayon_nuke["imageio"]
+ for item in ayon_imageio["nodes"]["requiredNodes"]:
+ item["knobs"] = _convert_nuke_knobs(item["knobs"])
+ for item in ayon_imageio["nodes"]["overrideNodes"]:
+ item["knobs"] = _convert_nuke_knobs(item["knobs"])
+
+ output["nuke"] = ayon_nuke
+
+
+def _convert_hiero_project_settings(ayon_settings, output):
+ if "hiero" not in ayon_settings:
+ return
+
+ ayon_hiero = ayon_settings["hiero"]
+ _convert_host_imageio(ayon_hiero)
+
+ new_gui_filters = {}
+ for item in ayon_hiero.pop("filters"):
+ subvalue = {}
+ key = item["name"]
+ for subitem in item["value"]:
+ subvalue[subitem["name"]] = subitem["value"]
+ new_gui_filters[key] = subvalue
+ ayon_hiero["filters"] = new_gui_filters
+
+ ayon_load_clip = ayon_hiero["load"]["LoadClip"]
+ if "product_types" in ayon_load_clip:
+ ayon_load_clip["families"] = ayon_load_clip.pop("product_types")
+
+ ayon_load_clip = ayon_hiero["load"]["LoadClip"]
+ ayon_load_clip["clip_name_template"] = (
+ ayon_load_clip["clip_name_template"]
+ .replace("{folder[name]}", "{asset}")
+ .replace("{product[name]}", "{subset}")
+ )
+
+ output["hiero"] = ayon_hiero
+
+
+def _convert_photoshop_project_settings(ayon_settings, output):
+ if "photoshop" not in ayon_settings:
+ return
+
+ ayon_photoshop = ayon_settings["photoshop"]
+ _convert_host_imageio(ayon_photoshop)
+
+ ayon_publish_photoshop = ayon_photoshop["publish"]
+
+ ayon_colorcoded = ayon_publish_photoshop["CollectColorCodedInstances"]
+ if "flatten_product_type_template" in ayon_colorcoded:
+ ayon_colorcoded["flatten_subset_template"] = (
+ ayon_colorcoded.pop("flatten_product_type_template"))
+
+ collect_review = ayon_publish_photoshop["CollectReview"]
+ if "active" in collect_review:
+ collect_review["publish"] = collect_review.pop("active")
+
+ output["photoshop"] = ayon_photoshop
+
+
+def _convert_tvpaint_project_settings(ayon_settings, output):
+ if "tvpaint" not in ayon_settings:
+ return
+ ayon_tvpaint = ayon_settings["tvpaint"]
+
+ _convert_host_imageio(ayon_tvpaint)
+
+ filters = {}
+ for item in ayon_tvpaint["filters"]:
+ value = item["value"]
+ try:
+ value = json.loads(value)
+
+ except ValueError:
+ value = {}
+ filters[item["name"]] = value
+ ayon_tvpaint["filters"] = filters
+
+ ayon_publish_settings = ayon_tvpaint["publish"]
+ for plugin_name in (
+ "ValidateProjectSettings",
+ "ValidateMarks",
+ "ValidateStartFrame",
+ "ValidateAssetName",
+ ):
+ ayon_value = ayon_publish_settings[plugin_name]
+ for src_key, dst_key in (
+ ("action_enabled", "optional"),
+ ("action_enable", "active"),
+ ):
+ if src_key in ayon_value:
+ ayon_value[dst_key] = ayon_value.pop(src_key)
+
+ extract_sequence_setting = ayon_publish_settings["ExtractSequence"]
+ extract_sequence_setting["review_bg"] = _convert_color(
+ extract_sequence_setting["review_bg"]
+ )
+
+ output["tvpaint"] = ayon_tvpaint
+
+
+def _convert_traypublisher_project_settings(ayon_settings, output):
+ if "traypublisher" not in ayon_settings:
+ return
+
+ ayon_traypublisher = ayon_settings["traypublisher"]
+
+ _convert_host_imageio(ayon_traypublisher)
+
+ ayon_editorial_simple = (
+ ayon_traypublisher["editorial_creators"]["editorial_simple"]
+ )
+ # Subset -> Product type conversion
+ if "product_type_presets" in ayon_editorial_simple:
+ family_presets = ayon_editorial_simple.pop("product_type_presets")
+ for item in family_presets:
+ item["family"] = item.pop("product_type")
+ ayon_editorial_simple["family_presets"] = family_presets
+
+ if "shot_metadata_creator" in ayon_editorial_simple:
+ shot_metadata_creator = ayon_editorial_simple.pop(
+ "shot_metadata_creator"
+ )
+ if isinstance(shot_metadata_creator["clip_name_tokenizer"], dict):
+ shot_metadata_creator["clip_name_tokenizer"] = [
+ {"name": "_sequence_", "regex": "(sc\\d{3})"},
+ {"name": "_shot_", "regex": "(sh\\d{3})"},
+ ]
+ ayon_editorial_simple.update(shot_metadata_creator)
+
+ ayon_editorial_simple["clip_name_tokenizer"] = {
+ item["name"]: item["regex"]
+ for item in ayon_editorial_simple["clip_name_tokenizer"]
+ }
+
+ if "shot_subset_creator" in ayon_editorial_simple:
+ ayon_editorial_simple.update(
+ ayon_editorial_simple.pop("shot_subset_creator"))
+ for item in ayon_editorial_simple["shot_hierarchy"]["parents"]:
+ item["type"] = item.pop("parent_type")
+
+ # Simple creators
+ ayon_simple_creators = ayon_traypublisher["simple_creators"]
+ for item in ayon_simple_creators:
+ if "product_type" not in item:
+ break
+ item["family"] = item.pop("product_type")
+
+ shot_add_tasks = ayon_editorial_simple["shot_add_tasks"]
+ if isinstance(shot_add_tasks, dict):
+ shot_add_tasks = []
+ new_shot_add_tasks = {
+ item["name"]: item["task_type"]
+ for item in shot_add_tasks
+ }
+ ayon_editorial_simple["shot_add_tasks"] = new_shot_add_tasks
+
+ output["traypublisher"] = ayon_traypublisher
+
+
+def _convert_webpublisher_project_settings(ayon_settings, output):
+ if "webpublisher" not in ayon_settings:
+ return
+
+ ayon_webpublisher = ayon_settings["webpublisher"]
+ _convert_host_imageio(ayon_webpublisher)
+
+ ayon_publish = ayon_webpublisher["publish"]
+
+ ayon_collect_files = ayon_publish["CollectPublishedFiles"]
+ ayon_collect_files["task_type_to_family"] = {
+ item["name"]: item["value"]
+ for item in ayon_collect_files["task_type_to_family"]
+ }
+
+ output["webpublisher"] = ayon_webpublisher
+
+
+def _convert_deadline_project_settings(ayon_settings, output):
+ if "deadline" not in ayon_settings:
+ return
+
+ ayon_deadline = ayon_settings["deadline"]
+
+ for key in ("deadline_urls",):
+ ayon_deadline.pop(key)
+
+ ayon_deadline_publish = ayon_deadline["publish"]
+ limit_groups = {
+ item["name"]: item["value"]
+ for item in ayon_deadline_publish["NukeSubmitDeadline"]["limit_groups"]
+ }
+ ayon_deadline_publish["NukeSubmitDeadline"]["limit_groups"] = limit_groups
+
+ maya_submit = ayon_deadline_publish["MayaSubmitDeadline"]
+ for json_key in ("jobInfo", "pluginInfo"):
+ src_text = maya_submit.pop(json_key)
+ try:
+ value = json.loads(src_text)
+ except ValueError:
+ value = {}
+ maya_submit[json_key] = value
+
+ nuke_submit = ayon_deadline_publish["NukeSubmitDeadline"]
+ nuke_submit["env_search_replace_values"] = {
+ item["name"]: item["value"]
+ for item in nuke_submit.pop("env_search_replace_values")
+ }
+ nuke_submit["limit_groups"] = {
+ item["name"]: item["value"] for item in nuke_submit.pop("limit_groups")
+ }
+
+ process_subsetted_job = ayon_deadline_publish["ProcessSubmittedJobOnFarm"]
+ process_subsetted_job["aov_filter"] = {
+ item["name"]: item["value"]
+ for item in process_subsetted_job.pop("aov_filter")
+ }
+
+ output["deadline"] = ayon_deadline
+
+
+def _convert_royalrender_project_settings(ayon_settings, output):
+ if "royalrender" not in ayon_settings:
+ return
+ ayon_royalrender = ayon_settings["royalrender"]
+ output["royalrender"] = {
+ "publish": ayon_royalrender["publish"]
+ }
+
+
+def _convert_kitsu_project_settings(ayon_settings, output):
+ if "kitsu" not in ayon_settings:
+ return
+
+ ayon_kitsu_settings = ayon_settings["kitsu"]
+ ayon_kitsu_settings.pop("server")
+
+ integrate_note = ayon_kitsu_settings["publish"]["IntegrateKitsuNote"]
+ status_change_conditions = integrate_note["status_change_conditions"]
+ if "product_type_requirements" in status_change_conditions:
+ status_change_conditions["family_requirements"] = (
+ status_change_conditions.pop("product_type_requirements"))
+
+ output["kitsu"] = ayon_kitsu_settings
+
+
+def _convert_shotgrid_project_settings(ayon_settings, output):
+ if "shotgrid" not in ayon_settings:
+ return
+
+ ayon_shotgrid = ayon_settings["shotgrid"]
+ # This means that a different variant of addon is used
+ if "leecher_backend_url" not in ayon_shotgrid:
+ return
+
+ for key in {
+ "leecher_backend_url",
+ "filter_projects_by_login",
+ "shotgrid_settings",
+ "leecher_manager_url",
+ }:
+ ayon_shotgrid.pop(key)
+
+ asset_field = ayon_shotgrid["fields"]["asset"]
+ asset_field["type"] = asset_field.pop("asset_type")
+
+ task_field = ayon_shotgrid["fields"]["task"]
+ if "task" in task_field:
+ task_field["step"] = task_field.pop("task")
+
+ output["shotgrid"] = ayon_settings["shotgrid"]
+
+
+def _convert_slack_project_settings(ayon_settings, output):
+ if "slack" not in ayon_settings:
+ return
+
+ ayon_slack = ayon_settings["slack"]
+ ayon_slack.pop("enabled", None)
+ for profile in ayon_slack["publish"]["CollectSlackFamilies"]["profiles"]:
+ profile["tasks"] = profile.pop("task_names")
+ profile["subsets"] = profile.pop("subset_names")
+
+ output["slack"] = ayon_slack
+
+
+def _convert_global_project_settings(ayon_settings, output, default_settings):
+ if "core" not in ayon_settings:
+ return
+
+ ayon_core = ayon_settings["core"]
+
+ _convert_host_imageio(ayon_core)
+
+ for key in (
+ "environments",
+ "studio_name",
+ "studio_code",
+ ):
+ ayon_core.pop(key)
+
+ # Publish conversion
+ ayon_publish = ayon_core["publish"]
+
+ ayon_collect_audio = ayon_publish["CollectAudio"]
+ if "audio_product_name" in ayon_collect_audio:
+ ayon_collect_audio["audio_subset_name"] = (
+ ayon_collect_audio.pop("audio_product_name"))
+
+ for profile in ayon_publish["ExtractReview"]["profiles"]:
+ if "product_types" in profile:
+ profile["families"] = profile.pop("product_types")
+ new_outputs = {}
+ for output_def in profile.pop("outputs"):
+ name = output_def.pop("name")
+ new_outputs[name] = output_def
+
+ output_def_filter = output_def["filter"]
+ if "product_names" in output_def_filter:
+ output_def_filter["subsets"] = (
+ output_def_filter.pop("product_names"))
+
+ for color_key in ("overscan_color", "bg_color"):
+ output_def[color_key] = _convert_color(output_def[color_key])
+
+ letter_box = output_def["letter_box"]
+ for color_key in ("fill_color", "line_color"):
+ letter_box[color_key] = _convert_color(letter_box[color_key])
+
+ if "output_width" in output_def:
+ output_def["width"] = output_def.pop("output_width")
+
+ if "output_height" in output_def:
+ output_def["height"] = output_def.pop("output_height")
+
+ profile["outputs"] = new_outputs
+
+ # Extract Burnin plugin
+ extract_burnin = ayon_publish["ExtractBurnin"]
+ extract_burnin_options = extract_burnin["options"]
+ for color_key in ("font_color", "bg_color"):
+ extract_burnin_options[color_key] = _convert_color(
+ extract_burnin_options[color_key]
+ )
+
+ for profile in extract_burnin["profiles"]:
+ extract_burnin_defs = profile["burnins"]
+ if "product_names" in profile:
+ profile["subsets"] = profile.pop("product_names")
+ profile["families"] = profile.pop("product_types")
+
+ for burnin_def in extract_burnin_defs:
+ for key in (
+ "TOP_LEFT",
+ "TOP_CENTERED",
+ "TOP_RIGHT",
+ "BOTTOM_LEFT",
+ "BOTTOM_CENTERED",
+ "BOTTOM_RIGHT",
+ ):
+ burnin_def[key] = (
+ burnin_def[key]
+ .replace("{product[name]}", "{subset}")
+ .replace("{Product[name]}", "{Subset}")
+ .replace("{PRODUCT[NAME]}", "{SUBSET}")
+ .replace("{product[type]}", "{family}")
+ .replace("{Product[type]}", "{Family}")
+ .replace("{PRODUCT[TYPE]}", "{FAMILY}")
+ .replace("{folder[name]}", "{asset}")
+ .replace("{Folder[name]}", "{Asset}")
+ .replace("{FOLDER[NAME]}", "{ASSET}")
+ )
+ profile["burnins"] = {
+ extract_burnin_def.pop("name"): extract_burnin_def
+ for extract_burnin_def in extract_burnin_defs
+ }
+
+ ayon_integrate_hero = ayon_publish["IntegrateHeroVersion"]
+ for profile in ayon_integrate_hero["template_name_profiles"]:
+ if "product_types" not in profile:
+ break
+ profile["families"] = profile.pop("product_types")
+
+ if "IntegrateProductGroup" in ayon_publish:
+ subset_group = ayon_publish.pop("IntegrateProductGroup")
+ subset_group_profiles = subset_group.pop("product_grouping_profiles")
+ for profile in subset_group_profiles:
+ profile["families"] = profile.pop("product_types")
+ subset_group["subset_grouping_profiles"] = subset_group_profiles
+ ayon_publish["IntegrateSubsetGroup"] = subset_group
+
+ # Cleanup plugin
+ ayon_cleanup = ayon_publish["CleanUp"]
+ if "patterns" in ayon_cleanup:
+ ayon_cleanup["paterns"] = ayon_cleanup.pop("patterns")
+
+ # Project root settings - json string to dict
+ ayon_core["project_environments"] = json.loads(
+ ayon_core["project_environments"]
+ )
+ ayon_core["project_folder_structure"] = json.dumps(json.loads(
+ ayon_core["project_folder_structure"]
+ ))
+
+ # Tools settings
+ ayon_tools = ayon_core["tools"]
+ ayon_create_tool = ayon_tools["creator"]
+ if "product_name_profiles" in ayon_create_tool:
+ product_name_profiles = ayon_create_tool.pop("product_name_profiles")
+ for profile in product_name_profiles:
+ profile["families"] = profile.pop("product_types")
+ ayon_create_tool["subset_name_profiles"] = product_name_profiles
+
+ for profile in ayon_create_tool["subset_name_profiles"]:
+ template = profile["template"]
+ profile["template"] = (
+ template
+ .replace("{task[name]}", "{task}")
+ .replace("{Task[name]}", "{Task}")
+ .replace("{TASK[NAME]}", "{TASK}")
+ .replace("{product[type]}", "{family}")
+ .replace("{Product[type]}", "{Family}")
+ .replace("{PRODUCT[TYPE]}", "{FAMILY}")
+ .replace("{folder[name]}", "{asset}")
+ .replace("{Folder[name]}", "{Asset}")
+ .replace("{FOLDER[NAME]}", "{ASSET}")
+ )
+
+ product_smart_select_key = "families_smart_select"
+ if "product_types_smart_select" in ayon_create_tool:
+ product_smart_select_key = "product_types_smart_select"
+
+ new_smart_select_families = {
+ item["name"]: item["task_names"]
+ for item in ayon_create_tool.pop(product_smart_select_key)
+ }
+ ayon_create_tool["families_smart_select"] = new_smart_select_families
+
+ ayon_loader_tool = ayon_tools["loader"]
+ if "product_type_filter_profiles" in ayon_loader_tool:
+ product_type_filter_profiles = (
+ ayon_loader_tool.pop("product_type_filter_profiles"))
+ for profile in product_type_filter_profiles:
+ profile["filter_families"] = profile.pop("filter_product_types")
+
+ ayon_loader_tool["family_filter_profiles"] = (
+ product_type_filter_profiles)
+
+ ayon_publish_tool = ayon_tools["publish"]
+ for profile in ayon_publish_tool["hero_template_name_profiles"]:
+ if "product_types" in profile:
+ profile["families"] = profile.pop("product_types")
+
+ for profile in ayon_publish_tool["template_name_profiles"]:
+ if "product_types" in profile:
+ profile["families"] = profile.pop("product_types")
+
+ ayon_core["sync_server"] = (
+ default_settings["global"]["sync_server"]
+ )
+ output["global"] = ayon_core
+
+
+def convert_project_settings(ayon_settings, default_settings):
+ # Missing settings
+ # - standalonepublisher
+ default_settings = copy.deepcopy(default_settings)
+ output = {}
+ exact_match = {
+ "aftereffects",
+ "harmony",
+ "houdini",
+ "resolve",
+ "unreal",
+ }
+ for key in exact_match:
+ if key in ayon_settings:
+ output[key] = ayon_settings[key]
+ _convert_host_imageio(output[key])
+
+ _convert_applications_project_settings(ayon_settings, output)
+ _convert_blender_project_settings(ayon_settings, output)
+ _convert_celaction_project_settings(ayon_settings, output)
+ _convert_flame_project_settings(ayon_settings, output)
+ _convert_fusion_project_settings(ayon_settings, output)
+ _convert_maya_project_settings(ayon_settings, output)
+ _convert_nuke_project_settings(ayon_settings, output)
+ _convert_hiero_project_settings(ayon_settings, output)
+ _convert_photoshop_project_settings(ayon_settings, output)
+ _convert_tvpaint_project_settings(ayon_settings, output)
+ _convert_traypublisher_project_settings(ayon_settings, output)
+ _convert_webpublisher_project_settings(ayon_settings, output)
+
+ _convert_deadline_project_settings(ayon_settings, output)
+ _convert_royalrender_project_settings(ayon_settings, output)
+ _convert_kitsu_project_settings(ayon_settings, output)
+ _convert_shotgrid_project_settings(ayon_settings, output)
+ _convert_slack_project_settings(ayon_settings, output)
+
+ _convert_global_project_settings(ayon_settings, output, default_settings)
+
+ for key, value in default_settings.items():
+ if key not in output:
+ output[key] = value
+
+ return output
+
+
+class CacheItem:
+ lifetime = 10
+
+ def __init__(self, value, outdate_time=None):
+ self._value = value
+ if outdate_time is None:
+ outdate_time = time.time() + self.lifetime
+ self._outdate_time = outdate_time
+
+ @classmethod
+ def create_outdated(cls):
+ return cls({}, 0)
+
+ def get_value(self):
+ return copy.deepcopy(self._value)
+
+ def update_value(self, value):
+ self._value = value
+ self._outdate_time = time.time() + self.lifetime
+
+ @property
+ def is_outdated(self):
+ return time.time() > self._outdate_time
+
+
+class _AyonSettingsCache:
+ use_bundles = None
+ variant = None
+ addon_versions = CacheItem.create_outdated()
+ studio_settings = CacheItem.create_outdated()
+ cache_by_project_name = collections.defaultdict(
+ CacheItem.create_outdated)
+
+ @classmethod
+ def _use_bundles(cls):
+ if _AyonSettingsCache.use_bundles is None:
+ major, minor, _, _, _ = ayon_api.get_server_version_tuple()
+ _AyonSettingsCache.use_bundles = major == 0 and minor >= 3
+ return _AyonSettingsCache.use_bundles
+
+ @classmethod
+ def _get_variant(cls):
+ if _AyonSettingsCache.variant is None:
+ from openpype.lib.openpype_version import is_staging_enabled
+
+ _AyonSettingsCache.variant = (
+ "staging" if is_staging_enabled() else "production"
+ )
+ return _AyonSettingsCache.variant
+
+ @classmethod
+ def _get_bundle_name(cls):
+ return os.environ["AYON_BUNDLE_NAME"]
+
+ @classmethod
+ def get_value_by_project(cls, project_name):
+ cache_item = _AyonSettingsCache.cache_by_project_name[project_name]
+ if cache_item.is_outdated:
+ if cls._use_bundles():
+ value = ayon_api.get_addons_settings(
+ bundle_name=cls._get_bundle_name(),
+ project_name=project_name
+ )
+ else:
+ value = ayon_api.get_addons_settings(project_name)
+ cache_item.update_value(value)
+ return cache_item.get_value()
+
+ @classmethod
+ def _get_addon_versions_from_bundle(cls):
+ expected_bundle = cls._get_bundle_name()
+ bundles = ayon_api.get_bundles()["bundles"]
+ bundle = next(
+ (
+ bundle
+ for bundle in bundles
+ if bundle["name"] == expected_bundle
+ ),
+ None
+ )
+ if bundle is not None:
+ return bundle["addons"]
+ return {}
+
+ @classmethod
+ def get_addon_versions(cls):
+ cache_item = _AyonSettingsCache.addon_versions
+ if cache_item.is_outdated:
+ if cls._use_bundles():
+ addons = cls._get_addon_versions_from_bundle()
+ else:
+ settings_data = ayon_api.get_addons_settings(
+ only_values=False, variant=cls._get_variant())
+ addons = settings_data["versions"]
+ cache_item.update_value(addons)
+
+ return cache_item.get_value()
+
+
+def get_ayon_project_settings(default_values, project_name):
+ ayon_settings = _AyonSettingsCache.get_value_by_project(project_name)
+ return convert_project_settings(ayon_settings, default_values)
+
+
+def get_ayon_system_settings(default_values):
+ addon_versions = _AyonSettingsCache.get_addon_versions()
+ ayon_settings = _AyonSettingsCache.get_value_by_project(None)
+
+ return convert_system_settings(
+ ayon_settings, default_values, addon_versions
+ )
diff --git a/openpype/settings/defaults/project_settings/global.json b/openpype/settings/defaults/project_settings/global.json
index 802b964375..b6eb2f52f1 100644
--- a/openpype/settings/defaults/project_settings/global.json
+++ b/openpype/settings/defaults/project_settings/global.json
@@ -3,8 +3,8 @@
"activate_global_color_management": false,
"ocio_config": {
"filepath": [
- "{OPENPYPE_ROOT}/vendor/bin/ocioconfig/OpenColorIOConfigs/aces_1.2/config.ocio",
- "{OPENPYPE_ROOT}/vendor/bin/ocioconfig/OpenColorIOConfigs/nuke-default/config.ocio"
+ "{BUILTIN_OCIO_ROOT}/aces_1.2/config.ocio",
+ "{BUILTIN_OCIO_ROOT}/nuke-default/config.ocio"
]
},
"file_rules": {
@@ -53,7 +53,8 @@
},
"ValidateEditorialAssetName": {
"enabled": true,
- "optional": false
+ "optional": false,
+ "active": true
},
"ValidateVersion": {
"enabled": true,
@@ -300,74 +301,6 @@
}
]
},
- "IntegrateAssetNew": {
- "subset_grouping_profiles": [
- {
- "families": [],
- "hosts": [],
- "task_types": [],
- "tasks": [],
- "template": ""
- }
- ],
- "template_name_profiles": [
- {
- "families": [],
- "hosts": [],
- "task_types": [],
- "tasks": [],
- "template_name": "publish"
- },
- {
- "families": [
- "review",
- "render",
- "prerender"
- ],
- "hosts": [],
- "task_types": [],
- "tasks": [],
- "template_name": "render"
- },
- {
- "families": [
- "simpleUnrealTexture"
- ],
- "hosts": [
- "standalonepublisher"
- ],
- "task_types": [],
- "tasks": [],
- "template_name": "simpleUnrealTexture"
- },
- {
- "families": [
- "staticMesh",
- "skeletalMesh"
- ],
- "hosts": [
- "maya"
- ],
- "task_types": [],
- "tasks": [],
- "template_name": "maya2unreal"
- },
- {
- "families": [
- "online"
- ],
- "hosts": [
- "traypublisher"
- ],
- "task_types": [],
- "tasks": [],
- "template_name": "online"
- }
- ]
- },
- "IntegrateAsset": {
- "skip_host_families": []
- },
"IntegrateHeroVersion": {
"enabled": true,
"optional": true,
diff --git a/openpype/settings/defaults/project_settings/harmony.json b/openpype/settings/defaults/project_settings/harmony.json
index 02f51d1d2b..b424b43cc1 100644
--- a/openpype/settings/defaults/project_settings/harmony.json
+++ b/openpype/settings/defaults/project_settings/harmony.json
@@ -10,22 +10,6 @@
"rules": {}
}
},
- "load": {
- "ImageSequenceLoader": {
- "family": [
- "shot",
- "render",
- "image",
- "plate",
- "reference"
- ],
- "representations": [
- "jpeg",
- "png",
- "jpg"
- ]
- }
- },
"publish": {
"CollectPalettes": {
"allowed_tasks": [
diff --git a/openpype/settings/defaults/project_settings/substancepainter.json b/openpype/settings/defaults/project_settings/substancepainter.json
index 4adeff98ef..2f9344d435 100644
--- a/openpype/settings/defaults/project_settings/substancepainter.json
+++ b/openpype/settings/defaults/project_settings/substancepainter.json
@@ -2,11 +2,11 @@
"imageio": {
"activate_host_color_management": true,
"ocio_config": {
- "override_global_config": true,
+ "override_global_config": false,
"filepath": []
},
"file_rules": {
- "activate_host_rules": true,
+ "activate_host_rules": false,
"rules": {}
}
},
diff --git a/openpype/settings/defaults/project_settings/traypublisher.json b/openpype/settings/defaults/project_settings/traypublisher.json
index 4c2c2f1391..dda958ebcd 100644
--- a/openpype/settings/defaults/project_settings/traypublisher.json
+++ b/openpype/settings/defaults/project_settings/traypublisher.json
@@ -329,6 +329,11 @@
}
},
"publish": {
+ "CollectFrameDataFromAssetEntity": {
+ "enabled": true,
+ "optional": true,
+ "active": true
+ },
"ValidateFrameRange": {
"enabled": true,
"optional": true,
diff --git a/openpype/settings/defaults/project_settings/tvpaint.json b/openpype/settings/defaults/project_settings/tvpaint.json
index 1f4f468656..fdbd6d5d0f 100644
--- a/openpype/settings/defaults/project_settings/tvpaint.json
+++ b/openpype/settings/defaults/project_settings/tvpaint.json
@@ -60,11 +60,6 @@
255,
255,
255
- ],
- "families_to_review": [
- "review",
- "renderlayer",
- "renderscene"
]
},
"ValidateProjectSettings": {
diff --git a/openpype/settings/defaults/system_settings/general.json b/openpype/settings/defaults/system_settings/general.json
index d2994d1a62..496c37cd4d 100644
--- a/openpype/settings/defaults/system_settings/general.json
+++ b/openpype/settings/defaults/system_settings/general.json
@@ -15,6 +15,11 @@
"darwin": [],
"linux": []
},
+ "local_openpype_path": {
+ "windows": "",
+ "darwin": "",
+ "linux": ""
+ },
"production_version": "",
"staging_version": "",
"version_check_interval": 5
diff --git a/openpype/settings/entities/schemas/projects_schema/schema_project_harmony.json b/openpype/settings/entities/schemas/projects_schema/schema_project_harmony.json
index 98a815f2d4..f081c48b23 100644
--- a/openpype/settings/entities/schemas/projects_schema/schema_project_harmony.json
+++ b/openpype/settings/entities/schemas/projects_schema/schema_project_harmony.json
@@ -18,34 +18,6 @@
}
]
},
- {
- "type": "dict",
- "collapsible": true,
- "key": "load",
- "label": "Loader plugins",
- "children": [
- {
- "type": "dict",
- "collapsible": true,
- "key": "ImageSequenceLoader",
- "label": "Load Image Sequence",
- "children": [
- {
- "type": "list",
- "key": "family",
- "label": "Families",
- "object_type": "text"
- },
- {
- "type": "list",
- "key": "representations",
- "label": "Representations",
- "object_type": "text"
- }
- ]
- }
- ]
- },
{
"type": "dict",
"collapsible": true,
diff --git a/openpype/settings/entities/schemas/projects_schema/schema_project_traypublisher.json b/openpype/settings/entities/schemas/projects_schema/schema_project_traypublisher.json
index e75e2887db..184fc657be 100644
--- a/openpype/settings/entities/schemas/projects_schema/schema_project_traypublisher.json
+++ b/openpype/settings/entities/schemas/projects_schema/schema_project_traypublisher.json
@@ -349,6 +349,10 @@
"type": "schema_template",
"name": "template_validate_plugin",
"template_data": [
+ {
+ "key": "CollectFrameDataFromAssetEntity",
+ "label": "Collect frame range from asset entity"
+ },
{
"key": "ValidateFrameRange",
"label": "Validate frame range"
diff --git a/openpype/settings/entities/schemas/projects_schema/schema_project_tvpaint.json b/openpype/settings/entities/schemas/projects_schema/schema_project_tvpaint.json
index 45fc13bdde..e9255f426e 100644
--- a/openpype/settings/entities/schemas/projects_schema/schema_project_tvpaint.json
+++ b/openpype/settings/entities/schemas/projects_schema/schema_project_tvpaint.json
@@ -273,18 +273,6 @@
"key": "review_bg",
"label": "Review BG color",
"use_alpha": false
- },
- {
- "type": "enum",
- "key": "families_to_review",
- "label": "Families to review",
- "multiselection": true,
- "enum_items": [
- {"review": "review"},
- {"renderpass": "renderPass"},
- {"renderlayer": "renderLayer"},
- {"renderscene": "renderScene"}
- ]
}
]
},
diff --git a/openpype/settings/entities/schemas/projects_schema/schemas/schema_global_publish.json b/openpype/settings/entities/schemas/projects_schema/schemas/schema_global_publish.json
index 3164cfb62d..c7e91fd22d 100644
--- a/openpype/settings/entities/schemas/projects_schema/schemas/schema_global_publish.json
+++ b/openpype/settings/entities/schemas/projects_schema/schemas/schema_global_publish.json
@@ -118,6 +118,11 @@
"type": "boolean",
"key": "optional",
"label": "Optional"
+ },
+ {
+ "type": "boolean",
+ "key": "active",
+ "label": "Active"
}
]
},
@@ -888,142 +893,6 @@
}
]
},
- {
- "type": "dict",
- "collapsible": true,
- "key": "IntegrateAssetNew",
- "label": "IntegrateAsset (Legacy)",
- "is_group": true,
- "children": [
- {
- "type": "label",
- "label": "NOTE: Subset grouping profiles settings were moved to Integrate Subset Group. Please move values there."
- },
- {
- "type": "list",
- "key": "subset_grouping_profiles",
- "label": "Subset grouping profiles (DEPRECATED)",
- "use_label_wrap": true,
- "object_type": {
- "type": "dict",
- "children": [
- {
- "key": "families",
- "label": "Families",
- "type": "list",
- "object_type": "text"
- },
- {
- "type": "hosts-enum",
- "key": "hosts",
- "label": "Hosts",
- "multiselection": true
- },
- {
- "key": "task_types",
- "label": "Task types",
- "type": "task-types-enum"
- },
- {
- "key": "tasks",
- "label": "Task names",
- "type": "list",
- "object_type": "text"
- },
- {
- "type": "separator"
- },
- {
- "type": "text",
- "key": "template",
- "label": "Template"
- }
- ]
- }
- },
- {
- "type": "label",
- "label": "NOTE: Publish template profiles settings were moved to Tools/Publish/Template name profiles. Please move values there."
- },
- {
- "type": "list",
- "key": "template_name_profiles",
- "label": "Template name profiles (DEPRECATED)",
- "use_label_wrap": true,
- "object_type": {
- "type": "dict",
- "children": [
- {
- "type": "label",
- "label": ""
- },
- {
- "key": "families",
- "label": "Families",
- "type": "list",
- "object_type": "text"
- },
- {
- "type": "hosts-enum",
- "key": "hosts",
- "label": "Hosts",
- "multiselection": true
- },
- {
- "key": "task_types",
- "label": "Task types",
- "type": "task-types-enum"
- },
- {
- "key": "tasks",
- "label": "Task names",
- "type": "list",
- "object_type": "text"
- },
- {
- "type": "separator"
- },
- {
- "type": "text",
- "key": "template_name",
- "label": "Template name"
- }
- ]
- }
- }
- ]
- },
- {
- "type": "dict",
- "collapsible": true,
- "key": "IntegrateAsset",
- "label": "Integrate Asset",
- "is_group": true,
- "children": [
- {
- "type": "list",
- "key": "skip_host_families",
- "label": "Skip hosts and families",
- "use_label_wrap": true,
- "object_type": {
- "type": "dict",
- "children": [
- {
- "type": "hosts-enum",
- "key": "host",
- "label": "Host"
- },
- {
- "type": "list",
- "key": "families",
- "label": "Families",
- "object_type": "text"
- }
- ]
- }
- }
- ]
- },
{
"type": "dict",
"collapsible": true,
diff --git a/openpype/settings/entities/schemas/projects_schema/schemas/schema_global_tools.json b/openpype/settings/entities/schemas/projects_schema/schemas/schema_global_tools.json
index 85ec482e73..23fc7c9351 100644
--- a/openpype/settings/entities/schemas/projects_schema/schemas/schema_global_tools.json
+++ b/openpype/settings/entities/schemas/projects_schema/schemas/schema_global_tools.json
@@ -320,10 +320,6 @@
"key": "publish",
"label": "Publish",
"children": [
- {
- "type": "label",
- "label": "NOTE: For backwards compatibility can be value empty and in that case are used values from IntegrateAssetNew. This will change in future so please move all values here as soon as possible."
- },
{
"type": "list",
"key": "template_name_profiles",
diff --git a/openpype/settings/entities/schemas/system_schema/schema_general.json b/openpype/settings/entities/schemas/system_schema/schema_general.json
index d6c22fe54c..2609441061 100644
--- a/openpype/settings/entities/schemas/system_schema/schema_general.json
+++ b/openpype/settings/entities/schemas/system_schema/schema_general.json
@@ -128,8 +128,12 @@
{
"type": "collapsible-wrap",
"label": "OpenPype deployment control",
- "collapsible": false,
+ "collapsible": true,
"children": [
+ {
+ "type": "label",
+ "label": "Define location accessible by artist machine to check for zip updates with Openpype code."
+ },
{
"type": "path",
"key": "openpype_path",
@@ -138,6 +142,18 @@
"multipath": true,
"require_restart": true
},
+ {
+ "type": "label",
+ "label": "Define custom location for artist machine where to unzip versions of Openpype code. By default it is in user app data folder."
+ },
+ {
+ "type": "path",
+ "key": "local_openpype_path",
+ "label": "Custom Local Versions Folder",
+ "multiplatform": true,
+ "multipath": false,
+ "require_restart": true
+ },
{
"type": "splitter"
},
diff --git a/openpype/settings/handlers.py b/openpype/settings/handlers.py
index a1f3331ccc..1d4c838f1a 100644
--- a/openpype/settings/handlers.py
+++ b/openpype/settings/handlers.py
@@ -7,10 +7,14 @@ from abc import ABCMeta, abstractmethod
import six
import openpype.version
-from openpype.client.mongo import OpenPypeMongoConnection
-from openpype.client.entities import get_project_connection, get_project
+from openpype.client.mongo import (
+ OpenPypeMongoConnection,
+ get_project_connection,
+)
+from openpype.client.entities import get_project
from openpype.lib.pype_info import get_workstation_info
+
from .constants import (
GLOBAL_SETTINGS_KEY,
SYSTEM_SETTINGS_KEY,
@@ -185,6 +189,7 @@ class SettingsStateInfo:
class SettingsHandler(object):
global_keys = {
"openpype_path",
+ "local_openpype_path",
"admin_password",
"log_to_server",
"disk_mapping",
diff --git a/openpype/settings/lib.py b/openpype/settings/lib.py
index 73554df236..ce62dde43f 100644
--- a/openpype/settings/lib.py
+++ b/openpype/settings/lib.py
@@ -4,6 +4,9 @@ import functools
import logging
import platform
import copy
+
+from openpype import AYON_SERVER_ENABLED
+
from .exceptions import (
SaveWarningExc
)
@@ -18,6 +21,11 @@ from .constants import (
DEFAULT_PROJECT_KEY
)
+from .ayon_settings import (
+ get_ayon_project_settings,
+ get_ayon_system_settings
+)
+
log = logging.getLogger(__name__)
# Py2 + Py3 json decode exception
@@ -40,36 +48,17 @@ _SETTINGS_HANDLER = None
_LOCAL_SETTINGS_HANDLER = None
-def require_handler(func):
- @functools.wraps(func)
- def wrapper(*args, **kwargs):
- global _SETTINGS_HANDLER
- if _SETTINGS_HANDLER is None:
- _SETTINGS_HANDLER = create_settings_handler()
- return func(*args, **kwargs)
- return wrapper
-
-
-def require_local_handler(func):
- @functools.wraps(func)
- def wrapper(*args, **kwargs):
- global _LOCAL_SETTINGS_HANDLER
- if _LOCAL_SETTINGS_HANDLER is None:
- _LOCAL_SETTINGS_HANDLER = create_local_settings_handler()
- return func(*args, **kwargs)
- return wrapper
-
-
-def create_settings_handler():
- from .handlers import MongoSettingsHandler
- # Handler can't be created in global space on initialization but only when
- # needed. Plus here may be logic: Which handler is used (in future).
- return MongoSettingsHandler()
-
-
-def create_local_settings_handler():
- from .handlers import MongoLocalSettingsHandler
- return MongoLocalSettingsHandler()
+def clear_metadata_from_settings(values):
+ """Remove all metadata keys from loaded settings."""
+ if isinstance(values, dict):
+ for key in tuple(values.keys()):
+ if key in METADATA_KEYS:
+ values.pop(key)
+ else:
+ clear_metadata_from_settings(values[key])
+ elif isinstance(values, list):
+ for item in values:
+ clear_metadata_from_settings(item)
def calculate_changes(old_value, new_value):
@@ -91,6 +80,42 @@ def calculate_changes(old_value, new_value):
return changes
+def create_settings_handler():
+ if AYON_SERVER_ENABLED:
+ raise RuntimeError("Mongo settings handler was triggered in AYON mode")
+ from .handlers import MongoSettingsHandler
+ # Handler can't be created in global space on initialization but only when
+ # needed. Plus here may be logic: Which handler is used (in future).
+ return MongoSettingsHandler()
+
+
+def create_local_settings_handler():
+ if AYON_SERVER_ENABLED:
+ raise RuntimeError("Mongo settings handler was triggered in AYON mode")
+ from .handlers import MongoLocalSettingsHandler
+ return MongoLocalSettingsHandler()
+
+
+def require_handler(func):
+ @functools.wraps(func)
+ def wrapper(*args, **kwargs):
+ global _SETTINGS_HANDLER
+ if _SETTINGS_HANDLER is None:
+ _SETTINGS_HANDLER = create_settings_handler()
+ return func(*args, **kwargs)
+ return wrapper
+
+
+def require_local_handler(func):
+ @functools.wraps(func)
+ def wrapper(*args, **kwargs):
+ global _LOCAL_SETTINGS_HANDLER
+ if _LOCAL_SETTINGS_HANDLER is None:
+ _LOCAL_SETTINGS_HANDLER = create_local_settings_handler()
+ return func(*args, **kwargs)
+ return wrapper
+
+
@require_handler
def get_system_last_saved_info():
return _SETTINGS_HANDLER.get_system_last_saved_info()
@@ -494,10 +519,17 @@ def save_local_settings(data):
@require_local_handler
-def get_local_settings():
+def _get_local_settings():
return _LOCAL_SETTINGS_HANDLER.get_local_settings()
+def get_local_settings():
+ if not AYON_SERVER_ENABLED:
+ return _get_local_settings()
+ # TODO implement ayon implementation
+ return {}
+
+
def load_openpype_default_settings():
"""Load openpype default settings."""
return load_jsons_from_dir(DEFAULTS_DIR)
@@ -890,7 +922,7 @@ def apply_local_settings_on_project_settings(
sync_server_config["remote_site"] = remote_site
-def get_system_settings(clear_metadata=True, exclude_locals=None):
+def _get_system_settings(clear_metadata=True, exclude_locals=None):
"""System settings with applied studio overrides."""
default_values = get_default_settings()[SYSTEM_SETTINGS_KEY]
studio_values = get_studio_system_settings_overrides()
@@ -992,7 +1024,7 @@ def get_anatomy_settings(
return result
-def get_project_settings(
+def _get_project_settings(
project_name, clear_metadata=True, exclude_locals=None
):
"""Project settings with applied studio and project overrides."""
@@ -1043,7 +1075,7 @@ def get_current_project_settings():
@require_handler
-def get_global_settings():
+def _get_global_settings():
default_settings = load_openpype_default_settings()
default_values = default_settings["system_settings"]["general"]
studio_values = _SETTINGS_HANDLER.get_global_settings()
@@ -1053,7 +1085,14 @@ def get_global_settings():
}
-def get_general_environments():
+def get_global_settings():
+ if not AYON_SERVER_ENABLED:
+ return _get_global_settings()
+ default_settings = load_openpype_default_settings()
+ return default_settings["system_settings"]["general"]
+
+
+def _get_general_environments():
"""Get general environments.
Function is implemented to be able load general environments without using
@@ -1082,14 +1121,24 @@ def get_general_environments():
return environments
-def clear_metadata_from_settings(values):
- """Remove all metadata keys from loaded settings."""
- if isinstance(values, dict):
- for key in tuple(values.keys()):
- if key in METADATA_KEYS:
- values.pop(key)
- else:
- clear_metadata_from_settings(values[key])
- elif isinstance(values, list):
- for item in values:
- clear_metadata_from_settings(item)
+def get_general_environments():
+ if not AYON_SERVER_ENABLED:
+ return _get_general_environments()
+ value = get_system_settings()
+ return value["general"]["environment"]
+
+
+def get_system_settings(*args, **kwargs):
+ if not AYON_SERVER_ENABLED:
+ return _get_system_settings(*args, **kwargs)
+
+ default_settings = get_default_settings()[SYSTEM_SETTINGS_KEY]
+ return get_ayon_system_settings(default_settings)
+
+
+def get_project_settings(project_name, *args, **kwargs):
+ if not AYON_SERVER_ENABLED:
+ return _get_project_settings(project_name, *args, **kwargs)
+
+ default_settings = get_default_settings()[PROJECT_SETTINGS_KEY]
+ return get_ayon_project_settings(default_settings, project_name)
diff --git a/openpype/tests/lib.py b/openpype/tests/lib.py
index 1fa5fb8054..c7d4423aba 100644
--- a/openpype/tests/lib.py
+++ b/openpype/tests/lib.py
@@ -5,7 +5,6 @@ import tempfile
import contextlib
import pyblish
-import pyblish.cli
import pyblish.plugin
from pyblish.vendor import six
diff --git a/openpype/tools/adobe_webserver/app.py b/openpype/tools/adobe_webserver/app.py
index 3911baf7ac..49d61d3883 100644
--- a/openpype/tools/adobe_webserver/app.py
+++ b/openpype/tools/adobe_webserver/app.py
@@ -16,7 +16,7 @@ from wsrpc_aiohttp import (
WSRPCClient
)
-from openpype.pipeline import legacy_io
+from openpype.pipeline import get_global_context
log = logging.getLogger(__name__)
@@ -80,9 +80,10 @@ class WebServerTool:
loop=asyncio.get_event_loop())
await client.connect()
- project = legacy_io.Session["AVALON_PROJECT"]
- asset = legacy_io.Session["AVALON_ASSET"]
- task = legacy_io.Session["AVALON_TASK"]
+ context = get_global_context()
+ project = context["project_name"]
+ asset = context["asset_name"]
+ task = context["task_name"]
log.info("Sending context change to {}-{}-{}".format(project,
asset,
task))
diff --git a/openpype/tools/context_dialog/window.py b/openpype/tools/context_dialog/window.py
index 86c53b55c5..4fe41c9949 100644
--- a/openpype/tools/context_dialog/window.py
+++ b/openpype/tools/context_dialog/window.py
@@ -5,7 +5,7 @@ from qtpy import QtWidgets, QtCore, QtGui
from openpype import style
from openpype.pipeline import AvalonMongoDB
-from openpype.tools.utils.lib import center_window
+from openpype.tools.utils.lib import center_window, get_openpype_qt_app
from openpype.tools.utils.assets_widget import SingleSelectAssetsWidget
from openpype.tools.utils.constants import (
PROJECT_NAME_ROLE
@@ -376,9 +376,7 @@ def main(
strict=True
):
# Run Qt application
- app = QtWidgets.QApplication.instance()
- if app is None:
- app = QtWidgets.QApplication([])
+ app = get_openpype_qt_app()
window = ContextDialog()
window.set_strict(strict)
window.set_context(project_name, asset_name)
diff --git a/openpype/tools/creator/widgets.py b/openpype/tools/creator/widgets.py
index 74f75811ff..0ebbd905e5 100644
--- a/openpype/tools/creator/widgets.py
+++ b/openpype/tools/creator/widgets.py
@@ -5,6 +5,7 @@ from qtpy import QtWidgets, QtCore, QtGui
import qtawesome
+from openpype import AYON_SERVER_ENABLED
from openpype.pipeline.create import SUBSET_NAME_ALLOWED_SYMBOLS
from openpype.tools.utils import ErrorMessageBox
@@ -42,10 +43,13 @@ class CreateErrorMessageBox(ErrorMessageBox):
def _get_report_data(self):
report_message = (
- "Failed to create Subset: \"{subset}\" Family: \"{family}\""
+ "Failed to create {subset_label}: \"{subset}\""
+ " {family_label}: \"{family}\""
" in Asset: \"{asset}\""
"\n\nError: {message}"
).format(
+ subset_label="Product" if AYON_SERVER_ENABLED else "Subset",
+ family_label="Type" if AYON_SERVER_ENABLED else "Family",
subset=self._subset_name,
family=self._family,
asset=self._asset_name,
@@ -57,9 +61,13 @@ class CreateErrorMessageBox(ErrorMessageBox):
def _create_content(self, content_layout):
item_name_template = (
- "Family: {} "
- "Subset: {} "
- "Asset: {} "
+ "{}: {{}} "
+ "{}: {{}} "
+ "{}: {{}} "
+ ).format(
+ "Product type" if AYON_SERVER_ENABLED else "Family",
+ "Product name" if AYON_SERVER_ENABLED else "Subset",
+ "Folder" if AYON_SERVER_ENABLED else "Asset"
)
exc_msg_template = "{}"
@@ -151,15 +159,21 @@ class VariantLineEdit(QtWidgets.QLineEdit):
def as_empty(self):
self._set_border("empty")
- self.report.emit("Empty subset name ..")
+ self.report.emit("Empty {} name ..".format(
+ "product" if AYON_SERVER_ENABLED else "subset"
+ ))
def as_exists(self):
self._set_border("exists")
- self.report.emit("Existing subset, appending next version.")
+ self.report.emit("Existing {}, appending next version.".format(
+ "product" if AYON_SERVER_ENABLED else "subset"
+ ))
def as_new(self):
self._set_border("new")
- self.report.emit("New subset, creating first version.")
+ self.report.emit("New {}, creating first version.".format(
+ "product" if AYON_SERVER_ENABLED else "subset"
+ ))
def _set_border(self, status):
qcolor, style = self.colors[status]
diff --git a/openpype/tools/creator/window.py b/openpype/tools/creator/window.py
index 57e2c49576..47f27a262a 100644
--- a/openpype/tools/creator/window.py
+++ b/openpype/tools/creator/window.py
@@ -8,7 +8,11 @@ from openpype.client import get_asset_by_name, get_subsets
from openpype import style
from openpype.settings import get_current_project_settings
from openpype.tools.utils.lib import qt_app_context
-from openpype.pipeline import legacy_io
+from openpype.pipeline import (
+ get_current_project_name,
+ get_current_asset_name,
+ get_current_task_name,
+)
from openpype.pipeline.create import (
SUBSET_NAME_ALLOWED_SYMBOLS,
legacy_create,
@@ -216,7 +220,7 @@ class CreatorWindow(QtWidgets.QDialog):
self._set_valid_state(False)
return
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
asset_doc = None
if creator_plugin:
# Get the asset from the database which match with the name
@@ -237,7 +241,7 @@ class CreatorWindow(QtWidgets.QDialog):
return
asset_id = asset_doc["_id"]
- task_name = legacy_io.Session["AVALON_TASK"]
+ task_name = get_current_task_name()
# Calculate subset name with Creator plugin
subset_name = creator_plugin.get_subset_name(
@@ -369,7 +373,7 @@ class CreatorWindow(QtWidgets.QDialog):
self.setStyleSheet(style.load_stylesheet())
def refresh(self):
- self._asset_name_input.setText(legacy_io.Session["AVALON_ASSET"])
+ self._asset_name_input.setText(get_current_asset_name())
self._creators_model.reset()
@@ -382,7 +386,7 @@ class CreatorWindow(QtWidgets.QDialog):
)
current_index = None
family = None
- task_name = legacy_io.Session.get("AVALON_TASK", None)
+ task_name = get_current_task_name() or None
lowered_task_name = task_name.lower()
if task_name:
for _family, _task_names in pype_project_setting.items():
diff --git a/openpype/tools/loader/app.py b/openpype/tools/loader/app.py
index 302fe6c366..b305233247 100644
--- a/openpype/tools/loader/app.py
+++ b/openpype/tools/loader/app.py
@@ -223,7 +223,7 @@ class LoaderWindow(QtWidgets.QDialog):
lib.schedule(self._refresh, 50, channel="mongo")
def on_assetschanged(self, *args):
- self.echo("Fetching asset..")
+ self.echo("Fetching hierarchy..")
lib.schedule(self._assetschanged, 50, channel="mongo")
def on_subsetschanged(self, *args):
diff --git a/openpype/tools/loader/model.py b/openpype/tools/loader/model.py
index e58e02f89a..5115f39a69 100644
--- a/openpype/tools/loader/model.py
+++ b/openpype/tools/loader/model.py
@@ -7,6 +7,7 @@ from uuid import uuid4
from qtpy import QtCore, QtGui
import qtawesome
+from openpype import AYON_SERVER_ENABLED
from openpype.client import (
get_assets,
get_subsets,
@@ -143,9 +144,9 @@ class SubsetsModel(BaseRepresentationModel, TreeModel):
]
column_labels_mapping = {
- "subset": "Subset",
- "asset": "Asset",
- "family": "Family",
+ "subset": "Product" if AYON_SERVER_ENABLED else "Subset",
+ "asset": "Folder" if AYON_SERVER_ENABLED else "Asset",
+ "family": "Product type" if AYON_SERVER_ENABLED else "Family",
"version": "Version",
"time": "Time",
"author": "Author",
@@ -1173,9 +1174,9 @@ class RepresentationModel(TreeModel, BaseRepresentationModel):
repre_groups_items[doc["name"]] = 0
group = group_item
- progress = lib.get_progress_for_repre(
- doc, self.active_site, self.remote_site
- )
+ progress = self.sync_server.get_progress_for_repre(
+ doc,
+ self.active_site, self.remote_site)
active_site_icon = self._icons.get(self.active_provider)
remote_site_icon = self._icons.get(self.remote_provider)
diff --git a/openpype/tools/loader/widgets.py b/openpype/tools/loader/widgets.py
index b3aa381d14..5dd3af08d6 100644
--- a/openpype/tools/loader/widgets.py
+++ b/openpype/tools/loader/widgets.py
@@ -886,7 +886,9 @@ class ThumbnailWidget(QtWidgets.QLabel):
self.set_pixmap()
return
- thumbnail_ent = get_thumbnail(project_name, thumbnail_id)
+ thumbnail_ent = get_thumbnail(
+ project_name, thumbnail_id, src_type, src_id
+ )
if not thumbnail_ent:
return
diff --git a/openpype/tools/publisher/widgets/assets_widget.py b/openpype/tools/publisher/widgets/assets_widget.py
index a750d8d540..c536f93c9b 100644
--- a/openpype/tools/publisher/widgets/assets_widget.py
+++ b/openpype/tools/publisher/widgets/assets_widget.py
@@ -2,6 +2,7 @@ import collections
from qtpy import QtWidgets, QtCore, QtGui
+from openpype import AYON_SERVER_ENABLED
from openpype.tools.utils import (
PlaceholderLineEdit,
RecursiveSortFilterProxyModel,
@@ -187,7 +188,8 @@ class AssetsDialog(QtWidgets.QDialog):
proxy_model.setFilterCaseSensitivity(QtCore.Qt.CaseInsensitive)
filter_input = PlaceholderLineEdit(self)
- filter_input.setPlaceholderText("Filter assets..")
+ filter_input.setPlaceholderText("Filter {}..".format(
+ "folders" if AYON_SERVER_ENABLED else "assets"))
asset_view = AssetDialogView(self)
asset_view.setModel(proxy_model)
diff --git a/openpype/tools/publisher/widgets/create_widget.py b/openpype/tools/publisher/widgets/create_widget.py
index b7605b1188..1940d16eb8 100644
--- a/openpype/tools/publisher/widgets/create_widget.py
+++ b/openpype/tools/publisher/widgets/create_widget.py
@@ -2,6 +2,7 @@ import re
from qtpy import QtWidgets, QtCore, QtGui
+from openpype import AYON_SERVER_ENABLED
from openpype.pipeline.create import (
SUBSET_NAME_ALLOWED_SYMBOLS,
PRE_CREATE_THUMBNAIL_KEY,
@@ -203,7 +204,9 @@ class CreateWidget(QtWidgets.QWidget):
variant_subset_layout.setHorizontalSpacing(INPUTS_LAYOUT_HSPACING)
variant_subset_layout.setVerticalSpacing(INPUTS_LAYOUT_VSPACING)
variant_subset_layout.addRow("Variant", variant_widget)
- variant_subset_layout.addRow("Subset", subset_name_input)
+ variant_subset_layout.addRow(
+ "Product" if AYON_SERVER_ENABLED else "Subset",
+ subset_name_input)
creator_basics_layout = QtWidgets.QVBoxLayout(creator_basics_widget)
creator_basics_layout.setContentsMargins(0, 0, 0, 0)
diff --git a/openpype/tools/publisher/widgets/widgets.py b/openpype/tools/publisher/widgets/widgets.py
index 0b13f26d57..1bbe73381f 100644
--- a/openpype/tools/publisher/widgets/widgets.py
+++ b/openpype/tools/publisher/widgets/widgets.py
@@ -9,6 +9,7 @@ import collections
from qtpy import QtWidgets, QtCore, QtGui
import qtawesome
+from openpype import AYON_SERVER_ENABLED
from openpype.lib.attribute_definitions import UnknownDef
from openpype.tools.attribute_defs import create_widget_for_attr_def
from openpype.tools import resources
@@ -1116,10 +1117,16 @@ class GlobalAttrsWidget(QtWidgets.QWidget):
main_layout.setHorizontalSpacing(INPUTS_LAYOUT_HSPACING)
main_layout.setVerticalSpacing(INPUTS_LAYOUT_VSPACING)
main_layout.addRow("Variant", variant_input)
- main_layout.addRow("Asset", asset_value_widget)
+ main_layout.addRow(
+ "Folder" if AYON_SERVER_ENABLED else "Asset",
+ asset_value_widget)
main_layout.addRow("Task", task_value_widget)
- main_layout.addRow("Family", family_value_widget)
- main_layout.addRow("Subset", subset_value_widget)
+ main_layout.addRow(
+ "Product type" if AYON_SERVER_ENABLED else "Family",
+ family_value_widget)
+ main_layout.addRow(
+ "Product name" if AYON_SERVER_ENABLED else "Subset",
+ subset_value_widget)
main_layout.addRow(btns_layout)
variant_input.value_changed.connect(self._on_variant_change)
diff --git a/openpype/tools/push_to_project/app.py b/openpype/tools/push_to_project/app.py
index 9ca5fd83e9..b3ec33f353 100644
--- a/openpype/tools/push_to_project/app.py
+++ b/openpype/tools/push_to_project/app.py
@@ -1,6 +1,6 @@
import click
-from qtpy import QtWidgets, QtCore
+from openpype.tools.utils import get_openpype_qt_app
from openpype.tools.push_to_project.window import PushToContextSelectWindow
@@ -15,20 +15,7 @@ def main(project, version):
version (str): Version id.
"""
- app = QtWidgets.QApplication.instance()
- if not app:
- # 'AA_EnableHighDpiScaling' must be set before app instance creation
- high_dpi_scale_attr = getattr(
- QtCore.Qt, "AA_EnableHighDpiScaling", None
- )
- if high_dpi_scale_attr is not None:
- QtWidgets.QApplication.setAttribute(high_dpi_scale_attr)
-
- app = QtWidgets.QApplication([])
-
- attr = getattr(QtCore.Qt, "AA_UseHighDpiPixmaps", None)
- if attr is not None:
- app.setAttribute(attr)
+ app = get_openpype_qt_app()
window = PushToContextSelectWindow()
window.show()
diff --git a/openpype/tools/sceneinventory/lib.py b/openpype/tools/sceneinventory/lib.py
index 5db3c479c5..4b1860342a 100644
--- a/openpype/tools/sceneinventory/lib.py
+++ b/openpype/tools/sceneinventory/lib.py
@@ -28,55 +28,3 @@ def get_site_icons():
return icons
-
-def get_progress_for_repre(repre_doc, active_site, remote_site):
- """
- Calculates average progress for representation.
-
- If site has created_dt >> fully available >> progress == 1
-
- Could be calculated in aggregate if it would be too slow
- Args:
- repre_doc(dict): representation dict
- Returns:
- (dict) with active and remote sites progress
- {'studio': 1.0, 'gdrive': -1} - gdrive site is not present
- -1 is used to highlight the site should be added
- {'studio': 1.0, 'gdrive': 0.0} - gdrive site is present, not
- uploaded yet
- """
- progress = {active_site: -1, remote_site: -1}
- if not repre_doc:
- return progress
-
- files = {active_site: 0, remote_site: 0}
- doc_files = repre_doc.get("files") or []
- for doc_file in doc_files:
- if not isinstance(doc_file, dict):
- continue
-
- sites = doc_file.get("sites") or []
- for site in sites:
- if (
- # Pype 2 compatibility
- not isinstance(site, dict)
- # Check if site name is one of progress sites
- or site["name"] not in progress
- ):
- continue
-
- files[site["name"]] += 1
- norm_progress = max(progress[site["name"]], 0)
- if site.get("created_dt"):
- progress[site["name"]] = norm_progress + 1
- elif site.get("progress"):
- progress[site["name"]] = norm_progress + site["progress"]
- else: # site exists, might be failed, do not add again
- progress[site["name"]] = 0
-
- # for example 13 fully avail. files out of 26 >> 13/26 = 0.5
- avg_progress = {
- active_site: progress[active_site] / max(files[active_site], 1),
- remote_site: progress[remote_site] / max(files[remote_site], 1)
- }
- return avg_progress
diff --git a/openpype/tools/sceneinventory/model.py b/openpype/tools/sceneinventory/model.py
index 5cc849bb9e..1cfcd0d8c0 100644
--- a/openpype/tools/sceneinventory/model.py
+++ b/openpype/tools/sceneinventory/model.py
@@ -15,7 +15,7 @@ from openpype.client import (
get_representation_by_id,
)
from openpype.pipeline import (
- legacy_io,
+ get_current_project_name,
schema,
HeroVersionType,
registered_host,
@@ -27,7 +27,6 @@ from openpype.modules import ModulesManager
from .lib import (
get_site_icons,
walk_hierarchy,
- get_progress_for_repre
)
@@ -63,7 +62,7 @@ class InventoryModel(TreeModel):
if not self.sync_enabled:
return
- project_name = legacy_io.current_project()
+ project_name = get_current_project_name()
active_site = sync_server.get_active_site(project_name)
remote_site = sync_server.get_remote_site(project_name)
@@ -80,7 +79,7 @@ class InventoryModel(TreeModel):
project_name, remote_site
)
- # self.sync_server = sync_server
+ self.sync_server = sync_server
self.active_site = active_site
self.active_provider = active_provider
self.remote_site = remote_site
@@ -321,7 +320,7 @@ class InventoryModel(TreeModel):
"""
# NOTE: @iLLiCiTiT this need refactor
- project_name = legacy_io.active_project()
+ project_name = get_current_project_name()
self.beginResetModel()
@@ -445,7 +444,7 @@ class InventoryModel(TreeModel):
group_node["group"] = subset["data"].get("subsetGroup")
if self.sync_enabled:
- progress = get_progress_for_repre(
+ progress = self.sync_server.get_progress_for_repre(
representation, self.active_site, self.remote_site
)
group_node["active_site"] = self.active_site
diff --git a/openpype/tools/sceneinventory/view.py b/openpype/tools/sceneinventory/view.py
index 57e6e24411..d22b2bdd0f 100644
--- a/openpype/tools/sceneinventory/view.py
+++ b/openpype/tools/sceneinventory/view.py
@@ -23,7 +23,6 @@ from openpype.pipeline import (
)
from openpype.modules import ModulesManager
from openpype.tools.utils.lib import (
- get_progress_for_repre,
iter_model_rows,
format_version
)
@@ -361,7 +360,7 @@ class SceneInventoryView(QtWidgets.QTreeView):
if not repre_doc:
continue
- progress = get_progress_for_repre(
+ progress = self.sync_server.get_progress_for_repre(
repre_doc,
active_site,
remote_site
diff --git a/openpype/tools/settings/__init__.py b/openpype/tools/settings/__init__.py
index a5b1ea51a5..04f64e13f1 100644
--- a/openpype/tools/settings/__init__.py
+++ b/openpype/tools/settings/__init__.py
@@ -1,7 +1,8 @@
import sys
-from qtpy import QtWidgets, QtGui
+from qtpy import QtGui
from openpype import style
+from openpype.tools.utils import get_openpype_qt_app
from .lib import (
BTN_FIXED_SIZE,
CHILD_OFFSET
@@ -24,9 +25,7 @@ def main(user_role=None):
user_role, ", ".join(allowed_roles)
))
- app = QtWidgets.QApplication.instance()
- if not app:
- app = QtWidgets.QApplication(sys.argv)
+ app = get_openpype_qt_app()
app.setWindowIcon(QtGui.QIcon(style.app_icon_path()))
widget = MainWidget(user_role)
diff --git a/openpype/tools/standalonepublish/widgets/widget_asset.py b/openpype/tools/standalonepublish/widgets/widget_asset.py
index 5da25a0c3e..669366dd1d 100644
--- a/openpype/tools/standalonepublish/widgets/widget_asset.py
+++ b/openpype/tools/standalonepublish/widgets/widget_asset.py
@@ -2,6 +2,7 @@ import contextlib
from qtpy import QtWidgets, QtCore
import qtawesome
+from openpype import AYON_SERVER_ENABLED
from openpype.client import (
get_projects,
get_project,
@@ -181,7 +182,8 @@ class AssetWidget(QtWidgets.QWidget):
filter = PlaceholderLineEdit()
filter.textChanged.connect(proxy.setFilterFixedString)
- filter.setPlaceholderText("Filter assets..")
+ filter.setPlaceholderText("Filter {}..".format(
+ "folders" if AYON_SERVER_ENABLED else "assets"))
header.addWidget(filter)
header.addWidget(refresh)
diff --git a/openpype/tools/tray/pype_info_widget.py b/openpype/tools/tray/pype_info_widget.py
index c616ad4dba..dc222b79b5 100644
--- a/openpype/tools/tray/pype_info_widget.py
+++ b/openpype/tools/tray/pype_info_widget.py
@@ -2,11 +2,14 @@ import os
import json
import collections
+import ayon_api
from qtpy import QtCore, QtGui, QtWidgets
from openpype import style
from openpype import resources
+from openpype import AYON_SERVER_ENABLED
from openpype.settings.lib import get_local_settings
+from openpype.lib import get_openpype_execute_args
from openpype.lib.pype_info import (
get_all_current_info,
get_openpype_info,
@@ -327,8 +330,9 @@ class PypeInfoSubWidget(QtWidgets.QWidget):
main_layout.addWidget(self._create_openpype_info_widget(), 0)
main_layout.addWidget(self._create_separator(), 0)
main_layout.addWidget(self._create_workstation_widget(), 0)
- main_layout.addWidget(self._create_separator(), 0)
- main_layout.addWidget(self._create_local_settings_widget(), 0)
+ if not AYON_SERVER_ENABLED:
+ main_layout.addWidget(self._create_separator(), 0)
+ main_layout.addWidget(self._create_local_settings_widget(), 0)
main_layout.addWidget(self._create_separator(), 0)
main_layout.addWidget(self._create_environ_widget(), 1)
@@ -425,31 +429,59 @@ class PypeInfoSubWidget(QtWidgets.QWidget):
def _create_openpype_info_widget(self):
"""Create widget with information about OpenPype application."""
- # Get pype info data
- pype_info = get_openpype_info()
- # Modify version key/values
- version_value = "{} ({})".format(
- pype_info.pop("version", self.not_applicable),
- pype_info.pop("version_type", self.not_applicable)
- )
- pype_info["version_value"] = version_value
- # Prepare label mapping
- key_label_mapping = {
- "version_value": "Running version:",
- "build_verison": "Build version:",
- "executable": "OpenPype executable:",
- "pype_root": "OpenPype location:",
- "mongo_url": "OpenPype Mongo URL:"
- }
- # Prepare keys order
- keys_order = [
- "version_value",
- "build_verison",
- "executable",
- "pype_root",
- "mongo_url"
- ]
- for key in pype_info.keys():
+ if AYON_SERVER_ENABLED:
+ executable_args = get_openpype_execute_args()
+ username = "N/A"
+ user_info = ayon_api.get_user()
+ if user_info:
+ username = user_info.get("name") or username
+ full_name = user_info.get("attrib", {}).get("fullName")
+ if full_name:
+ username = "{} ({})".format(full_name, username)
+ info_values = {
+ "executable": executable_args[-1],
+ "server_url": os.environ["AYON_SERVER_URL"],
+ "username": username
+ }
+ key_label_mapping = {
+ "executable": "AYON Executable:",
+ "server_url": "AYON Server:",
+ "username": "AYON Username:"
+ }
+ # Prepare keys order
+ keys_order = [
+ "server_url",
+ "username",
+ "executable",
+ ]
+
+ else:
+ # Get pype info data
+ info_values = get_openpype_info()
+ # Modify version key/values
+ version_value = "{} ({})".format(
+ info_values.pop("version", self.not_applicable),
+ info_values.pop("version_type", self.not_applicable)
+ )
+ info_values["version_value"] = version_value
+ # Prepare label mapping
+ key_label_mapping = {
+ "version_value": "Running version:",
+ "build_verison": "Build version:",
+ "executable": "OpenPype executable:",
+ "pype_root": "OpenPype location:",
+ "mongo_url": "OpenPype Mongo URL:"
+ }
+ # Prepare keys order
+ keys_order = [
+ "version_value",
+ "build_verison",
+ "executable",
+ "pype_root",
+ "mongo_url"
+ ]
+
+ for key in info_values.keys():
if key not in keys_order:
keys_order.append(key)
@@ -466,9 +498,9 @@ class PypeInfoSubWidget(QtWidgets.QWidget):
info_layout.addWidget(title_label, 0, 0, 1, 2)
for key in keys_order:
- if key not in pype_info:
+ if key not in info_values:
continue
- value = pype_info[key]
+ value = info_values[key]
label = key_label_mapping.get(key, key)
row = info_layout.rowCount()
info_layout.addWidget(
diff --git a/openpype/tools/tray/pype_tray.py b/openpype/tools/tray/pype_tray.py
index fdc0a8094d..a5876ca721 100644
--- a/openpype/tools/tray/pype_tray.py
+++ b/openpype/tools/tray/pype_tray.py
@@ -8,6 +8,7 @@ import platform
from qtpy import QtCore, QtGui, QtWidgets
import openpype.version
+from openpype import AYON_SERVER_ENABLED
from openpype import resources, style
from openpype.lib import (
Logger,
@@ -35,7 +36,8 @@ from openpype.settings import (
from openpype.tools.utils import (
WrappedCallbackItem,
paint_image_with_color,
- get_warning_pixmap
+ get_warning_pixmap,
+ get_openpype_qt_app,
)
from .pype_info_widget import PypeInfoWidget
@@ -589,6 +591,11 @@ class TrayManager:
self.tray_widget.showMessage(*args, **kwargs)
def _add_version_item(self):
+ if AYON_SERVER_ENABLED:
+ login_action = QtWidgets.QAction("Login", self.tray_widget)
+ login_action.triggered.connect(self._on_ayon_login)
+ self.tray_widget.menu.addAction(login_action)
+
subversion = os.environ.get("OPENPYPE_SUBVERSION")
client_name = os.environ.get("OPENPYPE_CLIENT")
@@ -614,6 +621,19 @@ class TrayManager:
self._restart_action = restart_action
+ def _on_ayon_login(self):
+ self.execute_in_main_thread(self._show_ayon_login)
+
+ def _show_ayon_login(self):
+ from ayon_common.connection.credentials import change_user_ui
+
+ result = change_user_ui()
+ if result.shutdown:
+ self.exit()
+
+ elif result.restart or result.token_changed:
+ self.restart()
+
def _on_restart_action(self):
self.restart(use_expected_version=True)
@@ -839,37 +859,7 @@ class PypeTrayStarter(QtCore.QObject):
def main():
- log = Logger.get_logger(__name__)
- app = QtWidgets.QApplication.instance()
-
- high_dpi_scale_attr = None
- if not app:
- # 'AA_EnableHighDpiScaling' must be set before app instance creation
- high_dpi_scale_attr = getattr(
- QtCore.Qt, "AA_EnableHighDpiScaling", None
- )
- if high_dpi_scale_attr is not None:
- QtWidgets.QApplication.setAttribute(high_dpi_scale_attr)
-
- app = QtWidgets.QApplication([])
-
- if high_dpi_scale_attr is None:
- log.debug((
- "Attribute 'AA_EnableHighDpiScaling' was not set."
- " UI quality may be affected."
- ))
-
- for attr_name in (
- "AA_UseHighDpiPixmaps",
- ):
- attr = getattr(QtCore.Qt, attr_name, None)
- if attr is None:
- log.debug((
- "Missing QtCore.Qt attribute \"{}\"."
- " UI quality may be affected."
- ).format(attr_name))
- else:
- app.setAttribute(attr)
+ app = get_openpype_qt_app()
starter = PypeTrayStarter(app)
diff --git a/openpype/tools/traypublisher/window.py b/openpype/tools/traypublisher/window.py
index 3ac1b4c4ad..a1ed38dcc0 100644
--- a/openpype/tools/traypublisher/window.py
+++ b/openpype/tools/traypublisher/window.py
@@ -17,7 +17,7 @@ from openpype.pipeline import install_host
from openpype.hosts.traypublisher.api import TrayPublisherHost
from openpype.tools.publisher.control_qt import QtPublisherController
from openpype.tools.publisher.window import PublisherWindow
-from openpype.tools.utils import PlaceholderLineEdit
+from openpype.tools.utils import PlaceholderLineEdit, get_openpype_qt_app
from openpype.tools.utils.constants import PROJECT_NAME_ROLE
from openpype.tools.utils.models import (
ProjectModel,
@@ -263,9 +263,7 @@ def main():
host = TrayPublisherHost()
install_host(host)
- app_instance = QtWidgets.QApplication.instance()
- if app_instance is None:
- app_instance = QtWidgets.QApplication([])
+ app_instance = get_openpype_qt_app()
if platform.system().lower() == "windows":
import ctypes
diff --git a/openpype/tools/utils/__init__.py b/openpype/tools/utils/__init__.py
index 10bd527692..f35bfaee70 100644
--- a/openpype/tools/utils/__init__.py
+++ b/openpype/tools/utils/__init__.py
@@ -25,6 +25,7 @@ from .lib import (
set_style_property,
DynamicQThread,
qt_app_context,
+ get_openpype_qt_app,
get_asset_icon,
get_asset_icon_by_name,
get_asset_icon_name_from_doc,
@@ -68,6 +69,7 @@ __all__ = (
"set_style_property",
"DynamicQThread",
"qt_app_context",
+ "get_openpype_qt_app",
"get_asset_icon",
"get_asset_icon_by_name",
"get_asset_icon_name_from_doc",
diff --git a/openpype/tools/utils/assets_widget.py b/openpype/tools/utils/assets_widget.py
index ffbdd995d6..a45d762c73 100644
--- a/openpype/tools/utils/assets_widget.py
+++ b/openpype/tools/utils/assets_widget.py
@@ -5,6 +5,7 @@ import qtpy
from qtpy import QtWidgets, QtCore, QtGui
import qtawesome
+from openpype import AYON_SERVER_ENABLED
from openpype.client import (
get_project,
get_assets,
@@ -607,7 +608,8 @@ class AssetsWidget(QtWidgets.QWidget):
refresh_btn.setToolTip("Refresh items")
filter_input = PlaceholderLineEdit(header_widget)
- filter_input.setPlaceholderText("Filter assets..")
+ filter_input.setPlaceholderText("Filter {}..".format(
+ "folders" if AYON_SERVER_ENABLED else "assets"))
# Header
header_layout = QtWidgets.QHBoxLayout(header_widget)
diff --git a/openpype/tools/utils/host_tools.py b/openpype/tools/utils/host_tools.py
index ac242d24d2..bc4b7867c2 100644
--- a/openpype/tools/utils/host_tools.py
+++ b/openpype/tools/utils/host_tools.py
@@ -10,7 +10,7 @@ from openpype.host import IWorkfileHost, ILoadHost
from openpype.lib import Logger
from openpype.pipeline import (
registered_host,
- legacy_io,
+ get_current_asset_name,
)
from .lib import qt_app_context
@@ -96,7 +96,7 @@ class HostToolsHelper:
use_context = False
if use_context:
- context = {"asset": legacy_io.Session["AVALON_ASSET"]}
+ context = {"asset": get_current_asset_name()}
loader_tool.set_context(context, refresh=True)
else:
loader_tool.refresh()
diff --git a/openpype/tools/utils/lib.py b/openpype/tools/utils/lib.py
index 58ece7c68f..82ca23c848 100644
--- a/openpype/tools/utils/lib.py
+++ b/openpype/tools/utils/lib.py
@@ -14,11 +14,16 @@ from openpype.client import (
from openpype.style import (
get_default_entity_icon_color,
get_objected_colors,
+ get_app_icon_path,
)
from openpype.resources import get_image_path
from openpype.lib import filter_profiles, Logger
from openpype.settings import get_project_settings
-from openpype.pipeline import registered_host
+from openpype.pipeline import (
+ registered_host,
+ get_current_context,
+ get_current_host_name,
+)
from .constants import CHECKED_INT, UNCHECKED_INT
@@ -46,7 +51,6 @@ def checkstate_enum_to_int(state):
return 2
-
def center_window(window):
"""Move window to center of it's screen."""
@@ -149,6 +153,36 @@ def qt_app_context():
yield app
+def get_openpype_qt_app():
+ """Main Qt application initialized for OpenPype processed.
+
+ This function should be used only inside OpenPype process and never inside
+ other processes.
+ """
+
+ app = QtWidgets.QApplication.instance()
+ if app is None:
+ for attr_name in (
+ "AA_EnableHighDpiScaling",
+ "AA_UseHighDpiPixmaps",
+ ):
+ attr = getattr(QtCore.Qt, attr_name, None)
+ if attr is not None:
+ QtWidgets.QApplication.setAttribute(attr)
+
+ if hasattr(
+ QtWidgets.QApplication, "setHighDpiScaleFactorRoundingPolicy"
+ ):
+ QtWidgets.QApplication.setHighDpiScaleFactorRoundingPolicy(
+ QtCore.Qt.HighDpiScaleFactorRoundingPolicy.PassThrough
+ )
+
+ app = QtWidgets.QApplication(sys.argv)
+
+ app.setWindowIcon(QtGui.QIcon(get_app_icon_path()))
+ return app
+
+
class SharedObjects:
jobs = {}
icons = {}
@@ -496,10 +530,11 @@ class FamilyConfigCache:
return
# Update the icons from the project configuration
- project_name = os.environ.get("AVALON_PROJECT")
- asset_name = os.environ.get("AVALON_ASSET")
- task_name = os.environ.get("AVALON_TASK")
- host_name = os.environ.get("AVALON_APP")
+ context = get_current_context()
+ project_name = context["project_name"]
+ asset_name = context["asset_name"]
+ task_name = context["task_name"]
+ host_name = get_current_host_name()
if not all((project_name, asset_name, task_name)):
return
@@ -752,61 +787,6 @@ def get_repre_icons():
return icons
-def get_progress_for_repre(doc, active_site, remote_site):
- """
- Calculates average progress for representation.
-
- If site has created_dt >> fully available >> progress == 1
-
- Could be calculated in aggregate if it would be too slow
- Args:
- doc(dict): representation dict
- Returns:
- (dict) with active and remote sites progress
- {'studio': 1.0, 'gdrive': -1} - gdrive site is not present
- -1 is used to highlight the site should be added
- {'studio': 1.0, 'gdrive': 0.0} - gdrive site is present, not
- uploaded yet
- """
- progress = {active_site: -1,
- remote_site: -1}
- if not doc:
- return progress
-
- files = {active_site: 0, remote_site: 0}
- doc_files = doc.get("files") or []
- for doc_file in doc_files:
- if not isinstance(doc_file, dict):
- continue
-
- sites = doc_file.get("sites") or []
- for site in sites:
- if (
- # Pype 2 compatibility
- not isinstance(site, dict)
- # Check if site name is one of progress sites
- or site["name"] not in progress
- ):
- continue
-
- files[site["name"]] += 1
- norm_progress = max(progress[site["name"]], 0)
- if site.get("created_dt"):
- progress[site["name"]] = norm_progress + 1
- elif site.get("progress"):
- progress[site["name"]] = norm_progress + site["progress"]
- else: # site exists, might be failed, do not add again
- progress[site["name"]] = 0
-
- # for example 13 fully avail. files out of 26 >> 13/26 = 0.5
- avg_progress = {}
- avg_progress[active_site] = \
- progress[active_site] / max(files[active_site], 1)
- avg_progress[remote_site] = \
- progress[remote_site] / max(files[remote_site], 1)
- return avg_progress
-
-
def is_sync_loader(loader):
return is_remove_site_loader(loader) or is_add_site_loader(loader)
diff --git a/openpype/tools/workfiles/files_widget.py b/openpype/tools/workfiles/files_widget.py
index 2f338cf516..e4715a0340 100644
--- a/openpype/tools/workfiles/files_widget.py
+++ b/openpype/tools/workfiles/files_widget.py
@@ -21,6 +21,7 @@ from openpype.pipeline import (
registered_host,
legacy_io,
Anatomy,
+ get_current_project_name,
)
from openpype.pipeline.context_tools import (
compute_session_changes,
@@ -99,7 +100,7 @@ class FilesWidget(QtWidgets.QWidget):
self._task_type = None
# Pype's anatomy object for current project
- project_name = legacy_io.Session["AVALON_PROJECT"]
+ project_name = get_current_project_name()
self.anatomy = Anatomy(project_name)
self.project_name = project_name
# Template key used to get work template from anatomy templates
diff --git a/openpype/tools/workfiles/window.py b/openpype/tools/workfiles/window.py
index 53f8894665..50c39d4a40 100644
--- a/openpype/tools/workfiles/window.py
+++ b/openpype/tools/workfiles/window.py
@@ -15,7 +15,12 @@ from openpype.client.operations import (
)
from openpype import style
from openpype import resources
-from openpype.pipeline import Anatomy
+from openpype.pipeline import (
+ Anatomy,
+ get_current_project_name,
+ get_current_asset_name,
+ get_current_task_name,
+)
from openpype.pipeline import legacy_io
from openpype.tools.utils.assets_widget import SingleSelectAssetsWidget
from openpype.tools.utils.tasks_widget import TasksWidget
@@ -285,8 +290,8 @@ class Window(QtWidgets.QWidget):
if use_context is None or use_context is True:
context = {
- "asset": legacy_io.Session["AVALON_ASSET"],
- "task": legacy_io.Session["AVALON_TASK"]
+ "asset": get_current_asset_name(),
+ "task": get_current_task_name()
}
self.set_context(context)
@@ -296,7 +301,7 @@ class Window(QtWidgets.QWidget):
@property
def project_name(self):
- return legacy_io.Session["AVALON_PROJECT"]
+ return get_current_project_name()
def showEvent(self, event):
super(Window, self).showEvent(event)
@@ -325,7 +330,7 @@ class Window(QtWidgets.QWidget):
workfile_doc = None
if asset_id and task_name and filepath:
filename = os.path.split(filepath)[1]
- project_name = legacy_io.active_project()
+ project_name = self.project_name
workfile_doc = get_workfile_info(
project_name, asset_id, task_name, filename
)
@@ -356,7 +361,7 @@ class Window(QtWidgets.QWidget):
if not update_data:
return
- project_name = legacy_io.active_project()
+ project_name = self.project_name
session = OperationsSession()
session.update_entity(
@@ -373,7 +378,7 @@ class Window(QtWidgets.QWidget):
return
filename = os.path.split(filepath)[1]
- project_name = legacy_io.active_project()
+ project_name = self.project_name
return get_workfile_info(
project_name, asset_id, task_name, filename
)
@@ -385,7 +390,7 @@ class Window(QtWidgets.QWidget):
workdir, filename = os.path.split(filepath)
- project_name = legacy_io.active_project()
+ project_name = self.project_name
asset_id = self.assets_widget.get_selected_asset_id()
task_name = self.tasks_widget.get_selected_task_name()
diff --git a/openpype/vendor/python/common/ayon_api/__init__.py b/openpype/vendor/python/common/ayon_api/__init__.py
new file mode 100644
index 0000000000..4b4e0f3359
--- /dev/null
+++ b/openpype/vendor/python/common/ayon_api/__init__.py
@@ -0,0 +1,366 @@
+from .version import __version__
+from .utils import (
+ TransferProgress,
+ slugify_string,
+ create_dependency_package_basename,
+)
+from .server_api import (
+ ServerAPI,
+)
+
+from ._api import (
+ GlobalServerAPI,
+ ServiceContext,
+
+ init_service,
+ get_service_name,
+ get_service_addon_name,
+ get_service_addon_version,
+ get_service_addon_settings,
+
+ is_connection_created,
+ create_connection,
+ close_connection,
+ change_token,
+ set_environments,
+ get_server_api_connection,
+ get_site_id,
+ set_site_id,
+ get_client_version,
+ set_client_version,
+ get_default_settings_variant,
+ set_default_settings_variant,
+
+ get_base_url,
+ get_rest_url,
+
+ raw_get,
+ raw_post,
+ raw_put,
+ raw_patch,
+ raw_delete,
+
+ get,
+ post,
+ put,
+ patch,
+ delete,
+
+ get_event,
+ get_events,
+ dispatch_event,
+ update_event,
+ enroll_event_job,
+
+ download_file,
+ upload_file,
+
+ query_graphql,
+
+ get_addons_info,
+ get_addon_url,
+ download_addon_private_file,
+
+ get_installers,
+ create_installer,
+ update_installer,
+ delete_installer,
+ download_installer,
+ upload_installer,
+
+ get_dependencies_info,
+ update_dependency_info,
+ get_dependency_packages,
+ create_dependency_package,
+ update_dependency_package,
+ delete_dependency_package,
+
+ download_dependency_package,
+ upload_dependency_package,
+
+ get_bundles,
+ create_bundle,
+ update_bundle,
+ delete_bundle,
+
+ get_info,
+ get_server_version,
+ get_server_version_tuple,
+ get_user,
+ get_users,
+
+ get_attributes_for_type,
+ get_default_fields_for_type,
+
+ get_project_anatomy_preset,
+ get_project_anatomy_presets,
+ get_project_roots_by_site,
+ get_project_roots_for_site,
+
+ get_addon_site_settings_schema,
+ get_addon_settings_schema,
+
+ get_addon_studio_settings,
+ get_addon_project_settings,
+ get_addon_settings,
+ get_bundle_settings,
+ get_addons_studio_settings,
+ get_addons_project_settings,
+ get_addons_settings,
+
+ get_project_names,
+ get_projects,
+ get_project,
+ create_project,
+ update_project,
+ delete_project,
+
+ get_folder_by_id,
+ get_folder_by_name,
+ get_folder_by_path,
+ get_folders,
+ get_folders_hierarchy,
+
+ get_tasks,
+
+ get_folder_ids_with_products,
+ get_product_by_id,
+ get_product_by_name,
+ get_products,
+ get_product_types,
+ get_project_product_types,
+ get_product_type_names,
+
+ get_version_by_id,
+ get_version_by_name,
+ version_is_latest,
+ get_versions,
+ get_hero_version_by_product_id,
+ get_hero_version_by_id,
+ get_hero_versions,
+ get_last_versions,
+ get_last_version_by_product_id,
+ get_last_version_by_product_name,
+ get_representation_by_id,
+ get_representation_by_name,
+ get_representations,
+ get_representations_parents,
+ get_representation_parents,
+ get_repre_ids_by_context_filters,
+
+ get_workfiles_info,
+ get_workfile_info,
+ get_workfile_info_by_id,
+
+ get_thumbnail,
+ get_folder_thumbnail,
+ get_version_thumbnail,
+ get_workfile_thumbnail,
+ create_thumbnail,
+ update_thumbnail,
+
+ get_full_link_type_name,
+ get_link_types,
+ get_link_type,
+ create_link_type,
+ delete_link_type,
+ make_sure_link_type_exists,
+
+ create_link,
+ delete_link,
+ get_entities_links,
+ get_folder_links,
+ get_folders_links,
+ get_task_links,
+ get_tasks_links,
+ get_product_links,
+ get_products_links,
+ get_version_links,
+ get_versions_links,
+ get_representations_links,
+ get_representation_links,
+
+ send_batch_operations,
+)
+
+
+__all__ = (
+ "__version__",
+
+ "TransferProgress",
+ "slugify_string",
+ "create_dependency_package_basename",
+
+ "ServerAPI",
+
+ "GlobalServerAPI",
+ "ServiceContext",
+
+ "init_service",
+ "get_service_name",
+ "get_service_addon_name",
+ "get_service_addon_version",
+ "get_service_addon_settings",
+
+ "is_connection_created",
+ "create_connection",
+ "close_connection",
+ "change_token",
+ "set_environments",
+ "get_server_api_connection",
+ "get_site_id",
+ "set_site_id",
+ "get_client_version",
+ "set_client_version",
+ "get_default_settings_variant",
+ "set_default_settings_variant",
+
+ "get_base_url",
+ "get_rest_url",
+
+ "raw_get",
+ "raw_post",
+ "raw_put",
+ "raw_patch",
+ "raw_delete",
+
+ "get",
+ "post",
+ "put",
+ "patch",
+ "delete",
+
+ "get_event",
+ "get_events",
+ "dispatch_event",
+ "update_event",
+ "enroll_event_job",
+
+ "download_file",
+ "upload_file",
+
+ "query_graphql",
+
+ "get_addons_info",
+ "get_addon_url",
+ "download_addon_private_file",
+
+ "get_installers",
+ "create_installer",
+ "update_installer",
+ "delete_installer",
+ "download_installer",
+ "upload_installer",
+
+ "get_dependencies_info",
+ "update_dependency_info",
+ "get_dependency_packages",
+ "create_dependency_package",
+ "update_dependency_package",
+ "delete_dependency_package",
+
+ "download_dependency_package",
+ "upload_dependency_package",
+
+ "get_bundles",
+ "create_bundle",
+ "update_bundle",
+ "delete_bundle",
+
+ "get_info",
+ "get_server_version",
+ "get_server_version_tuple",
+ "get_user",
+ "get_users",
+
+ "get_attributes_for_type",
+ "get_default_fields_for_type",
+
+ "get_project_anatomy_preset",
+ "get_project_anatomy_presets",
+ "get_project_roots_by_site",
+ "get_project_roots_for_site",
+
+ "get_addon_site_settings_schema",
+ "get_addon_settings_schema",
+ "get_addon_studio_settings",
+ "get_addon_project_settings",
+ "get_addon_settings",
+ "get_bundle_settings",
+ "get_addons_studio_settings",
+ "get_addons_project_settings",
+ "get_addons_settings",
+
+ "get_project_names",
+ "get_projects",
+ "get_project",
+ "create_project",
+ "update_project",
+ "delete_project",
+
+ "get_folder_by_id",
+ "get_folder_by_name",
+ "get_folder_by_path",
+ "get_folders",
+
+ "get_tasks",
+
+ "get_folder_ids_with_products",
+ "get_product_by_id",
+ "get_product_by_name",
+ "get_products",
+ "get_product_types",
+ "get_project_product_types",
+ "get_product_type_names",
+
+ "get_version_by_id",
+ "get_version_by_name",
+ "version_is_latest",
+ "get_versions",
+ "get_hero_version_by_product_id",
+ "get_hero_version_by_id",
+ "get_hero_versions",
+ "get_last_versions",
+ "get_last_version_by_product_id",
+ "get_last_version_by_product_name",
+ "get_representation_by_id",
+ "get_representation_by_name",
+ "get_representations",
+ "get_representations_parents",
+ "get_representation_parents",
+ "get_repre_ids_by_context_filters",
+
+ "get_workfiles_info",
+ "get_workfile_info",
+ "get_workfile_info_by_id",
+
+ "get_thumbnail",
+ "get_folder_thumbnail",
+ "get_version_thumbnail",
+ "get_workfile_thumbnail",
+ "create_thumbnail",
+ "update_thumbnail",
+
+ "get_full_link_type_name",
+ "get_link_types",
+ "get_link_type",
+ "create_link_type",
+ "delete_link_type",
+ "make_sure_link_type_exists",
+
+ "create_link",
+ "delete_link",
+ "get_entities_links",
+ "get_folder_links",
+ "get_folders_links",
+ "get_task_links",
+ "get_tasks_links",
+ "get_product_links",
+ "get_products_links",
+ "get_version_links",
+ "get_versions_links",
+ "get_representations_links",
+ "get_representation_links",
+
+ "send_batch_operations",
+)
diff --git a/openpype/vendor/python/common/ayon_api/_api.py b/openpype/vendor/python/common/ayon_api/_api.py
new file mode 100644
index 0000000000..82ffdc7527
--- /dev/null
+++ b/openpype/vendor/python/common/ayon_api/_api.py
@@ -0,0 +1,1186 @@
+"""Singleton based server api for direct access.
+
+This implementation will be probably the most used part of package. Gives
+option to have singleton connection to Server URL based on environment variable
+values. All public functions and classes are imported in '__init__.py' so
+they're available directly in top module import.
+"""
+
+import os
+import socket
+
+from .constants import (
+ SERVER_URL_ENV_KEY,
+ SERVER_API_ENV_KEY,
+)
+from .server_api import ServerAPI
+from .exceptions import FailedServiceInit
+
+
+class GlobalServerAPI(ServerAPI):
+ """Extended server api which also handles storing tokens and url.
+
+ Created object expect to have set environment variables
+ 'AYON_SERVER_URL'. Also is expecting filled 'AYON_API_KEY'
+ but that can be filled afterwards with calling 'login' method.
+ """
+
+ def __init__(self, site_id=None, client_version=None):
+ url = self.get_url()
+ token = self.get_token()
+
+ super(GlobalServerAPI, self).__init__(url, token, site_id, client_version)
+
+ self.validate_server_availability()
+ self.create_session()
+
+ def login(self, username, password):
+ """Login to the server or change user.
+
+ If user is the same as current user and token is available the
+ login is skipped.
+ """
+
+ previous_token = self._access_token
+ super(GlobalServerAPI, self).login(username, password)
+ if self.has_valid_token and previous_token != self._access_token:
+ os.environ[SERVER_API_ENV_KEY] = self._access_token
+
+ @staticmethod
+ def get_url():
+ return os.environ.get(SERVER_URL_ENV_KEY)
+
+ @staticmethod
+ def get_token():
+ return os.environ.get(SERVER_API_ENV_KEY)
+
+ @staticmethod
+ def set_environments(url, token):
+ """Change url and token environemnts in currently running process.
+
+ Args:
+ url (str): New server url.
+ token (str): User's token.
+ """
+
+ os.environ[SERVER_URL_ENV_KEY] = url or ""
+ os.environ[SERVER_API_ENV_KEY] = token or ""
+
+
+class GlobalContext:
+ """Singleton connection holder.
+
+ Goal is to avoid create connection on import which can be dangerous in
+ some cases.
+ """
+
+ _connection = None
+
+ @classmethod
+ def is_connection_created(cls):
+ return cls._connection is not None
+
+ @classmethod
+ def change_token(cls, url, token):
+ GlobalServerAPI.set_environments(url, token)
+ if cls._connection is None:
+ return
+
+ if cls._connection.get_base_url() == url:
+ cls._connection.set_token(token)
+ else:
+ cls.close_connection()
+
+ @classmethod
+ def close_connection(cls):
+ if cls._connection is not None:
+ cls._connection.close_session()
+ cls._connection = None
+
+ @classmethod
+ def create_connection(cls, *args, **kwargs):
+ if cls._connection is not None:
+ cls.close_connection()
+ cls._connection = GlobalServerAPI(*args, **kwargs)
+ return cls._connection
+
+ @classmethod
+ def get_server_api_connection(cls):
+ if cls._connection is None:
+ cls.create_connection()
+ return cls._connection
+
+
+class ServiceContext:
+ """Helper for services running under server.
+
+ When service is running from server the process receives information about
+ connection from environment variables. This class helps to initialize the
+ values without knowing environment variables (that may change over time).
+
+ All what must be done is to call 'init_service' function/method. The
+ arguments are for cases when the service is running in specific environment
+ and their values are e.g. loaded from private file or for testing purposes.
+ """
+
+ token = None
+ server_url = None
+ addon_name = None
+ addon_version = None
+ service_name = None
+
+ @staticmethod
+ def get_value_from_envs(env_keys, value=None):
+ if value:
+ return value
+
+ for env_key in env_keys:
+ value = os.environ.get(env_key)
+ if value:
+ break
+ return value
+
+ @classmethod
+ def init_service(
+ cls,
+ token=None,
+ server_url=None,
+ addon_name=None,
+ addon_version=None,
+ service_name=None,
+ connect=True
+ ):
+ token = cls.get_value_from_envs(
+ ("AY_API_KEY", "AYON_API_KEY"),
+ token
+ )
+ server_url = cls.get_value_from_envs(
+ ("AY_SERVER_URL", "AYON_SERVER_URL"),
+ server_url
+ )
+ if not server_url:
+ raise FailedServiceInit("URL to server is not set")
+
+ if not token:
+ raise FailedServiceInit(
+ "Token to server {} is not set".format(server_url)
+ )
+
+ addon_name = cls.get_value_from_envs(
+ ("AY_ADDON_NAME", "AYON_ADDON_NAME"),
+ addon_name
+ )
+ addon_version = cls.get_value_from_envs(
+ ("AY_ADDON_VERSION", "AYON_ADDON_VERSION"),
+ addon_version
+ )
+ service_name = cls.get_value_from_envs(
+ ("AY_SERVICE_NAME", "AYON_SERVICE_NAME"),
+ service_name
+ )
+
+ cls.token = token
+ cls.server_url = server_url
+ cls.addon_name = addon_name
+ cls.addon_version = addon_version
+ cls.service_name = service_name or socket.gethostname()
+
+ # Make sure required environments for GlobalServerAPI are set
+ GlobalServerAPI.set_environments(cls.server_url, cls.token)
+
+ if connect:
+ print("Connecting to server \"{}\"".format(server_url))
+ con = GlobalContext.get_server_api_connection()
+ user = con.get_user()
+ print("Logged in as user \"{}\"".format(user["name"]))
+
+
+def init_service(*args, **kwargs):
+ """Initialize current connection from service.
+
+ The service expect specific environment variables. The variables must all
+ be set to make the connection work as a service.
+ """
+
+ ServiceContext.init_service(*args, **kwargs)
+
+
+def get_service_addon_name():
+ """Name of addon which initialized service connection.
+
+ Service context must be initialized to be able to use this function. Call
+ 'init_service' on you service start to do so.
+
+ Returns:
+ Union[str, None]: Name of addon or None.
+ """
+
+ return ServiceContext.addon_name
+
+
+def get_service_addon_version():
+ """Version of addon which initialized service connection.
+
+ Service context must be initialized to be able to use this function. Call
+ 'init_service' on you service start to do so.
+
+ Returns:
+ Union[str, None]: Version of addon or None.
+ """
+
+ return ServiceContext.addon_version
+
+
+def get_service_name():
+ """Name of service.
+
+ Service context must be initialized to be able to use this function. Call
+ 'init_service' on you service start to do so.
+
+ Returns:
+ Union[str, None]: Name of service if service was registered.
+ """
+
+ return ServiceContext.service_name
+
+
+def get_service_addon_settings():
+ """Addon settings of service which initialized service.
+
+ Service context must be initialized to be able to use this function. Call
+ 'init_service' on you service start to do so.
+
+ Returns:
+ Dict[str, Any]: Addon settings.
+
+ Raises:
+ ValueError: When service was not initialized.
+ """
+
+ addon_name = get_service_addon_name()
+ addon_version = get_service_addon_version()
+ if addon_name is None or addon_version is None:
+ raise ValueError("Service is not initialized")
+ return get_addon_settings(addon_name, addon_version)
+
+
+def is_connection_created():
+ """Is global connection created.
+
+ Returns:
+ bool: True if connection was connected.
+ """
+
+ return GlobalContext.is_connection_created()
+
+
+def create_connection(site_id=None, client_version=None):
+ """Create global connection.
+
+ Args:
+ site_id (str): Machine site id/name.
+ client_version (str): Desktop app version.
+
+ Returns:
+ GlobalServerAPI: Created connection.
+ """
+
+ return GlobalContext.create_connection(site_id, client_version)
+
+
+def close_connection():
+ """Close global connection if is connected."""
+
+ GlobalContext.close_connection()
+
+
+def change_token(url, token):
+ """Change connection token for url.
+
+ This function can be also used to change url.
+
+ Args:
+ url (str): Server url.
+ token (str): API key token.
+ """
+
+ GlobalContext.change_token(url, token)
+
+
+def set_environments(url, token):
+ """Set global environments for global connection.
+
+ Args:
+ url (Union[str, None]): Url to server or None to unset environments.
+ token (Union[str, None]): API key token to be used for connection.
+ """
+
+ GlobalServerAPI.set_environments(url, token)
+
+
+def get_server_api_connection():
+ """Access to global scope object of GlobalServerAPI.
+
+ This access expect to have set environment variables 'AYON_SERVER_URL'
+ and 'AYON_API_KEY'.
+
+ Returns:
+ GlobalServerAPI: Object of connection to server.
+ """
+
+ return GlobalContext.get_server_api_connection()
+
+
+def get_site_id():
+ con = get_server_api_connection()
+ return con.get_site_id()
+
+
+def set_site_id(site_id):
+ """Set site id of already connected client connection.
+
+ Site id is human-readable machine id used in AYON desktop application.
+
+ Args:
+ site_id (Union[str, None]): Site id used in connection.
+ """
+
+ con = get_server_api_connection()
+ con.set_site_id(site_id)
+
+
+def get_client_version():
+ """Version of client used to connect to server.
+
+ Client version is AYON client build desktop application.
+
+ Returns:
+ str: Client version string used in connection.
+ """
+
+ con = get_server_api_connection()
+ return con.get_client_version()
+
+
+def set_client_version(client_version):
+ """Set version of already connected client connection.
+
+ Client version is version of AYON desktop application.
+
+ Args:
+ client_version (Union[str, None]): Client version string.
+ """
+
+ con = get_server_api_connection()
+ con.set_client_version(client_version)
+
+
+def get_default_settings_variant():
+ """Default variant used for settings.
+
+ Returns:
+ Union[str, None]: name of variant or None.
+ """
+
+ con = get_server_api_connection()
+ return con.get_client_version()
+
+
+def set_default_settings_variant(variant):
+ """Change default variant for addon settings.
+
+ Note:
+ It is recommended to set only 'production' or 'staging' variants
+ as default variant.
+
+ Args:
+ variant (Union[str, None]): Settings variant name.
+ """
+
+ con = get_server_api_connection()
+ return con.set_default_settings_variant(variant)
+
+
+def get_base_url():
+ con = get_server_api_connection()
+ return con.get_base_url()
+
+
+def get_rest_url():
+ con = get_server_api_connection()
+ return con.get_rest_url()
+
+
+def raw_get(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.raw_get(*args, **kwargs)
+
+
+def raw_post(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.raw_post(*args, **kwargs)
+
+
+def raw_put(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.raw_put(*args, **kwargs)
+
+
+def raw_patch(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.raw_patch(*args, **kwargs)
+
+
+def raw_delete(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.raw_delete(*args, **kwargs)
+
+
+def get(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get(*args, **kwargs)
+
+
+def post(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.post(*args, **kwargs)
+
+
+def put(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.put(*args, **kwargs)
+
+
+def patch(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.patch(*args, **kwargs)
+
+
+def delete(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.delete(*args, **kwargs)
+
+
+def get_event(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_event(*args, **kwargs)
+
+
+def get_events(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_events(*args, **kwargs)
+
+
+def dispatch_event(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.dispatch_event(*args, **kwargs)
+
+
+def update_event(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.update_event(*args, **kwargs)
+
+
+def enroll_event_job(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.enroll_event_job(*args, **kwargs)
+
+
+def download_file(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.download_file(*args, **kwargs)
+
+
+def upload_file(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.upload_file(*args, **kwargs)
+
+
+def query_graphql(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.query_graphql(*args, **kwargs)
+
+
+def get_users(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_users(*args, **kwargs)
+
+
+def get_user(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_user(*args, **kwargs)
+
+
+def get_attributes_for_type(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_attributes_for_type(*args, **kwargs)
+
+
+def get_addons_info(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_addons_info(*args, **kwargs)
+
+
+def get_addon_url(addon_name, addon_version, *subpaths):
+ con = get_server_api_connection()
+ return con.get_addon_url(addon_name, addon_version, *subpaths)
+
+
+def download_addon_private_file(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.download_addon_private_file(*args, **kwargs)
+
+
+def get_info(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_info(*args, **kwargs)
+
+
+def get_server_version(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_server_version(*args, **kwargs)
+
+
+def get_server_version_tuple(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_server_version_tuple(*args, **kwargs)
+
+
+# Installers
+def get_installers(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_installers(*args, **kwargs)
+
+
+def create_installer(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.create_installer(*args, **kwargs)
+
+
+def update_installer(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.update_installer(*args, **kwargs)
+
+
+def delete_installer(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.delete_installer(*args, **kwargs)
+
+
+def download_installer(*args, **kwargs):
+ con = get_server_api_connection()
+ con.download_installer(*args, **kwargs)
+
+
+def upload_installer(*args, **kwargs):
+ con = get_server_api_connection()
+ con.upload_installer(*args, **kwargs)
+
+
+# Dependency packages
+def get_dependencies_info(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_dependencies_info(*args, **kwargs)
+
+
+def update_dependency_info(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.update_dependency_info(*args, **kwargs)
+
+
+def download_dependency_package(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.download_dependency_package(*args, **kwargs)
+
+
+def upload_dependency_package(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.upload_dependency_package(*args, **kwargs)
+
+
+def get_dependency_packages(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_dependency_packages(*args, **kwargs)
+
+
+def create_dependency_package(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.create_dependency_package(*args, **kwargs)
+
+
+def update_dependency_package(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.update_dependency_package(*args, **kwargs)
+
+
+def delete_dependency_package(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.delete_dependency_package(*args, **kwargs)
+
+
+def get_project_anatomy_presets(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_project_anatomy_presets(*args, **kwargs)
+
+
+def get_bundles(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_bundles(*args, **kwargs)
+
+
+def create_bundle(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.create_bundle(*args, **kwargs)
+
+
+def update_bundle(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.update_bundle(*args, **kwargs)
+
+
+def delete_bundle(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.delete_bundle(*args, **kwargs)
+
+
+def get_project_anatomy_preset(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_project_anatomy_preset(*args, **kwargs)
+
+
+def get_project_roots_by_site(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_project_roots_by_site(*args, **kwargs)
+
+
+def get_project_roots_for_site(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_project_roots_for_site(*args, **kwargs)
+
+
+def get_addon_settings_schema(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_addon_settings_schema(*args, **kwargs)
+
+
+def get_addon_site_settings_schema(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_addon_site_settings_schema(*args, **kwargs)
+
+
+def get_addon_studio_settings(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_addon_studio_settings(*args, **kwargs)
+
+
+def get_addon_project_settings(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_addon_project_settings(*args, **kwargs)
+
+
+def get_addon_settings(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_addon_settings(*args, **kwargs)
+
+
+def get_addon_site_settings(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_addon_site_settings(*args, **kwargs)
+
+
+def get_bundle_settings(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_bundle_settings(*args, **kwargs)
+
+
+def get_addons_studio_settings(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_addons_studio_settings(*args, **kwargs)
+
+
+def get_addons_project_settings(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_addons_project_settings(*args, **kwargs)
+
+
+def get_addons_settings(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_addons_settings(*args, **kwargs)
+
+
+def get_project_names(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_project_names(*args, **kwargs)
+
+
+def get_project(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_project(*args, **kwargs)
+
+
+def get_projects(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_projects(*args, **kwargs)
+
+
+def get_folders(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_folders(*args, **kwargs)
+
+
+def get_folders_hierarchy(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_folders_hierarchy(*args, **kwargs)
+
+
+def get_tasks(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_tasks(*args, **kwargs)
+
+
+def get_folder_by_id(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_folder_by_id(*args, **kwargs)
+
+
+def get_folder_by_path(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_folder_by_path(*args, **kwargs)
+
+
+def get_folder_by_name(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_folder_by_name(*args, **kwargs)
+
+
+def get_folder_ids_with_products(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_folder_ids_with_products(*args, **kwargs)
+
+
+def get_product_types(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_product_types(*args, **kwargs)
+
+
+def get_project_product_types(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_project_product_types(*args, **kwargs)
+
+
+def get_product_type_names(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_product_type_names(*args, **kwargs)
+
+
+def get_products(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_products(*args, **kwargs)
+
+
+def get_product_by_id(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_product_by_id(*args, **kwargs)
+
+
+def get_product_by_name(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_product_by_name(*args, **kwargs)
+
+
+def get_versions(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_versions(*args, **kwargs)
+
+
+def get_version_by_id(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_version_by_id(*args, **kwargs)
+
+
+def get_version_by_name(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_version_by_name(*args, **kwargs)
+
+
+def get_hero_version_by_id(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_hero_version_by_id(*args, **kwargs)
+
+
+def get_hero_version_by_product_id(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_hero_version_by_product_id(*args, **kwargs)
+
+
+def get_hero_versions(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_hero_versions(*args, **kwargs)
+
+
+def get_last_versions(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_last_versions(*args, **kwargs)
+
+
+def get_last_version_by_product_id(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_last_version_by_product_id(*args, **kwargs)
+
+
+def get_last_version_by_product_name(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_last_version_by_product_name(*args, **kwargs)
+
+
+def version_is_latest(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.version_is_latest(*args, **kwargs)
+
+
+def get_representations(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_representations(*args, **kwargs)
+
+
+def get_representation_by_id(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_representation_by_id(*args, **kwargs)
+
+
+def get_representation_by_name(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_representation_by_name(*args, **kwargs)
+
+
+def get_representation_parents(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_representation_parents(*args, **kwargs)
+
+
+def get_representations_parents(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_representations_parents(*args, **kwargs)
+
+
+def get_repre_ids_by_context_filters(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_repre_ids_by_context_filters(*args, **kwargs)
+
+
+def get_workfiles_info(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_workfiles_info(*args, **kwargs)
+
+
+def get_workfile_info(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_workfile_info(*args, **kwargs)
+
+
+def get_workfile_info_by_id(*args, **kwargs):
+ con = get_server_api_connection()
+ return con.get_workfile_info_by_id(*args, **kwargs)
+
+
+def create_project(
+ project_name,
+ project_code,
+ library_project=False,
+ preset_name=None
+):
+ con = get_server_api_connection()
+ return con.create_project(
+ project_name,
+ project_code,
+ library_project,
+ preset_name
+ )
+
+
+def update_project(project_name, *args, **kwargs):
+ con = get_server_api_connection()
+ return con.update_project(project_name, *args, **kwargs)
+
+
+def delete_project(project_name):
+ con = get_server_api_connection()
+ return con.delete_project(project_name)
+
+
+def get_thumbnail(project_name, entity_type, entity_id, thumbnail_id=None):
+ con = get_server_api_connection()
+ con.get_thumbnail(project_name, entity_type, entity_id, thumbnail_id)
+
+
+def get_folder_thumbnail(project_name, folder_id, thumbnail_id=None):
+ con = get_server_api_connection()
+ return con.get_folder_thumbnail(project_name, folder_id, thumbnail_id)
+
+
+def get_version_thumbnail(project_name, version_id, thumbnail_id=None):
+ con = get_server_api_connection()
+ return con.get_version_thumbnail(project_name, version_id, thumbnail_id)
+
+
+def get_workfile_thumbnail(project_name, workfile_id, thumbnail_id=None):
+ con = get_server_api_connection()
+ return con.get_workfile_thumbnail(project_name, workfile_id, thumbnail_id)
+
+
+def create_thumbnail(project_name, src_filepath, thumbnail_id=None):
+ con = get_server_api_connection()
+ return con.create_thumbnail(project_name, src_filepath, thumbnail_id)
+
+
+def update_thumbnail(project_name, thumbnail_id, src_filepath):
+ con = get_server_api_connection()
+ return con.update_thumbnail(project_name, thumbnail_id, src_filepath)
+
+
+def get_default_fields_for_type(entity_type):
+ con = get_server_api_connection()
+ return con.get_default_fields_for_type(entity_type)
+
+
+def get_full_link_type_name(link_type_name, input_type, output_type):
+ con = get_server_api_connection()
+ return con.get_full_link_type_name(
+ link_type_name, input_type, output_type)
+
+
+def get_link_types(project_name):
+ con = get_server_api_connection()
+ return con.get_link_types(project_name)
+
+
+def get_link_type(project_name, link_type_name, input_type, output_type):
+ con = get_server_api_connection()
+ return con.get_link_type(
+ project_name, link_type_name, input_type, output_type)
+
+
+def create_link_type(
+ project_name, link_type_name, input_type, output_type, data=None):
+ con = get_server_api_connection()
+ return con.create_link_type(
+ project_name, link_type_name, input_type, output_type, data=data)
+
+
+def delete_link_type(project_name, link_type_name, input_type, output_type):
+ con = get_server_api_connection()
+ return con.delete_link_type(
+ project_name, link_type_name, input_type, output_type)
+
+
+def make_sure_link_type_exists(
+ project_name, link_type_name, input_type, output_type, data=None
+):
+ con = get_server_api_connection()
+ return con.make_sure_link_type_exists(
+ project_name, link_type_name, input_type, output_type, data=data
+ )
+
+
+def create_link(
+ project_name,
+ link_type_name,
+ input_id,
+ input_type,
+ output_id,
+ output_type
+):
+ con = get_server_api_connection()
+ return con.create_link(
+ project_name,
+ link_type_name,
+ input_id, input_type,
+ output_id, output_type
+ )
+
+
+def delete_link(project_name, link_id):
+ con = get_server_api_connection()
+ return con.delete_link(project_name, link_id)
+
+
+def get_entities_links(
+ project_name,
+ entity_type,
+ entity_ids=None,
+ link_types=None,
+ link_direction=None
+):
+ con = get_server_api_connection()
+ return con.get_entities_links(
+ project_name,
+ entity_type,
+ entity_ids,
+ link_types,
+ link_direction
+ )
+
+
+def get_folders_links(
+ project_name,
+ folder_ids=None,
+ link_types=None,
+ link_direction=None
+):
+ con = get_server_api_connection()
+ return con.get_folders_links(
+ project_name,
+ folder_ids,
+ link_types,
+ link_direction
+ )
+
+
+def get_folder_links(
+ project_name,
+ folder_id,
+ link_types=None,
+ link_direction=None
+):
+ con = get_server_api_connection()
+ return con.get_folder_links(
+ project_name,
+ folder_id,
+ link_types,
+ link_direction
+ )
+
+
+def get_tasks_links(
+ project_name,
+ task_ids=None,
+ link_types=None,
+ link_direction=None
+):
+ con = get_server_api_connection()
+ return con.get_tasks_links(
+ project_name,
+ task_ids,
+ link_types,
+ link_direction
+ )
+
+
+def get_task_links(
+ project_name,
+ task_id,
+ link_types=None,
+ link_direction=None
+):
+ con = get_server_api_connection()
+ return con.get_task_links(
+ project_name,
+ task_id,
+ link_types,
+ link_direction
+ )
+
+
+def get_products_links(
+ project_name,
+ product_ids=None,
+ link_types=None,
+ link_direction=None
+):
+ con = get_server_api_connection()
+ return con.get_products_links(
+ project_name,
+ product_ids,
+ link_types,
+ link_direction
+ )
+
+
+def get_product_links(
+ project_name,
+ product_id,
+ link_types=None,
+ link_direction=None
+):
+ con = get_server_api_connection()
+ return con.get_product_links(
+ project_name,
+ product_id,
+ link_types,
+ link_direction
+ )
+
+
+def get_versions_links(
+ project_name,
+ version_ids=None,
+ link_types=None,
+ link_direction=None
+):
+ con = get_server_api_connection()
+ return con.get_versions_links(
+ project_name,
+ version_ids,
+ link_types,
+ link_direction
+ )
+
+
+def get_version_links(
+ project_name,
+ version_id,
+ link_types=None,
+ link_direction=None
+):
+ con = get_server_api_connection()
+ return con.get_version_links(
+ project_name,
+ version_id,
+ link_types,
+ link_direction
+ )
+
+
+def get_representations_links(
+ project_name,
+ representation_ids=None,
+ link_types=None,
+ link_direction=None
+):
+ con = get_server_api_connection()
+ return con.get_representations_links(
+ project_name,
+ representation_ids,
+ link_types,
+ link_direction
+ )
+
+
+def get_representation_links(
+ project_name,
+ representation_id,
+ link_types=None,
+ link_direction=None
+):
+ con = get_server_api_connection()
+ return con.get_representation_links(
+ project_name,
+ representation_id,
+ link_types,
+ link_direction
+ )
+
+
+def send_batch_operations(
+ project_name,
+ operations,
+ can_fail=False,
+ raise_on_fail=True
+):
+ con = get_server_api_connection()
+ return con.send_batch_operations(
+ project_name,
+ operations,
+ can_fail=can_fail,
+ raise_on_fail=raise_on_fail
+ )
diff --git a/openpype/vendor/python/common/ayon_api/constants.py b/openpype/vendor/python/common/ayon_api/constants.py
new file mode 100644
index 0000000000..e2b05a5cae
--- /dev/null
+++ b/openpype/vendor/python/common/ayon_api/constants.py
@@ -0,0 +1,115 @@
+# Environments where server url and api key are stored for global connection
+SERVER_URL_ENV_KEY = "AYON_SERVER_URL"
+SERVER_API_ENV_KEY = "AYON_API_KEY"
+# Backwards compatibility
+SERVER_TOKEN_ENV_KEY = SERVER_API_ENV_KEY
+
+# --- Product types ---
+DEFAULT_PRODUCT_TYPE_FIELDS = {
+ "name",
+ "icon",
+ "color",
+}
+
+# --- Project ---
+DEFAULT_PROJECT_FIELDS = {
+ "active",
+ "name",
+ "code",
+ "config",
+ "createdAt",
+}
+
+# --- Folders ---
+DEFAULT_FOLDER_FIELDS = {
+ "id",
+ "name",
+ "label",
+ "folderType",
+ "path",
+ "parentId",
+ "active",
+ "thumbnailId",
+}
+
+# --- Tasks ---
+DEFAULT_TASK_FIELDS = {
+ "id",
+ "name",
+ "label",
+ "taskType",
+ "folderId",
+ "active",
+ "assignees",
+}
+
+# --- Products ---
+DEFAULT_PRODUCT_FIELDS = {
+ "id",
+ "name",
+ "folderId",
+ "active",
+ "productType",
+}
+
+# --- Versions ---
+DEFAULT_VERSION_FIELDS = {
+ "id",
+ "name",
+ "version",
+ "productId",
+ "taskId",
+ "active",
+ "author",
+ "thumbnailId",
+ "createdAt",
+ "updatedAt",
+}
+
+# --- Representations ---
+DEFAULT_REPRESENTATION_FIELDS = {
+ "id",
+ "name",
+ "context",
+ "createdAt",
+ "active",
+ "versionId",
+}
+
+REPRESENTATION_FILES_FIELDS = {
+ "files.name",
+ "files.hash",
+ "files.id",
+ "files.path",
+ "files.size",
+}
+
+# --- Workfile info ---
+DEFAULT_WORKFILE_INFO_FIELDS = {
+ "active",
+ "createdAt",
+ "createdBy",
+ "id",
+ "name",
+ "path",
+ "projectName",
+ "taskId",
+ "thumbnailId",
+ "updatedAt",
+ "updatedBy",
+}
+
+DEFAULT_EVENT_FIELDS = {
+ "id",
+ "hash",
+ "createdAt",
+ "dependsOn",
+ "description",
+ "project",
+ "retries",
+ "sender",
+ "status",
+ "topic",
+ "updatedAt",
+ "user",
+}
diff --git a/openpype/vendor/python/common/ayon_api/entity_hub.py b/openpype/vendor/python/common/ayon_api/entity_hub.py
new file mode 100644
index 0000000000..ab1e2584d7
--- /dev/null
+++ b/openpype/vendor/python/common/ayon_api/entity_hub.py
@@ -0,0 +1,1907 @@
+import copy
+import collections
+from abc import ABCMeta, abstractmethod
+
+import six
+from ._api import get_server_api_connection
+from .utils import create_entity_id, convert_entity_id
+
+UNKNOWN_VALUE = object()
+PROJECT_PARENT_ID = object()
+_NOT_SET = object()
+
+
+class EntityHub(object):
+ """Helper to create, update or remove entities in project.
+
+ The hub is a guide to operation with folder entities and update of project.
+ Project entity must already exist on server (can be only updated).
+
+ Object is caching entities queried from server. They won't be required once
+ they were queried, so it is recommended to create new hub or clear cache
+ frequently.
+
+ Todos:
+ Listen to server events about entity changes to be able update already
+ queried entities.
+
+ Args:
+ project_name (str): Name of project where changes will happen.
+ connection (ServerAPI): Connection to server with logged user.
+ allow_data_changes (bool): This option gives ability to change 'data'
+ key on entities. This is not recommended as 'data' may be use for
+ secure information and would also slow down server queries. Content
+ of 'data' key can't be received only GraphQl.
+ """
+
+ def __init__(
+ self, project_name, connection=None, allow_data_changes=False
+ ):
+ if not connection:
+ connection = get_server_api_connection()
+ self._connection = connection
+
+ self._project_name = project_name
+ self._entities_by_id = {}
+ self._entities_by_parent_id = collections.defaultdict(list)
+ self._project_entity = UNKNOWN_VALUE
+
+ self._allow_data_changes = allow_data_changes
+
+ self._path_reset_queue = None
+
+ @property
+ def allow_data_changes(self):
+ """Entity hub allows changes of 'data' key on entities.
+
+ Data are private and not all users may have access to them. Also to get
+ 'data' for entity is required to use REST api calls, which means to
+ query each entity on-by-one from server.
+
+ Returns:
+ bool: Data changes are allowed.
+ """
+
+ return self._allow_data_changes
+
+ @property
+ def project_name(self):
+ """Project name which is maintained by hub.
+
+ Returns:
+ str: Name of project.
+ """
+
+ return self._project_name
+
+ @property
+ def project_entity(self):
+ """Project entity.
+
+ Returns:
+ ProjectEntity: Project entity.
+ """
+
+ if self._project_entity is UNKNOWN_VALUE:
+ self.fill_project_from_server()
+ return self._project_entity
+
+ def get_attributes_for_type(self, entity_type):
+ """Get attributes available for a type.
+
+ Attributes are based on entity types.
+
+ Todos:
+ Use attribute schema to validate values on entities.
+
+ Args:
+ entity_type (Literal["project", "folder", "task"]): Entity type
+ for which should be attributes received.
+
+ Returns:
+ Dict[str, Dict[str, Any]]: Attribute schemas that are available
+ for entered entity type.
+ """
+
+ return self._connection.get_attributes_for_type(entity_type)
+
+ def get_entity_by_id(self, entity_id):
+ """Receive entity by its id without entity type.
+
+ The entity must be already existing in cached objects.
+
+ Args:
+ entity_id (str): Id of entity.
+
+ Returns:
+ Union[BaseEntity, None]: Entity object or None.
+ """
+
+ return self._entities_by_id.get(entity_id)
+
+ def get_folder_by_id(self, entity_id, allow_query=True):
+ """Get folder entity by id.
+
+ Args:
+ entity_id (str): Id of folder entity.
+ allow_query (bool): Try to query entity from server if is not
+ available in cache.
+
+ Returns:
+ Union[FolderEntity, None]: Object of folder or 'None'.
+ """
+
+ if allow_query:
+ return self.get_or_query_entity_by_id(entity_id, ["folder"])
+ return self._entities_by_id.get(entity_id)
+
+ def get_task_by_id(self, entity_id, allow_query=True):
+ """Get task entity by id.
+
+ Args:
+ entity_id (str): Id of task entity.
+ allow_query (bool): Try to query entity from server if is not
+ available in cache.
+
+ Returns:
+ Union[TaskEntity, None]: Object of folder or 'None'.
+ """
+
+ if allow_query:
+ return self.get_or_query_entity_by_id(entity_id, ["task"])
+ return self._entities_by_id.get(entity_id)
+
+ def get_or_query_entity_by_id(self, entity_id, entity_types):
+ """Get or query entity based on it's id and possible entity types.
+
+ This is a helper function when entity id is known but entity type may
+ have multiple possible options.
+
+ Args:
+ entity_id (str): Entity id.
+ entity_types (Iterable[str]): Possible entity types that can the id
+ represent. e.g. '["folder", "project"]'
+ """
+
+ existing_entity = self._entities_by_id.get(entity_id)
+ if existing_entity is not None:
+ return existing_entity
+
+ if not entity_types:
+ return None
+
+ entity_data = None
+ for entity_type in entity_types:
+ if entity_type == "folder":
+ entity_data = self._connection.get_folder_by_id(
+ self.project_name,
+ entity_id,
+ fields=self._get_folder_fields(),
+ own_attributes=True
+ )
+ elif entity_type == "task":
+ entity_data = self._connection.get_task_by_id(
+ self.project_name,
+ entity_id,
+ own_attributes=True
+ )
+ else:
+ raise ValueError(
+ "Unknonwn entity type \"{}\"".format(entity_type)
+ )
+
+ if entity_data:
+ break
+
+ if not entity_data:
+ return None
+
+ if entity_type == "folder":
+ return self.add_folder(entity_data)
+ elif entity_type == "task":
+ return self.add_task(entity_data)
+
+ return None
+
+ @property
+ def entities(self):
+ """Iterator over available entities.
+
+ Returns:
+ Iterator[BaseEntity]: All queried/created entities cached in hub.
+ """
+
+ for entity in self._entities_by_id.values():
+ yield entity
+
+ def add_new_folder(self, *args, created=True, **kwargs):
+ """Create folder object and add it to entity hub.
+
+ Args:
+ folder_type (str): Type of folder. Folder type must be available in
+ config of project folder types.
+ entity_id (Union[str, None]): Id of the entity. New id is created if
+ not passed.
+ parent_id (Union[str, None]): Id of parent entity.
+ name (str): Name of entity.
+ label (Optional[str]): Folder label.
+ path (Optional[str]): Folder path. Path consist of all parent names
+ with slash('/') used as separator.
+ attribs (Dict[str, Any]): Attribute values.
+ data (Dict[str, Any]): Entity data (custom data).
+ thumbnail_id (Union[str, None]): Id of entity's thumbnail.
+ active (bool): Is entity active.
+ created (Optional[bool]): Entity is new. When 'None' is passed the
+ value is defined based on value of 'entity_id'.
+
+ Returns:
+ FolderEntity: Added folder entity.
+ """
+
+ folder_entity = FolderEntity(
+ *args, **kwargs, created=created, entity_hub=self
+ )
+ self.add_entity(folder_entity)
+ return folder_entity
+
+ def add_new_task(self, *args, created=True, **kwargs):
+ """Create folder object and add it to entity hub.
+
+ Args:
+ task_type (str): Type of task. Task type must be available in
+ config of project folder types.
+ entity_id (Union[str, None]): Id of the entity. New id is created if
+ not passed.
+ parent_id (Union[str, None]): Id of parent entity.
+ name (str): Name of entity.
+ label (Optional[str]): Folder label.
+ attribs (Dict[str, Any]): Attribute values.
+ data (Dict[str, Any]): Entity data (custom data).
+ thumbnail_id (Union[str, None]): Id of entity's thumbnail.
+ active (bool): Is entity active.
+ created (Optional[bool]): Entity is new. When 'None' is passed the
+ value is defined based on value of 'entity_id'.
+
+ Returns:
+ TaskEntity: Added task entity.
+ """
+
+ task_entity = TaskEntity(
+ *args, **kwargs, created=created, entity_hub=self
+ )
+ self.add_entity(task_entity)
+ return task_entity
+
+ def add_folder(self, folder):
+ """Create folder object and add it to entity hub.
+
+ Args:
+ folder (Dict[str, Any]): Folder entity data.
+
+ Returns:
+ FolderEntity: Added folder entity.
+ """
+
+ folder_entity = FolderEntity.from_entity_data(folder, entity_hub=self)
+ self.add_entity(folder_entity)
+ return folder_entity
+
+ def add_task(self, task):
+ """Create task object and add it to entity hub.
+
+ Args:
+ task (Dict[str, Any]): Task entity data.
+
+ Returns:
+ TaskEntity: Added task entity.
+ """
+
+ task_entity = TaskEntity.from_entity_data(task, entity_hub=self)
+ self.add_entity(task_entity)
+ return task_entity
+
+ def add_entity(self, entity):
+ """Add entity to hub cache.
+
+ Args:
+ entity (BaseEntity): Entity that should be added to hub's cache.
+ """
+
+ self._entities_by_id[entity.id] = entity
+ parent_children = self._entities_by_parent_id[entity.parent_id]
+ if entity not in parent_children:
+ parent_children.append(entity)
+
+ if entity.parent_id is PROJECT_PARENT_ID:
+ return
+
+ parent = self._entities_by_id.get(entity.parent_id)
+ if parent is not None:
+ parent.add_child(entity.id)
+
+ def folder_path_reseted(self, folder_id):
+ """Method called from 'FolderEntity' on path reset.
+
+ This should reset cache of folder paths on all children entities.
+
+ The path cache is always propagated from top to bottom so if an entity
+ has not cached path it means that any children can't have it cached.
+ """
+
+ if self._path_reset_queue is not None:
+ self._path_reset_queue.append(folder_id)
+ return
+
+ self._path_reset_queue = collections.deque()
+ self._path_reset_queue.append(folder_id)
+ while self._path_reset_queue:
+ children = self._entities_by_parent_id[folder_id]
+ for child in children:
+ # Get child path but don't trigger cache
+ path = child.get_path(False)
+ if path is not None:
+ # Reset it's path cache if is set
+ child.reset_path()
+ else:
+ self._path_reset_queue.append(child.id)
+
+ self._path_reset_queue = None
+
+ def unset_entity_parent(self, entity_id, parent_id):
+ entity = self._entities_by_id.get(entity_id)
+ parent = self._entities_by_id.get(parent_id)
+ children_ids = UNKNOWN_VALUE
+ if parent is not None:
+ children_ids = parent.get_children_ids(False)
+
+ has_set_parent = False
+ if entity is not None:
+ has_set_parent = entity.parent_id == parent_id
+
+ new_parent_id = None
+ if has_set_parent:
+ entity.parent_id = new_parent_id
+
+ if children_ids is not UNKNOWN_VALUE and entity_id in children_ids:
+ parent.remove_child(entity_id)
+
+ if entity is None or not has_set_parent:
+ self.reset_immutable_for_hierarchy_cache(parent_id)
+ return
+
+ orig_parent_children = self._entities_by_parent_id[parent_id]
+ if entity in orig_parent_children:
+ orig_parent_children.remove(entity)
+
+ new_parent_children = self._entities_by_parent_id[new_parent_id]
+ if entity not in new_parent_children:
+ new_parent_children.append(entity)
+ self.reset_immutable_for_hierarchy_cache(parent_id)
+
+ def set_entity_parent(self, entity_id, parent_id, orig_parent_id=_NOT_SET):
+ parent = self._entities_by_id.get(parent_id)
+ entity = self._entities_by_id.get(entity_id)
+ if entity is None:
+ if parent is not None:
+ children_ids = parent.get_children_ids(False)
+ if (
+ children_ids is not UNKNOWN_VALUE
+ and entity_id in children_ids
+ ):
+ parent.remove_child(entity_id)
+ self.reset_immutable_for_hierarchy_cache(parent.id)
+ return
+
+ if orig_parent_id is _NOT_SET:
+ orig_parent_id = entity.parent_id
+ if orig_parent_id == parent_id:
+ return
+
+ orig_parent_children = self._entities_by_parent_id[orig_parent_id]
+ if entity in orig_parent_children:
+ orig_parent_children.remove(entity)
+ self.reset_immutable_for_hierarchy_cache(orig_parent_id)
+
+ orig_parent = self._entities_by_id.get(orig_parent_id)
+ if orig_parent is not None:
+ orig_parent.remove_child(entity_id)
+
+ parent_children = self._entities_by_parent_id[parent_id]
+ if entity not in parent_children:
+ parent_children.append(entity)
+
+ entity.parent_id = parent_id
+ if parent is None or parent.get_children_ids(False) is UNKNOWN_VALUE:
+ return
+
+ parent.add_child(entity_id)
+ self.reset_immutable_for_hierarchy_cache(parent_id)
+
+ def _query_entity_children(self, entity):
+ folder_fields = self._get_folder_fields()
+ tasks = []
+ folders = []
+ if entity.entity_type == "project":
+ folders = list(self._connection.get_folders(
+ entity["name"],
+ parent_ids=[entity.id],
+ fields=folder_fields,
+ own_attributes=True
+ ))
+
+ elif entity.entity_type == "folder":
+ folders = list(self._connection.get_folders(
+ self.project_entity["name"],
+ parent_ids=[entity.id],
+ fields=folder_fields,
+ own_attributes=True
+ ))
+
+ tasks = list(self._connection.get_tasks(
+ self.project_entity["name"],
+ folder_ids=[entity.id],
+ own_attributes=True
+ ))
+
+ children_ids = {
+ child.id
+ for child in self._entities_by_parent_id[entity.id]
+ }
+ for folder in folders:
+ folder_entity = self._entities_by_id.get(folder["id"])
+ if folder_entity is not None:
+ if folder_entity.parent_id == entity.id:
+ children_ids.add(folder_entity.id)
+ continue
+
+ folder_entity = self.add_folder(folder)
+ children_ids.add(folder_entity.id)
+
+ for task in tasks:
+ task_entity = self._entities_by_id.get(task["id"])
+ if task_entity is not None:
+ if task_entity.parent_id == entity.id:
+ children_ids.add(task_entity.id)
+ continue
+
+ task_entity = self.add_task(task)
+ children_ids.add(task_entity.id)
+
+ entity.fill_children_ids(children_ids)
+
+ def get_entity_children(self, entity, allow_query=True):
+ children_ids = entity.get_children_ids(allow_query=False)
+ if children_ids is not UNKNOWN_VALUE:
+ return entity.get_children()
+
+ if children_ids is UNKNOWN_VALUE and not allow_query:
+ return UNKNOWN_VALUE
+
+ self._query_entity_children(entity)
+
+ return entity.get_children()
+
+ def delete_entity(self, entity):
+ parent_id = entity.parent_id
+ if parent_id is None:
+ return
+
+ parent = self._entities_by_id.get(parent_id)
+ if parent is not None:
+ parent.remove_child(entity.id)
+
+ def reset_immutable_for_hierarchy_cache(
+ self, entity_id, bottom_to_top=True
+ ):
+ if bottom_to_top is None or entity_id is None:
+ return
+
+ reset_queue = collections.deque()
+ reset_queue.append(entity_id)
+ if bottom_to_top:
+ while reset_queue:
+ entity_id = reset_queue.popleft()
+ entity = self.get_entity_by_id(entity_id)
+ if entity is None:
+ continue
+ entity.reset_immutable_for_hierarchy_cache(None)
+ reset_queue.append(entity.parent_id)
+ else:
+ while reset_queue:
+ entity_id = reset_queue.popleft()
+ entity = self.get_entity_by_id(entity_id)
+ if entity is None:
+ continue
+ entity.reset_immutable_for_hierarchy_cache(None)
+ for child in self._entities_by_parent_id[entity.id]:
+ reset_queue.append(child.id)
+
+ def fill_project_from_server(self):
+ """Query project data from server and create project entity.
+
+ This method will invalidate previous object of Project entity.
+
+ Returns:
+ ProjectEntity: Entity that was updated with server data.
+
+ Raises:
+ ValueError: When project was not found on server.
+ """
+
+ project_name = self.project_name
+ project = self._connection.get_project(
+ project_name,
+ own_attributes=True
+ )
+ if not project:
+ raise ValueError(
+ "Project \"{}\" was not found.".format(project_name)
+ )
+
+ self._project_entity = ProjectEntity(
+ project["code"],
+ parent_id=PROJECT_PARENT_ID,
+ entity_id=project["name"],
+ library=project["library"],
+ folder_types=project["folderTypes"],
+ task_types=project["taskTypes"],
+ name=project["name"],
+ attribs=project["ownAttrib"],
+ data=project["data"],
+ active=project["active"],
+ entity_hub=self
+ )
+ self.add_entity(self._project_entity)
+ return self._project_entity
+
+ def _get_folder_fields(self):
+ folder_fields = set(
+ self._connection.get_default_fields_for_type("folder")
+ )
+ folder_fields.add("hasProducts")
+ if self._allow_data_changes:
+ folder_fields.add("data")
+ return folder_fields
+
+ def query_entities_from_server(self):
+ """Query whole project at once."""
+
+ project_entity = self.fill_project_from_server()
+
+ folder_fields = self._get_folder_fields()
+
+ folders = self._connection.get_folders(
+ project_entity.name,
+ fields=folder_fields,
+ own_attributes=True
+ )
+ tasks = self._connection.get_tasks(
+ project_entity.name,
+ own_attributes=True
+ )
+ folders_by_parent_id = collections.defaultdict(list)
+ for folder in folders:
+ parent_id = folder["parentId"]
+ folders_by_parent_id[parent_id].append(folder)
+
+ tasks_by_parent_id = collections.defaultdict(list)
+ for task in tasks:
+ parent_id = task["folderId"]
+ tasks_by_parent_id[parent_id].append(task)
+
+ lock_queue = collections.deque()
+ hierarchy_queue = collections.deque()
+ hierarchy_queue.append((None, project_entity))
+ while hierarchy_queue:
+ item = hierarchy_queue.popleft()
+ parent_id, parent_entity = item
+
+ lock_queue.append(parent_entity)
+
+ children_ids = set()
+ for folder in folders_by_parent_id[parent_id]:
+ folder_entity = self.add_folder(folder)
+ children_ids.add(folder_entity.id)
+ folder_entity.has_published_content = folder["hasProducts"]
+ hierarchy_queue.append((folder_entity.id, folder_entity))
+
+ for task in tasks_by_parent_id[parent_id]:
+ task_entity = self.add_task(task)
+ lock_queue.append(task_entity)
+ children_ids.add(task_entity.id)
+
+ parent_entity.fill_children_ids(children_ids)
+
+ # Lock entities when all are added to hub
+ # - lock only entities added in this method
+ while lock_queue:
+ entity = lock_queue.popleft()
+ entity.lock()
+
+ def lock(self):
+ if self._project_entity is None:
+ return
+
+ for entity in self._entities_by_id.values():
+ entity.lock()
+
+ def _get_top_entities(self):
+ all_ids = set(self._entities_by_id.keys())
+ return [
+ entity
+ for entity in self._entities_by_id.values()
+ if entity.parent_id not in all_ids
+ ]
+
+ def _split_entities(self):
+ top_entities = self._get_top_entities()
+ entities_queue = collections.deque(top_entities)
+ removed_entity_ids = []
+ created_entity_ids = []
+ other_entity_ids = []
+ while entities_queue:
+ entity = entities_queue.popleft()
+ removed = entity.removed
+ if removed:
+ removed_entity_ids.append(entity.id)
+ elif entity.created:
+ created_entity_ids.append(entity.id)
+ else:
+ other_entity_ids.append(entity.id)
+
+ for child in tuple(self._entities_by_parent_id[entity.id]):
+ if removed:
+ self.unset_entity_parent(child.id, entity.id)
+ entities_queue.append(child)
+ return created_entity_ids, other_entity_ids, removed_entity_ids
+
+ def _get_update_body(self, entity, changes=None):
+ if changes is None:
+ changes = entity.changes
+
+ if not changes:
+ return None
+ return {
+ "type": "update",
+ "entityType": entity.entity_type,
+ "entityId": entity.id,
+ "data": changes
+ }
+
+ def _get_create_body(self, entity):
+ return {
+ "type": "create",
+ "entityType": entity.entity_type,
+ "entityId": entity.id,
+ "data": entity.to_create_body_data()
+ }
+
+ def _get_delete_body(self, entity):
+ return {
+ "type": "delete",
+ "entityType": entity.entity_type,
+ "entityId": entity.id
+ }
+
+ def _pre_commit_types_changes(
+ self, project_changes, orig_types, changes_key, post_changes
+ ):
+ """Compare changes of types on a project.
+
+ Compare old and new types. Change project changes content if some old
+ types were removed. In that case the final change of types will
+ happen when all other entities have changed.
+
+ Args:
+ project_changes (dict[str, Any]): Project changes.
+ orig_types (list[dict[str, Any]]): Original types.
+ changes_key (Literal[folderTypes, taskTypes]): Key of type changes
+ in project changes.
+ post_changes (dict[str, Any]): An object where post changes will
+ be stored.
+ """
+
+ if changes_key not in project_changes:
+ return
+
+ new_types = project_changes[changes_key]
+
+ orig_types_by_name = {
+ type_info["name"]: type_info
+ for type_info in orig_types
+ }
+ new_names = {
+ type_info["name"]
+ for type_info in new_types
+ }
+ diff_names = set(orig_types_by_name) - new_names
+ if not diff_names:
+ return
+
+ # Create copy of folder type changes to post changes
+ # - post changes will be commited at the end
+ post_changes[changes_key] = copy.deepcopy(new_types)
+
+ for type_name in diff_names:
+ new_types.append(orig_types_by_name[type_name])
+
+ def _pre_commit_project(self):
+ """Some project changes cannot be committed before hierarchy changes.
+
+ It is not possible to change folder types or task types if there are
+ existing hierarchy items using the removed types. For that purposes
+ is first committed union of all old and new types and post changes
+ are prepared when all existing entities are changed.
+
+ Returns:
+ dict[str, Any]: Changes that will be committed after hierarchy
+ changes.
+ """
+
+ project_changes = self.project_entity.changes
+
+ post_changes = {}
+ if not project_changes:
+ return post_changes
+
+ self._pre_commit_types_changes(
+ project_changes,
+ self.project_entity.get_orig_folder_types(),
+ "folderType",
+ post_changes
+ )
+ self._pre_commit_types_changes(
+ project_changes,
+ self.project_entity.get_orig_task_types(),
+ "taskType",
+ post_changes
+ )
+ self._connection.update_project(self.project_name, **project_changes)
+ return post_changes
+
+ def commit_changes(self):
+ """Commit any changes that happened on entities.
+
+ Todos:
+ Use Operations Session instead of known operations body.
+ """
+
+ post_project_changes = self._pre_commit_project()
+ self.project_entity.lock()
+
+ project_changes = self.project_entity.changes
+ if project_changes:
+ response = self._connection.patch(
+ "projects/{}".format(self.project_name),
+ **project_changes
+ )
+ if response.status_code != 204:
+ raise ValueError("Failed to update project")
+
+ self.project_entity.lock()
+
+ operations_body = []
+
+ created_entity_ids, other_entity_ids, removed_entity_ids = (
+ self._split_entities()
+ )
+ processed_ids = set()
+ for entity_id in other_entity_ids:
+ if entity_id in processed_ids:
+ continue
+
+ entity = self._entities_by_id[entity_id]
+ changes = entity.changes
+ processed_ids.add(entity_id)
+ if not changes:
+ continue
+
+ bodies = [self._get_update_body(entity, changes)]
+ # Parent was created and was not yet added to operations body
+ parent_queue = collections.deque()
+ parent_queue.append(entity.parent_id)
+ while parent_queue:
+ # Make sure entity's parents are created
+ parent_id = parent_queue.popleft()
+ if (
+ parent_id is UNKNOWN_VALUE
+ or parent_id in processed_ids
+ or parent_id not in created_entity_ids
+ ):
+ continue
+
+ parent = self._entities_by_id.get(parent_id)
+ processed_ids.add(parent.id)
+ bodies.append(self._get_create_body(parent))
+ parent_queue.append(parent.id)
+
+ operations_body.extend(reversed(bodies))
+
+ for entity_id in created_entity_ids:
+ if entity_id in processed_ids:
+ continue
+ entity = self._entities_by_id[entity_id]
+ processed_ids.add(entity_id)
+ operations_body.append(self._get_create_body(entity))
+
+ for entity_id in reversed(removed_entity_ids):
+ if entity_id in processed_ids:
+ continue
+
+ entity = self._entities_by_id.pop(entity_id)
+ parent_children = self._entities_by_parent_id[entity.parent_id]
+ if entity in parent_children:
+ parent_children.remove(entity)
+
+ if not entity.created:
+ operations_body.append(self._get_delete_body(entity))
+
+ self._connection.send_batch_operations(
+ self.project_name, operations_body
+ )
+ if post_project_changes:
+ self._connection.update_project(
+ self.project_name, **post_project_changes)
+
+ self.lock()
+
+
+class AttributeValue(object):
+ def __init__(self, value):
+ self._value = value
+ self._origin_value = copy.deepcopy(value)
+
+ def get_value(self):
+ return self._value
+
+ def set_value(self, value):
+ self._value = value
+
+ value = property(get_value, set_value)
+
+ @property
+ def changed(self):
+ return self._value != self._origin_value
+
+ def lock(self):
+ self._origin_value = copy.deepcopy(self._value)
+
+
+class Attributes(object):
+ """Object representing attribs of entity.
+
+ Todos:
+ This could be enhanced to know attribute schema and validate values
+ based on the schema.
+
+ Args:
+ attrib_keys (Iterable[str]): Keys that are available in attribs of the
+ entity.
+ values (Union[None, Dict[str, Any]]): Values of attributes.
+ """
+
+ def __init__(self, attrib_keys, values=UNKNOWN_VALUE):
+ if values in (UNKNOWN_VALUE, None):
+ values = {}
+ self._attributes = {
+ key: AttributeValue(values.get(key))
+ for key in attrib_keys
+ }
+
+ def __contains__(self, key):
+ return key in self._attributes
+
+ def __getitem__(self, key):
+ return self._attributes[key].value
+
+ def __setitem__(self, key, value):
+ self._attributes[key].set_value(value)
+
+ def __iter__(self):
+ for key in self._attributes:
+ yield key
+
+ def keys(self):
+ return self._attributes.keys()
+
+ def values(self):
+ for attribute in self._attributes.values():
+ yield attribute.value
+
+ def items(self):
+ for key, attribute in self._attributes.items():
+ yield key, attribute.value
+
+ def get(self, key, default=None):
+ """Get value of attribute.
+
+ Args:
+ key (str): Attribute name.
+ default (Any): Default value to return when attribute was not
+ found.
+ """
+
+ attribute = self._attributes.get(key)
+ if attribute is None:
+ return default
+ return attribute.value
+
+ def set(self, key, value):
+ """Change value of attribute.
+
+ Args:
+ key (str): Attribute name.
+ value (Any): New value of the attribute.
+ """
+
+ self[key] = value
+
+ def get_attribute(self, key):
+ """Access to attribute object.
+
+ Args:
+ key (str): Name of attribute.
+
+ Returns:
+ AttributeValue: Object of attribute value.
+
+ Raises:
+ KeyError: When attribute is not available.
+ """
+
+ return self._attributes[key]
+
+ def lock(self):
+ for attribute in self._attributes.values():
+ attribute.lock()
+
+ @property
+ def changes(self):
+ """Attribute value changes.
+
+ Returns:
+ Dict[str, Any]: Key mapping with new values.
+ """
+
+ return {
+ attr_key: attribute.value
+ for attr_key, attribute in self._attributes.items()
+ if attribute.changed
+ }
+
+ def to_dict(self, ignore_none=True):
+ output = {}
+ for key, value in self.items():
+ if (
+ value is UNKNOWN_VALUE
+ or (ignore_none and value is None)
+ ):
+ continue
+
+ output[key] = value
+ return output
+
+
+@six.add_metaclass(ABCMeta)
+class BaseEntity(object):
+ """Object representation of entity from server which is capturing changes.
+
+ All data on created object are expected as "current data" on server entity
+ unless the entity has set 'created' to 'True'. So if new data should be
+ stored to server entity then fill entity with server data first and
+ then change them.
+
+ Calling 'lock' method will mark entity as "saved" and all changes made on
+ entity are set as "current data" on server.
+
+ Args:
+ entity_id (Union[str, None]): Id of the entity. New id is created if
+ not passed.
+ parent_id (Union[str, None]): Id of parent entity.
+ name (str): Name of entity.
+ attribs (Dict[str, Any]): Attribute values.
+ data (Dict[str, Any]): Entity data (custom data).
+ thumbnail_id (Union[str, None]): Id of entity's thumbnail.
+ active (bool): Is entity active.
+ entity_hub (EntityHub): Object of entity hub which created object of
+ the entity.
+ created (Optional[bool]): Entity is new. When 'None' is passed the
+ value is defined based on value of 'entity_id'.
+ """
+
+ def __init__(
+ self,
+ entity_id=None,
+ parent_id=UNKNOWN_VALUE,
+ name=UNKNOWN_VALUE,
+ attribs=UNKNOWN_VALUE,
+ data=UNKNOWN_VALUE,
+ thumbnail_id=UNKNOWN_VALUE,
+ active=UNKNOWN_VALUE,
+ entity_hub=None,
+ created=None
+ ):
+ if entity_hub is None:
+ raise ValueError("Missing required kwarg 'entity_hub'")
+
+ self._entity_hub = entity_hub
+
+ if created is None:
+ created = entity_id is None
+
+ entity_id = self._prepare_entity_id(entity_id)
+
+ if data is None:
+ data = {}
+
+ children_ids = UNKNOWN_VALUE
+ if created:
+ children_ids = set()
+
+ if not created and parent_id is UNKNOWN_VALUE:
+ raise ValueError("Existing entity is missing parent id.")
+
+ # These are public without any validation at this moment
+ # may change in future (e.g. name will have regex validation)
+ self._entity_id = entity_id
+
+ self._parent_id = parent_id
+ self._name = name
+ self.active = active
+ self._created = created
+ self._thumbnail_id = thumbnail_id
+ self._attribs = Attributes(
+ self._get_attributes_for_type(self.entity_type),
+ attribs
+ )
+ self._data = data
+ self._children_ids = children_ids
+
+ self._orig_parent_id = parent_id
+ self._orig_name = name
+ self._orig_data = copy.deepcopy(data)
+ self._orig_thumbnail_id = thumbnail_id
+ self._orig_active = active
+
+ self._immutable_for_hierarchy_cache = None
+
+ def __repr__(self):
+ return "<{} - {}>".format(self.__class__.__name__, self.id)
+
+ def __getitem__(self, item):
+ return getattr(self, item)
+
+ def __setitem__(self, item, value):
+ return setattr(self, item, value)
+
+ def _prepare_entity_id(self, entity_id):
+ entity_id = convert_entity_id(entity_id)
+ if entity_id is None:
+ entity_id = create_entity_id()
+ return entity_id
+
+ @property
+ def id(self):
+ """Access to entity id under which is entity available on server.
+
+ Returns:
+ str: Entity id.
+ """
+
+ return self._entity_id
+
+ @property
+ def removed(self):
+ return self._parent_id is None
+
+ @property
+ def orig_parent_id(self):
+ return self._orig_parent_id
+
+ @property
+ def attribs(self):
+ """Entity attributes based on server configuration.
+
+ Returns:
+ Attributes: Attributes object handling changes and values of
+ attributes on entity.
+ """
+
+ return self._attribs
+
+ @property
+ def data(self):
+ """Entity custom data that are not stored by any deterministic model.
+
+ Be aware that 'data' can't be queried using GraphQl and cannot be
+ updated partially.
+
+ Returns:
+ Dict[str, Any]: Custom data on entity.
+ """
+
+ return self._data
+
+ @property
+ def project_name(self):
+ """Quick access to project from entity hub.
+
+ Returns:
+ str: Name of project under which entity lives.
+ """
+
+ return self._entity_hub.project_name
+
+ @property
+ @abstractmethod
+ def entity_type(self):
+ """Entity type coresponding to server.
+
+ Returns:
+ Literal[project, folder, task]: Entity type.
+ """
+
+ pass
+
+ @property
+ @abstractmethod
+ def parent_entity_types(self):
+ """Entity type coresponding to server.
+
+ Returns:
+ Iterable[str]: Possible entity types of parent.
+ """
+
+ pass
+
+ @property
+ @abstractmethod
+ def changes(self):
+ """Receive entity changes.
+
+ Returns:
+ Union[Dict[str, Any], None]: All values that have changed on
+ entity. New entity must return None.
+ """
+
+ pass
+
+ @classmethod
+ @abstractmethod
+ def from_entity_data(cls, entity_data, entity_hub):
+ """Create entity based on queried data from server.
+
+ Args:
+ entity_data (Dict[str, Any]): Entity data from server.
+ entity_hub (EntityHub): Hub which handle the entity.
+
+ Returns:
+ BaseEntity: Object of the class.
+ """
+
+ pass
+
+ @abstractmethod
+ def to_create_body_data(self):
+ """Convert object of entity to data for server on creation.
+
+ Returns:
+ Dict[str, Any]: Entity data.
+ """
+
+ pass
+
+ @property
+ def immutable_for_hierarchy(self):
+ """Entity is immutable for hierarchy changes.
+
+ Hierarchy changes can be considered as change of name or parents.
+
+ Returns:
+ bool: Entity is immutable for hierarchy changes.
+ """
+
+ if self._immutable_for_hierarchy_cache is not None:
+ return self._immutable_for_hierarchy_cache
+
+ immutable_for_hierarchy = self._immutable_for_hierarchy
+ if immutable_for_hierarchy is not None:
+ self._immutable_for_hierarchy_cache = immutable_for_hierarchy
+ return self._immutable_for_hierarchy_cache
+
+ for child in self._entity_hub.get_entity_children(self):
+ if child.immutable_for_hierarchy:
+ self._immutable_for_hierarchy_cache = True
+ return self._immutable_for_hierarchy_cache
+
+ self._immutable_for_hierarchy_cache = False
+ return self._immutable_for_hierarchy_cache
+
+ @property
+ def _immutable_for_hierarchy(self):
+ """Override this method to define if entity object is immutable.
+
+ This property was added to define immutable state of Folder entities
+ which is used in property 'immutable_for_hierarchy'.
+
+ Returns:
+ Union[bool, None]: Bool to explicitly telling if is immutable or
+ not otherwise None.
+ """
+
+ return None
+
+ @property
+ def has_cached_immutable_hierarchy(self):
+ return self._immutable_for_hierarchy_cache is not None
+
+ def reset_immutable_for_hierarchy_cache(self, bottom_to_top=True):
+ """Clear cache of immutable hierarchy property.
+
+ This is used when entity changed parent or a child was added.
+
+ Args:
+ bottom_to_top (bool): Reset cache from top hierarchy to bottom or
+ from bottom hierarchy to top.
+ """
+
+ self._immutable_for_hierarchy_cache = None
+ self._entity_hub.reset_immutable_for_hierarchy_cache(
+ self.id, bottom_to_top
+ )
+
+ def _get_default_changes(self):
+ """Collect changes of common data on entity.
+
+ Returns:
+ Dict[str, Any]: Changes on entity. Key and it's new value.
+ """
+
+ changes = {}
+ if self._orig_name != self._name:
+ changes["name"] = self._name
+
+ if self._entity_hub.allow_data_changes:
+ if self._orig_data != self._data:
+ changes["data"] = self._data
+
+ if self._orig_thumbnail_id != self._thumbnail_id:
+ changes["thumbnailId"] = self._thumbnail_id
+
+ if self._orig_active != self.active:
+ changes["active"] = self.active
+
+ attrib_changes = self.attribs.changes
+ if attrib_changes:
+ changes["attrib"] = attrib_changes
+ return changes
+
+ def _get_attributes_for_type(self, entity_type):
+ return self._entity_hub.get_attributes_for_type(entity_type)
+
+ def lock(self):
+ """Lock entity as 'saved' so all changes are discarded."""
+
+ self._orig_parent_id = self._parent_id
+ self._orig_name = self._name
+ self._orig_data = copy.deepcopy(self._data)
+ self._orig_thumbnail_id = self.thumbnail_id
+ self._attribs.lock()
+
+ self._immutable_for_hierarchy_cache = None
+ self._created = False
+
+ def _get_entity_by_id(self, entity_id):
+ return self._entity_hub.get_entity_by_id(entity_id)
+
+ def get_name(self):
+ return self._name
+
+ def set_name(self, name):
+ self._name = name
+
+ name = property(get_name, set_name)
+
+ def get_parent_id(self):
+ """Parent entity id.
+
+ Returns:
+ Union[str, None]: Id of parent entity or none if is not set.
+ """
+
+ return self._parent_id
+
+ def set_parent_id(self, parent_id):
+ """Change parent by id.
+
+ Args:
+ parent_id (Union[str, None]): Id of new parent for entity.
+
+ Raises:
+ ValueError: If parent was not found by id.
+ TypeError: If validation of parent does not pass.
+ """
+
+ if parent_id != self._parent_id:
+ orig_parent_id = self._parent_id
+ self._parent_id = parent_id
+ self._entity_hub.set_entity_parent(
+ self.id, parent_id, orig_parent_id
+ )
+
+ parent_id = property(get_parent_id, set_parent_id)
+
+ def get_parent(self, allow_query=True):
+ """Parent entity.
+
+ Returns:
+ Union[BaseEntity, None]: Parent object.
+ """
+
+ parent = self._entity_hub.get_entity_by_id(self._parent_id)
+ if parent is not None:
+ return parent
+
+ if not allow_query:
+ return self._parent_id
+
+ if self._parent_id is UNKNOWN_VALUE:
+ return self._parent_id
+
+ return self._entity_hub.get_or_query_entity_by_id(
+ self._parent_id, self.parent_entity_types
+ )
+
+ def set_parent(self, parent):
+ """Change parent object.
+
+ Args:
+ parent (BaseEntity): New parent for entity.
+
+ Raises:
+ TypeError: If validation of parent does not pass.
+ """
+
+ parent_id = None
+ if parent is not None:
+ parent_id = parent.id
+ self._entity_hub.set_entity_parent(self.id, parent_id)
+
+ parent = property(get_parent, set_parent)
+
+ def get_children_ids(self, allow_query=True):
+ """Access to children objects.
+
+ Todos:
+ Children should be maybe handled by EntityHub instead of entities
+ themselves. That would simplify 'set_entity_parent',
+ 'unset_entity_parent' and other logic related to changing
+ hierarchy.
+
+ Returns:
+ Union[List[str], Type[UNKNOWN_VALUE]]: Children iterator.
+ """
+
+ if self._children_ids is UNKNOWN_VALUE:
+ if not allow_query:
+ return self._children_ids
+ self._entity_hub.get_entity_children(self, True)
+ return set(self._children_ids)
+
+ children_ids = property(get_children_ids)
+
+ def get_children(self, allow_query=True):
+ """Access to children objects.
+
+ Returns:
+ Union[List[BaseEntity], Type[UNKNOWN_VALUE]]: Children iterator.
+ """
+
+ if self._children_ids is UNKNOWN_VALUE:
+ if not allow_query:
+ return self._children_ids
+ return self._entity_hub.get_entity_children(self, True)
+
+ return [
+ self._entity_hub.get_entity_by_id(children_id)
+ for children_id in self._children_ids
+ ]
+
+ children = property(get_children)
+
+ def add_child(self, child):
+ """Add child entity.
+
+ Args:
+ child (BaseEntity): Child object to add.
+
+ Raises:
+ TypeError: When child object has invalid type to be children.
+ """
+
+ child_id = child
+ if isinstance(child_id, BaseEntity):
+ child_id = child.id
+
+ if self._children_ids is not UNKNOWN_VALUE:
+ self._children_ids.add(child_id)
+
+ self._entity_hub.set_entity_parent(child_id, self.id)
+
+ def remove_child(self, child):
+ """Remove child entity.
+
+ Is ignored if child is not in children.
+
+ Args:
+ child (Union[str, BaseEntity]): Child object or child id to remove.
+ """
+
+ child_id = child
+ if isinstance(child_id, BaseEntity):
+ child_id = child.id
+
+ if self._children_ids is not UNKNOWN_VALUE:
+ self._children_ids.discard(child_id)
+ self._entity_hub.unset_entity_parent(child_id, self.id)
+
+ def get_thumbnail_id(self):
+ """Thumbnail id of entity.
+
+ Returns:
+ Union[str, None]: Id of parent entity or none if is not set.
+ """
+
+ return self._thumbnail_id
+
+ def set_thumbnail_id(self, thumbnail_id):
+ """Change thumbnail id.
+
+ Args:
+ thumbnail_id (Union[str, None]): Id of thumbnail for entity.
+ """
+
+ self._thumbnail_id = thumbnail_id
+
+ thumbnail_id = property(get_thumbnail_id, set_thumbnail_id)
+
+ @property
+ def created(self):
+ """Entity is new.
+
+ Returns:
+ bool: Entity is newly created.
+ """
+
+ return self._created
+
+ def fill_children_ids(self, children_ids):
+ """Fill children ids on entity.
+
+ Warning:
+ This is not an api call but is called from entity hub.
+ """
+
+ self._children_ids = set(children_ids)
+
+
+class ProjectEntity(BaseEntity):
+ """Entity representing project on AYON server.
+
+ Args:
+ project_code (str): Project code.
+ library (bool): Is project library project.
+ folder_types (list[dict[str, Any]]): Folder types definition.
+ task_types (list[dict[str, Any]]): Task types definition.
+ entity_id (Optional[str]): Id of the entity. New id is created if
+ not passed.
+ parent_id (Union[str, None]): Id of parent entity.
+ name (str): Name of entity.
+ attribs (Dict[str, Any]): Attribute values.
+ data (Dict[str, Any]): Entity data (custom data).
+ thumbnail_id (Union[str, None]): Id of entity's thumbnail.
+ active (bool): Is entity active.
+ entity_hub (EntityHub): Object of entity hub which created object of
+ the entity.
+ created (Optional[bool]): Entity is new. When 'None' is passed the
+ value is defined based on value of 'entity_id'.
+ """
+
+ entity_type = "project"
+ parent_entity_types = []
+ # TODO These are hardcoded but maybe should be used from server???
+ default_folder_type_icon = "folder"
+ default_task_type_icon = "task_alt"
+
+ def __init__(
+ self, project_code, library, folder_types, task_types, *args, **kwargs
+ ):
+ super(ProjectEntity, self).__init__(*args, **kwargs)
+
+ self._project_code = project_code
+ self._library_project = library
+ self._folder_types = folder_types
+ self._task_types = task_types
+
+ self._orig_project_code = project_code
+ self._orig_library_project = library
+ self._orig_folder_types = copy.deepcopy(folder_types)
+ self._orig_task_types = copy.deepcopy(task_types)
+
+ def _prepare_entity_id(self, entity_id):
+ if entity_id != self.project_name:
+ raise ValueError(
+ "Unexpected entity id value \"{}\". Expected \"{}\"".format(
+ entity_id, self.project_name))
+ return entity_id
+
+ def get_parent(self, *args, **kwargs):
+ return None
+
+ def set_parent(self, parent):
+ raise ValueError(
+ "Parent of project cannot be set to {}".format(parent)
+ )
+
+ parent = property(get_parent, set_parent)
+
+ def get_orig_folder_types(self):
+ return copy.deepcopy(self._orig_folder_types)
+
+ def get_folder_types(self):
+ return copy.deepcopy(self._folder_types)
+
+ def set_folder_types(self, folder_types):
+ new_folder_types = []
+ for folder_type in folder_types:
+ if "icon" not in folder_type:
+ folder_type["icon"] = self.default_folder_type_icon
+ new_folder_types.append(folder_type)
+ self._folder_types = new_folder_types
+
+ def get_orig_task_types(self):
+ return copy.deepcopy(self._orig_task_types)
+
+ def get_task_types(self):
+ return copy.deepcopy(self._task_types)
+
+ def set_task_types(self, task_types):
+ new_task_types = []
+ for task_type in task_types:
+ if "icon" not in task_type:
+ task_type["icon"] = self.default_task_type_icon
+ new_task_types.append(task_type)
+ self._task_types = new_task_types
+
+ folder_types = property(get_folder_types, set_folder_types)
+ task_types = property(get_task_types, set_task_types)
+
+ def lock(self):
+ super(ProjectEntity, self).lock()
+ self._orig_folder_types = copy.deepcopy(self._folder_types)
+ self._orig_task_types = copy.deepcopy(self._task_types)
+
+ @property
+ def changes(self):
+ changes = self._get_default_changes()
+ if self._orig_folder_types != self._folder_types:
+ changes["folderTypes"] = self.get_folder_types()
+
+ if self._orig_task_types != self._task_types:
+ changes["taskTypes"] = self.get_task_types()
+
+ return changes
+
+ @classmethod
+ def from_entity_data(cls, project, entity_hub):
+ return cls(
+ project["code"],
+ parent_id=PROJECT_PARENT_ID,
+ entity_id=project["name"],
+ library=project["library"],
+ folder_types=project["folderTypes"],
+ task_types=project["taskTypes"],
+ name=project["name"],
+ attribs=project["ownAttrib"],
+ data=project["data"],
+ active=project["active"],
+ entity_hub=entity_hub
+ )
+
+ def to_create_body_data(self):
+ raise NotImplementedError(
+ "ProjectEntity does not support conversion to entity data"
+ )
+
+
+class FolderEntity(BaseEntity):
+ """Entity representing a folder on AYON server.
+
+ Args:
+ folder_type (str): Type of folder. Folder type must be available in
+ config of project folder types.
+ entity_id (Union[str, None]): Id of the entity. New id is created if
+ not passed.
+ parent_id (Union[str, None]): Id of parent entity.
+ name (str): Name of entity.
+ attribs (Dict[str, Any]): Attribute values.
+ data (Dict[str, Any]): Entity data (custom data).
+ thumbnail_id (Union[str, None]): Id of entity's thumbnail.
+ active (bool): Is entity active.
+ label (Optional[str]): Folder label.
+ path (Optional[str]): Folder path. Path consist of all parent names
+ with slash('/') used as separator.
+ entity_hub (EntityHub): Object of entity hub which created object of
+ the entity.
+ created (Optional[bool]): Entity is new. When 'None' is passed the
+ value is defined based on value of 'entity_id'.
+ """
+
+ entity_type = "folder"
+ parent_entity_types = ["folder", "project"]
+
+ def __init__(self, folder_type, *args, label=None, path=None, **kwargs):
+ super(FolderEntity, self).__init__(*args, **kwargs)
+ # Autofill project as parent of folder if is not yet set
+ # - this can be guessed only if folder was just created
+ if self.created and self._parent_id is UNKNOWN_VALUE:
+ self._parent_id = self.project_name
+
+ self._folder_type = folder_type
+ self._label = label
+
+ self._orig_folder_type = folder_type
+ self._orig_label = label
+ # Know if folder has any products
+ # - is used to know if folder allows hierarchy changes
+ self._has_published_content = False
+ self._path = path
+
+ def get_folder_type(self):
+ return self._folder_type
+
+ def set_folder_type(self, folder_type):
+ self._folder_type = folder_type
+
+ folder_type = property(get_folder_type, set_folder_type)
+
+ def get_label(self):
+ return self._label
+
+ def set_label(self, label):
+ self._label = label
+
+ label = property(get_label, set_label)
+
+ def get_path(self, dynamic_value=True):
+ if not dynamic_value:
+ return self._path
+
+ if self._path is None:
+ parent = self.parent
+ path = self.name
+ if parent.entity_type == "folder":
+ parent_path = parent.path
+ path = "/".join([parent_path, path])
+ self._path = path
+ return self._path
+
+ def reset_path(self):
+ self._path = None
+ self._entity_hub.folder_path_reseted(self.id)
+
+ path = property(get_path)
+
+ def get_has_published_content(self):
+ return self._has_published_content
+
+ def set_has_published_content(self, has_published_content):
+ if self._has_published_content is has_published_content:
+ return
+
+ self._has_published_content = has_published_content
+ # Reset immutable cache of parents
+ self._entity_hub.reset_immutable_for_hierarchy_cache(self.id)
+
+ has_published_content = property(
+ get_has_published_content, set_has_published_content
+ )
+
+ @property
+ def _immutable_for_hierarchy(self):
+ if self.has_published_content:
+ return True
+ return None
+
+ def lock(self):
+ super(FolderEntity, self).lock()
+ self._orig_folder_type = self._folder_type
+
+ @property
+ def changes(self):
+ changes = self._get_default_changes()
+
+ if self._orig_parent_id != self._parent_id:
+ parent_id = self._parent_id
+ if parent_id == self.project_name:
+ parent_id = None
+ changes["parentId"] = parent_id
+
+ if self._orig_folder_type != self._folder_type:
+ changes["folderType"] = self._folder_type
+
+ label = self._label
+ if self._name == label:
+ label = None
+
+ if label != self._orig_label:
+ changes["label"] = label
+
+ return changes
+
+ @classmethod
+ def from_entity_data(cls, folder, entity_hub):
+ parent_id = folder["parentId"]
+ if parent_id is None:
+ parent_id = entity_hub.project_entity.id
+ return cls(
+ folder["folderType"],
+ label=folder["label"],
+ path=folder["path"],
+ entity_id=folder["id"],
+ parent_id=parent_id,
+ name=folder["name"],
+ data=folder.get("data"),
+ attribs=folder["ownAttrib"],
+ active=folder["active"],
+ thumbnail_id=folder["thumbnailId"],
+ created=False,
+ entity_hub=entity_hub
+ )
+
+ def to_create_body_data(self):
+ parent_id = self._parent_id
+ if parent_id is UNKNOWN_VALUE:
+ raise ValueError("Folder does not have set 'parent_id'")
+
+ if parent_id == self.project_name:
+ parent_id = None
+
+ if not self.name or self.name is UNKNOWN_VALUE:
+ raise ValueError("Folder does not have set 'name'")
+
+ output = {
+ "name": self.name,
+ "folderType": self.folder_type,
+ "parentId": parent_id,
+ }
+ attrib = self.attribs.to_dict()
+ if attrib:
+ output["attrib"] = attrib
+
+ if self.active is not UNKNOWN_VALUE:
+ output["active"] = self.active
+
+ if self.thumbnail_id is not UNKNOWN_VALUE:
+ output["thumbnailId"] = self.thumbnail_id
+
+ if self._entity_hub.allow_data_changes:
+ output["data"] = self._data
+ return output
+
+
+class TaskEntity(BaseEntity):
+ """Entity representing a task on AYON server.
+
+ Args:
+ task_type (str): Type of task. Task type must be available in config
+ of project task types.
+ entity_id (Union[str, None]): Id of the entity. New id is created if
+ not passed.
+ parent_id (Union[str, None]): Id of parent entity.
+ name (str): Name of entity.
+ label (Optional[str]): Task label.
+ attribs (Dict[str, Any]): Attribute values.
+ data (Dict[str, Any]): Entity data (custom data).
+ thumbnail_id (Union[str, None]): Id of entity's thumbnail.
+ active (bool): Is entity active.
+ entity_hub (EntityHub): Object of entity hub which created object of
+ the entity.
+ created (Optional[bool]): Entity is new. When 'None' is passed the
+ value is defined based on value of 'entity_id'.
+ """
+
+ entity_type = "task"
+ parent_entity_types = ["folder"]
+
+ def __init__(self, task_type, *args, label=None, **kwargs):
+ super(TaskEntity, self).__init__(*args, **kwargs)
+
+ self._task_type = task_type
+ self._label = label
+
+ self._orig_task_type = task_type
+ self._orig_label = label
+
+ self._children_ids = set()
+
+ def lock(self):
+ super(TaskEntity, self).lock()
+ self._orig_task_type = self._task_type
+
+ def get_task_type(self):
+ return self._task_type
+
+ def set_task_type(self, task_type):
+ self._task_type = task_type
+
+ task_type = property(get_task_type, set_task_type)
+
+ def get_label(self):
+ return self._label
+
+ def set_label(self, label):
+ self._label = label
+
+ label = property(get_label, set_label)
+
+ def add_child(self, child):
+ raise ValueError("Task does not support to add children")
+
+ @property
+ def changes(self):
+ changes = self._get_default_changes()
+
+ if self._orig_parent_id != self._parent_id:
+ changes["folderId"] = self._parent_id
+
+ if self._orig_task_type != self._task_type:
+ changes["taskType"] = self._task_type
+
+ label = self._label
+ if self._name == label:
+ label = None
+
+ if label != self._orig_label:
+ changes["label"] = label
+
+ return changes
+
+ @classmethod
+ def from_entity_data(cls, task, entity_hub):
+ return cls(
+ task["taskType"],
+ entity_id=task["id"],
+ label=task["label"],
+ parent_id=task["folderId"],
+ name=task["name"],
+ data=task.get("data"),
+ attribs=task["ownAttrib"],
+ active=task["active"],
+ created=False,
+ entity_hub=entity_hub
+ )
+
+ def to_create_body_data(self):
+ if self.parent_id is UNKNOWN_VALUE:
+ raise ValueError("Task does not have set 'parent_id'")
+
+ output = {
+ "name": self.name,
+ "taskType": self.task_type,
+ "folderId": self.parent_id,
+ "attrib": self.attribs.to_dict(),
+ }
+ attrib = self.attribs.to_dict()
+ if attrib:
+ output["attrib"] = attrib
+
+ if self.active is not UNKNOWN_VALUE:
+ output["active"] = self.active
+
+ if (
+ self._entity_hub.allow_data_changes
+ and self._data is not UNKNOWN_VALUE
+ ):
+ output["data"] = self._data
+ return output
diff --git a/openpype/vendor/python/common/ayon_api/events.py b/openpype/vendor/python/common/ayon_api/events.py
new file mode 100644
index 0000000000..aa256f6cfc
--- /dev/null
+++ b/openpype/vendor/python/common/ayon_api/events.py
@@ -0,0 +1,52 @@
+import copy
+
+
+class ServerEvent(object):
+ def __init__(
+ self,
+ topic,
+ sender=None,
+ event_hash=None,
+ project_name=None,
+ username=None,
+ dependencies=None,
+ description=None,
+ summary=None,
+ payload=None,
+ finished=True,
+ store=True,
+ ):
+ if dependencies is None:
+ dependencies = []
+ if payload is None:
+ payload = {}
+ if summary is None:
+ summary = {}
+
+ self.topic = topic
+ self.sender = sender
+ self.event_hash = event_hash
+ self.project_name = project_name
+ self.username = username
+ self.dependencies = dependencies
+ self.description = description
+ self.summary = summary
+ self.payload = payload
+ self.finished = finished
+ self.store = store
+
+ def to_data(self):
+ return {
+ "topic": self.topic,
+ "sender": self.sender,
+ "hash": self.event_hash,
+ "project": self.project_name,
+ "user": self.username,
+ "dependencies": copy.deepcopy(self.dependencies),
+ "description": self.description,
+ "description": self.description,
+ "summary": copy.deepcopy(self.summary),
+ "payload": self.payload,
+ "finished": self.finished,
+ "store": self.store
+ }
diff --git a/openpype/vendor/python/common/ayon_api/exceptions.py b/openpype/vendor/python/common/ayon_api/exceptions.py
new file mode 100644
index 0000000000..db4917e90a
--- /dev/null
+++ b/openpype/vendor/python/common/ayon_api/exceptions.py
@@ -0,0 +1,107 @@
+import copy
+
+
+class UrlError(Exception):
+ """Url cannot be parsed as url.
+
+ Exception may contain hints of possible fixes of url that can be used in
+ UI if needed.
+ """
+
+ def __init__(self, message, title, hints=None):
+ if hints is None:
+ hints = []
+
+ self.title = title
+ self.hints = hints
+ super(UrlError, self).__init__(message)
+
+
+class ServerError(Exception):
+ pass
+
+
+class UnauthorizedError(ServerError):
+ pass
+
+
+class AuthenticationError(ServerError):
+ pass
+
+
+class ServerNotReached(ServerError):
+ pass
+
+
+class RequestError(Exception):
+ def __init__(self, message, response):
+ self.response = response
+ super(RequestError, self).__init__(message)
+
+
+class HTTPRequestError(RequestError):
+ pass
+
+
+class GraphQlQueryFailed(Exception):
+ def __init__(self, errors, query, variables):
+ if variables is None:
+ variables = {}
+
+ error_messages = []
+ for error in errors:
+ msg = error["message"]
+ path = error.get("path")
+ if path:
+ msg += " on item '{}'".format("/".join(path))
+ locations = error.get("locations")
+ if locations:
+ _locations = [
+ "Line {} Column {}".format(
+ location["line"], location["column"]
+ )
+ for location in locations
+ ]
+
+ msg += " ({})".format(" and ".join(_locations))
+ error_messages.append(msg)
+
+ message = "GraphQl query Failed"
+ if error_messages:
+ message = "{}: {}".format(message, " | ".join(error_messages))
+
+ self.errors = errors
+ self.query = query
+ self.variables = copy.deepcopy(variables)
+ super(GraphQlQueryFailed, self).__init__(message)
+
+
+class MissingEntityError(Exception):
+ pass
+
+
+class ProjectNotFound(MissingEntityError):
+ def __init__(self, project_name, message=None):
+ if not message:
+ message = "Project \"{}\" was not found".format(project_name)
+ self.project_name = project_name
+ super(ProjectNotFound, self).__init__(message)
+
+
+class FolderNotFound(MissingEntityError):
+ def __init__(self, project_name, folder_id, message=None):
+ self.project_name = project_name
+ self.folder_id = folder_id
+ if not message:
+ message = (
+ "Folder with id \"{}\" was not found in project \"{}\""
+ ).format(folder_id, project_name)
+ super(FolderNotFound, self).__init__(message)
+
+
+class FailedOperations(Exception):
+ pass
+
+
+class FailedServiceInit(Exception):
+ pass
diff --git a/openpype/vendor/python/common/ayon_api/graphql.py b/openpype/vendor/python/common/ayon_api/graphql.py
new file mode 100644
index 0000000000..854f207a00
--- /dev/null
+++ b/openpype/vendor/python/common/ayon_api/graphql.py
@@ -0,0 +1,983 @@
+import copy
+import numbers
+from abc import ABCMeta, abstractmethod
+
+import six
+
+from .exceptions import GraphQlQueryFailed
+
+FIELD_VALUE = object()
+
+
+def fields_to_dict(fields):
+ if not fields:
+ return None
+
+ output = {}
+ for field in fields:
+ hierarchy = field.split(".")
+ last = hierarchy.pop(-1)
+ value = output
+ for part in hierarchy:
+ if value is FIELD_VALUE:
+ break
+
+ if part not in value:
+ value[part] = {}
+ value = value[part]
+
+ if value is not FIELD_VALUE:
+ value[last] = FIELD_VALUE
+ return output
+
+
+class QueryVariable(object):
+ """Object representing single varible used in GraphQlQuery.
+
+ Variable definition is in GraphQl query header but it's value is used
+ in fields.
+
+ Args:
+ variable_name (str): Name of variable in query.
+ """
+
+ def __init__(self, variable_name):
+ self._variable_name = variable_name
+ self._name = "${}".format(variable_name)
+
+ @property
+ def name(self):
+ """Name used in field filter."""
+
+ return self._name
+
+ @property
+ def variable_name(self):
+ """Name of variable in query definition."""
+
+ return self._variable_name
+
+ def __hash__(self):
+ return self._name.__hash__()
+
+ def __str__(self):
+ return self._name
+
+ def __format__(self, *args, **kwargs):
+ return self._name.__format__(*args, **kwargs)
+
+
+class GraphQlQuery:
+ """GraphQl query which can have fields to query.
+
+ Single use object which can be used only for one query. Object and children
+ objects keep track about paging and progress.
+
+ Args:
+ name (str): Name of query.
+ """
+
+ offset = 2
+
+ def __init__(self, name):
+ self._name = name
+ self._variables = {}
+ self._children = []
+ self._has_multiple_edge_fields = None
+
+ @property
+ def indent(self):
+ """Indentation for preparation of query string.
+
+ Returns:
+ int: Ident spaces.
+ """
+
+ return 0
+
+ @property
+ def child_indent(self):
+ """Indentation for preparation of query string used by children.
+
+ Returns:
+ int: Ident spaces for children.
+ """
+
+ return self.indent
+
+ @property
+ def need_query(self):
+ """Still need query from server.
+
+ Needed for edges which use pagination.
+
+ Returns:
+ bool: If still need query from server.
+ """
+
+ for child in self._children:
+ if child.need_query:
+ return True
+ return False
+
+ @property
+ def has_multiple_edge_fields(self):
+ if self._has_multiple_edge_fields is None:
+ edge_counter = 0
+ for child in self._children:
+ edge_counter += child.sum_edge_fields(2)
+ if edge_counter > 1:
+ break
+ self._has_multiple_edge_fields = edge_counter > 1
+
+ return self._has_multiple_edge_fields
+
+ def add_variable(self, key, value_type, value=None):
+ """Add variable to query.
+
+ Args:
+ key (str): Variable name.
+ value_type (str): Type of expected value in variables. This is
+ graphql type e.g. "[String!]", "Int", "Boolean", etc.
+ value (Any): Default value for variable. Can be changed later.
+
+ Returns:
+ QueryVariable: Created variable object.
+
+ Raises:
+ KeyError: If variable was already added before.
+ """
+
+ if key in self._variables:
+ raise KeyError(
+ "Variable \"{}\" was already set with type {}.".format(
+ key, value_type
+ )
+ )
+
+ variable = QueryVariable(key)
+ self._variables[key] = {
+ "type": value_type,
+ "variable": variable,
+ "value": value
+ }
+ return variable
+
+ def get_variable(self, key):
+ """Variable object.
+
+ Args:
+ key (str): Variable name added to headers.
+
+ Returns:
+ QueryVariable: Variable object used in query string.
+ """
+
+ return self._variables[key]["variable"]
+
+ def get_variable_value(self, key, default=None):
+ """Get Current value of variable.
+
+ Args:
+ key (str): Variable name.
+ default (Any): Default value if variable is available.
+
+ Returns:
+ Any: Variable value.
+ """
+
+ variable_item = self._variables.get(key)
+ if variable_item:
+ return variable_item["value"]
+ return default
+
+ def set_variable_value(self, key, value):
+ """Set value for variable.
+
+ Args:
+ key (str): Variable name under which the value is stored.
+ value (Any): Variable value used in query. Variable is not used
+ if value is 'None'.
+ """
+
+ self._variables[key]["value"] = value
+
+ def get_variables_values(self):
+ """Calculate variable values used that should be used in query.
+
+ Variables with value set to 'None' are skipped.
+
+ Returns:
+ Dict[str, Any]: Variable values by their name.
+ """
+
+ output = {}
+ for key, item in self._variables.items():
+ value = item["value"]
+ if value is not None:
+ output[key] = item["value"]
+
+ return output
+
+ def add_obj_field(self, field):
+ """Add field object to children.
+
+ Args:
+ field (BaseGraphQlQueryField): Add field to query children.
+ """
+
+ if field in self._children:
+ return
+
+ self._children.append(field)
+ field.set_parent(self)
+
+ def add_field_with_edges(self, name):
+ """Add field with edges to query.
+
+ Args:
+ name (str): Field name e.g. 'tasks'.
+
+ Returns:
+ GraphQlQueryEdgeField: Created field object.
+ """
+
+ item = GraphQlQueryEdgeField(name, self)
+ self.add_obj_field(item)
+ return item
+
+ def add_field(self, name):
+ """Add field to query.
+
+ Args:
+ name (str): Field name e.g. 'id'.
+
+ Returns:
+ GraphQlQueryField: Created field object.
+ """
+
+ item = GraphQlQueryField(name, self)
+ self.add_obj_field(item)
+ return item
+
+ def calculate_query(self):
+ """Calculate query string which is sent to server.
+
+ Returns:
+ str: GraphQl string with variables and headers.
+
+ Raises:
+ ValueError: Query has no fiels.
+ """
+
+ if not self._children:
+ raise ValueError("Missing fields to query")
+
+ variables = []
+ for item in self._variables.values():
+ if item["value"] is None:
+ continue
+
+ variables.append(
+ "{}: {}".format(item["variable"], item["type"])
+ )
+
+ variables_str = ""
+ if variables:
+ variables_str = "({})".format(",".join(variables))
+ header = "query {}{}".format(self._name, variables_str)
+
+ output = []
+ output.append(header + " {")
+ for field in self._children:
+ output.append(field.calculate_query())
+ output.append("}")
+
+ return "\n".join(output)
+
+ def parse_result(self, data, output, progress_data):
+ """Parse data from response for output.
+
+ Output is stored to passed 'output' variable. That's because of paging
+ during which objects must have access to both new and previous values.
+
+ Args:
+ data (Dict[str, Any]): Data received using calculated query.
+ output (Dict[str, Any]): Where parsed data are stored.
+ """
+
+ if not data:
+ return
+
+ for child in self._children:
+ child.parse_result(data, output, progress_data)
+
+ def query(self, con):
+ """Do a query from server.
+
+ Args:
+ con (ServerAPI): Connection to server with 'query' method.
+
+ Returns:
+ Dict[str, Any]: Parsed output from GraphQl query.
+ """
+
+ progress_data = {}
+ output = {}
+ while self.need_query:
+ query_str = self.calculate_query()
+ variables = self.get_variables_values()
+ response = con.query_graphql(
+ query_str,
+ self.get_variables_values()
+ )
+ if response.errors:
+ raise GraphQlQueryFailed(response.errors, query_str, variables)
+ self.parse_result(response.data["data"], output, progress_data)
+
+ return output
+
+ def continuous_query(self, con):
+ """Do a query from server.
+
+ Args:
+ con (ServerAPI): Connection to server with 'query' method.
+
+ Returns:
+ Dict[str, Any]: Parsed output from GraphQl query.
+ """
+
+ progress_data = {}
+ if self.has_multiple_edge_fields:
+ output = {}
+ while self.need_query:
+ query_str = self.calculate_query()
+ variables = self.get_variables_values()
+ response = con.query_graphql(query_str, variables)
+ if response.errors:
+ raise GraphQlQueryFailed(
+ response.errors, query_str, variables
+ )
+ self.parse_result(response.data["data"], output, progress_data)
+
+ yield output
+
+ else:
+ while self.need_query:
+ output = {}
+ query_str = self.calculate_query()
+ variables = self.get_variables_values()
+ response = con.query_graphql(query_str, variables)
+ if response.errors:
+ raise GraphQlQueryFailed(
+ response.errors, query_str, variables
+ )
+
+ self.parse_result(response.data["data"], output, progress_data)
+
+ yield output
+
+
+@six.add_metaclass(ABCMeta)
+class BaseGraphQlQueryField(object):
+ """Field in GraphQl query.
+
+ Args:
+ name (str): Name of field.
+ parent (Union[BaseGraphQlQueryField, GraphQlQuery]): Parent object of a
+ field.
+ """
+
+ def __init__(self, name, parent):
+ if isinstance(parent, GraphQlQuery):
+ query_item = parent
+ else:
+ query_item = parent.query_item
+
+ self._name = name
+ self._parent = parent
+
+ self._filters = {}
+
+ self._children = []
+ # Value is changed on first parse of result
+ self._need_query = True
+
+ self._query_item = query_item
+
+ self._path = None
+
+ def __repr__(self):
+ return "<{} {}>".format(self.__class__.__name__, self.path)
+
+ def add_variable(self, key, value_type, value=None):
+ """Add variable to query.
+
+ Args:
+ key (str): Variable name.
+ value_type (str): Type of expected value in variables. This is
+ graphql type e.g. "[String!]", "Int", "Boolean", etc.
+ value (Any): Default value for variable. Can be changed later.
+
+ Returns:
+ QueryVariable: Created variable object.
+
+ Raises:
+ KeyError: If variable was already added before.
+ """
+
+ return self._parent.add_variable(key, value_type, value)
+
+ def get_variable(self, key):
+ """Variable object.
+
+ Args:
+ key (str): Variable name added to headers.
+
+ Returns:
+ QueryVariable: Variable object used in query string.
+ """
+
+ return self._parent.get_variable(key)
+
+ @property
+ def need_query(self):
+ """Still need query from server.
+
+ Needed for edges which use pagination. Look into children values too.
+
+ Returns:
+ bool: If still need query from server.
+ """
+
+ if self._need_query:
+ return True
+
+ for child in self._children_iter():
+ if child.need_query:
+ return True
+ return False
+
+ def _children_iter(self):
+ """Iterate over all children fields of object.
+
+ Returns:
+ Iterator[BaseGraphQlQueryField]: Children fields.
+ """
+
+ for child in self._children:
+ yield child
+
+ def sum_edge_fields(self, max_limit=None):
+ """Check how many edge fields query has.
+
+ In case there are multiple edge fields or are nested the query can't
+ yield mid cursor results.
+
+ Args:
+ max_limit (int): Skip rest of counting if counter is bigger then
+ entered number.
+
+ Returns:
+ int: Counter edge fields
+ """
+
+ counter = 0
+ if isinstance(self, GraphQlQueryEdgeField):
+ counter = 1
+
+ for child in self._children_iter():
+ counter += child.sum_edge_fields(max_limit)
+ if max_limit is not None and counter >= max_limit:
+ break
+ return counter
+
+ @property
+ def offset(self):
+ return self._query_item.offset
+
+ @property
+ def indent(self):
+ return self._parent.child_indent + self.offset
+
+ @property
+ @abstractmethod
+ def child_indent(self):
+ pass
+
+ @property
+ def query_item(self):
+ return self._query_item
+
+ @property
+ @abstractmethod
+ def has_edges(self):
+ pass
+
+ @property
+ def child_has_edges(self):
+ for child in self._children_iter():
+ if child.has_edges or child.child_has_edges:
+ return True
+ return False
+
+ @property
+ def path(self):
+ """Field path for debugging purposes.
+
+ Returns:
+ str: Field path in query.
+ """
+
+ if self._path is None:
+ if isinstance(self._parent, GraphQlQuery):
+ path = self._name
+ else:
+ path = "/".join((self._parent.path, self._name))
+ self._path = path
+ return self._path
+
+ def reset_cursor(self):
+ for child in self._children_iter():
+ child.reset_cursor()
+
+ def get_variable_value(self, *args, **kwargs):
+ return self._query_item.get_variable_value(*args, **kwargs)
+
+ def set_variable_value(self, *args, **kwargs):
+ return self._query_item.set_variable_value(*args, **kwargs)
+
+ def set_filter(self, key, value):
+ self._filters[key] = value
+
+ def has_filter(self, key):
+ return key in self._filters
+
+ def remove_filter(self, key):
+ self._filters.pop(key, None)
+
+ def set_parent(self, parent):
+ if self._parent is parent:
+ return
+ self._parent = parent
+ parent.add_obj_field(self)
+
+ def add_obj_field(self, field):
+ if field in self._children:
+ return
+
+ self._children.append(field)
+ field.set_parent(self)
+
+ def add_field_with_edges(self, name):
+ item = GraphQlQueryEdgeField(name, self)
+ self.add_obj_field(item)
+ return item
+
+ def add_field(self, name):
+ item = GraphQlQueryField(name, self)
+ self.add_obj_field(item)
+ return item
+
+ def _filter_value_to_str(self, value):
+ if isinstance(value, QueryVariable):
+ if self.get_variable_value(value.variable_name) is None:
+ return None
+ return str(value)
+
+ if isinstance(value, numbers.Number):
+ return str(value)
+
+ if isinstance(value, six.string_types):
+ return '"{}"'.format(value)
+
+ if isinstance(value, (list, set, tuple)):
+ return "[{}]".format(
+ ", ".join(
+ self._filter_value_to_str(item)
+ for item in iter(value)
+ )
+ )
+ raise TypeError(
+ "Unknown type to convert '{}'".format(str(type(value)))
+ )
+
+ def get_filters(self):
+ """Receive filters for item.
+
+ By default just use copy of set filters.
+
+ Returns:
+ Dict[str, Any]: Fields filters.
+ """
+
+ return copy.deepcopy(self._filters)
+
+ def _filters_to_string(self):
+ filters = self.get_filters()
+ if not filters:
+ return ""
+
+ filter_items = []
+ for key, value in filters.items():
+ string_value = self._filter_value_to_str(value)
+ if string_value is None:
+ continue
+
+ filter_items.append("{}: {}".format(key, string_value))
+
+ if not filter_items:
+ return ""
+ return "({})".format(", ".join(filter_items))
+
+ def _fake_children_parse(self):
+ """Mark children as they don't need query."""
+
+ for child in self._children_iter():
+ child.parse_result({}, {}, {})
+
+ @abstractmethod
+ def calculate_query(self):
+ pass
+
+ @abstractmethod
+ def parse_result(self, data, output, progress_data):
+ pass
+
+
+class GraphQlQueryField(BaseGraphQlQueryField):
+ has_edges = False
+
+ @property
+ def child_indent(self):
+ return self.indent
+
+ def parse_result(self, data, output, progress_data):
+ if not isinstance(data, dict):
+ raise TypeError("{} Expected 'dict' type got '{}'".format(
+ self._name, str(type(data))
+ ))
+
+ self._need_query = False
+ value = data.get(self._name)
+ if value is None:
+ self._fake_children_parse()
+ if self._name in data:
+ output[self._name] = None
+ return
+
+ if not self._children:
+ output[self._name] = value
+ return
+
+ output_value = output.get(self._name)
+ if isinstance(value, dict):
+ if output_value is None:
+ output_value = {}
+ output[self._name] = output_value
+
+ for child in self._children:
+ child.parse_result(value, output_value, progress_data)
+ return
+
+ if output_value is None:
+ output_value = []
+ output[self._name] = output_value
+
+ if not value:
+ self._fake_children_parse()
+ return
+
+ diff = len(value) - len(output_value)
+ if diff > 0:
+ for _ in range(diff):
+ output_value.append({})
+
+ for idx, item in enumerate(value):
+ item_value = output_value[idx]
+ for child in self._children:
+ child.parse_result(item, item_value, progress_data)
+
+ def calculate_query(self):
+ offset = self.indent * " "
+ header = "{}{}{}".format(
+ offset,
+ self._name,
+ self._filters_to_string()
+ )
+ if not self._children:
+ return header
+
+ output = []
+ output.append(header + " {")
+
+ output.extend([
+ field.calculate_query()
+ for field in self._children
+ ])
+ output.append(offset + "}")
+
+ return "\n".join(output)
+
+
+class GraphQlQueryEdgeField(BaseGraphQlQueryField):
+ has_edges = True
+
+ def __init__(self, *args, **kwargs):
+ super(GraphQlQueryEdgeField, self).__init__(*args, **kwargs)
+ self._cursor = None
+ self._edge_children = []
+
+ @property
+ def child_indent(self):
+ offset = self.offset * 2
+ return self.indent + offset
+
+ def _children_iter(self):
+ for child in super(GraphQlQueryEdgeField, self)._children_iter():
+ yield child
+
+ for child in self._edge_children:
+ yield child
+
+ def add_obj_field(self, field):
+ if field in self._edge_children:
+ return
+
+ super(GraphQlQueryEdgeField, self).add_obj_field(field)
+
+ def add_obj_edge_field(self, field):
+ if field in self._edge_children or field in self._children:
+ return
+
+ self._edge_children.append(field)
+ field.set_parent(self)
+
+ def add_edge_field(self, name):
+ item = GraphQlQueryField(name, self)
+ self.add_obj_edge_field(item)
+ return item
+
+ def reset_cursor(self):
+ # Reset cursor only for edges
+ self._cursor = None
+ self._need_query = True
+
+ super(GraphQlQueryEdgeField, self).reset_cursor()
+
+ def parse_result(self, data, output, progress_data):
+ if not isinstance(data, dict):
+ raise TypeError("{} Expected 'dict' type got '{}'".format(
+ self._name, str(type(data))
+ ))
+
+ value = data.get(self._name)
+ if value is None:
+ self._fake_children_parse()
+ self._need_query = False
+ return
+
+ if self._name in output:
+ node_values = output[self._name]
+ else:
+ node_values = []
+ output[self._name] = node_values
+
+ handle_cursors = self.child_has_edges
+ if handle_cursors:
+ cursor_key = self._get_cursor_key()
+ if cursor_key in progress_data:
+ nodes_by_cursor = progress_data[cursor_key]
+ else:
+ nodes_by_cursor = {}
+ progress_data[cursor_key] = nodes_by_cursor
+
+ page_info = value["pageInfo"]
+ new_cursor = page_info["endCursor"]
+ self._need_query = page_info["hasNextPage"]
+ edges = value["edges"]
+ # Fake result parse
+ if not edges:
+ self._fake_children_parse()
+
+ for edge in edges:
+ if not handle_cursors:
+ edge_value = {}
+ node_values.append(edge_value)
+ else:
+ edge_cursor = edge["cursor"]
+ edge_value = nodes_by_cursor.get(edge_cursor)
+ if edge_value is None:
+ edge_value = {}
+ nodes_by_cursor[edge_cursor] = edge_value
+ node_values.append(edge_value)
+
+ for child in self._edge_children:
+ child.parse_result(edge, edge_value, progress_data)
+
+ for child in self._children:
+ child.parse_result(edge["node"], edge_value, progress_data)
+
+ if not self._need_query:
+ return
+
+ change_cursor = True
+ for child in self._children_iter():
+ if child.need_query:
+ change_cursor = False
+
+ if change_cursor:
+ for child in self._children_iter():
+ child.reset_cursor()
+ self._cursor = new_cursor
+
+ def _get_cursor_key(self):
+ return "{}/__cursor__".format(self.path)
+
+ def get_filters(self):
+ filters = super(GraphQlQueryEdgeField, self).get_filters()
+
+ filters["first"] = 300
+ if self._cursor:
+ filters["after"] = self._cursor
+ return filters
+
+ def calculate_query(self):
+ if not self._children and not self._edge_children:
+ raise ValueError("Missing child definitions for edges {}".format(
+ self.path
+ ))
+
+ offset = self.indent * " "
+ header = "{}{}{}".format(
+ offset,
+ self._name,
+ self._filters_to_string()
+ )
+
+ output = []
+ output.append(header + " {")
+
+ edges_offset = offset + self.offset * " "
+ node_offset = edges_offset + self.offset * " "
+ output.append(edges_offset + "edges {")
+ for field in self._edge_children:
+ output.append(field.calculate_query())
+
+ if self._children:
+ output.append(node_offset + "node {")
+
+ for field in self._children:
+ output.append(
+ field.calculate_query()
+ )
+
+ output.append(node_offset + "}")
+ if self.child_has_edges:
+ output.append(node_offset + "cursor")
+
+ output.append(edges_offset + "}")
+
+ # Add page information
+ output.append(edges_offset + "pageInfo {")
+ for page_key in (
+ "endCursor",
+ "hasNextPage",
+ ):
+ output.append(node_offset + page_key)
+ output.append(edges_offset + "}")
+ output.append(offset + "}")
+
+ return "\n".join(output)
+
+
+INTROSPECTION_QUERY = """
+ query IntrospectionQuery {
+ __schema {
+ queryType { name }
+ mutationType { name }
+ subscriptionType { name }
+ types {
+ ...FullType
+ }
+ directives {
+ name
+ description
+ locations
+ args {
+ ...InputValue
+ }
+ }
+ }
+ }
+ fragment FullType on __Type {
+ kind
+ name
+ description
+ fields(includeDeprecated: true) {
+ name
+ description
+ args {
+ ...InputValue
+ }
+ type {
+ ...TypeRef
+ }
+ isDeprecated
+ deprecationReason
+ }
+ inputFields {
+ ...InputValue
+ }
+ interfaces {
+ ...TypeRef
+ }
+ enumValues(includeDeprecated: true) {
+ name
+ description
+ isDeprecated
+ deprecationReason
+ }
+ possibleTypes {
+ ...TypeRef
+ }
+ }
+ fragment InputValue on __InputValue {
+ name
+ description
+ type { ...TypeRef }
+ defaultValue
+ }
+ fragment TypeRef on __Type {
+ kind
+ name
+ ofType {
+ kind
+ name
+ ofType {
+ kind
+ name
+ ofType {
+ kind
+ name
+ ofType {
+ kind
+ name
+ ofType {
+ kind
+ name
+ ofType {
+ kind
+ name
+ ofType {
+ kind
+ name
+ }
+ }
+ }
+ }
+ }
+ }
+ }
+ }
+"""
diff --git a/openpype/vendor/python/common/ayon_api/graphql_queries.py b/openpype/vendor/python/common/ayon_api/graphql_queries.py
new file mode 100644
index 0000000000..4af8c53e4e
--- /dev/null
+++ b/openpype/vendor/python/common/ayon_api/graphql_queries.py
@@ -0,0 +1,464 @@
+import collections
+
+from .graphql import FIELD_VALUE, GraphQlQuery
+
+
+def fields_to_dict(fields):
+ if not fields:
+ return None
+
+ output = {}
+ for field in fields:
+ hierarchy = field.split(".")
+ last = hierarchy.pop(-1)
+ value = output
+ for part in hierarchy:
+ if value is FIELD_VALUE:
+ break
+
+ if part not in value:
+ value[part] = {}
+ value = value[part]
+
+ if value is not FIELD_VALUE:
+ value[last] = FIELD_VALUE
+ return output
+
+
+def add_links_fields(entity_field, nested_fields):
+ if "links" not in nested_fields:
+ return
+ links_fields = nested_fields.pop("links")
+
+ link_edge_fields = {
+ "id",
+ "linkType",
+ "projectName",
+ "entityType",
+ "entityId",
+ "direction",
+ "description",
+ "author",
+ }
+ if isinstance(links_fields, dict):
+ simple_fields = set(links_fields)
+ simple_variant = len(simple_fields - link_edge_fields) == 0
+ else:
+ simple_variant = True
+ simple_fields = link_edge_fields
+
+ link_field = entity_field.add_field_with_edges("links")
+
+ link_type_var = link_field.add_variable("linkTypes", "[String!]")
+ link_dir_var = link_field.add_variable("linkDirection", "String!")
+ link_field.set_filter("linkTypes", link_type_var)
+ link_field.set_filter("direction", link_dir_var)
+
+ if simple_variant:
+ for key in simple_fields:
+ link_field.add_edge_field(key)
+ return
+
+ query_queue = collections.deque()
+ for key, value in links_fields.items():
+ if key in link_edge_fields:
+ link_field.add_edge_field(key)
+ continue
+ query_queue.append((key, value, link_field))
+
+ while query_queue:
+ item = query_queue.popleft()
+ key, value, parent = item
+ field = parent.add_field(key)
+ if value is FIELD_VALUE:
+ continue
+
+ for k, v in value.items():
+ query_queue.append((k, v, field))
+
+
+def project_graphql_query(fields):
+ query = GraphQlQuery("ProjectQuery")
+ project_name_var = query.add_variable("projectName", "String!")
+ project_field = query.add_field("project")
+ project_field.set_filter("name", project_name_var)
+
+ nested_fields = fields_to_dict(fields)
+
+ query_queue = collections.deque()
+ for key, value in nested_fields.items():
+ query_queue.append((key, value, project_field))
+
+ while query_queue:
+ item = query_queue.popleft()
+ key, value, parent = item
+ field = parent.add_field(key)
+ if value is FIELD_VALUE:
+ continue
+
+ for k, v in value.items():
+ query_queue.append((k, v, field))
+ return query
+
+
+def projects_graphql_query(fields):
+ query = GraphQlQuery("ProjectsQuery")
+ projects_field = query.add_field_with_edges("projects")
+
+ nested_fields = fields_to_dict(fields)
+
+ query_queue = collections.deque()
+ for key, value in nested_fields.items():
+ query_queue.append((key, value, projects_field))
+
+ while query_queue:
+ item = query_queue.popleft()
+ key, value, parent = item
+ field = parent.add_field(key)
+ if value is FIELD_VALUE:
+ continue
+
+ for k, v in value.items():
+ query_queue.append((k, v, field))
+ return query
+
+
+def product_types_query(fields):
+ query = GraphQlQuery("ProductTypes")
+ product_types_field = query.add_field("productTypes")
+
+ nested_fields = fields_to_dict(fields)
+
+ query_queue = collections.deque()
+ for key, value in nested_fields.items():
+ query_queue.append((key, value, product_types_field))
+
+ while query_queue:
+ item = query_queue.popleft()
+ key, value, parent = item
+ field = parent.add_field(key)
+ if value is FIELD_VALUE:
+ continue
+
+ for k, v in value.items():
+ query_queue.append((k, v, field))
+ return query
+
+def project_product_types_query(fields):
+ query = GraphQlQuery("ProjectProductTypes")
+ project_query = query.add_field("project")
+ project_name_var = query.add_variable("projectName", "String!")
+ project_query.set_filter("name", project_name_var)
+ product_types_field = project_query.add_field("productTypes")
+ nested_fields = fields_to_dict(fields)
+
+ query_queue = collections.deque()
+ for key, value in nested_fields.items():
+ query_queue.append((key, value, product_types_field))
+
+ while query_queue:
+ item = query_queue.popleft()
+ key, value, parent = item
+ field = parent.add_field(key)
+ if value is FIELD_VALUE:
+ continue
+
+ for k, v in value.items():
+ query_queue.append((k, v, field))
+ return query
+
+
+def folders_graphql_query(fields):
+ query = GraphQlQuery("FoldersQuery")
+ project_name_var = query.add_variable("projectName", "String!")
+ folder_ids_var = query.add_variable("folderIds", "[String!]")
+ parent_folder_ids_var = query.add_variable("parentFolderIds", "[String!]")
+ folder_paths_var = query.add_variable("folderPaths", "[String!]")
+ folder_names_var = query.add_variable("folderNames", "[String!]")
+ has_products_var = query.add_variable("folderHasProducts", "Boolean!")
+
+ project_field = query.add_field("project")
+ project_field.set_filter("name", project_name_var)
+
+ folders_field = project_field.add_field_with_edges("folders")
+ folders_field.set_filter("ids", folder_ids_var)
+ folders_field.set_filter("parentIds", parent_folder_ids_var)
+ folders_field.set_filter("names", folder_names_var)
+ folders_field.set_filter("paths", folder_paths_var)
+ folders_field.set_filter("hasProducts", has_products_var)
+
+ nested_fields = fields_to_dict(fields)
+ add_links_fields(folders_field, nested_fields)
+
+ query_queue = collections.deque()
+ for key, value in nested_fields.items():
+ query_queue.append((key, value, folders_field))
+
+ while query_queue:
+ item = query_queue.popleft()
+ key, value, parent = item
+ field = parent.add_field(key)
+ if value is FIELD_VALUE:
+ continue
+
+ for k, v in value.items():
+ query_queue.append((k, v, field))
+ return query
+
+
+def tasks_graphql_query(fields):
+ query = GraphQlQuery("TasksQuery")
+ project_name_var = query.add_variable("projectName", "String!")
+ task_ids_var = query.add_variable("taskIds", "[String!]")
+ task_names_var = query.add_variable("taskNames", "[String!]")
+ task_types_var = query.add_variable("taskTypes", "[String!]")
+ folder_ids_var = query.add_variable("folderIds", "[String!]")
+
+ project_field = query.add_field("project")
+ project_field.set_filter("name", project_name_var)
+
+ tasks_field = project_field.add_field_with_edges("tasks")
+ tasks_field.set_filter("ids", task_ids_var)
+ # WARNING: At moment when this been created 'names' filter is not supported
+ tasks_field.set_filter("names", task_names_var)
+ tasks_field.set_filter("taskTypes", task_types_var)
+ tasks_field.set_filter("folderIds", folder_ids_var)
+
+ nested_fields = fields_to_dict(fields)
+ add_links_fields(tasks_field, nested_fields)
+
+ query_queue = collections.deque()
+ for key, value in nested_fields.items():
+ query_queue.append((key, value, tasks_field))
+
+ while query_queue:
+ item = query_queue.popleft()
+ key, value, parent = item
+ field = parent.add_field(key)
+ if value is FIELD_VALUE:
+ continue
+
+ for k, v in value.items():
+ query_queue.append((k, v, field))
+ return query
+
+
+def products_graphql_query(fields):
+ query = GraphQlQuery("ProductsQuery")
+
+ project_name_var = query.add_variable("projectName", "String!")
+ folder_ids_var = query.add_variable("folderIds", "[String!]")
+ product_ids_var = query.add_variable("productIds", "[String!]")
+ product_names_var = query.add_variable("productNames", "[String!]")
+
+ project_field = query.add_field("project")
+ project_field.set_filter("name", project_name_var)
+
+ products_field = project_field.add_field_with_edges("products")
+ products_field.set_filter("ids", product_ids_var)
+ products_field.set_filter("names", product_names_var)
+ products_field.set_filter("folderIds", folder_ids_var)
+
+ nested_fields = fields_to_dict(set(fields))
+ add_links_fields(products_field, nested_fields)
+
+ query_queue = collections.deque()
+ for key, value in nested_fields.items():
+ query_queue.append((key, value, products_field))
+
+ while query_queue:
+ item = query_queue.popleft()
+ key, value, parent = item
+ field = parent.add_field(key)
+ if value is FIELD_VALUE:
+ continue
+
+ for k, v in value.items():
+ query_queue.append((k, v, field))
+ return query
+
+
+def versions_graphql_query(fields):
+ query = GraphQlQuery("VersionsQuery")
+
+ project_name_var = query.add_variable("projectName", "String!")
+ product_ids_var = query.add_variable("productIds", "[String!]")
+ version_ids_var = query.add_variable("versionIds", "[String!]")
+ versions_var = query.add_variable("versions", "[Int!]")
+ hero_only_var = query.add_variable("heroOnly", "Boolean")
+ latest_only_var = query.add_variable("latestOnly", "Boolean")
+ hero_or_latest_only_var = query.add_variable(
+ "heroOrLatestOnly", "Boolean"
+ )
+
+ project_field = query.add_field("project")
+ project_field.set_filter("name", project_name_var)
+
+ products_field = project_field.add_field_with_edges("versions")
+ products_field.set_filter("ids", version_ids_var)
+ products_field.set_filter("productIds", product_ids_var)
+ products_field.set_filter("versions", versions_var)
+ products_field.set_filter("heroOnly", hero_only_var)
+ products_field.set_filter("latestOnly", latest_only_var)
+ products_field.set_filter("heroOrLatestOnly", hero_or_latest_only_var)
+
+ nested_fields = fields_to_dict(set(fields))
+ add_links_fields(products_field, nested_fields)
+
+ query_queue = collections.deque()
+ for key, value in nested_fields.items():
+ query_queue.append((key, value, products_field))
+
+ while query_queue:
+ item = query_queue.popleft()
+ key, value, parent = item
+ field = parent.add_field(key)
+ if value is FIELD_VALUE:
+ continue
+
+ for k, v in value.items():
+ query_queue.append((k, v, field))
+ return query
+
+
+def representations_graphql_query(fields):
+ query = GraphQlQuery("RepresentationsQuery")
+
+ project_name_var = query.add_variable("projectName", "String!")
+ repre_ids_var = query.add_variable("representationIds", "[String!]")
+ repre_names_var = query.add_variable("representationNames", "[String!]")
+ version_ids_var = query.add_variable("versionIds", "[String!]")
+
+ project_field = query.add_field("project")
+ project_field.set_filter("name", project_name_var)
+
+ repres_field = project_field.add_field_with_edges("representations")
+ repres_field.set_filter("ids", repre_ids_var)
+ repres_field.set_filter("versionIds", version_ids_var)
+ repres_field.set_filter("names", repre_names_var)
+
+ nested_fields = fields_to_dict(set(fields))
+ add_links_fields(repres_field, nested_fields)
+
+ query_queue = collections.deque()
+ for key, value in nested_fields.items():
+ query_queue.append((key, value, repres_field))
+
+ while query_queue:
+ item = query_queue.popleft()
+ key, value, parent = item
+ field = parent.add_field(key)
+ if value is FIELD_VALUE:
+ continue
+
+ for k, v in value.items():
+ query_queue.append((k, v, field))
+ return query
+
+
+def representations_parents_qraphql_query(
+ version_fields, product_fields, folder_fields
+):
+ query = GraphQlQuery("RepresentationsParentsQuery")
+
+ project_name_var = query.add_variable("projectName", "String!")
+ repre_ids_var = query.add_variable("representationIds", "[String!]")
+
+ project_field = query.add_field("project")
+ project_field.set_filter("name", project_name_var)
+
+ repres_field = project_field.add_field_with_edges("representations")
+ repres_field.add_field("id")
+ repres_field.set_filter("ids", repre_ids_var)
+ version_field = repres_field.add_field("version")
+
+ fields_queue = collections.deque()
+ for key, value in fields_to_dict(version_fields).items():
+ fields_queue.append((key, value, version_field))
+
+ product_field = version_field.add_field("product")
+ for key, value in fields_to_dict(product_fields).items():
+ fields_queue.append((key, value, product_field))
+
+ folder_field = product_field.add_field("folder")
+ for key, value in fields_to_dict(folder_fields).items():
+ fields_queue.append((key, value, folder_field))
+
+ while fields_queue:
+ item = fields_queue.popleft()
+ key, value, parent = item
+ field = parent.add_field(key)
+ if value is FIELD_VALUE:
+ continue
+
+ for k, v in value.items():
+ fields_queue.append((k, v, field))
+
+ return query
+
+
+def workfiles_info_graphql_query(fields):
+ query = GraphQlQuery("WorkfilesInfo")
+ project_name_var = query.add_variable("projectName", "String!")
+ workfiles_info_ids = query.add_variable("workfileIds", "[String!]")
+ task_ids_var = query.add_variable("taskIds", "[String!]")
+ paths_var = query.add_variable("paths", "[String!]")
+
+ project_field = query.add_field("project")
+ project_field.set_filter("name", project_name_var)
+
+ workfiles_field = project_field.add_field_with_edges("workfiles")
+ workfiles_field.set_filter("ids", workfiles_info_ids)
+ workfiles_field.set_filter("taskIds", task_ids_var)
+ workfiles_field.set_filter("paths", paths_var)
+
+ nested_fields = fields_to_dict(set(fields))
+ add_links_fields(workfiles_field, nested_fields)
+
+ query_queue = collections.deque()
+ for key, value in nested_fields.items():
+ query_queue.append((key, value, workfiles_field))
+
+ while query_queue:
+ item = query_queue.popleft()
+ key, value, parent = item
+ field = parent.add_field(key)
+ if value is FIELD_VALUE:
+ continue
+
+ for k, v in value.items():
+ query_queue.append((k, v, field))
+ return query
+
+
+def events_graphql_query(fields):
+ query = GraphQlQuery("Events")
+ topics_var = query.add_variable("eventTopics", "[String!]")
+ projects_var = query.add_variable("projectNames", "[String!]")
+ states_var = query.add_variable("eventStates", "[String!]")
+ users_var = query.add_variable("eventUsers", "[String!]")
+ include_logs_var = query.add_variable("includeLogsFilter", "Boolean!")
+
+ events_field = query.add_field_with_edges("events")
+ events_field.set_filter("topics", topics_var)
+ events_field.set_filter("projects", projects_var)
+ events_field.set_filter("states", states_var)
+ events_field.set_filter("users", users_var)
+ events_field.set_filter("includeLogs", include_logs_var)
+
+ nested_fields = fields_to_dict(set(fields))
+
+ query_queue = collections.deque()
+ for key, value in nested_fields.items():
+ query_queue.append((key, value, events_field))
+
+ while query_queue:
+ item = query_queue.popleft()
+ key, value, parent = item
+ field = parent.add_field(key)
+ if value is FIELD_VALUE:
+ continue
+
+ for k, v in value.items():
+ query_queue.append((k, v, field))
+ return query
diff --git a/openpype/vendor/python/common/ayon_api/operations.py b/openpype/vendor/python/common/ayon_api/operations.py
new file mode 100644
index 0000000000..7cf610a566
--- /dev/null
+++ b/openpype/vendor/python/common/ayon_api/operations.py
@@ -0,0 +1,706 @@
+import copy
+import collections
+import uuid
+from abc import ABCMeta, abstractmethod
+
+import six
+
+from ._api import get_server_api_connection
+from .utils import create_entity_id, REMOVED_VALUE
+
+
+def _create_or_convert_to_id(entity_id=None):
+ if entity_id is None:
+ return create_entity_id()
+
+ # Validate if can be converted to uuid
+ uuid.UUID(entity_id)
+ return entity_id
+
+
+def new_folder_entity(
+ name,
+ folder_type,
+ parent_id=None,
+ attribs=None,
+ data=None,
+ thumbnail_id=None,
+ entity_id=None
+):
+ """Create skeleton data of folder entity.
+
+ Args:
+ name (str): Is considered as unique identifier of folder in project.
+ folder_type (str): Type of folder.
+ parent_id (Optional[str]]): Id of parent folder.
+ attribs (Optional[Dict[str, Any]]): Explicitly set attributes
+ of folder.
+ data (Optional[Dict[str, Any]]): Custom folder data. Empty dictionary
+ is used if not passed.
+ thumbnail_id (Optional[str]): Id of thumbnail related to folder.
+ entity_id (Optional[str]): Predefined id of entity. New id is
+ created if not passed.
+
+ Returns:
+ Dict[str, Any]: Skeleton of folder entity.
+ """
+
+ if attribs is None:
+ attribs = {}
+
+ if data is None:
+ data = {}
+
+ if parent_id is not None:
+ parent_id = _create_or_convert_to_id(parent_id)
+
+ return {
+ "id": _create_or_convert_to_id(entity_id),
+ "name": name,
+ # This will be ignored
+ "folderType": folder_type,
+ "parentId": parent_id,
+ "data": data,
+ "attrib": attribs,
+ "thumbnailId": thumbnail_id
+ }
+
+
+def new_product_entity(
+ name,
+ product_type,
+ folder_id,
+ status=None,
+ attribs=None,
+ data=None,
+ entity_id=None
+):
+ """Create skeleton data of product entity.
+
+ Args:
+ name (str): Is considered as unique identifier of
+ product under folder.
+ product_type (str): Product type.
+ folder_id (str): Id of parent folder.
+ status (Optional[str]): Product status.
+ attribs (Optional[Dict[str, Any]]): Explicitly set attributes
+ of product.
+ data (Optional[Dict[str, Any]]): product entity data. Empty dictionary
+ is used if not passed.
+ entity_id (Optional[str]): Predefined id of entity. New id is
+ created if not passed.
+
+ Returns:
+ Dict[str, Any]: Skeleton of product entity.
+ """
+
+ if attribs is None:
+ attribs = {}
+
+ if data is None:
+ data = {}
+
+ output = {
+ "id": _create_or_convert_to_id(entity_id),
+ "name": name,
+ "productType": product_type,
+ "attrib": attribs,
+ "data": data,
+ "folderId": _create_or_convert_to_id(folder_id),
+ }
+ if status:
+ output["status"] = status
+ return output
+
+
+def new_version_entity(
+ version,
+ product_id,
+ task_id=None,
+ thumbnail_id=None,
+ author=None,
+ attribs=None,
+ data=None,
+ entity_id=None
+):
+ """Create skeleton data of version entity.
+
+ Args:
+ version (int): Is considered as unique identifier of version
+ under product.
+ product_id (str): Id of parent product.
+ task_id (Optional[str]]): Id of task under which product was created.
+ thumbnail_id (Optional[str]]): Thumbnail related to version.
+ author (Optional[str]]): Name of version author.
+ attribs (Optional[Dict[str, Any]]): Explicitly set attributes
+ of version.
+ data (Optional[Dict[str, Any]]): Version entity custom data.
+ entity_id (Optional[str]): Predefined id of entity. New id is
+ created if not passed.
+
+ Returns:
+ Dict[str, Any]: Skeleton of version entity.
+ """
+
+ if attribs is None:
+ attribs = {}
+
+ if data is None:
+ data = {}
+
+ if data is None:
+ data = {}
+
+ output = {
+ "id": _create_or_convert_to_id(entity_id),
+ "version": int(version),
+ "productId": _create_or_convert_to_id(product_id),
+ "attrib": attribs,
+ "data": data
+ }
+ if task_id:
+ output["taskId"] = task_id
+ if thumbnail_id:
+ output["thumbnailId"] = thumbnail_id
+ if author:
+ output["author"] = author
+ return output
+
+
+def new_hero_version_entity(
+ version,
+ product_id,
+ task_id=None,
+ thumbnail_id=None,
+ author=None,
+ attribs=None,
+ data=None,
+ entity_id=None
+):
+ """Create skeleton data of hero version entity.
+
+ Args:
+ version (int): Is considered as unique identifier of version
+ under product. Should be same as standard version if there is any.
+ product_id (str): Id of parent product.
+ task_id (Optional[str]): Id of task under which product was created.
+ thumbnail_id (Optional[str]): Thumbnail related to version.
+ author (Optional[str]): Name of version author.
+ attribs (Optional[Dict[str, Any]]): Explicitly set attributes
+ of version.
+ data (Optional[Dict[str, Any]]): Version entity data.
+ entity_id (Optional[str]): Predefined id of entity. New id is
+ created if not passed.
+
+ Returns:
+ Dict[str, Any]: Skeleton of version entity.
+ """
+
+ if attribs is None:
+ attribs = {}
+
+ if data is None:
+ data = {}
+
+ output = {
+ "id": _create_or_convert_to_id(entity_id),
+ "version": -abs(int(version)),
+ "productId": product_id,
+ "attrib": attribs,
+ "data": data
+ }
+ if task_id:
+ output["taskId"] = task_id
+ if thumbnail_id:
+ output["thumbnailId"] = thumbnail_id
+ if author:
+ output["author"] = author
+ return output
+
+
+def new_representation_entity(
+ name, version_id, attribs=None, data=None, entity_id=None
+):
+ """Create skeleton data of representation entity.
+
+ Args:
+ name (str): Representation name considered as unique identifier
+ of representation under version.
+ version_id (str): Id of parent version.
+ attribs (Optional[Dict[str, Any]]): Explicitly set attributes
+ of representation.
+ data (Optional[Dict[str, Any]]): Representation entity data.
+ entity_id (Optional[str]): Predefined id of entity. New id is created
+ if not passed.
+
+ Returns:
+ Dict[str, Any]: Skeleton of representation entity.
+ """
+
+ if attribs is None:
+ attribs = {}
+
+ if data is None:
+ data = {}
+
+ return {
+ "id": _create_or_convert_to_id(entity_id),
+ "versionId": _create_or_convert_to_id(version_id),
+ "name": name,
+ "data": data,
+ "attrib": attribs
+ }
+
+
+def new_workfile_info_doc(
+ filename, folder_id, task_name, files, data=None, entity_id=None
+):
+ """Create skeleton data of workfile info entity.
+
+ Workfile entity is at this moment used primarily for artist notes.
+
+ Args:
+ filename (str): Filename of workfile.
+ folder_id (str): Id of folder under which workfile live.
+ task_name (str): Task under which was workfile created.
+ files (List[str]): List of rootless filepaths related to workfile.
+ data (Optional[Dict[str, Any]]): Additional metadata.
+ entity_id (Optional[str]): Predefined id of entity. New id is created
+ if not passed.
+
+ Returns:
+ Dict[str, Any]: Skeleton of workfile info entity.
+ """
+
+ if not data:
+ data = {}
+
+ return {
+ "id": _create_or_convert_to_id(entity_id),
+ "parent": _create_or_convert_to_id(folder_id),
+ "task_name": task_name,
+ "filename": filename,
+ "data": data,
+ "files": files
+ }
+
+
+@six.add_metaclass(ABCMeta)
+class AbstractOperation(object):
+ """Base operation class.
+
+ Opration represent a call into database. The call can create, change or
+ remove data.
+
+ Args:
+ project_name (str): On which project operation will happen.
+ entity_type (str): Type of entity on which change happens.
+ e.g. 'folder', 'representation' etc.
+ """
+
+ def __init__(self, project_name, entity_type, session):
+ self._project_name = project_name
+ self._entity_type = entity_type
+ self._session = session
+ self._id = str(uuid.uuid4())
+
+ @property
+ def project_name(self):
+ return self._project_name
+
+ @property
+ def id(self):
+ """Identifier of operation."""
+
+ return self._id
+
+ @property
+ def entity_type(self):
+ return self._entity_type
+
+ @property
+ @abstractmethod
+ def operation_name(self):
+ """Stringified type of operation."""
+
+ pass
+
+ def to_data(self):
+ """Convert opration to data that can be converted to json or others.
+
+ Returns:
+ Dict[str, Any]: Description of operation.
+ """
+
+ return {
+ "id": self._id,
+ "entity_type": self.entity_type,
+ "project_name": self.project_name,
+ "operation": self.operation_name
+ }
+
+
+class CreateOperation(AbstractOperation):
+ """Opeartion to create an entity.
+
+ Args:
+ project_name (str): On which project operation will happen.
+ entity_type (str): Type of entity on which change happens.
+ e.g. 'folder', 'representation' etc.
+ data (Dict[str, Any]): Data of entity that will be created.
+ """
+
+ operation_name = "create"
+
+ def __init__(self, project_name, entity_type, data, session):
+ if not data:
+ data = {}
+ else:
+ data = copy.deepcopy(dict(data))
+
+ if "id" not in data:
+ data["id"] = create_entity_id()
+
+ self._data = data
+ super(CreateOperation, self).__init__(
+ project_name, entity_type, session
+ )
+
+ def __setitem__(self, key, value):
+ self.set_value(key, value)
+
+ def __getitem__(self, key):
+ return self.data[key]
+
+ def set_value(self, key, value):
+ self.data[key] = value
+
+ def get(self, key, *args, **kwargs):
+ return self.data.get(key, *args, **kwargs)
+
+ @property
+ def con(self):
+ return self.session.con
+
+ @property
+ def session(self):
+ return self._session
+
+ @property
+ def entity_id(self):
+ return self._data["id"]
+
+ @property
+ def data(self):
+ return self._data
+
+ def to_data(self):
+ output = super(CreateOperation, self).to_data()
+ output["data"] = copy.deepcopy(self.data)
+ return output
+
+ def to_server_operation(self):
+ return {
+ "id": self.id,
+ "type": "create",
+ "entityType": self.entity_type,
+ "entityId": self.entity_id,
+ "data": self._data
+ }
+
+
+class UpdateOperation(AbstractOperation):
+ """Operation to update an entity.
+
+ Args:
+ project_name (str): On which project operation will happen.
+ entity_type (str): Type of entity on which change happens.
+ e.g. 'folder', 'representation' etc.
+ entity_id (str): Identifier of an entity.
+ update_data (Dict[str, Any]): Key -> value changes that will be set in
+ database. If value is set to 'REMOVED_VALUE' the key will be
+ removed. Only first level of dictionary is checked (on purpose).
+ """
+
+ operation_name = "update"
+
+ def __init__(
+ self, project_name, entity_type, entity_id, update_data, session
+ ):
+ super(UpdateOperation, self).__init__(
+ project_name, entity_type, session
+ )
+
+ self._entity_id = entity_id
+ self._update_data = update_data
+
+ @property
+ def entity_id(self):
+ return self._entity_id
+
+ @property
+ def update_data(self):
+ return self._update_data
+
+ @property
+ def con(self):
+ return self.session.con
+
+ @property
+ def session(self):
+ return self._session
+
+ def to_data(self):
+ changes = {}
+ for key, value in self._update_data.items():
+ if value is REMOVED_VALUE:
+ value = None
+ changes[key] = value
+
+ output = super(UpdateOperation, self).to_data()
+ output.update({
+ "entity_id": self.entity_id,
+ "changes": changes
+ })
+ return output
+
+ def to_server_operation(self):
+ if not self._update_data:
+ return None
+
+ update_data = {}
+ for key, value in self._update_data.items():
+ if value is REMOVED_VALUE:
+ value = None
+ update_data[key] = value
+
+ return {
+ "id": self.id,
+ "type": "update",
+ "entityType": self.entity_type,
+ "entityId": self.entity_id,
+ "data": update_data
+ }
+
+
+class DeleteOperation(AbstractOperation):
+ """Opeartion to delete an entity.
+
+ Args:
+ project_name (str): On which project operation will happen.
+ entity_type (str): Type of entity on which change happens.
+ e.g. 'folder', 'representation' etc.
+ entity_id (str): Entity id that will be removed.
+ """
+
+ operation_name = "delete"
+
+ def __init__(self, project_name, entity_type, entity_id, session):
+ self._entity_id = entity_id
+
+ super(DeleteOperation, self).__init__(
+ project_name, entity_type, session
+ )
+
+ @property
+ def entity_id(self):
+ return self._entity_id
+
+ @property
+ def con(self):
+ return self.session.con
+
+ @property
+ def session(self):
+ return self._session
+
+ def to_data(self):
+ output = super(DeleteOperation, self).to_data()
+ output["entity_id"] = self.entity_id
+ return output
+
+ def to_server_operation(self):
+ return {
+ "id": self.id,
+ "type": self.operation_name,
+ "entityId": self.entity_id,
+ "entityType": self.entity_type,
+ }
+
+
+class OperationsSession(object):
+ """Session storing operations that should happen in an order.
+
+ At this moment does not handle anything special can be sonsidered as
+ stupid list of operations that will happen after each other. If creation
+ of same entity is there multiple times it's handled in any way and entity
+ values are not validated.
+
+ All operations must be related to single project.
+
+ Args:
+ project_name (str): Project name to which are operations related.
+ """
+
+ def __init__(self, con=None):
+ if con is None:
+ con = get_server_api_connection()
+ self._con = con
+ self._project_cache = {}
+ self._operations = []
+ self._nested_operations = collections.defaultdict(list)
+
+ @property
+ def con(self):
+ return self._con
+
+ def get_project(self, project_name):
+ if project_name not in self._project_cache:
+ self._project_cache[project_name] = self.con.get_project(
+ project_name)
+ return copy.deepcopy(self._project_cache[project_name])
+
+ def __len__(self):
+ return len(self._operations)
+
+ def add(self, operation):
+ """Add operation to be processed.
+
+ Args:
+ operation (BaseOperation): Operation that should be processed.
+ """
+ if not isinstance(
+ operation,
+ (CreateOperation, UpdateOperation, DeleteOperation)
+ ):
+ raise TypeError("Expected Operation object got {}".format(
+ str(type(operation))
+ ))
+
+ self._operations.append(operation)
+
+ def append(self, operation):
+ """Add operation to be processed.
+
+ Args:
+ operation (BaseOperation): Operation that should be processed.
+ """
+
+ self.add(operation)
+
+ def extend(self, operations):
+ """Add operations to be processed.
+
+ Args:
+ operations (List[BaseOperation]): Operations that should be
+ processed.
+ """
+
+ for operation in operations:
+ self.add(operation)
+
+ def remove(self, operation):
+ """Remove operation."""
+
+ self._operations.remove(operation)
+
+ def clear(self):
+ """Clear all registered operations."""
+
+ self._operations = []
+
+ def to_data(self):
+ return [
+ operation.to_data()
+ for operation in self._operations
+ ]
+
+ def commit(self):
+ """Commit session operations."""
+
+ operations, self._operations = self._operations, []
+ if not operations:
+ return
+
+ operations_by_project = collections.defaultdict(list)
+ for operation in operations:
+ operations_by_project[operation.project_name].append(operation)
+
+ for project_name, operations in operations_by_project.items():
+ operations_body = []
+ for operation in operations:
+ body = operation.to_server_operation()
+ if body is not None:
+ operations_body.append(body)
+
+ self._con.send_batch_operations(
+ project_name, operations_body, can_fail=False
+ )
+
+ def create_entity(self, project_name, entity_type, data, nested_id=None):
+ """Fast access to 'CreateOperation'.
+
+ Args:
+ project_name (str): On which project the creation happens.
+ entity_type (str): Which entity type will be created.
+ data (Dicst[str, Any]): Entity data.
+ nested_id (str): Id of other operation from which is triggered
+ operation -> Operations can trigger suboperations but they
+ must be added to operations list after it's parent is added.
+
+ Returns:
+ CreateOperation: Object of update operation.
+ """
+
+ operation = CreateOperation(
+ project_name, entity_type, data, self
+ )
+
+ if nested_id:
+ self._nested_operations[nested_id].append(operation)
+ else:
+ self.add(operation)
+ if operation.id in self._nested_operations:
+ self.extend(self._nested_operations.pop(operation.id))
+
+ return operation
+
+ def update_entity(
+ self, project_name, entity_type, entity_id, update_data, nested_id=None
+ ):
+ """Fast access to 'UpdateOperation'.
+
+ Returns:
+ UpdateOperation: Object of update operation.
+ """
+
+ operation = UpdateOperation(
+ project_name, entity_type, entity_id, update_data, self
+ )
+ if nested_id:
+ self._nested_operations[nested_id].append(operation)
+ else:
+ self.add(operation)
+ if operation.id in self._nested_operations:
+ self.extend(self._nested_operations.pop(operation.id))
+ return operation
+
+ def delete_entity(
+ self, project_name, entity_type, entity_id, nested_id=None
+ ):
+ """Fast access to 'DeleteOperation'.
+
+ Returns:
+ DeleteOperation: Object of delete operation.
+ """
+
+ operation = DeleteOperation(
+ project_name, entity_type, entity_id, self
+ )
+ if nested_id:
+ self._nested_operations[nested_id].append(operation)
+ else:
+ self.add(operation)
+ if operation.id in self._nested_operations:
+ self.extend(self._nested_operations.pop(operation.id))
+ return operation
diff --git a/openpype/vendor/python/common/ayon_api/server_api.py b/openpype/vendor/python/common/ayon_api/server_api.py
new file mode 100644
index 0000000000..c886fed976
--- /dev/null
+++ b/openpype/vendor/python/common/ayon_api/server_api.py
@@ -0,0 +1,5739 @@
+import os
+import re
+import io
+import json
+import logging
+import collections
+import datetime
+import platform
+import copy
+import uuid
+from contextlib import contextmanager
+try:
+ from http import HTTPStatus
+except ImportError:
+ HTTPStatus = None
+
+import requests
+from requests.exceptions import JSONDecodeError as RequestsJSONDecodeError
+
+from .constants import (
+ DEFAULT_PRODUCT_TYPE_FIELDS,
+ DEFAULT_PROJECT_FIELDS,
+ DEFAULT_FOLDER_FIELDS,
+ DEFAULT_TASK_FIELDS,
+ DEFAULT_PRODUCT_FIELDS,
+ DEFAULT_VERSION_FIELDS,
+ DEFAULT_REPRESENTATION_FIELDS,
+ REPRESENTATION_FILES_FIELDS,
+ DEFAULT_WORKFILE_INFO_FIELDS,
+ DEFAULT_EVENT_FIELDS,
+)
+from .thumbnails import ThumbnailCache
+from .graphql import GraphQlQuery, INTROSPECTION_QUERY
+from .graphql_queries import (
+ project_graphql_query,
+ projects_graphql_query,
+ project_product_types_query,
+ product_types_query,
+ folders_graphql_query,
+ tasks_graphql_query,
+ products_graphql_query,
+ versions_graphql_query,
+ representations_graphql_query,
+ representations_parents_qraphql_query,
+ workfiles_info_graphql_query,
+ events_graphql_query,
+)
+from .exceptions import (
+ FailedOperations,
+ UnauthorizedError,
+ AuthenticationError,
+ ServerNotReached,
+ ServerError,
+ HTTPRequestError,
+)
+from .utils import (
+ RepresentationParents,
+ prepare_query_string,
+ logout_from_server,
+ create_entity_id,
+ entity_data_json_default,
+ failed_json_default,
+ TransferProgress,
+ create_dependency_package_basename,
+)
+
+PatternType = type(re.compile(""))
+JSONDecodeError = getattr(json, "JSONDecodeError", ValueError)
+# This should be collected from server schema
+PROJECT_NAME_ALLOWED_SYMBOLS = "a-zA-Z0-9_"
+PROJECT_NAME_REGEX = re.compile(
+ "^[{}]+$".format(PROJECT_NAME_ALLOWED_SYMBOLS)
+)
+
+VERSION_REGEX = re.compile(
+ r"(?P0|[1-9]\d*)"
+ r"\.(?P0|[1-9]\d*)"
+ r"\.(?P0|[1-9]\d*)"
+ r"(?:-(?P[a-zA-Z\d\-.]*))?"
+ r"(?:\+(?P[a-zA-Z\d\-.]*))?"
+)
+
+
+def _get_description(response):
+ if HTTPStatus is None:
+ return str(response.orig_response)
+ return HTTPStatus(response.status).description
+
+
+class RequestType:
+ def __init__(self, name):
+ self.name = name
+
+ def __hash__(self):
+ return self.name.__hash__()
+
+
+class RequestTypes:
+ get = RequestType("GET")
+ post = RequestType("POST")
+ put = RequestType("PUT")
+ patch = RequestType("PATCH")
+ delete = RequestType("DELETE")
+
+
+class RestApiResponse(object):
+ """API Response."""
+
+ def __init__(self, response, data=None):
+ if response is None:
+ status_code = 500
+ else:
+ status_code = response.status_code
+ self._response = response
+ self.status = status_code
+ self._data = data
+
+ @property
+ def text(self):
+ return self._response.text
+
+ @property
+ def orig_response(self):
+ return self._response
+
+ @property
+ def headers(self):
+ return self._response.headers
+
+ @property
+ def data(self):
+ if self._data is None:
+ try:
+ self._data = self.orig_response.json()
+ except RequestsJSONDecodeError:
+ self._data = {}
+ return self._data
+
+ @property
+ def content(self):
+ return self._response.content
+
+ @property
+ def content_type(self):
+ return self.headers.get("Content-Type")
+
+ @property
+ def detail(self):
+ detail = self.get("detail")
+ if detail:
+ return detail
+ return _get_description(self)
+
+ @property
+ def status_code(self):
+ return self.status
+
+ def raise_for_status(self, message=None):
+ try:
+ self._response.raise_for_status()
+ except requests.exceptions.HTTPError as exc:
+ if message is None:
+ message = str(exc)
+ raise HTTPRequestError(message, exc.response)
+
+ def __enter__(self, *args, **kwargs):
+ return self._response.__enter__(*args, **kwargs)
+
+ def __contains__(self, key):
+ return key in self.data
+
+ def __repr__(self):
+ return "<{} [{}]>".format(self.__class__.__name__, self.status)
+
+ def __len__(self):
+ return 200 <= self.status < 400
+
+ def __getitem__(self, key):
+ return self.data[key]
+
+ def get(self, key, default=None):
+ data = self.data
+ if isinstance(data, dict):
+ return self.data.get(key, default)
+ return default
+
+
+class GraphQlResponse:
+ def __init__(self, data):
+ self.data = data
+ self.errors = data.get("errors")
+
+ def __len__(self):
+ if self.errors:
+ return 0
+ return 1
+
+ def __repr__(self):
+ if self.errors:
+ return "<{} errors={}>".format(
+ self.__class__.__name__, self.errors[0]['message']
+ )
+ return "<{}>".format(self.__class__.__name__)
+
+
+def fill_own_attribs(entity):
+ if not entity or not entity.get("attrib"):
+ return
+
+ attributes = set(entity["ownAttrib"])
+
+ own_attrib = {}
+ entity["ownAttrib"] = own_attrib
+
+ for key, value in entity["attrib"].items():
+ if key not in attributes:
+ own_attrib[key] = None
+ else:
+ own_attrib[key] = copy.deepcopy(value)
+
+
+class _AsUserStack:
+ """Handle stack of users used over server api connection in service mode.
+
+ ServerAPI can behave as other users if it is using special API key.
+
+ Examples:
+ >>> stack = _AsUserStack()
+ >>> stack.set_default_username("DefaultName")
+ >>> print(stack.username)
+ DefaultName
+ >>> with stack.as_user("Other1"):
+ ... print(stack.username)
+ ... with stack.as_user("Other2"):
+ ... print(stack.username)
+ ... print(stack.username)
+ ... stack.clear()
+ ... print(stack.username)
+ Other1
+ Other2
+ Other1
+ None
+ >>> print(stack.username)
+ None
+ >>> stack.set_default_username("DefaultName")
+ >>> print(stack.username)
+ DefaultName
+ """
+
+ def __init__(self):
+ self._users_by_id = {}
+ self._user_ids = []
+ self._last_user = None
+ self._default_user = None
+
+ def clear(self):
+ self._users_by_id = {}
+ self._user_ids = []
+ self._last_user = None
+ self._default_user = None
+
+ @property
+ def username(self):
+ # Use '_user_ids' for boolean check to have ability "unset"
+ # default user
+ if self._user_ids:
+ return self._last_user
+ return self._default_user
+
+ def get_default_username(self):
+ return self._default_user
+
+ def set_default_username(self, username=None):
+ self._default_user = username
+
+ default_username = property(get_default_username, set_default_username)
+
+ @contextmanager
+ def as_user(self, username):
+ self._last_user = username
+ user_id = uuid.uuid4().hex
+ self._user_ids.append(user_id)
+ self._users_by_id[user_id] = username
+ try:
+ yield
+ finally:
+ self._users_by_id.pop(user_id, None)
+ if not self._user_ids:
+ return
+
+ # First check if is the user id the last one
+ was_last = self._user_ids[-1] == user_id
+ # Remove id from variables
+ if user_id in self._user_ids:
+ self._user_ids.remove(user_id)
+
+ if not was_last:
+ return
+
+ new_last_user = None
+ if self._user_ids:
+ new_last_user = self._users_by_id.get(self._user_ids[-1])
+ self._last_user = new_last_user
+
+
+class ServerAPI(object):
+ """Base handler of connection to server.
+
+ Requires url to server which is used as base for api and graphql calls.
+
+ Login cause that a session is used
+
+ Args:
+ base_url (str): Example: http://localhost:5000
+ token (Optional[str]): Access token (api key) to server.
+ site_id (Optional[str]): Unique name of site. Should be the same when
+ connection is created from the same machine under same user.
+ client_version (Optional[str]): Version of client application (used in
+ desktop client application).
+ default_settings_variant (Optional[Literal["production", "staging"]]):
+ Settings variant used by default if a method for settings won't
+ get any (by default is 'production').
+ ssl_verify (Union[bool, str, None]): Verify SSL certificate
+ Looks for env variable value 'AYON_CA_FILE' by default. If not
+ available then 'True' is used.
+ cert (Optional[str]): Path to certificate file. Looks for env
+ variable value 'AYON_CERT_FILE' by default.
+ """
+
+ def __init__(
+ self,
+ base_url,
+ token=None,
+ site_id=None,
+ client_version=None,
+ default_settings_variant=None,
+ ssl_verify=None,
+ cert=None,
+ ):
+ if not base_url:
+ raise ValueError("Invalid server URL {}".format(str(base_url)))
+
+ base_url = base_url.rstrip("/")
+ self._base_url = base_url
+ self._rest_url = "{}/api".format(base_url)
+ self._graphql_url = "{}/graphql".format(base_url)
+ self._log = None
+ self._access_token = token
+ self._site_id = site_id
+ self._client_version = client_version
+ self._default_settings_variant = (
+ default_settings_variant
+ or "production"
+ )
+
+ if ssl_verify is None:
+ # Custom AYON env variable for CA file or 'True'
+ # - that should cover most default behaviors in 'requests'
+ # with 'certifi'
+ ssl_verify = os.environ.get("AYON_CA_FILE") or True
+
+ if cert is None:
+ cert = os.environ.get("AYON_CERT_FILE")
+
+ self._ssl_verify = ssl_verify
+ self._cert = cert
+
+ self._access_token_is_service = None
+ self._token_is_valid = None
+ self._server_available = None
+ self._server_version = None
+ self._server_version_tuple = None
+
+ self._session = None
+
+ self._base_functions_mapping = {
+ RequestTypes.get: requests.get,
+ RequestTypes.post: requests.post,
+ RequestTypes.put: requests.put,
+ RequestTypes.patch: requests.patch,
+ RequestTypes.delete: requests.delete
+ }
+ self._session_functions_mapping = {}
+
+ # Attributes cache
+ self._attributes_schema = None
+ self._entity_type_attributes_cache = {}
+
+ self._as_user_stack = _AsUserStack()
+ self._thumbnail_cache = ThumbnailCache(True)
+
+ @property
+ def log(self):
+ if self._log is None:
+ self._log = logging.getLogger(self.__class__.__name__)
+ return self._log
+
+ def get_base_url(self):
+ return self._base_url
+
+ def get_rest_url(self):
+ return self._rest_url
+
+ base_url = property(get_base_url)
+ rest_url = property(get_rest_url)
+
+ def get_ssl_verify(self):
+ """Enable ssl verification.
+
+ Returns:
+ bool: Current state of ssl verification.
+ """
+
+ return self._ssl_verify
+
+ def set_ssl_verify(self, ssl_verify):
+ """Change ssl verification state.
+
+ Args:
+ ssl_verify (Union[bool, str, None]): Enabled/disable
+ ssl verification, can be a path to file.
+ """
+
+ if self._ssl_verify == ssl_verify:
+ return
+ self._ssl_verify = ssl_verify
+ if self._session is not None:
+ self._session.verify = ssl_verify
+
+ def get_cert(self):
+ """Current cert file used for connection to server.
+
+ Returns:
+ Union[str, None]: Path to cert file.
+ """
+
+ return self._cert
+
+ def set_cert(self, cert):
+ """Change cert file used for connection to server.
+
+ Args:
+ cert (Union[str, None]): Path to cert file.
+ """
+
+ if cert == self._cert:
+ return
+ self._cert = cert
+ if self._session is not None:
+ self._session.cert = cert
+
+ ssl_verify = property(get_ssl_verify, set_ssl_verify)
+ cert = property(get_cert, set_cert)
+
+ @property
+ def access_token(self):
+ """Access token used for authorization to server.
+
+ Returns:
+ Union[str, None]: Token string or None if not authorized yet.
+ """
+
+ return self._access_token
+
+ def get_site_id(self):
+ """Site id used for connection.
+
+ Site id tells server from which machine/site is connection created and
+ is used for default site overrides when settings are received.
+
+ Returns:
+ Union[str, None]: Site id value or None if not filled.
+ """
+
+ return self._site_id
+
+ def set_site_id(self, site_id):
+ """Change site id of connection.
+
+ Behave as specific site for server. It affects default behavior of
+ settings getter methods.
+
+ Args:
+ site_id (Union[str, None]): Site id value, or 'None' to unset.
+ """
+
+ if self._site_id == site_id:
+ return
+ self._site_id = site_id
+ # Recreate session on machine id change
+ self._update_session_headers()
+
+ site_id = property(get_site_id, set_site_id)
+
+ def get_client_version(self):
+ """Version of client used to connect to server.
+
+ Client version is AYON client build desktop application.
+
+ Returns:
+ str: Client version string used in connection.
+ """
+
+ return self._client_version
+
+ def set_client_version(self, client_version):
+ """Set version of client used to connect to server.
+
+ Client version is AYON client build desktop application.
+
+ Args:
+ client_version (Union[str, None]): Client version string.
+ """
+
+ if self._client_version == client_version:
+ return
+
+ self._client_version = client_version
+ self._update_session_headers()
+
+ client_version = property(get_client_version, set_client_version)
+
+ def get_default_settings_variant(self):
+ """Default variant used for settings.
+
+ Returns:
+ Union[str, None]: name of variant or None.
+ """
+
+ return self._default_settings_variant
+
+ def set_default_settings_variant(self, variant):
+ """Change default variant for addon settings.
+
+ Note:
+ It is recommended to set only 'production' or 'staging' variants
+ as default variant.
+
+ Args:
+ variant (Literal['production', 'staging']): Settings variant name.
+ """
+
+ if variant not in ("production", "staging"):
+ raise ValueError((
+ "Invalid variant name {}. Expected 'production' or 'staging'"
+ ).format(variant))
+ self._default_settings_variant = variant
+
+ default_settings_variant = property(
+ get_default_settings_variant,
+ set_default_settings_variant
+ )
+
+ def get_default_service_username(self):
+ """Default username used for callbacks when used with service API key.
+
+ Returns:
+ Union[str, None]: Username if any was filled.
+ """
+
+ return self._as_user_stack.get_default_username()
+
+ def set_default_service_username(self, username=None):
+ """Service API will work as other user.
+
+ Service API keys can work as other user. It can be temporary using
+ context manager 'as_user' or it is possible to set default username if
+ 'as_user' context manager is not entered.
+
+ Args:
+ username (Optional[str]): Username to work as when service.
+
+ Raises:
+ ValueError: When connection is not yet authenticated or api key
+ is not service token.
+ """
+
+ current_username = self._as_user_stack.get_default_username()
+ if current_username == username:
+ return
+
+ if not self.has_valid_token:
+ raise ValueError(
+ "Authentication of connection did not happen yet."
+ )
+
+ if not self._access_token_is_service:
+ raise ValueError(
+ "Can't set service username. API key is not a service token."
+ )
+
+ self._as_user_stack.set_default_username(username)
+ if self._as_user_stack.username == username:
+ self._update_session_headers()
+
+ @contextmanager
+ def as_username(self, username):
+ """Service API will temporarily work as other user.
+
+ This method can be used only if service API key is logged in.
+
+ Args:
+ username (Union[str, None]): Username to work as when service.
+
+ Raises:
+ ValueError: When connection is not yet authenticated or api key
+ is not service token.
+ """
+
+ if not self.has_valid_token:
+ raise ValueError(
+ "Authentication of connection did not happen yet."
+ )
+
+ if not self._access_token_is_service:
+ raise ValueError(
+ "Can't set service username. API key is not a service token."
+ )
+
+ with self._as_user_stack.as_user(username) as o:
+ self._update_session_headers()
+ try:
+ yield o
+ finally:
+ self._update_session_headers()
+
+ @property
+ def is_server_available(self):
+ if self._server_available is None:
+ response = requests.get(
+ self._base_url,
+ cert=self._cert,
+ verify=self._ssl_verify
+ )
+ self._server_available = response.status_code == 200
+ return self._server_available
+
+ @property
+ def has_valid_token(self):
+ if self._access_token is None:
+ return False
+
+ if self._token_is_valid is None:
+ self.validate_token()
+ return self._token_is_valid
+
+ def validate_server_availability(self):
+ if not self.is_server_available:
+ raise ServerNotReached("Server \"{}\" can't be reached".format(
+ self._base_url
+ ))
+
+ def validate_token(self):
+ try:
+ # TODO add other possible validations
+ # - existence of 'user' key in info
+ # - validate that 'site_id' is in 'sites' in info
+ self.get_info()
+ self.get_user()
+ self._token_is_valid = True
+
+ except UnauthorizedError:
+ self._token_is_valid = False
+ return self._token_is_valid
+
+ def set_token(self, token):
+ self.reset_token()
+ self._access_token = token
+ self.get_user()
+
+ def reset_token(self):
+ self._access_token = None
+ self._token_is_valid = None
+ self.close_session()
+
+ def create_session(self):
+ if self._session is not None:
+ raise ValueError("Session is already created.")
+
+ self._as_user_stack.clear()
+ # Validate token before session creation
+ self.validate_token()
+
+ session = requests.Session()
+ session.cert = self._cert
+ session.verify = self._ssl_verify
+ session.headers.update(self.get_headers())
+
+ self._session_functions_mapping = {
+ RequestTypes.get: session.get,
+ RequestTypes.post: session.post,
+ RequestTypes.put: session.put,
+ RequestTypes.patch: session.patch,
+ RequestTypes.delete: session.delete
+ }
+ self._session = session
+
+ def close_session(self):
+ if self._session is None:
+ return
+
+ session = self._session
+ self._session = None
+ self._session_functions_mapping = {}
+ session.close()
+
+ def _update_session_headers(self):
+ if self._session is None:
+ return
+
+ # Header keys that may change over time
+ for key, value in (
+ ("X-as-user", self._as_user_stack.username),
+ ("x-ayon-version", self._client_version),
+ ("x-ayon-site-id", self._site_id),
+ ):
+ if value is not None:
+ self._session.headers[key] = value
+ elif key in self._session.headers:
+ self._session.headers.pop(key)
+
+ def get_info(self):
+ """Get information about current used api key.
+
+ By default, the 'info' contains only 'uptime' and 'version'. With
+ logged user info also contains information about user and machines on
+ which was logged in.
+
+ Todos:
+ Use this method for validation of token instead of 'get_user'.
+
+ Returns:
+ dict[str, Any]: Information from server.
+ """
+
+ response = self.get("info")
+ return response.data
+
+ def get_server_version(self):
+ """Get server version.
+
+ Version should match semantic version (https://semver.org/).
+
+ Returns:
+ str: Server version.
+ """
+
+ if self._server_version is None:
+ self._server_version = self.get_info()["version"]
+ return self._server_version
+
+ def get_server_version_tuple(self):
+ """Get server version as tuple.
+
+ Version should match semantic version (https://semver.org/).
+
+ This function only returns first three numbers of version.
+
+ Returns:
+ Tuple[int, int, int, Union[str, None], Union[str, None]]: Server
+ version.
+ """
+
+ if self._server_version_tuple is None:
+ re_match = VERSION_REGEX.fullmatch(
+ self.get_server_version())
+ self._server_version_tuple = (
+ int(re_match.group("major")),
+ int(re_match.group("minor")),
+ int(re_match.group("patch")),
+ re_match.group("prerelease"),
+ re_match.group("buildmetadata")
+ )
+ return self._server_version_tuple
+
+ server_version = property(get_server_version)
+ server_version_tuple = property(get_server_version_tuple)
+
+ def _get_user_info(self):
+ if self._access_token is None:
+ return None
+
+ if self._access_token_is_service is not None:
+ response = self.get("users/me")
+ return response.data
+
+ self._access_token_is_service = False
+ response = self.get("users/me")
+ if response.status == 200:
+ return response.data
+
+ self._access_token_is_service = True
+ response = self.get("users/me")
+ if response.status == 200:
+ return response.data
+
+ self._access_token_is_service = None
+ return None
+
+ def get_users(self):
+ # TODO how to find out if user have permission?
+ users = self.get("users")
+ return users.data
+
+ def get_user(self, username=None):
+ output = None
+ if username is None:
+ output = self._get_user_info()
+ else:
+ response = self.get("users/{}".format(username))
+ if response.status == 200:
+ output = response.data
+
+ if output is None:
+ raise UnauthorizedError("User is not authorized.")
+ return output
+
+ def get_headers(self, content_type=None):
+ if content_type is None:
+ content_type = "application/json"
+
+ headers = {
+ "Content-Type": content_type,
+ "x-ayon-platform": platform.system().lower(),
+ "x-ayon-hostname": platform.node(),
+ }
+ if self._site_id is not None:
+ headers["x-ayon-site-id"] = self._site_id
+
+ if self._client_version is not None:
+ headers["x-ayon-version"] = self._client_version
+
+ if self._access_token:
+ if self._access_token_is_service:
+ headers["X-Api-Key"] = self._access_token
+ username = self._as_user_stack.username
+ if username:
+ headers["X-as-user"] = username
+ else:
+ headers["Authorization"] = "Bearer {}".format(
+ self._access_token)
+ return headers
+
+ def login(self, username, password):
+ if self.has_valid_token:
+ try:
+ user_info = self.get_user()
+ except UnauthorizedError:
+ user_info = {}
+
+ current_username = user_info.get("name")
+ if current_username == username:
+ self.close_session()
+ self.create_session()
+ return
+
+ self.reset_token()
+
+ self.validate_server_availability()
+
+ response = self.post(
+ "auth/login",
+ name=username,
+ password=password
+ )
+ if response.status_code != 200:
+ _detail = response.data.get("detail")
+ details = ""
+ if _detail:
+ details = " {}".format(_detail)
+
+ raise AuthenticationError("Login failed {}".format(details))
+
+ self._access_token = response["token"]
+
+ if not self.has_valid_token:
+ raise AuthenticationError("Invalid credentials")
+ self.create_session()
+
+ def logout(self, soft=False):
+ if self._access_token:
+ if not soft:
+ self._logout()
+ self.reset_token()
+
+ def _logout(self):
+ logout_from_server(self._base_url, self._access_token)
+
+ def _do_rest_request(self, function, url, **kwargs):
+ if self._session is None:
+ if "headers" not in kwargs:
+ kwargs["headers"] = self.get_headers()
+
+ if isinstance(function, RequestType):
+ function = self._base_functions_mapping[function]
+
+ elif isinstance(function, RequestType):
+ function = self._session_functions_mapping[function]
+
+ try:
+ response = function(url, **kwargs)
+
+ except ConnectionRefusedError:
+ new_response = RestApiResponse(
+ None,
+ {"detail": "Unable to connect the server. Connection refused"}
+ )
+ except requests.exceptions.ConnectionError:
+ new_response = RestApiResponse(
+ None,
+ {"detail": "Unable to connect the server. Connection error"}
+ )
+ else:
+ content_type = response.headers.get("Content-Type")
+ if content_type == "application/json":
+ try:
+ new_response = RestApiResponse(response)
+ except JSONDecodeError:
+ new_response = RestApiResponse(
+ None,
+ {
+ "detail": "The response is not a JSON: {}".format(
+ response.text)
+ }
+ )
+
+ elif content_type in ("image/jpeg", "image/png"):
+ new_response = RestApiResponse(response)
+
+ else:
+ new_response = RestApiResponse(response)
+
+ self.log.debug("Response {}".format(str(new_response)))
+ return new_response
+
+ def raw_post(self, entrypoint, **kwargs):
+ entrypoint = entrypoint.lstrip("/").rstrip("/")
+ self.log.debug("Executing [POST] {}".format(entrypoint))
+ url = "{}/{}".format(self._rest_url, entrypoint)
+ return self._do_rest_request(
+ RequestTypes.post,
+ url,
+ **kwargs
+ )
+
+ def raw_put(self, entrypoint, **kwargs):
+ entrypoint = entrypoint.lstrip("/").rstrip("/")
+ self.log.debug("Executing [PUT] {}".format(entrypoint))
+ url = "{}/{}".format(self._rest_url, entrypoint)
+ return self._do_rest_request(
+ RequestTypes.put,
+ url,
+ **kwargs
+ )
+
+ def raw_patch(self, entrypoint, **kwargs):
+ entrypoint = entrypoint.lstrip("/").rstrip("/")
+ self.log.debug("Executing [PATCH] {}".format(entrypoint))
+ url = "{}/{}".format(self._rest_url, entrypoint)
+ return self._do_rest_request(
+ RequestTypes.patch,
+ url,
+ **kwargs
+ )
+
+ def raw_get(self, entrypoint, **kwargs):
+ entrypoint = entrypoint.lstrip("/").rstrip("/")
+ self.log.debug("Executing [GET] {}".format(entrypoint))
+ url = "{}/{}".format(self._rest_url, entrypoint)
+ return self._do_rest_request(
+ RequestTypes.get,
+ url,
+ **kwargs
+ )
+
+ def raw_delete(self, entrypoint, **kwargs):
+ entrypoint = entrypoint.lstrip("/").rstrip("/")
+ self.log.debug("Executing [DELETE] {}".format(entrypoint))
+ url = "{}/{}".format(self._rest_url, entrypoint)
+ return self._do_rest_request(
+ RequestTypes.delete,
+ url,
+ **kwargs
+ )
+
+ def post(self, entrypoint, **kwargs):
+ return self.raw_post(entrypoint, json=kwargs)
+
+ def put(self, entrypoint, **kwargs):
+ return self.raw_put(entrypoint, json=kwargs)
+
+ def patch(self, entrypoint, **kwargs):
+ return self.raw_patch(entrypoint, json=kwargs)
+
+ def get(self, entrypoint, **kwargs):
+ return self.raw_get(entrypoint, params=kwargs)
+
+ def delete(self, entrypoint, **kwargs):
+ return self.raw_delete(entrypoint, params=kwargs)
+
+ def get_event(self, event_id):
+ """Query full event data by id.
+
+ Events received using event server do not contain full information. To
+ get the full event information is required to receive it explicitly.
+
+ Args:
+ event_id (str): Id of event.
+
+ Returns:
+ dict[str, Any]: Full event data.
+ """
+
+ response = self.get("events/{}".format(event_id))
+ response.raise_for_status()
+ return response.data
+
+ def get_events(
+ self,
+ topics=None,
+ project_names=None,
+ states=None,
+ users=None,
+ include_logs=None,
+ fields=None
+ ):
+ """Get events from server with filtering options.
+
+ Notes:
+ Not all event happen on a project.
+
+ Args:
+ topics (Optional[Iterable[str]]): Name of topics.
+ project_names (Optional[Iterable[str]]): Project on which
+ event happened.
+ states (Optional[Iterable[str]]): Filtering by states.
+ users (Optional[Iterable[str]]): Filtering by users
+ who created/triggered an event.
+ include_logs (Optional[bool]): Query also log events.
+ fields (Optional[Iterable[str]]): Fields that should be received
+ for each event.
+
+ Returns:
+ Generator[dict[str, Any]]: Available events matching filters.
+ """
+
+ filters = {}
+ if topics is not None:
+ topics = set(topics)
+ if not topics:
+ return
+ filters["eventTopics"] = list(topics)
+
+ if project_names is not None:
+ project_names = set(project_names)
+ if not project_names:
+ return
+ filters["projectNames"] = list(project_names)
+
+ if states is not None:
+ states = set(states)
+ if not states:
+ return
+ filters["eventStates"] = list(states)
+
+ if users is not None:
+ users = set(users)
+ if not users:
+ return
+ filters["eventUsers"] = list(users)
+
+ if include_logs is None:
+ include_logs = False
+ filters["includeLogsFilter"] = include_logs
+
+ if not fields:
+ fields = DEFAULT_EVENT_FIELDS
+
+ query = events_graphql_query(set(fields))
+ for attr, filter_value in filters.items():
+ query.set_variable_value(attr, filter_value)
+
+ for parsed_data in query.continuous_query(self):
+ for event in parsed_data["events"]:
+ yield event
+
+ def update_event(
+ self,
+ event_id,
+ sender=None,
+ project_name=None,
+ status=None,
+ description=None,
+ summary=None,
+ payload=None
+ ):
+ kwargs = {
+ key: value
+ for key, value in (
+ ("sender", sender),
+ ("project", project_name),
+ ("status", status),
+ ("description", description),
+ ("summary", summary),
+ ("payload", payload),
+ )
+ if value is not None
+ }
+ response = self.patch(
+ "events/{}".format(event_id),
+ **kwargs
+ )
+ response.raise_for_status()
+
+ def dispatch_event(
+ self,
+ topic,
+ sender=None,
+ event_hash=None,
+ project_name=None,
+ username=None,
+ dependencies=None,
+ description=None,
+ summary=None,
+ payload=None,
+ finished=True,
+ store=True,
+ ):
+ """Dispatch event to server.
+
+ Arg:
+ topic (str): Event topic used for filtering of listeners.
+ sender (Optional[str]): Sender of event.
+ hash (Optional[str]): Event hash.
+ project_name (Optional[str]): Project name.
+ username (Optional[str]): Username which triggered event.
+ dependencies (Optional[list[str]]): List of event id dependencies.
+ description (Optional[str]): Description of event.
+ summary (Optional[dict[str, Any]]): Summary of event that can be used
+ for simple filtering on listeners.
+ payload (Optional[dict[str, Any]]): Full payload of event data with
+ all details.
+ finished (Optional[bool]): Mark event as finished on dispatch.
+ store (Optional[bool]): Store event in event queue for possible
+ future processing otherwise is event send only
+ to active listeners.
+ """
+
+ if summary is None:
+ summary = {}
+ if payload is None:
+ payload = {}
+ event_data = {
+ "topic": topic,
+ "sender": sender,
+ "hash": event_hash,
+ "project": project_name,
+ "user": username,
+ "dependencies": dependencies,
+ "description": description,
+ "summary": summary,
+ "payload": payload,
+ "finished": finished,
+ "store": store,
+ }
+ if self.post("events", **event_data):
+ self.log.debug("Dispatched event {}".format(topic))
+ return True
+ self.log.error("Unable to dispatch event {}".format(topic))
+ return False
+
+ def enroll_event_job(
+ self,
+ source_topic,
+ target_topic,
+ sender,
+ description=None,
+ sequential=None
+ ):
+ """Enroll job based on events.
+
+ Enroll will find first unprocessed event with 'source_topic' and will
+ create new event with 'target_topic' for it and return the new event
+ data.
+
+ Use 'sequential' to control that only single target event is created
+ at same time. Creation of new target events is blocked while there is
+ at least one unfinished event with target topic, when set to 'True'.
+ This helps when order of events matter and more than one process using
+ the same target is running at the same time.
+ - Make sure the new event has updated status to '"finished"' status
+ when you're done with logic
+
+ Target topic should not clash with other processes/services.
+
+ Created target event have 'dependsOn' key where is id of source topic.
+
+ Use-case:
+ - Service 1 is creating events with topic 'my.leech'
+ - Service 2 process 'my.leech' and uses target topic 'my.process'
+ - this service can run on 1-n machines
+ - all events must be processed in a sequence by their creation
+ time and only one event can be processed at a time
+ - in this case 'sequential' should be set to 'True' so only
+ one machine is actually processing events, but if one goes
+ down there are other that can take place
+ - Service 3 process 'my.leech' and uses target topic 'my.discover'
+ - this service can run on 1-n machines
+ - order of events is not important
+ - 'sequential' should be 'False'
+
+ Args:
+ source_topic (str): Source topic to enroll.
+ target_topic (str): Topic of dependent event.
+ sender (str): Identifier of sender (e.g. service name or username).
+ description (Optional[str]): Human readable text shown
+ in target event.
+ sequential (Optional[bool]): The source topic must be processed
+ in sequence.
+
+ Returns:
+ Union[None, dict[str, Any]]: None if there is no event matching
+ filters. Created event with 'target_topic'.
+ """
+
+ kwargs = {
+ "sourceTopic": source_topic,
+ "targetTopic": target_topic,
+ "sender": sender,
+ }
+ if sequential is not None:
+ kwargs["sequential"] = sequential
+ if description is not None:
+ kwargs["description"] = description
+ response = self.post("enroll", **kwargs)
+ if response.status_code == 204:
+ return None
+ elif response.status_code >= 400:
+ self.log.error(response.text)
+ return None
+
+ return response.data
+
+ def _download_file(self, url, filepath, chunk_size, progress):
+ dst_directory = os.path.dirname(filepath)
+ if not os.path.exists(dst_directory):
+ os.makedirs(dst_directory)
+
+ kwargs = {"stream": True}
+ if self._session is None:
+ kwargs["headers"] = self.get_headers()
+ get_func = self._base_functions_mapping[RequestTypes.get]
+ else:
+ get_func = self._session_functions_mapping[RequestTypes.get]
+
+ with open(filepath, "wb") as f_stream:
+ with get_func(url, **kwargs) as response:
+ response.raise_for_status()
+ progress.set_content_size(response.headers["Content-length"])
+ for chunk in response.iter_content(chunk_size=chunk_size):
+ f_stream.write(chunk)
+ progress.add_transferred_chunk(len(chunk))
+
+ def download_file(
+ self, endpoint, filepath, chunk_size=None, progress=None
+ ):
+ """Download file from AYON server.
+
+ Endpoint can be full url (must start with 'base_url' of api object).
+
+ Progress object can be used to track download. Can be used when
+ download happens in thread and other thread want to catch changes over
+ time.
+
+ Args:
+ endpoint (str): Endpoint or URL to file that should be downloaded.
+ filepath (str): Path where file will be downloaded.
+ chunk_size (Optional[int]): Size of chunks that are received
+ in single loop.
+ progress (Optional[TransferProgress]): Object that gives ability
+ to track download progress.
+ """
+
+ if not chunk_size:
+ # 1 MB chunk by default
+ chunk_size = 1024 * 1024
+
+ if endpoint.startswith(self._base_url):
+ url = endpoint
+ else:
+ endpoint = endpoint.lstrip("/").rstrip("/")
+ url = "{}/{}".format(self._rest_url, endpoint)
+
+ # Create dummy object so the function does not have to check
+ # 'progress' variable everywhere
+ if progress is None:
+ progress = TransferProgress()
+
+ progress.set_source_url(url)
+ progress.set_destination_url(filepath)
+ progress.set_started()
+ try:
+ self._download_file(url, filepath, chunk_size, progress)
+
+ except Exception as exc:
+ progress.set_failed(str(exc))
+ raise
+
+ finally:
+ progress.set_transfer_done()
+ return progress
+
+ def _upload_file(self, url, filepath, progress, request_type=None):
+ if request_type is None:
+ request_type = RequestTypes.put
+ kwargs = {}
+ if self._session is None:
+ kwargs["headers"] = self.get_headers()
+ post_func = self._base_functions_mapping[request_type]
+ else:
+ post_func = self._session_functions_mapping[request_type]
+
+ with open(filepath, "rb") as stream:
+ stream.seek(0, io.SEEK_END)
+ size = stream.tell()
+ stream.seek(0)
+ progress.set_content_size(size)
+ response = post_func(url, data=stream, **kwargs)
+ response.raise_for_status()
+ progress.set_transferred_size(size)
+
+ def upload_file(
+ self, endpoint, filepath, progress=None, request_type=None
+ ):
+ """Upload file to server.
+
+ Todos:
+ Uploading with more detailed progress.
+
+ Args:
+ endpoint (str): Endpoint or url where file will be uploaded.
+ filepath (str): Source filepath.
+ progress (Optional[TransferProgress]): Object that gives ability
+ to track upload progress.
+ request_type (Optional[RequestType]): Type of request that will
+ be used to upload file.
+ """
+
+ if endpoint.startswith(self._base_url):
+ url = endpoint
+ else:
+ endpoint = endpoint.lstrip("/").rstrip("/")
+ url = "{}/{}".format(self._rest_url, endpoint)
+
+ # Create dummy object so the function does not have to check
+ # 'progress' variable everywhere
+ if progress is None:
+ progress = TransferProgress()
+
+ progress.set_source_url(filepath)
+ progress.set_destination_url(url)
+ progress.set_started()
+
+ try:
+ self._upload_file(url, filepath, progress, request_type)
+
+ except Exception as exc:
+ progress.set_failed(str(exc))
+ raise
+
+ finally:
+ progress.set_transfer_done()
+
+ def trigger_server_restart(self):
+ """Trigger server restart.
+
+ Restart may be required when a change of specific value happened on
+ server.
+ """
+
+ result = self.post("system/restart")
+ if result.status_code != 204:
+ # TODO add better exception
+ raise ValueError("Failed to restart server")
+
+ def query_graphql(self, query, variables=None):
+ """Execute GraphQl query.
+
+ Args:
+ query (str): GraphQl query string.
+ variables (Optional[dict[str, Any]): Variables that can be
+ used in query.
+
+ Returns:
+ GraphQlResponse: Response from server.
+ """
+
+ data = {"query": query, "variables": variables or {}}
+ response = self._do_rest_request(
+ RequestTypes.post,
+ self._graphql_url,
+ json=data
+ )
+ response.raise_for_status()
+ return GraphQlResponse(response)
+
+ def get_graphql_schema(self):
+ return self.query_graphql(INTROSPECTION_QUERY).data
+
+ def get_server_schema(self):
+ """Get server schema with info, url paths, components etc.
+
+ Todos:
+ Cache schema - How to find out it is outdated?
+
+ Returns:
+ dict[str, Any]: Full server schema.
+ """
+
+ url = "{}/openapi.json".format(self._base_url)
+ response = self._do_rest_request(RequestTypes.get, url)
+ if response:
+ return response.data
+ return None
+
+ def get_schemas(self):
+ """Get components schema.
+
+ Name of components does not match entity type names e.g. 'project' is
+ under 'ProjectModel'. We should find out some mapping. Also, there
+ are properties which don't have information about reference to object
+ e.g. 'config' has just object definition without reference schema.
+
+ Returns:
+ dict[str, Any]: Component schemas.
+ """
+
+ server_schema = self.get_server_schema()
+ return server_schema["components"]["schemas"]
+
+ def get_attributes_schema(self, use_cache=True):
+ if not use_cache:
+ self.reset_attributes_schema()
+
+ if self._attributes_schema is None:
+ result = self.get("attributes")
+ if result.status_code != 200:
+ raise UnauthorizedError(
+ "User must be authorized to receive attributes"
+ )
+ self._attributes_schema = result.data
+ return copy.deepcopy(self._attributes_schema)
+
+ def reset_attributes_schema(self):
+ self._attributes_schema = None
+ self._entity_type_attributes_cache = {}
+
+ def set_attribute_config(
+ self, attribute_name, data, scope, position=None, builtin=False
+ ):
+ if position is None:
+ attributes = self.get("attributes").data["attributes"]
+ origin_attr = next(
+ (
+ attr for attr in attributes
+ if attr["name"] == attribute_name
+ ),
+ None
+ )
+ if origin_attr:
+ position = origin_attr["position"]
+ else:
+ position = len(attributes)
+
+ response = self.put(
+ "attributes/{}".format(attribute_name),
+ data=data,
+ scope=scope,
+ position=position,
+ builtin=builtin
+ )
+ if response.status_code != 204:
+ # TODO raise different exception
+ raise ValueError(
+ "Attribute \"{}\" was not created/updated. {}".format(
+ attribute_name, response.detail
+ )
+ )
+
+ self.reset_attributes_schema()
+
+ def remove_attribute_config(self, attribute_name):
+ """Remove attribute from server.
+
+ This can't be un-done, please use carefully.
+
+ Args:
+ attribute_name (str): Name of attribute to remove.
+ """
+
+ response = self.delete("attributes/{}".format(attribute_name))
+ response.raise_for_status(
+ "Attribute \"{}\" was not created/updated. {}".format(
+ attribute_name, response.detail
+ )
+ )
+
+ self.reset_attributes_schema()
+
+ def get_attributes_for_type(self, entity_type):
+ """Get attribute schemas available for an entity type.
+
+ ```
+ # Example attribute schema
+ {
+ # Common
+ "type": "integer",
+ "title": "Clip Out",
+ "description": null,
+ "example": 1,
+ "default": 1,
+ # These can be filled based on value of 'type'
+ "gt": null,
+ "ge": null,
+ "lt": null,
+ "le": null,
+ "minLength": null,
+ "maxLength": null,
+ "minItems": null,
+ "maxItems": null,
+ "regex": null,
+ "enum": null
+ }
+ ```
+
+ Args:
+ entity_type (str): Entity type for which should be attributes
+ received.
+
+ Returns:
+ dict[str, dict[str, Any]]: Attribute schemas that are available
+ for entered entity type.
+ """
+ attributes = self._entity_type_attributes_cache.get(entity_type)
+ if attributes is None:
+ attributes_schema = self.get_attributes_schema()
+ attributes = {}
+ for attr in attributes_schema["attributes"]:
+ if entity_type not in attr["scope"]:
+ continue
+ attr_name = attr["name"]
+ attributes[attr_name] = attr["data"]
+
+ self._entity_type_attributes_cache[entity_type] = attributes
+
+ return copy.deepcopy(attributes)
+
+ def get_default_fields_for_type(self, entity_type):
+ """Default fields for entity type.
+
+ Returns most of commonly used fields from server.
+
+ Args:
+ entity_type (str): Name of entity type.
+
+ Returns:
+ set[str]: Fields that should be queried from server.
+ """
+
+ attributes = self.get_attributes_for_type(entity_type)
+ if entity_type == "project":
+ return DEFAULT_PROJECT_FIELDS | {
+ "attrib.{}".format(attr)
+ for attr in attributes
+ }
+
+ if entity_type == "folder":
+ return DEFAULT_FOLDER_FIELDS | {
+ "attrib.{}".format(attr)
+ for attr in attributes
+ }
+
+ if entity_type == "task":
+ return DEFAULT_TASK_FIELDS | {
+ "attrib.{}".format(attr)
+ for attr in attributes
+ }
+
+ if entity_type == "product":
+ return DEFAULT_PRODUCT_FIELDS | {
+ "attrib.{}".format(attr)
+ for attr in attributes
+ }
+
+ if entity_type == "version":
+ return DEFAULT_VERSION_FIELDS | {
+ "attrib.{}".format(attr)
+ for attr in attributes
+ }
+
+ if entity_type == "representation":
+ return (
+ DEFAULT_REPRESENTATION_FIELDS
+ | REPRESENTATION_FILES_FIELDS
+ | {
+ "attrib.{}".format(attr)
+ for attr in attributes
+ }
+ )
+
+ if entity_type == "productType":
+ return DEFAULT_PRODUCT_TYPE_FIELDS
+
+ raise ValueError("Unknown entity type \"{}\"".format(entity_type))
+
+ def get_addons_info(self, details=True):
+ """Get information about addons available on server.
+
+ Args:
+ details (Optional[bool]): Detailed data with information how
+ to get client code.
+ """
+
+ endpoint = "addons"
+ if details:
+ endpoint += "?details=1"
+ response = self.get(endpoint)
+ response.raise_for_status()
+ return response.data
+
+ def get_addon_url(self, addon_name, addon_version, *subpaths):
+ """Calculate url to addon route.
+
+ Example:
+ >>> api = ServerAPI("https://your.url.com")
+ >>> api.get_addon_url(
+ ... "example", "1.0.0", "private", "my.zip")
+ 'https://your.url.com/addons/example/1.0.0/private/my.zip'
+
+ Args:
+ addon_name (str): Name of addon.
+ addon_version (str): Version of addon.
+ subpaths (tuple[str]): Any amount of subpaths that are added to
+ addon url.
+
+ Returns:
+ str: Final url.
+ """
+
+ ending = ""
+ if subpaths:
+ ending = "/{}".format("/".join(subpaths))
+ return "{}/addons/{}/{}{}".format(
+ self._base_url,
+ addon_name,
+ addon_version,
+ ending
+ )
+
+ def download_addon_private_file(
+ self,
+ addon_name,
+ addon_version,
+ filename,
+ destination_dir,
+ destination_filename=None,
+ chunk_size=None,
+ progress=None,
+ ):
+ """Download a file from addon private files.
+
+ This method requires to have authorized token available. Private files
+ are not under '/api' restpoint.
+
+ Args:
+ addon_name (str): Addon name.
+ addon_version (str): Addon version.
+ filename (str): Filename in private folder on server.
+ destination_dir (str): Where the file should be downloaded.
+ destination_filename (Optional[str]): Name of destination
+ filename. Source filename is used if not passed.
+ chunk_size (Optional[int]): Download chunk size.
+ progress (Optional[TransferProgress]): Object that gives ability
+ to track download progress.
+
+ Returns:
+ str: Filepath to downloaded file.
+ """
+
+ if not destination_filename:
+ destination_filename = filename
+ dst_filepath = os.path.join(destination_dir, destination_filename)
+ # Filename can contain "subfolders"
+ dst_dirpath = os.path.dirname(dst_filepath)
+ if not os.path.exists(dst_dirpath):
+ os.makedirs(dst_dirpath)
+
+ url = self.get_addon_url(
+ addon_name,
+ addon_version,
+ "private",
+ filename
+ )
+ self.download_file(
+ url, dst_filepath, chunk_size=chunk_size, progress=progress
+ )
+ return dst_filepath
+
+ def get_installers(self, version=None, platform_name=None):
+ """Information about desktop application installers on server.
+
+ Desktop application installers are helpers to download/update AYON
+ desktop application for artists.
+
+ Args:
+ version (Optional[str]): Filter installers by version.
+ platform_name (Optional[str]): Filter installers by platform name.
+
+ Returns:
+ list[dict[str, Any]]:
+ """
+
+ query_fields = [
+ "{}={}".format(key, value)
+ for key, value in (
+ ("version", version),
+ ("platform", platform_name),
+ )
+ if value
+ ]
+ query = ""
+ if query_fields:
+ query = "?{}".format(",".join(query_fields))
+
+ response = self.get("desktop/installers{}".format(query))
+ response.raise_for_status()
+ return response.data
+
+ def create_installer(
+ self,
+ filename,
+ version,
+ python_version,
+ platform_name,
+ python_modules,
+ runtime_python_modules,
+ checksum,
+ checksum_algorithm,
+ file_size,
+ sources=None,
+ ):
+ """Create new installer information on server.
+
+ This step will create only metadata. Make sure to upload installer
+ to the server using 'upload_installer' method.
+
+ Runtime python modules are modules that are required to run AYON
+ desktop application, but are not added to PYTHONPATH for any
+ subprocess.
+
+ Args:
+ filename (str): Installer filename.
+ version (str): Version of installer.
+ python_version (str): Version of Python.
+ platform_name (str): Name of platform.
+ python_modules (dict[str, str]): Python modules that are available
+ in installer.
+ runtime_python_modules (dict[str, str]): Runtime python modules
+ that are available in installer.
+ checksum (str): Installer file checksum.
+ checksum_algorithm (str): Type of checksum used to create checksum.
+ file_size (int): File size.
+ sources (Optional[list[dict[str, Any]]]): List of sources that
+ can be used to download file.
+ """
+
+ body = {
+ "filename": filename,
+ "version": version,
+ "pythonVersion": python_version,
+ "platform": platform_name,
+ "pythonModules": python_modules,
+ "runtimePythonModules": runtime_python_modules,
+ "checksum": checksum,
+ "checksumAlgorithm": checksum_algorithm,
+ "size": file_size,
+ }
+ if sources:
+ body["sources"] = sources
+
+ response = self.post("desktop/installers", **body)
+ response.raise_for_status()
+
+ def update_installer(self, filename, sources):
+ """Update installer information on server.
+
+ Args:
+ filename (str): Installer filename.
+ sources (list[dict[str, Any]]): List of sources that
+ can be used to download file. Fully replaces existing sources.
+ """
+
+ response = self.patch(
+ "desktop/installers/{}".format(filename),
+ sources=sources
+ )
+ response.raise_for_status()
+
+ def delete_installer(self, filename):
+ """Delete installer from server.
+
+ Args:
+ filename (str): Installer filename.
+ """
+
+ response = self.delete("desktop/installers/{}".format(filename))
+ response.raise_for_status()
+
+ def download_installer(
+ self,
+ filename,
+ dst_filepath,
+ chunk_size=None,
+ progress=None
+ ):
+ """Download installer file from server.
+
+ Args:
+ filename (str): Installer filename.
+ dst_filepath (str): Destination filepath.
+ chunk_size (Optional[int]): Download chunk size.
+ progress (Optional[TransferProgress]): Object that gives ability
+ to track download progress.
+ """
+
+ self.download_file(
+ "desktop/installers/{}".format(filename),
+ dst_filepath,
+ chunk_size=chunk_size,
+ progress=progress
+ )
+
+ def upload_installer(self, src_filepath, dst_filename, progress=None):
+ """Upload installer file to server.
+
+ Args:
+ src_filepath (str): Source filepath.
+ dst_filename (str): Destination filename.
+ progress (Optional[TransferProgress]): Object that gives ability
+ to track download progress.
+ """
+
+ self.upload_file(
+ "desktop/installers/{}".format(dst_filename),
+ src_filepath,
+ progress=progress
+ )
+
+ def get_dependencies_info(self):
+ """Information about dependency packages on server.
+
+ Example data structure:
+ {
+ "packages": [
+ {
+ "name": str,
+ "platform": str,
+ "checksum": str,
+ "sources": list[dict[str, Any]],
+ "supportedAddons": dict[str, str],
+ "pythonModules": dict[str, str]
+ }
+ ],
+ "productionPackage": str
+ }
+
+ Deprecated:
+ Deprecated since server version 0.2.1. Use
+ 'get_dependency_packages' instead.
+
+ Returns:
+ dict[str, Any]: Information about dependency packages known for
+ server.
+ """
+
+ major, minor, patch, _, _ = self.server_version_tuple
+ if major == 0 and (minor < 2 or (minor == 2 and patch < 1)):
+ result = self.get("dependencies")
+ return result.data
+ packages = self.get_dependency_packages()
+ packages["productionPackage"] = None
+ return packages
+
+ def update_dependency_info(
+ self,
+ name,
+ platform_name,
+ size,
+ checksum,
+ checksum_algorithm=None,
+ supported_addons=None,
+ python_modules=None,
+ sources=None
+ ):
+ """Update or create dependency package for identifiers.
+
+ The endpoint can be used to create or update dependency package.
+
+
+ Deprecated:
+ Deprecated for server version 0.2.1. Use
+ 'create_dependency_pacakge' instead.
+
+ Args:
+ name (str): Name of dependency package.
+ platform_name (Literal["windows", "linux", "darwin"]): Platform
+ for which is dependency package targeted.
+ size (int): Size of dependency package in bytes.
+ checksum (str): Checksum of archive file where dependencies are.
+ checksum_algorithm (Optional[str]): Algorithm used to calculate
+ checksum. By default, is used 'md5' (defined by server).
+ supported_addons (Optional[dict[str, str]]): Name of addons for
+ which was the package created.
+ '{"": "", ...}'
+ python_modules (Optional[dict[str, str]]): Python modules in
+ dependencies package.
+ '{"": "", ...}'
+ sources (Optional[list[dict[str, Any]]]): Information about
+ sources where dependency package is available.
+ """
+
+ kwargs = {
+ key: value
+ for key, value in (
+ ("checksumAlgorithm", checksum_algorithm),
+ ("supportedAddons", supported_addons),
+ ("pythonModules", python_modules),
+ ("sources", sources),
+ )
+ if value
+ }
+
+ response = self.put(
+ "dependencies",
+ name=name,
+ platform=platform_name,
+ size=size,
+ checksum=checksum,
+ **kwargs
+ )
+ response.raise_for_status("Failed to create/update dependency")
+ return response.data
+
+ def get_dependency_packages(self):
+ """Information about dependency packages on server.
+
+ To download dependency package, use 'download_dependency_package'
+ method and pass in 'filename'.
+
+ Example data structure:
+ {
+ "packages": [
+ {
+ "filename": str,
+ "platform": str,
+ "checksum": str,
+ "checksumAlgorithm": str,
+ "size": int,
+ "sources": list[dict[str, Any]],
+ "supportedAddons": dict[str, str],
+ "pythonModules": dict[str, str]
+ }
+ ]
+ }
+
+ Returns:
+ dict[str, Any]: Information about dependency packages known for
+ server.
+ """
+
+ result = self.get("desktop/dependency_packages")
+ result.raise_for_status()
+ return result.data
+
+ def _get_dependency_package_route(
+ self, filename=None, platform_name=None
+ ):
+ major, minor, patch, _, _ = self.server_version_tuple
+ if major == 0 and (minor > 2 or (minor == 2 and patch >= 1)):
+ base = "desktop/dependency_packages"
+ if not filename:
+ return base
+ return "{}/{}".format(base, filename)
+
+ # Backwards compatibility for AYON server 0.2.0 and lower
+ if platform_name is None:
+ platform_name = platform.system().lower()
+ base = "dependencies"
+ if not filename:
+ return base
+ return "{}/{}/{}".format(base, filename, platform_name)
+
+ def create_dependency_package(
+ self,
+ filename,
+ python_modules,
+ source_addons,
+ installer_version,
+ checksum,
+ checksum_algorithm,
+ file_size,
+ sources=None,
+ platform_name=None,
+ ):
+ """Create dependency package on server.
+
+ The package will be created on a server, it is also required to upload
+ the package archive file (using 'upload_dependency_package').
+
+ Args:
+ filename (str): Filename of dependency package.
+ python_modules (dict[str, str]): Python modules in dependency
+ package.
+ '{"": "", ...}'
+ source_addons (dict[str, str]): Name of addons for which is
+ dependency package created.
+ '{"": "", ...}'
+ installer_version (str): Version of installer for which was
+ package created.
+ checksum (str): Checksum of archive file where dependencies are.
+ checksum_algorithm (str): Algorithm used to calculate checksum.
+ file_size (Optional[int]): Size of file.
+ sources (Optional[list[dict[str, Any]]]): Information about
+ sources from where it is possible to get file.
+ platform_name (Optional[str]): Name of platform for which is
+ dependency package targeted. Default value is
+ current platform.
+ """
+
+ post_body = {
+ "filename": filename,
+ "pythonModules": python_modules,
+ "sourceAddons": source_addons,
+ "installerVersion": installer_version,
+ "checksum": checksum,
+ "checksumAlgorithm": checksum_algorithm,
+ "size": file_size,
+ "platform": platform_name or platform.system().lower(),
+ }
+ if sources:
+ post_body["sources"] = sources
+
+ route = self._get_dependency_package_route()
+ response = self.post(route, **post_body)
+ response.raise_for_status()
+
+ def update_dependency_package(self, filename, sources):
+ """Update dependency package metadata on server.
+
+ Args:
+ filename (str): Filename of dependency package.
+ sources (list[dict[str, Any]]): Information about
+ sources from where it is possible to get file. Fully replaces
+ existing sources.
+ """
+
+ response = self.patch(
+ self._get_dependency_package_route(filename),
+ sources=sources
+ )
+ response.raise_for_status()
+
+ def delete_dependency_package(self, filename, platform_name=None):
+ """Remove dependency package for specific platform.
+
+ Args:
+ filename (str): Filename of dependency package. Or name of package
+ for server version 0.2.0 or lower.
+ platform_name (Optional[str]): Which platform of the package
+ should be removed. Current platform is used if not passed.
+ Deprecated since version 0.2.1
+ """
+
+ route = self._get_dependency_package_route(filename, platform_name)
+ response = self.delete(route)
+ response.raise_for_status("Failed to delete dependency file")
+ return response.data
+
+ def download_dependency_package(
+ self,
+ src_filename,
+ dst_directory,
+ dst_filename,
+ platform_name=None,
+ chunk_size=None,
+ progress=None,
+ ):
+ """Download dependency package from server.
+
+ This method requires to have authorized token available. The package
+ is only downloaded.
+
+ Args:
+ src_filename (str): Filename of dependency pacakge.
+ For server version 0.2.0 and lower it is name of package
+ to download.
+ dst_directory (str): Where the file should be downloaded.
+ dst_filename (str): Name of destination filename.
+ platform_name (Optional[str]): Name of platform for which the
+ dependency package is targeted. Default value is
+ current platform. Deprecated since server version 0.2.1.
+ chunk_size (Optional[int]): Download chunk size.
+ progress (Optional[TransferProgress]): Object that gives ability
+ to track download progress.
+
+ Returns:
+ str: Filepath to downloaded file.
+ """
+
+ route = self._get_dependency_package_route(src_filename, platform_name)
+ package_filepath = os.path.join(dst_directory, dst_filename)
+ self.download_file(
+ route,
+ package_filepath,
+ chunk_size=chunk_size,
+ progress=progress
+ )
+ return package_filepath
+
+ def upload_dependency_package(
+ self, src_filepath, dst_filename, platform_name=None, progress=None
+ ):
+ """Upload dependency package to server.
+
+ Args:
+ src_filepath (str): Path to a package file.
+ dst_filename (str): Dependency package filename or name of package
+ for server version 0.2.0 or lower. Must be unique.
+ platform_name (Optional[str]): For which platform is the
+ package targeted. Deprecated since server version 0.2.1.
+ progress (Optional[TransferProgress]): Object to keep track about
+ upload state.
+ """
+
+ route = self._get_dependency_package_route(dst_filename, platform_name)
+ self.upload_file(route, src_filepath, progress=progress)
+
+ def create_dependency_package_basename(self, platform_name=None):
+ """Create basename for dependency package file.
+
+ Deprecated:
+ Use 'create_dependency_package_basename' from `ayon_api` or
+ `ayon_api.utils` instead.
+
+ Args:
+ platform_name (Optional[str]): Name of platform for which the
+ bundle is targeted. Default value is current platform.
+
+ Returns:
+ str: Dependency package name with timestamp and platform.
+ """
+
+ return create_dependency_package_basename(platform_name)
+
+ def _get_bundles_route(self):
+ major, minor, patch, _, _ = self.server_version_tuple
+ # Backwards compatibility for AYON server 0.3.0
+ # - first version where bundles were available
+ if major == 0 and minor == 3 and patch == 0:
+ return "desktop/bundles"
+ return "bundles"
+
+ def get_bundles(self):
+ """Server bundles with basic information.
+
+ Example output:
+ {
+ "bundles": [
+ {
+ "name": "my_bundle",
+ "createdAt": "2023-06-12T15:37:02.420260",
+ "installerVersion": "1.0.0",
+ "addons": {
+ "core": "1.2.3"
+ },
+ "dependencyPackages": {
+ "windows": "a_windows_package123.zip",
+ "linux": "a_linux_package123.zip",
+ "darwin": "a_mac_package123.zip"
+ },
+ "isProduction": False,
+ "isStaging": False
+ }
+ ],
+ "productionBundle": "my_bundle",
+ "stagingBundle": "test_bundle"
+ }
+
+ Returns:
+ dict[str, Any]: Server bundles with basic information.
+ """
+
+ response = self.get(self._get_bundles_route())
+ response.raise_for_status()
+ return response.data
+
+ def create_bundle(
+ self,
+ name,
+ addon_versions,
+ installer_version,
+ dependency_packages=None,
+ is_production=None,
+ is_staging=None
+ ):
+ """Create bundle on server.
+
+ Bundle cannot be changed once is created. Only isProduction, isStaging
+ and dependency packages can change after creation.
+
+ Args:
+ name (str): Name of bundle.
+ addon_versions (dict[str, str]): Addon versions.
+ installer_version (Union[str, None]): Installer version.
+ dependency_packages (Optional[dict[str, str]]): Dependency
+ package names. Keys are platform names and values are name of
+ packages.
+ is_production (Optional[bool]): Bundle will be marked as
+ production.
+ is_staging (Optional[bool]): Bundle will be marked as staging.
+ """
+
+ body = {
+ "name": name,
+ "installerVersion": installer_version,
+ "addons": addon_versions,
+ }
+ for key, value in (
+ ("dependencyPackages", dependency_packages),
+ ("isProduction", is_production),
+ ("isStaging", is_staging),
+ ):
+ if value is not None:
+ body[key] = value
+
+ response = self.post(self._get_bundles_route(), **body)
+ response.raise_for_status()
+
+ def update_bundle(
+ self,
+ bundle_name,
+ dependency_packages=None,
+ is_production=None,
+ is_staging=None
+ ):
+ """Update bundle on server.
+
+ Dependency packages can be update only for single platform. Others
+ will be left untouched. Use 'None' value to unset dependency package
+ from bundle.
+
+ Args:
+ bundle_name (str): Name of bundle.
+ dependency_packages (Optional[dict[str, str]]): Dependency pacakge
+ names that should be used with the bundle.
+ is_production (Optional[bool]): Bundle will be marked as
+ production.
+ is_staging (Optional[bool]): Bundle will be marked as staging.
+ """
+
+ body = {
+ key: value
+ for key, value in (
+ ("dependencyPackages", dependency_packages),
+ ("isProduction", is_production),
+ ("isStaging", is_staging),
+ )
+ if value is not None
+ }
+ response = self.patch(
+ "{}/{}".format(self._get_bundles_route(), bundle_name),
+ **body
+ )
+ response.raise_for_status()
+
+ def delete_bundle(self, bundle_name):
+ """Delete bundle from server.
+
+ Args:
+ bundle_name (str): Name of bundle to delete.
+ """
+
+ response = self.delete(
+ "{}/{}".format(self._get_bundles_route(), bundle_name)
+ )
+ response.raise_for_status()
+
+ # Anatomy presets
+ def get_project_anatomy_presets(self):
+ """Anatomy presets available on server.
+
+ Content has basic information about presets. Example output:
+ [
+ {
+ "name": "netflix_VFX",
+ "primary": false,
+ "version": "1.0.0"
+ },
+ {
+ ...
+ },
+ ...
+ ]
+
+ Returns:
+ list[dict[str, str]]: Anatomy presets available on server.
+ """
+
+ result = self.get("anatomy/presets")
+ result.raise_for_status()
+ return result.data.get("presets") or []
+
+ def get_project_anatomy_preset(self, preset_name=None):
+ """Anatomy preset values by name.
+
+ Get anatomy preset values by preset name. Primary preset is returned
+ if preset name is set to 'None'.
+
+ Args:
+ preset_name (Optional[str]): Preset name.
+
+ Returns:
+ dict[str, Any]: Anatomy preset values.
+ """
+
+ if preset_name is None:
+ preset_name = "_"
+ result = self.get("anatomy/presets/{}".format(preset_name))
+ result.raise_for_status()
+ return result.data
+
+ def get_project_roots_by_site(self, project_name):
+ """Root overrides per site name.
+
+ Method is based on logged user and can't be received for any other
+ user on server.
+
+ Output will contain only roots per site id used by logged user.
+
+ Args:
+ project_name (str): Name of project.
+
+ Returns:
+ dict[str, dict[str, str]]: Root values by root name by site id.
+ """
+
+ result = self.get("projects/{}/roots".format(project_name))
+ result.raise_for_status()
+ return result.data
+
+ def get_project_roots_for_site(self, project_name, site_id=None):
+ """Root overrides for site.
+
+ If site id is not passed a site set in current api object is used
+ instead.
+
+ Args:
+ project_name (str): Name of project.
+ site_id (Optional[str]): Id of site for which want to receive
+ site overrides.
+
+ Returns:
+ dict[str, str]: Root values by root name or None if
+ site does not have overrides.
+ """
+
+ if site_id is None:
+ site_id = self.site_id
+
+ if site_id is None:
+ return {}
+ roots = self.get_project_roots_by_site(project_name)
+ return roots.get(site_id, {})
+
+ def get_addon_settings_schema(
+ self, addon_name, addon_version, project_name=None
+ ):
+ """Sudio/Project settings schema of an addon.
+
+ Project schema may look differently as some enums are based on project
+ values.
+
+ Args:
+ addon_name (str): Name of addon.
+ addon_version (str): Version of addon.
+ project_name (Optional[str]): Schema for specific project or
+ default studio schemas.
+
+ Returns:
+ dict[str, Any]: Schema of studio/project settings.
+ """
+
+ args = tuple()
+ if project_name:
+ args = (project_name, )
+
+ endpoint = self.get_addon_url(
+ addon_name, addon_version, "schema", *args
+ )
+ result = self.get(endpoint)
+ result.raise_for_status()
+ return result.data
+
+ def get_addon_site_settings_schema(self, addon_name, addon_version):
+ """Site settings schema of an addon.
+
+ Args:
+ addon_name (str): Name of addon.
+ addon_version (str): Version of addon.
+
+ Returns:
+ dict[str, Any]: Schema of site settings.
+ """
+
+ result = self.get("addons/{}/{}/siteSettings/schema".format(
+ addon_name, addon_version
+ ))
+ result.raise_for_status()
+ return result.data
+
+ def get_addon_studio_settings(
+ self,
+ addon_name,
+ addon_version,
+ variant=None
+ ):
+ """Addon studio settings.
+
+ Receive studio settings for specific version of an addon.
+
+ Args:
+ addon_name (str): Name of addon.
+ addon_version (str): Version of addon.
+ variant (Optional[Literal['production', 'staging']]): Name of
+ settings variant. Used 'default_settings_variant' by default.
+
+ Returns:
+ dict[str, Any]: Addon settings.
+ """
+
+ if variant is None:
+ variant = self.default_settings_variant
+
+ query_items = {}
+ if variant:
+ query_items["variant"] = variant
+ query = prepare_query_string(query_items)
+
+ result = self.get(
+ "addons/{}/{}/settings{}".format(addon_name, addon_version, query)
+ )
+ result.raise_for_status()
+ return result.data
+
+ def get_addon_project_settings(
+ self,
+ addon_name,
+ addon_version,
+ project_name,
+ variant=None,
+ site_id=None,
+ use_site=True
+ ):
+ """Addon project settings.
+
+ Receive project settings for specific version of an addon. The settings
+ may be with site overrides when enabled.
+
+ Site id is filled with current connection site id if not passed. To
+ make sure any site id is used set 'use_site' to 'False'.
+
+ Args:
+ addon_name (str): Name of addon.
+ addon_version (str): Version of addon.
+ project_name (str): Name of project for which the settings are
+ received.
+ variant (Optional[Literal['production', 'staging']]): Name of
+ settings variant. Used 'default_settings_variant' by default.
+ site_id (Optional[str]): Name of site which is used for site
+ overrides. Is filled with connection 'site_id' attribute
+ if not passed.
+ use_site (Optional[bool]): To force disable option of using site
+ overrides set to 'False'. In that case won't be applied
+ any site overrides.
+
+ Returns:
+ dict[str, Any]: Addon settings.
+ """
+
+ if not use_site:
+ site_id = None
+ elif not site_id:
+ site_id = self.site_id
+
+ query_items = {}
+ if site_id:
+ query_items["site"] = site_id
+
+ if variant is None:
+ variant = self.default_settings_variant
+
+ if variant:
+ query_items["variant"] = variant
+
+ query = prepare_query_string(query_items)
+ result = self.get(
+ "addons/{}/{}/settings/{}{}".format(
+ addon_name, addon_version, project_name, query
+ )
+ )
+ result.raise_for_status()
+ return result.data
+
+ def get_addon_settings(
+ self,
+ addon_name,
+ addon_version,
+ project_name=None,
+ variant=None,
+ site_id=None,
+ use_site=True
+ ):
+ """Receive addon settings.
+
+ Receive addon settings based on project name value. Some arguments may
+ be ignored if 'project_name' is set to 'None'.
+
+ Args:
+ addon_name (str): Name of addon.
+ addon_version (str): Version of addon.
+ project_name (Optional[str]): Name of project for which the
+ settings are received. A studio settings values are received
+ if is 'None'.
+ variant (Optional[Literal['production', 'staging']]): Name of
+ settings variant. Used 'default_settings_variant' by default.
+ site_id (Optional[str]): Name of site which is used for site
+ overrides. Is filled with connection 'site_id' attribute
+ if not passed.
+ use_site (Optional[bool]): To force disable option of using
+ site overrides set to 'False'. In that case won't be applied
+ any site overrides.
+
+ Returns:
+ dict[str, Any]: Addon settings.
+ """
+
+ if project_name is None:
+ return self.get_addon_studio_settings(
+ addon_name, addon_version, variant
+ )
+ return self.get_addon_project_settings(
+ addon_name, addon_version, project_name, variant, site_id, use_site
+ )
+
+ def get_addon_site_settings(
+ self, addon_name, addon_version, site_id=None
+ ):
+ """Site settings of an addon.
+
+ If site id is not available an empty dictionary is returned.
+
+ Args:
+ addon_name (str): Name of addon.
+ addon_version (str): Version of addon.
+ site_id (Optional[str]): Name of site for which should be settings
+ returned. using 'site_id' attribute if not passed.
+
+ Returns:
+ dict[str, Any]: Site settings.
+ """
+
+ if site_id is None:
+ site_id = self.site_id
+
+ if not site_id:
+ return {}
+
+ query = prepare_query_string({"site": site_id})
+ result = self.get("addons/{}/{}/siteSettings{}".format(
+ addon_name, addon_version, query
+ ))
+ result.raise_for_status()
+ return result.data
+
+ def get_bundle_settings(
+ self,
+ bundle_name=None,
+ project_name=None,
+ variant=None,
+ site_id=None,
+ use_site=True
+ ):
+ """Get complete set of settings for given data.
+
+ If project is not passed then studio settings are returned. If variant
+ is not passed 'default_settings_variant' is used. If bundle name is
+ not passed then current production/staging bundle is used, based on
+ variant value.
+
+ Output contains addon settings and site settings in single dictionary.
+
+ TODOs:
+ - test how it behaves if there is not any bundle.
+ - test how it behaves if there is not any production/staging
+ bundle.
+
+ Warnings:
+ For AYON server < 0.3.0 bundle name will be ignored.
+
+ Example output:
+ {
+ "addons": [
+ {
+ "name": "addon-name",
+ "version": "addon-version",
+ "settings": {...}
+ "siteSettings": {...}
+ }
+ ]
+ }
+
+ Returns:
+ dict[str, Any]: All settings for single bundle.
+ """
+
+ major, minor, _, _, _ = self.server_version_tuple
+ query_values = {
+ key: value
+ for key, value in (
+ ("project_name", project_name),
+ ("variant", variant or self.default_settings_variant),
+ ("bundle_name", bundle_name),
+ )
+ if value
+ }
+ if use_site:
+ if not site_id:
+ site_id = self.site_id
+ if site_id:
+ query_values["site_id"] = site_id
+
+ if major == 0 and minor >= 3:
+ url = "settings"
+ else:
+ # Backward compatibility for AYON server < 0.3.0
+ url = "settings/addons"
+ query_values.pop("bundle_name", None)
+ for new_key, old_key in (
+ ("project_name", "project"),
+ ("site_id", "site"),
+ ):
+ if new_key in query_values:
+ query_values[old_key] = query_values.pop(new_key)
+
+ query = prepare_query_string(query_values)
+ response = self.get("{}{}".format(url, query))
+ response.raise_for_status()
+ return response.data
+
+ def get_addons_studio_settings(
+ self,
+ bundle_name=None,
+ variant=None,
+ site_id=None,
+ use_site=True,
+ only_values=True
+ ):
+ """All addons settings in one bulk.
+
+ Warnings:
+ Behavior of this function changed with AYON server version 0.3.0.
+ Structure of output from server changed. If using
+ 'only_values=True' then output should be same as before.
+
+ Args:
+ bundle_name (Optional[str]): Name of bundle for which should be
+ settings received.
+ variant (Optional[Literal['production', 'staging']]): Name of
+ settings variant. Used 'default_settings_variant' by default.
+ site_id (Optional[str]): Id of site for which want to receive
+ site overrides.
+ use_site (bool): To force disable option of using site overrides
+ set to 'False'. In that case won't be applied any site
+ overrides.
+ only_values (Optional[bool]): Output will contain only settings
+ values without metadata about addons.
+
+ Returns:
+ dict[str, Any]: Settings of all addons on server.
+ """
+
+ output = self.get_bundle_settings(
+ bundle_name=bundle_name,
+ variant=variant,
+ site_id=site_id,
+ use_site=use_site
+ )
+ if only_values:
+ major, minor, patch, _, _ = self.server_version_tuple
+ if major == 0 and minor >= 3:
+ output = {
+ addon["name"]: addon["settings"]
+ for addon in output["addons"]
+ }
+ else:
+ # Backward compatibility for AYON server < 0.3.0
+ output = output["settings"]
+ return output
+
+ def get_addons_project_settings(
+ self,
+ project_name,
+ bundle_name=None,
+ variant=None,
+ site_id=None,
+ use_site=True,
+ only_values=True
+ ):
+ """Project settings of all addons.
+
+ Server returns information about used addon versions, so full output
+ looks like:
+ {
+ "settings": {...},
+ "addons": {...}
+ }
+
+ The output can be limited to only values. To do so is 'only_values'
+ argument which is by default set to 'True'. In that case output
+ contains only value of 'settings' key.
+
+ Warnings:
+ Behavior of this function changed with AYON server version 0.3.0.
+ Structure of output from server changed. If using
+ 'only_values=True' then output should be same as before.
+
+ Args:
+ project_name (str): Name of project for which are settings
+ received.
+ bundle_name (Optional[str]): Name of bundle for which should be
+ settings received.
+ variant (Optional[Literal['production', 'staging']]): Name of
+ settings variant. Used 'default_settings_variant' by default.
+ site_id (Optional[str]): Id of site for which want to receive
+ site overrides.
+ use_site (bool): To force disable option of using site overrides
+ set to 'False'. In that case won't be applied any site
+ overrides.
+ only_values (Optional[bool]): Output will contain only settings
+ values without metadata about addons.
+
+ Returns:
+ dict[str, Any]: Settings of all addons on server for passed
+ project.
+ """
+
+ if not project_name:
+ raise ValueError("Project name must be passed.")
+
+ output = self.get_bundle_settings(
+ project_name=project_name,
+ bundle_name=bundle_name,
+ variant=variant,
+ site_id=site_id,
+ use_site=use_site
+ )
+ if only_values:
+ major, minor, patch, _, _ = self.server_version_tuple
+ if major == 0 and minor >= 3:
+ output = {
+ addon["name"]: addon["settings"]
+ for addon in output["addons"]
+ }
+ else:
+ # Backward compatibility for AYON server < 0.3.0
+ output = output["settings"]
+ return output
+
+ def get_addons_settings(
+ self,
+ bundle_name=None,
+ project_name=None,
+ variant=None,
+ site_id=None,
+ use_site=True,
+ only_values=True
+ ):
+ """Universal function to receive all addon settings.
+
+ Based on 'project_name' will receive studio settings or project
+ settings. In case project is not passed is 'site_id' ignored.
+
+ Warnings:
+ Behavior of this function changed with AYON server version 0.3.0.
+ Structure of output from server changed. If using
+ 'only_values=True' then output should be same as before.
+
+ Args:
+ bundle_name (Optional[str]): Name of bundle for which should be
+ settings received.
+ project_name (Optional[str]): Name of project for which should be
+ settings received.
+ variant (Optional[Literal['production', 'staging']]): Name of
+ settings variant. Used 'default_settings_variant' by default.
+ site_id (Optional[str]): Id of site for which want to receive
+ site overrides.
+ use_site (Optional[bool]): To force disable option of using site
+ overrides set to 'False'. In that case won't be applied
+ any site overrides.
+ only_values (Optional[bool]): Only settings values will be
+ returned. By default, is set to 'True'.
+ """
+
+ if project_name is None:
+ return self.get_addons_studio_settings(
+ bundle_name=bundle_name,
+ variant=variant,
+ site_id=site_id,
+ use_site=use_site,
+ only_values=only_values
+ )
+
+ return self.get_addons_project_settings(
+ project_name=project_name,
+ bundle_name=bundle_name,
+ variant=variant,
+ site_id=site_id,
+ use_site=use_site,
+ only_values=only_values
+ )
+
+ # Entity getters
+ def get_rest_project(self, project_name):
+ """Query project by name.
+
+ This call returns project with anatomy data.
+
+ Args:
+ project_name (str): Name of project.
+
+ Returns:
+ Union[dict[str, Any], None]: Project entity data or 'None' if
+ project was not found.
+ """
+
+ if not project_name:
+ return None
+
+ response = self.get("projects/{}".format(project_name))
+ if response.status == 200:
+ return response.data
+ return None
+
+ def get_rest_projects(self, active=True, library=None):
+ """Query available project entities.
+
+ User must be logged in.
+
+ Args:
+ active (Optional[bool]): Filter active/inactive projects. Both
+ are returned if 'None' is passed.
+ library (Optional[bool]): Filter standard/library projects. Both
+ are returned if 'None' is passed.
+
+ Returns:
+ Generator[dict[str, Any]]: Available projects.
+ """
+
+ for project_name in self.get_project_names(active, library):
+ project = self.get_rest_project(project_name)
+ if project:
+ yield project
+
+ def get_rest_entity_by_id(self, project_name, entity_type, entity_id):
+ """Get entity using REST on a project by its id.
+
+ Args:
+ project_name (str): Name of project where entity is.
+ entity_type (Literal["folder", "task", "product", "version"]): The
+ entity type which should be received.
+ entity_id (str): Id of entity.
+
+ Returns:
+ dict[str, Any]: Received entity data.
+ """
+
+ if not all((project_name, entity_type, entity_id)):
+ return None
+
+ entity_endpoint = "{}s".format(entity_type)
+ response = self.get("projects/{}/{}/{}".format(
+ project_name, entity_endpoint, entity_id
+ ))
+ if response.status == 200:
+ return response.data
+ return None
+
+ def get_rest_folder(self, project_name, folder_id):
+ return self.get_rest_entity_by_id(project_name, "folder", folder_id)
+
+ def get_rest_task(self, project_name, task_id):
+ return self.get_rest_entity_by_id(project_name, "task", task_id)
+
+ def get_rest_product(self, project_name, product_id):
+ return self.get_rest_entity_by_id(project_name, "product", product_id)
+
+ def get_rest_version(self, project_name, version_id):
+ return self.get_rest_entity_by_id(project_name, "version", version_id)
+
+ def get_rest_representation(self, project_name, representation_id):
+ return self.get_rest_entity_by_id(
+ project_name, "representation", representation_id
+ )
+
+ def get_project_names(self, active=True, library=None):
+ """Receive available project names.
+
+ User must be logged in.
+
+ Args:
+ active (Optional[bool]): Filter active/inactive projects. Both
+ are returned if 'None' is passed.
+ library (Optional[bool]): Filter standard/library projects. Both
+ are returned if 'None' is passed.
+
+ Returns:
+ list[str]: List of available project names.
+ """
+
+ query_keys = {}
+ if active is not None:
+ query_keys["active"] = "true" if active else "false"
+
+ if library is not None:
+ query_keys["library"] = "true" if library else "false"
+ query = ""
+ if query_keys:
+ query = "?{}".format(",".join([
+ "{}={}".format(key, value)
+ for key, value in query_keys.items()
+ ]))
+
+ response = self.get("projects{}".format(query), **query_keys)
+ response.raise_for_status()
+ data = response.data
+ project_names = []
+ if data:
+ for project in data["projects"]:
+ project_names.append(project["name"])
+ return project_names
+
+ def get_projects(
+ self, active=True, library=None, fields=None, own_attributes=False
+ ):
+ """Get projects.
+
+ Args:
+ active (Optional[bool]): Filter active or inactive projects.
+ Filter is disabled when 'None' is passed.
+ library (Optional[bool]): Filter library projects. Filter is
+ disabled when 'None' is passed.
+ fields (Optional[Iterable[str]]): fields to be queried
+ for project.
+ own_attributes (Optional[bool]): Attribute values that are
+ not explicitly set on entity will have 'None' value.
+
+ Returns:
+ Generator[dict[str, Any]]: Queried projects.
+ """
+
+ if fields is None:
+ use_rest = True
+ else:
+ use_rest = False
+ fields = set(fields)
+ if own_attributes:
+ fields.add("ownAttrib")
+ for field in fields:
+ if field.startswith("config"):
+ use_rest = True
+ break
+
+ if use_rest:
+ for project in self.get_rest_projects(active, library):
+ if own_attributes:
+ fill_own_attribs(project)
+ yield project
+
+ else:
+ query = projects_graphql_query(fields)
+ for parsed_data in query.continuous_query(self):
+ for project in parsed_data["projects"]:
+ if own_attributes:
+ fill_own_attribs(project)
+ yield project
+
+ def get_project(self, project_name, fields=None, own_attributes=False):
+ """Get project.
+
+ Args:
+ project_name (str): Name of project.
+ fields (Optional[Iterable[str]]): fields to be queried
+ for project.
+ own_attributes (Optional[bool]): Attribute values that are
+ not explicitly set on entity will have 'None' value.
+
+ Returns:
+ Union[dict[str, Any], None]: Project entity data or None
+ if project was not found.
+ """
+
+ use_rest = True
+ if fields is not None:
+ use_rest = False
+ _fields = set()
+ for field in fields:
+ if field.startswith("config") or field == "data":
+ use_rest = True
+ break
+ _fields.add(field)
+
+ fields = _fields
+
+ if use_rest:
+ project = self.get_rest_project(project_name)
+ if own_attributes:
+ fill_own_attribs(project)
+ return project
+
+ if own_attributes:
+ field.add("ownAttrib")
+ query = project_graphql_query(fields)
+ query.set_variable_value("projectName", project_name)
+
+ parsed_data = query.query(self)
+
+ project = parsed_data["project"]
+ if project is not None:
+ project["name"] = project_name
+ if own_attributes:
+ fill_own_attribs(project)
+ return project
+
+ def get_folders(
+ self,
+ project_name,
+ folder_ids=None,
+ folder_paths=None,
+ folder_names=None,
+ parent_ids=None,
+ active=True,
+ fields=None,
+ own_attributes=False
+ ):
+ """Query folders from server.
+
+ Todos:
+ Folder name won't be unique identifier, so we should add folder path
+ filtering.
+
+ Notes:
+ Filter 'active' don't have direct filter in GraphQl.
+
+ Args:
+ project_name (str): Name of project.
+ folder_ids (Optional[Iterable[str]]): Folder ids to filter.
+ folder_paths (Optional[Iterable[str]]): Folder paths used
+ for filtering.
+ folder_names (Optional[Iterable[str]]): Folder names used
+ for filtering.
+ parent_ids (Optional[Iterable[str]]): Ids of folder parents.
+ Use 'None' if folder is direct child of project.
+ active (Optional[bool]): Filter active/inactive folders.
+ Both are returned if is set to None.
+ fields (Optional[Iterable[str]]): Fields to be queried for
+ folder. All possible folder fields are returned
+ if 'None' is passed.
+ own_attributes (Optional[bool]): Attribute values that are
+ not explicitly set on entity will have 'None' value.
+
+ Returns:
+ Generator[dict[str, Any]]: Queried folder entities.
+ """
+
+ if not project_name:
+ return
+
+ filters = {
+ "projectName": project_name
+ }
+ if folder_ids is not None:
+ folder_ids = set(folder_ids)
+ if not folder_ids:
+ return
+ filters["folderIds"] = list(folder_ids)
+
+ if folder_paths is not None:
+ folder_paths = set(folder_paths)
+ if not folder_paths:
+ return
+ filters["folderPaths"] = list(folder_paths)
+
+ if folder_names is not None:
+ folder_names = set(folder_names)
+ if not folder_names:
+ return
+ filters["folderNames"] = list(folder_names)
+
+ if parent_ids is not None:
+ parent_ids = set(parent_ids)
+ if not parent_ids:
+ return
+ if None in parent_ids:
+ # Replace 'None' with '"root"' which is used during GraphQl
+ # query for parent ids filter for folders without folder
+ # parent
+ parent_ids.remove(None)
+ parent_ids.add("root")
+
+ if project_name in parent_ids:
+ # Replace project name with '"root"' which is used during
+ # GraphQl query for parent ids filter for folders without
+ # folder parent
+ parent_ids.remove(project_name)
+ parent_ids.add("root")
+
+ filters["parentFolderIds"] = list(parent_ids)
+
+ if fields:
+ fields = set(fields)
+ else:
+ fields = self.get_default_fields_for_type("folder")
+
+ use_rest = False
+ if "data" in fields:
+ use_rest = True
+ fields = {"id"}
+
+ if active is not None:
+ fields.add("active")
+
+ if own_attributes and not use_rest:
+ fields.add("ownAttrib")
+
+ query = folders_graphql_query(fields)
+ for attr, filter_value in filters.items():
+ query.set_variable_value(attr, filter_value)
+
+ for parsed_data in query.continuous_query(self):
+ for folder in parsed_data["project"]["folders"]:
+ if active is not None and active is not folder["active"]:
+ continue
+
+ if use_rest:
+ folder = self.get_rest_folder(project_name, folder["id"])
+
+ if own_attributes:
+ fill_own_attribs(folder)
+ yield folder
+
+ def get_folder_by_id(
+ self,
+ project_name,
+ folder_id,
+ fields=None,
+ own_attributes=False
+ ):
+ """Query folder entity by id.
+
+ Args:
+ project_name (str): Name of project where to look for queried
+ entities.
+ folder_id (str): Folder id.
+ fields (Optional[Iterable[str]]): Fields that should be returned.
+ All fields are returned if 'None' is passed.
+ own_attributes (Optional[bool]): Attribute values that are
+ not explicitly set on entity will have 'None' value.
+
+ Returns:
+ Union[dict, None]: Folder entity data or None if was not found.
+ """
+
+ folders = self.get_folders(
+ project_name,
+ folder_ids=[folder_id],
+ active=None,
+ fields=fields,
+ own_attributes=own_attributes
+ )
+ for folder in folders:
+ return folder
+ return None
+
+ def get_folder_by_path(
+ self,
+ project_name,
+ folder_path,
+ fields=None,
+ own_attributes=False
+ ):
+ """Query folder entity by path.
+
+ Folder path is a path to folder with all parent names joined by slash.
+
+ Args:
+ project_name (str): Name of project where to look for queried
+ entities.
+ folder_path (str): Folder path.
+ fields (Optional[Iterable[str]]): Fields that should be returned.
+ All fields are returned if 'None' is passed.
+ own_attributes (Optional[bool]): Attribute values that are
+ not explicitly set on entity will have 'None' value.
+
+ Returns:
+ Union[dict, None]: Folder entity data or None if was not found.
+ """
+
+ folders = self.get_folders(
+ project_name,
+ folder_paths=[folder_path],
+ active=None,
+ fields=fields,
+ own_attributes=own_attributes
+ )
+ for folder in folders:
+ return folder
+ return None
+
+ def get_folder_by_name(
+ self,
+ project_name,
+ folder_name,
+ fields=None,
+ own_attributes=False
+ ):
+ """Query folder entity by path.
+
+ Warnings:
+ Folder name is not a unique identifier of a folder. Function is
+ kept for OpenPype 3 compatibility.
+
+ Args:
+ project_name (str): Name of project where to look for queried
+ entities.
+ folder_name (str): Folder name.
+ fields (Optional[Iterable[str]]): Fields that should be returned.
+ All fields are returned if 'None' is passed.
+ own_attributes (Optional[bool]): Attribute values that are
+ not explicitly set on entity will have 'None' value.
+
+ Returns:
+ Union[dict, None]: Folder entity data or None if was not found.
+ """
+
+ folders = self.get_folders(
+ project_name,
+ folder_names=[folder_name],
+ active=None,
+ fields=fields,
+ own_attributes=own_attributes
+ )
+ for folder in folders:
+ return folder
+ return None
+
+ def get_folder_ids_with_products(self, project_name, folder_ids=None):
+ """Find folders which have at least one product.
+
+ Folders that have at least one product should be immutable, so they
+ should not change path -> change of name or name of any parent
+ is not possible.
+
+ Args:
+ project_name (str): Name of project.
+ folder_ids (Optional[Iterable[str]]): Limit folder ids filtering
+ to a set of folders. If set to None all folders on project are
+ checked.
+
+ Returns:
+ set[str]: Folder ids that have at least one product.
+ """
+
+ if folder_ids is not None:
+ folder_ids = set(folder_ids)
+ if not folder_ids:
+ return set()
+
+ query = folders_graphql_query({"id"})
+ query.set_variable_value("projectName", project_name)
+ query.set_variable_value("folderHasProducts", True)
+ if folder_ids:
+ query.set_variable_value("folderIds", list(folder_ids))
+
+ parsed_data = query.query(self)
+ folders = parsed_data["project"]["folders"]
+ return {
+ folder["id"]
+ for folder in folders
+ }
+
+ def get_tasks(
+ self,
+ project_name,
+ task_ids=None,
+ task_names=None,
+ task_types=None,
+ folder_ids=None,
+ active=True,
+ fields=None,
+ own_attributes=False
+ ):
+ """Query task entities from server.
+
+ Args:
+ project_name (str): Name of project.
+ task_ids (Iterable[str]): Task ids to filter.
+ task_names (Iterable[str]): Task names used for filtering.
+ task_types (Iterable[str]): Task types used for filtering.
+ folder_ids (Iterable[str]): Ids of task parents. Use 'None'
+ if folder is direct child of project.
+ active (Optional[bool]): Filter active/inactive tasks.
+ Both are returned if is set to None.
+ fields (Optional[Iterable[str]]): Fields to be queried for
+ folder. All possible folder fields are returned
+ if 'None' is passed.
+ own_attributes (Optional[bool]): Attribute values that are
+ not explicitly set on entity will have 'None' value.
+
+ Returns:
+ Generator[dict[str, Any]]: Queried task entities.
+ """
+
+ if not project_name:
+ return
+
+ filters = {
+ "projectName": project_name
+ }
+
+ if task_ids is not None:
+ task_ids = set(task_ids)
+ if not task_ids:
+ return
+ filters["taskIds"] = list(task_ids)
+
+ if task_names is not None:
+ task_names = set(task_names)
+ if not task_names:
+ return
+ filters["taskNames"] = list(task_names)
+
+ if task_types is not None:
+ task_types = set(task_types)
+ if not task_types:
+ return
+ filters["taskTypes"] = list(task_types)
+
+ if folder_ids is not None:
+ folder_ids = set(folder_ids)
+ if not folder_ids:
+ return
+ filters["folderIds"] = list(folder_ids)
+
+ if not fields:
+ fields = self.get_default_fields_for_type("task")
+
+ fields = set(fields)
+
+ use_rest = False
+ if "data" in fields:
+ use_rest = True
+ fields = {"id"}
+
+ if active is not None:
+ fields.add("active")
+
+ if own_attributes:
+ fields.add("ownAttrib")
+
+ query = tasks_graphql_query(fields)
+ for attr, filter_value in filters.items():
+ query.set_variable_value(attr, filter_value)
+
+ for parsed_data in query.continuous_query(self):
+ for task in parsed_data["project"]["tasks"]:
+ if active is not None and active is not task["active"]:
+ continue
+
+ if use_rest:
+ task = self.get_rest_task(project_name, task["id"])
+
+ if own_attributes:
+ fill_own_attribs(task)
+ yield task
+
+ def get_task_by_name(
+ self,
+ project_name,
+ folder_id,
+ task_name,
+ fields=None,
+ own_attributes=False
+ ):
+ """Query task entity by name and folder id.
+
+ Args:
+ project_name (str): Name of project where to look for queried
+ entities.
+ folder_id (str): Folder id.
+ task_name (str): Task name
+ fields (Optional[Iterable[str]): Fields that should be returned.
+ All fields are returned if 'None' is passed.
+ own_attributes (Optional[bool]): Attribute values that are
+ not explicitly set on entity will have 'None' value.
+
+ Returns:
+ Union[dict, None]: Task entity data or None if was not found.
+ """
+
+ for task in self.get_tasks(
+ project_name,
+ folder_ids=[folder_id],
+ task_names=[task_name],
+ active=None,
+ fields=fields,
+ own_attributes=own_attributes
+ ):
+ return task
+ return None
+
+ def get_task_by_id(
+ self,
+ project_name,
+ task_id,
+ fields=None,
+ own_attributes=False
+ ):
+ """Query task entity by id.
+
+ Args:
+ project_name (str): Name of project where to look for queried
+ entities.
+ task_id (str): Task id.
+ fields (Optional[Iterable[str]): Fields that should be returned.
+ All fields are returned if 'None' is passed.
+ own_attributes (Optional[bool]): Attribute values that are
+ not explicitly set on entity will have 'None' value.
+
+ Returns:
+ Union[dict, None]: Task entity data or None if was not found.
+ """
+
+ for task in self.get_tasks(
+ project_name,
+ task_ids=[task_id],
+ active=None,
+ fields=fields,
+ own_attributes=own_attributes
+ ):
+ return task
+ return None
+
+ def _filter_product(
+ self, project_name, product, active, own_attributes, use_rest
+ ):
+ if active is not None and product["active"] is not active:
+ return None
+
+ if use_rest:
+ product = self.get_rest_product(project_name, product["id"])
+
+ if own_attributes:
+ fill_own_attribs(product)
+
+ return product
+
+ def get_products(
+ self,
+ project_name,
+ product_ids=None,
+ product_names=None,
+ folder_ids=None,
+ names_by_folder_ids=None,
+ active=True,
+ fields=None,
+ own_attributes=False
+ ):
+ """Query products from server.
+
+ Todos:
+ Separate 'name_by_folder_ids' filtering to separated method. It
+ cannot be combined with some other filters.
+
+ Args:
+ project_name (str): Name of project.
+ product_ids (Optional[Iterable[str]]): Task ids to filter.
+ product_names (Optional[Iterable[str]]): Task names used for
+ filtering.
+ folder_ids (Optional[Iterable[str]]): Ids of task parents.
+ Use 'None' if folder is direct child of project.
+ names_by_folder_ids (Optional[dict[str, Iterable[str]]]): Product
+ name filtering by folder id.
+ active (Optional[bool]): Filter active/inactive products.
+ Both are returned if is set to None.
+ fields (Optional[Iterable[str]]): Fields to be queried for
+ folder. All possible folder fields are returned
+ if 'None' is passed.
+ own_attributes (Optional[bool]): Attribute values that are
+ not explicitly set on entity will have 'None' value.
+
+ Returns:
+ Generator[dict[str, Any]]: Queried product entities.
+ """
+
+ if not project_name:
+ return
+
+ if product_ids is not None:
+ product_ids = set(product_ids)
+ if not product_ids:
+ return
+
+ filter_product_names = None
+ if product_names is not None:
+ filter_product_names = set(product_names)
+ if not filter_product_names:
+ return
+
+ filter_folder_ids = None
+ if folder_ids is not None:
+ filter_folder_ids = set(folder_ids)
+ if not filter_folder_ids:
+ return
+
+ # This will disable 'folder_ids' and 'product_names' filters
+ # - maybe could be enhanced in future?
+ if names_by_folder_ids is not None:
+ filter_product_names = set()
+ filter_folder_ids = set()
+
+ for folder_id, names in names_by_folder_ids.items():
+ if folder_id and names:
+ filter_folder_ids.add(folder_id)
+ filter_product_names |= set(names)
+
+ if not filter_product_names or not filter_folder_ids:
+ return
+
+ # Convert fields and add minimum required fields
+ if fields:
+ fields = set(fields) | {"id"}
+ else:
+ fields = self.get_default_fields_for_type("product")
+
+ use_rest = False
+ if "data" in fields:
+ use_rest = True
+ fields = {"id"}
+
+ if active is not None:
+ fields.add("active")
+
+ if own_attributes:
+ fields.add("ownAttrib")
+
+ # Add 'name' and 'folderId' if 'names_by_folder_ids' filter is entered
+ if names_by_folder_ids:
+ fields.add("name")
+ fields.add("folderId")
+
+ # Prepare filters for query
+ filters = {
+ "projectName": project_name
+ }
+ if filter_folder_ids:
+ filters["folderIds"] = list(filter_folder_ids)
+
+ if product_ids:
+ filters["productIds"] = list(product_ids)
+
+ if filter_product_names:
+ filters["productNames"] = list(filter_product_names)
+
+ query = products_graphql_query(fields)
+ for attr, filter_value in filters.items():
+ query.set_variable_value(attr, filter_value)
+
+ parsed_data = query.query(self)
+
+ products = parsed_data.get("project", {}).get("products", [])
+ # Filter products by 'names_by_folder_ids'
+ if names_by_folder_ids:
+ products_by_folder_id = collections.defaultdict(list)
+ for product in products:
+ filtered_product = self._filter_product(
+ project_name, product, active, own_attributes, use_rest
+ )
+ if filtered_product is not None:
+ folder_id = filtered_product["folderId"]
+ products_by_folder_id[folder_id].append(filtered_product)
+
+ for folder_id, names in names_by_folder_ids.items():
+ for folder_product in products_by_folder_id[folder_id]:
+ if folder_product["name"] in names:
+ yield folder_product
+
+ else:
+ for product in products:
+ filtered_product = self._filter_product(
+ project_name, product, active, own_attributes, use_rest
+ )
+ if filtered_product is not None:
+ yield filtered_product
+
+
+ def get_product_by_id(
+ self,
+ project_name,
+ product_id,
+ fields=None,
+ own_attributes=False
+ ):
+ """Query product entity by id.
+
+ Args:
+ project_name (str): Name of project where to look for queried
+ entities.
+ product_id (str): Product id.
+ fields (Optional[Iterable[str]]): Fields that should be returned.
+ All fields are returned if 'None' is passed.
+ own_attributes (Optional[bool]): Attribute values that are
+ not explicitly set on entity will have 'None' value.
+
+ Returns:
+ Union[dict, None]: Product entity data or None if was not found.
+ """
+
+ products = self.get_products(
+ project_name,
+ product_ids=[product_id],
+ active=None,
+ fields=fields,
+ own_attributes=own_attributes
+ )
+ for product in products:
+ return product
+ return None
+
+ def get_product_by_name(
+ self,
+ project_name,
+ product_name,
+ folder_id,
+ fields=None,
+ own_attributes=False
+ ):
+ """Query product entity by name and folder id.
+
+ Args:
+ project_name (str): Name of project where to look for queried
+ entities.
+ product_name (str): Product name.
+ folder_id (str): Folder id (Folder is a parent of products).
+ fields (Optional[Iterable[str]]): Fields that should be returned.
+ All fields are returned if 'None' is passed.
+ own_attributes (Optional[bool]): Attribute values that are
+ not explicitly set on entity will have 'None' value.
+
+ Returns:
+ Union[dict, None]: Product entity data or None if was not found.
+ """
+
+ products = self.get_products(
+ project_name,
+ product_names=[product_name],
+ folder_ids=[folder_id],
+ active=None,
+ fields=fields,
+ own_attributes=own_attributes
+ )
+ for product in products:
+ return product
+ return None
+
+ def get_product_types(self, fields=None):
+ """Types of products.
+
+ This is server wide information. Product types have 'name', 'icon' and
+ 'color'.
+
+ Args:
+ fields (Optional[Iterable[str]]): Product types fields to query.
+
+ Returns:
+ list[dict[str, Any]]: Product types information.
+ """
+
+ if not fields:
+ fields = self.get_default_fields_for_type("productType")
+
+ query = product_types_query(fields)
+
+ parsed_data = query.query(self)
+
+ return parsed_data.get("productTypes", [])
+
+ def get_project_product_types(self, project_name, fields=None):
+ """Types of products available on a project.
+
+ Filter only product types available on project.
+
+ Args:
+ project_name (str): Name of project where to look for
+ product types.
+ fields (Optional[Iterable[str]]): Product types fields to query.
+
+ Returns:
+ list[dict[str, Any]]: Product types information.
+ """
+
+ if not fields:
+ fields = self.get_default_fields_for_type("productType")
+
+ query = project_product_types_query(fields)
+ query.set_variable_value("projectName", project_name)
+
+ parsed_data = query.query(self)
+
+ return parsed_data.get("project", {}).get("productTypes", [])
+
+ def get_product_type_names(self, project_name=None, product_ids=None):
+ """Product type names.
+
+ Warnings:
+ This function will be probably removed. Matters if 'products_id'
+ filter has real use-case.
+
+ Args:
+ project_name (Optional[str]): Name of project where to look for
+ queried entities.
+ product_ids (Optional[Iterable[str]]): Product ids filter. Can be
+ used only with 'project_name'.
+
+ Returns:
+ set[str]: Product type names.
+ """
+
+ if project_name and product_ids:
+ products = self.get_products(
+ project_name,
+ product_ids=product_ids,
+ fields=["productType"],
+ active=None,
+ )
+ return {
+ product["productType"]
+ for product in products
+ }
+
+ return {
+ product_info["name"]
+ for product_info in self.get_project_product_types(
+ project_name, fields=["name"]
+ )
+ }
+
+ def get_versions(
+ self,
+ project_name,
+ version_ids=None,
+ product_ids=None,
+ versions=None,
+ hero=True,
+ standard=True,
+ latest=None,
+ active=True,
+ fields=None,
+ own_attributes=False
+ ):
+ """Get version entities based on passed filters from server.
+
+ Args:
+ project_name (str): Name of project where to look for versions.
+ version_ids (Optional[Iterable[str]]): Version ids used for
+ version filtering.
+ product_ids (Optional[Iterable[str]]): Product ids used for
+ version filtering.
+ versions (Optional[Iterable[int]]): Versions we're interested in.
+ hero (Optional[bool]): Receive also hero versions when set to true.
+ standard (Optional[bool]): Receive versions which are not hero when
+ set to true.
+ latest (Optional[bool]): Return only latest version of standard
+ versions. This can be combined only with 'standard' attribute
+ set to True.
+ active (Optional[bool]): Receive active/inactive entities.
+ Both are returned when 'None' is passed.
+ fields (Optional[Iterable[str]]): Fields to be queried
+ for version. All possible folder fields are returned
+ if 'None' is passed.
+ own_attributes (Optional[bool]): Attribute values that are
+ not explicitly set on entity will have 'None' value.
+
+ Returns:
+ Generator[dict[str, Any]]: Queried version entities.
+ """
+
+ if not fields:
+ fields = self.get_default_fields_for_type("version")
+ fields = set(fields)
+
+ if active is not None:
+ fields.add("active")
+
+ # Make sure fields have minimum required fields
+ fields |= {"id", "version"}
+
+ use_rest = False
+ if "data" in fields:
+ use_rest = True
+ fields = {"id"}
+
+ if own_attributes:
+ fields.add("ownAttrib")
+
+ filters = {
+ "projectName": project_name
+ }
+ if version_ids is not None:
+ version_ids = set(version_ids)
+ if not version_ids:
+ return
+ filters["versionIds"] = list(version_ids)
+
+ if product_ids is not None:
+ product_ids = set(product_ids)
+ if not product_ids:
+ return
+ filters["productIds"] = list(product_ids)
+
+ # TODO versions can't be used as filter at this moment!
+ if versions is not None:
+ versions = set(versions)
+ if not versions:
+ return
+ filters["versions"] = list(versions)
+
+ if not hero and not standard:
+ return
+
+ queries = []
+ # Add filters based on 'hero' and 'standard'
+ # NOTE: There is not a filter to "ignore" hero versions or to get
+ # latest and hero version
+ # - if latest and hero versions should be returned it must be done in
+ # 2 graphql queries
+ if standard and not latest:
+ # This query all versions standard + hero
+ # - hero must be filtered out if is not enabled during loop
+ query = versions_graphql_query(fields)
+ for attr, filter_value in filters.items():
+ query.set_variable_value(attr, filter_value)
+ queries.append(query)
+ else:
+ if hero:
+ # Add hero query if hero is enabled
+ hero_query = versions_graphql_query(fields)
+ for attr, filter_value in filters.items():
+ hero_query.set_variable_value(attr, filter_value)
+
+ hero_query.set_variable_value("heroOnly", True)
+ queries.append(hero_query)
+
+ if standard:
+ standard_query = versions_graphql_query(fields)
+ for attr, filter_value in filters.items():
+ standard_query.set_variable_value(attr, filter_value)
+
+ if latest:
+ standard_query.set_variable_value("latestOnly", True)
+ queries.append(standard_query)
+
+ for query in queries:
+ for parsed_data in query.continuous_query(self):
+ for version in parsed_data["project"]["versions"]:
+ if active is not None and version["active"] is not active:
+ continue
+
+ if not hero and version["version"] < 0:
+ continue
+
+ if use_rest:
+ version = self.get_rest_version(
+ project_name, version["id"]
+ )
+
+ if own_attributes:
+ fill_own_attribs(version)
+
+ yield version
+
+ def get_version_by_id(
+ self,
+ project_name,
+ version_id,
+ fields=None,
+ own_attributes=False
+ ):
+ """Query version entity by id.
+
+ Args:
+ project_name (str): Name of project where to look for queried
+ entities.
+ version_id (str): Version id.
+ fields (Optional[Iterable[str]]): Fields that should be returned.
+ All fields are returned if 'None' is passed.
+ own_attributes (Optional[bool]): Attribute values that are
+ not explicitly set on entity will have 'None' value.
+
+ Returns:
+ Union[dict, None]: Version entity data or None if was not found.
+ """
+
+ versions = self.get_versions(
+ project_name,
+ version_ids=[version_id],
+ active=None,
+ hero=True,
+ fields=fields,
+ own_attributes=own_attributes
+ )
+ for version in versions:
+ return version
+ return None
+
+ def get_version_by_name(
+ self,
+ project_name,
+ version,
+ product_id,
+ fields=None,
+ own_attributes=False
+ ):
+ """Query version entity by version and product id.
+
+ Args:
+ project_name (str): Name of project where to look for queried
+ entities.
+ version (int): Version of version entity.
+ product_id (str): Product id. Product is a parent of version.
+ fields (Optional[Iterable[str]]): Fields that should be returned.
+ All fields are returned if 'None' is passed.
+ own_attributes (Optional[bool]): Attribute values that are
+ not explicitly set on entity will have 'None' value.
+
+ Returns:
+ Union[dict, None]: Version entity data or None if was not found.
+ """
+
+ versions = self.get_versions(
+ project_name,
+ product_ids=[product_id],
+ versions=[version],
+ active=None,
+ fields=fields,
+ own_attributes=own_attributes
+ )
+ for version in versions:
+ return version
+ return None
+
+ def get_hero_version_by_id(
+ self,
+ project_name,
+ version_id,
+ fields=None,
+ own_attributes=False
+ ):
+ """Query hero version entity by id.
+
+ Args:
+ project_name (str): Name of project where to look for queried
+ entities.
+ version_id (int): Hero version id.
+ fields (Optional[Iterable[str]]): Fields that should be returned.
+ All fields are returned if 'None' is passed.
+ own_attributes (Optional[bool]): Attribute values that are
+ not explicitly set on entity will have 'None' value.
+
+ Returns:
+ Union[dict, None]: Version entity data or None if was not found.
+ """
+
+ versions = self.get_hero_versions(
+ project_name,
+ version_ids=[version_id],
+ fields=fields,
+ own_attributes=own_attributes
+ )
+ for version in versions:
+ return version
+ return None
+
+ def get_hero_version_by_product_id(
+ self,
+ project_name,
+ product_id,
+ fields=None,
+ own_attributes=False
+ ):
+ """Query hero version entity by product id.
+
+ Only one hero version is available on a product.
+
+ Args:
+ project_name (str): Name of project where to look for queried
+ entities.
+ product_id (int): Product id.
+ fields (Optional[Iterable[str]]): Fields that should be returned.
+ All fields are returned if 'None' is passed.
+ own_attributes (Optional[bool]): Attribute values that are
+ not explicitly set on entity will have 'None' value.
+
+ Returns:
+ Union[dict, None]: Version entity data or None if was not found.
+ """
+
+ versions = self.get_hero_versions(
+ project_name,
+ product_ids=[product_id],
+ fields=fields,
+ own_attributes=own_attributes
+ )
+ for version in versions:
+ return version
+ return None
+
+ def get_hero_versions(
+ self,
+ project_name,
+ product_ids=None,
+ version_ids=None,
+ active=True,
+ fields=None,
+ own_attributes=False
+ ):
+ """Query hero versions by multiple filters.
+
+ Only one hero version is available on a product.
+
+ Args:
+ project_name (str): Name of project where to look for queried
+ entities.
+ product_ids (Optional[Iterable[str]]): Product ids.
+ version_ids (Optional[Iterable[str]]): Version ids.
+ active (Optional[bool]): Receive active/inactive entities.
+ Both are returned when 'None' is passed.
+ fields (Optional[Iterable[str]]): Fields that should be returned.
+ All fields are returned if 'None' is passed.
+ own_attributes (Optional[bool]): Attribute values that are
+ not explicitly set on entity will have 'None' value.
+
+ Returns:
+ Union[dict, None]: Version entity data or None if was not found.
+ """
+
+ return self.get_versions(
+ project_name,
+ version_ids=version_ids,
+ product_ids=product_ids,
+ hero=True,
+ standard=False,
+ active=active,
+ fields=fields,
+ own_attributes=own_attributes
+ )
+
+ def get_last_versions(
+ self,
+ project_name,
+ product_ids,
+ active=True,
+ fields=None,
+ own_attributes=False
+ ):
+ """Query last version entities by product ids.
+
+ Args:
+ project_name (str): Project where to look for representation.
+ product_ids (Iterable[str]): Product ids.
+ active (Optional[bool]): Receive active/inactive entities.
+ Both are returned when 'None' is passed.
+ fields (Optional[Iterable[str]]): fields to be queried
+ for representations.
+ own_attributes (Optional[bool]): Attribute values that are
+ not explicitly set on entity will have 'None' value.
+
+ Returns:
+ dict[str, dict[str, Any]]: Last versions by product id.
+ """
+
+ versions = self.get_versions(
+ project_name,
+ product_ids=product_ids,
+ latest=True,
+ active=active,
+ fields=fields,
+ own_attributes=own_attributes
+ )
+ return {
+ version["parent"]: version
+ for version in versions
+ }
+
+ def get_last_version_by_product_id(
+ self,
+ project_name,
+ product_id,
+ active=True,
+ fields=None,
+ own_attributes=False
+ ):
+ """Query last version entity by product id.
+
+ Args:
+ project_name (str): Project where to look for representation.
+ product_id (str): Product id.
+ active (Optional[bool]): Receive active/inactive entities.
+ Both are returned when 'None' is passed.
+ fields (Optional[Iterable[str]]): fields to be queried
+ for representations.
+ own_attributes (Optional[bool]): Attribute values that are
+ not explicitly set on entity will have 'None' value.
+
+ Returns:
+ Union[dict[str, Any], None]: Queried version entity or None.
+ """
+
+ versions = self.get_versions(
+ project_name,
+ product_ids=[product_id],
+ latest=True,
+ active=active,
+ fields=fields,
+ own_attributes=own_attributes
+ )
+ for version in versions:
+ return version
+ return None
+
+ def get_last_version_by_product_name(
+ self,
+ project_name,
+ product_name,
+ folder_id,
+ active=True,
+ fields=None,
+ own_attributes=False
+ ):
+ """Query last version entity by product name and folder id.
+
+ Args:
+ project_name (str): Project where to look for representation.
+ product_name (str): Product name.
+ folder_id (str): Folder id.
+ active (Optional[bool]): Receive active/inactive entities.
+ Both are returned when 'None' is passed.
+ fields (Optional[Iterable[str]): fields to be queried
+ for representations.
+ own_attributes (Optional[bool]): Attribute values that are
+ not explicitly set on entity will have 'None' value.
+
+ Returns:
+ Union[dict[str, Any], None]: Queried version entity or None.
+ """
+
+ if not folder_id:
+ return None
+
+ product = self.get_product_by_name(
+ project_name, product_name, folder_id, fields=["_id"]
+ )
+ if not product:
+ return None
+ return self.get_last_version_by_product_id(
+ project_name,
+ product["id"],
+ active=active,
+ fields=fields,
+ own_attributes=own_attributes
+ )
+
+ def version_is_latest(self, project_name, version_id):
+ """Is version latest from a product.
+
+ Args:
+ project_name (str): Project where to look for representation.
+ version_id (str): Version id.
+
+ Returns:
+ bool: Version is latest or not.
+ """
+
+ query = GraphQlQuery("VersionIsLatest")
+ project_name_var = query.add_variable(
+ "projectName", "String!", project_name
+ )
+ version_id_var = query.add_variable(
+ "versionId", "String!", version_id
+ )
+ project_query = query.add_field("project")
+ project_query.set_filter("name", project_name_var)
+ version_query = project_query.add_field("version")
+ version_query.set_filter("id", version_id_var)
+ product_query = version_query.add_field("product")
+ latest_version_query = product_query.add_field("latestVersion")
+ latest_version_query.add_field("id")
+
+ parsed_data = query.query(self)
+ latest_version = (
+ parsed_data["project"]["version"]["product"]["latestVersion"]
+ )
+ return latest_version["id"] == version_id
+
+ def get_representations(
+ self,
+ project_name,
+ representation_ids=None,
+ representation_names=None,
+ version_ids=None,
+ names_by_version_ids=None,
+ active=True,
+ fields=None,
+ own_attributes=False
+ ):
+ """Get representation entities based on passed filters from server.
+
+ Todos:
+ Add separated function for 'names_by_version_ids' filtering.
+ Because can't be combined with others.
+
+ Args:
+ project_name (str): Name of project where to look for versions.
+ representation_ids (Optional[Iterable[str]]): Representation ids
+ used for representation filtering.
+ representation_names (Optional[Iterable[str]]): Representation
+ names used for representation filtering.
+ version_ids (Optional[Iterable[str]]): Version ids used for
+ representation filtering. Versions are parents of
+ representations.
+ names_by_version_ids (Optional[bool]): Find representations
+ by names and version ids. This filter discard all
+ other filters.
+ active (Optional[bool]): Receive active/inactive entities.
+ Both are returned when 'None' is passed.
+ fields (Optional[Iterable[str]]): Fields to be queried for
+ representation. All possible fields are returned if 'None' is
+ passed.
+ own_attributes (Optional[bool]): Attribute values that are
+ not explicitly set on entity will have 'None' value.
+
+ Returns:
+ Generator[dict[str, Any]]: Queried representation entities.
+ """
+
+ if not fields:
+ fields = self.get_default_fields_for_type("representation")
+ fields = set(fields)
+
+ use_rest = False
+ if "data" in fields:
+ use_rest = True
+ fields = {"id"}
+
+ if active is not None:
+ fields.add("active")
+
+ if own_attributes:
+ fields.add("ownAttrib")
+
+ filters = {
+ "projectName": project_name
+ }
+
+ if representation_ids is not None:
+ representation_ids = set(representation_ids)
+ if not representation_ids:
+ return
+ filters["representationIds"] = list(representation_ids)
+
+ version_ids_filter = None
+ representaion_names_filter = None
+ if names_by_version_ids is not None:
+ version_ids_filter = set()
+ representaion_names_filter = set()
+ for version_id, names in names_by_version_ids.items():
+ version_ids_filter.add(version_id)
+ representaion_names_filter |= set(names)
+
+ if not version_ids_filter or not representaion_names_filter:
+ return
+
+ else:
+ if representation_names is not None:
+ representaion_names_filter = set(representation_names)
+ if not representaion_names_filter:
+ return
+
+ if version_ids is not None:
+ version_ids_filter = set(version_ids)
+ if not version_ids_filter:
+ return
+
+ if version_ids_filter:
+ filters["versionIds"] = list(version_ids_filter)
+
+ if representaion_names_filter:
+ filters["representationNames"] = list(representaion_names_filter)
+
+ query = representations_graphql_query(fields)
+
+ for attr, filter_value in filters.items():
+ query.set_variable_value(attr, filter_value)
+
+ for parsed_data in query.continuous_query(self):
+ for repre in parsed_data["project"]["representations"]:
+ if active is not None and active is not repre["active"]:
+ continue
+
+ if use_rest:
+ repre = self.get_rest_representation(
+ project_name, repre["id"]
+ )
+
+ if "context" in repre:
+ orig_context = repre["context"]
+ context = {}
+ if orig_context and orig_context != "null":
+ context = json.loads(orig_context)
+ repre["context"] = context
+
+ if own_attributes:
+ fill_own_attribs(repre)
+ yield repre
+
+ def get_representation_by_id(
+ self,
+ project_name,
+ representation_id,
+ fields=None,
+ own_attributes=False
+ ):
+ """Query representation entity from server based on id filter.
+
+ Args:
+ project_name (str): Project where to look for representation.
+ representation_id (str): Id of representation.
+ fields (Optional[Iterable[str]]): fields to be queried
+ for representations.
+ own_attributes (Optional[bool]): Attribute values that are
+ not explicitly set on entity will have 'None' value.
+
+ Returns:
+ Union[dict[str, Any], None]: Queried representation entity or None.
+ """
+
+ representations = self.get_representations(
+ project_name,
+ representation_ids=[representation_id],
+ active=None,
+ fields=fields,
+ own_attributes=own_attributes
+ )
+ for representation in representations:
+ return representation
+ return None
+
+ def get_representation_by_name(
+ self,
+ project_name,
+ representation_name,
+ version_id,
+ fields=None,
+ own_attributes=False
+ ):
+ """Query representation entity by name and version id.
+
+ Args:
+ project_name (str): Project where to look for representation.
+ representation_name (str): Representation name.
+ version_id (str): Version id.
+ fields (Optional[Iterable[str]]): fields to be queried
+ for representations.
+ own_attributes (Optional[bool]): Attribute values that are
+ not explicitly set on entity will have 'None' value.
+
+ Returns:
+ Union[dict[str, Any], None]: Queried representation entity or None.
+ """
+
+ representations = self.get_representations(
+ project_name,
+ representation_names=[representation_name],
+ version_ids=[version_id],
+ active=None,
+ fields=fields,
+ own_attributes=own_attributes
+ )
+ for representation in representations:
+ return representation
+ return None
+
+ def get_representations_parents(self, project_name, representation_ids):
+ """Find representations parents by representation id.
+
+ Representation parent entities up to project.
+
+ Args:
+ project_name (str): Project where to look for entities.
+ representation_ids (Iterable[str]): Representation ids.
+
+ Returns:
+ dict[str, RepresentationParents]: Parent entities by
+ representation id.
+ """
+
+ if not representation_ids:
+ return {}
+
+ project = self.get_project(project_name)
+ repre_ids = set(representation_ids)
+ output = {
+ repre_id: RepresentationParents(None, None, None, None)
+ for repre_id in representation_ids
+ }
+
+ version_fields = self.get_default_fields_for_type("version")
+ product_fields = self.get_default_fields_for_type("product")
+ folder_fields = self.get_default_fields_for_type("folder")
+
+ query = representations_parents_qraphql_query(
+ version_fields, product_fields, folder_fields
+ )
+ query.set_variable_value("projectName", project_name)
+ query.set_variable_value("representationIds", list(repre_ids))
+
+ parsed_data = query.query(self)
+ for repre in parsed_data["project"]["representations"]:
+ repre_id = repre["id"]
+ version = repre.pop("version")
+ product = version.pop("product")
+ folder = product.pop("folder")
+ output[repre_id] = RepresentationParents(
+ version, product, folder, project
+ )
+
+ return output
+
+ def get_representation_parents(self, project_name, representation_id):
+ """Find representation parents by representation id.
+
+ Representation parent entities up to project.
+
+ Args:
+ project_name (str): Project where to look for entities.
+ representation_id (str): Representation id.
+
+ Returns:
+ RepresentationParents: Representation parent entities.
+ """
+
+ if not representation_id:
+ return None
+
+ parents_by_repre_id = self.get_representations_parents(
+ project_name, [representation_id]
+ )
+ return parents_by_repre_id[representation_id]
+
+ def get_repre_ids_by_context_filters(
+ self,
+ project_name,
+ context_filters,
+ representation_names=None,
+ version_ids=None
+ ):
+ """Find representation ids which match passed context filters.
+
+ Each representation has context integrated on representation entity in
+ database. The context may contain project, folder, task name or
+ product name, product type and many more. This implementation gives
+ option to quickly filter representation based on representation data
+ in database.
+
+ Context filters have defined structure. To define filter of nested
+ subfield use dot '.' as delimiter (For example 'task.name').
+ Filter values can be regex filters. String or 're.Pattern' can be used.
+
+ Args:
+ project_name (str): Project where to look for representations.
+ context_filters (dict[str, list[str]]): Filters of context fields.
+ representation_names (Optional[Iterable[str]]): Representation
+ names, can be used as additional filter for representations
+ by their names.
+ version_ids (Optional[Iterable[str]]): Version ids, can be used
+ as additional filter for representations by their parent ids.
+
+ Returns:
+ list[str]: Representation ids that match passed filters.
+
+ Example:
+ The function returns just representation ids so if entities are
+ required for funtionality they must be queried afterwards by
+ their ids.
+ >>> project_name = "testProject"
+ >>> filters = {
+ ... "task.name": ["[aA]nimation"],
+ ... "product": [".*[Mm]ain"]
+ ... }
+ >>> repre_ids = get_repre_ids_by_context_filters(
+ ... project_name, filters)
+ >>> repres = get_representations(project_name, repre_ids)
+ """
+
+ if not isinstance(context_filters, dict):
+ raise TypeError(
+ "Expected 'dict' got {}".format(str(type(context_filters)))
+ )
+
+ filter_body = {}
+ if representation_names is not None:
+ if not representation_names:
+ return []
+ filter_body["names"] = list(set(representation_names))
+
+ if version_ids is not None:
+ if not version_ids:
+ return []
+ filter_body["versionIds"] = list(set(version_ids))
+
+ body_context_filters = []
+ for key, filters in context_filters.items():
+ if not isinstance(filters, (set, list, tuple)):
+ raise TypeError(
+ "Expected 'set', 'list', 'tuple' got {}".format(
+ str(type(filters))))
+
+
+ new_filters = set()
+ for filter_value in filters:
+ if isinstance(filter_value, PatternType):
+ filter_value = filter_value.pattern
+ new_filters.add(filter_value)
+
+ body_context_filters.append({
+ "key": key,
+ "values": list(new_filters)
+ })
+
+ response = self.post(
+ "projects/{}/repreContextFilter".format(project_name),
+ context=body_context_filters,
+ **filter_body
+ )
+ response.raise_for_status()
+ return response.data["ids"]
+
+ def get_workfiles_info(
+ self,
+ project_name,
+ workfile_ids=None,
+ task_ids=None,
+ paths=None,
+ fields=None,
+ own_attributes=False
+ ):
+ """Workfile info entities by passed filters.
+
+ Args:
+ project_name (str): Project under which the entity is located.
+ workfile_ids (Optional[Iterable[str]]): Workfile ids.
+ task_ids (Optional[Iterable[str]]): Task ids.
+ paths (Optional[Iterable[str]]): Rootless workfiles paths.
+ fields (Optional[Iterable[str]]): Fields to be queried for
+ representation. All possible fields are returned if 'None' is
+ passed.
+ own_attributes (Optional[bool]): Attribute values that are
+ not explicitly set on entity will have 'None' value.
+
+ Returns:
+ Generator[dict[str, Any]]: Queried workfile info entites.
+ """
+
+ filters = {"projectName": project_name}
+ if task_ids is not None:
+ task_ids = set(task_ids)
+ if not task_ids:
+ return
+ filters["taskIds"] = list(task_ids)
+
+ if paths is not None:
+ paths = set(paths)
+ if not paths:
+ return
+ filters["paths"] = list(paths)
+
+ if workfile_ids is not None:
+ workfile_ids = set(workfile_ids)
+ if not workfile_ids:
+ return
+ filters["workfileIds"] = list(workfile_ids)
+
+ if not fields:
+ fields = DEFAULT_WORKFILE_INFO_FIELDS
+ fields = set(fields)
+ if own_attributes:
+ fields.add("ownAttrib")
+
+ query = workfiles_info_graphql_query(fields)
+
+ for attr, filter_value in filters.items():
+ query.set_variable_value(attr, filter_value)
+
+ for parsed_data in query.continuous_query(self):
+ for workfile_info in parsed_data["project"]["workfiles"]:
+ if own_attributes:
+ fill_own_attribs(workfile_info)
+ yield workfile_info
+
+ def get_workfile_info(
+ self, project_name, task_id, path, fields=None, own_attributes=False
+ ):
+ """Workfile info entity by task id and workfile path.
+
+ Args:
+ project_name (str): Project under which the entity is located.
+ task_id (str): Task id.
+ path (str): Rootless workfile path.
+ fields (Optional[Iterable[str]]): Fields to be queried for
+ representation. All possible fields are returned if 'None' is
+ passed.
+ own_attributes (Optional[bool]): Attribute values that are
+ not explicitly set on entity will have 'None' value.
+
+ Returns:
+ Union[dict[str, Any], None]: Workfile info entity or None.
+ """
+
+ if not task_id or not path:
+ return None
+
+ for workfile_info in self.get_workfiles_info(
+ project_name,
+ task_ids=[task_id],
+ paths=[path],
+ fields=fields,
+ own_attributes=own_attributes
+ ):
+ return workfile_info
+ return None
+
+ def get_workfile_info_by_id(
+ self, project_name, workfile_id, fields=None, own_attributes=False
+ ):
+ """Workfile info entity by id.
+
+ Args:
+ project_name (str): Project under which the entity is located.
+ workfile_id (str): Workfile info id.
+ fields (Optional[Iterable[str]]): Fields to be queried for
+ representation. All possible fields are returned if 'None' is
+ passed.
+ own_attributes (Optional[bool]): Attribute values that are
+ not explicitly set on entity will have 'None' value.
+
+ Returns:
+ Union[dict[str, Any], None]: Workfile info entity or None.
+ """
+
+ if not workfile_id:
+ return None
+
+ for workfile_info in self.get_workfiles_info(
+ project_name,
+ workfile_ids=[workfile_id],
+ fields=fields,
+ own_attributes=own_attributes
+ ):
+ return workfile_info
+ return None
+
+ def get_thumbnail(
+ self, project_name, entity_type, entity_id, thumbnail_id=None
+ ):
+ """Get thumbnail from server.
+
+ Permissions of thumbnails are related to entities so thumbnails must be
+ queried per entity. So an entity type and entity type is required to
+ be passed.
+
+ If thumbnail id is passed logic can look into locally cached thumbnails
+ before calling server which can enhance loading time. If thumbnail id
+ is not passed the thumbnail is always downloaded even if is available.
+
+ Notes:
+ It is recommended to use one of prepared entity type specific
+ methods 'get_folder_thumbnail', 'get_version_thumbnail' or
+ 'get_workfile_thumbnail'.
+ We do recommend pass thumbnail id if you have access to it. Each
+ entity that allows thumbnails has 'thumbnailId' field, so it
+ can be queried.
+
+ Args:
+ project_name (str): Project under which the entity is located.
+ entity_type (str): Entity type which passed entity id represents.
+ entity_id (str): Entity id for which thumbnail should be returned.
+ thumbnail_id (Optional[str]): Prepared thumbnail id from entity.
+ Used only to check if thumbnail was already cached.
+
+ Returns:
+ Union[str, None]: Path to downloaded thumbnail or none if entity
+ does not have any (or if user does not have permissions).
+ """
+
+ # Look for thumbnail into cache and return the path if was found
+ filepath = self._thumbnail_cache.get_thumbnail_filepath(
+ project_name, thumbnail_id
+ )
+ if filepath:
+ return filepath
+
+ if entity_type in (
+ "folder",
+ "version",
+ "workfile",
+ ):
+ entity_type += "s"
+
+ # Receive thumbnail content from server
+ result = self.raw_get("projects/{}/{}/{}/thumbnail".format(
+ project_name,
+ entity_type,
+ entity_id
+ ))
+
+ if result.content_type is None:
+ return None
+
+ # It is expected the response contains thumbnail id otherwise the
+ # content cannot be cached and filepath returned
+ thumbnail_id = result.headers.get("X-Thumbnail-Id")
+ if thumbnail_id is None:
+ return None
+
+ # Cache thumbnail and return path
+ return self._thumbnail_cache.store_thumbnail(
+ project_name,
+ thumbnail_id,
+ result.content,
+ result.content_type
+ )
+
+ def get_folder_thumbnail(
+ self, project_name, folder_id, thumbnail_id=None
+ ):
+ """Prepared method to receive thumbnail for folder entity.
+
+ Args:
+ project_name (str): Project under which the entity is located.
+ folder_id (str): Folder id for which thumbnail should be returned.
+ thumbnail_id (Optional[str]): Prepared thumbnail id from entity.
+ Used only to check if thumbnail was already cached.
+
+ Returns:
+ Union[str, None]: Path to downloaded thumbnail or none if entity
+ does not have any (or if user does not have permissions).
+ """
+
+ return self.get_thumbnail(
+ project_name, "folder", folder_id, thumbnail_id
+ )
+
+ def get_version_thumbnail(
+ self, project_name, version_id, thumbnail_id=None
+ ):
+ """Prepared method to receive thumbnail for version entity.
+
+ Args:
+ project_name (str): Project under which the entity is located.
+ version_id (str): Version id for which thumbnail should be
+ returned.
+ thumbnail_id (Optional[str]): Prepared thumbnail id from entity.
+ Used only to check if thumbnail was already cached.
+
+ Returns:
+ Union[str, None]: Path to downloaded thumbnail or none if entity
+ does not have any (or if user does not have permissions).
+ """
+
+ return self.get_thumbnail(
+ project_name, "version", version_id, thumbnail_id
+ )
+
+ def get_workfile_thumbnail(
+ self, project_name, workfile_id, thumbnail_id=None
+ ):
+ """Prepared method to receive thumbnail for workfile entity.
+
+ Args:
+ project_name (str): Project under which the entity is located.
+ workfile_id (str): Worfile id for which thumbnail should be
+ returned.
+ thumbnail_id (Optional[str]): Prepared thumbnail id from entity.
+ Used only to check if thumbnail was already cached.
+
+ Returns:
+ Union[str, None]: Path to downloaded thumbnail or none if entity
+ does not have any (or if user does not have permissions).
+ """
+
+ return self.get_thumbnail(
+ project_name, "workfile", workfile_id, thumbnail_id
+ )
+
+ def _get_thumbnail_mime_type(self, thumbnail_path):
+ """Get thumbnail mime type on thumbnail creation based on source path.
+
+ Args:
+ thumbnail_path (str): Path to thumbnail source fie.
+
+ Returns:
+ str: Mime type used for thumbnail creation.
+
+ Raises:
+ ValueError: Mime type cannot be determined.
+ """
+
+ ext = os.path.splitext(thumbnail_path)[-1].lower()
+ if ext == ".png":
+ return "image/png"
+
+ elif ext in (".jpeg", ".jpg"):
+ return "image/jpeg"
+
+ raise ValueError(
+ "Thumbnail source file has unknown extensions {}".format(ext))
+
+ def create_thumbnail(self, project_name, src_filepath, thumbnail_id=None):
+ """Create new thumbnail on server from passed path.
+
+ Args:
+ project_name (str): Project where the thumbnail will be created
+ and can be used.
+ src_filepath (str): Filepath to thumbnail which should be uploaded.
+ thumbnail_id (Optional[str]): Prepared if of thumbnail.
+
+ Returns:
+ str: Created thumbnail id.
+
+ Raises:
+ ValueError: When thumbnail source cannot be processed.
+ """
+
+ if not os.path.exists(src_filepath):
+ raise ValueError("Entered filepath does not exist.")
+
+ if thumbnail_id:
+ self.update_thumbnail(
+ project_name,
+ thumbnail_id,
+ src_filepath
+ )
+ return thumbnail_id
+
+ mime_type = self._get_thumbnail_mime_type(src_filepath)
+ with open(src_filepath, "rb") as stream:
+ content = stream.read()
+
+ response = self.raw_post(
+ "projects/{}/thumbnails".format(project_name),
+ headers={"Content-Type": mime_type},
+ data=content
+ )
+ response.raise_for_status()
+ return response.data["id"]
+
+ def update_thumbnail(self, project_name, thumbnail_id, src_filepath):
+ """Change thumbnail content by id.
+
+ Update can be also used to create new thumbnail.
+
+ Args:
+ project_name (str): Project where the thumbnail will be created
+ and can be used.
+ thumbnail_id (str): Thumbnail id to update.
+ src_filepath (str): Filepath to thumbnail which should be uploaded.
+
+ Raises:
+ ValueError: When thumbnail source cannot be processed.
+ """
+
+ if not os.path.exists(src_filepath):
+ raise ValueError("Entered filepath does not exist.")
+
+ mime_type = self._get_thumbnail_mime_type(src_filepath)
+ with open(src_filepath, "rb") as stream:
+ content = stream.read()
+
+ response = self.raw_put(
+ "projects/{}/thumbnails/{}".format(project_name, thumbnail_id),
+ headers={"Content-Type": mime_type},
+ data=content
+ )
+ response.raise_for_status()
+
+ def create_project(
+ self,
+ project_name,
+ project_code,
+ library_project=False,
+ preset_name=None
+ ):
+ """Create project using Ayon settings.
+
+ This project creation function is not validating project entity on
+ creation. It is because project entity is created blindly with only
+ minimum required information about project which is name and code.
+
+ Entered project name must be unique and project must not exist yet.
+
+ Note:
+ This function is here to be OP v4 ready but in v3 has more logic
+ to do. That's why inner imports are in the body.
+
+ Args:
+ project_name (str): New project name. Should be unique.
+ project_code (str): Project's code should be unique too.
+ library_project (Optional[bool]): Project is library project.
+ preset_name (Optional[str]): Name of anatomy preset. Default is
+ used if not passed.
+
+ Raises:
+ ValueError: When project name already exists.
+
+ Returns:
+ dict[str, Any]: Created project entity.
+ """
+
+ if self.get_project(project_name):
+ raise ValueError("Project with name \"{}\" already exists".format(
+ project_name
+ ))
+
+ if not PROJECT_NAME_REGEX.match(project_name):
+ raise ValueError((
+ "Project name \"{}\" contain invalid characters"
+ ).format(project_name))
+
+ preset = self.get_project_anatomy_preset(preset_name)
+
+ result = self.post(
+ "projects",
+ name=project_name,
+ code=project_code,
+ anatomy=preset,
+ library=library_project
+ )
+
+ if result.status != 201:
+ details = "Unknown details ({})".format(result.status)
+ if result.data:
+ details = result.data.get("detail") or details
+ raise ValueError("Failed to create project \"{}\": {}".format(
+ project_name, details
+ ))
+
+ return self.get_project(project_name)
+
+ def update_project(
+ self,
+ project_name,
+ library=None,
+ folder_types=None,
+ task_types=None,
+ link_types=None,
+ statuses=None,
+ tags=None,
+ config=None,
+ attrib=None,
+ data=None,
+ active=None,
+ project_code=None,
+ **changes
+ ):
+ """Update project entity on server.
+
+ Args:
+ project_name (str): Name of project.
+ library (Optional[bool]): Change library state.
+ folder_types (Optional[list[dict[str, Any]]]): Folder type
+ definitions.
+ task_types (Optional[list[dict[str, Any]]]): Task type
+ definitions.
+ link_types (Optional[list[dict[str, Any]]]): Link type
+ definitions.
+ statuses (Optional[list[dict[str, Any]]]): Status definitions.
+ tags (Optional[list[dict[str, Any]]]): List of tags available to
+ set on entities.
+ config (Optional[dict[dict[str, Any]]]): Project anatomy config
+ with templates and roots.
+ attrib (Optional[dict[str, Any]]): Project attributes to change.
+ data (Optional[dict[str, Any]]): Custom data of a project. This
+ value will 100% override project data.
+ active (Optional[bool]): Change active state of a project.
+ project_code (Optional[str]): Change project code. Not recommended
+ during production.
+ **changes: Other changed keys based on Rest API documentation.
+ """
+
+ changes.update({
+ key: value
+ for key, value in (
+ ("library", library),
+ ("folderTypes", folder_types),
+ ("taskTypes", task_types),
+ ("linkTypes", link_types),
+ ("statuses", statuses),
+ ("tags", tags),
+ ("config", config),
+ ("attrib", attrib),
+ ("data", data),
+ ("active", active),
+ ("code", project_code),
+ )
+ if value is not None
+ })
+ response = self.patch(
+ "projects/{}".format(project_name),
+ **changes
+ )
+ response.raise_for_status()
+
+ def delete_project(self, project_name):
+ """Delete project from server.
+
+ This will completely remove project from server without any step back.
+
+ Args:
+ project_name (str): Project name that will be removed.
+ """
+
+ if not self.get_project(project_name):
+ raise ValueError("Project with name \"{}\" was not found".format(
+ project_name
+ ))
+
+ result = self.delete("projects/{}".format(project_name))
+ if result.status_code != 204:
+ raise ValueError(
+ "Failed to delete project \"{}\". {}".format(
+ project_name, result.data["detail"]
+ )
+ )
+
+ # --- Links ---
+ def get_full_link_type_name(self, link_type_name, input_type, output_type):
+ """Calculate full link type name used for query from server.
+
+ Args:
+ link_type_name (str): Type of link.
+ input_type (str): Input entity type of link.
+ output_type (str): Output entity type of link.
+
+ Returns:
+ str: Full name of link type used for query from server.
+ """
+
+ return "|".join([link_type_name, input_type, output_type])
+
+ def get_link_types(self, project_name):
+ """All link types available on a project.
+
+ Example output:
+ [
+ {
+ "name": "reference|folder|folder",
+ "link_type": "reference",
+ "input_type": "folder",
+ "output_type": "folder",
+ "data": {}
+ }
+ ]
+
+ Args:
+ project_name (str): Name of project where to look for link types.
+
+ Returns:
+ list[dict[str, Any]]: Link types available on project.
+ """
+
+ response = self.get("projects/{}/links/types".format(project_name))
+ response.raise_for_status()
+ return response.data["types"]
+
+ def get_link_type(
+ self, project_name, link_type_name, input_type, output_type
+ ):
+ """Get link type data.
+
+ There is not dedicated REST endpoint to get single link type,
+ so method 'get_link_types' is used.
+
+ Example output:
+ {
+ "name": "reference|folder|folder",
+ "link_type": "reference",
+ "input_type": "folder",
+ "output_type": "folder",
+ "data": {}
+ }
+
+ Args:
+ project_name (str): Project where link type is available.
+ link_type_name (str): Name of link type.
+ input_type (str): Input entity type of link.
+ output_type (str): Output entity type of link.
+
+ Returns:
+ Union[None, dict[str, Any]]: Link type information.
+ """
+
+ full_type_name = self.get_full_link_type_name(
+ link_type_name, input_type, output_type
+ )
+ for link_type in self.get_link_types(project_name):
+ if link_type["name"] == full_type_name:
+ return link_type
+ return None
+
+ def create_link_type(
+ self, project_name, link_type_name, input_type, output_type, data=None
+ ):
+ """Create or update link type on server.
+
+ Warning:
+ Because PUT is used for creation it is also used for update.
+
+ Args:
+ project_name (str): Project where link type is created.
+ link_type_name (str): Name of link type.
+ input_type (str): Input entity type of link.
+ output_type (str): Output entity type of link.
+ data (Optional[dict[str, Any]]): Additional data related to link.
+
+ Raises:
+ HTTPRequestError: Server error happened.
+ """
+
+ if data is None:
+ data = {}
+ full_type_name = self.get_full_link_type_name(
+ link_type_name, input_type, output_type
+ )
+ response = self.put(
+ "projects/{}/links/types/{}".format(project_name, full_type_name),
+ **data
+ )
+ response.raise_for_status()
+
+ def delete_link_type(
+ self, project_name, link_type_name, input_type, output_type
+ ):
+ """Remove link type from project.
+
+ Args:
+ project_name (str): Project where link type is created.
+ link_type_name (str): Name of link type.
+ input_type (str): Input entity type of link.
+ output_type (str): Output entity type of link.
+
+ Raises:
+ HTTPRequestError: Server error happened.
+ """
+
+ full_type_name = self.get_full_link_type_name(
+ link_type_name, input_type, output_type
+ )
+ response = self.delete(
+ "projects/{}/links/types/{}".format(project_name, full_type_name))
+ response.raise_for_status()
+
+ def make_sure_link_type_exists(
+ self, project_name, link_type_name, input_type, output_type, data=None
+ ):
+ """Make sure link type exists on a project.
+
+ Args:
+ project_name (str): Name of project.
+ link_type_name (str): Name of link type.
+ input_type (str): Input entity type of link.
+ output_type (str): Output entity type of link.
+ data (Optional[dict[str, Any]]): Link type related data.
+ """
+
+ link_type = self.get_link_type(
+ project_name, link_type_name, input_type, output_type)
+ if (
+ link_type
+ and (data is None or data == link_type["data"])
+ ):
+ return
+ self.create_link_type(
+ project_name, link_type_name, input_type, output_type, data
+ )
+
+ def create_link(
+ self,
+ project_name,
+ link_type_name,
+ input_id,
+ input_type,
+ output_id,
+ output_type
+ ):
+ """Create link between 2 entities.
+
+ Link has a type which must already exists on a project.
+
+ Example output:
+ {
+ "id": "59a212c0d2e211eda0e20242ac120002"
+ }
+
+ Args:
+ project_name (str): Project where the link is created.
+ link_type_name (str): Type of link.
+ input_id (str): Id of input entity.
+ input_type (str): Entity type of input entity.
+ output_id (str): Id of output entity.
+ output_type (str): Entity type of output entity.
+
+ Returns:
+ dict[str, str]: Information about link.
+
+ Raises:
+ HTTPRequestError: Server error happened.
+ """
+
+ full_link_type_name = self.get_full_link_type_name(
+ link_type_name, input_type, output_type)
+ response = self.post(
+ "projects/{}/links".format(project_name),
+ link=full_link_type_name,
+ input=input_id,
+ output=output_id
+ )
+ response.raise_for_status()
+ return response.data
+
+ def delete_link(self, project_name, link_id):
+ """Remove link by id.
+
+ Args:
+ project_name (str): Project where link exists.
+ link_id (str): Id of link.
+
+ Raises:
+ HTTPRequestError: Server error happened.
+ """
+
+ response = self.delete(
+ "projects/{}/links/{}".format(project_name, link_id)
+ )
+ response.raise_for_status()
+
+ def _prepare_link_filters(self, filters, link_types, link_direction):
+ """Add links filters for GraphQl queries.
+
+ Args:
+ filters (dict[str, Any]): Object where filters will be added.
+ link_types (Union[Iterable[str], None]): Link types filters.
+ link_direction (Union[Literal["in", "out"], None]): Direction of
+ link "in", "out" or 'None' for both.
+
+ Returns:
+ bool: Links are valid, and query from server can happen.
+ """
+
+ if link_types is not None:
+ link_types = set(link_types)
+ if not link_types:
+ return False
+ filters["linkTypes"] = list(link_types)
+
+ if link_direction is not None:
+ if link_direction not in ("in", "out"):
+ return False
+ filters["linkDirection"] = link_direction
+ return True
+
+ def get_entities_links(
+ self,
+ project_name,
+ entity_type,
+ entity_ids=None,
+ link_types=None,
+ link_direction=None
+ ):
+ """Helper method to get links from server for entity types.
+
+ Example output:
+ [
+ {
+ "id": "59a212c0d2e211eda0e20242ac120002",
+ "linkType": "reference",
+ "description": "reference link between folders",
+ "projectName": "my_project",
+ "author": "frantadmin",
+ "entityId": "b1df109676db11ed8e8c6c9466b19aa8",
+ "entityType": "folder",
+ "direction": "out"
+ },
+ ...
+ ]
+
+ Args:
+ project_name (str): Project where links are.
+ entity_type (Literal["folder", "task", "product",
+ "version", "representations"]): Entity type.
+ entity_ids (Optional[Iterable[str]]): Ids of entities for which
+ links should be received.
+ link_types (Optional[Iterable[str]]): Link type filters.
+ link_direction (Optional[Literal["in", "out"]]): Link direction
+ filter.
+
+ Returns:
+ dict[str, list[dict[str, Any]]]: Link info by entity ids.
+ """
+
+ if entity_type == "folder":
+ query_func = folders_graphql_query
+ id_filter_key = "folderIds"
+ project_sub_key = "folders"
+ elif entity_type == "task":
+ query_func = tasks_graphql_query
+ id_filter_key = "taskIds"
+ project_sub_key = "tasks"
+ elif entity_type == "product":
+ query_func = products_graphql_query
+ id_filter_key = "productIds"
+ project_sub_key = "products"
+ elif entity_type == "version":
+ query_func = versions_graphql_query
+ id_filter_key = "versionIds"
+ project_sub_key = "versions"
+ elif entity_type == "representation":
+ query_func = representations_graphql_query
+ id_filter_key = "representationIds"
+ project_sub_key = "representations"
+ else:
+ raise ValueError("Unknown type \"{}\". Expected {}".format(
+ entity_type,
+ ", ".join(
+ ("folder", "task", "product", "version", "representation")
+ )
+ ))
+
+ output = collections.defaultdict(list)
+ filters = {
+ "projectName": project_name
+ }
+ if entity_ids is not None:
+ entity_ids = set(entity_ids)
+ if not entity_ids:
+ return output
+ filters[id_filter_key] = list(entity_ids)
+
+ if not self._prepare_link_filters(filters, link_types, link_direction):
+ return output
+
+ query = query_func({"id", "links"})
+ for attr, filter_value in filters.items():
+ query.set_variable_value(attr, filter_value)
+
+ for parsed_data in query.continuous_query(self):
+ for entity in parsed_data["project"][project_sub_key]:
+ entity_id = entity["id"]
+ output[entity_id].extend(entity["links"])
+ return output
+
+ def get_folders_links(
+ self,
+ project_name,
+ folder_ids=None,
+ link_types=None,
+ link_direction=None
+ ):
+ """Query folders links from server.
+
+ Args:
+ project_name (str): Project where links are.
+ folder_ids (Optional[Iterable[str]]): Ids of folders for which
+ links should be received.
+ link_types (Optional[Iterable[str]]): Link type filters.
+ link_direction (Optional[Literal["in", "out"]]): Link direction
+ filter.
+
+ Returns:
+ dict[str, list[dict[str, Any]]]: Link info by folder ids.
+ """
+
+ return self.get_entities_links(
+ project_name, "folder", folder_ids, link_types, link_direction
+ )
+
+ def get_folder_links(
+ self,
+ project_name,
+ folder_id,
+ link_types=None,
+ link_direction=None
+ ):
+ """Query folder links from server.
+
+ Args:
+ project_name (str): Project where links are.
+ folder_id (str): Folder id for which links should be received.
+ link_types (Optional[Iterable[str]]): Link type filters.
+ link_direction (Optional[Literal["in", "out"]]): Link direction
+ filter.
+
+ Returns:
+ list[dict[str, Any]]: Link info of folder.
+ """
+
+ return self.get_folders_links(
+ project_name, [folder_id], link_types, link_direction
+ )[folder_id]
+
+ def get_tasks_links(
+ self,
+ project_name,
+ task_ids=None,
+ link_types=None,
+ link_direction=None
+ ):
+ """Query tasks links from server.
+
+ Args:
+ project_name (str): Project where links are.
+ task_ids (Optional[Iterable[str]]): Ids of tasks for which
+ links should be received.
+ link_types (Optional[Iterable[str]]): Link type filters.
+ link_direction (Optional[Literal["in", "out"]]): Link direction
+ filter.
+
+ Returns:
+ dict[str, list[dict[str, Any]]]: Link info by task ids.
+ """
+
+ return self.get_entities_links(
+ project_name, "task", task_ids, link_types, link_direction
+ )
+
+ def get_task_links(
+ self,
+ project_name,
+ task_id,
+ link_types=None,
+ link_direction=None
+ ):
+ """Query task links from server.
+
+ Args:
+ project_name (str): Project where links are.
+ task_id (str): Task id for which links should be received.
+ link_types (Optional[Iterable[str]]): Link type filters.
+ link_direction (Optional[Literal["in", "out"]]): Link direction
+ filter.
+
+ Returns:
+ list[dict[str, Any]]: Link info of task.
+ """
+
+ return self.get_tasks_links(
+ project_name, [task_id], link_types, link_direction
+ )[task_id]
+
+ def get_products_links(
+ self,
+ project_name,
+ product_ids=None,
+ link_types=None,
+ link_direction=None
+ ):
+ """Query products links from server.
+
+ Args:
+ project_name (str): Project where links are.
+ product_ids (Optional[Iterable[str]]): Ids of products for which
+ links should be received.
+ link_types (Optional[Iterable[str]]): Link type filters.
+ link_direction (Optional[Literal["in", "out"]]): Link direction
+ filter.
+
+ Returns:
+ dict[str, list[dict[str, Any]]]: Link info by product ids.
+ """
+
+ return self.get_entities_links(
+ project_name, "product", product_ids, link_types, link_direction
+ )
+
+ def get_product_links(
+ self,
+ project_name,
+ product_id,
+ link_types=None,
+ link_direction=None
+ ):
+ """Query product links from server.
+
+ Args:
+ project_name (str): Project where links are.
+ product_id (str): Product id for which links should be received.
+ link_types (Optional[Iterable[str]]): Link type filters.
+ link_direction (Optional[Literal["in", "out"]]): Link direction
+ filter.
+
+ Returns:
+ list[dict[str, Any]]: Link info of product.
+ """
+
+ return self.get_products_links(
+ project_name, [product_id], link_types, link_direction
+ )[product_id]
+
+ def get_versions_links(
+ self,
+ project_name,
+ version_ids=None,
+ link_types=None,
+ link_direction=None
+ ):
+ """Query versions links from server.
+
+ Args:
+ project_name (str): Project where links are.
+ version_ids (Optional[Iterable[str]]): Ids of versions for which
+ links should be received.
+ link_types (Optional[Iterable[str]]): Link type filters.
+ link_direction (Optional[Literal["in", "out"]]): Link direction
+ filter.
+
+ Returns:
+ dict[str, list[dict[str, Any]]]: Link info by version ids.
+ """
+
+ return self.get_entities_links(
+ project_name, "version", version_ids, link_types, link_direction
+ )
+
+ def get_version_links(
+ self,
+ project_name,
+ version_id,
+ link_types=None,
+ link_direction=None
+ ):
+ """Query version links from server.
+
+ Args:
+ project_name (str): Project where links are.
+ version_id (str): Version id for which links should be received.
+ link_types (Optional[Iterable[str]]): Link type filters.
+ link_direction (Optional[Literal["in", "out"]]): Link direction
+ filter.
+
+ Returns:
+ list[dict[str, Any]]: Link info of version.
+ """
+
+ return self.get_versions_links(
+ project_name, [version_id], link_types, link_direction
+ )[version_id]
+
+ def get_representations_links(
+ self,
+ project_name,
+ representation_ids=None,
+ link_types=None,
+ link_direction=None
+ ):
+ """Query representations links from server.
+
+ Args:
+ project_name (str): Project where links are.
+ representation_ids (Optional[Iterable[str]]): Ids of
+ representations for which links should be received.
+ link_types (Optional[Iterable[str]]): Link type filters.
+ link_direction (Optional[Literal["in", "out"]]): Link direction
+ filter.
+
+ Returns:
+ dict[str, list[dict[str, Any]]]: Link info by representation ids.
+ """
+
+ return self.get_entities_links(
+ project_name,
+ "representation",
+ representation_ids,
+ link_types,
+ link_direction
+ )
+
+ def get_representation_links(
+ self,
+ project_name,
+ representation_id,
+ link_types=None,
+ link_direction=None
+ ):
+ """Query representation links from server.
+
+ Args:
+ project_name (str): Project where links are.
+ representation_id (str): Representation id for which links
+ should be received.
+ link_types (Optional[Iterable[str]]): Link type filters.
+ link_direction (Optional[Literal["in", "out"]]): Link direction
+ filter.
+
+ Returns:
+ list[dict[str, Any]]: Link info of representation.
+ """
+
+ return self.get_representations_links(
+ project_name, [representation_id], link_types, link_direction
+ )[representation_id]
+
+ # --- Batch operations processing ---
+ def send_batch_operations(
+ self,
+ project_name,
+ operations,
+ can_fail=False,
+ raise_on_fail=True
+ ):
+ """Post multiple CRUD operations to server.
+
+ When multiple changes should be made on server side this is the best
+ way to go. It is possible to pass multiple operations to process on a
+ server side and do the changes in a transaction.
+
+ Args:
+ project_name (str): On which project should be operations
+ processed.
+ operations (list[dict[str, Any]]): Operations to be processed.
+ can_fail (Optional[bool]): Server will try to process all
+ operations even if one of them fails.
+ raise_on_fail (Optional[bool]): Raise exception if an operation
+ fails. You can handle failed operations on your own
+ when set to 'False'.
+
+ Raises:
+ ValueError: Operations can't be converted to json string.
+ FailedOperations: When output does not contain server operations
+ or 'raise_on_fail' is enabled and any operation fails.
+
+ Returns:
+ list[dict[str, Any]]: Operations result with process details.
+ """
+
+ if not operations:
+ return []
+
+ body_by_id = {}
+ operations_body = []
+ for operation in operations:
+ if not operation:
+ continue
+
+ op_id = operation.get("id")
+ if not op_id:
+ op_id = create_entity_id()
+ operation["id"] = op_id
+
+ try:
+ body = json.loads(
+ json.dumps(operation, default=entity_data_json_default)
+ )
+ except:
+ raise ValueError("Couldn't json parse body: {}".format(
+ json.dumps(
+ operation, indent=4, default=failed_json_default
+ )
+ ))
+
+ body_by_id[op_id] = body
+ operations_body.append(body)
+
+ if not operations_body:
+ return []
+
+ result = self.post(
+ "projects/{}/operations".format(project_name),
+ operations=operations_body,
+ canFail=can_fail
+ )
+
+ op_results = result.get("operations")
+ if op_results is None:
+ raise FailedOperations(
+ "Operation failed. Content: {}".format(str(result))
+ )
+
+ if result.get("success") or not raise_on_fail:
+ return op_results
+
+ for op_result in op_results:
+ if not op_result["success"]:
+ operation_id = op_result["id"]
+ raise FailedOperations((
+ "Operation \"{}\" failed with data:\n{}\nDetail: {}."
+ ).format(
+ operation_id,
+ json.dumps(body_by_id[operation_id], indent=4),
+ op_result["detail"],
+ ))
+ return op_results
diff --git a/openpype/vendor/python/common/ayon_api/thumbnails.py b/openpype/vendor/python/common/ayon_api/thumbnails.py
new file mode 100644
index 0000000000..11734ca762
--- /dev/null
+++ b/openpype/vendor/python/common/ayon_api/thumbnails.py
@@ -0,0 +1,219 @@
+import os
+import time
+import collections
+
+import appdirs
+
+FileInfo = collections.namedtuple(
+ "FileInfo",
+ ("path", "size", "modification_time")
+)
+
+
+class ThumbnailCache:
+ """Cache of thumbnails on local storage.
+
+ Thumbnails are cached to appdirs to predefined directory. Each project has
+ own subfolder with thumbnails -> that's because each project has own
+ thumbnail id validation and file names are thumbnail ids with matching
+ extension. Extensions are predefined (.png and .jpeg).
+
+ Cache has cleanup mechanism which is triggered on initialized by default.
+
+ The cleanup has 2 levels:
+ 1. soft cleanup which remove all files that are older then 'days_alive'
+ 2. max size cleanup which remove all files until the thumbnails folder
+ contains less then 'max_filesize'
+ - this is time consuming so it's not triggered automatically
+
+ Args:
+ cleanup (bool): Trigger soft cleanup (Cleanup expired thumbnails).
+ """
+
+ # Lifetime of thumbnails (in seconds)
+ # - default 3 days
+ days_alive = 3 * 24 * 60 * 60
+ # Max size of thumbnail directory (in bytes)
+ # - default 2 Gb
+ max_filesize = 2 * 1024 * 1024 * 1024
+
+ def __init__(self, cleanup=True):
+ self._thumbnails_dir = None
+ if cleanup:
+ self.cleanup()
+
+ def get_thumbnails_dir(self):
+ """Root directory where thumbnails are stored.
+
+ Returns:
+ str: Path to thumbnails root.
+ """
+
+ if self._thumbnails_dir is None:
+ directory = appdirs.user_data_dir("AYON", "Ynput")
+ self._thumbnails_dir = os.path.join(directory, "thumbnails")
+ return self._thumbnails_dir
+
+ thumbnails_dir = property(get_thumbnails_dir)
+
+ def get_thumbnails_dir_file_info(self):
+ """Get information about all files in thumbnails directory.
+
+ Returns:
+ List[FileInfo]: List of file information about all files.
+ """
+
+ thumbnails_dir = self.thumbnails_dir
+ files_info = []
+ if not os.path.exists(thumbnails_dir):
+ return files_info
+
+ for root, _, filenames in os.walk(thumbnails_dir):
+ for filename in filenames:
+ path = os.path.join(root, filename)
+ files_info.append(FileInfo(
+ path, os.path.getsize(path), os.path.getmtime(path)
+ ))
+ return files_info
+
+ def get_thumbnails_dir_size(self, files_info=None):
+ """Got full size of thumbnail directory.
+
+ Args:
+ files_info (List[FileInfo]): Prepared file information about
+ files in thumbnail directory.
+
+ Returns:
+ int: File size of all files in thumbnail directory.
+ """
+
+ if files_info is None:
+ files_info = self.get_thumbnails_dir_file_info()
+
+ if not files_info:
+ return 0
+
+ return sum(
+ file_info.size
+ for file_info in files_info
+ )
+
+ def cleanup(self, check_max_size=False):
+ """Cleanup thumbnails directory.
+
+ Args:
+ check_max_size (bool): Also cleanup files to match max size of
+ thumbnails directory.
+ """
+
+ thumbnails_dir = self.get_thumbnails_dir()
+ # Skip if thumbnails dir does not exists yet
+ if not os.path.exists(thumbnails_dir):
+ return
+
+ self._soft_cleanup(thumbnails_dir)
+ if check_max_size:
+ self._max_size_cleanup(thumbnails_dir)
+
+ def _soft_cleanup(self, thumbnails_dir):
+ current_time = time.time()
+ for root, _, filenames in os.walk(thumbnails_dir):
+ for filename in filenames:
+ path = os.path.join(root, filename)
+ modification_time = os.path.getmtime(path)
+ if current_time - modification_time > self.days_alive:
+ os.remove(path)
+
+ def _max_size_cleanup(self, thumbnails_dir):
+ files_info = self.get_thumbnails_dir_file_info()
+ size = self.get_thumbnails_dir_size(files_info)
+ if size < self.max_filesize:
+ return
+
+ sorted_file_info = collections.deque(
+ sorted(files_info, key=lambda item: item.modification_time)
+ )
+ diff = size - self.max_filesize
+ while diff > 0:
+ if not sorted_file_info:
+ break
+
+ file_info = sorted_file_info.popleft()
+ diff -= file_info.size
+ os.remove(file_info.path)
+
+ def get_thumbnail_filepath(self, project_name, thumbnail_id):
+ """Get thumbnail by thumbnail id.
+
+ Args:
+ project_name (str): Name of project.
+ thumbnail_id (str): Thumbnail id.
+
+ Returns:
+ Union[str, None]: Path to thumbnail image or None if thumbnail
+ is not cached yet.
+ """
+
+ if not thumbnail_id:
+ return None
+
+ for ext in (
+ ".png",
+ ".jpeg",
+ ):
+ filepath = os.path.join(
+ self.thumbnails_dir, project_name, thumbnail_id + ext
+ )
+ if os.path.exists(filepath):
+ return filepath
+ return None
+
+ def get_project_dir(self, project_name):
+ """Path to root directory for specific project.
+
+ Args:
+ project_name (str): Name of project for which root directory path
+ should be returned.
+
+ Returns:
+ str: Path to root of project's thumbnails.
+ """
+
+ return os.path.join(self.thumbnails_dir, project_name)
+
+ def make_sure_project_dir_exists(self, project_name):
+ project_dir = self.get_project_dir(project_name)
+ if not os.path.exists(project_dir):
+ os.makedirs(project_dir)
+ return project_dir
+
+ def store_thumbnail(self, project_name, thumbnail_id, content, mime_type):
+ """Store thumbnail to cache folder.
+
+ Args:
+ project_name (str): Project where the thumbnail belong to.
+ thumbnail_id (str): Id of thumbnail.
+ content (bytes): Byte content of thumbnail file.
+ mime_data (str): Type of content.
+
+ Returns:
+ str: Path to cached thumbnail image file.
+ """
+
+ if mime_type == "image/png":
+ ext = ".png"
+ elif mime_type == "image/jpeg":
+ ext = ".jpeg"
+ else:
+ raise ValueError(
+ "Unknown mime type for thumbnail \"{}\"".format(mime_type))
+
+ project_dir = self.make_sure_project_dir_exists(project_name)
+ thumbnail_path = os.path.join(project_dir, thumbnail_id + ext)
+ with open(thumbnail_path, "wb") as stream:
+ stream.write(content)
+
+ current_time = time.time()
+ os.utime(thumbnail_path, (current_time, current_time))
+
+ return thumbnail_path
diff --git a/openpype/vendor/python/common/ayon_api/utils.py b/openpype/vendor/python/common/ayon_api/utils.py
new file mode 100644
index 0000000000..69fd8e9b41
--- /dev/null
+++ b/openpype/vendor/python/common/ayon_api/utils.py
@@ -0,0 +1,471 @@
+import re
+import datetime
+import uuid
+import string
+import platform
+import collections
+try:
+ # Python 3
+ from urllib.parse import urlparse, urlencode
+except ImportError:
+ # Python 2
+ from urlparse import urlparse
+ from urllib import urlencode
+
+import requests
+import unidecode
+
+from .exceptions import UrlError
+
+REMOVED_VALUE = object()
+SLUGIFY_WHITELIST = string.ascii_letters + string.digits
+SLUGIFY_SEP_WHITELIST = " ,./\\;:!|*^#@~+-_="
+
+RepresentationParents = collections.namedtuple(
+ "RepresentationParents",
+ ("version", "product", "folder", "project")
+)
+
+
+def prepare_query_string(key_values):
+ """Prepare data to query string.
+
+ If there are any values a query starting with '?' is returned otherwise
+ an empty string.
+
+ Args:
+ dict[str, Any]: Query values.
+
+ Returns:
+ str: Query string.
+ """
+
+ if not key_values:
+ return ""
+ return "?{}".format(urlencode(key_values))
+
+
+def create_entity_id():
+ return uuid.uuid1().hex
+
+
+def convert_entity_id(entity_id):
+ if not entity_id:
+ return None
+
+ if isinstance(entity_id, uuid.UUID):
+ return entity_id.hex
+
+ try:
+ return uuid.UUID(entity_id).hex
+
+ except (TypeError, ValueError, AttributeError):
+ pass
+ return None
+
+
+def convert_or_create_entity_id(entity_id=None):
+ output = convert_entity_id(entity_id)
+ if output is None:
+ output = create_entity_id()
+ return output
+
+
+def entity_data_json_default(value):
+ if isinstance(value, datetime.datetime):
+ return int(value.timestamp())
+
+ raise TypeError(
+ "Object of type {} is not JSON serializable".format(str(type(value)))
+ )
+
+
+def slugify_string(
+ input_string,
+ separator="_",
+ slug_whitelist=SLUGIFY_WHITELIST,
+ split_chars=SLUGIFY_SEP_WHITELIST,
+ min_length=1,
+ lower=False,
+ make_set=False,
+):
+ """Slugify a text string.
+
+ This function removes transliterates input string to ASCII, removes
+ special characters and use join resulting elements using
+ specified separator.
+
+ Args:
+ input_string (str): Input string to slugify
+ separator (str): A string used to separate returned elements
+ (default: "_")
+ slug_whitelist (str): Characters allowed in the output
+ (default: ascii letters, digits and the separator)
+ split_chars (str): Set of characters used for word splitting
+ (there is a sane default)
+ lower (bool): Convert to lower-case (default: False)
+ make_set (bool): Return "set" object instead of string.
+ min_length (int): Minimal length of an element (word).
+
+ Returns:
+ Union[str, Set[str]]: Based on 'make_set' value returns slugified
+ string.
+ """
+
+ tmp_string = unidecode.unidecode(input_string)
+ if lower:
+ tmp_string = tmp_string.lower()
+
+ parts = [
+ # Remove all characters that are not in whitelist
+ re.sub("[^{}]".format(re.escape(slug_whitelist)), "", part)
+ # Split text into part by split characters
+ for part in re.split("[{}]".format(re.escape(split_chars)), tmp_string)
+ ]
+ # Filter text parts by length
+ filtered_parts = [
+ part
+ for part in parts
+ if len(part) >= min_length
+ ]
+ if make_set:
+ return set(filtered_parts)
+ return separator.join(filtered_parts)
+
+
+def failed_json_default(value):
+ return "< Failed value {} > {}".format(type(value), str(value))
+
+
+def prepare_attribute_changes(old_entity, new_entity, replace=False):
+ attrib_changes = {}
+ new_attrib = new_entity.get("attrib")
+ old_attrib = old_entity.get("attrib")
+ if new_attrib is None:
+ if not replace:
+ return attrib_changes
+ new_attrib = {}
+
+ if old_attrib is None:
+ return new_attrib
+
+ for attr, new_attr_value in new_attrib.items():
+ old_attr_value = old_attrib.get(attr)
+ if old_attr_value != new_attr_value:
+ attrib_changes[attr] = new_attr_value
+
+ if replace:
+ for attr in old_attrib:
+ if attr not in new_attrib:
+ attrib_changes[attr] = REMOVED_VALUE
+
+ return attrib_changes
+
+
+def prepare_entity_changes(old_entity, new_entity, replace=False):
+ """Prepare changes of entities."""
+
+ changes = {}
+ for key, new_value in new_entity.items():
+ if key == "attrib":
+ continue
+
+ old_value = old_entity.get(key)
+ if old_value != new_value:
+ changes[key] = new_value
+
+ if replace:
+ for key in old_entity:
+ if key not in new_entity:
+ changes[key] = REMOVED_VALUE
+
+ attr_changes = prepare_attribute_changes(old_entity, new_entity, replace)
+ if attr_changes:
+ changes["attrib"] = attr_changes
+ return changes
+
+
+def _try_parse_url(url):
+ try:
+ return urlparse(url)
+ except BaseException:
+ return None
+
+
+def _try_connect_to_server(url):
+ try:
+ # TODO add validation if the url lead to Ayon server
+ # - thiw won't validate if the url lead to 'google.com'
+ requests.get(url)
+
+ except BaseException:
+ return False
+ return True
+
+
+def login_to_server(url, username, password):
+ """Use login to the server to receive token.
+
+ Args:
+ url (str): Server url.
+ username (str): User's username.
+ password (str): User's password.
+
+ Returns:
+ Union[str, None]: User's token if login was successfull.
+ Otherwise 'None'.
+ """
+
+ headers = {"Content-Type": "application/json"}
+ response = requests.post(
+ "{}/api/auth/login".format(url),
+ headers=headers,
+ json={
+ "name": username,
+ "password": password
+ }
+ )
+ token = None
+ # 200 - success
+ # 401 - invalid credentials
+ # * - other issues
+ if response.status_code == 200:
+ token = response.json()["token"]
+ return token
+
+
+def logout_from_server(url, token):
+ """Logout from server and throw token away.
+
+ Args:
+ url (str): Url from which should be logged out.
+ token (str): Token which should be used to log out.
+ """
+
+ headers = {
+ "Content-Type": "application/json",
+ "Authorization": "Bearer {}".format(token)
+ }
+ requests.post(
+ url + "/api/auth/logout",
+ headers=headers
+ )
+
+
+def is_token_valid(url, token):
+ """Check if token is valid.
+
+ Args:
+ url (str): Server url.
+ token (str): User's token.
+
+ Returns:
+ bool: True if token is valid.
+ """
+
+ headers = {
+ "Content-Type": "application/json",
+ "Authorization": "Bearer {}".format(token)
+ }
+ response = requests.get(
+ "{}/api/users/me".format(url),
+ headers=headers
+ )
+ return response.status_code == 200
+
+
+def validate_url(url):
+ """Validate url if is valid and server is available.
+
+ Validation checks if can be parsed as url and contains scheme.
+
+ Function will try to autofix url thus will return modified url when
+ connection to server works.
+
+ ```python
+ my_url = "my.server.url"
+ try:
+ # Store new url
+ validated_url = validate_url(my_url)
+
+ except UrlError:
+ # Handle invalid url
+ ...
+ ```
+
+ Args:
+ url (str): Server url.
+
+ Returns:
+ Url which was used to connect to server.
+
+ Raises:
+ UrlError: Error with short description and hints for user.
+ """
+
+ stripperd_url = url.strip()
+ if not stripperd_url:
+ raise UrlError(
+ "Invalid url format. Url is empty.",
+ title="Invalid url format",
+ hints=["url seems to be empty"]
+ )
+
+ # Not sure if this is good idea?
+ modified_url = stripperd_url.rstrip("/")
+ parsed_url = _try_parse_url(modified_url)
+ universal_hints = [
+ "does the url work in browser?"
+ ]
+ if parsed_url is None:
+ raise UrlError(
+ "Invalid url format. Url cannot be parsed as url \"{}\".".format(
+ modified_url
+ ),
+ title="Invalid url format",
+ hints=universal_hints
+ )
+
+ # Try add 'https://' scheme if is missing
+ # - this will trigger UrlError if both will crash
+ if not parsed_url.scheme:
+ new_url = "https://" + modified_url
+ if _try_connect_to_server(new_url):
+ return new_url
+
+ if _try_connect_to_server(modified_url):
+ return modified_url
+
+ hints = []
+ if "/" in parsed_url.path or not parsed_url.scheme:
+ new_path = parsed_url.path.split("/")[0]
+ if not parsed_url.scheme:
+ new_path = "https://" + new_path
+
+ hints.append(
+ "did you mean \"{}\"?".format(parsed_url.scheme + new_path)
+ )
+
+ raise UrlError(
+ "Couldn't connect to server on \"{}\"".format(url),
+ title="Couldn't connect to server",
+ hints=hints + universal_hints
+ )
+
+
+class TransferProgress:
+ """Object to store progress of download/upload from/to server."""
+
+ def __init__(self):
+ self._started = False
+ self._transfer_done = False
+ self._transfered = 0
+ self._content_size = None
+
+ self._failed = False
+ self._fail_reason = None
+
+ self._source_url = "N/A"
+ self._destination_url = "N/A"
+
+ def get_content_size(self):
+ return self._content_size
+
+ def set_content_size(self, content_size):
+ if self._content_size is not None:
+ raise ValueError("Content size was set more then once")
+ self._content_size = content_size
+
+ def get_started(self):
+ return self._started
+
+ def set_started(self):
+ if self._started:
+ raise ValueError("Progress already started")
+ self._started = True
+
+ def get_transfer_done(self):
+ return self._transfer_done
+
+ def set_transfer_done(self):
+ if self._transfer_done:
+ raise ValueError("Progress was already marked as done")
+ if not self._started:
+ raise ValueError("Progress didn't start yet")
+ self._transfer_done = True
+
+ def get_failed(self):
+ return self._failed
+
+ def get_fail_reason(self):
+ return self._fail_reason
+
+ def set_failed(self, reason):
+ self._fail_reason = reason
+ self._failed = True
+
+ def get_transferred_size(self):
+ return self._transfered
+
+ def set_transferred_size(self, transfered):
+ self._transfered = transfered
+
+ def add_transferred_chunk(self, chunk_size):
+ self._transfered += chunk_size
+
+ def get_source_url(self):
+ return self._source_url
+
+ def set_source_url(self, url):
+ self._source_url = url
+
+ def get_destination_url(self):
+ return self._destination_url
+
+ def set_destination_url(self, url):
+ self._destination_url = url
+
+ @property
+ def is_running(self):
+ if (
+ not self.started
+ or self.done
+ or self.failed
+ ):
+ return False
+ return True
+
+ @property
+ def transfer_progress(self):
+ if self._content_size is None:
+ return None
+ return (self._transfered * 100.0) / float(self._content_size)
+
+ content_size = property(get_content_size, set_content_size)
+ started = property(get_started)
+ transfer_done = property(get_transfer_done)
+ failed = property(get_failed)
+ fail_reason = property(get_fail_reason)
+ source_url = property(get_source_url, set_source_url)
+ destination_url = property(get_destination_url, set_destination_url)
+ content_size = property(get_content_size, set_content_size)
+ transferred_size = property(get_transferred_size, set_transferred_size)
+
+
+def create_dependency_package_basename(platform_name=None):
+ """Create basename for dependency package file.
+
+ Args:
+ platform_name (Optional[str]): Name of platform for which the
+ bundle is targeted. Default value is current platform.
+
+ Returns:
+ str: Dependency package name with timestamp and platform.
+ """
+
+ if platform_name is None:
+ platform_name = platform.system().lower()
+
+ now_date = datetime.datetime.now()
+ time_stamp = now_date.strftime("%y%m%d%H%M")
+ return "ayon_{}_{}".format(time_stamp, platform_name)
diff --git a/openpype/vendor/python/common/ayon_api/version.py b/openpype/vendor/python/common/ayon_api/version.py
new file mode 100644
index 0000000000..238f6e9426
--- /dev/null
+++ b/openpype/vendor/python/common/ayon_api/version.py
@@ -0,0 +1,2 @@
+"""Package declaring Python API for Ayon server."""
+__version__ = "0.3.2"
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/arrow/__init__.py b/openpype/vendor/python/python_2/arrow/__init__.py
similarity index 100%
rename from openpype/modules/ftrack/python2_vendor/arrow/arrow/__init__.py
rename to openpype/vendor/python/python_2/arrow/__init__.py
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/arrow/_version.py b/openpype/vendor/python/python_2/arrow/_version.py
similarity index 100%
rename from openpype/modules/ftrack/python2_vendor/arrow/arrow/_version.py
rename to openpype/vendor/python/python_2/arrow/_version.py
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/arrow/api.py b/openpype/vendor/python/python_2/arrow/api.py
similarity index 100%
rename from openpype/modules/ftrack/python2_vendor/arrow/arrow/api.py
rename to openpype/vendor/python/python_2/arrow/api.py
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/arrow/arrow.py b/openpype/vendor/python/python_2/arrow/arrow.py
similarity index 100%
rename from openpype/modules/ftrack/python2_vendor/arrow/arrow/arrow.py
rename to openpype/vendor/python/python_2/arrow/arrow.py
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/arrow/constants.py b/openpype/vendor/python/python_2/arrow/constants.py
similarity index 100%
rename from openpype/modules/ftrack/python2_vendor/arrow/arrow/constants.py
rename to openpype/vendor/python/python_2/arrow/constants.py
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/arrow/factory.py b/openpype/vendor/python/python_2/arrow/factory.py
similarity index 100%
rename from openpype/modules/ftrack/python2_vendor/arrow/arrow/factory.py
rename to openpype/vendor/python/python_2/arrow/factory.py
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/arrow/formatter.py b/openpype/vendor/python/python_2/arrow/formatter.py
similarity index 100%
rename from openpype/modules/ftrack/python2_vendor/arrow/arrow/formatter.py
rename to openpype/vendor/python/python_2/arrow/formatter.py
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/arrow/locales.py b/openpype/vendor/python/python_2/arrow/locales.py
similarity index 100%
rename from openpype/modules/ftrack/python2_vendor/arrow/arrow/locales.py
rename to openpype/vendor/python/python_2/arrow/locales.py
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/arrow/parser.py b/openpype/vendor/python/python_2/arrow/parser.py
similarity index 100%
rename from openpype/modules/ftrack/python2_vendor/arrow/arrow/parser.py
rename to openpype/vendor/python/python_2/arrow/parser.py
diff --git a/openpype/modules/ftrack/python2_vendor/arrow/arrow/util.py b/openpype/vendor/python/python_2/arrow/util.py
similarity index 100%
rename from openpype/modules/ftrack/python2_vendor/arrow/arrow/util.py
rename to openpype/vendor/python/python_2/arrow/util.py
diff --git a/openpype/modules/ftrack/python2_vendor/backports.functools_lru_cache/backports/__init__.py b/openpype/vendor/python/python_2/backports/__init__.py
similarity index 100%
rename from openpype/modules/ftrack/python2_vendor/backports.functools_lru_cache/backports/__init__.py
rename to openpype/vendor/python/python_2/backports/__init__.py
diff --git a/openpype/modules/ftrack/python2_vendor/backports.functools_lru_cache/backports/configparser/__init__.py b/openpype/vendor/python/python_2/backports/configparser/__init__.py
similarity index 100%
rename from openpype/modules/ftrack/python2_vendor/backports.functools_lru_cache/backports/configparser/__init__.py
rename to openpype/vendor/python/python_2/backports/configparser/__init__.py
diff --git a/openpype/modules/ftrack/python2_vendor/backports.functools_lru_cache/backports/configparser/helpers.py b/openpype/vendor/python/python_2/backports/configparser/helpers.py
similarity index 100%
rename from openpype/modules/ftrack/python2_vendor/backports.functools_lru_cache/backports/configparser/helpers.py
rename to openpype/vendor/python/python_2/backports/configparser/helpers.py
diff --git a/openpype/modules/ftrack/python2_vendor/backports.functools_lru_cache/backports/functools_lru_cache.py b/openpype/vendor/python/python_2/backports/functools_lru_cache.py
similarity index 100%
rename from openpype/modules/ftrack/python2_vendor/backports.functools_lru_cache/backports/functools_lru_cache.py
rename to openpype/vendor/python/python_2/backports/functools_lru_cache.py
diff --git a/openpype/modules/ftrack/python2_vendor/builtins/builtins/__init__.py b/openpype/vendor/python/python_2/builtins/__init__.py
similarity index 100%
rename from openpype/modules/ftrack/python2_vendor/builtins/builtins/__init__.py
rename to openpype/vendor/python/python_2/builtins/__init__.py
diff --git a/openpype/version.py b/openpype/version.py
index cdd546c4a0..2d396e5d30 100644
--- a/openpype/version.py
+++ b/openpype/version.py
@@ -1,3 +1,3 @@
# -*- coding: utf-8 -*-
"""Package declaring Pype version."""
-__version__ = "3.15.12-nightly.4"
+__version__ = "3.16.0-nightly.2"
diff --git a/poetry.lock b/poetry.lock
index d5b80d0f0a..50f8150638 100644
--- a/poetry.lock
+++ b/poetry.lock
@@ -18,106 +18,106 @@ resolved_reference = "126f7a188cfe36718f707f42ebbc597e86aa86c3"
[[package]]
name = "aiohttp"
-version = "3.8.3"
+version = "3.8.4"
description = "Async http client/server framework (asyncio)"
category = "main"
optional = false
python-versions = ">=3.6"
files = [
- {file = "aiohttp-3.8.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:ba71c9b4dcbb16212f334126cc3d8beb6af377f6703d9dc2d9fb3874fd667ee9"},
- {file = "aiohttp-3.8.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d24b8bb40d5c61ef2d9b6a8f4528c2f17f1c5d2d31fed62ec860f6006142e83e"},
- {file = "aiohttp-3.8.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f88df3a83cf9df566f171adba39d5bd52814ac0b94778d2448652fc77f9eb491"},
- {file = "aiohttp-3.8.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b97decbb3372d4b69e4d4c8117f44632551c692bb1361b356a02b97b69e18a62"},
- {file = "aiohttp-3.8.3-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:309aa21c1d54b8ef0723181d430347d7452daaff93e8e2363db8e75c72c2fb2d"},
- {file = "aiohttp-3.8.3-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ad5383a67514e8e76906a06741febd9126fc7c7ff0f599d6fcce3e82b80d026f"},
- {file = "aiohttp-3.8.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:20acae4f268317bb975671e375493dbdbc67cddb5f6c71eebdb85b34444ac46b"},
- {file = "aiohttp-3.8.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:05a3c31c6d7cd08c149e50dc7aa2568317f5844acd745621983380597f027a18"},
- {file = "aiohttp-3.8.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:d6f76310355e9fae637c3162936e9504b4767d5c52ca268331e2756e54fd4ca5"},
- {file = "aiohttp-3.8.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:256deb4b29fe5e47893fa32e1de2d73c3afe7407738bd3c63829874661d4822d"},
- {file = "aiohttp-3.8.3-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:5c59fcd80b9049b49acd29bd3598cada4afc8d8d69bd4160cd613246912535d7"},
- {file = "aiohttp-3.8.3-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:059a91e88f2c00fe40aed9031b3606c3f311414f86a90d696dd982e7aec48142"},
- {file = "aiohttp-3.8.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:2feebbb6074cdbd1ac276dbd737b40e890a1361b3cc30b74ac2f5e24aab41f7b"},
- {file = "aiohttp-3.8.3-cp310-cp310-win32.whl", hash = "sha256:5bf651afd22d5f0c4be16cf39d0482ea494f5c88f03e75e5fef3a85177fecdeb"},
- {file = "aiohttp-3.8.3-cp310-cp310-win_amd64.whl", hash = "sha256:653acc3880459f82a65e27bd6526e47ddf19e643457d36a2250b85b41a564715"},
- {file = "aiohttp-3.8.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:86fc24e58ecb32aee09f864cb11bb91bc4c1086615001647dbfc4dc8c32f4008"},
- {file = "aiohttp-3.8.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:75e14eac916f024305db517e00a9252714fce0abcb10ad327fb6dcdc0d060f1d"},
- {file = "aiohttp-3.8.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:d1fde0f44029e02d02d3993ad55ce93ead9bb9b15c6b7ccd580f90bd7e3de476"},
- {file = "aiohttp-3.8.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4ab94426ddb1ecc6a0b601d832d5d9d421820989b8caa929114811369673235c"},
- {file = "aiohttp-3.8.3-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:89d2e02167fa95172c017732ed7725bc8523c598757f08d13c5acca308e1a061"},
- {file = "aiohttp-3.8.3-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:02f9a2c72fc95d59b881cf38a4b2be9381b9527f9d328771e90f72ac76f31ad8"},
- {file = "aiohttp-3.8.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9c7149272fb5834fc186328e2c1fa01dda3e1fa940ce18fded6d412e8f2cf76d"},
- {file = "aiohttp-3.8.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:512bd5ab136b8dc0ffe3fdf2dfb0c4b4f49c8577f6cae55dca862cd37a4564e2"},
- {file = "aiohttp-3.8.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:7018ecc5fe97027214556afbc7c502fbd718d0740e87eb1217b17efd05b3d276"},
- {file = "aiohttp-3.8.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:88c70ed9da9963d5496d38320160e8eb7e5f1886f9290475a881db12f351ab5d"},
- {file = "aiohttp-3.8.3-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:da22885266bbfb3f78218dc40205fed2671909fbd0720aedba39b4515c038091"},
- {file = "aiohttp-3.8.3-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:e65bc19919c910127c06759a63747ebe14f386cda573d95bcc62b427ca1afc73"},
- {file = "aiohttp-3.8.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:08c78317e950e0762c2983f4dd58dc5e6c9ff75c8a0efeae299d363d439c8e34"},
- {file = "aiohttp-3.8.3-cp311-cp311-win32.whl", hash = "sha256:45d88b016c849d74ebc6f2b6e8bc17cabf26e7e40c0661ddd8fae4c00f015697"},
- {file = "aiohttp-3.8.3-cp311-cp311-win_amd64.whl", hash = "sha256:96372fc29471646b9b106ee918c8eeb4cca423fcbf9a34daa1b93767a88a2290"},
- {file = "aiohttp-3.8.3-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:c971bf3786b5fad82ce5ad570dc6ee420f5b12527157929e830f51c55dc8af77"},
- {file = "aiohttp-3.8.3-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ff25f48fc8e623d95eca0670b8cc1469a83783c924a602e0fbd47363bb54aaca"},
- {file = "aiohttp-3.8.3-cp36-cp36m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e381581b37db1db7597b62a2e6b8b57c3deec95d93b6d6407c5b61ddc98aca6d"},
- {file = "aiohttp-3.8.3-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:db19d60d846283ee275d0416e2a23493f4e6b6028825b51290ac05afc87a6f97"},
- {file = "aiohttp-3.8.3-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:25892c92bee6d9449ffac82c2fe257f3a6f297792cdb18ad784737d61e7a9a85"},
- {file = "aiohttp-3.8.3-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:398701865e7a9565d49189f6c90868efaca21be65c725fc87fc305906be915da"},
- {file = "aiohttp-3.8.3-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:4a4fbc769ea9b6bd97f4ad0b430a6807f92f0e5eb020f1e42ece59f3ecfc4585"},
- {file = "aiohttp-3.8.3-cp36-cp36m-musllinux_1_1_i686.whl", hash = "sha256:b29bfd650ed8e148f9c515474a6ef0ba1090b7a8faeee26b74a8ff3b33617502"},
- {file = "aiohttp-3.8.3-cp36-cp36m-musllinux_1_1_ppc64le.whl", hash = "sha256:1e56b9cafcd6531bab5d9b2e890bb4937f4165109fe98e2b98ef0dcfcb06ee9d"},
- {file = "aiohttp-3.8.3-cp36-cp36m-musllinux_1_1_s390x.whl", hash = "sha256:ec40170327d4a404b0d91855d41bfe1fe4b699222b2b93e3d833a27330a87a6d"},
- {file = "aiohttp-3.8.3-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:2df5f139233060578d8c2c975128fb231a89ca0a462b35d4b5fcf7c501ebdbe1"},
- {file = "aiohttp-3.8.3-cp36-cp36m-win32.whl", hash = "sha256:f973157ffeab5459eefe7b97a804987876dd0a55570b8fa56b4e1954bf11329b"},
- {file = "aiohttp-3.8.3-cp36-cp36m-win_amd64.whl", hash = "sha256:437399385f2abcd634865705bdc180c8314124b98299d54fe1d4c8990f2f9494"},
- {file = "aiohttp-3.8.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:09e28f572b21642128ef31f4e8372adb6888846f32fecb288c8b0457597ba61a"},
- {file = "aiohttp-3.8.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6f3553510abdbec67c043ca85727396ceed1272eef029b050677046d3387be8d"},
- {file = "aiohttp-3.8.3-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e168a7560b7c61342ae0412997b069753f27ac4862ec7867eff74f0fe4ea2ad9"},
- {file = "aiohttp-3.8.3-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:db4c979b0b3e0fa7e9e69ecd11b2b3174c6963cebadeecfb7ad24532ffcdd11a"},
- {file = "aiohttp-3.8.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e164e0a98e92d06da343d17d4e9c4da4654f4a4588a20d6c73548a29f176abe2"},
- {file = "aiohttp-3.8.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e8a78079d9a39ca9ca99a8b0ac2fdc0c4d25fc80c8a8a82e5c8211509c523363"},
- {file = "aiohttp-3.8.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:21b30885a63c3f4ff5b77a5d6caf008b037cb521a5f33eab445dc566f6d092cc"},
- {file = "aiohttp-3.8.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:4b0f30372cef3fdc262f33d06e7b411cd59058ce9174ef159ad938c4a34a89da"},
- {file = "aiohttp-3.8.3-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:8135fa153a20d82ffb64f70a1b5c2738684afa197839b34cc3e3c72fa88d302c"},
- {file = "aiohttp-3.8.3-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:ad61a9639792fd790523ba072c0555cd6be5a0baf03a49a5dd8cfcf20d56df48"},
- {file = "aiohttp-3.8.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:978b046ca728073070e9abc074b6299ebf3501e8dee5e26efacb13cec2b2dea0"},
- {file = "aiohttp-3.8.3-cp37-cp37m-win32.whl", hash = "sha256:0d2c6d8c6872df4a6ec37d2ede71eff62395b9e337b4e18efd2177de883a5033"},
- {file = "aiohttp-3.8.3-cp37-cp37m-win_amd64.whl", hash = "sha256:21d69797eb951f155026651f7e9362877334508d39c2fc37bd04ff55b2007091"},
- {file = "aiohttp-3.8.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:2ca9af5f8f5812d475c5259393f52d712f6d5f0d7fdad9acdb1107dd9e3cb7eb"},
- {file = "aiohttp-3.8.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:1d90043c1882067f1bd26196d5d2db9aa6d268def3293ed5fb317e13c9413ea4"},
- {file = "aiohttp-3.8.3-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:d737fc67b9a970f3234754974531dc9afeea11c70791dcb7db53b0cf81b79784"},
- {file = "aiohttp-3.8.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ebf909ea0a3fc9596e40d55d8000702a85e27fd578ff41a5500f68f20fd32e6c"},
- {file = "aiohttp-3.8.3-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5835f258ca9f7c455493a57ee707b76d2d9634d84d5d7f62e77be984ea80b849"},
- {file = "aiohttp-3.8.3-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:da37dcfbf4b7f45d80ee386a5f81122501ec75672f475da34784196690762f4b"},
- {file = "aiohttp-3.8.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:87f44875f2804bc0511a69ce44a9595d5944837a62caecc8490bbdb0e18b1342"},
- {file = "aiohttp-3.8.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:527b3b87b24844ea7865284aabfab08eb0faf599b385b03c2aa91fc6edd6e4b6"},
- {file = "aiohttp-3.8.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:d5ba88df9aa5e2f806650fcbeedbe4f6e8736e92fc0e73b0400538fd25a4dd96"},
- {file = "aiohttp-3.8.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:e7b8813be97cab8cb52b1375f41f8e6804f6507fe4660152e8ca5c48f0436017"},
- {file = "aiohttp-3.8.3-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:2dea10edfa1a54098703cb7acaa665c07b4e7568472a47f4e64e6319d3821ccf"},
- {file = "aiohttp-3.8.3-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:713d22cd9643ba9025d33c4af43943c7a1eb8547729228de18d3e02e278472b6"},
- {file = "aiohttp-3.8.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:2d252771fc85e0cf8da0b823157962d70639e63cb9b578b1dec9868dd1f4f937"},
- {file = "aiohttp-3.8.3-cp38-cp38-win32.whl", hash = "sha256:66bd5f950344fb2b3dbdd421aaa4e84f4411a1a13fca3aeb2bcbe667f80c9f76"},
- {file = "aiohttp-3.8.3-cp38-cp38-win_amd64.whl", hash = "sha256:84b14f36e85295fe69c6b9789b51a0903b774046d5f7df538176516c3e422446"},
- {file = "aiohttp-3.8.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:16c121ba0b1ec2b44b73e3a8a171c4f999b33929cd2397124a8c7fcfc8cd9e06"},
- {file = "aiohttp-3.8.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:8d6aaa4e7155afaf994d7924eb290abbe81a6905b303d8cb61310a2aba1c68ba"},
- {file = "aiohttp-3.8.3-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:43046a319664a04b146f81b40e1545d4c8ac7b7dd04c47e40bf09f65f2437346"},
- {file = "aiohttp-3.8.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:599418aaaf88a6d02a8c515e656f6faf3d10618d3dd95866eb4436520096c84b"},
- {file = "aiohttp-3.8.3-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:92a2964319d359f494f16011e23434f6f8ef0434acd3cf154a6b7bec511e2fb7"},
- {file = "aiohttp-3.8.3-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:73a4131962e6d91109bca6536416aa067cf6c4efb871975df734f8d2fd821b37"},
- {file = "aiohttp-3.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:598adde339d2cf7d67beaccda3f2ce7c57b3b412702f29c946708f69cf8222aa"},
- {file = "aiohttp-3.8.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:75880ed07be39beff1881d81e4a907cafb802f306efd6d2d15f2b3c69935f6fb"},
- {file = "aiohttp-3.8.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:a0239da9fbafd9ff82fd67c16704a7d1bccf0d107a300e790587ad05547681c8"},
- {file = "aiohttp-3.8.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:4e3a23ec214e95c9fe85a58470b660efe6534b83e6cbe38b3ed52b053d7cb6ad"},
- {file = "aiohttp-3.8.3-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:47841407cc89a4b80b0c52276f3cc8138bbbfba4b179ee3acbd7d77ae33f7ac4"},
- {file = "aiohttp-3.8.3-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:54d107c89a3ebcd13228278d68f1436d3f33f2dd2af5415e3feaeb1156e1a62c"},
- {file = "aiohttp-3.8.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:c37c5cce780349d4d51739ae682dec63573847a2a8dcb44381b174c3d9c8d403"},
- {file = "aiohttp-3.8.3-cp39-cp39-win32.whl", hash = "sha256:f178d2aadf0166be4df834c4953da2d7eef24719e8aec9a65289483eeea9d618"},
- {file = "aiohttp-3.8.3-cp39-cp39-win_amd64.whl", hash = "sha256:88e5be56c231981428f4f506c68b6a46fa25c4123a2e86d156c58a8369d31ab7"},
- {file = "aiohttp-3.8.3.tar.gz", hash = "sha256:3828fb41b7203176b82fe5d699e0d845435f2374750a44b480ea6b930f6be269"},
+ {file = "aiohttp-3.8.4-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:5ce45967538fb747370308d3145aa68a074bdecb4f3a300869590f725ced69c1"},
+ {file = "aiohttp-3.8.4-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:b744c33b6f14ca26b7544e8d8aadff6b765a80ad6164fb1a430bbadd593dfb1a"},
+ {file = "aiohttp-3.8.4-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:1a45865451439eb320784918617ba54b7a377e3501fb70402ab84d38c2cd891b"},
+ {file = "aiohttp-3.8.4-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a86d42d7cba1cec432d47ab13b6637bee393a10f664c425ea7b305d1301ca1a3"},
+ {file = "aiohttp-3.8.4-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ee3c36df21b5714d49fc4580247947aa64bcbe2939d1b77b4c8dcb8f6c9faecc"},
+ {file = "aiohttp-3.8.4-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:176a64b24c0935869d5bbc4c96e82f89f643bcdf08ec947701b9dbb3c956b7dd"},
+ {file = "aiohttp-3.8.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c844fd628851c0bc309f3c801b3a3d58ce430b2ce5b359cd918a5a76d0b20cb5"},
+ {file = "aiohttp-3.8.4-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5393fb786a9e23e4799fec788e7e735de18052f83682ce2dfcabaf1c00c2c08e"},
+ {file = "aiohttp-3.8.4-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:e4b09863aae0dc965c3ef36500d891a3ff495a2ea9ae9171e4519963c12ceefd"},
+ {file = "aiohttp-3.8.4-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:adfbc22e87365a6e564c804c58fc44ff7727deea782d175c33602737b7feadb6"},
+ {file = "aiohttp-3.8.4-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:147ae376f14b55f4f3c2b118b95be50a369b89b38a971e80a17c3fd623f280c9"},
+ {file = "aiohttp-3.8.4-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:eafb3e874816ebe2a92f5e155f17260034c8c341dad1df25672fb710627c6949"},
+ {file = "aiohttp-3.8.4-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c6cc15d58053c76eacac5fa9152d7d84b8d67b3fde92709195cb984cfb3475ea"},
+ {file = "aiohttp-3.8.4-cp310-cp310-win32.whl", hash = "sha256:59f029a5f6e2d679296db7bee982bb3d20c088e52a2977e3175faf31d6fb75d1"},
+ {file = "aiohttp-3.8.4-cp310-cp310-win_amd64.whl", hash = "sha256:fe7ba4a51f33ab275515f66b0a236bcde4fb5561498fe8f898d4e549b2e4509f"},
+ {file = "aiohttp-3.8.4-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:3d8ef1a630519a26d6760bc695842579cb09e373c5f227a21b67dc3eb16cfea4"},
+ {file = "aiohttp-3.8.4-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:5b3f2e06a512e94722886c0827bee9807c86a9f698fac6b3aee841fab49bbfb4"},
+ {file = "aiohttp-3.8.4-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:3a80464982d41b1fbfe3154e440ba4904b71c1a53e9cd584098cd41efdb188ef"},
+ {file = "aiohttp-3.8.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8b631e26df63e52f7cce0cce6507b7a7f1bc9b0c501fcde69742130b32e8782f"},
+ {file = "aiohttp-3.8.4-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3f43255086fe25e36fd5ed8f2ee47477408a73ef00e804cb2b5cba4bf2ac7f5e"},
+ {file = "aiohttp-3.8.4-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4d347a172f866cd1d93126d9b239fcbe682acb39b48ee0873c73c933dd23bd0f"},
+ {file = "aiohttp-3.8.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a3fec6a4cb5551721cdd70473eb009d90935b4063acc5f40905d40ecfea23e05"},
+ {file = "aiohttp-3.8.4-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:80a37fe8f7c1e6ce8f2d9c411676e4bc633a8462844e38f46156d07a7d401654"},
+ {file = "aiohttp-3.8.4-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:d1e6a862b76f34395a985b3cd39a0d949ca80a70b6ebdea37d3ab39ceea6698a"},
+ {file = "aiohttp-3.8.4-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:cd468460eefef601ece4428d3cf4562459157c0f6523db89365202c31b6daebb"},
+ {file = "aiohttp-3.8.4-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:618c901dd3aad4ace71dfa0f5e82e88b46ef57e3239fc7027773cb6d4ed53531"},
+ {file = "aiohttp-3.8.4-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:652b1bff4f15f6287550b4670546a2947f2a4575b6c6dff7760eafb22eacbf0b"},
+ {file = "aiohttp-3.8.4-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:80575ba9377c5171407a06d0196b2310b679dc752d02a1fcaa2bc20b235dbf24"},
+ {file = "aiohttp-3.8.4-cp311-cp311-win32.whl", hash = "sha256:bbcf1a76cf6f6dacf2c7f4d2ebd411438c275faa1dc0c68e46eb84eebd05dd7d"},
+ {file = "aiohttp-3.8.4-cp311-cp311-win_amd64.whl", hash = "sha256:6e74dd54f7239fcffe07913ff8b964e28b712f09846e20de78676ce2a3dc0bfc"},
+ {file = "aiohttp-3.8.4-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:880e15bb6dad90549b43f796b391cfffd7af373f4646784795e20d92606b7a51"},
+ {file = "aiohttp-3.8.4-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bb96fa6b56bb536c42d6a4a87dfca570ff8e52de2d63cabebfd6fb67049c34b6"},
+ {file = "aiohttp-3.8.4-cp36-cp36m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4a6cadebe132e90cefa77e45f2d2f1a4b2ce5c6b1bfc1656c1ddafcfe4ba8131"},
+ {file = "aiohttp-3.8.4-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f352b62b45dff37b55ddd7b9c0c8672c4dd2eb9c0f9c11d395075a84e2c40f75"},
+ {file = "aiohttp-3.8.4-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7ab43061a0c81198d88f39aaf90dae9a7744620978f7ef3e3708339b8ed2ef01"},
+ {file = "aiohttp-3.8.4-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c9cb1565a7ad52e096a6988e2ee0397f72fe056dadf75d17fa6b5aebaea05622"},
+ {file = "aiohttp-3.8.4-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:1b3ea7edd2d24538959c1c1abf97c744d879d4e541d38305f9bd7d9b10c9ec41"},
+ {file = "aiohttp-3.8.4-cp36-cp36m-musllinux_1_1_i686.whl", hash = "sha256:7c7837fe8037e96b6dd5cfcf47263c1620a9d332a87ec06a6ca4564e56bd0f36"},
+ {file = "aiohttp-3.8.4-cp36-cp36m-musllinux_1_1_ppc64le.whl", hash = "sha256:3b90467ebc3d9fa5b0f9b6489dfb2c304a1db7b9946fa92aa76a831b9d587e99"},
+ {file = "aiohttp-3.8.4-cp36-cp36m-musllinux_1_1_s390x.whl", hash = "sha256:cab9401de3ea52b4b4c6971db5fb5c999bd4260898af972bf23de1c6b5dd9d71"},
+ {file = "aiohttp-3.8.4-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:d1f9282c5f2b5e241034a009779e7b2a1aa045f667ff521e7948ea9b56e0c5ff"},
+ {file = "aiohttp-3.8.4-cp36-cp36m-win32.whl", hash = "sha256:5e14f25765a578a0a634d5f0cd1e2c3f53964553a00347998dfdf96b8137f777"},
+ {file = "aiohttp-3.8.4-cp36-cp36m-win_amd64.whl", hash = "sha256:4c745b109057e7e5f1848c689ee4fb3a016c8d4d92da52b312f8a509f83aa05e"},
+ {file = "aiohttp-3.8.4-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:aede4df4eeb926c8fa70de46c340a1bc2c6079e1c40ccf7b0eae1313ffd33519"},
+ {file = "aiohttp-3.8.4-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4ddaae3f3d32fc2cb4c53fab020b69a05c8ab1f02e0e59665c6f7a0d3a5be54f"},
+ {file = "aiohttp-3.8.4-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c4eb3b82ca349cf6fadcdc7abcc8b3a50ab74a62e9113ab7a8ebc268aad35bb9"},
+ {file = "aiohttp-3.8.4-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9bcb89336efa095ea21b30f9e686763f2be4478f1b0a616969551982c4ee4c3b"},
+ {file = "aiohttp-3.8.4-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c08e8ed6fa3d477e501ec9db169bfac8140e830aa372d77e4a43084d8dd91ab"},
+ {file = "aiohttp-3.8.4-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c6cd05ea06daca6ad6a4ca3ba7fe7dc5b5de063ff4daec6170ec0f9979f6c332"},
+ {file = "aiohttp-3.8.4-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b7a00a9ed8d6e725b55ef98b1b35c88013245f35f68b1b12c5cd4100dddac333"},
+ {file = "aiohttp-3.8.4-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:de04b491d0e5007ee1b63a309956eaed959a49f5bb4e84b26c8f5d49de140fa9"},
+ {file = "aiohttp-3.8.4-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:40653609b3bf50611356e6b6554e3a331f6879fa7116f3959b20e3528783e699"},
+ {file = "aiohttp-3.8.4-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:dbf3a08a06b3f433013c143ebd72c15cac33d2914b8ea4bea7ac2c23578815d6"},
+ {file = "aiohttp-3.8.4-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:854f422ac44af92bfe172d8e73229c270dc09b96535e8a548f99c84f82dde241"},
+ {file = "aiohttp-3.8.4-cp37-cp37m-win32.whl", hash = "sha256:aeb29c84bb53a84b1a81c6c09d24cf33bb8432cc5c39979021cc0f98c1292a1a"},
+ {file = "aiohttp-3.8.4-cp37-cp37m-win_amd64.whl", hash = "sha256:db3fc6120bce9f446d13b1b834ea5b15341ca9ff3f335e4a951a6ead31105480"},
+ {file = "aiohttp-3.8.4-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:fabb87dd8850ef0f7fe2b366d44b77d7e6fa2ea87861ab3844da99291e81e60f"},
+ {file = "aiohttp-3.8.4-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:91f6d540163f90bbaef9387e65f18f73ffd7c79f5225ac3d3f61df7b0d01ad15"},
+ {file = "aiohttp-3.8.4-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:d265f09a75a79a788237d7f9054f929ced2e69eb0bb79de3798c468d8a90f945"},
+ {file = "aiohttp-3.8.4-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3d89efa095ca7d442a6d0cbc755f9e08190ba40069b235c9886a8763b03785da"},
+ {file = "aiohttp-3.8.4-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4dac314662f4e2aa5009977b652d9b8db7121b46c38f2073bfeed9f4049732cd"},
+ {file = "aiohttp-3.8.4-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:fe11310ae1e4cd560035598c3f29d86cef39a83d244c7466f95c27ae04850f10"},
+ {file = "aiohttp-3.8.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6ddb2a2026c3f6a68c3998a6c47ab6795e4127315d2e35a09997da21865757f8"},
+ {file = "aiohttp-3.8.4-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e75b89ac3bd27d2d043b234aa7b734c38ba1b0e43f07787130a0ecac1e12228a"},
+ {file = "aiohttp-3.8.4-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:6e601588f2b502c93c30cd5a45bfc665faaf37bbe835b7cfd461753068232074"},
+ {file = "aiohttp-3.8.4-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a5d794d1ae64e7753e405ba58e08fcfa73e3fad93ef9b7e31112ef3c9a0efb52"},
+ {file = "aiohttp-3.8.4-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:a1f4689c9a1462f3df0a1f7e797791cd6b124ddbee2b570d34e7f38ade0e2c71"},
+ {file = "aiohttp-3.8.4-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:3032dcb1c35bc330134a5b8a5d4f68c1a87252dfc6e1262c65a7e30e62298275"},
+ {file = "aiohttp-3.8.4-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:8189c56eb0ddbb95bfadb8f60ea1b22fcfa659396ea36f6adcc521213cd7b44d"},
+ {file = "aiohttp-3.8.4-cp38-cp38-win32.whl", hash = "sha256:33587f26dcee66efb2fff3c177547bd0449ab7edf1b73a7f5dea1e38609a0c54"},
+ {file = "aiohttp-3.8.4-cp38-cp38-win_amd64.whl", hash = "sha256:e595432ac259af2d4630008bf638873d69346372d38255774c0e286951e8b79f"},
+ {file = "aiohttp-3.8.4-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:5a7bdf9e57126dc345b683c3632e8ba317c31d2a41acd5800c10640387d193ed"},
+ {file = "aiohttp-3.8.4-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:22f6eab15b6db242499a16de87939a342f5a950ad0abaf1532038e2ce7d31567"},
+ {file = "aiohttp-3.8.4-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:7235604476a76ef249bd64cb8274ed24ccf6995c4a8b51a237005ee7a57e8643"},
+ {file = "aiohttp-3.8.4-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ea9eb976ffdd79d0e893869cfe179a8f60f152d42cb64622fca418cd9b18dc2a"},
+ {file = "aiohttp-3.8.4-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:92c0cea74a2a81c4c76b62ea1cac163ecb20fb3ba3a75c909b9fa71b4ad493cf"},
+ {file = "aiohttp-3.8.4-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:493f5bc2f8307286b7799c6d899d388bbaa7dfa6c4caf4f97ef7521b9cb13719"},
+ {file = "aiohttp-3.8.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0a63f03189a6fa7c900226e3ef5ba4d3bd047e18f445e69adbd65af433add5a2"},
+ {file = "aiohttp-3.8.4-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:10c8cefcff98fd9168cdd86c4da8b84baaa90bf2da2269c6161984e6737bf23e"},
+ {file = "aiohttp-3.8.4-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:bca5f24726e2919de94f047739d0a4fc01372801a3672708260546aa2601bf57"},
+ {file = "aiohttp-3.8.4-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:03baa76b730e4e15a45f81dfe29a8d910314143414e528737f8589ec60cf7391"},
+ {file = "aiohttp-3.8.4-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:8c29c77cc57e40f84acef9bfb904373a4e89a4e8b74e71aa8075c021ec9078c2"},
+ {file = "aiohttp-3.8.4-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:03543dcf98a6619254b409be2d22b51f21ec66272be4ebda7b04e6412e4b2e14"},
+ {file = "aiohttp-3.8.4-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:17b79c2963db82086229012cff93ea55196ed31f6493bb1ccd2c62f1724324e4"},
+ {file = "aiohttp-3.8.4-cp39-cp39-win32.whl", hash = "sha256:34ce9f93a4a68d1272d26030655dd1b58ff727b3ed2a33d80ec433561b03d67a"},
+ {file = "aiohttp-3.8.4-cp39-cp39-win_amd64.whl", hash = "sha256:41a86a69bb63bb2fc3dc9ad5ea9f10f1c9c8e282b471931be0268ddd09430b04"},
+ {file = "aiohttp-3.8.4.tar.gz", hash = "sha256:bf2e1a9162c1e441bf805a1fd166e249d574ca04e03b34f97e2928769e91ab5c"},
]
[package.dependencies]
aiosignal = ">=1.1.2"
async-timeout = ">=4.0.0a3,<5.0"
attrs = ">=17.3.0"
-charset-normalizer = ">=2.0,<3.0"
+charset-normalizer = ">=2.0,<4.0"
frozenlist = ">=1.1.1"
multidict = ">=4.5,<7.0"
yarl = ">=1.0,<2.0"
@@ -210,7 +210,7 @@ develop = false
type = "git"
url = "https://github.com/ActiveState/appdirs.git"
reference = "master"
-resolved_reference = "211708144ddcbba1f02e26a43efec9aef57bc9fc"
+resolved_reference = "8734277956c1df3b85385e6b308e954910533884"
[[package]]
name = "arrow"
@@ -229,19 +229,19 @@ python-dateutil = ">=2.7.0"
[[package]]
name = "astroid"
-version = "2.13.2"
+version = "2.15.5"
description = "An abstract syntax tree for Python with inference support."
category = "dev"
optional = false
python-versions = ">=3.7.2"
files = [
- {file = "astroid-2.13.2-py3-none-any.whl", hash = "sha256:8f6a8d40c4ad161d6fc419545ae4b2f275ed86d1c989c97825772120842ee0d2"},
- {file = "astroid-2.13.2.tar.gz", hash = "sha256:3bc7834720e1a24ca797fd785d77efb14f7a28ee8e635ef040b6e2d80ccb3303"},
+ {file = "astroid-2.15.5-py3-none-any.whl", hash = "sha256:078e5212f9885fa85fbb0cf0101978a336190aadea6e13305409d099f71b2324"},
+ {file = "astroid-2.15.5.tar.gz", hash = "sha256:1039262575027b441137ab4a62a793a9b43defb42c32d5670f38686207cd780f"},
]
[package.dependencies]
lazy-object-proxy = ">=1.4.0"
-typing-extensions = ">=4.0.0"
+typing-extensions = {version = ">=4.0.0", markers = "python_version < \"3.11\""}
wrapt = {version = ">=1.11,<2", markers = "python_version < \"3.11\""}
[[package]]
@@ -269,33 +269,33 @@ files = [
[[package]]
name = "attrs"
-version = "22.2.0"
+version = "23.1.0"
description = "Classes Without Boilerplate"
category = "main"
optional = false
-python-versions = ">=3.6"
+python-versions = ">=3.7"
files = [
- {file = "attrs-22.2.0-py3-none-any.whl", hash = "sha256:29e95c7f6778868dbd49170f98f8818f78f3dc5e0e37c0b1f474e3561b240836"},
- {file = "attrs-22.2.0.tar.gz", hash = "sha256:c9227bfc2f01993c03f68db37d1d15c9690188323c067c641f1a35ca58185f99"},
+ {file = "attrs-23.1.0-py3-none-any.whl", hash = "sha256:1f28b4522cdc2fb4256ac1a020c78acf9cba2c6b461ccd2c126f3aa8e8335d04"},
+ {file = "attrs-23.1.0.tar.gz", hash = "sha256:6279836d581513a26f1bf235f9acd333bc9115683f14f7e8fae46c98fc50e015"},
]
[package.extras]
-cov = ["attrs[tests]", "coverage-enable-subprocess", "coverage[toml] (>=5.3)"]
-dev = ["attrs[docs,tests]"]
-docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-towncrier", "towncrier", "zope.interface"]
-tests = ["attrs[tests-no-zope]", "zope.interface"]
-tests-no-zope = ["cloudpickle", "cloudpickle", "hypothesis", "hypothesis", "mypy (>=0.971,<0.990)", "mypy (>=0.971,<0.990)", "pympler", "pympler", "pytest (>=4.3.0)", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-mypy-plugins", "pytest-xdist[psutil]", "pytest-xdist[psutil]"]
+cov = ["attrs[tests]", "coverage[toml] (>=5.3)"]
+dev = ["attrs[docs,tests]", "pre-commit"]
+docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-towncrier", "towncrier", "zope-interface"]
+tests = ["attrs[tests-no-zope]", "zope-interface"]
+tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
[[package]]
name = "autopep8"
-version = "2.0.1"
+version = "2.0.2"
description = "A tool that automatically formats Python code to conform to the PEP 8 style guide"
category = "dev"
optional = false
python-versions = ">=3.6"
files = [
- {file = "autopep8-2.0.1-py2.py3-none-any.whl", hash = "sha256:be5bc98c33515b67475420b7b1feafc8d32c1a69862498eda4983b45bffd2687"},
- {file = "autopep8-2.0.1.tar.gz", hash = "sha256:d27a8929d8dcd21c0f4b3859d2d07c6c25273727b98afc984c039df0f0d86566"},
+ {file = "autopep8-2.0.2-py2.py3-none-any.whl", hash = "sha256:86e9303b5e5c8160872b2f5ef611161b2893e9bfe8ccc7e2f76385947d57a2f1"},
+ {file = "autopep8-2.0.2.tar.gz", hash = "sha256:f9849cdd62108cb739dbcdbfb7fdcc9a30d1b63c4cc3e1c1f893b5360941b61c"},
]
[package.dependencies]
@@ -304,19 +304,16 @@ tomli = {version = "*", markers = "python_version < \"3.11\""}
[[package]]
name = "babel"
-version = "2.11.0"
+version = "2.12.1"
description = "Internationalization utilities"
category = "dev"
optional = false
-python-versions = ">=3.6"
+python-versions = ">=3.7"
files = [
- {file = "Babel-2.11.0-py3-none-any.whl", hash = "sha256:1ad3eca1c885218f6dce2ab67291178944f810a10a9b5f3cb8382a5a232b64fe"},
- {file = "Babel-2.11.0.tar.gz", hash = "sha256:5ef4b3226b0180dedded4229651c8b0e1a3a6a2837d45a073272f313e4cf97f6"},
+ {file = "Babel-2.12.1-py3-none-any.whl", hash = "sha256:b4246fb7677d3b98f501a39d43396d3cafdc8eadb045f4a31be01863f655c610"},
+ {file = "Babel-2.12.1.tar.gz", hash = "sha256:cc2d99999cd01d44420ae725a21c9e3711b3aadc7976d6147f622d8581963455"},
]
-[package.dependencies]
-pytz = ">=2015.7"
-
[[package]]
name = "bcrypt"
version = "4.0.1"
@@ -352,16 +349,33 @@ files = [
tests = ["pytest (>=3.2.1,!=3.3.0)"]
typecheck = ["mypy"]
+[[package]]
+name = "bidict"
+version = "0.22.1"
+description = "The bidirectional mapping library for Python."
+category = "main"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "bidict-0.22.1-py3-none-any.whl", hash = "sha256:6ef212238eb884b664f28da76f33f1d28b260f665fc737b413b287d5487d1e7b"},
+ {file = "bidict-0.22.1.tar.gz", hash = "sha256:1e0f7f74e4860e6d0943a05d4134c63a2fad86f3d4732fb265bd79e4e856d81d"},
+]
+
+[package.extras]
+docs = ["furo", "sphinx", "sphinx-copybutton"]
+lint = ["pre-commit"]
+test = ["hypothesis", "pytest", "pytest-benchmark[histogram]", "pytest-cov", "pytest-xdist", "sortedcollections", "sortedcontainers", "sphinx"]
+
[[package]]
name = "blessed"
-version = "1.19.1"
+version = "1.20.0"
description = "Easy, practical library for making terminal apps, by providing an elegant, well-documented interface to Colors, Keyboard input, and screen Positioning capabilities."
category = "main"
optional = false
python-versions = ">=2.7"
files = [
- {file = "blessed-1.19.1-py2.py3-none-any.whl", hash = "sha256:63b8554ae2e0e7f43749b6715c734cc8f3883010a809bf16790102563e6cf25b"},
- {file = "blessed-1.19.1.tar.gz", hash = "sha256:9a0d099695bf621d4680dd6c73f6ad547f6a3442fbdbe80c4b1daa1edbc492fc"},
+ {file = "blessed-1.20.0-py2.py3-none-any.whl", hash = "sha256:0c542922586a265e699188e52d5f5ac5ec0dd517e5a1041d90d2bbf23f906058"},
+ {file = "blessed-1.20.0.tar.gz", hash = "sha256:2cdd67f8746e048f00df47a2880f4d6acbcdb399031b604e34ba8f71d5787680"},
]
[package.dependencies]
@@ -371,26 +385,26 @@ wcwidth = ">=0.1.4"
[[package]]
name = "cachetools"
-version = "5.2.1"
+version = "5.3.1"
description = "Extensible memoizing collections and decorators"
category = "main"
optional = false
-python-versions = "~=3.7"
+python-versions = ">=3.7"
files = [
- {file = "cachetools-5.2.1-py3-none-any.whl", hash = "sha256:8462eebf3a6c15d25430a8c27c56ac61340b2ecf60c9ce57afc2b97e450e47da"},
- {file = "cachetools-5.2.1.tar.gz", hash = "sha256:5991bc0e08a1319bb618d3195ca5b6bc76646a49c21d55962977197b301cc1fe"},
+ {file = "cachetools-5.3.1-py3-none-any.whl", hash = "sha256:95ef631eeaea14ba2e36f06437f36463aac3a096799e876ee55e5cdccb102590"},
+ {file = "cachetools-5.3.1.tar.gz", hash = "sha256:dce83f2d9b4e1f732a8cd44af8e8fab2dbe46201467fc98b3ef8f269092bf62b"},
]
[[package]]
name = "certifi"
-version = "2022.12.7"
+version = "2023.5.7"
description = "Python package for providing Mozilla's CA Bundle."
category = "main"
optional = false
python-versions = ">=3.6"
files = [
- {file = "certifi-2022.12.7-py3-none-any.whl", hash = "sha256:4ad3232f5e926d6718ec31cfc1fcadfde020920e278684144551c91769c7bc18"},
- {file = "certifi-2022.12.7.tar.gz", hash = "sha256:35824b4c3a97115964b408844d64aa14db1cc518f6562e8d7261699d1350a9e3"},
+ {file = "certifi-2023.5.7-py3-none-any.whl", hash = "sha256:c6c2e98f5c7869efca1f8916fed228dd91539f9f1b444c314c06eef02980c716"},
+ {file = "certifi-2023.5.7.tar.gz", hash = "sha256:0f0d56dc5a6ad56fd4ba36484d6cc34451e1c6548c61daad8c320169f91eddc7"},
]
[[package]]
@@ -484,31 +498,104 @@ files = [
[[package]]
name = "charset-normalizer"
-version = "2.1.1"
+version = "3.1.0"
description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
category = "main"
optional = false
-python-versions = ">=3.6.0"
+python-versions = ">=3.7.0"
files = [
- {file = "charset-normalizer-2.1.1.tar.gz", hash = "sha256:5a3d016c7c547f69d6f81fb0db9449ce888b418b5b9952cc5e6e66843e9dd845"},
- {file = "charset_normalizer-2.1.1-py3-none-any.whl", hash = "sha256:83e9a75d1911279afd89352c68b45348559d1fc0506b054b346651b5e7fee29f"},
+ {file = "charset-normalizer-3.1.0.tar.gz", hash = "sha256:34e0a2f9c370eb95597aae63bf85eb5e96826d81e3dcf88b8886012906f509b5"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:e0ac8959c929593fee38da1c2b64ee9778733cdf03c482c9ff1d508b6b593b2b"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d7fc3fca01da18fbabe4625d64bb612b533533ed10045a2ac3dd194bfa656b60"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:04eefcee095f58eaabe6dc3cc2262f3bcd776d2c67005880894f447b3f2cb9c1"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:20064ead0717cf9a73a6d1e779b23d149b53daf971169289ed2ed43a71e8d3b0"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1435ae15108b1cb6fffbcea2af3d468683b7afed0169ad718451f8db5d1aff6f"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c84132a54c750fda57729d1e2599bb598f5fa0344085dbde5003ba429a4798c0"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:75f2568b4189dda1c567339b48cba4ac7384accb9c2a7ed655cd86b04055c795"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:11d3bcb7be35e7b1bba2c23beedac81ee893ac9871d0ba79effc7fc01167db6c"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:891cf9b48776b5c61c700b55a598621fdb7b1e301a550365571e9624f270c203"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:5f008525e02908b20e04707a4f704cd286d94718f48bb33edddc7d7b584dddc1"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:b06f0d3bf045158d2fb8837c5785fe9ff9b8c93358be64461a1089f5da983137"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:49919f8400b5e49e961f320c735388ee686a62327e773fa5b3ce6721f7e785ce"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:22908891a380d50738e1f978667536f6c6b526a2064156203d418f4856d6e86a"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-win32.whl", hash = "sha256:12d1a39aa6b8c6f6248bb54550efcc1c38ce0d8096a146638fd4738e42284448"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-win_amd64.whl", hash = "sha256:65ed923f84a6844de5fd29726b888e58c62820e0769b76565480e1fdc3d062f8"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:9a3267620866c9d17b959a84dd0bd2d45719b817245e49371ead79ed4f710d19"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6734e606355834f13445b6adc38b53c0fd45f1a56a9ba06c2058f86893ae8017"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f8303414c7b03f794347ad062c0516cee0e15f7a612abd0ce1e25caf6ceb47df"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aaf53a6cebad0eae578f062c7d462155eada9c172bd8c4d250b8c1d8eb7f916a"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3dc5b6a8ecfdc5748a7e429782598e4f17ef378e3e272eeb1340ea57c9109f41"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e1b25e3ad6c909f398df8921780d6a3d120d8c09466720226fc621605b6f92b1"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0ca564606d2caafb0abe6d1b5311c2649e8071eb241b2d64e75a0d0065107e62"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b82fab78e0b1329e183a65260581de4375f619167478dddab510c6c6fb04d9b6"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:bd7163182133c0c7701b25e604cf1611c0d87712e56e88e7ee5d72deab3e76b5"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:11d117e6c63e8f495412d37e7dc2e2fff09c34b2d09dbe2bee3c6229577818be"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:cf6511efa4801b9b38dc5546d7547d5b5c6ef4b081c60b23e4d941d0eba9cbeb"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:abc1185d79f47c0a7aaf7e2412a0eb2c03b724581139193d2d82b3ad8cbb00ac"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:cb7b2ab0188829593b9de646545175547a70d9a6e2b63bf2cd87a0a391599324"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-win32.whl", hash = "sha256:c36bcbc0d5174a80d6cccf43a0ecaca44e81d25be4b7f90f0ed7bcfbb5a00909"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:cca4def576f47a09a943666b8f829606bcb17e2bc2d5911a46c8f8da45f56755"},
+ {file = "charset_normalizer-3.1.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:0c95f12b74681e9ae127728f7e5409cbbef9cd914d5896ef238cc779b8152373"},
+ {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fca62a8301b605b954ad2e9c3666f9d97f63872aa4efcae5492baca2056b74ab"},
+ {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ac0aa6cd53ab9a31d397f8303f92c42f534693528fafbdb997c82bae6e477ad9"},
+ {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c3af8e0f07399d3176b179f2e2634c3ce9c1301379a6b8c9c9aeecd481da494f"},
+ {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3a5fc78f9e3f501a1614a98f7c54d3969f3ad9bba8ba3d9b438c3bc5d047dd28"},
+ {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:628c985afb2c7d27a4800bfb609e03985aaecb42f955049957814e0491d4006d"},
+ {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:74db0052d985cf37fa111828d0dd230776ac99c740e1a758ad99094be4f1803d"},
+ {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:1e8fcdd8f672a1c4fc8d0bd3a2b576b152d2a349782d1eb0f6b8e52e9954731d"},
+ {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:04afa6387e2b282cf78ff3dbce20f0cc071c12dc8f685bd40960cc68644cfea6"},
+ {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:dd5653e67b149503c68c4018bf07e42eeed6b4e956b24c00ccdf93ac79cdff84"},
+ {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:d2686f91611f9e17f4548dbf050e75b079bbc2a82be565832bc8ea9047b61c8c"},
+ {file = "charset_normalizer-3.1.0-cp37-cp37m-win32.whl", hash = "sha256:4155b51ae05ed47199dc5b2a4e62abccb274cee6b01da5b895099b61b1982974"},
+ {file = "charset_normalizer-3.1.0-cp37-cp37m-win_amd64.whl", hash = "sha256:322102cdf1ab682ecc7d9b1c5eed4ec59657a65e1c146a0da342b78f4112db23"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:e633940f28c1e913615fd624fcdd72fdba807bf53ea6925d6a588e84e1151531"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:3a06f32c9634a8705f4ca9946d667609f52cf130d5548881401f1eb2c39b1e2c"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:7381c66e0561c5757ffe616af869b916c8b4e42b367ab29fedc98481d1e74e14"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3573d376454d956553c356df45bb824262c397c6e26ce43e8203c4c540ee0acb"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e89df2958e5159b811af9ff0f92614dabf4ff617c03a4c1c6ff53bf1c399e0e1"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:78cacd03e79d009d95635e7d6ff12c21eb89b894c354bd2b2ed0b4763373693b"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:de5695a6f1d8340b12a5d6d4484290ee74d61e467c39ff03b39e30df62cf83a0"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1c60b9c202d00052183c9be85e5eaf18a4ada0a47d188a83c8f5c5b23252f649"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:f645caaf0008bacf349875a974220f1f1da349c5dbe7c4ec93048cdc785a3326"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:ea9f9c6034ea2d93d9147818f17c2a0860d41b71c38b9ce4d55f21b6f9165a11"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:80d1543d58bd3d6c271b66abf454d437a438dff01c3e62fdbcd68f2a11310d4b"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:73dc03a6a7e30b7edc5b01b601e53e7fc924b04e1835e8e407c12c037e81adbd"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:6f5c2e7bc8a4bf7c426599765b1bd33217ec84023033672c1e9a8b35eaeaaaf8"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-win32.whl", hash = "sha256:12a2b561af122e3d94cdb97fe6fb2bb2b82cef0cdca131646fdb940a1eda04f0"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-win_amd64.whl", hash = "sha256:3160a0fd9754aab7d47f95a6b63ab355388d890163eb03b2d2b87ab0a30cfa59"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:38e812a197bf8e71a59fe55b757a84c1f946d0ac114acafaafaf21667a7e169e"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6baf0baf0d5d265fa7944feb9f7451cc316bfe30e8df1a61b1bb08577c554f31"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:8f25e17ab3039b05f762b0a55ae0b3632b2e073d9c8fc88e89aca31a6198e88f"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3747443b6a904001473370d7810aa19c3a180ccd52a7157aacc264a5ac79265e"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b116502087ce8a6b7a5f1814568ccbd0e9f6cfd99948aa59b0e241dc57cf739f"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d16fd5252f883eb074ca55cb622bc0bee49b979ae4e8639fff6ca3ff44f9f854"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:21fa558996782fc226b529fdd2ed7866c2c6ec91cee82735c98a197fae39f706"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6f6c7a8a57e9405cad7485f4c9d3172ae486cfef1344b5ddd8e5239582d7355e"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ac3775e3311661d4adace3697a52ac0bab17edd166087d493b52d4f4f553f9f0"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:10c93628d7497c81686e8e5e557aafa78f230cd9e77dd0c40032ef90c18f2230"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:6f4f4668e1831850ebcc2fd0b1cd11721947b6dc7c00bf1c6bd3c929ae14f2c7"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:0be65ccf618c1e7ac9b849c315cc2e8a8751d9cfdaa43027d4f6624bd587ab7e"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:53d0a3fa5f8af98a1e261de6a3943ca631c526635eb5817a87a59d9a57ebf48f"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-win32.whl", hash = "sha256:a04f86f41a8916fe45ac5024ec477f41f886b3c435da2d4e3d2709b22ab02af1"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-win_amd64.whl", hash = "sha256:830d2948a5ec37c386d3170c483063798d7879037492540f10a475e3fd6f244b"},
+ {file = "charset_normalizer-3.1.0-py3-none-any.whl", hash = "sha256:3d9098b479e78c85080c98e1e35ff40b4a31d8953102bb0fd7d1b6f8a2111a3d"},
]
-[package.extras]
-unicode-backport = ["unicodedata2"]
-
[[package]]
name = "click"
-version = "7.1.2"
+version = "8.1.3"
description = "Composable command line interface toolkit"
category = "main"
optional = false
-python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+python-versions = ">=3.7"
files = [
- {file = "click-7.1.2-py2.py3-none-any.whl", hash = "sha256:dacca89f4bfadd5de3d7489b7c8a566eee0d3676333fbb50030263894c38c0dc"},
- {file = "click-7.1.2.tar.gz", hash = "sha256:d2b5255c7c6349bc1bd1e59e08cd12acbbd63ce649f2588755783aa94dfb6b1a"},
+ {file = "click-8.1.3-py3-none-any.whl", hash = "sha256:bb4d8133cb15a609f44e8213d9b391b0809795062913b383c62be0ee95b1db48"},
+ {file = "click-8.1.3.tar.gz", hash = "sha256:7682dc8afb30297001674575ea00d1814d808d6a36af415a82bd481d37ba7b8e"},
]
+[package.dependencies]
+colorama = {version = "*", markers = "platform_system == \"Windows\""}
+
[[package]]
name = "clique"
version = "1.6.1"
@@ -530,7 +617,7 @@ test = ["pytest (>=2.3.5,<5)", "pytest-cov (>=2,<3)", "pytest-runner (>=2.7,<3)"
name = "colorama"
version = "0.4.6"
description = "Cross-platform colored terminal text."
-category = "dev"
+category = "main"
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,>=2.7"
files = [
@@ -567,63 +654,72 @@ files = [
[[package]]
name = "coverage"
-version = "7.0.5"
+version = "7.2.7"
description = "Code coverage measurement for Python"
category = "dev"
optional = false
python-versions = ">=3.7"
files = [
- {file = "coverage-7.0.5-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:2a7f23bbaeb2a87f90f607730b45564076d870f1fb07b9318d0c21f36871932b"},
- {file = "coverage-7.0.5-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:c18d47f314b950dbf24a41787ced1474e01ca816011925976d90a88b27c22b89"},
- {file = "coverage-7.0.5-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ef14d75d86f104f03dea66c13188487151760ef25dd6b2dbd541885185f05f40"},
- {file = "coverage-7.0.5-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:66e50680e888840c0995f2ad766e726ce71ca682e3c5f4eee82272c7671d38a2"},
- {file = "coverage-7.0.5-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a9fed35ca8c6e946e877893bbac022e8563b94404a605af1d1e6accc7eb73289"},
- {file = "coverage-7.0.5-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:d8d04e755934195bdc1db45ba9e040b8d20d046d04d6d77e71b3b34a8cc002d0"},
- {file = "coverage-7.0.5-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:7e109f1c9a3ece676597831874126555997c48f62bddbcace6ed17be3e372de8"},
- {file = "coverage-7.0.5-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:0a1890fca2962c4f1ad16551d660b46ea77291fba2cc21c024cd527b9d9c8809"},
- {file = "coverage-7.0.5-cp310-cp310-win32.whl", hash = "sha256:be9fcf32c010da0ba40bf4ee01889d6c737658f4ddff160bd7eb9cac8f094b21"},
- {file = "coverage-7.0.5-cp310-cp310-win_amd64.whl", hash = "sha256:cbfcba14a3225b055a28b3199c3d81cd0ab37d2353ffd7f6fd64844cebab31ad"},
- {file = "coverage-7.0.5-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:30b5fec1d34cc932c1bc04017b538ce16bf84e239378b8f75220478645d11fca"},
- {file = "coverage-7.0.5-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1caed2367b32cc80a2b7f58a9f46658218a19c6cfe5bc234021966dc3daa01f0"},
- {file = "coverage-7.0.5-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d254666d29540a72d17cc0175746cfb03d5123db33e67d1020e42dae611dc196"},
- {file = "coverage-7.0.5-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:19245c249aa711d954623d94f23cc94c0fd65865661f20b7781210cb97c471c0"},
- {file = "coverage-7.0.5-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7b05ed4b35bf6ee790832f68932baf1f00caa32283d66cc4d455c9e9d115aafc"},
- {file = "coverage-7.0.5-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:29de916ba1099ba2aab76aca101580006adfac5646de9b7c010a0f13867cba45"},
- {file = "coverage-7.0.5-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:e057e74e53db78122a3979f908973e171909a58ac20df05c33998d52e6d35757"},
- {file = "coverage-7.0.5-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:411d4ff9d041be08fdfc02adf62e89c735b9468f6d8f6427f8a14b6bb0a85095"},
- {file = "coverage-7.0.5-cp311-cp311-win32.whl", hash = "sha256:52ab14b9e09ce052237dfe12d6892dd39b0401690856bcfe75d5baba4bfe2831"},
- {file = "coverage-7.0.5-cp311-cp311-win_amd64.whl", hash = "sha256:1f66862d3a41674ebd8d1a7b6f5387fe5ce353f8719040a986551a545d7d83ea"},
- {file = "coverage-7.0.5-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:b69522b168a6b64edf0c33ba53eac491c0a8f5cc94fa4337f9c6f4c8f2f5296c"},
- {file = "coverage-7.0.5-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:436e103950d05b7d7f55e39beeb4d5be298ca3e119e0589c0227e6d0b01ee8c7"},
- {file = "coverage-7.0.5-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b8c56bec53d6e3154eaff6ea941226e7bd7cc0d99f9b3756c2520fc7a94e6d96"},
- {file = "coverage-7.0.5-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7a38362528a9115a4e276e65eeabf67dcfaf57698e17ae388599568a78dcb029"},
- {file = "coverage-7.0.5-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:f67472c09a0c7486e27f3275f617c964d25e35727af952869dd496b9b5b7f6a3"},
- {file = "coverage-7.0.5-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:220e3fa77d14c8a507b2d951e463b57a1f7810a6443a26f9b7591ef39047b1b2"},
- {file = "coverage-7.0.5-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:ecb0f73954892f98611e183f50acdc9e21a4653f294dfbe079da73c6378a6f47"},
- {file = "coverage-7.0.5-cp37-cp37m-win32.whl", hash = "sha256:d8f3e2e0a1d6777e58e834fd5a04657f66affa615dae61dd67c35d1568c38882"},
- {file = "coverage-7.0.5-cp37-cp37m-win_amd64.whl", hash = "sha256:9e662e6fc4f513b79da5d10a23edd2b87685815b337b1a30cd11307a6679148d"},
- {file = "coverage-7.0.5-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:790e4433962c9f454e213b21b0fd4b42310ade9c077e8edcb5113db0818450cb"},
- {file = "coverage-7.0.5-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:49640bda9bda35b057b0e65b7c43ba706fa2335c9a9896652aebe0fa399e80e6"},
- {file = "coverage-7.0.5-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d66187792bfe56f8c18ba986a0e4ae44856b1c645336bd2c776e3386da91e1dd"},
- {file = "coverage-7.0.5-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:276f4cd0001cd83b00817c8db76730938b1ee40f4993b6a905f40a7278103b3a"},
- {file = "coverage-7.0.5-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:95304068686545aa368b35dfda1cdfbbdbe2f6fe43de4a2e9baa8ebd71be46e2"},
- {file = "coverage-7.0.5-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:17e01dd8666c445025c29684d4aabf5a90dc6ef1ab25328aa52bedaa95b65ad7"},
- {file = "coverage-7.0.5-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:ea76dbcad0b7b0deb265d8c36e0801abcddf6cc1395940a24e3595288b405ca0"},
- {file = "coverage-7.0.5-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:50a6adc2be8edd7ee67d1abc3cd20678987c7b9d79cd265de55941e3d0d56499"},
- {file = "coverage-7.0.5-cp38-cp38-win32.whl", hash = "sha256:e4ce984133b888cc3a46867c8b4372c7dee9cee300335e2925e197bcd45b9e16"},
- {file = "coverage-7.0.5-cp38-cp38-win_amd64.whl", hash = "sha256:4a950f83fd3f9bca23b77442f3a2b2ea4ac900944d8af9993743774c4fdc57af"},
- {file = "coverage-7.0.5-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:3c2155943896ac78b9b0fd910fb381186d0c345911f5333ee46ac44c8f0e43ab"},
- {file = "coverage-7.0.5-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:54f7e9705e14b2c9f6abdeb127c390f679f6dbe64ba732788d3015f7f76ef637"},
- {file = "coverage-7.0.5-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0ee30375b409d9a7ea0f30c50645d436b6f5dfee254edffd27e45a980ad2c7f4"},
- {file = "coverage-7.0.5-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b78729038abea6a5df0d2708dce21e82073463b2d79d10884d7d591e0f385ded"},
- {file = "coverage-7.0.5-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:13250b1f0bd023e0c9f11838bdeb60214dd5b6aaf8e8d2f110c7e232a1bff83b"},
- {file = "coverage-7.0.5-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:2c407b1950b2d2ffa091f4e225ca19a66a9bd81222f27c56bd12658fc5ca1209"},
- {file = "coverage-7.0.5-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:c76a3075e96b9c9ff00df8b5f7f560f5634dffd1658bafb79eb2682867e94f78"},
- {file = "coverage-7.0.5-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:f26648e1b3b03b6022b48a9b910d0ae209e2d51f50441db5dce5b530fad6d9b1"},
- {file = "coverage-7.0.5-cp39-cp39-win32.whl", hash = "sha256:ba3027deb7abf02859aca49c865ece538aee56dcb4871b4cced23ba4d5088904"},
- {file = "coverage-7.0.5-cp39-cp39-win_amd64.whl", hash = "sha256:949844af60ee96a376aac1ded2a27e134b8c8d35cc006a52903fc06c24a3296f"},
- {file = "coverage-7.0.5-pp37.pp38.pp39-none-any.whl", hash = "sha256:b9727ac4f5cf2cbf87880a63870b5b9730a8ae3a4a360241a0fdaa2f71240ff0"},
- {file = "coverage-7.0.5.tar.gz", hash = "sha256:051afcbd6d2ac39298d62d340f94dbb6a1f31de06dfaf6fcef7b759dd3860c45"},
+ {file = "coverage-7.2.7-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d39b5b4f2a66ccae8b7263ac3c8170994b65266797fb96cbbfd3fb5b23921db8"},
+ {file = "coverage-7.2.7-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:6d040ef7c9859bb11dfeb056ff5b3872436e3b5e401817d87a31e1750b9ae2fb"},
+ {file = "coverage-7.2.7-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ba90a9563ba44a72fda2e85302c3abc71c5589cea608ca16c22b9804262aaeb6"},
+ {file = "coverage-7.2.7-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e7d9405291c6928619403db1d10bd07888888ec1abcbd9748fdaa971d7d661b2"},
+ {file = "coverage-7.2.7-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:31563e97dae5598556600466ad9beea39fb04e0229e61c12eaa206e0aa202063"},
+ {file = "coverage-7.2.7-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:ebba1cd308ef115925421d3e6a586e655ca5a77b5bf41e02eb0e4562a111f2d1"},
+ {file = "coverage-7.2.7-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:cb017fd1b2603ef59e374ba2063f593abe0fc45f2ad9abdde5b4d83bd922a353"},
+ {file = "coverage-7.2.7-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:d62a5c7dad11015c66fbb9d881bc4caa5b12f16292f857842d9d1871595f4495"},
+ {file = "coverage-7.2.7-cp310-cp310-win32.whl", hash = "sha256:ee57190f24fba796e36bb6d3aa8a8783c643d8fa9760c89f7a98ab5455fbf818"},
+ {file = "coverage-7.2.7-cp310-cp310-win_amd64.whl", hash = "sha256:f75f7168ab25dd93110c8a8117a22450c19976afbc44234cbf71481094c1b850"},
+ {file = "coverage-7.2.7-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:06a9a2be0b5b576c3f18f1a241f0473575c4a26021b52b2a85263a00f034d51f"},
+ {file = "coverage-7.2.7-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:5baa06420f837184130752b7c5ea0808762083bf3487b5038d68b012e5937dbe"},
+ {file = "coverage-7.2.7-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fdec9e8cbf13a5bf63290fc6013d216a4c7232efb51548594ca3631a7f13c3a3"},
+ {file = "coverage-7.2.7-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:52edc1a60c0d34afa421c9c37078817b2e67a392cab17d97283b64c5833f427f"},
+ {file = "coverage-7.2.7-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:63426706118b7f5cf6bb6c895dc215d8a418d5952544042c8a2d9fe87fcf09cb"},
+ {file = "coverage-7.2.7-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:afb17f84d56068a7c29f5fa37bfd38d5aba69e3304af08ee94da8ed5b0865833"},
+ {file = "coverage-7.2.7-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:48c19d2159d433ccc99e729ceae7d5293fbffa0bdb94952d3579983d1c8c9d97"},
+ {file = "coverage-7.2.7-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:0e1f928eaf5469c11e886fe0885ad2bf1ec606434e79842a879277895a50942a"},
+ {file = "coverage-7.2.7-cp311-cp311-win32.whl", hash = "sha256:33d6d3ea29d5b3a1a632b3c4e4f4ecae24ef170b0b9ee493883f2df10039959a"},
+ {file = "coverage-7.2.7-cp311-cp311-win_amd64.whl", hash = "sha256:5b7540161790b2f28143191f5f8ec02fb132660ff175b7747b95dcb77ac26562"},
+ {file = "coverage-7.2.7-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:f2f67fe12b22cd130d34d0ef79206061bfb5eda52feb6ce0dba0644e20a03cf4"},
+ {file = "coverage-7.2.7-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a342242fe22407f3c17f4b499276a02b01e80f861f1682ad1d95b04018e0c0d4"},
+ {file = "coverage-7.2.7-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:171717c7cb6b453aebac9a2ef603699da237f341b38eebfee9be75d27dc38e01"},
+ {file = "coverage-7.2.7-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:49969a9f7ffa086d973d91cec8d2e31080436ef0fb4a359cae927e742abfaaa6"},
+ {file = "coverage-7.2.7-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:b46517c02ccd08092f4fa99f24c3b83d8f92f739b4657b0f146246a0ca6a831d"},
+ {file = "coverage-7.2.7-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:a3d33a6b3eae87ceaefa91ffdc130b5e8536182cd6dfdbfc1aa56b46ff8c86de"},
+ {file = "coverage-7.2.7-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:976b9c42fb2a43ebf304fa7d4a310e5f16cc99992f33eced91ef6f908bd8f33d"},
+ {file = "coverage-7.2.7-cp312-cp312-win32.whl", hash = "sha256:8de8bb0e5ad103888d65abef8bca41ab93721647590a3f740100cd65c3b00511"},
+ {file = "coverage-7.2.7-cp312-cp312-win_amd64.whl", hash = "sha256:9e31cb64d7de6b6f09702bb27c02d1904b3aebfca610c12772452c4e6c21a0d3"},
+ {file = "coverage-7.2.7-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:58c2ccc2f00ecb51253cbe5d8d7122a34590fac9646a960d1430d5b15321d95f"},
+ {file = "coverage-7.2.7-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d22656368f0e6189e24722214ed8d66b8022db19d182927b9a248a2a8a2f67eb"},
+ {file = "coverage-7.2.7-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a895fcc7b15c3fc72beb43cdcbdf0ddb7d2ebc959edac9cef390b0d14f39f8a9"},
+ {file = "coverage-7.2.7-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e84606b74eb7de6ff581a7915e2dab7a28a0517fbe1c9239eb227e1354064dcd"},
+ {file = "coverage-7.2.7-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:0a5f9e1dbd7fbe30196578ca36f3fba75376fb99888c395c5880b355e2875f8a"},
+ {file = "coverage-7.2.7-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:419bfd2caae268623dd469eff96d510a920c90928b60f2073d79f8fe2bbc5959"},
+ {file = "coverage-7.2.7-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:2aee274c46590717f38ae5e4650988d1af340fe06167546cc32fe2f58ed05b02"},
+ {file = "coverage-7.2.7-cp37-cp37m-win32.whl", hash = "sha256:61b9a528fb348373c433e8966535074b802c7a5d7f23c4f421e6c6e2f1697a6f"},
+ {file = "coverage-7.2.7-cp37-cp37m-win_amd64.whl", hash = "sha256:b1c546aca0ca4d028901d825015dc8e4d56aac4b541877690eb76490f1dc8ed0"},
+ {file = "coverage-7.2.7-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:54b896376ab563bd38453cecb813c295cf347cf5906e8b41d340b0321a5433e5"},
+ {file = "coverage-7.2.7-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:3d376df58cc111dc8e21e3b6e24606b5bb5dee6024f46a5abca99124b2229ef5"},
+ {file = "coverage-7.2.7-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5e330fc79bd7207e46c7d7fd2bb4af2963f5f635703925543a70b99574b0fea9"},
+ {file = "coverage-7.2.7-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1e9d683426464e4a252bf70c3498756055016f99ddaec3774bf368e76bbe02b6"},
+ {file = "coverage-7.2.7-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8d13c64ee2d33eccf7437961b6ea7ad8673e2be040b4f7fd4fd4d4d28d9ccb1e"},
+ {file = "coverage-7.2.7-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:b7aa5f8a41217360e600da646004f878250a0d6738bcdc11a0a39928d7dc2050"},
+ {file = "coverage-7.2.7-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:8fa03bce9bfbeeef9f3b160a8bed39a221d82308b4152b27d82d8daa7041fee5"},
+ {file = "coverage-7.2.7-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:245167dd26180ab4c91d5e1496a30be4cd721a5cf2abf52974f965f10f11419f"},
+ {file = "coverage-7.2.7-cp38-cp38-win32.whl", hash = "sha256:d2c2db7fd82e9b72937969bceac4d6ca89660db0a0967614ce2481e81a0b771e"},
+ {file = "coverage-7.2.7-cp38-cp38-win_amd64.whl", hash = "sha256:2e07b54284e381531c87f785f613b833569c14ecacdcb85d56b25c4622c16c3c"},
+ {file = "coverage-7.2.7-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:537891ae8ce59ef63d0123f7ac9e2ae0fc8b72c7ccbe5296fec45fd68967b6c9"},
+ {file = "coverage-7.2.7-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:06fb182e69f33f6cd1d39a6c597294cff3143554b64b9825d1dc69d18cc2fff2"},
+ {file = "coverage-7.2.7-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:201e7389591af40950a6480bd9edfa8ed04346ff80002cec1a66cac4549c1ad7"},
+ {file = "coverage-7.2.7-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f6951407391b639504e3b3be51b7ba5f3528adbf1a8ac3302b687ecababf929e"},
+ {file = "coverage-7.2.7-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6f48351d66575f535669306aa7d6d6f71bc43372473b54a832222803eb956fd1"},
+ {file = "coverage-7.2.7-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:b29019c76039dc3c0fd815c41392a044ce555d9bcdd38b0fb60fb4cd8e475ba9"},
+ {file = "coverage-7.2.7-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:81c13a1fc7468c40f13420732805a4c38a105d89848b7c10af65a90beff25250"},
+ {file = "coverage-7.2.7-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:975d70ab7e3c80a3fe86001d8751f6778905ec723f5b110aed1e450da9d4b7f2"},
+ {file = "coverage-7.2.7-cp39-cp39-win32.whl", hash = "sha256:7ee7d9d4822c8acc74a5e26c50604dff824710bc8de424904c0982e25c39c6cb"},
+ {file = "coverage-7.2.7-cp39-cp39-win_amd64.whl", hash = "sha256:eb393e5ebc85245347950143969b241d08b52b88a3dc39479822e073a1a8eb27"},
+ {file = "coverage-7.2.7-pp37.pp38.pp39-none-any.whl", hash = "sha256:b7b4c971f05e6ae490fef852c218b0e79d4e52f79ef0c8475566584a8fb3e01d"},
+ {file = "coverage-7.2.7.tar.gz", hash = "sha256:924d94291ca674905fe9481f12294eb11f2d3d3fd1adb20314ba89e94f44ed59"},
]
[package.dependencies]
@@ -765,24 +861,6 @@ files = [
{file = "cx_Logging-3.1.0.tar.gz", hash = "sha256:8a06834d8527aa904a68b25c9c1a5fa09f0dfdc94dbd9f86b81cd8d2f7a0e487"},
]
-[[package]]
-name = "deprecated"
-version = "1.2.13"
-description = "Python @deprecated decorator to deprecate old python classes, functions or methods."
-category = "main"
-optional = false
-python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
-files = [
- {file = "Deprecated-1.2.13-py2.py3-none-any.whl", hash = "sha256:64756e3e14c8c5eea9795d93c524551432a0be75629f8f29e67ab8caf076c76d"},
- {file = "Deprecated-1.2.13.tar.gz", hash = "sha256:43ac5335da90c31c24ba028af536a91d41d53f9e6901ddb021bcc572ce44e38d"},
-]
-
-[package.dependencies]
-wrapt = ">=1.10,<2"
-
-[package.extras]
-dev = ["PyTest", "PyTest (<5)", "PyTest-Cov", "PyTest-Cov (<2.6)", "bump2version (<1)", "configparser (<5)", "importlib-metadata (<3)", "importlib-resources (<4)", "sphinx (<2)", "sphinxcontrib-websupport (<2)", "tox", "zipp (<2)"]
-
[[package]]
name = "dill"
version = "0.3.6"
@@ -863,14 +941,14 @@ stone = ">=2"
[[package]]
name = "enlighten"
-version = "1.11.1"
+version = "1.11.2"
description = "Enlighten Progress Bar"
category = "main"
optional = false
python-versions = "*"
files = [
- {file = "enlighten-1.11.1-py2.py3-none-any.whl", hash = "sha256:e825eb534ca80778bb7d46e5581527b2a6fae559b6cf09e290a7952c6e11961e"},
- {file = "enlighten-1.11.1.tar.gz", hash = "sha256:57abd98a3d3f83484ef9f91f9255f4d23c8b3097ecdb647c7b9b0049d600b7f8"},
+ {file = "enlighten-1.11.2-py2.py3-none-any.whl", hash = "sha256:98c9eb20e022b6a57f1c8d4f17e16760780b6881e6d658c40f52d21255ea45f3"},
+ {file = "enlighten-1.11.2.tar.gz", hash = "sha256:9284861dee5a272e0e1a3758cd3f3b7180b1bd1754875da76876f2a7f46ccb61"},
]
[package.dependencies]
@@ -879,30 +957,30 @@ prefixed = ">=0.3.2"
[[package]]
name = "evdev"
-version = "1.6.0"
+version = "1.6.1"
description = "Bindings to the Linux input handling subsystem"
category = "main"
optional = false
python-versions = "*"
files = [
- {file = "evdev-1.6.0.tar.gz", hash = "sha256:ecfa01b5c84f7e8c6ced3367ac95288f43cd84efbfd7dd7d0cdbfc0d18c87a6a"},
+ {file = "evdev-1.6.1.tar.gz", hash = "sha256:299db8628cc73b237fc1cc57d3c2948faa0756e2a58b6194b5bf81dc2081f1e3"},
]
[[package]]
name = "filelock"
-version = "3.9.0"
+version = "3.12.0"
description = "A platform independent file lock."
category = "dev"
optional = false
python-versions = ">=3.7"
files = [
- {file = "filelock-3.9.0-py3-none-any.whl", hash = "sha256:f58d535af89bb9ad5cd4df046f741f8553a418c01a7856bf0d173bbc9f6bd16d"},
- {file = "filelock-3.9.0.tar.gz", hash = "sha256:7b319f24340b51f55a2bf7a12ac0755a9b03e718311dac567a0f4f7fabd2f5de"},
+ {file = "filelock-3.12.0-py3-none-any.whl", hash = "sha256:ad98852315c2ab702aeb628412cbf7e95b7ce8c3bf9565670b4eaecf1db370a9"},
+ {file = "filelock-3.12.0.tar.gz", hash = "sha256:fc03ae43288c013d2ea83c8597001b1129db351aad9c57fe2409327916b8e718"},
]
[package.extras]
-docs = ["furo (>=2022.12.7)", "sphinx (>=5.3)", "sphinx-autodoc-typehints (>=1.19.5)"]
-testing = ["covdefaults (>=2.2.2)", "coverage (>=7.0.1)", "pytest (>=7.2)", "pytest-cov (>=4)", "pytest-timeout (>=2.1)"]
+docs = ["furo (>=2023.3.27)", "sphinx (>=6.1.3)", "sphinx-autodoc-typehints (>=1.23,!=1.23.4)"]
+testing = ["covdefaults (>=2.3)", "coverage (>=7.2.3)", "diff-cover (>=7.5)", "pytest (>=7.3.1)", "pytest-cov (>=4)", "pytest-mock (>=3.10)", "pytest-timeout (>=2.1)"]
[[package]]
name = "flake8"
@@ -1007,14 +1085,14 @@ files = [
[[package]]
name = "ftrack-python-api"
-version = "2.3.3"
+version = "2.5.0"
description = "Python API for ftrack."
category = "main"
optional = false
-python-versions = ">=2.7.9, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*, < 3.10"
+python-versions = ">=2.7.9, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*"
files = [
- {file = "ftrack-python-api-2.3.3.tar.gz", hash = "sha256:358f37e5b1c5635eab107c19e27a0c890d512877f78af35b1ac416e90c037295"},
- {file = "ftrack_python_api-2.3.3-py2.py3-none-any.whl", hash = "sha256:82834c4d5def5557a2ea547a7e6f6ba84d3129e8f90457d8bbd85b287a2c39f6"},
+ {file = "ftrack-python-api-2.5.0.tar.gz", hash = "sha256:95205022552b1abadec5e9dcb225762b8e8b9f16ebeadba374e56c25e69e6954"},
+ {file = "ftrack_python_api-2.5.0-py2.py3-none-any.whl", hash = "sha256:59ef3f1d47e5c1df8c3f7ebcc937bbc9a5613b147f9ed083f10cff6370f0750d"},
]
[package.dependencies]
@@ -1041,23 +1119,23 @@ files = [
[[package]]
name = "gazu"
-version = "0.8.34"
+version = "0.9.3"
description = "Gazu is a client for Zou, the API to store the data of your CG production."
category = "main"
optional = false
-python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+python-versions = ">= 2.7, != 3.0.*, != 3.1.*, != 3.2.*, != 3.3.*, != 3.4.*, != 3.5.*, != 3.6.1, != 3.6.2"
files = [
- {file = "gazu-0.8.34-py2.py3-none-any.whl", hash = "sha256:a78a8c5e61108aeaab6185646af78b0402dbdb29097e8ba5882bd55410f38c4b"},
+ {file = "gazu-0.9.3-py2.py3-none-any.whl", hash = "sha256:daa6e4bdaa364b68a048ad97837aec011a0060d12edc3a5ac6ae34c13a05cb2b"},
]
[package.dependencies]
-deprecated = "1.2.13"
-python-socketio = {version = "4.6.1", extras = ["client"], markers = "python_version >= \"3.5\""}
-requests = ">=2.25.1,<=2.28.1"
+python-socketio = {version = "5.8.0", extras = ["client"], markers = "python_version != \"2.7\""}
+requests = ">=2.25.1"
[package.extras]
dev = ["wheel"]
-test = ["black (<=22.8.0)", "pre-commit (<=2.20.0)", "pytest (<=7.1.3)", "pytest-cov (<=3.0.0)", "requests-mock (==1.10.0)"]
+lint = ["black (==23.3.0)", "pre-commit (==3.2.2)"]
+test = ["pytest", "pytest-cov", "requests-mock"]
[[package]]
name = "gitdb"
@@ -1076,14 +1154,14 @@ smmap = ">=3.0.1,<6"
[[package]]
name = "gitpython"
-version = "3.1.30"
-description = "GitPython is a python library used to interact with Git repositories"
+version = "3.1.31"
+description = "GitPython is a Python library used to interact with Git repositories"
category = "dev"
optional = false
python-versions = ">=3.7"
files = [
- {file = "GitPython-3.1.30-py3-none-any.whl", hash = "sha256:cd455b0000615c60e286208ba540271af9fe531fa6a87cc590a7298785ab2882"},
- {file = "GitPython-3.1.30.tar.gz", hash = "sha256:769c2d83e13f5d938b7688479da374c4e3d49f71549aaf462b646db9602ea6f8"},
+ {file = "GitPython-3.1.31-py3-none-any.whl", hash = "sha256:f04893614f6aa713a60cbbe1e6a97403ef633103cdd0ef5eb6efe0deb98dbe8d"},
+ {file = "GitPython-3.1.31.tar.gz", hash = "sha256:8ce3bcf69adfdf7c7d503e78fd3b1c492af782d58893b650adb2ac8912ddd573"},
]
[package.dependencies]
@@ -1134,14 +1212,14 @@ uritemplate = ">=3.0.0,<4dev"
[[package]]
name = "google-auth"
-version = "2.16.0"
+version = "2.17.3"
description = "Google Authentication Library"
category = "main"
optional = false
python-versions = ">=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*"
files = [
- {file = "google-auth-2.16.0.tar.gz", hash = "sha256:ed7057a101af1146f0554a769930ac9de506aeca4fd5af6543ebe791851a9fbd"},
- {file = "google_auth-2.16.0-py2.py3-none-any.whl", hash = "sha256:5045648c821fb72384cdc0e82cc326df195f113a33049d9b62b74589243d2acc"},
+ {file = "google-auth-2.17.3.tar.gz", hash = "sha256:ce311e2bc58b130fddf316df57c9b3943c2a7b4f6ec31de9663a9333e4064efc"},
+ {file = "google_auth-2.17.3-py2.py3-none-any.whl", hash = "sha256:f586b274d3eb7bd932ea424b1c702a30e0393a2e2bc4ca3eae8263ffd8be229f"},
]
[package.dependencies]
@@ -1176,14 +1254,14 @@ six = "*"
[[package]]
name = "googleapis-common-protos"
-version = "1.58.0"
+version = "1.59.0"
description = "Common protobufs used in Google APIs"
category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "googleapis-common-protos-1.58.0.tar.gz", hash = "sha256:c727251ec025947d545184ba17e3578840fc3a24a0516a020479edab660457df"},
- {file = "googleapis_common_protos-1.58.0-py2.py3-none-any.whl", hash = "sha256:ca3befcd4580dab6ad49356b46bf165bb68ff4b32389f028f1abd7c10ab9519a"},
+ {file = "googleapis-common-protos-1.59.0.tar.gz", hash = "sha256:4168fcb568a826a52f23510412da405abd93f4d23ba544bb68d943b14ba3cb44"},
+ {file = "googleapis_common_protos-1.59.0-py2.py3-none-any.whl", hash = "sha256:b287dc48449d1d41af0c69f4ea26242b5ae4c3d7249a38b0984c86a4caffff1f"},
]
[package.dependencies]
@@ -1194,14 +1272,14 @@ grpc = ["grpcio (>=1.44.0,<2.0.0dev)"]
[[package]]
name = "httplib2"
-version = "0.21.0"
+version = "0.22.0"
description = "A comprehensive HTTP client library."
category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
files = [
- {file = "httplib2-0.21.0-py3-none-any.whl", hash = "sha256:987c8bb3eb82d3fa60c68699510a692aa2ad9c4bd4f123e51dfb1488c14cdd01"},
- {file = "httplib2-0.21.0.tar.gz", hash = "sha256:fc144f091c7286b82bec71bdbd9b27323ba709cc612568d3000893bfd9cb4b34"},
+ {file = "httplib2-0.22.0-py3-none-any.whl", hash = "sha256:14ae0a53c1ba8f3d37e9e27cf37eabb0fb9980f435ba405d546948b009dd64dc"},
+ {file = "httplib2-0.22.0.tar.gz", hash = "sha256:d7a10bc5ef5ab08322488bde8c726eeee5c8618723fdb399597ec58f3d82df81"},
]
[package.dependencies]
@@ -1209,14 +1287,14 @@ pyparsing = {version = ">=2.4.2,<3.0.0 || >3.0.0,<3.0.1 || >3.0.1,<3.0.2 || >3.0
[[package]]
name = "identify"
-version = "2.5.13"
+version = "2.5.24"
description = "File identification library for Python"
category = "dev"
optional = false
python-versions = ">=3.7"
files = [
- {file = "identify-2.5.13-py2.py3-none-any.whl", hash = "sha256:8aa48ce56e38c28b6faa9f261075dea0a942dfbb42b341b4e711896cbb40f3f7"},
- {file = "identify-2.5.13.tar.gz", hash = "sha256:abb546bca6f470228785338a01b539de8a85bbf46491250ae03363956d8ebb10"},
+ {file = "identify-2.5.24-py2.py3-none-any.whl", hash = "sha256:986dbfb38b1140e763e413e6feb44cd731faf72d1909543178aa79b0e258265d"},
+ {file = "identify-2.5.24.tar.gz", hash = "sha256:0aac67d5b4812498056d28a9a512a483f5085cc28640b02b258a59dac34301d4"},
]
[package.extras]
@@ -1248,14 +1326,14 @@ files = [
[[package]]
name = "importlib-metadata"
-version = "6.0.0"
+version = "6.6.0"
description = "Read metadata from Python packages"
category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "importlib_metadata-6.0.0-py3-none-any.whl", hash = "sha256:7efb448ec9a5e313a57655d35aa54cd3e01b7e1fbcf72dce1bf06119420f5bad"},
- {file = "importlib_metadata-6.0.0.tar.gz", hash = "sha256:e354bedeb60efa6affdcc8ae121b73544a7aa74156d047311948f6d711cd378d"},
+ {file = "importlib_metadata-6.6.0-py3-none-any.whl", hash = "sha256:43dd286a2cd8995d5eaef7fee2066340423b818ed3fd70adf0bad5f1fac53fed"},
+ {file = "importlib_metadata-6.6.0.tar.gz", hash = "sha256:92501cdf9cc66ebd3e612f1b4f0c0765dfa42f0fa38ffb319b6bd84dd675d705"},
]
[package.dependencies]
@@ -1280,19 +1358,19 @@ files = [
[[package]]
name = "isort"
-version = "5.11.4"
+version = "5.12.0"
description = "A Python utility / library to sort Python imports."
category = "dev"
optional = false
-python-versions = ">=3.7.0"
+python-versions = ">=3.8.0"
files = [
- {file = "isort-5.11.4-py3-none-any.whl", hash = "sha256:c033fd0edb91000a7f09527fe5c75321878f98322a77ddcc81adbd83724afb7b"},
- {file = "isort-5.11.4.tar.gz", hash = "sha256:6db30c5ded9815d813932c04c2f85a360bcdd35fed496f4d8f35495ef0a261b6"},
+ {file = "isort-5.12.0-py3-none-any.whl", hash = "sha256:f84c2818376e66cf843d497486ea8fed8700b340f308f076c6fb1229dff318b6"},
+ {file = "isort-5.12.0.tar.gz", hash = "sha256:8bef7dde241278824a6d83f44a544709b065191b95b6e50894bdc722fcba0504"},
]
[package.extras]
-colors = ["colorama (>=0.4.3,<0.5.0)"]
-pipfile-deprecated-finder = ["pipreqs", "requirementslib"]
+colors = ["colorama (>=0.4.3)"]
+pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"]
plugins = ["setuptools"]
requirements-deprecated-finder = ["pip-api", "pipreqs"]
@@ -1448,48 +1526,58 @@ files = [
[[package]]
name = "lief"
-version = "0.12.3"
+version = "0.13.1"
description = "Library to instrument executable formats"
category = "dev"
optional = false
-python-versions = ">=3.6"
+python-versions = ">=3.8"
files = [
- {file = "lief-0.12.3-cp310-cp310-macosx_10_14_arm64.whl", hash = "sha256:66724f337e6a36cea1a9380f13b59923f276c49ca837becae2e7be93a2e245d9"},
- {file = "lief-0.12.3-cp310-cp310-macosx_10_14_x86_64.whl", hash = "sha256:6d18aafa2028587c98f6d4387bec94346e92f2b5a8a5002f70b1cf35b1c045cc"},
- {file = "lief-0.12.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d4f69d125caaa8d5ddb574f29cc83101e165ebea1a9f18ad042eb3544081a797"},
- {file = "lief-0.12.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c078d6230279ffd3bca717c79664fb8368666f610b577deb24b374607936e9c1"},
- {file = "lief-0.12.3-cp310-cp310-win32.whl", hash = "sha256:e3a6af926532d0aac9e7501946134513d63217bacba666e6f7f5a0b7e15ba236"},
- {file = "lief-0.12.3-cp310-cp310-win_amd64.whl", hash = "sha256:0750b72e3aa161e1fb0e2e7f571121ae05d2428aafd742ff05a7656ad2288447"},
- {file = "lief-0.12.3-cp311-cp311-macosx_10_14_arm64.whl", hash = "sha256:b5c123cb99a7879d754c059e299198b34e7e30e3b64cf22e8962013db0099f47"},
- {file = "lief-0.12.3-cp311-cp311-macosx_10_14_x86_64.whl", hash = "sha256:8bc58fa26a830df6178e36f112cb2bbdd65deff593f066d2d51434ff78386ba5"},
- {file = "lief-0.12.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:74ac6143ac6ccd813c9b068d9c5f1f9d55c8813c8b407387eb57de01c3db2d74"},
- {file = "lief-0.12.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:04eb6b70d646fb5bd6183575928ee23715550f161f2832cbcd8c6ff2071fb408"},
- {file = "lief-0.12.3-cp311-cp311-win32.whl", hash = "sha256:7e2d0a53c403769b04adcf8df92e83c5e25f9103a052aa7f17b0a9cf057735fb"},
- {file = "lief-0.12.3-cp311-cp311-win_amd64.whl", hash = "sha256:7f6395c12ee1bc4a5162f567cba96d0c72dfb660e7902e84d4f3029daf14fe33"},
- {file = "lief-0.12.3-cp36-cp36m-macosx_10_14_x86_64.whl", hash = "sha256:71327fdc764fd2b1f3cd371d8ac5e0b801bde32b71cfcf7dccee506d46768539"},
- {file = "lief-0.12.3-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d320fb80ed5b42b354b8e4f251ab05a51929c162c57c377b5e95ad4b1c1b415d"},
- {file = "lief-0.12.3-cp36-cp36m-win32.whl", hash = "sha256:176fa6c342dd480195cda34a20f62ac76dfae103b22ca7583b762e0b434ee1f3"},
- {file = "lief-0.12.3-cp36-cp36m-win_amd64.whl", hash = "sha256:3a18fe108fb82a2640864deef933731afe77413b1226551796ef2c373a1b3a2a"},
- {file = "lief-0.12.3-cp37-cp37m-macosx_10_14_x86_64.whl", hash = "sha256:c73e990cd2737d1060b8c1e8edcc128832806995b69d1d6bf191409e2cea7bde"},
- {file = "lief-0.12.3-cp37-cp37m-manylinux2014_aarch64.whl", hash = "sha256:5fa2b1c8ffe47ee66b2507c2bb4e3fd628965532b7888c0627d10e690b5ef20c"},
- {file = "lief-0.12.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5f224e9a261e88099f86160f121d088d30894c2946e3e551cf11c678daadcf2b"},
- {file = "lief-0.12.3-cp37-cp37m-win32.whl", hash = "sha256:3481d7c9fb3d3a1acff53851f40efd1a5a05d354312d367294bc2e310b736826"},
- {file = "lief-0.12.3-cp37-cp37m-win_amd64.whl", hash = "sha256:4e5173e1be5ebf43594f4eb187cbcb04758761942bc0a1e685ea1cb9047dc0d9"},
- {file = "lief-0.12.3-cp38-cp38-macosx_10_14_x86_64.whl", hash = "sha256:54d6a45e01260b9c8bf1c99f58257cff5338aee5c02eacfeee789f9d15cf38c6"},
- {file = "lief-0.12.3-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:4501dc399fb15dc7a3c8df4a76264a86be6d581d99098dafc3a67626149d8ff1"},
- {file = "lief-0.12.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c848aadac0816268aeb9dde7cefdb54bf24f78e664a19e97e74c92d3be1bb147"},
- {file = "lief-0.12.3-cp38-cp38-win32.whl", hash = "sha256:d7e35f9ee9dd6e79add3b343f83659b71c05189e5cb224e02a1902ddc7654e96"},
- {file = "lief-0.12.3-cp38-cp38-win_amd64.whl", hash = "sha256:b00667257b43e93d94166c959055b6147d46d302598f3ee55c194b40414c89cc"},
- {file = "lief-0.12.3-cp39-cp39-macosx_10_14_arm64.whl", hash = "sha256:e6a1b5b389090d524621c2455795e1262f62dc9381bedd96f0cd72b878c4066d"},
- {file = "lief-0.12.3-cp39-cp39-macosx_10_14_x86_64.whl", hash = "sha256:ae773196df814202c0c51056163a1478941b299512b09660a3c37be3c7fac81e"},
- {file = "lief-0.12.3-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:66ddf88917ec7b00752687c476bb2771dc8ec19bd7e4c0dcff1f8ef774cad4e9"},
- {file = "lief-0.12.3-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:4a47f410032c63ac3be051d963d0337d6b47f0e94bfe8e946ab4b6c428f4d0f8"},
- {file = "lief-0.12.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dbd11367c2259bd1131a6c8755dcde33314324de5ea029227bfbc7d3755871e6"},
- {file = "lief-0.12.3-cp39-cp39-win32.whl", hash = "sha256:2ce53e311918c3e5b54c815ef420a747208d2a88200c41cd476f3dd1eb876bcf"},
- {file = "lief-0.12.3-cp39-cp39-win_amd64.whl", hash = "sha256:446e53ccf0ebd1616c5d573470662ff71ca6df3cd62ec1764e303764f3f03cca"},
- {file = "lief-0.12.3.zip", hash = "sha256:62e81d2f1a827d43152aed12446a604627e8833493a51dca027026eed0ce7128"},
+ {file = "lief-0.13.1-cp310-cp310-macosx_10_14_x86_64.whl", hash = "sha256:b53317d78f8b7528e3f2f358b3f9334a1a84fae88c5aec1a3b7717ed31bfb066"},
+ {file = "lief-0.13.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:bb8b285a6c670df590c36fc0c19b9d2e32b99f17e57afa29bb3052f1d55aa50f"},
+ {file = "lief-0.13.1-cp310-cp310-manylinux2014_aarch64.whl", hash = "sha256:be871116faa698b6d9da76b0caec2ec5b7e7b8781cfb3a4ac0c4e348fb37ab49"},
+ {file = "lief-0.13.1-cp310-cp310-manylinux_2_24_x86_64.whl", hash = "sha256:c6839df875e912edd3fc553ab5d1b916527adee9c57ba85c69314a93f7ba2e15"},
+ {file = "lief-0.13.1-cp310-cp310-win32.whl", hash = "sha256:b1f295dbb57094443926ac6051bee9a1945d92344f470da1cb506060eb2f91ac"},
+ {file = "lief-0.13.1-cp310-cp310-win_amd64.whl", hash = "sha256:8439805a389cc67b6d4ea7d757a3211f22298edce53c5b064fdf8bf05fabba54"},
+ {file = "lief-0.13.1-cp311-cp311-macosx_10_14_x86_64.whl", hash = "sha256:3cfbc6c50f9e3a8015cd5ee88dfe83f423562c025439143bbd5c086a3f9fe599"},
+ {file = "lief-0.13.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:661abaa48bc032b9a7529e0b73d2ced3e4a1f13381592f6b9e940750b07a5ac2"},
+ {file = "lief-0.13.1-cp311-cp311-manylinux2014_aarch64.whl", hash = "sha256:23617d96d162081f8bf315d9b0494845891f8d0f04ad60991b83367ee9e261aa"},
+ {file = "lief-0.13.1-cp311-cp311-manylinux_2_24_x86_64.whl", hash = "sha256:aa7f45c5125be80a513624d3a5f6bd50751c2edc6de5357fde218580111c8535"},
+ {file = "lief-0.13.1-cp311-cp311-win32.whl", hash = "sha256:018b542f09fe2305e1585a3e63a7e5132927b835062b456e5c8c571db7784d1e"},
+ {file = "lief-0.13.1-cp311-cp311-win_amd64.whl", hash = "sha256:bfbf8885a3643ea9aaf663d039f50ca58b228886c3fe412725b22851aeda3b77"},
+ {file = "lief-0.13.1-cp38-cp38-macosx_10_14_x86_64.whl", hash = "sha256:a0472636ab15b9afecf8b5d55966912af8cb4de2f05b98fc05c87d51880d0208"},
+ {file = "lief-0.13.1-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:ccfba33c02f21d4ede26ab85eb6539a00e74e236569c13dcbab2e157b73673c4"},
+ {file = "lief-0.13.1-cp38-cp38-manylinux_2_24_x86_64.whl", hash = "sha256:e414d6c23f26053f4824d080885ab1b75482122796cba7d09cbf157900646289"},
+ {file = "lief-0.13.1-cp38-cp38-win32.whl", hash = "sha256:a18fee5cf69adf9d5ee977778ccd46c39c450960f806231b26b69011f81bc712"},
+ {file = "lief-0.13.1-cp38-cp38-win_amd64.whl", hash = "sha256:04c87039d1e68ebc467f83136179626403547dd1ce851541345f8ca0b1fe6c5b"},
+ {file = "lief-0.13.1-cp39-cp39-macosx_10_14_x86_64.whl", hash = "sha256:0283a4c749afe58be8e21cdd9be79c657c51ca9b8346f75f4b97349b1f022851"},
+ {file = "lief-0.13.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:95a4b6d1f8dba9360aecf7542e54ce5eb02c0e88f2d827b5445594d5d51109f5"},
+ {file = "lief-0.13.1-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:16753bd72b1e3932d94d088a93b64e08c1f6c8bce1b064b47fe66ed73d9562b2"},
+ {file = "lief-0.13.1-cp39-cp39-manylinux_2_24_x86_64.whl", hash = "sha256:965fadb1301d1a81f16067e4fa743d2be3f6aa71391a83b752ff811ec74b0766"},
+ {file = "lief-0.13.1-cp39-cp39-win32.whl", hash = "sha256:57bdb0471760c4ff520f5e5d005e503cc7ea3ebe22df307bb579a1a561b8c4e9"},
+ {file = "lief-0.13.1-cp39-cp39-win_amd64.whl", hash = "sha256:a3c900f49c3d3135c728faeb386d13310bb3511eb2d4e1c9b109b48ae2658361"},
]
+[[package]]
+name = "linkify-it-py"
+version = "2.0.2"
+description = "Links recognition library with FULL unicode support."
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "linkify-it-py-2.0.2.tar.gz", hash = "sha256:19f3060727842c254c808e99d465c80c49d2c7306788140987a1a7a29b0d6ad2"},
+ {file = "linkify_it_py-2.0.2-py3-none-any.whl", hash = "sha256:a3a24428f6c96f27370d7fe61d2ac0be09017be5190d68d8658233171f1b6541"},
+]
+
+[package.dependencies]
+uc-micro-py = "*"
+
+[package.extras]
+benchmark = ["pytest", "pytest-benchmark"]
+dev = ["black", "flake8", "isort", "pre-commit", "pyproject-flake8"]
+doc = ["myst-parser", "sphinx", "sphinx-book-theme"]
+test = ["coverage", "pytest", "pytest-cov"]
+
[[package]]
name = "log4mongo"
version = "1.7.0"
@@ -1504,6 +1592,47 @@ files = [
[package.dependencies]
pymongo = "*"
+[[package]]
+name = "m2r2"
+version = "0.3.3.post2"
+description = "Markdown and reStructuredText in a single file."
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "m2r2-0.3.3.post2-py3-none-any.whl", hash = "sha256:86157721eb6eabcd54d4eea7195890cc58fa6188b8d0abea633383cfbb5e11e3"},
+ {file = "m2r2-0.3.3.post2.tar.gz", hash = "sha256:e62bcb0e74b3ce19cda0737a0556b04cf4a43b785072fcef474558f2c1482ca8"},
+]
+
+[package.dependencies]
+docutils = ">=0.19"
+mistune = "0.8.4"
+
+[[package]]
+name = "markdown-it-py"
+version = "2.2.0"
+description = "Python port of markdown-it. Markdown parsing, done right!"
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "markdown-it-py-2.2.0.tar.gz", hash = "sha256:7c9a5e412688bc771c67432cbfebcdd686c93ce6484913dccf06cb5a0bea35a1"},
+ {file = "markdown_it_py-2.2.0-py3-none-any.whl", hash = "sha256:5a35f8d1870171d9acc47b99612dc146129b631baf04970128b568f190d0cc30"},
+]
+
+[package.dependencies]
+mdurl = ">=0.1,<1.0"
+
+[package.extras]
+benchmarking = ["psutil", "pytest", "pytest-benchmark"]
+code-style = ["pre-commit (>=3.0,<4.0)"]
+compare = ["commonmark (>=0.9,<1.0)", "markdown (>=3.4,<4.0)", "mistletoe (>=1.0,<2.0)", "mistune (>=2.0,<3.0)", "panflute (>=2.3,<3.0)"]
+linkify = ["linkify-it-py (>=1,<3)"]
+plugins = ["mdit-py-plugins"]
+profiling = ["gprof2dot"]
+rtd = ["attrs", "myst-parser", "pyyaml", "sphinx", "sphinx-copybutton", "sphinx-design", "sphinx_book_theme"]
+testing = ["coverage", "pytest", "pytest-cov", "pytest-regressions"]
+
[[package]]
name = "markupsafe"
version = "2.0.1"
@@ -1595,6 +1724,50 @@ files = [
{file = "mccabe-0.7.0.tar.gz", hash = "sha256:348e0240c33b60bbdf4e523192ef919f28cb2c3d7d5c7794f74009290f236325"},
]
+[[package]]
+name = "mdit-py-plugins"
+version = "0.3.5"
+description = "Collection of plugins for markdown-it-py"
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "mdit-py-plugins-0.3.5.tar.gz", hash = "sha256:eee0adc7195e5827e17e02d2a258a2ba159944a0748f59c5099a4a27f78fcf6a"},
+ {file = "mdit_py_plugins-0.3.5-py3-none-any.whl", hash = "sha256:ca9a0714ea59a24b2b044a1831f48d817dd0c817e84339f20e7889f392d77c4e"},
+]
+
+[package.dependencies]
+markdown-it-py = ">=1.0.0,<3.0.0"
+
+[package.extras]
+code-style = ["pre-commit"]
+rtd = ["attrs", "myst-parser (>=0.16.1,<0.17.0)", "sphinx-book-theme (>=0.1.0,<0.2.0)"]
+testing = ["coverage", "pytest", "pytest-cov", "pytest-regressions"]
+
+[[package]]
+name = "mdurl"
+version = "0.1.2"
+description = "Markdown URL utilities"
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8"},
+ {file = "mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba"},
+]
+
+[[package]]
+name = "mistune"
+version = "0.8.4"
+description = "The fastest markdown parser in pure Python"
+category = "dev"
+optional = false
+python-versions = "*"
+files = [
+ {file = "mistune-0.8.4-py2.py3-none-any.whl", hash = "sha256:88a1051873018da288eee8538d476dffe1262495144b33ecb586c4ab266bb8d4"},
+ {file = "mistune-0.8.4.tar.gz", hash = "sha256:59a3429db53c50b5c6bcc8a07f8848cb00d7dc8bdb431a4ab41920d201d4756e"},
+]
+
[[package]]
name = "multidict"
version = "6.0.4"
@@ -1680,189 +1853,80 @@ files = [
]
[[package]]
-name = "mypy"
-version = "1.2.0"
-description = "Optional static typing for Python"
+name = "myst-parser"
+version = "0.18.1"
+description = "An extended commonmark compliant parser, with bridges to docutils & sphinx."
category = "dev"
optional = false
python-versions = ">=3.7"
files = [
- {file = "mypy-1.2.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:701189408b460a2ff42b984e6bd45c3f41f0ac9f5f58b8873bbedc511900086d"},
- {file = "mypy-1.2.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:fe91be1c51c90e2afe6827601ca14353bbf3953f343c2129fa1e247d55fd95ba"},
- {file = "mypy-1.2.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8d26b513225ffd3eacece727f4387bdce6469192ef029ca9dd469940158bc89e"},
- {file = "mypy-1.2.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:3a2d219775a120581a0ae8ca392b31f238d452729adbcb6892fa89688cb8306a"},
- {file = "mypy-1.2.0-cp310-cp310-win_amd64.whl", hash = "sha256:2e93a8a553e0394b26c4ca683923b85a69f7ccdc0139e6acd1354cc884fe0128"},
- {file = "mypy-1.2.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3efde4af6f2d3ccf58ae825495dbb8d74abd6d176ee686ce2ab19bd025273f41"},
- {file = "mypy-1.2.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:695c45cea7e8abb6f088a34a6034b1d273122e5530aeebb9c09626cea6dca4cb"},
- {file = "mypy-1.2.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d0e9464a0af6715852267bf29c9553e4555b61f5904a4fc538547a4d67617937"},
- {file = "mypy-1.2.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:8293a216e902ac12779eb7a08f2bc39ec6c878d7c6025aa59464e0c4c16f7eb9"},
- {file = "mypy-1.2.0-cp311-cp311-win_amd64.whl", hash = "sha256:f46af8d162f3d470d8ffc997aaf7a269996d205f9d746124a179d3abe05ac602"},
- {file = "mypy-1.2.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:031fc69c9a7e12bcc5660b74122ed84b3f1c505e762cc4296884096c6d8ee140"},
- {file = "mypy-1.2.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:390bc685ec209ada4e9d35068ac6988c60160b2b703072d2850457b62499e336"},
- {file = "mypy-1.2.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:4b41412df69ec06ab141808d12e0bf2823717b1c363bd77b4c0820feaa37249e"},
- {file = "mypy-1.2.0-cp37-cp37m-win_amd64.whl", hash = "sha256:4e4a682b3f2489d218751981639cffc4e281d548f9d517addfd5a2917ac78119"},
- {file = "mypy-1.2.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:a197ad3a774f8e74f21e428f0de7f60ad26a8d23437b69638aac2764d1e06a6a"},
- {file = "mypy-1.2.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:c9a084bce1061e55cdc0493a2ad890375af359c766b8ac311ac8120d3a472950"},
- {file = "mypy-1.2.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:eaeaa0888b7f3ccb7bcd40b50497ca30923dba14f385bde4af78fac713d6d6f6"},
- {file = "mypy-1.2.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:bea55fc25b96c53affab852ad94bf111a3083bc1d8b0c76a61dd101d8a388cf5"},
- {file = "mypy-1.2.0-cp38-cp38-win_amd64.whl", hash = "sha256:4c8d8c6b80aa4a1689f2a179d31d86ae1367ea4a12855cc13aa3ba24bb36b2d8"},
- {file = "mypy-1.2.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:70894c5345bea98321a2fe84df35f43ee7bb0feec117a71420c60459fc3e1eed"},
- {file = "mypy-1.2.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:4a99fe1768925e4a139aace8f3fb66db3576ee1c30b9c0f70f744ead7e329c9f"},
- {file = "mypy-1.2.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:023fe9e618182ca6317ae89833ba422c411469156b690fde6a315ad10695a521"},
- {file = "mypy-1.2.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:4d19f1a239d59f10fdc31263d48b7937c585810288376671eaf75380b074f238"},
- {file = "mypy-1.2.0-cp39-cp39-win_amd64.whl", hash = "sha256:2de7babe398cb7a85ac7f1fd5c42f396c215ab3eff731b4d761d68d0f6a80f48"},
- {file = "mypy-1.2.0-py3-none-any.whl", hash = "sha256:d8e9187bfcd5ffedbe87403195e1fc340189a68463903c39e2b63307c9fa0394"},
- {file = "mypy-1.2.0.tar.gz", hash = "sha256:f70a40410d774ae23fcb4afbbeca652905a04de7948eaf0b1789c8d1426b72d1"},
+ {file = "myst-parser-0.18.1.tar.gz", hash = "sha256:79317f4bb2c13053dd6e64f9da1ba1da6cd9c40c8a430c447a7b146a594c246d"},
+ {file = "myst_parser-0.18.1-py3-none-any.whl", hash = "sha256:61b275b85d9f58aa327f370913ae1bec26ebad372cc99f3ab85c8ec3ee8d9fb8"},
]
[package.dependencies]
-mypy-extensions = ">=1.0.0"
-tomli = {version = ">=1.1.0", markers = "python_version < \"3.11\""}
-typing-extensions = ">=3.10"
+docutils = ">=0.15,<0.20"
+jinja2 = "*"
+markdown-it-py = ">=1.0.0,<3.0.0"
+mdit-py-plugins = ">=0.3.1,<0.4.0"
+pyyaml = "*"
+sphinx = ">=4,<6"
+typing-extensions = "*"
[package.extras]
-dmypy = ["psutil (>=4.0)"]
-install-types = ["pip"]
-python2 = ["typed-ast (>=1.4.0,<2)"]
-reports = ["lxml"]
-
-[[package]]
-name = "mypy-extensions"
-version = "1.0.0"
-description = "Type system extensions for programs checked with the mypy type checker."
-category = "dev"
-optional = false
-python-versions = ">=3.5"
-files = [
- {file = "mypy_extensions-1.0.0-py3-none-any.whl", hash = "sha256:4392f6c0eb8a5668a69e23d168ffa70f0be9ccfd32b5cc2d26a34ae5b844552d"},
- {file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
-]
+code-style = ["pre-commit (>=2.12,<3.0)"]
+linkify = ["linkify-it-py (>=1.0,<2.0)"]
+rtd = ["ipython", "sphinx-book-theme", "sphinx-design", "sphinxcontrib.mermaid (>=0.7.1,<0.8.0)", "sphinxext-opengraph (>=0.6.3,<0.7.0)", "sphinxext-rediraffe (>=0.2.7,<0.3.0)"]
+testing = ["beautifulsoup4", "coverage[toml]", "pytest (>=6,<7)", "pytest-cov", "pytest-param-files (>=0.3.4,<0.4.0)", "pytest-regressions", "sphinx (<5.2)", "sphinx-pytest"]
[[package]]
name = "nodeenv"
-version = "1.7.0"
+version = "1.8.0"
description = "Node.js virtual environment builder"
category = "dev"
optional = false
python-versions = ">=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*"
files = [
- {file = "nodeenv-1.7.0-py2.py3-none-any.whl", hash = "sha256:27083a7b96a25f2f5e1d8cb4b6317ee8aeda3bdd121394e5ac54e498028a042e"},
- {file = "nodeenv-1.7.0.tar.gz", hash = "sha256:e0e7f7dfb85fc5394c6fe1e8fa98131a2473e04311a45afb6508f7cf1836fa2b"},
+ {file = "nodeenv-1.8.0-py2.py3-none-any.whl", hash = "sha256:df865724bb3c3adc86b3876fa209771517b0cfe596beff01a92700e0e8be4cec"},
+ {file = "nodeenv-1.8.0.tar.gz", hash = "sha256:d51e0c37e64fbf47d017feac3145cdbb58836d7eee8c6f6d3b6880c5456227d2"},
]
[package.dependencies]
setuptools = "*"
-[[package]]
-name = "opencolorio"
-version = "2.2.1"
-description = "OpenColorIO (OCIO) is a complete color management solution geared towards motion picture production with an emphasis on visual effects and computer animation."
-category = "main"
-optional = false
-python-versions = "*"
-files = [
- {file = "opencolorio-2.2.1-cp310-cp310-macosx_10_13_x86_64.whl", hash = "sha256:a9feec76e450325f12203264194d905a938d5e7944772b806886f9531e406d42"},
- {file = "opencolorio-2.2.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:7eeae01328b359408940a1f29d53b15b034755413d95d08781b76084ee14cbb1"},
- {file = "opencolorio-2.2.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:85b63a9162e99f0f29ef4074017d1b6e8caf59096043fb91cbacfc5bc01fa0b9"},
- {file = "opencolorio-2.2.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:67d19ea54daff2b209b91981da415aa41ea8e3a60fecd5dd843ae13272d38dcf"},
- {file = "opencolorio-2.2.1-cp310-cp310-win_amd64.whl", hash = "sha256:da0043a1007d269b5da3c8ca1de8c63926b38bf5e08cfade6cb8f2f5f6b663b9"},
- {file = "opencolorio-2.2.1-cp311-cp311-macosx_10_13_x86_64.whl", hash = "sha256:62180cec075cae8dff56eeb977132eb9755d7fe312d8d34236cba838cb9314b3"},
- {file = "opencolorio-2.2.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:24b7bfc4b77c04845de847373e58232c48838042d5e45e027b8bf64bada988e3"},
- {file = "opencolorio-2.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:41cadab13b18dbedd992df2056c787cf38bf89a5b0903b90f701d5228ac496f9"},
- {file = "opencolorio-2.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fa278dd4414791a5605e685b562b6ad1c729a4a44c1c906151f5bca10c0ff10e"},
- {file = "opencolorio-2.2.1-cp311-cp311-win_amd64.whl", hash = "sha256:7b44858c26b662ec42b089f8f85ea3aa63aa04e0e58e902a4cbf8cae0fbd4c6c"},
- {file = "opencolorio-2.2.1-cp37-cp37m-macosx_10_13_x86_64.whl", hash = "sha256:07fce0d36a6041b524b2122b9f55fbd03e029def5a22e93822041b652b60590a"},
- {file = "opencolorio-2.2.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ae043bc588d9ee98f54fe9524481eba5634d6dd70d0c70e1bd242b60a3a81731"},
- {file = "opencolorio-2.2.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3a4ad1a4ed5742a7dda41f0548274e8328b2774ce04dfc31fd5dfbacabc4c166"},
- {file = "opencolorio-2.2.1-cp37-cp37m-win_amd64.whl", hash = "sha256:9bd885e34898c204db19a9e6926c551a74bda6d8e7d3ef27596630e3422b99b1"},
- {file = "opencolorio-2.2.1-cp38-cp38-macosx_10_13_x86_64.whl", hash = "sha256:86ed205bec96fd84e882d431c181560df0cf6f0f73150976303b6f3ff1d9d5ed"},
- {file = "opencolorio-2.2.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:c1bf1c19baa86203e2329194ea837161520dae5c94e4f04b7659e9bfe4f1a6a9"},
- {file = "opencolorio-2.2.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:639f7052da7086d999c0d84e424453fb44abc8f2d22ec8601d20d8ee9d90384b"},
- {file = "opencolorio-2.2.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f7e3208c5c1ac63a6e921398db661fdd9309b17253b285f227818713f3faec92"},
- {file = "opencolorio-2.2.1-cp38-cp38-win_amd64.whl", hash = "sha256:68814600c0d8c07b552e1f1e4e32d45bffba4cb49b41481e5d4dd0bc56a206ea"},
- {file = "opencolorio-2.2.1-cp39-cp39-macosx_10_13_x86_64.whl", hash = "sha256:cb5337ac2804dbb90c856b423d2799d3dc35f9c948da25d8e6506d1dd8200df7"},
- {file = "opencolorio-2.2.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:425593a96de7927aa7cda065dc3729e881de1d0b72c43e704e02962adb63b4ad"},
- {file = "opencolorio-2.2.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8be9f6af01e4c710de4cc03c9b6de04ef0844bf611e9100abf045ec62a4c685a"},
- {file = "opencolorio-2.2.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e84788002aa28151409f2367a040e9d39ffea0a9129777451bd0c55ac87d9d47"},
- {file = "opencolorio-2.2.1-cp39-cp39-win_amd64.whl", hash = "sha256:d92802922bc4e2ff3e9a06d44b6055efd1863abb1aaf0243849d35b077b72253"},
- {file = "opencolorio-2.2.1.tar.gz", hash = "sha256:283abb8db5bc18ab9686e08255a9245deaba3d7837be5363b7a69b0902b73a78"},
-]
-
-[[package]]
-name = "opentimelineio"
-version = "0.14.1"
-description = "Editorial interchange format and API"
-category = "main"
-optional = false
-python-versions = ">2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*, !=3.6.*, !=3.9.0"
-files = [
- {file = "OpenTimelineIO-0.14.1-cp27-cp27m-macosx_10_9_x86_64.whl", hash = "sha256:d5466742d1de323e922965e64ca7099f6dd756774d5f8b404a11d6ec6e7c5fe0"},
- {file = "OpenTimelineIO-0.14.1-cp27-cp27m-manylinux2010_i686.whl", hash = "sha256:3f5187eb0cd8f607bfcc5c1d58ce878734975a0a6a91360a2605ad831198ed89"},
- {file = "OpenTimelineIO-0.14.1-cp27-cp27m-manylinux2010_x86_64.whl", hash = "sha256:a2b64bf817d3065f7302c748bcc1d5938971e157c42e67fcb4e5e3612358813b"},
- {file = "OpenTimelineIO-0.14.1-cp27-cp27m-win32.whl", hash = "sha256:4cde33ea83ba041332bae55474fc155219871396b82031dd54d3e857973805b6"},
- {file = "OpenTimelineIO-0.14.1-cp27-cp27m-win_amd64.whl", hash = "sha256:d5dc153867c688ad4f39cbac78eda069cfe4f17376d9444d202f8073efa6cbd4"},
- {file = "OpenTimelineIO-0.14.1-cp27-cp27mu-manylinux2010_i686.whl", hash = "sha256:e07390dd1e0f82e5a5880ef2d498cbcbf482b4e5bfb4b9026342578a2fad358d"},
- {file = "OpenTimelineIO-0.14.1-cp27-cp27mu-manylinux2010_x86_64.whl", hash = "sha256:4c1c522df397536c7620d44e32302165a9ef9bbbf0de83a5a0621f0a75047cc9"},
- {file = "OpenTimelineIO-0.14.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:e368a1d64366e3fdf1eadd10077a135833fdc893ff65f8dc43a91254cb7ee6fa"},
- {file = "OpenTimelineIO-0.14.1-cp37-cp37m-manylinux2010_i686.whl", hash = "sha256:cf2cd94d11d0ae0fc78418cc0d17f2fe3bf85598b9b109f98b2301272a87bff5"},
- {file = "OpenTimelineIO-0.14.1-cp37-cp37m-manylinux2010_x86_64.whl", hash = "sha256:7af41f43ef72fbf3c0ae2e47cabd7715eb348726c9e5e430ab36ce2357181cf4"},
- {file = "OpenTimelineIO-0.14.1-cp37-cp37m-win32.whl", hash = "sha256:55dbb859d16535ba5dab8a66a78aef8db55f030d771b6e5b91e94241b6db65bd"},
- {file = "OpenTimelineIO-0.14.1-cp37-cp37m-win_amd64.whl", hash = "sha256:08eaef8fbc423c25e94e189eb788c92c16916ae74d16ebcab34ba889e980c6ad"},
- {file = "OpenTimelineIO-0.14.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:10b34a6997d6d6edb9b8a1c93718a1e90e8202d930559cdce2ad369e0473327f"},
- {file = "OpenTimelineIO-0.14.1-cp38-cp38-manylinux2010_i686.whl", hash = "sha256:c6b44986da8c7a64f8f549795279f0af05ec875a425d11600585dab0b3269ec2"},
- {file = "OpenTimelineIO-0.14.1-cp38-cp38-manylinux2010_x86_64.whl", hash = "sha256:45e1774d9f7215190a7c1e5b70dfc237f4a03b79b0539902d9ec8074707450f9"},
- {file = "OpenTimelineIO-0.14.1-cp38-cp38-win32.whl", hash = "sha256:1ee0e72320309b8dedf0e2f40fc2b8d3dd2c854db0aba28a84a038d7177a1208"},
- {file = "OpenTimelineIO-0.14.1-cp38-cp38-win_amd64.whl", hash = "sha256:bd58e9fdc765623e160ab3ec32e9199bcb3906a6f3c06cca7564fbb7c18d2d28"},
- {file = "OpenTimelineIO-0.14.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:f8d6e15f793577de59cc01e49600898ab12dbdc260dbcba83936c00965f0090a"},
- {file = "OpenTimelineIO-0.14.1-cp39-cp39-manylinux2010_i686.whl", hash = "sha256:50644c5e43076a3717b77645657545d0be19376ecb4c6f2e4103670052d726d4"},
- {file = "OpenTimelineIO-0.14.1-cp39-cp39-manylinux2010_x86_64.whl", hash = "sha256:a44f77fb5dbfd60d992ac2acc6782a7b0a26452db3a069425b8bd73b2f3bb336"},
- {file = "OpenTimelineIO-0.14.1-cp39-cp39-win32.whl", hash = "sha256:63fb0d1258f490bcebf6325067db64a0f0dc405b8b905ee2bb625f04d04a8082"},
- {file = "OpenTimelineIO-0.14.1-cp39-cp39-win_amd64.whl", hash = "sha256:8a303b2f3dfba542f588b227575f1967f7a9da854b34f620504e1ecb8d551f5f"},
- {file = "OpenTimelineIO-0.14.1.tar.gz", hash = "sha256:0b9adc0fd303b978af120259d6b1d23e0623800615b4a3e2eb9f9fb2c70d5d13"},
-]
-
-[package.dependencies]
-pyaaf2 = ">=1.4.0,<1.5.0"
-
-[package.extras]
-dev = ["check-manifest", "coverage (>=4.5)", "flake8 (>=3.5)", "urllib3 (>=1.24.3)"]
-view = ["PySide2 (>=5.11,<6.0)"]
-
[[package]]
name = "packaging"
-version = "23.0"
+version = "23.1"
description = "Core utilities for Python packages"
category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "packaging-23.0-py3-none-any.whl", hash = "sha256:714ac14496c3e68c99c29b00845f7a2b85f3bb6f1078fd9f72fd20f0570002b2"},
- {file = "packaging-23.0.tar.gz", hash = "sha256:b6ad297f8907de0fa2fe1ccbd26fdaf387f5f47c7275fedf8cce89f99446cf97"},
+ {file = "packaging-23.1-py3-none-any.whl", hash = "sha256:994793af429502c4ea2ebf6bf664629d07c1a9fe974af92966e4b8d2df7edc61"},
+ {file = "packaging-23.1.tar.gz", hash = "sha256:a392980d2b6cffa644431898be54b0045151319d1e7ec34f0cfed48767dd334f"},
]
[[package]]
name = "paramiko"
-version = "2.12.0"
+version = "3.2.0"
description = "SSH2 protocol library"
category = "main"
optional = false
-python-versions = "*"
+python-versions = ">=3.6"
files = [
- {file = "paramiko-2.12.0-py2.py3-none-any.whl", hash = "sha256:b2df1a6325f6996ef55a8789d0462f5b502ea83b3c990cbb5bbe57345c6812c4"},
- {file = "paramiko-2.12.0.tar.gz", hash = "sha256:376885c05c5d6aa6e1f4608aac2a6b5b0548b1add40274477324605903d9cd49"},
+ {file = "paramiko-3.2.0-py3-none-any.whl", hash = "sha256:df0f9dd8903bc50f2e10580af687f3015bf592a377cd438d2ec9546467a14eb8"},
+ {file = "paramiko-3.2.0.tar.gz", hash = "sha256:93cdce625a8a1dc12204439d45033f3261bdb2c201648cfcdc06f9fd0f94ec29"},
]
[package.dependencies]
-bcrypt = ">=3.1.3"
-cryptography = ">=2.5"
-pynacl = ">=1.0.1"
-six = "*"
+bcrypt = ">=3.2"
+cryptography = ">=3.3"
+pynacl = ">=1.5"
[package.extras]
-all = ["bcrypt (>=3.1.3)", "gssapi (>=1.4.1)", "invoke (>=1.3)", "pyasn1 (>=0.1.7)", "pynacl (>=1.0.1)", "pywin32 (>=2.1.8)"]
-ed25519 = ["bcrypt (>=3.1.3)", "pynacl (>=1.0.1)"]
+all = ["gssapi (>=1.4.1)", "invoke (>=2.0)", "pyasn1 (>=0.1.7)", "pywin32 (>=2.1.8)"]
gssapi = ["gssapi (>=1.4.1)", "pyasn1 (>=0.1.7)", "pywin32 (>=2.1.8)"]
-invoke = ["invoke (>=1.3)"]
+invoke = ["invoke (>=2.0)"]
[[package]]
name = "parso"
@@ -1882,18 +1946,19 @@ testing = ["docopt", "pytest (<6.0.0)"]
[[package]]
name = "patchelf"
-version = "0.17.2.0"
+version = "0.17.2.1"
description = "A small utility to modify the dynamic linker and RPATH of ELF executables."
category = "dev"
optional = false
python-versions = "*"
files = [
- {file = "patchelf-0.17.2.0-py2.py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.musllinux_1_1_aarch64.whl", hash = "sha256:b8d86f32e1414d6964d5d166ddd2cf829d156fba0d28d32a3bd0192f987f4529"},
- {file = "patchelf-0.17.2.0-py2.py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.musllinux_1_1_ppc64le.whl", hash = "sha256:9233a0f2fc73820c5bd468f27507bdf0c9ac543f07c7f9888bb7cf910b1be22f"},
- {file = "patchelf-0.17.2.0-py2.py3-none-manylinux_2_17_s390x.manylinux2014_s390x.musllinux_1_1_s390x.whl", hash = "sha256:6601d7d831508bcdd3d8ebfa6435c2379bf11e41af2409ae7b88de572926841c"},
- {file = "patchelf-0.17.2.0-py2.py3-none-manylinux_2_5_i686.manylinux1_i686.musllinux_1_1_i686.whl", hash = "sha256:c62a34f0c25e6c2d6ae44389f819a00ccdf3f292ad1b814fbe1cc23cb27023ce"},
- {file = "patchelf-0.17.2.0-py2.py3-none-manylinux_2_5_x86_64.manylinux1_x86_64.musllinux_1_1_x86_64.whl", hash = "sha256:1b9fd14f300341dc020ae05c49274dd1fa6727eabb4e61dd7fb6fb3600acd26e"},
- {file = "patchelf-0.17.2.0.tar.gz", hash = "sha256:dedf987a83d7f6d6f5512269e57f5feeec36719bd59567173b6d9beabe019efe"},
+ {file = "patchelf-0.17.2.1-py2.py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.musllinux_1_1_aarch64.whl", hash = "sha256:fc329da0e8f628bd836dfb8eaf523547e342351fa8f739bf2b3fe4a6db5a297c"},
+ {file = "patchelf-0.17.2.1-py2.py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.musllinux_1_1_armv7l.whl", hash = "sha256:ccb266a94edf016efe80151172c26cff8c2ec120a57a1665d257b0442784195d"},
+ {file = "patchelf-0.17.2.1-py2.py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.musllinux_1_1_ppc64le.whl", hash = "sha256:f47b5bdd6885cfb20abdd14c707d26eb6f499a7f52e911865548d4aa43385502"},
+ {file = "patchelf-0.17.2.1-py2.py3-none-manylinux_2_17_s390x.manylinux2014_s390x.musllinux_1_1_s390x.whl", hash = "sha256:a9e6ebb0874a11f7ed56d2380bfaa95f00612b23b15f896583da30c2059fcfa8"},
+ {file = "patchelf-0.17.2.1-py2.py3-none-manylinux_2_5_i686.manylinux1_i686.musllinux_1_1_i686.whl", hash = "sha256:3c8d58f0e4c1929b1c7c45ba8da5a84a8f1aa6a82a46e1cfb2e44a4d40f350e5"},
+ {file = "patchelf-0.17.2.1-py2.py3-none-manylinux_2_5_x86_64.manylinux1_x86_64.musllinux_1_1_x86_64.whl", hash = "sha256:d1a9bc0d4fd80c038523ebdc451a1cce75237cfcc52dbd1aca224578001d5927"},
+ {file = "patchelf-0.17.2.1.tar.gz", hash = "sha256:a6eb0dd452ce4127d0d5e1eb26515e39186fa609364274bc1b0b77539cfa7031"},
]
[package.extras]
@@ -1916,110 +1981,99 @@ six = "*"
[[package]]
name = "pillow"
-version = "9.4.0"
+version = "9.5.0"
description = "Python Imaging Library (Fork)"
category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "Pillow-9.4.0-1-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:1b4b4e9dda4f4e4c4e6896f93e84a8f0bcca3b059de9ddf67dac3c334b1195e1"},
- {file = "Pillow-9.4.0-1-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:fb5c1ad6bad98c57482236a21bf985ab0ef42bd51f7ad4e4538e89a997624e12"},
- {file = "Pillow-9.4.0-1-cp37-cp37m-macosx_10_10_x86_64.whl", hash = "sha256:f0caf4a5dcf610d96c3bd32932bfac8aee61c96e60481c2a0ea58da435e25acd"},
- {file = "Pillow-9.4.0-1-cp38-cp38-macosx_10_10_x86_64.whl", hash = "sha256:3f4cc516e0b264c8d4ccd6b6cbc69a07c6d582d8337df79be1e15a5056b258c9"},
- {file = "Pillow-9.4.0-1-cp39-cp39-macosx_10_10_x86_64.whl", hash = "sha256:b8c2f6eb0df979ee99433d8b3f6d193d9590f735cf12274c108bd954e30ca858"},
- {file = "Pillow-9.4.0-1-pp38-pypy38_pp73-macosx_10_10_x86_64.whl", hash = "sha256:b70756ec9417c34e097f987b4d8c510975216ad26ba6e57ccb53bc758f490dab"},
- {file = "Pillow-9.4.0-1-pp39-pypy39_pp73-macosx_10_10_x86_64.whl", hash = "sha256:43521ce2c4b865d385e78579a082b6ad1166ebed2b1a2293c3be1d68dd7ca3b9"},
- {file = "Pillow-9.4.0-2-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:9d9a62576b68cd90f7075876f4e8444487db5eeea0e4df3ba298ee38a8d067b0"},
- {file = "Pillow-9.4.0-2-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:87708d78a14d56a990fbf4f9cb350b7d89ee8988705e58e39bdf4d82c149210f"},
- {file = "Pillow-9.4.0-2-cp37-cp37m-macosx_10_10_x86_64.whl", hash = "sha256:8a2b5874d17e72dfb80d917213abd55d7e1ed2479f38f001f264f7ce7bae757c"},
- {file = "Pillow-9.4.0-2-cp38-cp38-macosx_10_10_x86_64.whl", hash = "sha256:83125753a60cfc8c412de5896d10a0a405e0bd88d0470ad82e0869ddf0cb3848"},
- {file = "Pillow-9.4.0-2-cp39-cp39-macosx_10_10_x86_64.whl", hash = "sha256:9e5f94742033898bfe84c93c831a6f552bb629448d4072dd312306bab3bd96f1"},
- {file = "Pillow-9.4.0-2-pp38-pypy38_pp73-macosx_10_10_x86_64.whl", hash = "sha256:013016af6b3a12a2f40b704677f8b51f72cb007dac785a9933d5c86a72a7fe33"},
- {file = "Pillow-9.4.0-2-pp39-pypy39_pp73-macosx_10_10_x86_64.whl", hash = "sha256:99d92d148dd03fd19d16175b6d355cc1b01faf80dae93c6c3eb4163709edc0a9"},
- {file = "Pillow-9.4.0-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:2968c58feca624bb6c8502f9564dd187d0e1389964898f5e9e1fbc8533169157"},
- {file = "Pillow-9.4.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:c5c1362c14aee73f50143d74389b2c158707b4abce2cb055b7ad37ce60738d47"},
- {file = "Pillow-9.4.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bd752c5ff1b4a870b7661234694f24b1d2b9076b8bf337321a814c612665f343"},
- {file = "Pillow-9.4.0-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9a3049a10261d7f2b6514d35bbb7a4dfc3ece4c4de14ef5876c4b7a23a0e566d"},
- {file = "Pillow-9.4.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:16a8df99701f9095bea8a6c4b3197da105df6f74e6176c5b410bc2df2fd29a57"},
- {file = "Pillow-9.4.0-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:94cdff45173b1919350601f82d61365e792895e3c3a3443cf99819e6fbf717a5"},
- {file = "Pillow-9.4.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:ed3e4b4e1e6de75fdc16d3259098de7c6571b1a6cc863b1a49e7d3d53e036070"},
- {file = "Pillow-9.4.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:d5b2f8a31bd43e0f18172d8ac82347c8f37ef3e0b414431157718aa234991b28"},
- {file = "Pillow-9.4.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:09b89ddc95c248ee788328528e6a2996e09eaccddeeb82a5356e92645733be35"},
- {file = "Pillow-9.4.0-cp310-cp310-win32.whl", hash = "sha256:f09598b416ba39a8f489c124447b007fe865f786a89dbfa48bb5cf395693132a"},
- {file = "Pillow-9.4.0-cp310-cp310-win_amd64.whl", hash = "sha256:f6e78171be3fb7941f9910ea15b4b14ec27725865a73c15277bc39f5ca4f8391"},
- {file = "Pillow-9.4.0-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:3fa1284762aacca6dc97474ee9c16f83990b8eeb6697f2ba17140d54b453e133"},
- {file = "Pillow-9.4.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:eaef5d2de3c7e9b21f1e762f289d17b726c2239a42b11e25446abf82b26ac132"},
- {file = "Pillow-9.4.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a4dfdae195335abb4e89cc9762b2edc524f3c6e80d647a9a81bf81e17e3fb6f0"},
- {file = "Pillow-9.4.0-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6abfb51a82e919e3933eb137e17c4ae9c0475a25508ea88993bb59faf82f3b35"},
- {file = "Pillow-9.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:451f10ef963918e65b8869e17d67db5e2f4ab40e716ee6ce7129b0cde2876eab"},
- {file = "Pillow-9.4.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:6663977496d616b618b6cfa43ec86e479ee62b942e1da76a2c3daa1c75933ef4"},
- {file = "Pillow-9.4.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:60e7da3a3ad1812c128750fc1bc14a7ceeb8d29f77e0a2356a8fb2aa8925287d"},
- {file = "Pillow-9.4.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:19005a8e58b7c1796bc0167862b1f54a64d3b44ee5d48152b06bb861458bc0f8"},
- {file = "Pillow-9.4.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:f715c32e774a60a337b2bb8ad9839b4abf75b267a0f18806f6f4f5f1688c4b5a"},
- {file = "Pillow-9.4.0-cp311-cp311-win32.whl", hash = "sha256:b222090c455d6d1a64e6b7bb5f4035c4dff479e22455c9eaa1bdd4c75b52c80c"},
- {file = "Pillow-9.4.0-cp311-cp311-win_amd64.whl", hash = "sha256:ba6612b6548220ff5e9df85261bddc811a057b0b465a1226b39bfb8550616aee"},
- {file = "Pillow-9.4.0-cp37-cp37m-macosx_10_10_x86_64.whl", hash = "sha256:5f532a2ad4d174eb73494e7397988e22bf427f91acc8e6ebf5bb10597b49c493"},
- {file = "Pillow-9.4.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5dd5a9c3091a0f414a963d427f920368e2b6a4c2f7527fdd82cde8ef0bc7a327"},
- {file = "Pillow-9.4.0-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ef21af928e807f10bf4141cad4746eee692a0dd3ff56cfb25fce076ec3cc8abe"},
- {file = "Pillow-9.4.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:847b114580c5cc9ebaf216dd8c8dbc6b00a3b7ab0131e173d7120e6deade1f57"},
- {file = "Pillow-9.4.0-cp37-cp37m-manylinux_2_28_aarch64.whl", hash = "sha256:653d7fb2df65efefbcbf81ef5fe5e5be931f1ee4332c2893ca638c9b11a409c4"},
- {file = "Pillow-9.4.0-cp37-cp37m-manylinux_2_28_x86_64.whl", hash = "sha256:46f39cab8bbf4a384ba7cb0bc8bae7b7062b6a11cfac1ca4bc144dea90d4a9f5"},
- {file = "Pillow-9.4.0-cp37-cp37m-win32.whl", hash = "sha256:7ac7594397698f77bce84382929747130765f66406dc2cd8b4ab4da68ade4c6e"},
- {file = "Pillow-9.4.0-cp37-cp37m-win_amd64.whl", hash = "sha256:46c259e87199041583658457372a183636ae8cd56dbf3f0755e0f376a7f9d0e6"},
- {file = "Pillow-9.4.0-cp38-cp38-macosx_10_10_x86_64.whl", hash = "sha256:0e51f608da093e5d9038c592b5b575cadc12fd748af1479b5e858045fff955a9"},
- {file = "Pillow-9.4.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:765cb54c0b8724a7c12c55146ae4647e0274a839fb6de7bcba841e04298e1011"},
- {file = "Pillow-9.4.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:519e14e2c49fcf7616d6d2cfc5c70adae95682ae20f0395e9280db85e8d6c4df"},
- {file = "Pillow-9.4.0-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d197df5489004db87d90b918033edbeee0bd6df3848a204bca3ff0a903bef837"},
- {file = "Pillow-9.4.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0845adc64fe9886db00f5ab68c4a8cd933ab749a87747555cec1c95acea64b0b"},
- {file = "Pillow-9.4.0-cp38-cp38-manylinux_2_28_aarch64.whl", hash = "sha256:e1339790c083c5a4de48f688b4841f18df839eb3c9584a770cbd818b33e26d5d"},
- {file = "Pillow-9.4.0-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:a96e6e23f2b79433390273eaf8cc94fec9c6370842e577ab10dabdcc7ea0a66b"},
- {file = "Pillow-9.4.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:7cfc287da09f9d2a7ec146ee4d72d6ea1342e770d975e49a8621bf54eaa8f30f"},
- {file = "Pillow-9.4.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:d7081c084ceb58278dd3cf81f836bc818978c0ccc770cbbb202125ddabec6628"},
- {file = "Pillow-9.4.0-cp38-cp38-win32.whl", hash = "sha256:df41112ccce5d47770a0c13651479fbcd8793f34232a2dd9faeccb75eb5d0d0d"},
- {file = "Pillow-9.4.0-cp38-cp38-win_amd64.whl", hash = "sha256:7a21222644ab69ddd9967cfe6f2bb420b460dae4289c9d40ff9a4896e7c35c9a"},
- {file = "Pillow-9.4.0-cp39-cp39-macosx_10_10_x86_64.whl", hash = "sha256:0f3269304c1a7ce82f1759c12ce731ef9b6e95b6df829dccd9fe42912cc48569"},
- {file = "Pillow-9.4.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:cb362e3b0976dc994857391b776ddaa8c13c28a16f80ac6522c23d5257156bed"},
- {file = "Pillow-9.4.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a2e0f87144fcbbe54297cae708c5e7f9da21a4646523456b00cc956bd4c65815"},
- {file = "Pillow-9.4.0-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:28676836c7796805914b76b1837a40f76827ee0d5398f72f7dcc634bae7c6264"},
- {file = "Pillow-9.4.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0884ba7b515163a1a05440a138adeb722b8a6ae2c2b33aea93ea3118dd3a899e"},
- {file = "Pillow-9.4.0-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:53dcb50fbdc3fb2c55431a9b30caeb2f7027fcd2aeb501459464f0214200a503"},
- {file = "Pillow-9.4.0-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:e8c5cf126889a4de385c02a2c3d3aba4b00f70234bfddae82a5eaa3ee6d5e3e6"},
- {file = "Pillow-9.4.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:6c6b1389ed66cdd174d040105123a5a1bc91d0aa7059c7261d20e583b6d8cbd2"},
- {file = "Pillow-9.4.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0dd4c681b82214b36273c18ca7ee87065a50e013112eea7d78c7a1b89a739153"},
- {file = "Pillow-9.4.0-cp39-cp39-win32.whl", hash = "sha256:6d9dfb9959a3b0039ee06c1a1a90dc23bac3b430842dcb97908ddde05870601c"},
- {file = "Pillow-9.4.0-cp39-cp39-win_amd64.whl", hash = "sha256:54614444887e0d3043557d9dbc697dbb16cfb5a35d672b7a0fcc1ed0cf1c600b"},
- {file = "Pillow-9.4.0-pp38-pypy38_pp73-macosx_10_10_x86_64.whl", hash = "sha256:b9b752ab91e78234941e44abdecc07f1f0d8f51fb62941d32995b8161f68cfe5"},
- {file = "Pillow-9.4.0-pp38-pypy38_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d3b56206244dc8711f7e8b7d6cad4663917cd5b2d950799425076681e8766286"},
- {file = "Pillow-9.4.0-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:aabdab8ec1e7ca7f1434d042bf8b1e92056245fb179790dc97ed040361f16bfd"},
- {file = "Pillow-9.4.0-pp38-pypy38_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:db74f5562c09953b2c5f8ec4b7dfd3f5421f31811e97d1dbc0a7c93d6e3a24df"},
- {file = "Pillow-9.4.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:e9d7747847c53a16a729b6ee5e737cf170f7a16611c143d95aa60a109a59c336"},
- {file = "Pillow-9.4.0-pp39-pypy39_pp73-macosx_10_10_x86_64.whl", hash = "sha256:b52ff4f4e002f828ea6483faf4c4e8deea8d743cf801b74910243c58acc6eda3"},
- {file = "Pillow-9.4.0-pp39-pypy39_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:575d8912dca808edd9acd6f7795199332696d3469665ef26163cd090fa1f8bfa"},
- {file = "Pillow-9.4.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c3c4ed2ff6760e98d262e0cc9c9a7f7b8a9f61aa4d47c58835cdaf7b0b8811bb"},
- {file = "Pillow-9.4.0-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:e621b0246192d3b9cb1dc62c78cfa4c6f6d2ddc0ec207d43c0dedecb914f152a"},
- {file = "Pillow-9.4.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:8f127e7b028900421cad64f51f75c051b628db17fb00e099eb148761eed598c9"},
- {file = "Pillow-9.4.0.tar.gz", hash = "sha256:a1c2d7780448eb93fbcc3789bf3916aa5720d942e37945f4056680317f1cd23e"},
+ {file = "Pillow-9.5.0-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:ace6ca218308447b9077c14ea4ef381ba0b67ee78d64046b3f19cf4e1139ad16"},
+ {file = "Pillow-9.5.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d3d403753c9d5adc04d4694d35cf0391f0f3d57c8e0030aac09d7678fa8030aa"},
+ {file = "Pillow-9.5.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5ba1b81ee69573fe7124881762bb4cd2e4b6ed9dd28c9c60a632902fe8db8b38"},
+ {file = "Pillow-9.5.0-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:fe7e1c262d3392afcf5071df9afa574544f28eac825284596ac6db56e6d11062"},
+ {file = "Pillow-9.5.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f36397bf3f7d7c6a3abdea815ecf6fd14e7fcd4418ab24bae01008d8d8ca15e"},
+ {file = "Pillow-9.5.0-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:252a03f1bdddce077eff2354c3861bf437c892fb1832f75ce813ee94347aa9b5"},
+ {file = "Pillow-9.5.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:85ec677246533e27770b0de5cf0f9d6e4ec0c212a1f89dfc941b64b21226009d"},
+ {file = "Pillow-9.5.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:b416f03d37d27290cb93597335a2f85ed446731200705b22bb927405320de903"},
+ {file = "Pillow-9.5.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:1781a624c229cb35a2ac31cc4a77e28cafc8900733a864870c49bfeedacd106a"},
+ {file = "Pillow-9.5.0-cp310-cp310-win32.whl", hash = "sha256:8507eda3cd0608a1f94f58c64817e83ec12fa93a9436938b191b80d9e4c0fc44"},
+ {file = "Pillow-9.5.0-cp310-cp310-win_amd64.whl", hash = "sha256:d3c6b54e304c60c4181da1c9dadf83e4a54fd266a99c70ba646a9baa626819eb"},
+ {file = "Pillow-9.5.0-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:7ec6f6ce99dab90b52da21cf0dc519e21095e332ff3b399a357c187b1a5eee32"},
+ {file = "Pillow-9.5.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:560737e70cb9c6255d6dcba3de6578a9e2ec4b573659943a5e7e4af13f298f5c"},
+ {file = "Pillow-9.5.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:96e88745a55b88a7c64fa49bceff363a1a27d9a64e04019c2281049444a571e3"},
+ {file = "Pillow-9.5.0-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d9c206c29b46cfd343ea7cdfe1232443072bbb270d6a46f59c259460db76779a"},
+ {file = "Pillow-9.5.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cfcc2c53c06f2ccb8976fb5c71d448bdd0a07d26d8e07e321c103416444c7ad1"},
+ {file = "Pillow-9.5.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:a0f9bb6c80e6efcde93ffc51256d5cfb2155ff8f78292f074f60f9e70b942d99"},
+ {file = "Pillow-9.5.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:8d935f924bbab8f0a9a28404422da8af4904e36d5c33fc6f677e4c4485515625"},
+ {file = "Pillow-9.5.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:fed1e1cf6a42577953abbe8e6cf2fe2f566daebde7c34724ec8803c4c0cda579"},
+ {file = "Pillow-9.5.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:c1170d6b195555644f0616fd6ed929dfcf6333b8675fcca044ae5ab110ded296"},
+ {file = "Pillow-9.5.0-cp311-cp311-win32.whl", hash = "sha256:54f7102ad31a3de5666827526e248c3530b3a33539dbda27c6843d19d72644ec"},
+ {file = "Pillow-9.5.0-cp311-cp311-win_amd64.whl", hash = "sha256:cfa4561277f677ecf651e2b22dc43e8f5368b74a25a8f7d1d4a3a243e573f2d4"},
+ {file = "Pillow-9.5.0-cp311-cp311-win_arm64.whl", hash = "sha256:965e4a05ef364e7b973dd17fc765f42233415974d773e82144c9bbaaaea5d089"},
+ {file = "Pillow-9.5.0-cp312-cp312-win32.whl", hash = "sha256:22baf0c3cf0c7f26e82d6e1adf118027afb325e703922c8dfc1d5d0156bb2eeb"},
+ {file = "Pillow-9.5.0-cp312-cp312-win_amd64.whl", hash = "sha256:432b975c009cf649420615388561c0ce7cc31ce9b2e374db659ee4f7d57a1f8b"},
+ {file = "Pillow-9.5.0-cp37-cp37m-macosx_10_10_x86_64.whl", hash = "sha256:5d4ebf8e1db4441a55c509c4baa7a0587a0210f7cd25fcfe74dbbce7a4bd1906"},
+ {file = "Pillow-9.5.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:375f6e5ee9620a271acb6820b3d1e94ffa8e741c0601db4c0c4d3cb0a9c224bf"},
+ {file = "Pillow-9.5.0-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:99eb6cafb6ba90e436684e08dad8be1637efb71c4f2180ee6b8f940739406e78"},
+ {file = "Pillow-9.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2dfaaf10b6172697b9bceb9a3bd7b951819d1ca339a5ef294d1f1ac6d7f63270"},
+ {file = "Pillow-9.5.0-cp37-cp37m-manylinux_2_28_aarch64.whl", hash = "sha256:763782b2e03e45e2c77d7779875f4432e25121ef002a41829d8868700d119392"},
+ {file = "Pillow-9.5.0-cp37-cp37m-manylinux_2_28_x86_64.whl", hash = "sha256:35f6e77122a0c0762268216315bf239cf52b88865bba522999dc38f1c52b9b47"},
+ {file = "Pillow-9.5.0-cp37-cp37m-win32.whl", hash = "sha256:aca1c196f407ec7cf04dcbb15d19a43c507a81f7ffc45b690899d6a76ac9fda7"},
+ {file = "Pillow-9.5.0-cp37-cp37m-win_amd64.whl", hash = "sha256:322724c0032af6692456cd6ed554bb85f8149214d97398bb80613b04e33769f6"},
+ {file = "Pillow-9.5.0-cp38-cp38-macosx_10_10_x86_64.whl", hash = "sha256:a0aa9417994d91301056f3d0038af1199eb7adc86e646a36b9e050b06f526597"},
+ {file = "Pillow-9.5.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:f8286396b351785801a976b1e85ea88e937712ee2c3ac653710a4a57a8da5d9c"},
+ {file = "Pillow-9.5.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c830a02caeb789633863b466b9de10c015bded434deb3ec87c768e53752ad22a"},
+ {file = "Pillow-9.5.0-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:fbd359831c1657d69bb81f0db962905ee05e5e9451913b18b831febfe0519082"},
+ {file = "Pillow-9.5.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f8fc330c3370a81bbf3f88557097d1ea26cd8b019d6433aa59f71195f5ddebbf"},
+ {file = "Pillow-9.5.0-cp38-cp38-manylinux_2_28_aarch64.whl", hash = "sha256:7002d0797a3e4193c7cdee3198d7c14f92c0836d6b4a3f3046a64bd1ce8df2bf"},
+ {file = "Pillow-9.5.0-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:229e2c79c00e85989a34b5981a2b67aa079fd08c903f0aaead522a1d68d79e51"},
+ {file = "Pillow-9.5.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:9adf58f5d64e474bed00d69bcd86ec4bcaa4123bfa70a65ce72e424bfb88ed96"},
+ {file = "Pillow-9.5.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:662da1f3f89a302cc22faa9f14a262c2e3951f9dbc9617609a47521c69dd9f8f"},
+ {file = "Pillow-9.5.0-cp38-cp38-win32.whl", hash = "sha256:6608ff3bf781eee0cd14d0901a2b9cc3d3834516532e3bd673a0a204dc8615fc"},
+ {file = "Pillow-9.5.0-cp38-cp38-win_amd64.whl", hash = "sha256:e49eb4e95ff6fd7c0c402508894b1ef0e01b99a44320ba7d8ecbabefddcc5569"},
+ {file = "Pillow-9.5.0-cp39-cp39-macosx_10_10_x86_64.whl", hash = "sha256:482877592e927fd263028c105b36272398e3e1be3269efda09f6ba21fd83ec66"},
+ {file = "Pillow-9.5.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:3ded42b9ad70e5f1754fb7c2e2d6465a9c842e41d178f262e08b8c85ed8a1d8e"},
+ {file = "Pillow-9.5.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c446d2245ba29820d405315083d55299a796695d747efceb5717a8b450324115"},
+ {file = "Pillow-9.5.0-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8aca1152d93dcc27dc55395604dcfc55bed5f25ef4c98716a928bacba90d33a3"},
+ {file = "Pillow-9.5.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:608488bdcbdb4ba7837461442b90ea6f3079397ddc968c31265c1e056964f1ef"},
+ {file = "Pillow-9.5.0-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:60037a8db8750e474af7ffc9faa9b5859e6c6d0a50e55c45576bf28be7419705"},
+ {file = "Pillow-9.5.0-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:07999f5834bdc404c442146942a2ecadd1cb6292f5229f4ed3b31e0a108746b1"},
+ {file = "Pillow-9.5.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:a127ae76092974abfbfa38ca2d12cbeddcdeac0fb71f9627cc1135bedaf9d51a"},
+ {file = "Pillow-9.5.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:489f8389261e5ed43ac8ff7b453162af39c3e8abd730af8363587ba64bb2e865"},
+ {file = "Pillow-9.5.0-cp39-cp39-win32.whl", hash = "sha256:9b1af95c3a967bf1da94f253e56b6286b50af23392a886720f563c547e48e964"},
+ {file = "Pillow-9.5.0-cp39-cp39-win_amd64.whl", hash = "sha256:77165c4a5e7d5a284f10a6efaa39a0ae8ba839da344f20b111d62cc932fa4e5d"},
+ {file = "Pillow-9.5.0-pp38-pypy38_pp73-macosx_10_10_x86_64.whl", hash = "sha256:833b86a98e0ede388fa29363159c9b1a294b0905b5128baf01db683672f230f5"},
+ {file = "Pillow-9.5.0-pp38-pypy38_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:aaf305d6d40bd9632198c766fb64f0c1a83ca5b667f16c1e79e1661ab5060140"},
+ {file = "Pillow-9.5.0-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0852ddb76d85f127c135b6dd1f0bb88dbb9ee990d2cd9aa9e28526c93e794fba"},
+ {file = "Pillow-9.5.0-pp38-pypy38_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:91ec6fe47b5eb5a9968c79ad9ed78c342b1f97a091677ba0e012701add857829"},
+ {file = "Pillow-9.5.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:cb841572862f629b99725ebaec3287fc6d275be9b14443ea746c1dd325053cbd"},
+ {file = "Pillow-9.5.0-pp39-pypy39_pp73-macosx_10_10_x86_64.whl", hash = "sha256:c380b27d041209b849ed246b111b7c166ba36d7933ec6e41175fd15ab9eb1572"},
+ {file = "Pillow-9.5.0-pp39-pypy39_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7c9af5a3b406a50e313467e3565fc99929717f780164fe6fbb7704edba0cebbe"},
+ {file = "Pillow-9.5.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5671583eab84af046a397d6d0ba25343c00cd50bce03787948e0fff01d4fd9b1"},
+ {file = "Pillow-9.5.0-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:84a6f19ce086c1bf894644b43cd129702f781ba5751ca8572f08aa40ef0ab7b7"},
+ {file = "Pillow-9.5.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:1e7723bd90ef94eda669a3c2c19d549874dd5badaeefabefd26053304abe5799"},
+ {file = "Pillow-9.5.0.tar.gz", hash = "sha256:bf548479d336726d7a0eceb6e767e179fbde37833ae42794602631a070d630f1"},
]
[package.extras]
-docs = ["furo", "olefile", "sphinx (>=2.4)", "sphinx-copybutton", "sphinx-inline-tabs", "sphinx-issues (>=3.0.1)", "sphinx-removed-in", "sphinxext-opengraph"]
+docs = ["furo", "olefile", "sphinx (>=2.4)", "sphinx-copybutton", "sphinx-inline-tabs", "sphinx-removed-in", "sphinxext-opengraph"]
tests = ["check-manifest", "coverage", "defusedxml", "markdown2", "olefile", "packaging", "pyroma", "pytest", "pytest-cov", "pytest-timeout"]
[[package]]
name = "platformdirs"
-version = "2.6.2"
+version = "3.5.1"
description = "A small Python package for determining appropriate platform-specific dirs, e.g. a \"user data dir\"."
category = "dev"
optional = false
python-versions = ">=3.7"
files = [
- {file = "platformdirs-2.6.2-py3-none-any.whl", hash = "sha256:83c8f6d04389165de7c9b6f0c682439697887bca0aa2f1c87ef1826be3584490"},
- {file = "platformdirs-2.6.2.tar.gz", hash = "sha256:e1fea1fe471b9ff8332e229df3cb7de4f53eeea4998d3b6bfff542115e998bd2"},
+ {file = "platformdirs-3.5.1-py3-none-any.whl", hash = "sha256:e2378146f1964972c03c085bb5662ae80b2b8c06226c54b2ff4aa9483e8a13a5"},
+ {file = "platformdirs-3.5.1.tar.gz", hash = "sha256:412dae91f52a6f84830f39a8078cecd0e866cb72294a5c66808e74d5e88d251f"},
]
[package.extras]
-docs = ["furo (>=2022.12.7)", "proselint (>=0.13)", "sphinx (>=5.3)", "sphinx-autodoc-typehints (>=1.19.5)"]
-test = ["appdirs (==1.4.4)", "covdefaults (>=2.2.2)", "pytest (>=7.2)", "pytest-cov (>=4)", "pytest-mock (>=3.10)"]
+docs = ["furo (>=2023.3.27)", "proselint (>=0.13)", "sphinx (>=6.2.1)", "sphinx-autodoc-typehints (>=1.23,!=1.23.4)"]
+test = ["appdirs (==1.4.4)", "covdefaults (>=2.3)", "pytest (>=7.3.1)", "pytest-cov (>=4)", "pytest-mock (>=3.10)"]
[[package]]
name = "pluggy"
@@ -2049,16 +2103,31 @@ files = [
{file = "ply-3.11.tar.gz", hash = "sha256:00c7c1aaa88358b9c765b6d3000c6eec0ba42abca5351b095321aef446081da3"},
]
+[[package]]
+name = "pockets"
+version = "0.9.1"
+description = "A collection of helpful Python tools!"
+category = "dev"
+optional = false
+python-versions = "*"
+files = [
+ {file = "pockets-0.9.1-py2.py3-none-any.whl", hash = "sha256:68597934193c08a08eb2bf6a1d85593f627c22f9b065cc727a4f03f669d96d86"},
+ {file = "pockets-0.9.1.tar.gz", hash = "sha256:9320f1a3c6f7a9133fe3b571f283bcf3353cd70249025ae8d618e40e9f7e92b3"},
+]
+
+[package.dependencies]
+six = ">=1.5.2"
+
[[package]]
name = "pre-commit"
-version = "2.21.0"
+version = "3.3.2"
description = "A framework for managing and maintaining multi-language pre-commit hooks."
category = "dev"
optional = false
-python-versions = ">=3.7"
+python-versions = ">=3.8"
files = [
- {file = "pre_commit-2.21.0-py2.py3-none-any.whl", hash = "sha256:e2f91727039fc39a92f58a588a25b87f936de6567eed4f0e673e0507edc75bad"},
- {file = "pre_commit-2.21.0.tar.gz", hash = "sha256:31ef31af7e474a8d8995027fefdfcf509b5c913ff31f2015b4ec4beb26a6f658"},
+ {file = "pre_commit-3.3.2-py2.py3-none-any.whl", hash = "sha256:8056bc52181efadf4aac792b1f4f255dfd2fb5a350ded7335d251a68561e8cb6"},
+ {file = "pre_commit-3.3.2.tar.gz", hash = "sha256:66e37bec2d882de1f17f88075047ef8962581f83c234ac08da21a0c58953d1f0"},
]
[package.dependencies]
@@ -2070,38 +2139,37 @@ virtualenv = ">=20.10.0"
[[package]]
name = "prefixed"
-version = "0.6.0"
+version = "0.7.0"
description = "Prefixed alternative numeric library"
category = "main"
optional = false
python-versions = "*"
files = [
- {file = "prefixed-0.6.0-py2.py3-none-any.whl", hash = "sha256:5ab094773dc71df68cc78151c81510b9521dcc6b58a4acb78442b127d4e400fa"},
- {file = "prefixed-0.6.0.tar.gz", hash = "sha256:b39fbfac72618fa1eeb5b3fd9ed1341f10dd90df75499cb4c38a6c3ef47cdd94"},
+ {file = "prefixed-0.7.0-py2.py3-none-any.whl", hash = "sha256:537b0e4ff4516c4578f277a41d7104f769d6935ae9cdb0f88fed82ec7b3c0ca5"},
+ {file = "prefixed-0.7.0.tar.gz", hash = "sha256:0b54d15e602eb8af4ac31b1db21a37ea95ce5890e0741bb0dd9ded493cefbbe9"},
]
[[package]]
name = "protobuf"
-version = "4.21.12"
+version = "4.23.2"
description = ""
category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "protobuf-4.21.12-cp310-abi3-win32.whl", hash = "sha256:b135410244ebe777db80298297a97fbb4c862c881b4403b71bac9d4107d61fd1"},
- {file = "protobuf-4.21.12-cp310-abi3-win_amd64.whl", hash = "sha256:89f9149e4a0169cddfc44c74f230d7743002e3aa0b9472d8c28f0388102fc4c2"},
- {file = "protobuf-4.21.12-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:299ea899484ee6f44604deb71f424234f654606b983cb496ea2a53e3c63ab791"},
- {file = "protobuf-4.21.12-cp37-abi3-manylinux2014_aarch64.whl", hash = "sha256:d1736130bce8cf131ac7957fa26880ca19227d4ad68b4888b3be0dea1f95df97"},
- {file = "protobuf-4.21.12-cp37-abi3-manylinux2014_x86_64.whl", hash = "sha256:78a28c9fa223998472886c77042e9b9afb6fe4242bd2a2a5aced88e3f4422aa7"},
- {file = "protobuf-4.21.12-cp37-cp37m-win32.whl", hash = "sha256:3d164928ff0727d97022957c2b849250ca0e64777ee31efd7d6de2e07c494717"},
- {file = "protobuf-4.21.12-cp37-cp37m-win_amd64.whl", hash = "sha256:f45460f9ee70a0ec1b6694c6e4e348ad2019275680bd68a1d9314b8c7e01e574"},
- {file = "protobuf-4.21.12-cp38-cp38-win32.whl", hash = "sha256:6ab80df09e3208f742c98443b6166bcb70d65f52cfeb67357d52032ea1ae9bec"},
- {file = "protobuf-4.21.12-cp38-cp38-win_amd64.whl", hash = "sha256:1f22ac0ca65bb70a876060d96d914dae09ac98d114294f77584b0d2644fa9c30"},
- {file = "protobuf-4.21.12-cp39-cp39-win32.whl", hash = "sha256:27f4d15021da6d2b706ddc3860fac0a5ddaba34ab679dc182b60a8bb4e1121cc"},
- {file = "protobuf-4.21.12-cp39-cp39-win_amd64.whl", hash = "sha256:237216c3326d46808a9f7c26fd1bd4b20015fb6867dc5d263a493ef9a539293b"},
- {file = "protobuf-4.21.12-py2.py3-none-any.whl", hash = "sha256:a53fd3f03e578553623272dc46ac2f189de23862e68565e83dde203d41b76fc5"},
- {file = "protobuf-4.21.12-py3-none-any.whl", hash = "sha256:b98d0148f84e3a3c569e19f52103ca1feacdac0d2df8d6533cf983d1fda28462"},
- {file = "protobuf-4.21.12.tar.gz", hash = "sha256:7cd532c4566d0e6feafecc1059d04c7915aec8e182d1cf7adee8b24ef1e2e6ab"},
+ {file = "protobuf-4.23.2-cp310-abi3-win32.whl", hash = "sha256:384dd44cb4c43f2ccddd3645389a23ae61aeb8cfa15ca3a0f60e7c3ea09b28b3"},
+ {file = "protobuf-4.23.2-cp310-abi3-win_amd64.whl", hash = "sha256:09310bce43353b46d73ba7e3bca78273b9bc50349509b9698e64d288c6372c2a"},
+ {file = "protobuf-4.23.2-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:b2cfab63a230b39ae603834718db74ac11e52bccaaf19bf20f5cce1a84cf76df"},
+ {file = "protobuf-4.23.2-cp37-abi3-manylinux2014_aarch64.whl", hash = "sha256:c52cfcbfba8eb791255edd675c1fe6056f723bf832fa67f0442218f8817c076e"},
+ {file = "protobuf-4.23.2-cp37-abi3-manylinux2014_x86_64.whl", hash = "sha256:86df87016d290143c7ce3be3ad52d055714ebaebb57cc659c387e76cfacd81aa"},
+ {file = "protobuf-4.23.2-cp37-cp37m-win32.whl", hash = "sha256:281342ea5eb631c86697e1e048cb7e73b8a4e85f3299a128c116f05f5c668f8f"},
+ {file = "protobuf-4.23.2-cp37-cp37m-win_amd64.whl", hash = "sha256:ce744938406de1e64b91410f473736e815f28c3b71201302612a68bf01517fea"},
+ {file = "protobuf-4.23.2-cp38-cp38-win32.whl", hash = "sha256:6c081863c379bb1741be8f8193e893511312b1d7329b4a75445d1ea9955be69e"},
+ {file = "protobuf-4.23.2-cp38-cp38-win_amd64.whl", hash = "sha256:25e3370eda26469b58b602e29dff069cfaae8eaa0ef4550039cc5ef8dc004511"},
+ {file = "protobuf-4.23.2-cp39-cp39-win32.whl", hash = "sha256:efabbbbac1ab519a514579ba9ec52f006c28ae19d97915951f69fa70da2c9e91"},
+ {file = "protobuf-4.23.2-cp39-cp39-win_amd64.whl", hash = "sha256:54a533b971288af3b9926e53850c7eb186886c0c84e61daa8444385a4720297f"},
+ {file = "protobuf-4.23.2-py3-none-any.whl", hash = "sha256:8da6070310d634c99c0db7df48f10da495cc283fd9e9234877f0cd182d43ab7f"},
+ {file = "protobuf-4.23.2.tar.gz", hash = "sha256:20874e7ca4436f683b64ebdbee2129a5a2c301579a67d1a7dda2cdf62fb7f5f7"},
]
[[package]]
@@ -2116,54 +2184,43 @@ files = [
{file = "py-1.11.0.tar.gz", hash = "sha256:51c75c4126074b472f746a24399ad32f6053d1b34b68d2fa41e558e6f4a98719"},
]
-[[package]]
-name = "pyaaf2"
-version = "1.4.0"
-description = "A python module for reading and writing advanced authoring format files"
-category = "main"
-optional = false
-python-versions = "*"
-files = [
- {file = "pyaaf2-1.4.0.tar.gz", hash = "sha256:160d3c26c7cfef7176d0bdb0e55772156570435982c3abfa415e89639f76e71b"},
-]
-
[[package]]
name = "pyasn1"
-version = "0.4.8"
-description = "ASN.1 types and codecs"
+version = "0.5.0"
+description = "Pure-Python implementation of ASN.1 types and DER/BER/CER codecs (X.208)"
category = "main"
optional = false
-python-versions = "*"
+python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7"
files = [
- {file = "pyasn1-0.4.8-py2.py3-none-any.whl", hash = "sha256:39c7e2ec30515947ff4e87fb6f456dfc6e84857d34be479c9d4a4ba4bf46aa5d"},
- {file = "pyasn1-0.4.8.tar.gz", hash = "sha256:aef77c9fb94a3ac588e87841208bdec464471d9871bd5050a287cc9a475cd0ba"},
+ {file = "pyasn1-0.5.0-py2.py3-none-any.whl", hash = "sha256:87a2121042a1ac9358cabcaf1d07680ff97ee6404333bacca15f76aa8ad01a57"},
+ {file = "pyasn1-0.5.0.tar.gz", hash = "sha256:97b7290ca68e62a832558ec3976f15cbf911bf5d7c7039d8b861c2a0ece69fde"},
]
[[package]]
name = "pyasn1-modules"
-version = "0.2.8"
-description = "A collection of ASN.1-based protocols modules."
+version = "0.3.0"
+description = "A collection of ASN.1-based protocols modules"
category = "main"
optional = false
-python-versions = "*"
+python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7"
files = [
- {file = "pyasn1-modules-0.2.8.tar.gz", hash = "sha256:905f84c712230b2c592c19470d3ca8d552de726050d1d1716282a1f6146be65e"},
- {file = "pyasn1_modules-0.2.8-py2.py3-none-any.whl", hash = "sha256:a50b808ffeb97cb3601dd25981f6b016cbb3d31fbf57a8b8a87428e6158d0c74"},
+ {file = "pyasn1_modules-0.3.0-py2.py3-none-any.whl", hash = "sha256:d3ccd6ed470d9ffbc716be08bd90efbd44d0734bc9303818f7336070984a162d"},
+ {file = "pyasn1_modules-0.3.0.tar.gz", hash = "sha256:5bd01446b736eb9d31512a30d46c1ac3395d676c6f3cafa4c03eb54b9925631c"},
]
[package.dependencies]
-pyasn1 = ">=0.4.6,<0.5.0"
+pyasn1 = ">=0.4.6,<0.6.0"
[[package]]
name = "pyblish-base"
-version = "1.8.8"
+version = "1.8.11"
description = "Plug-in driven automation framework for content"
category = "main"
optional = false
python-versions = "*"
files = [
- {file = "pyblish-base-1.8.8.tar.gz", hash = "sha256:85a2c034dbb86345bf95018f5b7b3c36c7dda29ea4d93c10d167f147b69a7b22"},
- {file = "pyblish_base-1.8.8-py2.py3-none-any.whl", hash = "sha256:67ea253a05d007ab4a175e44e778928ea7bdb0e9707573e1100417bbf0451a53"},
+ {file = "pyblish-base-1.8.11.tar.gz", hash = "sha256:86dfeec0567430eb7eb25f89a18312054147a729ec66f6ac8c7e421fd15b66e1"},
+ {file = "pyblish_base-1.8.11-py2.py3-none-any.whl", hash = "sha256:c321be7020c946fe9dfa11941241bd985a572c5009198b4f9810e5afad1f0b4b"},
]
[[package]]
@@ -2192,20 +2249,21 @@ files = [
[[package]]
name = "pydocstyle"
-version = "3.0.0"
+version = "6.3.0"
description = "Python docstring style checker"
category = "dev"
optional = false
-python-versions = "*"
+python-versions = ">=3.6"
files = [
- {file = "pydocstyle-3.0.0-py2-none-any.whl", hash = "sha256:2258f9b0df68b97bf3a6c29003edc5238ff8879f1efb6f1999988d934e432bd8"},
- {file = "pydocstyle-3.0.0-py3-none-any.whl", hash = "sha256:ed79d4ec5e92655eccc21eb0c6cf512e69512b4a97d215ace46d17e4990f2039"},
- {file = "pydocstyle-3.0.0.tar.gz", hash = "sha256:5741c85e408f9e0ddf873611085e819b809fca90b619f5fd7f34bd4959da3dd4"},
+ {file = "pydocstyle-6.3.0-py3-none-any.whl", hash = "sha256:118762d452a49d6b05e194ef344a55822987a462831ade91ec5c06fd2169d019"},
+ {file = "pydocstyle-6.3.0.tar.gz", hash = "sha256:7ce43f0c0ac87b07494eb9c0b462c0b73e6ff276807f204d6b53edc72b7e44e1"},
]
[package.dependencies]
-six = "*"
-snowballstemmer = "*"
+snowballstemmer = ">=2.2.0"
+
+[package.extras]
+toml = ["tomli (>=1.2.3)"]
[[package]]
name = "pyflakes"
@@ -2221,14 +2279,14 @@ files = [
[[package]]
name = "pygments"
-version = "2.14.0"
+version = "2.15.1"
description = "Pygments is a syntax highlighting package written in Python."
category = "dev"
optional = false
-python-versions = ">=3.6"
+python-versions = ">=3.7"
files = [
- {file = "Pygments-2.14.0-py3-none-any.whl", hash = "sha256:fa7bd7bd2771287c0de303af8bfdfc731f51bd2c6a47ab69d117138893b82717"},
- {file = "Pygments-2.14.0.tar.gz", hash = "sha256:b3ed06a9e8ac9a9aae5a6f5dbe78a8a58655d17b43b93c078f094ddc476ae297"},
+ {file = "Pygments-2.15.1-py3-none-any.whl", hash = "sha256:db2db3deb4b4179f399a09054b023b6a586b76499d36965813c71aa8ed7b5fd1"},
+ {file = "Pygments-2.15.1.tar.gz", hash = "sha256:8ace4d3c1dd481894b2005f560ead0f9f19ee64fe983366be1a21e171d12775c"},
]
[package.extras]
@@ -2236,18 +2294,18 @@ plugins = ["importlib-metadata"]
[[package]]
name = "pylint"
-version = "2.15.10"
+version = "2.17.4"
description = "python code static checker"
category = "dev"
optional = false
python-versions = ">=3.7.2"
files = [
- {file = "pylint-2.15.10-py3-none-any.whl", hash = "sha256:9df0d07e8948a1c3ffa3b6e2d7e6e63d9fb457c5da5b961ed63106594780cc7e"},
- {file = "pylint-2.15.10.tar.gz", hash = "sha256:b3dc5ef7d33858f297ac0d06cc73862f01e4f2e74025ec3eff347ce0bc60baf5"},
+ {file = "pylint-2.17.4-py3-none-any.whl", hash = "sha256:7a1145fb08c251bdb5cca11739722ce64a63db479283d10ce718b2460e54123c"},
+ {file = "pylint-2.17.4.tar.gz", hash = "sha256:5dcf1d9e19f41f38e4e85d10f511e5b9c35e1aa74251bf95cdd8cb23584e2db1"},
]
[package.dependencies]
-astroid = ">=2.12.13,<=2.14.0-dev0"
+astroid = ">=2.15.4,<=2.17.0-dev0"
colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
dill = {version = ">=0.2", markers = "python_version < \"3.11\""}
isort = ">=4.2.5,<6"
@@ -2438,83 +2496,83 @@ six = "*"
[[package]]
name = "pyobjc-core"
-version = "9.0.1"
+version = "9.1.1"
description = "Python<->ObjC Interoperability Module"
category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "pyobjc-core-9.0.1.tar.gz", hash = "sha256:5ce1510bb0bdff527c597079a42b2e13a19b7592e76850be7960a2775b59c929"},
- {file = "pyobjc_core-9.0.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:b614406d46175b1438a9596b664bf61952323116704d19bc1dea68052a0aad98"},
- {file = "pyobjc_core-9.0.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:bd397e729f6271c694fb70df8f5d3d3c9b2f2b8ac02fbbdd1757ca96027b94bb"},
- {file = "pyobjc_core-9.0.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:d919934eaa6d1cf1505ff447a5c2312be4c5651efcb694eb9f59e86f5bd25e6b"},
- {file = "pyobjc_core-9.0.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:67d67ca8b164f38ceacce28a18025845c3ec69613f3301935d4d2c4ceb22e3fd"},
- {file = "pyobjc_core-9.0.1-cp38-cp38-macosx_11_0_universal2.whl", hash = "sha256:39d11d71f6161ac0bd93cffc8ea210bb0178b56d16a7408bf74283d6ecfa7430"},
- {file = "pyobjc_core-9.0.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:25be1c4d530e473ed98b15063b8d6844f0733c98914de6f09fe1f7652b772bbc"},
+ {file = "pyobjc-core-9.1.1.tar.gz", hash = "sha256:4b6cb9053b5fcd3c0e76b8c8105a8110786b20f3403c5643a688c5ec51c55c6b"},
+ {file = "pyobjc_core-9.1.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:4bd07049fd9fe5b40e4b7c468af9cf942508387faf383a5acb043d20627bad2c"},
+ {file = "pyobjc_core-9.1.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:1a8307527621729ff2ab67860e7ed84f76ad0da881b248c2ef31e0da0088e4ba"},
+ {file = "pyobjc_core-9.1.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:083004d28b92ccb483a41195c600728854843b0486566aba2d6e63eef51f80e6"},
+ {file = "pyobjc_core-9.1.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:d61e9517d451bc062a7fae8b3648f4deba4fa54a24926fa1cf581b90ef4ced5a"},
+ {file = "pyobjc_core-9.1.1-cp38-cp38-macosx_11_0_universal2.whl", hash = "sha256:1626909916603a3b04c07c721cf1af0e0b892cec85bb3db98d05ba024f1786fc"},
+ {file = "pyobjc_core-9.1.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:2dde96462b52e952515d142e2afbb6913624a02c13582047e06211e6c3993728"},
]
[[package]]
name = "pyobjc-framework-applicationservices"
-version = "9.0.1"
+version = "9.1.1"
description = "Wrappers for the framework ApplicationServices on macOS"
category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "pyobjc-framework-ApplicationServices-9.0.1.tar.gz", hash = "sha256:e3a350781fdcab6c1da4343dfc54ae3c0523e59e61147432f61dcfb365752fde"},
- {file = "pyobjc_framework_ApplicationServices-9.0.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:c4214febf3cc2e417ae15d45b6502e5c20f1097cd042b025760d019fe69b07b6"},
- {file = "pyobjc_framework_ApplicationServices-9.0.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:c62693e01ba272fbadcd66677881311d2d63fda84b9662533fcc883c54be76d7"},
- {file = "pyobjc_framework_ApplicationServices-9.0.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:6829df4dc4cf012bdc221d4e0296d6699b33ca89741569df153989a0c18aa40e"},
- {file = "pyobjc_framework_ApplicationServices-9.0.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:5af5d12871499c429dd68c5ec4be56c631ec8439aa953c266eed9afdffb5ec2b"},
- {file = "pyobjc_framework_ApplicationServices-9.0.1-cp38-cp38-macosx_11_0_universal2.whl", hash = "sha256:724da9dfae6ab0505b90340231a685720288caecfcca335b08903102e97a93dc"},
- {file = "pyobjc_framework_ApplicationServices-9.0.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8e1dbfc8f482c433ce642724d4bed0c527c7f2f2f8b9ba1ac3f778a68cf1538d"},
+ {file = "pyobjc-framework-ApplicationServices-9.1.1.tar.gz", hash = "sha256:50c613bee364150bbd6cd992ca32b0848a780922cb57d112f6a4a56e29802e19"},
+ {file = "pyobjc_framework_ApplicationServices-9.1.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:f9286c05d80a6aafc7388a4c2a35801db9ea6bab960acf2df079110debb659cb"},
+ {file = "pyobjc_framework_ApplicationServices-9.1.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:3db1c79d7420052320529432e8562cd339a7ef0841df83a85bbf3648abb55b6b"},
+ {file = "pyobjc_framework_ApplicationServices-9.1.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:baf5a0d72c9e2d2a3b402823a2ea53eccdc27b8b9319d61cee7d753a30cb9411"},
+ {file = "pyobjc_framework_ApplicationServices-9.1.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:7cc5aad93bb6b178f838fe9b78cdcf1217c7baab157b1f3525e0acf696cc3490"},
+ {file = "pyobjc_framework_ApplicationServices-9.1.1-cp38-cp38-macosx_11_0_universal2.whl", hash = "sha256:a03434605873b9f83255a0b16bbc539d06afd77f5969a3b11a1fc293dfd56680"},
+ {file = "pyobjc_framework_ApplicationServices-9.1.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:9f18e9b92674be0e503a2bd451a328693450e6f80ee510bbc375238b14117e24"},
]
[package.dependencies]
-pyobjc-core = ">=9.0.1"
-pyobjc-framework-Cocoa = ">=9.0.1"
-pyobjc-framework-Quartz = ">=9.0.1"
+pyobjc-core = ">=9.1.1"
+pyobjc-framework-Cocoa = ">=9.1.1"
+pyobjc-framework-Quartz = ">=9.1.1"
[[package]]
name = "pyobjc-framework-cocoa"
-version = "9.0.1"
+version = "9.1.1"
description = "Wrappers for the Cocoa frameworks on macOS"
category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "pyobjc-framework-Cocoa-9.0.1.tar.gz", hash = "sha256:a8b53b3426f94307a58e2f8214dc1094c19afa9dcb96f21be12f937d968b2df3"},
- {file = "pyobjc_framework_Cocoa-9.0.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:5f94b0f92a62b781e633e58f09bcaded63d612f9b1e15202f5f372ea59e4aebd"},
- {file = "pyobjc_framework_Cocoa-9.0.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:f062c3bb5cc89902e6d164aa9a66ffc03638645dd5f0468b6f525ac997c86e51"},
- {file = "pyobjc_framework_Cocoa-9.0.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:0b374c0a9d32ba4fc5610ab2741cb05a005f1dfb82a47dbf2dbb2b3a34b73ce5"},
- {file = "pyobjc_framework_Cocoa-9.0.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:8928080cebbce91ac139e460d3dfc94c7cb6935be032dcae9c0a51b247f9c2d9"},
- {file = "pyobjc_framework_Cocoa-9.0.1-cp38-cp38-macosx_11_0_universal2.whl", hash = "sha256:9d2bd86a0a98d906f762f5dc59f2fc67cce32ae9633b02ff59ac8c8a33dd862d"},
- {file = "pyobjc_framework_Cocoa-9.0.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:2a41053cbcee30e1e8914efa749c50b70bf782527d5938f2bc2a6393740969ce"},
+ {file = "pyobjc-framework-Cocoa-9.1.1.tar.gz", hash = "sha256:345c32b6d1f3db45f635e400f2d0d6c0f0f7349d45ec823f76fc1df43d13caeb"},
+ {file = "pyobjc_framework_Cocoa-9.1.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:9176a4276f3b4b4758e9b9ca10698be5341ceffaeaa4fa055133417179e6bc37"},
+ {file = "pyobjc_framework_Cocoa-9.1.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:5e1e96fb3461f46ff951413515f2029e21be268b0e033db6abee7b64ec8e93d3"},
+ {file = "pyobjc_framework_Cocoa-9.1.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:083b195c496d30c6b9dd86126a6093c4b95e0138e9b052b13e54103fcc0b4872"},
+ {file = "pyobjc_framework_Cocoa-9.1.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:a1b3333b1aa045608848bd68bbab4c31171f36aeeaa2fabeb4527c6f6f1e33cd"},
+ {file = "pyobjc_framework_Cocoa-9.1.1-cp38-cp38-macosx_11_0_universal2.whl", hash = "sha256:54c017354671f0d955432986c42218e452ca69906a101c8e7acde8510432303a"},
+ {file = "pyobjc_framework_Cocoa-9.1.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:10c0075688ce95b92caf59e368585fffdcc98c919bc345067af070222f5d01d2"},
]
[package.dependencies]
-pyobjc-core = ">=9.0.1"
+pyobjc-core = ">=9.1.1"
[[package]]
name = "pyobjc-framework-quartz"
-version = "9.0.1"
+version = "9.1.1"
description = "Wrappers for the Quartz frameworks on macOS"
category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "pyobjc-framework-Quartz-9.0.1.tar.gz", hash = "sha256:7e2e37fc5c01bbdc37c1355d886e6184d1977043d5a05d1d956573fa8503dac3"},
- {file = "pyobjc_framework_Quartz-9.0.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:13a546a2af7c1c5c2bbf88cce6891896a449e92466415ad14d9a5ee93fba6ef3"},
- {file = "pyobjc_framework_Quartz-9.0.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:93ee6e339ab6928115a92188a0162ec80bf62cd0bd908d54695c1b9f9381ea45"},
- {file = "pyobjc_framework_Quartz-9.0.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:066ffbe26de1456f79a6d9467dabd6a3b9ef228318a0ba3f3fedbdbc0e2d3444"},
- {file = "pyobjc_framework_Quartz-9.0.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:0c9b553be6ef672e0886b0d2c77d1841b1a942c7b1dc9a67f6e1376dc5493513"},
- {file = "pyobjc_framework_Quartz-9.0.1-cp38-cp38-macosx_11_0_universal2.whl", hash = "sha256:7b39f85d0b747b0a13a11d0d538001b757c82d05e656eab437167b5b118307df"},
- {file = "pyobjc_framework_Quartz-9.0.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:0bedb6e1b7789d5b24fd5c790f0d53e4c62930313c97a891068bfa0e966ccc0b"},
+ {file = "pyobjc-framework-Quartz-9.1.1.tar.gz", hash = "sha256:8d03bc52bd6d90f00f274fd709b82e53dc5dfca19f3fc744997634e03faaa159"},
+ {file = "pyobjc_framework_Quartz-9.1.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:32602f46353a5eadb0843a0940635c8ec103f47d5b1ce84284604e01c6393fa8"},
+ {file = "pyobjc_framework_Quartz-9.1.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:7b3a56f52f9bb7fbd45c5a5f0de312ee9c104dfce6e1731015048d9e65a95e43"},
+ {file = "pyobjc_framework_Quartz-9.1.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:b3138773dfb269e6e3894e20dcfaf90102bad84ba44aa2bba8683b8426a69cdd"},
+ {file = "pyobjc_framework_Quartz-9.1.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:3b583e6953e9c65525db908c33c1c97cead3ac8aa0cf2759fcc568666a1b7373"},
+ {file = "pyobjc_framework_Quartz-9.1.1-cp38-cp38-macosx_11_0_universal2.whl", hash = "sha256:c3efcbba62e9c5351c2a9469faabb7f400f214cd8cf98f57798d6b6c93c76efb"},
+ {file = "pyobjc_framework_Quartz-9.1.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:a82d43c6c5fe0f5d350cfc97212bef7c572e345aa9c6e23909d23dace6448c99"},
]
[package.dependencies]
-pyobjc-core = ">=9.0.1"
-pyobjc-framework-Cocoa = ">=9.0.1"
+pyobjc-core = ">=9.1.1"
+pyobjc-framework-Cocoa = ">=9.1.1"
[[package]]
name = "pyparsing"
@@ -2569,14 +2627,14 @@ testing = ["argcomplete", "hypothesis (>=3.56)", "mock", "nose", "requests", "xm
[[package]]
name = "pytest-cov"
-version = "4.0.0"
+version = "4.1.0"
description = "Pytest plugin for measuring coverage."
category = "dev"
optional = false
-python-versions = ">=3.6"
+python-versions = ">=3.7"
files = [
- {file = "pytest-cov-4.0.0.tar.gz", hash = "sha256:996b79efde6433cdbd0088872dbc5fb3ed7fe1578b68cdbba634f14bb8dd0470"},
- {file = "pytest_cov-4.0.0-py3-none-any.whl", hash = "sha256:2feb1b751d66a8bd934e5edfa2e961d11309dc37b73b0eabe73b5945fee20f6b"},
+ {file = "pytest-cov-4.1.0.tar.gz", hash = "sha256:3904b13dfbfec47f003b8e77fd5b589cd11904a21ddf1ab38a64f204d6a10ef6"},
+ {file = "pytest_cov-4.1.0-py3-none-any.whl", hash = "sha256:6ba70b9e97e69fcc3fb45bfeab2d0a138fb65c4d0d6a41ef33983ad114be8c3a"},
]
[package.dependencies]
@@ -2621,43 +2679,40 @@ six = ">=1.5"
[[package]]
name = "python-engineio"
-version = "3.14.2"
-description = "Engine.IO server"
+version = "4.4.1"
+description = "Engine.IO server and client for Python"
category = "main"
optional = false
-python-versions = "*"
+python-versions = ">=3.6"
files = [
- {file = "python-engineio-3.14.2.tar.gz", hash = "sha256:eab4553f2804c1ce97054c8b22cf0d5a9ab23128075248b97e1a5b2f29553085"},
- {file = "python_engineio-3.14.2-py2.py3-none-any.whl", hash = "sha256:5a9e6086d192463b04a1428ff1f85b6ba631bbb19d453b144ffc04f530542b84"},
+ {file = "python-engineio-4.4.1.tar.gz", hash = "sha256:eb3663ecb300195926b526386f712dff84cd092c818fb7b62eeeda9160120c29"},
+ {file = "python_engineio-4.4.1-py3-none-any.whl", hash = "sha256:28ab67f94cba2e5f598cbb04428138fd6bb8b06d3478c939412da445f24f0773"},
]
-[package.dependencies]
-six = ">=1.9.0"
-
[package.extras]
asyncio-client = ["aiohttp (>=3.4)"]
client = ["requests (>=2.21.0)", "websocket-client (>=0.54.0)"]
[[package]]
name = "python-socketio"
-version = "4.6.1"
-description = "Socket.IO server"
+version = "5.8.0"
+description = "Socket.IO server and client for Python"
category = "main"
optional = false
-python-versions = "*"
+python-versions = ">=3.6"
files = [
- {file = "python-socketio-4.6.1.tar.gz", hash = "sha256:cd1f5aa492c1eb2be77838e837a495f117e17f686029ebc03d62c09e33f4fa10"},
- {file = "python_socketio-4.6.1-py2.py3-none-any.whl", hash = "sha256:5a21da53fdbdc6bb6c8071f40e13d100e0b279ad997681c2492478e06f370523"},
+ {file = "python-socketio-5.8.0.tar.gz", hash = "sha256:e714f4dddfaaa0cb0e37a1e2deef2bb60590a5b9fea9c343dd8ca5e688416fd9"},
+ {file = "python_socketio-5.8.0-py3-none-any.whl", hash = "sha256:7adb8867aac1c2929b9c1429f1c02e12ca4c36b67c807967393e367dfbb01441"},
]
[package.dependencies]
-python-engineio = ">=3.13.0,<4"
+bidict = ">=0.21.0"
+python-engineio = ">=4.3.0"
requests = {version = ">=2.21.0", optional = true, markers = "extra == \"client\""}
-six = ">=1.9.0"
websocket-client = {version = ">=0.54.0", optional = true, markers = "extra == \"client\""}
[package.extras]
-asyncio-client = ["aiohttp (>=3.4)", "websockets (>=7.0)"]
+asyncio-client = ["aiohttp (>=3.4)"]
client = ["requests (>=2.21.0)", "websocket-client (>=0.54.0)"]
[[package]]
@@ -2686,18 +2741,6 @@ files = [
{file = "python3-xlib-0.15.tar.gz", hash = "sha256:dc4245f3ae4aa5949c1d112ee4723901ade37a96721ba9645f2bfa56e5b383f8"},
]
-[[package]]
-name = "pytz"
-version = "2022.7.1"
-description = "World timezone definitions, modern and historical"
-category = "dev"
-optional = false
-python-versions = "*"
-files = [
- {file = "pytz-2022.7.1-py2.py3-none-any.whl", hash = "sha256:78f4f37d8198e0627c5f1143240bb0206b8691d8d7ac6d78fee88b78733f8c4a"},
- {file = "pytz-2022.7.1.tar.gz", hash = "sha256:01a0681c4b9684a28304615eba55d1ab31ae00bf68ec157ec3708a8182dbbcd0"},
-]
-
[[package]]
name = "pywin32"
version = "301"
@@ -2782,16 +2825,19 @@ files = [
[[package]]
name = "qt-py"
-version = "1.3.7"
+version = "1.3.8"
description = "Python 2 & 3 compatibility wrapper around all Qt bindings - PySide, PySide2, PyQt4 and PyQt5."
category = "main"
optional = false
python-versions = "*"
files = [
- {file = "Qt.py-1.3.7-py2.py3-none-any.whl", hash = "sha256:150099d1c6f64c9621a2c9d79d45102ec781c30ee30ee69fc082c6e9be7324fe"},
- {file = "Qt.py-1.3.7.tar.gz", hash = "sha256:803c7bdf4d6230f9a466be19d55934a173eabb61406d21cb91e80c2a3f773b1f"},
+ {file = "Qt.py-1.3.8-py2.py3-none-any.whl", hash = "sha256:665b9d4cfefaff2d697876d5027e145a0e0b1ba62dda9652ea114db134bc9911"},
+ {file = "Qt.py-1.3.8.tar.gz", hash = "sha256:6d330928f7ec8db8e329b19116c52482b6abfaccfa5edef0248e57d012300895"},
]
+[package.dependencies]
+types-PySide2 = "*"
+
[[package]]
name = "qtawesome"
version = "0.7.3"
@@ -2810,14 +2856,14 @@ six = "*"
[[package]]
name = "qtpy"
-version = "2.3.0"
+version = "2.3.1"
description = "Provides an abstraction layer on top of the various Qt bindings (PyQt5/6 and PySide2/6)."
category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "QtPy-2.3.0-py3-none-any.whl", hash = "sha256:8d6d544fc20facd27360ea189592e6135c614785f0dec0b4f083289de6beb408"},
- {file = "QtPy-2.3.0.tar.gz", hash = "sha256:0603c9c83ccc035a4717a12908bf6bc6cb22509827ea2ec0e94c2da7c9ed57c5"},
+ {file = "QtPy-2.3.1-py3-none-any.whl", hash = "sha256:5193d20e0b16e4d9d3bc2c642d04d9f4e2c892590bd1b9c92bfe38a95d5a2e12"},
+ {file = "QtPy-2.3.1.tar.gz", hash = "sha256:a8c74982d6d172ce124d80cafd39653df78989683f760f2281ba91a6e7b9de8b"},
]
[package.dependencies]
@@ -2845,26 +2891,48 @@ sphinx = ">=1.3.1"
[[package]]
name = "requests"
-version = "2.28.1"
+version = "2.31.0"
description = "Python HTTP for Humans."
category = "main"
optional = false
-python-versions = ">=3.7, <4"
+python-versions = ">=3.7"
files = [
- {file = "requests-2.28.1-py3-none-any.whl", hash = "sha256:8fefa2a1a1365bf5520aac41836fbee479da67864514bdb821f31ce07ce65349"},
- {file = "requests-2.28.1.tar.gz", hash = "sha256:7c5599b102feddaa661c826c56ab4fee28bfd17f5abca1ebbe3e7f19d7c97983"},
+ {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
+ {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
]
[package.dependencies]
certifi = ">=2017.4.17"
-charset-normalizer = ">=2,<3"
+charset-normalizer = ">=2,<4"
idna = ">=2.5,<4"
-urllib3 = ">=1.21.1,<1.27"
+urllib3 = ">=1.21.1,<3"
[package.extras]
socks = ["PySocks (>=1.5.6,!=1.5.7)"]
use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
+[[package]]
+name = "revitron-sphinx-theme"
+version = "0.7.2"
+description = "Revitron theme for Sphinx"
+category = "dev"
+optional = false
+python-versions = "*"
+files = []
+develop = false
+
+[package.dependencies]
+sphinx = "*"
+
+[package.extras]
+dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client"]
+
+[package.source]
+type = "git"
+url = "https://github.com/revitron/revitron-sphinx-theme.git"
+reference = "master"
+resolved_reference = "c0779c66365d9d258d93575ebaff7db9d3aee282"
+
[[package]]
name = "rsa"
version = "4.9"
@@ -2955,19 +3023,19 @@ files = [
[[package]]
name = "slack-sdk"
-version = "3.19.5"
+version = "3.21.3"
description = "The Slack API Platform SDK for Python"
category = "main"
optional = false
python-versions = ">=3.6.0"
files = [
- {file = "slack_sdk-3.19.5-py2.py3-none-any.whl", hash = "sha256:0b52bb32a87c71f638b9eb47e228dffeebf89de5e762684ef848276f9f186c84"},
- {file = "slack_sdk-3.19.5.tar.gz", hash = "sha256:47fb4af596243fe6585a92f3034de21eb2104a55cc9fd59a92ef3af17cf9ddd8"},
+ {file = "slack_sdk-3.21.3-py2.py3-none-any.whl", hash = "sha256:de3c07b92479940b61cd68c566f49fbc9974c8f38f661d26244078f3903bb9cc"},
+ {file = "slack_sdk-3.21.3.tar.gz", hash = "sha256:20829bdc1a423ec93dac903470975ebf3bc76fd3fd91a4dadc0eeffc940ecb0c"},
]
[package.extras]
-optional = ["SQLAlchemy (>=1,<2)", "aiodns (>1.0)", "aiohttp (>=3.7.3,<4)", "boto3 (<=2)", "websocket-client (>=1,<2)", "websockets (>=10,<11)"]
-testing = ["Flask (>=1,<2)", "Flask-Sockets (>=0.2,<1)", "Jinja2 (==3.0.3)", "Werkzeug (<2)", "black (==22.8.0)", "boto3 (<=2)", "click (==8.0.4)", "codecov (>=2,<3)", "databases (>=0.5)", "flake8 (>=5,<6)", "itsdangerous (==1.1.0)", "moto (>=3,<4)", "psutil (>=5,<6)", "pytest (>=6.2.5,<7)", "pytest-asyncio (<1)", "pytest-cov (>=2,<3)"]
+optional = ["SQLAlchemy (>=1.4,<3)", "aiodns (>1.0)", "aiohttp (>=3.7.3,<4)", "boto3 (<=2)", "websocket-client (>=1,<2)", "websockets (>=10,<11)"]
+testing = ["Flask (>=1,<2)", "Flask-Sockets (>=0.2,<1)", "Jinja2 (==3.0.3)", "Werkzeug (<2)", "black (==22.8.0)", "boto3 (<=2)", "click (==8.0.4)", "databases (>=0.5)", "flake8 (>=5,<6)", "itsdangerous (==1.1.0)", "moto (>=3,<4)", "psutil (>=5,<6)", "pytest (>=6.2.5,<7)", "pytest-asyncio (<1)", "pytest-cov (>=2,<3)"]
[[package]]
name = "smmap"
@@ -3007,27 +3075,27 @@ files = [
[[package]]
name = "sphinx"
-version = "6.1.3"
+version = "5.3.0"
description = "Python documentation generator"
category = "dev"
optional = false
-python-versions = ">=3.8"
+python-versions = ">=3.6"
files = [
- {file = "Sphinx-6.1.3.tar.gz", hash = "sha256:0dac3b698538ffef41716cf97ba26c1c7788dba73ce6f150c1ff5b4720786dd2"},
- {file = "sphinx-6.1.3-py3-none-any.whl", hash = "sha256:807d1cb3d6be87eb78a381c3e70ebd8d346b9a25f3753e9947e866b2786865fc"},
+ {file = "Sphinx-5.3.0.tar.gz", hash = "sha256:51026de0a9ff9fc13c05d74913ad66047e104f56a129ff73e174eb5c3ee794b5"},
+ {file = "sphinx-5.3.0-py3-none-any.whl", hash = "sha256:060ca5c9f7ba57a08a1219e547b269fadf125ae25b06b9fa7f66768efb652d6d"},
]
[package.dependencies]
alabaster = ">=0.7,<0.8"
babel = ">=2.9"
colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
-docutils = ">=0.18,<0.20"
+docutils = ">=0.14,<0.20"
imagesize = ">=1.3"
importlib-metadata = {version = ">=4.8", markers = "python_version < \"3.10\""}
Jinja2 = ">=3.0"
packaging = ">=21.0"
-Pygments = ">=2.13"
-requests = ">=2.25.0"
+Pygments = ">=2.12"
+requests = ">=2.5.0"
snowballstemmer = ">=2.0"
sphinxcontrib-applehelp = "*"
sphinxcontrib-devhelp = "*"
@@ -3038,37 +3106,43 @@ sphinxcontrib-serializinghtml = ">=1.1.5"
[package.extras]
docs = ["sphinxcontrib-websupport"]
-lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
-test = ["cython", "html5lib", "pytest (>=4.6)"]
+lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-bugbear", "flake8-comprehensions", "flake8-simplify", "isort", "mypy (>=0.981)", "sphinx-lint", "types-requests", "types-typed-ast"]
+test = ["cython", "html5lib", "pytest (>=4.6)", "typed_ast"]
[[package]]
-name = "sphinx-rtd-theme"
-version = "0.5.1"
-description = "Read the Docs theme for Sphinx"
+name = "sphinx-autoapi"
+version = "2.1.0"
+description = "Sphinx API documentation generator"
category = "dev"
optional = false
-python-versions = "*"
+python-versions = ">=3.7"
files = [
- {file = "sphinx_rtd_theme-0.5.1-py2.py3-none-any.whl", hash = "sha256:fa6bebd5ab9a73da8e102509a86f3fcc36dec04a0b52ea80e5a033b2aba00113"},
- {file = "sphinx_rtd_theme-0.5.1.tar.gz", hash = "sha256:eda689eda0c7301a80cf122dad28b1861e5605cbf455558f3775e1e8200e83a5"},
+ {file = "sphinx-autoapi-2.1.0.tar.gz", hash = "sha256:5b5c58064214d5a846c9c81d23f00990a64654b9bca10213231db54a241bc50f"},
+ {file = "sphinx_autoapi-2.1.0-py2.py3-none-any.whl", hash = "sha256:b25c7b2cda379447b8c36b6a0e3bdf76e02fd64f7ca99d41c6cbdf130a01768f"},
]
[package.dependencies]
-sphinx = "*"
+astroid = ">=2.7"
+Jinja2 = "*"
+PyYAML = "*"
+sphinx = ">=5.2.0"
+unidecode = "*"
[package.extras]
-dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client"]
+docs = ["sphinx", "sphinx-rtd-theme"]
+dotnet = ["sphinxcontrib-dotnetdomain"]
+go = ["sphinxcontrib-golangdomain"]
[[package]]
name = "sphinxcontrib-applehelp"
-version = "1.0.3"
+version = "1.0.4"
description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
category = "dev"
optional = false
python-versions = ">=3.8"
files = [
- {file = "sphinxcontrib.applehelp-1.0.3-py3-none-any.whl", hash = "sha256:ba0f2a22e6eeada8da6428d0d520215ee8864253f32facf958cca81e426f661d"},
- {file = "sphinxcontrib.applehelp-1.0.3.tar.gz", hash = "sha256:83749f09f6ac843b8cb685277dbc818a8bf2d76cc19602699094fe9a74db529e"},
+ {file = "sphinxcontrib-applehelp-1.0.4.tar.gz", hash = "sha256:828f867945bbe39817c210a1abfd1bc4895c8b73fcaade56d45357a348a07d7e"},
+ {file = "sphinxcontrib_applehelp-1.0.4-py3-none-any.whl", hash = "sha256:29d341f67fb0f6f586b23ad80e072c8e6ad0b48417db2bde114a4c9746feb228"},
]
[package.extras]
@@ -3093,14 +3167,14 @@ test = ["pytest"]
[[package]]
name = "sphinxcontrib-htmlhelp"
-version = "2.0.0"
+version = "2.0.1"
description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
category = "dev"
optional = false
-python-versions = ">=3.6"
+python-versions = ">=3.8"
files = [
- {file = "sphinxcontrib-htmlhelp-2.0.0.tar.gz", hash = "sha256:f5f8bb2d0d629f398bf47d0d69c07bc13b65f75a81ad9e2f71a63d4b7a2f6db2"},
- {file = "sphinxcontrib_htmlhelp-2.0.0-py2.py3-none-any.whl", hash = "sha256:d412243dfb797ae3ec2b59eca0e52dac12e75a241bf0e4eb861e450d06c6ed07"},
+ {file = "sphinxcontrib-htmlhelp-2.0.1.tar.gz", hash = "sha256:0cbdd302815330058422b98a113195c9249825d681e18f11e8b1f78a2f11efff"},
+ {file = "sphinxcontrib_htmlhelp-2.0.1-py3-none-any.whl", hash = "sha256:c38cb46dccf316c79de6e5515e1770414b797162b23cd3d06e67020e1d2a6903"},
]
[package.extras]
@@ -3122,6 +3196,22 @@ files = [
[package.extras]
test = ["flake8", "mypy", "pytest"]
+[[package]]
+name = "sphinxcontrib-napoleon"
+version = "0.7"
+description = "Sphinx \"napoleon\" extension."
+category = "dev"
+optional = false
+python-versions = "*"
+files = [
+ {file = "sphinxcontrib-napoleon-0.7.tar.gz", hash = "sha256:407382beed396e9f2d7f3043fad6afda95719204a1e1a231ac865f40abcbfcf8"},
+ {file = "sphinxcontrib_napoleon-0.7-py2.py3-none-any.whl", hash = "sha256:711e41a3974bdf110a484aec4c1a556799eb0b3f3b897521a018ad7e2db13fef"},
+]
+
+[package.dependencies]
+pockets = ">=0.3"
+six = ">=1.5.2"
+
[[package]]
name = "sphinxcontrib-qthelp"
version = "1.0.3"
@@ -3208,26 +3298,64 @@ files = [
[[package]]
name = "tomlkit"
-version = "0.11.6"
+version = "0.11.8"
description = "Style preserving TOML library"
category = "dev"
optional = false
-python-versions = ">=3.6"
+python-versions = ">=3.7"
files = [
- {file = "tomlkit-0.11.6-py3-none-any.whl", hash = "sha256:07de26b0d8cfc18f871aec595fda24d95b08fef89d147caa861939f37230bf4b"},
- {file = "tomlkit-0.11.6.tar.gz", hash = "sha256:71b952e5721688937fb02cf9d354dbcf0785066149d2855e44531ebdd2b65d73"},
+ {file = "tomlkit-0.11.8-py3-none-any.whl", hash = "sha256:8c726c4c202bdb148667835f68d68780b9a003a9ec34167b6c673b38eff2a171"},
+ {file = "tomlkit-0.11.8.tar.gz", hash = "sha256:9330fc7faa1db67b541b28e62018c17d20be733177d290a13b24c62d1614e0c3"},
+]
+
+[[package]]
+name = "types-pyside2"
+version = "5.15.2.1.5"
+description = "The most accurate stubs for PySide2"
+category = "main"
+optional = false
+python-versions = "*"
+files = [
+ {file = "types_PySide2-5.15.2.1.5-py2.py3-none-any.whl", hash = "sha256:4bbee2c8f09961101013d05bb5c506b7351b3020494fc8b5c3b73c95014fa1b0"},
]
[[package]]
name = "typing-extensions"
-version = "4.4.0"
+version = "4.6.2"
description = "Backported and Experimental Type Hints for Python 3.7+"
category = "dev"
optional = false
python-versions = ">=3.7"
files = [
- {file = "typing_extensions-4.4.0-py3-none-any.whl", hash = "sha256:16fa4864408f655d35ec496218b85f79b3437c829e93320c7c9215ccfd92489e"},
- {file = "typing_extensions-4.4.0.tar.gz", hash = "sha256:1511434bb92bf8dd198c12b1cc812e800d4181cfcb867674e0f8279cc93087aa"},
+ {file = "typing_extensions-4.6.2-py3-none-any.whl", hash = "sha256:3a8b36f13dd5fdc5d1b16fe317f5668545de77fa0b8e02006381fd49d731ab98"},
+ {file = "typing_extensions-4.6.2.tar.gz", hash = "sha256:06006244c70ac8ee83fa8282cb188f697b8db25bc8b4df07be1873c43897060c"},
+]
+
+[[package]]
+name = "uc-micro-py"
+version = "1.0.2"
+description = "Micro subset of unicode data files for linkify-it-py projects."
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "uc-micro-py-1.0.2.tar.gz", hash = "sha256:30ae2ac9c49f39ac6dce743bd187fcd2b574b16ca095fa74cd9396795c954c54"},
+ {file = "uc_micro_py-1.0.2-py3-none-any.whl", hash = "sha256:8c9110c309db9d9e87302e2f4ad2c3152770930d88ab385cd544e7a7e75f3de0"},
+]
+
+[package.extras]
+test = ["coverage", "pytest", "pytest-cov"]
+
+[[package]]
+name = "unidecode"
+version = "1.2.0"
+description = "ASCII transliterations of Unicode text"
+category = "main"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+ {file = "Unidecode-1.2.0-py2.py3-none-any.whl", hash = "sha256:12435ef2fc4cdfd9cf1035a1db7e98b6b047fe591892e81f34e94959591fad00"},
+ {file = "Unidecode-1.2.0.tar.gz", hash = "sha256:8d73a97d387a956922344f6b74243c2c6771594659778744b2dbdaad8f6b727d"},
]
[[package]]
@@ -3244,41 +3372,42 @@ files = [
[[package]]
name = "urllib3"
-version = "1.26.14"
+version = "2.0.2"
description = "HTTP library with thread-safe connection pooling, file post, and more."
category = "main"
optional = false
-python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*"
+python-versions = ">=3.7"
files = [
- {file = "urllib3-1.26.14-py2.py3-none-any.whl", hash = "sha256:75edcdc2f7d85b137124a6c3c9fc3933cdeaa12ecb9a6a959f22797a0feca7e1"},
- {file = "urllib3-1.26.14.tar.gz", hash = "sha256:076907bf8fd355cde77728471316625a4d2f7e713c125f51953bb5b3eecf4f72"},
+ {file = "urllib3-2.0.2-py3-none-any.whl", hash = "sha256:d055c2f9d38dc53c808f6fdc8eab7360b6fdbbde02340ed25cfbcd817c62469e"},
+ {file = "urllib3-2.0.2.tar.gz", hash = "sha256:61717a1095d7e155cdb737ac7bb2f4324a858a1e2e6466f6d03ff630ca68d3cc"},
]
[package.extras]
-brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)", "brotlipy (>=0.6.0)"]
-secure = ["certifi", "cryptography (>=1.3.4)", "idna (>=2.0.0)", "ipaddress", "pyOpenSSL (>=0.14)", "urllib3-secure-extra"]
-socks = ["PySocks (>=1.5.6,!=1.5.7,<2.0)"]
+brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
+secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
+socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
+zstd = ["zstandard (>=0.18.0)"]
[[package]]
name = "virtualenv"
-version = "20.17.1"
+version = "20.23.0"
description = "Virtual Python Environment builder"
category = "dev"
optional = false
-python-versions = ">=3.6"
+python-versions = ">=3.7"
files = [
- {file = "virtualenv-20.17.1-py3-none-any.whl", hash = "sha256:ce3b1684d6e1a20a3e5ed36795a97dfc6af29bc3970ca8dab93e11ac6094b3c4"},
- {file = "virtualenv-20.17.1.tar.gz", hash = "sha256:f8b927684efc6f1cc206c9db297a570ab9ad0e51c16fa9e45487d36d1905c058"},
+ {file = "virtualenv-20.23.0-py3-none-any.whl", hash = "sha256:6abec7670e5802a528357fdc75b26b9f57d5d92f29c5462ba0fbe45feacc685e"},
+ {file = "virtualenv-20.23.0.tar.gz", hash = "sha256:a85caa554ced0c0afbd0d638e7e2d7b5f92d23478d05d17a76daeac8f279f924"},
]
[package.dependencies]
distlib = ">=0.3.6,<1"
-filelock = ">=3.4.1,<4"
-platformdirs = ">=2.4,<3"
+filelock = ">=3.11,<4"
+platformdirs = ">=3.2,<4"
[package.extras]
-docs = ["proselint (>=0.13)", "sphinx (>=5.3)", "sphinx-argparse (>=0.3.2)", "sphinx-rtd-theme (>=1)", "towncrier (>=22.8)"]
-testing = ["coverage (>=6.2)", "coverage-enable-subprocess (>=1)", "flaky (>=3.7)", "packaging (>=21.3)", "pytest (>=7.0.1)", "pytest-env (>=0.6.2)", "pytest-freezegun (>=0.4.2)", "pytest-mock (>=3.6.1)", "pytest-randomly (>=3.10.3)", "pytest-timeout (>=2.1)"]
+docs = ["furo (>=2023.3.27)", "proselint (>=0.13)", "sphinx (>=6.1.3)", "sphinx-argparse (>=0.4)", "sphinxcontrib-towncrier (>=0.2.1a0)", "towncrier (>=22.12)"]
+test = ["covdefaults (>=2.3)", "coverage (>=7.2.3)", "coverage-enable-subprocess (>=1)", "flaky (>=3.7)", "packaging (>=23.1)", "pytest (>=7.3.1)", "pytest-env (>=0.8.1)", "pytest-freezegun (>=0.4.2)", "pytest-mock (>=3.10)", "pytest-randomly (>=3.12)", "pytest-timeout (>=2.1)", "setuptools (>=67.7.1)", "time-machine (>=2.9)"]
[[package]]
name = "wcwidth"
@@ -3309,91 +3438,102 @@ six = "*"
[[package]]
name = "wheel"
-version = "0.38.4"
+version = "0.40.0"
description = "A built-package format for Python"
category = "dev"
optional = false
python-versions = ">=3.7"
files = [
- {file = "wheel-0.38.4-py3-none-any.whl", hash = "sha256:b60533f3f5d530e971d6737ca6d58681ee434818fab630c83a734bb10c083ce8"},
- {file = "wheel-0.38.4.tar.gz", hash = "sha256:965f5259b566725405b05e7cf774052044b1ed30119b5d586b2703aafe8719ac"},
+ {file = "wheel-0.40.0-py3-none-any.whl", hash = "sha256:d236b20e7cb522daf2390fa84c55eea81c5c30190f90f29ae2ca1ad8355bf247"},
+ {file = "wheel-0.40.0.tar.gz", hash = "sha256:cd1196f3faee2b31968d626e1731c94f99cbdb67cf5a46e4f5656cbee7738873"},
]
[package.extras]
-test = ["pytest (>=3.0.0)"]
+test = ["pytest (>=6.0.0)"]
[[package]]
name = "wrapt"
-version = "1.14.1"
+version = "1.15.0"
description = "Module for decorators, wrappers and monkey patching."
-category = "main"
+category = "dev"
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,>=2.7"
files = [
- {file = "wrapt-1.14.1-cp27-cp27m-macosx_10_9_x86_64.whl", hash = "sha256:1b376b3f4896e7930f1f772ac4b064ac12598d1c38d04907e696cc4d794b43d3"},
- {file = "wrapt-1.14.1-cp27-cp27m-manylinux1_i686.whl", hash = "sha256:903500616422a40a98a5a3c4ff4ed9d0066f3b4c951fa286018ecdf0750194ef"},
- {file = "wrapt-1.14.1-cp27-cp27m-manylinux1_x86_64.whl", hash = "sha256:5a9a0d155deafd9448baff28c08e150d9b24ff010e899311ddd63c45c2445e28"},
- {file = "wrapt-1.14.1-cp27-cp27m-manylinux2010_i686.whl", hash = "sha256:ddaea91abf8b0d13443f6dac52e89051a5063c7d014710dcb4d4abb2ff811a59"},
- {file = "wrapt-1.14.1-cp27-cp27m-manylinux2010_x86_64.whl", hash = "sha256:36f582d0c6bc99d5f39cd3ac2a9062e57f3cf606ade29a0a0d6b323462f4dd87"},
- {file = "wrapt-1.14.1-cp27-cp27mu-manylinux1_i686.whl", hash = "sha256:7ef58fb89674095bfc57c4069e95d7a31cfdc0939e2a579882ac7d55aadfd2a1"},
- {file = "wrapt-1.14.1-cp27-cp27mu-manylinux1_x86_64.whl", hash = "sha256:e2f83e18fe2f4c9e7db597e988f72712c0c3676d337d8b101f6758107c42425b"},
- {file = "wrapt-1.14.1-cp27-cp27mu-manylinux2010_i686.whl", hash = "sha256:ee2b1b1769f6707a8a445162ea16dddf74285c3964f605877a20e38545c3c462"},
- {file = "wrapt-1.14.1-cp27-cp27mu-manylinux2010_x86_64.whl", hash = "sha256:833b58d5d0b7e5b9832869f039203389ac7cbf01765639c7309fd50ef619e0b1"},
- {file = "wrapt-1.14.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:80bb5c256f1415f747011dc3604b59bc1f91c6e7150bd7db03b19170ee06b320"},
- {file = "wrapt-1.14.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:07f7a7d0f388028b2df1d916e94bbb40624c59b48ecc6cbc232546706fac74c2"},
- {file = "wrapt-1.14.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:02b41b633c6261feff8ddd8d11c711df6842aba629fdd3da10249a53211a72c4"},
- {file = "wrapt-1.14.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2fe803deacd09a233e4762a1adcea5db5d31e6be577a43352936179d14d90069"},
- {file = "wrapt-1.14.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:257fd78c513e0fb5cdbe058c27a0624c9884e735bbd131935fd49e9fe719d310"},
- {file = "wrapt-1.14.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:4fcc4649dc762cddacd193e6b55bc02edca674067f5f98166d7713b193932b7f"},
- {file = "wrapt-1.14.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:11871514607b15cfeb87c547a49bca19fde402f32e2b1c24a632506c0a756656"},
- {file = "wrapt-1.14.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:8ad85f7f4e20964db4daadcab70b47ab05c7c1cf2a7c1e51087bfaa83831854c"},
- {file = "wrapt-1.14.1-cp310-cp310-win32.whl", hash = "sha256:a9a52172be0b5aae932bef82a79ec0a0ce87288c7d132946d645eba03f0ad8a8"},
- {file = "wrapt-1.14.1-cp310-cp310-win_amd64.whl", hash = "sha256:6d323e1554b3d22cfc03cd3243b5bb815a51f5249fdcbb86fda4bf62bab9e164"},
- {file = "wrapt-1.14.1-cp35-cp35m-manylinux1_i686.whl", hash = "sha256:43ca3bbbe97af00f49efb06e352eae40434ca9d915906f77def219b88e85d907"},
- {file = "wrapt-1.14.1-cp35-cp35m-manylinux1_x86_64.whl", hash = "sha256:6b1a564e6cb69922c7fe3a678b9f9a3c54e72b469875aa8018f18b4d1dd1adf3"},
- {file = "wrapt-1.14.1-cp35-cp35m-manylinux2010_i686.whl", hash = "sha256:00b6d4ea20a906c0ca56d84f93065b398ab74b927a7a3dbd470f6fc503f95dc3"},
- {file = "wrapt-1.14.1-cp35-cp35m-manylinux2010_x86_64.whl", hash = "sha256:a85d2b46be66a71bedde836d9e41859879cc54a2a04fad1191eb50c2066f6e9d"},
- {file = "wrapt-1.14.1-cp35-cp35m-win32.whl", hash = "sha256:dbcda74c67263139358f4d188ae5faae95c30929281bc6866d00573783c422b7"},
- {file = "wrapt-1.14.1-cp35-cp35m-win_amd64.whl", hash = "sha256:b21bb4c09ffabfa0e85e3a6b623e19b80e7acd709b9f91452b8297ace2a8ab00"},
- {file = "wrapt-1.14.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:9e0fd32e0148dd5dea6af5fee42beb949098564cc23211a88d799e434255a1f4"},
- {file = "wrapt-1.14.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9736af4641846491aedb3c3f56b9bc5568d92b0692303b5a305301a95dfd38b1"},
- {file = "wrapt-1.14.1-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5b02d65b9ccf0ef6c34cba6cf5bf2aab1bb2f49c6090bafeecc9cd81ad4ea1c1"},
- {file = "wrapt-1.14.1-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:21ac0156c4b089b330b7666db40feee30a5d52634cc4560e1905d6529a3897ff"},
- {file = "wrapt-1.14.1-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:9f3e6f9e05148ff90002b884fbc2a86bd303ae847e472f44ecc06c2cd2fcdb2d"},
- {file = "wrapt-1.14.1-cp36-cp36m-musllinux_1_1_i686.whl", hash = "sha256:6e743de5e9c3d1b7185870f480587b75b1cb604832e380d64f9504a0535912d1"},
- {file = "wrapt-1.14.1-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:d79d7d5dc8a32b7093e81e97dad755127ff77bcc899e845f41bf71747af0c569"},
- {file = "wrapt-1.14.1-cp36-cp36m-win32.whl", hash = "sha256:81b19725065dcb43df02b37e03278c011a09e49757287dca60c5aecdd5a0b8ed"},
- {file = "wrapt-1.14.1-cp36-cp36m-win_amd64.whl", hash = "sha256:b014c23646a467558be7da3d6b9fa409b2c567d2110599b7cf9a0c5992b3b471"},
- {file = "wrapt-1.14.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:88bd7b6bd70a5b6803c1abf6bca012f7ed963e58c68d76ee20b9d751c74a3248"},
- {file = "wrapt-1.14.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b5901a312f4d14c59918c221323068fad0540e34324925c8475263841dbdfe68"},
- {file = "wrapt-1.14.1-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d77c85fedff92cf788face9bfa3ebaa364448ebb1d765302e9af11bf449ca36d"},
- {file = "wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8d649d616e5c6a678b26d15ece345354f7c2286acd6db868e65fcc5ff7c24a77"},
- {file = "wrapt-1.14.1-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:7d2872609603cb35ca513d7404a94d6d608fc13211563571117046c9d2bcc3d7"},
- {file = "wrapt-1.14.1-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:ee6acae74a2b91865910eef5e7de37dc6895ad96fa23603d1d27ea69df545015"},
- {file = "wrapt-1.14.1-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:2b39d38039a1fdad98c87279b48bc5dce2c0ca0d73483b12cb72aa9609278e8a"},
- {file = "wrapt-1.14.1-cp37-cp37m-win32.whl", hash = "sha256:60db23fa423575eeb65ea430cee741acb7c26a1365d103f7b0f6ec412b893853"},
- {file = "wrapt-1.14.1-cp37-cp37m-win_amd64.whl", hash = "sha256:709fe01086a55cf79d20f741f39325018f4df051ef39fe921b1ebe780a66184c"},
- {file = "wrapt-1.14.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:8c0ce1e99116d5ab21355d8ebe53d9460366704ea38ae4d9f6933188f327b456"},
- {file = "wrapt-1.14.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:e3fb1677c720409d5f671e39bac6c9e0e422584e5f518bfd50aa4cbbea02433f"},
- {file = "wrapt-1.14.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:642c2e7a804fcf18c222e1060df25fc210b9c58db7c91416fb055897fc27e8cc"},
- {file = "wrapt-1.14.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7b7c050ae976e286906dd3f26009e117eb000fb2cf3533398c5ad9ccc86867b1"},
- {file = "wrapt-1.14.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ef3f72c9666bba2bab70d2a8b79f2c6d2c1a42a7f7e2b0ec83bb2f9e383950af"},
- {file = "wrapt-1.14.1-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:01c205616a89d09827986bc4e859bcabd64f5a0662a7fe95e0d359424e0e071b"},
- {file = "wrapt-1.14.1-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:5a0f54ce2c092aaf439813735584b9537cad479575a09892b8352fea5e988dc0"},
- {file = "wrapt-1.14.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:2cf71233a0ed05ccdabe209c606fe0bac7379fdcf687f39b944420d2a09fdb57"},
- {file = "wrapt-1.14.1-cp38-cp38-win32.whl", hash = "sha256:aa31fdcc33fef9eb2552cbcbfee7773d5a6792c137b359e82879c101e98584c5"},
- {file = "wrapt-1.14.1-cp38-cp38-win_amd64.whl", hash = "sha256:d1967f46ea8f2db647c786e78d8cc7e4313dbd1b0aca360592d8027b8508e24d"},
- {file = "wrapt-1.14.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:3232822c7d98d23895ccc443bbdf57c7412c5a65996c30442ebe6ed3df335383"},
- {file = "wrapt-1.14.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:988635d122aaf2bdcef9e795435662bcd65b02f4f4c1ae37fbee7401c440b3a7"},
- {file = "wrapt-1.14.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9cca3c2cdadb362116235fdbd411735de4328c61425b0aa9f872fd76d02c4e86"},
- {file = "wrapt-1.14.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d52a25136894c63de15a35bc0bdc5adb4b0e173b9c0d07a2be9d3ca64a332735"},
- {file = "wrapt-1.14.1-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:40e7bc81c9e2b2734ea4bc1aceb8a8f0ceaac7c5299bc5d69e37c44d9081d43b"},
- {file = "wrapt-1.14.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:b9b7a708dd92306328117d8c4b62e2194d00c365f18eff11a9b53c6f923b01e3"},
- {file = "wrapt-1.14.1-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:6a9a25751acb379b466ff6be78a315e2b439d4c94c1e99cb7266d40a537995d3"},
- {file = "wrapt-1.14.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:34aa51c45f28ba7f12accd624225e2b1e5a3a45206aa191f6f9aac931d9d56fe"},
- {file = "wrapt-1.14.1-cp39-cp39-win32.whl", hash = "sha256:dee0ce50c6a2dd9056c20db781e9c1cfd33e77d2d569f5d1d9321c641bb903d5"},
- {file = "wrapt-1.14.1-cp39-cp39-win_amd64.whl", hash = "sha256:dee60e1de1898bde3b238f18340eec6148986da0455d8ba7848d50470a7a32fb"},
- {file = "wrapt-1.14.1.tar.gz", hash = "sha256:380a85cf89e0e69b7cfbe2ea9f765f004ff419f34194018a6827ac0e3edfed4d"},
+ {file = "wrapt-1.15.0-cp27-cp27m-macosx_10_9_x86_64.whl", hash = "sha256:ca1cccf838cd28d5a0883b342474c630ac48cac5df0ee6eacc9c7290f76b11c1"},
+ {file = "wrapt-1.15.0-cp27-cp27m-manylinux1_i686.whl", hash = "sha256:e826aadda3cae59295b95343db8f3d965fb31059da7de01ee8d1c40a60398b29"},
+ {file = "wrapt-1.15.0-cp27-cp27m-manylinux1_x86_64.whl", hash = "sha256:5fc8e02f5984a55d2c653f5fea93531e9836abbd84342c1d1e17abc4a15084c2"},
+ {file = "wrapt-1.15.0-cp27-cp27m-manylinux2010_i686.whl", hash = "sha256:96e25c8603a155559231c19c0349245eeb4ac0096fe3c1d0be5c47e075bd4f46"},
+ {file = "wrapt-1.15.0-cp27-cp27m-manylinux2010_x86_64.whl", hash = "sha256:40737a081d7497efea35ab9304b829b857f21558acfc7b3272f908d33b0d9d4c"},
+ {file = "wrapt-1.15.0-cp27-cp27mu-manylinux1_i686.whl", hash = "sha256:f87ec75864c37c4c6cb908d282e1969e79763e0d9becdfe9fe5473b7bb1e5f09"},
+ {file = "wrapt-1.15.0-cp27-cp27mu-manylinux1_x86_64.whl", hash = "sha256:1286eb30261894e4c70d124d44b7fd07825340869945c79d05bda53a40caa079"},
+ {file = "wrapt-1.15.0-cp27-cp27mu-manylinux2010_i686.whl", hash = "sha256:493d389a2b63c88ad56cdc35d0fa5752daac56ca755805b1b0c530f785767d5e"},
+ {file = "wrapt-1.15.0-cp27-cp27mu-manylinux2010_x86_64.whl", hash = "sha256:58d7a75d731e8c63614222bcb21dd992b4ab01a399f1f09dd82af17bbfc2368a"},
+ {file = "wrapt-1.15.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:21f6d9a0d5b3a207cdf7acf8e58d7d13d463e639f0c7e01d82cdb671e6cb7923"},
+ {file = "wrapt-1.15.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:ce42618f67741d4697684e501ef02f29e758a123aa2d669e2d964ff734ee00ee"},
+ {file = "wrapt-1.15.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:41d07d029dd4157ae27beab04d22b8e261eddfc6ecd64ff7000b10dc8b3a5727"},
+ {file = "wrapt-1.15.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:54accd4b8bc202966bafafd16e69da9d5640ff92389d33d28555c5fd4f25ccb7"},
+ {file = "wrapt-1.15.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2fbfbca668dd15b744418265a9607baa970c347eefd0db6a518aaf0cfbd153c0"},
+ {file = "wrapt-1.15.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:76e9c727a874b4856d11a32fb0b389afc61ce8aaf281ada613713ddeadd1cfec"},
+ {file = "wrapt-1.15.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e20076a211cd6f9b44a6be58f7eeafa7ab5720eb796975d0c03f05b47d89eb90"},
+ {file = "wrapt-1.15.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:a74d56552ddbde46c246b5b89199cb3fd182f9c346c784e1a93e4dc3f5ec9975"},
+ {file = "wrapt-1.15.0-cp310-cp310-win32.whl", hash = "sha256:26458da5653aa5b3d8dc8b24192f574a58984c749401f98fff994d41d3f08da1"},
+ {file = "wrapt-1.15.0-cp310-cp310-win_amd64.whl", hash = "sha256:75760a47c06b5974aa5e01949bf7e66d2af4d08cb8c1d6516af5e39595397f5e"},
+ {file = "wrapt-1.15.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:ba1711cda2d30634a7e452fc79eabcadaffedf241ff206db2ee93dd2c89a60e7"},
+ {file = "wrapt-1.15.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:56374914b132c702aa9aa9959c550004b8847148f95e1b824772d453ac204a72"},
+ {file = "wrapt-1.15.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a89ce3fd220ff144bd9d54da333ec0de0399b52c9ac3d2ce34b569cf1a5748fb"},
+ {file = "wrapt-1.15.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3bbe623731d03b186b3d6b0d6f51865bf598587c38d6f7b0be2e27414f7f214e"},
+ {file = "wrapt-1.15.0-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3abbe948c3cbde2689370a262a8d04e32ec2dd4f27103669a45c6929bcdbfe7c"},
+ {file = "wrapt-1.15.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:b67b819628e3b748fd3c2192c15fb951f549d0f47c0449af0764d7647302fda3"},
+ {file = "wrapt-1.15.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:7eebcdbe3677e58dd4c0e03b4f2cfa346ed4049687d839adad68cc38bb559c92"},
+ {file = "wrapt-1.15.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:74934ebd71950e3db69960a7da29204f89624dde411afbfb3b4858c1409b1e98"},
+ {file = "wrapt-1.15.0-cp311-cp311-win32.whl", hash = "sha256:bd84395aab8e4d36263cd1b9308cd504f6cf713b7d6d3ce25ea55670baec5416"},
+ {file = "wrapt-1.15.0-cp311-cp311-win_amd64.whl", hash = "sha256:a487f72a25904e2b4bbc0817ce7a8de94363bd7e79890510174da9d901c38705"},
+ {file = "wrapt-1.15.0-cp35-cp35m-manylinux1_i686.whl", hash = "sha256:4ff0d20f2e670800d3ed2b220d40984162089a6e2c9646fdb09b85e6f9a8fc29"},
+ {file = "wrapt-1.15.0-cp35-cp35m-manylinux1_x86_64.whl", hash = "sha256:9ed6aa0726b9b60911f4aed8ec5b8dd7bf3491476015819f56473ffaef8959bd"},
+ {file = "wrapt-1.15.0-cp35-cp35m-manylinux2010_i686.whl", hash = "sha256:896689fddba4f23ef7c718279e42f8834041a21342d95e56922e1c10c0cc7afb"},
+ {file = "wrapt-1.15.0-cp35-cp35m-manylinux2010_x86_64.whl", hash = "sha256:75669d77bb2c071333417617a235324a1618dba66f82a750362eccbe5b61d248"},
+ {file = "wrapt-1.15.0-cp35-cp35m-win32.whl", hash = "sha256:fbec11614dba0424ca72f4e8ba3c420dba07b4a7c206c8c8e4e73f2e98f4c559"},
+ {file = "wrapt-1.15.0-cp35-cp35m-win_amd64.whl", hash = "sha256:fd69666217b62fa5d7c6aa88e507493a34dec4fa20c5bd925e4bc12fce586639"},
+ {file = "wrapt-1.15.0-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:b0724f05c396b0a4c36a3226c31648385deb6a65d8992644c12a4963c70326ba"},
+ {file = "wrapt-1.15.0-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bbeccb1aa40ab88cd29e6c7d8585582c99548f55f9b2581dfc5ba68c59a85752"},
+ {file = "wrapt-1.15.0-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:38adf7198f8f154502883242f9fe7333ab05a5b02de7d83aa2d88ea621f13364"},
+ {file = "wrapt-1.15.0-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:578383d740457fa790fdf85e6d346fda1416a40549fe8db08e5e9bd281c6a475"},
+ {file = "wrapt-1.15.0-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:a4cbb9ff5795cd66f0066bdf5947f170f5d63a9274f99bdbca02fd973adcf2a8"},
+ {file = "wrapt-1.15.0-cp36-cp36m-musllinux_1_1_i686.whl", hash = "sha256:af5bd9ccb188f6a5fdda9f1f09d9f4c86cc8a539bd48a0bfdc97723970348418"},
+ {file = "wrapt-1.15.0-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:b56d5519e470d3f2fe4aa7585f0632b060d532d0696c5bdfb5e8319e1d0f69a2"},
+ {file = "wrapt-1.15.0-cp36-cp36m-win32.whl", hash = "sha256:77d4c1b881076c3ba173484dfa53d3582c1c8ff1f914c6461ab70c8428b796c1"},
+ {file = "wrapt-1.15.0-cp36-cp36m-win_amd64.whl", hash = "sha256:077ff0d1f9d9e4ce6476c1a924a3332452c1406e59d90a2cf24aeb29eeac9420"},
+ {file = "wrapt-1.15.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:5c5aa28df055697d7c37d2099a7bc09f559d5053c3349b1ad0c39000e611d317"},
+ {file = "wrapt-1.15.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3a8564f283394634a7a7054b7983e47dbf39c07712d7b177b37e03f2467a024e"},
+ {file = "wrapt-1.15.0-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:780c82a41dc493b62fc5884fb1d3a3b81106642c5c5c78d6a0d4cbe96d62ba7e"},
+ {file = "wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e169e957c33576f47e21864cf3fc9ff47c223a4ebca8960079b8bd36cb014fd0"},
+ {file = "wrapt-1.15.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b02f21c1e2074943312d03d243ac4388319f2456576b2c6023041c4d57cd7019"},
+ {file = "wrapt-1.15.0-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:f2e69b3ed24544b0d3dbe2c5c0ba5153ce50dcebb576fdc4696d52aa22db6034"},
+ {file = "wrapt-1.15.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:d787272ed958a05b2c86311d3a4135d3c2aeea4fc655705f074130aa57d71653"},
+ {file = "wrapt-1.15.0-cp37-cp37m-win32.whl", hash = "sha256:02fce1852f755f44f95af51f69d22e45080102e9d00258053b79367d07af39c0"},
+ {file = "wrapt-1.15.0-cp37-cp37m-win_amd64.whl", hash = "sha256:abd52a09d03adf9c763d706df707c343293d5d106aea53483e0ec8d9e310ad5e"},
+ {file = "wrapt-1.15.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:cdb4f085756c96a3af04e6eca7f08b1345e94b53af8921b25c72f096e704e145"},
+ {file = "wrapt-1.15.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:230ae493696a371f1dbffaad3dafbb742a4d27a0afd2b1aecebe52b740167e7f"},
+ {file = "wrapt-1.15.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:63424c681923b9f3bfbc5e3205aafe790904053d42ddcc08542181a30a7a51bd"},
+ {file = "wrapt-1.15.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d6bcbfc99f55655c3d93feb7ef3800bd5bbe963a755687cbf1f490a71fb7794b"},
+ {file = "wrapt-1.15.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c99f4309f5145b93eca6e35ac1a988f0dc0a7ccf9ccdcd78d3c0adf57224e62f"},
+ {file = "wrapt-1.15.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:b130fe77361d6771ecf5a219d8e0817d61b236b7d8b37cc045172e574ed219e6"},
+ {file = "wrapt-1.15.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:96177eb5645b1c6985f5c11d03fc2dbda9ad24ec0f3a46dcce91445747e15094"},
+ {file = "wrapt-1.15.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:d5fe3e099cf07d0fb5a1e23d399e5d4d1ca3e6dfcbe5c8570ccff3e9208274f7"},
+ {file = "wrapt-1.15.0-cp38-cp38-win32.whl", hash = "sha256:abd8f36c99512755b8456047b7be10372fca271bf1467a1caa88db991e7c421b"},
+ {file = "wrapt-1.15.0-cp38-cp38-win_amd64.whl", hash = "sha256:b06fa97478a5f478fb05e1980980a7cdf2712015493b44d0c87606c1513ed5b1"},
+ {file = "wrapt-1.15.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:2e51de54d4fb8fb50d6ee8327f9828306a959ae394d3e01a1ba8b2f937747d86"},
+ {file = "wrapt-1.15.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:0970ddb69bba00670e58955f8019bec4a42d1785db3faa043c33d81de2bf843c"},
+ {file = "wrapt-1.15.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:76407ab327158c510f44ded207e2f76b657303e17cb7a572ffe2f5a8a48aa04d"},
+ {file = "wrapt-1.15.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cd525e0e52a5ff16653a3fc9e3dd827981917d34996600bbc34c05d048ca35cc"},
+ {file = "wrapt-1.15.0-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9d37ac69edc5614b90516807de32d08cb8e7b12260a285ee330955604ed9dd29"},
+ {file = "wrapt-1.15.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:078e2a1a86544e644a68422f881c48b84fef6d18f8c7a957ffd3f2e0a74a0d4a"},
+ {file = "wrapt-1.15.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:2cf56d0e237280baed46f0b5316661da892565ff58309d4d2ed7dba763d984b8"},
+ {file = "wrapt-1.15.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:7dc0713bf81287a00516ef43137273b23ee414fe41a3c14be10dd95ed98a2df9"},
+ {file = "wrapt-1.15.0-cp39-cp39-win32.whl", hash = "sha256:46ed616d5fb42f98630ed70c3529541408166c22cdfd4540b88d5f21006b0eff"},
+ {file = "wrapt-1.15.0-cp39-cp39-win_amd64.whl", hash = "sha256:eef4d64c650f33347c1f9266fa5ae001440b232ad9b98f1f43dfe7a79435c0a6"},
+ {file = "wrapt-1.15.0-py3-none-any.whl", hash = "sha256:64b1df0f83706b4ef4cfb4fb0e4c2669100fd7ecacfb59e091fad300d4e04640"},
+ {file = "wrapt-1.15.0.tar.gz", hash = "sha256:d06730c6aed78cee4126234cf2d071e01b44b915e725a6cb439a879ec9754a3a"},
]
[[package]]
@@ -3419,86 +3559,86 @@ ujson = ["ujson"]
[[package]]
name = "yarl"
-version = "1.8.2"
+version = "1.9.2"
description = "Yet another URL library"
category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "yarl-1.8.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:bb81f753c815f6b8e2ddd2eef3c855cf7da193b82396ac013c661aaa6cc6b0a5"},
- {file = "yarl-1.8.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:47d49ac96156f0928f002e2424299b2c91d9db73e08c4cd6742923a086f1c863"},
- {file = "yarl-1.8.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:3fc056e35fa6fba63248d93ff6e672c096f95f7836938241ebc8260e062832fe"},
- {file = "yarl-1.8.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:58a3c13d1c3005dbbac5c9f0d3210b60220a65a999b1833aa46bd6677c69b08e"},
- {file = "yarl-1.8.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:10b08293cda921157f1e7c2790999d903b3fd28cd5c208cf8826b3b508026996"},
- {file = "yarl-1.8.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:de986979bbd87272fe557e0a8fcb66fd40ae2ddfe28a8b1ce4eae22681728fef"},
- {file = "yarl-1.8.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c4fcfa71e2c6a3cb568cf81aadc12768b9995323186a10827beccf5fa23d4f8"},
- {file = "yarl-1.8.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ae4d7ff1049f36accde9e1ef7301912a751e5bae0a9d142459646114c70ecba6"},
- {file = "yarl-1.8.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:bf071f797aec5b96abfc735ab97da9fd8f8768b43ce2abd85356a3127909d146"},
- {file = "yarl-1.8.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:74dece2bfc60f0f70907c34b857ee98f2c6dd0f75185db133770cd67300d505f"},
- {file = "yarl-1.8.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:df60a94d332158b444301c7f569659c926168e4d4aad2cfbf4bce0e8fb8be826"},
- {file = "yarl-1.8.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:63243b21c6e28ec2375f932a10ce7eda65139b5b854c0f6b82ed945ba526bff3"},
- {file = "yarl-1.8.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:cfa2bbca929aa742b5084fd4663dd4b87c191c844326fcb21c3afd2d11497f80"},
- {file = "yarl-1.8.2-cp310-cp310-win32.whl", hash = "sha256:b05df9ea7496df11b710081bd90ecc3a3db6adb4fee36f6a411e7bc91a18aa42"},
- {file = "yarl-1.8.2-cp310-cp310-win_amd64.whl", hash = "sha256:24ad1d10c9db1953291f56b5fe76203977f1ed05f82d09ec97acb623a7976574"},
- {file = "yarl-1.8.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:2a1fca9588f360036242f379bfea2b8b44cae2721859b1c56d033adfd5893634"},
- {file = "yarl-1.8.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:f37db05c6051eff17bc832914fe46869f8849de5b92dc4a3466cd63095d23dfd"},
- {file = "yarl-1.8.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:77e913b846a6b9c5f767b14dc1e759e5aff05502fe73079f6f4176359d832581"},
- {file = "yarl-1.8.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0978f29222e649c351b173da2b9b4665ad1feb8d1daa9d971eb90df08702668a"},
- {file = "yarl-1.8.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:388a45dc77198b2460eac0aca1efd6a7c09e976ee768b0d5109173e521a19daf"},
- {file = "yarl-1.8.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2305517e332a862ef75be8fad3606ea10108662bc6fe08509d5ca99503ac2aee"},
- {file = "yarl-1.8.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42430ff511571940d51e75cf42f1e4dbdded477e71c1b7a17f4da76c1da8ea76"},
- {file = "yarl-1.8.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3150078118f62371375e1e69b13b48288e44f6691c1069340081c3fd12c94d5b"},
- {file = "yarl-1.8.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:c15163b6125db87c8f53c98baa5e785782078fbd2dbeaa04c6141935eb6dab7a"},
- {file = "yarl-1.8.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:4d04acba75c72e6eb90745447d69f84e6c9056390f7a9724605ca9c56b4afcc6"},
- {file = "yarl-1.8.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:e7fd20d6576c10306dea2d6a5765f46f0ac5d6f53436217913e952d19237efc4"},
- {file = "yarl-1.8.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:75c16b2a900b3536dfc7014905a128a2bea8fb01f9ee26d2d7d8db0a08e7cb2c"},
- {file = "yarl-1.8.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:6d88056a04860a98341a0cf53e950e3ac9f4e51d1b6f61a53b0609df342cc8b2"},
- {file = "yarl-1.8.2-cp311-cp311-win32.whl", hash = "sha256:fb742dcdd5eec9f26b61224c23baea46c9055cf16f62475e11b9b15dfd5c117b"},
- {file = "yarl-1.8.2-cp311-cp311-win_amd64.whl", hash = "sha256:8c46d3d89902c393a1d1e243ac847e0442d0196bbd81aecc94fcebbc2fd5857c"},
- {file = "yarl-1.8.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:ceff9722e0df2e0a9e8a79c610842004fa54e5b309fe6d218e47cd52f791d7ef"},
- {file = "yarl-1.8.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3f6b4aca43b602ba0f1459de647af954769919c4714706be36af670a5f44c9c1"},
- {file = "yarl-1.8.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1684a9bd9077e922300ecd48003ddae7a7474e0412bea38d4631443a91d61077"},
- {file = "yarl-1.8.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ebb78745273e51b9832ef90c0898501006670d6e059f2cdb0e999494eb1450c2"},
- {file = "yarl-1.8.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3adeef150d528ded2a8e734ebf9ae2e658f4c49bf413f5f157a470e17a4a2e89"},
- {file = "yarl-1.8.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:57a7c87927a468e5a1dc60c17caf9597161d66457a34273ab1760219953f7f4c"},
- {file = "yarl-1.8.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:efff27bd8cbe1f9bd127e7894942ccc20c857aa8b5a0327874f30201e5ce83d0"},
- {file = "yarl-1.8.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:a783cd344113cb88c5ff7ca32f1f16532a6f2142185147822187913eb989f739"},
- {file = "yarl-1.8.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:705227dccbe96ab02c7cb2c43e1228e2826e7ead880bb19ec94ef279e9555b5b"},
- {file = "yarl-1.8.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:34c09b43bd538bf6c4b891ecce94b6fa4f1f10663a8d4ca589a079a5018f6ed7"},
- {file = "yarl-1.8.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:a48f4f7fea9a51098b02209d90297ac324241bf37ff6be6d2b0149ab2bd51b37"},
- {file = "yarl-1.8.2-cp37-cp37m-win32.whl", hash = "sha256:0414fd91ce0b763d4eadb4456795b307a71524dbacd015c657bb2a39db2eab89"},
- {file = "yarl-1.8.2-cp37-cp37m-win_amd64.whl", hash = "sha256:d881d152ae0007809c2c02e22aa534e702f12071e6b285e90945aa3c376463c5"},
- {file = "yarl-1.8.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:5df5e3d04101c1e5c3b1d69710b0574171cc02fddc4b23d1b2813e75f35a30b1"},
- {file = "yarl-1.8.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:7a66c506ec67eb3159eea5096acd05f5e788ceec7b96087d30c7d2865a243918"},
- {file = "yarl-1.8.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:2b4fa2606adf392051d990c3b3877d768771adc3faf2e117b9de7eb977741229"},
- {file = "yarl-1.8.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1e21fb44e1eff06dd6ef971d4bdc611807d6bd3691223d9c01a18cec3677939e"},
- {file = "yarl-1.8.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:93202666046d9edadfe9f2e7bf5e0782ea0d497b6d63da322e541665d65a044e"},
- {file = "yarl-1.8.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:fc77086ce244453e074e445104f0ecb27530d6fd3a46698e33f6c38951d5a0f1"},
- {file = "yarl-1.8.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:64dd68a92cab699a233641f5929a40f02a4ede8c009068ca8aa1fe87b8c20ae3"},
- {file = "yarl-1.8.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1b372aad2b5f81db66ee7ec085cbad72c4da660d994e8e590c997e9b01e44901"},
- {file = "yarl-1.8.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:e6f3515aafe0209dd17fb9bdd3b4e892963370b3de781f53e1746a521fb39fc0"},
- {file = "yarl-1.8.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:dfef7350ee369197106805e193d420b75467b6cceac646ea5ed3049fcc950a05"},
- {file = "yarl-1.8.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:728be34f70a190566d20aa13dc1f01dc44b6aa74580e10a3fb159691bc76909d"},
- {file = "yarl-1.8.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:ff205b58dc2929191f68162633d5e10e8044398d7a45265f90a0f1d51f85f72c"},
- {file = "yarl-1.8.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:baf211dcad448a87a0d9047dc8282d7de59473ade7d7fdf22150b1d23859f946"},
- {file = "yarl-1.8.2-cp38-cp38-win32.whl", hash = "sha256:272b4f1599f1b621bf2aabe4e5b54f39a933971f4e7c9aa311d6d7dc06965165"},
- {file = "yarl-1.8.2-cp38-cp38-win_amd64.whl", hash = "sha256:326dd1d3caf910cd26a26ccbfb84c03b608ba32499b5d6eeb09252c920bcbe4f"},
- {file = "yarl-1.8.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:f8ca8ad414c85bbc50f49c0a106f951613dfa5f948ab69c10ce9b128d368baf8"},
- {file = "yarl-1.8.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:418857f837347e8aaef682679f41e36c24250097f9e2f315d39bae3a99a34cbf"},
- {file = "yarl-1.8.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:ae0eec05ab49e91a78700761777f284c2df119376e391db42c38ab46fd662b77"},
- {file = "yarl-1.8.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:009a028127e0a1755c38b03244c0bea9d5565630db9c4cf9572496e947137a87"},
- {file = "yarl-1.8.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3edac5d74bb3209c418805bda77f973117836e1de7c000e9755e572c1f7850d0"},
- {file = "yarl-1.8.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:da65c3f263729e47351261351b8679c6429151ef9649bba08ef2528ff2c423b2"},
- {file = "yarl-1.8.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0ef8fb25e52663a1c85d608f6dd72e19bd390e2ecaf29c17fb08f730226e3a08"},
- {file = "yarl-1.8.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:bcd7bb1e5c45274af9a1dd7494d3c52b2be5e6bd8d7e49c612705fd45420b12d"},
- {file = "yarl-1.8.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:44ceac0450e648de86da8e42674f9b7077d763ea80c8ceb9d1c3e41f0f0a9951"},
- {file = "yarl-1.8.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:97209cc91189b48e7cfe777237c04af8e7cc51eb369004e061809bcdf4e55220"},
- {file = "yarl-1.8.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:48dd18adcf98ea9cd721a25313aef49d70d413a999d7d89df44f469edfb38a06"},
- {file = "yarl-1.8.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:e59399dda559688461762800d7fb34d9e8a6a7444fd76ec33220a926c8be1516"},
- {file = "yarl-1.8.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:d617c241c8c3ad5c4e78a08429fa49e4b04bedfc507b34b4d8dceb83b4af3588"},
- {file = "yarl-1.8.2-cp39-cp39-win32.whl", hash = "sha256:cb6d48d80a41f68de41212f3dfd1a9d9898d7841c8f7ce6696cf2fd9cb57ef83"},
- {file = "yarl-1.8.2-cp39-cp39-win_amd64.whl", hash = "sha256:6604711362f2dbf7160df21c416f81fac0de6dbcf0b5445a2ef25478ecc4c778"},
- {file = "yarl-1.8.2.tar.gz", hash = "sha256:49d43402c6e3013ad0978602bf6bf5328535c48d192304b91b97a3c6790b1562"},
+ {file = "yarl-1.9.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:8c2ad583743d16ddbdf6bb14b5cd76bf43b0d0006e918809d5d4ddf7bde8dd82"},
+ {file = "yarl-1.9.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:82aa6264b36c50acfb2424ad5ca537a2060ab6de158a5bd2a72a032cc75b9eb8"},
+ {file = "yarl-1.9.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:c0c77533b5ed4bcc38e943178ccae29b9bcf48ffd1063f5821192f23a1bd27b9"},
+ {file = "yarl-1.9.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ee4afac41415d52d53a9833ebae7e32b344be72835bbb589018c9e938045a560"},
+ {file = "yarl-1.9.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9bf345c3a4f5ba7f766430f97f9cc1320786f19584acc7086491f45524a551ac"},
+ {file = "yarl-1.9.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2a96c19c52ff442a808c105901d0bdfd2e28575b3d5f82e2f5fd67e20dc5f4ea"},
+ {file = "yarl-1.9.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:891c0e3ec5ec881541f6c5113d8df0315ce5440e244a716b95f2525b7b9f3608"},
+ {file = "yarl-1.9.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c3a53ba34a636a256d767c086ceb111358876e1fb6b50dfc4d3f4951d40133d5"},
+ {file = "yarl-1.9.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:566185e8ebc0898b11f8026447eacd02e46226716229cea8db37496c8cdd26e0"},
+ {file = "yarl-1.9.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:2b0738fb871812722a0ac2154be1f049c6223b9f6f22eec352996b69775b36d4"},
+ {file = "yarl-1.9.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:32f1d071b3f362c80f1a7d322bfd7b2d11e33d2adf395cc1dd4df36c9c243095"},
+ {file = "yarl-1.9.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:e9fdc7ac0d42bc3ea78818557fab03af6181e076a2944f43c38684b4b6bed8e3"},
+ {file = "yarl-1.9.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:56ff08ab5df8429901ebdc5d15941b59f6253393cb5da07b4170beefcf1b2528"},
+ {file = "yarl-1.9.2-cp310-cp310-win32.whl", hash = "sha256:8ea48e0a2f931064469bdabca50c2f578b565fc446f302a79ba6cc0ee7f384d3"},
+ {file = "yarl-1.9.2-cp310-cp310-win_amd64.whl", hash = "sha256:50f33040f3836e912ed16d212f6cc1efb3231a8a60526a407aeb66c1c1956dde"},
+ {file = "yarl-1.9.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:646d663eb2232d7909e6601f1a9107e66f9791f290a1b3dc7057818fe44fc2b6"},
+ {file = "yarl-1.9.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:aff634b15beff8902d1f918012fc2a42e0dbae6f469fce134c8a0dc51ca423bb"},
+ {file = "yarl-1.9.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:a83503934c6273806aed765035716216cc9ab4e0364f7f066227e1aaea90b8d0"},
+ {file = "yarl-1.9.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b25322201585c69abc7b0e89e72790469f7dad90d26754717f3310bfe30331c2"},
+ {file = "yarl-1.9.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:22a94666751778629f1ec4280b08eb11815783c63f52092a5953faf73be24191"},
+ {file = "yarl-1.9.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8ec53a0ea2a80c5cd1ab397925f94bff59222aa3cf9c6da938ce05c9ec20428d"},
+ {file = "yarl-1.9.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:159d81f22d7a43e6eabc36d7194cb53f2f15f498dbbfa8edc8a3239350f59fe7"},
+ {file = "yarl-1.9.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:832b7e711027c114d79dffb92576acd1bd2decc467dec60e1cac96912602d0e6"},
+ {file = "yarl-1.9.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:95d2ecefbcf4e744ea952d073c6922e72ee650ffc79028eb1e320e732898d7e8"},
+ {file = "yarl-1.9.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:d4e2c6d555e77b37288eaf45b8f60f0737c9efa3452c6c44626a5455aeb250b9"},
+ {file = "yarl-1.9.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:783185c75c12a017cc345015ea359cc801c3b29a2966c2655cd12b233bf5a2be"},
+ {file = "yarl-1.9.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:b8cc1863402472f16c600e3e93d542b7e7542a540f95c30afd472e8e549fc3f7"},
+ {file = "yarl-1.9.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:822b30a0f22e588b32d3120f6d41e4ed021806418b4c9f0bc3048b8c8cb3f92a"},
+ {file = "yarl-1.9.2-cp311-cp311-win32.whl", hash = "sha256:a60347f234c2212a9f0361955007fcf4033a75bf600a33c88a0a8e91af77c0e8"},
+ {file = "yarl-1.9.2-cp311-cp311-win_amd64.whl", hash = "sha256:be6b3fdec5c62f2a67cb3f8c6dbf56bbf3f61c0f046f84645cd1ca73532ea051"},
+ {file = "yarl-1.9.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:38a3928ae37558bc1b559f67410df446d1fbfa87318b124bf5032c31e3447b74"},
+ {file = "yarl-1.9.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ac9bb4c5ce3975aeac288cfcb5061ce60e0d14d92209e780c93954076c7c4367"},
+ {file = "yarl-1.9.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3da8a678ca8b96c8606bbb8bfacd99a12ad5dd288bc6f7979baddd62f71c63ef"},
+ {file = "yarl-1.9.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:13414591ff516e04fcdee8dc051c13fd3db13b673c7a4cb1350e6b2ad9639ad3"},
+ {file = "yarl-1.9.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bf74d08542c3a9ea97bb8f343d4fcbd4d8f91bba5ec9d5d7f792dbe727f88938"},
+ {file = "yarl-1.9.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6e7221580dc1db478464cfeef9b03b95c5852cc22894e418562997df0d074ccc"},
+ {file = "yarl-1.9.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:494053246b119b041960ddcd20fd76224149cfea8ed8777b687358727911dd33"},
+ {file = "yarl-1.9.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:52a25809fcbecfc63ac9ba0c0fb586f90837f5425edfd1ec9f3372b119585e45"},
+ {file = "yarl-1.9.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:e65610c5792870d45d7b68c677681376fcf9cc1c289f23e8e8b39c1485384185"},
+ {file = "yarl-1.9.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:1b1bba902cba32cdec51fca038fd53f8beee88b77efc373968d1ed021024cc04"},
+ {file = "yarl-1.9.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:662e6016409828ee910f5d9602a2729a8a57d74b163c89a837de3fea050c7582"},
+ {file = "yarl-1.9.2-cp37-cp37m-win32.whl", hash = "sha256:f364d3480bffd3aa566e886587eaca7c8c04d74f6e8933f3f2c996b7f09bee1b"},
+ {file = "yarl-1.9.2-cp37-cp37m-win_amd64.whl", hash = "sha256:6a5883464143ab3ae9ba68daae8e7c5c95b969462bbe42e2464d60e7e2698368"},
+ {file = "yarl-1.9.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:5610f80cf43b6202e2c33ba3ec2ee0a2884f8f423c8f4f62906731d876ef4fac"},
+ {file = "yarl-1.9.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:b9a4e67ad7b646cd6f0938c7ebfd60e481b7410f574c560e455e938d2da8e0f4"},
+ {file = "yarl-1.9.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:83fcc480d7549ccebe9415d96d9263e2d4226798c37ebd18c930fce43dfb9574"},
+ {file = "yarl-1.9.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5fcd436ea16fee7d4207c045b1e340020e58a2597301cfbcfdbe5abd2356c2fb"},
+ {file = "yarl-1.9.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:84e0b1599334b1e1478db01b756e55937d4614f8654311eb26012091be109d59"},
+ {file = "yarl-1.9.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3458a24e4ea3fd8930e934c129b676c27452e4ebda80fbe47b56d8c6c7a63a9e"},
+ {file = "yarl-1.9.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:838162460b3a08987546e881a2bfa573960bb559dfa739e7800ceeec92e64417"},
+ {file = "yarl-1.9.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f4e2d08f07a3d7d3e12549052eb5ad3eab1c349c53ac51c209a0e5991bbada78"},
+ {file = "yarl-1.9.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:de119f56f3c5f0e2fb4dee508531a32b069a5f2c6e827b272d1e0ff5ac040333"},
+ {file = "yarl-1.9.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:149ddea5abf329752ea5051b61bd6c1d979e13fbf122d3a1f9f0c8be6cb6f63c"},
+ {file = "yarl-1.9.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:674ca19cbee4a82c9f54e0d1eee28116e63bc6fd1e96c43031d11cbab8b2afd5"},
+ {file = "yarl-1.9.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:9b3152f2f5677b997ae6c804b73da05a39daa6a9e85a512e0e6823d81cdad7cc"},
+ {file = "yarl-1.9.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:5415d5a4b080dc9612b1b63cba008db84e908b95848369aa1da3686ae27b6d2b"},
+ {file = "yarl-1.9.2-cp38-cp38-win32.whl", hash = "sha256:f7a3d8146575e08c29ed1cd287068e6d02f1c7bdff8970db96683b9591b86ee7"},
+ {file = "yarl-1.9.2-cp38-cp38-win_amd64.whl", hash = "sha256:63c48f6cef34e6319a74c727376e95626f84ea091f92c0250a98e53e62c77c72"},
+ {file = "yarl-1.9.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:75df5ef94c3fdc393c6b19d80e6ef1ecc9ae2f4263c09cacb178d871c02a5ba9"},
+ {file = "yarl-1.9.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:c027a6e96ef77d401d8d5a5c8d6bc478e8042f1e448272e8d9752cb0aff8b5c8"},
+ {file = "yarl-1.9.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:f3b078dbe227f79be488ffcfc7a9edb3409d018e0952cf13f15fd6512847f3f7"},
+ {file = "yarl-1.9.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:59723a029760079b7d991a401386390c4be5bfec1e7dd83e25a6a0881859e716"},
+ {file = "yarl-1.9.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b03917871bf859a81ccb180c9a2e6c1e04d2f6a51d953e6a5cdd70c93d4e5a2a"},
+ {file = "yarl-1.9.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c1012fa63eb6c032f3ce5d2171c267992ae0c00b9e164efe4d73db818465fac3"},
+ {file = "yarl-1.9.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a74dcbfe780e62f4b5a062714576f16c2f3493a0394e555ab141bf0d746bb955"},
+ {file = "yarl-1.9.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8c56986609b057b4839968ba901944af91b8e92f1725d1a2d77cbac6972b9ed1"},
+ {file = "yarl-1.9.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:2c315df3293cd521033533d242d15eab26583360b58f7ee5d9565f15fee1bef4"},
+ {file = "yarl-1.9.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:b7232f8dfbd225d57340e441d8caf8652a6acd06b389ea2d3222b8bc89cbfca6"},
+ {file = "yarl-1.9.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:53338749febd28935d55b41bf0bcc79d634881195a39f6b2f767870b72514caf"},
+ {file = "yarl-1.9.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:066c163aec9d3d073dc9ffe5dd3ad05069bcb03fcaab8d221290ba99f9f69ee3"},
+ {file = "yarl-1.9.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:8288d7cd28f8119b07dd49b7230d6b4562f9b61ee9a4ab02221060d21136be80"},
+ {file = "yarl-1.9.2-cp39-cp39-win32.whl", hash = "sha256:b124e2a6d223b65ba8768d5706d103280914d61f5cae3afbc50fc3dfcc016623"},
+ {file = "yarl-1.9.2-cp39-cp39-win_amd64.whl", hash = "sha256:61016e7d582bc46a5378ffdd02cd0314fb8ba52f40f9cf4d9a5e7dbef88dee18"},
+ {file = "yarl-1.9.2.tar.gz", hash = "sha256:04ab9d4b9f587c06d801c2abfe9317b77cdf996c65a90d5e84ecc45010823571"},
]
[package.dependencies]
@@ -3507,21 +3647,24 @@ multidict = ">=4.0"
[[package]]
name = "zipp"
-version = "3.11.0"
+version = "3.15.0"
description = "Backport of pathlib-compatible object wrapper for zip files"
category = "main"
optional = false
python-versions = ">=3.7"
files = [
- {file = "zipp-3.11.0-py3-none-any.whl", hash = "sha256:83a28fcb75844b5c0cdaf5aa4003c2d728c77e05f5aeabe8e95e56727005fbaa"},
- {file = "zipp-3.11.0.tar.gz", hash = "sha256:a7a22e05929290a67401440b39690ae6563279bced5f314609d9d03798f56766"},
+ {file = "zipp-3.15.0-py3-none-any.whl", hash = "sha256:48904fc76a60e542af151aded95726c1a5c34ed43ab4134b597665c86d7ad556"},
+ {file = "zipp-3.15.0.tar.gz", hash = "sha256:112929ad649da941c23de50f356a2b5570c954b65150642bccdd66bf194d224b"},
]
[package.extras]
-docs = ["furo", "jaraco.packaging (>=9)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (>=3.5)"]
-testing = ["flake8 (<5)", "func-timeout", "jaraco.functools", "jaraco.itertools", "more-itertools", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=1.3)", "pytest-flake8", "pytest-mypy (>=0.9.1)"]
+docs = ["furo", "jaraco.packaging (>=9)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-lint"]
+testing = ["big-O", "flake8 (<5)", "jaraco.functools", "jaraco.itertools", "more-itertools", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=1.3)", "pytest-flake8", "pytest-mypy (>=0.9.1)"]
+
+[extras]
+docs = []
[metadata]
lock-version = "2.0"
python-versions = ">=3.9.1,<3.10"
-content-hash = "9d3a574b1b6f42ae05d4f0fa6d65677ee54a51c53d984dd3f44d02f234962dbb"
+content-hash = "d2b8da22dcd11e0b03f19b9b79e51f205156c5ce75e41cc0225392e9afd8803b"
diff --git a/pyproject.toml b/pyproject.toml
index be020ac3aa..5b58257310 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -1,6 +1,6 @@
[tool.poetry]
name = "OpenPype"
-version = "3.15.11" # OpenPype
+version = "3.16.0" # OpenPype
description = "Open VFX and Animation pipeline with support."
authors = ["OpenPype Team "]
license = "MIT License"
@@ -32,23 +32,23 @@ python = ">=3.9.1,<3.10"
aiohttp = "^3.7"
aiohttp_json_rpc = "*" # TVPaint server
acre = { git = "https://github.com/pypeclub/acre.git" }
-opentimelineio = "^0.14"
appdirs = { git = "https://github.com/ActiveState/appdirs.git", branch = "master" }
blessed = "^1.17" # openpype terminal formatting
coolname = "*"
clique = "1.6.*"
-Click = "^7"
+Click = "^8"
dnspython = "^2.1.0"
ftrack-python-api = "^2.3.3"
+arrow = "^0.17"
shotgun_api3 = {git = "https://github.com/shotgunsoftware/python-api.git", rev = "v3.3.3"}
-gazu = "^0.8.34"
+gazu = "^0.9.3"
google-api-python-client = "^1.12.8" # sync server google support (should be separate?)
jsonschema = "^2.6.0"
keyring = "^22.0.1"
log4mongo = "^1.7"
pathlib2= "^2.3.5" # deadline submit publish job only (single place, maybe not needed?)
Pillow = "^9.0" # used in TVPaint and for slates
-pyblish-base = "^1.8.8"
+pyblish-base = "^1.8.11"
pynput = "^1.7.2" # idle manager in tray
pymongo = "^3.11.2"
"Qt.py" = "^1.3.3"
@@ -70,7 +70,8 @@ requests = "^2.25.1"
pysftp = "^0.2.9"
dropbox = "^11.20.0"
aiohttp-middlewares = "^2.0.0"
-opencolorio = "^2.2.0"
+Unidecode = "1.2.0"
+cryptography = "39.0.0"
[tool.poetry.dev-dependencies]
flake8 = "^6.0"
@@ -81,14 +82,19 @@ GitPython = "^3.1.17"
jedi = "^0.13"
Jinja2 = "^3"
markupsafe = "2.0.1"
-pycodestyle = "^2.5.0"
-pydocstyle = "^3.0.0"
+pycodestyle = "*"
+pydocstyle = "*"
+linkify-it-py = "^2.0.0"
+myst-parser = "^0.18.1"
pylint = "^2.4.4"
pytest = "^6.1"
pytest-cov = "*"
pytest-print = "*"
-Sphinx = "^6.1"
-sphinx-rtd-theme = "*"
+Sphinx = "^5.3"
+m2r2 = "^0.3.3.post2"
+sphinx-autoapi = "^2.0.1"
+sphinxcontrib-napoleon = "^0.7"
+revitron-sphinx-theme = { git = "https://github.com/revitron/revitron-sphinx-theme.git", branch = "master" }
recommonmark = "*"
wheel = "*"
enlighten = "*" # cool terminal progress bars
@@ -119,12 +125,18 @@ version = "5.15.2"
[openpype.qtbinding.darwin]
package = "PySide6"
-version = "6.4.1"
+version = "6.4.3"
[openpype.qtbinding.linux]
package = "PySide2"
version = "5.15.2"
+# Python dependencies that will be available only in runtime of
+# OpenPype process - do not interfere with DCCs dependencies
+[openpype.runtime-deps]
+opencolorio = "2.2.1"
+opentimelineio = "0.14.1"
+
# TODO: we will need to handle different linux flavours here and
# also different macos versions too.
[openpype.thirdparty.ffmpeg.windows]
@@ -166,3 +178,6 @@ ignore = ["website", "docs", ".git"]
reportMissingImports = true
reportMissingTypeStubs = false
+
+[tool.poetry.extras]
+docs = ["Sphinx", "furo", "sphinxcontrib-napoleon"]
diff --git a/server_addon/README.md b/server_addon/README.md
new file mode 100644
index 0000000000..fa9a6001d2
--- /dev/null
+++ b/server_addon/README.md
@@ -0,0 +1,21 @@
+# OpenPype addon for AYON server
+Convert openpype into AYON addon which can be installed on AYON server. The versioning of the addon is following versioning of OpenPype.
+
+## Intro
+OpenPype is transitioning to AYON, a dedicated server with its own database, moving away from MongoDB. During this transition period, OpenPype will remain compatible with both MongoDB and AYON. However, we will gradually update the codebase to align with AYON's data structure and separate individual components into addons.
+
+Currently, OpenPype has an AYON mode, which means it utilizes the AYON server instead of MongoDB through conversion utilities. Initially, we added the AYON executable alongside the OpenPype executables to enable AYON mode. While this approach worked, updating to new code versions would require a complete reinstallation. To address this, we have decided to create a new repository specifically for the base desktop application logic, which we currently refer to as the AYON Launcher. This Launcher will replace the executables generated by the OpenPype build and convert the OpenPype code into a server addon, resulting in smaller updates.
+
+Since the implementation of the AYON Launcher is not yet fully completed, we will maintain both methods of starting AYON mode for now. Once the AYON Launcher is finished, we will remove the AYON executables from the OpenPype codebase entirely.
+
+During this transitional period, the AYON Launcher addon will be a requirement as the entry point for using the AYON Launcher.
+
+## How to start
+There is a `create_ayon_addon.py` python file which contains logic how to create server addon from OpenPype codebase. Just run the code.
+```shell
+./.poetry/bin/poetry run python ./server_addon/create_ayon_addon.py
+```
+
+It will create directory `./package/openpype//*` folder with all files necessary for AYON server. You can then copy `./package/openpype/` to server addons, or zip the folder and upload it to AYON server. Restart server to update addons information, add the addon version to server bundle and set the bundle for production or staging usage.
+
+Once addon is on server and is enabled, you can just run AYON launcher. Content will be downloaded and used automatically.
diff --git a/server_addon/client/pyproject.toml b/server_addon/client/pyproject.toml
new file mode 100644
index 0000000000..6d5ac92ca7
--- /dev/null
+++ b/server_addon/client/pyproject.toml
@@ -0,0 +1,25 @@
+[project]
+name="openpype"
+description="OpenPype addon for AYON server."
+
+[tool.poetry.dependencies]
+python = ">=3.9.1,<3.10"
+aiohttp_json_rpc = "*" # TVPaint server
+aiohttp-middlewares = "^2.0.0"
+wsrpc_aiohttp = "^3.1.1" # websocket server
+clique = "1.6.*"
+shotgun_api3 = {git = "https://github.com/shotgunsoftware/python-api.git", rev = "v3.3.3"}
+gazu = "^0.9.3"
+google-api-python-client = "^1.12.8" # sync server google support (should be separate?)
+jsonschema = "^2.6.0"
+pymongo = "^3.11.2"
+log4mongo = "^1.7"
+pathlib2= "^2.3.5" # deadline submit publish job only (single place, maybe not needed?)
+pyblish-base = "^1.8.11"
+pynput = "^1.7.2" # Timers manager - TODO replace
+"Qt.py" = "^1.3.3"
+qtawesome = "0.7.3"
+speedcopy = "^2.1"
+slack-sdk = "^3.6.0"
+pysftp = "^0.2.9"
+dropbox = "^11.20.0"
diff --git a/server_addon/create_ayon_addon.py b/server_addon/create_ayon_addon.py
new file mode 100644
index 0000000000..657f416441
--- /dev/null
+++ b/server_addon/create_ayon_addon.py
@@ -0,0 +1,140 @@
+import os
+import re
+import shutil
+import zipfile
+import collections
+from pathlib import Path
+from typing import Any, Optional, Iterable
+
+# Patterns of directories to be skipped for server part of addon
+IGNORE_DIR_PATTERNS: list[re.Pattern] = [
+ re.compile(pattern)
+ for pattern in {
+ # Skip directories starting with '.'
+ r"^\.",
+ # Skip any pycache folders
+ "^__pycache__$"
+ }
+]
+
+# Patterns of files to be skipped for server part of addon
+IGNORE_FILE_PATTERNS: list[re.Pattern] = [
+ re.compile(pattern)
+ for pattern in {
+ # Skip files starting with '.'
+ # NOTE this could be an issue in some cases
+ r"^\.",
+ # Skip '.pyc' files
+ r"\.pyc$"
+ }
+]
+
+
+def _value_match_regexes(value: str, regexes: Iterable[re.Pattern]) -> bool:
+ return any(
+ regex.search(value)
+ for regex in regexes
+ )
+
+
+def find_files_in_subdir(
+ src_path: str,
+ ignore_file_patterns: Optional[list[re.Pattern]] = None,
+ ignore_dir_patterns: Optional[list[re.Pattern]] = None
+):
+ """Find all files to copy in subdirectories of given path.
+
+ All files that match any of the patterns in 'ignore_file_patterns' will
+ be skipped and any directories that match any of the patterns in
+ 'ignore_dir_patterns' will be skipped with all subfiles.
+
+ Args:
+ src_path (str): Path to directory to search in.
+ ignore_file_patterns (Optional[list[re.Pattern]]): List of regexes
+ to match files to ignore.
+ ignore_dir_patterns (Optional[list[re.Pattern]]): List of regexes
+ to match directories to ignore.
+
+ Returns:
+ list[tuple[str, str]]: List of tuples with path to file and parent
+ directories relative to 'src_path'.
+ """
+
+ if ignore_file_patterns is None:
+ ignore_file_patterns = IGNORE_FILE_PATTERNS
+
+ if ignore_dir_patterns is None:
+ ignore_dir_patterns = IGNORE_DIR_PATTERNS
+ output: list[tuple[str, str]] = []
+
+ hierarchy_queue = collections.deque()
+ hierarchy_queue.append((src_path, []))
+ while hierarchy_queue:
+ item: tuple[str, str] = hierarchy_queue.popleft()
+ dirpath, parents = item
+ for name in os.listdir(dirpath):
+ path = os.path.join(dirpath, name)
+ if os.path.isfile(path):
+ if not _value_match_regexes(name, ignore_file_patterns):
+ items = list(parents)
+ items.append(name)
+ output.append((path, os.path.sep.join(items)))
+ continue
+
+ if not _value_match_regexes(name, ignore_dir_patterns):
+ items = list(parents)
+ items.append(name)
+ hierarchy_queue.append((path, items))
+
+ return output
+
+
+def main():
+ openpype_addon_dir = Path(os.path.dirname(os.path.abspath(__file__)))
+ server_dir = openpype_addon_dir / "server"
+ package_root = openpype_addon_dir / "package"
+ pyproject_path = openpype_addon_dir / "client" / "pyproject.toml"
+
+ root_dir = openpype_addon_dir.parent
+ openpype_dir = root_dir / "openpype"
+ version_path = openpype_dir / "version.py"
+
+ # Read version
+ version_content: dict[str, Any] = {}
+ with open(str(version_path), "r") as stream:
+ exec(stream.read(), version_content)
+ addon_version: str = version_content["__version__"]
+
+ output_dir = package_root / "openpype" / addon_version
+ private_dir = output_dir / "private"
+
+ # Make sure package dir is empty
+ if package_root.exists():
+ shutil.rmtree(str(package_root))
+ # Make sure output dir is created
+ output_dir.mkdir(parents=True)
+
+ # Copy version
+ shutil.copy(str(version_path), str(output_dir))
+ for subitem in server_dir.iterdir():
+ shutil.copy(str(subitem), str(output_dir / subitem.name))
+
+ # Make sure private dir exists
+ private_dir.mkdir(parents=True)
+
+ # Copy pyproject.toml
+ shutil.copy(
+ str(pyproject_path),
+ (private_dir / pyproject_path.name)
+ )
+
+ # Zip client
+ zip_filepath = private_dir / "client.zip"
+ with zipfile.ZipFile(zip_filepath, "w", zipfile.ZIP_DEFLATED) as zipf:
+ # Add client code content to zip
+ for path, sub_path in find_files_in_subdir(str(openpype_dir)):
+ zipf.write(path, f"{openpype_dir.name}/{sub_path}")
+
+
+if __name__ == "__main__":
+ main()
diff --git a/server_addon/server/__init__.py b/server_addon/server/__init__.py
new file mode 100644
index 0000000000..df24c73c76
--- /dev/null
+++ b/server_addon/server/__init__.py
@@ -0,0 +1,9 @@
+from ayon_server.addons import BaseServerAddon
+
+from .version import __version__
+
+
+class OpenPypeAddon(BaseServerAddon):
+ name = "openpype"
+ title = "OpenPype"
+ version = __version__
diff --git a/setup.py b/setup.py
index ab6e22bccc..260728dde6 100644
--- a/setup.py
+++ b/setup.py
@@ -89,7 +89,6 @@ install_requires = [
"keyring",
"clique",
"jsonschema",
- "opentimelineio",
"pathlib2",
"pkg_resources",
"PIL",
@@ -126,6 +125,7 @@ bin_includes = [
include_files = [
"igniter",
"openpype",
+ "common",
"schema",
"LICENSE",
"README.md"
@@ -158,11 +158,35 @@ bdist_mac_options = dict(
)
executables = [
- Executable("start.py", base=base,
- target_name="openpype_gui", icon=icon_path.as_posix()),
- Executable("start.py", base=None,
- target_name="openpype_console", icon=icon_path.as_posix())
+ Executable(
+ "start.py",
+ base=base,
+ target_name="openpype_gui",
+ icon=icon_path.as_posix()
+ ),
+ Executable(
+ "start.py",
+ base=None,
+ target_name="openpype_console",
+ icon=icon_path.as_posix()
+ ),
+ Executable(
+ "ayon_start.py",
+ base=base,
+ target_name="ayon",
+ icon=icon_path.as_posix()
+ ),
]
+if IS_WINDOWS:
+ executables.append(
+ Executable(
+ "ayon_start.py",
+ base=None,
+ target_name="ayon_console",
+ icon=icon_path.as_posix()
+ )
+ )
+
if IS_LINUX:
executables.append(
Executable(
diff --git a/start.py b/start.py
index 91e5c29a53..3b020c76c0 100644
--- a/start.py
+++ b/start.py
@@ -133,6 +133,10 @@ else:
vendor_python_path = os.path.join(OPENPYPE_ROOT, "vendor", "python")
sys.path.insert(0, vendor_python_path)
+# Add common package to sys path
+# - common contains common code for bootstraping and OpenPype processes
+sys.path.insert(0, os.path.join(OPENPYPE_ROOT, "common"))
+
import blessed # noqa: E402
import certifi # noqa: E402
@@ -140,8 +144,8 @@ import certifi # noqa: E402
if sys.__stdout__:
term = blessed.Terminal()
- def _print(message: str):
- if silent_mode:
+ def _print(message: str, force=False):
+ if silent_mode and not force:
return
if message.startswith("!!! "):
print(f'{term.orangered2("!!! ")}{message[4:]}')
@@ -197,6 +201,15 @@ if "--headless" in sys.argv:
elif os.getenv("OPENPYPE_HEADLESS_MODE") != "1":
os.environ.pop("OPENPYPE_HEADLESS_MODE", None)
+# Set builtin ocio root
+os.environ["BUILTIN_OCIO_ROOT"] = os.path.join(
+ OPENPYPE_ROOT,
+ "vendor",
+ "bin",
+ "ocioconfig",
+ "OpenColorIOConfigs"
+)
+
# Enabled logging debug mode when "--debug" is passed
if "--verbose" in sys.argv:
expected_values = (
@@ -255,6 +268,7 @@ from igniter import BootstrapRepos # noqa: E402
from igniter.tools import (
get_openpype_global_settings,
get_openpype_path_from_settings,
+ get_local_openpype_path_from_settings,
validate_mongo_connection,
OpenPypeVersionNotFound,
OpenPypeVersionIncompatible
@@ -348,8 +362,15 @@ def run_disk_mapping_commands(settings):
mappings = disk_mapping.get(low_platform) or []
for source, destination in mappings:
- destination = destination.rstrip('/')
- source = source.rstrip('/')
+ if low_platform == "windows":
+ destination = destination.replace("/", "\\").rstrip("\\")
+ source = source.replace("/", "\\").rstrip("\\")
+ # Add slash after ':' ('G:' -> 'G:\')
+ if destination.endswith(":"):
+ destination += "\\"
+ else:
+ destination = destination.rstrip("/")
+ source = source.rstrip("/")
if low_platform == "darwin":
scr = f'do shell script "ln -s {source} {destination}" with administrator privileges' # noqa
@@ -507,8 +528,8 @@ def _process_arguments() -> tuple:
not use_version_value
or not use_version_value.startswith("=")
):
- _print("!!! Please use option --use-version like:")
- _print(" --use-version=3.0.0")
+ _print("!!! Please use option --use-version like:", True)
+ _print(" --use-version=3.0.0", True)
sys.exit(1)
version_str = use_version_value[1:]
@@ -525,14 +546,14 @@ def _process_arguments() -> tuple:
break
if use_version is None:
- _print("!!! Requested version isn't in correct format.")
+ _print("!!! Requested version isn't in correct format.", True)
_print((" Use --list-versions to find out"
- " proper version string."))
+ " proper version string."), True)
sys.exit(1)
if arg == "--validate-version":
- _print("!!! Please use option --validate-version like:")
- _print(" --validate-version=3.0.0")
+ _print("!!! Please use option --validate-version like:", True)
+ _print(" --validate-version=3.0.0", True)
sys.exit(1)
if arg.startswith("--validate-version="):
@@ -543,9 +564,9 @@ def _process_arguments() -> tuple:
sys.argv.remove(arg)
commands.append("validate")
else:
- _print("!!! Requested version isn't in correct format.")
+ _print("!!! Requested version isn't in correct format.", True)
_print((" Use --list-versions to find out"
- " proper version string."))
+ " proper version string."), True)
sys.exit(1)
if "--list-versions" in sys.argv:
@@ -556,7 +577,7 @@ def _process_arguments() -> tuple:
# this is helper to run igniter before anything else
if "igniter" in sys.argv:
if os.getenv("OPENPYPE_HEADLESS_MODE") == "1":
- _print("!!! Cannot open Igniter dialog in headless mode.")
+ _print("!!! Cannot open Igniter dialog in headless mode.", True)
sys.exit(1)
return_code = igniter.open_dialog()
@@ -606,9 +627,9 @@ def _determine_mongodb() -> str:
if not openpype_mongo:
_print("*** No DB connection string specified.")
if os.getenv("OPENPYPE_HEADLESS_MODE") == "1":
- _print("!!! Cannot open Igniter dialog in headless mode.")
- _print(
- "!!! Please use `OPENPYPE_MONGO` to specify server address.")
+ _print("!!! Cannot open Igniter dialog in headless mode.", True)
+ _print(("!!! Please use `OPENPYPE_MONGO` to specify "
+ "server address."), True)
sys.exit(1)
_print("--- launching setup UI ...")
@@ -783,7 +804,7 @@ def _find_frozen_openpype(use_version: str = None,
try:
version_path = bootstrap.extract_openpype(openpype_version)
except OSError as e:
- _print("!!! failed: {}".format(str(e)))
+ _print("!!! failed: {}".format(str(e)), True)
sys.exit(1)
else:
# cleanup zip after extraction
@@ -899,7 +920,7 @@ def _boot_validate_versions(use_version, local_version):
v: OpenPypeVersion
found = [v for v in openpype_versions if str(v) == use_version]
if not found:
- _print(f"!!! Version [ {use_version} ] not found.")
+ _print(f"!!! Version [ {use_version} ] not found.", True)
list_versions(openpype_versions, local_version)
sys.exit(1)
@@ -908,7 +929,8 @@ def _boot_validate_versions(use_version, local_version):
use_version, openpype_versions
)
valid, message = bootstrap.validate_openpype_version(version_path)
- _print(f'{">>> " if valid else "!!! "}{message}')
+ _print(f'{">>> " if valid else "!!! "}{message}', not valid)
+ return valid
def _boot_print_versions(openpype_root):
@@ -935,7 +957,7 @@ def _boot_print_versions(openpype_root):
def _boot_handle_missing_version(local_version, message):
- _print(message)
+ _print(message, True)
if os.environ.get("OPENPYPE_HEADLESS_MODE") == "1":
openpype_versions = bootstrap.find_openpype(
include_zips=True)
@@ -983,7 +1005,7 @@ def boot():
openpype_mongo = _determine_mongodb()
except RuntimeError as e:
# without mongodb url we are done for.
- _print(f"!!! {e}")
+ _print(f"!!! {e}", True)
sys.exit(1)
os.environ["OPENPYPE_MONGO"] = openpype_mongo
@@ -1018,14 +1040,18 @@ def boot():
# find its versions there and bootstrap them.
openpype_path = get_openpype_path_from_settings(global_settings)
+ # Check if local versions should be installed in custom folder and not in
+ # user app data
+ data_dir = get_local_openpype_path_from_settings(global_settings)
+ bootstrap.set_data_dir(data_dir)
if getattr(sys, 'frozen', False):
local_version = bootstrap.get_version(Path(OPENPYPE_ROOT))
else:
local_version = OpenPypeVersion.get_installed_version_str()
if "validate" in commands:
- _boot_validate_versions(use_version, local_version)
- sys.exit(1)
+ valid = _boot_validate_versions(use_version, local_version)
+ sys.exit(0 if valid else 1)
if not openpype_path:
_print("*** Cannot get OpenPype path from database.")
@@ -1035,7 +1061,7 @@ def boot():
if "print_versions" in commands:
_boot_print_versions(OPENPYPE_ROOT)
- sys.exit(1)
+ sys.exit(0)
# ------------------------------------------------------------------------
# Find OpenPype versions
@@ -1052,13 +1078,13 @@ def boot():
except RuntimeError as e:
# no version to run
- _print(f"!!! {e}")
+ _print(f"!!! {e}", True)
sys.exit(1)
# validate version
- _print(f">>> Validating version [ {str(version_path)} ]")
+ _print(f">>> Validating version in frozen [ {str(version_path)} ]")
result = bootstrap.validate_openpype_version(version_path)
if not result[0]:
- _print(f"!!! Invalid version: {result[1]}")
+ _print(f"!!! Invalid version: {result[1]}", True)
sys.exit(1)
_print("--- version is valid")
else:
@@ -1126,7 +1152,7 @@ def boot():
cli.main(obj={}, prog_name="openpype")
except Exception: # noqa
exc_info = sys.exc_info()
- _print("!!! OpenPype crashed:")
+ _print("!!! OpenPype crashed:", True)
traceback.print_exception(*exc_info)
sys.exit(1)
diff --git a/tests/lib/testing_classes.py b/tests/lib/testing_classes.py
index 300024dc98..f04607dc27 100644
--- a/tests/lib/testing_classes.py
+++ b/tests/lib/testing_classes.py
@@ -12,7 +12,7 @@ import requests
import re
from tests.lib.db_handler import DBHandler
-from common.openpype_common.distribution.file_handler import RemoteFileHandler
+from common.ayon_common.distribution.file_handler import RemoteFileHandler
from openpype.modules import ModulesManager
from openpype.settings import get_project_settings
diff --git a/tools/fetch_thirdparty_libs.py b/tools/fetch_thirdparty_libs.py
index 70257caa46..c2dc4636d0 100644
--- a/tools/fetch_thirdparty_libs.py
+++ b/tools/fetch_thirdparty_libs.py
@@ -67,40 +67,45 @@ def _print(msg: str, message_type: int = 0) -> None:
print(f"{header}{msg}")
-def install_qtbinding(pyproject, openpype_root, platform_name):
- _print("Handling Qt binding framework ...")
- qtbinding_def = pyproject["openpype"]["qtbinding"][platform_name]
- package = qtbinding_def["package"]
- version = qtbinding_def.get("version")
-
- qtbinding_arg = None
+def _pip_install(openpype_root, package, version=None):
+ arg = None
if package and version:
- qtbinding_arg = f"{package}=={version}"
+ arg = f"{package}=={version}"
elif package:
- qtbinding_arg = package
+ arg = package
- if not qtbinding_arg:
- _print("Didn't find Qt binding to install")
+ if not arg:
+ _print("Couldn't find package to install")
sys.exit(1)
- _print(f"We'll install {qtbinding_arg}")
+ _print(f"We'll install {arg}")
python_vendor_dir = openpype_root / "vendor" / "python"
try:
subprocess.run(
[
sys.executable,
- "-m", "pip", "install", "--upgrade", qtbinding_arg,
+ "-m", "pip", "install", "--upgrade", arg,
"-t", str(python_vendor_dir)
],
check=True,
stdout=subprocess.DEVNULL
)
except subprocess.CalledProcessError as e:
- _print("Error during PySide2 installation.", 1)
+ _print(f"Error during {package} installation.", 1)
_print(str(e), 1)
sys.exit(1)
+
+def install_qtbinding(pyproject, openpype_root, platform_name):
+ _print("Handling Qt binding framework ...")
+ qtbinding_def = pyproject["openpype"]["qtbinding"][platform_name]
+ package = qtbinding_def["package"]
+ version = qtbinding_def.get("version")
+ _pip_install(openpype_root, package, version)
+
+ python_vendor_dir = openpype_root / "vendor" / "python"
+
# Remove libraries for QtSql which don't have available libraries
# by default and Postgre library would require to modify rpath of
# dependency
@@ -112,6 +117,13 @@ def install_qtbinding(pyproject, openpype_root, platform_name):
os.remove(str(filepath))
+def install_runtime_dependencies(pyproject, openpype_root):
+ _print("Installing Runtime Dependencies ...")
+ runtime_deps = pyproject["openpype"]["runtime-deps"]
+ for package, version in runtime_deps.items():
+ _pip_install(openpype_root, package, version)
+
+
def install_thirdparty(pyproject, openpype_root, platform_name):
_print("Processing third-party dependencies ...")
try:
@@ -221,6 +233,7 @@ def main():
pyproject = toml.load(openpype_root / "pyproject.toml")
platform_name = platform.system().lower()
install_qtbinding(pyproject, openpype_root, platform_name)
+ install_runtime_dependencies(pyproject, openpype_root)
install_thirdparty(pyproject, openpype_root, platform_name)
end_time = time.time_ns()
total_time = (end_time - start_time) / 1000000000
diff --git a/tools/run_tray_ayon.ps1 b/tools/run_tray_ayon.ps1
new file mode 100644
index 0000000000..54a80f93fd
--- /dev/null
+++ b/tools/run_tray_ayon.ps1
@@ -0,0 +1,41 @@
+<#
+.SYNOPSIS
+ Helper script AYON Tray.
+
+.DESCRIPTION
+
+
+.EXAMPLE
+
+PS> .\run_tray.ps1
+
+#>
+$current_dir = Get-Location
+$script_dir = Split-Path -Path $MyInvocation.MyCommand.Definition -Parent
+$ayon_root = (Get-Item $script_dir).parent.FullName
+
+# Install PSWriteColor to support colorized output to terminal
+$env:PSModulePath = $env:PSModulePath + ";$($ayon_root)\tools\modules\powershell"
+
+$env:_INSIDE_OPENPYPE_TOOL = "1"
+
+# make sure Poetry is in PATH
+if (-not (Test-Path 'env:POETRY_HOME')) {
+ $env:POETRY_HOME = "$ayon_root\.poetry"
+}
+$env:PATH = "$($env:PATH);$($env:POETRY_HOME)\bin"
+
+
+Set-Location -Path $ayon_root
+
+Write-Color -Text ">>> ", "Reading Poetry ... " -Color Green, Gray -NoNewline
+if (-not (Test-Path -PathType Container -Path "$($env:POETRY_HOME)\bin")) {
+ Write-Color -Text "NOT FOUND" -Color Yellow
+ Write-Color -Text "*** ", "We need to install Poetry create virtual env first ..." -Color Yellow, Gray
+ & "$ayon_root\tools\create_env.ps1"
+} else {
+ Write-Color -Text "OK" -Color Green
+}
+
+& "$($env:POETRY_HOME)\bin\poetry" run python "$($ayon_root)\ayon_start.py" tray --debug
+Set-Location -Path $current_dir
diff --git a/tools/run_tray_ayon.sh b/tools/run_tray_ayon.sh
new file mode 100755
index 0000000000..3039750b87
--- /dev/null
+++ b/tools/run_tray_ayon.sh
@@ -0,0 +1,78 @@
+#!/usr/bin/env bash
+# Run AYON Tray
+
+# Colors for terminal
+
+RST='\033[0m' # Text Reset
+
+# Regular Colors
+Black='\033[0;30m' # Black
+Red='\033[0;31m' # Red
+Green='\033[0;32m' # Green
+Yellow='\033[0;33m' # Yellow
+Blue='\033[0;34m' # Blue
+Purple='\033[0;35m' # Purple
+Cyan='\033[0;36m' # Cyan
+White='\033[0;37m' # White
+
+# Bold
+BBlack='\033[1;30m' # Black
+BRed='\033[1;31m' # Red
+BGreen='\033[1;32m' # Green
+BYellow='\033[1;33m' # Yellow
+BBlue='\033[1;34m' # Blue
+BPurple='\033[1;35m' # Purple
+BCyan='\033[1;36m' # Cyan
+BWhite='\033[1;37m' # White
+
+# Bold High Intensity
+BIBlack='\033[1;90m' # Black
+BIRed='\033[1;91m' # Red
+BIGreen='\033[1;92m' # Green
+BIYellow='\033[1;93m' # Yellow
+BIBlue='\033[1;94m' # Blue
+BIPurple='\033[1;95m' # Purple
+BICyan='\033[1;96m' # Cyan
+BIWhite='\033[1;97m' # White
+
+
+##############################################################################
+# Return absolute path
+# Globals:
+# None
+# Arguments:
+# Path to resolve
+# Returns:
+# None
+###############################################################################
+realpath () {
+ echo $(cd $(dirname "$1"); pwd)/$(basename "$1")
+}
+
+# Main
+main () {
+ # Directories
+ ayon_root=$(realpath $(dirname $(dirname "${BASH_SOURCE[0]}")))
+
+ _inside_openpype_tool="1"
+
+ if [[ -z $POETRY_HOME ]]; then
+ export POETRY_HOME="$ayon_root/.poetry"
+ fi
+
+ echo -e "${BIGreen}>>>${RST} Reading Poetry ... \c"
+ if [ -f "$POETRY_HOME/bin/poetry" ]; then
+ echo -e "${BIGreen}OK${RST}"
+ else
+ echo -e "${BIYellow}NOT FOUND${RST}"
+ echo -e "${BIYellow}***${RST} We need to install Poetry and virtual env ..."
+ . "$ayon_root/tools/create_env.sh" || { echo -e "${BIRed}!!!${RST} Poetry installation failed"; return; }
+ fi
+
+ pushd "$ayon_root" > /dev/null || return > /dev/null
+
+ echo -e "${BIGreen}>>>${RST} Running AYON Tray with debug option ..."
+ "$POETRY_HOME/bin/poetry" run python3 "$ayon_root/ayon_start.py" tray --debug
+}
+
+main
diff --git a/website/docs/admin_openpype_commands.md b/website/docs/admin_openpype_commands.md
index 131b6c0a51..a149d78aa2 100644
--- a/website/docs/admin_openpype_commands.md
+++ b/website/docs/admin_openpype_commands.md
@@ -40,7 +40,6 @@ For more information [see here](admin_use.md#run-openpype).
| module | Run command line arguments for modules. | |
| repack-version | Tool to re-create version zip. | [📑](#repack-version-arguments) |
| tray | Launch OpenPype Tray. | [📑](#tray-arguments)
-| launch | Launch application in Pype environment. | [📑](#launch-arguments) |
| publish | Pype takes JSON from provided path and use it to publish data in it. | [📑](#publish-arguments) |
| extractenvironments | Extract environment variables for entered context to a json file. | [📑](#extractenvironments-arguments) |
| run | Execute given python script within OpenPype environment. | [📑](#run-arguments) |
@@ -54,26 +53,6 @@ For more information [see here](admin_use.md#run-openpype).
```shell
openpype_console tray
```
----
-
-### `launch` arguments {#launch-arguments}
-
-| Argument | Description |
-| --- | --- |
-| `--app` | Application name - this should be the key for application from Settings. |
-| `--project` | Project name (default taken from `AVALON_PROJECT` if set) |
-| `--asset` | Asset name (default taken from `AVALON_ASSET` if set) |
-| `--task` | Task name (default taken from `AVALON_TASK` is set) |
-| `--tools` | *Optional: Additional tools to add* |
-| `--user` | *Optional: User on behalf to run* |
-| `--ftrack-server` / `-fs` | *Optional: Ftrack server URL* |
-| `--ftrack-user` / `-fu` | *Optional: Ftrack user* |
-| `--ftrack-key` / `-fk` | *Optional: Ftrack API key* |
-
-For example to run Python interactive console in Pype context:
-```shell
-pype launch --app python --project my_project --asset my_asset --task my_task
-```
---
### `publish` arguments {#publish-arguments}
diff --git a/website/docs/dev_colorspace.md b/website/docs/dev_colorspace.md
index c4b8e74d73..cb07cb18a0 100644
--- a/website/docs/dev_colorspace.md
+++ b/website/docs/dev_colorspace.md
@@ -80,7 +80,7 @@ from openpype.pipeline.colorspace import (
class YourLoader(api.Loader):
def load(self, context, name=None, namespace=None, options=None):
- path = self.fname
+ path = self.filepath_from_context(context)
colorspace_data = context["representation"]["data"].get("colorspaceData", {})
colorspace = (
colorspace_data.get("colorspace")