diff --git a/CHANGELOG.md b/CHANGELOG.md
index 70e23e0ff8..364555f8b2 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -1,18 +1,264 @@
# Changelog
-## [2.14.0](https://github.com/pypeclub/pype/tree/2.14.0) (2020-11-24)
+## [2.16.1](https://github.com/pypeclub/pype/tree/2.16.1) (2021-04-13)
+
+[Full Changelog](https://github.com/pypeclub/pype/compare/2.16.0...2.16.1)
+
+**Enhancements:**
+
+- Nuke: comp renders mix up [\#1301](https://github.com/pypeclub/pype/pull/1301)
+- Validate project settings [\#1297](https://github.com/pypeclub/pype/pull/1297)
+- After Effects: added SubsetManager [\#1234](https://github.com/pypeclub/pype/pull/1234)
+
+**Fixed bugs:**
+
+- Ftrack custom attributes in bulks [\#1312](https://github.com/pypeclub/pype/pull/1312)
+- Ftrack optional pypclub role [\#1303](https://github.com/pypeclub/pype/pull/1303)
+- AE remove orphaned instance from workfile - fix self.stub [\#1282](https://github.com/pypeclub/pype/pull/1282)
+- Avalon schema names [\#1242](https://github.com/pypeclub/pype/pull/1242)
+- Handle duplication of Task name [\#1226](https://github.com/pypeclub/pype/pull/1226)
+- Modified path of plugin loads for Harmony and TVPaint [\#1217](https://github.com/pypeclub/pype/pull/1217)
+- Regex checks in profiles filtering [\#1214](https://github.com/pypeclub/pype/pull/1214)
+- Bulk mov strict task [\#1204](https://github.com/pypeclub/pype/pull/1204)
+- Update custom ftrack session attributes [\#1202](https://github.com/pypeclub/pype/pull/1202)
+- Nuke: write node colorspace ignore `default\(\)` label [\#1199](https://github.com/pypeclub/pype/pull/1199)
+- Nuke: reverse search to make it more versatile [\#1178](https://github.com/pypeclub/pype/pull/1178)
+
+**Merged pull requests:**
+
+- Forward compatible ftrack group [\#1243](https://github.com/pypeclub/pype/pull/1243)
+- Error message in pyblish UI [\#1206](https://github.com/pypeclub/pype/pull/1206)
+- Nuke: deadline submission with search replaced env values from preset [\#1194](https://github.com/pypeclub/pype/pull/1194)
+
+## [2.16.0](https://github.com/pypeclub/pype/tree/2.16.0) (2021-03-22)
+
+[Full Changelog](https://github.com/pypeclub/pype/compare/2.15.3...2.16.0)
+
+**Enhancements:**
+
+- Nuke: deadline submit limit group filter [\#1167](https://github.com/pypeclub/pype/pull/1167)
+- Maya: support for Deadline Group and Limit Groups - backport 2.x [\#1156](https://github.com/pypeclub/pype/pull/1156)
+- Maya: fixes for Redshift support [\#1152](https://github.com/pypeclub/pype/pull/1152)
+- Nuke: adding preset for a Read node name to all img and mov Loaders [\#1146](https://github.com/pypeclub/pype/pull/1146)
+- nuke deadline submit with environ var from presets overrides [\#1142](https://github.com/pypeclub/pype/pull/1142)
+- Change timers after task change [\#1138](https://github.com/pypeclub/pype/pull/1138)
+- Nuke: shortcuts for Pype menu [\#1127](https://github.com/pypeclub/pype/pull/1127)
+- Nuke: workfile template [\#1124](https://github.com/pypeclub/pype/pull/1124)
+- Sites local settings by site name [\#1117](https://github.com/pypeclub/pype/pull/1117)
+- Reset loader's asset selection on context change [\#1106](https://github.com/pypeclub/pype/pull/1106)
+- Bulk mov render publishing [\#1101](https://github.com/pypeclub/pype/pull/1101)
+- Photoshop: mark publishable instances [\#1093](https://github.com/pypeclub/pype/pull/1093)
+- Added ability to define BG color for extract review [\#1088](https://github.com/pypeclub/pype/pull/1088)
+- TVPaint extractor enhancement [\#1080](https://github.com/pypeclub/pype/pull/1080)
+- Photoshop: added support for .psb in workfiles [\#1078](https://github.com/pypeclub/pype/pull/1078)
+- Optionally add task to subset name [\#1072](https://github.com/pypeclub/pype/pull/1072)
+- Only extend clip range when collecting. [\#1008](https://github.com/pypeclub/pype/pull/1008)
+- Collect audio for farm reviews. [\#1073](https://github.com/pypeclub/pype/pull/1073)
+
+
+**Fixed bugs:**
+
+- Fix path spaces in jpeg extractor [\#1174](https://github.com/pypeclub/pype/pull/1174)
+- Maya: Bugfix: superclass for CreateCameraRig [\#1166](https://github.com/pypeclub/pype/pull/1166)
+- Maya: Submit to Deadline - fix typo in condition [\#1163](https://github.com/pypeclub/pype/pull/1163)
+- Avoid dot in repre extension [\#1125](https://github.com/pypeclub/pype/pull/1125)
+- Fix versions variable usage in standalone publisher [\#1090](https://github.com/pypeclub/pype/pull/1090)
+- Collect instance data fix subset query [\#1082](https://github.com/pypeclub/pype/pull/1082)
+- Fix getting the camera name. [\#1067](https://github.com/pypeclub/pype/pull/1067)
+- Nuke: Ensure "NUKE\_TEMP\_DIR" is not part of the Deadline job environment. [\#1064](https://github.com/pypeclub/pype/pull/1064)
+
+## [2.15.3](https://github.com/pypeclub/pype/tree/2.15.3) (2021-02-26)
+
+[Full Changelog](https://github.com/pypeclub/pype/compare/2.15.2...2.15.3)
+
+**Enhancements:**
+
+- Maya: speedup renderable camera collection [\#1053](https://github.com/pypeclub/pype/pull/1053)
+- Harmony - add regex search to filter allowed task names for collectin… [\#1047](https://github.com/pypeclub/pype/pull/1047)
+
+**Fixed bugs:**
+
+- Ftrack integrate hierarchy fix [\#1085](https://github.com/pypeclub/pype/pull/1085)
+- Explicit subset filter in anatomy instance data [\#1059](https://github.com/pypeclub/pype/pull/1059)
+- TVPaint frame offset [\#1057](https://github.com/pypeclub/pype/pull/1057)
+- Auto fix unicode strings [\#1046](https://github.com/pypeclub/pype/pull/1046)
+
+## [2.15.2](https://github.com/pypeclub/pype/tree/2.15.2) (2021-02-19)
+
+[Full Changelog](https://github.com/pypeclub/pype/compare/2.15.1...2.15.2)
+
+**Enhancements:**
+
+- Maya: Vray scene publishing [\#1013](https://github.com/pypeclub/pype/pull/1013)
+
+**Fixed bugs:**
+
+- Fix entity move under project [\#1040](https://github.com/pypeclub/pype/pull/1040)
+- smaller nuke fixes from production [\#1036](https://github.com/pypeclub/pype/pull/1036)
+- TVPaint thumbnail extract fix [\#1031](https://github.com/pypeclub/pype/pull/1031)
+
+## [2.15.1](https://github.com/pypeclub/pype/tree/2.15.1) (2021-02-12)
+
+[Full Changelog](https://github.com/pypeclub/pype/compare/2.15.0...2.15.1)
+
+**Enhancements:**
+
+- Delete version as loader action [\#1011](https://github.com/pypeclub/pype/pull/1011)
+- Delete old versions [\#445](https://github.com/pypeclub/pype/pull/445)
+
+**Fixed bugs:**
+
+- PS - remove obsolete functions from pywin32 [\#1006](https://github.com/pypeclub/pype/pull/1006)
+- Clone description of review session objects. [\#922](https://github.com/pypeclub/pype/pull/922)
+
+## [2.15.0](https://github.com/pypeclub/pype/tree/2.15.0) (2021-02-09)
+
+[Full Changelog](https://github.com/pypeclub/pype/compare/2.14.6...2.15.0)
+
+**Enhancements:**
+
+- Resolve - loading and updating clips [\#932](https://github.com/pypeclub/pype/pull/932)
+- Release/2.15.0 [\#926](https://github.com/pypeclub/pype/pull/926)
+- Photoshop: add option for template.psd and prelaunch hook [\#894](https://github.com/pypeclub/pype/pull/894)
+- Nuke: deadline presets [\#993](https://github.com/pypeclub/pype/pull/993)
+- Maya: Alembic only set attributes that exists. [\#986](https://github.com/pypeclub/pype/pull/986)
+- Harmony: render local and handle fixes [\#981](https://github.com/pypeclub/pype/pull/981)
+- PSD Bulk export of ANIM group [\#965](https://github.com/pypeclub/pype/pull/965)
+- AE - added prelaunch hook for opening last or workfile from template [\#944](https://github.com/pypeclub/pype/pull/944)
+- PS - safer handling of loading of workfile [\#941](https://github.com/pypeclub/pype/pull/941)
+- Maya: Handling Arnold referenced AOVs [\#938](https://github.com/pypeclub/pype/pull/938)
+- TVPaint: switch layer IDs for layer names during identification [\#903](https://github.com/pypeclub/pype/pull/903)
+- TVPaint audio/sound loader [\#893](https://github.com/pypeclub/pype/pull/893)
+- Clone review session with children. [\#891](https://github.com/pypeclub/pype/pull/891)
+- Simple compositing data packager for freelancers [\#884](https://github.com/pypeclub/pype/pull/884)
+- Harmony deadline submission [\#881](https://github.com/pypeclub/pype/pull/881)
+- Maya: Optionally hide image planes from reviews. [\#840](https://github.com/pypeclub/pype/pull/840)
+- Maya: handle referenced AOVs for Vray [\#824](https://github.com/pypeclub/pype/pull/824)
+- DWAA/DWAB support on windows [\#795](https://github.com/pypeclub/pype/pull/795)
+- Unreal: animation, layout and setdress updates [\#695](https://github.com/pypeclub/pype/pull/695)
+
+**Fixed bugs:**
+
+- Maya: Looks - disable hardlinks [\#995](https://github.com/pypeclub/pype/pull/995)
+- Fix Ftrack custom attribute update [\#982](https://github.com/pypeclub/pype/pull/982)
+- Prores ks in burnin script [\#960](https://github.com/pypeclub/pype/pull/960)
+- terminal.py crash on import [\#839](https://github.com/pypeclub/pype/pull/839)
+- Extract review handle bizarre pixel aspect ratio [\#990](https://github.com/pypeclub/pype/pull/990)
+- Nuke: add nuke related env var to sumbission [\#988](https://github.com/pypeclub/pype/pull/988)
+- Nuke: missing preset's variable [\#984](https://github.com/pypeclub/pype/pull/984)
+- Get creator by name fix [\#979](https://github.com/pypeclub/pype/pull/979)
+- Fix update of project's tasks on Ftrack sync [\#972](https://github.com/pypeclub/pype/pull/972)
+- nuke: wrong frame offset in mov loader [\#971](https://github.com/pypeclub/pype/pull/971)
+- Create project structure action fix multiroot [\#967](https://github.com/pypeclub/pype/pull/967)
+- PS: remove pywin installation from hook [\#964](https://github.com/pypeclub/pype/pull/964)
+- Prores ks in burnin script [\#959](https://github.com/pypeclub/pype/pull/959)
+- Subset family is now stored in subset document [\#956](https://github.com/pypeclub/pype/pull/956)
+- DJV new version arguments [\#954](https://github.com/pypeclub/pype/pull/954)
+- TV Paint: Fix single frame Sequence [\#953](https://github.com/pypeclub/pype/pull/953)
+- nuke: missing `file` knob update [\#933](https://github.com/pypeclub/pype/pull/933)
+- Photoshop: Create from single layer was failing [\#920](https://github.com/pypeclub/pype/pull/920)
+- Nuke: baking mov with correct colorspace inherited from write [\#909](https://github.com/pypeclub/pype/pull/909)
+- Launcher fix actions discover [\#896](https://github.com/pypeclub/pype/pull/896)
+- Get the correct file path for the updated mov. [\#889](https://github.com/pypeclub/pype/pull/889)
+- Maya: Deadline submitter - shared data access violation [\#831](https://github.com/pypeclub/pype/pull/831)
+- Maya: Take into account vray master AOV switch [\#822](https://github.com/pypeclub/pype/pull/822)
+
+**Merged pull requests:**
+
+- Refactor blender to 3.0 format [\#934](https://github.com/pypeclub/pype/pull/934)
+
+## [2.14.6](https://github.com/pypeclub/pype/tree/2.14.6) (2021-01-15)
+
+[Full Changelog](https://github.com/pypeclub/pype/compare/2.14.5...2.14.6)
+
+**Fixed bugs:**
+
+- Nuke: improving of hashing path [\#885](https://github.com/pypeclub/pype/pull/885)
+
+**Merged pull requests:**
+
+- Hiero: cut videos with correct secons [\#892](https://github.com/pypeclub/pype/pull/892)
+- Faster sync to avalon preparation [\#869](https://github.com/pypeclub/pype/pull/869)
+
+## [2.14.5](https://github.com/pypeclub/pype/tree/2.14.5) (2021-01-06)
+
+[Full Changelog](https://github.com/pypeclub/pype/compare/2.14.4...2.14.5)
+
+**Merged pull requests:**
+
+- Pype logger refactor [\#866](https://github.com/pypeclub/pype/pull/866)
+
+## [2.14.4](https://github.com/pypeclub/pype/tree/2.14.4) (2020-12-18)
+
+[Full Changelog](https://github.com/pypeclub/pype/compare/2.14.3...2.14.4)
+
+**Merged pull requests:**
+
+- Fix - AE - added explicit cast to int [\#837](https://github.com/pypeclub/pype/pull/837)
+
+## [2.14.3](https://github.com/pypeclub/pype/tree/2.14.3) (2020-12-16)
+
+[Full Changelog](https://github.com/pypeclub/pype/compare/2.14.2...2.14.3)
+
+**Fixed bugs:**
+
+- TVPaint repair invalid metadata [\#809](https://github.com/pypeclub/pype/pull/809)
+- Feature/push hier value to nonhier action [\#807](https://github.com/pypeclub/pype/pull/807)
+- Harmony: fix palette and image sequence loader [\#806](https://github.com/pypeclub/pype/pull/806)
+
+**Merged pull requests:**
+
+- respecting space in path [\#823](https://github.com/pypeclub/pype/pull/823)
+
+## [2.14.2](https://github.com/pypeclub/pype/tree/2.14.2) (2020-12-04)
+
+[Full Changelog](https://github.com/pypeclub/pype/compare/2.14.1...2.14.2)
+
+**Enhancements:**
+
+- Collapsible wrapper in settings [\#767](https://github.com/pypeclub/pype/pull/767)
+
+**Fixed bugs:**
+
+- Harmony: template extraction and palettes thumbnails on mac [\#768](https://github.com/pypeclub/pype/pull/768)
+- TVPaint store context to workfile metadata \(764\) [\#766](https://github.com/pypeclub/pype/pull/766)
+- Extract review audio cut fix [\#763](https://github.com/pypeclub/pype/pull/763)
+
+**Merged pull requests:**
+
+- AE: fix publish after background load [\#781](https://github.com/pypeclub/pype/pull/781)
+- TVPaint store members key [\#769](https://github.com/pypeclub/pype/pull/769)
+
+## [2.14.1](https://github.com/pypeclub/pype/tree/2.14.1) (2020-11-27)
+
+[Full Changelog](https://github.com/pypeclub/pype/compare/2.14.0...2.14.1)
+
+**Enhancements:**
+
+- Settings required keys in modifiable dict [\#770](https://github.com/pypeclub/pype/pull/770)
+- Extract review may not add audio to output [\#761](https://github.com/pypeclub/pype/pull/761)
+
+**Fixed bugs:**
+
+- After Effects: frame range, file format and render source scene fixes [\#760](https://github.com/pypeclub/pype/pull/760)
+- Hiero: trimming review with clip event number [\#754](https://github.com/pypeclub/pype/pull/754)
+- TVPaint: fix updating of loaded subsets [\#752](https://github.com/pypeclub/pype/pull/752)
+- Maya: Vray handling of default aov [\#748](https://github.com/pypeclub/pype/pull/748)
+- Maya: multiple renderable cameras in layer didn't work [\#744](https://github.com/pypeclub/pype/pull/744)
+- Ftrack integrate custom attributes fix [\#742](https://github.com/pypeclub/pype/pull/742)
+
+## [2.14.0](https://github.com/pypeclub/pype/tree/2.14.0) (2020-11-23)
[Full Changelog](https://github.com/pypeclub/pype/compare/2.13.7...2.14.0)
**Enhancements:**
+- Render publish plugins abstraction [\#687](https://github.com/pypeclub/pype/pull/687)
- Shot asset build trigger status [\#736](https://github.com/pypeclub/pype/pull/736)
- Maya: add camera rig publishing option [\#721](https://github.com/pypeclub/pype/pull/721)
- Sort instances by label in pyblish gui [\#719](https://github.com/pypeclub/pype/pull/719)
- Synchronize ftrack hierarchical and shot attributes [\#716](https://github.com/pypeclub/pype/pull/716)
- 686 standalonepublisher editorial from image sequences [\#699](https://github.com/pypeclub/pype/pull/699)
-- TV Paint: initial implementation of creators and local rendering [\#693](https://github.com/pypeclub/pype/pull/693)
-- Render publish plugins abstraction [\#687](https://github.com/pypeclub/pype/pull/687)
- Ask user to select non-default camera from scene or create a new. [\#678](https://github.com/pypeclub/pype/pull/678)
- TVPaint: image loader with options [\#675](https://github.com/pypeclub/pype/pull/675)
- Maya: Camera name can be added to burnins. [\#674](https://github.com/pypeclub/pype/pull/674)
@@ -21,25 +267,33 @@
**Fixed bugs:**
+- Bugfix Hiero Review / Plate representation publish [\#743](https://github.com/pypeclub/pype/pull/743)
+- Asset fetch second fix [\#726](https://github.com/pypeclub/pype/pull/726)
- TVPaint extract review fix [\#740](https://github.com/pypeclub/pype/pull/740)
- After Effects: Review were not being sent to ftrack [\#738](https://github.com/pypeclub/pype/pull/738)
-- Asset fetch second fix [\#726](https://github.com/pypeclub/pype/pull/726)
- Maya: vray proxy was not loading [\#722](https://github.com/pypeclub/pype/pull/722)
- Maya: Vray expected file fixes [\#682](https://github.com/pypeclub/pype/pull/682)
+- Missing audio on farm submission. [\#639](https://github.com/pypeclub/pype/pull/639)
**Deprecated:**
- Removed artist view from pyblish gui [\#717](https://github.com/pypeclub/pype/pull/717)
- Maya: disable legacy override check for cameras [\#715](https://github.com/pypeclub/pype/pull/715)
+**Merged pull requests:**
+
+- Application manager [\#728](https://github.com/pypeclub/pype/pull/728)
+- Feature \#664 3.0 lib refactor [\#706](https://github.com/pypeclub/pype/pull/706)
+- Lib from illicit part 2 [\#700](https://github.com/pypeclub/pype/pull/700)
+- 3.0 lib refactor - path tools [\#697](https://github.com/pypeclub/pype/pull/697)
## [2.13.7](https://github.com/pypeclub/pype/tree/2.13.7) (2020-11-19)
[Full Changelog](https://github.com/pypeclub/pype/compare/2.13.6...2.13.7)
-**Merged pull requests:**
+**Fixed bugs:**
-- fix\(SP\): getting fps from context instead of nonexistent entity [\#729](https://github.com/pypeclub/pype/pull/729)
+- Standalone Publisher: getting fps from context instead of nonexistent entity [\#729](https://github.com/pypeclub/pype/pull/729)
# Changelog
diff --git a/HISTORY.md b/HISTORY.md
index b8b96fb4c3..053059a9ea 100644
--- a/HISTORY.md
+++ b/HISTORY.md
@@ -1,3 +1,268 @@
+## [2.16.0](https://github.com/pypeclub/pype/tree/2.16.0) (2021-03-22)
+
+[Full Changelog](https://github.com/pypeclub/pype/compare/2.15.3...2.16.0)
+
+**Enhancements:**
+
+- Nuke: deadline submit limit group filter [\#1167](https://github.com/pypeclub/pype/pull/1167)
+- Maya: support for Deadline Group and Limit Groups - backport 2.x [\#1156](https://github.com/pypeclub/pype/pull/1156)
+- Maya: fixes for Redshift support [\#1152](https://github.com/pypeclub/pype/pull/1152)
+- Nuke: adding preset for a Read node name to all img and mov Loaders [\#1146](https://github.com/pypeclub/pype/pull/1146)
+- nuke deadline submit with environ var from presets overrides [\#1142](https://github.com/pypeclub/pype/pull/1142)
+- Change timers after task change [\#1138](https://github.com/pypeclub/pype/pull/1138)
+- Nuke: shortcuts for Pype menu [\#1127](https://github.com/pypeclub/pype/pull/1127)
+- Nuke: workfile template [\#1124](https://github.com/pypeclub/pype/pull/1124)
+- Sites local settings by site name [\#1117](https://github.com/pypeclub/pype/pull/1117)
+- Reset loader's asset selection on context change [\#1106](https://github.com/pypeclub/pype/pull/1106)
+- Bulk mov render publishing [\#1101](https://github.com/pypeclub/pype/pull/1101)
+- Photoshop: mark publishable instances [\#1093](https://github.com/pypeclub/pype/pull/1093)
+- Added ability to define BG color for extract review [\#1088](https://github.com/pypeclub/pype/pull/1088)
+- TVPaint extractor enhancement [\#1080](https://github.com/pypeclub/pype/pull/1080)
+- Photoshop: added support for .psb in workfiles [\#1078](https://github.com/pypeclub/pype/pull/1078)
+- Optionally add task to subset name [\#1072](https://github.com/pypeclub/pype/pull/1072)
+- Only extend clip range when collecting. [\#1008](https://github.com/pypeclub/pype/pull/1008)
+- Collect audio for farm reviews. [\#1073](https://github.com/pypeclub/pype/pull/1073)
+
+
+**Fixed bugs:**
+
+- Fix path spaces in jpeg extractor [\#1174](https://github.com/pypeclub/pype/pull/1174)
+- Maya: Bugfix: superclass for CreateCameraRig [\#1166](https://github.com/pypeclub/pype/pull/1166)
+- Maya: Submit to Deadline - fix typo in condition [\#1163](https://github.com/pypeclub/pype/pull/1163)
+- Avoid dot in repre extension [\#1125](https://github.com/pypeclub/pype/pull/1125)
+- Fix versions variable usage in standalone publisher [\#1090](https://github.com/pypeclub/pype/pull/1090)
+- Collect instance data fix subset query [\#1082](https://github.com/pypeclub/pype/pull/1082)
+- Fix getting the camera name. [\#1067](https://github.com/pypeclub/pype/pull/1067)
+- Nuke: Ensure "NUKE\_TEMP\_DIR" is not part of the Deadline job environment. [\#1064](https://github.com/pypeclub/pype/pull/1064)
+
+## [2.15.3](https://github.com/pypeclub/pype/tree/2.15.3) (2021-02-26)
+
+[Full Changelog](https://github.com/pypeclub/pype/compare/2.15.2...2.15.3)
+
+**Enhancements:**
+
+- Maya: speedup renderable camera collection [\#1053](https://github.com/pypeclub/pype/pull/1053)
+- Harmony - add regex search to filter allowed task names for collectin… [\#1047](https://github.com/pypeclub/pype/pull/1047)
+
+**Fixed bugs:**
+
+- Ftrack integrate hierarchy fix [\#1085](https://github.com/pypeclub/pype/pull/1085)
+- Explicit subset filter in anatomy instance data [\#1059](https://github.com/pypeclub/pype/pull/1059)
+- TVPaint frame offset [\#1057](https://github.com/pypeclub/pype/pull/1057)
+- Auto fix unicode strings [\#1046](https://github.com/pypeclub/pype/pull/1046)
+
+## [2.15.2](https://github.com/pypeclub/pype/tree/2.15.2) (2021-02-19)
+
+[Full Changelog](https://github.com/pypeclub/pype/compare/2.15.1...2.15.2)
+
+**Enhancements:**
+
+- Maya: Vray scene publishing [\#1013](https://github.com/pypeclub/pype/pull/1013)
+
+**Fixed bugs:**
+
+- Fix entity move under project [\#1040](https://github.com/pypeclub/pype/pull/1040)
+- smaller nuke fixes from production [\#1036](https://github.com/pypeclub/pype/pull/1036)
+- TVPaint thumbnail extract fix [\#1031](https://github.com/pypeclub/pype/pull/1031)
+
+## [2.15.1](https://github.com/pypeclub/pype/tree/2.15.1) (2021-02-12)
+
+[Full Changelog](https://github.com/pypeclub/pype/compare/2.15.0...2.15.1)
+
+**Enhancements:**
+
+- Delete version as loader action [\#1011](https://github.com/pypeclub/pype/pull/1011)
+- Delete old versions [\#445](https://github.com/pypeclub/pype/pull/445)
+
+**Fixed bugs:**
+
+- PS - remove obsolete functions from pywin32 [\#1006](https://github.com/pypeclub/pype/pull/1006)
+- Clone description of review session objects. [\#922](https://github.com/pypeclub/pype/pull/922)
+
+## [2.15.0](https://github.com/pypeclub/pype/tree/2.15.0) (2021-02-09)
+
+[Full Changelog](https://github.com/pypeclub/pype/compare/2.14.6...2.15.0)
+
+**Enhancements:**
+
+- Resolve - loading and updating clips [\#932](https://github.com/pypeclub/pype/pull/932)
+- Release/2.15.0 [\#926](https://github.com/pypeclub/pype/pull/926)
+- Photoshop: add option for template.psd and prelaunch hook [\#894](https://github.com/pypeclub/pype/pull/894)
+- Nuke: deadline presets [\#993](https://github.com/pypeclub/pype/pull/993)
+- Maya: Alembic only set attributes that exists. [\#986](https://github.com/pypeclub/pype/pull/986)
+- Harmony: render local and handle fixes [\#981](https://github.com/pypeclub/pype/pull/981)
+- PSD Bulk export of ANIM group [\#965](https://github.com/pypeclub/pype/pull/965)
+- AE - added prelaunch hook for opening last or workfile from template [\#944](https://github.com/pypeclub/pype/pull/944)
+- PS - safer handling of loading of workfile [\#941](https://github.com/pypeclub/pype/pull/941)
+- Maya: Handling Arnold referenced AOVs [\#938](https://github.com/pypeclub/pype/pull/938)
+- TVPaint: switch layer IDs for layer names during identification [\#903](https://github.com/pypeclub/pype/pull/903)
+- TVPaint audio/sound loader [\#893](https://github.com/pypeclub/pype/pull/893)
+- Clone review session with children. [\#891](https://github.com/pypeclub/pype/pull/891)
+- Simple compositing data packager for freelancers [\#884](https://github.com/pypeclub/pype/pull/884)
+- Harmony deadline submission [\#881](https://github.com/pypeclub/pype/pull/881)
+- Maya: Optionally hide image planes from reviews. [\#840](https://github.com/pypeclub/pype/pull/840)
+- Maya: handle referenced AOVs for Vray [\#824](https://github.com/pypeclub/pype/pull/824)
+- DWAA/DWAB support on windows [\#795](https://github.com/pypeclub/pype/pull/795)
+- Unreal: animation, layout and setdress updates [\#695](https://github.com/pypeclub/pype/pull/695)
+
+**Fixed bugs:**
+
+- Maya: Looks - disable hardlinks [\#995](https://github.com/pypeclub/pype/pull/995)
+- Fix Ftrack custom attribute update [\#982](https://github.com/pypeclub/pype/pull/982)
+- Prores ks in burnin script [\#960](https://github.com/pypeclub/pype/pull/960)
+- terminal.py crash on import [\#839](https://github.com/pypeclub/pype/pull/839)
+- Extract review handle bizarre pixel aspect ratio [\#990](https://github.com/pypeclub/pype/pull/990)
+- Nuke: add nuke related env var to sumbission [\#988](https://github.com/pypeclub/pype/pull/988)
+- Nuke: missing preset's variable [\#984](https://github.com/pypeclub/pype/pull/984)
+- Get creator by name fix [\#979](https://github.com/pypeclub/pype/pull/979)
+- Fix update of project's tasks on Ftrack sync [\#972](https://github.com/pypeclub/pype/pull/972)
+- nuke: wrong frame offset in mov loader [\#971](https://github.com/pypeclub/pype/pull/971)
+- Create project structure action fix multiroot [\#967](https://github.com/pypeclub/pype/pull/967)
+- PS: remove pywin installation from hook [\#964](https://github.com/pypeclub/pype/pull/964)
+- Prores ks in burnin script [\#959](https://github.com/pypeclub/pype/pull/959)
+- Subset family is now stored in subset document [\#956](https://github.com/pypeclub/pype/pull/956)
+- DJV new version arguments [\#954](https://github.com/pypeclub/pype/pull/954)
+- TV Paint: Fix single frame Sequence [\#953](https://github.com/pypeclub/pype/pull/953)
+- nuke: missing `file` knob update [\#933](https://github.com/pypeclub/pype/pull/933)
+- Photoshop: Create from single layer was failing [\#920](https://github.com/pypeclub/pype/pull/920)
+- Nuke: baking mov with correct colorspace inherited from write [\#909](https://github.com/pypeclub/pype/pull/909)
+- Launcher fix actions discover [\#896](https://github.com/pypeclub/pype/pull/896)
+- Get the correct file path for the updated mov. [\#889](https://github.com/pypeclub/pype/pull/889)
+- Maya: Deadline submitter - shared data access violation [\#831](https://github.com/pypeclub/pype/pull/831)
+- Maya: Take into account vray master AOV switch [\#822](https://github.com/pypeclub/pype/pull/822)
+
+**Merged pull requests:**
+
+- Refactor blender to 3.0 format [\#934](https://github.com/pypeclub/pype/pull/934)
+
+## [2.14.6](https://github.com/pypeclub/pype/tree/2.14.6) (2021-01-15)
+
+[Full Changelog](https://github.com/pypeclub/pype/compare/2.14.5...2.14.6)
+
+**Fixed bugs:**
+
+- Nuke: improving of hashing path [\#885](https://github.com/pypeclub/pype/pull/885)
+
+**Merged pull requests:**
+
+- Hiero: cut videos with correct secons [\#892](https://github.com/pypeclub/pype/pull/892)
+- Faster sync to avalon preparation [\#869](https://github.com/pypeclub/pype/pull/869)
+
+## [2.14.5](https://github.com/pypeclub/pype/tree/2.14.5) (2021-01-06)
+
+[Full Changelog](https://github.com/pypeclub/pype/compare/2.14.4...2.14.5)
+
+**Merged pull requests:**
+
+- Pype logger refactor [\#866](https://github.com/pypeclub/pype/pull/866)
+
+## [2.14.4](https://github.com/pypeclub/pype/tree/2.14.4) (2020-12-18)
+
+[Full Changelog](https://github.com/pypeclub/pype/compare/2.14.3...2.14.4)
+
+**Merged pull requests:**
+
+- Fix - AE - added explicit cast to int [\#837](https://github.com/pypeclub/pype/pull/837)
+
+## [2.14.3](https://github.com/pypeclub/pype/tree/2.14.3) (2020-12-16)
+
+[Full Changelog](https://github.com/pypeclub/pype/compare/2.14.2...2.14.3)
+
+**Fixed bugs:**
+
+- TVPaint repair invalid metadata [\#809](https://github.com/pypeclub/pype/pull/809)
+- Feature/push hier value to nonhier action [\#807](https://github.com/pypeclub/pype/pull/807)
+- Harmony: fix palette and image sequence loader [\#806](https://github.com/pypeclub/pype/pull/806)
+
+**Merged pull requests:**
+
+- respecting space in path [\#823](https://github.com/pypeclub/pype/pull/823)
+
+## [2.14.2](https://github.com/pypeclub/pype/tree/2.14.2) (2020-12-04)
+
+[Full Changelog](https://github.com/pypeclub/pype/compare/2.14.1...2.14.2)
+
+**Enhancements:**
+
+- Collapsible wrapper in settings [\#767](https://github.com/pypeclub/pype/pull/767)
+
+**Fixed bugs:**
+
+- Harmony: template extraction and palettes thumbnails on mac [\#768](https://github.com/pypeclub/pype/pull/768)
+- TVPaint store context to workfile metadata \(764\) [\#766](https://github.com/pypeclub/pype/pull/766)
+- Extract review audio cut fix [\#763](https://github.com/pypeclub/pype/pull/763)
+
+**Merged pull requests:**
+
+- AE: fix publish after background load [\#781](https://github.com/pypeclub/pype/pull/781)
+- TVPaint store members key [\#769](https://github.com/pypeclub/pype/pull/769)
+
+## [2.14.1](https://github.com/pypeclub/pype/tree/2.14.1) (2020-11-27)
+
+[Full Changelog](https://github.com/pypeclub/pype/compare/2.14.0...2.14.1)
+
+**Enhancements:**
+
+- Settings required keys in modifiable dict [\#770](https://github.com/pypeclub/pype/pull/770)
+- Extract review may not add audio to output [\#761](https://github.com/pypeclub/pype/pull/761)
+
+**Fixed bugs:**
+
+- After Effects: frame range, file format and render source scene fixes [\#760](https://github.com/pypeclub/pype/pull/760)
+- Hiero: trimming review with clip event number [\#754](https://github.com/pypeclub/pype/pull/754)
+- TVPaint: fix updating of loaded subsets [\#752](https://github.com/pypeclub/pype/pull/752)
+- Maya: Vray handling of default aov [\#748](https://github.com/pypeclub/pype/pull/748)
+- Maya: multiple renderable cameras in layer didn't work [\#744](https://github.com/pypeclub/pype/pull/744)
+- Ftrack integrate custom attributes fix [\#742](https://github.com/pypeclub/pype/pull/742)
+
+## [2.14.0](https://github.com/pypeclub/pype/tree/2.14.0) (2020-11-23)
+
+[Full Changelog](https://github.com/pypeclub/pype/compare/2.13.7...2.14.0)
+
+**Enhancements:**
+
+- Render publish plugins abstraction [\#687](https://github.com/pypeclub/pype/pull/687)
+- Shot asset build trigger status [\#736](https://github.com/pypeclub/pype/pull/736)
+- Maya: add camera rig publishing option [\#721](https://github.com/pypeclub/pype/pull/721)
+- Sort instances by label in pyblish gui [\#719](https://github.com/pypeclub/pype/pull/719)
+- Synchronize ftrack hierarchical and shot attributes [\#716](https://github.com/pypeclub/pype/pull/716)
+- 686 standalonepublisher editorial from image sequences [\#699](https://github.com/pypeclub/pype/pull/699)
+- Ask user to select non-default camera from scene or create a new. [\#678](https://github.com/pypeclub/pype/pull/678)
+- TVPaint: image loader with options [\#675](https://github.com/pypeclub/pype/pull/675)
+- Maya: Camera name can be added to burnins. [\#674](https://github.com/pypeclub/pype/pull/674)
+- After Effects: base integration with loaders [\#667](https://github.com/pypeclub/pype/pull/667)
+- Harmony: Javascript refactoring and overall stability improvements [\#666](https://github.com/pypeclub/pype/pull/666)
+
+**Fixed bugs:**
+
+- Bugfix Hiero Review / Plate representation publish [\#743](https://github.com/pypeclub/pype/pull/743)
+- Asset fetch second fix [\#726](https://github.com/pypeclub/pype/pull/726)
+- TVPaint extract review fix [\#740](https://github.com/pypeclub/pype/pull/740)
+- After Effects: Review were not being sent to ftrack [\#738](https://github.com/pypeclub/pype/pull/738)
+- Maya: vray proxy was not loading [\#722](https://github.com/pypeclub/pype/pull/722)
+- Maya: Vray expected file fixes [\#682](https://github.com/pypeclub/pype/pull/682)
+- Missing audio on farm submission. [\#639](https://github.com/pypeclub/pype/pull/639)
+
+**Deprecated:**
+
+- Removed artist view from pyblish gui [\#717](https://github.com/pypeclub/pype/pull/717)
+- Maya: disable legacy override check for cameras [\#715](https://github.com/pypeclub/pype/pull/715)
+
+**Merged pull requests:**
+
+- Application manager [\#728](https://github.com/pypeclub/pype/pull/728)
+- Feature \#664 3.0 lib refactor [\#706](https://github.com/pypeclub/pype/pull/706)
+- Lib from illicit part 2 [\#700](https://github.com/pypeclub/pype/pull/700)
+- 3.0 lib refactor - path tools [\#697](https://github.com/pypeclub/pype/pull/697)
+
+## [2.13.7](https://github.com/pypeclub/pype/tree/2.13.7) (2020-11-19)
+
+[Full Changelog](https://github.com/pypeclub/pype/compare/2.13.6...2.13.7)
+
+**Fixed bugs:**
+
+- Standalone Publisher: getting fps from context instead of nonexistent entity [\#729](https://github.com/pypeclub/pype/pull/729)
+
# Changelog
## [2.13.6](https://github.com/pypeclub/pype/tree/2.13.6) (2020-11-15)
@@ -789,4 +1054,7 @@ A large cleanup release. Most of the change are under the hood.
- _(avalon)_ subsets in maya 2019 weren't behaving correctly in the outliner
+\* *This Changelog was automatically generated by [github_changelog_generator](https://github.com/github-changelog-generator/github-changelog-generator)*
+
+
\* *This Changelog was automatically generated by [github_changelog_generator](https://github.com/github-changelog-generator/github-changelog-generator)*
diff --git a/openpype/hosts/nuke/plugins/publish/precollect_instances.py b/openpype/hosts/nuke/plugins/publish/precollect_instances.py
index 2d25b29826..92f96ea48d 100644
--- a/openpype/hosts/nuke/plugins/publish/precollect_instances.py
+++ b/openpype/hosts/nuke/plugins/publish/precollect_instances.py
@@ -80,25 +80,31 @@ class PreCollectNukeInstances(pyblish.api.ContextPlugin):
# Add all nodes in group instances.
if node.Class() == "Group":
- # check if it is write node in family
- if "write" in families:
+ # only alter families for render family
+ if "write" in families_ak:
target = node["render"].value()
if target == "Use existing frames":
# Local rendering
self.log.info("flagged for no render")
- families.append("render")
+ families.append(family)
elif target == "Local":
# Local rendering
self.log.info("flagged for local render")
- families.append("{}.local".format("render"))
+ families.append("{}.local".format(family))
elif target == "On farm":
# Farm rendering
self.log.info("flagged for farm render")
instance.data["transfer"] = False
- families.append("{}.farm".format("render"))
+ families.append("{}.farm".format(family))
+
+ # suffle family to `write` as it is main family
+ # this will be changed later on in process
if "render" in families:
families.remove("render")
family = "write"
+ elif "prerender" in families:
+ families.remove("prerender")
+ family = "write"
node.begin()
for i in nuke.allNodes():
diff --git a/openpype/hosts/nuke/plugins/publish/precollect_writes.py b/openpype/hosts/nuke/plugins/publish/precollect_writes.py
index a519609f52..57303bd42e 100644
--- a/openpype/hosts/nuke/plugins/publish/precollect_writes.py
+++ b/openpype/hosts/nuke/plugins/publish/precollect_writes.py
@@ -108,6 +108,8 @@ class CollectNukeWrites(pyblish.api.InstancePlugin):
# Add version data to instance
version_data = {
+ "families": [f.replace(".local", "").replace(".farm", "")
+ for f in families if "write" not in f],
"colorspace": node["colorspace"].value(),
}
diff --git a/openpype/hosts/tvpaint/plugins/publish/collect_instances.py b/openpype/hosts/tvpaint/plugins/publish/collect_instances.py
index e03833b96b..6353715a76 100644
--- a/openpype/hosts/tvpaint/plugins/publish/collect_instances.py
+++ b/openpype/hosts/tvpaint/plugins/publish/collect_instances.py
@@ -18,7 +18,7 @@ class CollectInstances(pyblish.api.ContextPlugin):
))
for instance_data in workfile_instances:
- instance_data["fps"] = context.data["fps"]
+ instance_data["fps"] = context.data["sceneFps"]
# Store workfile instance data to instance data
instance_data["originData"] = copy.deepcopy(instance_data)
@@ -32,6 +32,11 @@ class CollectInstances(pyblish.api.ContextPlugin):
subset_name = instance_data["subset"]
name = instance_data.get("name", subset_name)
instance_data["name"] = name
+ instance_data["label"] = "{} [{}-{}]".format(
+ name,
+ context.data["sceneFrameStart"],
+ context.data["sceneFrameEnd"]
+ )
active = instance_data.get("active", True)
instance_data["active"] = active
diff --git a/openpype/hosts/tvpaint/plugins/publish/collect_workfile_data.py b/openpype/hosts/tvpaint/plugins/publish/collect_workfile_data.py
index af1dd46594..95add0b79c 100644
--- a/openpype/hosts/tvpaint/plugins/publish/collect_workfile_data.py
+++ b/openpype/hosts/tvpaint/plugins/publish/collect_workfile_data.py
@@ -141,11 +141,11 @@ class CollectWorkfileData(pyblish.api.ContextPlugin):
"currentFile": workfile_path,
"sceneWidth": width,
"sceneHeight": height,
- "pixelAspect": pixel_apsect,
- "frameStart": frame_start,
- "frameEnd": frame_end,
- "fps": frame_rate,
- "fieldOrder": field_order
+ "scenePixelAspect": pixel_apsect,
+ "sceneFrameStart": frame_start,
+ "sceneFrameEnd": frame_end,
+ "sceneFps": frame_rate,
+ "sceneFieldOrder": field_order
}
self.log.debug(
"Scene data: {}".format(json.dumps(scene_data, indent=4))
diff --git a/openpype/hosts/tvpaint/plugins/publish/validate_project_settings.py b/openpype/hosts/tvpaint/plugins/publish/validate_project_settings.py
new file mode 100644
index 0000000000..fead3393ae
--- /dev/null
+++ b/openpype/hosts/tvpaint/plugins/publish/validate_project_settings.py
@@ -0,0 +1,36 @@
+import json
+
+import pyblish.api
+
+
+class ValidateProjectSettings(pyblish.api.ContextPlugin):
+ """Validate project settings against database.
+ """
+
+ label = "Validate Project Settings"
+ order = pyblish.api.ValidatorOrder
+ optional = True
+
+ def process(self, context):
+ scene_data = {
+ "frameStart": context.data.get("sceneFrameStart"),
+ "frameEnd": context.data.get("sceneFrameEnd"),
+ "fps": context.data.get("sceneFps"),
+ "resolutionWidth": context.data.get("sceneWidth"),
+ "resolutionHeight": context.data.get("sceneHeight"),
+ "pixelAspect": context.data.get("scenePixelAspect")
+ }
+ invalid = {}
+ for k in scene_data.keys():
+ expected_value = context.data["assetEntity"]["data"][k]
+ if scene_data[k] != expected_value:
+ invalid[k] = {
+ "current": scene_data[k], "expected": expected_value
+ }
+
+ if invalid:
+ raise AssertionError(
+ "Project settings does not match database:\n{}".format(
+ json.dumps(invalid, sort_keys=True, indent=4)
+ )
+ )
diff --git a/openpype/modules/__init__.py b/openpype/modules/__init__.py
index 4b120647e1..d7c6d99fe6 100644
--- a/openpype/modules/__init__.py
+++ b/openpype/modules/__init__.py
@@ -41,7 +41,7 @@ from .log_viewer import LogViewModule
from .muster import MusterModule
from .deadline import DeadlineModule
from .standalonepublish_action import StandAlonePublishAction
-from .sync_server import SyncServer
+from .sync_server import SyncServerModule
__all__ = (
@@ -82,5 +82,5 @@ __all__ = (
"DeadlineModule",
"StandAlonePublishAction",
- "SyncServer"
+ "SyncServerModule"
)
diff --git a/openpype/modules/deadline/plugins/publish/submit_publish_job.py b/openpype/modules/deadline/plugins/publish/submit_publish_job.py
index a2e21fb766..ea953441a2 100644
--- a/openpype/modules/deadline/plugins/publish/submit_publish_job.py
+++ b/openpype/modules/deadline/plugins/publish/submit_publish_job.py
@@ -102,7 +102,7 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
hosts = ["fusion", "maya", "nuke", "celaction", "aftereffects", "harmony"]
- families = ["render.farm", "prerender",
+ families = ["render.farm", "prerender.farm",
"renderlayer", "imagesequence", "vrayscene"]
aov_filter = {"maya": [r".+(?:\.|_)([Bb]eauty)(?:\.|_).*"],
diff --git a/openpype/modules/ftrack/event_handlers_server/action_prepare_project.py b/openpype/modules/ftrack/event_handlers_server/action_prepare_project.py
new file mode 100644
index 0000000000..8248bf532e
--- /dev/null
+++ b/openpype/modules/ftrack/event_handlers_server/action_prepare_project.py
@@ -0,0 +1,365 @@
+import json
+
+from openpype.api import ProjectSettings
+
+from openpype.modules.ftrack.lib import ServerAction
+from openpype.modules.ftrack.lib.avalon_sync import (
+ get_pype_attr,
+ CUST_ATTR_AUTO_SYNC
+)
+
+
+class PrepareProjectServer(ServerAction):
+ """Prepare project attributes in Anatomy."""
+
+ identifier = "prepare.project.server"
+ label = "OpenPype Admin"
+ variant = "- Prepare Project (Server)"
+ description = "Set basic attributes on the project"
+
+ settings_key = "prepare_project"
+
+ role_list = ["Pypeclub", "Administrator", "Project Manager"]
+
+ # Key to store info about trigerring create folder structure
+ item_splitter = {"type": "label", "value": "---"}
+
+ def discover(self, session, entities, event):
+ """Show only on project."""
+ if (
+ len(entities) != 1
+ or entities[0].entity_type.lower() != "project"
+ ):
+ return False
+
+ return self.valid_roles(session, entities, event)
+
+ def interface(self, session, entities, event):
+ if event['data'].get('values', {}):
+ return
+
+ # Inform user that this may take a while
+ self.show_message(event, "Preparing data... Please wait", True)
+ self.log.debug("Preparing data which will be shown")
+
+ self.log.debug("Loading custom attributes")
+
+ project_entity = entities[0]
+ project_name = project_entity["full_name"]
+
+ try:
+ project_settings = ProjectSettings(project_name)
+ except ValueError:
+ return {
+ "message": "Project is not synchronized yet",
+ "success": False
+ }
+
+ project_anatom_settings = project_settings["project_anatomy"]
+ root_items = self.prepare_root_items(project_anatom_settings)
+
+ ca_items, multiselect_enumerators = (
+ self.prepare_custom_attribute_items(project_anatom_settings)
+ )
+
+ self.log.debug("Heavy items are ready. Preparing last items group.")
+
+ title = "Prepare Project"
+ items = []
+
+ # Add root items
+ items.extend(root_items)
+
+ items.append(self.item_splitter)
+ items.append({
+ "type": "label",
+ "value": "
Set basic Attributes:
"
+ })
+
+ items.extend(ca_items)
+
+ # This item will be last (before enumerators)
+ # - sets value of auto synchronization
+ auto_sync_name = "avalon_auto_sync"
+ auto_sync_value = project_entity["custom_attributes"].get(
+ CUST_ATTR_AUTO_SYNC, False
+ )
+ auto_sync_item = {
+ "name": auto_sync_name,
+ "type": "boolean",
+ "value": auto_sync_value,
+ "label": "AutoSync to Avalon"
+ }
+ # Add autosync attribute
+ items.append(auto_sync_item)
+
+ # Add enumerator items at the end
+ for item in multiselect_enumerators:
+ items.append(item)
+
+ return {
+ "items": items,
+ "title": title
+ }
+
+ def prepare_root_items(self, project_anatom_settings):
+ self.log.debug("Root items preparation begins.")
+
+ root_items = []
+ root_items.append({
+ "type": "label",
+ "value": "Check your Project root settings
"
+ })
+ root_items.append({
+ "type": "label",
+ "value": (
+ "NOTE: Roots are crutial for path filling"
+ " (and creating folder structure).
"
+ )
+ })
+ root_items.append({
+ "type": "label",
+ "value": (
+ "WARNING: Do not change roots on running project,"
+ " that will cause workflow issues.
"
+ )
+ })
+
+ empty_text = "Enter root path here..."
+
+ roots_entity = project_anatom_settings["roots"]
+ for root_name, root_entity in roots_entity.items():
+ root_items.append(self.item_splitter)
+ root_items.append({
+ "type": "label",
+ "value": "Root: \"{}\"".format(root_name)
+ })
+ for platform_name, value_entity in root_entity.items():
+ root_items.append({
+ "label": platform_name,
+ "name": "__root__{}__{}".format(root_name, platform_name),
+ "type": "text",
+ "value": value_entity.value,
+ "empty_text": empty_text
+ })
+
+ root_items.append({
+ "type": "hidden",
+ "name": "__rootnames__",
+ "value": json.dumps(list(roots_entity.keys()))
+ })
+
+ self.log.debug("Root items preparation ended.")
+ return root_items
+
+ def _attributes_to_set(self, project_anatom_settings):
+ attributes_to_set = {}
+
+ attribute_values_by_key = {}
+ for key, entity in project_anatom_settings["attributes"].items():
+ attribute_values_by_key[key] = entity.value
+
+ cust_attrs, hier_cust_attrs = get_pype_attr(self.session, True)
+
+ for attr in hier_cust_attrs:
+ key = attr["key"]
+ if key.startswith("avalon_"):
+ continue
+ attributes_to_set[key] = {
+ "label": attr["label"],
+ "object": attr,
+ "default": attribute_values_by_key.get(key)
+ }
+
+ for attr in cust_attrs:
+ if attr["entity_type"].lower() != "show":
+ continue
+ key = attr["key"]
+ if key.startswith("avalon_"):
+ continue
+ attributes_to_set[key] = {
+ "label": attr["label"],
+ "object": attr,
+ "default": attribute_values_by_key.get(key)
+ }
+
+ # Sort by label
+ attributes_to_set = dict(sorted(
+ attributes_to_set.items(),
+ key=lambda x: x[1]["label"]
+ ))
+ return attributes_to_set
+
+ def prepare_custom_attribute_items(self, project_anatom_settings):
+ items = []
+ multiselect_enumerators = []
+ attributes_to_set = self._attributes_to_set(project_anatom_settings)
+
+ self.log.debug("Preparing interface for keys: \"{}\"".format(
+ str([key for key in attributes_to_set])
+ ))
+
+ for key, in_data in attributes_to_set.items():
+ attr = in_data["object"]
+
+ # initial item definition
+ item = {
+ "name": key,
+ "label": in_data["label"]
+ }
+
+ # cust attr type - may have different visualization
+ type_name = attr["type"]["name"].lower()
+ easy_types = ["text", "boolean", "date", "number"]
+
+ easy_type = False
+ if type_name in easy_types:
+ easy_type = True
+
+ elif type_name == "enumerator":
+
+ attr_config = json.loads(attr["config"])
+ attr_config_data = json.loads(attr_config["data"])
+
+ if attr_config["multiSelect"] is True:
+ multiselect_enumerators.append(self.item_splitter)
+ multiselect_enumerators.append({
+ "type": "label",
+ "value": in_data["label"]
+ })
+
+ default = in_data["default"]
+ names = []
+ for option in sorted(
+ attr_config_data, key=lambda x: x["menu"]
+ ):
+ name = option["value"]
+ new_name = "__{}__{}".format(key, name)
+ names.append(new_name)
+ item = {
+ "name": new_name,
+ "type": "boolean",
+ "label": "- {}".format(option["menu"])
+ }
+ if default:
+ if isinstance(default, (list, tuple)):
+ if name in default:
+ item["value"] = True
+ else:
+ if name == default:
+ item["value"] = True
+
+ multiselect_enumerators.append(item)
+
+ multiselect_enumerators.append({
+ "type": "hidden",
+ "name": "__hidden__{}".format(key),
+ "value": json.dumps(names)
+ })
+ else:
+ easy_type = True
+ item["data"] = attr_config_data
+
+ else:
+ self.log.warning((
+ "Custom attribute \"{}\" has type \"{}\"."
+ " I don't know how to handle"
+ ).format(key, type_name))
+ items.append({
+ "type": "label",
+ "value": (
+ "!!! Can't handle Custom attritubte type \"{}\""
+ " (key: \"{}\")"
+ ).format(type_name, key)
+ })
+
+ if easy_type:
+ item["type"] = type_name
+
+ # default value in interface
+ default = in_data["default"]
+ if default is not None:
+ item["value"] = default
+
+ items.append(item)
+
+ return items, multiselect_enumerators
+
+ def launch(self, session, entities, event):
+ if not event['data'].get('values', {}):
+ return
+
+ in_data = event['data']['values']
+
+ root_values = {}
+ root_key = "__root__"
+ for key in tuple(in_data.keys()):
+ if key.startswith(root_key):
+ _key = key[len(root_key):]
+ root_values[_key] = in_data.pop(key)
+
+ root_names = in_data.pop("__rootnames__", None)
+ root_data = {}
+ for root_name in json.loads(root_names):
+ root_data[root_name] = {}
+ for key, value in tuple(root_values.items()):
+ prefix = "{}__".format(root_name)
+ if not key.startswith(prefix):
+ continue
+
+ _key = key[len(prefix):]
+ root_data[root_name][_key] = value
+
+ # Find hidden items for multiselect enumerators
+ keys_to_process = []
+ for key in in_data:
+ if key.startswith("__hidden__"):
+ keys_to_process.append(key)
+
+ self.log.debug("Preparing data for Multiselect Enumerators")
+ enumerators = {}
+ for key in keys_to_process:
+ new_key = key.replace("__hidden__", "")
+ enumerator_items = in_data.pop(key)
+ enumerators[new_key] = json.loads(enumerator_items)
+
+ # find values set for multiselect enumerator
+ for key, enumerator_items in enumerators.items():
+ in_data[key] = []
+
+ name = "__{}__".format(key)
+
+ for item in enumerator_items:
+ value = in_data.pop(item)
+ if value is True:
+ new_key = item.replace(name, "")
+ in_data[key].append(new_key)
+
+ self.log.debug("Setting Custom Attribute values")
+
+ project_name = entities[0]["full_name"]
+ project_settings = ProjectSettings(project_name)
+ project_anatomy_settings = project_settings["project_anatomy"]
+ project_anatomy_settings["roots"] = root_data
+
+ custom_attribute_values = {}
+ attributes_entity = project_anatomy_settings["attributes"]
+ for key, value in in_data.items():
+ if key not in attributes_entity:
+ custom_attribute_values[key] = value
+ else:
+ attributes_entity[key] = value
+
+ project_settings.save()
+
+ entity = entities[0]
+ for key, value in custom_attribute_values.items():
+ entity["custom_attributes"][key] = value
+ self.log.debug("- Key \"{}\" set to \"{}\"".format(key, value))
+
+ return True
+
+
+def register(session):
+ '''Register plugin. Called when used as an plugin.'''
+ PrepareProjectServer(session).register()
diff --git a/openpype/modules/ftrack/event_handlers_user/action_prepare_project.py b/openpype/modules/ftrack/event_handlers_user/action_prepare_project.py
index 7f674310fc..bd25f995fe 100644
--- a/openpype/modules/ftrack/event_handlers_user/action_prepare_project.py
+++ b/openpype/modules/ftrack/event_handlers_user/action_prepare_project.py
@@ -1,31 +1,34 @@
-import os
import json
-from openpype.modules.ftrack.lib import BaseAction, statics_icon
-from openpype.api import config, Anatomy
-from openpype.modules.ftrack.lib.avalon_sync import get_pype_attr
+from openpype.api import ProjectSettings
+
+from openpype.modules.ftrack.lib import (
+ BaseAction,
+ statics_icon
+)
+from openpype.modules.ftrack.lib.avalon_sync import (
+ get_pype_attr,
+ CUST_ATTR_AUTO_SYNC
+)
-class PrepareProject(BaseAction):
- '''Edit meta data action.'''
+class PrepareProjectLocal(BaseAction):
+ """Prepare project attributes in Anatomy."""
- #: Action identifier.
- identifier = 'prepare.project'
- #: Action label.
- label = 'Prepare Project'
- #: Action description.
- description = 'Set basic attributes on the project'
- #: roles that are allowed to register this action
+ identifier = "prepare.project.local"
+ label = "Prepare Project"
+ description = "Set basic attributes on the project"
icon = statics_icon("ftrack", "action_icons", "PrepareProject.svg")
+ role_list = ["Pypeclub", "Administrator", "Project Manager"]
+
settings_key = "prepare_project"
# Key to store info about trigerring create folder structure
- create_project_structure_key = "create_folder_structure"
- item_splitter = {'type': 'label', 'value': '---'}
+ item_splitter = {"type": "label", "value": "---"}
def discover(self, session, entities, event):
- ''' Validation '''
+ """Show only on project."""
if (
len(entities) != 1
or entities[0].entity_type.lower() != "project"
@@ -44,27 +47,22 @@ class PrepareProject(BaseAction):
self.log.debug("Loading custom attributes")
- project_name = entities[0]["full_name"]
+ project_entity = entities[0]
+ project_name = project_entity["full_name"]
- project_defaults = (
- config.get_presets(project_name)
- .get("ftrack", {})
- .get("project_defaults", {})
- )
-
- anatomy = Anatomy(project_name)
- if not anatomy.roots:
+ try:
+ project_settings = ProjectSettings(project_name)
+ except ValueError:
return {
- "success": False,
- "message": (
- "Have issues with loading Roots for project \"{}\"."
- ).format(anatomy.project_name)
+ "message": "Project is not synchronized yet",
+ "success": False
}
- root_items = self.prepare_root_items(anatomy)
+ project_anatom_settings = project_settings["project_anatomy"]
+ root_items = self.prepare_root_items(project_anatom_settings)
ca_items, multiselect_enumerators = (
- self.prepare_custom_attribute_items(project_defaults)
+ self.prepare_custom_attribute_items(project_anatom_settings)
)
self.log.debug("Heavy items are ready. Preparing last items group.")
@@ -74,19 +72,6 @@ class PrepareProject(BaseAction):
# Add root items
items.extend(root_items)
- items.append(self.item_splitter)
-
- # Ask if want to trigger Action Create Folder Structure
- items.append({
- "type": "label",
- "value": "Want to create basic Folder Structure?
"
- })
- items.append({
- "name": self.create_project_structure_key,
- "type": "boolean",
- "value": False,
- "label": "Check if Yes"
- })
items.append(self.item_splitter)
items.append({
@@ -99,10 +84,13 @@ class PrepareProject(BaseAction):
# This item will be last (before enumerators)
# - sets value of auto synchronization
auto_sync_name = "avalon_auto_sync"
+ auto_sync_value = project_entity["custom_attributes"].get(
+ CUST_ATTR_AUTO_SYNC, False
+ )
auto_sync_item = {
"name": auto_sync_name,
"type": "boolean",
- "value": project_defaults.get(auto_sync_name, False),
+ "value": auto_sync_value,
"label": "AutoSync to Avalon"
}
# Add autosync attribute
@@ -117,13 +105,10 @@ class PrepareProject(BaseAction):
"title": title
}
- def prepare_root_items(self, anatomy):
- root_items = []
+ def prepare_root_items(self, project_anatom_settings):
self.log.debug("Root items preparation begins.")
- root_names = anatomy.root_names()
- roots = anatomy.roots
-
+ root_items = []
root_items.append({
"type": "label",
"value": "Check your Project root settings
"
@@ -143,85 +128,40 @@ class PrepareProject(BaseAction):
)
})
- default_roots = anatomy.roots
- while isinstance(default_roots, dict):
- key = tuple(default_roots.keys())[0]
- default_roots = default_roots[key]
-
empty_text = "Enter root path here..."
- # Root names is None when anatomy templates contain "{root}"
- all_platforms = ["windows", "linux", "darwin"]
- if root_names is None:
- root_items.append(self.item_splitter)
- # find first possible key
- for platform in all_platforms:
- value = default_roots.raw_data.get(platform) or ""
- root_items.append({
- "label": platform,
- "name": "__root__{}".format(platform),
- "type": "text",
- "value": value,
- "empty_text": empty_text
- })
- return root_items
-
- root_name_data = {}
- missing_roots = []
- for root_name in root_names:
- root_name_data[root_name] = {}
- if not isinstance(roots, dict):
- missing_roots.append(root_name)
- continue
-
- root_item = roots.get(root_name)
- if not root_item:
- missing_roots.append(root_name)
- continue
-
- for platform in all_platforms:
- root_name_data[root_name][platform] = (
- root_item.raw_data.get(platform) or ""
- )
-
- if missing_roots:
- default_values = {}
- for platform in all_platforms:
- default_values[platform] = (
- default_roots.raw_data.get(platform) or ""
- )
-
- for root_name in missing_roots:
- root_name_data[root_name] = default_values
-
- root_names = list(root_name_data.keys())
- root_items.append({
- "type": "hidden",
- "name": "__rootnames__",
- "value": json.dumps(root_names)
- })
-
- for root_name, values in root_name_data.items():
+ roots_entity = project_anatom_settings["roots"]
+ for root_name, root_entity in roots_entity.items():
root_items.append(self.item_splitter)
root_items.append({
"type": "label",
"value": "Root: \"{}\"".format(root_name)
})
- for platform, value in values.items():
+ for platform_name, value_entity in root_entity.items():
root_items.append({
- "label": platform,
- "name": "__root__{}{}".format(root_name, platform),
+ "label": platform_name,
+ "name": "__root__{}__{}".format(root_name, platform_name),
"type": "text",
- "value": value,
+ "value": value_entity.value,
"empty_text": empty_text
})
+ root_items.append({
+ "type": "hidden",
+ "name": "__rootnames__",
+ "value": json.dumps(list(roots_entity.keys()))
+ })
+
self.log.debug("Root items preparation ended.")
return root_items
- def _attributes_to_set(self, project_defaults):
+ def _attributes_to_set(self, project_anatom_settings):
attributes_to_set = {}
+ attribute_values_by_key = {}
+ for key, entity in project_anatom_settings["attributes"].items():
+ attribute_values_by_key[key] = entity.value
+
cust_attrs, hier_cust_attrs = get_pype_attr(self.session, True)
for attr in hier_cust_attrs:
@@ -231,7 +171,7 @@ class PrepareProject(BaseAction):
attributes_to_set[key] = {
"label": attr["label"],
"object": attr,
- "default": project_defaults.get(key)
+ "default": attribute_values_by_key.get(key)
}
for attr in cust_attrs:
@@ -243,7 +183,7 @@ class PrepareProject(BaseAction):
attributes_to_set[key] = {
"label": attr["label"],
"object": attr,
- "default": project_defaults.get(key)
+ "default": attribute_values_by_key.get(key)
}
# Sort by label
@@ -253,10 +193,10 @@ class PrepareProject(BaseAction):
))
return attributes_to_set
- def prepare_custom_attribute_items(self, project_defaults):
+ def prepare_custom_attribute_items(self, project_anatom_settings):
items = []
multiselect_enumerators = []
- attributes_to_set = self._attributes_to_set(project_defaults)
+ attributes_to_set = self._attributes_to_set(project_anatom_settings)
self.log.debug("Preparing interface for keys: \"{}\"".format(
str([key for key in attributes_to_set])
@@ -363,24 +303,15 @@ class PrepareProject(BaseAction):
root_names = in_data.pop("__rootnames__", None)
root_data = {}
- if root_names:
- for root_name in json.loads(root_names):
- root_data[root_name] = {}
- for key, value in tuple(root_values.items()):
- if key.startswith(root_name):
- _key = key[len(root_name):]
- root_data[root_name][_key] = value
+ for root_name in json.loads(root_names):
+ root_data[root_name] = {}
+ for key, value in tuple(root_values.items()):
+ prefix = "{}__".format(root_name)
+ if not key.startswith(prefix):
+ continue
- else:
- for key, value in root_values.items():
- root_data[key] = value
-
- # TODO implement creating of anatomy for new projects
- # project_name = entities[0]["full_name"]
- # anatomy = Anatomy(project_name)
-
- # pop out info about creating project structure
- create_proj_struct = in_data.pop(self.create_project_structure_key)
+ _key = key[len(prefix):]
+ root_data[root_name][_key] = value
# Find hidden items for multiselect enumerators
keys_to_process = []
@@ -407,54 +338,31 @@ class PrepareProject(BaseAction):
new_key = item.replace(name, "")
in_data[key].append(new_key)
- self.log.debug("Setting Custom Attribute values:")
- entity = entities[0]
+ self.log.debug("Setting Custom Attribute values")
+
+ project_name = entities[0]["full_name"]
+ project_settings = ProjectSettings(project_name)
+ project_anatomy_settings = project_settings["project_anatomy"]
+ project_anatomy_settings["roots"] = root_data
+
+ custom_attribute_values = {}
+ attributes_entity = project_anatomy_settings["attributes"]
for key, value in in_data.items():
+ if key not in attributes_entity:
+ custom_attribute_values[key] = value
+ else:
+ attributes_entity[key] = value
+
+ project_settings.save()
+
+ entity = entities[0]
+ for key, value in custom_attribute_values.items():
entity["custom_attributes"][key] = value
self.log.debug("- Key \"{}\" set to \"{}\"".format(key, value))
- session.commit()
-
- # Create project structure
- self.create_project_specific_config(entities[0]["full_name"], in_data)
-
- # Trigger Create Project Structure action
- if create_proj_struct is True:
- self.trigger_action("create.project.structure", event)
-
return True
- def create_project_specific_config(self, project_name, json_data):
- self.log.debug("*** Creating project specifig configs ***")
- project_specific_path = project_overrides_dir_path(project_name)
- if not os.path.exists(project_specific_path):
- os.makedirs(project_specific_path)
- self.log.debug((
- "Project specific config folder for project \"{}\" created."
- ).format(project_name))
-
- # Presets ####################################
- self.log.debug("--- Processing Presets Begins: ---")
-
- project_defaults_dir = os.path.normpath(os.path.join(
- project_specific_path, "presets", "ftrack"
- ))
- project_defaults_path = os.path.normpath(os.path.join(
- project_defaults_dir, "project_defaults.json"
- ))
- # Create folder if not exist
- if not os.path.exists(project_defaults_dir):
- self.log.debug("Creating Ftrack Presets folder: \"{}\"".format(
- project_defaults_dir
- ))
- os.makedirs(project_defaults_dir)
-
- with open(project_defaults_path, 'w') as file_stream:
- json.dump(json_data, file_stream, indent=4)
-
- self.log.debug("*** Creating project specifig configs Finished ***")
-
def register(session):
'''Register plugin. Called when used as an plugin.'''
- PrepareProject(session).register()
+ PrepareProjectLocal(session).register()
diff --git a/openpype/modules/sync_server/__init__.py b/openpype/modules/sync_server/__init__.py
index 7123536fcf..a814f0db62 100644
--- a/openpype/modules/sync_server/__init__.py
+++ b/openpype/modules/sync_server/__init__.py
@@ -1,5 +1,5 @@
-from openpype.modules.sync_server.sync_server import SyncServer
+from openpype.modules.sync_server.sync_server_module import SyncServerModule
def tray_init(tray_widget, main_widget):
- return SyncServer()
+ return SyncServerModule()
diff --git a/openpype/modules/sync_server/providers/__init__.py b/openpype/modules/sync_server/providers/__init__.py
new file mode 100644
index 0000000000..e69de29bb2
diff --git a/openpype/modules/sync_server/providers/abstract_provider.py b/openpype/modules/sync_server/providers/abstract_provider.py
index 001d4c4d50..a60595ba93 100644
--- a/openpype/modules/sync_server/providers/abstract_provider.py
+++ b/openpype/modules/sync_server/providers/abstract_provider.py
@@ -1,16 +1,23 @@
-from abc import ABCMeta, abstractmethod
+import abc
+import six
+from openpype.api import Logger
+
+log = Logger().get_logger("SyncServer")
-class AbstractProvider(metaclass=ABCMeta):
+@six.add_metaclass(abc.ABCMeta)
+class AbstractProvider:
- def __init__(self, site_name, tree=None, presets=None):
+ def __init__(self, project_name, site_name, tree=None, presets=None):
self.presets = None
self.active = False
self.site_name = site_name
self.presets = presets
- @abstractmethod
+ super(AbstractProvider, self).__init__()
+
+ @abc.abstractmethod
def is_active(self):
"""
Returns True if provider is activated, eg. has working credentials.
@@ -18,36 +25,54 @@ class AbstractProvider(metaclass=ABCMeta):
(boolean)
"""
- @abstractmethod
- def upload_file(self, source_path, target_path, overwrite=True):
+ @abc.abstractmethod
+ def upload_file(self, source_path, path,
+ server, collection, file, representation, site,
+ overwrite=False):
"""
Copy file from 'source_path' to 'target_path' on provider.
Use 'overwrite' boolean to rewrite existing file on provider
Args:
- source_path (string): absolute path on local system
- target_path (string): absolute path on provider (GDrive etc.)
- overwrite (boolean): True if overwite existing
+ source_path (string):
+ path (string): absolute path with or without name of the file
+ overwrite (boolean): replace existing file
+
+ arguments for saving progress:
+ server (SyncServer): server instance to call update_db on
+ collection (str): name of collection
+ file (dict): info about uploaded file (matches structure from db)
+ representation (dict): complete repre containing 'file'
+ site (str): site name
Returns:
(string) file_id of created file, raises exception
"""
pass
- @abstractmethod
- def download_file(self, source_path, local_path, overwrite=True):
+ @abc.abstractmethod
+ def download_file(self, source_path, local_path,
+ server, collection, file, representation, site,
+ overwrite=False):
"""
Download file from provider into local system
Args:
source_path (string): absolute path on provider
- local_path (string): absolute path on local
- overwrite (bool): default set to True
+ local_path (string): absolute path with or without name of the file
+ overwrite (boolean): replace existing file
+
+ arguments for saving progress:
+ server (SyncServer): server instance to call update_db on
+ collection (str): name of collection
+ file (dict): info about uploaded file (matches structure from db)
+ representation (dict): complete repre containing 'file'
+ site (str): site name
Returns:
None
"""
pass
- @abstractmethod
+ @abc.abstractmethod
def delete_file(self, path):
"""
Deletes file from 'path'. Expects path to specific file.
@@ -60,7 +85,7 @@ class AbstractProvider(metaclass=ABCMeta):
"""
pass
- @abstractmethod
+ @abc.abstractmethod
def list_folder(self, folder_path):
"""
List all files and subfolders of particular path non-recursively.
@@ -72,7 +97,7 @@ class AbstractProvider(metaclass=ABCMeta):
"""
pass
- @abstractmethod
+ @abc.abstractmethod
def create_folder(self, folder_path):
"""
Create all nonexistent folders and subfolders in 'path'.
@@ -85,7 +110,7 @@ class AbstractProvider(metaclass=ABCMeta):
"""
pass
- @abstractmethod
+ @abc.abstractmethod
def get_tree(self):
"""
Creates folder structure for providers which do not provide
@@ -94,16 +119,50 @@ class AbstractProvider(metaclass=ABCMeta):
"""
pass
- @abstractmethod
- def resolve_path(self, path, root_config, anatomy=None):
+ @abc.abstractmethod
+ def get_roots_config(self, anatomy=None):
"""
- Replaces root placeholders with appropriate real value from
- 'root_configs' (from Settings or Local Settings) or Anatomy
- (mainly for 'studio' site)
+ Returns root values for path resolving
- Args:
- path(string): path with '{root[work]}/...'
- root_config(dict): from Settings or Local Settings
- anatomy (Anatomy): prepared anatomy object for project
+ Takes value from Anatomy which takes values from Settings
+ overridden by Local Settings
+
+ Returns:
+ (dict) - {"root": {"root": "/My Drive"}}
+ OR
+ {"root": {"root_ONE": "value", "root_TWO":"value}}
+ Format is importing for usage of python's format ** approach
"""
pass
+
+ def resolve_path(self, path, root_config=None, anatomy=None):
+ """
+ Replaces all root placeholders with proper values
+
+ Args:
+ path(string): root[work]/folder...
+ root_config (dict): {'work': "c:/..."...}
+ anatomy (Anatomy): object of Anatomy
+ Returns:
+ (string): proper url
+ """
+ if not root_config:
+ root_config = self.get_roots_config(anatomy)
+
+ if root_config and not root_config.get("root"):
+ root_config = {"root": root_config}
+
+ try:
+ if not root_config:
+ raise KeyError
+
+ path = path.format(**root_config)
+ except KeyError:
+ try:
+ path = anatomy.fill_root(path)
+ except KeyError:
+ msg = "Error in resolving local root from anatomy"
+ log.error(msg)
+ raise ValueError(msg)
+
+ return path
diff --git a/openpype/modules/sync_server/providers/gdrive.py b/openpype/modules/sync_server/providers/gdrive.py
index 6c01bc4e6f..f1ea24f601 100644
--- a/openpype/modules/sync_server/providers/gdrive.py
+++ b/openpype/modules/sync_server/providers/gdrive.py
@@ -10,6 +10,7 @@ from openpype.api import get_system_settings
from ..utils import time_function
import time
+
SCOPES = ['https://www.googleapis.com/auth/drive.metadata.readonly',
'https://www.googleapis.com/auth/drive.file',
'https://www.googleapis.com/auth/drive.readonly'] # for write|delete
@@ -45,9 +46,10 @@ class GDriveHandler(AbstractProvider):
MY_DRIVE_STR = 'My Drive' # name of root folder of regular Google drive
CHUNK_SIZE = 2097152 # must be divisible by 256!
- def __init__(self, site_name, tree=None, presets=None):
+ def __init__(self, project_name, site_name, tree=None, presets=None):
self.presets = None
self.active = False
+ self.project_name = project_name
self.site_name = site_name
self.presets = presets
@@ -65,137 +67,6 @@ class GDriveHandler(AbstractProvider):
self._tree = tree
self.active = True
- def _get_gd_service(self):
- """
- Authorize client with 'credentials.json', uses service account.
- Service account needs to have target folder shared with.
- Produces service that communicates with GDrive API.
-
- Returns:
- None
- """
- creds = service_account.Credentials.from_service_account_file(
- self.presets["credentials_url"],
- scopes=SCOPES)
- service = build('drive', 'v3',
- credentials=creds, cache_discovery=False)
- return service
-
- def _prepare_root_info(self):
- """
- Prepare info about roots and theirs folder ids from 'presets'.
- Configuration might be for single or multiroot projects.
- Regular My Drive and Shared drives are implemented, their root
- folder ids need to be queried in slightly different way.
-
- Returns:
- (dicts) of dicts where root folders are keys
- """
- roots = {}
- for path in self.get_roots_config().values():
- if self.MY_DRIVE_STR in path:
- roots[self.MY_DRIVE_STR] = self.service.files()\
- .get(fileId='root').execute()
- else:
- shared_drives = []
- page_token = None
-
- while True:
- response = self.service.drives().list(
- pageSize=100,
- pageToken=page_token).execute()
- shared_drives.extend(response.get('drives', []))
- page_token = response.get('nextPageToken', None)
- if page_token is None:
- break
-
- folders = path.split('/')
- if len(folders) < 2:
- raise ValueError("Wrong root folder definition {}".
- format(path))
-
- for shared_drive in shared_drives:
- if folders[1] in shared_drive["name"]:
- roots[shared_drive["name"]] = {
- "name": shared_drive["name"],
- "id": shared_drive["id"]}
- if self.MY_DRIVE_STR not in roots: # add My Drive always
- roots[self.MY_DRIVE_STR] = self.service.files() \
- .get(fileId='root').execute()
-
- return roots
-
- @time_function
- def _build_tree(self, folders):
- """
- Create in-memory structure resolving paths to folder id as
- recursive querying might be slower.
- Initialized in the time of class initialization.
- Maybe should be persisted
- Tree is structure of path to id:
- '/ROOT': {'id': '1234567'}
- '/ROOT/PROJECT_FOLDER': {'id':'222222'}
- '/ROOT/PROJECT_FOLDER/Assets': {'id': '3434545'}
- Args:
- folders (list): list of dictionaries with folder metadata
- Returns:
- (dictionary) path as a key, folder id as a value
- """
- log.debug("build_tree len {}".format(len(folders)))
- root_ids = []
- default_root_id = None
- tree = {}
- ending_by = {}
- for root_name, root in self.root.items(): # might be multiple roots
- if root["id"] not in root_ids:
- tree["/" + root_name] = {"id": root["id"]}
- ending_by[root["id"]] = "/" + root_name
- root_ids.append(root["id"])
-
- if self.MY_DRIVE_STR == root_name:
- default_root_id = root["id"]
-
- no_parents_yet = {}
- while folders:
- folder = folders.pop(0)
- parents = folder.get("parents", [])
- # weird cases, shared folders, etc, parent under root
- if not parents:
- parent = default_root_id
- else:
- parent = parents[0]
-
- if folder["id"] in root_ids: # do not process root
- continue
-
- if parent in ending_by:
- path_key = ending_by[parent] + "/" + folder["name"]
- ending_by[folder["id"]] = path_key
- tree[path_key] = {"id": folder["id"]}
- else:
- no_parents_yet.setdefault(parent, []).append((folder["id"],
- folder["name"]))
- loop_cnt = 0
- # break if looped more then X times - safety against infinite loop
- while no_parents_yet and loop_cnt < 20:
-
- keys = list(no_parents_yet.keys())
- for parent in keys:
- if parent in ending_by.keys():
- subfolders = no_parents_yet.pop(parent)
- for folder_id, folder_name in subfolders:
- path_key = ending_by[parent] + "/" + folder_name
- ending_by[folder_id] = path_key
- tree[path_key] = {"id": folder_id}
- loop_cnt += 1
-
- if len(no_parents_yet) > 0:
- log.debug("Some folders path are not resolved {}".
- format(no_parents_yet))
- log.debug("Remove deleted folders from trash.")
-
- return tree
-
def is_active(self):
"""
Returns True if provider is activated, eg. has working credentials.
@@ -204,6 +75,21 @@ class GDriveHandler(AbstractProvider):
"""
return self.active
+ def get_roots_config(self, anatomy=None):
+ """
+ Returns root values for path resolving
+
+ Use only Settings as GDrive cannot be modified by Local Settings
+
+ Returns:
+ (dict) - {"root": {"root": "/My Drive"}}
+ OR
+ {"root": {"root_ONE": "value", "root_TWO":"value}}
+ Format is importing for usage of python's format ** approach
+ """
+ # GDrive roots cannot be locally overridden
+ return self.presets['root']
+
def get_tree(self):
"""
Building of the folder tree could be potentially expensive,
@@ -217,26 +103,6 @@ class GDriveHandler(AbstractProvider):
self._tree = self._build_tree(self.list_folders())
return self._tree
- def get_roots_config(self):
- """
- Returns value from presets of roots. It calculates with multi
- roots. Config should be simple key value, or dictionary.
-
- Examples:
- "root": "/My Drive"
- OR
- "root": {"root_ONE": "value", "root_TWO":"value}
- Returns:
- (dict) - {"root": {"root": "/My Drive"}}
- OR
- {"root": {"root_ONE": "value", "root_TWO":"value}}
- Format is importing for usage of python's format ** approach
- """
- roots = self.presets["root"]
- if isinstance(roots, str):
- roots = {"root": roots}
- return roots
-
def create_folder(self, path):
"""
Create all nonexistent folders and subfolders in 'path'.
@@ -510,20 +376,6 @@ class GDriveHandler(AbstractProvider):
self.service.files().delete(fileId=file["id"],
supportsAllDrives=True).execute()
- def _get_folder_metadata(self, path):
- """
- Get info about folder with 'path'
- Args:
- path (string):
-
- Returns:
- (dictionary) with metadata or raises ValueError
- """
- try:
- return self.get_tree()[path]
- except Exception:
- raise ValueError("Uknown folder id {}".format(id))
-
def list_folder(self, folder_path):
"""
List all files and subfolders of particular path non-recursively.
@@ -678,15 +530,151 @@ class GDriveHandler(AbstractProvider):
return
return provider_presets
- def resolve_path(self, path, root_config, anatomy=None):
- if not root_config.get("root"):
- root_config = {"root": root_config}
+ def _get_gd_service(self):
+ """
+ Authorize client with 'credentials.json', uses service account.
+ Service account needs to have target folder shared with.
+ Produces service that communicates with GDrive API.
+ Returns:
+ None
+ """
+ creds = service_account.Credentials.from_service_account_file(
+ self.presets["credentials_url"],
+ scopes=SCOPES)
+ service = build('drive', 'v3',
+ credentials=creds, cache_discovery=False)
+ return service
+
+ def _prepare_root_info(self):
+ """
+ Prepare info about roots and theirs folder ids from 'presets'.
+ Configuration might be for single or multiroot projects.
+ Regular My Drive and Shared drives are implemented, their root
+ folder ids need to be queried in slightly different way.
+
+ Returns:
+ (dicts) of dicts where root folders are keys
+ """
+ roots = {}
+ config_roots = self.get_roots_config()
+ for path in config_roots.values():
+ if self.MY_DRIVE_STR in path:
+ roots[self.MY_DRIVE_STR] = self.service.files()\
+ .get(fileId='root').execute()
+ else:
+ shared_drives = []
+ page_token = None
+
+ while True:
+ response = self.service.drives().list(
+ pageSize=100,
+ pageToken=page_token).execute()
+ shared_drives.extend(response.get('drives', []))
+ page_token = response.get('nextPageToken', None)
+ if page_token is None:
+ break
+
+ folders = path.split('/')
+ if len(folders) < 2:
+ raise ValueError("Wrong root folder definition {}".
+ format(path))
+
+ for shared_drive in shared_drives:
+ if folders[1] in shared_drive["name"]:
+ roots[shared_drive["name"]] = {
+ "name": shared_drive["name"],
+ "id": shared_drive["id"]}
+ if self.MY_DRIVE_STR not in roots: # add My Drive always
+ roots[self.MY_DRIVE_STR] = self.service.files() \
+ .get(fileId='root').execute()
+
+ return roots
+
+ @time_function
+ def _build_tree(self, folders):
+ """
+ Create in-memory structure resolving paths to folder id as
+ recursive querying might be slower.
+ Initialized in the time of class initialization.
+ Maybe should be persisted
+ Tree is structure of path to id:
+ '/ROOT': {'id': '1234567'}
+ '/ROOT/PROJECT_FOLDER': {'id':'222222'}
+ '/ROOT/PROJECT_FOLDER/Assets': {'id': '3434545'}
+ Args:
+ folders (list): list of dictionaries with folder metadata
+ Returns:
+ (dictionary) path as a key, folder id as a value
+ """
+ log.debug("build_tree len {}".format(len(folders)))
+ root_ids = []
+ default_root_id = None
+ tree = {}
+ ending_by = {}
+ for root_name, root in self.root.items(): # might be multiple roots
+ if root["id"] not in root_ids:
+ tree["/" + root_name] = {"id": root["id"]}
+ ending_by[root["id"]] = "/" + root_name
+ root_ids.append(root["id"])
+
+ if self.MY_DRIVE_STR == root_name:
+ default_root_id = root["id"]
+
+ no_parents_yet = {}
+ while folders:
+ folder = folders.pop(0)
+ parents = folder.get("parents", [])
+ # weird cases, shared folders, etc, parent under root
+ if not parents:
+ parent = default_root_id
+ else:
+ parent = parents[0]
+
+ if folder["id"] in root_ids: # do not process root
+ continue
+
+ if parent in ending_by:
+ path_key = ending_by[parent] + "/" + folder["name"]
+ ending_by[folder["id"]] = path_key
+ tree[path_key] = {"id": folder["id"]}
+ else:
+ no_parents_yet.setdefault(parent, []).append((folder["id"],
+ folder["name"]))
+ loop_cnt = 0
+ # break if looped more then X times - safety against infinite loop
+ while no_parents_yet and loop_cnt < 20:
+
+ keys = list(no_parents_yet.keys())
+ for parent in keys:
+ if parent in ending_by.keys():
+ subfolders = no_parents_yet.pop(parent)
+ for folder_id, folder_name in subfolders:
+ path_key = ending_by[parent] + "/" + folder_name
+ ending_by[folder_id] = path_key
+ tree[path_key] = {"id": folder_id}
+ loop_cnt += 1
+
+ if len(no_parents_yet) > 0:
+ log.debug("Some folders path are not resolved {}".
+ format(no_parents_yet))
+ log.debug("Remove deleted folders from trash.")
+
+ return tree
+
+ def _get_folder_metadata(self, path):
+ """
+ Get info about folder with 'path'
+ Args:
+ path (string):
+
+ Returns:
+ (dictionary) with metadata or raises ValueError
+ """
try:
- return path.format(**root_config)
- except KeyError:
- msg = "Error in resolving remote root, unknown key"
- log.error(msg)
+ return self.get_tree()[path]
+ except Exception:
+ raise ValueError("Uknown folder id {}".format(id))
def _handle_q(self, q, trashed=False):
""" API list call contain trashed and hidden files/folder by default.
diff --git a/openpype/modules/sync_server/providers/lib.py b/openpype/modules/sync_server/providers/lib.py
index 144594ecbe..58947e115d 100644
--- a/openpype/modules/sync_server/providers/lib.py
+++ b/openpype/modules/sync_server/providers/lib.py
@@ -1,4 +1,3 @@
-from enum import Enum
from .gdrive import GDriveHandler
from .local_drive import LocalDriveHandler
@@ -25,7 +24,8 @@ class ProviderFactory:
"""
self.providers[provider] = (creator, batch_limit)
- def get_provider(self, provider, site_name, tree=None, presets=None):
+ def get_provider(self, provider, project_name, site_name,
+ tree=None, presets=None):
"""
Returns new instance of provider client for specific site.
One provider could have multiple sites.
@@ -37,6 +37,7 @@ class ProviderFactory:
provider (string): 'gdrive','S3'
site_name (string): descriptor of site, different service accounts
must have different site name
+ project_name (string): different projects could have diff. sites
tree (dictionary): - folder paths to folder id structure
presets (dictionary): config for provider and site (eg.
"credentials_url"..)
@@ -44,7 +45,8 @@ class ProviderFactory:
(implementation of AbstractProvider)
"""
creator_info = self._get_creator_info(provider)
- site = creator_info[0](site_name, tree, presets) # call init
+ # call init
+ site = creator_info[0](project_name, site_name, tree, presets)
return site
diff --git a/openpype/modules/sync_server/providers/local_drive.py b/openpype/modules/sync_server/providers/local_drive.py
index fa8dd4c183..1f4fca80eb 100644
--- a/openpype/modules/sync_server/providers/local_drive.py
+++ b/openpype/modules/sync_server/providers/local_drive.py
@@ -4,7 +4,7 @@ import shutil
import threading
import time
-from openpype.api import Logger
+from openpype.api import Logger, Anatomy
from .abstract_provider import AbstractProvider
log = Logger().get_logger("SyncServer")
@@ -12,6 +12,14 @@ log = Logger().get_logger("SyncServer")
class LocalDriveHandler(AbstractProvider):
""" Handles required operations on mounted disks with OS """
+ def __init__(self, project_name, site_name, tree=None, presets=None):
+ self.presets = None
+ self.active = False
+ self.project_name = project_name
+ self.site_name = site_name
+
+ self.active = self.is_active()
+
def is_active(self):
return True
@@ -82,27 +90,37 @@ class LocalDriveHandler(AbstractProvider):
os.makedirs(folder_path, exist_ok=True)
return folder_path
+ def get_roots_config(self, anatomy=None):
+ """
+ Returns root values for path resolving
+
+ Takes value from Anatomy which takes values from Settings
+ overridden by Local Settings
+
+ Returns:
+ (dict) - {"root": {"root": "/My Drive"}}
+ OR
+ {"root": {"root_ONE": "value", "root_TWO":"value}}
+ Format is importing for usage of python's format ** approach
+ """
+ if not anatomy:
+ anatomy = Anatomy(self.project_name,
+ self._normalize_site_name(self.site_name))
+
+ return {'root': anatomy.roots}
+
def get_tree(self):
return
- def resolve_path(self, path, root_config, anatomy=None):
- if root_config and not root_config.get("root"):
- root_config = {"root": root_config}
+ def get_configurable_items_for_site(self):
+ """
+ Returns list of items that should be configurable by User
- try:
- if not root_config:
- raise KeyError
-
- path = path.format(**root_config)
- except KeyError:
- try:
- path = anatomy.fill_root(path)
- except KeyError:
- msg = "Error in resolving local root from anatomy"
- log.error(msg)
- raise ValueError(msg)
-
- return path
+ Returns:
+ (list of dict)
+ [{key:"root", label:"root", value:"valueFromSettings"}]
+ """
+ pass
def _copy(self, source_path, target_path):
print("copying {}->{}".format(source_path, target_path))
@@ -133,3 +151,9 @@ class LocalDriveHandler(AbstractProvider):
)
target_file_size = os.path.getsize(target_path)
time.sleep(0.5)
+
+ def _normalize_site_name(self, site_name):
+ """Transform user id to 'local' for Local settings"""
+ if site_name != 'studio':
+ return 'local'
+ return site_name
diff --git a/openpype/modules/sync_server/sync_server.py b/openpype/modules/sync_server/sync_server.py
index 62a5dc675c..e97c0e8844 100644
--- a/openpype/modules/sync_server/sync_server.py
+++ b/openpype/modules/sync_server/sync_server.py
@@ -1,1391 +1,225 @@
-from openpype.api import (
- Anatomy,
- get_project_settings,
- get_local_site_id)
-
+"""Python 3 only implementation."""
+import os
+import asyncio
import threading
import concurrent.futures
from concurrent.futures._base import CancelledError
-from enum import Enum
-from datetime import datetime
-
from .providers import lib
-import os
-from bson.objectid import ObjectId
-
-from avalon.api import AvalonMongoDB
-from .utils import time_function
-
-import six
from openpype.lib import PypeLogger
-from .. import PypeModule, ITrayModule
-from .providers.local_drive import LocalDriveHandler
-if six.PY2:
- web = asyncio = STATIC_DIR = WebSocketAsync = None
-else:
- import asyncio
+from .utils import SyncStatus
+
log = PypeLogger().get_logger("SyncServer")
-class SyncStatus(Enum):
- DO_NOTHING = 0
- DO_UPLOAD = 1
- DO_DOWNLOAD = 2
-
-
-class SyncServer(PypeModule, ITrayModule):
+async def upload(module, collection, file, representation, provider_name,
+ remote_site_name, tree=None, preset=None):
"""
- Synchronization server that is syncing published files from local to
- any of implemented providers (like GDrive, S3 etc.)
- Runs in the background and checks all representations, looks for files
- that are marked to be in different location than 'studio' (temporary),
- checks if 'created_dt' field is present denoting successful sync
- with provider destination.
- Sites structure is created during publish OR by calling 'add_site'
- method.
+ Upload single 'file' of a 'representation' to 'provider'.
+ Source url is taken from 'file' portion, where {root} placeholder
+ is replaced by 'representation.Context.root'
+ Provider could be one of implemented in provider.py.
- By default it will always contain 1 record with
- "name" == self.presets["active_site"] and
- filled "created_dt" AND 1 or multiple records for all defined
- remote sites, where "created_dt" is not present.
- This highlights that file should be uploaded to
- remote destination
+ Updates MongoDB, fills in id of file from provider (ie. file_id
+ from GDrive), 'created_dt' - time of upload
- ''' - example of synced file test_Cylinder_lookMain_v010.ma to GDrive
- "files" : [
- {
- "path" : "{root}/Test/Assets/Cylinder/publish/look/lookMain/v010/
- test_Cylinder_lookMain_v010.ma",
- "_id" : ObjectId("5eeb25e411e06a16209ab78f"),
- "hash" : "test_Cylinder_lookMain_v010,ma|1592468963,24|4822",
- "size" : NumberLong(4822),
- "sites" : [
- {
- "name": "john_local_XD4345",
- "created_dt" : ISODate("2020-05-22T08:05:44.000Z")
- },
- {
- "id" : ObjectId("5eeb25e411e06a16209ab78f"),
- "name": "gdrive",
- "created_dt" : ISODate("2020-05-55T08:54:35.833Z")
- ]
- }
- },
- '''
- Each Tray app has assigned its own self.presets["local_id"]
- used in sites as a name.
- Tray is searching only for records where name matches its
- self.presets["active_site"] + self.presets["remote_site"].
- "active_site" could be storage in studio ('studio'), or specific
- "local_id" when user is working disconnected from home.
- If the local record has its "created_dt" filled, it is a source and
- process will try to upload the file to all defined remote sites.
+ 'provider_name' doesn't have to match to 'site_name', single
+ provider (GDrive) might have multiple sites ('projectA',
+ 'projectB')
- Remote files "id" is real id that could be used in appropriate API.
- Local files have "id" too, for conformity, contains just file name.
- It is expected that multiple providers will be implemented in separate
- classes and registered in 'providers.py'.
+ Args:
+ module(SyncServerModule): object to run SyncServerModule API
+ collection (str): source collection
+ file (dictionary): of file from representation in Mongo
+ representation (dictionary): of representation
+ provider_name (string): gdrive, gdc etc.
+ site_name (string): site on provider, single provider(gdrive) could
+ have multiple sites (different accounts, credentials)
+ tree (dictionary): injected memory structure for performance
+ preset (dictionary): site config ('credentials_url', 'root'...)
"""
- # limit querying DB to look for X number of representations that should
- # be sync, we try to run more loops with less records
- # actual number of files synced could be lower as providers can have
- # different limits imposed by its API
- # set 0 to no limit
- REPRESENTATION_LIMIT = 100
- DEFAULT_SITE = 'studio'
- LOCAL_SITE = 'local'
- LOG_PROGRESS_SEC = 5 # how often log progress to DB
+ # create ids sequentially, upload file in parallel later
+ with module.lock:
+ # this part modifies structure on 'remote_site', only single
+ # thread can do that at a time, upload/download to prepared
+ # structure should be run in parallel
+ remote_handler = lib.factory.get_provider(provider_name,
+ collection,
+ remote_site_name,
+ tree=tree,
+ presets=preset)
- name = "sync_server"
- label = "Sync Server"
-
- def initialize(self, module_settings):
- """
- Called during Module Manager creation.
-
- Collects needed data, checks asyncio presence.
- Sets 'enabled' according to global settings for the module.
- Shouldnt be doing any initialization, thats a job for 'tray_init'
- """
- self.enabled = module_settings[self.name]["enabled"]
- if asyncio is None:
- raise AssertionError(
- "SyncServer module requires Python 3.5 or higher."
+ file_path = file.get("path", "")
+ try:
+ local_file_path, remote_file_path = resolve_paths(module,
+ file_path, collection, remote_site_name, remote_handler
)
- # some parts of code need to run sequentially, not in async
- self.lock = None
- self.connection = None # connection to avalon DB to update state
- # settings for all enabled projects for sync
- self.sync_project_settings = None
- self.sync_server_thread = None # asyncio requires new thread
-
- self.action_show_widget = None
- self._paused = False
- self._paused_projects = set()
- self._paused_representations = set()
- self._anatomies = {}
-
- """ Start of Public API """
- def add_site(self, collection, representation_id, site_name=None):
- """
- Adds new site to representation to be synced.
-
- 'collection' must have synchronization enabled (globally or
- project only)
-
- Used as a API endpoint from outside applications (Loader etc)
-
- Args:
- collection (string): project name (must match DB)
- representation_id (string): MongoDB _id value
- site_name (string): name of configured and active site
-
- Returns:
- throws ValueError if any issue
- """
- if not self.get_sync_project_setting(collection):
- raise ValueError("Project not configured")
-
- if not site_name:
- site_name = self.DEFAULT_SITE
-
- self.reset_provider_for_file(collection,
- representation_id,
- site_name=site_name)
-
- # public facing API
- def remove_site(self, collection, representation_id, site_name,
- remove_local_files=False):
- """
- Removes 'site_name' for particular 'representation_id' on
- 'collection'
-
- Args:
- collection (string): project name (must match DB)
- representation_id (string): MongoDB _id value
- site_name (string): name of configured and active site
- remove_local_files (bool): remove only files for 'local_id'
- site
-
- Returns:
- throws ValueError if any issue
- """
- if not self.get_sync_project_setting(collection):
- raise ValueError("Project not configured")
-
- self.reset_provider_for_file(collection,
- representation_id,
- site_name=site_name,
- remove=True)
- if remove_local_files:
- self._remove_local_file(collection, representation_id, site_name)
-
- def clear_project(self, collection, site_name):
- """
- Clear 'collection' of 'site_name' and its local files
-
- Works only on real local sites, not on 'studio'
- """
- query = {
- "type": "representation",
- "files.sites.name": site_name
- }
-
- representations = list(
- self.connection.database[collection].find(query))
- if not representations:
- self.log.debug("No repre found")
- return
-
- for repre in representations:
- self.remove_site(collection, repre.get("_id"), site_name, True)
-
- def pause_representation(self, collection, representation_id, site_name):
- """
- Sets 'representation_id' as paused, eg. no syncing should be
- happening on it.
-
- Args:
- collection (string): project name
- representation_id (string): MongoDB objectId value
- site_name (string): 'gdrive', 'studio' etc.
- """
- log.info("Pausing SyncServer for {}".format(representation_id))
- self._paused_representations.add(representation_id)
- self.reset_provider_for_file(collection, representation_id,
- site_name=site_name, pause=True)
-
- def unpause_representation(self, collection, representation_id, site_name):
- """
- Sets 'representation_id' as unpaused.
-
- Does not fail or warn if repre wasn't paused.
-
- Args:
- collection (string): project name
- representation_id (string): MongoDB objectId value
- site_name (string): 'gdrive', 'studio' etc.
- """
- log.info("Unpausing SyncServer for {}".format(representation_id))
- try:
- self._paused_representations.remove(representation_id)
- except KeyError:
- pass
- # self.paused_representations is not persistent
- self.reset_provider_for_file(collection, representation_id,
- site_name=site_name, pause=False)
-
- def is_representation_paused(self, representation_id,
- check_parents=False, project_name=None):
- """
- Returns if 'representation_id' is paused or not.
-
- Args:
- representation_id (string): MongoDB objectId value
- check_parents (bool): check if parent project or server itself
- are not paused
- project_name (string): project to check if paused
-
- if 'check_parents', 'project_name' should be set too
- Returns:
- (bool)
- """
- condition = representation_id in self._paused_representations
- if check_parents and project_name:
- condition = condition or \
- self.is_project_paused(project_name) or \
- self.is_paused()
- return condition
-
- def pause_project(self, project_name):
- """
- Sets 'project_name' as paused, eg. no syncing should be
- happening on all representation inside.
-
- Args:
- project_name (string): collection name
- """
- log.info("Pausing SyncServer for {}".format(project_name))
- self._paused_projects.add(project_name)
-
- def unpause_project(self, project_name):
- """
- Sets 'project_name' as unpaused
-
- Does not fail or warn if project wasn't paused.
-
- Args:
- project_name (string): collection name
- """
- log.info("Unpausing SyncServer for {}".format(project_name))
- try:
- self._paused_projects.remove(project_name)
- except KeyError:
- pass
-
- def is_project_paused(self, project_name, check_parents=False):
- """
- Returns if 'project_name' is paused or not.
-
- Args:
- project_name (string): collection name
- check_parents (bool): check if server itself
- is not paused
- Returns:
- (bool)
- """
- condition = project_name in self._paused_projects
- if check_parents:
- condition = condition or self.is_paused()
- return condition
-
- def pause_server(self):
- """
- Pause sync server
-
- It won't check anything, not uploading/downloading...
- """
- log.info("Pausing SyncServer")
- self._paused = True
-
- def unpause_server(self):
- """
- Unpause server
- """
- log.info("Unpausing SyncServer")
- self._paused = False
-
- def is_paused(self):
- """ Is server paused """
- return self._paused
-
- def get_active_sites(self, project_name):
- """
- Returns list of active sites for 'project_name'.
-
- By default it returns ['studio'], this site is default
- and always present even if SyncServer is not enabled. (for publish)
-
- Used mainly for Local settings for user override.
-
- Args:
- project_name (string):
-
- Returns:
- (list) of strings
- """
- return self.get_active_sites_from_settings(
- get_project_settings(project_name))
-
- def get_active_sites_from_settings(self, settings):
- """
- List available active sites from incoming 'settings'. Used for
- returning 'default' values for Local Settings
-
- Args:
- settings (dict): full settings (global + project)
- Returns:
- (list) of strings
- """
- sync_settings = self._parse_sync_settings_from_settings(settings)
-
- return self._get_active_sites_from_settings(sync_settings)
-
- def get_active_site(self, project_name):
- """
- Returns active (mine) site for 'project_name' from settings
-
- Returns:
- (string)
- """
- active_site = self.get_sync_project_setting(
- project_name)['config']['active_site']
- if active_site == self.LOCAL_SITE:
- return get_local_site_id()
- return active_site
-
- # remote sites
- def get_remote_sites(self, project_name):
- """
- Returns all remote sites configured on 'project_name'.
-
- If 'project_name' is not enabled for syncing returns [].
-
- Used by Local setting to allow user choose remote site.
-
- Args:
- project_name (string):
-
- Returns:
- (list) of strings
- """
- return self.get_remote_sites_from_settings(
- get_project_settings(project_name))
-
- def get_remote_sites_from_settings(self, settings):
- """
- Get remote sites for returning 'default' values for Local Settings
- """
- sync_settings = self._parse_sync_settings_from_settings(settings)
-
- return self._get_remote_sites_from_settings(sync_settings)
-
- def get_remote_site(self, project_name):
- """
- Returns remote (theirs) site for 'project_name' from settings
- """
- remote_site = self.get_sync_project_setting(
- project_name)['config']['remote_site']
- if remote_site == self.LOCAL_SITE:
- return get_local_site_id()
-
- return remote_site
-
- """ End of Public API """
-
- def get_local_file_path(self, collection, file_path):
- """
- Externalized for app
- """
- local_file_path, _ = self._resolve_paths(file_path, collection)
-
- return local_file_path
-
- def _get_remote_sites_from_settings(self, sync_settings):
- if not self.enabled or not sync_settings['enabled']:
- return []
-
- remote_sites = [self.DEFAULT_SITE, self.LOCAL_SITE]
- if sync_settings:
- remote_sites.extend(sync_settings.get("sites").keys())
-
- return list(set(remote_sites))
-
- def _get_active_sites_from_settings(self, sync_settings):
- sites = [self.DEFAULT_SITE]
- if self.enabled and sync_settings['enabled']:
- sites.append(self.LOCAL_SITE)
-
- return sites
-
- def connect_with_modules(self, *_a, **kw):
- return
-
- def tray_init(self):
- """
- Actual initialization of Sync Server.
-
- Called when tray is initialized, it checks if module should be
- enabled. If not, no initialization necessary.
- """
- if not self.enabled:
- return
-
- self.sync_project_settings = None
- self.lock = threading.Lock()
-
- self.connection = AvalonMongoDB()
- self.connection.install()
-
- try:
- self.set_sync_project_settings()
- self.sync_server_thread = SyncServerThread(self)
- from .tray.app import SyncServerWindow
- self.widget = SyncServerWindow(self)
- except ValueError:
- log.info("No system setting for sync. Not syncing.", exc_info=True)
- self.enabled = False
- except KeyError:
- log.info((
- "There are not set presets for SyncServer OR "
- "Credentials provided are invalid, "
- "no syncing possible").
- format(str(self.sync_project_settings)), exc_info=True)
- self.enabled = False
-
- def tray_start(self):
- """
- Triggered when Tray is started.
-
- Checks if configuration presets are available and if there is
- any provider ('gdrive', 'S3') that is activated
- (eg. has valid credentials).
+ except Exception as exp:
+ print(exp)
+
+ target_folder = os.path.dirname(remote_file_path)
+ folder_id = remote_handler.create_folder(target_folder)
+
+ if not folder_id:
+ err = "Folder {} wasn't created. Check permissions.". \
+ format(target_folder)
+ raise NotADirectoryError(err)
+
+ loop = asyncio.get_running_loop()
+ file_id = await loop.run_in_executor(None,
+ remote_handler.upload_file,
+ local_file_path,
+ remote_file_path,
+ module,
+ collection,
+ file,
+ representation,
+ remote_site_name,
+ True
+ )
+ return file_id
+
+
+async def download(module, collection, file, representation, provider_name,
+ remote_site_name, tree=None, preset=None):
+ """
+ Downloads file to local folder denoted in representation.Context.
+
+ Args:
+ module(SyncServerModule): object to run SyncServerModule API
+ collection (str): source collection
+ file (dictionary) : info about processed file
+ representation (dictionary): repr that 'file' belongs to
+ provider_name (string): 'gdrive' etc
+ site_name (string): site on provider, single provider(gdrive) could
+ have multiple sites (different accounts, credentials)
+ tree (dictionary): injected memory structure for performance
+ preset (dictionary): site config ('credentials_url', 'root'...)
Returns:
- None
- """
- if self.sync_project_settings and self.enabled:
- self.sync_server_thread.start()
- else:
- log.info("No presets or active providers. " +
- "Synchronization not possible.")
+ (string) - 'name' of local file
+ """
+ with module.lock:
+ remote_handler = lib.factory.get_provider(provider_name,
+ collection,
+ remote_site_name,
+ tree=tree,
+ presets=preset)
- def tray_exit(self):
- """
- Stops sync thread if running.
+ file_path = file.get("path", "")
+ local_file_path, remote_file_path = resolve_paths(
+ module, file_path, collection, remote_site_name, remote_handler
+ )
- Called from Module Manager
- """
- if not self.sync_server_thread:
- return
+ local_folder = os.path.dirname(local_file_path)
+ os.makedirs(local_folder, exist_ok=True)
- if not self.is_running:
- return
- try:
- log.info("Stopping sync server server")
- self.sync_server_thread.is_running = False
- self.sync_server_thread.stop()
- except Exception:
- log.warning(
- "Error has happened during Killing sync server",
- exc_info=True
- )
+ local_site = module.get_active_site(collection)
- def tray_menu(self, parent_menu):
- if not self.enabled:
- return
+ loop = asyncio.get_running_loop()
+ file_id = await loop.run_in_executor(None,
+ remote_handler.download_file,
+ remote_file_path,
+ local_file_path,
+ module,
+ collection,
+ file,
+ representation,
+ local_site,
+ True
+ )
+ return file_id
- from Qt import QtWidgets
- """Add menu or action to Tray(or parent)'s menu"""
- action = QtWidgets.QAction("SyncServer", parent_menu)
- action.triggered.connect(self.show_widget)
- parent_menu.addAction(action)
- parent_menu.addSeparator()
- self.action_show_widget = action
+def resolve_paths(module, file_path, collection,
+ remote_site_name=None, remote_handler=None):
+ """
+ Returns tuple of local and remote file paths with {root}
+ placeholders replaced with proper values from Settings or Anatomy
- @property
- def is_running(self):
- return self.sync_server_thread.is_running
+ Ejected here because of Python 2 hosts (GDriveHandler is an issue)
- def get_anatomy(self, project_name):
- """
- Get already created or newly created anatomy for project
-
- Args:
- project_name (string):
-
- Return:
- (Anatomy)
- """
- return self._anatomies.get('project_name') or Anatomy(project_name)
-
- def set_sync_project_settings(self):
- """
- Set sync_project_settings for all projects (caching)
-
- For performance
- """
- sync_project_settings = {}
- if not self.connection:
- self.connection = AvalonMongoDB()
- self.connection.install()
-
- for collection in self.connection.database.collection_names(False):
- sync_settings = self._parse_sync_settings_from_settings(
- get_project_settings(collection))
- if sync_settings:
- default_sites = self._get_default_site_configs()
- sync_settings['sites'].update(default_sites)
- sync_project_settings[collection] = sync_settings
-
- if not sync_project_settings:
- log.info("No enabled and configured projects for sync.")
-
- self.sync_project_settings = sync_project_settings
-
- def get_sync_project_settings(self, refresh=False):
- """
- Collects all projects which have enabled syncing and their settings
Args:
- refresh (bool): refresh presets from settings - used when user
- changes site in Local Settings or any time up-to-date values
- are necessary
+ module(SyncServerModule): object to run SyncServerModule API
+ file_path(string): path with {root}
+ collection(string): project name
+ remote_site_name(string): remote site
+ remote_handler(AbstractProvider): implementation
Returns:
- (dict): of settings, keys are project names
- {'projectA':{enabled: True, sites:{}...}
- """
- # presets set already, do not call again and again
- if refresh or not self.sync_project_settings:
- self.set_sync_project_settings()
+ (string, string) - proper absolute paths, remote path is optional
+ """
+ remote_file_path = ''
+ if remote_handler:
+ remote_file_path = remote_handler.resolve_path(file_path)
- return self.sync_project_settings
+ local_handler = lib.factory.get_provider(
+ 'local_drive', collection, module.get_active_site(collection))
+ local_file_path = local_handler.resolve_path(file_path)
- def get_sync_project_setting(self, project_name):
- """ Handles pulling sync_server's settings for enabled 'project_name'
+ return local_file_path, remote_file_path
- Args:
- project_name (str): used in project settings
- Returns:
- (dict): settings dictionary for the enabled project,
- empty if no settings or sync is disabled
- """
- # presets set already, do not call again and again
- # self.log.debug("project preset {}".format(self.presets))
- if self.sync_project_settings and \
- self.sync_project_settings.get(project_name):
- return self.sync_project_settings.get(project_name)
- settings = get_project_settings(project_name)
- return self._parse_sync_settings_from_settings(settings)
+def site_is_working(module, project_name, site_name):
+ """
+ Confirm that 'site_name' is configured correctly for 'project_name'.
- def site_is_working(self, project_name, site_name):
- """
- Confirm that 'site_name' is configured correctly for 'project_name'
- Args:
- project_name(string):
- site_name(string):
- Returns
- (bool)
- """
- if self._get_configured_sites(project_name).get(site_name):
- return True
- return False
+ Must be here as lib.factory access doesn't work in Python 2 hosts.
- def _parse_sync_settings_from_settings(self, settings):
- """ settings from api.get_project_settings, TOOD rename """
- sync_settings = settings.get("global").get("sync_server")
- if not sync_settings:
- log.info("No project setting not syncing.")
- return {}
- if sync_settings.get("enabled"):
- return sync_settings
+ Args:
+ module (SyncServerModule)
+ project_name(string):
+ site_name(string):
+ Returns
+ (bool)
+ """
+ if _get_configured_sites(module, project_name).get(site_name):
+ return True
+ return False
+
+def _get_configured_sites(module, project_name):
+ """
+ Loops through settings and looks for configured sites and checks
+ its handlers for particular 'project_name'.
+
+ Args:
+ project_setting(dict): dictionary from Settings
+ only_project_name(string, optional): only interested in
+ particular project
+ Returns:
+ (dict of dict)
+ {'ProjectA': {'studio':True, 'gdrive':False}}
+ """
+ settings = module.get_sync_project_setting(project_name)
+ return _get_configured_sites_from_setting(module, project_name, settings)
+
+
+def _get_configured_sites_from_setting(module, project_name, project_setting):
+ if not project_setting.get("enabled"):
return {}
- def _get_configured_sites(self, project_name):
- """
- Loops through settings and looks for configured sites and checks
- its handlers for particular 'project_name'.
-
- Args:
- project_setting(dict): dictionary from Settings
- only_project_name(string, optional): only interested in
- particular project
- Returns:
- (dict of dict)
- {'ProjectA': {'studio':True, 'gdrive':False}}
- """
- settings = self.get_sync_project_setting(project_name)
- return self._get_configured_sites_from_setting(settings)
-
- def _get_configured_sites_from_setting(self, project_setting):
- if not project_setting.get("enabled"):
- return {}
-
- initiated_handlers = {}
- configured_sites = {}
- all_sites = self._get_default_site_configs()
- all_sites.update(project_setting.get("sites"))
- for site_name, config in all_sites.items():
- handler = initiated_handlers. \
- get((config["provider"], site_name))
- if not handler:
- handler = lib.factory.get_provider(config["provider"],
- site_name,
- presets=config)
- initiated_handlers[(config["provider"], site_name)] = \
- handler
-
- if handler.is_active():
- configured_sites[site_name] = True
-
- return configured_sites
-
- def _get_default_site_configs(self):
- """
- Returns skeleton settings for 'studio' and user's local site
- """
- default_config = {'provider': 'local_drive'}
- all_sites = {self.DEFAULT_SITE: default_config,
- get_local_site_id(): default_config}
- return all_sites
-
- def get_provider_for_site(self, project_name, site):
- """
- Return provider name for site.
- """
- site_preset = self.get_sync_project_setting(project_name)["sites"].\
- get(site)
- if site_preset:
- return site_preset["provider"]
-
- return "NA"
-
- @time_function
- def get_sync_representations(self, collection, active_site, remote_site):
- """
- Get representations that should be synced, these could be
- recognised by presence of document in 'files.sites', where key is
- a provider (GDrive, S3) and value is empty document or document
- without 'created_dt' field. (Don't put null to 'created_dt'!).
-
- Querying of 'to-be-synched' files is offloaded to Mongod for
- better performance. Goal is to get as few representations as
- possible.
- Args:
- collection (string): name of collection (in most cases matches
- project name
- active_site (string): identifier of current active site (could be
- 'local_0' when working from home, 'studio' when working in the
- studio (default)
- remote_site (string): identifier of remote site I want to sync to
-
- Returns:
- (list) of dictionaries
- """
- log.debug("Check representations for : {}".format(collection))
- self.connection.Session["AVALON_PROJECT"] = collection
- # retry_cnt - number of attempts to sync specific file before giving up
- retries_arr = self._get_retries_arr(collection)
- query = {
- "type": "representation",
- "$or": [
- {"$and": [
- {
- "files.sites": {
- "$elemMatch": {
- "name": active_site,
- "created_dt": {"$exists": True}
- }
- }}, {
- "files.sites": {
- "$elemMatch": {
- "name": {"$in": [remote_site]},
- "created_dt": {"$exists": False},
- "tries": {"$in": retries_arr}
- }
- }
- }]},
- {"$and": [
- {
- "files.sites": {
- "$elemMatch": {
- "name": active_site,
- "created_dt": {"$exists": False},
- "tries": {"$in": retries_arr}
- }
- }}, {
- "files.sites": {
- "$elemMatch": {
- "name": {"$in": [remote_site]},
- "created_dt": {"$exists": True}
- }
- }
- }
- ]}
- ]
- }
- log.debug("active_site:{} - remote_site:{}".format(active_site,
- remote_site))
- log.debug("query: {}".format(query))
- representations = self.connection.find(query)
-
- return representations
-
- def check_status(self, file, local_site, remote_site, config_preset):
- """
- Check synchronization status for single 'file' of single
- 'representation' by single 'provider'.
- (Eg. check if 'scene.ma' of lookdev.v10 should be synced to GDrive
-
- Always is comparing local record, eg. site with
- 'name' == self.presets[PROJECT_NAME]['config']["active_site"]
-
- Args:
- file (dictionary): of file from representation in Mongo
- local_site (string): - local side of compare (usually 'studio')
- remote_site (string): - gdrive etc.
- config_preset (dict): config about active site, retries
- Returns:
- (string) - one of SyncStatus
- """
- sites = file.get("sites") or []
- # if isinstance(sites, list): # temporary, old format of 'sites'
- # return SyncStatus.DO_NOTHING
- _, remote_rec = self._get_site_rec(sites, remote_site) or {}
- if remote_rec: # sync remote target
- created_dt = remote_rec.get("created_dt")
- if not created_dt:
- tries = self._get_tries_count_from_rec(remote_rec)
- # file will be skipped if unsuccessfully tried over threshold
- # error metadata needs to be purged manually in DB to reset
- if tries < int(config_preset["retry_cnt"]):
- return SyncStatus.DO_UPLOAD
- else:
- _, local_rec = self._get_site_rec(sites, local_site) or {}
- if not local_rec or not local_rec.get("created_dt"):
- tries = self._get_tries_count_from_rec(local_rec)
- # file will be skipped if unsuccessfully tried over
- # threshold times, error metadata needs to be purged
- # manually in DB to reset
- if tries < int(config_preset["retry_cnt"]):
- return SyncStatus.DO_DOWNLOAD
-
- return SyncStatus.DO_NOTHING
-
- async def upload(self, collection, file, representation, provider_name,
- remote_site_name, tree=None, preset=None):
- """
- Upload single 'file' of a 'representation' to 'provider'.
- Source url is taken from 'file' portion, where {root} placeholder
- is replaced by 'representation.Context.root'
- Provider could be one of implemented in provider.py.
-
- Updates MongoDB, fills in id of file from provider (ie. file_id
- from GDrive), 'created_dt' - time of upload
-
- 'provider_name' doesn't have to match to 'site_name', single
- provider (GDrive) might have multiple sites ('projectA',
- 'projectB')
-
- Args:
- collection (str): source collection
- file (dictionary): of file from representation in Mongo
- representation (dictionary): of representation
- provider_name (string): gdrive, gdc etc.
- site_name (string): site on provider, single provider(gdrive) could
- have multiple sites (different accounts, credentials)
- tree (dictionary): injected memory structure for performance
- preset (dictionary): site config ('credentials_url', 'root'...)
-
- """
- # create ids sequentially, upload file in parallel later
- with self.lock:
- # this part modifies structure on 'remote_site', only single
- # thread can do that at a time, upload/download to prepared
- # structure should be run in parallel
- remote_handler = lib.factory.get_provider(provider_name,
- remote_site_name,
- tree=tree,
- presets=preset)
-
- file_path = file.get("path", "")
- local_file_path, remote_file_path = self._resolve_paths(
- file_path, collection, remote_site_name, remote_handler
- )
-
- target_folder = os.path.dirname(remote_file_path)
- folder_id = remote_handler.create_folder(target_folder)
-
- if not folder_id:
- err = "Folder {} wasn't created. Check permissions.".\
- format(target_folder)
- raise NotADirectoryError(err)
-
- loop = asyncio.get_running_loop()
- file_id = await loop.run_in_executor(None,
- remote_handler.upload_file,
- local_file_path,
- remote_file_path,
- self,
- collection,
- file,
- representation,
- remote_site_name,
- True
- )
- return file_id
-
- async def download(self, collection, file, representation, provider_name,
- remote_site_name, tree=None, preset=None):
- """
- Downloads file to local folder denoted in representation.Context.
-
- Args:
- collection (str): source collection
- file (dictionary) : info about processed file
- representation (dictionary): repr that 'file' belongs to
- provider_name (string): 'gdrive' etc
- site_name (string): site on provider, single provider(gdrive) could
- have multiple sites (different accounts, credentials)
- tree (dictionary): injected memory structure for performance
- preset (dictionary): site config ('credentials_url', 'root'...)
-
- Returns:
- (string) - 'name' of local file
- """
- with self.lock:
- remote_handler = lib.factory.get_provider(provider_name,
- remote_site_name,
- tree=tree,
- presets=preset)
-
- file_path = file.get("path", "")
- local_file_path, remote_file_path = self._resolve_paths(
- file_path, collection, remote_site_name, remote_handler
- )
-
- local_folder = os.path.dirname(local_file_path)
- os.makedirs(local_folder, exist_ok=True)
-
- local_site = self.get_active_site(collection)
-
- loop = asyncio.get_running_loop()
- file_id = await loop.run_in_executor(None,
- remote_handler.download_file,
- remote_file_path,
- local_file_path,
- self,
- collection,
- file,
- representation,
- local_site,
- True
- )
- return file_id
-
- def update_db(self, collection, new_file_id, file, representation,
- site, error=None, progress=None):
- """
- Update 'provider' portion of records in DB with success (file_id)
- or error (exception)
-
- Args:
- collection (string): name of project - force to db connection as
- each file might come from different collection
- new_file_id (string):
- file (dictionary): info about processed file (pulled from DB)
- representation (dictionary): parent repr of file (from DB)
- site (string): label ('gdrive', 'S3')
- error (string): exception message
- progress (float): 0-1 of progress of upload/download
-
- Returns:
- None
- """
- representation_id = representation.get("_id")
- file_id = file.get("_id")
- query = {
- "_id": representation_id
- }
-
- update = {}
- if new_file_id:
- update["$set"] = self._get_success_dict(new_file_id)
- # reset previous errors if any
- update["$unset"] = self._get_error_dict("", "", "")
- elif progress is not None:
- update["$set"] = self._get_progress_dict(progress)
- else:
- tries = self._get_tries_count(file, site)
- tries += 1
-
- update["$set"] = self._get_error_dict(error, tries)
-
- arr_filter = [
- {'s.name': site},
- {'f._id': ObjectId(file_id)}
- ]
-
- self.connection.database[collection].update_one(
- query,
- update,
- upsert=True,
- array_filters=arr_filter
- )
-
- if progress is not None:
- return
-
- status = 'failed'
- error_str = 'with error {}'.format(error)
- if new_file_id:
- status = 'succeeded with id {}'.format(new_file_id)
- error_str = ''
-
- source_file = file.get("path", "")
- log.debug("File for {} - {source_file} process {status} {error_str}".
- format(representation_id,
- status=status,
- source_file=source_file,
- error_str=error_str))
-
- def _get_file_info(self, files, _id):
- """
- Return record from list of records which name matches to 'provider'
- Could be possibly refactored with '_get_provider_rec' together.
-
- Args:
- files (list): of dictionaries with info about published files
- _id (string): _id of specific file
-
- Returns:
- (int, dictionary): index from list and record with metadata
- about site (if/when created, errors..)
- OR (-1, None) if not present
- """
- for index, rec in enumerate(files):
- if rec.get("_id") == _id:
- return index, rec
-
- return -1, None
-
- def _get_site_rec(self, sites, site_name):
- """
- Return record from list of records which name matches to
- 'remote_site_name'
-
- Args:
- sites (list): of dictionaries
- site_name (string): 'local_XXX', 'gdrive'
-
- Returns:
- (int, dictionary): index from list and record with metadata
- about site (if/when created, errors..)
- OR (-1, None) if not present
- """
- for index, rec in enumerate(sites):
- if rec.get("name") == site_name:
- return index, rec
-
- return -1, None
-
- def reset_provider_for_file(self, collection, representation_id,
- side=None, file_id=None, site_name=None,
- remove=False, pause=None):
- """
- Reset information about synchronization for particular 'file_id'
- and provider.
- Useful for testing or forcing file to be reuploaded.
-
- 'side' and 'site_name' are disjunctive.
-
- 'side' is used for resetting local or remote side for
- current user for repre.
-
- 'site_name' is used to set synchronization for particular site.
- Should be used when repre should be synced to new site.
-
- Args:
- collection (string): name of project (eg. collection) in DB
- representation_id(string): _id of representation
- file_id (string): file _id in representation
- side (string): local or remote side
- site_name (string): for adding new site
- remove (bool): if True remove site altogether
- pause (bool or None): if True - pause, False - unpause
-
- Returns:
- throws ValueError
- """
- query = {
- "_id": ObjectId(representation_id)
- }
-
- representation = list(self.connection.database[collection].find(query))
- if not representation:
- raise ValueError("Representation {} not found in {}".
- format(representation_id, collection))
- if side and site_name:
- raise ValueError("Misconfiguration, only one of side and " +
- "site_name arguments should be passed.")
-
- local_site = self.get_active_site(collection)
- remote_site = self.get_remote_site(collection)
-
- if side:
- if side == 'local':
- site_name = local_site
- else:
- site_name = remote_site
-
- elem = {"name": site_name}
-
- if file_id: # reset site for particular file
- self._reset_site_for_file(collection, query,
- elem, file_id, site_name)
- elif side: # reset site for whole representation
- self._reset_site(collection, query, elem, site_name)
- elif remove: # remove site for whole representation
- self._remove_site(collection, query, representation, site_name)
- elif pause is not None:
- self._pause_unpause_site(collection, query,
- representation, site_name, pause)
- else: # add new site to all files for representation
- self._add_site(collection, query, representation, elem, site_name)
-
- def _update_site(self, collection, query, update, arr_filter):
- """
- Auxiliary method to call update_one function on DB
-
- Used for refactoring ugly reset_provider_for_file
- """
- self.connection.database[collection].update_one(
- query,
- update,
- upsert=True,
- array_filters=arr_filter
- )
-
- def _reset_site_for_file(self, collection, query,
- elem, file_id, site_name):
- """
- Resets 'site_name' for 'file_id' on representation in 'query' on
- 'collection'
- """
- update = {
- "$set": {"files.$[f].sites.$[s]": elem}
- }
- arr_filter = [
- {'s.name': site_name},
- {'f._id': ObjectId(file_id)}
- ]
-
- self._update_site(collection, query, update, arr_filter)
-
- def _reset_site(self, collection, query, elem, site_name):
- """
- Resets 'site_name' for all files of representation in 'query'
- """
- update = {
- "$set": {"files.$[].sites.$[s]": elem}
- }
-
- arr_filter = [
- {'s.name': site_name}
- ]
-
- self._update_site(collection, query, update, arr_filter)
-
- def _remove_site(self, collection, query, representation, site_name):
- """
- Removes 'site_name' for 'representation' in 'query'
-
- Throws ValueError if 'site_name' not found on 'representation'
- """
- found = False
- for file in representation.pop().get("files"):
- for site in file.get("sites"):
- if site["name"] == site_name:
- found = True
- break
- if not found:
- msg = "Site {} not found".format(site_name)
- log.info(msg)
- raise ValueError(msg)
-
- update = {
- "$pull": {"files.$[].sites": {"name": site_name}}
- }
- arr_filter = []
-
- self._update_site(collection, query, update, arr_filter)
-
- def _pause_unpause_site(self, collection, query,
- representation, site_name, pause):
- """
- Pauses/unpauses all files for 'representation' based on 'pause'
-
- Throws ValueError if 'site_name' not found on 'representation'
- """
- found = False
- site = None
- for file in representation.pop().get("files"):
- for site in file.get("sites"):
- if site["name"] == site_name:
- found = True
- break
- if not found:
- msg = "Site {} not found".format(site_name)
- log.info(msg)
- raise ValueError(msg)
-
- if pause:
- site['paused'] = pause
- else:
- if site.get('paused'):
- site.pop('paused')
-
- update = {
- "$set": {"files.$[].sites.$[s]": site}
- }
-
- arr_filter = [
- {'s.name': site_name}
- ]
-
- self._update_site(collection, query, update, arr_filter)
-
- def _add_site(self, collection, query, representation, elem, site_name):
- """
- Adds 'site_name' to 'representation' on 'collection'
-
- Throws ValueError if already present
- """
- for file in representation.pop().get("files"):
- for site in file.get("sites"):
- if site["name"] == site_name:
- msg = "Site {} already present".format(site_name)
- log.info(msg)
- raise ValueError(msg)
-
- update = {
- "$push": {"files.$[].sites": elem}
- }
-
- arr_filter = []
-
- self._update_site(collection, query, update, arr_filter)
-
- def _remove_local_file(self, collection, representation_id, site_name):
- """
- Removes all local files for 'site_name' of 'representation_id'
-
- Args:
- collection (string): project name (must match DB)
- representation_id (string): MongoDB _id value
- site_name (string): name of configured and active site
-
- Returns:
- only logs, catches IndexError and OSError
- """
- my_local_site = get_local_site_id()
- if my_local_site != site_name:
- self.log.warning("Cannot remove non local file for {}".
- format(site_name))
- return
-
- provider_name = self.get_provider_for_site(collection, site_name)
- handler = lib.factory.get_provider(provider_name, site_name)
-
- if handler and isinstance(handler, LocalDriveHandler):
- query = {
- "_id": ObjectId(representation_id)
- }
-
- representation = list(
- self.connection.database[collection].find(query))
- if not representation:
- self.log.debug("No repre {} found".format(
- representation_id))
- return
-
- representation = representation.pop()
- local_file_path = ''
- for file in representation.get("files"):
- local_file_path, _ = self._resolve_paths(file.get("path", ""),
- collection
- )
- try:
- self.log.debug("Removing {}".format(local_file_path))
- os.remove(local_file_path)
- except IndexError:
- msg = "No file set for {}".format(representation_id)
- self.log.debug(msg)
- raise ValueError(msg)
- except OSError:
- msg = "File {} cannot be removed".format(file["path"])
- self.log.warning(msg)
- raise ValueError(msg)
-
- try:
- folder = os.path.dirname(local_file_path)
- os.rmdir(folder)
- except OSError:
- msg = "folder {} cannot be removed".format(folder)
- self.log.warning(msg)
- raise ValueError(msg)
-
- def get_loop_delay(self, project_name):
- """
- Return count of seconds before next synchronization loop starts
- after finish of previous loop.
- Returns:
- (int): in seconds
- """
- ld = self.sync_project_settings[project_name]["config"]["loop_delay"]
- return int(ld)
-
- def show_widget(self):
- """Show dialog to enter credentials"""
- self.widget.show()
-
- def _get_success_dict(self, new_file_id):
- """
- Provide success metadata ("id", "created_dt") to be stored in Db.
- Used in $set: "DICT" part of query.
- Sites are array inside of array(file), so real indexes for both
- file and site are needed for upgrade in DB.
- Args:
- new_file_id: id of created file
- Returns:
- (dictionary)
- """
- val = {"files.$[f].sites.$[s].id": new_file_id,
- "files.$[f].sites.$[s].created_dt": datetime.now()}
- return val
-
- def _get_error_dict(self, error="", tries="", progress=""):
- """
- Provide error metadata to be stored in Db.
- Used for set (error and tries provided) or unset mode.
- Args:
- error: (string) - message
- tries: how many times failed
- Returns:
- (dictionary)
- """
- val = {"files.$[f].sites.$[s].last_failed_dt": datetime.now(),
- "files.$[f].sites.$[s].error": error,
- "files.$[f].sites.$[s].tries": tries,
- "files.$[f].sites.$[s].progress": progress
- }
- return val
-
- def _get_tries_count_from_rec(self, rec):
- """
- Get number of failed attempts to sync from site record
- Args:
- rec (dictionary): info about specific site record
- Returns:
- (int) - number of failed attempts
- """
- if not rec:
- return 0
- return rec.get("tries", 0)
-
- def _get_tries_count(self, file, provider):
- """
- Get number of failed attempts to sync
- Args:
- file (dictionary): info about specific file
- provider (string): name of site ('gdrive' or specific user site)
- Returns:
- (int) - number of failed attempts
- """
- _, rec = self._get_site_rec(file.get("sites", []), provider)
- return rec.get("tries", 0)
-
- def _get_progress_dict(self, progress):
- """
- Provide progress metadata to be stored in Db.
- Used during upload/download for GUI to show.
- Args:
- progress: (float) - 0-1 progress of upload/download
- Returns:
- (dictionary)
- """
- val = {"files.$[f].sites.$[s].progress": progress}
- return val
-
- def _resolve_paths(self, file_path, collection,
- remote_site_name=None, remote_handler=None):
- """
- Returns tuple of local and remote file paths with {root}
- placeholders replaced with proper values from Settings or Anatomy
-
- Args:
- file_path(string): path with {root}
- collection(string): project name
- remote_site_name(string): remote site
- remote_handler(AbstractProvider): implementation
- Returns:
- (string, string) - proper absolute paths
- """
- remote_file_path = ''
- if remote_handler:
- root_configs = self._get_roots_config(self.sync_project_settings,
- collection,
- remote_site_name)
-
- remote_file_path = remote_handler.resolve_path(file_path,
- root_configs)
-
- local_handler = lib.factory.get_provider(
- 'local_drive', self.get_active_site(collection))
- local_file_path = local_handler.resolve_path(
- file_path, None, self.get_anatomy(collection))
-
- return local_file_path, remote_file_path
-
- def _get_retries_arr(self, project_name):
- """
- Returns array with allowed values in 'tries' field. If repre
- contains these values, it means it was tried to be synchronized
- but failed. We try up to 'self.presets["retry_cnt"]' times before
- giving up and skipping representation.
- Returns:
- (list)
- """
- retry_cnt = self.sync_project_settings[project_name].\
- get("config")["retry_cnt"]
- arr = [i for i in range(int(retry_cnt))]
- arr.append(None)
-
- return arr
-
- def _get_roots_config(self, presets, project_name, site_name):
- """
- Returns configured root(s) for 'project_name' and 'site_name' from
- settings ('presets')
- """
- return presets[project_name]['sites'][site_name]['root']
-
+ initiated_handlers = {}
+ configured_sites = {}
+ all_sites = module._get_default_site_configs()
+ all_sites.update(project_setting.get("sites"))
+ for site_name, config in all_sites.items():
+ handler = initiated_handlers. \
+ get((config["provider"], site_name))
+ if not handler:
+ handler = lib.factory.get_provider(config["provider"],
+ project_name,
+ site_name,
+ presets=config)
+ initiated_handlers[(config["provider"], site_name)] = \
+ handler
+
+ if handler.is_active():
+ configured_sites[site_name] = True
+
+ return configured_sites
class SyncServerThread(threading.Thread):
"""
@@ -1437,7 +271,7 @@ class SyncServerThread(threading.Thread):
import time
start_time = None
self.module.set_sync_project_settings() # clean cache
- for collection, preset in self.module.get_sync_project_settings().\
+ for collection, preset in self.module.sync_project_settings.\
items():
start_time = time.time()
local_site, remote_site = self._working_sites(collection)
@@ -1462,6 +296,7 @@ class SyncServerThread(threading.Thread):
site_preset = preset.get('sites')[remote_site]
remote_provider = site_preset['provider']
handler = lib.factory.get_provider(remote_provider,
+ collection,
remote_site,
presets=site_preset)
limit = lib.factory.get_provider_batch_limit(
@@ -1491,13 +326,14 @@ class SyncServerThread(threading.Thread):
tree = handler.get_tree()
limit -= 1
task = asyncio.create_task(
- self.module.upload(collection,
- file,
- sync,
- remote_provider,
- remote_site,
- tree,
- site_preset))
+ upload(self.module,
+ collection,
+ file,
+ sync,
+ remote_provider,
+ remote_site,
+ tree,
+ site_preset))
task_files_to_process.append(task)
# store info for exception handlingy
files_processed_info.append((file,
@@ -1510,13 +346,14 @@ class SyncServerThread(threading.Thread):
tree = handler.get_tree()
limit -= 1
task = asyncio.create_task(
- self.module.download(collection,
- file,
- sync,
- remote_provider,
- remote_site,
- tree,
- site_preset))
+ download(self.module,
+ collection,
+ file,
+ sync,
+ remote_provider,
+ remote_site,
+ tree,
+ site_preset))
task_files_to_process.append(task)
files_processed_info.append((file,
@@ -1592,8 +429,8 @@ class SyncServerThread(threading.Thread):
remote_site))
return None, None
- if not all([self.module.site_is_working(collection, local_site),
- self.module.site_is_working(collection, remote_site)]):
+ if not all([site_is_working(self.module, collection, local_site),
+ site_is_working(self.module, collection, remote_site)]):
log.debug("Some of the sites {} - {} is not ".format(local_site,
remote_site) +
"working properly")
diff --git a/openpype/modules/sync_server/sync_server_module.py b/openpype/modules/sync_server/sync_server_module.py
new file mode 100644
index 0000000000..59c3787789
--- /dev/null
+++ b/openpype/modules/sync_server/sync_server_module.py
@@ -0,0 +1,1193 @@
+import os
+from bson.objectid import ObjectId
+from datetime import datetime
+import threading
+
+from avalon.api import AvalonMongoDB
+
+from .. import PypeModule, ITrayModule
+from openpype.api import (
+ Anatomy,
+ get_project_settings,
+ get_local_site_id)
+from openpype.lib import PypeLogger
+
+from .providers.local_drive import LocalDriveHandler
+
+from .utils import time_function, SyncStatus
+
+
+log = PypeLogger().get_logger("SyncServer")
+
+
+class SyncServerModule(PypeModule, ITrayModule):
+ """
+ Synchronization server that is syncing published files from local to
+ any of implemented providers (like GDrive, S3 etc.)
+ Runs in the background and checks all representations, looks for files
+ that are marked to be in different location than 'studio' (temporary),
+ checks if 'created_dt' field is present denoting successful sync
+ with provider destination.
+ Sites structure is created during publish OR by calling 'add_site'
+ method.
+
+ By default it will always contain 1 record with
+ "name" == self.presets["active_site"] and
+ filled "created_dt" AND 1 or multiple records for all defined
+ remote sites, where "created_dt" is not present.
+ This highlights that file should be uploaded to
+ remote destination
+
+ ''' - example of synced file test_Cylinder_lookMain_v010.ma to GDrive
+ "files" : [
+ {
+ "path" : "{root}/Test/Assets/Cylinder/publish/look/lookMain/v010/
+ test_Cylinder_lookMain_v010.ma",
+ "_id" : ObjectId("5eeb25e411e06a16209ab78f"),
+ "hash" : "test_Cylinder_lookMain_v010,ma|1592468963,24|4822",
+ "size" : NumberLong(4822),
+ "sites" : [
+ {
+ "name": "john_local_XD4345",
+ "created_dt" : ISODate("2020-05-22T08:05:44.000Z")
+ },
+ {
+ "id" : ObjectId("5eeb25e411e06a16209ab78f"),
+ "name": "gdrive",
+ "created_dt" : ISODate("2020-05-55T08:54:35.833Z")
+ ]
+ }
+ },
+ '''
+ Each Tray app has assigned its own self.presets["local_id"]
+ used in sites as a name.
+ Tray is searching only for records where name matches its
+ self.presets["active_site"] + self.presets["remote_site"].
+ "active_site" could be storage in studio ('studio'), or specific
+ "local_id" when user is working disconnected from home.
+ If the local record has its "created_dt" filled, it is a source and
+ process will try to upload the file to all defined remote sites.
+
+ Remote files "id" is real id that could be used in appropriate API.
+ Local files have "id" too, for conformity, contains just file name.
+ It is expected that multiple providers will be implemented in separate
+ classes and registered in 'providers.py'.
+
+ """
+ # limit querying DB to look for X number of representations that should
+ # be sync, we try to run more loops with less records
+ # actual number of files synced could be lower as providers can have
+ # different limits imposed by its API
+ # set 0 to no limit
+ REPRESENTATION_LIMIT = 100
+ DEFAULT_SITE = 'studio'
+ LOCAL_SITE = 'local'
+ LOG_PROGRESS_SEC = 5 # how often log progress to DB
+
+ name = "sync_server"
+ label = "Sync Queue"
+
+ def initialize(self, module_settings):
+ """
+ Called during Module Manager creation.
+
+ Collects needed data, checks asyncio presence.
+ Sets 'enabled' according to global settings for the module.
+ Shouldnt be doing any initialization, thats a job for 'tray_init'
+ """
+ self.enabled = module_settings[self.name]["enabled"]
+
+ # some parts of code need to run sequentially, not in async
+ self.lock = None
+ # settings for all enabled projects for sync
+ self._sync_project_settings = None
+ self.sync_server_thread = None # asyncio requires new thread
+
+ self.action_show_widget = None
+ self._paused = False
+ self._paused_projects = set()
+ self._paused_representations = set()
+ self._anatomies = {}
+
+ self._connection = None
+
+ """ Start of Public API """
+ def add_site(self, collection, representation_id, site_name=None,
+ force=False):
+ """
+ Adds new site to representation to be synced.
+
+ 'collection' must have synchronization enabled (globally or
+ project only)
+
+ Used as a API endpoint from outside applications (Loader etc)
+
+ Args:
+ collection (string): project name (must match DB)
+ representation_id (string): MongoDB _id value
+ site_name (string): name of configured and active site
+ force (bool): reset site if exists
+
+ Returns:
+ throws ValueError if any issue
+ """
+ if not self.get_sync_project_setting(collection):
+ raise ValueError("Project not configured")
+
+ if not site_name:
+ site_name = self.DEFAULT_SITE
+
+ self.reset_provider_for_file(collection,
+ representation_id,
+ site_name=site_name, force=force)
+
+ # public facing API
+ def remove_site(self, collection, representation_id, site_name,
+ remove_local_files=False):
+ """
+ Removes 'site_name' for particular 'representation_id' on
+ 'collection'
+
+ Args:
+ collection (string): project name (must match DB)
+ representation_id (string): MongoDB _id value
+ site_name (string): name of configured and active site
+ remove_local_files (bool): remove only files for 'local_id'
+ site
+
+ Returns:
+ throws ValueError if any issue
+ """
+ if not self.get_sync_project_setting(collection):
+ raise ValueError("Project not configured")
+
+ self.reset_provider_for_file(collection,
+ representation_id,
+ site_name=site_name,
+ remove=True)
+ if remove_local_files:
+ self._remove_local_file(collection, representation_id, site_name)
+
+ def clear_project(self, collection, site_name):
+ """
+ Clear 'collection' of 'site_name' and its local files
+
+ Works only on real local sites, not on 'studio'
+ """
+ query = {
+ "type": "representation",
+ "files.sites.name": site_name
+ }
+
+ representations = list(
+ self.connection.database[collection].find(query))
+ if not representations:
+ self.log.debug("No repre found")
+ return
+
+ for repre in representations:
+ self.remove_site(collection, repre.get("_id"), site_name, True)
+
+ def pause_representation(self, collection, representation_id, site_name):
+ """
+ Sets 'representation_id' as paused, eg. no syncing should be
+ happening on it.
+
+ Args:
+ collection (string): project name
+ representation_id (string): MongoDB objectId value
+ site_name (string): 'gdrive', 'studio' etc.
+ """
+ log.info("Pausing SyncServer for {}".format(representation_id))
+ self._paused_representations.add(representation_id)
+ self.reset_provider_for_file(collection, representation_id,
+ site_name=site_name, pause=True)
+
+ def unpause_representation(self, collection, representation_id, site_name):
+ """
+ Sets 'representation_id' as unpaused.
+
+ Does not fail or warn if repre wasn't paused.
+
+ Args:
+ collection (string): project name
+ representation_id (string): MongoDB objectId value
+ site_name (string): 'gdrive', 'studio' etc.
+ """
+ log.info("Unpausing SyncServer for {}".format(representation_id))
+ try:
+ self._paused_representations.remove(representation_id)
+ except KeyError:
+ pass
+ # self.paused_representations is not persistent
+ self.reset_provider_for_file(collection, representation_id,
+ site_name=site_name, pause=False)
+
+ def is_representation_paused(self, representation_id,
+ check_parents=False, project_name=None):
+ """
+ Returns if 'representation_id' is paused or not.
+
+ Args:
+ representation_id (string): MongoDB objectId value
+ check_parents (bool): check if parent project or server itself
+ are not paused
+ project_name (string): project to check if paused
+
+ if 'check_parents', 'project_name' should be set too
+ Returns:
+ (bool)
+ """
+ condition = representation_id in self._paused_representations
+ if check_parents and project_name:
+ condition = condition or \
+ self.is_project_paused(project_name) or \
+ self.is_paused()
+ return condition
+
+ def pause_project(self, project_name):
+ """
+ Sets 'project_name' as paused, eg. no syncing should be
+ happening on all representation inside.
+
+ Args:
+ project_name (string): collection name
+ """
+ log.info("Pausing SyncServer for {}".format(project_name))
+ self._paused_projects.add(project_name)
+
+ def unpause_project(self, project_name):
+ """
+ Sets 'project_name' as unpaused
+
+ Does not fail or warn if project wasn't paused.
+
+ Args:
+ project_name (string): collection name
+ """
+ log.info("Unpausing SyncServer for {}".format(project_name))
+ try:
+ self._paused_projects.remove(project_name)
+ except KeyError:
+ pass
+
+ def is_project_paused(self, project_name, check_parents=False):
+ """
+ Returns if 'project_name' is paused or not.
+
+ Args:
+ project_name (string): collection name
+ check_parents (bool): check if server itself
+ is not paused
+ Returns:
+ (bool)
+ """
+ condition = project_name in self._paused_projects
+ if check_parents:
+ condition = condition or self.is_paused()
+ return condition
+
+ def pause_server(self):
+ """
+ Pause sync server
+
+ It won't check anything, not uploading/downloading...
+ """
+ log.info("Pausing SyncServer")
+ self._paused = True
+
+ def unpause_server(self):
+ """
+ Unpause server
+ """
+ log.info("Unpausing SyncServer")
+ self._paused = False
+
+ def is_paused(self):
+ """ Is server paused """
+ return self._paused
+
+ def get_active_sites(self, project_name):
+ """
+ Returns list of active sites for 'project_name'.
+
+ By default it returns ['studio'], this site is default
+ and always present even if SyncServer is not enabled. (for publish)
+
+ Used mainly for Local settings for user override.
+
+ Args:
+ project_name (string):
+
+ Returns:
+ (list) of strings
+ """
+ return self.get_active_sites_from_settings(
+ get_project_settings(project_name))
+
+ def get_active_sites_from_settings(self, settings):
+ """
+ List available active sites from incoming 'settings'. Used for
+ returning 'default' values for Local Settings
+
+ Args:
+ settings (dict): full settings (global + project)
+ Returns:
+ (list) of strings
+ """
+ sync_settings = self._parse_sync_settings_from_settings(settings)
+
+ return self._get_enabled_sites_from_settings(sync_settings)
+
+ def get_configurable_items_for_site(self, project_name, site_name):
+ """
+ Returns list of items that should be configurable by User
+
+ Returns:
+ (list of dict)
+ [{key:"root", label:"root", value:"valueFromSettings"}]
+ """
+ # if project_name is None: ..for get_default_project_settings
+ # return handler.get_configurable_items()
+ pass
+
+ def get_active_site(self, project_name):
+ """
+ Returns active (mine) site for 'project_name' from settings
+
+ Returns:
+ (string)
+ """
+ active_site = self.get_sync_project_setting(
+ project_name)['config']['active_site']
+ if active_site == self.LOCAL_SITE:
+ return get_local_site_id()
+ return active_site
+
+ # remote sites
+ def get_remote_sites(self, project_name):
+ """
+ Returns all remote sites configured on 'project_name'.
+
+ If 'project_name' is not enabled for syncing returns [].
+
+ Used by Local setting to allow user choose remote site.
+
+ Args:
+ project_name (string):
+
+ Returns:
+ (list) of strings
+ """
+ return self.get_remote_sites_from_settings(
+ get_project_settings(project_name))
+
+ def get_remote_sites_from_settings(self, settings):
+ """
+ Get remote sites for returning 'default' values for Local Settings
+ """
+ sync_settings = self._parse_sync_settings_from_settings(settings)
+
+ return self._get_remote_sites_from_settings(sync_settings)
+
+ def get_remote_site(self, project_name):
+ """
+ Returns remote (theirs) site for 'project_name' from settings
+ """
+ remote_site = self.get_sync_project_setting(
+ project_name)['config']['remote_site']
+ if remote_site == self.LOCAL_SITE:
+ return get_local_site_id()
+
+ return remote_site
+
+ """ End of Public API """
+
+ def get_local_file_path(self, collection, site_name, file_path):
+ """
+ Externalized for app
+ """
+ handler = LocalDriveHandler(collection, site_name)
+ local_file_path = handler.resolve_path(file_path)
+
+ return local_file_path
+
+ def _get_remote_sites_from_settings(self, sync_settings):
+ if not self.enabled or not sync_settings['enabled']:
+ return []
+
+ remote_sites = [self.DEFAULT_SITE, self.LOCAL_SITE]
+ if sync_settings:
+ remote_sites.extend(sync_settings.get("sites").keys())
+
+ return list(set(remote_sites))
+
+ def _get_enabled_sites_from_settings(self, sync_settings):
+ sites = [self.DEFAULT_SITE]
+ if self.enabled and sync_settings['enabled']:
+ sites.append(self.LOCAL_SITE)
+
+ return sites
+
+ def connect_with_modules(self, *_a, **kw):
+ return
+
+ def tray_init(self):
+ """
+ Actual initialization of Sync Server.
+
+ Called when tray is initialized, it checks if module should be
+ enabled. If not, no initialization necessary.
+ """
+ # import only in tray, because of Python2 hosts
+ from .sync_server import SyncServerThread
+
+ if not self.enabled:
+ return
+
+ self.lock = threading.Lock()
+
+ try:
+ self.sync_server_thread = SyncServerThread(self)
+ from .tray.app import SyncServerWindow
+ self.widget = SyncServerWindow(self)
+ except ValueError:
+ log.info("No system setting for sync. Not syncing.", exc_info=True)
+ self.enabled = False
+ except KeyError:
+ log.info((
+ "There are not set presets for SyncServer OR "
+ "Credentials provided are invalid, "
+ "no syncing possible").
+ format(str(self.sync_project_settings)), exc_info=True)
+ self.enabled = False
+
+ def tray_start(self):
+ """
+ Triggered when Tray is started.
+
+ Checks if configuration presets are available and if there is
+ any provider ('gdrive', 'S3') that is activated
+ (eg. has valid credentials).
+
+ Returns:
+ None
+ """
+ if self.sync_project_settings and self.enabled:
+ self.sync_server_thread.start()
+ else:
+ log.info("No presets or active providers. " +
+ "Synchronization not possible.")
+
+ def tray_exit(self):
+ """
+ Stops sync thread if running.
+
+ Called from Module Manager
+ """
+ if not self.sync_server_thread:
+ return
+
+ if not self.is_running:
+ return
+ try:
+ log.info("Stopping sync server server")
+ self.sync_server_thread.is_running = False
+ self.sync_server_thread.stop()
+ except Exception:
+ log.warning(
+ "Error has happened during Killing sync server",
+ exc_info=True
+ )
+
+ def tray_menu(self, parent_menu):
+ if not self.enabled:
+ return
+
+ from Qt import QtWidgets
+ """Add menu or action to Tray(or parent)'s menu"""
+ action = QtWidgets.QAction(self.label, parent_menu)
+ action.triggered.connect(self.show_widget)
+ parent_menu.addAction(action)
+ parent_menu.addSeparator()
+
+ self.action_show_widget = action
+
+ @property
+ def is_running(self):
+ return self.sync_server_thread.is_running
+
+ def get_anatomy(self, project_name):
+ """
+ Get already created or newly created anatomy for project
+
+ Args:
+ project_name (string):
+
+ Return:
+ (Anatomy)
+ """
+ return self._anatomies.get('project_name') or Anatomy(project_name)
+
+ @property
+ def connection(self):
+ if self._connection is None:
+ self._connection = AvalonMongoDB()
+
+ return self._connection
+
+ @property
+ def sync_project_settings(self):
+ if self._sync_project_settings is None:
+ self.set_sync_project_settings()
+
+ return self._sync_project_settings
+
+ def set_sync_project_settings(self):
+ """
+ Set sync_project_settings for all projects (caching)
+
+ For performance
+ """
+ sync_project_settings = {}
+
+ for collection in self.connection.database.collection_names(False):
+ sync_settings = self._parse_sync_settings_from_settings(
+ get_project_settings(collection))
+ if sync_settings:
+ default_sites = self._get_default_site_configs()
+ sync_settings['sites'].update(default_sites)
+ sync_project_settings[collection] = sync_settings
+
+ if not sync_project_settings:
+ log.info("No enabled and configured projects for sync.")
+
+ self._sync_project_settings = sync_project_settings
+
+ def get_sync_project_setting(self, project_name):
+ """ Handles pulling sync_server's settings for enabled 'project_name'
+
+ Args:
+ project_name (str): used in project settings
+ Returns:
+ (dict): settings dictionary for the enabled project,
+ empty if no settings or sync is disabled
+ """
+ # presets set already, do not call again and again
+ # self.log.debug("project preset {}".format(self.presets))
+ if self.sync_project_settings and \
+ self.sync_project_settings.get(project_name):
+ return self.sync_project_settings.get(project_name)
+
+ settings = get_project_settings(project_name)
+ return self._parse_sync_settings_from_settings(settings)
+
+ def _parse_sync_settings_from_settings(self, settings):
+ """ settings from api.get_project_settings, TOOD rename """
+ sync_settings = settings.get("global").get("sync_server")
+ if not sync_settings:
+ log.info("No project setting not syncing.")
+ return {}
+ if sync_settings.get("enabled"):
+ return sync_settings
+
+ return {}
+
+ def _get_default_site_configs(self):
+ """
+ Returns skeleton settings for 'studio' and user's local site
+ """
+ default_config = {'provider': 'local_drive'}
+ all_sites = {self.DEFAULT_SITE: default_config,
+ get_local_site_id(): default_config}
+ return all_sites
+
+ def get_provider_for_site(self, project_name, site):
+ """
+ Return provider name for site.
+ """
+ site_preset = self.get_sync_project_setting(project_name)["sites"].\
+ get(site)
+ if site_preset:
+ return site_preset["provider"]
+
+ return "NA"
+
+ @time_function
+ def get_sync_representations(self, collection, active_site, remote_site):
+ """
+ Get representations that should be synced, these could be
+ recognised by presence of document in 'files.sites', where key is
+ a provider (GDrive, S3) and value is empty document or document
+ without 'created_dt' field. (Don't put null to 'created_dt'!).
+
+ Querying of 'to-be-synched' files is offloaded to Mongod for
+ better performance. Goal is to get as few representations as
+ possible.
+ Args:
+ collection (string): name of collection (in most cases matches
+ project name
+ active_site (string): identifier of current active site (could be
+ 'local_0' when working from home, 'studio' when working in the
+ studio (default)
+ remote_site (string): identifier of remote site I want to sync to
+
+ Returns:
+ (list) of dictionaries
+ """
+ log.debug("Check representations for : {}".format(collection))
+ self.connection.Session["AVALON_PROJECT"] = collection
+ # retry_cnt - number of attempts to sync specific file before giving up
+ retries_arr = self._get_retries_arr(collection)
+ query = {
+ "type": "representation",
+ "$or": [
+ {"$and": [
+ {
+ "files.sites": {
+ "$elemMatch": {
+ "name": active_site,
+ "created_dt": {"$exists": True}
+ }
+ }}, {
+ "files.sites": {
+ "$elemMatch": {
+ "name": {"$in": [remote_site]},
+ "created_dt": {"$exists": False},
+ "tries": {"$in": retries_arr}
+ }
+ }
+ }]},
+ {"$and": [
+ {
+ "files.sites": {
+ "$elemMatch": {
+ "name": active_site,
+ "created_dt": {"$exists": False},
+ "tries": {"$in": retries_arr}
+ }
+ }}, {
+ "files.sites": {
+ "$elemMatch": {
+ "name": {"$in": [remote_site]},
+ "created_dt": {"$exists": True}
+ }
+ }
+ }
+ ]}
+ ]
+ }
+ log.debug("active_site:{} - remote_site:{}".format(active_site,
+ remote_site))
+ log.debug("query: {}".format(query))
+ representations = self.connection.find(query)
+
+ return representations
+
+ def check_status(self, file, local_site, remote_site, config_preset):
+ """
+ Check synchronization status for single 'file' of single
+ 'representation' by single 'provider'.
+ (Eg. check if 'scene.ma' of lookdev.v10 should be synced to GDrive
+
+ Always is comparing local record, eg. site with
+ 'name' == self.presets[PROJECT_NAME]['config']["active_site"]
+
+ Args:
+ file (dictionary): of file from representation in Mongo
+ local_site (string): - local side of compare (usually 'studio')
+ remote_site (string): - gdrive etc.
+ config_preset (dict): config about active site, retries
+ Returns:
+ (string) - one of SyncStatus
+ """
+ sites = file.get("sites") or []
+ # if isinstance(sites, list): # temporary, old format of 'sites'
+ # return SyncStatus.DO_NOTHING
+ _, remote_rec = self._get_site_rec(sites, remote_site) or {}
+ if remote_rec: # sync remote target
+ created_dt = remote_rec.get("created_dt")
+ if not created_dt:
+ tries = self._get_tries_count_from_rec(remote_rec)
+ # file will be skipped if unsuccessfully tried over threshold
+ # error metadata needs to be purged manually in DB to reset
+ if tries < int(config_preset["retry_cnt"]):
+ return SyncStatus.DO_UPLOAD
+ else:
+ _, local_rec = self._get_site_rec(sites, local_site) or {}
+ if not local_rec or not local_rec.get("created_dt"):
+ tries = self._get_tries_count_from_rec(local_rec)
+ # file will be skipped if unsuccessfully tried over
+ # threshold times, error metadata needs to be purged
+ # manually in DB to reset
+ if tries < int(config_preset["retry_cnt"]):
+ return SyncStatus.DO_DOWNLOAD
+
+ return SyncStatus.DO_NOTHING
+
+ def update_db(self, collection, new_file_id, file, representation,
+ site, error=None, progress=None):
+ """
+ Update 'provider' portion of records in DB with success (file_id)
+ or error (exception)
+
+ Args:
+ collection (string): name of project - force to db connection as
+ each file might come from different collection
+ new_file_id (string):
+ file (dictionary): info about processed file (pulled from DB)
+ representation (dictionary): parent repr of file (from DB)
+ site (string): label ('gdrive', 'S3')
+ error (string): exception message
+ progress (float): 0-1 of progress of upload/download
+
+ Returns:
+ None
+ """
+ representation_id = representation.get("_id")
+ file_id = file.get("_id")
+ query = {
+ "_id": representation_id
+ }
+
+ update = {}
+ if new_file_id:
+ update["$set"] = self._get_success_dict(new_file_id)
+ # reset previous errors if any
+ update["$unset"] = self._get_error_dict("", "", "")
+ elif progress is not None:
+ update["$set"] = self._get_progress_dict(progress)
+ else:
+ tries = self._get_tries_count(file, site)
+ tries += 1
+
+ update["$set"] = self._get_error_dict(error, tries)
+
+ arr_filter = [
+ {'s.name': site},
+ {'f._id': ObjectId(file_id)}
+ ]
+
+ self.connection.database[collection].update_one(
+ query,
+ update,
+ upsert=True,
+ array_filters=arr_filter
+ )
+
+ if progress is not None:
+ return
+
+ status = 'failed'
+ error_str = 'with error {}'.format(error)
+ if new_file_id:
+ status = 'succeeded with id {}'.format(new_file_id)
+ error_str = ''
+
+ source_file = file.get("path", "")
+ log.debug("File for {} - {source_file} process {status} {error_str}".
+ format(representation_id,
+ status=status,
+ source_file=source_file,
+ error_str=error_str))
+
+ def _get_file_info(self, files, _id):
+ """
+ Return record from list of records which name matches to 'provider'
+ Could be possibly refactored with '_get_provider_rec' together.
+
+ Args:
+ files (list): of dictionaries with info about published files
+ _id (string): _id of specific file
+
+ Returns:
+ (int, dictionary): index from list and record with metadata
+ about site (if/when created, errors..)
+ OR (-1, None) if not present
+ """
+ for index, rec in enumerate(files):
+ if rec.get("_id") == _id:
+ return index, rec
+
+ return -1, None
+
+ def _get_site_rec(self, sites, site_name):
+ """
+ Return record from list of records which name matches to
+ 'remote_site_name'
+
+ Args:
+ sites (list): of dictionaries
+ site_name (string): 'local_XXX', 'gdrive'
+
+ Returns:
+ (int, dictionary): index from list and record with metadata
+ about site (if/when created, errors..)
+ OR (-1, None) if not present
+ """
+ for index, rec in enumerate(sites):
+ if rec.get("name") == site_name:
+ return index, rec
+
+ return -1, None
+
+ def reset_provider_for_file(self, collection, representation_id,
+ side=None, file_id=None, site_name=None,
+ remove=False, pause=None, force=False):
+ """
+ Reset information about synchronization for particular 'file_id'
+ and provider.
+ Useful for testing or forcing file to be reuploaded.
+
+ 'side' and 'site_name' are disjunctive.
+
+ 'side' is used for resetting local or remote side for
+ current user for repre.
+
+ 'site_name' is used to set synchronization for particular site.
+ Should be used when repre should be synced to new site.
+
+ Args:
+ collection (string): name of project (eg. collection) in DB
+ representation_id(string): _id of representation
+ file_id (string): file _id in representation
+ side (string): local or remote side
+ site_name (string): for adding new site
+ remove (bool): if True remove site altogether
+ pause (bool or None): if True - pause, False - unpause
+ force (bool): hard reset - currently only for add_site
+
+ Returns:
+ throws ValueError
+ """
+ query = {
+ "_id": ObjectId(representation_id)
+ }
+
+ representation = list(self.connection.database[collection].find(query))
+ if not representation:
+ raise ValueError("Representation {} not found in {}".
+ format(representation_id, collection))
+ if side and site_name:
+ raise ValueError("Misconfiguration, only one of side and " +
+ "site_name arguments should be passed.")
+
+ local_site = self.get_active_site(collection)
+ remote_site = self.get_remote_site(collection)
+
+ if side:
+ if side == 'local':
+ site_name = local_site
+ else:
+ site_name = remote_site
+
+ elem = {"name": site_name}
+
+ if file_id: # reset site for particular file
+ self._reset_site_for_file(collection, query,
+ elem, file_id, site_name)
+ elif side: # reset site for whole representation
+ self._reset_site(collection, query, elem, site_name)
+ elif remove: # remove site for whole representation
+ self._remove_site(collection, query, representation, site_name)
+ elif pause is not None:
+ self._pause_unpause_site(collection, query,
+ representation, site_name, pause)
+ else: # add new site to all files for representation
+ self._add_site(collection, query, representation, elem, site_name,
+ force)
+
+ def _update_site(self, collection, query, update, arr_filter):
+ """
+ Auxiliary method to call update_one function on DB
+
+ Used for refactoring ugly reset_provider_for_file
+ """
+ self.connection.database[collection].update_one(
+ query,
+ update,
+ upsert=True,
+ array_filters=arr_filter
+ )
+
+ def _reset_site_for_file(self, collection, query,
+ elem, file_id, site_name):
+ """
+ Resets 'site_name' for 'file_id' on representation in 'query' on
+ 'collection'
+ """
+ update = {
+ "$set": {"files.$[f].sites.$[s]": elem}
+ }
+ arr_filter = [
+ {'s.name': site_name},
+ {'f._id': ObjectId(file_id)}
+ ]
+
+ self._update_site(collection, query, update, arr_filter)
+
+ def _reset_site(self, collection, query, elem, site_name):
+ """
+ Resets 'site_name' for all files of representation in 'query'
+ """
+ update = {
+ "$set": {"files.$[].sites.$[s]": elem}
+ }
+
+ arr_filter = [
+ {'s.name': site_name}
+ ]
+
+ self._update_site(collection, query, update, arr_filter)
+
+ def _remove_site(self, collection, query, representation, site_name):
+ """
+ Removes 'site_name' for 'representation' in 'query'
+
+ Throws ValueError if 'site_name' not found on 'representation'
+ """
+ found = False
+ for repre_file in representation.pop().get("files"):
+ for site in repre_file.get("sites"):
+ if site["name"] == site_name:
+ found = True
+ break
+ if not found:
+ msg = "Site {} not found".format(site_name)
+ log.info(msg)
+ raise ValueError(msg)
+
+ update = {
+ "$pull": {"files.$[].sites": {"name": site_name}}
+ }
+ arr_filter = []
+
+ self._update_site(collection, query, update, arr_filter)
+
+ def _pause_unpause_site(self, collection, query,
+ representation, site_name, pause):
+ """
+ Pauses/unpauses all files for 'representation' based on 'pause'
+
+ Throws ValueError if 'site_name' not found on 'representation'
+ """
+ found = False
+ site = None
+ for repre_file in representation.pop().get("files"):
+ for site in repre_file.get("sites"):
+ if site["name"] == site_name:
+ found = True
+ break
+ if not found:
+ msg = "Site {} not found".format(site_name)
+ log.info(msg)
+ raise ValueError(msg)
+
+ if pause:
+ site['paused'] = pause
+ else:
+ if site.get('paused'):
+ site.pop('paused')
+
+ update = {
+ "$set": {"files.$[].sites.$[s]": site}
+ }
+
+ arr_filter = [
+ {'s.name': site_name}
+ ]
+
+ self._update_site(collection, query, update, arr_filter)
+
+ def _add_site(self, collection, query, representation, elem, site_name,
+ force=False):
+ """
+ Adds 'site_name' to 'representation' on 'collection'
+
+ Use 'force' to remove existing or raises ValueError
+ """
+ for repre_file in representation.pop().get("files"):
+ for site in repre_file.get("sites"):
+ if site["name"] == site_name:
+ if force:
+ self._reset_site_for_file(collection, query,
+ elem, repre_file["_id"],
+ site_name)
+ return
+ else:
+ msg = "Site {} already present".format(site_name)
+ log.info(msg)
+ raise ValueError(msg)
+
+ update = {
+ "$push": {"files.$[].sites": elem}
+ }
+
+ arr_filter = []
+
+ self._update_site(collection, query, update, arr_filter)
+
+ def _remove_local_file(self, collection, representation_id, site_name):
+ """
+ Removes all local files for 'site_name' of 'representation_id'
+
+ Args:
+ collection (string): project name (must match DB)
+ representation_id (string): MongoDB _id value
+ site_name (string): name of configured and active site
+
+ Returns:
+ only logs, catches IndexError and OSError
+ """
+ my_local_site = get_local_site_id()
+ if my_local_site != site_name:
+ self.log.warning("Cannot remove non local file for {}".
+ format(site_name))
+ return
+
+ provider_name = self.get_provider_for_site(collection, site_name)
+
+ if provider_name == 'local_drive':
+ query = {
+ "_id": ObjectId(representation_id)
+ }
+
+ representation = list(
+ self.connection.database[collection].find(query))
+ if not representation:
+ self.log.debug("No repre {} found".format(
+ representation_id))
+ return
+
+ representation = representation.pop()
+ local_file_path = ''
+ for file in representation.get("files"):
+ local_file_path = self.get_local_file_path(collection,
+ site_name,
+ file.get("path", "")
+ )
+ try:
+ self.log.debug("Removing {}".format(local_file_path))
+ os.remove(local_file_path)
+ except IndexError:
+ msg = "No file set for {}".format(representation_id)
+ self.log.debug(msg)
+ raise ValueError(msg)
+ except OSError:
+ msg = "File {} cannot be removed".format(file["path"])
+ self.log.warning(msg)
+ raise ValueError(msg)
+
+ folder = None
+ try:
+ folder = os.path.dirname(local_file_path)
+ os.rmdir(folder)
+ except OSError:
+ msg = "folder {} cannot be removed".format(folder)
+ self.log.warning(msg)
+ raise ValueError(msg)
+
+ def get_loop_delay(self, project_name):
+ """
+ Return count of seconds before next synchronization loop starts
+ after finish of previous loop.
+ Returns:
+ (int): in seconds
+ """
+ ld = self.sync_project_settings[project_name]["config"]["loop_delay"]
+ return int(ld)
+
+ def show_widget(self):
+ """Show dialog to enter credentials"""
+ self.widget.show()
+
+ def _get_success_dict(self, new_file_id):
+ """
+ Provide success metadata ("id", "created_dt") to be stored in Db.
+ Used in $set: "DICT" part of query.
+ Sites are array inside of array(file), so real indexes for both
+ file and site are needed for upgrade in DB.
+ Args:
+ new_file_id: id of created file
+ Returns:
+ (dictionary)
+ """
+ val = {"files.$[f].sites.$[s].id": new_file_id,
+ "files.$[f].sites.$[s].created_dt": datetime.now()}
+ return val
+
+ def _get_error_dict(self, error="", tries="", progress=""):
+ """
+ Provide error metadata to be stored in Db.
+ Used for set (error and tries provided) or unset mode.
+ Args:
+ error: (string) - message
+ tries: how many times failed
+ Returns:
+ (dictionary)
+ """
+ val = {"files.$[f].sites.$[s].last_failed_dt": datetime.now(),
+ "files.$[f].sites.$[s].error": error,
+ "files.$[f].sites.$[s].tries": tries,
+ "files.$[f].sites.$[s].progress": progress
+ }
+ return val
+
+ def _get_tries_count_from_rec(self, rec):
+ """
+ Get number of failed attempts to sync from site record
+ Args:
+ rec (dictionary): info about specific site record
+ Returns:
+ (int) - number of failed attempts
+ """
+ if not rec:
+ return 0
+ return rec.get("tries", 0)
+
+ def _get_tries_count(self, file, provider):
+ """
+ Get number of failed attempts to sync
+ Args:
+ file (dictionary): info about specific file
+ provider (string): name of site ('gdrive' or specific user site)
+ Returns:
+ (int) - number of failed attempts
+ """
+ _, rec = self._get_site_rec(file.get("sites", []), provider)
+ return rec.get("tries", 0)
+
+ def _get_progress_dict(self, progress):
+ """
+ Provide progress metadata to be stored in Db.
+ Used during upload/download for GUI to show.
+ Args:
+ progress: (float) - 0-1 progress of upload/download
+ Returns:
+ (dictionary)
+ """
+ val = {"files.$[f].sites.$[s].progress": progress}
+ return val
+
+ def _get_retries_arr(self, project_name):
+ """
+ Returns array with allowed values in 'tries' field. If repre
+ contains these values, it means it was tried to be synchronized
+ but failed. We try up to 'self.presets["retry_cnt"]' times before
+ giving up and skipping representation.
+ Returns:
+ (list)
+ """
+ retry_cnt = self.sync_project_settings[project_name].\
+ get("config")["retry_cnt"]
+ arr = [i for i in range(int(retry_cnt))]
+ arr.append(None)
+
+ return arr
+
+ def _get_roots_config(self, presets, project_name, site_name):
+ """
+ Returns configured root(s) for 'project_name' and 'site_name' from
+ settings ('presets')
+ """
+ return presets[project_name]['sites'][site_name]['root']
diff --git a/openpype/modules/sync_server/tray/app.py b/openpype/modules/sync_server/tray/app.py
index 476e9d16e8..25fbf0e49a 100644
--- a/openpype/modules/sync_server/tray/app.py
+++ b/openpype/modules/sync_server/tray/app.py
@@ -1,35 +1,17 @@
from Qt import QtWidgets, QtCore, QtGui
-from Qt.QtCore import Qt
-import attr
-import os
-import sys
-import subprocess
-
-from openpype.tools.settings import (
- ProjectListWidget,
- style
-)
-
-from avalon.tools.delegates import PrettyTimeDelegate, pretty_timestamp
-from bson.objectid import ObjectId
+from openpype.tools.settings import style
from openpype.lib import PypeLogger
-from openpype.api import get_local_site_id
+from openpype import resources
+
+from openpype.modules.sync_server.tray.widgets import (
+ SyncProjectListWidget,
+ SyncRepresentationWidget
+)
log = PypeLogger().get_logger("SyncServer")
-STATUS = {
- 0: 'In Progress',
- 1: 'Failed',
- 2: 'Queued',
- 3: 'Paused',
- 4: 'Synced OK',
- -1: 'Not available'
-}
-
-DUMMY_PROJECT = "No project configured"
-
class SyncServerWindow(QtWidgets.QDialog):
"""
@@ -44,8 +26,8 @@ class SyncServerWindow(QtWidgets.QDialog):
self.setFocusPolicy(QtCore.Qt.StrongFocus)
self.setStyleSheet(style.load_stylesheet())
- self.setWindowIcon(QtGui.QIcon(style.app_icon_path()))
- self.resize(1400, 800)
+ self.setWindowIcon(QtGui.QIcon(resources.pype_icon_filepath()))
+ self.resize(1450, 700)
self.timer = QtCore.QTimer()
self.timer.timeout.connect(self._hide_message)
@@ -134,1912 +116,3 @@ class SyncServerWindow(QtWidgets.QDialog):
"""
self.message.setText("")
self.message.hide()
-
-
-class SyncProjectListWidget(ProjectListWidget):
- """
- Lists all projects that are synchronized to choose from
- """
-
- def __init__(self, sync_server, parent):
- super(SyncProjectListWidget, self).__init__(parent)
- self.sync_server = sync_server
- self.project_list.setContextMenuPolicy(QtCore.Qt.CustomContextMenu)
- self.project_list.customContextMenuRequested.connect(
- self._on_context_menu)
- self.project_name = None
- self.local_site = None
- self.icons = {}
-
- def validate_context_change(self):
- return True
-
- def refresh(self):
- model = self.project_list.model()
- model.clear()
-
- project_name = None
- for project_name in self.sync_server.get_sync_project_settings().\
- keys():
- if self.sync_server.is_paused() or \
- self.sync_server.is_project_paused(project_name):
- icon = self._get_icon("paused")
- else:
- icon = self._get_icon("synced")
-
- model.appendRow(QtGui.QStandardItem(icon, project_name))
-
- if len(self.sync_server.get_sync_project_settings().keys()) == 0:
- model.appendRow(QtGui.QStandardItem(DUMMY_PROJECT))
-
- self.current_project = self.project_list.currentIndex().data(
- QtCore.Qt.DisplayRole
- )
- if not self.current_project:
- self.current_project = self.project_list.model().item(0). \
- data(QtCore.Qt.DisplayRole)
-
- if project_name:
- self.local_site = self.sync_server.get_active_site(project_name)
-
- def _get_icon(self, status):
- if not self.icons.get(status):
- resource_path = os.path.dirname(__file__)
- resource_path = os.path.join(resource_path, "..",
- "resources")
- pix_url = "{}/{}.png".format(resource_path, status)
- icon = QtGui.QIcon(pix_url)
- self.icons[status] = icon
- else:
- icon = self.icons[status]
- return icon
-
- def _on_context_menu(self, point):
- point_index = self.project_list.indexAt(point)
- if not point_index.isValid():
- return
-
- self.project_name = point_index.data(QtCore.Qt.DisplayRole)
-
- menu = QtWidgets.QMenu()
- actions_mapping = {}
-
- if self.sync_server.is_project_paused(self.project_name):
- action = QtWidgets.QAction("Unpause")
- actions_mapping[action] = self._unpause
- else:
- action = QtWidgets.QAction("Pause")
- actions_mapping[action] = self._pause
- menu.addAction(action)
-
- if self.local_site == get_local_site_id():
- action = QtWidgets.QAction("Clear local project")
- actions_mapping[action] = self._clear_project
- menu.addAction(action)
-
- result = menu.exec_(QtGui.QCursor.pos())
- if result:
- to_run = actions_mapping[result]
- if to_run:
- to_run()
-
- def _pause(self):
- if self.project_name:
- self.sync_server.pause_project(self.project_name)
- self.project_name = None
- self.refresh()
-
- def _unpause(self):
- if self.project_name:
- self.sync_server.unpause_project(self.project_name)
- self.project_name = None
- self.refresh()
-
- def _clear_project(self):
- if self.project_name:
- self.sync_server.clear_project(self.project_name, self.local_site)
- self.project_name = None
- self.refresh()
-
-
-class ProjectModel(QtCore.QAbstractListModel):
- def __init__(self, *args, projects=None, **kwargs):
- super(ProjectModel, self).__init__(*args, **kwargs)
- self.projects = projects or []
-
- def data(self, index, role):
- if role == Qt.DisplayRole:
- # See below for the data structure.
- status, text = self.projects[index.row()]
- # Return the todo text only.
- return text
-
- def rowCount(self, index):
- return len(self.todos)
-
-
-class SyncRepresentationWidget(QtWidgets.QWidget):
- """
- Summary dialog with list of representations that matches current
- settings 'local_site' and 'remote_site'.
- """
- active_changed = QtCore.Signal() # active index changed
- message_generated = QtCore.Signal(str)
-
- default_widths = (
- ("asset", 210),
- ("subset", 190),
- ("version", 10),
- ("representation", 90),
- ("created_dt", 100),
- ("sync_dt", 100),
- ("local_site", 60),
- ("remote_site", 70),
- ("files_count", 70),
- ("files_size", 70),
- ("priority", 20),
- ("state", 50)
- )
-
- def __init__(self, sync_server, project=None, parent=None):
- super(SyncRepresentationWidget, self).__init__(parent)
-
- self.sync_server = sync_server
-
- self._selected_id = None # keep last selected _id
- self.representation_id = None
- self.site_name = None # to pause/unpause representation
-
- self.filter = QtWidgets.QLineEdit()
- self.filter.setPlaceholderText("Filter representations..")
-
- top_bar_layout = QtWidgets.QHBoxLayout()
- top_bar_layout.addWidget(self.filter)
-
- self.table_view = QtWidgets.QTableView()
- headers = [item[0] for item in self.default_widths]
-
- model = SyncRepresentationModel(sync_server, headers, project)
- self.table_view.setModel(model)
- self.table_view.setContextMenuPolicy(QtCore.Qt.CustomContextMenu)
- self.table_view.setSelectionMode(
- QtWidgets.QAbstractItemView.SingleSelection)
- self.table_view.setSelectionBehavior(
- QtWidgets.QAbstractItemView.SelectRows)
- self.table_view.horizontalHeader().setSortIndicator(
- -1, Qt.AscendingOrder)
- self.table_view.setSortingEnabled(True)
- self.table_view.setAlternatingRowColors(True)
- self.table_view.verticalHeader().hide()
-
- time_delegate = PrettyTimeDelegate(self)
- column = self.table_view.model().get_header_index("created_dt")
- self.table_view.setItemDelegateForColumn(column, time_delegate)
- column = self.table_view.model().get_header_index("sync_dt")
- self.table_view.setItemDelegateForColumn(column, time_delegate)
-
- column = self.table_view.model().get_header_index("local_site")
- delegate = ImageDelegate(self)
- self.table_view.setItemDelegateForColumn(column, delegate)
-
- column = self.table_view.model().get_header_index("remote_site")
- delegate = ImageDelegate(self)
- self.table_view.setItemDelegateForColumn(column, delegate)
-
- column = self.table_view.model().get_header_index("files_size")
- delegate = SizeDelegate(self)
- self.table_view.setItemDelegateForColumn(column, delegate)
-
- for column_name, width in self.default_widths:
- idx = model.get_header_index(column_name)
- self.table_view.setColumnWidth(idx, width)
-
- layout = QtWidgets.QVBoxLayout(self)
- layout.setContentsMargins(0, 0, 0, 0)
- layout.addLayout(top_bar_layout)
- layout.addWidget(self.table_view)
-
- self.table_view.doubleClicked.connect(self._double_clicked)
- self.filter.textChanged.connect(lambda: model.set_filter(
- self.filter.text()))
- self.table_view.customContextMenuRequested.connect(
- self._on_context_menu)
-
- self.table_view.model().modelReset.connect(self._set_selection)
-
- self.selection_model = self.table_view.selectionModel()
- self.selection_model.selectionChanged.connect(self._selection_changed)
-
- def _selection_changed(self, new_selection):
- index = self.selection_model.currentIndex()
- self._selected_id = \
- self.table_view.model().data(index, Qt.UserRole)
-
- def _set_selection(self):
- """
- Sets selection to 'self._selected_id' if exists.
-
- Keep selection during model refresh.
- """
- if self._selected_id:
- index = self.table_view.model().get_index(self._selected_id)
- if index and index.isValid():
- mode = QtCore.QItemSelectionModel.Select | \
- QtCore.QItemSelectionModel.Rows
- self.selection_model.setCurrentIndex(index, mode)
- else:
- self._selected_id = None
-
- def _double_clicked(self, index):
- """
- Opens representation dialog with all files after doubleclick
- """
- _id = self.table_view.model().data(index, Qt.UserRole)
- detail_window = SyncServerDetailWindow(
- self.sync_server, _id, self.table_view.model()._project)
- detail_window.exec()
-
- def _on_context_menu(self, point):
- """
- Shows menu with loader actions on Right-click.
- """
- point_index = self.table_view.indexAt(point)
- if not point_index.isValid():
- return
-
- self.item = self.table_view.model()._data[point_index.row()]
- self.representation_id = self.item._id
- log.debug("menu representation _id:: {}".
- format(self.representation_id))
-
- menu = QtWidgets.QMenu()
- actions_mapping = {}
-
- action = QtWidgets.QAction("Open in explorer")
- actions_mapping[action] = self._open_in_explorer
- menu.addAction(action)
-
- local_site, local_progress = self.item.local_site.split()
- remote_site, remote_progress = self.item.remote_site.split()
- local_progress = float(local_progress)
- remote_progress = float(remote_progress)
-
- # progress smaller then 1.0 --> in progress or queued
- if local_progress < 1.0:
- self.site_name = local_site
- else:
- self.site_name = remote_site
-
- if self.item.state in [STATUS[0], STATUS[2]]:
- action = QtWidgets.QAction("Pause")
- actions_mapping[action] = self._pause
- menu.addAction(action)
-
- if self.item.state == STATUS[3]:
- action = QtWidgets.QAction("Unpause")
- actions_mapping[action] = self._unpause
- menu.addAction(action)
-
- # if self.item.state == STATUS[1]:
- # action = QtWidgets.QAction("Open error detail")
- # actions_mapping[action] = self._show_detail
- # menu.addAction(action)
-
- if remote_progress == 1.0:
- action = QtWidgets.QAction("Reset local site")
- actions_mapping[action] = self._reset_local_site
- menu.addAction(action)
-
- if local_progress == 1.0:
- action = QtWidgets.QAction("Reset remote site")
- actions_mapping[action] = self._reset_remote_site
- menu.addAction(action)
-
- if local_site != self.sync_server.DEFAULT_SITE:
- action = QtWidgets.QAction("Completely remove from local")
- actions_mapping[action] = self._remove_site
- menu.addAction(action)
- else:
- action = QtWidgets.QAction("Mark for sync to local")
- actions_mapping[action] = self._add_site
- menu.addAction(action)
-
- if not actions_mapping:
- action = QtWidgets.QAction("< No action >")
- actions_mapping[action] = None
- menu.addAction(action)
-
- result = menu.exec_(QtGui.QCursor.pos())
- if result:
- to_run = actions_mapping[result]
- if to_run:
- to_run()
-
- self.table_view.model().refresh()
-
- def _pause(self):
- self.sync_server.pause_representation(self.table_view.model()._project,
- self.representation_id,
- self.site_name)
- self.site_name = None
- self.message_generated.emit("Paused {}".format(self.representation_id))
-
- def _unpause(self):
- self.sync_server.unpause_representation(
- self.table_view.model()._project,
- self.representation_id,
- self.site_name)
- self.site_name = None
- self.message_generated.emit("Unpaused {}".format(
- self.representation_id))
-
- # temporary here for testing, will be removed TODO
- def _add_site(self):
- log.info(self.representation_id)
- project_name = self.table_view.model()._project
- local_site_name = self.sync_server.get_my_local_site()
- try:
- self.sync_server.add_site(
- project_name,
- self.representation_id,
- local_site_name
- )
- self.message_generated.emit(
- "Site {} added for {}".format(local_site_name,
- self.representation_id))
- except ValueError as exp:
- self.message_generated.emit("Error {}".format(str(exp)))
-
- def _remove_site(self):
- """
- Removes site record AND files.
-
- This is ONLY for representations stored on local site, which
- cannot be same as SyncServer.DEFAULT_SITE.
-
- This could only happen when artist work on local machine, not
- connected to studio mounted drives.
- """
- log.info("Removing {}".format(self.representation_id))
- try:
- local_site = get_local_site_id()
- self.sync_server.remove_site(
- self.table_view.model()._project,
- self.representation_id,
- local_site,
- True
- )
- self.message_generated.emit("Site {} removed".format(local_site))
- except ValueError as exp:
- self.message_generated.emit("Error {}".format(str(exp)))
-
- def _reset_local_site(self):
- """
- Removes errors or success metadata for particular file >> forces
- redo of upload/download
- """
- self.sync_server.reset_provider_for_file(
- self.table_view.model()._project,
- self.representation_id,
- 'local'
- )
-
- def _reset_remote_site(self):
- """
- Removes errors or success metadata for particular file >> forces
- redo of upload/download
- """
- self.sync_server.reset_provider_for_file(
- self.table_view.model()._project,
- self.representation_id,
- 'remote'
- )
-
- def _open_in_explorer(self):
- if not self.item:
- return
-
- fpath = self.item.path
- project = self.table_view.model()._project
- fpath = self.sync_server.get_local_file_path(project, fpath)
-
- fpath = os.path.normpath(os.path.dirname(fpath))
- if os.path.isdir(fpath):
- if 'win' in sys.platform: # windows
- subprocess.Popen('explorer "%s"' % fpath)
- elif sys.platform == 'darwin': # macOS
- subprocess.Popen(['open', fpath])
- else: # linux
- try:
- subprocess.Popen(['xdg-open', fpath])
- except OSError:
- raise OSError('unsupported xdg-open call??')
-
-
-class SyncRepresentationModel(QtCore.QAbstractTableModel):
- """
- Model for summary of representations.
-
- Groups files information per representation. Allows sorting and
- full text filtering.
-
- Allows pagination, most of heavy lifting is being done on DB side.
- Single model matches to single collection. When project is changed,
- model is reset and refreshed.
-
- Args:
- sync_server (SyncServer) - object to call server operations (update
- db status, set site status...)
- header (list) - names of visible columns
- project (string) - collection name, all queries must be called on
- a specific collection
-
- """
- PAGE_SIZE = 20 # default page size to query for
- REFRESH_SEC = 5000 # in seconds, requery DB for new status
- DEFAULT_SORT = {
- "updated_dt_remote": -1,
- "_id": 1
- }
- SORT_BY_COLUMN = [
- "context.asset", # asset
- "context.subset", # subset
- "context.version", # version
- "context.representation", # representation
- "updated_dt_local", # local created_dt
- "updated_dt_remote", # remote created_dt
- "avg_progress_local", # local progress
- "avg_progress_remote", # remote progress
- "files_count", # count of files
- "files_size", # file size of all files
- "context.asset", # priority TODO
- "status" # state
- ]
-
- @attr.s
- class SyncRepresentation:
- """
- Auxiliary object for easier handling.
-
- Fields must contain all header values (+ any arbitrary values).
- """
- _id = attr.ib()
- asset = attr.ib()
- subset = attr.ib()
- version = attr.ib()
- representation = attr.ib()
- created_dt = attr.ib(default=None)
- sync_dt = attr.ib(default=None)
- local_site = attr.ib(default=None)
- remote_site = attr.ib(default=None)
- files_count = attr.ib(default=None)
- files_size = attr.ib(default=None)
- priority = attr.ib(default=None)
- state = attr.ib(default=None)
- path = attr.ib(default=None)
-
- def __init__(self, sync_server, header, project=None):
- super(SyncRepresentationModel, self).__init__()
- self._header = header
- self._data = []
- self._project = project
- self._rec_loaded = 0
- self._total_records = 0 # how many documents query actually found
- self.filter = None
-
- self._initialized = False
- if not self._project or self._project == DUMMY_PROJECT:
- return
-
- self.sync_server = sync_server
- # TODO think about admin mode
- # this is for regular user, always only single local and single remote
- self.local_site = self.sync_server.get_active_site(self._project)
- self.remote_site = self.sync_server.get_remote_site(self._project)
-
- self.projection = self.get_default_projection()
-
- self.sort = self.DEFAULT_SORT
-
- self.query = self.get_default_query()
- self.default_query = list(self.get_default_query())
-
- representations = self.dbcon.aggregate(self.query)
- self.refresh(representations)
-
- self.timer = QtCore.QTimer()
- self.timer.timeout.connect(self.tick)
- self.timer.start(self.REFRESH_SEC)
-
- @property
- def dbcon(self):
- """
- Database object with preselected project (collection) to run DB
- operations (find, aggregate).
-
- All queries should go through this (because of collection).
- """
- return self.sync_server.connection.database[self._project]
-
- def data(self, index, role):
- item = self._data[index.row()]
-
- if role == Qt.DisplayRole:
- return attr.asdict(item)[self._header[index.column()]]
- if role == Qt.UserRole:
- return item._id
-
- def rowCount(self, index):
- return len(self._data)
-
- def columnCount(self, index):
- return len(self._header)
-
- def headerData(self, section, orientation, role):
- if role == Qt.DisplayRole:
- if orientation == Qt.Horizontal:
- return str(self._header[section])
-
- def tick(self):
- """
- Triggers refresh of model.
-
- Because of pagination, prepared (sorting, filtering) query needs
- to be run on DB every X seconds.
- """
- self.refresh(representations=None, load_records=self._rec_loaded)
- self.timer.start(self.REFRESH_SEC)
-
- def get_header_index(self, value):
- """
- Returns index of 'value' in headers
-
- Args:
- value (str): header name value
- Returns:
- (int)
- """
- return self._header.index(value)
-
- def refresh(self, representations=None, load_records=0):
- """
- Reloads representations from DB if necessary, adds them to model.
-
- Runs periodically (every X seconds) or by demand (change of
- sorting, filtering etc.)
-
- Emits 'modelReset' signal.
-
- Args:
- representations (PaginationResult object): pass result of
- aggregate query from outside - mostly for testing only
- load_records (int) - enforces how many records should be
- actually queried (scrolled a couple of times to list more
- than single page of records)
- """
- if self.sync_server.is_paused() or \
- self.sync_server.is_project_paused(self._project):
- return
-
- self.beginResetModel()
- self._data = []
- self._rec_loaded = 0
-
- if not representations:
- self.query = self.get_default_query(load_records)
- representations = self.dbcon.aggregate(self.query)
-
- self._add_page_records(self.local_site, self.remote_site,
- representations)
- self.endResetModel()
-
- def _add_page_records(self, local_site, remote_site, representations):
- """
- Process all records from 'representation' and add them to storage.
-
- Args:
- local_site (str): name of local site (mine)
- remote_site (str): name of cloud provider (theirs)
- representations (Mongo Cursor) - mimics result set, 1 object
- with paginatedResults array and totalCount array
- """
- result = representations.next()
- count = 0
- total_count = result.get("totalCount")
- if total_count:
- count = total_count.pop().get('count')
- self._total_records = count
-
- local_provider = _translate_provider_for_icon(self.sync_server,
- self._project,
- local_site)
- remote_provider = _translate_provider_for_icon(self.sync_server,
- self._project,
- remote_site)
-
- for repre in result.get("paginatedResults"):
- context = repre.get("context").pop()
- files = repre.get("files", [])
- if isinstance(files, dict): # aggregate returns dictionary
- files = [files]
-
- # representation without files doesnt concern us
- if not files:
- continue
-
- local_updated = remote_updated = None
- if repre.get('updated_dt_local'):
- local_updated = \
- repre.get('updated_dt_local').strftime("%Y%m%dT%H%M%SZ")
-
- if repre.get('updated_dt_remote'):
- remote_updated = \
- repre.get('updated_dt_remote').strftime("%Y%m%dT%H%M%SZ")
-
- avg_progress_remote = _convert_progress(
- repre.get('avg_progress_remote', '0'))
- avg_progress_local = _convert_progress(
- repre.get('avg_progress_local', '0'))
-
- if context.get("version"):
- version = "v{:0>3d}".format(context.get("version"))
- else:
- version = "hero"
-
- item = self.SyncRepresentation(
- repre.get("_id"),
- context.get("asset"),
- context.get("subset"),
- version,
- context.get("representation"),
- local_updated,
- remote_updated,
- '{} {}'.format(local_provider, avg_progress_local),
- '{} {}'.format(remote_provider, avg_progress_remote),
- repre.get("files_count", 1),
- repre.get("files_size", 0),
- 1,
- STATUS[repre.get("status", -1)],
- files[0].get('path')
- )
-
- self._data.append(item)
- self._rec_loaded += 1
-
- def canFetchMore(self, index):
- """
- Check if there are more records than currently loaded
- """
- # 'skip' might be suboptimal when representation hits 500k+
- return self._total_records > self._rec_loaded
-
- def fetchMore(self, index):
- """
- Add more record to model.
-
- Called when 'canFetchMore' returns true, which means there are
- more records in DB than loaded.
- """
- log.debug("fetchMore")
- items_to_fetch = min(self._total_records - self._rec_loaded,
- self.PAGE_SIZE)
- self.query = self.get_default_query(self._rec_loaded)
- representations = self.dbcon.aggregate(self.query)
- self.beginInsertRows(index,
- self._rec_loaded,
- self._rec_loaded + items_to_fetch - 1)
-
- self._add_page_records(self.local_site, self.remote_site,
- representations)
-
- self.endInsertRows()
-
- def sort(self, index, order):
- """
- Summary sort per representation.
-
- Sort is happening on a DB side, model is reset, db queried
- again.
-
- Args:
- index (int): column index
- order (int): 0|
- """
- # limit unwanted first re-sorting by view
- if index < 0:
- return
-
- self._rec_loaded = 0
- if order == 0:
- order = 1
- else:
- order = -1
-
- self.sort = {self.SORT_BY_COLUMN[index]: order, '_id': 1}
- self.query = self.get_default_query()
- # import json
- # log.debug(json.dumps(self.query, indent=4).replace('False', 'false').\
- # replace('True', 'true').replace('None', 'null'))
-
- representations = self.dbcon.aggregate(self.query)
- self.refresh(representations)
-
- def set_filter(self, filter):
- """
- Adds text value filtering
-
- Args:
- filter (str): string inputted by user
- """
- self.filter = filter
- self.refresh()
-
- def set_project(self, project):
- """
- Changes project, called after project selection is changed
-
- Args:
- project (str): name of project
- """
- self._project = project
- self.sync_server.set_sync_project_settings()
- self.local_site = self.sync_server.get_active_site(self._project)
- self.remote_site = self.sync_server.get_remote_site(self._project)
- self.refresh()
-
- def get_index(self, id):
- """
- Get index of 'id' value.
-
- Used for keeping selection after refresh.
-
- Args:
- id (str): MongoDB _id
- Returns:
- (QModelIndex)
- """
- for i in range(self.rowCount(None)):
- index = self.index(i, 0)
- value = self.data(index, Qt.UserRole)
- if value == id:
- return index
- return None
-
- def get_default_query(self, limit=0):
- """
- Returns basic aggregate query for main table.
-
- Main table provides summary information about representation,
- which could have multiple files. Details are accessible after
- double click on representation row.
- Columns:
- 'created_dt' - max of created or updated (when failed) per repr
- 'sync_dt' - same for remote side
- 'local_site' - progress of repr on local side, 1 = finished
- 'remote_site' - progress on remote side, calculates from files
- 'state' -
- 0 - in progress
- 1 - failed
- 2 - queued
- 3 - paused
- 4 - finished on both sides
-
- are calculated and must be calculated in DB because of
- pagination
-
- Args:
- limit (int): how many records should be returned, by default
- it 'PAGE_SIZE' for performance.
- Should be overridden by value of loaded records for refresh
- functionality (got more records by scrolling, refresh
- shouldn't reset that)
- """
- if limit == 0:
- limit = SyncRepresentationModel.PAGE_SIZE
-
- return [
- {"$match": self._get_match_part()},
- {'$unwind': '$files'},
- # merge potentially unwinded records back to single per repre
- {'$addFields': {
- 'order_remote': {
- '$filter': {'input': '$files.sites', 'as': 'p',
- 'cond': {'$eq': ['$$p.name', self.remote_site]}
- }},
- 'order_local': {
- '$filter': {'input': '$files.sites', 'as': 'p',
- 'cond': {'$eq': ['$$p.name', self.local_site]}
- }}
- }},
- {'$addFields': {
- # prepare progress per file, presence of 'created_dt' denotes
- # successfully finished load/download
- 'progress_remote': {'$first': {
- '$cond': [{'$size': "$order_remote.progress"},
- "$order_remote.progress",
- {'$cond': [
- {'$size': "$order_remote.created_dt"},
- [1],
- [0]
- ]}
- ]}},
- 'progress_local': {'$first': {
- '$cond': [{'$size': "$order_local.progress"},
- "$order_local.progress",
- {'$cond': [
- {'$size': "$order_local.created_dt"},
- [1],
- [0]
- ]}
- ]}},
- # file might be successfully created or failed, not both
- 'updated_dt_remote': {'$first': {
- '$cond': [{'$size': "$order_remote.created_dt"},
- "$order_remote.created_dt",
- {'$cond': [
- {'$size': "$order_remote.last_failed_dt"},
- "$order_remote.last_failed_dt",
- []
- ]}
- ]}},
- 'updated_dt_local': {'$first': {
- '$cond': [{'$size': "$order_local.created_dt"},
- "$order_local.created_dt",
- {'$cond': [
- {'$size': "$order_local.last_failed_dt"},
- "$order_local.last_failed_dt",
- []
- ]}
- ]}},
- 'files_size': {'$ifNull': ["$files.size", 0]},
- 'failed_remote': {
- '$cond': [{'$size': "$order_remote.last_failed_dt"},
- 1,
- 0]},
- 'failed_local': {
- '$cond': [{'$size': "$order_local.last_failed_dt"},
- 1,
- 0]},
- 'failed_local_tries': {
- '$cond': [{'$size': '$order_local.tries'},
- {'$first': '$order_local.tries'},
- 0]},
- 'failed_remote_tries': {
- '$cond': [{'$size': '$order_remote.tries'},
- {'$first': '$order_remote.tries'},
- 0]},
- 'paused_remote': {
- '$cond': [{'$size': "$order_remote.paused"},
- 1,
- 0]},
- 'paused_local': {
- '$cond': [{'$size': "$order_local.paused"},
- 1,
- 0]},
- }},
- {'$group': {
- '_id': '$_id',
- # pass through context - same for representation
- 'context': {'$addToSet': '$context'},
- 'data': {'$addToSet': '$data'},
- # pass through files as a list
- 'files': {'$addToSet': '$files'},
- # count how many files
- 'files_count': {'$sum': 1},
- 'files_size': {'$sum': '$files_size'},
- # sum avg progress, finished = 1
- 'avg_progress_remote': {'$avg': "$progress_remote"},
- 'avg_progress_local': {'$avg': "$progress_local"},
- # select last touch of file
- 'updated_dt_remote': {'$max': "$updated_dt_remote"},
- 'failed_remote': {'$sum': '$failed_remote'},
- 'failed_local': {'$sum': '$failed_local'},
- 'failed_remote_tries': {'$sum': '$failed_remote_tries'},
- 'failed_local_tries': {'$sum': '$failed_local_tries'},
- 'paused_remote': {'$sum': '$paused_remote'},
- 'paused_local': {'$sum': '$paused_local'},
- 'updated_dt_local': {'$max': "$updated_dt_local"}
- }},
- {"$project": self.projection},
- {"$sort": self.sort},
- {
- '$facet': {
- 'paginatedResults': [{'$skip': self._rec_loaded},
- {'$limit': limit}],
- 'totalCount': [{'$count': 'count'}]
- }
- }
- ]
-
- def _get_match_part(self):
- """
- Extend match part with filter if present.
-
- Filter is set by user input. Each model has different fields to be
- checked.
- If performance issues are found, '$text' and text indexes should
- be investigated.
-
- Fulltext searches in:
- context.subset
- context.asset
- context.representation names AND _id (ObjectId)
- """
- base_match = {
- "type": "representation",
- 'files.sites.name': {'$all': [self.local_site,
- self.remote_site]}
- }
- if not self.filter:
- return base_match
- else:
- regex_str = '.*{}.*'.format(self.filter)
- base_match['$or'] = [
- {'context.subset': {'$regex': regex_str, '$options': 'i'}},
- {'context.asset': {'$regex': regex_str, '$options': 'i'}},
- {'context.representation': {'$regex': regex_str,
- '$options': 'i'}}]
-
- if ObjectId.is_valid(self.filter):
- base_match['$or'] = [{'_id': ObjectId(self.filter)}]
-
- return base_match
-
- def get_default_projection(self):
- """
- Projection part for aggregate query.
-
- All fields with '1' will be returned, no others.
-
- Returns:
- (dict)
- """
- return {
- "context.subset": 1,
- "context.asset": 1,
- "context.version": 1,
- "context.representation": 1,
- "data.path": 1,
- "files": 1,
- 'files_count': 1,
- "files_size": 1,
- 'avg_progress_remote': 1,
- 'avg_progress_local': 1,
- 'updated_dt_remote': 1,
- 'updated_dt_local': 1,
- 'paused_remote': 1,
- 'paused_local': 1,
- 'status': {
- '$switch': {
- 'branches': [
- {
- 'case': {
- '$or': ['$paused_remote', '$paused_local']},
- 'then': 3 # Paused
- },
- {
- 'case': {
- '$or': [
- {'$gte': ['$failed_local_tries', 3]},
- {'$gte': ['$failed_remote_tries', 3]}
- ]},
- 'then': 1},
- {
- 'case': {
- '$or': [{'$eq': ['$avg_progress_remote', 0]},
- {'$eq': ['$avg_progress_local', 0]}]},
- 'then': 2 # Queued
- },
- {
- 'case': {'$or': [{'$and': [
- {'$gt': ['$avg_progress_remote', 0]},
- {'$lt': ['$avg_progress_remote', 1]}
- ]},
- {'$and': [
- {'$gt': ['$avg_progress_local', 0]},
- {'$lt': ['$avg_progress_local', 1]}
- ]}
- ]},
- 'then': 0 # In progress
- },
- {
- 'case': {'$and': [
- {'$eq': ['$avg_progress_remote', 1]},
- {'$eq': ['$avg_progress_local', 1]}
- ]},
- 'then': 4 # Synced OK
- },
- ],
- 'default': -1
- }
- }
- }
-
-
-class SyncServerDetailWindow(QtWidgets.QDialog):
- def __init__(self, sync_server, _id, project, parent=None):
- log.debug(
- "!!! SyncServerDetailWindow _id:: {}".format(_id))
- super(SyncServerDetailWindow, self).__init__(parent)
- self.setWindowFlags(QtCore.Qt.Window)
- self.setFocusPolicy(QtCore.Qt.StrongFocus)
-
- self.setStyleSheet(style.load_stylesheet())
- self.setWindowIcon(QtGui.QIcon(style.app_icon_path()))
- self.resize(1000, 400)
-
- body = QtWidgets.QWidget()
- footer = QtWidgets.QWidget()
- footer.setFixedHeight(20)
-
- container = SyncRepresentationDetailWidget(sync_server, _id, project,
- parent=self)
- body_layout = QtWidgets.QHBoxLayout(body)
- body_layout.addWidget(container)
- body_layout.setContentsMargins(0, 0, 0, 0)
-
- self.message = QtWidgets.QLabel()
- self.message.hide()
-
- footer_layout = QtWidgets.QVBoxLayout(footer)
- footer_layout.addWidget(self.message)
- footer_layout.setContentsMargins(0, 0, 0, 0)
-
- layout = QtWidgets.QVBoxLayout(self)
- layout.addWidget(body)
- layout.addWidget(footer)
-
- self.setLayout(body_layout)
- self.setWindowTitle("Sync Representation Detail")
-
-
-class SyncRepresentationDetailWidget(QtWidgets.QWidget):
- """
- Widget to display list of synchronizable files for single repre.
-
- Args:
- _id (str): representation _id
- project (str): name of project with repre
- parent (QDialog): SyncServerDetailWindow
- """
- active_changed = QtCore.Signal() # active index changed
-
- default_widths = (
- ("file", 290),
- ("created_dt", 120),
- ("sync_dt", 120),
- ("local_site", 60),
- ("remote_site", 60),
- ("size", 60),
- ("priority", 20),
- ("state", 90)
- )
-
- def __init__(self, sync_server, _id=None, project=None, parent=None):
- super(SyncRepresentationDetailWidget, self).__init__(parent)
-
- log.debug("Representation_id:{}".format(_id))
- self.representation_id = _id
- self.item = None # set to item that mouse was clicked over
- self.project = project
-
- self.sync_server = sync_server
-
- self._selected_id = None
-
- self.filter = QtWidgets.QLineEdit()
- self.filter.setPlaceholderText("Filter representation..")
-
- top_bar_layout = QtWidgets.QHBoxLayout()
- top_bar_layout.addWidget(self.filter)
-
- self.table_view = QtWidgets.QTableView()
- headers = [item[0] for item in self.default_widths]
-
- model = SyncRepresentationDetailModel(sync_server, headers, _id,
- project)
- self.table_view.setModel(model)
- self.table_view.setContextMenuPolicy(QtCore.Qt.CustomContextMenu)
- self.table_view.setSelectionMode(
- QtWidgets.QAbstractItemView.SingleSelection)
- self.table_view.setSelectionBehavior(
- QtWidgets.QTableView.SelectRows)
- self.table_view.horizontalHeader().setSortIndicator(-1,
- Qt.AscendingOrder)
- self.table_view.setSortingEnabled(True)
- self.table_view.setAlternatingRowColors(True)
- self.table_view.verticalHeader().hide()
-
- time_delegate = PrettyTimeDelegate(self)
- column = self.table_view.model().get_header_index("created_dt")
- self.table_view.setItemDelegateForColumn(column, time_delegate)
- column = self.table_view.model().get_header_index("sync_dt")
- self.table_view.setItemDelegateForColumn(column, time_delegate)
-
- column = self.table_view.model().get_header_index("local_site")
- delegate = ImageDelegate(self)
- self.table_view.setItemDelegateForColumn(column, delegate)
-
- column = self.table_view.model().get_header_index("remote_site")
- delegate = ImageDelegate(self)
- self.table_view.setItemDelegateForColumn(column, delegate)
-
- column = self.table_view.model().get_header_index("size")
- delegate = SizeDelegate(self)
- self.table_view.setItemDelegateForColumn(column, delegate)
-
- for column_name, width in self.default_widths:
- idx = model.get_header_index(column_name)
- self.table_view.setColumnWidth(idx, width)
-
- layout = QtWidgets.QVBoxLayout(self)
- layout.setContentsMargins(0, 0, 0, 0)
- layout.addLayout(top_bar_layout)
- layout.addWidget(self.table_view)
-
- self.filter.textChanged.connect(lambda: model.set_filter(
- self.filter.text()))
- self.table_view.customContextMenuRequested.connect(
- self._on_context_menu)
-
- self.table_view.model().modelReset.connect(self._set_selection)
-
- self.selection_model = self.table_view.selectionModel()
- self.selection_model.selectionChanged.connect(self._selection_changed)
-
- def _selection_changed(self):
- index = self.selection_model.currentIndex()
- self._selected_id = self.table_view.model().data(index, Qt.UserRole)
-
- def _set_selection(self):
- """
- Sets selection to 'self._selected_id' if exists.
-
- Keep selection during model refresh.
- """
- if self._selected_id:
- index = self.table_view.model().get_index(self._selected_id)
- if index.isValid():
- mode = QtCore.QItemSelectionModel.Select | \
- QtCore.QItemSelectionModel.Rows
- self.selection_model.setCurrentIndex(index, mode)
- else:
- self._selected_id = None
-
- def _show_detail(self):
- """
- Shows windows with error message for failed sync of a file.
- """
- dt = max(self.item.created_dt, self.item.sync_dt)
- detail_window = SyncRepresentationErrorWindow(self.item._id,
- self.project,
- dt,
- self.item.tries,
- self.item.error)
- detail_window.exec()
-
- def _on_context_menu(self, point):
- """
- Shows menu with loader actions on Right-click.
- """
- point_index = self.table_view.indexAt(point)
- if not point_index.isValid():
- return
-
- self.item = self.table_view.model()._data[point_index.row()]
-
- menu = QtWidgets.QMenu()
- actions_mapping = {}
-
- action = QtWidgets.QAction("Open in explorer")
- actions_mapping[action] = self._open_in_explorer
- menu.addAction(action)
-
- if self.item.state == STATUS[1]:
- action = QtWidgets.QAction("Open error detail")
- actions_mapping[action] = self._show_detail
- menu.addAction(action)
-
- remote_site, remote_progress = self.item.remote_site.split()
- if float(remote_progress) == 1.0:
- action = QtWidgets.QAction("Reset local site")
- actions_mapping[action] = self._reset_local_site
- menu.addAction(action)
-
- local_site, local_progress = self.item.local_site.split()
- if float(local_progress) == 1.0:
- action = QtWidgets.QAction("Reset remote site")
- actions_mapping[action] = self._reset_remote_site
- menu.addAction(action)
-
- if not actions_mapping:
- action = QtWidgets.QAction("< No action >")
- actions_mapping[action] = None
- menu.addAction(action)
-
- result = menu.exec_(QtGui.QCursor.pos())
- if result:
- to_run = actions_mapping[result]
- if to_run:
- to_run()
-
- def _reset_local_site(self):
- """
- Removes errors or success metadata for particular file >> forces
- redo of upload/download
- """
- self.sync_server.reset_provider_for_file(
- self.table_view.model()._project,
- self.representation_id,
- 'local',
- self.item._id)
- self.table_view.model().refresh()
-
- def _reset_remote_site(self):
- """
- Removes errors or success metadata for particular file >> forces
- redo of upload/download
- """
- self.sync_server.reset_provider_for_file(
- self.table_view.model()._project,
- self.representation_id,
- 'remote',
- self.item._id)
- self.table_view.model().refresh()
-
- def _open_in_explorer(self):
- if not self.item:
- return
-
- fpath = self.item.path
- project = self.table_view.model()._project
- fpath = self.sync_server.get_local_file_path(project, fpath)
-
- fpath = os.path.normpath(os.path.dirname(fpath))
- if os.path.isdir(fpath):
- if 'win' in sys.platform: # windows
- subprocess.Popen('explorer "%s"' % fpath)
- elif sys.platform == 'darwin': # macOS
- subprocess.Popen(['open', fpath])
- else: # linux
- try:
- subprocess.Popen(['xdg-open', fpath])
- except OSError:
- raise OSError('unsupported xdg-open call??')
-
-
-class SyncRepresentationDetailModel(QtCore.QAbstractTableModel):
- """
- List of all syncronizable files per single representation.
-
- Used in detail window accessible after clicking on single repre in the
- summary.
-
- Args:
- sync_server (SyncServer) - object to call server operations (update
- db status, set site status...)
- header (list) - names of visible columns
- _id (string) - MongoDB _id of representation
- project (string) - collection name, all queries must be called on
- a specific collection
- """
- PAGE_SIZE = 30
- # TODO add filter filename
- DEFAULT_SORT = {
- "files.path": 1
- }
- SORT_BY_COLUMN = [
- "files.path",
- "updated_dt_local", # local created_dt
- "updated_dt_remote", # remote created_dt
- "progress_local", # local progress
- "progress_remote", # remote progress
- "size", # remote progress
- "context.asset", # priority TODO
- "status" # state
- ]
-
- @attr.s
- class SyncRepresentationDetail:
- """
- Auxiliary object for easier handling.
-
- Fields must contain all header values (+ any arbitrary values).
- """
- _id = attr.ib()
- file = attr.ib()
- created_dt = attr.ib(default=None)
- sync_dt = attr.ib(default=None)
- local_site = attr.ib(default=None)
- remote_site = attr.ib(default=None)
- size = attr.ib(default=None)
- priority = attr.ib(default=None)
- state = attr.ib(default=None)
- tries = attr.ib(default=None)
- error = attr.ib(default=None)
- path = attr.ib(default=None)
-
- def __init__(self, sync_server, header, _id, project=None):
- super(SyncRepresentationDetailModel, self).__init__()
- self._header = header
- self._data = []
- self._project = project
- self._rec_loaded = 0
- self._total_records = 0 # how many documents query actually found
- self.filter = None
- self._id = _id
- self._initialized = False
-
- self.sync_server = sync_server
- # TODO think about admin mode
- # this is for regular user, always only single local and single remote
- self.local_site = self.sync_server.get_active_site(self._project)
- self.remote_site = self.sync_server.get_remote_site(self._project)
-
- self.sort = self.DEFAULT_SORT
-
- # in case we would like to hide/show some columns
- self.projection = self.get_default_projection()
-
- self.query = self.get_default_query()
- representations = self.dbcon.aggregate(self.query)
- self.refresh(representations)
-
- self.timer = QtCore.QTimer()
- self.timer.timeout.connect(self.tick)
- self.timer.start(SyncRepresentationModel.REFRESH_SEC)
-
- @property
- def dbcon(self):
- return self.sync_server.connection.database[self._project]
-
- def tick(self):
- self.refresh(representations=None, load_records=self._rec_loaded)
- self.timer.start(SyncRepresentationModel.REFRESH_SEC)
-
- def get_header_index(self, value):
- """
- Returns index of 'value' in headers
-
- Args:
- value (str): header name value
- Returns:
- (int)
- """
- return self._header.index(value)
-
- def data(self, index, role):
- item = self._data[index.row()]
- if role == Qt.DisplayRole:
- return attr.asdict(item)[self._header[index.column()]]
- if role == Qt.UserRole:
- return item._id
-
- def rowCount(self, index):
- return len(self._data)
-
- def columnCount(self, index):
- return len(self._header)
-
- def headerData(self, section, orientation, role):
- if role == Qt.DisplayRole:
- if orientation == Qt.Horizontal:
- return str(self._header[section])
-
- def refresh(self, representations=None, load_records=0):
- if self.sync_server.is_paused():
- return
-
- self.beginResetModel()
- self._data = []
- self._rec_loaded = 0
-
- if not representations:
- self.query = self.get_default_query(load_records)
- representations = self.dbcon.aggregate(self.query)
-
- self._add_page_records(self.local_site, self.remote_site,
- representations)
- self.endResetModel()
-
- def _add_page_records(self, local_site, remote_site, representations):
- """
- Process all records from 'representation' and add them to storage.
-
- Args:
- local_site (str): name of local site (mine)
- remote_site (str): name of cloud provider (theirs)
- representations (Mongo Cursor) - mimics result set, 1 object
- with paginatedResults array and totalCount array
- """
- # representations is a Cursor, get first
- result = representations.next()
- count = 0
- total_count = result.get("totalCount")
- if total_count:
- count = total_count.pop().get('count')
- self._total_records = count
-
- local_provider = _translate_provider_for_icon(self.sync_server,
- self._project,
- local_site)
- remote_provider = _translate_provider_for_icon(self.sync_server,
- self._project,
- remote_site)
-
- for repre in result.get("paginatedResults"):
- # log.info("!!! repre:: {}".format(repre))
- files = repre.get("files", [])
- if isinstance(files, dict): # aggregate returns dictionary
- files = [files]
-
- for file in files:
- local_updated = remote_updated = None
- if repre.get('updated_dt_local'):
- local_updated = \
- repre.get('updated_dt_local').strftime(
- "%Y%m%dT%H%M%SZ")
-
- if repre.get('updated_dt_remote'):
- remote_updated = \
- repre.get('updated_dt_remote').strftime(
- "%Y%m%dT%H%M%SZ")
-
- progress_remote = _convert_progress(
- repre.get('progress_remote', '0'))
- progress_local = _convert_progress(
- repre.get('progress_local', '0'))
-
- errors = []
- if repre.get('failed_remote_error'):
- errors.append(repre.get('failed_remote_error'))
- if repre.get('failed_local_error'):
- errors.append(repre.get('failed_local_error'))
-
- item = self.SyncRepresentationDetail(
- file.get("_id"),
- os.path.basename(file["path"]),
- local_updated,
- remote_updated,
- '{} {}'.format(local_provider, progress_local),
- '{} {}'.format(remote_provider, progress_remote),
- file.get('size', 0),
- 1,
- STATUS[repre.get("status", -1)],
- repre.get("tries"),
- '\n'.join(errors),
- file.get('path')
-
- )
- self._data.append(item)
- self._rec_loaded += 1
-
- def canFetchMore(self, index):
- """
- Check if there are more records than currently loaded
- """
- # 'skip' might be suboptimal when representation hits 500k+
- return self._total_records > self._rec_loaded
-
- def fetchMore(self, index):
- """
- Add more record to model.
-
- Called when 'canFetchMore' returns true, which means there are
- more records in DB than loaded.
- 'self._buffer' is used to stash cursor to limit requery
- """
- log.debug("fetchMore")
- items_to_fetch = min(self._total_records - self._rec_loaded,
- self.PAGE_SIZE)
- self.query = self.get_default_query(self._rec_loaded)
- representations = self.dbcon.aggregate(self.query)
- self.beginInsertRows(index,
- self._rec_loaded,
- self._rec_loaded + items_to_fetch - 1)
-
- self._add_page_records(self.local_site, self.remote_site,
- representations)
-
- self.endInsertRows()
-
- def sort(self, index, order):
- # limit unwanted first re-sorting by view
- if index < 0:
- return
-
- self._rec_loaded = 0 # change sort - reset from start
-
- if order == 0:
- order = 1
- else:
- order = -1
-
- self.sort = {self.SORT_BY_COLUMN[index]: order}
- self.query = self.get_default_query()
-
- representations = self.dbcon.aggregate(self.query)
- self.refresh(representations)
-
- def set_filter(self, filter):
- self.filter = filter
- self.refresh()
-
- def get_index(self, id):
- """
- Get index of 'id' value.
-
- Used for keeping selection after refresh.
-
- Args:
- id (str): MongoDB _id
- Returns:
- (QModelIndex)
- """
- for i in range(self.rowCount(None)):
- index = self.index(i, 0)
- value = self.data(index, Qt.UserRole)
- if value == id:
- return index
- return None
-
- def get_default_query(self, limit=0):
- """
- Gets query that gets used when no extra sorting, filtering or
- projecting is needed.
-
- Called for basic table view.
-
- Returns:
- [(dict)] - list with single dict - appropriate for aggregate
- function for MongoDB
- """
- if limit == 0:
- limit = SyncRepresentationModel.PAGE_SIZE
-
- return [
- {"$match": self._get_match_part()},
- {"$unwind": "$files"},
- {'$addFields': {
- 'order_remote': {
- '$filter': {'input': '$files.sites', 'as': 'p',
- 'cond': {'$eq': ['$$p.name', self.remote_site]}
- }},
- 'order_local': {
- '$filter': {'input': '$files.sites', 'as': 'p',
- 'cond': {'$eq': ['$$p.name', self.local_site]}
- }}
- }},
- {'$addFields': {
- # prepare progress per file, presence of 'created_dt' denotes
- # successfully finished load/download
- 'progress_remote': {'$first': {
- '$cond': [{'$size': "$order_remote.progress"},
- "$order_remote.progress",
- {'$cond': [
- {'$size': "$order_remote.created_dt"},
- [1],
- [0]
- ]}
- ]}},
- 'progress_local': {'$first': {
- '$cond': [{'$size': "$order_local.progress"},
- "$order_local.progress",
- {'$cond': [
- {'$size': "$order_local.created_dt"},
- [1],
- [0]
- ]}
- ]}},
- # file might be successfully created or failed, not both
- 'updated_dt_remote': {'$first': {
- '$cond': [
- {'$size': "$order_remote.created_dt"},
- "$order_remote.created_dt",
- {
- '$cond': [
- {'$size': "$order_remote.last_failed_dt"},
- "$order_remote.last_failed_dt",
- []
- ]
- }
- ]
- }},
- 'updated_dt_local': {'$first': {
- '$cond': [
- {'$size': "$order_local.created_dt"},
- "$order_local.created_dt",
- {
- '$cond': [
- {'$size': "$order_local.last_failed_dt"},
- "$order_local.last_failed_dt",
- []
- ]
- }
- ]
- }},
- 'paused_remote': {
- '$cond': [{'$size': "$order_remote.paused"},
- 1,
- 0]},
- 'paused_local': {
- '$cond': [{'$size': "$order_local.paused"},
- 1,
- 0]},
- 'failed_remote': {
- '$cond': [{'$size': "$order_remote.last_failed_dt"},
- 1,
- 0]},
- 'failed_local': {
- '$cond': [{'$size': "$order_local.last_failed_dt"},
- 1,
- 0]},
- 'failed_remote_error': {'$first': {
- '$cond': [{'$size': "$order_remote.error"},
- "$order_remote.error",
- [""]]}},
- 'failed_local_error': {'$first': {
- '$cond': [{'$size': "$order_local.error"},
- "$order_local.error",
- [""]]}},
- 'tries': {'$first': {
- '$cond': [
- {'$size': "$order_local.tries"},
- "$order_local.tries",
- {'$cond': [
- {'$size': "$order_remote.tries"},
- "$order_remote.tries",
- []
- ]}
- ]}}
- }},
- {"$project": self.projection},
- {"$sort": self.sort},
- {
- '$facet': {
- 'paginatedResults': [{'$skip': self._rec_loaded},
- {'$limit': limit}],
- 'totalCount': [{'$count': 'count'}]
- }
- }
- ]
-
- def _get_match_part(self):
- """
- Returns different content for 'match' portion if filtering by
- name is present
-
- Returns:
- (dict)
- """
- if not self.filter:
- return {
- "type": "representation",
- "_id": self._id
- }
- else:
- regex_str = '.*{}.*'.format(self.filter)
- return {
- "type": "representation",
- "_id": self._id,
- '$or': [{'files.path': {'$regex': regex_str, '$options': 'i'}}]
- }
-
- def get_default_projection(self):
- """
- Projection part for aggregate query.
-
- All fields with '1' will be returned, no others.
-
- Returns:
- (dict)
- """
- return {
- "files": 1,
- 'progress_remote': 1,
- 'progress_local': 1,
- 'updated_dt_remote': 1,
- 'updated_dt_local': 1,
- 'paused_remote': 1,
- 'paused_local': 1,
- 'failed_remote_error': 1,
- 'failed_local_error': 1,
- 'tries': 1,
- 'status': {
- '$switch': {
- 'branches': [
- {
- 'case': {
- '$or': ['$paused_remote', '$paused_local']},
- 'then': 3 # Paused
- },
- {
- 'case': {
- '$and': [{'$or': ['$failed_remote',
- '$failed_local']},
- {'$eq': ['$tries', 3]}]},
- 'then': 1 # Failed (3 tries)
- },
- {
- 'case': {
- '$or': [{'$eq': ['$progress_remote', 0]},
- {'$eq': ['$progress_local', 0]}]},
- 'then': 2 # Queued
- },
- {
- 'case': {
- '$or': ['$failed_remote', '$failed_local']},
- 'then': 1 # Failed
- },
- {
- 'case': {'$or': [{'$and': [
- {'$gt': ['$progress_remote', 0]},
- {'$lt': ['$progress_remote', 1]}
- ]},
- {'$and': [
- {'$gt': ['$progress_local', 0]},
- {'$lt': ['$progress_local', 1]}
- ]}
- ]},
- 'then': 0 # In Progress
- },
- {
- 'case': {'$and': [
- {'$eq': ['$progress_remote', 1]},
- {'$eq': ['$progress_local', 1]}
- ]},
- 'then': 4 # Synced OK
- },
- ],
- 'default': -1
- }
- },
- 'data.path': 1
- }
-
-
-class ImageDelegate(QtWidgets.QStyledItemDelegate):
- """
- Prints icon of site and progress of synchronization
- """
-
- def __init__(self, parent=None):
- super(ImageDelegate, self).__init__(parent)
- self.icons = {}
-
- def paint(self, painter, option, index):
- option = QtWidgets.QStyleOptionViewItem(option)
- option.showDecorationSelected = True
-
- if (option.showDecorationSelected and
- (option.state & QtWidgets.QStyle.State_Selected)):
- painter.setOpacity(0.20) # highlight color is a bit off
- painter.fillRect(option.rect,
- option.palette.highlight())
- painter.setOpacity(1)
-
- d = index.data(QtCore.Qt.DisplayRole)
- if d:
- provider, value = d.split()
- else:
- return
-
- if not self.icons.get(provider):
- resource_path = os.path.dirname(__file__)
- resource_path = os.path.join(resource_path, "..",
- "providers", "resources")
- pix_url = "{}/{}.png".format(resource_path, provider)
- pixmap = QtGui.QPixmap(pix_url)
- self.icons[provider] = pixmap
- else:
- pixmap = self.icons[provider]
-
- point = QtCore.QPoint(option.rect.x() +
- (option.rect.width() - pixmap.width()) / 2,
- option.rect.y() +
- (option.rect.height() - pixmap.height()) / 2)
- painter.drawPixmap(point, pixmap)
-
- painter.setOpacity(0.5)
- overlay_rect = option.rect
- overlay_rect.setHeight(overlay_rect.height() * (1.0 - float(value)))
- painter.fillRect(overlay_rect,
- QtGui.QBrush(QtGui.QColor(0, 0, 0, 200)))
- painter.setOpacity(1)
-
-
-class SyncRepresentationErrorWindow(QtWidgets.QDialog):
- def __init__(self, _id, project, dt, tries, msg, parent=None):
- super(SyncRepresentationErrorWindow, self).__init__(parent)
- self.setWindowFlags(QtCore.Qt.Window)
- self.setFocusPolicy(QtCore.Qt.StrongFocus)
-
- self.setStyleSheet(style.load_stylesheet())
- self.setWindowIcon(QtGui.QIcon(style.app_icon_path()))
- self.resize(250, 200)
-
- body = QtWidgets.QWidget()
- footer = QtWidgets.QWidget()
- footer.setFixedHeight(20)
-
- container = SyncRepresentationErrorWidget(_id, project, dt, tries, msg,
- parent=self)
- body_layout = QtWidgets.QHBoxLayout(body)
- body_layout.addWidget(container)
- body_layout.setContentsMargins(0, 0, 0, 0)
-
- message = QtWidgets.QLabel()
- message.hide()
-
- footer_layout = QtWidgets.QVBoxLayout(footer)
- footer_layout.addWidget(message)
- footer_layout.setContentsMargins(0, 0, 0, 0)
-
- layout = QtWidgets.QVBoxLayout(self)
- layout.addWidget(body)
- layout.addWidget(footer)
-
- self.setLayout(body_layout)
- self.setWindowTitle("Sync Representation Error Detail")
-
-
-class SyncRepresentationErrorWidget(QtWidgets.QWidget):
- """
- Dialog to show when sync error happened, prints error message
- """
-
- def __init__(self, _id, project, dt, tries, msg, parent=None):
- super(SyncRepresentationErrorWidget, self).__init__(parent)
-
- layout = QtWidgets.QFormLayout(self)
- layout.addRow(QtWidgets.QLabel("Last update date"),
- QtWidgets.QLabel(pretty_timestamp(dt)))
- layout.addRow(QtWidgets.QLabel("Retries"),
- QtWidgets.QLabel(str(tries)))
- layout.addRow(QtWidgets.QLabel("Error message"),
- QtWidgets.QLabel(msg))
-
-
-class SizeDelegate(QtWidgets.QStyledItemDelegate):
- """
- Pretty print for file size
- """
-
- def __init__(self, parent=None):
- super(SizeDelegate, self).__init__(parent)
-
- def displayText(self, value, locale):
- if value is None:
- # Ignore None value
- return
-
- return self._pretty_size(value)
-
- def _pretty_size(self, value, suffix='B'):
- for unit in ['', 'Ki', 'Mi', 'Gi', 'Ti', 'Pi', 'Ei', 'Zi']:
- if abs(value) < 1024.0:
- return "%3.1f%s%s" % (value, unit, suffix)
- value /= 1024.0
- return "%.1f%s%s" % (value, 'Yi', suffix)
-
-
-def _convert_progress(value):
- try:
- progress = float(value)
- except (ValueError, TypeError):
- progress = 0.0
-
- return progress
-
-
-def _translate_provider_for_icon(sync_server, project, site):
- """
- Get provider for 'site'
-
- This is used for getting icon, 'studio' should have different icon
- then local sites, even the provider 'local_drive' is same
-
- """
- if site == sync_server.DEFAULT_SITE:
- return sync_server.DEFAULT_SITE
- return sync_server.get_provider_for_site(project, site)
diff --git a/openpype/modules/sync_server/tray/lib.py b/openpype/modules/sync_server/tray/lib.py
new file mode 100644
index 0000000000..0282d79ea1
--- /dev/null
+++ b/openpype/modules/sync_server/tray/lib.py
@@ -0,0 +1,52 @@
+from Qt import QtCore
+
+from openpype.lib import PypeLogger
+
+
+log = PypeLogger().get_logger("SyncServer")
+
+STATUS = {
+ 0: 'In Progress',
+ 1: 'Queued',
+ 2: 'Failed',
+ 3: 'Paused',
+ 4: 'Synced OK',
+ -1: 'Not available'
+}
+
+DUMMY_PROJECT = "No project configured"
+
+ProviderRole = QtCore.Qt.UserRole + 2
+ProgressRole = QtCore.Qt.UserRole + 4
+DateRole = QtCore.Qt.UserRole + 6
+FailedRole = QtCore.Qt.UserRole + 8
+
+
+def pretty_size(value, suffix='B'):
+ for unit in ['', 'Ki', 'Mi', 'Gi', 'Ti', 'Pi', 'Ei', 'Zi']:
+ if abs(value) < 1024.0:
+ return "%3.1f%s%s" % (value, unit, suffix)
+ value /= 1024.0
+ return "%.1f%s%s" % (value, 'Yi', suffix)
+
+
+def convert_progress(value):
+ try:
+ progress = float(value)
+ except (ValueError, TypeError):
+ progress = 0.0
+
+ return progress
+
+
+def translate_provider_for_icon(sync_server, project, site):
+ """
+ Get provider for 'site'
+
+ This is used for getting icon, 'studio' should have different icon
+ then local sites, even the provider 'local_drive' is same
+
+ """
+ if site == sync_server.DEFAULT_SITE:
+ return sync_server.DEFAULT_SITE
+ return sync_server.get_provider_for_site(project, site)
diff --git a/openpype/modules/sync_server/tray/models.py b/openpype/modules/sync_server/tray/models.py
new file mode 100644
index 0000000000..3cc53c6ec4
--- /dev/null
+++ b/openpype/modules/sync_server/tray/models.py
@@ -0,0 +1,1124 @@
+import os
+import attr
+from bson.objectid import ObjectId
+
+from Qt import QtCore
+from Qt.QtCore import Qt
+
+from avalon.tools.delegates import pretty_timestamp
+
+from openpype.lib import PypeLogger
+
+from openpype.modules.sync_server.tray import lib
+
+
+log = PypeLogger().get_logger("SyncServer")
+
+
+class ProjectModel(QtCore.QAbstractListModel):
+ def __init__(self, *args, projects=None, **kwargs):
+ super(ProjectModel, self).__init__(*args, **kwargs)
+ self.projects = projects or []
+
+ def data(self, index, role):
+ if role == Qt.DisplayRole:
+ # See below for the data structure.
+ status, text = self.projects[index.row()]
+ # Return the todo text only.
+ return text
+
+ def rowCount(self, _index):
+ return len(self.todos)
+
+ def columnCount(self, _index):
+ return len(self._header)
+
+
+class _SyncRepresentationModel(QtCore.QAbstractTableModel):
+
+ COLUMN_LABELS = []
+
+ PAGE_SIZE = 20 # default page size to query for
+ REFRESH_SEC = 5000 # in seconds, requery DB for new status
+
+ @property
+ def dbcon(self):
+ """
+ Database object with preselected project (collection) to run DB
+ operations (find, aggregate).
+
+ All queries should go through this (because of collection).
+ """
+ return self.sync_server.connection.database[self.project]
+
+ @property
+ def project(self):
+ """Returns project"""
+ return self._project
+
+ def rowCount(self, _index):
+ return len(self._data)
+
+ def columnCount(self, _index):
+ return len(self._header)
+
+ def headerData(self, section, orientation, role):
+ if role == Qt.DisplayRole:
+ if orientation == Qt.Horizontal:
+ return self.COLUMN_LABELS[section][1]
+
+ def get_header_index(self, value):
+ """
+ Returns index of 'value' in headers
+
+ Args:
+ value (str): header name value
+ Returns:
+ (int)
+ """
+ return self._header.index(value)
+
+ def refresh(self, representations=None, load_records=0):
+ """
+ Reloads representations from DB if necessary, adds them to model.
+
+ Runs periodically (every X seconds) or by demand (change of
+ sorting, filtering etc.)
+
+ Emits 'modelReset' signal.
+
+ Args:
+ representations (PaginationResult object): pass result of
+ aggregate query from outside - mostly for testing only
+ load_records (int) - enforces how many records should be
+ actually queried (scrolled a couple of times to list more
+ than single page of records)
+ """
+ if self.sync_server.is_paused() or \
+ self.sync_server.is_project_paused(self.project):
+ return
+ self.refresh_started.emit()
+ self.beginResetModel()
+ self._data = []
+ self._rec_loaded = 0
+
+ if not representations:
+ self.query = self.get_default_query(load_records)
+ representations = self.dbcon.aggregate(self.query)
+
+ self.add_page_records(self.local_site, self.remote_site,
+ representations)
+ self.endResetModel()
+ self.refresh_finished.emit()
+
+ def tick(self):
+ """
+ Triggers refresh of model.
+
+ Because of pagination, prepared (sorting, filtering) query needs
+ to be run on DB every X seconds.
+ """
+ self.refresh(representations=None, load_records=self._rec_loaded)
+ self.timer.start(self.REFRESH_SEC)
+
+ def canFetchMore(self, _index):
+ """
+ Check if there are more records than currently loaded
+ """
+ # 'skip' might be suboptimal when representation hits 500k+
+ return self._total_records > self._rec_loaded
+
+ def fetchMore(self, index):
+ """
+ Add more record to model.
+
+ Called when 'canFetchMore' returns true, which means there are
+ more records in DB than loaded.
+ """
+ log.debug("fetchMore")
+ items_to_fetch = min(self._total_records - self._rec_loaded,
+ self.PAGE_SIZE)
+ self.query = self.get_default_query(self._rec_loaded)
+ representations = self.dbcon.aggregate(self.query)
+ self.beginInsertRows(index,
+ self._rec_loaded,
+ self._rec_loaded + items_to_fetch - 1)
+
+ self.add_page_records(self.local_site, self.remote_site,
+ representations)
+
+ self.endInsertRows()
+
+ def sort(self, index, order):
+ """
+ Summary sort per representation.
+
+ Sort is happening on a DB side, model is reset, db queried
+ again.
+
+ Args:
+ index (int): column index
+ order (int): 0|
+ """
+ # limit unwanted first re-sorting by view
+ if index < 0:
+ return
+
+ self._rec_loaded = 0
+ if order == 0:
+ order = 1
+ else:
+ order = -1
+
+ self.sort = {self.SORT_BY_COLUMN[index]: order, '_id': 1}
+ self.query = self.get_default_query()
+ # import json
+ # log.debug(json.dumps(self.query, indent=4).\
+ # replace('False', 'false').\
+ # replace('True', 'true').replace('None', 'null'))
+
+ representations = self.dbcon.aggregate(self.query)
+ self.refresh(representations)
+
+ def set_filter(self, word_filter):
+ """
+ Adds text value filtering
+
+ Args:
+ word_filter (str): string inputted by user
+ """
+ self.word_filter = word_filter
+ self.refresh()
+
+ def set_project(self, project):
+ """
+ Changes project, called after project selection is changed
+
+ Args:
+ project (str): name of project
+ """
+ self._project = project
+ self.sync_server.set_sync_project_settings()
+ self.local_site = self.sync_server.get_active_site(self.project)
+ self.remote_site = self.sync_server.get_remote_site(self.project)
+ self.refresh()
+
+ def get_index(self, id):
+ """
+ Get index of 'id' value.
+
+ Used for keeping selection after refresh.
+
+ Args:
+ id (str): MongoDB _id
+ Returns:
+ (QModelIndex)
+ """
+ for i in range(self.rowCount(None)):
+ index = self.index(i, 0)
+ value = self.data(index, Qt.UserRole)
+ if value == id:
+ return index
+ return None
+
+
+class SyncRepresentationSummaryModel(_SyncRepresentationModel):
+ """
+ Model for summary of representations.
+
+ Groups files information per representation. Allows sorting and
+ full text filtering.
+
+ Allows pagination, most of heavy lifting is being done on DB side.
+ Single model matches to single collection. When project is changed,
+ model is reset and refreshed.
+
+ Args:
+ sync_server (SyncServer) - object to call server operations (update
+ db status, set site status...)
+ header (list) - names of visible columns
+ project (string) - collection name, all queries must be called on
+ a specific collection
+
+ """
+ COLUMN_LABELS = [
+ ("asset", "Asset"),
+ ("subset", "Subset"),
+ ("version", "Version"),
+ ("representation", "Representation"),
+ ("local_site", "Active site"),
+ ("remote_site", "Remote site"),
+ ("files_count", "Files"),
+ ("files_size", "Size"),
+ ("priority", "Priority"),
+ ("state", "Status")
+ ]
+
+ DEFAULT_SORT = {
+ "updated_dt_remote": -1,
+ "_id": 1
+ }
+ SORT_BY_COLUMN = [
+ "context.asset", # asset
+ "context.subset", # subset
+ "context.version", # version
+ "context.representation", # representation
+ "updated_dt_local", # local created_dt
+ "updated_dt_remote", # remote created_dt
+ "files_count", # count of files
+ "files_size", # file size of all files
+ "context.asset", # priority TODO
+ "status" # state
+ ]
+
+ refresh_started = QtCore.Signal()
+ refresh_finished = QtCore.Signal()
+
+ @attr.s
+ class SyncRepresentation:
+ """
+ Auxiliary object for easier handling.
+
+ Fields must contain all header values (+ any arbitrary values).
+ """
+ _id = attr.ib()
+ asset = attr.ib()
+ subset = attr.ib()
+ version = attr.ib()
+ representation = attr.ib()
+ created_dt = attr.ib(default=None)
+ sync_dt = attr.ib(default=None)
+ local_site = attr.ib(default=None)
+ remote_site = attr.ib(default=None)
+ local_provider = attr.ib(default=None)
+ remote_provider = attr.ib(default=None)
+ local_progress = attr.ib(default=None)
+ remote_progress = attr.ib(default=None)
+ files_count = attr.ib(default=None)
+ files_size = attr.ib(default=None)
+ priority = attr.ib(default=None)
+ state = attr.ib(default=None)
+ path = attr.ib(default=None)
+
+ def __init__(self, sync_server, header, project=None):
+ super(SyncRepresentationSummaryModel, self).__init__()
+ self._header = header
+ self._data = []
+ self._project = project
+ self._rec_loaded = 0
+ self._total_records = 0 # how many documents query actually found
+ self.word_filter = None
+
+ self._initialized = False
+ if not self._project or self._project == lib.DUMMY_PROJECT:
+ return
+
+ self.sync_server = sync_server
+ # TODO think about admin mode
+ # this is for regular user, always only single local and single remote
+ self.local_site = self.sync_server.get_active_site(self.project)
+ self.remote_site = self.sync_server.get_remote_site(self.project)
+
+ self.projection = self.get_default_projection()
+
+ self.sort = self.DEFAULT_SORT
+
+ self.query = self.get_default_query()
+ self.default_query = list(self.get_default_query())
+
+ representations = self.dbcon.aggregate(self.query)
+ self.refresh(representations)
+
+ self.timer = QtCore.QTimer()
+ self.timer.timeout.connect(self.tick)
+ self.timer.start(self.REFRESH_SEC)
+
+ def data(self, index, role):
+ item = self._data[index.row()]
+
+ header_value = self._header[index.column()]
+ if role == lib.ProviderRole:
+ if header_value == 'local_site':
+ return item.local_provider
+ if header_value == 'remote_site':
+ return item.remote_provider
+
+ if role == lib.ProgressRole:
+ if header_value == 'local_site':
+ return item.local_progress
+ if header_value == 'remote_site':
+ return item.remote_progress
+
+ if role == lib.DateRole:
+ if header_value == 'local_site':
+ if item.created_dt:
+ return pretty_timestamp(item.created_dt)
+ if header_value == 'remote_site':
+ if item.sync_dt:
+ return pretty_timestamp(item.sync_dt)
+
+ if role == lib.FailedRole:
+ if header_value == 'local_site':
+ return item.state == lib.STATUS[2] and item.local_progress < 1
+ if header_value == 'remote_site':
+ return item.state == lib.STATUS[2] and item.remote_progress < 1
+
+ if role == Qt.DisplayRole:
+ # because of ImageDelegate
+ if header_value in ['remote_site', 'local_site']:
+ return ""
+
+ return attr.asdict(item)[self._header[index.column()]]
+ if role == Qt.UserRole:
+ return item._id
+
+ def add_page_records(self, local_site, remote_site, representations):
+ """
+ Process all records from 'representation' and add them to storage.
+
+ Args:
+ local_site (str): name of local site (mine)
+ remote_site (str): name of cloud provider (theirs)
+ representations (Mongo Cursor) - mimics result set, 1 object
+ with paginatedResults array and totalCount array
+ """
+ result = representations.next()
+ count = 0
+ total_count = result.get("totalCount")
+ if total_count:
+ count = total_count.pop().get('count')
+ self._total_records = count
+
+ local_provider = lib.translate_provider_for_icon(self.sync_server,
+ self.project,
+ local_site)
+ remote_provider = lib.translate_provider_for_icon(self.sync_server,
+ self.project,
+ remote_site)
+
+ for repre in result.get("paginatedResults"):
+ context = repre.get("context").pop()
+ files = repre.get("files", [])
+ if isinstance(files, dict): # aggregate returns dictionary
+ files = [files]
+
+ # representation without files doesnt concern us
+ if not files:
+ continue
+
+ local_updated = remote_updated = None
+ if repre.get('updated_dt_local'):
+ local_updated = \
+ repre.get('updated_dt_local').strftime("%Y%m%dT%H%M%SZ")
+
+ if repre.get('updated_dt_remote'):
+ remote_updated = \
+ repre.get('updated_dt_remote').strftime("%Y%m%dT%H%M%SZ")
+
+ avg_progress_remote = lib.convert_progress(
+ repre.get('avg_progress_remote', '0'))
+ avg_progress_local = lib.convert_progress(
+ repre.get('avg_progress_local', '0'))
+
+ if context.get("version"):
+ version = "v{:0>3d}".format(context.get("version"))
+ else:
+ version = "master"
+
+ item = self.SyncRepresentation(
+ repre.get("_id"),
+ context.get("asset"),
+ context.get("subset"),
+ version,
+ context.get("representation"),
+ local_updated,
+ remote_updated,
+ local_site,
+ remote_site,
+ local_provider,
+ remote_provider,
+ avg_progress_local,
+ avg_progress_remote,
+ repre.get("files_count", 1),
+ lib.pretty_size(repre.get("files_size", 0)),
+ 1,
+ lib.STATUS[repre.get("status", -1)],
+ files[0].get('path')
+ )
+
+ self._data.append(item)
+ self._rec_loaded += 1
+
+ def get_default_query(self, limit=0):
+ """
+ Returns basic aggregate query for main table.
+
+ Main table provides summary information about representation,
+ which could have multiple files. Details are accessible after
+ double click on representation row.
+ Columns:
+ 'created_dt' - max of created or updated (when failed) per repr
+ 'sync_dt' - same for remote side
+ 'local_site' - progress of repr on local side, 1 = finished
+ 'remote_site' - progress on remote side, calculates from files
+ 'state' -
+ 0 - in progress
+ 1 - failed
+ 2 - queued
+ 3 - paused
+ 4 - finished on both sides
+
+ are calculated and must be calculated in DB because of
+ pagination
+
+ Args:
+ limit (int): how many records should be returned, by default
+ it 'PAGE_SIZE' for performance.
+ Should be overridden by value of loaded records for refresh
+ functionality (got more records by scrolling, refresh
+ shouldn't reset that)
+ """
+ if limit == 0:
+ limit = SyncRepresentationSummaryModel.PAGE_SIZE
+
+ return [
+ {"$match": self.get_match_part()},
+ {'$unwind': '$files'},
+ # merge potentially unwinded records back to single per repre
+ {'$addFields': {
+ 'order_remote': {
+ '$filter': {'input': '$files.sites', 'as': 'p',
+ 'cond': {'$eq': ['$$p.name', self.remote_site]}
+ }},
+ 'order_local': {
+ '$filter': {'input': '$files.sites', 'as': 'p',
+ 'cond': {'$eq': ['$$p.name', self.local_site]}
+ }}
+ }},
+ {'$addFields': {
+ # prepare progress per file, presence of 'created_dt' denotes
+ # successfully finished load/download
+ 'progress_remote': {'$first': {
+ '$cond': [{'$size': "$order_remote.progress"},
+ "$order_remote.progress",
+ {'$cond': [
+ {'$size': "$order_remote.created_dt"},
+ [1],
+ [0]
+ ]}
+ ]}},
+ 'progress_local': {'$first': {
+ '$cond': [{'$size': "$order_local.progress"},
+ "$order_local.progress",
+ {'$cond': [
+ {'$size': "$order_local.created_dt"},
+ [1],
+ [0]
+ ]}
+ ]}},
+ # file might be successfully created or failed, not both
+ 'updated_dt_remote': {'$first': {
+ '$cond': [{'$size': "$order_remote.created_dt"},
+ "$order_remote.created_dt",
+ {'$cond': [
+ {'$size': "$order_remote.last_failed_dt"},
+ "$order_remote.last_failed_dt",
+ []
+ ]}
+ ]}},
+ 'updated_dt_local': {'$first': {
+ '$cond': [{'$size': "$order_local.created_dt"},
+ "$order_local.created_dt",
+ {'$cond': [
+ {'$size': "$order_local.last_failed_dt"},
+ "$order_local.last_failed_dt",
+ []
+ ]}
+ ]}},
+ 'files_size': {'$ifNull': ["$files.size", 0]},
+ 'failed_remote': {
+ '$cond': [{'$size': "$order_remote.last_failed_dt"},
+ 1,
+ 0]},
+ 'failed_local': {
+ '$cond': [{'$size': "$order_local.last_failed_dt"},
+ 1,
+ 0]},
+ 'failed_local_tries': {
+ '$cond': [{'$size': '$order_local.tries'},
+ {'$first': '$order_local.tries'},
+ 0]},
+ 'failed_remote_tries': {
+ '$cond': [{'$size': '$order_remote.tries'},
+ {'$first': '$order_remote.tries'},
+ 0]},
+ 'paused_remote': {
+ '$cond': [{'$size': "$order_remote.paused"},
+ 1,
+ 0]},
+ 'paused_local': {
+ '$cond': [{'$size': "$order_local.paused"},
+ 1,
+ 0]},
+ }},
+ {'$group': {
+ '_id': '$_id',
+ # pass through context - same for representation
+ 'context': {'$addToSet': '$context'},
+ 'data': {'$addToSet': '$data'},
+ # pass through files as a list
+ 'files': {'$addToSet': '$files'},
+ # count how many files
+ 'files_count': {'$sum': 1},
+ 'files_size': {'$sum': '$files_size'},
+ # sum avg progress, finished = 1
+ 'avg_progress_remote': {'$avg': "$progress_remote"},
+ 'avg_progress_local': {'$avg': "$progress_local"},
+ # select last touch of file
+ 'updated_dt_remote': {'$max': "$updated_dt_remote"},
+ 'failed_remote': {'$sum': '$failed_remote'},
+ 'failed_local': {'$sum': '$failed_local'},
+ 'failed_remote_tries': {'$sum': '$failed_remote_tries'},
+ 'failed_local_tries': {'$sum': '$failed_local_tries'},
+ 'paused_remote': {'$sum': '$paused_remote'},
+ 'paused_local': {'$sum': '$paused_local'},
+ 'updated_dt_local': {'$max': "$updated_dt_local"}
+ }},
+ {"$project": self.projection},
+ {"$sort": self.sort},
+ {
+ '$facet': {
+ 'paginatedResults': [{'$skip': self._rec_loaded},
+ {'$limit': limit}],
+ 'totalCount': [{'$count': 'count'}]
+ }
+ }
+ ]
+
+ def get_match_part(self):
+ """
+ Extend match part with word_filter if present.
+
+ Filter is set by user input. Each model has different fields to be
+ checked.
+ If performance issues are found, '$text' and text indexes should
+ be investigated.
+
+ Fulltext searches in:
+ context.subset
+ context.asset
+ context.representation names AND _id (ObjectId)
+ """
+ base_match = {
+ "type": "representation",
+ 'files.sites.name': {'$all': [self.local_site,
+ self.remote_site]}
+ }
+ if not self.word_filter:
+ return base_match
+ else:
+ regex_str = '.*{}.*'.format(self.word_filter)
+ base_match['$or'] = [
+ {'context.subset': {'$regex': regex_str, '$options': 'i'}},
+ {'context.asset': {'$regex': regex_str, '$options': 'i'}},
+ {'context.representation': {'$regex': regex_str,
+ '$options': 'i'}}]
+
+ if ObjectId.is_valid(self.word_filter):
+ base_match['$or'] = [{'_id': ObjectId(self.word_filter)}]
+
+ return base_match
+
+ def get_default_projection(self):
+ """
+ Projection part for aggregate query.
+
+ All fields with '1' will be returned, no others.
+
+ Returns:
+ (dict)
+ """
+ return {
+ "context.subset": 1,
+ "context.asset": 1,
+ "context.version": 1,
+ "context.representation": 1,
+ "data.path": 1,
+ "files": 1,
+ 'files_count': 1,
+ "files_size": 1,
+ 'avg_progress_remote': 1,
+ 'avg_progress_local': 1,
+ 'updated_dt_remote': 1,
+ 'updated_dt_local': 1,
+ 'paused_remote': 1,
+ 'paused_local': 1,
+ 'status': {
+ '$switch': {
+ 'branches': [
+ {
+ 'case': {
+ '$or': ['$paused_remote', '$paused_local']},
+ 'then': 3 # Paused
+ },
+ {
+ 'case': {
+ '$or': [
+ {'$gte': ['$failed_local_tries', 3]},
+ {'$gte': ['$failed_remote_tries', 3]}
+ ]},
+ 'then': 2}, # Failed
+ {
+ 'case': {
+ '$or': [{'$eq': ['$avg_progress_remote', 0]},
+ {'$eq': ['$avg_progress_local', 0]}]},
+ 'then': 1 # Queued
+ },
+ {
+ 'case': {'$or': [{'$and': [
+ {'$gt': ['$avg_progress_remote', 0]},
+ {'$lt': ['$avg_progress_remote', 1]}
+ ]},
+ {'$and': [
+ {'$gt': ['$avg_progress_local', 0]},
+ {'$lt': ['$avg_progress_local', 1]}
+ ]}
+ ]},
+ 'then': 0 # In progress
+ },
+ {
+ 'case': {'$and': [
+ {'$eq': ['$avg_progress_remote', 1]},
+ {'$eq': ['$avg_progress_local', 1]}
+ ]},
+ 'then': 4 # Synced OK
+ },
+ ],
+ 'default': -1
+ }
+ }
+ }
+
+
+class SyncRepresentationDetailModel(_SyncRepresentationModel):
+ """
+ List of all syncronizable files per single representation.
+
+ Used in detail window accessible after clicking on single repre in the
+ summary.
+
+ Args:
+ sync_server (SyncServer) - object to call server operations (update
+ db status, set site status...)
+ header (list) - names of visible columns
+ _id (string) - MongoDB _id of representation
+ project (string) - collection name, all queries must be called on
+ a specific collection
+ """
+ COLUMN_LABELS = [
+ ("file", "File name"),
+ ("local_site", "Active site"),
+ ("remote_site", "Remote site"),
+ ("files_size", "Size"),
+ ("priority", "Priority"),
+ ("state", "Status")
+ ]
+
+ PAGE_SIZE = 30
+ DEFAULT_SORT = {
+ "files.path": 1
+ }
+ SORT_BY_COLUMN = [
+ "files.path",
+ "updated_dt_local", # local created_dt
+ "updated_dt_remote", # remote created_dt
+ "size", # remote progress
+ "context.asset", # priority TODO
+ "status" # state
+ ]
+
+ refresh_started = QtCore.Signal()
+ refresh_finished = QtCore.Signal()
+
+ @attr.s
+ class SyncRepresentationDetail:
+ """
+ Auxiliary object for easier handling.
+
+ Fields must contain all header values (+ any arbitrary values).
+ """
+ _id = attr.ib()
+ file = attr.ib()
+ created_dt = attr.ib(default=None)
+ sync_dt = attr.ib(default=None)
+ local_site = attr.ib(default=None)
+ remote_site = attr.ib(default=None)
+ local_provider = attr.ib(default=None)
+ remote_provider = attr.ib(default=None)
+ local_progress = attr.ib(default=None)
+ remote_progress = attr.ib(default=None)
+ size = attr.ib(default=None)
+ priority = attr.ib(default=None)
+ state = attr.ib(default=None)
+ tries = attr.ib(default=None)
+ error = attr.ib(default=None)
+ path = attr.ib(default=None)
+
+ def __init__(self, sync_server, header, _id,
+ project=None):
+ super(SyncRepresentationDetailModel, self).__init__()
+ self._header = header
+ self._data = []
+ self._project = project
+ self._rec_loaded = 0
+ self._total_records = 0 # how many documents query actually found
+ self.word_filter = None
+ self._id = _id
+ self._initialized = False
+
+ self.sync_server = sync_server
+ # TODO think about admin mode
+ # this is for regular user, always only single local and single remote
+ self.local_site = self.sync_server.get_active_site(self.project)
+ self.remote_site = self.sync_server.get_remote_site(self.project)
+
+ self.sort = self.DEFAULT_SORT
+
+ # in case we would like to hide/show some columns
+ self.projection = self.get_default_projection()
+
+ self.query = self.get_default_query()
+ representations = self.dbcon.aggregate(self.query)
+ self.refresh(representations)
+
+ self.timer = QtCore.QTimer()
+ self.timer.timeout.connect(self.tick)
+ self.timer.start(SyncRepresentationSummaryModel.REFRESH_SEC)
+
+ def data(self, index, role):
+ item = self._data[index.row()]
+
+ header_value = self._header[index.column()]
+ if role == lib.ProviderRole:
+ if header_value == 'local_site':
+ return item.local_provider
+ if header_value == 'remote_site':
+ return item.remote_provider
+
+ if role == lib.ProgressRole:
+ if header_value == 'local_site':
+ return item.local_progress
+ if header_value == 'remote_site':
+ return item.remote_progress
+
+ if role == lib.DateRole:
+ if header_value == 'local_site':
+ if item.created_dt:
+ return pretty_timestamp(item.created_dt)
+ if header_value == 'remote_site':
+ if item.sync_dt:
+ return pretty_timestamp(item.sync_dt)
+
+ if role == lib.FailedRole:
+ if header_value == 'local_site':
+ return item.state == lib.STATUS[2] and item.local_progress < 1
+ if header_value == 'remote_site':
+ return item.state == lib.STATUS[2] and item.remote_progress < 1
+
+ if role == Qt.DisplayRole:
+ # because of ImageDelegate
+ if header_value in ['remote_site', 'local_site']:
+ return ""
+ return attr.asdict(item)[self._header[index.column()]]
+ if role == Qt.UserRole:
+ return item._id
+
+ def add_page_records(self, local_site, remote_site, representations):
+ """
+ Process all records from 'representation' and add them to storage.
+
+ Args:
+ local_site (str): name of local site (mine)
+ remote_site (str): name of cloud provider (theirs)
+ representations (Mongo Cursor) - mimics result set, 1 object
+ with paginatedResults array and totalCount array
+ """
+ # representations is a Cursor, get first
+ result = representations.next()
+ count = 0
+ total_count = result.get("totalCount")
+ if total_count:
+ count = total_count.pop().get('count')
+ self._total_records = count
+
+ local_provider = lib.translate_provider_for_icon(self.sync_server,
+ self.project,
+ local_site)
+ remote_provider = lib.translate_provider_for_icon(self.sync_server,
+ self.project,
+ remote_site)
+
+ for repre in result.get("paginatedResults"):
+ # log.info("!!! repre:: {}".format(repre))
+ files = repre.get("files", [])
+ if isinstance(files, dict): # aggregate returns dictionary
+ files = [files]
+
+ for file in files:
+ local_updated = remote_updated = None
+ if repre.get('updated_dt_local'):
+ local_updated = \
+ repre.get('updated_dt_local').strftime(
+ "%Y%m%dT%H%M%SZ")
+
+ if repre.get('updated_dt_remote'):
+ remote_updated = \
+ repre.get('updated_dt_remote').strftime(
+ "%Y%m%dT%H%M%SZ")
+
+ remote_progress = lib.convert_progress(
+ repre.get('progress_remote', '0'))
+ local_progress = lib.convert_progress(
+ repre.get('progress_local', '0'))
+
+ errors = []
+ if repre.get('failed_remote_error'):
+ errors.append(repre.get('failed_remote_error'))
+ if repre.get('failed_local_error'):
+ errors.append(repre.get('failed_local_error'))
+
+ item = self.SyncRepresentationDetail(
+ file.get("_id"),
+ os.path.basename(file["path"]),
+ local_updated,
+ remote_updated,
+ local_site,
+ remote_site,
+ local_provider,
+ remote_provider,
+ local_progress,
+ remote_progress,
+ lib.pretty_size(file.get('size', 0)),
+ 1,
+ lib.STATUS[repre.get("status", -1)],
+ repre.get("tries"),
+ '\n'.join(errors),
+ file.get('path')
+
+ )
+ self._data.append(item)
+ self._rec_loaded += 1
+
+ def get_default_query(self, limit=0):
+ """
+ Gets query that gets used when no extra sorting, filtering or
+ projecting is needed.
+
+ Called for basic table view.
+
+ Returns:
+ [(dict)] - list with single dict - appropriate for aggregate
+ function for MongoDB
+ """
+ if limit == 0:
+ limit = SyncRepresentationSummaryModel.PAGE_SIZE
+
+ return [
+ {"$match": self.get_match_part()},
+ {"$unwind": "$files"},
+ {'$addFields': {
+ 'order_remote': {
+ '$filter': {'input': '$files.sites', 'as': 'p',
+ 'cond': {'$eq': ['$$p.name', self.remote_site]}
+ }},
+ 'order_local': {
+ '$filter': {'input': '$files.sites', 'as': 'p',
+ 'cond': {'$eq': ['$$p.name', self.local_site]}
+ }}
+ }},
+ {'$addFields': {
+ # prepare progress per file, presence of 'created_dt' denotes
+ # successfully finished load/download
+ 'progress_remote': {'$first': {
+ '$cond': [{'$size': "$order_remote.progress"},
+ "$order_remote.progress",
+ {'$cond': [
+ {'$size': "$order_remote.created_dt"},
+ [1],
+ [0]
+ ]}
+ ]}},
+ 'progress_local': {'$first': {
+ '$cond': [{'$size': "$order_local.progress"},
+ "$order_local.progress",
+ {'$cond': [
+ {'$size': "$order_local.created_dt"},
+ [1],
+ [0]
+ ]}
+ ]}},
+ # file might be successfully created or failed, not both
+ 'updated_dt_remote': {'$first': {
+ '$cond': [
+ {'$size': "$order_remote.created_dt"},
+ "$order_remote.created_dt",
+ {
+ '$cond': [
+ {'$size': "$order_remote.last_failed_dt"},
+ "$order_remote.last_failed_dt",
+ []
+ ]
+ }
+ ]
+ }},
+ 'updated_dt_local': {'$first': {
+ '$cond': [
+ {'$size': "$order_local.created_dt"},
+ "$order_local.created_dt",
+ {
+ '$cond': [
+ {'$size': "$order_local.last_failed_dt"},
+ "$order_local.last_failed_dt",
+ []
+ ]
+ }
+ ]
+ }},
+ 'paused_remote': {
+ '$cond': [{'$size': "$order_remote.paused"},
+ 1,
+ 0]},
+ 'paused_local': {
+ '$cond': [{'$size': "$order_local.paused"},
+ 1,
+ 0]},
+ 'failed_remote': {
+ '$cond': [{'$size': "$order_remote.last_failed_dt"},
+ 1,
+ 0]},
+ 'failed_local': {
+ '$cond': [{'$size': "$order_local.last_failed_dt"},
+ 1,
+ 0]},
+ 'failed_remote_error': {'$first': {
+ '$cond': [{'$size': "$order_remote.error"},
+ "$order_remote.error",
+ [""]]}},
+ 'failed_local_error': {'$first': {
+ '$cond': [{'$size': "$order_local.error"},
+ "$order_local.error",
+ [""]]}},
+ 'tries': {'$first': {
+ '$cond': [
+ {'$size': "$order_local.tries"},
+ "$order_local.tries",
+ {'$cond': [
+ {'$size': "$order_remote.tries"},
+ "$order_remote.tries",
+ []
+ ]}
+ ]}}
+ }},
+ {"$project": self.projection},
+ {"$sort": self.sort},
+ {
+ '$facet': {
+ 'paginatedResults': [{'$skip': self._rec_loaded},
+ {'$limit': limit}],
+ 'totalCount': [{'$count': 'count'}]
+ }
+ }
+ ]
+
+ def get_match_part(self):
+ """
+ Returns different content for 'match' portion if filtering by
+ name is present
+
+ Returns:
+ (dict)
+ """
+ if not self.word_filter:
+ return {
+ "type": "representation",
+ "_id": self._id
+ }
+ else:
+ regex_str = '.*{}.*'.format(self.word_filter)
+ return {
+ "type": "representation",
+ "_id": self._id,
+ '$or': [{'files.path': {'$regex': regex_str, '$options': 'i'}}]
+ }
+
+ def get_default_projection(self):
+ """
+ Projection part for aggregate query.
+
+ All fields with '1' will be returned, no others.
+
+ Returns:
+ (dict)
+ """
+ return {
+ "files": 1,
+ 'progress_remote': 1,
+ 'progress_local': 1,
+ 'updated_dt_remote': 1,
+ 'updated_dt_local': 1,
+ 'paused_remote': 1,
+ 'paused_local': 1,
+ 'failed_remote_error': 1,
+ 'failed_local_error': 1,
+ 'tries': 1,
+ 'status': {
+ '$switch': {
+ 'branches': [
+ {
+ 'case': {
+ '$or': ['$paused_remote', '$paused_local']},
+ 'then': 3 # Paused
+ },
+ {
+ 'case': {
+ '$and': [{'$or': ['$failed_remote',
+ '$failed_local']},
+ {'$eq': ['$tries', 3]}]},
+ 'then': 2 # Failed (3 tries)
+ },
+ {
+ 'case': {
+ '$or': [{'$eq': ['$progress_remote', 0]},
+ {'$eq': ['$progress_local', 0]}]},
+ 'then': 1 # Queued
+ },
+ {
+ 'case': {
+ '$or': ['$failed_remote', '$failed_local']},
+ 'then': 2 # Failed
+ },
+ {
+ 'case': {'$or': [{'$and': [
+ {'$gt': ['$progress_remote', 0]},
+ {'$lt': ['$progress_remote', 1]}
+ ]},
+ {'$and': [
+ {'$gt': ['$progress_local', 0]},
+ {'$lt': ['$progress_local', 1]}
+ ]}
+ ]},
+ 'then': 0 # In Progress
+ },
+ {
+ 'case': {'$and': [
+ {'$eq': ['$progress_remote', 1]},
+ {'$eq': ['$progress_local', 1]}
+ ]},
+ 'then': 4 # Synced OK
+ },
+ ],
+ 'default': -1
+ }
+ },
+ 'data.path': 1
+ }
diff --git a/openpype/modules/sync_server/tray/widgets.py b/openpype/modules/sync_server/tray/widgets.py
new file mode 100644
index 0000000000..5071ffa2b0
--- /dev/null
+++ b/openpype/modules/sync_server/tray/widgets.py
@@ -0,0 +1,820 @@
+import os
+import subprocess
+import sys
+
+from Qt import QtWidgets, QtCore, QtGui
+from Qt.QtCore import Qt
+
+from openpype.tools.settings import (
+ ProjectListWidget,
+ style
+)
+
+from openpype.api import get_local_site_id
+from openpype.lib import PypeLogger
+
+from avalon.tools.delegates import pretty_timestamp
+
+from openpype.modules.sync_server.tray.models import (
+ SyncRepresentationSummaryModel,
+ SyncRepresentationDetailModel
+)
+
+from openpype.modules.sync_server.tray import lib
+
+log = PypeLogger().get_logger("SyncServer")
+
+
+class SyncProjectListWidget(ProjectListWidget):
+ """
+ Lists all projects that are synchronized to choose from
+ """
+
+ def __init__(self, sync_server, parent):
+ super(SyncProjectListWidget, self).__init__(parent)
+ self.sync_server = sync_server
+ self.project_list.setContextMenuPolicy(QtCore.Qt.CustomContextMenu)
+ self.project_list.customContextMenuRequested.connect(
+ self._on_context_menu)
+ self.project_name = None
+ self.local_site = None
+ self.icons = {}
+
+ def validate_context_change(self):
+ return True
+
+ def refresh(self):
+ model = self.project_list.model()
+ model.clear()
+
+ project_name = None
+ for project_name in self.sync_server.sync_project_settings.\
+ keys():
+ if self.sync_server.is_paused() or \
+ self.sync_server.is_project_paused(project_name):
+ icon = self._get_icon("paused")
+ else:
+ icon = self._get_icon("synced")
+
+ model.appendRow(QtGui.QStandardItem(icon, project_name))
+
+ if len(self.sync_server.sync_project_settings.keys()) == 0:
+ model.appendRow(QtGui.QStandardItem(lib.DUMMY_PROJECT))
+
+ self.current_project = self.project_list.currentIndex().data(
+ QtCore.Qt.DisplayRole
+ )
+ if not self.current_project:
+ self.current_project = self.project_list.model().item(0). \
+ data(QtCore.Qt.DisplayRole)
+
+ if project_name:
+ self.local_site = self.sync_server.get_active_site(project_name)
+
+ def _get_icon(self, status):
+ if not self.icons.get(status):
+ resource_path = os.path.dirname(__file__)
+ resource_path = os.path.join(resource_path, "..",
+ "resources")
+ pix_url = "{}/{}.png".format(resource_path, status)
+ icon = QtGui.QIcon(pix_url)
+ self.icons[status] = icon
+ else:
+ icon = self.icons[status]
+ return icon
+
+ def _on_context_menu(self, point):
+ point_index = self.project_list.indexAt(point)
+ if not point_index.isValid():
+ return
+
+ self.project_name = point_index.data(QtCore.Qt.DisplayRole)
+
+ menu = QtWidgets.QMenu()
+ menu.setStyleSheet(style.load_stylesheet())
+ actions_mapping = {}
+
+ if self.sync_server.is_project_paused(self.project_name):
+ action = QtWidgets.QAction("Unpause")
+ actions_mapping[action] = self._unpause
+ else:
+ action = QtWidgets.QAction("Pause")
+ actions_mapping[action] = self._pause
+ menu.addAction(action)
+
+ if self.local_site == get_local_site_id():
+ action = QtWidgets.QAction("Clear local project")
+ actions_mapping[action] = self._clear_project
+ menu.addAction(action)
+
+ result = menu.exec_(QtGui.QCursor.pos())
+ if result:
+ to_run = actions_mapping[result]
+ if to_run:
+ to_run()
+
+ def _pause(self):
+ if self.project_name:
+ self.sync_server.pause_project(self.project_name)
+ self.project_name = None
+ self.refresh()
+
+ def _unpause(self):
+ if self.project_name:
+ self.sync_server.unpause_project(self.project_name)
+ self.project_name = None
+ self.refresh()
+
+ def _clear_project(self):
+ if self.project_name:
+ self.sync_server.clear_project(self.project_name, self.local_site)
+ self.project_name = None
+ self.refresh()
+
+
+class SyncRepresentationWidget(QtWidgets.QWidget):
+ """
+ Summary dialog with list of representations that matches current
+ settings 'local_site' and 'remote_site'.
+ """
+ active_changed = QtCore.Signal() # active index changed
+ message_generated = QtCore.Signal(str)
+
+ default_widths = (
+ ("asset", 220),
+ ("subset", 190),
+ ("version", 55),
+ ("representation", 95),
+ ("local_site", 170),
+ ("remote_site", 170),
+ ("files_count", 50),
+ ("files_size", 60),
+ ("priority", 50),
+ ("state", 110)
+ )
+
+ def __init__(self, sync_server, project=None, parent=None):
+ super(SyncRepresentationWidget, self).__init__(parent)
+
+ self.sync_server = sync_server
+
+ self._selected_id = None # keep last selected _id
+ self.representation_id = None
+ self.site_name = None # to pause/unpause representation
+
+ self.filter = QtWidgets.QLineEdit()
+ self.filter.setPlaceholderText("Filter representations..")
+
+ self._scrollbar_pos = None
+
+ top_bar_layout = QtWidgets.QHBoxLayout()
+ top_bar_layout.addWidget(self.filter)
+
+ self.table_view = QtWidgets.QTableView()
+ headers = [item[0] for item in self.default_widths]
+
+ model = SyncRepresentationSummaryModel(sync_server, headers, project)
+ self.table_view.setModel(model)
+ self.table_view.setContextMenuPolicy(QtCore.Qt.CustomContextMenu)
+ self.table_view.setSelectionMode(
+ QtWidgets.QAbstractItemView.SingleSelection)
+ self.table_view.setSelectionBehavior(
+ QtWidgets.QAbstractItemView.SelectRows)
+ self.table_view.horizontalHeader().setSortIndicator(
+ -1, Qt.AscendingOrder)
+ self.table_view.setSortingEnabled(True)
+ self.table_view.horizontalHeader().setSortIndicatorShown(True)
+ self.table_view.setAlternatingRowColors(True)
+ self.table_view.verticalHeader().hide()
+
+ column = self.table_view.model().get_header_index("local_site")
+ delegate = ImageDelegate(self)
+ self.table_view.setItemDelegateForColumn(column, delegate)
+
+ column = self.table_view.model().get_header_index("remote_site")
+ delegate = ImageDelegate(self)
+ self.table_view.setItemDelegateForColumn(column, delegate)
+
+ for column_name, width in self.default_widths:
+ idx = model.get_header_index(column_name)
+ self.table_view.setColumnWidth(idx, width)
+
+ layout = QtWidgets.QVBoxLayout(self)
+ layout.setContentsMargins(0, 0, 0, 0)
+ layout.addLayout(top_bar_layout)
+ layout.addWidget(self.table_view)
+
+ self.table_view.doubleClicked.connect(self._double_clicked)
+ self.filter.textChanged.connect(lambda: model.set_filter(
+ self.filter.text()))
+ self.table_view.customContextMenuRequested.connect(
+ self._on_context_menu)
+
+ model.refresh_started.connect(self._save_scrollbar)
+ model.refresh_finished.connect(self._set_scrollbar)
+ self.table_view.model().modelReset.connect(self._set_selection)
+
+ self.selection_model = self.table_view.selectionModel()
+ self.selection_model.selectionChanged.connect(self._selection_changed)
+
+ def _selection_changed(self, _new_selection):
+ index = self.selection_model.currentIndex()
+ self._selected_id = \
+ self.table_view.model().data(index, Qt.UserRole)
+
+ def _set_selection(self):
+ """
+ Sets selection to 'self._selected_id' if exists.
+
+ Keep selection during model refresh.
+ """
+ if self._selected_id:
+ index = self.table_view.model().get_index(self._selected_id)
+ if index and index.isValid():
+ mode = QtCore.QItemSelectionModel.Select | \
+ QtCore.QItemSelectionModel.Rows
+ self.selection_model.setCurrentIndex(index, mode)
+ else:
+ self._selected_id = None
+
+ def _double_clicked(self, index):
+ """
+ Opens representation dialog with all files after doubleclick
+ """
+ _id = self.table_view.model().data(index, Qt.UserRole)
+ detail_window = SyncServerDetailWindow(
+ self.sync_server, _id, self.table_view.model().project)
+ detail_window.exec()
+
+ def _on_context_menu(self, point):
+ """
+ Shows menu with loader actions on Right-click.
+ """
+ point_index = self.table_view.indexAt(point)
+ if not point_index.isValid():
+ return
+
+ self.item = self.table_view.model()._data[point_index.row()]
+ self.representation_id = self.item._id
+ log.debug("menu representation _id:: {}".
+ format(self.representation_id))
+
+ menu = QtWidgets.QMenu()
+ menu.setStyleSheet(style.load_stylesheet())
+ actions_mapping = {}
+ actions_kwargs_mapping = {}
+
+ local_site = self.item.local_site
+ local_progress = self.item.local_progress
+ remote_site = self.item.remote_site
+ remote_progress = self.item.remote_progress
+
+ for site, progress in {local_site: local_progress,
+ remote_site: remote_progress}.items():
+ project = self.table_view.model().project
+ provider = self.sync_server.get_provider_for_site(project,
+ site)
+ if provider == 'local_drive':
+ if 'studio' in site:
+ txt = " studio version"
+ else:
+ txt = " local version"
+ action = QtWidgets.QAction("Open in explorer" + txt)
+ if progress == 1.0:
+ actions_mapping[action] = self._open_in_explorer
+ actions_kwargs_mapping[action] = {'site': site}
+ menu.addAction(action)
+
+ # progress smaller then 1.0 --> in progress or queued
+ if local_progress < 1.0:
+ self.site_name = local_site
+ else:
+ self.site_name = remote_site
+
+ if self.item.state in [lib.STATUS[0], lib.STATUS[1]]:
+ action = QtWidgets.QAction("Pause")
+ actions_mapping[action] = self._pause
+ menu.addAction(action)
+
+ if self.item.state == lib.STATUS[3]:
+ action = QtWidgets.QAction("Unpause")
+ actions_mapping[action] = self._unpause
+ menu.addAction(action)
+
+ # if self.item.state == lib.STATUS[1]:
+ # action = QtWidgets.QAction("Open error detail")
+ # actions_mapping[action] = self._show_detail
+ # menu.addAction(action)
+
+ if remote_progress == 1.0:
+ action = QtWidgets.QAction("Re-sync Active site")
+ actions_mapping[action] = self._reset_local_site
+ menu.addAction(action)
+
+ if local_progress == 1.0:
+ action = QtWidgets.QAction("Re-sync Remote site")
+ actions_mapping[action] = self._reset_remote_site
+ menu.addAction(action)
+
+ if local_site != self.sync_server.DEFAULT_SITE:
+ action = QtWidgets.QAction("Completely remove from local")
+ actions_mapping[action] = self._remove_site
+ menu.addAction(action)
+ else:
+ action = QtWidgets.QAction("Mark for sync to local")
+ actions_mapping[action] = self._add_site
+ menu.addAction(action)
+
+ if not actions_mapping:
+ action = QtWidgets.QAction("< No action >")
+ actions_mapping[action] = None
+ menu.addAction(action)
+
+ result = menu.exec_(QtGui.QCursor.pos())
+ if result:
+ to_run = actions_mapping[result]
+ to_run_kwargs = actions_kwargs_mapping.get(result, {})
+ if to_run:
+ to_run(**to_run_kwargs)
+
+ self.table_view.model().refresh()
+
+ def _pause(self):
+ self.sync_server.pause_representation(self.table_view.model().project,
+ self.representation_id,
+ self.site_name)
+ self.site_name = None
+ self.message_generated.emit("Paused {}".format(self.representation_id))
+
+ def _unpause(self):
+ self.sync_server.unpause_representation(
+ self.table_view.model().project,
+ self.representation_id,
+ self.site_name)
+ self.site_name = None
+ self.message_generated.emit("Unpaused {}".format(
+ self.representation_id))
+
+ # temporary here for testing, will be removed TODO
+ def _add_site(self):
+ log.info(self.representation_id)
+ project_name = self.table_view.model().project
+ local_site_name = get_local_site_id()
+ try:
+ self.sync_server.add_site(
+ project_name,
+ self.representation_id,
+ local_site_name
+ )
+ self.message_generated.emit(
+ "Site {} added for {}".format(local_site_name,
+ self.representation_id))
+ except ValueError as exp:
+ self.message_generated.emit("Error {}".format(str(exp)))
+
+ def _remove_site(self):
+ """
+ Removes site record AND files.
+
+ This is ONLY for representations stored on local site, which
+ cannot be same as SyncServer.DEFAULT_SITE.
+
+ This could only happen when artist work on local machine, not
+ connected to studio mounted drives.
+ """
+ log.info("Removing {}".format(self.representation_id))
+ try:
+ local_site = get_local_site_id()
+ self.sync_server.remove_site(
+ self.table_view.model().project,
+ self.representation_id,
+ local_site,
+ True)
+ self.message_generated.emit("Site {} removed".format(local_site))
+ except ValueError as exp:
+ self.message_generated.emit("Error {}".format(str(exp)))
+ self.table_view.model().refresh(
+ load_records=self.table_view.model()._rec_loaded)
+
+ def _reset_local_site(self):
+ """
+ Removes errors or success metadata for particular file >> forces
+ redo of upload/download
+ """
+ self.sync_server.reset_provider_for_file(
+ self.table_view.model().project,
+ self.representation_id,
+ 'local')
+ self.table_view.model().refresh(
+ load_records=self.table_view.model()._rec_loaded)
+
+ def _reset_remote_site(self):
+ """
+ Removes errors or success metadata for particular file >> forces
+ redo of upload/download
+ """
+ self.sync_server.reset_provider_for_file(
+ self.table_view.model().project,
+ self.representation_id,
+ 'remote')
+ self.table_view.model().refresh(
+ load_records=self.table_view.model()._rec_loaded)
+
+ def _open_in_explorer(self, site):
+ if not self.item:
+ return
+
+ fpath = self.item.path
+ project = self.table_view.model().project
+ fpath = self.sync_server.get_local_file_path(project,
+ site,
+ fpath)
+
+ fpath = os.path.normpath(os.path.dirname(fpath))
+ if os.path.isdir(fpath):
+ if 'win' in sys.platform: # windows
+ subprocess.Popen('explorer "%s"' % fpath)
+ elif sys.platform == 'darwin': # macOS
+ subprocess.Popen(['open', fpath])
+ else: # linux
+ try:
+ subprocess.Popen(['xdg-open', fpath])
+ except OSError:
+ raise OSError('unsupported xdg-open call??')
+
+ def _save_scrollbar(self):
+ self._scrollbar_pos = self.table_view.verticalScrollBar().value()
+
+ def _set_scrollbar(self):
+ if self._scrollbar_pos:
+ self.table_view.verticalScrollBar().setValue(self._scrollbar_pos)
+
+
+class SyncRepresentationDetailWidget(QtWidgets.QWidget):
+ """
+ Widget to display list of synchronizable files for single repre.
+
+ Args:
+ _id (str): representation _id
+ project (str): name of project with repre
+ parent (QDialog): SyncServerDetailWindow
+ """
+ active_changed = QtCore.Signal() # active index changed
+
+ default_widths = (
+ ("file", 290),
+ ("local_site", 185),
+ ("remote_site", 185),
+ ("size", 60),
+ ("priority", 25),
+ ("state", 110)
+ )
+
+ def __init__(self, sync_server, _id=None, project=None, parent=None):
+ super(SyncRepresentationDetailWidget, self).__init__(parent)
+
+ log.debug("Representation_id:{}".format(_id))
+ self.representation_id = _id
+ self.item = None # set to item that mouse was clicked over
+ self.project = project
+
+ self.sync_server = sync_server
+
+ self._selected_id = None
+
+ self.filter = QtWidgets.QLineEdit()
+ self.filter.setPlaceholderText("Filter representation..")
+
+ self._scrollbar_pos = None
+
+ top_bar_layout = QtWidgets.QHBoxLayout()
+ top_bar_layout.addWidget(self.filter)
+
+ self.table_view = QtWidgets.QTableView()
+ headers = [item[0] for item in self.default_widths]
+
+ model = SyncRepresentationDetailModel(sync_server, headers, _id,
+ project)
+ self.table_view.setModel(model)
+ self.table_view.setContextMenuPolicy(QtCore.Qt.CustomContextMenu)
+ self.table_view.setSelectionMode(
+ QtWidgets.QAbstractItemView.SingleSelection)
+ self.table_view.setSelectionBehavior(
+ QtWidgets.QTableView.SelectRows)
+ self.table_view.horizontalHeader().setSortIndicator(-1,
+ Qt.AscendingOrder)
+ self.table_view.setSortingEnabled(True)
+ self.table_view.horizontalHeader().setSortIndicatorShown(True)
+ self.table_view.setAlternatingRowColors(True)
+ self.table_view.verticalHeader().hide()
+
+ column = self.table_view.model().get_header_index("local_site")
+ delegate = ImageDelegate(self)
+ self.table_view.setItemDelegateForColumn(column, delegate)
+
+ column = self.table_view.model().get_header_index("remote_site")
+ delegate = ImageDelegate(self)
+ self.table_view.setItemDelegateForColumn(column, delegate)
+
+ for column_name, width in self.default_widths:
+ idx = model.get_header_index(column_name)
+ self.table_view.setColumnWidth(idx, width)
+
+ layout = QtWidgets.QVBoxLayout(self)
+ layout.setContentsMargins(0, 0, 0, 0)
+ layout.addLayout(top_bar_layout)
+ layout.addWidget(self.table_view)
+
+ self.filter.textChanged.connect(lambda: model.set_filter(
+ self.filter.text()))
+ self.table_view.customContextMenuRequested.connect(
+ self._on_context_menu)
+
+ model.refresh_started.connect(self._save_scrollbar)
+ model.refresh_finished.connect(self._set_scrollbar)
+ self.table_view.model().modelReset.connect(self._set_selection)
+
+ self.selection_model = self.table_view.selectionModel()
+ self.selection_model.selectionChanged.connect(self._selection_changed)
+
+ def _selection_changed(self):
+ index = self.selection_model.currentIndex()
+ self._selected_id = self.table_view.model().data(index, Qt.UserRole)
+
+ def _set_selection(self):
+ """
+ Sets selection to 'self._selected_id' if exists.
+
+ Keep selection during model refresh.
+ """
+ if self._selected_id:
+ index = self.table_view.model().get_index(self._selected_id)
+ if index and index.isValid():
+ mode = QtCore.QItemSelectionModel.Select | \
+ QtCore.QItemSelectionModel.Rows
+ self.selection_model.setCurrentIndex(index, mode)
+ else:
+ self._selected_id = None
+
+ def _show_detail(self):
+ """
+ Shows windows with error message for failed sync of a file.
+ """
+ dt = max(self.item.created_dt, self.item.sync_dt)
+ detail_window = SyncRepresentationErrorWindow(self.item._id,
+ self.project,
+ dt,
+ self.item.tries,
+ self.item.error)
+ detail_window.exec()
+
+ def _on_context_menu(self, point):
+ """
+ Shows menu with loader actions on Right-click.
+ """
+ point_index = self.table_view.indexAt(point)
+ if not point_index.isValid():
+ return
+
+ self.item = self.table_view.model()._data[point_index.row()]
+
+ menu = QtWidgets.QMenu()
+ menu.setStyleSheet(style.load_stylesheet())
+ actions_mapping = {}
+ actions_kwargs_mapping = {}
+
+ local_site = self.item.local_site
+ local_progress = self.item.local_progress
+ remote_site = self.item.remote_site
+ remote_progress = self.item.remote_progress
+
+ for site, progress in {local_site: local_progress,
+ remote_site: remote_progress}.items():
+ project = self.table_view.model().project
+ provider = self.sync_server.get_provider_for_site(project,
+ site)
+ if provider == 'local_drive':
+ if 'studio' in site:
+ txt = " studio version"
+ else:
+ txt = " local version"
+ action = QtWidgets.QAction("Open in explorer" + txt)
+ if progress == 1:
+ actions_mapping[action] = self._open_in_explorer
+ actions_kwargs_mapping[action] = {'site': site}
+ menu.addAction(action)
+
+ if self.item.state == lib.STATUS[2]:
+ action = QtWidgets.QAction("Open error detail")
+ actions_mapping[action] = self._show_detail
+ menu.addAction(action)
+
+ if float(remote_progress) == 1.0:
+ action = QtWidgets.QAction("Re-sync active site")
+ actions_mapping[action] = self._reset_local_site
+ menu.addAction(action)
+
+ if float(local_progress) == 1.0:
+ action = QtWidgets.QAction("Re-sync remote site")
+ actions_mapping[action] = self._reset_remote_site
+ menu.addAction(action)
+
+ if not actions_mapping:
+ action = QtWidgets.QAction("< No action >")
+ actions_mapping[action] = None
+ menu.addAction(action)
+
+ result = menu.exec_(QtGui.QCursor.pos())
+ if result:
+ to_run = actions_mapping[result]
+ to_run_kwargs = actions_kwargs_mapping.get(result, {})
+ if to_run:
+ to_run(**to_run_kwargs)
+
+ def _reset_local_site(self):
+ """
+ Removes errors or success metadata for particular file >> forces
+ redo of upload/download
+ """
+ self.sync_server.reset_provider_for_file(
+ self.table_view.model().project,
+ self.representation_id,
+ 'local',
+ self.item._id)
+ self.table_view.model().refresh(
+ load_records=self.table_view.model()._rec_loaded)
+
+ def _reset_remote_site(self):
+ """
+ Removes errors or success metadata for particular file >> forces
+ redo of upload/download
+ """
+ self.sync_server.reset_provider_for_file(
+ self.table_view.model().project,
+ self.representation_id,
+ 'remote',
+ self.item._id)
+ self.table_view.model().refresh(
+ load_records=self.table_view.model()._rec_loaded)
+
+ def _open_in_explorer(self, site):
+ if not self.item:
+ return
+
+ fpath = self.item.path
+ project = self.project
+ fpath = self.sync_server.get_local_file_path(project, site, fpath)
+
+ fpath = os.path.normpath(os.path.dirname(fpath))
+ if os.path.isdir(fpath):
+ if 'win' in sys.platform: # windows
+ subprocess.Popen('explorer "%s"' % fpath)
+ elif sys.platform == 'darwin': # macOS
+ subprocess.Popen(['open', fpath])
+ else: # linux
+ try:
+ subprocess.Popen(['xdg-open', fpath])
+ except OSError:
+ raise OSError('unsupported xdg-open call??')
+
+ def _save_scrollbar(self):
+ self._scrollbar_pos = self.table_view.verticalScrollBar().value()
+
+ def _set_scrollbar(self):
+ if self._scrollbar_pos:
+ self.table_view.verticalScrollBar().setValue(self._scrollbar_pos)
+
+
+class SyncRepresentationErrorWidget(QtWidgets.QWidget):
+ """
+ Dialog to show when sync error happened, prints error message
+ """
+
+ def __init__(self, _id, dt, tries, msg, parent=None):
+ super(SyncRepresentationErrorWidget, self).__init__(parent)
+
+ layout = QtWidgets.QHBoxLayout(self)
+
+ txts = []
+ txts.append("{}: {}".format("Last update date", pretty_timestamp(dt)))
+ txts.append("{}: {}".format("Retries", str(tries)))
+ txts.append("{}: {}".format("Error message", msg))
+
+ text_area = QtWidgets.QPlainTextEdit("\n\n".join(txts))
+ text_area.setReadOnly(True)
+ layout.addWidget(text_area)
+
+
+class ImageDelegate(QtWidgets.QStyledItemDelegate):
+ """
+ Prints icon of site and progress of synchronization
+ """
+
+ def __init__(self, parent=None):
+ super(ImageDelegate, self).__init__(parent)
+ self.icons = {}
+
+ def paint(self, painter, option, index):
+ super(ImageDelegate, self).paint(painter, option, index)
+ option = QtWidgets.QStyleOptionViewItem(option)
+ option.showDecorationSelected = True
+
+ provider = index.data(lib.ProviderRole)
+ value = index.data(lib.ProgressRole)
+ date_value = index.data(lib.DateRole)
+ is_failed = index.data(lib.FailedRole)
+
+ if not self.icons.get(provider):
+ resource_path = os.path.dirname(__file__)
+ resource_path = os.path.join(resource_path, "..",
+ "providers", "resources")
+ pix_url = "{}/{}.png".format(resource_path, provider)
+ pixmap = QtGui.QPixmap(pix_url)
+ self.icons[provider] = pixmap
+ else:
+ pixmap = self.icons[provider]
+
+ padding = 10
+ point = QtCore.QPoint(option.rect.x() + padding,
+ option.rect.y() +
+ (option.rect.height() - pixmap.height()) / 2)
+ painter.drawPixmap(point, pixmap)
+
+ overlay_rect = option.rect.translated(0, 0)
+ overlay_rect.setHeight(overlay_rect.height() * (1.0 - float(value)))
+ painter.fillRect(overlay_rect,
+ QtGui.QBrush(QtGui.QColor(0, 0, 0, 100)))
+ text_rect = option.rect.translated(10, 0)
+ painter.drawText(text_rect,
+ QtCore.Qt.AlignCenter,
+ date_value)
+
+ if is_failed:
+ overlay_rect = option.rect.translated(0, 0)
+ painter.fillRect(overlay_rect,
+ QtGui.QBrush(QtGui.QColor(255, 0, 0, 35)))
+
+
+class SyncServerDetailWindow(QtWidgets.QDialog):
+ def __init__(self, sync_server, _id, project, parent=None):
+ log.debug(
+ "!!! SyncServerDetailWindow _id:: {}".format(_id))
+ super(SyncServerDetailWindow, self).__init__(parent)
+ self.setWindowFlags(QtCore.Qt.Window)
+ self.setFocusPolicy(QtCore.Qt.StrongFocus)
+
+ self.setStyleSheet(style.load_stylesheet())
+ self.setWindowIcon(QtGui.QIcon(style.app_icon_path()))
+ self.resize(1000, 400)
+
+ body = QtWidgets.QWidget()
+ footer = QtWidgets.QWidget()
+ footer.setFixedHeight(20)
+
+ container = SyncRepresentationDetailWidget(sync_server, _id, project,
+ parent=self)
+ body_layout = QtWidgets.QHBoxLayout(body)
+ body_layout.addWidget(container)
+ body_layout.setContentsMargins(0, 0, 0, 0)
+
+ self.message = QtWidgets.QLabel()
+ self.message.hide()
+
+ footer_layout = QtWidgets.QVBoxLayout(footer)
+ footer_layout.addWidget(self.message)
+ footer_layout.setContentsMargins(0, 0, 0, 0)
+
+ layout = QtWidgets.QVBoxLayout(self)
+ layout.addWidget(body)
+ layout.addWidget(footer)
+
+ self.setLayout(body_layout)
+ self.setWindowTitle("Sync Representation Detail")
+
+
+class SyncRepresentationErrorWindow(QtWidgets.QDialog):
+ def __init__(self, _id, project, dt, tries, msg, parent=None):
+ super(SyncRepresentationErrorWindow, self).__init__(parent)
+ self.setWindowFlags(QtCore.Qt.Window)
+ self.setFocusPolicy(QtCore.Qt.StrongFocus)
+
+ self.setStyleSheet(style.load_stylesheet())
+ self.setWindowIcon(QtGui.QIcon(style.app_icon_path()))
+ self.resize(900, 150)
+
+ body = QtWidgets.QWidget()
+
+ container = SyncRepresentationErrorWidget(_id, dt, tries, msg,
+ parent=self)
+ body_layout = QtWidgets.QHBoxLayout(body)
+ body_layout.addWidget(container)
+ body_layout.setContentsMargins(0, 0, 0, 0)
+
+ message = QtWidgets.QLabel()
+ message.hide()
+
+ layout = QtWidgets.QVBoxLayout(self)
+ layout.addWidget(body)
+
+ self.setLayout(body_layout)
+ self.setWindowTitle("Sync Representation Error Detail")
diff --git a/openpype/modules/sync_server/utils.py b/openpype/modules/sync_server/utils.py
index 0762766783..36f3444399 100644
--- a/openpype/modules/sync_server/utils.py
+++ b/openpype/modules/sync_server/utils.py
@@ -1,8 +1,14 @@
import time
-from openpype.api import Logger
+from openpype.api import Logger
log = Logger().get_logger("SyncServer")
+class SyncStatus:
+ DO_NOTHING = 0
+ DO_UPLOAD = 1
+ DO_DOWNLOAD = 2
+
+
def time_function(method):
""" Decorator to print how much time function took.
For debugging.
diff --git a/openpype/plugins/load/add_site.py b/openpype/plugins/load/add_site.py
new file mode 100644
index 0000000000..09448d553c
--- /dev/null
+++ b/openpype/plugins/load/add_site.py
@@ -0,0 +1,33 @@
+from avalon import api
+from openpype.modules import ModulesManager
+
+
+class AddSyncSite(api.Loader):
+ """Add sync site to representation"""
+ representations = ["*"]
+ families = ["*"]
+
+ label = "Add Sync Site"
+ order = 2 # lower means better
+ icon = "download"
+ color = "#999999"
+
+ def load(self, context, name=None, namespace=None, data=None):
+ self.log.info("Adding {} to representation: {}".format(
+ data["site_name"], data["_id"]))
+ self.add_site_to_representation(data["project_name"],
+ data["_id"],
+ data["site_name"])
+ self.log.debug("Site added.")
+
+ @staticmethod
+ def add_site_to_representation(project_name, representation_id, site_name):
+ """Adds new site to representation_id, resets if exists"""
+ manager = ModulesManager()
+ sync_server = manager.modules_by_name["sync_server"]
+ sync_server.add_site(project_name, representation_id, site_name,
+ force=True)
+
+ def filepath_from_context(self, context):
+ """No real file loading"""
+ return ""
diff --git a/openpype/plugins/load/delete_old_versions.py b/openpype/plugins/load/delete_old_versions.py
index e5132e0f8a..8e3999e9c4 100644
--- a/openpype/plugins/load/delete_old_versions.py
+++ b/openpype/plugins/load/delete_old_versions.py
@@ -15,11 +15,12 @@ from openpype.api import Anatomy
class DeleteOldVersions(api.Loader):
-
+ """Deletes specific number of old version"""
representations = ["*"]
families = ["*"]
label = "Delete Old Versions"
+ order = 35
icon = "trash"
color = "#d8d8d8"
@@ -421,8 +422,9 @@ class DeleteOldVersions(api.Loader):
class CalculateOldVersions(DeleteOldVersions):
-
+ """Calculate file size of old versions"""
label = "Calculate Old Versions"
+ order = 30
options = [
qargparse.Integer(
diff --git a/openpype/plugins/load/remove_site.py b/openpype/plugins/load/remove_site.py
new file mode 100644
index 0000000000..aedb5d1f2f
--- /dev/null
+++ b/openpype/plugins/load/remove_site.py
@@ -0,0 +1,33 @@
+from avalon import api
+from openpype.modules import ModulesManager
+
+
+class RemoveSyncSite(api.Loader):
+ """Remove sync site and its files on representation"""
+ representations = ["*"]
+ families = ["*"]
+
+ label = "Remove Sync Site"
+ order = 4
+ icon = "download"
+ color = "#999999"
+
+ def load(self, context, name=None, namespace=None, data=None):
+ self.log.info("Removing {} on representation: {}".format(
+ data["site_name"], data["_id"]))
+ self.remove_site_on_representation(data["project_name"],
+ data["_id"],
+ data["site_name"])
+ self.log.debug("Site added.")
+
+ @staticmethod
+ def remove_site_on_representation(project_name, representation_id,
+ site_name):
+ manager = ModulesManager()
+ sync_server = manager.modules_by_name["sync_server"]
+ sync_server.remove_site(project_name, representation_id,
+ site_name, True)
+
+ def filepath_from_context(self, context):
+ """No real file loading"""
+ return ""
diff --git a/openpype/plugins/publish/integrate_new.py b/openpype/plugins/publish/integrate_new.py
index 0d36828ccf..ea90f284b2 100644
--- a/openpype/plugins/publish/integrate_new.py
+++ b/openpype/plugins/publish/integrate_new.py
@@ -976,6 +976,9 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
local_site = local_site_id
remote_site = sync_server_presets["config"].get("remote_site")
+ if remote_site == local_site:
+ remote_site = None
+
if remote_site == 'local':
remote_site = local_site_id
diff --git a/openpype/settings/defaults/project_settings/ftrack.json b/openpype/settings/defaults/project_settings/ftrack.json
index 03ac8f309f..8970aa8ac8 100644
--- a/openpype/settings/defaults/project_settings/ftrack.json
+++ b/openpype/settings/defaults/project_settings/ftrack.json
@@ -7,6 +7,14 @@
"not ready"
]
},
+ "prepare_project": {
+ "enabled": true,
+ "role_list": [
+ "Pypeclub",
+ "Administrator",
+ "Project manager"
+ ]
+ },
"sync_hier_entity_attributes": {
"enabled": true,
"interest_entity_types": [
@@ -195,7 +203,7 @@
"publish": {
"IntegrateFtrackNote": {
"enabled": true,
- "note_with_intent_template": "",
+ "note_with_intent_template": "{intent}: {comment}",
"note_labels": []
},
"ValidateFtrackAttributes": {
diff --git a/openpype/settings/defaults/project_settings/global.json b/openpype/settings/defaults/project_settings/global.json
index 8081f92ef7..ca1b258e72 100644
--- a/openpype/settings/defaults/project_settings/global.json
+++ b/openpype/settings/defaults/project_settings/global.json
@@ -6,7 +6,9 @@
"ExtractJpegEXR": {
"enabled": true,
"ffmpeg_args": {
- "input": [],
+ "input": [
+ "-gamma 2.2"
+ ],
"output": []
}
},
diff --git a/openpype/settings/defaults/project_settings/maya.json b/openpype/settings/defaults/project_settings/maya.json
index feddd2860a..a524ec45ae 100644
--- a/openpype/settings/defaults/project_settings/maya.json
+++ b/openpype/settings/defaults/project_settings/maya.json
@@ -313,8 +313,8 @@
"rendererName": "vp2Renderer"
},
"Resolution": {
- "width": 1080,
- "height": 1920,
+ "width": 1920,
+ "height": 1080,
"percent": 1.0,
"mode": "Custom"
},
diff --git a/openpype/settings/defaults/project_settings/standalonepublisher.json b/openpype/settings/defaults/project_settings/standalonepublisher.json
index 08895bcba9..9d40d2ded6 100644
--- a/openpype/settings/defaults/project_settings/standalonepublisher.json
+++ b/openpype/settings/defaults/project_settings/standalonepublisher.json
@@ -116,7 +116,7 @@
"ExtractThumbnailSP": {
"ffmpeg_args": {
"input": [
- "gamma 2.2"
+ "-gamma 2.2"
],
"output": []
}
diff --git a/openpype/settings/defaults/project_settings/tvpaint.json b/openpype/settings/defaults/project_settings/tvpaint.json
new file mode 100644
index 0000000000..d4130c88be
--- /dev/null
+++ b/openpype/settings/defaults/project_settings/tvpaint.json
@@ -0,0 +1,10 @@
+{
+ "publish": {
+ "ValidateMissingLayers": {
+ "enabled": true,
+ "optional": true,
+ "active": true
+ }
+ },
+ "filters": {}
+}
\ No newline at end of file
diff --git a/openpype/settings/defaults/system_settings/applications.json b/openpype/settings/defaults/system_settings/applications.json
index 8034bc6368..bd2987c153 100644
--- a/openpype/settings/defaults/system_settings/applications.json
+++ b/openpype/settings/defaults/system_settings/applications.json
@@ -6,7 +6,7 @@
"host_name": "maya",
"environment": {
"PYTHONPATH": [
- "{OPENPYPE_ROOT}/pype/hosts/maya/startup",
+ "{OPENPYPE_ROOT}/openpype/hosts/maya/startup",
"{OPENPYPE_ROOT}/repos/avalon-core/setup/maya",
"{OPENPYPE_ROOT}/repos/maya-look-assigner",
"{PYTHONPATH}"
@@ -715,7 +715,7 @@
"{OPENPYPE_ROOT}/repos/avalon-core/setup/blender",
"{PYTHONPATH}"
],
- "CREATE_NEW_CONSOLE": "yes"
+ "QT_PREFERRED_BINDING": "PySide2"
},
"variants": {
"2-83": {
diff --git a/openpype/settings/entities/schemas/projects_schema/schema_main.json b/openpype/settings/entities/schemas/projects_schema/schema_main.json
index 565500edd2..6bc158aa60 100644
--- a/openpype/settings/entities/schemas/projects_schema/schema_main.json
+++ b/openpype/settings/entities/schemas/projects_schema/schema_main.json
@@ -82,6 +82,10 @@
"type": "schema",
"name": "schema_project_harmony"
},
+ {
+ "type": "schema",
+ "name": "schema_project_tvpaint"
+ },
{
"type": "schema",
"name": "schema_project_celaction"
diff --git a/openpype/settings/entities/schemas/projects_schema/schema_project_ftrack.json b/openpype/settings/entities/schemas/projects_schema/schema_project_ftrack.json
index eefc0e12b7..a801175031 100644
--- a/openpype/settings/entities/schemas/projects_schema/schema_project_ftrack.json
+++ b/openpype/settings/entities/schemas/projects_schema/schema_project_ftrack.json
@@ -36,6 +36,25 @@
}
]
},
+ {
+ "type": "dict",
+ "key": "prepare_project",
+ "label": "Prepare Project",
+ "checkbox_key": "enabled",
+ "children": [
+ {
+ "type": "boolean",
+ "key": "enabled",
+ "label": "Enabled"
+ },
+ {
+ "type": "list",
+ "key": "role_list",
+ "label": "Roles",
+ "object_type": "text"
+ }
+ ]
+ },
{
"type": "dict",
"key": "sync_hier_entity_attributes",
diff --git a/openpype/settings/entities/schemas/projects_schema/schema_project_tvpaint.json b/openpype/settings/entities/schemas/projects_schema/schema_project_tvpaint.json
new file mode 100644
index 0000000000..b9fe26a57c
--- /dev/null
+++ b/openpype/settings/entities/schemas/projects_schema/schema_project_tvpaint.json
@@ -0,0 +1,32 @@
+{
+ "type": "dict",
+ "collapsible": true,
+ "key": "tvpaint",
+ "label": "TVPaint",
+ "is_file": true,
+ "children": [
+ {
+ "type": "dict",
+ "collapsible": true,
+ "key": "publish",
+ "label": "Publish plugins",
+ "is_file": true,
+ "children": [
+ {
+ "type": "schema_template",
+ "name": "template_publish_plugin",
+ "template_data": [
+ {
+ "key": "ValidateMissingLayers",
+ "label": "ValidateMissingLayers"
+ }
+ ]
+ }
+ ]
+ },
+ {
+ "type": "schema",
+ "name": "schema_publish_gui_filter"
+ }
+ ]
+}
diff --git a/openpype/tools/settings/settings/style/__init__.py b/openpype/tools/settings/settings/style/__init__.py
index 9bb5e851b4..5a57642ee1 100644
--- a/openpype/tools/settings/settings/style/__init__.py
+++ b/openpype/tools/settings/settings/style/__init__.py
@@ -1,4 +1,5 @@
import os
+from openpype import resources
def load_stylesheet():
@@ -9,4 +10,4 @@ def load_stylesheet():
def app_icon_path():
- return os.path.join(os.path.dirname(__file__), "openpype_icon.png")
+ return resources.pype_icon_filepath()
diff --git a/openpype/tools/settings/settings/style/pype_icon.png b/openpype/tools/settings/settings/style/pype_icon.png
deleted file mode 100644
index bfacf6eeed..0000000000
Binary files a/openpype/tools/settings/settings/style/pype_icon.png and /dev/null differ
diff --git a/openpype/version.py b/openpype/version.py
index f85ea13ac8..dedf799055 100644
--- a/openpype/version.py
+++ b/openpype/version.py
@@ -1,3 +1,3 @@
# -*- coding: utf-8 -*-
"""Package declaring Pype version."""
-__version__ = "3.0.0-beta"
+__version__ = "3.0.0-beta2"
diff --git a/pyproject.toml b/pyproject.toml
index 6df6db5a18..c8c0d5b4ff 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -1,6 +1,6 @@
[tool.poetry]
name = "OpenPype"
-version = "3.0.0-alpha1"
+version = "3.0.0-beta2"
description = "Multi-platform open-source pipeline built around the Avalon platform, expanding it with extra features and integrations."
authors = ["OpenPype Team "]
license = "MIT License"
diff --git a/repos/avalon-core b/repos/avalon-core
index bbba8765c4..911bd8999a 160000
--- a/repos/avalon-core
+++ b/repos/avalon-core
@@ -1 +1 @@
-Subproject commit bbba8765c431ee124590e4f12d2e56db4d62eacd
+Subproject commit 911bd8999ab0030d0f7412dde6fd545c1a73b62d
diff --git a/website/docs/artist_hosts_blender.md b/website/docs/artist_hosts_blender.md
new file mode 100644
index 0000000000..877e99bff4
--- /dev/null
+++ b/website/docs/artist_hosts_blender.md
@@ -0,0 +1,226 @@
+---
+id: artist_hosts_blender
+title: Blender
+sidebar_label: Blender
+---
+
+## OpenPype global tools
+
+- [Set Context](artist_tools.md#set-context)
+- [Work Files](artist_tools.md#workfiles)
+- [Create](artist_tools.md#creator)
+- [Load](artist_tools.md#loader)
+- [Manage (Inventory)](artist_tools.md#inventory)
+- [Publish](artist_tools.md#publisher)
+- [Library Loader](artist_tools.md#library-loader)
+
+## Working with OpenPype in Blender
+
+OpenPype is here to ease you the burden of working on project with lots of
+collaborators, worrying about naming, setting stuff, browsing through endless
+directories, loading and exporting and so on. To achieve that, OpenPype is using
+concept of being _"data driven"_. This means that what happens when publishing
+is influenced by data in scene. This can by slightly confusing so let's get to
+it with few examples.
+
+
+## Setting scene data
+
+Blender settings concerning framerate, resolution and frame range are handled
+by OpenPype. If set correctly in Ftrack, Blender will automatically set the
+values for you.
+
+
+## Publishing models
+
+### Intro
+
+Publishing models in Blender is pretty straightforward. Create your model as you
+need. You might need to adhere to specifications of your studio that can be different
+between studios and projects but by default your geometry does not need any
+other convention.
+
+
+
+### Creating instance
+
+Now create **Model instance** from it to let OpenPype know what in the scene you want to
+publish. Go **OpenPype → Create... → Model**.
+
+
+
+`Asset` field is a name of asset you are working on - it should be already filled
+with correct name as you've started Blender or switched context to specific asset. You
+can edit that field to change it to different asset (but that one must already exists).
+
+`Subset` field is a name you can decide on. It should describe what kind of data you
+have in the model. For example, you can name it `Proxy` to indicate that this is
+low resolution stuff. See [Subset](artist_concepts#subset).
+
+
+
+Read-only field just under it show final subset name, adding subset field to
+name of the group you have selected.
+
+`Use selection` checkbox will use whatever you have selected in Outliner to be
+wrapped in Model instance. This is usually what you want. Click on **Create** button.
+
+You'll notice then after you've created new Model instance, there is a new
+collection in Outliner called after your asset and subset, in our case it is
+`character1_modelDefault`. The assets selected when creating the Model instance
+are linked in the new collection.
+
+And that's it, you have your first model ready to publish.
+
+Now save your scene (if you didn't do it already). You will notice that path
+in Save dialog is already set to place where scenes related to modeling task on
+your asset should reside. As in our case we are working on asset called
+**character1** and on task **modeling**, path relative to your project directory will be
+`project_XY/assets/character1/work/modeling`. The default name for the file will
+be `project_XY_asset_task_version`, so in our case
+`simonetest_character1_modeling_v001.blend`. Let's save it.
+
+
+
+### Publishing models
+
+Now let's publish it. Go **OpenPype → Publish...**. You will be presented with following window:
+
+
+
+Note that content of this window can differs by your pipeline configuration.
+For more detail see [Publisher](artist_tools#publisher).
+
+Items in left column are instances you will be publishing. You can disable them
+by clicking on square next to them. White filled square indicate they are ready for
+publishing, red means something went wrong either during collection phase
+or publishing phase. Empty one with gray text is disabled.
+
+See that in this case we are publishing from the scene file
+`simonetest_character1_modeling_v001.blend` the Blender model named
+`character1_modelDefault`.
+
+Right column lists all tasks that are run during collection, validation,
+extraction and integration phase. White items are optional and you can disable
+them by clicking on them.
+
+Lets do dry-run on publishing to see if we pass all validators. Click on flask
+icon at the bottom. Validators are run. Ideally you will end up with everything
+green in validator section.
+
+### Fixing problems
+
+For the sake of demonstration, I intentionally kept the model in Edit Mode, to
+trigger the validator designed to check just this.
+
+
+
+You can see our model is now marked red in left column and in right we have
+red box next to `Mesh is in Object Mode` validator.
+
+You can click on arrow next to it to see more details:
+
+
+
+From there you can see in **Records** entry that there is problem with the
+object `Suzanne`. Some validators have option to fix problem for you or just
+select objects that cause trouble. This is the case with our failed validator.
+
+In main overview you can notice little A in a circle next to validator
+name. Right click on it and you can see menu item `select invalid`. This
+will select offending object in Blender.
+
+Fix is easy. Without closing Publisher window we just turn back the Object Mode.
+Then we need to reset it to make it notice changes we've made. Click on arrow
+circle button at the bottom and it will reset the Publisher to initial state. Run
+validators again (flask icon) to see if everything is ok.
+
+It should OK be now. Write some comment if you want and click play icon button
+when ready.
+
+Publish process will now take its course. Depending on data you are publishing
+it can take a while. You should end up with everything green and message
+**Finished successfully ...** You can now close publisher window.
+
+To check for yourself that model is published, open
+[Asset Loader](artist_tools#loader) - **OpenPype → Load...**.
+There you should see your model, named `modelDefault`.
+
+### Loading models
+
+You can load model with [Loader](artist_tools.md#loader). Go **OpenPype → Load...**,
+select your rig, right click on it and click **Link model (blend)**.
+
+## Creating Rigs
+
+Creating and publishing rigs with OpenPype follows similar workflow as with
+other data types. Create your rig and mark parts of your hierarchy in sets to
+help OpenPype validators and extractors to check it and publish it.
+
+### Preparing rig for publish
+
+When creating rigs in Blender, it is important to keep a specific structure for
+the bones and the geometry. Let's first create a model and its rig. For
+demonstration, I'll create a simple model for a robotic arm made of simple boxes.
+
+
+
+I have now created the armature `RIG_RobotArm`. While the naming is not important,
+you can just adhere to your naming conventions, the hierarchy is. Once the models
+are skinned to the armature, the geometry must be organized in a separate Collection.
+In this case, I have the armature in the main Collection, and the geometry in
+the `Geometry` Collection.
+
+
+
+When you've prepared your hierarchy, it's time to create *Rig instance* in OpenPype.
+Select your whole rig hierarchy and go **OpenPype → Create...**. Select **Rig**.
+
+
+
+A new collection named after the selected Asset and Subset should have been created.
+In our case, it is `character1_rigDefault`. All the selected armature and models
+have been linked in this new collection. You should end up with something like
+this:
+
+
+
+### Publishing rigs
+
+Publishing rig is done in same way as publishing everything else. Save your scene
+and go **OpenPype → Publish**. For more detail see [Publisher](artist_tools#publisher).
+
+### Loading rigs
+
+You can load rig with [Loader](artist_tools.md#loader). Go **OpenPype → Load...**,
+select your rig, right click on it and click **Link rig (blend)**.
+
+## Layouts in Blender
+
+A layout is a set of elements that populate a scene. OpenPype allows to version
+and manage those sets.
+
+### Publishing a layout
+
+Working with Layout is easy. Just load your assets into scene with
+[Loader](artist_tools.md#loader) (**OpenPype → Load...**). Populate your scene as
+you wish, translate each piece to fit your need. When ready, select all imported
+stuff and go **OpenPype → Create...** and select **Layout**. When selecting rigs,
+you need to select only the armature, the geometry will automatically be included.
+This will create set containing your selection and marking it for publishing.
+
+Now you can publish is with **OpenPype → Publish**.
+
+### Loading layouts
+
+You can load a Layout using [Loader](artist_tools.md#loader)
+(**OpenPype → Load...**). Select your layout, right click on it and
+select **Link Layout (blend)**. This will populate your scene with all those
+models you've put into layout.
\ No newline at end of file
diff --git a/website/docs/assets/blender-model_create_instance.jpg b/website/docs/assets/blender-model_create_instance.jpg
new file mode 100644
index 0000000000..d0891c5d05
Binary files /dev/null and b/website/docs/assets/blender-model_create_instance.jpg differ
diff --git a/website/docs/assets/blender-model_error_details.jpg b/website/docs/assets/blender-model_error_details.jpg
new file mode 100644
index 0000000000..1756254e5f
Binary files /dev/null and b/website/docs/assets/blender-model_error_details.jpg differ
diff --git a/website/docs/assets/blender-model_example.jpg b/website/docs/assets/blender-model_example.jpg
new file mode 100644
index 0000000000..98d98e903f
Binary files /dev/null and b/website/docs/assets/blender-model_example.jpg differ
diff --git a/website/docs/assets/blender-model_pre_publish.jpg b/website/docs/assets/blender-model_pre_publish.jpg
new file mode 100644
index 0000000000..11233229c5
Binary files /dev/null and b/website/docs/assets/blender-model_pre_publish.jpg differ
diff --git a/website/docs/assets/blender-model_publish_error.jpg b/website/docs/assets/blender-model_publish_error.jpg
new file mode 100644
index 0000000000..260d9b9996
Binary files /dev/null and b/website/docs/assets/blender-model_publish_error.jpg differ
diff --git a/website/docs/assets/blender-rig_create.jpg b/website/docs/assets/blender-rig_create.jpg
new file mode 100644
index 0000000000..169ddae84f
Binary files /dev/null and b/website/docs/assets/blender-rig_create.jpg differ
diff --git a/website/docs/assets/blender-rig_hierarchy_before_publish.jpg b/website/docs/assets/blender-rig_hierarchy_before_publish.jpg
new file mode 100644
index 0000000000..81f3916c9e
Binary files /dev/null and b/website/docs/assets/blender-rig_hierarchy_before_publish.jpg differ
diff --git a/website/docs/assets/blender-rig_hierarchy_example.jpg b/website/docs/assets/blender-rig_hierarchy_example.jpg
new file mode 100644
index 0000000000..6ab6897650
Binary files /dev/null and b/website/docs/assets/blender-rig_hierarchy_example.jpg differ
diff --git a/website/docs/assets/blender-rig_model_setup.jpg b/website/docs/assets/blender-rig_model_setup.jpg
new file mode 100644
index 0000000000..6f967cdab4
Binary files /dev/null and b/website/docs/assets/blender-rig_model_setup.jpg differ
diff --git a/website/docs/assets/blender-save_modelling_file.jpg b/website/docs/assets/blender-save_modelling_file.jpg
new file mode 100644
index 0000000000..d7f2401c51
Binary files /dev/null and b/website/docs/assets/blender-save_modelling_file.jpg differ
diff --git a/website/sidebars.js b/website/sidebars.js
index ec608f0a13..82f063e252 100644
--- a/website/sidebars.js
+++ b/website/sidebars.js
@@ -19,6 +19,7 @@ module.exports = {
"artist_hosts_nukestudio",
"artist_hosts_nuke",
"artist_hosts_maya",
+ "artist_hosts_blender",
"artist_hosts_harmony",
"artist_hosts_aftereffects",
"artist_hosts_photoshop",