mirror of
https://github.com/ynput/ayon-core.git
synced 2026-01-02 00:44:52 +01:00
Merge branch 'develop' into feature/new_publisher_core
This commit is contained in:
commit
952e7a7599
185 changed files with 9323 additions and 1326 deletions
166
CHANGELOG.md
166
CHANGELOG.md
|
|
@ -1,128 +1,122 @@
|
|||
# Changelog
|
||||
|
||||
## [3.2.0-nightly.5](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
## [3.3.0-nightly.9](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/2.18.4...HEAD)
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.2.0...HEAD)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- submodules: avalon-core update [\#1911](https://github.com/pypeclub/OpenPype/pull/1911)
|
||||
- Feature AE local render [\#1901](https://github.com/pypeclub/OpenPype/pull/1901)
|
||||
- Ftrack: Where I run action enhancement [\#1900](https://github.com/pypeclub/OpenPype/pull/1900)
|
||||
- Ftrack: Private project server actions [\#1899](https://github.com/pypeclub/OpenPype/pull/1899)
|
||||
- Support nested studio plugins paths. [\#1898](https://github.com/pypeclub/OpenPype/pull/1898)
|
||||
- Settings: global validators with options [\#1892](https://github.com/pypeclub/OpenPype/pull/1892)
|
||||
- Settings: Conditional dict enum positioning [\#1891](https://github.com/pypeclub/OpenPype/pull/1891)
|
||||
- Expose stop timer through rest api. [\#1886](https://github.com/pypeclub/OpenPype/pull/1886)
|
||||
- TVPaint: Increment workfile [\#1885](https://github.com/pypeclub/OpenPype/pull/1885)
|
||||
- Allow Multiple Notes to run on tasks. [\#1882](https://github.com/pypeclub/OpenPype/pull/1882)
|
||||
- Prepare for pyside2 [\#1869](https://github.com/pypeclub/OpenPype/pull/1869)
|
||||
- Filter hosts in settings host-enum [\#1868](https://github.com/pypeclub/OpenPype/pull/1868)
|
||||
- Local actions with process identifier [\#1867](https://github.com/pypeclub/OpenPype/pull/1867)
|
||||
- Workfile tool start at host launch support [\#1865](https://github.com/pypeclub/OpenPype/pull/1865)
|
||||
- Anatomy schema validation [\#1864](https://github.com/pypeclub/OpenPype/pull/1864)
|
||||
- Ftrack prepare project structure [\#1861](https://github.com/pypeclub/OpenPype/pull/1861)
|
||||
- Independent general environments [\#1853](https://github.com/pypeclub/OpenPype/pull/1853)
|
||||
- TVPaint Start Frame [\#1844](https://github.com/pypeclub/OpenPype/pull/1844)
|
||||
- Ftrack push attributes action adds traceback to job [\#1843](https://github.com/pypeclub/OpenPype/pull/1843)
|
||||
- Prepare project action enhance [\#1838](https://github.com/pypeclub/OpenPype/pull/1838)
|
||||
- Standalone Publish of textures family [\#1834](https://github.com/pypeclub/OpenPype/pull/1834)
|
||||
- nuke: settings create missing default subsets [\#1829](https://github.com/pypeclub/OpenPype/pull/1829)
|
||||
- Update poetry lock [\#1823](https://github.com/pypeclub/OpenPype/pull/1823)
|
||||
- Settings: settings for plugins [\#1819](https://github.com/pypeclub/OpenPype/pull/1819)
|
||||
- Settings list can use template or schema as object type [\#1815](https://github.com/pypeclub/OpenPype/pull/1815)
|
||||
- Maya: Deadline custom settings [\#1797](https://github.com/pypeclub/OpenPype/pull/1797)
|
||||
- Maya: Shader name validation [\#1762](https://github.com/pypeclub/OpenPype/pull/1762)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Nuke: update video file crassing [\#1916](https://github.com/pypeclub/OpenPype/pull/1916)
|
||||
- Fix - texture validators for workfiles triggers only for textures workfiles [\#1914](https://github.com/pypeclub/OpenPype/pull/1914)
|
||||
- Settings UI: List order works as expected [\#1906](https://github.com/pypeclub/OpenPype/pull/1906)
|
||||
- Hiero: loaded clip was not set colorspace from version data [\#1904](https://github.com/pypeclub/OpenPype/pull/1904)
|
||||
- Pyblish UI: Fix collecting stage processing [\#1903](https://github.com/pypeclub/OpenPype/pull/1903)
|
||||
- Burnins: Use input's bitrate in h624 [\#1902](https://github.com/pypeclub/OpenPype/pull/1902)
|
||||
- Bug: fixed python detection [\#1893](https://github.com/pypeclub/OpenPype/pull/1893)
|
||||
- global: integrate name missing default template [\#1890](https://github.com/pypeclub/OpenPype/pull/1890)
|
||||
- publisher: editorial plugins fixes [\#1889](https://github.com/pypeclub/OpenPype/pull/1889)
|
||||
- Normalize path returned from Workfiles. [\#1880](https://github.com/pypeclub/OpenPype/pull/1880)
|
||||
- Workfiles tool event arguments fix [\#1862](https://github.com/pypeclub/OpenPype/pull/1862)
|
||||
- imageio: fix grouping [\#1856](https://github.com/pypeclub/OpenPype/pull/1856)
|
||||
- publisher: missing version in subset prop [\#1849](https://github.com/pypeclub/OpenPype/pull/1849)
|
||||
- Ftrack type error fix in sync to avalon event handler [\#1845](https://github.com/pypeclub/OpenPype/pull/1845)
|
||||
- Nuke: updating effects subset fail [\#1841](https://github.com/pypeclub/OpenPype/pull/1841)
|
||||
- nuke: write render node skipped with crop [\#1836](https://github.com/pypeclub/OpenPype/pull/1836)
|
||||
- Project folder structure overrides [\#1813](https://github.com/pypeclub/OpenPype/pull/1813)
|
||||
- Maya: fix yeti settings path in extractor [\#1809](https://github.com/pypeclub/OpenPype/pull/1809)
|
||||
- Failsafe for cross project containers. [\#1806](https://github.com/pypeclub/OpenPype/pull/1806)
|
||||
- Settings error dialog on show [\#1798](https://github.com/pypeclub/OpenPype/pull/1798)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Maya: add support for `RedshiftNormalMap` node, fix `tx` linear space 🚀 [\#1863](https://github.com/pypeclub/OpenPype/pull/1863)
|
||||
- Add support for pyenv-win on windows [\#1822](https://github.com/pypeclub/OpenPype/pull/1822)
|
||||
- PS, AE - send actual context when another webserver is running [\#1811](https://github.com/pypeclub/OpenPype/pull/1811)
|
||||
|
||||
## [3.2.0](https://github.com/pypeclub/OpenPype/tree/3.2.0) (2021-07-13)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.2.0-nightly.7...3.2.0)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Nuke: ftrack family plugin settings preset [\#1805](https://github.com/pypeclub/OpenPype/pull/1805)
|
||||
- Standalone publisher last project [\#1799](https://github.com/pypeclub/OpenPype/pull/1799)
|
||||
- Ftrack Multiple notes as server action [\#1795](https://github.com/pypeclub/OpenPype/pull/1795)
|
||||
- Settings conditional dict [\#1777](https://github.com/pypeclub/OpenPype/pull/1777)
|
||||
- Settings application use python 2 only where needed [\#1776](https://github.com/pypeclub/OpenPype/pull/1776)
|
||||
- Settings UI copy/paste [\#1769](https://github.com/pypeclub/OpenPype/pull/1769)
|
||||
- Workfile tool widths [\#1766](https://github.com/pypeclub/OpenPype/pull/1766)
|
||||
- Push hierarchical attributes care about task parent changes [\#1763](https://github.com/pypeclub/OpenPype/pull/1763)
|
||||
- Application executables with environment variables [\#1757](https://github.com/pypeclub/OpenPype/pull/1757)
|
||||
- Settings Hosts enum [\#1739](https://github.com/pypeclub/OpenPype/pull/1739)
|
||||
- Validate containers settings [\#1736](https://github.com/pypeclub/OpenPype/pull/1736)
|
||||
- PS - added loader from sequence [\#1726](https://github.com/pypeclub/OpenPype/pull/1726)
|
||||
- Autoupdate launcher [\#1725](https://github.com/pypeclub/OpenPype/pull/1725)
|
||||
- Subset template and TVPaint subset template docs [\#1717](https://github.com/pypeclub/OpenPype/pull/1717)
|
||||
- Toggle Ftrack upload in StandalonePublisher [\#1708](https://github.com/pypeclub/OpenPype/pull/1708)
|
||||
- Overscan color extract review [\#1701](https://github.com/pypeclub/OpenPype/pull/1701)
|
||||
- Nuke: Prerender Frame Range by default [\#1699](https://github.com/pypeclub/OpenPype/pull/1699)
|
||||
- Smoother edges of color triangle [\#1695](https://github.com/pypeclub/OpenPype/pull/1695)
|
||||
- Deadline: Nuke submission additional attributes [\#1756](https://github.com/pypeclub/OpenPype/pull/1756)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- nuke: fixing wrong name of family folder when `used existing frames` [\#1803](https://github.com/pypeclub/OpenPype/pull/1803)
|
||||
- Collect ftrack family bugs [\#1801](https://github.com/pypeclub/OpenPype/pull/1801)
|
||||
- Invitee email can be None which break the Ftrack commit. [\#1788](https://github.com/pypeclub/OpenPype/pull/1788)
|
||||
- Fix: staging and `--use-version` option [\#1786](https://github.com/pypeclub/OpenPype/pull/1786)
|
||||
- Otio unrelated error on import [\#1782](https://github.com/pypeclub/OpenPype/pull/1782)
|
||||
- FFprobe streams order [\#1775](https://github.com/pypeclub/OpenPype/pull/1775)
|
||||
- Fix - single file files are str only, cast it to list to count properly [\#1772](https://github.com/pypeclub/OpenPype/pull/1772)
|
||||
- Environments in app executable for MacOS [\#1768](https://github.com/pypeclub/OpenPype/pull/1768)
|
||||
- Project specific environments [\#1767](https://github.com/pypeclub/OpenPype/pull/1767)
|
||||
- Settings UI with refresh button [\#1764](https://github.com/pypeclub/OpenPype/pull/1764)
|
||||
- Standalone publisher thumbnail extractor fix [\#1761](https://github.com/pypeclub/OpenPype/pull/1761)
|
||||
- Anatomy others templates don't cause crash [\#1758](https://github.com/pypeclub/OpenPype/pull/1758)
|
||||
- Backend acre module commit update [\#1745](https://github.com/pypeclub/OpenPype/pull/1745)
|
||||
- hiero: precollect instances failing when audio selected [\#1743](https://github.com/pypeclub/OpenPype/pull/1743)
|
||||
- Hiero: creator instance error [\#1742](https://github.com/pypeclub/OpenPype/pull/1742)
|
||||
- Nuke: fixing render creator for no selection format failing [\#1741](https://github.com/pypeclub/OpenPype/pull/1741)
|
||||
- Local settings UI crash on missing defaults [\#1737](https://github.com/pypeclub/OpenPype/pull/1737)
|
||||
- TVPaint white background on thumbnail [\#1735](https://github.com/pypeclub/OpenPype/pull/1735)
|
||||
- Ftrack missing custom attribute message [\#1734](https://github.com/pypeclub/OpenPype/pull/1734)
|
||||
- Launcher project changes [\#1733](https://github.com/pypeclub/OpenPype/pull/1733)
|
||||
- Ftrack sync status [\#1732](https://github.com/pypeclub/OpenPype/pull/1732)
|
||||
- TVPaint use layer name for default variant [\#1724](https://github.com/pypeclub/OpenPype/pull/1724)
|
||||
- Default subset template for TVPaint review and workfile families [\#1716](https://github.com/pypeclub/OpenPype/pull/1716)
|
||||
- Maya: Extract review hotfix [\#1714](https://github.com/pypeclub/OpenPype/pull/1714)
|
||||
- Settings: Imageio improving granularity [\#1711](https://github.com/pypeclub/OpenPype/pull/1711)
|
||||
- Application without executables [\#1679](https://github.com/pypeclub/OpenPype/pull/1679)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Build: don't add Poetry to `PATH` [\#1808](https://github.com/pypeclub/OpenPype/pull/1808)
|
||||
- Bump prismjs from 1.23.0 to 1.24.0 in /website [\#1773](https://github.com/pypeclub/OpenPype/pull/1773)
|
||||
- TVPaint ftrack family [\#1755](https://github.com/pypeclub/OpenPype/pull/1755)
|
||||
- global: removing obsolete ftrack validator plugin [\#1710](https://github.com/pypeclub/OpenPype/pull/1710)
|
||||
- Bc/fix/docs [\#1771](https://github.com/pypeclub/OpenPype/pull/1771)
|
||||
|
||||
## [2.18.4](https://github.com/pypeclub/OpenPype/tree/2.18.4) (2021-06-24)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/2.18.3...2.18.4)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- celaction fixes [\#1754](https://github.com/pypeclub/OpenPype/pull/1754)
|
||||
- celaciton: audio subset changed data structure [\#1750](https://github.com/pypeclub/OpenPype/pull/1750)
|
||||
|
||||
## [2.18.3](https://github.com/pypeclub/OpenPype/tree/2.18.3) (2021-06-23)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.2.0-nightly.2...2.18.3)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Tools names forwards compatibility [\#1727](https://github.com/pypeclub/OpenPype/pull/1727)
|
||||
|
||||
## [2.18.2](https://github.com/pypeclub/OpenPype/tree/2.18.2) (2021-06-16)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.1.0...2.18.2)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- StandalonePublisher: adding exception for adding `delete` tag to repre [\#1650](https://github.com/pypeclub/OpenPype/pull/1650)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Maya: Extract review hotfix - 2.x backport [\#1713](https://github.com/pypeclub/OpenPype/pull/1713)
|
||||
- StandalonePublisher: instance data attribute `keepSequence` [\#1668](https://github.com/pypeclub/OpenPype/pull/1668)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- 1698 Nuke: Prerender Frame Range by default [\#1709](https://github.com/pypeclub/OpenPype/pull/1709)
|
||||
|
||||
## [3.1.0](https://github.com/pypeclub/OpenPype/tree/3.1.0) (2021-06-15)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.1.0-nightly.4...3.1.0)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Log Viewer with OpenPype style [\#1703](https://github.com/pypeclub/OpenPype/pull/1703)
|
||||
- Scrolling in OpenPype info widget [\#1702](https://github.com/pypeclub/OpenPype/pull/1702)
|
||||
- OpenPype style in modules [\#1694](https://github.com/pypeclub/OpenPype/pull/1694)
|
||||
- Sort applications and tools alphabetically in Settings UI [\#1689](https://github.com/pypeclub/OpenPype/pull/1689)
|
||||
- \#683 - Validate Frame Range in Standalone Publisher [\#1683](https://github.com/pypeclub/OpenPype/pull/1683)
|
||||
- Hiero: old container versions identify with red color [\#1682](https://github.com/pypeclub/OpenPype/pull/1682)
|
||||
- Project Manger: Default name column width [\#1669](https://github.com/pypeclub/OpenPype/pull/1669)
|
||||
- Remove outline in stylesheet [\#1667](https://github.com/pypeclub/OpenPype/pull/1667)
|
||||
- TVPaint: Creator take layer name as default value for subset variant [\#1663](https://github.com/pypeclub/OpenPype/pull/1663)
|
||||
- TVPaint custom subset template [\#1662](https://github.com/pypeclub/OpenPype/pull/1662)
|
||||
- Editorial: conform assets validator [\#1659](https://github.com/pypeclub/OpenPype/pull/1659)
|
||||
- Feature Slack integration [\#1657](https://github.com/pypeclub/OpenPype/pull/1657)
|
||||
- Nuke - Publish simplification [\#1653](https://github.com/pypeclub/OpenPype/pull/1653)
|
||||
- \#1333 - added tooltip hints to Pyblish buttons [\#1649](https://github.com/pypeclub/OpenPype/pull/1649)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Nuke: broken publishing rendered frames [\#1707](https://github.com/pypeclub/OpenPype/pull/1707)
|
||||
- Standalone publisher Thumbnail export args [\#1705](https://github.com/pypeclub/OpenPype/pull/1705)
|
||||
- Bad zip can break OpenPype start [\#1691](https://github.com/pypeclub/OpenPype/pull/1691)
|
||||
- Hiero: published whole edit mov [\#1687](https://github.com/pypeclub/OpenPype/pull/1687)
|
||||
- Ftrack subprocess handle of stdout/stderr [\#1675](https://github.com/pypeclub/OpenPype/pull/1675)
|
||||
- Settings list race condifiton and mutable dict list conversion [\#1671](https://github.com/pypeclub/OpenPype/pull/1671)
|
||||
- Mac launch arguments fix [\#1660](https://github.com/pypeclub/OpenPype/pull/1660)
|
||||
- Fix missing dbm python module [\#1652](https://github.com/pypeclub/OpenPype/pull/1652)
|
||||
- Transparent branches in view on Mac [\#1648](https://github.com/pypeclub/OpenPype/pull/1648)
|
||||
- Add asset on task item [\#1646](https://github.com/pypeclub/OpenPype/pull/1646)
|
||||
- Project manager save and queue [\#1645](https://github.com/pypeclub/OpenPype/pull/1645)
|
||||
- New project anatomy values [\#1644](https://github.com/pypeclub/OpenPype/pull/1644)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- update dependencies [\#1697](https://github.com/pypeclub/OpenPype/pull/1697)
|
||||
- Bump normalize-url from 4.5.0 to 4.5.1 in /website [\#1686](https://github.com/pypeclub/OpenPype/pull/1686)
|
||||
|
||||
# Changelog
|
||||
|
||||
|
||||
|
|
|
|||
13
README.md
13
README.md
|
|
@ -29,7 +29,7 @@ The main things you will need to run and build OpenPype are:
|
|||
- PowerShell 5.0+ (Windows)
|
||||
- Bash (Linux)
|
||||
- [**Python 3.7.8**](#python) or higher
|
||||
- [**MongoDB**](#database)
|
||||
- [**MongoDB**](#database) (needed only for local development)
|
||||
|
||||
|
||||
It can be built and ran on all common platforms. We develop and test on the following:
|
||||
|
|
@ -126,6 +126,16 @@ pyenv local 3.7.9
|
|||
|
||||
### Linux
|
||||
|
||||
#### Docker
|
||||
Easiest way to build OpenPype on Linux is using [Docker](https://www.docker.com/). Just run:
|
||||
|
||||
```sh
|
||||
sudo ./tools/docker_build.sh
|
||||
```
|
||||
|
||||
If all is successful, you'll find built OpenPype in `./build/` folder.
|
||||
|
||||
#### Manual build
|
||||
You will need [Python 3.7](https://www.python.org/downloads/) and [git](https://git-scm.com/downloads). You'll also need [curl](https://curl.se) on systems that doesn't have one preinstalled.
|
||||
|
||||
To build Python related stuff, you need Python header files installed (`python3-dev` on Ubuntu for example).
|
||||
|
|
@ -133,7 +143,6 @@ To build Python related stuff, you need Python header files installed (`python3-
|
|||
You'll need also other tools to build
|
||||
some OpenPype dependencies like [CMake](https://cmake.org/). Python 3 should be part of all modern distributions. You can use your package manager to install **git** and **cmake**.
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Details for Ubuntu</summary>
|
||||
Install git, cmake and curl
|
||||
|
|
|
|||
|
|
@ -657,7 +657,7 @@ class BootstrapRepos:
|
|||
]
|
||||
|
||||
# remove duplicates
|
||||
openpype_versions = list(set(openpype_versions))
|
||||
openpype_versions = sorted(list(set(openpype_versions)))
|
||||
|
||||
return openpype_versions
|
||||
|
||||
|
|
|
|||
|
|
@ -98,6 +98,11 @@ def install():
|
|||
.get(platform_name)
|
||||
) or []
|
||||
for path in project_plugins:
|
||||
try:
|
||||
path = str(path.format(**os.environ))
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
if not path or not os.path.exists(path):
|
||||
continue
|
||||
|
||||
|
|
|
|||
|
|
@ -15,6 +15,9 @@ from .pype_commands import PypeCommands
|
|||
expose_value=False, help="use specified version")
|
||||
@click.option("--use-staging", is_flag=True,
|
||||
expose_value=False, help="use staging variants")
|
||||
@click.option("--list-versions", is_flag=True, expose_value=False,
|
||||
help=("list all detected versions. Use With `--use-staging "
|
||||
"to list staging versions."))
|
||||
def main(ctx):
|
||||
"""Pype is main command serving as entry point to pipeline system.
|
||||
|
||||
|
|
|
|||
|
|
@ -49,7 +49,7 @@ class CopyTemplateWorkfile(PreLaunchHook):
|
|||
))
|
||||
return
|
||||
|
||||
self.log.info("Last workfile does not exits.")
|
||||
self.log.info("Last workfile does not exist.")
|
||||
|
||||
project_name = self.data["project_name"]
|
||||
asset_name = self.data["asset_name"]
|
||||
|
|
|
|||
28
openpype/hooks/pre_foundry_apps.py
Normal file
28
openpype/hooks/pre_foundry_apps.py
Normal file
|
|
@ -0,0 +1,28 @@
|
|||
import subprocess
|
||||
from openpype.lib import PreLaunchHook
|
||||
|
||||
|
||||
class LaunchFoundryAppsWindows(PreLaunchHook):
|
||||
"""Foundry applications have specific way how to launch them.
|
||||
|
||||
Nuke is executed "like" python process so it is required to pass
|
||||
`CREATE_NEW_CONSOLE` flag on windows to trigger creation of new console.
|
||||
At the same time the newly created console won't create it's own stdout
|
||||
and stderr handlers so they should not be redirected to DEVNULL.
|
||||
"""
|
||||
|
||||
# Should be as last hook because must change launch arguments to string
|
||||
order = 1000
|
||||
app_groups = ["nuke", "nukex", "hiero", "nukestudio"]
|
||||
platforms = ["windows"]
|
||||
|
||||
def execute(self):
|
||||
# Change `creationflags` to CREATE_NEW_CONSOLE
|
||||
# - on Windows will nuke create new window using it's console
|
||||
# Set `stdout` and `stderr` to None so new created console does not
|
||||
# have redirected output to DEVNULL in build
|
||||
self.launch_context.kwargs.update({
|
||||
"creationflags": subprocess.CREATE_NEW_CONSOLE,
|
||||
"stdout": None,
|
||||
"stderr": None
|
||||
})
|
||||
|
|
@ -49,5 +49,7 @@ class NonPythonHostHook(PreLaunchHook):
|
|||
if remainders:
|
||||
self.launch_context.launch_args.extend(remainders)
|
||||
|
||||
# This must be set otherwise it wouldn't be possible to catch output
|
||||
# when build OpenPype is used.
|
||||
self.launch_context.kwargs["stdout"] = subprocess.DEVNULL
|
||||
self.launch_context.kwargs["stderr"] = subprocess.STDOUT
|
||||
self.launch_context.kwargs["stderr"] = subprocess.DEVNULL
|
||||
|
|
|
|||
|
|
@ -1,44 +0,0 @@
|
|||
import os
|
||||
import subprocess
|
||||
from openpype.lib import PreLaunchHook
|
||||
|
||||
|
||||
class LaunchWithWindowsShell(PreLaunchHook):
|
||||
"""Add shell command before executable.
|
||||
|
||||
Some hosts have issues when are launched directly from python in that case
|
||||
it is possible to prepend shell executable which will trigger process
|
||||
instead.
|
||||
"""
|
||||
|
||||
# Should be as last hook because must change launch arguments to string
|
||||
order = 1000
|
||||
app_groups = ["nuke", "nukex", "hiero", "nukestudio"]
|
||||
platforms = ["windows"]
|
||||
|
||||
def execute(self):
|
||||
launch_args = self.launch_context.clear_launch_args(
|
||||
self.launch_context.launch_args)
|
||||
new_args = [
|
||||
# Get comspec which is cmd.exe in most cases.
|
||||
os.environ.get("COMSPEC", "cmd.exe"),
|
||||
# NOTE change to "/k" if want to keep console opened
|
||||
"/c",
|
||||
# Convert arguments to command line arguments (as string)
|
||||
"\"{}\"".format(
|
||||
subprocess.list2cmdline(launch_args)
|
||||
)
|
||||
]
|
||||
# Convert list to string
|
||||
# WARNING this only works if is used as string
|
||||
args_string = " ".join(new_args)
|
||||
self.log.info((
|
||||
"Modified launch arguments to be launched with shell \"{}\"."
|
||||
).format(args_string))
|
||||
|
||||
# Replace launch args with new one
|
||||
self.launch_context.launch_args = args_string
|
||||
# Change `creationflags` to CREATE_NEW_CONSOLE
|
||||
self.launch_context.kwargs["creationflags"] = (
|
||||
subprocess.CREATE_NEW_CONSOLE
|
||||
)
|
||||
|
|
@ -0,0 +1,17 @@
|
|||
from openpype.hosts.aftereffects.plugins.create import create_render
|
||||
|
||||
import logging
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class CreateLocalRender(create_render.CreateRender):
|
||||
""" Creator to render locally.
|
||||
|
||||
Created only after default render on farm. So family 'render.local' is
|
||||
used for backward compatibility.
|
||||
"""
|
||||
|
||||
name = "renderDefault"
|
||||
label = "Render Locally"
|
||||
family = "renderLocal"
|
||||
|
|
@ -1,10 +1,14 @@
|
|||
from openpype.lib import abstract_collect_render
|
||||
from openpype.lib.abstract_collect_render import RenderInstance
|
||||
import pyblish.api
|
||||
import attr
|
||||
import os
|
||||
import re
|
||||
import attr
|
||||
import tempfile
|
||||
|
||||
from avalon import aftereffects
|
||||
import pyblish.api
|
||||
|
||||
from openpype.settings import get_project_settings
|
||||
from openpype.lib import abstract_collect_render
|
||||
from openpype.lib.abstract_collect_render import RenderInstance
|
||||
|
||||
|
||||
@attr.s
|
||||
|
|
@ -13,6 +17,8 @@ class AERenderInstance(RenderInstance):
|
|||
comp_name = attr.ib(default=None)
|
||||
comp_id = attr.ib(default=None)
|
||||
fps = attr.ib(default=None)
|
||||
projectEntity = attr.ib(default=None)
|
||||
stagingDir = attr.ib(default=None)
|
||||
|
||||
|
||||
class CollectAERender(abstract_collect_render.AbstractCollectRender):
|
||||
|
|
@ -21,6 +27,11 @@ class CollectAERender(abstract_collect_render.AbstractCollectRender):
|
|||
label = "Collect After Effects Render Layers"
|
||||
hosts = ["aftereffects"]
|
||||
|
||||
# internal
|
||||
family_remapping = {
|
||||
"render": ("render.farm", "farm"), # (family, label)
|
||||
"renderLocal": ("render", "local")
|
||||
}
|
||||
padding_width = 6
|
||||
rendered_extension = 'png'
|
||||
|
||||
|
|
@ -62,14 +73,16 @@ class CollectAERender(abstract_collect_render.AbstractCollectRender):
|
|||
fps = work_area_info.frameRate
|
||||
# TODO add resolution when supported by extension
|
||||
|
||||
if inst["family"] == "render" and inst["active"]:
|
||||
if inst["family"] in self.family_remapping.keys() \
|
||||
and inst["active"]:
|
||||
remapped_family = self.family_remapping[inst["family"]]
|
||||
instance = AERenderInstance(
|
||||
family="render.farm", # other way integrate would catch it
|
||||
families=["render.farm"],
|
||||
family=remapped_family[0],
|
||||
families=[remapped_family[0]],
|
||||
version=version,
|
||||
time="",
|
||||
source=current_file,
|
||||
label="{} - farm".format(inst["subset"]),
|
||||
label="{} - {}".format(inst["subset"], remapped_family[1]),
|
||||
subset=inst["subset"],
|
||||
asset=context.data["assetEntity"]["name"],
|
||||
attachTo=False,
|
||||
|
|
@ -105,6 +118,30 @@ class CollectAERender(abstract_collect_render.AbstractCollectRender):
|
|||
|
||||
instance.outputDir = self._get_output_dir(instance)
|
||||
|
||||
settings = get_project_settings(os.getenv("AVALON_PROJECT"))
|
||||
reviewable_subset_filter = \
|
||||
(settings["deadline"]
|
||||
["publish"]
|
||||
["ProcessSubmittedJobOnFarm"]
|
||||
["aov_filter"])
|
||||
|
||||
if inst["family"] == "renderLocal":
|
||||
# for local renders
|
||||
instance.anatomyData["version"] = instance.version
|
||||
instance.anatomyData["subset"] = instance.subset
|
||||
instance.stagingDir = tempfile.mkdtemp()
|
||||
instance.projectEntity = project_entity
|
||||
|
||||
if self.hosts[0] in reviewable_subset_filter.keys():
|
||||
for aov_pattern in \
|
||||
reviewable_subset_filter[self.hosts[0]]:
|
||||
if re.match(aov_pattern, instance.subset):
|
||||
instance.families.append("review")
|
||||
instance.review = True
|
||||
break
|
||||
|
||||
self.log.info("New instance:: {}".format(instance))
|
||||
|
||||
instances.append(instance)
|
||||
|
||||
return instances
|
||||
|
|
|
|||
|
|
@ -0,0 +1,82 @@
|
|||
import os
|
||||
import six
|
||||
import sys
|
||||
|
||||
import openpype.api
|
||||
from avalon import aftereffects
|
||||
|
||||
|
||||
class ExtractLocalRender(openpype.api.Extractor):
|
||||
"""Render RenderQueue locally."""
|
||||
|
||||
order = openpype.api.Extractor.order - 0.47
|
||||
label = "Extract Local Render"
|
||||
hosts = ["aftereffects"]
|
||||
families = ["render"]
|
||||
|
||||
def process(self, instance):
|
||||
stub = aftereffects.stub()
|
||||
staging_dir = instance.data["stagingDir"]
|
||||
self.log.info("staging_dir::{}".format(staging_dir))
|
||||
|
||||
stub.render(staging_dir)
|
||||
|
||||
# pull file name from Render Queue Output module
|
||||
render_q = stub.get_render_info()
|
||||
if not render_q:
|
||||
raise ValueError("No file extension set in Render Queue")
|
||||
_, ext = os.path.splitext(os.path.basename(render_q.file_name))
|
||||
ext = ext[1:]
|
||||
|
||||
first_file_path = None
|
||||
files = []
|
||||
self.log.info("files::{}".format(os.listdir(staging_dir)))
|
||||
for file_name in os.listdir(staging_dir):
|
||||
files.append(file_name)
|
||||
if first_file_path is None:
|
||||
first_file_path = os.path.join(staging_dir,
|
||||
file_name)
|
||||
|
||||
resulting_files = files
|
||||
if len(files) == 1:
|
||||
resulting_files = files[0]
|
||||
|
||||
repre_data = {
|
||||
"frameStart": instance.data["frameStart"],
|
||||
"frameEnd": instance.data["frameEnd"],
|
||||
"name": ext,
|
||||
"ext": ext,
|
||||
"files": resulting_files,
|
||||
"stagingDir": staging_dir
|
||||
}
|
||||
if instance.data["review"]:
|
||||
repre_data["tags"] = ["review"]
|
||||
|
||||
instance.data["representations"] = [repre_data]
|
||||
|
||||
ffmpeg_path = openpype.lib.get_ffmpeg_tool_path("ffmpeg")
|
||||
# Generate thumbnail.
|
||||
thumbnail_path = os.path.join(staging_dir,
|
||||
"thumbnail.jpg")
|
||||
|
||||
args = [
|
||||
ffmpeg_path, "-y",
|
||||
"-i", first_file_path,
|
||||
"-vf", "scale=300:-1",
|
||||
"-vframes", "1",
|
||||
thumbnail_path
|
||||
]
|
||||
self.log.debug("Thumbnail args:: {}".format(args))
|
||||
try:
|
||||
output = openpype.lib.run_subprocess(args)
|
||||
except TypeError:
|
||||
self.log.warning("Error in creating thumbnail")
|
||||
six.reraise(*sys.exc_info())
|
||||
|
||||
instance.data["representations"].append({
|
||||
"name": "thumbnail",
|
||||
"ext": "jpg",
|
||||
"files": os.path.basename(thumbnail_path),
|
||||
"stagingDir": staging_dir,
|
||||
"tags": ["thumbnail"]
|
||||
})
|
||||
|
|
@ -0,0 +1,61 @@
|
|||
from avalon import api
|
||||
import pyblish.api
|
||||
import openpype.api
|
||||
from avalon import aftereffects
|
||||
|
||||
|
||||
class ValidateInstanceAssetRepair(pyblish.api.Action):
|
||||
"""Repair the instance asset with value from Context."""
|
||||
|
||||
label = "Repair"
|
||||
icon = "wrench"
|
||||
on = "failed"
|
||||
|
||||
def process(self, context, plugin):
|
||||
|
||||
# Get the errored instances
|
||||
failed = []
|
||||
for result in context.data["results"]:
|
||||
if (result["error"] is not None and result["instance"] is not None
|
||||
and result["instance"] not in failed):
|
||||
failed.append(result["instance"])
|
||||
|
||||
# Apply pyblish.logic to get the instances for the plug-in
|
||||
instances = pyblish.api.instances_by_plugin(failed, plugin)
|
||||
stub = aftereffects.stub()
|
||||
for instance in instances:
|
||||
data = stub.read(instance[0])
|
||||
|
||||
data["asset"] = api.Session["AVALON_ASSET"]
|
||||
stub.imprint(instance[0], data)
|
||||
|
||||
|
||||
class ValidateInstanceAsset(pyblish.api.InstancePlugin):
|
||||
"""Validate the instance asset is the current selected context asset.
|
||||
|
||||
As it might happen that multiple worfiles are opened at same time,
|
||||
switching between them would mess with selected context. (From Launcher
|
||||
or Ftrack).
|
||||
|
||||
In that case outputs might be output under wrong asset!
|
||||
|
||||
Repair action will use Context asset value (from Workfiles or Launcher)
|
||||
Closing and reopening with Workfiles will refresh Context value.
|
||||
"""
|
||||
|
||||
label = "Validate Instance Asset"
|
||||
hosts = ["aftereffects"]
|
||||
actions = [ValidateInstanceAssetRepair]
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
|
||||
def process(self, instance):
|
||||
instance_asset = instance.data["asset"]
|
||||
current_asset = api.Session["AVALON_ASSET"]
|
||||
msg = (
|
||||
f"Instance asset {instance_asset} is not the same "
|
||||
f"as current context {current_asset}. PLEASE DO:\n"
|
||||
f"Repair with 'A' action to use '{current_asset}'.\n"
|
||||
f"If that's not correct value, close workfile and "
|
||||
f"reopen via Workfiles!"
|
||||
)
|
||||
assert instance_asset == current_asset, msg
|
||||
|
|
@ -53,7 +53,7 @@ class ValidateSceneSettings(pyblish.api.InstancePlugin):
|
|||
|
||||
order = pyblish.api.ValidatorOrder
|
||||
label = "Validate Scene Settings"
|
||||
families = ["render.farm"]
|
||||
families = ["render.farm", "render"]
|
||||
hosts = ["aftereffects"]
|
||||
optional = True
|
||||
|
||||
|
|
|
|||
|
|
@ -54,6 +54,10 @@ class LoadClip(phiero.SequenceLoader):
|
|||
object_name = self.clip_name_template.format(
|
||||
**context["representation"]["context"])
|
||||
|
||||
# set colorspace
|
||||
if colorspace:
|
||||
track_item.source().setSourceMediaColourTransform(colorspace)
|
||||
|
||||
# add additional metadata from the version to imprint Avalon knob
|
||||
add_keys = [
|
||||
"frameStart", "frameEnd", "source", "author",
|
||||
|
|
@ -109,9 +113,14 @@ class LoadClip(phiero.SequenceLoader):
|
|||
colorspace = version_data.get("colorspace", None)
|
||||
object_name = "{}_{}".format(name, namespace)
|
||||
file = api.get_representation_path(representation).replace("\\", "/")
|
||||
clip = track_item.source()
|
||||
|
||||
# reconnect media to new path
|
||||
track_item.source().reconnectMedia(file)
|
||||
clip.reconnectMedia(file)
|
||||
|
||||
# set colorspace
|
||||
if colorspace:
|
||||
clip.setSourceMediaColourTransform(colorspace)
|
||||
|
||||
# add additional metadata from the version to imprint Avalon knob
|
||||
add_keys = [
|
||||
|
|
@ -160,6 +169,7 @@ class LoadClip(phiero.SequenceLoader):
|
|||
@classmethod
|
||||
def set_item_color(cls, track_item, version):
|
||||
|
||||
clip = track_item.source()
|
||||
# define version name
|
||||
version_name = version.get("name", None)
|
||||
# get all versions in list
|
||||
|
|
@ -172,6 +182,6 @@ class LoadClip(phiero.SequenceLoader):
|
|||
|
||||
# set clip colour
|
||||
if version_name == max_version:
|
||||
track_item.source().binItem().setColor(cls.clip_color_last)
|
||||
clip.binItem().setColor(cls.clip_color_last)
|
||||
else:
|
||||
track_item.source().binItem().setColor(cls.clip_color)
|
||||
clip.binItem().setColor(cls.clip_color)
|
||||
|
|
|
|||
|
|
@ -120,6 +120,13 @@ class PrecollectInstances(pyblish.api.ContextPlugin):
|
|||
# create instance
|
||||
instance = context.create_instance(**data)
|
||||
|
||||
# add colorspace data
|
||||
instance.data.update({
|
||||
"versionData": {
|
||||
"colorspace": track_item.sourceMediaColourTransform(),
|
||||
}
|
||||
})
|
||||
|
||||
# create shot instance for shot attributes create/update
|
||||
self.create_shot_instance(context, **data)
|
||||
|
||||
|
|
@ -133,13 +140,6 @@ class PrecollectInstances(pyblish.api.ContextPlugin):
|
|||
# create audio subset instance
|
||||
self.create_audio_instance(context, **data)
|
||||
|
||||
# add colorspace data
|
||||
instance.data.update({
|
||||
"versionData": {
|
||||
"colorspace": track_item.sourceMediaColourTransform(),
|
||||
}
|
||||
})
|
||||
|
||||
# add audioReview attribute to plate instance data
|
||||
# if reviewTrack is on
|
||||
if tag_data.get("reviewTrack") is not None:
|
||||
|
|
|
|||
|
|
@ -56,7 +56,7 @@ class CollectInstances(pyblish.api.ContextPlugin):
|
|||
# Create nice name if the instance has a frame range.
|
||||
label = data.get("name", node.name())
|
||||
if "frameStart" in data and "frameEnd" in data:
|
||||
frames = "[{startFrame} - {endFrame}]".format(**data)
|
||||
frames = "[{frameStart} - {frameEnd}]".format(**data)
|
||||
label = "{} {}".format(label, frames)
|
||||
|
||||
instance = context.create_instance(label)
|
||||
|
|
|
|||
|
|
@ -26,6 +26,12 @@ INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory")
|
|||
|
||||
|
||||
def install():
|
||||
from openpype.settings import get_project_settings
|
||||
|
||||
project_settings = get_project_settings(os.getenv("AVALON_PROJECT"))
|
||||
# process path mapping
|
||||
process_dirmap(project_settings)
|
||||
|
||||
pyblish.register_plugin_path(PUBLISH_PATH)
|
||||
avalon.register_plugin_path(avalon.Loader, LOAD_PATH)
|
||||
avalon.register_plugin_path(avalon.Creator, CREATE_PATH)
|
||||
|
|
@ -53,6 +59,40 @@ def install():
|
|||
avalon.data["familiesStateToggled"] = ["imagesequence"]
|
||||
|
||||
|
||||
def process_dirmap(project_settings):
|
||||
# type: (dict) -> None
|
||||
"""Go through all paths in Settings and set them using `dirmap`.
|
||||
|
||||
Args:
|
||||
project_settings (dict): Settings for current project.
|
||||
|
||||
"""
|
||||
if not project_settings["maya"].get("maya-dirmap"):
|
||||
return
|
||||
mapping = project_settings["maya"]["maya-dirmap"]["paths"] or {}
|
||||
mapping_enabled = project_settings["maya"]["maya-dirmap"]["enabled"]
|
||||
if not mapping or not mapping_enabled:
|
||||
return
|
||||
if mapping.get("source-path") and mapping_enabled is True:
|
||||
log.info("Processing directory mapping ...")
|
||||
cmds.dirmap(en=True)
|
||||
for k, sp in enumerate(mapping["source-path"]):
|
||||
try:
|
||||
print("{} -> {}".format(sp, mapping["destination-path"][k]))
|
||||
cmds.dirmap(m=(sp, mapping["destination-path"][k]))
|
||||
cmds.dirmap(m=(mapping["destination-path"][k], sp))
|
||||
except IndexError:
|
||||
# missing corresponding destination path
|
||||
log.error(("invalid dirmap mapping, missing corresponding"
|
||||
" destination directory."))
|
||||
break
|
||||
except RuntimeError:
|
||||
log.error("invalid path {} -> {}, mapping not registered".format(
|
||||
sp, mapping["destination-path"][k]
|
||||
))
|
||||
continue
|
||||
|
||||
|
||||
def uninstall():
|
||||
pyblish.deregister_plugin_path(PUBLISH_PATH)
|
||||
avalon.deregister_plugin_path(avalon.Loader, LOAD_PATH)
|
||||
|
|
|
|||
53
openpype/hosts/maya/api/commands.py
Normal file
53
openpype/hosts/maya/api/commands.py
Normal file
|
|
@ -0,0 +1,53 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""OpenPype script commands to be used directly in Maya."""
|
||||
|
||||
|
||||
class ToolWindows:
|
||||
|
||||
_windows = {}
|
||||
|
||||
@classmethod
|
||||
def get_window(cls, tool):
|
||||
"""Get widget for specific tool.
|
||||
|
||||
Args:
|
||||
tool (str): Name of the tool.
|
||||
|
||||
Returns:
|
||||
Stored widget.
|
||||
|
||||
"""
|
||||
try:
|
||||
return cls._windows[tool]
|
||||
except KeyError:
|
||||
return None
|
||||
|
||||
@classmethod
|
||||
def set_window(cls, tool, window):
|
||||
"""Set widget for the tool.
|
||||
|
||||
Args:
|
||||
tool (str): Name of the tool.
|
||||
window (QtWidgets.QWidget): Widget
|
||||
|
||||
"""
|
||||
cls._windows[tool] = window
|
||||
|
||||
|
||||
def edit_shader_definitions():
|
||||
from avalon.tools import lib
|
||||
from Qt import QtWidgets
|
||||
from openpype.hosts.maya.api.shader_definition_editor import (
|
||||
ShaderDefinitionsEditor
|
||||
)
|
||||
|
||||
top_level_widgets = QtWidgets.QApplication.topLevelWidgets()
|
||||
main_window = next(widget for widget in top_level_widgets
|
||||
if widget.objectName() == "MayaWindow")
|
||||
|
||||
with lib.application():
|
||||
window = ToolWindows.get_window("shader_definition_editor")
|
||||
if not window:
|
||||
window = ShaderDefinitionsEditor(parent=main_window)
|
||||
ToolWindows.set_window("shader_definition_editor", window)
|
||||
window.show()
|
||||
|
|
@ -6,9 +6,9 @@ from avalon.vendor.Qt import QtWidgets, QtGui
|
|||
from avalon.maya import pipeline
|
||||
from openpype.api import BuildWorkfile
|
||||
import maya.cmds as cmds
|
||||
from openpype.settings import get_project_settings
|
||||
|
||||
self = sys.modules[__name__]
|
||||
self._menu = os.environ.get("AVALON_LABEL")
|
||||
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
|
@ -17,8 +17,11 @@ log = logging.getLogger(__name__)
|
|||
def _get_menu(menu_name=None):
|
||||
"""Return the menu instance if it currently exists in Maya"""
|
||||
|
||||
project_settings = get_project_settings(os.getenv("AVALON_PROJECT"))
|
||||
_menu = project_settings["maya"]["scriptsmenu"]["name"]
|
||||
|
||||
if menu_name is None:
|
||||
menu_name = self._menu
|
||||
menu_name = _menu
|
||||
widgets = dict((
|
||||
w.objectName(), w) for w in QtWidgets.QApplication.allWidgets())
|
||||
menu = widgets.get(menu_name)
|
||||
|
|
@ -55,35 +58,7 @@ def deferred():
|
|||
parent=pipeline._parent
|
||||
)
|
||||
|
||||
# Find the pipeline menu
|
||||
top_menu = _get_menu(pipeline._menu)
|
||||
|
||||
# Try to find workfile tool action in the menu
|
||||
workfile_action = None
|
||||
for action in top_menu.actions():
|
||||
if action.text() == "Work Files":
|
||||
workfile_action = action
|
||||
break
|
||||
|
||||
# Add at the top of menu if "Work Files" action was not found
|
||||
after_action = ""
|
||||
if workfile_action:
|
||||
# Use action's object name for `insertAfter` argument
|
||||
after_action = workfile_action.objectName()
|
||||
|
||||
# Insert action to menu
|
||||
cmds.menuItem(
|
||||
"Work Files",
|
||||
parent=pipeline._menu,
|
||||
command=launch_workfiles_app,
|
||||
insertAfter=after_action
|
||||
)
|
||||
|
||||
# Remove replaced action
|
||||
if workfile_action:
|
||||
top_menu.removeAction(workfile_action)
|
||||
|
||||
log.info("Attempting to install scripts menu..")
|
||||
log.info("Attempting to install scripts menu ...")
|
||||
|
||||
add_build_workfiles_item()
|
||||
add_look_assigner_item()
|
||||
|
|
@ -100,13 +75,18 @@ def deferred():
|
|||
return
|
||||
|
||||
# load configuration of custom menu
|
||||
config_path = os.path.join(os.path.dirname(__file__), "menu.json")
|
||||
config = scriptsmenu.load_configuration(config_path)
|
||||
project_settings = get_project_settings(os.getenv("AVALON_PROJECT"))
|
||||
config = project_settings["maya"]["scriptsmenu"]["definition"]
|
||||
_menu = project_settings["maya"]["scriptsmenu"]["name"]
|
||||
|
||||
if not config:
|
||||
log.warning("Skipping studio menu, no definition found.")
|
||||
return
|
||||
|
||||
# run the launcher for Maya menu
|
||||
studio_menu = launchformaya.main(
|
||||
title=self._menu.title(),
|
||||
objectName=self._menu
|
||||
title=_menu.title(),
|
||||
objectName=_menu.title().lower().replace(" ", "_")
|
||||
)
|
||||
|
||||
# apply configuration
|
||||
|
|
@ -116,7 +96,7 @@ def deferred():
|
|||
def uninstall():
|
||||
menu = _get_menu()
|
||||
if menu:
|
||||
log.info("Attempting to uninstall..")
|
||||
log.info("Attempting to uninstall ...")
|
||||
|
||||
try:
|
||||
menu.deleteLater()
|
||||
|
|
@ -136,9 +116,8 @@ def install():
|
|||
|
||||
|
||||
def popup():
|
||||
"""Pop-up the existing menu near the mouse cursor"""
|
||||
"""Pop-up the existing menu near the mouse cursor."""
|
||||
menu = _get_menu()
|
||||
|
||||
cursor = QtGui.QCursor()
|
||||
point = cursor.pos()
|
||||
menu.exec_(point)
|
||||
|
|
|
|||
176
openpype/hosts/maya/api/shader_definition_editor.py
Normal file
176
openpype/hosts/maya/api/shader_definition_editor.py
Normal file
|
|
@ -0,0 +1,176 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Editor for shader definitions.
|
||||
|
||||
Shader names are stored as simple text file over GridFS in mongodb.
|
||||
|
||||
"""
|
||||
import os
|
||||
from Qt import QtWidgets, QtCore, QtGui
|
||||
from openpype.lib.mongo import OpenPypeMongoConnection
|
||||
from openpype import resources
|
||||
import gridfs
|
||||
|
||||
|
||||
DEFINITION_FILENAME = "{}/maya/shader_definition.txt".format(
|
||||
os.getenv("AVALON_PROJECT"))
|
||||
|
||||
|
||||
class ShaderDefinitionsEditor(QtWidgets.QWidget):
|
||||
"""Widget serving as simple editor for shader name definitions."""
|
||||
|
||||
# name of the file used to store definitions
|
||||
|
||||
def __init__(self, parent=None):
|
||||
super(ShaderDefinitionsEditor, self).__init__(parent)
|
||||
self._mongo = OpenPypeMongoConnection.get_mongo_client()
|
||||
self._gridfs = gridfs.GridFS(
|
||||
self._mongo[os.getenv("OPENPYPE_DATABASE_NAME")])
|
||||
self._editor = None
|
||||
|
||||
self._original_content = self._read_definition_file()
|
||||
|
||||
self.setObjectName("shaderDefinitionEditor")
|
||||
self.setWindowTitle("OpenPype shader name definition editor")
|
||||
icon = QtGui.QIcon(resources.pype_icon_filepath())
|
||||
self.setWindowIcon(icon)
|
||||
self.setWindowFlags(QtCore.Qt.Window)
|
||||
self.setParent(parent)
|
||||
self.setAttribute(QtCore.Qt.WA_DeleteOnClose)
|
||||
self.resize(750, 500)
|
||||
|
||||
self._setup_ui()
|
||||
self._reload()
|
||||
|
||||
def _setup_ui(self):
|
||||
"""Setup UI of Widget."""
|
||||
layout = QtWidgets.QVBoxLayout(self)
|
||||
label = QtWidgets.QLabel()
|
||||
label.setText("Put shader names here - one name per line:")
|
||||
layout.addWidget(label)
|
||||
self._editor = QtWidgets.QPlainTextEdit()
|
||||
self._editor.setStyleSheet("border: none;")
|
||||
layout.addWidget(self._editor)
|
||||
|
||||
btn_layout = QtWidgets.QHBoxLayout()
|
||||
save_btn = QtWidgets.QPushButton("Save")
|
||||
save_btn.clicked.connect(self._save)
|
||||
|
||||
reload_btn = QtWidgets.QPushButton("Reload")
|
||||
reload_btn.clicked.connect(self._reload)
|
||||
|
||||
exit_btn = QtWidgets.QPushButton("Exit")
|
||||
exit_btn.clicked.connect(self._close)
|
||||
|
||||
btn_layout.addWidget(reload_btn)
|
||||
btn_layout.addWidget(save_btn)
|
||||
btn_layout.addWidget(exit_btn)
|
||||
|
||||
layout.addLayout(btn_layout)
|
||||
|
||||
def _read_definition_file(self, file=None):
|
||||
"""Read definition file from database.
|
||||
|
||||
Args:
|
||||
file (gridfs.grid_file.GridOut, Optional): File to read. If not
|
||||
set, new query will be issued to find it.
|
||||
|
||||
Returns:
|
||||
str: Content of the file or empty string if file doesn't exist.
|
||||
|
||||
"""
|
||||
content = ""
|
||||
if not file:
|
||||
file = self._gridfs.find_one(
|
||||
{"filename": DEFINITION_FILENAME})
|
||||
if not file:
|
||||
print(">>> [SNDE]: nothing in database yet")
|
||||
return content
|
||||
content = file.read()
|
||||
file.close()
|
||||
return content
|
||||
|
||||
def _write_definition_file(self, content, force=False):
|
||||
"""Write content as definition to file in database.
|
||||
|
||||
Before file is writen, check is made if its content has not
|
||||
changed. If is changed, warning is issued to user if he wants
|
||||
it to overwrite. Note: GridFs doesn't allow changing file content.
|
||||
You need to delete existing file and create new one.
|
||||
|
||||
Args:
|
||||
content (str): Content to write.
|
||||
|
||||
Raises:
|
||||
ContentException: If file is changed in database while
|
||||
editor is running.
|
||||
"""
|
||||
file = self._gridfs.find_one(
|
||||
{"filename": DEFINITION_FILENAME})
|
||||
if file:
|
||||
content_check = self._read_definition_file(file)
|
||||
if content == content_check:
|
||||
print(">>> [SNDE]: content not changed")
|
||||
return
|
||||
if self._original_content != content_check:
|
||||
if not force:
|
||||
raise ContentException("Content changed")
|
||||
print(">>> [SNDE]: overwriting data")
|
||||
file.close()
|
||||
self._gridfs.delete(file._id)
|
||||
|
||||
file = self._gridfs.new_file(
|
||||
filename=DEFINITION_FILENAME,
|
||||
content_type='text/plain',
|
||||
encoding='utf-8')
|
||||
file.write(content)
|
||||
file.close()
|
||||
QtCore.QTimer.singleShot(200, self._reset_style)
|
||||
self._editor.setStyleSheet("border: 1px solid #33AF65;")
|
||||
self._original_content = content
|
||||
|
||||
def _reset_style(self):
|
||||
"""Reset editor style back.
|
||||
|
||||
Used to visually indicate save.
|
||||
|
||||
"""
|
||||
self._editor.setStyleSheet("border: none;")
|
||||
|
||||
def _close(self):
|
||||
self.hide()
|
||||
|
||||
def closeEvent(self, event):
|
||||
event.ignore()
|
||||
self.hide()
|
||||
|
||||
def _reload(self):
|
||||
print(">>> [SNDE]: reloading")
|
||||
self._set_content(self._read_definition_file())
|
||||
|
||||
def _save(self):
|
||||
try:
|
||||
self._write_definition_file(content=self._editor.toPlainText())
|
||||
except ContentException:
|
||||
# content has changed meanwhile
|
||||
print(">>> [SNDE]: content has changed")
|
||||
self._show_overwrite_warning()
|
||||
|
||||
def _set_content(self, content):
|
||||
self._editor.setPlainText(content)
|
||||
|
||||
def _show_overwrite_warning(self):
|
||||
reply = QtWidgets.QMessageBox.question(
|
||||
self,
|
||||
"Warning",
|
||||
("Content you are editing was changed meanwhile in database.\n"
|
||||
"Please, reload and solve the conflict."),
|
||||
QtWidgets.QMessageBox.OK)
|
||||
|
||||
if reply == QtWidgets.QMessageBox.OK:
|
||||
# do nothing
|
||||
pass
|
||||
|
||||
|
||||
class ContentException(Exception):
|
||||
"""This is risen during save if file is changed in database."""
|
||||
pass
|
||||
|
|
@ -167,6 +167,8 @@ def get_file_node_path(node):
|
|||
|
||||
if cmds.nodeType(node) == 'aiImage':
|
||||
return cmds.getAttr('{0}.filename'.format(node))
|
||||
if cmds.nodeType(node) == 'RedshiftNormalMap':
|
||||
return cmds.getAttr('{}.tex0'.format(node))
|
||||
|
||||
# otherwise use fileTextureName
|
||||
return cmds.getAttr('{0}.fileTextureName'.format(node))
|
||||
|
|
@ -357,6 +359,7 @@ class CollectLook(pyblish.api.InstancePlugin):
|
|||
|
||||
files = cmds.ls(history, type="file", long=True)
|
||||
files.extend(cmds.ls(history, type="aiImage", long=True))
|
||||
files.extend(cmds.ls(history, type="RedshiftNormalMap", long=True))
|
||||
|
||||
self.log.info("Collected file nodes:\n{}".format(files))
|
||||
# Collect textures if any file nodes are found
|
||||
|
|
@ -487,7 +490,7 @@ class CollectLook(pyblish.api.InstancePlugin):
|
|||
"""
|
||||
|
||||
self.log.debug("processing: {}".format(node))
|
||||
if cmds.nodeType(node) not in ["file", "aiImage"]:
|
||||
if cmds.nodeType(node) not in ["file", "aiImage", "RedshiftNormalMap"]:
|
||||
self.log.error(
|
||||
"Unsupported file node: {}".format(cmds.nodeType(node)))
|
||||
raise AssertionError("Unsupported file node")
|
||||
|
|
@ -500,11 +503,19 @@ class CollectLook(pyblish.api.InstancePlugin):
|
|||
self.log.debug("aiImage node")
|
||||
attribute = "{}.filename".format(node)
|
||||
computed_attribute = attribute
|
||||
elif cmds.nodeType(node) == 'RedshiftNormalMap':
|
||||
self.log.debug("RedshiftNormalMap node")
|
||||
attribute = "{}.tex0".format(node)
|
||||
computed_attribute = attribute
|
||||
|
||||
source = cmds.getAttr(attribute)
|
||||
self.log.info(" - file source: {}".format(source))
|
||||
color_space_attr = "{}.colorSpace".format(node)
|
||||
color_space = cmds.getAttr(color_space_attr)
|
||||
try:
|
||||
color_space = cmds.getAttr(color_space_attr)
|
||||
except ValueError:
|
||||
# node doesn't have colorspace attribute
|
||||
color_space = "raw"
|
||||
# Compare with the computed file path, e.g. the one with the <UDIM>
|
||||
# pattern in it, to generate some logging information about this
|
||||
# difference
|
||||
|
|
|
|||
|
|
@ -233,11 +233,14 @@ class ExtractLook(openpype.api.Extractor):
|
|||
for filepath in files_metadata:
|
||||
|
||||
linearize = False
|
||||
if do_maketx and files_metadata[filepath]["color_space"] == "sRGB": # noqa: E501
|
||||
if do_maketx and files_metadata[filepath]["color_space"].lower() == "srgb": # noqa: E501
|
||||
linearize = True
|
||||
# set its file node to 'raw' as tx will be linearized
|
||||
files_metadata[filepath]["color_space"] = "raw"
|
||||
|
||||
if do_maketx:
|
||||
color_space = "raw"
|
||||
|
||||
source, mode, texture_hash = self._process_texture(
|
||||
filepath,
|
||||
do_maketx,
|
||||
|
|
@ -280,14 +283,19 @@ class ExtractLook(openpype.api.Extractor):
|
|||
# This will also trigger in the same order at end of context to
|
||||
# ensure after context it's still the original value.
|
||||
color_space_attr = resource["node"] + ".colorSpace"
|
||||
color_space = cmds.getAttr(color_space_attr)
|
||||
if files_metadata[source]["color_space"] == "raw":
|
||||
# set color space to raw if we linearized it
|
||||
color_space = "Raw"
|
||||
# Remap file node filename to destination
|
||||
try:
|
||||
color_space = cmds.getAttr(color_space_attr)
|
||||
except ValueError:
|
||||
# node doesn't have color space attribute
|
||||
color_space = "raw"
|
||||
else:
|
||||
if files_metadata[source]["color_space"] == "raw":
|
||||
# set color space to raw if we linearized it
|
||||
color_space = "raw"
|
||||
# Remap file node filename to destination
|
||||
remap[color_space_attr] = color_space
|
||||
attr = resource["attribute"]
|
||||
remap[attr] = destinations[source]
|
||||
remap[color_space_attr] = color_space
|
||||
|
||||
self.log.info("Finished remapping destinations ...")
|
||||
|
||||
|
|
|
|||
|
|
@ -133,10 +133,10 @@ class ExtractYetiRig(openpype.api.Extractor):
|
|||
image_search_path = resources_dir = instance.data["resourcesDir"]
|
||||
|
||||
settings = instance.data.get("rigsettings", None)
|
||||
if settings:
|
||||
settings["imageSearchPath"] = image_search_path
|
||||
with open(settings_path, "w") as fp:
|
||||
json.dump(settings, fp, ensure_ascii=False)
|
||||
assert settings, "Yeti rig settings were not collected."
|
||||
settings["imageSearchPath"] = image_search_path
|
||||
with open(settings_path, "w") as fp:
|
||||
json.dump(settings, fp, ensure_ascii=False)
|
||||
|
||||
# add textures to transfers
|
||||
if 'transfers' not in instance.data:
|
||||
|
|
@ -192,12 +192,12 @@ class ExtractYetiRig(openpype.api.Extractor):
|
|||
'stagingDir': dirname
|
||||
}
|
||||
)
|
||||
self.log.info("settings file: {}".format(settings))
|
||||
self.log.info("settings file: {}".format(settings_path))
|
||||
instance.data["representations"].append(
|
||||
{
|
||||
'name': 'rigsettings',
|
||||
'ext': 'rigsettings',
|
||||
'files': os.path.basename(settings),
|
||||
'files': os.path.basename(settings_path),
|
||||
'stagingDir': dirname
|
||||
}
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,8 +1,16 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Validate model nodes names."""
|
||||
from maya import cmds
|
||||
import pyblish.api
|
||||
import openpype.api
|
||||
import avalon.api
|
||||
import openpype.hosts.maya.api.action
|
||||
from openpype.hosts.maya.api.shader_definition_editor import (
|
||||
DEFINITION_FILENAME)
|
||||
from openpype.lib.mongo import OpenPypeMongoConnection
|
||||
import gridfs
|
||||
import re
|
||||
import os
|
||||
|
||||
|
||||
class ValidateModelName(pyblish.api.InstancePlugin):
|
||||
|
|
@ -19,18 +27,18 @@ class ValidateModelName(pyblish.api.InstancePlugin):
|
|||
families = ["model"]
|
||||
label = "Model Name"
|
||||
actions = [openpype.hosts.maya.api.action.SelectInvalidAction]
|
||||
# path to shader names definitions
|
||||
# TODO: move it to preset file
|
||||
material_file = None
|
||||
regex = '(.*)_(\\d)*_(.*)_(GEO)'
|
||||
database_file = DEFINITION_FILENAME
|
||||
|
||||
@classmethod
|
||||
def get_invalid(cls, instance):
|
||||
"""Get invalid nodes."""
|
||||
use_db = cls.database
|
||||
|
||||
# find out if supplied transform is group or not
|
||||
def is_group(groupName):
|
||||
def is_group(group_name):
|
||||
"""Find out if supplied transform is group or not."""
|
||||
try:
|
||||
children = cmds.listRelatives(groupName, children=True)
|
||||
children = cmds.listRelatives(group_name, children=True)
|
||||
for child in children:
|
||||
if not cmds.ls(child, transforms=True):
|
||||
return False
|
||||
|
|
@ -44,29 +52,74 @@ class ValidateModelName(pyblish.api.InstancePlugin):
|
|||
cls.log.error("Instance has no nodes!")
|
||||
return True
|
||||
pass
|
||||
|
||||
# validate top level group name
|
||||
assemblies = cmds.ls(content_instance, assemblies=True, long=True)
|
||||
if len(assemblies) != 1:
|
||||
cls.log.error("Must have exactly one top group")
|
||||
return assemblies or True
|
||||
top_group = assemblies[0]
|
||||
regex = cls.top_level_regex
|
||||
r = re.compile(regex)
|
||||
m = r.match(top_group)
|
||||
if m is None:
|
||||
cls.log.error("invalid name on: {}".format(top_group))
|
||||
cls.log.error("name doesn't match regex {}".format(regex))
|
||||
invalid.append(top_group)
|
||||
else:
|
||||
if "asset" in r.groupindex:
|
||||
if m.group("asset") != avalon.api.Session["AVALON_ASSET"]:
|
||||
cls.log.error("Invalid asset name in top level group.")
|
||||
return top_group
|
||||
if "subset" in r.groupindex:
|
||||
if m.group("subset") != instance.data.get("subset"):
|
||||
cls.log.error("Invalid subset name in top level group.")
|
||||
return top_group
|
||||
if "project" in r.groupindex:
|
||||
if m.group("project") != avalon.api.Session["AVALON_PROJECT"]:
|
||||
cls.log.error("Invalid project name in top level group.")
|
||||
return top_group
|
||||
|
||||
descendants = cmds.listRelatives(content_instance,
|
||||
allDescendents=True,
|
||||
fullPath=True) or []
|
||||
|
||||
descendants = cmds.ls(descendants, noIntermediate=True, long=True)
|
||||
trns = cmds.ls(descendants, long=False, type=('transform'))
|
||||
trns = cmds.ls(descendants, long=False, type='transform')
|
||||
|
||||
# filter out groups
|
||||
filter = [node for node in trns if not is_group(node)]
|
||||
filtered = [node for node in trns if not is_group(node)]
|
||||
|
||||
# load shader list file as utf-8
|
||||
if cls.material_file:
|
||||
shader_file = open(cls.material_file, "r")
|
||||
shaders = shader_file.readlines()
|
||||
shaders = []
|
||||
if not use_db:
|
||||
if cls.material_file:
|
||||
if os.path.isfile(cls.material_file):
|
||||
shader_file = open(cls.material_file, "r")
|
||||
shaders = shader_file.readlines()
|
||||
shader_file.close()
|
||||
else:
|
||||
cls.log.error("Missing shader name definition file.")
|
||||
return True
|
||||
else:
|
||||
client = OpenPypeMongoConnection.get_mongo_client()
|
||||
fs = gridfs.GridFS(client[os.getenv("OPENPYPE_DATABASE_NAME")])
|
||||
shader_file = fs.find_one({"filename": cls.database_file})
|
||||
if not shader_file:
|
||||
cls.log.error("Missing shader name definition in database.")
|
||||
return True
|
||||
shaders = shader_file.read().splitlines()
|
||||
shader_file.close()
|
||||
|
||||
# strip line endings from list
|
||||
shaders = map(lambda s: s.rstrip(), shaders)
|
||||
|
||||
# compile regex for testing names
|
||||
r = re.compile(cls.regex)
|
||||
regex = cls.regex
|
||||
r = re.compile(regex)
|
||||
|
||||
for obj in filter:
|
||||
for obj in filtered:
|
||||
cls.log.info("testing: {}".format(obj))
|
||||
m = r.match(obj)
|
||||
if m is None:
|
||||
cls.log.error("invalid name on: {}".format(obj))
|
||||
|
|
@ -74,7 +127,7 @@ class ValidateModelName(pyblish.api.InstancePlugin):
|
|||
else:
|
||||
# if we have shader files and shader named group is in
|
||||
# regex, test this group against names in shader file
|
||||
if 'shader' in r.groupindex and shaders:
|
||||
if "shader" in r.groupindex and shaders:
|
||||
try:
|
||||
if not m.group('shader') in shaders:
|
||||
cls.log.error(
|
||||
|
|
@ -90,8 +143,8 @@ class ValidateModelName(pyblish.api.InstancePlugin):
|
|||
return invalid
|
||||
|
||||
def process(self, instance):
|
||||
|
||||
"""Plugin entry point."""
|
||||
invalid = self.get_invalid(instance)
|
||||
|
||||
if invalid:
|
||||
raise RuntimeError("Model naming is invalid. See log.")
|
||||
raise RuntimeError("Model naming is invalid. See the log.")
|
||||
|
|
|
|||
|
|
@ -113,6 +113,14 @@ def check_inventory_versions():
|
|||
"_id": io.ObjectId(avalon_knob_data["representation"])
|
||||
})
|
||||
|
||||
# Failsafe for not finding the representation.
|
||||
if not representation:
|
||||
log.warning(
|
||||
"Could not find the representation on "
|
||||
"node \"{}\"".format(node.name())
|
||||
)
|
||||
continue
|
||||
|
||||
# Get start frame from version data
|
||||
version = io.find_one({
|
||||
"type": "version",
|
||||
|
|
@ -391,13 +399,14 @@ def create_write_node(name, data, input=None, prenodes=None,
|
|||
if prenodes:
|
||||
for node in prenodes:
|
||||
# get attributes
|
||||
name = node["name"]
|
||||
pre_node_name = node["name"]
|
||||
klass = node["class"]
|
||||
knobs = node["knobs"]
|
||||
dependent = node["dependent"]
|
||||
|
||||
# create node
|
||||
now_node = nuke.createNode(klass, "name {}".format(name))
|
||||
now_node = nuke.createNode(
|
||||
klass, "name {}".format(pre_node_name))
|
||||
now_node.hideControlPanel()
|
||||
|
||||
# add data to knob
|
||||
|
|
@ -476,27 +485,27 @@ def create_write_node(name, data, input=None, prenodes=None,
|
|||
|
||||
linked_knob_names.append("Render")
|
||||
|
||||
for name in linked_knob_names:
|
||||
if "_grp-start_" in name:
|
||||
for _k_name in linked_knob_names:
|
||||
if "_grp-start_" in _k_name:
|
||||
knob = nuke.Tab_Knob(
|
||||
"rnd_attr", "Rendering attributes", nuke.TABBEGINCLOSEDGROUP)
|
||||
GN.addKnob(knob)
|
||||
elif "_grp-end_" in name:
|
||||
elif "_grp-end_" in _k_name:
|
||||
knob = nuke.Tab_Knob(
|
||||
"rnd_attr_end", "Rendering attributes", nuke.TABENDGROUP)
|
||||
GN.addKnob(knob)
|
||||
else:
|
||||
if "___" in name:
|
||||
if "___" in _k_name:
|
||||
# add devider
|
||||
GN.addKnob(nuke.Text_Knob(""))
|
||||
else:
|
||||
# add linked knob by name
|
||||
# add linked knob by _k_name
|
||||
link = nuke.Link_Knob("")
|
||||
link.makeLink(write_node.name(), name)
|
||||
link.setName(name)
|
||||
link.makeLink(write_node.name(), _k_name)
|
||||
link.setName(_k_name)
|
||||
|
||||
# make render
|
||||
if "Render" in name:
|
||||
if "Render" in _k_name:
|
||||
link.setLabel("Render Local")
|
||||
link.setFlag(0x1000)
|
||||
GN.addKnob(link)
|
||||
|
|
@ -1651,9 +1660,13 @@ def find_free_space_to_paste_nodes(
|
|||
def launch_workfiles_app():
|
||||
'''Function letting start workfiles after start of host
|
||||
'''
|
||||
# get state from settings
|
||||
open_at_start = get_current_project_settings()["nuke"].get(
|
||||
"general", {}).get("open_workfile_at_start")
|
||||
from openpype.lib import (
|
||||
env_value_to_bool
|
||||
)
|
||||
# get all imortant settings
|
||||
open_at_start = env_value_to_bool(
|
||||
env_key="OPENPYPE_WORKFILE_TOOL_ON_START",
|
||||
default=None)
|
||||
|
||||
# return if none is defined
|
||||
if not open_at_start:
|
||||
|
|
@ -1730,3 +1743,68 @@ def process_workfile_builder():
|
|||
log.info("Opening last workfile...")
|
||||
# open workfile
|
||||
open_file(last_workfile_path)
|
||||
|
||||
|
||||
def recreate_instance(origin_node, avalon_data=None):
|
||||
"""Recreate input instance to different data
|
||||
|
||||
Args:
|
||||
origin_node (nuke.Node): Nuke node to be recreating from
|
||||
avalon_data (dict, optional): data to be used in new node avalon_data
|
||||
|
||||
Returns:
|
||||
nuke.Node: newly created node
|
||||
"""
|
||||
knobs_wl = ["render", "publish", "review", "ypos",
|
||||
"use_limit", "first", "last"]
|
||||
# get data from avalon knobs
|
||||
data = anlib.get_avalon_knob_data(
|
||||
origin_node)
|
||||
|
||||
# add input data to avalon data
|
||||
if avalon_data:
|
||||
data.update(avalon_data)
|
||||
|
||||
# capture all node knobs allowed in op_knobs
|
||||
knobs_data = {k: origin_node[k].value()
|
||||
for k in origin_node.knobs()
|
||||
for key in knobs_wl
|
||||
if key in k}
|
||||
|
||||
# get node dependencies
|
||||
inputs = origin_node.dependencies()
|
||||
outputs = origin_node.dependent()
|
||||
|
||||
# remove the node
|
||||
nuke.delete(origin_node)
|
||||
|
||||
# create new node
|
||||
# get appropriate plugin class
|
||||
creator_plugin = None
|
||||
for Creator in api.discover(api.Creator):
|
||||
if Creator.__name__ == data["creator"]:
|
||||
creator_plugin = Creator
|
||||
break
|
||||
|
||||
# create write node with creator
|
||||
new_node_name = data["subset"]
|
||||
new_node = creator_plugin(new_node_name, data["asset"]).process()
|
||||
|
||||
# white listed knobs to the new node
|
||||
for _k, _v in knobs_data.items():
|
||||
try:
|
||||
print(_k, _v)
|
||||
new_node[_k].setValue(_v)
|
||||
except Exception as e:
|
||||
print(e)
|
||||
|
||||
# connect to original inputs
|
||||
for i, n in enumerate(inputs):
|
||||
new_node.setInput(i, n)
|
||||
|
||||
# connect to outputs
|
||||
if len(outputs) > 0:
|
||||
for dn in outputs:
|
||||
dn.setInput(0, new_node)
|
||||
|
||||
return new_node
|
||||
|
|
|
|||
|
|
@ -214,7 +214,7 @@ class LoadEffects(api.Loader):
|
|||
self.log.warning(e)
|
||||
continue
|
||||
|
||||
if isinstance(v, list) and len(v) > 3:
|
||||
if isinstance(v, list) and len(v) > 4:
|
||||
node[k].setAnimated()
|
||||
for i, value in enumerate(v):
|
||||
if isinstance(value, list):
|
||||
|
|
|
|||
|
|
@ -217,7 +217,7 @@ class LoadEffectsInputProcess(api.Loader):
|
|||
self.log.warning(e)
|
||||
continue
|
||||
|
||||
if isinstance(v, list) and len(v) > 3:
|
||||
if isinstance(v, list) and len(v) > 4:
|
||||
node[k].setAnimated()
|
||||
for i, value in enumerate(v):
|
||||
if isinstance(value, list):
|
||||
|
|
@ -239,10 +239,10 @@ class LoadEffectsInputProcess(api.Loader):
|
|||
output = nuke.createNode("Output")
|
||||
output.setInput(0, pre_node)
|
||||
|
||||
# try to place it under Viewer1
|
||||
if not self.connect_active_viewer(GN):
|
||||
nuke.delete(GN)
|
||||
return
|
||||
# # try to place it under Viewer1
|
||||
# if not self.connect_active_viewer(GN):
|
||||
# nuke.delete(GN)
|
||||
# return
|
||||
|
||||
# get all versions in list
|
||||
versions = io.find({
|
||||
|
|
@ -298,7 +298,11 @@ class LoadEffectsInputProcess(api.Loader):
|
|||
viewer["input_process_node"].setValue(group_node_name)
|
||||
|
||||
# put backdrop under
|
||||
lib.create_backdrop(label="Input Process", layer=2, nodes=[viewer, group_node], color="0x7c7faaff")
|
||||
lib.create_backdrop(
|
||||
label="Input Process",
|
||||
layer=2,
|
||||
nodes=[viewer, group_node],
|
||||
color="0x7c7faaff")
|
||||
|
||||
return True
|
||||
|
||||
|
|
|
|||
|
|
@ -259,7 +259,8 @@ class LoadMov(api.Loader):
|
|||
read_node["last"].setValue(last)
|
||||
read_node['frame_mode'].setValue("start at")
|
||||
|
||||
if int(self.first_frame) == int(read_node['frame'].value()):
|
||||
if int(float(self.first_frame)) == int(
|
||||
float(read_node['frame'].value())):
|
||||
# start at workfile start
|
||||
read_node['frame'].setValue(str(self.first_frame))
|
||||
else:
|
||||
|
|
|
|||
|
|
@ -51,7 +51,6 @@ class ExtractReviewDataLut(openpype.api.Extractor):
|
|||
|
||||
if "render.farm" in families:
|
||||
instance.data["families"].remove("review")
|
||||
instance.data["families"].remove("ftrack")
|
||||
|
||||
self.log.debug(
|
||||
"_ lutPath: {}".format(instance.data["lutPath"]))
|
||||
|
|
|
|||
|
|
@ -45,7 +45,6 @@ class ExtractReviewDataMov(openpype.api.Extractor):
|
|||
|
||||
if "render.farm" in families:
|
||||
instance.data["families"].remove("review")
|
||||
instance.data["families"].remove("ftrack")
|
||||
data = exporter.generate_mov(farm=True)
|
||||
|
||||
self.log.debug(
|
||||
|
|
|
|||
|
|
@ -70,8 +70,9 @@ class PreCollectNukeInstances(pyblish.api.ContextPlugin):
|
|||
review = False
|
||||
if "review" in node.knobs():
|
||||
review = node["review"].value()
|
||||
|
||||
if review:
|
||||
families.append("review")
|
||||
families.append("ftrack")
|
||||
|
||||
# Add all nodes in group instances.
|
||||
if node.Class() == "Group":
|
||||
|
|
@ -81,18 +82,18 @@ class PreCollectNukeInstances(pyblish.api.ContextPlugin):
|
|||
if target == "Use existing frames":
|
||||
# Local rendering
|
||||
self.log.info("flagged for no render")
|
||||
families.append(family)
|
||||
families.append(families_ak.lower())
|
||||
elif target == "Local":
|
||||
# Local rendering
|
||||
self.log.info("flagged for local render")
|
||||
families.append("{}.local".format(family))
|
||||
family = families_ak.lower()
|
||||
elif target == "On farm":
|
||||
# Farm rendering
|
||||
self.log.info("flagged for farm render")
|
||||
instance.data["transfer"] = False
|
||||
families.append("{}.farm".format(family))
|
||||
|
||||
family = families_ak.lower()
|
||||
family = families_ak.lower()
|
||||
|
||||
node.begin()
|
||||
for i in nuke.allNodes():
|
||||
|
|
|
|||
|
|
@ -1,5 +1,4 @@
|
|||
import os
|
||||
|
||||
from avalon import api
|
||||
import pyblish.api
|
||||
import openpype.api
|
||||
from avalon import photoshop
|
||||
|
|
@ -27,12 +26,20 @@ class ValidateInstanceAssetRepair(pyblish.api.Action):
|
|||
for instance in instances:
|
||||
data = stub.read(instance[0])
|
||||
|
||||
data["asset"] = os.environ["AVALON_ASSET"]
|
||||
data["asset"] = api.Session["AVALON_ASSET"]
|
||||
stub.imprint(instance[0], data)
|
||||
|
||||
|
||||
class ValidateInstanceAsset(pyblish.api.InstancePlugin):
|
||||
"""Validate the instance asset is the current asset."""
|
||||
"""Validate the instance asset is the current selected context asset.
|
||||
|
||||
As it might happen that multiple worfiles are opened, switching
|
||||
between them would mess with selected context.
|
||||
In that case outputs might be output under wrong asset!
|
||||
|
||||
Repair action will use Context asset value (from Workfiles or Launcher)
|
||||
Closing and reopening with Workfiles will refresh Context value.
|
||||
"""
|
||||
|
||||
label = "Validate Instance Asset"
|
||||
hosts = ["photoshop"]
|
||||
|
|
@ -41,9 +48,12 @@ class ValidateInstanceAsset(pyblish.api.InstancePlugin):
|
|||
|
||||
def process(self, instance):
|
||||
instance_asset = instance.data["asset"]
|
||||
current_asset = os.environ["AVALON_ASSET"]
|
||||
current_asset = api.Session["AVALON_ASSET"]
|
||||
msg = (
|
||||
"Instance asset is not the same as current asset:"
|
||||
f"\nInstance: {instance_asset}\nCurrent: {current_asset}"
|
||||
f"Instance asset {instance_asset} is not the same "
|
||||
f"as current context {current_asset}. PLEASE DO:\n"
|
||||
f"Repair with 'A' action to use '{current_asset}'.\n"
|
||||
f"If that's not correct value, close workfile and "
|
||||
f"reopen via Workfiles!"
|
||||
)
|
||||
assert instance_asset == current_asset, msg
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@
|
|||
Optional:
|
||||
presets -> extensions (
|
||||
example of use:
|
||||
[".mov", ".mp4"]
|
||||
["mov", "mp4"]
|
||||
)
|
||||
presets -> source_dir (
|
||||
example of use:
|
||||
|
|
@ -11,6 +11,7 @@ Optional:
|
|||
"{root[work]}/{project[name]}/inputs"
|
||||
"./input"
|
||||
"../input"
|
||||
""
|
||||
)
|
||||
"""
|
||||
|
||||
|
|
@ -48,7 +49,7 @@ class CollectEditorial(pyblish.api.InstancePlugin):
|
|||
actions = []
|
||||
|
||||
# presets
|
||||
extensions = [".mov", ".mp4"]
|
||||
extensions = ["mov", "mp4"]
|
||||
source_dir = None
|
||||
|
||||
def process(self, instance):
|
||||
|
|
@ -72,7 +73,7 @@ class CollectEditorial(pyblish.api.InstancePlugin):
|
|||
video_path = None
|
||||
basename = os.path.splitext(os.path.basename(file_path))[0]
|
||||
|
||||
if self.source_dir:
|
||||
if self.source_dir != "":
|
||||
source_dir = self.source_dir.replace("\\", "/")
|
||||
if ("./" in source_dir) or ("../" in source_dir):
|
||||
# get current working dir
|
||||
|
|
@ -98,7 +99,7 @@ class CollectEditorial(pyblish.api.InstancePlugin):
|
|||
if os.path.splitext(f)[0] not in basename:
|
||||
continue
|
||||
# filter out by respected extensions
|
||||
if os.path.splitext(f)[1] not in self.extensions:
|
||||
if os.path.splitext(f)[1][1:] not in self.extensions:
|
||||
continue
|
||||
video_path = os.path.join(
|
||||
staging_dir, f
|
||||
|
|
|
|||
|
|
@ -8,7 +8,7 @@ class CollectInstances(pyblish.api.InstancePlugin):
|
|||
"""Collect instances from editorial's OTIO sequence"""
|
||||
|
||||
order = pyblish.api.CollectorOrder + 0.01
|
||||
label = "Collect Instances"
|
||||
label = "Collect Editorial Instances"
|
||||
hosts = ["standalonepublisher"]
|
||||
families = ["editorial"]
|
||||
|
||||
|
|
@ -17,16 +17,12 @@ class CollectInstances(pyblish.api.InstancePlugin):
|
|||
"referenceMain": {
|
||||
"family": "review",
|
||||
"families": ["clip"],
|
||||
"extensions": [".mp4"]
|
||||
"extensions": ["mp4"]
|
||||
},
|
||||
"audioMain": {
|
||||
"family": "audio",
|
||||
"families": ["clip"],
|
||||
"extensions": [".wav"],
|
||||
},
|
||||
"shotMain": {
|
||||
"family": "shot",
|
||||
"families": []
|
||||
"extensions": ["wav"],
|
||||
}
|
||||
}
|
||||
timeline_frame_start = 900000 # starndard edl default (10:00:00:00)
|
||||
|
|
@ -55,7 +51,7 @@ class CollectInstances(pyblish.api.InstancePlugin):
|
|||
fps = plib.get_asset()["data"]["fps"]
|
||||
|
||||
tracks = timeline.each_child(
|
||||
descended_from_type=otio.schema.track.Track
|
||||
descended_from_type=otio.schema.Track
|
||||
)
|
||||
|
||||
# get data from avalon
|
||||
|
|
@ -84,6 +80,9 @@ class CollectInstances(pyblish.api.InstancePlugin):
|
|||
if clip.name is None:
|
||||
continue
|
||||
|
||||
if isinstance(clip, otio.schema.Gap):
|
||||
continue
|
||||
|
||||
# skip all generators like black ampty
|
||||
if isinstance(
|
||||
clip.media_reference,
|
||||
|
|
@ -92,7 +91,7 @@ class CollectInstances(pyblish.api.InstancePlugin):
|
|||
|
||||
# Transitions are ignored, because Clips have the full frame
|
||||
# range.
|
||||
if isinstance(clip, otio.schema.transition.Transition):
|
||||
if isinstance(clip, otio.schema.Transition):
|
||||
continue
|
||||
|
||||
# basic unique asset name
|
||||
|
|
@ -175,7 +174,17 @@ class CollectInstances(pyblish.api.InstancePlugin):
|
|||
data_key: instance.data.get(data_key)})
|
||||
|
||||
# adding subsets to context as instances
|
||||
self.subsets.update({
|
||||
"shotMain": {
|
||||
"family": "shot",
|
||||
"families": []
|
||||
}
|
||||
})
|
||||
for subset, properities in self.subsets.items():
|
||||
version = properities.get("version")
|
||||
if version == 0:
|
||||
properities.pop("version")
|
||||
|
||||
# adding Review-able instance
|
||||
subset_instance_data = instance_data.copy()
|
||||
subset_instance_data.update(properities)
|
||||
|
|
@ -11,7 +11,7 @@ class CollectInstanceResources(pyblish.api.InstancePlugin):
|
|||
|
||||
# must be after `CollectInstances`
|
||||
order = pyblish.api.CollectorOrder + 0.011
|
||||
label = "Collect Instance Resources"
|
||||
label = "Collect Editorial Resources"
|
||||
hosts = ["standalonepublisher"]
|
||||
families = ["clip"]
|
||||
|
||||
|
|
@ -177,19 +177,23 @@ class CollectInstanceResources(pyblish.api.InstancePlugin):
|
|||
collection_head_name = None
|
||||
# loop trough collections and create representations
|
||||
for _collection in collections:
|
||||
ext = _collection.tail
|
||||
ext = _collection.tail[1:]
|
||||
collection_head_name = _collection.head
|
||||
frame_start = list(_collection.indexes)[0]
|
||||
frame_end = list(_collection.indexes)[-1]
|
||||
repre_data = {
|
||||
"frameStart": frame_start,
|
||||
"frameEnd": frame_end,
|
||||
"name": ext[1:],
|
||||
"ext": ext[1:],
|
||||
"name": ext,
|
||||
"ext": ext,
|
||||
"files": [item for item in _collection],
|
||||
"stagingDir": staging_dir
|
||||
}
|
||||
|
||||
if instance_data.get("keepSequence"):
|
||||
repre_data_keep = deepcopy(repre_data)
|
||||
instance_data["representations"].append(repre_data_keep)
|
||||
|
||||
if "review" in instance_data["families"]:
|
||||
repre_data.update({
|
||||
"thumbnail": True,
|
||||
|
|
@ -208,20 +212,20 @@ class CollectInstanceResources(pyblish.api.InstancePlugin):
|
|||
|
||||
# loop trough reminders and create representations
|
||||
for _reminding_file in remainder:
|
||||
ext = os.path.splitext(_reminding_file)[-1]
|
||||
ext = os.path.splitext(_reminding_file)[-1][1:]
|
||||
if ext not in instance_data["extensions"]:
|
||||
continue
|
||||
if collection_head_name and (
|
||||
(collection_head_name + ext[1:]) not in _reminding_file
|
||||
) and (ext in [".mp4", ".mov"]):
|
||||
(collection_head_name + ext) not in _reminding_file
|
||||
) and (ext in ["mp4", "mov"]):
|
||||
self.log.info(f"Skipping file: {_reminding_file}")
|
||||
continue
|
||||
frame_start = 1
|
||||
frame_end = 1
|
||||
|
||||
repre_data = {
|
||||
"name": ext[1:],
|
||||
"ext": ext[1:],
|
||||
"name": ext,
|
||||
"ext": ext,
|
||||
"files": _reminding_file,
|
||||
"stagingDir": staging_dir
|
||||
}
|
||||
|
|
@ -37,7 +37,7 @@ class CollectHierarchyInstance(pyblish.api.ContextPlugin):
|
|||
|
||||
# return if any
|
||||
if entity_type:
|
||||
return {"entityType": entity_type, "entityName": value}
|
||||
return {"entity_type": entity_type, "entity_name": value}
|
||||
|
||||
def rename_with_hierarchy(self, instance):
|
||||
search_text = ""
|
||||
|
|
@ -76,8 +76,8 @@ class CollectHierarchyInstance(pyblish.api.ContextPlugin):
|
|||
# add current selection context hierarchy from standalonepublisher
|
||||
for entity in reversed(visual_hierarchy):
|
||||
parents.append({
|
||||
"entityType": entity["data"]["entityType"],
|
||||
"entityName": entity["name"]
|
||||
"entity_type": entity["data"]["entityType"],
|
||||
"entity_name": entity["name"]
|
||||
})
|
||||
|
||||
if self.shot_add_hierarchy:
|
||||
|
|
@ -98,7 +98,7 @@ class CollectHierarchyInstance(pyblish.api.ContextPlugin):
|
|||
|
||||
# in case SP context is set to the same folder
|
||||
if (_index == 0) and ("folder" in parent_key) \
|
||||
and (parents[-1]["entityName"] == parent_filled):
|
||||
and (parents[-1]["entity_name"] == parent_filled):
|
||||
self.log.debug(f" skiping : {parent_filled}")
|
||||
continue
|
||||
|
||||
|
|
@ -131,20 +131,21 @@ class CollectHierarchyInstance(pyblish.api.ContextPlugin):
|
|||
tasks_to_add = dict()
|
||||
project_tasks = io.find_one({"type": "project"})["config"]["tasks"]
|
||||
for task_name, task_data in self.shot_add_tasks.items():
|
||||
try:
|
||||
if task_data["type"] in project_tasks.keys():
|
||||
tasks_to_add.update({task_name: task_data})
|
||||
else:
|
||||
raise KeyError(
|
||||
"Wrong FtrackTaskType `{}` for `{}` is not"
|
||||
" existing in `{}``".format(
|
||||
task_data["type"],
|
||||
task_name,
|
||||
list(project_tasks.keys())))
|
||||
except KeyError as error:
|
||||
_task_data = deepcopy(task_data)
|
||||
|
||||
# fixing enumerator from settings
|
||||
_task_data["type"] = task_data["type"][0]
|
||||
|
||||
# check if task type in project task types
|
||||
if _task_data["type"] in project_tasks.keys():
|
||||
tasks_to_add.update({task_name: _task_data})
|
||||
else:
|
||||
raise KeyError(
|
||||
"Wrong presets: `{0}`".format(error)
|
||||
)
|
||||
"Wrong FtrackTaskType `{}` for `{}` is not"
|
||||
" existing in `{}``".format(
|
||||
_task_data["type"],
|
||||
task_name,
|
||||
list(project_tasks.keys())))
|
||||
|
||||
instance.data["tasks"] = tasks_to_add
|
||||
else:
|
||||
|
|
@ -279,9 +280,9 @@ class CollectHierarchyContext(pyblish.api.ContextPlugin):
|
|||
|
||||
for parent in reversed(parents):
|
||||
next_dict = {}
|
||||
parent_name = parent["entityName"]
|
||||
parent_name = parent["entity_name"]
|
||||
next_dict[parent_name] = {}
|
||||
next_dict[parent_name]["entity_type"] = parent["entityType"]
|
||||
next_dict[parent_name]["entity_type"] = parent["entity_type"]
|
||||
next_dict[parent_name]["childs"] = actual
|
||||
actual = next_dict
|
||||
|
||||
|
|
|
|||
|
|
@ -0,0 +1,457 @@
|
|||
import os
|
||||
import re
|
||||
import pyblish.api
|
||||
import json
|
||||
|
||||
from avalon.api import format_template_with_optional_keys
|
||||
|
||||
from openpype.lib import prepare_template_data
|
||||
|
||||
|
||||
class CollectTextures(pyblish.api.ContextPlugin):
|
||||
"""Collect workfile (and its resource_files) and textures.
|
||||
|
||||
Currently implements use case with Mari and Substance Painter, where
|
||||
one workfile is main (.mra - Mari) with possible additional workfiles
|
||||
(.spp - Substance)
|
||||
|
||||
|
||||
Provides:
|
||||
1 instance per workfile (with 'resources' filled if needed)
|
||||
(workfile family)
|
||||
1 instance per group of textures
|
||||
(textures family)
|
||||
"""
|
||||
|
||||
order = pyblish.api.CollectorOrder
|
||||
label = "Collect Textures"
|
||||
hosts = ["standalonepublisher"]
|
||||
families = ["texture_batch"]
|
||||
actions = []
|
||||
|
||||
# from presets
|
||||
main_workfile_extensions = ['mra']
|
||||
other_workfile_extensions = ['spp', 'psd']
|
||||
texture_extensions = ["exr", "dpx", "jpg", "jpeg", "png", "tiff", "tga",
|
||||
"gif", "svg"]
|
||||
|
||||
# additional families (ftrack etc.)
|
||||
workfile_families = []
|
||||
textures_families = []
|
||||
|
||||
color_space = ["linsRGB", "raw", "acesg"]
|
||||
|
||||
# currently implemented placeholders ["color_space"]
|
||||
# describing patterns in file names splitted by regex groups
|
||||
input_naming_patterns = {
|
||||
# workfile: corridorMain_v001.mra >
|
||||
# texture: corridorMain_aluminiumID_v001_baseColor_linsRGB_1001.exr
|
||||
"workfile": r'^([^.]+)(_[^_.]*)?_v([0-9]{3,}).+',
|
||||
"textures": r'^([^_.]+)_([^_.]+)_v([0-9]{3,})_([^_.]+)_({color_space})_(1[0-9]{3}).+', # noqa
|
||||
}
|
||||
# matching regex group position to 'input_naming_patterns'
|
||||
input_naming_groups = {
|
||||
"workfile": ('asset', 'filler', 'version'),
|
||||
"textures": ('asset', 'shader', 'version', 'channel', 'color_space',
|
||||
'udim')
|
||||
}
|
||||
|
||||
workfile_subset_template = "textures{Subset}Workfile"
|
||||
# implemented keys: ["color_space", "channel", "subset", "shader"]
|
||||
texture_subset_template = "textures{Subset}_{Shader}_{Channel}"
|
||||
|
||||
def process(self, context):
|
||||
self.context = context
|
||||
|
||||
resource_files = {}
|
||||
workfile_files = {}
|
||||
representations = {}
|
||||
version_data = {}
|
||||
asset_builds = set()
|
||||
asset = None
|
||||
for instance in context:
|
||||
if not self.input_naming_patterns:
|
||||
raise ValueError("Naming patterns are not configured. \n"
|
||||
"Ask admin to provide naming conventions "
|
||||
"for workfiles and textures.")
|
||||
|
||||
if not asset:
|
||||
asset = instance.data["asset"] # selected from SP
|
||||
|
||||
parsed_subset = instance.data["subset"].replace(
|
||||
instance.data["family"], '')
|
||||
|
||||
fill_pairs = {
|
||||
"subset": parsed_subset
|
||||
}
|
||||
|
||||
fill_pairs = prepare_template_data(fill_pairs)
|
||||
workfile_subset = format_template_with_optional_keys(
|
||||
fill_pairs, self.workfile_subset_template)
|
||||
|
||||
processed_instance = False
|
||||
for repre in instance.data["representations"]:
|
||||
ext = repre["ext"].replace('.', '')
|
||||
asset_build = version = None
|
||||
|
||||
if isinstance(repre["files"], list):
|
||||
repre_file = repre["files"][0]
|
||||
else:
|
||||
repre_file = repre["files"]
|
||||
|
||||
if ext in self.main_workfile_extensions or \
|
||||
ext in self.other_workfile_extensions:
|
||||
|
||||
asset_build = self._get_asset_build(
|
||||
repre_file,
|
||||
self.input_naming_patterns["workfile"],
|
||||
self.input_naming_groups["workfile"],
|
||||
self.color_space
|
||||
)
|
||||
version = self._get_version(
|
||||
repre_file,
|
||||
self.input_naming_patterns["workfile"],
|
||||
self.input_naming_groups["workfile"],
|
||||
self.color_space
|
||||
)
|
||||
asset_builds.add((asset_build, version,
|
||||
workfile_subset, 'workfile'))
|
||||
processed_instance = True
|
||||
|
||||
if not representations.get(workfile_subset):
|
||||
representations[workfile_subset] = []
|
||||
|
||||
if ext in self.main_workfile_extensions:
|
||||
# workfiles can have only single representation
|
||||
# currently OP is not supporting different extensions in
|
||||
# representation files
|
||||
representations[workfile_subset] = [repre]
|
||||
|
||||
workfile_files[asset_build] = repre_file
|
||||
|
||||
if ext in self.other_workfile_extensions:
|
||||
# add only if not added already from main
|
||||
if not representations.get(workfile_subset):
|
||||
representations[workfile_subset] = [repre]
|
||||
|
||||
# only overwrite if not present
|
||||
if not workfile_files.get(asset_build):
|
||||
workfile_files[asset_build] = repre_file
|
||||
|
||||
if not resource_files.get(workfile_subset):
|
||||
resource_files[workfile_subset] = []
|
||||
item = {
|
||||
"files": [os.path.join(repre["stagingDir"],
|
||||
repre["files"])],
|
||||
"source": "standalone publisher"
|
||||
}
|
||||
resource_files[workfile_subset].append(item)
|
||||
|
||||
if ext in self.texture_extensions:
|
||||
c_space = self._get_color_space(
|
||||
repre_file,
|
||||
self.color_space
|
||||
)
|
||||
|
||||
channel = self._get_channel_name(
|
||||
repre_file,
|
||||
self.input_naming_patterns["textures"],
|
||||
self.input_naming_groups["textures"],
|
||||
self.color_space
|
||||
)
|
||||
|
||||
shader = self._get_shader_name(
|
||||
repre_file,
|
||||
self.input_naming_patterns["textures"],
|
||||
self.input_naming_groups["textures"],
|
||||
self.color_space
|
||||
)
|
||||
|
||||
formatting_data = {
|
||||
"color_space": c_space or '', # None throws exception
|
||||
"channel": channel or '',
|
||||
"shader": shader or '',
|
||||
"subset": parsed_subset or ''
|
||||
}
|
||||
|
||||
fill_pairs = prepare_template_data(formatting_data)
|
||||
subset = format_template_with_optional_keys(
|
||||
fill_pairs, self.texture_subset_template)
|
||||
|
||||
asset_build = self._get_asset_build(
|
||||
repre_file,
|
||||
self.input_naming_patterns["textures"],
|
||||
self.input_naming_groups["textures"],
|
||||
self.color_space
|
||||
)
|
||||
version = self._get_version(
|
||||
repre_file,
|
||||
self.input_naming_patterns["textures"],
|
||||
self.input_naming_groups["textures"],
|
||||
self.color_space
|
||||
)
|
||||
if not representations.get(subset):
|
||||
representations[subset] = []
|
||||
representations[subset].append(repre)
|
||||
|
||||
ver_data = {
|
||||
"color_space": c_space or '',
|
||||
"channel_name": channel or '',
|
||||
"shader_name": shader or ''
|
||||
}
|
||||
version_data[subset] = ver_data
|
||||
|
||||
asset_builds.add(
|
||||
(asset_build, version, subset, "textures"))
|
||||
processed_instance = True
|
||||
|
||||
if processed_instance:
|
||||
self.context.remove(instance)
|
||||
|
||||
self._create_new_instances(context,
|
||||
asset,
|
||||
asset_builds,
|
||||
resource_files,
|
||||
representations,
|
||||
version_data,
|
||||
workfile_files)
|
||||
|
||||
def _create_new_instances(self, context, asset, asset_builds,
|
||||
resource_files, representations,
|
||||
version_data, workfile_files):
|
||||
"""Prepare new instances from collected data.
|
||||
|
||||
Args:
|
||||
context (ContextPlugin)
|
||||
asset (string): selected asset from SP
|
||||
asset_builds (set) of tuples
|
||||
(asset_build, version, subset, family)
|
||||
resource_files (list) of resource dicts - to store additional
|
||||
files to main workfile
|
||||
representations (list) of dicts - to store workfile info OR
|
||||
all collected texture files, key is asset_build
|
||||
version_data (dict) - prepared to store into version doc in DB
|
||||
workfile_files (dict) - to store workfile to add to textures
|
||||
key is asset_build
|
||||
"""
|
||||
# sort workfile first
|
||||
asset_builds = sorted(asset_builds,
|
||||
key=lambda tup: tup[3], reverse=True)
|
||||
|
||||
# workfile must have version, textures might
|
||||
main_version = None
|
||||
for asset_build, version, subset, family in asset_builds:
|
||||
if not main_version:
|
||||
main_version = version
|
||||
new_instance = context.create_instance(subset)
|
||||
new_instance.data.update(
|
||||
{
|
||||
"subset": subset,
|
||||
"asset": asset,
|
||||
"label": subset,
|
||||
"name": subset,
|
||||
"family": family,
|
||||
"version": int(version or main_version or 1),
|
||||
"asset_build": asset_build # remove in validator
|
||||
}
|
||||
)
|
||||
|
||||
workfile = workfile_files.get(asset_build)
|
||||
|
||||
if resource_files.get(subset):
|
||||
# add resources only when workfile is main style
|
||||
for ext in self.main_workfile_extensions:
|
||||
if ext in workfile:
|
||||
new_instance.data.update({
|
||||
"resources": resource_files.get(subset)
|
||||
})
|
||||
break
|
||||
|
||||
# store origin
|
||||
if family == 'workfile':
|
||||
families = self.workfile_families
|
||||
families.append("texture_batch_workfile")
|
||||
|
||||
new_instance.data["source"] = "standalone publisher"
|
||||
else:
|
||||
families = self.textures_families
|
||||
|
||||
repre = representations.get(subset)[0]
|
||||
new_instance.context.data["currentFile"] = os.path.join(
|
||||
repre["stagingDir"], workfile or 'dummy.txt')
|
||||
|
||||
new_instance.data["families"] = families
|
||||
|
||||
# add data for version document
|
||||
ver_data = version_data.get(subset)
|
||||
if ver_data:
|
||||
if workfile:
|
||||
ver_data['workfile'] = workfile
|
||||
|
||||
new_instance.data.update(
|
||||
{"versionData": ver_data}
|
||||
)
|
||||
|
||||
upd_representations = representations.get(subset)
|
||||
if upd_representations and family != 'workfile':
|
||||
upd_representations = self._update_representations(
|
||||
upd_representations)
|
||||
|
||||
new_instance.data["representations"] = upd_representations
|
||||
|
||||
self.log.debug("new instance - {}:: {}".format(
|
||||
family,
|
||||
json.dumps(new_instance.data, indent=4)))
|
||||
|
||||
def _get_asset_build(self, name,
|
||||
input_naming_patterns, input_naming_groups,
|
||||
color_spaces):
|
||||
"""Loops through configured workfile patterns to find asset name.
|
||||
|
||||
Asset name used to bind workfile and its textures.
|
||||
|
||||
Args:
|
||||
name (str): workfile name
|
||||
input_naming_patterns (list):
|
||||
[workfile_pattern] or [texture_pattern]
|
||||
input_naming_groups (list)
|
||||
ordinal position of regex groups matching to input_naming..
|
||||
color_spaces (list) - predefined color spaces
|
||||
"""
|
||||
asset_name = "NOT_AVAIL"
|
||||
|
||||
return self._parse(name, input_naming_patterns, input_naming_groups,
|
||||
color_spaces, 'asset') or asset_name
|
||||
|
||||
def _get_version(self, name, input_naming_patterns, input_naming_groups,
|
||||
color_spaces):
|
||||
found = self._parse(name, input_naming_patterns, input_naming_groups,
|
||||
color_spaces, 'version')
|
||||
|
||||
if found:
|
||||
return found.replace('v', '')
|
||||
|
||||
self.log.info("No version found in the name {}".format(name))
|
||||
|
||||
def _get_udim(self, name, input_naming_patterns, input_naming_groups,
|
||||
color_spaces):
|
||||
"""Parses from 'name' udim value."""
|
||||
found = self._parse(name, input_naming_patterns, input_naming_groups,
|
||||
color_spaces, 'udim')
|
||||
if found:
|
||||
return found
|
||||
|
||||
self.log.warning("Didn't find UDIM in {}".format(name))
|
||||
|
||||
def _get_color_space(self, name, color_spaces):
|
||||
"""Looks for color_space from a list in a file name.
|
||||
|
||||
Color space seems not to be recognizable by regex pattern, set of
|
||||
known space spaces must be provided.
|
||||
"""
|
||||
color_space = None
|
||||
found = [cs for cs in color_spaces if
|
||||
re.search("_{}_".format(cs), name)]
|
||||
|
||||
if not found:
|
||||
self.log.warning("No color space found in {}".format(name))
|
||||
else:
|
||||
if len(found) > 1:
|
||||
msg = "Multiple color spaces found in {}->{}".format(name,
|
||||
found)
|
||||
self.log.warning(msg)
|
||||
|
||||
color_space = found[0]
|
||||
|
||||
return color_space
|
||||
|
||||
def _get_shader_name(self, name, input_naming_patterns,
|
||||
input_naming_groups, color_spaces):
|
||||
"""Return parsed shader name.
|
||||
|
||||
Shader name is needed for overlapping udims (eg. udims might be
|
||||
used for different materials, shader needed to not overwrite).
|
||||
|
||||
Unknown format of channel name and color spaces >> cs are known
|
||||
list - 'color_space' used as a placeholder
|
||||
"""
|
||||
found = self._parse(name, input_naming_patterns, input_naming_groups,
|
||||
color_spaces, 'shader')
|
||||
if found:
|
||||
return found
|
||||
|
||||
self.log.warning("Didn't find shader in {}".format(name))
|
||||
|
||||
def _get_channel_name(self, name, input_naming_patterns,
|
||||
input_naming_groups, color_spaces):
|
||||
"""Return parsed channel name.
|
||||
|
||||
Unknown format of channel name and color spaces >> cs are known
|
||||
list - 'color_space' used as a placeholder
|
||||
"""
|
||||
found = self._parse(name, input_naming_patterns, input_naming_groups,
|
||||
color_spaces, 'channel')
|
||||
if found:
|
||||
return found
|
||||
|
||||
self.log.warning("Didn't find channel in {}".format(name))
|
||||
|
||||
def _parse(self, name, input_naming_patterns, input_naming_groups,
|
||||
color_spaces, key):
|
||||
"""Universal way to parse 'name' with configurable regex groups.
|
||||
|
||||
Args:
|
||||
name (str): workfile name
|
||||
input_naming_patterns (list):
|
||||
[workfile_pattern] or [texture_pattern]
|
||||
input_naming_groups (list)
|
||||
ordinal position of regex groups matching to input_naming..
|
||||
color_spaces (list) - predefined color spaces
|
||||
|
||||
Raises:
|
||||
ValueError - if broken 'input_naming_groups'
|
||||
"""
|
||||
for input_pattern in input_naming_patterns:
|
||||
for cs in color_spaces:
|
||||
pattern = input_pattern.replace('{color_space}', cs)
|
||||
regex_result = re.findall(pattern, name)
|
||||
if regex_result:
|
||||
idx = list(input_naming_groups).index(key)
|
||||
if idx < 0:
|
||||
msg = "input_naming_groups must " +\
|
||||
"have '{}' key".format(key)
|
||||
raise ValueError(msg)
|
||||
|
||||
try:
|
||||
parsed_value = regex_result[0][idx]
|
||||
return parsed_value
|
||||
except IndexError:
|
||||
self.log.warning("Wrong index, probably "
|
||||
"wrong name {}".format(name))
|
||||
|
||||
def _update_representations(self, upd_representations):
|
||||
"""Frames dont have sense for textures, add collected udims instead."""
|
||||
udims = []
|
||||
for repre in upd_representations:
|
||||
repre.pop("frameStart", None)
|
||||
repre.pop("frameEnd", None)
|
||||
repre.pop("fps", None)
|
||||
|
||||
# ignore unique name from SP, use extension instead
|
||||
# SP enforces unique name, here different subsets >> unique repres
|
||||
repre["name"] = repre["ext"].replace('.', '')
|
||||
|
||||
files = repre.get("files", [])
|
||||
if not isinstance(files, list):
|
||||
files = [files]
|
||||
|
||||
for file_name in files:
|
||||
udim = self._get_udim(file_name,
|
||||
self.input_naming_patterns["textures"],
|
||||
self.input_naming_groups["textures"],
|
||||
self.color_space)
|
||||
udims.append(udim)
|
||||
|
||||
repre["udim"] = udims # must be this way, used for filling path
|
||||
|
||||
return upd_representations
|
||||
|
|
@ -0,0 +1,42 @@
|
|||
import os
|
||||
import pyblish.api
|
||||
|
||||
|
||||
class ExtractResources(pyblish.api.InstancePlugin):
|
||||
"""
|
||||
Extracts files from instance.data["resources"].
|
||||
|
||||
These files are additional (textures etc.), currently not stored in
|
||||
representations!
|
||||
|
||||
Expects collected 'resourcesDir'. (list of dicts with 'files' key and
|
||||
list of source urls)
|
||||
|
||||
Provides filled 'transfers' (list of tuples (source_url, target_url))
|
||||
"""
|
||||
|
||||
label = "Extract Resources SP"
|
||||
hosts = ["standalonepublisher"]
|
||||
order = pyblish.api.ExtractorOrder
|
||||
|
||||
families = ["workfile"]
|
||||
|
||||
def process(self, instance):
|
||||
if not instance.data.get("resources"):
|
||||
self.log.info("No resources")
|
||||
return
|
||||
|
||||
if not instance.data.get("transfers"):
|
||||
instance.data["transfers"] = []
|
||||
|
||||
publish_dir = instance.data["resourcesDir"]
|
||||
|
||||
transfers = []
|
||||
for resource in instance.data["resources"]:
|
||||
for file_url in resource.get("files", []):
|
||||
file_name = os.path.basename(file_url)
|
||||
dest_url = os.path.join(publish_dir, file_name)
|
||||
transfers.append((file_url, dest_url))
|
||||
|
||||
self.log.info("transfers:: {}".format(transfers))
|
||||
instance.data["transfers"].extend(transfers)
|
||||
|
|
@ -60,7 +60,7 @@ class ExtractTrimVideoAudio(openpype.api.Extractor):
|
|||
]
|
||||
|
||||
args = [
|
||||
ffmpeg_path,
|
||||
f"\"{ffmpeg_path}\"",
|
||||
"-ss", str(start / fps),
|
||||
"-i", f"\"{video_file_path}\"",
|
||||
"-t", str(dur / fps)
|
||||
|
|
|
|||
|
|
@ -0,0 +1,43 @@
|
|||
import os
|
||||
import pyblish.api
|
||||
|
||||
|
||||
class ExtractWorkfileUrl(pyblish.api.ContextPlugin):
|
||||
"""
|
||||
Modifies 'workfile' field to contain link to published workfile.
|
||||
|
||||
Expects that batch contains only single workfile and matching
|
||||
(multiple) textures.
|
||||
"""
|
||||
|
||||
label = "Extract Workfile Url SP"
|
||||
hosts = ["standalonepublisher"]
|
||||
order = pyblish.api.ExtractorOrder
|
||||
|
||||
families = ["textures"]
|
||||
|
||||
def process(self, context):
|
||||
filepath = None
|
||||
|
||||
# first loop for workfile
|
||||
for instance in context:
|
||||
if instance.data["family"] == 'workfile':
|
||||
anatomy = context.data['anatomy']
|
||||
template_data = instance.data.get("anatomyData")
|
||||
rep_name = instance.data.get("representations")[0].get("name")
|
||||
template_data["representation"] = rep_name
|
||||
template_data["ext"] = rep_name
|
||||
anatomy_filled = anatomy.format(template_data)
|
||||
template_filled = anatomy_filled["publish"]["path"]
|
||||
filepath = os.path.normpath(template_filled)
|
||||
self.log.info("Using published scene for render {}".format(
|
||||
filepath))
|
||||
|
||||
if not filepath:
|
||||
self.log.info("Texture batch doesn't contain workfile.")
|
||||
return
|
||||
|
||||
# then apply to all textures
|
||||
for instance in context:
|
||||
if instance.data["family"] == 'textures':
|
||||
instance.data["versionData"]["workfile"] = filepath
|
||||
|
|
@ -0,0 +1,22 @@
|
|||
import pyblish.api
|
||||
import openpype.api
|
||||
|
||||
|
||||
class ValidateTextureBatch(pyblish.api.InstancePlugin):
|
||||
"""Validates that some texture files are present."""
|
||||
|
||||
label = "Validate Texture Presence"
|
||||
hosts = ["standalonepublisher"]
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
families = ["texture_batch_workfile"]
|
||||
optional = False
|
||||
|
||||
def process(self, instance):
|
||||
present = False
|
||||
for instance in instance.context:
|
||||
if instance.data["family"] == "textures":
|
||||
self.log.info("Some textures present.")
|
||||
|
||||
return
|
||||
|
||||
assert present, "No textures found in published batch!"
|
||||
|
|
@ -0,0 +1,20 @@
|
|||
import pyblish.api
|
||||
import openpype.api
|
||||
|
||||
|
||||
class ValidateTextureHasWorkfile(pyblish.api.InstancePlugin):
|
||||
"""Validates that textures have appropriate workfile attached.
|
||||
|
||||
Workfile is optional, disable this Validator after Refresh if you are
|
||||
sure it is not needed.
|
||||
"""
|
||||
label = "Validate Texture Has Workfile"
|
||||
hosts = ["standalonepublisher"]
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
families = ["textures"]
|
||||
optional = True
|
||||
|
||||
def process(self, instance):
|
||||
wfile = instance.data["versionData"].get("workfile")
|
||||
|
||||
assert wfile, "Textures are missing attached workfile"
|
||||
|
|
@ -0,0 +1,50 @@
|
|||
import pyblish.api
|
||||
import openpype.api
|
||||
|
||||
|
||||
class ValidateTextureBatchNaming(pyblish.api.InstancePlugin):
|
||||
"""Validates that all instances had properly formatted name."""
|
||||
|
||||
label = "Validate Texture Batch Naming"
|
||||
hosts = ["standalonepublisher"]
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
families = ["texture_batch_workfile", "textures"]
|
||||
optional = False
|
||||
|
||||
def process(self, instance):
|
||||
file_name = instance.data["representations"][0]["files"]
|
||||
if isinstance(file_name, list):
|
||||
file_name = file_name[0]
|
||||
|
||||
msg = "Couldnt find asset name in '{}'\n".format(file_name) + \
|
||||
"File name doesn't follow configured pattern.\n" + \
|
||||
"Please rename the file."
|
||||
assert "NOT_AVAIL" not in instance.data["asset_build"], msg
|
||||
|
||||
instance.data.pop("asset_build")
|
||||
|
||||
if instance.data["family"] == "textures":
|
||||
file_name = instance.data["representations"][0]["files"][0]
|
||||
self._check_proper_collected(instance.data["versionData"],
|
||||
file_name)
|
||||
|
||||
def _check_proper_collected(self, versionData, file_name):
|
||||
"""
|
||||
Loop through collected versionData to check if name parsing was OK.
|
||||
Args:
|
||||
versionData: (dict)
|
||||
|
||||
Returns:
|
||||
raises AssertionException
|
||||
"""
|
||||
missing_key_values = []
|
||||
for key, value in versionData.items():
|
||||
if not value:
|
||||
missing_key_values.append(key)
|
||||
|
||||
msg = "Collected data {} doesn't contain values for {}".format(
|
||||
versionData, missing_key_values) + "\n" + \
|
||||
"Name of the texture file doesn't match expected pattern.\n" + \
|
||||
"Please rename file(s) {}".format(file_name)
|
||||
|
||||
assert not missing_key_values, msg
|
||||
|
|
@ -0,0 +1,38 @@
|
|||
import pyblish.api
|
||||
import openpype.api
|
||||
|
||||
|
||||
class ValidateTextureBatchVersions(pyblish.api.InstancePlugin):
|
||||
"""Validates that versions match in workfile and textures.
|
||||
|
||||
Workfile is optional, so if you are sure, you can disable this
|
||||
validator after Refresh.
|
||||
|
||||
Validates that only single version is published at a time.
|
||||
"""
|
||||
label = "Validate Texture Batch Versions"
|
||||
hosts = ["standalonepublisher"]
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
families = ["textures"]
|
||||
optional = False
|
||||
|
||||
def process(self, instance):
|
||||
wfile = instance.data["versionData"].get("workfile")
|
||||
|
||||
version_str = "v{:03d}".format(instance.data["version"])
|
||||
|
||||
if not wfile: # no matching workfile, do not check versions
|
||||
self.log.info("No workfile present for textures")
|
||||
return
|
||||
|
||||
msg = "Not matching version: texture v{:03d} - workfile {}"
|
||||
assert version_str in wfile, \
|
||||
msg.format(
|
||||
instance.data["version"], wfile
|
||||
)
|
||||
|
||||
present_versions = set()
|
||||
for instance in instance.context:
|
||||
present_versions.add(instance.data["version"])
|
||||
|
||||
assert len(present_versions) == 1, "Too many versions in a batch!"
|
||||
|
|
@ -0,0 +1,29 @@
|
|||
import pyblish.api
|
||||
import openpype.api
|
||||
|
||||
|
||||
class ValidateTextureBatchWorkfiles(pyblish.api.InstancePlugin):
|
||||
"""Validates that textures workfile has collected resources (optional).
|
||||
|
||||
Collected recourses means secondary workfiles (in most cases).
|
||||
"""
|
||||
|
||||
label = "Validate Texture Workfile Has Resources"
|
||||
hosts = ["standalonepublisher"]
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
families = ["texture_batch_workfile"]
|
||||
optional = True
|
||||
|
||||
# from presets
|
||||
main_workfile_extensions = ['mra']
|
||||
|
||||
def process(self, instance):
|
||||
if instance.data["family"] == "workfile":
|
||||
ext = instance.data["representations"][0]["ext"]
|
||||
if ext not in self.main_workfile_extensions:
|
||||
self.log.warning("Only secondary workfile present!")
|
||||
return
|
||||
|
||||
msg = "No secondary workfiles present for workfile {}".\
|
||||
format(instance.data["name"])
|
||||
assert instance.data.get("resources"), msg
|
||||
|
|
@ -155,6 +155,7 @@ class CollectWorkfileData(pyblish.api.ContextPlugin):
|
|||
"sceneMarkInState": mark_in_state == "set",
|
||||
"sceneMarkOut": int(mark_out_frame),
|
||||
"sceneMarkOutState": mark_out_state == "set",
|
||||
"sceneStartFrame": int(lib.execute_george("tv_startframe")),
|
||||
"sceneBgColor": self._get_bg_color()
|
||||
}
|
||||
self.log.debug(
|
||||
|
|
|
|||
|
|
@ -49,6 +49,14 @@ class ExtractSequence(pyblish.api.Extractor):
|
|||
family_lowered = instance.data["family"].lower()
|
||||
mark_in = instance.context.data["sceneMarkIn"]
|
||||
mark_out = instance.context.data["sceneMarkOut"]
|
||||
|
||||
# Scene start frame offsets the output files, so we need to offset the
|
||||
# marks.
|
||||
scene_start_frame = instance.context.data["sceneStartFrame"]
|
||||
difference = scene_start_frame - mark_in
|
||||
mark_in += difference
|
||||
mark_out += difference
|
||||
|
||||
# Frame start/end may be stored as float
|
||||
frame_start = int(instance.data["frameStart"])
|
||||
frame_end = int(instance.data["frameEnd"])
|
||||
|
|
@ -98,7 +106,7 @@ class ExtractSequence(pyblish.api.Extractor):
|
|||
self.log.warning((
|
||||
"Lowering representation range to {} frames."
|
||||
" Changed frame end {} -> {}"
|
||||
).format(output_range + 1, mark_out, new_mark_out))
|
||||
).format(output_range + 1, mark_out, new_output_frame_end))
|
||||
output_frame_end = new_output_frame_end
|
||||
|
||||
# -------------------------------------------------------------------
|
||||
|
|
|
|||
|
|
@ -0,0 +1,22 @@
|
|||
import pyblish.api
|
||||
|
||||
from avalon.tvpaint import workio
|
||||
from openpype.api import version_up
|
||||
|
||||
|
||||
class IncrementWorkfileVersion(pyblish.api.ContextPlugin):
|
||||
"""Increment current workfile version."""
|
||||
|
||||
order = pyblish.api.IntegratorOrder + 1
|
||||
label = "Increment Workfile Version"
|
||||
optional = True
|
||||
hosts = ["tvpaint"]
|
||||
|
||||
def process(self, context):
|
||||
|
||||
assert all(result["success"] for result in context.data["results"]), (
|
||||
"Publishing not succesfull so version is not increased.")
|
||||
|
||||
path = context.data["currentFile"]
|
||||
workio.save_file(version_up(path))
|
||||
self.log.info('Incrementing workfile version')
|
||||
|
|
@ -0,0 +1,27 @@
|
|||
import pyblish.api
|
||||
from avalon.tvpaint import lib
|
||||
|
||||
|
||||
class RepairStartFrame(pyblish.api.Action):
|
||||
"""Repair start frame."""
|
||||
|
||||
label = "Repair"
|
||||
icon = "wrench"
|
||||
on = "failed"
|
||||
|
||||
def process(self, context, plugin):
|
||||
lib.execute_george("tv_startframe 0")
|
||||
|
||||
|
||||
class ValidateStartFrame(pyblish.api.ContextPlugin):
|
||||
"""Validate start frame being at frame 0."""
|
||||
|
||||
label = "Validate Start Frame"
|
||||
order = pyblish.api.ValidatorOrder
|
||||
hosts = ["tvpaint"]
|
||||
actions = [RepairStartFrame]
|
||||
optional = True
|
||||
|
||||
def process(self, context):
|
||||
start_frame = lib.execute_george("tv_startframe")
|
||||
assert int(start_frame) == 0, "Start frame has to be frame 0."
|
||||
9
openpype/hosts/unreal/README.md
Normal file
9
openpype/hosts/unreal/README.md
Normal file
|
|
@ -0,0 +1,9 @@
|
|||
## Unreal Integration
|
||||
|
||||
Supported Unreal Engine version is 4.26+ (mainly because of major Python changes done there).
|
||||
|
||||
### Project naming
|
||||
Unreal doesn't support project names starting with non-alphabetic character. So names like `123_myProject` are
|
||||
invalid. If OpenPype detects such name it automatically prepends letter **P** to make it valid name, so `123_myProject`
|
||||
will become `P123_myProject`. There is also soft-limit on project name length to be shorter than 20 characters.
|
||||
Longer names will issue warning in Unreal Editor that there might be possible side effects.
|
||||
|
|
@ -1,38 +1,51 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Unreal launching and project tools."""
|
||||
import sys
|
||||
import os
|
||||
import platform
|
||||
import json
|
||||
from distutils import dir_util
|
||||
import subprocess
|
||||
import re
|
||||
from pathlib import Path
|
||||
from collections import OrderedDict
|
||||
from openpype.api import get_project_settings
|
||||
|
||||
|
||||
def get_engine_versions():
|
||||
"""
|
||||
def get_engine_versions(env=None):
|
||||
"""Detect Unreal Engine versions.
|
||||
|
||||
This will try to detect location and versions of installed Unreal Engine.
|
||||
Location can be overridden by `UNREAL_ENGINE_LOCATION` environment
|
||||
variable.
|
||||
|
||||
Returns:
|
||||
Args:
|
||||
env (dict, optional): Environment to use.
|
||||
|
||||
dict: dictionary with version as a key and dir as value.
|
||||
Returns:
|
||||
OrderedDict: dictionary with version as a key and dir as value.
|
||||
so the highest version is first.
|
||||
|
||||
Example:
|
||||
|
||||
>>> get_engine_version()
|
||||
>>> get_engine_versions()
|
||||
{
|
||||
"4.23": "C:/Epic Games/UE_4.23",
|
||||
"4.24": "C:/Epic Games/UE_4.24"
|
||||
}
|
||||
"""
|
||||
try:
|
||||
engine_locations = {}
|
||||
root, dirs, files = next(os.walk(os.environ["UNREAL_ENGINE_LOCATION"]))
|
||||
|
||||
for dir in dirs:
|
||||
if dir.startswith("UE_"):
|
||||
ver = dir.split("_")[1]
|
||||
engine_locations[ver] = os.path.join(root, dir)
|
||||
"""
|
||||
env = env or os.environ
|
||||
engine_locations = {}
|
||||
try:
|
||||
root, dirs, _ = next(os.walk(env["UNREAL_ENGINE_LOCATION"]))
|
||||
|
||||
for directory in dirs:
|
||||
if directory.startswith("UE"):
|
||||
try:
|
||||
ver = re.split(r"[-_]", directory)[1]
|
||||
except IndexError:
|
||||
continue
|
||||
engine_locations[ver] = os.path.join(root, directory)
|
||||
except KeyError:
|
||||
# environment variable not set
|
||||
pass
|
||||
|
|
@ -40,32 +53,52 @@ def get_engine_versions():
|
|||
# specified directory doesn't exists
|
||||
pass
|
||||
|
||||
# if we've got something, terminate autodetection process
|
||||
# if we've got something, terminate auto-detection process
|
||||
if engine_locations:
|
||||
return engine_locations
|
||||
return OrderedDict(sorted(engine_locations.items()))
|
||||
|
||||
# else kick in platform specific detection
|
||||
if platform.system().lower() == "windows":
|
||||
return _win_get_engine_versions()
|
||||
elif platform.system().lower() == "linux":
|
||||
return OrderedDict(sorted(_win_get_engine_versions().items()))
|
||||
if platform.system().lower() == "linux":
|
||||
# on linux, there is no installation and getting Unreal Engine involves
|
||||
# git clone. So we'll probably depend on `UNREAL_ENGINE_LOCATION`.
|
||||
pass
|
||||
elif platform.system().lower() == "darwin":
|
||||
return _darwin_get_engine_version()
|
||||
if platform.system().lower() == "darwin":
|
||||
return OrderedDict(sorted(_darwin_get_engine_version().items()))
|
||||
|
||||
return {}
|
||||
return OrderedDict()
|
||||
|
||||
|
||||
def get_editor_executable_path(engine_path: Path) -> Path:
|
||||
"""Get UE4 Editor executable path."""
|
||||
ue4_path = engine_path / "Engine/Binaries"
|
||||
if platform.system().lower() == "windows":
|
||||
ue4_path /= "Win64/UE4Editor.exe"
|
||||
|
||||
elif platform.system().lower() == "linux":
|
||||
ue4_path /= "Linux/UE4Editor"
|
||||
|
||||
elif platform.system().lower() == "darwin":
|
||||
ue4_path /= "Mac/UE4Editor"
|
||||
|
||||
return ue4_path
|
||||
|
||||
|
||||
def _win_get_engine_versions():
|
||||
"""
|
||||
"""Get Unreal Engine versions on Windows.
|
||||
|
||||
If engines are installed via Epic Games Launcher then there is:
|
||||
`%PROGRAMDATA%/Epic/UnrealEngineLauncher/LauncherInstalled.dat`
|
||||
This file is JSON file listing installed stuff, Unreal engines
|
||||
are marked with `"AppName" = "UE_X.XX"`` like `UE_4.24`
|
||||
|
||||
Returns:
|
||||
dict: version as a key and path as a value.
|
||||
|
||||
"""
|
||||
install_json_path = os.path.join(
|
||||
os.environ.get("PROGRAMDATA"),
|
||||
os.getenv("PROGRAMDATA"),
|
||||
"Epic",
|
||||
"UnrealEngineLauncher",
|
||||
"LauncherInstalled.dat",
|
||||
|
|
@ -75,11 +108,19 @@ def _win_get_engine_versions():
|
|||
|
||||
|
||||
def _darwin_get_engine_version() -> dict:
|
||||
"""
|
||||
"""Get Unreal Engine versions on MacOS.
|
||||
|
||||
It works the same as on Windows, just JSON file location is different.
|
||||
|
||||
Returns:
|
||||
dict: version as a key and path as a value.
|
||||
|
||||
See Aslo:
|
||||
:func:`_win_get_engine_versions`.
|
||||
|
||||
"""
|
||||
install_json_path = os.path.join(
|
||||
os.environ.get("HOME"),
|
||||
os.getenv("HOME"),
|
||||
"Library",
|
||||
"Application Support",
|
||||
"Epic",
|
||||
|
|
@ -91,25 +132,26 @@ def _darwin_get_engine_version() -> dict:
|
|||
|
||||
|
||||
def _parse_launcher_locations(install_json_path: str) -> dict:
|
||||
"""
|
||||
This will parse locations from json file.
|
||||
"""This will parse locations from json file.
|
||||
|
||||
Args:
|
||||
install_json_path (str): Path to `LauncherInstalled.dat`.
|
||||
|
||||
Returns:
|
||||
dict: with unreal engine versions as keys and
|
||||
paths to those engine installations as value.
|
||||
|
||||
:param install_json_path: path to `LauncherInstalled.dat`
|
||||
:type install_json_path: str
|
||||
:returns: returns dict with unreal engine versions as keys and
|
||||
paths to those engine installations as value.
|
||||
:rtype: dict
|
||||
"""
|
||||
engine_locations = {}
|
||||
if os.path.isfile(install_json_path):
|
||||
with open(install_json_path, "r") as ilf:
|
||||
try:
|
||||
install_data = json.load(ilf)
|
||||
except json.JSONDecodeError:
|
||||
except json.JSONDecodeError as e:
|
||||
raise Exception(
|
||||
"Invalid `LauncherInstalled.dat file. `"
|
||||
"Cannot determine Unreal Engine location."
|
||||
)
|
||||
) from e
|
||||
|
||||
for installation in install_data.get("InstallationList", []):
|
||||
if installation.get("AppName").startswith("UE_"):
|
||||
|
|
@ -121,55 +163,91 @@ def _parse_launcher_locations(install_json_path: str) -> dict:
|
|||
|
||||
def create_unreal_project(project_name: str,
|
||||
ue_version: str,
|
||||
pr_dir: str,
|
||||
engine_path: str,
|
||||
dev_mode: bool = False) -> None:
|
||||
"""
|
||||
This will create `.uproject` file at specified location. As there is no
|
||||
way I know to create project via command line, this is easiest option.
|
||||
Unreal project file is basically JSON file. If we find
|
||||
pr_dir: Path,
|
||||
engine_path: Path,
|
||||
dev_mode: bool = False,
|
||||
env: dict = None) -> None:
|
||||
"""This will create `.uproject` file at specified location.
|
||||
|
||||
As there is no way I know to create project via command line, this is
|
||||
easiest option. Unreal project file is basically JSON file. If we find
|
||||
`AVALON_UNREAL_PLUGIN` environment variable we assume this is location
|
||||
of Avalon Integration Plugin and we copy its content to project folder
|
||||
and enable this plugin.
|
||||
|
||||
:param project_name: project name
|
||||
:type project_name: str
|
||||
:param ue_version: unreal engine version (like 4.23)
|
||||
:type ue_version: str
|
||||
:param pr_dir: path to directory where project will be created
|
||||
:type pr_dir: str
|
||||
:param engine_path: Path to Unreal Engine installation
|
||||
:type engine_path: str
|
||||
:param dev_mode: Flag to trigger C++ style Unreal project needing
|
||||
Visual Studio and other tools to compile plugins from
|
||||
sources. This will trigger automatically if `Binaries`
|
||||
directory is not found in plugin folders as this indicates
|
||||
this is only source distribution of the plugin. Dev mode
|
||||
is also set by preset file `unreal/project_setup.json` in
|
||||
**OPENPYPE_CONFIG**.
|
||||
:type dev_mode: bool
|
||||
:returns: None
|
||||
"""
|
||||
preset = get_project_settings(project_name)["unreal"]["project_setup"]
|
||||
Args:
|
||||
project_name (str): Name of the project.
|
||||
ue_version (str): Unreal engine version (like 4.23).
|
||||
pr_dir (Path): Path to directory where project will be created.
|
||||
engine_path (Path): Path to Unreal Engine installation.
|
||||
dev_mode (bool, optional): Flag to trigger C++ style Unreal project
|
||||
needing Visual Studio and other tools to compile plugins from
|
||||
sources. This will trigger automatically if `Binaries`
|
||||
directory is not found in plugin folders as this indicates
|
||||
this is only source distribution of the plugin. Dev mode
|
||||
is also set by preset file `unreal/project_setup.json` in
|
||||
**OPENPYPE_CONFIG**.
|
||||
env (dict, optional): Environment to use. If not set, `os.environ`.
|
||||
|
||||
if os.path.isdir(os.environ.get("AVALON_UNREAL_PLUGIN", "")):
|
||||
Throws:
|
||||
NotImplementedError: For unsupported platforms.
|
||||
|
||||
Returns:
|
||||
None
|
||||
|
||||
"""
|
||||
env = env or os.environ
|
||||
preset = get_project_settings(project_name)["unreal"]["project_setup"]
|
||||
ue_id = ".".join(ue_version.split(".")[:2])
|
||||
# get unreal engine identifier
|
||||
# -------------------------------------------------------------------------
|
||||
# FIXME (antirotor): As of 4.26 this is problem with UE4 built from
|
||||
# sources. In that case Engine ID is calculated per machine/user and not
|
||||
# from Engine files as this code then reads. This then prevents UE4
|
||||
# to directly open project as it will complain about project being
|
||||
# created in different UE4 version. When user convert such project
|
||||
# to his UE4 version, Engine ID is replaced in uproject file. If some
|
||||
# other user tries to open it, it will present him with similar error.
|
||||
ue4_modules = Path()
|
||||
if platform.system().lower() == "windows":
|
||||
ue4_modules = Path(os.path.join(engine_path, "Engine", "Binaries",
|
||||
"Win64", "UE4Editor.modules"))
|
||||
|
||||
if platform.system().lower() == "linux":
|
||||
ue4_modules = Path(os.path.join(engine_path, "Engine", "Binaries",
|
||||
"Linux", "UE4Editor.modules"))
|
||||
|
||||
if platform.system().lower() == "darwin":
|
||||
ue4_modules = Path(os.path.join(engine_path, "Engine", "Binaries",
|
||||
"Mac", "UE4Editor.modules"))
|
||||
|
||||
if ue4_modules.exists():
|
||||
print("--- Loading Engine ID from modules file ...")
|
||||
with open(ue4_modules, "r") as mp:
|
||||
loaded_modules = json.load(mp)
|
||||
|
||||
if loaded_modules.get("BuildId"):
|
||||
ue_id = "{" + loaded_modules.get("BuildId") + "}"
|
||||
|
||||
plugins_path = None
|
||||
if os.path.isdir(env.get("AVALON_UNREAL_PLUGIN", "")):
|
||||
# copy plugin to correct path under project
|
||||
plugins_path = os.path.join(pr_dir, "Plugins")
|
||||
avalon_plugin_path = os.path.join(plugins_path, "Avalon")
|
||||
if not os.path.isdir(avalon_plugin_path):
|
||||
os.makedirs(avalon_plugin_path, exist_ok=True)
|
||||
plugins_path = pr_dir / "Plugins"
|
||||
avalon_plugin_path = plugins_path / "Avalon"
|
||||
if not avalon_plugin_path.is_dir():
|
||||
avalon_plugin_path.mkdir(parents=True, exist_ok=True)
|
||||
dir_util._path_created = {}
|
||||
dir_util.copy_tree(os.environ.get("AVALON_UNREAL_PLUGIN"),
|
||||
avalon_plugin_path)
|
||||
avalon_plugin_path.as_posix())
|
||||
|
||||
if (not os.path.isdir(os.path.join(avalon_plugin_path, "Binaries"))
|
||||
or not os.path.join(avalon_plugin_path, "Intermediate")):
|
||||
if not (avalon_plugin_path / "Binaries").is_dir() \
|
||||
or not (avalon_plugin_path / "Intermediate").is_dir():
|
||||
dev_mode = True
|
||||
|
||||
# data for project file
|
||||
data = {
|
||||
"FileVersion": 3,
|
||||
"EngineAssociation": ue_version,
|
||||
"EngineAssociation": ue_id,
|
||||
"Category": "",
|
||||
"Description": "",
|
||||
"Plugins": [
|
||||
|
|
@ -179,35 +257,6 @@ def create_unreal_project(project_name: str,
|
|||
]
|
||||
}
|
||||
|
||||
if preset["install_unreal_python_engine"]:
|
||||
# If `OPENPYPE_UNREAL_ENGINE_PYTHON_PLUGIN` is set, copy it from there
|
||||
# to support offline installation.
|
||||
# Otherwise clone UnrealEnginePython to Plugins directory
|
||||
# https://github.com/20tab/UnrealEnginePython.git
|
||||
uep_path = os.path.join(plugins_path, "UnrealEnginePython")
|
||||
if os.environ.get("OPENPYPE_UNREAL_ENGINE_PYTHON_PLUGIN"):
|
||||
|
||||
os.makedirs(uep_path, exist_ok=True)
|
||||
dir_util._path_created = {}
|
||||
dir_util.copy_tree(
|
||||
os.environ.get("OPENPYPE_UNREAL_ENGINE_PYTHON_PLUGIN"),
|
||||
uep_path)
|
||||
else:
|
||||
# WARNING: this will trigger dev_mode, because we need to compile
|
||||
# this plugin.
|
||||
dev_mode = True
|
||||
import git
|
||||
git.Repo.clone_from(
|
||||
"https://github.com/20tab/UnrealEnginePython.git",
|
||||
uep_path)
|
||||
|
||||
data["Plugins"].append(
|
||||
{"Name": "UnrealEnginePython", "Enabled": True})
|
||||
|
||||
if (not os.path.isdir(os.path.join(uep_path, "Binaries"))
|
||||
or not os.path.join(uep_path, "Intermediate")):
|
||||
dev_mode = True
|
||||
|
||||
if dev_mode or preset["dev_mode"]:
|
||||
# this will add project module and necessary source file to make it
|
||||
# C++ project and to (hopefully) make Unreal Editor to compile all
|
||||
|
|
@ -220,51 +269,39 @@ def create_unreal_project(project_name: str,
|
|||
"AdditionalDependencies": ["Engine"],
|
||||
}]
|
||||
|
||||
if preset["install_unreal_python_engine"]:
|
||||
# now we need to fix python path in:
|
||||
# `UnrealEnginePython.Build.cs`
|
||||
# to point to our python
|
||||
with open(os.path.join(
|
||||
uep_path, "Source",
|
||||
"UnrealEnginePython",
|
||||
"UnrealEnginePython.Build.cs"), mode="r") as f:
|
||||
build_file = f.read()
|
||||
|
||||
fix = build_file.replace(
|
||||
'private string pythonHome = "";',
|
||||
'private string pythonHome = "{}";'.format(
|
||||
sys.base_prefix.replace("\\", "/")))
|
||||
|
||||
with open(os.path.join(
|
||||
uep_path, "Source",
|
||||
"UnrealEnginePython",
|
||||
"UnrealEnginePython.Build.cs"), mode="w") as f:
|
||||
f.write(fix)
|
||||
|
||||
# write project file
|
||||
project_file = os.path.join(pr_dir, "{}.uproject".format(project_name))
|
||||
project_file = pr_dir / f"{project_name}.uproject"
|
||||
with open(project_file, mode="w") as pf:
|
||||
json.dump(data, pf, indent=4)
|
||||
|
||||
# UE < 4.26 have Python2 by default, so we need PySide
|
||||
# but we will not need it in 4.26 and up
|
||||
if int(ue_version.split(".")[1]) < 26:
|
||||
# ensure we have PySide installed in engine
|
||||
# TODO: make it work for other platforms 🍎 🐧
|
||||
if platform.system().lower() == "windows":
|
||||
python_path = os.path.join(engine_path, "Engine", "Binaries",
|
||||
"ThirdParty", "Python", "Win64",
|
||||
"python.exe")
|
||||
# ensure we have PySide2 installed in engine
|
||||
python_path = None
|
||||
if platform.system().lower() == "windows":
|
||||
python_path = engine_path / ("Engine/Binaries/ThirdParty/"
|
||||
"Python3/Win64/pythonw.exe")
|
||||
|
||||
subprocess.run([python_path, "-m",
|
||||
"pip", "install", "pyside"])
|
||||
if platform.system().lower() == "linux":
|
||||
python_path = engine_path / ("Engine/Binaries/ThirdParty/"
|
||||
"Python3/Linux/bin/python3")
|
||||
|
||||
if platform.system().lower() == "darwin":
|
||||
python_path = engine_path / ("Engine/Binaries/ThirdParty/"
|
||||
"Python3/Mac/bin/python3")
|
||||
|
||||
if not python_path:
|
||||
raise NotImplementedError("Unsupported platform")
|
||||
if not python_path.exists():
|
||||
raise RuntimeError(f"Unreal Python not found at {python_path}")
|
||||
subprocess.run(
|
||||
[python_path.as_posix(), "-m", "pip", "install", "pyside2"])
|
||||
|
||||
if dev_mode or preset["dev_mode"]:
|
||||
_prepare_cpp_project(project_file, engine_path)
|
||||
|
||||
|
||||
def _prepare_cpp_project(project_file: str, engine_path: str) -> None:
|
||||
"""
|
||||
def _prepare_cpp_project(project_file: Path, engine_path: Path) -> None:
|
||||
"""Prepare CPP Unreal Project.
|
||||
|
||||
This function will add source files needed for project to be
|
||||
rebuild along with the avalon integration plugin.
|
||||
|
||||
|
|
@ -273,19 +310,19 @@ def _prepare_cpp_project(project_file: str, engine_path: str) -> None:
|
|||
by some generator. This needs more research as manually writing
|
||||
those files is rather hackish. :skull_and_crossbones:
|
||||
|
||||
:param project_file: path to .uproject file
|
||||
:type project_file: str
|
||||
:param engine_path: path to unreal engine associated with project
|
||||
:type engine_path: str
|
||||
|
||||
Args:
|
||||
project_file (str): Path to .uproject file.
|
||||
engine_path (str): Path to unreal engine associated with project.
|
||||
|
||||
"""
|
||||
project_name = project_file.stem
|
||||
project_dir = project_file.parent
|
||||
targets_dir = project_dir / "Source"
|
||||
sources_dir = targets_dir / project_name
|
||||
|
||||
project_name = os.path.splitext(os.path.basename(project_file))[0]
|
||||
project_dir = os.path.dirname(project_file)
|
||||
targets_dir = os.path.join(project_dir, "Source")
|
||||
sources_dir = os.path.join(targets_dir, project_name)
|
||||
|
||||
os.makedirs(sources_dir, exist_ok=True)
|
||||
os.makedirs(os.path.join(project_dir, "Content"), exist_ok=True)
|
||||
sources_dir.mkdir(parents=True, exist_ok=True)
|
||||
(project_dir / "Content").mkdir(parents=True, exist_ok=True)
|
||||
|
||||
module_target = '''
|
||||
using UnrealBuildTool;
|
||||
|
|
@ -360,59 +397,59 @@ class {1}_API A{0}GameModeBase : public AGameModeBase
|
|||
}};
|
||||
'''.format(project_name, project_name.upper())
|
||||
|
||||
with open(os.path.join(
|
||||
targets_dir, f"{project_name}.Target.cs"), mode="w") as f:
|
||||
with open(targets_dir / f"{project_name}.Target.cs", mode="w") as f:
|
||||
f.write(module_target)
|
||||
|
||||
with open(os.path.join(
|
||||
targets_dir, f"{project_name}Editor.Target.cs"), mode="w") as f:
|
||||
with open(targets_dir / f"{project_name}Editor.Target.cs", mode="w") as f:
|
||||
f.write(editor_module_target)
|
||||
|
||||
with open(os.path.join(
|
||||
sources_dir, f"{project_name}.Build.cs"), mode="w") as f:
|
||||
with open(sources_dir / f"{project_name}.Build.cs", mode="w") as f:
|
||||
f.write(module_build)
|
||||
|
||||
with open(os.path.join(
|
||||
sources_dir, f"{project_name}.cpp"), mode="w") as f:
|
||||
with open(sources_dir / f"{project_name}.cpp", mode="w") as f:
|
||||
f.write(module_cpp)
|
||||
|
||||
with open(os.path.join(
|
||||
sources_dir, f"{project_name}.h"), mode="w") as f:
|
||||
with open(sources_dir / f"{project_name}.h", mode="w") as f:
|
||||
f.write(module_header)
|
||||
|
||||
with open(os.path.join(
|
||||
sources_dir, f"{project_name}GameModeBase.cpp"), mode="w") as f:
|
||||
with open(sources_dir / f"{project_name}GameModeBase.cpp", mode="w") as f:
|
||||
f.write(game_mode_cpp)
|
||||
|
||||
with open(os.path.join(
|
||||
sources_dir, f"{project_name}GameModeBase.h"), mode="w") as f:
|
||||
with open(sources_dir / f"{project_name}GameModeBase.h", mode="w") as f:
|
||||
f.write(game_mode_h)
|
||||
|
||||
u_build_tool = Path(
|
||||
engine_path / "Engine/Binaries/DotNET/UnrealBuildTool.exe")
|
||||
u_header_tool = None
|
||||
|
||||
arch = "Win64"
|
||||
if platform.system().lower() == "windows":
|
||||
u_build_tool = (f"{engine_path}/Engine/Binaries/DotNET/"
|
||||
"UnrealBuildTool.exe")
|
||||
u_header_tool = (f"{engine_path}/Engine/Binaries/Win64/"
|
||||
f"UnrealHeaderTool.exe")
|
||||
arch = "Win64"
|
||||
u_header_tool = Path(
|
||||
engine_path / "Engine/Binaries/Win64/UnrealHeaderTool.exe")
|
||||
elif platform.system().lower() == "linux":
|
||||
# WARNING: there is no UnrealBuildTool on linux?
|
||||
u_build_tool = ""
|
||||
u_header_tool = ""
|
||||
arch = "Linux"
|
||||
u_header_tool = Path(
|
||||
engine_path / "Engine/Binaries/Linux/UnrealHeaderTool")
|
||||
elif platform.system().lower() == "darwin":
|
||||
# WARNING: there is no UnrealBuildTool on Mac?
|
||||
u_build_tool = ""
|
||||
u_header_tool = ""
|
||||
# we need to test this out
|
||||
arch = "Mac"
|
||||
u_header_tool = Path(
|
||||
engine_path / "Engine/Binaries/Mac/UnrealHeaderTool")
|
||||
|
||||
u_build_tool = u_build_tool.replace("\\", "/")
|
||||
u_header_tool = u_header_tool.replace("\\", "/")
|
||||
if not u_header_tool:
|
||||
raise NotImplementedError("Unsupported platform")
|
||||
|
||||
command1 = [u_build_tool, "-projectfiles", f"-project={project_file}",
|
||||
"-progress"]
|
||||
command1 = [u_build_tool.as_posix(), "-projectfiles",
|
||||
f"-project={project_file}", "-progress"]
|
||||
|
||||
subprocess.run(command1)
|
||||
|
||||
command2 = [u_build_tool, f"-ModuleWithSuffix={project_name},3555"
|
||||
"Win64", "Development", "-TargetType=Editor"
|
||||
f'-Project="{project_file}"', f'"{project_file}"'
|
||||
command2 = [u_build_tool.as_posix(),
|
||||
f"-ModuleWithSuffix={project_name},3555", arch,
|
||||
"Development", "-TargetType=Editor",
|
||||
f'-Project={project_file}',
|
||||
f'{project_file}',
|
||||
"-IgnoreJunk"]
|
||||
|
||||
subprocess.run(command2)
|
||||
|
|
|
|||
|
|
@ -1,31 +1,49 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Hook to launch Unreal and prepare projects."""
|
||||
import os
|
||||
from pathlib import Path
|
||||
|
||||
from openpype.lib import (
|
||||
PreLaunchHook,
|
||||
ApplicationLaunchFailed
|
||||
ApplicationLaunchFailed,
|
||||
ApplicationNotFound
|
||||
)
|
||||
from openpype.hosts.unreal.api import lib as unreal_lib
|
||||
|
||||
|
||||
class UnrealPrelaunchHook(PreLaunchHook):
|
||||
"""
|
||||
"""Hook to handle launching Unreal.
|
||||
|
||||
This hook will check if current workfile path has Unreal
|
||||
project inside. IF not, it initialize it and finally it pass
|
||||
path to the project by environment variable to Unreal launcher
|
||||
shell script.
|
||||
"""
|
||||
|
||||
"""
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
self.signature = "( {} )".format(self.__class__.__name__)
|
||||
|
||||
def execute(self):
|
||||
"""Hook entry method."""
|
||||
asset_name = self.data["asset_name"]
|
||||
task_name = self.data["task_name"]
|
||||
workdir = self.launch_context.env["AVALON_WORKDIR"]
|
||||
engine_version = self.app_name.split("/")[-1].replace("-", ".")
|
||||
unreal_project_name = f"{asset_name}_{task_name}"
|
||||
try:
|
||||
if int(engine_version.split(".")[0]) < 4 and \
|
||||
int(engine_version.split(".")[1]) < 26:
|
||||
raise ApplicationLaunchFailed((
|
||||
f"{self.signature} Old unsupported version of UE4 "
|
||||
f"detected - {engine_version}"))
|
||||
except ValueError:
|
||||
# there can be string in minor version and in that case
|
||||
# int cast is failing. This probably happens only with
|
||||
# early access versions and is of no concert for this check
|
||||
# so lets keep it quite.
|
||||
...
|
||||
|
||||
# Unreal is sensitive about project names longer then 20 chars
|
||||
if len(unreal_project_name) > 20:
|
||||
|
|
@ -45,19 +63,21 @@ class UnrealPrelaunchHook(PreLaunchHook):
|
|||
))
|
||||
unreal_project_name = f"P{unreal_project_name}"
|
||||
|
||||
project_path = os.path.join(workdir, unreal_project_name)
|
||||
project_path = Path(os.path.join(workdir, unreal_project_name))
|
||||
|
||||
self.log.info((
|
||||
f"{self.signature} requested UE4 version: "
|
||||
f"[ {engine_version} ]"
|
||||
))
|
||||
|
||||
detected = unreal_lib.get_engine_versions()
|
||||
detected = unreal_lib.get_engine_versions(self.launch_context.env)
|
||||
detected_str = ', '.join(detected.keys()) or 'none'
|
||||
self.log.info((
|
||||
f"{self.signature} detected UE4 versions: "
|
||||
f"[ {detected_str} ]"
|
||||
))
|
||||
if not detected:
|
||||
raise ApplicationNotFound("No Unreal Engines are found.")
|
||||
|
||||
engine_version = ".".join(engine_version.split(".")[:2])
|
||||
if engine_version not in detected.keys():
|
||||
|
|
@ -66,13 +86,14 @@ class UnrealPrelaunchHook(PreLaunchHook):
|
|||
f"detected [ {engine_version} ]"
|
||||
))
|
||||
|
||||
os.makedirs(project_path, exist_ok=True)
|
||||
ue4_path = unreal_lib.get_editor_executable_path(
|
||||
Path(detected[engine_version]))
|
||||
|
||||
project_file = os.path.join(
|
||||
project_path,
|
||||
f"{unreal_project_name}.uproject"
|
||||
)
|
||||
if not os.path.isfile(project_file):
|
||||
self.launch_context.launch_args.append(ue4_path.as_posix())
|
||||
project_path.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
project_file = project_path / f"{unreal_project_name}.uproject"
|
||||
if not project_file.is_file():
|
||||
engine_path = detected[engine_version]
|
||||
self.log.info((
|
||||
f"{self.signature} creating unreal "
|
||||
|
|
@ -88,8 +109,9 @@ class UnrealPrelaunchHook(PreLaunchHook):
|
|||
unreal_project_name,
|
||||
engine_version,
|
||||
project_path,
|
||||
engine_path=engine_path
|
||||
engine_path=Path(engine_path)
|
||||
)
|
||||
|
||||
# Append project file to launch arguments
|
||||
self.launch_context.launch_args.append(f"\"{project_file}\"")
|
||||
self.launch_context.launch_args.append(
|
||||
f"\"{project_file.as_posix()}\"")
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
import os
|
||||
import sys
|
||||
import re
|
||||
import copy
|
||||
import json
|
||||
|
|
@ -449,6 +450,12 @@ class ApplicationExecutable:
|
|||
"""Representation of executable loaded from settings."""
|
||||
|
||||
def __init__(self, executable):
|
||||
# Try to format executable with environments
|
||||
try:
|
||||
executable = executable.format(**os.environ)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# On MacOS check if exists path to executable when ends with `.app`
|
||||
# - it is common that path will lead to "/Applications/Blender" but
|
||||
# real path is "/Applications/Blender.app"
|
||||
|
|
@ -460,12 +467,6 @@ class ApplicationExecutable:
|
|||
if os.path.exists(_executable):
|
||||
executable = _executable
|
||||
|
||||
# Try to format executable with environments
|
||||
try:
|
||||
executable = executable.format(**os.environ)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
self.executable_path = executable
|
||||
|
||||
def __str__(self):
|
||||
|
|
@ -708,6 +709,10 @@ class ApplicationLaunchContext:
|
|||
)
|
||||
self.kwargs["creationflags"] = flags
|
||||
|
||||
if not sys.stdout:
|
||||
self.kwargs["stdout"] = subprocess.DEVNULL
|
||||
self.kwargs["stderr"] = subprocess.DEVNULL
|
||||
|
||||
self.prelaunch_hooks = None
|
||||
self.postlaunch_hooks = None
|
||||
|
||||
|
|
@ -1133,7 +1138,8 @@ def prepare_host_environments(data, implementation_envs=True):
|
|||
# Merge dictionaries
|
||||
env_values = _merge_env(tool_env, env_values)
|
||||
|
||||
loaded_env = _merge_env(acre.compute(env_values), data["env"])
|
||||
merged_env = _merge_env(env_values, data["env"])
|
||||
loaded_env = acre.compute(merged_env, cleanup=False)
|
||||
|
||||
final_env = None
|
||||
# Add host specific environments
|
||||
|
|
@ -1184,7 +1190,10 @@ def apply_project_environments_value(project_name, env, project_settings=None):
|
|||
|
||||
env_value = project_settings["global"]["project_environments"]
|
||||
if env_value:
|
||||
env.update(_merge_env(acre.parse(env_value), env))
|
||||
env.update(acre.compute(
|
||||
_merge_env(acre.parse(env_value), env),
|
||||
cleanup=False
|
||||
))
|
||||
return env
|
||||
|
||||
|
||||
|
|
@ -1297,10 +1306,18 @@ def _prepare_last_workfile(data, workdir):
|
|||
)
|
||||
data["start_last_workfile"] = start_last_workfile
|
||||
|
||||
workfile_startup = should_workfile_tool_start(
|
||||
project_name, app.host_name, task_name
|
||||
)
|
||||
data["workfile_startup"] = workfile_startup
|
||||
|
||||
# Store boolean as "0"(False) or "1"(True)
|
||||
data["env"]["AVALON_OPEN_LAST_WORKFILE"] = (
|
||||
str(int(bool(start_last_workfile)))
|
||||
)
|
||||
data["env"]["OPENPYPE_WORKFILE_TOOL_ON_START"] = (
|
||||
str(int(bool(workfile_startup)))
|
||||
)
|
||||
|
||||
_sub_msg = "" if start_last_workfile else " not"
|
||||
log.debug(
|
||||
|
|
@ -1339,40 +1356,9 @@ def _prepare_last_workfile(data, workdir):
|
|||
data["last_workfile_path"] = last_workfile_path
|
||||
|
||||
|
||||
def should_start_last_workfile(
|
||||
project_name, host_name, task_name, default_output=False
|
||||
def get_option_from_settings(
|
||||
startup_presets, host_name, task_name, default_output
|
||||
):
|
||||
"""Define if host should start last version workfile if possible.
|
||||
|
||||
Default output is `False`. Can be overriden with environment variable
|
||||
`AVALON_OPEN_LAST_WORKFILE`, valid values without case sensitivity are
|
||||
`"0", "1", "true", "false", "yes", "no"`.
|
||||
|
||||
Args:
|
||||
project_name (str): Name of project.
|
||||
host_name (str): Name of host which is launched. In avalon's
|
||||
application context it's value stored in app definition under
|
||||
key `"application_dir"`. Is not case sensitive.
|
||||
task_name (str): Name of task which is used for launching the host.
|
||||
Task name is not case sensitive.
|
||||
|
||||
Returns:
|
||||
bool: True if host should start workfile.
|
||||
|
||||
"""
|
||||
|
||||
project_settings = get_project_settings(project_name)
|
||||
startup_presets = (
|
||||
project_settings
|
||||
["global"]
|
||||
["tools"]
|
||||
["Workfiles"]
|
||||
["last_workfile_on_startup"]
|
||||
)
|
||||
|
||||
if not startup_presets:
|
||||
return default_output
|
||||
|
||||
host_name_lowered = host_name.lower()
|
||||
task_name_lowered = task_name.lower()
|
||||
|
||||
|
|
@ -1416,6 +1402,82 @@ def should_start_last_workfile(
|
|||
return default_output
|
||||
|
||||
|
||||
def should_start_last_workfile(
|
||||
project_name, host_name, task_name, default_output=False
|
||||
):
|
||||
"""Define if host should start last version workfile if possible.
|
||||
|
||||
Default output is `False`. Can be overriden with environment variable
|
||||
`AVALON_OPEN_LAST_WORKFILE`, valid values without case sensitivity are
|
||||
`"0", "1", "true", "false", "yes", "no"`.
|
||||
|
||||
Args:
|
||||
project_name (str): Name of project.
|
||||
host_name (str): Name of host which is launched. In avalon's
|
||||
application context it's value stored in app definition under
|
||||
key `"application_dir"`. Is not case sensitive.
|
||||
task_name (str): Name of task which is used for launching the host.
|
||||
Task name is not case sensitive.
|
||||
|
||||
Returns:
|
||||
bool: True if host should start workfile.
|
||||
|
||||
"""
|
||||
|
||||
project_settings = get_project_settings(project_name)
|
||||
startup_presets = (
|
||||
project_settings
|
||||
["global"]
|
||||
["tools"]
|
||||
["Workfiles"]
|
||||
["last_workfile_on_startup"]
|
||||
)
|
||||
|
||||
if not startup_presets:
|
||||
return default_output
|
||||
|
||||
return get_option_from_settings(
|
||||
startup_presets, host_name, task_name, default_output)
|
||||
|
||||
|
||||
def should_workfile_tool_start(
|
||||
project_name, host_name, task_name, default_output=False
|
||||
):
|
||||
"""Define if host should start workfile tool at host launch.
|
||||
|
||||
Default output is `False`. Can be overriden with environment variable
|
||||
`OPENPYPE_WORKFILE_TOOL_ON_START`, valid values without case sensitivity are
|
||||
`"0", "1", "true", "false", "yes", "no"`.
|
||||
|
||||
Args:
|
||||
project_name (str): Name of project.
|
||||
host_name (str): Name of host which is launched. In avalon's
|
||||
application context it's value stored in app definition under
|
||||
key `"application_dir"`. Is not case sensitive.
|
||||
task_name (str): Name of task which is used for launching the host.
|
||||
Task name is not case sensitive.
|
||||
|
||||
Returns:
|
||||
bool: True if host should start workfile.
|
||||
|
||||
"""
|
||||
|
||||
project_settings = get_project_settings(project_name)
|
||||
startup_presets = (
|
||||
project_settings
|
||||
["global"]
|
||||
["tools"]
|
||||
["Workfiles"]
|
||||
["open_workfile_tool_on_startup"]
|
||||
)
|
||||
|
||||
if not startup_presets:
|
||||
return default_output
|
||||
|
||||
return get_option_from_settings(
|
||||
startup_presets, host_name, task_name, default_output)
|
||||
|
||||
|
||||
def compile_list_of_regexes(in_list):
|
||||
"""Convert strings in entered list to compiled regex objects."""
|
||||
regexes = list()
|
||||
|
|
|
|||
|
|
@ -7,6 +7,8 @@ try:
|
|||
import opentimelineio as otio
|
||||
from opentimelineio import opentime as _ot
|
||||
except ImportError:
|
||||
if not os.environ.get("AVALON_APP"):
|
||||
raise
|
||||
otio = discover_host_vendor_module("opentimelineio")
|
||||
_ot = discover_host_vendor_module("opentimelineio.opentime")
|
||||
|
||||
|
|
|
|||
|
|
@ -199,7 +199,7 @@ def get_renderer_variables(renderlayer, root):
|
|||
if extension is None:
|
||||
extension = "png"
|
||||
|
||||
if extension == "exr (multichannel)" or extension == "exr (deep)":
|
||||
if extension in ["exr (multichannel)", "exr (deep)"]:
|
||||
extension = "exr"
|
||||
|
||||
prefix_attr = "vraySettings.fileNamePrefix"
|
||||
|
|
@ -271,6 +271,22 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
["DEADLINE_REST_URL"]
|
||||
)
|
||||
|
||||
self._job_info = (
|
||||
context.data["project_settings"].get(
|
||||
"deadline", {}).get(
|
||||
"publish", {}).get(
|
||||
"MayaSubmitDeadline", {}).get(
|
||||
"jobInfo", {})
|
||||
)
|
||||
|
||||
self._plugin_info = (
|
||||
context.data["project_settings"].get(
|
||||
"deadline", {}).get(
|
||||
"publish", {}).get(
|
||||
"MayaSubmitDeadline", {}).get(
|
||||
"pluginInfo", {})
|
||||
)
|
||||
|
||||
assert self._deadline_url, "Requires DEADLINE_REST_URL"
|
||||
|
||||
context = instance.context
|
||||
|
|
@ -279,57 +295,70 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
instance.data["toBeRenderedOn"] = "deadline"
|
||||
|
||||
filepath = None
|
||||
patches = (
|
||||
context.data["project_settings"].get(
|
||||
"deadline", {}).get(
|
||||
"publish", {}).get(
|
||||
"MayaSubmitDeadline", {}).get(
|
||||
"scene_patches", {})
|
||||
)
|
||||
|
||||
# Handle render/export from published scene or not ------------------
|
||||
if self.use_published:
|
||||
patched_files = []
|
||||
for i in context:
|
||||
if "workfile" in i.data["families"]:
|
||||
assert i.data["publish"] is True, (
|
||||
"Workfile (scene) must be published along")
|
||||
template_data = i.data.get("anatomyData")
|
||||
rep = i.data.get("representations")[0].get("name")
|
||||
template_data["representation"] = rep
|
||||
template_data["ext"] = rep
|
||||
template_data["comment"] = None
|
||||
anatomy_filled = anatomy.format(template_data)
|
||||
template_filled = anatomy_filled["publish"]["path"]
|
||||
filepath = os.path.normpath(template_filled)
|
||||
self.log.info("Using published scene for render {}".format(
|
||||
filepath))
|
||||
if "workfile" not in i.data["families"]:
|
||||
continue
|
||||
assert i.data["publish"] is True, (
|
||||
"Workfile (scene) must be published along")
|
||||
template_data = i.data.get("anatomyData")
|
||||
rep = i.data.get("representations")[0].get("name")
|
||||
template_data["representation"] = rep
|
||||
template_data["ext"] = rep
|
||||
template_data["comment"] = None
|
||||
anatomy_filled = anatomy.format(template_data)
|
||||
template_filled = anatomy_filled["publish"]["path"]
|
||||
filepath = os.path.normpath(template_filled)
|
||||
self.log.info("Using published scene for render {}".format(
|
||||
filepath))
|
||||
|
||||
if not os.path.exists(filepath):
|
||||
self.log.error("published scene does not exist!")
|
||||
raise
|
||||
# now we need to switch scene in expected files
|
||||
# because <scene> token will now point to published
|
||||
# scene file and that might differ from current one
|
||||
new_scene = os.path.splitext(
|
||||
os.path.basename(filepath))[0]
|
||||
orig_scene = os.path.splitext(
|
||||
os.path.basename(context.data["currentFile"]))[0]
|
||||
exp = instance.data.get("expectedFiles")
|
||||
if not os.path.exists(filepath):
|
||||
self.log.error("published scene does not exist!")
|
||||
raise
|
||||
# now we need to switch scene in expected files
|
||||
# because <scene> token will now point to published
|
||||
# scene file and that might differ from current one
|
||||
new_scene = os.path.splitext(
|
||||
os.path.basename(filepath))[0]
|
||||
orig_scene = os.path.splitext(
|
||||
os.path.basename(context.data["currentFile"]))[0]
|
||||
exp = instance.data.get("expectedFiles")
|
||||
|
||||
if isinstance(exp[0], dict):
|
||||
# we have aovs and we need to iterate over them
|
||||
new_exp = {}
|
||||
for aov, files in exp[0].items():
|
||||
replaced_files = []
|
||||
for f in files:
|
||||
replaced_files.append(
|
||||
f.replace(orig_scene, new_scene)
|
||||
)
|
||||
new_exp[aov] = replaced_files
|
||||
instance.data["expectedFiles"] = [new_exp]
|
||||
else:
|
||||
new_exp = []
|
||||
for f in exp:
|
||||
new_exp.append(
|
||||
if isinstance(exp[0], dict):
|
||||
# we have aovs and we need to iterate over them
|
||||
new_exp = {}
|
||||
for aov, files in exp[0].items():
|
||||
replaced_files = []
|
||||
for f in files:
|
||||
replaced_files.append(
|
||||
f.replace(orig_scene, new_scene)
|
||||
)
|
||||
instance.data["expectedFiles"] = [new_exp]
|
||||
self.log.info("Scene name was switched {} -> {}".format(
|
||||
orig_scene, new_scene
|
||||
))
|
||||
new_exp[aov] = replaced_files
|
||||
instance.data["expectedFiles"] = [new_exp]
|
||||
else:
|
||||
new_exp = []
|
||||
for f in exp:
|
||||
new_exp.append(
|
||||
f.replace(orig_scene, new_scene)
|
||||
)
|
||||
instance.data["expectedFiles"] = [new_exp]
|
||||
self.log.info("Scene name was switched {} -> {}".format(
|
||||
orig_scene, new_scene
|
||||
))
|
||||
# patch workfile is needed
|
||||
if filepath not in patched_files:
|
||||
patched_file = self._patch_workfile(filepath, patches)
|
||||
patched_files.append(patched_file)
|
||||
|
||||
all_instances = []
|
||||
for result in context.data["results"]:
|
||||
|
|
@ -407,7 +436,7 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
self.payload_skeleton["JobInfo"]["Priority"] = \
|
||||
self._instance.data.get("priority", 50)
|
||||
|
||||
if self.group != "none":
|
||||
if self.group != "none" and self.group:
|
||||
self.payload_skeleton["JobInfo"]["Group"] = self.group
|
||||
|
||||
if self.limit_groups:
|
||||
|
|
@ -536,6 +565,10 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
|
||||
self.preflight_check(instance)
|
||||
|
||||
# add jobInfo and pluginInfo variables from Settings
|
||||
payload["JobInfo"].update(self._job_info)
|
||||
payload["PluginInfo"].update(self._plugin_info)
|
||||
|
||||
# Prepare tiles data ------------------------------------------------
|
||||
if instance.data.get("tileRendering"):
|
||||
# if we have sequence of files, we need to create tile job for
|
||||
|
|
@ -848,10 +881,11 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
payload["JobInfo"].update(job_info_ext)
|
||||
payload["PluginInfo"].update(plugin_info_ext)
|
||||
|
||||
envs = []
|
||||
for k, v in payload["JobInfo"].items():
|
||||
if k.startswith("EnvironmentKeyValue"):
|
||||
envs.append(v)
|
||||
envs = [
|
||||
v
|
||||
for k, v in payload["JobInfo"].items()
|
||||
if k.startswith("EnvironmentKeyValue")
|
||||
]
|
||||
|
||||
# add app name to environment
|
||||
envs.append(
|
||||
|
|
@ -872,11 +906,8 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
envs.append(
|
||||
"OPENPYPE_ASS_EXPORT_STEP={}".format(1))
|
||||
|
||||
i = 0
|
||||
for e in envs:
|
||||
for i, e in enumerate(envs):
|
||||
payload["JobInfo"]["EnvironmentKeyValue{}".format(i)] = e
|
||||
i += 1
|
||||
|
||||
return payload
|
||||
|
||||
def _get_vray_render_payload(self, data):
|
||||
|
|
@ -983,7 +1014,7 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
|
||||
"""
|
||||
if 'verify' not in kwargs:
|
||||
kwargs['verify'] = False if os.getenv("OPENPYPE_DONT_VERIFY_SSL", True) else True # noqa
|
||||
kwargs['verify'] = not os.getenv("OPENPYPE_DONT_VERIFY_SSL", True)
|
||||
# add 10sec timeout before bailing out
|
||||
kwargs['timeout'] = 10
|
||||
return requests.post(*args, **kwargs)
|
||||
|
|
@ -1002,7 +1033,7 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
|
||||
"""
|
||||
if 'verify' not in kwargs:
|
||||
kwargs['verify'] = False if os.getenv("OPENPYPE_DONT_VERIFY_SSL", True) else True # noqa
|
||||
kwargs['verify'] = not os.getenv("OPENPYPE_DONT_VERIFY_SSL", True)
|
||||
# add 10sec timeout before bailing out
|
||||
kwargs['timeout'] = 10
|
||||
return requests.get(*args, **kwargs)
|
||||
|
|
@ -1049,3 +1080,43 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
result = filename_zero.replace("\\", "/")
|
||||
|
||||
return result
|
||||
|
||||
def _patch_workfile(self, file, patches):
|
||||
# type: (str, dict) -> [str, None]
|
||||
"""Patch Maya scene.
|
||||
|
||||
This will take list of patches (lines to add) and apply them to
|
||||
*published* Maya scene file (that is used later for rendering).
|
||||
|
||||
Patches are dict with following structure::
|
||||
{
|
||||
"name": "Name of patch",
|
||||
"regex": "regex of line before patch",
|
||||
"line": "line to insert"
|
||||
}
|
||||
|
||||
Args:
|
||||
file (str): File to patch.
|
||||
patches (dict): Dictionary defining patches.
|
||||
|
||||
Returns:
|
||||
str: Patched file path or None
|
||||
|
||||
"""
|
||||
if os.path.splitext(file)[1].lower() != ".ma" or not patches:
|
||||
return None
|
||||
|
||||
compiled_regex = [re.compile(p["regex"]) for p in patches]
|
||||
with open(file, "r+") as pf:
|
||||
scene_data = pf.readlines()
|
||||
for ln, line in enumerate(scene_data):
|
||||
for i, r in enumerate(compiled_regex):
|
||||
if re.match(r, line):
|
||||
scene_data.insert(ln + 1, patches[i]["line"])
|
||||
pf.seek(0)
|
||||
pf.writelines(scene_data)
|
||||
pf.truncate()
|
||||
self.log.info(
|
||||
"Applied {} patch to scene.".format(
|
||||
patches[i]["name"]))
|
||||
return file
|
||||
|
|
|
|||
|
|
@ -32,6 +32,8 @@ class NukeSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
department = ""
|
||||
limit_groups = {}
|
||||
use_gpu = False
|
||||
env_allowed_keys = []
|
||||
env_search_replace_values = {}
|
||||
|
||||
def process(self, instance):
|
||||
instance.data["toBeRenderedOn"] = "deadline"
|
||||
|
|
@ -242,19 +244,19 @@ class NukeSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
"PYBLISHPLUGINPATH",
|
||||
"NUKE_PATH",
|
||||
"TOOL_ENV",
|
||||
"OPENPYPE_DEV",
|
||||
"FOUNDRY_LICENSE"
|
||||
]
|
||||
# add allowed keys from preset if any
|
||||
if self.env_allowed_keys:
|
||||
keys += self.env_allowed_keys
|
||||
|
||||
environment = dict({key: os.environ[key] for key in keys
|
||||
if key in os.environ}, **api.Session)
|
||||
# self.log.debug("enviro: {}".format(pprint(environment)))
|
||||
for path in os.environ:
|
||||
if path.lower().startswith('pype_'):
|
||||
environment[path] = os.environ[path]
|
||||
if path.lower().startswith('nuke_'):
|
||||
environment[path] = os.environ[path]
|
||||
if 'license' in path.lower():
|
||||
environment[path] = os.environ[path]
|
||||
|
||||
for _path in os.environ:
|
||||
if _path.lower().startswith('openpype_'):
|
||||
environment[_path] = os.environ[_path]
|
||||
|
||||
clean_environment = {}
|
||||
for key, value in environment.items():
|
||||
|
|
@ -285,6 +287,13 @@ class NukeSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
environment = clean_environment
|
||||
# to recognize job from PYPE for turning Event On/Off
|
||||
environment["OPENPYPE_RENDER_JOB"] = "1"
|
||||
|
||||
# finally search replace in values of any key
|
||||
if self.env_search_replace_values:
|
||||
for key, value in environment.items():
|
||||
for _k, _v in self.env_search_replace_values.items():
|
||||
environment[key] = value.replace(_k, _v)
|
||||
|
||||
payload["JobInfo"].update({
|
||||
"EnvironmentKeyValue%d" % index: "{key}={value}".format(
|
||||
key=key,
|
||||
|
|
|
|||
|
|
@ -181,6 +181,10 @@ class ValidateExpectedFiles(pyblish.api.InstancePlugin):
|
|||
"""Returns set of file names from metadata.json"""
|
||||
expected_files = set()
|
||||
|
||||
for file_name in repre["files"]:
|
||||
files = repre["files"]
|
||||
if not isinstance(files, list):
|
||||
files = [files]
|
||||
|
||||
for file_name in files:
|
||||
expected_files.add(file_name)
|
||||
return expected_files
|
||||
|
|
|
|||
|
|
@ -16,11 +16,13 @@ def clone_review_session(session, entity):
|
|||
|
||||
# Add all invitees.
|
||||
for invitee in entity["review_session_invitees"]:
|
||||
# Make sure email is not None but string
|
||||
email = invitee["email"] or ""
|
||||
session.create(
|
||||
"ReviewSessionInvitee",
|
||||
{
|
||||
"name": invitee["name"],
|
||||
"email": invitee["email"],
|
||||
"email": email,
|
||||
"review_session": review_session
|
||||
}
|
||||
)
|
||||
|
|
|
|||
|
|
@ -0,0 +1,167 @@
|
|||
from openpype.modules.ftrack.lib import ServerAction
|
||||
|
||||
|
||||
class MultipleNotesServer(ServerAction):
|
||||
"""Action adds same note for muliple AssetVersions.
|
||||
|
||||
Note is added to selection of AssetVersions. Note is created with user
|
||||
who triggered the action. It is possible to define note category of note.
|
||||
"""
|
||||
|
||||
identifier = "multiple.notes.server"
|
||||
label = "Multiple Notes (Server)"
|
||||
description = "Add same note to multiple Asset Versions"
|
||||
|
||||
_none_category = "__NONE__"
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
"""Show action only on AssetVersions."""
|
||||
if not entities:
|
||||
return False
|
||||
|
||||
for entity in entities:
|
||||
if entity.entity_type.lower() != "assetversion":
|
||||
return False
|
||||
return True
|
||||
|
||||
def interface(self, session, entities, event):
|
||||
event_source = event["source"]
|
||||
user_info = event_source.get("user") or {}
|
||||
user_id = user_info.get("id")
|
||||
if not user_id:
|
||||
return None
|
||||
|
||||
values = event["data"].get("values")
|
||||
if values:
|
||||
return None
|
||||
|
||||
note_label = {
|
||||
"type": "label",
|
||||
"value": "# Enter note: #"
|
||||
}
|
||||
|
||||
note_value = {
|
||||
"name": "note",
|
||||
"type": "textarea"
|
||||
}
|
||||
|
||||
category_label = {
|
||||
"type": "label",
|
||||
"value": "## Category: ##"
|
||||
}
|
||||
|
||||
category_data = []
|
||||
category_data.append({
|
||||
"label": "- None -",
|
||||
"value": self._none_category
|
||||
})
|
||||
all_categories = session.query(
|
||||
"select id, name from NoteCategory"
|
||||
).all()
|
||||
for cat in all_categories:
|
||||
category_data.append({
|
||||
"label": cat["name"],
|
||||
"value": cat["id"]
|
||||
})
|
||||
category_value = {
|
||||
"type": "enumerator",
|
||||
"name": "category",
|
||||
"data": category_data,
|
||||
"value": self._none_category
|
||||
}
|
||||
|
||||
splitter = {
|
||||
"type": "label",
|
||||
"value": "---"
|
||||
}
|
||||
|
||||
return [
|
||||
note_label,
|
||||
note_value,
|
||||
splitter,
|
||||
category_label,
|
||||
category_value
|
||||
]
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
if "values" not in event["data"]:
|
||||
return None
|
||||
|
||||
values = event["data"]["values"]
|
||||
if len(values) <= 0 or "note" not in values:
|
||||
return False
|
||||
|
||||
# Get Note text
|
||||
note_value = values["note"]
|
||||
if note_value.lower().strip() == "":
|
||||
return {
|
||||
"success": True,
|
||||
"message": "Note was not entered. Skipping"
|
||||
}
|
||||
|
||||
# Get User
|
||||
event_source = event["source"]
|
||||
user_info = event_source.get("user") or {}
|
||||
user_id = user_info.get("id")
|
||||
user = None
|
||||
if user_id:
|
||||
user = session.query(
|
||||
'User where id is "{}"'.format(user_id)
|
||||
).first()
|
||||
|
||||
if not user:
|
||||
return {
|
||||
"success": False,
|
||||
"message": "Couldn't get user information."
|
||||
}
|
||||
|
||||
# Logging message preparation
|
||||
# - username
|
||||
username = user.get("username") or "N/A"
|
||||
|
||||
# - AssetVersion ids
|
||||
asset_version_ids_str = ",".join([entity["id"] for entity in entities])
|
||||
|
||||
# Base note data
|
||||
note_data = {
|
||||
"content": note_value,
|
||||
"author": user
|
||||
}
|
||||
|
||||
# Get category
|
||||
category_id = values["category"]
|
||||
if category_id == self._none_category:
|
||||
category_id = None
|
||||
|
||||
category_name = None
|
||||
if category_id is not None:
|
||||
category = session.query(
|
||||
"select id, name from NoteCategory where id is \"{}\"".format(
|
||||
category_id
|
||||
)
|
||||
).first()
|
||||
if category:
|
||||
note_data["category"] = category
|
||||
category_name = category["name"]
|
||||
|
||||
category_msg = ""
|
||||
if category_name:
|
||||
category_msg = " with category: \"{}\"".format(category_name)
|
||||
|
||||
self.log.warning((
|
||||
"Creating note{} as User \"{}\" on "
|
||||
"AssetVersions: {} with value \"{}\""
|
||||
).format(category_msg, username, asset_version_ids_str, note_value))
|
||||
|
||||
# Create notes for entities
|
||||
for entity in entities:
|
||||
new_note = session.create("Note", note_data)
|
||||
entity["notes"].append(new_note)
|
||||
session.commit()
|
||||
return True
|
||||
|
||||
|
||||
def register(session):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
MultipleNotesServer(session).register()
|
||||
|
|
@ -1,6 +1,8 @@
|
|||
import json
|
||||
|
||||
from avalon.api import AvalonMongoDB
|
||||
from openpype.api import ProjectSettings
|
||||
from openpype.lib import create_project
|
||||
|
||||
from openpype.modules.ftrack.lib import (
|
||||
ServerAction,
|
||||
|
|
@ -21,8 +23,24 @@ class PrepareProjectServer(ServerAction):
|
|||
|
||||
role_list = ["Pypeclub", "Administrator", "Project Manager"]
|
||||
|
||||
# Key to store info about trigerring create folder structure
|
||||
settings_key = "prepare_project"
|
||||
|
||||
item_splitter = {"type": "label", "value": "---"}
|
||||
_keys_order = (
|
||||
"fps",
|
||||
"frameStart",
|
||||
"frameEnd",
|
||||
"handleStart",
|
||||
"handleEnd",
|
||||
"clipIn",
|
||||
"clipOut",
|
||||
"resolutionHeight",
|
||||
"resolutionWidth",
|
||||
"pixelAspect",
|
||||
"applications",
|
||||
"tools_env",
|
||||
"library_project",
|
||||
)
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
"""Show only on project."""
|
||||
|
|
@ -47,13 +65,7 @@ class PrepareProjectServer(ServerAction):
|
|||
project_entity = entities[0]
|
||||
project_name = project_entity["full_name"]
|
||||
|
||||
try:
|
||||
project_settings = ProjectSettings(project_name)
|
||||
except ValueError:
|
||||
return {
|
||||
"message": "Project is not synchronized yet",
|
||||
"success": False
|
||||
}
|
||||
project_settings = ProjectSettings(project_name)
|
||||
|
||||
project_anatom_settings = project_settings["project_anatomy"]
|
||||
root_items = self.prepare_root_items(project_anatom_settings)
|
||||
|
|
@ -78,14 +90,13 @@ class PrepareProjectServer(ServerAction):
|
|||
|
||||
items.extend(ca_items)
|
||||
|
||||
# This item will be last (before enumerators)
|
||||
# - sets value of auto synchronization
|
||||
auto_sync_name = "avalon_auto_sync"
|
||||
# This item will be last before enumerators
|
||||
# Set value of auto synchronization
|
||||
auto_sync_value = project_entity["custom_attributes"].get(
|
||||
CUST_ATTR_AUTO_SYNC, False
|
||||
)
|
||||
auto_sync_item = {
|
||||
"name": auto_sync_name,
|
||||
"name": CUST_ATTR_AUTO_SYNC,
|
||||
"type": "boolean",
|
||||
"value": auto_sync_value,
|
||||
"label": "AutoSync to Avalon"
|
||||
|
|
@ -199,7 +210,18 @@ class PrepareProjectServer(ServerAction):
|
|||
str([key for key in attributes_to_set])
|
||||
))
|
||||
|
||||
for key, in_data in attributes_to_set.items():
|
||||
attribute_keys = set(attributes_to_set.keys())
|
||||
keys_order = []
|
||||
for key in self._keys_order:
|
||||
if key in attribute_keys:
|
||||
keys_order.append(key)
|
||||
|
||||
attribute_keys = attribute_keys - set(keys_order)
|
||||
for key in sorted(attribute_keys):
|
||||
keys_order.append(key)
|
||||
|
||||
for key in keys_order:
|
||||
in_data = attributes_to_set[key]
|
||||
attr = in_data["object"]
|
||||
|
||||
# initial item definition
|
||||
|
|
@ -225,7 +247,7 @@ class PrepareProjectServer(ServerAction):
|
|||
multiselect_enumerators.append(self.item_splitter)
|
||||
multiselect_enumerators.append({
|
||||
"type": "label",
|
||||
"value": in_data["label"]
|
||||
"value": "<h3>{}</h3>".format(in_data["label"])
|
||||
})
|
||||
|
||||
default = in_data["default"]
|
||||
|
|
@ -286,10 +308,10 @@ class PrepareProjectServer(ServerAction):
|
|||
return items, multiselect_enumerators
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
if not event['data'].get('values', {}):
|
||||
in_data = event["data"].get("values")
|
||||
if not in_data:
|
||||
return
|
||||
|
||||
in_data = event['data']['values']
|
||||
|
||||
root_values = {}
|
||||
root_key = "__root__"
|
||||
|
|
@ -337,7 +359,27 @@ class PrepareProjectServer(ServerAction):
|
|||
|
||||
self.log.debug("Setting Custom Attribute values")
|
||||
|
||||
project_name = entities[0]["full_name"]
|
||||
project_entity = entities[0]
|
||||
project_name = project_entity["full_name"]
|
||||
|
||||
# Try to find project document
|
||||
dbcon = AvalonMongoDB()
|
||||
dbcon.install()
|
||||
dbcon.Session["AVALON_PROJECT"] = project_name
|
||||
project_doc = dbcon.find_one({
|
||||
"type": "project"
|
||||
})
|
||||
# Create project if is not available
|
||||
# - creation is required to be able set project anatomy and attributes
|
||||
if not project_doc:
|
||||
project_code = project_entity["name"]
|
||||
self.log.info("Creating project \"{} [{}]\"".format(
|
||||
project_name, project_code
|
||||
))
|
||||
create_project(project_name, project_code, dbcon=dbcon)
|
||||
|
||||
dbcon.uninstall()
|
||||
|
||||
project_settings = ProjectSettings(project_name)
|
||||
project_anatomy_settings = project_settings["project_anatomy"]
|
||||
project_anatomy_settings["roots"] = root_data
|
||||
|
|
@ -352,10 +394,12 @@ class PrepareProjectServer(ServerAction):
|
|||
|
||||
project_settings.save()
|
||||
|
||||
entity = entities[0]
|
||||
for key, value in custom_attribute_values.items():
|
||||
entity["custom_attributes"][key] = value
|
||||
self.log.debug("- Key \"{}\" set to \"{}\"".format(key, value))
|
||||
# Change custom attributes on project
|
||||
if custom_attribute_values:
|
||||
for key, value in custom_attribute_values.items():
|
||||
project_entity["custom_attributes"][key] = value
|
||||
self.log.debug("- Key \"{}\" set to \"{}\"".format(key, value))
|
||||
session.commit()
|
||||
|
||||
return True
|
||||
|
||||
|
|
|
|||
|
|
@ -0,0 +1,61 @@
|
|||
from openpype.modules.ftrack.lib import ServerAction
|
||||
|
||||
|
||||
class PrivateProjectDetectionAction(ServerAction):
|
||||
"""Action helps to identify if does not have access to project."""
|
||||
|
||||
identifier = "server.missing.perm.private.project"
|
||||
label = "Missing permissions"
|
||||
description = (
|
||||
"Main ftrack event server does not have access to this project."
|
||||
)
|
||||
|
||||
def _discover(self, event):
|
||||
"""Show action only if there is a selection in event data."""
|
||||
entities = self._translate_event(event)
|
||||
if entities:
|
||||
return None
|
||||
|
||||
selection = event["data"].get("selection")
|
||||
if not selection:
|
||||
return None
|
||||
|
||||
return {
|
||||
"items": [{
|
||||
"label": self.label,
|
||||
"variant": self.variant,
|
||||
"description": self.description,
|
||||
"actionIdentifier": self.discover_identifier,
|
||||
"icon": self.icon,
|
||||
}]
|
||||
}
|
||||
|
||||
def _launch(self, event):
|
||||
# Ignore if there are values in event data
|
||||
# - somebody clicked on submit button
|
||||
values = event["data"].get("values")
|
||||
if values:
|
||||
return None
|
||||
|
||||
title = "# Private project (missing permissions) #"
|
||||
msg = (
|
||||
"User ({}) or API Key used on Ftrack event server"
|
||||
" does not have permissions to access this private project."
|
||||
).format(self.session.api_user)
|
||||
return {
|
||||
"type": "form",
|
||||
"title": "Missing permissions",
|
||||
"items": [
|
||||
{"type": "label", "value": title},
|
||||
{"type": "label", "value": msg},
|
||||
# Add hidden to be able detect if was clicked on submit
|
||||
{"type": "hidden", "value": "1", "name": "hidden"}
|
||||
],
|
||||
"submit_button_label": "Got it"
|
||||
}
|
||||
|
||||
|
||||
def register(session):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
PrivateProjectDetectionAction(session).register()
|
||||
|
|
@ -1,3 +1,4 @@
|
|||
import sys
|
||||
import json
|
||||
import collections
|
||||
import ftrack_api
|
||||
|
|
@ -90,27 +91,28 @@ class PushHierValuesToNonHier(ServerAction):
|
|||
|
||||
try:
|
||||
result = self.propagate_values(session, event, entities)
|
||||
job["status"] = "done"
|
||||
session.commit()
|
||||
|
||||
return result
|
||||
|
||||
except Exception:
|
||||
session.rollback()
|
||||
job["status"] = "failed"
|
||||
session.commit()
|
||||
|
||||
except Exception as exc:
|
||||
msg = "Pushing Custom attribute values to task Failed"
|
||||
|
||||
self.log.warning(msg, exc_info=True)
|
||||
|
||||
session.rollback()
|
||||
|
||||
description = "{} (Download traceback)".format(msg)
|
||||
self.add_traceback_to_job(
|
||||
job, session, sys.exc_info(), description
|
||||
)
|
||||
|
||||
return {
|
||||
"success": False,
|
||||
"message": msg
|
||||
"message": "Error: {}".format(str(exc))
|
||||
}
|
||||
|
||||
finally:
|
||||
if job["status"] == "running":
|
||||
job["status"] = "failed"
|
||||
session.commit()
|
||||
job["status"] = "done"
|
||||
session.commit()
|
||||
|
||||
return result
|
||||
|
||||
def attrs_configurations(self, session, object_ids, interest_attributes):
|
||||
attrs = session.query(self.cust_attrs_query.format(
|
||||
|
|
|
|||
|
|
@ -1259,7 +1259,7 @@ class SyncToAvalonEvent(BaseEvent):
|
|||
self.process_session,
|
||||
entity,
|
||||
hier_attrs,
|
||||
self.cust_attr_types_by_id
|
||||
self.cust_attr_types_by_id.values()
|
||||
)
|
||||
for key, val in hier_values.items():
|
||||
output[key] = val
|
||||
|
|
|
|||
|
|
@ -11,29 +11,44 @@ from avalon.api import AvalonMongoDB
|
|||
|
||||
|
||||
class AppplicationsAction(BaseAction):
|
||||
"""Application Action class.
|
||||
|
||||
Args:
|
||||
session (ftrack_api.Session): Session where action will be registered.
|
||||
label (str): A descriptive string identifing your action.
|
||||
varaint (str, optional): To group actions together, give them the same
|
||||
label and specify a unique variant per action.
|
||||
identifier (str): An unique identifier for app.
|
||||
description (str): A verbose descriptive text for you action.
|
||||
icon (str): Url path to icon which will be shown in Ftrack web.
|
||||
"""
|
||||
"""Applications Action class."""
|
||||
|
||||
type = "Application"
|
||||
label = "Application action"
|
||||
identifier = "pype_app.{}.".format(str(uuid4()))
|
||||
|
||||
identifier = "openpype_app"
|
||||
_launch_identifier_with_id = None
|
||||
|
||||
icon_url = os.environ.get("OPENPYPE_STATICS_SERVER")
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
super(AppplicationsAction, self).__init__(*args, **kwargs)
|
||||
|
||||
self.application_manager = ApplicationManager()
|
||||
self.dbcon = AvalonMongoDB()
|
||||
|
||||
@property
|
||||
def discover_identifier(self):
|
||||
if self._discover_identifier is None:
|
||||
self._discover_identifier = "{}.{}".format(
|
||||
self.identifier, self.process_identifier()
|
||||
)
|
||||
return self._discover_identifier
|
||||
|
||||
@property
|
||||
def launch_identifier(self):
|
||||
if self._launch_identifier is None:
|
||||
self._launch_identifier = "{}.*".format(self.identifier)
|
||||
return self._launch_identifier
|
||||
|
||||
@property
|
||||
def launch_identifier_with_id(self):
|
||||
if self._launch_identifier_with_id is None:
|
||||
self._launch_identifier_with_id = "{}.{}".format(
|
||||
self.identifier, self.process_identifier()
|
||||
)
|
||||
return self._launch_identifier_with_id
|
||||
|
||||
def construct_requirements_validations(self):
|
||||
# Override validation as this action does not need them
|
||||
return
|
||||
|
|
@ -56,7 +71,7 @@ class AppplicationsAction(BaseAction):
|
|||
" and data.actionIdentifier={0}"
|
||||
" and source.user.username={1}"
|
||||
).format(
|
||||
self.identifier + "*",
|
||||
self.launch_identifier,
|
||||
self.session.api_user
|
||||
)
|
||||
self.session.event_hub.subscribe(
|
||||
|
|
@ -136,12 +151,29 @@ class AppplicationsAction(BaseAction):
|
|||
"label": app.group.label,
|
||||
"variant": app.label,
|
||||
"description": None,
|
||||
"actionIdentifier": self.identifier + app_name,
|
||||
"actionIdentifier": "{}.{}".format(
|
||||
self.launch_identifier_with_id, app_name
|
||||
),
|
||||
"icon": app_icon
|
||||
})
|
||||
|
||||
return items
|
||||
|
||||
def _launch(self, event):
|
||||
event_identifier = event["data"]["actionIdentifier"]
|
||||
# Check if identifier is same
|
||||
# - show message that acion may not be triggered on this machine
|
||||
if event_identifier.startswith(self.launch_identifier_with_id):
|
||||
return BaseAction._launch(self, event)
|
||||
|
||||
return {
|
||||
"success": False,
|
||||
"message": (
|
||||
"There are running more OpenPype processes"
|
||||
" where Application can be launched."
|
||||
)
|
||||
}
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
"""Callback method for the custom action.
|
||||
|
||||
|
|
@ -162,7 +194,8 @@ class AppplicationsAction(BaseAction):
|
|||
*event* the unmodified original event
|
||||
"""
|
||||
identifier = event["data"]["actionIdentifier"]
|
||||
app_name = identifier[len(self.identifier):]
|
||||
id_identifier_len = len(self.launch_identifier_with_id) + 1
|
||||
app_name = identifier[id_identifier_len:]
|
||||
|
||||
entity = entities[0]
|
||||
|
||||
|
|
|
|||
|
|
@ -1,5 +1,6 @@
|
|||
import os
|
||||
import re
|
||||
import json
|
||||
|
||||
from openpype.modules.ftrack.lib import BaseAction, statics_icon
|
||||
from openpype.api import Anatomy, get_project_settings
|
||||
|
|
@ -84,6 +85,9 @@ class CreateProjectFolders(BaseAction):
|
|||
}
|
||||
|
||||
try:
|
||||
if isinstance(project_folder_structure, str):
|
||||
project_folder_structure = json.loads(project_folder_structure)
|
||||
|
||||
# Get paths based on presets
|
||||
basic_paths = self.get_path_items(project_folder_structure)
|
||||
self.create_folders(basic_paths, project_entity)
|
||||
|
|
|
|||
|
|
@ -9,16 +9,24 @@ class MultipleNotes(BaseAction):
|
|||
#: Action label.
|
||||
label = 'Multiple Notes'
|
||||
#: Action description.
|
||||
description = 'Add same note to multiple Asset Versions'
|
||||
description = 'Add same note to multiple entities'
|
||||
icon = statics_icon("ftrack", "action_icons", "MultipleNotes.svg")
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
''' Validation '''
|
||||
valid = True
|
||||
|
||||
# Check for multiple selection.
|
||||
if len(entities) < 2:
|
||||
valid = False
|
||||
|
||||
# Check for valid entities.
|
||||
valid_entity_types = ['assetversion', 'task']
|
||||
for entity in entities:
|
||||
if entity.entity_type.lower() != 'assetversion':
|
||||
if entity.entity_type.lower() not in valid_entity_types:
|
||||
valid = False
|
||||
break
|
||||
|
||||
return valid
|
||||
|
||||
def interface(self, session, entities, event):
|
||||
|
|
@ -58,7 +66,7 @@ class MultipleNotes(BaseAction):
|
|||
|
||||
splitter = {
|
||||
'type': 'label',
|
||||
'value': '{}'.format(200*"-")
|
||||
'value': '{}'.format(200 * "-")
|
||||
}
|
||||
|
||||
items = []
|
||||
|
|
|
|||
|
|
@ -1,6 +1,8 @@
|
|||
import json
|
||||
|
||||
from avalon.api import AvalonMongoDB
|
||||
from openpype.api import ProjectSettings
|
||||
from openpype.lib import create_project
|
||||
|
||||
from openpype.modules.ftrack.lib import (
|
||||
BaseAction,
|
||||
|
|
@ -23,7 +25,24 @@ class PrepareProjectLocal(BaseAction):
|
|||
settings_key = "prepare_project"
|
||||
|
||||
# Key to store info about trigerring create folder structure
|
||||
create_project_structure_key = "create_folder_structure"
|
||||
create_project_structure_identifier = "create.project.structure"
|
||||
item_splitter = {"type": "label", "value": "---"}
|
||||
_keys_order = (
|
||||
"fps",
|
||||
"frameStart",
|
||||
"frameEnd",
|
||||
"handleStart",
|
||||
"handleEnd",
|
||||
"clipIn",
|
||||
"clipOut",
|
||||
"resolutionHeight",
|
||||
"resolutionWidth",
|
||||
"pixelAspect",
|
||||
"applications",
|
||||
"tools_env",
|
||||
"library_project",
|
||||
)
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
"""Show only on project."""
|
||||
|
|
@ -48,13 +67,7 @@ class PrepareProjectLocal(BaseAction):
|
|||
project_entity = entities[0]
|
||||
project_name = project_entity["full_name"]
|
||||
|
||||
try:
|
||||
project_settings = ProjectSettings(project_name)
|
||||
except ValueError:
|
||||
return {
|
||||
"message": "Project is not synchronized yet",
|
||||
"success": False
|
||||
}
|
||||
project_settings = ProjectSettings(project_name)
|
||||
|
||||
project_anatom_settings = project_settings["project_anatomy"]
|
||||
root_items = self.prepare_root_items(project_anatom_settings)
|
||||
|
|
@ -79,14 +92,12 @@ class PrepareProjectLocal(BaseAction):
|
|||
|
||||
items.extend(ca_items)
|
||||
|
||||
# This item will be last (before enumerators)
|
||||
# - sets value of auto synchronization
|
||||
auto_sync_name = "avalon_auto_sync"
|
||||
# Set value of auto synchronization
|
||||
auto_sync_value = project_entity["custom_attributes"].get(
|
||||
CUST_ATTR_AUTO_SYNC, False
|
||||
)
|
||||
auto_sync_item = {
|
||||
"name": auto_sync_name,
|
||||
"name": CUST_ATTR_AUTO_SYNC,
|
||||
"type": "boolean",
|
||||
"value": auto_sync_value,
|
||||
"label": "AutoSync to Avalon"
|
||||
|
|
@ -94,6 +105,27 @@ class PrepareProjectLocal(BaseAction):
|
|||
# Add autosync attribute
|
||||
items.append(auto_sync_item)
|
||||
|
||||
# This item will be last before enumerators
|
||||
# Ask if want to trigger Action Create Folder Structure
|
||||
create_project_structure_checked = (
|
||||
project_settings
|
||||
["project_settings"]
|
||||
["ftrack"]
|
||||
["user_handlers"]
|
||||
["prepare_project"]
|
||||
["create_project_structure_checked"]
|
||||
).value
|
||||
items.append({
|
||||
"type": "label",
|
||||
"value": "<h3>Want to create basic Folder Structure?</h3>"
|
||||
})
|
||||
items.append({
|
||||
"name": self.create_project_structure_key,
|
||||
"type": "boolean",
|
||||
"value": create_project_structure_checked,
|
||||
"label": "Check if Yes"
|
||||
})
|
||||
|
||||
# Add enumerator items at the end
|
||||
for item in multiselect_enumerators:
|
||||
items.append(item)
|
||||
|
|
@ -200,7 +232,18 @@ class PrepareProjectLocal(BaseAction):
|
|||
str([key for key in attributes_to_set])
|
||||
))
|
||||
|
||||
for key, in_data in attributes_to_set.items():
|
||||
attribute_keys = set(attributes_to_set.keys())
|
||||
keys_order = []
|
||||
for key in self._keys_order:
|
||||
if key in attribute_keys:
|
||||
keys_order.append(key)
|
||||
|
||||
attribute_keys = attribute_keys - set(keys_order)
|
||||
for key in sorted(attribute_keys):
|
||||
keys_order.append(key)
|
||||
|
||||
for key in keys_order:
|
||||
in_data = attributes_to_set[key]
|
||||
attr = in_data["object"]
|
||||
|
||||
# initial item definition
|
||||
|
|
@ -226,7 +269,7 @@ class PrepareProjectLocal(BaseAction):
|
|||
multiselect_enumerators.append(self.item_splitter)
|
||||
multiselect_enumerators.append({
|
||||
"type": "label",
|
||||
"value": in_data["label"]
|
||||
"value": "<h3>{}</h3>".format(in_data["label"])
|
||||
})
|
||||
|
||||
default = in_data["default"]
|
||||
|
|
@ -287,10 +330,13 @@ class PrepareProjectLocal(BaseAction):
|
|||
return items, multiselect_enumerators
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
if not event['data'].get('values', {}):
|
||||
in_data = event["data"].get("values")
|
||||
if not in_data:
|
||||
return
|
||||
|
||||
in_data = event['data']['values']
|
||||
create_project_structure_checked = in_data.pop(
|
||||
self.create_project_structure_key
|
||||
)
|
||||
|
||||
root_values = {}
|
||||
root_key = "__root__"
|
||||
|
|
@ -338,7 +384,27 @@ class PrepareProjectLocal(BaseAction):
|
|||
|
||||
self.log.debug("Setting Custom Attribute values")
|
||||
|
||||
project_name = entities[0]["full_name"]
|
||||
project_entity = entities[0]
|
||||
project_name = project_entity["full_name"]
|
||||
|
||||
# Try to find project document
|
||||
dbcon = AvalonMongoDB()
|
||||
dbcon.install()
|
||||
dbcon.Session["AVALON_PROJECT"] = project_name
|
||||
project_doc = dbcon.find_one({
|
||||
"type": "project"
|
||||
})
|
||||
# Create project if is not available
|
||||
# - creation is required to be able set project anatomy and attributes
|
||||
if not project_doc:
|
||||
project_code = project_entity["name"]
|
||||
self.log.info("Creating project \"{} [{}]\"".format(
|
||||
project_name, project_code
|
||||
))
|
||||
create_project(project_name, project_code, dbcon=dbcon)
|
||||
|
||||
dbcon.uninstall()
|
||||
|
||||
project_settings = ProjectSettings(project_name)
|
||||
project_anatomy_settings = project_settings["project_anatomy"]
|
||||
project_anatomy_settings["roots"] = root_data
|
||||
|
|
@ -353,11 +419,20 @@ class PrepareProjectLocal(BaseAction):
|
|||
|
||||
project_settings.save()
|
||||
|
||||
entity = entities[0]
|
||||
for key, value in custom_attribute_values.items():
|
||||
entity["custom_attributes"][key] = value
|
||||
self.log.debug("- Key \"{}\" set to \"{}\"".format(key, value))
|
||||
# Change custom attributes on project
|
||||
if custom_attribute_values:
|
||||
for key, value in custom_attribute_values.items():
|
||||
project_entity["custom_attributes"][key] = value
|
||||
self.log.debug("- Key \"{}\" set to \"{}\"".format(key, value))
|
||||
session.commit()
|
||||
|
||||
# Trigger create project structure action
|
||||
if create_project_structure_checked:
|
||||
trigger_identifier = "{}.{}".format(
|
||||
self.create_project_structure_identifier,
|
||||
self.process_identifier()
|
||||
)
|
||||
self.trigger_action(trigger_identifier, event)
|
||||
return True
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -1,33 +1,98 @@
|
|||
import platform
|
||||
import socket
|
||||
import getpass
|
||||
|
||||
from openpype.modules.ftrack.lib import BaseAction, statics_icon
|
||||
|
||||
|
||||
class ActionAskWhereIRun(BaseAction):
|
||||
""" Sometimes user forget where pipeline with his credentials is running.
|
||||
- this action triggers `ActionShowWhereIRun`
|
||||
"""
|
||||
ignore_me = True
|
||||
identifier = 'ask.where.i.run'
|
||||
label = 'Ask where I run'
|
||||
description = 'Triggers PC info where user have running OpenPype'
|
||||
icon = statics_icon("ftrack", "action_icons", "ActionAskWhereIRun.svg")
|
||||
class ActionWhereIRun(BaseAction):
|
||||
"""Show where same user has running OpenPype instances."""
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
""" Hide by default - Should be enabled only if you want to run.
|
||||
- best practise is to create another action that triggers this one
|
||||
"""
|
||||
identifier = "ask.where.i.run"
|
||||
show_identifier = "show.where.i.run"
|
||||
label = "OpenPype Admin"
|
||||
variant = "- Where I run"
|
||||
description = "Show PC info where user have running OpenPype"
|
||||
|
||||
return True
|
||||
def _discover(self, _event):
|
||||
return {
|
||||
"items": [{
|
||||
"label": self.label,
|
||||
"variant": self.variant,
|
||||
"description": self.description,
|
||||
"actionIdentifier": self.discover_identifier,
|
||||
"icon": self.icon,
|
||||
}]
|
||||
}
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
more_data = {"event_hub_id": session.event_hub.id}
|
||||
self.trigger_action(
|
||||
"show.where.i.run", event, additional_event_data=more_data
|
||||
def _launch(self, event):
|
||||
self.trigger_action(self.show_identifier, event)
|
||||
|
||||
def register(self):
|
||||
# Register default action callbacks
|
||||
super(ActionWhereIRun, self).register()
|
||||
|
||||
# Add show identifier
|
||||
show_subscription = (
|
||||
"topic=ftrack.action.launch"
|
||||
" and data.actionIdentifier={}"
|
||||
" and source.user.username={}"
|
||||
).format(
|
||||
self.show_identifier,
|
||||
self.session.api_user
|
||||
)
|
||||
self.session.event_hub.subscribe(
|
||||
show_subscription,
|
||||
self._show_info
|
||||
)
|
||||
|
||||
return True
|
||||
def _show_info(self, event):
|
||||
title = "Where Do I Run?"
|
||||
msgs = {}
|
||||
all_keys = ["Hostname", "IP", "Username", "System name", "PC name"]
|
||||
try:
|
||||
host_name = socket.gethostname()
|
||||
msgs["Hostname"] = host_name
|
||||
host_ip = socket.gethostbyname(host_name)
|
||||
msgs["IP"] = host_ip
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
try:
|
||||
system_name, pc_name, *_ = platform.uname()
|
||||
msgs["System name"] = system_name
|
||||
msgs["PC name"] = pc_name
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
try:
|
||||
msgs["Username"] = getpass.getuser()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
for key in all_keys:
|
||||
if not msgs.get(key):
|
||||
msgs[key] = "-Undefined-"
|
||||
|
||||
items = []
|
||||
first = True
|
||||
separator = {"type": "label", "value": "---"}
|
||||
for key, value in msgs.items():
|
||||
if first:
|
||||
first = False
|
||||
else:
|
||||
items.append(separator)
|
||||
self.log.debug("{}: {}".format(key, value))
|
||||
|
||||
subtitle = {"type": "label", "value": "<h3>{}</h3>".format(key)}
|
||||
items.append(subtitle)
|
||||
message = {"type": "label", "value": "<p>{}</p>".format(value)}
|
||||
items.append(message)
|
||||
|
||||
self.show_interface(items, title, event=event)
|
||||
|
||||
|
||||
def register(session):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
ActionAskWhereIRun(session).register()
|
||||
ActionWhereIRun(session).register()
|
||||
|
|
|
|||
|
|
@ -1,82 +0,0 @@
|
|||
import platform
|
||||
import socket
|
||||
import getpass
|
||||
from openpype.modules.ftrack.lib import BaseAction
|
||||
|
||||
|
||||
class ActionShowWhereIRun(BaseAction):
|
||||
""" Sometimes user forget where pipeline with his credentials is running.
|
||||
- this action shows on which PC, Username and IP is running
|
||||
- requirement action MUST be registered where we want to locate the PC:
|
||||
- - can't be used retrospectively...
|
||||
"""
|
||||
#: Action identifier.
|
||||
identifier = 'show.where.i.run'
|
||||
#: Action label.
|
||||
label = 'Show where I run'
|
||||
#: Action description.
|
||||
description = 'Shows PC info where user have running OpenPype'
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
""" Hide by default - Should be enabled only if you want to run.
|
||||
- best practise is to create another action that triggers this one
|
||||
"""
|
||||
|
||||
return False
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
# Don't show info when was launch from this session
|
||||
if session.event_hub.id == event.get("data", {}).get("event_hub_id"):
|
||||
return True
|
||||
|
||||
title = "Where Do I Run?"
|
||||
msgs = {}
|
||||
all_keys = ["Hostname", "IP", "Username", "System name", "PC name"]
|
||||
try:
|
||||
host_name = socket.gethostname()
|
||||
msgs["Hostname"] = host_name
|
||||
host_ip = socket.gethostbyname(host_name)
|
||||
msgs["IP"] = host_ip
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
try:
|
||||
system_name, pc_name, *_ = platform.uname()
|
||||
msgs["System name"] = system_name
|
||||
msgs["PC name"] = pc_name
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
try:
|
||||
msgs["Username"] = getpass.getuser()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
for key in all_keys:
|
||||
if not msgs.get(key):
|
||||
msgs[key] = "-Undefined-"
|
||||
|
||||
items = []
|
||||
first = True
|
||||
splitter = {'type': 'label', 'value': '---'}
|
||||
for key, value in msgs.items():
|
||||
if first:
|
||||
first = False
|
||||
else:
|
||||
items.append(splitter)
|
||||
self.log.debug("{}: {}".format(key, value))
|
||||
|
||||
subtitle = {'type': 'label', 'value': '<h3>{}</h3>'.format(key)}
|
||||
items.append(subtitle)
|
||||
message = {'type': 'label', 'value': '<p>{}</p>'.format(value)}
|
||||
items.append(message)
|
||||
|
||||
self.show_interface(items, title, event=event)
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def register(session):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
ActionShowWhereIRun(session).register()
|
||||
|
|
@ -29,6 +29,9 @@ class BaseAction(BaseHandler):
|
|||
icon = None
|
||||
type = 'Action'
|
||||
|
||||
_discover_identifier = None
|
||||
_launch_identifier = None
|
||||
|
||||
settings_frack_subkey = "user_handlers"
|
||||
settings_enabled_key = "enabled"
|
||||
|
||||
|
|
@ -42,6 +45,22 @@ class BaseAction(BaseHandler):
|
|||
|
||||
super().__init__(session)
|
||||
|
||||
@property
|
||||
def discover_identifier(self):
|
||||
if self._discover_identifier is None:
|
||||
self._discover_identifier = "{}.{}".format(
|
||||
self.identifier, self.process_identifier()
|
||||
)
|
||||
return self._discover_identifier
|
||||
|
||||
@property
|
||||
def launch_identifier(self):
|
||||
if self._launch_identifier is None:
|
||||
self._launch_identifier = "{}.{}".format(
|
||||
self.identifier, self.process_identifier()
|
||||
)
|
||||
return self._launch_identifier
|
||||
|
||||
def register(self):
|
||||
'''
|
||||
Registers the action, subscribing the the discover and launch topics.
|
||||
|
|
@ -60,7 +79,7 @@ class BaseAction(BaseHandler):
|
|||
' and data.actionIdentifier={0}'
|
||||
' and source.user.username={1}'
|
||||
).format(
|
||||
self.identifier,
|
||||
self.launch_identifier,
|
||||
self.session.api_user
|
||||
)
|
||||
self.session.event_hub.subscribe(
|
||||
|
|
@ -86,7 +105,7 @@ class BaseAction(BaseHandler):
|
|||
'label': self.label,
|
||||
'variant': self.variant,
|
||||
'description': self.description,
|
||||
'actionIdentifier': self.identifier,
|
||||
'actionIdentifier': self.discover_identifier,
|
||||
'icon': self.icon,
|
||||
}]
|
||||
}
|
||||
|
|
@ -309,6 +328,78 @@ class BaseAction(BaseHandler):
|
|||
return True
|
||||
|
||||
|
||||
class LocalAction(BaseAction):
|
||||
"""Action that warn user when more Processes with same action are running.
|
||||
|
||||
Action is launched all the time but if id does not match id of current
|
||||
instanace then message is shown to user.
|
||||
|
||||
Handy for actions where matters if is executed on specific machine.
|
||||
"""
|
||||
_full_launch_identifier = None
|
||||
|
||||
@property
|
||||
def discover_identifier(self):
|
||||
if self._discover_identifier is None:
|
||||
self._discover_identifier = "{}.{}".format(
|
||||
self.identifier, self.process_identifier()
|
||||
)
|
||||
return self._discover_identifier
|
||||
|
||||
@property
|
||||
def launch_identifier(self):
|
||||
"""Catch all topics with same identifier."""
|
||||
if self._launch_identifier is None:
|
||||
self._launch_identifier = "{}.*".format(self.identifier)
|
||||
return self._launch_identifier
|
||||
|
||||
@property
|
||||
def full_launch_identifier(self):
|
||||
"""Catch all topics with same identifier."""
|
||||
if self._full_launch_identifier is None:
|
||||
self._full_launch_identifier = "{}.{}".format(
|
||||
self.identifier, self.process_identifier()
|
||||
)
|
||||
return self._full_launch_identifier
|
||||
|
||||
def _discover(self, event):
|
||||
entities = self._translate_event(event)
|
||||
if not entities:
|
||||
return
|
||||
|
||||
accepts = self.discover(self.session, entities, event)
|
||||
if not accepts:
|
||||
return
|
||||
|
||||
self.log.debug("Discovering action with selection: {0}".format(
|
||||
event["data"].get("selection", [])
|
||||
))
|
||||
|
||||
return {
|
||||
"items": [{
|
||||
"label": self.label,
|
||||
"variant": self.variant,
|
||||
"description": self.description,
|
||||
"actionIdentifier": self.discover_identifier,
|
||||
"icon": self.icon,
|
||||
}]
|
||||
}
|
||||
|
||||
def _launch(self, event):
|
||||
event_identifier = event["data"]["actionIdentifier"]
|
||||
# Check if identifier is same
|
||||
# - show message that acion may not be triggered on this machine
|
||||
if event_identifier != self.full_launch_identifier:
|
||||
return {
|
||||
"success": False,
|
||||
"message": (
|
||||
"There are running more OpenPype processes"
|
||||
" where this action could be launched."
|
||||
)
|
||||
}
|
||||
return super(LocalAction, self)._launch(event)
|
||||
|
||||
|
||||
class ServerAction(BaseAction):
|
||||
"""Action class meant to be used on event server.
|
||||
|
||||
|
|
@ -318,6 +409,14 @@ class ServerAction(BaseAction):
|
|||
|
||||
settings_frack_subkey = "events"
|
||||
|
||||
@property
|
||||
def discover_identifier(self):
|
||||
return self.identifier
|
||||
|
||||
@property
|
||||
def launch_identifier(self):
|
||||
return self.identifier
|
||||
|
||||
def register(self):
|
||||
"""Register subcription to Ftrack event hub."""
|
||||
self.session.event_hub.subscribe(
|
||||
|
|
@ -328,5 +427,5 @@ class ServerAction(BaseAction):
|
|||
|
||||
launch_subscription = (
|
||||
"topic=ftrack.action.launch and data.actionIdentifier={0}"
|
||||
).format(self.identifier)
|
||||
).format(self.launch_identifier)
|
||||
self.session.event_hub.subscribe(launch_subscription, self._launch)
|
||||
|
|
|
|||
|
|
@ -1,4 +1,10 @@
|
|||
import os
|
||||
import tempfile
|
||||
import json
|
||||
import functools
|
||||
import uuid
|
||||
import datetime
|
||||
import traceback
|
||||
import time
|
||||
from openpype.api import Logger
|
||||
from openpype.settings import get_project_settings
|
||||
|
|
@ -31,6 +37,7 @@ class BaseHandler(object):
|
|||
<description> - a verbose descriptive text for you action
|
||||
<icon> - icon in ftrack
|
||||
'''
|
||||
_process_id = None
|
||||
# Default priority is 100
|
||||
priority = 100
|
||||
# Type is just for logging purpose (e.g.: Action, Event, Application,...)
|
||||
|
|
@ -65,6 +72,13 @@ class BaseHandler(object):
|
|||
self.register = self.register_decorator(self.register)
|
||||
self.launch = self.launch_log(self.launch)
|
||||
|
||||
@staticmethod
|
||||
def process_identifier():
|
||||
"""Helper property to have """
|
||||
if not BaseHandler._process_id:
|
||||
BaseHandler._process_id = str(uuid.uuid4())
|
||||
return BaseHandler._process_id
|
||||
|
||||
# Decorator
|
||||
def register_decorator(self, func):
|
||||
@functools.wraps(func)
|
||||
|
|
@ -177,15 +191,22 @@ class BaseHandler(object):
|
|||
if session is None:
|
||||
session = self.session
|
||||
|
||||
_entities = event['data'].get('entities_object', None)
|
||||
_entities = event["data"].get("entities_object", None)
|
||||
if _entities is not None and not _entities:
|
||||
return _entities
|
||||
|
||||
if (
|
||||
_entities is None or
|
||||
_entities[0].get(
|
||||
'link', None
|
||||
_entities is None
|
||||
or _entities[0].get(
|
||||
"link", None
|
||||
) == ftrack_api.symbol.NOT_SET
|
||||
):
|
||||
_entities = self._get_entities(event)
|
||||
event['data']['entities_object'] = _entities
|
||||
_entities = [
|
||||
item
|
||||
for item in self._get_entities(event)
|
||||
if item is not None
|
||||
]
|
||||
event["data"]["entities_object"] = _entities
|
||||
|
||||
return _entities
|
||||
|
||||
|
|
@ -583,3 +604,105 @@ class BaseHandler(object):
|
|||
return "/".join(
|
||||
[ent["name"] for ent in entity["link"]]
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def add_traceback_to_job(
|
||||
cls, job, session, exc_info,
|
||||
description=None,
|
||||
component_name=None,
|
||||
job_status=None
|
||||
):
|
||||
"""Add traceback file to a job.
|
||||
|
||||
Args:
|
||||
job (JobEntity): Entity of job where file should be able to
|
||||
download (Created or queried with passed session).
|
||||
session (Session): Ftrack session which was used to query/create
|
||||
entered job.
|
||||
exc_info (tuple): Exception info (e.g. from `sys.exc_info()`).
|
||||
description (str): Change job description to describe what
|
||||
happened. Job description won't change if not passed.
|
||||
component_name (str): Name of component and default name of
|
||||
downloaded file. Class name and current date time are used if
|
||||
not specified.
|
||||
job_status (str): Status of job which will be set. By default is
|
||||
set to 'failed'.
|
||||
"""
|
||||
if description:
|
||||
job_data = {
|
||||
"description": description
|
||||
}
|
||||
job["data"] = json.dumps(job_data)
|
||||
|
||||
if not job_status:
|
||||
job_status = "failed"
|
||||
|
||||
job["status"] = job_status
|
||||
|
||||
# Create temp file where traceback will be stored
|
||||
temp_obj = tempfile.NamedTemporaryFile(
|
||||
mode="w", prefix="openpype_ftrack_", suffix=".txt", delete=False
|
||||
)
|
||||
temp_obj.close()
|
||||
temp_filepath = temp_obj.name
|
||||
|
||||
# Store traceback to file
|
||||
result = traceback.format_exception(*exc_info)
|
||||
with open(temp_filepath, "w") as temp_file:
|
||||
temp_file.write("".join(result))
|
||||
|
||||
# Upload file with traceback to ftrack server and add it to job
|
||||
if not component_name:
|
||||
component_name = "{}_{}".format(
|
||||
cls.__name__,
|
||||
datetime.datetime.now().strftime("%y-%m-%d-%H%M")
|
||||
)
|
||||
cls.add_file_component_to_job(
|
||||
job, session, temp_filepath, component_name
|
||||
)
|
||||
# Delete temp file
|
||||
os.remove(temp_filepath)
|
||||
|
||||
@staticmethod
|
||||
def add_file_component_to_job(job, session, filepath, basename=None):
|
||||
"""Add filepath as downloadable component to job.
|
||||
|
||||
Args:
|
||||
job (JobEntity): Entity of job where file should be able to
|
||||
download (Created or queried with passed session).
|
||||
session (Session): Ftrack session which was used to query/create
|
||||
entered job.
|
||||
filepath (str): Path to file which should be added to job.
|
||||
basename (str): Defines name of file which will be downloaded on
|
||||
user's side. Must be without extension otherwise extension will
|
||||
be duplicated in downloaded name. Basename from entered path
|
||||
used when not entered.
|
||||
"""
|
||||
# Make sure session's locations are configured
|
||||
# - they can be deconfigured e.g. using `rollback` method
|
||||
session._configure_locations()
|
||||
|
||||
# Query `ftrack.server` location where component will be stored
|
||||
location = session.query(
|
||||
"Location where name is \"ftrack.server\""
|
||||
).one()
|
||||
|
||||
# Use filename as basename if not entered (must be without extension)
|
||||
if basename is None:
|
||||
basename = os.path.splitext(
|
||||
os.path.basename(filepath)
|
||||
)[0]
|
||||
|
||||
component = session.create_component(
|
||||
filepath,
|
||||
data={"name": basename},
|
||||
location=location
|
||||
)
|
||||
session.create(
|
||||
"JobComponent",
|
||||
{
|
||||
"component_id": component["id"],
|
||||
"job_id": job["id"]
|
||||
}
|
||||
)
|
||||
session.commit()
|
||||
|
|
|
|||
|
|
@ -51,7 +51,7 @@ class CollectFtrackFamily(pyblish.api.InstancePlugin):
|
|||
families = instance.data.get("families")
|
||||
add_ftrack_family = profile["add_ftrack_family"]
|
||||
|
||||
additional_filters = profile.get("additional_filters")
|
||||
additional_filters = profile.get("advanced_filtering")
|
||||
if additional_filters:
|
||||
add_ftrack_family = self._get_add_ftrack_f_from_addit_filters(
|
||||
additional_filters,
|
||||
|
|
|
|||
|
|
@ -7,12 +7,13 @@ class LogsWindow(QtWidgets.QWidget):
|
|||
def __init__(self, parent=None):
|
||||
super(LogsWindow, self).__init__(parent)
|
||||
|
||||
self.setStyleSheet(style.load_stylesheet())
|
||||
self.setWindowTitle("Logs viewer")
|
||||
|
||||
self.resize(1400, 800)
|
||||
log_detail = OutputWidget(parent=self)
|
||||
logs_widget = LogsWidget(log_detail, parent=self)
|
||||
|
||||
main_layout = QtWidgets.QHBoxLayout()
|
||||
main_layout = QtWidgets.QHBoxLayout(self)
|
||||
|
||||
log_splitter = QtWidgets.QSplitter(self)
|
||||
log_splitter.setOrientation(QtCore.Qt.Horizontal)
|
||||
|
|
@ -24,5 +25,4 @@ class LogsWindow(QtWidgets.QWidget):
|
|||
self.logs_widget = logs_widget
|
||||
self.log_detail = log_detail
|
||||
|
||||
self.setLayout(main_layout)
|
||||
self.setWindowTitle("Logs")
|
||||
self.setStyleSheet(style.load_stylesheet())
|
||||
|
|
|
|||
|
|
@ -77,12 +77,10 @@ class CustomCombo(QtWidgets.QWidget):
|
|||
toolbutton.setMenu(toolmenu)
|
||||
toolbutton.setPopupMode(QtWidgets.QToolButton.MenuButtonPopup)
|
||||
|
||||
layout = QtWidgets.QHBoxLayout()
|
||||
layout = QtWidgets.QHBoxLayout(self)
|
||||
layout.setContentsMargins(0, 0, 0, 0)
|
||||
layout.addWidget(toolbutton)
|
||||
|
||||
self.setLayout(layout)
|
||||
|
||||
toolmenu.selection_changed.connect(self.selection_changed)
|
||||
|
||||
self.toolbutton = toolbutton
|
||||
|
|
@ -141,7 +139,6 @@ class LogsWidget(QtWidgets.QWidget):
|
|||
filter_layout.addWidget(refresh_btn)
|
||||
|
||||
view = QtWidgets.QTreeView(self)
|
||||
view.setAllColumnsShowFocus(True)
|
||||
view.setEditTriggers(QtWidgets.QAbstractItemView.NoEditTriggers)
|
||||
|
||||
layout = QtWidgets.QVBoxLayout(self)
|
||||
|
|
@ -229,9 +226,9 @@ class OutputWidget(QtWidgets.QWidget):
|
|||
super(OutputWidget, self).__init__(parent=parent)
|
||||
layout = QtWidgets.QVBoxLayout(self)
|
||||
|
||||
show_timecode_checkbox = QtWidgets.QCheckBox("Show timestamp")
|
||||
show_timecode_checkbox = QtWidgets.QCheckBox("Show timestamp", self)
|
||||
|
||||
output_text = QtWidgets.QTextEdit()
|
||||
output_text = QtWidgets.QTextEdit(self)
|
||||
output_text.setReadOnly(True)
|
||||
# output_text.setLineWrapMode(QtWidgets.QTextEdit.FixedPixelWidth)
|
||||
|
||||
|
|
|
|||
|
|
@ -3,6 +3,7 @@ from openpype.api import Logger
|
|||
|
||||
log = Logger().get_logger("Event processor")
|
||||
|
||||
|
||||
class TimersManagerModuleRestApi:
|
||||
"""
|
||||
REST API endpoint used for calling from hosts when context change
|
||||
|
|
@ -22,6 +23,11 @@ class TimersManagerModuleRestApi:
|
|||
self.prefix + "/start_timer",
|
||||
self.start_timer
|
||||
)
|
||||
self.server_manager.add_route(
|
||||
"POST",
|
||||
self.prefix + "/stop_timer",
|
||||
self.stop_timer
|
||||
)
|
||||
|
||||
async def start_timer(self, request):
|
||||
data = await request.json()
|
||||
|
|
@ -38,3 +44,7 @@ class TimersManagerModuleRestApi:
|
|||
self.module.stop_timers()
|
||||
self.module.start_timer(project_name, asset_name, task_name, hierarchy)
|
||||
return Response(status=200)
|
||||
|
||||
async def stop_timer(self, request):
|
||||
self.module.stop_timers()
|
||||
return Response(status=200)
|
||||
|
|
|
|||
|
|
@ -44,7 +44,8 @@ class ExtractBurnin(openpype.api.Extractor):
|
|||
"harmony",
|
||||
"fusion",
|
||||
"aftereffects",
|
||||
"tvpaint"
|
||||
"tvpaint",
|
||||
"aftereffects"
|
||||
# "resolve"
|
||||
]
|
||||
optional = True
|
||||
|
|
|
|||
|
|
@ -44,7 +44,8 @@ class ExtractReview(pyblish.api.InstancePlugin):
|
|||
"standalonepublisher",
|
||||
"fusion",
|
||||
"tvpaint",
|
||||
"resolve"
|
||||
"resolve",
|
||||
"aftereffects"
|
||||
]
|
||||
|
||||
# Supported extensions
|
||||
|
|
|
|||
|
|
@ -303,6 +303,8 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
|||
key_values = {"families": family, "tasks": task_name}
|
||||
profile = filter_profiles(self.template_name_profiles, key_values,
|
||||
logger=self.log)
|
||||
|
||||
template_name = "publish"
|
||||
if profile:
|
||||
template_name = profile["template_name"]
|
||||
|
||||
|
|
@ -380,7 +382,12 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
|||
|
||||
test_dest_files = list()
|
||||
for i in [1, 2]:
|
||||
template_data["frame"] = src_padding_exp % i
|
||||
template_data["representation"] = repre['ext']
|
||||
if not repre.get("udim"):
|
||||
template_data["frame"] = src_padding_exp % i
|
||||
else:
|
||||
template_data["udim"] = src_padding_exp % i
|
||||
|
||||
anatomy_filled = anatomy.format(template_data)
|
||||
template_filled = anatomy_filled[template_name]["path"]
|
||||
if repre_context is None:
|
||||
|
|
@ -388,7 +395,10 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
|||
test_dest_files.append(
|
||||
os.path.normpath(template_filled)
|
||||
)
|
||||
template_data["frame"] = repre_context["frame"]
|
||||
if not repre.get("udim"):
|
||||
template_data["frame"] = repre_context["frame"]
|
||||
else:
|
||||
template_data["udim"] = repre_context["udim"]
|
||||
|
||||
self.log.debug(
|
||||
"test_dest_files: {}".format(str(test_dest_files)))
|
||||
|
|
@ -453,7 +463,9 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
|||
dst_start_frame = dst_padding
|
||||
|
||||
# Store used frame value to template data
|
||||
template_data["frame"] = dst_start_frame
|
||||
if repre.get("frame"):
|
||||
template_data["frame"] = dst_start_frame
|
||||
|
||||
dst = "{0}{1}{2}".format(
|
||||
dst_head,
|
||||
dst_start_frame,
|
||||
|
|
@ -476,6 +488,10 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
|||
"Given file name is a full path"
|
||||
)
|
||||
|
||||
template_data["representation"] = repre['ext']
|
||||
# Store used frame value to template data
|
||||
if repre.get("udim"):
|
||||
template_data["udim"] = repre["udim"][0]
|
||||
src = os.path.join(stagingdir, fname)
|
||||
anatomy_filled = anatomy.format(template_data)
|
||||
template_filled = anatomy_filled[template_name]["path"]
|
||||
|
|
@ -488,6 +504,9 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
|||
repre['published_path'] = dst
|
||||
self.log.debug("__ dst: {}".format(dst))
|
||||
|
||||
if repre.get("udim"):
|
||||
repre_context["udim"] = repre.get("udim") # store list
|
||||
|
||||
repre["publishedFiles"] = published_files
|
||||
|
||||
for key in self.db_representation_context_keys:
|
||||
|
|
@ -1045,6 +1064,7 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
|||
)
|
||||
)
|
||||
shutil.copy(file_url, new_name)
|
||||
os.remove(file_url)
|
||||
else:
|
||||
self.log.debug(
|
||||
"Renaming file {} to {}".format(
|
||||
|
|
|
|||
|
|
@ -11,11 +11,12 @@ class ValidateEditorialAssetName(pyblish.api.ContextPlugin):
|
|||
"""
|
||||
|
||||
order = pyblish.api.ValidatorOrder
|
||||
label = "Validate Asset Name"
|
||||
label = "Validate Editorial Asset Name"
|
||||
|
||||
def process(self, context):
|
||||
|
||||
asset_and_parents = self.get_parents(context)
|
||||
self.log.debug("__ asset_and_parents: {}".format(asset_and_parents))
|
||||
|
||||
if not io.Session:
|
||||
io.install()
|
||||
|
|
@ -25,7 +26,8 @@ class ValidateEditorialAssetName(pyblish.api.ContextPlugin):
|
|||
self.log.debug("__ db_assets: {}".format(db_assets))
|
||||
|
||||
asset_db_docs = {
|
||||
str(e["name"]): e["data"]["parents"] for e in db_assets}
|
||||
str(e["name"]): e["data"]["parents"]
|
||||
for e in db_assets}
|
||||
|
||||
self.log.debug("__ project_entities: {}".format(
|
||||
pformat(asset_db_docs)))
|
||||
|
|
@ -107,6 +109,7 @@ class ValidateEditorialAssetName(pyblish.api.ContextPlugin):
|
|||
parents = instance.data["parents"]
|
||||
|
||||
return_dict.update({
|
||||
asset: [p["entity_name"] for p in parents]
|
||||
asset: [p["entity_name"] for p in parents
|
||||
if p["entity_type"].lower() != "project"]
|
||||
})
|
||||
return return_dict
|
||||
|
|
|
|||
|
|
@ -92,15 +92,16 @@ class RepairSelectInvalidInstances(pyblish.api.Action):
|
|||
|
||||
context_asset = context.data["assetEntity"]["name"]
|
||||
for instance in instances:
|
||||
self.set_attribute(instance, context_asset)
|
||||
if "nuke" in pyblish.api.registered_hosts():
|
||||
import openpype.hosts.nuke.api as nuke_api
|
||||
origin_node = instance[0]
|
||||
nuke_api.lib.recreate_instance(
|
||||
origin_node, avalon_data={"asset": context_asset}
|
||||
)
|
||||
else:
|
||||
self.set_attribute(instance, context_asset)
|
||||
|
||||
def set_attribute(self, instance, context_asset):
|
||||
if "nuke" in pyblish.api.registered_hosts():
|
||||
import nuke
|
||||
nuke.toNode(
|
||||
instance.data.get("name")
|
||||
)["avalon:asset"].setValue(context_asset)
|
||||
|
||||
if "maya" in pyblish.api.registered_hosts():
|
||||
from maya import cmds
|
||||
cmds.setAttr(
|
||||
|
|
|
|||
|
|
@ -113,6 +113,10 @@ def _h264_codec_args(ffprobe_data):
|
|||
|
||||
output.extend(["-codec:v", "h264"])
|
||||
|
||||
bit_rate = ffprobe_data.get("bit_rate")
|
||||
if bit_rate:
|
||||
output.extend(["-b:v", bit_rate])
|
||||
|
||||
pix_fmt = ffprobe_data.get("pix_fmt")
|
||||
if pix_fmt:
|
||||
output.extend(["-pix_fmt", pix_fmt])
|
||||
|
|
|
|||
|
|
@ -17,7 +17,7 @@
|
|||
},
|
||||
"publish": {
|
||||
"folder": "{root[work]}/{project[name]}/{hierarchy}/{asset}/publish/{family}/{subset}/{@version}",
|
||||
"file": "{project[code]}_{asset}_{subset}_{@version}<_{output}><.{@frame}>.{ext}",
|
||||
"file": "{project[code]}_{asset}_{subset}_{@version}<_{output}><.{@frame}><_{udim}>.{ext}",
|
||||
"path": "{@folder}/{@file}",
|
||||
"thumbnail": "{thumbnail_root}/{project[name]}/{_id}_{thumbnail_type}.{ext}"
|
||||
},
|
||||
|
|
|
|||
|
|
@ -3,9 +3,37 @@
|
|||
"ValidateExpectedFiles": {
|
||||
"enabled": true,
|
||||
"active": true,
|
||||
"families": ["render"],
|
||||
"targets": ["deadline"],
|
||||
"allow_user_override": true
|
||||
"allow_user_override": true,
|
||||
"families": [
|
||||
"render"
|
||||
],
|
||||
"targets": [
|
||||
"deadline"
|
||||
]
|
||||
},
|
||||
"ProcessSubmittedJobOnFarm": {
|
||||
"enabled": true,
|
||||
"deadline_department": "",
|
||||
"deadline_pool": "",
|
||||
"deadline_group": "",
|
||||
"deadline_chunk_size": 1,
|
||||
"deadline_priority": 50,
|
||||
"publishing_script": "",
|
||||
"skip_integration_repre_list": [],
|
||||
"aov_filter": {
|
||||
"maya": [
|
||||
".+(?:\\.|_)([Bb]eauty)(?:\\.|_).*"
|
||||
],
|
||||
"nuke": [
|
||||
".*"
|
||||
],
|
||||
"aftereffects": [
|
||||
".*"
|
||||
],
|
||||
"celaction": [
|
||||
".*"
|
||||
]
|
||||
}
|
||||
},
|
||||
"MayaSubmitDeadline": {
|
||||
"enabled": true,
|
||||
|
|
@ -15,7 +43,10 @@
|
|||
"use_published": true,
|
||||
"asset_dependencies": true,
|
||||
"group": "none",
|
||||
"limit": []
|
||||
"limit": [],
|
||||
"jobInfo": {},
|
||||
"pluginInfo": {},
|
||||
"scene_patches": []
|
||||
},
|
||||
"NukeSubmitDeadline": {
|
||||
"enabled": true,
|
||||
|
|
@ -29,6 +60,8 @@
|
|||
"group": "",
|
||||
"department": "",
|
||||
"use_gpu": true,
|
||||
"env_allowed_keys": [],
|
||||
"env_search_replace_values": {},
|
||||
"limit_groups": {}
|
||||
},
|
||||
"HarmonySubmitDeadline": {
|
||||
|
|
|
|||
|
|
@ -136,7 +136,8 @@
|
|||
"Pypeclub",
|
||||
"Administrator",
|
||||
"Project manager"
|
||||
]
|
||||
],
|
||||
"create_project_structure_checked": false
|
||||
},
|
||||
"clean_hierarchical_attr": {
|
||||
"enabled": true,
|
||||
|
|
@ -229,7 +230,6 @@
|
|||
"standalonepublisher"
|
||||
],
|
||||
"families": [
|
||||
"review",
|
||||
"plate"
|
||||
],
|
||||
"tasks": [],
|
||||
|
|
@ -279,6 +279,36 @@
|
|||
"tasks": [],
|
||||
"add_ftrack_family": true,
|
||||
"advanced_filtering": []
|
||||
},
|
||||
{
|
||||
"hosts": [
|
||||
"nuke"
|
||||
],
|
||||
"families": [
|
||||
"write",
|
||||
"render"
|
||||
],
|
||||
"tasks": [],
|
||||
"add_ftrack_family": false,
|
||||
"advanced_filtering": [
|
||||
{
|
||||
"families": [
|
||||
"review"
|
||||
],
|
||||
"add_ftrack_family": true
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"hosts": [
|
||||
"aftereffects"
|
||||
],
|
||||
"families": [
|
||||
"render"
|
||||
],
|
||||
"tasks": [],
|
||||
"add_ftrack_family": true,
|
||||
"advanced_filtering": []
|
||||
}
|
||||
]
|
||||
},
|
||||
|
|
|
|||
|
|
@ -1,5 +1,13 @@
|
|||
{
|
||||
"publish": {
|
||||
"ValidateEditorialAssetName": {
|
||||
"enabled": true,
|
||||
"optional": false
|
||||
},
|
||||
"ValidateVersion": {
|
||||
"enabled": true,
|
||||
"optional": false
|
||||
},
|
||||
"IntegrateHeroVersion": {
|
||||
"enabled": true,
|
||||
"optional": true,
|
||||
|
|
@ -165,25 +173,9 @@
|
|||
}
|
||||
]
|
||||
},
|
||||
"ProcessSubmittedJobOnFarm": {
|
||||
"enabled": true,
|
||||
"deadline_department": "",
|
||||
"deadline_pool": "",
|
||||
"deadline_group": "",
|
||||
"deadline_chunk_size": 1,
|
||||
"deadline_priority": 50,
|
||||
"aov_filter": {
|
||||
"maya": [
|
||||
".+(?:\\.|_)([Bb]eauty)(?:\\.|_).*"
|
||||
],
|
||||
"nuke": [],
|
||||
"aftereffects": [
|
||||
".*"
|
||||
],
|
||||
"celaction": [
|
||||
".*"
|
||||
]
|
||||
}
|
||||
"CleanUp": {
|
||||
"paterns": [],
|
||||
"remove_temp_renders": false
|
||||
}
|
||||
},
|
||||
"tools": {
|
||||
|
|
@ -243,6 +235,16 @@
|
|||
],
|
||||
"tasks": [],
|
||||
"template": "{family}{Task}"
|
||||
},
|
||||
{
|
||||
"families": [
|
||||
"renderLocal"
|
||||
],
|
||||
"hosts": [
|
||||
"aftereffects"
|
||||
],
|
||||
"tasks": [],
|
||||
"template": "render{Task}{Variant}"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
|
@ -254,6 +256,13 @@
|
|||
"enabled": true
|
||||
}
|
||||
],
|
||||
"open_workfile_tool_on_startup": [
|
||||
{
|
||||
"hosts": [],
|
||||
"tasks": [],
|
||||
"enabled": false
|
||||
}
|
||||
],
|
||||
"sw_folders": {
|
||||
"compositing": [
|
||||
"nuke",
|
||||
|
|
@ -271,28 +280,7 @@
|
|||
}
|
||||
}
|
||||
},
|
||||
"project_folder_structure": {
|
||||
"__project_root__": {
|
||||
"prod": {},
|
||||
"resources": {
|
||||
"footage": {
|
||||
"plates": {},
|
||||
"offline": {}
|
||||
},
|
||||
"audio": {},
|
||||
"art_dept": {}
|
||||
},
|
||||
"editorial": {},
|
||||
"assets[ftrack.Library]": {
|
||||
"characters[ftrack]": {},
|
||||
"locations[ftrack]": {}
|
||||
},
|
||||
"shots[ftrack.Sequence]": {
|
||||
"scripts": {},
|
||||
"editorial[ftrack.Folder]": {}
|
||||
}
|
||||
}
|
||||
},
|
||||
"project_folder_structure": "{\"__project_root__\": {\"prod\": {}, \"resources\": {\"footage\": {\"plates\": {}, \"offline\": {}}, \"audio\": {}, \"art_dept\": {}}, \"editorial\": {}, \"assets[ftrack.Library]\": {\"characters[ftrack]\": {}, \"locations[ftrack]\": {}}, \"shots[ftrack.Sequence]\": {\"scripts\": {}, \"editorial[ftrack.Folder]\": {}}}}",
|
||||
"sync_server": {
|
||||
"enabled": true,
|
||||
"config": {
|
||||
|
|
|
|||
|
|
@ -7,6 +7,35 @@
|
|||
"workfile": "ma",
|
||||
"yetiRig": "ma"
|
||||
},
|
||||
"maya-dirmap": {
|
||||
"enabled": true,
|
||||
"paths": {
|
||||
"source-path": [
|
||||
"foo1",
|
||||
"foo2"
|
||||
],
|
||||
"destination-path": [
|
||||
"bar1",
|
||||
"bar2"
|
||||
]
|
||||
}
|
||||
},
|
||||
"scriptsmenu": {
|
||||
"name": "OpenPype Tools",
|
||||
"definition": [
|
||||
{
|
||||
"type": "action",
|
||||
"command": "import openpype.hosts.maya.api.commands as op_cmds; op_cmds.edit_shader_definitions()",
|
||||
"sourcetype": "python",
|
||||
"title": "Edit shader name definitions",
|
||||
"tooltip": "Edit shader name definitions used in validation and renaming.",
|
||||
"tags": [
|
||||
"pipeline",
|
||||
"shader"
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
"create": {
|
||||
"CreateLook": {
|
||||
"enabled": true,
|
||||
|
|
@ -148,12 +177,14 @@
|
|||
},
|
||||
"ValidateModelName": {
|
||||
"enabled": false,
|
||||
"database": true,
|
||||
"material_file": {
|
||||
"windows": "",
|
||||
"darwin": "",
|
||||
"linux": ""
|
||||
},
|
||||
"regex": "(.*)_(\\\\d)*_(.*)_(GEO)"
|
||||
"regex": "(.*)_(\\d)*_(?P<shader>.*)_(GEO)",
|
||||
"top_level_regex": ".*_GRP"
|
||||
},
|
||||
"ValidateTransformNamingSuffix": {
|
||||
"enabled": true,
|
||||
|
|
|
|||
|
|
@ -10,11 +10,22 @@
|
|||
},
|
||||
"create": {
|
||||
"CreateWriteRender": {
|
||||
"fpath_template": "{work}/renders/nuke/{subset}/{subset}.{frame}.{ext}"
|
||||
"fpath_template": "{work}/renders/nuke/{subset}/{subset}.{frame}.{ext}",
|
||||
"defaults": [
|
||||
"Main",
|
||||
"Mask"
|
||||
]
|
||||
},
|
||||
"CreateWritePrerender": {
|
||||
"fpath_template": "{work}/prerenders/nuke/{subset}/{subset}.{frame}.{ext}",
|
||||
"use_range_limit": true
|
||||
"use_range_limit": true,
|
||||
"defaults": [
|
||||
"Key01",
|
||||
"Bg01",
|
||||
"Fg01",
|
||||
"Branch01",
|
||||
"Part01"
|
||||
]
|
||||
}
|
||||
},
|
||||
"publish": {
|
||||
|
|
|
|||
|
|
@ -123,6 +123,16 @@
|
|||
],
|
||||
"help": "Process multiple Mov files and publish them for layout and comp."
|
||||
},
|
||||
"create_texture_batch": {
|
||||
"name": "texture_batch",
|
||||
"label": "Texture Batch",
|
||||
"family": "texture_batch",
|
||||
"icon": "image",
|
||||
"defaults": [
|
||||
"Main"
|
||||
],
|
||||
"help": "Texture files with UDIM together with worfile"
|
||||
},
|
||||
"__dynamic_keys_labels__": {
|
||||
"create_workfile": "Workfile",
|
||||
"create_model": "Model",
|
||||
|
|
@ -134,10 +144,65 @@
|
|||
"create_image": "Image",
|
||||
"create_matchmove": "Matchmove",
|
||||
"create_render": "Render",
|
||||
"create_mov_batch": "Batch Mov"
|
||||
"create_mov_batch": "Batch Mov",
|
||||
"create_texture_batch": "Batch Texture"
|
||||
}
|
||||
},
|
||||
"publish": {
|
||||
"CollectTextures": {
|
||||
"enabled": true,
|
||||
"active": true,
|
||||
"main_workfile_extensions": [
|
||||
"mra"
|
||||
],
|
||||
"other_workfile_extensions": [
|
||||
"spp",
|
||||
"psd"
|
||||
],
|
||||
"texture_extensions": [
|
||||
"exr",
|
||||
"dpx",
|
||||
"jpg",
|
||||
"jpeg",
|
||||
"png",
|
||||
"tiff",
|
||||
"tga",
|
||||
"gif",
|
||||
"svg"
|
||||
],
|
||||
"workfile_families": [],
|
||||
"texture_families": [],
|
||||
"color_space": [
|
||||
"linsRGB",
|
||||
"raw",
|
||||
"acesg"
|
||||
],
|
||||
"input_naming_patterns": {
|
||||
"workfile": [
|
||||
"^([^.]+)(_[^_.]*)?_v([0-9]{3,}).+"
|
||||
],
|
||||
"textures": [
|
||||
"^([^_.]+)_([^_.]+)_v([0-9]{3,})_([^_.]+)_({color_space})_(1[0-9]{3}).+"
|
||||
]
|
||||
},
|
||||
"input_naming_groups": {
|
||||
"workfile": [
|
||||
"asset",
|
||||
"filler",
|
||||
"version"
|
||||
],
|
||||
"textures": [
|
||||
"asset",
|
||||
"shader",
|
||||
"version",
|
||||
"channel",
|
||||
"color_space",
|
||||
"udim"
|
||||
]
|
||||
},
|
||||
"workfile_subset_template": "textures{Subset}Workfile",
|
||||
"texture_subset_template": "textures{Subset}_{Shader}_{Channel}"
|
||||
},
|
||||
"ValidateSceneSettings": {
|
||||
"enabled": true,
|
||||
"optional": true,
|
||||
|
|
@ -165,6 +230,58 @@
|
|||
],
|
||||
"output": []
|
||||
}
|
||||
},
|
||||
"CollectEditorial": {
|
||||
"source_dir": "",
|
||||
"extensions": [
|
||||
"mov",
|
||||
"mp4"
|
||||
]
|
||||
},
|
||||
"CollectHierarchyInstance": {
|
||||
"shot_rename_template": "{project[code]}_{_sequence_}_{_shot_}",
|
||||
"shot_rename_search_patterns": {
|
||||
"_sequence_": "(\\d{4})(?=_\\d{4})",
|
||||
"_shot_": "(\\d{4})(?!_\\d{4})"
|
||||
},
|
||||
"shot_add_hierarchy": {
|
||||
"parents_path": "{project}/{folder}/{sequence}",
|
||||
"parents": {
|
||||
"project": "{project[name]}",
|
||||
"sequence": "{_sequence_}",
|
||||
"folder": "shots"
|
||||
}
|
||||
},
|
||||
"shot_add_tasks": {}
|
||||
},
|
||||
"CollectInstances": {
|
||||
"custom_start_frame": 0,
|
||||
"timeline_frame_start": 900000,
|
||||
"timeline_frame_offset": 0,
|
||||
"subsets": {
|
||||
"referenceMain": {
|
||||
"family": "review",
|
||||
"families": [
|
||||
"clip"
|
||||
],
|
||||
"extensions": [
|
||||
"mp4"
|
||||
],
|
||||
"version": 0,
|
||||
"keepSequence": false
|
||||
},
|
||||
"audioMain": {
|
||||
"family": "audio",
|
||||
"families": [
|
||||
"clip"
|
||||
],
|
||||
"extensions": [
|
||||
"wav"
|
||||
],
|
||||
"version": 0,
|
||||
"keepSequence": false
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -18,6 +18,11 @@
|
|||
"optional": true,
|
||||
"active": true
|
||||
},
|
||||
"ValidateStartFrame": {
|
||||
"enabled": false,
|
||||
"optional": true,
|
||||
"active": true
|
||||
},
|
||||
"ValidateAssetName": {
|
||||
"enabled": true,
|
||||
"optional": true,
|
||||
|
|
|
|||
|
|
@ -1,6 +1,5 @@
|
|||
{
|
||||
"project_setup": {
|
||||
"dev_mode": true,
|
||||
"install_unreal_python_engine": false
|
||||
"dev_mode": true
|
||||
}
|
||||
}
|
||||
|
|
@ -1084,7 +1084,7 @@
|
|||
"unreal": {
|
||||
"enabled": true,
|
||||
"label": "Unreal Editor",
|
||||
"icon": "{}/app_icons/ue4.png'",
|
||||
"icon": "{}/app_icons/ue4.png",
|
||||
"host_name": "unreal",
|
||||
"environment": {},
|
||||
"variants": {
|
||||
|
|
|
|||
|
|
@ -111,6 +111,7 @@ from .enum_entity import (
|
|||
from .list_entity import ListEntity
|
||||
from .dict_immutable_keys_entity import DictImmutableKeysEntity
|
||||
from .dict_mutable_keys_entity import DictMutableKeysEntity
|
||||
from .dict_conditional import DictConditionalEntity
|
||||
|
||||
from .anatomy_entities import AnatomyEntity
|
||||
|
||||
|
|
@ -166,5 +167,7 @@ __all__ = (
|
|||
|
||||
"DictMutableKeysEntity",
|
||||
|
||||
"DictConditionalEntity",
|
||||
|
||||
"AnatomyEntity"
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,5 +1,6 @@
|
|||
from .dict_immutable_keys_entity import DictImmutableKeysEntity
|
||||
from .lib import OverrideState
|
||||
from .exceptions import EntitySchemaError
|
||||
|
||||
|
||||
class AnatomyEntity(DictImmutableKeysEntity):
|
||||
|
|
@ -23,3 +24,25 @@ class AnatomyEntity(DictImmutableKeysEntity):
|
|||
if not child_obj.has_project_override:
|
||||
child_obj.add_to_project_override()
|
||||
return super(AnatomyEntity, self).on_child_change(child_obj)
|
||||
|
||||
def schema_validations(self):
|
||||
non_group_children = []
|
||||
for key, child_obj in self.non_gui_children.items():
|
||||
if not child_obj.is_group:
|
||||
non_group_children.append(key)
|
||||
|
||||
if non_group_children:
|
||||
_non_group_children = [
|
||||
"project_anatomy/{}".format(key)
|
||||
for key in non_group_children
|
||||
]
|
||||
reason = (
|
||||
"Anatomy must have all children as groups."
|
||||
" Set 'is_group' to `true` on > {}"
|
||||
).format(", ".join([
|
||||
'"{}"'.format(item)
|
||||
for item in _non_group_children
|
||||
]))
|
||||
raise EntitySchemaError(self, reason)
|
||||
|
||||
return super(AnatomyEntity, self).schema_validations()
|
||||
|
|
|
|||
|
|
@ -136,6 +136,7 @@ class BaseItemEntity(BaseEntity):
|
|||
# Override state defines which values are used, saved and how.
|
||||
# TODO convert to private attribute
|
||||
self._override_state = OverrideState.NOT_DEFINED
|
||||
self._ignore_missing_defaults = None
|
||||
|
||||
# These attributes may change values during existence of an object
|
||||
# Default value, studio override values and project override values
|
||||
|
|
@ -285,7 +286,7 @@ class BaseItemEntity(BaseEntity):
|
|||
pass
|
||||
|
||||
@abstractmethod
|
||||
def set_override_state(self, state):
|
||||
def set_override_state(self, state, ignore_missing_defaults):
|
||||
"""Set override state and trigger it on children.
|
||||
|
||||
Method discard all changes in hierarchy and use values, metadata
|
||||
|
|
@ -295,8 +296,15 @@ class BaseItemEntity(BaseEntity):
|
|||
Should start on root entity and when triggered then must be called on
|
||||
all entities in hierarchy.
|
||||
|
||||
Argument `ignore_missing_defaults` should be used when entity has
|
||||
children that are not saved or used all the time but override statu
|
||||
must be changed and children must have any default value.
|
||||
|
||||
Args:
|
||||
state (OverrideState): State to which should be data changed.
|
||||
ignore_missing_defaults (bool): Ignore missing default values.
|
||||
Entity won't raise `DefaultsNotDefined` and
|
||||
`StudioDefaultsNotDefined`.
|
||||
"""
|
||||
pass
|
||||
|
||||
|
|
|
|||
723
openpype/settings/entities/dict_conditional.py
Normal file
723
openpype/settings/entities/dict_conditional.py
Normal file
|
|
@ -0,0 +1,723 @@
|
|||
import copy
|
||||
|
||||
from .lib import (
|
||||
OverrideState,
|
||||
NOT_SET
|
||||
)
|
||||
from openpype.settings.constants import (
|
||||
METADATA_KEYS,
|
||||
M_OVERRIDEN_KEY,
|
||||
KEY_REGEX
|
||||
)
|
||||
from . import (
|
||||
BaseItemEntity,
|
||||
ItemEntity,
|
||||
GUIEntity
|
||||
)
|
||||
from .exceptions import (
|
||||
SchemaDuplicatedKeys,
|
||||
EntitySchemaError,
|
||||
InvalidKeySymbols
|
||||
)
|
||||
|
||||
|
||||
class DictConditionalEntity(ItemEntity):
|
||||
"""Entity represents dictionay with only one persistent key definition.
|
||||
|
||||
The persistent key is enumerator which define rest of children under
|
||||
dictionary. There is not possibility of shared children.
|
||||
|
||||
Entity's keys can't be removed or added. But they may change based on
|
||||
the persistent key. If you're change value manually (key by key) make sure
|
||||
you'll change value of the persistent key as first. It is recommended to
|
||||
use `set` method which handle this for you.
|
||||
|
||||
It is possible to use entity similar way as `dict` object. Returned values
|
||||
are not real settings values but entities representing the value.
|
||||
"""
|
||||
schema_types = ["dict-conditional"]
|
||||
_default_label_wrap = {
|
||||
"use_label_wrap": False,
|
||||
"collapsible": False,
|
||||
"collapsed": True
|
||||
}
|
||||
|
||||
def __getitem__(self, key):
|
||||
"""Return entity inder key."""
|
||||
if key == self.enum_key:
|
||||
return self.enum_entity
|
||||
return self.non_gui_children[self.current_enum][key]
|
||||
|
||||
def __setitem__(self, key, value):
|
||||
"""Set value of item under key."""
|
||||
if key == self.enum_key:
|
||||
child_obj = self.enum_entity
|
||||
else:
|
||||
child_obj = self.non_gui_children[self.current_enum][key]
|
||||
child_obj.set(value)
|
||||
|
||||
def __iter__(self):
|
||||
"""Iter through keys."""
|
||||
for key in self.keys():
|
||||
yield key
|
||||
|
||||
def __contains__(self, key):
|
||||
"""Check if key is available."""
|
||||
if key == self.enum_key:
|
||||
return True
|
||||
return key in self.non_gui_children[self.current_enum]
|
||||
|
||||
def get(self, key, default=None):
|
||||
"""Safe entity getter by key."""
|
||||
if key == self.enum_key:
|
||||
return self.enum_entity
|
||||
return self.non_gui_children[self.current_enum].get(key, default)
|
||||
|
||||
def keys(self):
|
||||
"""Entity's keys."""
|
||||
keys = list(self.non_gui_children[self.current_enum].keys())
|
||||
keys.insert(0, [self.enum_key])
|
||||
return keys
|
||||
|
||||
def values(self):
|
||||
"""Children entities."""
|
||||
values = [
|
||||
self.enum_entity
|
||||
]
|
||||
for child_entiy in self.non_gui_children[self.current_enum].values():
|
||||
values.append(child_entiy)
|
||||
return values
|
||||
|
||||
def items(self):
|
||||
"""Children entities paired with their key (key, value)."""
|
||||
items = [
|
||||
(self.enum_key, self.enum_entity)
|
||||
]
|
||||
for key, value in self.non_gui_children[self.current_enum].items():
|
||||
items.append((key, value))
|
||||
return items
|
||||
|
||||
def set(self, value):
|
||||
"""Set value."""
|
||||
new_value = self.convert_to_valid_type(value)
|
||||
# First change value of enum key if available
|
||||
if self.enum_key in new_value:
|
||||
self.enum_entity.set(new_value.pop(self.enum_key))
|
||||
|
||||
for _key, _value in new_value.items():
|
||||
self.non_gui_children[self.current_enum][_key].set(_value)
|
||||
|
||||
def _item_initalization(self):
|
||||
self._default_metadata = NOT_SET
|
||||
self._studio_override_metadata = NOT_SET
|
||||
self._project_override_metadata = NOT_SET
|
||||
|
||||
self._ignore_child_changes = False
|
||||
|
||||
# `current_metadata` are still when schema is loaded
|
||||
# - only metadata stored with dict item are gorup overrides in
|
||||
# M_OVERRIDEN_KEY
|
||||
self._current_metadata = {}
|
||||
self._metadata_are_modified = False
|
||||
|
||||
# Entity must be group or in group
|
||||
if (
|
||||
self.group_item is None
|
||||
and not self.is_dynamic_item
|
||||
and not self.is_in_dynamic_item
|
||||
):
|
||||
self.is_group = True
|
||||
|
||||
# Children are stored by key as keys are immutable and are defined by
|
||||
# schema
|
||||
self.valid_value_types = (dict, )
|
||||
self.children = {}
|
||||
self.non_gui_children = {}
|
||||
self.gui_layout = {}
|
||||
|
||||
if self.is_dynamic_item:
|
||||
self.require_key = False
|
||||
|
||||
self.enum_key = self.schema_data.get("enum_key")
|
||||
self.enum_label = self.schema_data.get("enum_label")
|
||||
self.enum_children = self.schema_data.get("enum_children")
|
||||
self.enum_default = self.schema_data.get("enum_default")
|
||||
|
||||
self.enum_entity = None
|
||||
|
||||
# GUI attributes
|
||||
self.enum_is_horizontal = self.schema_data.get(
|
||||
"enum_is_horizontal", False
|
||||
)
|
||||
# `enum_on_right` can be used only if
|
||||
self.enum_on_right = self.schema_data.get("enum_on_right", False)
|
||||
|
||||
self.highlight_content = self.schema_data.get(
|
||||
"highlight_content", False
|
||||
)
|
||||
self.show_borders = self.schema_data.get("show_borders", True)
|
||||
|
||||
self._add_children()
|
||||
|
||||
@property
|
||||
def current_enum(self):
|
||||
"""Current value of enum entity.
|
||||
|
||||
This value define what children are used.
|
||||
"""
|
||||
if self.enum_entity is None:
|
||||
return None
|
||||
return self.enum_entity.value
|
||||
|
||||
def schema_validations(self):
|
||||
"""Validation of schema data."""
|
||||
# Enum key must be defined
|
||||
if self.enum_key is None:
|
||||
raise EntitySchemaError(self, "Key 'enum_key' is not set.")
|
||||
|
||||
# Validate type of enum children
|
||||
if not isinstance(self.enum_children, list):
|
||||
raise EntitySchemaError(
|
||||
self, "Key 'enum_children' must be a list. Got: {}".format(
|
||||
str(type(self.enum_children))
|
||||
)
|
||||
)
|
||||
|
||||
# Without defined enum children entity has nothing to do
|
||||
if not self.enum_children:
|
||||
raise EntitySchemaError(self, (
|
||||
"Key 'enum_children' have empty value. Entity can't work"
|
||||
" without children definitions."
|
||||
))
|
||||
|
||||
children_def_keys = []
|
||||
for children_def in self.enum_children:
|
||||
if not isinstance(children_def, dict):
|
||||
raise EntitySchemaError(self, (
|
||||
"Children definition under key 'enum_children' must"
|
||||
" be a dictionary."
|
||||
))
|
||||
|
||||
if "key" not in children_def:
|
||||
raise EntitySchemaError(self, (
|
||||
"Children definition under key 'enum_children' miss"
|
||||
" 'key' definition."
|
||||
))
|
||||
# We don't validate regex of these keys because they will be stored
|
||||
# as value at the end.
|
||||
key = children_def["key"]
|
||||
if key in children_def_keys:
|
||||
# TODO this hould probably be different exception?
|
||||
raise SchemaDuplicatedKeys(self, key)
|
||||
children_def_keys.append(key)
|
||||
|
||||
# Validate key duplications per each enum item
|
||||
for children in self.children.values():
|
||||
children_keys = set()
|
||||
children_keys.add(self.enum_key)
|
||||
for child_entity in children:
|
||||
if not isinstance(child_entity, BaseItemEntity):
|
||||
continue
|
||||
elif child_entity.key not in children_keys:
|
||||
children_keys.add(child_entity.key)
|
||||
else:
|
||||
raise SchemaDuplicatedKeys(self, child_entity.key)
|
||||
|
||||
# Enum key must match key regex
|
||||
if not KEY_REGEX.match(self.enum_key):
|
||||
raise InvalidKeySymbols(self.path, self.enum_key)
|
||||
|
||||
# Validate all remaining keys with key regex
|
||||
for children_by_key in self.non_gui_children.values():
|
||||
for key in children_by_key.keys():
|
||||
if not KEY_REGEX.match(key):
|
||||
raise InvalidKeySymbols(self.path, key)
|
||||
|
||||
super(DictConditionalEntity, self).schema_validations()
|
||||
# Trigger schema validation on children entities
|
||||
for children in self.children.values():
|
||||
for child_obj in children:
|
||||
child_obj.schema_validations()
|
||||
|
||||
def on_change(self):
|
||||
"""Update metadata on change and pass change to parent."""
|
||||
self._update_current_metadata()
|
||||
|
||||
for callback in self.on_change_callbacks:
|
||||
callback()
|
||||
self.parent.on_child_change(self)
|
||||
|
||||
def on_child_change(self, child_obj):
|
||||
"""Trigger on change callback if child changes are not ignored."""
|
||||
if self._ignore_child_changes:
|
||||
return
|
||||
|
||||
if (
|
||||
child_obj is self.enum_entity
|
||||
or child_obj in self.children[self.current_enum]
|
||||
):
|
||||
self.on_change()
|
||||
|
||||
def _add_children(self):
|
||||
"""Add children from schema data and repare enum items.
|
||||
|
||||
Each enum item must have defined it's children. None are shared across
|
||||
all enum items.
|
||||
|
||||
Nice to have: Have ability to have shared keys across all enum items.
|
||||
|
||||
All children are stored by their enum item.
|
||||
"""
|
||||
# Skip if are not defined
|
||||
# - schema validations should raise and exception
|
||||
if not self.enum_children or not self.enum_key:
|
||||
return
|
||||
|
||||
valid_enum_items = []
|
||||
for item in self.enum_children:
|
||||
if isinstance(item, dict) and "key" in item:
|
||||
valid_enum_items.append(item)
|
||||
|
||||
enum_keys = []
|
||||
enum_items = []
|
||||
for item in valid_enum_items:
|
||||
item_key = item["key"]
|
||||
enum_keys.append(item_key)
|
||||
item_label = item.get("label") or item_key
|
||||
enum_items.append({item_key: item_label})
|
||||
|
||||
if not enum_items:
|
||||
return
|
||||
|
||||
if self.enum_default in enum_keys:
|
||||
default_key = self.enum_default
|
||||
else:
|
||||
default_key = enum_keys[0]
|
||||
|
||||
# Create Enum child first
|
||||
enum_key = self.enum_key or "invalid"
|
||||
enum_schema = {
|
||||
"type": "enum",
|
||||
"multiselection": False,
|
||||
"enum_items": enum_items,
|
||||
"key": enum_key,
|
||||
"label": self.enum_label,
|
||||
"default": default_key
|
||||
}
|
||||
|
||||
enum_entity = self.create_schema_object(enum_schema, self)
|
||||
self.enum_entity = enum_entity
|
||||
|
||||
# Create children per each enum item
|
||||
for item in valid_enum_items:
|
||||
item_key = item["key"]
|
||||
# Make sure all keys have set value in these variables
|
||||
# - key 'children' is optional
|
||||
self.non_gui_children[item_key] = {}
|
||||
self.children[item_key] = []
|
||||
self.gui_layout[item_key] = []
|
||||
|
||||
children = item.get("children") or []
|
||||
for children_schema in children:
|
||||
child_obj = self.create_schema_object(children_schema, self)
|
||||
self.children[item_key].append(child_obj)
|
||||
self.gui_layout[item_key].append(child_obj)
|
||||
if isinstance(child_obj, GUIEntity):
|
||||
continue
|
||||
|
||||
self.non_gui_children[item_key][child_obj.key] = child_obj
|
||||
|
||||
def get_child_path(self, child_obj):
|
||||
"""Get hierarchical path of child entity.
|
||||
|
||||
Child must be entity's direct children. This must be possible to get
|
||||
for any children even if not from current enum value.
|
||||
"""
|
||||
if child_obj is self.enum_entity:
|
||||
return "/".join([self.path, self.enum_key])
|
||||
|
||||
result_key = None
|
||||
for children in self.non_gui_children.values():
|
||||
for key, _child_obj in children.items():
|
||||
if _child_obj is child_obj:
|
||||
result_key = key
|
||||
break
|
||||
|
||||
if result_key is None:
|
||||
raise ValueError("Didn't found child {}".format(child_obj))
|
||||
|
||||
return "/".join([self.path, result_key])
|
||||
|
||||
def _update_current_metadata(self):
|
||||
current_metadata = {}
|
||||
for key, child_obj in self.non_gui_children[self.current_enum].items():
|
||||
if self._override_state is OverrideState.DEFAULTS:
|
||||
break
|
||||
|
||||
if not child_obj.is_group:
|
||||
continue
|
||||
|
||||
if (
|
||||
self._override_state is OverrideState.STUDIO
|
||||
and not child_obj.has_studio_override
|
||||
):
|
||||
continue
|
||||
|
||||
if (
|
||||
self._override_state is OverrideState.PROJECT
|
||||
and not child_obj.has_project_override
|
||||
):
|
||||
continue
|
||||
|
||||
if M_OVERRIDEN_KEY not in current_metadata:
|
||||
current_metadata[M_OVERRIDEN_KEY] = []
|
||||
current_metadata[M_OVERRIDEN_KEY].append(key)
|
||||
|
||||
# Define if current metadata are avaialble for current override state
|
||||
metadata = NOT_SET
|
||||
if self._override_state is OverrideState.STUDIO:
|
||||
metadata = self._studio_override_metadata
|
||||
|
||||
elif self._override_state is OverrideState.PROJECT:
|
||||
metadata = self._project_override_metadata
|
||||
|
||||
if metadata is NOT_SET:
|
||||
metadata = {}
|
||||
|
||||
self._metadata_are_modified = current_metadata != metadata
|
||||
self._current_metadata = current_metadata
|
||||
|
||||
def set_override_state(self, state, ignore_missing_defaults):
|
||||
# Trigger override state change of root if is not same
|
||||
if self.root_item.override_state is not state:
|
||||
self.root_item.set_override_state(state)
|
||||
return
|
||||
|
||||
# Change has/had override states
|
||||
self._override_state = state
|
||||
self._ignore_missing_defaults = ignore_missing_defaults
|
||||
|
||||
# Set override state on enum entity first
|
||||
self.enum_entity.set_override_state(state, ignore_missing_defaults)
|
||||
|
||||
# Set override state on other enum children
|
||||
# - these must not raise exception about missing defaults
|
||||
for children_by_key in self.non_gui_children.values():
|
||||
for child_obj in children_by_key.values():
|
||||
child_obj.set_override_state(state, True)
|
||||
|
||||
self._update_current_metadata()
|
||||
|
||||
@property
|
||||
def value(self):
|
||||
output = {
|
||||
self.enum_key: self.enum_entity.value
|
||||
}
|
||||
for key, child_obj in self.non_gui_children[self.current_enum].items():
|
||||
output[key] = child_obj.value
|
||||
return output
|
||||
|
||||
@property
|
||||
def has_unsaved_changes(self):
|
||||
if self._metadata_are_modified:
|
||||
return True
|
||||
|
||||
return self._child_has_unsaved_changes
|
||||
|
||||
@property
|
||||
def _child_has_unsaved_changes(self):
|
||||
if self.enum_entity.has_unsaved_changes:
|
||||
return True
|
||||
|
||||
for child_obj in self.non_gui_children[self.current_enum].values():
|
||||
if child_obj.has_unsaved_changes:
|
||||
return True
|
||||
return False
|
||||
|
||||
@property
|
||||
def has_studio_override(self):
|
||||
return self._child_has_studio_override
|
||||
|
||||
@property
|
||||
def _child_has_studio_override(self):
|
||||
if self._override_state >= OverrideState.STUDIO:
|
||||
if self.enum_entity.has_studio_override:
|
||||
return True
|
||||
|
||||
for child_obj in self.non_gui_children[self.current_enum].values():
|
||||
if child_obj.has_studio_override:
|
||||
return True
|
||||
return False
|
||||
|
||||
@property
|
||||
def has_project_override(self):
|
||||
return self._child_has_project_override
|
||||
|
||||
@property
|
||||
def _child_has_project_override(self):
|
||||
if self._override_state >= OverrideState.PROJECT:
|
||||
if self.enum_entity.has_project_override:
|
||||
return True
|
||||
|
||||
for child_obj in self.non_gui_children[self.current_enum].values():
|
||||
if child_obj.has_project_override:
|
||||
return True
|
||||
return False
|
||||
|
||||
def settings_value(self):
|
||||
if self._override_state is OverrideState.NOT_DEFINED:
|
||||
return NOT_SET
|
||||
|
||||
if self._override_state is OverrideState.DEFAULTS:
|
||||
children_items = [
|
||||
(self.enum_key, self.enum_entity)
|
||||
]
|
||||
for item in self.non_gui_children[self.current_enum].items():
|
||||
children_items.append(item)
|
||||
|
||||
output = {}
|
||||
for key, child_obj in children_items:
|
||||
child_value = child_obj.settings_value()
|
||||
if not child_obj.is_file and not child_obj.file_item:
|
||||
for _key, _value in child_value.items():
|
||||
new_key = "/".join([key, _key])
|
||||
output[new_key] = _value
|
||||
else:
|
||||
output[key] = child_value
|
||||
return output
|
||||
|
||||
if self.is_group:
|
||||
if self._override_state is OverrideState.STUDIO:
|
||||
if not self.has_studio_override:
|
||||
return NOT_SET
|
||||
elif self._override_state is OverrideState.PROJECT:
|
||||
if not self.has_project_override:
|
||||
return NOT_SET
|
||||
|
||||
output = {}
|
||||
children_items = [
|
||||
(self.enum_key, self.enum_entity)
|
||||
]
|
||||
for item in self.non_gui_children[self.current_enum].items():
|
||||
children_items.append(item)
|
||||
|
||||
for key, child_obj in children_items:
|
||||
value = child_obj.settings_value()
|
||||
if value is not NOT_SET:
|
||||
output[key] = value
|
||||
|
||||
if not output:
|
||||
return NOT_SET
|
||||
|
||||
output.update(self._current_metadata)
|
||||
return output
|
||||
|
||||
def _prepare_value(self, value):
|
||||
if value is NOT_SET or self.enum_key not in value:
|
||||
return NOT_SET, NOT_SET
|
||||
|
||||
enum_value = value.get(self.enum_key)
|
||||
if enum_value not in self.non_gui_children:
|
||||
return NOT_SET, NOT_SET
|
||||
|
||||
# Create copy of value before poping values
|
||||
value = copy.deepcopy(value)
|
||||
metadata = {}
|
||||
for key in METADATA_KEYS:
|
||||
if key in value:
|
||||
metadata[key] = value.pop(key)
|
||||
|
||||
enum_value = value.get(self.enum_key)
|
||||
|
||||
old_metadata = metadata.get(M_OVERRIDEN_KEY)
|
||||
if old_metadata:
|
||||
old_metadata_set = set(old_metadata)
|
||||
new_metadata = []
|
||||
non_gui_children = self.non_gui_children[enum_value]
|
||||
for key in non_gui_children.keys():
|
||||
if key in old_metadata:
|
||||
new_metadata.append(key)
|
||||
old_metadata_set.remove(key)
|
||||
|
||||
for key in old_metadata_set:
|
||||
new_metadata.append(key)
|
||||
metadata[M_OVERRIDEN_KEY] = new_metadata
|
||||
|
||||
return value, metadata
|
||||
|
||||
def update_default_value(self, value):
|
||||
"""Update default values.
|
||||
|
||||
Not an api method, should be called by parent.
|
||||
"""
|
||||
value = self._check_update_value(value, "default")
|
||||
self.has_default_value = value is not NOT_SET
|
||||
# TODO add value validation
|
||||
value, metadata = self._prepare_value(value)
|
||||
self._default_metadata = metadata
|
||||
|
||||
if value is NOT_SET:
|
||||
self.enum_entity.update_default_value(value)
|
||||
for children_by_key in self.non_gui_children.values():
|
||||
for child_obj in children_by_key.values():
|
||||
child_obj.update_default_value(value)
|
||||
return
|
||||
|
||||
value_keys = set(value.keys())
|
||||
enum_value = value[self.enum_key]
|
||||
expected_keys = set(self.non_gui_children[enum_value].keys())
|
||||
expected_keys.add(self.enum_key)
|
||||
unknown_keys = value_keys - expected_keys
|
||||
if unknown_keys:
|
||||
self.log.warning(
|
||||
"{} Unknown keys in default values: {}".format(
|
||||
self.path,
|
||||
", ".join("\"{}\"".format(key) for key in unknown_keys)
|
||||
)
|
||||
)
|
||||
|
||||
self.enum_entity.update_default_value(enum_value)
|
||||
for children_by_key in self.non_gui_children.values():
|
||||
for key, child_obj in children_by_key.items():
|
||||
child_value = value.get(key, NOT_SET)
|
||||
child_obj.update_default_value(child_value)
|
||||
|
||||
def update_studio_value(self, value):
|
||||
"""Update studio override values.
|
||||
|
||||
Not an api method, should be called by parent.
|
||||
"""
|
||||
value = self._check_update_value(value, "studio override")
|
||||
value, metadata = self._prepare_value(value)
|
||||
self._studio_override_metadata = metadata
|
||||
self.had_studio_override = metadata is not NOT_SET
|
||||
|
||||
if value is NOT_SET:
|
||||
self.enum_entity.update_studio_value(value)
|
||||
for children_by_key in self.non_gui_children.values():
|
||||
for child_obj in children_by_key.values():
|
||||
child_obj.update_studio_value(value)
|
||||
return
|
||||
|
||||
value_keys = set(value.keys())
|
||||
enum_value = value[self.enum_key]
|
||||
expected_keys = set(self.non_gui_children[enum_value])
|
||||
expected_keys.add(self.enum_key)
|
||||
unknown_keys = value_keys - expected_keys
|
||||
if unknown_keys:
|
||||
self.log.warning(
|
||||
"{} Unknown keys in studio overrides: {}".format(
|
||||
self.path,
|
||||
", ".join("\"{}\"".format(key) for key in unknown_keys)
|
||||
)
|
||||
)
|
||||
|
||||
self.enum_entity.update_studio_value(enum_value)
|
||||
for children_by_key in self.non_gui_children.values():
|
||||
for key, child_obj in children_by_key.items():
|
||||
child_value = value.get(key, NOT_SET)
|
||||
child_obj.update_studio_value(child_value)
|
||||
|
||||
def update_project_value(self, value):
|
||||
"""Update project override values.
|
||||
|
||||
Not an api method, should be called by parent.
|
||||
"""
|
||||
value = self._check_update_value(value, "project override")
|
||||
value, metadata = self._prepare_value(value)
|
||||
self._project_override_metadata = metadata
|
||||
self.had_project_override = metadata is not NOT_SET
|
||||
|
||||
if value is NOT_SET:
|
||||
self.enum_entity.update_project_value(value)
|
||||
for children_by_key in self.non_gui_children.values():
|
||||
for child_obj in children_by_key.values():
|
||||
child_obj.update_project_value(value)
|
||||
return
|
||||
|
||||
value_keys = set(value.keys())
|
||||
enum_value = value[self.enum_key]
|
||||
expected_keys = set(self.non_gui_children[enum_value])
|
||||
expected_keys.add(self.enum_key)
|
||||
unknown_keys = value_keys - expected_keys
|
||||
if unknown_keys:
|
||||
self.log.warning(
|
||||
"{} Unknown keys in project overrides: {}".format(
|
||||
self.path,
|
||||
", ".join("\"{}\"".format(key) for key in unknown_keys)
|
||||
)
|
||||
)
|
||||
|
||||
self.enum_entity.update_project_value(enum_value)
|
||||
for children_by_key in self.non_gui_children.values():
|
||||
for key, child_obj in children_by_key.items():
|
||||
child_value = value.get(key, NOT_SET)
|
||||
child_obj.update_project_value(child_value)
|
||||
|
||||
def _discard_changes(self, on_change_trigger):
|
||||
self._ignore_child_changes = True
|
||||
|
||||
self.enum_entity.discard_changes(on_change_trigger)
|
||||
for children_by_key in self.non_gui_children.values():
|
||||
for child_obj in children_by_key.values():
|
||||
child_obj.discard_changes(on_change_trigger)
|
||||
|
||||
self._ignore_child_changes = False
|
||||
|
||||
def _add_to_studio_default(self, on_change_trigger):
|
||||
self._ignore_child_changes = True
|
||||
|
||||
self.enum_entity.add_to_studio_default(on_change_trigger)
|
||||
for children_by_key in self.non_gui_children.values():
|
||||
for child_obj in children_by_key.values():
|
||||
child_obj.add_to_studio_default(on_change_trigger)
|
||||
|
||||
self._ignore_child_changes = False
|
||||
|
||||
self._update_current_metadata()
|
||||
|
||||
self.parent.on_child_change(self)
|
||||
|
||||
def _remove_from_studio_default(self, on_change_trigger):
|
||||
self._ignore_child_changes = True
|
||||
|
||||
self.enum_entity.remove_from_studio_default(on_change_trigger)
|
||||
for children_by_key in self.non_gui_children.values():
|
||||
for child_obj in children_by_key.values():
|
||||
child_obj.remove_from_studio_default(on_change_trigger)
|
||||
|
||||
self._ignore_child_changes = False
|
||||
|
||||
def _add_to_project_override(self, on_change_trigger):
|
||||
self._ignore_child_changes = True
|
||||
|
||||
self.enum_entity.add_to_project_override(on_change_trigger)
|
||||
for children_by_key in self.non_gui_children.values():
|
||||
for child_obj in children_by_key.values():
|
||||
child_obj.add_to_project_override(on_change_trigger)
|
||||
|
||||
self._ignore_child_changes = False
|
||||
|
||||
self._update_current_metadata()
|
||||
|
||||
self.parent.on_child_change(self)
|
||||
|
||||
def _remove_from_project_override(self, on_change_trigger):
|
||||
if self._override_state is not OverrideState.PROJECT:
|
||||
return
|
||||
|
||||
self._ignore_child_changes = True
|
||||
|
||||
self.enum_entity.remove_from_project_override(on_change_trigger)
|
||||
for children_by_key in self.non_gui_children.values():
|
||||
for child_obj in children_by_key.values():
|
||||
child_obj.remove_from_project_override(on_change_trigger)
|
||||
|
||||
self._ignore_child_changes = False
|
||||
|
||||
def reset_callbacks(self):
|
||||
"""Reset registered callbacks on entity and children."""
|
||||
super(DictConditionalEntity, self).reset_callbacks()
|
||||
for children in self.children.values():
|
||||
for child_entity in children:
|
||||
child_entity.reset_callbacks()
|
||||
|
|
@ -258,7 +258,7 @@ class DictImmutableKeysEntity(ItemEntity):
|
|||
self._metadata_are_modified = current_metadata != metadata
|
||||
self._current_metadata = current_metadata
|
||||
|
||||
def set_override_state(self, state):
|
||||
def set_override_state(self, state, ignore_missing_defaults):
|
||||
# Trigger override state change of root if is not same
|
||||
if self.root_item.override_state is not state:
|
||||
self.root_item.set_override_state(state)
|
||||
|
|
@ -266,9 +266,10 @@ class DictImmutableKeysEntity(ItemEntity):
|
|||
|
||||
# Change has/had override states
|
||||
self._override_state = state
|
||||
self._ignore_missing_defaults = ignore_missing_defaults
|
||||
|
||||
for child_obj in self.non_gui_children.values():
|
||||
child_obj.set_override_state(state)
|
||||
child_obj.set_override_state(state, ignore_missing_defaults)
|
||||
|
||||
self._update_current_metadata()
|
||||
|
||||
|
|
|
|||
|
|
@ -154,7 +154,9 @@ class DictMutableKeysEntity(EndpointEntity):
|
|||
|
||||
def add_key(self, key):
|
||||
new_child = self._add_key(key)
|
||||
new_child.set_override_state(self._override_state)
|
||||
new_child.set_override_state(
|
||||
self._override_state, self._ignore_missing_defaults
|
||||
)
|
||||
self.on_change()
|
||||
return new_child
|
||||
|
||||
|
|
@ -320,7 +322,7 @@ class DictMutableKeysEntity(EndpointEntity):
|
|||
def _metadata_for_current_state(self):
|
||||
return self._get_metadata_for_state(self._override_state)
|
||||
|
||||
def set_override_state(self, state):
|
||||
def set_override_state(self, state, ignore_missing_defaults):
|
||||
# Trigger override state change of root if is not same
|
||||
if self.root_item.override_state is not state:
|
||||
self.root_item.set_override_state(state)
|
||||
|
|
@ -328,14 +330,22 @@ class DictMutableKeysEntity(EndpointEntity):
|
|||
|
||||
# TODO change metadata
|
||||
self._override_state = state
|
||||
self._ignore_missing_defaults = ignore_missing_defaults
|
||||
|
||||
# Ignore if is dynamic item and use default in that case
|
||||
if not self.is_dynamic_item and not self.is_in_dynamic_item:
|
||||
if state > OverrideState.DEFAULTS:
|
||||
if not self.has_default_value:
|
||||
if (
|
||||
not self.has_default_value
|
||||
and not ignore_missing_defaults
|
||||
):
|
||||
raise DefaultsNotDefined(self)
|
||||
|
||||
elif state > OverrideState.STUDIO:
|
||||
if not self.had_studio_override:
|
||||
if (
|
||||
not self.had_studio_override
|
||||
and not ignore_missing_defaults
|
||||
):
|
||||
raise StudioDefaultsNotDefined(self)
|
||||
|
||||
if state is OverrideState.STUDIO:
|
||||
|
|
@ -426,7 +436,7 @@ class DictMutableKeysEntity(EndpointEntity):
|
|||
|
||||
if label:
|
||||
children_label_by_id[child_entity.id] = label
|
||||
child_entity.set_override_state(state)
|
||||
child_entity.set_override_state(state, ignore_missing_defaults)
|
||||
|
||||
self.children_label_by_id = children_label_by_id
|
||||
|
||||
|
|
@ -610,7 +620,9 @@ class DictMutableKeysEntity(EndpointEntity):
|
|||
if not self._can_discard_changes:
|
||||
return
|
||||
|
||||
self.set_override_state(self._override_state)
|
||||
self.set_override_state(
|
||||
self._override_state, self._ignore_missing_defaults
|
||||
)
|
||||
on_change_trigger.append(self.on_change)
|
||||
|
||||
def _add_to_studio_default(self, _on_change_trigger):
|
||||
|
|
@ -645,7 +657,9 @@ class DictMutableKeysEntity(EndpointEntity):
|
|||
if label:
|
||||
children_label_by_id[child_entity.id] = label
|
||||
|
||||
child_entity.set_override_state(self._override_state)
|
||||
child_entity.set_override_state(
|
||||
self._override_state, self._ignore_missing_defaults
|
||||
)
|
||||
|
||||
self.children_label_by_id = children_label_by_id
|
||||
|
||||
|
|
@ -694,7 +708,9 @@ class DictMutableKeysEntity(EndpointEntity):
|
|||
if label:
|
||||
children_label_by_id[child_entity.id] = label
|
||||
|
||||
child_entity.set_override_state(self._override_state)
|
||||
child_entity.set_override_state(
|
||||
self._override_state, self._ignore_missing_defaults
|
||||
)
|
||||
|
||||
self.children_label_by_id = children_label_by_id
|
||||
|
||||
|
|
|
|||
|
|
@ -1,3 +1,4 @@
|
|||
import copy
|
||||
from .input_entities import InputEntity
|
||||
from .exceptions import EntitySchemaError
|
||||
from .lib import (
|
||||
|
|
@ -72,21 +73,41 @@ class EnumEntity(BaseEnumEntity):
|
|||
def _item_initalization(self):
|
||||
self.multiselection = self.schema_data.get("multiselection", False)
|
||||
self.enum_items = self.schema_data.get("enum_items")
|
||||
# Default is optional and non breaking attribute
|
||||
enum_default = self.schema_data.get("default")
|
||||
|
||||
valid_keys = set()
|
||||
all_keys = []
|
||||
for item in self.enum_items or []:
|
||||
valid_keys.add(tuple(item.keys())[0])
|
||||
key = tuple(item.keys())[0]
|
||||
all_keys.append(key)
|
||||
|
||||
self.valid_keys = valid_keys
|
||||
self.valid_keys = set(all_keys)
|
||||
|
||||
if self.multiselection:
|
||||
self.valid_value_types = (list, )
|
||||
self.value_on_not_set = []
|
||||
value_on_not_set = []
|
||||
if enum_default:
|
||||
if not isinstance(enum_default, list):
|
||||
enum_default = [enum_default]
|
||||
|
||||
for item in enum_default:
|
||||
if item in all_keys:
|
||||
value_on_not_set.append(item)
|
||||
|
||||
self.value_on_not_set = value_on_not_set
|
||||
|
||||
else:
|
||||
for key in valid_keys:
|
||||
if self.value_on_not_set is NOT_SET:
|
||||
self.value_on_not_set = key
|
||||
break
|
||||
if isinstance(enum_default, list) and enum_default:
|
||||
enum_default = enum_default[0]
|
||||
|
||||
if enum_default in self.valid_keys:
|
||||
self.value_on_not_set = enum_default
|
||||
|
||||
else:
|
||||
for key in all_keys:
|
||||
if self.value_on_not_set is NOT_SET:
|
||||
self.value_on_not_set = key
|
||||
break
|
||||
|
||||
self.valid_value_types = (STRING_TYPE, )
|
||||
|
||||
|
|
@ -118,29 +139,43 @@ class HostsEnumEntity(BaseEnumEntity):
|
|||
implementation instead of application name.
|
||||
"""
|
||||
schema_types = ["hosts-enum"]
|
||||
all_host_names = [
|
||||
"aftereffects",
|
||||
"blender",
|
||||
"celaction",
|
||||
"fusion",
|
||||
"harmony",
|
||||
"hiero",
|
||||
"houdini",
|
||||
"maya",
|
||||
"nuke",
|
||||
"photoshop",
|
||||
"resolve",
|
||||
"tvpaint",
|
||||
"unreal",
|
||||
"standalonepublisher"
|
||||
]
|
||||
|
||||
def _item_initalization(self):
|
||||
self.multiselection = self.schema_data.get("multiselection", True)
|
||||
self.use_empty_value = self.schema_data.get(
|
||||
"use_empty_value", not self.multiselection
|
||||
)
|
||||
use_empty_value = False
|
||||
if not self.multiselection:
|
||||
use_empty_value = self.schema_data.get(
|
||||
"use_empty_value", use_empty_value
|
||||
)
|
||||
self.use_empty_value = use_empty_value
|
||||
|
||||
hosts_filter = self.schema_data.get("hosts_filter") or []
|
||||
self.hosts_filter = hosts_filter
|
||||
|
||||
custom_labels = self.schema_data.get("custom_labels") or {}
|
||||
|
||||
host_names = [
|
||||
"aftereffects",
|
||||
"blender",
|
||||
"celaction",
|
||||
"fusion",
|
||||
"harmony",
|
||||
"hiero",
|
||||
"houdini",
|
||||
"maya",
|
||||
"nuke",
|
||||
"photoshop",
|
||||
"resolve",
|
||||
"tvpaint",
|
||||
"unreal"
|
||||
]
|
||||
host_names = copy.deepcopy(self.all_host_names)
|
||||
if hosts_filter:
|
||||
for host_name in tuple(host_names):
|
||||
if host_name not in hosts_filter:
|
||||
host_names.remove(host_name)
|
||||
|
||||
if self.use_empty_value:
|
||||
host_names.insert(0, "")
|
||||
# Add default label for empty value if not available
|
||||
|
|
@ -172,6 +207,44 @@ class HostsEnumEntity(BaseEnumEntity):
|
|||
# GUI attribute
|
||||
self.placeholder = self.schema_data.get("placeholder")
|
||||
|
||||
def schema_validations(self):
|
||||
if self.hosts_filter:
|
||||
enum_len = len(self.enum_items)
|
||||
if (
|
||||
enum_len == 0
|
||||
or (enum_len == 1 and self.use_empty_value)
|
||||
):
|
||||
joined_filters = ", ".join([
|
||||
'"{}"'.format(item)
|
||||
for item in self.hosts_filter
|
||||
])
|
||||
reason = (
|
||||
"All host names were removed after applying"
|
||||
" host filters. {}"
|
||||
).format(joined_filters)
|
||||
raise EntitySchemaError(self, reason)
|
||||
|
||||
invalid_filters = set()
|
||||
for item in self.hosts_filter:
|
||||
if item not in self.all_host_names:
|
||||
invalid_filters.add(item)
|
||||
|
||||
if invalid_filters:
|
||||
joined_filters = ", ".join([
|
||||
'"{}"'.format(item)
|
||||
for item in self.hosts_filter
|
||||
])
|
||||
expected_hosts = ", ".join([
|
||||
'"{}"'.format(item)
|
||||
for item in self.all_host_names
|
||||
])
|
||||
self.log.warning((
|
||||
"Host filters containt invalid host names:"
|
||||
" \"{}\" Expected values are {}"
|
||||
).format(joined_filters, expected_hosts))
|
||||
|
||||
super(HostsEnumEntity, self).schema_validations()
|
||||
|
||||
|
||||
class AppsEnumEntity(BaseEnumEntity):
|
||||
schema_types = ["apps-enum"]
|
||||
|
|
|
|||
|
|
@ -1,5 +1,6 @@
|
|||
import re
|
||||
import copy
|
||||
import json
|
||||
from abc import abstractmethod
|
||||
|
||||
from .base_entity import ItemEntity
|
||||
|
|
@ -217,21 +218,28 @@ class InputEntity(EndpointEntity):
|
|||
return True
|
||||
return False
|
||||
|
||||
def set_override_state(self, state):
|
||||
def set_override_state(self, state, ignore_missing_defaults):
|
||||
# Trigger override state change of root if is not same
|
||||
if self.root_item.override_state is not state:
|
||||
self.root_item.set_override_state(state)
|
||||
return
|
||||
|
||||
self._override_state = state
|
||||
self._ignore_missing_defaults = ignore_missing_defaults
|
||||
# Ignore if is dynamic item and use default in that case
|
||||
if not self.is_dynamic_item and not self.is_in_dynamic_item:
|
||||
if state > OverrideState.DEFAULTS:
|
||||
if not self.has_default_value:
|
||||
if (
|
||||
not self.has_default_value
|
||||
and not ignore_missing_defaults
|
||||
):
|
||||
raise DefaultsNotDefined(self)
|
||||
|
||||
elif state > OverrideState.STUDIO:
|
||||
if not self.had_studio_override:
|
||||
if (
|
||||
not self.had_studio_override
|
||||
and not ignore_missing_defaults
|
||||
):
|
||||
raise StudioDefaultsNotDefined(self)
|
||||
|
||||
if state is OverrideState.STUDIO:
|
||||
|
|
@ -433,6 +441,7 @@ class RawJsonEntity(InputEntity):
|
|||
|
||||
def _item_initalization(self):
|
||||
# Schema must define if valid value is dict or list
|
||||
store_as_string = self.schema_data.get("store_as_string", False)
|
||||
is_list = self.schema_data.get("is_list", False)
|
||||
if is_list:
|
||||
valid_value_types = (list, )
|
||||
|
|
@ -441,6 +450,8 @@ class RawJsonEntity(InputEntity):
|
|||
valid_value_types = (dict, )
|
||||
value_on_not_set = {}
|
||||
|
||||
self.store_as_string = store_as_string
|
||||
|
||||
self._is_list = is_list
|
||||
self.valid_value_types = valid_value_types
|
||||
self.value_on_not_set = value_on_not_set
|
||||
|
|
@ -484,6 +495,23 @@ class RawJsonEntity(InputEntity):
|
|||
result = self.metadata != self._metadata_for_current_state()
|
||||
return result
|
||||
|
||||
def schema_validations(self):
|
||||
if self.store_as_string and self.is_env_group:
|
||||
reason = (
|
||||
"RawJson entity can't store environment group metadata"
|
||||
" as string."
|
||||
)
|
||||
raise EntitySchemaError(self, reason)
|
||||
super(RawJsonEntity, self).schema_validations()
|
||||
|
||||
def _convert_to_valid_type(self, value):
|
||||
if isinstance(value, STRING_TYPE):
|
||||
try:
|
||||
return json.loads(value)
|
||||
except Exception:
|
||||
pass
|
||||
return super(RawJsonEntity, self)._convert_to_valid_type(value)
|
||||
|
||||
def _metadata_for_current_state(self):
|
||||
if (
|
||||
self._override_state is OverrideState.PROJECT
|
||||
|
|
@ -503,6 +531,9 @@ class RawJsonEntity(InputEntity):
|
|||
value = super(RawJsonEntity, self)._settings_value()
|
||||
if self.is_env_group and isinstance(value, dict):
|
||||
value.update(self.metadata)
|
||||
|
||||
if self.store_as_string:
|
||||
return json.dumps(value)
|
||||
return value
|
||||
|
||||
def _prepare_value(self, value):
|
||||
|
|
|
|||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue