mirror of
https://github.com/ynput/ayon-core.git
synced 2025-12-25 05:14:40 +01:00
Merge branch 'develop' into bugfix/OP-2803_nuke-farm-publishing-with-multiple-bake-profiles
This commit is contained in:
commit
f97e2498cd
74 changed files with 2556 additions and 824 deletions
5
.gitmodules
vendored
5
.gitmodules
vendored
|
|
@ -1,6 +1,3 @@
|
|||
[submodule "repos/avalon-core"]
|
||||
path = repos/avalon-core
|
||||
url = https://github.com/pypeclub/avalon-core.git
|
||||
[submodule "repos/avalon-unreal-integration"]
|
||||
path = repos/avalon-unreal-integration
|
||||
url = https://github.com/pypeclub/avalon-unreal-integration.git
|
||||
url = https://github.com/pypeclub/avalon-core.git
|
||||
14
CHANGELOG.md
14
CHANGELOG.md
|
|
@ -1,6 +1,6 @@
|
|||
# Changelog
|
||||
|
||||
## [3.9.0-nightly.4](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
## [3.9.0-nightly.5](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.8.2...HEAD)
|
||||
|
||||
|
|
@ -14,21 +14,23 @@
|
|||
- Documentation: broken link fix [\#2785](https://github.com/pypeclub/OpenPype/pull/2785)
|
||||
- Documentation: link fixes [\#2772](https://github.com/pypeclub/OpenPype/pull/2772)
|
||||
- Update docusaurus to latest version [\#2760](https://github.com/pypeclub/OpenPype/pull/2760)
|
||||
- Various testing updates [\#2726](https://github.com/pypeclub/OpenPype/pull/2726)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- General: Color dialog UI fixes [\#2817](https://github.com/pypeclub/OpenPype/pull/2817)
|
||||
- General: Set context environments for non host applications [\#2803](https://github.com/pypeclub/OpenPype/pull/2803)
|
||||
- Tray publisher: New Tray Publisher host \(beta\) [\#2778](https://github.com/pypeclub/OpenPype/pull/2778)
|
||||
- Houdini: Implement Reset Frame Range [\#2770](https://github.com/pypeclub/OpenPype/pull/2770)
|
||||
- Pyblish Pype: Remove redundant new line in installed fonts printing [\#2758](https://github.com/pypeclub/OpenPype/pull/2758)
|
||||
- Flame: use Shot Name on segment for asset name [\#2751](https://github.com/pypeclub/OpenPype/pull/2751)
|
||||
- Flame: adding validator source clip [\#2746](https://github.com/pypeclub/OpenPype/pull/2746)
|
||||
- Work Files: Preserve subversion comment of current filename by default [\#2734](https://github.com/pypeclub/OpenPype/pull/2734)
|
||||
- Ftrack: Disable ftrack module by default [\#2732](https://github.com/pypeclub/OpenPype/pull/2732)
|
||||
- RoyalRender: Minor enhancements [\#2700](https://github.com/pypeclub/OpenPype/pull/2700)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Settings: Missing document with OP versions may break start of OpenPype [\#2825](https://github.com/pypeclub/OpenPype/pull/2825)
|
||||
- Settings UI: Fix "Apply from" action [\#2820](https://github.com/pypeclub/OpenPype/pull/2820)
|
||||
- Settings UI: Search case sensitivity [\#2810](https://github.com/pypeclub/OpenPype/pull/2810)
|
||||
- Flame Babypublisher optimalization [\#2806](https://github.com/pypeclub/OpenPype/pull/2806)
|
||||
- resolve: fixing fusion module loading [\#2802](https://github.com/pypeclub/OpenPype/pull/2802)
|
||||
|
|
@ -38,13 +40,15 @@
|
|||
- Maya: Fix `unique\_namespace` when in an namespace that is empty [\#2759](https://github.com/pypeclub/OpenPype/pull/2759)
|
||||
- Loader UI: Fix right click in representation widget [\#2757](https://github.com/pypeclub/OpenPype/pull/2757)
|
||||
- Aftereffects 2022 and Deadline [\#2748](https://github.com/pypeclub/OpenPype/pull/2748)
|
||||
- Flame: bunch of bugs [\#2745](https://github.com/pypeclub/OpenPype/pull/2745)
|
||||
- Maya: Save current scene on workfile publish [\#2744](https://github.com/pypeclub/OpenPype/pull/2744)
|
||||
- Version Up: Preserve parts of filename after version number \(like subversion\) on version\_up [\#2741](https://github.com/pypeclub/OpenPype/pull/2741)
|
||||
- Loader UI: Multiple asset selection and underline colors fixed [\#2731](https://github.com/pypeclub/OpenPype/pull/2731)
|
||||
- Maya: Remove some unused code [\#2709](https://github.com/pypeclub/OpenPype/pull/2709)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Move Unreal Implementation to OpenPype [\#2823](https://github.com/pypeclub/OpenPype/pull/2823)
|
||||
- Ftrack: Job killer with missing user [\#2819](https://github.com/pypeclub/OpenPype/pull/2819)
|
||||
- Ftrack: Unset task ids from asset versions before tasks are removed [\#2800](https://github.com/pypeclub/OpenPype/pull/2800)
|
||||
- Slack: fail gracefully if slack exception [\#2798](https://github.com/pypeclub/OpenPype/pull/2798)
|
||||
- Ftrack: Moved module one hierarchy level higher [\#2792](https://github.com/pypeclub/OpenPype/pull/2792)
|
||||
|
|
@ -54,10 +58,10 @@
|
|||
- Houdini: Remove duplicate ValidateOutputNode plug-in [\#2780](https://github.com/pypeclub/OpenPype/pull/2780)
|
||||
- Slack: Added regex for filtering on subset names [\#2775](https://github.com/pypeclub/OpenPype/pull/2775)
|
||||
- Houdini: Fix open last workfile [\#2767](https://github.com/pypeclub/OpenPype/pull/2767)
|
||||
- General: Extract template formatting from anatomy [\#2766](https://github.com/pypeclub/OpenPype/pull/2766)
|
||||
- Harmony: Rendering in Deadline didn't work in other machines than submitter [\#2754](https://github.com/pypeclub/OpenPype/pull/2754)
|
||||
- Houdini: Move Houdini Save Current File to beginning of ExtractorOrder [\#2747](https://github.com/pypeclub/OpenPype/pull/2747)
|
||||
- Maya: set Deadline job/batch name to original source workfile name instead of published workfile [\#2733](https://github.com/pypeclub/OpenPype/pull/2733)
|
||||
- Fusion: Moved implementation into OpenPype [\#2713](https://github.com/pypeclub/OpenPype/pull/2713)
|
||||
|
||||
## [3.8.2](https://github.com/pypeclub/OpenPype/tree/3.8.2) (2022-02-07)
|
||||
|
||||
|
|
|
|||
|
|
@ -202,13 +202,10 @@ def reload_pipeline(*args):
|
|||
avalon.api.uninstall()
|
||||
|
||||
for module in (
|
||||
"avalon.io",
|
||||
"avalon.lib",
|
||||
"avalon.pipeline",
|
||||
"avalon.tools.creator.app",
|
||||
"avalon.tools.manager.app",
|
||||
"avalon.api",
|
||||
"avalon.tools",
|
||||
"avalon.io",
|
||||
"avalon.lib",
|
||||
"avalon.pipeline",
|
||||
"avalon.api",
|
||||
):
|
||||
module = importlib.import_module(module)
|
||||
importlib.reload(module)
|
||||
|
|
|
|||
|
|
@ -361,7 +361,7 @@ def zip_and_move(source, destination):
|
|||
log.debug(f"Saved '{source}' to '{destination}'")
|
||||
|
||||
|
||||
def show(module_name):
|
||||
def show(tool_name):
|
||||
"""Call show on "module_name".
|
||||
|
||||
This allows to make a QApplication ahead of time and always "exec_" to
|
||||
|
|
@ -375,13 +375,6 @@ def show(module_name):
|
|||
# requests to be received properly.
|
||||
time.sleep(1)
|
||||
|
||||
# Get tool name from module name
|
||||
# TODO this is for backwards compatibility not sure if `TB_sceneOpened.js`
|
||||
# is automatically updated.
|
||||
# Previous javascript sent 'module_name' which contained whole tool import
|
||||
# string e.g. "avalon.tools.workfiles" now it should be only "workfiles"
|
||||
tool_name = module_name.split(".")[-1]
|
||||
|
||||
kwargs = {}
|
||||
if tool_name == "loader":
|
||||
kwargs["use_context"] = True
|
||||
|
|
|
|||
|
|
@ -37,17 +37,17 @@ class ToolWindows:
|
|||
|
||||
|
||||
def edit_shader_definitions():
|
||||
from avalon.tools import lib
|
||||
from Qt import QtWidgets
|
||||
from openpype.hosts.maya.api.shader_definition_editor import (
|
||||
ShaderDefinitionsEditor
|
||||
)
|
||||
from openpype.tools.utils import qt_app_context
|
||||
|
||||
top_level_widgets = QtWidgets.QApplication.topLevelWidgets()
|
||||
main_window = next(widget for widget in top_level_widgets
|
||||
if widget.objectName() == "MayaWindow")
|
||||
|
||||
with lib.application():
|
||||
with qt_app_context():
|
||||
window = ToolWindows.get_window("shader_definition_editor")
|
||||
if not window:
|
||||
window = ShaderDefinitionsEditor(parent=main_window)
|
||||
|
|
|
|||
|
|
@ -36,7 +36,7 @@ def install():
|
|||
return
|
||||
|
||||
def deferred():
|
||||
from avalon.tools import publish
|
||||
pyblish_icon = host_tools.get_pyblish_icon()
|
||||
parent_widget = get_main_window()
|
||||
cmds.menu(
|
||||
MENU_NAME,
|
||||
|
|
@ -80,7 +80,7 @@ def install():
|
|||
command=lambda *args: host_tools.show_publish(
|
||||
parent=parent_widget
|
||||
),
|
||||
image=publish.ICON
|
||||
image=pyblish_icon
|
||||
)
|
||||
|
||||
cmds.menuItem(
|
||||
|
|
|
|||
|
|
@ -1,5 +1,6 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
import os
|
||||
import maya.cmds as cmds
|
||||
import maya.cmds as cmds # noqa
|
||||
from avalon import api
|
||||
from openpype.api import get_project_settings
|
||||
from openpype.hosts.maya.api.lib import (
|
||||
|
|
@ -42,20 +43,20 @@ class VRaySceneLoader(api.Loader):
|
|||
with maintained_selection():
|
||||
cmds.namespace(addNamespace=namespace)
|
||||
with namespaced(namespace, new=False):
|
||||
nodes, group_node = self.create_vray_scene(name,
|
||||
filename=self.fname)
|
||||
nodes, root_node = self.create_vray_scene(name,
|
||||
filename=self.fname)
|
||||
|
||||
self[:] = nodes
|
||||
if not nodes:
|
||||
return
|
||||
|
||||
# colour the group node
|
||||
presets = get_project_settings(os.environ['AVALON_PROJECT'])
|
||||
colors = presets['maya']['load']['colors']
|
||||
settings = get_project_settings(os.environ['AVALON_PROJECT'])
|
||||
colors = settings['maya']['load']['colors']
|
||||
c = colors.get(family)
|
||||
if c is not None:
|
||||
cmds.setAttr("{0}.useOutlinerColor".format(group_node), 1)
|
||||
cmds.setAttr("{0}.outlinerColor".format(group_node),
|
||||
cmds.setAttr("{0}.useOutlinerColor".format(root_node), 1)
|
||||
cmds.setAttr("{0}.outlinerColor".format(root_node),
|
||||
(float(c[0])/255),
|
||||
(float(c[1])/255),
|
||||
(float(c[2])/255)
|
||||
|
|
@ -123,17 +124,21 @@ class VRaySceneLoader(api.Loader):
|
|||
mesh_node_name = "VRayScene_{}".format(name)
|
||||
|
||||
trans = cmds.createNode(
|
||||
"transform", name="{}".format(mesh_node_name))
|
||||
mesh = cmds.createNode(
|
||||
"mesh", name="{}_Shape".format(mesh_node_name), parent=trans)
|
||||
"transform", name=mesh_node_name)
|
||||
vray_scene = cmds.createNode(
|
||||
"VRayScene", name="{}_VRSCN".format(mesh_node_name), parent=trans)
|
||||
mesh = cmds.createNode(
|
||||
"mesh", name="{}_Shape".format(mesh_node_name), parent=trans)
|
||||
|
||||
cmds.connectAttr(
|
||||
"{}.outMesh".format(vray_scene), "{}.inMesh".format(mesh))
|
||||
|
||||
cmds.setAttr("{}.FilePath".format(vray_scene), filename, type="string")
|
||||
|
||||
# Lock the shape nodes so the user cannot delete these
|
||||
cmds.lockNode(mesh, lock=True)
|
||||
cmds.lockNode(vray_scene, lock=True)
|
||||
|
||||
# Create important connections
|
||||
cmds.connectAttr("time1.outTime",
|
||||
"{0}.inputTime".format(trans))
|
||||
|
|
@ -141,11 +146,9 @@ class VRaySceneLoader(api.Loader):
|
|||
# Connect mesh to initialShadingGroup
|
||||
cmds.sets([mesh], forceElement="initialShadingGroup")
|
||||
|
||||
group_node = cmds.group(empty=True, name="{}_GRP".format(name))
|
||||
cmds.parent(trans, group_node)
|
||||
nodes = [trans, vray_scene, mesh, group_node]
|
||||
nodes = [trans, vray_scene, mesh]
|
||||
|
||||
# Fix: Force refresh so the mesh shows correctly after creation
|
||||
cmds.refresh()
|
||||
|
||||
return nodes, group_node
|
||||
return nodes, trans
|
||||
|
|
|
|||
|
|
@ -1,6 +1,5 @@
|
|||
import os
|
||||
import re
|
||||
import sys
|
||||
import six
|
||||
import platform
|
||||
import contextlib
|
||||
|
|
@ -679,10 +678,10 @@ def get_render_path(node):
|
|||
}
|
||||
|
||||
nuke_imageio_writes = get_created_node_imageio_setting(**data_preset)
|
||||
host_name = os.environ.get("AVALON_APP")
|
||||
|
||||
application = lib.get_application(os.environ["AVALON_APP_NAME"])
|
||||
data.update({
|
||||
"application": application,
|
||||
"app": host_name,
|
||||
"nuke_imageio_writes": nuke_imageio_writes
|
||||
})
|
||||
|
||||
|
|
@ -805,18 +804,14 @@ def create_write_node(name, data, input=None, prenodes=None,
|
|||
'''
|
||||
|
||||
imageio_writes = get_created_node_imageio_setting(**data)
|
||||
app_manager = ApplicationManager()
|
||||
app_name = os.environ.get("AVALON_APP_NAME")
|
||||
if app_name:
|
||||
app = app_manager.applications.get(app_name)
|
||||
|
||||
for knob in imageio_writes["knobs"]:
|
||||
if knob["name"] == "file_type":
|
||||
representation = knob["value"]
|
||||
|
||||
host_name = os.environ.get("AVALON_APP")
|
||||
try:
|
||||
data.update({
|
||||
"app": app.host_name,
|
||||
"app": host_name,
|
||||
"imageio_writes": imageio_writes,
|
||||
"representation": representation,
|
||||
})
|
||||
|
|
|
|||
|
|
@ -1,13 +1,15 @@
|
|||
import os
|
||||
import openpype.hosts
|
||||
|
||||
|
||||
def add_implementation_envs(env, _app):
|
||||
"""Modify environments to contain all required for implementation."""
|
||||
# Set AVALON_UNREAL_PLUGIN required for Unreal implementation
|
||||
# Set OPENPYPE_UNREAL_PLUGIN required for Unreal implementation
|
||||
unreal_plugin_path = os.path.join(
|
||||
os.environ["OPENPYPE_REPOS_ROOT"], "repos", "avalon-unreal-integration"
|
||||
os.path.dirname(os.path.abspath(openpype.hosts.__file__)),
|
||||
"unreal", "integration"
|
||||
)
|
||||
env["AVALON_UNREAL_PLUGIN"] = unreal_plugin_path
|
||||
env["OPENPYPE_UNREAL_PLUGIN"] = unreal_plugin_path
|
||||
|
||||
# Set default environments if are not set via settings
|
||||
defaults = {
|
||||
|
|
|
|||
|
|
@ -1,45 +1,40 @@
|
|||
import os
|
||||
import logging
|
||||
# -*- coding: utf-8 -*-
|
||||
"""Unreal Editor OpenPype host API."""
|
||||
|
||||
from avalon import api as avalon
|
||||
from pyblish import api as pyblish
|
||||
import openpype.hosts.unreal
|
||||
from .plugin import (
|
||||
Loader,
|
||||
Creator
|
||||
)
|
||||
from .pipeline import (
|
||||
install,
|
||||
uninstall,
|
||||
ls,
|
||||
publish,
|
||||
containerise,
|
||||
show_creator,
|
||||
show_loader,
|
||||
show_publisher,
|
||||
show_manager,
|
||||
show_experimental_tools,
|
||||
show_tools_dialog,
|
||||
show_tools_popup,
|
||||
instantiate,
|
||||
)
|
||||
|
||||
logger = logging.getLogger("openpype.hosts.unreal")
|
||||
|
||||
HOST_DIR = os.path.dirname(os.path.abspath(openpype.hosts.unreal.__file__))
|
||||
PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
|
||||
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
|
||||
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
|
||||
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
|
||||
INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory")
|
||||
|
||||
|
||||
def install():
|
||||
"""Install Unreal configuration for Avalon."""
|
||||
print("-=" * 40)
|
||||
logo = '''.
|
||||
.
|
||||
____________
|
||||
/ \\ __ \\
|
||||
\\ \\ \\/_\\ \\
|
||||
\\ \\ _____/ ______
|
||||
\\ \\ \\___// \\ \\
|
||||
\\ \\____\\ \\ \\_____\\
|
||||
\\/_____/ \\/______/ PYPE Club .
|
||||
.
|
||||
'''
|
||||
print(logo)
|
||||
print("installing OpenPype for Unreal ...")
|
||||
print("-=" * 40)
|
||||
logger.info("installing OpenPype for Unreal")
|
||||
pyblish.register_plugin_path(str(PUBLISH_PATH))
|
||||
avalon.register_plugin_path(avalon.Loader, str(LOAD_PATH))
|
||||
avalon.register_plugin_path(avalon.Creator, str(CREATE_PATH))
|
||||
|
||||
|
||||
def uninstall():
|
||||
"""Uninstall Unreal configuration for Avalon."""
|
||||
pyblish.deregister_plugin_path(str(PUBLISH_PATH))
|
||||
avalon.deregister_plugin_path(avalon.Loader, str(LOAD_PATH))
|
||||
avalon.deregister_plugin_path(avalon.Creator, str(CREATE_PATH))
|
||||
__all__ = [
|
||||
"install",
|
||||
"uninstall",
|
||||
"Creator",
|
||||
"Loader",
|
||||
"ls",
|
||||
"publish",
|
||||
"containerise",
|
||||
"show_creator",
|
||||
"show_loader",
|
||||
"show_publisher",
|
||||
"show_manager",
|
||||
"show_experimental_tools",
|
||||
"show_tools_dialog",
|
||||
"show_tools_popup",
|
||||
"instantiate"
|
||||
]
|
||||
|
|
|
|||
44
openpype/hosts/unreal/api/helpers.py
Normal file
44
openpype/hosts/unreal/api/helpers.py
Normal file
|
|
@ -0,0 +1,44 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
import unreal # noqa
|
||||
|
||||
|
||||
class OpenPypeUnrealException(Exception):
|
||||
pass
|
||||
|
||||
|
||||
@unreal.uclass()
|
||||
class OpenPypeHelpers(unreal.OpenPypeLib):
|
||||
"""Class wrapping some useful functions for OpenPype.
|
||||
|
||||
This class is extending native BP class in OpenPype Integration Plugin.
|
||||
|
||||
"""
|
||||
|
||||
@unreal.ufunction(params=[str, unreal.LinearColor, bool])
|
||||
def set_folder_color(self, path: str, color: unreal.LinearColor) -> None:
|
||||
"""Set color on folder in Content Browser.
|
||||
|
||||
This method sets color on folder in Content Browser. Unfortunately
|
||||
there is no way to refresh Content Browser so new color isn't applied
|
||||
immediately. They are saved to config file and appears correctly
|
||||
only after Editor is restarted.
|
||||
|
||||
Args:
|
||||
path (str): Path to folder
|
||||
color (:class:`unreal.LinearColor`): Color of the folder
|
||||
|
||||
Example:
|
||||
|
||||
OpenPypeHelpers().set_folder_color(
|
||||
"/Game/Path", unreal.LinearColor(a=1.0, r=1.0, g=0.5, b=0)
|
||||
)
|
||||
|
||||
Note:
|
||||
This will take effect only after Editor is restarted. I couldn't
|
||||
find a way to refresh it. Also this saves the color definition
|
||||
into the project config, binding this path with color. So if you
|
||||
delete this path and later re-create, it will set this color
|
||||
again.
|
||||
|
||||
"""
|
||||
self.c_set_folder_color(path, color, False)
|
||||
413
openpype/hosts/unreal/api/pipeline.py
Normal file
413
openpype/hosts/unreal/api/pipeline.py
Normal file
|
|
@ -0,0 +1,413 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
import os
|
||||
import logging
|
||||
from typing import List
|
||||
|
||||
import pyblish.api
|
||||
from avalon.pipeline import AVALON_CONTAINER_ID
|
||||
from avalon import api
|
||||
|
||||
from openpype.tools.utils import host_tools
|
||||
import openpype.hosts.unreal
|
||||
|
||||
import unreal # noqa
|
||||
|
||||
|
||||
logger = logging.getLogger("openpype.hosts.unreal")
|
||||
OPENPYPE_CONTAINERS = "OpenPypeContainers"
|
||||
|
||||
HOST_DIR = os.path.dirname(os.path.abspath(openpype.hosts.unreal.__file__))
|
||||
PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
|
||||
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
|
||||
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
|
||||
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
|
||||
INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory")
|
||||
|
||||
|
||||
def install():
|
||||
"""Install Unreal configuration for OpenPype."""
|
||||
print("-=" * 40)
|
||||
logo = '''.
|
||||
.
|
||||
____________
|
||||
/ \\ __ \\
|
||||
\\ \\ \\/_\\ \\
|
||||
\\ \\ _____/ ______
|
||||
\\ \\ \\___// \\ \\
|
||||
\\ \\____\\ \\ \\_____\\
|
||||
\\/_____/ \\/______/ PYPE Club .
|
||||
.
|
||||
'''
|
||||
print(logo)
|
||||
print("installing OpenPype for Unreal ...")
|
||||
print("-=" * 40)
|
||||
logger.info("installing OpenPype for Unreal")
|
||||
pyblish.api.register_plugin_path(str(PUBLISH_PATH))
|
||||
api.register_plugin_path(api.Loader, str(LOAD_PATH))
|
||||
api.register_plugin_path(api.Creator, str(CREATE_PATH))
|
||||
_register_callbacks()
|
||||
_register_events()
|
||||
|
||||
|
||||
def uninstall():
|
||||
"""Uninstall Unreal configuration for Avalon."""
|
||||
pyblish.api.deregister_plugin_path(str(PUBLISH_PATH))
|
||||
api.deregister_plugin_path(api.Loader, str(LOAD_PATH))
|
||||
api.deregister_plugin_path(api.Creator, str(CREATE_PATH))
|
||||
|
||||
|
||||
def _register_callbacks():
|
||||
"""
|
||||
TODO: Implement callbacks if supported by UE4
|
||||
"""
|
||||
pass
|
||||
|
||||
|
||||
def _register_events():
|
||||
"""
|
||||
TODO: Implement callbacks if supported by UE4
|
||||
"""
|
||||
pass
|
||||
|
||||
|
||||
class Creator(api.Creator):
|
||||
hosts = ["unreal"]
|
||||
asset_types = []
|
||||
|
||||
def process(self):
|
||||
nodes = list()
|
||||
|
||||
with unreal.ScopedEditorTransaction("OpenPype Creating Instance"):
|
||||
if (self.options or {}).get("useSelection"):
|
||||
self.log.info("setting ...")
|
||||
print("settings ...")
|
||||
nodes = unreal.EditorUtilityLibrary.get_selected_assets()
|
||||
|
||||
asset_paths = [a.get_path_name() for a in nodes]
|
||||
self.name = move_assets_to_path(
|
||||
"/Game", self.name, asset_paths
|
||||
)
|
||||
|
||||
instance = create_publish_instance("/Game", self.name)
|
||||
imprint(instance, self.data)
|
||||
|
||||
return instance
|
||||
|
||||
|
||||
def ls():
|
||||
"""List all containers.
|
||||
|
||||
List all found in *Content Manager* of Unreal and return
|
||||
metadata from them. Adding `objectName` to set.
|
||||
|
||||
"""
|
||||
ar = unreal.AssetRegistryHelpers.get_asset_registry()
|
||||
openpype_containers = ar.get_assets_by_class("AssetContainer", True)
|
||||
|
||||
# get_asset_by_class returns AssetData. To get all metadata we need to
|
||||
# load asset. get_tag_values() work only on metadata registered in
|
||||
# Asset Registry Project settings (and there is no way to set it with
|
||||
# python short of editing ini configuration file).
|
||||
for asset_data in openpype_containers:
|
||||
asset = asset_data.get_asset()
|
||||
data = unreal.EditorAssetLibrary.get_metadata_tag_values(asset)
|
||||
data["objectName"] = asset_data.asset_name
|
||||
data = cast_map_to_str_dict(data)
|
||||
|
||||
yield data
|
||||
|
||||
|
||||
def parse_container(container):
|
||||
"""To get data from container, AssetContainer must be loaded.
|
||||
|
||||
Args:
|
||||
container(str): path to container
|
||||
|
||||
Returns:
|
||||
dict: metadata stored on container
|
||||
"""
|
||||
asset = unreal.EditorAssetLibrary.load_asset(container)
|
||||
data = unreal.EditorAssetLibrary.get_metadata_tag_values(asset)
|
||||
data["objectName"] = asset.get_name()
|
||||
data = cast_map_to_str_dict(data)
|
||||
|
||||
return data
|
||||
|
||||
|
||||
def publish():
|
||||
"""Shorthand to publish from within host."""
|
||||
import pyblish.util
|
||||
|
||||
return pyblish.util.publish()
|
||||
|
||||
|
||||
def containerise(name, namespace, nodes, context, loader=None, suffix="_CON"):
|
||||
"""Bundles *nodes* (assets) into a *container* and add metadata to it.
|
||||
|
||||
Unreal doesn't support *groups* of assets that you can add metadata to.
|
||||
But it does support folders that helps to organize asset. Unfortunately
|
||||
those folders are just that - you cannot add any additional information
|
||||
to them. OpenPype Integration Plugin is providing way out - Implementing
|
||||
`AssetContainer` Blueprint class. This class when added to folder can
|
||||
handle metadata on it using standard
|
||||
:func:`unreal.EditorAssetLibrary.set_metadata_tag()` and
|
||||
:func:`unreal.EditorAssetLibrary.get_metadata_tag_values()`. It also
|
||||
stores and monitor all changes in assets in path where it resides. List of
|
||||
those assets is available as `assets` property.
|
||||
|
||||
This is list of strings starting with asset type and ending with its path:
|
||||
`Material /Game/OpenPype/Test/TestMaterial.TestMaterial`
|
||||
|
||||
"""
|
||||
# 1 - create directory for container
|
||||
root = "/Game"
|
||||
container_name = "{}{}".format(name, suffix)
|
||||
new_name = move_assets_to_path(root, container_name, nodes)
|
||||
|
||||
# 2 - create Asset Container there
|
||||
path = "{}/{}".format(root, new_name)
|
||||
create_container(container=container_name, path=path)
|
||||
|
||||
namespace = path
|
||||
|
||||
data = {
|
||||
"schema": "openpype:container-2.0",
|
||||
"id": AVALON_CONTAINER_ID,
|
||||
"name": new_name,
|
||||
"namespace": namespace,
|
||||
"loader": str(loader),
|
||||
"representation": context["representation"]["_id"],
|
||||
}
|
||||
# 3 - imprint data
|
||||
imprint("{}/{}".format(path, container_name), data)
|
||||
return path
|
||||
|
||||
|
||||
def instantiate(root, name, data, assets=None, suffix="_INS"):
|
||||
"""Bundles *nodes* into *container*.
|
||||
|
||||
Marking it with metadata as publishable instance. If assets are provided,
|
||||
they are moved to new path where `OpenPypePublishInstance` class asset is
|
||||
created and imprinted with metadata.
|
||||
|
||||
This can then be collected for publishing by Pyblish for example.
|
||||
|
||||
Args:
|
||||
root (str): root path where to create instance container
|
||||
name (str): name of the container
|
||||
data (dict): data to imprint on container
|
||||
assets (list of str): list of asset paths to include in publish
|
||||
instance
|
||||
suffix (str): suffix string to append to instance name
|
||||
|
||||
"""
|
||||
container_name = "{}{}".format(name, suffix)
|
||||
|
||||
# if we specify assets, create new folder and move them there. If not,
|
||||
# just create empty folder
|
||||
if assets:
|
||||
new_name = move_assets_to_path(root, container_name, assets)
|
||||
else:
|
||||
new_name = create_folder(root, name)
|
||||
|
||||
path = "{}/{}".format(root, new_name)
|
||||
create_publish_instance(instance=container_name, path=path)
|
||||
|
||||
imprint("{}/{}".format(path, container_name), data)
|
||||
|
||||
|
||||
def imprint(node, data):
|
||||
loaded_asset = unreal.EditorAssetLibrary.load_asset(node)
|
||||
for key, value in data.items():
|
||||
# Support values evaluated at imprint
|
||||
if callable(value):
|
||||
value = value()
|
||||
# Unreal doesn't support NoneType in metadata values
|
||||
if value is None:
|
||||
value = ""
|
||||
unreal.EditorAssetLibrary.set_metadata_tag(
|
||||
loaded_asset, key, str(value)
|
||||
)
|
||||
|
||||
with unreal.ScopedEditorTransaction("OpenPype containerising"):
|
||||
unreal.EditorAssetLibrary.save_asset(node)
|
||||
|
||||
|
||||
def show_tools_popup():
|
||||
"""Show popup with tools.
|
||||
|
||||
Popup will disappear on click or loosing focus.
|
||||
"""
|
||||
from openpype.hosts.unreal.api import tools_ui
|
||||
|
||||
tools_ui.show_tools_popup()
|
||||
|
||||
|
||||
def show_tools_dialog():
|
||||
"""Show dialog with tools.
|
||||
|
||||
Dialog will stay visible.
|
||||
"""
|
||||
from openpype.hosts.unreal.api import tools_ui
|
||||
|
||||
tools_ui.show_tools_dialog()
|
||||
|
||||
|
||||
def show_creator():
|
||||
host_tools.show_creator()
|
||||
|
||||
|
||||
def show_loader():
|
||||
host_tools.show_loader(use_context=True)
|
||||
|
||||
|
||||
def show_publisher():
|
||||
host_tools.show_publish()
|
||||
|
||||
|
||||
def show_manager():
|
||||
host_tools.show_scene_inventory()
|
||||
|
||||
|
||||
def show_experimental_tools():
|
||||
host_tools.show_experimental_tools_dialog()
|
||||
|
||||
|
||||
def create_folder(root: str, name: str) -> str:
|
||||
"""Create new folder.
|
||||
|
||||
If folder exists, append number at the end and try again, incrementing
|
||||
if needed.
|
||||
|
||||
Args:
|
||||
root (str): path root
|
||||
name (str): folder name
|
||||
|
||||
Returns:
|
||||
str: folder name
|
||||
|
||||
Example:
|
||||
>>> create_folder("/Game/Foo")
|
||||
/Game/Foo
|
||||
>>> create_folder("/Game/Foo")
|
||||
/Game/Foo1
|
||||
|
||||
"""
|
||||
eal = unreal.EditorAssetLibrary
|
||||
index = 1
|
||||
while True:
|
||||
if eal.does_directory_exist("{}/{}".format(root, name)):
|
||||
name = "{}{}".format(name, index)
|
||||
index += 1
|
||||
else:
|
||||
eal.make_directory("{}/{}".format(root, name))
|
||||
break
|
||||
|
||||
return name
|
||||
|
||||
|
||||
def move_assets_to_path(root: str, name: str, assets: List[str]) -> str:
|
||||
"""Moving (renaming) list of asset paths to new destination.
|
||||
|
||||
Args:
|
||||
root (str): root of the path (eg. `/Game`)
|
||||
name (str): name of destination directory (eg. `Foo` )
|
||||
assets (list of str): list of asset paths
|
||||
|
||||
Returns:
|
||||
str: folder name
|
||||
|
||||
Example:
|
||||
This will get paths of all assets under `/Game/Test` and move them
|
||||
to `/Game/NewTest`. If `/Game/NewTest` already exists, then resulting
|
||||
path will be `/Game/NewTest1`
|
||||
|
||||
>>> assets = unreal.EditorAssetLibrary.list_assets("/Game/Test")
|
||||
>>> move_assets_to_path("/Game", "NewTest", assets)
|
||||
NewTest
|
||||
|
||||
"""
|
||||
eal = unreal.EditorAssetLibrary
|
||||
name = create_folder(root, name)
|
||||
|
||||
unreal.log(assets)
|
||||
for asset in assets:
|
||||
loaded = eal.load_asset(asset)
|
||||
eal.rename_asset(
|
||||
asset, "{}/{}/{}".format(root, name, loaded.get_name())
|
||||
)
|
||||
|
||||
return name
|
||||
|
||||
|
||||
def create_container(container: str, path: str) -> unreal.Object:
|
||||
"""Helper function to create Asset Container class on given path.
|
||||
|
||||
This Asset Class helps to mark given path as Container
|
||||
and enable asset version control on it.
|
||||
|
||||
Args:
|
||||
container (str): Asset Container name
|
||||
path (str): Path where to create Asset Container. This path should
|
||||
point into container folder
|
||||
|
||||
Returns:
|
||||
:class:`unreal.Object`: instance of created asset
|
||||
|
||||
Example:
|
||||
|
||||
create_container(
|
||||
"/Game/modelingFooCharacter_CON",
|
||||
"modelingFooCharacter_CON"
|
||||
)
|
||||
|
||||
"""
|
||||
factory = unreal.AssetContainerFactory()
|
||||
tools = unreal.AssetToolsHelpers().get_asset_tools()
|
||||
|
||||
asset = tools.create_asset(container, path, None, factory)
|
||||
return asset
|
||||
|
||||
|
||||
def create_publish_instance(instance: str, path: str) -> unreal.Object:
|
||||
"""Helper function to create OpenPype Publish Instance on given path.
|
||||
|
||||
This behaves similarly as :func:`create_openpype_container`.
|
||||
|
||||
Args:
|
||||
path (str): Path where to create Publish Instance.
|
||||
This path should point into container folder
|
||||
instance (str): Publish Instance name
|
||||
|
||||
Returns:
|
||||
:class:`unreal.Object`: instance of created asset
|
||||
|
||||
Example:
|
||||
|
||||
create_publish_instance(
|
||||
"/Game/modelingFooCharacter_INST",
|
||||
"modelingFooCharacter_INST"
|
||||
)
|
||||
|
||||
"""
|
||||
factory = unreal.OpenPypePublishInstanceFactory()
|
||||
tools = unreal.AssetToolsHelpers().get_asset_tools()
|
||||
asset = tools.create_asset(instance, path, None, factory)
|
||||
return asset
|
||||
|
||||
|
||||
def cast_map_to_str_dict(umap) -> dict:
|
||||
"""Cast Unreal Map to dict.
|
||||
|
||||
Helper function to cast Unreal Map object to plain old python
|
||||
dict. This will also cast values and keys to str. Useful for
|
||||
metadata dicts.
|
||||
|
||||
Args:
|
||||
umap: Unreal Map object
|
||||
|
||||
Returns:
|
||||
dict
|
||||
|
||||
"""
|
||||
return {str(key): str(value) for (key, value) in umap.items()}
|
||||
|
|
@ -1,5 +1,8 @@
|
|||
from avalon import api
|
||||
# -*- coding: utf-8 -*-
|
||||
from abc import ABC
|
||||
|
||||
import openpype.api
|
||||
import avalon.api
|
||||
|
||||
|
||||
class Creator(openpype.api.Creator):
|
||||
|
|
@ -7,6 +10,6 @@ class Creator(openpype.api.Creator):
|
|||
defaults = ['Main']
|
||||
|
||||
|
||||
class Loader(api.Loader):
|
||||
class Loader(avalon.api.Loader, ABC):
|
||||
"""This serves as skeleton for future OpenPype specific functionality"""
|
||||
pass
|
||||
|
|
|
|||
|
|
@ -10,7 +10,7 @@ from openpype.lib import (
|
|||
get_workdir_data,
|
||||
get_workfile_template_key
|
||||
)
|
||||
from openpype.hosts.unreal.api import lib as unreal_lib
|
||||
import openpype.hosts.unreal.lib as unreal_lib
|
||||
|
||||
|
||||
class UnrealPrelaunchHook(PreLaunchHook):
|
||||
|
|
@ -136,9 +136,9 @@ class UnrealPrelaunchHook(PreLaunchHook):
|
|||
f"{self.signature} creating unreal "
|
||||
f"project [ {unreal_project_name} ]"
|
||||
))
|
||||
# Set "AVALON_UNREAL_PLUGIN" to current process environment for
|
||||
# Set "OPENPYPE_UNREAL_PLUGIN" to current process environment for
|
||||
# execution of `create_unreal_project`
|
||||
env_key = "AVALON_UNREAL_PLUGIN"
|
||||
env_key = "OPENPYPE_UNREAL_PLUGIN"
|
||||
if self.launch_context.env.get(env_key):
|
||||
os.environ[env_key] = self.launch_context.env[env_key]
|
||||
|
||||
|
|
|
|||
35
openpype/hosts/unreal/integration/.gitignore
vendored
Normal file
35
openpype/hosts/unreal/integration/.gitignore
vendored
Normal file
|
|
@ -0,0 +1,35 @@
|
|||
# Prerequisites
|
||||
*.d
|
||||
|
||||
# Compiled Object files
|
||||
*.slo
|
||||
*.lo
|
||||
*.o
|
||||
*.obj
|
||||
|
||||
# Precompiled Headers
|
||||
*.gch
|
||||
*.pch
|
||||
|
||||
# Compiled Dynamic libraries
|
||||
*.so
|
||||
*.dylib
|
||||
*.dll
|
||||
|
||||
# Fortran module files
|
||||
*.mod
|
||||
*.smod
|
||||
|
||||
# Compiled Static libraries
|
||||
*.lai
|
||||
*.la
|
||||
*.a
|
||||
*.lib
|
||||
|
||||
# Executables
|
||||
*.exe
|
||||
*.out
|
||||
*.app
|
||||
|
||||
/Binaries
|
||||
/Intermediate
|
||||
|
|
@ -0,0 +1,34 @@
|
|||
import unreal
|
||||
|
||||
openpype_detected = True
|
||||
try:
|
||||
from avalon import api
|
||||
except ImportError as exc:
|
||||
api = None
|
||||
openpype_detected = False
|
||||
unreal.log_error("Avalon: cannot load Avalon [ {} ]".format(exc))
|
||||
|
||||
try:
|
||||
from openpype.hosts.unreal import api as openpype_host
|
||||
except ImportError as exc:
|
||||
openpype_host = None
|
||||
openpype_detected = False
|
||||
unreal.log_error("OpenPype: cannot load OpenPype [ {} ]".format(exc))
|
||||
|
||||
if openpype_detected:
|
||||
api.install(openpype_host)
|
||||
|
||||
|
||||
@unreal.uclass()
|
||||
class OpenPypeIntegration(unreal.OpenPypePythonBridge):
|
||||
@unreal.ufunction(override=True)
|
||||
def RunInPython_Popup(self):
|
||||
unreal.log_warning("OpenPype: showing tools popup")
|
||||
if openpype_detected:
|
||||
openpype_host.show_tools_popup()
|
||||
|
||||
@unreal.ufunction(override=True)
|
||||
def RunInPython_Dialog(self):
|
||||
unreal.log_warning("OpenPype: showing tools dialog")
|
||||
if openpype_detected:
|
||||
openpype_host.show_tools_dialog()
|
||||
24
openpype/hosts/unreal/integration/OpenPype.uplugin
Normal file
24
openpype/hosts/unreal/integration/OpenPype.uplugin
Normal file
|
|
@ -0,0 +1,24 @@
|
|||
{
|
||||
"FileVersion": 3,
|
||||
"Version": 1,
|
||||
"VersionName": "1.0",
|
||||
"FriendlyName": "OpenPype",
|
||||
"Description": "OpenPype Integration",
|
||||
"Category": "OpenPype.Integration",
|
||||
"CreatedBy": "Ondrej Samohel",
|
||||
"CreatedByURL": "https://openpype.io",
|
||||
"DocsURL": "https://openpype.io/docs/artist_hosts_unreal",
|
||||
"MarketplaceURL": "",
|
||||
"SupportURL": "https://pype.club/",
|
||||
"CanContainContent": true,
|
||||
"IsBetaVersion": true,
|
||||
"IsExperimentalVersion": false,
|
||||
"Installed": false,
|
||||
"Modules": [
|
||||
{
|
||||
"Name": "OpenPype",
|
||||
"Type": "Editor",
|
||||
"LoadingPhase": "Default"
|
||||
}
|
||||
]
|
||||
}
|
||||
11
openpype/hosts/unreal/integration/README.md
Normal file
11
openpype/hosts/unreal/integration/README.md
Normal file
|
|
@ -0,0 +1,11 @@
|
|||
# OpenPype Unreal Integration plugin
|
||||
|
||||
This is plugin for Unreal Editor, creating menu for [OpenPype](https://github.com/getavalon) tools to run.
|
||||
|
||||
## How does this work
|
||||
|
||||
Plugin is creating basic menu items in **Window/OpenPype** section of Unreal Editor main menu and a button
|
||||
on the main toolbar with associated menu. Clicking on those menu items is calling callbacks that are
|
||||
declared in c++ but needs to be implemented during Unreal Editor
|
||||
startup in `Plugins/OpenPype/Content/Python/init_unreal.py` - this should be executed by Unreal Editor
|
||||
automatically.
|
||||
BIN
openpype/hosts/unreal/integration/Resources/openpype128.png
Normal file
BIN
openpype/hosts/unreal/integration/Resources/openpype128.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 14 KiB |
BIN
openpype/hosts/unreal/integration/Resources/openpype40.png
Normal file
BIN
openpype/hosts/unreal/integration/Resources/openpype40.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 4.8 KiB |
BIN
openpype/hosts/unreal/integration/Resources/openpype512.png
Normal file
BIN
openpype/hosts/unreal/integration/Resources/openpype512.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 84 KiB |
|
|
@ -0,0 +1,57 @@
|
|||
// Copyright 1998-2019 Epic Games, Inc. All Rights Reserved.
|
||||
|
||||
using UnrealBuildTool;
|
||||
|
||||
public class OpenPype : ModuleRules
|
||||
{
|
||||
public OpenPype(ReadOnlyTargetRules Target) : base(Target)
|
||||
{
|
||||
PCHUsage = ModuleRules.PCHUsageMode.UseExplicitOrSharedPCHs;
|
||||
|
||||
PublicIncludePaths.AddRange(
|
||||
new string[] {
|
||||
// ... add public include paths required here ...
|
||||
}
|
||||
);
|
||||
|
||||
|
||||
PrivateIncludePaths.AddRange(
|
||||
new string[] {
|
||||
// ... add other private include paths required here ...
|
||||
}
|
||||
);
|
||||
|
||||
|
||||
PublicDependencyModuleNames.AddRange(
|
||||
new string[]
|
||||
{
|
||||
"Core",
|
||||
// ... add other public dependencies that you statically link with here ...
|
||||
}
|
||||
);
|
||||
|
||||
|
||||
PrivateDependencyModuleNames.AddRange(
|
||||
new string[]
|
||||
{
|
||||
"Projects",
|
||||
"InputCore",
|
||||
"UnrealEd",
|
||||
"LevelEditor",
|
||||
"CoreUObject",
|
||||
"Engine",
|
||||
"Slate",
|
||||
"SlateCore",
|
||||
// ... add private dependencies that you statically link with here ...
|
||||
}
|
||||
);
|
||||
|
||||
|
||||
DynamicallyLoadedModuleNames.AddRange(
|
||||
new string[]
|
||||
{
|
||||
// ... add any modules that your module loads dynamically here ...
|
||||
}
|
||||
);
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,115 @@
|
|||
// Fill out your copyright notice in the Description page of Project Settings.
|
||||
|
||||
#include "AssetContainer.h"
|
||||
#include "AssetRegistryModule.h"
|
||||
#include "Misc/PackageName.h"
|
||||
#include "Engine.h"
|
||||
#include "Containers/UnrealString.h"
|
||||
|
||||
UAssetContainer::UAssetContainer(const FObjectInitializer& ObjectInitializer)
|
||||
: UAssetUserData(ObjectInitializer)
|
||||
{
|
||||
FAssetRegistryModule& AssetRegistryModule = FModuleManager::LoadModuleChecked<FAssetRegistryModule>("AssetRegistry");
|
||||
FString path = UAssetContainer::GetPathName();
|
||||
UE_LOG(LogTemp, Warning, TEXT("UAssetContainer %s"), *path);
|
||||
FARFilter Filter;
|
||||
Filter.PackagePaths.Add(FName(*path));
|
||||
|
||||
AssetRegistryModule.Get().OnAssetAdded().AddUObject(this, &UAssetContainer::OnAssetAdded);
|
||||
AssetRegistryModule.Get().OnAssetRemoved().AddUObject(this, &UAssetContainer::OnAssetRemoved);
|
||||
AssetRegistryModule.Get().OnAssetRenamed().AddUObject(this, &UAssetContainer::OnAssetRenamed);
|
||||
}
|
||||
|
||||
void UAssetContainer::OnAssetAdded(const FAssetData& AssetData)
|
||||
{
|
||||
TArray<FString> split;
|
||||
|
||||
// get directory of current container
|
||||
FString selfFullPath = UAssetContainer::GetPathName();
|
||||
FString selfDir = FPackageName::GetLongPackagePath(*selfFullPath);
|
||||
|
||||
// get asset path and class
|
||||
FString assetPath = AssetData.GetFullName();
|
||||
FString assetFName = AssetData.AssetClass.ToString();
|
||||
|
||||
// split path
|
||||
assetPath.ParseIntoArray(split, TEXT(" "), true);
|
||||
|
||||
FString assetDir = FPackageName::GetLongPackagePath(*split[1]);
|
||||
|
||||
// take interest only in paths starting with path of current container
|
||||
if (assetDir.StartsWith(*selfDir))
|
||||
{
|
||||
// exclude self
|
||||
if (assetFName != "AssetContainer")
|
||||
{
|
||||
assets.Add(assetPath);
|
||||
assetsData.Add(AssetData);
|
||||
UE_LOG(LogTemp, Log, TEXT("%s: asset added to %s"), *selfFullPath, *selfDir);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
void UAssetContainer::OnAssetRemoved(const FAssetData& AssetData)
|
||||
{
|
||||
TArray<FString> split;
|
||||
|
||||
// get directory of current container
|
||||
FString selfFullPath = UAssetContainer::GetPathName();
|
||||
FString selfDir = FPackageName::GetLongPackagePath(*selfFullPath);
|
||||
|
||||
// get asset path and class
|
||||
FString assetPath = AssetData.GetFullName();
|
||||
FString assetFName = AssetData.AssetClass.ToString();
|
||||
|
||||
// split path
|
||||
assetPath.ParseIntoArray(split, TEXT(" "), true);
|
||||
|
||||
FString assetDir = FPackageName::GetLongPackagePath(*split[1]);
|
||||
|
||||
// take interest only in paths starting with path of current container
|
||||
FString path = UAssetContainer::GetPathName();
|
||||
FString lpp = FPackageName::GetLongPackagePath(*path);
|
||||
|
||||
if (assetDir.StartsWith(*selfDir))
|
||||
{
|
||||
// exclude self
|
||||
if (assetFName != "AssetContainer")
|
||||
{
|
||||
// UE_LOG(LogTemp, Warning, TEXT("%s: asset removed"), *lpp);
|
||||
assets.Remove(assetPath);
|
||||
assetsData.Remove(AssetData);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
void UAssetContainer::OnAssetRenamed(const FAssetData& AssetData, const FString& str)
|
||||
{
|
||||
TArray<FString> split;
|
||||
|
||||
// get directory of current container
|
||||
FString selfFullPath = UAssetContainer::GetPathName();
|
||||
FString selfDir = FPackageName::GetLongPackagePath(*selfFullPath);
|
||||
|
||||
// get asset path and class
|
||||
FString assetPath = AssetData.GetFullName();
|
||||
FString assetFName = AssetData.AssetClass.ToString();
|
||||
|
||||
// split path
|
||||
assetPath.ParseIntoArray(split, TEXT(" "), true);
|
||||
|
||||
FString assetDir = FPackageName::GetLongPackagePath(*split[1]);
|
||||
if (assetDir.StartsWith(*selfDir))
|
||||
{
|
||||
// exclude self
|
||||
if (assetFName != "AssetContainer")
|
||||
{
|
||||
|
||||
assets.Remove(str);
|
||||
assets.Add(assetPath);
|
||||
assetsData.Remove(AssetData);
|
||||
// UE_LOG(LogTemp, Warning, TEXT("%s: asset renamed %s"), *lpp, *str);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -0,0 +1,20 @@
|
|||
#include "AssetContainerFactory.h"
|
||||
#include "AssetContainer.h"
|
||||
|
||||
UAssetContainerFactory::UAssetContainerFactory(const FObjectInitializer& ObjectInitializer)
|
||||
: UFactory(ObjectInitializer)
|
||||
{
|
||||
SupportedClass = UAssetContainer::StaticClass();
|
||||
bCreateNew = false;
|
||||
bEditorImport = true;
|
||||
}
|
||||
|
||||
UObject* UAssetContainerFactory::FactoryCreateNew(UClass* Class, UObject* InParent, FName Name, EObjectFlags Flags, UObject* Context, FFeedbackContext* Warn)
|
||||
{
|
||||
UAssetContainer* AssetContainer = NewObject<UAssetContainer>(InParent, Class, Name, Flags);
|
||||
return AssetContainer;
|
||||
}
|
||||
|
||||
bool UAssetContainerFactory::ShouldShowInNewMenu() const {
|
||||
return false;
|
||||
}
|
||||
|
|
@ -0,0 +1,103 @@
|
|||
#include "OpenPype.h"
|
||||
#include "LevelEditor.h"
|
||||
#include "OpenPypePythonBridge.h"
|
||||
#include "OpenPypeStyle.h"
|
||||
|
||||
|
||||
static const FName OpenPypeTabName("OpenPype");
|
||||
|
||||
#define LOCTEXT_NAMESPACE "FOpenPypeModule"
|
||||
|
||||
// This function is triggered when the plugin is staring up
|
||||
void FOpenPypeModule::StartupModule()
|
||||
{
|
||||
|
||||
FOpenPypeStyle::Initialize();
|
||||
FOpenPypeStyle::SetIcon("Logo", "openpype40");
|
||||
|
||||
// Create the Extender that will add content to the menu
|
||||
FLevelEditorModule& LevelEditorModule = FModuleManager::LoadModuleChecked<FLevelEditorModule>("LevelEditor");
|
||||
|
||||
TSharedPtr<FExtender> MenuExtender = MakeShareable(new FExtender());
|
||||
TSharedPtr<FExtender> ToolbarExtender = MakeShareable(new FExtender());
|
||||
|
||||
MenuExtender->AddMenuExtension(
|
||||
"LevelEditor",
|
||||
EExtensionHook::After,
|
||||
NULL,
|
||||
FMenuExtensionDelegate::CreateRaw(this, &FOpenPypeModule::AddMenuEntry)
|
||||
);
|
||||
ToolbarExtender->AddToolBarExtension(
|
||||
"Settings",
|
||||
EExtensionHook::After,
|
||||
NULL,
|
||||
FToolBarExtensionDelegate::CreateRaw(this, &FOpenPypeModule::AddToobarEntry));
|
||||
|
||||
|
||||
LevelEditorModule.GetMenuExtensibilityManager()->AddExtender(MenuExtender);
|
||||
LevelEditorModule.GetToolBarExtensibilityManager()->AddExtender(ToolbarExtender);
|
||||
|
||||
}
|
||||
|
||||
void FOpenPypeModule::ShutdownModule()
|
||||
{
|
||||
FOpenPypeStyle::Shutdown();
|
||||
}
|
||||
|
||||
|
||||
void FOpenPypeModule::AddMenuEntry(FMenuBuilder& MenuBuilder)
|
||||
{
|
||||
// Create Section
|
||||
MenuBuilder.BeginSection("OpenPype", TAttribute<FText>(FText::FromString("OpenPype")));
|
||||
{
|
||||
// Create a Submenu inside of the Section
|
||||
MenuBuilder.AddMenuEntry(
|
||||
FText::FromString("Tools..."),
|
||||
FText::FromString("Pipeline tools"),
|
||||
FSlateIcon(FOpenPypeStyle::GetStyleSetName(), "OpenPype.Logo"),
|
||||
FUIAction(FExecuteAction::CreateRaw(this, &FOpenPypeModule::MenuPopup))
|
||||
);
|
||||
|
||||
MenuBuilder.AddMenuEntry(
|
||||
FText::FromString("Tools dialog..."),
|
||||
FText::FromString("Pipeline tools dialog"),
|
||||
FSlateIcon(FOpenPypeStyle::GetStyleSetName(), "OpenPype.Logo"),
|
||||
FUIAction(FExecuteAction::CreateRaw(this, &FOpenPypeModule::MenuDialog))
|
||||
);
|
||||
|
||||
}
|
||||
MenuBuilder.EndSection();
|
||||
}
|
||||
|
||||
void FOpenPypeModule::AddToobarEntry(FToolBarBuilder& ToolbarBuilder)
|
||||
{
|
||||
ToolbarBuilder.BeginSection(TEXT("OpenPype"));
|
||||
{
|
||||
ToolbarBuilder.AddToolBarButton(
|
||||
FUIAction(
|
||||
FExecuteAction::CreateRaw(this, &FOpenPypeModule::MenuPopup),
|
||||
NULL,
|
||||
FIsActionChecked()
|
||||
|
||||
),
|
||||
NAME_None,
|
||||
LOCTEXT("OpenPype_label", "OpenPype"),
|
||||
LOCTEXT("OpenPype_tooltip", "OpenPype Tools"),
|
||||
FSlateIcon(FOpenPypeStyle::GetStyleSetName(), "OpenPype.Logo")
|
||||
);
|
||||
}
|
||||
ToolbarBuilder.EndSection();
|
||||
}
|
||||
|
||||
|
||||
void FOpenPypeModule::MenuPopup() {
|
||||
UOpenPypePythonBridge* bridge = UOpenPypePythonBridge::Get();
|
||||
bridge->RunInPython_Popup();
|
||||
}
|
||||
|
||||
void FOpenPypeModule::MenuDialog() {
|
||||
UOpenPypePythonBridge* bridge = UOpenPypePythonBridge::Get();
|
||||
bridge->RunInPython_Dialog();
|
||||
}
|
||||
|
||||
IMPLEMENT_MODULE(FOpenPypeModule, OpenPype)
|
||||
|
|
@ -0,0 +1,48 @@
|
|||
#include "OpenPypeLib.h"
|
||||
#include "Misc/Paths.h"
|
||||
#include "Misc/ConfigCacheIni.h"
|
||||
#include "UObject/UnrealType.h"
|
||||
|
||||
/**
|
||||
* Sets color on folder icon on given path
|
||||
* @param InPath - path to folder
|
||||
* @param InFolderColor - color of the folder
|
||||
* @warning This color will appear only after Editor restart. Is there a better way?
|
||||
*/
|
||||
|
||||
void UOpenPypeLib::CSetFolderColor(FString FolderPath, FLinearColor FolderColor, bool bForceAdd)
|
||||
{
|
||||
auto SaveColorInternal = [](FString InPath, FLinearColor InFolderColor)
|
||||
{
|
||||
// Saves the color of the folder to the config
|
||||
if (FPaths::FileExists(GEditorPerProjectIni))
|
||||
{
|
||||
GConfig->SetString(TEXT("PathColor"), *InPath, *InFolderColor.ToString(), GEditorPerProjectIni);
|
||||
}
|
||||
|
||||
};
|
||||
|
||||
SaveColorInternal(FolderPath, FolderColor);
|
||||
|
||||
}
|
||||
/**
|
||||
* Returns all poperties on given object
|
||||
* @param cls - class
|
||||
* @return TArray of properties
|
||||
*/
|
||||
TArray<FString> UOpenPypeLib::GetAllProperties(UClass* cls)
|
||||
{
|
||||
TArray<FString> Ret;
|
||||
if (cls != nullptr)
|
||||
{
|
||||
for (TFieldIterator<FProperty> It(cls); It; ++It)
|
||||
{
|
||||
FProperty* Property = *It;
|
||||
if (Property->HasAnyPropertyFlags(EPropertyFlags::CPF_Edit))
|
||||
{
|
||||
Ret.Add(Property->GetName());
|
||||
}
|
||||
}
|
||||
}
|
||||
return Ret;
|
||||
}
|
||||
|
|
@ -0,0 +1,108 @@
|
|||
#pragma once
|
||||
|
||||
#include "OpenPypePublishInstance.h"
|
||||
#include "AssetRegistryModule.h"
|
||||
|
||||
|
||||
UOpenPypePublishInstance::UOpenPypePublishInstance(const FObjectInitializer& ObjectInitializer)
|
||||
: UObject(ObjectInitializer)
|
||||
{
|
||||
FAssetRegistryModule& AssetRegistryModule = FModuleManager::LoadModuleChecked<FAssetRegistryModule>("AssetRegistry");
|
||||
FString path = UOpenPypePublishInstance::GetPathName();
|
||||
FARFilter Filter;
|
||||
Filter.PackagePaths.Add(FName(*path));
|
||||
|
||||
AssetRegistryModule.Get().OnAssetAdded().AddUObject(this, &UOpenPypePublishInstance::OnAssetAdded);
|
||||
AssetRegistryModule.Get().OnAssetRemoved().AddUObject(this, &UOpenPypePublishInstance::OnAssetRemoved);
|
||||
AssetRegistryModule.Get().OnAssetRenamed().AddUObject(this, &UOpenPypePublishInstance::OnAssetRenamed);
|
||||
}
|
||||
|
||||
void UOpenPypePublishInstance::OnAssetAdded(const FAssetData& AssetData)
|
||||
{
|
||||
TArray<FString> split;
|
||||
|
||||
// get directory of current container
|
||||
FString selfFullPath = UOpenPypePublishInstance::GetPathName();
|
||||
FString selfDir = FPackageName::GetLongPackagePath(*selfFullPath);
|
||||
|
||||
// get asset path and class
|
||||
FString assetPath = AssetData.GetFullName();
|
||||
FString assetFName = AssetData.AssetClass.ToString();
|
||||
|
||||
// split path
|
||||
assetPath.ParseIntoArray(split, TEXT(" "), true);
|
||||
|
||||
FString assetDir = FPackageName::GetLongPackagePath(*split[1]);
|
||||
|
||||
// take interest only in paths starting with path of current container
|
||||
if (assetDir.StartsWith(*selfDir))
|
||||
{
|
||||
// exclude self
|
||||
if (assetFName != "OpenPypePublishInstance")
|
||||
{
|
||||
assets.Add(assetPath);
|
||||
UE_LOG(LogTemp, Log, TEXT("%s: asset added to %s"), *selfFullPath, *selfDir);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
void UOpenPypePublishInstance::OnAssetRemoved(const FAssetData& AssetData)
|
||||
{
|
||||
TArray<FString> split;
|
||||
|
||||
// get directory of current container
|
||||
FString selfFullPath = UOpenPypePublishInstance::GetPathName();
|
||||
FString selfDir = FPackageName::GetLongPackagePath(*selfFullPath);
|
||||
|
||||
// get asset path and class
|
||||
FString assetPath = AssetData.GetFullName();
|
||||
FString assetFName = AssetData.AssetClass.ToString();
|
||||
|
||||
// split path
|
||||
assetPath.ParseIntoArray(split, TEXT(" "), true);
|
||||
|
||||
FString assetDir = FPackageName::GetLongPackagePath(*split[1]);
|
||||
|
||||
// take interest only in paths starting with path of current container
|
||||
FString path = UOpenPypePublishInstance::GetPathName();
|
||||
FString lpp = FPackageName::GetLongPackagePath(*path);
|
||||
|
||||
if (assetDir.StartsWith(*selfDir))
|
||||
{
|
||||
// exclude self
|
||||
if (assetFName != "OpenPypePublishInstance")
|
||||
{
|
||||
// UE_LOG(LogTemp, Warning, TEXT("%s: asset removed"), *lpp);
|
||||
assets.Remove(assetPath);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
void UOpenPypePublishInstance::OnAssetRenamed(const FAssetData& AssetData, const FString& str)
|
||||
{
|
||||
TArray<FString> split;
|
||||
|
||||
// get directory of current container
|
||||
FString selfFullPath = UOpenPypePublishInstance::GetPathName();
|
||||
FString selfDir = FPackageName::GetLongPackagePath(*selfFullPath);
|
||||
|
||||
// get asset path and class
|
||||
FString assetPath = AssetData.GetFullName();
|
||||
FString assetFName = AssetData.AssetClass.ToString();
|
||||
|
||||
// split path
|
||||
assetPath.ParseIntoArray(split, TEXT(" "), true);
|
||||
|
||||
FString assetDir = FPackageName::GetLongPackagePath(*split[1]);
|
||||
if (assetDir.StartsWith(*selfDir))
|
||||
{
|
||||
// exclude self
|
||||
if (assetFName != "AssetContainer")
|
||||
{
|
||||
|
||||
assets.Remove(str);
|
||||
assets.Add(assetPath);
|
||||
// UE_LOG(LogTemp, Warning, TEXT("%s: asset renamed %s"), *lpp, *str);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,20 @@
|
|||
#include "OpenPypePublishInstanceFactory.h"
|
||||
#include "OpenPypePublishInstance.h"
|
||||
|
||||
UOpenPypePublishInstanceFactory::UOpenPypePublishInstanceFactory(const FObjectInitializer& ObjectInitializer)
|
||||
: UFactory(ObjectInitializer)
|
||||
{
|
||||
SupportedClass = UOpenPypePublishInstance::StaticClass();
|
||||
bCreateNew = false;
|
||||
bEditorImport = true;
|
||||
}
|
||||
|
||||
UObject* UOpenPypePublishInstanceFactory::FactoryCreateNew(UClass* Class, UObject* InParent, FName Name, EObjectFlags Flags, UObject* Context, FFeedbackContext* Warn)
|
||||
{
|
||||
UOpenPypePublishInstance* OpenPypePublishInstance = NewObject<UOpenPypePublishInstance>(InParent, Class, Name, Flags);
|
||||
return OpenPypePublishInstance;
|
||||
}
|
||||
|
||||
bool UOpenPypePublishInstanceFactory::ShouldShowInNewMenu() const {
|
||||
return false;
|
||||
}
|
||||
|
|
@ -0,0 +1,13 @@
|
|||
#include "OpenPypePythonBridge.h"
|
||||
|
||||
UOpenPypePythonBridge* UOpenPypePythonBridge::Get()
|
||||
{
|
||||
TArray<UClass*> OpenPypePythonBridgeClasses;
|
||||
GetDerivedClasses(UOpenPypePythonBridge::StaticClass(), OpenPypePythonBridgeClasses);
|
||||
int32 NumClasses = OpenPypePythonBridgeClasses.Num();
|
||||
if (NumClasses > 0)
|
||||
{
|
||||
return Cast<UOpenPypePythonBridge>(OpenPypePythonBridgeClasses[NumClasses - 1]->GetDefaultObject());
|
||||
}
|
||||
return nullptr;
|
||||
};
|
||||
|
|
@ -0,0 +1,70 @@
|
|||
#include "OpenPypeStyle.h"
|
||||
#include "Framework/Application/SlateApplication.h"
|
||||
#include "Styling/SlateStyle.h"
|
||||
#include "Styling/SlateStyleRegistry.h"
|
||||
|
||||
|
||||
TUniquePtr< FSlateStyleSet > FOpenPypeStyle::OpenPypeStyleInstance = nullptr;
|
||||
|
||||
void FOpenPypeStyle::Initialize()
|
||||
{
|
||||
if (!OpenPypeStyleInstance.IsValid())
|
||||
{
|
||||
OpenPypeStyleInstance = Create();
|
||||
FSlateStyleRegistry::RegisterSlateStyle(*OpenPypeStyleInstance);
|
||||
}
|
||||
}
|
||||
|
||||
void FOpenPypeStyle::Shutdown()
|
||||
{
|
||||
if (OpenPypeStyleInstance.IsValid())
|
||||
{
|
||||
FSlateStyleRegistry::UnRegisterSlateStyle(*OpenPypeStyleInstance);
|
||||
OpenPypeStyleInstance.Reset();
|
||||
}
|
||||
}
|
||||
|
||||
FName FOpenPypeStyle::GetStyleSetName()
|
||||
{
|
||||
static FName StyleSetName(TEXT("OpenPypeStyle"));
|
||||
return StyleSetName;
|
||||
}
|
||||
|
||||
FName FOpenPypeStyle::GetContextName()
|
||||
{
|
||||
static FName ContextName(TEXT("OpenPype"));
|
||||
return ContextName;
|
||||
}
|
||||
|
||||
#define IMAGE_BRUSH(RelativePath, ...) FSlateImageBrush( Style->RootToContentDir( RelativePath, TEXT(".png") ), __VA_ARGS__ )
|
||||
|
||||
const FVector2D Icon40x40(40.0f, 40.0f);
|
||||
|
||||
TUniquePtr< FSlateStyleSet > FOpenPypeStyle::Create()
|
||||
{
|
||||
TUniquePtr< FSlateStyleSet > Style = MakeUnique<FSlateStyleSet>(GetStyleSetName());
|
||||
Style->SetContentRoot(FPaths::ProjectPluginsDir() / TEXT("OpenPype/Resources"));
|
||||
|
||||
return Style;
|
||||
}
|
||||
|
||||
void FOpenPypeStyle::SetIcon(const FString& StyleName, const FString& ResourcePath)
|
||||
{
|
||||
FSlateStyleSet* Style = OpenPypeStyleInstance.Get();
|
||||
|
||||
FString Name(GetContextName().ToString());
|
||||
Name = Name + "." + StyleName;
|
||||
Style->Set(*Name, new FSlateImageBrush(Style->RootToContentDir(ResourcePath, TEXT(".png")), Icon40x40));
|
||||
|
||||
|
||||
FSlateApplication::Get().GetRenderer()->ReloadTextureResources();
|
||||
}
|
||||
|
||||
#undef IMAGE_BRUSH
|
||||
|
||||
const ISlateStyle& FOpenPypeStyle::Get()
|
||||
{
|
||||
check(OpenPypeStyleInstance);
|
||||
return *OpenPypeStyleInstance;
|
||||
return *OpenPypeStyleInstance;
|
||||
}
|
||||
|
|
@ -0,0 +1,39 @@
|
|||
// Fill out your copyright notice in the Description page of Project Settings.
|
||||
|
||||
#pragma once
|
||||
|
||||
#include "CoreMinimal.h"
|
||||
#include "UObject/NoExportTypes.h"
|
||||
#include "Engine/AssetUserData.h"
|
||||
#include "AssetData.h"
|
||||
#include "AssetContainer.generated.h"
|
||||
|
||||
/**
|
||||
*
|
||||
*/
|
||||
UCLASS(Blueprintable)
|
||||
class OPENPYPE_API UAssetContainer : public UAssetUserData
|
||||
{
|
||||
GENERATED_BODY()
|
||||
|
||||
public:
|
||||
|
||||
UAssetContainer(const FObjectInitializer& ObjectInitalizer);
|
||||
// ~UAssetContainer();
|
||||
|
||||
UPROPERTY(EditAnywhere, BlueprintReadOnly)
|
||||
TArray<FString> assets;
|
||||
|
||||
// There seems to be no reflection option to expose array of FAssetData
|
||||
/*
|
||||
UPROPERTY(Transient, BlueprintReadOnly, Category = "Python", meta=(DisplayName="Assets Data"))
|
||||
TArray<FAssetData> assetsData;
|
||||
*/
|
||||
private:
|
||||
TArray<FAssetData> assetsData;
|
||||
void OnAssetAdded(const FAssetData& AssetData);
|
||||
void OnAssetRemoved(const FAssetData& AssetData);
|
||||
void OnAssetRenamed(const FAssetData& AssetData, const FString& str);
|
||||
};
|
||||
|
||||
|
||||
|
|
@ -0,0 +1,21 @@
|
|||
// Fill out your copyright notice in the Description page of Project Settings.
|
||||
|
||||
#pragma once
|
||||
|
||||
#include "CoreMinimal.h"
|
||||
#include "Factories/Factory.h"
|
||||
#include "AssetContainerFactory.generated.h"
|
||||
|
||||
/**
|
||||
*
|
||||
*/
|
||||
UCLASS()
|
||||
class OPENPYPE_API UAssetContainerFactory : public UFactory
|
||||
{
|
||||
GENERATED_BODY()
|
||||
|
||||
public:
|
||||
UAssetContainerFactory(const FObjectInitializer& ObjectInitializer);
|
||||
virtual UObject* FactoryCreateNew(UClass* Class, UObject* InParent, FName Name, EObjectFlags Flags, UObject* Context, FFeedbackContext* Warn) override;
|
||||
virtual bool ShouldShowInNewMenu() const override;
|
||||
};
|
||||
|
|
@ -0,0 +1,21 @@
|
|||
// Copyright 1998-2019 Epic Games, Inc. All Rights Reserved.
|
||||
|
||||
#pragma once
|
||||
|
||||
#include "Engine.h"
|
||||
|
||||
|
||||
class FOpenPypeModule : public IModuleInterface
|
||||
{
|
||||
public:
|
||||
virtual void StartupModule() override;
|
||||
virtual void ShutdownModule() override;
|
||||
|
||||
private:
|
||||
|
||||
void AddMenuEntry(FMenuBuilder& MenuBuilder);
|
||||
void AddToobarEntry(FToolBarBuilder& ToolbarBuilder);
|
||||
void MenuPopup();
|
||||
void MenuDialog();
|
||||
|
||||
};
|
||||
|
|
@ -0,0 +1,19 @@
|
|||
#pragma once
|
||||
|
||||
#include "Engine.h"
|
||||
#include "OpenPypeLib.generated.h"
|
||||
|
||||
|
||||
UCLASS(Blueprintable)
|
||||
class OPENPYPE_API UOpenPypeLib : public UObject
|
||||
{
|
||||
|
||||
GENERATED_BODY()
|
||||
|
||||
public:
|
||||
UFUNCTION(BlueprintCallable, Category = Python)
|
||||
static void CSetFolderColor(FString FolderPath, FLinearColor FolderColor, bool bForceAdd);
|
||||
|
||||
UFUNCTION(BlueprintCallable, Category = Python)
|
||||
static TArray<FString> GetAllProperties(UClass* cls);
|
||||
};
|
||||
|
|
@ -0,0 +1,21 @@
|
|||
#pragma once
|
||||
|
||||
#include "Engine.h"
|
||||
#include "OpenPypePublishInstance.generated.h"
|
||||
|
||||
|
||||
UCLASS(Blueprintable)
|
||||
class OPENPYPE_API UOpenPypePublishInstance : public UObject
|
||||
{
|
||||
GENERATED_BODY()
|
||||
|
||||
public:
|
||||
UOpenPypePublishInstance(const FObjectInitializer& ObjectInitalizer);
|
||||
|
||||
UPROPERTY(EditAnywhere, BlueprintReadOnly)
|
||||
TArray<FString> assets;
|
||||
private:
|
||||
void OnAssetAdded(const FAssetData& AssetData);
|
||||
void OnAssetRemoved(const FAssetData& AssetData);
|
||||
void OnAssetRenamed(const FAssetData& AssetData, const FString& str);
|
||||
};
|
||||
|
|
@ -0,0 +1,19 @@
|
|||
#pragma once
|
||||
|
||||
#include "CoreMinimal.h"
|
||||
#include "Factories/Factory.h"
|
||||
#include "OpenPypePublishInstanceFactory.generated.h"
|
||||
|
||||
/**
|
||||
*
|
||||
*/
|
||||
UCLASS()
|
||||
class OPENPYPE_API UOpenPypePublishInstanceFactory : public UFactory
|
||||
{
|
||||
GENERATED_BODY()
|
||||
|
||||
public:
|
||||
UOpenPypePublishInstanceFactory(const FObjectInitializer& ObjectInitializer);
|
||||
virtual UObject* FactoryCreateNew(UClass* Class, UObject* InParent, FName Name, EObjectFlags Flags, UObject* Context, FFeedbackContext* Warn) override;
|
||||
virtual bool ShouldShowInNewMenu() const override;
|
||||
};
|
||||
|
|
@ -0,0 +1,20 @@
|
|||
#pragma once
|
||||
#include "Engine.h"
|
||||
#include "OpenPypePythonBridge.generated.h"
|
||||
|
||||
UCLASS(Blueprintable)
|
||||
class UOpenPypePythonBridge : public UObject
|
||||
{
|
||||
GENERATED_BODY()
|
||||
|
||||
public:
|
||||
UFUNCTION(BlueprintCallable, Category = Python)
|
||||
static UOpenPypePythonBridge* Get();
|
||||
|
||||
UFUNCTION(BlueprintImplementableEvent, Category = Python)
|
||||
void RunInPython_Popup() const;
|
||||
|
||||
UFUNCTION(BlueprintImplementableEvent, Category = Python)
|
||||
void RunInPython_Dialog() const;
|
||||
|
||||
};
|
||||
|
|
@ -0,0 +1,22 @@
|
|||
#pragma once
|
||||
#include "CoreMinimal.h"
|
||||
|
||||
class FSlateStyleSet;
|
||||
class ISlateStyle;
|
||||
|
||||
|
||||
class FOpenPypeStyle
|
||||
{
|
||||
public:
|
||||
static void Initialize();
|
||||
static void Shutdown();
|
||||
static const ISlateStyle& Get();
|
||||
static FName GetStyleSetName();
|
||||
static FName GetContextName();
|
||||
|
||||
static void SetIcon(const FString& StyleName, const FString& ResourcePath);
|
||||
|
||||
private:
|
||||
static TUniquePtr< FSlateStyleSet > Create();
|
||||
static TUniquePtr< FSlateStyleSet > OpenPypeStyleInstance;
|
||||
};
|
||||
|
|
@ -169,11 +169,11 @@ def create_unreal_project(project_name: str,
|
|||
env: dict = None) -> None:
|
||||
"""This will create `.uproject` file at specified location.
|
||||
|
||||
As there is no way I know to create project via command line, this is
|
||||
easiest option. Unreal project file is basically JSON file. If we find
|
||||
`AVALON_UNREAL_PLUGIN` environment variable we assume this is location
|
||||
of Avalon Integration Plugin and we copy its content to project folder
|
||||
and enable this plugin.
|
||||
As there is no way I know to create a project via command line, this is
|
||||
easiest option. Unreal project file is basically a JSON file. If we find
|
||||
the `OPENPYPE_UNREAL_PLUGIN` environment variable we assume this is the
|
||||
location of the Integration Plugin and we copy its content to the project
|
||||
folder and enable this plugin.
|
||||
|
||||
Args:
|
||||
project_name (str): Name of the project.
|
||||
|
|
@ -230,18 +230,18 @@ def create_unreal_project(project_name: str,
|
|||
ue_id = "{" + loaded_modules.get("BuildId") + "}"
|
||||
|
||||
plugins_path = None
|
||||
if os.path.isdir(env.get("AVALON_UNREAL_PLUGIN", "")):
|
||||
if os.path.isdir(env.get("OPENPYPE_UNREAL_PLUGIN", "")):
|
||||
# copy plugin to correct path under project
|
||||
plugins_path = pr_dir / "Plugins"
|
||||
avalon_plugin_path = plugins_path / "Avalon"
|
||||
if not avalon_plugin_path.is_dir():
|
||||
avalon_plugin_path.mkdir(parents=True, exist_ok=True)
|
||||
openpype_plugin_path = plugins_path / "OpenPype"
|
||||
if not openpype_plugin_path.is_dir():
|
||||
openpype_plugin_path.mkdir(parents=True, exist_ok=True)
|
||||
dir_util._path_created = {}
|
||||
dir_util.copy_tree(os.environ.get("AVALON_UNREAL_PLUGIN"),
|
||||
avalon_plugin_path.as_posix())
|
||||
dir_util.copy_tree(os.environ.get("OPENPYPE_UNREAL_PLUGIN"),
|
||||
openpype_plugin_path.as_posix())
|
||||
|
||||
if not (avalon_plugin_path / "Binaries").is_dir() \
|
||||
or not (avalon_plugin_path / "Intermediate").is_dir():
|
||||
if not (openpype_plugin_path / "Binaries").is_dir() \
|
||||
or not (openpype_plugin_path / "Intermediate").is_dir():
|
||||
dev_mode = True
|
||||
|
||||
# data for project file
|
||||
|
|
@ -254,14 +254,14 @@ def create_unreal_project(project_name: str,
|
|||
{"Name": "PythonScriptPlugin", "Enabled": True},
|
||||
{"Name": "EditorScriptingUtilities", "Enabled": True},
|
||||
{"Name": "SequencerScripting", "Enabled": True},
|
||||
{"Name": "Avalon", "Enabled": True}
|
||||
{"Name": "OpenPype", "Enabled": True}
|
||||
]
|
||||
}
|
||||
|
||||
if dev_mode or preset["dev_mode"]:
|
||||
# this will add project module and necessary source file to make it
|
||||
# C++ project and to (hopefully) make Unreal Editor to compile all
|
||||
# sources at start
|
||||
# this will add the project module and necessary source file to
|
||||
# make it a C++ project and to (hopefully) make Unreal Editor to
|
||||
# compile all # sources at start
|
||||
|
||||
data["Modules"] = [{
|
||||
"Name": project_name,
|
||||
|
|
@ -304,7 +304,7 @@ def _prepare_cpp_project(project_file: Path, engine_path: Path) -> None:
|
|||
"""Prepare CPP Unreal Project.
|
||||
|
||||
This function will add source files needed for project to be
|
||||
rebuild along with the avalon integration plugin.
|
||||
rebuild along with the OpenPype integration plugin.
|
||||
|
||||
There seems not to be automated way to do it from command line.
|
||||
But there might be way to create at least those target and build files
|
||||
|
|
@ -16,7 +16,7 @@ class CreateCamera(Creator):
|
|||
family = "camera"
|
||||
icon = "cubes"
|
||||
|
||||
root = "/Game/Avalon/Instances"
|
||||
root = "/Game/OpenPype/Instances"
|
||||
suffix = "_INS"
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
|
|
|
|||
|
|
@ -1,3 +1,4 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
from unreal import EditorLevelLibrary as ell
|
||||
from openpype.hosts.unreal.api.plugin import Creator
|
||||
from avalon.unreal import (
|
||||
|
|
@ -6,7 +7,7 @@ from avalon.unreal import (
|
|||
|
||||
|
||||
class CreateLayout(Creator):
|
||||
"""Layout output for character rigs"""
|
||||
"""Layout output for character rigs."""
|
||||
|
||||
name = "layoutMain"
|
||||
label = "Layout"
|
||||
|
|
|
|||
|
|
@ -1,10 +1,12 @@
|
|||
import unreal
|
||||
# -*- coding: utf-8 -*-
|
||||
"""Create look in Unreal."""
|
||||
import unreal # noqa
|
||||
from openpype.hosts.unreal.api.plugin import Creator
|
||||
from avalon.unreal import pipeline
|
||||
from openpype.hosts.unreal.api import pipeline
|
||||
|
||||
|
||||
class CreateLook(Creator):
|
||||
"""Shader connections defining shape look"""
|
||||
"""Shader connections defining shape look."""
|
||||
|
||||
name = "unrealLook"
|
||||
label = "Unreal - Look"
|
||||
|
|
@ -49,14 +51,14 @@ class CreateLook(Creator):
|
|||
for material in materials:
|
||||
name = material.get_editor_property('material_slot_name')
|
||||
object_path = f"{full_path}/{name}.{name}"
|
||||
object = unreal.EditorAssetLibrary.duplicate_loaded_asset(
|
||||
unreal_object = unreal.EditorAssetLibrary.duplicate_loaded_asset(
|
||||
cube.get_asset(), object_path
|
||||
)
|
||||
|
||||
# Remove the default material of the cube object
|
||||
object.get_editor_property('static_materials').pop()
|
||||
unreal_object.get_editor_property('static_materials').pop()
|
||||
|
||||
object.add_material(
|
||||
unreal_object.add_material(
|
||||
material.get_editor_property('material_interface'))
|
||||
|
||||
self.data["members"].append(object_path)
|
||||
|
|
|
|||
|
|
@ -1,12 +1,14 @@
|
|||
import unreal
|
||||
# -*- coding: utf-8 -*-
|
||||
"""Create Static Meshes as FBX geometry."""
|
||||
import unreal # noqa
|
||||
from openpype.hosts.unreal.api.plugin import Creator
|
||||
from avalon.unreal import (
|
||||
from openpype.hosts.unreal.api.pipeline import (
|
||||
instantiate,
|
||||
)
|
||||
|
||||
|
||||
class CreateStaticMeshFBX(Creator):
|
||||
"""Static FBX geometry"""
|
||||
"""Static FBX geometry."""
|
||||
|
||||
name = "unrealStaticMeshMain"
|
||||
label = "Unreal - Static Mesh"
|
||||
|
|
|
|||
|
|
@ -1,12 +1,15 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Loader for published alembics."""
|
||||
import os
|
||||
|
||||
from avalon import api, pipeline
|
||||
from avalon.unreal import lib
|
||||
from avalon.unreal import pipeline as unreal_pipeline
|
||||
import unreal
|
||||
from openpype.hosts.unreal.api import plugin
|
||||
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
|
||||
|
||||
import unreal # noqa
|
||||
|
||||
|
||||
class PointCacheAlembicLoader(api.Loader):
|
||||
class PointCacheAlembicLoader(plugin.Loader):
|
||||
"""Load Point Cache from Alembic"""
|
||||
|
||||
families = ["model", "pointcache"]
|
||||
|
|
@ -56,8 +59,7 @@ class PointCacheAlembicLoader(api.Loader):
|
|||
return task
|
||||
|
||||
def load(self, context, name, namespace, data):
|
||||
"""
|
||||
Load and containerise representation into Content Browser.
|
||||
"""Load and containerise representation into Content Browser.
|
||||
|
||||
This is two step process. First, import FBX to temporary path and
|
||||
then call `containerise()` on it - this moves all content to new
|
||||
|
|
@ -76,10 +78,10 @@ class PointCacheAlembicLoader(api.Loader):
|
|||
|
||||
Returns:
|
||||
list(str): list of container content
|
||||
"""
|
||||
|
||||
# Create directory for asset and avalon container
|
||||
root = "/Game/Avalon/Assets"
|
||||
"""
|
||||
# Create directory for asset and OpenPype container
|
||||
root = "/Game/OpenPype/Assets"
|
||||
asset = context.get('asset').get('name')
|
||||
suffix = "_CON"
|
||||
if asset:
|
||||
|
|
@ -109,7 +111,7 @@ class PointCacheAlembicLoader(api.Loader):
|
|||
unreal.AssetToolsHelpers.get_asset_tools().import_asset_tasks([task]) # noqa: E501
|
||||
|
||||
# Create Asset Container
|
||||
lib.create_avalon_container(
|
||||
unreal_pipeline.create_container(
|
||||
container=container_name, path=asset_dir)
|
||||
|
||||
data = {
|
||||
|
|
|
|||
|
|
@ -1,12 +1,14 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Load Skeletal Mesh alembics."""
|
||||
import os
|
||||
|
||||
from avalon import api, pipeline
|
||||
from avalon.unreal import lib
|
||||
from avalon.unreal import pipeline as unreal_pipeline
|
||||
import unreal
|
||||
from openpype.hosts.unreal.api import plugin
|
||||
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
|
||||
import unreal # noqa
|
||||
|
||||
|
||||
class SkeletalMeshAlembicLoader(api.Loader):
|
||||
class SkeletalMeshAlembicLoader(plugin.Loader):
|
||||
"""Load Unreal SkeletalMesh from Alembic"""
|
||||
|
||||
families = ["pointcache"]
|
||||
|
|
@ -16,8 +18,7 @@ class SkeletalMeshAlembicLoader(api.Loader):
|
|||
color = "orange"
|
||||
|
||||
def load(self, context, name, namespace, data):
|
||||
"""
|
||||
Load and containerise representation into Content Browser.
|
||||
"""Load and containerise representation into Content Browser.
|
||||
|
||||
This is two step process. First, import FBX to temporary path and
|
||||
then call `containerise()` on it - this moves all content to new
|
||||
|
|
@ -38,8 +39,8 @@ class SkeletalMeshAlembicLoader(api.Loader):
|
|||
list(str): list of container content
|
||||
"""
|
||||
|
||||
# Create directory for asset and avalon container
|
||||
root = "/Game/Avalon/Assets"
|
||||
# Create directory for asset and openpype container
|
||||
root = "/Game/OpenPype/Assets"
|
||||
asset = context.get('asset').get('name')
|
||||
suffix = "_CON"
|
||||
if asset:
|
||||
|
|
@ -74,7 +75,7 @@ class SkeletalMeshAlembicLoader(api.Loader):
|
|||
unreal.AssetToolsHelpers.get_asset_tools().import_asset_tasks([task]) # noqa: E501
|
||||
|
||||
# Create Asset Container
|
||||
lib.create_avalon_container(
|
||||
unreal_pipeline.create_container(
|
||||
container=container_name, path=asset_dir)
|
||||
|
||||
data = {
|
||||
|
|
|
|||
|
|
@ -1,12 +1,14 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Loader for Static Mesh alembics."""
|
||||
import os
|
||||
|
||||
from avalon import api, pipeline
|
||||
from avalon.unreal import lib
|
||||
from avalon.unreal import pipeline as unreal_pipeline
|
||||
import unreal
|
||||
from openpype.hosts.unreal.api import plugin
|
||||
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
|
||||
import unreal # noqa
|
||||
|
||||
|
||||
class StaticMeshAlembicLoader(api.Loader):
|
||||
class StaticMeshAlembicLoader(plugin.Loader):
|
||||
"""Load Unreal StaticMesh from Alembic"""
|
||||
|
||||
families = ["model"]
|
||||
|
|
@ -49,8 +51,7 @@ class StaticMeshAlembicLoader(api.Loader):
|
|||
return task
|
||||
|
||||
def load(self, context, name, namespace, data):
|
||||
"""
|
||||
Load and containerise representation into Content Browser.
|
||||
"""Load and containerise representation into Content Browser.
|
||||
|
||||
This is two step process. First, import FBX to temporary path and
|
||||
then call `containerise()` on it - this moves all content to new
|
||||
|
|
@ -69,10 +70,10 @@ class StaticMeshAlembicLoader(api.Loader):
|
|||
|
||||
Returns:
|
||||
list(str): list of container content
|
||||
"""
|
||||
|
||||
# Create directory for asset and avalon container
|
||||
root = "/Game/Avalon/Assets"
|
||||
"""
|
||||
# Create directory for asset and OpenPype container
|
||||
root = "/Game/OpenPype/Assets"
|
||||
asset = context.get('asset').get('name')
|
||||
suffix = "_CON"
|
||||
if asset:
|
||||
|
|
@ -93,7 +94,7 @@ class StaticMeshAlembicLoader(api.Loader):
|
|||
unreal.AssetToolsHelpers.get_asset_tools().import_asset_tasks([task]) # noqa: E501
|
||||
|
||||
# Create Asset Container
|
||||
lib.create_avalon_container(
|
||||
unreal_pipeline.create_container(
|
||||
container=container_name, path=asset_dir)
|
||||
|
||||
data = {
|
||||
|
|
|
|||
|
|
@ -1,14 +1,16 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Load FBX with animations."""
|
||||
import os
|
||||
import json
|
||||
|
||||
from avalon import api, pipeline
|
||||
from avalon.unreal import lib
|
||||
from avalon.unreal import pipeline as unreal_pipeline
|
||||
import unreal
|
||||
from openpype.hosts.unreal.api import plugin
|
||||
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
|
||||
import unreal # noqa
|
||||
|
||||
|
||||
class AnimationFBXLoader(api.Loader):
|
||||
"""Load Unreal SkeletalMesh from FBX"""
|
||||
class AnimationFBXLoader(plugin.Loader):
|
||||
"""Load Unreal SkeletalMesh from FBX."""
|
||||
|
||||
families = ["animation"]
|
||||
label = "Import FBX Animation"
|
||||
|
|
@ -37,10 +39,10 @@ class AnimationFBXLoader(api.Loader):
|
|||
|
||||
Returns:
|
||||
list(str): list of container content
|
||||
"""
|
||||
|
||||
# Create directory for asset and avalon container
|
||||
root = "/Game/Avalon/Assets"
|
||||
"""
|
||||
# Create directory for asset and OpenPype container
|
||||
root = "/Game/OpenPype/Assets"
|
||||
asset = context.get('asset').get('name')
|
||||
suffix = "_CON"
|
||||
if asset:
|
||||
|
|
@ -62,9 +64,9 @@ class AnimationFBXLoader(api.Loader):
|
|||
task = unreal.AssetImportTask()
|
||||
task.options = unreal.FbxImportUI()
|
||||
|
||||
libpath = self.fname.replace("fbx", "json")
|
||||
lib_path = self.fname.replace("fbx", "json")
|
||||
|
||||
with open(libpath, "r") as fp:
|
||||
with open(lib_path, "r") as fp:
|
||||
data = json.load(fp)
|
||||
|
||||
instance_name = data.get("instance_name")
|
||||
|
|
@ -127,7 +129,7 @@ class AnimationFBXLoader(api.Loader):
|
|||
unreal.AssetToolsHelpers.get_asset_tools().import_asset_tasks([task])
|
||||
|
||||
# Create Asset Container
|
||||
lib.create_avalon_container(
|
||||
unreal_pipeline.create_container(
|
||||
container=container_name, path=asset_dir)
|
||||
|
||||
data = {
|
||||
|
|
|
|||
|
|
@ -1,12 +1,14 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Load camera from FBX."""
|
||||
import os
|
||||
|
||||
from avalon import api, io, pipeline
|
||||
from avalon.unreal import lib
|
||||
from avalon.unreal import pipeline as unreal_pipeline
|
||||
import unreal
|
||||
from avalon import io, pipeline
|
||||
from openpype.hosts.unreal.api import plugin
|
||||
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
|
||||
import unreal # noqa
|
||||
|
||||
|
||||
class CameraLoader(api.Loader):
|
||||
class CameraLoader(plugin.Loader):
|
||||
"""Load Unreal StaticMesh from FBX"""
|
||||
|
||||
families = ["camera"]
|
||||
|
|
@ -38,8 +40,8 @@ class CameraLoader(api.Loader):
|
|||
list(str): list of container content
|
||||
"""
|
||||
|
||||
# Create directory for asset and avalon container
|
||||
root = "/Game/Avalon/Assets"
|
||||
# Create directory for asset and OpenPype container
|
||||
root = "/Game/OpenPype/Assets"
|
||||
asset = context.get('asset').get('name')
|
||||
suffix = "_CON"
|
||||
if asset:
|
||||
|
|
@ -109,7 +111,8 @@ class CameraLoader(api.Loader):
|
|||
)
|
||||
|
||||
# Create Asset Container
|
||||
lib.create_avalon_container(container=container_name, path=asset_dir)
|
||||
unreal_pipeline.create_container(
|
||||
container=container_name, path=asset_dir)
|
||||
|
||||
data = {
|
||||
"schema": "openpype:container-2.0",
|
||||
|
|
|
|||
|
|
@ -1,3 +1,5 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Loader for layouts."""
|
||||
import os
|
||||
import json
|
||||
from pathlib import Path
|
||||
|
|
@ -10,11 +12,11 @@ from unreal import FBXImportType
|
|||
from unreal import MathLibrary as umath
|
||||
|
||||
from avalon import api, pipeline
|
||||
from avalon.unreal import lib
|
||||
from avalon.unreal import pipeline as unreal_pipeline
|
||||
from openpype.hosts.unreal.api import plugin
|
||||
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
|
||||
|
||||
|
||||
class LayoutLoader(api.Loader):
|
||||
class LayoutLoader(plugin.Loader):
|
||||
"""Load Layout from a JSON file"""
|
||||
|
||||
families = ["layout"]
|
||||
|
|
@ -23,6 +25,7 @@ class LayoutLoader(api.Loader):
|
|||
label = "Load Layout"
|
||||
icon = "code-fork"
|
||||
color = "orange"
|
||||
ASSET_ROOT = "/Game/OpenPype/Assets"
|
||||
|
||||
def _get_asset_containers(self, path):
|
||||
ar = unreal.AssetRegistryHelpers.get_asset_registry()
|
||||
|
|
@ -40,7 +43,8 @@ class LayoutLoader(api.Loader):
|
|||
|
||||
return asset_containers
|
||||
|
||||
def _get_fbx_loader(self, loaders, family):
|
||||
@staticmethod
|
||||
def _get_fbx_loader(loaders, family):
|
||||
name = ""
|
||||
if family == 'rig':
|
||||
name = "SkeletalMeshFBXLoader"
|
||||
|
|
@ -58,7 +62,8 @@ class LayoutLoader(api.Loader):
|
|||
|
||||
return None
|
||||
|
||||
def _get_abc_loader(self, loaders, family):
|
||||
@staticmethod
|
||||
def _get_abc_loader(loaders, family):
|
||||
name = ""
|
||||
if family == 'rig':
|
||||
name = "SkeletalMeshAlembicLoader"
|
||||
|
|
@ -74,14 +79,15 @@ class LayoutLoader(api.Loader):
|
|||
|
||||
return None
|
||||
|
||||
def _process_family(self, assets, classname, transform, inst_name=None):
|
||||
@staticmethod
|
||||
def _process_family(assets, class_name, transform, inst_name=None):
|
||||
ar = unreal.AssetRegistryHelpers.get_asset_registry()
|
||||
|
||||
actors = []
|
||||
|
||||
for asset in assets:
|
||||
obj = ar.get_asset_by_object_path(asset).get_asset()
|
||||
if obj.get_class().get_name() == classname:
|
||||
if obj.get_class().get_name() == class_name:
|
||||
actor = EditorLevelLibrary.spawn_actor_from_object(
|
||||
obj,
|
||||
transform.get('translation')
|
||||
|
|
@ -111,8 +117,9 @@ class LayoutLoader(api.Loader):
|
|||
|
||||
return actors
|
||||
|
||||
@staticmethod
|
||||
def _import_animation(
|
||||
self, asset_dir, path, instance_name, skeleton, actors_dict,
|
||||
asset_dir, path, instance_name, skeleton, actors_dict,
|
||||
animation_file):
|
||||
anim_file = Path(animation_file)
|
||||
anim_file_name = anim_file.with_suffix('')
|
||||
|
|
@ -192,10 +199,10 @@ class LayoutLoader(api.Loader):
|
|||
actor.skeletal_mesh_component.animation_data.set_editor_property(
|
||||
'anim_to_play', animation)
|
||||
|
||||
def _process(self, libpath, asset_dir, loaded=None):
|
||||
def _process(self, lib_path, asset_dir, loaded=None):
|
||||
ar = unreal.AssetRegistryHelpers.get_asset_registry()
|
||||
|
||||
with open(libpath, "r") as fp:
|
||||
with open(lib_path, "r") as fp:
|
||||
data = json.load(fp)
|
||||
|
||||
all_loaders = api.discover(api.Loader)
|
||||
|
|
@ -203,7 +210,7 @@ class LayoutLoader(api.Loader):
|
|||
if not loaded:
|
||||
loaded = []
|
||||
|
||||
path = Path(libpath)
|
||||
path = Path(lib_path)
|
||||
|
||||
skeleton_dict = {}
|
||||
actors_dict = {}
|
||||
|
|
@ -292,17 +299,18 @@ class LayoutLoader(api.Loader):
|
|||
asset_dir, path, instance_name, skeleton,
|
||||
actors_dict, animation_file)
|
||||
|
||||
def _remove_family(self, assets, components, classname, propname):
|
||||
@staticmethod
|
||||
def _remove_family(assets, components, class_name, prop_name):
|
||||
ar = unreal.AssetRegistryHelpers.get_asset_registry()
|
||||
|
||||
objects = []
|
||||
for a in assets:
|
||||
obj = ar.get_asset_by_object_path(a)
|
||||
if obj.get_asset().get_class().get_name() == classname:
|
||||
if obj.get_asset().get_class().get_name() == class_name:
|
||||
objects.append(obj)
|
||||
for obj in objects:
|
||||
for comp in components:
|
||||
if comp.get_editor_property(propname) == obj.get_asset():
|
||||
if comp.get_editor_property(prop_name) == obj.get_asset():
|
||||
comp.get_owner().destroy_actor()
|
||||
|
||||
def _remove_actors(self, path):
|
||||
|
|
@ -334,8 +342,7 @@ class LayoutLoader(api.Loader):
|
|||
assets, skel_meshes_comp, 'SkeletalMesh', 'skeletal_mesh')
|
||||
|
||||
def load(self, context, name, namespace, options):
|
||||
"""
|
||||
Load and containerise representation into Content Browser.
|
||||
"""Load and containerise representation into Content Browser.
|
||||
|
||||
This is two step process. First, import FBX to temporary path and
|
||||
then call `containerise()` on it - this moves all content to new
|
||||
|
|
@ -349,14 +356,14 @@ class LayoutLoader(api.Loader):
|
|||
This is not passed here, so namespace is set
|
||||
by `containerise()` because only then we know
|
||||
real path.
|
||||
data (dict): Those would be data to be imprinted. This is not used
|
||||
now, data are imprinted by `containerise()`.
|
||||
options (dict): Those would be data to be imprinted. This is not
|
||||
used now, data are imprinted by `containerise()`.
|
||||
|
||||
Returns:
|
||||
list(str): list of container content
|
||||
"""
|
||||
# Create directory for asset and avalon container
|
||||
root = "/Game/Avalon/Assets"
|
||||
root = self.ASSET_ROOT
|
||||
asset = context.get('asset').get('name')
|
||||
suffix = "_CON"
|
||||
if asset:
|
||||
|
|
@ -375,7 +382,7 @@ class LayoutLoader(api.Loader):
|
|||
self._process(self.fname, asset_dir)
|
||||
|
||||
# Create Asset Container
|
||||
lib.create_avalon_container(
|
||||
unreal_pipeline.create_container(
|
||||
container=container_name, path=asset_dir)
|
||||
|
||||
data = {
|
||||
|
|
@ -406,7 +413,7 @@ class LayoutLoader(api.Loader):
|
|||
|
||||
source_path = api.get_representation_path(representation)
|
||||
destination_path = container["namespace"]
|
||||
libpath = Path(api.get_representation_path(representation))
|
||||
lib_path = Path(api.get_representation_path(representation))
|
||||
|
||||
self._remove_actors(destination_path)
|
||||
|
||||
|
|
@ -502,7 +509,7 @@ class LayoutLoader(api.Loader):
|
|||
|
||||
if animation_file and skeleton:
|
||||
self._import_animation(
|
||||
destination_path, libpath,
|
||||
destination_path, lib_path,
|
||||
instance_name, skeleton,
|
||||
actors_dict, animation_file)
|
||||
|
||||
|
|
|
|||
|
|
@ -1,13 +1,15 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Load Skeletal Meshes form FBX."""
|
||||
import os
|
||||
|
||||
from avalon import api, pipeline
|
||||
from avalon.unreal import lib
|
||||
from avalon.unreal import pipeline as unreal_pipeline
|
||||
import unreal
|
||||
from openpype.hosts.unreal.api import plugin
|
||||
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
|
||||
import unreal # noqa
|
||||
|
||||
|
||||
class SkeletalMeshFBXLoader(api.Loader):
|
||||
"""Load Unreal SkeletalMesh from FBX"""
|
||||
class SkeletalMeshFBXLoader(plugin.Loader):
|
||||
"""Load Unreal SkeletalMesh from FBX."""
|
||||
|
||||
families = ["rig"]
|
||||
label = "Import FBX Skeletal Mesh"
|
||||
|
|
@ -16,8 +18,7 @@ class SkeletalMeshFBXLoader(api.Loader):
|
|||
color = "orange"
|
||||
|
||||
def load(self, context, name, namespace, options):
|
||||
"""
|
||||
Load and containerise representation into Content Browser.
|
||||
"""Load and containerise representation into Content Browser.
|
||||
|
||||
This is two step process. First, import FBX to temporary path and
|
||||
then call `containerise()` on it - this moves all content to new
|
||||
|
|
@ -31,15 +32,15 @@ class SkeletalMeshFBXLoader(api.Loader):
|
|||
This is not passed here, so namespace is set
|
||||
by `containerise()` because only then we know
|
||||
real path.
|
||||
data (dict): Those would be data to be imprinted. This is not used
|
||||
now, data are imprinted by `containerise()`.
|
||||
options (dict): Those would be data to be imprinted. This is not
|
||||
used now, data are imprinted by `containerise()`.
|
||||
|
||||
Returns:
|
||||
list(str): list of container content
|
||||
"""
|
||||
|
||||
# Create directory for asset and avalon container
|
||||
root = "/Game/Avalon/Assets"
|
||||
"""
|
||||
# Create directory for asset and OpenPype container
|
||||
root = "/Game/OpenPype/Assets"
|
||||
if options and options.get("asset_dir"):
|
||||
root = options["asset_dir"]
|
||||
asset = context.get('asset').get('name')
|
||||
|
|
@ -94,7 +95,7 @@ class SkeletalMeshFBXLoader(api.Loader):
|
|||
unreal.AssetToolsHelpers.get_asset_tools().import_asset_tasks([task]) # noqa: E501
|
||||
|
||||
# Create Asset Container
|
||||
lib.create_avalon_container(
|
||||
unreal_pipeline.create_container(
|
||||
container=container_name, path=asset_dir)
|
||||
|
||||
data = {
|
||||
|
|
|
|||
|
|
@ -1,13 +1,15 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Load Static meshes form FBX."""
|
||||
import os
|
||||
|
||||
from avalon import api, pipeline
|
||||
from avalon.unreal import lib
|
||||
from avalon.unreal import pipeline as unreal_pipeline
|
||||
import unreal
|
||||
from openpype.hosts.unreal.api import plugin
|
||||
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
|
||||
import unreal # noqa
|
||||
|
||||
|
||||
class StaticMeshFBXLoader(api.Loader):
|
||||
"""Load Unreal StaticMesh from FBX"""
|
||||
class StaticMeshFBXLoader(plugin.Loader):
|
||||
"""Load Unreal StaticMesh from FBX."""
|
||||
|
||||
families = ["model", "unrealStaticMesh"]
|
||||
label = "Import FBX Static Mesh"
|
||||
|
|
@ -15,7 +17,8 @@ class StaticMeshFBXLoader(api.Loader):
|
|||
icon = "cube"
|
||||
color = "orange"
|
||||
|
||||
def get_task(self, filename, asset_dir, asset_name, replace):
|
||||
@staticmethod
|
||||
def get_task(filename, asset_dir, asset_name, replace):
|
||||
task = unreal.AssetImportTask()
|
||||
options = unreal.FbxImportUI()
|
||||
import_data = unreal.FbxStaticMeshImportData()
|
||||
|
|
@ -41,8 +44,7 @@ class StaticMeshFBXLoader(api.Loader):
|
|||
return task
|
||||
|
||||
def load(self, context, name, namespace, options):
|
||||
"""
|
||||
Load and containerise representation into Content Browser.
|
||||
"""Load and containerise representation into Content Browser.
|
||||
|
||||
This is two step process. First, import FBX to temporary path and
|
||||
then call `containerise()` on it - this moves all content to new
|
||||
|
|
@ -56,15 +58,15 @@ class StaticMeshFBXLoader(api.Loader):
|
|||
This is not passed here, so namespace is set
|
||||
by `containerise()` because only then we know
|
||||
real path.
|
||||
data (dict): Those would be data to be imprinted. This is not used
|
||||
now, data are imprinted by `containerise()`.
|
||||
options (dict): Those would be data to be imprinted. This is not
|
||||
used now, data are imprinted by `containerise()`.
|
||||
|
||||
Returns:
|
||||
list(str): list of container content
|
||||
"""
|
||||
|
||||
# Create directory for asset and avalon container
|
||||
root = "/Game/Avalon/Assets"
|
||||
# Create directory for asset and OpenPype container
|
||||
root = "/Game/OpenPype/Assets"
|
||||
if options and options.get("asset_dir"):
|
||||
root = options["asset_dir"]
|
||||
asset = context.get('asset').get('name')
|
||||
|
|
@ -87,7 +89,7 @@ class StaticMeshFBXLoader(api.Loader):
|
|||
unreal.AssetToolsHelpers.get_asset_tools().import_asset_tasks([task]) # noqa: E501
|
||||
|
||||
# Create Asset Container
|
||||
lib.create_avalon_container(
|
||||
unreal_pipeline.create_container(
|
||||
container=container_name, path=asset_dir)
|
||||
|
||||
data = {
|
||||
|
|
|
|||
|
|
@ -1,17 +1,18 @@
|
|||
import unreal
|
||||
|
||||
# -*- coding: utf-8 -*-
|
||||
"""Collect current project path."""
|
||||
import unreal # noqa
|
||||
import pyblish.api
|
||||
|
||||
|
||||
class CollectUnrealCurrentFile(pyblish.api.ContextPlugin):
|
||||
"""Inject the current working file into context"""
|
||||
"""Inject the current working file into context."""
|
||||
|
||||
order = pyblish.api.CollectorOrder - 0.5
|
||||
label = "Unreal Current File"
|
||||
hosts = ['unreal']
|
||||
|
||||
def process(self, context):
|
||||
"""Inject the current working file"""
|
||||
"""Inject the current working file."""
|
||||
current_file = unreal.Paths.get_project_file_path()
|
||||
context.data['currentFile'] = current_file
|
||||
|
||||
|
|
|
|||
|
|
@ -1,12 +1,14 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Collect publishable instances in Unreal."""
|
||||
import ast
|
||||
import unreal
|
||||
import unreal # noqa
|
||||
import pyblish.api
|
||||
|
||||
|
||||
class CollectInstances(pyblish.api.ContextPlugin):
|
||||
"""Gather instances by AvalonPublishInstance class
|
||||
"""Gather instances by OpenPypePublishInstance class
|
||||
|
||||
This collector finds all paths containing `AvalonPublishInstance` class
|
||||
This collector finds all paths containing `OpenPypePublishInstance` class
|
||||
asset
|
||||
|
||||
Identifier:
|
||||
|
|
@ -22,7 +24,7 @@ class CollectInstances(pyblish.api.ContextPlugin):
|
|||
|
||||
ar = unreal.AssetRegistryHelpers.get_asset_registry()
|
||||
instance_containers = ar.get_assets_by_class(
|
||||
"AvalonPublishInstance", True)
|
||||
"OpenPypePublishInstance", True)
|
||||
|
||||
for container_data in instance_containers:
|
||||
asset = container_data.get_asset()
|
||||
|
|
|
|||
|
|
@ -1,3 +1,5 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Extract camera from Unreal."""
|
||||
import os
|
||||
|
||||
import unreal
|
||||
|
|
@ -17,7 +19,7 @@ class ExtractCamera(openpype.api.Extractor):
|
|||
|
||||
def process(self, instance):
|
||||
# Define extract output file path
|
||||
stagingdir = self.staging_dir(instance)
|
||||
staging_dir = self.staging_dir(instance)
|
||||
fbx_filename = "{}.fbx".format(instance.name)
|
||||
|
||||
# Perform extraction
|
||||
|
|
@ -38,7 +40,7 @@ class ExtractCamera(openpype.api.Extractor):
|
|||
sequence,
|
||||
sequence.get_bindings(),
|
||||
unreal.FbxExportOption(),
|
||||
os.path.join(stagingdir, fbx_filename)
|
||||
os.path.join(staging_dir, fbx_filename)
|
||||
)
|
||||
break
|
||||
|
||||
|
|
@ -49,6 +51,6 @@ class ExtractCamera(openpype.api.Extractor):
|
|||
'name': 'fbx',
|
||||
'ext': 'fbx',
|
||||
'files': fbx_filename,
|
||||
"stagingDir": stagingdir,
|
||||
"stagingDir": staging_dir,
|
||||
}
|
||||
instance.data["representations"].append(fbx_representation)
|
||||
|
|
|
|||
|
|
@ -1,3 +1,4 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
import os
|
||||
import json
|
||||
import math
|
||||
|
|
@ -20,7 +21,7 @@ class ExtractLayout(openpype.api.Extractor):
|
|||
|
||||
def process(self, instance):
|
||||
# Define extract output file path
|
||||
stagingdir = self.staging_dir(instance)
|
||||
staging_dir = self.staging_dir(instance)
|
||||
|
||||
# Perform extraction
|
||||
self.log.info("Performing extraction..")
|
||||
|
|
@ -96,7 +97,7 @@ class ExtractLayout(openpype.api.Extractor):
|
|||
json_data.append(json_element)
|
||||
|
||||
json_filename = "{}.json".format(instance.name)
|
||||
json_path = os.path.join(stagingdir, json_filename)
|
||||
json_path = os.path.join(staging_dir, json_filename)
|
||||
|
||||
with open(json_path, "w+") as file:
|
||||
json.dump(json_data, fp=file, indent=2)
|
||||
|
|
@ -108,6 +109,6 @@ class ExtractLayout(openpype.api.Extractor):
|
|||
'name': 'json',
|
||||
'ext': 'json',
|
||||
'files': json_filename,
|
||||
"stagingDir": stagingdir,
|
||||
"stagingDir": staging_dir,
|
||||
}
|
||||
instance.data["representations"].append(json_representation)
|
||||
|
|
|
|||
|
|
@ -1,3 +1,4 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
import json
|
||||
import os
|
||||
|
||||
|
|
@ -17,7 +18,7 @@ class ExtractLook(openpype.api.Extractor):
|
|||
|
||||
def process(self, instance):
|
||||
# Define extract output file path
|
||||
stagingdir = self.staging_dir(instance)
|
||||
staging_dir = self.staging_dir(instance)
|
||||
resources_dir = instance.data["resourcesDir"]
|
||||
|
||||
ar = unreal.AssetRegistryHelpers.get_asset_registry()
|
||||
|
|
@ -57,7 +58,7 @@ class ExtractLook(openpype.api.Extractor):
|
|||
tga_export_task.set_editor_property('automated', True)
|
||||
tga_export_task.set_editor_property('object', texture)
|
||||
tga_export_task.set_editor_property(
|
||||
'filename', f"{stagingdir}/{tga_filename}")
|
||||
'filename', f"{staging_dir}/{tga_filename}")
|
||||
tga_export_task.set_editor_property('prompt', False)
|
||||
tga_export_task.set_editor_property('selected', False)
|
||||
|
||||
|
|
@ -66,7 +67,7 @@ class ExtractLook(openpype.api.Extractor):
|
|||
json_element['tga_filename'] = tga_filename
|
||||
|
||||
transfers.append((
|
||||
f"{stagingdir}/{tga_filename}",
|
||||
f"{staging_dir}/{tga_filename}",
|
||||
f"{resources_dir}/{tga_filename}"))
|
||||
|
||||
fbx_filename = f"{instance.name}_{name}.fbx"
|
||||
|
|
@ -84,7 +85,7 @@ class ExtractLook(openpype.api.Extractor):
|
|||
task.set_editor_property('automated', True)
|
||||
task.set_editor_property('object', object)
|
||||
task.set_editor_property(
|
||||
'filename', f"{stagingdir}/{fbx_filename}")
|
||||
'filename', f"{staging_dir}/{fbx_filename}")
|
||||
task.set_editor_property('prompt', False)
|
||||
task.set_editor_property('selected', False)
|
||||
|
||||
|
|
@ -93,13 +94,13 @@ class ExtractLook(openpype.api.Extractor):
|
|||
json_element['fbx_filename'] = fbx_filename
|
||||
|
||||
transfers.append((
|
||||
f"{stagingdir}/{fbx_filename}",
|
||||
f"{staging_dir}/{fbx_filename}",
|
||||
f"{resources_dir}/{fbx_filename}"))
|
||||
|
||||
json_data.append(json_element)
|
||||
|
||||
json_filename = f"{instance.name}.json"
|
||||
json_path = os.path.join(stagingdir, json_filename)
|
||||
json_path = os.path.join(staging_dir, json_filename)
|
||||
|
||||
with open(json_path, "w+") as file:
|
||||
json.dump(json_data, fp=file, indent=2)
|
||||
|
|
@ -113,7 +114,7 @@ class ExtractLook(openpype.api.Extractor):
|
|||
'name': 'json',
|
||||
'ext': 'json',
|
||||
'files': json_filename,
|
||||
"stagingDir": stagingdir,
|
||||
"stagingDir": staging_dir,
|
||||
}
|
||||
|
||||
instance.data["representations"].append(json_representation)
|
||||
|
|
|
|||
|
|
@ -36,13 +36,22 @@ from .execute import (
|
|||
CREATE_NO_WINDOW
|
||||
)
|
||||
from .log import PypeLogger, timeit
|
||||
|
||||
from .path_templates import (
|
||||
merge_dict,
|
||||
TemplateMissingKey,
|
||||
TemplateUnsolved,
|
||||
StringTemplate,
|
||||
TemplatesDict,
|
||||
FormatObject,
|
||||
)
|
||||
|
||||
from .mongo import (
|
||||
get_default_components,
|
||||
validate_mongo_connection,
|
||||
OpenPypeMongoConnection
|
||||
)
|
||||
from .anatomy import (
|
||||
merge_dict,
|
||||
Anatomy
|
||||
)
|
||||
|
||||
|
|
@ -285,9 +294,15 @@ __all__ = [
|
|||
"get_version_from_path",
|
||||
"get_last_version_from_path",
|
||||
|
||||
"merge_dict",
|
||||
"TemplateMissingKey",
|
||||
"TemplateUnsolved",
|
||||
"StringTemplate",
|
||||
"TemplatesDict",
|
||||
"FormatObject",
|
||||
|
||||
"terminal",
|
||||
|
||||
"merge_dict",
|
||||
"Anatomy",
|
||||
|
||||
"get_datetime_data",
|
||||
|
|
|
|||
|
|
@ -9,6 +9,12 @@ from openpype.settings.lib import (
|
|||
get_default_anatomy_settings,
|
||||
get_anatomy_settings
|
||||
)
|
||||
from .path_templates import (
|
||||
TemplateUnsolved,
|
||||
TemplateResult,
|
||||
TemplatesDict,
|
||||
FormatObject,
|
||||
)
|
||||
from .log import PypeLogger
|
||||
|
||||
log = PypeLogger().get_logger(__name__)
|
||||
|
|
@ -19,32 +25,6 @@ except NameError:
|
|||
StringType = str
|
||||
|
||||
|
||||
def merge_dict(main_dict, enhance_dict):
|
||||
"""Merges dictionaries by keys.
|
||||
|
||||
Function call itself if value on key is again dictionary.
|
||||
|
||||
Args:
|
||||
main_dict (dict): First dict to merge second one into.
|
||||
enhance_dict (dict): Second dict to be merged.
|
||||
|
||||
Returns:
|
||||
dict: Merged result.
|
||||
|
||||
.. note:: does not overrides whole value on first found key
|
||||
but only values differences from enhance_dict
|
||||
|
||||
"""
|
||||
for key, value in enhance_dict.items():
|
||||
if key not in main_dict:
|
||||
main_dict[key] = value
|
||||
elif isinstance(value, dict) and isinstance(main_dict[key], dict):
|
||||
main_dict[key] = merge_dict(main_dict[key], value)
|
||||
else:
|
||||
main_dict[key] = value
|
||||
return main_dict
|
||||
|
||||
|
||||
class ProjectNotSet(Exception):
|
||||
"""Exception raised when is created Anatomy without project name."""
|
||||
|
||||
|
|
@ -59,7 +39,7 @@ class RootCombinationError(Exception):
|
|||
# TODO better error message
|
||||
msg = (
|
||||
"Combination of root with and"
|
||||
" without root name in Templates. {}"
|
||||
" without root name in AnatomyTemplates. {}"
|
||||
).format(joined_roots)
|
||||
|
||||
super(RootCombinationError, self).__init__(msg)
|
||||
|
|
@ -68,7 +48,7 @@ class RootCombinationError(Exception):
|
|||
class Anatomy:
|
||||
"""Anatomy module helps to keep project settings.
|
||||
|
||||
Wraps key project specifications, Templates and Roots.
|
||||
Wraps key project specifications, AnatomyTemplates and Roots.
|
||||
|
||||
Args:
|
||||
project_name (str): Project name to look on overrides.
|
||||
|
|
@ -93,7 +73,7 @@ class Anatomy:
|
|||
get_anatomy_settings(project_name, site_name)
|
||||
)
|
||||
self._site_name = site_name
|
||||
self._templates_obj = Templates(self)
|
||||
self._templates_obj = AnatomyTemplates(self)
|
||||
self._roots_obj = Roots(self)
|
||||
|
||||
# Anatomy used as dictionary
|
||||
|
|
@ -158,12 +138,12 @@ class Anatomy:
|
|||
|
||||
@property
|
||||
def templates(self):
|
||||
"""Wrap property `templates` of Anatomy's Templates instance."""
|
||||
"""Wrap property `templates` of Anatomy's AnatomyTemplates instance."""
|
||||
return self._templates_obj.templates
|
||||
|
||||
@property
|
||||
def templates_obj(self):
|
||||
"""Return `Templates` object of current Anatomy instance."""
|
||||
"""Return `AnatomyTemplates` object of current Anatomy instance."""
|
||||
return self._templates_obj
|
||||
|
||||
def format(self, *args, **kwargs):
|
||||
|
|
@ -375,203 +355,45 @@ class Anatomy:
|
|||
return rootless_path.format(**data)
|
||||
|
||||
|
||||
class TemplateMissingKey(Exception):
|
||||
"""Exception for cases when key does not exist in Anatomy."""
|
||||
|
||||
msg = "Anatomy key does not exist: `anatomy{0}`."
|
||||
|
||||
def __init__(self, parents):
|
||||
parent_join = "".join(["[\"{0}\"]".format(key) for key in parents])
|
||||
super(TemplateMissingKey, self).__init__(
|
||||
self.msg.format(parent_join)
|
||||
)
|
||||
|
||||
|
||||
class TemplateUnsolved(Exception):
|
||||
class AnatomyTemplateUnsolved(TemplateUnsolved):
|
||||
"""Exception for unsolved template when strict is set to True."""
|
||||
|
||||
msg = "Anatomy template \"{0}\" is unsolved.{1}{2}"
|
||||
invalid_types_msg = " Keys with invalid DataType: `{0}`."
|
||||
missing_keys_msg = " Missing keys: \"{0}\"."
|
||||
|
||||
def __init__(self, template, missing_keys, invalid_types):
|
||||
invalid_type_items = []
|
||||
for _key, _type in invalid_types.items():
|
||||
invalid_type_items.append(
|
||||
"\"{0}\" {1}".format(_key, str(_type))
|
||||
)
|
||||
|
||||
invalid_types_msg = ""
|
||||
if invalid_type_items:
|
||||
invalid_types_msg = self.invalid_types_msg.format(
|
||||
", ".join(invalid_type_items)
|
||||
)
|
||||
class AnatomyTemplateResult(TemplateResult):
|
||||
rootless = None
|
||||
|
||||
missing_keys_msg = ""
|
||||
if missing_keys:
|
||||
missing_keys_msg = self.missing_keys_msg.format(
|
||||
", ".join(missing_keys)
|
||||
)
|
||||
super(TemplateUnsolved, self).__init__(
|
||||
self.msg.format(template, missing_keys_msg, invalid_types_msg)
|
||||
def __new__(cls, result, rootless_path):
|
||||
new_obj = super(AnatomyTemplateResult, cls).__new__(
|
||||
cls,
|
||||
str(result),
|
||||
result.template,
|
||||
result.solved,
|
||||
result.used_values,
|
||||
result.missing_keys,
|
||||
result.invalid_types
|
||||
)
|
||||
|
||||
|
||||
class TemplateResult(str):
|
||||
"""Result (formatted template) of anatomy with most of information in.
|
||||
|
||||
Args:
|
||||
used_values (dict): Dictionary of template filling data with
|
||||
only used keys.
|
||||
solved (bool): For check if all required keys were filled.
|
||||
template (str): Original template.
|
||||
missing_keys (list): Missing keys that were not in the data. Include
|
||||
missing optional keys.
|
||||
invalid_types (dict): When key was found in data, but value had not
|
||||
allowed DataType. Allowed data types are `numbers`,
|
||||
`str`(`basestring`) and `dict`. Dictionary may cause invalid type
|
||||
when value of key in data is dictionary but template expect string
|
||||
of number.
|
||||
"""
|
||||
|
||||
def __new__(
|
||||
cls, filled_template, template, solved, rootless_path,
|
||||
used_values, missing_keys, invalid_types
|
||||
):
|
||||
new_obj = super(TemplateResult, cls).__new__(cls, filled_template)
|
||||
new_obj.used_values = used_values
|
||||
new_obj.solved = solved
|
||||
new_obj.template = template
|
||||
new_obj.rootless = rootless_path
|
||||
new_obj.missing_keys = list(set(missing_keys))
|
||||
_invalid_types = {}
|
||||
for invalid_type in invalid_types:
|
||||
for key, val in invalid_type.items():
|
||||
if key in _invalid_types:
|
||||
continue
|
||||
_invalid_types[key] = val
|
||||
new_obj.invalid_types = _invalid_types
|
||||
return new_obj
|
||||
|
||||
|
||||
class TemplatesDict(dict):
|
||||
"""Holds and wrap TemplateResults for easy bug report."""
|
||||
|
||||
def __init__(self, in_data, key=None, parent=None, strict=None):
|
||||
super(TemplatesDict, self).__init__()
|
||||
for _key, _value in in_data.items():
|
||||
if isinstance(_value, dict):
|
||||
_value = self.__class__(_value, _key, self)
|
||||
self[_key] = _value
|
||||
|
||||
self.key = key
|
||||
self.parent = parent
|
||||
self.strict = strict
|
||||
if self.parent is None and strict is None:
|
||||
self.strict = True
|
||||
|
||||
def __getitem__(self, key):
|
||||
# Raise error about missing key in anatomy.yaml
|
||||
if key not in self.keys():
|
||||
hier = self.hierarchy()
|
||||
hier.append(key)
|
||||
raise TemplateMissingKey(hier)
|
||||
|
||||
value = super(TemplatesDict, self).__getitem__(key)
|
||||
if isinstance(value, self.__class__):
|
||||
return value
|
||||
|
||||
# Raise exception when expected solved templates and it is not.
|
||||
if (
|
||||
self.raise_on_unsolved
|
||||
and (hasattr(value, "solved") and not value.solved)
|
||||
):
|
||||
raise TemplateUnsolved(
|
||||
value.template, value.missing_keys, value.invalid_types
|
||||
def validate(self):
|
||||
if not self.solved:
|
||||
raise AnatomyTemplateUnsolved(
|
||||
self.template,
|
||||
self.missing_keys,
|
||||
self.invalid_types
|
||||
)
|
||||
return value
|
||||
|
||||
@property
|
||||
def raise_on_unsolved(self):
|
||||
"""To affect this change `strict` attribute."""
|
||||
if self.strict is not None:
|
||||
return self.strict
|
||||
return self.parent.raise_on_unsolved
|
||||
|
||||
def hierarchy(self):
|
||||
"""Return dictionary keys one by one to root parent."""
|
||||
if self.parent is None:
|
||||
return []
|
||||
|
||||
hier_keys = []
|
||||
par_hier = self.parent.hierarchy()
|
||||
if par_hier:
|
||||
hier_keys.extend(par_hier)
|
||||
hier_keys.append(self.key)
|
||||
|
||||
return hier_keys
|
||||
|
||||
@property
|
||||
def missing_keys(self):
|
||||
"""Return missing keys of all children templates."""
|
||||
missing_keys = []
|
||||
for value in self.values():
|
||||
missing_keys.extend(value.missing_keys)
|
||||
return list(set(missing_keys))
|
||||
|
||||
@property
|
||||
def invalid_types(self):
|
||||
"""Return invalid types of all children templates."""
|
||||
invalid_types = {}
|
||||
for value in self.values():
|
||||
for invalid_type in value.invalid_types:
|
||||
_invalid_types = {}
|
||||
for key, val in invalid_type.items():
|
||||
if key in invalid_types:
|
||||
continue
|
||||
_invalid_types[key] = val
|
||||
invalid_types = merge_dict(invalid_types, _invalid_types)
|
||||
return invalid_types
|
||||
|
||||
@property
|
||||
def used_values(self):
|
||||
"""Return used values for all children templates."""
|
||||
used_values = {}
|
||||
for value in self.values():
|
||||
used_values = merge_dict(used_values, value.used_values)
|
||||
return used_values
|
||||
|
||||
def get_solved(self):
|
||||
"""Get only solved key from templates."""
|
||||
result = {}
|
||||
for key, value in self.items():
|
||||
if isinstance(value, self.__class__):
|
||||
value = value.get_solved()
|
||||
if not value:
|
||||
continue
|
||||
result[key] = value
|
||||
|
||||
elif (
|
||||
not hasattr(value, "solved") or
|
||||
value.solved
|
||||
):
|
||||
result[key] = value
|
||||
return self.__class__(result, key=self.key, parent=self.parent)
|
||||
|
||||
|
||||
class Templates:
|
||||
key_pattern = re.compile(r"(\{.*?[^{0]*\})")
|
||||
key_padding_pattern = re.compile(r"([^:]+)\S+[><]\S+")
|
||||
sub_dict_pattern = re.compile(r"([^\[\]]+)")
|
||||
optional_pattern = re.compile(r"(<.*?[^{0]*>)[^0-9]*?")
|
||||
|
||||
class AnatomyTemplates(TemplatesDict):
|
||||
inner_key_pattern = re.compile(r"(\{@.*?[^{}0]*\})")
|
||||
inner_key_name_pattern = re.compile(r"\{@(.*?[^{}0]*)\}")
|
||||
|
||||
def __init__(self, anatomy):
|
||||
super(AnatomyTemplates, self).__init__()
|
||||
self.anatomy = anatomy
|
||||
self.loaded_project = None
|
||||
self._templates = None
|
||||
|
||||
def __getitem__(self, key):
|
||||
return self.templates[key]
|
||||
|
|
@ -580,7 +402,9 @@ class Templates:
|
|||
return self.templates.get(key, default)
|
||||
|
||||
def reset(self):
|
||||
self._raw_templates = None
|
||||
self._templates = None
|
||||
self._objected_templates = None
|
||||
|
||||
@property
|
||||
def project_name(self):
|
||||
|
|
@ -592,17 +416,66 @@ class Templates:
|
|||
|
||||
@property
|
||||
def templates(self):
|
||||
self._validate_discovery()
|
||||
return self._templates
|
||||
|
||||
@property
|
||||
def objected_templates(self):
|
||||
self._validate_discovery()
|
||||
return self._objected_templates
|
||||
|
||||
def _validate_discovery(self):
|
||||
if self.project_name != self.loaded_project:
|
||||
self._templates = None
|
||||
self.reset()
|
||||
|
||||
if self._templates is None:
|
||||
self._templates = self._discover()
|
||||
self._discover()
|
||||
self.loaded_project = self.project_name
|
||||
return self._templates
|
||||
|
||||
def _format_value(self, value, data):
|
||||
if isinstance(value, RootItem):
|
||||
return self._solve_dict(value, data)
|
||||
|
||||
result = super(AnatomyTemplates, self)._format_value(value, data)
|
||||
if isinstance(result, TemplateResult):
|
||||
rootless_path = self._rootless_path(result, data)
|
||||
result = AnatomyTemplateResult(result, rootless_path)
|
||||
return result
|
||||
|
||||
def set_templates(self, templates):
|
||||
if not templates:
|
||||
self.reset()
|
||||
return
|
||||
|
||||
self._raw_templates = copy.deepcopy(templates)
|
||||
templates = copy.deepcopy(templates)
|
||||
v_queue = collections.deque()
|
||||
v_queue.append(templates)
|
||||
while v_queue:
|
||||
item = v_queue.popleft()
|
||||
if not isinstance(item, dict):
|
||||
continue
|
||||
|
||||
for key in tuple(item.keys()):
|
||||
value = item[key]
|
||||
if isinstance(value, dict):
|
||||
v_queue.append(value)
|
||||
|
||||
elif (
|
||||
isinstance(value, StringType)
|
||||
and "{task}" in value
|
||||
):
|
||||
item[key] = value.replace("{task}", "{task[name]}")
|
||||
|
||||
solved_templates = self.solve_template_inner_links(templates)
|
||||
self._templates = solved_templates
|
||||
self._objected_templates = self.create_ojected_templates(
|
||||
solved_templates
|
||||
)
|
||||
|
||||
def default_templates(self):
|
||||
"""Return default templates data with solved inner keys."""
|
||||
return Templates.solve_template_inner_links(
|
||||
return self.solve_template_inner_links(
|
||||
self.anatomy["templates"]
|
||||
)
|
||||
|
||||
|
|
@ -613,7 +486,7 @@ class Templates:
|
|||
TODO: create templates if not exist.
|
||||
|
||||
Returns:
|
||||
TemplatesDict: Contain templates data for current project of
|
||||
TemplatesResultDict: Contain templates data for current project of
|
||||
default templates.
|
||||
"""
|
||||
|
||||
|
|
@ -624,7 +497,7 @@ class Templates:
|
|||
" Trying to use default."
|
||||
).format(self.project_name))
|
||||
|
||||
return Templates.solve_template_inner_links(self.anatomy["templates"])
|
||||
self.set_templates(self.anatomy["templates"])
|
||||
|
||||
@classmethod
|
||||
def replace_inner_keys(cls, matches, value, key_values, key):
|
||||
|
|
@ -791,149 +664,6 @@ class Templates:
|
|||
|
||||
return keys_by_subkey
|
||||
|
||||
def _filter_optional(self, template, data):
|
||||
"""Filter invalid optional keys.
|
||||
|
||||
Invalid keys may be missing keys of with invalid value DataType.
|
||||
|
||||
Args:
|
||||
template (str): Anatomy template which will be formatted.
|
||||
data (dict): Containing keys to be filled into template.
|
||||
|
||||
Result:
|
||||
tuple: Contain origin template without missing optional keys and
|
||||
without optional keys identificator ("<" and ">"), information
|
||||
about missing optional keys and invalid types of optional keys.
|
||||
|
||||
"""
|
||||
|
||||
# Remove optional missing keys
|
||||
missing_keys = []
|
||||
invalid_types = []
|
||||
for optional_group in self.optional_pattern.findall(template):
|
||||
_missing_keys = []
|
||||
_invalid_types = []
|
||||
for optional_key in self.key_pattern.findall(optional_group):
|
||||
key = str(optional_key[1:-1])
|
||||
key_padding = list(
|
||||
self.key_padding_pattern.findall(key)
|
||||
)
|
||||
if key_padding:
|
||||
key = key_padding[0]
|
||||
|
||||
validation_result = self._validate_data_key(
|
||||
key, data
|
||||
)
|
||||
missing_key = validation_result["missing_key"]
|
||||
invalid_type = validation_result["invalid_type"]
|
||||
|
||||
valid = True
|
||||
if missing_key is not None:
|
||||
_missing_keys.append(missing_key)
|
||||
valid = False
|
||||
|
||||
if invalid_type is not None:
|
||||
_invalid_types.append(invalid_type)
|
||||
valid = False
|
||||
|
||||
if valid:
|
||||
try:
|
||||
optional_key.format(**data)
|
||||
except KeyError:
|
||||
_missing_keys.append(key)
|
||||
valid = False
|
||||
|
||||
valid = len(_invalid_types) == 0 and len(_missing_keys) == 0
|
||||
missing_keys.extend(_missing_keys)
|
||||
invalid_types.extend(_invalid_types)
|
||||
replacement = ""
|
||||
if valid:
|
||||
replacement = optional_group[1:-1]
|
||||
|
||||
template = template.replace(optional_group, replacement)
|
||||
return (template, missing_keys, invalid_types)
|
||||
|
||||
def _validate_data_key(self, key, data):
|
||||
"""Check and prepare missing keys and invalid types of template."""
|
||||
result = {
|
||||
"missing_key": None,
|
||||
"invalid_type": None
|
||||
}
|
||||
|
||||
# check if key expects subdictionary keys (e.g. project[name])
|
||||
key_subdict = list(self.sub_dict_pattern.findall(key))
|
||||
used_keys = []
|
||||
if len(key_subdict) <= 1:
|
||||
if key not in data:
|
||||
result["missing_key"] = key
|
||||
return result
|
||||
|
||||
used_keys.append(key)
|
||||
value = data[key]
|
||||
|
||||
else:
|
||||
value = data
|
||||
missing_key = False
|
||||
invalid_type = False
|
||||
for sub_key in key_subdict:
|
||||
if (
|
||||
value is None
|
||||
or (hasattr(value, "items") and sub_key not in value)
|
||||
):
|
||||
missing_key = True
|
||||
used_keys.append(sub_key)
|
||||
break
|
||||
|
||||
elif not hasattr(value, "items"):
|
||||
invalid_type = True
|
||||
break
|
||||
|
||||
used_keys.append(sub_key)
|
||||
value = value.get(sub_key)
|
||||
|
||||
if missing_key or invalid_type:
|
||||
if len(used_keys) == 0:
|
||||
invalid_key = key_subdict[0]
|
||||
else:
|
||||
invalid_key = used_keys[0]
|
||||
for idx, sub_key in enumerate(used_keys):
|
||||
if idx == 0:
|
||||
continue
|
||||
invalid_key += "[{0}]".format(sub_key)
|
||||
|
||||
if missing_key:
|
||||
result["missing_key"] = invalid_key
|
||||
|
||||
elif invalid_type:
|
||||
result["invalid_type"] = {invalid_key: type(value)}
|
||||
|
||||
return result
|
||||
|
||||
if isinstance(value, (numbers.Number, Roots, RootItem)):
|
||||
return result
|
||||
|
||||
for inh_class in type(value).mro():
|
||||
if inh_class == StringType:
|
||||
return result
|
||||
|
||||
result["missing_key"] = key
|
||||
result["invalid_type"] = {key: type(value)}
|
||||
return result
|
||||
|
||||
def _merge_used_values(self, current_used, keys, value):
|
||||
key = keys[0]
|
||||
_keys = keys[1:]
|
||||
if len(_keys) == 0:
|
||||
current_used[key] = value
|
||||
else:
|
||||
next_dict = {}
|
||||
if key in current_used:
|
||||
next_dict = current_used[key]
|
||||
current_used[key] = self._merge_used_values(
|
||||
next_dict, _keys, value
|
||||
)
|
||||
return current_used
|
||||
|
||||
def _dict_to_subkeys_list(self, subdict, pre_keys=None):
|
||||
if pre_keys is None:
|
||||
pre_keys = []
|
||||
|
|
@ -956,9 +686,11 @@ class Templates:
|
|||
return {key_list[0]: value}
|
||||
return {key_list[0]: self._keys_to_dicts(key_list[1:], value)}
|
||||
|
||||
def _rootless_path(
|
||||
self, template, used_values, final_data, missing_keys, invalid_types
|
||||
):
|
||||
def _rootless_path(self, result, final_data):
|
||||
used_values = result.used_values
|
||||
missing_keys = result.missing_keys
|
||||
template = result.template
|
||||
invalid_types = result.invalid_types
|
||||
if (
|
||||
"root" not in used_values
|
||||
or "root" in missing_keys
|
||||
|
|
@ -974,210 +706,49 @@ class Templates:
|
|||
if not root_keys:
|
||||
return
|
||||
|
||||
roots_dict = {}
|
||||
output = str(result)
|
||||
for used_root_keys in root_keys:
|
||||
if not used_root_keys:
|
||||
continue
|
||||
|
||||
used_value = used_values
|
||||
root_key = None
|
||||
for key in used_root_keys:
|
||||
used_value = used_value[key]
|
||||
if root_key is None:
|
||||
root_key = key
|
||||
else:
|
||||
root_key += "[{}]".format(key)
|
||||
|
||||
root_key = "{" + root_key + "}"
|
||||
|
||||
roots_dict = merge_dict(
|
||||
roots_dict,
|
||||
self._keys_to_dicts(used_root_keys, root_key)
|
||||
)
|
||||
|
||||
final_data["root"] = roots_dict["root"]
|
||||
return template.format(**final_data)
|
||||
|
||||
def _format(self, orig_template, data):
|
||||
""" Figure out with whole formatting.
|
||||
|
||||
Separate advanced keys (*Like '{project[name]}') from string which must
|
||||
be formatted separatelly in case of missing or incomplete keys in data.
|
||||
|
||||
Args:
|
||||
template (str): Anatomy template which will be formatted.
|
||||
data (dict): Containing keys to be filled into template.
|
||||
|
||||
Returns:
|
||||
TemplateResult: Filled or partially filled template containing all
|
||||
data needed or missing for filling template.
|
||||
"""
|
||||
task_data = data.get("task")
|
||||
if (
|
||||
isinstance(task_data, StringType)
|
||||
and "{task[name]}" in orig_template
|
||||
):
|
||||
# Change task to dictionary if template expect dictionary
|
||||
data["task"] = {"name": task_data}
|
||||
|
||||
template, missing_optional, invalid_optional = (
|
||||
self._filter_optional(orig_template, data)
|
||||
)
|
||||
# Remove optional missing keys
|
||||
used_values = {}
|
||||
invalid_required = []
|
||||
missing_required = []
|
||||
replace_keys = []
|
||||
|
||||
for group in self.key_pattern.findall(template):
|
||||
orig_key = group[1:-1]
|
||||
key = str(orig_key)
|
||||
key_padding = list(self.key_padding_pattern.findall(key))
|
||||
if key_padding:
|
||||
key = key_padding[0]
|
||||
|
||||
validation_result = self._validate_data_key(key, data)
|
||||
missing_key = validation_result["missing_key"]
|
||||
invalid_type = validation_result["invalid_type"]
|
||||
|
||||
if invalid_type is not None:
|
||||
invalid_required.append(invalid_type)
|
||||
replace_keys.append(key)
|
||||
continue
|
||||
|
||||
if missing_key is not None:
|
||||
missing_required.append(missing_key)
|
||||
replace_keys.append(key)
|
||||
continue
|
||||
|
||||
try:
|
||||
value = group.format(**data)
|
||||
key_subdict = list(self.sub_dict_pattern.findall(key))
|
||||
if len(key_subdict) <= 1:
|
||||
used_values[key] = value
|
||||
|
||||
else:
|
||||
used_values = self._merge_used_values(
|
||||
used_values, key_subdict, value
|
||||
)
|
||||
|
||||
except (TypeError, KeyError):
|
||||
missing_required.append(key)
|
||||
replace_keys.append(key)
|
||||
|
||||
final_data = copy.deepcopy(data)
|
||||
for key in replace_keys:
|
||||
key_subdict = list(self.sub_dict_pattern.findall(key))
|
||||
if len(key_subdict) <= 1:
|
||||
final_data[key] = "{" + key + "}"
|
||||
continue
|
||||
|
||||
replace_key_dst = "---".join(key_subdict)
|
||||
replace_key_dst_curly = "{" + replace_key_dst + "}"
|
||||
replace_key_src_curly = "{" + key + "}"
|
||||
template = template.replace(
|
||||
replace_key_src_curly, replace_key_dst_curly
|
||||
)
|
||||
final_data[replace_key_dst] = replace_key_src_curly
|
||||
|
||||
solved = len(missing_required) == 0 and len(invalid_required) == 0
|
||||
|
||||
missing_keys = missing_required + missing_optional
|
||||
invalid_types = invalid_required + invalid_optional
|
||||
|
||||
filled_template = template.format(**final_data)
|
||||
# WARNING `_rootless_path` change values in `final_data` please keep
|
||||
# in midn when changing order
|
||||
rootless_path = self._rootless_path(
|
||||
template, used_values, final_data, missing_keys, invalid_types
|
||||
)
|
||||
if rootless_path is None:
|
||||
rootless_path = filled_template
|
||||
|
||||
result = TemplateResult(
|
||||
filled_template, orig_template, solved, rootless_path,
|
||||
used_values, missing_keys, invalid_types
|
||||
)
|
||||
return result
|
||||
|
||||
def solve_dict(self, templates, data):
|
||||
""" Solves templates with entered data.
|
||||
|
||||
Args:
|
||||
templates (dict): All Anatomy templates which will be formatted.
|
||||
data (dict): Containing keys to be filled into template.
|
||||
|
||||
Returns:
|
||||
dict: With `TemplateResult` in values containing filled or
|
||||
partially filled templates.
|
||||
"""
|
||||
output = collections.defaultdict(dict)
|
||||
for key, orig_value in templates.items():
|
||||
if isinstance(orig_value, StringType):
|
||||
# Replace {task} by '{task[name]}' for backward compatibility
|
||||
if '{task}' in orig_value:
|
||||
orig_value = orig_value.replace('{task}', '{task[name]}')
|
||||
|
||||
output[key] = self._format(orig_value, data)
|
||||
continue
|
||||
|
||||
# Check if orig_value has items attribute (any dict inheritance)
|
||||
if not hasattr(orig_value, "items"):
|
||||
# TODO we should handle this case
|
||||
output[key] = orig_value
|
||||
continue
|
||||
|
||||
for s_key, s_value in self.solve_dict(orig_value, data).items():
|
||||
output[key][s_key] = s_value
|
||||
output = output.replace(str(used_value), root_key)
|
||||
|
||||
return output
|
||||
|
||||
def format(self, data, strict=True):
|
||||
copy_data = copy.deepcopy(data)
|
||||
roots = self.roots
|
||||
if roots:
|
||||
copy_data["root"] = roots
|
||||
result = super(AnatomyTemplates, self).format(copy_data)
|
||||
result.strict = strict
|
||||
return result
|
||||
|
||||
def format_all(self, in_data, only_keys=True):
|
||||
""" Solves templates based on entered data.
|
||||
|
||||
Args:
|
||||
data (dict): Containing keys to be filled into template.
|
||||
only_keys (bool, optional): Decides if environ will be used to
|
||||
fill templates or only keys in data.
|
||||
|
||||
Returns:
|
||||
TemplatesDict: Output `TemplateResult` have `strict` attribute
|
||||
set to False so accessing unfilled keys in templates won't
|
||||
raise any exceptions.
|
||||
TemplatesResultDict: Output `TemplateResult` have `strict`
|
||||
attribute set to False so accessing unfilled keys in templates
|
||||
won't raise any exceptions.
|
||||
"""
|
||||
output = self.format(in_data, only_keys)
|
||||
output.strict = False
|
||||
return output
|
||||
|
||||
def format(self, in_data, only_keys=True):
|
||||
""" Solves templates based on entered data.
|
||||
|
||||
Args:
|
||||
data (dict): Containing keys to be filled into template.
|
||||
only_keys (bool, optional): Decides if environ will be used to
|
||||
fill templates or only keys in data.
|
||||
|
||||
Returns:
|
||||
TemplatesDict: Output `TemplateResult` have `strict` attribute
|
||||
set to True so accessing unfilled keys in templates will
|
||||
raise exceptions with explaned error.
|
||||
"""
|
||||
# Create a copy of inserted data
|
||||
data = copy.deepcopy(in_data)
|
||||
|
||||
# Add environment variable to data
|
||||
if only_keys is False:
|
||||
for key, val in os.environ.items():
|
||||
data["$" + key] = val
|
||||
|
||||
# override root value
|
||||
roots = self.roots
|
||||
if roots:
|
||||
data["root"] = roots
|
||||
solved = self.solve_dict(self.templates, data)
|
||||
|
||||
return TemplatesDict(solved)
|
||||
return self.format(in_data, strict=False)
|
||||
|
||||
|
||||
class RootItem:
|
||||
class RootItem(FormatObject):
|
||||
"""Represents one item or roots.
|
||||
|
||||
Holds raw data of root item specification. Raw data contain value
|
||||
|
|
|
|||
|
|
@ -28,7 +28,8 @@ from .profiles_filtering import filter_profiles
|
|||
from .local_settings import get_openpype_username
|
||||
from .avalon_context import (
|
||||
get_workdir_data,
|
||||
get_workdir_with_workdir_data
|
||||
get_workdir_with_workdir_data,
|
||||
get_workfile_template_key
|
||||
)
|
||||
|
||||
from .python_module_tools import (
|
||||
|
|
@ -1591,14 +1592,15 @@ def _prepare_last_workfile(data, workdir):
|
|||
last_workfile_path = data.get("last_workfile_path") or ""
|
||||
if not last_workfile_path:
|
||||
extensions = avalon.api.HOST_WORKFILE_EXTENSIONS.get(app.host_name)
|
||||
|
||||
if extensions:
|
||||
anatomy = data["anatomy"]
|
||||
project_settings = data["project_settings"]
|
||||
task_type = workdir_data["task"]["type"]
|
||||
template_key = get_workfile_template_key(
|
||||
task_type, app.host_name, project_settings=project_settings
|
||||
)
|
||||
# Find last workfile
|
||||
file_template = anatomy.templates["work"]["file"]
|
||||
# Replace {task} by '{task[name]}' for backward compatibility
|
||||
if '{task}' in file_template:
|
||||
file_template = file_template.replace('{task}', '{task[name]}')
|
||||
file_template = str(anatomy.templates[template_key]["file"])
|
||||
|
||||
workdir_data.update({
|
||||
"version": 1,
|
||||
|
|
|
|||
|
|
@ -952,7 +952,7 @@ class BuildWorkfile:
|
|||
Returns:
|
||||
(dict): preset per entered task name
|
||||
"""
|
||||
host_name = avalon.api.registered_host().__name__.rsplit(".", 1)[-1]
|
||||
host_name = os.environ["AVALON_APP"]
|
||||
project_settings = get_project_settings(
|
||||
avalon.io.Session["AVALON_PROJECT"]
|
||||
)
|
||||
|
|
|
|||
778
openpype/lib/path_templates.py
Normal file
778
openpype/lib/path_templates.py
Normal file
|
|
@ -0,0 +1,778 @@
|
|||
import os
|
||||
import re
|
||||
import copy
|
||||
import numbers
|
||||
import collections
|
||||
|
||||
import six
|
||||
|
||||
from .log import PypeLogger
|
||||
|
||||
log = PypeLogger.get_logger(__name__)
|
||||
|
||||
|
||||
KEY_PATTERN = re.compile(r"(\{.*?[^{0]*\})")
|
||||
KEY_PADDING_PATTERN = re.compile(r"([^:]+)\S+[><]\S+")
|
||||
SUB_DICT_PATTERN = re.compile(r"([^\[\]]+)")
|
||||
OPTIONAL_PATTERN = re.compile(r"(<.*?[^{0]*>)[^0-9]*?")
|
||||
|
||||
|
||||
def merge_dict(main_dict, enhance_dict):
|
||||
"""Merges dictionaries by keys.
|
||||
|
||||
Function call itself if value on key is again dictionary.
|
||||
|
||||
Args:
|
||||
main_dict (dict): First dict to merge second one into.
|
||||
enhance_dict (dict): Second dict to be merged.
|
||||
|
||||
Returns:
|
||||
dict: Merged result.
|
||||
|
||||
.. note:: does not overrides whole value on first found key
|
||||
but only values differences from enhance_dict
|
||||
|
||||
"""
|
||||
for key, value in enhance_dict.items():
|
||||
if key not in main_dict:
|
||||
main_dict[key] = value
|
||||
elif isinstance(value, dict) and isinstance(main_dict[key], dict):
|
||||
main_dict[key] = merge_dict(main_dict[key], value)
|
||||
else:
|
||||
main_dict[key] = value
|
||||
return main_dict
|
||||
|
||||
|
||||
class TemplateMissingKey(Exception):
|
||||
"""Exception for cases when key does not exist in template."""
|
||||
|
||||
msg = "Template key does not exist: `{}`."
|
||||
|
||||
def __init__(self, parents):
|
||||
parent_join = "".join(["[\"{0}\"]".format(key) for key in parents])
|
||||
super(TemplateMissingKey, self).__init__(
|
||||
self.msg.format(parent_join)
|
||||
)
|
||||
|
||||
|
||||
class TemplateUnsolved(Exception):
|
||||
"""Exception for unsolved template when strict is set to True."""
|
||||
|
||||
msg = "Template \"{0}\" is unsolved.{1}{2}"
|
||||
invalid_types_msg = " Keys with invalid DataType: `{0}`."
|
||||
missing_keys_msg = " Missing keys: \"{0}\"."
|
||||
|
||||
def __init__(self, template, missing_keys, invalid_types):
|
||||
invalid_type_items = []
|
||||
for _key, _type in invalid_types.items():
|
||||
invalid_type_items.append(
|
||||
"\"{0}\" {1}".format(_key, str(_type))
|
||||
)
|
||||
|
||||
invalid_types_msg = ""
|
||||
if invalid_type_items:
|
||||
invalid_types_msg = self.invalid_types_msg.format(
|
||||
", ".join(invalid_type_items)
|
||||
)
|
||||
|
||||
missing_keys_msg = ""
|
||||
if missing_keys:
|
||||
missing_keys_msg = self.missing_keys_msg.format(
|
||||
", ".join(missing_keys)
|
||||
)
|
||||
super(TemplateUnsolved, self).__init__(
|
||||
self.msg.format(template, missing_keys_msg, invalid_types_msg)
|
||||
)
|
||||
|
||||
|
||||
class StringTemplate(object):
|
||||
"""String that can be formatted."""
|
||||
def __init__(self, template):
|
||||
if not isinstance(template, six.string_types):
|
||||
raise TypeError("<{}> argument must be a string, not {}.".format(
|
||||
self.__class__.__name__, str(type(template))
|
||||
))
|
||||
|
||||
self._template = template
|
||||
parts = []
|
||||
last_end_idx = 0
|
||||
for item in KEY_PATTERN.finditer(template):
|
||||
start, end = item.span()
|
||||
if start > last_end_idx:
|
||||
parts.append(template[last_end_idx:start])
|
||||
parts.append(FormattingPart(template[start:end]))
|
||||
last_end_idx = end
|
||||
|
||||
if last_end_idx < len(template):
|
||||
parts.append(template[last_end_idx:len(template)])
|
||||
|
||||
new_parts = []
|
||||
for part in parts:
|
||||
if not isinstance(part, six.string_types):
|
||||
new_parts.append(part)
|
||||
continue
|
||||
|
||||
substr = ""
|
||||
for char in part:
|
||||
if char not in ("<", ">"):
|
||||
substr += char
|
||||
else:
|
||||
if substr:
|
||||
new_parts.append(substr)
|
||||
new_parts.append(char)
|
||||
substr = ""
|
||||
if substr:
|
||||
new_parts.append(substr)
|
||||
|
||||
self._parts = self.find_optional_parts(new_parts)
|
||||
|
||||
def __str__(self):
|
||||
return self.template
|
||||
|
||||
def __repr__(self):
|
||||
return "<{}> {}".format(self.__class__.__name__, self.template)
|
||||
|
||||
def __contains__(self, other):
|
||||
return other in self.template
|
||||
|
||||
def replace(self, *args, **kwargs):
|
||||
self._template = self.template.replace(*args, **kwargs)
|
||||
return self
|
||||
|
||||
@property
|
||||
def template(self):
|
||||
return self._template
|
||||
|
||||
def format(self, data):
|
||||
""" Figure out with whole formatting.
|
||||
|
||||
Separate advanced keys (*Like '{project[name]}') from string which must
|
||||
be formatted separatelly in case of missing or incomplete keys in data.
|
||||
|
||||
Args:
|
||||
data (dict): Containing keys to be filled into template.
|
||||
|
||||
Returns:
|
||||
TemplateResult: Filled or partially filled template containing all
|
||||
data needed or missing for filling template.
|
||||
"""
|
||||
result = TemplatePartResult()
|
||||
for part in self._parts:
|
||||
if isinstance(part, six.string_types):
|
||||
result.add_output(part)
|
||||
else:
|
||||
part.format(data, result)
|
||||
|
||||
invalid_types = result.invalid_types
|
||||
invalid_types.update(result.invalid_optional_types)
|
||||
invalid_types = result.split_keys_to_subdicts(invalid_types)
|
||||
|
||||
missing_keys = result.missing_keys
|
||||
missing_keys |= result.missing_optional_keys
|
||||
|
||||
solved = result.solved
|
||||
used_values = result.get_clean_used_values()
|
||||
|
||||
return TemplateResult(
|
||||
result.output,
|
||||
self.template,
|
||||
solved,
|
||||
used_values,
|
||||
missing_keys,
|
||||
invalid_types
|
||||
)
|
||||
|
||||
def format_strict(self, *args, **kwargs):
|
||||
result = self.format(*args, **kwargs)
|
||||
result.validate()
|
||||
return result
|
||||
|
||||
@staticmethod
|
||||
def find_optional_parts(parts):
|
||||
new_parts = []
|
||||
tmp_parts = {}
|
||||
counted_symb = -1
|
||||
for part in parts:
|
||||
if part == "<":
|
||||
counted_symb += 1
|
||||
tmp_parts[counted_symb] = []
|
||||
|
||||
elif part == ">":
|
||||
if counted_symb > -1:
|
||||
parts = tmp_parts.pop(counted_symb)
|
||||
counted_symb -= 1
|
||||
if parts:
|
||||
# Remove optional start char
|
||||
parts.pop(0)
|
||||
if counted_symb < 0:
|
||||
out_parts = new_parts
|
||||
else:
|
||||
out_parts = tmp_parts[counted_symb]
|
||||
# Store temp parts
|
||||
out_parts.append(OptionalPart(parts))
|
||||
continue
|
||||
|
||||
if counted_symb < 0:
|
||||
new_parts.append(part)
|
||||
else:
|
||||
tmp_parts[counted_symb].append(part)
|
||||
|
||||
if tmp_parts:
|
||||
for idx in sorted(tmp_parts.keys()):
|
||||
new_parts.extend(tmp_parts[idx])
|
||||
return new_parts
|
||||
|
||||
|
||||
class TemplatesDict(object):
|
||||
def __init__(self, templates=None):
|
||||
self._raw_templates = None
|
||||
self._templates = None
|
||||
self._objected_templates = None
|
||||
self.set_templates(templates)
|
||||
|
||||
def set_templates(self, templates):
|
||||
if templates is None:
|
||||
self._raw_templates = None
|
||||
self._templates = None
|
||||
self._objected_templates = None
|
||||
elif isinstance(templates, dict):
|
||||
self._raw_templates = copy.deepcopy(templates)
|
||||
self._templates = templates
|
||||
self._objected_templates = self.create_ojected_templates(templates)
|
||||
else:
|
||||
raise TypeError("<{}> argument must be a dict, not {}.".format(
|
||||
self.__class__.__name__, str(type(templates))
|
||||
))
|
||||
|
||||
def __getitem__(self, key):
|
||||
return self.templates[key]
|
||||
|
||||
def get(self, key, *args, **kwargs):
|
||||
return self.templates.get(key, *args, **kwargs)
|
||||
|
||||
@property
|
||||
def raw_templates(self):
|
||||
return self._raw_templates
|
||||
|
||||
@property
|
||||
def templates(self):
|
||||
return self._templates
|
||||
|
||||
@property
|
||||
def objected_templates(self):
|
||||
return self._objected_templates
|
||||
|
||||
@classmethod
|
||||
def create_ojected_templates(cls, templates):
|
||||
if not isinstance(templates, dict):
|
||||
raise TypeError("Expected dict object, got {}".format(
|
||||
str(type(templates))
|
||||
))
|
||||
|
||||
objected_templates = copy.deepcopy(templates)
|
||||
inner_queue = collections.deque()
|
||||
inner_queue.append(objected_templates)
|
||||
while inner_queue:
|
||||
item = inner_queue.popleft()
|
||||
if not isinstance(item, dict):
|
||||
continue
|
||||
for key in tuple(item.keys()):
|
||||
value = item[key]
|
||||
if isinstance(value, six.string_types):
|
||||
item[key] = StringTemplate(value)
|
||||
elif isinstance(value, dict):
|
||||
inner_queue.append(value)
|
||||
return objected_templates
|
||||
|
||||
def _format_value(self, value, data):
|
||||
if isinstance(value, StringTemplate):
|
||||
return value.format(data)
|
||||
|
||||
if isinstance(value, dict):
|
||||
return self._solve_dict(value, data)
|
||||
return value
|
||||
|
||||
def _solve_dict(self, templates, data):
|
||||
""" Solves templates with entered data.
|
||||
|
||||
Args:
|
||||
templates (dict): All templates which will be formatted.
|
||||
data (dict): Containing keys to be filled into template.
|
||||
|
||||
Returns:
|
||||
dict: With `TemplateResult` in values containing filled or
|
||||
partially filled templates.
|
||||
"""
|
||||
output = collections.defaultdict(dict)
|
||||
for key, value in templates.items():
|
||||
output[key] = self._format_value(value, data)
|
||||
|
||||
return output
|
||||
|
||||
def format(self, in_data, only_keys=True, strict=True):
|
||||
""" Solves templates based on entered data.
|
||||
|
||||
Args:
|
||||
data (dict): Containing keys to be filled into template.
|
||||
only_keys (bool, optional): Decides if environ will be used to
|
||||
fill templates or only keys in data.
|
||||
|
||||
Returns:
|
||||
TemplatesResultDict: Output `TemplateResult` have `strict`
|
||||
attribute set to True so accessing unfilled keys in templates
|
||||
will raise exceptions with explaned error.
|
||||
"""
|
||||
# Create a copy of inserted data
|
||||
data = copy.deepcopy(in_data)
|
||||
|
||||
# Add environment variable to data
|
||||
if only_keys is False:
|
||||
for key, val in os.environ.items():
|
||||
env_key = "$" + key
|
||||
if env_key not in data:
|
||||
data[env_key] = val
|
||||
|
||||
solved = self._solve_dict(self.objected_templates, data)
|
||||
|
||||
output = TemplatesResultDict(solved)
|
||||
output.strict = strict
|
||||
return output
|
||||
|
||||
|
||||
class TemplateResult(str):
|
||||
"""Result of template format with most of information in.
|
||||
|
||||
Args:
|
||||
used_values (dict): Dictionary of template filling data with
|
||||
only used keys.
|
||||
solved (bool): For check if all required keys were filled.
|
||||
template (str): Original template.
|
||||
missing_keys (list): Missing keys that were not in the data. Include
|
||||
missing optional keys.
|
||||
invalid_types (dict): When key was found in data, but value had not
|
||||
allowed DataType. Allowed data types are `numbers`,
|
||||
`str`(`basestring`) and `dict`. Dictionary may cause invalid type
|
||||
when value of key in data is dictionary but template expect string
|
||||
of number.
|
||||
"""
|
||||
used_values = None
|
||||
solved = None
|
||||
template = None
|
||||
missing_keys = None
|
||||
invalid_types = None
|
||||
|
||||
def __new__(
|
||||
cls, filled_template, template, solved,
|
||||
used_values, missing_keys, invalid_types
|
||||
):
|
||||
new_obj = super(TemplateResult, cls).__new__(cls, filled_template)
|
||||
new_obj.used_values = used_values
|
||||
new_obj.solved = solved
|
||||
new_obj.template = template
|
||||
new_obj.missing_keys = list(set(missing_keys))
|
||||
new_obj.invalid_types = invalid_types
|
||||
return new_obj
|
||||
|
||||
def validate(self):
|
||||
if not self.solved:
|
||||
raise TemplateUnsolved(
|
||||
self.template,
|
||||
self.missing_keys,
|
||||
self.invalid_types
|
||||
)
|
||||
|
||||
|
||||
class TemplatesResultDict(dict):
|
||||
"""Holds and wrap TemplateResults for easy bug report."""
|
||||
|
||||
def __init__(self, in_data, key=None, parent=None, strict=None):
|
||||
super(TemplatesResultDict, self).__init__()
|
||||
for _key, _value in in_data.items():
|
||||
if isinstance(_value, dict):
|
||||
_value = self.__class__(_value, _key, self)
|
||||
self[_key] = _value
|
||||
|
||||
self.key = key
|
||||
self.parent = parent
|
||||
self.strict = strict
|
||||
if self.parent is None and strict is None:
|
||||
self.strict = True
|
||||
|
||||
def __getitem__(self, key):
|
||||
if key not in self.keys():
|
||||
hier = self.hierarchy()
|
||||
hier.append(key)
|
||||
raise TemplateMissingKey(hier)
|
||||
|
||||
value = super(TemplatesResultDict, self).__getitem__(key)
|
||||
if isinstance(value, self.__class__):
|
||||
return value
|
||||
|
||||
# Raise exception when expected solved templates and it is not.
|
||||
if self.raise_on_unsolved and hasattr(value, "validate"):
|
||||
value.validate()
|
||||
return value
|
||||
|
||||
@property
|
||||
def raise_on_unsolved(self):
|
||||
"""To affect this change `strict` attribute."""
|
||||
if self.strict is not None:
|
||||
return self.strict
|
||||
return self.parent.raise_on_unsolved
|
||||
|
||||
def hierarchy(self):
|
||||
"""Return dictionary keys one by one to root parent."""
|
||||
if self.parent is None:
|
||||
return []
|
||||
|
||||
hier_keys = []
|
||||
par_hier = self.parent.hierarchy()
|
||||
if par_hier:
|
||||
hier_keys.extend(par_hier)
|
||||
hier_keys.append(self.key)
|
||||
|
||||
return hier_keys
|
||||
|
||||
@property
|
||||
def missing_keys(self):
|
||||
"""Return missing keys of all children templates."""
|
||||
missing_keys = set()
|
||||
for value in self.values():
|
||||
missing_keys |= value.missing_keys
|
||||
return missing_keys
|
||||
|
||||
@property
|
||||
def invalid_types(self):
|
||||
"""Return invalid types of all children templates."""
|
||||
invalid_types = {}
|
||||
for value in self.values():
|
||||
invalid_types = merge_dict(invalid_types, value.invalid_types)
|
||||
return invalid_types
|
||||
|
||||
@property
|
||||
def used_values(self):
|
||||
"""Return used values for all children templates."""
|
||||
used_values = {}
|
||||
for value in self.values():
|
||||
used_values = merge_dict(used_values, value.used_values)
|
||||
return used_values
|
||||
|
||||
def get_solved(self):
|
||||
"""Get only solved key from templates."""
|
||||
result = {}
|
||||
for key, value in self.items():
|
||||
if isinstance(value, self.__class__):
|
||||
value = value.get_solved()
|
||||
if not value:
|
||||
continue
|
||||
result[key] = value
|
||||
|
||||
elif (
|
||||
not hasattr(value, "solved") or
|
||||
value.solved
|
||||
):
|
||||
result[key] = value
|
||||
return self.__class__(result, key=self.key, parent=self.parent)
|
||||
|
||||
|
||||
class TemplatePartResult:
|
||||
"""Result to store result of template parts."""
|
||||
def __init__(self, optional=False):
|
||||
# Missing keys or invalid value types of required keys
|
||||
self._missing_keys = set()
|
||||
self._invalid_types = {}
|
||||
# Missing keys or invalid value types of optional keys
|
||||
self._missing_optional_keys = set()
|
||||
self._invalid_optional_types = {}
|
||||
|
||||
# Used values stored by key with origin type
|
||||
# - key without any padding or key modifiers
|
||||
# - value from filling data
|
||||
# Example: {"version": 1}
|
||||
self._used_values = {}
|
||||
# Used values stored by key with all modifirs
|
||||
# - value is already formatted string
|
||||
# Example: {"version:0>3": "001"}
|
||||
self._realy_used_values = {}
|
||||
# Concatenated string output after formatting
|
||||
self._output = ""
|
||||
# Is this result from optional part
|
||||
self._optional = True
|
||||
|
||||
def add_output(self, other):
|
||||
if isinstance(other, six.string_types):
|
||||
self._output += other
|
||||
|
||||
elif isinstance(other, TemplatePartResult):
|
||||
self._output += other.output
|
||||
|
||||
self._missing_keys |= other.missing_keys
|
||||
self._missing_optional_keys |= other.missing_optional_keys
|
||||
|
||||
self._invalid_types.update(other.invalid_types)
|
||||
self._invalid_optional_types.update(other.invalid_optional_types)
|
||||
|
||||
if other.optional and not other.solved:
|
||||
return
|
||||
self._used_values.update(other.used_values)
|
||||
self._realy_used_values.update(other.realy_used_values)
|
||||
|
||||
else:
|
||||
raise TypeError("Cannot add data from \"{}\" to \"{}\"".format(
|
||||
str(type(other)), self.__class__.__name__)
|
||||
)
|
||||
|
||||
@property
|
||||
def solved(self):
|
||||
if self.optional:
|
||||
if (
|
||||
len(self.missing_optional_keys) > 0
|
||||
or len(self.invalid_optional_types) > 0
|
||||
):
|
||||
return False
|
||||
return (
|
||||
len(self.missing_keys) == 0
|
||||
and len(self.invalid_types) == 0
|
||||
)
|
||||
|
||||
@property
|
||||
def optional(self):
|
||||
return self._optional
|
||||
|
||||
@property
|
||||
def output(self):
|
||||
return self._output
|
||||
|
||||
@property
|
||||
def missing_keys(self):
|
||||
return self._missing_keys
|
||||
|
||||
@property
|
||||
def missing_optional_keys(self):
|
||||
return self._missing_optional_keys
|
||||
|
||||
@property
|
||||
def invalid_types(self):
|
||||
return self._invalid_types
|
||||
|
||||
@property
|
||||
def invalid_optional_types(self):
|
||||
return self._invalid_optional_types
|
||||
|
||||
@property
|
||||
def realy_used_values(self):
|
||||
return self._realy_used_values
|
||||
|
||||
@property
|
||||
def used_values(self):
|
||||
return self._used_values
|
||||
|
||||
@staticmethod
|
||||
def split_keys_to_subdicts(values):
|
||||
output = {}
|
||||
for key, value in values.items():
|
||||
key_padding = list(KEY_PADDING_PATTERN.findall(key))
|
||||
if key_padding:
|
||||
key = key_padding[0]
|
||||
key_subdict = list(SUB_DICT_PATTERN.findall(key))
|
||||
data = output
|
||||
last_key = key_subdict.pop(-1)
|
||||
for subkey in key_subdict:
|
||||
if subkey not in data:
|
||||
data[subkey] = {}
|
||||
data = data[subkey]
|
||||
data[last_key] = value
|
||||
return output
|
||||
|
||||
def get_clean_used_values(self):
|
||||
new_used_values = {}
|
||||
for key, value in self.used_values.items():
|
||||
if isinstance(value, FormatObject):
|
||||
value = str(value)
|
||||
new_used_values[key] = value
|
||||
|
||||
return self.split_keys_to_subdicts(new_used_values)
|
||||
|
||||
def add_realy_used_value(self, key, value):
|
||||
self._realy_used_values[key] = value
|
||||
|
||||
def add_used_value(self, key, value):
|
||||
self._used_values[key] = value
|
||||
|
||||
def add_missing_key(self, key):
|
||||
if self._optional:
|
||||
self._missing_optional_keys.add(key)
|
||||
else:
|
||||
self._missing_keys.add(key)
|
||||
|
||||
def add_invalid_type(self, key, value):
|
||||
if self._optional:
|
||||
self._invalid_optional_types[key] = type(value)
|
||||
else:
|
||||
self._invalid_types[key] = type(value)
|
||||
|
||||
|
||||
class FormatObject(object):
|
||||
"""Object that can be used for formatting.
|
||||
|
||||
This is base that is valid for to be used in 'StringTemplate' value.
|
||||
"""
|
||||
def __init__(self):
|
||||
self.value = ""
|
||||
|
||||
def __format__(self, *args, **kwargs):
|
||||
return self.value.__format__(*args, **kwargs)
|
||||
|
||||
def __str__(self):
|
||||
return str(self.value)
|
||||
|
||||
def __repr__(self):
|
||||
return self.__str__()
|
||||
|
||||
|
||||
class FormattingPart:
|
||||
"""String with formatting template.
|
||||
|
||||
Containt only single key to format e.g. "{project[name]}".
|
||||
|
||||
Args:
|
||||
template(str): String containing the formatting key.
|
||||
"""
|
||||
def __init__(self, template):
|
||||
self._template = template
|
||||
|
||||
@property
|
||||
def template(self):
|
||||
return self._template
|
||||
|
||||
def __repr__(self):
|
||||
return "<Format:{}>".format(self._template)
|
||||
|
||||
def __str__(self):
|
||||
return self._template
|
||||
|
||||
@staticmethod
|
||||
def validate_value_type(value):
|
||||
"""Check if value can be used for formatting of single key."""
|
||||
if isinstance(value, (numbers.Number, FormatObject)):
|
||||
return True
|
||||
|
||||
for inh_class in type(value).mro():
|
||||
if inh_class in six.string_types:
|
||||
return True
|
||||
return False
|
||||
|
||||
def format(self, data, result):
|
||||
"""Format the formattings string.
|
||||
|
||||
Args:
|
||||
data(dict): Data that should be used for formatting.
|
||||
result(TemplatePartResult): Object where result is stored.
|
||||
"""
|
||||
key = self.template[1:-1]
|
||||
if key in result.realy_used_values:
|
||||
result.add_output(result.realy_used_values[key])
|
||||
return result
|
||||
|
||||
# check if key expects subdictionary keys (e.g. project[name])
|
||||
existence_check = key
|
||||
key_padding = list(KEY_PADDING_PATTERN.findall(existence_check))
|
||||
if key_padding:
|
||||
existence_check = key_padding[0]
|
||||
key_subdict = list(SUB_DICT_PATTERN.findall(existence_check))
|
||||
|
||||
value = data
|
||||
missing_key = False
|
||||
invalid_type = False
|
||||
used_keys = []
|
||||
for sub_key in key_subdict:
|
||||
if (
|
||||
value is None
|
||||
or (hasattr(value, "items") and sub_key not in value)
|
||||
):
|
||||
missing_key = True
|
||||
used_keys.append(sub_key)
|
||||
break
|
||||
|
||||
if not hasattr(value, "items"):
|
||||
invalid_type = True
|
||||
break
|
||||
|
||||
used_keys.append(sub_key)
|
||||
value = value.get(sub_key)
|
||||
|
||||
if missing_key or invalid_type:
|
||||
if len(used_keys) == 0:
|
||||
invalid_key = key_subdict[0]
|
||||
else:
|
||||
invalid_key = used_keys[0]
|
||||
for idx, sub_key in enumerate(used_keys):
|
||||
if idx == 0:
|
||||
continue
|
||||
invalid_key += "[{0}]".format(sub_key)
|
||||
|
||||
if missing_key:
|
||||
result.add_missing_key(invalid_key)
|
||||
|
||||
elif invalid_type:
|
||||
result.add_invalid_type(invalid_key, value)
|
||||
|
||||
result.add_output(self.template)
|
||||
return result
|
||||
|
||||
if self.validate_value_type(value):
|
||||
fill_data = {}
|
||||
first_value = True
|
||||
for used_key in reversed(used_keys):
|
||||
if first_value:
|
||||
first_value = False
|
||||
fill_data[used_key] = value
|
||||
else:
|
||||
_fill_data = {used_key: fill_data}
|
||||
fill_data = _fill_data
|
||||
|
||||
formatted_value = self.template.format(**fill_data)
|
||||
result.add_realy_used_value(key, formatted_value)
|
||||
result.add_used_value(existence_check, formatted_value)
|
||||
result.add_output(formatted_value)
|
||||
return result
|
||||
|
||||
result.add_invalid_type(key, value)
|
||||
result.add_output(self.template)
|
||||
|
||||
return result
|
||||
|
||||
|
||||
class OptionalPart:
|
||||
"""Template part which contains optional formatting strings.
|
||||
|
||||
If this part can't be filled the result is empty string.
|
||||
|
||||
Args:
|
||||
parts(list): Parts of template. Can contain 'str', 'OptionalPart' or
|
||||
'FormattingPart'.
|
||||
"""
|
||||
def __init__(self, parts):
|
||||
self._parts = parts
|
||||
|
||||
@property
|
||||
def parts(self):
|
||||
return self._parts
|
||||
|
||||
def __str__(self):
|
||||
return "<{}>".format("".join([str(p) for p in self._parts]))
|
||||
|
||||
def __repr__(self):
|
||||
return "<Optional:{}>".format("".join([str(p) for p in self._parts]))
|
||||
|
||||
def format(self, data, result):
|
||||
new_result = TemplatePartResult(True)
|
||||
for part in self._parts:
|
||||
if isinstance(part, six.string_types):
|
||||
new_result.add_output(part)
|
||||
else:
|
||||
part.format(data, new_result)
|
||||
|
||||
if new_result.solved:
|
||||
result.add_output(new_result)
|
||||
return result
|
||||
|
|
@ -1,10 +1,11 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
import os
|
||||
import tempfile
|
||||
import time
|
||||
from datetime import datetime
|
||||
import subprocess
|
||||
import json
|
||||
import platform
|
||||
import uuid
|
||||
from Deadline.Scripting import RepositoryUtils, FileUtils
|
||||
|
||||
|
||||
|
|
@ -36,9 +37,11 @@ def inject_openpype_environment(deadlinePlugin):
|
|||
print("--- OpenPype executable: {}".format(openpype_app))
|
||||
|
||||
# tempfile.TemporaryFile cannot be used because of locking
|
||||
export_url = os.path.join(tempfile.gettempdir(),
|
||||
time.strftime('%Y%m%d%H%M%S'),
|
||||
'env.json') # add HHMMSS + delete later
|
||||
temp_file_name = "{}_{}.json".format(
|
||||
datetime.utcnow().strftime('%Y%m%d%H%M%S%f'),
|
||||
str(uuid.uuid1())
|
||||
)
|
||||
export_url = os.path.join(tempfile.gettempdir(), temp_file_name)
|
||||
print(">>> Temporary path: {}".format(export_url))
|
||||
|
||||
args = [
|
||||
|
|
|
|||
|
|
@ -97,7 +97,6 @@ class CreateFolders(BaseAction):
|
|||
all_entities = self.get_notask_children(entity)
|
||||
|
||||
anatomy = Anatomy(project_name)
|
||||
project_settings = get_project_settings(project_name)
|
||||
|
||||
work_keys = ["work", "folder"]
|
||||
work_template = anatomy.templates
|
||||
|
|
|
|||
|
|
@ -589,6 +589,12 @@
|
|||
12,
|
||||
255
|
||||
],
|
||||
"vrayscene_layer": [
|
||||
255,
|
||||
150,
|
||||
12,
|
||||
255
|
||||
],
|
||||
"yeticache": [
|
||||
99,
|
||||
206,
|
||||
|
|
|
|||
|
|
@ -75,6 +75,11 @@
|
|||
"label": "Vray Proxy:",
|
||||
"key": "vrayproxy"
|
||||
},
|
||||
{
|
||||
"type": "color",
|
||||
"label": "Vray Scene:",
|
||||
"key": "vrayscene_layer"
|
||||
},
|
||||
{
|
||||
"type": "color",
|
||||
"label": "Yeti Cache:",
|
||||
|
|
|
|||
|
|
@ -3,9 +3,11 @@ from collections import defaultdict
|
|||
|
||||
from Qt import QtWidgets, QtCore
|
||||
|
||||
# TODO: expose this better in avalon core
|
||||
from avalon.tools import lib
|
||||
from avalon.tools.models import TreeModel
|
||||
from openpype.tools.utils.models import TreeModel
|
||||
from openpype.tools.utils.lib import (
|
||||
preserve_expanded_rows,
|
||||
preserve_selection,
|
||||
)
|
||||
|
||||
from .models import (
|
||||
AssetModel,
|
||||
|
|
@ -88,8 +90,8 @@ class AssetOutliner(QtWidgets.QWidget):
|
|||
"""Add all items from the current scene"""
|
||||
|
||||
items = []
|
||||
with lib.preserve_expanded_rows(self.view):
|
||||
with lib.preserve_selection(self.view):
|
||||
with preserve_expanded_rows(self.view):
|
||||
with preserve_selection(self.view):
|
||||
self.clear()
|
||||
nodes = commands.get_all_asset_nodes()
|
||||
items = commands.create_items_from_nodes(nodes)
|
||||
|
|
@ -100,8 +102,8 @@ class AssetOutliner(QtWidgets.QWidget):
|
|||
def get_selected_assets(self):
|
||||
"""Add all selected items from the current scene"""
|
||||
|
||||
with lib.preserve_expanded_rows(self.view):
|
||||
with lib.preserve_selection(self.view):
|
||||
with preserve_expanded_rows(self.view):
|
||||
with preserve_selection(self.view):
|
||||
self.clear()
|
||||
nodes = commands.get_selected_nodes()
|
||||
items = commands.create_items_from_nodes(nodes)
|
||||
|
|
|
|||
|
|
@ -8,7 +8,7 @@ from avalon import api, io, style, schema
|
|||
from avalon.vendor import qtawesome
|
||||
|
||||
from avalon.lib import HeroVersionType
|
||||
from avalon.tools.models import TreeModel, Item
|
||||
from openpype.tools.utils.models import TreeModel, Item
|
||||
|
||||
from .lib import (
|
||||
get_site_icons,
|
||||
|
|
|
|||
|
|
@ -7,9 +7,13 @@ from Qt import QtWidgets, QtCore
|
|||
from avalon import io, api, style
|
||||
from avalon.vendor import qtawesome
|
||||
from avalon.lib import HeroVersionType
|
||||
from avalon.tools import lib as tools_lib
|
||||
|
||||
from openpype.modules import ModulesManager
|
||||
from openpype.tools.utils.lib import (
|
||||
get_progress_for_repre,
|
||||
iter_model_rows,
|
||||
format_version
|
||||
)
|
||||
|
||||
from .switch_dialog import SwitchAssetDialog
|
||||
from .model import InventoryModel
|
||||
|
|
@ -373,7 +377,7 @@ class SceneInvetoryView(QtWidgets.QTreeView):
|
|||
if not repre_doc:
|
||||
continue
|
||||
|
||||
progress = tools_lib.get_progress_for_repre(
|
||||
progress = get_progress_for_repre(
|
||||
repre_doc,
|
||||
active_site,
|
||||
remote_site
|
||||
|
|
@ -544,7 +548,7 @@ class SceneInvetoryView(QtWidgets.QTreeView):
|
|||
"toggle": selection_model.Toggle,
|
||||
}[options.get("mode", "select")]
|
||||
|
||||
for item in tools_lib.iter_model_rows(model, 0):
|
||||
for item in iter_model_rows(model, 0):
|
||||
item = item.data(InventoryModel.ItemRole)
|
||||
if item.get("isGroupNode"):
|
||||
continue
|
||||
|
|
@ -704,7 +708,7 @@ class SceneInvetoryView(QtWidgets.QTreeView):
|
|||
labels = []
|
||||
for version in all_versions:
|
||||
is_hero = version["type"] == "hero_version"
|
||||
label = tools_lib.format_version(version["name"], is_hero)
|
||||
label = format_version(version["name"], is_hero)
|
||||
labels.append(label)
|
||||
versions_by_label[label] = version["name"]
|
||||
|
||||
|
|
|
|||
|
|
@ -3,10 +3,10 @@ import sys
|
|||
|
||||
import openpype
|
||||
import pyblish.api
|
||||
from openpype.tools.utils.host_tools import show_publish
|
||||
|
||||
|
||||
def main(env):
|
||||
from avalon.tools import publish
|
||||
# Registers pype's Global pyblish plugins
|
||||
openpype.install()
|
||||
|
||||
|
|
@ -19,7 +19,7 @@ def main(env):
|
|||
continue
|
||||
pyblish.api.register_plugin_path(path)
|
||||
|
||||
return publish.show()
|
||||
return show_publish()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
|
|
|
|||
|
|
@ -15,6 +15,7 @@ from .lib import (
|
|||
get_warning_pixmap,
|
||||
set_style_property,
|
||||
DynamicQThread,
|
||||
qt_app_context,
|
||||
)
|
||||
|
||||
from .models import (
|
||||
|
|
@ -39,6 +40,7 @@ __all__ = (
|
|||
"get_warning_pixmap",
|
||||
"set_style_property",
|
||||
"DynamicQThread",
|
||||
"qt_app_context",
|
||||
|
||||
"RecursiveSortFilterProxyModel",
|
||||
)
|
||||
|
|
|
|||
|
|
@ -3,8 +3,9 @@
|
|||
It is possible to create `HostToolsHelper` in host implementation or
|
||||
use singleton approach with global functions (using helper anyway).
|
||||
"""
|
||||
|
||||
import os
|
||||
import avalon.api
|
||||
import pyblish.api
|
||||
from .lib import qt_app_context
|
||||
|
||||
|
||||
|
|
@ -196,10 +197,29 @@ class HostToolsHelper:
|
|||
library_loader_tool.refresh()
|
||||
|
||||
def show_publish(self, parent=None):
|
||||
"""Publish UI."""
|
||||
from avalon.tools import publish
|
||||
"""Try showing the most desirable publish GUI
|
||||
|
||||
publish.show(parent)
|
||||
This function cycles through the currently registered
|
||||
graphical user interfaces, if any, and presents it to
|
||||
the user.
|
||||
"""
|
||||
|
||||
pyblish_show = self._discover_pyblish_gui()
|
||||
return pyblish_show(parent)
|
||||
|
||||
def _discover_pyblish_gui(self):
|
||||
"""Return the most desirable of the currently registered GUIs"""
|
||||
# Prefer last registered
|
||||
guis = list(reversed(pyblish.api.registered_guis()))
|
||||
for gui in guis:
|
||||
try:
|
||||
gui = __import__(gui).show
|
||||
except (ImportError, AttributeError):
|
||||
continue
|
||||
else:
|
||||
return gui
|
||||
|
||||
raise ImportError("No Pyblish GUI found")
|
||||
|
||||
def get_look_assigner_tool(self, parent):
|
||||
"""Create, cache and return look assigner tool window."""
|
||||
|
|
@ -394,3 +414,11 @@ def show_publish(parent=None):
|
|||
|
||||
def show_experimental_tools_dialog(parent=None):
|
||||
_SingletonPoint.show_tool_by_name("experimental_tools", parent)
|
||||
|
||||
|
||||
def get_pyblish_icon():
|
||||
pyblish_dir = os.path.abspath(os.path.dirname(pyblish.api.__file__))
|
||||
icon_path = os.path.join(pyblish_dir, "icons", "logo-32x32.svg")
|
||||
if os.path.exists(icon_path):
|
||||
return icon_path
|
||||
return None
|
||||
|
|
|
|||
|
|
@ -1,11 +1,11 @@
|
|||
import os
|
||||
import logging
|
||||
|
||||
from Qt import QtCore, QtGui
|
||||
from Qt import QtCore
|
||||
|
||||
from avalon import style
|
||||
from avalon.vendor import qtawesome
|
||||
from avalon.tools.models import TreeModel, Item
|
||||
from openpype.tools.utils.models import TreeModel, Item
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
|
|
|||
|
|
@ -1,3 +1,3 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Package declaring Pype version."""
|
||||
__version__ = "3.9.0-nightly.4"
|
||||
__version__ = "3.9.0-nightly.5"
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
[tool.poetry]
|
||||
name = "OpenPype"
|
||||
version = "3.9.0-nightly.4" # OpenPype
|
||||
version = "3.9.0-nightly.5" # OpenPype
|
||||
description = "Open VFX and Animation pipeline with support."
|
||||
authors = ["OpenPype Team <info@openpype.io>"]
|
||||
license = "MIT License"
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue