mirror of
https://github.com/ynput/ayon-core.git
synced 2025-12-25 05:14:40 +01:00
OP-2766 - merge develop
This commit is contained in:
commit
c86643dcce
166 changed files with 2611 additions and 1368 deletions
43
CHANGELOG.md
43
CHANGELOG.md
|
|
@ -1,11 +1,43 @@
|
|||
# Changelog
|
||||
|
||||
## [3.9.1-nightly.1](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
## [3.9.2-nightly.1](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.9.0...HEAD)
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.9.1...HEAD)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- CI: change the version bump logic [\#2919](https://github.com/pypeclub/OpenPype/pull/2919)
|
||||
- Deadline: Add headless argument [\#2916](https://github.com/pypeclub/OpenPype/pull/2916)
|
||||
- Ftrack: Fill workfile in custom attribute [\#2906](https://github.com/pypeclub/OpenPype/pull/2906)
|
||||
- Settings UI: Add simple tooltips for settings entities [\#2901](https://github.com/pypeclub/OpenPype/pull/2901)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Ftrack: Missing Ftrack id after editorial publish [\#2905](https://github.com/pypeclub/OpenPype/pull/2905)
|
||||
- AfterEffects: Fix rendering for single frame in DL [\#2875](https://github.com/pypeclub/OpenPype/pull/2875)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- General: Move formatting and workfile functions [\#2914](https://github.com/pypeclub/OpenPype/pull/2914)
|
||||
|
||||
## [3.9.1](https://github.com/pypeclub/OpenPype/tree/3.9.1) (2022-03-18)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.9.1-nightly.3...3.9.1)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- General: Change how OPENPYPE\_DEBUG value is handled [\#2907](https://github.com/pypeclub/OpenPype/pull/2907)
|
||||
- nuke: imageio adding ocio config version 1.2 [\#2897](https://github.com/pypeclub/OpenPype/pull/2897)
|
||||
- Flame: support for comment with xml attribute overrides [\#2892](https://github.com/pypeclub/OpenPype/pull/2892)
|
||||
- Nuke: ExtractReviewSlate can handle more codes and profiles [\#2879](https://github.com/pypeclub/OpenPype/pull/2879)
|
||||
- Flame: sequence used for reference video [\#2869](https://github.com/pypeclub/OpenPype/pull/2869)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- General: Fix use of Anatomy roots [\#2904](https://github.com/pypeclub/OpenPype/pull/2904)
|
||||
- Fixing gap detection in extract review [\#2902](https://github.com/pypeclub/OpenPype/pull/2902)
|
||||
- Pyblish Pype - ensure current state is correct when entering new group order [\#2899](https://github.com/pypeclub/OpenPype/pull/2899)
|
||||
- SceneInventory: Fix import of load function [\#2894](https://github.com/pypeclub/OpenPype/pull/2894)
|
||||
- Harmony - fixed creator issue [\#2891](https://github.com/pypeclub/OpenPype/pull/2891)
|
||||
- General: Remove forgotten use of avalon Creator [\#2885](https://github.com/pypeclub/OpenPype/pull/2885)
|
||||
- General: Avoid circular import [\#2884](https://github.com/pypeclub/OpenPype/pull/2884)
|
||||
|
|
@ -14,6 +46,7 @@
|
|||
**🔀 Refactored code**
|
||||
|
||||
- General: Reduce style usage to OpenPype repository [\#2889](https://github.com/pypeclub/OpenPype/pull/2889)
|
||||
- General: Move loader logic from avalon to openpype [\#2886](https://github.com/pypeclub/OpenPype/pull/2886)
|
||||
|
||||
## [3.9.0](https://github.com/pypeclub/OpenPype/tree/3.9.0) (2022-03-14)
|
||||
|
||||
|
|
@ -23,6 +56,10 @@
|
|||
|
||||
- AssetCreator: Remove the tool [\#2845](https://github.com/pypeclub/OpenPype/pull/2845)
|
||||
|
||||
### 📖 Documentation
|
||||
|
||||
- Documentation: Change Photoshop & AfterEffects plugin path [\#2878](https://github.com/pypeclub/OpenPype/pull/2878)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- General: Subset name filtering in ExtractReview outpus [\#2872](https://github.com/pypeclub/OpenPype/pull/2872)
|
||||
|
|
@ -57,6 +94,7 @@
|
|||
- Maya: Stop creation of reviews for Cryptomattes [\#2832](https://github.com/pypeclub/OpenPype/pull/2832)
|
||||
- Deadline: Remove recreated event [\#2828](https://github.com/pypeclub/OpenPype/pull/2828)
|
||||
- Deadline: Added missing events folder [\#2827](https://github.com/pypeclub/OpenPype/pull/2827)
|
||||
- Maya: Deformer node ids validation plugin [\#2826](https://github.com/pypeclub/OpenPype/pull/2826)
|
||||
- Settings: Missing document with OP versions may break start of OpenPype [\#2825](https://github.com/pypeclub/OpenPype/pull/2825)
|
||||
- Deadline: more detailed temp file name for environment json [\#2824](https://github.com/pypeclub/OpenPype/pull/2824)
|
||||
- General: Host name was formed from obsolete code [\#2821](https://github.com/pypeclub/OpenPype/pull/2821)
|
||||
|
|
@ -74,7 +112,6 @@
|
|||
- General: Move change context functions [\#2839](https://github.com/pypeclub/OpenPype/pull/2839)
|
||||
- Tools: Don't use avalon tools code [\#2829](https://github.com/pypeclub/OpenPype/pull/2829)
|
||||
- Move Unreal Implementation to OpenPype [\#2823](https://github.com/pypeclub/OpenPype/pull/2823)
|
||||
- General: Extract template formatting from anatomy [\#2766](https://github.com/pypeclub/OpenPype/pull/2766)
|
||||
|
||||
## [3.8.2](https://github.com/pypeclub/OpenPype/tree/3.8.2) (2022-02-07)
|
||||
|
||||
|
|
|
|||
|
|
@ -78,6 +78,7 @@ def install():
|
|||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
register_loader_plugin_path,
|
||||
register_inventory_action,
|
||||
)
|
||||
from avalon import pipeline
|
||||
|
||||
|
|
@ -124,7 +125,7 @@ def install():
|
|||
pyblish.register_plugin_path(path)
|
||||
register_loader_plugin_path(path)
|
||||
avalon.register_plugin_path(LegacyCreator, path)
|
||||
avalon.register_plugin_path(avalon.InventoryAction, path)
|
||||
register_inventory_action(path)
|
||||
|
||||
# apply monkey patched discover to original one
|
||||
log.info("Patching discovery")
|
||||
|
|
|
|||
|
|
@ -101,7 +101,7 @@ def eventserver(debug,
|
|||
on linux and window service).
|
||||
"""
|
||||
if debug:
|
||||
os.environ['OPENPYPE_DEBUG'] = "3"
|
||||
os.environ["OPENPYPE_DEBUG"] = "1"
|
||||
|
||||
PypeCommands().launch_eventservercli(
|
||||
ftrack_url,
|
||||
|
|
@ -128,7 +128,7 @@ def webpublisherwebserver(debug, executable, upload_dir, host=None, port=None):
|
|||
Expect "pype.club" user created on Ftrack.
|
||||
"""
|
||||
if debug:
|
||||
os.environ['OPENPYPE_DEBUG'] = "3"
|
||||
os.environ["OPENPYPE_DEBUG"] = "1"
|
||||
|
||||
PypeCommands().launch_webpublisher_webservercli(
|
||||
upload_dir=upload_dir,
|
||||
|
|
@ -176,7 +176,7 @@ def publish(debug, paths, targets, gui):
|
|||
More than one path is allowed.
|
||||
"""
|
||||
if debug:
|
||||
os.environ['OPENPYPE_DEBUG'] = '3'
|
||||
os.environ["OPENPYPE_DEBUG"] = "1"
|
||||
PypeCommands.publish(list(paths), targets, gui)
|
||||
|
||||
|
||||
|
|
@ -195,7 +195,7 @@ def remotepublishfromapp(debug, project, path, host, user=None, targets=None):
|
|||
More than one path is allowed.
|
||||
"""
|
||||
if debug:
|
||||
os.environ['OPENPYPE_DEBUG'] = '3'
|
||||
os.environ["OPENPYPE_DEBUG"] = "1"
|
||||
PypeCommands.remotepublishfromapp(
|
||||
project, path, host, user, targets=targets
|
||||
)
|
||||
|
|
@ -215,7 +215,7 @@ def remotepublish(debug, project, path, user=None, targets=None):
|
|||
More than one path is allowed.
|
||||
"""
|
||||
if debug:
|
||||
os.environ['OPENPYPE_DEBUG'] = '3'
|
||||
os.environ["OPENPYPE_DEBUG"] = "1"
|
||||
PypeCommands.remotepublish(project, path, user, targets=targets)
|
||||
|
||||
|
||||
|
|
@ -240,7 +240,7 @@ def texturecopy(debug, project, asset, path):
|
|||
Nothing is written to database.
|
||||
"""
|
||||
if debug:
|
||||
os.environ['OPENPYPE_DEBUG'] = '3'
|
||||
os.environ["OPENPYPE_DEBUG"] = "1"
|
||||
PypeCommands().texture_copy(project, asset, path)
|
||||
|
||||
|
||||
|
|
@ -409,7 +409,7 @@ def syncserver(debug, active_site):
|
|||
var OPENPYPE_LOCAL_ID set to 'active_site'.
|
||||
"""
|
||||
if debug:
|
||||
os.environ['OPENPYPE_DEBUG'] = '3'
|
||||
os.environ["OPENPYPE_DEBUG"] = "1"
|
||||
PypeCommands().syncserver(active_site)
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -5,7 +5,7 @@ from Qt import QtWidgets
|
|||
|
||||
import pyblish.api
|
||||
import avalon.api
|
||||
from avalon import io, pipeline
|
||||
from avalon import io
|
||||
|
||||
from openpype import lib
|
||||
from openpype.api import Logger
|
||||
|
|
@ -13,6 +13,7 @@ from openpype.pipeline import (
|
|||
LegacyCreator,
|
||||
register_loader_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
import openpype.hosts.aftereffects
|
||||
from openpype.pipeline import BaseCreator
|
||||
|
|
@ -30,7 +31,6 @@ PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
|
|||
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
|
||||
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
|
||||
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
|
||||
INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory")
|
||||
|
||||
|
||||
def install():
|
||||
|
|
@ -96,7 +96,6 @@ def get_asset_settings():
|
|||
}
|
||||
|
||||
|
||||
# loaded containers section
|
||||
def ls():
|
||||
"""Yields containers from active AfterEffects document.
|
||||
|
||||
|
|
@ -194,7 +193,7 @@ def containerise(name,
|
|||
"""
|
||||
data = {
|
||||
"schema": "openpype:container-2.0",
|
||||
"id": pipeline.AVALON_CONTAINER_ID,
|
||||
"id": AVALON_CONTAINER_ID,
|
||||
"name": name,
|
||||
"namespace": namespace,
|
||||
"loader": str(loader),
|
||||
|
|
|
|||
|
|
@ -1,8 +1,8 @@
|
|||
"""Host API required Work Files tool"""
|
||||
import os
|
||||
|
||||
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
|
||||
from .launch_logic import get_stub
|
||||
from avalon import api
|
||||
|
||||
|
||||
def _active_document():
|
||||
|
|
@ -14,7 +14,7 @@ def _active_document():
|
|||
|
||||
|
||||
def file_extensions():
|
||||
return api.HOST_WORKFILE_EXTENSIONS["aftereffects"]
|
||||
return HOST_WORKFILE_EXTENSIONS["aftereffects"]
|
||||
|
||||
|
||||
def has_unsaved_changes():
|
||||
|
|
|
|||
|
|
@ -328,7 +328,6 @@ class LaunchWorkFiles(LaunchQtApp):
|
|||
result = super().execute(context)
|
||||
self._window.set_context({
|
||||
"asset": avalon.api.Session["AVALON_ASSET"],
|
||||
"silo": avalon.api.Session["AVALON_SILO"],
|
||||
"task": avalon.api.Session["AVALON_TASK"]
|
||||
})
|
||||
return result
|
||||
|
|
|
|||
|
|
@ -12,12 +12,12 @@ from . import ops
|
|||
import pyblish.api
|
||||
import avalon.api
|
||||
from avalon import io, schema
|
||||
from avalon.pipeline import AVALON_CONTAINER_ID
|
||||
|
||||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
register_loader_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.api import Logger
|
||||
from openpype.lib import (
|
||||
|
|
@ -31,7 +31,6 @@ PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
|
|||
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
|
||||
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
|
||||
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
|
||||
INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory")
|
||||
|
||||
ORIGINAL_EXCEPTHOOK = sys.excepthook
|
||||
|
||||
|
|
|
|||
|
|
@ -4,7 +4,8 @@ from pathlib import Path
|
|||
from typing import List, Optional
|
||||
|
||||
import bpy
|
||||
from avalon import api
|
||||
|
||||
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
|
||||
|
||||
|
||||
class OpenFileCacher:
|
||||
|
|
@ -77,7 +78,7 @@ def has_unsaved_changes() -> bool:
|
|||
def file_extensions() -> List[str]:
|
||||
"""Return the supported file extensions for Blender scene files."""
|
||||
|
||||
return api.HOST_WORKFILE_EXTENSIONS["blender"]
|
||||
return HOST_WORKFILE_EXTENSIONS["blender"]
|
||||
|
||||
|
||||
def work_root(session: dict) -> str:
|
||||
|
|
|
|||
|
|
@ -6,11 +6,14 @@ from typing import Dict, List, Optional
|
|||
|
||||
import bpy
|
||||
|
||||
from openpype.pipeline import get_representation_path
|
||||
from openpype.pipeline import (
|
||||
get_representation_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
|
||||
from openpype.hosts.blender.api.pipeline import (
|
||||
AVALON_CONTAINERS,
|
||||
AVALON_PROPERTY,
|
||||
AVALON_CONTAINER_ID
|
||||
)
|
||||
from openpype.hosts.blender.api import plugin, lib
|
||||
|
||||
|
|
|
|||
|
|
@ -6,12 +6,14 @@ from typing import Dict, List, Optional
|
|||
|
||||
import bpy
|
||||
|
||||
from openpype.pipeline import get_representation_path
|
||||
from openpype.pipeline import (
|
||||
get_representation_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.hosts.blender.api import plugin
|
||||
from openpype.hosts.blender.api.pipeline import (
|
||||
AVALON_CONTAINERS,
|
||||
AVALON_PROPERTY,
|
||||
AVALON_CONTAINER_ID
|
||||
)
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -7,12 +7,14 @@ from typing import Dict, List, Optional
|
|||
|
||||
import bpy
|
||||
|
||||
from openpype.pipeline import get_representation_path
|
||||
from openpype.pipeline import (
|
||||
get_representation_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.hosts.blender.api import plugin
|
||||
from openpype.hosts.blender.api.pipeline import (
|
||||
AVALON_CONTAINERS,
|
||||
AVALON_PROPERTY,
|
||||
AVALON_CONTAINER_ID
|
||||
)
|
||||
|
||||
logger = logging.getLogger("openpype").getChild(
|
||||
|
|
|
|||
|
|
@ -6,12 +6,14 @@ from typing import Dict, List, Optional
|
|||
|
||||
import bpy
|
||||
|
||||
from openpype.pipeline import get_representation_path
|
||||
from openpype.pipeline import (
|
||||
get_representation_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.hosts.blender.api import plugin, lib
|
||||
from openpype.hosts.blender.api.pipeline import (
|
||||
AVALON_CONTAINERS,
|
||||
AVALON_PROPERTY,
|
||||
AVALON_CONTAINER_ID
|
||||
)
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -6,12 +6,14 @@ from typing import Dict, List, Optional
|
|||
|
||||
import bpy
|
||||
|
||||
from openpype.pipeline import get_representation_path
|
||||
from openpype.pipeline import (
|
||||
get_representation_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.hosts.blender.api import plugin, lib
|
||||
from openpype.hosts.blender.api.pipeline import (
|
||||
AVALON_CONTAINERS,
|
||||
AVALON_PROPERTY,
|
||||
AVALON_CONTAINER_ID
|
||||
)
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -10,12 +10,12 @@ from openpype import lib
|
|||
from openpype.pipeline import (
|
||||
legacy_create,
|
||||
get_representation_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.hosts.blender.api import plugin
|
||||
from openpype.hosts.blender.api.pipeline import (
|
||||
AVALON_CONTAINERS,
|
||||
AVALON_PROPERTY,
|
||||
AVALON_CONTAINER_ID
|
||||
)
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -13,12 +13,12 @@ from openpype.pipeline import (
|
|||
load_container,
|
||||
get_representation_path,
|
||||
loaders_from_representation,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.hosts.blender.api.pipeline import (
|
||||
AVALON_INSTANCES,
|
||||
AVALON_CONTAINERS,
|
||||
AVALON_PROPERTY,
|
||||
AVALON_CONTAINER_ID
|
||||
)
|
||||
from openpype.hosts.blender.api import plugin
|
||||
|
||||
|
|
|
|||
|
|
@ -6,12 +6,14 @@ from typing import Dict, List, Optional
|
|||
|
||||
import bpy
|
||||
|
||||
from openpype.pipeline import get_representation_path
|
||||
from openpype.pipeline import (
|
||||
get_representation_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.hosts.blender.api import plugin
|
||||
from openpype.hosts.blender.api.pipeline import (
|
||||
AVALON_CONTAINERS,
|
||||
AVALON_PROPERTY,
|
||||
AVALON_CONTAINER_ID
|
||||
)
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -10,6 +10,7 @@ from openpype import lib
|
|||
from openpype.pipeline import (
|
||||
legacy_create,
|
||||
get_representation_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.hosts.blender.api import (
|
||||
plugin,
|
||||
|
|
@ -18,7 +19,6 @@ from openpype.hosts.blender.api import (
|
|||
from openpype.hosts.blender.api.pipeline import (
|
||||
AVALON_CONTAINERS,
|
||||
AVALON_PROPERTY,
|
||||
AVALON_CONTAINER_ID
|
||||
)
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -1,6 +1,8 @@
|
|||
import os
|
||||
import json
|
||||
|
||||
from bson.objectid import ObjectId
|
||||
|
||||
import bpy
|
||||
import bpy_extras
|
||||
import bpy_extras.anim_utils
|
||||
|
|
@ -140,7 +142,7 @@ class ExtractLayout(openpype.api.Extractor):
|
|||
blend = io.find_one(
|
||||
{
|
||||
"type": "representation",
|
||||
"parent": io.ObjectId(parent),
|
||||
"parent": ObjectId(parent),
|
||||
"name": "blend"
|
||||
},
|
||||
projection={"_id": True})
|
||||
|
|
@ -151,7 +153,7 @@ class ExtractLayout(openpype.api.Extractor):
|
|||
fbx = io.find_one(
|
||||
{
|
||||
"type": "representation",
|
||||
"parent": io.ObjectId(parent),
|
||||
"parent": ObjectId(parent),
|
||||
"name": "fbx"
|
||||
},
|
||||
projection={"_id": True})
|
||||
|
|
@ -162,7 +164,7 @@ class ExtractLayout(openpype.api.Extractor):
|
|||
abc = io.find_one(
|
||||
{
|
||||
"type": "representation",
|
||||
"parent": io.ObjectId(parent),
|
||||
"parent": ObjectId(parent),
|
||||
"name": "abc"
|
||||
},
|
||||
projection={"_id": True})
|
||||
|
|
|
|||
|
|
@ -68,7 +68,8 @@ from .workio import (
|
|||
)
|
||||
from .render_utils import (
|
||||
export_clip,
|
||||
get_preset_path_by_xml_name
|
||||
get_preset_path_by_xml_name,
|
||||
modify_preset_file
|
||||
)
|
||||
|
||||
__all__ = [
|
||||
|
|
@ -140,5 +141,6 @@ __all__ = [
|
|||
|
||||
# render utils
|
||||
"export_clip",
|
||||
"get_preset_path_by_xml_name"
|
||||
"get_preset_path_by_xml_name",
|
||||
"modify_preset_file"
|
||||
]
|
||||
|
|
|
|||
|
|
@ -4,13 +4,14 @@ Basic avalon integration
|
|||
import os
|
||||
import contextlib
|
||||
from avalon import api as avalon
|
||||
from avalon.pipeline import AVALON_CONTAINER_ID
|
||||
from pyblish import api as pyblish
|
||||
|
||||
from openpype.api import Logger
|
||||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
register_loader_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from .lib import (
|
||||
set_segment_data_marker,
|
||||
|
|
@ -26,7 +27,6 @@ PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
|
|||
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
|
||||
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
|
||||
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
|
||||
INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory")
|
||||
|
||||
AVALON_CONTAINERS = "AVALON_CONTAINERS"
|
||||
|
||||
|
|
@ -34,12 +34,10 @@ log = Logger.get_logger(__name__)
|
|||
|
||||
|
||||
def install():
|
||||
|
||||
pyblish.register_host("flame")
|
||||
pyblish.register_plugin_path(PUBLISH_PATH)
|
||||
register_loader_plugin_path(LOAD_PATH)
|
||||
avalon.register_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
avalon.register_plugin_path(avalon.InventoryAction, INVENTORY_PATH)
|
||||
log.info("OpenPype Flame plug-ins registred ...")
|
||||
|
||||
# register callback for switching publishable
|
||||
|
|
@ -47,6 +45,7 @@ def install():
|
|||
|
||||
log.info("OpenPype Flame host installed ...")
|
||||
|
||||
|
||||
def uninstall():
|
||||
pyblish.deregister_host("flame")
|
||||
|
||||
|
|
@ -54,7 +53,6 @@ def uninstall():
|
|||
pyblish.deregister_plugin_path(PUBLISH_PATH)
|
||||
deregister_loader_plugin_path(LOAD_PATH)
|
||||
avalon.deregister_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
avalon.deregister_plugin_path(avalon.InventoryAction, INVENTORY_PATH)
|
||||
|
||||
# register callback for switching publishable
|
||||
pyblish.deregister_callback("instanceToggled", on_pyblish_instance_toggled)
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
import os
|
||||
from xml.etree import ElementTree as ET
|
||||
|
||||
|
||||
def export_clip(export_path, clip, preset_path, **kwargs):
|
||||
|
|
@ -123,3 +124,29 @@ def get_preset_path_by_xml_name(xml_preset_name):
|
|||
|
||||
# if nothing found then return False
|
||||
return False
|
||||
|
||||
|
||||
def modify_preset_file(xml_path, staging_dir, data):
|
||||
"""Modify xml preset with input data
|
||||
|
||||
Args:
|
||||
xml_path (str ): path for input xml preset
|
||||
staging_dir (str): staging dir path
|
||||
data (dict): data where key is xmlTag and value as string
|
||||
|
||||
Returns:
|
||||
str: _description_
|
||||
"""
|
||||
# create temp path
|
||||
dirname, basename = os.path.split(xml_path)
|
||||
temp_path = os.path.join(staging_dir, basename)
|
||||
|
||||
# change xml following data keys
|
||||
with open(xml_path, "r") as datafile:
|
||||
tree = ET.parse(datafile)
|
||||
for key, value in data.items():
|
||||
for element in tree.findall(".//{}".format(key)):
|
||||
element.text = str(value)
|
||||
tree.write(temp_path)
|
||||
|
||||
return temp_path
|
||||
|
|
|
|||
|
|
@ -420,13 +420,20 @@ class WireTapCom(object):
|
|||
RuntimeError: Not able to set colorspace policy
|
||||
"""
|
||||
color_policy = color_policy or "Legacy"
|
||||
|
||||
# check if the colour policy in custom dir
|
||||
if not os.path.exists(color_policy):
|
||||
color_policy = "/syncolor/policies/Autodesk/{}".format(
|
||||
color_policy)
|
||||
|
||||
# create arguments
|
||||
project_colorspace_cmd = [
|
||||
os.path.join(
|
||||
self.wiretap_tools_dir,
|
||||
"wiretap_duplicate_node"
|
||||
),
|
||||
"-s",
|
||||
"/syncolor/policies/Autodesk/{}".format(color_policy),
|
||||
color_policy,
|
||||
"-n",
|
||||
"/projects/{}/syncolor".format(project_name)
|
||||
]
|
||||
|
|
|
|||
|
|
@ -73,7 +73,7 @@ class FlamePrelaunch(PreLaunchHook):
|
|||
"FrameWidth": int(width),
|
||||
"FrameHeight": int(height),
|
||||
"AspectRatio": float((width / height) * _db_p_data["pixelAspect"]),
|
||||
"FrameRate": "{} fps".format(fps),
|
||||
"FrameRate": self._get_flame_fps(fps),
|
||||
"FrameDepth": str(imageio_flame["project"]["frameDepth"]),
|
||||
"FieldDominance": str(imageio_flame["project"]["fieldDominance"])
|
||||
}
|
||||
|
|
@ -101,6 +101,28 @@ class FlamePrelaunch(PreLaunchHook):
|
|||
|
||||
self.launch_context.launch_args.extend(app_arguments)
|
||||
|
||||
def _get_flame_fps(self, fps_num):
|
||||
fps_table = {
|
||||
float(23.976): "23.976 fps",
|
||||
int(25): "25 fps",
|
||||
int(24): "24 fps",
|
||||
float(29.97): "29.97 fps DF",
|
||||
int(30): "30 fps",
|
||||
int(50): "50 fps",
|
||||
float(59.94): "59.94 fps DF",
|
||||
int(60): "60 fps"
|
||||
}
|
||||
|
||||
match_key = min(fps_table.keys(), key=lambda x: abs(x - fps_num))
|
||||
|
||||
try:
|
||||
return fps_table[match_key]
|
||||
except KeyError as msg:
|
||||
raise KeyError((
|
||||
"Missing FPS key in conversion table. "
|
||||
"Following keys are available: {}".format(fps_table.keys())
|
||||
)) from msg
|
||||
|
||||
def _add_pythonpath(self):
|
||||
pythonpath = self.launch_context.env.get("PYTHONPATH")
|
||||
|
||||
|
|
|
|||
|
|
@ -1,3 +1,4 @@
|
|||
import re
|
||||
import pyblish
|
||||
import openpype
|
||||
import openpype.hosts.flame.api as opfapi
|
||||
|
|
@ -6,6 +7,10 @@ from openpype.hosts.flame.otio import flame_export
|
|||
# # developer reload modules
|
||||
from pprint import pformat
|
||||
|
||||
# constatns
|
||||
NUM_PATERN = re.compile(r"([0-9\.]+)")
|
||||
TXT_PATERN = re.compile(r"([a-zA-Z]+)")
|
||||
|
||||
|
||||
class CollectTimelineInstances(pyblish.api.ContextPlugin):
|
||||
"""Collect all Timeline segment selection."""
|
||||
|
|
@ -16,6 +21,16 @@ class CollectTimelineInstances(pyblish.api.ContextPlugin):
|
|||
|
||||
audio_track_items = []
|
||||
|
||||
# TODO: add to settings
|
||||
# settings
|
||||
xml_preset_attrs_from_comments = {
|
||||
"width": "number",
|
||||
"height": "number",
|
||||
"pixelRatio": "float",
|
||||
"resizeType": "string",
|
||||
"resizeFilter": "string"
|
||||
}
|
||||
|
||||
def process(self, context):
|
||||
project = context.data["flameProject"]
|
||||
sequence = context.data["flameSequence"]
|
||||
|
|
@ -26,6 +41,10 @@ class CollectTimelineInstances(pyblish.api.ContextPlugin):
|
|||
# process all sellected
|
||||
with opfapi.maintained_segment_selection(sequence) as segments:
|
||||
for segment in segments:
|
||||
comment_attributes = self._get_comment_attributes(segment)
|
||||
self.log.debug("_ comment_attributes: {}".format(
|
||||
pformat(comment_attributes)))
|
||||
|
||||
clip_data = opfapi.get_segment_attributes(segment)
|
||||
clip_name = clip_data["segment_name"]
|
||||
self.log.debug("clip_name: {}".format(clip_name))
|
||||
|
|
@ -101,6 +120,9 @@ class CollectTimelineInstances(pyblish.api.ContextPlugin):
|
|||
# add resolution
|
||||
self._get_resolution_to_data(inst_data, context)
|
||||
|
||||
# add comment attributes if any
|
||||
inst_data.update(comment_attributes)
|
||||
|
||||
# create instance
|
||||
instance = context.create_instance(**inst_data)
|
||||
|
||||
|
|
@ -126,6 +148,94 @@ class CollectTimelineInstances(pyblish.api.ContextPlugin):
|
|||
if marker_data.get("reviewTrack") is not None:
|
||||
instance.data["reviewAudio"] = True
|
||||
|
||||
def _get_comment_attributes(self, segment):
|
||||
comment = segment.comment.get_value()
|
||||
|
||||
# try to find attributes
|
||||
attributes = {
|
||||
"xml_overrides": {
|
||||
"pixelRatio": 1.00}
|
||||
}
|
||||
# search for `:`
|
||||
for split in self._split_comments(comment):
|
||||
# make sure we ignore if not `:` in key
|
||||
if ":" not in split:
|
||||
continue
|
||||
|
||||
self._get_xml_preset_attrs(
|
||||
attributes, split)
|
||||
|
||||
# add xml overides resolution to instance data
|
||||
xml_overrides = attributes["xml_overrides"]
|
||||
if xml_overrides.get("width"):
|
||||
attributes.update({
|
||||
"resolutionWidth": xml_overrides["width"],
|
||||
"resolutionHeight": xml_overrides["height"],
|
||||
"pixelAspect": xml_overrides["pixelRatio"]
|
||||
})
|
||||
|
||||
return attributes
|
||||
|
||||
def _get_xml_preset_attrs(self, attributes, split):
|
||||
|
||||
# split to key and value
|
||||
key, value = split.split(":")
|
||||
|
||||
for a_name, a_type in self.xml_preset_attrs_from_comments.items():
|
||||
# exclude all not related attributes
|
||||
if a_name.lower() not in key.lower():
|
||||
continue
|
||||
|
||||
# get pattern defined by type
|
||||
pattern = TXT_PATERN
|
||||
if a_type in ("number" , "float"):
|
||||
pattern = NUM_PATERN
|
||||
|
||||
res_goup = pattern.findall(value)
|
||||
|
||||
# raise if nothing is found as it is not correctly defined
|
||||
if not res_goup:
|
||||
raise ValueError((
|
||||
"Value for `{}` attribute is not "
|
||||
"set correctly: `{}`").format(a_name, split))
|
||||
|
||||
if "string" in a_type:
|
||||
_value = res_goup[0]
|
||||
if "float" in a_type:
|
||||
_value = float(res_goup[0])
|
||||
if "number" in a_type:
|
||||
_value = int(res_goup[0])
|
||||
|
||||
attributes["xml_overrides"][a_name] = _value
|
||||
|
||||
# condition for resolution in key
|
||||
if "resolution" in key.lower():
|
||||
res_goup = NUM_PATERN.findall(value)
|
||||
# check if axpect was also defined
|
||||
# 1920x1080x1.5
|
||||
aspect = res_goup[2] if len(res_goup) > 2 else 1
|
||||
|
||||
width = int(res_goup[0])
|
||||
height = int(res_goup[1])
|
||||
pixel_ratio = float(aspect)
|
||||
attributes["xml_overrides"].update({
|
||||
"width": width,
|
||||
"height": height,
|
||||
"pixelRatio": pixel_ratio
|
||||
})
|
||||
|
||||
def _split_comments(self, comment_string):
|
||||
# first split comment by comma
|
||||
split_comments = []
|
||||
if "," in comment_string:
|
||||
split_comments.extend(comment_string.split(","))
|
||||
elif ";" in comment_string:
|
||||
split_comments.extend(comment_string.split(";"))
|
||||
else:
|
||||
split_comments.append(comment_string)
|
||||
|
||||
return split_comments
|
||||
|
||||
def _get_head_tail(self, clip_data, first_frame):
|
||||
# calculate head and tail with forward compatibility
|
||||
head = clip_data.get("segment_head")
|
||||
|
|
|
|||
|
|
@ -1,6 +1,7 @@
|
|||
import os
|
||||
from pprint import pformat
|
||||
from copy import deepcopy
|
||||
|
||||
import pyblish.api
|
||||
import openpype.api
|
||||
from openpype.hosts.flame import api as opfapi
|
||||
|
|
@ -22,6 +23,8 @@ class ExtractSubsetResources(openpype.api.Extractor):
|
|||
"ext": "jpg",
|
||||
"xml_preset_file": "Jpeg (8-bit).xml",
|
||||
"xml_preset_dir": "",
|
||||
"export_type": "File Sequence",
|
||||
"ignore_comment_attrs": True,
|
||||
"colorspace_out": "Output - sRGB",
|
||||
"representation_add_range": False,
|
||||
"representation_tags": ["thumbnail"]
|
||||
|
|
@ -30,6 +33,8 @@ class ExtractSubsetResources(openpype.api.Extractor):
|
|||
"ext": "mov",
|
||||
"xml_preset_file": "Apple iPad (1920x1080).xml",
|
||||
"xml_preset_dir": "",
|
||||
"export_type": "Movie",
|
||||
"ignore_comment_attrs": True,
|
||||
"colorspace_out": "Output - Rec.709",
|
||||
"representation_add_range": True,
|
||||
"representation_tags": [
|
||||
|
|
@ -54,21 +59,35 @@ class ExtractSubsetResources(openpype.api.Extractor):
|
|||
):
|
||||
instance.data["representations"] = []
|
||||
|
||||
frame_start = instance.data["frameStart"]
|
||||
handle_start = instance.data["handleStart"]
|
||||
frame_start_handle = frame_start - handle_start
|
||||
source_first_frame = instance.data["sourceFirstFrame"]
|
||||
source_start_handles = instance.data["sourceStartH"]
|
||||
source_end_handles = instance.data["sourceEndH"]
|
||||
source_duration_handles = (
|
||||
source_end_handles - source_start_handles) + 1
|
||||
|
||||
# flame objects
|
||||
segment = instance.data["item"]
|
||||
sequence_clip = instance.context.data["flameSequence"]
|
||||
clip_data = instance.data["flameSourceClip"]
|
||||
clip = clip_data["PyClip"]
|
||||
|
||||
in_mark = (source_start_handles - source_first_frame) + 1
|
||||
out_mark = in_mark + source_duration_handles
|
||||
# segment's parent track name
|
||||
s_track_name = segment.parent.name.get_value()
|
||||
|
||||
# get configured workfile frame start/end (handles excluded)
|
||||
frame_start = instance.data["frameStart"]
|
||||
# get media source first frame
|
||||
source_first_frame = instance.data["sourceFirstFrame"]
|
||||
|
||||
# get timeline in/out of segment
|
||||
clip_in = instance.data["clipIn"]
|
||||
clip_out = instance.data["clipOut"]
|
||||
|
||||
# get handles value - take only the max from both
|
||||
handle_start = instance.data["handleStart"]
|
||||
handle_end = instance.data["handleStart"]
|
||||
handles = max(handle_start, handle_end)
|
||||
|
||||
# get media source range with handles
|
||||
source_end_handles = instance.data["sourceEndH"]
|
||||
source_start_handles = instance.data["sourceStartH"]
|
||||
source_end_handles = instance.data["sourceEndH"]
|
||||
|
||||
# create staging dir path
|
||||
staging_dir = self.staging_dir(instance)
|
||||
|
||||
# add default preset type for thumbnail and reviewable video
|
||||
|
|
@ -77,15 +96,61 @@ class ExtractSubsetResources(openpype.api.Extractor):
|
|||
export_presets = deepcopy(self.default_presets)
|
||||
export_presets.update(self.export_presets_mapping)
|
||||
|
||||
# with maintained duplication loop all presets
|
||||
with opfapi.maintained_object_duplication(clip) as duplclip:
|
||||
# loop all preset names and
|
||||
for unique_name, preset_config in export_presets.items():
|
||||
# loop all preset names and
|
||||
for unique_name, preset_config in export_presets.items():
|
||||
modify_xml_data = {}
|
||||
|
||||
# get all presets attributes
|
||||
preset_file = preset_config["xml_preset_file"]
|
||||
preset_dir = preset_config["xml_preset_dir"]
|
||||
export_type = preset_config["export_type"]
|
||||
repre_tags = preset_config["representation_tags"]
|
||||
ignore_comment_attrs = preset_config["ignore_comment_attrs"]
|
||||
color_out = preset_config["colorspace_out"]
|
||||
|
||||
# get frame range with handles for representation range
|
||||
frame_start_handle = frame_start - handle_start
|
||||
source_duration_handles = (
|
||||
source_end_handles - source_start_handles) + 1
|
||||
|
||||
# define in/out marks
|
||||
in_mark = (source_start_handles - source_first_frame) + 1
|
||||
out_mark = in_mark + source_duration_handles
|
||||
|
||||
# by default export source clips
|
||||
exporting_clip = clip
|
||||
|
||||
if export_type == "Sequence Publish":
|
||||
# change export clip to sequence
|
||||
exporting_clip = sequence_clip
|
||||
|
||||
# change in/out marks to timeline in/out
|
||||
in_mark = clip_in
|
||||
out_mark = clip_out
|
||||
|
||||
# add xml tags modifications
|
||||
modify_xml_data.update({
|
||||
"exportHandles": True,
|
||||
"nbHandles": handles,
|
||||
"startFrame": frame_start
|
||||
})
|
||||
|
||||
if not ignore_comment_attrs:
|
||||
# add any xml overrides collected form segment.comment
|
||||
modify_xml_data.update(instance.data["xml_overrides"])
|
||||
|
||||
self.log.debug("__ modify_xml_data: {}".format(pformat(
|
||||
modify_xml_data
|
||||
)))
|
||||
|
||||
# with maintained duplication loop all presets
|
||||
with opfapi.maintained_object_duplication(
|
||||
exporting_clip) as duplclip:
|
||||
kwargs = {}
|
||||
preset_file = preset_config["xml_preset_file"]
|
||||
preset_dir = preset_config["xml_preset_dir"]
|
||||
repre_tags = preset_config["representation_tags"]
|
||||
color_out = preset_config["colorspace_out"]
|
||||
|
||||
if export_type == "Sequence Publish":
|
||||
# only keep visible layer where instance segment is child
|
||||
self.hide_other_tracks(duplclip, s_track_name)
|
||||
|
||||
# validate xml preset file is filled
|
||||
if preset_file == "":
|
||||
|
|
@ -108,10 +173,13 @@ class ExtractSubsetResources(openpype.api.Extractor):
|
|||
)
|
||||
|
||||
# create preset path
|
||||
preset_path = str(os.path.join(
|
||||
preset_orig_xml_path = str(os.path.join(
|
||||
preset_dir, preset_file
|
||||
))
|
||||
|
||||
preset_path = opfapi.modify_preset_file(
|
||||
preset_orig_xml_path, staging_dir, modify_xml_data)
|
||||
|
||||
# define kwargs based on preset type
|
||||
if "thumbnail" in unique_name:
|
||||
kwargs["thumb_frame_number"] = in_mark + (
|
||||
|
|
@ -122,6 +190,7 @@ class ExtractSubsetResources(openpype.api.Extractor):
|
|||
"out_mark": out_mark
|
||||
})
|
||||
|
||||
# get and make export dir paths
|
||||
export_dir_path = str(os.path.join(
|
||||
staging_dir, unique_name
|
||||
))
|
||||
|
|
@ -132,6 +201,7 @@ class ExtractSubsetResources(openpype.api.Extractor):
|
|||
export_dir_path, duplclip, preset_path, **kwargs)
|
||||
|
||||
extension = preset_config["ext"]
|
||||
|
||||
# create representation data
|
||||
representation_data = {
|
||||
"name": unique_name,
|
||||
|
|
@ -159,7 +229,12 @@ class ExtractSubsetResources(openpype.api.Extractor):
|
|||
# add files to represetation but add
|
||||
# imagesequence as list
|
||||
if (
|
||||
"movie_file" in preset_path
|
||||
# first check if path in files is not mov extension
|
||||
[
|
||||
f for f in files
|
||||
if os.path.splitext(f)[-1] == ".mov"
|
||||
]
|
||||
# then try if thumbnail is not in unique name
|
||||
or unique_name == "thumbnail"
|
||||
):
|
||||
representation_data["files"] = files.pop()
|
||||
|
|
@ -246,3 +321,19 @@ class ExtractSubsetResources(openpype.api.Extractor):
|
|||
)
|
||||
|
||||
return new_stage_dir, new_files_list
|
||||
|
||||
def hide_other_tracks(self, sequence_clip, track_name):
|
||||
"""Helper method used only if sequence clip is used
|
||||
|
||||
Args:
|
||||
sequence_clip (flame.Clip): sequence clip
|
||||
track_name (str): track name
|
||||
"""
|
||||
# create otio tracks and clips
|
||||
for ver in sequence_clip.versions:
|
||||
for track in ver.tracks:
|
||||
if len(track.segments) == 0 and track.hidden:
|
||||
continue
|
||||
|
||||
if track.name.get_value() != track_name:
|
||||
track.hidden = True
|
||||
|
|
|
|||
|
|
@ -3,6 +3,7 @@ import sys
|
|||
import re
|
||||
import contextlib
|
||||
|
||||
from bson.objectid import ObjectId
|
||||
from Qt import QtGui
|
||||
|
||||
from avalon import io
|
||||
|
|
@ -92,7 +93,7 @@ def switch_item(container,
|
|||
# Collect any of current asset, subset and representation if not provided
|
||||
# so we can use the original name from those.
|
||||
if any(not x for x in [asset_name, subset_name, representation_name]):
|
||||
_id = io.ObjectId(container["representation"])
|
||||
_id = ObjectId(container["representation"])
|
||||
representation = io.find_one({"type": "representation", "_id": _id})
|
||||
version, subset, asset, project = io.parenthood(representation)
|
||||
|
||||
|
|
|
|||
|
|
@ -8,13 +8,15 @@ import contextlib
|
|||
|
||||
import pyblish.api
|
||||
import avalon.api
|
||||
from avalon.pipeline import AVALON_CONTAINER_ID
|
||||
|
||||
from openpype.api import Logger
|
||||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
register_loader_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
register_inventory_action_path,
|
||||
deregister_inventory_action_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
import openpype.hosts.fusion
|
||||
|
||||
|
|
@ -69,7 +71,7 @@ def install():
|
|||
|
||||
register_loader_plugin_path(LOAD_PATH)
|
||||
avalon.api.register_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
avalon.api.register_plugin_path(avalon.api.InventoryAction, INVENTORY_PATH)
|
||||
register_inventory_action_path(INVENTORY_PATH)
|
||||
|
||||
pyblish.api.register_callback(
|
||||
"instanceToggled", on_pyblish_instance_toggled
|
||||
|
|
@ -93,9 +95,7 @@ def uninstall():
|
|||
|
||||
deregister_loader_plugin_path(LOAD_PATH)
|
||||
avalon.api.deregister_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
avalon.api.deregister_plugin_path(
|
||||
avalon.api.InventoryAction, INVENTORY_PATH
|
||||
)
|
||||
deregister_inventory_action_path(INVENTORY_PATH)
|
||||
|
||||
pyblish.api.deregister_callback(
|
||||
"instanceToggled", on_pyblish_instance_toggled
|
||||
|
|
|
|||
|
|
@ -1,12 +1,14 @@
|
|||
"""Host API required Work Files tool"""
|
||||
import sys
|
||||
import os
|
||||
from avalon import api
|
||||
|
||||
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
|
||||
|
||||
from .pipeline import get_current_comp
|
||||
|
||||
|
||||
def file_extensions():
|
||||
return api.HOST_WORKFILE_EXTENSIONS["fusion"]
|
||||
return HOST_WORKFILE_EXTENSIONS["fusion"]
|
||||
|
||||
|
||||
def has_unsaved_changes():
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
from avalon import api
|
||||
from openpype.pipeline import InventoryAction
|
||||
|
||||
|
||||
class FusionSelectContainers(api.InventoryAction):
|
||||
class FusionSelectContainers(InventoryAction):
|
||||
|
||||
label = "Select Containers"
|
||||
icon = "mouse-pointer"
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
from avalon import api
|
||||
from Qt import QtGui, QtWidgets
|
||||
|
||||
from openpype.pipeline import InventoryAction
|
||||
from openpype import style
|
||||
from openpype.hosts.fusion.api import (
|
||||
get_current_comp,
|
||||
|
|
@ -8,7 +8,7 @@ from openpype.hosts.fusion.api import (
|
|||
)
|
||||
|
||||
|
||||
class FusionSetToolColor(api.InventoryAction):
|
||||
class FusionSetToolColor(InventoryAction):
|
||||
"""Update the color of the selected tools"""
|
||||
|
||||
label = "Set Tool Color"
|
||||
|
|
|
|||
|
|
@ -2,11 +2,11 @@ import os
|
|||
from pathlib import Path
|
||||
import logging
|
||||
|
||||
from bson.objectid import ObjectId
|
||||
import pyblish.api
|
||||
|
||||
from avalon import io
|
||||
import avalon.api
|
||||
from avalon.pipeline import AVALON_CONTAINER_ID
|
||||
|
||||
from openpype import lib
|
||||
from openpype.lib import register_event_callback
|
||||
|
|
@ -14,6 +14,7 @@ from openpype.pipeline import (
|
|||
LegacyCreator,
|
||||
register_loader_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
import openpype.hosts.harmony
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
|
|
@ -113,7 +114,7 @@ def check_inventory():
|
|||
representation = container['representation']
|
||||
representation_doc = io.find_one(
|
||||
{
|
||||
"_id": io.ObjectId(representation),
|
||||
"_id": ObjectId(representation),
|
||||
"type": "representation"
|
||||
},
|
||||
projection={"parent": True}
|
||||
|
|
|
|||
|
|
@ -2,20 +2,21 @@
|
|||
import os
|
||||
import shutil
|
||||
|
||||
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
|
||||
|
||||
from .lib import (
|
||||
ProcessContext,
|
||||
get_local_harmony_path,
|
||||
zip_and_move,
|
||||
launch_zip_file
|
||||
)
|
||||
from avalon import api
|
||||
|
||||
# used to lock saving until previous save is done.
|
||||
save_disabled = False
|
||||
|
||||
|
||||
def file_extensions():
|
||||
return api.HOST_WORKFILE_EXTENSIONS["harmony"]
|
||||
return HOST_WORKFILE_EXTENSIONS["harmony"]
|
||||
|
||||
|
||||
def has_unsaved_changes():
|
||||
|
|
|
|||
|
|
@ -41,6 +41,7 @@ class ExtractRender(pyblish.api.InstancePlugin):
|
|||
func = """function %s(args)
|
||||
{
|
||||
node.setTextAttr(args[0], "DRAWING_NAME", 1, args[1]);
|
||||
node.setTextAttr(args[0], 'MOVIE_PATH', 1, args[1]);
|
||||
}
|
||||
%s
|
||||
""" % (sig, sig)
|
||||
|
|
|
|||
|
|
@ -1,12 +1,12 @@
|
|||
import os
|
||||
import hiero.core.events
|
||||
from openpype.api import Logger
|
||||
from openpype.lib import register_event_callback
|
||||
from .lib import (
|
||||
sync_avalon_data_to_workfile,
|
||||
launch_workfiles_app,
|
||||
selection_changed_timeline,
|
||||
before_project_save,
|
||||
register_event_callback
|
||||
)
|
||||
from .tags import add_tags_to_workfile
|
||||
from .menu import update_menu_task_label
|
||||
|
|
|
|||
|
|
@ -8,7 +8,10 @@ import platform
|
|||
import ast
|
||||
import shutil
|
||||
import hiero
|
||||
|
||||
from Qt import QtWidgets
|
||||
from bson.objectid import ObjectId
|
||||
|
||||
import avalon.api as avalon
|
||||
import avalon.io
|
||||
from openpype.api import (Logger, Anatomy, get_anatomy_settings)
|
||||
|
|
@ -1006,7 +1009,7 @@ def check_inventory_versions():
|
|||
# get representation from io
|
||||
representation = io.find_one({
|
||||
"type": "representation",
|
||||
"_id": io.ObjectId(container["representation"])
|
||||
"_id": ObjectId(container["representation"])
|
||||
})
|
||||
|
||||
# Get start frame from version data
|
||||
|
|
|
|||
|
|
@ -4,7 +4,7 @@ Basic avalon integration
|
|||
import os
|
||||
import contextlib
|
||||
from collections import OrderedDict
|
||||
from avalon.pipeline import AVALON_CONTAINER_ID
|
||||
|
||||
from avalon import api as avalon
|
||||
from avalon import schema
|
||||
from pyblish import api as pyblish
|
||||
|
|
@ -13,6 +13,7 @@ from openpype.pipeline import (
|
|||
LegacyCreator,
|
||||
register_loader_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.tools.utils import host_tools
|
||||
from . import lib, menu, events
|
||||
|
|
@ -28,7 +29,6 @@ PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
|
|||
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish").replace("\\", "/")
|
||||
LOAD_PATH = os.path.join(PLUGINS_DIR, "load").replace("\\", "/")
|
||||
CREATE_PATH = os.path.join(PLUGINS_DIR, "create").replace("\\", "/")
|
||||
INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory").replace("\\", "/")
|
||||
|
||||
AVALON_CONTAINERS = ":AVALON_CONTAINERS"
|
||||
|
||||
|
|
@ -51,7 +51,6 @@ def install():
|
|||
pyblish.register_plugin_path(PUBLISH_PATH)
|
||||
register_loader_plugin_path(LOAD_PATH)
|
||||
avalon.register_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
avalon.register_plugin_path(avalon.InventoryAction, INVENTORY_PATH)
|
||||
|
||||
# register callback for switching publishable
|
||||
pyblish.register_callback("instanceToggled", on_pyblish_instance_toggled)
|
||||
|
|
|
|||
|
|
@ -1,14 +1,14 @@
|
|||
import os
|
||||
import hiero
|
||||
from avalon import api
|
||||
|
||||
from openpype.api import Logger
|
||||
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
|
||||
|
||||
|
||||
log = Logger().get_logger(__name__)
|
||||
log = Logger.get_logger(__name__)
|
||||
|
||||
|
||||
def file_extensions():
|
||||
return api.HOST_WORKFILE_EXTENSIONS["hiero"]
|
||||
return HOST_WORKFILE_EXTENSIONS["hiero"]
|
||||
|
||||
|
||||
def has_unsaved_changes():
|
||||
|
|
|
|||
|
|
@ -8,12 +8,12 @@ import hdefereval
|
|||
|
||||
import pyblish.api
|
||||
import avalon.api
|
||||
from avalon.pipeline import AVALON_CONTAINER_ID
|
||||
from avalon.lib import find_submodule
|
||||
|
||||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
register_loader_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
import openpype.hosts.houdini
|
||||
from openpype.hosts.houdini.api import lib
|
||||
|
|
|
|||
|
|
@ -2,11 +2,11 @@
|
|||
import os
|
||||
|
||||
import hou
|
||||
from avalon import api
|
||||
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
|
||||
|
||||
|
||||
def file_extensions():
|
||||
return api.HOST_WORKFILE_EXTENSIONS["houdini"]
|
||||
return HOST_WORKFILE_EXTENSIONS["houdini"]
|
||||
|
||||
|
||||
def has_unsaved_changes():
|
||||
|
|
|
|||
|
|
@ -3,6 +3,7 @@ import os
|
|||
from openpype.pipeline import (
|
||||
load,
|
||||
get_representation_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.hosts.houdini.api import lib, pipeline
|
||||
|
||||
|
|
@ -73,7 +74,7 @@ class ImageLoader(load.LoaderPlugin):
|
|||
# Imprint it manually
|
||||
data = {
|
||||
"schema": "avalon-core:container-2.0",
|
||||
"id": pipeline.AVALON_CONTAINER_ID,
|
||||
"id": AVALON_CONTAINER_ID,
|
||||
"name": node_name,
|
||||
"namespace": namespace,
|
||||
"loader": str(self.__class__.__name__),
|
||||
|
|
|
|||
|
|
@ -1,8 +1,9 @@
|
|||
from openpype.pipeline import (
|
||||
load,
|
||||
get_representation_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.hosts.houdini.api import lib, pipeline
|
||||
from openpype.hosts.houdini.api import lib
|
||||
|
||||
|
||||
class USDSublayerLoader(load.LoaderPlugin):
|
||||
|
|
@ -43,7 +44,7 @@ class USDSublayerLoader(load.LoaderPlugin):
|
|||
# Imprint it manually
|
||||
data = {
|
||||
"schema": "avalon-core:container-2.0",
|
||||
"id": pipeline.AVALON_CONTAINER_ID,
|
||||
"id": AVALON_CONTAINER_ID,
|
||||
"name": node_name,
|
||||
"namespace": namespace,
|
||||
"loader": str(self.__class__.__name__),
|
||||
|
|
|
|||
|
|
@ -1,8 +1,9 @@
|
|||
from openpype.pipeline import (
|
||||
load,
|
||||
get_representation_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.hosts.houdini.api import lib, pipeline
|
||||
from openpype.hosts.houdini.api import lib
|
||||
|
||||
|
||||
class USDReferenceLoader(load.LoaderPlugin):
|
||||
|
|
@ -43,7 +44,7 @@ class USDReferenceLoader(load.LoaderPlugin):
|
|||
# Imprint it manually
|
||||
data = {
|
||||
"schema": "avalon-core:container-2.0",
|
||||
"id": pipeline.AVALON_CONTAINER_ID,
|
||||
"id": AVALON_CONTAINER_ID,
|
||||
"name": node_name,
|
||||
"namespace": namespace,
|
||||
"loader": str(self.__class__.__name__),
|
||||
|
|
|
|||
|
|
@ -145,7 +145,6 @@ class AvalonURIOutputProcessor(base.OutputProcessorBase):
|
|||
path = self._template.format(**{
|
||||
"root": root,
|
||||
"project": PROJECT,
|
||||
"silo": asset_doc["silo"],
|
||||
"asset": asset_doc["name"],
|
||||
"subset": subset,
|
||||
"representation": ext,
|
||||
|
|
@ -165,4 +164,3 @@ output_processor = AvalonURIOutputProcessor()
|
|||
|
||||
def usdOutputProcessor():
|
||||
return output_processor
|
||||
|
||||
|
|
|
|||
|
|
@ -1937,18 +1937,26 @@ def remove_other_uv_sets(mesh):
|
|||
cmds.removeMultiInstance(attr, b=True)
|
||||
|
||||
|
||||
def get_id_from_history(node):
|
||||
def get_id_from_sibling(node, history_only=True):
|
||||
"""Return first node id in the history chain that matches this node.
|
||||
|
||||
The nodes in history must be of the exact same node type and must be
|
||||
parented under the same parent.
|
||||
|
||||
Optionally, if no matching node is found from the history, all the
|
||||
siblings of the node that are of the same type are checked.
|
||||
Additionally to having the same parent, the sibling must be marked as
|
||||
'intermediate object'.
|
||||
|
||||
Args:
|
||||
node (str): node to retrieve the
|
||||
node (str): node to retrieve the history from
|
||||
history_only (bool): if True and if nothing found in history,
|
||||
look for an 'intermediate object' in all the node's siblings
|
||||
of same type
|
||||
|
||||
Returns:
|
||||
str or None: The id from the node in history or None when no id found
|
||||
on any valid nodes in the history.
|
||||
str or None: The id from the sibling node or None when no id found
|
||||
on any valid nodes in the history or siblings.
|
||||
|
||||
"""
|
||||
|
||||
|
|
@ -1977,6 +1985,45 @@ def get_id_from_history(node):
|
|||
if _id:
|
||||
return _id
|
||||
|
||||
if not history_only:
|
||||
# Get siblings of same type
|
||||
similar_nodes = cmds.listRelatives(parent,
|
||||
type=node_type,
|
||||
fullPath=True)
|
||||
similar_nodes = cmds.ls(similar_nodes, exactType=node_type, long=True)
|
||||
|
||||
# Exclude itself
|
||||
similar_nodes = [x for x in similar_nodes if x != node]
|
||||
|
||||
# Get all unique ids from siblings in order since
|
||||
# we consistently take the first one found
|
||||
sibling_ids = OrderedDict()
|
||||
for similar_node in similar_nodes:
|
||||
# Check if "intermediate object"
|
||||
if not cmds.getAttr(similar_node + ".intermediateObject"):
|
||||
continue
|
||||
|
||||
_id = get_id(similar_node)
|
||||
if not _id:
|
||||
continue
|
||||
|
||||
if _id in sibling_ids:
|
||||
sibling_ids[_id].append(similar_node)
|
||||
else:
|
||||
sibling_ids[_id] = [similar_node]
|
||||
|
||||
if sibling_ids:
|
||||
first_id, found_nodes = next(iter(sibling_ids.items()))
|
||||
|
||||
# Log a warning if we've found multiple unique ids
|
||||
if len(sibling_ids) > 1:
|
||||
log.warning(("Found more than 1 intermediate shape with"
|
||||
" unique id for '{}'. Using id of first"
|
||||
" found: '{}'".format(node, found_nodes[0])))
|
||||
|
||||
return first_id
|
||||
|
||||
|
||||
|
||||
# Project settings
|
||||
def set_scene_fps(fps, update=True):
|
||||
|
|
|
|||
|
|
@ -10,7 +10,6 @@ import pyblish.api
|
|||
import avalon.api
|
||||
|
||||
from avalon.lib import find_submodule
|
||||
from avalon.pipeline import AVALON_CONTAINER_ID
|
||||
|
||||
import openpype.hosts.maya
|
||||
from openpype.tools.utils import host_tools
|
||||
|
|
@ -23,7 +22,10 @@ from openpype.lib.path_tools import HostDirmap
|
|||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
register_loader_plugin_path,
|
||||
register_inventory_action_path,
|
||||
deregister_loader_plugin_path,
|
||||
deregister_inventory_action_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.hosts.maya.lib import copy_workspace_mel
|
||||
from . import menu, lib
|
||||
|
|
@ -59,7 +61,7 @@ def install():
|
|||
|
||||
register_loader_plugin_path(LOAD_PATH)
|
||||
avalon.api.register_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
avalon.api.register_plugin_path(avalon.api.InventoryAction, INVENTORY_PATH)
|
||||
register_inventory_action_path(INVENTORY_PATH)
|
||||
log.info(PUBLISH_PATH)
|
||||
|
||||
log.info("Installing callbacks ... ")
|
||||
|
|
@ -188,9 +190,7 @@ def uninstall():
|
|||
|
||||
deregister_loader_plugin_path(LOAD_PATH)
|
||||
avalon.api.deregister_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
avalon.api.deregister_plugin_path(
|
||||
avalon.api.InventoryAction, INVENTORY_PATH
|
||||
)
|
||||
deregister_inventory_action_path(INVENTORY_PATH)
|
||||
|
||||
menu.uninstall()
|
||||
|
||||
|
|
|
|||
|
|
@ -4,11 +4,11 @@ from maya import cmds
|
|||
|
||||
import qargparse
|
||||
|
||||
from avalon.pipeline import AVALON_CONTAINER_ID
|
||||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
LoaderPlugin,
|
||||
get_representation_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
|
||||
from .pipeline import containerise
|
||||
|
|
|
|||
|
|
@ -6,6 +6,8 @@ import contextlib
|
|||
import copy
|
||||
|
||||
import six
|
||||
from bson.objectid import ObjectId
|
||||
|
||||
from maya import cmds
|
||||
|
||||
from avalon import io
|
||||
|
|
@ -282,7 +284,7 @@ def update_package_version(container, version):
|
|||
|
||||
# Versioning (from `core.maya.pipeline`)
|
||||
current_representation = io.find_one({
|
||||
"_id": io.ObjectId(container["representation"])
|
||||
"_id": ObjectId(container["representation"])
|
||||
})
|
||||
|
||||
assert current_representation is not None, "This is a bug"
|
||||
|
|
@ -327,7 +329,7 @@ def update_package(set_container, representation):
|
|||
|
||||
# Load the original package data
|
||||
current_representation = io.find_one({
|
||||
"_id": io.ObjectId(set_container['representation']),
|
||||
"_id": ObjectId(set_container['representation']),
|
||||
"type": "representation"
|
||||
})
|
||||
|
||||
|
|
@ -478,10 +480,10 @@ def update_scene(set_container, containers, current_data, new_data, new_file):
|
|||
# They *must* use the same asset, subset and Loader for
|
||||
# `update_container` to make sense.
|
||||
old = io.find_one({
|
||||
"_id": io.ObjectId(representation_current)
|
||||
"_id": ObjectId(representation_current)
|
||||
})
|
||||
new = io.find_one({
|
||||
"_id": io.ObjectId(representation_new)
|
||||
"_id": ObjectId(representation_new)
|
||||
})
|
||||
is_valid = compare_representations(old=old, new=new)
|
||||
if not is_valid:
|
||||
|
|
|
|||
|
|
@ -1,11 +1,12 @@
|
|||
"""Host API required Work Files tool"""
|
||||
import os
|
||||
from maya import cmds
|
||||
from avalon import api
|
||||
|
||||
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
|
||||
|
||||
|
||||
def file_extensions():
|
||||
return api.HOST_WORKFILE_EXTENSIONS["maya"]
|
||||
return HOST_WORKFILE_EXTENSIONS["maya"]
|
||||
|
||||
|
||||
def has_unsaved_changes():
|
||||
|
|
|
|||
|
|
@ -1,6 +1,8 @@
|
|||
import json
|
||||
from avalon import api, io
|
||||
from avalon import io
|
||||
from bson.objectid import ObjectId
|
||||
from openpype.pipeline import (
|
||||
InventoryAction,
|
||||
get_representation_context,
|
||||
get_representation_path_from_context,
|
||||
)
|
||||
|
|
@ -10,7 +12,7 @@ from openpype.hosts.maya.api.lib import (
|
|||
)
|
||||
|
||||
|
||||
class ImportModelRender(api.InventoryAction):
|
||||
class ImportModelRender(InventoryAction):
|
||||
|
||||
label = "Import Model Render Sets"
|
||||
icon = "industry"
|
||||
|
|
@ -39,7 +41,7 @@ class ImportModelRender(api.InventoryAction):
|
|||
nodes.append(n)
|
||||
|
||||
repr_doc = io.find_one({
|
||||
"_id": io.ObjectId(container["representation"]),
|
||||
"_id": ObjectId(container["representation"]),
|
||||
})
|
||||
version_id = repr_doc["parent"]
|
||||
|
||||
|
|
|
|||
|
|
@ -1,11 +1,10 @@
|
|||
from maya import cmds
|
||||
|
||||
from avalon import api
|
||||
|
||||
from openpype.pipeline import InventoryAction
|
||||
from openpype.hosts.maya.api.plugin import get_reference_node
|
||||
|
||||
|
||||
class ImportReference(api.InventoryAction):
|
||||
class ImportReference(InventoryAction):
|
||||
"""Imports selected reference to inside of the file."""
|
||||
|
||||
label = "Import Reference"
|
||||
|
|
|
|||
|
|
@ -7,6 +7,8 @@ loader will use them instead of native vray vrmesh format.
|
|||
"""
|
||||
import os
|
||||
|
||||
from bson.objectid import ObjectId
|
||||
|
||||
import maya.cmds as cmds
|
||||
|
||||
from avalon import io
|
||||
|
|
@ -186,7 +188,7 @@ class VRayProxyLoader(load.LoaderPlugin):
|
|||
abc_rep = io.find_one(
|
||||
{
|
||||
"type": "representation",
|
||||
"parent": io.ObjectId(version_id),
|
||||
"parent": ObjectId(version_id),
|
||||
"name": "abc"
|
||||
})
|
||||
|
||||
|
|
|
|||
|
|
@ -6,7 +6,7 @@ from maya import cmds
|
|||
|
||||
import openpype.api
|
||||
from openpype.hosts.maya.api.lib import maintained_selection
|
||||
from avalon.pipeline import AVALON_CONTAINER_ID
|
||||
from openpype.pipeline import AVALON_CONTAINER_ID
|
||||
|
||||
|
||||
class ExtractMayaSceneRaw(openpype.api.Extractor):
|
||||
|
|
|
|||
|
|
@ -32,8 +32,8 @@ class ValidateOutRelatedNodeIds(pyblish.api.InstancePlugin):
|
|||
# if a deformer has been created on the shape
|
||||
invalid = self.get_invalid(instance)
|
||||
if invalid:
|
||||
raise RuntimeError("Nodes found with non-related "
|
||||
"asset IDs: {0}".format(invalid))
|
||||
raise RuntimeError("Nodes found with mismatching "
|
||||
"IDs: {0}".format(invalid))
|
||||
|
||||
@classmethod
|
||||
def get_invalid(cls, instance):
|
||||
|
|
@ -65,7 +65,7 @@ class ValidateOutRelatedNodeIds(pyblish.api.InstancePlugin):
|
|||
invalid.append(node)
|
||||
continue
|
||||
|
||||
history_id = lib.get_id_from_history(node)
|
||||
history_id = lib.get_id_from_sibling(node)
|
||||
if history_id is not None and node_id != history_id:
|
||||
invalid.append(node)
|
||||
|
||||
|
|
@ -76,7 +76,7 @@ class ValidateOutRelatedNodeIds(pyblish.api.InstancePlugin):
|
|||
|
||||
for node in cls.get_invalid(instance):
|
||||
# Get the original id from history
|
||||
history_id = lib.get_id_from_history(node)
|
||||
history_id = lib.get_id_from_sibling(node)
|
||||
if not history_id:
|
||||
cls.log.error("Could not find ID in history for '%s'", node)
|
||||
continue
|
||||
|
|
|
|||
|
|
@ -48,7 +48,7 @@ class ValidateNodeIdsDeformedShape(pyblish.api.InstancePlugin):
|
|||
|
||||
invalid = []
|
||||
for shape in shapes:
|
||||
history_id = lib.get_id_from_history(shape)
|
||||
history_id = lib.get_id_from_sibling(shape)
|
||||
if history_id:
|
||||
current_id = lib.get_id(shape)
|
||||
if current_id != history_id:
|
||||
|
|
@ -61,7 +61,7 @@ class ValidateNodeIdsDeformedShape(pyblish.api.InstancePlugin):
|
|||
|
||||
for node in cls.get_invalid(instance):
|
||||
# Get the original id from history
|
||||
history_id = lib.get_id_from_history(node)
|
||||
history_id = lib.get_id_from_sibling(node)
|
||||
if not history_id:
|
||||
cls.log.error("Could not find ID in history for '%s'", node)
|
||||
continue
|
||||
|
|
|
|||
|
|
@ -24,6 +24,7 @@ class ValidateRigOutSetNodeIds(pyblish.api.InstancePlugin):
|
|||
openpype.hosts.maya.api.action.SelectInvalidAction,
|
||||
openpype.api.RepairAction
|
||||
]
|
||||
allow_history_only = False
|
||||
|
||||
def process(self, instance):
|
||||
"""Process all meshes"""
|
||||
|
|
@ -32,8 +33,8 @@ class ValidateRigOutSetNodeIds(pyblish.api.InstancePlugin):
|
|||
# if a deformer has been created on the shape
|
||||
invalid = self.get_invalid(instance)
|
||||
if invalid:
|
||||
raise RuntimeError("Nodes found with non-related "
|
||||
"asset IDs: {0}".format(invalid))
|
||||
raise RuntimeError("Nodes found with mismatching "
|
||||
"IDs: {0}".format(invalid))
|
||||
|
||||
@classmethod
|
||||
def get_invalid(cls, instance):
|
||||
|
|
@ -51,10 +52,13 @@ class ValidateRigOutSetNodeIds(pyblish.api.InstancePlugin):
|
|||
noIntermediate=True)
|
||||
|
||||
for shape in shapes:
|
||||
history_id = lib.get_id_from_history(shape)
|
||||
if history_id:
|
||||
sibling_id = lib.get_id_from_sibling(
|
||||
shape,
|
||||
history_only=cls.allow_history_only
|
||||
)
|
||||
if sibling_id:
|
||||
current_id = lib.get_id(shape)
|
||||
if current_id != history_id:
|
||||
if current_id != sibling_id:
|
||||
invalid.append(shape)
|
||||
|
||||
return invalid
|
||||
|
|
@ -63,10 +67,13 @@ class ValidateRigOutSetNodeIds(pyblish.api.InstancePlugin):
|
|||
def repair(cls, instance):
|
||||
|
||||
for node in cls.get_invalid(instance):
|
||||
# Get the original id from history
|
||||
history_id = lib.get_id_from_history(node)
|
||||
if not history_id:
|
||||
cls.log.error("Could not find ID in history for '%s'", node)
|
||||
# Get the original id from sibling
|
||||
sibling_id = lib.get_id_from_sibling(
|
||||
node,
|
||||
history_only=cls.allow_history_only
|
||||
)
|
||||
if not sibling_id:
|
||||
cls.log.error("Could not find ID in siblings for '%s'", node)
|
||||
continue
|
||||
|
||||
lib.set_id(node, history_id, overwrite=True)
|
||||
lib.set_id(node, sibling_id, overwrite=True)
|
||||
|
|
|
|||
|
|
@ -1,6 +1,7 @@
|
|||
import logging
|
||||
import contextlib
|
||||
import nuke
|
||||
from bson.objectid import ObjectId
|
||||
|
||||
from avalon import api, io
|
||||
|
||||
|
|
@ -70,10 +71,10 @@ def get_handles(asset):
|
|||
if "visualParent" in data:
|
||||
vp = data["visualParent"]
|
||||
if vp is not None:
|
||||
parent_asset = io.find_one({"_id": io.ObjectId(vp)})
|
||||
parent_asset = io.find_one({"_id": ObjectId(vp)})
|
||||
|
||||
if parent_asset is None:
|
||||
parent_asset = io.find_one({"_id": io.ObjectId(asset["parent"])})
|
||||
parent_asset = io.find_one({"_id": ObjectId(asset["parent"])})
|
||||
|
||||
if parent_asset is not None:
|
||||
return get_handles(parent_asset)
|
||||
|
|
|
|||
|
|
@ -6,10 +6,11 @@ import contextlib
|
|||
from collections import OrderedDict
|
||||
|
||||
import clique
|
||||
from bson.objectid import ObjectId
|
||||
|
||||
import nuke
|
||||
|
||||
from avalon import api, io, lib
|
||||
from avalon import api, io
|
||||
|
||||
from openpype.api import (
|
||||
Logger,
|
||||
|
|
@ -20,7 +21,6 @@ from openpype.api import (
|
|||
get_workdir_data,
|
||||
get_asset,
|
||||
get_current_project_settings,
|
||||
ApplicationManager
|
||||
)
|
||||
from openpype.tools.utils import host_tools
|
||||
from openpype.lib.path_tools import HostDirmap
|
||||
|
|
@ -570,7 +570,7 @@ def check_inventory_versions():
|
|||
# get representation from io
|
||||
representation = io.find_one({
|
||||
"type": "representation",
|
||||
"_id": io.ObjectId(avalon_knob_data["representation"])
|
||||
"_id": ObjectId(avalon_knob_data["representation"])
|
||||
})
|
||||
|
||||
# Failsafe for not finding the representation.
|
||||
|
|
|
|||
|
|
@ -6,7 +6,6 @@ import nuke
|
|||
|
||||
import pyblish.api
|
||||
import avalon.api
|
||||
from avalon import pipeline
|
||||
|
||||
import openpype
|
||||
from openpype.api import (
|
||||
|
|
@ -18,7 +17,10 @@ from openpype.lib import register_event_callback
|
|||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
register_loader_plugin_path,
|
||||
register_inventory_action_path,
|
||||
deregister_loader_plugin_path,
|
||||
deregister_inventory_action_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.tools.utils import host_tools
|
||||
|
||||
|
|
@ -105,7 +107,7 @@ def install():
|
|||
pyblish.api.register_plugin_path(PUBLISH_PATH)
|
||||
register_loader_plugin_path(LOAD_PATH)
|
||||
avalon.api.register_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
avalon.api.register_plugin_path(avalon.api.InventoryAction, INVENTORY_PATH)
|
||||
register_inventory_action_path(INVENTORY_PATH)
|
||||
|
||||
# Register Avalon event for workfiles loading.
|
||||
register_event_callback("workio.open_file", check_inventory_versions)
|
||||
|
|
@ -131,6 +133,7 @@ def uninstall():
|
|||
pyblish.api.deregister_plugin_path(PUBLISH_PATH)
|
||||
deregister_loader_plugin_path(LOAD_PATH)
|
||||
avalon.api.deregister_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
deregister_inventory_action_path(INVENTORY_PATH)
|
||||
|
||||
pyblish.api.deregister_callback(
|
||||
"instanceToggled", on_pyblish_instance_toggled)
|
||||
|
|
@ -330,7 +333,7 @@ def containerise(node,
|
|||
data = OrderedDict(
|
||||
[
|
||||
("schema", "openpype:container-2.0"),
|
||||
("id", pipeline.AVALON_CONTAINER_ID),
|
||||
("id", AVALON_CONTAINER_ID),
|
||||
("name", name),
|
||||
("namespace", namespace),
|
||||
("loader", str(loader)),
|
||||
|
|
|
|||
|
|
@ -1,11 +1,12 @@
|
|||
"""Host API required Work Files tool"""
|
||||
import os
|
||||
import nuke
|
||||
import avalon.api
|
||||
|
||||
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
|
||||
|
||||
|
||||
def file_extensions():
|
||||
return avalon.api.HOST_WORKFILE_EXTENSIONS["nuke"]
|
||||
return HOST_WORKFILE_EXTENSIONS["nuke"]
|
||||
|
||||
|
||||
def has_unsaved_changes():
|
||||
|
|
|
|||
|
|
@ -1,9 +1,9 @@
|
|||
from avalon import api
|
||||
from openpype.api import Logger
|
||||
from openpype.pipeline import InventoryAction
|
||||
from openpype.hosts.nuke.api.lib import set_avalon_knob_data
|
||||
|
||||
|
||||
class RepairOldLoaders(api.InventoryAction):
|
||||
class RepairOldLoaders(InventoryAction):
|
||||
|
||||
label = "Repair Old Loaders"
|
||||
icon = "gears"
|
||||
|
|
|
|||
|
|
@ -1,8 +1,8 @@
|
|||
from avalon import api
|
||||
from openpype.pipeline import InventoryAction
|
||||
from openpype.hosts.nuke.api.commands import viewer_update_and_undo_stop
|
||||
|
||||
|
||||
class SelectContainers(api.InventoryAction):
|
||||
class SelectContainers(InventoryAction):
|
||||
|
||||
label = "Select Containers"
|
||||
icon = "mouse-pointer"
|
||||
|
|
|
|||
|
|
@ -101,7 +101,7 @@ class LoadClip(plugin.NukeLoader):
|
|||
last += self.handle_end
|
||||
|
||||
if not is_sequence:
|
||||
duration = last - first + 1
|
||||
duration = last - first
|
||||
first = 1
|
||||
last = first + duration
|
||||
elif "#" not in file:
|
||||
|
|
@ -216,7 +216,7 @@ class LoadClip(plugin.NukeLoader):
|
|||
last += self.handle_end
|
||||
|
||||
if not is_sequence:
|
||||
duration = last - first + 1
|
||||
duration = last - first
|
||||
first = 1
|
||||
last = first + duration
|
||||
elif "#" not in file:
|
||||
|
|
|
|||
|
|
@ -1,9 +1,10 @@
|
|||
import os
|
||||
from Qt import QtWidgets
|
||||
from bson.objectid import ObjectId
|
||||
|
||||
import pyblish.api
|
||||
import avalon.api
|
||||
from avalon import pipeline, io
|
||||
from avalon import io
|
||||
|
||||
from openpype.api import Logger
|
||||
from openpype.lib import register_event_callback
|
||||
|
|
@ -11,6 +12,7 @@ from openpype.pipeline import (
|
|||
LegacyCreator,
|
||||
register_loader_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
import openpype.hosts.photoshop
|
||||
|
||||
|
|
@ -36,7 +38,7 @@ def check_inventory():
|
|||
representation = container['representation']
|
||||
representation_doc = io.find_one(
|
||||
{
|
||||
"_id": io.ObjectId(representation),
|
||||
"_id": ObjectId(representation),
|
||||
"type": "representation"
|
||||
},
|
||||
projection={"parent": True}
|
||||
|
|
@ -221,7 +223,7 @@ def containerise(
|
|||
|
||||
data = {
|
||||
"schema": "openpype:container-2.0",
|
||||
"id": pipeline.AVALON_CONTAINER_ID,
|
||||
"id": AVALON_CONTAINER_ID,
|
||||
"name": name,
|
||||
"namespace": namespace,
|
||||
"loader": str(loader),
|
||||
|
|
|
|||
|
|
@ -1,8 +1,7 @@
|
|||
"""Host API required Work Files tool"""
|
||||
import os
|
||||
|
||||
import avalon.api
|
||||
|
||||
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
|
||||
from . import lib
|
||||
|
||||
|
||||
|
|
@ -15,7 +14,7 @@ def _active_document():
|
|||
|
||||
|
||||
def file_extensions():
|
||||
return avalon.api.HOST_WORKFILE_EXTENSIONS["photoshop"]
|
||||
return HOST_WORKFILE_EXTENSIONS["photoshop"]
|
||||
|
||||
|
||||
def has_unsaved_changes():
|
||||
|
|
|
|||
|
|
@ -6,13 +6,13 @@ import contextlib
|
|||
from collections import OrderedDict
|
||||
from avalon import api as avalon
|
||||
from avalon import schema
|
||||
from avalon.pipeline import AVALON_CONTAINER_ID
|
||||
from pyblish import api as pyblish
|
||||
from openpype.api import Logger
|
||||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
register_loader_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from . import lib
|
||||
from . import PLUGINS_DIR
|
||||
|
|
@ -22,7 +22,6 @@ log = Logger().get_logger(__name__)
|
|||
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
|
||||
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
|
||||
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
|
||||
INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory")
|
||||
|
||||
AVALON_CONTAINERS = ":AVALON_CONTAINERS"
|
||||
|
||||
|
|
@ -48,7 +47,6 @@ def install():
|
|||
|
||||
register_loader_plugin_path(LOAD_PATH)
|
||||
avalon.register_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
avalon.register_plugin_path(avalon.InventoryAction, INVENTORY_PATH)
|
||||
|
||||
# register callback for switching publishable
|
||||
pyblish.register_callback("instanceToggled", on_pyblish_instance_toggled)
|
||||
|
|
@ -73,7 +71,6 @@ def uninstall():
|
|||
|
||||
deregister_loader_plugin_path(LOAD_PATH)
|
||||
avalon.deregister_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
avalon.deregister_plugin_path(avalon.InventoryAction, INVENTORY_PATH)
|
||||
|
||||
# register callback for switching publishable
|
||||
pyblish.deregister_callback("instanceToggled", on_pyblish_instance_toggled)
|
||||
|
|
|
|||
|
|
@ -3,9 +3,10 @@ import re
|
|||
import pyblish.api
|
||||
import json
|
||||
|
||||
from avalon.api import format_template_with_optional_keys
|
||||
|
||||
from openpype.lib import prepare_template_data
|
||||
from openpype.lib import (
|
||||
prepare_template_data,
|
||||
StringTemplate,
|
||||
)
|
||||
|
||||
|
||||
class CollectTextures(pyblish.api.ContextPlugin):
|
||||
|
|
@ -110,8 +111,9 @@ class CollectTextures(pyblish.api.ContextPlugin):
|
|||
|
||||
formatting_data.update(explicit_data)
|
||||
fill_pairs = prepare_template_data(formatting_data)
|
||||
workfile_subset = format_template_with_optional_keys(
|
||||
fill_pairs, self.workfile_subset_template)
|
||||
workfile_subset = StringTemplate.format_strict_template(
|
||||
self.workfile_subset_template, fill_pairs
|
||||
)
|
||||
|
||||
asset_build = self._get_asset_build(
|
||||
repre_file,
|
||||
|
|
@ -201,8 +203,9 @@ class CollectTextures(pyblish.api.ContextPlugin):
|
|||
formatting_data.update(explicit_data)
|
||||
|
||||
fill_pairs = prepare_template_data(formatting_data)
|
||||
subset = format_template_with_optional_keys(
|
||||
fill_pairs, self.texture_subset_template)
|
||||
subset = StringTemplate.format_strict_template(
|
||||
self.texture_subset_template, fill_pairs
|
||||
)
|
||||
|
||||
asset_build = self._get_asset_build(
|
||||
repre_file,
|
||||
|
|
|
|||
|
|
@ -2,6 +2,9 @@ import os
|
|||
import pyblish.api
|
||||
import openpype.api
|
||||
|
||||
from openpype.lib import (
|
||||
get_ffmpeg_tool_path,
|
||||
)
|
||||
from pprint import pformat
|
||||
|
||||
|
||||
|
|
@ -27,7 +30,7 @@ class ExtractTrimVideoAudio(openpype.api.Extractor):
|
|||
instance.data["representations"] = list()
|
||||
|
||||
# get ffmpet path
|
||||
ffmpeg_path = openpype.lib.get_ffmpeg_tool_path("ffmpeg")
|
||||
ffmpeg_path = get_ffmpeg_tool_path("ffmpeg")
|
||||
|
||||
# get staging dir
|
||||
staging_dir = self.staging_dir(instance)
|
||||
|
|
@ -44,7 +47,7 @@ class ExtractTrimVideoAudio(openpype.api.Extractor):
|
|||
clip_trimed_path = os.path.join(
|
||||
staging_dir, instance.data["name"] + ext)
|
||||
# # check video file metadata
|
||||
# input_data = plib.ffprobe_streams(video_file_path)[0]
|
||||
# input_data = plib.get_ffprobe_streams(video_file_path)[0]
|
||||
# self.log.debug(f"__ input_data: `{input_data}`")
|
||||
|
||||
start = float(instance.data["clipInH"])
|
||||
|
|
|
|||
|
|
@ -1,10 +1,10 @@
|
|||
from avalon import io
|
||||
from openpype.lib import NumberDef
|
||||
from openpype.hosts.testhost.api import pipeline
|
||||
from openpype.pipeline import (
|
||||
AutoCreator,
|
||||
CreatedInstance,
|
||||
lib
|
||||
)
|
||||
from avalon import io
|
||||
|
||||
|
||||
class MyAutoCreator(AutoCreator):
|
||||
|
|
@ -13,7 +13,7 @@ class MyAutoCreator(AutoCreator):
|
|||
|
||||
def get_instance_attr_defs(self):
|
||||
output = [
|
||||
lib.NumberDef("number_key", label="Number")
|
||||
NumberDef("number_key", label="Number")
|
||||
]
|
||||
return output
|
||||
|
||||
|
|
|
|||
|
|
@ -1,10 +1,16 @@
|
|||
import json
|
||||
from openpype import resources
|
||||
from openpype.hosts.testhost.api import pipeline
|
||||
from openpype.lib import (
|
||||
UISeparatorDef,
|
||||
UILabelDef,
|
||||
BoolDef,
|
||||
NumberDef,
|
||||
FileDef,
|
||||
)
|
||||
from openpype.pipeline import (
|
||||
Creator,
|
||||
CreatedInstance,
|
||||
lib
|
||||
)
|
||||
|
||||
|
||||
|
|
@ -54,17 +60,17 @@ class TestCreatorOne(Creator):
|
|||
|
||||
def get_instance_attr_defs(self):
|
||||
output = [
|
||||
lib.NumberDef("number_key", label="Number"),
|
||||
NumberDef("number_key", label="Number"),
|
||||
]
|
||||
return output
|
||||
|
||||
def get_pre_create_attr_defs(self):
|
||||
output = [
|
||||
lib.BoolDef("use_selection", label="Use selection"),
|
||||
lib.UISeparatorDef(),
|
||||
lib.UILabelDef("Testing label"),
|
||||
lib.FileDef("filepath", folders=True, label="Filepath"),
|
||||
lib.FileDef(
|
||||
BoolDef("use_selection", label="Use selection"),
|
||||
UISeparatorDef(),
|
||||
UILabelDef("Testing label"),
|
||||
FileDef("filepath", folders=True, label="Filepath"),
|
||||
FileDef(
|
||||
"filepath_2", multipath=True, folders=True, label="Filepath 2"
|
||||
)
|
||||
]
|
||||
|
|
|
|||
|
|
@ -1,8 +1,8 @@
|
|||
from openpype.lib import NumberDef, TextDef
|
||||
from openpype.hosts.testhost.api import pipeline
|
||||
from openpype.pipeline import (
|
||||
Creator,
|
||||
CreatedInstance,
|
||||
lib
|
||||
)
|
||||
|
||||
|
||||
|
|
@ -40,8 +40,8 @@ class TestCreatorTwo(Creator):
|
|||
|
||||
def get_instance_attr_defs(self):
|
||||
output = [
|
||||
lib.NumberDef("number_key"),
|
||||
lib.TextDef("text_key")
|
||||
NumberDef("number_key"),
|
||||
TextDef("text_key")
|
||||
]
|
||||
return output
|
||||
|
||||
|
|
|
|||
|
|
@ -1,10 +1,8 @@
|
|||
import json
|
||||
import pyblish.api
|
||||
|
||||
from openpype.pipeline import (
|
||||
OpenPypePyblishPluginMixin,
|
||||
attribute_definitions
|
||||
)
|
||||
from openpype.lib import attribute_definitions
|
||||
from openpype.pipeline import OpenPypePyblishPluginMixin
|
||||
|
||||
|
||||
class CollectInstanceOneTestHost(
|
||||
|
|
|
|||
|
|
@ -10,7 +10,6 @@ import pyblish.api
|
|||
import avalon.api
|
||||
|
||||
from avalon import io
|
||||
from avalon.pipeline import AVALON_CONTAINER_ID
|
||||
|
||||
from openpype.hosts import tvpaint
|
||||
from openpype.api import get_current_project_settings
|
||||
|
|
@ -19,6 +18,7 @@ from openpype.pipeline import (
|
|||
LegacyCreator,
|
||||
register_loader_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
|
||||
from .lib import (
|
||||
|
|
|
|||
|
|
@ -4,6 +4,7 @@
|
|||
"""
|
||||
|
||||
from avalon import api
|
||||
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
|
||||
from .lib import (
|
||||
execute_george,
|
||||
execute_george_through_file
|
||||
|
|
@ -47,7 +48,7 @@ def has_unsaved_changes():
|
|||
|
||||
def file_extensions():
|
||||
"""Return the supported file extensions for Blender scene files."""
|
||||
return api.HOST_WORKFILE_EXTENSIONS["tvpaint"]
|
||||
return HOST_WORKFILE_EXTENSIONS["tvpaint"]
|
||||
|
||||
|
||||
def work_root(session):
|
||||
|
|
|
|||
|
|
@ -1,10 +1,11 @@
|
|||
import getpass
|
||||
import os
|
||||
|
||||
from avalon import api, io
|
||||
from openpype.lib import (
|
||||
StringTemplate,
|
||||
get_workfile_template_key_from_context,
|
||||
get_workdir_data
|
||||
get_workdir_data,
|
||||
get_last_workfile_with_version,
|
||||
)
|
||||
from openpype.api import Anatomy
|
||||
from openpype.hosts.tvpaint.api import lib, pipeline, plugin
|
||||
|
|
@ -67,9 +68,8 @@ class LoadWorkfile(plugin.Loader):
|
|||
|
||||
data = get_workdir_data(project_doc, asset_doc, task_name, host_name)
|
||||
data["root"] = anatomy.roots
|
||||
data["user"] = getpass.getuser()
|
||||
|
||||
template = anatomy.templates[template_key]["file"]
|
||||
file_template = anatomy.templates[template_key]["file"]
|
||||
|
||||
# Define saving file extension
|
||||
if current_file:
|
||||
|
|
@ -81,11 +81,12 @@ class LoadWorkfile(plugin.Loader):
|
|||
|
||||
data["ext"] = extension
|
||||
|
||||
work_root = api.format_template_with_optional_keys(
|
||||
data, anatomy.templates[template_key]["folder"]
|
||||
folder_template = anatomy.templates[template_key]["folder"]
|
||||
work_root = StringTemplate.format_strict_template(
|
||||
folder_template, data
|
||||
)
|
||||
version = api.last_workfile_with_version(
|
||||
work_root, template, data, host.file_extensions()
|
||||
version = get_last_workfile_with_version(
|
||||
work_root, file_template, data, host.file_extensions()
|
||||
)[1]
|
||||
|
||||
if version is None:
|
||||
|
|
@ -95,8 +96,8 @@ class LoadWorkfile(plugin.Loader):
|
|||
|
||||
data["version"] = version
|
||||
|
||||
path = os.path.join(
|
||||
work_root,
|
||||
api.format_template_with_optional_keys(data, template)
|
||||
filename = StringTemplate.format_strict_template(
|
||||
file_template, data
|
||||
)
|
||||
path = os.path.join(work_root, filename)
|
||||
host.save_file(path)
|
||||
|
|
|
|||
|
|
@ -4,13 +4,13 @@ import logging
|
|||
from typing import List
|
||||
|
||||
import pyblish.api
|
||||
from avalon.pipeline import AVALON_CONTAINER_ID
|
||||
from avalon import api
|
||||
|
||||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
register_loader_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.tools.utils import host_tools
|
||||
import openpype.hosts.unreal
|
||||
|
|
|
|||
|
|
@ -2,8 +2,10 @@
|
|||
"""Loader for published alembics."""
|
||||
import os
|
||||
|
||||
from avalon import pipeline
|
||||
from openpype.pipeline import get_representation_path
|
||||
from openpype.pipeline import (
|
||||
get_representation_path,
|
||||
AVALON_CONTAINER_ID
|
||||
)
|
||||
from openpype.hosts.unreal.api import plugin
|
||||
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
|
||||
|
||||
|
|
@ -117,7 +119,7 @@ class PointCacheAlembicLoader(plugin.Loader):
|
|||
|
||||
data = {
|
||||
"schema": "openpype:container-2.0",
|
||||
"id": pipeline.AVALON_CONTAINER_ID,
|
||||
"id": AVALON_CONTAINER_ID,
|
||||
"asset": asset,
|
||||
"namespace": asset_dir,
|
||||
"container_name": container_name,
|
||||
|
|
|
|||
|
|
@ -2,8 +2,10 @@
|
|||
"""Load Skeletal Mesh alembics."""
|
||||
import os
|
||||
|
||||
from avalon import pipeline
|
||||
from openpype.pipeline import get_representation_path
|
||||
from openpype.pipeline import (
|
||||
get_representation_path,
|
||||
AVALON_CONTAINER_ID
|
||||
)
|
||||
from openpype.hosts.unreal.api import plugin
|
||||
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
|
||||
import unreal # noqa
|
||||
|
|
@ -81,7 +83,7 @@ class SkeletalMeshAlembicLoader(plugin.Loader):
|
|||
|
||||
data = {
|
||||
"schema": "openpype:container-2.0",
|
||||
"id": pipeline.AVALON_CONTAINER_ID,
|
||||
"id": AVALON_CONTAINER_ID,
|
||||
"asset": asset,
|
||||
"namespace": asset_dir,
|
||||
"container_name": container_name,
|
||||
|
|
|
|||
|
|
@ -2,8 +2,10 @@
|
|||
"""Loader for Static Mesh alembics."""
|
||||
import os
|
||||
|
||||
from avalon import pipeline
|
||||
from openpype.pipeline import get_representation_path
|
||||
from openpype.pipeline import (
|
||||
get_representation_path,
|
||||
AVALON_CONTAINER_ID
|
||||
)
|
||||
from openpype.hosts.unreal.api import plugin
|
||||
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
|
||||
import unreal # noqa
|
||||
|
|
@ -100,7 +102,7 @@ class StaticMeshAlembicLoader(plugin.Loader):
|
|||
|
||||
data = {
|
||||
"schema": "openpype:container-2.0",
|
||||
"id": pipeline.AVALON_CONTAINER_ID,
|
||||
"id": AVALON_CONTAINER_ID,
|
||||
"asset": asset,
|
||||
"namespace": asset_dir,
|
||||
"container_name": container_name,
|
||||
|
|
|
|||
|
|
@ -3,8 +3,10 @@
|
|||
import os
|
||||
import json
|
||||
|
||||
from avalon import pipeline
|
||||
from openpype.pipeline import get_representation_path
|
||||
from openpype.pipeline import (
|
||||
get_representation_path,
|
||||
AVALON_CONTAINER_ID
|
||||
)
|
||||
from openpype.hosts.unreal.api import plugin
|
||||
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
|
||||
import unreal # noqa
|
||||
|
|
@ -135,7 +137,7 @@ class AnimationFBXLoader(plugin.Loader):
|
|||
|
||||
data = {
|
||||
"schema": "openpype:container-2.0",
|
||||
"id": pipeline.AVALON_CONTAINER_ID,
|
||||
"id": AVALON_CONTAINER_ID,
|
||||
"asset": asset,
|
||||
"namespace": asset_dir,
|
||||
"container_name": container_name,
|
||||
|
|
|
|||
|
|
@ -2,7 +2,8 @@
|
|||
"""Load camera from FBX."""
|
||||
import os
|
||||
|
||||
from avalon import io, pipeline
|
||||
from avalon import io
|
||||
from openpype.pipeline import AVALON_CONTAINER_ID
|
||||
from openpype.hosts.unreal.api import plugin
|
||||
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
|
||||
import unreal # noqa
|
||||
|
|
@ -116,7 +117,7 @@ class CameraLoader(plugin.Loader):
|
|||
|
||||
data = {
|
||||
"schema": "openpype:container-2.0",
|
||||
"id": pipeline.AVALON_CONTAINER_ID,
|
||||
"id": AVALON_CONTAINER_ID,
|
||||
"asset": asset,
|
||||
"namespace": asset_dir,
|
||||
"container_name": container_name,
|
||||
|
|
|
|||
|
|
@ -11,12 +11,12 @@ from unreal import AssetToolsHelpers
|
|||
from unreal import FBXImportType
|
||||
from unreal import MathLibrary as umath
|
||||
|
||||
from avalon.pipeline import AVALON_CONTAINER_ID
|
||||
from openpype.pipeline import (
|
||||
discover_loader_plugins,
|
||||
loaders_from_representation,
|
||||
load_container,
|
||||
get_representation_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.hosts.unreal.api import plugin
|
||||
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
|
||||
|
|
|
|||
|
|
@ -2,8 +2,10 @@
|
|||
"""Load Skeletal Meshes form FBX."""
|
||||
import os
|
||||
|
||||
from avalon import pipeline
|
||||
from openpype.pipeline import get_representation_path
|
||||
from openpype.pipeline import (
|
||||
get_representation_path,
|
||||
AVALON_CONTAINER_ID
|
||||
)
|
||||
from openpype.hosts.unreal.api import plugin
|
||||
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
|
||||
import unreal # noqa
|
||||
|
|
@ -101,7 +103,7 @@ class SkeletalMeshFBXLoader(plugin.Loader):
|
|||
|
||||
data = {
|
||||
"schema": "openpype:container-2.0",
|
||||
"id": pipeline.AVALON_CONTAINER_ID,
|
||||
"id": AVALON_CONTAINER_ID,
|
||||
"asset": asset,
|
||||
"namespace": asset_dir,
|
||||
"container_name": container_name,
|
||||
|
|
|
|||
|
|
@ -2,8 +2,10 @@
|
|||
"""Load Static meshes form FBX."""
|
||||
import os
|
||||
|
||||
from avalon import pipeline
|
||||
from openpype.pipeline import get_representation_path
|
||||
from openpype.pipeline import (
|
||||
get_representation_path,
|
||||
AVALON_CONTAINER_ID
|
||||
)
|
||||
from openpype.hosts.unreal.api import plugin
|
||||
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
|
||||
import unreal # noqa
|
||||
|
|
@ -95,7 +97,7 @@ class StaticMeshFBXLoader(plugin.Loader):
|
|||
|
||||
data = {
|
||||
"schema": "openpype:container-2.0",
|
||||
"id": pipeline.AVALON_CONTAINER_ID,
|
||||
"id": AVALON_CONTAINER_ID,
|
||||
"asset": asset,
|
||||
"namespace": asset_dir,
|
||||
"container_name": container_name,
|
||||
|
|
|
|||
|
|
@ -3,6 +3,8 @@ import os
|
|||
import json
|
||||
import math
|
||||
|
||||
from bson.objectid import ObjectId
|
||||
|
||||
import unreal
|
||||
from unreal import EditorLevelLibrary as ell
|
||||
from unreal import EditorAssetLibrary as eal
|
||||
|
|
@ -62,7 +64,7 @@ class ExtractLayout(openpype.api.Extractor):
|
|||
blend = io.find_one(
|
||||
{
|
||||
"type": "representation",
|
||||
"parent": io.ObjectId(parent),
|
||||
"parent": ObjectId(parent),
|
||||
"name": "blend"
|
||||
},
|
||||
projection={"_id": True})
|
||||
|
|
|
|||
|
|
@ -14,8 +14,12 @@ import math
|
|||
|
||||
from avalon import io
|
||||
import pyblish.api
|
||||
from openpype.lib import prepare_template_data, get_asset, ffprobe_streams
|
||||
from openpype.lib.vendor_bin_utils import get_fps
|
||||
from openpype.lib import (
|
||||
prepare_template_data,
|
||||
get_asset,
|
||||
get_ffprobe_streams,
|
||||
convert_ffprobe_fps_value,
|
||||
)
|
||||
from openpype.lib.plugin_tools import (
|
||||
parse_json,
|
||||
get_subset_name_with_asset_doc
|
||||
|
|
@ -265,7 +269,7 @@ class CollectPublishedFiles(pyblish.api.ContextPlugin):
|
|||
def _get_number_of_frames(self, file_url):
|
||||
"""Return duration in frames"""
|
||||
try:
|
||||
streams = ffprobe_streams(file_url, self.log)
|
||||
streams = get_ffprobe_streams(file_url, self.log)
|
||||
except Exception as exc:
|
||||
raise AssertionError((
|
||||
"FFprobe couldn't read information about input file: \"{}\"."
|
||||
|
|
@ -288,7 +292,9 @@ class CollectPublishedFiles(pyblish.api.ContextPlugin):
|
|||
"nb_frames {} not convertible".format(nb_frames))
|
||||
|
||||
duration = stream.get("duration")
|
||||
frame_rate = get_fps(stream.get("r_frame_rate", '0/0'))
|
||||
frame_rate = convert_ffprobe_fps_value(
|
||||
stream.get("r_frame_rate", '0/0')
|
||||
)
|
||||
self.log.debug("duration:: {} frame_rate:: {}".format(
|
||||
duration, frame_rate))
|
||||
try:
|
||||
|
|
|
|||
|
|
@ -26,10 +26,24 @@ from .vendor_bin_utils import (
|
|||
get_vendor_bin_path,
|
||||
get_oiio_tools_path,
|
||||
get_ffmpeg_tool_path,
|
||||
ffprobe_streams,
|
||||
is_oiio_supported
|
||||
)
|
||||
|
||||
from .attribute_definitions import (
|
||||
AbtractAttrDef,
|
||||
|
||||
UIDef,
|
||||
UISeparatorDef,
|
||||
UILabelDef,
|
||||
|
||||
UnknownDef,
|
||||
NumberDef,
|
||||
TextDef,
|
||||
EnumDef,
|
||||
BoolDef,
|
||||
FileDef,
|
||||
)
|
||||
|
||||
from .env_tools import (
|
||||
env_value_to_bool,
|
||||
get_paths_from_environ,
|
||||
|
|
@ -90,7 +104,12 @@ from .profiles_filtering import (
|
|||
from .transcoding import (
|
||||
get_transcode_temp_directory,
|
||||
should_convert_for_ffmpeg,
|
||||
convert_for_ffmpeg
|
||||
convert_for_ffmpeg,
|
||||
get_ffprobe_data,
|
||||
get_ffprobe_streams,
|
||||
get_ffmpeg_codec_args,
|
||||
get_ffmpeg_format_args,
|
||||
convert_ffprobe_fps_value,
|
||||
)
|
||||
from .avalon_context import (
|
||||
CURRENT_DOC_SCHEMAS,
|
||||
|
|
@ -110,6 +129,8 @@ from .avalon_context import (
|
|||
get_workdir_data,
|
||||
get_workdir,
|
||||
get_workdir_with_workdir_data,
|
||||
get_last_workfile_with_version,
|
||||
get_last_workfile,
|
||||
|
||||
create_workfile_doc,
|
||||
save_workfile_data_to_doc,
|
||||
|
|
@ -225,9 +246,21 @@ __all__ = [
|
|||
"get_vendor_bin_path",
|
||||
"get_oiio_tools_path",
|
||||
"get_ffmpeg_tool_path",
|
||||
"ffprobe_streams",
|
||||
"is_oiio_supported",
|
||||
|
||||
"AbtractAttrDef",
|
||||
|
||||
"UIDef",
|
||||
"UISeparatorDef",
|
||||
"UILabelDef",
|
||||
|
||||
"UnknownDef",
|
||||
"NumberDef",
|
||||
"TextDef",
|
||||
"EnumDef",
|
||||
"BoolDef",
|
||||
"FileDef",
|
||||
|
||||
"import_filepath",
|
||||
"modules_from_path",
|
||||
"recursive_bases_from_class",
|
||||
|
|
@ -237,6 +270,11 @@ __all__ = [
|
|||
"get_transcode_temp_directory",
|
||||
"should_convert_for_ffmpeg",
|
||||
"convert_for_ffmpeg",
|
||||
"get_ffprobe_data",
|
||||
"get_ffprobe_streams",
|
||||
"get_ffmpeg_codec_args",
|
||||
"get_ffmpeg_format_args",
|
||||
"convert_ffprobe_fps_value",
|
||||
|
||||
"CURRENT_DOC_SCHEMAS",
|
||||
"PROJECT_NAME_ALLOWED_SYMBOLS",
|
||||
|
|
@ -255,6 +293,8 @@ __all__ = [
|
|||
"get_workdir_data",
|
||||
"get_workdir",
|
||||
"get_workdir_with_workdir_data",
|
||||
"get_last_workfile_with_version",
|
||||
"get_last_workfile",
|
||||
|
||||
"create_workfile_doc",
|
||||
"save_workfile_data_to_doc",
|
||||
|
|
|
|||
|
|
@ -28,7 +28,8 @@ from .local_settings import get_openpype_username
|
|||
from .avalon_context import (
|
||||
get_workdir_data,
|
||||
get_workdir_with_workdir_data,
|
||||
get_workfile_template_key
|
||||
get_workfile_template_key,
|
||||
get_last_workfile
|
||||
)
|
||||
|
||||
from .python_module_tools import (
|
||||
|
|
@ -1544,6 +1545,7 @@ def _prepare_last_workfile(data, workdir):
|
|||
workdir (str): Path to folder where workfiles should be stored.
|
||||
"""
|
||||
import avalon.api
|
||||
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
|
||||
|
||||
log = data["log"]
|
||||
|
||||
|
|
@ -1592,7 +1594,7 @@ def _prepare_last_workfile(data, workdir):
|
|||
# Last workfile path
|
||||
last_workfile_path = data.get("last_workfile_path") or ""
|
||||
if not last_workfile_path:
|
||||
extensions = avalon.api.HOST_WORKFILE_EXTENSIONS.get(app.host_name)
|
||||
extensions = HOST_WORKFILE_EXTENSIONS.get(app.host_name)
|
||||
if extensions:
|
||||
anatomy = data["anatomy"]
|
||||
project_settings = data["project_settings"]
|
||||
|
|
@ -1609,7 +1611,7 @@ def _prepare_last_workfile(data, workdir):
|
|||
"ext": extensions[0]
|
||||
})
|
||||
|
||||
last_workfile_path = avalon.api.last_workfile(
|
||||
last_workfile_path = get_last_workfile(
|
||||
workdir, file_template, workdir_data, extensions, True
|
||||
)
|
||||
|
||||
|
|
|
|||
|
|
@ -9,6 +9,8 @@ import collections
|
|||
import functools
|
||||
import getpass
|
||||
|
||||
from bson.objectid import ObjectId
|
||||
|
||||
from openpype.settings import (
|
||||
get_project_settings,
|
||||
get_system_settings
|
||||
|
|
@ -16,6 +18,7 @@ from openpype.settings import (
|
|||
from .anatomy import Anatomy
|
||||
from .profiles_filtering import filter_profiles
|
||||
from .events import emit_event
|
||||
from .path_templates import StringTemplate
|
||||
|
||||
# avalon module is not imported at the top
|
||||
# - may not be in path at the time of pype.lib initialization
|
||||
|
|
@ -168,7 +171,7 @@ def any_outdated():
|
|||
|
||||
representation_doc = avalon.io.find_one(
|
||||
{
|
||||
"_id": avalon.io.ObjectId(representation),
|
||||
"_id": ObjectId(representation),
|
||||
"type": "representation"
|
||||
},
|
||||
projection={"parent": True}
|
||||
|
|
@ -1735,8 +1738,6 @@ def get_custom_workfile_template_by_context(
|
|||
context. (Existence of formatted path is not validated.)
|
||||
"""
|
||||
|
||||
from openpype.lib import filter_profiles
|
||||
|
||||
if anatomy is None:
|
||||
anatomy = Anatomy(project_doc["name"])
|
||||
|
||||
|
|
@ -1759,7 +1760,9 @@ def get_custom_workfile_template_by_context(
|
|||
# there are some anatomy template strings
|
||||
if matching_item:
|
||||
template = matching_item["path"][platform.system().lower()]
|
||||
return template.format(**anatomy_context_data)
|
||||
return StringTemplate.format_strict_template(
|
||||
template, anatomy_context_data
|
||||
)
|
||||
|
||||
return None
|
||||
|
||||
|
|
@ -1847,3 +1850,124 @@ def get_custom_workfile_template(template_profiles):
|
|||
io.Session["AVALON_TASK"],
|
||||
io
|
||||
)
|
||||
|
||||
|
||||
def get_last_workfile_with_version(
|
||||
workdir, file_template, fill_data, extensions
|
||||
):
|
||||
"""Return last workfile version.
|
||||
|
||||
Args:
|
||||
workdir(str): Path to dir where workfiles are stored.
|
||||
file_template(str): Template of file name.
|
||||
fill_data(dict): Data for filling template.
|
||||
extensions(list, tuple): All allowed file extensions of workfile.
|
||||
|
||||
Returns:
|
||||
tuple: Last workfile<str> with version<int> if there is any otherwise
|
||||
returns (None, None).
|
||||
"""
|
||||
if not os.path.exists(workdir):
|
||||
return None, None
|
||||
|
||||
# Fast match on extension
|
||||
filenames = [
|
||||
filename
|
||||
for filename in os.listdir(workdir)
|
||||
if os.path.splitext(filename)[1] in extensions
|
||||
]
|
||||
|
||||
# Build template without optionals, version to digits only regex
|
||||
# and comment to any definable value.
|
||||
_ext = []
|
||||
for ext in extensions:
|
||||
if not ext.startswith("."):
|
||||
ext = "." + ext
|
||||
# Escape dot for regex
|
||||
ext = "\\" + ext
|
||||
_ext.append(ext)
|
||||
ext_expression = "(?:" + "|".join(_ext) + ")"
|
||||
|
||||
# Replace `.{ext}` with `{ext}` so we are sure there is not dot at the end
|
||||
file_template = re.sub(r"\.?{ext}", ext_expression, file_template)
|
||||
# Replace optional keys with optional content regex
|
||||
file_template = re.sub(r"<.*?>", r".*?", file_template)
|
||||
# Replace `{version}` with group regex
|
||||
file_template = re.sub(r"{version.*?}", r"([0-9]+)", file_template)
|
||||
file_template = re.sub(r"{comment.*?}", r".+?", file_template)
|
||||
file_template = StringTemplate.format_strict_template(
|
||||
file_template, fill_data
|
||||
)
|
||||
|
||||
# Match with ignore case on Windows due to the Windows
|
||||
# OS not being case-sensitive. This avoids later running
|
||||
# into the error that the file did exist if it existed
|
||||
# with a different upper/lower-case.
|
||||
kwargs = {}
|
||||
if platform.system().lower() == "windows":
|
||||
kwargs["flags"] = re.IGNORECASE
|
||||
|
||||
# Get highest version among existing matching files
|
||||
version = None
|
||||
output_filenames = []
|
||||
for filename in sorted(filenames):
|
||||
match = re.match(file_template, filename, **kwargs)
|
||||
if not match:
|
||||
continue
|
||||
|
||||
file_version = int(match.group(1))
|
||||
if version is None or file_version > version:
|
||||
output_filenames[:] = []
|
||||
version = file_version
|
||||
|
||||
if file_version == version:
|
||||
output_filenames.append(filename)
|
||||
|
||||
output_filename = None
|
||||
if output_filenames:
|
||||
if len(output_filenames) == 1:
|
||||
output_filename = output_filenames[0]
|
||||
else:
|
||||
last_time = None
|
||||
for _output_filename in output_filenames:
|
||||
full_path = os.path.join(workdir, _output_filename)
|
||||
mod_time = os.path.getmtime(full_path)
|
||||
if last_time is None or last_time < mod_time:
|
||||
output_filename = _output_filename
|
||||
last_time = mod_time
|
||||
|
||||
return output_filename, version
|
||||
|
||||
|
||||
def get_last_workfile(
|
||||
workdir, file_template, fill_data, extensions, full_path=False
|
||||
):
|
||||
"""Return last workfile filename.
|
||||
|
||||
Returns file with version 1 if there is not workfile yet.
|
||||
|
||||
Args:
|
||||
workdir(str): Path to dir where workfiles are stored.
|
||||
file_template(str): Template of file name.
|
||||
fill_data(dict): Data for filling template.
|
||||
extensions(list, tuple): All allowed file extensions of workfile.
|
||||
full_path(bool): Full path to file is returned if set to True.
|
||||
|
||||
Returns:
|
||||
str: Last or first workfile as filename of full path to filename.
|
||||
"""
|
||||
filename, version = get_last_workfile_with_version(
|
||||
workdir, file_template, fill_data, extensions
|
||||
)
|
||||
if filename is None:
|
||||
data = copy.deepcopy(fill_data)
|
||||
data["version"] = 1
|
||||
data.pop("comment", None)
|
||||
if not data.get("ext"):
|
||||
data["ext"] = extensions[0]
|
||||
filename = StringTemplate.format_strict_template(file_template, data)
|
||||
|
||||
if full_path:
|
||||
return os.path.normpath(os.path.join(workdir, filename))
|
||||
|
||||
return filename
|
||||
|
|
|
|||
|
|
@ -5,23 +5,30 @@ import glob
|
|||
import clique
|
||||
import collections
|
||||
|
||||
from .path_templates import (
|
||||
StringTemplate,
|
||||
TemplateUnsolved,
|
||||
)
|
||||
|
||||
|
||||
def collect_frames(files):
|
||||
"""
|
||||
Returns dict of source path and its frame, if from sequence
|
||||
|
||||
Uses clique as most precise solution
|
||||
Uses clique as most precise solution, used when anatomy template that
|
||||
created files is not known.
|
||||
|
||||
Assumption is that frames are separated by '.', negative frames are not
|
||||
allowed.
|
||||
|
||||
Args:
|
||||
files(list) or (set with single value): list of source paths
|
||||
Returns:
|
||||
(dict): {'/asset/subset_v001.0001.png': '0001', ....}
|
||||
"""
|
||||
collections, remainder = clique.assemble(files, minimum_items=1)
|
||||
|
||||
real_file_name = None
|
||||
if len(files) == 1:
|
||||
real_file_name = list(files)[0]
|
||||
patterns = [clique.PATTERNS["frames"]]
|
||||
collections, remainder = clique.assemble(files, minimum_items=1,
|
||||
patterns=patterns)
|
||||
|
||||
sources_and_frames = {}
|
||||
if collections:
|
||||
|
|
@ -29,14 +36,6 @@ def collect_frames(files):
|
|||
src_head = collection.head
|
||||
src_tail = collection.tail
|
||||
|
||||
if src_head.endswith("_v"):
|
||||
# print("Collection gathered incorrectly, not a sequence "
|
||||
# "just a version found in {}".format(files))
|
||||
if len(collections) > 1:
|
||||
continue
|
||||
else:
|
||||
return {real_file_name: None}
|
||||
|
||||
for index in collection.indexes:
|
||||
src_frame = collection.format("{padding}") % index
|
||||
src_file_name = "{}{}{}".format(src_head, src_frame,
|
||||
|
|
@ -58,8 +57,6 @@ def sizeof_fmt(num, suffix='B'):
|
|||
|
||||
|
||||
def path_from_representation(representation, anatomy):
|
||||
from avalon import pipeline # safer importing
|
||||
|
||||
try:
|
||||
template = representation["data"]["template"]
|
||||
|
||||
|
|
@ -69,12 +66,10 @@ def path_from_representation(representation, anatomy):
|
|||
try:
|
||||
context = representation["context"]
|
||||
context["root"] = anatomy.roots
|
||||
path = pipeline.format_template_with_optional_keys(
|
||||
context, template
|
||||
)
|
||||
path = os.path.normpath(path.replace("/", "\\"))
|
||||
path = StringTemplate.format_strict_template(template, context)
|
||||
return os.path.normpath(path)
|
||||
|
||||
except KeyError:
|
||||
except TemplateUnsolved:
|
||||
# Template references unavailable data
|
||||
return None
|
||||
|
||||
|
|
@ -83,15 +78,14 @@ def path_from_representation(representation, anatomy):
|
|||
|
||||
def copy_file(src_path, dst_path):
|
||||
"""Hardlink file if possible(to save space), copy if not"""
|
||||
from avalon.vendor import filelink # safer importing
|
||||
from openpype.lib import create_hard_link # safer importing
|
||||
|
||||
if os.path.exists(dst_path):
|
||||
return
|
||||
try:
|
||||
filelink.create(
|
||||
create_hard_link(
|
||||
src_path,
|
||||
dst_path,
|
||||
filelink.HARDLINK
|
||||
dst_path
|
||||
)
|
||||
except OSError:
|
||||
shutil.copyfile(src_path, dst_path)
|
||||
|
|
|
|||
|
|
@ -99,6 +99,10 @@ class PypeStreamHandler(logging.StreamHandler):
|
|||
self.flush()
|
||||
except (KeyboardInterrupt, SystemExit):
|
||||
raise
|
||||
|
||||
except OSError:
|
||||
self.handleError(record)
|
||||
|
||||
except Exception:
|
||||
print(repr(record))
|
||||
self.handleError(record)
|
||||
|
|
@ -228,7 +232,7 @@ class PypeLogger:
|
|||
|
||||
logger = logging.getLogger(name or "__main__")
|
||||
|
||||
if cls.pype_debug > 1:
|
||||
if cls.pype_debug > 0:
|
||||
logger.setLevel(logging.DEBUG)
|
||||
else:
|
||||
logger.setLevel(logging.INFO)
|
||||
|
|
|
|||
|
|
@ -1,413 +0,0 @@
|
|||
|
||||
|
||||
|
||||
*
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
.*
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
*
|
||||
.*
|
||||
*
|
||||
|
||||
|
||||
|
||||
.
|
||||
*
|
||||
.*
|
||||
*
|
||||
.
|
||||
|
||||
.
|
||||
*
|
||||
.*
|
||||
.*
|
||||
.*
|
||||
*
|
||||
.
|
||||
.
|
||||
*
|
||||
.*
|
||||
.*
|
||||
.*
|
||||
*
|
||||
.
|
||||
_.
|
||||
/**
|
||||
\ *
|
||||
\*
|
||||
*
|
||||
*
|
||||
.
|
||||
__.
|
||||
---*
|
||||
\ \*
|
||||
\ *
|
||||
\*
|
||||
*
|
||||
.
|
||||
\___.
|
||||
/* *
|
||||
\ \ *
|
||||
\ \*
|
||||
\ *
|
||||
\*
|
||||
.
|
||||
|____.
|
||||
/* *
|
||||
\|\ *
|
||||
\ \ *
|
||||
\ \ *
|
||||
\ \*
|
||||
\/.
|
||||
_/_____.
|
||||
/* *
|
||||
/ \ *
|
||||
\ \ *
|
||||
\ \ *
|
||||
\ \__*
|
||||
\/__.
|
||||
__________.
|
||||
--*-- ___*
|
||||
\ \ \/_*
|
||||
\ \ __*
|
||||
\ \ \_*
|
||||
\ \____\*
|
||||
\/____/.
|
||||
\____________ .
|
||||
/* ___ \*
|
||||
\ \ \/_\ *
|
||||
\ \ _____*
|
||||
\ \ \___/*
|
||||
\ \____\ *
|
||||
\/____/ .
|
||||
|___________ .
|
||||
/* ___ \ *
|
||||
\|\ \/_\ \ *
|
||||
\ \ _____/ *
|
||||
\ \ \___/ *
|
||||
\ \____\ / *
|
||||
\/____/ \.
|
||||
_/__________ .
|
||||
/* ___ \ *
|
||||
/ \ \/_\ \ *
|
||||
\ \ _____/ *
|
||||
\ \ \___/ ---*
|
||||
\ \____\ / \__*
|
||||
\/____/ \/__.
|
||||
____________ .
|
||||
--*-- ___ \ *
|
||||
\ \ \/_\ \ *
|
||||
\ \ _____/ *
|
||||
\ \ \___/ ---- *
|
||||
\ \____\ / \____\*
|
||||
\/____/ \/____/.
|
||||
____________
|
||||
/\ ___ \ .
|
||||
\ \ \/_\ \ *
|
||||
\ \ _____/ *
|
||||
\ \ \___/ ---- *
|
||||
\ \____\ / \____\ .
|
||||
\/____/ \/____/
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \ .
|
||||
\ \ _____/ *
|
||||
\ \ \___/ ---- *
|
||||
\ \____\ / \____\ .
|
||||
\/____/ \/____/
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ .
|
||||
\ \ \___/ ---- *
|
||||
\ \____\ / \____\ .
|
||||
\/____/ \/____/
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/
|
||||
\ \ \___/ ---- *
|
||||
\ \____\ / \____\
|
||||
\/____/ \/____/
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/
|
||||
\ \ \___/ ---- .
|
||||
\ \____\ / \____\
|
||||
\/____/ \/____/
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ _
|
||||
\ \ \___/ ----
|
||||
\ \____\ / \____\
|
||||
\/____/ \/____/
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___
|
||||
\ \ \___/ ----
|
||||
\ \____\ / \____\
|
||||
\/____/ \/____/
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___
|
||||
\ \ \___/ ---- \
|
||||
\ \____\ / \____\
|
||||
\/____/ \/____/
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___
|
||||
\ \ \___/ ---- \
|
||||
\ \____\ / \____\ \
|
||||
\/____/ \/____/
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___
|
||||
\ \ \___/ ---- \
|
||||
\ \____\ / \____\ __\
|
||||
\/____/ \/____/
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___
|
||||
\ \ \___/ ---- \
|
||||
\ \____\ / \____\ \__\
|
||||
\/____/ \/____/
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___
|
||||
\ \ \___/ ---- \ \
|
||||
\ \____\ / \____\ \__\
|
||||
\/____/ \/____/
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___
|
||||
\ \ \___/ ---- \ \
|
||||
\ \____\ / \____\ \__\
|
||||
\/____/ \/____/
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___.
|
||||
\ \ \___/ ---- \ \\
|
||||
\ \____\ / \____\ \__\,
|
||||
\/____/ \/____/
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ .
|
||||
\ \ \___/ ---- \ \\
|
||||
\ \____\ / \____\ \__\\,
|
||||
\/____/ \/____/
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ _.
|
||||
\ \ \___/ ---- \ \\\
|
||||
\ \____\ / \____\ \__\\\
|
||||
\/____/ \/____/
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ __.
|
||||
\ \ \___/ ---- \ \\ \
|
||||
\ \____\ / \____\ \__\\_/.
|
||||
\/____/ \/____/
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ ___.
|
||||
\ \ \___/ ---- \ \\ \\
|
||||
\ \____\ / \____\ \__\\__\.
|
||||
\/____/ \/____/
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ ___ .
|
||||
\ \ \___/ ---- \ \\ \\
|
||||
\ \____\ / \____\ \__\\__\\.
|
||||
\/____/ \/____/
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ ___ _.
|
||||
\ \ \___/ ---- \ \\ \\\
|
||||
\ \____\ / \____\ \__\\__\\.
|
||||
\/____/ \/____/
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ ___ __.
|
||||
\ \ \___/ ---- \ \\ \\ \
|
||||
\ \____\ / \____\ \__\\__\\_.
|
||||
\/____/ \/____/
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ ___ __.
|
||||
\ \ \___/ ---- \ \\ \\ \
|
||||
\ \____\ / \____\ \__\\__\\__.
|
||||
\/____/ \/____/
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ ___ ___
|
||||
\ \ \___/ ---- \ \\ \\ \
|
||||
\ \____\ / \____\ \__\\__\\__\
|
||||
\/____/ \/____/ .
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ ___ ___
|
||||
\ \ \___/ ---- \ \\ \\ \
|
||||
\ \____\ / \____\ \__\\__\\__\
|
||||
\/____/ \/____/ *
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ ___ ___
|
||||
\ \ \___/ ---- \ \\ \\ \
|
||||
\ \____\ / \____\ \__\\__\\__\
|
||||
\/____/ \/____/ O*
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ ___ ___
|
||||
\ \ \___/ ---- \ \\ \\ \
|
||||
\ \____\ / \____\ \__\\__\\__\
|
||||
\/____/ \/____/ oO*
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ ___ ___
|
||||
\ \ \___/ ---- \ \\ \\ \
|
||||
\ \____\ / \____\ \__\\__\\__\
|
||||
\/____/ \/____/ .oO*
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ ___ ___
|
||||
\ \ \___/ ---- \ \\ \\ \
|
||||
\ \____\ / \____\ \__\\__\\__\
|
||||
\/____/ \/____/ ..oO*
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ ___ ___
|
||||
\ \ \___/ ---- \ \\ \\ \
|
||||
\ \____\ / \____\ \__\\__\\__\
|
||||
\/____/ \/____/ . .oO*
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ ___ ___
|
||||
\ \ \___/ ---- \ \\ \\ \
|
||||
\ \____\ / \____\ \__\\__\\__\
|
||||
\/____/ \/____/ . p.oO*
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ ___ ___
|
||||
\ \ \___/ ---- \ \\ \\ \
|
||||
\ \____\ / \____\ \__\\__\\__\
|
||||
\/____/ \/____/ . Py.oO*
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ ___ ___
|
||||
\ \ \___/ ---- \ \\ \\ \
|
||||
\ \____\ / \____\ \__\\__\\__\
|
||||
\/____/ \/____/ . PYp.oO*
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ ___ ___
|
||||
\ \ \___/ ---- \ \\ \\ \
|
||||
\ \____\ / \____\ \__\\__\\__\
|
||||
\/____/ \/____/ . PYPe.oO*
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ ___ ___
|
||||
\ \ \___/ ---- \ \\ \\ \
|
||||
\ \____\ / \____\ \__\\__\\__\
|
||||
\/____/ \/____/ . PYPE .oO*
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ ___ ___
|
||||
\ \ \___/ ---- \ \\ \\ \
|
||||
\ \____\ / \____\ \__\\__\\__\
|
||||
\/____/ \/____/ . PYPE c.oO*
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ ___ ___
|
||||
\ \ \___/ ---- \ \\ \\ \
|
||||
\ \____\ / \____\ \__\\__\\__\
|
||||
\/____/ \/____/ . PYPE C1.oO*
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ ___ ___
|
||||
\ \ \___/ ---- \ \\ \\ \
|
||||
\ \____\ / \____\ \__\\__\\__\
|
||||
\/____/ \/____/ . PYPE ClU.oO*
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ ___ ___
|
||||
\ \ \___/ ---- \ \\ \\ \
|
||||
\ \____\ / \____\ \__\\__\\__\
|
||||
\/____/ \/____/ . PYPE CluB.oO*
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ ___ ___
|
||||
\ \ \___/ ---- \ \\ \\ \
|
||||
\ \____\ / \____\ \__\\__\\__\
|
||||
\/____/ \/____/ . PYPE Club .oO*
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ ___ ___
|
||||
\ \ \___/ ---- \ \\ \\ \
|
||||
\ \____\ / \____\ \__\\__\\__\
|
||||
\/____/ \/____/ . PYPE Club . ..
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ ___ ___
|
||||
\ \ \___/ ---- \ \\ \\ \
|
||||
\ \____\ / \____\ \__\\__\\__\
|
||||
\/____/ \/____/ . PYPE Club . ..
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ ___ ___
|
||||
\ \ \___/ ---- \ \\ \\ \
|
||||
\ \____\ / \____\ \__\\__\\__\
|
||||
\/____/ \/____/ . PYPE Club . .
|
||||
____________
|
||||
/\ ___ \
|
||||
\ \ \/_\ \
|
||||
\ \ _____/ ___ ___ ___
|
||||
\ \ \___/ ---- \ \\ \\ \
|
||||
\ \____\ / \____\ \__\\__\\__\
|
||||
\/____/ \/____/ . PYPE Club .
|
||||
|
|
@ -1,43 +0,0 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Pype terminal animation."""
|
||||
import blessed
|
||||
from pathlib import Path
|
||||
from time import sleep
|
||||
|
||||
NO_TERMINAL = False
|
||||
|
||||
try:
|
||||
term = blessed.Terminal()
|
||||
except AttributeError:
|
||||
# this happens when blessed cannot find proper terminal.
|
||||
# If so, skip printing ascii art animation.
|
||||
NO_TERMINAL = True
|
||||
|
||||
|
||||
def play_animation():
|
||||
"""Play ASCII art Pype animation."""
|
||||
if NO_TERMINAL:
|
||||
return
|
||||
print(term.home + term.clear)
|
||||
frame_size = 7
|
||||
splash_file = Path(__file__).parent / "splash.txt"
|
||||
with splash_file.open("r") as sf:
|
||||
animation = sf.readlines()
|
||||
|
||||
animation_length = int(len(animation) / frame_size)
|
||||
current_frame = 0
|
||||
for _ in range(animation_length):
|
||||
frame = "".join(
|
||||
scanline
|
||||
for y, scanline in enumerate(
|
||||
animation[current_frame: current_frame + frame_size]
|
||||
)
|
||||
)
|
||||
|
||||
with term.location(0, 0):
|
||||
# term.aquamarine3_bold(frame)
|
||||
print(f"{term.bold}{term.aquamarine3}{frame}{term.normal}")
|
||||
|
||||
sleep(0.02)
|
||||
current_frame += frame_size
|
||||
print(term.move_y(7))
|
||||
|
|
@ -1,15 +1,18 @@
|
|||
import os
|
||||
import re
|
||||
import logging
|
||||
import json
|
||||
import collections
|
||||
import tempfile
|
||||
import subprocess
|
||||
|
||||
import xml.etree.ElementTree
|
||||
|
||||
from .execute import run_subprocess
|
||||
from .vendor_bin_utils import (
|
||||
get_ffmpeg_tool_path,
|
||||
get_oiio_tools_path,
|
||||
is_oiio_supported
|
||||
is_oiio_supported,
|
||||
)
|
||||
|
||||
# Max length of string that is supported by ffmpeg
|
||||
|
|
@ -483,3 +486,290 @@ def convert_for_ffmpeg(
|
|||
|
||||
logger.debug("Conversion command: {}".format(" ".join(oiio_cmd)))
|
||||
run_subprocess(oiio_cmd, logger=logger)
|
||||
|
||||
|
||||
# FFMPEG functions
|
||||
def get_ffprobe_data(path_to_file, logger=None):
|
||||
"""Load data about entered filepath via ffprobe.
|
||||
|
||||
Args:
|
||||
path_to_file (str): absolute path
|
||||
logger (logging.Logger): injected logger, if empty new is created
|
||||
"""
|
||||
if not logger:
|
||||
logger = logging.getLogger(__name__)
|
||||
logger.info(
|
||||
"Getting information about input \"{}\".".format(path_to_file)
|
||||
)
|
||||
args = [
|
||||
get_ffmpeg_tool_path("ffprobe"),
|
||||
"-hide_banner",
|
||||
"-loglevel", "fatal",
|
||||
"-show_error",
|
||||
"-show_format",
|
||||
"-show_streams",
|
||||
"-show_programs",
|
||||
"-show_chapters",
|
||||
"-show_private_data",
|
||||
"-print_format", "json",
|
||||
path_to_file
|
||||
]
|
||||
|
||||
logger.debug("FFprobe command: {}".format(
|
||||
subprocess.list2cmdline(args)
|
||||
))
|
||||
popen = subprocess.Popen(
|
||||
args,
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.PIPE
|
||||
)
|
||||
|
||||
popen_stdout, popen_stderr = popen.communicate()
|
||||
if popen_stdout:
|
||||
logger.debug("FFprobe stdout:\n{}".format(
|
||||
popen_stdout.decode("utf-8")
|
||||
))
|
||||
|
||||
if popen_stderr:
|
||||
logger.warning("FFprobe stderr:\n{}".format(
|
||||
popen_stderr.decode("utf-8")
|
||||
))
|
||||
|
||||
return json.loads(popen_stdout)
|
||||
|
||||
|
||||
def get_ffprobe_streams(path_to_file, logger=None):
|
||||
"""Load streams from entered filepath via ffprobe.
|
||||
|
||||
Args:
|
||||
path_to_file (str): absolute path
|
||||
logger (logging.Logger): injected logger, if empty new is created
|
||||
"""
|
||||
return get_ffprobe_data(path_to_file, logger)["streams"]
|
||||
|
||||
|
||||
def get_ffmpeg_format_args(ffprobe_data, source_ffmpeg_cmd=None):
|
||||
"""Copy format from input metadata for output.
|
||||
|
||||
Args:
|
||||
ffprobe_data(dict): Data received from ffprobe.
|
||||
source_ffmpeg_cmd(str): Command that created input if available.
|
||||
"""
|
||||
input_format = ffprobe_data.get("format") or {}
|
||||
if input_format.get("format_name") == "mxf":
|
||||
return _ffmpeg_mxf_format_args(ffprobe_data, source_ffmpeg_cmd)
|
||||
return []
|
||||
|
||||
|
||||
def _ffmpeg_mxf_format_args(ffprobe_data, source_ffmpeg_cmd):
|
||||
input_format = ffprobe_data["format"]
|
||||
format_tags = input_format.get("tags") or {}
|
||||
product_name = format_tags.get("product_name") or ""
|
||||
output = []
|
||||
if "opatom" in product_name.lower():
|
||||
output.extend(["-f", "mxf_opatom"])
|
||||
return output
|
||||
|
||||
|
||||
def get_ffmpeg_codec_args(ffprobe_data, source_ffmpeg_cmd=None, logger=None):
|
||||
"""Copy codec from input metadata for output.
|
||||
|
||||
Args:
|
||||
ffprobe_data(dict): Data received from ffprobe.
|
||||
source_ffmpeg_cmd(str): Command that created input if available.
|
||||
"""
|
||||
if logger is None:
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
video_stream = None
|
||||
no_audio_stream = None
|
||||
for stream in ffprobe_data["streams"]:
|
||||
codec_type = stream["codec_type"]
|
||||
if codec_type == "video":
|
||||
video_stream = stream
|
||||
break
|
||||
elif no_audio_stream is None and codec_type != "audio":
|
||||
no_audio_stream = stream
|
||||
|
||||
if video_stream is None:
|
||||
if no_audio_stream is None:
|
||||
logger.warning(
|
||||
"Couldn't find stream that is not an audio file."
|
||||
)
|
||||
return []
|
||||
logger.info(
|
||||
"Didn't find video stream. Using first non audio stream."
|
||||
)
|
||||
video_stream = no_audio_stream
|
||||
|
||||
codec_name = video_stream.get("codec_name")
|
||||
# Codec "prores"
|
||||
if codec_name == "prores":
|
||||
return _ffmpeg_prores_codec_args(video_stream, source_ffmpeg_cmd)
|
||||
|
||||
# Codec "h264"
|
||||
if codec_name == "h264":
|
||||
return _ffmpeg_h264_codec_args(video_stream, source_ffmpeg_cmd)
|
||||
|
||||
# Coded DNxHD
|
||||
if codec_name == "dnxhd":
|
||||
return _ffmpeg_dnxhd_codec_args(video_stream, source_ffmpeg_cmd)
|
||||
|
||||
output = []
|
||||
if codec_name:
|
||||
output.extend(["-codec:v", codec_name])
|
||||
|
||||
bit_rate = video_stream.get("bit_rate")
|
||||
if bit_rate:
|
||||
output.extend(["-b:v", bit_rate])
|
||||
|
||||
pix_fmt = video_stream.get("pix_fmt")
|
||||
if pix_fmt:
|
||||
output.extend(["-pix_fmt", pix_fmt])
|
||||
|
||||
output.extend(["-g", "1"])
|
||||
|
||||
return output
|
||||
|
||||
|
||||
def _ffmpeg_prores_codec_args(stream_data, source_ffmpeg_cmd):
|
||||
output = []
|
||||
|
||||
tags = stream_data.get("tags") or {}
|
||||
encoder = tags.get("encoder") or ""
|
||||
if encoder.endswith("prores_ks"):
|
||||
codec_name = "prores_ks"
|
||||
|
||||
elif encoder.endswith("prores_aw"):
|
||||
codec_name = "prores_aw"
|
||||
|
||||
else:
|
||||
codec_name = "prores"
|
||||
|
||||
output.extend(["-codec:v", codec_name])
|
||||
|
||||
pix_fmt = stream_data.get("pix_fmt")
|
||||
if pix_fmt:
|
||||
output.extend(["-pix_fmt", pix_fmt])
|
||||
|
||||
# Rest of arguments is prores_kw specific
|
||||
if codec_name == "prores_ks":
|
||||
codec_tag_to_profile_map = {
|
||||
"apco": "proxy",
|
||||
"apcs": "lt",
|
||||
"apcn": "standard",
|
||||
"apch": "hq",
|
||||
"ap4h": "4444",
|
||||
"ap4x": "4444xq"
|
||||
}
|
||||
codec_tag_str = stream_data.get("codec_tag_string")
|
||||
if codec_tag_str:
|
||||
profile = codec_tag_to_profile_map.get(codec_tag_str)
|
||||
if profile:
|
||||
output.extend(["-profile:v", profile])
|
||||
|
||||
return output
|
||||
|
||||
|
||||
def _ffmpeg_h264_codec_args(stream_data, source_ffmpeg_cmd):
|
||||
output = ["-codec:v", "h264"]
|
||||
|
||||
# Use arguments from source if are available source arguments
|
||||
if source_ffmpeg_cmd:
|
||||
copy_args = (
|
||||
"-crf",
|
||||
"-b:v", "-vb",
|
||||
"-minrate", "-minrate:",
|
||||
"-maxrate", "-maxrate:",
|
||||
"-bufsize", "-bufsize:"
|
||||
)
|
||||
args = source_ffmpeg_cmd.split(" ")
|
||||
for idx, arg in enumerate(args):
|
||||
if arg in copy_args:
|
||||
output.extend([arg, args[idx + 1]])
|
||||
|
||||
pix_fmt = stream_data.get("pix_fmt")
|
||||
if pix_fmt:
|
||||
output.extend(["-pix_fmt", pix_fmt])
|
||||
|
||||
output.extend(["-intra"])
|
||||
output.extend(["-g", "1"])
|
||||
|
||||
return output
|
||||
|
||||
|
||||
def _ffmpeg_dnxhd_codec_args(stream_data, source_ffmpeg_cmd):
|
||||
output = ["-codec:v", "dnxhd"]
|
||||
|
||||
# Use source profile (profiles in metadata are not usable in args directly)
|
||||
profile = stream_data.get("profile") or ""
|
||||
# Lower profile and replace space with underscore
|
||||
cleaned_profile = profile.lower().replace(" ", "_")
|
||||
|
||||
# TODO validate this statement
|
||||
# Looks like using 'dnxhd' profile must have set bit rate and in that case
|
||||
# should be used bitrate from source.
|
||||
# - related attributes 'bit_rate_defined', 'bit_rate_must_be_defined'
|
||||
bit_rate_must_be_defined = True
|
||||
dnx_profiles = {
|
||||
"dnxhd",
|
||||
"dnxhr_lb",
|
||||
"dnxhr_sq",
|
||||
"dnxhr_hq",
|
||||
"dnxhr_hqx",
|
||||
"dnxhr_444"
|
||||
}
|
||||
if cleaned_profile in dnx_profiles:
|
||||
if cleaned_profile != "dnxhd":
|
||||
bit_rate_must_be_defined = False
|
||||
output.extend(["-profile:v", cleaned_profile])
|
||||
|
||||
pix_fmt = stream_data.get("pix_fmt")
|
||||
if pix_fmt:
|
||||
output.extend(["-pix_fmt", pix_fmt])
|
||||
|
||||
# Use arguments from source if are available source arguments
|
||||
bit_rate_defined = False
|
||||
if source_ffmpeg_cmd:
|
||||
# Define bitrate arguments
|
||||
bit_rate_args = ("-b:v", "-vb",)
|
||||
# Seprate the two variables in case something else should be copied
|
||||
# from source command
|
||||
copy_args = []
|
||||
copy_args.extend(bit_rate_args)
|
||||
|
||||
args = source_ffmpeg_cmd.split(" ")
|
||||
for idx, arg in enumerate(args):
|
||||
if arg in copy_args:
|
||||
if arg in bit_rate_args:
|
||||
bit_rate_defined = True
|
||||
output.extend([arg, args[idx + 1]])
|
||||
|
||||
# Add bitrate if needed
|
||||
if bit_rate_must_be_defined and not bit_rate_defined:
|
||||
src_bit_rate = stream_data.get("bit_rate")
|
||||
if src_bit_rate:
|
||||
output.extend(["-b:v", src_bit_rate])
|
||||
|
||||
output.extend(["-g", "1"])
|
||||
return output
|
||||
|
||||
|
||||
def convert_ffprobe_fps_value(str_value):
|
||||
"""Returns (str) value of fps from ffprobe frame format (120/1)"""
|
||||
if str_value == "0/0":
|
||||
print("WARNING: Source has \"r_frame_rate\" value set to \"0/0\".")
|
||||
return "Unknown"
|
||||
|
||||
items = str_value.split("/")
|
||||
if len(items) == 1:
|
||||
fps = float(items[0])
|
||||
|
||||
elif len(items) == 2:
|
||||
fps = float(items[0]) / float(items[1])
|
||||
|
||||
# Check if fps is integer or float number
|
||||
if int(fps) == fps:
|
||||
fps = int(fps)
|
||||
|
||||
return str(fps)
|
||||
|
|
|
|||
|
|
@ -315,7 +315,7 @@ def get_usd_master_path(asset, subset, representation):
|
|||
)
|
||||
template = project["config"]["template"]["publish"]
|
||||
|
||||
if isinstance(asset, dict) and "silo" in asset and "name" in asset:
|
||||
if isinstance(asset, dict) and "name" in asset:
|
||||
# Allow explicitly passing asset document
|
||||
asset_doc = asset
|
||||
else:
|
||||
|
|
@ -325,7 +325,6 @@ def get_usd_master_path(asset, subset, representation):
|
|||
**{
|
||||
"root": api.registered_root(),
|
||||
"project": api.Session["AVALON_PROJECT"],
|
||||
"silo": asset_doc["silo"],
|
||||
"asset": asset_doc["name"],
|
||||
"subset": subset,
|
||||
"representation": representation,
|
||||
|
|
|
|||
|
|
@ -1,8 +1,6 @@
|
|||
import os
|
||||
import logging
|
||||
import json
|
||||
import platform
|
||||
import subprocess
|
||||
|
||||
log = logging.getLogger("Vendor utils")
|
||||
|
||||
|
|
@ -138,56 +136,6 @@ def get_ffmpeg_tool_path(tool="ffmpeg"):
|
|||
return find_executable(os.path.join(ffmpeg_dir, tool))
|
||||
|
||||
|
||||
def ffprobe_streams(path_to_file, logger=None):
|
||||
"""Load streams from entered filepath via ffprobe.
|
||||
|
||||
Args:
|
||||
path_to_file (str): absolute path
|
||||
logger (logging.getLogger): injected logger, if empty new is created
|
||||
|
||||
"""
|
||||
if not logger:
|
||||
logger = log
|
||||
logger.info(
|
||||
"Getting information about input \"{}\".".format(path_to_file)
|
||||
)
|
||||
args = [
|
||||
get_ffmpeg_tool_path("ffprobe"),
|
||||
"-hide_banner",
|
||||
"-loglevel", "fatal",
|
||||
"-show_error",
|
||||
"-show_format",
|
||||
"-show_streams",
|
||||
"-show_programs",
|
||||
"-show_chapters",
|
||||
"-show_private_data",
|
||||
"-print_format", "json",
|
||||
path_to_file
|
||||
]
|
||||
|
||||
logger.debug("FFprobe command: {}".format(
|
||||
subprocess.list2cmdline(args)
|
||||
))
|
||||
popen = subprocess.Popen(
|
||||
args,
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.PIPE
|
||||
)
|
||||
|
||||
popen_stdout, popen_stderr = popen.communicate()
|
||||
if popen_stdout:
|
||||
logger.debug("FFprobe stdout:\n{}".format(
|
||||
popen_stdout.decode("utf-8")
|
||||
))
|
||||
|
||||
if popen_stderr:
|
||||
logger.warning("FFprobe stderr:\n{}".format(
|
||||
popen_stderr.decode("utf-8")
|
||||
))
|
||||
|
||||
return json.loads(popen_stdout)["streams"]
|
||||
|
||||
|
||||
def is_oiio_supported():
|
||||
"""Checks if oiiotool is configured for this platform.
|
||||
|
||||
|
|
@ -204,23 +152,3 @@ def is_oiio_supported():
|
|||
))
|
||||
return False
|
||||
return True
|
||||
|
||||
|
||||
def get_fps(str_value):
|
||||
"""Returns (str) value of fps from ffprobe frame format (120/1)"""
|
||||
if str_value == "0/0":
|
||||
print("WARNING: Source has \"r_frame_rate\" value set to \"0/0\".")
|
||||
return "Unknown"
|
||||
|
||||
items = str_value.split("/")
|
||||
if len(items) == 1:
|
||||
fps = float(items[0])
|
||||
|
||||
elif len(items) == 2:
|
||||
fps = float(items[0]) / float(items[1])
|
||||
|
||||
# Check if fps is integer or float number
|
||||
if int(fps) == fps:
|
||||
fps = int(fps)
|
||||
|
||||
return str(fps)
|
||||
|
|
|
|||
|
|
@ -236,6 +236,7 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
environment["OPENPYPE_MONGO"] = mongo_url
|
||||
|
||||
args = [
|
||||
"--headless",
|
||||
'publish',
|
||||
roothless_metadata_path,
|
||||
"--targets", "deadline",
|
||||
|
|
@ -601,13 +602,23 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
"files": os.path.basename(remainder),
|
||||
"stagingDir": os.path.dirname(remainder),
|
||||
}
|
||||
representations.append(rep)
|
||||
if "render" in instance.get("families"):
|
||||
rep.update({
|
||||
"fps": instance.get("fps"),
|
||||
"tags": ["review"]
|
||||
})
|
||||
self._solve_families(instance, True)
|
||||
self._solve_families(instance, True)
|
||||
|
||||
already_there = False
|
||||
for repre in instance.get("representations", []):
|
||||
# might be added explicitly before by publish_on_farm
|
||||
already_there = repre.get("files") == rep["files"]
|
||||
if already_there:
|
||||
self.log.debug("repre {} already_there".format(repre))
|
||||
break
|
||||
|
||||
if not already_there:
|
||||
representations.append(rep)
|
||||
|
||||
return representations
|
||||
|
||||
|
|
|
|||
|
|
@ -46,6 +46,7 @@ def inject_openpype_environment(deadlinePlugin):
|
|||
|
||||
args = [
|
||||
openpype_app,
|
||||
"--headless",
|
||||
'extractenvironments',
|
||||
export_url
|
||||
]
|
||||
|
|
|
|||
|
|
@ -199,8 +199,10 @@ class SyncToAvalonEvent(BaseEvent):
|
|||
if proj:
|
||||
ftrack_id = proj["data"].get("ftrackId")
|
||||
if ftrack_id is None:
|
||||
ftrack_id = self._update_project_ftrack_id()
|
||||
proj["data"]["ftrackId"] = ftrack_id
|
||||
self.handle_missing_ftrack_id(proj)
|
||||
ftrack_id = proj["data"]["ftrackId"]
|
||||
self._avalon_ents_by_ftrack_id[ftrack_id] = proj
|
||||
|
||||
self._avalon_ents_by_ftrack_id[ftrack_id] = proj
|
||||
for ent in ents:
|
||||
ftrack_id = ent["data"].get("ftrackId")
|
||||
|
|
@ -209,15 +211,78 @@ class SyncToAvalonEvent(BaseEvent):
|
|||
self._avalon_ents_by_ftrack_id[ftrack_id] = ent
|
||||
return self._avalon_ents_by_ftrack_id
|
||||
|
||||
def _update_project_ftrack_id(self):
|
||||
ftrack_id = self.cur_project["id"]
|
||||
def handle_missing_ftrack_id(self, doc):
|
||||
# TODO handling of missing ftrack id is primarily issue of editorial
|
||||
# publishing it would be better to find out what causes that
|
||||
# ftrack id is removed during the publishing
|
||||
ftrack_id = doc["data"].get("ftrackId")
|
||||
if ftrack_id is not None:
|
||||
return
|
||||
|
||||
if doc["type"] == "project":
|
||||
ftrack_id = self.cur_project["id"]
|
||||
|
||||
self.dbcon.update_one(
|
||||
{"type": "project"},
|
||||
{"$set": {
|
||||
"data.ftrackId": ftrack_id,
|
||||
"data.entityType": self.cur_project.entity_type
|
||||
}}
|
||||
)
|
||||
|
||||
doc["data"]["ftrackId"] = ftrack_id
|
||||
doc["data"]["entityType"] = self.cur_project.entity_type
|
||||
self.log.info("Updated ftrack id of project \"{}\"".format(
|
||||
self.cur_project["full_name"]
|
||||
))
|
||||
return
|
||||
|
||||
if doc["type"] != "asset":
|
||||
return
|
||||
|
||||
doc_parents = doc.get("data", {}).get("parents")
|
||||
if doc_parents is None:
|
||||
return
|
||||
|
||||
entities = self.process_session.query((
|
||||
"select id, link from TypedContext"
|
||||
" where project_id is \"{}\" and name is \"{}\""
|
||||
).format(self.cur_project["id"], doc["name"])).all()
|
||||
self.log.info("Entities: {}".format(str(entities)))
|
||||
matching_entity = None
|
||||
for entity in entities:
|
||||
parents = []
|
||||
for item in entity["link"]:
|
||||
if item["id"] == entity["id"]:
|
||||
break
|
||||
low_type = item["type"].lower()
|
||||
if low_type == "typedcontext":
|
||||
parents.append(item["name"])
|
||||
if doc_parents == parents:
|
||||
matching_entity = entity
|
||||
break
|
||||
|
||||
if matching_entity is None:
|
||||
return
|
||||
|
||||
ftrack_id = matching_entity["id"]
|
||||
self.dbcon.update_one(
|
||||
{"type": "project"},
|
||||
{"$set": {"data.ftrackId": ftrack_id}}
|
||||
{"_id": doc["_id"]},
|
||||
{"$set": {
|
||||
"data.ftrackId": ftrack_id,
|
||||
"data.entityType": matching_entity.entity_type
|
||||
}}
|
||||
)
|
||||
doc["data"]["ftrackId"] = ftrack_id
|
||||
doc["data"]["entityType"] = matching_entity.entity_type
|
||||
|
||||
return ftrack_id
|
||||
entity_path_items = []
|
||||
for item in entity["link"]:
|
||||
entity_path_items.append(item["name"])
|
||||
self.log.info("Updated ftrack id of entity \"{}\"".format(
|
||||
"/".join(entity_path_items)
|
||||
))
|
||||
self._avalon_ents_by_ftrack_id[ftrack_id] = doc
|
||||
|
||||
@property
|
||||
def avalon_subsets_by_parents(self):
|
||||
|
|
@ -857,7 +922,14 @@ class SyncToAvalonEvent(BaseEvent):
|
|||
if vis_par is None:
|
||||
vis_par = proj["_id"]
|
||||
parent_ent = self.avalon_ents_by_id[vis_par]
|
||||
parent_ftrack_id = parent_ent["data"]["ftrackId"]
|
||||
|
||||
parent_ftrack_id = parent_ent["data"].get("ftrackId")
|
||||
if parent_ftrack_id is None:
|
||||
self.handle_missing_ftrack_id(parent_ent)
|
||||
parent_ftrack_id = parent_ent["data"].get("ftrackId")
|
||||
if parent_ftrack_id is None:
|
||||
continue
|
||||
|
||||
parent_ftrack_ent = self.ftrack_ents_by_id.get(
|
||||
parent_ftrack_id
|
||||
)
|
||||
|
|
@ -2128,7 +2200,13 @@ class SyncToAvalonEvent(BaseEvent):
|
|||
vis_par = avalon_ent["parent"]
|
||||
|
||||
parent_ent = self.avalon_ents_by_id[vis_par]
|
||||
parent_ftrack_id = parent_ent["data"]["ftrackId"]
|
||||
parent_ftrack_id = parent_ent["data"].get("ftrackId")
|
||||
if parent_ftrack_id is None:
|
||||
self.handle_missing_ftrack_id(parent_ent)
|
||||
parent_ftrack_id = parent_ent["data"].get("ftrackId")
|
||||
if parent_ftrack_id is None:
|
||||
continue
|
||||
|
||||
if parent_ftrack_id not in entities_dict:
|
||||
entities_dict[parent_ftrack_id] = {
|
||||
"children": [],
|
||||
|
|
|
|||
|
|
@ -87,8 +87,8 @@ class UserAssigmentEvent(BaseEvent):
|
|||
if not user_id:
|
||||
return None, None
|
||||
|
||||
task = session.query('Task where id is "{}"'.format(task_id)).one()
|
||||
user = session.query('User where id is "{}"'.format(user_id)).one()
|
||||
task = session.query('Task where id is "{}"'.format(task_id)).first()
|
||||
user = session.query('User where id is "{}"'.format(user_id)).first()
|
||||
|
||||
return task, user
|
||||
|
||||
|
|
|
|||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue