Local/Global Pre publish scripts

1. global publish scripts: are scripts you write on bs/applications (columns: pre_publish, pre_checin and post_publish), and these are running first and are global to all local and remote users.

2. Local publish scripts: are local scripts found on <app>/publish_scripts Any number of scripts can be provided for any phase by using the pattern: pre_publish<_anything>.py, pre_checkin<_anything>.py, post_publish<_anything>.py

These are running after the global publish scripts and are useful to do custom work on specific user/vendor side, like injecting our pipeline to another pipeline a vendor is using, or doing some custom file transfer operations before/after the publishing… etc.

* The pre_publish script, will be executed immediately after the user click the publish button and before generating any task files by the publisher, if the pre_publish script provided some files types in self.files (as explained later), then these files will not be regenerated by the publisher and fed files will be published instead.

* The pre_checkin script will be hooked up and executed immediately before executing the database checkin function…

This is useful for situations when file operations and syncing need to take place before the evaluating the final checkin function.

* The post_publish script will be executed after the checin process if the checkin process succeeded.

There are some internal modules and variables available for you at the execution time of the pre/post publishing scripts and available for the pre_checkin script too… these are:

modules: os, sys, json, socket, bs_abc (maya), bs_utils, bs_styles, shaderInfo as shd (maya), shaderlib as shdUtil (maya), assets_io as io (maya), assets_xml as xml, OrderedDict from collections

pre defined variables and objects:

self.files and self.files_types: are list type variables, they have all the files that will be published if you read at the pre_publish phase, and they have all the files that was published if you read them in the post_publish phase.

For example, you can write a pre_publish.py trigger that creates some more files like playblast for example and append your generated file to the self.files and append this file type to the self.files_types, accordingly, the Task Control will publish your files with the other internal generated files to the database.

self.bypass_internal_publish: You can set this variable to True to bypass all the internal publishing code, and use your own method for publishing everything.

Note: This will also bypass the post_publish trigger because it is not needed in such case, since your already have branched the internal code to use your own at the pre_publish phase.

self.bypass_checkin: if True, the checkin function will not be executed (assuming you are implementing your own one).

self.append_to_snapshot: this is False by default, if a snapshot_code given here, then then the publisher will append the given/internal_generated files to the given snapshot instead of doing a new version publish.

use_queue: READ_ONLY: If True, then all publishing operations will go through the configured Queue (Pixar Tractor, Sony OpenCue, or Deadline). Don’t alter the value of this variable, but use it as it is, otherwise the internal publishing operations might all fail… if you want to override the publishing behavior to not go through Tractor, then you way is to run task manager in local mode instead of running in Remote mode.

rename_dir_files_on_publish: False by default, if True then the files inside the being published directory will be renamed as: <ast>_<context>…{ext} example: anything.1001.exr -renames-to-> ast_context_subcontext.1001.exr

self.additional_deps: None by default, can be used to pass additional asset dependencies that need to be appended to the internal generated .dep file. The given additional_deps should be a list of dictionaries where each dict should contain at least the keys: [‘ast’, ‘context’, ‘version’, ‘category’, ‘type’, ‘ast_sk’] Generally, this isn’t needed except when you publish files in the pre_publish script where the system knows nothing about them, then you have to append them if these should be mentioned in the .dep file for loadDependencies and remote sync operations.

naming: READ_ONLY this is a dictionary having information about the virtual_snapshot (which is how the snapshot will look like in the future after it gets created and the files/dirs are published and appended to it) the dictionary looks like this: {‘asset_dir_name’: asset_dir_name, ‘context_dir_name’: context_dir_name,

‘full_path’: full_path, ‘versionless_full_path’: versionless_basename, ‘versionless_basedir’: versionless_basedir, ‘future_version’: future_version}

self.future_version: READ_ONLY it’s the version of the asset will take when published. Generally: it’s safe to use this for setting internal application file refs since the task will never be published by two users in the same time.

self.my: is instance of nreal_brain API

(you can also access tactic api from self.my.server.<TacticFactoryFunctions>)

self.project: The project code

self.client_root: is the the client_app folder of your pipeline installation.

ast_sk: is the asset search_key of the asset owning the currently being published task.

self.proc_ctrl: the process control dict sobject of the currently being published ‘process’.

self.configs: is the json.loads of the pipeline.conf file.

self.workflow: is the json.loads of the workflow.conf file.

silent_publish: when True, the publisher will avoid raising any internal dialogs and will keep going with the defaults, information messages will be logged only.

self.ui_mode: setting this to False, then the Task Control UI will not be initialized, and Task Control will work in batch mode, this mode is useful when doing batch publishing on the Farm. Note: disabling ui_mode will automatically set silent_publish to True.

Examples:

1. The following pre_publish script will pop-up a test dialog before publishing:

if self.ui_mode:
    nr_utils.confirmDialog(self, 'Testing Pre Publish...', 'This is a pre_publish dialog')

. 2. The following will add the given files to the list of files to be published in conjunction with the task predefined publish types that will be generated by the publisher based on the processes_ctrl config table of the being published process:

my_files = ['/path/to/playblast.mov', '/path/to/any_dir_to_publish']
my_types = self.my.get_files_types(my_files)

self.files = self.files + my_files
self.files_types = self.files_types + my_types

. 3. The following will bypass the simulation (alembic cache) and will use your given cache instead (assuming you already cached a cloth sim for example and you don’t want to re-cache but just to pass your cache file):

my_files = ['/path/to/clothSim_cache.abc']
my_types = self.my.get_files_types(my_files)
self.files = self.files + my_files
self.files_types = self.files_types + my_types

.

Publishing Actions

The Publishing actions are hook-ups of code to be executed during the publishing process, it gets executed after the pre_publish_scripts and after the publishing validation routines and immediately before generating the files to be published, and hence before the post_publish_scripts.

The Publishing Actions used to configure custom operations during the publish for specific processes, where the user should be free to decide and make an action, such kind of actions, is to ask the user if he wants to proceed with GUI publishing, or go for batch mode, or doing batch publish on the farm for simulation tasks, or submit to render for lighting and comp tasks… etc.

In such cases, an action should be configured in the processes_ctrl for the desired process, and the actions translations should should be configured in the applications.conf (see config/application.conf as an example)… According to the applications.conf, the system will load the proper action module based on the user’s taken action on the question the system asked (the question itself, is configured the processes_ctrl as mentioned above)…

So, during a task publish, the system will look for a pre configured action to this process in the processes_ctrl table, if one found, the system will through the configured question to the user so he take the desired action, the user decision then gets executed based on the pre-configured decision translation in applications.conf where the key “actions” in the config file tells the system which module to run for each pre-defined user action.

This way, you can define as many actions as you want without any limits, so the user can decide from them whenever he is asked to.

Note:

Each predefined action in the applications.conf represents a button the user clicks on when
he wants to make this action, the configured action question will show any configured buttons,
the the button the user clicks will run the corresponding module defined for it in
applications.conf

The publish actions question in the processes_ctrl for a process should have the following syntax:

<q?>/act-status=typ1&typ2&typ3|act2-status=typ1&typ2&typ3|actN...

where:

<Task Control Question that should be asked to the user?>/<the button
text for an answer>-<the status the task should be set to it if the user took
this action>=<the_asset_types this actions apply on seperated by '&'s>

ATTENTION::

When any publish type is provided in the action as to be processed later by the action module, the whole files checkin operation will be by bypassed and all the generated files during the publish will be moved to <sandbox>/__in_progress__/ folder, the system will only create a version snapshot reserved for this publish operation, the new files paths are passed as arg to the action module, then the action module is expected to generate the types it has bypassed during the publish and then publish everything the previously reserved snapshot (the snapshot_code is one of the args sent to the action module already.

NOTE:

You should not use '.' periods in the action phrase except when it's meant to
setup a per application actions, although you can use one action for all apps
without providing a period separation like the rest of the other processes
ctrl elements.

If the given action declared a publish types to button action, then these types will be ignored by the internal publisher assuming that the action script will generate them (or process them on the farm) sine you already configured the action to equal these publish types (multiple types can be given separated by &s as above).

Note:

The publisher will also respect separating the actions by '.' period
so you can configure a different process action for each different
application.

Optionally, you can configure the applications.conf to link a script to be called when this action is taken (the user click on the action button)… the system then will look for a translation to this button in application.conf, if one found, the system will call it, this actually can be your custom script to generate some publish files types that you have ignored in the action syntax.

Example Process Control action on clothSim process:

This process is supported on the farm, what would you like to do?/Just
Publish|Simulate on the farm-Processing=alembic

The action above will ask the user during the publish (and immediately before generating the publish files the question above, the user then will see two buttons, “Just Publish” and “Simulate on the farm”, if the user clicked on “Simulate on the farm” button, the task status will be set to “Processing” and the publisher will ignore generating the “alembic” file generation configured in the ‘publishe_types’ column, then you have to make sure to configure the applications.conf to support your action, continuing in our example, the applications.conf can have an entry like this (for maya application for instance):

actions: [{"Simulate on the farm": "sim_module.write_cache"}]

Now, and since there is an action translation to the user button, the system will run given python method as:

import sim_module
sim_module.write_cache(......)

The system will pass to your module all the information you need to do the job as args, so your write_cache() method should be defined with **kwargs, or should at least define all the following args that will be passed to any action module.method():

- ast_sk=self.ast_sk (the asset search key)
- context=self.context (the task context)
- asset_name=self.currentAstName (the asset name)
- category=self.category (the asset category)
- snapshot_code=<code> (the code of snapshot where the files will be published)
- task_code=<code> (the code of the task being published)
- files=<list_of_files> (the files that was generated internally by the publisher)
- files_types=<list_of_types> (the types equivalent to each given file in files)

In addition to all above, remember that action itself is executed
in the asset_publish scope as a hookup, so it has access to all the
Task Control variables like self.my (the nr nreal_brain api instance), and
all the other variables mentioned above in the pre publish/checkin scripts.

.

Note:

You can just use the Processes Control actions without a translation
scripts in the applications.conf, in cases if you want to just manage
the status the task will be set to based on the action button the user
clicked, and then, you can may have a pre_py_trigger or post_py_trigger
who makes something based on these tasks status.

.

Another example on lighting process actions:

This process supports some additional actions, what would you like
to do?/Just Publish|Submit Preview Render-Gen. Preview|Submit to
Render-Processing

Based on the action above, when the user publish a lighting task, the Task Control will ask the user “what would you like to do?” then will give the user the given three answers to choose from, where each one of them is a button the user use for answering, depending the user answer the status is being grepped and the Task Control is going to make action based on this answer, for instance, if the user answered the above question by “Submit to Render”, then the Task Control will set the published task status to ‘Processing’. At this point, nothing happens more than the status change, then there is a pre_py_trigger that submits the published recipe to the farm whenever it sees the task status became “Processing”.

See the following Task Control & applications.conf chapter for more information on how the action works.

CAUTION:

Any publish script that will raise pop-ups and dialogs should test the
self.ui_mode first to avoid crashes.

For example:
    if self.ui_mode:
        # show dialog
        nr_utils.confirmDialog(self, "title...", "testing window")

.

Running Task Control in Batch Mode

You can run the application Task Control in batch mode, and do everything you would do in UI mode except the features that requires showing dialogs.

An example script for running in batch mode to publish an Asset, there is a script in maya/publish_scripts/publish_asset.py, which look like this:

from pymel.core import *

import sys
from sys import argv
import os
import json

client_root = os.getenv('NREAL_SENEFERU')
sys.path.append('%s/modules' % client_root)
sys.path.append('%s/ui_modules' % client_root)
sys.path.append('%s/maya' % client_root)
sys.path.append('%s/maya/modules' % client_root)
sys.path.append('%s/maya/modules/alembic' % client_root)
sys.path.append('%s/maya/modules/shading' % client_root)
sys.path.append('%s/maya/modules/external_scripts' % client_root)

import nr_utils
import maya_pipeline


def batch_publish(ast_file):
    kwargs = json.loads(open(ast_file).read())

    ast_code = kwargs.get('ast_sk').split('=')[-1]

    maya_scene = None
    for i, ftype in enumerate(kwargs.get('files_types')):
        if ftype == 'maya':
            maya_scene = kwargs.get('files')[i]
            break

    if maya_scene:
        cmds.file(maya_scene, force=True, open=True)

        asset_root_pattern = '%s_%s_ast*' % (ast_code, kwargs.get('context').replace('/', '_'))
        asset_roots = cmds.ls(asset_root_pattern)
        asset_root = str()
        if asset_roots:
            asset_root = asset_roots[0]

        if asset_root:
            # Initialize the Task Control
            task_control = maya_pipeline.Task_Control(asset_root=asset_root, asset_name=kwargs.get('asset_name'),
                                                      context=kwargs.get('context'), category=kwargs.get('category'),
                                                      frame_range=kwargs.get('frame_range'),
                                                      ast_sk=kwargs.get('ast_sk'),
                                                      task_code=kwargs.get('task_code'), ip=kwargs.get('ip'),
                                                      net_mode=kwargs.get('net_mode'), project=kwargs.get('project'),
                                                      manager_port=kwargs.get('manager_port'))
            # Publish the asset
            success = task_control.publish_asset()
            if success:
                nr_utils.log("[INFO]: Asset published successfully.")
        else:
            nr_utils.log("[ERROR]: We couldn't find an asset locator in the given scene", type='error')
    else:
        nr_utils.log("[ERROR]: Failed to get a maya scene to publish from, make sure your publihser is confgired"
                        "to write one and pass it to self.files and self.files_types globals", type='error')


if __name__ == "__main__":
    batch_publish(argv[1])

.

The script above takes an asset information file as arg and run a maya batch Task Control publishing process for the given asset information.

Practical Example on Batch Publishing

Let’s say that we want to configure the system to ask the user to take an action whenever he is publishing a clothSim task.

  1. Setup an action that asks the user if he wants to process on the farm. in prcesses_ctrl actions column::

    This process supports some additional actions, what would you like to do?/Just Publish|Simulate on the farm-Processing=usd|alembic|yeti|vdb

    The action statement above

  2. in applications.conf, declare an action response: under the desired application setting:

    actions: [ {“Simulate on the farm”: “actions.batch_publish”}]

.

This tells the system that whenever the user take the action “Simulate on the farm”, call the method “my_module.my_action_method()” with all the args about the asset (refer to the actions chapter for complete list of the passed args). so, when the user click on “Simulate on the Farm”, the Task Control will execute this:

import actions;reload(actions)
actions.publish_asset(args....)

.

  1. As the batch mode example above, you can write your action method to submit publish in batch mode (you can modify it by your own to do a farm job)… Th publish action module now according to the settings above should be a python module called actions.py and should be in PYTHONPATH, it can look like this:

    from pymel.core import *
    
    import sys
    import os
    import json
    import nr_utils
    import nr_cue
    reload(nr_cue)
    from nr_cue import nr_cue
    from socket import gethostname
    
    
    def batch_publish(**kwargs):
        """
        creates a json act file from the incoming Task Control args, and submit a tractor batch publishing
        job that reads this file and do batch publish on the blade
        """
        data_dir = kwargs.get('in_progress_dir')
        ast_file = '%s/%s.ast' % (data_dir, kwargs.get('snapshot_code'))
    
        # Now, if you have any additional files you generated manually and would like them to be published in
        # same batch mode, inject them here to the kwargs['files'] and kwargs['files_types'] before proceeding
        # in generating the .ast file below
    
        # Adding more files:
        # kwargs['files'] = kwargs['files'] + ['/path/to/playbalst.mov', '/path/to/render/previews.exr']
        # kwargs['files_types'] = kwargs['files_types'] + ['media', 'media']  # refer for mime.conf for more info about types
    
    
        # Generate .ast file (needed by the publisher script on the farm blades)
        with open(ast_file, 'w') as ast_file_io:
            act_data = json.dumps(kwargs, separators=(',', ': '), sort_keys=False, indent=4)
            ast_file_io.writelines(act_data)
        ast_file_io.close()
        print '[INFO]: Action Generated %s' % ast_file
    
    
        # Now, submit a tractor job should look like this:
        # mayapy $NREAL_SENEFERU/maya/publish_scripts/batch_publish.py /path/to/ast_file.ast
        #
        # Note: the batch_publish.py script in this command is a complete Task Control publish, it will take about
        #       any new files injected above and it will automatically bypass the generation of any files prvided
        #       by the parent caller publish process.
        #       It will aslo take care about how to inject the files at the end of the process considering if this
        #       is a Local or Remote publish operation.
        client_root = os.getenv('NREAL_SENEFERU')
        configs = nr_utils.get_project_config('pipeline', kwargs.get('project'))
        tractor_engine = configs['TRACTOR']['ENGINE']
        service = configs['TRACTOR']['SERVICE']  # this gets the default group of blades to be used for the process
        publish_script = '%s/maya/publish_scripts/batch_publish.py' % client_root
    
        job_file = '%s/%s.alf' % (data_dir, kwargs.get('snapshot_code'))
        job_title = '%s_%s_publishing' % (kwargs.get('asset_name'), kwargs.get('context').replace('/', '_'))
        task_title = "Publishing"
    
        # batch cmd syntax:
        # mayapy /path/to/actions.py /path/to/the_args_file.ast # the 'ast_file' file generated above
        tractor = nr_cue(engine=tractor_engine, spoolhost=str(gethostname()), spoolfile=job_file, project=kwargs.get('project'))
        jid = tractor.batch_process(job_title=job_title, tasks=[(task_title, ["mayapy", publish_script, ast_file])], tasks_service=service)
    

.

The action above will be taken based on the user’s respond to the Action question during the publish, as: import actions;reload(actions) actions.publish_asset(args….)

Then according to your action script above, the script is batching a command to Tractor, which uses the previous batch mode publisher script (batch_publish.py) to do batch publishing.