Complex UV Layout in Maya
Over the last couple of years UV layout in Maya has changed for the better. In this course we're going to be taking a look at some of those changes as we UV map an entire character
# 1 20-05-2015 , 07:29 PM
EduSciVis-er
Join Date: Dec 2005
Location: Toronto
Posts: 3,374

Backburner Concurrent Jobs

Any backburner users here? I've been setting up a render farm for my department, and have cleared most of the obstacles so far. Students are getting renders submitted and are getting back frames (sometimes).

One thing I haven't been able to determine yet is how to get backburner to distribute tasks from multiple jobs concurrently. There is a manager setting for "Max concurrent jobs" so it seems to me that given three jobs submitted at about the same time with the same priority, it should send out some tasks from the first, then some from the second and some from the third and carry on like that. Currently, it seems (albeit with a small sample size) that it entirely completes one job before continuing to the next.

PS I know Genny has done some work extending backburner; I may try to install the script/mod on the render machines, as it seems like it could be very useful.

# 2 21-05-2015 , 03:36 PM
NextDesign's Avatar
Technical Director
Join Date: Feb 2004
Posts: 2,988
How are you submitting the jobs? If you're submitting through a shell script, you can background the process which will allow it to execute many at the same time. You just have to be careful about overloading the machine. On Windows I believe you can do this: START /B programName.

I think you also misread the name of that setting. It's "Max Concurrent Assignments", not jobs. This is the number of assignments the Backburner manager can output at once, not the number of assignments it will give for each machine.


Imagination is more important than knowledge.

Last edited by NextDesign; 21-05-2015 at 03:51 PM.
# 3 21-05-2015 , 03:41 PM
EduSciVis-er
Join Date: Dec 2005
Location: Toronto
Posts: 3,374
The jobs are being submitted through the Create Backburner Job... dialog in Maya. We're using Macs, though not command line (although Maya generates a CmdJob command when it submits). I know what you mean by putting the process in the background, but I'm not sure that applies in this situation. The backburner manager is sending out tasks through its own internal logic to available render machines. I don't want a render machine to be working on multiple tasks at once, however it would be nice for the manager to send out tasks from multiple jobs, if that makes sense.

# 4 21-05-2015 , 03:49 PM
NextDesign's Avatar
Technical Director
Join Date: Feb 2004
Posts: 2,988
So you want it to assign jobs to a machine before it's ready to run it? I'm not sure if that's possible, as certain frames may take longer than others to render. If one machine finished before another, but didn't have an assignment because it was given to another machine, and that other machine was still working, you would have an idle node. Assigning on demand gets around this issue.


Imagination is more important than knowledge.

Last edited by NextDesign; 21-05-2015 at 03:52 PM.
# 5 21-05-2015 , 03:55 PM
EduSciVis-er
Join Date: Dec 2005
Location: Toronto
Posts: 3,374
So take an example of three jobs (submitted one after the other with equal priority), each with four tasks (say frames) and you have 2 available render machines:
Job 1: 01, 02, 03, 04
Job 2: 01, 02, 03, 04
Job 3: 01, 02, 03, 04

Renderer A takes Job1_task01
Renderer B takes Job1_task02

They both finish, and are available to receive another task each. We now have:
Job 1: 03, 04
Job 2: 01, 02, 03, 04
Job 3: 01, 02, 03, 04

Currently, the manager will send out the remaining two tasks from Job 1. I would rather it send out the first task each from Job 2 and Job 3. So the jobs finish at about the same time, rather than Job 1 finishing before Jobs 2 and 3 even begin.

# 6 21-05-2015 , 04:08 PM
NextDesign's Avatar
Technical Director
Join Date: Feb 2004
Posts: 2,988
Ah, I see. I'm not sure if that's possible. You could possibly abuse the system however.

1) Submit each frame as a separate paused job.
2) Assign each job a decreasing priority number.
3) Un-pause the jobs.

So you would have priorities assigned in the following way:

Code:
Job1_task01: 100
Job1_task02: 75
Job1_task03: 50
Job1_task04: 25

Job2_task01: 100
Job2_task02: 75
Job2_task03: 50
Job2_task04: 25

...


Imagination is more important than knowledge.
# 7 21-05-2015 , 04:14 PM
EduSciVis-er
Join Date: Dec 2005
Location: Toronto
Posts: 3,374
Hmmm, yeah, that would work I guess, but it doesn't sound super efficient. I guess I'm confused why there is a parameter for "Max concurrent jobs" if backburner doesn't send out tasks from multiple jobs.

This brief section from the sparse documentation also seems to indicate that this happens:
Suspending and reactivating jobs is commonly used to quickly improve job throughput and network efficiency. For example, you might suspend one job to temporarily assign its render nodes to another that is more urgent. Or, if a particular job is taking too long, you can suspend it until off-peak hours, allowing shorter jobs to complete in the meantime. Sometimes, a low-priority job can 'grab' a processing node during the brief moment when it is between tasks—in such a case, suspending the low-priority job will return system resources to jobs with higher priorities.

I just don't understand the logic behind the task distribution.

# 8 21-05-2015 , 04:24 PM
NextDesign's Avatar
Technical Director
Join Date: Feb 2004
Posts: 2,988
I think that just means that there is a race condition when assigning jobs. I don't see any indication of what you're describing from that.


Imagination is more important than knowledge.

Last edited by NextDesign; 21-05-2015 at 04:27 PM.
# 9 21-05-2015 , 06:31 PM
Gen's Avatar
Super Moderator
Join Date: Dec 2006
Location: South FL
Posts: 3,522
I wonder, if you use the max servers per job limit would Backburner send out the next job in line to the remaining servers?


- Genny
__________________
::|| My CG Blog ||::
::|| My Maya FAQ ||::
# 10 21-05-2015 , 06:40 PM
EduSciVis-er
Join Date: Dec 2005
Location: Toronto
Posts: 3,374
Yes, that might be the only way to get multiple jobs going. However, I don't really like the idea of setting a global max server limit because in situations where there is only one job in the queue, it will leave servers idle.

It seems like it requires a case-by-case monitor and management, reducing the "max servers per job" on a per job basis when the queue is long, for example.

Posting Rules Forum Rules
You may not post new threads | You may not post replies | You may not post attachments | You may not edit your posts | BB code is On | Smilies are On | [IMG] code is On | HTML code is Off

Similar Threads