Tree Proxies

In this 16 minute video tutorial create ParticleFlow TreeProxies by Anselm von Seherr in 3ds maxdiv class=”feedflare”
a href=”http://feeds.feedburner.com/~ff/Cgarena-GetAttentionInCG?a=qTUC5ieOr6E:HkT4hAHmnG0:yIl2AUoC8zA”img src=”http://feeds.feedburner.com/~ff/Cgarena-GetAttentionInCG?d=yIl2AUoC8zA” border=”0″/img/a a href=”http://feeds.feedburner.com/~ff/Cgarena-GetAttentionInCG?a=qTUC5ieOr6E:HkT4hAHmnG0:7Q72WNTAKBA”img src=”http://feeds.feedburner.com/~ff/Cgarena-GetAttentionInCG?d=7Q72WNTAKBA” border=”0″/img/a a href=”http://feeds.feedburner.com/~ff/Cgarena-GetAttentionInCG?a=qTUC5ieOr6E:HkT4hAHmnG0:V_sGLiPBpWU”img src=”http://feeds.feedburner.com/~ff/Cgarena-GetAttentionInCG?i=qTUC5ieOr6E:HkT4hAHmnG0:V_sGLiPBpWU” border=”0″/img/a a href=”http://feeds.feedburner.com/~ff/Cgarena-GetAttentionInCG?a=qTUC5ieOr6E:HkT4hAHmnG0:qj6IDK7rITs”img src=”http://feeds.feedburner.com/~ff/Cgarena-GetAttentionInCG?d=qj6IDK7rITs” border=”0″/img/a a href=”http://feeds.feedburner.com/~ff/Cgarena-GetAttentionInCG?a=qTUC5ieOr6E:HkT4hAHmnG0:gIN9vFwOqvQ”img src=”http://feeds.feedburner.com/~ff/Cgarena-GetAttentionInCG?i=qTUC5ieOr6E:HkT4hAHmnG0:gIN9vFwOqvQ” border=”0″/img/a
/divimg src=”http://feeds.feedburner.com/~r/Cgarena-GetAttentionInCG/~4/qTUC5ieOr6E” height=”1″ width=”1″/

GOW 3 – IF

Imaginary Forces lend their hand on telling the story of one of this years most highly anticipated titles God of War III. 01. main title / 02. flashback cinematics

Node for tracking bright spots

Hey guys,

I saw the Gnomon Workshop DVD 1 on Nuke Compositing a few weeks back. Great stuff.

In the lectures he mentioned a node that is able to determine the brightest spot in an assigned zone of a given footage, that can be used in-turn to drive tracking data for lens flare etc, which I thought was pretty neat.

However I’ve forgotten the name of the node and I’ve sat through several of the lectures again but I can’t seem to find where he mentioned it (Can’t skip around in the gnomon dvd).

Anyone remember the name?

“Audacity” (Short Film in NYC)

http://www.youtube.com/watch?v=5otVdc2733s

A New York City resident reading a newspaper about the Yankees at a Spring street park becomes increasingly enraged as he witnesses a careless outsider continuously disregarding the sake of the environment with his trash, so he decides to take matters into his own hands and frighten him into being more considerate of his favorite park.

Starring: Dan Zambrano and Jad Magaziner
Written & Directed by Jon Edwards
Filmed & Edited by Jon Edwards
Soundtrack by Quincy Jones
Shot with the Nikon D90 (ADR Sound)

2D Fluid Texture as Displacement

After checking out the examples on Duncan’s blog dealing with new features in Maya 2011, I thought I’d try and replicate a few. Right now I’m working on the bomber example, where particles emit into a 2D fluid texture which is used for displacement and color on a groundplane. There’s no outAlpha on the fluid texture, and when I set the Color input to density, none of the outColors produce any results on the heightField I’m feeding it into. Anyone tried this?

Thanks,
Paul

nuke.ViewerWindow.activeInput()

Hi I am trying to get the number assigned to the active input in the viewer. Here is the code I am running:

nuke.ViewerWindow.activeInput()

As far as I understand it, according to the documentation, this "Returns the currently active input of the viewer – i. e. the one with its image in the output window"

Am I missing a bracket or something somewhere?
It seems to be expecting a <ViewerWindow> type as an argument.
The error I usually get is this:
"descriptor ‘activeInput’ of ‘ViewerWindow’ object needs an argument"
I am on Windows using Nuke5.2v1.

Any help would be greatly appreciated.
Pete

Where are all the jobs?

I know times are tough but there just seem to be no jobs out there…at all. I’ve been trying to get a roto job now for the last 8 months (and I consider myself to be fairly proficient. People tell me I should have no problem getting work with my reel). In that time I’ve seen Dneg advertise (but are they really cos they always seem to advertise) and that is it. Not a single other roto job.

Am I missing something?

I seem to be in an unfortunate place between runner and roto artist. Cant get a sniff of roto jobs but Im told I have too much experience to be a runner. Just what is inbetween unless I want go down a data ops route (which I dont cos that is more of a 3D route)?

So glad I wasted 25 grand on a degree that cant even get me a job making tea.

__________________________________________________ ______________
Jon Reid
reid.vfx@googlemail.com

RotoReel: http://vimeo.com/9939625
__________________________________________________ ______________

digital camera noise (like from RED one)

does anybody have a plate of digital noise from RED or any solution to replicate that?

i want to integrated some cg stuff on red footage and need to match the digital noise from the compression.

Onesize: Quants

Onesize’s latest project; a leader for a series of 5 documentaries for the show called Tegenlicht. A making-of animatic is on the same page.

Retiming Z channel

Hi,
for the sake of rendering times, we have just rendered a slow motion sequence out of vray at normal speed and i am retiming it usig kronos.

Of course, retiming the rgb channels went smooth and sequence looks ok.
But now i am trying to apply zblur to the slow segment and i am having trouble slowing down the zdepth

I have tried to aply the zblur before retiming, but times of calculation goes up, i suppose triggered by the semitransparent alpha. Also, more arfifacts are aparent with the pre-blured version.

I have tried to retime the z channel using a cloned node from the rgb analysis, but results are extrange and unusable.
At the moment i am testing with the _vectorGenerator and trying to use the exact vector data on all channels, but can´t get to a nice result yet.

I have the VrayVelocity pass if i can feed it to the motion estimator to help with the calculations, but so far i wasn´t able to translate that info to furnace´s vectors format.

I am asking if anyone can help with some info or thoughts about this workflow. By this time rendering the whole slow motion sequence is not a possibility i´m affraid.

Thanks in advance
T