How blockchain gaming is slowly transforming the gaming scenario?

Over the years with the advancement of technology the gaming industry has evolved extensively.  Combination of blockchain technology and gaming has become quite popular among gamers over the past few years.

On the other hand, many gamers are yet to explore blockchain gaming. So here is the easy understanding of what is blockchain gaming? Blockchain gaming is a kind of video game that relies on blockchain technology for a decentralised experience. But many of you must be wondering what exactly is blockchain technology?   Blockchain technology is the growing list of records popularly known as blocks, which are connected using cryptography. It was initially introduced as the Bitcoin crypto-currency, these revolutionary decentralized records with multiple nodes encrypted information are uploaded in each block which only allows secret data sharing among the authorized individuals.

Breaking the myth of cryptocurrency and blockchain gaming, many consider that the blockchain gaming dealt with online rummy, poker or kind of casino gambling game but the truth is that blockchain game can be of any sort. It can be in the form of poker even or it can be in the form of Farmville or most popular game like Candycrush where you play games and may earn bitcoins.

Today, blockchain is most overhyped technology which is being handled by the Internet of Things (IoT) vendors to not only secure data but also to increase the speed of data processing to make the internet of everything. This is where the blockchain technology entered into the gaming industry.

Many blockchain enthusiasts and gaming developers are collaborating and trying to improve their business processes, businesses need, blockchain advancement in gaming to get into the bigger picture of blockchain and crypto-currencies. This is where the hustle of the blockchain technology in the online gaming or gaming industry has been exhibiting a steady interest in utilising the technology.

Benefits of combining gaming and blockchain

Considering the popularity of blockchain technology and both government, as well as the independent gaming companies, are attempting to attain customers’ trust in their own ways. Blockchain technology has become the medium which helps to improves transparency between the participants and the gaming companies with the support of high-end encryption of data. Not only that it enables direct connectivity between the player and the gaming brand, which helps to develop trust,  which again falls as a challenge to maintain it. On the other hand, this trust can be instilled across multiple systems in order to reap the most benefits which include earning of cryptocurrency or bitcoins.

Considering the improvised process, security, transparency, more gamers are getting attracted to blockchain games. And therefore more game developer is coming in front to develop blockchain gaming.

 Recently at the Block Chain Game Summit Lyon France, leading brands in the gaming and blockchain space have come together to form the Blockchain Game Alliance, a coalition that will promote blockchain development within the gaming industry. The Blockchain Game Alliance is made up of influential companies such as Alto, Gimli, Fig, Ubisoft, Ultra, B2Expand, ConsenSys, EverdreamSoft and Enjin.

The public attention into blockchain gaming came close to crashing the Ethereum network in December 2017 with the launch of blockchain game Crypto Kitties. It was a platform to breed your own unique virtual cat. It’s the blockchain equivalent of the old-school Tamagochi or Beanie Baby crazes. And it drew the ire of the ethereum community. So thinking of the demand in the blockchain game we are listing down some of the new blockchain games of 2018 which are transforming traditional gaming and revolutionizing blockchain gaming.

Glitch Goons : It is the new mobile fighting game by a well-known blockchain game developer Ether Dale. The main feature of the game is the fights between cybernetic animal-humanoid characters in a futuristic post-apocalyptic world. Glitch Goons is one of the very few games on the market, supporting a whole range of cryptocurrencies, which makes it multi-blockchain. Its pre-sales has started on 22 October and will run for 3 weeks. It is a free-to-play game with premium tournaments and a substantial prize pool. The core mechanics are automatic player vs. player combat system with no manual control over the fight, and an advanced character management system to modify fighter’s stats and abilities.
There are 5 unique sets (out of 30) distributed between the chests – cosmic, fusion, crimson, shining, and ancient. Having only two items of the set already gives the player a generous bonus to his fighting skills so we can only imagine what gives the full set if obtained. The cost for the item chests varies from 0,5 ETH to 10 ETH. These items will not be duplicated or sold in the in-game shop ever again.

Eth Town : Eth Town is the first ever crypto game where you can get your very own custom-designed character. You can amend the appearance of your characters to look like yourself, your family, or your friends. Considering that a lot of Eth.Town is based on your heroes, it’s your minions who trade for you, fight for you, and merge for you! The heroes are an ERC-721 ethereum token (this code stays for non-fungible tokens, that is each hero is unique, your ownership and the properties of the hero are recorded on the blockchain, and you can transfer and trade it on platforms that support the standard). Just like in real life, the heroes merge and create offsprings, which are better than the originals. It takes a boy-hero and a girl-hero hero to make a new baby-hero, and the new one will have a mix of the parents’ genes, stats, and looks. The parents retire right after merging and leave the town. You start with level-0 heroes and can reach levels up to level-15 (you get a Super Hero).

Spells of Genesis : Spells of Genesis is a mobile game that is a mix of a trading card game (TCG) like monopoly, bringing in deck collection and strategy, along with arcade-style gaming aspects. The game is based on blockchain technology, which is also the main source of inspiration for the storyline. The game revolves around in collection, trading and combining orbs to build the strongest decks, and put them to the test against various opponents while exploring the fantasy realm of Askian. This game can be played with multiple players where you can collect cards, trade or exchange them with other players and collectors. Combine them to build your team then test their abilities and your strategic skills in battles.

0x Universe and 0x battleship:
0x Universe is the game which is powered by the ethereum blockchain. In its first part, released at the end of June 2018, players are encouraged to be a space explorer, colonizing planets in a 3D-rendered galaxy. They can explore planets or buy them using blockchain technology. In the second half, OxGames launched the second part as 0xBattleships which comes with a new challenge. This time, players have to protect inhabitants of the virtual galaxy from a new threat — a mysterious cult of the Void God, famous for its villainyThe resistance to the enemy requires the players to construct their own powerful battleship with various modules that are stored on the blockchain. The modules, which are non-fungible tokens, are the most important part of the game. Players can own lasers, force shields, physical armour, energy generators, engines, and so on — selling them or buying from other players in an auction. The discoverers will extract resources and carry out research that allows them to conquer the remotest corners of the galaxy. In addition, players can jointly contribute to the story and unravel the mystery of the universe. Each planet has a digital collectable with a unique design and resources. Discover the four kinds: common, rare, epic and legendary planets. The latter will amaze you with their fantastic presentation and special stories.

World of Blockchain: World of Blockchain is a web-based business simulation game, which mirrors the blockchain industry in the real world and helps you to know more about the blockchain startups. Different from other crypto games, it adds new features and is expected to take the lead. Previously, it was released in the version of the closed beta test. Except for the market, other gameplay in the official release is similar to the closed beta test. Through this game, the outsiders see the truth behind the stories that blockchain startups try to sell, and the insiders take inspiration from playing. The game’s philosophy is to let people make their own decisions and having fun out of craving out business opportunities.

Considering the indulgence of the gamers in blockchain gaming, the variation of block chain games so far and the Block Chain Alliance it is expected that more game developers will come up with more unique blockchain games in near future

The post How blockchain gaming is slowly transforming the gaming scenario? appeared first on AnimationXpress.

Academy receives 25 animated feature submissions for ninety-first awards held next year

We aren’t even through to 2019 yet, but the race for next year’s Oscars is already heating up: the Academy of Motion Picture and Sciences revealed that it has received as many as 25 submissions for the Animated Feature Film category, ranging from Hollywood productions to that of overseas.

Some big-budget releases of the year such as Incredibles 2, Hotel Transylvania 3: Summer Vacation, Spider-Man: Into the Spider-Verse, Smallfoot, Teen Titans Go! To the Movies, have all entered the race, unsurprisingly. However, the list also features certain animation productions that haven’t met the theatrical release requirements yet, and wouldn’t make it to the voting process until the condition is satisfied. And as a matter of fact, films submitted in this category also qualify for the Academy Awards in other categories, such as the Best Picture.

The full list of submissions is as follows:

  • Ana y Bruno
  • Dr. Seuss’ The Grinch
  • Early Man
  • Fireworks
  • Have a Nice Day
  • Hotel Transylvania 3: Summer Vacation
  • Incredibles 2
  • Isle of Dogs
  • The Laws of the Universe – Part I
  • Liz and the Blue Bird
  • Lu over the Wall
  • MFKZ
  • Maquia: When the Promised Flower Blooms
  • Mirai
  • The Night Is Short, Walk on Girl
  • On Happiness Road
  • Ralph Breaks the Internet
  • Ruben Brandt, Collector
  • Sgt. Stubby: An American Hero
  • Sherlock Gnomes
  • Smallfoot
  • Spider-Man: Into the Spider-Verse
  • Tall Tales
  • Teen Titans Go! To the Movies
  • Tito and the Birds

The final nominations for the Academy Awards 2019 would be announced on 22 January 2019. The mainevent would take place at the Dolby Theatre at Hollywood & Hyland Center in Hollywood on Sunday, 24 February 2019.

The post Academy receives 25 animated feature submissions for ninety-first awards held next year appeared first on AnimationXpress.

Yes, more Gunner!

Yes, more Gunner!

SEPC pavilion makes waves at MIPCOM 2018

It was indeed a historic moment for India’s content creation, distribution and broadcast industry at this year’s MIPCOM which concluded at Cannes, France on 18 October 2018. More than 300 Indian delegates swarmed the French Riviera village – the largest delegation in the Reed Midem market- cum-festival’s history.

The highlight of the market was the first ever Indian pavilion which had around 38 companies registered on it, the screening of Eros Now’s digtial series Smoke, the presence of Shemaroo Entertainment CEO Hiren Gada and Zee Entertainment’s Sunita Uchchil on a panel which discussed how Asia was developing.

What generated the most oohs and aahs was the Indian pavilion which was located in the basement of the Palais des Festivals. Led by Services Export Promotion Council director general Sangeeta Godbole and her team, it had a mixture of service providers such as dubbing and subtitling company BOL, Native Ninjas, animation studios such as Digitoonz, Eplus Technologies, BFX CGI, Big Animation, Ayanaa Cinematics, Animantz, Aadarsh Motion Philm CGI, Nilesh Patel Studios, Locomotive Films, Phoebus Creations, Rockline Group, Rotomaker India, Sacom MediaWorks, Sony Music, Technicolor India, Thought Cloud Studio, Vedatma Studios, and Broadvision Group.

The big draw at the market was the Indian cocktails and snacking session that the SEPC put together on the Indian pavilion on 16 October evening. The menu was very appealing with many savories and the Indian Sula wine and French wines, apart from beer, juices, sodas, flowed freely, even as Bollywood music played in the background.

Indian participants and foreign delegates could not get enough of the bonhomie, delicacies on offer and said it reminded them of the Zee TV beach party during the MIPCOM India country of honour celebration in 20007.

Said Godbole: “It is the Indian government’s intent to help India’s very small and small content creators to take their work globally. And also help highlight the fabulous creative talent we have in India. We have subsidised the rates that they have to pay to enter what can otherwise be an expensive market for the small studios. This was our first entry into MIPCOM and we are very enthused by the response and can only look forward to scaling up our presence by taking a larger
pavilion with many more smaller entertainment companies registered in MIPCOM 2019 apart from promoting the Indian presence in a bigger way.”

Present at the celebration was Indian embassy’s Paris-based chief of the economic and commercial wing Sarvjeet Soodan who said that he was delighted by the presence of the pavilion and the plethora of Indians at MIPCOM. He promised that the Indian embassy and its officials would be available to support the initiatives of the SEPC at the coming MIPCOMs in anyway that was needed.

Film maker Ketan Mehta who now heads animation powerhouse Cosmos-Maya was also delighted about the pavilion. Said he: “It took me as a creator a long time to get to know how to take my content global and sell it internationally. We struggled in our time and I am very happy that the smaller studios have this opportunity to leap frog and achieve in a couple of years what took us many more in our time. Congrats to the SEPC.”

Reed Midem India, Pakistan, Sri Lanka and Bangladesh representative Anil Wanvari – who drove the initiative from the Paris-based organisation’s side along with international sales manager Paul Barbaro. “In earlier years too, efforts were made to set up an India pavilion but the response was tepid from both industry and government. We started the conversation with the SEPC a year or more ago and it took the grit of Sangeeta to see this to its logical conclusion,” said Wanvari. “Now smaller studios who come in as participants can at last have the address of the Indian pavilion from which they can operate and fix meetings to showcase their content or services to potential international customers and buyers.”

The post SEPC pavilion makes waves at MIPCOM 2018 appeared first on AnimationXpress.

OPPO unveils eSports championship at its PUBG-themed store in Bangalore

OPPO, the selfie expert smartphone brand, has recently partnered with Tencent Games, which organises ‘PlayerUnknown’s BattleGrounds Mobile Campus Championship 2018’ in India.

After receiving multiple entries and a total of 10,000 teams from over 2,000 colleges, a month-long action-packed tournament finally culminated on 21 October at the first-ever PUBG-themed OPPO store in Bangalore.

Keeping in mind the growing trend of PUBG, the OPPO brand showroom has been designed like the real battleground. Along with the combat feel, consumers can now move away from the virtual world and step into the real word of PUBG by experiencing the adrenaline of the game in the store with the new OPPO F9 Pro.

Speaking on the occasion, OPPO brand director Will Yang said, “With PUBG’s massive success amongst the youngster, we are delighted to announce the revamp of our brand showroom based on PUBG. Our unique showroom has been designed to cater to the youth of India who are continuously following new and interesting trends in the market. By leveraging this partnership with Tencent Games, OPPO wants to strengthen its relationship with the youngsters as the F9 Pro is the perfect combination for the unified and unique experience.”

Packed with extraordinary features, OPPO F9 Pro acted like the perfect gear for all the gamers in this championship. The 6.3-inch bezel-less, 1080 x 2340 resolution and a super-high screen-to-body ratio of 90.8 per cent waterdrop screen helped participants achieve maximum kills to get to the finale. After intense rounds and some amazing performances leading to winner winner chicken dinner, a total of 20 teams competed at the grand finale and the ultimate winner was awarded a prize money of 0.50 million.

Along with the mega prize pool, participants were commemorated with the following awards at the revamped PUBG themed store:

  • MVP – Overall Best Player with a maximum number of MVP awards
  • The Executioner – Awarded for maximum kills overall
  • The Medic – Awarded for the highest number of revives
  • The Redeemer – Awarded to the player with the highest amount of health restored
  • The Rampage Freak – Awarded for maximum kills in one lobby
  • The Lone ranger – Awarded for Maximum Time Survived in game

The post OPPO unveils eSports championship at its PUBG-themed store in Bangalore appeared first on AnimationXpress.

‘Star Wars: Darth Vader’ comic reveals ‘Rogue One’ castle secrets

Marvel Comics writer Charles Soule let slip Darth Vader’s castle secrets in the latest issue of the Star Wars comic. The castle in question first appeared in the film, Rogue One: A Star Wars Story. It was on the same planet where Darth Vader had his final duel with Obi-Wan Kenobi in Star Wars: Episode III-Revenge of the Sith. In Rouge One, the film’s main antagonist, Orson Krennic, went to Darth Vader to get his help against Grand Moff Tarkin.

Charles Soule’s spin on the book has been primarily focused on the bridge between the infamous ‘I don’t like sand’ Anakin Skywalker from the prequel trilogy to badass, Dark Lord of the Sith. This has included an explanation on how Darth Vader obtained his first red lightsaber, the hunting down of the Jedi who survived Order 66 and going deeper into just why he joined the dark side, to begin with. In addition, it also gives us answers as to why he decided to build his house on Mustafar, to begin with.

This latest issue the book reveals his true reason behind putting his base in such a morbid place. In the first image shown, it is revealed that in the depths of the castle is what is described as a door to the dark side. Apparently, by going through it, Darth Vader will be able to meet with the soul of Padme.

Excited to discover more? The feeling is mutual.

The post ‘Star Wars: Darth Vader’ comic reveals ‘Rogue One’ castle secrets appeared first on AnimationXpress.

3D World Magazine Gives Light Kit Pro 3 a Perfect 5-Star Review

See why 3D World Magazine calls Light Kit Pro 3 an “essential” plugin for Cinema 4D, and gives the new tool a Best in Class 5-star review.

The latest edition of 3D World Magazine is on newsstands now. (You can purchase a physical or digital copy here.)

3D World Magazine Gives Light Kit Pro 3 a Perfect 5-Star Review - Magazine

Featured among the greatest new things and trends in 3D are both a Light Kit Pro 3 review and tutorial.

Freelance journalist and CG artist Steve Jarratt tested the new Light Kit Pro 3, and came back with a stellar review and the magazine’s highest honors, a Best in Class 5/5 review.

“With a huge library of preset studios on offer, some really clever functionality and general ease of use, Light Kit Pro 3.0 comes highly recommended — hell, if your work is primarily product shots, it’s pretty much essential.”

3D World Magazine Gives Light Kit Pro 3 a Perfect 5-Star Review - Review

Image via 3D World Magazine.

Jarratt goes on to say that, “a genuinely useful feature is the ability to decouple the cast light from the reflected light, enabling you to tweak overall brightness levels separately from the specular highlights, to get precisely the look you want.”

As for LKP3’s most talked about Render Switch feature, “There was already a lot to like about this update, but the icing on the cake is it’s built-in support for Octane, Arnold, and Redshift.”

3D World Magazine Gives Light Kit Pro 3 a Perfect 5-Star Review - Tutorial

Image via 3D World Magazine.

To those readers with copies of the magazine, be sure to flip to the tutorial section, where 3D and visual effects artist Mike Griggs will give you a 3D Bootcamp introduction to Light Kit Pro 3. For those with digital edition, you can tap for a 12-minute walk through with Griggs. You can also find additional Light Kit Pro 3 training here.

3D World Magazine Gives Light Kit Pro 3 a Perfect 5-Star Review - Award

Thanks to 3D World for the kind words. If you want to learn more about Light Kit Pro 3, check out the links below.

The post 3D World Magazine Gives Light Kit Pro 3 a Perfect 5-Star Review appeared first on Greyscalegorilla.

Yes, more Gunner!

Going back to ‘Pleasantville’: when doing a DI wasn’t so easy

Pleasantville

In 1998, Gary Ross’ Pleasantville became the first major Hollywood feature to go through what’s now known as the digital intermediate, or DI, process. The film needed that process because its characters, stuck in a 1950s black and white existence, would slowly start to escape their repressive world as they begin experiencing colour.

Getting there involved shooting on colour negative, scanning the film, carrying out significant roto, and then doing colour correction and selective desaturation of the imagery – then getting that all back onto film again. These were understood principles in the burgeoning era of digital filmmaking, but they hadn’t really been contemplated for so many VFX shots (around 1700 in total).

To get a handle on just what was involved in making the film two decades ago, vfxblog asked Pleasantville visual effects supervisor Chris Watts to break down the process. In this Q&A, you’ll read about the early Pleasantville colour manipulation tests, the need to convince the studio that the immense amount of scanning/colour correction could be done, the late 1990s tools of the trade – including an early version of Shake – and why painting out unwanted arm bands might have been the toughest work on the film.

vfxblog: What were the new things that had to be solved to make Pleasantville?

Chris Watts: Everything that we did in there, I knew how to do already. I had just never done that big of a job of doing it. Really, the harder part was the sort of disbelief encountered among vendors that you were actually going to try some new things. It was just crazy-talk to most people. Even people who you would think would know better. The people at certain facilities now, known exclusively for their DI abilities, they were in disbelief that certain things were possible, or that certain things, that we discovered were necessary, were in fact necessary. Like doing entire reels at once, rather than doing things as shots individually.

I’m not sure people are aware of how the job travelled from facility to facility in search of somebody who would do it our way, which was to do it a reel at a time. That was one of the big deals on that film, was getting somebody who would actually say, ‘Okay, we trust that you’ve done the research necessary to make us do this, because otherwise we’re just gonna do it our own way.’

[We] made a deal with Bob Fernley to get the movie done for the amount of money we had, and the amount of time we had left to do it. Fernley looked at me, thought about it for a second, said, ‘Okay, I’m gonna trust this guy. It sounds crazy, but we’ll do the movie a reel at a time and see what happens.’ Back then, nobody did that. Now it’s totally commonplace.

So that was probably the biggest hurdle, just getting people to say, ‘Okay, these guys, even though we’ve never heard of them, these guys know what they’re doing, and we’re gonna trust them.’

We made the movie essentially twice. We did it once in video-res, or low-res. We animated all the roto in video-res. And then we filmed it out at EFILM, to see how the movie played. Because, hell if we really knew how this was gonna work in a whole movie sense! The fact that it went through a computer and came out again, didn’t really phase me, but people were worried. They’d say, ‘How’s it gonna play if it goes through a computer and back?’ People just didn’t get that it was still gonna look like a movie, it was not gonna be necessarily something different. In fact it was going to be more evenly and better controlled, and it was gonna look better.

But it was one of those things where there was a lot of doubt whether the processes were going to work. So the director wanted to have something at all times that was screenable, but not too good, because then they might make us release it! It was screenable for people who were not really on board with the process. Because, well, it was a brand new thing for people.

Now, when I came on, the plan for the movie was to do it a certain way. And they had this chapter in the process built in. We ended up going a totally different way, but we still kept the major milestones on the studio side, just because it was fairly late in the game, and we didn’t want to be changing things up with this giant expense that they were not really accustomed to paying.

So, we did it twice. We did it at the video-res, essentially, and then we did it again at film-res. And obviously there was a lot of transference of assets in terms of what worked. We did the roto, to the degree that it was actually able to be done, we were able to transmit the roto to the high-res medium.

But even then, we did a lot of hacks on roto. I think we were probably the first people ever to do roto on jpgs instead of actual files. You know, we did roto on jpgs because who needs to do the roto on the Cineon files? Everybody was doing that, and there was this fear of doing it on the jpgs, that for some reason, even though that image was gonna be thrown away, and we’re gonna keep just the shapes, there was some fear that it was gonna be somehow different, and it wasn’t gonna line up as well or something if we, if people did roto on jpgs. But now, of course, everybody does it that way. Another crazy thing that came out of that.

vfxblog: Let’s go back to the beginning a little bit – was there any kind of proof of concept or test done to prove that a film like this could be done?

Chris Watts: When I came on, they didn’t have an effects supervisor, and they didn’t have anybody to look after the whole process. They also had nobody as a DI wrangler. But they had done this test, they’d shot this little sequence, and they’d shot it on colour film, turned it all completely black and white, and then re-colourized the whole thing from scratch. I thought, ‘That was kind a weird way to do it, but okay.’ They had this ‘mustard girl fill,’ which is what I always called it. Because she looked like she was kind of mustard coloured.

I watched it, and everybody was like, ‘Oh, isn’t this great? Isn’t this awesome?’ And I was, like, it was cool and everything, and the work was done to a better standard than any colourized movie I’d ever seen, but it was still basically colourized from scratch. And I thought, ‘Well, why not take the movie and shoot it on colour stock, and then do some selective desaturation to keep all the nice colours of colour film, and all that technology of the last 80 years?’ So, we tried that a little bit, after watching the ‘mustard test’, and we didn’t keep testing the selective desaturation. Everybody was, like, ‘Oh duh, this is much better.’ So that’s what got me hired on the job, was the fact that I’d come up with that little idea.

And the guy who was in charge of the colour, Michael Southard, who I’m still great friends with today, he immediately saw – luckily – he immediately saw the benefits of doing it this way. And he had the technical experience with the software, which they had kind of borrowed from this company that went out of business. He had the technical experience with that software to actually do it and then manage a little team that was able to duplicate the effect I was after. He was on board, and he was a great ally through the whole movie.

Michael just did a fantastic job. He was basically the colorist of the film, even though we didn’t have anything quite so real-time as a telecine console for the final colour, it was still done essentially in the same manner. We essentially filmed out the whole movie, piece by piece, in our office. We had a couple of Solitaires clicking away for 24/7 for months. And then once we got that done, we output the whole film again at Cinesite a reel at a time. And that was the deal that Bob Fernley and his crew were able to hold up their end of, which was to basically record the whole movie reasonably quickly, and then supe it all a reel at a time at once.

Even with our ambitious thinking, we didn’t think, ‘Well, we should just do a whole movie at once.’ Because that was too much even for us at the time. So we did a reel at a time, and then we did this elaborate print matching thing, where we matched up prints that came out of Deluxe, so that we didn’t have a big bump between reels when one went from one reel to the other. And that worked okay. The bumps that we got between reels were not great, I don’t think, but that was a time when the audience came to expect bumps, because you can always tell when a reel’s gonna change, the movie, it gets a little dirtier at the end. People were okay with that. Nobody ran screaming from the theatre when we encountered these little discontinuities in the colour that we got between reels.

vfxblog: How did the eventual approach to what would be done in post-production influence anything in terms of on-set filming?

Chris Watts: I pretty much planned things to just let production shoot whatever they wanted to, and then we would deal with it later. It wasn’t so much of a cop out, but it was kind of a daring, white knuckle experiment in what would become the style of visual effects supervising for the next 20 years. Let them shoot whatever they want, we’ll fix it later. For the most part, we didn’t key anything, we didn’t shoot much with multiple elements or multiple bluescreens. We knew we had a crew that was going to roto 160,000 frames of film. We knew they were really good and really fast. So basically once we sort of swallowed that somewhat bitter pill, which turned out to be not bitter at all, we were able to sort of free up some production to do all the things we wanted to do.

There were a couple things, like the drive through the cherry tree leaves that were falling down, the little pink leaves. They were gonna shoot that with the leaves that were the right colour, and then have this slowly come on. So I changed the colour of those to be a little bit different, basically I made them magenta instead of pink, just so they could be the opposite of green, and then we could key those pretty easily as they came down, because it would have been a real mess to have to roto that stuff as they were driving through that black and white forest. And that worked great, it worked fine. There wasn’t any problem. We did a test of that, one little shot, and it worked great.

vfxblog: There’s a very famous scene where Tobey Maguire is with Joan Allen, and he’s applying the makeup. How was that actually filmed, and what then happened in the back-end?

Chris Watts: For that scene we actually did shoot a different way than what they were gonna do. We did use keys in that. We got some green bespoke make-up. We wanted a colour green that was essentially the same colour as a flesh tone would be in black and white. Which wasn’t that hard to do, because green is essentially the major component of lumens. So it was pretty easy to come up with. I can’t remember the exact formula, but I remember doing a few tests, and picking one, and that was the way we went.

We really had to do the test and then get the output quickly, because people were hugely concerned about putting green makeup on Joan Allen. That was gonna be our master footage – ‘What are we gonna do if it didn’t work?’ they would say – so we did a test, and it worked, and it was all fine.

These days, again, it doesn’t seem like something was just that terribly risky, putting some green makeup on an actor for some effect later. But at the time, people were nervous about things like that. And people were even nervous about the way that Joan would be able to act with green makeup on. She might be self conscious or something. But she was totally fine. If she was self conscious, I didn’t know about it. She just did it, and the movie looked great.

Dealing with little challenges like that, little sort of petty fears of the studio, that was a big part of my job. Just convincing people things were gonna go okay. This is back in the day when there were – a lot of companies which now do great work were just coming up, and were maybe not doing work that was quite so great, and people who were maybe not so experienced in the filming side, more experienced with the digital side, were showing up on sets and making recommendations about things that turned out to not be the right thing. So I was real careful not to do that.

Anyway, with the green makeup – that made the edges of the make-up that much easier to deal with. They were gonna do it in roto before I showed up, and then I came up with the idea of the green makeup. So we just made the switch. Other than that, it wasn’t really that big of a deal.

vfxblog: These days, the big films have 12 or 13 or more VFX vendors. How did you approach it on Pleasantville? i remember there was essentially an in-house unit called Pleasantville Effects.

Chris Watts: Well, it’s kind of a weird story. Essentially the movie was gonna be done by this company, Cerulean. They were essentially a colourizing company in LA. The day after New Line gave them a million dollar deposit or something like that, they basically packed up their offices and disappeared. And then these other people – Dynacs – appeared who were, who’d been on the board of directors of Cerulean, who basically said, ‘Well, we’ll do the movie.’ And it was all a little bit shady, it seemed to me.

This was all at the beginning of the trend of sending work to India, and they had this great idea that they were gonna get work, and they were gonna have this whole schedule of when they were gonna get frames sent out, and get sent back. I just looked at it and laughed, because I knew instantly that it wasn’t gonna happen, just based on the schedule, and the company I was dealing with. It ended up in a lawsuit, and the company that was gonna do the work, they had no movie experience, and they had none of the sort of traditional pertinences of companies that are accustomed to dealing with working on a feature film.

I pointed out that these guys, even if you ignored the fact that one company had disappeared and sprung up from the ashes into another company, minus the $900,000 we gave them, they’d never had any experience doing any movies. Well, they had experience doing movies, because they had some of the same people, they still never had any experience doing movies where there was a living director, or a living editor. Pretty much any movie that’s been colourized is a movie that’s fallen into the public domain. Which is generally a movie for which the director and the editor are no longer around, if they are, they’re not concerned with the movie anymore. So that was a big deal.

I floated the idea that they were gonna get a movie that was gonna have changes, and they were gonna have to go back into shots, and redo things, and they pretty much freaked out, and said, ‘Well that just can’t happen.’ They ‘told on’ me to the higher ups at the production, thinking they were gonna get rid of me or something. They basically got rid of themselves, they ended up getting the job taken away from them. And we ended up setting something up ourselves in Gary Ross’ office in Toluca Lake.

We’d been fooling around with the idea of doing it ourselves, because I saw the writing on the wall. I was sort of hoping for the best but preparing for the worst while we were filming the film. And everybody was pretty concerned with just filming the movie at that point. So I was told to basically keep my mouth shut, not say anything to anybody. But I slowly prepared, quietly, for any of the various eventualities. One of which would be we’d end up doing the movie ourselves.

Eventually we started gathering up a little crew, and making plans for equipment, and budgeting things. There were some good people there at Dynacs, for sure, but they didn’t get some things that we take for granted on features. Like the ability to turn around work quickly, and the ability to iterate on shots, and things like that. That was not part of their game plan when they signed up for the movie, and they had no idea they were gonna have to do any of that stuff. So basically we took it away from them, and there was a big lawsuit.

vfxblog: Let’s talk a little about how you managed the post-production. How were you handling the data for this show back then?

Chris Watts: Here’s a fun fact for you: for the entire film, the disc capacity of the entire facility where we did it, which was essentially our office in Toluca Lake, we had two, count them, two terabytes of disc space. That’s what the whole film was done on. Which now fits in my laptop. But back then it was a huge rack and stuff. And we were so proud of it. But you know, they came out with these drives – the four gigabyte drive was the biggest one you could get. That was big then. Then they came out with the nine gig drives, and those were really expensive, but we still got them.

It was silly how efficient we got to be, or we had to be, with disc space. Now you look at what people do, and that’s like, that’s 10 minutes’ output of MPC or something. It’s astounding how much more data we deal with these days, and how many more elements we generate. I mean, luckily, these shots had, they had a background element, maybe one other element, and then a bunch of roto elements, which were essentially of insignificant size. So it was really only just multiple copies of outputs and some intermediate elements.

Every time we rendered it, we went really back almost to the original footage. Because we didn’t render things over and over again, we just went back like anybody else would. But it was still, it was not a huge data show from today’s standards, but back then, it was like, ‘Whoa, that’s the most data we’ve ever seen.’ It still cracks me up, that we were so proud of two terabytes.

vfxblog: You mentioned the crazy amount of roto in the film. What were the main tools you were using for roto, and image manipulation?

Chris Watts: There was nothing there really to play with. We pretty much had to build a lot of stuff ourselves. Luckily, there were some people around who also saw the writing on the wall, and we were able to use the ‘baby’ versions of certain bits of software that really helped us out a lot. Nobody knew, nobody understood, with very few exceptions, except for a few people at Cinesite, how logarithmic colourspace worked. And nobody was using scene linear colour space or anything else yet. But log was the way to go if you wanted it to come out of the computer looking the same way that it looked going in.

I had Raymond Yeung write us this bit of software that was able to manipulate the monitor access of CRT screens on SGI computers. And that was a huge help to us, because then we were able to back and forth between LUTs. Which again, remember, there was nothing there. You turn on your computer, and you got like, you got a terminal prompt. There was nothing else. There was not much of any kind of desktop, there was no, there was just nothing. It was SGI O2s.

For roto, we did have Matador, that was out. But Matador at that point was being made and sold by Avid, and they wanted like, 20 grand for a barely functioning, ass-backwards, really difficult mistress of a paid programme. Yeah, there were a few people who knew how to use it really well, but man, those people were expensive too. We also used Commotion for dust busting, and some paint work, too.

One thing that was really crucial was, Shake had just come out. Arnaud Hervas and the guys over in Venice had been working on it, we’d heard about this thing, Shake, and I went over to talk to him. I was like, ‘Oh my God, this is exactly what we need.’ And it’s got lots of handles to be controlled by external stuff. It’s all command line accessible.

But the version of Shake that had the interface was years away. It hadn’t come out yet. The interface for Shake basically just generated a text file, which was then rendered by the Shake engine. It was very simple as a software matter to associate those two. And just have the engine. And then later on, when they tacked the interface on there, that was the thing that generated the text file that the engine was able to read, and use as the basis for doing what I wanted to do.

So we were able to, with a little bit of work, and getting our head around kind of object oriented filmmaking, we were able to write a lot of software that essentially coughed out render scripts, or at least the beginnings of a render script, for every shot in the movie. We had this really awesome sort of mass production. Essentially, it’s was like a kind of a slightly slow motion DI thing we do now. You don’t have the ability to whiz back and forth in the film, and see one reel of cut negative, and see what you’re doing to it like you did before, but we did have the ability to go through shots, and time them. We had also written this colour correction tool called Coco that dealt with essentially still images. And it was able to, quite quickly afterwards, assemble very small but colour accurate motion film, and you could cut it into little tiny thumbnails.

The guys at Shake were hugely helpful in just being where they were at that point in development. It was exactly the right thing we needed at exactly the right time. We had to twist their arms a little bit to get them to sell us a few of them, because they were really still developing it. But we showed them we weren’t gonna be complaining about the lack of an interface. I was used to no interface software, because working at CFC, that was how they did it. The Shake guys, they were awesome. Every frame of that movie was rendered in Shake.

vfxblog: Can you talk a little about your actual workflow for completing the shots?

Chris Watts: To do the meat and potatoes of the work, we’d get the film in, went through all the editorial stuff, we’d telecined the film in a very structured way. In fact, we’d sort of do things like pull-downs, remember that? Before machines could detect pull-downs automatically. So I wrote some software that essentially lined up all the dailies in reels, to be worked with, the selected dailies, so the cadence of the pull-down would be unchanged for the entire reel. So when we wanted to take a pull-down out, it could all be put in or out at once, without messing with sound, or causing jitter frames, or anything like that.

And that was kind of a pain in the butt, because nobody was doing that yet. Avid didn’t know how to do any of that stuff. And we were actually editing in Lightworks, which was probably the superior platform then if you were an editor. But it wasn’t really as easy to get the data in and out of it. We figured it out.

So, we’d get the cut, we’d get the takes from which the cut was constructed, we’d scan whole takes at 1920×1440 resolution. We’d telecine the whole takes, obviously, but then we got to scanning, and we scanned less than the whole takes. I wrote a new bit of software to put just the takes together in the proper cadence. But either way, we had these big reels of footage where the pull-down could be taken out in one fell swoop on a Henry. And we’d spend lots of time in a Henry room. My son is named Henry; Thomas Henry Christopher Watts, because I used the Henry so much that I grew to love the Henry, and it’s now my kid’s middle name! Wayne Shepherd was our Henry guy. And there were other people too, like Mark Robben at Editel, back when Editel was still there.

Then once we had the files in our various file sizes at our little production facility, after we’d done the dailies, they did the roto. We did this crazy process to save time which was roto at 1K. Which probably sounds like anathema, but you can’t really see the difference if you look at it on film. Compression was one of those things where, oh my God, we can’t compress anything. The Academy was not willing to consider any cameras that did any kind of compression till the RED came out, basically, and then they finally had to say, ‘Okay, fine. We’ll let you compress things.’

So we did what essentially amounted to jpg compression. We would split the movie into luminance and then the colour part of it. Because we knew we’d be throwing most of the colour part away. Or at least not having to deal with it for a while. So we split it into these weird little daughter files. There were the luminance files, which look fine if you look at them, and then the colour files, which were essentially this weird – if you strip the colour out of something, and you’re just looking at the colour information, it’s this kind of weird, out of focus, kind of blobby looking stuff. It’s not even like, colour is on its own, and when you separate it out, it’s not really very sharp or anything else.

I think Paddy Eason and I came up with this idea when I was over in London at one point. Paddy’s been a long time friend from back in the CFC days. And we thought, let’s explore this idea of doing the movie at 1K, because again this was, we’re using O2’s and things like that. And then up-res’ing it, up-res’ing the colour from 1K colour to 2K colour, and then using the luminance from the original 2K file. Or from whatever the output of the effects work was.

So we tried that. I did some tests at Editel where we took the files and we did full-res chroma, half-res chroma, and quarter-res chroma. And the quarter-res chroma looked totally fine, but I was like, ‘Well, let’s not push it.’ So we went to half-res. Half-res actually made these real, nice small files, that were sort of like a quarter the size. And then we had these finished 1K files, and we took the colour from the finished 1K files, applied it to the 2K luminance, and we had these beautiful, pristine looking 2K output files. It worked really well.

That was the kernel of the image processing pipeline. And that was all managed by Lauralee Wiseman and her crew. She was great. She was able to manage that whole process, just keeping all that stuff straight. And then it ended up going to Cinesite to get filmed out with Jackson Yu over there, who did amazingly Herculean amounts of work to get everything in order, and dust busted, and looking good. Half the dust busting we did, and half the dust busting Cinesite did.

vfxblog: Was there anything else that you remember specifically about Pleasantville that you wanted to share?

Chris Watts: A couple of universal truths still hold. The one that I always end up coming back to is that after Pleasantville happened, well, on Pleasantville we had this joke that we said, ‘Don’t underestimate the difficulty of scanning an entire film into a computer and then getting all the frames back out in the right order.’ And that kind of still applies today.

DI’s are generally done the same way we did them, mostly because Raymond Yeung, who was our programmer guy, was behind that. He wrote so much software, he ended up getting hired by all the labs to go make their DI pipelines for them. So it was kind of nice – a lot of the stuff that we had sort of fought and struggled through, and come to conclusions on the best way to do it, is still the way that a lot of things get done because it’s the way that Raymond ported it over to whatever facility he was working for at the time. That was kind of an interesting side effect of Pleasantville, was that that stuff persists, and some of the difficulties that we had are still the same difficulties that people have today.

Then also, the other crazy thing about the movie that people might not believe is that we had to find somebody to scan all this stuff really quickly. We tested all different kinds of ways of scanning it, and we decided we wanted something that was kinda quick, and we’d be able to sort of evaluate it as it rolled by, and that was not the way the film scanners worked at that point. So we heard about this new machine called the Spirit DataCine, which sounded very complicated and new and exciting. And we heard that Kodak had one. I went to London, and went to do some tests at VTR. Because we were thinking about using theirs. And they were awesome. And we got back, and then we found out the Kodak had one, but they couldn’t find it. It was like, in some other building or something somewhere. They had this other office in Culver City, that I remember had an elevator that squeaked like the Titanic was sinking. It was basically this box in a basement. This big crate and it said Spirit DataCine on the side. So we cracked the thing open with a crowbar, plugged it in, and basically started playing with it. And it was, we were definitely the first people in LA to mess with one of those things.

We got it open, and we realised quite quickly that they’d made some errors in it – it was still in development, really, but errors in things like the log curves, that we all take for granted. The people at Phillips, who’d built the thing, and the people in Kodak who had had purchased the thing, they have different ideas about what log colour space meant. So there were some issues that we had to deal with, and some of those things came up in the middle of production. They were kind of hard to swallow. But essentially, in a nutshell, that machine was this brand new piece of kit, that later on became ubiquitous, before they were all pushed out into alleys, right behind the ranks that they replaced.

But it was these machines that basically enabled us to do the movie, that came along just in time for us to use in the movie. If it had been a week later, we would have had to do it some other way. But luckily, this machine was there, it was sitting at Kodak, nobody even knew what it was, sitting there gathering dust, it had been there for a week or two, just sitting there.

Here’s another thing: a lot of the effects work we did ourselves, and then it was a couple shots we farmed out to CFC, and various other places. One thing that came up was that we had a big clump of work that was completely not something we were expecting to do, or budgeted to do. It had to do with the black and white Gestapo guys. And they had these armbands on. And if you look carefully, you can still see them in a couple frames of the movie, but Gary decided this was too much, so we needed to get rid of those. And what a pain in the ass that was! We probably had 60 shots or something where we had to paint those things out. So that was all done in Commotion, and a little bit was done in Avid Media Illusion, and some of it was done in Matador, too. We had various difficulty levels of shot, based on how big the armband was in the frame. That was one of those things where it was like an armband over this puffy, billowy white shirt, and these people are running around doing stuff. It was pretty hard to do. But you know, we got it done. Marc Nanjo really cut his teeth as an artist on that.

I actually also worked on the last shot delivered of the movie, where William H. Macy says, ‘Honey, I’m home.’ And then it’s a pan around to various things in the house, and there’s a shot of his hat on the hat rack. And I guess they forgot to shoot it or something, or they decided they wanted it later. And so I had to come up with a shot of a hat on a hat rack, and I ended up having to assemble it from a couple other pieces, just the very tail ends of dailies, and frames that we had laying around from other shots on that set. And so I was madly painting that thing in.

It was literally the last thing. That was the one thing that was holding up the movie. And as soon as I was done with that, handed it to Lauralee, and said, ‘Okay, I’m done.’ I turned out the lights and went home. It’s the only movie where I’ve ever felt, ‘Okay, I’m done. There’s nothing more I could do on this movie.’ Usually you get dragged away kicking and screaming. Bob Degus the producer was there. He was like, ‘Oh, thank God.’ We shut off the lights and walked out together, because it was such a moment. And I just imagine that that hat’s probably still sitting there somewhere on that post. Waiting for me to come home.