I was just reading a blog post by Nick Covington about retargeting missing bitmaps in 3ds Max scenes. He has a perfectly clever idea for doing so, but I wondered why I've never needed to write something similar for my users. Or rather, why so many people feel that bitmap retargeting is necessary?
3ds Max has an "External File Paths" feature found under Customize/Configure User Paths. It allows you to list folders in which bitmaps can be found on your system, regardless of where they were when first assigned. For instance, if you load a Max scene containing a reference to "C:\my_killer_art\foo.tga", but that file or folder doesn't exist on your PC, it will automatically look for "foo.tga" in each of the External File folders until it's found.
It also works for shader files. As long as you have paths in that list that contain all bitmaps your scenes could possibly use, you'll never see another "missing files" error again.
So am I missing something? Do people just prefer to retarget in the scenes themselves, rather than maintain that list of external file folders? I don't, but mileage might vary.
Admittedly, one limitation of that external file paths is that they don't automatically recurse into subdirectories. If you have a folder below one listed there, it will not look in that folder, it needs to be listed explicitly. Rather dumb.
Wednesday, February 20, 2013
Why retarget missing bitmaps in 3ds Max?
Thursday, March 1, 2012
Perforce Python API Basics
Many Python users working with Perforce believe that calling out to "p4.exe" with subprocess is the only method available. Perforce actually maintains free, native API packages for several languages, including Python. The Perforce Python API is fast, fully-featured and easy to work with. It lets you interact with Perforce in a familiar Python manner, without having to capture and parse command-line output. Parsing output is one of my least favorite things to do, and I doubt I'm alone there.
Here is a dead-simple example, showing how to use the Perforce Python API to sync all files in a certain depot folder.
# Sync contents of a folder import P4 p4_api = P4.P4( ) p4_api.connect( ) results = p4_api.run_sync( '//project_x/...' ) p4_api.disconnect( )Some notes on connections... You'll notice above I first "connected" before issuing any commands with the API. Typically you do this once in your tool/script, run any Perforce commands you need, then disconnect when you're finished or the tool closes. It will also disconnect when the API object falls out of scope and gets destroyed. There's no need to open and close the connection all the time.
You can also use the "with" statement to easily manage the connection, automatically disconnecting when that block of code is completed:
with p4_api.connect( ): # connected here results = p4_api.run_sync( '//project_x/...' ) # disconnected hereGoing back to the top example... as written it will simply use the default Perforce port, client and user. If you want to explicitly set this and not use the default, call the "set_env" function prior to your connect call (new in version 2011.1):
p4_api.set_env = ( 'P4CLIENT', 'my_workspace' )Next, take a look at the "run_sync" command we issued. One cool thing about the Perforce Python API is that the general syntax for everything is "<api_object>.run_<command>( args )", where "command" is literally the command string you would pass to p4.exe when using the command-line interface. Examples: "run_sync", "run_edit", "run_add", "run_fstat", etc. If you know how to use Python from the command-line you already know how to use the Python API.
Above you'll see I captured the return value of our sync as "results". Calls like this all return a single list of dictionaries, one dict for each file the operation was run on. In the sync example above, it only has to update two files in my workspace, so the results object returned looks like this:
[ { 'totalFileSize': '5299712', 'rev': '319', 'totalFileCount': '2', 'clientFile': 'D:\\projects\\project_x\\stuff.dll', 'fileSize': '4865024', 'action': 'updated', 'depotFile': '//project_x/stuff.dll', 'change': '969310' }, { 'action': 'updated', 'clientFile': 'D:\\projects\\project_x\\foo.txt', 'rev': '134', 'depotFile': '//project_x/foo.txt', 'fileSize': '434688' } ]Looking at the second dictionary at the bottom, you'll notice several keys indicating data from the sync operation for that file, including the action ("updated", "added", etc.), both the client and depot paths to the file, and its new revision number.
For some operations the first dictionary returned contains some extra keys related to the overall operation, such as the total number of files acted on, and their total sizes on disk.
These returned results are full of any data you need to present friendly messages to your users. Being in simple dictionary form means they're flexible and easy to work with.
Sunday, January 29, 2012
Old is the new new
If you're seeing old posts of mine on the RSS feed, I apologize. The code syntax highlighter I use stopped working and I had to modify several posts to get it working again. For unknown reasons that causes the feed to treat them as new.
If anyone else uses Blogger and knows how to avoid that, please enlighten me.
Monday, January 9, 2012
GDC 2012 Tech Artist Boot Camp Announcement
I'm organizing and MCing the Tech Artist Boot Camp at GDC 2012 in March. The TABC is an
all-day Tutorial-format session on Tuesday, March 6, from 10 AM-6
PM.
Below is the session description and list of speakers &
topics. We also plan to do a group panel-style Q&A session at the end of the
day.
I spoke at the TABC last year, and it was an excellent way to reach out to and share with other industry TAs. I hope to see you
there!
Description
Technical Art is evolving rapidly. In many
studios TAs play key roles in developing efficient tools pipelines and ensuring
art content is visually striking and optimized for performance. TAs bridge
content and engineering helping make both more successful. However, many studios
have still not fully embraced the TA role. Their TAs are smart and eager to make
an impact, but are not sure how to best prove their value, and be given key
roles in development.
A group of experienced, respected technical artists
from across the industry would like to invite you to sit with them for a day and
learn how to be a more effective TA. Focus on the tools and skills TAs can use
to demonstrate their value, and further integrate technical art into their
studios' pipelines and cultures. Find the worst development problems at your
studio and show them what a TA can do!
Intended Audience
This
all-day tutorial is for technical artists and other developers of any experience
level. A light focus will be placed on techniques and skills useful to TAs at
studios with little-to-no tech art integration and
culture.
Takeaway
At the end of this all-day event, attendees
will understand key techniques to help them take technical art to the next level
at their studios. Learn how to effectively work within constraints, integrate
into your teams, communicate with other disciplines, design better code and
pipelines, and master new shader techniques.
Speakers & Topics
---
Welcome, Introduction
Adam Pletcher, Technical Art
Director, Volition, Inc.
You Have to Start Somewhere... Defining
the Tech Art Role and Building Their Team
Arthur Shek, Technical Art
Director, Microsoft Studios (Turn 10)
This session will go over the
trials of moving from a job in film/animation to a studio with a minimal Tech
Art presence and the ensuing panic of change. The Tech Art role has a soft
definition and differs at every studio – our common quality is that we are
problem solvers, and to problem solve, you must have experience, wide knowledge
and the ability to scramble on your feet. At times, what we may feel pressure to
know can be overwhelming. Relax - you have to start somewhere.
Better,
Faster Stronger: Teaching Tech Artists to Build Technology
Rob
Galanakis, Lead Technical Artist, CCP Games
The success of Tech Art has
caused a complexity of projects and tools for which our traditional skill set is
under-equipped. Tech Artists are now building technology, not just scripts, and
our essential growth must be as a cohesive team, not just trained individuals.
In this session, attendees will learn how to apply a few key practices of
professional software development, such as code review, support processes, and
collaborative coding, to the unique environment of Tech Art.
Build it
on Stone: Best Practices for Developing A Tech Art Infrastructure
Seth
Gibson, Senior Technical Artist, Crystal Dynamics
In this session we
present a set of best practices for building Tech Art tools and pipelines in a
stable, maintainable, and scalable fashion through the establishment of a solid
tools development infrastructure geared toward the specific needs of Technical
Artists.
Joining the Dark Side: How Embedded Tech Artists Can Unite
Artists and Programmers
Ben Cloward, Senior Technical Artist, Bioware
Austin
Technical Artists can be a powerful force to unify teams and
ensure that productions run smoothly. In this case study, I’ll show how the
simple act of moving two technical artists into the programmers’ working area
helped to improve the relationship between art and programming and resulted in a
better-looking, more efficient game.
Lessons in Tool
Development
Jason Hayes, Technical Art Director, Volition,
Inc.
All too often, the importance of planning the architecture of tools
and pipelines in game development is overlooked. In most cases, project
pressures often give us the false impression that we don’t have time to plan, or
worse, we actually save time by “just getting it done”. Nothing could be further
from the truth. This session explains why up front planning is important, when
to recognize over-engineering and offers architectural design principles for
effective tools development-- such as program organization, data design,
scalability and user interface design. Internal tools developed at Volition will
be used to demonstrate these topics.
Shady Situations: Real-time
Rendering Tips & Techniques
Wes Grandmont III, Senior Technical
Art Director, Microsoft Studios (343 Industries)
This tutorial session
will cover a variety of techniques that can be used individually or combined to
solve a variety of game related real-time shading problems. It will begin with a
brief overview of the current generation GPU pipeline, followed by some HLSL
basics. The rest of the talk will dive into a range of techniques with a
complete overview of how each one is implemented.
Unusual UVs:
Illuminating Night Windows in Saints Row The Third
Will Smith,
Technical Artist, Volition, Inc.
This session presents a holistic case
study involving HLSL shader development. Included is not only the problem and
its resolution, but perhaps more importantly, an insight into the Technical
Artist’s problem-solving mindset throughout its resolution.
Group
Q&A, Conclusion
Monday, December 5, 2011
py2exe, Windows 7 & Vista
I don't use py2exe very often, but it can be a useful tool for environments that may not have an existing Python installation.
I recently used py2exe on my Windows 7 PC to build a small tasktray tool. The resulting executable ran fine on my PC (doesn't it always?), but threw an exception on any Vista PC it was run on.
File "win32com\__init__.pyo", line 5, inAfter more online searching than I'd like to admit, I found a post that said py2exe may be including W7-specific DLLs, when instead it should be leaving those out, forcing Vista to go find its native builds of those DLLs.File "win32api.pyo", line 12, in File "win32api.pyo", line 10, in __load ImportError: DLL load failed: The specified module could not be found.
I was able to fix the problem by adding two DLLs to the "dll_excludes" list in my py2exe setup script:
options = { "bundle_files": 3, "compressed": 1, "optimize": 1, "excludes": excludes, "packages": packages, 'dll_excludes': [ 'mswsock.dll', 'powrprof.dll' ] }The tool now runs on both Vista and Windows 7.