Kelly Richard Fennig is a technical producer who’s worked at Slant Six Games, was the project director for Circa 1948 at the National Film Board of Canada, and is a founding member of Ton Up Interactive. An actor, hardware & software engineer, UX designer, project manager, and musician, his various industries gives him a unique perspective and well-rounded appreciation of what it takes to make games.We were recently able to talk to Fennig about the creation of Circa 1948, difficulties encountered during its production, and long-term goals for this project.
GS: What was the inspiration for this project? Specifically, what is so special about 1948?
Kelly Richard Fennig: The world-renowned visual artist, Stan Douglas, was the key inspiration for the project. (He proposed the project several years ago.) For those unfamiliar with his work, he’s primarily a visual storyteller and photographer, and is known for creating photograph composites that capture a moment in time.
One Stan Douglas photo can be composed of over 100 or more separate elements – each being specifically chosen and placed, then seamlessly assembled together to make a “perfect” photograph. But the true art comes from the curiosity of the audience themselves, from what subtle and nuanced details they discover in his work and, usually depending on the order in which they discover them, people will ask themselves about the significance of these details. Eventually, viewers create their own narrative to explain what happened leading up to the moment, so the audience experience is an integral part of the art, and every experience is unique to each individual.
Douglas has a fascination with history and his style is what I personally call a “dirty reality,” since many of the works I’ve seen of his look very “lived-in,” almost to the point of being run down. This makes sense to me as a storyteller: the more worn out something is, the more it has experienced to get to that state, and the more potential for stories it has to tell. As mentioned before, the devil is indeed in the details, so Douglas makes it a point to be as historically accurate and photorealistic as possible.
Being born and raised in Vancouver, he loves this city and its history, and 1948 was a time when the city was on the cusp of change. For most, there was a deep postwar depression and jobs and money were hard to come by. Soldiers back from the war were without jobs or, for some, even homes. The city was beginning its “urban renewal” and claiming its casualties. The technological innovations of the latter half of the 20th century, marking our modern age, were just around the corner. Looking back, the themes in the story Stan Douglas tells in Circa 1948 would be echoed in any city in North America at that time, and have numerous parallels with the present.
GS: On this note, why did you select the two locations?
Fennig: There are the simple answers – money, budget, and technical limitations. For an ambitious iOS app to have the visual fidelity to honor the works of Douglas, we would need to limit how much content the app could have so it could reliably run at an acceptable frame rate and not be multiple gigabytes in size.
However, there is also an artistic rationale for this decision – the duality of having two locations sets up a ‘compare and contrast’ dynamic with the themes of the story. Geographically, the city of Vancouver is divided along Main Street.
On the traditionally more affluent west-side is the site of the Hotel Vancouver. In 1948, it was weeks away from being torn down and relocated, soldiers from the war who had yet received promised support from the government have taken over the building and are squatting in this “tarnished dilapidated gem” of the city.
On the working class east-side is Hogan’s Alley. This area was a culturally diverse home to immigrant and migrant workers who turn into backyard entrepreneurs using whatever skills they have to find a buck, with some of their enterprises being less legal than others. In the middle are those who like to straddle and profit from both sides. So the divide of race, income, and “urban renewal” gets blurred at this moment in time.
And to this day, this divide still holds true.
GS: While recreating the two locations, what archives did you use? Were you able to interview anyone was alive in 1948?
Fennig: Currently, neither site exists anymore. The old Hotel Vancouver at the intersection of Georgia & Granville Street was torn down, and Hogan’s Alley was razed in 1968 to build the Georgia Viaduct. As a result of these changes, we had to rely entirely on archival documentation. Our artists combed the City of Vancouver Archives, and those of the Province and Vancouver Sun newspaper archives. Our producers also got access to the CBC radio archives to gather some radio interviews to add audio colour to the world. We even discovered some magazine articles published at the time showcasing the architecture of the city.
We were about 95 percent confirmed accurate with the geography, but where we weren’t certain, we made our best assumption of what would have been there based on our findings. In many cases, where there were gaps in accuracy, some miracle photograph would show up in the strangest of places and times. For example, two months before completion, a photo would show up and we would find a building that was completely wrong, so we went back and rebuilt it. It was uncanny – in January 2014, Canada Post celebrated Black History Month by releasing a stamp recognizing Hogan’s Alley. On it, we discovered yet another building. Our art lead Jonny Ostrem, who worked closely with Douglas for the duration of the project, would insist we respect the historic authenticity Douglas revered.
As for the characters in the app, nearly all of them are fictitious. That being said, we were able to interview many people who were alive in these communities, and they shared stories about some of the more “colorful and notable” people and events of the time. From these stories, Douglas worked with screen-writer Chris Haddock and playwright Kevin Kerr to create some original characters and situations that were amalgams of these stories.
GS: During the making of this interactive experience, what were some difficulties encountered?
Fennig: LOL! Where to begin? This project was well underway by the time I came on board – two-four years depending on who you talk to. By the time I came on:
- – The app was originally planned around the time of the iPad 2, but Douglas’ vision was too technically ambitious for even the iPad Air (four years later).
- – The project was created by a series of contractors and students, who rolled on and off at various times based on monies and availabilities. The only constants were the producers at the National Film Board of Canada’s Digital Studio (the NFB), known and celebrated for their development of HTML5 and Flash experiences. This app would be their first real-time experience.
- – The Kraken engine we used was open-source and in-development throughout production, right up to shipping
- – Douglas wanted people to not see this as a game, but as art. He wanted every asset, prop and texture to be unique. So every asset had been individually modeled/textured without reuse or instantiation. This created extra strain and challenge on the engine, memory, and computing resources.
- – The project never went through much of a pre-production stage; they just started producing assets.
- – Many of these assets were created by art students whose only experience had been school projects for animation, film & TV visual-FX, and demo reels. They had minimal to no knowledge about techniques for optimizing assets for real-time engines or mobile platforms. Many of the lead artists, and the art lead himself, were learning as they went. In the art world, this in itself is part of the “artistic experience”- to learn and grow while creating the “art”, and this “ground up” approach is integral to Douglas’ artistic methodology/”process”. I have a great deal of respect for them because their lessons from the school of “hard knocks” will stick with them forever.
- – As new evidence and archival photos arose, assets needed to be rebuilt in order to continue to be “historically accurate.”
- – It was decided that the app wouldn’t use real-time lighting – all lighting and shadows were rendered in Maya onto light and specular maps. Having numerous maps in memory instead of relying on the GPU and rendering pipeline, memory, and asset streaming would be the critical path for performance.
- – Whenever an asset had an error (texture, model or otherwise), quite often it required re-rendering the lightmap. Over the course of the project, many, many, MANY of the assets would have to be redone.
- – At the time, the engine didn’t have much of an asset import tool chain: all assets would be created in Maya and Photoshop then converted and imported into the engine manually. Any spelling mistake with any of the assets would cause errors.
- – The user experience and interface went through many iterations and was still too complex to users who were not gamers.
- -There were all these assets, but not a cohesive end-to-end experience for the user.
When I came on board they were “a couple weeks away from shipping,” but only because they had virtually used nearly the entire budget. They realized that, although this wasn’t a “game”, experience from the games industry would be able to provide the perspective they needed to complete and ship the app. This is where I came in.
So for the next six months, we had to simplify the design and create a cohesive experience for an audience who is not accustomed to any form of first-person, real-time digital experience, with an extremely limited budget. (I am eternally thankful that Loc Dao and Janine Steele at the NFB were able to procure more monies required for completion.) Even though most of the production wasn’t efficient by conventions already proven and known by the video games industry, the ship had set sail – my job was to steer it safely into port by any means necessary.
The first step was to actually take a step back and create a design document. From there, we used lean-style design iterations to quickly test out new concepts and simplify the experience for users who are not traditional gamers. Some gaming conventions were brought in, mainly to bring in a simple cause-effect teaching loop. As well, we had to develop a way to optimize the engine and assets but still maintain a high level of fidelity.
It was an exciting six months to say the least. We were committed to a release at the TriBeCa film festival, so with all the changes required, we had an asset lock within days of submitting to Apple. This left next to no time to optimize performance and came in far too hot for my comfort. Needless to say, I expected the first couple of weeks after release would be crashy, and we would need to get user experience feedback in the real-world to address stability.
GS: How do you see this as an “Augmented Reality” experience?
Fennig: It goes beyond the obvious. Being based off of actual historic locations and being historically and geographically accurate and incredibly detailed, it goes beyond the standard fare expected from a “game.” These places actually existed and were respected and reproduced in such a way that allows the user to see how life actually was, warts and all.
In the initial release, we have an alternative input scheme we call “viewport” mode: it takes the gyroscopic positioning data from the iOS devices and uses it to control the “in-game” camera of the user. Your phone/tablet becomes a “window to the past”: point the device up, and you look up; turn around, and so does the in-world camera.
This isn’t the standard or ideal mode because, as Jesse Schell pointed out last year at Casual Connect 2013, users’ arms will eventually get tired. However, it does allow for a very natural way to look at how the world once was. The Kraken engine supports Head Rotational Transform Function (HRTF) sound so with a set of headphones, the user is fully immersed into the environment.
In future releases, there are plans to incorporate GPS and compass data so for those who are in Vancouver and at these historic locations can actually hold up their devices and see what the world was once like where they stood. See the modern world through their eyes, and the historic world through the app. i.e. ‘Where what is now a Starbucks once stood a speakeasy’. It’s a gimmick, but it does allow the user a more immersive experience into the world.
At the TriBeCa film festival in New York, we collaborated with R&D Arts and Memo Aiken’s team at Marshmallow Laser Feast to go one step further – we took the app, as seen from one frame of view, and projection-mapped the environment onto four walls, almost like a first generation of the Holodeck from Star Trek. This produced a 360º view of the world – where the app allowed the user to explore a Stan Douglas photograph, the TriBeCa interactive experience actually and literally placed the user into a Stan Douglas photograph. Using multiple Xbox Kinects and the very latest Mac Pro, we would track the movement of the user and render this “reality”. Off-axis positioning would allow the user to look up, under, and around objects, and we would use their body itself as a virtual joystick to move through this world we created in the app.
Both experiences – viewport and the interactive experience – are pretty trippy and very, very cool. Honestly, I really wish people more people could experience the installation, but it does cost a bit to transport and set up.
GS: Several of the conversations are influenced by noir films. Which noir films did you and your team turn to? How do you feel this adds to the historical authenticity?
Fennig: The primary point of visual inspiration from Douglas to the art team was the film Hammett. It’s Francis Ford Coppola, so it gets a bye for legitimacy, as he is a stickler for authenticity. Essentially, this movie is an homage to “film noir” – heightened shadows, a femme fatale, corrupt police, etc. Other films were considered, but with respect to mood and detail, why deviate from a master?
The app was originally designed to be set during midday. It wasn’t until extremely late in the project (re: two months prior to release) that we should switch to an evening setting. This worked on many levels, as the story itself inherently has a “film noir” undertone, so why not make the setting “noir”…verging on the edge of twilight/early evening, moody, with heightened shadows, etc. There is this magic that happens at twilight. Since the light levels are low, fog starts to roll in, and details are obscured.
With the technical limitations of mobile platform, we could have our cake and eat it too: it allowed us a logical and natural way to obscure the details for some of the unessential environments, but still support the photorealism Douglas was known for. Having film noir inspirations, it was a natural choice, and it was surprising we didn’t think of this sooner. This choice did mean a lot of late nights and hiring additional artists to re-render nearly almost all the light maps, but it was definitely worth it. This change was chalked up to the “artistic experience” for the artists: we had to get as far as we did to realize that a time-of-day change would most honor the project.
GS: How did you convince the Canadian government to fund this project?
Fennig: The origins of the project came from a screenplay called Helen Lawrence, a collaboration between artist Stan Douglas and the acclaimed screenwriter Chris Haddock. They originally approached the National Film Board of Canada to make the screenplay into a film, but the NFB was known for producing short animations and documentary films, not fictionalized feature films.
However, the NFB Digital Studio in Vancouver still wanted the opportunity to collaborate with Douglas, so they proposed to develop an app inspired by some of the characters and plot elements from the screenplay, but present them in a way that focuses on one of Douglas’s artistic staples – non-linear (or recombinant) storytelling. Part of the NFB’s mandate is to push the technological boundaries and innovate new ways to tell character-rich, Canadian stories, and this had potential to really try something radically new in the world of art, what the NFB calls the Circa 1948 Storyworld.
In addition to the app, they wanted to create a multi-contextual experience around it, so the Circa 1948 Storyworld is not just the app, but also a historically informative webpage, a Stan Douglas photo series, the immersive projection-map installation (as featured at TriBeCa and touring major cities), and the stage play of Helen Lawrence itself. (Although not a film as originally intended, Helen Lawrence became a ground-breaking play where stage actors were filmed against blue screen and composited and shown to the audience in real-time into the digital environments we developed for the app).
There’s so much more to say about the Storyworld project as a whole that could be said that couldn’t fit into an interview. I highly recommend people read the official press synopsis.
GS: Are there any plans to incorporate architecture that currently exist into this virtual experience?
Fennig: Since these locations don’t exist anymore, there are plans to incorporate GPS telemetry and compass information into the app. So when a person is at the physical location where a structure once stood, they can bring up the app and the tablet truly becomes a “window to the past.” The user could walk down what was once Hogan’s Alley, hold up their phone or tablet, and see what used to stand there.
As for incorporating currently existing architecture, I can’t speak of plans just yet. Some proposals are being discussed, potentially for separate but related projects, but it is too soon to disclose it.
GS: How has this technology been received by educators?
Fennig: It hasn’t really been used by educators… yet. However, historians have been comparing the app content to known historical evidence and records and applauded the sense of accuracy, detail, authenticity, and respect to the locations and the era.
Again, there are a couple of potential projects in the future that I’m not open to talk about just yet, but with “urban renewal” being a constant force for change in our city, the project has archival potential that could be quite cool, so we don’t forget the rich history from bygone generations.
GS: Overall, what are some long-term goals for this project? Is an Android version going to be created?
Fennig: Long term, there are many plans for what was accomplished. Not speaking on behalf of the NFB, I do know that the following may (or may not) happen:
- – An Android version is tentatively in the plans, but it will require rewriting much of the Objective-C used in the front end to work in OpenGL-ES and Kraken and port the Kraken engine to work for Android. This work will benefit the iOS version as well since iOS Viewports are costly when implementing UI.
- – The NFB would like to take the immersive installation on the road. It had an amazing response at TriBeCa. The challenge is finding sponsors to allow for the transport and setup of installation.
- – As mentioned earlier, the kinesthetic mode linking real-world geo-location telemetry to the app, is planned for “on-location” presentations and exploration.
- – The NFB is investigating opportunities to use the Kraken engine, the workflow improvements, and the lessons learned, to tell similar stories set in different historic locations. Since the project was developed using public tax-funding, there is great potential to open this platform up to the public for them to use and innovate (again, I’m only speculating, as the Canadian Government owns some of the technology).
Part of the NFB’s mandate is to push the technological boundaries and innovate new ways to tell character-rich Canadian stories. Based on the initial feedback the NFB feels they have accomplished this and are proud to join the small but growing movement of “interactive storytelling” – using gaming techniques and technologies to tell stories. Personally, it feels pretty cool to have Apple feature and endorse an app I’ve worked on and open the “gaming” world I love to an audience that normally wouldn’t have normally discovered it. It shrinks the gap between the “Game” and “Art” debate.