際際滷

際際滷Share a Scribd company logo
HOW THE GAME INDUSTRY SHAPED THE CREATIVE INDUSTRY IN 20S
ALESSANDRO STRACCIA
RESUME
Alessandro Straccia
25 years of experience in digital arts, focused on creative
technology since 2015
Some companies Ive worked for:
Blippar UK, Lockwood Publishing, FabraQuinteiro, Grupo
Eug棚nio, Ag棚nciaClick
Currently Creative Technologist at Blippar UK, creating AR
and VR experiences for clients like General Mills, Baileys,
MAC Cosmetics, Airbnb, among others.
HOW THE GAME INDUSTRY SHAPED THE CREATIVE INDUSTRY IN 20S
ALESSANDRO STRACCIA
 Toy Story 1 - 1995 - First feature film entirely made in CGI
 77 minutes long - 114,240 frames in total
 Each frame taking 45 minutes to 30 hours to render
 Pixars renderfarm with a network of 117 computers, running full-throttle at 24/7
 Toy Story 1 - 1995 - First feature film entirely made in CGI
 77 minutes long - 114,240 frames in total
 Each frame taking 45 minutes to 30 hours to render
 Pixars renderfarm with a network of 117 computers, running full-throttle at 24/7
Voice-over:
Probably some of you guys wasnt born yet in 1995, but it was in this year that one of the biggest revolutions in the film
industry happened: the launch of the first feature film made entirely in CGI: Toy Story. Pixar, at that time, was just a
small studio doing some experimental animation and working for advertising industry, basically a spin-off of Star Wars
well-known studio LucasFilm.
Producing a 77-minute feature film was a big challenge at that time: even the most advanced computer was too slow
to compute hardcore 3D maths, specially very complex render algorithms. The 3D and animation software were very
simple and rudimentary in 1995, making the 3D artists and animators work very hard. Each frame took between 45
minutes to 30 hours to render, even on a renderfarm with 117 computers running full-throttle 24/7.
 Toy Story 1 - 1995 - First feature film entirely made in CGI
 77 minutes long - 114.240 frames in total
 Each frame taking 45 minutes to 30 hours to render
 Pixar renderfarm with a network of 117 computers, running full-throttle at 24/
 Toy Story 1 - 1995 - First feature film entirely made in CGI
 77 minutes long - 114.240 frames in total
 Each frame taking 45 minutes to 30 hours to render
 Pixar renderfarm with a network of 117 computers, running full-throttle at 24/
Voice-over:
Here you can see John Lasseter (one of Pixars founders and Toy Storys director) explaining how was the workflow of
producing a CGI animation. Funny thing is that this workflow havent changed too much since then, but you can see
how slow and rudimentary were hardware and software in 1995.
 The Mandalorian - 2020
 Season 1 - 8 episodes
 Big part of the scenes with CGI + talent used Virtual Production technique
 Real time render, syncing cameras and lighting on virtual and real sets
 Revolutionary technology that speeded up the production time and reduced the costs
 The Mandalorian - 2020
 Season 1 - 8 episodes
 Big part of the scenes with CGI + talent used Virtual Production technique
 Real time render, syncing cameras and lighting on virtual and real sets
 Revolutionary technology that speeded up the production time and reduced the costs
Voice-over:
Jump to 2020: COVID pandemic stroke the world and the cinema industry was hit (again) by a huge technology
innovation: the Virtual Production technology. This tech wasnt entirely new: its being used by TV studios to build
virtual sets in live shows and news since the 2010 decade, but The Mandalorian show was the first feature film (in this
case, a streaming show) to use it in huge part of the CGI scenes. Jon Favreau, the show producer, decided to borrow
equipment and software from TV studios to try it on a cinematic production and it was a BIG win. LucasFilm (yeah,
these guys again!) used a 13 feet high semi-cylinder LED screen, and shot the actors and first-plane props at same
time the LED screen was rendering (in real time) the animated and interactive set. The real camera and lighting were
synced with the virtual camera and lighting, so if it moves in the real world, moves on the virtual set too. More: the
reflections and lighting bouncing on the real actors costumes and props were real, not rendered in the post production
as the old school way, so it really gives you the impression the actors are fully immersed in the virtual set. Even
more: any last minute change or fine tuning could be done (and revised) in real time by a team of artists in the
backstage.
 Old school method: chroma key
 Still used by 90% of film industry, but much more expensive and complex
 This method use regular 3D software and regular render method, taking minutes or hours to
render each frame
 Needs a lot of post-production and editing
 Old school method: chroma key
 Still used by 90% of film industry, but much more expensive and complex
 This method use regular 3D software and regular render method, taking minutes or hours to
render each frame
 Needs a lot of post-production and editing
Voice-over:
When I say old school its just for fun: the chroma key is still the main CGI resource the film industry uses nowadays.
Basically, you need to shoot all the scenes in a front of a green (or blue) screen, which is removed by software and
replaced by CGI elements, exactly how it what was done in Toy Story (without the green screen step, of course),
taking minutes or hours to render each frame. To make the things more complex, the production team needs to add
tracking points on the green screen, which will be used by the 3D software to track the camera movement and
reproduces it on the 3D camera. Quite complex and expensive.
With the Virtual Production technology you can cut off lots of steps of this workflow, keeping only (and most important)
step: the shooting. The CGI step is done BEFORE the shooting, and actors and production crew can see and interact
with the CGI in real time. Much better than interacting with a green background behind them, huh?
But, of course, not all CGI shots can be done using Virtual Production. This is why the old school method is still being
used, and will be one of the main CGI resources for a long time, but even so, weve been seeing more and more
studios using Virtual Production software to produce regular CGI scenes.
How the Game Industry shaped the Creative Industry
Voice-over:
In this video, The Mandalorian director Jon Favreau talks a bit about the Virtual Production process.
- new software
new production techniques
- new hardware (specially GPU - graphics processor)
- real time rendering software
So, what happened in 25 years?
1 to 30 hours / frame Real time
X
- new software
new production techniques
- new hardware (specially GPU - graphics processor)
- real time rendering software
So, what happened in 25 years?
1 to 30 hours / frame Real time
X
Voice-over:
So, what happened in 25 years to reduce the render time from minutes/hours to real time? Of course, hardware has
evolved a lot: our computers now are much more powerful than the computers back 25 years ago, specially the
graphics processor - the GPU. But this innovation that weve seen on The Mandalorian only could be achieved
because we are standing on the shoulders of another industry
How the Game Industry shaped the Creative Industry
Voice-over:
The Video Game industry. Which started for real in the consumer products industry from the second half of 70s,
with very simple 2D games, which evolved a lot in 20 years and brought to us a technical revolution that happened in
the early 90s
How the Game Industry shaped the Creative Industry
Voice-over:
the 3D technology to make games. This groundbreaking evolution transformed very simple 2D games into highly
interactive 3D experiences. Of course, in the 90s the consoles hardware wasnt powerful enough to provide very
detailed high resolution graphics, but even so it was amazing to see 3D objects and camera perspectives running in
real time on your TV screen. The 3D graphics pushed the console (and household PCs) hardware to the limit, and
nowadays, in some games, you can barely see the difference between a game scene or a real world footage.
To make the process of creating games more easy, the game industry created the firsts game engines. Before that,
game studios and developers should do hardcore programming to draw 3D graphics, lighting, shadows, etc, on the TV
screen. For each game. Again and again. Sounds nuts, huh?
GAME ENGINES
GAME ENGINES
Voice-over:
Not only to develop the 3D engine, but build the interactions, interface, etc, for each new game. So, the game engines
helped to gather all these tools into one powerful software, using only one development language. All ready to use: 3D
engine, animation software, physics engine, interaction, interface, in only one software. Saving time and effort of our
poor game developers and artists.
The first game engines were built in the late 80s, but they were very simple and rudimentary. Only after one decade
they started to be used for real in the game studios, specially after Unreal Engine (in 1998) and Unity being launched.
By the way, these 2 game engines are main game engines in the market right now.
How the Game Industry shaped the Creative Industry
Voice-over:
But, in my opinion, the main point of using a game engine is the possibility of run high-quality 3D models and special
effects in real time in (almost) any personal computer. The laptop Im writing this lecture right now has a good CPU, a
good GPU and a good amount of memory, but its far far away from the powerful computers used in big studios and
renderfarms. And, even so, Im able to run complex 3D scenes on it in real time.
And it was exactly the reason the other creative industries started to get interested in the game engines.
How the Game Industry shaped the Creative Industry
Voice-over:
Virtual Reality
Augmented Reality
Simulation & Manufacturing
Film Industry
Broadcast
Architecture
All these industries have been using game engines to deliver some kind of content, to different platforms: PCs, mobile
phones, tablets, VR devices, interactive displays, etc.
TECHNOLOGY CREATIVITY
+
TECHNOLOGY CREATIVITY
+
Voice-over:
And it helped to shape the creative technology industry as we know today: using technology (hardware and software)
to deliver creative content, creating from simple shopping experiences on a mobile app to highly immersive
experiences on a VR headset.
TECHNOLOGY CREATIVITY
+
TECHNOLOGY CREATIVITY
+
Voice-over:
The creative technology as a professional role is a quite new thing. To be honest, it always been around (Ive been
working with it since 20 years ago!), but it was misunderstood as a developer or designer role, which is, actually, but
not limited to, its far beyond that. Usually, the creative technology professional has a main skill (like development, for
example), but he/she knows a bit of 3D (to be able to tweak a 3D model and export it), animation (to add some special
effects or 2D animation), editing and even sound effects.
The creative technology professional is multidisciplinary and very curious, which is quite rare profile specially because
the tradicional jobs are looking for more and more specialized professional profiles.
TECHNOLOGY CREATIVITY
+
TECHNOLOGY CREATIVITY
+
Voice-over:
And this is why the creative tech guys/girls are in so high demand nowadays: its been very hard to find people with so
many different skills - specially skills so opposed like tech and creative skills.
So, the time IS NOW! Specially now the remote work has been accepted (and encouraged) by companies all over the
world. Good luck!
Drop me a message @ astraccia@gmail.com or join my Discord channel (https://discord.gg/mZxT6nET) to talk about
creative technology, get learning resources and much more!
Archviz project - Airbnb
Voice-over:
Some projects Ive been working on in the last years
Archviz project - Airbnb
Archviz project - Unreal Engine
AR Experience - Halo Petfoods
AR Experience - C&A Big Brother Brazil
AR Experience - Club Bruges
Alessandro Straccia
astraccia@gmail.com
https://discord.gg/mZxT6nET

More Related Content

How the Game Industry shaped the Creative Industry

  • 1. HOW THE GAME INDUSTRY SHAPED THE CREATIVE INDUSTRY IN 20S ALESSANDRO STRACCIA
  • 2. RESUME Alessandro Straccia 25 years of experience in digital arts, focused on creative technology since 2015 Some companies Ive worked for: Blippar UK, Lockwood Publishing, FabraQuinteiro, Grupo Eug棚nio, Ag棚nciaClick Currently Creative Technologist at Blippar UK, creating AR and VR experiences for clients like General Mills, Baileys, MAC Cosmetics, Airbnb, among others.
  • 3. HOW THE GAME INDUSTRY SHAPED THE CREATIVE INDUSTRY IN 20S ALESSANDRO STRACCIA
  • 4. Toy Story 1 - 1995 - First feature film entirely made in CGI 77 minutes long - 114,240 frames in total Each frame taking 45 minutes to 30 hours to render Pixars renderfarm with a network of 117 computers, running full-throttle at 24/7
  • 5. Toy Story 1 - 1995 - First feature film entirely made in CGI 77 minutes long - 114,240 frames in total Each frame taking 45 minutes to 30 hours to render Pixars renderfarm with a network of 117 computers, running full-throttle at 24/7 Voice-over: Probably some of you guys wasnt born yet in 1995, but it was in this year that one of the biggest revolutions in the film industry happened: the launch of the first feature film made entirely in CGI: Toy Story. Pixar, at that time, was just a small studio doing some experimental animation and working for advertising industry, basically a spin-off of Star Wars well-known studio LucasFilm. Producing a 77-minute feature film was a big challenge at that time: even the most advanced computer was too slow to compute hardcore 3D maths, specially very complex render algorithms. The 3D and animation software were very simple and rudimentary in 1995, making the 3D artists and animators work very hard. Each frame took between 45 minutes to 30 hours to render, even on a renderfarm with 117 computers running full-throttle 24/7.
  • 6. Toy Story 1 - 1995 - First feature film entirely made in CGI 77 minutes long - 114.240 frames in total Each frame taking 45 minutes to 30 hours to render Pixar renderfarm with a network of 117 computers, running full-throttle at 24/
  • 7. Toy Story 1 - 1995 - First feature film entirely made in CGI 77 minutes long - 114.240 frames in total Each frame taking 45 minutes to 30 hours to render Pixar renderfarm with a network of 117 computers, running full-throttle at 24/ Voice-over: Here you can see John Lasseter (one of Pixars founders and Toy Storys director) explaining how was the workflow of producing a CGI animation. Funny thing is that this workflow havent changed too much since then, but you can see how slow and rudimentary were hardware and software in 1995.
  • 8. The Mandalorian - 2020 Season 1 - 8 episodes Big part of the scenes with CGI + talent used Virtual Production technique Real time render, syncing cameras and lighting on virtual and real sets Revolutionary technology that speeded up the production time and reduced the costs
  • 9. The Mandalorian - 2020 Season 1 - 8 episodes Big part of the scenes with CGI + talent used Virtual Production technique Real time render, syncing cameras and lighting on virtual and real sets Revolutionary technology that speeded up the production time and reduced the costs Voice-over: Jump to 2020: COVID pandemic stroke the world and the cinema industry was hit (again) by a huge technology innovation: the Virtual Production technology. This tech wasnt entirely new: its being used by TV studios to build virtual sets in live shows and news since the 2010 decade, but The Mandalorian show was the first feature film (in this case, a streaming show) to use it in huge part of the CGI scenes. Jon Favreau, the show producer, decided to borrow equipment and software from TV studios to try it on a cinematic production and it was a BIG win. LucasFilm (yeah, these guys again!) used a 13 feet high semi-cylinder LED screen, and shot the actors and first-plane props at same time the LED screen was rendering (in real time) the animated and interactive set. The real camera and lighting were synced with the virtual camera and lighting, so if it moves in the real world, moves on the virtual set too. More: the reflections and lighting bouncing on the real actors costumes and props were real, not rendered in the post production as the old school way, so it really gives you the impression the actors are fully immersed in the virtual set. Even more: any last minute change or fine tuning could be done (and revised) in real time by a team of artists in the backstage.
  • 10. Old school method: chroma key Still used by 90% of film industry, but much more expensive and complex This method use regular 3D software and regular render method, taking minutes or hours to render each frame Needs a lot of post-production and editing
  • 11. Old school method: chroma key Still used by 90% of film industry, but much more expensive and complex This method use regular 3D software and regular render method, taking minutes or hours to render each frame Needs a lot of post-production and editing Voice-over: When I say old school its just for fun: the chroma key is still the main CGI resource the film industry uses nowadays. Basically, you need to shoot all the scenes in a front of a green (or blue) screen, which is removed by software and replaced by CGI elements, exactly how it what was done in Toy Story (without the green screen step, of course), taking minutes or hours to render each frame. To make the things more complex, the production team needs to add tracking points on the green screen, which will be used by the 3D software to track the camera movement and reproduces it on the 3D camera. Quite complex and expensive. With the Virtual Production technology you can cut off lots of steps of this workflow, keeping only (and most important) step: the shooting. The CGI step is done BEFORE the shooting, and actors and production crew can see and interact with the CGI in real time. Much better than interacting with a green background behind them, huh? But, of course, not all CGI shots can be done using Virtual Production. This is why the old school method is still being used, and will be one of the main CGI resources for a long time, but even so, weve been seeing more and more studios using Virtual Production software to produce regular CGI scenes.
  • 13. Voice-over: In this video, The Mandalorian director Jon Favreau talks a bit about the Virtual Production process.
  • 14. - new software new production techniques - new hardware (specially GPU - graphics processor) - real time rendering software So, what happened in 25 years? 1 to 30 hours / frame Real time X
  • 15. - new software new production techniques - new hardware (specially GPU - graphics processor) - real time rendering software So, what happened in 25 years? 1 to 30 hours / frame Real time X Voice-over: So, what happened in 25 years to reduce the render time from minutes/hours to real time? Of course, hardware has evolved a lot: our computers now are much more powerful than the computers back 25 years ago, specially the graphics processor - the GPU. But this innovation that weve seen on The Mandalorian only could be achieved because we are standing on the shoulders of another industry
  • 17. Voice-over: The Video Game industry. Which started for real in the consumer products industry from the second half of 70s, with very simple 2D games, which evolved a lot in 20 years and brought to us a technical revolution that happened in the early 90s
  • 19. Voice-over: the 3D technology to make games. This groundbreaking evolution transformed very simple 2D games into highly interactive 3D experiences. Of course, in the 90s the consoles hardware wasnt powerful enough to provide very detailed high resolution graphics, but even so it was amazing to see 3D objects and camera perspectives running in real time on your TV screen. The 3D graphics pushed the console (and household PCs) hardware to the limit, and nowadays, in some games, you can barely see the difference between a game scene or a real world footage. To make the process of creating games more easy, the game industry created the firsts game engines. Before that, game studios and developers should do hardcore programming to draw 3D graphics, lighting, shadows, etc, on the TV screen. For each game. Again and again. Sounds nuts, huh?
  • 21. GAME ENGINES Voice-over: Not only to develop the 3D engine, but build the interactions, interface, etc, for each new game. So, the game engines helped to gather all these tools into one powerful software, using only one development language. All ready to use: 3D engine, animation software, physics engine, interaction, interface, in only one software. Saving time and effort of our poor game developers and artists. The first game engines were built in the late 80s, but they were very simple and rudimentary. Only after one decade they started to be used for real in the game studios, specially after Unreal Engine (in 1998) and Unity being launched. By the way, these 2 game engines are main game engines in the market right now.
  • 23. Voice-over: But, in my opinion, the main point of using a game engine is the possibility of run high-quality 3D models and special effects in real time in (almost) any personal computer. The laptop Im writing this lecture right now has a good CPU, a good GPU and a good amount of memory, but its far far away from the powerful computers used in big studios and renderfarms. And, even so, Im able to run complex 3D scenes on it in real time. And it was exactly the reason the other creative industries started to get interested in the game engines.
  • 25. Voice-over: Virtual Reality Augmented Reality Simulation & Manufacturing Film Industry Broadcast Architecture All these industries have been using game engines to deliver some kind of content, to different platforms: PCs, mobile phones, tablets, VR devices, interactive displays, etc.
  • 27. TECHNOLOGY CREATIVITY + Voice-over: And it helped to shape the creative technology industry as we know today: using technology (hardware and software) to deliver creative content, creating from simple shopping experiences on a mobile app to highly immersive experiences on a VR headset.
  • 29. TECHNOLOGY CREATIVITY + Voice-over: The creative technology as a professional role is a quite new thing. To be honest, it always been around (Ive been working with it since 20 years ago!), but it was misunderstood as a developer or designer role, which is, actually, but not limited to, its far beyond that. Usually, the creative technology professional has a main skill (like development, for example), but he/she knows a bit of 3D (to be able to tweak a 3D model and export it), animation (to add some special effects or 2D animation), editing and even sound effects. The creative technology professional is multidisciplinary and very curious, which is quite rare profile specially because the tradicional jobs are looking for more and more specialized professional profiles.
  • 31. TECHNOLOGY CREATIVITY + Voice-over: And this is why the creative tech guys/girls are in so high demand nowadays: its been very hard to find people with so many different skills - specially skills so opposed like tech and creative skills. So, the time IS NOW! Specially now the remote work has been accepted (and encouraged) by companies all over the world. Good luck! Drop me a message @ astraccia@gmail.com or join my Discord channel (https://discord.gg/mZxT6nET) to talk about creative technology, get learning resources and much more!
  • 32. Archviz project - Airbnb Voice-over: Some projects Ive been working on in the last years
  • 34. Archviz project - Unreal Engine
  • 35. AR Experience - Halo Petfoods
  • 36. AR Experience - C&A Big Brother Brazil
  • 37. AR Experience - Club Bruges