I was interested in learning two things before starting this project. First was Buffers in Shadertoy. Second was integrating Force-Based Physics Simulation through Verlet integration.
At the bottom left corner a small square can be seen, that is the buffer I use to store information for each point. The top part of the square represent constant data between 0-1 which are initiated on spawn of the particle – these attributes are radius, mass, bounciness and color and are together stored in a float4. The bottom part of the square is the particle states which change over the course of the simulation. At the beginning of the frame, the buffer is first read, then the simulation step occurs and the result of the simulation is then written to the buffer to be used the next frame. Since this is a 2d simulation, the data fits into a float4 – position.xy and velocity.xy.
What I think is the great about a force-based physics simulation is the ease to add affectors (or fields) which modify the acceleration of the particle. In this example, there are three different forces acting on the balls. There is Wind, Gravity and Attraction – just like in the real world!
Shadertoy is a website where you can code shaders and share them with members of the community. I use this not only as a tool to create art, but also to prototype rendering functions for use in realtime vfx for games.
Building upon the idea of “Spherifying” UVs I thought that creating a Solar System would be a great application of that technique. Spherify UVs can be seen in this Shadertoy: https://www.shadertoy.com/view/3lGfWy
The planets are basically fake spheres and their orbit speeds are fairly accurate but the rest of the parameters are not.
While making this, I learned how to create multiple render passes and composite them to create the final image.
Shadertoy is a website where you can code shaders and share them with members of the community. I use this not only as a tool to create art, but also to prototype rendering functions for use in realtime vfx for games.
In 2015 I moved to Germany for work and in the beginning I was getting around fine only speaking English. I put off learning German for a very long time since I knew it was going to be hard work learning a new language. I regret not starting learning the language sooner. In the end of 2018 I finally started taking German classes at VHS Offenbach to learn German.
I took classes three days during the week, with each class being three hours long – it was hard work but I think it really paid off. The German language integration system does work, at least it did for me. I finished the classes A1.2, A2.1, A2.2, B1.1, B1.2, B2.1, B.2.2 in two years and wrote telc exams for B1 and B2 which I passed with the grades “Sehr gut”.
At the beginning of studying, I always felt that there was no good way for me to learn the articles for the German nouns. Since the articles are really important to learn, as they are a big part of the foundation of the language, I figured why not gamify the article learning process? I looked around for already available apps, but all of them lacked some or many of the features I thought were important:
Color coded articles
Sound coded triggers when answering correct
Highlighting of rules
Translation of words
Words used in context with example phrases
Explanation of the word
Motivational support
Sense of reward and achievement
Sense of progress
Repetition
A learning path
Themed chapters
Ability to share word lists with others
I wanted to create an app that combines many different learning techniques to make sure you learn not only the articles, but also new words and their meanings. The most important part of a learning experience is for me to have fun!
As a challenge, I decided to create my own app not only for a way of learning articles myself, but also for others. I was also seeking for a way of learning new tools and technology. Creating my own app has always drawn my attention but I had never really had a reason or motivation to do it – until now. This was a perfect project for learning how to create my own app!
I decided to use Unity for this project, since I already knew the Engine from a content perspective and I also wanted to learn how to make a full game with it.
I always had an interest in databases and thought this project would be perfect for me to learn them. I decided to use SQLite with Unity and it turned out to be a great match. I extracted data from Wiktionary to populate the database and then I used Microsoft Neural Machine Translation to translate many of the words to different languages.
In the App database there are a total of 72319 words with:
I built a prefab emitter that spits out planks in Unity using C#.
While I created the effect I built the scene at the same time. I created this scene (maybe a little too complex for real-world workflow) with some control parameters because I like to work on interactive objects.
Providing a simple context like this saves a lot of time for me when iterating on the effect. I think the result turns out better too. Setting up a scene like this also allows me to think about potential implementation features the effect could benefit from.
Plank Projectile
Plank projectile is simple geometry with convex collision. Randomly selecting between six different plank meshes on creation. Plank variations are roughly sized 0.5-1 meter long.
Flame sprites that use plank mesh as geometry emitter. Particles are simulated in localspace to make plank look on fire.
Smoke emission is distance based together with standard emission rate so it still emit smoke when static on land.
Additive Smoke lit up by the flame is using an Additive shader with the same texture as smoke but with shorter lifespan.
Embers are sprites with noise applied to their motion.
Ground impact
When the plank hits the ground it will spawn an impact VFX. The impact VFX is oriented with its impact normal. The impact share a few elements as the plank fire – Smoke, Additive Smoke and Embers.
After plank has come to a stop, it will stop burning after a few seconds. The extinguish sequence is not really animated or designed, it just disables emission of particles. If taking this effect further, I think that the way it fades out is important as well.
Water impact
Directional elements are driven by the impact direction of the projectile, I named this property tilt in this project. With a steep impact-angle the splash should appear less directional while flatter impact-angles should result in a more directional splash. The impact “directionality” is handled in code to tilt only a few select elements of the water impact effect.
Ripples & Foam consist of expanding rings on the water. Secondary ripples appear in splash direction as water splash fall down on water.
Fire is extinguished and result is a small Smoke puff and Embers.
For the water splash, I used two different textures, one is more misty and is suited well for bigger elements while the second texture is used for smaller elements such as droplets.
After a short delay, there is a secondary and smaller splash of water simulating the plank impact creates a small air-pocket which then collapse into the secondary splash. This effect is super obvious when looking at reference videos for objects dropped into water.
Basic water droplets add a little more detail.
What can bring this effect further?
Scalable effects based on per-plank variables (attributes or parameters), for example:
Size of plank affect fire-amount.
Fire-amount variation to give each plank thrown a more unique look.
Impact should scale based on size of plank and impact speed.
Better water surface shader
Transparent, underwater fog & vertex animated.
Vertex animated water surface would break my effect currently as I use horizontal billboards for a flat static water surface.
Water surface transparency would also break the effect, as water splash particles are currently rendered underwater.
Bubbles underwater from the plank when the fire fizzles out.
Refraction on the water splash.
Heat haze distortion from fire.
Improved particle lighting & shading
Support time of day.
Smoke should react better to dynamic lights.
Improved backlighting (when looking through particles towards the sun).
Burning wood-shader on plank.
Adding wind to particle motion.
Ripples from plank when it’s floating.
Smaller textures by using less frames in flipbooks.
I like the water-splash a little better when I set timescale to 1.3, so I think this element could have benefited from a little extra tweaking to its timing.
Screen effect that adds a water splash to camera lens if close enough.
While at Jagex, I worked the game Transformers Universe. This game has a lot of history.
I joined late 2012 when Jagex still were developing their own engine for the game, this of course came with its own particle tool that I helped develop.
Jagex’ own engine later got scrapped and we moved onto Unity. I had no previous experience with Unity but it ended up becoming one of my favourite game engines to date!
Transformers Universe was an online PVP tactical MOBA-style web-browser game where the visual VFX language was key for readability of player abilities. It was very important you can tell what ability the other player are using, to either stay away or time your attack! It was really fun to develop this type of VFX language for abilities and it was super challenging to try to make the VFX stand out and be unique. Each Transformer had a handfull of abilities that had to be unique, adhere to one of three different damage types, faction and their personality, not to forget their vehicle-form abilities as well!
Each ability in the video has a short description of what it does and we almost free hands when it came to creating the VFX. Usually bigger and more impactful the better – but it still had to be readable!
I love Unity’s ease of use while still powerful and real flexible. If I find there’s something I can’t do out of the box in Unity I can usually easily write some C# and fix whatever I wanted to do. As a VFX Artist in Unity, knowing how to do programming is key in my opinion. I wouldn’t be able to do much without it! I find it limiting to either rely on someone else to do the programming for you or forever stay bound by the off-the-shelf-tools.
While working at Crytek I got the opportunity to create Realtime VFX for a few bigger VR titles. VR is Cool.
Working with VR as a VFX artist is really challenging as there are way less resources available. I usually had 1-2 milliseconds of total render time budget at my disposal. This depended on scene complexity of course and also if we were CPU or GPU bound. There are a lot of things to consider while working with VR you’d normally can ignore. Hitting a 60fps target on a Playstation 4 is harder said than done! We spent a lot of time optimizing performance on these VR experiences.
For the first project in this video, labeled Sky Harbor, 0:00s-0:16s, is basically a long cutscene developed to benchmark graphics cards. We used a new particle system that I had been involved in developing. There was a lot of new cool features we had available and it was amazing how fast it was. This new system also introduced GPU particles which work really well in VR!
One particular sequence I’m particularly proud off, which also required a lot of hard work and effort, is where the big ship comes crashing down, and gets shot by the big cannon. This sequence would have been impossible for me to finish without an Alembic cache. Alembic caches are super fast to process, downside is that it costs a little bit of memory but that’s easier to get when working in VR in such an empty scene. In the end the whole Alembic sequence allocated 350mb RAM which is not that much considering the complexity and length of it – 1700 frames and roughly 100k vertices. This sequence was animated in Maya, some parts were simulated while others hand keyed.
The second project in the video is Robinson: The Journey where your play a little boy stranded alone on a planet full of dinosaurs. I really like the whole world of Robinson and it had amazing environments. I mostly worked on scripted events, triggers and environmental VFX as there wasn’t much gameplay-wise. The player had access to a scanner tool but that was about it.
I think VFX in VR is all about adding to the immersion, whether its small tiny dust particles that are present around you or a big explosion. A VFX artist can add real small things do that let’s you feel a lot more physically present.
In pursuit of taking my Unreal Engine skills to the next level I wanted to learn more about Blueprint and how to create Object oriented archetypes.
I built this game so all Towers would inherit from the same parent tower class while all enemies inherit from the Enemy base class.
This game uses Data Driven Gameplay to control most design variables of the game. All Tower-, Enemy- and Wave parameters can easily be balanced simply by updating and re-importing an Excel spreadsheet.
I really like the flexibility of this data driven approach. Saves me a lot of time!
Levels
Each wave, or level, is defined with a few parameters. This makes the levels really easy to tweak and extend.
Towers
Tower data is exposed in the spreadsheet where all tower parameters are tweakable.
Gameplay
To make the game more tactical and challenging, it supports five different Armor and Damage types that have different Strengths and Weaknesses according to the spreadsheet defined matrix:
I enjoyed playing Destiny 1 back in the day when it featured Lootcave gameplay. I thought it would be a great way to explore data driven gameplay by trying to re-creating the Lootcave phenomenon.
I wanted to learn how to use an external Excel spreadsheet as input to Unreal Engine. I wanted to control as much as possible using this spreadsheet, such as items, balancing and loot tables.
Purpose of this project was to learn how to use:
Multiple maps and switching between them.
Data driven game design and balancing.
Item database.
Loot tables.
Enemy attributes.
The project was built around the principle of being able to extend, maintain and tweak it with more levels, items and monster types. All these things can be done really quickly from an Excel spreadsheet without the involvement of programmers. A simple re-import of the Excel sheet is enough.
Realtime VFX artists can use Motion Vector frame blending together with a 2D lookup texture to achieve higher amount of detail using less resources. These advanced shader techniques offer an efficient way to save texture memory and shader instructions.
A looping texture offer ~infinite particle lifespans and slower texture playblack rate. My example texture is a 32 frames and its playback speed can easily be slowed down to 5% without sacrificing visuals.
Required Textures:
Main texture, BC5 two channel greyscale
Motion Vectors, uncompressed
Linear2D gradient lookup map, RGBA 256×256
Main Texture source is a 32 frame animation created in FumeFX. Motion Vectors came rendered straight out of FumeFX with no optical flow post-process. Sequence made looping using a simple crossfade in After effects.
This shader is created from the inspiration from Klemenz Lozar’s blog post about Motion Vector frame blending. LINK
Another source of inspiration was Simon Trümpler’s blog post about Fallout 4’s clever use of 2D gradients: LINK
Following the theme “Nuovo”. Made with Unreal Engine in February 2016. Together with Mikkel, we created this game over three days from Friday through Sunday. Most challenging with this game jam was in my opinion the theme.
“Nuovo” games are defined as innovative, abstract and unconventional games that are short in duration.
It’s the end of the world! Make sure the vegetables, herbivores, carnivores and people are safely stored on the ark and be sure not to let any unwanted passengers on.
I did Game design, UI, Physics, Programming using Blueprints & VFX.
Mikkel did Game design, Visual design, Modelling, Sound, Rigging & Animation.
A few colleagues from Jagex and I wanted to get some experience with Unreal Engine so we decided to create a multiplayer snakes game. I was the only person on the team with prior experience with Unreal Engine which meant I got to do training and mentoring as well, which was fun!
From a conceptual stage, this took about three weeks for us to make and I did all the graphics.
I decided that I wanted to learn Unreal Engine 4 and blueprint scripting. I personally learn best by the learning-by-doing approach which I highly recommend.
Since I wanted to learn something new, I thought it’d be best to start small. I ended up aiming towards a simple implementation of a Tic-Tac-Toe game where I could play against an AI.
The AI is limited as it doesn’t know the concept of forking, or how to defend against a fork-move, but it ends up doing quite well anyway.
Disclaimer: the graphics was not part of the exercise!
The work I produced while working my first Realtime VFX job at Eurocom Developments Ltd. I used their own proprietary engines EuroLand 2 and Euroland 4.
During this time I got the chance to work on the following games:
Goldeneye 007
I worked on environment destruction, scripted events and cutscenes. This game was released on the Wii and was my first Video game credit!
Disney Universe
Defined the cartoony look of the core effects in the game such as fire, explosions and smoke. I also worked on destruction, scripted events and cutscenes. There were a lot of destructible objects in this game which I manually shattered and then baked the destruction simulation to use in the game.
Rio: The Game
As a party game, I worked on a wide variety of gameplay effects such as interactive objects and pickups to name a few. I also did a lot of work on victory fanfare screens.
Goldeneye 007: Reloaded
This is the Xbox360 and PS3 version of the Wii release which got itself a face lift by swapping engine. We didn’t have a good VFX converter of the Wii effects so we ended up re-making all the effects.
We use cookies to optimise our website and our service.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.