My name is Simon, and I am a first-year student in the two-year Industrial Technical Artist program at Yrgo in Gothenburg. As a Technical Artist, we learn to solve problems at the intersection of visualization and programming, gaining extensive knowledge in both industry-standard applications and newer platforms that have a growing demand for expertise.
Within this field, I am particularly interested in Virtual and Mixed Reality applications and creating different worlds for users to experience. Working with procedural tools to generate content and events within these worlds is also exciting to me.
My goal is to participate in the development of such applications alongside a talented team from whom I can learn and grow. I also hope to utilize my aesthetic interests and knowledge across various fields to add extra dimensions to the projects we develop.
Vision Impairment Simulator is a VR simulation of visual impairments created as a group project.
The project was carried out in collaboration with the Swedish Association of the Visually Impaired (SRF)
based on the need for new tools for SRF's awareness training.
The purpose of these training sessions is to provide insight into what it is like to live with a visual impairment.
In the application, one can choose the type and degree of visual impairment, as well as information about these.
The user can navigate in two different environments - a square and a pedestrian street with a crosswalk.
There is also the option to pick up an aid and choose the time of day/lighting through a day cycle function.
My work on the project consisted of environment design, implementation of crowds, and programming the day cycle.
I designed the square environment based on Gustav Adolf's Square in Göteborg using the City Sample package (with City Sample Buildings) and assets from Quixel. Furthermore, I implemented the NPC crowds we used in the environments, which were created using the City Sample Crowd package and the Mass Entity plug-in.
Finally I did the blueprinting and created materials for the day cycle function, which was controlled by the Sun Position Calculator plug-in. This can control the sun's position linked to the time of day. The end result was very successful, and we received positive feedback from SRF, along with development suggestions. They are eager to stay in touch with us regarding planned continuation in the fall with further development of the project. We have also been in contact with the city planning department of Göteborg who together with Virtual Gothenburg Lab are developing a digital twin model of the city.
We plan on incorporating parts of this in a future version of the simulator.
The project was based on the school assignment "Unreal Engine Configurator," where the goal was to create a configurator in Unreal Engine using Blueprint programming. In the configurator, the user should be able to interact with a product focusing on being able to affect and replace parts.
As someone interested in sound and music, I chose to create a configurator where the user can change the appearance of an Audio Visualizer with basic shape and color, switch to another fixed camera angle, and use a post-process volume function with Chromatic Aberration. The icons for the user interface were designed in Illustrator.
The foundation of the programming is to be able to affect objects in arrays whose contents are created at runtime.
Based on 3 static meshes, 3 arrays of 48 objects each are created, which can be switched between in the UI. These constitute the visual equalizer through which the sound is represented. With the help of the Synaesthesia plugin, a wave file with the music to be visualized is analyzed, and this information can then be applied to whatever is desired. In this case, it is to scale the objects in the arrays in time with the music. The reason for having exactly 48 objects is that Synaesthesia is set to work with 48 frequency bands from bass to treble, which it analyzes the music into.
The project was based on the school assignment "Custom Shader Project," where the task was to create a sample scene in Unreal Engine with an implementation of custom-built shaders/materials and some type of custom-built Post Process Effect. The scene was then to be presented in a playable state with optional control method and character type.
Since I am interested in minerals and rocks, I chose to create a shader that could reflect something within this area.
When certain rocks are illuminated with UV-lighting, they can exhibit different visual properties than in normal light.
I thought it would be interesting to try to implement something like this in a shader and chose to present it in a First-Person format. I obtained the 3D-models of the crystals from Turbo Squid and CG Trader.
My method was to blend two different materials with conditions for which one was visible, depending on where an invisible cone-mesh collided with the object to which the shader was applied (in this case, large purple crystals).
The goal was to, together with the post process that delimited the image with a circular mask, resemble a flashlight being turned on with the post process and illuminating the crystal to display a new material.
This project is a combination of the two school assignments in Interactive Environment and Animation Project.
Since I have some experience with beekeeping, I wanted to create a beekeeping simulator that briefly describes some tasks in honey production. In the Interactive Environment assignment, the task was to create an environment using Unity that could be navigated and interacted with by controlling a character and a UI.
In the Animation Project, I chose to focus on building one or more animations in Blender to use in projects in other subjects, which in my case was Interactive Environment.The programming in Unity (C#) is based on a chain of bools triggered by entering/exiting collision boxes. The moment when objects are picked up is solved by picking up the object to a "grabbing point" that is a child of the player. Then the object becomes invisible and is replaced with an animation until the object is released and becomes visible again. The swarm of bees is created with a particle system whose size increases or decreases depending on where in the game chain you are. The smoke in the smoke puff is also a particle system, and when you blow with the smoke puff, you activate the ability for a WindZone to affect the smoke system.
Animation in Blender: I rigged and animated arms and various objects that could be interacted with for a first-person character. The mesh of the arms comes from the MB_Labs software, which we also worked with in previous Blender courses.