Year 3_T2 - Major Project Dev-Log

Year 3_T2 - Major Project Dev-Log

\\ My Submission Content

The Development posts for Spl/t for major project

\\ Post 1 Dev-Log - Differences From Proposal

Target hardware:

For this project I wanted to be able to compile and run my game for console, not only to get used to developing for console but for understanding how to optimize for console and less powerful hardware. However, to gain access to the ability to compile Unreal Engine 5, for Xbox consoles. I needed to join the ID Xbox program. To do this I need to provide a game development document, that outlines all aspects of the game as well as a compiled game. I cannot go under too much detail as I have signed an NDA during this process. I still plan to optimize the game with minium following standard industry practices. The minimum hardware level, I aiming for :

PC- Minimum 

Operating system:Windows 10 64-bit or linux
Processor:Intel/Amd Quad-core 3.2Ghz
Memory:8gb ram
Graphics Card:Gtx 1060 6gb

Console-Minimum 

Xbox Series S/X

PlayStation 5 (Depends on the availability to develop for the platform)

The reasoning for leaving out, Xbox One and PS4. These consoles when the game is developed further will have a lower player base and will affect design decisions. This is due to the hardware not handling the Unreal Engine 5 rendering pipeline very well. Also, the reasoning I am not developing for the Nintendo Switch.

Hud:

I am following a similar style for the HUD but not including the gauge UI element, as I will be using the environment to show the positioning of the player. This will be done by the in-world cameras displaying their point of view to a monitor. This is to help them understand the split mechanic as they can see the split happen. It is because narratively the player will be expected to crouch, and from feedback this can be daunting. I’m trying to give a different twist on the crouch mechanic. 

Proposal : Hud Concept
Final _HUD

Software Changes:

To get used to working in a team, even though this is a solo effort. I am going to use GitHub. In Unreal Engine, it adds new icons that tell me if something has been uploaded. Red is needing to be pushed and uploaded, and then no icon in the content browser is it has been uploaded.

During development, I changed digital audio workstation software from Audacity to figure 1 (Adobe inc.,2023) Adobe Audition, this was due to a bug in audacity crashing because of my peripheral software Logitech G hub. An issue as it was controlling my mic. The issue didn’t occur in Adobe Audition.

Updates and upgrades to Unreal Engine, during development Unreal Engine has been updated from 5.03 to 5.1.0 and now on the current build of 5.1.1 figure 2 (Epic Games,2023) . This has mostly affected development positively. As the improvement made were bug fixes and performance improvements. While the negative of doing this was checking, if it had broken anything I had done so far. The only issue that will occur in a later version of Unreal Engine is they have changed the input system. A rewrite of the character’s movement system is needed, if I want to go to a newer version when it releases.  

\\ Post 2 Dev-Log - Main Mechanic

Prototype 1:

To start to see if the mechanic would work. I developed the first iteration of the mechanic. This would switch between two separate characters. Teleporting the spawned in blueprint in front of the player when you split again. After getting the player to split, I knew this mechanic would work. After this, I decided to re-write the mechanic as handling two player blueprints got confusing. I wasn’t sure on how to handle multiple variables between two different characters. As well as handling the teleport states between them both. To simplify the mechanic, I wanted to work off the one-character blueprint.

Prototype 2:

For the next iteration of the prototype, the character is still 2 separate player blueprints. Bp_fullCharacter and Bp_HalfCharacter these swapped characters by using a flip-flop when the custom input key was pressed. However, to stop the character from being stuck in props or floating, I customized the collision to make you pass though. When you split, I made the character hide its bones above the pelvis. When splitting, joining the character back together does not remove the player. To make it more visually like my character proposal. The model needs to be changed from the default Unreal Engine model.  I plan on re-writing this code to use 1 character and 1 controller instead of 2. Making the split mechanic.

Character Split Test Types 1+2

Split Test Types 1

For the first experiment for the character splitting. I took away the player’s ability to move on the x-y axis. So, the only movement control they had was on the z axis. In this prototype, I had re-written the character to work off the one-character blueprint. Locking the cameras’ location on split and hiding everything above the character’s pelvis. Giving the illusion of the changing character. I had people test the mechanic and got feedback that it was confusing. As you couldn’t tell what way it would go. I will iterate on this mechanic and test another method.

Split Test Types 2

Taking in the feedback, I was given in the last split mechanic. I decided to give back the player more control. I implemented the character model this time. Likewise, I made 2 control methods for this mechanic. The moment you split you have control over the camera moving, this also moved the lower character rotation movement too. Then when you left-clicked or pressed down, the left thumb stick on the controller. It would lock the camera to allow you to move the lower body. When getting feedback, it was to make the split more visual to the player. So, they understand that the split is happening.

Current Split Type:

To improve the split mechanic, I made the split apply force to the character. It forces the model forward, making it more apparent the split has happened. I also made the camera lower itself during this phase. This split mechanic is based on 1 character and 1 character controller. It's working by locking the cameras’ location when the split key is pressed. Then the camera's rotation uses your mouse screen space to convert the rotation. I will go into more depth on how this works in the programming development log.  Feedback I got of this mechanic is to let the player experiment and try to understand the mechanic.

\\ Post 3 Dev-Log - Programming

During development, I was finding it difficult to diagnose issues or test to see if I was trying to do the correct thing. I was often using the wrong blueprint event or variable. Remembering the times, I had used Unity C#, you can print string or use the debug log. I started using print string in Unreal Engine, before and after calling a specific event, If I was struggling to get the intended event I wanted. It was especially useful when trying to understand the conversion of mouse location to world space.

Conscription Scripts :

During development, when creating scripts, I wondered what conscription scripts were. To find out, I had a look at figure 1 (Epic Games) as the documentation covers what conscription scripts are and what they do. They allow you to set specific variables to objects like.

In this example, I have the sphere collision size setup to the conscription script. I then have a float with a value and made it public and made it instance editable. This makes each version of the blueprint, instance editable and I can customize the sphere collision, depending on its world space it’s in. I have implemented this technique on many of the assets that need certain elements exposing or their materials changing. This technique has helped me with optimization as, I do not need multiple different blueprints, for different sounds or texture on blueprint.

BP_Anim_Camera:

The blueprint camera follows the player when they enter the sphere collision. The collision size is set by the Sphere Scale float. To control that value, it is done by the outlier, details panel for the object. Clamping the value between 1 and 100, if it needs to be bigger, having to manually type it in.

The event graph for the blueprint, grabs the camera mesh and gets its world location and rotation. Then getting the player controller, converting mouse location to world space. From the camera mesh, the look at rotation value is plugged into the RInterp to setting the world location of the scene component. That is the camera mesh. Then delayed until next tick and a custom event called “Event Loop” triggers the recalculation of the set world rotation. It does this from the event begin play where the player gets cast to and an actor variable is set for the custom event loop. Depending on if the player is within the sphere, col. It affects The InRange? Boolean. If the player is in range, the event loop is cast every frame until you exit then it keeps the last position state it’s in, until you enter again.

BP_Anim_Speaker:

This blueprint also uses the construction script to set a sound asset as well as, the sphere scale. This will activate the sound and detective, it depends on if the player is within the sphere. The attenuation settings for the sound can be customized too. The sound may need to cover a larger area, so I added a float for the inner radius and fall off distance of the audio. This will allow me to speed up development and customize the effect of the audio depending on the world space it’s in.

To animate the speaker, it follows a similar system then the camera with the activation and even loop system.  

Bp_Full_Character :

The character is the most in depth collection of blueprints I have. It took up a large majority of this project. As I had to learn and research, Unreal Engine blueprint system following the documentation they created in figure 1 (Epic Games). This helped me understand most features as it provided context and examples of how it may be used.  To help me further understand the blueprint system, I watched a series of videos by Unreal Engine figure 2 to 15 (Unreal Engine,2015). This was a video series showing many aspects of the blueprint system.

For the base of most of the mechanics, like picking up objects, I started with a blueprint interface. These functions held variables for picking up objects, interacting with and object, dropping and picking up an item. For instance, to pick up an object that has two input variables, The pawn, Bp_Full_Character and the empty hand component on the character. To be able to detect the items, I used a ray cast to check if the item you fire can be picked up or interacted with. If it was intractable, it would fire off like a begin play node. While, if it was pickup-able, it would follow the similar process but be linearly transported to the hand. 

\\ Post 4 Dev-Log - Visual Style + Sound

Textures :

Drawing:

Drawing the textures, I used Photoshop. Using a base resolution of 4k, then downscaling them to 1080p × 1080p when exporting them.  I decided to hand draw the textures, to follow the style I set out in the project proposal. The textures being hand draw would add to the overall aesthetic and make the game more unique. To draw the textures, I would use multiple layers and different opacities to blend specific layers together. In photoshop, I had a grid on along with pattern preview. It turns the outer canvas into the repeated texture. It helps you get a feel for the texture and shows you what it will look like once repeating.  

Textures Created:

Taking what I had learned from creating textures for the block out. I wanted to create a series of textures that outlined a similar style to my project proposal. A hand drawn, cel shaded look. When creating the textures, I took into consideration how the post process would affect the normal map and how the colours may change slightly from this. I looked around my house at how the plaster is and then at university looking at how the ceiling tiles are. This process for creating textures, made it more enjoyable and felt more stratifying. Using my surrounding environment as reference to put into a 3d space. 

The textures created were for my planned environments outlined in the project proposal. Office, warehouse, and test environment styled textures. I started with creating ceiling texture using my floor block out as a reference. I gave each of these textures a unique quirk. For example, the plastered walls have small rough speckles like they do in real life. Giving it the illusion of depth, during this process. I found out that Unreal Engine 5, no longer supported displacement maps. As I was nearly finished with creating all these textures. I did not want to redo them for a mesh because Unreal Engine had replaced this with nanite. They want the displacement to be done on a 3d model rather than a flat BSP brush now.

Giving Doors Style:

To give the game a better visual identity, I added a delay to the doors. I did a test and got feedback from having the door animated on 3 different delay times. 0.24 ,0.18,0.12. 0.24 and 0.18 both had positive feedback, being that it did add to the game. However, for the 0.12 door animation, it was being mistaken for lag. I decided to go with 0.24 on the doors as I didn’t want it to be mistaken for lag but as a design choice.

In game look:

This is what the textures looked like when applied in game, with the correct scales and on the appropriate meshes.

To get these textures to work in unreal, I also created normal maps and ambient occlusion maps for the textures. As some of the textures are the same and are different colours, these did not need to be remade. Textures like the tiles, I could scale smaller and duplicate it and blend 4 of them together, to make a smaller version.

Some textures in Unreal needed blending together, I used texcoord and the lerp node to layer and blend pieces together. After I got confident, I started making new textures from the textures I had drawn by using the lerp to overlay extra detail on other textures bases. This approach has helped me make more textures.

Cel Shading :

To help me get an understanding of the post-processing system and how to create a custom CEL shader. I watched 4 different videos to help me create mine. I started with watching video by figure 5 (Visual Tech Art,2022) and figure 3 (Matt Aspland,2022) to get a basis on how I can visually change the experience.From these videos, I created the first CEL shader:

To improve upon the first CEL shader. I combined aspects of that shader with adding outlines to objects. This made it more like my project proposal. I did this by implementing new techniques shown by figure 2(Evans Bohl ,2022) and figure 4 (Visual Tech Art,2022). This is because it went though, how to manipulate the directional light and normalize the shadows into 3 different flat shadows, depending on the texture it was on.

To apply these effects, I put them onto the post-processing volume. Added them both as materials, making sure the outline effect was the first to be the one to process first. If it wasn’t at the top, it wouldn’t outline objects correctly. This is how it affects the world space :

With PostProcessing/Cel Shader:

Without PostProcessing/Cel Shader:

Sound :

What I used:

As my game is to do with AI, I wanted the voice of the narrator role who guides you. To be an ai generated voice. I used figure 1 (Eleven labs,2023) an AI Voice generator service. I was going to use my own voice, but it didn’t have the intended effect or feel the AI one had. To get the ai to work, I just had to give it the text. I wanted it to read. This is the script I have for the narrator character. 

Script for narrator:

Room 1:

Test Number 314, 2.8. Welcome to this world, you will come across a variety of objects that you can interact with. I left a message for you on the monitor. If your circuitry is working. When you’re ready, go on into the next room. 

Room 2:

You are faced with two doors. One of these doors will open and lead you to the first test, while the other will not. You have two choices, and only one of them will keep you alive. Fail and you shall be rebooted! 

Room 3:

Congrats, you picked the right door. This isn’t the first time you have got this far. We installed a ramp just in case you forgot how to walk upstairs. 

Test room 0.1

Look, this may be a little hard for you to figure out, but your aim is to get into the room in front of you. 

Test room 0.2

Built with scrap, but can you handle an extra variable, conflicting with your resistors.  

Test room 0.3 p1

Time to test your manoeuvrability. You have not managed to get past this much. Crouch under the barrier and walk past the sensor to open the doors.

Test room 0.3 p2

Well, that is a first ! That’s not how you crouch, but hey you made it this far! I’ll make some adjustments to other Test Rooms.

 

AI Voice Reading Out Script:

⚠️
Sadly, I was not able to recover all the original Voice clips
audio-thumbnail
4 Visual Style Sound
0:00
/6.112625

\\ Post 5 Dev-Log - Level Design

Blockout:

Custom Textures:

To help me understand how to create custom textures and Unreal Engine’s scale. I worked out the scale from the default player character. It is around 1.8M, it helped me understand how to scale textures correctly. From this, I could use the character to scale the textures correctly. I knew I had scaled them correctly when they were lining up with the default project ones. I did not want textures that were stretched out and distracted the player from the experience. This also will help me in scaling the assets I create for the world. 

The textures I created; I would need during the block out stage. To create them, I used Photoshop with 1080p × 1080p texture resolution. I then used an orange colour for the background as after the block out process. If I missed any walls when exporting them and modelling them. I would easily notice them. The block out textures I created were, 6 walls and 3 floor textures. For the floor, I created a 1 m,2 m,3 m,3 m with character height reference,3 m with door height reference and 4 m wall. Then for the floor textures, I created a 1 m,2 m and 3 m floors with grid layout.

Puzzle designs:

As the main mechanic is to split, I came up with some concept
text rooms, I designed them on photoshop. These tests are like the one in my project proposal. Taking inspiration from other puzzle games, pacing. I wanted to design puzzles for this concept that allow the player to get used to and understand the mechanic. It was easier for me to block out the test environments first, than do concept drawings.  I did write down ideas of test rooms on my phone, and these are : 

The player controls both halves of the character separately, with the top half able to jump higher and the bottom half able to move faster. The player must navigate through platforming sections by utilizing the unique abilities of each half, such as having the top half jump over obstacles while the bottom half crawls under them.

The player controls both halves of the character separately, with the top half moving in one direction and the bottom half moving in the opposite direction.

The player controls the bottom half of the character, which is able to move around, while the top half is stationary.

The player controls the bottom half of the character, and must use its movement to balance the top half on certain objects or platforms in the environment.

Blockout 1:

For this block out, I wanted the player to start constrained without the ability to move. Until the surrounding walls lower, and they gain the ability to move. The starting room was going to be a freezer, with two nitrogen tanks. The room being constantly filled with smoke. They would go into a small space, to a bigger one than a slightly larger one. These rooms were going to be to introducing narrative using the world space.

After the player gets into the main test warehouse room, they are introduced to two doors. A tutorial room and going straight into test room 1. I changed this design due to feedback because of my mechanic being different. It would be good to force all players to learn it without it feeling like they are doing a tutorial. At first, I was not too sure about taking away power from the player but getting feedback from multiple people at this point was crucial in improving teaching the player naturally. 

Blockout 2:

In this version of the block out, I wanted to have much more open spaces. Starting the player within the warehouse environment. Allowing them to see where they need to go next by using windows. I designed these spaces to have the tutorial elements embedded within the world. For instance, in the first room you are introduced to the interact mechanic and the narrator who will lead you through the tests. The narrator mocks the player and tries to entice them to do the puzzles correctly. In the first room, the player can interact with any object within the room and pick them up. While the narrator is speaking, the player cannot leave the room.

The second room is to teach the player about doors. Doors with handles open and those without do not open. It also reinforces the narrator role in the story and guiding the player by giving them hints. I tried to design these spaces in a grounded way. Like the pipes being the industry standard for what they contain within them. Why change some standards when they have meaning. Red being, fire quenching fluids, blue being compressed air and green being boiling feed.

The first test room is to get the player used to the test rooms and what is being asked of them, this is followed by another test but with another box.  Then to introduce them to the mechanic. They are asked to split.  However, when they enter the room. The door will close on them.

Teaching the Split Mechanic :

The design for this room was very difficult. As teaching a player that may be used to the normal crouching. Now has to see that the crouching they are so used to, is now completely different. The player’s lower body detaches and the upper half is lowered onto the floor. The feedback, I kept getting back from this room, was to do with onboarding the player. I improved it by adding in the crouch button element they need to press. Increasing the overall visibility of the room by removing the chin link fences and extending the room to allow for more movement. As well as showing the player split in world. This made a difference to feedback positively.  

\\ Post 6 Dev-Log - Character + Models

Character :

Design:

The design of the character was influenced by my character concept from my proposal.  The character is an AI, built with spare components left over for their ease of availability in the world. The head being a crt, torso being a pc, arms are power extenders and legs are built out of old office chairs. The design of the character is to match the main mechanic, the split ability. The top and bottom half of the character can split.

To create the character, I used blender. Using my character drawing as a reference plane. I modelled the character in a T-Pose. I exported the Unreal Engine 5, model as a size reference. This helped me maintain scale for the character within Unreal. When creating the character’s proportions, as I wanted to get the legs, arms and head a similar size. 

Blender Model:

To rig the character quickly, I used figure 1 (Adobe,Inc , 2023) Mixamo. I did this to save time, and it could still be retargeted to the Unreal Engine 5 animations. Even though this process did speed up creating the bone layout. The weight painting was incorrect. So, I modified it in blender and improved the weight painting.

Models :

As an exercise to improve my modelling skills and texturing skills. I wanted to created something highly detailed. However, I didn’t include it in the game as it wouldn’t match the other models in style. 

\\ Post 7 Dev-Log - UI Design

UI Concept 1:

For this main menu, I had a working options menu and a working single player button. I wanted the main menu to be accessible and easy to read. However, to implement controller support, I would need to recreate the menu. It was because the menus’ hierarchy was not set up correctly.

Current Main Menu

For this main menu, I made it stylized talking inspiration for the Windows desktop environment. To make the main menu more interesting for the player, I implemented an interactive feature. The name of the game, split, follows the mouse cursor.  At the bottom of the screen, there are the options you can press. I made this menu more controller friendly. As you can navigate the 4 options using a controller, keyboard, or mouse.

The main menu, is also a developer room. To fix one of Unreal Engine’s biggest problem, shader compilation. I put all the games assets inside the main menu. This should help with reducing the game’s stutter.  

Interactive Loading Screen:

I decided to make the loading screen interactive within the game. As there isn’t any large textures or many assets. You don’t get to see the loading screen. I wanted to give the player something to do, during this downtime. 

Embodiment:

To make the player more aware of the fact they are not human and are an AI robot. I decided to add an UI element that is a child of BP_FullCharacters camera. It is a 3d model that will interact with the lighting. It doesn’t show up in reflections and is animated by the motion of the camera. Furthermore, it is not attached using the widget interface. However, an issue that will occur, depending on if the resolution is not 16:9. If it’s 4:3 for example, the faceplate would need to change its scale on the x and y-axis. 

\\ Back to the Top of the page !

Back to Top