You can also see this text in development documentation
2009463 Xiong Xin
March 8
When I am writing this document, it has been about one year after I decided to move to this major. When I was in BA Transport Design, I used to have one Moodboard in my portforlio which was a VR concept that you drive in a simulator device take you to a virtual world where you can drive with your friends to anywhere, any place of interests in the world.
https://xin718342876.medium.com/week-3-real-and-virtual-worlds-cb88d7de5163
At the very start of this project, I am literally knowing nothing about what I am doing or any solid content, as I was thinking that what I am trying to do is a 100% digital copy of what does it look like in real Japan, and soon I realized it is very unreliable, as if I really goes through this approach, there are too many assets, sounds, 3D models I need. I started modeling at a very early stage and did not realize I went wrong.
However in general I still have a goal, when Engelwood created a fantastic music MV to Crystal Dolphin,I was so attracted so I decided that at least the mood of my game should be as casual as the Feb 21 music MV, and with some similar art style.
March 15
204 Ford Zodiac MkIV + Ginetta Camper Top
https://xin718342876.medium.com/real-and-virtual-worlds-week-4-432857146a91
I throw myself into modeling work since then, and trusting that I did not went too wrong even though I even don’t know why I am modeling, and lack of thinking since then. I was modeling for the reason that I am design a game riding a vehicle, however I did not put much think that if player really need to see the exterior that long, and that on the journey most of the part the player see is actually the interior.
March 21
I was still stuck in the assets modeling stage, and until the end of this month I realized that I should not pay so many effort to some certain stuff that I may not even let the player actually see.
This week is a week struggling at unity creation and assets creating and to tell the truth, the process is a little painful…
I tried very hard to create a shader in blender for my fuji mountain but it end up look really awful in unity by default, and I know if I am aiming at a TA I should do better than that, so it become a week with youtube tutorials and endless try and failures… It was a hard week.
https://xin718342876.medium.com/real-and-virtual-worlds-week-5-ecf0ae9f3aa5
It was after March that I actually start making this game, and put my attention to somewhere really matters, I spent about 1 week developed the first scene, used nothing I prepared before as assets, which means I literally failed the first month.
April 10
Counting the time left, at this point I only have a rough scene from blender, modeled with support from GIS add on and here I realized my problem. Making the road as the same scale as the real world might be correct if I am working on a game have this capacity, but my time is limited as I don’t have enough time to consider every frame from the window, besides, it really doesn’t matter if player did not notice what looks like out of the view.
So I just kept the scene for now and swift to interaction and driving part as the real and virtual worlds are moving to immersion and interaction.
April 21
I started working on driving system on April. This is my first attempt, and I didn’t realize it could be so hard to develop a driving system in VR.
I have five editions of playmaker blueprint used when solving this problem.
I might be pretty good at making easy things difficult to realize, and the most direct way is to use thumpstick to drive. It is very traditional, but useful at anytime. However it does break the immersion and it will be ideal if a player can drive like in real life.
My first thinking to driving was to seek unity asset store, and it turns out there are no specific asset or tutorial aiming at driving in VR. I don’t have the ability to modify any existing code to fit my purpose and I did tried write some code following a youtube tutorial.
Part of the code I wrote
One single script took me almost one week to understand and at last I find out that it do not apply to my project, for the reason that it only support your steering wheel rotate in world space.
I gave up “driving” temporarily at end of April, focus on finish other experience first.
May 1
Chris sad something critical to me. My is about in a car, and what is left about a experience in a car if we can not even driving it?
https://xin718342876.medium.com/real-and-virtual-worlds-week-7-49e1d0fcf047
Blog link for interaction and immersive plan
I just finished the immersive plan and interaction plan, and went back to driving again, as I do not wanna end up making a game you can not actually “play”.
I summed up the reason I failed for the first time and decided to make all things work only by playmaker. This is where I already knew something about and easy to go with, and I was trying to code something from baby step before.
This time I tried compare the height of both controller every frame to decide if the player is going left or right. And this approach is very immersive as it feels like driving.
It worked, though someone tested it said : It feels weird.
Yes, it did feel weird, but not because of this part, it is because I did not use Rigidbody in Unity, Instead I used get/set position. The transit of scene was not natural when using get/set position, and also there are world/space transform problems.
So I tried to investigate in the rigidbody system, and I do developed a set of playmaker blueprint with rigidbody vehicle, but there are few problems.
May 14
When we pick up something, we may need to use oculus primary button, and this interaction have problem with rigidbody vehicle. We need a rigidbody component to both the vehicle and the steering wheel, and steering wheel is inside the vehicle.
When we try to interact with the steering wheel, the vehicle may move like crazy because you are trying to drag the entire vehicle as well. I also tried change the collider size but that makes the collide of vehicle so weird, you may slide through the gap then you are driving across the road.
I started to think if the classic thumbstick works better than all solutions I found. It is more easy to build and measure.
I use the playmaker to developed the system that i am going to drive with. It have the function to get oculus sensors, including thumbstick angles, and with a multiply function I can transform and store it as needed. I used left thumbstick Axis as turning and right as gas. It was as easy as expected. After recycle every state I can easily build a useful blueprint, and with some adjustment of gravity value the vehicle is good to go now.
I invited my roommate to test the project, to compare the thumbstick solution and the height solution, and he says that thumbstick is just easy and comfortable, though not immersive enough : it is much more easier, and player can pay more attention to the scene outside.
This is basically how I finished design the driving part, and at end of May, the scene is good to drive around.
May 21
The most difficult part is solved. Now I had a rework on the scene to make it nice and clean, and I am trying to modify the art style more like a town.
And at this stage I do need to build the level soon, as I did not finish build it in May.
Current stage : You can summon your guidance with index button down, and it also create
: Dialog menu, narrative flow, most interaction solutions
https://xin718342876.medium.com/real-and-virtual-worlds-week-8-c34f6d5f7154
Blog link for narrative and a NPC robot.
May 29
I started finish and put everything together. At the same time I almost gave up the idea that I will 100 per cent use my own assets in making this game, as it seems a little bit native or too difficult.
This game was planed to have 4 scenes, and every scene is linear with no branches. It is easy to bring out everything together with playmaker functions.
At the same time, I started to look into sound solutions. I am frankly new to sound design, though so many videos that you can look into for sound solutions, I just don’t have much time for it. I grab mainly my sound asset from mixkit, which was really helpful, including the sound of wind and the sound of birds’ singing, and some japan classical music clips.
I was intended not to use too much sound because I just can not handle with it, too much sound could cause the listener to be confused, not saying that you still have to listen to your robot guidance to push forward the level.
June 10
I should put an end to the development process now, as I need to catch every possible mistakes and make them right.
The very last problem I met is whether to rework my dialog system, as I did not add any sound before June 10, all the dialog is very silent, the little robot only pop up text to go with you, and I sometimes think that is not right, that could be a big problem. But to fix that also takes sometime, and it is already June 10 as I need to work on Maddalena’s project as well.
I decided that after June 14 the deadline of Maddalena’s project, I should look into this problem and decide whether I should rework all the dialog I have.
June 16
Everything is going to a end now, and I have a critical reflection to write as well, I zipped my time really tight to rework the dialog. Maybe it is another part of the sound design, and I used text-to-speech from wideo.co, which to my own perspective is a really good idea.
I do not need to find someone and voice my robot, which could be the most unstable point, and the process was really fast as I just need to upload every single line in the script, and download the sounded version. It turns out to have someone could really speak will give so much feeling of presence, very immersive as the solution I took before was so rough.
That’s how Mr.guidance finally got his sound as a making up for I did 0% for the multiplayer part which I do not even have a plan to go.