Week 8/9 - Rapid ideation session 2 - part 2
- Anouk Dutrée
- 31 jul 2021
- 9 minuten om te lezen
Bijgewerkt op: 4 aug 2021
And that’s another two week rapid ideation sprint finished, phew! A lot has happened in this sprint and it is time to wrap it up and identify the learnings. All in all, I am very happy with how this sprint went. Even though the final prototype is not exactly a finished game and nor is it very sturdy, I have learned a ton in the process of creating it. I have made multiple mistakes but that is actually great as I can learn from those. When everything went perfect you didn’t set the bar high enough, right? With that said, let’s dive in!
The resulting prototype
I’ll start by showing you the fun part: the resulting prototype and some screenshots. You can find a playable web version of the prototype here. A word of warning though: the game is not optimal for being played in the browser so you will notice it is quite laggy and the camera movement might be a bit jumpy. With some patience and some tender love and care it is okay though. I will get to why this is the case later. For now you can either enjoy a laggy web version or you can watch the screencapture below showing what the game would actually look like if you would run it locally.
Video 1: A screenrecording of the resulting prototype, recorded in the Unity editor.
Figure 1: Two screenshots from the game where you can see birds interacting with the player (white).
Scope
In the earlier blog post I had set out the scope of the prototype plus the general goal of working with sound to enhance my Unity skill set. The scope I had set out was as follows:
· A single level with a mountain/grassland type terrain in which the player can freely roam about.
· Interactable objects in the level for the player to interact with.
· A GUI overlay that displays your "knowledge" points or interaction points of some sort
· Sound needs to be incorporated in the final prototype
The final prototype pretty much meets the scope I had initially set out. So scoping appears to have gone better this time than last time now that I am getting a better feel for Unity. In the end the GUI does not show knowledge points, but I opted for a simple metric of “number of birds interacted with”. The reason I did this was because not too much knowledge was explicitly shared with the player so this made more sense. If I would have had more time, I would have added explanations about the bird species if the player manages to attract one. This could add to the user experience as right now there isn’t much feedback from the game when you actually attract a bird, other than it landing on your head.
I absolutely loved the addition of sound to the prototype. From the moment I had the birds flying around and singing occasionally, my personal experience of play testing and debugging became quite enjoyable. My boyfriend already mentioned that he will miss hearing the continuous bird sounds now that I am wrapping up the prototype. I think it was a good decision to add the bird sounds to the game, for more reasons than just personal pleasure. Nature sounds, including bird sounds, have several health benefits. Listening to bird sounds can relieve stress and reduce annoyance(Andringa and Lanser, 2013; Buxton et al., 2021). They not only relieve negative things like stress and annoyance, but they also contribute to general health and well-being (Erfanian et al., 2019). Of course listening to actual living birds and walking around in a forest yourself is different than the digital experience, but research suggests that the benefits remain in a virtual setting as wel(Depledge, Stone and Bird, 2011)! Sound has incredible power, and I will not forget that in future projects.
Reflection and learnings
The process
This sprint was a bit of a rollercoaster ride. I got off to a really good start. Setting up the general terrain went super fast after I had found some good open source assets to use, like this conifer package and this grass package. I barely had to tweak these assets to make them work which sped up the process a lot. I already went into detail about the terrain sculpting in the previous post so I will not go into that again though.
After the terrain I moved on to setting up a player controller and creating a mechanism for interaction. The playercontroller I largely borrowed from the previous rapid ideation prototype. In hindsight I am not so sure if that was the right way to go though. It helped me move on to more interesting parts for my personal development so it was good to not spend too much time on it. But in the end I am not happy with how the movement works for a web built, and it might have been better to have used a first person view. Next time I should spend a bit more time before digging into the development part to map out what the user controls should look like. The mechanism for interaction took quite some time to set up, but I expected that as well. So it aligned with the time I had allotted for it. The nice thing is that the mechanism I have built for this prototype is very easily extendable and adaptable. So the time investment of setting it up properly should pay off in the future.
So far so good, but here comes the tricky part! It was time to add the birds and allow for interaction with them. I had found this amazing free asset pack by Dinopunch games in the asset store. It was a bit old so it was not fully functional for my Unity version but it was straight forward to patch up. The bird models were perfect and they came fully animated and with sound: exactly what I needed. I am so happy I decided to look around in the asset store for animal models before jumping to blender to make my own. But (there is always a but isn’t there?) I was too fast in thinking the scripts worked after patching it up. I had birds flying around freely in the scene so everything looked good. When I started implementing the actual interaction with the birds (using food to attract specific birds) I found out certain core bird flight functions actually weren’t working at all. I made the mistake of building my core interaction logic on the assumption that these functions worked. I only found out quite late that they didn’t and basically had to rewrite the bird mechanics. It hindered the development but it was a good lesson to learn: always thoroughly check code from open source assets. In addition, it emphasized something I had noticed earlier as well, I need to work on developing in small chunks which can be tested intermittently. I often get carried away and want everything at the same time with graphics and all. I should develop in a way that allows me to easily test bits and pieces of the logic before implementing the rest. This way it is much easier to develop stable and properly functioning features.
Along the way I also implemented a basic overlay GUI to help the player get started and to provide some visual cues that something is actually happening. I was a bit afraid of the UI parts of Unity at first, but now that I could actually sink my teeth into it it’s not that bad at all. I had a tough time with the anchor points and scaling settings. I still don’t understand completely how to work with the canvas in the right way so that the overlay will properly scale with screen size. There is a setting for it, but enabling it made a huge mess so I will leave that for another day to learn about. I had the chance to work with buttons, dialogues, icons and scripting with the UI so a lot of ground was covered for a solid base to improve my development practice with.
In the last part of this sprint, when I had to make the build, I struggled the most. I expected this to be as easy as in the last session but boy was I wrong! For some reason the resolution I had for building the game (Full HD) was not the resolution that resulted in a build that looked the same way. Instead the build was awfully big with that setting with a UI bar that was barely visible and in the wrong place. The player controls also didn’t work at all in the browser. After hours of fiddling with Unity and with some suggestions from one of my indie game dev peers I managed to find out that the browser just couldn’t handle the game (thanks Rob for the help!). I reduced the quality and tweaked some of the camera settings which improved the build, but it was still nowhere near how it was in my editor. In the future I should make sure to make intermediate builds throughout the development process, to check if what I’m seeing in the editor translates well to the build. This way I could have spotted some of the UI canvas scaling issues earlier as well and fixed them without problems. Luckily this RI session is more about learning along the way than the actual outcome, so for this prototype it’s fine. But if this would have been a real life project, it could have really been detrimental to only find all these issues shortly before the deadline!
Lastly I also learned that user testing is really important and that it is good to include in the general development process. When I was done and I let my boyfriend play it, I noticed he played the game very differently than what I expected. Even though my game could handle it, the logic was not ideal for this type of use. If I were to continue this project in the future I could take my observations of his use into account to create a more pleasant user experience. It also gave me ideas of functionalities to add which were missing. In future projects I will make sure to include user testing when a playable MVP has been reached.
Learning approach
The approach I took for learning throughout this RI session proved to fit me well. I watched tons of tutorials to get a general feel for how things can be done and took snippets to form a solution that would fit my project. This way I learned new things and saw different ways of achieving the same goal, before deciding upon an approach to take. I am also happy that I decided to explore open source assets instead of making my own. As I am in the beginning of my master it is good to cover all the bases. I know my way around modeling and for this module I wanted to focus on Unity. I will need Unity to bring my models alive and the more I work with it the more I get a feel for how to do that.
Time management
I did not use a Kanban board this time and just went with the flow a bit more. I still had a defined scope but I didn't set out small steps to get to that scope other than writing down specific features I would need in my notebook. It worked for now because I am in general good with time management, but I did notice it was harder to keep the overview. I was tempted to go off into tangents to develop small bits which would be interesting, rather than working towards the big picture. Because I noticed, I could correct my path but it did cost extra energy. For an individual project I think it is fine to lose the Kanban board, but next time I will structure my working a bit more with clear ToDo's.
Key take-aways
I have already mentioned the realisations I had throughout development, but it's always good to sum them up for clarity. The key take-aways from this RI session for me are:
Make builds early on in the process
Decide on what type of GUI and movement controls fit the game idea best before implementation
Include user testing where possible
When starting on the GUI, set the canvas to scale with screen size directly. Doing it later causes trouble.
Sound is a great addition and should be considered in its own regard
Develop in small testable chunks, as opposed to an entire feature in one go
Thoroughly check open source code before using it in your core logic!
All in all I learned a lot throughout this RI session and I feel much more comfortable in Unity. I can't wait to put my newly acquired knowledge to the test in another project!
List of References
Andringa, T. C. and Lanser, J. J. L. (2013) “How Pleasant Sounds Promote and Annoying Sounds Impede Health: A Cognitive Approach,” International Journal of Environmental Research and Public Health, 10(4), p. 1439. doi: 10.3390/IJERPH10041439.
Buxton, R. T. et al. (2021) “A synthesis of health benefits of natural sounds and their distribution in national parks,” Proceedings of the National Academy of Sciences of the United States of America, 118(14). doi: 10.1073/PNAS.2013097118.
Depledge, M. H., Stone, R. J. and Bird, W. J. (2011) “Can Natural and Virtual Environments Be Used To Promote Improved Human Health and Wellbeing?,” Environmental Science and Technology, 45(11), pp. 4660–4665. doi: 10.1021/ES103907M.
Erfanian, M. et al. (2019) “The Psychophysiological Implications of Soundscape: A Systematic Review of Empirical Literature and a Research Agenda,” International Journal of Environmental Research and Public Health 2019, Vol. 16, Page 3533, 16(19), p. 3533. doi: 10.3390/IJERPH16193533.










Opmerkingen