# Assignment 3 - Starting on your app (graded)
**Date**: 17/02/2023
**Group**: 17
**Group members participating**: Anna Górska, Răzvan Mazilu, Jędrzej Głowaczewski
**Activity duration**: ~10h/person
## Goal
Creating a mobile application that:
- lets the user know if placing an object is possible in a given place
- places the object in a possible place
- allows for selecting objects
- allows for removing objects via button
We want the application to be consistent with the Kawaii Slimes models we chose in the previous assignment.
The purpose of the application is to learn more about possibilities AR on mobile devices.
## Plan
We plan on dividing the tasks from the assignment equally per each group member. We will work in parallel, each person independently. When one person finishes, they help the others.
Exercise 1 and 2.5 were mostly done in the previous assignment and shouldn't require much work to finish them.
We divided the tasks into:
- Task 1 and 2.3 - Jędrzej Głowaczewski
- Tasks 2.4 and 2.5 - Răzvan Mazilu
- Tasks 3.1 and 3.2 - Anna Górska
## Results
### Exercise 1
We copied the scene from the previous assignment, and slightly edited it, so that it fulfills the requirements. We made the planes invisible by removing mesh renderer. Hence,
- the plane tracking is active but hidden,
- the planes' colliders are still present in the scene what allows us to use gravity for other objects.
- we can tap on the screen to instantiate a new slime,
- and when specific QR code is detected a tree is created, and moved accordingly when the tracked image moves.

### Exercise 2.1
For the indication marker we used `.png` image of flames in a circle. To put it into the scene we used a quad using the mentioned image. The indication marker was scaled and rotated using the "parent trick" known from the previous assignments. The position of the marker is defined by a raycast function originated from the center of the screen. We used unity default raycast function using layer mask, so that it hits only planes which are on a specific layer. This way raycast is always on the floor. If the raycast hits a plane, the marker is instantiated in (or moved to) the position of the hit. If it doesn't hit anything the marker is removed, it is no longer in the scene.
The marker was implemented in a new script TapIndicator. This script stores the marker object and defines public getter, to allow other scripts to access the marker position. RealObjectAdder uses TapIndicator to access the position of the marker and instantiate new object in its position when a tap detected.

To make the application a bit cooler, we decided to add animation to the marker. Now it is rotating in the idle state and shrinking when a new slime is added.

### Exercise 2.2
For creating an UI Button we used a button resource with TextMeshPro in the canvas. We created 2 buttons: one for adding a slime and one for removing a selected slime. However, for this exercise, we have implemented the functionality only for the add button, leaving the delete button implementation to be realised in section 3 of this assignment. We made the button round and we used a color similar to the slimes, and we added the corresponding symbols.
To make the button look this way, we utilized:
1. Round Button -> Image Component and we selected a Knob as the Source Image.
2. Green Color -> Button Component and we changed the Normal and Selected colors to be #656F5E.
3. '+' and 'x' symbols -> To add these symbols we used the TextMeshPro component of the button and just typed the fitting characters.

For the implementation, we added a listener on the button which will detect when the button is pressed. Then, the slime is going to be instantiated at the specified place by the marker. For this functionality we used the following code:
``` C#
void Start() {
Button btn = addButton.GetComponent<Button>();
btn.onClick.AddListener(OnClick);
}
void OnClick() {
if(tapIndicator.Marker)
{
tapIndicator.Marker.GetComponent<Animator>().Play("Click");
Instantiate(raycast, tapIndicator.Marker.transform.position, Quaternion.identity);
}
}
```
### Exercise 2.3
1. We used prefabs from Kawaii Slimes package for adding new objects when the button is clicked. For image tracking, we used tree prefab from Low Poly Tree Pack. The package with slimes inludes basic animations, so we decided to use a idle animation for slimes.
2. We had some issues with AR Foundation Raycast function. It sometimes seemed to be imprecise. It seemed like it is detecting more surfaces - not only the planes visible on the screen. Hence, we experimented with the regular Unity raycast. We used a layer mask to detect only planes which were modified to be on a special layer. After these changes the problems with low precision were fixed. After relfection, we think that AR raycast function may be always detecting both horizontal and vertical planes no matter what we set in the AR Plane Manager component.
3. Objects we used are visible in the gifs included above. To use the prefabs from the Unity Store we had to slightly modify them, making changes such as resizing, adding colliders and rigidbody.
### Exercise 3
#### Exercise 3.1
1. We implemented "change of state" of Game objects on touch: Touching an object once flips them to the other state than they were previously, for example changing the color.

2. We combined the previously described functionality with the mechanics of adding Kawaii Slimes objects - now clicking on an existing object will render a semi-transparent shpere around an object.
The sphere's properties is computed by looking at the bounds of the Game Object. The following code snippet shows how the scale of the sphere is calculated, based on the size of the selected object's renderer:

The result of the task:

#### Exercise 3.2
A delete button was added to the UI. The button is only visible if at least one of the selectable game objects are selected. Once it is clicked, it deletes all selected objects.
A script related to the button checks if any element with tag "Selected" exists. If it is false, the button is inactive.

Otherwise, if an object is selected, the button is set to active- therefore it becomes visible and clickable.

An OnClick() method specifies button's behavior once its clicked. It iterates through objects with "Selected" tag and deletes them.

### Final result

## Conclusion
In this assignment, we successfully extended a previously developed mobile application to include custom game objects and user interactions. We tried to divide the tasks equally among the group members and completed them in approximately 10 hours each. We were helping each other, because some of us finished their parts faster. We encountered some challenges with AR Foundation Raycast function, but were able to fix the issue with the regular Unity raycast. Overall, we achieved the project goal and produced a final result that fulfills all the requirements. Additionally we implemented animations for the indication marker, and used animations for slimes as well.
## References
- https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@5.1/manual/index.html
- https://assetstore.unity.com/packages/3d/characters/creatures/kawaii-slimes-221172
- https://assetstore.unity.com/packages/3d/vegetation/trees/low-poly-tree-pack-57866