Fyp Report

Download as pdf or txt
Download as pdf or txt
You are on page 1of 48

This document is downloaded from DR‑NTU (https://dr.ntu.edu.

sg)
Nanyang Technological University, Singapore.

Augmented food appreciation

Yang, Xunsheng

2023

Yang, X. (2023). Augmented food appreciation. Final Year Project (FYP), Nanyang
Technological University, Singapore. https://hdl.handle.net/10356/167132

https://hdl.handle.net/10356/167132

Downloaded on 11 Mar 2024 09:24:59 SGT


ACAD. YEAR
2022/2023
AUGMENTED FOOD APPRECIATION
PROJECT NO.
P-C022
AUGMENTED FOOD APPRECIATION

YANG XUNSHENG

SCHOOL OF MECHANICAL AND AEROSPACE ENGINEERING


NANYANG TECHNOLOGICAL UNIVERSITY

YEAR 2022/2023
AUGMENTED FOOD APPRECIATION

SUBMITTED

BY

YANG XUNSHENG

SCHOOL OF MECHANICAL AND AEROSPACE ENGINEERING

A final year project report


presented to
Nanyang Technological University, Singapore
in partial fulfilment of the
requirements for the
Degree of Bachelor of Engineering (Mechanical Engineering)
Nanyang Technological University

YEAR 2022 / 2023

Pg. 1
Table of Contents
Pg.
Abstract 4
Acknowledgement 5
Lists of Figures 6
List of Tables 7
Chapter One 8
1 Introduction 8
1.1 Background 8
1.2 Objectives 12
1.3 Scope 12
Chapter Two 12
2 Literature Review 12
2.1 Mixed Reality 12
2.2 Virtual Reality 14
2.3 Augmented Reality 15
2.4 User Experience & User Interface 17
Chapter Three 18
3 Methodology 18
Chapter Four 20
4 Hardware, Software & Programming Language 20
4.1 Hardware 20
4.1.1 Photography Devices 20
4.1.2 Portable Lighting 21
4.2 Software & Techniques 22
4.2.1 AliceVision 22
4.2.2 Autodesk Meshmixer 23
4.2.3 Unity3D 24
4.2.4 Vuforia 24
4.2.5 ARCore 25
4.2.6 RunwayML 25
4.2.7 Figma 26

Pg. 2
4.2.8 MindAR 27
4.3 Programming Language 28
4.3.1 C Sharp (C#) 28
4.3.2 JavaScript (JS) 29
4.3.3 HTML 29
4.3.4 CSS 29
Chapter Five 30
5 Design and Development 30
5.1 Prototype One 30
5.2 Prototype Two 34
Chapter Six 35
6 Testing and Implementation 35
Chapter Seven 37
7 Conclusion, Limitation and Recommendation 37
References 40
Appendix 44

Pg. 3
Abstract

This paper explores the use of Augmented Reality (AR) technology to create a digital
food menu for eateries on handheld devices. The aim is to provide an immersive,
playful, and interactive dining experience that helps individuals make more informed
and confident food choices. The literature review covers mixed reality, virtual reality,
augmented reality, user experience and user interface design. The methodology
involves several stages, including planning, research, design, development, testing,
and implementation, and the necessary hardware, software, and programming
languages are discussed. Two prototypes are presented, one using Unity and the other
using JavaScript. The testing phase showed that both prototypes had functional and
interactive interfaces with potential to enhance the dining experience.

Pg. 4
Acknowledgment

The author would like to express his sincere gratitude to Dr. Cai Yiyu for providing
the opportunity to work on this project and for his invaluable guidance and support
throughout the entire process. The author would also like to extend his gratitude to his
team members, Cai JingHong, Ding YiJie, and Clarissa Bella Jew, for their valuable
insights, assistance, and knowledge sharing, which were instrumental in the on-going
development of this project.

The author would also like to thank the staff and management of the various eateries
such as Un Yang Kor Dai who participated in this project, for their cooperation and
support, without which this project would not have been possible.

Lastly, the author would like to express his appreciation to all the individuals who
have contributed to this project, directly or indirectly, and who have provided
inspiration and motivation for the author to continue pursuing excellence in the field
of AR technology.

Pg. 5
Lists of Figures
Pg.
Figure 1 Lao Beijing’s Traditional Menu 8
Figure 2 Sushiro’s Personalised Menu from Hand-held Device 10
Figure 3 Beauty in The Pot’s Personalised Menu from Hand-held Device 10
Figure 4 Tsukimi Hamburg’s Customised Menu from scanning QR code 11
Figure 5 Virtuality Continuum Spectrum 12
Figure 6 Bloodstain Pattern Analysis (BPA) training program 13
Figure 7 VR scene of Ocean 14
Figure 8 Diners experiencing VR Dining 14
Figure 9 Diners viewing AR Menu 15
Figure 10 3D Food Model of Banoffee 15
Figure 11 Realme 5 Quad-Camera 21
Figure 12 AliceVision Meshroom Logo 22
Figure 13 Photogrammetry of Ramen produced by AliceVision Meshroom 23
Figure 14 Autodesk Meshmixer 23
Figure 15 Unity Logo 24
Figure 16 Vuforia Logo 24
Figure 17 ARCore Logo 25
Figure 18 RunwayML Logo 25
Figure 19 Example of before and after green screen remover 26
Figure 20 Figma Logo 26
Figure 21 An overview of framework used to design prototype two 27
Figure 22 MindAR Logo 27
Figure 23 Development Flowchart for Prototype One 30
Figure 24 Example of food generating in AliceVision Meshroom 31
Figure 25 Unedited 3D model of Pork Rib Soup 32
Figure 26 Editing 3D model of Pork Rib Soup with Autodesk Meshmixer 32
Figure 27 Final edited 3D model of Pork Rib Soup 32
Figure 28 App Building in Unity 33
Figure 29 Development Flowchart for Prototype Two 34
Figure 30 Targeted Image of Leng Zaab 35
Figure 31 Single Frame from Augmented Video 35

Pg. 6
Figure 32 Sample Text-Based Menu 36
Figure 33 3D Fried Chicken Model 36
Figure 34 3D Ramen Model 36
Figure 35 How the app function on Image-Based Menu 37

List of Tables
Pg.
Table 1 Timeline of Five Industrial Revolution (5IR) 8

Pg. 7
Chapter One

1 Introduction
1.1 Background
Technological advancements have greatly facilitated human life for generations, and
without realising it, we have now entered the fifth industrial revolution –
Personalisation [1]. Personalisation is where creativity and innovation meet the fourth
industrial revolution – Digitalisation. Digitalisations involves robotics, artificial
intelligence (AI), and the Internet of Things (IoT), with this revolution reliant on the
first, second, and third industrial revolutions [1]. This can be seen in Table 1 below.

Table 1 Timeline of Five Industrial Revolution (5IR)


Note. From https://www.regenesys.net/reginsights/the-fifth-industrial-revolution-5ir/

IoT (Internet of Things) links the real and virtual worlds. Information is derived from
data gathered by IoT-enabled devices such as smartphones. Using technologies such
as augmented reality (AR) and virtual reality (VR), this information can be made
visible in real time.

Pg. 8
Imagine hungry individuals at an eatery, reading the text-based menu as shown in
Figure one below, with or without a few signature dishes images on it while trying to
decipher and decide what to eat. Oftentimes, individuals will become indecisive
which is caused by choice overload. Choice overload is a phenomenon that occurs
when individuals brain is presented with an overwhelming range of options and hence
struggles to decide [2].

Fig. 1 Lao Beijing’s Traditional Menu


Note. From https://www.laobeijing.com.sg/

To reduce choice overload, having visualisation in those words will assist individuals
in making wiser choice decisions and give a clearer view on what they will be
ordering. As the saying goes, "A picture is worth a thousand words," so by looking at
appealing dish images, individuals will be able to choose the food they prefer better.
This is because the brain processes words as tiny images before combining them to
understand the information [3].

Despite technological advances, many eateries still use text-based menus. Only a
small margin of eateries incorporated mankind's favourite technology, the handheld
device, as individuals are more familiar with it. Eateries either provide their

Pg. 9
personalised handheld device menu or ask individuals to use their smartphone and
scan a Quick Response (QR) code, which then leads to the eatery's customised digital
food menu. These digital menus usually include food images, as shown in the
examples below.

Fig. 2 Sushiro’s Personalised Menu from Hand-held Device


Note. From https://sethlui.com/sushiro-conveyor-belt-singapore/

Fig. 3 Beauty in The Pot’s Personalised Menu from Hand-held Device


Note. From http://slowchomp.com/

Pg. 10
Fig. 4 Tsukimi Hamburg’s Customised Menu from scanning QR code
Note. From https://njoy.com.sg/brands/tsukimihamburg/

Visual aids may indeed make it easier to grasp information as compared to reading
text alone, but it may not always be accurately illustrated and thus, mislead
individuals at times. Eateries will always showcase their most attractive food item on
the menu to entice individuals to order it. Then, when the food that individuals order
does not match how it was depicted in the menu, it can be disappointing or irritating
for them.

To avoid individuals feeling misled, it would be great if the visual and written
representation of a dish on the menu was an exact portrayal of what the dish looked
and described like. This is where visual aids and text-based menus need to be
upgraded. By introducing the Augmented Reality (AR) component, 2-Dimensional
(2D) food images and text-based menus can be transformed into 3-Dimensional (3D)
food models. This can allow individuals to see an exact 1:1 detailed scale of the dish
they will be having and strengthen their decision-making process.

Pg. 11
1.2 Objectives
This paper’s sole focus will be on designing and developing a digital food menu on
handheld devices for eateries through Augmented Reality (AR) technology, with the
goal of creating an immersive, playful, and interactive overall dining experience.

1.3 Scope
The scope of this paper will explore the technologies and applications of Mixed
Reality (MR), Virtual Reality (VR), and Augmented Reality (AR). In addition, short
research into User Experience (UX) and User Interface (UI) in the context of
designing a specific app will also be conducted.

Chapter Two

2 Literature Review
A literature review is a critical evaluation of the body of work that has already been
published on a certain subject or question. The literature on Mixed Reality (MR),
Virtual Reality (VR), Augmented Reality (AR), User Experience (UX), and User
Interface (UI) will be discussed in this section to better understand the technology.

2.1 Mixed Reality (MR)


Mixed Reality (MR) is a type of interactive experience that combines features of
Virtual Reality (VR) and Augmented Reality (AR). MR is a "continuum that spans the
real environment and a virtual environment," according to Milgram and Kishino
(1994) as shown in Figure 5 below [4]. By real-time interaction with both virtual and
physical items, MR gives users a completely immersive experience.

Fig. 5 Virtuality Continuum Spectrum [4]


Note. From P. Milgram and F. Kishino, "A taxonomy of mixed reality visual displays"

Pg. 12
MR has found practical applications across various industries, such as entertainment,
healthcare, military, education, and even food [5][6][7]. A recent example is the
development of a Bloodstain Pattern Analysis (BPA) training program by the Home
Team Science and Technology Agency (HTX) in 2022 [8]. This program utilises MR
technology by projecting holographic images onto the user's field of view through a
headset, creating a realistic virtual crime scene with various bloodstains for trainees to
interact with and analyse [8]. The BPA program is depicted in Figure 6, which uses
augmented reality features in a virtual environment setting. The use of MR allows for
a more interactive and realistic training setting for trainees, thus improving learning
outcomes.

Fig. 6 Bloodstain Pattern Analysis (BPA) training program [8]


Note. From https://www.htx.gov.sg/news/

The augmented reality feature of MR can be introduced to digital food menus,


providing individuals a distinctive and immersive ordering experience. The
application of MR in the food industry has enormous potential because it allows
individuals to sample their meals in a realistic and interactive manner, resulting in
more informed decisions.

Pg. 13
2.2 Virtual Reality (VR)
The term "Virtual Reality" (VR) refers to a computer-generated simulation that
mimics the actual world or an imaginary environment [9][10]. The user can interact
with this environment in a fully immersive manner by using VR headsets and
controllers. VR technology has been explored in many fields, such as gaming,
medicine, engineering, and education. The application of VR technology in the food
industry has also gained popularity over the years.

Through the creation of engaging and immersive settings, VR technology can


improve the overall dining experience. For instance, VR headsets can be used to
replicate a particular dining place or enhance the atmosphere virtually [11]. The
creation and preparation of cuisine can be explored by individuals through immersive
experiences made possible by VR technology [12]. Participants in research by Lee et
al. (2019) who took a virtual reality food journey expressed greater levels of
satisfaction with their dining experience [13].

Sublimotion, a Two Michelin-starred restaurant in Ibiza, Spain, renowned for its use
of VR technology to provide diners with an immersive, luxurious, and multi-sensory
eating experience. While diners enjoy their courses throughout the exquisite dining,
the restaurant has been transporting them to various places using VR. For instance,
while tasting seafood dishes, diners get to experience a virtual trip through the ocean's
depths as shown in Figure 7 and 8. Furthermore, Sublimotion enhances the eating
experience with music, illumination, and other sensory elements.

Fig. 7 VR scene of Ocean Fig.8 Diners experiencing VR Dining


Fig. 7 & 8 Note. From https://www.youtube.com/@OurTaste

Pg. 14
Sublimotion's use of VR technology is just one illustration of how the restaurant
experience is being improved by the food industry's adoption of technology. VR
technology has a lot of promise for the food industry in enhancing the dining
experience. Diners can experience the food production and preparation processes
thanks to VR technology's ability to create immersive and interactive settings. Diners
can also examine 3D models of cuisine and alter their purchases by using VR menus,
which can also give them a more immersive and participatory experience.

2.3 Augmented Reality (AR)


Augmented Reality (AR) is a technology that overlays digital content onto the real-
world environment that is garnering popularity in a wide range of industries, including
the food industry. Digital food menus that are dynamic and interesting have been
produced using AR technology. Individuals can personalise their orders, examine 3D
models of the food items, and even see how their food is made using these menus.

The use of AR technology in food menus has been demonstrated to increase consumer
involvement and satisfaction levels, according to research by Kim et al. [14]. The
research discovered that consumers of AR menus expressed greater satisfaction with
the ordering process and were more likely to visit the restaurant again. The research
also found that using AR menus enhanced how well people viewed the food items,
indicating that the technology may better dining experiences in general.

Kabaq is a firm that has developed an AR solution for the food industry, enabling
consumers to examine virtual 3D models of menu items on their handheld devices as
shown in Figure 9 and 10 [15]. The use of technology can completely change how
diners engage with menus and make food ordering more enjoyable and individualised.

Pg. 15
Fig. 9 Diners viewing AR Menu [15] Fig. 10 3D Food Model of Banoffee [15]
Fig. 9 & 10 Note. From https://kabaq.io/

Individuals were more likely to purchase menu items when they could see a 3D
representation of the meal, according to a Kabaq research [15]. Individuals who used
the AR menu spent more time browsing the menu and were more apt to post about
their experience on social media, according to the research.

The use of AR technology in the food industry has a variety of possible advantages,
such as enhancing individuals’ satisfaction, involvement, and boosting revenue for the
restaurant [16]. With the ability to view accurate 3D models of menu items and
modify their purchases, AR menus can give individuals a more interactive and
immersive experience.

The application of AR technology in the food industry does, however, come with
some difficulties. Since the technology is rather new, more resources may be needed
for assistance [17]. Additionally, not all restaurants or menus may be appropriate for
the use of AR menus, and some patrons may not feel safe using the technology.

Despite these difficulties, there is a lot of promise for enhancing the individuals
experience and increasing revenue in the food industry by using AR technology [18].
It is possible that more restaurants will start using AR menus to improve their
offerings and maintain competitiveness in a market that is changing quickly as
technology advances and becomes more widely available.

Overall, MR, VR, and AR technologies have the potential to change the food industry
by refining the diner experience and allowing consumers to connect with food in more

Pg. 16
interactive and immersive ways. More study is needed to completely explore and
exploit the potential of these technologies in the food industry, including tackling
issues like cost, accessibility, and privacy. As these technologies advance and become
more widely available, it will be exciting to see how restaurants and food businesses
integrate them into their operations to better the dining experience and customer
satisfaction. It is obvious that MR, VR, and AR have the potential to transform the
food industry while providing intriguing future possibilities.

2.4 User Experience (UX) & User Interface (UI)


User Experience (UX) and User Interface (UI) often go together in this digital age to
provide a positive user experience.

UX is the term used to describe how an individual uses a system, product, or service
[19]. In the context of MR, VR, and AR technologies, UX refers to how consumers
interact with the virtual or augmented environment. The architecture of the UX is a
crucial factor in the overall performance of the technology.

The UI of a technology refers to its visible and functional design, which includes
menus, buttons, and other components that enable individuals to interact with the
system [20]. The user interface design for MR, VR, and AR technologies must ensure
that the user can readily explore and interact with the virtual or augmented
environment.

The UX and UI of MR, VR, and AR technologies have been extensively researched in
academic literature and commercial study [21][22][23]. Design concepts for making
successful UX and UI have been defined as simplicity, consistency, usability, and
feedback [24].

According to research, the UX and UI of MR, VR, and AR technologies have a


substantial influence on user satisfaction, adoption, and learning results [25][26]. A
well-designed UX and UI can strengthen the user's ability to comprehend and
navigate the virtual or augmented environment, resulting in a more pleasurable and

Pg. 17
engaging experience. On the other hand, if the UX and UI are poorly designed, users
may become frustrated, lose interest, and ultimately be less likely to use the
technology.

To conclude, effective UX and UI design is vital for the adoption of MR, VR, and AR
technologies in the food industry. To ensure that users can effectively navigate and
interact with the virtual or augmented environment, the UX and UI must be intuitive,
user-friendly, and engaging.

Chapter Three

3 Methodology
Using AR technology, the procedure for designing and developing a digital food
menu for handheld devices can be broken down into several phases. Planning,
research, design, development, testing, and implementation are the phases that
comprise this process.

Planning:
At this point, the project's goals, objectives, and scope will be outlined. Determine the
resources needed, along with the timelines and budget. Anyone who is related to the
project should be involved in the planning stage to ensure that everyone is on the
same page about the project's objectives.

Research:
During this period, information on the technologies and applications of MR, VR, and
AR is collected. Besides, UX and UI research in the context of designing a specific
app will be performed. It is also important to do studies on current digital food menus
on handheld devices, as well as their features, customer evaluations, and comments.
This phase will likely influence the design, development, and testing processes.

Design:
During the design phase, wireframes, and prototypes of the digital food menu's UX
and UI are constructed. The research done in the earlier step will likely have an

Pg. 18
influence on the design. The style should be aesthetically pleasing, simple to use, and
straightforward. The one who design should consider the restaurant's brand identity
and ensure that the design is coherent with it.

Development:
The backend and frontend of the digital food menu should be created during the
development stage. This phase involves coding and combining AR technology into
the app. The development team should ensure that the app is responsive, works on
various operating systems, and provides a consistent user experience.

Testing:
During the testing phase, any bugs or errors in the app are to be identified and
rectified. The testing process should be comprehensive and include testing on a
variety of devices, platforms, and environments. To ensure that the app is bug-free
and delivers a smooth user experience, both the one who develop, and external testers
should test it.

Implementation:
The digital food menu should be published and made accessible to individuals during
the implementation stage. To guarantee that the software gets the targeted crowd, it
should be advertised and promoted. The restaurant employees should be educated on
how to use the app and be able to assist diners who have difficulties using it. To
ensure that the app stays relevant, it should be constantly monitored and updated.

Overall, the design and development of a digital food menu on handheld devices
using AR technology requires meticulous planning, research, design, development,
testing, and implementation. The final product should be an app which provides an
immersive, playful, and interactive dining experience.

Pg. 19
Chapter Four

4 Hardware, Software and Programming Language


Before developing the AR app for the digital food menu, it is necessary to consider
the hardware, software and techniques, and programming language that will be used.
This section will cover the tools used in prototypes one and two.

4.1 Hardware
Photography device and portable lighting are among the gear requirements for
creating the AR app. Food images can be captured using photography devices such as
DSLRs, mirrorless cameras, or even smartphones with high-quality cameras. Portable
lighting can be used to ensure that the images captured are well-lit.

4.1.1 Photography Devices


Photography device may refer to as any equipment that can take visual images or
video footage. Digital cameras, smartphones, tablets, and other handheld devices with
camera functions are such examples. The quality of the photography device used can
affect the image quality and video footage captured, so it is critical to select a device
that fulfils the project's requirements. In the context of creating a digital food menu
using AR technology, photography devices will be used to capture images of food
items, and other elements that will be integrated into the AR app.

Due to budget constraints, the author is limited to using his own smartphone for food
photography. The Realme 5 smartphone was used to shoot the food, which was then
transferred to a computer for processing. The image quality is sufficient to produce
decent result.

Pg. 20
Fig. 11 Realme 5 Quad-Camera
Note. From https://c.realme.com/in/post-details/1164427197176348672

The Realme 5 smartphone has a quad-camera system on the back, which includes
[27]:
- A 12-megapixel main camera with a f/1.8 aperture
- An 8-megapixel ultrawide camera with a field of view of 119 degrees and an
aperture of f/2.25.
- A 2-megapixel macro camera with an f/2.4 aperture and a 4 cm minimum
focusing distance.
- A 2-megapixel depth camera with an f/2.4 aperture

4.1.2 Portable Lighting


Portable lighting refers to any lighting apparatus that is readily transportable and
usable in various locations [28]. This form of lighting is important for capturing high-
quality images and video clips in low-light circumstances or when natural lighting is
inadequate. LED lights, softbox lights, ring lights, and other readily mounted and
adjustable forms of lighting are all examples of portable lighting tools. Portable
lighting may be used, if necessary, to ensure that food items are properly lighted and
that the images taken are of high quality.

Pg. 21
4.2 Software & Techniques
The following software tools and techniques will be used in the development of the
AR app for both prototype:

4.2.1 AliceVision

Fig. 12 AliceVision Meshroom Logo [29]


Note. From https://alicevision.org/#

AliceVision is a photogrammetric computer vision framework that uses 2D images to


reconstruct 3D models [29][30]. It is a free and open-source software library that
includes methods for photo modelling, camera tracking, and 3D reconstruction.
AliceVision has a wide range of uses, including the generation of 3D models of items,
structures, and landscapes, as well as VR and AR. The software accepts a variety of
input file formats, including JPEG, PNG, and RAW, and can run on Windows, Linux,
and Mac OS. It is recommended to have NVIDIA CUDA-enabled GPU to fully utilise
the software.

Pg. 22
Fig. 13 Photogrammetry of Ramen produced by AliceVision Meshroom

Photogrammetry as shown in Figure 13 above, is the technique used in the software


for measuring and transforming two-dimensional (2D) images into three-dimensional
(3D) models. It involves taking multiple photos of an object from various angles and
using specialised software to create a 3D model. It has been used in a variety of
sectors, including design, technology, and mapping. It can also be used to create 3D
models of food items that can be integrated into the AR food app in Prototype One.

4.2.2 Autodesk Meshmixer

Fig. 14 Autodesk Meshmixer [31]


Note. From https://meshmixer.com/

Autodesk Meshmixer is a free 3D modelling software that lets users construct,


modify, and manipulate 3D models [31]. It includes modelling tools, mesh editing
tools, and 3D printing tools to help users create complex models. Meshmixer is a
popular choice among 3D modelling and printing hobbyists due to its user-friendly
interface and extensive file format support. Since the 3D model had been generated
from AliceVision Meshroom, the purpose of using Autodesk Meshmixer is to help

Pg. 23
edit unwanted surface and to refine the 3D model. This software will be used in
Prototype One.

4.2.3 Unity3D

Fig. 15 Unity Logo [32]


Note. From https://unity.com/

Unity3D, also known as Unity, is a cross-platform game engine and integrated


development environment (IDE) that is used to create video games, simulations, and
other interactive 2D, 3D, VR and AR apps [32]. It features a visual editor, scripting
tools, a physics engine, animation tools, and support for multiple devices, including
mobile, PC, and internet. Unity is a famous game engine that has been used to create
several great games and apps, including Pokémon Go and Super Mario Run [33][34].
Unity will be the main software tool to build the AR app for prototype one. It will
combine the 3D models and apply interactive features for the digital food menu.

4.2.4 Vuforia

Fig. 16 Vuforia Logo [35]


Note. From https://library.vuforia.com/

Vuforia Engine is an AR Software Development Kit (SDK) developed by PTC [35]. It


enables developers to build AR apps that use computer vision technology to identify
and track images and objects in the real world. Vuforia includes image localization,
object tracking, and virtual buttons, which can be used to build interactive AR
experiences. The SDK supports an array of platforms, including Android, iOS, and
Unity3D. Vuforia has been seen using in a wide range of applications, such as games,
marketing initiatives, and education. Vuforia will be incorporated into Unity for this

Pg. 24
project for prototype one. The functions used will be the interactive buttons and object
tracking for the 3D food model.

4.2.5 ARCore

Fig. 17 ARCore Logo [36]


Note. From https://developers.google.com/ar

ARCore, developed by Google, is an AR SDK that develop AR experiences for


Android devices [36]. It enables developers to create AR apps that use the device's
camera and sensors to blend digital objects and information with the real world. To
build realistic AR interactions, ARCore employs motion tracking, environmental
understanding, and light estimation. ARCore works with a variety of Android 7.0
(Nougat) or later devices and needs specific hardware specs, such as support for
ARCore's depth API. ARCore is used to build and run the digital food menu AR app
created by Unity in this project for prototype one.

4.2.6 RunwayML

Fig. 18 RunwayML Logo [37]


Note. From https://runwayml.com/

RunwayML is an AI-based software that provides developers with various machine


learning models and tools for building AI-powered apps [37]. The green screen video
background remover, which uses a deep learning algorithm to remove the background
from images or videos, is one of its functions. This tool can be helpful when creating
a digital food menu with AR technology because it can help remove the background
from images of food items, making them simpler to incorporate into the AR world.

Pg. 25
Due to various cramped kitchen configurations, it is not ideal to install a physical
green screen due to space constraints. Another reason is that the green screen can only
be laid out in one orientation, and when the photographer shoots the food images or
videos, it can only have that few angles to perform with within the green screen. As a
result, a background remover software will prove helpful in these circumstances.
Additionally, the expense of the green screen can be minimised.

Fig. 19 Example of before and after green screen remover


Note. Original Video From https://www.youtube.com/watch?v=g-10aU5YsDc

Figure 19 shows the before and after effect of the background being overlaid by a
green screen which helps to focus on the subject and remove unwanted content. The
subject then can be implemented into the AR food app for both prototypes. The only
downside might be a hassle to scan through the whole video to see if the background
was removed properly. If it does not, frame by frame removing will need to be done
and that will be quite time consuming.

4.2.7 Figma

Fig. 20 Figma Logo [38]


Note. From http://figma.com/

Figma is a web-based UI design and prototyping software that allows creators and
developers to collaborate on projects together in real time. It includes vector editing
tools, shape and text tools, prototyping tools, and design frameworks for creating user

Pg. 26
interfaces. Figma is also notable for its collaboration capabilities, as multiple users
can work on the same project at the same time, with modifications synchronised in
real-time.

Fig. 21 An overview of framework used to design prototype two


Note. From http://figma.com/

The author and his team member have collaborated using Figma as shown in Figure
21 above to design the user interface for prototype two AR food app. Different
wireframes were created and design ideas were iterated after each meeting.

4.2.8 MindAR

Fig. 22 MindAR Logo [39]


Note. From https://hiukim.github.io/mind-ar-js-doc/

MindAR, created by Hiukim, is a lightweight and simple to use open-source


JavaScript library for developing web-based AR experiences [39]. It facilitates
developers to quickly integrate AR into their websites or apps by providing a
collection of tools for detecting and tracking markers, objection recognition, plane

Pg. 27
detection, as well as real-time rendering of 3D models and other visual representations
that can be positioned and manipulated by the user within the AR scene.

This makes it an excellent choice for both novice and seasoned coders. It also
supports a variety of devices, including smartphones and tablets, making it a flexible
option for creating AR experiences that can be accessed by many users.

MindAR is used in this project for prototype two. First, it compiles images to extract
feature points so that the AR app can later track and detect the images [39]. This step
is crucial because compiling requires time; therefore, it is best to do it ahead of time to
shorten the loading time when individuals use the AR app later.

Overall, MindAR is a strong and flexible library that gives developers a fantastic
collection of tools for building entertaining and immersive AR experiences on the
web.

4.3 Programming Language


Programming languages are also an important factor in creating an AR app. It makes
certain parts of the app interactive and connects the overall flow for a smooth AR
experience. It should be capable of handling the functionality and performance of the
app. The following programming languages were used to code prototype one and two
for their interactive design.

4.3.1 C Sharp (C#)


C Sharp (C#) is a ubiquitous object-oriented programming language created by
Microsoft [40]. It is frequently utilised in the development of AR apps on platforms
like Unity3D. C# is a robust and versatile programming language that can manage
complex AR functionality. Its syntax is comparable to other C-style languages,
making it simple to learn for programmers familiar with C++.

Pg. 28
C# scripting for various interactive role like triggering the buttons to do certain event
and rotating function for 3D models will be used in prototype one. The script will be
implemented into Unity.

4.3.2 JavaScript (JS)


JavaScript (JS), invented by Brendan Eich, is a common programming language used
for creating web developments and can be used to construct AR apps [41]. It was first
developed for Netscape 2, then subsequently, Mozilla took over and continued its
development. JS, as a client-side scripting language, can create dynamic and
interactive elements that enhance user engagement and facilitate an intuitive user
experience. Prototype two uses this language as its main to code out its function for
the AR food app. The author’s team members are the main coder for the prototype
two app.

4.3.3 HTML
HTML, or HyperText Markup Language in short, was first developed by Tim
Berners-Lee in the late 1980s to share information between researchers [42]. Since
then, it has become the standard language for creating web pages and defining their
structure. HTML was used in prototype two to give structure to the JS code and
structure its own web-AR page. The author’s team members incorporated this
language together with JS.

4.3.4 CSS
CSS, also known as Cascading Style Sheets, was developed in the late 1990s to
separate the presentation and content of a web page [43]. As a result, designers could
produce more aesthetically pleasing web sites without having to cram presentational
components into the HTML code. CSS helps to beautify the content created by
HTML in short. The author’s team members used this language to make the web page
more appealing.

Pg. 29
Chapter Five

5 Design and Development


This section will cover the process of designing the AR food app for both prototypes.
Prototype one, which is created by the author, uses Unity and Vuforia to map out the
AR app. Prototype two, the author assisted in designing the user interface layout. The
coding part were done by his team members.

5.1 Prototype One

Fig. 23 Development Flowchart for Prototype One

Figure 23 provides an overview of the four phases involved in developing prototype


one, namely, 3D Food Modelling, App Building, Testing Phase, and Implementation.
However, this section will only discuss the first two phases: 3D Food Modelling and

Pg. 30
App Building. The Testing and Implementation phases will be covered in the next
chapter.

For 3D Food Modelling, photogrammetry technique was utilised via the AliceVision
Meshroom software to generate the rendering of the 3D food model, and further
processing was done via Autodesk Meshmixer to refine the model. To achieve this,
multiple pictures of the food were taken from various perspectives, preferably more
than 20 pictures to accurately capture the whole 3D food image in the real world. The
more pictures taken, the better the result. The photos were then transferred to a
computer workstation, and since they were in 2D format, a software was needed to
recreate the food in 3D in the virtual environment.

Fig. 24 Example of food generating in AliceVision Meshroom

Therefore, the use of AliceVision Meshroom as shown in Figure 24 was a valuable


asset in addressing this particular issue. With this software, all the user needs to do is
to run it, import the photos, and generate the model. It typically takes a minimum of
30 minutes to generate a model from small number of photos. However, as the
number of photos increases, the time required to generate the model will also increase,
but it will result in a more accurate output. If another 3D food model is required, the
user can simply repeat the same steps.

Pg. 31
Fig. 25 Unedited 3D model of Pork Rib Soup

Fig. 26 Editing 3D model of Pork Rib Soup with Autodesk Meshmixer

Fig. 27 Final edited 3D model of Pork Rib Soup

Pg. 32
In Figure 25, the unedited 3D food model can be seen after being generated from the
Meshroom software. The next step involves importing the food model into Autodesk
Meshmixer shown in Figure 26 for refinement, where unwanted surfaces such as the
brown wooden plane can be removed. The purpose of this refinement process is to
obtain a 3D food model that is free of extraneous elements and suitable for use in the
AR app. The final version of the 3D food model, which will be imported into the AR
app, can be seen in Figure 27.

Fig. 28 App Building in Unity

In the App Building phase, relevant resources, including the 3D food model and
Vuforia Engine were imported into Unity for the design and development of the AR
app. The user interface for the app was created entirely inside Unity shown in Figure
28. To make the interface more interactive and dynamic, various C# scripts using
Visual Studio were integrated into Unity. The code is available in the appendix
section.

After the design and development phase was complete, ARCore for Android was used
to build the app APK file. This app file was then used for testing purposes to ensure
the app's functionality and proper working. Any necessary adjustments or bug fixes
were made during the testing phase to ensure a smooth user experience.

Pg. 33
5.2 Prototype Two

Fig. 29 Development Flowchart for Prototype Two

Figure 29 depicts the four phases of prototype two development, which include
Augmented Food Video, App Building, Testing, and Implementation. This section
will focus on Augmented Food Video and App Building phases while Testing and
Implementation will be covered in the next chapter.

The Augmented Food Video phase involved shooting food videos with and without
green/blue screens. The videos were then edited on a computer workstation. If needed,
a background remover software by RunwayML which was mentioned in the previous
chapter was used to replace unwanted backgrounds with green screens. The videos
were then added to the libraries in the web app built during the App Building phase.

Pg. 34
The web app was primarily built using JavaScript and was supported by HTML and
CSS to create an organised, aesthetically pleasing, and user-friendly interface. The
food images were sent to the MindAR image compiler, which integrated them into the
web app. The augmented video was then attached to the target image that had been
compiled. An example of how the augmented video was projected up from the
targeted image was shown in Figure 30 and 31. Additionally, Figma design was used
to create well-organised CSS code.

Fig. 30 Targeted Image of Leng Zaab Fig. 31 Single Frame from Augmented Video
Note. From Un-Yang-Kor-Dai Singapore Menu

The Testing and Implementation phases will provide further insight into how the web
app functions.

Chapter Six

6 Testing and Implementation


During the Testing and Implementation phase, Prototype One underwent internal
testing by the author and a few volunteers. It was found to be compatible with
Android 8.0 and newer versions but was not tested on Apple products. Users had the
option of using the eateries' tablets or downloading the app onto their mobile phones.
To use the app, they needed to open it, select the menu item on the bottom left of the
screen by tapping it, and point it at the corresponding wording on the text-based
menu. A 3D model of the food with an option to play the video would then pop up.
Users could rotate the 3D food model and view the price of the menu item at the

Pg. 35
bottom right of the screen. The app can be used in both vertical and horizontal
orientations, depending on how users hold their devices. An example of how the food
app function is shown in Figure 32, 33 and 34 below.

Fig. 32 Sample Text-Based Menu

Fig. 33 3D Fried Chicken Model Fig. 34 3D Ramen Model

Prototype Two was tested in a closed beta with Un Yang Kor Dai. Users were able to
use their own mobile devices to access the digital food menu via a QR code they
scanned. The digital food menu is currently powered by the Heroku cloud server.

Pg. 36
Once the users tapped the interface to start, they faced the front phone camera towards
the image-based menu, which would trigger the augmented video of the food to pop
up and play automatically. Figure 35 below shows an example of it.

Fig. 35 How the app function on Image-Based Menu

Bugs and feedback were collected during the closed beta testing to improve both
prototypes. Subsequently, it will be release to public for further improvement and
update accordingly. Continuously improving and updating the digital food menu
based on user feedback and behaviour can help increase the eateries’ branding and
maintain stronger individual engagement. This can lead to a better user experience
and individual satisfaction, ultimately resulting in increased revenue for the
restaurant.

Chapter Seven

7 Conclusion, Limitation and Recommendation


This paper has discussed the potential of using AR technology to create a digital food
menu on handheld devices for eateries. The objective was to develop an immersive,
interactive, and playful dining experience that would help individuals visualise the
food they were ordering and make better decisions. The paper also explored the
applications and technologies of MR, VR, and AR, as well as brief research on UX
and UI in the context of designing a specific app.

Pg. 37
Two prototypes were developed, Prototype One using Unity and Vuforia and
Prototype Two using JavaScript. Prototype One involved using photogrammetry to
create 3D food models with AliceVision Meshroom, refining them with Autodesk
Meshmixer, and integrating them into Unity for app development. In contrast,
Prototype Two consisted of filming food videos with and without green/blue screens
and creating augmented videos attached to a target image compiled in the MindAR
image compiler. Both prototypes had user-friendly and visually appealing user
interfaces, with the layout design of Prototype Two done by the author and the coding
done his team members. The testing and implementation phases of both prototypes
were discussed in Chapter 6.

Using AR technology in digital food menus has several benefits, such as reducing
choice overload by providing visual aids that help individuals make informed choices
and giving an accurate representation of the dish to avoid any misunderstandings.
Moreover, AR technology can create an immersive and interactive experience for
diners, making the overall dining experience more engaging and enjoyable.

However, the current design and development of the digital food menu have some
limitations. For example, it may not be accessible to all individuals, particularly those
who are not familiar with technology or do not have handheld devices. Additionally,
the cost of implementing AR technology in eateries may be high, which may prevent
smaller establishments from adopting it. Another limitation is the accuracy of 3D food
models for Prototype One, which heavily relies on the number and angles of
photographs taken. Future work could explore alternative techniques such as 3D
scanning or Lidar to improve accuracy.

Future research could be conducted to address these limitations and improve the
design and development of digital food menus. For instance, exploring alternative
methods of implementing AR technology to make it more accessible and affordable,
and evaluating the effectiveness of AR technology in improving the overall dining
experience and the success of an eatery.

Pg. 38
In conclusion, the development of AR technology in the food industry is still in its
early stages, with much potential for growth and innovation. This paper provides a
foundation for future research and development in this area, and hopefully, more
eateries will adopt AR technology to enhance the dining experience for their
customers.

Pg. 39
References

[1] R. B. School, "The Fifth Industrial Revolution (5IR) and how it will change
the business landscape," 2020. [Online]. Available:
https://www.regenesys.net/reginsights/the-fifth-industrial-revolution-5ir/.

[2] E. Velasco, "Scientists Uncover Why You Can't Decide What to Order for
Lunch," 2018. [Online]. Available: https://www.caltech.edu/about/news/scientists-
uncover-why-you-cant-decide-what-order-lunch-83881.

[3] E. Press, "The psychology of visuals, how images impact decision making,"
2020. [Online]. Available: https://enterprise.press/stories/2020/02/05/the-
psychology-of-visuals-how-images-impact-decision-making-11273/.

[4] P. Milgram and F. Kishino, "A taxonomy of mixed reality visual displays,"
IEICE TRANSACTIONS on Information and Systems, vol. 77, no. 12, pp. 1321-
1329, 1994.

[5] C. E. Hughes, C. B. Stapleton, D. E. Hughes, and E. M. Smith, "Mixed reality


in education, entertainment, and training," IEEE Computer Graphics and
Applications, vol. 25, no. 6, pp. 24-30, 2005, doi: 10.1109/mcg.2005.139.

[6] J. Gerup, C. B. Soerensen, and P. Dieckmann, "Augmented reality and mixed


reality for healthcare education beyond surgery: an integrative review," International
Journal of Medical Education, vol. 11, pp. 1-18, 2020, doi: 10.5116/ijme.5e01.eb1a.

[7] J. J. K. Chai, C. O'Sullivan, A. A. Gowen, B. Rooney, and J.-L. Xu,


"Augmented/mixed reality technologies for food: A review," Trends in Food Science
& Technology, vol. 124, pp. 182-194, 2022/06/01/ 2022, doi:
https://doi.org/10.1016/j.tifs.2022.04.021.

[8] HTX Singapore, "MIXED REALITY AND HOLOGRAMS IN COURTS OF


THE FUTURE," 2022. [Online]. Available: https://www.htx.gov.sg/news/featured-
news-mixed-reality-and-holograms-in-courts-of-the-future.

[9] W. R. Sherman and A. B. Craig, "Understanding virtual reality," San


Francisco, CA: Morgan Kauffman, 2003.

[10] Iberdrola, "Virtual Reality: another world within sight," 2022. [Online].
Available: https://www.iberdrola.com/innovation/virtual-reality.

[11] J. D. Calvert and H. T. Lawless, "Virtual Reality and Its Potential for Food
and Beverage Industries," Journal of Food Science, vol. 83, no. 4, pp. 1067-1072,
Apr. 2018.

[12] H. J. Kim, M. H. Kim, and E. Park, "Virtual Reality in Food Science and
Hospitality," Journal of Food Science Education, vol. 16, no. 2, pp. 58-66, Jun. 2017.

Pg. 40
[13] K. Lee, M. H. Lee, and K. J. Kim, "Does Virtual Reality Food Tour Enhance
Satisfaction with Food Tourism?," Sustainability, vol. 11, no. 19, pp. 5181, Sep.
2019.

[14] J. Kim, S. Kim, and Y. Kim, "The effect of augmented reality menus on
customers' sensory experience and behavioural intentions in restaurants," Journal of
Travel Research, vol. 59, no. 4, pp. 656-670, 2020.

[15] Kabaq, "Augmented Reality Solution for the Food Industry," [Online].
Available: https://kabaq.io/.

[16] L. Lin and M. Cooper, "Augmented Reality: Enhancing the Customer


Experience in the Food Service Industry," Journal of Foodservice Business Research,
vol. 21, no. 2, pp. 119-129, Mar. 2018.

[17] Y. Kim and J. Kim, "Effects of Augmented Reality on Customer's Impulsive


Buying Behaviour in Restaurants," Journal of Hospitality Marketing & Management,
vol. 29, no. 5, pp. 551-564, Jun. 2020.

[18] Y. Liu, L. Li, and Y. Liang, "Augmented Reality Menu Design and Customer
Experience: An Experimental Study," International Journal of Hospitality
Management, vol. 86, p. 102445, Mar. 2020.

[19] ISO 9241-210:2019. Ergonomics of human-system interaction -- Part 210:


Human-centred design for interactive systems.

[20] Shneiderman, B., & Plaisant, C. (2010). Designing the User Interface:
Strategies for Effective Human-Computer Interaction (5th ed.). Addison-Wesley.

[21] Lopes, P., & Branco, F. (2018). A review of the literature on the usability of
virtual environments. Virtual Reality, 22(4), 301-326.

[22] Leue, A., & Beaudouin-Lafon, M. (2019). Beyond the lab: Evaluating HCI in
the wild. Interactions, 26(6), 26-33.

[23] Dey, A., Billinghurst, M., & Lindeman, R. W. (2016). A systematic review of
10 years of augmented reality usability studies: 2005 to 2014. Frontiers in Robotics
and AI, 3, 22.

[24] Nielsen, J. (1993). Usability engineering. Academic Press.

[25] Krokos, E., Plaisant, C., & Varshavsky, A. (2016). A human-centered


evaluation of a mobile augmented reality system for consumer health education.
Journal of biomedical informatics, 63, 21-31.

[26] Billinghurst, M., & Duenser, A. (2012). Augmented reality in the classroom.
In Proceedings of the 2012 ACM international conference on Interactive tabletops
and surfaces (pp. 1-8).

Pg. 41
[27] Realme SG, "Realme 5 Specification," 2019. [Online]. Available:
https://www.realme.com/sg/realme-5/specs/.

[28] A. Mohan, J. Tumblin, B. Bodenheimer, C. Grimm, and R. Bailey, "Table-


top computed lighting for practical digital photography," presented at the ACM
SIGGRAPH 2006 Courses, Boston, Massachusetts, 2006. [Online]. Available:
https://doi.org.remotexs.ntu.edu.sg/10.1145/1185657.1185742.

[29] AliceVision, "AliceVision," 2023. [Online]. Available:


https://alicevision.org/#.

[30] C. Griwodz et al., "AliceVision Meshroom: An open-source 3D


reconstruction pipeline," presented at the Proceedings of the 12th ACM Multimedia
Systems Conference, Istanbul, Turkey, 2021. [Online]. Available:
https://doi.org/10.1145/3458305.3478443.

[31] Autodesk, "Meshmixer," 2021. [Online]. Available: https://meshmixer.com/.

[32] Unity, "Unity3D," 2023. [Online]. Available: https://unity.com/.

[33] N. Wingfield, "Unity Technologies, Maker of Pokémon Go Engine, Swells in


Value," 2016. [Online]. Available:
https://www.nytimes.com/2016/07/14/technology/unity-technologies-maker-of-
pokemon-go-engine-swells-in-value.html.

[34] A. OSBORN, "Super Mario Run Created With Unity," 2016. [Online].
Available: https://www.ign.com/articles/2016/11/01/super-mario-run-created-with-
unity.

[35] Vuforia, "Vuforia Engine," 2023. [Online]. Available:


https://library.vuforia.com/.

[36] Google, "ARCore," 2023. [Online]. Available:


https://developers.google.com/ar.

[37] RunwayML, "RunwayML," 2023. [Online]. Available:


https://runwayml.com/.

[38] Figma, "Figma," 2023. [Online]. Available: https://figma.com/.

[39] MindAR, "MindAR - Documentation," 2023. [Online]. Available:


https://hiukim.github.io/mind-ar-js-doc/.

[40] Microsoft, "A tour of the C# language," 2023. [Online]. Available:


https://learn.microsoft.com/en-us/dotnet/csharp/tour-of-csharp/.

[41] W3schools, "JavaScript History" 2023. [Online]. Available:


https://www.w3schools.com/js/js_history.asp.

Pg. 42
[42] University of Washington, "A Brief History of HTML" 2005. [Online].
Available:
https://www.washington.edu/accesscomputing/webd2/student/unit1/module3/html_hi
story.html

[43] Boston University, " History of CSS " 2005. [Online]. Available:
https://www.bu.edu/lernet/artemis/years/2020/projects/FinalPresentations/HTML/his
toryofcss.html.

Pg. 43
Appendix

Script Codes for Prototype One


Menu trigger script: Show Price, Interactive Menu

Pg. 44
Object Rotation Script:

Pg. 45
3D Food Models for Prototype One:

Pg. 46

You might also like