Fyp Report
Fyp Report
Fyp Report
sg)
Nanyang Technological University, Singapore.
Yang, Xunsheng
2023
Yang, X. (2023). Augmented food appreciation. Final Year Project (FYP), Nanyang
Technological University, Singapore. https://hdl.handle.net/10356/167132
https://hdl.handle.net/10356/167132
YANG XUNSHENG
YEAR 2022/2023
AUGMENTED FOOD APPRECIATION
SUBMITTED
BY
YANG XUNSHENG
Pg. 1
Table of Contents
Pg.
Abstract 4
Acknowledgement 5
Lists of Figures 6
List of Tables 7
Chapter One 8
1 Introduction 8
1.1 Background 8
1.2 Objectives 12
1.3 Scope 12
Chapter Two 12
2 Literature Review 12
2.1 Mixed Reality 12
2.2 Virtual Reality 14
2.3 Augmented Reality 15
2.4 User Experience & User Interface 17
Chapter Three 18
3 Methodology 18
Chapter Four 20
4 Hardware, Software & Programming Language 20
4.1 Hardware 20
4.1.1 Photography Devices 20
4.1.2 Portable Lighting 21
4.2 Software & Techniques 22
4.2.1 AliceVision 22
4.2.2 Autodesk Meshmixer 23
4.2.3 Unity3D 24
4.2.4 Vuforia 24
4.2.5 ARCore 25
4.2.6 RunwayML 25
4.2.7 Figma 26
Pg. 2
4.2.8 MindAR 27
4.3 Programming Language 28
4.3.1 C Sharp (C#) 28
4.3.2 JavaScript (JS) 29
4.3.3 HTML 29
4.3.4 CSS 29
Chapter Five 30
5 Design and Development 30
5.1 Prototype One 30
5.2 Prototype Two 34
Chapter Six 35
6 Testing and Implementation 35
Chapter Seven 37
7 Conclusion, Limitation and Recommendation 37
References 40
Appendix 44
Pg. 3
Abstract
This paper explores the use of Augmented Reality (AR) technology to create a digital
food menu for eateries on handheld devices. The aim is to provide an immersive,
playful, and interactive dining experience that helps individuals make more informed
and confident food choices. The literature review covers mixed reality, virtual reality,
augmented reality, user experience and user interface design. The methodology
involves several stages, including planning, research, design, development, testing,
and implementation, and the necessary hardware, software, and programming
languages are discussed. Two prototypes are presented, one using Unity and the other
using JavaScript. The testing phase showed that both prototypes had functional and
interactive interfaces with potential to enhance the dining experience.
Pg. 4
Acknowledgment
The author would like to express his sincere gratitude to Dr. Cai Yiyu for providing
the opportunity to work on this project and for his invaluable guidance and support
throughout the entire process. The author would also like to extend his gratitude to his
team members, Cai JingHong, Ding YiJie, and Clarissa Bella Jew, for their valuable
insights, assistance, and knowledge sharing, which were instrumental in the on-going
development of this project.
The author would also like to thank the staff and management of the various eateries
such as Un Yang Kor Dai who participated in this project, for their cooperation and
support, without which this project would not have been possible.
Lastly, the author would like to express his appreciation to all the individuals who
have contributed to this project, directly or indirectly, and who have provided
inspiration and motivation for the author to continue pursuing excellence in the field
of AR technology.
Pg. 5
Lists of Figures
Pg.
Figure 1 Lao Beijing’s Traditional Menu 8
Figure 2 Sushiro’s Personalised Menu from Hand-held Device 10
Figure 3 Beauty in The Pot’s Personalised Menu from Hand-held Device 10
Figure 4 Tsukimi Hamburg’s Customised Menu from scanning QR code 11
Figure 5 Virtuality Continuum Spectrum 12
Figure 6 Bloodstain Pattern Analysis (BPA) training program 13
Figure 7 VR scene of Ocean 14
Figure 8 Diners experiencing VR Dining 14
Figure 9 Diners viewing AR Menu 15
Figure 10 3D Food Model of Banoffee 15
Figure 11 Realme 5 Quad-Camera 21
Figure 12 AliceVision Meshroom Logo 22
Figure 13 Photogrammetry of Ramen produced by AliceVision Meshroom 23
Figure 14 Autodesk Meshmixer 23
Figure 15 Unity Logo 24
Figure 16 Vuforia Logo 24
Figure 17 ARCore Logo 25
Figure 18 RunwayML Logo 25
Figure 19 Example of before and after green screen remover 26
Figure 20 Figma Logo 26
Figure 21 An overview of framework used to design prototype two 27
Figure 22 MindAR Logo 27
Figure 23 Development Flowchart for Prototype One 30
Figure 24 Example of food generating in AliceVision Meshroom 31
Figure 25 Unedited 3D model of Pork Rib Soup 32
Figure 26 Editing 3D model of Pork Rib Soup with Autodesk Meshmixer 32
Figure 27 Final edited 3D model of Pork Rib Soup 32
Figure 28 App Building in Unity 33
Figure 29 Development Flowchart for Prototype Two 34
Figure 30 Targeted Image of Leng Zaab 35
Figure 31 Single Frame from Augmented Video 35
Pg. 6
Figure 32 Sample Text-Based Menu 36
Figure 33 3D Fried Chicken Model 36
Figure 34 3D Ramen Model 36
Figure 35 How the app function on Image-Based Menu 37
List of Tables
Pg.
Table 1 Timeline of Five Industrial Revolution (5IR) 8
Pg. 7
Chapter One
1 Introduction
1.1 Background
Technological advancements have greatly facilitated human life for generations, and
without realising it, we have now entered the fifth industrial revolution –
Personalisation [1]. Personalisation is where creativity and innovation meet the fourth
industrial revolution – Digitalisation. Digitalisations involves robotics, artificial
intelligence (AI), and the Internet of Things (IoT), with this revolution reliant on the
first, second, and third industrial revolutions [1]. This can be seen in Table 1 below.
IoT (Internet of Things) links the real and virtual worlds. Information is derived from
data gathered by IoT-enabled devices such as smartphones. Using technologies such
as augmented reality (AR) and virtual reality (VR), this information can be made
visible in real time.
Pg. 8
Imagine hungry individuals at an eatery, reading the text-based menu as shown in
Figure one below, with or without a few signature dishes images on it while trying to
decipher and decide what to eat. Oftentimes, individuals will become indecisive
which is caused by choice overload. Choice overload is a phenomenon that occurs
when individuals brain is presented with an overwhelming range of options and hence
struggles to decide [2].
To reduce choice overload, having visualisation in those words will assist individuals
in making wiser choice decisions and give a clearer view on what they will be
ordering. As the saying goes, "A picture is worth a thousand words," so by looking at
appealing dish images, individuals will be able to choose the food they prefer better.
This is because the brain processes words as tiny images before combining them to
understand the information [3].
Despite technological advances, many eateries still use text-based menus. Only a
small margin of eateries incorporated mankind's favourite technology, the handheld
device, as individuals are more familiar with it. Eateries either provide their
Pg. 9
personalised handheld device menu or ask individuals to use their smartphone and
scan a Quick Response (QR) code, which then leads to the eatery's customised digital
food menu. These digital menus usually include food images, as shown in the
examples below.
Pg. 10
Fig. 4 Tsukimi Hamburg’s Customised Menu from scanning QR code
Note. From https://njoy.com.sg/brands/tsukimihamburg/
Visual aids may indeed make it easier to grasp information as compared to reading
text alone, but it may not always be accurately illustrated and thus, mislead
individuals at times. Eateries will always showcase their most attractive food item on
the menu to entice individuals to order it. Then, when the food that individuals order
does not match how it was depicted in the menu, it can be disappointing or irritating
for them.
To avoid individuals feeling misled, it would be great if the visual and written
representation of a dish on the menu was an exact portrayal of what the dish looked
and described like. This is where visual aids and text-based menus need to be
upgraded. By introducing the Augmented Reality (AR) component, 2-Dimensional
(2D) food images and text-based menus can be transformed into 3-Dimensional (3D)
food models. This can allow individuals to see an exact 1:1 detailed scale of the dish
they will be having and strengthen their decision-making process.
Pg. 11
1.2 Objectives
This paper’s sole focus will be on designing and developing a digital food menu on
handheld devices for eateries through Augmented Reality (AR) technology, with the
goal of creating an immersive, playful, and interactive overall dining experience.
1.3 Scope
The scope of this paper will explore the technologies and applications of Mixed
Reality (MR), Virtual Reality (VR), and Augmented Reality (AR). In addition, short
research into User Experience (UX) and User Interface (UI) in the context of
designing a specific app will also be conducted.
Chapter Two
2 Literature Review
A literature review is a critical evaluation of the body of work that has already been
published on a certain subject or question. The literature on Mixed Reality (MR),
Virtual Reality (VR), Augmented Reality (AR), User Experience (UX), and User
Interface (UI) will be discussed in this section to better understand the technology.
Pg. 12
MR has found practical applications across various industries, such as entertainment,
healthcare, military, education, and even food [5][6][7]. A recent example is the
development of a Bloodstain Pattern Analysis (BPA) training program by the Home
Team Science and Technology Agency (HTX) in 2022 [8]. This program utilises MR
technology by projecting holographic images onto the user's field of view through a
headset, creating a realistic virtual crime scene with various bloodstains for trainees to
interact with and analyse [8]. The BPA program is depicted in Figure 6, which uses
augmented reality features in a virtual environment setting. The use of MR allows for
a more interactive and realistic training setting for trainees, thus improving learning
outcomes.
Pg. 13
2.2 Virtual Reality (VR)
The term "Virtual Reality" (VR) refers to a computer-generated simulation that
mimics the actual world or an imaginary environment [9][10]. The user can interact
with this environment in a fully immersive manner by using VR headsets and
controllers. VR technology has been explored in many fields, such as gaming,
medicine, engineering, and education. The application of VR technology in the food
industry has also gained popularity over the years.
Sublimotion, a Two Michelin-starred restaurant in Ibiza, Spain, renowned for its use
of VR technology to provide diners with an immersive, luxurious, and multi-sensory
eating experience. While diners enjoy their courses throughout the exquisite dining,
the restaurant has been transporting them to various places using VR. For instance,
while tasting seafood dishes, diners get to experience a virtual trip through the ocean's
depths as shown in Figure 7 and 8. Furthermore, Sublimotion enhances the eating
experience with music, illumination, and other sensory elements.
Pg. 14
Sublimotion's use of VR technology is just one illustration of how the restaurant
experience is being improved by the food industry's adoption of technology. VR
technology has a lot of promise for the food industry in enhancing the dining
experience. Diners can experience the food production and preparation processes
thanks to VR technology's ability to create immersive and interactive settings. Diners
can also examine 3D models of cuisine and alter their purchases by using VR menus,
which can also give them a more immersive and participatory experience.
The use of AR technology in food menus has been demonstrated to increase consumer
involvement and satisfaction levels, according to research by Kim et al. [14]. The
research discovered that consumers of AR menus expressed greater satisfaction with
the ordering process and were more likely to visit the restaurant again. The research
also found that using AR menus enhanced how well people viewed the food items,
indicating that the technology may better dining experiences in general.
Kabaq is a firm that has developed an AR solution for the food industry, enabling
consumers to examine virtual 3D models of menu items on their handheld devices as
shown in Figure 9 and 10 [15]. The use of technology can completely change how
diners engage with menus and make food ordering more enjoyable and individualised.
Pg. 15
Fig. 9 Diners viewing AR Menu [15] Fig. 10 3D Food Model of Banoffee [15]
Fig. 9 & 10 Note. From https://kabaq.io/
Individuals were more likely to purchase menu items when they could see a 3D
representation of the meal, according to a Kabaq research [15]. Individuals who used
the AR menu spent more time browsing the menu and were more apt to post about
their experience on social media, according to the research.
The use of AR technology in the food industry has a variety of possible advantages,
such as enhancing individuals’ satisfaction, involvement, and boosting revenue for the
restaurant [16]. With the ability to view accurate 3D models of menu items and
modify their purchases, AR menus can give individuals a more interactive and
immersive experience.
The application of AR technology in the food industry does, however, come with
some difficulties. Since the technology is rather new, more resources may be needed
for assistance [17]. Additionally, not all restaurants or menus may be appropriate for
the use of AR menus, and some patrons may not feel safe using the technology.
Despite these difficulties, there is a lot of promise for enhancing the individuals
experience and increasing revenue in the food industry by using AR technology [18].
It is possible that more restaurants will start using AR menus to improve their
offerings and maintain competitiveness in a market that is changing quickly as
technology advances and becomes more widely available.
Overall, MR, VR, and AR technologies have the potential to change the food industry
by refining the diner experience and allowing consumers to connect with food in more
Pg. 16
interactive and immersive ways. More study is needed to completely explore and
exploit the potential of these technologies in the food industry, including tackling
issues like cost, accessibility, and privacy. As these technologies advance and become
more widely available, it will be exciting to see how restaurants and food businesses
integrate them into their operations to better the dining experience and customer
satisfaction. It is obvious that MR, VR, and AR have the potential to transform the
food industry while providing intriguing future possibilities.
UX is the term used to describe how an individual uses a system, product, or service
[19]. In the context of MR, VR, and AR technologies, UX refers to how consumers
interact with the virtual or augmented environment. The architecture of the UX is a
crucial factor in the overall performance of the technology.
The UI of a technology refers to its visible and functional design, which includes
menus, buttons, and other components that enable individuals to interact with the
system [20]. The user interface design for MR, VR, and AR technologies must ensure
that the user can readily explore and interact with the virtual or augmented
environment.
The UX and UI of MR, VR, and AR technologies have been extensively researched in
academic literature and commercial study [21][22][23]. Design concepts for making
successful UX and UI have been defined as simplicity, consistency, usability, and
feedback [24].
Pg. 17
engaging experience. On the other hand, if the UX and UI are poorly designed, users
may become frustrated, lose interest, and ultimately be less likely to use the
technology.
To conclude, effective UX and UI design is vital for the adoption of MR, VR, and AR
technologies in the food industry. To ensure that users can effectively navigate and
interact with the virtual or augmented environment, the UX and UI must be intuitive,
user-friendly, and engaging.
Chapter Three
3 Methodology
Using AR technology, the procedure for designing and developing a digital food
menu for handheld devices can be broken down into several phases. Planning,
research, design, development, testing, and implementation are the phases that
comprise this process.
Planning:
At this point, the project's goals, objectives, and scope will be outlined. Determine the
resources needed, along with the timelines and budget. Anyone who is related to the
project should be involved in the planning stage to ensure that everyone is on the
same page about the project's objectives.
Research:
During this period, information on the technologies and applications of MR, VR, and
AR is collected. Besides, UX and UI research in the context of designing a specific
app will be performed. It is also important to do studies on current digital food menus
on handheld devices, as well as their features, customer evaluations, and comments.
This phase will likely influence the design, development, and testing processes.
Design:
During the design phase, wireframes, and prototypes of the digital food menu's UX
and UI are constructed. The research done in the earlier step will likely have an
Pg. 18
influence on the design. The style should be aesthetically pleasing, simple to use, and
straightforward. The one who design should consider the restaurant's brand identity
and ensure that the design is coherent with it.
Development:
The backend and frontend of the digital food menu should be created during the
development stage. This phase involves coding and combining AR technology into
the app. The development team should ensure that the app is responsive, works on
various operating systems, and provides a consistent user experience.
Testing:
During the testing phase, any bugs or errors in the app are to be identified and
rectified. The testing process should be comprehensive and include testing on a
variety of devices, platforms, and environments. To ensure that the app is bug-free
and delivers a smooth user experience, both the one who develop, and external testers
should test it.
Implementation:
The digital food menu should be published and made accessible to individuals during
the implementation stage. To guarantee that the software gets the targeted crowd, it
should be advertised and promoted. The restaurant employees should be educated on
how to use the app and be able to assist diners who have difficulties using it. To
ensure that the app stays relevant, it should be constantly monitored and updated.
Overall, the design and development of a digital food menu on handheld devices
using AR technology requires meticulous planning, research, design, development,
testing, and implementation. The final product should be an app which provides an
immersive, playful, and interactive dining experience.
Pg. 19
Chapter Four
4.1 Hardware
Photography device and portable lighting are among the gear requirements for
creating the AR app. Food images can be captured using photography devices such as
DSLRs, mirrorless cameras, or even smartphones with high-quality cameras. Portable
lighting can be used to ensure that the images captured are well-lit.
Due to budget constraints, the author is limited to using his own smartphone for food
photography. The Realme 5 smartphone was used to shoot the food, which was then
transferred to a computer for processing. The image quality is sufficient to produce
decent result.
Pg. 20
Fig. 11 Realme 5 Quad-Camera
Note. From https://c.realme.com/in/post-details/1164427197176348672
The Realme 5 smartphone has a quad-camera system on the back, which includes
[27]:
- A 12-megapixel main camera with a f/1.8 aperture
- An 8-megapixel ultrawide camera with a field of view of 119 degrees and an
aperture of f/2.25.
- A 2-megapixel macro camera with an f/2.4 aperture and a 4 cm minimum
focusing distance.
- A 2-megapixel depth camera with an f/2.4 aperture
Pg. 21
4.2 Software & Techniques
The following software tools and techniques will be used in the development of the
AR app for both prototype:
4.2.1 AliceVision
Pg. 22
Fig. 13 Photogrammetry of Ramen produced by AliceVision Meshroom
Pg. 23
edit unwanted surface and to refine the 3D model. This software will be used in
Prototype One.
4.2.3 Unity3D
4.2.4 Vuforia
Pg. 24
project for prototype one. The functions used will be the interactive buttons and object
tracking for the 3D food model.
4.2.5 ARCore
4.2.6 RunwayML
Pg. 25
Due to various cramped kitchen configurations, it is not ideal to install a physical
green screen due to space constraints. Another reason is that the green screen can only
be laid out in one orientation, and when the photographer shoots the food images or
videos, it can only have that few angles to perform with within the green screen. As a
result, a background remover software will prove helpful in these circumstances.
Additionally, the expense of the green screen can be minimised.
Figure 19 shows the before and after effect of the background being overlaid by a
green screen which helps to focus on the subject and remove unwanted content. The
subject then can be implemented into the AR food app for both prototypes. The only
downside might be a hassle to scan through the whole video to see if the background
was removed properly. If it does not, frame by frame removing will need to be done
and that will be quite time consuming.
4.2.7 Figma
Figma is a web-based UI design and prototyping software that allows creators and
developers to collaborate on projects together in real time. It includes vector editing
tools, shape and text tools, prototyping tools, and design frameworks for creating user
Pg. 26
interfaces. Figma is also notable for its collaboration capabilities, as multiple users
can work on the same project at the same time, with modifications synchronised in
real-time.
The author and his team member have collaborated using Figma as shown in Figure
21 above to design the user interface for prototype two AR food app. Different
wireframes were created and design ideas were iterated after each meeting.
4.2.8 MindAR
Pg. 27
detection, as well as real-time rendering of 3D models and other visual representations
that can be positioned and manipulated by the user within the AR scene.
This makes it an excellent choice for both novice and seasoned coders. It also
supports a variety of devices, including smartphones and tablets, making it a flexible
option for creating AR experiences that can be accessed by many users.
MindAR is used in this project for prototype two. First, it compiles images to extract
feature points so that the AR app can later track and detect the images [39]. This step
is crucial because compiling requires time; therefore, it is best to do it ahead of time to
shorten the loading time when individuals use the AR app later.
Overall, MindAR is a strong and flexible library that gives developers a fantastic
collection of tools for building entertaining and immersive AR experiences on the
web.
Pg. 28
C# scripting for various interactive role like triggering the buttons to do certain event
and rotating function for 3D models will be used in prototype one. The script will be
implemented into Unity.
4.3.3 HTML
HTML, or HyperText Markup Language in short, was first developed by Tim
Berners-Lee in the late 1980s to share information between researchers [42]. Since
then, it has become the standard language for creating web pages and defining their
structure. HTML was used in prototype two to give structure to the JS code and
structure its own web-AR page. The author’s team members incorporated this
language together with JS.
4.3.4 CSS
CSS, also known as Cascading Style Sheets, was developed in the late 1990s to
separate the presentation and content of a web page [43]. As a result, designers could
produce more aesthetically pleasing web sites without having to cram presentational
components into the HTML code. CSS helps to beautify the content created by
HTML in short. The author’s team members used this language to make the web page
more appealing.
Pg. 29
Chapter Five
Pg. 30
App Building. The Testing and Implementation phases will be covered in the next
chapter.
For 3D Food Modelling, photogrammetry technique was utilised via the AliceVision
Meshroom software to generate the rendering of the 3D food model, and further
processing was done via Autodesk Meshmixer to refine the model. To achieve this,
multiple pictures of the food were taken from various perspectives, preferably more
than 20 pictures to accurately capture the whole 3D food image in the real world. The
more pictures taken, the better the result. The photos were then transferred to a
computer workstation, and since they were in 2D format, a software was needed to
recreate the food in 3D in the virtual environment.
Pg. 31
Fig. 25 Unedited 3D model of Pork Rib Soup
Pg. 32
In Figure 25, the unedited 3D food model can be seen after being generated from the
Meshroom software. The next step involves importing the food model into Autodesk
Meshmixer shown in Figure 26 for refinement, where unwanted surfaces such as the
brown wooden plane can be removed. The purpose of this refinement process is to
obtain a 3D food model that is free of extraneous elements and suitable for use in the
AR app. The final version of the 3D food model, which will be imported into the AR
app, can be seen in Figure 27.
In the App Building phase, relevant resources, including the 3D food model and
Vuforia Engine were imported into Unity for the design and development of the AR
app. The user interface for the app was created entirely inside Unity shown in Figure
28. To make the interface more interactive and dynamic, various C# scripts using
Visual Studio were integrated into Unity. The code is available in the appendix
section.
After the design and development phase was complete, ARCore for Android was used
to build the app APK file. This app file was then used for testing purposes to ensure
the app's functionality and proper working. Any necessary adjustments or bug fixes
were made during the testing phase to ensure a smooth user experience.
Pg. 33
5.2 Prototype Two
Figure 29 depicts the four phases of prototype two development, which include
Augmented Food Video, App Building, Testing, and Implementation. This section
will focus on Augmented Food Video and App Building phases while Testing and
Implementation will be covered in the next chapter.
The Augmented Food Video phase involved shooting food videos with and without
green/blue screens. The videos were then edited on a computer workstation. If needed,
a background remover software by RunwayML which was mentioned in the previous
chapter was used to replace unwanted backgrounds with green screens. The videos
were then added to the libraries in the web app built during the App Building phase.
Pg. 34
The web app was primarily built using JavaScript and was supported by HTML and
CSS to create an organised, aesthetically pleasing, and user-friendly interface. The
food images were sent to the MindAR image compiler, which integrated them into the
web app. The augmented video was then attached to the target image that had been
compiled. An example of how the augmented video was projected up from the
targeted image was shown in Figure 30 and 31. Additionally, Figma design was used
to create well-organised CSS code.
Fig. 30 Targeted Image of Leng Zaab Fig. 31 Single Frame from Augmented Video
Note. From Un-Yang-Kor-Dai Singapore Menu
The Testing and Implementation phases will provide further insight into how the web
app functions.
Chapter Six
Pg. 35
bottom right of the screen. The app can be used in both vertical and horizontal
orientations, depending on how users hold their devices. An example of how the food
app function is shown in Figure 32, 33 and 34 below.
Prototype Two was tested in a closed beta with Un Yang Kor Dai. Users were able to
use their own mobile devices to access the digital food menu via a QR code they
scanned. The digital food menu is currently powered by the Heroku cloud server.
Pg. 36
Once the users tapped the interface to start, they faced the front phone camera towards
the image-based menu, which would trigger the augmented video of the food to pop
up and play automatically. Figure 35 below shows an example of it.
Bugs and feedback were collected during the closed beta testing to improve both
prototypes. Subsequently, it will be release to public for further improvement and
update accordingly. Continuously improving and updating the digital food menu
based on user feedback and behaviour can help increase the eateries’ branding and
maintain stronger individual engagement. This can lead to a better user experience
and individual satisfaction, ultimately resulting in increased revenue for the
restaurant.
Chapter Seven
Pg. 37
Two prototypes were developed, Prototype One using Unity and Vuforia and
Prototype Two using JavaScript. Prototype One involved using photogrammetry to
create 3D food models with AliceVision Meshroom, refining them with Autodesk
Meshmixer, and integrating them into Unity for app development. In contrast,
Prototype Two consisted of filming food videos with and without green/blue screens
and creating augmented videos attached to a target image compiled in the MindAR
image compiler. Both prototypes had user-friendly and visually appealing user
interfaces, with the layout design of Prototype Two done by the author and the coding
done his team members. The testing and implementation phases of both prototypes
were discussed in Chapter 6.
Using AR technology in digital food menus has several benefits, such as reducing
choice overload by providing visual aids that help individuals make informed choices
and giving an accurate representation of the dish to avoid any misunderstandings.
Moreover, AR technology can create an immersive and interactive experience for
diners, making the overall dining experience more engaging and enjoyable.
However, the current design and development of the digital food menu have some
limitations. For example, it may not be accessible to all individuals, particularly those
who are not familiar with technology or do not have handheld devices. Additionally,
the cost of implementing AR technology in eateries may be high, which may prevent
smaller establishments from adopting it. Another limitation is the accuracy of 3D food
models for Prototype One, which heavily relies on the number and angles of
photographs taken. Future work could explore alternative techniques such as 3D
scanning or Lidar to improve accuracy.
Future research could be conducted to address these limitations and improve the
design and development of digital food menus. For instance, exploring alternative
methods of implementing AR technology to make it more accessible and affordable,
and evaluating the effectiveness of AR technology in improving the overall dining
experience and the success of an eatery.
Pg. 38
In conclusion, the development of AR technology in the food industry is still in its
early stages, with much potential for growth and innovation. This paper provides a
foundation for future research and development in this area, and hopefully, more
eateries will adopt AR technology to enhance the dining experience for their
customers.
Pg. 39
References
[1] R. B. School, "The Fifth Industrial Revolution (5IR) and how it will change
the business landscape," 2020. [Online]. Available:
https://www.regenesys.net/reginsights/the-fifth-industrial-revolution-5ir/.
[2] E. Velasco, "Scientists Uncover Why You Can't Decide What to Order for
Lunch," 2018. [Online]. Available: https://www.caltech.edu/about/news/scientists-
uncover-why-you-cant-decide-what-order-lunch-83881.
[3] E. Press, "The psychology of visuals, how images impact decision making,"
2020. [Online]. Available: https://enterprise.press/stories/2020/02/05/the-
psychology-of-visuals-how-images-impact-decision-making-11273/.
[4] P. Milgram and F. Kishino, "A taxonomy of mixed reality visual displays,"
IEICE TRANSACTIONS on Information and Systems, vol. 77, no. 12, pp. 1321-
1329, 1994.
[10] Iberdrola, "Virtual Reality: another world within sight," 2022. [Online].
Available: https://www.iberdrola.com/innovation/virtual-reality.
[11] J. D. Calvert and H. T. Lawless, "Virtual Reality and Its Potential for Food
and Beverage Industries," Journal of Food Science, vol. 83, no. 4, pp. 1067-1072,
Apr. 2018.
[12] H. J. Kim, M. H. Kim, and E. Park, "Virtual Reality in Food Science and
Hospitality," Journal of Food Science Education, vol. 16, no. 2, pp. 58-66, Jun. 2017.
Pg. 40
[13] K. Lee, M. H. Lee, and K. J. Kim, "Does Virtual Reality Food Tour Enhance
Satisfaction with Food Tourism?," Sustainability, vol. 11, no. 19, pp. 5181, Sep.
2019.
[14] J. Kim, S. Kim, and Y. Kim, "The effect of augmented reality menus on
customers' sensory experience and behavioural intentions in restaurants," Journal of
Travel Research, vol. 59, no. 4, pp. 656-670, 2020.
[15] Kabaq, "Augmented Reality Solution for the Food Industry," [Online].
Available: https://kabaq.io/.
[18] Y. Liu, L. Li, and Y. Liang, "Augmented Reality Menu Design and Customer
Experience: An Experimental Study," International Journal of Hospitality
Management, vol. 86, p. 102445, Mar. 2020.
[20] Shneiderman, B., & Plaisant, C. (2010). Designing the User Interface:
Strategies for Effective Human-Computer Interaction (5th ed.). Addison-Wesley.
[21] Lopes, P., & Branco, F. (2018). A review of the literature on the usability of
virtual environments. Virtual Reality, 22(4), 301-326.
[22] Leue, A., & Beaudouin-Lafon, M. (2019). Beyond the lab: Evaluating HCI in
the wild. Interactions, 26(6), 26-33.
[23] Dey, A., Billinghurst, M., & Lindeman, R. W. (2016). A systematic review of
10 years of augmented reality usability studies: 2005 to 2014. Frontiers in Robotics
and AI, 3, 22.
[26] Billinghurst, M., & Duenser, A. (2012). Augmented reality in the classroom.
In Proceedings of the 2012 ACM international conference on Interactive tabletops
and surfaces (pp. 1-8).
Pg. 41
[27] Realme SG, "Realme 5 Specification," 2019. [Online]. Available:
https://www.realme.com/sg/realme-5/specs/.
[34] A. OSBORN, "Super Mario Run Created With Unity," 2016. [Online].
Available: https://www.ign.com/articles/2016/11/01/super-mario-run-created-with-
unity.
Pg. 42
[42] University of Washington, "A Brief History of HTML" 2005. [Online].
Available:
https://www.washington.edu/accesscomputing/webd2/student/unit1/module3/html_hi
story.html
[43] Boston University, " History of CSS " 2005. [Online]. Available:
https://www.bu.edu/lernet/artemis/years/2020/projects/FinalPresentations/HTML/his
toryofcss.html.
Pg. 43
Appendix
Pg. 44
Object Rotation Script:
Pg. 45
3D Food Models for Prototype One:
Pg. 46