An Interview With Fotis Georgiadis

Learn the fundamentals — If you hop into XR development without a good grounding of the software you are building your experience in, you may end up missing important optimizations and processes. Sometimes the best way for some people to learn is to learn while trying to make something, and for other people they may prefer to consult tutorial videos or follow guides. The important thing is to find out what way works best for you to learn best development practices, and to not get too far ahead of yourself as you are bound to learn from others and your own experiences during your career. I am learning new things almost every day when working in XR, and when I am not working on XR apps, I am discovering new tips and tricks to implement in XR apps when I next work on one.

The Virtual Reality, Augmented Reality & Mixed Reality Industries are so exciting. What is coming around the corner? How will these improve our lives? What are the concerns we should keep an eye out for? Aside from entertainment, how can VR or AR help work or other parts of life? To address this, we had the pleasure of interviewing Robert Farthing.

Robert is a Unity-certified Artist and digital programmer for XR applications with experience in pairing Machine Learning with Augmented Reality. Since graduating from the University of Portsmouth (BSc Hons 1st class) in 2016, Robert has developed a range of immersive experiences for clients such as Samsung, GSK, and Harman. He is currently a developer at EPM.

Thank you so much for doing this with us! Before we dig in, our readers would like to get to know you a bit. Can you tell us a bit about your backstory and how you grew up?

I grew up in Hampshire, United Kingdom and was always an avid gamer as a teenager. My love for art and video games fuelled me to chase after a career in the gaming industry as a 3D Artist. I went on to study Graphic Design at college and then went to Portsmouth University to study Computer Games Technology where I learned how to develop games and make 3D art. I didn’t land in the gaming industry when I graduated from University, but there were a load of other businesses such as large tech companies and digital creative agencies looking to hire game development graduates to work on XR (Augmented/Virtual Reality) apps as people had just started talking about AR, MR and VR (Augmented/Mixed/Virtual Reality) at the time.

Unfortunately I wasn’t able to grasp programming while in education, however once I started working on apps at digital agencies I was able to shadow learn from skilled contractors who worked alongside me on apps and quickly learned how to make XR apps from scratch. My initial skills with 3D art and animation took me far in my early career as I was able to quickly learn things I didn’t know while at the same time working on the visual aspects of apps.

Is there a particular book, film, or podcast that made a significant impact on you? Can you share a story or explain why it resonated with you so much?

Edge magazine’s developer career-retrospective interviews always inspired me, seeing experienced game developers and designers talking about the issues they solved and problems they overcame during the early days of game consoles drew a lot of parallels with the experiences I was having working on XR apps while the tech was rapidly evolving. Hearing how development processes changed quickly midway through development, or new dev kits for consoles arriving close to launch is exactly how things are going with the XR space currently during rapid advances in the technology. It feels like a full circle moment now as I am detailing my own experiences as an XR developer in this interview which I hope in turn will inspire and help others on the path to their career in XR.

Is there a particular story that inspired you to pursue a career in the X Reality industry? We’d love to hear it.

In 2016 during my final year of University was when SteamVR was released and all the major first VR titles were out on Steam. At the same time, Sony’s PSVR (Playstation VR) was releasing and there was this surge of new games with a very different way to play them from traditional flatscreen titles. The tech felt quite bulky back then as you needed SteamVR ‘base station’ tracking devices mounted to the walls or sensors stuck to your desk; it felt like a lot to just track a headset and controllers in a room. We worked with a variety of early MR and VR headsets at university, such as the original ‘Meta One’ MR headset and the first Oculus and Vive development kits, but everything felt very fiddly to set up and just tracking the headset and/or controllers felt limited back then. It wasn’t until I used an early Oculus DK1 headset (with some mix of a custom passthrough camera for MR and a Leap Motion controller strapped to the front of the headset for experimental hand tracking in MR to cast spells in a wizard simulator) that I could see the possibilities for the future of the technology during those early days of VR.

At the time we were encouraged by the university to experiment with new gaming technologies so we attempted to create early full-body tracking with a smartphone using Google Cardboard for head movement, an Xbox Kinect sensor to track the user’s body position and a Leap Motion sensor stuck to the front of the headset for hand tracking. This combination of body, hand and head tracking is something we didn’t see arriving as a feature set built into VR headsets until December 2019 when Oculus announced hand tracking support for the Oculus Quest. I wanted to get in early and learn how to develop for this technology so that I would be ready for when that full body tracking experience I had on that 2016 university project would become a standard part of VR. You can see the original body tracking demo I worked on here (https://www.youtube.com/watch?v=N5xl3HBNsqg ).

Can you share the most interesting story that happened to you since you began this fascinating career?

When working on a touchscreen store experience for Harman Kardon using 3D environments to show their products in situ, we arranged to set up screens at their USA showrooms. At the time, Samsung was buying out Harmon Kardon and upper management were not keen on the idea of interactive interfaces in showrooms at the time as it was a big change using new technology and sales processes.
It got to the point where we heard back that the interactive screens would be taken down soon… that was until the bosses at Samsung visited the Harman Kardon Experience Center in Los Angeles to see the app. They were really impressed with the 3D visuals and interactivity on the screens and from that day onwards we didn’t hear back from the management team at Harman Kardon. We went onto release the touchscreen experience app on the Windows store to selected regions using mouse input to simulate touchscreen input.

Can you share a story about the funniest mistake you made when you were first starting? Can you tell us what lesson you learned from that?

During the early days of Apple’s ARKit, a creative digital agency I worked at had an old iPad Air model as a development device and at the time it ran Apple’s ARKit. We would be making a lot of demonstration apps to take to prospective clients to show them how they could make use of AR internally or for their customers to have new ways to access their services or view their products. I had a few test apps installed on the iPad Air which the boss took off to show some clients across a week.
While he was away, I wasn’t aware that Apple would be cutting off ARKit support for some older devices during that week through a software update. Unfortunately, the boss downloaded a few firmware updates while on the road so when he went to show the apps off, he would be met with a black screen crash on the iPad screen!
Fortunately for me this was good news as I already wasn’t impressed with the low power and performance of that iPad Air so we bought the latest gen iPad models for the office to allow us to make better-looking AR apps.

None of us are able to achieve success without some help along the way. Is there a particular person who you are grateful towards who helped get you to where you are? Can you share a story about that?

I would like to thank my first boss in the industry, Steve Carter, who hired me as a graduate fresh out of University and has stayed in touch with me ever since my first job with him. He never questioned when I had a ridiculous idea I wanted to try and always had suggestions to try to push the technology as far as possible when AR was still developing. He hired a computer vision expert to help us try to do things before Apple or Google’s AR SDKs (Software Development Kits) would allow us to do so, such as detecting and measuring walls, floors and unique fixtures such as windows and doors in a home.
I worked with Steve full-time on an in-home sales app for a company specializing in window-fitted products and during that time he allowed me to step up from being a mere 3D artist on the project to project manager and principal developer of the project. It is still an app I am really proud of as we were competing with the AR SDK providers to be the first in detecting parts of people’s rooms in AR.

Are you working on any exciting new projects now? How do you think that will help people?

At EPM (Electropages Media) (https://www.epm.digital/ ), we are working on various applications using a pairing of ML (Machine Learning) and AR to power apps that require body tracking and environmental understanding. In the wake of Covid, there has been an uptick in clients asking if we can remove anything that requires touching surfaces, so face, body and hand tracking with ML and AR has been key to achieving the visions of our clients while keeping the end users safe. With exhibitions returning this year around the world, we are also working on various web- and exhibition-focused experiences using touch-based input and body tracking to create innovative apps to deliver product visualization and interactive games for different industries at these shows. I am still fairly new to the team here at EPM so I am excited to show more of what I can do to the team and our customers, especially as VR content for exhibitions is not asked for as much, I am hoping to work on VR experiences for VR users at home instead in the near future as we see an uptick in VR adoption in the consumer market.
On the hobbyist side, I am working on custom home environments for the Meta Quest 2 (Oculus/Facebook) by porting and recreating popular environments from classic video games into VR to allow people to hang out in their favorite places. The most popular of these have been Silent Hill and Resident Evil themed spaces such as the hallway from P.T. (https://sidequestvr.com/app/6100/custom-home-pt-hallway-silent-hills ) and the Room 302 Apartment from Silent Hill 4: The Room (https://sidequestvr.com/app/4265/custom-home-silent-hill-apartment-room-302-custom-home ). I am working on more of these spaces in my free time and hope to put out more spaces that people have requested. (https://sidequestvr.com/user/95537 )
In 2019 I was coaching for other VR developers such as Immersive Matthew who is putting out a VR ‘dark-ride’ called Into the Metaverse (https://www.metaverse-adventures.com/ ) for educational purposes, of which the first part is now out on Sidequest and Oculus App Lab. From that collaboration, he helped me to develop a VR animation tool I am hoping to release soon using hardware trackers stuck to the user’s body to animate characters within their 3D scenes while in VR to fully motion capture their acting in situ. He couldn’t have animated over 60+ characters in his ride so quickly without the VR animation tool and I am looking to future-proof it with OpenXR support before release.

Ok super. Thank you for all that. Let’s now shift to the main focus of our interview. The VR, AR and MR industries seem so exciting right now. What are the 3 things in particular that most excite you about the industry? Can you explain or give an example?

  1. Mixed Reality headsets haven’t breached the consumer market like VR has recently — We are seeing rumors of Apple bringing out a MR headset recently and also rumors of Google returning to the MR headset space after the failures of Google Glass. It feels like we are still waiting for someone to release an affordable and essential MR headset or glasses to become a leader in the consumer market. This is exciting because us developers can think about how people can use MR at home and it will create new opportunities to make smartphone AR apps that can share content with MR and VR apps, closing the gap in the ecosystems we have currently.
  2. The use of ML with AR — With technology such as Google’s Mediapipe and newer releases of ARKit, we are seeing ML and AR merging together to provide smart understanding of what is in the camera feed. This is opening up new avenues such as detecting and tracking objects through object tracking rather than traditional image markers or QR codes. We can now track the human body in camera view also, and in some cases newer technology such as ‘Instant motion tracking’ is removing any specific hardware requirements for AR.
  3. Presence — Face and hand tracking isn’t implemented in many headsets or applications yet. I think that with newer devices we will have more tangible representations of our bodies in VR and it will make a lot of current VR games and apps feel outdated in terms of presence when we have properly tracked bodies in VR.

What are the 3 things that concern you about the VR, AR and MR industries? Can you explain? What can be done to address those concerns?

  1. The ‘Metaverse’ gold rush — Since Zuckerberg put Meta on the map and spearheaded the Metaverse push we are seeing now, it’s still evident we are quite a ways off achieving the goals he set out for sharing digital content between apps and platforms. On the content side, there are a range of file format issues to overcome, such as differences in required material texture formats on different devices and game engines that power the content using different file formats. How will we author content for the metaverse so that it runs well on all platforms? There is also the issue of many MMO (Massively Multiplayer Online) experiences claiming to be ‘The Metaverse’ which raises the question of what will the metaverse actually be, and how can we get the platform holders and game engine providers to work together to allow transferring of content between apps and devices in such a way. I just hope that nobody wastes money being sold the ‘Metaverse’ well before we are anywhere near formulating one.
  2. As much as I am excited about this, it is also a concern that MR headsets haven’t breached the consumer market anywhere near as well as VR has. There are a plethora of justified concerns with wearable MR technology around privacy and safety while walking around, but I am still unsure of why Magic Leap and Microsoft have steered clear of the consumer market for so long. Is there a reason these devices are reserved for enterprise customers, and will Apple or Google bridge the gap?
  3. Standalone Headset operating systems — We have seen with the Meta Quest that Android-powered standalone VR headsets are taking the world by storm, but this is creating issues of cross-compatibility with traditional PC-based VR apps, requiring developers to port their apps. With the recent release of the Steam Deck, we are seeing that PC processor powered standalone devices can deliver solid performance and I expect to see Valve moving into this space, however I am concerned whether this will create more platforms to port to. To be safe I would suggest to target OpenXR as the target runtime of your VR apps so that it will be easier to port your VR content to newer devices.

I think the entertainment aspects of VR, AR and MR are apparent. Can you share with our readers how these industries can help us at work?

I can think of two reasons that XR can help with work and one of these is the removal of screens all across desks, instead you can have a small wearable headset that you can take anywhere with your laptop/PC and have screens as big as the wall. For productivity this makes the most sense, as most of us working from home have had that experience of lugging screens around our home to set up an office, working with what available space we have, which an XR-based desk would remove the hassle of. I would like to have a screen as big as the wall in front of me for some work tasks but unfortunately none of the current consumer VR wearables are great for wearing for long periods of time while working.

The other main use case is for attending meetings in VR, this is something I would like to see: the main video conferencing providers such as Skype, Microsoft Teams etc. providing some way to join meetings in VR so we can present, talk and showcase our work in an immersive manner. Currently this space of VR meetings is reserved for separate VR products which handle this, but more could be done to drive the adoption of VR in the workspace through the services we all use internally on the daily at work.

Outside of VR conferencing, for artists and designers using XR can help to understand the scale and dimensions of work by overlaying our works virtually in the real world. Adobe is developing a suite of AR and VR tools to allow this, and Blender also supports OpenXR for VR exploration of 3D models and animations. This is something that I am using regularly when working in 3D to preview models and spaces which greatly improves my understanding of how objects will look in VR or AR in the finished product.

Are there other ways that VR, AR and MR can improve our lives? Can you explain?

I am very excited for the future of exercise tracking and guidance. With current VR workout apps such as FitXR and Supernatural, we are getting glimpses into how our performance can be reported on through body tracking and a full-scale trainer avatar can show us what to do in different exercises. This has already helped me to learn some dance moves and how to correctly do HIIT workouts, and I think this is something that can already improve our fitness/wellbeing and will continue to improve in the future.

AR and MR for online shopping is an example of how we can preview how items would look or fit against our homes and this is something that we are seeing improve constantly. A great example of this is the IKEA app on smartphones. AR measuring apps can also help in cases where you just have the dimensions of a product only. Being able to quickly get your phone out and measure or place a product has influenced my shopping purchase decisions in the past, and we are seeing online shops such as Amazon and Ebay already demonstrating the use of AR to preview projects. As AR becomes more adopted you can expect to see more online stores making use of this. Apple have nearly finished their support of the open Web AR standard called WebXR on iOS Safari, which will allow us to make online AR shipping easily accessible across Android and iOS smartphones hopefully this year. As MR headsets become more accessible we can expect these AR shopping experiences to move from the smartphone screen to glasses and headsets.

Recently HTC Vive have announced support for VR headsets for in-car passive passenger experiences like riding a rollercoaster or sitting in a spaceship while you are a passenger in a long car journey. This is an exciting novel use-case for VR in the automotive industry that could be a great way to watch movies or play small VR games on long car journeys. As car entertainment systems advance with electric vehicles allowing playing of games on a passenger screen, these new ways of using immersive technology can enhance our in-car experiences.

What are the “myths” that you would like to dispel about working in your industry? Can you explain what you mean?

  1. The ‘Metaverse’ pitch — We are seeing a lot of large companies reveal their vision of the ‘Metaverse’ such as HTC Vive and Meta/Facebook. Quite often these pitch videos show hardware and concepts that don’t even exist yet, see the VIVERSE trailer for example https://www.youtube.com/watch?v=rTislcoD4eA. The risk here is that consumers are being shown something and having their expectations set at potentially a higher standard than what might be possible when the teased ‘Metaverse’ arrives and may be quite far off what we eventually get.
    These trailers have been pulled apart by some industry experts such as John Carmack of Oculus who mentioned “I have pretty good reasons to believe that setting out to build the metaverse is not actually the best way to wind up with the metaverse.”. Gabe Newell of Valve also drew parallels with MMO gaming, “Most of the people who are talking about the metaverse have absolutely no idea what they’re talking about. And they’ve apparently never played an MMO. They’re like, ‘Oh, you’ll have this customizable avatar.’ And it’s like, well… go into La Noscea in Final Fantasy 14 and tell me that this isn’t a solved problem from a decade ago, not some fabulous thing that you’re, you know, inventing.”
  2. 5G will revolutionize VR — For 3 years now we have been hearing that 5G will ‘revolutionize’ VR and since that time, nothing has changed with how we are making VR apps. Albeit, the speeds granted by 5G do allow for faster streaming of content which could be great for VR immersive livestream events. For gaming we have already seen that cloud-streamed VR and AR gaming is a ways off. The 5G streams of Superhot and Batman VR at MWC 2019 (https://www.roadtovr.com/batman-vr-cloud-gaming-5g-mwc-2019/) brought some silence to the crowds talking about cloud-streaming interactive content as corners were being cut for the demos, such as streaming only 3DOF (3 degrees of freedom) content rather than 6DOF (6 degrees of freedom) fully tracked content. The other current issue with 5G VR is that no current mainstream consumer VR headsets have access to 5G data outside of phone hotspots, which raises another question of whether VR headsets will require some mobile data access or if 5G will remain in the domain of smartphone-tethered headsets.

What are your “5 Things You Need To Create A Highly Successful Career In The VR, AR or MR Industries?”

  1. Be prepared to change things and experiment — XR SDKs change frequently, and you sometimes can only prepare so much for the next new headset or target device you will need to use. Before Khronos Group created OpenXR (https://www.khronos.org/openxr/ ), which is an open standard for cross-platform VR, supporting multiple VR platforms was always a learning challenge as tools and VR hardware were changing faster than the software could keep up at times. An example of this was that OpenXR was announced around February 2017, then the first preview release of OpenXR arrived in July 2019, but Unity only released support for OpenXR in December 2020. You might sometimes not have all the tools you need, but you can at least prepare for when those tools arrive if you are aware of what is coming.
  2. Learn the fundamentals — If you hop into XR development without a good grounding of the software you are building your experience in, you may end up missing important optimizations and processes. Sometimes the best way for some people to learn is to learn while trying to make something, and for other people they may prefer to consult tutorial videos or follow guides. The important thing is to find out what way works best for you to learn best development practices, and to not get too far ahead of yourself as you are bound to learn from others and your own experiences during your career. I am learning new things almost every day when working in XR, and when I am not working on XR apps, I am discovering new tips and tricks to implement in XR apps when I next work on one.
  3. Don’t be afraid of testing — With XR being relatively new in terms of design standards and expectations, quite often the design and testing can be very experimental. Google have some guidance on AR design standards which is now my go-to for explaining what is required for designers, (https://developers.google.com/ar/design) and Oculus have best practices guidance in their developer docs which can help with deciding how to structure UX (user experience) in your apps (https://developer.oculus.com/resources/bp-generalux/ ). Try to get other people to test your app for you on the target hardware and don’t tell them what to do before they use it so that they go into it as a new user. I worked on a VR Mindfulness app for healthcare once, which required rigorous user testing due to it being given to vulnerable users with health conditions. It really drove home how important frequent and good testing is in ensuring your users are comfortable and not confused.
  4. Network and talk to other developers — Go to expos and events, chat to other developers, join developer groups on Discord and Slack. Unity/Unreal engine forums are full of other developers to talk to, learn from and share tips with the development community. Posting solutions to common problems I have during development on forums has gotten me very far in networking. In 2019 I reported 180+ Oculus Quest bugs, so many bugs that Unity invited me to their Unite Conference and head office in Copenhagen as part of a beta user group to talk directly to the developers. The contacts I made at that event have helped me a lot with development since, especially the wonderful team at Needle Tools (https://needle.tools/ ) who create productivity tools to help with Unity development, which have saved me days of development time. If you are a new starter VR developer, I would also suggest applying for the Oculus Start Program (https://developer.oculus.com/oculus-start/ ) to receive direct developer support and other benefits to help you with developing VR applications.
  5. Use development samples to speed up development time early in projects — You often don’t need to start from scratch on projects (as fun as it can be to sometimes). Before starting development, make sure to look around at what samples are available. One of these starter packs I always use is the XR Interaction Toolkit from Unity (https://docs.unity3d.com/Packages/[email protected]/manual/index.html ) which has an interaction system set up for all the common things you need to do in VR and AR, such as UI (User Interface) interaction, grabbing, moving and rotating objects as well as teleporting in VR. Often these starter kits also help to show the best practices for development in the samples they provide. Download them and have a play!

You are a person of great influence. If you could inspire a movement that would bring the most amount of good to the most amount of people, what would that be? You never know what your idea can trigger. 🙂

In the interests of preservation of media, I would like to see more projects to update and maintain older media, such as games and apps, to keep them up to date with modern technologies. I’m always scouring the internet for open source projects based on my interests that I can contribute to.This reminds me of the types of people who still have Windows XP installed to use very old legacy software at the risk of their own device security. We need to all contribute to ways that ensure that what we create now isn’t left behind in a few years, and that what we build can last as long as its users. We are seeing this happening with the closing of Nintendo’s Wii and DS digital stores as digital content people once bought is now no longer available.

It’s part of the reason I am working on VR environment adaptations of retro gaming levels in my spare time as I would like to put together a group of hobbyists such as myself to port older 3D games into VR to allow people to experience these older, more inaccessible games in a new way. Bringing the past into the future could help to show modern gamers what they are missing and inspire a new generation of game developers also.

We are very blessed that very prominent leaders read this column. Is there a person in the world, or in the US with whom you would like to have a private breakfast or lunch, and why? He or she might just see this if we tag them 🙂

John Carmack would be my first choice as his very forward-thinking and open outlook on the progress of VR technology has been an inspiration for my development approaches. Knowing that he gets to say almost anything he wants about the state of VR has given me a realistic outlook on the medium and where it is going. I would love to see what he is working on next with the Oculus team and to understand the current development challenges he is facing with the hardware and software.

Thank you so much for these excellent stories and insights. We wish you continued success on your great work!


Makers of The Metaverse: Robert Farthing On The Future Of The VR, AR & Mixed Reality Industries was originally published in Authority Magazine on Medium, where people are continuing the conversation by highlighting and responding to this story.

Recommended Posts